Oct 13 13:07:01 crc systemd[1]: Starting Kubernetes Kubelet... Oct 13 13:07:01 crc restorecon[4741]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:01 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 13 13:07:02 crc restorecon[4741]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 13 13:07:02 crc restorecon[4741]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Oct 13 13:07:02 crc kubenswrapper[4797]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 13 13:07:02 crc kubenswrapper[4797]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Oct 13 13:07:02 crc kubenswrapper[4797]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 13 13:07:02 crc kubenswrapper[4797]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 13 13:07:02 crc kubenswrapper[4797]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Oct 13 13:07:02 crc kubenswrapper[4797]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.934955 4797 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.944556 4797 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.944614 4797 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.944624 4797 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.944637 4797 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.944646 4797 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.944657 4797 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.944667 4797 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.944675 4797 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.944683 4797 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.944695 4797 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.944709 4797 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.944720 4797 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.944729 4797 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.944739 4797 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.944749 4797 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.944760 4797 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.944772 4797 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.944783 4797 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.944792 4797 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.944801 4797 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.944846 4797 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.944857 4797 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.944868 4797 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.944878 4797 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.944887 4797 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.944895 4797 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.944905 4797 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.944913 4797 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.944922 4797 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.944930 4797 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.944938 4797 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.944946 4797 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.944953 4797 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.944962 4797 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.944972 4797 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.944984 4797 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.944993 4797 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.945002 4797 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.945012 4797 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.945022 4797 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.945030 4797 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.945038 4797 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.945047 4797 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.945055 4797 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.945063 4797 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.945072 4797 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.945079 4797 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.945088 4797 feature_gate.go:330] unrecognized feature gate: Example Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.945096 4797 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.945103 4797 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.945112 4797 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.945119 4797 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.945129 4797 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.945137 4797 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.945145 4797 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.945158 4797 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.945171 4797 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.945182 4797 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.945191 4797 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.945199 4797 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.945208 4797 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.945216 4797 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.945227 4797 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.945235 4797 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.945243 4797 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.945254 4797 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.945264 4797 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.945274 4797 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.945283 4797 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.945291 4797 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.945299 4797 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.945495 4797 flags.go:64] FLAG: --address="0.0.0.0" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.945523 4797 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.945552 4797 flags.go:64] FLAG: --anonymous-auth="true" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.945567 4797 flags.go:64] FLAG: --application-metrics-count-limit="100" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.945582 4797 flags.go:64] FLAG: --authentication-token-webhook="false" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.945593 4797 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.945643 4797 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.945656 4797 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.945665 4797 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.945675 4797 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.945687 4797 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.945700 4797 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.945709 4797 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.945719 4797 flags.go:64] FLAG: --cgroup-root="" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.945728 4797 flags.go:64] FLAG: --cgroups-per-qos="true" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.945737 4797 flags.go:64] FLAG: --client-ca-file="" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.945748 4797 flags.go:64] FLAG: --cloud-config="" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.945758 4797 flags.go:64] FLAG: --cloud-provider="" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.945768 4797 flags.go:64] FLAG: --cluster-dns="[]" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.945781 4797 flags.go:64] FLAG: --cluster-domain="" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.945792 4797 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.945803 4797 flags.go:64] FLAG: --config-dir="" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.945839 4797 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.945851 4797 flags.go:64] FLAG: --container-log-max-files="5" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.945864 4797 flags.go:64] FLAG: --container-log-max-size="10Mi" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.945874 4797 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.945884 4797 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.945894 4797 flags.go:64] FLAG: --containerd-namespace="k8s.io" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.945903 4797 flags.go:64] FLAG: --contention-profiling="false" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.945913 4797 flags.go:64] FLAG: --cpu-cfs-quota="true" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.945923 4797 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.945934 4797 flags.go:64] FLAG: --cpu-manager-policy="none" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.945944 4797 flags.go:64] FLAG: --cpu-manager-policy-options="" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.945958 4797 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.946148 4797 flags.go:64] FLAG: --enable-controller-attach-detach="true" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.946158 4797 flags.go:64] FLAG: --enable-debugging-handlers="true" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.946168 4797 flags.go:64] FLAG: --enable-load-reader="false" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.946181 4797 flags.go:64] FLAG: --enable-server="true" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.946191 4797 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.946204 4797 flags.go:64] FLAG: --event-burst="100" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.946214 4797 flags.go:64] FLAG: --event-qps="50" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.946223 4797 flags.go:64] FLAG: --event-storage-age-limit="default=0" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.946234 4797 flags.go:64] FLAG: --event-storage-event-limit="default=0" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.946243 4797 flags.go:64] FLAG: --eviction-hard="" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.946254 4797 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.946264 4797 flags.go:64] FLAG: --eviction-minimum-reclaim="" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.946273 4797 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.946283 4797 flags.go:64] FLAG: --eviction-soft="" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.946293 4797 flags.go:64] FLAG: --eviction-soft-grace-period="" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.946302 4797 flags.go:64] FLAG: --exit-on-lock-contention="false" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.946313 4797 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.946322 4797 flags.go:64] FLAG: --experimental-mounter-path="" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.946331 4797 flags.go:64] FLAG: --fail-cgroupv1="false" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.946340 4797 flags.go:64] FLAG: --fail-swap-on="true" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.946350 4797 flags.go:64] FLAG: --feature-gates="" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.946362 4797 flags.go:64] FLAG: --file-check-frequency="20s" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.946372 4797 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.946382 4797 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.946391 4797 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.946401 4797 flags.go:64] FLAG: --healthz-port="10248" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.946426 4797 flags.go:64] FLAG: --help="false" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.946435 4797 flags.go:64] FLAG: --hostname-override="" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.946445 4797 flags.go:64] FLAG: --housekeeping-interval="10s" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.946455 4797 flags.go:64] FLAG: --http-check-frequency="20s" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.946464 4797 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.946473 4797 flags.go:64] FLAG: --image-credential-provider-config="" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.946482 4797 flags.go:64] FLAG: --image-gc-high-threshold="85" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.946491 4797 flags.go:64] FLAG: --image-gc-low-threshold="80" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.946501 4797 flags.go:64] FLAG: --image-service-endpoint="" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.946511 4797 flags.go:64] FLAG: --kernel-memcg-notification="false" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.946520 4797 flags.go:64] FLAG: --kube-api-burst="100" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.946529 4797 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.946539 4797 flags.go:64] FLAG: --kube-api-qps="50" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.946549 4797 flags.go:64] FLAG: --kube-reserved="" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.946558 4797 flags.go:64] FLAG: --kube-reserved-cgroup="" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.946566 4797 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.946575 4797 flags.go:64] FLAG: --kubelet-cgroups="" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.946584 4797 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.946594 4797 flags.go:64] FLAG: --lock-file="" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.946603 4797 flags.go:64] FLAG: --log-cadvisor-usage="false" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.946613 4797 flags.go:64] FLAG: --log-flush-frequency="5s" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.946622 4797 flags.go:64] FLAG: --log-json-info-buffer-size="0" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.946641 4797 flags.go:64] FLAG: --log-json-split-stream="false" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.946652 4797 flags.go:64] FLAG: --log-text-info-buffer-size="0" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.946662 4797 flags.go:64] FLAG: --log-text-split-stream="false" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.946670 4797 flags.go:64] FLAG: --logging-format="text" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.946679 4797 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.946689 4797 flags.go:64] FLAG: --make-iptables-util-chains="true" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.946698 4797 flags.go:64] FLAG: --manifest-url="" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.946709 4797 flags.go:64] FLAG: --manifest-url-header="" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.946723 4797 flags.go:64] FLAG: --max-housekeeping-interval="15s" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.946733 4797 flags.go:64] FLAG: --max-open-files="1000000" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.946745 4797 flags.go:64] FLAG: --max-pods="110" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.946754 4797 flags.go:64] FLAG: --maximum-dead-containers="-1" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.946764 4797 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.946773 4797 flags.go:64] FLAG: --memory-manager-policy="None" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.946782 4797 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.946792 4797 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.946809 4797 flags.go:64] FLAG: --node-ip="192.168.126.11" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.946843 4797 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.946871 4797 flags.go:64] FLAG: --node-status-max-images="50" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.946880 4797 flags.go:64] FLAG: --node-status-update-frequency="10s" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.946890 4797 flags.go:64] FLAG: --oom-score-adj="-999" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.946899 4797 flags.go:64] FLAG: --pod-cidr="" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.946909 4797 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.946923 4797 flags.go:64] FLAG: --pod-manifest-path="" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.946932 4797 flags.go:64] FLAG: --pod-max-pids="-1" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.946942 4797 flags.go:64] FLAG: --pods-per-core="0" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.946951 4797 flags.go:64] FLAG: --port="10250" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.946960 4797 flags.go:64] FLAG: --protect-kernel-defaults="false" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.946969 4797 flags.go:64] FLAG: --provider-id="" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.946978 4797 flags.go:64] FLAG: --qos-reserved="" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.946987 4797 flags.go:64] FLAG: --read-only-port="10255" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.946997 4797 flags.go:64] FLAG: --register-node="true" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.947006 4797 flags.go:64] FLAG: --register-schedulable="true" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.947016 4797 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.947033 4797 flags.go:64] FLAG: --registry-burst="10" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.947042 4797 flags.go:64] FLAG: --registry-qps="5" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.947051 4797 flags.go:64] FLAG: --reserved-cpus="" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.947063 4797 flags.go:64] FLAG: --reserved-memory="" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.947075 4797 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.947084 4797 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.947094 4797 flags.go:64] FLAG: --rotate-certificates="false" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.947103 4797 flags.go:64] FLAG: --rotate-server-certificates="false" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.947112 4797 flags.go:64] FLAG: --runonce="false" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.947122 4797 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.947131 4797 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.947141 4797 flags.go:64] FLAG: --seccomp-default="false" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.947150 4797 flags.go:64] FLAG: --serialize-image-pulls="true" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.947160 4797 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.947171 4797 flags.go:64] FLAG: --storage-driver-db="cadvisor" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.947181 4797 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.947191 4797 flags.go:64] FLAG: --storage-driver-password="root" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.947201 4797 flags.go:64] FLAG: --storage-driver-secure="false" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.947210 4797 flags.go:64] FLAG: --storage-driver-table="stats" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.947219 4797 flags.go:64] FLAG: --storage-driver-user="root" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.947229 4797 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.947238 4797 flags.go:64] FLAG: --sync-frequency="1m0s" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.947247 4797 flags.go:64] FLAG: --system-cgroups="" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.947256 4797 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.947273 4797 flags.go:64] FLAG: --system-reserved-cgroup="" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.947282 4797 flags.go:64] FLAG: --tls-cert-file="" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.947291 4797 flags.go:64] FLAG: --tls-cipher-suites="[]" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.947301 4797 flags.go:64] FLAG: --tls-min-version="" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.947310 4797 flags.go:64] FLAG: --tls-private-key-file="" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.947319 4797 flags.go:64] FLAG: --topology-manager-policy="none" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.947329 4797 flags.go:64] FLAG: --topology-manager-policy-options="" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.947339 4797 flags.go:64] FLAG: --topology-manager-scope="container" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.947349 4797 flags.go:64] FLAG: --v="2" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.947364 4797 flags.go:64] FLAG: --version="false" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.947378 4797 flags.go:64] FLAG: --vmodule="" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.947391 4797 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.947401 4797 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.947656 4797 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.947668 4797 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.947681 4797 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.947690 4797 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.947698 4797 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.947707 4797 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.947716 4797 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.947724 4797 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.947736 4797 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.947745 4797 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.947757 4797 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.947765 4797 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.947774 4797 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.947782 4797 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.947791 4797 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.947801 4797 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.947839 4797 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.947850 4797 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.947861 4797 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.947870 4797 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.947879 4797 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.947889 4797 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.947899 4797 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.947908 4797 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.947916 4797 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.947924 4797 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.947934 4797 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.947942 4797 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.947950 4797 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.947958 4797 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.947966 4797 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.947974 4797 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.947982 4797 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.947989 4797 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.947998 4797 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.948006 4797 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.948014 4797 feature_gate.go:330] unrecognized feature gate: Example Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.948022 4797 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.948034 4797 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.948042 4797 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.948051 4797 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.948059 4797 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.948067 4797 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.948075 4797 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.948082 4797 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.948090 4797 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.948098 4797 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.948105 4797 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.948113 4797 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.948121 4797 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.948128 4797 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.948136 4797 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.948144 4797 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.948151 4797 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.948159 4797 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.948167 4797 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.948177 4797 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.948187 4797 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.948203 4797 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.948211 4797 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.948219 4797 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.948228 4797 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.948235 4797 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.948243 4797 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.948250 4797 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.948260 4797 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.948270 4797 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.948279 4797 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.948289 4797 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.948299 4797 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.948310 4797 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.948343 4797 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.964111 4797 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.964169 4797 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.964335 4797 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.964350 4797 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.964362 4797 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.964372 4797 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.964381 4797 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.964389 4797 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.964397 4797 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.964406 4797 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.964414 4797 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.964422 4797 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.964430 4797 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.964438 4797 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.964448 4797 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.964456 4797 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.964466 4797 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.964474 4797 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.964482 4797 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.964491 4797 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.964500 4797 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.964511 4797 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.964523 4797 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.964533 4797 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.964541 4797 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.964551 4797 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.964559 4797 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.964568 4797 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.964577 4797 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.964586 4797 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.964595 4797 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.964614 4797 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.964622 4797 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.964631 4797 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.964639 4797 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.964647 4797 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.964656 4797 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.964664 4797 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.964672 4797 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.964681 4797 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.964689 4797 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.964698 4797 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.964706 4797 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.964714 4797 feature_gate.go:330] unrecognized feature gate: Example Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.964722 4797 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.964730 4797 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.964739 4797 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.964747 4797 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.964755 4797 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.964763 4797 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.964771 4797 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.964779 4797 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.964787 4797 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.964795 4797 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.964808 4797 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.964838 4797 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.964846 4797 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.964856 4797 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.964867 4797 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.964877 4797 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.964886 4797 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.964895 4797 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.964903 4797 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.964912 4797 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.964921 4797 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.964929 4797 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.964937 4797 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.964950 4797 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.964961 4797 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.964970 4797 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.964980 4797 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.964991 4797 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.965000 4797 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.965015 4797 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.965283 4797 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.965295 4797 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.965322 4797 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.965332 4797 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.965341 4797 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.965350 4797 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.965358 4797 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.965367 4797 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.965376 4797 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.965384 4797 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.965392 4797 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.965400 4797 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.965409 4797 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.965418 4797 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.965426 4797 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.965434 4797 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.965443 4797 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.965451 4797 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.965460 4797 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.965468 4797 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.965479 4797 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.965490 4797 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.965499 4797 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.965509 4797 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.965517 4797 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.965526 4797 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.965534 4797 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.965542 4797 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.965550 4797 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.965560 4797 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.965568 4797 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.965579 4797 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.965589 4797 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.965598 4797 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.965607 4797 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.965616 4797 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.965623 4797 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.965631 4797 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.965642 4797 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.965652 4797 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.965661 4797 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.965670 4797 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.965678 4797 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.965688 4797 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.965697 4797 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.965706 4797 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.965713 4797 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.965721 4797 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.965729 4797 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.965737 4797 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.965746 4797 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.965754 4797 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.965761 4797 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.965769 4797 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.965778 4797 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.965786 4797 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.965794 4797 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.965809 4797 feature_gate.go:330] unrecognized feature gate: Example Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.965874 4797 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.965883 4797 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.965893 4797 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.965901 4797 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.965909 4797 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.965917 4797 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.965925 4797 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.965941 4797 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.965949 4797 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.965957 4797 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.965966 4797 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.965974 4797 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 13 13:07:02 crc kubenswrapper[4797]: W1013 13:07:02.965982 4797 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.965994 4797 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.966299 4797 server.go:940] "Client rotation is on, will bootstrap in background" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.973685 4797 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.973940 4797 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.975857 4797 server.go:997] "Starting client certificate rotation" Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.975910 4797 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.977093 4797 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-12-31 21:12:48.626004169 +0000 UTC Oct 13 13:07:02 crc kubenswrapper[4797]: I1013 13:07:02.977213 4797 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 1904h5m45.64879649s for next certificate rotation Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.007128 4797 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.010491 4797 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.032423 4797 log.go:25] "Validated CRI v1 runtime API" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.078970 4797 log.go:25] "Validated CRI v1 image API" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.081872 4797 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.090433 4797 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-10-13-13-02-09-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.090510 4797 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.127965 4797 manager.go:217] Machine: {Timestamp:2025-10-13 13:07:03.124284371 +0000 UTC m=+0.657834727 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:1126131d-f382-4ed8-9b1e-fad3c0f5c993 BootID:7c305ae9-a0eb-4806-bd54-a7ad9c447299 Filesystems:[{Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:2f:35:9f Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:2f:35:9f Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:13:3c:3c Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:b4:bf:c3 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:21:4d:f3 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:22:47:85 Speed:-1 Mtu:1496} {Name:ens7.23 MacAddress:52:54:00:ce:cb:81 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:4a:ed:b4:ae:03:75 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:52:42:d1:e4:06:a6 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.128454 4797 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.128701 4797 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.129278 4797 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.129578 4797 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.129645 4797 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.130070 4797 topology_manager.go:138] "Creating topology manager with none policy" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.130092 4797 container_manager_linux.go:303] "Creating device plugin manager" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.130773 4797 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.130862 4797 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.131953 4797 state_mem.go:36] "Initialized new in-memory state store" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.132104 4797 server.go:1245] "Using root directory" path="/var/lib/kubelet" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.135868 4797 kubelet.go:418] "Attempting to sync node with API server" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.135904 4797 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.135972 4797 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.136000 4797 kubelet.go:324] "Adding apiserver pod source" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.136021 4797 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.141003 4797 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.142295 4797 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.145161 4797 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Oct 13 13:07:03 crc kubenswrapper[4797]: W1013 13:07:03.146540 4797 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.147:6443: connect: connection refused Oct 13 13:07:03 crc kubenswrapper[4797]: W1013 13:07:03.146511 4797 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.147:6443: connect: connection refused Oct 13 13:07:03 crc kubenswrapper[4797]: E1013 13:07:03.146712 4797 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.147:6443: connect: connection refused" logger="UnhandledError" Oct 13 13:07:03 crc kubenswrapper[4797]: E1013 13:07:03.146885 4797 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.147:6443: connect: connection refused" logger="UnhandledError" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.147031 4797 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.147070 4797 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.147086 4797 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.147101 4797 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.147123 4797 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.147138 4797 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.147151 4797 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.147174 4797 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.147193 4797 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.147208 4797 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.147228 4797 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.147242 4797 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.148592 4797 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.149477 4797 server.go:1280] "Started kubelet" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.151115 4797 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.147:6443: connect: connection refused Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.150714 4797 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.151109 4797 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Oct 13 13:07:03 crc systemd[1]: Started Kubernetes Kubelet. Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.152509 4797 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.153779 4797 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.153880 4797 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.154150 4797 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-19 02:10:18.915242496 +0000 UTC Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.154216 4797 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 877h3m15.761031989s for next certificate rotation Oct 13 13:07:03 crc kubenswrapper[4797]: E1013 13:07:03.154488 4797 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.154576 4797 volume_manager.go:287] "The desired_state_of_world populator starts" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.154585 4797 volume_manager.go:289] "Starting Kubelet Volume Manager" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.154668 4797 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Oct 13 13:07:03 crc kubenswrapper[4797]: E1013 13:07:03.156146 4797 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.147:6443: connect: connection refused" interval="200ms" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.156287 4797 factory.go:55] Registering systemd factory Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.156322 4797 factory.go:221] Registration of the systemd container factory successfully Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.156872 4797 factory.go:153] Registering CRI-O factory Oct 13 13:07:03 crc kubenswrapper[4797]: W1013 13:07:03.156798 4797 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.147:6443: connect: connection refused Oct 13 13:07:03 crc kubenswrapper[4797]: E1013 13:07:03.162965 4797 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.147:6443: connect: connection refused" logger="UnhandledError" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.156929 4797 factory.go:221] Registration of the crio container factory successfully Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.163181 4797 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.163238 4797 factory.go:103] Registering Raw factory Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.163300 4797 manager.go:1196] Started watching for new ooms in manager Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.164484 4797 manager.go:319] Starting recovery of all containers Oct 13 13:07:03 crc kubenswrapper[4797]: E1013 13:07:03.165189 4797 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.147:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.186e0ed784561399 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-10-13 13:07:03.149433753 +0000 UTC m=+0.682984049,LastTimestamp:2025-10-13 13:07:03.149433753 +0000 UTC m=+0.682984049,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.169673 4797 server.go:460] "Adding debug handlers to kubelet server" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.173854 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.173921 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.173935 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.173945 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.173957 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.173967 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.174002 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.174013 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.174025 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.174034 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.174044 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.174057 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.174067 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.174081 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.174093 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.174104 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.174116 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.174127 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.174137 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.174147 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.174157 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.174171 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.174209 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.174225 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.174238 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.174251 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.174266 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.174282 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.174297 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.174327 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.174341 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.174353 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.174366 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.174377 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.174389 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.174403 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.174414 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.174425 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.174438 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.174450 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.174461 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.174471 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.174483 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.174494 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.174504 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.174514 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.174524 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.174540 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.174553 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.174566 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.174577 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.174589 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.174606 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.174619 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.176257 4797 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.176286 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.176301 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.176319 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.176332 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.176344 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.176354 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.176362 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.176370 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.176379 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.176388 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.176398 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.176410 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.176419 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.176428 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.176438 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.176447 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.176456 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.176465 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.176475 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.176485 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.176496 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.176507 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.176519 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.176530 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.176541 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.176554 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.176563 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.176574 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.176585 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.176596 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.176607 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.176617 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.176629 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.176641 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.176653 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.176665 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.176676 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.176687 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.176698 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.176710 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.176721 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.176732 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.176741 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.176752 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.176763 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.176772 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.176783 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.176794 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.176808 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.176834 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.176859 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.176870 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.176881 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.176891 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.176902 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.176912 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.176923 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.176933 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.176943 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.176954 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.176964 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.176973 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.176982 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.176992 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.177002 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.177022 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.177033 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.177042 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.177052 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.177062 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.177071 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.177079 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.177087 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.177097 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.177107 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.177117 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.177126 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.177137 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.177147 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.177156 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.177166 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.177175 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.177190 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.177199 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.177209 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.177220 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.177231 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.177241 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.177251 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.177262 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.177271 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.177282 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.177292 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.177307 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.177319 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.177330 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.177340 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.177351 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.177363 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.177373 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.177383 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.177393 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.177405 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.177415 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.177427 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.177438 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.177448 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.177459 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.177471 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.177483 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.177495 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.177504 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.177514 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.177524 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.177534 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.177545 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.177556 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.177565 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.177575 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.177584 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.177595 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.177604 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.177614 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.177626 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.177636 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.177645 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.177656 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.177665 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.177676 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.177685 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.177697 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.177707 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.177716 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.177728 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.177738 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.177752 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.177761 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.177770 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.177781 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.177791 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.177807 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.177907 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.177919 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.177929 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.177939 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.177949 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.177963 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.177973 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.177984 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.177997 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.178007 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.178019 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.178029 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.178039 4797 reconstruct.go:97] "Volume reconstruction finished" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.178047 4797 reconciler.go:26] "Reconciler: start to sync state" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.210494 4797 manager.go:324] Recovery completed Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.230459 4797 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.234610 4797 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.234705 4797 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.234777 4797 status_manager.go:217] "Starting to sync pod status with apiserver" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.234884 4797 kubelet.go:2335] "Starting kubelet main sync loop" Oct 13 13:07:03 crc kubenswrapper[4797]: E1013 13:07:03.235122 4797 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Oct 13 13:07:03 crc kubenswrapper[4797]: W1013 13:07:03.235749 4797 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.147:6443: connect: connection refused Oct 13 13:07:03 crc kubenswrapper[4797]: E1013 13:07:03.237641 4797 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.147:6443: connect: connection refused" logger="UnhandledError" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.236865 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.238450 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.238600 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.239449 4797 cpu_manager.go:225] "Starting CPU manager" policy="none" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.239561 4797 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.239656 4797 state_mem.go:36] "Initialized new in-memory state store" Oct 13 13:07:03 crc kubenswrapper[4797]: E1013 13:07:03.254624 4797 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.257743 4797 policy_none.go:49] "None policy: Start" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.259770 4797 memory_manager.go:170] "Starting memorymanager" policy="None" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.259925 4797 state_mem.go:35] "Initializing new in-memory state store" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.318341 4797 manager.go:334] "Starting Device Plugin manager" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.318452 4797 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.318484 4797 server.go:79] "Starting device plugin registration server" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.319318 4797 eviction_manager.go:189] "Eviction manager: starting control loop" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.319364 4797 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.319574 4797 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.319787 4797 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.319897 4797 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.335918 4797 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc"] Oct 13 13:07:03 crc kubenswrapper[4797]: E1013 13:07:03.336056 4797 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.336069 4797 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.337404 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.337445 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.337460 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.337675 4797 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.337898 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.337951 4797 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.339345 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.339428 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.339462 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.339553 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.339589 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.339602 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.339705 4797 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.339840 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.339890 4797 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.340732 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.340787 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.340801 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.340839 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.340865 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.340884 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.341090 4797 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.341250 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.341284 4797 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.342473 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.342512 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.342530 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.342687 4797 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.343447 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.343497 4797 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.343732 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.343769 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.343782 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.344569 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.344615 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.344628 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.345331 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.345372 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.345385 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.345558 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.345600 4797 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.346454 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.346511 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.346539 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:03 crc kubenswrapper[4797]: E1013 13:07:03.358011 4797 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.147:6443: connect: connection refused" interval="400ms" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.380741 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.380849 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.380921 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.381077 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.381134 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.381167 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.381195 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.381221 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.381248 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.381290 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.381314 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.381335 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.381377 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.381426 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.381485 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.419738 4797 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.421223 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.421295 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.421318 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.421373 4797 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 13 13:07:03 crc kubenswrapper[4797]: E1013 13:07:03.422153 4797 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.147:6443: connect: connection refused" node="crc" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.486307 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.486959 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.486975 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.486987 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.487167 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.487192 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.487226 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.487229 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.487248 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.487270 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.487249 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.487291 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.487312 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.487331 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.487332 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.487358 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.487370 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.487372 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.487384 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.487388 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.487310 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.487396 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.487402 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.487433 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.487440 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.487451 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.487477 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.487502 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.487525 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.487618 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.622901 4797 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.625089 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.625138 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.625157 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.625194 4797 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 13 13:07:03 crc kubenswrapper[4797]: E1013 13:07:03.625716 4797 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.147:6443: connect: connection refused" node="crc" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.682885 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.710078 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 13 13:07:03 crc kubenswrapper[4797]: W1013 13:07:03.734571 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-c9dd53d4a6258a65e8807419a82a718ee194126971132fdc3ca64a8b22351381 WatchSource:0}: Error finding container c9dd53d4a6258a65e8807419a82a718ee194126971132fdc3ca64a8b22351381: Status 404 returned error can't find the container with id c9dd53d4a6258a65e8807419a82a718ee194126971132fdc3ca64a8b22351381 Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.738189 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 13 13:07:03 crc kubenswrapper[4797]: W1013 13:07:03.743019 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-36727cb6cedb3896e98eeeec67ac77cd55d3413a869e016213c4a7f31f51d2ad WatchSource:0}: Error finding container 36727cb6cedb3896e98eeeec67ac77cd55d3413a869e016213c4a7f31f51d2ad: Status 404 returned error can't find the container with id 36727cb6cedb3896e98eeeec67ac77cd55d3413a869e016213c4a7f31f51d2ad Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.757985 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 13 13:07:03 crc kubenswrapper[4797]: E1013 13:07:03.759131 4797 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.147:6443: connect: connection refused" interval="800ms" Oct 13 13:07:03 crc kubenswrapper[4797]: W1013 13:07:03.763766 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-5ea192b90ac549d9583ec1e1aab494b32ff09df0ff5b7ad87bc19f69fd026822 WatchSource:0}: Error finding container 5ea192b90ac549d9583ec1e1aab494b32ff09df0ff5b7ad87bc19f69fd026822: Status 404 returned error can't find the container with id 5ea192b90ac549d9583ec1e1aab494b32ff09df0ff5b7ad87bc19f69fd026822 Oct 13 13:07:03 crc kubenswrapper[4797]: I1013 13:07:03.770359 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Oct 13 13:07:03 crc kubenswrapper[4797]: W1013 13:07:03.777177 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-cf99b501f61b165cf4a783625d26e99773056e09fb4ff0424bf35ec64cf4bf05 WatchSource:0}: Error finding container cf99b501f61b165cf4a783625d26e99773056e09fb4ff0424bf35ec64cf4bf05: Status 404 returned error can't find the container with id cf99b501f61b165cf4a783625d26e99773056e09fb4ff0424bf35ec64cf4bf05 Oct 13 13:07:03 crc kubenswrapper[4797]: W1013 13:07:03.787697 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-550bef385bd9b216b9ac8f88ba37ca7b1766588e33c2dc24645e17e9044f0aeb WatchSource:0}: Error finding container 550bef385bd9b216b9ac8f88ba37ca7b1766588e33c2dc24645e17e9044f0aeb: Status 404 returned error can't find the container with id 550bef385bd9b216b9ac8f88ba37ca7b1766588e33c2dc24645e17e9044f0aeb Oct 13 13:07:04 crc kubenswrapper[4797]: I1013 13:07:04.026313 4797 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 13 13:07:04 crc kubenswrapper[4797]: I1013 13:07:04.028581 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:04 crc kubenswrapper[4797]: I1013 13:07:04.028652 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:04 crc kubenswrapper[4797]: I1013 13:07:04.028675 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:04 crc kubenswrapper[4797]: I1013 13:07:04.028720 4797 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 13 13:07:04 crc kubenswrapper[4797]: E1013 13:07:04.029388 4797 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.147:6443: connect: connection refused" node="crc" Oct 13 13:07:04 crc kubenswrapper[4797]: I1013 13:07:04.153101 4797 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.147:6443: connect: connection refused Oct 13 13:07:04 crc kubenswrapper[4797]: I1013 13:07:04.240615 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"cf99b501f61b165cf4a783625d26e99773056e09fb4ff0424bf35ec64cf4bf05"} Oct 13 13:07:04 crc kubenswrapper[4797]: I1013 13:07:04.241656 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"5ea192b90ac549d9583ec1e1aab494b32ff09df0ff5b7ad87bc19f69fd026822"} Oct 13 13:07:04 crc kubenswrapper[4797]: I1013 13:07:04.242975 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"36727cb6cedb3896e98eeeec67ac77cd55d3413a869e016213c4a7f31f51d2ad"} Oct 13 13:07:04 crc kubenswrapper[4797]: I1013 13:07:04.244354 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"c9dd53d4a6258a65e8807419a82a718ee194126971132fdc3ca64a8b22351381"} Oct 13 13:07:04 crc kubenswrapper[4797]: I1013 13:07:04.245722 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"550bef385bd9b216b9ac8f88ba37ca7b1766588e33c2dc24645e17e9044f0aeb"} Oct 13 13:07:04 crc kubenswrapper[4797]: W1013 13:07:04.481191 4797 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.147:6443: connect: connection refused Oct 13 13:07:04 crc kubenswrapper[4797]: E1013 13:07:04.481732 4797 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.147:6443: connect: connection refused" logger="UnhandledError" Oct 13 13:07:04 crc kubenswrapper[4797]: W1013 13:07:04.504328 4797 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.147:6443: connect: connection refused Oct 13 13:07:04 crc kubenswrapper[4797]: E1013 13:07:04.504419 4797 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.147:6443: connect: connection refused" logger="UnhandledError" Oct 13 13:07:04 crc kubenswrapper[4797]: E1013 13:07:04.560973 4797 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.147:6443: connect: connection refused" interval="1.6s" Oct 13 13:07:04 crc kubenswrapper[4797]: W1013 13:07:04.612761 4797 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.147:6443: connect: connection refused Oct 13 13:07:04 crc kubenswrapper[4797]: E1013 13:07:04.612928 4797 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.147:6443: connect: connection refused" logger="UnhandledError" Oct 13 13:07:04 crc kubenswrapper[4797]: W1013 13:07:04.783120 4797 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.147:6443: connect: connection refused Oct 13 13:07:04 crc kubenswrapper[4797]: E1013 13:07:04.783262 4797 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.147:6443: connect: connection refused" logger="UnhandledError" Oct 13 13:07:04 crc kubenswrapper[4797]: I1013 13:07:04.830461 4797 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 13 13:07:04 crc kubenswrapper[4797]: I1013 13:07:04.831930 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:04 crc kubenswrapper[4797]: I1013 13:07:04.831978 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:04 crc kubenswrapper[4797]: I1013 13:07:04.831994 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:04 crc kubenswrapper[4797]: I1013 13:07:04.832030 4797 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 13 13:07:04 crc kubenswrapper[4797]: E1013 13:07:04.832558 4797 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.147:6443: connect: connection refused" node="crc" Oct 13 13:07:05 crc kubenswrapper[4797]: I1013 13:07:05.153839 4797 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.147:6443: connect: connection refused Oct 13 13:07:05 crc kubenswrapper[4797]: I1013 13:07:05.251370 4797 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="9634ab258db11c80c4fea57a4a31969b811204d49513802e9fd1e584c9baeeb6" exitCode=0 Oct 13 13:07:05 crc kubenswrapper[4797]: I1013 13:07:05.251529 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"9634ab258db11c80c4fea57a4a31969b811204d49513802e9fd1e584c9baeeb6"} Oct 13 13:07:05 crc kubenswrapper[4797]: I1013 13:07:05.251607 4797 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 13 13:07:05 crc kubenswrapper[4797]: I1013 13:07:05.253076 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:05 crc kubenswrapper[4797]: I1013 13:07:05.253116 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:05 crc kubenswrapper[4797]: I1013 13:07:05.253127 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:05 crc kubenswrapper[4797]: I1013 13:07:05.253731 4797 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="114ccc69a396d2e48941fe7dc1e9b68f9cf6297746930ee02ce8aa98273064d8" exitCode=0 Oct 13 13:07:05 crc kubenswrapper[4797]: I1013 13:07:05.253841 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"114ccc69a396d2e48941fe7dc1e9b68f9cf6297746930ee02ce8aa98273064d8"} Oct 13 13:07:05 crc kubenswrapper[4797]: I1013 13:07:05.253930 4797 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 13 13:07:05 crc kubenswrapper[4797]: I1013 13:07:05.256832 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:05 crc kubenswrapper[4797]: I1013 13:07:05.256871 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:05 crc kubenswrapper[4797]: I1013 13:07:05.256885 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:05 crc kubenswrapper[4797]: I1013 13:07:05.259053 4797 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="0e62409684da122b3385446a402a798c47eca9f32aeb43f734f94dc498f95d23" exitCode=0 Oct 13 13:07:05 crc kubenswrapper[4797]: I1013 13:07:05.259136 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"0e62409684da122b3385446a402a798c47eca9f32aeb43f734f94dc498f95d23"} Oct 13 13:07:05 crc kubenswrapper[4797]: I1013 13:07:05.259184 4797 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 13 13:07:05 crc kubenswrapper[4797]: I1013 13:07:05.261321 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:05 crc kubenswrapper[4797]: I1013 13:07:05.261344 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:05 crc kubenswrapper[4797]: I1013 13:07:05.261355 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:05 crc kubenswrapper[4797]: I1013 13:07:05.265462 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"6529ac19e0d9f2b6ecc69e041e75c9767c971617166ca22bb29349b3b3965b1e"} Oct 13 13:07:05 crc kubenswrapper[4797]: I1013 13:07:05.265496 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"2f65bab26af0e0d003d4e1a27dc4bdb84b64b5f6143e363973331a3fb6d26b12"} Oct 13 13:07:05 crc kubenswrapper[4797]: I1013 13:07:05.265509 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"c81fc000b6df41386d24f9077cee4aa0ceb4733774dc37d225495575543e84a5"} Oct 13 13:07:05 crc kubenswrapper[4797]: I1013 13:07:05.267258 4797 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="8ff207205d6f17838682cab641fe9c77fad739e5454c206f376440741bed8724" exitCode=0 Oct 13 13:07:05 crc kubenswrapper[4797]: I1013 13:07:05.267310 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"8ff207205d6f17838682cab641fe9c77fad739e5454c206f376440741bed8724"} Oct 13 13:07:05 crc kubenswrapper[4797]: I1013 13:07:05.267439 4797 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 13 13:07:05 crc kubenswrapper[4797]: I1013 13:07:05.268714 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:05 crc kubenswrapper[4797]: I1013 13:07:05.268745 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:05 crc kubenswrapper[4797]: I1013 13:07:05.268756 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:05 crc kubenswrapper[4797]: I1013 13:07:05.271999 4797 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 13 13:07:05 crc kubenswrapper[4797]: I1013 13:07:05.273131 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:05 crc kubenswrapper[4797]: I1013 13:07:05.273182 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:05 crc kubenswrapper[4797]: I1013 13:07:05.273202 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:06 crc kubenswrapper[4797]: I1013 13:07:06.152895 4797 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.147:6443: connect: connection refused Oct 13 13:07:06 crc kubenswrapper[4797]: E1013 13:07:06.162844 4797 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.147:6443: connect: connection refused" interval="3.2s" Oct 13 13:07:06 crc kubenswrapper[4797]: I1013 13:07:06.278731 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"034f468ed62eb8201ad4abdbf235c13b6c9ff8e3fe2494ad768f7047e188bc77"} Oct 13 13:07:06 crc kubenswrapper[4797]: I1013 13:07:06.278891 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"11160e205816f4de995be138142cca7672957f217e49bf9f4761ae2cb132e9da"} Oct 13 13:07:06 crc kubenswrapper[4797]: I1013 13:07:06.278917 4797 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 13 13:07:06 crc kubenswrapper[4797]: I1013 13:07:06.278910 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"ca584d4ecaf82f6fb7822ce377920e84fa94325d8c157e75bdcbbe45a125fa17"} Oct 13 13:07:06 crc kubenswrapper[4797]: I1013 13:07:06.280381 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:06 crc kubenswrapper[4797]: I1013 13:07:06.280420 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:06 crc kubenswrapper[4797]: I1013 13:07:06.280436 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:06 crc kubenswrapper[4797]: I1013 13:07:06.286001 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"153aaedead7ed76ab97858342317945de885afe80c00d9873d1a03444c47f67d"} Oct 13 13:07:06 crc kubenswrapper[4797]: I1013 13:07:06.286244 4797 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 13 13:07:06 crc kubenswrapper[4797]: I1013 13:07:06.287332 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:06 crc kubenswrapper[4797]: I1013 13:07:06.287374 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:06 crc kubenswrapper[4797]: I1013 13:07:06.287385 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:06 crc kubenswrapper[4797]: W1013 13:07:06.290155 4797 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.147:6443: connect: connection refused Oct 13 13:07:06 crc kubenswrapper[4797]: E1013 13:07:06.290250 4797 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.147:6443: connect: connection refused" logger="UnhandledError" Oct 13 13:07:06 crc kubenswrapper[4797]: I1013 13:07:06.292270 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"ba76df71160260346f2ecd968722de778b7d2b3dcb8673d6ec770964965384fd"} Oct 13 13:07:06 crc kubenswrapper[4797]: I1013 13:07:06.292310 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"015492cb16b3cf6dedc1936f90cf03d1331bfd1fddf6a257c719a6bf102691f5"} Oct 13 13:07:06 crc kubenswrapper[4797]: I1013 13:07:06.292326 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"a259d99cae7127eb6fc8ad5446de3eda5a06da45868ab2325a89fc9c44f1d34c"} Oct 13 13:07:06 crc kubenswrapper[4797]: I1013 13:07:06.292345 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"94213b2963fead3db49bc98dfdf6347265b92e3a0a965295610e496d2e1f03fd"} Oct 13 13:07:06 crc kubenswrapper[4797]: I1013 13:07:06.294611 4797 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="a4eac0af23e91572524547b0ce92c10d435b55d0cd15ca4cfc1f49bda2de8bde" exitCode=0 Oct 13 13:07:06 crc kubenswrapper[4797]: I1013 13:07:06.294672 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"a4eac0af23e91572524547b0ce92c10d435b55d0cd15ca4cfc1f49bda2de8bde"} Oct 13 13:07:06 crc kubenswrapper[4797]: I1013 13:07:06.294855 4797 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 13 13:07:06 crc kubenswrapper[4797]: I1013 13:07:06.296427 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:06 crc kubenswrapper[4797]: I1013 13:07:06.296468 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:06 crc kubenswrapper[4797]: I1013 13:07:06.296481 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:06 crc kubenswrapper[4797]: I1013 13:07:06.300159 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"63f3b10625b7649804d048f23322a40de7a36368dd72faed1c1f3a313c64f452"} Oct 13 13:07:06 crc kubenswrapper[4797]: I1013 13:07:06.300266 4797 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 13 13:07:06 crc kubenswrapper[4797]: I1013 13:07:06.302416 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:06 crc kubenswrapper[4797]: I1013 13:07:06.302556 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:06 crc kubenswrapper[4797]: I1013 13:07:06.302618 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:06 crc kubenswrapper[4797]: I1013 13:07:06.433500 4797 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 13 13:07:06 crc kubenswrapper[4797]: I1013 13:07:06.436291 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:06 crc kubenswrapper[4797]: I1013 13:07:06.436401 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:06 crc kubenswrapper[4797]: I1013 13:07:06.436423 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:06 crc kubenswrapper[4797]: I1013 13:07:06.436490 4797 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 13 13:07:06 crc kubenswrapper[4797]: E1013 13:07:06.437052 4797 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.147:6443: connect: connection refused" node="crc" Oct 13 13:07:07 crc kubenswrapper[4797]: I1013 13:07:07.309925 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"58e84f914871c37c2c5cf2767af6a88354e4e59af0cbe5b178b80e1372d50629"} Oct 13 13:07:07 crc kubenswrapper[4797]: I1013 13:07:07.310073 4797 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 13 13:07:07 crc kubenswrapper[4797]: I1013 13:07:07.311614 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:07 crc kubenswrapper[4797]: I1013 13:07:07.311668 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:07 crc kubenswrapper[4797]: I1013 13:07:07.311687 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:07 crc kubenswrapper[4797]: I1013 13:07:07.314791 4797 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="af79d6bbb6a3c16b532ba2234d3373011c151ffa801eb1ae5ae947142a64bcfe" exitCode=0 Oct 13 13:07:07 crc kubenswrapper[4797]: I1013 13:07:07.314970 4797 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 13 13:07:07 crc kubenswrapper[4797]: I1013 13:07:07.315070 4797 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 13 13:07:07 crc kubenswrapper[4797]: I1013 13:07:07.315076 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"af79d6bbb6a3c16b532ba2234d3373011c151ffa801eb1ae5ae947142a64bcfe"} Oct 13 13:07:07 crc kubenswrapper[4797]: I1013 13:07:07.315123 4797 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 13 13:07:07 crc kubenswrapper[4797]: I1013 13:07:07.316028 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 13 13:07:07 crc kubenswrapper[4797]: I1013 13:07:07.315966 4797 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 13 13:07:07 crc kubenswrapper[4797]: I1013 13:07:07.316883 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:07 crc kubenswrapper[4797]: I1013 13:07:07.316933 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:07 crc kubenswrapper[4797]: I1013 13:07:07.316954 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:07 crc kubenswrapper[4797]: I1013 13:07:07.316967 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:07 crc kubenswrapper[4797]: I1013 13:07:07.317001 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:07 crc kubenswrapper[4797]: I1013 13:07:07.317021 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:07 crc kubenswrapper[4797]: I1013 13:07:07.316934 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:07 crc kubenswrapper[4797]: I1013 13:07:07.317100 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:07 crc kubenswrapper[4797]: I1013 13:07:07.317131 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:07 crc kubenswrapper[4797]: I1013 13:07:07.319365 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:07 crc kubenswrapper[4797]: I1013 13:07:07.319440 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:07 crc kubenswrapper[4797]: I1013 13:07:07.319460 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:07 crc kubenswrapper[4797]: I1013 13:07:07.388139 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 13 13:07:07 crc kubenswrapper[4797]: I1013 13:07:07.669221 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 13 13:07:08 crc kubenswrapper[4797]: I1013 13:07:08.323106 4797 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 13 13:07:08 crc kubenswrapper[4797]: I1013 13:07:08.322991 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"2839d93188aac6edbd17e7c1dc6d3b6004d3c1d8d03c559205b1f180ca7fc722"} Oct 13 13:07:08 crc kubenswrapper[4797]: I1013 13:07:08.323106 4797 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 13 13:07:08 crc kubenswrapper[4797]: I1013 13:07:08.323111 4797 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 13 13:07:08 crc kubenswrapper[4797]: I1013 13:07:08.323974 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 13 13:07:08 crc kubenswrapper[4797]: I1013 13:07:08.324024 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"dfe67dfd1c3ab4ca933a08e0384f2c38dccf755989a2d788f7c96bd8c2005c7d"} Oct 13 13:07:08 crc kubenswrapper[4797]: I1013 13:07:08.324054 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"01ae91c2e82dd02eb4c0aced1159efd45e3a0570a4db649f2fd2b58681419471"} Oct 13 13:07:08 crc kubenswrapper[4797]: I1013 13:07:08.325062 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:08 crc kubenswrapper[4797]: I1013 13:07:08.325125 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:08 crc kubenswrapper[4797]: I1013 13:07:08.325151 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:08 crc kubenswrapper[4797]: I1013 13:07:08.325226 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:08 crc kubenswrapper[4797]: I1013 13:07:08.325261 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:08 crc kubenswrapper[4797]: I1013 13:07:08.325273 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:08 crc kubenswrapper[4797]: I1013 13:07:08.325301 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:08 crc kubenswrapper[4797]: I1013 13:07:08.325383 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:08 crc kubenswrapper[4797]: I1013 13:07:08.325389 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:08 crc kubenswrapper[4797]: I1013 13:07:08.862410 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 13 13:07:08 crc kubenswrapper[4797]: I1013 13:07:08.875943 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 13 13:07:09 crc kubenswrapper[4797]: I1013 13:07:09.335001 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"221933b30055ace0b4911bda08736e1c703b7757d55fadb3114ae39d038e4b19"} Oct 13 13:07:09 crc kubenswrapper[4797]: I1013 13:07:09.335104 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"7d103007fe3545a7470b16b99638c5d5c87f34918e102e1453d4f7ee1fa67109"} Oct 13 13:07:09 crc kubenswrapper[4797]: I1013 13:07:09.335140 4797 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 13 13:07:09 crc kubenswrapper[4797]: I1013 13:07:09.335244 4797 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 13 13:07:09 crc kubenswrapper[4797]: I1013 13:07:09.335384 4797 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 13 13:07:09 crc kubenswrapper[4797]: I1013 13:07:09.336741 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:09 crc kubenswrapper[4797]: I1013 13:07:09.336853 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:09 crc kubenswrapper[4797]: I1013 13:07:09.336880 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:09 crc kubenswrapper[4797]: I1013 13:07:09.337453 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:09 crc kubenswrapper[4797]: I1013 13:07:09.337525 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:09 crc kubenswrapper[4797]: I1013 13:07:09.337551 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:09 crc kubenswrapper[4797]: I1013 13:07:09.338022 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:09 crc kubenswrapper[4797]: I1013 13:07:09.338097 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:09 crc kubenswrapper[4797]: I1013 13:07:09.338118 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:09 crc kubenswrapper[4797]: I1013 13:07:09.637896 4797 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 13 13:07:09 crc kubenswrapper[4797]: I1013 13:07:09.644222 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:09 crc kubenswrapper[4797]: I1013 13:07:09.644470 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:09 crc kubenswrapper[4797]: I1013 13:07:09.644840 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:09 crc kubenswrapper[4797]: I1013 13:07:09.645931 4797 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 13 13:07:10 crc kubenswrapper[4797]: I1013 13:07:10.321701 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 13 13:07:10 crc kubenswrapper[4797]: I1013 13:07:10.333334 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 13 13:07:10 crc kubenswrapper[4797]: I1013 13:07:10.338362 4797 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 13 13:07:10 crc kubenswrapper[4797]: I1013 13:07:10.338418 4797 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 13 13:07:10 crc kubenswrapper[4797]: I1013 13:07:10.338365 4797 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 13 13:07:10 crc kubenswrapper[4797]: I1013 13:07:10.340684 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:10 crc kubenswrapper[4797]: I1013 13:07:10.340746 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:10 crc kubenswrapper[4797]: I1013 13:07:10.340761 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:10 crc kubenswrapper[4797]: I1013 13:07:10.340769 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:10 crc kubenswrapper[4797]: I1013 13:07:10.340794 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:10 crc kubenswrapper[4797]: I1013 13:07:10.340916 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:10 crc kubenswrapper[4797]: I1013 13:07:10.341136 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:10 crc kubenswrapper[4797]: I1013 13:07:10.341158 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:10 crc kubenswrapper[4797]: I1013 13:07:10.341167 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:11 crc kubenswrapper[4797]: I1013 13:07:11.341711 4797 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 13 13:07:11 crc kubenswrapper[4797]: I1013 13:07:11.343388 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:11 crc kubenswrapper[4797]: I1013 13:07:11.343464 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:11 crc kubenswrapper[4797]: I1013 13:07:11.343492 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:13 crc kubenswrapper[4797]: E1013 13:07:13.336868 4797 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Oct 13 13:07:13 crc kubenswrapper[4797]: I1013 13:07:13.588538 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 13 13:07:13 crc kubenswrapper[4797]: I1013 13:07:13.589520 4797 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 13 13:07:13 crc kubenswrapper[4797]: I1013 13:07:13.591302 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:13 crc kubenswrapper[4797]: I1013 13:07:13.591371 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:13 crc kubenswrapper[4797]: I1013 13:07:13.591394 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:14 crc kubenswrapper[4797]: I1013 13:07:14.102557 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Oct 13 13:07:14 crc kubenswrapper[4797]: I1013 13:07:14.102886 4797 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 13 13:07:14 crc kubenswrapper[4797]: I1013 13:07:14.104690 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:14 crc kubenswrapper[4797]: I1013 13:07:14.104758 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:14 crc kubenswrapper[4797]: I1013 13:07:14.104779 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:16 crc kubenswrapper[4797]: I1013 13:07:16.588737 4797 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 13 13:07:16 crc kubenswrapper[4797]: I1013 13:07:16.588885 4797 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 13 13:07:16 crc kubenswrapper[4797]: W1013 13:07:16.774287 4797 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout Oct 13 13:07:16 crc kubenswrapper[4797]: I1013 13:07:16.774397 4797 trace.go:236] Trace[196555097]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (13-Oct-2025 13:07:06.772) (total time: 10002ms): Oct 13 13:07:16 crc kubenswrapper[4797]: Trace[196555097]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10002ms (13:07:16.774) Oct 13 13:07:16 crc kubenswrapper[4797]: Trace[196555097]: [10.002188354s] [10.002188354s] END Oct 13 13:07:16 crc kubenswrapper[4797]: E1013 13:07:16.774425 4797 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Oct 13 13:07:16 crc kubenswrapper[4797]: I1013 13:07:16.914712 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Oct 13 13:07:16 crc kubenswrapper[4797]: I1013 13:07:16.915094 4797 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 13 13:07:16 crc kubenswrapper[4797]: I1013 13:07:16.917091 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:16 crc kubenswrapper[4797]: I1013 13:07:16.917151 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:16 crc kubenswrapper[4797]: I1013 13:07:16.917164 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:17 crc kubenswrapper[4797]: W1013 13:07:17.144784 4797 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout Oct 13 13:07:17 crc kubenswrapper[4797]: I1013 13:07:17.144957 4797 trace.go:236] Trace[824934267]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (13-Oct-2025 13:07:07.143) (total time: 10001ms): Oct 13 13:07:17 crc kubenswrapper[4797]: Trace[824934267]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (13:07:17.144) Oct 13 13:07:17 crc kubenswrapper[4797]: Trace[824934267]: [10.001451454s] [10.001451454s] END Oct 13 13:07:17 crc kubenswrapper[4797]: E1013 13:07:17.144990 4797 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Oct 13 13:07:17 crc kubenswrapper[4797]: I1013 13:07:17.153006 4797 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Oct 13 13:07:17 crc kubenswrapper[4797]: I1013 13:07:17.388398 4797 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="Get \"https://192.168.126.11:6443/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 13 13:07:17 crc kubenswrapper[4797]: I1013 13:07:17.388488 4797 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="Get \"https://192.168.126.11:6443/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 13 13:07:17 crc kubenswrapper[4797]: W1013 13:07:17.966875 4797 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout Oct 13 13:07:17 crc kubenswrapper[4797]: I1013 13:07:17.966978 4797 trace.go:236] Trace[1285835145]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (13-Oct-2025 13:07:07.963) (total time: 10003ms): Oct 13 13:07:17 crc kubenswrapper[4797]: Trace[1285835145]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10003ms (13:07:17.966) Oct 13 13:07:17 crc kubenswrapper[4797]: Trace[1285835145]: [10.003925568s] [10.003925568s] END Oct 13 13:07:17 crc kubenswrapper[4797]: E1013 13:07:17.967008 4797 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Oct 13 13:07:18 crc kubenswrapper[4797]: I1013 13:07:18.618308 4797 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Oct 13 13:07:18 crc kubenswrapper[4797]: I1013 13:07:18.618379 4797 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Oct 13 13:07:20 crc kubenswrapper[4797]: I1013 13:07:20.327823 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 13 13:07:20 crc kubenswrapper[4797]: I1013 13:07:20.328051 4797 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 13 13:07:20 crc kubenswrapper[4797]: I1013 13:07:20.329690 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:20 crc kubenswrapper[4797]: I1013 13:07:20.329733 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:20 crc kubenswrapper[4797]: I1013 13:07:20.329743 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:20 crc kubenswrapper[4797]: I1013 13:07:20.613314 4797 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Oct 13 13:07:22 crc kubenswrapper[4797]: I1013 13:07:22.394121 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 13 13:07:22 crc kubenswrapper[4797]: I1013 13:07:22.394335 4797 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 13 13:07:22 crc kubenswrapper[4797]: I1013 13:07:22.395639 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:22 crc kubenswrapper[4797]: I1013 13:07:22.395683 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:22 crc kubenswrapper[4797]: I1013 13:07:22.395697 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:22 crc kubenswrapper[4797]: I1013 13:07:22.399868 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 13 13:07:22 crc kubenswrapper[4797]: I1013 13:07:22.562858 4797 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Oct 13 13:07:22 crc kubenswrapper[4797]: I1013 13:07:22.762344 4797 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.146855 4797 apiserver.go:52] "Watching apiserver" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.152161 4797 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.152609 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf"] Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.153050 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.153059 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 13 13:07:23 crc kubenswrapper[4797]: E1013 13:07:23.153285 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.153393 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 13 13:07:23 crc kubenswrapper[4797]: E1013 13:07:23.153441 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.153489 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.153504 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 13:07:23 crc kubenswrapper[4797]: E1013 13:07:23.153552 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.153613 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.155162 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.155403 4797 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.155779 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.156438 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.156437 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.156715 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.157099 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.157208 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.157228 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.158339 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.182956 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.195626 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.208100 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.223988 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.235232 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.253203 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.265877 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.279294 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.292616 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.304418 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.319958 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.333952 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.348986 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.363329 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.374873 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.395775 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Oct 13 13:07:23 crc kubenswrapper[4797]: E1013 13:07:23.607742 4797 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.615509 4797 trace.go:236] Trace[843872510]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (13-Oct-2025 13:07:12.495) (total time: 11120ms): Oct 13 13:07:23 crc kubenswrapper[4797]: Trace[843872510]: ---"Objects listed" error: 11119ms (13:07:23.614) Oct 13 13:07:23 crc kubenswrapper[4797]: Trace[843872510]: [11.120291864s] [11.120291864s] END Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.615570 4797 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Oct 13 13:07:23 crc kubenswrapper[4797]: E1013 13:07:23.616019 4797 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.617036 4797 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.655355 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.665908 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.682470 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.697425 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.703302 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.717510 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.717570 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.717594 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.717615 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.717635 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.717658 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.717679 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.717702 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.717728 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.717754 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.717775 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.717798 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.717837 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.717857 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.717876 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.717895 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.717918 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.717939 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.717959 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.717977 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.717995 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.718015 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.718037 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.718058 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.718084 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.718106 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.718128 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.718149 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.718170 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.718194 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.718216 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.718238 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.718264 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.718285 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.718336 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.718358 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.718378 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.718399 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.718418 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.718439 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.718461 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.718484 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.718496 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.718515 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.718524 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.718508 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.718516 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.718586 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.718593 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.718612 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.718637 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.718661 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.718682 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.718685 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.718701 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.718722 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.718727 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.718742 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.718740 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.718786 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.718825 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.718840 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.718849 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.718862 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.718967 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.718998 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.719029 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.719102 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.719132 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.719158 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.719277 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.719308 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.719312 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.719335 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.719409 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.719415 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.719484 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.719514 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.719547 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.719664 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.719693 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.719700 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.719739 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.719963 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.719754 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.719827 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.719833 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.719870 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.719930 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.718845 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.720147 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.720583 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.720614 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.720651 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.720680 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.720792 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.720856 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.720880 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.720884 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.720956 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.721066 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.721249 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.721370 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.721463 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.721483 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.721498 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.721914 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.721931 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.721684 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.721947 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.721781 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.721796 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.721823 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.721835 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.722034 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.722348 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.722556 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.722571 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.722589 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.722630 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.722648 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.722650 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.722663 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.722714 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.722729 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.722738 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.722746 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.722833 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.722863 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.723015 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.723035 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.722887 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.723216 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.723239 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.723259 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.723285 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.723306 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.723341 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.723359 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.723366 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.723407 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.723425 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.723442 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.723440 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.723457 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.723474 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.723489 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.723506 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.723521 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.723567 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.723583 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.723601 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.723624 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.723643 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.723669 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.723691 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.723712 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.723730 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.723750 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.723771 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.723796 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.723854 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.723868 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.723884 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.723899 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.723913 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.723928 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.723944 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.723961 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.723983 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.724005 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.724026 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.724071 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.724096 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.724118 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.724140 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.724165 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.724188 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.724212 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.724326 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.724354 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.724377 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.724412 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.724435 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.724459 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.724484 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.724507 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.724529 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.724546 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.724563 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.724577 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.724592 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.724609 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.724624 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.724643 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.724657 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.724672 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.724686 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.724701 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.724715 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.724729 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.724743 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.724757 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.724774 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.724790 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.724850 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.724868 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.724884 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.724901 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.724916 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.724930 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.724947 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.724962 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.724978 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.724992 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.725007 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.725023 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.725037 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.725053 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.725071 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.725086 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.725103 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.725118 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.725134 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.725150 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.725166 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.725180 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.725199 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.725223 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.725248 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.725272 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.725298 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.725315 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.725329 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.725345 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.725360 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.725376 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.725395 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.725410 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.725425 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.725441 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.725457 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.725472 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.725488 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.725505 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.725521 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.725538 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.725556 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.725573 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.725588 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.725606 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.725624 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.725641 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.725657 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.725674 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.725689 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.725705 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.725720 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.725759 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.725785 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.725824 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.725850 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.725870 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.725886 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.725906 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.725924 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.725941 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.725958 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.725975 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.725993 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.726009 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.726028 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.726124 4797 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.726136 4797 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.726147 4797 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.726156 4797 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.726165 4797 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.726174 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.726184 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.726193 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.726202 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.726212 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.726429 4797 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.726439 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.726449 4797 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.726458 4797 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.726466 4797 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.726475 4797 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.726484 4797 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.726493 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.726502 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.726511 4797 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.726520 4797 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.726529 4797 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.726539 4797 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.726550 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.726562 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.726573 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.726581 4797 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.726590 4797 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.726600 4797 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.726611 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.726620 4797 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.726630 4797 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.726638 4797 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.726648 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.726657 4797 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.726666 4797 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.726675 4797 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.726685 4797 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.726699 4797 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.726711 4797 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.726723 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.726735 4797 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.726747 4797 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.726762 4797 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.726774 4797 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.726787 4797 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.726818 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.734882 4797 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.734906 4797 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.734922 4797 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.734941 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.734955 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.734969 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.735005 4797 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.735019 4797 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.735033 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.735049 4797 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.735062 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.735074 4797 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.735088 4797 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.735099 4797 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.735112 4797 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.735125 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.731439 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.746457 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a225515-1318-413d-aafe-877c9f16f598\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94213b2963fead3db49bc98dfdf6347265b92e3a0a965295610e496d2e1f03fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://015492cb16b3cf6dedc1936f90cf03d1331bfd1fddf6a257c719a6bf102691f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a259d99cae7127eb6fc8ad5446de3eda5a06da45868ab2325a89fc9c44f1d34c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58e84f914871c37c2c5cf2767af6a88354e4e59af0cbe5b178b80e1372d50629\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba76df71160260346f2ecd968722de778b7d2b3dcb8673d6ec770964965384fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ff207205d6f17838682cab641fe9c77fad739e5454c206f376440741bed8724\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ff207205d6f17838682cab641fe9c77fad739e5454c206f376440741bed8724\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.755609 4797 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.761347 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.774425 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.727038 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.727298 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.727560 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.727838 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.727869 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.728152 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.728652 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.728957 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.729162 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.731888 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.732090 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.732243 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.732367 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.732411 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.732516 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.732884 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.732932 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.732933 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.733189 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.733238 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.733274 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.733741 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.733761 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.775023 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.734033 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.734168 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.734174 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.734481 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.734735 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.735928 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.736209 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.736334 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.736814 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.737855 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.738003 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.738037 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.738129 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.738244 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.738250 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: E1013 13:07:23.738375 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-13 13:07:24.238326326 +0000 UTC m=+21.771876582 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.738450 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.738576 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.738590 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.738698 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.738847 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.738980 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.739046 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.739128 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.744697 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.744730 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.746026 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.746786 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.747304 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.747506 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.747741 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.748450 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.749980 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.750159 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.750727 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.750901 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.751003 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.752458 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.752875 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.752998 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.753260 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.753412 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.753457 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.753482 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.753528 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.754103 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.754167 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.754215 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.754476 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.754670 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.754700 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.754737 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.754910 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.755012 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.755048 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: E1013 13:07:23.755118 4797 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 13 13:07:23 crc kubenswrapper[4797]: E1013 13:07:23.775637 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-13 13:07:24.275615887 +0000 UTC m=+21.809166143 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.755250 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.756404 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.759388 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.760563 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.761267 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.764931 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.766327 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.766367 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.766469 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: E1013 13:07:23.766507 4797 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.766602 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.766617 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.767050 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.767127 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.771009 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.771284 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.773327 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.774193 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.774268 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.774353 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: E1013 13:07:23.775888 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-13 13:07:24.275879874 +0000 UTC m=+21.809430130 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.776064 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.781015 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.781431 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.781755 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.782065 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.782436 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.782742 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.782848 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.783644 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.783724 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.785009 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.786928 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.787503 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.793044 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.793051 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.793053 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.793300 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.793322 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.793630 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.793752 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.793944 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.794119 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.794191 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.794640 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.794693 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.794883 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.794992 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.796528 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.798045 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.798076 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.798192 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: E1013 13:07:23.802104 4797 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 13 13:07:23 crc kubenswrapper[4797]: E1013 13:07:23.802135 4797 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 13 13:07:23 crc kubenswrapper[4797]: E1013 13:07:23.802148 4797 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 13 13:07:23 crc kubenswrapper[4797]: E1013 13:07:23.802210 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-13 13:07:24.302191998 +0000 UTC m=+21.835742254 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.802504 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.806326 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.806332 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.807205 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: E1013 13:07:23.807506 4797 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 13 13:07:23 crc kubenswrapper[4797]: E1013 13:07:23.807530 4797 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 13 13:07:23 crc kubenswrapper[4797]: E1013 13:07:23.807542 4797 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 13 13:07:23 crc kubenswrapper[4797]: E1013 13:07:23.807579 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-13 13:07:24.307566448 +0000 UTC m=+21.841116714 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.820925 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.824040 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.825367 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.827509 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.827792 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.827990 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.836033 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.836079 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.836121 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.836161 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.836170 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.836178 4797 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.836187 4797 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.836195 4797 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.836203 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.836211 4797 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.836219 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.836227 4797 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.836235 4797 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.836243 4797 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.836253 4797 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.836262 4797 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.836270 4797 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.836279 4797 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.836287 4797 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.836295 4797 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.836304 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.836312 4797 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.836322 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.836331 4797 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.836340 4797 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.836349 4797 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.836358 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.836366 4797 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.836400 4797 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.836410 4797 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.836418 4797 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.836426 4797 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.836435 4797 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.836575 4797 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.836590 4797 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.836648 4797 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.836658 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.836667 4797 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.836676 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.836685 4797 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.836694 4797 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.836703 4797 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.836712 4797 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.836720 4797 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.836728 4797 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.836736 4797 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.836764 4797 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.836773 4797 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.836782 4797 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.836792 4797 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.836840 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.836850 4797 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.836858 4797 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.836866 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.836875 4797 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.836883 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.836891 4797 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.836900 4797 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.836908 4797 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.836916 4797 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.836925 4797 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.836933 4797 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.836941 4797 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.836949 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.836958 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.836966 4797 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.836975 4797 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.836985 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.836995 4797 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.837004 4797 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.837013 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.837022 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.837030 4797 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.837039 4797 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.837047 4797 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.837055 4797 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.837064 4797 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.837072 4797 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.837081 4797 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.837089 4797 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.837098 4797 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.837106 4797 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.837115 4797 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.837123 4797 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.837131 4797 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.837140 4797 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.837148 4797 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.837157 4797 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.837165 4797 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.837173 4797 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.837182 4797 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.837190 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.837199 4797 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.837207 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.837214 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.837222 4797 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.837230 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.837238 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.837247 4797 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.837255 4797 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.837262 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.837270 4797 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.837280 4797 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.837287 4797 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.837296 4797 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.837304 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.837317 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.837325 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.837333 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.837341 4797 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.837349 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.837358 4797 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.837367 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.837377 4797 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.837388 4797 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.837398 4797 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.837406 4797 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.837414 4797 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.837422 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.837432 4797 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.837440 4797 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.837483 4797 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.837511 4797 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.837520 4797 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.837528 4797 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.837537 4797 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.837545 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.837553 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.837560 4797 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.837579 4797 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.837589 4797 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.837596 4797 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.837642 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.837691 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.837862 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.840895 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.860778 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.881576 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.895595 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.906750 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a225515-1318-413d-aafe-877c9f16f598\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94213b2963fead3db49bc98dfdf6347265b92e3a0a965295610e496d2e1f03fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://015492cb16b3cf6dedc1936f90cf03d1331bfd1fddf6a257c719a6bf102691f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a259d99cae7127eb6fc8ad5446de3eda5a06da45868ab2325a89fc9c44f1d34c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58e84f914871c37c2c5cf2767af6a88354e4e59af0cbe5b178b80e1372d50629\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba76df71160260346f2ecd968722de778b7d2b3dcb8673d6ec770964965384fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ff207205d6f17838682cab641fe9c77fad739e5454c206f376440741bed8724\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ff207205d6f17838682cab641fe9c77fad739e5454c206f376440741bed8724\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.916970 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.927138 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.934563 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.938615 4797 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.945120 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f37f607-3b81-4e33-878e-e78a69b89d23\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f65bab26af0e0d003d4e1a27dc4bdb84b64b5f6143e363973331a3fb6d26b12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81fc000b6df41386d24f9077cee4aa0ceb4733774dc37d225495575543e84a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6529ac19e0d9f2b6ecc69e041e75c9767c971617166ca22bb29349b3b3965b1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://153aaedead7ed76ab97858342317945de885afe80c00d9873d1a03444c47f67d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 13 13:07:23 crc kubenswrapper[4797]: I1013 13:07:23.955594 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 13 13:07:24 crc kubenswrapper[4797]: I1013 13:07:24.067703 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 13 13:07:24 crc kubenswrapper[4797]: I1013 13:07:24.074267 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 13 13:07:24 crc kubenswrapper[4797]: I1013 13:07:24.082232 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 13 13:07:24 crc kubenswrapper[4797]: W1013 13:07:24.098337 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-577717795e009f1ef8fa894a8b9fdacfca122d4afd7b1155d17ab2860d4be32c WatchSource:0}: Error finding container 577717795e009f1ef8fa894a8b9fdacfca122d4afd7b1155d17ab2860d4be32c: Status 404 returned error can't find the container with id 577717795e009f1ef8fa894a8b9fdacfca122d4afd7b1155d17ab2860d4be32c Oct 13 13:07:24 crc kubenswrapper[4797]: I1013 13:07:24.242379 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 13 13:07:24 crc kubenswrapper[4797]: E1013 13:07:24.242562 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-13 13:07:25.242545397 +0000 UTC m=+22.776095643 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 13:07:24 crc kubenswrapper[4797]: I1013 13:07:24.343181 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 13 13:07:24 crc kubenswrapper[4797]: I1013 13:07:24.343219 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 13 13:07:24 crc kubenswrapper[4797]: I1013 13:07:24.343238 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 13:07:24 crc kubenswrapper[4797]: I1013 13:07:24.343262 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 13:07:24 crc kubenswrapper[4797]: E1013 13:07:24.343368 4797 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 13 13:07:24 crc kubenswrapper[4797]: E1013 13:07:24.343410 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-13 13:07:25.343394675 +0000 UTC m=+22.876944931 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 13 13:07:24 crc kubenswrapper[4797]: E1013 13:07:24.343457 4797 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 13 13:07:24 crc kubenswrapper[4797]: E1013 13:07:24.343523 4797 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 13 13:07:24 crc kubenswrapper[4797]: E1013 13:07:24.343557 4797 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 13 13:07:24 crc kubenswrapper[4797]: E1013 13:07:24.343567 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-13 13:07:25.343545459 +0000 UTC m=+22.877095715 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 13 13:07:24 crc kubenswrapper[4797]: E1013 13:07:24.343577 4797 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 13 13:07:24 crc kubenswrapper[4797]: E1013 13:07:24.343470 4797 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 13 13:07:24 crc kubenswrapper[4797]: E1013 13:07:24.343610 4797 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 13 13:07:24 crc kubenswrapper[4797]: E1013 13:07:24.343623 4797 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 13 13:07:24 crc kubenswrapper[4797]: E1013 13:07:24.343629 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-13 13:07:25.34361223 +0000 UTC m=+22.877162646 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 13 13:07:24 crc kubenswrapper[4797]: E1013 13:07:24.343660 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-13 13:07:25.343649241 +0000 UTC m=+22.877199727 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 13 13:07:24 crc kubenswrapper[4797]: I1013 13:07:24.384068 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"577717795e009f1ef8fa894a8b9fdacfca122d4afd7b1155d17ab2860d4be32c"} Oct 13 13:07:24 crc kubenswrapper[4797]: I1013 13:07:24.385761 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"050a7223e7496fca4ef77f2d73f6aefc921ac5accb7ecaa34609524388da6549"} Oct 13 13:07:24 crc kubenswrapper[4797]: I1013 13:07:24.385832 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"2f59a6104215a3e6febd2e26c286b00895bcd8a45719acbd8e86d6fb5683df39"} Oct 13 13:07:24 crc kubenswrapper[4797]: I1013 13:07:24.385844 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"3364899ce3090f7f33a1762a385282509339239e7999662ded73c6c04c464e82"} Oct 13 13:07:24 crc kubenswrapper[4797]: I1013 13:07:24.386891 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"6681a37da700a80ffec94aef9264f87838622029c76a2badc7b8f4a7e9e167e8"} Oct 13 13:07:24 crc kubenswrapper[4797]: I1013 13:07:24.386955 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"6600c60ac5b8b083ee8c5d489b1d660128325b667f2568c53e8780f8efc34112"} Oct 13 13:07:24 crc kubenswrapper[4797]: E1013 13:07:24.392188 4797 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-crc\" already exists" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 13 13:07:24 crc kubenswrapper[4797]: I1013 13:07:24.397211 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 13 13:07:24 crc kubenswrapper[4797]: I1013 13:07:24.417897 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f37f607-3b81-4e33-878e-e78a69b89d23\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f65bab26af0e0d003d4e1a27dc4bdb84b64b5f6143e363973331a3fb6d26b12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81fc000b6df41386d24f9077cee4aa0ceb4733774dc37d225495575543e84a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6529ac19e0d9f2b6ecc69e041e75c9767c971617166ca22bb29349b3b3965b1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://153aaedead7ed76ab97858342317945de885afe80c00d9873d1a03444c47f67d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 13 13:07:24 crc kubenswrapper[4797]: I1013 13:07:24.430802 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 13 13:07:24 crc kubenswrapper[4797]: I1013 13:07:24.443982 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 13 13:07:24 crc kubenswrapper[4797]: I1013 13:07:24.457260 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 13 13:07:24 crc kubenswrapper[4797]: I1013 13:07:24.470938 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://050a7223e7496fca4ef77f2d73f6aefc921ac5accb7ecaa34609524388da6549\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f59a6104215a3e6febd2e26c286b00895bcd8a45719acbd8e86d6fb5683df39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 13 13:07:24 crc kubenswrapper[4797]: I1013 13:07:24.500742 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a225515-1318-413d-aafe-877c9f16f598\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94213b2963fead3db49bc98dfdf6347265b92e3a0a965295610e496d2e1f03fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://015492cb16b3cf6dedc1936f90cf03d1331bfd1fddf6a257c719a6bf102691f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a259d99cae7127eb6fc8ad5446de3eda5a06da45868ab2325a89fc9c44f1d34c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58e84f914871c37c2c5cf2767af6a88354e4e59af0cbe5b178b80e1372d50629\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba76df71160260346f2ecd968722de778b7d2b3dcb8673d6ec770964965384fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ff207205d6f17838682cab641fe9c77fad739e5454c206f376440741bed8724\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ff207205d6f17838682cab641fe9c77fad739e5454c206f376440741bed8724\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 13 13:07:24 crc kubenswrapper[4797]: I1013 13:07:24.528102 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 13 13:07:24 crc kubenswrapper[4797]: I1013 13:07:24.571921 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f37f607-3b81-4e33-878e-e78a69b89d23\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f65bab26af0e0d003d4e1a27dc4bdb84b64b5f6143e363973331a3fb6d26b12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81fc000b6df41386d24f9077cee4aa0ceb4733774dc37d225495575543e84a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6529ac19e0d9f2b6ecc69e041e75c9767c971617166ca22bb29349b3b3965b1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://153aaedead7ed76ab97858342317945de885afe80c00d9873d1a03444c47f67d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 13 13:07:24 crc kubenswrapper[4797]: I1013 13:07:24.588562 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 13 13:07:24 crc kubenswrapper[4797]: I1013 13:07:24.599644 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 13 13:07:24 crc kubenswrapper[4797]: I1013 13:07:24.609152 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://050a7223e7496fca4ef77f2d73f6aefc921ac5accb7ecaa34609524388da6549\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f59a6104215a3e6febd2e26c286b00895bcd8a45719acbd8e86d6fb5683df39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 13 13:07:24 crc kubenswrapper[4797]: I1013 13:07:24.619920 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a225515-1318-413d-aafe-877c9f16f598\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94213b2963fead3db49bc98dfdf6347265b92e3a0a965295610e496d2e1f03fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://015492cb16b3cf6dedc1936f90cf03d1331bfd1fddf6a257c719a6bf102691f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a259d99cae7127eb6fc8ad5446de3eda5a06da45868ab2325a89fc9c44f1d34c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58e84f914871c37c2c5cf2767af6a88354e4e59af0cbe5b178b80e1372d50629\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba76df71160260346f2ecd968722de778b7d2b3dcb8673d6ec770964965384fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ff207205d6f17838682cab641fe9c77fad739e5454c206f376440741bed8724\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ff207205d6f17838682cab641fe9c77fad739e5454c206f376440741bed8724\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 13 13:07:24 crc kubenswrapper[4797]: I1013 13:07:24.630349 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6681a37da700a80ffec94aef9264f87838622029c76a2badc7b8f4a7e9e167e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 13 13:07:24 crc kubenswrapper[4797]: I1013 13:07:24.639843 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 13 13:07:24 crc kubenswrapper[4797]: I1013 13:07:24.649473 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 13 13:07:24 crc kubenswrapper[4797]: I1013 13:07:24.978339 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-5jgrm"] Oct 13 13:07:24 crc kubenswrapper[4797]: I1013 13:07:24.978928 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-5jgrm" Oct 13 13:07:24 crc kubenswrapper[4797]: I1013 13:07:24.981675 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Oct 13 13:07:24 crc kubenswrapper[4797]: I1013 13:07:24.983311 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Oct 13 13:07:24 crc kubenswrapper[4797]: I1013 13:07:24.984369 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:24.999927 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f37f607-3b81-4e33-878e-e78a69b89d23\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f65bab26af0e0d003d4e1a27dc4bdb84b64b5f6143e363973331a3fb6d26b12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81fc000b6df41386d24f9077cee4aa0ceb4733774dc37d225495575543e84a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6529ac19e0d9f2b6ecc69e041e75c9767c971617166ca22bb29349b3b3965b1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://153aaedead7ed76ab97858342317945de885afe80c00d9873d1a03444c47f67d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:24Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.017226 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:25Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.027930 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5jgrm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"680a49a0-7eff-44a9-8ab8-e4b52f4743c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pptw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5jgrm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:25Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.043467 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:25Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.059983 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://050a7223e7496fca4ef77f2d73f6aefc921ac5accb7ecaa34609524388da6549\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f59a6104215a3e6febd2e26c286b00895bcd8a45719acbd8e86d6fb5683df39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:25Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.072311 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a225515-1318-413d-aafe-877c9f16f598\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94213b2963fead3db49bc98dfdf6347265b92e3a0a965295610e496d2e1f03fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://015492cb16b3cf6dedc1936f90cf03d1331bfd1fddf6a257c719a6bf102691f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a259d99cae7127eb6fc8ad5446de3eda5a06da45868ab2325a89fc9c44f1d34c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58e84f914871c37c2c5cf2767af6a88354e4e59af0cbe5b178b80e1372d50629\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba76df71160260346f2ecd968722de778b7d2b3dcb8673d6ec770964965384fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ff207205d6f17838682cab641fe9c77fad739e5454c206f376440741bed8724\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ff207205d6f17838682cab641fe9c77fad739e5454c206f376440741bed8724\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:25Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.085609 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6681a37da700a80ffec94aef9264f87838622029c76a2badc7b8f4a7e9e167e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:25Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.098308 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:25Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.111988 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:25Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.151507 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/680a49a0-7eff-44a9-8ab8-e4b52f4743c6-hosts-file\") pod \"node-resolver-5jgrm\" (UID: \"680a49a0-7eff-44a9-8ab8-e4b52f4743c6\") " pod="openshift-dns/node-resolver-5jgrm" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.151561 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pptw6\" (UniqueName: \"kubernetes.io/projected/680a49a0-7eff-44a9-8ab8-e4b52f4743c6-kube-api-access-pptw6\") pod \"node-resolver-5jgrm\" (UID: \"680a49a0-7eff-44a9-8ab8-e4b52f4743c6\") " pod="openshift-dns/node-resolver-5jgrm" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.235970 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.236038 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.236091 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 13 13:07:25 crc kubenswrapper[4797]: E1013 13:07:25.236116 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 13 13:07:25 crc kubenswrapper[4797]: E1013 13:07:25.236208 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 13 13:07:25 crc kubenswrapper[4797]: E1013 13:07:25.236264 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.240202 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.241445 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.244738 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.246284 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.248710 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.250081 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.251460 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.252486 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.252577 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pptw6\" (UniqueName: \"kubernetes.io/projected/680a49a0-7eff-44a9-8ab8-e4b52f4743c6-kube-api-access-pptw6\") pod \"node-resolver-5jgrm\" (UID: \"680a49a0-7eff-44a9-8ab8-e4b52f4743c6\") " pod="openshift-dns/node-resolver-5jgrm" Oct 13 13:07:25 crc kubenswrapper[4797]: E1013 13:07:25.252653 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-13 13:07:27.252630502 +0000 UTC m=+24.786180758 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.252734 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/680a49a0-7eff-44a9-8ab8-e4b52f4743c6-hosts-file\") pod \"node-resolver-5jgrm\" (UID: \"680a49a0-7eff-44a9-8ab8-e4b52f4743c6\") " pod="openshift-dns/node-resolver-5jgrm" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.252851 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/680a49a0-7eff-44a9-8ab8-e4b52f4743c6-hosts-file\") pod \"node-resolver-5jgrm\" (UID: \"680a49a0-7eff-44a9-8ab8-e4b52f4743c6\") " pod="openshift-dns/node-resolver-5jgrm" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.254339 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.255579 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.256347 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.257037 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.258347 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.259218 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.259724 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.260232 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.261120 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.261671 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.262649 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.263377 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.264078 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.267052 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.267847 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.268404 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.269662 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.270332 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.271231 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.272184 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.273046 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.274059 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.274746 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.275403 4797 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.275537 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.277879 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.278449 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.278880 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.281203 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.282656 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.283271 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.284685 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.285874 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.287166 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.288083 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.289446 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pptw6\" (UniqueName: \"kubernetes.io/projected/680a49a0-7eff-44a9-8ab8-e4b52f4743c6-kube-api-access-pptw6\") pod \"node-resolver-5jgrm\" (UID: \"680a49a0-7eff-44a9-8ab8-e4b52f4743c6\") " pod="openshift-dns/node-resolver-5jgrm" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.289563 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.290635 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.294486 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.295289 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.296838 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.297911 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.299298 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.300014 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.300711 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.302060 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.302943 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.304197 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.312056 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-5jgrm" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.355360 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.355431 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.355473 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 13 13:07:25 crc kubenswrapper[4797]: E1013 13:07:25.355892 4797 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 13 13:07:25 crc kubenswrapper[4797]: E1013 13:07:25.355949 4797 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 13 13:07:25 crc kubenswrapper[4797]: E1013 13:07:25.355984 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-13 13:07:27.355954959 +0000 UTC m=+24.889505245 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 13 13:07:25 crc kubenswrapper[4797]: E1013 13:07:25.356076 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-13 13:07:27.356042121 +0000 UTC m=+24.889592437 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 13 13:07:25 crc kubenswrapper[4797]: E1013 13:07:25.356211 4797 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 13 13:07:25 crc kubenswrapper[4797]: E1013 13:07:25.356236 4797 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 13 13:07:25 crc kubenswrapper[4797]: E1013 13:07:25.356255 4797 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 13 13:07:25 crc kubenswrapper[4797]: E1013 13:07:25.356300 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-13 13:07:27.356286517 +0000 UTC m=+24.889836873 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.356299 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-hc9bk"] Oct 13 13:07:25 crc kubenswrapper[4797]: E1013 13:07:25.358076 4797 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 13 13:07:25 crc kubenswrapper[4797]: E1013 13:07:25.358224 4797 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 13 13:07:25 crc kubenswrapper[4797]: E1013 13:07:25.358310 4797 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 13 13:07:25 crc kubenswrapper[4797]: E1013 13:07:25.358407 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-13 13:07:27.358392517 +0000 UTC m=+24.891942773 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.359504 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-6gbdx"] Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.359744 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-dhk2q"] Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.360207 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-hc9bk" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.360387 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-6gbdx" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.361466 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-hrdxs"] Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.364857 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.365052 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.355508 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.366308 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.366441 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.366307 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.367995 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.368262 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.368528 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.368713 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.372067 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.372477 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.372701 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.372962 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.373071 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.373203 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.373411 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.373497 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.373708 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.373901 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.373937 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.374142 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.389096 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a225515-1318-413d-aafe-877c9f16f598\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94213b2963fead3db49bc98dfdf6347265b92e3a0a965295610e496d2e1f03fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://015492cb16b3cf6dedc1936f90cf03d1331bfd1fddf6a257c719a6bf102691f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a259d99cae7127eb6fc8ad5446de3eda5a06da45868ab2325a89fc9c44f1d34c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58e84f914871c37c2c5cf2767af6a88354e4e59af0cbe5b178b80e1372d50629\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba76df71160260346f2ecd968722de778b7d2b3dcb8673d6ec770964965384fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ff207205d6f17838682cab641fe9c77fad739e5454c206f376440741bed8724\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ff207205d6f17838682cab641fe9c77fad739e5454c206f376440741bed8724\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:25Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.391369 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-5jgrm" event={"ID":"680a49a0-7eff-44a9-8ab8-e4b52f4743c6","Type":"ContainerStarted","Data":"71f9bf5f2fef12f7897c6c6bdba5b31b6bcfd1a240a8cdd05367aa9aadcf3121"} Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.403965 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:25Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.417032 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:25Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.432317 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6gbdx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2ab9f14-aae8-45ef-880e-a1563e920f87\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rkc2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6gbdx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:25Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.445848 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:25Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.455451 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5jgrm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"680a49a0-7eff-44a9-8ab8-e4b52f4743c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pptw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5jgrm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:25Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.466413 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/94a6e41f-8980-41db-a008-d5a81058cdba-cnibin\") pod \"multus-additional-cni-plugins-hc9bk\" (UID: \"94a6e41f-8980-41db-a008-d5a81058cdba\") " pod="openshift-multus/multus-additional-cni-plugins-hc9bk" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.466565 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b2ab9f14-aae8-45ef-880e-a1563e920f87-etc-kubernetes\") pod \"multus-6gbdx\" (UID: \"b2ab9f14-aae8-45ef-880e-a1563e920f87\") " pod="openshift-multus/multus-6gbdx" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.466656 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/658edc6a-9975-4d8b-9551-821edcc32ce1-run-openvswitch\") pod \"ovnkube-node-dhk2q\" (UID: \"658edc6a-9975-4d8b-9551-821edcc32ce1\") " pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.466722 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/658edc6a-9975-4d8b-9551-821edcc32ce1-host-run-ovn-kubernetes\") pod \"ovnkube-node-dhk2q\" (UID: \"658edc6a-9975-4d8b-9551-821edcc32ce1\") " pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.466782 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/658edc6a-9975-4d8b-9551-821edcc32ce1-host-cni-netd\") pod \"ovnkube-node-dhk2q\" (UID: \"658edc6a-9975-4d8b-9551-821edcc32ce1\") " pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.466894 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/94a6e41f-8980-41db-a008-d5a81058cdba-cni-binary-copy\") pod \"multus-additional-cni-plugins-hc9bk\" (UID: \"94a6e41f-8980-41db-a008-d5a81058cdba\") " pod="openshift-multus/multus-additional-cni-plugins-hc9bk" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.466928 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b2ab9f14-aae8-45ef-880e-a1563e920f87-system-cni-dir\") pod \"multus-6gbdx\" (UID: \"b2ab9f14-aae8-45ef-880e-a1563e920f87\") " pod="openshift-multus/multus-6gbdx" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.466957 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/658edc6a-9975-4d8b-9551-821edcc32ce1-node-log\") pod \"ovnkube-node-dhk2q\" (UID: \"658edc6a-9975-4d8b-9551-821edcc32ce1\") " pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.466986 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/658edc6a-9975-4d8b-9551-821edcc32ce1-log-socket\") pod \"ovnkube-node-dhk2q\" (UID: \"658edc6a-9975-4d8b-9551-821edcc32ce1\") " pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.467084 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/658edc6a-9975-4d8b-9551-821edcc32ce1-ovnkube-config\") pod \"ovnkube-node-dhk2q\" (UID: \"658edc6a-9975-4d8b-9551-821edcc32ce1\") " pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.467122 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2wq7\" (UniqueName: \"kubernetes.io/projected/658edc6a-9975-4d8b-9551-821edcc32ce1-kube-api-access-z2wq7\") pod \"ovnkube-node-dhk2q\" (UID: \"658edc6a-9975-4d8b-9551-821edcc32ce1\") " pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.467173 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/658edc6a-9975-4d8b-9551-821edcc32ce1-host-cni-bin\") pod \"ovnkube-node-dhk2q\" (UID: \"658edc6a-9975-4d8b-9551-821edcc32ce1\") " pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.467215 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b2ab9f14-aae8-45ef-880e-a1563e920f87-host-var-lib-kubelet\") pod \"multus-6gbdx\" (UID: \"b2ab9f14-aae8-45ef-880e-a1563e920f87\") " pod="openshift-multus/multus-6gbdx" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.468430 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/b2ab9f14-aae8-45ef-880e-a1563e920f87-host-run-multus-certs\") pod \"multus-6gbdx\" (UID: \"b2ab9f14-aae8-45ef-880e-a1563e920f87\") " pod="openshift-multus/multus-6gbdx" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.468491 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/345b1c60-ba79-407d-8423-53010f2dfeb0-mcd-auth-proxy-config\") pod \"machine-config-daemon-hrdxs\" (UID: \"345b1c60-ba79-407d-8423-53010f2dfeb0\") " pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.468519 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b2ab9f14-aae8-45ef-880e-a1563e920f87-cnibin\") pod \"multus-6gbdx\" (UID: \"b2ab9f14-aae8-45ef-880e-a1563e920f87\") " pod="openshift-multus/multus-6gbdx" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.468545 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b2ab9f14-aae8-45ef-880e-a1563e920f87-host-var-lib-cni-bin\") pod \"multus-6gbdx\" (UID: \"b2ab9f14-aae8-45ef-880e-a1563e920f87\") " pod="openshift-multus/multus-6gbdx" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.468561 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/658edc6a-9975-4d8b-9551-821edcc32ce1-systemd-units\") pod \"ovnkube-node-dhk2q\" (UID: \"658edc6a-9975-4d8b-9551-821edcc32ce1\") " pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.468575 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/658edc6a-9975-4d8b-9551-821edcc32ce1-var-lib-openvswitch\") pod \"ovnkube-node-dhk2q\" (UID: \"658edc6a-9975-4d8b-9551-821edcc32ce1\") " pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.468593 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/658edc6a-9975-4d8b-9551-821edcc32ce1-env-overrides\") pod \"ovnkube-node-dhk2q\" (UID: \"658edc6a-9975-4d8b-9551-821edcc32ce1\") " pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.468613 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/94a6e41f-8980-41db-a008-d5a81058cdba-tuning-conf-dir\") pod \"multus-additional-cni-plugins-hc9bk\" (UID: \"94a6e41f-8980-41db-a008-d5a81058cdba\") " pod="openshift-multus/multus-additional-cni-plugins-hc9bk" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.468627 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/94a6e41f-8980-41db-a008-d5a81058cdba-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-hc9bk\" (UID: \"94a6e41f-8980-41db-a008-d5a81058cdba\") " pod="openshift-multus/multus-additional-cni-plugins-hc9bk" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.468641 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9g9sc\" (UniqueName: \"kubernetes.io/projected/94a6e41f-8980-41db-a008-d5a81058cdba-kube-api-access-9g9sc\") pod \"multus-additional-cni-plugins-hc9bk\" (UID: \"94a6e41f-8980-41db-a008-d5a81058cdba\") " pod="openshift-multus/multus-additional-cni-plugins-hc9bk" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.468659 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b2ab9f14-aae8-45ef-880e-a1563e920f87-multus-cni-dir\") pod \"multus-6gbdx\" (UID: \"b2ab9f14-aae8-45ef-880e-a1563e920f87\") " pod="openshift-multus/multus-6gbdx" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.468675 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/658edc6a-9975-4d8b-9551-821edcc32ce1-host-run-netns\") pod \"ovnkube-node-dhk2q\" (UID: \"658edc6a-9975-4d8b-9551-821edcc32ce1\") " pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.468689 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/658edc6a-9975-4d8b-9551-821edcc32ce1-run-systemd\") pod \"ovnkube-node-dhk2q\" (UID: \"658edc6a-9975-4d8b-9551-821edcc32ce1\") " pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.468705 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/345b1c60-ba79-407d-8423-53010f2dfeb0-rootfs\") pod \"machine-config-daemon-hrdxs\" (UID: \"345b1c60-ba79-407d-8423-53010f2dfeb0\") " pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.468720 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/345b1c60-ba79-407d-8423-53010f2dfeb0-proxy-tls\") pod \"machine-config-daemon-hrdxs\" (UID: \"345b1c60-ba79-407d-8423-53010f2dfeb0\") " pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.468737 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4mqw\" (UniqueName: \"kubernetes.io/projected/345b1c60-ba79-407d-8423-53010f2dfeb0-kube-api-access-t4mqw\") pod \"machine-config-daemon-hrdxs\" (UID: \"345b1c60-ba79-407d-8423-53010f2dfeb0\") " pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.468753 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/94a6e41f-8980-41db-a008-d5a81058cdba-os-release\") pod \"multus-additional-cni-plugins-hc9bk\" (UID: \"94a6e41f-8980-41db-a008-d5a81058cdba\") " pod="openshift-multus/multus-additional-cni-plugins-hc9bk" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.468769 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/b2ab9f14-aae8-45ef-880e-a1563e920f87-multus-socket-dir-parent\") pod \"multus-6gbdx\" (UID: \"b2ab9f14-aae8-45ef-880e-a1563e920f87\") " pod="openshift-multus/multus-6gbdx" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.468787 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b2ab9f14-aae8-45ef-880e-a1563e920f87-multus-conf-dir\") pod \"multus-6gbdx\" (UID: \"b2ab9f14-aae8-45ef-880e-a1563e920f87\") " pod="openshift-multus/multus-6gbdx" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.468834 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/b2ab9f14-aae8-45ef-880e-a1563e920f87-multus-daemon-config\") pod \"multus-6gbdx\" (UID: \"b2ab9f14-aae8-45ef-880e-a1563e920f87\") " pod="openshift-multus/multus-6gbdx" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.468856 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/658edc6a-9975-4d8b-9551-821edcc32ce1-run-ovn\") pod \"ovnkube-node-dhk2q\" (UID: \"658edc6a-9975-4d8b-9551-821edcc32ce1\") " pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.468872 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/658edc6a-9975-4d8b-9551-821edcc32ce1-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-dhk2q\" (UID: \"658edc6a-9975-4d8b-9551-821edcc32ce1\") " pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.468891 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/658edc6a-9975-4d8b-9551-821edcc32ce1-ovnkube-script-lib\") pod \"ovnkube-node-dhk2q\" (UID: \"658edc6a-9975-4d8b-9551-821edcc32ce1\") " pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.468909 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/658edc6a-9975-4d8b-9551-821edcc32ce1-host-kubelet\") pod \"ovnkube-node-dhk2q\" (UID: \"658edc6a-9975-4d8b-9551-821edcc32ce1\") " pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.468929 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b2ab9f14-aae8-45ef-880e-a1563e920f87-os-release\") pod \"multus-6gbdx\" (UID: \"b2ab9f14-aae8-45ef-880e-a1563e920f87\") " pod="openshift-multus/multus-6gbdx" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.468947 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/b2ab9f14-aae8-45ef-880e-a1563e920f87-host-var-lib-cni-multus\") pod \"multus-6gbdx\" (UID: \"b2ab9f14-aae8-45ef-880e-a1563e920f87\") " pod="openshift-multus/multus-6gbdx" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.468961 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/658edc6a-9975-4d8b-9551-821edcc32ce1-host-slash\") pod \"ovnkube-node-dhk2q\" (UID: \"658edc6a-9975-4d8b-9551-821edcc32ce1\") " pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.468981 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/658edc6a-9975-4d8b-9551-821edcc32ce1-etc-openvswitch\") pod \"ovnkube-node-dhk2q\" (UID: \"658edc6a-9975-4d8b-9551-821edcc32ce1\") " pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.468997 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/94a6e41f-8980-41db-a008-d5a81058cdba-system-cni-dir\") pod \"multus-additional-cni-plugins-hc9bk\" (UID: \"94a6e41f-8980-41db-a008-d5a81058cdba\") " pod="openshift-multus/multus-additional-cni-plugins-hc9bk" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.469014 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b2ab9f14-aae8-45ef-880e-a1563e920f87-cni-binary-copy\") pod \"multus-6gbdx\" (UID: \"b2ab9f14-aae8-45ef-880e-a1563e920f87\") " pod="openshift-multus/multus-6gbdx" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.469030 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rkc2d\" (UniqueName: \"kubernetes.io/projected/b2ab9f14-aae8-45ef-880e-a1563e920f87-kube-api-access-rkc2d\") pod \"multus-6gbdx\" (UID: \"b2ab9f14-aae8-45ef-880e-a1563e920f87\") " pod="openshift-multus/multus-6gbdx" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.469068 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/b2ab9f14-aae8-45ef-880e-a1563e920f87-host-run-k8s-cni-cncf-io\") pod \"multus-6gbdx\" (UID: \"b2ab9f14-aae8-45ef-880e-a1563e920f87\") " pod="openshift-multus/multus-6gbdx" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.469083 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b2ab9f14-aae8-45ef-880e-a1563e920f87-host-run-netns\") pod \"multus-6gbdx\" (UID: \"b2ab9f14-aae8-45ef-880e-a1563e920f87\") " pod="openshift-multus/multus-6gbdx" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.469097 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/b2ab9f14-aae8-45ef-880e-a1563e920f87-hostroot\") pod \"multus-6gbdx\" (UID: \"b2ab9f14-aae8-45ef-880e-a1563e920f87\") " pod="openshift-multus/multus-6gbdx" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.469109 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/658edc6a-9975-4d8b-9551-821edcc32ce1-ovn-node-metrics-cert\") pod \"ovnkube-node-dhk2q\" (UID: \"658edc6a-9975-4d8b-9551-821edcc32ce1\") " pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.472303 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6681a37da700a80ffec94aef9264f87838622029c76a2badc7b8f4a7e9e167e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:25Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.487100 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://050a7223e7496fca4ef77f2d73f6aefc921ac5accb7ecaa34609524388da6549\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f59a6104215a3e6febd2e26c286b00895bcd8a45719acbd8e86d6fb5683df39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:25Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.502191 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:25Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.517322 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hc9bk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94a6e41f-8980-41db-a008-d5a81058cdba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hc9bk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:25Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.527522 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f37f607-3b81-4e33-878e-e78a69b89d23\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f65bab26af0e0d003d4e1a27dc4bdb84b64b5f6143e363973331a3fb6d26b12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81fc000b6df41386d24f9077cee4aa0ceb4733774dc37d225495575543e84a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6529ac19e0d9f2b6ecc69e041e75c9767c971617166ca22bb29349b3b3965b1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://153aaedead7ed76ab97858342317945de885afe80c00d9873d1a03444c47f67d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:25Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.539209 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:25Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.549111 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5jgrm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"680a49a0-7eff-44a9-8ab8-e4b52f4743c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pptw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5jgrm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:25Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.562629 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6681a37da700a80ffec94aef9264f87838622029c76a2badc7b8f4a7e9e167e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:25Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.570851 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/b2ab9f14-aae8-45ef-880e-a1563e920f87-multus-daemon-config\") pod \"multus-6gbdx\" (UID: \"b2ab9f14-aae8-45ef-880e-a1563e920f87\") " pod="openshift-multus/multus-6gbdx" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.570951 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/658edc6a-9975-4d8b-9551-821edcc32ce1-run-ovn\") pod \"ovnkube-node-dhk2q\" (UID: \"658edc6a-9975-4d8b-9551-821edcc32ce1\") " pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.571006 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/658edc6a-9975-4d8b-9551-821edcc32ce1-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-dhk2q\" (UID: \"658edc6a-9975-4d8b-9551-821edcc32ce1\") " pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.571036 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/658edc6a-9975-4d8b-9551-821edcc32ce1-ovnkube-script-lib\") pod \"ovnkube-node-dhk2q\" (UID: \"658edc6a-9975-4d8b-9551-821edcc32ce1\") " pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.571062 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/658edc6a-9975-4d8b-9551-821edcc32ce1-run-ovn\") pod \"ovnkube-node-dhk2q\" (UID: \"658edc6a-9975-4d8b-9551-821edcc32ce1\") " pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.571116 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/658edc6a-9975-4d8b-9551-821edcc32ce1-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-dhk2q\" (UID: \"658edc6a-9975-4d8b-9551-821edcc32ce1\") " pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.571714 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/658edc6a-9975-4d8b-9551-821edcc32ce1-ovnkube-script-lib\") pod \"ovnkube-node-dhk2q\" (UID: \"658edc6a-9975-4d8b-9551-821edcc32ce1\") " pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.571840 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/b2ab9f14-aae8-45ef-880e-a1563e920f87-multus-daemon-config\") pod \"multus-6gbdx\" (UID: \"b2ab9f14-aae8-45ef-880e-a1563e920f87\") " pod="openshift-multus/multus-6gbdx" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.571932 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/94a6e41f-8980-41db-a008-d5a81058cdba-os-release\") pod \"multus-additional-cni-plugins-hc9bk\" (UID: \"94a6e41f-8980-41db-a008-d5a81058cdba\") " pod="openshift-multus/multus-additional-cni-plugins-hc9bk" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.571958 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/b2ab9f14-aae8-45ef-880e-a1563e920f87-multus-socket-dir-parent\") pod \"multus-6gbdx\" (UID: \"b2ab9f14-aae8-45ef-880e-a1563e920f87\") " pod="openshift-multus/multus-6gbdx" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.572072 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/b2ab9f14-aae8-45ef-880e-a1563e920f87-multus-socket-dir-parent\") pod \"multus-6gbdx\" (UID: \"b2ab9f14-aae8-45ef-880e-a1563e920f87\") " pod="openshift-multus/multus-6gbdx" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.572207 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/94a6e41f-8980-41db-a008-d5a81058cdba-os-release\") pod \"multus-additional-cni-plugins-hc9bk\" (UID: \"94a6e41f-8980-41db-a008-d5a81058cdba\") " pod="openshift-multus/multus-additional-cni-plugins-hc9bk" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.572320 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b2ab9f14-aae8-45ef-880e-a1563e920f87-multus-conf-dir\") pod \"multus-6gbdx\" (UID: \"b2ab9f14-aae8-45ef-880e-a1563e920f87\") " pod="openshift-multus/multus-6gbdx" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.572367 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b2ab9f14-aae8-45ef-880e-a1563e920f87-multus-conf-dir\") pod \"multus-6gbdx\" (UID: \"b2ab9f14-aae8-45ef-880e-a1563e920f87\") " pod="openshift-multus/multus-6gbdx" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.572392 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/658edc6a-9975-4d8b-9551-821edcc32ce1-host-kubelet\") pod \"ovnkube-node-dhk2q\" (UID: \"658edc6a-9975-4d8b-9551-821edcc32ce1\") " pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.572443 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b2ab9f14-aae8-45ef-880e-a1563e920f87-os-release\") pod \"multus-6gbdx\" (UID: \"b2ab9f14-aae8-45ef-880e-a1563e920f87\") " pod="openshift-multus/multus-6gbdx" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.572468 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/658edc6a-9975-4d8b-9551-821edcc32ce1-host-kubelet\") pod \"ovnkube-node-dhk2q\" (UID: \"658edc6a-9975-4d8b-9551-821edcc32ce1\") " pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.572480 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/658edc6a-9975-4d8b-9551-821edcc32ce1-host-slash\") pod \"ovnkube-node-dhk2q\" (UID: \"658edc6a-9975-4d8b-9551-821edcc32ce1\") " pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.572502 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/658edc6a-9975-4d8b-9551-821edcc32ce1-etc-openvswitch\") pod \"ovnkube-node-dhk2q\" (UID: \"658edc6a-9975-4d8b-9551-821edcc32ce1\") " pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.572521 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/658edc6a-9975-4d8b-9551-821edcc32ce1-host-slash\") pod \"ovnkube-node-dhk2q\" (UID: \"658edc6a-9975-4d8b-9551-821edcc32ce1\") " pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.572504 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b2ab9f14-aae8-45ef-880e-a1563e920f87-os-release\") pod \"multus-6gbdx\" (UID: \"b2ab9f14-aae8-45ef-880e-a1563e920f87\") " pod="openshift-multus/multus-6gbdx" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.572564 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/b2ab9f14-aae8-45ef-880e-a1563e920f87-host-var-lib-cni-multus\") pod \"multus-6gbdx\" (UID: \"b2ab9f14-aae8-45ef-880e-a1563e920f87\") " pod="openshift-multus/multus-6gbdx" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.572582 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/94a6e41f-8980-41db-a008-d5a81058cdba-system-cni-dir\") pod \"multus-additional-cni-plugins-hc9bk\" (UID: \"94a6e41f-8980-41db-a008-d5a81058cdba\") " pod="openshift-multus/multus-additional-cni-plugins-hc9bk" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.572564 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/658edc6a-9975-4d8b-9551-821edcc32ce1-etc-openvswitch\") pod \"ovnkube-node-dhk2q\" (UID: \"658edc6a-9975-4d8b-9551-821edcc32ce1\") " pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.572610 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b2ab9f14-aae8-45ef-880e-a1563e920f87-cni-binary-copy\") pod \"multus-6gbdx\" (UID: \"b2ab9f14-aae8-45ef-880e-a1563e920f87\") " pod="openshift-multus/multus-6gbdx" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.572625 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rkc2d\" (UniqueName: \"kubernetes.io/projected/b2ab9f14-aae8-45ef-880e-a1563e920f87-kube-api-access-rkc2d\") pod \"multus-6gbdx\" (UID: \"b2ab9f14-aae8-45ef-880e-a1563e920f87\") " pod="openshift-multus/multus-6gbdx" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.572626 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/b2ab9f14-aae8-45ef-880e-a1563e920f87-host-var-lib-cni-multus\") pod \"multus-6gbdx\" (UID: \"b2ab9f14-aae8-45ef-880e-a1563e920f87\") " pod="openshift-multus/multus-6gbdx" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.572638 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b2ab9f14-aae8-45ef-880e-a1563e920f87-host-run-netns\") pod \"multus-6gbdx\" (UID: \"b2ab9f14-aae8-45ef-880e-a1563e920f87\") " pod="openshift-multus/multus-6gbdx" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.573028 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/94a6e41f-8980-41db-a008-d5a81058cdba-system-cni-dir\") pod \"multus-additional-cni-plugins-hc9bk\" (UID: \"94a6e41f-8980-41db-a008-d5a81058cdba\") " pod="openshift-multus/multus-additional-cni-plugins-hc9bk" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.573169 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b2ab9f14-aae8-45ef-880e-a1563e920f87-cni-binary-copy\") pod \"multus-6gbdx\" (UID: \"b2ab9f14-aae8-45ef-880e-a1563e920f87\") " pod="openshift-multus/multus-6gbdx" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.573199 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b2ab9f14-aae8-45ef-880e-a1563e920f87-host-run-netns\") pod \"multus-6gbdx\" (UID: \"b2ab9f14-aae8-45ef-880e-a1563e920f87\") " pod="openshift-multus/multus-6gbdx" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.573230 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/b2ab9f14-aae8-45ef-880e-a1563e920f87-hostroot\") pod \"multus-6gbdx\" (UID: \"b2ab9f14-aae8-45ef-880e-a1563e920f87\") " pod="openshift-multus/multus-6gbdx" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.573204 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/b2ab9f14-aae8-45ef-880e-a1563e920f87-hostroot\") pod \"multus-6gbdx\" (UID: \"b2ab9f14-aae8-45ef-880e-a1563e920f87\") " pod="openshift-multus/multus-6gbdx" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.573265 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/658edc6a-9975-4d8b-9551-821edcc32ce1-ovn-node-metrics-cert\") pod \"ovnkube-node-dhk2q\" (UID: \"658edc6a-9975-4d8b-9551-821edcc32ce1\") " pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.573298 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/b2ab9f14-aae8-45ef-880e-a1563e920f87-host-run-k8s-cni-cncf-io\") pod \"multus-6gbdx\" (UID: \"b2ab9f14-aae8-45ef-880e-a1563e920f87\") " pod="openshift-multus/multus-6gbdx" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.573319 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/94a6e41f-8980-41db-a008-d5a81058cdba-cnibin\") pod \"multus-additional-cni-plugins-hc9bk\" (UID: \"94a6e41f-8980-41db-a008-d5a81058cdba\") " pod="openshift-multus/multus-additional-cni-plugins-hc9bk" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.573340 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b2ab9f14-aae8-45ef-880e-a1563e920f87-etc-kubernetes\") pod \"multus-6gbdx\" (UID: \"b2ab9f14-aae8-45ef-880e-a1563e920f87\") " pod="openshift-multus/multus-6gbdx" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.573361 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/658edc6a-9975-4d8b-9551-821edcc32ce1-run-openvswitch\") pod \"ovnkube-node-dhk2q\" (UID: \"658edc6a-9975-4d8b-9551-821edcc32ce1\") " pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.573386 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/658edc6a-9975-4d8b-9551-821edcc32ce1-host-run-ovn-kubernetes\") pod \"ovnkube-node-dhk2q\" (UID: \"658edc6a-9975-4d8b-9551-821edcc32ce1\") " pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.573407 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/658edc6a-9975-4d8b-9551-821edcc32ce1-host-cni-netd\") pod \"ovnkube-node-dhk2q\" (UID: \"658edc6a-9975-4d8b-9551-821edcc32ce1\") " pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.573438 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/658edc6a-9975-4d8b-9551-821edcc32ce1-node-log\") pod \"ovnkube-node-dhk2q\" (UID: \"658edc6a-9975-4d8b-9551-821edcc32ce1\") " pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.573460 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/658edc6a-9975-4d8b-9551-821edcc32ce1-log-socket\") pod \"ovnkube-node-dhk2q\" (UID: \"658edc6a-9975-4d8b-9551-821edcc32ce1\") " pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.573479 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/658edc6a-9975-4d8b-9551-821edcc32ce1-ovnkube-config\") pod \"ovnkube-node-dhk2q\" (UID: \"658edc6a-9975-4d8b-9551-821edcc32ce1\") " pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.573498 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z2wq7\" (UniqueName: \"kubernetes.io/projected/658edc6a-9975-4d8b-9551-821edcc32ce1-kube-api-access-z2wq7\") pod \"ovnkube-node-dhk2q\" (UID: \"658edc6a-9975-4d8b-9551-821edcc32ce1\") " pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.573518 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/94a6e41f-8980-41db-a008-d5a81058cdba-cni-binary-copy\") pod \"multus-additional-cni-plugins-hc9bk\" (UID: \"94a6e41f-8980-41db-a008-d5a81058cdba\") " pod="openshift-multus/multus-additional-cni-plugins-hc9bk" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.573537 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b2ab9f14-aae8-45ef-880e-a1563e920f87-system-cni-dir\") pod \"multus-6gbdx\" (UID: \"b2ab9f14-aae8-45ef-880e-a1563e920f87\") " pod="openshift-multus/multus-6gbdx" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.573564 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/658edc6a-9975-4d8b-9551-821edcc32ce1-host-cni-bin\") pod \"ovnkube-node-dhk2q\" (UID: \"658edc6a-9975-4d8b-9551-821edcc32ce1\") " pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.573587 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/345b1c60-ba79-407d-8423-53010f2dfeb0-mcd-auth-proxy-config\") pod \"machine-config-daemon-hrdxs\" (UID: \"345b1c60-ba79-407d-8423-53010f2dfeb0\") " pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.573621 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b2ab9f14-aae8-45ef-880e-a1563e920f87-host-var-lib-kubelet\") pod \"multus-6gbdx\" (UID: \"b2ab9f14-aae8-45ef-880e-a1563e920f87\") " pod="openshift-multus/multus-6gbdx" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.573642 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/b2ab9f14-aae8-45ef-880e-a1563e920f87-host-run-multus-certs\") pod \"multus-6gbdx\" (UID: \"b2ab9f14-aae8-45ef-880e-a1563e920f87\") " pod="openshift-multus/multus-6gbdx" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.573662 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b2ab9f14-aae8-45ef-880e-a1563e920f87-host-var-lib-cni-bin\") pod \"multus-6gbdx\" (UID: \"b2ab9f14-aae8-45ef-880e-a1563e920f87\") " pod="openshift-multus/multus-6gbdx" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.573682 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/658edc6a-9975-4d8b-9551-821edcc32ce1-systemd-units\") pod \"ovnkube-node-dhk2q\" (UID: \"658edc6a-9975-4d8b-9551-821edcc32ce1\") " pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.573701 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/658edc6a-9975-4d8b-9551-821edcc32ce1-var-lib-openvswitch\") pod \"ovnkube-node-dhk2q\" (UID: \"658edc6a-9975-4d8b-9551-821edcc32ce1\") " pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.573721 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/658edc6a-9975-4d8b-9551-821edcc32ce1-env-overrides\") pod \"ovnkube-node-dhk2q\" (UID: \"658edc6a-9975-4d8b-9551-821edcc32ce1\") " pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.573741 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b2ab9f14-aae8-45ef-880e-a1563e920f87-cnibin\") pod \"multus-6gbdx\" (UID: \"b2ab9f14-aae8-45ef-880e-a1563e920f87\") " pod="openshift-multus/multus-6gbdx" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.573761 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/94a6e41f-8980-41db-a008-d5a81058cdba-tuning-conf-dir\") pod \"multus-additional-cni-plugins-hc9bk\" (UID: \"94a6e41f-8980-41db-a008-d5a81058cdba\") " pod="openshift-multus/multus-additional-cni-plugins-hc9bk" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.573780 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/658edc6a-9975-4d8b-9551-821edcc32ce1-host-run-netns\") pod \"ovnkube-node-dhk2q\" (UID: \"658edc6a-9975-4d8b-9551-821edcc32ce1\") " pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.573816 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/658edc6a-9975-4d8b-9551-821edcc32ce1-run-systemd\") pod \"ovnkube-node-dhk2q\" (UID: \"658edc6a-9975-4d8b-9551-821edcc32ce1\") " pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.573836 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/345b1c60-ba79-407d-8423-53010f2dfeb0-rootfs\") pod \"machine-config-daemon-hrdxs\" (UID: \"345b1c60-ba79-407d-8423-53010f2dfeb0\") " pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.573856 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/345b1c60-ba79-407d-8423-53010f2dfeb0-proxy-tls\") pod \"machine-config-daemon-hrdxs\" (UID: \"345b1c60-ba79-407d-8423-53010f2dfeb0\") " pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.573876 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t4mqw\" (UniqueName: \"kubernetes.io/projected/345b1c60-ba79-407d-8423-53010f2dfeb0-kube-api-access-t4mqw\") pod \"machine-config-daemon-hrdxs\" (UID: \"345b1c60-ba79-407d-8423-53010f2dfeb0\") " pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.573899 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/94a6e41f-8980-41db-a008-d5a81058cdba-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-hc9bk\" (UID: \"94a6e41f-8980-41db-a008-d5a81058cdba\") " pod="openshift-multus/multus-additional-cni-plugins-hc9bk" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.573920 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9g9sc\" (UniqueName: \"kubernetes.io/projected/94a6e41f-8980-41db-a008-d5a81058cdba-kube-api-access-9g9sc\") pod \"multus-additional-cni-plugins-hc9bk\" (UID: \"94a6e41f-8980-41db-a008-d5a81058cdba\") " pod="openshift-multus/multus-additional-cni-plugins-hc9bk" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.574004 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b2ab9f14-aae8-45ef-880e-a1563e920f87-multus-cni-dir\") pod \"multus-6gbdx\" (UID: \"b2ab9f14-aae8-45ef-880e-a1563e920f87\") " pod="openshift-multus/multus-6gbdx" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.574151 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b2ab9f14-aae8-45ef-880e-a1563e920f87-multus-cni-dir\") pod \"multus-6gbdx\" (UID: \"b2ab9f14-aae8-45ef-880e-a1563e920f87\") " pod="openshift-multus/multus-6gbdx" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.574183 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/b2ab9f14-aae8-45ef-880e-a1563e920f87-host-run-k8s-cni-cncf-io\") pod \"multus-6gbdx\" (UID: \"b2ab9f14-aae8-45ef-880e-a1563e920f87\") " pod="openshift-multus/multus-6gbdx" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.574229 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/94a6e41f-8980-41db-a008-d5a81058cdba-cnibin\") pod \"multus-additional-cni-plugins-hc9bk\" (UID: \"94a6e41f-8980-41db-a008-d5a81058cdba\") " pod="openshift-multus/multus-additional-cni-plugins-hc9bk" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.574257 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b2ab9f14-aae8-45ef-880e-a1563e920f87-etc-kubernetes\") pod \"multus-6gbdx\" (UID: \"b2ab9f14-aae8-45ef-880e-a1563e920f87\") " pod="openshift-multus/multus-6gbdx" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.574283 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/658edc6a-9975-4d8b-9551-821edcc32ce1-run-openvswitch\") pod \"ovnkube-node-dhk2q\" (UID: \"658edc6a-9975-4d8b-9551-821edcc32ce1\") " pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.574309 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/658edc6a-9975-4d8b-9551-821edcc32ce1-host-run-ovn-kubernetes\") pod \"ovnkube-node-dhk2q\" (UID: \"658edc6a-9975-4d8b-9551-821edcc32ce1\") " pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.574335 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/658edc6a-9975-4d8b-9551-821edcc32ce1-host-cni-netd\") pod \"ovnkube-node-dhk2q\" (UID: \"658edc6a-9975-4d8b-9551-821edcc32ce1\") " pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.574298 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://050a7223e7496fca4ef77f2d73f6aefc921ac5accb7ecaa34609524388da6549\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f59a6104215a3e6febd2e26c286b00895bcd8a45719acbd8e86d6fb5683df39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:25Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.574382 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/658edc6a-9975-4d8b-9551-821edcc32ce1-log-socket\") pod \"ovnkube-node-dhk2q\" (UID: \"658edc6a-9975-4d8b-9551-821edcc32ce1\") " pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.574363 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/658edc6a-9975-4d8b-9551-821edcc32ce1-node-log\") pod \"ovnkube-node-dhk2q\" (UID: \"658edc6a-9975-4d8b-9551-821edcc32ce1\") " pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.574637 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/658edc6a-9975-4d8b-9551-821edcc32ce1-var-lib-openvswitch\") pod \"ovnkube-node-dhk2q\" (UID: \"658edc6a-9975-4d8b-9551-821edcc32ce1\") " pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.575215 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/658edc6a-9975-4d8b-9551-821edcc32ce1-ovnkube-config\") pod \"ovnkube-node-dhk2q\" (UID: \"658edc6a-9975-4d8b-9551-821edcc32ce1\") " pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.576182 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/658edc6a-9975-4d8b-9551-821edcc32ce1-env-overrides\") pod \"ovnkube-node-dhk2q\" (UID: \"658edc6a-9975-4d8b-9551-821edcc32ce1\") " pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.576259 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b2ab9f14-aae8-45ef-880e-a1563e920f87-cnibin\") pod \"multus-6gbdx\" (UID: \"b2ab9f14-aae8-45ef-880e-a1563e920f87\") " pod="openshift-multus/multus-6gbdx" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.576269 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/94a6e41f-8980-41db-a008-d5a81058cdba-cni-binary-copy\") pod \"multus-additional-cni-plugins-hc9bk\" (UID: \"94a6e41f-8980-41db-a008-d5a81058cdba\") " pod="openshift-multus/multus-additional-cni-plugins-hc9bk" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.576468 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b2ab9f14-aae8-45ef-880e-a1563e920f87-host-var-lib-kubelet\") pod \"multus-6gbdx\" (UID: \"b2ab9f14-aae8-45ef-880e-a1563e920f87\") " pod="openshift-multus/multus-6gbdx" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.576481 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/658edc6a-9975-4d8b-9551-821edcc32ce1-run-systemd\") pod \"ovnkube-node-dhk2q\" (UID: \"658edc6a-9975-4d8b-9551-821edcc32ce1\") " pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.576529 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/658edc6a-9975-4d8b-9551-821edcc32ce1-host-run-netns\") pod \"ovnkube-node-dhk2q\" (UID: \"658edc6a-9975-4d8b-9551-821edcc32ce1\") " pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.576562 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/658edc6a-9975-4d8b-9551-821edcc32ce1-host-cni-bin\") pod \"ovnkube-node-dhk2q\" (UID: \"658edc6a-9975-4d8b-9551-821edcc32ce1\") " pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.576768 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b2ab9f14-aae8-45ef-880e-a1563e920f87-host-var-lib-cni-bin\") pod \"multus-6gbdx\" (UID: \"b2ab9f14-aae8-45ef-880e-a1563e920f87\") " pod="openshift-multus/multus-6gbdx" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.576847 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/b2ab9f14-aae8-45ef-880e-a1563e920f87-host-run-multus-certs\") pod \"multus-6gbdx\" (UID: \"b2ab9f14-aae8-45ef-880e-a1563e920f87\") " pod="openshift-multus/multus-6gbdx" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.576859 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/94a6e41f-8980-41db-a008-d5a81058cdba-tuning-conf-dir\") pod \"multus-additional-cni-plugins-hc9bk\" (UID: \"94a6e41f-8980-41db-a008-d5a81058cdba\") " pod="openshift-multus/multus-additional-cni-plugins-hc9bk" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.576896 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/658edc6a-9975-4d8b-9551-821edcc32ce1-systemd-units\") pod \"ovnkube-node-dhk2q\" (UID: \"658edc6a-9975-4d8b-9551-821edcc32ce1\") " pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.577658 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/94a6e41f-8980-41db-a008-d5a81058cdba-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-hc9bk\" (UID: \"94a6e41f-8980-41db-a008-d5a81058cdba\") " pod="openshift-multus/multus-additional-cni-plugins-hc9bk" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.578385 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/345b1c60-ba79-407d-8423-53010f2dfeb0-rootfs\") pod \"machine-config-daemon-hrdxs\" (UID: \"345b1c60-ba79-407d-8423-53010f2dfeb0\") " pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.578615 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/658edc6a-9975-4d8b-9551-821edcc32ce1-ovn-node-metrics-cert\") pod \"ovnkube-node-dhk2q\" (UID: \"658edc6a-9975-4d8b-9551-821edcc32ce1\") " pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.578868 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b2ab9f14-aae8-45ef-880e-a1563e920f87-system-cni-dir\") pod \"multus-6gbdx\" (UID: \"b2ab9f14-aae8-45ef-880e-a1563e920f87\") " pod="openshift-multus/multus-6gbdx" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.578959 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/345b1c60-ba79-407d-8423-53010f2dfeb0-mcd-auth-proxy-config\") pod \"machine-config-daemon-hrdxs\" (UID: \"345b1c60-ba79-407d-8423-53010f2dfeb0\") " pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.587174 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/345b1c60-ba79-407d-8423-53010f2dfeb0-proxy-tls\") pod \"machine-config-daemon-hrdxs\" (UID: \"345b1c60-ba79-407d-8423-53010f2dfeb0\") " pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.595241 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4mqw\" (UniqueName: \"kubernetes.io/projected/345b1c60-ba79-407d-8423-53010f2dfeb0-kube-api-access-t4mqw\") pod \"machine-config-daemon-hrdxs\" (UID: \"345b1c60-ba79-407d-8423-53010f2dfeb0\") " pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.595826 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9g9sc\" (UniqueName: \"kubernetes.io/projected/94a6e41f-8980-41db-a008-d5a81058cdba-kube-api-access-9g9sc\") pod \"multus-additional-cni-plugins-hc9bk\" (UID: \"94a6e41f-8980-41db-a008-d5a81058cdba\") " pod="openshift-multus/multus-additional-cni-plugins-hc9bk" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.595836 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rkc2d\" (UniqueName: \"kubernetes.io/projected/b2ab9f14-aae8-45ef-880e-a1563e920f87-kube-api-access-rkc2d\") pod \"multus-6gbdx\" (UID: \"b2ab9f14-aae8-45ef-880e-a1563e920f87\") " pod="openshift-multus/multus-6gbdx" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.597556 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:25Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.600176 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2wq7\" (UniqueName: \"kubernetes.io/projected/658edc6a-9975-4d8b-9551-821edcc32ce1-kube-api-access-z2wq7\") pod \"ovnkube-node-dhk2q\" (UID: \"658edc6a-9975-4d8b-9551-821edcc32ce1\") " pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.613928 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hc9bk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94a6e41f-8980-41db-a008-d5a81058cdba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hc9bk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:25Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.627206 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f37f607-3b81-4e33-878e-e78a69b89d23\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f65bab26af0e0d003d4e1a27dc4bdb84b64b5f6143e363973331a3fb6d26b12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81fc000b6df41386d24f9077cee4aa0ceb4733774dc37d225495575543e84a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6529ac19e0d9f2b6ecc69e041e75c9767c971617166ca22bb29349b3b3965b1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://153aaedead7ed76ab97858342317945de885afe80c00d9873d1a03444c47f67d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:25Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.641788 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"345b1c60-ba79-407d-8423-53010f2dfeb0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4mqw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4mqw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hrdxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:25Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.658130 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a225515-1318-413d-aafe-877c9f16f598\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94213b2963fead3db49bc98dfdf6347265b92e3a0a965295610e496d2e1f03fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://015492cb16b3cf6dedc1936f90cf03d1331bfd1fddf6a257c719a6bf102691f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a259d99cae7127eb6fc8ad5446de3eda5a06da45868ab2325a89fc9c44f1d34c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58e84f914871c37c2c5cf2767af6a88354e4e59af0cbe5b178b80e1372d50629\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba76df71160260346f2ecd968722de778b7d2b3dcb8673d6ec770964965384fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ff207205d6f17838682cab641fe9c77fad739e5454c206f376440741bed8724\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ff207205d6f17838682cab641fe9c77fad739e5454c206f376440741bed8724\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:25Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.673797 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:25Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.687792 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:25Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.689933 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-hc9bk" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.699218 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-6gbdx" Oct 13 13:07:25 crc kubenswrapper[4797]: W1013 13:07:25.703488 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod94a6e41f_8980_41db_a008_d5a81058cdba.slice/crio-906d4e97ffbbb9ad46ae1bdfcd6f6fa199d1f9327aa45ef7cfbeb5d5c76722cd WatchSource:0}: Error finding container 906d4e97ffbbb9ad46ae1bdfcd6f6fa199d1f9327aa45ef7cfbeb5d5c76722cd: Status 404 returned error can't find the container with id 906d4e97ffbbb9ad46ae1bdfcd6f6fa199d1f9327aa45ef7cfbeb5d5c76722cd Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.711061 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6gbdx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2ab9f14-aae8-45ef-880e-a1563e920f87\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rkc2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6gbdx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:25Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.714292 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" Oct 13 13:07:25 crc kubenswrapper[4797]: W1013 13:07:25.719385 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb2ab9f14_aae8_45ef_880e_a1563e920f87.slice/crio-69360d8e1f832f23dca9890613ddc6614d22c8dcf301419791ec7750df4f2837 WatchSource:0}: Error finding container 69360d8e1f832f23dca9890613ddc6614d22c8dcf301419791ec7750df4f2837: Status 404 returned error can't find the container with id 69360d8e1f832f23dca9890613ddc6614d22c8dcf301419791ec7750df4f2837 Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.726934 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" Oct 13 13:07:25 crc kubenswrapper[4797]: I1013 13:07:25.735254 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"658edc6a-9975-4d8b-9551-821edcc32ce1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dhk2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:25Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:25 crc kubenswrapper[4797]: W1013 13:07:25.743213 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod345b1c60_ba79_407d_8423_53010f2dfeb0.slice/crio-41ad72feb9eaa88f2f9198403f960850a68bc321fd0468cfe7ef1d7004b0eaed WatchSource:0}: Error finding container 41ad72feb9eaa88f2f9198403f960850a68bc321fd0468cfe7ef1d7004b0eaed: Status 404 returned error can't find the container with id 41ad72feb9eaa88f2f9198403f960850a68bc321fd0468cfe7ef1d7004b0eaed Oct 13 13:07:25 crc kubenswrapper[4797]: W1013 13:07:25.746751 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod658edc6a_9975_4d8b_9551_821edcc32ce1.slice/crio-23f2e2805650d7d1dd19457d52d7fbaa2345c59697754367b63f611235d82b36 WatchSource:0}: Error finding container 23f2e2805650d7d1dd19457d52d7fbaa2345c59697754367b63f611235d82b36: Status 404 returned error can't find the container with id 23f2e2805650d7d1dd19457d52d7fbaa2345c59697754367b63f611235d82b36 Oct 13 13:07:26 crc kubenswrapper[4797]: I1013 13:07:26.396994 4797 generic.go:334] "Generic (PLEG): container finished" podID="658edc6a-9975-4d8b-9551-821edcc32ce1" containerID="6339ec20a5a892de024630eaf6febedfdfa4373c24d394fc758267cc7ae2603e" exitCode=0 Oct 13 13:07:26 crc kubenswrapper[4797]: I1013 13:07:26.397166 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" event={"ID":"658edc6a-9975-4d8b-9551-821edcc32ce1","Type":"ContainerDied","Data":"6339ec20a5a892de024630eaf6febedfdfa4373c24d394fc758267cc7ae2603e"} Oct 13 13:07:26 crc kubenswrapper[4797]: I1013 13:07:26.397568 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" event={"ID":"658edc6a-9975-4d8b-9551-821edcc32ce1","Type":"ContainerStarted","Data":"23f2e2805650d7d1dd19457d52d7fbaa2345c59697754367b63f611235d82b36"} Oct 13 13:07:26 crc kubenswrapper[4797]: I1013 13:07:26.400842 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" event={"ID":"345b1c60-ba79-407d-8423-53010f2dfeb0","Type":"ContainerStarted","Data":"e9188982d992b79d058393a141055552eeb63bc5cd53178991e62b3df7604f55"} Oct 13 13:07:26 crc kubenswrapper[4797]: I1013 13:07:26.401011 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" event={"ID":"345b1c60-ba79-407d-8423-53010f2dfeb0","Type":"ContainerStarted","Data":"ae2106d4b7e73d19b0c8cbd8089d372e56fa08d827a3b45148d0cf68e8596c00"} Oct 13 13:07:26 crc kubenswrapper[4797]: I1013 13:07:26.401103 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" event={"ID":"345b1c60-ba79-407d-8423-53010f2dfeb0","Type":"ContainerStarted","Data":"41ad72feb9eaa88f2f9198403f960850a68bc321fd0468cfe7ef1d7004b0eaed"} Oct 13 13:07:26 crc kubenswrapper[4797]: I1013 13:07:26.404606 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-5jgrm" event={"ID":"680a49a0-7eff-44a9-8ab8-e4b52f4743c6","Type":"ContainerStarted","Data":"875be57ff7356a934b342acd8ae700f66656680be4e58e6cfccdc0407b66ddea"} Oct 13 13:07:26 crc kubenswrapper[4797]: I1013 13:07:26.407268 4797 generic.go:334] "Generic (PLEG): container finished" podID="94a6e41f-8980-41db-a008-d5a81058cdba" containerID="55fca726ceb1c3d6ba2023acc8fcf6829a7843a2da5209dc77f3d1f499542984" exitCode=0 Oct 13 13:07:26 crc kubenswrapper[4797]: I1013 13:07:26.407346 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hc9bk" event={"ID":"94a6e41f-8980-41db-a008-d5a81058cdba","Type":"ContainerDied","Data":"55fca726ceb1c3d6ba2023acc8fcf6829a7843a2da5209dc77f3d1f499542984"} Oct 13 13:07:26 crc kubenswrapper[4797]: I1013 13:07:26.407411 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hc9bk" event={"ID":"94a6e41f-8980-41db-a008-d5a81058cdba","Type":"ContainerStarted","Data":"906d4e97ffbbb9ad46ae1bdfcd6f6fa199d1f9327aa45ef7cfbeb5d5c76722cd"} Oct 13 13:07:26 crc kubenswrapper[4797]: I1013 13:07:26.409994 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-6gbdx" event={"ID":"b2ab9f14-aae8-45ef-880e-a1563e920f87","Type":"ContainerStarted","Data":"414f6ddbfec431109009fc83e56eeac94db15726b109e707ebd8d3e2403999b7"} Oct 13 13:07:26 crc kubenswrapper[4797]: I1013 13:07:26.410034 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-6gbdx" event={"ID":"b2ab9f14-aae8-45ef-880e-a1563e920f87","Type":"ContainerStarted","Data":"69360d8e1f832f23dca9890613ddc6614d22c8dcf301419791ec7750df4f2837"} Oct 13 13:07:26 crc kubenswrapper[4797]: I1013 13:07:26.435389 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f37f607-3b81-4e33-878e-e78a69b89d23\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f65bab26af0e0d003d4e1a27dc4bdb84b64b5f6143e363973331a3fb6d26b12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81fc000b6df41386d24f9077cee4aa0ceb4733774dc37d225495575543e84a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6529ac19e0d9f2b6ecc69e041e75c9767c971617166ca22bb29349b3b3965b1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://153aaedead7ed76ab97858342317945de885afe80c00d9873d1a03444c47f67d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:26Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:26 crc kubenswrapper[4797]: I1013 13:07:26.456797 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"345b1c60-ba79-407d-8423-53010f2dfeb0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4mqw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4mqw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hrdxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:26Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:26 crc kubenswrapper[4797]: I1013 13:07:26.486091 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a225515-1318-413d-aafe-877c9f16f598\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94213b2963fead3db49bc98dfdf6347265b92e3a0a965295610e496d2e1f03fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://015492cb16b3cf6dedc1936f90cf03d1331bfd1fddf6a257c719a6bf102691f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a259d99cae7127eb6fc8ad5446de3eda5a06da45868ab2325a89fc9c44f1d34c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58e84f914871c37c2c5cf2767af6a88354e4e59af0cbe5b178b80e1372d50629\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba76df71160260346f2ecd968722de778b7d2b3dcb8673d6ec770964965384fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ff207205d6f17838682cab641fe9c77fad739e5454c206f376440741bed8724\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ff207205d6f17838682cab641fe9c77fad739e5454c206f376440741bed8724\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:26Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:26 crc kubenswrapper[4797]: I1013 13:07:26.522757 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:26Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:26 crc kubenswrapper[4797]: I1013 13:07:26.542632 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:26Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:26 crc kubenswrapper[4797]: I1013 13:07:26.572080 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6gbdx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2ab9f14-aae8-45ef-880e-a1563e920f87\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rkc2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6gbdx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:26Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:26 crc kubenswrapper[4797]: I1013 13:07:26.599550 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"658edc6a-9975-4d8b-9551-821edcc32ce1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6339ec20a5a892de024630eaf6febedfdfa4373c24d394fc758267cc7ae2603e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6339ec20a5a892de024630eaf6febedfdfa4373c24d394fc758267cc7ae2603e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dhk2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:26Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:26 crc kubenswrapper[4797]: I1013 13:07:26.614785 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:26Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:26 crc kubenswrapper[4797]: I1013 13:07:26.635230 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5jgrm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"680a49a0-7eff-44a9-8ab8-e4b52f4743c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pptw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5jgrm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:26Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:26 crc kubenswrapper[4797]: I1013 13:07:26.650140 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:26Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:26 crc kubenswrapper[4797]: I1013 13:07:26.670062 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hc9bk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94a6e41f-8980-41db-a008-d5a81058cdba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hc9bk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:26Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:26 crc kubenswrapper[4797]: I1013 13:07:26.683970 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6681a37da700a80ffec94aef9264f87838622029c76a2badc7b8f4a7e9e167e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:26Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:26 crc kubenswrapper[4797]: I1013 13:07:26.703274 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://050a7223e7496fca4ef77f2d73f6aefc921ac5accb7ecaa34609524388da6549\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f59a6104215a3e6febd2e26c286b00895bcd8a45719acbd8e86d6fb5683df39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:26Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:26 crc kubenswrapper[4797]: I1013 13:07:26.718361 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6gbdx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2ab9f14-aae8-45ef-880e-a1563e920f87\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://414f6ddbfec431109009fc83e56eeac94db15726b109e707ebd8d3e2403999b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rkc2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6gbdx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:26Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:26 crc kubenswrapper[4797]: I1013 13:07:26.742876 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"658edc6a-9975-4d8b-9551-821edcc32ce1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6339ec20a5a892de024630eaf6febedfdfa4373c24d394fc758267cc7ae2603e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6339ec20a5a892de024630eaf6febedfdfa4373c24d394fc758267cc7ae2603e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dhk2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:26Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:26 crc kubenswrapper[4797]: I1013 13:07:26.757257 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a225515-1318-413d-aafe-877c9f16f598\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94213b2963fead3db49bc98dfdf6347265b92e3a0a965295610e496d2e1f03fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://015492cb16b3cf6dedc1936f90cf03d1331bfd1fddf6a257c719a6bf102691f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a259d99cae7127eb6fc8ad5446de3eda5a06da45868ab2325a89fc9c44f1d34c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58e84f914871c37c2c5cf2767af6a88354e4e59af0cbe5b178b80e1372d50629\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba76df71160260346f2ecd968722de778b7d2b3dcb8673d6ec770964965384fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ff207205d6f17838682cab641fe9c77fad739e5454c206f376440741bed8724\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ff207205d6f17838682cab641fe9c77fad739e5454c206f376440741bed8724\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:26Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:26 crc kubenswrapper[4797]: I1013 13:07:26.778228 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:26Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:26 crc kubenswrapper[4797]: I1013 13:07:26.794212 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:26Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:26 crc kubenswrapper[4797]: I1013 13:07:26.821930 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:26Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:26 crc kubenswrapper[4797]: I1013 13:07:26.840225 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5jgrm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"680a49a0-7eff-44a9-8ab8-e4b52f4743c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://875be57ff7356a934b342acd8ae700f66656680be4e58e6cfccdc0407b66ddea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pptw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5jgrm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:26Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:26 crc kubenswrapper[4797]: I1013 13:07:26.855650 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6681a37da700a80ffec94aef9264f87838622029c76a2badc7b8f4a7e9e167e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:26Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:26 crc kubenswrapper[4797]: I1013 13:07:26.868490 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://050a7223e7496fca4ef77f2d73f6aefc921ac5accb7ecaa34609524388da6549\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f59a6104215a3e6febd2e26c286b00895bcd8a45719acbd8e86d6fb5683df39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:26Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:26 crc kubenswrapper[4797]: I1013 13:07:26.885916 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:26Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:26 crc kubenswrapper[4797]: I1013 13:07:26.901052 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hc9bk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94a6e41f-8980-41db-a008-d5a81058cdba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55fca726ceb1c3d6ba2023acc8fcf6829a7843a2da5209dc77f3d1f499542984\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55fca726ceb1c3d6ba2023acc8fcf6829a7843a2da5209dc77f3d1f499542984\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hc9bk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:26Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:26 crc kubenswrapper[4797]: I1013 13:07:26.914476 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f37f607-3b81-4e33-878e-e78a69b89d23\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f65bab26af0e0d003d4e1a27dc4bdb84b64b5f6143e363973331a3fb6d26b12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81fc000b6df41386d24f9077cee4aa0ceb4733774dc37d225495575543e84a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6529ac19e0d9f2b6ecc69e041e75c9767c971617166ca22bb29349b3b3965b1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://153aaedead7ed76ab97858342317945de885afe80c00d9873d1a03444c47f67d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:26Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:26 crc kubenswrapper[4797]: I1013 13:07:26.926893 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"345b1c60-ba79-407d-8423-53010f2dfeb0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9188982d992b79d058393a141055552eeb63bc5cd53178991e62b3df7604f55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4mqw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae2106d4b7e73d19b0c8cbd8089d372e56fa08d827a3b45148d0cf68e8596c00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4mqw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hrdxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:26Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:26 crc kubenswrapper[4797]: I1013 13:07:26.929414 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-7c2fp"] Oct 13 13:07:26 crc kubenswrapper[4797]: I1013 13:07:26.929874 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-7c2fp" Oct 13 13:07:26 crc kubenswrapper[4797]: I1013 13:07:26.931847 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Oct 13 13:07:26 crc kubenswrapper[4797]: I1013 13:07:26.932139 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Oct 13 13:07:26 crc kubenswrapper[4797]: I1013 13:07:26.932303 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Oct 13 13:07:26 crc kubenswrapper[4797]: I1013 13:07:26.932574 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Oct 13 13:07:26 crc kubenswrapper[4797]: I1013 13:07:26.952336 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:26Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:26 crc kubenswrapper[4797]: I1013 13:07:26.952426 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Oct 13 13:07:26 crc kubenswrapper[4797]: I1013 13:07:26.963760 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Oct 13 13:07:26 crc kubenswrapper[4797]: I1013 13:07:26.967120 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5jgrm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"680a49a0-7eff-44a9-8ab8-e4b52f4743c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://875be57ff7356a934b342acd8ae700f66656680be4e58e6cfccdc0407b66ddea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pptw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5jgrm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:26Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:26 crc kubenswrapper[4797]: I1013 13:07:26.968048 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Oct 13 13:07:26 crc kubenswrapper[4797]: I1013 13:07:26.984617 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6681a37da700a80ffec94aef9264f87838622029c76a2badc7b8f4a7e9e167e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:26Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:27 crc kubenswrapper[4797]: I1013 13:07:27.004310 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://050a7223e7496fca4ef77f2d73f6aefc921ac5accb7ecaa34609524388da6549\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f59a6104215a3e6febd2e26c286b00895bcd8a45719acbd8e86d6fb5683df39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:27Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:27 crc kubenswrapper[4797]: I1013 13:07:27.019989 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:27Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:27 crc kubenswrapper[4797]: I1013 13:07:27.036213 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hc9bk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94a6e41f-8980-41db-a008-d5a81058cdba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55fca726ceb1c3d6ba2023acc8fcf6829a7843a2da5209dc77f3d1f499542984\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55fca726ceb1c3d6ba2023acc8fcf6829a7843a2da5209dc77f3d1f499542984\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hc9bk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:27Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:27 crc kubenswrapper[4797]: I1013 13:07:27.050526 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f37f607-3b81-4e33-878e-e78a69b89d23\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f65bab26af0e0d003d4e1a27dc4bdb84b64b5f6143e363973331a3fb6d26b12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81fc000b6df41386d24f9077cee4aa0ceb4733774dc37d225495575543e84a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6529ac19e0d9f2b6ecc69e041e75c9767c971617166ca22bb29349b3b3965b1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://153aaedead7ed76ab97858342317945de885afe80c00d9873d1a03444c47f67d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:27Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:27 crc kubenswrapper[4797]: I1013 13:07:27.063204 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"345b1c60-ba79-407d-8423-53010f2dfeb0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9188982d992b79d058393a141055552eeb63bc5cd53178991e62b3df7604f55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4mqw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae2106d4b7e73d19b0c8cbd8089d372e56fa08d827a3b45148d0cf68e8596c00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4mqw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hrdxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:27Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:27 crc kubenswrapper[4797]: I1013 13:07:27.077033 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7c2fp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c10f51f-a7f1-4ab8-8d9c-fc358bd7f2c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hgkjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7c2fp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:27Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:27 crc kubenswrapper[4797]: I1013 13:07:27.090569 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a225515-1318-413d-aafe-877c9f16f598\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94213b2963fead3db49bc98dfdf6347265b92e3a0a965295610e496d2e1f03fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://015492cb16b3cf6dedc1936f90cf03d1331bfd1fddf6a257c719a6bf102691f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a259d99cae7127eb6fc8ad5446de3eda5a06da45868ab2325a89fc9c44f1d34c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58e84f914871c37c2c5cf2767af6a88354e4e59af0cbe5b178b80e1372d50629\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba76df71160260346f2ecd968722de778b7d2b3dcb8673d6ec770964965384fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ff207205d6f17838682cab641fe9c77fad739e5454c206f376440741bed8724\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ff207205d6f17838682cab641fe9c77fad739e5454c206f376440741bed8724\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:27Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:27 crc kubenswrapper[4797]: I1013 13:07:27.100278 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2c10f51f-a7f1-4ab8-8d9c-fc358bd7f2c6-host\") pod \"node-ca-7c2fp\" (UID: \"2c10f51f-a7f1-4ab8-8d9c-fc358bd7f2c6\") " pod="openshift-image-registry/node-ca-7c2fp" Oct 13 13:07:27 crc kubenswrapper[4797]: I1013 13:07:27.100321 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hgkjm\" (UniqueName: \"kubernetes.io/projected/2c10f51f-a7f1-4ab8-8d9c-fc358bd7f2c6-kube-api-access-hgkjm\") pod \"node-ca-7c2fp\" (UID: \"2c10f51f-a7f1-4ab8-8d9c-fc358bd7f2c6\") " pod="openshift-image-registry/node-ca-7c2fp" Oct 13 13:07:27 crc kubenswrapper[4797]: I1013 13:07:27.100378 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/2c10f51f-a7f1-4ab8-8d9c-fc358bd7f2c6-serviceca\") pod \"node-ca-7c2fp\" (UID: \"2c10f51f-a7f1-4ab8-8d9c-fc358bd7f2c6\") " pod="openshift-image-registry/node-ca-7c2fp" Oct 13 13:07:27 crc kubenswrapper[4797]: I1013 13:07:27.102703 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:27Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:27 crc kubenswrapper[4797]: I1013 13:07:27.113033 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:27Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:27 crc kubenswrapper[4797]: I1013 13:07:27.123764 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6gbdx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2ab9f14-aae8-45ef-880e-a1563e920f87\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://414f6ddbfec431109009fc83e56eeac94db15726b109e707ebd8d3e2403999b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rkc2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6gbdx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:27Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:27 crc kubenswrapper[4797]: I1013 13:07:27.139879 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"658edc6a-9975-4d8b-9551-821edcc32ce1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6339ec20a5a892de024630eaf6febedfdfa4373c24d394fc758267cc7ae2603e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6339ec20a5a892de024630eaf6febedfdfa4373c24d394fc758267cc7ae2603e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dhk2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:27Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:27 crc kubenswrapper[4797]: I1013 13:07:27.151704 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f37f607-3b81-4e33-878e-e78a69b89d23\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f65bab26af0e0d003d4e1a27dc4bdb84b64b5f6143e363973331a3fb6d26b12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81fc000b6df41386d24f9077cee4aa0ceb4733774dc37d225495575543e84a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6529ac19e0d9f2b6ecc69e041e75c9767c971617166ca22bb29349b3b3965b1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://153aaedead7ed76ab97858342317945de885afe80c00d9873d1a03444c47f67d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:27Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:27 crc kubenswrapper[4797]: I1013 13:07:27.162382 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"345b1c60-ba79-407d-8423-53010f2dfeb0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9188982d992b79d058393a141055552eeb63bc5cd53178991e62b3df7604f55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4mqw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae2106d4b7e73d19b0c8cbd8089d372e56fa08d827a3b45148d0cf68e8596c00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4mqw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hrdxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:27Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:27 crc kubenswrapper[4797]: I1013 13:07:27.172729 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7c2fp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c10f51f-a7f1-4ab8-8d9c-fc358bd7f2c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hgkjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7c2fp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:27Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:27 crc kubenswrapper[4797]: I1013 13:07:27.184243 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6gbdx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2ab9f14-aae8-45ef-880e-a1563e920f87\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://414f6ddbfec431109009fc83e56eeac94db15726b109e707ebd8d3e2403999b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rkc2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6gbdx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:27Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:27 crc kubenswrapper[4797]: I1013 13:07:27.201211 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/2c10f51f-a7f1-4ab8-8d9c-fc358bd7f2c6-serviceca\") pod \"node-ca-7c2fp\" (UID: \"2c10f51f-a7f1-4ab8-8d9c-fc358bd7f2c6\") " pod="openshift-image-registry/node-ca-7c2fp" Oct 13 13:07:27 crc kubenswrapper[4797]: I1013 13:07:27.201273 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hgkjm\" (UniqueName: \"kubernetes.io/projected/2c10f51f-a7f1-4ab8-8d9c-fc358bd7f2c6-kube-api-access-hgkjm\") pod \"node-ca-7c2fp\" (UID: \"2c10f51f-a7f1-4ab8-8d9c-fc358bd7f2c6\") " pod="openshift-image-registry/node-ca-7c2fp" Oct 13 13:07:27 crc kubenswrapper[4797]: I1013 13:07:27.201292 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2c10f51f-a7f1-4ab8-8d9c-fc358bd7f2c6-host\") pod \"node-ca-7c2fp\" (UID: \"2c10f51f-a7f1-4ab8-8d9c-fc358bd7f2c6\") " pod="openshift-image-registry/node-ca-7c2fp" Oct 13 13:07:27 crc kubenswrapper[4797]: I1013 13:07:27.201358 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2c10f51f-a7f1-4ab8-8d9c-fc358bd7f2c6-host\") pod \"node-ca-7c2fp\" (UID: \"2c10f51f-a7f1-4ab8-8d9c-fc358bd7f2c6\") " pod="openshift-image-registry/node-ca-7c2fp" Oct 13 13:07:27 crc kubenswrapper[4797]: I1013 13:07:27.202245 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/2c10f51f-a7f1-4ab8-8d9c-fc358bd7f2c6-serviceca\") pod \"node-ca-7c2fp\" (UID: \"2c10f51f-a7f1-4ab8-8d9c-fc358bd7f2c6\") " pod="openshift-image-registry/node-ca-7c2fp" Oct 13 13:07:27 crc kubenswrapper[4797]: I1013 13:07:27.220981 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"658edc6a-9975-4d8b-9551-821edcc32ce1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6339ec20a5a892de024630eaf6febedfdfa4373c24d394fc758267cc7ae2603e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6339ec20a5a892de024630eaf6febedfdfa4373c24d394fc758267cc7ae2603e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dhk2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:27Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:27 crc kubenswrapper[4797]: I1013 13:07:27.235338 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 13 13:07:27 crc kubenswrapper[4797]: I1013 13:07:27.235374 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 13:07:27 crc kubenswrapper[4797]: I1013 13:07:27.235413 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 13 13:07:27 crc kubenswrapper[4797]: E1013 13:07:27.235501 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 13 13:07:27 crc kubenswrapper[4797]: E1013 13:07:27.235664 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 13 13:07:27 crc kubenswrapper[4797]: E1013 13:07:27.235846 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 13 13:07:27 crc kubenswrapper[4797]: I1013 13:07:27.245204 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hgkjm\" (UniqueName: \"kubernetes.io/projected/2c10f51f-a7f1-4ab8-8d9c-fc358bd7f2c6-kube-api-access-hgkjm\") pod \"node-ca-7c2fp\" (UID: \"2c10f51f-a7f1-4ab8-8d9c-fc358bd7f2c6\") " pod="openshift-image-registry/node-ca-7c2fp" Oct 13 13:07:27 crc kubenswrapper[4797]: I1013 13:07:27.284126 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a225515-1318-413d-aafe-877c9f16f598\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94213b2963fead3db49bc98dfdf6347265b92e3a0a965295610e496d2e1f03fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://015492cb16b3cf6dedc1936f90cf03d1331bfd1fddf6a257c719a6bf102691f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a259d99cae7127eb6fc8ad5446de3eda5a06da45868ab2325a89fc9c44f1d34c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58e84f914871c37c2c5cf2767af6a88354e4e59af0cbe5b178b80e1372d50629\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba76df71160260346f2ecd968722de778b7d2b3dcb8673d6ec770964965384fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ff207205d6f17838682cab641fe9c77fad739e5454c206f376440741bed8724\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ff207205d6f17838682cab641fe9c77fad739e5454c206f376440741bed8724\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:27Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:27 crc kubenswrapper[4797]: I1013 13:07:27.301725 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 13 13:07:27 crc kubenswrapper[4797]: E1013 13:07:27.302038 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-13 13:07:31.302014555 +0000 UTC m=+28.835564811 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 13:07:27 crc kubenswrapper[4797]: I1013 13:07:27.323358 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:27Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:27 crc kubenswrapper[4797]: I1013 13:07:27.362862 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:27Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:27 crc kubenswrapper[4797]: I1013 13:07:27.401006 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:27Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:27 crc kubenswrapper[4797]: I1013 13:07:27.402532 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 13:07:27 crc kubenswrapper[4797]: I1013 13:07:27.402572 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 13 13:07:27 crc kubenswrapper[4797]: I1013 13:07:27.402612 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 13 13:07:27 crc kubenswrapper[4797]: I1013 13:07:27.402637 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 13:07:27 crc kubenswrapper[4797]: E1013 13:07:27.402767 4797 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 13 13:07:27 crc kubenswrapper[4797]: E1013 13:07:27.402843 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-13 13:07:31.402825932 +0000 UTC m=+28.936376188 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 13 13:07:27 crc kubenswrapper[4797]: E1013 13:07:27.402856 4797 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 13 13:07:27 crc kubenswrapper[4797]: E1013 13:07:27.402972 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-13 13:07:31.402942045 +0000 UTC m=+28.936492371 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 13 13:07:27 crc kubenswrapper[4797]: E1013 13:07:27.403022 4797 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 13 13:07:27 crc kubenswrapper[4797]: E1013 13:07:27.403043 4797 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 13 13:07:27 crc kubenswrapper[4797]: E1013 13:07:27.403058 4797 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 13 13:07:27 crc kubenswrapper[4797]: E1013 13:07:27.403095 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-13 13:07:31.403083778 +0000 UTC m=+28.936634134 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 13 13:07:27 crc kubenswrapper[4797]: E1013 13:07:27.403189 4797 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 13 13:07:27 crc kubenswrapper[4797]: E1013 13:07:27.403249 4797 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 13 13:07:27 crc kubenswrapper[4797]: E1013 13:07:27.403273 4797 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 13 13:07:27 crc kubenswrapper[4797]: E1013 13:07:27.403373 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-13 13:07:31.403343234 +0000 UTC m=+28.936893530 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 13 13:07:27 crc kubenswrapper[4797]: I1013 13:07:27.419057 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" event={"ID":"658edc6a-9975-4d8b-9551-821edcc32ce1","Type":"ContainerStarted","Data":"1293f7ed35796e22a4be73a35ad07f83fa98d250d21de2d0b96b9090354142b7"} Oct 13 13:07:27 crc kubenswrapper[4797]: I1013 13:07:27.419119 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" event={"ID":"658edc6a-9975-4d8b-9551-821edcc32ce1","Type":"ContainerStarted","Data":"3e599d81d1a996abd4de74afc58a8255a1ae548327401146b6bdf688d7455823"} Oct 13 13:07:27 crc kubenswrapper[4797]: I1013 13:07:27.419134 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" event={"ID":"658edc6a-9975-4d8b-9551-821edcc32ce1","Type":"ContainerStarted","Data":"6815d3509df673d7f5da2c26130c6c4d533e9d2c25c40f82365ef61d63ee71bb"} Oct 13 13:07:27 crc kubenswrapper[4797]: I1013 13:07:27.419148 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" event={"ID":"658edc6a-9975-4d8b-9551-821edcc32ce1","Type":"ContainerStarted","Data":"32991406197be9d38b8d5e8d1a7e95165b1846e9e054efbe87f30aac9f7f8784"} Oct 13 13:07:27 crc kubenswrapper[4797]: I1013 13:07:27.419159 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" event={"ID":"658edc6a-9975-4d8b-9551-821edcc32ce1","Type":"ContainerStarted","Data":"7aebb018a68c2984d9e4e58071c2b623652bfa700acebaf735c35615abf8c592"} Oct 13 13:07:27 crc kubenswrapper[4797]: I1013 13:07:27.421433 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"69631c207b244ea05458caf7f67665697be6b3794c1aac98d0ac8d23df060e02"} Oct 13 13:07:27 crc kubenswrapper[4797]: I1013 13:07:27.422781 4797 generic.go:334] "Generic (PLEG): container finished" podID="94a6e41f-8980-41db-a008-d5a81058cdba" containerID="d315709aee893961a174d0368efcf68e50e45845bdce18b40b96f5d49a8ac12c" exitCode=0 Oct 13 13:07:27 crc kubenswrapper[4797]: I1013 13:07:27.423551 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hc9bk" event={"ID":"94a6e41f-8980-41db-a008-d5a81058cdba","Type":"ContainerDied","Data":"d315709aee893961a174d0368efcf68e50e45845bdce18b40b96f5d49a8ac12c"} Oct 13 13:07:27 crc kubenswrapper[4797]: I1013 13:07:27.438188 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5jgrm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"680a49a0-7eff-44a9-8ab8-e4b52f4743c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://875be57ff7356a934b342acd8ae700f66656680be4e58e6cfccdc0407b66ddea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pptw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5jgrm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:27Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:27 crc kubenswrapper[4797]: E1013 13:07:27.456855 4797 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"etcd-crc\" already exists" pod="openshift-etcd/etcd-crc" Oct 13 13:07:27 crc kubenswrapper[4797]: I1013 13:07:27.475040 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-7c2fp" Oct 13 13:07:27 crc kubenswrapper[4797]: W1013 13:07:27.498428 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2c10f51f_a7f1_4ab8_8d9c_fc358bd7f2c6.slice/crio-4f31c6404deeabaf6d8360cbdabd2ff874f1f70b2f61750fb4f18091a9320da6 WatchSource:0}: Error finding container 4f31c6404deeabaf6d8360cbdabd2ff874f1f70b2f61750fb4f18091a9320da6: Status 404 returned error can't find the container with id 4f31c6404deeabaf6d8360cbdabd2ff874f1f70b2f61750fb4f18091a9320da6 Oct 13 13:07:27 crc kubenswrapper[4797]: I1013 13:07:27.500346 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6681a37da700a80ffec94aef9264f87838622029c76a2badc7b8f4a7e9e167e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:27Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:27 crc kubenswrapper[4797]: I1013 13:07:27.538348 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://050a7223e7496fca4ef77f2d73f6aefc921ac5accb7ecaa34609524388da6549\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f59a6104215a3e6febd2e26c286b00895bcd8a45719acbd8e86d6fb5683df39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:27Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:27 crc kubenswrapper[4797]: I1013 13:07:27.580030 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:27Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:27 crc kubenswrapper[4797]: I1013 13:07:27.621011 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hc9bk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94a6e41f-8980-41db-a008-d5a81058cdba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55fca726ceb1c3d6ba2023acc8fcf6829a7843a2da5209dc77f3d1f499542984\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55fca726ceb1c3d6ba2023acc8fcf6829a7843a2da5209dc77f3d1f499542984\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hc9bk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:27Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:27 crc kubenswrapper[4797]: I1013 13:07:27.666889 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a0ece0b-2009-4af8-a479-18fe277add03\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfe67dfd1c3ab4ca933a08e0384f2c38dccf755989a2d788f7c96bd8c2005c7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2839d93188aac6edbd17e7c1dc6d3b6004d3c1d8d03c559205b1f180ca7fc722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d103007fe3545a7470b16b99638c5d5c87f34918e102e1453d4f7ee1fa67109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://221933b30055ace0b4911bda08736e1c703b7757d55fadb3114ae39d038e4b19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01ae91c2e82dd02eb4c0aced1159efd45e3a0570a4db649f2fd2b58681419471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9634ab258db11c80c4fea57a4a31969b811204d49513802e9fd1e584c9baeeb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9634ab258db11c80c4fea57a4a31969b811204d49513802e9fd1e584c9baeeb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4eac0af23e91572524547b0ce92c10d435b55d0cd15ca4cfc1f49bda2de8bde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4eac0af23e91572524547b0ce92c10d435b55d0cd15ca4cfc1f49bda2de8bde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://af79d6bbb6a3c16b532ba2234d3373011c151ffa801eb1ae5ae947142a64bcfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af79d6bbb6a3c16b532ba2234d3373011c151ffa801eb1ae5ae947142a64bcfe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:03Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:27Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:27 crc kubenswrapper[4797]: I1013 13:07:27.696338 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7c2fp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c10f51f-a7f1-4ab8-8d9c-fc358bd7f2c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hgkjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7c2fp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:27Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:27 crc kubenswrapper[4797]: I1013 13:07:27.739706 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f37f607-3b81-4e33-878e-e78a69b89d23\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f65bab26af0e0d003d4e1a27dc4bdb84b64b5f6143e363973331a3fb6d26b12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81fc000b6df41386d24f9077cee4aa0ceb4733774dc37d225495575543e84a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6529ac19e0d9f2b6ecc69e041e75c9767c971617166ca22bb29349b3b3965b1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://153aaedead7ed76ab97858342317945de885afe80c00d9873d1a03444c47f67d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:27Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:27 crc kubenswrapper[4797]: I1013 13:07:27.782100 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"345b1c60-ba79-407d-8423-53010f2dfeb0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9188982d992b79d058393a141055552eeb63bc5cd53178991e62b3df7604f55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4mqw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae2106d4b7e73d19b0c8cbd8089d372e56fa08d827a3b45148d0cf68e8596c00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4mqw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hrdxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:27Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:27 crc kubenswrapper[4797]: I1013 13:07:27.819039 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:27Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:27 crc kubenswrapper[4797]: I1013 13:07:27.858516 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6gbdx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2ab9f14-aae8-45ef-880e-a1563e920f87\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://414f6ddbfec431109009fc83e56eeac94db15726b109e707ebd8d3e2403999b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rkc2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6gbdx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:27Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:27 crc kubenswrapper[4797]: I1013 13:07:27.913677 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"658edc6a-9975-4d8b-9551-821edcc32ce1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6339ec20a5a892de024630eaf6febedfdfa4373c24d394fc758267cc7ae2603e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6339ec20a5a892de024630eaf6febedfdfa4373c24d394fc758267cc7ae2603e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dhk2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:27Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:27 crc kubenswrapper[4797]: I1013 13:07:27.943796 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a225515-1318-413d-aafe-877c9f16f598\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94213b2963fead3db49bc98dfdf6347265b92e3a0a965295610e496d2e1f03fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://015492cb16b3cf6dedc1936f90cf03d1331bfd1fddf6a257c719a6bf102691f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a259d99cae7127eb6fc8ad5446de3eda5a06da45868ab2325a89fc9c44f1d34c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58e84f914871c37c2c5cf2767af6a88354e4e59af0cbe5b178b80e1372d50629\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba76df71160260346f2ecd968722de778b7d2b3dcb8673d6ec770964965384fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ff207205d6f17838682cab641fe9c77fad739e5454c206f376440741bed8724\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ff207205d6f17838682cab641fe9c77fad739e5454c206f376440741bed8724\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:27Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:27 crc kubenswrapper[4797]: I1013 13:07:27.980262 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:27Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:28 crc kubenswrapper[4797]: I1013 13:07:28.018939 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69631c207b244ea05458caf7f67665697be6b3794c1aac98d0ac8d23df060e02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:28Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:28 crc kubenswrapper[4797]: I1013 13:07:28.060473 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5jgrm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"680a49a0-7eff-44a9-8ab8-e4b52f4743c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://875be57ff7356a934b342acd8ae700f66656680be4e58e6cfccdc0407b66ddea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pptw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5jgrm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:28Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:28 crc kubenswrapper[4797]: I1013 13:07:28.112126 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a0ece0b-2009-4af8-a479-18fe277add03\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfe67dfd1c3ab4ca933a08e0384f2c38dccf755989a2d788f7c96bd8c2005c7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2839d93188aac6edbd17e7c1dc6d3b6004d3c1d8d03c559205b1f180ca7fc722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d103007fe3545a7470b16b99638c5d5c87f34918e102e1453d4f7ee1fa67109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://221933b30055ace0b4911bda08736e1c703b7757d55fadb3114ae39d038e4b19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01ae91c2e82dd02eb4c0aced1159efd45e3a0570a4db649f2fd2b58681419471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9634ab258db11c80c4fea57a4a31969b811204d49513802e9fd1e584c9baeeb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9634ab258db11c80c4fea57a4a31969b811204d49513802e9fd1e584c9baeeb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4eac0af23e91572524547b0ce92c10d435b55d0cd15ca4cfc1f49bda2de8bde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4eac0af23e91572524547b0ce92c10d435b55d0cd15ca4cfc1f49bda2de8bde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://af79d6bbb6a3c16b532ba2234d3373011c151ffa801eb1ae5ae947142a64bcfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af79d6bbb6a3c16b532ba2234d3373011c151ffa801eb1ae5ae947142a64bcfe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:03Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:28Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:28 crc kubenswrapper[4797]: I1013 13:07:28.139521 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6681a37da700a80ffec94aef9264f87838622029c76a2badc7b8f4a7e9e167e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:28Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:28 crc kubenswrapper[4797]: I1013 13:07:28.190035 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://050a7223e7496fca4ef77f2d73f6aefc921ac5accb7ecaa34609524388da6549\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f59a6104215a3e6febd2e26c286b00895bcd8a45719acbd8e86d6fb5683df39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:28Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:28 crc kubenswrapper[4797]: I1013 13:07:28.220638 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:28Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:28 crc kubenswrapper[4797]: I1013 13:07:28.263362 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hc9bk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94a6e41f-8980-41db-a008-d5a81058cdba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55fca726ceb1c3d6ba2023acc8fcf6829a7843a2da5209dc77f3d1f499542984\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55fca726ceb1c3d6ba2023acc8fcf6829a7843a2da5209dc77f3d1f499542984\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d315709aee893961a174d0368efcf68e50e45845bdce18b40b96f5d49a8ac12c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d315709aee893961a174d0368efcf68e50e45845bdce18b40b96f5d49a8ac12c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hc9bk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:28Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:28 crc kubenswrapper[4797]: I1013 13:07:28.432374 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-7c2fp" event={"ID":"2c10f51f-a7f1-4ab8-8d9c-fc358bd7f2c6","Type":"ContainerStarted","Data":"55ca323dd3a92ee203542f4ef7bb8be990bcfc8f75f125c562127129aecefc5e"} Oct 13 13:07:28 crc kubenswrapper[4797]: I1013 13:07:28.432472 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-7c2fp" event={"ID":"2c10f51f-a7f1-4ab8-8d9c-fc358bd7f2c6","Type":"ContainerStarted","Data":"4f31c6404deeabaf6d8360cbdabd2ff874f1f70b2f61750fb4f18091a9320da6"} Oct 13 13:07:28 crc kubenswrapper[4797]: I1013 13:07:28.435736 4797 generic.go:334] "Generic (PLEG): container finished" podID="94a6e41f-8980-41db-a008-d5a81058cdba" containerID="8e5eeb32046ffa3c309bb0649ed24bb4149050e757d4252bc5ed8e0593e1b139" exitCode=0 Oct 13 13:07:28 crc kubenswrapper[4797]: I1013 13:07:28.435823 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hc9bk" event={"ID":"94a6e41f-8980-41db-a008-d5a81058cdba","Type":"ContainerDied","Data":"8e5eeb32046ffa3c309bb0649ed24bb4149050e757d4252bc5ed8e0593e1b139"} Oct 13 13:07:28 crc kubenswrapper[4797]: I1013 13:07:28.446004 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" event={"ID":"658edc6a-9975-4d8b-9551-821edcc32ce1","Type":"ContainerStarted","Data":"aa5161ba66d687daedb3caa1a0e2d7be83859aa3076731f94aebf83cc3348a4a"} Oct 13 13:07:28 crc kubenswrapper[4797]: I1013 13:07:28.459893 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a225515-1318-413d-aafe-877c9f16f598\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94213b2963fead3db49bc98dfdf6347265b92e3a0a965295610e496d2e1f03fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://015492cb16b3cf6dedc1936f90cf03d1331bfd1fddf6a257c719a6bf102691f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a259d99cae7127eb6fc8ad5446de3eda5a06da45868ab2325a89fc9c44f1d34c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58e84f914871c37c2c5cf2767af6a88354e4e59af0cbe5b178b80e1372d50629\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba76df71160260346f2ecd968722de778b7d2b3dcb8673d6ec770964965384fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ff207205d6f17838682cab641fe9c77fad739e5454c206f376440741bed8724\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ff207205d6f17838682cab641fe9c77fad739e5454c206f376440741bed8724\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:28Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:28 crc kubenswrapper[4797]: I1013 13:07:28.476476 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:28Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:28 crc kubenswrapper[4797]: I1013 13:07:28.492908 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:28Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:28 crc kubenswrapper[4797]: I1013 13:07:28.506073 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6gbdx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2ab9f14-aae8-45ef-880e-a1563e920f87\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://414f6ddbfec431109009fc83e56eeac94db15726b109e707ebd8d3e2403999b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rkc2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6gbdx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:28Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:28 crc kubenswrapper[4797]: I1013 13:07:28.525164 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"658edc6a-9975-4d8b-9551-821edcc32ce1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6339ec20a5a892de024630eaf6febedfdfa4373c24d394fc758267cc7ae2603e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6339ec20a5a892de024630eaf6febedfdfa4373c24d394fc758267cc7ae2603e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dhk2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:28Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:28 crc kubenswrapper[4797]: I1013 13:07:28.536468 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69631c207b244ea05458caf7f67665697be6b3794c1aac98d0ac8d23df060e02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:28Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:28 crc kubenswrapper[4797]: I1013 13:07:28.547404 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5jgrm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"680a49a0-7eff-44a9-8ab8-e4b52f4743c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://875be57ff7356a934b342acd8ae700f66656680be4e58e6cfccdc0407b66ddea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pptw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5jgrm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:28Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:28 crc kubenswrapper[4797]: I1013 13:07:28.577507 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:28Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:28 crc kubenswrapper[4797]: I1013 13:07:28.620110 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hc9bk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94a6e41f-8980-41db-a008-d5a81058cdba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55fca726ceb1c3d6ba2023acc8fcf6829a7843a2da5209dc77f3d1f499542984\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55fca726ceb1c3d6ba2023acc8fcf6829a7843a2da5209dc77f3d1f499542984\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d315709aee893961a174d0368efcf68e50e45845bdce18b40b96f5d49a8ac12c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d315709aee893961a174d0368efcf68e50e45845bdce18b40b96f5d49a8ac12c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hc9bk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:28Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:28 crc kubenswrapper[4797]: I1013 13:07:28.663546 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a0ece0b-2009-4af8-a479-18fe277add03\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfe67dfd1c3ab4ca933a08e0384f2c38dccf755989a2d788f7c96bd8c2005c7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2839d93188aac6edbd17e7c1dc6d3b6004d3c1d8d03c559205b1f180ca7fc722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d103007fe3545a7470b16b99638c5d5c87f34918e102e1453d4f7ee1fa67109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://221933b30055ace0b4911bda08736e1c703b7757d55fadb3114ae39d038e4b19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01ae91c2e82dd02eb4c0aced1159efd45e3a0570a4db649f2fd2b58681419471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9634ab258db11c80c4fea57a4a31969b811204d49513802e9fd1e584c9baeeb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9634ab258db11c80c4fea57a4a31969b811204d49513802e9fd1e584c9baeeb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4eac0af23e91572524547b0ce92c10d435b55d0cd15ca4cfc1f49bda2de8bde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4eac0af23e91572524547b0ce92c10d435b55d0cd15ca4cfc1f49bda2de8bde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://af79d6bbb6a3c16b532ba2234d3373011c151ffa801eb1ae5ae947142a64bcfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af79d6bbb6a3c16b532ba2234d3373011c151ffa801eb1ae5ae947142a64bcfe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:03Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:28Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:28 crc kubenswrapper[4797]: I1013 13:07:28.697207 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6681a37da700a80ffec94aef9264f87838622029c76a2badc7b8f4a7e9e167e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:28Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:28 crc kubenswrapper[4797]: I1013 13:07:28.738065 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://050a7223e7496fca4ef77f2d73f6aefc921ac5accb7ecaa34609524388da6549\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f59a6104215a3e6febd2e26c286b00895bcd8a45719acbd8e86d6fb5683df39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:28Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:28 crc kubenswrapper[4797]: I1013 13:07:28.784891 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f37f607-3b81-4e33-878e-e78a69b89d23\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f65bab26af0e0d003d4e1a27dc4bdb84b64b5f6143e363973331a3fb6d26b12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81fc000b6df41386d24f9077cee4aa0ceb4733774dc37d225495575543e84a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6529ac19e0d9f2b6ecc69e041e75c9767c971617166ca22bb29349b3b3965b1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://153aaedead7ed76ab97858342317945de885afe80c00d9873d1a03444c47f67d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:28Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:28 crc kubenswrapper[4797]: I1013 13:07:28.816310 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"345b1c60-ba79-407d-8423-53010f2dfeb0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9188982d992b79d058393a141055552eeb63bc5cd53178991e62b3df7604f55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4mqw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae2106d4b7e73d19b0c8cbd8089d372e56fa08d827a3b45148d0cf68e8596c00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4mqw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hrdxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:28Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:28 crc kubenswrapper[4797]: I1013 13:07:28.867344 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7c2fp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c10f51f-a7f1-4ab8-8d9c-fc358bd7f2c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55ca323dd3a92ee203542f4ef7bb8be990bcfc8f75f125c562127129aecefc5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hgkjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7c2fp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:28Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:28 crc kubenswrapper[4797]: I1013 13:07:28.899000 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:28Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:28 crc kubenswrapper[4797]: I1013 13:07:28.939311 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hc9bk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94a6e41f-8980-41db-a008-d5a81058cdba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55fca726ceb1c3d6ba2023acc8fcf6829a7843a2da5209dc77f3d1f499542984\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55fca726ceb1c3d6ba2023acc8fcf6829a7843a2da5209dc77f3d1f499542984\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d315709aee893961a174d0368efcf68e50e45845bdce18b40b96f5d49a8ac12c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d315709aee893961a174d0368efcf68e50e45845bdce18b40b96f5d49a8ac12c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e5eeb32046ffa3c309bb0649ed24bb4149050e757d4252bc5ed8e0593e1b139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e5eeb32046ffa3c309bb0649ed24bb4149050e757d4252bc5ed8e0593e1b139\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hc9bk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:28Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:28 crc kubenswrapper[4797]: I1013 13:07:28.991338 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a0ece0b-2009-4af8-a479-18fe277add03\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfe67dfd1c3ab4ca933a08e0384f2c38dccf755989a2d788f7c96bd8c2005c7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2839d93188aac6edbd17e7c1dc6d3b6004d3c1d8d03c559205b1f180ca7fc722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d103007fe3545a7470b16b99638c5d5c87f34918e102e1453d4f7ee1fa67109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://221933b30055ace0b4911bda08736e1c703b7757d55fadb3114ae39d038e4b19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01ae91c2e82dd02eb4c0aced1159efd45e3a0570a4db649f2fd2b58681419471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9634ab258db11c80c4fea57a4a31969b811204d49513802e9fd1e584c9baeeb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9634ab258db11c80c4fea57a4a31969b811204d49513802e9fd1e584c9baeeb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4eac0af23e91572524547b0ce92c10d435b55d0cd15ca4cfc1f49bda2de8bde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4eac0af23e91572524547b0ce92c10d435b55d0cd15ca4cfc1f49bda2de8bde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://af79d6bbb6a3c16b532ba2234d3373011c151ffa801eb1ae5ae947142a64bcfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af79d6bbb6a3c16b532ba2234d3373011c151ffa801eb1ae5ae947142a64bcfe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:03Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:28Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:29 crc kubenswrapper[4797]: I1013 13:07:29.017004 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6681a37da700a80ffec94aef9264f87838622029c76a2badc7b8f4a7e9e167e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:29Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:29 crc kubenswrapper[4797]: I1013 13:07:29.056974 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://050a7223e7496fca4ef77f2d73f6aefc921ac5accb7ecaa34609524388da6549\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f59a6104215a3e6febd2e26c286b00895bcd8a45719acbd8e86d6fb5683df39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:29Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:29 crc kubenswrapper[4797]: I1013 13:07:29.101540 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f37f607-3b81-4e33-878e-e78a69b89d23\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f65bab26af0e0d003d4e1a27dc4bdb84b64b5f6143e363973331a3fb6d26b12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81fc000b6df41386d24f9077cee4aa0ceb4733774dc37d225495575543e84a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6529ac19e0d9f2b6ecc69e041e75c9767c971617166ca22bb29349b3b3965b1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://153aaedead7ed76ab97858342317945de885afe80c00d9873d1a03444c47f67d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:29Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:29 crc kubenswrapper[4797]: I1013 13:07:29.139788 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"345b1c60-ba79-407d-8423-53010f2dfeb0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9188982d992b79d058393a141055552eeb63bc5cd53178991e62b3df7604f55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4mqw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae2106d4b7e73d19b0c8cbd8089d372e56fa08d827a3b45148d0cf68e8596c00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4mqw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hrdxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:29Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:29 crc kubenswrapper[4797]: I1013 13:07:29.176729 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7c2fp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c10f51f-a7f1-4ab8-8d9c-fc358bd7f2c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55ca323dd3a92ee203542f4ef7bb8be990bcfc8f75f125c562127129aecefc5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hgkjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7c2fp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:29Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:29 crc kubenswrapper[4797]: I1013 13:07:29.223627 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a225515-1318-413d-aafe-877c9f16f598\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94213b2963fead3db49bc98dfdf6347265b92e3a0a965295610e496d2e1f03fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://015492cb16b3cf6dedc1936f90cf03d1331bfd1fddf6a257c719a6bf102691f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a259d99cae7127eb6fc8ad5446de3eda5a06da45868ab2325a89fc9c44f1d34c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58e84f914871c37c2c5cf2767af6a88354e4e59af0cbe5b178b80e1372d50629\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba76df71160260346f2ecd968722de778b7d2b3dcb8673d6ec770964965384fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ff207205d6f17838682cab641fe9c77fad739e5454c206f376440741bed8724\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ff207205d6f17838682cab641fe9c77fad739e5454c206f376440741bed8724\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:29Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:29 crc kubenswrapper[4797]: I1013 13:07:29.236111 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 13:07:29 crc kubenswrapper[4797]: I1013 13:07:29.236187 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 13 13:07:29 crc kubenswrapper[4797]: I1013 13:07:29.236282 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 13 13:07:29 crc kubenswrapper[4797]: E1013 13:07:29.236279 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 13 13:07:29 crc kubenswrapper[4797]: E1013 13:07:29.236403 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 13 13:07:29 crc kubenswrapper[4797]: E1013 13:07:29.236497 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 13 13:07:29 crc kubenswrapper[4797]: I1013 13:07:29.264667 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:29Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:29 crc kubenswrapper[4797]: I1013 13:07:29.325623 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:29Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:29 crc kubenswrapper[4797]: I1013 13:07:29.345364 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6gbdx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2ab9f14-aae8-45ef-880e-a1563e920f87\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://414f6ddbfec431109009fc83e56eeac94db15726b109e707ebd8d3e2403999b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rkc2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6gbdx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:29Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:29 crc kubenswrapper[4797]: I1013 13:07:29.382718 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"658edc6a-9975-4d8b-9551-821edcc32ce1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6339ec20a5a892de024630eaf6febedfdfa4373c24d394fc758267cc7ae2603e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6339ec20a5a892de024630eaf6febedfdfa4373c24d394fc758267cc7ae2603e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dhk2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:29Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:29 crc kubenswrapper[4797]: I1013 13:07:29.422938 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69631c207b244ea05458caf7f67665697be6b3794c1aac98d0ac8d23df060e02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:29Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:29 crc kubenswrapper[4797]: I1013 13:07:29.453131 4797 generic.go:334] "Generic (PLEG): container finished" podID="94a6e41f-8980-41db-a008-d5a81058cdba" containerID="12ad75a6e287db934cda0d0128e5445d47433aa12807b4145ce0bd28c36b08a4" exitCode=0 Oct 13 13:07:29 crc kubenswrapper[4797]: I1013 13:07:29.453198 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hc9bk" event={"ID":"94a6e41f-8980-41db-a008-d5a81058cdba","Type":"ContainerDied","Data":"12ad75a6e287db934cda0d0128e5445d47433aa12807b4145ce0bd28c36b08a4"} Oct 13 13:07:29 crc kubenswrapper[4797]: I1013 13:07:29.459506 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5jgrm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"680a49a0-7eff-44a9-8ab8-e4b52f4743c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://875be57ff7356a934b342acd8ae700f66656680be4e58e6cfccdc0407b66ddea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pptw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5jgrm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:29Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:29 crc kubenswrapper[4797]: I1013 13:07:29.497457 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7c2fp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c10f51f-a7f1-4ab8-8d9c-fc358bd7f2c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55ca323dd3a92ee203542f4ef7bb8be990bcfc8f75f125c562127129aecefc5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hgkjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7c2fp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:29Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:29 crc kubenswrapper[4797]: I1013 13:07:29.539616 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f37f607-3b81-4e33-878e-e78a69b89d23\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f65bab26af0e0d003d4e1a27dc4bdb84b64b5f6143e363973331a3fb6d26b12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81fc000b6df41386d24f9077cee4aa0ceb4733774dc37d225495575543e84a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6529ac19e0d9f2b6ecc69e041e75c9767c971617166ca22bb29349b3b3965b1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://153aaedead7ed76ab97858342317945de885afe80c00d9873d1a03444c47f67d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:29Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:29 crc kubenswrapper[4797]: I1013 13:07:29.576094 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"345b1c60-ba79-407d-8423-53010f2dfeb0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9188982d992b79d058393a141055552eeb63bc5cd53178991e62b3df7604f55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4mqw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae2106d4b7e73d19b0c8cbd8089d372e56fa08d827a3b45148d0cf68e8596c00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4mqw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hrdxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:29Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:29 crc kubenswrapper[4797]: I1013 13:07:29.622322 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:29Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:29 crc kubenswrapper[4797]: I1013 13:07:29.659664 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6gbdx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2ab9f14-aae8-45ef-880e-a1563e920f87\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://414f6ddbfec431109009fc83e56eeac94db15726b109e707ebd8d3e2403999b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rkc2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6gbdx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:29Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:29 crc kubenswrapper[4797]: I1013 13:07:29.701606 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"658edc6a-9975-4d8b-9551-821edcc32ce1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6339ec20a5a892de024630eaf6febedfdfa4373c24d394fc758267cc7ae2603e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6339ec20a5a892de024630eaf6febedfdfa4373c24d394fc758267cc7ae2603e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dhk2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:29Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:29 crc kubenswrapper[4797]: I1013 13:07:29.739333 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a225515-1318-413d-aafe-877c9f16f598\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94213b2963fead3db49bc98dfdf6347265b92e3a0a965295610e496d2e1f03fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://015492cb16b3cf6dedc1936f90cf03d1331bfd1fddf6a257c719a6bf102691f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a259d99cae7127eb6fc8ad5446de3eda5a06da45868ab2325a89fc9c44f1d34c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58e84f914871c37c2c5cf2767af6a88354e4e59af0cbe5b178b80e1372d50629\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba76df71160260346f2ecd968722de778b7d2b3dcb8673d6ec770964965384fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ff207205d6f17838682cab641fe9c77fad739e5454c206f376440741bed8724\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ff207205d6f17838682cab641fe9c77fad739e5454c206f376440741bed8724\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:29Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:29 crc kubenswrapper[4797]: I1013 13:07:29.778236 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:29Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:29 crc kubenswrapper[4797]: I1013 13:07:29.815841 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69631c207b244ea05458caf7f67665697be6b3794c1aac98d0ac8d23df060e02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:29Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:29 crc kubenswrapper[4797]: I1013 13:07:29.859177 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5jgrm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"680a49a0-7eff-44a9-8ab8-e4b52f4743c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://875be57ff7356a934b342acd8ae700f66656680be4e58e6cfccdc0407b66ddea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pptw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5jgrm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:29Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:29 crc kubenswrapper[4797]: I1013 13:07:29.915854 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a0ece0b-2009-4af8-a479-18fe277add03\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfe67dfd1c3ab4ca933a08e0384f2c38dccf755989a2d788f7c96bd8c2005c7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2839d93188aac6edbd17e7c1dc6d3b6004d3c1d8d03c559205b1f180ca7fc722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d103007fe3545a7470b16b99638c5d5c87f34918e102e1453d4f7ee1fa67109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://221933b30055ace0b4911bda08736e1c703b7757d55fadb3114ae39d038e4b19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01ae91c2e82dd02eb4c0aced1159efd45e3a0570a4db649f2fd2b58681419471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9634ab258db11c80c4fea57a4a31969b811204d49513802e9fd1e584c9baeeb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9634ab258db11c80c4fea57a4a31969b811204d49513802e9fd1e584c9baeeb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4eac0af23e91572524547b0ce92c10d435b55d0cd15ca4cfc1f49bda2de8bde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4eac0af23e91572524547b0ce92c10d435b55d0cd15ca4cfc1f49bda2de8bde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://af79d6bbb6a3c16b532ba2234d3373011c151ffa801eb1ae5ae947142a64bcfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af79d6bbb6a3c16b532ba2234d3373011c151ffa801eb1ae5ae947142a64bcfe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:03Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:29Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:29 crc kubenswrapper[4797]: I1013 13:07:29.940199 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6681a37da700a80ffec94aef9264f87838622029c76a2badc7b8f4a7e9e167e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:29Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:29 crc kubenswrapper[4797]: I1013 13:07:29.977592 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://050a7223e7496fca4ef77f2d73f6aefc921ac5accb7ecaa34609524388da6549\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f59a6104215a3e6febd2e26c286b00895bcd8a45719acbd8e86d6fb5683df39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:29Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:30 crc kubenswrapper[4797]: I1013 13:07:30.016932 4797 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 13 13:07:30 crc kubenswrapper[4797]: I1013 13:07:30.018860 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:30 crc kubenswrapper[4797]: I1013 13:07:30.018898 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:30 crc kubenswrapper[4797]: I1013 13:07:30.018914 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:30 crc kubenswrapper[4797]: I1013 13:07:30.019048 4797 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 13 13:07:30 crc kubenswrapper[4797]: I1013 13:07:30.025001 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:30Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:30 crc kubenswrapper[4797]: I1013 13:07:30.071715 4797 kubelet_node_status.go:115] "Node was previously registered" node="crc" Oct 13 13:07:30 crc kubenswrapper[4797]: I1013 13:07:30.072057 4797 kubelet_node_status.go:79] "Successfully registered node" node="crc" Oct 13 13:07:30 crc kubenswrapper[4797]: I1013 13:07:30.073440 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:30 crc kubenswrapper[4797]: I1013 13:07:30.073506 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:30 crc kubenswrapper[4797]: I1013 13:07:30.073522 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:30 crc kubenswrapper[4797]: I1013 13:07:30.073552 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:30 crc kubenswrapper[4797]: I1013 13:07:30.073569 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:30Z","lastTransitionTime":"2025-10-13T13:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:30 crc kubenswrapper[4797]: E1013 13:07:30.087694 4797 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:07:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:07:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:07:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:07:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7c305ae9-a0eb-4806-bd54-a7ad9c447299\\\",\\\"systemUUID\\\":\\\"1126131d-f382-4ed8-9b1e-fad3c0f5c993\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:30Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:30 crc kubenswrapper[4797]: I1013 13:07:30.092305 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:30 crc kubenswrapper[4797]: I1013 13:07:30.092371 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:30 crc kubenswrapper[4797]: I1013 13:07:30.092389 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:30 crc kubenswrapper[4797]: I1013 13:07:30.092428 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:30 crc kubenswrapper[4797]: I1013 13:07:30.092447 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:30Z","lastTransitionTime":"2025-10-13T13:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:30 crc kubenswrapper[4797]: I1013 13:07:30.099782 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hc9bk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94a6e41f-8980-41db-a008-d5a81058cdba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55fca726ceb1c3d6ba2023acc8fcf6829a7843a2da5209dc77f3d1f499542984\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55fca726ceb1c3d6ba2023acc8fcf6829a7843a2da5209dc77f3d1f499542984\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d315709aee893961a174d0368efcf68e50e45845bdce18b40b96f5d49a8ac12c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d315709aee893961a174d0368efcf68e50e45845bdce18b40b96f5d49a8ac12c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e5eeb32046ffa3c309bb0649ed24bb4149050e757d4252bc5ed8e0593e1b139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e5eeb32046ffa3c309bb0649ed24bb4149050e757d4252bc5ed8e0593e1b139\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12ad75a6e287db934cda0d0128e5445d47433aa12807b4145ce0bd28c36b08a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12ad75a6e287db934cda0d0128e5445d47433aa12807b4145ce0bd28c36b08a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hc9bk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:30Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:30 crc kubenswrapper[4797]: E1013 13:07:30.106790 4797 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:07:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:07:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:07:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:07:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7c305ae9-a0eb-4806-bd54-a7ad9c447299\\\",\\\"systemUUID\\\":\\\"1126131d-f382-4ed8-9b1e-fad3c0f5c993\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:30Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:30 crc kubenswrapper[4797]: I1013 13:07:30.111167 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:30 crc kubenswrapper[4797]: I1013 13:07:30.111218 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:30 crc kubenswrapper[4797]: I1013 13:07:30.111232 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:30 crc kubenswrapper[4797]: I1013 13:07:30.111254 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:30 crc kubenswrapper[4797]: I1013 13:07:30.111268 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:30Z","lastTransitionTime":"2025-10-13T13:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:30 crc kubenswrapper[4797]: E1013 13:07:30.128203 4797 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:07:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:07:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:07:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:07:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7c305ae9-a0eb-4806-bd54-a7ad9c447299\\\",\\\"systemUUID\\\":\\\"1126131d-f382-4ed8-9b1e-fad3c0f5c993\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:30Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:30 crc kubenswrapper[4797]: I1013 13:07:30.132652 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:30 crc kubenswrapper[4797]: I1013 13:07:30.132699 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:30 crc kubenswrapper[4797]: I1013 13:07:30.132715 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:30 crc kubenswrapper[4797]: I1013 13:07:30.132764 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:30 crc kubenswrapper[4797]: I1013 13:07:30.132780 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:30Z","lastTransitionTime":"2025-10-13T13:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:30 crc kubenswrapper[4797]: E1013 13:07:30.153102 4797 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:07:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:07:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:07:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:07:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7c305ae9-a0eb-4806-bd54-a7ad9c447299\\\",\\\"systemUUID\\\":\\\"1126131d-f382-4ed8-9b1e-fad3c0f5c993\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:30Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:30 crc kubenswrapper[4797]: I1013 13:07:30.157164 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:30 crc kubenswrapper[4797]: I1013 13:07:30.157207 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:30 crc kubenswrapper[4797]: I1013 13:07:30.157220 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:30 crc kubenswrapper[4797]: I1013 13:07:30.157238 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:30 crc kubenswrapper[4797]: I1013 13:07:30.157249 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:30Z","lastTransitionTime":"2025-10-13T13:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:30 crc kubenswrapper[4797]: E1013 13:07:30.172086 4797 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:07:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:07:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:07:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:07:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7c305ae9-a0eb-4806-bd54-a7ad9c447299\\\",\\\"systemUUID\\\":\\\"1126131d-f382-4ed8-9b1e-fad3c0f5c993\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:30Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:30 crc kubenswrapper[4797]: E1013 13:07:30.172259 4797 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 13 13:07:30 crc kubenswrapper[4797]: I1013 13:07:30.180282 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:30 crc kubenswrapper[4797]: I1013 13:07:30.180341 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:30 crc kubenswrapper[4797]: I1013 13:07:30.180396 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:30 crc kubenswrapper[4797]: I1013 13:07:30.180414 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:30 crc kubenswrapper[4797]: I1013 13:07:30.180427 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:30Z","lastTransitionTime":"2025-10-13T13:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:30 crc kubenswrapper[4797]: I1013 13:07:30.283026 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:30 crc kubenswrapper[4797]: I1013 13:07:30.283092 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:30 crc kubenswrapper[4797]: I1013 13:07:30.283117 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:30 crc kubenswrapper[4797]: I1013 13:07:30.283144 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:30 crc kubenswrapper[4797]: I1013 13:07:30.283163 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:30Z","lastTransitionTime":"2025-10-13T13:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:30 crc kubenswrapper[4797]: I1013 13:07:30.386195 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:30 crc kubenswrapper[4797]: I1013 13:07:30.386273 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:30 crc kubenswrapper[4797]: I1013 13:07:30.386296 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:30 crc kubenswrapper[4797]: I1013 13:07:30.386327 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:30 crc kubenswrapper[4797]: I1013 13:07:30.386353 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:30Z","lastTransitionTime":"2025-10-13T13:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:30 crc kubenswrapper[4797]: I1013 13:07:30.459429 4797 generic.go:334] "Generic (PLEG): container finished" podID="94a6e41f-8980-41db-a008-d5a81058cdba" containerID="578ddb49fd9525eafe74852b96ea1f3e320cbe40fa15ef4da3e4269f9bc23fc9" exitCode=0 Oct 13 13:07:30 crc kubenswrapper[4797]: I1013 13:07:30.459495 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hc9bk" event={"ID":"94a6e41f-8980-41db-a008-d5a81058cdba","Type":"ContainerDied","Data":"578ddb49fd9525eafe74852b96ea1f3e320cbe40fa15ef4da3e4269f9bc23fc9"} Oct 13 13:07:30 crc kubenswrapper[4797]: I1013 13:07:30.464887 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" event={"ID":"658edc6a-9975-4d8b-9551-821edcc32ce1","Type":"ContainerStarted","Data":"a900854ab289e65833932548eadd4705ec501737d66773d5b6c283458125b598"} Oct 13 13:07:30 crc kubenswrapper[4797]: I1013 13:07:30.483248 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"658edc6a-9975-4d8b-9551-821edcc32ce1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6339ec20a5a892de024630eaf6febedfdfa4373c24d394fc758267cc7ae2603e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6339ec20a5a892de024630eaf6febedfdfa4373c24d394fc758267cc7ae2603e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dhk2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:30Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:30 crc kubenswrapper[4797]: I1013 13:07:30.488623 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:30 crc kubenswrapper[4797]: I1013 13:07:30.488669 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:30 crc kubenswrapper[4797]: I1013 13:07:30.488684 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:30 crc kubenswrapper[4797]: I1013 13:07:30.488703 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:30 crc kubenswrapper[4797]: I1013 13:07:30.488715 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:30Z","lastTransitionTime":"2025-10-13T13:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:30 crc kubenswrapper[4797]: I1013 13:07:30.501710 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a225515-1318-413d-aafe-877c9f16f598\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94213b2963fead3db49bc98dfdf6347265b92e3a0a965295610e496d2e1f03fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://015492cb16b3cf6dedc1936f90cf03d1331bfd1fddf6a257c719a6bf102691f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a259d99cae7127eb6fc8ad5446de3eda5a06da45868ab2325a89fc9c44f1d34c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58e84f914871c37c2c5cf2767af6a88354e4e59af0cbe5b178b80e1372d50629\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba76df71160260346f2ecd968722de778b7d2b3dcb8673d6ec770964965384fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ff207205d6f17838682cab641fe9c77fad739e5454c206f376440741bed8724\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ff207205d6f17838682cab641fe9c77fad739e5454c206f376440741bed8724\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:30Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:30 crc kubenswrapper[4797]: I1013 13:07:30.518724 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:30Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:30 crc kubenswrapper[4797]: I1013 13:07:30.536565 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:30Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:30 crc kubenswrapper[4797]: I1013 13:07:30.550662 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6gbdx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2ab9f14-aae8-45ef-880e-a1563e920f87\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://414f6ddbfec431109009fc83e56eeac94db15726b109e707ebd8d3e2403999b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rkc2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6gbdx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:30Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:30 crc kubenswrapper[4797]: I1013 13:07:30.565603 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69631c207b244ea05458caf7f67665697be6b3794c1aac98d0ac8d23df060e02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:30Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:30 crc kubenswrapper[4797]: I1013 13:07:30.578719 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5jgrm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"680a49a0-7eff-44a9-8ab8-e4b52f4743c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://875be57ff7356a934b342acd8ae700f66656680be4e58e6cfccdc0407b66ddea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pptw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5jgrm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:30Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:30 crc kubenswrapper[4797]: I1013 13:07:30.590899 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:30 crc kubenswrapper[4797]: I1013 13:07:30.590929 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:30 crc kubenswrapper[4797]: I1013 13:07:30.590938 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:30 crc kubenswrapper[4797]: I1013 13:07:30.590953 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:30 crc kubenswrapper[4797]: I1013 13:07:30.590963 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:30Z","lastTransitionTime":"2025-10-13T13:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:30 crc kubenswrapper[4797]: I1013 13:07:30.593635 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://050a7223e7496fca4ef77f2d73f6aefc921ac5accb7ecaa34609524388da6549\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f59a6104215a3e6febd2e26c286b00895bcd8a45719acbd8e86d6fb5683df39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:30Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:30 crc kubenswrapper[4797]: I1013 13:07:30.605692 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:30Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:30 crc kubenswrapper[4797]: I1013 13:07:30.622508 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hc9bk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94a6e41f-8980-41db-a008-d5a81058cdba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55fca726ceb1c3d6ba2023acc8fcf6829a7843a2da5209dc77f3d1f499542984\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55fca726ceb1c3d6ba2023acc8fcf6829a7843a2da5209dc77f3d1f499542984\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d315709aee893961a174d0368efcf68e50e45845bdce18b40b96f5d49a8ac12c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d315709aee893961a174d0368efcf68e50e45845bdce18b40b96f5d49a8ac12c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e5eeb32046ffa3c309bb0649ed24bb4149050e757d4252bc5ed8e0593e1b139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e5eeb32046ffa3c309bb0649ed24bb4149050e757d4252bc5ed8e0593e1b139\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12ad75a6e287db934cda0d0128e5445d47433aa12807b4145ce0bd28c36b08a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12ad75a6e287db934cda0d0128e5445d47433aa12807b4145ce0bd28c36b08a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://578ddb49fd9525eafe74852b96ea1f3e320cbe40fa15ef4da3e4269f9bc23fc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://578ddb49fd9525eafe74852b96ea1f3e320cbe40fa15ef4da3e4269f9bc23fc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hc9bk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:30Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:30 crc kubenswrapper[4797]: I1013 13:07:30.642186 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a0ece0b-2009-4af8-a479-18fe277add03\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfe67dfd1c3ab4ca933a08e0384f2c38dccf755989a2d788f7c96bd8c2005c7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2839d93188aac6edbd17e7c1dc6d3b6004d3c1d8d03c559205b1f180ca7fc722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d103007fe3545a7470b16b99638c5d5c87f34918e102e1453d4f7ee1fa67109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://221933b30055ace0b4911bda08736e1c703b7757d55fadb3114ae39d038e4b19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01ae91c2e82dd02eb4c0aced1159efd45e3a0570a4db649f2fd2b58681419471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9634ab258db11c80c4fea57a4a31969b811204d49513802e9fd1e584c9baeeb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9634ab258db11c80c4fea57a4a31969b811204d49513802e9fd1e584c9baeeb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4eac0af23e91572524547b0ce92c10d435b55d0cd15ca4cfc1f49bda2de8bde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4eac0af23e91572524547b0ce92c10d435b55d0cd15ca4cfc1f49bda2de8bde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://af79d6bbb6a3c16b532ba2234d3373011c151ffa801eb1ae5ae947142a64bcfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af79d6bbb6a3c16b532ba2234d3373011c151ffa801eb1ae5ae947142a64bcfe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:03Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:30Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:30 crc kubenswrapper[4797]: I1013 13:07:30.656065 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6681a37da700a80ffec94aef9264f87838622029c76a2badc7b8f4a7e9e167e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:30Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:30 crc kubenswrapper[4797]: I1013 13:07:30.672522 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f37f607-3b81-4e33-878e-e78a69b89d23\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f65bab26af0e0d003d4e1a27dc4bdb84b64b5f6143e363973331a3fb6d26b12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81fc000b6df41386d24f9077cee4aa0ceb4733774dc37d225495575543e84a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6529ac19e0d9f2b6ecc69e041e75c9767c971617166ca22bb29349b3b3965b1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://153aaedead7ed76ab97858342317945de885afe80c00d9873d1a03444c47f67d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:30Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:30 crc kubenswrapper[4797]: I1013 13:07:30.686651 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"345b1c60-ba79-407d-8423-53010f2dfeb0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9188982d992b79d058393a141055552eeb63bc5cd53178991e62b3df7604f55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4mqw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae2106d4b7e73d19b0c8cbd8089d372e56fa08d827a3b45148d0cf68e8596c00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4mqw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hrdxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:30Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:30 crc kubenswrapper[4797]: I1013 13:07:30.694732 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:30 crc kubenswrapper[4797]: I1013 13:07:30.694781 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:30 crc kubenswrapper[4797]: I1013 13:07:30.694794 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:30 crc kubenswrapper[4797]: I1013 13:07:30.694841 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:30 crc kubenswrapper[4797]: I1013 13:07:30.694857 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:30Z","lastTransitionTime":"2025-10-13T13:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:30 crc kubenswrapper[4797]: I1013 13:07:30.702252 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7c2fp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c10f51f-a7f1-4ab8-8d9c-fc358bd7f2c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55ca323dd3a92ee203542f4ef7bb8be990bcfc8f75f125c562127129aecefc5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hgkjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7c2fp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:30Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:30 crc kubenswrapper[4797]: I1013 13:07:30.799055 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:30 crc kubenswrapper[4797]: I1013 13:07:30.799100 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:30 crc kubenswrapper[4797]: I1013 13:07:30.799113 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:30 crc kubenswrapper[4797]: I1013 13:07:30.799131 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:30 crc kubenswrapper[4797]: I1013 13:07:30.799142 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:30Z","lastTransitionTime":"2025-10-13T13:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:30 crc kubenswrapper[4797]: I1013 13:07:30.902591 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:30 crc kubenswrapper[4797]: I1013 13:07:30.902639 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:30 crc kubenswrapper[4797]: I1013 13:07:30.902651 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:30 crc kubenswrapper[4797]: I1013 13:07:30.902668 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:30 crc kubenswrapper[4797]: I1013 13:07:30.902681 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:30Z","lastTransitionTime":"2025-10-13T13:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:31 crc kubenswrapper[4797]: I1013 13:07:31.005341 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:31 crc kubenswrapper[4797]: I1013 13:07:31.005386 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:31 crc kubenswrapper[4797]: I1013 13:07:31.005399 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:31 crc kubenswrapper[4797]: I1013 13:07:31.005421 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:31 crc kubenswrapper[4797]: I1013 13:07:31.005438 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:31Z","lastTransitionTime":"2025-10-13T13:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:31 crc kubenswrapper[4797]: I1013 13:07:31.108316 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:31 crc kubenswrapper[4797]: I1013 13:07:31.108367 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:31 crc kubenswrapper[4797]: I1013 13:07:31.108381 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:31 crc kubenswrapper[4797]: I1013 13:07:31.108400 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:31 crc kubenswrapper[4797]: I1013 13:07:31.108412 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:31Z","lastTransitionTime":"2025-10-13T13:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:31 crc kubenswrapper[4797]: I1013 13:07:31.210893 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:31 crc kubenswrapper[4797]: I1013 13:07:31.210949 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:31 crc kubenswrapper[4797]: I1013 13:07:31.210959 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:31 crc kubenswrapper[4797]: I1013 13:07:31.210977 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:31 crc kubenswrapper[4797]: I1013 13:07:31.210989 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:31Z","lastTransitionTime":"2025-10-13T13:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:31 crc kubenswrapper[4797]: I1013 13:07:31.235415 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 13 13:07:31 crc kubenswrapper[4797]: I1013 13:07:31.235561 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 13:07:31 crc kubenswrapper[4797]: I1013 13:07:31.235612 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 13 13:07:31 crc kubenswrapper[4797]: E1013 13:07:31.235792 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 13 13:07:31 crc kubenswrapper[4797]: E1013 13:07:31.236012 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 13 13:07:31 crc kubenswrapper[4797]: E1013 13:07:31.236210 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 13 13:07:31 crc kubenswrapper[4797]: I1013 13:07:31.314336 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:31 crc kubenswrapper[4797]: I1013 13:07:31.314393 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:31 crc kubenswrapper[4797]: I1013 13:07:31.314414 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:31 crc kubenswrapper[4797]: I1013 13:07:31.314438 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:31 crc kubenswrapper[4797]: I1013 13:07:31.314453 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:31Z","lastTransitionTime":"2025-10-13T13:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:31 crc kubenswrapper[4797]: I1013 13:07:31.347785 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 13 13:07:31 crc kubenswrapper[4797]: E1013 13:07:31.348162 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-13 13:07:39.34812737 +0000 UTC m=+36.881677656 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 13:07:31 crc kubenswrapper[4797]: I1013 13:07:31.417567 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:31 crc kubenswrapper[4797]: I1013 13:07:31.417614 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:31 crc kubenswrapper[4797]: I1013 13:07:31.417626 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:31 crc kubenswrapper[4797]: I1013 13:07:31.417645 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:31 crc kubenswrapper[4797]: I1013 13:07:31.417659 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:31Z","lastTransitionTime":"2025-10-13T13:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:31 crc kubenswrapper[4797]: I1013 13:07:31.450613 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 13 13:07:31 crc kubenswrapper[4797]: I1013 13:07:31.450701 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 13:07:31 crc kubenswrapper[4797]: I1013 13:07:31.450750 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 13:07:31 crc kubenswrapper[4797]: I1013 13:07:31.450787 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 13 13:07:31 crc kubenswrapper[4797]: E1013 13:07:31.450882 4797 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 13 13:07:31 crc kubenswrapper[4797]: E1013 13:07:31.450912 4797 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 13 13:07:31 crc kubenswrapper[4797]: E1013 13:07:31.450946 4797 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 13 13:07:31 crc kubenswrapper[4797]: E1013 13:07:31.450966 4797 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 13 13:07:31 crc kubenswrapper[4797]: E1013 13:07:31.450980 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-13 13:07:39.450955584 +0000 UTC m=+36.984505860 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 13 13:07:31 crc kubenswrapper[4797]: E1013 13:07:31.451018 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-13 13:07:39.451000996 +0000 UTC m=+36.984551262 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 13 13:07:31 crc kubenswrapper[4797]: E1013 13:07:31.451035 4797 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 13 13:07:31 crc kubenswrapper[4797]: E1013 13:07:31.451068 4797 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 13 13:07:31 crc kubenswrapper[4797]: E1013 13:07:31.451088 4797 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 13 13:07:31 crc kubenswrapper[4797]: E1013 13:07:31.451157 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-13 13:07:39.451135789 +0000 UTC m=+36.984686085 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 13 13:07:31 crc kubenswrapper[4797]: E1013 13:07:31.451039 4797 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 13 13:07:31 crc kubenswrapper[4797]: E1013 13:07:31.451247 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-13 13:07:39.451222821 +0000 UTC m=+36.984773217 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 13 13:07:31 crc kubenswrapper[4797]: I1013 13:07:31.475021 4797 generic.go:334] "Generic (PLEG): container finished" podID="94a6e41f-8980-41db-a008-d5a81058cdba" containerID="d96793d08ac170b3c25abd53c779b2ebecf10b5271c5f3eb4f9cbc524ba65c0e" exitCode=0 Oct 13 13:07:31 crc kubenswrapper[4797]: I1013 13:07:31.475086 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hc9bk" event={"ID":"94a6e41f-8980-41db-a008-d5a81058cdba","Type":"ContainerDied","Data":"d96793d08ac170b3c25abd53c779b2ebecf10b5271c5f3eb4f9cbc524ba65c0e"} Oct 13 13:07:31 crc kubenswrapper[4797]: I1013 13:07:31.490242 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69631c207b244ea05458caf7f67665697be6b3794c1aac98d0ac8d23df060e02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:31Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:31 crc kubenswrapper[4797]: I1013 13:07:31.502531 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5jgrm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"680a49a0-7eff-44a9-8ab8-e4b52f4743c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://875be57ff7356a934b342acd8ae700f66656680be4e58e6cfccdc0407b66ddea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pptw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5jgrm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:31Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:31 crc kubenswrapper[4797]: I1013 13:07:31.521344 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6681a37da700a80ffec94aef9264f87838622029c76a2badc7b8f4a7e9e167e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:31Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:31 crc kubenswrapper[4797]: I1013 13:07:31.522662 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:31 crc kubenswrapper[4797]: I1013 13:07:31.522712 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:31 crc kubenswrapper[4797]: I1013 13:07:31.522726 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:31 crc kubenswrapper[4797]: I1013 13:07:31.522744 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:31 crc kubenswrapper[4797]: I1013 13:07:31.522757 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:31Z","lastTransitionTime":"2025-10-13T13:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:31 crc kubenswrapper[4797]: I1013 13:07:31.540950 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://050a7223e7496fca4ef77f2d73f6aefc921ac5accb7ecaa34609524388da6549\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f59a6104215a3e6febd2e26c286b00895bcd8a45719acbd8e86d6fb5683df39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:31Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:31 crc kubenswrapper[4797]: I1013 13:07:31.563319 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:31Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:31 crc kubenswrapper[4797]: I1013 13:07:31.579951 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hc9bk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94a6e41f-8980-41db-a008-d5a81058cdba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55fca726ceb1c3d6ba2023acc8fcf6829a7843a2da5209dc77f3d1f499542984\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55fca726ceb1c3d6ba2023acc8fcf6829a7843a2da5209dc77f3d1f499542984\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d315709aee893961a174d0368efcf68e50e45845bdce18b40b96f5d49a8ac12c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d315709aee893961a174d0368efcf68e50e45845bdce18b40b96f5d49a8ac12c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e5eeb32046ffa3c309bb0649ed24bb4149050e757d4252bc5ed8e0593e1b139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e5eeb32046ffa3c309bb0649ed24bb4149050e757d4252bc5ed8e0593e1b139\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12ad75a6e287db934cda0d0128e5445d47433aa12807b4145ce0bd28c36b08a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12ad75a6e287db934cda0d0128e5445d47433aa12807b4145ce0bd28c36b08a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://578ddb49fd9525eafe74852b96ea1f3e320cbe40fa15ef4da3e4269f9bc23fc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://578ddb49fd9525eafe74852b96ea1f3e320cbe40fa15ef4da3e4269f9bc23fc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d96793d08ac170b3c25abd53c779b2ebecf10b5271c5f3eb4f9cbc524ba65c0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d96793d08ac170b3c25abd53c779b2ebecf10b5271c5f3eb4f9cbc524ba65c0e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hc9bk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:31Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:31 crc kubenswrapper[4797]: I1013 13:07:31.602768 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a0ece0b-2009-4af8-a479-18fe277add03\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfe67dfd1c3ab4ca933a08e0384f2c38dccf755989a2d788f7c96bd8c2005c7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2839d93188aac6edbd17e7c1dc6d3b6004d3c1d8d03c559205b1f180ca7fc722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d103007fe3545a7470b16b99638c5d5c87f34918e102e1453d4f7ee1fa67109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://221933b30055ace0b4911bda08736e1c703b7757d55fadb3114ae39d038e4b19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01ae91c2e82dd02eb4c0aced1159efd45e3a0570a4db649f2fd2b58681419471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9634ab258db11c80c4fea57a4a31969b811204d49513802e9fd1e584c9baeeb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9634ab258db11c80c4fea57a4a31969b811204d49513802e9fd1e584c9baeeb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4eac0af23e91572524547b0ce92c10d435b55d0cd15ca4cfc1f49bda2de8bde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4eac0af23e91572524547b0ce92c10d435b55d0cd15ca4cfc1f49bda2de8bde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://af79d6bbb6a3c16b532ba2234d3373011c151ffa801eb1ae5ae947142a64bcfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af79d6bbb6a3c16b532ba2234d3373011c151ffa801eb1ae5ae947142a64bcfe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:03Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:31Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:31 crc kubenswrapper[4797]: I1013 13:07:31.614967 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f37f607-3b81-4e33-878e-e78a69b89d23\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f65bab26af0e0d003d4e1a27dc4bdb84b64b5f6143e363973331a3fb6d26b12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81fc000b6df41386d24f9077cee4aa0ceb4733774dc37d225495575543e84a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6529ac19e0d9f2b6ecc69e041e75c9767c971617166ca22bb29349b3b3965b1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://153aaedead7ed76ab97858342317945de885afe80c00d9873d1a03444c47f67d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:31Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:31 crc kubenswrapper[4797]: I1013 13:07:31.624512 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"345b1c60-ba79-407d-8423-53010f2dfeb0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9188982d992b79d058393a141055552eeb63bc5cd53178991e62b3df7604f55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4mqw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae2106d4b7e73d19b0c8cbd8089d372e56fa08d827a3b45148d0cf68e8596c00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4mqw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hrdxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:31Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:31 crc kubenswrapper[4797]: I1013 13:07:31.625553 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:31 crc kubenswrapper[4797]: I1013 13:07:31.625666 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:31 crc kubenswrapper[4797]: I1013 13:07:31.625750 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:31 crc kubenswrapper[4797]: I1013 13:07:31.625966 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:31 crc kubenswrapper[4797]: I1013 13:07:31.626062 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:31Z","lastTransitionTime":"2025-10-13T13:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:31 crc kubenswrapper[4797]: I1013 13:07:31.635921 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7c2fp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c10f51f-a7f1-4ab8-8d9c-fc358bd7f2c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55ca323dd3a92ee203542f4ef7bb8be990bcfc8f75f125c562127129aecefc5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hgkjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7c2fp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:31Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:31 crc kubenswrapper[4797]: I1013 13:07:31.647477 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6gbdx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2ab9f14-aae8-45ef-880e-a1563e920f87\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://414f6ddbfec431109009fc83e56eeac94db15726b109e707ebd8d3e2403999b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rkc2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6gbdx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:31Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:31 crc kubenswrapper[4797]: I1013 13:07:31.665540 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"658edc6a-9975-4d8b-9551-821edcc32ce1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6339ec20a5a892de024630eaf6febedfdfa4373c24d394fc758267cc7ae2603e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6339ec20a5a892de024630eaf6febedfdfa4373c24d394fc758267cc7ae2603e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dhk2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:31Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:31 crc kubenswrapper[4797]: I1013 13:07:31.686211 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a225515-1318-413d-aafe-877c9f16f598\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94213b2963fead3db49bc98dfdf6347265b92e3a0a965295610e496d2e1f03fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://015492cb16b3cf6dedc1936f90cf03d1331bfd1fddf6a257c719a6bf102691f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a259d99cae7127eb6fc8ad5446de3eda5a06da45868ab2325a89fc9c44f1d34c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58e84f914871c37c2c5cf2767af6a88354e4e59af0cbe5b178b80e1372d50629\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba76df71160260346f2ecd968722de778b7d2b3dcb8673d6ec770964965384fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ff207205d6f17838682cab641fe9c77fad739e5454c206f376440741bed8724\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ff207205d6f17838682cab641fe9c77fad739e5454c206f376440741bed8724\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:31Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:31 crc kubenswrapper[4797]: I1013 13:07:31.700873 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:31Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:31 crc kubenswrapper[4797]: I1013 13:07:31.713339 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:31Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:31 crc kubenswrapper[4797]: I1013 13:07:31.728081 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:31 crc kubenswrapper[4797]: I1013 13:07:31.728112 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:31 crc kubenswrapper[4797]: I1013 13:07:31.728122 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:31 crc kubenswrapper[4797]: I1013 13:07:31.728136 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:31 crc kubenswrapper[4797]: I1013 13:07:31.728144 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:31Z","lastTransitionTime":"2025-10-13T13:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:31 crc kubenswrapper[4797]: I1013 13:07:31.830420 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:31 crc kubenswrapper[4797]: I1013 13:07:31.830458 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:31 crc kubenswrapper[4797]: I1013 13:07:31.830472 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:31 crc kubenswrapper[4797]: I1013 13:07:31.830486 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:31 crc kubenswrapper[4797]: I1013 13:07:31.830496 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:31Z","lastTransitionTime":"2025-10-13T13:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:31 crc kubenswrapper[4797]: I1013 13:07:31.933059 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:31 crc kubenswrapper[4797]: I1013 13:07:31.933082 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:31 crc kubenswrapper[4797]: I1013 13:07:31.933091 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:31 crc kubenswrapper[4797]: I1013 13:07:31.933105 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:31 crc kubenswrapper[4797]: I1013 13:07:31.933116 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:31Z","lastTransitionTime":"2025-10-13T13:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:32 crc kubenswrapper[4797]: I1013 13:07:32.035572 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:32 crc kubenswrapper[4797]: I1013 13:07:32.035626 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:32 crc kubenswrapper[4797]: I1013 13:07:32.035643 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:32 crc kubenswrapper[4797]: I1013 13:07:32.035665 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:32 crc kubenswrapper[4797]: I1013 13:07:32.035680 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:32Z","lastTransitionTime":"2025-10-13T13:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:32 crc kubenswrapper[4797]: I1013 13:07:32.137586 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:32 crc kubenswrapper[4797]: I1013 13:07:32.137629 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:32 crc kubenswrapper[4797]: I1013 13:07:32.137644 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:32 crc kubenswrapper[4797]: I1013 13:07:32.137663 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:32 crc kubenswrapper[4797]: I1013 13:07:32.137677 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:32Z","lastTransitionTime":"2025-10-13T13:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:32 crc kubenswrapper[4797]: I1013 13:07:32.240722 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:32 crc kubenswrapper[4797]: I1013 13:07:32.240795 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:32 crc kubenswrapper[4797]: I1013 13:07:32.240857 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:32 crc kubenswrapper[4797]: I1013 13:07:32.240883 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:32 crc kubenswrapper[4797]: I1013 13:07:32.240904 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:32Z","lastTransitionTime":"2025-10-13T13:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:32 crc kubenswrapper[4797]: I1013 13:07:32.344412 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:32 crc kubenswrapper[4797]: I1013 13:07:32.344448 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:32 crc kubenswrapper[4797]: I1013 13:07:32.344460 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:32 crc kubenswrapper[4797]: I1013 13:07:32.344475 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:32 crc kubenswrapper[4797]: I1013 13:07:32.344491 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:32Z","lastTransitionTime":"2025-10-13T13:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:32 crc kubenswrapper[4797]: I1013 13:07:32.447698 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:32 crc kubenswrapper[4797]: I1013 13:07:32.447730 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:32 crc kubenswrapper[4797]: I1013 13:07:32.447741 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:32 crc kubenswrapper[4797]: I1013 13:07:32.447757 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:32 crc kubenswrapper[4797]: I1013 13:07:32.447771 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:32Z","lastTransitionTime":"2025-10-13T13:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:32 crc kubenswrapper[4797]: I1013 13:07:32.510756 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hc9bk" event={"ID":"94a6e41f-8980-41db-a008-d5a81058cdba","Type":"ContainerStarted","Data":"4187a9b2b8147080c704bcda550e1fa94124e2d876e766361e95907a8805d300"} Oct 13 13:07:32 crc kubenswrapper[4797]: I1013 13:07:32.524331 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" event={"ID":"658edc6a-9975-4d8b-9551-821edcc32ce1","Type":"ContainerStarted","Data":"b640eb4e66649937e6ff80419eed3945beffd841847b10c5d9405c6d69eb6168"} Oct 13 13:07:32 crc kubenswrapper[4797]: I1013 13:07:32.524560 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" Oct 13 13:07:32 crc kubenswrapper[4797]: I1013 13:07:32.524625 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" Oct 13 13:07:32 crc kubenswrapper[4797]: I1013 13:07:32.524640 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" Oct 13 13:07:32 crc kubenswrapper[4797]: I1013 13:07:32.531198 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f37f607-3b81-4e33-878e-e78a69b89d23\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f65bab26af0e0d003d4e1a27dc4bdb84b64b5f6143e363973331a3fb6d26b12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81fc000b6df41386d24f9077cee4aa0ceb4733774dc37d225495575543e84a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6529ac19e0d9f2b6ecc69e041e75c9767c971617166ca22bb29349b3b3965b1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://153aaedead7ed76ab97858342317945de885afe80c00d9873d1a03444c47f67d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:32Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:32 crc kubenswrapper[4797]: I1013 13:07:32.544072 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"345b1c60-ba79-407d-8423-53010f2dfeb0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9188982d992b79d058393a141055552eeb63bc5cd53178991e62b3df7604f55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4mqw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae2106d4b7e73d19b0c8cbd8089d372e56fa08d827a3b45148d0cf68e8596c00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4mqw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hrdxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:32Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:32 crc kubenswrapper[4797]: I1013 13:07:32.549715 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:32 crc kubenswrapper[4797]: I1013 13:07:32.549749 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:32 crc kubenswrapper[4797]: I1013 13:07:32.549758 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:32 crc kubenswrapper[4797]: I1013 13:07:32.549774 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:32 crc kubenswrapper[4797]: I1013 13:07:32.549784 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:32Z","lastTransitionTime":"2025-10-13T13:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:32 crc kubenswrapper[4797]: I1013 13:07:32.554655 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7c2fp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c10f51f-a7f1-4ab8-8d9c-fc358bd7f2c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55ca323dd3a92ee203542f4ef7bb8be990bcfc8f75f125c562127129aecefc5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hgkjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7c2fp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:32Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:32 crc kubenswrapper[4797]: I1013 13:07:32.566841 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a225515-1318-413d-aafe-877c9f16f598\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94213b2963fead3db49bc98dfdf6347265b92e3a0a965295610e496d2e1f03fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://015492cb16b3cf6dedc1936f90cf03d1331bfd1fddf6a257c719a6bf102691f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a259d99cae7127eb6fc8ad5446de3eda5a06da45868ab2325a89fc9c44f1d34c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58e84f914871c37c2c5cf2767af6a88354e4e59af0cbe5b178b80e1372d50629\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba76df71160260346f2ecd968722de778b7d2b3dcb8673d6ec770964965384fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ff207205d6f17838682cab641fe9c77fad739e5454c206f376440741bed8724\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ff207205d6f17838682cab641fe9c77fad739e5454c206f376440741bed8724\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:32Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:32 crc kubenswrapper[4797]: I1013 13:07:32.575563 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" Oct 13 13:07:32 crc kubenswrapper[4797]: I1013 13:07:32.579253 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" Oct 13 13:07:32 crc kubenswrapper[4797]: I1013 13:07:32.584357 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:32Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:32 crc kubenswrapper[4797]: I1013 13:07:32.596602 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:32Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:32 crc kubenswrapper[4797]: I1013 13:07:32.607397 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6gbdx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2ab9f14-aae8-45ef-880e-a1563e920f87\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://414f6ddbfec431109009fc83e56eeac94db15726b109e707ebd8d3e2403999b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rkc2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6gbdx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:32Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:32 crc kubenswrapper[4797]: I1013 13:07:32.623543 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"658edc6a-9975-4d8b-9551-821edcc32ce1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6339ec20a5a892de024630eaf6febedfdfa4373c24d394fc758267cc7ae2603e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6339ec20a5a892de024630eaf6febedfdfa4373c24d394fc758267cc7ae2603e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dhk2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:32Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:32 crc kubenswrapper[4797]: I1013 13:07:32.633736 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69631c207b244ea05458caf7f67665697be6b3794c1aac98d0ac8d23df060e02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:32Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:32 crc kubenswrapper[4797]: I1013 13:07:32.642418 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5jgrm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"680a49a0-7eff-44a9-8ab8-e4b52f4743c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://875be57ff7356a934b342acd8ae700f66656680be4e58e6cfccdc0407b66ddea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pptw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5jgrm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:32Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:32 crc kubenswrapper[4797]: I1013 13:07:32.652342 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:32 crc kubenswrapper[4797]: I1013 13:07:32.652375 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:32 crc kubenswrapper[4797]: I1013 13:07:32.652383 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:32 crc kubenswrapper[4797]: I1013 13:07:32.652397 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:32 crc kubenswrapper[4797]: I1013 13:07:32.652407 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:32Z","lastTransitionTime":"2025-10-13T13:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:32 crc kubenswrapper[4797]: I1013 13:07:32.656313 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hc9bk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94a6e41f-8980-41db-a008-d5a81058cdba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4187a9b2b8147080c704bcda550e1fa94124e2d876e766361e95907a8805d300\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55fca726ceb1c3d6ba2023acc8fcf6829a7843a2da5209dc77f3d1f499542984\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55fca726ceb1c3d6ba2023acc8fcf6829a7843a2da5209dc77f3d1f499542984\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d315709aee893961a174d0368efcf68e50e45845bdce18b40b96f5d49a8ac12c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d315709aee893961a174d0368efcf68e50e45845bdce18b40b96f5d49a8ac12c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e5eeb32046ffa3c309bb0649ed24bb4149050e757d4252bc5ed8e0593e1b139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e5eeb32046ffa3c309bb0649ed24bb4149050e757d4252bc5ed8e0593e1b139\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12ad75a6e287db934cda0d0128e5445d47433aa12807b4145ce0bd28c36b08a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12ad75a6e287db934cda0d0128e5445d47433aa12807b4145ce0bd28c36b08a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://578ddb49fd9525eafe74852b96ea1f3e320cbe40fa15ef4da3e4269f9bc23fc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://578ddb49fd9525eafe74852b96ea1f3e320cbe40fa15ef4da3e4269f9bc23fc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d96793d08ac170b3c25abd53c779b2ebecf10b5271c5f3eb4f9cbc524ba65c0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d96793d08ac170b3c25abd53c779b2ebecf10b5271c5f3eb4f9cbc524ba65c0e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hc9bk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:32Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:32 crc kubenswrapper[4797]: I1013 13:07:32.673946 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a0ece0b-2009-4af8-a479-18fe277add03\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfe67dfd1c3ab4ca933a08e0384f2c38dccf755989a2d788f7c96bd8c2005c7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2839d93188aac6edbd17e7c1dc6d3b6004d3c1d8d03c559205b1f180ca7fc722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d103007fe3545a7470b16b99638c5d5c87f34918e102e1453d4f7ee1fa67109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://221933b30055ace0b4911bda08736e1c703b7757d55fadb3114ae39d038e4b19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01ae91c2e82dd02eb4c0aced1159efd45e3a0570a4db649f2fd2b58681419471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9634ab258db11c80c4fea57a4a31969b811204d49513802e9fd1e584c9baeeb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9634ab258db11c80c4fea57a4a31969b811204d49513802e9fd1e584c9baeeb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4eac0af23e91572524547b0ce92c10d435b55d0cd15ca4cfc1f49bda2de8bde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4eac0af23e91572524547b0ce92c10d435b55d0cd15ca4cfc1f49bda2de8bde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://af79d6bbb6a3c16b532ba2234d3373011c151ffa801eb1ae5ae947142a64bcfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af79d6bbb6a3c16b532ba2234d3373011c151ffa801eb1ae5ae947142a64bcfe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:03Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:32Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:32 crc kubenswrapper[4797]: I1013 13:07:32.687698 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6681a37da700a80ffec94aef9264f87838622029c76a2badc7b8f4a7e9e167e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:32Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:32 crc kubenswrapper[4797]: I1013 13:07:32.699407 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://050a7223e7496fca4ef77f2d73f6aefc921ac5accb7ecaa34609524388da6549\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f59a6104215a3e6febd2e26c286b00895bcd8a45719acbd8e86d6fb5683df39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:32Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:32 crc kubenswrapper[4797]: I1013 13:07:32.713052 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:32Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:32 crc kubenswrapper[4797]: I1013 13:07:32.722053 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69631c207b244ea05458caf7f67665697be6b3794c1aac98d0ac8d23df060e02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:32Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:32 crc kubenswrapper[4797]: I1013 13:07:32.730950 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5jgrm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"680a49a0-7eff-44a9-8ab8-e4b52f4743c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://875be57ff7356a934b342acd8ae700f66656680be4e58e6cfccdc0407b66ddea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pptw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5jgrm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:32Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:32 crc kubenswrapper[4797]: I1013 13:07:32.756441 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:32 crc kubenswrapper[4797]: I1013 13:07:32.756490 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:32 crc kubenswrapper[4797]: I1013 13:07:32.756508 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:32 crc kubenswrapper[4797]: I1013 13:07:32.756533 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:32 crc kubenswrapper[4797]: I1013 13:07:32.756551 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:32Z","lastTransitionTime":"2025-10-13T13:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:32 crc kubenswrapper[4797]: I1013 13:07:32.765423 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a0ece0b-2009-4af8-a479-18fe277add03\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfe67dfd1c3ab4ca933a08e0384f2c38dccf755989a2d788f7c96bd8c2005c7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2839d93188aac6edbd17e7c1dc6d3b6004d3c1d8d03c559205b1f180ca7fc722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d103007fe3545a7470b16b99638c5d5c87f34918e102e1453d4f7ee1fa67109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://221933b30055ace0b4911bda08736e1c703b7757d55fadb3114ae39d038e4b19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01ae91c2e82dd02eb4c0aced1159efd45e3a0570a4db649f2fd2b58681419471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9634ab258db11c80c4fea57a4a31969b811204d49513802e9fd1e584c9baeeb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9634ab258db11c80c4fea57a4a31969b811204d49513802e9fd1e584c9baeeb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4eac0af23e91572524547b0ce92c10d435b55d0cd15ca4cfc1f49bda2de8bde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4eac0af23e91572524547b0ce92c10d435b55d0cd15ca4cfc1f49bda2de8bde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://af79d6bbb6a3c16b532ba2234d3373011c151ffa801eb1ae5ae947142a64bcfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af79d6bbb6a3c16b532ba2234d3373011c151ffa801eb1ae5ae947142a64bcfe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:03Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:32Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:32 crc kubenswrapper[4797]: I1013 13:07:32.790554 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6681a37da700a80ffec94aef9264f87838622029c76a2badc7b8f4a7e9e167e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:32Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:32 crc kubenswrapper[4797]: I1013 13:07:32.810396 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://050a7223e7496fca4ef77f2d73f6aefc921ac5accb7ecaa34609524388da6549\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f59a6104215a3e6febd2e26c286b00895bcd8a45719acbd8e86d6fb5683df39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:32Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:32 crc kubenswrapper[4797]: I1013 13:07:32.827752 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:32Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:32 crc kubenswrapper[4797]: I1013 13:07:32.850793 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hc9bk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94a6e41f-8980-41db-a008-d5a81058cdba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4187a9b2b8147080c704bcda550e1fa94124e2d876e766361e95907a8805d300\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55fca726ceb1c3d6ba2023acc8fcf6829a7843a2da5209dc77f3d1f499542984\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55fca726ceb1c3d6ba2023acc8fcf6829a7843a2da5209dc77f3d1f499542984\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d315709aee893961a174d0368efcf68e50e45845bdce18b40b96f5d49a8ac12c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d315709aee893961a174d0368efcf68e50e45845bdce18b40b96f5d49a8ac12c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e5eeb32046ffa3c309bb0649ed24bb4149050e757d4252bc5ed8e0593e1b139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e5eeb32046ffa3c309bb0649ed24bb4149050e757d4252bc5ed8e0593e1b139\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12ad75a6e287db934cda0d0128e5445d47433aa12807b4145ce0bd28c36b08a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12ad75a6e287db934cda0d0128e5445d47433aa12807b4145ce0bd28c36b08a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://578ddb49fd9525eafe74852b96ea1f3e320cbe40fa15ef4da3e4269f9bc23fc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://578ddb49fd9525eafe74852b96ea1f3e320cbe40fa15ef4da3e4269f9bc23fc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d96793d08ac170b3c25abd53c779b2ebecf10b5271c5f3eb4f9cbc524ba65c0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d96793d08ac170b3c25abd53c779b2ebecf10b5271c5f3eb4f9cbc524ba65c0e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hc9bk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:32Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:32 crc kubenswrapper[4797]: I1013 13:07:32.858366 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:32 crc kubenswrapper[4797]: I1013 13:07:32.858445 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:32 crc kubenswrapper[4797]: I1013 13:07:32.858469 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:32 crc kubenswrapper[4797]: I1013 13:07:32.858501 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:32 crc kubenswrapper[4797]: I1013 13:07:32.858528 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:32Z","lastTransitionTime":"2025-10-13T13:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:32 crc kubenswrapper[4797]: I1013 13:07:32.869065 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7c2fp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c10f51f-a7f1-4ab8-8d9c-fc358bd7f2c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55ca323dd3a92ee203542f4ef7bb8be990bcfc8f75f125c562127129aecefc5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hgkjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7c2fp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:32Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:32 crc kubenswrapper[4797]: I1013 13:07:32.885832 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f37f607-3b81-4e33-878e-e78a69b89d23\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f65bab26af0e0d003d4e1a27dc4bdb84b64b5f6143e363973331a3fb6d26b12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81fc000b6df41386d24f9077cee4aa0ceb4733774dc37d225495575543e84a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6529ac19e0d9f2b6ecc69e041e75c9767c971617166ca22bb29349b3b3965b1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://153aaedead7ed76ab97858342317945de885afe80c00d9873d1a03444c47f67d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:32Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:32 crc kubenswrapper[4797]: I1013 13:07:32.899862 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"345b1c60-ba79-407d-8423-53010f2dfeb0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9188982d992b79d058393a141055552eeb63bc5cd53178991e62b3df7604f55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4mqw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae2106d4b7e73d19b0c8cbd8089d372e56fa08d827a3b45148d0cf68e8596c00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4mqw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hrdxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:32Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:32 crc kubenswrapper[4797]: I1013 13:07:32.913033 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:32Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:32 crc kubenswrapper[4797]: I1013 13:07:32.936280 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6gbdx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2ab9f14-aae8-45ef-880e-a1563e920f87\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://414f6ddbfec431109009fc83e56eeac94db15726b109e707ebd8d3e2403999b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rkc2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6gbdx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:32Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:32 crc kubenswrapper[4797]: I1013 13:07:32.961946 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:32 crc kubenswrapper[4797]: I1013 13:07:32.962005 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:32 crc kubenswrapper[4797]: I1013 13:07:32.962023 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:32 crc kubenswrapper[4797]: I1013 13:07:32.962048 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:32 crc kubenswrapper[4797]: I1013 13:07:32.962064 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:32Z","lastTransitionTime":"2025-10-13T13:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:32 crc kubenswrapper[4797]: I1013 13:07:32.969481 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"658edc6a-9975-4d8b-9551-821edcc32ce1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6815d3509df673d7f5da2c26130c6c4d533e9d2c25c40f82365ef61d63ee71bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e599d81d1a996abd4de74afc58a8255a1ae548327401146b6bdf688d7455823\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa5161ba66d687daedb3caa1a0e2d7be83859aa3076731f94aebf83cc3348a4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1293f7ed35796e22a4be73a35ad07f83fa98d250d21de2d0b96b9090354142b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32991406197be9d38b8d5e8d1a7e95165b1846e9e054efbe87f30aac9f7f8784\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7aebb018a68c2984d9e4e58071c2b623652bfa700acebaf735c35615abf8c592\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b640eb4e66649937e6ff80419eed3945beffd841847b10c5d9405c6d69eb6168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a900854ab289e65833932548eadd4705ec501737d66773d5b6c283458125b598\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6339ec20a5a892de024630eaf6febedfdfa4373c24d394fc758267cc7ae2603e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6339ec20a5a892de024630eaf6febedfdfa4373c24d394fc758267cc7ae2603e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dhk2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:32Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:32 crc kubenswrapper[4797]: I1013 13:07:32.995695 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a225515-1318-413d-aafe-877c9f16f598\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94213b2963fead3db49bc98dfdf6347265b92e3a0a965295610e496d2e1f03fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://015492cb16b3cf6dedc1936f90cf03d1331bfd1fddf6a257c719a6bf102691f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a259d99cae7127eb6fc8ad5446de3eda5a06da45868ab2325a89fc9c44f1d34c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58e84f914871c37c2c5cf2767af6a88354e4e59af0cbe5b178b80e1372d50629\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba76df71160260346f2ecd968722de778b7d2b3dcb8673d6ec770964965384fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ff207205d6f17838682cab641fe9c77fad739e5454c206f376440741bed8724\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ff207205d6f17838682cab641fe9c77fad739e5454c206f376440741bed8724\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:32Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:33 crc kubenswrapper[4797]: I1013 13:07:33.014654 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:33Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:33 crc kubenswrapper[4797]: I1013 13:07:33.065158 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:33 crc kubenswrapper[4797]: I1013 13:07:33.065226 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:33 crc kubenswrapper[4797]: I1013 13:07:33.065240 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:33 crc kubenswrapper[4797]: I1013 13:07:33.065261 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:33 crc kubenswrapper[4797]: I1013 13:07:33.065277 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:33Z","lastTransitionTime":"2025-10-13T13:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:33 crc kubenswrapper[4797]: I1013 13:07:33.168327 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:33 crc kubenswrapper[4797]: I1013 13:07:33.168422 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:33 crc kubenswrapper[4797]: I1013 13:07:33.168525 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:33 crc kubenswrapper[4797]: I1013 13:07:33.168641 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:33 crc kubenswrapper[4797]: I1013 13:07:33.168678 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:33Z","lastTransitionTime":"2025-10-13T13:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:33 crc kubenswrapper[4797]: I1013 13:07:33.237095 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 13 13:07:33 crc kubenswrapper[4797]: I1013 13:07:33.237189 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 13:07:33 crc kubenswrapper[4797]: I1013 13:07:33.237225 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 13 13:07:33 crc kubenswrapper[4797]: E1013 13:07:33.237366 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 13 13:07:33 crc kubenswrapper[4797]: E1013 13:07:33.237470 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 13 13:07:33 crc kubenswrapper[4797]: E1013 13:07:33.237653 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 13 13:07:33 crc kubenswrapper[4797]: I1013 13:07:33.255195 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5jgrm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"680a49a0-7eff-44a9-8ab8-e4b52f4743c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://875be57ff7356a934b342acd8ae700f66656680be4e58e6cfccdc0407b66ddea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pptw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5jgrm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:33Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:33 crc kubenswrapper[4797]: I1013 13:07:33.271467 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:33 crc kubenswrapper[4797]: I1013 13:07:33.271542 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:33 crc kubenswrapper[4797]: I1013 13:07:33.271568 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:33 crc kubenswrapper[4797]: I1013 13:07:33.271597 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:33 crc kubenswrapper[4797]: I1013 13:07:33.271619 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:33Z","lastTransitionTime":"2025-10-13T13:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:33 crc kubenswrapper[4797]: I1013 13:07:33.276916 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69631c207b244ea05458caf7f67665697be6b3794c1aac98d0ac8d23df060e02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:33Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:33 crc kubenswrapper[4797]: I1013 13:07:33.324769 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a0ece0b-2009-4af8-a479-18fe277add03\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfe67dfd1c3ab4ca933a08e0384f2c38dccf755989a2d788f7c96bd8c2005c7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2839d93188aac6edbd17e7c1dc6d3b6004d3c1d8d03c559205b1f180ca7fc722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d103007fe3545a7470b16b99638c5d5c87f34918e102e1453d4f7ee1fa67109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://221933b30055ace0b4911bda08736e1c703b7757d55fadb3114ae39d038e4b19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01ae91c2e82dd02eb4c0aced1159efd45e3a0570a4db649f2fd2b58681419471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9634ab258db11c80c4fea57a4a31969b811204d49513802e9fd1e584c9baeeb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9634ab258db11c80c4fea57a4a31969b811204d49513802e9fd1e584c9baeeb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4eac0af23e91572524547b0ce92c10d435b55d0cd15ca4cfc1f49bda2de8bde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4eac0af23e91572524547b0ce92c10d435b55d0cd15ca4cfc1f49bda2de8bde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://af79d6bbb6a3c16b532ba2234d3373011c151ffa801eb1ae5ae947142a64bcfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af79d6bbb6a3c16b532ba2234d3373011c151ffa801eb1ae5ae947142a64bcfe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:03Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:33Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:33 crc kubenswrapper[4797]: I1013 13:07:33.355294 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6681a37da700a80ffec94aef9264f87838622029c76a2badc7b8f4a7e9e167e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:33Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:33 crc kubenswrapper[4797]: I1013 13:07:33.374756 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:33 crc kubenswrapper[4797]: I1013 13:07:33.374860 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:33 crc kubenswrapper[4797]: I1013 13:07:33.374885 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:33 crc kubenswrapper[4797]: I1013 13:07:33.374916 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:33 crc kubenswrapper[4797]: I1013 13:07:33.374938 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:33Z","lastTransitionTime":"2025-10-13T13:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:33 crc kubenswrapper[4797]: I1013 13:07:33.375294 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://050a7223e7496fca4ef77f2d73f6aefc921ac5accb7ecaa34609524388da6549\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f59a6104215a3e6febd2e26c286b00895bcd8a45719acbd8e86d6fb5683df39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:33Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:33 crc kubenswrapper[4797]: I1013 13:07:33.394947 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:33Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:33 crc kubenswrapper[4797]: I1013 13:07:33.414456 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hc9bk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94a6e41f-8980-41db-a008-d5a81058cdba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4187a9b2b8147080c704bcda550e1fa94124e2d876e766361e95907a8805d300\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55fca726ceb1c3d6ba2023acc8fcf6829a7843a2da5209dc77f3d1f499542984\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55fca726ceb1c3d6ba2023acc8fcf6829a7843a2da5209dc77f3d1f499542984\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d315709aee893961a174d0368efcf68e50e45845bdce18b40b96f5d49a8ac12c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d315709aee893961a174d0368efcf68e50e45845bdce18b40b96f5d49a8ac12c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e5eeb32046ffa3c309bb0649ed24bb4149050e757d4252bc5ed8e0593e1b139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e5eeb32046ffa3c309bb0649ed24bb4149050e757d4252bc5ed8e0593e1b139\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12ad75a6e287db934cda0d0128e5445d47433aa12807b4145ce0bd28c36b08a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12ad75a6e287db934cda0d0128e5445d47433aa12807b4145ce0bd28c36b08a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://578ddb49fd9525eafe74852b96ea1f3e320cbe40fa15ef4da3e4269f9bc23fc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://578ddb49fd9525eafe74852b96ea1f3e320cbe40fa15ef4da3e4269f9bc23fc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d96793d08ac170b3c25abd53c779b2ebecf10b5271c5f3eb4f9cbc524ba65c0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d96793d08ac170b3c25abd53c779b2ebecf10b5271c5f3eb4f9cbc524ba65c0e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hc9bk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:33Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:33 crc kubenswrapper[4797]: I1013 13:07:33.426606 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"345b1c60-ba79-407d-8423-53010f2dfeb0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9188982d992b79d058393a141055552eeb63bc5cd53178991e62b3df7604f55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4mqw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae2106d4b7e73d19b0c8cbd8089d372e56fa08d827a3b45148d0cf68e8596c00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4mqw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hrdxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:33Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:33 crc kubenswrapper[4797]: I1013 13:07:33.436824 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7c2fp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c10f51f-a7f1-4ab8-8d9c-fc358bd7f2c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55ca323dd3a92ee203542f4ef7bb8be990bcfc8f75f125c562127129aecefc5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hgkjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7c2fp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:33Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:33 crc kubenswrapper[4797]: I1013 13:07:33.446749 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f37f607-3b81-4e33-878e-e78a69b89d23\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f65bab26af0e0d003d4e1a27dc4bdb84b64b5f6143e363973331a3fb6d26b12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81fc000b6df41386d24f9077cee4aa0ceb4733774dc37d225495575543e84a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6529ac19e0d9f2b6ecc69e041e75c9767c971617166ca22bb29349b3b3965b1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://153aaedead7ed76ab97858342317945de885afe80c00d9873d1a03444c47f67d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:33Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:33 crc kubenswrapper[4797]: I1013 13:07:33.457321 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:33Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:33 crc kubenswrapper[4797]: I1013 13:07:33.467635 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:33Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:33 crc kubenswrapper[4797]: I1013 13:07:33.477263 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:33 crc kubenswrapper[4797]: I1013 13:07:33.477307 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:33 crc kubenswrapper[4797]: I1013 13:07:33.477323 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:33 crc kubenswrapper[4797]: I1013 13:07:33.477345 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:33 crc kubenswrapper[4797]: I1013 13:07:33.477364 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:33Z","lastTransitionTime":"2025-10-13T13:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:33 crc kubenswrapper[4797]: I1013 13:07:33.484371 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6gbdx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2ab9f14-aae8-45ef-880e-a1563e920f87\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://414f6ddbfec431109009fc83e56eeac94db15726b109e707ebd8d3e2403999b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rkc2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6gbdx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:33Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:33 crc kubenswrapper[4797]: I1013 13:07:33.504715 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"658edc6a-9975-4d8b-9551-821edcc32ce1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6815d3509df673d7f5da2c26130c6c4d533e9d2c25c40f82365ef61d63ee71bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e599d81d1a996abd4de74afc58a8255a1ae548327401146b6bdf688d7455823\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa5161ba66d687daedb3caa1a0e2d7be83859aa3076731f94aebf83cc3348a4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1293f7ed35796e22a4be73a35ad07f83fa98d250d21de2d0b96b9090354142b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32991406197be9d38b8d5e8d1a7e95165b1846e9e054efbe87f30aac9f7f8784\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7aebb018a68c2984d9e4e58071c2b623652bfa700acebaf735c35615abf8c592\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b640eb4e66649937e6ff80419eed3945beffd841847b10c5d9405c6d69eb6168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a900854ab289e65833932548eadd4705ec501737d66773d5b6c283458125b598\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6339ec20a5a892de024630eaf6febedfdfa4373c24d394fc758267cc7ae2603e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6339ec20a5a892de024630eaf6febedfdfa4373c24d394fc758267cc7ae2603e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dhk2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:33Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:33 crc kubenswrapper[4797]: I1013 13:07:33.525511 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a225515-1318-413d-aafe-877c9f16f598\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94213b2963fead3db49bc98dfdf6347265b92e3a0a965295610e496d2e1f03fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://015492cb16b3cf6dedc1936f90cf03d1331bfd1fddf6a257c719a6bf102691f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a259d99cae7127eb6fc8ad5446de3eda5a06da45868ab2325a89fc9c44f1d34c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58e84f914871c37c2c5cf2767af6a88354e4e59af0cbe5b178b80e1372d50629\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba76df71160260346f2ecd968722de778b7d2b3dcb8673d6ec770964965384fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ff207205d6f17838682cab641fe9c77fad739e5454c206f376440741bed8724\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ff207205d6f17838682cab641fe9c77fad739e5454c206f376440741bed8724\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:33Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:33 crc kubenswrapper[4797]: I1013 13:07:33.580372 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:33 crc kubenswrapper[4797]: I1013 13:07:33.580418 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:33 crc kubenswrapper[4797]: I1013 13:07:33.580430 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:33 crc kubenswrapper[4797]: I1013 13:07:33.580449 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:33 crc kubenswrapper[4797]: I1013 13:07:33.580464 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:33Z","lastTransitionTime":"2025-10-13T13:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:33 crc kubenswrapper[4797]: I1013 13:07:33.687563 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:33 crc kubenswrapper[4797]: I1013 13:07:33.687635 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:33 crc kubenswrapper[4797]: I1013 13:07:33.687646 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:33 crc kubenswrapper[4797]: I1013 13:07:33.687670 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:33 crc kubenswrapper[4797]: I1013 13:07:33.687683 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:33Z","lastTransitionTime":"2025-10-13T13:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:33 crc kubenswrapper[4797]: I1013 13:07:33.790406 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:33 crc kubenswrapper[4797]: I1013 13:07:33.790465 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:33 crc kubenswrapper[4797]: I1013 13:07:33.790486 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:33 crc kubenswrapper[4797]: I1013 13:07:33.790509 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:33 crc kubenswrapper[4797]: I1013 13:07:33.790526 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:33Z","lastTransitionTime":"2025-10-13T13:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:33 crc kubenswrapper[4797]: I1013 13:07:33.893579 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:33 crc kubenswrapper[4797]: I1013 13:07:33.893648 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:33 crc kubenswrapper[4797]: I1013 13:07:33.893664 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:33 crc kubenswrapper[4797]: I1013 13:07:33.893690 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:33 crc kubenswrapper[4797]: I1013 13:07:33.893711 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:33Z","lastTransitionTime":"2025-10-13T13:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:33 crc kubenswrapper[4797]: I1013 13:07:33.996858 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:33 crc kubenswrapper[4797]: I1013 13:07:33.996903 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:33 crc kubenswrapper[4797]: I1013 13:07:33.996912 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:33 crc kubenswrapper[4797]: I1013 13:07:33.996931 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:33 crc kubenswrapper[4797]: I1013 13:07:33.996942 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:33Z","lastTransitionTime":"2025-10-13T13:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:34 crc kubenswrapper[4797]: I1013 13:07:34.100295 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:34 crc kubenswrapper[4797]: I1013 13:07:34.100344 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:34 crc kubenswrapper[4797]: I1013 13:07:34.100356 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:34 crc kubenswrapper[4797]: I1013 13:07:34.100374 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:34 crc kubenswrapper[4797]: I1013 13:07:34.100387 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:34Z","lastTransitionTime":"2025-10-13T13:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:34 crc kubenswrapper[4797]: I1013 13:07:34.203264 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:34 crc kubenswrapper[4797]: I1013 13:07:34.203395 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:34 crc kubenswrapper[4797]: I1013 13:07:34.203415 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:34 crc kubenswrapper[4797]: I1013 13:07:34.203440 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:34 crc kubenswrapper[4797]: I1013 13:07:34.203481 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:34Z","lastTransitionTime":"2025-10-13T13:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:34 crc kubenswrapper[4797]: I1013 13:07:34.306660 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:34 crc kubenswrapper[4797]: I1013 13:07:34.306716 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:34 crc kubenswrapper[4797]: I1013 13:07:34.306735 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:34 crc kubenswrapper[4797]: I1013 13:07:34.306760 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:34 crc kubenswrapper[4797]: I1013 13:07:34.306779 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:34Z","lastTransitionTime":"2025-10-13T13:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:34 crc kubenswrapper[4797]: I1013 13:07:34.409397 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:34 crc kubenswrapper[4797]: I1013 13:07:34.409440 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:34 crc kubenswrapper[4797]: I1013 13:07:34.409455 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:34 crc kubenswrapper[4797]: I1013 13:07:34.409477 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:34 crc kubenswrapper[4797]: I1013 13:07:34.409492 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:34Z","lastTransitionTime":"2025-10-13T13:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:34 crc kubenswrapper[4797]: I1013 13:07:34.511849 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:34 crc kubenswrapper[4797]: I1013 13:07:34.511893 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:34 crc kubenswrapper[4797]: I1013 13:07:34.511905 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:34 crc kubenswrapper[4797]: I1013 13:07:34.511924 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:34 crc kubenswrapper[4797]: I1013 13:07:34.511937 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:34Z","lastTransitionTime":"2025-10-13T13:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:34 crc kubenswrapper[4797]: I1013 13:07:34.614382 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:34 crc kubenswrapper[4797]: I1013 13:07:34.614452 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:34 crc kubenswrapper[4797]: I1013 13:07:34.614471 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:34 crc kubenswrapper[4797]: I1013 13:07:34.614496 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:34 crc kubenswrapper[4797]: I1013 13:07:34.614516 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:34Z","lastTransitionTime":"2025-10-13T13:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:34 crc kubenswrapper[4797]: I1013 13:07:34.717224 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:34 crc kubenswrapper[4797]: I1013 13:07:34.717286 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:34 crc kubenswrapper[4797]: I1013 13:07:34.717303 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:34 crc kubenswrapper[4797]: I1013 13:07:34.717327 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:34 crc kubenswrapper[4797]: I1013 13:07:34.717344 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:34Z","lastTransitionTime":"2025-10-13T13:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:34 crc kubenswrapper[4797]: I1013 13:07:34.820535 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:34 crc kubenswrapper[4797]: I1013 13:07:34.820600 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:34 crc kubenswrapper[4797]: I1013 13:07:34.820620 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:34 crc kubenswrapper[4797]: I1013 13:07:34.820646 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:34 crc kubenswrapper[4797]: I1013 13:07:34.820662 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:34Z","lastTransitionTime":"2025-10-13T13:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:34 crc kubenswrapper[4797]: I1013 13:07:34.923400 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:34 crc kubenswrapper[4797]: I1013 13:07:34.923501 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:34 crc kubenswrapper[4797]: I1013 13:07:34.923517 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:34 crc kubenswrapper[4797]: I1013 13:07:34.923537 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:34 crc kubenswrapper[4797]: I1013 13:07:34.923550 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:34Z","lastTransitionTime":"2025-10-13T13:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:35 crc kubenswrapper[4797]: I1013 13:07:35.026191 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:35 crc kubenswrapper[4797]: I1013 13:07:35.026244 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:35 crc kubenswrapper[4797]: I1013 13:07:35.026263 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:35 crc kubenswrapper[4797]: I1013 13:07:35.026291 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:35 crc kubenswrapper[4797]: I1013 13:07:35.026309 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:35Z","lastTransitionTime":"2025-10-13T13:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:35 crc kubenswrapper[4797]: I1013 13:07:35.129004 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:35 crc kubenswrapper[4797]: I1013 13:07:35.129054 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:35 crc kubenswrapper[4797]: I1013 13:07:35.129066 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:35 crc kubenswrapper[4797]: I1013 13:07:35.129084 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:35 crc kubenswrapper[4797]: I1013 13:07:35.129097 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:35Z","lastTransitionTime":"2025-10-13T13:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:35 crc kubenswrapper[4797]: I1013 13:07:35.231356 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:35 crc kubenswrapper[4797]: I1013 13:07:35.231405 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:35 crc kubenswrapper[4797]: I1013 13:07:35.231417 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:35 crc kubenswrapper[4797]: I1013 13:07:35.231436 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:35 crc kubenswrapper[4797]: I1013 13:07:35.231449 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:35Z","lastTransitionTime":"2025-10-13T13:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:35 crc kubenswrapper[4797]: I1013 13:07:35.235787 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 13 13:07:35 crc kubenswrapper[4797]: E1013 13:07:35.235951 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 13 13:07:35 crc kubenswrapper[4797]: I1013 13:07:35.236028 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 13:07:35 crc kubenswrapper[4797]: I1013 13:07:35.236054 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 13 13:07:35 crc kubenswrapper[4797]: E1013 13:07:35.236122 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 13 13:07:35 crc kubenswrapper[4797]: E1013 13:07:35.236277 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 13 13:07:35 crc kubenswrapper[4797]: I1013 13:07:35.334982 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:35 crc kubenswrapper[4797]: I1013 13:07:35.335043 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:35 crc kubenswrapper[4797]: I1013 13:07:35.335061 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:35 crc kubenswrapper[4797]: I1013 13:07:35.335090 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:35 crc kubenswrapper[4797]: I1013 13:07:35.335107 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:35Z","lastTransitionTime":"2025-10-13T13:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:35 crc kubenswrapper[4797]: I1013 13:07:35.439162 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:35 crc kubenswrapper[4797]: I1013 13:07:35.439227 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:35 crc kubenswrapper[4797]: I1013 13:07:35.439244 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:35 crc kubenswrapper[4797]: I1013 13:07:35.439270 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:35 crc kubenswrapper[4797]: I1013 13:07:35.439286 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:35Z","lastTransitionTime":"2025-10-13T13:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:35 crc kubenswrapper[4797]: I1013 13:07:35.534128 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dhk2q_658edc6a-9975-4d8b-9551-821edcc32ce1/ovnkube-controller/0.log" Oct 13 13:07:35 crc kubenswrapper[4797]: I1013 13:07:35.537913 4797 generic.go:334] "Generic (PLEG): container finished" podID="658edc6a-9975-4d8b-9551-821edcc32ce1" containerID="b640eb4e66649937e6ff80419eed3945beffd841847b10c5d9405c6d69eb6168" exitCode=1 Oct 13 13:07:35 crc kubenswrapper[4797]: I1013 13:07:35.537953 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" event={"ID":"658edc6a-9975-4d8b-9551-821edcc32ce1","Type":"ContainerDied","Data":"b640eb4e66649937e6ff80419eed3945beffd841847b10c5d9405c6d69eb6168"} Oct 13 13:07:35 crc kubenswrapper[4797]: I1013 13:07:35.538765 4797 scope.go:117] "RemoveContainer" containerID="b640eb4e66649937e6ff80419eed3945beffd841847b10c5d9405c6d69eb6168" Oct 13 13:07:35 crc kubenswrapper[4797]: I1013 13:07:35.542466 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:35 crc kubenswrapper[4797]: I1013 13:07:35.542536 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:35 crc kubenswrapper[4797]: I1013 13:07:35.542560 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:35 crc kubenswrapper[4797]: I1013 13:07:35.542595 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:35 crc kubenswrapper[4797]: I1013 13:07:35.542617 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:35Z","lastTransitionTime":"2025-10-13T13:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:35 crc kubenswrapper[4797]: I1013 13:07:35.560318 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a0ece0b-2009-4af8-a479-18fe277add03\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfe67dfd1c3ab4ca933a08e0384f2c38dccf755989a2d788f7c96bd8c2005c7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2839d93188aac6edbd17e7c1dc6d3b6004d3c1d8d03c559205b1f180ca7fc722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d103007fe3545a7470b16b99638c5d5c87f34918e102e1453d4f7ee1fa67109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://221933b30055ace0b4911bda08736e1c703b7757d55fadb3114ae39d038e4b19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01ae91c2e82dd02eb4c0aced1159efd45e3a0570a4db649f2fd2b58681419471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9634ab258db11c80c4fea57a4a31969b811204d49513802e9fd1e584c9baeeb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9634ab258db11c80c4fea57a4a31969b811204d49513802e9fd1e584c9baeeb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4eac0af23e91572524547b0ce92c10d435b55d0cd15ca4cfc1f49bda2de8bde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4eac0af23e91572524547b0ce92c10d435b55d0cd15ca4cfc1f49bda2de8bde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://af79d6bbb6a3c16b532ba2234d3373011c151ffa801eb1ae5ae947142a64bcfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af79d6bbb6a3c16b532ba2234d3373011c151ffa801eb1ae5ae947142a64bcfe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:03Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:35Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:35 crc kubenswrapper[4797]: I1013 13:07:35.578622 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6681a37da700a80ffec94aef9264f87838622029c76a2badc7b8f4a7e9e167e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:35Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:35 crc kubenswrapper[4797]: I1013 13:07:35.596434 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://050a7223e7496fca4ef77f2d73f6aefc921ac5accb7ecaa34609524388da6549\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f59a6104215a3e6febd2e26c286b00895bcd8a45719acbd8e86d6fb5683df39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:35Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:35 crc kubenswrapper[4797]: I1013 13:07:35.610394 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:35Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:35 crc kubenswrapper[4797]: I1013 13:07:35.632025 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hc9bk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94a6e41f-8980-41db-a008-d5a81058cdba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4187a9b2b8147080c704bcda550e1fa94124e2d876e766361e95907a8805d300\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55fca726ceb1c3d6ba2023acc8fcf6829a7843a2da5209dc77f3d1f499542984\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55fca726ceb1c3d6ba2023acc8fcf6829a7843a2da5209dc77f3d1f499542984\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d315709aee893961a174d0368efcf68e50e45845bdce18b40b96f5d49a8ac12c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d315709aee893961a174d0368efcf68e50e45845bdce18b40b96f5d49a8ac12c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e5eeb32046ffa3c309bb0649ed24bb4149050e757d4252bc5ed8e0593e1b139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e5eeb32046ffa3c309bb0649ed24bb4149050e757d4252bc5ed8e0593e1b139\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12ad75a6e287db934cda0d0128e5445d47433aa12807b4145ce0bd28c36b08a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12ad75a6e287db934cda0d0128e5445d47433aa12807b4145ce0bd28c36b08a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://578ddb49fd9525eafe74852b96ea1f3e320cbe40fa15ef4da3e4269f9bc23fc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://578ddb49fd9525eafe74852b96ea1f3e320cbe40fa15ef4da3e4269f9bc23fc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d96793d08ac170b3c25abd53c779b2ebecf10b5271c5f3eb4f9cbc524ba65c0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d96793d08ac170b3c25abd53c779b2ebecf10b5271c5f3eb4f9cbc524ba65c0e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hc9bk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:35Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:35 crc kubenswrapper[4797]: I1013 13:07:35.645249 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:35 crc kubenswrapper[4797]: I1013 13:07:35.645312 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:35 crc kubenswrapper[4797]: I1013 13:07:35.645330 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:35 crc kubenswrapper[4797]: I1013 13:07:35.645354 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:35 crc kubenswrapper[4797]: I1013 13:07:35.645371 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:35Z","lastTransitionTime":"2025-10-13T13:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:35 crc kubenswrapper[4797]: I1013 13:07:35.648416 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7c2fp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c10f51f-a7f1-4ab8-8d9c-fc358bd7f2c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55ca323dd3a92ee203542f4ef7bb8be990bcfc8f75f125c562127129aecefc5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hgkjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7c2fp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:35Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:35 crc kubenswrapper[4797]: I1013 13:07:35.666622 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f37f607-3b81-4e33-878e-e78a69b89d23\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f65bab26af0e0d003d4e1a27dc4bdb84b64b5f6143e363973331a3fb6d26b12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81fc000b6df41386d24f9077cee4aa0ceb4733774dc37d225495575543e84a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6529ac19e0d9f2b6ecc69e041e75c9767c971617166ca22bb29349b3b3965b1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://153aaedead7ed76ab97858342317945de885afe80c00d9873d1a03444c47f67d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:35Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:35 crc kubenswrapper[4797]: I1013 13:07:35.683756 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"345b1c60-ba79-407d-8423-53010f2dfeb0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9188982d992b79d058393a141055552eeb63bc5cd53178991e62b3df7604f55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4mqw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae2106d4b7e73d19b0c8cbd8089d372e56fa08d827a3b45148d0cf68e8596c00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4mqw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hrdxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:35Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:35 crc kubenswrapper[4797]: I1013 13:07:35.701007 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:35Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:35 crc kubenswrapper[4797]: I1013 13:07:35.720218 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6gbdx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2ab9f14-aae8-45ef-880e-a1563e920f87\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://414f6ddbfec431109009fc83e56eeac94db15726b109e707ebd8d3e2403999b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rkc2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6gbdx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:35Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:35 crc kubenswrapper[4797]: I1013 13:07:35.742249 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"658edc6a-9975-4d8b-9551-821edcc32ce1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6815d3509df673d7f5da2c26130c6c4d533e9d2c25c40f82365ef61d63ee71bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e599d81d1a996abd4de74afc58a8255a1ae548327401146b6bdf688d7455823\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa5161ba66d687daedb3caa1a0e2d7be83859aa3076731f94aebf83cc3348a4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1293f7ed35796e22a4be73a35ad07f83fa98d250d21de2d0b96b9090354142b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32991406197be9d38b8d5e8d1a7e95165b1846e9e054efbe87f30aac9f7f8784\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7aebb018a68c2984d9e4e58071c2b623652bfa700acebaf735c35615abf8c592\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b640eb4e66649937e6ff80419eed3945beffd841847b10c5d9405c6d69eb6168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b640eb4e66649937e6ff80419eed3945beffd841847b10c5d9405c6d69eb6168\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-13T13:07:35Z\\\",\\\"message\\\":\\\"(0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1013 13:07:35.108652 6082 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1013 13:07:35.108802 6082 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1013 13:07:35.108884 6082 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1013 13:07:35.109212 6082 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1013 13:07:35.109520 6082 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1013 13:07:35.109724 6082 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1013 13:07:35.110384 6082 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1013 13:07:35.110420 6082 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1013 13:07:35.110445 6082 factory.go:656] Stopping watch factory\\\\nI1013 13:07:35.110464 6082 ovnkube.go:599] Stopped ovnkube\\\\nI1013 13:07:35.110618 6082 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a900854ab289e65833932548eadd4705ec501737d66773d5b6c283458125b598\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6339ec20a5a892de024630eaf6febedfdfa4373c24d394fc758267cc7ae2603e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6339ec20a5a892de024630eaf6febedfdfa4373c24d394fc758267cc7ae2603e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dhk2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:35Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:35 crc kubenswrapper[4797]: I1013 13:07:35.748675 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:35 crc kubenswrapper[4797]: I1013 13:07:35.748726 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:35 crc kubenswrapper[4797]: I1013 13:07:35.748746 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:35 crc kubenswrapper[4797]: I1013 13:07:35.748769 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:35 crc kubenswrapper[4797]: I1013 13:07:35.748783 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:35Z","lastTransitionTime":"2025-10-13T13:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:35 crc kubenswrapper[4797]: I1013 13:07:35.761940 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a225515-1318-413d-aafe-877c9f16f598\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94213b2963fead3db49bc98dfdf6347265b92e3a0a965295610e496d2e1f03fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://015492cb16b3cf6dedc1936f90cf03d1331bfd1fddf6a257c719a6bf102691f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a259d99cae7127eb6fc8ad5446de3eda5a06da45868ab2325a89fc9c44f1d34c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58e84f914871c37c2c5cf2767af6a88354e4e59af0cbe5b178b80e1372d50629\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba76df71160260346f2ecd968722de778b7d2b3dcb8673d6ec770964965384fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ff207205d6f17838682cab641fe9c77fad739e5454c206f376440741bed8724\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ff207205d6f17838682cab641fe9c77fad739e5454c206f376440741bed8724\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:35Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:35 crc kubenswrapper[4797]: I1013 13:07:35.777337 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:35Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:35 crc kubenswrapper[4797]: I1013 13:07:35.789798 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69631c207b244ea05458caf7f67665697be6b3794c1aac98d0ac8d23df060e02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:35Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:35 crc kubenswrapper[4797]: I1013 13:07:35.802128 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5jgrm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"680a49a0-7eff-44a9-8ab8-e4b52f4743c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://875be57ff7356a934b342acd8ae700f66656680be4e58e6cfccdc0407b66ddea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pptw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5jgrm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:35Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:35 crc kubenswrapper[4797]: I1013 13:07:35.850913 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:35 crc kubenswrapper[4797]: I1013 13:07:35.850944 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:35 crc kubenswrapper[4797]: I1013 13:07:35.850953 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:35 crc kubenswrapper[4797]: I1013 13:07:35.850968 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:35 crc kubenswrapper[4797]: I1013 13:07:35.850978 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:35Z","lastTransitionTime":"2025-10-13T13:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:35 crc kubenswrapper[4797]: I1013 13:07:35.954670 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:35 crc kubenswrapper[4797]: I1013 13:07:35.954702 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:35 crc kubenswrapper[4797]: I1013 13:07:35.954713 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:35 crc kubenswrapper[4797]: I1013 13:07:35.954728 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:35 crc kubenswrapper[4797]: I1013 13:07:35.954740 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:35Z","lastTransitionTime":"2025-10-13T13:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:36 crc kubenswrapper[4797]: I1013 13:07:36.057293 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:36 crc kubenswrapper[4797]: I1013 13:07:36.057336 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:36 crc kubenswrapper[4797]: I1013 13:07:36.057352 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:36 crc kubenswrapper[4797]: I1013 13:07:36.057372 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:36 crc kubenswrapper[4797]: I1013 13:07:36.057386 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:36Z","lastTransitionTime":"2025-10-13T13:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:36 crc kubenswrapper[4797]: I1013 13:07:36.159912 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:36 crc kubenswrapper[4797]: I1013 13:07:36.159968 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:36 crc kubenswrapper[4797]: I1013 13:07:36.159980 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:36 crc kubenswrapper[4797]: I1013 13:07:36.159998 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:36 crc kubenswrapper[4797]: I1013 13:07:36.160013 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:36Z","lastTransitionTime":"2025-10-13T13:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:36 crc kubenswrapper[4797]: I1013 13:07:36.263235 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:36 crc kubenswrapper[4797]: I1013 13:07:36.263299 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:36 crc kubenswrapper[4797]: I1013 13:07:36.263319 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:36 crc kubenswrapper[4797]: I1013 13:07:36.263343 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:36 crc kubenswrapper[4797]: I1013 13:07:36.263360 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:36Z","lastTransitionTime":"2025-10-13T13:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:36 crc kubenswrapper[4797]: I1013 13:07:36.366294 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:36 crc kubenswrapper[4797]: I1013 13:07:36.366351 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:36 crc kubenswrapper[4797]: I1013 13:07:36.366366 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:36 crc kubenswrapper[4797]: I1013 13:07:36.366389 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:36 crc kubenswrapper[4797]: I1013 13:07:36.366404 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:36Z","lastTransitionTime":"2025-10-13T13:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:36 crc kubenswrapper[4797]: I1013 13:07:36.469054 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:36 crc kubenswrapper[4797]: I1013 13:07:36.469080 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:36 crc kubenswrapper[4797]: I1013 13:07:36.469089 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:36 crc kubenswrapper[4797]: I1013 13:07:36.469103 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:36 crc kubenswrapper[4797]: I1013 13:07:36.469115 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:36Z","lastTransitionTime":"2025-10-13T13:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:36 crc kubenswrapper[4797]: I1013 13:07:36.543143 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dhk2q_658edc6a-9975-4d8b-9551-821edcc32ce1/ovnkube-controller/1.log" Oct 13 13:07:36 crc kubenswrapper[4797]: I1013 13:07:36.543629 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dhk2q_658edc6a-9975-4d8b-9551-821edcc32ce1/ovnkube-controller/0.log" Oct 13 13:07:36 crc kubenswrapper[4797]: I1013 13:07:36.546568 4797 generic.go:334] "Generic (PLEG): container finished" podID="658edc6a-9975-4d8b-9551-821edcc32ce1" containerID="f7f9c135f7562fc84fb86baaf2fcb4fbdc3bb71573315ae64c369e542e394944" exitCode=1 Oct 13 13:07:36 crc kubenswrapper[4797]: I1013 13:07:36.546599 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" event={"ID":"658edc6a-9975-4d8b-9551-821edcc32ce1","Type":"ContainerDied","Data":"f7f9c135f7562fc84fb86baaf2fcb4fbdc3bb71573315ae64c369e542e394944"} Oct 13 13:07:36 crc kubenswrapper[4797]: I1013 13:07:36.546638 4797 scope.go:117] "RemoveContainer" containerID="b640eb4e66649937e6ff80419eed3945beffd841847b10c5d9405c6d69eb6168" Oct 13 13:07:36 crc kubenswrapper[4797]: I1013 13:07:36.547219 4797 scope.go:117] "RemoveContainer" containerID="f7f9c135f7562fc84fb86baaf2fcb4fbdc3bb71573315ae64c369e542e394944" Oct 13 13:07:36 crc kubenswrapper[4797]: E1013 13:07:36.547356 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-dhk2q_openshift-ovn-kubernetes(658edc6a-9975-4d8b-9551-821edcc32ce1)\"" pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" podUID="658edc6a-9975-4d8b-9551-821edcc32ce1" Oct 13 13:07:36 crc kubenswrapper[4797]: I1013 13:07:36.562644 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a225515-1318-413d-aafe-877c9f16f598\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94213b2963fead3db49bc98dfdf6347265b92e3a0a965295610e496d2e1f03fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://015492cb16b3cf6dedc1936f90cf03d1331bfd1fddf6a257c719a6bf102691f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a259d99cae7127eb6fc8ad5446de3eda5a06da45868ab2325a89fc9c44f1d34c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58e84f914871c37c2c5cf2767af6a88354e4e59af0cbe5b178b80e1372d50629\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba76df71160260346f2ecd968722de778b7d2b3dcb8673d6ec770964965384fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ff207205d6f17838682cab641fe9c77fad739e5454c206f376440741bed8724\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ff207205d6f17838682cab641fe9c77fad739e5454c206f376440741bed8724\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:36Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:36 crc kubenswrapper[4797]: I1013 13:07:36.571028 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:36 crc kubenswrapper[4797]: I1013 13:07:36.571055 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:36 crc kubenswrapper[4797]: I1013 13:07:36.571067 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:36 crc kubenswrapper[4797]: I1013 13:07:36.571080 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:36 crc kubenswrapper[4797]: I1013 13:07:36.571090 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:36Z","lastTransitionTime":"2025-10-13T13:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:36 crc kubenswrapper[4797]: I1013 13:07:36.580309 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:36Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:36 crc kubenswrapper[4797]: I1013 13:07:36.593131 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:36Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:36 crc kubenswrapper[4797]: I1013 13:07:36.610721 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6gbdx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2ab9f14-aae8-45ef-880e-a1563e920f87\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://414f6ddbfec431109009fc83e56eeac94db15726b109e707ebd8d3e2403999b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rkc2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6gbdx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:36Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:36 crc kubenswrapper[4797]: I1013 13:07:36.631677 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"658edc6a-9975-4d8b-9551-821edcc32ce1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6815d3509df673d7f5da2c26130c6c4d533e9d2c25c40f82365ef61d63ee71bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e599d81d1a996abd4de74afc58a8255a1ae548327401146b6bdf688d7455823\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa5161ba66d687daedb3caa1a0e2d7be83859aa3076731f94aebf83cc3348a4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1293f7ed35796e22a4be73a35ad07f83fa98d250d21de2d0b96b9090354142b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32991406197be9d38b8d5e8d1a7e95165b1846e9e054efbe87f30aac9f7f8784\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7aebb018a68c2984d9e4e58071c2b623652bfa700acebaf735c35615abf8c592\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7f9c135f7562fc84fb86baaf2fcb4fbdc3bb71573315ae64c369e542e394944\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b640eb4e66649937e6ff80419eed3945beffd841847b10c5d9405c6d69eb6168\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-13T13:07:35Z\\\",\\\"message\\\":\\\"(0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1013 13:07:35.108652 6082 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1013 13:07:35.108802 6082 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1013 13:07:35.108884 6082 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1013 13:07:35.109212 6082 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1013 13:07:35.109520 6082 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1013 13:07:35.109724 6082 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1013 13:07:35.110384 6082 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1013 13:07:35.110420 6082 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1013 13:07:35.110445 6082 factory.go:656] Stopping watch factory\\\\nI1013 13:07:35.110464 6082 ovnkube.go:599] Stopped ovnkube\\\\nI1013 13:07:35.110618 6082 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f9c135f7562fc84fb86baaf2fcb4fbdc3bb71573315ae64c369e542e394944\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-13T13:07:36Z\\\",\\\"message\\\":\\\"ce/redhat-marketplace for network=default\\\\nF1013 13:07:36.374532 6199 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:36Z is after 2025-08-24T17:21:41Z]\\\\nI1013 13:07:36.374702 6199 services_controller.go:434] Service openshift-marketplace/redhat-marketplace retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{redhat-marketplace openshift-marketplace cf6d00ec-cc2c-43f6-815c-40ffd0563e71 5558 0 2025-02-23 05:23:25 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[olm.managed:true olm.service-spec-hash:aUeLNNcZzVZO2rcaZ5Kc8V3jffO0Ss4T6qX6V5] map[] [{operators.coreos.com/v1alpha1 CatalogSource redhat-marketplace fcb55c30-a739-4bc1-9c9c-7634e05a3dbd 0xc0075c0c4d 0xc0075c0c4e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:grpc,Protocol:TCP,P\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a900854ab289e65833932548eadd4705ec501737d66773d5b6c283458125b598\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6339ec20a5a892de024630eaf6febedfdfa4373c24d394fc758267cc7ae2603e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6339ec20a5a892de024630eaf6febedfdfa4373c24d394fc758267cc7ae2603e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dhk2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:36Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:36 crc kubenswrapper[4797]: I1013 13:07:36.652959 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69631c207b244ea05458caf7f67665697be6b3794c1aac98d0ac8d23df060e02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:36Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:36 crc kubenswrapper[4797]: I1013 13:07:36.667899 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5jgrm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"680a49a0-7eff-44a9-8ab8-e4b52f4743c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://875be57ff7356a934b342acd8ae700f66656680be4e58e6cfccdc0407b66ddea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pptw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5jgrm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:36Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:36 crc kubenswrapper[4797]: I1013 13:07:36.673738 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:36 crc kubenswrapper[4797]: I1013 13:07:36.673784 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:36 crc kubenswrapper[4797]: I1013 13:07:36.673820 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:36 crc kubenswrapper[4797]: I1013 13:07:36.673841 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:36 crc kubenswrapper[4797]: I1013 13:07:36.673855 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:36Z","lastTransitionTime":"2025-10-13T13:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:36 crc kubenswrapper[4797]: I1013 13:07:36.688984 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:36Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:36 crc kubenswrapper[4797]: I1013 13:07:36.712631 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hc9bk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94a6e41f-8980-41db-a008-d5a81058cdba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4187a9b2b8147080c704bcda550e1fa94124e2d876e766361e95907a8805d300\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55fca726ceb1c3d6ba2023acc8fcf6829a7843a2da5209dc77f3d1f499542984\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55fca726ceb1c3d6ba2023acc8fcf6829a7843a2da5209dc77f3d1f499542984\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d315709aee893961a174d0368efcf68e50e45845bdce18b40b96f5d49a8ac12c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d315709aee893961a174d0368efcf68e50e45845bdce18b40b96f5d49a8ac12c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e5eeb32046ffa3c309bb0649ed24bb4149050e757d4252bc5ed8e0593e1b139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e5eeb32046ffa3c309bb0649ed24bb4149050e757d4252bc5ed8e0593e1b139\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12ad75a6e287db934cda0d0128e5445d47433aa12807b4145ce0bd28c36b08a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12ad75a6e287db934cda0d0128e5445d47433aa12807b4145ce0bd28c36b08a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://578ddb49fd9525eafe74852b96ea1f3e320cbe40fa15ef4da3e4269f9bc23fc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://578ddb49fd9525eafe74852b96ea1f3e320cbe40fa15ef4da3e4269f9bc23fc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d96793d08ac170b3c25abd53c779b2ebecf10b5271c5f3eb4f9cbc524ba65c0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d96793d08ac170b3c25abd53c779b2ebecf10b5271c5f3eb4f9cbc524ba65c0e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hc9bk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:36Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:36 crc kubenswrapper[4797]: I1013 13:07:36.744101 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a0ece0b-2009-4af8-a479-18fe277add03\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfe67dfd1c3ab4ca933a08e0384f2c38dccf755989a2d788f7c96bd8c2005c7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2839d93188aac6edbd17e7c1dc6d3b6004d3c1d8d03c559205b1f180ca7fc722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d103007fe3545a7470b16b99638c5d5c87f34918e102e1453d4f7ee1fa67109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://221933b30055ace0b4911bda08736e1c703b7757d55fadb3114ae39d038e4b19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01ae91c2e82dd02eb4c0aced1159efd45e3a0570a4db649f2fd2b58681419471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9634ab258db11c80c4fea57a4a31969b811204d49513802e9fd1e584c9baeeb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9634ab258db11c80c4fea57a4a31969b811204d49513802e9fd1e584c9baeeb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4eac0af23e91572524547b0ce92c10d435b55d0cd15ca4cfc1f49bda2de8bde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4eac0af23e91572524547b0ce92c10d435b55d0cd15ca4cfc1f49bda2de8bde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://af79d6bbb6a3c16b532ba2234d3373011c151ffa801eb1ae5ae947142a64bcfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af79d6bbb6a3c16b532ba2234d3373011c151ffa801eb1ae5ae947142a64bcfe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:03Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:36Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:36 crc kubenswrapper[4797]: I1013 13:07:36.763888 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6681a37da700a80ffec94aef9264f87838622029c76a2badc7b8f4a7e9e167e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:36Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:36 crc kubenswrapper[4797]: I1013 13:07:36.776948 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:36 crc kubenswrapper[4797]: I1013 13:07:36.776994 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:36 crc kubenswrapper[4797]: I1013 13:07:36.777009 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:36 crc kubenswrapper[4797]: I1013 13:07:36.777030 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:36 crc kubenswrapper[4797]: I1013 13:07:36.777045 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:36Z","lastTransitionTime":"2025-10-13T13:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:36 crc kubenswrapper[4797]: I1013 13:07:36.783528 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://050a7223e7496fca4ef77f2d73f6aefc921ac5accb7ecaa34609524388da6549\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f59a6104215a3e6febd2e26c286b00895bcd8a45719acbd8e86d6fb5683df39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:36Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:36 crc kubenswrapper[4797]: I1013 13:07:36.802988 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f37f607-3b81-4e33-878e-e78a69b89d23\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f65bab26af0e0d003d4e1a27dc4bdb84b64b5f6143e363973331a3fb6d26b12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81fc000b6df41386d24f9077cee4aa0ceb4733774dc37d225495575543e84a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6529ac19e0d9f2b6ecc69e041e75c9767c971617166ca22bb29349b3b3965b1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://153aaedead7ed76ab97858342317945de885afe80c00d9873d1a03444c47f67d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:36Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:36 crc kubenswrapper[4797]: I1013 13:07:36.820366 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"345b1c60-ba79-407d-8423-53010f2dfeb0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9188982d992b79d058393a141055552eeb63bc5cd53178991e62b3df7604f55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4mqw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae2106d4b7e73d19b0c8cbd8089d372e56fa08d827a3b45148d0cf68e8596c00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4mqw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hrdxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:36Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:36 crc kubenswrapper[4797]: I1013 13:07:36.835639 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7c2fp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c10f51f-a7f1-4ab8-8d9c-fc358bd7f2c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55ca323dd3a92ee203542f4ef7bb8be990bcfc8f75f125c562127129aecefc5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hgkjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7c2fp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:36Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:36 crc kubenswrapper[4797]: I1013 13:07:36.880635 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:36 crc kubenswrapper[4797]: I1013 13:07:36.880702 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:36 crc kubenswrapper[4797]: I1013 13:07:36.880724 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:36 crc kubenswrapper[4797]: I1013 13:07:36.880756 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:36 crc kubenswrapper[4797]: I1013 13:07:36.880780 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:36Z","lastTransitionTime":"2025-10-13T13:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:36 crc kubenswrapper[4797]: I1013 13:07:36.983740 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:36 crc kubenswrapper[4797]: I1013 13:07:36.983801 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:36 crc kubenswrapper[4797]: I1013 13:07:36.983858 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:36 crc kubenswrapper[4797]: I1013 13:07:36.983905 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:36 crc kubenswrapper[4797]: I1013 13:07:36.983921 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:36Z","lastTransitionTime":"2025-10-13T13:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:37 crc kubenswrapper[4797]: I1013 13:07:37.087112 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:37 crc kubenswrapper[4797]: I1013 13:07:37.087145 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:37 crc kubenswrapper[4797]: I1013 13:07:37.087153 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:37 crc kubenswrapper[4797]: I1013 13:07:37.087167 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:37 crc kubenswrapper[4797]: I1013 13:07:37.087174 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:37Z","lastTransitionTime":"2025-10-13T13:07:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:37 crc kubenswrapper[4797]: I1013 13:07:37.190796 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:37 crc kubenswrapper[4797]: I1013 13:07:37.190880 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:37 crc kubenswrapper[4797]: I1013 13:07:37.190898 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:37 crc kubenswrapper[4797]: I1013 13:07:37.190922 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:37 crc kubenswrapper[4797]: I1013 13:07:37.190940 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:37Z","lastTransitionTime":"2025-10-13T13:07:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:37 crc kubenswrapper[4797]: I1013 13:07:37.235182 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 13 13:07:37 crc kubenswrapper[4797]: E1013 13:07:37.235286 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 13 13:07:37 crc kubenswrapper[4797]: I1013 13:07:37.235601 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 13 13:07:37 crc kubenswrapper[4797]: E1013 13:07:37.235645 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 13 13:07:37 crc kubenswrapper[4797]: I1013 13:07:37.235779 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 13:07:37 crc kubenswrapper[4797]: E1013 13:07:37.235859 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 13 13:07:37 crc kubenswrapper[4797]: I1013 13:07:37.293353 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:37 crc kubenswrapper[4797]: I1013 13:07:37.293435 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:37 crc kubenswrapper[4797]: I1013 13:07:37.293502 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:37 crc kubenswrapper[4797]: I1013 13:07:37.293533 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:37 crc kubenswrapper[4797]: I1013 13:07:37.293556 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:37Z","lastTransitionTime":"2025-10-13T13:07:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:37 crc kubenswrapper[4797]: I1013 13:07:37.397499 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:37 crc kubenswrapper[4797]: I1013 13:07:37.397553 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:37 crc kubenswrapper[4797]: I1013 13:07:37.397569 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:37 crc kubenswrapper[4797]: I1013 13:07:37.397592 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:37 crc kubenswrapper[4797]: I1013 13:07:37.397609 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:37Z","lastTransitionTime":"2025-10-13T13:07:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:37 crc kubenswrapper[4797]: I1013 13:07:37.499629 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:37 crc kubenswrapper[4797]: I1013 13:07:37.499666 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:37 crc kubenswrapper[4797]: I1013 13:07:37.499677 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:37 crc kubenswrapper[4797]: I1013 13:07:37.499691 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:37 crc kubenswrapper[4797]: I1013 13:07:37.499700 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:37Z","lastTransitionTime":"2025-10-13T13:07:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:37 crc kubenswrapper[4797]: I1013 13:07:37.551412 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dhk2q_658edc6a-9975-4d8b-9551-821edcc32ce1/ovnkube-controller/1.log" Oct 13 13:07:37 crc kubenswrapper[4797]: I1013 13:07:37.601405 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:37 crc kubenswrapper[4797]: I1013 13:07:37.601439 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:37 crc kubenswrapper[4797]: I1013 13:07:37.601447 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:37 crc kubenswrapper[4797]: I1013 13:07:37.601460 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:37 crc kubenswrapper[4797]: I1013 13:07:37.601471 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:37Z","lastTransitionTime":"2025-10-13T13:07:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:37 crc kubenswrapper[4797]: I1013 13:07:37.704539 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:37 crc kubenswrapper[4797]: I1013 13:07:37.704588 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:37 crc kubenswrapper[4797]: I1013 13:07:37.704604 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:37 crc kubenswrapper[4797]: I1013 13:07:37.704627 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:37 crc kubenswrapper[4797]: I1013 13:07:37.704646 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:37Z","lastTransitionTime":"2025-10-13T13:07:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:37 crc kubenswrapper[4797]: I1013 13:07:37.806474 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:37 crc kubenswrapper[4797]: I1013 13:07:37.806512 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:37 crc kubenswrapper[4797]: I1013 13:07:37.806520 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:37 crc kubenswrapper[4797]: I1013 13:07:37.806533 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:37 crc kubenswrapper[4797]: I1013 13:07:37.806542 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:37Z","lastTransitionTime":"2025-10-13T13:07:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:37 crc kubenswrapper[4797]: I1013 13:07:37.908627 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:37 crc kubenswrapper[4797]: I1013 13:07:37.908673 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:37 crc kubenswrapper[4797]: I1013 13:07:37.908685 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:37 crc kubenswrapper[4797]: I1013 13:07:37.908704 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:37 crc kubenswrapper[4797]: I1013 13:07:37.908715 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:37Z","lastTransitionTime":"2025-10-13T13:07:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:38 crc kubenswrapper[4797]: I1013 13:07:38.010791 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:38 crc kubenswrapper[4797]: I1013 13:07:38.010909 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:38 crc kubenswrapper[4797]: I1013 13:07:38.010936 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:38 crc kubenswrapper[4797]: I1013 13:07:38.010969 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:38 crc kubenswrapper[4797]: I1013 13:07:38.010995 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:38Z","lastTransitionTime":"2025-10-13T13:07:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:38 crc kubenswrapper[4797]: I1013 13:07:38.113762 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:38 crc kubenswrapper[4797]: I1013 13:07:38.113837 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:38 crc kubenswrapper[4797]: I1013 13:07:38.113854 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:38 crc kubenswrapper[4797]: I1013 13:07:38.113877 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:38 crc kubenswrapper[4797]: I1013 13:07:38.113898 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:38Z","lastTransitionTime":"2025-10-13T13:07:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:38 crc kubenswrapper[4797]: I1013 13:07:38.217003 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:38 crc kubenswrapper[4797]: I1013 13:07:38.217063 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:38 crc kubenswrapper[4797]: I1013 13:07:38.217079 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:38 crc kubenswrapper[4797]: I1013 13:07:38.217105 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:38 crc kubenswrapper[4797]: I1013 13:07:38.217124 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:38Z","lastTransitionTime":"2025-10-13T13:07:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:38 crc kubenswrapper[4797]: I1013 13:07:38.242016 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hvhmz"] Oct 13 13:07:38 crc kubenswrapper[4797]: I1013 13:07:38.244090 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hvhmz" Oct 13 13:07:38 crc kubenswrapper[4797]: I1013 13:07:38.249921 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Oct 13 13:07:38 crc kubenswrapper[4797]: I1013 13:07:38.254873 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Oct 13 13:07:38 crc kubenswrapper[4797]: I1013 13:07:38.286356 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a0ece0b-2009-4af8-a479-18fe277add03\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfe67dfd1c3ab4ca933a08e0384f2c38dccf755989a2d788f7c96bd8c2005c7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2839d93188aac6edbd17e7c1dc6d3b6004d3c1d8d03c559205b1f180ca7fc722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d103007fe3545a7470b16b99638c5d5c87f34918e102e1453d4f7ee1fa67109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://221933b30055ace0b4911bda08736e1c703b7757d55fadb3114ae39d038e4b19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01ae91c2e82dd02eb4c0aced1159efd45e3a0570a4db649f2fd2b58681419471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9634ab258db11c80c4fea57a4a31969b811204d49513802e9fd1e584c9baeeb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9634ab258db11c80c4fea57a4a31969b811204d49513802e9fd1e584c9baeeb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4eac0af23e91572524547b0ce92c10d435b55d0cd15ca4cfc1f49bda2de8bde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4eac0af23e91572524547b0ce92c10d435b55d0cd15ca4cfc1f49bda2de8bde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://af79d6bbb6a3c16b532ba2234d3373011c151ffa801eb1ae5ae947142a64bcfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af79d6bbb6a3c16b532ba2234d3373011c151ffa801eb1ae5ae947142a64bcfe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:03Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:38Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:38 crc kubenswrapper[4797]: I1013 13:07:38.303082 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6681a37da700a80ffec94aef9264f87838622029c76a2badc7b8f4a7e9e167e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:38Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:38 crc kubenswrapper[4797]: I1013 13:07:38.321059 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:38 crc kubenswrapper[4797]: I1013 13:07:38.321129 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:38 crc kubenswrapper[4797]: I1013 13:07:38.321154 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:38 crc kubenswrapper[4797]: I1013 13:07:38.321183 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:38 crc kubenswrapper[4797]: I1013 13:07:38.321206 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:38Z","lastTransitionTime":"2025-10-13T13:07:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:38 crc kubenswrapper[4797]: I1013 13:07:38.322041 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://050a7223e7496fca4ef77f2d73f6aefc921ac5accb7ecaa34609524388da6549\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f59a6104215a3e6febd2e26c286b00895bcd8a45719acbd8e86d6fb5683df39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:38Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:38 crc kubenswrapper[4797]: I1013 13:07:38.331664 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4b24c284-a754-4877-83cc-334b0a893a47-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-hvhmz\" (UID: \"4b24c284-a754-4877-83cc-334b0a893a47\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hvhmz" Oct 13 13:07:38 crc kubenswrapper[4797]: I1013 13:07:38.331803 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4b24c284-a754-4877-83cc-334b0a893a47-env-overrides\") pod \"ovnkube-control-plane-749d76644c-hvhmz\" (UID: \"4b24c284-a754-4877-83cc-334b0a893a47\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hvhmz" Oct 13 13:07:38 crc kubenswrapper[4797]: I1013 13:07:38.331912 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4b24c284-a754-4877-83cc-334b0a893a47-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-hvhmz\" (UID: \"4b24c284-a754-4877-83cc-334b0a893a47\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hvhmz" Oct 13 13:07:38 crc kubenswrapper[4797]: I1013 13:07:38.331973 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-znzc9\" (UniqueName: \"kubernetes.io/projected/4b24c284-a754-4877-83cc-334b0a893a47-kube-api-access-znzc9\") pod \"ovnkube-control-plane-749d76644c-hvhmz\" (UID: \"4b24c284-a754-4877-83cc-334b0a893a47\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hvhmz" Oct 13 13:07:38 crc kubenswrapper[4797]: I1013 13:07:38.344270 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:38Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:38 crc kubenswrapper[4797]: I1013 13:07:38.367005 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hc9bk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94a6e41f-8980-41db-a008-d5a81058cdba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4187a9b2b8147080c704bcda550e1fa94124e2d876e766361e95907a8805d300\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55fca726ceb1c3d6ba2023acc8fcf6829a7843a2da5209dc77f3d1f499542984\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55fca726ceb1c3d6ba2023acc8fcf6829a7843a2da5209dc77f3d1f499542984\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d315709aee893961a174d0368efcf68e50e45845bdce18b40b96f5d49a8ac12c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d315709aee893961a174d0368efcf68e50e45845bdce18b40b96f5d49a8ac12c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e5eeb32046ffa3c309bb0649ed24bb4149050e757d4252bc5ed8e0593e1b139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e5eeb32046ffa3c309bb0649ed24bb4149050e757d4252bc5ed8e0593e1b139\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12ad75a6e287db934cda0d0128e5445d47433aa12807b4145ce0bd28c36b08a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12ad75a6e287db934cda0d0128e5445d47433aa12807b4145ce0bd28c36b08a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://578ddb49fd9525eafe74852b96ea1f3e320cbe40fa15ef4da3e4269f9bc23fc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://578ddb49fd9525eafe74852b96ea1f3e320cbe40fa15ef4da3e4269f9bc23fc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d96793d08ac170b3c25abd53c779b2ebecf10b5271c5f3eb4f9cbc524ba65c0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d96793d08ac170b3c25abd53c779b2ebecf10b5271c5f3eb4f9cbc524ba65c0e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hc9bk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:38Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:38 crc kubenswrapper[4797]: I1013 13:07:38.382758 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"345b1c60-ba79-407d-8423-53010f2dfeb0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9188982d992b79d058393a141055552eeb63bc5cd53178991e62b3df7604f55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4mqw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae2106d4b7e73d19b0c8cbd8089d372e56fa08d827a3b45148d0cf68e8596c00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4mqw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hrdxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:38Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:38 crc kubenswrapper[4797]: I1013 13:07:38.395451 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7c2fp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c10f51f-a7f1-4ab8-8d9c-fc358bd7f2c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55ca323dd3a92ee203542f4ef7bb8be990bcfc8f75f125c562127129aecefc5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hgkjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7c2fp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:38Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:38 crc kubenswrapper[4797]: I1013 13:07:38.413532 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f37f607-3b81-4e33-878e-e78a69b89d23\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f65bab26af0e0d003d4e1a27dc4bdb84b64b5f6143e363973331a3fb6d26b12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81fc000b6df41386d24f9077cee4aa0ceb4733774dc37d225495575543e84a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6529ac19e0d9f2b6ecc69e041e75c9767c971617166ca22bb29349b3b3965b1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://153aaedead7ed76ab97858342317945de885afe80c00d9873d1a03444c47f67d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:38Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:38 crc kubenswrapper[4797]: I1013 13:07:38.424074 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:38 crc kubenswrapper[4797]: I1013 13:07:38.424120 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:38 crc kubenswrapper[4797]: I1013 13:07:38.424136 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:38 crc kubenswrapper[4797]: I1013 13:07:38.424157 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:38 crc kubenswrapper[4797]: I1013 13:07:38.424174 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:38Z","lastTransitionTime":"2025-10-13T13:07:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:38 crc kubenswrapper[4797]: I1013 13:07:38.431326 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:38Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:38 crc kubenswrapper[4797]: I1013 13:07:38.432615 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4b24c284-a754-4877-83cc-334b0a893a47-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-hvhmz\" (UID: \"4b24c284-a754-4877-83cc-334b0a893a47\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hvhmz" Oct 13 13:07:38 crc kubenswrapper[4797]: I1013 13:07:38.432667 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4b24c284-a754-4877-83cc-334b0a893a47-env-overrides\") pod \"ovnkube-control-plane-749d76644c-hvhmz\" (UID: \"4b24c284-a754-4877-83cc-334b0a893a47\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hvhmz" Oct 13 13:07:38 crc kubenswrapper[4797]: I1013 13:07:38.432722 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4b24c284-a754-4877-83cc-334b0a893a47-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-hvhmz\" (UID: \"4b24c284-a754-4877-83cc-334b0a893a47\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hvhmz" Oct 13 13:07:38 crc kubenswrapper[4797]: I1013 13:07:38.432756 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-znzc9\" (UniqueName: \"kubernetes.io/projected/4b24c284-a754-4877-83cc-334b0a893a47-kube-api-access-znzc9\") pod \"ovnkube-control-plane-749d76644c-hvhmz\" (UID: \"4b24c284-a754-4877-83cc-334b0a893a47\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hvhmz" Oct 13 13:07:38 crc kubenswrapper[4797]: I1013 13:07:38.434929 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4b24c284-a754-4877-83cc-334b0a893a47-env-overrides\") pod \"ovnkube-control-plane-749d76644c-hvhmz\" (UID: \"4b24c284-a754-4877-83cc-334b0a893a47\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hvhmz" Oct 13 13:07:38 crc kubenswrapper[4797]: I1013 13:07:38.436520 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4b24c284-a754-4877-83cc-334b0a893a47-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-hvhmz\" (UID: \"4b24c284-a754-4877-83cc-334b0a893a47\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hvhmz" Oct 13 13:07:38 crc kubenswrapper[4797]: I1013 13:07:38.441737 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4b24c284-a754-4877-83cc-334b0a893a47-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-hvhmz\" (UID: \"4b24c284-a754-4877-83cc-334b0a893a47\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hvhmz" Oct 13 13:07:38 crc kubenswrapper[4797]: I1013 13:07:38.447538 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:38Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:38 crc kubenswrapper[4797]: I1013 13:07:38.462954 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-znzc9\" (UniqueName: \"kubernetes.io/projected/4b24c284-a754-4877-83cc-334b0a893a47-kube-api-access-znzc9\") pod \"ovnkube-control-plane-749d76644c-hvhmz\" (UID: \"4b24c284-a754-4877-83cc-334b0a893a47\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hvhmz" Oct 13 13:07:38 crc kubenswrapper[4797]: I1013 13:07:38.467149 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6gbdx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2ab9f14-aae8-45ef-880e-a1563e920f87\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://414f6ddbfec431109009fc83e56eeac94db15726b109e707ebd8d3e2403999b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rkc2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6gbdx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:38Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:38 crc kubenswrapper[4797]: I1013 13:07:38.491076 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"658edc6a-9975-4d8b-9551-821edcc32ce1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6815d3509df673d7f5da2c26130c6c4d533e9d2c25c40f82365ef61d63ee71bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e599d81d1a996abd4de74afc58a8255a1ae548327401146b6bdf688d7455823\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa5161ba66d687daedb3caa1a0e2d7be83859aa3076731f94aebf83cc3348a4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1293f7ed35796e22a4be73a35ad07f83fa98d250d21de2d0b96b9090354142b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32991406197be9d38b8d5e8d1a7e95165b1846e9e054efbe87f30aac9f7f8784\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7aebb018a68c2984d9e4e58071c2b623652bfa700acebaf735c35615abf8c592\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7f9c135f7562fc84fb86baaf2fcb4fbdc3bb71573315ae64c369e542e394944\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b640eb4e66649937e6ff80419eed3945beffd841847b10c5d9405c6d69eb6168\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-13T13:07:35Z\\\",\\\"message\\\":\\\"(0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1013 13:07:35.108652 6082 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1013 13:07:35.108802 6082 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1013 13:07:35.108884 6082 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1013 13:07:35.109212 6082 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1013 13:07:35.109520 6082 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1013 13:07:35.109724 6082 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1013 13:07:35.110384 6082 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1013 13:07:35.110420 6082 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1013 13:07:35.110445 6082 factory.go:656] Stopping watch factory\\\\nI1013 13:07:35.110464 6082 ovnkube.go:599] Stopped ovnkube\\\\nI1013 13:07:35.110618 6082 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f9c135f7562fc84fb86baaf2fcb4fbdc3bb71573315ae64c369e542e394944\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-13T13:07:36Z\\\",\\\"message\\\":\\\"ce/redhat-marketplace for network=default\\\\nF1013 13:07:36.374532 6199 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:36Z is after 2025-08-24T17:21:41Z]\\\\nI1013 13:07:36.374702 6199 services_controller.go:434] Service openshift-marketplace/redhat-marketplace retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{redhat-marketplace openshift-marketplace cf6d00ec-cc2c-43f6-815c-40ffd0563e71 5558 0 2025-02-23 05:23:25 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[olm.managed:true olm.service-spec-hash:aUeLNNcZzVZO2rcaZ5Kc8V3jffO0Ss4T6qX6V5] map[] [{operators.coreos.com/v1alpha1 CatalogSource redhat-marketplace fcb55c30-a739-4bc1-9c9c-7634e05a3dbd 0xc0075c0c4d 0xc0075c0c4e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:grpc,Protocol:TCP,P\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a900854ab289e65833932548eadd4705ec501737d66773d5b6c283458125b598\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6339ec20a5a892de024630eaf6febedfdfa4373c24d394fc758267cc7ae2603e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6339ec20a5a892de024630eaf6febedfdfa4373c24d394fc758267cc7ae2603e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dhk2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:38Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:38 crc kubenswrapper[4797]: I1013 13:07:38.509375 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a225515-1318-413d-aafe-877c9f16f598\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94213b2963fead3db49bc98dfdf6347265b92e3a0a965295610e496d2e1f03fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://015492cb16b3cf6dedc1936f90cf03d1331bfd1fddf6a257c719a6bf102691f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a259d99cae7127eb6fc8ad5446de3eda5a06da45868ab2325a89fc9c44f1d34c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58e84f914871c37c2c5cf2767af6a88354e4e59af0cbe5b178b80e1372d50629\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba76df71160260346f2ecd968722de778b7d2b3dcb8673d6ec770964965384fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ff207205d6f17838682cab641fe9c77fad739e5454c206f376440741bed8724\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ff207205d6f17838682cab641fe9c77fad739e5454c206f376440741bed8724\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:38Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:38 crc kubenswrapper[4797]: I1013 13:07:38.522170 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5jgrm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"680a49a0-7eff-44a9-8ab8-e4b52f4743c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://875be57ff7356a934b342acd8ae700f66656680be4e58e6cfccdc0407b66ddea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pptw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5jgrm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:38Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:38 crc kubenswrapper[4797]: I1013 13:07:38.526339 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:38 crc kubenswrapper[4797]: I1013 13:07:38.526383 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:38 crc kubenswrapper[4797]: I1013 13:07:38.526399 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:38 crc kubenswrapper[4797]: I1013 13:07:38.526422 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:38 crc kubenswrapper[4797]: I1013 13:07:38.526440 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:38Z","lastTransitionTime":"2025-10-13T13:07:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:38 crc kubenswrapper[4797]: I1013 13:07:38.538498 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hvhmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b24c284-a754-4877-83cc-334b0a893a47\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znzc9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znzc9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hvhmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:38Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:38 crc kubenswrapper[4797]: I1013 13:07:38.551327 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69631c207b244ea05458caf7f67665697be6b3794c1aac98d0ac8d23df060e02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:38Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:38 crc kubenswrapper[4797]: I1013 13:07:38.577468 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hvhmz" Oct 13 13:07:38 crc kubenswrapper[4797]: W1013 13:07:38.589744 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4b24c284_a754_4877_83cc_334b0a893a47.slice/crio-c51b69d3a810763d35b1bd5260a45a9ace19f3e1b5355008a2eacb646eac59a5 WatchSource:0}: Error finding container c51b69d3a810763d35b1bd5260a45a9ace19f3e1b5355008a2eacb646eac59a5: Status 404 returned error can't find the container with id c51b69d3a810763d35b1bd5260a45a9ace19f3e1b5355008a2eacb646eac59a5 Oct 13 13:07:38 crc kubenswrapper[4797]: I1013 13:07:38.629106 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:38 crc kubenswrapper[4797]: I1013 13:07:38.629186 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:38 crc kubenswrapper[4797]: I1013 13:07:38.629210 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:38 crc kubenswrapper[4797]: I1013 13:07:38.629240 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:38 crc kubenswrapper[4797]: I1013 13:07:38.629263 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:38Z","lastTransitionTime":"2025-10-13T13:07:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:38 crc kubenswrapper[4797]: I1013 13:07:38.731370 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:38 crc kubenswrapper[4797]: I1013 13:07:38.731414 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:38 crc kubenswrapper[4797]: I1013 13:07:38.731427 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:38 crc kubenswrapper[4797]: I1013 13:07:38.731443 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:38 crc kubenswrapper[4797]: I1013 13:07:38.731455 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:38Z","lastTransitionTime":"2025-10-13T13:07:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:38 crc kubenswrapper[4797]: I1013 13:07:38.834360 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:38 crc kubenswrapper[4797]: I1013 13:07:38.834419 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:38 crc kubenswrapper[4797]: I1013 13:07:38.834437 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:38 crc kubenswrapper[4797]: I1013 13:07:38.834461 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:38 crc kubenswrapper[4797]: I1013 13:07:38.834479 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:38Z","lastTransitionTime":"2025-10-13T13:07:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:38 crc kubenswrapper[4797]: I1013 13:07:38.936767 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:38 crc kubenswrapper[4797]: I1013 13:07:38.936860 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:38 crc kubenswrapper[4797]: I1013 13:07:38.936889 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:38 crc kubenswrapper[4797]: I1013 13:07:38.936922 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:38 crc kubenswrapper[4797]: I1013 13:07:38.936946 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:38Z","lastTransitionTime":"2025-10-13T13:07:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:39 crc kubenswrapper[4797]: I1013 13:07:39.039932 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:39 crc kubenswrapper[4797]: I1013 13:07:39.039982 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:39 crc kubenswrapper[4797]: I1013 13:07:39.040003 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:39 crc kubenswrapper[4797]: I1013 13:07:39.040026 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:39 crc kubenswrapper[4797]: I1013 13:07:39.040043 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:39Z","lastTransitionTime":"2025-10-13T13:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:39 crc kubenswrapper[4797]: I1013 13:07:39.142119 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:39 crc kubenswrapper[4797]: I1013 13:07:39.142178 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:39 crc kubenswrapper[4797]: I1013 13:07:39.142196 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:39 crc kubenswrapper[4797]: I1013 13:07:39.142222 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:39 crc kubenswrapper[4797]: I1013 13:07:39.142238 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:39Z","lastTransitionTime":"2025-10-13T13:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:39 crc kubenswrapper[4797]: I1013 13:07:39.235986 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 13 13:07:39 crc kubenswrapper[4797]: I1013 13:07:39.235997 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 13:07:39 crc kubenswrapper[4797]: E1013 13:07:39.237096 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 13 13:07:39 crc kubenswrapper[4797]: I1013 13:07:39.236169 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 13 13:07:39 crc kubenswrapper[4797]: E1013 13:07:39.237240 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 13 13:07:39 crc kubenswrapper[4797]: E1013 13:07:39.237349 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 13 13:07:39 crc kubenswrapper[4797]: I1013 13:07:39.245428 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:39 crc kubenswrapper[4797]: I1013 13:07:39.245472 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:39 crc kubenswrapper[4797]: I1013 13:07:39.245488 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:39 crc kubenswrapper[4797]: I1013 13:07:39.245510 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:39 crc kubenswrapper[4797]: I1013 13:07:39.245527 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:39Z","lastTransitionTime":"2025-10-13T13:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:39 crc kubenswrapper[4797]: I1013 13:07:39.348190 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:39 crc kubenswrapper[4797]: I1013 13:07:39.348257 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:39 crc kubenswrapper[4797]: I1013 13:07:39.348276 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:39 crc kubenswrapper[4797]: I1013 13:07:39.348308 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:39 crc kubenswrapper[4797]: I1013 13:07:39.348335 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:39Z","lastTransitionTime":"2025-10-13T13:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:39 crc kubenswrapper[4797]: I1013 13:07:39.404197 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-pdvg5"] Oct 13 13:07:39 crc kubenswrapper[4797]: I1013 13:07:39.404665 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pdvg5" Oct 13 13:07:39 crc kubenswrapper[4797]: E1013 13:07:39.404739 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pdvg5" podUID="e65d35bc-209d-4438-ae53-31deb132aaf5" Oct 13 13:07:39 crc kubenswrapper[4797]: I1013 13:07:39.426558 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69631c207b244ea05458caf7f67665697be6b3794c1aac98d0ac8d23df060e02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:39Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:39 crc kubenswrapper[4797]: I1013 13:07:39.442245 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 13 13:07:39 crc kubenswrapper[4797]: I1013 13:07:39.442499 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8nspn\" (UniqueName: \"kubernetes.io/projected/e65d35bc-209d-4438-ae53-31deb132aaf5-kube-api-access-8nspn\") pod \"network-metrics-daemon-pdvg5\" (UID: \"e65d35bc-209d-4438-ae53-31deb132aaf5\") " pod="openshift-multus/network-metrics-daemon-pdvg5" Oct 13 13:07:39 crc kubenswrapper[4797]: I1013 13:07:39.442544 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e65d35bc-209d-4438-ae53-31deb132aaf5-metrics-certs\") pod \"network-metrics-daemon-pdvg5\" (UID: \"e65d35bc-209d-4438-ae53-31deb132aaf5\") " pod="openshift-multus/network-metrics-daemon-pdvg5" Oct 13 13:07:39 crc kubenswrapper[4797]: E1013 13:07:39.442682 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-13 13:07:55.442658654 +0000 UTC m=+52.976208950 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 13:07:39 crc kubenswrapper[4797]: I1013 13:07:39.443528 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5jgrm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"680a49a0-7eff-44a9-8ab8-e4b52f4743c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://875be57ff7356a934b342acd8ae700f66656680be4e58e6cfccdc0407b66ddea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pptw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5jgrm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:39Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:39 crc kubenswrapper[4797]: I1013 13:07:39.451896 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:39 crc kubenswrapper[4797]: I1013 13:07:39.451965 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:39 crc kubenswrapper[4797]: I1013 13:07:39.451988 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:39 crc kubenswrapper[4797]: I1013 13:07:39.452018 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:39 crc kubenswrapper[4797]: I1013 13:07:39.452039 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:39Z","lastTransitionTime":"2025-10-13T13:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:39 crc kubenswrapper[4797]: I1013 13:07:39.463522 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hvhmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b24c284-a754-4877-83cc-334b0a893a47\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znzc9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znzc9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hvhmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:39Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:39 crc kubenswrapper[4797]: I1013 13:07:39.479352 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-pdvg5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e65d35bc-209d-4438-ae53-31deb132aaf5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nspn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nspn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-pdvg5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:39Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:39 crc kubenswrapper[4797]: I1013 13:07:39.511372 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a0ece0b-2009-4af8-a479-18fe277add03\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfe67dfd1c3ab4ca933a08e0384f2c38dccf755989a2d788f7c96bd8c2005c7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2839d93188aac6edbd17e7c1dc6d3b6004d3c1d8d03c559205b1f180ca7fc722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d103007fe3545a7470b16b99638c5d5c87f34918e102e1453d4f7ee1fa67109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://221933b30055ace0b4911bda08736e1c703b7757d55fadb3114ae39d038e4b19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01ae91c2e82dd02eb4c0aced1159efd45e3a0570a4db649f2fd2b58681419471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9634ab258db11c80c4fea57a4a31969b811204d49513802e9fd1e584c9baeeb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9634ab258db11c80c4fea57a4a31969b811204d49513802e9fd1e584c9baeeb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4eac0af23e91572524547b0ce92c10d435b55d0cd15ca4cfc1f49bda2de8bde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4eac0af23e91572524547b0ce92c10d435b55d0cd15ca4cfc1f49bda2de8bde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://af79d6bbb6a3c16b532ba2234d3373011c151ffa801eb1ae5ae947142a64bcfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af79d6bbb6a3c16b532ba2234d3373011c151ffa801eb1ae5ae947142a64bcfe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:03Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:39Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:39 crc kubenswrapper[4797]: I1013 13:07:39.533428 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6681a37da700a80ffec94aef9264f87838622029c76a2badc7b8f4a7e9e167e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:39Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:39 crc kubenswrapper[4797]: I1013 13:07:39.543451 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 13 13:07:39 crc kubenswrapper[4797]: I1013 13:07:39.543523 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 13:07:39 crc kubenswrapper[4797]: I1013 13:07:39.543612 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 13:07:39 crc kubenswrapper[4797]: E1013 13:07:39.543700 4797 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 13 13:07:39 crc kubenswrapper[4797]: E1013 13:07:39.543731 4797 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 13 13:07:39 crc kubenswrapper[4797]: E1013 13:07:39.543728 4797 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 13 13:07:39 crc kubenswrapper[4797]: E1013 13:07:39.543747 4797 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 13 13:07:39 crc kubenswrapper[4797]: E1013 13:07:39.543798 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-13 13:07:55.543783319 +0000 UTC m=+53.077333585 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 13 13:07:39 crc kubenswrapper[4797]: E1013 13:07:39.543852 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-13 13:07:55.54384134 +0000 UTC m=+53.077391606 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 13 13:07:39 crc kubenswrapper[4797]: E1013 13:07:39.544008 4797 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 13 13:07:39 crc kubenswrapper[4797]: E1013 13:07:39.544139 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-13 13:07:55.544103796 +0000 UTC m=+53.077654092 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 13 13:07:39 crc kubenswrapper[4797]: I1013 13:07:39.544367 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 13 13:07:39 crc kubenswrapper[4797]: I1013 13:07:39.544421 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8nspn\" (UniqueName: \"kubernetes.io/projected/e65d35bc-209d-4438-ae53-31deb132aaf5-kube-api-access-8nspn\") pod \"network-metrics-daemon-pdvg5\" (UID: \"e65d35bc-209d-4438-ae53-31deb132aaf5\") " pod="openshift-multus/network-metrics-daemon-pdvg5" Oct 13 13:07:39 crc kubenswrapper[4797]: I1013 13:07:39.544460 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e65d35bc-209d-4438-ae53-31deb132aaf5-metrics-certs\") pod \"network-metrics-daemon-pdvg5\" (UID: \"e65d35bc-209d-4438-ae53-31deb132aaf5\") " pod="openshift-multus/network-metrics-daemon-pdvg5" Oct 13 13:07:39 crc kubenswrapper[4797]: E1013 13:07:39.544583 4797 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 13 13:07:39 crc kubenswrapper[4797]: E1013 13:07:39.544638 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e65d35bc-209d-4438-ae53-31deb132aaf5-metrics-certs podName:e65d35bc-209d-4438-ae53-31deb132aaf5 nodeName:}" failed. No retries permitted until 2025-10-13 13:07:40.044618808 +0000 UTC m=+37.578169104 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e65d35bc-209d-4438-ae53-31deb132aaf5-metrics-certs") pod "network-metrics-daemon-pdvg5" (UID: "e65d35bc-209d-4438-ae53-31deb132aaf5") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 13 13:07:39 crc kubenswrapper[4797]: E1013 13:07:39.544731 4797 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 13 13:07:39 crc kubenswrapper[4797]: E1013 13:07:39.544752 4797 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 13 13:07:39 crc kubenswrapper[4797]: E1013 13:07:39.544773 4797 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 13 13:07:39 crc kubenswrapper[4797]: E1013 13:07:39.544873 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-13 13:07:55.544847814 +0000 UTC m=+53.078398170 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 13 13:07:39 crc kubenswrapper[4797]: I1013 13:07:39.551563 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://050a7223e7496fca4ef77f2d73f6aefc921ac5accb7ecaa34609524388da6549\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f59a6104215a3e6febd2e26c286b00895bcd8a45719acbd8e86d6fb5683df39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:39Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:39 crc kubenswrapper[4797]: I1013 13:07:39.555098 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:39 crc kubenswrapper[4797]: I1013 13:07:39.555148 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:39 crc kubenswrapper[4797]: I1013 13:07:39.555163 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:39 crc kubenswrapper[4797]: I1013 13:07:39.555181 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:39 crc kubenswrapper[4797]: I1013 13:07:39.555194 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:39Z","lastTransitionTime":"2025-10-13T13:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:39 crc kubenswrapper[4797]: I1013 13:07:39.561614 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hvhmz" event={"ID":"4b24c284-a754-4877-83cc-334b0a893a47","Type":"ContainerStarted","Data":"95d788f4cc7913f42c5282aea7303a5463ec8718dba6372a30c505e1648f230e"} Oct 13 13:07:39 crc kubenswrapper[4797]: I1013 13:07:39.561672 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hvhmz" event={"ID":"4b24c284-a754-4877-83cc-334b0a893a47","Type":"ContainerStarted","Data":"72031333bb0302ca8e823981a07e96b3bf16d02fbdb918d4fd3e79f36d86c5ea"} Oct 13 13:07:39 crc kubenswrapper[4797]: I1013 13:07:39.561690 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hvhmz" event={"ID":"4b24c284-a754-4877-83cc-334b0a893a47","Type":"ContainerStarted","Data":"c51b69d3a810763d35b1bd5260a45a9ace19f3e1b5355008a2eacb646eac59a5"} Oct 13 13:07:39 crc kubenswrapper[4797]: I1013 13:07:39.569683 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:39Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:39 crc kubenswrapper[4797]: I1013 13:07:39.575264 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8nspn\" (UniqueName: \"kubernetes.io/projected/e65d35bc-209d-4438-ae53-31deb132aaf5-kube-api-access-8nspn\") pod \"network-metrics-daemon-pdvg5\" (UID: \"e65d35bc-209d-4438-ae53-31deb132aaf5\") " pod="openshift-multus/network-metrics-daemon-pdvg5" Oct 13 13:07:39 crc kubenswrapper[4797]: I1013 13:07:39.590978 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hc9bk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94a6e41f-8980-41db-a008-d5a81058cdba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4187a9b2b8147080c704bcda550e1fa94124e2d876e766361e95907a8805d300\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55fca726ceb1c3d6ba2023acc8fcf6829a7843a2da5209dc77f3d1f499542984\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55fca726ceb1c3d6ba2023acc8fcf6829a7843a2da5209dc77f3d1f499542984\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d315709aee893961a174d0368efcf68e50e45845bdce18b40b96f5d49a8ac12c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d315709aee893961a174d0368efcf68e50e45845bdce18b40b96f5d49a8ac12c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e5eeb32046ffa3c309bb0649ed24bb4149050e757d4252bc5ed8e0593e1b139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e5eeb32046ffa3c309bb0649ed24bb4149050e757d4252bc5ed8e0593e1b139\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12ad75a6e287db934cda0d0128e5445d47433aa12807b4145ce0bd28c36b08a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12ad75a6e287db934cda0d0128e5445d47433aa12807b4145ce0bd28c36b08a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://578ddb49fd9525eafe74852b96ea1f3e320cbe40fa15ef4da3e4269f9bc23fc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://578ddb49fd9525eafe74852b96ea1f3e320cbe40fa15ef4da3e4269f9bc23fc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d96793d08ac170b3c25abd53c779b2ebecf10b5271c5f3eb4f9cbc524ba65c0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d96793d08ac170b3c25abd53c779b2ebecf10b5271c5f3eb4f9cbc524ba65c0e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hc9bk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:39Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:39 crc kubenswrapper[4797]: I1013 13:07:39.604651 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f37f607-3b81-4e33-878e-e78a69b89d23\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f65bab26af0e0d003d4e1a27dc4bdb84b64b5f6143e363973331a3fb6d26b12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81fc000b6df41386d24f9077cee4aa0ceb4733774dc37d225495575543e84a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6529ac19e0d9f2b6ecc69e041e75c9767c971617166ca22bb29349b3b3965b1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://153aaedead7ed76ab97858342317945de885afe80c00d9873d1a03444c47f67d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:39Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:39 crc kubenswrapper[4797]: I1013 13:07:39.615456 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"345b1c60-ba79-407d-8423-53010f2dfeb0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9188982d992b79d058393a141055552eeb63bc5cd53178991e62b3df7604f55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4mqw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae2106d4b7e73d19b0c8cbd8089d372e56fa08d827a3b45148d0cf68e8596c00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4mqw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hrdxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:39Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:39 crc kubenswrapper[4797]: I1013 13:07:39.625254 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7c2fp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c10f51f-a7f1-4ab8-8d9c-fc358bd7f2c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55ca323dd3a92ee203542f4ef7bb8be990bcfc8f75f125c562127129aecefc5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hgkjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7c2fp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:39Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:39 crc kubenswrapper[4797]: I1013 13:07:39.639380 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a225515-1318-413d-aafe-877c9f16f598\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94213b2963fead3db49bc98dfdf6347265b92e3a0a965295610e496d2e1f03fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://015492cb16b3cf6dedc1936f90cf03d1331bfd1fddf6a257c719a6bf102691f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a259d99cae7127eb6fc8ad5446de3eda5a06da45868ab2325a89fc9c44f1d34c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58e84f914871c37c2c5cf2767af6a88354e4e59af0cbe5b178b80e1372d50629\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba76df71160260346f2ecd968722de778b7d2b3dcb8673d6ec770964965384fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ff207205d6f17838682cab641fe9c77fad739e5454c206f376440741bed8724\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ff207205d6f17838682cab641fe9c77fad739e5454c206f376440741bed8724\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:39Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:39 crc kubenswrapper[4797]: I1013 13:07:39.653641 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:39Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:39 crc kubenswrapper[4797]: I1013 13:07:39.657554 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:39 crc kubenswrapper[4797]: I1013 13:07:39.657593 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:39 crc kubenswrapper[4797]: I1013 13:07:39.657601 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:39 crc kubenswrapper[4797]: I1013 13:07:39.657615 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:39 crc kubenswrapper[4797]: I1013 13:07:39.657627 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:39Z","lastTransitionTime":"2025-10-13T13:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:39 crc kubenswrapper[4797]: I1013 13:07:39.666138 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:39Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:39 crc kubenswrapper[4797]: I1013 13:07:39.677595 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6gbdx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2ab9f14-aae8-45ef-880e-a1563e920f87\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://414f6ddbfec431109009fc83e56eeac94db15726b109e707ebd8d3e2403999b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rkc2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6gbdx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:39Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:39 crc kubenswrapper[4797]: I1013 13:07:39.697194 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"658edc6a-9975-4d8b-9551-821edcc32ce1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6815d3509df673d7f5da2c26130c6c4d533e9d2c25c40f82365ef61d63ee71bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e599d81d1a996abd4de74afc58a8255a1ae548327401146b6bdf688d7455823\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa5161ba66d687daedb3caa1a0e2d7be83859aa3076731f94aebf83cc3348a4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1293f7ed35796e22a4be73a35ad07f83fa98d250d21de2d0b96b9090354142b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32991406197be9d38b8d5e8d1a7e95165b1846e9e054efbe87f30aac9f7f8784\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7aebb018a68c2984d9e4e58071c2b623652bfa700acebaf735c35615abf8c592\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7f9c135f7562fc84fb86baaf2fcb4fbdc3bb71573315ae64c369e542e394944\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b640eb4e66649937e6ff80419eed3945beffd841847b10c5d9405c6d69eb6168\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-13T13:07:35Z\\\",\\\"message\\\":\\\"(0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1013 13:07:35.108652 6082 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1013 13:07:35.108802 6082 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1013 13:07:35.108884 6082 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1013 13:07:35.109212 6082 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1013 13:07:35.109520 6082 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1013 13:07:35.109724 6082 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1013 13:07:35.110384 6082 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1013 13:07:35.110420 6082 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1013 13:07:35.110445 6082 factory.go:656] Stopping watch factory\\\\nI1013 13:07:35.110464 6082 ovnkube.go:599] Stopped ovnkube\\\\nI1013 13:07:35.110618 6082 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f9c135f7562fc84fb86baaf2fcb4fbdc3bb71573315ae64c369e542e394944\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-13T13:07:36Z\\\",\\\"message\\\":\\\"ce/redhat-marketplace for network=default\\\\nF1013 13:07:36.374532 6199 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:36Z is after 2025-08-24T17:21:41Z]\\\\nI1013 13:07:36.374702 6199 services_controller.go:434] Service openshift-marketplace/redhat-marketplace retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{redhat-marketplace openshift-marketplace cf6d00ec-cc2c-43f6-815c-40ffd0563e71 5558 0 2025-02-23 05:23:25 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[olm.managed:true olm.service-spec-hash:aUeLNNcZzVZO2rcaZ5Kc8V3jffO0Ss4T6qX6V5] map[] [{operators.coreos.com/v1alpha1 CatalogSource redhat-marketplace fcb55c30-a739-4bc1-9c9c-7634e05a3dbd 0xc0075c0c4d 0xc0075c0c4e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:grpc,Protocol:TCP,P\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a900854ab289e65833932548eadd4705ec501737d66773d5b6c283458125b598\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6339ec20a5a892de024630eaf6febedfdfa4373c24d394fc758267cc7ae2603e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6339ec20a5a892de024630eaf6febedfdfa4373c24d394fc758267cc7ae2603e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dhk2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:39Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:39 crc kubenswrapper[4797]: I1013 13:07:39.708766 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6gbdx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2ab9f14-aae8-45ef-880e-a1563e920f87\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://414f6ddbfec431109009fc83e56eeac94db15726b109e707ebd8d3e2403999b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rkc2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6gbdx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:39Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:39 crc kubenswrapper[4797]: I1013 13:07:39.725881 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"658edc6a-9975-4d8b-9551-821edcc32ce1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6815d3509df673d7f5da2c26130c6c4d533e9d2c25c40f82365ef61d63ee71bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e599d81d1a996abd4de74afc58a8255a1ae548327401146b6bdf688d7455823\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa5161ba66d687daedb3caa1a0e2d7be83859aa3076731f94aebf83cc3348a4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1293f7ed35796e22a4be73a35ad07f83fa98d250d21de2d0b96b9090354142b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32991406197be9d38b8d5e8d1a7e95165b1846e9e054efbe87f30aac9f7f8784\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7aebb018a68c2984d9e4e58071c2b623652bfa700acebaf735c35615abf8c592\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7f9c135f7562fc84fb86baaf2fcb4fbdc3bb71573315ae64c369e542e394944\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b640eb4e66649937e6ff80419eed3945beffd841847b10c5d9405c6d69eb6168\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-13T13:07:35Z\\\",\\\"message\\\":\\\"(0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1013 13:07:35.108652 6082 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1013 13:07:35.108802 6082 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1013 13:07:35.108884 6082 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1013 13:07:35.109212 6082 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1013 13:07:35.109520 6082 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1013 13:07:35.109724 6082 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1013 13:07:35.110384 6082 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1013 13:07:35.110420 6082 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1013 13:07:35.110445 6082 factory.go:656] Stopping watch factory\\\\nI1013 13:07:35.110464 6082 ovnkube.go:599] Stopped ovnkube\\\\nI1013 13:07:35.110618 6082 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f9c135f7562fc84fb86baaf2fcb4fbdc3bb71573315ae64c369e542e394944\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-13T13:07:36Z\\\",\\\"message\\\":\\\"ce/redhat-marketplace for network=default\\\\nF1013 13:07:36.374532 6199 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:36Z is after 2025-08-24T17:21:41Z]\\\\nI1013 13:07:36.374702 6199 services_controller.go:434] Service openshift-marketplace/redhat-marketplace retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{redhat-marketplace openshift-marketplace cf6d00ec-cc2c-43f6-815c-40ffd0563e71 5558 0 2025-02-23 05:23:25 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[olm.managed:true olm.service-spec-hash:aUeLNNcZzVZO2rcaZ5Kc8V3jffO0Ss4T6qX6V5] map[] [{operators.coreos.com/v1alpha1 CatalogSource redhat-marketplace fcb55c30-a739-4bc1-9c9c-7634e05a3dbd 0xc0075c0c4d 0xc0075c0c4e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:grpc,Protocol:TCP,P\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a900854ab289e65833932548eadd4705ec501737d66773d5b6c283458125b598\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6339ec20a5a892de024630eaf6febedfdfa4373c24d394fc758267cc7ae2603e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6339ec20a5a892de024630eaf6febedfdfa4373c24d394fc758267cc7ae2603e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dhk2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:39Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:39 crc kubenswrapper[4797]: I1013 13:07:39.742522 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a225515-1318-413d-aafe-877c9f16f598\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94213b2963fead3db49bc98dfdf6347265b92e3a0a965295610e496d2e1f03fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://015492cb16b3cf6dedc1936f90cf03d1331bfd1fddf6a257c719a6bf102691f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a259d99cae7127eb6fc8ad5446de3eda5a06da45868ab2325a89fc9c44f1d34c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58e84f914871c37c2c5cf2767af6a88354e4e59af0cbe5b178b80e1372d50629\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba76df71160260346f2ecd968722de778b7d2b3dcb8673d6ec770964965384fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ff207205d6f17838682cab641fe9c77fad739e5454c206f376440741bed8724\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ff207205d6f17838682cab641fe9c77fad739e5454c206f376440741bed8724\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:39Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:39 crc kubenswrapper[4797]: I1013 13:07:39.759361 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:39Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:39 crc kubenswrapper[4797]: I1013 13:07:39.760130 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:39 crc kubenswrapper[4797]: I1013 13:07:39.760186 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:39 crc kubenswrapper[4797]: I1013 13:07:39.760196 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:39 crc kubenswrapper[4797]: I1013 13:07:39.760213 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:39 crc kubenswrapper[4797]: I1013 13:07:39.760223 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:39Z","lastTransitionTime":"2025-10-13T13:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:39 crc kubenswrapper[4797]: I1013 13:07:39.774686 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:39Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:39 crc kubenswrapper[4797]: I1013 13:07:39.785209 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-pdvg5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e65d35bc-209d-4438-ae53-31deb132aaf5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nspn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nspn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-pdvg5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:39Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:39 crc kubenswrapper[4797]: I1013 13:07:39.795935 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69631c207b244ea05458caf7f67665697be6b3794c1aac98d0ac8d23df060e02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:39Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:39 crc kubenswrapper[4797]: I1013 13:07:39.805583 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5jgrm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"680a49a0-7eff-44a9-8ab8-e4b52f4743c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://875be57ff7356a934b342acd8ae700f66656680be4e58e6cfccdc0407b66ddea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pptw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5jgrm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:39Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:39 crc kubenswrapper[4797]: I1013 13:07:39.821755 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hvhmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b24c284-a754-4877-83cc-334b0a893a47\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72031333bb0302ca8e823981a07e96b3bf16d02fbdb918d4fd3e79f36d86c5ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znzc9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95d788f4cc7913f42c5282aea7303a5463ec8718dba6372a30c505e1648f230e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znzc9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hvhmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:39Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:39 crc kubenswrapper[4797]: I1013 13:07:39.842363 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6681a37da700a80ffec94aef9264f87838622029c76a2badc7b8f4a7e9e167e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:39Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:39 crc kubenswrapper[4797]: I1013 13:07:39.858322 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://050a7223e7496fca4ef77f2d73f6aefc921ac5accb7ecaa34609524388da6549\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f59a6104215a3e6febd2e26c286b00895bcd8a45719acbd8e86d6fb5683df39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:39Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:39 crc kubenswrapper[4797]: I1013 13:07:39.862486 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:39 crc kubenswrapper[4797]: I1013 13:07:39.862523 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:39 crc kubenswrapper[4797]: I1013 13:07:39.862535 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:39 crc kubenswrapper[4797]: I1013 13:07:39.862556 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:39 crc kubenswrapper[4797]: I1013 13:07:39.862568 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:39Z","lastTransitionTime":"2025-10-13T13:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:39 crc kubenswrapper[4797]: I1013 13:07:39.871619 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:39Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:39 crc kubenswrapper[4797]: I1013 13:07:39.893040 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hc9bk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94a6e41f-8980-41db-a008-d5a81058cdba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4187a9b2b8147080c704bcda550e1fa94124e2d876e766361e95907a8805d300\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55fca726ceb1c3d6ba2023acc8fcf6829a7843a2da5209dc77f3d1f499542984\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55fca726ceb1c3d6ba2023acc8fcf6829a7843a2da5209dc77f3d1f499542984\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d315709aee893961a174d0368efcf68e50e45845bdce18b40b96f5d49a8ac12c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d315709aee893961a174d0368efcf68e50e45845bdce18b40b96f5d49a8ac12c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e5eeb32046ffa3c309bb0649ed24bb4149050e757d4252bc5ed8e0593e1b139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e5eeb32046ffa3c309bb0649ed24bb4149050e757d4252bc5ed8e0593e1b139\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12ad75a6e287db934cda0d0128e5445d47433aa12807b4145ce0bd28c36b08a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12ad75a6e287db934cda0d0128e5445d47433aa12807b4145ce0bd28c36b08a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://578ddb49fd9525eafe74852b96ea1f3e320cbe40fa15ef4da3e4269f9bc23fc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://578ddb49fd9525eafe74852b96ea1f3e320cbe40fa15ef4da3e4269f9bc23fc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d96793d08ac170b3c25abd53c779b2ebecf10b5271c5f3eb4f9cbc524ba65c0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d96793d08ac170b3c25abd53c779b2ebecf10b5271c5f3eb4f9cbc524ba65c0e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hc9bk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:39Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:39 crc kubenswrapper[4797]: I1013 13:07:39.924407 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a0ece0b-2009-4af8-a479-18fe277add03\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfe67dfd1c3ab4ca933a08e0384f2c38dccf755989a2d788f7c96bd8c2005c7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2839d93188aac6edbd17e7c1dc6d3b6004d3c1d8d03c559205b1f180ca7fc722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d103007fe3545a7470b16b99638c5d5c87f34918e102e1453d4f7ee1fa67109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://221933b30055ace0b4911bda08736e1c703b7757d55fadb3114ae39d038e4b19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01ae91c2e82dd02eb4c0aced1159efd45e3a0570a4db649f2fd2b58681419471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9634ab258db11c80c4fea57a4a31969b811204d49513802e9fd1e584c9baeeb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9634ab258db11c80c4fea57a4a31969b811204d49513802e9fd1e584c9baeeb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4eac0af23e91572524547b0ce92c10d435b55d0cd15ca4cfc1f49bda2de8bde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4eac0af23e91572524547b0ce92c10d435b55d0cd15ca4cfc1f49bda2de8bde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://af79d6bbb6a3c16b532ba2234d3373011c151ffa801eb1ae5ae947142a64bcfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af79d6bbb6a3c16b532ba2234d3373011c151ffa801eb1ae5ae947142a64bcfe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:03Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:39Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:39 crc kubenswrapper[4797]: I1013 13:07:39.943203 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f37f607-3b81-4e33-878e-e78a69b89d23\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f65bab26af0e0d003d4e1a27dc4bdb84b64b5f6143e363973331a3fb6d26b12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81fc000b6df41386d24f9077cee4aa0ceb4733774dc37d225495575543e84a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6529ac19e0d9f2b6ecc69e041e75c9767c971617166ca22bb29349b3b3965b1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://153aaedead7ed76ab97858342317945de885afe80c00d9873d1a03444c47f67d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:39Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:39 crc kubenswrapper[4797]: I1013 13:07:39.960172 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"345b1c60-ba79-407d-8423-53010f2dfeb0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9188982d992b79d058393a141055552eeb63bc5cd53178991e62b3df7604f55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4mqw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae2106d4b7e73d19b0c8cbd8089d372e56fa08d827a3b45148d0cf68e8596c00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4mqw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hrdxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:39Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:39 crc kubenswrapper[4797]: I1013 13:07:39.965550 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:39 crc kubenswrapper[4797]: I1013 13:07:39.965603 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:39 crc kubenswrapper[4797]: I1013 13:07:39.965621 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:39 crc kubenswrapper[4797]: I1013 13:07:39.965645 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:39 crc kubenswrapper[4797]: I1013 13:07:39.965662 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:39Z","lastTransitionTime":"2025-10-13T13:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:39 crc kubenswrapper[4797]: I1013 13:07:39.975281 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7c2fp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c10f51f-a7f1-4ab8-8d9c-fc358bd7f2c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55ca323dd3a92ee203542f4ef7bb8be990bcfc8f75f125c562127129aecefc5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hgkjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7c2fp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:39Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:40 crc kubenswrapper[4797]: I1013 13:07:40.053031 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e65d35bc-209d-4438-ae53-31deb132aaf5-metrics-certs\") pod \"network-metrics-daemon-pdvg5\" (UID: \"e65d35bc-209d-4438-ae53-31deb132aaf5\") " pod="openshift-multus/network-metrics-daemon-pdvg5" Oct 13 13:07:40 crc kubenswrapper[4797]: E1013 13:07:40.053197 4797 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 13 13:07:40 crc kubenswrapper[4797]: E1013 13:07:40.053279 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e65d35bc-209d-4438-ae53-31deb132aaf5-metrics-certs podName:e65d35bc-209d-4438-ae53-31deb132aaf5 nodeName:}" failed. No retries permitted until 2025-10-13 13:07:41.053256401 +0000 UTC m=+38.586806697 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e65d35bc-209d-4438-ae53-31deb132aaf5-metrics-certs") pod "network-metrics-daemon-pdvg5" (UID: "e65d35bc-209d-4438-ae53-31deb132aaf5") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 13 13:07:40 crc kubenswrapper[4797]: I1013 13:07:40.068637 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:40 crc kubenswrapper[4797]: I1013 13:07:40.068689 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:40 crc kubenswrapper[4797]: I1013 13:07:40.068706 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:40 crc kubenswrapper[4797]: I1013 13:07:40.068729 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:40 crc kubenswrapper[4797]: I1013 13:07:40.068749 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:40Z","lastTransitionTime":"2025-10-13T13:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:40 crc kubenswrapper[4797]: I1013 13:07:40.171502 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:40 crc kubenswrapper[4797]: I1013 13:07:40.171566 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:40 crc kubenswrapper[4797]: I1013 13:07:40.171589 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:40 crc kubenswrapper[4797]: I1013 13:07:40.171618 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:40 crc kubenswrapper[4797]: I1013 13:07:40.171639 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:40Z","lastTransitionTime":"2025-10-13T13:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:40 crc kubenswrapper[4797]: I1013 13:07:40.274884 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:40 crc kubenswrapper[4797]: I1013 13:07:40.274952 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:40 crc kubenswrapper[4797]: I1013 13:07:40.274975 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:40 crc kubenswrapper[4797]: I1013 13:07:40.275006 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:40 crc kubenswrapper[4797]: I1013 13:07:40.275031 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:40Z","lastTransitionTime":"2025-10-13T13:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:40 crc kubenswrapper[4797]: I1013 13:07:40.380070 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:40 crc kubenswrapper[4797]: I1013 13:07:40.380113 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:40 crc kubenswrapper[4797]: I1013 13:07:40.380124 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:40 crc kubenswrapper[4797]: I1013 13:07:40.380141 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:40 crc kubenswrapper[4797]: I1013 13:07:40.380152 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:40Z","lastTransitionTime":"2025-10-13T13:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:40 crc kubenswrapper[4797]: I1013 13:07:40.482535 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:40 crc kubenswrapper[4797]: I1013 13:07:40.482574 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:40 crc kubenswrapper[4797]: I1013 13:07:40.482585 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:40 crc kubenswrapper[4797]: I1013 13:07:40.482602 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:40 crc kubenswrapper[4797]: I1013 13:07:40.482614 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:40Z","lastTransitionTime":"2025-10-13T13:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:40 crc kubenswrapper[4797]: I1013 13:07:40.557380 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:40 crc kubenswrapper[4797]: I1013 13:07:40.557444 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:40 crc kubenswrapper[4797]: I1013 13:07:40.557457 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:40 crc kubenswrapper[4797]: I1013 13:07:40.557476 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:40 crc kubenswrapper[4797]: I1013 13:07:40.557490 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:40Z","lastTransitionTime":"2025-10-13T13:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:40 crc kubenswrapper[4797]: E1013 13:07:40.575528 4797 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:07:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:07:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:07:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:07:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7c305ae9-a0eb-4806-bd54-a7ad9c447299\\\",\\\"systemUUID\\\":\\\"1126131d-f382-4ed8-9b1e-fad3c0f5c993\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:40Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:40 crc kubenswrapper[4797]: I1013 13:07:40.580781 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:40 crc kubenswrapper[4797]: I1013 13:07:40.580888 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:40 crc kubenswrapper[4797]: I1013 13:07:40.580916 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:40 crc kubenswrapper[4797]: I1013 13:07:40.580940 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:40 crc kubenswrapper[4797]: I1013 13:07:40.580962 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:40Z","lastTransitionTime":"2025-10-13T13:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:40 crc kubenswrapper[4797]: E1013 13:07:40.608057 4797 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:07:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:07:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:07:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:07:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7c305ae9-a0eb-4806-bd54-a7ad9c447299\\\",\\\"systemUUID\\\":\\\"1126131d-f382-4ed8-9b1e-fad3c0f5c993\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:40Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:40 crc kubenswrapper[4797]: I1013 13:07:40.614358 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:40 crc kubenswrapper[4797]: I1013 13:07:40.614412 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:40 crc kubenswrapper[4797]: I1013 13:07:40.614422 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:40 crc kubenswrapper[4797]: I1013 13:07:40.614441 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:40 crc kubenswrapper[4797]: I1013 13:07:40.614454 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:40Z","lastTransitionTime":"2025-10-13T13:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:40 crc kubenswrapper[4797]: E1013 13:07:40.632160 4797 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:07:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:07:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:07:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:07:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7c305ae9-a0eb-4806-bd54-a7ad9c447299\\\",\\\"systemUUID\\\":\\\"1126131d-f382-4ed8-9b1e-fad3c0f5c993\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:40Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:40 crc kubenswrapper[4797]: I1013 13:07:40.636836 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:40 crc kubenswrapper[4797]: I1013 13:07:40.636880 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:40 crc kubenswrapper[4797]: I1013 13:07:40.636890 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:40 crc kubenswrapper[4797]: I1013 13:07:40.636908 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:40 crc kubenswrapper[4797]: I1013 13:07:40.636919 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:40Z","lastTransitionTime":"2025-10-13T13:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:40 crc kubenswrapper[4797]: E1013 13:07:40.651375 4797 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:07:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:07:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:07:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:07:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7c305ae9-a0eb-4806-bd54-a7ad9c447299\\\",\\\"systemUUID\\\":\\\"1126131d-f382-4ed8-9b1e-fad3c0f5c993\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:40Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:40 crc kubenswrapper[4797]: I1013 13:07:40.655126 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:40 crc kubenswrapper[4797]: I1013 13:07:40.655177 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:40 crc kubenswrapper[4797]: I1013 13:07:40.655187 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:40 crc kubenswrapper[4797]: I1013 13:07:40.655208 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:40 crc kubenswrapper[4797]: I1013 13:07:40.655221 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:40Z","lastTransitionTime":"2025-10-13T13:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:40 crc kubenswrapper[4797]: E1013 13:07:40.668745 4797 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:07:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:07:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:07:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:07:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7c305ae9-a0eb-4806-bd54-a7ad9c447299\\\",\\\"systemUUID\\\":\\\"1126131d-f382-4ed8-9b1e-fad3c0f5c993\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:40Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:40 crc kubenswrapper[4797]: E1013 13:07:40.668965 4797 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 13 13:07:40 crc kubenswrapper[4797]: I1013 13:07:40.670622 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:40 crc kubenswrapper[4797]: I1013 13:07:40.670661 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:40 crc kubenswrapper[4797]: I1013 13:07:40.670673 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:40 crc kubenswrapper[4797]: I1013 13:07:40.670692 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:40 crc kubenswrapper[4797]: I1013 13:07:40.670705 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:40Z","lastTransitionTime":"2025-10-13T13:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:40 crc kubenswrapper[4797]: I1013 13:07:40.773167 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:40 crc kubenswrapper[4797]: I1013 13:07:40.773222 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:40 crc kubenswrapper[4797]: I1013 13:07:40.773238 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:40 crc kubenswrapper[4797]: I1013 13:07:40.773262 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:40 crc kubenswrapper[4797]: I1013 13:07:40.773279 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:40Z","lastTransitionTime":"2025-10-13T13:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:40 crc kubenswrapper[4797]: I1013 13:07:40.876154 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:40 crc kubenswrapper[4797]: I1013 13:07:40.876194 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:40 crc kubenswrapper[4797]: I1013 13:07:40.876205 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:40 crc kubenswrapper[4797]: I1013 13:07:40.876220 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:40 crc kubenswrapper[4797]: I1013 13:07:40.876230 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:40Z","lastTransitionTime":"2025-10-13T13:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:40 crc kubenswrapper[4797]: I1013 13:07:40.978709 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:40 crc kubenswrapper[4797]: I1013 13:07:40.978756 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:40 crc kubenswrapper[4797]: I1013 13:07:40.978769 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:40 crc kubenswrapper[4797]: I1013 13:07:40.978789 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:40 crc kubenswrapper[4797]: I1013 13:07:40.978801 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:40Z","lastTransitionTime":"2025-10-13T13:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:41 crc kubenswrapper[4797]: I1013 13:07:41.082164 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:41 crc kubenswrapper[4797]: I1013 13:07:41.082214 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:41 crc kubenswrapper[4797]: I1013 13:07:41.082232 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:41 crc kubenswrapper[4797]: I1013 13:07:41.082253 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:41 crc kubenswrapper[4797]: I1013 13:07:41.082269 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:41Z","lastTransitionTime":"2025-10-13T13:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:41 crc kubenswrapper[4797]: I1013 13:07:41.087949 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e65d35bc-209d-4438-ae53-31deb132aaf5-metrics-certs\") pod \"network-metrics-daemon-pdvg5\" (UID: \"e65d35bc-209d-4438-ae53-31deb132aaf5\") " pod="openshift-multus/network-metrics-daemon-pdvg5" Oct 13 13:07:41 crc kubenswrapper[4797]: E1013 13:07:41.088115 4797 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 13 13:07:41 crc kubenswrapper[4797]: E1013 13:07:41.088167 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e65d35bc-209d-4438-ae53-31deb132aaf5-metrics-certs podName:e65d35bc-209d-4438-ae53-31deb132aaf5 nodeName:}" failed. No retries permitted until 2025-10-13 13:07:43.088150535 +0000 UTC m=+40.621700801 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e65d35bc-209d-4438-ae53-31deb132aaf5-metrics-certs") pod "network-metrics-daemon-pdvg5" (UID: "e65d35bc-209d-4438-ae53-31deb132aaf5") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 13 13:07:41 crc kubenswrapper[4797]: I1013 13:07:41.185125 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:41 crc kubenswrapper[4797]: I1013 13:07:41.185176 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:41 crc kubenswrapper[4797]: I1013 13:07:41.185190 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:41 crc kubenswrapper[4797]: I1013 13:07:41.185213 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:41 crc kubenswrapper[4797]: I1013 13:07:41.185231 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:41Z","lastTransitionTime":"2025-10-13T13:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:41 crc kubenswrapper[4797]: I1013 13:07:41.235418 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 13:07:41 crc kubenswrapper[4797]: I1013 13:07:41.235486 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pdvg5" Oct 13 13:07:41 crc kubenswrapper[4797]: I1013 13:07:41.235620 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 13 13:07:41 crc kubenswrapper[4797]: E1013 13:07:41.235636 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 13 13:07:41 crc kubenswrapper[4797]: E1013 13:07:41.235756 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 13 13:07:41 crc kubenswrapper[4797]: E1013 13:07:41.235922 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pdvg5" podUID="e65d35bc-209d-4438-ae53-31deb132aaf5" Oct 13 13:07:41 crc kubenswrapper[4797]: I1013 13:07:41.236027 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 13 13:07:41 crc kubenswrapper[4797]: E1013 13:07:41.236237 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 13 13:07:41 crc kubenswrapper[4797]: I1013 13:07:41.288473 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:41 crc kubenswrapper[4797]: I1013 13:07:41.288545 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:41 crc kubenswrapper[4797]: I1013 13:07:41.288564 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:41 crc kubenswrapper[4797]: I1013 13:07:41.288585 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:41 crc kubenswrapper[4797]: I1013 13:07:41.288599 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:41Z","lastTransitionTime":"2025-10-13T13:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:41 crc kubenswrapper[4797]: I1013 13:07:41.390924 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:41 crc kubenswrapper[4797]: I1013 13:07:41.391028 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:41 crc kubenswrapper[4797]: I1013 13:07:41.391052 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:41 crc kubenswrapper[4797]: I1013 13:07:41.391082 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:41 crc kubenswrapper[4797]: I1013 13:07:41.391104 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:41Z","lastTransitionTime":"2025-10-13T13:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:41 crc kubenswrapper[4797]: I1013 13:07:41.493724 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:41 crc kubenswrapper[4797]: I1013 13:07:41.493764 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:41 crc kubenswrapper[4797]: I1013 13:07:41.493776 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:41 crc kubenswrapper[4797]: I1013 13:07:41.493791 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:41 crc kubenswrapper[4797]: I1013 13:07:41.493816 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:41Z","lastTransitionTime":"2025-10-13T13:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:41 crc kubenswrapper[4797]: I1013 13:07:41.596413 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:41 crc kubenswrapper[4797]: I1013 13:07:41.596457 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:41 crc kubenswrapper[4797]: I1013 13:07:41.596474 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:41 crc kubenswrapper[4797]: I1013 13:07:41.596495 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:41 crc kubenswrapper[4797]: I1013 13:07:41.596511 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:41Z","lastTransitionTime":"2025-10-13T13:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:41 crc kubenswrapper[4797]: I1013 13:07:41.699684 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:41 crc kubenswrapper[4797]: I1013 13:07:41.699752 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:41 crc kubenswrapper[4797]: I1013 13:07:41.699777 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:41 crc kubenswrapper[4797]: I1013 13:07:41.699835 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:41 crc kubenswrapper[4797]: I1013 13:07:41.699853 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:41Z","lastTransitionTime":"2025-10-13T13:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:41 crc kubenswrapper[4797]: I1013 13:07:41.802176 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:41 crc kubenswrapper[4797]: I1013 13:07:41.802241 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:41 crc kubenswrapper[4797]: I1013 13:07:41.802264 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:41 crc kubenswrapper[4797]: I1013 13:07:41.802292 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:41 crc kubenswrapper[4797]: I1013 13:07:41.802313 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:41Z","lastTransitionTime":"2025-10-13T13:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:41 crc kubenswrapper[4797]: I1013 13:07:41.905450 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:41 crc kubenswrapper[4797]: I1013 13:07:41.905511 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:41 crc kubenswrapper[4797]: I1013 13:07:41.905529 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:41 crc kubenswrapper[4797]: I1013 13:07:41.905553 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:41 crc kubenswrapper[4797]: I1013 13:07:41.905573 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:41Z","lastTransitionTime":"2025-10-13T13:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:42 crc kubenswrapper[4797]: I1013 13:07:42.012035 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:42 crc kubenswrapper[4797]: I1013 13:07:42.012112 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:42 crc kubenswrapper[4797]: I1013 13:07:42.012129 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:42 crc kubenswrapper[4797]: I1013 13:07:42.012150 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:42 crc kubenswrapper[4797]: I1013 13:07:42.012170 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:42Z","lastTransitionTime":"2025-10-13T13:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:42 crc kubenswrapper[4797]: I1013 13:07:42.114277 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:42 crc kubenswrapper[4797]: I1013 13:07:42.114334 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:42 crc kubenswrapper[4797]: I1013 13:07:42.114352 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:42 crc kubenswrapper[4797]: I1013 13:07:42.114376 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:42 crc kubenswrapper[4797]: I1013 13:07:42.114393 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:42Z","lastTransitionTime":"2025-10-13T13:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:42 crc kubenswrapper[4797]: I1013 13:07:42.217088 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:42 crc kubenswrapper[4797]: I1013 13:07:42.217161 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:42 crc kubenswrapper[4797]: I1013 13:07:42.217178 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:42 crc kubenswrapper[4797]: I1013 13:07:42.217204 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:42 crc kubenswrapper[4797]: I1013 13:07:42.217223 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:42Z","lastTransitionTime":"2025-10-13T13:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:42 crc kubenswrapper[4797]: I1013 13:07:42.320492 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:42 crc kubenswrapper[4797]: I1013 13:07:42.320549 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:42 crc kubenswrapper[4797]: I1013 13:07:42.320566 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:42 crc kubenswrapper[4797]: I1013 13:07:42.320590 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:42 crc kubenswrapper[4797]: I1013 13:07:42.320607 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:42Z","lastTransitionTime":"2025-10-13T13:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:42 crc kubenswrapper[4797]: I1013 13:07:42.423332 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:42 crc kubenswrapper[4797]: I1013 13:07:42.423366 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:42 crc kubenswrapper[4797]: I1013 13:07:42.423375 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:42 crc kubenswrapper[4797]: I1013 13:07:42.423389 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:42 crc kubenswrapper[4797]: I1013 13:07:42.423400 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:42Z","lastTransitionTime":"2025-10-13T13:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:42 crc kubenswrapper[4797]: I1013 13:07:42.525675 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:42 crc kubenswrapper[4797]: I1013 13:07:42.525720 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:42 crc kubenswrapper[4797]: I1013 13:07:42.525732 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:42 crc kubenswrapper[4797]: I1013 13:07:42.525748 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:42 crc kubenswrapper[4797]: I1013 13:07:42.525759 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:42Z","lastTransitionTime":"2025-10-13T13:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:42 crc kubenswrapper[4797]: I1013 13:07:42.628532 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:42 crc kubenswrapper[4797]: I1013 13:07:42.628584 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:42 crc kubenswrapper[4797]: I1013 13:07:42.628597 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:42 crc kubenswrapper[4797]: I1013 13:07:42.628615 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:42 crc kubenswrapper[4797]: I1013 13:07:42.628629 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:42Z","lastTransitionTime":"2025-10-13T13:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:42 crc kubenswrapper[4797]: I1013 13:07:42.731336 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:42 crc kubenswrapper[4797]: I1013 13:07:42.731407 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:42 crc kubenswrapper[4797]: I1013 13:07:42.731428 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:42 crc kubenswrapper[4797]: I1013 13:07:42.731454 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:42 crc kubenswrapper[4797]: I1013 13:07:42.731471 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:42Z","lastTransitionTime":"2025-10-13T13:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:42 crc kubenswrapper[4797]: I1013 13:07:42.834643 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:42 crc kubenswrapper[4797]: I1013 13:07:42.834711 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:42 crc kubenswrapper[4797]: I1013 13:07:42.834730 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:42 crc kubenswrapper[4797]: I1013 13:07:42.834755 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:42 crc kubenswrapper[4797]: I1013 13:07:42.834773 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:42Z","lastTransitionTime":"2025-10-13T13:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:42 crc kubenswrapper[4797]: I1013 13:07:42.936931 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:42 crc kubenswrapper[4797]: I1013 13:07:42.937014 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:42 crc kubenswrapper[4797]: I1013 13:07:42.937046 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:42 crc kubenswrapper[4797]: I1013 13:07:42.937078 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:42 crc kubenswrapper[4797]: I1013 13:07:42.937099 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:42Z","lastTransitionTime":"2025-10-13T13:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:43 crc kubenswrapper[4797]: I1013 13:07:43.040016 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:43 crc kubenswrapper[4797]: I1013 13:07:43.040056 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:43 crc kubenswrapper[4797]: I1013 13:07:43.040068 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:43 crc kubenswrapper[4797]: I1013 13:07:43.040083 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:43 crc kubenswrapper[4797]: I1013 13:07:43.040092 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:43Z","lastTransitionTime":"2025-10-13T13:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:43 crc kubenswrapper[4797]: I1013 13:07:43.109263 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e65d35bc-209d-4438-ae53-31deb132aaf5-metrics-certs\") pod \"network-metrics-daemon-pdvg5\" (UID: \"e65d35bc-209d-4438-ae53-31deb132aaf5\") " pod="openshift-multus/network-metrics-daemon-pdvg5" Oct 13 13:07:43 crc kubenswrapper[4797]: E1013 13:07:43.109409 4797 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 13 13:07:43 crc kubenswrapper[4797]: E1013 13:07:43.109496 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e65d35bc-209d-4438-ae53-31deb132aaf5-metrics-certs podName:e65d35bc-209d-4438-ae53-31deb132aaf5 nodeName:}" failed. No retries permitted until 2025-10-13 13:07:47.109471822 +0000 UTC m=+44.643022108 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e65d35bc-209d-4438-ae53-31deb132aaf5-metrics-certs") pod "network-metrics-daemon-pdvg5" (UID: "e65d35bc-209d-4438-ae53-31deb132aaf5") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 13 13:07:43 crc kubenswrapper[4797]: I1013 13:07:43.142860 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:43 crc kubenswrapper[4797]: I1013 13:07:43.142944 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:43 crc kubenswrapper[4797]: I1013 13:07:43.142969 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:43 crc kubenswrapper[4797]: I1013 13:07:43.143000 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:43 crc kubenswrapper[4797]: I1013 13:07:43.143022 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:43Z","lastTransitionTime":"2025-10-13T13:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:43 crc kubenswrapper[4797]: I1013 13:07:43.235976 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 13 13:07:43 crc kubenswrapper[4797]: I1013 13:07:43.235982 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 13 13:07:43 crc kubenswrapper[4797]: I1013 13:07:43.236020 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 13:07:43 crc kubenswrapper[4797]: I1013 13:07:43.236068 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pdvg5" Oct 13 13:07:43 crc kubenswrapper[4797]: E1013 13:07:43.236191 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 13 13:07:43 crc kubenswrapper[4797]: E1013 13:07:43.236490 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 13 13:07:43 crc kubenswrapper[4797]: E1013 13:07:43.236549 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 13 13:07:43 crc kubenswrapper[4797]: E1013 13:07:43.236628 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pdvg5" podUID="e65d35bc-209d-4438-ae53-31deb132aaf5" Oct 13 13:07:43 crc kubenswrapper[4797]: I1013 13:07:43.245254 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:43 crc kubenswrapper[4797]: I1013 13:07:43.245279 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:43 crc kubenswrapper[4797]: I1013 13:07:43.245287 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:43 crc kubenswrapper[4797]: I1013 13:07:43.245299 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:43 crc kubenswrapper[4797]: I1013 13:07:43.245307 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:43Z","lastTransitionTime":"2025-10-13T13:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:43 crc kubenswrapper[4797]: I1013 13:07:43.254525 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6gbdx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2ab9f14-aae8-45ef-880e-a1563e920f87\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://414f6ddbfec431109009fc83e56eeac94db15726b109e707ebd8d3e2403999b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rkc2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6gbdx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:43Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:43 crc kubenswrapper[4797]: I1013 13:07:43.279168 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"658edc6a-9975-4d8b-9551-821edcc32ce1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6815d3509df673d7f5da2c26130c6c4d533e9d2c25c40f82365ef61d63ee71bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e599d81d1a996abd4de74afc58a8255a1ae548327401146b6bdf688d7455823\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa5161ba66d687daedb3caa1a0e2d7be83859aa3076731f94aebf83cc3348a4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1293f7ed35796e22a4be73a35ad07f83fa98d250d21de2d0b96b9090354142b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32991406197be9d38b8d5e8d1a7e95165b1846e9e054efbe87f30aac9f7f8784\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7aebb018a68c2984d9e4e58071c2b623652bfa700acebaf735c35615abf8c592\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7f9c135f7562fc84fb86baaf2fcb4fbdc3bb71573315ae64c369e542e394944\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b640eb4e66649937e6ff80419eed3945beffd841847b10c5d9405c6d69eb6168\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-13T13:07:35Z\\\",\\\"message\\\":\\\"(0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1013 13:07:35.108652 6082 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1013 13:07:35.108802 6082 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1013 13:07:35.108884 6082 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1013 13:07:35.109212 6082 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1013 13:07:35.109520 6082 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1013 13:07:35.109724 6082 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1013 13:07:35.110384 6082 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1013 13:07:35.110420 6082 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1013 13:07:35.110445 6082 factory.go:656] Stopping watch factory\\\\nI1013 13:07:35.110464 6082 ovnkube.go:599] Stopped ovnkube\\\\nI1013 13:07:35.110618 6082 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f9c135f7562fc84fb86baaf2fcb4fbdc3bb71573315ae64c369e542e394944\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-13T13:07:36Z\\\",\\\"message\\\":\\\"ce/redhat-marketplace for network=default\\\\nF1013 13:07:36.374532 6199 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:36Z is after 2025-08-24T17:21:41Z]\\\\nI1013 13:07:36.374702 6199 services_controller.go:434] Service openshift-marketplace/redhat-marketplace retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{redhat-marketplace openshift-marketplace cf6d00ec-cc2c-43f6-815c-40ffd0563e71 5558 0 2025-02-23 05:23:25 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[olm.managed:true olm.service-spec-hash:aUeLNNcZzVZO2rcaZ5Kc8V3jffO0Ss4T6qX6V5] map[] [{operators.coreos.com/v1alpha1 CatalogSource redhat-marketplace fcb55c30-a739-4bc1-9c9c-7634e05a3dbd 0xc0075c0c4d 0xc0075c0c4e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:grpc,Protocol:TCP,P\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a900854ab289e65833932548eadd4705ec501737d66773d5b6c283458125b598\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6339ec20a5a892de024630eaf6febedfdfa4373c24d394fc758267cc7ae2603e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6339ec20a5a892de024630eaf6febedfdfa4373c24d394fc758267cc7ae2603e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dhk2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:43Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:43 crc kubenswrapper[4797]: I1013 13:07:43.294482 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a225515-1318-413d-aafe-877c9f16f598\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94213b2963fead3db49bc98dfdf6347265b92e3a0a965295610e496d2e1f03fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://015492cb16b3cf6dedc1936f90cf03d1331bfd1fddf6a257c719a6bf102691f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a259d99cae7127eb6fc8ad5446de3eda5a06da45868ab2325a89fc9c44f1d34c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58e84f914871c37c2c5cf2767af6a88354e4e59af0cbe5b178b80e1372d50629\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba76df71160260346f2ecd968722de778b7d2b3dcb8673d6ec770964965384fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ff207205d6f17838682cab641fe9c77fad739e5454c206f376440741bed8724\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ff207205d6f17838682cab641fe9c77fad739e5454c206f376440741bed8724\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:43Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:43 crc kubenswrapper[4797]: I1013 13:07:43.308941 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:43Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:43 crc kubenswrapper[4797]: I1013 13:07:43.321344 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:43Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:43 crc kubenswrapper[4797]: I1013 13:07:43.336174 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-pdvg5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e65d35bc-209d-4438-ae53-31deb132aaf5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nspn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nspn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-pdvg5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:43Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:43 crc kubenswrapper[4797]: I1013 13:07:43.347538 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:43 crc kubenswrapper[4797]: I1013 13:07:43.347838 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:43 crc kubenswrapper[4797]: I1013 13:07:43.348012 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:43 crc kubenswrapper[4797]: I1013 13:07:43.348163 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:43 crc kubenswrapper[4797]: I1013 13:07:43.348287 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:43Z","lastTransitionTime":"2025-10-13T13:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:43 crc kubenswrapper[4797]: I1013 13:07:43.352256 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69631c207b244ea05458caf7f67665697be6b3794c1aac98d0ac8d23df060e02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:43Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:43 crc kubenswrapper[4797]: I1013 13:07:43.363383 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5jgrm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"680a49a0-7eff-44a9-8ab8-e4b52f4743c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://875be57ff7356a934b342acd8ae700f66656680be4e58e6cfccdc0407b66ddea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pptw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5jgrm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:43Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:43 crc kubenswrapper[4797]: I1013 13:07:43.373015 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hvhmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b24c284-a754-4877-83cc-334b0a893a47\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72031333bb0302ca8e823981a07e96b3bf16d02fbdb918d4fd3e79f36d86c5ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znzc9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95d788f4cc7913f42c5282aea7303a5463ec8718dba6372a30c505e1648f230e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znzc9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hvhmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:43Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:43 crc kubenswrapper[4797]: I1013 13:07:43.384716 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6681a37da700a80ffec94aef9264f87838622029c76a2badc7b8f4a7e9e167e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:43Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:43 crc kubenswrapper[4797]: I1013 13:07:43.399511 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://050a7223e7496fca4ef77f2d73f6aefc921ac5accb7ecaa34609524388da6549\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f59a6104215a3e6febd2e26c286b00895bcd8a45719acbd8e86d6fb5683df39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:43Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:43 crc kubenswrapper[4797]: I1013 13:07:43.411920 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:43Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:43 crc kubenswrapper[4797]: I1013 13:07:43.427196 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hc9bk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94a6e41f-8980-41db-a008-d5a81058cdba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4187a9b2b8147080c704bcda550e1fa94124e2d876e766361e95907a8805d300\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55fca726ceb1c3d6ba2023acc8fcf6829a7843a2da5209dc77f3d1f499542984\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55fca726ceb1c3d6ba2023acc8fcf6829a7843a2da5209dc77f3d1f499542984\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d315709aee893961a174d0368efcf68e50e45845bdce18b40b96f5d49a8ac12c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d315709aee893961a174d0368efcf68e50e45845bdce18b40b96f5d49a8ac12c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e5eeb32046ffa3c309bb0649ed24bb4149050e757d4252bc5ed8e0593e1b139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e5eeb32046ffa3c309bb0649ed24bb4149050e757d4252bc5ed8e0593e1b139\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12ad75a6e287db934cda0d0128e5445d47433aa12807b4145ce0bd28c36b08a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12ad75a6e287db934cda0d0128e5445d47433aa12807b4145ce0bd28c36b08a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://578ddb49fd9525eafe74852b96ea1f3e320cbe40fa15ef4da3e4269f9bc23fc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://578ddb49fd9525eafe74852b96ea1f3e320cbe40fa15ef4da3e4269f9bc23fc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d96793d08ac170b3c25abd53c779b2ebecf10b5271c5f3eb4f9cbc524ba65c0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d96793d08ac170b3c25abd53c779b2ebecf10b5271c5f3eb4f9cbc524ba65c0e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hc9bk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:43Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:43 crc kubenswrapper[4797]: I1013 13:07:43.450858 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:43 crc kubenswrapper[4797]: I1013 13:07:43.450925 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:43 crc kubenswrapper[4797]: I1013 13:07:43.450936 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:43 crc kubenswrapper[4797]: I1013 13:07:43.450951 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:43 crc kubenswrapper[4797]: I1013 13:07:43.450963 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:43Z","lastTransitionTime":"2025-10-13T13:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:43 crc kubenswrapper[4797]: I1013 13:07:43.454507 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a0ece0b-2009-4af8-a479-18fe277add03\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfe67dfd1c3ab4ca933a08e0384f2c38dccf755989a2d788f7c96bd8c2005c7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2839d93188aac6edbd17e7c1dc6d3b6004d3c1d8d03c559205b1f180ca7fc722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d103007fe3545a7470b16b99638c5d5c87f34918e102e1453d4f7ee1fa67109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://221933b30055ace0b4911bda08736e1c703b7757d55fadb3114ae39d038e4b19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01ae91c2e82dd02eb4c0aced1159efd45e3a0570a4db649f2fd2b58681419471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9634ab258db11c80c4fea57a4a31969b811204d49513802e9fd1e584c9baeeb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9634ab258db11c80c4fea57a4a31969b811204d49513802e9fd1e584c9baeeb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4eac0af23e91572524547b0ce92c10d435b55d0cd15ca4cfc1f49bda2de8bde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4eac0af23e91572524547b0ce92c10d435b55d0cd15ca4cfc1f49bda2de8bde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://af79d6bbb6a3c16b532ba2234d3373011c151ffa801eb1ae5ae947142a64bcfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af79d6bbb6a3c16b532ba2234d3373011c151ffa801eb1ae5ae947142a64bcfe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:03Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:43Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:43 crc kubenswrapper[4797]: I1013 13:07:43.469460 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f37f607-3b81-4e33-878e-e78a69b89d23\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f65bab26af0e0d003d4e1a27dc4bdb84b64b5f6143e363973331a3fb6d26b12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81fc000b6df41386d24f9077cee4aa0ceb4733774dc37d225495575543e84a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6529ac19e0d9f2b6ecc69e041e75c9767c971617166ca22bb29349b3b3965b1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://153aaedead7ed76ab97858342317945de885afe80c00d9873d1a03444c47f67d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:43Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:43 crc kubenswrapper[4797]: I1013 13:07:43.482032 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"345b1c60-ba79-407d-8423-53010f2dfeb0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9188982d992b79d058393a141055552eeb63bc5cd53178991e62b3df7604f55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4mqw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae2106d4b7e73d19b0c8cbd8089d372e56fa08d827a3b45148d0cf68e8596c00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4mqw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hrdxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:43Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:43 crc kubenswrapper[4797]: I1013 13:07:43.492373 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7c2fp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c10f51f-a7f1-4ab8-8d9c-fc358bd7f2c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55ca323dd3a92ee203542f4ef7bb8be990bcfc8f75f125c562127129aecefc5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hgkjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7c2fp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:43Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:43 crc kubenswrapper[4797]: I1013 13:07:43.553228 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:43 crc kubenswrapper[4797]: I1013 13:07:43.553266 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:43 crc kubenswrapper[4797]: I1013 13:07:43.553277 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:43 crc kubenswrapper[4797]: I1013 13:07:43.553293 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:43 crc kubenswrapper[4797]: I1013 13:07:43.553304 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:43Z","lastTransitionTime":"2025-10-13T13:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:43 crc kubenswrapper[4797]: I1013 13:07:43.655244 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:43 crc kubenswrapper[4797]: I1013 13:07:43.655283 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:43 crc kubenswrapper[4797]: I1013 13:07:43.655291 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:43 crc kubenswrapper[4797]: I1013 13:07:43.655304 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:43 crc kubenswrapper[4797]: I1013 13:07:43.655312 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:43Z","lastTransitionTime":"2025-10-13T13:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:43 crc kubenswrapper[4797]: I1013 13:07:43.757633 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:43 crc kubenswrapper[4797]: I1013 13:07:43.757683 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:43 crc kubenswrapper[4797]: I1013 13:07:43.757699 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:43 crc kubenswrapper[4797]: I1013 13:07:43.757719 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:43 crc kubenswrapper[4797]: I1013 13:07:43.757735 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:43Z","lastTransitionTime":"2025-10-13T13:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:43 crc kubenswrapper[4797]: I1013 13:07:43.860303 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:43 crc kubenswrapper[4797]: I1013 13:07:43.860362 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:43 crc kubenswrapper[4797]: I1013 13:07:43.860379 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:43 crc kubenswrapper[4797]: I1013 13:07:43.860401 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:43 crc kubenswrapper[4797]: I1013 13:07:43.860420 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:43Z","lastTransitionTime":"2025-10-13T13:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:43 crc kubenswrapper[4797]: I1013 13:07:43.964745 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:43 crc kubenswrapper[4797]: I1013 13:07:43.964858 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:43 crc kubenswrapper[4797]: I1013 13:07:43.964884 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:43 crc kubenswrapper[4797]: I1013 13:07:43.964917 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:43 crc kubenswrapper[4797]: I1013 13:07:43.964951 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:43Z","lastTransitionTime":"2025-10-13T13:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:44 crc kubenswrapper[4797]: I1013 13:07:44.068329 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:44 crc kubenswrapper[4797]: I1013 13:07:44.068387 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:44 crc kubenswrapper[4797]: I1013 13:07:44.068409 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:44 crc kubenswrapper[4797]: I1013 13:07:44.068441 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:44 crc kubenswrapper[4797]: I1013 13:07:44.068460 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:44Z","lastTransitionTime":"2025-10-13T13:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:44 crc kubenswrapper[4797]: I1013 13:07:44.170532 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:44 crc kubenswrapper[4797]: I1013 13:07:44.170575 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:44 crc kubenswrapper[4797]: I1013 13:07:44.170586 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:44 crc kubenswrapper[4797]: I1013 13:07:44.170602 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:44 crc kubenswrapper[4797]: I1013 13:07:44.170614 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:44Z","lastTransitionTime":"2025-10-13T13:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:44 crc kubenswrapper[4797]: I1013 13:07:44.273502 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:44 crc kubenswrapper[4797]: I1013 13:07:44.273583 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:44 crc kubenswrapper[4797]: I1013 13:07:44.273616 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:44 crc kubenswrapper[4797]: I1013 13:07:44.273646 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:44 crc kubenswrapper[4797]: I1013 13:07:44.273666 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:44Z","lastTransitionTime":"2025-10-13T13:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:44 crc kubenswrapper[4797]: I1013 13:07:44.376519 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:44 crc kubenswrapper[4797]: I1013 13:07:44.376580 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:44 crc kubenswrapper[4797]: I1013 13:07:44.376598 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:44 crc kubenswrapper[4797]: I1013 13:07:44.376622 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:44 crc kubenswrapper[4797]: I1013 13:07:44.376641 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:44Z","lastTransitionTime":"2025-10-13T13:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:44 crc kubenswrapper[4797]: I1013 13:07:44.479759 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:44 crc kubenswrapper[4797]: I1013 13:07:44.480204 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:44 crc kubenswrapper[4797]: I1013 13:07:44.480351 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:44 crc kubenswrapper[4797]: I1013 13:07:44.480491 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:44 crc kubenswrapper[4797]: I1013 13:07:44.480683 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:44Z","lastTransitionTime":"2025-10-13T13:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:44 crc kubenswrapper[4797]: I1013 13:07:44.584184 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:44 crc kubenswrapper[4797]: I1013 13:07:44.584252 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:44 crc kubenswrapper[4797]: I1013 13:07:44.584275 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:44 crc kubenswrapper[4797]: I1013 13:07:44.584306 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:44 crc kubenswrapper[4797]: I1013 13:07:44.584330 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:44Z","lastTransitionTime":"2025-10-13T13:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:44 crc kubenswrapper[4797]: I1013 13:07:44.687576 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:44 crc kubenswrapper[4797]: I1013 13:07:44.687631 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:44 crc kubenswrapper[4797]: I1013 13:07:44.687649 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:44 crc kubenswrapper[4797]: I1013 13:07:44.687672 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:44 crc kubenswrapper[4797]: I1013 13:07:44.687690 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:44Z","lastTransitionTime":"2025-10-13T13:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:44 crc kubenswrapper[4797]: I1013 13:07:44.791188 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:44 crc kubenswrapper[4797]: I1013 13:07:44.791236 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:44 crc kubenswrapper[4797]: I1013 13:07:44.791247 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:44 crc kubenswrapper[4797]: I1013 13:07:44.791267 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:44 crc kubenswrapper[4797]: I1013 13:07:44.791280 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:44Z","lastTransitionTime":"2025-10-13T13:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:44 crc kubenswrapper[4797]: I1013 13:07:44.894501 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:44 crc kubenswrapper[4797]: I1013 13:07:44.894541 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:44 crc kubenswrapper[4797]: I1013 13:07:44.894550 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:44 crc kubenswrapper[4797]: I1013 13:07:44.894566 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:44 crc kubenswrapper[4797]: I1013 13:07:44.894574 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:44Z","lastTransitionTime":"2025-10-13T13:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:44 crc kubenswrapper[4797]: I1013 13:07:44.998941 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:44 crc kubenswrapper[4797]: I1013 13:07:44.998991 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:44 crc kubenswrapper[4797]: I1013 13:07:44.999011 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:44 crc kubenswrapper[4797]: I1013 13:07:44.999035 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:44 crc kubenswrapper[4797]: I1013 13:07:44.999053 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:44Z","lastTransitionTime":"2025-10-13T13:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:45 crc kubenswrapper[4797]: I1013 13:07:45.102122 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:45 crc kubenswrapper[4797]: I1013 13:07:45.102184 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:45 crc kubenswrapper[4797]: I1013 13:07:45.102203 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:45 crc kubenswrapper[4797]: I1013 13:07:45.102229 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:45 crc kubenswrapper[4797]: I1013 13:07:45.102246 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:45Z","lastTransitionTime":"2025-10-13T13:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:45 crc kubenswrapper[4797]: I1013 13:07:45.205769 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:45 crc kubenswrapper[4797]: I1013 13:07:45.205847 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:45 crc kubenswrapper[4797]: I1013 13:07:45.205857 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:45 crc kubenswrapper[4797]: I1013 13:07:45.205877 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:45 crc kubenswrapper[4797]: I1013 13:07:45.205890 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:45Z","lastTransitionTime":"2025-10-13T13:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:45 crc kubenswrapper[4797]: I1013 13:07:45.235712 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 13 13:07:45 crc kubenswrapper[4797]: I1013 13:07:45.235857 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pdvg5" Oct 13 13:07:45 crc kubenswrapper[4797]: I1013 13:07:45.235909 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 13:07:45 crc kubenswrapper[4797]: I1013 13:07:45.235869 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 13 13:07:45 crc kubenswrapper[4797]: E1013 13:07:45.236042 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 13 13:07:45 crc kubenswrapper[4797]: E1013 13:07:45.236277 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pdvg5" podUID="e65d35bc-209d-4438-ae53-31deb132aaf5" Oct 13 13:07:45 crc kubenswrapper[4797]: E1013 13:07:45.236462 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 13 13:07:45 crc kubenswrapper[4797]: E1013 13:07:45.236594 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 13 13:07:45 crc kubenswrapper[4797]: I1013 13:07:45.308329 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:45 crc kubenswrapper[4797]: I1013 13:07:45.308383 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:45 crc kubenswrapper[4797]: I1013 13:07:45.308400 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:45 crc kubenswrapper[4797]: I1013 13:07:45.308424 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:45 crc kubenswrapper[4797]: I1013 13:07:45.308441 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:45Z","lastTransitionTime":"2025-10-13T13:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:45 crc kubenswrapper[4797]: I1013 13:07:45.412138 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:45 crc kubenswrapper[4797]: I1013 13:07:45.412206 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:45 crc kubenswrapper[4797]: I1013 13:07:45.412224 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:45 crc kubenswrapper[4797]: I1013 13:07:45.412250 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:45 crc kubenswrapper[4797]: I1013 13:07:45.412269 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:45Z","lastTransitionTime":"2025-10-13T13:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:45 crc kubenswrapper[4797]: I1013 13:07:45.515769 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:45 crc kubenswrapper[4797]: I1013 13:07:45.515888 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:45 crc kubenswrapper[4797]: I1013 13:07:45.515917 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:45 crc kubenswrapper[4797]: I1013 13:07:45.515951 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:45 crc kubenswrapper[4797]: I1013 13:07:45.515978 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:45Z","lastTransitionTime":"2025-10-13T13:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:45 crc kubenswrapper[4797]: I1013 13:07:45.621142 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:45 crc kubenswrapper[4797]: I1013 13:07:45.621209 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:45 crc kubenswrapper[4797]: I1013 13:07:45.621227 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:45 crc kubenswrapper[4797]: I1013 13:07:45.621255 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:45 crc kubenswrapper[4797]: I1013 13:07:45.621271 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:45Z","lastTransitionTime":"2025-10-13T13:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:45 crc kubenswrapper[4797]: I1013 13:07:45.725114 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:45 crc kubenswrapper[4797]: I1013 13:07:45.725175 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:45 crc kubenswrapper[4797]: I1013 13:07:45.725192 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:45 crc kubenswrapper[4797]: I1013 13:07:45.725219 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:45 crc kubenswrapper[4797]: I1013 13:07:45.725236 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:45Z","lastTransitionTime":"2025-10-13T13:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:45 crc kubenswrapper[4797]: I1013 13:07:45.827875 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:45 crc kubenswrapper[4797]: I1013 13:07:45.827947 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:45 crc kubenswrapper[4797]: I1013 13:07:45.827970 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:45 crc kubenswrapper[4797]: I1013 13:07:45.828001 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:45 crc kubenswrapper[4797]: I1013 13:07:45.828025 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:45Z","lastTransitionTime":"2025-10-13T13:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:45 crc kubenswrapper[4797]: I1013 13:07:45.931338 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:45 crc kubenswrapper[4797]: I1013 13:07:45.931394 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:45 crc kubenswrapper[4797]: I1013 13:07:45.931417 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:45 crc kubenswrapper[4797]: I1013 13:07:45.931442 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:45 crc kubenswrapper[4797]: I1013 13:07:45.931460 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:45Z","lastTransitionTime":"2025-10-13T13:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:46 crc kubenswrapper[4797]: I1013 13:07:46.035260 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:46 crc kubenswrapper[4797]: I1013 13:07:46.035346 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:46 crc kubenswrapper[4797]: I1013 13:07:46.035385 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:46 crc kubenswrapper[4797]: I1013 13:07:46.035419 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:46 crc kubenswrapper[4797]: I1013 13:07:46.035440 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:46Z","lastTransitionTime":"2025-10-13T13:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:46 crc kubenswrapper[4797]: I1013 13:07:46.139125 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:46 crc kubenswrapper[4797]: I1013 13:07:46.139269 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:46 crc kubenswrapper[4797]: I1013 13:07:46.139293 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:46 crc kubenswrapper[4797]: I1013 13:07:46.139321 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:46 crc kubenswrapper[4797]: I1013 13:07:46.139342 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:46Z","lastTransitionTime":"2025-10-13T13:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:46 crc kubenswrapper[4797]: I1013 13:07:46.242931 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:46 crc kubenswrapper[4797]: I1013 13:07:46.242998 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:46 crc kubenswrapper[4797]: I1013 13:07:46.243021 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:46 crc kubenswrapper[4797]: I1013 13:07:46.243048 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:46 crc kubenswrapper[4797]: I1013 13:07:46.243069 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:46Z","lastTransitionTime":"2025-10-13T13:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:46 crc kubenswrapper[4797]: I1013 13:07:46.345957 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:46 crc kubenswrapper[4797]: I1013 13:07:46.346039 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:46 crc kubenswrapper[4797]: I1013 13:07:46.346058 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:46 crc kubenswrapper[4797]: I1013 13:07:46.346081 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:46 crc kubenswrapper[4797]: I1013 13:07:46.346098 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:46Z","lastTransitionTime":"2025-10-13T13:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:46 crc kubenswrapper[4797]: I1013 13:07:46.449776 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:46 crc kubenswrapper[4797]: I1013 13:07:46.449860 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:46 crc kubenswrapper[4797]: I1013 13:07:46.449877 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:46 crc kubenswrapper[4797]: I1013 13:07:46.449901 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:46 crc kubenswrapper[4797]: I1013 13:07:46.449919 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:46Z","lastTransitionTime":"2025-10-13T13:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:46 crc kubenswrapper[4797]: I1013 13:07:46.552566 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:46 crc kubenswrapper[4797]: I1013 13:07:46.552601 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:46 crc kubenswrapper[4797]: I1013 13:07:46.552609 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:46 crc kubenswrapper[4797]: I1013 13:07:46.552622 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:46 crc kubenswrapper[4797]: I1013 13:07:46.552632 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:46Z","lastTransitionTime":"2025-10-13T13:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:46 crc kubenswrapper[4797]: I1013 13:07:46.656110 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:46 crc kubenswrapper[4797]: I1013 13:07:46.656182 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:46 crc kubenswrapper[4797]: I1013 13:07:46.656205 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:46 crc kubenswrapper[4797]: I1013 13:07:46.656234 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:46 crc kubenswrapper[4797]: I1013 13:07:46.656257 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:46Z","lastTransitionTime":"2025-10-13T13:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:46 crc kubenswrapper[4797]: I1013 13:07:46.759373 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:46 crc kubenswrapper[4797]: I1013 13:07:46.759444 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:46 crc kubenswrapper[4797]: I1013 13:07:46.759465 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:46 crc kubenswrapper[4797]: I1013 13:07:46.759490 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:46 crc kubenswrapper[4797]: I1013 13:07:46.759507 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:46Z","lastTransitionTime":"2025-10-13T13:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:46 crc kubenswrapper[4797]: I1013 13:07:46.863002 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:46 crc kubenswrapper[4797]: I1013 13:07:46.863062 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:46 crc kubenswrapper[4797]: I1013 13:07:46.863086 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:46 crc kubenswrapper[4797]: I1013 13:07:46.863118 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:46 crc kubenswrapper[4797]: I1013 13:07:46.863136 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:46Z","lastTransitionTime":"2025-10-13T13:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:46 crc kubenswrapper[4797]: I1013 13:07:46.967621 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:46 crc kubenswrapper[4797]: I1013 13:07:46.967698 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:46 crc kubenswrapper[4797]: I1013 13:07:46.967717 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:46 crc kubenswrapper[4797]: I1013 13:07:46.967744 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:46 crc kubenswrapper[4797]: I1013 13:07:46.967762 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:46Z","lastTransitionTime":"2025-10-13T13:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:47 crc kubenswrapper[4797]: I1013 13:07:47.071563 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:47 crc kubenswrapper[4797]: I1013 13:07:47.071640 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:47 crc kubenswrapper[4797]: I1013 13:07:47.071660 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:47 crc kubenswrapper[4797]: I1013 13:07:47.071688 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:47 crc kubenswrapper[4797]: I1013 13:07:47.071709 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:47Z","lastTransitionTime":"2025-10-13T13:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:47 crc kubenswrapper[4797]: I1013 13:07:47.157356 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e65d35bc-209d-4438-ae53-31deb132aaf5-metrics-certs\") pod \"network-metrics-daemon-pdvg5\" (UID: \"e65d35bc-209d-4438-ae53-31deb132aaf5\") " pod="openshift-multus/network-metrics-daemon-pdvg5" Oct 13 13:07:47 crc kubenswrapper[4797]: E1013 13:07:47.157509 4797 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 13 13:07:47 crc kubenswrapper[4797]: E1013 13:07:47.157571 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e65d35bc-209d-4438-ae53-31deb132aaf5-metrics-certs podName:e65d35bc-209d-4438-ae53-31deb132aaf5 nodeName:}" failed. No retries permitted until 2025-10-13 13:07:55.157554122 +0000 UTC m=+52.691104388 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e65d35bc-209d-4438-ae53-31deb132aaf5-metrics-certs") pod "network-metrics-daemon-pdvg5" (UID: "e65d35bc-209d-4438-ae53-31deb132aaf5") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 13 13:07:47 crc kubenswrapper[4797]: I1013 13:07:47.174705 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:47 crc kubenswrapper[4797]: I1013 13:07:47.174773 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:47 crc kubenswrapper[4797]: I1013 13:07:47.174792 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:47 crc kubenswrapper[4797]: I1013 13:07:47.174844 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:47 crc kubenswrapper[4797]: I1013 13:07:47.174867 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:47Z","lastTransitionTime":"2025-10-13T13:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:47 crc kubenswrapper[4797]: I1013 13:07:47.235452 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 13 13:07:47 crc kubenswrapper[4797]: I1013 13:07:47.235637 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 13 13:07:47 crc kubenswrapper[4797]: I1013 13:07:47.235735 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 13:07:47 crc kubenswrapper[4797]: E1013 13:07:47.235763 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 13 13:07:47 crc kubenswrapper[4797]: E1013 13:07:47.235908 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 13 13:07:47 crc kubenswrapper[4797]: I1013 13:07:47.235971 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pdvg5" Oct 13 13:07:47 crc kubenswrapper[4797]: E1013 13:07:47.236097 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 13 13:07:47 crc kubenswrapper[4797]: E1013 13:07:47.236233 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pdvg5" podUID="e65d35bc-209d-4438-ae53-31deb132aaf5" Oct 13 13:07:47 crc kubenswrapper[4797]: I1013 13:07:47.278762 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:47 crc kubenswrapper[4797]: I1013 13:07:47.278873 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:47 crc kubenswrapper[4797]: I1013 13:07:47.278904 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:47 crc kubenswrapper[4797]: I1013 13:07:47.278941 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:47 crc kubenswrapper[4797]: I1013 13:07:47.278972 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:47Z","lastTransitionTime":"2025-10-13T13:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:47 crc kubenswrapper[4797]: I1013 13:07:47.383092 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:47 crc kubenswrapper[4797]: I1013 13:07:47.383158 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:47 crc kubenswrapper[4797]: I1013 13:07:47.383167 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:47 crc kubenswrapper[4797]: I1013 13:07:47.383185 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:47 crc kubenswrapper[4797]: I1013 13:07:47.383196 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:47Z","lastTransitionTime":"2025-10-13T13:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:47 crc kubenswrapper[4797]: I1013 13:07:47.486336 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:47 crc kubenswrapper[4797]: I1013 13:07:47.486390 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:47 crc kubenswrapper[4797]: I1013 13:07:47.486408 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:47 crc kubenswrapper[4797]: I1013 13:07:47.486440 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:47 crc kubenswrapper[4797]: I1013 13:07:47.486462 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:47Z","lastTransitionTime":"2025-10-13T13:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:47 crc kubenswrapper[4797]: I1013 13:07:47.589140 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:47 crc kubenswrapper[4797]: I1013 13:07:47.589208 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:47 crc kubenswrapper[4797]: I1013 13:07:47.589230 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:47 crc kubenswrapper[4797]: I1013 13:07:47.589260 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:47 crc kubenswrapper[4797]: I1013 13:07:47.589280 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:47Z","lastTransitionTime":"2025-10-13T13:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:47 crc kubenswrapper[4797]: I1013 13:07:47.692671 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:47 crc kubenswrapper[4797]: I1013 13:07:47.692742 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:47 crc kubenswrapper[4797]: I1013 13:07:47.692758 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:47 crc kubenswrapper[4797]: I1013 13:07:47.692783 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:47 crc kubenswrapper[4797]: I1013 13:07:47.692826 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:47Z","lastTransitionTime":"2025-10-13T13:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:47 crc kubenswrapper[4797]: I1013 13:07:47.797218 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:47 crc kubenswrapper[4797]: I1013 13:07:47.797301 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:47 crc kubenswrapper[4797]: I1013 13:07:47.797321 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:47 crc kubenswrapper[4797]: I1013 13:07:47.797351 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:47 crc kubenswrapper[4797]: I1013 13:07:47.797371 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:47Z","lastTransitionTime":"2025-10-13T13:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:47 crc kubenswrapper[4797]: I1013 13:07:47.900633 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:47 crc kubenswrapper[4797]: I1013 13:07:47.900712 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:47 crc kubenswrapper[4797]: I1013 13:07:47.900735 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:47 crc kubenswrapper[4797]: I1013 13:07:47.900796 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:47 crc kubenswrapper[4797]: I1013 13:07:47.900858 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:47Z","lastTransitionTime":"2025-10-13T13:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:48 crc kubenswrapper[4797]: I1013 13:07:48.004658 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:48 crc kubenswrapper[4797]: I1013 13:07:48.004743 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:48 crc kubenswrapper[4797]: I1013 13:07:48.004766 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:48 crc kubenswrapper[4797]: I1013 13:07:48.004793 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:48 crc kubenswrapper[4797]: I1013 13:07:48.004846 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:48Z","lastTransitionTime":"2025-10-13T13:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:48 crc kubenswrapper[4797]: I1013 13:07:48.107905 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:48 crc kubenswrapper[4797]: I1013 13:07:48.107975 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:48 crc kubenswrapper[4797]: I1013 13:07:48.107997 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:48 crc kubenswrapper[4797]: I1013 13:07:48.108025 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:48 crc kubenswrapper[4797]: I1013 13:07:48.108044 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:48Z","lastTransitionTime":"2025-10-13T13:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:48 crc kubenswrapper[4797]: I1013 13:07:48.211309 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:48 crc kubenswrapper[4797]: I1013 13:07:48.211387 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:48 crc kubenswrapper[4797]: I1013 13:07:48.211413 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:48 crc kubenswrapper[4797]: I1013 13:07:48.211445 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:48 crc kubenswrapper[4797]: I1013 13:07:48.211468 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:48Z","lastTransitionTime":"2025-10-13T13:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:48 crc kubenswrapper[4797]: I1013 13:07:48.237048 4797 scope.go:117] "RemoveContainer" containerID="f7f9c135f7562fc84fb86baaf2fcb4fbdc3bb71573315ae64c369e542e394944" Oct 13 13:07:48 crc kubenswrapper[4797]: I1013 13:07:48.257703 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:48Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:48 crc kubenswrapper[4797]: I1013 13:07:48.280850 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6gbdx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2ab9f14-aae8-45ef-880e-a1563e920f87\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://414f6ddbfec431109009fc83e56eeac94db15726b109e707ebd8d3e2403999b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rkc2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6gbdx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:48Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:48 crc kubenswrapper[4797]: I1013 13:07:48.315585 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:48 crc kubenswrapper[4797]: I1013 13:07:48.315642 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:48 crc kubenswrapper[4797]: I1013 13:07:48.315658 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:48 crc kubenswrapper[4797]: I1013 13:07:48.315679 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:48 crc kubenswrapper[4797]: I1013 13:07:48.315701 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:48Z","lastTransitionTime":"2025-10-13T13:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:48 crc kubenswrapper[4797]: I1013 13:07:48.313663 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"658edc6a-9975-4d8b-9551-821edcc32ce1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6815d3509df673d7f5da2c26130c6c4d533e9d2c25c40f82365ef61d63ee71bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e599d81d1a996abd4de74afc58a8255a1ae548327401146b6bdf688d7455823\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa5161ba66d687daedb3caa1a0e2d7be83859aa3076731f94aebf83cc3348a4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1293f7ed35796e22a4be73a35ad07f83fa98d250d21de2d0b96b9090354142b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32991406197be9d38b8d5e8d1a7e95165b1846e9e054efbe87f30aac9f7f8784\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7aebb018a68c2984d9e4e58071c2b623652bfa700acebaf735c35615abf8c592\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7f9c135f7562fc84fb86baaf2fcb4fbdc3bb71573315ae64c369e542e394944\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f9c135f7562fc84fb86baaf2fcb4fbdc3bb71573315ae64c369e542e394944\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-13T13:07:36Z\\\",\\\"message\\\":\\\"ce/redhat-marketplace for network=default\\\\nF1013 13:07:36.374532 6199 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:36Z is after 2025-08-24T17:21:41Z]\\\\nI1013 13:07:36.374702 6199 services_controller.go:434] Service openshift-marketplace/redhat-marketplace retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{redhat-marketplace openshift-marketplace cf6d00ec-cc2c-43f6-815c-40ffd0563e71 5558 0 2025-02-23 05:23:25 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[olm.managed:true olm.service-spec-hash:aUeLNNcZzVZO2rcaZ5Kc8V3jffO0Ss4T6qX6V5] map[] [{operators.coreos.com/v1alpha1 CatalogSource redhat-marketplace fcb55c30-a739-4bc1-9c9c-7634e05a3dbd 0xc0075c0c4d 0xc0075c0c4e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:grpc,Protocol:TCP,P\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:35Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-dhk2q_openshift-ovn-kubernetes(658edc6a-9975-4d8b-9551-821edcc32ce1)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a900854ab289e65833932548eadd4705ec501737d66773d5b6c283458125b598\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6339ec20a5a892de024630eaf6febedfdfa4373c24d394fc758267cc7ae2603e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6339ec20a5a892de024630eaf6febedfdfa4373c24d394fc758267cc7ae2603e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dhk2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:48Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:48 crc kubenswrapper[4797]: I1013 13:07:48.337293 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a225515-1318-413d-aafe-877c9f16f598\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94213b2963fead3db49bc98dfdf6347265b92e3a0a965295610e496d2e1f03fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://015492cb16b3cf6dedc1936f90cf03d1331bfd1fddf6a257c719a6bf102691f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a259d99cae7127eb6fc8ad5446de3eda5a06da45868ab2325a89fc9c44f1d34c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58e84f914871c37c2c5cf2767af6a88354e4e59af0cbe5b178b80e1372d50629\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba76df71160260346f2ecd968722de778b7d2b3dcb8673d6ec770964965384fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ff207205d6f17838682cab641fe9c77fad739e5454c206f376440741bed8724\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ff207205d6f17838682cab641fe9c77fad739e5454c206f376440741bed8724\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:48Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:48 crc kubenswrapper[4797]: I1013 13:07:48.351263 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:48Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:48 crc kubenswrapper[4797]: I1013 13:07:48.367128 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hvhmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b24c284-a754-4877-83cc-334b0a893a47\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72031333bb0302ca8e823981a07e96b3bf16d02fbdb918d4fd3e79f36d86c5ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znzc9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95d788f4cc7913f42c5282aea7303a5463ec8718dba6372a30c505e1648f230e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znzc9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hvhmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:48Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:48 crc kubenswrapper[4797]: I1013 13:07:48.384448 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-pdvg5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e65d35bc-209d-4438-ae53-31deb132aaf5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nspn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nspn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-pdvg5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:48Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:48 crc kubenswrapper[4797]: I1013 13:07:48.403093 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69631c207b244ea05458caf7f67665697be6b3794c1aac98d0ac8d23df060e02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:48Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:48 crc kubenswrapper[4797]: I1013 13:07:48.417418 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5jgrm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"680a49a0-7eff-44a9-8ab8-e4b52f4743c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://875be57ff7356a934b342acd8ae700f66656680be4e58e6cfccdc0407b66ddea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pptw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5jgrm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:48Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:48 crc kubenswrapper[4797]: I1013 13:07:48.420451 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:48 crc kubenswrapper[4797]: I1013 13:07:48.420489 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:48 crc kubenswrapper[4797]: I1013 13:07:48.420504 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:48 crc kubenswrapper[4797]: I1013 13:07:48.420525 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:48 crc kubenswrapper[4797]: I1013 13:07:48.420540 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:48Z","lastTransitionTime":"2025-10-13T13:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:48 crc kubenswrapper[4797]: I1013 13:07:48.453375 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a0ece0b-2009-4af8-a479-18fe277add03\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfe67dfd1c3ab4ca933a08e0384f2c38dccf755989a2d788f7c96bd8c2005c7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2839d93188aac6edbd17e7c1dc6d3b6004d3c1d8d03c559205b1f180ca7fc722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d103007fe3545a7470b16b99638c5d5c87f34918e102e1453d4f7ee1fa67109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://221933b30055ace0b4911bda08736e1c703b7757d55fadb3114ae39d038e4b19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01ae91c2e82dd02eb4c0aced1159efd45e3a0570a4db649f2fd2b58681419471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9634ab258db11c80c4fea57a4a31969b811204d49513802e9fd1e584c9baeeb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9634ab258db11c80c4fea57a4a31969b811204d49513802e9fd1e584c9baeeb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4eac0af23e91572524547b0ce92c10d435b55d0cd15ca4cfc1f49bda2de8bde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4eac0af23e91572524547b0ce92c10d435b55d0cd15ca4cfc1f49bda2de8bde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://af79d6bbb6a3c16b532ba2234d3373011c151ffa801eb1ae5ae947142a64bcfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af79d6bbb6a3c16b532ba2234d3373011c151ffa801eb1ae5ae947142a64bcfe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:03Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:48Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:48 crc kubenswrapper[4797]: I1013 13:07:48.475029 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6681a37da700a80ffec94aef9264f87838622029c76a2badc7b8f4a7e9e167e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:48Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:48 crc kubenswrapper[4797]: I1013 13:07:48.494825 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://050a7223e7496fca4ef77f2d73f6aefc921ac5accb7ecaa34609524388da6549\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f59a6104215a3e6febd2e26c286b00895bcd8a45719acbd8e86d6fb5683df39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:48Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:48 crc kubenswrapper[4797]: I1013 13:07:48.507427 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:48Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:48 crc kubenswrapper[4797]: I1013 13:07:48.522669 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:48 crc kubenswrapper[4797]: I1013 13:07:48.522690 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:48 crc kubenswrapper[4797]: I1013 13:07:48.522700 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:48 crc kubenswrapper[4797]: I1013 13:07:48.522714 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:48 crc kubenswrapper[4797]: I1013 13:07:48.522725 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:48Z","lastTransitionTime":"2025-10-13T13:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:48 crc kubenswrapper[4797]: I1013 13:07:48.523841 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hc9bk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94a6e41f-8980-41db-a008-d5a81058cdba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4187a9b2b8147080c704bcda550e1fa94124e2d876e766361e95907a8805d300\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55fca726ceb1c3d6ba2023acc8fcf6829a7843a2da5209dc77f3d1f499542984\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55fca726ceb1c3d6ba2023acc8fcf6829a7843a2da5209dc77f3d1f499542984\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d315709aee893961a174d0368efcf68e50e45845bdce18b40b96f5d49a8ac12c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d315709aee893961a174d0368efcf68e50e45845bdce18b40b96f5d49a8ac12c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e5eeb32046ffa3c309bb0649ed24bb4149050e757d4252bc5ed8e0593e1b139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e5eeb32046ffa3c309bb0649ed24bb4149050e757d4252bc5ed8e0593e1b139\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12ad75a6e287db934cda0d0128e5445d47433aa12807b4145ce0bd28c36b08a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12ad75a6e287db934cda0d0128e5445d47433aa12807b4145ce0bd28c36b08a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://578ddb49fd9525eafe74852b96ea1f3e320cbe40fa15ef4da3e4269f9bc23fc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://578ddb49fd9525eafe74852b96ea1f3e320cbe40fa15ef4da3e4269f9bc23fc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d96793d08ac170b3c25abd53c779b2ebecf10b5271c5f3eb4f9cbc524ba65c0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d96793d08ac170b3c25abd53c779b2ebecf10b5271c5f3eb4f9cbc524ba65c0e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hc9bk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:48Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:48 crc kubenswrapper[4797]: I1013 13:07:48.534765 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7c2fp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c10f51f-a7f1-4ab8-8d9c-fc358bd7f2c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55ca323dd3a92ee203542f4ef7bb8be990bcfc8f75f125c562127129aecefc5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hgkjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7c2fp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:48Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:48 crc kubenswrapper[4797]: I1013 13:07:48.547529 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f37f607-3b81-4e33-878e-e78a69b89d23\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f65bab26af0e0d003d4e1a27dc4bdb84b64b5f6143e363973331a3fb6d26b12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81fc000b6df41386d24f9077cee4aa0ceb4733774dc37d225495575543e84a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6529ac19e0d9f2b6ecc69e041e75c9767c971617166ca22bb29349b3b3965b1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://153aaedead7ed76ab97858342317945de885afe80c00d9873d1a03444c47f67d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:48Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:48 crc kubenswrapper[4797]: I1013 13:07:48.562651 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"345b1c60-ba79-407d-8423-53010f2dfeb0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9188982d992b79d058393a141055552eeb63bc5cd53178991e62b3df7604f55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4mqw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae2106d4b7e73d19b0c8cbd8089d372e56fa08d827a3b45148d0cf68e8596c00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4mqw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hrdxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:48Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:48 crc kubenswrapper[4797]: I1013 13:07:48.593068 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dhk2q_658edc6a-9975-4d8b-9551-821edcc32ce1/ovnkube-controller/1.log" Oct 13 13:07:48 crc kubenswrapper[4797]: I1013 13:07:48.602734 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" event={"ID":"658edc6a-9975-4d8b-9551-821edcc32ce1","Type":"ContainerStarted","Data":"dbb589aa6432f0be09c19acfbe0267d2c3e9906ab1b90494b2e097aec3c51e50"} Oct 13 13:07:48 crc kubenswrapper[4797]: I1013 13:07:48.603469 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" Oct 13 13:07:48 crc kubenswrapper[4797]: I1013 13:07:48.623350 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:48Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:48 crc kubenswrapper[4797]: I1013 13:07:48.625331 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:48 crc kubenswrapper[4797]: I1013 13:07:48.625359 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:48 crc kubenswrapper[4797]: I1013 13:07:48.625373 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:48 crc kubenswrapper[4797]: I1013 13:07:48.625392 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:48 crc kubenswrapper[4797]: I1013 13:07:48.625406 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:48Z","lastTransitionTime":"2025-10-13T13:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:48 crc kubenswrapper[4797]: I1013 13:07:48.639306 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6gbdx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2ab9f14-aae8-45ef-880e-a1563e920f87\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://414f6ddbfec431109009fc83e56eeac94db15726b109e707ebd8d3e2403999b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rkc2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6gbdx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:48Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:48 crc kubenswrapper[4797]: I1013 13:07:48.663202 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"658edc6a-9975-4d8b-9551-821edcc32ce1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6815d3509df673d7f5da2c26130c6c4d533e9d2c25c40f82365ef61d63ee71bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e599d81d1a996abd4de74afc58a8255a1ae548327401146b6bdf688d7455823\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa5161ba66d687daedb3caa1a0e2d7be83859aa3076731f94aebf83cc3348a4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1293f7ed35796e22a4be73a35ad07f83fa98d250d21de2d0b96b9090354142b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32991406197be9d38b8d5e8d1a7e95165b1846e9e054efbe87f30aac9f7f8784\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7aebb018a68c2984d9e4e58071c2b623652bfa700acebaf735c35615abf8c592\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbb589aa6432f0be09c19acfbe0267d2c3e9906ab1b90494b2e097aec3c51e50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f9c135f7562fc84fb86baaf2fcb4fbdc3bb71573315ae64c369e542e394944\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-13T13:07:36Z\\\",\\\"message\\\":\\\"ce/redhat-marketplace for network=default\\\\nF1013 13:07:36.374532 6199 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:36Z is after 2025-08-24T17:21:41Z]\\\\nI1013 13:07:36.374702 6199 services_controller.go:434] Service openshift-marketplace/redhat-marketplace retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{redhat-marketplace openshift-marketplace cf6d00ec-cc2c-43f6-815c-40ffd0563e71 5558 0 2025-02-23 05:23:25 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[olm.managed:true olm.service-spec-hash:aUeLNNcZzVZO2rcaZ5Kc8V3jffO0Ss4T6qX6V5] map[] [{operators.coreos.com/v1alpha1 CatalogSource redhat-marketplace fcb55c30-a739-4bc1-9c9c-7634e05a3dbd 0xc0075c0c4d 0xc0075c0c4e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:grpc,Protocol:TCP,P\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:35Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a900854ab289e65833932548eadd4705ec501737d66773d5b6c283458125b598\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6339ec20a5a892de024630eaf6febedfdfa4373c24d394fc758267cc7ae2603e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6339ec20a5a892de024630eaf6febedfdfa4373c24d394fc758267cc7ae2603e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dhk2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:48Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:48 crc kubenswrapper[4797]: I1013 13:07:48.682948 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a225515-1318-413d-aafe-877c9f16f598\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94213b2963fead3db49bc98dfdf6347265b92e3a0a965295610e496d2e1f03fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://015492cb16b3cf6dedc1936f90cf03d1331bfd1fddf6a257c719a6bf102691f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a259d99cae7127eb6fc8ad5446de3eda5a06da45868ab2325a89fc9c44f1d34c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58e84f914871c37c2c5cf2767af6a88354e4e59af0cbe5b178b80e1372d50629\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba76df71160260346f2ecd968722de778b7d2b3dcb8673d6ec770964965384fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ff207205d6f17838682cab641fe9c77fad739e5454c206f376440741bed8724\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ff207205d6f17838682cab641fe9c77fad739e5454c206f376440741bed8724\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:48Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:48 crc kubenswrapper[4797]: I1013 13:07:48.701606 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:48Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:48 crc kubenswrapper[4797]: I1013 13:07:48.715615 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hvhmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b24c284-a754-4877-83cc-334b0a893a47\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72031333bb0302ca8e823981a07e96b3bf16d02fbdb918d4fd3e79f36d86c5ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znzc9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95d788f4cc7913f42c5282aea7303a5463ec8718dba6372a30c505e1648f230e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znzc9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hvhmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:48Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:48 crc kubenswrapper[4797]: I1013 13:07:48.728526 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:48 crc kubenswrapper[4797]: I1013 13:07:48.728565 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:48 crc kubenswrapper[4797]: I1013 13:07:48.728576 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:48 crc kubenswrapper[4797]: I1013 13:07:48.728594 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:48 crc kubenswrapper[4797]: I1013 13:07:48.728607 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:48Z","lastTransitionTime":"2025-10-13T13:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:48 crc kubenswrapper[4797]: I1013 13:07:48.732410 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-pdvg5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e65d35bc-209d-4438-ae53-31deb132aaf5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nspn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nspn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-pdvg5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:48Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:48 crc kubenswrapper[4797]: I1013 13:07:48.751689 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69631c207b244ea05458caf7f67665697be6b3794c1aac98d0ac8d23df060e02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:48Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:48 crc kubenswrapper[4797]: I1013 13:07:48.765486 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5jgrm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"680a49a0-7eff-44a9-8ab8-e4b52f4743c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://875be57ff7356a934b342acd8ae700f66656680be4e58e6cfccdc0407b66ddea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pptw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5jgrm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:48Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:48 crc kubenswrapper[4797]: I1013 13:07:48.800407 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a0ece0b-2009-4af8-a479-18fe277add03\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfe67dfd1c3ab4ca933a08e0384f2c38dccf755989a2d788f7c96bd8c2005c7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2839d93188aac6edbd17e7c1dc6d3b6004d3c1d8d03c559205b1f180ca7fc722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d103007fe3545a7470b16b99638c5d5c87f34918e102e1453d4f7ee1fa67109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://221933b30055ace0b4911bda08736e1c703b7757d55fadb3114ae39d038e4b19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01ae91c2e82dd02eb4c0aced1159efd45e3a0570a4db649f2fd2b58681419471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9634ab258db11c80c4fea57a4a31969b811204d49513802e9fd1e584c9baeeb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9634ab258db11c80c4fea57a4a31969b811204d49513802e9fd1e584c9baeeb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4eac0af23e91572524547b0ce92c10d435b55d0cd15ca4cfc1f49bda2de8bde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4eac0af23e91572524547b0ce92c10d435b55d0cd15ca4cfc1f49bda2de8bde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://af79d6bbb6a3c16b532ba2234d3373011c151ffa801eb1ae5ae947142a64bcfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af79d6bbb6a3c16b532ba2234d3373011c151ffa801eb1ae5ae947142a64bcfe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:03Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:48Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:48 crc kubenswrapper[4797]: I1013 13:07:48.825676 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6681a37da700a80ffec94aef9264f87838622029c76a2badc7b8f4a7e9e167e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:48Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:48 crc kubenswrapper[4797]: I1013 13:07:48.833119 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:48 crc kubenswrapper[4797]: I1013 13:07:48.833167 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:48 crc kubenswrapper[4797]: I1013 13:07:48.833179 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:48 crc kubenswrapper[4797]: I1013 13:07:48.833198 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:48 crc kubenswrapper[4797]: I1013 13:07:48.833211 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:48Z","lastTransitionTime":"2025-10-13T13:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:48 crc kubenswrapper[4797]: I1013 13:07:48.844119 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://050a7223e7496fca4ef77f2d73f6aefc921ac5accb7ecaa34609524388da6549\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f59a6104215a3e6febd2e26c286b00895bcd8a45719acbd8e86d6fb5683df39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:48Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:48 crc kubenswrapper[4797]: I1013 13:07:48.860487 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:48Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:48 crc kubenswrapper[4797]: I1013 13:07:48.877866 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hc9bk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94a6e41f-8980-41db-a008-d5a81058cdba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4187a9b2b8147080c704bcda550e1fa94124e2d876e766361e95907a8805d300\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55fca726ceb1c3d6ba2023acc8fcf6829a7843a2da5209dc77f3d1f499542984\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55fca726ceb1c3d6ba2023acc8fcf6829a7843a2da5209dc77f3d1f499542984\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d315709aee893961a174d0368efcf68e50e45845bdce18b40b96f5d49a8ac12c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d315709aee893961a174d0368efcf68e50e45845bdce18b40b96f5d49a8ac12c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e5eeb32046ffa3c309bb0649ed24bb4149050e757d4252bc5ed8e0593e1b139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e5eeb32046ffa3c309bb0649ed24bb4149050e757d4252bc5ed8e0593e1b139\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12ad75a6e287db934cda0d0128e5445d47433aa12807b4145ce0bd28c36b08a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12ad75a6e287db934cda0d0128e5445d47433aa12807b4145ce0bd28c36b08a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://578ddb49fd9525eafe74852b96ea1f3e320cbe40fa15ef4da3e4269f9bc23fc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://578ddb49fd9525eafe74852b96ea1f3e320cbe40fa15ef4da3e4269f9bc23fc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d96793d08ac170b3c25abd53c779b2ebecf10b5271c5f3eb4f9cbc524ba65c0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d96793d08ac170b3c25abd53c779b2ebecf10b5271c5f3eb4f9cbc524ba65c0e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hc9bk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:48Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:48 crc kubenswrapper[4797]: I1013 13:07:48.888745 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7c2fp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c10f51f-a7f1-4ab8-8d9c-fc358bd7f2c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55ca323dd3a92ee203542f4ef7bb8be990bcfc8f75f125c562127129aecefc5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hgkjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7c2fp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:48Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:48 crc kubenswrapper[4797]: I1013 13:07:48.901295 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f37f607-3b81-4e33-878e-e78a69b89d23\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f65bab26af0e0d003d4e1a27dc4bdb84b64b5f6143e363973331a3fb6d26b12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81fc000b6df41386d24f9077cee4aa0ceb4733774dc37d225495575543e84a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6529ac19e0d9f2b6ecc69e041e75c9767c971617166ca22bb29349b3b3965b1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://153aaedead7ed76ab97858342317945de885afe80c00d9873d1a03444c47f67d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:48Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:48 crc kubenswrapper[4797]: I1013 13:07:48.911832 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"345b1c60-ba79-407d-8423-53010f2dfeb0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9188982d992b79d058393a141055552eeb63bc5cd53178991e62b3df7604f55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4mqw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae2106d4b7e73d19b0c8cbd8089d372e56fa08d827a3b45148d0cf68e8596c00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4mqw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hrdxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:48Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:48 crc kubenswrapper[4797]: I1013 13:07:48.935200 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:48 crc kubenswrapper[4797]: I1013 13:07:48.935236 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:48 crc kubenswrapper[4797]: I1013 13:07:48.935246 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:48 crc kubenswrapper[4797]: I1013 13:07:48.935262 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:48 crc kubenswrapper[4797]: I1013 13:07:48.935272 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:48Z","lastTransitionTime":"2025-10-13T13:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:49 crc kubenswrapper[4797]: I1013 13:07:49.038612 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:49 crc kubenswrapper[4797]: I1013 13:07:49.038952 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:49 crc kubenswrapper[4797]: I1013 13:07:49.039057 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:49 crc kubenswrapper[4797]: I1013 13:07:49.039154 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:49 crc kubenswrapper[4797]: I1013 13:07:49.039266 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:49Z","lastTransitionTime":"2025-10-13T13:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:49 crc kubenswrapper[4797]: I1013 13:07:49.142029 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:49 crc kubenswrapper[4797]: I1013 13:07:49.142069 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:49 crc kubenswrapper[4797]: I1013 13:07:49.142081 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:49 crc kubenswrapper[4797]: I1013 13:07:49.142097 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:49 crc kubenswrapper[4797]: I1013 13:07:49.142107 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:49Z","lastTransitionTime":"2025-10-13T13:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:49 crc kubenswrapper[4797]: I1013 13:07:49.235481 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 13 13:07:49 crc kubenswrapper[4797]: I1013 13:07:49.235496 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 13 13:07:49 crc kubenswrapper[4797]: E1013 13:07:49.235658 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 13 13:07:49 crc kubenswrapper[4797]: I1013 13:07:49.235675 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pdvg5" Oct 13 13:07:49 crc kubenswrapper[4797]: E1013 13:07:49.235773 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 13 13:07:49 crc kubenswrapper[4797]: I1013 13:07:49.235492 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 13:07:49 crc kubenswrapper[4797]: E1013 13:07:49.235950 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pdvg5" podUID="e65d35bc-209d-4438-ae53-31deb132aaf5" Oct 13 13:07:49 crc kubenswrapper[4797]: E1013 13:07:49.236109 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 13 13:07:49 crc kubenswrapper[4797]: I1013 13:07:49.244497 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:49 crc kubenswrapper[4797]: I1013 13:07:49.244532 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:49 crc kubenswrapper[4797]: I1013 13:07:49.244543 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:49 crc kubenswrapper[4797]: I1013 13:07:49.244557 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:49 crc kubenswrapper[4797]: I1013 13:07:49.244568 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:49Z","lastTransitionTime":"2025-10-13T13:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:49 crc kubenswrapper[4797]: I1013 13:07:49.347389 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:49 crc kubenswrapper[4797]: I1013 13:07:49.347445 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:49 crc kubenswrapper[4797]: I1013 13:07:49.347465 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:49 crc kubenswrapper[4797]: I1013 13:07:49.347485 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:49 crc kubenswrapper[4797]: I1013 13:07:49.347498 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:49Z","lastTransitionTime":"2025-10-13T13:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:49 crc kubenswrapper[4797]: I1013 13:07:49.451102 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:49 crc kubenswrapper[4797]: I1013 13:07:49.451155 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:49 crc kubenswrapper[4797]: I1013 13:07:49.451167 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:49 crc kubenswrapper[4797]: I1013 13:07:49.451185 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:49 crc kubenswrapper[4797]: I1013 13:07:49.451201 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:49Z","lastTransitionTime":"2025-10-13T13:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:49 crc kubenswrapper[4797]: I1013 13:07:49.554924 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:49 crc kubenswrapper[4797]: I1013 13:07:49.554988 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:49 crc kubenswrapper[4797]: I1013 13:07:49.555005 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:49 crc kubenswrapper[4797]: I1013 13:07:49.555034 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:49 crc kubenswrapper[4797]: I1013 13:07:49.555082 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:49Z","lastTransitionTime":"2025-10-13T13:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:49 crc kubenswrapper[4797]: I1013 13:07:49.609515 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dhk2q_658edc6a-9975-4d8b-9551-821edcc32ce1/ovnkube-controller/2.log" Oct 13 13:07:49 crc kubenswrapper[4797]: I1013 13:07:49.610860 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dhk2q_658edc6a-9975-4d8b-9551-821edcc32ce1/ovnkube-controller/1.log" Oct 13 13:07:49 crc kubenswrapper[4797]: I1013 13:07:49.614989 4797 generic.go:334] "Generic (PLEG): container finished" podID="658edc6a-9975-4d8b-9551-821edcc32ce1" containerID="dbb589aa6432f0be09c19acfbe0267d2c3e9906ab1b90494b2e097aec3c51e50" exitCode=1 Oct 13 13:07:49 crc kubenswrapper[4797]: I1013 13:07:49.615065 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" event={"ID":"658edc6a-9975-4d8b-9551-821edcc32ce1","Type":"ContainerDied","Data":"dbb589aa6432f0be09c19acfbe0267d2c3e9906ab1b90494b2e097aec3c51e50"} Oct 13 13:07:49 crc kubenswrapper[4797]: I1013 13:07:49.615175 4797 scope.go:117] "RemoveContainer" containerID="f7f9c135f7562fc84fb86baaf2fcb4fbdc3bb71573315ae64c369e542e394944" Oct 13 13:07:49 crc kubenswrapper[4797]: I1013 13:07:49.616531 4797 scope.go:117] "RemoveContainer" containerID="dbb589aa6432f0be09c19acfbe0267d2c3e9906ab1b90494b2e097aec3c51e50" Oct 13 13:07:49 crc kubenswrapper[4797]: E1013 13:07:49.616895 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-dhk2q_openshift-ovn-kubernetes(658edc6a-9975-4d8b-9551-821edcc32ce1)\"" pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" podUID="658edc6a-9975-4d8b-9551-821edcc32ce1" Oct 13 13:07:49 crc kubenswrapper[4797]: I1013 13:07:49.638172 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5jgrm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"680a49a0-7eff-44a9-8ab8-e4b52f4743c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://875be57ff7356a934b342acd8ae700f66656680be4e58e6cfccdc0407b66ddea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pptw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5jgrm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:49Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:49 crc kubenswrapper[4797]: I1013 13:07:49.657649 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:49 crc kubenswrapper[4797]: I1013 13:07:49.657709 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:49 crc kubenswrapper[4797]: I1013 13:07:49.657728 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:49 crc kubenswrapper[4797]: I1013 13:07:49.657751 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:49 crc kubenswrapper[4797]: I1013 13:07:49.657771 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:49Z","lastTransitionTime":"2025-10-13T13:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:49 crc kubenswrapper[4797]: I1013 13:07:49.659679 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hvhmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b24c284-a754-4877-83cc-334b0a893a47\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72031333bb0302ca8e823981a07e96b3bf16d02fbdb918d4fd3e79f36d86c5ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znzc9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95d788f4cc7913f42c5282aea7303a5463ec8718dba6372a30c505e1648f230e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znzc9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hvhmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:49Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:49 crc kubenswrapper[4797]: I1013 13:07:49.676403 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-pdvg5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e65d35bc-209d-4438-ae53-31deb132aaf5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nspn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nspn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-pdvg5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:49Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:49 crc kubenswrapper[4797]: I1013 13:07:49.697111 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69631c207b244ea05458caf7f67665697be6b3794c1aac98d0ac8d23df060e02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:49Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:49 crc kubenswrapper[4797]: I1013 13:07:49.730611 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a0ece0b-2009-4af8-a479-18fe277add03\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfe67dfd1c3ab4ca933a08e0384f2c38dccf755989a2d788f7c96bd8c2005c7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2839d93188aac6edbd17e7c1dc6d3b6004d3c1d8d03c559205b1f180ca7fc722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d103007fe3545a7470b16b99638c5d5c87f34918e102e1453d4f7ee1fa67109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://221933b30055ace0b4911bda08736e1c703b7757d55fadb3114ae39d038e4b19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01ae91c2e82dd02eb4c0aced1159efd45e3a0570a4db649f2fd2b58681419471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9634ab258db11c80c4fea57a4a31969b811204d49513802e9fd1e584c9baeeb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9634ab258db11c80c4fea57a4a31969b811204d49513802e9fd1e584c9baeeb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4eac0af23e91572524547b0ce92c10d435b55d0cd15ca4cfc1f49bda2de8bde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4eac0af23e91572524547b0ce92c10d435b55d0cd15ca4cfc1f49bda2de8bde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://af79d6bbb6a3c16b532ba2234d3373011c151ffa801eb1ae5ae947142a64bcfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af79d6bbb6a3c16b532ba2234d3373011c151ffa801eb1ae5ae947142a64bcfe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:03Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:49Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:49 crc kubenswrapper[4797]: I1013 13:07:49.750879 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6681a37da700a80ffec94aef9264f87838622029c76a2badc7b8f4a7e9e167e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:49Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:49 crc kubenswrapper[4797]: I1013 13:07:49.761030 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:49 crc kubenswrapper[4797]: I1013 13:07:49.761107 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:49 crc kubenswrapper[4797]: I1013 13:07:49.761127 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:49 crc kubenswrapper[4797]: I1013 13:07:49.761153 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:49 crc kubenswrapper[4797]: I1013 13:07:49.761171 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:49Z","lastTransitionTime":"2025-10-13T13:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:49 crc kubenswrapper[4797]: I1013 13:07:49.776003 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://050a7223e7496fca4ef77f2d73f6aefc921ac5accb7ecaa34609524388da6549\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f59a6104215a3e6febd2e26c286b00895bcd8a45719acbd8e86d6fb5683df39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:49Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:49 crc kubenswrapper[4797]: I1013 13:07:49.793926 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:49Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:49 crc kubenswrapper[4797]: I1013 13:07:49.814014 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hc9bk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94a6e41f-8980-41db-a008-d5a81058cdba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4187a9b2b8147080c704bcda550e1fa94124e2d876e766361e95907a8805d300\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55fca726ceb1c3d6ba2023acc8fcf6829a7843a2da5209dc77f3d1f499542984\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55fca726ceb1c3d6ba2023acc8fcf6829a7843a2da5209dc77f3d1f499542984\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d315709aee893961a174d0368efcf68e50e45845bdce18b40b96f5d49a8ac12c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d315709aee893961a174d0368efcf68e50e45845bdce18b40b96f5d49a8ac12c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e5eeb32046ffa3c309bb0649ed24bb4149050e757d4252bc5ed8e0593e1b139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e5eeb32046ffa3c309bb0649ed24bb4149050e757d4252bc5ed8e0593e1b139\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12ad75a6e287db934cda0d0128e5445d47433aa12807b4145ce0bd28c36b08a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12ad75a6e287db934cda0d0128e5445d47433aa12807b4145ce0bd28c36b08a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://578ddb49fd9525eafe74852b96ea1f3e320cbe40fa15ef4da3e4269f9bc23fc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://578ddb49fd9525eafe74852b96ea1f3e320cbe40fa15ef4da3e4269f9bc23fc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d96793d08ac170b3c25abd53c779b2ebecf10b5271c5f3eb4f9cbc524ba65c0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d96793d08ac170b3c25abd53c779b2ebecf10b5271c5f3eb4f9cbc524ba65c0e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hc9bk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:49Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:49 crc kubenswrapper[4797]: I1013 13:07:49.827188 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"345b1c60-ba79-407d-8423-53010f2dfeb0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9188982d992b79d058393a141055552eeb63bc5cd53178991e62b3df7604f55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4mqw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae2106d4b7e73d19b0c8cbd8089d372e56fa08d827a3b45148d0cf68e8596c00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4mqw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hrdxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:49Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:49 crc kubenswrapper[4797]: I1013 13:07:49.838958 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7c2fp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c10f51f-a7f1-4ab8-8d9c-fc358bd7f2c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55ca323dd3a92ee203542f4ef7bb8be990bcfc8f75f125c562127129aecefc5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hgkjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7c2fp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:49Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:49 crc kubenswrapper[4797]: I1013 13:07:49.852174 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f37f607-3b81-4e33-878e-e78a69b89d23\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f65bab26af0e0d003d4e1a27dc4bdb84b64b5f6143e363973331a3fb6d26b12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81fc000b6df41386d24f9077cee4aa0ceb4733774dc37d225495575543e84a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6529ac19e0d9f2b6ecc69e041e75c9767c971617166ca22bb29349b3b3965b1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://153aaedead7ed76ab97858342317945de885afe80c00d9873d1a03444c47f67d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:49Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:49 crc kubenswrapper[4797]: I1013 13:07:49.864129 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:49 crc kubenswrapper[4797]: I1013 13:07:49.864205 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:49 crc kubenswrapper[4797]: I1013 13:07:49.864227 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:49 crc kubenswrapper[4797]: I1013 13:07:49.864518 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:49 crc kubenswrapper[4797]: I1013 13:07:49.864577 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:49Z","lastTransitionTime":"2025-10-13T13:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:49 crc kubenswrapper[4797]: I1013 13:07:49.868320 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:49Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:49 crc kubenswrapper[4797]: I1013 13:07:49.882744 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:49Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:49 crc kubenswrapper[4797]: I1013 13:07:49.897101 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6gbdx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2ab9f14-aae8-45ef-880e-a1563e920f87\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://414f6ddbfec431109009fc83e56eeac94db15726b109e707ebd8d3e2403999b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rkc2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6gbdx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:49Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:49 crc kubenswrapper[4797]: I1013 13:07:49.921120 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"658edc6a-9975-4d8b-9551-821edcc32ce1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6815d3509df673d7f5da2c26130c6c4d533e9d2c25c40f82365ef61d63ee71bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e599d81d1a996abd4de74afc58a8255a1ae548327401146b6bdf688d7455823\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa5161ba66d687daedb3caa1a0e2d7be83859aa3076731f94aebf83cc3348a4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1293f7ed35796e22a4be73a35ad07f83fa98d250d21de2d0b96b9090354142b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32991406197be9d38b8d5e8d1a7e95165b1846e9e054efbe87f30aac9f7f8784\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7aebb018a68c2984d9e4e58071c2b623652bfa700acebaf735c35615abf8c592\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbb589aa6432f0be09c19acfbe0267d2c3e9906ab1b90494b2e097aec3c51e50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f9c135f7562fc84fb86baaf2fcb4fbdc3bb71573315ae64c369e542e394944\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-13T13:07:36Z\\\",\\\"message\\\":\\\"ce/redhat-marketplace for network=default\\\\nF1013 13:07:36.374532 6199 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:36Z is after 2025-08-24T17:21:41Z]\\\\nI1013 13:07:36.374702 6199 services_controller.go:434] Service openshift-marketplace/redhat-marketplace retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{redhat-marketplace openshift-marketplace cf6d00ec-cc2c-43f6-815c-40ffd0563e71 5558 0 2025-02-23 05:23:25 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[olm.managed:true olm.service-spec-hash:aUeLNNcZzVZO2rcaZ5Kc8V3jffO0Ss4T6qX6V5] map[] [{operators.coreos.com/v1alpha1 CatalogSource redhat-marketplace fcb55c30-a739-4bc1-9c9c-7634e05a3dbd 0xc0075c0c4d 0xc0075c0c4e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:grpc,Protocol:TCP,P\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:35Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbb589aa6432f0be09c19acfbe0267d2c3e9906ab1b90494b2e097aec3c51e50\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-13T13:07:49Z\\\",\\\"message\\\":\\\"ap[10.217.5.219:80:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7594bb65-e742-44b3-a975-d639b1128be5}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1013 13:07:49.213399 6403 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1013 13:07:49.213404 6403 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-network-diagnostics/network-check-target]} name:Service_openshift-network-diagnostics/network-check-target_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.219:80:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7594bb65-e742-44b3-a975-d639b1128be5}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1013 13:07:49.213463 6403 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: fail\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a900854ab289e65833932548eadd4705ec501737d66773d5b6c283458125b598\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6339ec20a5a892de024630eaf6febedfdfa4373c24d394fc758267cc7ae2603e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6339ec20a5a892de024630eaf6febedfdfa4373c24d394fc758267cc7ae2603e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dhk2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:49Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:49 crc kubenswrapper[4797]: I1013 13:07:49.942676 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a225515-1318-413d-aafe-877c9f16f598\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94213b2963fead3db49bc98dfdf6347265b92e3a0a965295610e496d2e1f03fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://015492cb16b3cf6dedc1936f90cf03d1331bfd1fddf6a257c719a6bf102691f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a259d99cae7127eb6fc8ad5446de3eda5a06da45868ab2325a89fc9c44f1d34c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58e84f914871c37c2c5cf2767af6a88354e4e59af0cbe5b178b80e1372d50629\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba76df71160260346f2ecd968722de778b7d2b3dcb8673d6ec770964965384fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ff207205d6f17838682cab641fe9c77fad739e5454c206f376440741bed8724\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ff207205d6f17838682cab641fe9c77fad739e5454c206f376440741bed8724\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:49Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:49 crc kubenswrapper[4797]: I1013 13:07:49.967925 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:49 crc kubenswrapper[4797]: I1013 13:07:49.967962 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:49 crc kubenswrapper[4797]: I1013 13:07:49.967981 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:49 crc kubenswrapper[4797]: I1013 13:07:49.968004 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:49 crc kubenswrapper[4797]: I1013 13:07:49.968019 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:49Z","lastTransitionTime":"2025-10-13T13:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:50 crc kubenswrapper[4797]: I1013 13:07:50.071489 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:50 crc kubenswrapper[4797]: I1013 13:07:50.071571 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:50 crc kubenswrapper[4797]: I1013 13:07:50.071633 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:50 crc kubenswrapper[4797]: I1013 13:07:50.071667 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:50 crc kubenswrapper[4797]: I1013 13:07:50.071692 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:50Z","lastTransitionTime":"2025-10-13T13:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:50 crc kubenswrapper[4797]: I1013 13:07:50.174922 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:50 crc kubenswrapper[4797]: I1013 13:07:50.174988 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:50 crc kubenswrapper[4797]: I1013 13:07:50.175004 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:50 crc kubenswrapper[4797]: I1013 13:07:50.175028 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:50 crc kubenswrapper[4797]: I1013 13:07:50.175042 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:50Z","lastTransitionTime":"2025-10-13T13:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:50 crc kubenswrapper[4797]: I1013 13:07:50.278372 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:50 crc kubenswrapper[4797]: I1013 13:07:50.278446 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:50 crc kubenswrapper[4797]: I1013 13:07:50.278455 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:50 crc kubenswrapper[4797]: I1013 13:07:50.278474 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:50 crc kubenswrapper[4797]: I1013 13:07:50.278486 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:50Z","lastTransitionTime":"2025-10-13T13:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:50 crc kubenswrapper[4797]: I1013 13:07:50.382071 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:50 crc kubenswrapper[4797]: I1013 13:07:50.382133 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:50 crc kubenswrapper[4797]: I1013 13:07:50.382155 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:50 crc kubenswrapper[4797]: I1013 13:07:50.382191 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:50 crc kubenswrapper[4797]: I1013 13:07:50.382214 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:50Z","lastTransitionTime":"2025-10-13T13:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:50 crc kubenswrapper[4797]: I1013 13:07:50.484205 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:50 crc kubenswrapper[4797]: I1013 13:07:50.484257 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:50 crc kubenswrapper[4797]: I1013 13:07:50.484273 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:50 crc kubenswrapper[4797]: I1013 13:07:50.484295 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:50 crc kubenswrapper[4797]: I1013 13:07:50.484311 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:50Z","lastTransitionTime":"2025-10-13T13:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:50 crc kubenswrapper[4797]: I1013 13:07:50.586912 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:50 crc kubenswrapper[4797]: I1013 13:07:50.586975 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:50 crc kubenswrapper[4797]: I1013 13:07:50.586992 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:50 crc kubenswrapper[4797]: I1013 13:07:50.587015 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:50 crc kubenswrapper[4797]: I1013 13:07:50.587033 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:50Z","lastTransitionTime":"2025-10-13T13:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:50 crc kubenswrapper[4797]: I1013 13:07:50.622760 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dhk2q_658edc6a-9975-4d8b-9551-821edcc32ce1/ovnkube-controller/2.log" Oct 13 13:07:50 crc kubenswrapper[4797]: I1013 13:07:50.628956 4797 scope.go:117] "RemoveContainer" containerID="dbb589aa6432f0be09c19acfbe0267d2c3e9906ab1b90494b2e097aec3c51e50" Oct 13 13:07:50 crc kubenswrapper[4797]: E1013 13:07:50.629221 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-dhk2q_openshift-ovn-kubernetes(658edc6a-9975-4d8b-9551-821edcc32ce1)\"" pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" podUID="658edc6a-9975-4d8b-9551-821edcc32ce1" Oct 13 13:07:50 crc kubenswrapper[4797]: I1013 13:07:50.662002 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a0ece0b-2009-4af8-a479-18fe277add03\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfe67dfd1c3ab4ca933a08e0384f2c38dccf755989a2d788f7c96bd8c2005c7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2839d93188aac6edbd17e7c1dc6d3b6004d3c1d8d03c559205b1f180ca7fc722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d103007fe3545a7470b16b99638c5d5c87f34918e102e1453d4f7ee1fa67109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://221933b30055ace0b4911bda08736e1c703b7757d55fadb3114ae39d038e4b19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01ae91c2e82dd02eb4c0aced1159efd45e3a0570a4db649f2fd2b58681419471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9634ab258db11c80c4fea57a4a31969b811204d49513802e9fd1e584c9baeeb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9634ab258db11c80c4fea57a4a31969b811204d49513802e9fd1e584c9baeeb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4eac0af23e91572524547b0ce92c10d435b55d0cd15ca4cfc1f49bda2de8bde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4eac0af23e91572524547b0ce92c10d435b55d0cd15ca4cfc1f49bda2de8bde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://af79d6bbb6a3c16b532ba2234d3373011c151ffa801eb1ae5ae947142a64bcfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af79d6bbb6a3c16b532ba2234d3373011c151ffa801eb1ae5ae947142a64bcfe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:03Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:50Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:50 crc kubenswrapper[4797]: I1013 13:07:50.687336 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6681a37da700a80ffec94aef9264f87838622029c76a2badc7b8f4a7e9e167e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:50Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:50 crc kubenswrapper[4797]: I1013 13:07:50.690368 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:50 crc kubenswrapper[4797]: I1013 13:07:50.690428 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:50 crc kubenswrapper[4797]: I1013 13:07:50.690447 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:50 crc kubenswrapper[4797]: I1013 13:07:50.690472 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:50 crc kubenswrapper[4797]: I1013 13:07:50.690491 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:50Z","lastTransitionTime":"2025-10-13T13:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:50 crc kubenswrapper[4797]: I1013 13:07:50.707797 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://050a7223e7496fca4ef77f2d73f6aefc921ac5accb7ecaa34609524388da6549\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f59a6104215a3e6febd2e26c286b00895bcd8a45719acbd8e86d6fb5683df39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:50Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:50 crc kubenswrapper[4797]: I1013 13:07:50.731066 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:50Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:50 crc kubenswrapper[4797]: I1013 13:07:50.758649 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hc9bk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94a6e41f-8980-41db-a008-d5a81058cdba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4187a9b2b8147080c704bcda550e1fa94124e2d876e766361e95907a8805d300\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55fca726ceb1c3d6ba2023acc8fcf6829a7843a2da5209dc77f3d1f499542984\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55fca726ceb1c3d6ba2023acc8fcf6829a7843a2da5209dc77f3d1f499542984\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d315709aee893961a174d0368efcf68e50e45845bdce18b40b96f5d49a8ac12c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d315709aee893961a174d0368efcf68e50e45845bdce18b40b96f5d49a8ac12c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e5eeb32046ffa3c309bb0649ed24bb4149050e757d4252bc5ed8e0593e1b139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e5eeb32046ffa3c309bb0649ed24bb4149050e757d4252bc5ed8e0593e1b139\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12ad75a6e287db934cda0d0128e5445d47433aa12807b4145ce0bd28c36b08a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12ad75a6e287db934cda0d0128e5445d47433aa12807b4145ce0bd28c36b08a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://578ddb49fd9525eafe74852b96ea1f3e320cbe40fa15ef4da3e4269f9bc23fc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://578ddb49fd9525eafe74852b96ea1f3e320cbe40fa15ef4da3e4269f9bc23fc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d96793d08ac170b3c25abd53c779b2ebecf10b5271c5f3eb4f9cbc524ba65c0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d96793d08ac170b3c25abd53c779b2ebecf10b5271c5f3eb4f9cbc524ba65c0e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hc9bk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:50Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:50 crc kubenswrapper[4797]: I1013 13:07:50.776427 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f37f607-3b81-4e33-878e-e78a69b89d23\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f65bab26af0e0d003d4e1a27dc4bdb84b64b5f6143e363973331a3fb6d26b12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81fc000b6df41386d24f9077cee4aa0ceb4733774dc37d225495575543e84a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6529ac19e0d9f2b6ecc69e041e75c9767c971617166ca22bb29349b3b3965b1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://153aaedead7ed76ab97858342317945de885afe80c00d9873d1a03444c47f67d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:50Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:50 crc kubenswrapper[4797]: I1013 13:07:50.791965 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"345b1c60-ba79-407d-8423-53010f2dfeb0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9188982d992b79d058393a141055552eeb63bc5cd53178991e62b3df7604f55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4mqw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae2106d4b7e73d19b0c8cbd8089d372e56fa08d827a3b45148d0cf68e8596c00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4mqw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hrdxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:50Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:50 crc kubenswrapper[4797]: I1013 13:07:50.795994 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:50 crc kubenswrapper[4797]: I1013 13:07:50.796049 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:50 crc kubenswrapper[4797]: I1013 13:07:50.796061 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:50 crc kubenswrapper[4797]: I1013 13:07:50.796081 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:50 crc kubenswrapper[4797]: I1013 13:07:50.796095 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:50Z","lastTransitionTime":"2025-10-13T13:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:50 crc kubenswrapper[4797]: I1013 13:07:50.807271 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7c2fp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c10f51f-a7f1-4ab8-8d9c-fc358bd7f2c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55ca323dd3a92ee203542f4ef7bb8be990bcfc8f75f125c562127129aecefc5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hgkjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7c2fp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:50Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:50 crc kubenswrapper[4797]: I1013 13:07:50.823390 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a225515-1318-413d-aafe-877c9f16f598\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94213b2963fead3db49bc98dfdf6347265b92e3a0a965295610e496d2e1f03fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://015492cb16b3cf6dedc1936f90cf03d1331bfd1fddf6a257c719a6bf102691f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a259d99cae7127eb6fc8ad5446de3eda5a06da45868ab2325a89fc9c44f1d34c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58e84f914871c37c2c5cf2767af6a88354e4e59af0cbe5b178b80e1372d50629\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba76df71160260346f2ecd968722de778b7d2b3dcb8673d6ec770964965384fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ff207205d6f17838682cab641fe9c77fad739e5454c206f376440741bed8724\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ff207205d6f17838682cab641fe9c77fad739e5454c206f376440741bed8724\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:50Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:50 crc kubenswrapper[4797]: I1013 13:07:50.837458 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:50Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:50 crc kubenswrapper[4797]: I1013 13:07:50.837947 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:50 crc kubenswrapper[4797]: I1013 13:07:50.837993 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:50 crc kubenswrapper[4797]: I1013 13:07:50.838011 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:50 crc kubenswrapper[4797]: I1013 13:07:50.838035 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:50 crc kubenswrapper[4797]: I1013 13:07:50.838053 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:50Z","lastTransitionTime":"2025-10-13T13:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:50 crc kubenswrapper[4797]: I1013 13:07:50.853418 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:50Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:50 crc kubenswrapper[4797]: E1013 13:07:50.856407 4797 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:07:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:07:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:07:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:07:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7c305ae9-a0eb-4806-bd54-a7ad9c447299\\\",\\\"systemUUID\\\":\\\"1126131d-f382-4ed8-9b1e-fad3c0f5c993\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:50Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:50 crc kubenswrapper[4797]: I1013 13:07:50.860358 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:50 crc kubenswrapper[4797]: I1013 13:07:50.860477 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:50 crc kubenswrapper[4797]: I1013 13:07:50.860500 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:50 crc kubenswrapper[4797]: I1013 13:07:50.860528 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:50 crc kubenswrapper[4797]: I1013 13:07:50.860549 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:50Z","lastTransitionTime":"2025-10-13T13:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:50 crc kubenswrapper[4797]: I1013 13:07:50.869821 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6gbdx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2ab9f14-aae8-45ef-880e-a1563e920f87\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://414f6ddbfec431109009fc83e56eeac94db15726b109e707ebd8d3e2403999b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rkc2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6gbdx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:50Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:50 crc kubenswrapper[4797]: E1013 13:07:50.881595 4797 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:07:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:07:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:07:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:07:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7c305ae9-a0eb-4806-bd54-a7ad9c447299\\\",\\\"systemUUID\\\":\\\"1126131d-f382-4ed8-9b1e-fad3c0f5c993\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:50Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:50 crc kubenswrapper[4797]: I1013 13:07:50.886523 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"658edc6a-9975-4d8b-9551-821edcc32ce1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6815d3509df673d7f5da2c26130c6c4d533e9d2c25c40f82365ef61d63ee71bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e599d81d1a996abd4de74afc58a8255a1ae548327401146b6bdf688d7455823\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa5161ba66d687daedb3caa1a0e2d7be83859aa3076731f94aebf83cc3348a4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1293f7ed35796e22a4be73a35ad07f83fa98d250d21de2d0b96b9090354142b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32991406197be9d38b8d5e8d1a7e95165b1846e9e054efbe87f30aac9f7f8784\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7aebb018a68c2984d9e4e58071c2b623652bfa700acebaf735c35615abf8c592\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbb589aa6432f0be09c19acfbe0267d2c3e9906ab1b90494b2e097aec3c51e50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbb589aa6432f0be09c19acfbe0267d2c3e9906ab1b90494b2e097aec3c51e50\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-13T13:07:49Z\\\",\\\"message\\\":\\\"ap[10.217.5.219:80:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7594bb65-e742-44b3-a975-d639b1128be5}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1013 13:07:49.213399 6403 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1013 13:07:49.213404 6403 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-network-diagnostics/network-check-target]} name:Service_openshift-network-diagnostics/network-check-target_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.219:80:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7594bb65-e742-44b3-a975-d639b1128be5}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1013 13:07:49.213463 6403 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: fail\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:48Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-dhk2q_openshift-ovn-kubernetes(658edc6a-9975-4d8b-9551-821edcc32ce1)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a900854ab289e65833932548eadd4705ec501737d66773d5b6c283458125b598\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6339ec20a5a892de024630eaf6febedfdfa4373c24d394fc758267cc7ae2603e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6339ec20a5a892de024630eaf6febedfdfa4373c24d394fc758267cc7ae2603e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dhk2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:50Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:50 crc kubenswrapper[4797]: I1013 13:07:50.887060 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:50 crc kubenswrapper[4797]: I1013 13:07:50.887083 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:50 crc kubenswrapper[4797]: I1013 13:07:50.887094 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:50 crc kubenswrapper[4797]: I1013 13:07:50.887110 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:50 crc kubenswrapper[4797]: I1013 13:07:50.887124 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:50Z","lastTransitionTime":"2025-10-13T13:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:50 crc kubenswrapper[4797]: I1013 13:07:50.905105 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69631c207b244ea05458caf7f67665697be6b3794c1aac98d0ac8d23df060e02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:50Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:50 crc kubenswrapper[4797]: E1013 13:07:50.905288 4797 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:07:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:07:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:07:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:07:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7c305ae9-a0eb-4806-bd54-a7ad9c447299\\\",\\\"systemUUID\\\":\\\"1126131d-f382-4ed8-9b1e-fad3c0f5c993\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:50Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:50 crc kubenswrapper[4797]: I1013 13:07:50.910100 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:50 crc kubenswrapper[4797]: I1013 13:07:50.910210 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:50 crc kubenswrapper[4797]: I1013 13:07:50.910284 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:50 crc kubenswrapper[4797]: I1013 13:07:50.910319 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:50 crc kubenswrapper[4797]: I1013 13:07:50.910410 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:50Z","lastTransitionTime":"2025-10-13T13:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:50 crc kubenswrapper[4797]: I1013 13:07:50.919210 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5jgrm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"680a49a0-7eff-44a9-8ab8-e4b52f4743c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://875be57ff7356a934b342acd8ae700f66656680be4e58e6cfccdc0407b66ddea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pptw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5jgrm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:50Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:50 crc kubenswrapper[4797]: E1013 13:07:50.934175 4797 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:07:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:07:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:07:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:07:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7c305ae9-a0eb-4806-bd54-a7ad9c447299\\\",\\\"systemUUID\\\":\\\"1126131d-f382-4ed8-9b1e-fad3c0f5c993\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:50Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:50 crc kubenswrapper[4797]: I1013 13:07:50.937845 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hvhmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b24c284-a754-4877-83cc-334b0a893a47\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72031333bb0302ca8e823981a07e96b3bf16d02fbdb918d4fd3e79f36d86c5ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znzc9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95d788f4cc7913f42c5282aea7303a5463ec8718dba6372a30c505e1648f230e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znzc9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hvhmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:50Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:50 crc kubenswrapper[4797]: I1013 13:07:50.938130 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:50 crc kubenswrapper[4797]: I1013 13:07:50.938211 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:50 crc kubenswrapper[4797]: I1013 13:07:50.938231 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:50 crc kubenswrapper[4797]: I1013 13:07:50.938251 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:50 crc kubenswrapper[4797]: I1013 13:07:50.938264 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:50Z","lastTransitionTime":"2025-10-13T13:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:50 crc kubenswrapper[4797]: I1013 13:07:50.951647 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-pdvg5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e65d35bc-209d-4438-ae53-31deb132aaf5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nspn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nspn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-pdvg5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:50Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:50 crc kubenswrapper[4797]: E1013 13:07:50.954359 4797 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:07:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:07:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:07:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:07:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7c305ae9-a0eb-4806-bd54-a7ad9c447299\\\",\\\"systemUUID\\\":\\\"1126131d-f382-4ed8-9b1e-fad3c0f5c993\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:50Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:50 crc kubenswrapper[4797]: E1013 13:07:50.954571 4797 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 13 13:07:50 crc kubenswrapper[4797]: I1013 13:07:50.956619 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:50 crc kubenswrapper[4797]: I1013 13:07:50.956655 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:50 crc kubenswrapper[4797]: I1013 13:07:50.956666 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:50 crc kubenswrapper[4797]: I1013 13:07:50.956684 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:50 crc kubenswrapper[4797]: I1013 13:07:50.956695 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:50Z","lastTransitionTime":"2025-10-13T13:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:51 crc kubenswrapper[4797]: I1013 13:07:51.059312 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:51 crc kubenswrapper[4797]: I1013 13:07:51.059354 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:51 crc kubenswrapper[4797]: I1013 13:07:51.059366 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:51 crc kubenswrapper[4797]: I1013 13:07:51.059382 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:51 crc kubenswrapper[4797]: I1013 13:07:51.059393 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:51Z","lastTransitionTime":"2025-10-13T13:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:51 crc kubenswrapper[4797]: I1013 13:07:51.162698 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:51 crc kubenswrapper[4797]: I1013 13:07:51.162750 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:51 crc kubenswrapper[4797]: I1013 13:07:51.162767 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:51 crc kubenswrapper[4797]: I1013 13:07:51.162790 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:51 crc kubenswrapper[4797]: I1013 13:07:51.162840 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:51Z","lastTransitionTime":"2025-10-13T13:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:51 crc kubenswrapper[4797]: I1013 13:07:51.235869 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 13:07:51 crc kubenswrapper[4797]: I1013 13:07:51.235974 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 13 13:07:51 crc kubenswrapper[4797]: I1013 13:07:51.235978 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 13 13:07:51 crc kubenswrapper[4797]: I1013 13:07:51.235908 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pdvg5" Oct 13 13:07:51 crc kubenswrapper[4797]: E1013 13:07:51.236314 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 13 13:07:51 crc kubenswrapper[4797]: E1013 13:07:51.236400 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 13 13:07:51 crc kubenswrapper[4797]: E1013 13:07:51.236476 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 13 13:07:51 crc kubenswrapper[4797]: E1013 13:07:51.236585 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pdvg5" podUID="e65d35bc-209d-4438-ae53-31deb132aaf5" Oct 13 13:07:51 crc kubenswrapper[4797]: I1013 13:07:51.266173 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:51 crc kubenswrapper[4797]: I1013 13:07:51.266207 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:51 crc kubenswrapper[4797]: I1013 13:07:51.266218 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:51 crc kubenswrapper[4797]: I1013 13:07:51.266322 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:51 crc kubenswrapper[4797]: I1013 13:07:51.266334 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:51Z","lastTransitionTime":"2025-10-13T13:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:51 crc kubenswrapper[4797]: I1013 13:07:51.369406 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:51 crc kubenswrapper[4797]: I1013 13:07:51.369444 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:51 crc kubenswrapper[4797]: I1013 13:07:51.369455 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:51 crc kubenswrapper[4797]: I1013 13:07:51.369471 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:51 crc kubenswrapper[4797]: I1013 13:07:51.369483 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:51Z","lastTransitionTime":"2025-10-13T13:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:51 crc kubenswrapper[4797]: I1013 13:07:51.472798 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:51 crc kubenswrapper[4797]: I1013 13:07:51.472888 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:51 crc kubenswrapper[4797]: I1013 13:07:51.472912 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:51 crc kubenswrapper[4797]: I1013 13:07:51.472941 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:51 crc kubenswrapper[4797]: I1013 13:07:51.473000 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:51Z","lastTransitionTime":"2025-10-13T13:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:51 crc kubenswrapper[4797]: I1013 13:07:51.575796 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:51 crc kubenswrapper[4797]: I1013 13:07:51.575882 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:51 crc kubenswrapper[4797]: I1013 13:07:51.575899 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:51 crc kubenswrapper[4797]: I1013 13:07:51.575925 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:51 crc kubenswrapper[4797]: I1013 13:07:51.575942 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:51Z","lastTransitionTime":"2025-10-13T13:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:51 crc kubenswrapper[4797]: I1013 13:07:51.679153 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:51 crc kubenswrapper[4797]: I1013 13:07:51.679201 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:51 crc kubenswrapper[4797]: I1013 13:07:51.679218 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:51 crc kubenswrapper[4797]: I1013 13:07:51.679243 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:51 crc kubenswrapper[4797]: I1013 13:07:51.679265 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:51Z","lastTransitionTime":"2025-10-13T13:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:51 crc kubenswrapper[4797]: I1013 13:07:51.782288 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:51 crc kubenswrapper[4797]: I1013 13:07:51.782346 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:51 crc kubenswrapper[4797]: I1013 13:07:51.782362 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:51 crc kubenswrapper[4797]: I1013 13:07:51.782388 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:51 crc kubenswrapper[4797]: I1013 13:07:51.782406 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:51Z","lastTransitionTime":"2025-10-13T13:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:51 crc kubenswrapper[4797]: I1013 13:07:51.885516 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:51 crc kubenswrapper[4797]: I1013 13:07:51.885574 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:51 crc kubenswrapper[4797]: I1013 13:07:51.885592 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:51 crc kubenswrapper[4797]: I1013 13:07:51.885616 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:51 crc kubenswrapper[4797]: I1013 13:07:51.885636 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:51Z","lastTransitionTime":"2025-10-13T13:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:51 crc kubenswrapper[4797]: I1013 13:07:51.989026 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:51 crc kubenswrapper[4797]: I1013 13:07:51.989089 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:51 crc kubenswrapper[4797]: I1013 13:07:51.989111 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:51 crc kubenswrapper[4797]: I1013 13:07:51.989139 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:51 crc kubenswrapper[4797]: I1013 13:07:51.989159 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:51Z","lastTransitionTime":"2025-10-13T13:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:52 crc kubenswrapper[4797]: I1013 13:07:52.093380 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:52 crc kubenswrapper[4797]: I1013 13:07:52.093451 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:52 crc kubenswrapper[4797]: I1013 13:07:52.093478 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:52 crc kubenswrapper[4797]: I1013 13:07:52.093505 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:52 crc kubenswrapper[4797]: I1013 13:07:52.093556 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:52Z","lastTransitionTime":"2025-10-13T13:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:52 crc kubenswrapper[4797]: I1013 13:07:52.196600 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:52 crc kubenswrapper[4797]: I1013 13:07:52.196664 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:52 crc kubenswrapper[4797]: I1013 13:07:52.196682 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:52 crc kubenswrapper[4797]: I1013 13:07:52.196709 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:52 crc kubenswrapper[4797]: I1013 13:07:52.196732 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:52Z","lastTransitionTime":"2025-10-13T13:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:52 crc kubenswrapper[4797]: I1013 13:07:52.300974 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:52 crc kubenswrapper[4797]: I1013 13:07:52.301053 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:52 crc kubenswrapper[4797]: I1013 13:07:52.301077 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:52 crc kubenswrapper[4797]: I1013 13:07:52.301109 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:52 crc kubenswrapper[4797]: I1013 13:07:52.301133 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:52Z","lastTransitionTime":"2025-10-13T13:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:52 crc kubenswrapper[4797]: I1013 13:07:52.404329 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:52 crc kubenswrapper[4797]: I1013 13:07:52.404381 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:52 crc kubenswrapper[4797]: I1013 13:07:52.404398 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:52 crc kubenswrapper[4797]: I1013 13:07:52.404424 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:52 crc kubenswrapper[4797]: I1013 13:07:52.404440 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:52Z","lastTransitionTime":"2025-10-13T13:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:52 crc kubenswrapper[4797]: I1013 13:07:52.508444 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:52 crc kubenswrapper[4797]: I1013 13:07:52.508518 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:52 crc kubenswrapper[4797]: I1013 13:07:52.508545 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:52 crc kubenswrapper[4797]: I1013 13:07:52.508577 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:52 crc kubenswrapper[4797]: I1013 13:07:52.508602 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:52Z","lastTransitionTime":"2025-10-13T13:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:52 crc kubenswrapper[4797]: I1013 13:07:52.611470 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:52 crc kubenswrapper[4797]: I1013 13:07:52.611513 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:52 crc kubenswrapper[4797]: I1013 13:07:52.611524 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:52 crc kubenswrapper[4797]: I1013 13:07:52.611542 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:52 crc kubenswrapper[4797]: I1013 13:07:52.611554 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:52Z","lastTransitionTime":"2025-10-13T13:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:52 crc kubenswrapper[4797]: I1013 13:07:52.714701 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:52 crc kubenswrapper[4797]: I1013 13:07:52.714763 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:52 crc kubenswrapper[4797]: I1013 13:07:52.714785 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:52 crc kubenswrapper[4797]: I1013 13:07:52.714848 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:52 crc kubenswrapper[4797]: I1013 13:07:52.714875 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:52Z","lastTransitionTime":"2025-10-13T13:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:52 crc kubenswrapper[4797]: I1013 13:07:52.817287 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:52 crc kubenswrapper[4797]: I1013 13:07:52.817337 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:52 crc kubenswrapper[4797]: I1013 13:07:52.817354 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:52 crc kubenswrapper[4797]: I1013 13:07:52.817378 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:52 crc kubenswrapper[4797]: I1013 13:07:52.817395 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:52Z","lastTransitionTime":"2025-10-13T13:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:52 crc kubenswrapper[4797]: I1013 13:07:52.919760 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:52 crc kubenswrapper[4797]: I1013 13:07:52.919849 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:52 crc kubenswrapper[4797]: I1013 13:07:52.919867 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:52 crc kubenswrapper[4797]: I1013 13:07:52.919895 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:52 crc kubenswrapper[4797]: I1013 13:07:52.919912 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:52Z","lastTransitionTime":"2025-10-13T13:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:53 crc kubenswrapper[4797]: I1013 13:07:53.022902 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:53 crc kubenswrapper[4797]: I1013 13:07:53.022958 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:53 crc kubenswrapper[4797]: I1013 13:07:53.022977 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:53 crc kubenswrapper[4797]: I1013 13:07:53.023000 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:53 crc kubenswrapper[4797]: I1013 13:07:53.023017 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:53Z","lastTransitionTime":"2025-10-13T13:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:53 crc kubenswrapper[4797]: I1013 13:07:53.129603 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:53 crc kubenswrapper[4797]: I1013 13:07:53.129643 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:53 crc kubenswrapper[4797]: I1013 13:07:53.129651 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:53 crc kubenswrapper[4797]: I1013 13:07:53.129666 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:53 crc kubenswrapper[4797]: I1013 13:07:53.129677 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:53Z","lastTransitionTime":"2025-10-13T13:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:53 crc kubenswrapper[4797]: I1013 13:07:53.232937 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:53 crc kubenswrapper[4797]: I1013 13:07:53.233008 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:53 crc kubenswrapper[4797]: I1013 13:07:53.233029 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:53 crc kubenswrapper[4797]: I1013 13:07:53.233059 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:53 crc kubenswrapper[4797]: I1013 13:07:53.233080 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:53Z","lastTransitionTime":"2025-10-13T13:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:53 crc kubenswrapper[4797]: I1013 13:07:53.235510 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 13:07:53 crc kubenswrapper[4797]: I1013 13:07:53.235585 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pdvg5" Oct 13 13:07:53 crc kubenswrapper[4797]: I1013 13:07:53.235595 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 13 13:07:53 crc kubenswrapper[4797]: I1013 13:07:53.235729 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 13 13:07:53 crc kubenswrapper[4797]: E1013 13:07:53.235733 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 13 13:07:53 crc kubenswrapper[4797]: E1013 13:07:53.235954 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 13 13:07:53 crc kubenswrapper[4797]: E1013 13:07:53.236049 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pdvg5" podUID="e65d35bc-209d-4438-ae53-31deb132aaf5" Oct 13 13:07:53 crc kubenswrapper[4797]: E1013 13:07:53.236116 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 13 13:07:53 crc kubenswrapper[4797]: I1013 13:07:53.252735 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6gbdx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2ab9f14-aae8-45ef-880e-a1563e920f87\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://414f6ddbfec431109009fc83e56eeac94db15726b109e707ebd8d3e2403999b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rkc2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6gbdx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:53Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:53 crc kubenswrapper[4797]: I1013 13:07:53.277559 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"658edc6a-9975-4d8b-9551-821edcc32ce1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6815d3509df673d7f5da2c26130c6c4d533e9d2c25c40f82365ef61d63ee71bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e599d81d1a996abd4de74afc58a8255a1ae548327401146b6bdf688d7455823\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa5161ba66d687daedb3caa1a0e2d7be83859aa3076731f94aebf83cc3348a4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1293f7ed35796e22a4be73a35ad07f83fa98d250d21de2d0b96b9090354142b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32991406197be9d38b8d5e8d1a7e95165b1846e9e054efbe87f30aac9f7f8784\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7aebb018a68c2984d9e4e58071c2b623652bfa700acebaf735c35615abf8c592\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbb589aa6432f0be09c19acfbe0267d2c3e9906ab1b90494b2e097aec3c51e50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbb589aa6432f0be09c19acfbe0267d2c3e9906ab1b90494b2e097aec3c51e50\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-13T13:07:49Z\\\",\\\"message\\\":\\\"ap[10.217.5.219:80:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7594bb65-e742-44b3-a975-d639b1128be5}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1013 13:07:49.213399 6403 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1013 13:07:49.213404 6403 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-network-diagnostics/network-check-target]} name:Service_openshift-network-diagnostics/network-check-target_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.219:80:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7594bb65-e742-44b3-a975-d639b1128be5}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1013 13:07:49.213463 6403 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: fail\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:48Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-dhk2q_openshift-ovn-kubernetes(658edc6a-9975-4d8b-9551-821edcc32ce1)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a900854ab289e65833932548eadd4705ec501737d66773d5b6c283458125b598\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6339ec20a5a892de024630eaf6febedfdfa4373c24d394fc758267cc7ae2603e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6339ec20a5a892de024630eaf6febedfdfa4373c24d394fc758267cc7ae2603e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dhk2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:53Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:53 crc kubenswrapper[4797]: I1013 13:07:53.299109 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a225515-1318-413d-aafe-877c9f16f598\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94213b2963fead3db49bc98dfdf6347265b92e3a0a965295610e496d2e1f03fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://015492cb16b3cf6dedc1936f90cf03d1331bfd1fddf6a257c719a6bf102691f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a259d99cae7127eb6fc8ad5446de3eda5a06da45868ab2325a89fc9c44f1d34c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58e84f914871c37c2c5cf2767af6a88354e4e59af0cbe5b178b80e1372d50629\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba76df71160260346f2ecd968722de778b7d2b3dcb8673d6ec770964965384fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ff207205d6f17838682cab641fe9c77fad739e5454c206f376440741bed8724\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ff207205d6f17838682cab641fe9c77fad739e5454c206f376440741bed8724\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:53Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:53 crc kubenswrapper[4797]: I1013 13:07:53.318844 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:53Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:53 crc kubenswrapper[4797]: I1013 13:07:53.336927 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:53 crc kubenswrapper[4797]: I1013 13:07:53.337197 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:53 crc kubenswrapper[4797]: I1013 13:07:53.337361 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:53 crc kubenswrapper[4797]: I1013 13:07:53.337933 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:53Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:53 crc kubenswrapper[4797]: I1013 13:07:53.338293 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:53 crc kubenswrapper[4797]: I1013 13:07:53.338527 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:53Z","lastTransitionTime":"2025-10-13T13:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:53 crc kubenswrapper[4797]: I1013 13:07:53.355146 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-pdvg5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e65d35bc-209d-4438-ae53-31deb132aaf5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nspn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nspn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-pdvg5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:53Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:53 crc kubenswrapper[4797]: I1013 13:07:53.369259 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69631c207b244ea05458caf7f67665697be6b3794c1aac98d0ac8d23df060e02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:53Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:53 crc kubenswrapper[4797]: I1013 13:07:53.379930 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5jgrm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"680a49a0-7eff-44a9-8ab8-e4b52f4743c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://875be57ff7356a934b342acd8ae700f66656680be4e58e6cfccdc0407b66ddea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pptw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5jgrm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:53Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:53 crc kubenswrapper[4797]: I1013 13:07:53.391954 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hvhmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b24c284-a754-4877-83cc-334b0a893a47\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72031333bb0302ca8e823981a07e96b3bf16d02fbdb918d4fd3e79f36d86c5ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znzc9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95d788f4cc7913f42c5282aea7303a5463ec8718dba6372a30c505e1648f230e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znzc9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hvhmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:53Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:53 crc kubenswrapper[4797]: I1013 13:07:53.405995 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6681a37da700a80ffec94aef9264f87838622029c76a2badc7b8f4a7e9e167e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:53Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:53 crc kubenswrapper[4797]: I1013 13:07:53.423453 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://050a7223e7496fca4ef77f2d73f6aefc921ac5accb7ecaa34609524388da6549\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f59a6104215a3e6febd2e26c286b00895bcd8a45719acbd8e86d6fb5683df39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:53Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:53 crc kubenswrapper[4797]: I1013 13:07:53.443035 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:53Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:53 crc kubenswrapper[4797]: I1013 13:07:53.443298 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:53 crc kubenswrapper[4797]: I1013 13:07:53.443346 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:53 crc kubenswrapper[4797]: I1013 13:07:53.443367 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:53 crc kubenswrapper[4797]: I1013 13:07:53.443397 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:53 crc kubenswrapper[4797]: I1013 13:07:53.443420 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:53Z","lastTransitionTime":"2025-10-13T13:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:53 crc kubenswrapper[4797]: I1013 13:07:53.459605 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hc9bk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94a6e41f-8980-41db-a008-d5a81058cdba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4187a9b2b8147080c704bcda550e1fa94124e2d876e766361e95907a8805d300\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55fca726ceb1c3d6ba2023acc8fcf6829a7843a2da5209dc77f3d1f499542984\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55fca726ceb1c3d6ba2023acc8fcf6829a7843a2da5209dc77f3d1f499542984\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d315709aee893961a174d0368efcf68e50e45845bdce18b40b96f5d49a8ac12c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d315709aee893961a174d0368efcf68e50e45845bdce18b40b96f5d49a8ac12c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e5eeb32046ffa3c309bb0649ed24bb4149050e757d4252bc5ed8e0593e1b139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e5eeb32046ffa3c309bb0649ed24bb4149050e757d4252bc5ed8e0593e1b139\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12ad75a6e287db934cda0d0128e5445d47433aa12807b4145ce0bd28c36b08a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12ad75a6e287db934cda0d0128e5445d47433aa12807b4145ce0bd28c36b08a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://578ddb49fd9525eafe74852b96ea1f3e320cbe40fa15ef4da3e4269f9bc23fc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://578ddb49fd9525eafe74852b96ea1f3e320cbe40fa15ef4da3e4269f9bc23fc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d96793d08ac170b3c25abd53c779b2ebecf10b5271c5f3eb4f9cbc524ba65c0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d96793d08ac170b3c25abd53c779b2ebecf10b5271c5f3eb4f9cbc524ba65c0e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hc9bk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:53Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:53 crc kubenswrapper[4797]: I1013 13:07:53.483796 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a0ece0b-2009-4af8-a479-18fe277add03\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfe67dfd1c3ab4ca933a08e0384f2c38dccf755989a2d788f7c96bd8c2005c7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2839d93188aac6edbd17e7c1dc6d3b6004d3c1d8d03c559205b1f180ca7fc722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d103007fe3545a7470b16b99638c5d5c87f34918e102e1453d4f7ee1fa67109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://221933b30055ace0b4911bda08736e1c703b7757d55fadb3114ae39d038e4b19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01ae91c2e82dd02eb4c0aced1159efd45e3a0570a4db649f2fd2b58681419471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9634ab258db11c80c4fea57a4a31969b811204d49513802e9fd1e584c9baeeb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9634ab258db11c80c4fea57a4a31969b811204d49513802e9fd1e584c9baeeb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4eac0af23e91572524547b0ce92c10d435b55d0cd15ca4cfc1f49bda2de8bde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4eac0af23e91572524547b0ce92c10d435b55d0cd15ca4cfc1f49bda2de8bde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://af79d6bbb6a3c16b532ba2234d3373011c151ffa801eb1ae5ae947142a64bcfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af79d6bbb6a3c16b532ba2234d3373011c151ffa801eb1ae5ae947142a64bcfe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:03Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:53Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:53 crc kubenswrapper[4797]: I1013 13:07:53.502083 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f37f607-3b81-4e33-878e-e78a69b89d23\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f65bab26af0e0d003d4e1a27dc4bdb84b64b5f6143e363973331a3fb6d26b12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81fc000b6df41386d24f9077cee4aa0ceb4733774dc37d225495575543e84a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6529ac19e0d9f2b6ecc69e041e75c9767c971617166ca22bb29349b3b3965b1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://153aaedead7ed76ab97858342317945de885afe80c00d9873d1a03444c47f67d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:53Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:53 crc kubenswrapper[4797]: I1013 13:07:53.515508 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"345b1c60-ba79-407d-8423-53010f2dfeb0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9188982d992b79d058393a141055552eeb63bc5cd53178991e62b3df7604f55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4mqw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae2106d4b7e73d19b0c8cbd8089d372e56fa08d827a3b45148d0cf68e8596c00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4mqw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hrdxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:53Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:53 crc kubenswrapper[4797]: I1013 13:07:53.526837 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7c2fp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c10f51f-a7f1-4ab8-8d9c-fc358bd7f2c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55ca323dd3a92ee203542f4ef7bb8be990bcfc8f75f125c562127129aecefc5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hgkjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7c2fp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:53Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:53 crc kubenswrapper[4797]: I1013 13:07:53.546221 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:53 crc kubenswrapper[4797]: I1013 13:07:53.546270 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:53 crc kubenswrapper[4797]: I1013 13:07:53.546285 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:53 crc kubenswrapper[4797]: I1013 13:07:53.546304 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:53 crc kubenswrapper[4797]: I1013 13:07:53.546317 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:53Z","lastTransitionTime":"2025-10-13T13:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:53 crc kubenswrapper[4797]: I1013 13:07:53.649317 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:53 crc kubenswrapper[4797]: I1013 13:07:53.649386 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:53 crc kubenswrapper[4797]: I1013 13:07:53.649405 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:53 crc kubenswrapper[4797]: I1013 13:07:53.649430 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:53 crc kubenswrapper[4797]: I1013 13:07:53.649446 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:53Z","lastTransitionTime":"2025-10-13T13:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:53 crc kubenswrapper[4797]: I1013 13:07:53.752925 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:53 crc kubenswrapper[4797]: I1013 13:07:53.752987 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:53 crc kubenswrapper[4797]: I1013 13:07:53.753008 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:53 crc kubenswrapper[4797]: I1013 13:07:53.753039 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:53 crc kubenswrapper[4797]: I1013 13:07:53.753063 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:53Z","lastTransitionTime":"2025-10-13T13:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:53 crc kubenswrapper[4797]: I1013 13:07:53.856654 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:53 crc kubenswrapper[4797]: I1013 13:07:53.856713 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:53 crc kubenswrapper[4797]: I1013 13:07:53.856729 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:53 crc kubenswrapper[4797]: I1013 13:07:53.856754 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:53 crc kubenswrapper[4797]: I1013 13:07:53.856773 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:53Z","lastTransitionTime":"2025-10-13T13:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:53 crc kubenswrapper[4797]: I1013 13:07:53.952519 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 13 13:07:53 crc kubenswrapper[4797]: I1013 13:07:53.962558 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:53 crc kubenswrapper[4797]: I1013 13:07:53.962637 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:53 crc kubenswrapper[4797]: I1013 13:07:53.962660 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:53 crc kubenswrapper[4797]: I1013 13:07:53.962693 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:53 crc kubenswrapper[4797]: I1013 13:07:53.962716 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:53Z","lastTransitionTime":"2025-10-13T13:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:53 crc kubenswrapper[4797]: I1013 13:07:53.968673 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Oct 13 13:07:53 crc kubenswrapper[4797]: I1013 13:07:53.976292 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69631c207b244ea05458caf7f67665697be6b3794c1aac98d0ac8d23df060e02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:53Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:53 crc kubenswrapper[4797]: I1013 13:07:53.992908 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5jgrm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"680a49a0-7eff-44a9-8ab8-e4b52f4743c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://875be57ff7356a934b342acd8ae700f66656680be4e58e6cfccdc0407b66ddea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pptw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5jgrm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:53Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:54 crc kubenswrapper[4797]: I1013 13:07:54.011926 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hvhmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b24c284-a754-4877-83cc-334b0a893a47\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72031333bb0302ca8e823981a07e96b3bf16d02fbdb918d4fd3e79f36d86c5ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znzc9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95d788f4cc7913f42c5282aea7303a5463ec8718dba6372a30c505e1648f230e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znzc9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hvhmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:54Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:54 crc kubenswrapper[4797]: I1013 13:07:54.028111 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-pdvg5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e65d35bc-209d-4438-ae53-31deb132aaf5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nspn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nspn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-pdvg5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:54Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:54 crc kubenswrapper[4797]: I1013 13:07:54.046578 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://050a7223e7496fca4ef77f2d73f6aefc921ac5accb7ecaa34609524388da6549\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f59a6104215a3e6febd2e26c286b00895bcd8a45719acbd8e86d6fb5683df39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:54Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:54 crc kubenswrapper[4797]: I1013 13:07:54.065672 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:54Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:54 crc kubenswrapper[4797]: I1013 13:07:54.069507 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:54 crc kubenswrapper[4797]: I1013 13:07:54.069582 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:54 crc kubenswrapper[4797]: I1013 13:07:54.069604 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:54 crc kubenswrapper[4797]: I1013 13:07:54.069635 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:54 crc kubenswrapper[4797]: I1013 13:07:54.069656 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:54Z","lastTransitionTime":"2025-10-13T13:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:54 crc kubenswrapper[4797]: I1013 13:07:54.089732 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hc9bk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94a6e41f-8980-41db-a008-d5a81058cdba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4187a9b2b8147080c704bcda550e1fa94124e2d876e766361e95907a8805d300\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55fca726ceb1c3d6ba2023acc8fcf6829a7843a2da5209dc77f3d1f499542984\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55fca726ceb1c3d6ba2023acc8fcf6829a7843a2da5209dc77f3d1f499542984\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d315709aee893961a174d0368efcf68e50e45845bdce18b40b96f5d49a8ac12c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d315709aee893961a174d0368efcf68e50e45845bdce18b40b96f5d49a8ac12c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e5eeb32046ffa3c309bb0649ed24bb4149050e757d4252bc5ed8e0593e1b139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e5eeb32046ffa3c309bb0649ed24bb4149050e757d4252bc5ed8e0593e1b139\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12ad75a6e287db934cda0d0128e5445d47433aa12807b4145ce0bd28c36b08a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12ad75a6e287db934cda0d0128e5445d47433aa12807b4145ce0bd28c36b08a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://578ddb49fd9525eafe74852b96ea1f3e320cbe40fa15ef4da3e4269f9bc23fc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://578ddb49fd9525eafe74852b96ea1f3e320cbe40fa15ef4da3e4269f9bc23fc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d96793d08ac170b3c25abd53c779b2ebecf10b5271c5f3eb4f9cbc524ba65c0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d96793d08ac170b3c25abd53c779b2ebecf10b5271c5f3eb4f9cbc524ba65c0e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hc9bk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:54Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:54 crc kubenswrapper[4797]: I1013 13:07:54.123156 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a0ece0b-2009-4af8-a479-18fe277add03\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfe67dfd1c3ab4ca933a08e0384f2c38dccf755989a2d788f7c96bd8c2005c7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2839d93188aac6edbd17e7c1dc6d3b6004d3c1d8d03c559205b1f180ca7fc722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d103007fe3545a7470b16b99638c5d5c87f34918e102e1453d4f7ee1fa67109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://221933b30055ace0b4911bda08736e1c703b7757d55fadb3114ae39d038e4b19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01ae91c2e82dd02eb4c0aced1159efd45e3a0570a4db649f2fd2b58681419471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9634ab258db11c80c4fea57a4a31969b811204d49513802e9fd1e584c9baeeb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9634ab258db11c80c4fea57a4a31969b811204d49513802e9fd1e584c9baeeb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4eac0af23e91572524547b0ce92c10d435b55d0cd15ca4cfc1f49bda2de8bde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4eac0af23e91572524547b0ce92c10d435b55d0cd15ca4cfc1f49bda2de8bde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://af79d6bbb6a3c16b532ba2234d3373011c151ffa801eb1ae5ae947142a64bcfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af79d6bbb6a3c16b532ba2234d3373011c151ffa801eb1ae5ae947142a64bcfe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:03Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:54Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:54 crc kubenswrapper[4797]: I1013 13:07:54.145156 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6681a37da700a80ffec94aef9264f87838622029c76a2badc7b8f4a7e9e167e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:54Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:54 crc kubenswrapper[4797]: I1013 13:07:54.165408 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f37f607-3b81-4e33-878e-e78a69b89d23\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f65bab26af0e0d003d4e1a27dc4bdb84b64b5f6143e363973331a3fb6d26b12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81fc000b6df41386d24f9077cee4aa0ceb4733774dc37d225495575543e84a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6529ac19e0d9f2b6ecc69e041e75c9767c971617166ca22bb29349b3b3965b1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://153aaedead7ed76ab97858342317945de885afe80c00d9873d1a03444c47f67d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:54Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:54 crc kubenswrapper[4797]: I1013 13:07:54.172354 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:54 crc kubenswrapper[4797]: I1013 13:07:54.172398 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:54 crc kubenswrapper[4797]: I1013 13:07:54.172415 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:54 crc kubenswrapper[4797]: I1013 13:07:54.172436 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:54 crc kubenswrapper[4797]: I1013 13:07:54.172453 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:54Z","lastTransitionTime":"2025-10-13T13:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:54 crc kubenswrapper[4797]: I1013 13:07:54.182445 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"345b1c60-ba79-407d-8423-53010f2dfeb0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9188982d992b79d058393a141055552eeb63bc5cd53178991e62b3df7604f55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4mqw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae2106d4b7e73d19b0c8cbd8089d372e56fa08d827a3b45148d0cf68e8596c00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4mqw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hrdxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:54Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:54 crc kubenswrapper[4797]: I1013 13:07:54.197498 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7c2fp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c10f51f-a7f1-4ab8-8d9c-fc358bd7f2c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55ca323dd3a92ee203542f4ef7bb8be990bcfc8f75f125c562127129aecefc5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hgkjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7c2fp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:54Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:54 crc kubenswrapper[4797]: I1013 13:07:54.231656 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"658edc6a-9975-4d8b-9551-821edcc32ce1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6815d3509df673d7f5da2c26130c6c4d533e9d2c25c40f82365ef61d63ee71bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e599d81d1a996abd4de74afc58a8255a1ae548327401146b6bdf688d7455823\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa5161ba66d687daedb3caa1a0e2d7be83859aa3076731f94aebf83cc3348a4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1293f7ed35796e22a4be73a35ad07f83fa98d250d21de2d0b96b9090354142b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32991406197be9d38b8d5e8d1a7e95165b1846e9e054efbe87f30aac9f7f8784\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7aebb018a68c2984d9e4e58071c2b623652bfa700acebaf735c35615abf8c592\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbb589aa6432f0be09c19acfbe0267d2c3e9906ab1b90494b2e097aec3c51e50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbb589aa6432f0be09c19acfbe0267d2c3e9906ab1b90494b2e097aec3c51e50\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-13T13:07:49Z\\\",\\\"message\\\":\\\"ap[10.217.5.219:80:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7594bb65-e742-44b3-a975-d639b1128be5}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1013 13:07:49.213399 6403 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1013 13:07:49.213404 6403 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-network-diagnostics/network-check-target]} name:Service_openshift-network-diagnostics/network-check-target_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.219:80:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7594bb65-e742-44b3-a975-d639b1128be5}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1013 13:07:49.213463 6403 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: fail\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:48Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-dhk2q_openshift-ovn-kubernetes(658edc6a-9975-4d8b-9551-821edcc32ce1)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a900854ab289e65833932548eadd4705ec501737d66773d5b6c283458125b598\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6339ec20a5a892de024630eaf6febedfdfa4373c24d394fc758267cc7ae2603e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6339ec20a5a892de024630eaf6febedfdfa4373c24d394fc758267cc7ae2603e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dhk2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:54Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:54 crc kubenswrapper[4797]: I1013 13:07:54.275717 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:54 crc kubenswrapper[4797]: I1013 13:07:54.275767 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:54 crc kubenswrapper[4797]: I1013 13:07:54.275789 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:54 crc kubenswrapper[4797]: I1013 13:07:54.275849 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:54 crc kubenswrapper[4797]: I1013 13:07:54.275873 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:54Z","lastTransitionTime":"2025-10-13T13:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:54 crc kubenswrapper[4797]: I1013 13:07:54.284159 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a225515-1318-413d-aafe-877c9f16f598\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94213b2963fead3db49bc98dfdf6347265b92e3a0a965295610e496d2e1f03fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://015492cb16b3cf6dedc1936f90cf03d1331bfd1fddf6a257c719a6bf102691f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a259d99cae7127eb6fc8ad5446de3eda5a06da45868ab2325a89fc9c44f1d34c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58e84f914871c37c2c5cf2767af6a88354e4e59af0cbe5b178b80e1372d50629\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba76df71160260346f2ecd968722de778b7d2b3dcb8673d6ec770964965384fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ff207205d6f17838682cab641fe9c77fad739e5454c206f376440741bed8724\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ff207205d6f17838682cab641fe9c77fad739e5454c206f376440741bed8724\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:54Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:54 crc kubenswrapper[4797]: I1013 13:07:54.302635 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:54Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:54 crc kubenswrapper[4797]: I1013 13:07:54.314433 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:54Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:54 crc kubenswrapper[4797]: I1013 13:07:54.326718 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6gbdx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2ab9f14-aae8-45ef-880e-a1563e920f87\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://414f6ddbfec431109009fc83e56eeac94db15726b109e707ebd8d3e2403999b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rkc2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6gbdx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:07:54Z is after 2025-08-24T17:21:41Z" Oct 13 13:07:54 crc kubenswrapper[4797]: I1013 13:07:54.378505 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:54 crc kubenswrapper[4797]: I1013 13:07:54.378560 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:54 crc kubenswrapper[4797]: I1013 13:07:54.378576 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:54 crc kubenswrapper[4797]: I1013 13:07:54.378614 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:54 crc kubenswrapper[4797]: I1013 13:07:54.378634 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:54Z","lastTransitionTime":"2025-10-13T13:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:54 crc kubenswrapper[4797]: I1013 13:07:54.483084 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:54 crc kubenswrapper[4797]: I1013 13:07:54.483184 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:54 crc kubenswrapper[4797]: I1013 13:07:54.483212 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:54 crc kubenswrapper[4797]: I1013 13:07:54.483247 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:54 crc kubenswrapper[4797]: I1013 13:07:54.483284 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:54Z","lastTransitionTime":"2025-10-13T13:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:54 crc kubenswrapper[4797]: I1013 13:07:54.587095 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:54 crc kubenswrapper[4797]: I1013 13:07:54.587151 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:54 crc kubenswrapper[4797]: I1013 13:07:54.587168 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:54 crc kubenswrapper[4797]: I1013 13:07:54.587189 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:54 crc kubenswrapper[4797]: I1013 13:07:54.587206 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:54Z","lastTransitionTime":"2025-10-13T13:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:54 crc kubenswrapper[4797]: I1013 13:07:54.690492 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:54 crc kubenswrapper[4797]: I1013 13:07:54.690531 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:54 crc kubenswrapper[4797]: I1013 13:07:54.690543 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:54 crc kubenswrapper[4797]: I1013 13:07:54.690560 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:54 crc kubenswrapper[4797]: I1013 13:07:54.690570 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:54Z","lastTransitionTime":"2025-10-13T13:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:54 crc kubenswrapper[4797]: I1013 13:07:54.794612 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:54 crc kubenswrapper[4797]: I1013 13:07:54.794683 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:54 crc kubenswrapper[4797]: I1013 13:07:54.794703 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:54 crc kubenswrapper[4797]: I1013 13:07:54.794732 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:54 crc kubenswrapper[4797]: I1013 13:07:54.794751 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:54Z","lastTransitionTime":"2025-10-13T13:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:54 crc kubenswrapper[4797]: I1013 13:07:54.897738 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:54 crc kubenswrapper[4797]: I1013 13:07:54.897801 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:54 crc kubenswrapper[4797]: I1013 13:07:54.897881 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:54 crc kubenswrapper[4797]: I1013 13:07:54.897908 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:54 crc kubenswrapper[4797]: I1013 13:07:54.897929 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:54Z","lastTransitionTime":"2025-10-13T13:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:55 crc kubenswrapper[4797]: I1013 13:07:55.000994 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:55 crc kubenswrapper[4797]: I1013 13:07:55.001058 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:55 crc kubenswrapper[4797]: I1013 13:07:55.001076 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:55 crc kubenswrapper[4797]: I1013 13:07:55.001100 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:55 crc kubenswrapper[4797]: I1013 13:07:55.001118 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:55Z","lastTransitionTime":"2025-10-13T13:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:55 crc kubenswrapper[4797]: I1013 13:07:55.104521 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:55 crc kubenswrapper[4797]: I1013 13:07:55.104607 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:55 crc kubenswrapper[4797]: I1013 13:07:55.104631 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:55 crc kubenswrapper[4797]: I1013 13:07:55.104664 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:55 crc kubenswrapper[4797]: I1013 13:07:55.104688 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:55Z","lastTransitionTime":"2025-10-13T13:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:55 crc kubenswrapper[4797]: I1013 13:07:55.208473 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:55 crc kubenswrapper[4797]: I1013 13:07:55.208557 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:55 crc kubenswrapper[4797]: I1013 13:07:55.208579 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:55 crc kubenswrapper[4797]: I1013 13:07:55.208609 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:55 crc kubenswrapper[4797]: I1013 13:07:55.208633 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:55Z","lastTransitionTime":"2025-10-13T13:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:55 crc kubenswrapper[4797]: I1013 13:07:55.235989 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 13 13:07:55 crc kubenswrapper[4797]: I1013 13:07:55.236026 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 13 13:07:55 crc kubenswrapper[4797]: I1013 13:07:55.236099 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 13:07:55 crc kubenswrapper[4797]: E1013 13:07:55.236209 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 13 13:07:55 crc kubenswrapper[4797]: I1013 13:07:55.236249 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pdvg5" Oct 13 13:07:55 crc kubenswrapper[4797]: E1013 13:07:55.236394 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 13 13:07:55 crc kubenswrapper[4797]: E1013 13:07:55.236487 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pdvg5" podUID="e65d35bc-209d-4438-ae53-31deb132aaf5" Oct 13 13:07:55 crc kubenswrapper[4797]: E1013 13:07:55.236662 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 13 13:07:55 crc kubenswrapper[4797]: I1013 13:07:55.252189 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e65d35bc-209d-4438-ae53-31deb132aaf5-metrics-certs\") pod \"network-metrics-daemon-pdvg5\" (UID: \"e65d35bc-209d-4438-ae53-31deb132aaf5\") " pod="openshift-multus/network-metrics-daemon-pdvg5" Oct 13 13:07:55 crc kubenswrapper[4797]: E1013 13:07:55.252431 4797 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 13 13:07:55 crc kubenswrapper[4797]: E1013 13:07:55.252522 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e65d35bc-209d-4438-ae53-31deb132aaf5-metrics-certs podName:e65d35bc-209d-4438-ae53-31deb132aaf5 nodeName:}" failed. No retries permitted until 2025-10-13 13:08:11.252499144 +0000 UTC m=+68.786049430 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e65d35bc-209d-4438-ae53-31deb132aaf5-metrics-certs") pod "network-metrics-daemon-pdvg5" (UID: "e65d35bc-209d-4438-ae53-31deb132aaf5") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 13 13:07:55 crc kubenswrapper[4797]: I1013 13:07:55.311544 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:55 crc kubenswrapper[4797]: I1013 13:07:55.311645 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:55 crc kubenswrapper[4797]: I1013 13:07:55.311668 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:55 crc kubenswrapper[4797]: I1013 13:07:55.311694 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:55 crc kubenswrapper[4797]: I1013 13:07:55.311713 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:55Z","lastTransitionTime":"2025-10-13T13:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:55 crc kubenswrapper[4797]: I1013 13:07:55.415498 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:55 crc kubenswrapper[4797]: I1013 13:07:55.415586 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:55 crc kubenswrapper[4797]: I1013 13:07:55.415614 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:55 crc kubenswrapper[4797]: I1013 13:07:55.415645 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:55 crc kubenswrapper[4797]: I1013 13:07:55.415669 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:55Z","lastTransitionTime":"2025-10-13T13:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:55 crc kubenswrapper[4797]: I1013 13:07:55.453369 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 13 13:07:55 crc kubenswrapper[4797]: E1013 13:07:55.453530 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-13 13:08:27.453497533 +0000 UTC m=+84.987047829 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 13:07:55 crc kubenswrapper[4797]: I1013 13:07:55.518661 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:55 crc kubenswrapper[4797]: I1013 13:07:55.518729 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:55 crc kubenswrapper[4797]: I1013 13:07:55.518753 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:55 crc kubenswrapper[4797]: I1013 13:07:55.518777 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:55 crc kubenswrapper[4797]: I1013 13:07:55.518794 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:55Z","lastTransitionTime":"2025-10-13T13:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:55 crc kubenswrapper[4797]: I1013 13:07:55.554365 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 13:07:55 crc kubenswrapper[4797]: E1013 13:07:55.554492 4797 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 13 13:07:55 crc kubenswrapper[4797]: E1013 13:07:55.554588 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-13 13:08:27.554561026 +0000 UTC m=+85.088111312 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 13 13:07:55 crc kubenswrapper[4797]: I1013 13:07:55.554498 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 13:07:55 crc kubenswrapper[4797]: I1013 13:07:55.554720 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 13 13:07:55 crc kubenswrapper[4797]: E1013 13:07:55.554637 4797 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 13 13:07:55 crc kubenswrapper[4797]: I1013 13:07:55.554892 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 13 13:07:55 crc kubenswrapper[4797]: E1013 13:07:55.554972 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-13 13:08:27.554944995 +0000 UTC m=+85.088495301 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 13 13:07:55 crc kubenswrapper[4797]: E1013 13:07:55.554978 4797 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 13 13:07:55 crc kubenswrapper[4797]: E1013 13:07:55.555011 4797 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 13 13:07:55 crc kubenswrapper[4797]: E1013 13:07:55.555032 4797 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 13 13:07:55 crc kubenswrapper[4797]: E1013 13:07:55.555069 4797 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 13 13:07:55 crc kubenswrapper[4797]: E1013 13:07:55.555097 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-13 13:08:27.555080688 +0000 UTC m=+85.088630974 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 13 13:07:55 crc kubenswrapper[4797]: E1013 13:07:55.555099 4797 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 13 13:07:55 crc kubenswrapper[4797]: E1013 13:07:55.555124 4797 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 13 13:07:55 crc kubenswrapper[4797]: E1013 13:07:55.555177 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-13 13:08:27.55516085 +0000 UTC m=+85.088711136 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 13 13:07:55 crc kubenswrapper[4797]: I1013 13:07:55.622140 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:55 crc kubenswrapper[4797]: I1013 13:07:55.622200 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:55 crc kubenswrapper[4797]: I1013 13:07:55.622219 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:55 crc kubenswrapper[4797]: I1013 13:07:55.622242 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:55 crc kubenswrapper[4797]: I1013 13:07:55.622260 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:55Z","lastTransitionTime":"2025-10-13T13:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:55 crc kubenswrapper[4797]: I1013 13:07:55.724676 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:55 crc kubenswrapper[4797]: I1013 13:07:55.724753 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:55 crc kubenswrapper[4797]: I1013 13:07:55.724776 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:55 crc kubenswrapper[4797]: I1013 13:07:55.724851 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:55 crc kubenswrapper[4797]: I1013 13:07:55.724892 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:55Z","lastTransitionTime":"2025-10-13T13:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:55 crc kubenswrapper[4797]: I1013 13:07:55.828435 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:55 crc kubenswrapper[4797]: I1013 13:07:55.828492 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:55 crc kubenswrapper[4797]: I1013 13:07:55.828509 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:55 crc kubenswrapper[4797]: I1013 13:07:55.828536 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:55 crc kubenswrapper[4797]: I1013 13:07:55.828557 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:55Z","lastTransitionTime":"2025-10-13T13:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:55 crc kubenswrapper[4797]: I1013 13:07:55.931421 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:55 crc kubenswrapper[4797]: I1013 13:07:55.931473 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:55 crc kubenswrapper[4797]: I1013 13:07:55.931490 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:55 crc kubenswrapper[4797]: I1013 13:07:55.931514 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:55 crc kubenswrapper[4797]: I1013 13:07:55.931533 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:55Z","lastTransitionTime":"2025-10-13T13:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:56 crc kubenswrapper[4797]: I1013 13:07:56.034955 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:56 crc kubenswrapper[4797]: I1013 13:07:56.035025 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:56 crc kubenswrapper[4797]: I1013 13:07:56.035044 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:56 crc kubenswrapper[4797]: I1013 13:07:56.035069 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:56 crc kubenswrapper[4797]: I1013 13:07:56.035094 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:56Z","lastTransitionTime":"2025-10-13T13:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:56 crc kubenswrapper[4797]: I1013 13:07:56.138705 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:56 crc kubenswrapper[4797]: I1013 13:07:56.138771 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:56 crc kubenswrapper[4797]: I1013 13:07:56.138793 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:56 crc kubenswrapper[4797]: I1013 13:07:56.138867 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:56 crc kubenswrapper[4797]: I1013 13:07:56.138890 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:56Z","lastTransitionTime":"2025-10-13T13:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:56 crc kubenswrapper[4797]: I1013 13:07:56.241449 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:56 crc kubenswrapper[4797]: I1013 13:07:56.241520 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:56 crc kubenswrapper[4797]: I1013 13:07:56.241543 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:56 crc kubenswrapper[4797]: I1013 13:07:56.241570 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:56 crc kubenswrapper[4797]: I1013 13:07:56.241596 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:56Z","lastTransitionTime":"2025-10-13T13:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:56 crc kubenswrapper[4797]: I1013 13:07:56.344864 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:56 crc kubenswrapper[4797]: I1013 13:07:56.344925 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:56 crc kubenswrapper[4797]: I1013 13:07:56.344943 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:56 crc kubenswrapper[4797]: I1013 13:07:56.344967 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:56 crc kubenswrapper[4797]: I1013 13:07:56.345025 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:56Z","lastTransitionTime":"2025-10-13T13:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:56 crc kubenswrapper[4797]: I1013 13:07:56.448084 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:56 crc kubenswrapper[4797]: I1013 13:07:56.448149 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:56 crc kubenswrapper[4797]: I1013 13:07:56.448175 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:56 crc kubenswrapper[4797]: I1013 13:07:56.448206 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:56 crc kubenswrapper[4797]: I1013 13:07:56.448229 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:56Z","lastTransitionTime":"2025-10-13T13:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:56 crc kubenswrapper[4797]: I1013 13:07:56.551399 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:56 crc kubenswrapper[4797]: I1013 13:07:56.551463 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:56 crc kubenswrapper[4797]: I1013 13:07:56.551489 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:56 crc kubenswrapper[4797]: I1013 13:07:56.551523 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:56 crc kubenswrapper[4797]: I1013 13:07:56.551544 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:56Z","lastTransitionTime":"2025-10-13T13:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:56 crc kubenswrapper[4797]: I1013 13:07:56.654095 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:56 crc kubenswrapper[4797]: I1013 13:07:56.654148 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:56 crc kubenswrapper[4797]: I1013 13:07:56.654165 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:56 crc kubenswrapper[4797]: I1013 13:07:56.654188 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:56 crc kubenswrapper[4797]: I1013 13:07:56.654212 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:56Z","lastTransitionTime":"2025-10-13T13:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:56 crc kubenswrapper[4797]: I1013 13:07:56.758026 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:56 crc kubenswrapper[4797]: I1013 13:07:56.758081 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:56 crc kubenswrapper[4797]: I1013 13:07:56.758098 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:56 crc kubenswrapper[4797]: I1013 13:07:56.758120 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:56 crc kubenswrapper[4797]: I1013 13:07:56.758138 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:56Z","lastTransitionTime":"2025-10-13T13:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:56 crc kubenswrapper[4797]: I1013 13:07:56.860893 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:56 crc kubenswrapper[4797]: I1013 13:07:56.860943 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:56 crc kubenswrapper[4797]: I1013 13:07:56.860960 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:56 crc kubenswrapper[4797]: I1013 13:07:56.860984 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:56 crc kubenswrapper[4797]: I1013 13:07:56.861001 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:56Z","lastTransitionTime":"2025-10-13T13:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:56 crc kubenswrapper[4797]: I1013 13:07:56.963205 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:56 crc kubenswrapper[4797]: I1013 13:07:56.963246 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:56 crc kubenswrapper[4797]: I1013 13:07:56.963256 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:56 crc kubenswrapper[4797]: I1013 13:07:56.963271 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:56 crc kubenswrapper[4797]: I1013 13:07:56.963281 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:56Z","lastTransitionTime":"2025-10-13T13:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:57 crc kubenswrapper[4797]: I1013 13:07:57.065736 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:57 crc kubenswrapper[4797]: I1013 13:07:57.065794 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:57 crc kubenswrapper[4797]: I1013 13:07:57.065851 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:57 crc kubenswrapper[4797]: I1013 13:07:57.065876 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:57 crc kubenswrapper[4797]: I1013 13:07:57.065902 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:57Z","lastTransitionTime":"2025-10-13T13:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:57 crc kubenswrapper[4797]: I1013 13:07:57.168783 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:57 crc kubenswrapper[4797]: I1013 13:07:57.168866 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:57 crc kubenswrapper[4797]: I1013 13:07:57.168887 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:57 crc kubenswrapper[4797]: I1013 13:07:57.168914 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:57 crc kubenswrapper[4797]: I1013 13:07:57.168931 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:57Z","lastTransitionTime":"2025-10-13T13:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:57 crc kubenswrapper[4797]: I1013 13:07:57.236091 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 13 13:07:57 crc kubenswrapper[4797]: I1013 13:07:57.236162 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 13:07:57 crc kubenswrapper[4797]: I1013 13:07:57.236115 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pdvg5" Oct 13 13:07:57 crc kubenswrapper[4797]: E1013 13:07:57.236338 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 13 13:07:57 crc kubenswrapper[4797]: I1013 13:07:57.236366 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 13 13:07:57 crc kubenswrapper[4797]: E1013 13:07:57.236481 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 13 13:07:57 crc kubenswrapper[4797]: E1013 13:07:57.236621 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pdvg5" podUID="e65d35bc-209d-4438-ae53-31deb132aaf5" Oct 13 13:07:57 crc kubenswrapper[4797]: E1013 13:07:57.236751 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 13 13:07:57 crc kubenswrapper[4797]: I1013 13:07:57.272342 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:57 crc kubenswrapper[4797]: I1013 13:07:57.272415 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:57 crc kubenswrapper[4797]: I1013 13:07:57.272439 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:57 crc kubenswrapper[4797]: I1013 13:07:57.272471 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:57 crc kubenswrapper[4797]: I1013 13:07:57.272497 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:57Z","lastTransitionTime":"2025-10-13T13:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:57 crc kubenswrapper[4797]: I1013 13:07:57.375442 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:57 crc kubenswrapper[4797]: I1013 13:07:57.375485 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:57 crc kubenswrapper[4797]: I1013 13:07:57.375499 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:57 crc kubenswrapper[4797]: I1013 13:07:57.375517 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:57 crc kubenswrapper[4797]: I1013 13:07:57.375530 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:57Z","lastTransitionTime":"2025-10-13T13:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:57 crc kubenswrapper[4797]: I1013 13:07:57.478969 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:57 crc kubenswrapper[4797]: I1013 13:07:57.479050 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:57 crc kubenswrapper[4797]: I1013 13:07:57.479069 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:57 crc kubenswrapper[4797]: I1013 13:07:57.479095 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:57 crc kubenswrapper[4797]: I1013 13:07:57.479112 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:57Z","lastTransitionTime":"2025-10-13T13:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:57 crc kubenswrapper[4797]: I1013 13:07:57.582352 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:57 crc kubenswrapper[4797]: I1013 13:07:57.582429 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:57 crc kubenswrapper[4797]: I1013 13:07:57.582453 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:57 crc kubenswrapper[4797]: I1013 13:07:57.582482 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:57 crc kubenswrapper[4797]: I1013 13:07:57.582504 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:57Z","lastTransitionTime":"2025-10-13T13:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:57 crc kubenswrapper[4797]: I1013 13:07:57.685250 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:57 crc kubenswrapper[4797]: I1013 13:07:57.685317 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:57 crc kubenswrapper[4797]: I1013 13:07:57.685340 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:57 crc kubenswrapper[4797]: I1013 13:07:57.685370 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:57 crc kubenswrapper[4797]: I1013 13:07:57.685390 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:57Z","lastTransitionTime":"2025-10-13T13:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:57 crc kubenswrapper[4797]: I1013 13:07:57.788528 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:57 crc kubenswrapper[4797]: I1013 13:07:57.788588 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:57 crc kubenswrapper[4797]: I1013 13:07:57.788645 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:57 crc kubenswrapper[4797]: I1013 13:07:57.788675 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:57 crc kubenswrapper[4797]: I1013 13:07:57.788694 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:57Z","lastTransitionTime":"2025-10-13T13:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:57 crc kubenswrapper[4797]: I1013 13:07:57.891372 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:57 crc kubenswrapper[4797]: I1013 13:07:57.891412 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:57 crc kubenswrapper[4797]: I1013 13:07:57.891423 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:57 crc kubenswrapper[4797]: I1013 13:07:57.891440 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:57 crc kubenswrapper[4797]: I1013 13:07:57.891453 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:57Z","lastTransitionTime":"2025-10-13T13:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:57 crc kubenswrapper[4797]: I1013 13:07:57.997077 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:57 crc kubenswrapper[4797]: I1013 13:07:57.997152 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:57 crc kubenswrapper[4797]: I1013 13:07:57.997177 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:57 crc kubenswrapper[4797]: I1013 13:07:57.997208 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:57 crc kubenswrapper[4797]: I1013 13:07:57.997235 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:57Z","lastTransitionTime":"2025-10-13T13:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:58 crc kubenswrapper[4797]: I1013 13:07:58.100554 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:58 crc kubenswrapper[4797]: I1013 13:07:58.100630 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:58 crc kubenswrapper[4797]: I1013 13:07:58.100652 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:58 crc kubenswrapper[4797]: I1013 13:07:58.100683 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:58 crc kubenswrapper[4797]: I1013 13:07:58.100709 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:58Z","lastTransitionTime":"2025-10-13T13:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:58 crc kubenswrapper[4797]: I1013 13:07:58.202958 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:58 crc kubenswrapper[4797]: I1013 13:07:58.203053 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:58 crc kubenswrapper[4797]: I1013 13:07:58.203072 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:58 crc kubenswrapper[4797]: I1013 13:07:58.203095 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:58 crc kubenswrapper[4797]: I1013 13:07:58.203112 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:58Z","lastTransitionTime":"2025-10-13T13:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:58 crc kubenswrapper[4797]: I1013 13:07:58.306497 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:58 crc kubenswrapper[4797]: I1013 13:07:58.306707 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:58 crc kubenswrapper[4797]: I1013 13:07:58.306747 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:58 crc kubenswrapper[4797]: I1013 13:07:58.306777 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:58 crc kubenswrapper[4797]: I1013 13:07:58.306794 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:58Z","lastTransitionTime":"2025-10-13T13:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:58 crc kubenswrapper[4797]: I1013 13:07:58.409596 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:58 crc kubenswrapper[4797]: I1013 13:07:58.409655 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:58 crc kubenswrapper[4797]: I1013 13:07:58.409672 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:58 crc kubenswrapper[4797]: I1013 13:07:58.409696 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:58 crc kubenswrapper[4797]: I1013 13:07:58.409712 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:58Z","lastTransitionTime":"2025-10-13T13:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:58 crc kubenswrapper[4797]: I1013 13:07:58.513126 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:58 crc kubenswrapper[4797]: I1013 13:07:58.513188 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:58 crc kubenswrapper[4797]: I1013 13:07:58.513206 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:58 crc kubenswrapper[4797]: I1013 13:07:58.513230 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:58 crc kubenswrapper[4797]: I1013 13:07:58.513248 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:58Z","lastTransitionTime":"2025-10-13T13:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:58 crc kubenswrapper[4797]: I1013 13:07:58.617296 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:58 crc kubenswrapper[4797]: I1013 13:07:58.617357 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:58 crc kubenswrapper[4797]: I1013 13:07:58.617380 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:58 crc kubenswrapper[4797]: I1013 13:07:58.617409 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:58 crc kubenswrapper[4797]: I1013 13:07:58.617431 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:58Z","lastTransitionTime":"2025-10-13T13:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:58 crc kubenswrapper[4797]: I1013 13:07:58.720421 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:58 crc kubenswrapper[4797]: I1013 13:07:58.720474 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:58 crc kubenswrapper[4797]: I1013 13:07:58.720492 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:58 crc kubenswrapper[4797]: I1013 13:07:58.720518 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:58 crc kubenswrapper[4797]: I1013 13:07:58.720537 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:58Z","lastTransitionTime":"2025-10-13T13:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:58 crc kubenswrapper[4797]: I1013 13:07:58.823996 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:58 crc kubenswrapper[4797]: I1013 13:07:58.824056 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:58 crc kubenswrapper[4797]: I1013 13:07:58.824072 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:58 crc kubenswrapper[4797]: I1013 13:07:58.824099 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:58 crc kubenswrapper[4797]: I1013 13:07:58.824116 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:58Z","lastTransitionTime":"2025-10-13T13:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:58 crc kubenswrapper[4797]: I1013 13:07:58.927497 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:58 crc kubenswrapper[4797]: I1013 13:07:58.927550 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:58 crc kubenswrapper[4797]: I1013 13:07:58.927578 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:58 crc kubenswrapper[4797]: I1013 13:07:58.927602 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:58 crc kubenswrapper[4797]: I1013 13:07:58.927620 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:58Z","lastTransitionTime":"2025-10-13T13:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:59 crc kubenswrapper[4797]: I1013 13:07:59.030144 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:59 crc kubenswrapper[4797]: I1013 13:07:59.030185 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:59 crc kubenswrapper[4797]: I1013 13:07:59.030200 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:59 crc kubenswrapper[4797]: I1013 13:07:59.030219 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:59 crc kubenswrapper[4797]: I1013 13:07:59.030235 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:59Z","lastTransitionTime":"2025-10-13T13:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:59 crc kubenswrapper[4797]: I1013 13:07:59.133051 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:59 crc kubenswrapper[4797]: I1013 13:07:59.133103 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:59 crc kubenswrapper[4797]: I1013 13:07:59.133119 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:59 crc kubenswrapper[4797]: I1013 13:07:59.133141 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:59 crc kubenswrapper[4797]: I1013 13:07:59.133161 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:59Z","lastTransitionTime":"2025-10-13T13:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:59 crc kubenswrapper[4797]: I1013 13:07:59.235100 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 13 13:07:59 crc kubenswrapper[4797]: I1013 13:07:59.235211 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 13:07:59 crc kubenswrapper[4797]: E1013 13:07:59.235287 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 13 13:07:59 crc kubenswrapper[4797]: I1013 13:07:59.235320 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pdvg5" Oct 13 13:07:59 crc kubenswrapper[4797]: E1013 13:07:59.235484 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 13 13:07:59 crc kubenswrapper[4797]: E1013 13:07:59.235534 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pdvg5" podUID="e65d35bc-209d-4438-ae53-31deb132aaf5" Oct 13 13:07:59 crc kubenswrapper[4797]: I1013 13:07:59.235594 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 13 13:07:59 crc kubenswrapper[4797]: E1013 13:07:59.235644 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 13 13:07:59 crc kubenswrapper[4797]: I1013 13:07:59.237030 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:59 crc kubenswrapper[4797]: I1013 13:07:59.237057 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:59 crc kubenswrapper[4797]: I1013 13:07:59.237066 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:59 crc kubenswrapper[4797]: I1013 13:07:59.237081 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:59 crc kubenswrapper[4797]: I1013 13:07:59.237094 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:59Z","lastTransitionTime":"2025-10-13T13:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:59 crc kubenswrapper[4797]: I1013 13:07:59.340475 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:59 crc kubenswrapper[4797]: I1013 13:07:59.340539 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:59 crc kubenswrapper[4797]: I1013 13:07:59.340556 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:59 crc kubenswrapper[4797]: I1013 13:07:59.340581 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:59 crc kubenswrapper[4797]: I1013 13:07:59.340600 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:59Z","lastTransitionTime":"2025-10-13T13:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:59 crc kubenswrapper[4797]: I1013 13:07:59.443337 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:59 crc kubenswrapper[4797]: I1013 13:07:59.443420 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:59 crc kubenswrapper[4797]: I1013 13:07:59.443444 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:59 crc kubenswrapper[4797]: I1013 13:07:59.443474 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:59 crc kubenswrapper[4797]: I1013 13:07:59.443496 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:59Z","lastTransitionTime":"2025-10-13T13:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:59 crc kubenswrapper[4797]: I1013 13:07:59.546791 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:59 crc kubenswrapper[4797]: I1013 13:07:59.546881 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:59 crc kubenswrapper[4797]: I1013 13:07:59.546900 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:59 crc kubenswrapper[4797]: I1013 13:07:59.546923 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:59 crc kubenswrapper[4797]: I1013 13:07:59.546940 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:59Z","lastTransitionTime":"2025-10-13T13:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:59 crc kubenswrapper[4797]: I1013 13:07:59.650265 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:59 crc kubenswrapper[4797]: I1013 13:07:59.650320 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:59 crc kubenswrapper[4797]: I1013 13:07:59.650337 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:59 crc kubenswrapper[4797]: I1013 13:07:59.650362 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:59 crc kubenswrapper[4797]: I1013 13:07:59.650378 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:59Z","lastTransitionTime":"2025-10-13T13:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:59 crc kubenswrapper[4797]: I1013 13:07:59.752801 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:59 crc kubenswrapper[4797]: I1013 13:07:59.752883 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:59 crc kubenswrapper[4797]: I1013 13:07:59.752902 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:59 crc kubenswrapper[4797]: I1013 13:07:59.752927 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:59 crc kubenswrapper[4797]: I1013 13:07:59.752944 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:59Z","lastTransitionTime":"2025-10-13T13:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:59 crc kubenswrapper[4797]: I1013 13:07:59.855888 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:59 crc kubenswrapper[4797]: I1013 13:07:59.855945 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:59 crc kubenswrapper[4797]: I1013 13:07:59.855963 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:59 crc kubenswrapper[4797]: I1013 13:07:59.855987 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:59 crc kubenswrapper[4797]: I1013 13:07:59.856005 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:59Z","lastTransitionTime":"2025-10-13T13:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:07:59 crc kubenswrapper[4797]: I1013 13:07:59.959197 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:07:59 crc kubenswrapper[4797]: I1013 13:07:59.959243 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:07:59 crc kubenswrapper[4797]: I1013 13:07:59.959253 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:07:59 crc kubenswrapper[4797]: I1013 13:07:59.959269 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:07:59 crc kubenswrapper[4797]: I1013 13:07:59.959282 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:07:59Z","lastTransitionTime":"2025-10-13T13:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:00 crc kubenswrapper[4797]: I1013 13:08:00.062386 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:00 crc kubenswrapper[4797]: I1013 13:08:00.062450 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:00 crc kubenswrapper[4797]: I1013 13:08:00.062467 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:00 crc kubenswrapper[4797]: I1013 13:08:00.062490 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:00 crc kubenswrapper[4797]: I1013 13:08:00.062510 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:00Z","lastTransitionTime":"2025-10-13T13:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:00 crc kubenswrapper[4797]: I1013 13:08:00.165255 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:00 crc kubenswrapper[4797]: I1013 13:08:00.165324 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:00 crc kubenswrapper[4797]: I1013 13:08:00.165345 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:00 crc kubenswrapper[4797]: I1013 13:08:00.165371 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:00 crc kubenswrapper[4797]: I1013 13:08:00.165403 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:00Z","lastTransitionTime":"2025-10-13T13:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:00 crc kubenswrapper[4797]: I1013 13:08:00.268468 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:00 crc kubenswrapper[4797]: I1013 13:08:00.268528 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:00 crc kubenswrapper[4797]: I1013 13:08:00.268545 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:00 crc kubenswrapper[4797]: I1013 13:08:00.268570 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:00 crc kubenswrapper[4797]: I1013 13:08:00.268587 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:00Z","lastTransitionTime":"2025-10-13T13:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:00 crc kubenswrapper[4797]: I1013 13:08:00.371694 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:00 crc kubenswrapper[4797]: I1013 13:08:00.371798 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:00 crc kubenswrapper[4797]: I1013 13:08:00.371908 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:00 crc kubenswrapper[4797]: I1013 13:08:00.371944 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:00 crc kubenswrapper[4797]: I1013 13:08:00.371964 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:00Z","lastTransitionTime":"2025-10-13T13:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:00 crc kubenswrapper[4797]: I1013 13:08:00.475151 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:00 crc kubenswrapper[4797]: I1013 13:08:00.475203 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:00 crc kubenswrapper[4797]: I1013 13:08:00.475218 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:00 crc kubenswrapper[4797]: I1013 13:08:00.475236 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:00 crc kubenswrapper[4797]: I1013 13:08:00.475252 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:00Z","lastTransitionTime":"2025-10-13T13:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:00 crc kubenswrapper[4797]: I1013 13:08:00.578100 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:00 crc kubenswrapper[4797]: I1013 13:08:00.578167 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:00 crc kubenswrapper[4797]: I1013 13:08:00.578186 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:00 crc kubenswrapper[4797]: I1013 13:08:00.578212 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:00 crc kubenswrapper[4797]: I1013 13:08:00.578230 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:00Z","lastTransitionTime":"2025-10-13T13:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:00 crc kubenswrapper[4797]: I1013 13:08:00.681589 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:00 crc kubenswrapper[4797]: I1013 13:08:00.681648 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:00 crc kubenswrapper[4797]: I1013 13:08:00.681668 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:00 crc kubenswrapper[4797]: I1013 13:08:00.681693 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:00 crc kubenswrapper[4797]: I1013 13:08:00.681711 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:00Z","lastTransitionTime":"2025-10-13T13:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:00 crc kubenswrapper[4797]: I1013 13:08:00.784869 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:00 crc kubenswrapper[4797]: I1013 13:08:00.784937 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:00 crc kubenswrapper[4797]: I1013 13:08:00.784959 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:00 crc kubenswrapper[4797]: I1013 13:08:00.784988 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:00 crc kubenswrapper[4797]: I1013 13:08:00.785011 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:00Z","lastTransitionTime":"2025-10-13T13:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:00 crc kubenswrapper[4797]: I1013 13:08:00.887472 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:00 crc kubenswrapper[4797]: I1013 13:08:00.887581 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:00 crc kubenswrapper[4797]: I1013 13:08:00.887598 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:00 crc kubenswrapper[4797]: I1013 13:08:00.887635 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:00 crc kubenswrapper[4797]: I1013 13:08:00.887653 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:00Z","lastTransitionTime":"2025-10-13T13:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:00 crc kubenswrapper[4797]: I1013 13:08:00.990464 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:00 crc kubenswrapper[4797]: I1013 13:08:00.990531 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:00 crc kubenswrapper[4797]: I1013 13:08:00.990560 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:00 crc kubenswrapper[4797]: I1013 13:08:00.990588 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:00 crc kubenswrapper[4797]: I1013 13:08:00.990611 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:00Z","lastTransitionTime":"2025-10-13T13:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:01 crc kubenswrapper[4797]: I1013 13:08:01.093878 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:01 crc kubenswrapper[4797]: I1013 13:08:01.093963 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:01 crc kubenswrapper[4797]: I1013 13:08:01.093992 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:01 crc kubenswrapper[4797]: I1013 13:08:01.094026 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:01 crc kubenswrapper[4797]: I1013 13:08:01.094049 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:01Z","lastTransitionTime":"2025-10-13T13:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:01 crc kubenswrapper[4797]: I1013 13:08:01.096070 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:01 crc kubenswrapper[4797]: I1013 13:08:01.096133 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:01 crc kubenswrapper[4797]: I1013 13:08:01.096151 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:01 crc kubenswrapper[4797]: I1013 13:08:01.096180 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:01 crc kubenswrapper[4797]: I1013 13:08:01.096198 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:01Z","lastTransitionTime":"2025-10-13T13:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:01 crc kubenswrapper[4797]: E1013 13:08:01.119344 4797 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:08:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:08:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:08:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:08:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:08:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:08:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:08:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:08:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7c305ae9-a0eb-4806-bd54-a7ad9c447299\\\",\\\"systemUUID\\\":\\\"1126131d-f382-4ed8-9b1e-fad3c0f5c993\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:01Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:01 crc kubenswrapper[4797]: I1013 13:08:01.124223 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:01 crc kubenswrapper[4797]: I1013 13:08:01.124338 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:01 crc kubenswrapper[4797]: I1013 13:08:01.124362 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:01 crc kubenswrapper[4797]: I1013 13:08:01.124395 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:01 crc kubenswrapper[4797]: I1013 13:08:01.124423 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:01Z","lastTransitionTime":"2025-10-13T13:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:01 crc kubenswrapper[4797]: E1013 13:08:01.146596 4797 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:08:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:08:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:08:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:08:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:08:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:08:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:08:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:08:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7c305ae9-a0eb-4806-bd54-a7ad9c447299\\\",\\\"systemUUID\\\":\\\"1126131d-f382-4ed8-9b1e-fad3c0f5c993\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:01Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:01 crc kubenswrapper[4797]: I1013 13:08:01.156307 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:01 crc kubenswrapper[4797]: I1013 13:08:01.156381 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:01 crc kubenswrapper[4797]: I1013 13:08:01.156405 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:01 crc kubenswrapper[4797]: I1013 13:08:01.156436 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:01 crc kubenswrapper[4797]: I1013 13:08:01.156459 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:01Z","lastTransitionTime":"2025-10-13T13:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:01 crc kubenswrapper[4797]: E1013 13:08:01.177221 4797 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:08:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:08:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:08:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:08:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:08:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:08:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:08:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:08:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7c305ae9-a0eb-4806-bd54-a7ad9c447299\\\",\\\"systemUUID\\\":\\\"1126131d-f382-4ed8-9b1e-fad3c0f5c993\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:01Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:01 crc kubenswrapper[4797]: I1013 13:08:01.182970 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:01 crc kubenswrapper[4797]: I1013 13:08:01.183015 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:01 crc kubenswrapper[4797]: I1013 13:08:01.183031 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:01 crc kubenswrapper[4797]: I1013 13:08:01.183053 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:01 crc kubenswrapper[4797]: I1013 13:08:01.183070 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:01Z","lastTransitionTime":"2025-10-13T13:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:01 crc kubenswrapper[4797]: E1013 13:08:01.203367 4797 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:08:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:08:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:08:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:08:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:08:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:08:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:08:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:08:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7c305ae9-a0eb-4806-bd54-a7ad9c447299\\\",\\\"systemUUID\\\":\\\"1126131d-f382-4ed8-9b1e-fad3c0f5c993\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:01Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:01 crc kubenswrapper[4797]: I1013 13:08:01.208530 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:01 crc kubenswrapper[4797]: I1013 13:08:01.208585 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:01 crc kubenswrapper[4797]: I1013 13:08:01.208604 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:01 crc kubenswrapper[4797]: I1013 13:08:01.208628 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:01 crc kubenswrapper[4797]: I1013 13:08:01.208645 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:01Z","lastTransitionTime":"2025-10-13T13:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:01 crc kubenswrapper[4797]: E1013 13:08:01.228408 4797 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:08:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:08:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:08:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:08:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:08:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:08:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:08:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:08:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7c305ae9-a0eb-4806-bd54-a7ad9c447299\\\",\\\"systemUUID\\\":\\\"1126131d-f382-4ed8-9b1e-fad3c0f5c993\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:01Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:01 crc kubenswrapper[4797]: E1013 13:08:01.228641 4797 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 13 13:08:01 crc kubenswrapper[4797]: I1013 13:08:01.230764 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:01 crc kubenswrapper[4797]: I1013 13:08:01.230878 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:01 crc kubenswrapper[4797]: I1013 13:08:01.230896 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:01 crc kubenswrapper[4797]: I1013 13:08:01.230920 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:01 crc kubenswrapper[4797]: I1013 13:08:01.230938 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:01Z","lastTransitionTime":"2025-10-13T13:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:01 crc kubenswrapper[4797]: I1013 13:08:01.236180 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 13 13:08:01 crc kubenswrapper[4797]: I1013 13:08:01.236239 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 13:08:01 crc kubenswrapper[4797]: E1013 13:08:01.236412 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 13 13:08:01 crc kubenswrapper[4797]: I1013 13:08:01.236489 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pdvg5" Oct 13 13:08:01 crc kubenswrapper[4797]: E1013 13:08:01.236582 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 13 13:08:01 crc kubenswrapper[4797]: E1013 13:08:01.236733 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pdvg5" podUID="e65d35bc-209d-4438-ae53-31deb132aaf5" Oct 13 13:08:01 crc kubenswrapper[4797]: I1013 13:08:01.236933 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 13 13:08:01 crc kubenswrapper[4797]: E1013 13:08:01.237180 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 13 13:08:01 crc kubenswrapper[4797]: I1013 13:08:01.334310 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:01 crc kubenswrapper[4797]: I1013 13:08:01.334471 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:01 crc kubenswrapper[4797]: I1013 13:08:01.334496 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:01 crc kubenswrapper[4797]: I1013 13:08:01.334525 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:01 crc kubenswrapper[4797]: I1013 13:08:01.334547 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:01Z","lastTransitionTime":"2025-10-13T13:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:01 crc kubenswrapper[4797]: I1013 13:08:01.437372 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:01 crc kubenswrapper[4797]: I1013 13:08:01.437439 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:01 crc kubenswrapper[4797]: I1013 13:08:01.437464 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:01 crc kubenswrapper[4797]: I1013 13:08:01.437494 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:01 crc kubenswrapper[4797]: I1013 13:08:01.437518 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:01Z","lastTransitionTime":"2025-10-13T13:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:01 crc kubenswrapper[4797]: I1013 13:08:01.540860 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:01 crc kubenswrapper[4797]: I1013 13:08:01.540934 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:01 crc kubenswrapper[4797]: I1013 13:08:01.540958 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:01 crc kubenswrapper[4797]: I1013 13:08:01.540989 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:01 crc kubenswrapper[4797]: I1013 13:08:01.541013 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:01Z","lastTransitionTime":"2025-10-13T13:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:01 crc kubenswrapper[4797]: I1013 13:08:01.644569 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:01 crc kubenswrapper[4797]: I1013 13:08:01.644643 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:01 crc kubenswrapper[4797]: I1013 13:08:01.644670 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:01 crc kubenswrapper[4797]: I1013 13:08:01.644701 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:01 crc kubenswrapper[4797]: I1013 13:08:01.644728 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:01Z","lastTransitionTime":"2025-10-13T13:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:01 crc kubenswrapper[4797]: I1013 13:08:01.747674 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:01 crc kubenswrapper[4797]: I1013 13:08:01.747715 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:01 crc kubenswrapper[4797]: I1013 13:08:01.747726 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:01 crc kubenswrapper[4797]: I1013 13:08:01.747742 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:01 crc kubenswrapper[4797]: I1013 13:08:01.747753 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:01Z","lastTransitionTime":"2025-10-13T13:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:01 crc kubenswrapper[4797]: I1013 13:08:01.851131 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:01 crc kubenswrapper[4797]: I1013 13:08:01.851205 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:01 crc kubenswrapper[4797]: I1013 13:08:01.851228 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:01 crc kubenswrapper[4797]: I1013 13:08:01.851259 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:01 crc kubenswrapper[4797]: I1013 13:08:01.851283 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:01Z","lastTransitionTime":"2025-10-13T13:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:01 crc kubenswrapper[4797]: I1013 13:08:01.954544 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:01 crc kubenswrapper[4797]: I1013 13:08:01.954597 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:01 crc kubenswrapper[4797]: I1013 13:08:01.954614 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:01 crc kubenswrapper[4797]: I1013 13:08:01.954638 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:01 crc kubenswrapper[4797]: I1013 13:08:01.954656 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:01Z","lastTransitionTime":"2025-10-13T13:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:02 crc kubenswrapper[4797]: I1013 13:08:02.058145 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:02 crc kubenswrapper[4797]: I1013 13:08:02.058227 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:02 crc kubenswrapper[4797]: I1013 13:08:02.058244 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:02 crc kubenswrapper[4797]: I1013 13:08:02.058271 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:02 crc kubenswrapper[4797]: I1013 13:08:02.058288 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:02Z","lastTransitionTime":"2025-10-13T13:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:02 crc kubenswrapper[4797]: I1013 13:08:02.161148 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:02 crc kubenswrapper[4797]: I1013 13:08:02.161229 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:02 crc kubenswrapper[4797]: I1013 13:08:02.161266 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:02 crc kubenswrapper[4797]: I1013 13:08:02.161297 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:02 crc kubenswrapper[4797]: I1013 13:08:02.161326 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:02Z","lastTransitionTime":"2025-10-13T13:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:02 crc kubenswrapper[4797]: I1013 13:08:02.237647 4797 scope.go:117] "RemoveContainer" containerID="dbb589aa6432f0be09c19acfbe0267d2c3e9906ab1b90494b2e097aec3c51e50" Oct 13 13:08:02 crc kubenswrapper[4797]: E1013 13:08:02.238183 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-dhk2q_openshift-ovn-kubernetes(658edc6a-9975-4d8b-9551-821edcc32ce1)\"" pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" podUID="658edc6a-9975-4d8b-9551-821edcc32ce1" Oct 13 13:08:02 crc kubenswrapper[4797]: I1013 13:08:02.264591 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:02 crc kubenswrapper[4797]: I1013 13:08:02.264667 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:02 crc kubenswrapper[4797]: I1013 13:08:02.264693 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:02 crc kubenswrapper[4797]: I1013 13:08:02.264726 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:02 crc kubenswrapper[4797]: I1013 13:08:02.264750 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:02Z","lastTransitionTime":"2025-10-13T13:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:02 crc kubenswrapper[4797]: I1013 13:08:02.367299 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:02 crc kubenswrapper[4797]: I1013 13:08:02.367345 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:02 crc kubenswrapper[4797]: I1013 13:08:02.367363 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:02 crc kubenswrapper[4797]: I1013 13:08:02.367379 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:02 crc kubenswrapper[4797]: I1013 13:08:02.367391 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:02Z","lastTransitionTime":"2025-10-13T13:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:02 crc kubenswrapper[4797]: I1013 13:08:02.470381 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:02 crc kubenswrapper[4797]: I1013 13:08:02.470449 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:02 crc kubenswrapper[4797]: I1013 13:08:02.470471 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:02 crc kubenswrapper[4797]: I1013 13:08:02.470503 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:02 crc kubenswrapper[4797]: I1013 13:08:02.470524 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:02Z","lastTransitionTime":"2025-10-13T13:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:02 crc kubenswrapper[4797]: I1013 13:08:02.573179 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:02 crc kubenswrapper[4797]: I1013 13:08:02.573233 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:02 crc kubenswrapper[4797]: I1013 13:08:02.573252 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:02 crc kubenswrapper[4797]: I1013 13:08:02.573277 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:02 crc kubenswrapper[4797]: I1013 13:08:02.573296 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:02Z","lastTransitionTime":"2025-10-13T13:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:02 crc kubenswrapper[4797]: I1013 13:08:02.675241 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:02 crc kubenswrapper[4797]: I1013 13:08:02.675311 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:02 crc kubenswrapper[4797]: I1013 13:08:02.675329 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:02 crc kubenswrapper[4797]: I1013 13:08:02.675352 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:02 crc kubenswrapper[4797]: I1013 13:08:02.675370 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:02Z","lastTransitionTime":"2025-10-13T13:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:02 crc kubenswrapper[4797]: I1013 13:08:02.777938 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:02 crc kubenswrapper[4797]: I1013 13:08:02.778010 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:02 crc kubenswrapper[4797]: I1013 13:08:02.778031 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:02 crc kubenswrapper[4797]: I1013 13:08:02.778057 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:02 crc kubenswrapper[4797]: I1013 13:08:02.778077 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:02Z","lastTransitionTime":"2025-10-13T13:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:02 crc kubenswrapper[4797]: I1013 13:08:02.881217 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:02 crc kubenswrapper[4797]: I1013 13:08:02.881268 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:02 crc kubenswrapper[4797]: I1013 13:08:02.881284 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:02 crc kubenswrapper[4797]: I1013 13:08:02.881304 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:02 crc kubenswrapper[4797]: I1013 13:08:02.881340 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:02Z","lastTransitionTime":"2025-10-13T13:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:02 crc kubenswrapper[4797]: I1013 13:08:02.983682 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:02 crc kubenswrapper[4797]: I1013 13:08:02.983747 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:02 crc kubenswrapper[4797]: I1013 13:08:02.983769 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:02 crc kubenswrapper[4797]: I1013 13:08:02.983796 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:02 crc kubenswrapper[4797]: I1013 13:08:02.983850 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:02Z","lastTransitionTime":"2025-10-13T13:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:03 crc kubenswrapper[4797]: I1013 13:08:03.090686 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:03 crc kubenswrapper[4797]: I1013 13:08:03.090754 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:03 crc kubenswrapper[4797]: I1013 13:08:03.090773 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:03 crc kubenswrapper[4797]: I1013 13:08:03.090799 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:03 crc kubenswrapper[4797]: I1013 13:08:03.090850 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:03Z","lastTransitionTime":"2025-10-13T13:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:03 crc kubenswrapper[4797]: I1013 13:08:03.193524 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:03 crc kubenswrapper[4797]: I1013 13:08:03.193584 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:03 crc kubenswrapper[4797]: I1013 13:08:03.193602 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:03 crc kubenswrapper[4797]: I1013 13:08:03.193626 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:03 crc kubenswrapper[4797]: I1013 13:08:03.193647 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:03Z","lastTransitionTime":"2025-10-13T13:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:03 crc kubenswrapper[4797]: I1013 13:08:03.235219 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 13 13:08:03 crc kubenswrapper[4797]: I1013 13:08:03.235284 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 13:08:03 crc kubenswrapper[4797]: I1013 13:08:03.235343 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pdvg5" Oct 13 13:08:03 crc kubenswrapper[4797]: E1013 13:08:03.235564 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 13 13:08:03 crc kubenswrapper[4797]: I1013 13:08:03.235939 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 13 13:08:03 crc kubenswrapper[4797]: E1013 13:08:03.236071 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pdvg5" podUID="e65d35bc-209d-4438-ae53-31deb132aaf5" Oct 13 13:08:03 crc kubenswrapper[4797]: E1013 13:08:03.236172 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 13 13:08:03 crc kubenswrapper[4797]: E1013 13:08:03.236313 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 13 13:08:03 crc kubenswrapper[4797]: I1013 13:08:03.256680 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f37f607-3b81-4e33-878e-e78a69b89d23\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f65bab26af0e0d003d4e1a27dc4bdb84b64b5f6143e363973331a3fb6d26b12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81fc000b6df41386d24f9077cee4aa0ceb4733774dc37d225495575543e84a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6529ac19e0d9f2b6ecc69e041e75c9767c971617166ca22bb29349b3b3965b1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://153aaedead7ed76ab97858342317945de885afe80c00d9873d1a03444c47f67d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:03Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:03 crc kubenswrapper[4797]: I1013 13:08:03.274174 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"345b1c60-ba79-407d-8423-53010f2dfeb0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9188982d992b79d058393a141055552eeb63bc5cd53178991e62b3df7604f55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4mqw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae2106d4b7e73d19b0c8cbd8089d372e56fa08d827a3b45148d0cf68e8596c00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4mqw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hrdxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:03Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:03 crc kubenswrapper[4797]: I1013 13:08:03.289944 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7c2fp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c10f51f-a7f1-4ab8-8d9c-fc358bd7f2c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55ca323dd3a92ee203542f4ef7bb8be990bcfc8f75f125c562127129aecefc5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hgkjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7c2fp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:03Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:03 crc kubenswrapper[4797]: I1013 13:08:03.296125 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:03 crc kubenswrapper[4797]: I1013 13:08:03.296196 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:03 crc kubenswrapper[4797]: I1013 13:08:03.296315 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:03 crc kubenswrapper[4797]: I1013 13:08:03.296411 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:03 crc kubenswrapper[4797]: I1013 13:08:03.296432 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:03Z","lastTransitionTime":"2025-10-13T13:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:03 crc kubenswrapper[4797]: I1013 13:08:03.318102 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a225515-1318-413d-aafe-877c9f16f598\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94213b2963fead3db49bc98dfdf6347265b92e3a0a965295610e496d2e1f03fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://015492cb16b3cf6dedc1936f90cf03d1331bfd1fddf6a257c719a6bf102691f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a259d99cae7127eb6fc8ad5446de3eda5a06da45868ab2325a89fc9c44f1d34c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58e84f914871c37c2c5cf2767af6a88354e4e59af0cbe5b178b80e1372d50629\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba76df71160260346f2ecd968722de778b7d2b3dcb8673d6ec770964965384fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ff207205d6f17838682cab641fe9c77fad739e5454c206f376440741bed8724\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ff207205d6f17838682cab641fe9c77fad739e5454c206f376440741bed8724\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:03Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:03 crc kubenswrapper[4797]: I1013 13:08:03.339262 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:03Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:03 crc kubenswrapper[4797]: I1013 13:08:03.360150 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:03Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:03 crc kubenswrapper[4797]: I1013 13:08:03.379071 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6gbdx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2ab9f14-aae8-45ef-880e-a1563e920f87\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://414f6ddbfec431109009fc83e56eeac94db15726b109e707ebd8d3e2403999b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rkc2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6gbdx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:03Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:03 crc kubenswrapper[4797]: I1013 13:08:03.399603 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:03 crc kubenswrapper[4797]: I1013 13:08:03.399654 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:03 crc kubenswrapper[4797]: I1013 13:08:03.399696 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:03 crc kubenswrapper[4797]: I1013 13:08:03.399727 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:03 crc kubenswrapper[4797]: I1013 13:08:03.399745 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:03Z","lastTransitionTime":"2025-10-13T13:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:03 crc kubenswrapper[4797]: I1013 13:08:03.410659 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"658edc6a-9975-4d8b-9551-821edcc32ce1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6815d3509df673d7f5da2c26130c6c4d533e9d2c25c40f82365ef61d63ee71bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e599d81d1a996abd4de74afc58a8255a1ae548327401146b6bdf688d7455823\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa5161ba66d687daedb3caa1a0e2d7be83859aa3076731f94aebf83cc3348a4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1293f7ed35796e22a4be73a35ad07f83fa98d250d21de2d0b96b9090354142b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32991406197be9d38b8d5e8d1a7e95165b1846e9e054efbe87f30aac9f7f8784\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7aebb018a68c2984d9e4e58071c2b623652bfa700acebaf735c35615abf8c592\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbb589aa6432f0be09c19acfbe0267d2c3e9906ab1b90494b2e097aec3c51e50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbb589aa6432f0be09c19acfbe0267d2c3e9906ab1b90494b2e097aec3c51e50\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-13T13:07:49Z\\\",\\\"message\\\":\\\"ap[10.217.5.219:80:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7594bb65-e742-44b3-a975-d639b1128be5}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1013 13:07:49.213399 6403 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1013 13:07:49.213404 6403 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-network-diagnostics/network-check-target]} name:Service_openshift-network-diagnostics/network-check-target_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.219:80:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7594bb65-e742-44b3-a975-d639b1128be5}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1013 13:07:49.213463 6403 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: fail\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:48Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-dhk2q_openshift-ovn-kubernetes(658edc6a-9975-4d8b-9551-821edcc32ce1)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a900854ab289e65833932548eadd4705ec501737d66773d5b6c283458125b598\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6339ec20a5a892de024630eaf6febedfdfa4373c24d394fc758267cc7ae2603e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6339ec20a5a892de024630eaf6febedfdfa4373c24d394fc758267cc7ae2603e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dhk2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:03Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:03 crc kubenswrapper[4797]: I1013 13:08:03.434429 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69631c207b244ea05458caf7f67665697be6b3794c1aac98d0ac8d23df060e02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:03Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:03 crc kubenswrapper[4797]: I1013 13:08:03.451355 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5jgrm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"680a49a0-7eff-44a9-8ab8-e4b52f4743c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://875be57ff7356a934b342acd8ae700f66656680be4e58e6cfccdc0407b66ddea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pptw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5jgrm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:03Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:03 crc kubenswrapper[4797]: I1013 13:08:03.467656 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hvhmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b24c284-a754-4877-83cc-334b0a893a47\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72031333bb0302ca8e823981a07e96b3bf16d02fbdb918d4fd3e79f36d86c5ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znzc9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95d788f4cc7913f42c5282aea7303a5463ec8718dba6372a30c505e1648f230e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znzc9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hvhmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:03Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:03 crc kubenswrapper[4797]: I1013 13:08:03.482867 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-pdvg5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e65d35bc-209d-4438-ae53-31deb132aaf5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nspn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nspn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-pdvg5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:03Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:03 crc kubenswrapper[4797]: I1013 13:08:03.498736 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:03Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:03 crc kubenswrapper[4797]: I1013 13:08:03.503255 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:03 crc kubenswrapper[4797]: I1013 13:08:03.503301 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:03 crc kubenswrapper[4797]: I1013 13:08:03.503314 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:03 crc kubenswrapper[4797]: I1013 13:08:03.503331 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:03 crc kubenswrapper[4797]: I1013 13:08:03.503343 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:03Z","lastTransitionTime":"2025-10-13T13:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:03 crc kubenswrapper[4797]: I1013 13:08:03.513558 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hc9bk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94a6e41f-8980-41db-a008-d5a81058cdba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4187a9b2b8147080c704bcda550e1fa94124e2d876e766361e95907a8805d300\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55fca726ceb1c3d6ba2023acc8fcf6829a7843a2da5209dc77f3d1f499542984\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55fca726ceb1c3d6ba2023acc8fcf6829a7843a2da5209dc77f3d1f499542984\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d315709aee893961a174d0368efcf68e50e45845bdce18b40b96f5d49a8ac12c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d315709aee893961a174d0368efcf68e50e45845bdce18b40b96f5d49a8ac12c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e5eeb32046ffa3c309bb0649ed24bb4149050e757d4252bc5ed8e0593e1b139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e5eeb32046ffa3c309bb0649ed24bb4149050e757d4252bc5ed8e0593e1b139\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12ad75a6e287db934cda0d0128e5445d47433aa12807b4145ce0bd28c36b08a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12ad75a6e287db934cda0d0128e5445d47433aa12807b4145ce0bd28c36b08a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://578ddb49fd9525eafe74852b96ea1f3e320cbe40fa15ef4da3e4269f9bc23fc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://578ddb49fd9525eafe74852b96ea1f3e320cbe40fa15ef4da3e4269f9bc23fc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d96793d08ac170b3c25abd53c779b2ebecf10b5271c5f3eb4f9cbc524ba65c0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d96793d08ac170b3c25abd53c779b2ebecf10b5271c5f3eb4f9cbc524ba65c0e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hc9bk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:03Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:03 crc kubenswrapper[4797]: I1013 13:08:03.526610 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b68fe527-212f-427f-853a-037035463262\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca584d4ecaf82f6fb7822ce377920e84fa94325d8c157e75bdcbbe45a125fa17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11160e205816f4de995be138142cca7672957f217e49bf9f4761ae2cb132e9da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://034f468ed62eb8201ad4abdbf235c13b6c9ff8e3fe2494ad768f7047e188bc77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e62409684da122b3385446a402a798c47eca9f32aeb43f734f94dc498f95d23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e62409684da122b3385446a402a798c47eca9f32aeb43f734f94dc498f95d23\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:03Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:03Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:03 crc kubenswrapper[4797]: I1013 13:08:03.552839 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a0ece0b-2009-4af8-a479-18fe277add03\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfe67dfd1c3ab4ca933a08e0384f2c38dccf755989a2d788f7c96bd8c2005c7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2839d93188aac6edbd17e7c1dc6d3b6004d3c1d8d03c559205b1f180ca7fc722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d103007fe3545a7470b16b99638c5d5c87f34918e102e1453d4f7ee1fa67109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://221933b30055ace0b4911bda08736e1c703b7757d55fadb3114ae39d038e4b19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01ae91c2e82dd02eb4c0aced1159efd45e3a0570a4db649f2fd2b58681419471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9634ab258db11c80c4fea57a4a31969b811204d49513802e9fd1e584c9baeeb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9634ab258db11c80c4fea57a4a31969b811204d49513802e9fd1e584c9baeeb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4eac0af23e91572524547b0ce92c10d435b55d0cd15ca4cfc1f49bda2de8bde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4eac0af23e91572524547b0ce92c10d435b55d0cd15ca4cfc1f49bda2de8bde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://af79d6bbb6a3c16b532ba2234d3373011c151ffa801eb1ae5ae947142a64bcfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af79d6bbb6a3c16b532ba2234d3373011c151ffa801eb1ae5ae947142a64bcfe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:03Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:03Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:03 crc kubenswrapper[4797]: I1013 13:08:03.571332 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6681a37da700a80ffec94aef9264f87838622029c76a2badc7b8f4a7e9e167e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:03Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:03 crc kubenswrapper[4797]: I1013 13:08:03.589585 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://050a7223e7496fca4ef77f2d73f6aefc921ac5accb7ecaa34609524388da6549\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f59a6104215a3e6febd2e26c286b00895bcd8a45719acbd8e86d6fb5683df39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:03Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:03 crc kubenswrapper[4797]: I1013 13:08:03.606095 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:03 crc kubenswrapper[4797]: I1013 13:08:03.606137 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:03 crc kubenswrapper[4797]: I1013 13:08:03.606149 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:03 crc kubenswrapper[4797]: I1013 13:08:03.606167 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:03 crc kubenswrapper[4797]: I1013 13:08:03.606180 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:03Z","lastTransitionTime":"2025-10-13T13:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:03 crc kubenswrapper[4797]: I1013 13:08:03.709178 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:03 crc kubenswrapper[4797]: I1013 13:08:03.709255 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:03 crc kubenswrapper[4797]: I1013 13:08:03.709274 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:03 crc kubenswrapper[4797]: I1013 13:08:03.709302 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:03 crc kubenswrapper[4797]: I1013 13:08:03.709331 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:03Z","lastTransitionTime":"2025-10-13T13:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:03 crc kubenswrapper[4797]: I1013 13:08:03.812347 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:03 crc kubenswrapper[4797]: I1013 13:08:03.812413 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:03 crc kubenswrapper[4797]: I1013 13:08:03.812434 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:03 crc kubenswrapper[4797]: I1013 13:08:03.812461 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:03 crc kubenswrapper[4797]: I1013 13:08:03.812482 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:03Z","lastTransitionTime":"2025-10-13T13:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:03 crc kubenswrapper[4797]: I1013 13:08:03.915422 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:03 crc kubenswrapper[4797]: I1013 13:08:03.915482 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:03 crc kubenswrapper[4797]: I1013 13:08:03.915499 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:03 crc kubenswrapper[4797]: I1013 13:08:03.915522 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:03 crc kubenswrapper[4797]: I1013 13:08:03.915545 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:03Z","lastTransitionTime":"2025-10-13T13:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:04 crc kubenswrapper[4797]: I1013 13:08:04.019171 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:04 crc kubenswrapper[4797]: I1013 13:08:04.019234 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:04 crc kubenswrapper[4797]: I1013 13:08:04.019279 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:04 crc kubenswrapper[4797]: I1013 13:08:04.019305 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:04 crc kubenswrapper[4797]: I1013 13:08:04.019322 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:04Z","lastTransitionTime":"2025-10-13T13:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:04 crc kubenswrapper[4797]: I1013 13:08:04.121797 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:04 crc kubenswrapper[4797]: I1013 13:08:04.121912 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:04 crc kubenswrapper[4797]: I1013 13:08:04.121977 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:04 crc kubenswrapper[4797]: I1013 13:08:04.122007 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:04 crc kubenswrapper[4797]: I1013 13:08:04.122025 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:04Z","lastTransitionTime":"2025-10-13T13:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:04 crc kubenswrapper[4797]: I1013 13:08:04.225031 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:04 crc kubenswrapper[4797]: I1013 13:08:04.225107 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:04 crc kubenswrapper[4797]: I1013 13:08:04.225142 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:04 crc kubenswrapper[4797]: I1013 13:08:04.225173 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:04 crc kubenswrapper[4797]: I1013 13:08:04.225199 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:04Z","lastTransitionTime":"2025-10-13T13:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:04 crc kubenswrapper[4797]: I1013 13:08:04.328339 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:04 crc kubenswrapper[4797]: I1013 13:08:04.328392 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:04 crc kubenswrapper[4797]: I1013 13:08:04.328409 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:04 crc kubenswrapper[4797]: I1013 13:08:04.328434 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:04 crc kubenswrapper[4797]: I1013 13:08:04.328451 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:04Z","lastTransitionTime":"2025-10-13T13:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:04 crc kubenswrapper[4797]: I1013 13:08:04.431677 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:04 crc kubenswrapper[4797]: I1013 13:08:04.431735 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:04 crc kubenswrapper[4797]: I1013 13:08:04.431773 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:04 crc kubenswrapper[4797]: I1013 13:08:04.431847 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:04 crc kubenswrapper[4797]: I1013 13:08:04.431874 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:04Z","lastTransitionTime":"2025-10-13T13:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:04 crc kubenswrapper[4797]: I1013 13:08:04.539712 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:04 crc kubenswrapper[4797]: I1013 13:08:04.539771 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:04 crc kubenswrapper[4797]: I1013 13:08:04.539796 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:04 crc kubenswrapper[4797]: I1013 13:08:04.539872 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:04 crc kubenswrapper[4797]: I1013 13:08:04.539900 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:04Z","lastTransitionTime":"2025-10-13T13:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:04 crc kubenswrapper[4797]: I1013 13:08:04.643785 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:04 crc kubenswrapper[4797]: I1013 13:08:04.643838 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:04 crc kubenswrapper[4797]: I1013 13:08:04.643856 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:04 crc kubenswrapper[4797]: I1013 13:08:04.643871 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:04 crc kubenswrapper[4797]: I1013 13:08:04.643883 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:04Z","lastTransitionTime":"2025-10-13T13:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:04 crc kubenswrapper[4797]: I1013 13:08:04.746999 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:04 crc kubenswrapper[4797]: I1013 13:08:04.747061 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:04 crc kubenswrapper[4797]: I1013 13:08:04.747078 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:04 crc kubenswrapper[4797]: I1013 13:08:04.747105 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:04 crc kubenswrapper[4797]: I1013 13:08:04.747121 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:04Z","lastTransitionTime":"2025-10-13T13:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:04 crc kubenswrapper[4797]: I1013 13:08:04.851349 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:04 crc kubenswrapper[4797]: I1013 13:08:04.851627 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:04 crc kubenswrapper[4797]: I1013 13:08:04.851642 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:04 crc kubenswrapper[4797]: I1013 13:08:04.851655 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:04 crc kubenswrapper[4797]: I1013 13:08:04.851667 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:04Z","lastTransitionTime":"2025-10-13T13:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:04 crc kubenswrapper[4797]: I1013 13:08:04.954642 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:04 crc kubenswrapper[4797]: I1013 13:08:04.954700 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:04 crc kubenswrapper[4797]: I1013 13:08:04.954722 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:04 crc kubenswrapper[4797]: I1013 13:08:04.954751 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:04 crc kubenswrapper[4797]: I1013 13:08:04.954773 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:04Z","lastTransitionTime":"2025-10-13T13:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:05 crc kubenswrapper[4797]: I1013 13:08:05.057985 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:05 crc kubenswrapper[4797]: I1013 13:08:05.058040 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:05 crc kubenswrapper[4797]: I1013 13:08:05.058051 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:05 crc kubenswrapper[4797]: I1013 13:08:05.058067 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:05 crc kubenswrapper[4797]: I1013 13:08:05.058077 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:05Z","lastTransitionTime":"2025-10-13T13:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:05 crc kubenswrapper[4797]: I1013 13:08:05.165619 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:05 crc kubenswrapper[4797]: I1013 13:08:05.165668 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:05 crc kubenswrapper[4797]: I1013 13:08:05.165686 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:05 crc kubenswrapper[4797]: I1013 13:08:05.165708 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:05 crc kubenswrapper[4797]: I1013 13:08:05.165725 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:05Z","lastTransitionTime":"2025-10-13T13:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:05 crc kubenswrapper[4797]: I1013 13:08:05.235187 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 13 13:08:05 crc kubenswrapper[4797]: I1013 13:08:05.235222 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 13:08:05 crc kubenswrapper[4797]: I1013 13:08:05.235316 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pdvg5" Oct 13 13:08:05 crc kubenswrapper[4797]: I1013 13:08:05.235306 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 13 13:08:05 crc kubenswrapper[4797]: E1013 13:08:05.235955 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 13 13:08:05 crc kubenswrapper[4797]: E1013 13:08:05.236143 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pdvg5" podUID="e65d35bc-209d-4438-ae53-31deb132aaf5" Oct 13 13:08:05 crc kubenswrapper[4797]: E1013 13:08:05.236199 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 13 13:08:05 crc kubenswrapper[4797]: E1013 13:08:05.236168 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 13 13:08:05 crc kubenswrapper[4797]: I1013 13:08:05.268326 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:05 crc kubenswrapper[4797]: I1013 13:08:05.268372 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:05 crc kubenswrapper[4797]: I1013 13:08:05.268388 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:05 crc kubenswrapper[4797]: I1013 13:08:05.268409 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:05 crc kubenswrapper[4797]: I1013 13:08:05.268425 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:05Z","lastTransitionTime":"2025-10-13T13:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:05 crc kubenswrapper[4797]: I1013 13:08:05.371387 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:05 crc kubenswrapper[4797]: I1013 13:08:05.371471 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:05 crc kubenswrapper[4797]: I1013 13:08:05.371495 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:05 crc kubenswrapper[4797]: I1013 13:08:05.371523 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:05 crc kubenswrapper[4797]: I1013 13:08:05.371544 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:05Z","lastTransitionTime":"2025-10-13T13:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:05 crc kubenswrapper[4797]: I1013 13:08:05.474647 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:05 crc kubenswrapper[4797]: I1013 13:08:05.474695 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:05 crc kubenswrapper[4797]: I1013 13:08:05.474712 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:05 crc kubenswrapper[4797]: I1013 13:08:05.474735 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:05 crc kubenswrapper[4797]: I1013 13:08:05.474752 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:05Z","lastTransitionTime":"2025-10-13T13:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:05 crc kubenswrapper[4797]: I1013 13:08:05.577354 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:05 crc kubenswrapper[4797]: I1013 13:08:05.577385 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:05 crc kubenswrapper[4797]: I1013 13:08:05.577396 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:05 crc kubenswrapper[4797]: I1013 13:08:05.577421 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:05 crc kubenswrapper[4797]: I1013 13:08:05.577436 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:05Z","lastTransitionTime":"2025-10-13T13:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:05 crc kubenswrapper[4797]: I1013 13:08:05.679714 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:05 crc kubenswrapper[4797]: I1013 13:08:05.680082 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:05 crc kubenswrapper[4797]: I1013 13:08:05.680232 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:05 crc kubenswrapper[4797]: I1013 13:08:05.680615 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:05 crc kubenswrapper[4797]: I1013 13:08:05.680795 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:05Z","lastTransitionTime":"2025-10-13T13:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:05 crc kubenswrapper[4797]: I1013 13:08:05.797496 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:05 crc kubenswrapper[4797]: I1013 13:08:05.797532 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:05 crc kubenswrapper[4797]: I1013 13:08:05.797544 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:05 crc kubenswrapper[4797]: I1013 13:08:05.797562 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:05 crc kubenswrapper[4797]: I1013 13:08:05.797575 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:05Z","lastTransitionTime":"2025-10-13T13:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:05 crc kubenswrapper[4797]: I1013 13:08:05.899891 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:05 crc kubenswrapper[4797]: I1013 13:08:05.899977 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:05 crc kubenswrapper[4797]: I1013 13:08:05.899992 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:05 crc kubenswrapper[4797]: I1013 13:08:05.900008 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:05 crc kubenswrapper[4797]: I1013 13:08:05.900631 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:05Z","lastTransitionTime":"2025-10-13T13:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:06 crc kubenswrapper[4797]: I1013 13:08:06.002980 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:06 crc kubenswrapper[4797]: I1013 13:08:06.003016 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:06 crc kubenswrapper[4797]: I1013 13:08:06.003031 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:06 crc kubenswrapper[4797]: I1013 13:08:06.003050 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:06 crc kubenswrapper[4797]: I1013 13:08:06.003064 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:06Z","lastTransitionTime":"2025-10-13T13:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:06 crc kubenswrapper[4797]: I1013 13:08:06.105756 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:06 crc kubenswrapper[4797]: I1013 13:08:06.105837 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:06 crc kubenswrapper[4797]: I1013 13:08:06.105862 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:06 crc kubenswrapper[4797]: I1013 13:08:06.105894 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:06 crc kubenswrapper[4797]: I1013 13:08:06.105916 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:06Z","lastTransitionTime":"2025-10-13T13:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:06 crc kubenswrapper[4797]: I1013 13:08:06.207666 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:06 crc kubenswrapper[4797]: I1013 13:08:06.207716 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:06 crc kubenswrapper[4797]: I1013 13:08:06.207732 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:06 crc kubenswrapper[4797]: I1013 13:08:06.207755 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:06 crc kubenswrapper[4797]: I1013 13:08:06.207771 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:06Z","lastTransitionTime":"2025-10-13T13:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:06 crc kubenswrapper[4797]: I1013 13:08:06.310505 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:06 crc kubenswrapper[4797]: I1013 13:08:06.310554 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:06 crc kubenswrapper[4797]: I1013 13:08:06.310566 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:06 crc kubenswrapper[4797]: I1013 13:08:06.310614 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:06 crc kubenswrapper[4797]: I1013 13:08:06.310628 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:06Z","lastTransitionTime":"2025-10-13T13:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:06 crc kubenswrapper[4797]: I1013 13:08:06.414109 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:06 crc kubenswrapper[4797]: I1013 13:08:06.414163 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:06 crc kubenswrapper[4797]: I1013 13:08:06.414183 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:06 crc kubenswrapper[4797]: I1013 13:08:06.414207 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:06 crc kubenswrapper[4797]: I1013 13:08:06.414225 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:06Z","lastTransitionTime":"2025-10-13T13:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:06 crc kubenswrapper[4797]: I1013 13:08:06.517275 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:06 crc kubenswrapper[4797]: I1013 13:08:06.517307 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:06 crc kubenswrapper[4797]: I1013 13:08:06.517317 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:06 crc kubenswrapper[4797]: I1013 13:08:06.517333 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:06 crc kubenswrapper[4797]: I1013 13:08:06.517344 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:06Z","lastTransitionTime":"2025-10-13T13:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:06 crc kubenswrapper[4797]: I1013 13:08:06.618744 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:06 crc kubenswrapper[4797]: I1013 13:08:06.618774 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:06 crc kubenswrapper[4797]: I1013 13:08:06.618782 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:06 crc kubenswrapper[4797]: I1013 13:08:06.618796 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:06 crc kubenswrapper[4797]: I1013 13:08:06.618820 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:06Z","lastTransitionTime":"2025-10-13T13:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:06 crc kubenswrapper[4797]: I1013 13:08:06.720596 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:06 crc kubenswrapper[4797]: I1013 13:08:06.720672 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:06 crc kubenswrapper[4797]: I1013 13:08:06.720696 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:06 crc kubenswrapper[4797]: I1013 13:08:06.720726 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:06 crc kubenswrapper[4797]: I1013 13:08:06.720749 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:06Z","lastTransitionTime":"2025-10-13T13:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:06 crc kubenswrapper[4797]: I1013 13:08:06.823600 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:06 crc kubenswrapper[4797]: I1013 13:08:06.823687 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:06 crc kubenswrapper[4797]: I1013 13:08:06.823713 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:06 crc kubenswrapper[4797]: I1013 13:08:06.823752 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:06 crc kubenswrapper[4797]: I1013 13:08:06.823777 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:06Z","lastTransitionTime":"2025-10-13T13:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:06 crc kubenswrapper[4797]: I1013 13:08:06.926412 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:06 crc kubenswrapper[4797]: I1013 13:08:06.926480 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:06 crc kubenswrapper[4797]: I1013 13:08:06.926504 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:06 crc kubenswrapper[4797]: I1013 13:08:06.926532 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:06 crc kubenswrapper[4797]: I1013 13:08:06.926552 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:06Z","lastTransitionTime":"2025-10-13T13:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:07 crc kubenswrapper[4797]: I1013 13:08:07.029226 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:07 crc kubenswrapper[4797]: I1013 13:08:07.029259 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:07 crc kubenswrapper[4797]: I1013 13:08:07.029269 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:07 crc kubenswrapper[4797]: I1013 13:08:07.029284 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:07 crc kubenswrapper[4797]: I1013 13:08:07.029294 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:07Z","lastTransitionTime":"2025-10-13T13:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:07 crc kubenswrapper[4797]: I1013 13:08:07.131987 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:07 crc kubenswrapper[4797]: I1013 13:08:07.132021 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:07 crc kubenswrapper[4797]: I1013 13:08:07.132029 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:07 crc kubenswrapper[4797]: I1013 13:08:07.132042 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:07 crc kubenswrapper[4797]: I1013 13:08:07.132053 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:07Z","lastTransitionTime":"2025-10-13T13:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:07 crc kubenswrapper[4797]: I1013 13:08:07.234733 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:07 crc kubenswrapper[4797]: I1013 13:08:07.234787 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:07 crc kubenswrapper[4797]: I1013 13:08:07.234801 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:07 crc kubenswrapper[4797]: I1013 13:08:07.234839 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:07 crc kubenswrapper[4797]: I1013 13:08:07.234859 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:07Z","lastTransitionTime":"2025-10-13T13:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:07 crc kubenswrapper[4797]: I1013 13:08:07.235072 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 13:08:07 crc kubenswrapper[4797]: I1013 13:08:07.235105 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 13 13:08:07 crc kubenswrapper[4797]: E1013 13:08:07.235188 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 13 13:08:07 crc kubenswrapper[4797]: I1013 13:08:07.235241 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pdvg5" Oct 13 13:08:07 crc kubenswrapper[4797]: E1013 13:08:07.235308 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 13 13:08:07 crc kubenswrapper[4797]: I1013 13:08:07.235342 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 13 13:08:07 crc kubenswrapper[4797]: E1013 13:08:07.235402 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 13 13:08:07 crc kubenswrapper[4797]: E1013 13:08:07.235466 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pdvg5" podUID="e65d35bc-209d-4438-ae53-31deb132aaf5" Oct 13 13:08:07 crc kubenswrapper[4797]: I1013 13:08:07.337467 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:07 crc kubenswrapper[4797]: I1013 13:08:07.337795 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:07 crc kubenswrapper[4797]: I1013 13:08:07.338002 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:07 crc kubenswrapper[4797]: I1013 13:08:07.338137 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:07 crc kubenswrapper[4797]: I1013 13:08:07.338258 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:07Z","lastTransitionTime":"2025-10-13T13:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:07 crc kubenswrapper[4797]: I1013 13:08:07.440321 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:07 crc kubenswrapper[4797]: I1013 13:08:07.440352 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:07 crc kubenswrapper[4797]: I1013 13:08:07.440362 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:07 crc kubenswrapper[4797]: I1013 13:08:07.440377 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:07 crc kubenswrapper[4797]: I1013 13:08:07.440387 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:07Z","lastTransitionTime":"2025-10-13T13:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:07 crc kubenswrapper[4797]: I1013 13:08:07.546752 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:07 crc kubenswrapper[4797]: I1013 13:08:07.546837 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:07 crc kubenswrapper[4797]: I1013 13:08:07.546858 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:07 crc kubenswrapper[4797]: I1013 13:08:07.546878 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:07 crc kubenswrapper[4797]: I1013 13:08:07.546891 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:07Z","lastTransitionTime":"2025-10-13T13:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:07 crc kubenswrapper[4797]: I1013 13:08:07.649368 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:07 crc kubenswrapper[4797]: I1013 13:08:07.649397 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:07 crc kubenswrapper[4797]: I1013 13:08:07.649405 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:07 crc kubenswrapper[4797]: I1013 13:08:07.649418 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:07 crc kubenswrapper[4797]: I1013 13:08:07.649427 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:07Z","lastTransitionTime":"2025-10-13T13:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:07 crc kubenswrapper[4797]: I1013 13:08:07.752246 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:07 crc kubenswrapper[4797]: I1013 13:08:07.752319 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:07 crc kubenswrapper[4797]: I1013 13:08:07.752336 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:07 crc kubenswrapper[4797]: I1013 13:08:07.752360 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:07 crc kubenswrapper[4797]: I1013 13:08:07.752377 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:07Z","lastTransitionTime":"2025-10-13T13:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:07 crc kubenswrapper[4797]: I1013 13:08:07.854836 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:07 crc kubenswrapper[4797]: I1013 13:08:07.854883 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:07 crc kubenswrapper[4797]: I1013 13:08:07.854896 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:07 crc kubenswrapper[4797]: I1013 13:08:07.854915 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:07 crc kubenswrapper[4797]: I1013 13:08:07.854928 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:07Z","lastTransitionTime":"2025-10-13T13:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:07 crc kubenswrapper[4797]: I1013 13:08:07.957689 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:07 crc kubenswrapper[4797]: I1013 13:08:07.957734 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:07 crc kubenswrapper[4797]: I1013 13:08:07.957766 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:07 crc kubenswrapper[4797]: I1013 13:08:07.957785 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:07 crc kubenswrapper[4797]: I1013 13:08:07.957799 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:07Z","lastTransitionTime":"2025-10-13T13:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:08 crc kubenswrapper[4797]: I1013 13:08:08.060715 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:08 crc kubenswrapper[4797]: I1013 13:08:08.060763 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:08 crc kubenswrapper[4797]: I1013 13:08:08.060775 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:08 crc kubenswrapper[4797]: I1013 13:08:08.060831 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:08 crc kubenswrapper[4797]: I1013 13:08:08.060845 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:08Z","lastTransitionTime":"2025-10-13T13:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:08 crc kubenswrapper[4797]: I1013 13:08:08.186387 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:08 crc kubenswrapper[4797]: I1013 13:08:08.186443 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:08 crc kubenswrapper[4797]: I1013 13:08:08.186460 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:08 crc kubenswrapper[4797]: I1013 13:08:08.186483 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:08 crc kubenswrapper[4797]: I1013 13:08:08.186502 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:08Z","lastTransitionTime":"2025-10-13T13:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:08 crc kubenswrapper[4797]: I1013 13:08:08.289421 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:08 crc kubenswrapper[4797]: I1013 13:08:08.289464 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:08 crc kubenswrapper[4797]: I1013 13:08:08.289474 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:08 crc kubenswrapper[4797]: I1013 13:08:08.289491 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:08 crc kubenswrapper[4797]: I1013 13:08:08.289502 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:08Z","lastTransitionTime":"2025-10-13T13:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:08 crc kubenswrapper[4797]: I1013 13:08:08.391950 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:08 crc kubenswrapper[4797]: I1013 13:08:08.391989 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:08 crc kubenswrapper[4797]: I1013 13:08:08.392000 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:08 crc kubenswrapper[4797]: I1013 13:08:08.392015 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:08 crc kubenswrapper[4797]: I1013 13:08:08.392026 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:08Z","lastTransitionTime":"2025-10-13T13:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:08 crc kubenswrapper[4797]: I1013 13:08:08.494544 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:08 crc kubenswrapper[4797]: I1013 13:08:08.494592 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:08 crc kubenswrapper[4797]: I1013 13:08:08.494608 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:08 crc kubenswrapper[4797]: I1013 13:08:08.494629 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:08 crc kubenswrapper[4797]: I1013 13:08:08.494644 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:08Z","lastTransitionTime":"2025-10-13T13:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:08 crc kubenswrapper[4797]: I1013 13:08:08.597286 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:08 crc kubenswrapper[4797]: I1013 13:08:08.597320 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:08 crc kubenswrapper[4797]: I1013 13:08:08.597331 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:08 crc kubenswrapper[4797]: I1013 13:08:08.597346 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:08 crc kubenswrapper[4797]: I1013 13:08:08.597357 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:08Z","lastTransitionTime":"2025-10-13T13:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:08 crc kubenswrapper[4797]: I1013 13:08:08.699212 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:08 crc kubenswrapper[4797]: I1013 13:08:08.699292 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:08 crc kubenswrapper[4797]: I1013 13:08:08.699321 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:08 crc kubenswrapper[4797]: I1013 13:08:08.699353 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:08 crc kubenswrapper[4797]: I1013 13:08:08.699377 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:08Z","lastTransitionTime":"2025-10-13T13:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:08 crc kubenswrapper[4797]: I1013 13:08:08.801663 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:08 crc kubenswrapper[4797]: I1013 13:08:08.801712 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:08 crc kubenswrapper[4797]: I1013 13:08:08.801724 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:08 crc kubenswrapper[4797]: I1013 13:08:08.801741 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:08 crc kubenswrapper[4797]: I1013 13:08:08.801754 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:08Z","lastTransitionTime":"2025-10-13T13:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:08 crc kubenswrapper[4797]: I1013 13:08:08.904349 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:08 crc kubenswrapper[4797]: I1013 13:08:08.904435 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:08 crc kubenswrapper[4797]: I1013 13:08:08.904453 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:08 crc kubenswrapper[4797]: I1013 13:08:08.904475 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:08 crc kubenswrapper[4797]: I1013 13:08:08.904492 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:08Z","lastTransitionTime":"2025-10-13T13:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:09 crc kubenswrapper[4797]: I1013 13:08:09.007475 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:09 crc kubenswrapper[4797]: I1013 13:08:09.007524 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:09 crc kubenswrapper[4797]: I1013 13:08:09.007539 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:09 crc kubenswrapper[4797]: I1013 13:08:09.007556 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:09 crc kubenswrapper[4797]: I1013 13:08:09.007568 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:09Z","lastTransitionTime":"2025-10-13T13:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:09 crc kubenswrapper[4797]: I1013 13:08:09.113660 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:09 crc kubenswrapper[4797]: I1013 13:08:09.113696 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:09 crc kubenswrapper[4797]: I1013 13:08:09.113707 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:09 crc kubenswrapper[4797]: I1013 13:08:09.113723 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:09 crc kubenswrapper[4797]: I1013 13:08:09.113733 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:09Z","lastTransitionTime":"2025-10-13T13:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:09 crc kubenswrapper[4797]: I1013 13:08:09.217043 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:09 crc kubenswrapper[4797]: I1013 13:08:09.217680 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:09 crc kubenswrapper[4797]: I1013 13:08:09.217694 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:09 crc kubenswrapper[4797]: I1013 13:08:09.217717 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:09 crc kubenswrapper[4797]: I1013 13:08:09.217732 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:09Z","lastTransitionTime":"2025-10-13T13:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:09 crc kubenswrapper[4797]: I1013 13:08:09.235562 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 13 13:08:09 crc kubenswrapper[4797]: E1013 13:08:09.235755 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 13 13:08:09 crc kubenswrapper[4797]: I1013 13:08:09.236101 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pdvg5" Oct 13 13:08:09 crc kubenswrapper[4797]: E1013 13:08:09.236189 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pdvg5" podUID="e65d35bc-209d-4438-ae53-31deb132aaf5" Oct 13 13:08:09 crc kubenswrapper[4797]: I1013 13:08:09.236649 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 13 13:08:09 crc kubenswrapper[4797]: E1013 13:08:09.236708 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 13 13:08:09 crc kubenswrapper[4797]: I1013 13:08:09.236878 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 13:08:09 crc kubenswrapper[4797]: E1013 13:08:09.236992 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 13 13:08:09 crc kubenswrapper[4797]: I1013 13:08:09.320576 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:09 crc kubenswrapper[4797]: I1013 13:08:09.320675 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:09 crc kubenswrapper[4797]: I1013 13:08:09.320694 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:09 crc kubenswrapper[4797]: I1013 13:08:09.320749 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:09 crc kubenswrapper[4797]: I1013 13:08:09.320769 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:09Z","lastTransitionTime":"2025-10-13T13:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:09 crc kubenswrapper[4797]: I1013 13:08:09.423466 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:09 crc kubenswrapper[4797]: I1013 13:08:09.423504 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:09 crc kubenswrapper[4797]: I1013 13:08:09.423513 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:09 crc kubenswrapper[4797]: I1013 13:08:09.423528 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:09 crc kubenswrapper[4797]: I1013 13:08:09.423538 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:09Z","lastTransitionTime":"2025-10-13T13:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:09 crc kubenswrapper[4797]: I1013 13:08:09.527062 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:09 crc kubenswrapper[4797]: I1013 13:08:09.527328 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:09 crc kubenswrapper[4797]: I1013 13:08:09.527413 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:09 crc kubenswrapper[4797]: I1013 13:08:09.527484 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:09 crc kubenswrapper[4797]: I1013 13:08:09.527508 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:09Z","lastTransitionTime":"2025-10-13T13:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:09 crc kubenswrapper[4797]: I1013 13:08:09.629999 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:09 crc kubenswrapper[4797]: I1013 13:08:09.630049 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:09 crc kubenswrapper[4797]: I1013 13:08:09.630061 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:09 crc kubenswrapper[4797]: I1013 13:08:09.630083 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:09 crc kubenswrapper[4797]: I1013 13:08:09.630097 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:09Z","lastTransitionTime":"2025-10-13T13:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:09 crc kubenswrapper[4797]: I1013 13:08:09.733470 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:09 crc kubenswrapper[4797]: I1013 13:08:09.733516 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:09 crc kubenswrapper[4797]: I1013 13:08:09.733524 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:09 crc kubenswrapper[4797]: I1013 13:08:09.733544 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:09 crc kubenswrapper[4797]: I1013 13:08:09.733556 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:09Z","lastTransitionTime":"2025-10-13T13:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:09 crc kubenswrapper[4797]: I1013 13:08:09.837419 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:09 crc kubenswrapper[4797]: I1013 13:08:09.837473 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:09 crc kubenswrapper[4797]: I1013 13:08:09.837484 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:09 crc kubenswrapper[4797]: I1013 13:08:09.837502 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:09 crc kubenswrapper[4797]: I1013 13:08:09.837514 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:09Z","lastTransitionTime":"2025-10-13T13:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:09 crc kubenswrapper[4797]: I1013 13:08:09.940144 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:09 crc kubenswrapper[4797]: I1013 13:08:09.940192 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:09 crc kubenswrapper[4797]: I1013 13:08:09.940202 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:09 crc kubenswrapper[4797]: I1013 13:08:09.940221 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:09 crc kubenswrapper[4797]: I1013 13:08:09.940231 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:09Z","lastTransitionTime":"2025-10-13T13:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:10 crc kubenswrapper[4797]: I1013 13:08:10.043150 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:10 crc kubenswrapper[4797]: I1013 13:08:10.043231 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:10 crc kubenswrapper[4797]: I1013 13:08:10.043254 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:10 crc kubenswrapper[4797]: I1013 13:08:10.043289 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:10 crc kubenswrapper[4797]: I1013 13:08:10.043311 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:10Z","lastTransitionTime":"2025-10-13T13:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:10 crc kubenswrapper[4797]: I1013 13:08:10.145378 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:10 crc kubenswrapper[4797]: I1013 13:08:10.145423 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:10 crc kubenswrapper[4797]: I1013 13:08:10.145432 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:10 crc kubenswrapper[4797]: I1013 13:08:10.145447 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:10 crc kubenswrapper[4797]: I1013 13:08:10.145458 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:10Z","lastTransitionTime":"2025-10-13T13:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:10 crc kubenswrapper[4797]: I1013 13:08:10.247441 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:10 crc kubenswrapper[4797]: I1013 13:08:10.247500 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:10 crc kubenswrapper[4797]: I1013 13:08:10.247517 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:10 crc kubenswrapper[4797]: I1013 13:08:10.247540 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:10 crc kubenswrapper[4797]: I1013 13:08:10.247558 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:10Z","lastTransitionTime":"2025-10-13T13:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:10 crc kubenswrapper[4797]: I1013 13:08:10.349505 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:10 crc kubenswrapper[4797]: I1013 13:08:10.349540 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:10 crc kubenswrapper[4797]: I1013 13:08:10.349551 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:10 crc kubenswrapper[4797]: I1013 13:08:10.349564 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:10 crc kubenswrapper[4797]: I1013 13:08:10.349574 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:10Z","lastTransitionTime":"2025-10-13T13:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:10 crc kubenswrapper[4797]: I1013 13:08:10.454413 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:10 crc kubenswrapper[4797]: I1013 13:08:10.454483 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:10 crc kubenswrapper[4797]: I1013 13:08:10.454500 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:10 crc kubenswrapper[4797]: I1013 13:08:10.454526 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:10 crc kubenswrapper[4797]: I1013 13:08:10.454551 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:10Z","lastTransitionTime":"2025-10-13T13:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:10 crc kubenswrapper[4797]: I1013 13:08:10.557035 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:10 crc kubenswrapper[4797]: I1013 13:08:10.557083 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:10 crc kubenswrapper[4797]: I1013 13:08:10.557091 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:10 crc kubenswrapper[4797]: I1013 13:08:10.557105 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:10 crc kubenswrapper[4797]: I1013 13:08:10.557114 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:10Z","lastTransitionTime":"2025-10-13T13:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:10 crc kubenswrapper[4797]: I1013 13:08:10.659392 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:10 crc kubenswrapper[4797]: I1013 13:08:10.659427 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:10 crc kubenswrapper[4797]: I1013 13:08:10.659436 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:10 crc kubenswrapper[4797]: I1013 13:08:10.659451 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:10 crc kubenswrapper[4797]: I1013 13:08:10.659459 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:10Z","lastTransitionTime":"2025-10-13T13:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:10 crc kubenswrapper[4797]: I1013 13:08:10.762035 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:10 crc kubenswrapper[4797]: I1013 13:08:10.762110 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:10 crc kubenswrapper[4797]: I1013 13:08:10.762127 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:10 crc kubenswrapper[4797]: I1013 13:08:10.762151 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:10 crc kubenswrapper[4797]: I1013 13:08:10.762171 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:10Z","lastTransitionTime":"2025-10-13T13:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:10 crc kubenswrapper[4797]: I1013 13:08:10.864656 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:10 crc kubenswrapper[4797]: I1013 13:08:10.864702 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:10 crc kubenswrapper[4797]: I1013 13:08:10.864713 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:10 crc kubenswrapper[4797]: I1013 13:08:10.864731 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:10 crc kubenswrapper[4797]: I1013 13:08:10.864743 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:10Z","lastTransitionTime":"2025-10-13T13:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:10 crc kubenswrapper[4797]: I1013 13:08:10.966901 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:10 crc kubenswrapper[4797]: I1013 13:08:10.966959 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:10 crc kubenswrapper[4797]: I1013 13:08:10.966976 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:10 crc kubenswrapper[4797]: I1013 13:08:10.966999 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:10 crc kubenswrapper[4797]: I1013 13:08:10.967016 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:10Z","lastTransitionTime":"2025-10-13T13:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:11 crc kubenswrapper[4797]: I1013 13:08:11.069363 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:11 crc kubenswrapper[4797]: I1013 13:08:11.069407 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:11 crc kubenswrapper[4797]: I1013 13:08:11.069420 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:11 crc kubenswrapper[4797]: I1013 13:08:11.069435 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:11 crc kubenswrapper[4797]: I1013 13:08:11.069449 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:11Z","lastTransitionTime":"2025-10-13T13:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:11 crc kubenswrapper[4797]: I1013 13:08:11.172087 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:11 crc kubenswrapper[4797]: I1013 13:08:11.172150 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:11 crc kubenswrapper[4797]: I1013 13:08:11.172180 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:11 crc kubenswrapper[4797]: I1013 13:08:11.172206 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:11 crc kubenswrapper[4797]: I1013 13:08:11.172223 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:11Z","lastTransitionTime":"2025-10-13T13:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:11 crc kubenswrapper[4797]: I1013 13:08:11.236015 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pdvg5" Oct 13 13:08:11 crc kubenswrapper[4797]: E1013 13:08:11.236162 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pdvg5" podUID="e65d35bc-209d-4438-ae53-31deb132aaf5" Oct 13 13:08:11 crc kubenswrapper[4797]: I1013 13:08:11.236540 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 13 13:08:11 crc kubenswrapper[4797]: E1013 13:08:11.236612 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 13 13:08:11 crc kubenswrapper[4797]: I1013 13:08:11.236660 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 13:08:11 crc kubenswrapper[4797]: E1013 13:08:11.236716 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 13 13:08:11 crc kubenswrapper[4797]: I1013 13:08:11.237013 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 13 13:08:11 crc kubenswrapper[4797]: E1013 13:08:11.237192 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 13 13:08:11 crc kubenswrapper[4797]: I1013 13:08:11.255375 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e65d35bc-209d-4438-ae53-31deb132aaf5-metrics-certs\") pod \"network-metrics-daemon-pdvg5\" (UID: \"e65d35bc-209d-4438-ae53-31deb132aaf5\") " pod="openshift-multus/network-metrics-daemon-pdvg5" Oct 13 13:08:11 crc kubenswrapper[4797]: E1013 13:08:11.255486 4797 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 13 13:08:11 crc kubenswrapper[4797]: E1013 13:08:11.255535 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e65d35bc-209d-4438-ae53-31deb132aaf5-metrics-certs podName:e65d35bc-209d-4438-ae53-31deb132aaf5 nodeName:}" failed. No retries permitted until 2025-10-13 13:08:43.255519996 +0000 UTC m=+100.789070252 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e65d35bc-209d-4438-ae53-31deb132aaf5-metrics-certs") pod "network-metrics-daemon-pdvg5" (UID: "e65d35bc-209d-4438-ae53-31deb132aaf5") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 13 13:08:11 crc kubenswrapper[4797]: I1013 13:08:11.275062 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:11 crc kubenswrapper[4797]: I1013 13:08:11.275124 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:11 crc kubenswrapper[4797]: I1013 13:08:11.275142 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:11 crc kubenswrapper[4797]: I1013 13:08:11.275167 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:11 crc kubenswrapper[4797]: I1013 13:08:11.275187 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:11Z","lastTransitionTime":"2025-10-13T13:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:11 crc kubenswrapper[4797]: I1013 13:08:11.377569 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:11 crc kubenswrapper[4797]: I1013 13:08:11.377623 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:11 crc kubenswrapper[4797]: I1013 13:08:11.377631 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:11 crc kubenswrapper[4797]: I1013 13:08:11.377649 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:11 crc kubenswrapper[4797]: I1013 13:08:11.377658 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:11Z","lastTransitionTime":"2025-10-13T13:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:11 crc kubenswrapper[4797]: I1013 13:08:11.445896 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:11 crc kubenswrapper[4797]: I1013 13:08:11.445954 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:11 crc kubenswrapper[4797]: I1013 13:08:11.445971 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:11 crc kubenswrapper[4797]: I1013 13:08:11.445990 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:11 crc kubenswrapper[4797]: I1013 13:08:11.446004 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:11Z","lastTransitionTime":"2025-10-13T13:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:11 crc kubenswrapper[4797]: E1013 13:08:11.461267 4797 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:08:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:08:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:08:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:08:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:08:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:08:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:08:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:08:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7c305ae9-a0eb-4806-bd54-a7ad9c447299\\\",\\\"systemUUID\\\":\\\"1126131d-f382-4ed8-9b1e-fad3c0f5c993\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:11Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:11 crc kubenswrapper[4797]: I1013 13:08:11.465473 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:11 crc kubenswrapper[4797]: I1013 13:08:11.465519 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:11 crc kubenswrapper[4797]: I1013 13:08:11.465529 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:11 crc kubenswrapper[4797]: I1013 13:08:11.465547 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:11 crc kubenswrapper[4797]: I1013 13:08:11.465559 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:11Z","lastTransitionTime":"2025-10-13T13:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:11 crc kubenswrapper[4797]: E1013 13:08:11.480383 4797 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:08:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:08:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:08:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:08:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:08:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:08:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:08:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:08:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7c305ae9-a0eb-4806-bd54-a7ad9c447299\\\",\\\"systemUUID\\\":\\\"1126131d-f382-4ed8-9b1e-fad3c0f5c993\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:11Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:11 crc kubenswrapper[4797]: I1013 13:08:11.484328 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:11 crc kubenswrapper[4797]: I1013 13:08:11.484388 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:11 crc kubenswrapper[4797]: I1013 13:08:11.484407 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:11 crc kubenswrapper[4797]: I1013 13:08:11.484434 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:11 crc kubenswrapper[4797]: I1013 13:08:11.484452 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:11Z","lastTransitionTime":"2025-10-13T13:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:11 crc kubenswrapper[4797]: E1013 13:08:11.500079 4797 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:08:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:08:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:08:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:08:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:08:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:08:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:08:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:08:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7c305ae9-a0eb-4806-bd54-a7ad9c447299\\\",\\\"systemUUID\\\":\\\"1126131d-f382-4ed8-9b1e-fad3c0f5c993\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:11Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:11 crc kubenswrapper[4797]: I1013 13:08:11.504253 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:11 crc kubenswrapper[4797]: I1013 13:08:11.504299 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:11 crc kubenswrapper[4797]: I1013 13:08:11.504312 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:11 crc kubenswrapper[4797]: I1013 13:08:11.504331 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:11 crc kubenswrapper[4797]: I1013 13:08:11.504343 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:11Z","lastTransitionTime":"2025-10-13T13:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:11 crc kubenswrapper[4797]: E1013 13:08:11.518168 4797 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:08:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:08:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:08:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:08:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:08:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:08:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:08:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:08:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7c305ae9-a0eb-4806-bd54-a7ad9c447299\\\",\\\"systemUUID\\\":\\\"1126131d-f382-4ed8-9b1e-fad3c0f5c993\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:11Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:11 crc kubenswrapper[4797]: I1013 13:08:11.522056 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:11 crc kubenswrapper[4797]: I1013 13:08:11.522089 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:11 crc kubenswrapper[4797]: I1013 13:08:11.522100 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:11 crc kubenswrapper[4797]: I1013 13:08:11.522116 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:11 crc kubenswrapper[4797]: I1013 13:08:11.522126 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:11Z","lastTransitionTime":"2025-10-13T13:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:11 crc kubenswrapper[4797]: E1013 13:08:11.532840 4797 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:08:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:08:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:08:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:08:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:08:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:08:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:08:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:08:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7c305ae9-a0eb-4806-bd54-a7ad9c447299\\\",\\\"systemUUID\\\":\\\"1126131d-f382-4ed8-9b1e-fad3c0f5c993\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:11Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:11 crc kubenswrapper[4797]: E1013 13:08:11.532996 4797 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 13 13:08:11 crc kubenswrapper[4797]: I1013 13:08:11.534413 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:11 crc kubenswrapper[4797]: I1013 13:08:11.534445 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:11 crc kubenswrapper[4797]: I1013 13:08:11.534456 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:11 crc kubenswrapper[4797]: I1013 13:08:11.534472 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:11 crc kubenswrapper[4797]: I1013 13:08:11.534484 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:11Z","lastTransitionTime":"2025-10-13T13:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:11 crc kubenswrapper[4797]: I1013 13:08:11.636922 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:11 crc kubenswrapper[4797]: I1013 13:08:11.636984 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:11 crc kubenswrapper[4797]: I1013 13:08:11.636993 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:11 crc kubenswrapper[4797]: I1013 13:08:11.637007 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:11 crc kubenswrapper[4797]: I1013 13:08:11.637037 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:11Z","lastTransitionTime":"2025-10-13T13:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:11 crc kubenswrapper[4797]: I1013 13:08:11.739917 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:11 crc kubenswrapper[4797]: I1013 13:08:11.739964 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:11 crc kubenswrapper[4797]: I1013 13:08:11.739974 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:11 crc kubenswrapper[4797]: I1013 13:08:11.739989 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:11 crc kubenswrapper[4797]: I1013 13:08:11.740002 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:11Z","lastTransitionTime":"2025-10-13T13:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:11 crc kubenswrapper[4797]: I1013 13:08:11.847973 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:11 crc kubenswrapper[4797]: I1013 13:08:11.848025 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:11 crc kubenswrapper[4797]: I1013 13:08:11.848038 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:11 crc kubenswrapper[4797]: I1013 13:08:11.848056 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:11 crc kubenswrapper[4797]: I1013 13:08:11.848068 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:11Z","lastTransitionTime":"2025-10-13T13:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:11 crc kubenswrapper[4797]: I1013 13:08:11.950725 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:11 crc kubenswrapper[4797]: I1013 13:08:11.950786 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:11 crc kubenswrapper[4797]: I1013 13:08:11.950838 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:11 crc kubenswrapper[4797]: I1013 13:08:11.950863 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:11 crc kubenswrapper[4797]: I1013 13:08:11.950881 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:11Z","lastTransitionTime":"2025-10-13T13:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:12 crc kubenswrapper[4797]: I1013 13:08:12.052973 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:12 crc kubenswrapper[4797]: I1013 13:08:12.053023 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:12 crc kubenswrapper[4797]: I1013 13:08:12.053038 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:12 crc kubenswrapper[4797]: I1013 13:08:12.053055 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:12 crc kubenswrapper[4797]: I1013 13:08:12.053069 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:12Z","lastTransitionTime":"2025-10-13T13:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:12 crc kubenswrapper[4797]: I1013 13:08:12.155573 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:12 crc kubenswrapper[4797]: I1013 13:08:12.155616 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:12 crc kubenswrapper[4797]: I1013 13:08:12.155626 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:12 crc kubenswrapper[4797]: I1013 13:08:12.155640 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:12 crc kubenswrapper[4797]: I1013 13:08:12.155649 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:12Z","lastTransitionTime":"2025-10-13T13:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:12 crc kubenswrapper[4797]: I1013 13:08:12.258243 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:12 crc kubenswrapper[4797]: I1013 13:08:12.258290 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:12 crc kubenswrapper[4797]: I1013 13:08:12.258300 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:12 crc kubenswrapper[4797]: I1013 13:08:12.258313 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:12 crc kubenswrapper[4797]: I1013 13:08:12.258322 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:12Z","lastTransitionTime":"2025-10-13T13:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:12 crc kubenswrapper[4797]: I1013 13:08:12.360639 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:12 crc kubenswrapper[4797]: I1013 13:08:12.360705 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:12 crc kubenswrapper[4797]: I1013 13:08:12.360728 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:12 crc kubenswrapper[4797]: I1013 13:08:12.360756 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:12 crc kubenswrapper[4797]: I1013 13:08:12.360792 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:12Z","lastTransitionTime":"2025-10-13T13:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:12 crc kubenswrapper[4797]: I1013 13:08:12.463769 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:12 crc kubenswrapper[4797]: I1013 13:08:12.463839 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:12 crc kubenswrapper[4797]: I1013 13:08:12.463851 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:12 crc kubenswrapper[4797]: I1013 13:08:12.463868 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:12 crc kubenswrapper[4797]: I1013 13:08:12.463880 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:12Z","lastTransitionTime":"2025-10-13T13:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:12 crc kubenswrapper[4797]: I1013 13:08:12.566583 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:12 crc kubenswrapper[4797]: I1013 13:08:12.566676 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:12 crc kubenswrapper[4797]: I1013 13:08:12.566708 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:12 crc kubenswrapper[4797]: I1013 13:08:12.566739 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:12 crc kubenswrapper[4797]: I1013 13:08:12.566761 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:12Z","lastTransitionTime":"2025-10-13T13:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:12 crc kubenswrapper[4797]: I1013 13:08:12.669974 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:12 crc kubenswrapper[4797]: I1013 13:08:12.670034 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:12 crc kubenswrapper[4797]: I1013 13:08:12.670102 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:12 crc kubenswrapper[4797]: I1013 13:08:12.670128 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:12 crc kubenswrapper[4797]: I1013 13:08:12.670148 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:12Z","lastTransitionTime":"2025-10-13T13:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:12 crc kubenswrapper[4797]: I1013 13:08:12.703519 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-6gbdx_b2ab9f14-aae8-45ef-880e-a1563e920f87/kube-multus/0.log" Oct 13 13:08:12 crc kubenswrapper[4797]: I1013 13:08:12.703615 4797 generic.go:334] "Generic (PLEG): container finished" podID="b2ab9f14-aae8-45ef-880e-a1563e920f87" containerID="414f6ddbfec431109009fc83e56eeac94db15726b109e707ebd8d3e2403999b7" exitCode=1 Oct 13 13:08:12 crc kubenswrapper[4797]: I1013 13:08:12.703669 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-6gbdx" event={"ID":"b2ab9f14-aae8-45ef-880e-a1563e920f87","Type":"ContainerDied","Data":"414f6ddbfec431109009fc83e56eeac94db15726b109e707ebd8d3e2403999b7"} Oct 13 13:08:12 crc kubenswrapper[4797]: I1013 13:08:12.704451 4797 scope.go:117] "RemoveContainer" containerID="414f6ddbfec431109009fc83e56eeac94db15726b109e707ebd8d3e2403999b7" Oct 13 13:08:12 crc kubenswrapper[4797]: I1013 13:08:12.725003 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:12Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:12 crc kubenswrapper[4797]: I1013 13:08:12.747036 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:12Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:12 crc kubenswrapper[4797]: I1013 13:08:12.768733 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6gbdx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2ab9f14-aae8-45ef-880e-a1563e920f87\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:08:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:08:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://414f6ddbfec431109009fc83e56eeac94db15726b109e707ebd8d3e2403999b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://414f6ddbfec431109009fc83e56eeac94db15726b109e707ebd8d3e2403999b7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-13T13:08:12Z\\\",\\\"message\\\":\\\"2025-10-13T13:07:27+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_241ac6c6-17e4-4495-a5d5-736f3cee06d0\\\\n2025-10-13T13:07:27+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_241ac6c6-17e4-4495-a5d5-736f3cee06d0 to /host/opt/cni/bin/\\\\n2025-10-13T13:07:27Z [verbose] multus-daemon started\\\\n2025-10-13T13:07:27Z [verbose] Readiness Indicator file check\\\\n2025-10-13T13:08:12Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rkc2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6gbdx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:12Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:12 crc kubenswrapper[4797]: I1013 13:08:12.773369 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:12 crc kubenswrapper[4797]: I1013 13:08:12.773409 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:12 crc kubenswrapper[4797]: I1013 13:08:12.773420 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:12 crc kubenswrapper[4797]: I1013 13:08:12.773438 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:12 crc kubenswrapper[4797]: I1013 13:08:12.773452 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:12Z","lastTransitionTime":"2025-10-13T13:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:12 crc kubenswrapper[4797]: I1013 13:08:12.788293 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"658edc6a-9975-4d8b-9551-821edcc32ce1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6815d3509df673d7f5da2c26130c6c4d533e9d2c25c40f82365ef61d63ee71bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e599d81d1a996abd4de74afc58a8255a1ae548327401146b6bdf688d7455823\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa5161ba66d687daedb3caa1a0e2d7be83859aa3076731f94aebf83cc3348a4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1293f7ed35796e22a4be73a35ad07f83fa98d250d21de2d0b96b9090354142b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32991406197be9d38b8d5e8d1a7e95165b1846e9e054efbe87f30aac9f7f8784\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7aebb018a68c2984d9e4e58071c2b623652bfa700acebaf735c35615abf8c592\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbb589aa6432f0be09c19acfbe0267d2c3e9906ab1b90494b2e097aec3c51e50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbb589aa6432f0be09c19acfbe0267d2c3e9906ab1b90494b2e097aec3c51e50\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-13T13:07:49Z\\\",\\\"message\\\":\\\"ap[10.217.5.219:80:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7594bb65-e742-44b3-a975-d639b1128be5}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1013 13:07:49.213399 6403 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1013 13:07:49.213404 6403 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-network-diagnostics/network-check-target]} name:Service_openshift-network-diagnostics/network-check-target_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.219:80:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7594bb65-e742-44b3-a975-d639b1128be5}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1013 13:07:49.213463 6403 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: fail\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:48Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-dhk2q_openshift-ovn-kubernetes(658edc6a-9975-4d8b-9551-821edcc32ce1)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a900854ab289e65833932548eadd4705ec501737d66773d5b6c283458125b598\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6339ec20a5a892de024630eaf6febedfdfa4373c24d394fc758267cc7ae2603e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6339ec20a5a892de024630eaf6febedfdfa4373c24d394fc758267cc7ae2603e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dhk2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:12Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:12 crc kubenswrapper[4797]: I1013 13:08:12.801176 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a225515-1318-413d-aafe-877c9f16f598\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94213b2963fead3db49bc98dfdf6347265b92e3a0a965295610e496d2e1f03fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://015492cb16b3cf6dedc1936f90cf03d1331bfd1fddf6a257c719a6bf102691f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a259d99cae7127eb6fc8ad5446de3eda5a06da45868ab2325a89fc9c44f1d34c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58e84f914871c37c2c5cf2767af6a88354e4e59af0cbe5b178b80e1372d50629\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba76df71160260346f2ecd968722de778b7d2b3dcb8673d6ec770964965384fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ff207205d6f17838682cab641fe9c77fad739e5454c206f376440741bed8724\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ff207205d6f17838682cab641fe9c77fad739e5454c206f376440741bed8724\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:12Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:12 crc kubenswrapper[4797]: I1013 13:08:12.817612 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5jgrm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"680a49a0-7eff-44a9-8ab8-e4b52f4743c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://875be57ff7356a934b342acd8ae700f66656680be4e58e6cfccdc0407b66ddea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pptw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5jgrm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:12Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:12 crc kubenswrapper[4797]: I1013 13:08:12.827980 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hvhmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b24c284-a754-4877-83cc-334b0a893a47\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72031333bb0302ca8e823981a07e96b3bf16d02fbdb918d4fd3e79f36d86c5ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znzc9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95d788f4cc7913f42c5282aea7303a5463ec8718dba6372a30c505e1648f230e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znzc9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hvhmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:12Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:12 crc kubenswrapper[4797]: I1013 13:08:12.840466 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-pdvg5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e65d35bc-209d-4438-ae53-31deb132aaf5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nspn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nspn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-pdvg5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:12Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:12 crc kubenswrapper[4797]: I1013 13:08:12.851310 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69631c207b244ea05458caf7f67665697be6b3794c1aac98d0ac8d23df060e02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:12Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:12 crc kubenswrapper[4797]: I1013 13:08:12.868242 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a0ece0b-2009-4af8-a479-18fe277add03\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfe67dfd1c3ab4ca933a08e0384f2c38dccf755989a2d788f7c96bd8c2005c7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2839d93188aac6edbd17e7c1dc6d3b6004d3c1d8d03c559205b1f180ca7fc722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d103007fe3545a7470b16b99638c5d5c87f34918e102e1453d4f7ee1fa67109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://221933b30055ace0b4911bda08736e1c703b7757d55fadb3114ae39d038e4b19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01ae91c2e82dd02eb4c0aced1159efd45e3a0570a4db649f2fd2b58681419471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9634ab258db11c80c4fea57a4a31969b811204d49513802e9fd1e584c9baeeb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9634ab258db11c80c4fea57a4a31969b811204d49513802e9fd1e584c9baeeb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4eac0af23e91572524547b0ce92c10d435b55d0cd15ca4cfc1f49bda2de8bde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4eac0af23e91572524547b0ce92c10d435b55d0cd15ca4cfc1f49bda2de8bde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://af79d6bbb6a3c16b532ba2234d3373011c151ffa801eb1ae5ae947142a64bcfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af79d6bbb6a3c16b532ba2234d3373011c151ffa801eb1ae5ae947142a64bcfe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:03Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:12Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:12 crc kubenswrapper[4797]: I1013 13:08:12.875942 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:12 crc kubenswrapper[4797]: I1013 13:08:12.875971 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:12 crc kubenswrapper[4797]: I1013 13:08:12.875981 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:12 crc kubenswrapper[4797]: I1013 13:08:12.875996 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:12 crc kubenswrapper[4797]: I1013 13:08:12.876005 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:12Z","lastTransitionTime":"2025-10-13T13:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:12 crc kubenswrapper[4797]: I1013 13:08:12.879383 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6681a37da700a80ffec94aef9264f87838622029c76a2badc7b8f4a7e9e167e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:12Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:12 crc kubenswrapper[4797]: I1013 13:08:12.894873 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://050a7223e7496fca4ef77f2d73f6aefc921ac5accb7ecaa34609524388da6549\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f59a6104215a3e6febd2e26c286b00895bcd8a45719acbd8e86d6fb5683df39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:12Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:12 crc kubenswrapper[4797]: I1013 13:08:12.906554 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:12Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:12 crc kubenswrapper[4797]: I1013 13:08:12.923356 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hc9bk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94a6e41f-8980-41db-a008-d5a81058cdba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4187a9b2b8147080c704bcda550e1fa94124e2d876e766361e95907a8805d300\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55fca726ceb1c3d6ba2023acc8fcf6829a7843a2da5209dc77f3d1f499542984\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55fca726ceb1c3d6ba2023acc8fcf6829a7843a2da5209dc77f3d1f499542984\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d315709aee893961a174d0368efcf68e50e45845bdce18b40b96f5d49a8ac12c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d315709aee893961a174d0368efcf68e50e45845bdce18b40b96f5d49a8ac12c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e5eeb32046ffa3c309bb0649ed24bb4149050e757d4252bc5ed8e0593e1b139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e5eeb32046ffa3c309bb0649ed24bb4149050e757d4252bc5ed8e0593e1b139\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12ad75a6e287db934cda0d0128e5445d47433aa12807b4145ce0bd28c36b08a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12ad75a6e287db934cda0d0128e5445d47433aa12807b4145ce0bd28c36b08a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://578ddb49fd9525eafe74852b96ea1f3e320cbe40fa15ef4da3e4269f9bc23fc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://578ddb49fd9525eafe74852b96ea1f3e320cbe40fa15ef4da3e4269f9bc23fc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d96793d08ac170b3c25abd53c779b2ebecf10b5271c5f3eb4f9cbc524ba65c0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d96793d08ac170b3c25abd53c779b2ebecf10b5271c5f3eb4f9cbc524ba65c0e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hc9bk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:12Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:12 crc kubenswrapper[4797]: I1013 13:08:12.933276 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b68fe527-212f-427f-853a-037035463262\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca584d4ecaf82f6fb7822ce377920e84fa94325d8c157e75bdcbbe45a125fa17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11160e205816f4de995be138142cca7672957f217e49bf9f4761ae2cb132e9da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://034f468ed62eb8201ad4abdbf235c13b6c9ff8e3fe2494ad768f7047e188bc77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e62409684da122b3385446a402a798c47eca9f32aeb43f734f94dc498f95d23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e62409684da122b3385446a402a798c47eca9f32aeb43f734f94dc498f95d23\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:03Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:12Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:12 crc kubenswrapper[4797]: I1013 13:08:12.945861 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"345b1c60-ba79-407d-8423-53010f2dfeb0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9188982d992b79d058393a141055552eeb63bc5cd53178991e62b3df7604f55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4mqw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae2106d4b7e73d19b0c8cbd8089d372e56fa08d827a3b45148d0cf68e8596c00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4mqw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hrdxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:12Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:12 crc kubenswrapper[4797]: I1013 13:08:12.958222 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7c2fp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c10f51f-a7f1-4ab8-8d9c-fc358bd7f2c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55ca323dd3a92ee203542f4ef7bb8be990bcfc8f75f125c562127129aecefc5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hgkjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7c2fp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:12Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:12 crc kubenswrapper[4797]: I1013 13:08:12.971306 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f37f607-3b81-4e33-878e-e78a69b89d23\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f65bab26af0e0d003d4e1a27dc4bdb84b64b5f6143e363973331a3fb6d26b12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81fc000b6df41386d24f9077cee4aa0ceb4733774dc37d225495575543e84a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6529ac19e0d9f2b6ecc69e041e75c9767c971617166ca22bb29349b3b3965b1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://153aaedead7ed76ab97858342317945de885afe80c00d9873d1a03444c47f67d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:12Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:12 crc kubenswrapper[4797]: I1013 13:08:12.978487 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:12 crc kubenswrapper[4797]: I1013 13:08:12.978519 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:12 crc kubenswrapper[4797]: I1013 13:08:12.978528 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:12 crc kubenswrapper[4797]: I1013 13:08:12.978543 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:12 crc kubenswrapper[4797]: I1013 13:08:12.978554 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:12Z","lastTransitionTime":"2025-10-13T13:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:13 crc kubenswrapper[4797]: I1013 13:08:13.081435 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:13 crc kubenswrapper[4797]: I1013 13:08:13.081471 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:13 crc kubenswrapper[4797]: I1013 13:08:13.081480 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:13 crc kubenswrapper[4797]: I1013 13:08:13.081495 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:13 crc kubenswrapper[4797]: I1013 13:08:13.081505 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:13Z","lastTransitionTime":"2025-10-13T13:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:13 crc kubenswrapper[4797]: I1013 13:08:13.183259 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:13 crc kubenswrapper[4797]: I1013 13:08:13.183293 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:13 crc kubenswrapper[4797]: I1013 13:08:13.183302 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:13 crc kubenswrapper[4797]: I1013 13:08:13.183316 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:13 crc kubenswrapper[4797]: I1013 13:08:13.183325 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:13Z","lastTransitionTime":"2025-10-13T13:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:13 crc kubenswrapper[4797]: I1013 13:08:13.236051 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 13:08:13 crc kubenswrapper[4797]: I1013 13:08:13.236092 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 13 13:08:13 crc kubenswrapper[4797]: I1013 13:08:13.236109 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 13 13:08:13 crc kubenswrapper[4797]: I1013 13:08:13.236109 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pdvg5" Oct 13 13:08:13 crc kubenswrapper[4797]: E1013 13:08:13.236194 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 13 13:08:13 crc kubenswrapper[4797]: E1013 13:08:13.236288 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 13 13:08:13 crc kubenswrapper[4797]: E1013 13:08:13.236441 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pdvg5" podUID="e65d35bc-209d-4438-ae53-31deb132aaf5" Oct 13 13:08:13 crc kubenswrapper[4797]: E1013 13:08:13.237061 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 13 13:08:13 crc kubenswrapper[4797]: I1013 13:08:13.237340 4797 scope.go:117] "RemoveContainer" containerID="dbb589aa6432f0be09c19acfbe0267d2c3e9906ab1b90494b2e097aec3c51e50" Oct 13 13:08:13 crc kubenswrapper[4797]: I1013 13:08:13.249905 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b68fe527-212f-427f-853a-037035463262\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca584d4ecaf82f6fb7822ce377920e84fa94325d8c157e75bdcbbe45a125fa17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11160e205816f4de995be138142cca7672957f217e49bf9f4761ae2cb132e9da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://034f468ed62eb8201ad4abdbf235c13b6c9ff8e3fe2494ad768f7047e188bc77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e62409684da122b3385446a402a798c47eca9f32aeb43f734f94dc498f95d23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e62409684da122b3385446a402a798c47eca9f32aeb43f734f94dc498f95d23\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:03Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:13Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:13 crc kubenswrapper[4797]: I1013 13:08:13.274517 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a0ece0b-2009-4af8-a479-18fe277add03\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfe67dfd1c3ab4ca933a08e0384f2c38dccf755989a2d788f7c96bd8c2005c7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2839d93188aac6edbd17e7c1dc6d3b6004d3c1d8d03c559205b1f180ca7fc722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d103007fe3545a7470b16b99638c5d5c87f34918e102e1453d4f7ee1fa67109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://221933b30055ace0b4911bda08736e1c703b7757d55fadb3114ae39d038e4b19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01ae91c2e82dd02eb4c0aced1159efd45e3a0570a4db649f2fd2b58681419471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9634ab258db11c80c4fea57a4a31969b811204d49513802e9fd1e584c9baeeb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9634ab258db11c80c4fea57a4a31969b811204d49513802e9fd1e584c9baeeb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4eac0af23e91572524547b0ce92c10d435b55d0cd15ca4cfc1f49bda2de8bde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4eac0af23e91572524547b0ce92c10d435b55d0cd15ca4cfc1f49bda2de8bde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://af79d6bbb6a3c16b532ba2234d3373011c151ffa801eb1ae5ae947142a64bcfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af79d6bbb6a3c16b532ba2234d3373011c151ffa801eb1ae5ae947142a64bcfe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:03Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:13Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:13 crc kubenswrapper[4797]: I1013 13:08:13.286386 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:13 crc kubenswrapper[4797]: I1013 13:08:13.286437 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:13 crc kubenswrapper[4797]: I1013 13:08:13.286450 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:13 crc kubenswrapper[4797]: I1013 13:08:13.286467 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:13 crc kubenswrapper[4797]: I1013 13:08:13.286478 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:13Z","lastTransitionTime":"2025-10-13T13:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:13 crc kubenswrapper[4797]: I1013 13:08:13.290187 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6681a37da700a80ffec94aef9264f87838622029c76a2badc7b8f4a7e9e167e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:13Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:13 crc kubenswrapper[4797]: I1013 13:08:13.304143 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://050a7223e7496fca4ef77f2d73f6aefc921ac5accb7ecaa34609524388da6549\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f59a6104215a3e6febd2e26c286b00895bcd8a45719acbd8e86d6fb5683df39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:13Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:13 crc kubenswrapper[4797]: I1013 13:08:13.315950 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:13Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:13 crc kubenswrapper[4797]: I1013 13:08:13.330732 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hc9bk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94a6e41f-8980-41db-a008-d5a81058cdba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4187a9b2b8147080c704bcda550e1fa94124e2d876e766361e95907a8805d300\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55fca726ceb1c3d6ba2023acc8fcf6829a7843a2da5209dc77f3d1f499542984\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55fca726ceb1c3d6ba2023acc8fcf6829a7843a2da5209dc77f3d1f499542984\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d315709aee893961a174d0368efcf68e50e45845bdce18b40b96f5d49a8ac12c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d315709aee893961a174d0368efcf68e50e45845bdce18b40b96f5d49a8ac12c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e5eeb32046ffa3c309bb0649ed24bb4149050e757d4252bc5ed8e0593e1b139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e5eeb32046ffa3c309bb0649ed24bb4149050e757d4252bc5ed8e0593e1b139\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12ad75a6e287db934cda0d0128e5445d47433aa12807b4145ce0bd28c36b08a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12ad75a6e287db934cda0d0128e5445d47433aa12807b4145ce0bd28c36b08a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://578ddb49fd9525eafe74852b96ea1f3e320cbe40fa15ef4da3e4269f9bc23fc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://578ddb49fd9525eafe74852b96ea1f3e320cbe40fa15ef4da3e4269f9bc23fc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d96793d08ac170b3c25abd53c779b2ebecf10b5271c5f3eb4f9cbc524ba65c0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d96793d08ac170b3c25abd53c779b2ebecf10b5271c5f3eb4f9cbc524ba65c0e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hc9bk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:13Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:13 crc kubenswrapper[4797]: I1013 13:08:13.347288 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f37f607-3b81-4e33-878e-e78a69b89d23\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f65bab26af0e0d003d4e1a27dc4bdb84b64b5f6143e363973331a3fb6d26b12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81fc000b6df41386d24f9077cee4aa0ceb4733774dc37d225495575543e84a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6529ac19e0d9f2b6ecc69e041e75c9767c971617166ca22bb29349b3b3965b1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://153aaedead7ed76ab97858342317945de885afe80c00d9873d1a03444c47f67d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:13Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:13 crc kubenswrapper[4797]: I1013 13:08:13.359323 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"345b1c60-ba79-407d-8423-53010f2dfeb0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9188982d992b79d058393a141055552eeb63bc5cd53178991e62b3df7604f55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4mqw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae2106d4b7e73d19b0c8cbd8089d372e56fa08d827a3b45148d0cf68e8596c00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4mqw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hrdxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:13Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:13 crc kubenswrapper[4797]: I1013 13:08:13.369265 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7c2fp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c10f51f-a7f1-4ab8-8d9c-fc358bd7f2c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55ca323dd3a92ee203542f4ef7bb8be990bcfc8f75f125c562127129aecefc5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hgkjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7c2fp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:13Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:13 crc kubenswrapper[4797]: I1013 13:08:13.385610 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a225515-1318-413d-aafe-877c9f16f598\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94213b2963fead3db49bc98dfdf6347265b92e3a0a965295610e496d2e1f03fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://015492cb16b3cf6dedc1936f90cf03d1331bfd1fddf6a257c719a6bf102691f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a259d99cae7127eb6fc8ad5446de3eda5a06da45868ab2325a89fc9c44f1d34c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58e84f914871c37c2c5cf2767af6a88354e4e59af0cbe5b178b80e1372d50629\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba76df71160260346f2ecd968722de778b7d2b3dcb8673d6ec770964965384fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ff207205d6f17838682cab641fe9c77fad739e5454c206f376440741bed8724\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ff207205d6f17838682cab641fe9c77fad739e5454c206f376440741bed8724\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:13Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:13 crc kubenswrapper[4797]: I1013 13:08:13.389493 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:13 crc kubenswrapper[4797]: I1013 13:08:13.389523 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:13 crc kubenswrapper[4797]: I1013 13:08:13.389532 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:13 crc kubenswrapper[4797]: I1013 13:08:13.389546 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:13 crc kubenswrapper[4797]: I1013 13:08:13.389556 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:13Z","lastTransitionTime":"2025-10-13T13:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:13 crc kubenswrapper[4797]: I1013 13:08:13.398358 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:13Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:13 crc kubenswrapper[4797]: I1013 13:08:13.408814 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:13Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:13 crc kubenswrapper[4797]: I1013 13:08:13.421070 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6gbdx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2ab9f14-aae8-45ef-880e-a1563e920f87\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:08:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:08:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://414f6ddbfec431109009fc83e56eeac94db15726b109e707ebd8d3e2403999b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://414f6ddbfec431109009fc83e56eeac94db15726b109e707ebd8d3e2403999b7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-13T13:08:12Z\\\",\\\"message\\\":\\\"2025-10-13T13:07:27+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_241ac6c6-17e4-4495-a5d5-736f3cee06d0\\\\n2025-10-13T13:07:27+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_241ac6c6-17e4-4495-a5d5-736f3cee06d0 to /host/opt/cni/bin/\\\\n2025-10-13T13:07:27Z [verbose] multus-daemon started\\\\n2025-10-13T13:07:27Z [verbose] Readiness Indicator file check\\\\n2025-10-13T13:08:12Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rkc2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6gbdx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:13Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:13 crc kubenswrapper[4797]: I1013 13:08:13.436276 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"658edc6a-9975-4d8b-9551-821edcc32ce1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6815d3509df673d7f5da2c26130c6c4d533e9d2c25c40f82365ef61d63ee71bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e599d81d1a996abd4de74afc58a8255a1ae548327401146b6bdf688d7455823\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa5161ba66d687daedb3caa1a0e2d7be83859aa3076731f94aebf83cc3348a4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1293f7ed35796e22a4be73a35ad07f83fa98d250d21de2d0b96b9090354142b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32991406197be9d38b8d5e8d1a7e95165b1846e9e054efbe87f30aac9f7f8784\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7aebb018a68c2984d9e4e58071c2b623652bfa700acebaf735c35615abf8c592\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbb589aa6432f0be09c19acfbe0267d2c3e9906ab1b90494b2e097aec3c51e50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbb589aa6432f0be09c19acfbe0267d2c3e9906ab1b90494b2e097aec3c51e50\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-13T13:07:49Z\\\",\\\"message\\\":\\\"ap[10.217.5.219:80:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7594bb65-e742-44b3-a975-d639b1128be5}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1013 13:07:49.213399 6403 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1013 13:07:49.213404 6403 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-network-diagnostics/network-check-target]} name:Service_openshift-network-diagnostics/network-check-target_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.219:80:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7594bb65-e742-44b3-a975-d639b1128be5}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1013 13:07:49.213463 6403 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: fail\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:48Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-dhk2q_openshift-ovn-kubernetes(658edc6a-9975-4d8b-9551-821edcc32ce1)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a900854ab289e65833932548eadd4705ec501737d66773d5b6c283458125b598\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6339ec20a5a892de024630eaf6febedfdfa4373c24d394fc758267cc7ae2603e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6339ec20a5a892de024630eaf6febedfdfa4373c24d394fc758267cc7ae2603e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dhk2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:13Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:13 crc kubenswrapper[4797]: I1013 13:08:13.445757 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69631c207b244ea05458caf7f67665697be6b3794c1aac98d0ac8d23df060e02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:13Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:13 crc kubenswrapper[4797]: I1013 13:08:13.455833 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5jgrm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"680a49a0-7eff-44a9-8ab8-e4b52f4743c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://875be57ff7356a934b342acd8ae700f66656680be4e58e6cfccdc0407b66ddea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pptw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5jgrm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:13Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:13 crc kubenswrapper[4797]: I1013 13:08:13.464759 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hvhmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b24c284-a754-4877-83cc-334b0a893a47\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72031333bb0302ca8e823981a07e96b3bf16d02fbdb918d4fd3e79f36d86c5ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znzc9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95d788f4cc7913f42c5282aea7303a5463ec8718dba6372a30c505e1648f230e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znzc9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hvhmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:13Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:13 crc kubenswrapper[4797]: I1013 13:08:13.473866 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-pdvg5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e65d35bc-209d-4438-ae53-31deb132aaf5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nspn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nspn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-pdvg5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:13Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:13 crc kubenswrapper[4797]: I1013 13:08:13.491931 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:13 crc kubenswrapper[4797]: I1013 13:08:13.491973 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:13 crc kubenswrapper[4797]: I1013 13:08:13.491985 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:13 crc kubenswrapper[4797]: I1013 13:08:13.492002 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:13 crc kubenswrapper[4797]: I1013 13:08:13.492015 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:13Z","lastTransitionTime":"2025-10-13T13:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:13 crc kubenswrapper[4797]: I1013 13:08:13.597943 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:13 crc kubenswrapper[4797]: I1013 13:08:13.597972 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:13 crc kubenswrapper[4797]: I1013 13:08:13.597983 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:13 crc kubenswrapper[4797]: I1013 13:08:13.597997 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:13 crc kubenswrapper[4797]: I1013 13:08:13.598007 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:13Z","lastTransitionTime":"2025-10-13T13:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:13 crc kubenswrapper[4797]: I1013 13:08:13.700458 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:13 crc kubenswrapper[4797]: I1013 13:08:13.700493 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:13 crc kubenswrapper[4797]: I1013 13:08:13.700502 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:13 crc kubenswrapper[4797]: I1013 13:08:13.700516 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:13 crc kubenswrapper[4797]: I1013 13:08:13.700526 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:13Z","lastTransitionTime":"2025-10-13T13:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:13 crc kubenswrapper[4797]: I1013 13:08:13.708040 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dhk2q_658edc6a-9975-4d8b-9551-821edcc32ce1/ovnkube-controller/2.log" Oct 13 13:08:13 crc kubenswrapper[4797]: I1013 13:08:13.710081 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" event={"ID":"658edc6a-9975-4d8b-9551-821edcc32ce1","Type":"ContainerStarted","Data":"4df9cf44f891f2c9d341965d5c89c3f3ca2eb1bf12a54a0a673389869f9c878d"} Oct 13 13:08:13 crc kubenswrapper[4797]: I1013 13:08:13.711153 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" Oct 13 13:08:13 crc kubenswrapper[4797]: I1013 13:08:13.713564 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-6gbdx_b2ab9f14-aae8-45ef-880e-a1563e920f87/kube-multus/0.log" Oct 13 13:08:13 crc kubenswrapper[4797]: I1013 13:08:13.713612 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-6gbdx" event={"ID":"b2ab9f14-aae8-45ef-880e-a1563e920f87","Type":"ContainerStarted","Data":"fa998288bf7354f5914b82c32971cd88e1fe9535016c7d137b79e4cf5c5c7248"} Oct 13 13:08:13 crc kubenswrapper[4797]: I1013 13:08:13.726854 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f37f607-3b81-4e33-878e-e78a69b89d23\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f65bab26af0e0d003d4e1a27dc4bdb84b64b5f6143e363973331a3fb6d26b12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81fc000b6df41386d24f9077cee4aa0ceb4733774dc37d225495575543e84a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6529ac19e0d9f2b6ecc69e041e75c9767c971617166ca22bb29349b3b3965b1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://153aaedead7ed76ab97858342317945de885afe80c00d9873d1a03444c47f67d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:13Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:13 crc kubenswrapper[4797]: I1013 13:08:13.740332 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"345b1c60-ba79-407d-8423-53010f2dfeb0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9188982d992b79d058393a141055552eeb63bc5cd53178991e62b3df7604f55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4mqw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae2106d4b7e73d19b0c8cbd8089d372e56fa08d827a3b45148d0cf68e8596c00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4mqw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hrdxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:13Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:13 crc kubenswrapper[4797]: I1013 13:08:13.752577 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7c2fp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c10f51f-a7f1-4ab8-8d9c-fc358bd7f2c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55ca323dd3a92ee203542f4ef7bb8be990bcfc8f75f125c562127129aecefc5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hgkjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7c2fp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:13Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:13 crc kubenswrapper[4797]: I1013 13:08:13.770592 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a225515-1318-413d-aafe-877c9f16f598\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94213b2963fead3db49bc98dfdf6347265b92e3a0a965295610e496d2e1f03fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://015492cb16b3cf6dedc1936f90cf03d1331bfd1fddf6a257c719a6bf102691f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a259d99cae7127eb6fc8ad5446de3eda5a06da45868ab2325a89fc9c44f1d34c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58e84f914871c37c2c5cf2767af6a88354e4e59af0cbe5b178b80e1372d50629\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba76df71160260346f2ecd968722de778b7d2b3dcb8673d6ec770964965384fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ff207205d6f17838682cab641fe9c77fad739e5454c206f376440741bed8724\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ff207205d6f17838682cab641fe9c77fad739e5454c206f376440741bed8724\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:13Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:13 crc kubenswrapper[4797]: I1013 13:08:13.785795 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:13Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:13 crc kubenswrapper[4797]: I1013 13:08:13.800090 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:13Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:13 crc kubenswrapper[4797]: I1013 13:08:13.803098 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:13 crc kubenswrapper[4797]: I1013 13:08:13.803139 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:13 crc kubenswrapper[4797]: I1013 13:08:13.803155 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:13 crc kubenswrapper[4797]: I1013 13:08:13.803176 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:13 crc kubenswrapper[4797]: I1013 13:08:13.803192 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:13Z","lastTransitionTime":"2025-10-13T13:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:13 crc kubenswrapper[4797]: I1013 13:08:13.819208 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6gbdx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2ab9f14-aae8-45ef-880e-a1563e920f87\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:08:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:08:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://414f6ddbfec431109009fc83e56eeac94db15726b109e707ebd8d3e2403999b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://414f6ddbfec431109009fc83e56eeac94db15726b109e707ebd8d3e2403999b7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-13T13:08:12Z\\\",\\\"message\\\":\\\"2025-10-13T13:07:27+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_241ac6c6-17e4-4495-a5d5-736f3cee06d0\\\\n2025-10-13T13:07:27+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_241ac6c6-17e4-4495-a5d5-736f3cee06d0 to /host/opt/cni/bin/\\\\n2025-10-13T13:07:27Z [verbose] multus-daemon started\\\\n2025-10-13T13:07:27Z [verbose] Readiness Indicator file check\\\\n2025-10-13T13:08:12Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rkc2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6gbdx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:13Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:13 crc kubenswrapper[4797]: I1013 13:08:13.841620 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"658edc6a-9975-4d8b-9551-821edcc32ce1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6815d3509df673d7f5da2c26130c6c4d533e9d2c25c40f82365ef61d63ee71bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e599d81d1a996abd4de74afc58a8255a1ae548327401146b6bdf688d7455823\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa5161ba66d687daedb3caa1a0e2d7be83859aa3076731f94aebf83cc3348a4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1293f7ed35796e22a4be73a35ad07f83fa98d250d21de2d0b96b9090354142b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32991406197be9d38b8d5e8d1a7e95165b1846e9e054efbe87f30aac9f7f8784\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7aebb018a68c2984d9e4e58071c2b623652bfa700acebaf735c35615abf8c592\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4df9cf44f891f2c9d341965d5c89c3f3ca2eb1bf12a54a0a673389869f9c878d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbb589aa6432f0be09c19acfbe0267d2c3e9906ab1b90494b2e097aec3c51e50\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-13T13:07:49Z\\\",\\\"message\\\":\\\"ap[10.217.5.219:80:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7594bb65-e742-44b3-a975-d639b1128be5}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1013 13:07:49.213399 6403 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1013 13:07:49.213404 6403 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-network-diagnostics/network-check-target]} name:Service_openshift-network-diagnostics/network-check-target_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.219:80:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7594bb65-e742-44b3-a975-d639b1128be5}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1013 13:07:49.213463 6403 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: fail\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:48Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:08:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a900854ab289e65833932548eadd4705ec501737d66773d5b6c283458125b598\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6339ec20a5a892de024630eaf6febedfdfa4373c24d394fc758267cc7ae2603e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6339ec20a5a892de024630eaf6febedfdfa4373c24d394fc758267cc7ae2603e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dhk2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:13Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:13 crc kubenswrapper[4797]: I1013 13:08:13.853627 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69631c207b244ea05458caf7f67665697be6b3794c1aac98d0ac8d23df060e02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:13Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:13 crc kubenswrapper[4797]: I1013 13:08:13.862932 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5jgrm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"680a49a0-7eff-44a9-8ab8-e4b52f4743c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://875be57ff7356a934b342acd8ae700f66656680be4e58e6cfccdc0407b66ddea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pptw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5jgrm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:13Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:13 crc kubenswrapper[4797]: I1013 13:08:13.871556 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hvhmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b24c284-a754-4877-83cc-334b0a893a47\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72031333bb0302ca8e823981a07e96b3bf16d02fbdb918d4fd3e79f36d86c5ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znzc9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95d788f4cc7913f42c5282aea7303a5463ec8718dba6372a30c505e1648f230e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znzc9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hvhmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:13Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:13 crc kubenswrapper[4797]: I1013 13:08:13.880756 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-pdvg5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e65d35bc-209d-4438-ae53-31deb132aaf5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nspn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nspn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-pdvg5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:13Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:13 crc kubenswrapper[4797]: I1013 13:08:13.891560 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hc9bk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94a6e41f-8980-41db-a008-d5a81058cdba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4187a9b2b8147080c704bcda550e1fa94124e2d876e766361e95907a8805d300\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55fca726ceb1c3d6ba2023acc8fcf6829a7843a2da5209dc77f3d1f499542984\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55fca726ceb1c3d6ba2023acc8fcf6829a7843a2da5209dc77f3d1f499542984\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d315709aee893961a174d0368efcf68e50e45845bdce18b40b96f5d49a8ac12c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d315709aee893961a174d0368efcf68e50e45845bdce18b40b96f5d49a8ac12c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e5eeb32046ffa3c309bb0649ed24bb4149050e757d4252bc5ed8e0593e1b139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e5eeb32046ffa3c309bb0649ed24bb4149050e757d4252bc5ed8e0593e1b139\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12ad75a6e287db934cda0d0128e5445d47433aa12807b4145ce0bd28c36b08a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12ad75a6e287db934cda0d0128e5445d47433aa12807b4145ce0bd28c36b08a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://578ddb49fd9525eafe74852b96ea1f3e320cbe40fa15ef4da3e4269f9bc23fc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://578ddb49fd9525eafe74852b96ea1f3e320cbe40fa15ef4da3e4269f9bc23fc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d96793d08ac170b3c25abd53c779b2ebecf10b5271c5f3eb4f9cbc524ba65c0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d96793d08ac170b3c25abd53c779b2ebecf10b5271c5f3eb4f9cbc524ba65c0e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hc9bk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:13Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:13 crc kubenswrapper[4797]: I1013 13:08:13.904333 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b68fe527-212f-427f-853a-037035463262\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca584d4ecaf82f6fb7822ce377920e84fa94325d8c157e75bdcbbe45a125fa17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11160e205816f4de995be138142cca7672957f217e49bf9f4761ae2cb132e9da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://034f468ed62eb8201ad4abdbf235c13b6c9ff8e3fe2494ad768f7047e188bc77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e62409684da122b3385446a402a798c47eca9f32aeb43f734f94dc498f95d23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e62409684da122b3385446a402a798c47eca9f32aeb43f734f94dc498f95d23\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:03Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:13Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:13 crc kubenswrapper[4797]: I1013 13:08:13.905365 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:13 crc kubenswrapper[4797]: I1013 13:08:13.905428 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:13 crc kubenswrapper[4797]: I1013 13:08:13.905444 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:13 crc kubenswrapper[4797]: I1013 13:08:13.905895 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:13 crc kubenswrapper[4797]: I1013 13:08:13.905949 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:13Z","lastTransitionTime":"2025-10-13T13:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:13 crc kubenswrapper[4797]: I1013 13:08:13.920274 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a0ece0b-2009-4af8-a479-18fe277add03\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfe67dfd1c3ab4ca933a08e0384f2c38dccf755989a2d788f7c96bd8c2005c7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2839d93188aac6edbd17e7c1dc6d3b6004d3c1d8d03c559205b1f180ca7fc722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d103007fe3545a7470b16b99638c5d5c87f34918e102e1453d4f7ee1fa67109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://221933b30055ace0b4911bda08736e1c703b7757d55fadb3114ae39d038e4b19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01ae91c2e82dd02eb4c0aced1159efd45e3a0570a4db649f2fd2b58681419471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9634ab258db11c80c4fea57a4a31969b811204d49513802e9fd1e584c9baeeb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9634ab258db11c80c4fea57a4a31969b811204d49513802e9fd1e584c9baeeb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4eac0af23e91572524547b0ce92c10d435b55d0cd15ca4cfc1f49bda2de8bde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4eac0af23e91572524547b0ce92c10d435b55d0cd15ca4cfc1f49bda2de8bde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://af79d6bbb6a3c16b532ba2234d3373011c151ffa801eb1ae5ae947142a64bcfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af79d6bbb6a3c16b532ba2234d3373011c151ffa801eb1ae5ae947142a64bcfe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:03Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:13Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:13 crc kubenswrapper[4797]: I1013 13:08:13.933114 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6681a37da700a80ffec94aef9264f87838622029c76a2badc7b8f4a7e9e167e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:13Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:13 crc kubenswrapper[4797]: I1013 13:08:13.944349 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://050a7223e7496fca4ef77f2d73f6aefc921ac5accb7ecaa34609524388da6549\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f59a6104215a3e6febd2e26c286b00895bcd8a45719acbd8e86d6fb5683df39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:13Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:13 crc kubenswrapper[4797]: I1013 13:08:13.955795 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:13Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:13 crc kubenswrapper[4797]: I1013 13:08:13.965260 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5jgrm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"680a49a0-7eff-44a9-8ab8-e4b52f4743c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://875be57ff7356a934b342acd8ae700f66656680be4e58e6cfccdc0407b66ddea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pptw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5jgrm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:13Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:13 crc kubenswrapper[4797]: I1013 13:08:13.974503 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hvhmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b24c284-a754-4877-83cc-334b0a893a47\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72031333bb0302ca8e823981a07e96b3bf16d02fbdb918d4fd3e79f36d86c5ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znzc9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95d788f4cc7913f42c5282aea7303a5463ec8718dba6372a30c505e1648f230e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znzc9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hvhmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:13Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:13 crc kubenswrapper[4797]: I1013 13:08:13.984649 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-pdvg5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e65d35bc-209d-4438-ae53-31deb132aaf5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nspn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nspn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-pdvg5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:13Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:13 crc kubenswrapper[4797]: I1013 13:08:13.995458 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69631c207b244ea05458caf7f67665697be6b3794c1aac98d0ac8d23df060e02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:13Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:14 crc kubenswrapper[4797]: I1013 13:08:14.008547 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:14 crc kubenswrapper[4797]: I1013 13:08:14.008599 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:14 crc kubenswrapper[4797]: I1013 13:08:14.008615 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:14 crc kubenswrapper[4797]: I1013 13:08:14.008640 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:14 crc kubenswrapper[4797]: I1013 13:08:14.008653 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:14Z","lastTransitionTime":"2025-10-13T13:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:14 crc kubenswrapper[4797]: I1013 13:08:14.018263 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a0ece0b-2009-4af8-a479-18fe277add03\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfe67dfd1c3ab4ca933a08e0384f2c38dccf755989a2d788f7c96bd8c2005c7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2839d93188aac6edbd17e7c1dc6d3b6004d3c1d8d03c559205b1f180ca7fc722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d103007fe3545a7470b16b99638c5d5c87f34918e102e1453d4f7ee1fa67109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://221933b30055ace0b4911bda08736e1c703b7757d55fadb3114ae39d038e4b19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01ae91c2e82dd02eb4c0aced1159efd45e3a0570a4db649f2fd2b58681419471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9634ab258db11c80c4fea57a4a31969b811204d49513802e9fd1e584c9baeeb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9634ab258db11c80c4fea57a4a31969b811204d49513802e9fd1e584c9baeeb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4eac0af23e91572524547b0ce92c10d435b55d0cd15ca4cfc1f49bda2de8bde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4eac0af23e91572524547b0ce92c10d435b55d0cd15ca4cfc1f49bda2de8bde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://af79d6bbb6a3c16b532ba2234d3373011c151ffa801eb1ae5ae947142a64bcfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af79d6bbb6a3c16b532ba2234d3373011c151ffa801eb1ae5ae947142a64bcfe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:03Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:14Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:14 crc kubenswrapper[4797]: I1013 13:08:14.034227 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6681a37da700a80ffec94aef9264f87838622029c76a2badc7b8f4a7e9e167e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:14Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:14 crc kubenswrapper[4797]: I1013 13:08:14.047851 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://050a7223e7496fca4ef77f2d73f6aefc921ac5accb7ecaa34609524388da6549\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f59a6104215a3e6febd2e26c286b00895bcd8a45719acbd8e86d6fb5683df39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:14Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:14 crc kubenswrapper[4797]: I1013 13:08:14.062411 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:14Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:14 crc kubenswrapper[4797]: I1013 13:08:14.076132 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hc9bk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94a6e41f-8980-41db-a008-d5a81058cdba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4187a9b2b8147080c704bcda550e1fa94124e2d876e766361e95907a8805d300\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55fca726ceb1c3d6ba2023acc8fcf6829a7843a2da5209dc77f3d1f499542984\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55fca726ceb1c3d6ba2023acc8fcf6829a7843a2da5209dc77f3d1f499542984\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d315709aee893961a174d0368efcf68e50e45845bdce18b40b96f5d49a8ac12c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d315709aee893961a174d0368efcf68e50e45845bdce18b40b96f5d49a8ac12c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e5eeb32046ffa3c309bb0649ed24bb4149050e757d4252bc5ed8e0593e1b139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e5eeb32046ffa3c309bb0649ed24bb4149050e757d4252bc5ed8e0593e1b139\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12ad75a6e287db934cda0d0128e5445d47433aa12807b4145ce0bd28c36b08a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12ad75a6e287db934cda0d0128e5445d47433aa12807b4145ce0bd28c36b08a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://578ddb49fd9525eafe74852b96ea1f3e320cbe40fa15ef4da3e4269f9bc23fc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://578ddb49fd9525eafe74852b96ea1f3e320cbe40fa15ef4da3e4269f9bc23fc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d96793d08ac170b3c25abd53c779b2ebecf10b5271c5f3eb4f9cbc524ba65c0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d96793d08ac170b3c25abd53c779b2ebecf10b5271c5f3eb4f9cbc524ba65c0e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hc9bk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:14Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:14 crc kubenswrapper[4797]: I1013 13:08:14.087125 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b68fe527-212f-427f-853a-037035463262\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca584d4ecaf82f6fb7822ce377920e84fa94325d8c157e75bdcbbe45a125fa17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11160e205816f4de995be138142cca7672957f217e49bf9f4761ae2cb132e9da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://034f468ed62eb8201ad4abdbf235c13b6c9ff8e3fe2494ad768f7047e188bc77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e62409684da122b3385446a402a798c47eca9f32aeb43f734f94dc498f95d23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e62409684da122b3385446a402a798c47eca9f32aeb43f734f94dc498f95d23\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:03Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:14Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:14 crc kubenswrapper[4797]: I1013 13:08:14.097409 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"345b1c60-ba79-407d-8423-53010f2dfeb0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9188982d992b79d058393a141055552eeb63bc5cd53178991e62b3df7604f55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4mqw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae2106d4b7e73d19b0c8cbd8089d372e56fa08d827a3b45148d0cf68e8596c00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4mqw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hrdxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:14Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:14 crc kubenswrapper[4797]: I1013 13:08:14.109494 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7c2fp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c10f51f-a7f1-4ab8-8d9c-fc358bd7f2c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55ca323dd3a92ee203542f4ef7bb8be990bcfc8f75f125c562127129aecefc5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hgkjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7c2fp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:14Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:14 crc kubenswrapper[4797]: I1013 13:08:14.110450 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:14 crc kubenswrapper[4797]: I1013 13:08:14.110482 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:14 crc kubenswrapper[4797]: I1013 13:08:14.110495 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:14 crc kubenswrapper[4797]: I1013 13:08:14.110510 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:14 crc kubenswrapper[4797]: I1013 13:08:14.110521 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:14Z","lastTransitionTime":"2025-10-13T13:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:14 crc kubenswrapper[4797]: I1013 13:08:14.123437 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f37f607-3b81-4e33-878e-e78a69b89d23\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f65bab26af0e0d003d4e1a27dc4bdb84b64b5f6143e363973331a3fb6d26b12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81fc000b6df41386d24f9077cee4aa0ceb4733774dc37d225495575543e84a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6529ac19e0d9f2b6ecc69e041e75c9767c971617166ca22bb29349b3b3965b1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://153aaedead7ed76ab97858342317945de885afe80c00d9873d1a03444c47f67d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:14Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:14 crc kubenswrapper[4797]: I1013 13:08:14.138095 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:14Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:14 crc kubenswrapper[4797]: I1013 13:08:14.155214 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:14Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:14 crc kubenswrapper[4797]: I1013 13:08:14.175705 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6gbdx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2ab9f14-aae8-45ef-880e-a1563e920f87\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa998288bf7354f5914b82c32971cd88e1fe9535016c7d137b79e4cf5c5c7248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://414f6ddbfec431109009fc83e56eeac94db15726b109e707ebd8d3e2403999b7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-13T13:08:12Z\\\",\\\"message\\\":\\\"2025-10-13T13:07:27+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_241ac6c6-17e4-4495-a5d5-736f3cee06d0\\\\n2025-10-13T13:07:27+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_241ac6c6-17e4-4495-a5d5-736f3cee06d0 to /host/opt/cni/bin/\\\\n2025-10-13T13:07:27Z [verbose] multus-daemon started\\\\n2025-10-13T13:07:27Z [verbose] Readiness Indicator file check\\\\n2025-10-13T13:08:12Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:08:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rkc2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6gbdx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:14Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:14 crc kubenswrapper[4797]: I1013 13:08:14.198550 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"658edc6a-9975-4d8b-9551-821edcc32ce1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6815d3509df673d7f5da2c26130c6c4d533e9d2c25c40f82365ef61d63ee71bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e599d81d1a996abd4de74afc58a8255a1ae548327401146b6bdf688d7455823\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa5161ba66d687daedb3caa1a0e2d7be83859aa3076731f94aebf83cc3348a4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1293f7ed35796e22a4be73a35ad07f83fa98d250d21de2d0b96b9090354142b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32991406197be9d38b8d5e8d1a7e95165b1846e9e054efbe87f30aac9f7f8784\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7aebb018a68c2984d9e4e58071c2b623652bfa700acebaf735c35615abf8c592\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4df9cf44f891f2c9d341965d5c89c3f3ca2eb1bf12a54a0a673389869f9c878d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbb589aa6432f0be09c19acfbe0267d2c3e9906ab1b90494b2e097aec3c51e50\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-13T13:07:49Z\\\",\\\"message\\\":\\\"ap[10.217.5.219:80:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7594bb65-e742-44b3-a975-d639b1128be5}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1013 13:07:49.213399 6403 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1013 13:07:49.213404 6403 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-network-diagnostics/network-check-target]} name:Service_openshift-network-diagnostics/network-check-target_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.219:80:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7594bb65-e742-44b3-a975-d639b1128be5}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1013 13:07:49.213463 6403 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: fail\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:48Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:08:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a900854ab289e65833932548eadd4705ec501737d66773d5b6c283458125b598\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6339ec20a5a892de024630eaf6febedfdfa4373c24d394fc758267cc7ae2603e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6339ec20a5a892de024630eaf6febedfdfa4373c24d394fc758267cc7ae2603e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dhk2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:14Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:14 crc kubenswrapper[4797]: I1013 13:08:14.210010 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a225515-1318-413d-aafe-877c9f16f598\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94213b2963fead3db49bc98dfdf6347265b92e3a0a965295610e496d2e1f03fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://015492cb16b3cf6dedc1936f90cf03d1331bfd1fddf6a257c719a6bf102691f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a259d99cae7127eb6fc8ad5446de3eda5a06da45868ab2325a89fc9c44f1d34c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58e84f914871c37c2c5cf2767af6a88354e4e59af0cbe5b178b80e1372d50629\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba76df71160260346f2ecd968722de778b7d2b3dcb8673d6ec770964965384fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ff207205d6f17838682cab641fe9c77fad739e5454c206f376440741bed8724\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ff207205d6f17838682cab641fe9c77fad739e5454c206f376440741bed8724\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:14Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:14 crc kubenswrapper[4797]: I1013 13:08:14.212278 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:14 crc kubenswrapper[4797]: I1013 13:08:14.212318 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:14 crc kubenswrapper[4797]: I1013 13:08:14.212326 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:14 crc kubenswrapper[4797]: I1013 13:08:14.212340 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:14 crc kubenswrapper[4797]: I1013 13:08:14.212348 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:14Z","lastTransitionTime":"2025-10-13T13:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:14 crc kubenswrapper[4797]: I1013 13:08:14.314492 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:14 crc kubenswrapper[4797]: I1013 13:08:14.314527 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:14 crc kubenswrapper[4797]: I1013 13:08:14.314539 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:14 crc kubenswrapper[4797]: I1013 13:08:14.314552 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:14 crc kubenswrapper[4797]: I1013 13:08:14.314561 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:14Z","lastTransitionTime":"2025-10-13T13:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:14 crc kubenswrapper[4797]: I1013 13:08:14.417126 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:14 crc kubenswrapper[4797]: I1013 13:08:14.417156 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:14 crc kubenswrapper[4797]: I1013 13:08:14.417166 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:14 crc kubenswrapper[4797]: I1013 13:08:14.417182 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:14 crc kubenswrapper[4797]: I1013 13:08:14.417193 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:14Z","lastTransitionTime":"2025-10-13T13:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:14 crc kubenswrapper[4797]: I1013 13:08:14.519708 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:14 crc kubenswrapper[4797]: I1013 13:08:14.519751 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:14 crc kubenswrapper[4797]: I1013 13:08:14.519769 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:14 crc kubenswrapper[4797]: I1013 13:08:14.519790 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:14 crc kubenswrapper[4797]: I1013 13:08:14.519844 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:14Z","lastTransitionTime":"2025-10-13T13:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:14 crc kubenswrapper[4797]: I1013 13:08:14.622272 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:14 crc kubenswrapper[4797]: I1013 13:08:14.622328 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:14 crc kubenswrapper[4797]: I1013 13:08:14.622340 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:14 crc kubenswrapper[4797]: I1013 13:08:14.622359 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:14 crc kubenswrapper[4797]: I1013 13:08:14.622370 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:14Z","lastTransitionTime":"2025-10-13T13:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:14 crc kubenswrapper[4797]: I1013 13:08:14.718072 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dhk2q_658edc6a-9975-4d8b-9551-821edcc32ce1/ovnkube-controller/3.log" Oct 13 13:08:14 crc kubenswrapper[4797]: I1013 13:08:14.718612 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dhk2q_658edc6a-9975-4d8b-9551-821edcc32ce1/ovnkube-controller/2.log" Oct 13 13:08:14 crc kubenswrapper[4797]: I1013 13:08:14.721382 4797 generic.go:334] "Generic (PLEG): container finished" podID="658edc6a-9975-4d8b-9551-821edcc32ce1" containerID="4df9cf44f891f2c9d341965d5c89c3f3ca2eb1bf12a54a0a673389869f9c878d" exitCode=1 Oct 13 13:08:14 crc kubenswrapper[4797]: I1013 13:08:14.721421 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" event={"ID":"658edc6a-9975-4d8b-9551-821edcc32ce1","Type":"ContainerDied","Data":"4df9cf44f891f2c9d341965d5c89c3f3ca2eb1bf12a54a0a673389869f9c878d"} Oct 13 13:08:14 crc kubenswrapper[4797]: I1013 13:08:14.721460 4797 scope.go:117] "RemoveContainer" containerID="dbb589aa6432f0be09c19acfbe0267d2c3e9906ab1b90494b2e097aec3c51e50" Oct 13 13:08:14 crc kubenswrapper[4797]: I1013 13:08:14.722074 4797 scope.go:117] "RemoveContainer" containerID="4df9cf44f891f2c9d341965d5c89c3f3ca2eb1bf12a54a0a673389869f9c878d" Oct 13 13:08:14 crc kubenswrapper[4797]: E1013 13:08:14.722224 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-dhk2q_openshift-ovn-kubernetes(658edc6a-9975-4d8b-9551-821edcc32ce1)\"" pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" podUID="658edc6a-9975-4d8b-9551-821edcc32ce1" Oct 13 13:08:14 crc kubenswrapper[4797]: I1013 13:08:14.732368 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:14 crc kubenswrapper[4797]: I1013 13:08:14.732437 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:14 crc kubenswrapper[4797]: I1013 13:08:14.732510 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:14 crc kubenswrapper[4797]: I1013 13:08:14.732590 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:14 crc kubenswrapper[4797]: I1013 13:08:14.732623 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:14Z","lastTransitionTime":"2025-10-13T13:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:14 crc kubenswrapper[4797]: I1013 13:08:14.761282 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a0ece0b-2009-4af8-a479-18fe277add03\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfe67dfd1c3ab4ca933a08e0384f2c38dccf755989a2d788f7c96bd8c2005c7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2839d93188aac6edbd17e7c1dc6d3b6004d3c1d8d03c559205b1f180ca7fc722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d103007fe3545a7470b16b99638c5d5c87f34918e102e1453d4f7ee1fa67109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://221933b30055ace0b4911bda08736e1c703b7757d55fadb3114ae39d038e4b19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01ae91c2e82dd02eb4c0aced1159efd45e3a0570a4db649f2fd2b58681419471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9634ab258db11c80c4fea57a4a31969b811204d49513802e9fd1e584c9baeeb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9634ab258db11c80c4fea57a4a31969b811204d49513802e9fd1e584c9baeeb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4eac0af23e91572524547b0ce92c10d435b55d0cd15ca4cfc1f49bda2de8bde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4eac0af23e91572524547b0ce92c10d435b55d0cd15ca4cfc1f49bda2de8bde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://af79d6bbb6a3c16b532ba2234d3373011c151ffa801eb1ae5ae947142a64bcfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af79d6bbb6a3c16b532ba2234d3373011c151ffa801eb1ae5ae947142a64bcfe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:03Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:14Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:14 crc kubenswrapper[4797]: I1013 13:08:14.779359 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6681a37da700a80ffec94aef9264f87838622029c76a2badc7b8f4a7e9e167e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:14Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:14 crc kubenswrapper[4797]: I1013 13:08:14.795586 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://050a7223e7496fca4ef77f2d73f6aefc921ac5accb7ecaa34609524388da6549\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f59a6104215a3e6febd2e26c286b00895bcd8a45719acbd8e86d6fb5683df39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:14Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:14 crc kubenswrapper[4797]: I1013 13:08:14.812872 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:14Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:14 crc kubenswrapper[4797]: I1013 13:08:14.835420 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:14 crc kubenswrapper[4797]: I1013 13:08:14.835476 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:14 crc kubenswrapper[4797]: I1013 13:08:14.835497 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:14 crc kubenswrapper[4797]: I1013 13:08:14.835524 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:14 crc kubenswrapper[4797]: I1013 13:08:14.835547 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:14Z","lastTransitionTime":"2025-10-13T13:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:14 crc kubenswrapper[4797]: I1013 13:08:14.838890 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hc9bk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94a6e41f-8980-41db-a008-d5a81058cdba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4187a9b2b8147080c704bcda550e1fa94124e2d876e766361e95907a8805d300\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55fca726ceb1c3d6ba2023acc8fcf6829a7843a2da5209dc77f3d1f499542984\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55fca726ceb1c3d6ba2023acc8fcf6829a7843a2da5209dc77f3d1f499542984\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d315709aee893961a174d0368efcf68e50e45845bdce18b40b96f5d49a8ac12c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d315709aee893961a174d0368efcf68e50e45845bdce18b40b96f5d49a8ac12c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e5eeb32046ffa3c309bb0649ed24bb4149050e757d4252bc5ed8e0593e1b139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e5eeb32046ffa3c309bb0649ed24bb4149050e757d4252bc5ed8e0593e1b139\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12ad75a6e287db934cda0d0128e5445d47433aa12807b4145ce0bd28c36b08a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12ad75a6e287db934cda0d0128e5445d47433aa12807b4145ce0bd28c36b08a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://578ddb49fd9525eafe74852b96ea1f3e320cbe40fa15ef4da3e4269f9bc23fc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://578ddb49fd9525eafe74852b96ea1f3e320cbe40fa15ef4da3e4269f9bc23fc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d96793d08ac170b3c25abd53c779b2ebecf10b5271c5f3eb4f9cbc524ba65c0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d96793d08ac170b3c25abd53c779b2ebecf10b5271c5f3eb4f9cbc524ba65c0e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hc9bk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:14Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:14 crc kubenswrapper[4797]: I1013 13:08:14.856383 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b68fe527-212f-427f-853a-037035463262\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca584d4ecaf82f6fb7822ce377920e84fa94325d8c157e75bdcbbe45a125fa17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11160e205816f4de995be138142cca7672957f217e49bf9f4761ae2cb132e9da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://034f468ed62eb8201ad4abdbf235c13b6c9ff8e3fe2494ad768f7047e188bc77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e62409684da122b3385446a402a798c47eca9f32aeb43f734f94dc498f95d23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e62409684da122b3385446a402a798c47eca9f32aeb43f734f94dc498f95d23\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:03Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:14Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:14 crc kubenswrapper[4797]: I1013 13:08:14.870778 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"345b1c60-ba79-407d-8423-53010f2dfeb0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9188982d992b79d058393a141055552eeb63bc5cd53178991e62b3df7604f55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4mqw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae2106d4b7e73d19b0c8cbd8089d372e56fa08d827a3b45148d0cf68e8596c00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4mqw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hrdxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:14Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:14 crc kubenswrapper[4797]: I1013 13:08:14.880928 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7c2fp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c10f51f-a7f1-4ab8-8d9c-fc358bd7f2c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55ca323dd3a92ee203542f4ef7bb8be990bcfc8f75f125c562127129aecefc5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hgkjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7c2fp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:14Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:14 crc kubenswrapper[4797]: I1013 13:08:14.896570 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f37f607-3b81-4e33-878e-e78a69b89d23\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f65bab26af0e0d003d4e1a27dc4bdb84b64b5f6143e363973331a3fb6d26b12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81fc000b6df41386d24f9077cee4aa0ceb4733774dc37d225495575543e84a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6529ac19e0d9f2b6ecc69e041e75c9767c971617166ca22bb29349b3b3965b1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://153aaedead7ed76ab97858342317945de885afe80c00d9873d1a03444c47f67d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:14Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:14 crc kubenswrapper[4797]: I1013 13:08:14.908493 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:14Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:14 crc kubenswrapper[4797]: I1013 13:08:14.925022 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:14Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:14 crc kubenswrapper[4797]: I1013 13:08:14.937506 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:14 crc kubenswrapper[4797]: I1013 13:08:14.937541 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:14 crc kubenswrapper[4797]: I1013 13:08:14.937551 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:14 crc kubenswrapper[4797]: I1013 13:08:14.937566 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:14 crc kubenswrapper[4797]: I1013 13:08:14.937577 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:14Z","lastTransitionTime":"2025-10-13T13:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:14 crc kubenswrapper[4797]: I1013 13:08:14.942915 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6gbdx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2ab9f14-aae8-45ef-880e-a1563e920f87\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa998288bf7354f5914b82c32971cd88e1fe9535016c7d137b79e4cf5c5c7248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://414f6ddbfec431109009fc83e56eeac94db15726b109e707ebd8d3e2403999b7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-13T13:08:12Z\\\",\\\"message\\\":\\\"2025-10-13T13:07:27+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_241ac6c6-17e4-4495-a5d5-736f3cee06d0\\\\n2025-10-13T13:07:27+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_241ac6c6-17e4-4495-a5d5-736f3cee06d0 to /host/opt/cni/bin/\\\\n2025-10-13T13:07:27Z [verbose] multus-daemon started\\\\n2025-10-13T13:07:27Z [verbose] Readiness Indicator file check\\\\n2025-10-13T13:08:12Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:08:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rkc2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6gbdx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:14Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:14 crc kubenswrapper[4797]: I1013 13:08:14.968052 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"658edc6a-9975-4d8b-9551-821edcc32ce1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6815d3509df673d7f5da2c26130c6c4d533e9d2c25c40f82365ef61d63ee71bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e599d81d1a996abd4de74afc58a8255a1ae548327401146b6bdf688d7455823\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa5161ba66d687daedb3caa1a0e2d7be83859aa3076731f94aebf83cc3348a4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1293f7ed35796e22a4be73a35ad07f83fa98d250d21de2d0b96b9090354142b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32991406197be9d38b8d5e8d1a7e95165b1846e9e054efbe87f30aac9f7f8784\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7aebb018a68c2984d9e4e58071c2b623652bfa700acebaf735c35615abf8c592\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4df9cf44f891f2c9d341965d5c89c3f3ca2eb1bf12a54a0a673389869f9c878d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbb589aa6432f0be09c19acfbe0267d2c3e9906ab1b90494b2e097aec3c51e50\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-13T13:07:49Z\\\",\\\"message\\\":\\\"ap[10.217.5.219:80:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7594bb65-e742-44b3-a975-d639b1128be5}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1013 13:07:49.213399 6403 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1013 13:07:49.213404 6403 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-network-diagnostics/network-check-target]} name:Service_openshift-network-diagnostics/network-check-target_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.219:80:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7594bb65-e742-44b3-a975-d639b1128be5}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1013 13:07:49.213463 6403 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: fail\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:48Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4df9cf44f891f2c9d341965d5c89c3f3ca2eb1bf12a54a0a673389869f9c878d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-13T13:08:14Z\\\",\\\"message\\\":\\\"LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-service-ca-operator/metrics\\\\\\\"}\\\\nI1013 13:08:14.032896 6746 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1013 13:08:14.032901 6746 services_controller.go:360] Finished syncing service metrics on namespace openshift-service-ca-operator for network=default : 1.704972ms\\\\nI1013 13:08:14.032930 6746 services_controller.go:356] Processing sync for service openshift-authentication-operator/metrics for network=default\\\\nF1013 13:08:14.032949 6746 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:13Z is after 2025-08\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-13T13:08:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a900854ab289e65833932548eadd4705ec501737d66773d5b6c283458125b598\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6339ec20a5a892de024630eaf6febedfdfa4373c24d394fc758267cc7ae2603e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6339ec20a5a892de024630eaf6febedfdfa4373c24d394fc758267cc7ae2603e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dhk2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:14Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:14 crc kubenswrapper[4797]: I1013 13:08:14.979467 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a225515-1318-413d-aafe-877c9f16f598\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94213b2963fead3db49bc98dfdf6347265b92e3a0a965295610e496d2e1f03fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://015492cb16b3cf6dedc1936f90cf03d1331bfd1fddf6a257c719a6bf102691f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a259d99cae7127eb6fc8ad5446de3eda5a06da45868ab2325a89fc9c44f1d34c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58e84f914871c37c2c5cf2767af6a88354e4e59af0cbe5b178b80e1372d50629\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba76df71160260346f2ecd968722de778b7d2b3dcb8673d6ec770964965384fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ff207205d6f17838682cab641fe9c77fad739e5454c206f376440741bed8724\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ff207205d6f17838682cab641fe9c77fad739e5454c206f376440741bed8724\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:14Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:14 crc kubenswrapper[4797]: I1013 13:08:14.988309 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5jgrm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"680a49a0-7eff-44a9-8ab8-e4b52f4743c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://875be57ff7356a934b342acd8ae700f66656680be4e58e6cfccdc0407b66ddea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pptw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5jgrm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:14Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:14 crc kubenswrapper[4797]: I1013 13:08:14.998454 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hvhmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b24c284-a754-4877-83cc-334b0a893a47\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72031333bb0302ca8e823981a07e96b3bf16d02fbdb918d4fd3e79f36d86c5ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znzc9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95d788f4cc7913f42c5282aea7303a5463ec8718dba6372a30c505e1648f230e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znzc9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hvhmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:14Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:15 crc kubenswrapper[4797]: I1013 13:08:15.007513 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-pdvg5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e65d35bc-209d-4438-ae53-31deb132aaf5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nspn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nspn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-pdvg5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:15Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:15 crc kubenswrapper[4797]: I1013 13:08:15.018530 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69631c207b244ea05458caf7f67665697be6b3794c1aac98d0ac8d23df060e02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:15Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:15 crc kubenswrapper[4797]: I1013 13:08:15.039118 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:15 crc kubenswrapper[4797]: I1013 13:08:15.039148 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:15 crc kubenswrapper[4797]: I1013 13:08:15.039159 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:15 crc kubenswrapper[4797]: I1013 13:08:15.039174 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:15 crc kubenswrapper[4797]: I1013 13:08:15.039186 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:15Z","lastTransitionTime":"2025-10-13T13:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:15 crc kubenswrapper[4797]: I1013 13:08:15.141138 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:15 crc kubenswrapper[4797]: I1013 13:08:15.141183 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:15 crc kubenswrapper[4797]: I1013 13:08:15.141195 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:15 crc kubenswrapper[4797]: I1013 13:08:15.141211 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:15 crc kubenswrapper[4797]: I1013 13:08:15.141223 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:15Z","lastTransitionTime":"2025-10-13T13:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:15 crc kubenswrapper[4797]: I1013 13:08:15.235664 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 13:08:15 crc kubenswrapper[4797]: I1013 13:08:15.235721 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 13 13:08:15 crc kubenswrapper[4797]: I1013 13:08:15.235717 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 13 13:08:15 crc kubenswrapper[4797]: I1013 13:08:15.235687 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pdvg5" Oct 13 13:08:15 crc kubenswrapper[4797]: E1013 13:08:15.235798 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 13 13:08:15 crc kubenswrapper[4797]: E1013 13:08:15.235956 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 13 13:08:15 crc kubenswrapper[4797]: E1013 13:08:15.236001 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pdvg5" podUID="e65d35bc-209d-4438-ae53-31deb132aaf5" Oct 13 13:08:15 crc kubenswrapper[4797]: E1013 13:08:15.236064 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 13 13:08:15 crc kubenswrapper[4797]: I1013 13:08:15.243604 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:15 crc kubenswrapper[4797]: I1013 13:08:15.243671 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:15 crc kubenswrapper[4797]: I1013 13:08:15.243682 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:15 crc kubenswrapper[4797]: I1013 13:08:15.243700 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:15 crc kubenswrapper[4797]: I1013 13:08:15.243711 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:15Z","lastTransitionTime":"2025-10-13T13:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:15 crc kubenswrapper[4797]: I1013 13:08:15.345876 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:15 crc kubenswrapper[4797]: I1013 13:08:15.345909 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:15 crc kubenswrapper[4797]: I1013 13:08:15.345919 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:15 crc kubenswrapper[4797]: I1013 13:08:15.345933 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:15 crc kubenswrapper[4797]: I1013 13:08:15.345941 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:15Z","lastTransitionTime":"2025-10-13T13:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:15 crc kubenswrapper[4797]: I1013 13:08:15.448697 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:15 crc kubenswrapper[4797]: I1013 13:08:15.448768 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:15 crc kubenswrapper[4797]: I1013 13:08:15.448789 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:15 crc kubenswrapper[4797]: I1013 13:08:15.448841 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:15 crc kubenswrapper[4797]: I1013 13:08:15.448861 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:15Z","lastTransitionTime":"2025-10-13T13:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:15 crc kubenswrapper[4797]: I1013 13:08:15.551966 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:15 crc kubenswrapper[4797]: I1013 13:08:15.552030 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:15 crc kubenswrapper[4797]: I1013 13:08:15.552047 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:15 crc kubenswrapper[4797]: I1013 13:08:15.552072 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:15 crc kubenswrapper[4797]: I1013 13:08:15.552088 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:15Z","lastTransitionTime":"2025-10-13T13:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:15 crc kubenswrapper[4797]: I1013 13:08:15.655080 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:15 crc kubenswrapper[4797]: I1013 13:08:15.655134 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:15 crc kubenswrapper[4797]: I1013 13:08:15.655169 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:15 crc kubenswrapper[4797]: I1013 13:08:15.655190 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:15 crc kubenswrapper[4797]: I1013 13:08:15.655210 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:15Z","lastTransitionTime":"2025-10-13T13:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:15 crc kubenswrapper[4797]: I1013 13:08:15.727986 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dhk2q_658edc6a-9975-4d8b-9551-821edcc32ce1/ovnkube-controller/3.log" Oct 13 13:08:15 crc kubenswrapper[4797]: I1013 13:08:15.733321 4797 scope.go:117] "RemoveContainer" containerID="4df9cf44f891f2c9d341965d5c89c3f3ca2eb1bf12a54a0a673389869f9c878d" Oct 13 13:08:15 crc kubenswrapper[4797]: E1013 13:08:15.733577 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-dhk2q_openshift-ovn-kubernetes(658edc6a-9975-4d8b-9551-821edcc32ce1)\"" pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" podUID="658edc6a-9975-4d8b-9551-821edcc32ce1" Oct 13 13:08:15 crc kubenswrapper[4797]: I1013 13:08:15.750888 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f37f607-3b81-4e33-878e-e78a69b89d23\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f65bab26af0e0d003d4e1a27dc4bdb84b64b5f6143e363973331a3fb6d26b12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81fc000b6df41386d24f9077cee4aa0ceb4733774dc37d225495575543e84a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6529ac19e0d9f2b6ecc69e041e75c9767c971617166ca22bb29349b3b3965b1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://153aaedead7ed76ab97858342317945de885afe80c00d9873d1a03444c47f67d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:15Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:15 crc kubenswrapper[4797]: I1013 13:08:15.758469 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:15 crc kubenswrapper[4797]: I1013 13:08:15.758546 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:15 crc kubenswrapper[4797]: I1013 13:08:15.758566 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:15 crc kubenswrapper[4797]: I1013 13:08:15.758598 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:15 crc kubenswrapper[4797]: I1013 13:08:15.758618 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:15Z","lastTransitionTime":"2025-10-13T13:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:15 crc kubenswrapper[4797]: I1013 13:08:15.767922 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"345b1c60-ba79-407d-8423-53010f2dfeb0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9188982d992b79d058393a141055552eeb63bc5cd53178991e62b3df7604f55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4mqw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae2106d4b7e73d19b0c8cbd8089d372e56fa08d827a3b45148d0cf68e8596c00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4mqw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hrdxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:15Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:15 crc kubenswrapper[4797]: I1013 13:08:15.782483 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7c2fp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c10f51f-a7f1-4ab8-8d9c-fc358bd7f2c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55ca323dd3a92ee203542f4ef7bb8be990bcfc8f75f125c562127129aecefc5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hgkjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7c2fp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:15Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:15 crc kubenswrapper[4797]: I1013 13:08:15.795754 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a225515-1318-413d-aafe-877c9f16f598\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94213b2963fead3db49bc98dfdf6347265b92e3a0a965295610e496d2e1f03fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://015492cb16b3cf6dedc1936f90cf03d1331bfd1fddf6a257c719a6bf102691f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a259d99cae7127eb6fc8ad5446de3eda5a06da45868ab2325a89fc9c44f1d34c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58e84f914871c37c2c5cf2767af6a88354e4e59af0cbe5b178b80e1372d50629\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba76df71160260346f2ecd968722de778b7d2b3dcb8673d6ec770964965384fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ff207205d6f17838682cab641fe9c77fad739e5454c206f376440741bed8724\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ff207205d6f17838682cab641fe9c77fad739e5454c206f376440741bed8724\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:15Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:15 crc kubenswrapper[4797]: I1013 13:08:15.809188 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:15Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:15 crc kubenswrapper[4797]: I1013 13:08:15.821614 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:15Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:15 crc kubenswrapper[4797]: I1013 13:08:15.832252 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6gbdx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2ab9f14-aae8-45ef-880e-a1563e920f87\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa998288bf7354f5914b82c32971cd88e1fe9535016c7d137b79e4cf5c5c7248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://414f6ddbfec431109009fc83e56eeac94db15726b109e707ebd8d3e2403999b7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-13T13:08:12Z\\\",\\\"message\\\":\\\"2025-10-13T13:07:27+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_241ac6c6-17e4-4495-a5d5-736f3cee06d0\\\\n2025-10-13T13:07:27+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_241ac6c6-17e4-4495-a5d5-736f3cee06d0 to /host/opt/cni/bin/\\\\n2025-10-13T13:07:27Z [verbose] multus-daemon started\\\\n2025-10-13T13:07:27Z [verbose] Readiness Indicator file check\\\\n2025-10-13T13:08:12Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:08:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rkc2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6gbdx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:15Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:15 crc kubenswrapper[4797]: I1013 13:08:15.848584 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"658edc6a-9975-4d8b-9551-821edcc32ce1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6815d3509df673d7f5da2c26130c6c4d533e9d2c25c40f82365ef61d63ee71bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e599d81d1a996abd4de74afc58a8255a1ae548327401146b6bdf688d7455823\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa5161ba66d687daedb3caa1a0e2d7be83859aa3076731f94aebf83cc3348a4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1293f7ed35796e22a4be73a35ad07f83fa98d250d21de2d0b96b9090354142b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32991406197be9d38b8d5e8d1a7e95165b1846e9e054efbe87f30aac9f7f8784\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7aebb018a68c2984d9e4e58071c2b623652bfa700acebaf735c35615abf8c592\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4df9cf44f891f2c9d341965d5c89c3f3ca2eb1bf12a54a0a673389869f9c878d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4df9cf44f891f2c9d341965d5c89c3f3ca2eb1bf12a54a0a673389869f9c878d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-13T13:08:14Z\\\",\\\"message\\\":\\\"LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-service-ca-operator/metrics\\\\\\\"}\\\\nI1013 13:08:14.032896 6746 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1013 13:08:14.032901 6746 services_controller.go:360] Finished syncing service metrics on namespace openshift-service-ca-operator for network=default : 1.704972ms\\\\nI1013 13:08:14.032930 6746 services_controller.go:356] Processing sync for service openshift-authentication-operator/metrics for network=default\\\\nF1013 13:08:14.032949 6746 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:13Z is after 2025-08\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-13T13:08:13Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-dhk2q_openshift-ovn-kubernetes(658edc6a-9975-4d8b-9551-821edcc32ce1)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a900854ab289e65833932548eadd4705ec501737d66773d5b6c283458125b598\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6339ec20a5a892de024630eaf6febedfdfa4373c24d394fc758267cc7ae2603e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6339ec20a5a892de024630eaf6febedfdfa4373c24d394fc758267cc7ae2603e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dhk2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:15Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:15 crc kubenswrapper[4797]: I1013 13:08:15.859370 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69631c207b244ea05458caf7f67665697be6b3794c1aac98d0ac8d23df060e02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:15Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:15 crc kubenswrapper[4797]: I1013 13:08:15.860766 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:15 crc kubenswrapper[4797]: I1013 13:08:15.860801 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:15 crc kubenswrapper[4797]: I1013 13:08:15.860857 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:15 crc kubenswrapper[4797]: I1013 13:08:15.860872 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:15 crc kubenswrapper[4797]: I1013 13:08:15.860883 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:15Z","lastTransitionTime":"2025-10-13T13:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:15 crc kubenswrapper[4797]: I1013 13:08:15.875548 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5jgrm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"680a49a0-7eff-44a9-8ab8-e4b52f4743c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://875be57ff7356a934b342acd8ae700f66656680be4e58e6cfccdc0407b66ddea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pptw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5jgrm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:15Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:15 crc kubenswrapper[4797]: I1013 13:08:15.886476 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hvhmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b24c284-a754-4877-83cc-334b0a893a47\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72031333bb0302ca8e823981a07e96b3bf16d02fbdb918d4fd3e79f36d86c5ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znzc9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95d788f4cc7913f42c5282aea7303a5463ec8718dba6372a30c505e1648f230e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znzc9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hvhmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:15Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:15 crc kubenswrapper[4797]: I1013 13:08:15.897015 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-pdvg5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e65d35bc-209d-4438-ae53-31deb132aaf5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nspn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nspn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-pdvg5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:15Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:15 crc kubenswrapper[4797]: I1013 13:08:15.908699 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b68fe527-212f-427f-853a-037035463262\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca584d4ecaf82f6fb7822ce377920e84fa94325d8c157e75bdcbbe45a125fa17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11160e205816f4de995be138142cca7672957f217e49bf9f4761ae2cb132e9da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://034f468ed62eb8201ad4abdbf235c13b6c9ff8e3fe2494ad768f7047e188bc77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e62409684da122b3385446a402a798c47eca9f32aeb43f734f94dc498f95d23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e62409684da122b3385446a402a798c47eca9f32aeb43f734f94dc498f95d23\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:03Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:15Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:15 crc kubenswrapper[4797]: I1013 13:08:15.926236 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a0ece0b-2009-4af8-a479-18fe277add03\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfe67dfd1c3ab4ca933a08e0384f2c38dccf755989a2d788f7c96bd8c2005c7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2839d93188aac6edbd17e7c1dc6d3b6004d3c1d8d03c559205b1f180ca7fc722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d103007fe3545a7470b16b99638c5d5c87f34918e102e1453d4f7ee1fa67109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://221933b30055ace0b4911bda08736e1c703b7757d55fadb3114ae39d038e4b19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01ae91c2e82dd02eb4c0aced1159efd45e3a0570a4db649f2fd2b58681419471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9634ab258db11c80c4fea57a4a31969b811204d49513802e9fd1e584c9baeeb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9634ab258db11c80c4fea57a4a31969b811204d49513802e9fd1e584c9baeeb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4eac0af23e91572524547b0ce92c10d435b55d0cd15ca4cfc1f49bda2de8bde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4eac0af23e91572524547b0ce92c10d435b55d0cd15ca4cfc1f49bda2de8bde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://af79d6bbb6a3c16b532ba2234d3373011c151ffa801eb1ae5ae947142a64bcfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af79d6bbb6a3c16b532ba2234d3373011c151ffa801eb1ae5ae947142a64bcfe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:03Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:15Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:15 crc kubenswrapper[4797]: I1013 13:08:15.938019 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6681a37da700a80ffec94aef9264f87838622029c76a2badc7b8f4a7e9e167e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:15Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:15 crc kubenswrapper[4797]: I1013 13:08:15.949661 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://050a7223e7496fca4ef77f2d73f6aefc921ac5accb7ecaa34609524388da6549\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f59a6104215a3e6febd2e26c286b00895bcd8a45719acbd8e86d6fb5683df39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:15Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:15 crc kubenswrapper[4797]: I1013 13:08:15.963026 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:15 crc kubenswrapper[4797]: I1013 13:08:15.963069 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:15 crc kubenswrapper[4797]: I1013 13:08:15.963082 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:15 crc kubenswrapper[4797]: I1013 13:08:15.963100 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:15 crc kubenswrapper[4797]: I1013 13:08:15.963111 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:15Z","lastTransitionTime":"2025-10-13T13:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:15 crc kubenswrapper[4797]: I1013 13:08:15.963624 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:15Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:15 crc kubenswrapper[4797]: I1013 13:08:15.981897 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hc9bk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94a6e41f-8980-41db-a008-d5a81058cdba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4187a9b2b8147080c704bcda550e1fa94124e2d876e766361e95907a8805d300\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55fca726ceb1c3d6ba2023acc8fcf6829a7843a2da5209dc77f3d1f499542984\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55fca726ceb1c3d6ba2023acc8fcf6829a7843a2da5209dc77f3d1f499542984\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d315709aee893961a174d0368efcf68e50e45845bdce18b40b96f5d49a8ac12c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d315709aee893961a174d0368efcf68e50e45845bdce18b40b96f5d49a8ac12c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e5eeb32046ffa3c309bb0649ed24bb4149050e757d4252bc5ed8e0593e1b139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e5eeb32046ffa3c309bb0649ed24bb4149050e757d4252bc5ed8e0593e1b139\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12ad75a6e287db934cda0d0128e5445d47433aa12807b4145ce0bd28c36b08a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12ad75a6e287db934cda0d0128e5445d47433aa12807b4145ce0bd28c36b08a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://578ddb49fd9525eafe74852b96ea1f3e320cbe40fa15ef4da3e4269f9bc23fc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://578ddb49fd9525eafe74852b96ea1f3e320cbe40fa15ef4da3e4269f9bc23fc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d96793d08ac170b3c25abd53c779b2ebecf10b5271c5f3eb4f9cbc524ba65c0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d96793d08ac170b3c25abd53c779b2ebecf10b5271c5f3eb4f9cbc524ba65c0e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hc9bk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:15Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:16 crc kubenswrapper[4797]: I1013 13:08:16.065841 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:16 crc kubenswrapper[4797]: I1013 13:08:16.065868 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:16 crc kubenswrapper[4797]: I1013 13:08:16.065876 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:16 crc kubenswrapper[4797]: I1013 13:08:16.065889 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:16 crc kubenswrapper[4797]: I1013 13:08:16.065897 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:16Z","lastTransitionTime":"2025-10-13T13:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:16 crc kubenswrapper[4797]: I1013 13:08:16.168225 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:16 crc kubenswrapper[4797]: I1013 13:08:16.168261 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:16 crc kubenswrapper[4797]: I1013 13:08:16.168273 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:16 crc kubenswrapper[4797]: I1013 13:08:16.168289 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:16 crc kubenswrapper[4797]: I1013 13:08:16.168299 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:16Z","lastTransitionTime":"2025-10-13T13:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:16 crc kubenswrapper[4797]: I1013 13:08:16.270609 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:16 crc kubenswrapper[4797]: I1013 13:08:16.270717 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:16 crc kubenswrapper[4797]: I1013 13:08:16.270730 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:16 crc kubenswrapper[4797]: I1013 13:08:16.270749 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:16 crc kubenswrapper[4797]: I1013 13:08:16.270761 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:16Z","lastTransitionTime":"2025-10-13T13:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:16 crc kubenswrapper[4797]: I1013 13:08:16.374216 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:16 crc kubenswrapper[4797]: I1013 13:08:16.374280 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:16 crc kubenswrapper[4797]: I1013 13:08:16.374298 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:16 crc kubenswrapper[4797]: I1013 13:08:16.374332 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:16 crc kubenswrapper[4797]: I1013 13:08:16.374348 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:16Z","lastTransitionTime":"2025-10-13T13:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:16 crc kubenswrapper[4797]: I1013 13:08:16.476568 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:16 crc kubenswrapper[4797]: I1013 13:08:16.476654 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:16 crc kubenswrapper[4797]: I1013 13:08:16.476666 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:16 crc kubenswrapper[4797]: I1013 13:08:16.476684 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:16 crc kubenswrapper[4797]: I1013 13:08:16.476696 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:16Z","lastTransitionTime":"2025-10-13T13:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:16 crc kubenswrapper[4797]: I1013 13:08:16.579747 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:16 crc kubenswrapper[4797]: I1013 13:08:16.579852 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:16 crc kubenswrapper[4797]: I1013 13:08:16.579880 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:16 crc kubenswrapper[4797]: I1013 13:08:16.579910 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:16 crc kubenswrapper[4797]: I1013 13:08:16.579932 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:16Z","lastTransitionTime":"2025-10-13T13:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:16 crc kubenswrapper[4797]: I1013 13:08:16.682937 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:16 crc kubenswrapper[4797]: I1013 13:08:16.682998 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:16 crc kubenswrapper[4797]: I1013 13:08:16.683016 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:16 crc kubenswrapper[4797]: I1013 13:08:16.683039 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:16 crc kubenswrapper[4797]: I1013 13:08:16.683056 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:16Z","lastTransitionTime":"2025-10-13T13:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:16 crc kubenswrapper[4797]: I1013 13:08:16.785755 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:16 crc kubenswrapper[4797]: I1013 13:08:16.785831 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:16 crc kubenswrapper[4797]: I1013 13:08:16.785844 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:16 crc kubenswrapper[4797]: I1013 13:08:16.785863 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:16 crc kubenswrapper[4797]: I1013 13:08:16.785875 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:16Z","lastTransitionTime":"2025-10-13T13:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:16 crc kubenswrapper[4797]: I1013 13:08:16.889286 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:16 crc kubenswrapper[4797]: I1013 13:08:16.889348 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:16 crc kubenswrapper[4797]: I1013 13:08:16.889367 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:16 crc kubenswrapper[4797]: I1013 13:08:16.889392 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:16 crc kubenswrapper[4797]: I1013 13:08:16.889411 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:16Z","lastTransitionTime":"2025-10-13T13:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:16 crc kubenswrapper[4797]: I1013 13:08:16.992615 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:16 crc kubenswrapper[4797]: I1013 13:08:16.992683 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:16 crc kubenswrapper[4797]: I1013 13:08:16.992702 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:16 crc kubenswrapper[4797]: I1013 13:08:16.992726 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:16 crc kubenswrapper[4797]: I1013 13:08:16.992744 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:16Z","lastTransitionTime":"2025-10-13T13:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:17 crc kubenswrapper[4797]: I1013 13:08:17.095366 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:17 crc kubenswrapper[4797]: I1013 13:08:17.095428 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:17 crc kubenswrapper[4797]: I1013 13:08:17.095444 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:17 crc kubenswrapper[4797]: I1013 13:08:17.095472 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:17 crc kubenswrapper[4797]: I1013 13:08:17.095489 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:17Z","lastTransitionTime":"2025-10-13T13:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:17 crc kubenswrapper[4797]: I1013 13:08:17.198829 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:17 crc kubenswrapper[4797]: I1013 13:08:17.198884 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:17 crc kubenswrapper[4797]: I1013 13:08:17.198898 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:17 crc kubenswrapper[4797]: I1013 13:08:17.198914 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:17 crc kubenswrapper[4797]: I1013 13:08:17.198955 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:17Z","lastTransitionTime":"2025-10-13T13:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:17 crc kubenswrapper[4797]: I1013 13:08:17.235954 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 13:08:17 crc kubenswrapper[4797]: I1013 13:08:17.236025 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 13 13:08:17 crc kubenswrapper[4797]: I1013 13:08:17.235954 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 13 13:08:17 crc kubenswrapper[4797]: E1013 13:08:17.236154 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 13 13:08:17 crc kubenswrapper[4797]: I1013 13:08:17.236215 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pdvg5" Oct 13 13:08:17 crc kubenswrapper[4797]: E1013 13:08:17.236350 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 13 13:08:17 crc kubenswrapper[4797]: E1013 13:08:17.236540 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pdvg5" podUID="e65d35bc-209d-4438-ae53-31deb132aaf5" Oct 13 13:08:17 crc kubenswrapper[4797]: E1013 13:08:17.236780 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 13 13:08:17 crc kubenswrapper[4797]: I1013 13:08:17.301793 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:17 crc kubenswrapper[4797]: I1013 13:08:17.301879 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:17 crc kubenswrapper[4797]: I1013 13:08:17.301907 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:17 crc kubenswrapper[4797]: I1013 13:08:17.301938 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:17 crc kubenswrapper[4797]: I1013 13:08:17.301964 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:17Z","lastTransitionTime":"2025-10-13T13:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:17 crc kubenswrapper[4797]: I1013 13:08:17.405049 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:17 crc kubenswrapper[4797]: I1013 13:08:17.405105 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:17 crc kubenswrapper[4797]: I1013 13:08:17.405122 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:17 crc kubenswrapper[4797]: I1013 13:08:17.405145 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:17 crc kubenswrapper[4797]: I1013 13:08:17.405162 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:17Z","lastTransitionTime":"2025-10-13T13:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:17 crc kubenswrapper[4797]: I1013 13:08:17.507552 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:17 crc kubenswrapper[4797]: I1013 13:08:17.507645 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:17 crc kubenswrapper[4797]: I1013 13:08:17.507662 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:17 crc kubenswrapper[4797]: I1013 13:08:17.507685 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:17 crc kubenswrapper[4797]: I1013 13:08:17.507702 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:17Z","lastTransitionTime":"2025-10-13T13:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:17 crc kubenswrapper[4797]: I1013 13:08:17.610392 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:17 crc kubenswrapper[4797]: I1013 13:08:17.610431 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:17 crc kubenswrapper[4797]: I1013 13:08:17.610440 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:17 crc kubenswrapper[4797]: I1013 13:08:17.610454 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:17 crc kubenswrapper[4797]: I1013 13:08:17.610465 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:17Z","lastTransitionTime":"2025-10-13T13:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:17 crc kubenswrapper[4797]: I1013 13:08:17.712513 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:17 crc kubenswrapper[4797]: I1013 13:08:17.712569 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:17 crc kubenswrapper[4797]: I1013 13:08:17.712584 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:17 crc kubenswrapper[4797]: I1013 13:08:17.712603 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:17 crc kubenswrapper[4797]: I1013 13:08:17.712614 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:17Z","lastTransitionTime":"2025-10-13T13:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:17 crc kubenswrapper[4797]: I1013 13:08:17.815798 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:17 crc kubenswrapper[4797]: I1013 13:08:17.815914 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:17 crc kubenswrapper[4797]: I1013 13:08:17.815938 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:17 crc kubenswrapper[4797]: I1013 13:08:17.815966 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:17 crc kubenswrapper[4797]: I1013 13:08:17.815987 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:17Z","lastTransitionTime":"2025-10-13T13:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:17 crc kubenswrapper[4797]: I1013 13:08:17.918465 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:17 crc kubenswrapper[4797]: I1013 13:08:17.918530 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:17 crc kubenswrapper[4797]: I1013 13:08:17.918548 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:17 crc kubenswrapper[4797]: I1013 13:08:17.918574 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:17 crc kubenswrapper[4797]: I1013 13:08:17.918592 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:17Z","lastTransitionTime":"2025-10-13T13:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:18 crc kubenswrapper[4797]: I1013 13:08:18.021279 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:18 crc kubenswrapper[4797]: I1013 13:08:18.021324 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:18 crc kubenswrapper[4797]: I1013 13:08:18.021335 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:18 crc kubenswrapper[4797]: I1013 13:08:18.021352 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:18 crc kubenswrapper[4797]: I1013 13:08:18.021363 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:18Z","lastTransitionTime":"2025-10-13T13:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:18 crc kubenswrapper[4797]: I1013 13:08:18.124518 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:18 crc kubenswrapper[4797]: I1013 13:08:18.124559 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:18 crc kubenswrapper[4797]: I1013 13:08:18.124567 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:18 crc kubenswrapper[4797]: I1013 13:08:18.124582 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:18 crc kubenswrapper[4797]: I1013 13:08:18.124592 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:18Z","lastTransitionTime":"2025-10-13T13:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:18 crc kubenswrapper[4797]: I1013 13:08:18.228082 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:18 crc kubenswrapper[4797]: I1013 13:08:18.228133 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:18 crc kubenswrapper[4797]: I1013 13:08:18.228149 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:18 crc kubenswrapper[4797]: I1013 13:08:18.228173 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:18 crc kubenswrapper[4797]: I1013 13:08:18.228190 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:18Z","lastTransitionTime":"2025-10-13T13:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:18 crc kubenswrapper[4797]: I1013 13:08:18.331093 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:18 crc kubenswrapper[4797]: I1013 13:08:18.331178 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:18 crc kubenswrapper[4797]: I1013 13:08:18.331212 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:18 crc kubenswrapper[4797]: I1013 13:08:18.331245 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:18 crc kubenswrapper[4797]: I1013 13:08:18.331267 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:18Z","lastTransitionTime":"2025-10-13T13:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:18 crc kubenswrapper[4797]: I1013 13:08:18.434217 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:18 crc kubenswrapper[4797]: I1013 13:08:18.434287 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:18 crc kubenswrapper[4797]: I1013 13:08:18.434304 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:18 crc kubenswrapper[4797]: I1013 13:08:18.434332 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:18 crc kubenswrapper[4797]: I1013 13:08:18.434350 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:18Z","lastTransitionTime":"2025-10-13T13:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:18 crc kubenswrapper[4797]: I1013 13:08:18.537791 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:18 crc kubenswrapper[4797]: I1013 13:08:18.537903 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:18 crc kubenswrapper[4797]: I1013 13:08:18.537940 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:18 crc kubenswrapper[4797]: I1013 13:08:18.537973 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:18 crc kubenswrapper[4797]: I1013 13:08:18.537998 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:18Z","lastTransitionTime":"2025-10-13T13:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:18 crc kubenswrapper[4797]: I1013 13:08:18.641715 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:18 crc kubenswrapper[4797]: I1013 13:08:18.641779 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:18 crc kubenswrapper[4797]: I1013 13:08:18.641796 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:18 crc kubenswrapper[4797]: I1013 13:08:18.641850 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:18 crc kubenswrapper[4797]: I1013 13:08:18.641867 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:18Z","lastTransitionTime":"2025-10-13T13:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:18 crc kubenswrapper[4797]: I1013 13:08:18.744257 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:18 crc kubenswrapper[4797]: I1013 13:08:18.744335 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:18 crc kubenswrapper[4797]: I1013 13:08:18.744359 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:18 crc kubenswrapper[4797]: I1013 13:08:18.744382 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:18 crc kubenswrapper[4797]: I1013 13:08:18.744398 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:18Z","lastTransitionTime":"2025-10-13T13:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:18 crc kubenswrapper[4797]: I1013 13:08:18.847633 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:18 crc kubenswrapper[4797]: I1013 13:08:18.847708 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:18 crc kubenswrapper[4797]: I1013 13:08:18.847735 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:18 crc kubenswrapper[4797]: I1013 13:08:18.847759 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:18 crc kubenswrapper[4797]: I1013 13:08:18.847776 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:18Z","lastTransitionTime":"2025-10-13T13:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:18 crc kubenswrapper[4797]: I1013 13:08:18.950386 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:18 crc kubenswrapper[4797]: I1013 13:08:18.950504 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:18 crc kubenswrapper[4797]: I1013 13:08:18.950542 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:18 crc kubenswrapper[4797]: I1013 13:08:18.950574 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:18 crc kubenswrapper[4797]: I1013 13:08:18.950594 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:18Z","lastTransitionTime":"2025-10-13T13:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:19 crc kubenswrapper[4797]: I1013 13:08:19.053943 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:19 crc kubenswrapper[4797]: I1013 13:08:19.054001 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:19 crc kubenswrapper[4797]: I1013 13:08:19.054018 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:19 crc kubenswrapper[4797]: I1013 13:08:19.054039 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:19 crc kubenswrapper[4797]: I1013 13:08:19.054056 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:19Z","lastTransitionTime":"2025-10-13T13:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:19 crc kubenswrapper[4797]: I1013 13:08:19.158104 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:19 crc kubenswrapper[4797]: I1013 13:08:19.158168 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:19 crc kubenswrapper[4797]: I1013 13:08:19.158185 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:19 crc kubenswrapper[4797]: I1013 13:08:19.158212 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:19 crc kubenswrapper[4797]: I1013 13:08:19.158231 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:19Z","lastTransitionTime":"2025-10-13T13:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:19 crc kubenswrapper[4797]: I1013 13:08:19.235656 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 13 13:08:19 crc kubenswrapper[4797]: I1013 13:08:19.235731 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 13 13:08:19 crc kubenswrapper[4797]: I1013 13:08:19.235770 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pdvg5" Oct 13 13:08:19 crc kubenswrapper[4797]: I1013 13:08:19.235732 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 13:08:19 crc kubenswrapper[4797]: E1013 13:08:19.235950 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 13 13:08:19 crc kubenswrapper[4797]: E1013 13:08:19.236202 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pdvg5" podUID="e65d35bc-209d-4438-ae53-31deb132aaf5" Oct 13 13:08:19 crc kubenswrapper[4797]: E1013 13:08:19.236243 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 13 13:08:19 crc kubenswrapper[4797]: E1013 13:08:19.236355 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 13 13:08:19 crc kubenswrapper[4797]: I1013 13:08:19.260638 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:19 crc kubenswrapper[4797]: I1013 13:08:19.260680 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:19 crc kubenswrapper[4797]: I1013 13:08:19.260695 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:19 crc kubenswrapper[4797]: I1013 13:08:19.260714 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:19 crc kubenswrapper[4797]: I1013 13:08:19.260731 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:19Z","lastTransitionTime":"2025-10-13T13:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:19 crc kubenswrapper[4797]: I1013 13:08:19.363198 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:19 crc kubenswrapper[4797]: I1013 13:08:19.363271 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:19 crc kubenswrapper[4797]: I1013 13:08:19.363282 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:19 crc kubenswrapper[4797]: I1013 13:08:19.363303 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:19 crc kubenswrapper[4797]: I1013 13:08:19.363322 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:19Z","lastTransitionTime":"2025-10-13T13:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:19 crc kubenswrapper[4797]: I1013 13:08:19.466147 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:19 crc kubenswrapper[4797]: I1013 13:08:19.466216 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:19 crc kubenswrapper[4797]: I1013 13:08:19.466241 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:19 crc kubenswrapper[4797]: I1013 13:08:19.466277 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:19 crc kubenswrapper[4797]: I1013 13:08:19.466299 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:19Z","lastTransitionTime":"2025-10-13T13:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:19 crc kubenswrapper[4797]: I1013 13:08:19.569709 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:19 crc kubenswrapper[4797]: I1013 13:08:19.569786 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:19 crc kubenswrapper[4797]: I1013 13:08:19.569859 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:19 crc kubenswrapper[4797]: I1013 13:08:19.569907 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:19 crc kubenswrapper[4797]: I1013 13:08:19.569929 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:19Z","lastTransitionTime":"2025-10-13T13:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:19 crc kubenswrapper[4797]: I1013 13:08:19.674221 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:19 crc kubenswrapper[4797]: I1013 13:08:19.674306 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:19 crc kubenswrapper[4797]: I1013 13:08:19.674331 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:19 crc kubenswrapper[4797]: I1013 13:08:19.674366 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:19 crc kubenswrapper[4797]: I1013 13:08:19.674391 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:19Z","lastTransitionTime":"2025-10-13T13:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:19 crc kubenswrapper[4797]: I1013 13:08:19.777859 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:19 crc kubenswrapper[4797]: I1013 13:08:19.777948 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:19 crc kubenswrapper[4797]: I1013 13:08:19.777973 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:19 crc kubenswrapper[4797]: I1013 13:08:19.778006 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:19 crc kubenswrapper[4797]: I1013 13:08:19.778034 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:19Z","lastTransitionTime":"2025-10-13T13:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:19 crc kubenswrapper[4797]: I1013 13:08:19.881401 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:19 crc kubenswrapper[4797]: I1013 13:08:19.881460 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:19 crc kubenswrapper[4797]: I1013 13:08:19.881478 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:19 crc kubenswrapper[4797]: I1013 13:08:19.881502 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:19 crc kubenswrapper[4797]: I1013 13:08:19.881520 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:19Z","lastTransitionTime":"2025-10-13T13:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:19 crc kubenswrapper[4797]: I1013 13:08:19.984610 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:19 crc kubenswrapper[4797]: I1013 13:08:19.984675 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:19 crc kubenswrapper[4797]: I1013 13:08:19.984698 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:19 crc kubenswrapper[4797]: I1013 13:08:19.984726 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:19 crc kubenswrapper[4797]: I1013 13:08:19.984746 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:19Z","lastTransitionTime":"2025-10-13T13:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:20 crc kubenswrapper[4797]: I1013 13:08:20.121576 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:20 crc kubenswrapper[4797]: I1013 13:08:20.121624 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:20 crc kubenswrapper[4797]: I1013 13:08:20.121638 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:20 crc kubenswrapper[4797]: I1013 13:08:20.121658 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:20 crc kubenswrapper[4797]: I1013 13:08:20.121671 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:20Z","lastTransitionTime":"2025-10-13T13:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:20 crc kubenswrapper[4797]: I1013 13:08:20.224681 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:20 crc kubenswrapper[4797]: I1013 13:08:20.224736 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:20 crc kubenswrapper[4797]: I1013 13:08:20.224755 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:20 crc kubenswrapper[4797]: I1013 13:08:20.224778 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:20 crc kubenswrapper[4797]: I1013 13:08:20.224797 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:20Z","lastTransitionTime":"2025-10-13T13:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:20 crc kubenswrapper[4797]: I1013 13:08:20.327930 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:20 crc kubenswrapper[4797]: I1013 13:08:20.327990 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:20 crc kubenswrapper[4797]: I1013 13:08:20.328010 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:20 crc kubenswrapper[4797]: I1013 13:08:20.328033 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:20 crc kubenswrapper[4797]: I1013 13:08:20.328050 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:20Z","lastTransitionTime":"2025-10-13T13:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:20 crc kubenswrapper[4797]: I1013 13:08:20.430054 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:20 crc kubenswrapper[4797]: I1013 13:08:20.430120 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:20 crc kubenswrapper[4797]: I1013 13:08:20.430131 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:20 crc kubenswrapper[4797]: I1013 13:08:20.430147 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:20 crc kubenswrapper[4797]: I1013 13:08:20.430159 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:20Z","lastTransitionTime":"2025-10-13T13:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:20 crc kubenswrapper[4797]: I1013 13:08:20.533713 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:20 crc kubenswrapper[4797]: I1013 13:08:20.533867 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:20 crc kubenswrapper[4797]: I1013 13:08:20.533898 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:20 crc kubenswrapper[4797]: I1013 13:08:20.533931 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:20 crc kubenswrapper[4797]: I1013 13:08:20.533955 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:20Z","lastTransitionTime":"2025-10-13T13:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:20 crc kubenswrapper[4797]: I1013 13:08:20.637091 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:20 crc kubenswrapper[4797]: I1013 13:08:20.637155 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:20 crc kubenswrapper[4797]: I1013 13:08:20.637178 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:20 crc kubenswrapper[4797]: I1013 13:08:20.637209 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:20 crc kubenswrapper[4797]: I1013 13:08:20.637233 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:20Z","lastTransitionTime":"2025-10-13T13:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:20 crc kubenswrapper[4797]: I1013 13:08:20.741676 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:20 crc kubenswrapper[4797]: I1013 13:08:20.741835 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:20 crc kubenswrapper[4797]: I1013 13:08:20.741854 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:20 crc kubenswrapper[4797]: I1013 13:08:20.741880 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:20 crc kubenswrapper[4797]: I1013 13:08:20.741898 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:20Z","lastTransitionTime":"2025-10-13T13:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:20 crc kubenswrapper[4797]: I1013 13:08:20.845324 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:20 crc kubenswrapper[4797]: I1013 13:08:20.845387 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:20 crc kubenswrapper[4797]: I1013 13:08:20.845410 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:20 crc kubenswrapper[4797]: I1013 13:08:20.845441 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:20 crc kubenswrapper[4797]: I1013 13:08:20.845464 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:20Z","lastTransitionTime":"2025-10-13T13:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:20 crc kubenswrapper[4797]: I1013 13:08:20.948655 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:20 crc kubenswrapper[4797]: I1013 13:08:20.948857 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:20 crc kubenswrapper[4797]: I1013 13:08:20.948901 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:20 crc kubenswrapper[4797]: I1013 13:08:20.948931 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:20 crc kubenswrapper[4797]: I1013 13:08:20.948951 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:20Z","lastTransitionTime":"2025-10-13T13:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:21 crc kubenswrapper[4797]: I1013 13:08:21.051941 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:21 crc kubenswrapper[4797]: I1013 13:08:21.052006 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:21 crc kubenswrapper[4797]: I1013 13:08:21.052023 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:21 crc kubenswrapper[4797]: I1013 13:08:21.052051 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:21 crc kubenswrapper[4797]: I1013 13:08:21.052070 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:21Z","lastTransitionTime":"2025-10-13T13:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:21 crc kubenswrapper[4797]: I1013 13:08:21.155067 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:21 crc kubenswrapper[4797]: I1013 13:08:21.155141 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:21 crc kubenswrapper[4797]: I1013 13:08:21.155168 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:21 crc kubenswrapper[4797]: I1013 13:08:21.155196 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:21 crc kubenswrapper[4797]: I1013 13:08:21.155214 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:21Z","lastTransitionTime":"2025-10-13T13:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:21 crc kubenswrapper[4797]: I1013 13:08:21.235207 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 13 13:08:21 crc kubenswrapper[4797]: I1013 13:08:21.235248 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 13:08:21 crc kubenswrapper[4797]: I1013 13:08:21.235395 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pdvg5" Oct 13 13:08:21 crc kubenswrapper[4797]: E1013 13:08:21.235602 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 13 13:08:21 crc kubenswrapper[4797]: I1013 13:08:21.235629 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 13 13:08:21 crc kubenswrapper[4797]: E1013 13:08:21.235739 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 13 13:08:21 crc kubenswrapper[4797]: E1013 13:08:21.235910 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 13 13:08:21 crc kubenswrapper[4797]: E1013 13:08:21.236098 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pdvg5" podUID="e65d35bc-209d-4438-ae53-31deb132aaf5" Oct 13 13:08:21 crc kubenswrapper[4797]: I1013 13:08:21.258464 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:21 crc kubenswrapper[4797]: I1013 13:08:21.258564 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:21 crc kubenswrapper[4797]: I1013 13:08:21.258583 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:21 crc kubenswrapper[4797]: I1013 13:08:21.258608 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:21 crc kubenswrapper[4797]: I1013 13:08:21.258625 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:21Z","lastTransitionTime":"2025-10-13T13:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:21 crc kubenswrapper[4797]: I1013 13:08:21.362097 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:21 crc kubenswrapper[4797]: I1013 13:08:21.362147 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:21 crc kubenswrapper[4797]: I1013 13:08:21.362158 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:21 crc kubenswrapper[4797]: I1013 13:08:21.362176 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:21 crc kubenswrapper[4797]: I1013 13:08:21.362191 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:21Z","lastTransitionTime":"2025-10-13T13:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:21 crc kubenswrapper[4797]: I1013 13:08:21.465938 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:21 crc kubenswrapper[4797]: I1013 13:08:21.465986 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:21 crc kubenswrapper[4797]: I1013 13:08:21.466000 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:21 crc kubenswrapper[4797]: I1013 13:08:21.466018 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:21 crc kubenswrapper[4797]: I1013 13:08:21.466033 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:21Z","lastTransitionTime":"2025-10-13T13:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:21 crc kubenswrapper[4797]: I1013 13:08:21.568763 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:21 crc kubenswrapper[4797]: I1013 13:08:21.568845 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:21 crc kubenswrapper[4797]: I1013 13:08:21.568864 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:21 crc kubenswrapper[4797]: I1013 13:08:21.568888 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:21 crc kubenswrapper[4797]: I1013 13:08:21.568906 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:21Z","lastTransitionTime":"2025-10-13T13:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:21 crc kubenswrapper[4797]: I1013 13:08:21.610879 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:21 crc kubenswrapper[4797]: I1013 13:08:21.610961 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:21 crc kubenswrapper[4797]: I1013 13:08:21.610984 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:21 crc kubenswrapper[4797]: I1013 13:08:21.611019 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:21 crc kubenswrapper[4797]: I1013 13:08:21.611042 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:21Z","lastTransitionTime":"2025-10-13T13:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:21 crc kubenswrapper[4797]: E1013 13:08:21.632292 4797 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:08:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:08:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:08:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:08:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:08:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:08:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:08:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:08:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7c305ae9-a0eb-4806-bd54-a7ad9c447299\\\",\\\"systemUUID\\\":\\\"1126131d-f382-4ed8-9b1e-fad3c0f5c993\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:21Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:21 crc kubenswrapper[4797]: I1013 13:08:21.636843 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:21 crc kubenswrapper[4797]: I1013 13:08:21.637130 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:21 crc kubenswrapper[4797]: I1013 13:08:21.637309 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:21 crc kubenswrapper[4797]: I1013 13:08:21.637503 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:21 crc kubenswrapper[4797]: I1013 13:08:21.637685 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:21Z","lastTransitionTime":"2025-10-13T13:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:21 crc kubenswrapper[4797]: E1013 13:08:21.658867 4797 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:08:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:08:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:08:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:08:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:08:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:08:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:08:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:08:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7c305ae9-a0eb-4806-bd54-a7ad9c447299\\\",\\\"systemUUID\\\":\\\"1126131d-f382-4ed8-9b1e-fad3c0f5c993\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:21Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:21 crc kubenswrapper[4797]: I1013 13:08:21.663798 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:21 crc kubenswrapper[4797]: I1013 13:08:21.663881 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:21 crc kubenswrapper[4797]: I1013 13:08:21.663898 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:21 crc kubenswrapper[4797]: I1013 13:08:21.663923 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:21 crc kubenswrapper[4797]: I1013 13:08:21.663942 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:21Z","lastTransitionTime":"2025-10-13T13:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:21 crc kubenswrapper[4797]: E1013 13:08:21.684679 4797 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:08:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:08:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:08:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:08:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:08:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:08:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:08:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:08:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7c305ae9-a0eb-4806-bd54-a7ad9c447299\\\",\\\"systemUUID\\\":\\\"1126131d-f382-4ed8-9b1e-fad3c0f5c993\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:21Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:21 crc kubenswrapper[4797]: I1013 13:08:21.689713 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:21 crc kubenswrapper[4797]: I1013 13:08:21.689760 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:21 crc kubenswrapper[4797]: I1013 13:08:21.689776 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:21 crc kubenswrapper[4797]: I1013 13:08:21.689800 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:21 crc kubenswrapper[4797]: I1013 13:08:21.689843 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:21Z","lastTransitionTime":"2025-10-13T13:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:21 crc kubenswrapper[4797]: E1013 13:08:21.711469 4797 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:08:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:08:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:08:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:08:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:08:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:08:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:08:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:08:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7c305ae9-a0eb-4806-bd54-a7ad9c447299\\\",\\\"systemUUID\\\":\\\"1126131d-f382-4ed8-9b1e-fad3c0f5c993\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:21Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:21 crc kubenswrapper[4797]: I1013 13:08:21.716272 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:21 crc kubenswrapper[4797]: I1013 13:08:21.716366 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:21 crc kubenswrapper[4797]: I1013 13:08:21.716389 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:21 crc kubenswrapper[4797]: I1013 13:08:21.716424 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:21 crc kubenswrapper[4797]: I1013 13:08:21.716448 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:21Z","lastTransitionTime":"2025-10-13T13:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:21 crc kubenswrapper[4797]: E1013 13:08:21.735446 4797 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:08:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:08:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:08:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:08:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:08:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:08:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:08:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:08:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7c305ae9-a0eb-4806-bd54-a7ad9c447299\\\",\\\"systemUUID\\\":\\\"1126131d-f382-4ed8-9b1e-fad3c0f5c993\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:21Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:21 crc kubenswrapper[4797]: E1013 13:08:21.735681 4797 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 13 13:08:21 crc kubenswrapper[4797]: I1013 13:08:21.737602 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:21 crc kubenswrapper[4797]: I1013 13:08:21.737637 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:21 crc kubenswrapper[4797]: I1013 13:08:21.737649 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:21 crc kubenswrapper[4797]: I1013 13:08:21.737665 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:21 crc kubenswrapper[4797]: I1013 13:08:21.737677 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:21Z","lastTransitionTime":"2025-10-13T13:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:21 crc kubenswrapper[4797]: I1013 13:08:21.841006 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:21 crc kubenswrapper[4797]: I1013 13:08:21.841062 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:21 crc kubenswrapper[4797]: I1013 13:08:21.841078 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:21 crc kubenswrapper[4797]: I1013 13:08:21.841101 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:21 crc kubenswrapper[4797]: I1013 13:08:21.841122 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:21Z","lastTransitionTime":"2025-10-13T13:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:21 crc kubenswrapper[4797]: I1013 13:08:21.949112 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:21 crc kubenswrapper[4797]: I1013 13:08:21.949186 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:21 crc kubenswrapper[4797]: I1013 13:08:21.949209 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:21 crc kubenswrapper[4797]: I1013 13:08:21.949239 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:21 crc kubenswrapper[4797]: I1013 13:08:21.949264 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:21Z","lastTransitionTime":"2025-10-13T13:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:22 crc kubenswrapper[4797]: I1013 13:08:22.052318 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:22 crc kubenswrapper[4797]: I1013 13:08:22.052669 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:22 crc kubenswrapper[4797]: I1013 13:08:22.052869 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:22 crc kubenswrapper[4797]: I1013 13:08:22.053036 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:22 crc kubenswrapper[4797]: I1013 13:08:22.053174 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:22Z","lastTransitionTime":"2025-10-13T13:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:22 crc kubenswrapper[4797]: I1013 13:08:22.156148 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:22 crc kubenswrapper[4797]: I1013 13:08:22.156213 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:22 crc kubenswrapper[4797]: I1013 13:08:22.156235 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:22 crc kubenswrapper[4797]: I1013 13:08:22.156263 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:22 crc kubenswrapper[4797]: I1013 13:08:22.156284 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:22Z","lastTransitionTime":"2025-10-13T13:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:22 crc kubenswrapper[4797]: I1013 13:08:22.259270 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:22 crc kubenswrapper[4797]: I1013 13:08:22.259321 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:22 crc kubenswrapper[4797]: I1013 13:08:22.259339 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:22 crc kubenswrapper[4797]: I1013 13:08:22.259360 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:22 crc kubenswrapper[4797]: I1013 13:08:22.259376 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:22Z","lastTransitionTime":"2025-10-13T13:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:22 crc kubenswrapper[4797]: I1013 13:08:22.361955 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:22 crc kubenswrapper[4797]: I1013 13:08:22.362045 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:22 crc kubenswrapper[4797]: I1013 13:08:22.362069 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:22 crc kubenswrapper[4797]: I1013 13:08:22.362092 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:22 crc kubenswrapper[4797]: I1013 13:08:22.362108 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:22Z","lastTransitionTime":"2025-10-13T13:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:22 crc kubenswrapper[4797]: I1013 13:08:22.464708 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:22 crc kubenswrapper[4797]: I1013 13:08:22.464884 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:22 crc kubenswrapper[4797]: I1013 13:08:22.464912 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:22 crc kubenswrapper[4797]: I1013 13:08:22.464940 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:22 crc kubenswrapper[4797]: I1013 13:08:22.464962 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:22Z","lastTransitionTime":"2025-10-13T13:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:22 crc kubenswrapper[4797]: I1013 13:08:22.567297 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:22 crc kubenswrapper[4797]: I1013 13:08:22.567357 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:22 crc kubenswrapper[4797]: I1013 13:08:22.567372 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:22 crc kubenswrapper[4797]: I1013 13:08:22.567393 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:22 crc kubenswrapper[4797]: I1013 13:08:22.567407 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:22Z","lastTransitionTime":"2025-10-13T13:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:22 crc kubenswrapper[4797]: I1013 13:08:22.671072 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:22 crc kubenswrapper[4797]: I1013 13:08:22.671133 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:22 crc kubenswrapper[4797]: I1013 13:08:22.671152 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:22 crc kubenswrapper[4797]: I1013 13:08:22.671175 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:22 crc kubenswrapper[4797]: I1013 13:08:22.671192 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:22Z","lastTransitionTime":"2025-10-13T13:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:22 crc kubenswrapper[4797]: I1013 13:08:22.774655 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:22 crc kubenswrapper[4797]: I1013 13:08:22.774730 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:22 crc kubenswrapper[4797]: I1013 13:08:22.774754 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:22 crc kubenswrapper[4797]: I1013 13:08:22.774788 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:22 crc kubenswrapper[4797]: I1013 13:08:22.774843 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:22Z","lastTransitionTime":"2025-10-13T13:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:22 crc kubenswrapper[4797]: I1013 13:08:22.877252 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:22 crc kubenswrapper[4797]: I1013 13:08:22.877332 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:22 crc kubenswrapper[4797]: I1013 13:08:22.877345 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:22 crc kubenswrapper[4797]: I1013 13:08:22.877373 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:22 crc kubenswrapper[4797]: I1013 13:08:22.877386 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:22Z","lastTransitionTime":"2025-10-13T13:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:22 crc kubenswrapper[4797]: I1013 13:08:22.980636 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:22 crc kubenswrapper[4797]: I1013 13:08:22.980722 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:22 crc kubenswrapper[4797]: I1013 13:08:22.980745 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:22 crc kubenswrapper[4797]: I1013 13:08:22.980778 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:22 crc kubenswrapper[4797]: I1013 13:08:22.980844 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:22Z","lastTransitionTime":"2025-10-13T13:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:23 crc kubenswrapper[4797]: I1013 13:08:23.083509 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:23 crc kubenswrapper[4797]: I1013 13:08:23.083577 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:23 crc kubenswrapper[4797]: I1013 13:08:23.083599 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:23 crc kubenswrapper[4797]: I1013 13:08:23.083627 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:23 crc kubenswrapper[4797]: I1013 13:08:23.083647 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:23Z","lastTransitionTime":"2025-10-13T13:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:23 crc kubenswrapper[4797]: I1013 13:08:23.186209 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:23 crc kubenswrapper[4797]: I1013 13:08:23.186243 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:23 crc kubenswrapper[4797]: I1013 13:08:23.186252 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:23 crc kubenswrapper[4797]: I1013 13:08:23.186265 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:23 crc kubenswrapper[4797]: I1013 13:08:23.186273 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:23Z","lastTransitionTime":"2025-10-13T13:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:23 crc kubenswrapper[4797]: I1013 13:08:23.235225 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 13 13:08:23 crc kubenswrapper[4797]: I1013 13:08:23.235322 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 13:08:23 crc kubenswrapper[4797]: I1013 13:08:23.235345 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 13 13:08:23 crc kubenswrapper[4797]: I1013 13:08:23.235282 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pdvg5" Oct 13 13:08:23 crc kubenswrapper[4797]: E1013 13:08:23.235512 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 13 13:08:23 crc kubenswrapper[4797]: E1013 13:08:23.235710 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pdvg5" podUID="e65d35bc-209d-4438-ae53-31deb132aaf5" Oct 13 13:08:23 crc kubenswrapper[4797]: E1013 13:08:23.235969 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 13 13:08:23 crc kubenswrapper[4797]: E1013 13:08:23.236131 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 13 13:08:23 crc kubenswrapper[4797]: I1013 13:08:23.258712 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6gbdx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2ab9f14-aae8-45ef-880e-a1563e920f87\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa998288bf7354f5914b82c32971cd88e1fe9535016c7d137b79e4cf5c5c7248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://414f6ddbfec431109009fc83e56eeac94db15726b109e707ebd8d3e2403999b7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-13T13:08:12Z\\\",\\\"message\\\":\\\"2025-10-13T13:07:27+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_241ac6c6-17e4-4495-a5d5-736f3cee06d0\\\\n2025-10-13T13:07:27+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_241ac6c6-17e4-4495-a5d5-736f3cee06d0 to /host/opt/cni/bin/\\\\n2025-10-13T13:07:27Z [verbose] multus-daemon started\\\\n2025-10-13T13:07:27Z [verbose] Readiness Indicator file check\\\\n2025-10-13T13:08:12Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:08:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rkc2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6gbdx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:23Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:23 crc kubenswrapper[4797]: I1013 13:08:23.289876 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:23 crc kubenswrapper[4797]: I1013 13:08:23.289953 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:23 crc kubenswrapper[4797]: I1013 13:08:23.289979 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:23 crc kubenswrapper[4797]: I1013 13:08:23.290010 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:23 crc kubenswrapper[4797]: I1013 13:08:23.290038 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:23Z","lastTransitionTime":"2025-10-13T13:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:23 crc kubenswrapper[4797]: I1013 13:08:23.298279 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"658edc6a-9975-4d8b-9551-821edcc32ce1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6815d3509df673d7f5da2c26130c6c4d533e9d2c25c40f82365ef61d63ee71bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e599d81d1a996abd4de74afc58a8255a1ae548327401146b6bdf688d7455823\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa5161ba66d687daedb3caa1a0e2d7be83859aa3076731f94aebf83cc3348a4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1293f7ed35796e22a4be73a35ad07f83fa98d250d21de2d0b96b9090354142b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32991406197be9d38b8d5e8d1a7e95165b1846e9e054efbe87f30aac9f7f8784\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7aebb018a68c2984d9e4e58071c2b623652bfa700acebaf735c35615abf8c592\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4df9cf44f891f2c9d341965d5c89c3f3ca2eb1bf12a54a0a673389869f9c878d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4df9cf44f891f2c9d341965d5c89c3f3ca2eb1bf12a54a0a673389869f9c878d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-13T13:08:14Z\\\",\\\"message\\\":\\\"LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-service-ca-operator/metrics\\\\\\\"}\\\\nI1013 13:08:14.032896 6746 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1013 13:08:14.032901 6746 services_controller.go:360] Finished syncing service metrics on namespace openshift-service-ca-operator for network=default : 1.704972ms\\\\nI1013 13:08:14.032930 6746 services_controller.go:356] Processing sync for service openshift-authentication-operator/metrics for network=default\\\\nF1013 13:08:14.032949 6746 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:13Z is after 2025-08\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-13T13:08:13Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-dhk2q_openshift-ovn-kubernetes(658edc6a-9975-4d8b-9551-821edcc32ce1)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a900854ab289e65833932548eadd4705ec501737d66773d5b6c283458125b598\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6339ec20a5a892de024630eaf6febedfdfa4373c24d394fc758267cc7ae2603e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6339ec20a5a892de024630eaf6febedfdfa4373c24d394fc758267cc7ae2603e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dhk2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:23Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:23 crc kubenswrapper[4797]: I1013 13:08:23.323152 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a225515-1318-413d-aafe-877c9f16f598\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94213b2963fead3db49bc98dfdf6347265b92e3a0a965295610e496d2e1f03fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://015492cb16b3cf6dedc1936f90cf03d1331bfd1fddf6a257c719a6bf102691f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a259d99cae7127eb6fc8ad5446de3eda5a06da45868ab2325a89fc9c44f1d34c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58e84f914871c37c2c5cf2767af6a88354e4e59af0cbe5b178b80e1372d50629\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba76df71160260346f2ecd968722de778b7d2b3dcb8673d6ec770964965384fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ff207205d6f17838682cab641fe9c77fad739e5454c206f376440741bed8724\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ff207205d6f17838682cab641fe9c77fad739e5454c206f376440741bed8724\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:23Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:23 crc kubenswrapper[4797]: I1013 13:08:23.346990 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:23Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:23 crc kubenswrapper[4797]: I1013 13:08:23.368363 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:23Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:23 crc kubenswrapper[4797]: I1013 13:08:23.381423 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-pdvg5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e65d35bc-209d-4438-ae53-31deb132aaf5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nspn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nspn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-pdvg5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:23Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:23 crc kubenswrapper[4797]: I1013 13:08:23.392267 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:23 crc kubenswrapper[4797]: I1013 13:08:23.392303 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:23 crc kubenswrapper[4797]: I1013 13:08:23.392313 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:23 crc kubenswrapper[4797]: I1013 13:08:23.392326 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:23 crc kubenswrapper[4797]: I1013 13:08:23.392337 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:23Z","lastTransitionTime":"2025-10-13T13:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:23 crc kubenswrapper[4797]: I1013 13:08:23.393358 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69631c207b244ea05458caf7f67665697be6b3794c1aac98d0ac8d23df060e02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:23Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:23 crc kubenswrapper[4797]: I1013 13:08:23.406372 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5jgrm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"680a49a0-7eff-44a9-8ab8-e4b52f4743c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://875be57ff7356a934b342acd8ae700f66656680be4e58e6cfccdc0407b66ddea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pptw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5jgrm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:23Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:23 crc kubenswrapper[4797]: I1013 13:08:23.418275 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hvhmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b24c284-a754-4877-83cc-334b0a893a47\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72031333bb0302ca8e823981a07e96b3bf16d02fbdb918d4fd3e79f36d86c5ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znzc9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95d788f4cc7913f42c5282aea7303a5463ec8718dba6372a30c505e1648f230e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znzc9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hvhmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:23Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:23 crc kubenswrapper[4797]: I1013 13:08:23.436512 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6681a37da700a80ffec94aef9264f87838622029c76a2badc7b8f4a7e9e167e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:23Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:23 crc kubenswrapper[4797]: I1013 13:08:23.450851 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://050a7223e7496fca4ef77f2d73f6aefc921ac5accb7ecaa34609524388da6549\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f59a6104215a3e6febd2e26c286b00895bcd8a45719acbd8e86d6fb5683df39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:23Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:23 crc kubenswrapper[4797]: I1013 13:08:23.469181 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:23Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:23 crc kubenswrapper[4797]: I1013 13:08:23.486458 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hc9bk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94a6e41f-8980-41db-a008-d5a81058cdba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4187a9b2b8147080c704bcda550e1fa94124e2d876e766361e95907a8805d300\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55fca726ceb1c3d6ba2023acc8fcf6829a7843a2da5209dc77f3d1f499542984\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55fca726ceb1c3d6ba2023acc8fcf6829a7843a2da5209dc77f3d1f499542984\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d315709aee893961a174d0368efcf68e50e45845bdce18b40b96f5d49a8ac12c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d315709aee893961a174d0368efcf68e50e45845bdce18b40b96f5d49a8ac12c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e5eeb32046ffa3c309bb0649ed24bb4149050e757d4252bc5ed8e0593e1b139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e5eeb32046ffa3c309bb0649ed24bb4149050e757d4252bc5ed8e0593e1b139\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12ad75a6e287db934cda0d0128e5445d47433aa12807b4145ce0bd28c36b08a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12ad75a6e287db934cda0d0128e5445d47433aa12807b4145ce0bd28c36b08a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://578ddb49fd9525eafe74852b96ea1f3e320cbe40fa15ef4da3e4269f9bc23fc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://578ddb49fd9525eafe74852b96ea1f3e320cbe40fa15ef4da3e4269f9bc23fc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d96793d08ac170b3c25abd53c779b2ebecf10b5271c5f3eb4f9cbc524ba65c0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d96793d08ac170b3c25abd53c779b2ebecf10b5271c5f3eb4f9cbc524ba65c0e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hc9bk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:23Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:23 crc kubenswrapper[4797]: I1013 13:08:23.496188 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:23 crc kubenswrapper[4797]: I1013 13:08:23.496249 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:23 crc kubenswrapper[4797]: I1013 13:08:23.496270 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:23 crc kubenswrapper[4797]: I1013 13:08:23.496294 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:23 crc kubenswrapper[4797]: I1013 13:08:23.496313 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:23Z","lastTransitionTime":"2025-10-13T13:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:23 crc kubenswrapper[4797]: I1013 13:08:23.501608 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b68fe527-212f-427f-853a-037035463262\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca584d4ecaf82f6fb7822ce377920e84fa94325d8c157e75bdcbbe45a125fa17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11160e205816f4de995be138142cca7672957f217e49bf9f4761ae2cb132e9da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://034f468ed62eb8201ad4abdbf235c13b6c9ff8e3fe2494ad768f7047e188bc77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e62409684da122b3385446a402a798c47eca9f32aeb43f734f94dc498f95d23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e62409684da122b3385446a402a798c47eca9f32aeb43f734f94dc498f95d23\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:03Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:23Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:23 crc kubenswrapper[4797]: I1013 13:08:23.525977 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a0ece0b-2009-4af8-a479-18fe277add03\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfe67dfd1c3ab4ca933a08e0384f2c38dccf755989a2d788f7c96bd8c2005c7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2839d93188aac6edbd17e7c1dc6d3b6004d3c1d8d03c559205b1f180ca7fc722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d103007fe3545a7470b16b99638c5d5c87f34918e102e1453d4f7ee1fa67109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://221933b30055ace0b4911bda08736e1c703b7757d55fadb3114ae39d038e4b19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01ae91c2e82dd02eb4c0aced1159efd45e3a0570a4db649f2fd2b58681419471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9634ab258db11c80c4fea57a4a31969b811204d49513802e9fd1e584c9baeeb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9634ab258db11c80c4fea57a4a31969b811204d49513802e9fd1e584c9baeeb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4eac0af23e91572524547b0ce92c10d435b55d0cd15ca4cfc1f49bda2de8bde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4eac0af23e91572524547b0ce92c10d435b55d0cd15ca4cfc1f49bda2de8bde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://af79d6bbb6a3c16b532ba2234d3373011c151ffa801eb1ae5ae947142a64bcfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af79d6bbb6a3c16b532ba2234d3373011c151ffa801eb1ae5ae947142a64bcfe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:03Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:23Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:23 crc kubenswrapper[4797]: I1013 13:08:23.545336 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f37f607-3b81-4e33-878e-e78a69b89d23\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f65bab26af0e0d003d4e1a27dc4bdb84b64b5f6143e363973331a3fb6d26b12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81fc000b6df41386d24f9077cee4aa0ceb4733774dc37d225495575543e84a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6529ac19e0d9f2b6ecc69e041e75c9767c971617166ca22bb29349b3b3965b1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://153aaedead7ed76ab97858342317945de885afe80c00d9873d1a03444c47f67d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:23Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:23 crc kubenswrapper[4797]: I1013 13:08:23.565121 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"345b1c60-ba79-407d-8423-53010f2dfeb0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9188982d992b79d058393a141055552eeb63bc5cd53178991e62b3df7604f55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4mqw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae2106d4b7e73d19b0c8cbd8089d372e56fa08d827a3b45148d0cf68e8596c00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4mqw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hrdxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:23Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:23 crc kubenswrapper[4797]: I1013 13:08:23.583189 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7c2fp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c10f51f-a7f1-4ab8-8d9c-fc358bd7f2c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55ca323dd3a92ee203542f4ef7bb8be990bcfc8f75f125c562127129aecefc5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hgkjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7c2fp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:23Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:23 crc kubenswrapper[4797]: I1013 13:08:23.598913 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:23 crc kubenswrapper[4797]: I1013 13:08:23.598953 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:23 crc kubenswrapper[4797]: I1013 13:08:23.598966 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:23 crc kubenswrapper[4797]: I1013 13:08:23.598985 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:23 crc kubenswrapper[4797]: I1013 13:08:23.598998 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:23Z","lastTransitionTime":"2025-10-13T13:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:23 crc kubenswrapper[4797]: I1013 13:08:23.702051 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:23 crc kubenswrapper[4797]: I1013 13:08:23.702149 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:23 crc kubenswrapper[4797]: I1013 13:08:23.702170 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:23 crc kubenswrapper[4797]: I1013 13:08:23.702197 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:23 crc kubenswrapper[4797]: I1013 13:08:23.702216 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:23Z","lastTransitionTime":"2025-10-13T13:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:23 crc kubenswrapper[4797]: I1013 13:08:23.806016 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:23 crc kubenswrapper[4797]: I1013 13:08:23.806080 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:23 crc kubenswrapper[4797]: I1013 13:08:23.806099 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:23 crc kubenswrapper[4797]: I1013 13:08:23.806126 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:23 crc kubenswrapper[4797]: I1013 13:08:23.806145 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:23Z","lastTransitionTime":"2025-10-13T13:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:23 crc kubenswrapper[4797]: I1013 13:08:23.909414 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:23 crc kubenswrapper[4797]: I1013 13:08:23.909471 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:23 crc kubenswrapper[4797]: I1013 13:08:23.909491 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:23 crc kubenswrapper[4797]: I1013 13:08:23.909517 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:23 crc kubenswrapper[4797]: I1013 13:08:23.909535 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:23Z","lastTransitionTime":"2025-10-13T13:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:24 crc kubenswrapper[4797]: I1013 13:08:24.013374 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:24 crc kubenswrapper[4797]: I1013 13:08:24.013431 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:24 crc kubenswrapper[4797]: I1013 13:08:24.013449 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:24 crc kubenswrapper[4797]: I1013 13:08:24.013471 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:24 crc kubenswrapper[4797]: I1013 13:08:24.013489 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:24Z","lastTransitionTime":"2025-10-13T13:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:24 crc kubenswrapper[4797]: I1013 13:08:24.117058 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:24 crc kubenswrapper[4797]: I1013 13:08:24.117225 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:24 crc kubenswrapper[4797]: I1013 13:08:24.117252 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:24 crc kubenswrapper[4797]: I1013 13:08:24.117329 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:24 crc kubenswrapper[4797]: I1013 13:08:24.117352 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:24Z","lastTransitionTime":"2025-10-13T13:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:24 crc kubenswrapper[4797]: I1013 13:08:24.219874 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:24 crc kubenswrapper[4797]: I1013 13:08:24.219934 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:24 crc kubenswrapper[4797]: I1013 13:08:24.219956 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:24 crc kubenswrapper[4797]: I1013 13:08:24.219987 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:24 crc kubenswrapper[4797]: I1013 13:08:24.220010 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:24Z","lastTransitionTime":"2025-10-13T13:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:24 crc kubenswrapper[4797]: I1013 13:08:24.323341 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:24 crc kubenswrapper[4797]: I1013 13:08:24.323402 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:24 crc kubenswrapper[4797]: I1013 13:08:24.323421 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:24 crc kubenswrapper[4797]: I1013 13:08:24.323445 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:24 crc kubenswrapper[4797]: I1013 13:08:24.323463 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:24Z","lastTransitionTime":"2025-10-13T13:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:24 crc kubenswrapper[4797]: I1013 13:08:24.426970 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:24 crc kubenswrapper[4797]: I1013 13:08:24.427048 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:24 crc kubenswrapper[4797]: I1013 13:08:24.427084 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:24 crc kubenswrapper[4797]: I1013 13:08:24.427115 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:24 crc kubenswrapper[4797]: I1013 13:08:24.427137 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:24Z","lastTransitionTime":"2025-10-13T13:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:24 crc kubenswrapper[4797]: I1013 13:08:24.530347 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:24 crc kubenswrapper[4797]: I1013 13:08:24.530423 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:24 crc kubenswrapper[4797]: I1013 13:08:24.530448 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:24 crc kubenswrapper[4797]: I1013 13:08:24.530477 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:24 crc kubenswrapper[4797]: I1013 13:08:24.530501 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:24Z","lastTransitionTime":"2025-10-13T13:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:24 crc kubenswrapper[4797]: I1013 13:08:24.634129 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:24 crc kubenswrapper[4797]: I1013 13:08:24.634201 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:24 crc kubenswrapper[4797]: I1013 13:08:24.634219 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:24 crc kubenswrapper[4797]: I1013 13:08:24.634242 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:24 crc kubenswrapper[4797]: I1013 13:08:24.634261 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:24Z","lastTransitionTime":"2025-10-13T13:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:24 crc kubenswrapper[4797]: I1013 13:08:24.737267 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:24 crc kubenswrapper[4797]: I1013 13:08:24.737338 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:24 crc kubenswrapper[4797]: I1013 13:08:24.737375 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:24 crc kubenswrapper[4797]: I1013 13:08:24.737402 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:24 crc kubenswrapper[4797]: I1013 13:08:24.737423 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:24Z","lastTransitionTime":"2025-10-13T13:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:24 crc kubenswrapper[4797]: I1013 13:08:24.840914 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:24 crc kubenswrapper[4797]: I1013 13:08:24.840976 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:24 crc kubenswrapper[4797]: I1013 13:08:24.840996 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:24 crc kubenswrapper[4797]: I1013 13:08:24.841021 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:24 crc kubenswrapper[4797]: I1013 13:08:24.841048 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:24Z","lastTransitionTime":"2025-10-13T13:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:24 crc kubenswrapper[4797]: I1013 13:08:24.944148 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:24 crc kubenswrapper[4797]: I1013 13:08:24.944209 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:24 crc kubenswrapper[4797]: I1013 13:08:24.944234 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:24 crc kubenswrapper[4797]: I1013 13:08:24.944263 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:24 crc kubenswrapper[4797]: I1013 13:08:24.944285 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:24Z","lastTransitionTime":"2025-10-13T13:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:25 crc kubenswrapper[4797]: I1013 13:08:25.047466 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:25 crc kubenswrapper[4797]: I1013 13:08:25.047544 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:25 crc kubenswrapper[4797]: I1013 13:08:25.047568 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:25 crc kubenswrapper[4797]: I1013 13:08:25.047714 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:25 crc kubenswrapper[4797]: I1013 13:08:25.047757 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:25Z","lastTransitionTime":"2025-10-13T13:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:25 crc kubenswrapper[4797]: I1013 13:08:25.150689 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:25 crc kubenswrapper[4797]: I1013 13:08:25.151147 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:25 crc kubenswrapper[4797]: I1013 13:08:25.151173 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:25 crc kubenswrapper[4797]: I1013 13:08:25.151202 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:25 crc kubenswrapper[4797]: I1013 13:08:25.151222 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:25Z","lastTransitionTime":"2025-10-13T13:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:25 crc kubenswrapper[4797]: I1013 13:08:25.235178 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 13:08:25 crc kubenswrapper[4797]: I1013 13:08:25.235259 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 13 13:08:25 crc kubenswrapper[4797]: I1013 13:08:25.235391 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pdvg5" Oct 13 13:08:25 crc kubenswrapper[4797]: E1013 13:08:25.235576 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 13 13:08:25 crc kubenswrapper[4797]: E1013 13:08:25.235758 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 13 13:08:25 crc kubenswrapper[4797]: I1013 13:08:25.235879 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 13 13:08:25 crc kubenswrapper[4797]: E1013 13:08:25.236051 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pdvg5" podUID="e65d35bc-209d-4438-ae53-31deb132aaf5" Oct 13 13:08:25 crc kubenswrapper[4797]: E1013 13:08:25.236496 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 13 13:08:25 crc kubenswrapper[4797]: I1013 13:08:25.253616 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:25 crc kubenswrapper[4797]: I1013 13:08:25.253674 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:25 crc kubenswrapper[4797]: I1013 13:08:25.253698 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:25 crc kubenswrapper[4797]: I1013 13:08:25.253727 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:25 crc kubenswrapper[4797]: I1013 13:08:25.253749 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:25Z","lastTransitionTime":"2025-10-13T13:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:25 crc kubenswrapper[4797]: I1013 13:08:25.356547 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:25 crc kubenswrapper[4797]: I1013 13:08:25.356595 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:25 crc kubenswrapper[4797]: I1013 13:08:25.356611 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:25 crc kubenswrapper[4797]: I1013 13:08:25.356632 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:25 crc kubenswrapper[4797]: I1013 13:08:25.356649 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:25Z","lastTransitionTime":"2025-10-13T13:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:25 crc kubenswrapper[4797]: I1013 13:08:25.459941 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:25 crc kubenswrapper[4797]: I1013 13:08:25.460012 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:25 crc kubenswrapper[4797]: I1013 13:08:25.460036 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:25 crc kubenswrapper[4797]: I1013 13:08:25.460065 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:25 crc kubenswrapper[4797]: I1013 13:08:25.460084 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:25Z","lastTransitionTime":"2025-10-13T13:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:25 crc kubenswrapper[4797]: I1013 13:08:25.562707 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:25 crc kubenswrapper[4797]: I1013 13:08:25.562873 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:25 crc kubenswrapper[4797]: I1013 13:08:25.562902 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:25 crc kubenswrapper[4797]: I1013 13:08:25.562931 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:25 crc kubenswrapper[4797]: I1013 13:08:25.562951 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:25Z","lastTransitionTime":"2025-10-13T13:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:25 crc kubenswrapper[4797]: I1013 13:08:25.666253 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:25 crc kubenswrapper[4797]: I1013 13:08:25.666296 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:25 crc kubenswrapper[4797]: I1013 13:08:25.666309 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:25 crc kubenswrapper[4797]: I1013 13:08:25.666328 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:25 crc kubenswrapper[4797]: I1013 13:08:25.666343 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:25Z","lastTransitionTime":"2025-10-13T13:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:25 crc kubenswrapper[4797]: I1013 13:08:25.773630 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:25 crc kubenswrapper[4797]: I1013 13:08:25.773690 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:25 crc kubenswrapper[4797]: I1013 13:08:25.773711 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:25 crc kubenswrapper[4797]: I1013 13:08:25.773735 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:25 crc kubenswrapper[4797]: I1013 13:08:25.773751 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:25Z","lastTransitionTime":"2025-10-13T13:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:25 crc kubenswrapper[4797]: I1013 13:08:25.877358 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:25 crc kubenswrapper[4797]: I1013 13:08:25.877438 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:25 crc kubenswrapper[4797]: I1013 13:08:25.877464 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:25 crc kubenswrapper[4797]: I1013 13:08:25.877501 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:25 crc kubenswrapper[4797]: I1013 13:08:25.877527 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:25Z","lastTransitionTime":"2025-10-13T13:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:25 crc kubenswrapper[4797]: I1013 13:08:25.980534 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:25 crc kubenswrapper[4797]: I1013 13:08:25.980588 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:25 crc kubenswrapper[4797]: I1013 13:08:25.980606 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:25 crc kubenswrapper[4797]: I1013 13:08:25.980630 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:25 crc kubenswrapper[4797]: I1013 13:08:25.980647 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:25Z","lastTransitionTime":"2025-10-13T13:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:26 crc kubenswrapper[4797]: I1013 13:08:26.083370 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:26 crc kubenswrapper[4797]: I1013 13:08:26.083429 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:26 crc kubenswrapper[4797]: I1013 13:08:26.083446 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:26 crc kubenswrapper[4797]: I1013 13:08:26.083469 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:26 crc kubenswrapper[4797]: I1013 13:08:26.083488 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:26Z","lastTransitionTime":"2025-10-13T13:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:26 crc kubenswrapper[4797]: I1013 13:08:26.186960 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:26 crc kubenswrapper[4797]: I1013 13:08:26.187022 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:26 crc kubenswrapper[4797]: I1013 13:08:26.187039 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:26 crc kubenswrapper[4797]: I1013 13:08:26.187063 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:26 crc kubenswrapper[4797]: I1013 13:08:26.187081 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:26Z","lastTransitionTime":"2025-10-13T13:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:26 crc kubenswrapper[4797]: I1013 13:08:26.290199 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:26 crc kubenswrapper[4797]: I1013 13:08:26.290583 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:26 crc kubenswrapper[4797]: I1013 13:08:26.290611 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:26 crc kubenswrapper[4797]: I1013 13:08:26.290635 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:26 crc kubenswrapper[4797]: I1013 13:08:26.290651 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:26Z","lastTransitionTime":"2025-10-13T13:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:26 crc kubenswrapper[4797]: I1013 13:08:26.393244 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:26 crc kubenswrapper[4797]: I1013 13:08:26.393316 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:26 crc kubenswrapper[4797]: I1013 13:08:26.393333 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:26 crc kubenswrapper[4797]: I1013 13:08:26.393357 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:26 crc kubenswrapper[4797]: I1013 13:08:26.393375 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:26Z","lastTransitionTime":"2025-10-13T13:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:26 crc kubenswrapper[4797]: I1013 13:08:26.496407 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:26 crc kubenswrapper[4797]: I1013 13:08:26.496464 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:26 crc kubenswrapper[4797]: I1013 13:08:26.496481 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:26 crc kubenswrapper[4797]: I1013 13:08:26.496505 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:26 crc kubenswrapper[4797]: I1013 13:08:26.496524 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:26Z","lastTransitionTime":"2025-10-13T13:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:26 crc kubenswrapper[4797]: I1013 13:08:26.600539 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:26 crc kubenswrapper[4797]: I1013 13:08:26.600601 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:26 crc kubenswrapper[4797]: I1013 13:08:26.600621 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:26 crc kubenswrapper[4797]: I1013 13:08:26.600646 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:26 crc kubenswrapper[4797]: I1013 13:08:26.600667 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:26Z","lastTransitionTime":"2025-10-13T13:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:26 crc kubenswrapper[4797]: I1013 13:08:26.703966 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:26 crc kubenswrapper[4797]: I1013 13:08:26.704061 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:26 crc kubenswrapper[4797]: I1013 13:08:26.704081 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:26 crc kubenswrapper[4797]: I1013 13:08:26.704136 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:26 crc kubenswrapper[4797]: I1013 13:08:26.704153 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:26Z","lastTransitionTime":"2025-10-13T13:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:26 crc kubenswrapper[4797]: I1013 13:08:26.807344 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:26 crc kubenswrapper[4797]: I1013 13:08:26.807439 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:26 crc kubenswrapper[4797]: I1013 13:08:26.807458 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:26 crc kubenswrapper[4797]: I1013 13:08:26.807512 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:26 crc kubenswrapper[4797]: I1013 13:08:26.807532 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:26Z","lastTransitionTime":"2025-10-13T13:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:26 crc kubenswrapper[4797]: I1013 13:08:26.910359 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:26 crc kubenswrapper[4797]: I1013 13:08:26.910418 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:26 crc kubenswrapper[4797]: I1013 13:08:26.910436 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:26 crc kubenswrapper[4797]: I1013 13:08:26.910461 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:26 crc kubenswrapper[4797]: I1013 13:08:26.910480 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:26Z","lastTransitionTime":"2025-10-13T13:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:27 crc kubenswrapper[4797]: I1013 13:08:27.013160 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:27 crc kubenswrapper[4797]: I1013 13:08:27.013226 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:27 crc kubenswrapper[4797]: I1013 13:08:27.013249 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:27 crc kubenswrapper[4797]: I1013 13:08:27.013278 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:27 crc kubenswrapper[4797]: I1013 13:08:27.013331 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:27Z","lastTransitionTime":"2025-10-13T13:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:27 crc kubenswrapper[4797]: I1013 13:08:27.116541 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:27 crc kubenswrapper[4797]: I1013 13:08:27.116585 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:27 crc kubenswrapper[4797]: I1013 13:08:27.116603 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:27 crc kubenswrapper[4797]: I1013 13:08:27.116628 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:27 crc kubenswrapper[4797]: I1013 13:08:27.116647 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:27Z","lastTransitionTime":"2025-10-13T13:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:27 crc kubenswrapper[4797]: I1013 13:08:27.219161 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:27 crc kubenswrapper[4797]: I1013 13:08:27.219224 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:27 crc kubenswrapper[4797]: I1013 13:08:27.219242 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:27 crc kubenswrapper[4797]: I1013 13:08:27.219268 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:27 crc kubenswrapper[4797]: I1013 13:08:27.219286 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:27Z","lastTransitionTime":"2025-10-13T13:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:27 crc kubenswrapper[4797]: I1013 13:08:27.236126 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 13:08:27 crc kubenswrapper[4797]: I1013 13:08:27.236135 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 13 13:08:27 crc kubenswrapper[4797]: I1013 13:08:27.236256 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pdvg5" Oct 13 13:08:27 crc kubenswrapper[4797]: I1013 13:08:27.236386 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 13 13:08:27 crc kubenswrapper[4797]: E1013 13:08:27.236499 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 13 13:08:27 crc kubenswrapper[4797]: E1013 13:08:27.236377 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 13 13:08:27 crc kubenswrapper[4797]: E1013 13:08:27.236612 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pdvg5" podUID="e65d35bc-209d-4438-ae53-31deb132aaf5" Oct 13 13:08:27 crc kubenswrapper[4797]: E1013 13:08:27.236691 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 13 13:08:27 crc kubenswrapper[4797]: I1013 13:08:27.252996 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Oct 13 13:08:27 crc kubenswrapper[4797]: I1013 13:08:27.322219 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:27 crc kubenswrapper[4797]: I1013 13:08:27.322268 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:27 crc kubenswrapper[4797]: I1013 13:08:27.322284 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:27 crc kubenswrapper[4797]: I1013 13:08:27.322309 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:27 crc kubenswrapper[4797]: I1013 13:08:27.322332 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:27Z","lastTransitionTime":"2025-10-13T13:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:27 crc kubenswrapper[4797]: I1013 13:08:27.425601 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:27 crc kubenswrapper[4797]: I1013 13:08:27.425655 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:27 crc kubenswrapper[4797]: I1013 13:08:27.425672 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:27 crc kubenswrapper[4797]: I1013 13:08:27.425694 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:27 crc kubenswrapper[4797]: I1013 13:08:27.425712 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:27Z","lastTransitionTime":"2025-10-13T13:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:27 crc kubenswrapper[4797]: I1013 13:08:27.528229 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:27 crc kubenswrapper[4797]: I1013 13:08:27.528294 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:27 crc kubenswrapper[4797]: I1013 13:08:27.528313 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:27 crc kubenswrapper[4797]: I1013 13:08:27.528341 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:27 crc kubenswrapper[4797]: I1013 13:08:27.528363 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:27Z","lastTransitionTime":"2025-10-13T13:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:27 crc kubenswrapper[4797]: I1013 13:08:27.540677 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 13 13:08:27 crc kubenswrapper[4797]: E1013 13:08:27.540990 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-13 13:09:31.540952898 +0000 UTC m=+149.074503184 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 13:08:27 crc kubenswrapper[4797]: I1013 13:08:27.631399 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:27 crc kubenswrapper[4797]: I1013 13:08:27.631491 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:27 crc kubenswrapper[4797]: I1013 13:08:27.631519 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:27 crc kubenswrapper[4797]: I1013 13:08:27.631550 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:27 crc kubenswrapper[4797]: I1013 13:08:27.631574 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:27Z","lastTransitionTime":"2025-10-13T13:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:27 crc kubenswrapper[4797]: I1013 13:08:27.642483 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 13:08:27 crc kubenswrapper[4797]: I1013 13:08:27.642564 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 13:08:27 crc kubenswrapper[4797]: I1013 13:08:27.642606 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 13 13:08:27 crc kubenswrapper[4797]: I1013 13:08:27.642691 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 13 13:08:27 crc kubenswrapper[4797]: E1013 13:08:27.642737 4797 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 13 13:08:27 crc kubenswrapper[4797]: E1013 13:08:27.642903 4797 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 13 13:08:27 crc kubenswrapper[4797]: E1013 13:08:27.642921 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-13 13:09:31.642889019 +0000 UTC m=+149.176439315 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 13 13:08:27 crc kubenswrapper[4797]: E1013 13:08:27.642934 4797 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 13 13:08:27 crc kubenswrapper[4797]: E1013 13:08:27.642953 4797 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 13 13:08:27 crc kubenswrapper[4797]: E1013 13:08:27.643009 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-13 13:09:31.642991581 +0000 UTC m=+149.176541877 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 13 13:08:27 crc kubenswrapper[4797]: E1013 13:08:27.643106 4797 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 13 13:08:27 crc kubenswrapper[4797]: E1013 13:08:27.643107 4797 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 13 13:08:27 crc kubenswrapper[4797]: E1013 13:08:27.643212 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-13 13:09:31.643185896 +0000 UTC m=+149.176736222 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 13 13:08:27 crc kubenswrapper[4797]: E1013 13:08:27.643126 4797 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 13 13:08:27 crc kubenswrapper[4797]: E1013 13:08:27.643254 4797 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 13 13:08:27 crc kubenswrapper[4797]: E1013 13:08:27.643300 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-13 13:09:31.643290939 +0000 UTC m=+149.176841305 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 13 13:08:27 crc kubenswrapper[4797]: I1013 13:08:27.733971 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:27 crc kubenswrapper[4797]: I1013 13:08:27.734043 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:27 crc kubenswrapper[4797]: I1013 13:08:27.734066 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:27 crc kubenswrapper[4797]: I1013 13:08:27.734098 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:27 crc kubenswrapper[4797]: I1013 13:08:27.734117 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:27Z","lastTransitionTime":"2025-10-13T13:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:27 crc kubenswrapper[4797]: I1013 13:08:27.837449 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:27 crc kubenswrapper[4797]: I1013 13:08:27.837509 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:27 crc kubenswrapper[4797]: I1013 13:08:27.837527 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:27 crc kubenswrapper[4797]: I1013 13:08:27.837552 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:27 crc kubenswrapper[4797]: I1013 13:08:27.837569 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:27Z","lastTransitionTime":"2025-10-13T13:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:27 crc kubenswrapper[4797]: I1013 13:08:27.940442 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:27 crc kubenswrapper[4797]: I1013 13:08:27.940523 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:27 crc kubenswrapper[4797]: I1013 13:08:27.940547 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:27 crc kubenswrapper[4797]: I1013 13:08:27.940577 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:27 crc kubenswrapper[4797]: I1013 13:08:27.940600 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:27Z","lastTransitionTime":"2025-10-13T13:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:28 crc kubenswrapper[4797]: I1013 13:08:28.043373 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:28 crc kubenswrapper[4797]: I1013 13:08:28.043444 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:28 crc kubenswrapper[4797]: I1013 13:08:28.043471 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:28 crc kubenswrapper[4797]: I1013 13:08:28.043499 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:28 crc kubenswrapper[4797]: I1013 13:08:28.043520 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:28Z","lastTransitionTime":"2025-10-13T13:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:28 crc kubenswrapper[4797]: I1013 13:08:28.147127 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:28 crc kubenswrapper[4797]: I1013 13:08:28.147199 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:28 crc kubenswrapper[4797]: I1013 13:08:28.147222 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:28 crc kubenswrapper[4797]: I1013 13:08:28.147255 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:28 crc kubenswrapper[4797]: I1013 13:08:28.147278 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:28Z","lastTransitionTime":"2025-10-13T13:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:28 crc kubenswrapper[4797]: I1013 13:08:28.236710 4797 scope.go:117] "RemoveContainer" containerID="4df9cf44f891f2c9d341965d5c89c3f3ca2eb1bf12a54a0a673389869f9c878d" Oct 13 13:08:28 crc kubenswrapper[4797]: E1013 13:08:28.237007 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-dhk2q_openshift-ovn-kubernetes(658edc6a-9975-4d8b-9551-821edcc32ce1)\"" pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" podUID="658edc6a-9975-4d8b-9551-821edcc32ce1" Oct 13 13:08:28 crc kubenswrapper[4797]: I1013 13:08:28.250916 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:28 crc kubenswrapper[4797]: I1013 13:08:28.251020 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:28 crc kubenswrapper[4797]: I1013 13:08:28.251046 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:28 crc kubenswrapper[4797]: I1013 13:08:28.251079 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:28 crc kubenswrapper[4797]: I1013 13:08:28.251104 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:28Z","lastTransitionTime":"2025-10-13T13:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:28 crc kubenswrapper[4797]: I1013 13:08:28.353577 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:28 crc kubenswrapper[4797]: I1013 13:08:28.353634 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:28 crc kubenswrapper[4797]: I1013 13:08:28.353649 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:28 crc kubenswrapper[4797]: I1013 13:08:28.353675 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:28 crc kubenswrapper[4797]: I1013 13:08:28.353693 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:28Z","lastTransitionTime":"2025-10-13T13:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:28 crc kubenswrapper[4797]: I1013 13:08:28.460404 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:28 crc kubenswrapper[4797]: I1013 13:08:28.460481 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:28 crc kubenswrapper[4797]: I1013 13:08:28.460499 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:28 crc kubenswrapper[4797]: I1013 13:08:28.460521 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:28 crc kubenswrapper[4797]: I1013 13:08:28.460543 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:28Z","lastTransitionTime":"2025-10-13T13:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:28 crc kubenswrapper[4797]: I1013 13:08:28.564434 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:28 crc kubenswrapper[4797]: I1013 13:08:28.564964 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:28 crc kubenswrapper[4797]: I1013 13:08:28.565186 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:28 crc kubenswrapper[4797]: I1013 13:08:28.565342 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:28 crc kubenswrapper[4797]: I1013 13:08:28.565514 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:28Z","lastTransitionTime":"2025-10-13T13:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:28 crc kubenswrapper[4797]: I1013 13:08:28.669239 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:28 crc kubenswrapper[4797]: I1013 13:08:28.669299 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:28 crc kubenswrapper[4797]: I1013 13:08:28.669317 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:28 crc kubenswrapper[4797]: I1013 13:08:28.669339 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:28 crc kubenswrapper[4797]: I1013 13:08:28.669357 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:28Z","lastTransitionTime":"2025-10-13T13:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:28 crc kubenswrapper[4797]: I1013 13:08:28.772509 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:28 crc kubenswrapper[4797]: I1013 13:08:28.772564 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:28 crc kubenswrapper[4797]: I1013 13:08:28.772581 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:28 crc kubenswrapper[4797]: I1013 13:08:28.772604 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:28 crc kubenswrapper[4797]: I1013 13:08:28.772622 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:28Z","lastTransitionTime":"2025-10-13T13:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:28 crc kubenswrapper[4797]: I1013 13:08:28.876208 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:28 crc kubenswrapper[4797]: I1013 13:08:28.876295 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:28 crc kubenswrapper[4797]: I1013 13:08:28.876313 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:28 crc kubenswrapper[4797]: I1013 13:08:28.876336 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:28 crc kubenswrapper[4797]: I1013 13:08:28.876352 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:28Z","lastTransitionTime":"2025-10-13T13:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:28 crc kubenswrapper[4797]: I1013 13:08:28.979613 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:28 crc kubenswrapper[4797]: I1013 13:08:28.979673 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:28 crc kubenswrapper[4797]: I1013 13:08:28.979690 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:28 crc kubenswrapper[4797]: I1013 13:08:28.979713 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:28 crc kubenswrapper[4797]: I1013 13:08:28.979731 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:28Z","lastTransitionTime":"2025-10-13T13:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:29 crc kubenswrapper[4797]: I1013 13:08:29.084325 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:29 crc kubenswrapper[4797]: I1013 13:08:29.084397 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:29 crc kubenswrapper[4797]: I1013 13:08:29.084419 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:29 crc kubenswrapper[4797]: I1013 13:08:29.084450 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:29 crc kubenswrapper[4797]: I1013 13:08:29.084472 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:29Z","lastTransitionTime":"2025-10-13T13:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:29 crc kubenswrapper[4797]: I1013 13:08:29.187116 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:29 crc kubenswrapper[4797]: I1013 13:08:29.187169 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:29 crc kubenswrapper[4797]: I1013 13:08:29.187185 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:29 crc kubenswrapper[4797]: I1013 13:08:29.187207 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:29 crc kubenswrapper[4797]: I1013 13:08:29.187225 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:29Z","lastTransitionTime":"2025-10-13T13:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:29 crc kubenswrapper[4797]: I1013 13:08:29.236133 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 13 13:08:29 crc kubenswrapper[4797]: I1013 13:08:29.236166 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 13:08:29 crc kubenswrapper[4797]: I1013 13:08:29.236240 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pdvg5" Oct 13 13:08:29 crc kubenswrapper[4797]: E1013 13:08:29.236299 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 13 13:08:29 crc kubenswrapper[4797]: I1013 13:08:29.236335 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 13 13:08:29 crc kubenswrapper[4797]: E1013 13:08:29.236419 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 13 13:08:29 crc kubenswrapper[4797]: E1013 13:08:29.236559 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 13 13:08:29 crc kubenswrapper[4797]: E1013 13:08:29.236756 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pdvg5" podUID="e65d35bc-209d-4438-ae53-31deb132aaf5" Oct 13 13:08:29 crc kubenswrapper[4797]: I1013 13:08:29.290591 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:29 crc kubenswrapper[4797]: I1013 13:08:29.290667 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:29 crc kubenswrapper[4797]: I1013 13:08:29.290688 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:29 crc kubenswrapper[4797]: I1013 13:08:29.290713 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:29 crc kubenswrapper[4797]: I1013 13:08:29.290731 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:29Z","lastTransitionTime":"2025-10-13T13:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:29 crc kubenswrapper[4797]: I1013 13:08:29.393319 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:29 crc kubenswrapper[4797]: I1013 13:08:29.393372 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:29 crc kubenswrapper[4797]: I1013 13:08:29.393384 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:29 crc kubenswrapper[4797]: I1013 13:08:29.393403 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:29 crc kubenswrapper[4797]: I1013 13:08:29.393416 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:29Z","lastTransitionTime":"2025-10-13T13:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:29 crc kubenswrapper[4797]: I1013 13:08:29.496481 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:29 crc kubenswrapper[4797]: I1013 13:08:29.496524 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:29 crc kubenswrapper[4797]: I1013 13:08:29.496536 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:29 crc kubenswrapper[4797]: I1013 13:08:29.496553 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:29 crc kubenswrapper[4797]: I1013 13:08:29.496566 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:29Z","lastTransitionTime":"2025-10-13T13:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:29 crc kubenswrapper[4797]: I1013 13:08:29.599123 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:29 crc kubenswrapper[4797]: I1013 13:08:29.599182 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:29 crc kubenswrapper[4797]: I1013 13:08:29.599202 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:29 crc kubenswrapper[4797]: I1013 13:08:29.599226 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:29 crc kubenswrapper[4797]: I1013 13:08:29.599245 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:29Z","lastTransitionTime":"2025-10-13T13:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:29 crc kubenswrapper[4797]: I1013 13:08:29.701586 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:29 crc kubenswrapper[4797]: I1013 13:08:29.701642 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:29 crc kubenswrapper[4797]: I1013 13:08:29.701660 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:29 crc kubenswrapper[4797]: I1013 13:08:29.701683 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:29 crc kubenswrapper[4797]: I1013 13:08:29.701699 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:29Z","lastTransitionTime":"2025-10-13T13:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:29 crc kubenswrapper[4797]: I1013 13:08:29.804725 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:29 crc kubenswrapper[4797]: I1013 13:08:29.804789 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:29 crc kubenswrapper[4797]: I1013 13:08:29.804846 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:29 crc kubenswrapper[4797]: I1013 13:08:29.804876 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:29 crc kubenswrapper[4797]: I1013 13:08:29.804897 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:29Z","lastTransitionTime":"2025-10-13T13:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:29 crc kubenswrapper[4797]: I1013 13:08:29.908260 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:29 crc kubenswrapper[4797]: I1013 13:08:29.908331 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:29 crc kubenswrapper[4797]: I1013 13:08:29.908349 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:29 crc kubenswrapper[4797]: I1013 13:08:29.908375 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:29 crc kubenswrapper[4797]: I1013 13:08:29.908393 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:29Z","lastTransitionTime":"2025-10-13T13:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:30 crc kubenswrapper[4797]: I1013 13:08:30.011894 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:30 crc kubenswrapper[4797]: I1013 13:08:30.011992 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:30 crc kubenswrapper[4797]: I1013 13:08:30.012017 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:30 crc kubenswrapper[4797]: I1013 13:08:30.012053 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:30 crc kubenswrapper[4797]: I1013 13:08:30.012090 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:30Z","lastTransitionTime":"2025-10-13T13:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:30 crc kubenswrapper[4797]: I1013 13:08:30.114454 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:30 crc kubenswrapper[4797]: I1013 13:08:30.114521 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:30 crc kubenswrapper[4797]: I1013 13:08:30.114539 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:30 crc kubenswrapper[4797]: I1013 13:08:30.114561 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:30 crc kubenswrapper[4797]: I1013 13:08:30.114583 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:30Z","lastTransitionTime":"2025-10-13T13:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:30 crc kubenswrapper[4797]: I1013 13:08:30.217713 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:30 crc kubenswrapper[4797]: I1013 13:08:30.217752 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:30 crc kubenswrapper[4797]: I1013 13:08:30.217763 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:30 crc kubenswrapper[4797]: I1013 13:08:30.217780 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:30 crc kubenswrapper[4797]: I1013 13:08:30.217791 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:30Z","lastTransitionTime":"2025-10-13T13:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:30 crc kubenswrapper[4797]: I1013 13:08:30.321095 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:30 crc kubenswrapper[4797]: I1013 13:08:30.321164 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:30 crc kubenswrapper[4797]: I1013 13:08:30.321187 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:30 crc kubenswrapper[4797]: I1013 13:08:30.321216 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:30 crc kubenswrapper[4797]: I1013 13:08:30.321241 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:30Z","lastTransitionTime":"2025-10-13T13:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:30 crc kubenswrapper[4797]: I1013 13:08:30.424427 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:30 crc kubenswrapper[4797]: I1013 13:08:30.424491 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:30 crc kubenswrapper[4797]: I1013 13:08:30.424515 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:30 crc kubenswrapper[4797]: I1013 13:08:30.424545 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:30 crc kubenswrapper[4797]: I1013 13:08:30.424571 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:30Z","lastTransitionTime":"2025-10-13T13:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:30 crc kubenswrapper[4797]: I1013 13:08:30.528245 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:30 crc kubenswrapper[4797]: I1013 13:08:30.528325 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:30 crc kubenswrapper[4797]: I1013 13:08:30.528347 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:30 crc kubenswrapper[4797]: I1013 13:08:30.528377 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:30 crc kubenswrapper[4797]: I1013 13:08:30.528404 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:30Z","lastTransitionTime":"2025-10-13T13:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:30 crc kubenswrapper[4797]: I1013 13:08:30.631502 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:30 crc kubenswrapper[4797]: I1013 13:08:30.631565 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:30 crc kubenswrapper[4797]: I1013 13:08:30.631583 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:30 crc kubenswrapper[4797]: I1013 13:08:30.631608 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:30 crc kubenswrapper[4797]: I1013 13:08:30.631628 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:30Z","lastTransitionTime":"2025-10-13T13:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:30 crc kubenswrapper[4797]: I1013 13:08:30.734314 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:30 crc kubenswrapper[4797]: I1013 13:08:30.734366 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:30 crc kubenswrapper[4797]: I1013 13:08:30.734384 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:30 crc kubenswrapper[4797]: I1013 13:08:30.734407 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:30 crc kubenswrapper[4797]: I1013 13:08:30.734424 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:30Z","lastTransitionTime":"2025-10-13T13:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:30 crc kubenswrapper[4797]: I1013 13:08:30.836553 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:30 crc kubenswrapper[4797]: I1013 13:08:30.836630 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:30 crc kubenswrapper[4797]: I1013 13:08:30.836648 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:30 crc kubenswrapper[4797]: I1013 13:08:30.836673 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:30 crc kubenswrapper[4797]: I1013 13:08:30.836691 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:30Z","lastTransitionTime":"2025-10-13T13:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:30 crc kubenswrapper[4797]: I1013 13:08:30.940150 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:30 crc kubenswrapper[4797]: I1013 13:08:30.940217 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:30 crc kubenswrapper[4797]: I1013 13:08:30.940235 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:30 crc kubenswrapper[4797]: I1013 13:08:30.940261 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:30 crc kubenswrapper[4797]: I1013 13:08:30.940290 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:30Z","lastTransitionTime":"2025-10-13T13:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:31 crc kubenswrapper[4797]: I1013 13:08:31.046691 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:31 crc kubenswrapper[4797]: I1013 13:08:31.046777 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:31 crc kubenswrapper[4797]: I1013 13:08:31.046801 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:31 crc kubenswrapper[4797]: I1013 13:08:31.046869 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:31 crc kubenswrapper[4797]: I1013 13:08:31.046903 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:31Z","lastTransitionTime":"2025-10-13T13:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:31 crc kubenswrapper[4797]: I1013 13:08:31.149977 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:31 crc kubenswrapper[4797]: I1013 13:08:31.150033 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:31 crc kubenswrapper[4797]: I1013 13:08:31.150051 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:31 crc kubenswrapper[4797]: I1013 13:08:31.150075 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:31 crc kubenswrapper[4797]: I1013 13:08:31.150091 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:31Z","lastTransitionTime":"2025-10-13T13:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:31 crc kubenswrapper[4797]: I1013 13:08:31.235085 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 13 13:08:31 crc kubenswrapper[4797]: E1013 13:08:31.235234 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 13 13:08:31 crc kubenswrapper[4797]: I1013 13:08:31.235483 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pdvg5" Oct 13 13:08:31 crc kubenswrapper[4797]: E1013 13:08:31.235638 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pdvg5" podUID="e65d35bc-209d-4438-ae53-31deb132aaf5" Oct 13 13:08:31 crc kubenswrapper[4797]: I1013 13:08:31.235946 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 13 13:08:31 crc kubenswrapper[4797]: E1013 13:08:31.236040 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 13 13:08:31 crc kubenswrapper[4797]: I1013 13:08:31.236087 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 13:08:31 crc kubenswrapper[4797]: E1013 13:08:31.236210 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 13 13:08:31 crc kubenswrapper[4797]: I1013 13:08:31.252691 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:31 crc kubenswrapper[4797]: I1013 13:08:31.252739 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:31 crc kubenswrapper[4797]: I1013 13:08:31.252756 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:31 crc kubenswrapper[4797]: I1013 13:08:31.252777 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:31 crc kubenswrapper[4797]: I1013 13:08:31.252850 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:31Z","lastTransitionTime":"2025-10-13T13:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:31 crc kubenswrapper[4797]: I1013 13:08:31.356200 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:31 crc kubenswrapper[4797]: I1013 13:08:31.356305 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:31 crc kubenswrapper[4797]: I1013 13:08:31.356331 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:31 crc kubenswrapper[4797]: I1013 13:08:31.356362 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:31 crc kubenswrapper[4797]: I1013 13:08:31.356388 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:31Z","lastTransitionTime":"2025-10-13T13:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:31 crc kubenswrapper[4797]: I1013 13:08:31.459191 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:31 crc kubenswrapper[4797]: I1013 13:08:31.459245 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:31 crc kubenswrapper[4797]: I1013 13:08:31.459262 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:31 crc kubenswrapper[4797]: I1013 13:08:31.459284 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:31 crc kubenswrapper[4797]: I1013 13:08:31.459301 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:31Z","lastTransitionTime":"2025-10-13T13:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:31 crc kubenswrapper[4797]: I1013 13:08:31.562439 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:31 crc kubenswrapper[4797]: I1013 13:08:31.562515 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:31 crc kubenswrapper[4797]: I1013 13:08:31.562534 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:31 crc kubenswrapper[4797]: I1013 13:08:31.562559 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:31 crc kubenswrapper[4797]: I1013 13:08:31.562578 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:31Z","lastTransitionTime":"2025-10-13T13:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:31 crc kubenswrapper[4797]: I1013 13:08:31.666173 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:31 crc kubenswrapper[4797]: I1013 13:08:31.666227 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:31 crc kubenswrapper[4797]: I1013 13:08:31.666246 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:31 crc kubenswrapper[4797]: I1013 13:08:31.666272 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:31 crc kubenswrapper[4797]: I1013 13:08:31.666290 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:31Z","lastTransitionTime":"2025-10-13T13:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:31 crc kubenswrapper[4797]: I1013 13:08:31.768828 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:31 crc kubenswrapper[4797]: I1013 13:08:31.769160 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:31 crc kubenswrapper[4797]: I1013 13:08:31.769177 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:31 crc kubenswrapper[4797]: I1013 13:08:31.769198 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:31 crc kubenswrapper[4797]: I1013 13:08:31.769214 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:31Z","lastTransitionTime":"2025-10-13T13:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:31 crc kubenswrapper[4797]: I1013 13:08:31.871912 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:31 crc kubenswrapper[4797]: I1013 13:08:31.871981 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:31 crc kubenswrapper[4797]: I1013 13:08:31.872004 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:31 crc kubenswrapper[4797]: I1013 13:08:31.872035 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:31 crc kubenswrapper[4797]: I1013 13:08:31.872055 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:31Z","lastTransitionTime":"2025-10-13T13:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:31 crc kubenswrapper[4797]: I1013 13:08:31.974969 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:31 crc kubenswrapper[4797]: I1013 13:08:31.975036 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:31 crc kubenswrapper[4797]: I1013 13:08:31.975059 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:31 crc kubenswrapper[4797]: I1013 13:08:31.975090 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:31 crc kubenswrapper[4797]: I1013 13:08:31.975115 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:31Z","lastTransitionTime":"2025-10-13T13:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:32 crc kubenswrapper[4797]: I1013 13:08:32.078109 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:32 crc kubenswrapper[4797]: I1013 13:08:32.078185 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:32 crc kubenswrapper[4797]: I1013 13:08:32.078205 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:32 crc kubenswrapper[4797]: I1013 13:08:32.078229 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:32 crc kubenswrapper[4797]: I1013 13:08:32.078246 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:32Z","lastTransitionTime":"2025-10-13T13:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:32 crc kubenswrapper[4797]: I1013 13:08:32.109700 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:32 crc kubenswrapper[4797]: I1013 13:08:32.109756 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:32 crc kubenswrapper[4797]: I1013 13:08:32.109774 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:32 crc kubenswrapper[4797]: I1013 13:08:32.109798 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:32 crc kubenswrapper[4797]: I1013 13:08:32.109844 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:32Z","lastTransitionTime":"2025-10-13T13:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:32 crc kubenswrapper[4797]: E1013 13:08:32.129854 4797 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:08:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:08:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:08:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:08:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:08:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:08:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:08:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:08:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7c305ae9-a0eb-4806-bd54-a7ad9c447299\\\",\\\"systemUUID\\\":\\\"1126131d-f382-4ed8-9b1e-fad3c0f5c993\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:32Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:32 crc kubenswrapper[4797]: I1013 13:08:32.134786 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:32 crc kubenswrapper[4797]: I1013 13:08:32.134867 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:32 crc kubenswrapper[4797]: I1013 13:08:32.134892 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:32 crc kubenswrapper[4797]: I1013 13:08:32.134923 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:32 crc kubenswrapper[4797]: I1013 13:08:32.134949 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:32Z","lastTransitionTime":"2025-10-13T13:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:32 crc kubenswrapper[4797]: E1013 13:08:32.154252 4797 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:08:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:08:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:08:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:08:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:08:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:08:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:08:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:08:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7c305ae9-a0eb-4806-bd54-a7ad9c447299\\\",\\\"systemUUID\\\":\\\"1126131d-f382-4ed8-9b1e-fad3c0f5c993\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:32Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:32 crc kubenswrapper[4797]: I1013 13:08:32.159786 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:32 crc kubenswrapper[4797]: I1013 13:08:32.159874 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:32 crc kubenswrapper[4797]: I1013 13:08:32.159892 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:32 crc kubenswrapper[4797]: I1013 13:08:32.159916 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:32 crc kubenswrapper[4797]: I1013 13:08:32.159934 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:32Z","lastTransitionTime":"2025-10-13T13:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:32 crc kubenswrapper[4797]: E1013 13:08:32.180333 4797 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:08:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:08:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:08:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:08:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:08:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:08:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:08:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:08:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7c305ae9-a0eb-4806-bd54-a7ad9c447299\\\",\\\"systemUUID\\\":\\\"1126131d-f382-4ed8-9b1e-fad3c0f5c993\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:32Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:32 crc kubenswrapper[4797]: I1013 13:08:32.185726 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:32 crc kubenswrapper[4797]: I1013 13:08:32.185800 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:32 crc kubenswrapper[4797]: I1013 13:08:32.185864 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:32 crc kubenswrapper[4797]: I1013 13:08:32.185897 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:32 crc kubenswrapper[4797]: I1013 13:08:32.185918 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:32Z","lastTransitionTime":"2025-10-13T13:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:32 crc kubenswrapper[4797]: E1013 13:08:32.206348 4797 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:08:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:08:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:08:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:08:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:08:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:08:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:08:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:08:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7c305ae9-a0eb-4806-bd54-a7ad9c447299\\\",\\\"systemUUID\\\":\\\"1126131d-f382-4ed8-9b1e-fad3c0f5c993\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:32Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:32 crc kubenswrapper[4797]: I1013 13:08:32.211090 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:32 crc kubenswrapper[4797]: I1013 13:08:32.211148 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:32 crc kubenswrapper[4797]: I1013 13:08:32.211164 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:32 crc kubenswrapper[4797]: I1013 13:08:32.211187 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:32 crc kubenswrapper[4797]: I1013 13:08:32.211208 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:32Z","lastTransitionTime":"2025-10-13T13:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:32 crc kubenswrapper[4797]: E1013 13:08:32.230946 4797 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:08:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:08:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:08:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:08:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:08:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:08:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:08:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:08:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7c305ae9-a0eb-4806-bd54-a7ad9c447299\\\",\\\"systemUUID\\\":\\\"1126131d-f382-4ed8-9b1e-fad3c0f5c993\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:32Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:32 crc kubenswrapper[4797]: E1013 13:08:32.231270 4797 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 13 13:08:32 crc kubenswrapper[4797]: I1013 13:08:32.233554 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:32 crc kubenswrapper[4797]: I1013 13:08:32.233612 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:32 crc kubenswrapper[4797]: I1013 13:08:32.233639 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:32 crc kubenswrapper[4797]: I1013 13:08:32.233667 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:32 crc kubenswrapper[4797]: I1013 13:08:32.233690 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:32Z","lastTransitionTime":"2025-10-13T13:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:32 crc kubenswrapper[4797]: I1013 13:08:32.337014 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:32 crc kubenswrapper[4797]: I1013 13:08:32.337084 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:32 crc kubenswrapper[4797]: I1013 13:08:32.337107 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:32 crc kubenswrapper[4797]: I1013 13:08:32.337137 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:32 crc kubenswrapper[4797]: I1013 13:08:32.337156 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:32Z","lastTransitionTime":"2025-10-13T13:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:32 crc kubenswrapper[4797]: I1013 13:08:32.439706 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:32 crc kubenswrapper[4797]: I1013 13:08:32.439794 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:32 crc kubenswrapper[4797]: I1013 13:08:32.439827 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:32 crc kubenswrapper[4797]: I1013 13:08:32.439852 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:32 crc kubenswrapper[4797]: I1013 13:08:32.439869 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:32Z","lastTransitionTime":"2025-10-13T13:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:32 crc kubenswrapper[4797]: I1013 13:08:32.542166 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:32 crc kubenswrapper[4797]: I1013 13:08:32.542212 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:32 crc kubenswrapper[4797]: I1013 13:08:32.542223 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:32 crc kubenswrapper[4797]: I1013 13:08:32.542241 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:32 crc kubenswrapper[4797]: I1013 13:08:32.542253 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:32Z","lastTransitionTime":"2025-10-13T13:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:32 crc kubenswrapper[4797]: I1013 13:08:32.644597 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:32 crc kubenswrapper[4797]: I1013 13:08:32.644645 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:32 crc kubenswrapper[4797]: I1013 13:08:32.644656 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:32 crc kubenswrapper[4797]: I1013 13:08:32.644672 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:32 crc kubenswrapper[4797]: I1013 13:08:32.644680 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:32Z","lastTransitionTime":"2025-10-13T13:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:32 crc kubenswrapper[4797]: I1013 13:08:32.748161 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:32 crc kubenswrapper[4797]: I1013 13:08:32.748238 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:32 crc kubenswrapper[4797]: I1013 13:08:32.748261 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:32 crc kubenswrapper[4797]: I1013 13:08:32.748291 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:32 crc kubenswrapper[4797]: I1013 13:08:32.748313 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:32Z","lastTransitionTime":"2025-10-13T13:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:32 crc kubenswrapper[4797]: I1013 13:08:32.851751 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:32 crc kubenswrapper[4797]: I1013 13:08:32.851866 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:32 crc kubenswrapper[4797]: I1013 13:08:32.851903 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:32 crc kubenswrapper[4797]: I1013 13:08:32.851934 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:32 crc kubenswrapper[4797]: I1013 13:08:32.851954 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:32Z","lastTransitionTime":"2025-10-13T13:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:32 crc kubenswrapper[4797]: I1013 13:08:32.955406 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:32 crc kubenswrapper[4797]: I1013 13:08:32.955467 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:32 crc kubenswrapper[4797]: I1013 13:08:32.955484 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:32 crc kubenswrapper[4797]: I1013 13:08:32.955509 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:32 crc kubenswrapper[4797]: I1013 13:08:32.955526 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:32Z","lastTransitionTime":"2025-10-13T13:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:33 crc kubenswrapper[4797]: I1013 13:08:33.058463 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:33 crc kubenswrapper[4797]: I1013 13:08:33.058527 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:33 crc kubenswrapper[4797]: I1013 13:08:33.058545 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:33 crc kubenswrapper[4797]: I1013 13:08:33.058570 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:33 crc kubenswrapper[4797]: I1013 13:08:33.058587 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:33Z","lastTransitionTime":"2025-10-13T13:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:33 crc kubenswrapper[4797]: I1013 13:08:33.162162 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:33 crc kubenswrapper[4797]: I1013 13:08:33.162235 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:33 crc kubenswrapper[4797]: I1013 13:08:33.162258 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:33 crc kubenswrapper[4797]: I1013 13:08:33.162290 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:33 crc kubenswrapper[4797]: I1013 13:08:33.162311 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:33Z","lastTransitionTime":"2025-10-13T13:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:33 crc kubenswrapper[4797]: I1013 13:08:33.235966 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 13:08:33 crc kubenswrapper[4797]: I1013 13:08:33.236009 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 13 13:08:33 crc kubenswrapper[4797]: I1013 13:08:33.236147 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pdvg5" Oct 13 13:08:33 crc kubenswrapper[4797]: E1013 13:08:33.236200 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 13 13:08:33 crc kubenswrapper[4797]: I1013 13:08:33.236236 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 13 13:08:33 crc kubenswrapper[4797]: E1013 13:08:33.236360 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 13 13:08:33 crc kubenswrapper[4797]: E1013 13:08:33.236476 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pdvg5" podUID="e65d35bc-209d-4438-ae53-31deb132aaf5" Oct 13 13:08:33 crc kubenswrapper[4797]: E1013 13:08:33.236557 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 13 13:08:33 crc kubenswrapper[4797]: I1013 13:08:33.260618 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a225515-1318-413d-aafe-877c9f16f598\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94213b2963fead3db49bc98dfdf6347265b92e3a0a965295610e496d2e1f03fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://015492cb16b3cf6dedc1936f90cf03d1331bfd1fddf6a257c719a6bf102691f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a259d99cae7127eb6fc8ad5446de3eda5a06da45868ab2325a89fc9c44f1d34c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58e84f914871c37c2c5cf2767af6a88354e4e59af0cbe5b178b80e1372d50629\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba76df71160260346f2ecd968722de778b7d2b3dcb8673d6ec770964965384fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ff207205d6f17838682cab641fe9c77fad739e5454c206f376440741bed8724\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ff207205d6f17838682cab641fe9c77fad739e5454c206f376440741bed8724\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:33Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:33 crc kubenswrapper[4797]: I1013 13:08:33.264445 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:33 crc kubenswrapper[4797]: I1013 13:08:33.264472 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:33 crc kubenswrapper[4797]: I1013 13:08:33.264483 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:33 crc kubenswrapper[4797]: I1013 13:08:33.264499 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:33 crc kubenswrapper[4797]: I1013 13:08:33.264512 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:33Z","lastTransitionTime":"2025-10-13T13:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:33 crc kubenswrapper[4797]: I1013 13:08:33.282281 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:33Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:33 crc kubenswrapper[4797]: I1013 13:08:33.301437 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:33Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:33 crc kubenswrapper[4797]: I1013 13:08:33.322440 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6gbdx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2ab9f14-aae8-45ef-880e-a1563e920f87\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa998288bf7354f5914b82c32971cd88e1fe9535016c7d137b79e4cf5c5c7248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://414f6ddbfec431109009fc83e56eeac94db15726b109e707ebd8d3e2403999b7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-13T13:08:12Z\\\",\\\"message\\\":\\\"2025-10-13T13:07:27+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_241ac6c6-17e4-4495-a5d5-736f3cee06d0\\\\n2025-10-13T13:07:27+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_241ac6c6-17e4-4495-a5d5-736f3cee06d0 to /host/opt/cni/bin/\\\\n2025-10-13T13:07:27Z [verbose] multus-daemon started\\\\n2025-10-13T13:07:27Z [verbose] Readiness Indicator file check\\\\n2025-10-13T13:08:12Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:08:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rkc2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6gbdx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:33Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:33 crc kubenswrapper[4797]: I1013 13:08:33.355719 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"658edc6a-9975-4d8b-9551-821edcc32ce1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6815d3509df673d7f5da2c26130c6c4d533e9d2c25c40f82365ef61d63ee71bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e599d81d1a996abd4de74afc58a8255a1ae548327401146b6bdf688d7455823\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa5161ba66d687daedb3caa1a0e2d7be83859aa3076731f94aebf83cc3348a4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1293f7ed35796e22a4be73a35ad07f83fa98d250d21de2d0b96b9090354142b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32991406197be9d38b8d5e8d1a7e95165b1846e9e054efbe87f30aac9f7f8784\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7aebb018a68c2984d9e4e58071c2b623652bfa700acebaf735c35615abf8c592\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4df9cf44f891f2c9d341965d5c89c3f3ca2eb1bf12a54a0a673389869f9c878d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4df9cf44f891f2c9d341965d5c89c3f3ca2eb1bf12a54a0a673389869f9c878d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-13T13:08:14Z\\\",\\\"message\\\":\\\"LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-service-ca-operator/metrics\\\\\\\"}\\\\nI1013 13:08:14.032896 6746 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1013 13:08:14.032901 6746 services_controller.go:360] Finished syncing service metrics on namespace openshift-service-ca-operator for network=default : 1.704972ms\\\\nI1013 13:08:14.032930 6746 services_controller.go:356] Processing sync for service openshift-authentication-operator/metrics for network=default\\\\nF1013 13:08:14.032949 6746 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:13Z is after 2025-08\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-13T13:08:13Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-dhk2q_openshift-ovn-kubernetes(658edc6a-9975-4d8b-9551-821edcc32ce1)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a900854ab289e65833932548eadd4705ec501737d66773d5b6c283458125b598\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6339ec20a5a892de024630eaf6febedfdfa4373c24d394fc758267cc7ae2603e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6339ec20a5a892de024630eaf6febedfdfa4373c24d394fc758267cc7ae2603e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dhk2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:33Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:33 crc kubenswrapper[4797]: I1013 13:08:33.366829 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:33 crc kubenswrapper[4797]: I1013 13:08:33.366879 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:33 crc kubenswrapper[4797]: I1013 13:08:33.366896 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:33 crc kubenswrapper[4797]: I1013 13:08:33.366921 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:33 crc kubenswrapper[4797]: I1013 13:08:33.366940 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:33Z","lastTransitionTime":"2025-10-13T13:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:33 crc kubenswrapper[4797]: I1013 13:08:33.377147 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69631c207b244ea05458caf7f67665697be6b3794c1aac98d0ac8d23df060e02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:33Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:33 crc kubenswrapper[4797]: I1013 13:08:33.393300 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5jgrm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"680a49a0-7eff-44a9-8ab8-e4b52f4743c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://875be57ff7356a934b342acd8ae700f66656680be4e58e6cfccdc0407b66ddea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pptw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5jgrm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:33Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:33 crc kubenswrapper[4797]: I1013 13:08:33.410741 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hvhmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b24c284-a754-4877-83cc-334b0a893a47\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72031333bb0302ca8e823981a07e96b3bf16d02fbdb918d4fd3e79f36d86c5ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znzc9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95d788f4cc7913f42c5282aea7303a5463ec8718dba6372a30c505e1648f230e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znzc9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hvhmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:33Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:33 crc kubenswrapper[4797]: I1013 13:08:33.427592 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-pdvg5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e65d35bc-209d-4438-ae53-31deb132aaf5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nspn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nspn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-pdvg5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:33Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:33 crc kubenswrapper[4797]: I1013 13:08:33.447157 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:33Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:33 crc kubenswrapper[4797]: I1013 13:08:33.470768 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:33 crc kubenswrapper[4797]: I1013 13:08:33.470875 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:33 crc kubenswrapper[4797]: I1013 13:08:33.470894 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:33 crc kubenswrapper[4797]: I1013 13:08:33.470919 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:33 crc kubenswrapper[4797]: I1013 13:08:33.470936 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:33Z","lastTransitionTime":"2025-10-13T13:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:33 crc kubenswrapper[4797]: I1013 13:08:33.471877 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hc9bk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94a6e41f-8980-41db-a008-d5a81058cdba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4187a9b2b8147080c704bcda550e1fa94124e2d876e766361e95907a8805d300\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55fca726ceb1c3d6ba2023acc8fcf6829a7843a2da5209dc77f3d1f499542984\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55fca726ceb1c3d6ba2023acc8fcf6829a7843a2da5209dc77f3d1f499542984\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d315709aee893961a174d0368efcf68e50e45845bdce18b40b96f5d49a8ac12c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d315709aee893961a174d0368efcf68e50e45845bdce18b40b96f5d49a8ac12c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e5eeb32046ffa3c309bb0649ed24bb4149050e757d4252bc5ed8e0593e1b139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e5eeb32046ffa3c309bb0649ed24bb4149050e757d4252bc5ed8e0593e1b139\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12ad75a6e287db934cda0d0128e5445d47433aa12807b4145ce0bd28c36b08a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12ad75a6e287db934cda0d0128e5445d47433aa12807b4145ce0bd28c36b08a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://578ddb49fd9525eafe74852b96ea1f3e320cbe40fa15ef4da3e4269f9bc23fc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://578ddb49fd9525eafe74852b96ea1f3e320cbe40fa15ef4da3e4269f9bc23fc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d96793d08ac170b3c25abd53c779b2ebecf10b5271c5f3eb4f9cbc524ba65c0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d96793d08ac170b3c25abd53c779b2ebecf10b5271c5f3eb4f9cbc524ba65c0e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hc9bk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:33Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:33 crc kubenswrapper[4797]: I1013 13:08:33.490488 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b68fe527-212f-427f-853a-037035463262\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca584d4ecaf82f6fb7822ce377920e84fa94325d8c157e75bdcbbe45a125fa17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11160e205816f4de995be138142cca7672957f217e49bf9f4761ae2cb132e9da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://034f468ed62eb8201ad4abdbf235c13b6c9ff8e3fe2494ad768f7047e188bc77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e62409684da122b3385446a402a798c47eca9f32aeb43f734f94dc498f95d23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e62409684da122b3385446a402a798c47eca9f32aeb43f734f94dc498f95d23\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:03Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:33Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:33 crc kubenswrapper[4797]: I1013 13:08:33.507056 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c1f95fe-a038-41aa-b56d-69043c556391\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63f3b10625b7649804d048f23322a40de7a36368dd72faed1c1f3a313c64f452\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://114ccc69a396d2e48941fe7dc1e9b68f9cf6297746930ee02ce8aa98273064d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://114ccc69a396d2e48941fe7dc1e9b68f9cf6297746930ee02ce8aa98273064d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:33Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:33 crc kubenswrapper[4797]: I1013 13:08:33.544789 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a0ece0b-2009-4af8-a479-18fe277add03\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfe67dfd1c3ab4ca933a08e0384f2c38dccf755989a2d788f7c96bd8c2005c7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2839d93188aac6edbd17e7c1dc6d3b6004d3c1d8d03c559205b1f180ca7fc722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d103007fe3545a7470b16b99638c5d5c87f34918e102e1453d4f7ee1fa67109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://221933b30055ace0b4911bda08736e1c703b7757d55fadb3114ae39d038e4b19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01ae91c2e82dd02eb4c0aced1159efd45e3a0570a4db649f2fd2b58681419471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9634ab258db11c80c4fea57a4a31969b811204d49513802e9fd1e584c9baeeb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9634ab258db11c80c4fea57a4a31969b811204d49513802e9fd1e584c9baeeb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4eac0af23e91572524547b0ce92c10d435b55d0cd15ca4cfc1f49bda2de8bde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4eac0af23e91572524547b0ce92c10d435b55d0cd15ca4cfc1f49bda2de8bde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://af79d6bbb6a3c16b532ba2234d3373011c151ffa801eb1ae5ae947142a64bcfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af79d6bbb6a3c16b532ba2234d3373011c151ffa801eb1ae5ae947142a64bcfe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:03Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:33Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:33 crc kubenswrapper[4797]: I1013 13:08:33.565517 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6681a37da700a80ffec94aef9264f87838622029c76a2badc7b8f4a7e9e167e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:33Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:33 crc kubenswrapper[4797]: I1013 13:08:33.574063 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:33 crc kubenswrapper[4797]: I1013 13:08:33.574119 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:33 crc kubenswrapper[4797]: I1013 13:08:33.574138 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:33 crc kubenswrapper[4797]: I1013 13:08:33.574160 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:33 crc kubenswrapper[4797]: I1013 13:08:33.574182 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:33Z","lastTransitionTime":"2025-10-13T13:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:33 crc kubenswrapper[4797]: I1013 13:08:33.585775 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://050a7223e7496fca4ef77f2d73f6aefc921ac5accb7ecaa34609524388da6549\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f59a6104215a3e6febd2e26c286b00895bcd8a45719acbd8e86d6fb5683df39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:33Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:33 crc kubenswrapper[4797]: I1013 13:08:33.606937 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f37f607-3b81-4e33-878e-e78a69b89d23\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f65bab26af0e0d003d4e1a27dc4bdb84b64b5f6143e363973331a3fb6d26b12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81fc000b6df41386d24f9077cee4aa0ceb4733774dc37d225495575543e84a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6529ac19e0d9f2b6ecc69e041e75c9767c971617166ca22bb29349b3b3965b1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://153aaedead7ed76ab97858342317945de885afe80c00d9873d1a03444c47f67d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:33Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:33 crc kubenswrapper[4797]: I1013 13:08:33.624419 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"345b1c60-ba79-407d-8423-53010f2dfeb0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9188982d992b79d058393a141055552eeb63bc5cd53178991e62b3df7604f55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4mqw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae2106d4b7e73d19b0c8cbd8089d372e56fa08d827a3b45148d0cf68e8596c00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4mqw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hrdxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:33Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:33 crc kubenswrapper[4797]: I1013 13:08:33.639767 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7c2fp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c10f51f-a7f1-4ab8-8d9c-fc358bd7f2c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55ca323dd3a92ee203542f4ef7bb8be990bcfc8f75f125c562127129aecefc5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hgkjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7c2fp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:33Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:33 crc kubenswrapper[4797]: I1013 13:08:33.677088 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:33 crc kubenswrapper[4797]: I1013 13:08:33.677143 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:33 crc kubenswrapper[4797]: I1013 13:08:33.677159 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:33 crc kubenswrapper[4797]: I1013 13:08:33.677183 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:33 crc kubenswrapper[4797]: I1013 13:08:33.677200 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:33Z","lastTransitionTime":"2025-10-13T13:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:33 crc kubenswrapper[4797]: I1013 13:08:33.782207 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:33 crc kubenswrapper[4797]: I1013 13:08:33.782295 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:33 crc kubenswrapper[4797]: I1013 13:08:33.782323 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:33 crc kubenswrapper[4797]: I1013 13:08:33.782354 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:33 crc kubenswrapper[4797]: I1013 13:08:33.782389 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:33Z","lastTransitionTime":"2025-10-13T13:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:33 crc kubenswrapper[4797]: I1013 13:08:33.885416 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:33 crc kubenswrapper[4797]: I1013 13:08:33.885453 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:33 crc kubenswrapper[4797]: I1013 13:08:33.885465 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:33 crc kubenswrapper[4797]: I1013 13:08:33.885480 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:33 crc kubenswrapper[4797]: I1013 13:08:33.885491 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:33Z","lastTransitionTime":"2025-10-13T13:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:33 crc kubenswrapper[4797]: I1013 13:08:33.988383 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:33 crc kubenswrapper[4797]: I1013 13:08:33.988456 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:33 crc kubenswrapper[4797]: I1013 13:08:33.988468 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:33 crc kubenswrapper[4797]: I1013 13:08:33.988485 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:33 crc kubenswrapper[4797]: I1013 13:08:33.988497 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:33Z","lastTransitionTime":"2025-10-13T13:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:34 crc kubenswrapper[4797]: I1013 13:08:34.091773 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:34 crc kubenswrapper[4797]: I1013 13:08:34.091846 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:34 crc kubenswrapper[4797]: I1013 13:08:34.091865 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:34 crc kubenswrapper[4797]: I1013 13:08:34.091887 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:34 crc kubenswrapper[4797]: I1013 13:08:34.091903 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:34Z","lastTransitionTime":"2025-10-13T13:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:34 crc kubenswrapper[4797]: I1013 13:08:34.194362 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:34 crc kubenswrapper[4797]: I1013 13:08:34.194426 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:34 crc kubenswrapper[4797]: I1013 13:08:34.194672 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:34 crc kubenswrapper[4797]: I1013 13:08:34.194710 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:34 crc kubenswrapper[4797]: I1013 13:08:34.194735 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:34Z","lastTransitionTime":"2025-10-13T13:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:34 crc kubenswrapper[4797]: I1013 13:08:34.298003 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:34 crc kubenswrapper[4797]: I1013 13:08:34.298070 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:34 crc kubenswrapper[4797]: I1013 13:08:34.298087 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:34 crc kubenswrapper[4797]: I1013 13:08:34.298112 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:34 crc kubenswrapper[4797]: I1013 13:08:34.298131 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:34Z","lastTransitionTime":"2025-10-13T13:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:34 crc kubenswrapper[4797]: I1013 13:08:34.400610 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:34 crc kubenswrapper[4797]: I1013 13:08:34.400668 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:34 crc kubenswrapper[4797]: I1013 13:08:34.400687 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:34 crc kubenswrapper[4797]: I1013 13:08:34.400710 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:34 crc kubenswrapper[4797]: I1013 13:08:34.400728 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:34Z","lastTransitionTime":"2025-10-13T13:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:34 crc kubenswrapper[4797]: I1013 13:08:34.504418 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:34 crc kubenswrapper[4797]: I1013 13:08:34.504497 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:34 crc kubenswrapper[4797]: I1013 13:08:34.504522 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:34 crc kubenswrapper[4797]: I1013 13:08:34.504551 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:34 crc kubenswrapper[4797]: I1013 13:08:34.504574 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:34Z","lastTransitionTime":"2025-10-13T13:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:34 crc kubenswrapper[4797]: I1013 13:08:34.607936 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:34 crc kubenswrapper[4797]: I1013 13:08:34.608018 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:34 crc kubenswrapper[4797]: I1013 13:08:34.608045 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:34 crc kubenswrapper[4797]: I1013 13:08:34.608070 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:34 crc kubenswrapper[4797]: I1013 13:08:34.608087 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:34Z","lastTransitionTime":"2025-10-13T13:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:34 crc kubenswrapper[4797]: I1013 13:08:34.711470 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:34 crc kubenswrapper[4797]: I1013 13:08:34.711549 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:34 crc kubenswrapper[4797]: I1013 13:08:34.711570 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:34 crc kubenswrapper[4797]: I1013 13:08:34.711604 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:34 crc kubenswrapper[4797]: I1013 13:08:34.711629 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:34Z","lastTransitionTime":"2025-10-13T13:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:34 crc kubenswrapper[4797]: I1013 13:08:34.814725 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:34 crc kubenswrapper[4797]: I1013 13:08:34.814783 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:34 crc kubenswrapper[4797]: I1013 13:08:34.814801 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:34 crc kubenswrapper[4797]: I1013 13:08:34.814855 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:34 crc kubenswrapper[4797]: I1013 13:08:34.814875 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:34Z","lastTransitionTime":"2025-10-13T13:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:34 crc kubenswrapper[4797]: I1013 13:08:34.918194 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:34 crc kubenswrapper[4797]: I1013 13:08:34.918272 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:34 crc kubenswrapper[4797]: I1013 13:08:34.918295 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:34 crc kubenswrapper[4797]: I1013 13:08:34.918324 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:34 crc kubenswrapper[4797]: I1013 13:08:34.918346 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:34Z","lastTransitionTime":"2025-10-13T13:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:35 crc kubenswrapper[4797]: I1013 13:08:35.021278 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:35 crc kubenswrapper[4797]: I1013 13:08:35.021333 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:35 crc kubenswrapper[4797]: I1013 13:08:35.021350 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:35 crc kubenswrapper[4797]: I1013 13:08:35.021376 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:35 crc kubenswrapper[4797]: I1013 13:08:35.021394 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:35Z","lastTransitionTime":"2025-10-13T13:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:35 crc kubenswrapper[4797]: I1013 13:08:35.125422 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:35 crc kubenswrapper[4797]: I1013 13:08:35.125493 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:35 crc kubenswrapper[4797]: I1013 13:08:35.125511 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:35 crc kubenswrapper[4797]: I1013 13:08:35.125538 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:35 crc kubenswrapper[4797]: I1013 13:08:35.125556 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:35Z","lastTransitionTime":"2025-10-13T13:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:35 crc kubenswrapper[4797]: I1013 13:08:35.228167 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:35 crc kubenswrapper[4797]: I1013 13:08:35.228229 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:35 crc kubenswrapper[4797]: I1013 13:08:35.228247 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:35 crc kubenswrapper[4797]: I1013 13:08:35.228278 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:35 crc kubenswrapper[4797]: I1013 13:08:35.228300 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:35Z","lastTransitionTime":"2025-10-13T13:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:35 crc kubenswrapper[4797]: I1013 13:08:35.235884 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 13 13:08:35 crc kubenswrapper[4797]: I1013 13:08:35.235924 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 13 13:08:35 crc kubenswrapper[4797]: E1013 13:08:35.236044 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 13 13:08:35 crc kubenswrapper[4797]: I1013 13:08:35.236125 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pdvg5" Oct 13 13:08:35 crc kubenswrapper[4797]: I1013 13:08:35.236151 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 13:08:35 crc kubenswrapper[4797]: E1013 13:08:35.236234 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 13 13:08:35 crc kubenswrapper[4797]: E1013 13:08:35.236388 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 13 13:08:35 crc kubenswrapper[4797]: E1013 13:08:35.236485 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pdvg5" podUID="e65d35bc-209d-4438-ae53-31deb132aaf5" Oct 13 13:08:35 crc kubenswrapper[4797]: I1013 13:08:35.331292 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:35 crc kubenswrapper[4797]: I1013 13:08:35.331340 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:35 crc kubenswrapper[4797]: I1013 13:08:35.331357 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:35 crc kubenswrapper[4797]: I1013 13:08:35.331379 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:35 crc kubenswrapper[4797]: I1013 13:08:35.331397 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:35Z","lastTransitionTime":"2025-10-13T13:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:35 crc kubenswrapper[4797]: I1013 13:08:35.435016 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:35 crc kubenswrapper[4797]: I1013 13:08:35.435074 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:35 crc kubenswrapper[4797]: I1013 13:08:35.435090 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:35 crc kubenswrapper[4797]: I1013 13:08:35.435113 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:35 crc kubenswrapper[4797]: I1013 13:08:35.435130 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:35Z","lastTransitionTime":"2025-10-13T13:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:35 crc kubenswrapper[4797]: I1013 13:08:35.538118 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:35 crc kubenswrapper[4797]: I1013 13:08:35.538189 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:35 crc kubenswrapper[4797]: I1013 13:08:35.538215 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:35 crc kubenswrapper[4797]: I1013 13:08:35.538246 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:35 crc kubenswrapper[4797]: I1013 13:08:35.538269 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:35Z","lastTransitionTime":"2025-10-13T13:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:35 crc kubenswrapper[4797]: I1013 13:08:35.641608 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:35 crc kubenswrapper[4797]: I1013 13:08:35.641683 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:35 crc kubenswrapper[4797]: I1013 13:08:35.641707 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:35 crc kubenswrapper[4797]: I1013 13:08:35.641753 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:35 crc kubenswrapper[4797]: I1013 13:08:35.641776 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:35Z","lastTransitionTime":"2025-10-13T13:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:35 crc kubenswrapper[4797]: I1013 13:08:35.744465 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:35 crc kubenswrapper[4797]: I1013 13:08:35.744499 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:35 crc kubenswrapper[4797]: I1013 13:08:35.744508 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:35 crc kubenswrapper[4797]: I1013 13:08:35.744524 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:35 crc kubenswrapper[4797]: I1013 13:08:35.744534 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:35Z","lastTransitionTime":"2025-10-13T13:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:35 crc kubenswrapper[4797]: I1013 13:08:35.847258 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:35 crc kubenswrapper[4797]: I1013 13:08:35.847324 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:35 crc kubenswrapper[4797]: I1013 13:08:35.847348 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:35 crc kubenswrapper[4797]: I1013 13:08:35.847376 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:35 crc kubenswrapper[4797]: I1013 13:08:35.847399 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:35Z","lastTransitionTime":"2025-10-13T13:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:35 crc kubenswrapper[4797]: I1013 13:08:35.950804 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:35 crc kubenswrapper[4797]: I1013 13:08:35.950895 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:35 crc kubenswrapper[4797]: I1013 13:08:35.950911 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:35 crc kubenswrapper[4797]: I1013 13:08:35.950932 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:35 crc kubenswrapper[4797]: I1013 13:08:35.950949 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:35Z","lastTransitionTime":"2025-10-13T13:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:36 crc kubenswrapper[4797]: I1013 13:08:36.053266 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:36 crc kubenswrapper[4797]: I1013 13:08:36.053468 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:36 crc kubenswrapper[4797]: I1013 13:08:36.053505 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:36 crc kubenswrapper[4797]: I1013 13:08:36.053533 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:36 crc kubenswrapper[4797]: I1013 13:08:36.053556 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:36Z","lastTransitionTime":"2025-10-13T13:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:36 crc kubenswrapper[4797]: I1013 13:08:36.157584 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:36 crc kubenswrapper[4797]: I1013 13:08:36.157651 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:36 crc kubenswrapper[4797]: I1013 13:08:36.157670 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:36 crc kubenswrapper[4797]: I1013 13:08:36.157694 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:36 crc kubenswrapper[4797]: I1013 13:08:36.157710 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:36Z","lastTransitionTime":"2025-10-13T13:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:36 crc kubenswrapper[4797]: I1013 13:08:36.260921 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:36 crc kubenswrapper[4797]: I1013 13:08:36.260981 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:36 crc kubenswrapper[4797]: I1013 13:08:36.260999 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:36 crc kubenswrapper[4797]: I1013 13:08:36.261026 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:36 crc kubenswrapper[4797]: I1013 13:08:36.261043 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:36Z","lastTransitionTime":"2025-10-13T13:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:36 crc kubenswrapper[4797]: I1013 13:08:36.364802 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:36 crc kubenswrapper[4797]: I1013 13:08:36.364876 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:36 crc kubenswrapper[4797]: I1013 13:08:36.364893 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:36 crc kubenswrapper[4797]: I1013 13:08:36.364916 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:36 crc kubenswrapper[4797]: I1013 13:08:36.364933 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:36Z","lastTransitionTime":"2025-10-13T13:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:36 crc kubenswrapper[4797]: I1013 13:08:36.468044 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:36 crc kubenswrapper[4797]: I1013 13:08:36.468110 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:36 crc kubenswrapper[4797]: I1013 13:08:36.468129 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:36 crc kubenswrapper[4797]: I1013 13:08:36.468152 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:36 crc kubenswrapper[4797]: I1013 13:08:36.468170 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:36Z","lastTransitionTime":"2025-10-13T13:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:36 crc kubenswrapper[4797]: I1013 13:08:36.570897 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:36 crc kubenswrapper[4797]: I1013 13:08:36.570945 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:36 crc kubenswrapper[4797]: I1013 13:08:36.570961 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:36 crc kubenswrapper[4797]: I1013 13:08:36.570983 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:36 crc kubenswrapper[4797]: I1013 13:08:36.570999 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:36Z","lastTransitionTime":"2025-10-13T13:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:36 crc kubenswrapper[4797]: I1013 13:08:36.673272 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:36 crc kubenswrapper[4797]: I1013 13:08:36.673344 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:36 crc kubenswrapper[4797]: I1013 13:08:36.673369 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:36 crc kubenswrapper[4797]: I1013 13:08:36.673398 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:36 crc kubenswrapper[4797]: I1013 13:08:36.673418 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:36Z","lastTransitionTime":"2025-10-13T13:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:36 crc kubenswrapper[4797]: I1013 13:08:36.776354 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:36 crc kubenswrapper[4797]: I1013 13:08:36.776404 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:36 crc kubenswrapper[4797]: I1013 13:08:36.776420 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:36 crc kubenswrapper[4797]: I1013 13:08:36.776444 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:36 crc kubenswrapper[4797]: I1013 13:08:36.776463 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:36Z","lastTransitionTime":"2025-10-13T13:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:36 crc kubenswrapper[4797]: I1013 13:08:36.879296 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:36 crc kubenswrapper[4797]: I1013 13:08:36.879355 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:36 crc kubenswrapper[4797]: I1013 13:08:36.879372 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:36 crc kubenswrapper[4797]: I1013 13:08:36.879394 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:36 crc kubenswrapper[4797]: I1013 13:08:36.879413 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:36Z","lastTransitionTime":"2025-10-13T13:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:36 crc kubenswrapper[4797]: I1013 13:08:36.982138 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:36 crc kubenswrapper[4797]: I1013 13:08:36.982222 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:36 crc kubenswrapper[4797]: I1013 13:08:36.982284 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:36 crc kubenswrapper[4797]: I1013 13:08:36.982321 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:36 crc kubenswrapper[4797]: I1013 13:08:36.982342 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:36Z","lastTransitionTime":"2025-10-13T13:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:37 crc kubenswrapper[4797]: I1013 13:08:37.085870 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:37 crc kubenswrapper[4797]: I1013 13:08:37.085933 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:37 crc kubenswrapper[4797]: I1013 13:08:37.085949 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:37 crc kubenswrapper[4797]: I1013 13:08:37.085968 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:37 crc kubenswrapper[4797]: I1013 13:08:37.085981 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:37Z","lastTransitionTime":"2025-10-13T13:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:37 crc kubenswrapper[4797]: I1013 13:08:37.189559 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:37 crc kubenswrapper[4797]: I1013 13:08:37.189986 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:37 crc kubenswrapper[4797]: I1013 13:08:37.190148 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:37 crc kubenswrapper[4797]: I1013 13:08:37.190184 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:37 crc kubenswrapper[4797]: I1013 13:08:37.190207 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:37Z","lastTransitionTime":"2025-10-13T13:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:37 crc kubenswrapper[4797]: I1013 13:08:37.235866 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 13 13:08:37 crc kubenswrapper[4797]: I1013 13:08:37.236198 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pdvg5" Oct 13 13:08:37 crc kubenswrapper[4797]: I1013 13:08:37.236128 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 13:08:37 crc kubenswrapper[4797]: I1013 13:08:37.236084 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 13 13:08:37 crc kubenswrapper[4797]: E1013 13:08:37.236740 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 13 13:08:37 crc kubenswrapper[4797]: E1013 13:08:37.237012 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 13 13:08:37 crc kubenswrapper[4797]: E1013 13:08:37.237165 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pdvg5" podUID="e65d35bc-209d-4438-ae53-31deb132aaf5" Oct 13 13:08:37 crc kubenswrapper[4797]: E1013 13:08:37.237255 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 13 13:08:37 crc kubenswrapper[4797]: I1013 13:08:37.293461 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:37 crc kubenswrapper[4797]: I1013 13:08:37.293536 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:37 crc kubenswrapper[4797]: I1013 13:08:37.293560 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:37 crc kubenswrapper[4797]: I1013 13:08:37.293588 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:37 crc kubenswrapper[4797]: I1013 13:08:37.293606 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:37Z","lastTransitionTime":"2025-10-13T13:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:37 crc kubenswrapper[4797]: I1013 13:08:37.396923 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:37 crc kubenswrapper[4797]: I1013 13:08:37.397015 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:37 crc kubenswrapper[4797]: I1013 13:08:37.397039 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:37 crc kubenswrapper[4797]: I1013 13:08:37.397075 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:37 crc kubenswrapper[4797]: I1013 13:08:37.397100 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:37Z","lastTransitionTime":"2025-10-13T13:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:37 crc kubenswrapper[4797]: I1013 13:08:37.500642 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:37 crc kubenswrapper[4797]: I1013 13:08:37.500712 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:37 crc kubenswrapper[4797]: I1013 13:08:37.500738 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:37 crc kubenswrapper[4797]: I1013 13:08:37.500769 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:37 crc kubenswrapper[4797]: I1013 13:08:37.500792 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:37Z","lastTransitionTime":"2025-10-13T13:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:37 crc kubenswrapper[4797]: I1013 13:08:37.604228 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:37 crc kubenswrapper[4797]: I1013 13:08:37.604278 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:37 crc kubenswrapper[4797]: I1013 13:08:37.604294 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:37 crc kubenswrapper[4797]: I1013 13:08:37.604320 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:37 crc kubenswrapper[4797]: I1013 13:08:37.604336 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:37Z","lastTransitionTime":"2025-10-13T13:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:37 crc kubenswrapper[4797]: I1013 13:08:37.707617 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:37 crc kubenswrapper[4797]: I1013 13:08:37.707684 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:37 crc kubenswrapper[4797]: I1013 13:08:37.707711 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:37 crc kubenswrapper[4797]: I1013 13:08:37.707742 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:37 crc kubenswrapper[4797]: I1013 13:08:37.707765 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:37Z","lastTransitionTime":"2025-10-13T13:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:37 crc kubenswrapper[4797]: I1013 13:08:37.810854 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:37 crc kubenswrapper[4797]: I1013 13:08:37.810919 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:37 crc kubenswrapper[4797]: I1013 13:08:37.810936 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:37 crc kubenswrapper[4797]: I1013 13:08:37.810962 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:37 crc kubenswrapper[4797]: I1013 13:08:37.810984 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:37Z","lastTransitionTime":"2025-10-13T13:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:37 crc kubenswrapper[4797]: I1013 13:08:37.914673 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:37 crc kubenswrapper[4797]: I1013 13:08:37.914726 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:37 crc kubenswrapper[4797]: I1013 13:08:37.914743 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:37 crc kubenswrapper[4797]: I1013 13:08:37.914767 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:37 crc kubenswrapper[4797]: I1013 13:08:37.914784 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:37Z","lastTransitionTime":"2025-10-13T13:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:38 crc kubenswrapper[4797]: I1013 13:08:38.017581 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:38 crc kubenswrapper[4797]: I1013 13:08:38.017653 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:38 crc kubenswrapper[4797]: I1013 13:08:38.017677 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:38 crc kubenswrapper[4797]: I1013 13:08:38.017707 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:38 crc kubenswrapper[4797]: I1013 13:08:38.017729 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:38Z","lastTransitionTime":"2025-10-13T13:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:38 crc kubenswrapper[4797]: I1013 13:08:38.120319 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:38 crc kubenswrapper[4797]: I1013 13:08:38.120389 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:38 crc kubenswrapper[4797]: I1013 13:08:38.120412 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:38 crc kubenswrapper[4797]: I1013 13:08:38.120441 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:38 crc kubenswrapper[4797]: I1013 13:08:38.120465 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:38Z","lastTransitionTime":"2025-10-13T13:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:38 crc kubenswrapper[4797]: I1013 13:08:38.223516 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:38 crc kubenswrapper[4797]: I1013 13:08:38.223559 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:38 crc kubenswrapper[4797]: I1013 13:08:38.223570 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:38 crc kubenswrapper[4797]: I1013 13:08:38.223593 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:38 crc kubenswrapper[4797]: I1013 13:08:38.223605 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:38Z","lastTransitionTime":"2025-10-13T13:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:38 crc kubenswrapper[4797]: I1013 13:08:38.325896 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:38 crc kubenswrapper[4797]: I1013 13:08:38.325972 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:38 crc kubenswrapper[4797]: I1013 13:08:38.325991 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:38 crc kubenswrapper[4797]: I1013 13:08:38.326016 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:38 crc kubenswrapper[4797]: I1013 13:08:38.326033 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:38Z","lastTransitionTime":"2025-10-13T13:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:38 crc kubenswrapper[4797]: I1013 13:08:38.428266 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:38 crc kubenswrapper[4797]: I1013 13:08:38.428332 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:38 crc kubenswrapper[4797]: I1013 13:08:38.428352 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:38 crc kubenswrapper[4797]: I1013 13:08:38.428381 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:38 crc kubenswrapper[4797]: I1013 13:08:38.428400 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:38Z","lastTransitionTime":"2025-10-13T13:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:38 crc kubenswrapper[4797]: I1013 13:08:38.531885 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:38 crc kubenswrapper[4797]: I1013 13:08:38.531958 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:38 crc kubenswrapper[4797]: I1013 13:08:38.531979 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:38 crc kubenswrapper[4797]: I1013 13:08:38.532009 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:38 crc kubenswrapper[4797]: I1013 13:08:38.532031 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:38Z","lastTransitionTime":"2025-10-13T13:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:38 crc kubenswrapper[4797]: I1013 13:08:38.634357 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:38 crc kubenswrapper[4797]: I1013 13:08:38.634416 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:38 crc kubenswrapper[4797]: I1013 13:08:38.634435 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:38 crc kubenswrapper[4797]: I1013 13:08:38.634459 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:38 crc kubenswrapper[4797]: I1013 13:08:38.634480 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:38Z","lastTransitionTime":"2025-10-13T13:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:38 crc kubenswrapper[4797]: I1013 13:08:38.737945 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:38 crc kubenswrapper[4797]: I1013 13:08:38.738017 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:38 crc kubenswrapper[4797]: I1013 13:08:38.738038 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:38 crc kubenswrapper[4797]: I1013 13:08:38.738067 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:38 crc kubenswrapper[4797]: I1013 13:08:38.738089 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:38Z","lastTransitionTime":"2025-10-13T13:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:38 crc kubenswrapper[4797]: I1013 13:08:38.844179 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:38 crc kubenswrapper[4797]: I1013 13:08:38.844260 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:38 crc kubenswrapper[4797]: I1013 13:08:38.844285 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:38 crc kubenswrapper[4797]: I1013 13:08:38.844484 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:38 crc kubenswrapper[4797]: I1013 13:08:38.844547 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:38Z","lastTransitionTime":"2025-10-13T13:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:38 crc kubenswrapper[4797]: I1013 13:08:38.947909 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:38 crc kubenswrapper[4797]: I1013 13:08:38.947960 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:38 crc kubenswrapper[4797]: I1013 13:08:38.947976 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:38 crc kubenswrapper[4797]: I1013 13:08:38.948001 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:38 crc kubenswrapper[4797]: I1013 13:08:38.948020 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:38Z","lastTransitionTime":"2025-10-13T13:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:39 crc kubenswrapper[4797]: I1013 13:08:39.051343 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:39 crc kubenswrapper[4797]: I1013 13:08:39.051794 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:39 crc kubenswrapper[4797]: I1013 13:08:39.052041 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:39 crc kubenswrapper[4797]: I1013 13:08:39.052210 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:39 crc kubenswrapper[4797]: I1013 13:08:39.052352 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:39Z","lastTransitionTime":"2025-10-13T13:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:39 crc kubenswrapper[4797]: I1013 13:08:39.155395 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:39 crc kubenswrapper[4797]: I1013 13:08:39.155462 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:39 crc kubenswrapper[4797]: I1013 13:08:39.155481 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:39 crc kubenswrapper[4797]: I1013 13:08:39.155505 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:39 crc kubenswrapper[4797]: I1013 13:08:39.155522 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:39Z","lastTransitionTime":"2025-10-13T13:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:39 crc kubenswrapper[4797]: I1013 13:08:39.235609 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 13:08:39 crc kubenswrapper[4797]: I1013 13:08:39.235674 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 13 13:08:39 crc kubenswrapper[4797]: I1013 13:08:39.235689 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pdvg5" Oct 13 13:08:39 crc kubenswrapper[4797]: I1013 13:08:39.235794 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 13 13:08:39 crc kubenswrapper[4797]: E1013 13:08:39.235830 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 13 13:08:39 crc kubenswrapper[4797]: E1013 13:08:39.235966 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pdvg5" podUID="e65d35bc-209d-4438-ae53-31deb132aaf5" Oct 13 13:08:39 crc kubenswrapper[4797]: E1013 13:08:39.236191 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 13 13:08:39 crc kubenswrapper[4797]: E1013 13:08:39.236698 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 13 13:08:39 crc kubenswrapper[4797]: I1013 13:08:39.237235 4797 scope.go:117] "RemoveContainer" containerID="4df9cf44f891f2c9d341965d5c89c3f3ca2eb1bf12a54a0a673389869f9c878d" Oct 13 13:08:39 crc kubenswrapper[4797]: E1013 13:08:39.237482 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-dhk2q_openshift-ovn-kubernetes(658edc6a-9975-4d8b-9551-821edcc32ce1)\"" pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" podUID="658edc6a-9975-4d8b-9551-821edcc32ce1" Oct 13 13:08:39 crc kubenswrapper[4797]: I1013 13:08:39.258972 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:39 crc kubenswrapper[4797]: I1013 13:08:39.259037 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:39 crc kubenswrapper[4797]: I1013 13:08:39.259055 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:39 crc kubenswrapper[4797]: I1013 13:08:39.259077 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:39 crc kubenswrapper[4797]: I1013 13:08:39.259098 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:39Z","lastTransitionTime":"2025-10-13T13:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:39 crc kubenswrapper[4797]: I1013 13:08:39.362243 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:39 crc kubenswrapper[4797]: I1013 13:08:39.362312 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:39 crc kubenswrapper[4797]: I1013 13:08:39.362329 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:39 crc kubenswrapper[4797]: I1013 13:08:39.362354 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:39 crc kubenswrapper[4797]: I1013 13:08:39.362372 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:39Z","lastTransitionTime":"2025-10-13T13:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:39 crc kubenswrapper[4797]: I1013 13:08:39.465725 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:39 crc kubenswrapper[4797]: I1013 13:08:39.465764 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:39 crc kubenswrapper[4797]: I1013 13:08:39.465775 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:39 crc kubenswrapper[4797]: I1013 13:08:39.465789 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:39 crc kubenswrapper[4797]: I1013 13:08:39.465800 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:39Z","lastTransitionTime":"2025-10-13T13:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:39 crc kubenswrapper[4797]: I1013 13:08:39.569272 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:39 crc kubenswrapper[4797]: I1013 13:08:39.569330 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:39 crc kubenswrapper[4797]: I1013 13:08:39.569346 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:39 crc kubenswrapper[4797]: I1013 13:08:39.569369 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:39 crc kubenswrapper[4797]: I1013 13:08:39.569392 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:39Z","lastTransitionTime":"2025-10-13T13:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:39 crc kubenswrapper[4797]: I1013 13:08:39.673061 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:39 crc kubenswrapper[4797]: I1013 13:08:39.673393 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:39 crc kubenswrapper[4797]: I1013 13:08:39.673586 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:39 crc kubenswrapper[4797]: I1013 13:08:39.673866 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:39 crc kubenswrapper[4797]: I1013 13:08:39.674240 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:39Z","lastTransitionTime":"2025-10-13T13:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:39 crc kubenswrapper[4797]: I1013 13:08:39.777376 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:39 crc kubenswrapper[4797]: I1013 13:08:39.777428 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:39 crc kubenswrapper[4797]: I1013 13:08:39.777450 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:39 crc kubenswrapper[4797]: I1013 13:08:39.777476 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:39 crc kubenswrapper[4797]: I1013 13:08:39.777500 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:39Z","lastTransitionTime":"2025-10-13T13:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:39 crc kubenswrapper[4797]: I1013 13:08:39.879777 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:39 crc kubenswrapper[4797]: I1013 13:08:39.879886 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:39 crc kubenswrapper[4797]: I1013 13:08:39.879913 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:39 crc kubenswrapper[4797]: I1013 13:08:39.879946 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:39 crc kubenswrapper[4797]: I1013 13:08:39.879969 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:39Z","lastTransitionTime":"2025-10-13T13:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:39 crc kubenswrapper[4797]: I1013 13:08:39.983557 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:39 crc kubenswrapper[4797]: I1013 13:08:39.983631 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:39 crc kubenswrapper[4797]: I1013 13:08:39.983653 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:39 crc kubenswrapper[4797]: I1013 13:08:39.983685 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:39 crc kubenswrapper[4797]: I1013 13:08:39.983705 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:39Z","lastTransitionTime":"2025-10-13T13:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:40 crc kubenswrapper[4797]: I1013 13:08:40.086790 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:40 crc kubenswrapper[4797]: I1013 13:08:40.086893 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:40 crc kubenswrapper[4797]: I1013 13:08:40.086918 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:40 crc kubenswrapper[4797]: I1013 13:08:40.086949 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:40 crc kubenswrapper[4797]: I1013 13:08:40.086970 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:40Z","lastTransitionTime":"2025-10-13T13:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:40 crc kubenswrapper[4797]: I1013 13:08:40.190064 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:40 crc kubenswrapper[4797]: I1013 13:08:40.190187 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:40 crc kubenswrapper[4797]: I1013 13:08:40.190215 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:40 crc kubenswrapper[4797]: I1013 13:08:40.190245 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:40 crc kubenswrapper[4797]: I1013 13:08:40.190266 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:40Z","lastTransitionTime":"2025-10-13T13:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:40 crc kubenswrapper[4797]: I1013 13:08:40.292927 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:40 crc kubenswrapper[4797]: I1013 13:08:40.292981 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:40 crc kubenswrapper[4797]: I1013 13:08:40.292994 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:40 crc kubenswrapper[4797]: I1013 13:08:40.293010 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:40 crc kubenswrapper[4797]: I1013 13:08:40.293021 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:40Z","lastTransitionTime":"2025-10-13T13:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:40 crc kubenswrapper[4797]: I1013 13:08:40.396079 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:40 crc kubenswrapper[4797]: I1013 13:08:40.396150 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:40 crc kubenswrapper[4797]: I1013 13:08:40.396186 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:40 crc kubenswrapper[4797]: I1013 13:08:40.396222 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:40 crc kubenswrapper[4797]: I1013 13:08:40.396244 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:40Z","lastTransitionTime":"2025-10-13T13:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:40 crc kubenswrapper[4797]: I1013 13:08:40.500092 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:40 crc kubenswrapper[4797]: I1013 13:08:40.500182 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:40 crc kubenswrapper[4797]: I1013 13:08:40.500208 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:40 crc kubenswrapper[4797]: I1013 13:08:40.500237 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:40 crc kubenswrapper[4797]: I1013 13:08:40.500261 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:40Z","lastTransitionTime":"2025-10-13T13:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:40 crc kubenswrapper[4797]: I1013 13:08:40.603147 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:40 crc kubenswrapper[4797]: I1013 13:08:40.603216 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:40 crc kubenswrapper[4797]: I1013 13:08:40.603236 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:40 crc kubenswrapper[4797]: I1013 13:08:40.603259 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:40 crc kubenswrapper[4797]: I1013 13:08:40.603277 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:40Z","lastTransitionTime":"2025-10-13T13:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:40 crc kubenswrapper[4797]: I1013 13:08:40.706330 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:40 crc kubenswrapper[4797]: I1013 13:08:40.706393 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:40 crc kubenswrapper[4797]: I1013 13:08:40.706415 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:40 crc kubenswrapper[4797]: I1013 13:08:40.706439 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:40 crc kubenswrapper[4797]: I1013 13:08:40.706456 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:40Z","lastTransitionTime":"2025-10-13T13:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:40 crc kubenswrapper[4797]: I1013 13:08:40.809683 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:40 crc kubenswrapper[4797]: I1013 13:08:40.809735 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:40 crc kubenswrapper[4797]: I1013 13:08:40.809747 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:40 crc kubenswrapper[4797]: I1013 13:08:40.809766 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:40 crc kubenswrapper[4797]: I1013 13:08:40.809778 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:40Z","lastTransitionTime":"2025-10-13T13:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:40 crc kubenswrapper[4797]: I1013 13:08:40.912190 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:40 crc kubenswrapper[4797]: I1013 13:08:40.912239 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:40 crc kubenswrapper[4797]: I1013 13:08:40.912253 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:40 crc kubenswrapper[4797]: I1013 13:08:40.912273 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:40 crc kubenswrapper[4797]: I1013 13:08:40.912288 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:40Z","lastTransitionTime":"2025-10-13T13:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:41 crc kubenswrapper[4797]: I1013 13:08:41.015315 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:41 crc kubenswrapper[4797]: I1013 13:08:41.015389 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:41 crc kubenswrapper[4797]: I1013 13:08:41.015401 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:41 crc kubenswrapper[4797]: I1013 13:08:41.015417 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:41 crc kubenswrapper[4797]: I1013 13:08:41.015448 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:41Z","lastTransitionTime":"2025-10-13T13:08:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:41 crc kubenswrapper[4797]: I1013 13:08:41.118602 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:41 crc kubenswrapper[4797]: I1013 13:08:41.118644 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:41 crc kubenswrapper[4797]: I1013 13:08:41.118657 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:41 crc kubenswrapper[4797]: I1013 13:08:41.118678 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:41 crc kubenswrapper[4797]: I1013 13:08:41.118692 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:41Z","lastTransitionTime":"2025-10-13T13:08:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:41 crc kubenswrapper[4797]: I1013 13:08:41.221392 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:41 crc kubenswrapper[4797]: I1013 13:08:41.221459 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:41 crc kubenswrapper[4797]: I1013 13:08:41.221477 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:41 crc kubenswrapper[4797]: I1013 13:08:41.221503 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:41 crc kubenswrapper[4797]: I1013 13:08:41.221525 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:41Z","lastTransitionTime":"2025-10-13T13:08:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:41 crc kubenswrapper[4797]: I1013 13:08:41.235779 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 13 13:08:41 crc kubenswrapper[4797]: I1013 13:08:41.235868 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pdvg5" Oct 13 13:08:41 crc kubenswrapper[4797]: E1013 13:08:41.236019 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 13 13:08:41 crc kubenswrapper[4797]: I1013 13:08:41.236085 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 13:08:41 crc kubenswrapper[4797]: I1013 13:08:41.236121 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 13 13:08:41 crc kubenswrapper[4797]: E1013 13:08:41.236274 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pdvg5" podUID="e65d35bc-209d-4438-ae53-31deb132aaf5" Oct 13 13:08:41 crc kubenswrapper[4797]: E1013 13:08:41.236407 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 13 13:08:41 crc kubenswrapper[4797]: E1013 13:08:41.236557 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 13 13:08:41 crc kubenswrapper[4797]: I1013 13:08:41.329935 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:41 crc kubenswrapper[4797]: I1013 13:08:41.330168 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:41 crc kubenswrapper[4797]: I1013 13:08:41.330233 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:41 crc kubenswrapper[4797]: I1013 13:08:41.330271 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:41 crc kubenswrapper[4797]: I1013 13:08:41.330290 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:41Z","lastTransitionTime":"2025-10-13T13:08:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:41 crc kubenswrapper[4797]: I1013 13:08:41.433974 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:41 crc kubenswrapper[4797]: I1013 13:08:41.434023 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:41 crc kubenswrapper[4797]: I1013 13:08:41.434048 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:41 crc kubenswrapper[4797]: I1013 13:08:41.434074 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:41 crc kubenswrapper[4797]: I1013 13:08:41.434092 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:41Z","lastTransitionTime":"2025-10-13T13:08:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:41 crc kubenswrapper[4797]: I1013 13:08:41.537578 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:41 crc kubenswrapper[4797]: I1013 13:08:41.537642 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:41 crc kubenswrapper[4797]: I1013 13:08:41.537659 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:41 crc kubenswrapper[4797]: I1013 13:08:41.537683 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:41 crc kubenswrapper[4797]: I1013 13:08:41.537701 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:41Z","lastTransitionTime":"2025-10-13T13:08:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:41 crc kubenswrapper[4797]: I1013 13:08:41.640433 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:41 crc kubenswrapper[4797]: I1013 13:08:41.640479 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:41 crc kubenswrapper[4797]: I1013 13:08:41.640489 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:41 crc kubenswrapper[4797]: I1013 13:08:41.640513 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:41 crc kubenswrapper[4797]: I1013 13:08:41.640526 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:41Z","lastTransitionTime":"2025-10-13T13:08:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:41 crc kubenswrapper[4797]: I1013 13:08:41.743161 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:41 crc kubenswrapper[4797]: I1013 13:08:41.743268 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:41 crc kubenswrapper[4797]: I1013 13:08:41.743291 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:41 crc kubenswrapper[4797]: I1013 13:08:41.743317 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:41 crc kubenswrapper[4797]: I1013 13:08:41.743338 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:41Z","lastTransitionTime":"2025-10-13T13:08:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:41 crc kubenswrapper[4797]: I1013 13:08:41.845869 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:41 crc kubenswrapper[4797]: I1013 13:08:41.845965 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:41 crc kubenswrapper[4797]: I1013 13:08:41.845977 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:41 crc kubenswrapper[4797]: I1013 13:08:41.845993 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:41 crc kubenswrapper[4797]: I1013 13:08:41.846005 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:41Z","lastTransitionTime":"2025-10-13T13:08:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:41 crc kubenswrapper[4797]: I1013 13:08:41.947971 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:41 crc kubenswrapper[4797]: I1013 13:08:41.948015 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:41 crc kubenswrapper[4797]: I1013 13:08:41.948025 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:41 crc kubenswrapper[4797]: I1013 13:08:41.948039 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:41 crc kubenswrapper[4797]: I1013 13:08:41.948051 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:41Z","lastTransitionTime":"2025-10-13T13:08:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:42 crc kubenswrapper[4797]: I1013 13:08:42.051542 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:42 crc kubenswrapper[4797]: I1013 13:08:42.051597 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:42 crc kubenswrapper[4797]: I1013 13:08:42.051617 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:42 crc kubenswrapper[4797]: I1013 13:08:42.051705 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:42 crc kubenswrapper[4797]: I1013 13:08:42.051727 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:42Z","lastTransitionTime":"2025-10-13T13:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:42 crc kubenswrapper[4797]: I1013 13:08:42.154674 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:42 crc kubenswrapper[4797]: I1013 13:08:42.154739 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:42 crc kubenswrapper[4797]: I1013 13:08:42.154756 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:42 crc kubenswrapper[4797]: I1013 13:08:42.154778 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:42 crc kubenswrapper[4797]: I1013 13:08:42.154795 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:42Z","lastTransitionTime":"2025-10-13T13:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:42 crc kubenswrapper[4797]: I1013 13:08:42.249216 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:42 crc kubenswrapper[4797]: I1013 13:08:42.249280 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:42 crc kubenswrapper[4797]: I1013 13:08:42.249298 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:42 crc kubenswrapper[4797]: I1013 13:08:42.249322 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:42 crc kubenswrapper[4797]: I1013 13:08:42.249339 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:42Z","lastTransitionTime":"2025-10-13T13:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:42 crc kubenswrapper[4797]: E1013 13:08:42.269237 4797 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:08:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:08:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:08:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:08:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:08:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:08:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:08:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:08:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7c305ae9-a0eb-4806-bd54-a7ad9c447299\\\",\\\"systemUUID\\\":\\\"1126131d-f382-4ed8-9b1e-fad3c0f5c993\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:42Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:42 crc kubenswrapper[4797]: I1013 13:08:42.273550 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:42 crc kubenswrapper[4797]: I1013 13:08:42.273603 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:42 crc kubenswrapper[4797]: I1013 13:08:42.273622 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:42 crc kubenswrapper[4797]: I1013 13:08:42.273643 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:42 crc kubenswrapper[4797]: I1013 13:08:42.273660 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:42Z","lastTransitionTime":"2025-10-13T13:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:42 crc kubenswrapper[4797]: E1013 13:08:42.290658 4797 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:08:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:08:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:08:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:08:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:08:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:08:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:08:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:08:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7c305ae9-a0eb-4806-bd54-a7ad9c447299\\\",\\\"systemUUID\\\":\\\"1126131d-f382-4ed8-9b1e-fad3c0f5c993\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:42Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:42 crc kubenswrapper[4797]: I1013 13:08:42.294487 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:42 crc kubenswrapper[4797]: I1013 13:08:42.294531 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:42 crc kubenswrapper[4797]: I1013 13:08:42.294549 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:42 crc kubenswrapper[4797]: I1013 13:08:42.294570 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:42 crc kubenswrapper[4797]: I1013 13:08:42.294587 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:42Z","lastTransitionTime":"2025-10-13T13:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:42 crc kubenswrapper[4797]: E1013 13:08:42.312707 4797 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:08:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:08:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:08:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:08:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:08:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:08:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:08:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:08:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7c305ae9-a0eb-4806-bd54-a7ad9c447299\\\",\\\"systemUUID\\\":\\\"1126131d-f382-4ed8-9b1e-fad3c0f5c993\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:42Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:42 crc kubenswrapper[4797]: I1013 13:08:42.317682 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:42 crc kubenswrapper[4797]: I1013 13:08:42.317758 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:42 crc kubenswrapper[4797]: I1013 13:08:42.317786 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:42 crc kubenswrapper[4797]: I1013 13:08:42.317853 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:42 crc kubenswrapper[4797]: I1013 13:08:42.317881 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:42Z","lastTransitionTime":"2025-10-13T13:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:42 crc kubenswrapper[4797]: E1013 13:08:42.337401 4797 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:08:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:08:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:08:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:08:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:08:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:08:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:08:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:08:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7c305ae9-a0eb-4806-bd54-a7ad9c447299\\\",\\\"systemUUID\\\":\\\"1126131d-f382-4ed8-9b1e-fad3c0f5c993\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:42Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:42 crc kubenswrapper[4797]: I1013 13:08:42.342349 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:42 crc kubenswrapper[4797]: I1013 13:08:42.342372 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:42 crc kubenswrapper[4797]: I1013 13:08:42.342380 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:42 crc kubenswrapper[4797]: I1013 13:08:42.342392 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:42 crc kubenswrapper[4797]: I1013 13:08:42.342408 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:42Z","lastTransitionTime":"2025-10-13T13:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:42 crc kubenswrapper[4797]: E1013 13:08:42.356161 4797 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:08:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:08:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:08:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:08:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:08:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:08:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T13:08:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T13:08:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7c305ae9-a0eb-4806-bd54-a7ad9c447299\\\",\\\"systemUUID\\\":\\\"1126131d-f382-4ed8-9b1e-fad3c0f5c993\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:42Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:42 crc kubenswrapper[4797]: E1013 13:08:42.356379 4797 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 13 13:08:42 crc kubenswrapper[4797]: I1013 13:08:42.358167 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:42 crc kubenswrapper[4797]: I1013 13:08:42.358231 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:42 crc kubenswrapper[4797]: I1013 13:08:42.358249 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:42 crc kubenswrapper[4797]: I1013 13:08:42.358270 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:42 crc kubenswrapper[4797]: I1013 13:08:42.358287 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:42Z","lastTransitionTime":"2025-10-13T13:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:42 crc kubenswrapper[4797]: I1013 13:08:42.461358 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:42 crc kubenswrapper[4797]: I1013 13:08:42.461388 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:42 crc kubenswrapper[4797]: I1013 13:08:42.461396 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:42 crc kubenswrapper[4797]: I1013 13:08:42.461407 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:42 crc kubenswrapper[4797]: I1013 13:08:42.461415 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:42Z","lastTransitionTime":"2025-10-13T13:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:42 crc kubenswrapper[4797]: I1013 13:08:42.564688 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:42 crc kubenswrapper[4797]: I1013 13:08:42.564752 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:42 crc kubenswrapper[4797]: I1013 13:08:42.564774 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:42 crc kubenswrapper[4797]: I1013 13:08:42.564842 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:42 crc kubenswrapper[4797]: I1013 13:08:42.564867 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:42Z","lastTransitionTime":"2025-10-13T13:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:42 crc kubenswrapper[4797]: I1013 13:08:42.668135 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:42 crc kubenswrapper[4797]: I1013 13:08:42.668183 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:42 crc kubenswrapper[4797]: I1013 13:08:42.668199 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:42 crc kubenswrapper[4797]: I1013 13:08:42.668220 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:42 crc kubenswrapper[4797]: I1013 13:08:42.668237 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:42Z","lastTransitionTime":"2025-10-13T13:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:42 crc kubenswrapper[4797]: I1013 13:08:42.770678 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:42 crc kubenswrapper[4797]: I1013 13:08:42.770745 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:42 crc kubenswrapper[4797]: I1013 13:08:42.770862 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:42 crc kubenswrapper[4797]: I1013 13:08:42.770906 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:42 crc kubenswrapper[4797]: I1013 13:08:42.770930 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:42Z","lastTransitionTime":"2025-10-13T13:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:42 crc kubenswrapper[4797]: I1013 13:08:42.874481 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:42 crc kubenswrapper[4797]: I1013 13:08:42.874595 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:42 crc kubenswrapper[4797]: I1013 13:08:42.874617 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:42 crc kubenswrapper[4797]: I1013 13:08:42.874642 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:42 crc kubenswrapper[4797]: I1013 13:08:42.874680 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:42Z","lastTransitionTime":"2025-10-13T13:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:42 crc kubenswrapper[4797]: I1013 13:08:42.978211 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:42 crc kubenswrapper[4797]: I1013 13:08:42.978270 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:42 crc kubenswrapper[4797]: I1013 13:08:42.978288 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:42 crc kubenswrapper[4797]: I1013 13:08:42.978311 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:42 crc kubenswrapper[4797]: I1013 13:08:42.978329 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:42Z","lastTransitionTime":"2025-10-13T13:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:43 crc kubenswrapper[4797]: I1013 13:08:43.081463 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:43 crc kubenswrapper[4797]: I1013 13:08:43.081514 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:43 crc kubenswrapper[4797]: I1013 13:08:43.081531 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:43 crc kubenswrapper[4797]: I1013 13:08:43.081555 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:43 crc kubenswrapper[4797]: I1013 13:08:43.081572 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:43Z","lastTransitionTime":"2025-10-13T13:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:43 crc kubenswrapper[4797]: I1013 13:08:43.199181 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:43 crc kubenswrapper[4797]: I1013 13:08:43.199234 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:43 crc kubenswrapper[4797]: I1013 13:08:43.199249 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:43 crc kubenswrapper[4797]: I1013 13:08:43.199267 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:43 crc kubenswrapper[4797]: I1013 13:08:43.199278 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:43Z","lastTransitionTime":"2025-10-13T13:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:43 crc kubenswrapper[4797]: I1013 13:08:43.235331 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pdvg5" Oct 13 13:08:43 crc kubenswrapper[4797]: I1013 13:08:43.235413 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 13:08:43 crc kubenswrapper[4797]: I1013 13:08:43.235356 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 13 13:08:43 crc kubenswrapper[4797]: E1013 13:08:43.235575 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pdvg5" podUID="e65d35bc-209d-4438-ae53-31deb132aaf5" Oct 13 13:08:43 crc kubenswrapper[4797]: E1013 13:08:43.235682 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 13 13:08:43 crc kubenswrapper[4797]: I1013 13:08:43.235750 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 13 13:08:43 crc kubenswrapper[4797]: E1013 13:08:43.235776 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 13 13:08:43 crc kubenswrapper[4797]: E1013 13:08:43.235903 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 13 13:08:43 crc kubenswrapper[4797]: I1013 13:08:43.258294 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f37f607-3b81-4e33-878e-e78a69b89d23\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f65bab26af0e0d003d4e1a27dc4bdb84b64b5f6143e363973331a3fb6d26b12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81fc000b6df41386d24f9077cee4aa0ceb4733774dc37d225495575543e84a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6529ac19e0d9f2b6ecc69e041e75c9767c971617166ca22bb29349b3b3965b1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://153aaedead7ed76ab97858342317945de885afe80c00d9873d1a03444c47f67d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:43Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:43 crc kubenswrapper[4797]: I1013 13:08:43.274168 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"345b1c60-ba79-407d-8423-53010f2dfeb0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9188982d992b79d058393a141055552eeb63bc5cd53178991e62b3df7604f55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4mqw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae2106d4b7e73d19b0c8cbd8089d372e56fa08d827a3b45148d0cf68e8596c00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4mqw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hrdxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:43Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:43 crc kubenswrapper[4797]: I1013 13:08:43.284884 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7c2fp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c10f51f-a7f1-4ab8-8d9c-fc358bd7f2c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55ca323dd3a92ee203542f4ef7bb8be990bcfc8f75f125c562127129aecefc5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hgkjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7c2fp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:43Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:43 crc kubenswrapper[4797]: I1013 13:08:43.300970 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6gbdx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2ab9f14-aae8-45ef-880e-a1563e920f87\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa998288bf7354f5914b82c32971cd88e1fe9535016c7d137b79e4cf5c5c7248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://414f6ddbfec431109009fc83e56eeac94db15726b109e707ebd8d3e2403999b7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-13T13:08:12Z\\\",\\\"message\\\":\\\"2025-10-13T13:07:27+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_241ac6c6-17e4-4495-a5d5-736f3cee06d0\\\\n2025-10-13T13:07:27+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_241ac6c6-17e4-4495-a5d5-736f3cee06d0 to /host/opt/cni/bin/\\\\n2025-10-13T13:07:27Z [verbose] multus-daemon started\\\\n2025-10-13T13:07:27Z [verbose] Readiness Indicator file check\\\\n2025-10-13T13:08:12Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:08:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rkc2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6gbdx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:43Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:43 crc kubenswrapper[4797]: I1013 13:08:43.301665 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:43 crc kubenswrapper[4797]: I1013 13:08:43.301717 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:43 crc kubenswrapper[4797]: I1013 13:08:43.301736 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:43 crc kubenswrapper[4797]: I1013 13:08:43.301758 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:43 crc kubenswrapper[4797]: I1013 13:08:43.301776 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:43Z","lastTransitionTime":"2025-10-13T13:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:43 crc kubenswrapper[4797]: I1013 13:08:43.315130 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e65d35bc-209d-4438-ae53-31deb132aaf5-metrics-certs\") pod \"network-metrics-daemon-pdvg5\" (UID: \"e65d35bc-209d-4438-ae53-31deb132aaf5\") " pod="openshift-multus/network-metrics-daemon-pdvg5" Oct 13 13:08:43 crc kubenswrapper[4797]: E1013 13:08:43.315302 4797 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 13 13:08:43 crc kubenswrapper[4797]: E1013 13:08:43.315441 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e65d35bc-209d-4438-ae53-31deb132aaf5-metrics-certs podName:e65d35bc-209d-4438-ae53-31deb132aaf5 nodeName:}" failed. No retries permitted until 2025-10-13 13:09:47.315420905 +0000 UTC m=+164.848971271 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e65d35bc-209d-4438-ae53-31deb132aaf5-metrics-certs") pod "network-metrics-daemon-pdvg5" (UID: "e65d35bc-209d-4438-ae53-31deb132aaf5") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 13 13:08:43 crc kubenswrapper[4797]: I1013 13:08:43.329472 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"658edc6a-9975-4d8b-9551-821edcc32ce1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6815d3509df673d7f5da2c26130c6c4d533e9d2c25c40f82365ef61d63ee71bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e599d81d1a996abd4de74afc58a8255a1ae548327401146b6bdf688d7455823\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa5161ba66d687daedb3caa1a0e2d7be83859aa3076731f94aebf83cc3348a4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1293f7ed35796e22a4be73a35ad07f83fa98d250d21de2d0b96b9090354142b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32991406197be9d38b8d5e8d1a7e95165b1846e9e054efbe87f30aac9f7f8784\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7aebb018a68c2984d9e4e58071c2b623652bfa700acebaf735c35615abf8c592\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4df9cf44f891f2c9d341965d5c89c3f3ca2eb1bf12a54a0a673389869f9c878d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4df9cf44f891f2c9d341965d5c89c3f3ca2eb1bf12a54a0a673389869f9c878d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-13T13:08:14Z\\\",\\\"message\\\":\\\"LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-service-ca-operator/metrics\\\\\\\"}\\\\nI1013 13:08:14.032896 6746 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1013 13:08:14.032901 6746 services_controller.go:360] Finished syncing service metrics on namespace openshift-service-ca-operator for network=default : 1.704972ms\\\\nI1013 13:08:14.032930 6746 services_controller.go:356] Processing sync for service openshift-authentication-operator/metrics for network=default\\\\nF1013 13:08:14.032949 6746 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:13Z is after 2025-08\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-13T13:08:13Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-dhk2q_openshift-ovn-kubernetes(658edc6a-9975-4d8b-9551-821edcc32ce1)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a900854ab289e65833932548eadd4705ec501737d66773d5b6c283458125b598\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6339ec20a5a892de024630eaf6febedfdfa4373c24d394fc758267cc7ae2603e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6339ec20a5a892de024630eaf6febedfdfa4373c24d394fc758267cc7ae2603e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2wq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dhk2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:43Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:43 crc kubenswrapper[4797]: I1013 13:08:43.349972 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a225515-1318-413d-aafe-877c9f16f598\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94213b2963fead3db49bc98dfdf6347265b92e3a0a965295610e496d2e1f03fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://015492cb16b3cf6dedc1936f90cf03d1331bfd1fddf6a257c719a6bf102691f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a259d99cae7127eb6fc8ad5446de3eda5a06da45868ab2325a89fc9c44f1d34c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58e84f914871c37c2c5cf2767af6a88354e4e59af0cbe5b178b80e1372d50629\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba76df71160260346f2ecd968722de778b7d2b3dcb8673d6ec770964965384fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ff207205d6f17838682cab641fe9c77fad739e5454c206f376440741bed8724\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ff207205d6f17838682cab641fe9c77fad739e5454c206f376440741bed8724\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:43Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:43 crc kubenswrapper[4797]: I1013 13:08:43.364655 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:43Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:43 crc kubenswrapper[4797]: I1013 13:08:43.378550 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:43Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:43 crc kubenswrapper[4797]: I1013 13:08:43.395904 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-pdvg5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e65d35bc-209d-4438-ae53-31deb132aaf5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nspn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nspn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-pdvg5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:43Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:43 crc kubenswrapper[4797]: I1013 13:08:43.404524 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:43 crc kubenswrapper[4797]: I1013 13:08:43.404554 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:43 crc kubenswrapper[4797]: I1013 13:08:43.404562 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:43 crc kubenswrapper[4797]: I1013 13:08:43.404574 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:43 crc kubenswrapper[4797]: I1013 13:08:43.404584 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:43Z","lastTransitionTime":"2025-10-13T13:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:43 crc kubenswrapper[4797]: I1013 13:08:43.408788 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69631c207b244ea05458caf7f67665697be6b3794c1aac98d0ac8d23df060e02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:43Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:43 crc kubenswrapper[4797]: I1013 13:08:43.420528 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5jgrm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"680a49a0-7eff-44a9-8ab8-e4b52f4743c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://875be57ff7356a934b342acd8ae700f66656680be4e58e6cfccdc0407b66ddea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pptw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5jgrm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:43Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:43 crc kubenswrapper[4797]: I1013 13:08:43.432692 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hvhmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b24c284-a754-4877-83cc-334b0a893a47\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72031333bb0302ca8e823981a07e96b3bf16d02fbdb918d4fd3e79f36d86c5ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znzc9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95d788f4cc7913f42c5282aea7303a5463ec8718dba6372a30c505e1648f230e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znzc9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hvhmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:43Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:43 crc kubenswrapper[4797]: I1013 13:08:43.453746 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6681a37da700a80ffec94aef9264f87838622029c76a2badc7b8f4a7e9e167e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:43Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:43 crc kubenswrapper[4797]: I1013 13:08:43.467230 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://050a7223e7496fca4ef77f2d73f6aefc921ac5accb7ecaa34609524388da6549\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f59a6104215a3e6febd2e26c286b00895bcd8a45719acbd8e86d6fb5683df39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:43Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:43 crc kubenswrapper[4797]: I1013 13:08:43.486665 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:43Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:43 crc kubenswrapper[4797]: I1013 13:08:43.507073 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:43 crc kubenswrapper[4797]: I1013 13:08:43.507140 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:43 crc kubenswrapper[4797]: I1013 13:08:43.507165 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:43 crc kubenswrapper[4797]: I1013 13:08:43.507196 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:43 crc kubenswrapper[4797]: I1013 13:08:43.507218 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:43Z","lastTransitionTime":"2025-10-13T13:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:43 crc kubenswrapper[4797]: I1013 13:08:43.511487 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hc9bk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94a6e41f-8980-41db-a008-d5a81058cdba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4187a9b2b8147080c704bcda550e1fa94124e2d876e766361e95907a8805d300\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55fca726ceb1c3d6ba2023acc8fcf6829a7843a2da5209dc77f3d1f499542984\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55fca726ceb1c3d6ba2023acc8fcf6829a7843a2da5209dc77f3d1f499542984\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d315709aee893961a174d0368efcf68e50e45845bdce18b40b96f5d49a8ac12c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d315709aee893961a174d0368efcf68e50e45845bdce18b40b96f5d49a8ac12c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e5eeb32046ffa3c309bb0649ed24bb4149050e757d4252bc5ed8e0593e1b139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e5eeb32046ffa3c309bb0649ed24bb4149050e757d4252bc5ed8e0593e1b139\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12ad75a6e287db934cda0d0128e5445d47433aa12807b4145ce0bd28c36b08a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12ad75a6e287db934cda0d0128e5445d47433aa12807b4145ce0bd28c36b08a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://578ddb49fd9525eafe74852b96ea1f3e320cbe40fa15ef4da3e4269f9bc23fc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://578ddb49fd9525eafe74852b96ea1f3e320cbe40fa15ef4da3e4269f9bc23fc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d96793d08ac170b3c25abd53c779b2ebecf10b5271c5f3eb4f9cbc524ba65c0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d96793d08ac170b3c25abd53c779b2ebecf10b5271c5f3eb4f9cbc524ba65c0e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g9sc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hc9bk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:43Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:43 crc kubenswrapper[4797]: I1013 13:08:43.529412 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b68fe527-212f-427f-853a-037035463262\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca584d4ecaf82f6fb7822ce377920e84fa94325d8c157e75bdcbbe45a125fa17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11160e205816f4de995be138142cca7672957f217e49bf9f4761ae2cb132e9da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://034f468ed62eb8201ad4abdbf235c13b6c9ff8e3fe2494ad768f7047e188bc77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e62409684da122b3385446a402a798c47eca9f32aeb43f734f94dc498f95d23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e62409684da122b3385446a402a798c47eca9f32aeb43f734f94dc498f95d23\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:03Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:43Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:43 crc kubenswrapper[4797]: I1013 13:08:43.542499 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c1f95fe-a038-41aa-b56d-69043c556391\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63f3b10625b7649804d048f23322a40de7a36368dd72faed1c1f3a313c64f452\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://114ccc69a396d2e48941fe7dc1e9b68f9cf6297746930ee02ce8aa98273064d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://114ccc69a396d2e48941fe7dc1e9b68f9cf6297746930ee02ce8aa98273064d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:43Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:43 crc kubenswrapper[4797]: I1013 13:08:43.574160 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a0ece0b-2009-4af8-a479-18fe277add03\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T13:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfe67dfd1c3ab4ca933a08e0384f2c38dccf755989a2d788f7c96bd8c2005c7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2839d93188aac6edbd17e7c1dc6d3b6004d3c1d8d03c559205b1f180ca7fc722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d103007fe3545a7470b16b99638c5d5c87f34918e102e1453d4f7ee1fa67109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://221933b30055ace0b4911bda08736e1c703b7757d55fadb3114ae39d038e4b19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01ae91c2e82dd02eb4c0aced1159efd45e3a0570a4db649f2fd2b58681419471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T13:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9634ab258db11c80c4fea57a4a31969b811204d49513802e9fd1e584c9baeeb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9634ab258db11c80c4fea57a4a31969b811204d49513802e9fd1e584c9baeeb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4eac0af23e91572524547b0ce92c10d435b55d0cd15ca4cfc1f49bda2de8bde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4eac0af23e91572524547b0ce92c10d435b55d0cd15ca4cfc1f49bda2de8bde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:05Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://af79d6bbb6a3c16b532ba2234d3373011c151ffa801eb1ae5ae947142a64bcfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af79d6bbb6a3c16b532ba2234d3373011c151ffa801eb1ae5ae947142a64bcfe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T13:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T13:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T13:07:03Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T13:08:43Z is after 2025-08-24T17:21:41Z" Oct 13 13:08:43 crc kubenswrapper[4797]: I1013 13:08:43.609951 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:43 crc kubenswrapper[4797]: I1013 13:08:43.610011 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:43 crc kubenswrapper[4797]: I1013 13:08:43.610028 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:43 crc kubenswrapper[4797]: I1013 13:08:43.610051 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:43 crc kubenswrapper[4797]: I1013 13:08:43.610068 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:43Z","lastTransitionTime":"2025-10-13T13:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:43 crc kubenswrapper[4797]: I1013 13:08:43.713514 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:43 crc kubenswrapper[4797]: I1013 13:08:43.713567 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:43 crc kubenswrapper[4797]: I1013 13:08:43.713585 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:43 crc kubenswrapper[4797]: I1013 13:08:43.713606 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:43 crc kubenswrapper[4797]: I1013 13:08:43.713623 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:43Z","lastTransitionTime":"2025-10-13T13:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:43 crc kubenswrapper[4797]: I1013 13:08:43.816794 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:43 crc kubenswrapper[4797]: I1013 13:08:43.816911 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:43 crc kubenswrapper[4797]: I1013 13:08:43.816935 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:43 crc kubenswrapper[4797]: I1013 13:08:43.816966 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:43 crc kubenswrapper[4797]: I1013 13:08:43.816987 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:43Z","lastTransitionTime":"2025-10-13T13:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:43 crc kubenswrapper[4797]: I1013 13:08:43.919384 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:43 crc kubenswrapper[4797]: I1013 13:08:43.919434 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:43 crc kubenswrapper[4797]: I1013 13:08:43.919452 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:43 crc kubenswrapper[4797]: I1013 13:08:43.919475 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:43 crc kubenswrapper[4797]: I1013 13:08:43.919492 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:43Z","lastTransitionTime":"2025-10-13T13:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:44 crc kubenswrapper[4797]: I1013 13:08:44.022697 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:44 crc kubenswrapper[4797]: I1013 13:08:44.022766 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:44 crc kubenswrapper[4797]: I1013 13:08:44.022789 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:44 crc kubenswrapper[4797]: I1013 13:08:44.022849 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:44 crc kubenswrapper[4797]: I1013 13:08:44.022870 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:44Z","lastTransitionTime":"2025-10-13T13:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:44 crc kubenswrapper[4797]: I1013 13:08:44.125223 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:44 crc kubenswrapper[4797]: I1013 13:08:44.125249 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:44 crc kubenswrapper[4797]: I1013 13:08:44.125258 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:44 crc kubenswrapper[4797]: I1013 13:08:44.125516 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:44 crc kubenswrapper[4797]: I1013 13:08:44.125542 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:44Z","lastTransitionTime":"2025-10-13T13:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:44 crc kubenswrapper[4797]: I1013 13:08:44.229591 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:44 crc kubenswrapper[4797]: I1013 13:08:44.229649 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:44 crc kubenswrapper[4797]: I1013 13:08:44.229667 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:44 crc kubenswrapper[4797]: I1013 13:08:44.229691 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:44 crc kubenswrapper[4797]: I1013 13:08:44.229709 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:44Z","lastTransitionTime":"2025-10-13T13:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:44 crc kubenswrapper[4797]: I1013 13:08:44.236157 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pdvg5" Oct 13 13:08:44 crc kubenswrapper[4797]: E1013 13:08:44.236328 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pdvg5" podUID="e65d35bc-209d-4438-ae53-31deb132aaf5" Oct 13 13:08:44 crc kubenswrapper[4797]: I1013 13:08:44.333184 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:44 crc kubenswrapper[4797]: I1013 13:08:44.333257 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:44 crc kubenswrapper[4797]: I1013 13:08:44.333325 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:44 crc kubenswrapper[4797]: I1013 13:08:44.333354 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:44 crc kubenswrapper[4797]: I1013 13:08:44.333373 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:44Z","lastTransitionTime":"2025-10-13T13:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:44 crc kubenswrapper[4797]: I1013 13:08:44.436644 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:44 crc kubenswrapper[4797]: I1013 13:08:44.436715 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:44 crc kubenswrapper[4797]: I1013 13:08:44.436737 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:44 crc kubenswrapper[4797]: I1013 13:08:44.436768 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:44 crc kubenswrapper[4797]: I1013 13:08:44.436790 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:44Z","lastTransitionTime":"2025-10-13T13:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:44 crc kubenswrapper[4797]: I1013 13:08:44.539844 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:44 crc kubenswrapper[4797]: I1013 13:08:44.539927 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:44 crc kubenswrapper[4797]: I1013 13:08:44.539951 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:44 crc kubenswrapper[4797]: I1013 13:08:44.539980 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:44 crc kubenswrapper[4797]: I1013 13:08:44.540000 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:44Z","lastTransitionTime":"2025-10-13T13:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:44 crc kubenswrapper[4797]: I1013 13:08:44.642766 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:44 crc kubenswrapper[4797]: I1013 13:08:44.642876 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:44 crc kubenswrapper[4797]: I1013 13:08:44.642895 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:44 crc kubenswrapper[4797]: I1013 13:08:44.642918 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:44 crc kubenswrapper[4797]: I1013 13:08:44.642935 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:44Z","lastTransitionTime":"2025-10-13T13:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:44 crc kubenswrapper[4797]: I1013 13:08:44.746328 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:44 crc kubenswrapper[4797]: I1013 13:08:44.746467 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:44 crc kubenswrapper[4797]: I1013 13:08:44.746491 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:44 crc kubenswrapper[4797]: I1013 13:08:44.746516 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:44 crc kubenswrapper[4797]: I1013 13:08:44.746533 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:44Z","lastTransitionTime":"2025-10-13T13:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:44 crc kubenswrapper[4797]: I1013 13:08:44.848747 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:44 crc kubenswrapper[4797]: I1013 13:08:44.848778 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:44 crc kubenswrapper[4797]: I1013 13:08:44.848786 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:44 crc kubenswrapper[4797]: I1013 13:08:44.848799 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:44 crc kubenswrapper[4797]: I1013 13:08:44.848820 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:44Z","lastTransitionTime":"2025-10-13T13:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:44 crc kubenswrapper[4797]: I1013 13:08:44.952123 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:44 crc kubenswrapper[4797]: I1013 13:08:44.952185 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:44 crc kubenswrapper[4797]: I1013 13:08:44.952202 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:44 crc kubenswrapper[4797]: I1013 13:08:44.952227 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:44 crc kubenswrapper[4797]: I1013 13:08:44.952244 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:44Z","lastTransitionTime":"2025-10-13T13:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:45 crc kubenswrapper[4797]: I1013 13:08:45.056000 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:45 crc kubenswrapper[4797]: I1013 13:08:45.056065 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:45 crc kubenswrapper[4797]: I1013 13:08:45.056086 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:45 crc kubenswrapper[4797]: I1013 13:08:45.056117 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:45 crc kubenswrapper[4797]: I1013 13:08:45.056138 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:45Z","lastTransitionTime":"2025-10-13T13:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:45 crc kubenswrapper[4797]: I1013 13:08:45.159848 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:45 crc kubenswrapper[4797]: I1013 13:08:45.159910 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:45 crc kubenswrapper[4797]: I1013 13:08:45.159929 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:45 crc kubenswrapper[4797]: I1013 13:08:45.159956 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:45 crc kubenswrapper[4797]: I1013 13:08:45.159973 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:45Z","lastTransitionTime":"2025-10-13T13:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:45 crc kubenswrapper[4797]: I1013 13:08:45.235305 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 13 13:08:45 crc kubenswrapper[4797]: E1013 13:08:45.235476 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 13 13:08:45 crc kubenswrapper[4797]: I1013 13:08:45.235529 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 13 13:08:45 crc kubenswrapper[4797]: I1013 13:08:45.235539 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 13:08:45 crc kubenswrapper[4797]: E1013 13:08:45.235939 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 13 13:08:45 crc kubenswrapper[4797]: E1013 13:08:45.236083 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 13 13:08:45 crc kubenswrapper[4797]: I1013 13:08:45.263239 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:45 crc kubenswrapper[4797]: I1013 13:08:45.263319 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:45 crc kubenswrapper[4797]: I1013 13:08:45.263344 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:45 crc kubenswrapper[4797]: I1013 13:08:45.263378 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:45 crc kubenswrapper[4797]: I1013 13:08:45.263403 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:45Z","lastTransitionTime":"2025-10-13T13:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:45 crc kubenswrapper[4797]: I1013 13:08:45.368019 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:45 crc kubenswrapper[4797]: I1013 13:08:45.368084 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:45 crc kubenswrapper[4797]: I1013 13:08:45.368101 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:45 crc kubenswrapper[4797]: I1013 13:08:45.368228 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:45 crc kubenswrapper[4797]: I1013 13:08:45.368252 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:45Z","lastTransitionTime":"2025-10-13T13:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:45 crc kubenswrapper[4797]: I1013 13:08:45.471280 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:45 crc kubenswrapper[4797]: I1013 13:08:45.471323 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:45 crc kubenswrapper[4797]: I1013 13:08:45.471334 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:45 crc kubenswrapper[4797]: I1013 13:08:45.471349 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:45 crc kubenswrapper[4797]: I1013 13:08:45.471361 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:45Z","lastTransitionTime":"2025-10-13T13:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:45 crc kubenswrapper[4797]: I1013 13:08:45.574894 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:45 crc kubenswrapper[4797]: I1013 13:08:45.574938 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:45 crc kubenswrapper[4797]: I1013 13:08:45.574954 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:45 crc kubenswrapper[4797]: I1013 13:08:45.574979 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:45 crc kubenswrapper[4797]: I1013 13:08:45.574998 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:45Z","lastTransitionTime":"2025-10-13T13:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:45 crc kubenswrapper[4797]: I1013 13:08:45.678135 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:45 crc kubenswrapper[4797]: I1013 13:08:45.678190 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:45 crc kubenswrapper[4797]: I1013 13:08:45.678207 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:45 crc kubenswrapper[4797]: I1013 13:08:45.678229 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:45 crc kubenswrapper[4797]: I1013 13:08:45.678247 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:45Z","lastTransitionTime":"2025-10-13T13:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:45 crc kubenswrapper[4797]: I1013 13:08:45.781504 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:45 crc kubenswrapper[4797]: I1013 13:08:45.781570 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:45 crc kubenswrapper[4797]: I1013 13:08:45.781588 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:45 crc kubenswrapper[4797]: I1013 13:08:45.781612 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:45 crc kubenswrapper[4797]: I1013 13:08:45.781629 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:45Z","lastTransitionTime":"2025-10-13T13:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:45 crc kubenswrapper[4797]: I1013 13:08:45.885125 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:45 crc kubenswrapper[4797]: I1013 13:08:45.885164 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:45 crc kubenswrapper[4797]: I1013 13:08:45.885173 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:45 crc kubenswrapper[4797]: I1013 13:08:45.885186 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:45 crc kubenswrapper[4797]: I1013 13:08:45.885198 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:45Z","lastTransitionTime":"2025-10-13T13:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:45 crc kubenswrapper[4797]: I1013 13:08:45.988415 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:45 crc kubenswrapper[4797]: I1013 13:08:45.988457 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:45 crc kubenswrapper[4797]: I1013 13:08:45.988468 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:45 crc kubenswrapper[4797]: I1013 13:08:45.988484 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:45 crc kubenswrapper[4797]: I1013 13:08:45.988495 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:45Z","lastTransitionTime":"2025-10-13T13:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:46 crc kubenswrapper[4797]: I1013 13:08:46.091961 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:46 crc kubenswrapper[4797]: I1013 13:08:46.092028 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:46 crc kubenswrapper[4797]: I1013 13:08:46.092049 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:46 crc kubenswrapper[4797]: I1013 13:08:46.092074 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:46 crc kubenswrapper[4797]: I1013 13:08:46.092094 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:46Z","lastTransitionTime":"2025-10-13T13:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:46 crc kubenswrapper[4797]: I1013 13:08:46.195538 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:46 crc kubenswrapper[4797]: I1013 13:08:46.195633 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:46 crc kubenswrapper[4797]: I1013 13:08:46.195664 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:46 crc kubenswrapper[4797]: I1013 13:08:46.195695 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:46 crc kubenswrapper[4797]: I1013 13:08:46.195716 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:46Z","lastTransitionTime":"2025-10-13T13:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:46 crc kubenswrapper[4797]: I1013 13:08:46.235061 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pdvg5" Oct 13 13:08:46 crc kubenswrapper[4797]: E1013 13:08:46.235234 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pdvg5" podUID="e65d35bc-209d-4438-ae53-31deb132aaf5" Oct 13 13:08:46 crc kubenswrapper[4797]: I1013 13:08:46.298191 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:46 crc kubenswrapper[4797]: I1013 13:08:46.298262 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:46 crc kubenswrapper[4797]: I1013 13:08:46.298286 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:46 crc kubenswrapper[4797]: I1013 13:08:46.298316 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:46 crc kubenswrapper[4797]: I1013 13:08:46.298338 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:46Z","lastTransitionTime":"2025-10-13T13:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:46 crc kubenswrapper[4797]: I1013 13:08:46.401406 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:46 crc kubenswrapper[4797]: I1013 13:08:46.401477 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:46 crc kubenswrapper[4797]: I1013 13:08:46.401504 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:46 crc kubenswrapper[4797]: I1013 13:08:46.401536 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:46 crc kubenswrapper[4797]: I1013 13:08:46.401558 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:46Z","lastTransitionTime":"2025-10-13T13:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:46 crc kubenswrapper[4797]: I1013 13:08:46.505181 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:46 crc kubenswrapper[4797]: I1013 13:08:46.505260 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:46 crc kubenswrapper[4797]: I1013 13:08:46.505283 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:46 crc kubenswrapper[4797]: I1013 13:08:46.505317 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:46 crc kubenswrapper[4797]: I1013 13:08:46.505340 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:46Z","lastTransitionTime":"2025-10-13T13:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:46 crc kubenswrapper[4797]: I1013 13:08:46.609107 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:46 crc kubenswrapper[4797]: I1013 13:08:46.609170 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:46 crc kubenswrapper[4797]: I1013 13:08:46.609186 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:46 crc kubenswrapper[4797]: I1013 13:08:46.609210 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:46 crc kubenswrapper[4797]: I1013 13:08:46.609233 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:46Z","lastTransitionTime":"2025-10-13T13:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:46 crc kubenswrapper[4797]: I1013 13:08:46.712463 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:46 crc kubenswrapper[4797]: I1013 13:08:46.712535 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:46 crc kubenswrapper[4797]: I1013 13:08:46.712558 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:46 crc kubenswrapper[4797]: I1013 13:08:46.712588 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:46 crc kubenswrapper[4797]: I1013 13:08:46.712610 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:46Z","lastTransitionTime":"2025-10-13T13:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:46 crc kubenswrapper[4797]: I1013 13:08:46.817092 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:46 crc kubenswrapper[4797]: I1013 13:08:46.817158 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:46 crc kubenswrapper[4797]: I1013 13:08:46.817181 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:46 crc kubenswrapper[4797]: I1013 13:08:46.817209 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:46 crc kubenswrapper[4797]: I1013 13:08:46.817231 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:46Z","lastTransitionTime":"2025-10-13T13:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:46 crc kubenswrapper[4797]: I1013 13:08:46.920963 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:46 crc kubenswrapper[4797]: I1013 13:08:46.921031 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:46 crc kubenswrapper[4797]: I1013 13:08:46.921057 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:46 crc kubenswrapper[4797]: I1013 13:08:46.921094 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:46 crc kubenswrapper[4797]: I1013 13:08:46.921113 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:46Z","lastTransitionTime":"2025-10-13T13:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:47 crc kubenswrapper[4797]: I1013 13:08:47.024257 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:47 crc kubenswrapper[4797]: I1013 13:08:47.024294 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:47 crc kubenswrapper[4797]: I1013 13:08:47.024304 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:47 crc kubenswrapper[4797]: I1013 13:08:47.024317 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:47 crc kubenswrapper[4797]: I1013 13:08:47.024326 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:47Z","lastTransitionTime":"2025-10-13T13:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:47 crc kubenswrapper[4797]: I1013 13:08:47.127725 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:47 crc kubenswrapper[4797]: I1013 13:08:47.127769 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:47 crc kubenswrapper[4797]: I1013 13:08:47.127786 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:47 crc kubenswrapper[4797]: I1013 13:08:47.127851 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:47 crc kubenswrapper[4797]: I1013 13:08:47.127884 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:47Z","lastTransitionTime":"2025-10-13T13:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:47 crc kubenswrapper[4797]: I1013 13:08:47.230154 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:47 crc kubenswrapper[4797]: I1013 13:08:47.230200 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:47 crc kubenswrapper[4797]: I1013 13:08:47.230217 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:47 crc kubenswrapper[4797]: I1013 13:08:47.230262 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:47 crc kubenswrapper[4797]: I1013 13:08:47.230279 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:47Z","lastTransitionTime":"2025-10-13T13:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:47 crc kubenswrapper[4797]: I1013 13:08:47.235860 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 13 13:08:47 crc kubenswrapper[4797]: I1013 13:08:47.235907 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 13:08:47 crc kubenswrapper[4797]: I1013 13:08:47.236050 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 13 13:08:47 crc kubenswrapper[4797]: E1013 13:08:47.236252 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 13 13:08:47 crc kubenswrapper[4797]: E1013 13:08:47.236449 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 13 13:08:47 crc kubenswrapper[4797]: E1013 13:08:47.236702 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 13 13:08:47 crc kubenswrapper[4797]: I1013 13:08:47.333410 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:47 crc kubenswrapper[4797]: I1013 13:08:47.333480 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:47 crc kubenswrapper[4797]: I1013 13:08:47.333504 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:47 crc kubenswrapper[4797]: I1013 13:08:47.333533 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:47 crc kubenswrapper[4797]: I1013 13:08:47.333560 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:47Z","lastTransitionTime":"2025-10-13T13:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:47 crc kubenswrapper[4797]: I1013 13:08:47.436007 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:47 crc kubenswrapper[4797]: I1013 13:08:47.436123 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:47 crc kubenswrapper[4797]: I1013 13:08:47.436140 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:47 crc kubenswrapper[4797]: I1013 13:08:47.436162 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:47 crc kubenswrapper[4797]: I1013 13:08:47.436179 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:47Z","lastTransitionTime":"2025-10-13T13:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:47 crc kubenswrapper[4797]: I1013 13:08:47.539376 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:47 crc kubenswrapper[4797]: I1013 13:08:47.539443 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:47 crc kubenswrapper[4797]: I1013 13:08:47.539465 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:47 crc kubenswrapper[4797]: I1013 13:08:47.539495 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:47 crc kubenswrapper[4797]: I1013 13:08:47.539516 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:47Z","lastTransitionTime":"2025-10-13T13:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:47 crc kubenswrapper[4797]: I1013 13:08:47.642265 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:47 crc kubenswrapper[4797]: I1013 13:08:47.642418 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:47 crc kubenswrapper[4797]: I1013 13:08:47.643060 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:47 crc kubenswrapper[4797]: I1013 13:08:47.643148 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:47 crc kubenswrapper[4797]: I1013 13:08:47.643176 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:47Z","lastTransitionTime":"2025-10-13T13:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:47 crc kubenswrapper[4797]: I1013 13:08:47.746944 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:47 crc kubenswrapper[4797]: I1013 13:08:47.747006 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:47 crc kubenswrapper[4797]: I1013 13:08:47.747025 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:47 crc kubenswrapper[4797]: I1013 13:08:47.747047 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:47 crc kubenswrapper[4797]: I1013 13:08:47.747065 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:47Z","lastTransitionTime":"2025-10-13T13:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:47 crc kubenswrapper[4797]: I1013 13:08:47.850616 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:47 crc kubenswrapper[4797]: I1013 13:08:47.850710 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:47 crc kubenswrapper[4797]: I1013 13:08:47.850735 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:47 crc kubenswrapper[4797]: I1013 13:08:47.850762 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:47 crc kubenswrapper[4797]: I1013 13:08:47.850782 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:47Z","lastTransitionTime":"2025-10-13T13:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:47 crc kubenswrapper[4797]: I1013 13:08:47.953672 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:47 crc kubenswrapper[4797]: I1013 13:08:47.953752 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:47 crc kubenswrapper[4797]: I1013 13:08:47.953778 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:47 crc kubenswrapper[4797]: I1013 13:08:47.953862 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:47 crc kubenswrapper[4797]: I1013 13:08:47.953891 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:47Z","lastTransitionTime":"2025-10-13T13:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:48 crc kubenswrapper[4797]: I1013 13:08:48.056876 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:48 crc kubenswrapper[4797]: I1013 13:08:48.056987 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:48 crc kubenswrapper[4797]: I1013 13:08:48.057008 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:48 crc kubenswrapper[4797]: I1013 13:08:48.057030 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:48 crc kubenswrapper[4797]: I1013 13:08:48.057049 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:48Z","lastTransitionTime":"2025-10-13T13:08:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:48 crc kubenswrapper[4797]: I1013 13:08:48.160048 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:48 crc kubenswrapper[4797]: I1013 13:08:48.160106 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:48 crc kubenswrapper[4797]: I1013 13:08:48.160125 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:48 crc kubenswrapper[4797]: I1013 13:08:48.160150 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:48 crc kubenswrapper[4797]: I1013 13:08:48.160168 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:48Z","lastTransitionTime":"2025-10-13T13:08:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:48 crc kubenswrapper[4797]: I1013 13:08:48.235194 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pdvg5" Oct 13 13:08:48 crc kubenswrapper[4797]: E1013 13:08:48.235398 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pdvg5" podUID="e65d35bc-209d-4438-ae53-31deb132aaf5" Oct 13 13:08:48 crc kubenswrapper[4797]: I1013 13:08:48.263199 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:48 crc kubenswrapper[4797]: I1013 13:08:48.263281 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:48 crc kubenswrapper[4797]: I1013 13:08:48.263305 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:48 crc kubenswrapper[4797]: I1013 13:08:48.263334 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:48 crc kubenswrapper[4797]: I1013 13:08:48.263361 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:48Z","lastTransitionTime":"2025-10-13T13:08:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:48 crc kubenswrapper[4797]: I1013 13:08:48.366998 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:48 crc kubenswrapper[4797]: I1013 13:08:48.367056 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:48 crc kubenswrapper[4797]: I1013 13:08:48.367079 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:48 crc kubenswrapper[4797]: I1013 13:08:48.367109 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:48 crc kubenswrapper[4797]: I1013 13:08:48.367132 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:48Z","lastTransitionTime":"2025-10-13T13:08:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:48 crc kubenswrapper[4797]: I1013 13:08:48.469377 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:48 crc kubenswrapper[4797]: I1013 13:08:48.469443 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:48 crc kubenswrapper[4797]: I1013 13:08:48.469467 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:48 crc kubenswrapper[4797]: I1013 13:08:48.469494 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:48 crc kubenswrapper[4797]: I1013 13:08:48.469519 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:48Z","lastTransitionTime":"2025-10-13T13:08:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:48 crc kubenswrapper[4797]: I1013 13:08:48.572083 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:48 crc kubenswrapper[4797]: I1013 13:08:48.572138 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:48 crc kubenswrapper[4797]: I1013 13:08:48.572157 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:48 crc kubenswrapper[4797]: I1013 13:08:48.572180 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:48 crc kubenswrapper[4797]: I1013 13:08:48.572196 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:48Z","lastTransitionTime":"2025-10-13T13:08:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:48 crc kubenswrapper[4797]: I1013 13:08:48.675490 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:48 crc kubenswrapper[4797]: I1013 13:08:48.675563 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:48 crc kubenswrapper[4797]: I1013 13:08:48.675585 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:48 crc kubenswrapper[4797]: I1013 13:08:48.675614 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:48 crc kubenswrapper[4797]: I1013 13:08:48.675634 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:48Z","lastTransitionTime":"2025-10-13T13:08:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:48 crc kubenswrapper[4797]: I1013 13:08:48.778741 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:48 crc kubenswrapper[4797]: I1013 13:08:48.778797 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:48 crc kubenswrapper[4797]: I1013 13:08:48.778844 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:48 crc kubenswrapper[4797]: I1013 13:08:48.778873 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:48 crc kubenswrapper[4797]: I1013 13:08:48.778902 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:48Z","lastTransitionTime":"2025-10-13T13:08:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:48 crc kubenswrapper[4797]: I1013 13:08:48.881942 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:48 crc kubenswrapper[4797]: I1013 13:08:48.881989 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:48 crc kubenswrapper[4797]: I1013 13:08:48.882007 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:48 crc kubenswrapper[4797]: I1013 13:08:48.882033 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:48 crc kubenswrapper[4797]: I1013 13:08:48.882054 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:48Z","lastTransitionTime":"2025-10-13T13:08:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:48 crc kubenswrapper[4797]: I1013 13:08:48.984960 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:48 crc kubenswrapper[4797]: I1013 13:08:48.985037 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:48 crc kubenswrapper[4797]: I1013 13:08:48.985063 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:48 crc kubenswrapper[4797]: I1013 13:08:48.985091 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:48 crc kubenswrapper[4797]: I1013 13:08:48.985113 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:48Z","lastTransitionTime":"2025-10-13T13:08:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:49 crc kubenswrapper[4797]: I1013 13:08:49.087320 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:49 crc kubenswrapper[4797]: I1013 13:08:49.087371 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:49 crc kubenswrapper[4797]: I1013 13:08:49.087387 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:49 crc kubenswrapper[4797]: I1013 13:08:49.087412 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:49 crc kubenswrapper[4797]: I1013 13:08:49.087429 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:49Z","lastTransitionTime":"2025-10-13T13:08:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:49 crc kubenswrapper[4797]: I1013 13:08:49.190612 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:49 crc kubenswrapper[4797]: I1013 13:08:49.190686 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:49 crc kubenswrapper[4797]: I1013 13:08:49.190710 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:49 crc kubenswrapper[4797]: I1013 13:08:49.190741 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:49 crc kubenswrapper[4797]: I1013 13:08:49.190762 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:49Z","lastTransitionTime":"2025-10-13T13:08:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:49 crc kubenswrapper[4797]: I1013 13:08:49.235256 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 13 13:08:49 crc kubenswrapper[4797]: E1013 13:08:49.235459 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 13 13:08:49 crc kubenswrapper[4797]: I1013 13:08:49.235543 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 13 13:08:49 crc kubenswrapper[4797]: I1013 13:08:49.235610 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 13:08:49 crc kubenswrapper[4797]: E1013 13:08:49.236064 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 13 13:08:49 crc kubenswrapper[4797]: E1013 13:08:49.236196 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 13 13:08:49 crc kubenswrapper[4797]: I1013 13:08:49.293561 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:49 crc kubenswrapper[4797]: I1013 13:08:49.293860 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:49 crc kubenswrapper[4797]: I1013 13:08:49.294022 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:49 crc kubenswrapper[4797]: I1013 13:08:49.294172 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:49 crc kubenswrapper[4797]: I1013 13:08:49.294316 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:49Z","lastTransitionTime":"2025-10-13T13:08:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:49 crc kubenswrapper[4797]: I1013 13:08:49.397980 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:49 crc kubenswrapper[4797]: I1013 13:08:49.398032 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:49 crc kubenswrapper[4797]: I1013 13:08:49.398048 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:49 crc kubenswrapper[4797]: I1013 13:08:49.398071 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:49 crc kubenswrapper[4797]: I1013 13:08:49.398089 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:49Z","lastTransitionTime":"2025-10-13T13:08:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:49 crc kubenswrapper[4797]: I1013 13:08:49.500885 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:49 crc kubenswrapper[4797]: I1013 13:08:49.501314 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:49 crc kubenswrapper[4797]: I1013 13:08:49.501493 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:49 crc kubenswrapper[4797]: I1013 13:08:49.501639 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:49 crc kubenswrapper[4797]: I1013 13:08:49.501786 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:49Z","lastTransitionTime":"2025-10-13T13:08:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:49 crc kubenswrapper[4797]: I1013 13:08:49.605696 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:49 crc kubenswrapper[4797]: I1013 13:08:49.605770 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:49 crc kubenswrapper[4797]: I1013 13:08:49.605788 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:49 crc kubenswrapper[4797]: I1013 13:08:49.605842 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:49 crc kubenswrapper[4797]: I1013 13:08:49.605860 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:49Z","lastTransitionTime":"2025-10-13T13:08:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:49 crc kubenswrapper[4797]: I1013 13:08:49.709226 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:49 crc kubenswrapper[4797]: I1013 13:08:49.709288 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:49 crc kubenswrapper[4797]: I1013 13:08:49.709305 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:49 crc kubenswrapper[4797]: I1013 13:08:49.709329 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:49 crc kubenswrapper[4797]: I1013 13:08:49.709346 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:49Z","lastTransitionTime":"2025-10-13T13:08:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:49 crc kubenswrapper[4797]: I1013 13:08:49.812477 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:49 crc kubenswrapper[4797]: I1013 13:08:49.812586 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:49 crc kubenswrapper[4797]: I1013 13:08:49.812604 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:49 crc kubenswrapper[4797]: I1013 13:08:49.812632 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:49 crc kubenswrapper[4797]: I1013 13:08:49.812648 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:49Z","lastTransitionTime":"2025-10-13T13:08:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:49 crc kubenswrapper[4797]: I1013 13:08:49.915234 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:49 crc kubenswrapper[4797]: I1013 13:08:49.915295 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:49 crc kubenswrapper[4797]: I1013 13:08:49.915317 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:49 crc kubenswrapper[4797]: I1013 13:08:49.915348 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:49 crc kubenswrapper[4797]: I1013 13:08:49.915372 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:49Z","lastTransitionTime":"2025-10-13T13:08:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:50 crc kubenswrapper[4797]: I1013 13:08:50.017764 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:50 crc kubenswrapper[4797]: I1013 13:08:50.017883 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:50 crc kubenswrapper[4797]: I1013 13:08:50.017913 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:50 crc kubenswrapper[4797]: I1013 13:08:50.017942 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:50 crc kubenswrapper[4797]: I1013 13:08:50.017961 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:50Z","lastTransitionTime":"2025-10-13T13:08:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:50 crc kubenswrapper[4797]: I1013 13:08:50.121225 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:50 crc kubenswrapper[4797]: I1013 13:08:50.121289 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:50 crc kubenswrapper[4797]: I1013 13:08:50.121313 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:50 crc kubenswrapper[4797]: I1013 13:08:50.121342 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:50 crc kubenswrapper[4797]: I1013 13:08:50.121366 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:50Z","lastTransitionTime":"2025-10-13T13:08:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:50 crc kubenswrapper[4797]: I1013 13:08:50.223620 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:50 crc kubenswrapper[4797]: I1013 13:08:50.223670 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:50 crc kubenswrapper[4797]: I1013 13:08:50.223688 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:50 crc kubenswrapper[4797]: I1013 13:08:50.223710 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:50 crc kubenswrapper[4797]: I1013 13:08:50.223727 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:50Z","lastTransitionTime":"2025-10-13T13:08:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:50 crc kubenswrapper[4797]: I1013 13:08:50.236039 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pdvg5" Oct 13 13:08:50 crc kubenswrapper[4797]: E1013 13:08:50.236356 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pdvg5" podUID="e65d35bc-209d-4438-ae53-31deb132aaf5" Oct 13 13:08:50 crc kubenswrapper[4797]: I1013 13:08:50.326907 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:50 crc kubenswrapper[4797]: I1013 13:08:50.326973 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:50 crc kubenswrapper[4797]: I1013 13:08:50.326995 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:50 crc kubenswrapper[4797]: I1013 13:08:50.327026 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:50 crc kubenswrapper[4797]: I1013 13:08:50.327048 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:50Z","lastTransitionTime":"2025-10-13T13:08:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:50 crc kubenswrapper[4797]: I1013 13:08:50.430592 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:50 crc kubenswrapper[4797]: I1013 13:08:50.430661 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:50 crc kubenswrapper[4797]: I1013 13:08:50.430684 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:50 crc kubenswrapper[4797]: I1013 13:08:50.430713 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:50 crc kubenswrapper[4797]: I1013 13:08:50.430734 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:50Z","lastTransitionTime":"2025-10-13T13:08:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:50 crc kubenswrapper[4797]: I1013 13:08:50.535014 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:50 crc kubenswrapper[4797]: I1013 13:08:50.535082 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:50 crc kubenswrapper[4797]: I1013 13:08:50.535106 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:50 crc kubenswrapper[4797]: I1013 13:08:50.535135 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:50 crc kubenswrapper[4797]: I1013 13:08:50.535156 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:50Z","lastTransitionTime":"2025-10-13T13:08:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:50 crc kubenswrapper[4797]: I1013 13:08:50.638336 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:50 crc kubenswrapper[4797]: I1013 13:08:50.638388 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:50 crc kubenswrapper[4797]: I1013 13:08:50.638403 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:50 crc kubenswrapper[4797]: I1013 13:08:50.638422 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:50 crc kubenswrapper[4797]: I1013 13:08:50.638438 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:50Z","lastTransitionTime":"2025-10-13T13:08:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:50 crc kubenswrapper[4797]: I1013 13:08:50.741462 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:50 crc kubenswrapper[4797]: I1013 13:08:50.741526 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:50 crc kubenswrapper[4797]: I1013 13:08:50.741560 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:50 crc kubenswrapper[4797]: I1013 13:08:50.741592 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:50 crc kubenswrapper[4797]: I1013 13:08:50.741615 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:50Z","lastTransitionTime":"2025-10-13T13:08:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:50 crc kubenswrapper[4797]: I1013 13:08:50.845002 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:50 crc kubenswrapper[4797]: I1013 13:08:50.845092 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:50 crc kubenswrapper[4797]: I1013 13:08:50.845118 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:50 crc kubenswrapper[4797]: I1013 13:08:50.845147 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:50 crc kubenswrapper[4797]: I1013 13:08:50.845169 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:50Z","lastTransitionTime":"2025-10-13T13:08:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:50 crc kubenswrapper[4797]: I1013 13:08:50.948407 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:50 crc kubenswrapper[4797]: I1013 13:08:50.948475 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:50 crc kubenswrapper[4797]: I1013 13:08:50.948499 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:50 crc kubenswrapper[4797]: I1013 13:08:50.948545 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:50 crc kubenswrapper[4797]: I1013 13:08:50.948568 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:50Z","lastTransitionTime":"2025-10-13T13:08:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:51 crc kubenswrapper[4797]: I1013 13:08:51.052419 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:51 crc kubenswrapper[4797]: I1013 13:08:51.052474 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:51 crc kubenswrapper[4797]: I1013 13:08:51.052492 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:51 crc kubenswrapper[4797]: I1013 13:08:51.052519 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:51 crc kubenswrapper[4797]: I1013 13:08:51.052541 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:51Z","lastTransitionTime":"2025-10-13T13:08:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:51 crc kubenswrapper[4797]: I1013 13:08:51.155691 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:51 crc kubenswrapper[4797]: I1013 13:08:51.155787 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:51 crc kubenswrapper[4797]: I1013 13:08:51.155837 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:51 crc kubenswrapper[4797]: I1013 13:08:51.155869 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:51 crc kubenswrapper[4797]: I1013 13:08:51.155889 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:51Z","lastTransitionTime":"2025-10-13T13:08:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:51 crc kubenswrapper[4797]: I1013 13:08:51.235672 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 13 13:08:51 crc kubenswrapper[4797]: I1013 13:08:51.235735 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 13:08:51 crc kubenswrapper[4797]: E1013 13:08:51.235903 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 13 13:08:51 crc kubenswrapper[4797]: I1013 13:08:51.236030 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 13 13:08:51 crc kubenswrapper[4797]: E1013 13:08:51.236199 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 13 13:08:51 crc kubenswrapper[4797]: E1013 13:08:51.236566 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 13 13:08:51 crc kubenswrapper[4797]: I1013 13:08:51.259189 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:51 crc kubenswrapper[4797]: I1013 13:08:51.259257 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:51 crc kubenswrapper[4797]: I1013 13:08:51.259281 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:51 crc kubenswrapper[4797]: I1013 13:08:51.259310 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:51 crc kubenswrapper[4797]: I1013 13:08:51.259331 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:51Z","lastTransitionTime":"2025-10-13T13:08:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:51 crc kubenswrapper[4797]: I1013 13:08:51.362282 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:51 crc kubenswrapper[4797]: I1013 13:08:51.362347 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:51 crc kubenswrapper[4797]: I1013 13:08:51.362367 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:51 crc kubenswrapper[4797]: I1013 13:08:51.362396 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:51 crc kubenswrapper[4797]: I1013 13:08:51.362418 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:51Z","lastTransitionTime":"2025-10-13T13:08:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:51 crc kubenswrapper[4797]: I1013 13:08:51.466598 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:51 crc kubenswrapper[4797]: I1013 13:08:51.466708 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:51 crc kubenswrapper[4797]: I1013 13:08:51.466733 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:51 crc kubenswrapper[4797]: I1013 13:08:51.466841 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:51 crc kubenswrapper[4797]: I1013 13:08:51.466863 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:51Z","lastTransitionTime":"2025-10-13T13:08:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:51 crc kubenswrapper[4797]: I1013 13:08:51.570220 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:51 crc kubenswrapper[4797]: I1013 13:08:51.570284 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:51 crc kubenswrapper[4797]: I1013 13:08:51.570304 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:51 crc kubenswrapper[4797]: I1013 13:08:51.570329 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:51 crc kubenswrapper[4797]: I1013 13:08:51.570368 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:51Z","lastTransitionTime":"2025-10-13T13:08:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:51 crc kubenswrapper[4797]: I1013 13:08:51.673348 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:51 crc kubenswrapper[4797]: I1013 13:08:51.673404 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:51 crc kubenswrapper[4797]: I1013 13:08:51.673438 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:51 crc kubenswrapper[4797]: I1013 13:08:51.673477 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:51 crc kubenswrapper[4797]: I1013 13:08:51.673501 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:51Z","lastTransitionTime":"2025-10-13T13:08:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:51 crc kubenswrapper[4797]: I1013 13:08:51.776766 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:51 crc kubenswrapper[4797]: I1013 13:08:51.776856 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:51 crc kubenswrapper[4797]: I1013 13:08:51.776880 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:51 crc kubenswrapper[4797]: I1013 13:08:51.776909 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:51 crc kubenswrapper[4797]: I1013 13:08:51.776930 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:51Z","lastTransitionTime":"2025-10-13T13:08:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:51 crc kubenswrapper[4797]: I1013 13:08:51.879973 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:51 crc kubenswrapper[4797]: I1013 13:08:51.880051 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:51 crc kubenswrapper[4797]: I1013 13:08:51.880084 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:51 crc kubenswrapper[4797]: I1013 13:08:51.880112 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:51 crc kubenswrapper[4797]: I1013 13:08:51.880133 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:51Z","lastTransitionTime":"2025-10-13T13:08:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:51 crc kubenswrapper[4797]: I1013 13:08:51.983149 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:51 crc kubenswrapper[4797]: I1013 13:08:51.983218 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:51 crc kubenswrapper[4797]: I1013 13:08:51.983240 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:51 crc kubenswrapper[4797]: I1013 13:08:51.983263 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:51 crc kubenswrapper[4797]: I1013 13:08:51.983279 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:51Z","lastTransitionTime":"2025-10-13T13:08:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:52 crc kubenswrapper[4797]: I1013 13:08:52.090915 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:52 crc kubenswrapper[4797]: I1013 13:08:52.090976 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:52 crc kubenswrapper[4797]: I1013 13:08:52.090994 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:52 crc kubenswrapper[4797]: I1013 13:08:52.091019 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:52 crc kubenswrapper[4797]: I1013 13:08:52.091037 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:52Z","lastTransitionTime":"2025-10-13T13:08:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:52 crc kubenswrapper[4797]: I1013 13:08:52.193658 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:52 crc kubenswrapper[4797]: I1013 13:08:52.193735 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:52 crc kubenswrapper[4797]: I1013 13:08:52.193759 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:52 crc kubenswrapper[4797]: I1013 13:08:52.193790 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:52 crc kubenswrapper[4797]: I1013 13:08:52.193845 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:52Z","lastTransitionTime":"2025-10-13T13:08:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:52 crc kubenswrapper[4797]: I1013 13:08:52.235135 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pdvg5" Oct 13 13:08:52 crc kubenswrapper[4797]: E1013 13:08:52.235317 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pdvg5" podUID="e65d35bc-209d-4438-ae53-31deb132aaf5" Oct 13 13:08:52 crc kubenswrapper[4797]: I1013 13:08:52.297218 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:52 crc kubenswrapper[4797]: I1013 13:08:52.297268 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:52 crc kubenswrapper[4797]: I1013 13:08:52.297285 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:52 crc kubenswrapper[4797]: I1013 13:08:52.297308 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:52 crc kubenswrapper[4797]: I1013 13:08:52.297325 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:52Z","lastTransitionTime":"2025-10-13T13:08:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:52 crc kubenswrapper[4797]: I1013 13:08:52.401231 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:52 crc kubenswrapper[4797]: I1013 13:08:52.401295 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:52 crc kubenswrapper[4797]: I1013 13:08:52.401312 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:52 crc kubenswrapper[4797]: I1013 13:08:52.401346 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:52 crc kubenswrapper[4797]: I1013 13:08:52.401366 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:52Z","lastTransitionTime":"2025-10-13T13:08:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:52 crc kubenswrapper[4797]: I1013 13:08:52.504648 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:52 crc kubenswrapper[4797]: I1013 13:08:52.504698 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:52 crc kubenswrapper[4797]: I1013 13:08:52.504714 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:52 crc kubenswrapper[4797]: I1013 13:08:52.504734 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:52 crc kubenswrapper[4797]: I1013 13:08:52.504749 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:52Z","lastTransitionTime":"2025-10-13T13:08:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:52 crc kubenswrapper[4797]: I1013 13:08:52.606747 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:52 crc kubenswrapper[4797]: I1013 13:08:52.606781 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:52 crc kubenswrapper[4797]: I1013 13:08:52.606826 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:52 crc kubenswrapper[4797]: I1013 13:08:52.606846 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:52 crc kubenswrapper[4797]: I1013 13:08:52.606860 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:52Z","lastTransitionTime":"2025-10-13T13:08:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:52 crc kubenswrapper[4797]: I1013 13:08:52.677907 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:52 crc kubenswrapper[4797]: I1013 13:08:52.678084 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:52 crc kubenswrapper[4797]: I1013 13:08:52.678125 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:52 crc kubenswrapper[4797]: I1013 13:08:52.678159 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:52 crc kubenswrapper[4797]: I1013 13:08:52.678184 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:52Z","lastTransitionTime":"2025-10-13T13:08:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:52 crc kubenswrapper[4797]: I1013 13:08:52.716294 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 13:08:52 crc kubenswrapper[4797]: I1013 13:08:52.716367 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 13:08:52 crc kubenswrapper[4797]: I1013 13:08:52.716392 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 13:08:52 crc kubenswrapper[4797]: I1013 13:08:52.716422 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 13:08:52 crc kubenswrapper[4797]: I1013 13:08:52.716453 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T13:08:52Z","lastTransitionTime":"2025-10-13T13:08:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 13:08:52 crc kubenswrapper[4797]: I1013 13:08:52.755433 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-qgxwp"] Oct 13 13:08:52 crc kubenswrapper[4797]: I1013 13:08:52.756053 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qgxwp" Oct 13 13:08:52 crc kubenswrapper[4797]: I1013 13:08:52.759730 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Oct 13 13:08:52 crc kubenswrapper[4797]: I1013 13:08:52.759955 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Oct 13 13:08:52 crc kubenswrapper[4797]: I1013 13:08:52.760766 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Oct 13 13:08:52 crc kubenswrapper[4797]: I1013 13:08:52.761107 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Oct 13 13:08:52 crc kubenswrapper[4797]: I1013 13:08:52.816578 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5ec5c2f3-9cba-453d-9807-f39eee146340-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-qgxwp\" (UID: \"5ec5c2f3-9cba-453d-9807-f39eee146340\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qgxwp" Oct 13 13:08:52 crc kubenswrapper[4797]: I1013 13:08:52.816693 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5ec5c2f3-9cba-453d-9807-f39eee146340-service-ca\") pod \"cluster-version-operator-5c965bbfc6-qgxwp\" (UID: \"5ec5c2f3-9cba-453d-9807-f39eee146340\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qgxwp" Oct 13 13:08:52 crc kubenswrapper[4797]: I1013 13:08:52.816757 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/5ec5c2f3-9cba-453d-9807-f39eee146340-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-qgxwp\" (UID: \"5ec5c2f3-9cba-453d-9807-f39eee146340\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qgxwp" Oct 13 13:08:52 crc kubenswrapper[4797]: I1013 13:08:52.816966 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5ec5c2f3-9cba-453d-9807-f39eee146340-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-qgxwp\" (UID: \"5ec5c2f3-9cba-453d-9807-f39eee146340\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qgxwp" Oct 13 13:08:52 crc kubenswrapper[4797]: I1013 13:08:52.817080 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/5ec5c2f3-9cba-453d-9807-f39eee146340-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-qgxwp\" (UID: \"5ec5c2f3-9cba-453d-9807-f39eee146340\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qgxwp" Oct 13 13:08:52 crc kubenswrapper[4797]: I1013 13:08:52.828599 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-hc9bk" podStartSLOduration=89.828570714 podStartE2EDuration="1m29.828570714s" podCreationTimestamp="2025-10-13 13:07:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 13:08:52.807117564 +0000 UTC m=+110.340667890" watchObservedRunningTime="2025-10-13 13:08:52.828570714 +0000 UTC m=+110.362121000" Oct 13 13:08:52 crc kubenswrapper[4797]: I1013 13:08:52.828999 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=59.828991205 podStartE2EDuration="59.828991205s" podCreationTimestamp="2025-10-13 13:07:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 13:08:52.8283899 +0000 UTC m=+110.361940186" watchObservedRunningTime="2025-10-13 13:08:52.828991205 +0000 UTC m=+110.362541501" Oct 13 13:08:52 crc kubenswrapper[4797]: I1013 13:08:52.844898 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=25.844870388 podStartE2EDuration="25.844870388s" podCreationTimestamp="2025-10-13 13:08:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 13:08:52.844389216 +0000 UTC m=+110.377939522" watchObservedRunningTime="2025-10-13 13:08:52.844870388 +0000 UTC m=+110.378420674" Oct 13 13:08:52 crc kubenswrapper[4797]: I1013 13:08:52.883788 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=86.883758579 podStartE2EDuration="1m26.883758579s" podCreationTimestamp="2025-10-13 13:07:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 13:08:52.882726474 +0000 UTC m=+110.416276790" watchObservedRunningTime="2025-10-13 13:08:52.883758579 +0000 UTC m=+110.417308875" Oct 13 13:08:52 crc kubenswrapper[4797]: I1013 13:08:52.918125 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5ec5c2f3-9cba-453d-9807-f39eee146340-service-ca\") pod \"cluster-version-operator-5c965bbfc6-qgxwp\" (UID: \"5ec5c2f3-9cba-453d-9807-f39eee146340\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qgxwp" Oct 13 13:08:52 crc kubenswrapper[4797]: I1013 13:08:52.918190 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/5ec5c2f3-9cba-453d-9807-f39eee146340-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-qgxwp\" (UID: \"5ec5c2f3-9cba-453d-9807-f39eee146340\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qgxwp" Oct 13 13:08:52 crc kubenswrapper[4797]: I1013 13:08:52.918234 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5ec5c2f3-9cba-453d-9807-f39eee146340-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-qgxwp\" (UID: \"5ec5c2f3-9cba-453d-9807-f39eee146340\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qgxwp" Oct 13 13:08:52 crc kubenswrapper[4797]: I1013 13:08:52.918257 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/5ec5c2f3-9cba-453d-9807-f39eee146340-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-qgxwp\" (UID: \"5ec5c2f3-9cba-453d-9807-f39eee146340\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qgxwp" Oct 13 13:08:52 crc kubenswrapper[4797]: I1013 13:08:52.918298 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5ec5c2f3-9cba-453d-9807-f39eee146340-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-qgxwp\" (UID: \"5ec5c2f3-9cba-453d-9807-f39eee146340\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qgxwp" Oct 13 13:08:52 crc kubenswrapper[4797]: I1013 13:08:52.918553 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/5ec5c2f3-9cba-453d-9807-f39eee146340-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-qgxwp\" (UID: \"5ec5c2f3-9cba-453d-9807-f39eee146340\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qgxwp" Oct 13 13:08:52 crc kubenswrapper[4797]: I1013 13:08:52.918595 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/5ec5c2f3-9cba-453d-9807-f39eee146340-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-qgxwp\" (UID: \"5ec5c2f3-9cba-453d-9807-f39eee146340\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qgxwp" Oct 13 13:08:52 crc kubenswrapper[4797]: I1013 13:08:52.920013 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5ec5c2f3-9cba-453d-9807-f39eee146340-service-ca\") pod \"cluster-version-operator-5c965bbfc6-qgxwp\" (UID: \"5ec5c2f3-9cba-453d-9807-f39eee146340\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qgxwp" Oct 13 13:08:52 crc kubenswrapper[4797]: I1013 13:08:52.931188 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5ec5c2f3-9cba-453d-9807-f39eee146340-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-qgxwp\" (UID: \"5ec5c2f3-9cba-453d-9807-f39eee146340\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qgxwp" Oct 13 13:08:52 crc kubenswrapper[4797]: I1013 13:08:52.943436 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5ec5c2f3-9cba-453d-9807-f39eee146340-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-qgxwp\" (UID: \"5ec5c2f3-9cba-453d-9807-f39eee146340\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qgxwp" Oct 13 13:08:52 crc kubenswrapper[4797]: I1013 13:08:52.961360 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=89.961339318 podStartE2EDuration="1m29.961339318s" podCreationTimestamp="2025-10-13 13:07:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 13:08:52.947620709 +0000 UTC m=+110.481170995" watchObservedRunningTime="2025-10-13 13:08:52.961339318 +0000 UTC m=+110.494889574" Oct 13 13:08:52 crc kubenswrapper[4797]: I1013 13:08:52.961878 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podStartSLOduration=89.961874221 podStartE2EDuration="1m29.961874221s" podCreationTimestamp="2025-10-13 13:07:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 13:08:52.961546863 +0000 UTC m=+110.495097149" watchObservedRunningTime="2025-10-13 13:08:52.961874221 +0000 UTC m=+110.495424477" Oct 13 13:08:52 crc kubenswrapper[4797]: I1013 13:08:52.975145 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-7c2fp" podStartSLOduration=89.975129669 podStartE2EDuration="1m29.975129669s" podCreationTimestamp="2025-10-13 13:07:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 13:08:52.974187726 +0000 UTC m=+110.507738032" watchObservedRunningTime="2025-10-13 13:08:52.975129669 +0000 UTC m=+110.508679935" Oct 13 13:08:53 crc kubenswrapper[4797]: I1013 13:08:53.028468 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=90.028447778 podStartE2EDuration="1m30.028447778s" podCreationTimestamp="2025-10-13 13:07:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 13:08:53.010185816 +0000 UTC m=+110.543736162" watchObservedRunningTime="2025-10-13 13:08:53.028447778 +0000 UTC m=+110.561998034" Oct 13 13:08:53 crc kubenswrapper[4797]: I1013 13:08:53.060025 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-6gbdx" podStartSLOduration=90.060005688 podStartE2EDuration="1m30.060005688s" podCreationTimestamp="2025-10-13 13:07:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 13:08:53.059729551 +0000 UTC m=+110.593279827" watchObservedRunningTime="2025-10-13 13:08:53.060005688 +0000 UTC m=+110.593555944" Oct 13 13:08:53 crc kubenswrapper[4797]: I1013 13:08:53.084491 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qgxwp" Oct 13 13:08:53 crc kubenswrapper[4797]: I1013 13:08:53.146534 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-5jgrm" podStartSLOduration=90.146513278 podStartE2EDuration="1m30.146513278s" podCreationTimestamp="2025-10-13 13:07:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 13:08:53.13242859 +0000 UTC m=+110.665978876" watchObservedRunningTime="2025-10-13 13:08:53.146513278 +0000 UTC m=+110.680063544" Oct 13 13:08:53 crc kubenswrapper[4797]: I1013 13:08:53.158876 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hvhmz" podStartSLOduration=89.158852093 podStartE2EDuration="1m29.158852093s" podCreationTimestamp="2025-10-13 13:07:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 13:08:53.147365529 +0000 UTC m=+110.680915795" watchObservedRunningTime="2025-10-13 13:08:53.158852093 +0000 UTC m=+110.692402359" Oct 13 13:08:53 crc kubenswrapper[4797]: I1013 13:08:53.249638 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 13:08:53 crc kubenswrapper[4797]: I1013 13:08:53.249667 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 13 13:08:53 crc kubenswrapper[4797]: I1013 13:08:53.249760 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 13 13:08:53 crc kubenswrapper[4797]: E1013 13:08:53.251124 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 13 13:08:53 crc kubenswrapper[4797]: E1013 13:08:53.251199 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 13 13:08:53 crc kubenswrapper[4797]: E1013 13:08:53.251246 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 13 13:08:53 crc kubenswrapper[4797]: I1013 13:08:53.873377 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qgxwp" event={"ID":"5ec5c2f3-9cba-453d-9807-f39eee146340","Type":"ContainerStarted","Data":"a66888bea26690f62dcf302b2ca71c20a77217777816fc79241f5b26cbc50d9a"} Oct 13 13:08:53 crc kubenswrapper[4797]: I1013 13:08:53.873441 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qgxwp" event={"ID":"5ec5c2f3-9cba-453d-9807-f39eee146340","Type":"ContainerStarted","Data":"b9bfe50c89c11bde6e5629350dbbc4799ef1caa5e1e38cabd55588a1a519e27b"} Oct 13 13:08:53 crc kubenswrapper[4797]: I1013 13:08:53.893487 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qgxwp" podStartSLOduration=90.893459832 podStartE2EDuration="1m30.893459832s" podCreationTimestamp="2025-10-13 13:07:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 13:08:53.891687118 +0000 UTC m=+111.425237414" watchObservedRunningTime="2025-10-13 13:08:53.893459832 +0000 UTC m=+111.427010128" Oct 13 13:08:54 crc kubenswrapper[4797]: I1013 13:08:54.235945 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pdvg5" Oct 13 13:08:54 crc kubenswrapper[4797]: E1013 13:08:54.236145 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pdvg5" podUID="e65d35bc-209d-4438-ae53-31deb132aaf5" Oct 13 13:08:54 crc kubenswrapper[4797]: I1013 13:08:54.236625 4797 scope.go:117] "RemoveContainer" containerID="4df9cf44f891f2c9d341965d5c89c3f3ca2eb1bf12a54a0a673389869f9c878d" Oct 13 13:08:54 crc kubenswrapper[4797]: I1013 13:08:54.878247 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dhk2q_658edc6a-9975-4d8b-9551-821edcc32ce1/ovnkube-controller/3.log" Oct 13 13:08:54 crc kubenswrapper[4797]: I1013 13:08:54.881120 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" event={"ID":"658edc6a-9975-4d8b-9551-821edcc32ce1","Type":"ContainerStarted","Data":"36507d098b9eb0faf4505b62ed2c38e52ab7e05b27bd8b54f2eaafec373bb752"} Oct 13 13:08:54 crc kubenswrapper[4797]: I1013 13:08:54.881564 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" Oct 13 13:08:54 crc kubenswrapper[4797]: I1013 13:08:54.918219 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" podStartSLOduration=90.918194946 podStartE2EDuration="1m30.918194946s" podCreationTimestamp="2025-10-13 13:07:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 13:08:54.917665483 +0000 UTC m=+112.451215759" watchObservedRunningTime="2025-10-13 13:08:54.918194946 +0000 UTC m=+112.451745202" Oct 13 13:08:55 crc kubenswrapper[4797]: I1013 13:08:55.235684 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 13 13:08:55 crc kubenswrapper[4797]: I1013 13:08:55.235745 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 13:08:55 crc kubenswrapper[4797]: I1013 13:08:55.235781 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 13 13:08:55 crc kubenswrapper[4797]: E1013 13:08:55.235861 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 13 13:08:55 crc kubenswrapper[4797]: E1013 13:08:55.235970 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 13 13:08:55 crc kubenswrapper[4797]: E1013 13:08:55.236138 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 13 13:08:55 crc kubenswrapper[4797]: I1013 13:08:55.321113 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-pdvg5"] Oct 13 13:08:55 crc kubenswrapper[4797]: I1013 13:08:55.321407 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pdvg5" Oct 13 13:08:55 crc kubenswrapper[4797]: E1013 13:08:55.321510 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pdvg5" podUID="e65d35bc-209d-4438-ae53-31deb132aaf5" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.236037 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.236295 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 13:08:57 crc kubenswrapper[4797]: E1013 13:08:57.236660 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.236371 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.236397 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pdvg5" Oct 13 13:08:57 crc kubenswrapper[4797]: E1013 13:08:57.236987 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 13 13:08:57 crc kubenswrapper[4797]: E1013 13:08:57.237053 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pdvg5" podUID="e65d35bc-209d-4438-ae53-31deb132aaf5" Oct 13 13:08:57 crc kubenswrapper[4797]: E1013 13:08:57.236903 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.713393 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.713600 4797 kubelet_node_status.go:538] "Fast updating node status as it just became ready" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.762916 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-dmwr8"] Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.763684 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-dmwr8" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.764602 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mwhth"] Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.765513 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mwhth" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.765947 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-72vb4"] Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.766998 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-72vb4" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.769194 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-lvjgk"] Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.769777 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-ccxwp"] Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.770061 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lvjgk" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.771158 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-wxrg4"] Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.771750 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-wxrg4" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.772428 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ccxwp" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.773025 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-vw685"] Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.774308 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-vw685" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.778704 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.778894 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.779010 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.779378 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.779711 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.780066 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.781422 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-l89kw"] Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.782051 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-l89kw" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.784977 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.786275 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-5dpx7"] Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.786849 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-5dpx7" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.787141 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.787540 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.802472 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.802534 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.802547 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.802590 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.803056 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.803599 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.804879 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.805084 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.805257 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.805473 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.805952 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.805965 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.805699 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.806186 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.806418 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.806518 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.806601 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.806688 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.807203 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.807537 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.814737 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.823659 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.823860 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.823989 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.824318 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.824546 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.824642 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.824713 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.824663 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.824941 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.825135 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.825320 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.825547 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.826122 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.829063 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.831776 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.835965 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.836057 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.836101 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.836389 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.836793 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.837236 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-8xklm"] Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.837454 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.837622 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.837634 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.837731 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.837825 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.837900 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.837914 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.837918 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-8xklm" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.839531 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.839628 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.839749 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.839852 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.847304 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.847624 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.847778 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.848027 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.848184 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.848200 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.850118 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-swfqw"] Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.850764 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-swfqw" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.852490 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-jlq7q"] Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.853091 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.853175 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-jlq7q" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.854183 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vfstq"] Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.854487 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vfstq" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.855317 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.856282 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mgjqp"] Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.856705 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.856733 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.856827 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mgjqp" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.868252 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-qssvq"] Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.882686 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.885363 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f156a9cc-9d97-4364-a17d-f02d8b0f8abe-serving-cert\") pod \"apiserver-7bbb656c7d-ccxwp\" (UID: \"f156a9cc-9d97-4364-a17d-f02d8b0f8abe\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ccxwp" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.885407 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a9a3c57-03b2-4adc-82a1-3aba68c83636-config\") pod \"machine-api-operator-5694c8668f-vw685\" (UID: \"3a9a3c57-03b2-4adc-82a1-3aba68c83636\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vw685" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.885426 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ea99d2e2-7d7a-4af5-8f2d-dadc6c1acd50-client-ca\") pod \"route-controller-manager-6576b87f9c-lvjgk\" (UID: \"ea99d2e2-7d7a-4af5-8f2d-dadc6c1acd50\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lvjgk" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.885461 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9p2qf\" (UniqueName: \"kubernetes.io/projected/33f4ebe7-de91-4b1c-b157-4234d535e206-kube-api-access-9p2qf\") pod \"openshift-controller-manager-operator-756b6f6bc6-vfstq\" (UID: \"33f4ebe7-de91-4b1c-b157-4234d535e206\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vfstq" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.885486 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwz5l\" (UniqueName: \"kubernetes.io/projected/90473429-30ec-490e-a96f-d66fce3c994c-kube-api-access-hwz5l\") pod \"apiserver-76f77b778f-72vb4\" (UID: \"90473429-30ec-490e-a96f-d66fce3c994c\") " pod="openshift-apiserver/apiserver-76f77b778f-72vb4" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.885510 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5wzvr\" (UniqueName: \"kubernetes.io/projected/4f565495-cb16-4443-8018-24e277acac69-kube-api-access-5wzvr\") pod \"oauth-openshift-558db77b4-5dpx7\" (UID: \"4f565495-cb16-4443-8018-24e277acac69\") " pod="openshift-authentication/oauth-openshift-558db77b4-5dpx7" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.885530 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/3a9a3c57-03b2-4adc-82a1-3aba68c83636-images\") pod \"machine-api-operator-5694c8668f-vw685\" (UID: \"3a9a3c57-03b2-4adc-82a1-3aba68c83636\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vw685" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.885548 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9d6de0ee-416d-43d1-bf5a-e176bc41b2c5-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-wxrg4\" (UID: \"9d6de0ee-416d-43d1-bf5a-e176bc41b2c5\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wxrg4" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.885569 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4f565495-cb16-4443-8018-24e277acac69-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-5dpx7\" (UID: \"4f565495-cb16-4443-8018-24e277acac69\") " pod="openshift-authentication/oauth-openshift-558db77b4-5dpx7" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.885593 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/df26da53-3ff8-4402-b5f6-25166b6b0f8a-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-swfqw\" (UID: \"df26da53-3ff8-4402-b5f6-25166b6b0f8a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-swfqw" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.885613 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d637970b-bb85-4dd8-beb8-1f01479781d1-auth-proxy-config\") pod \"machine-approver-56656f9798-dmwr8\" (UID: \"d637970b-bb85-4dd8-beb8-1f01479781d1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-dmwr8" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.885631 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5dae836b-33d9-45ed-9b21-13311ceff098-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-mwhth\" (UID: \"5dae836b-33d9-45ed-9b21-13311ceff098\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mwhth" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.885654 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/4f565495-cb16-4443-8018-24e277acac69-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-5dpx7\" (UID: \"4f565495-cb16-4443-8018-24e277acac69\") " pod="openshift-authentication/oauth-openshift-558db77b4-5dpx7" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.885673 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/90473429-30ec-490e-a96f-d66fce3c994c-serving-cert\") pod \"apiserver-76f77b778f-72vb4\" (UID: \"90473429-30ec-490e-a96f-d66fce3c994c\") " pod="openshift-apiserver/apiserver-76f77b778f-72vb4" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.885695 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7bzt\" (UniqueName: \"kubernetes.io/projected/9d6de0ee-416d-43d1-bf5a-e176bc41b2c5-kube-api-access-z7bzt\") pod \"controller-manager-879f6c89f-wxrg4\" (UID: \"9d6de0ee-416d-43d1-bf5a-e176bc41b2c5\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wxrg4" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.885712 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c72e2007-fbd4-4c7a-a0fc-9c949a748441-service-ca\") pod \"console-f9d7485db-8xklm\" (UID: \"c72e2007-fbd4-4c7a-a0fc-9c949a748441\") " pod="openshift-console/console-f9d7485db-8xklm" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.885735 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c528ef7f-ab03-4c1a-96b7-6d88270b67ee-config\") pod \"authentication-operator-69f744f599-l89kw\" (UID: \"c528ef7f-ab03-4c1a-96b7-6d88270b67ee\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-l89kw" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.885769 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4f565495-cb16-4443-8018-24e277acac69-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-5dpx7\" (UID: \"4f565495-cb16-4443-8018-24e277acac69\") " pod="openshift-authentication/oauth-openshift-558db77b4-5dpx7" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.885794 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90473429-30ec-490e-a96f-d66fce3c994c-config\") pod \"apiserver-76f77b778f-72vb4\" (UID: \"90473429-30ec-490e-a96f-d66fce3c994c\") " pod="openshift-apiserver/apiserver-76f77b778f-72vb4" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.885831 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea99d2e2-7d7a-4af5-8f2d-dadc6c1acd50-config\") pod \"route-controller-manager-6576b87f9c-lvjgk\" (UID: \"ea99d2e2-7d7a-4af5-8f2d-dadc6c1acd50\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lvjgk" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.885849 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5kks\" (UniqueName: \"kubernetes.io/projected/ea99d2e2-7d7a-4af5-8f2d-dadc6c1acd50-kube-api-access-m5kks\") pod \"route-controller-manager-6576b87f9c-lvjgk\" (UID: \"ea99d2e2-7d7a-4af5-8f2d-dadc6c1acd50\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lvjgk" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.885887 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/32de0c29-41b8-4e79-97c0-1d72acb48feb-serving-cert\") pod \"console-operator-58897d9998-jlq7q\" (UID: \"32de0c29-41b8-4e79-97c0-1d72acb48feb\") " pod="openshift-console-operator/console-operator-58897d9998-jlq7q" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.885903 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c72e2007-fbd4-4c7a-a0fc-9c949a748441-console-serving-cert\") pod \"console-f9d7485db-8xklm\" (UID: \"c72e2007-fbd4-4c7a-a0fc-9c949a748441\") " pod="openshift-console/console-f9d7485db-8xklm" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.885923 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4x7hf\" (UniqueName: \"kubernetes.io/projected/5dae836b-33d9-45ed-9b21-13311ceff098-kube-api-access-4x7hf\") pod \"openshift-apiserver-operator-796bbdcf4f-mwhth\" (UID: \"5dae836b-33d9-45ed-9b21-13311ceff098\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mwhth" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.885942 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/33f4ebe7-de91-4b1c-b157-4234d535e206-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-vfstq\" (UID: \"33f4ebe7-de91-4b1c-b157-4234d535e206\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vfstq" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.885961 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gg756\" (UniqueName: \"kubernetes.io/projected/f156a9cc-9d97-4364-a17d-f02d8b0f8abe-kube-api-access-gg756\") pod \"apiserver-7bbb656c7d-ccxwp\" (UID: \"f156a9cc-9d97-4364-a17d-f02d8b0f8abe\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ccxwp" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.885980 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/32de0c29-41b8-4e79-97c0-1d72acb48feb-trusted-ca\") pod \"console-operator-58897d9998-jlq7q\" (UID: \"32de0c29-41b8-4e79-97c0-1d72acb48feb\") " pod="openshift-console-operator/console-operator-58897d9998-jlq7q" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.886018 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c528ef7f-ab03-4c1a-96b7-6d88270b67ee-serving-cert\") pod \"authentication-operator-69f744f599-l89kw\" (UID: \"c528ef7f-ab03-4c1a-96b7-6d88270b67ee\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-l89kw" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.886041 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/3a9a3c57-03b2-4adc-82a1-3aba68c83636-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-vw685\" (UID: \"3a9a3c57-03b2-4adc-82a1-3aba68c83636\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vw685" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.886065 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9d6de0ee-416d-43d1-bf5a-e176bc41b2c5-client-ca\") pod \"controller-manager-879f6c89f-wxrg4\" (UID: \"9d6de0ee-416d-43d1-bf5a-e176bc41b2c5\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wxrg4" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.886085 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d637970b-bb85-4dd8-beb8-1f01479781d1-config\") pod \"machine-approver-56656f9798-dmwr8\" (UID: \"d637970b-bb85-4dd8-beb8-1f01479781d1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-dmwr8" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.886104 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/4f565495-cb16-4443-8018-24e277acac69-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-5dpx7\" (UID: \"4f565495-cb16-4443-8018-24e277acac69\") " pod="openshift-authentication/oauth-openshift-558db77b4-5dpx7" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.886124 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/90473429-30ec-490e-a96f-d66fce3c994c-trusted-ca-bundle\") pod \"apiserver-76f77b778f-72vb4\" (UID: \"90473429-30ec-490e-a96f-d66fce3c994c\") " pod="openshift-apiserver/apiserver-76f77b778f-72vb4" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.886143 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/90473429-30ec-490e-a96f-d66fce3c994c-audit-dir\") pod \"apiserver-76f77b778f-72vb4\" (UID: \"90473429-30ec-490e-a96f-d66fce3c994c\") " pod="openshift-apiserver/apiserver-76f77b778f-72vb4" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.886162 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f156a9cc-9d97-4364-a17d-f02d8b0f8abe-etcd-client\") pod \"apiserver-7bbb656c7d-ccxwp\" (UID: \"f156a9cc-9d97-4364-a17d-f02d8b0f8abe\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ccxwp" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.886191 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d6de0ee-416d-43d1-bf5a-e176bc41b2c5-serving-cert\") pod \"controller-manager-879f6c89f-wxrg4\" (UID: \"9d6de0ee-416d-43d1-bf5a-e176bc41b2c5\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wxrg4" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.886239 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/f156a9cc-9d97-4364-a17d-f02d8b0f8abe-encryption-config\") pod \"apiserver-7bbb656c7d-ccxwp\" (UID: \"f156a9cc-9d97-4364-a17d-f02d8b0f8abe\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ccxwp" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.886263 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v578k\" (UniqueName: \"kubernetes.io/projected/3a9a3c57-03b2-4adc-82a1-3aba68c83636-kube-api-access-v578k\") pod \"machine-api-operator-5694c8668f-vw685\" (UID: \"3a9a3c57-03b2-4adc-82a1-3aba68c83636\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vw685" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.886291 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/90473429-30ec-490e-a96f-d66fce3c994c-encryption-config\") pod \"apiserver-76f77b778f-72vb4\" (UID: \"90473429-30ec-490e-a96f-d66fce3c994c\") " pod="openshift-apiserver/apiserver-76f77b778f-72vb4" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.886353 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ea99d2e2-7d7a-4af5-8f2d-dadc6c1acd50-serving-cert\") pod \"route-controller-manager-6576b87f9c-lvjgk\" (UID: \"ea99d2e2-7d7a-4af5-8f2d-dadc6c1acd50\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lvjgk" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.886369 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f156a9cc-9d97-4364-a17d-f02d8b0f8abe-audit-policies\") pod \"apiserver-7bbb656c7d-ccxwp\" (UID: \"f156a9cc-9d97-4364-a17d-f02d8b0f8abe\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ccxwp" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.886406 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/4f565495-cb16-4443-8018-24e277acac69-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-5dpx7\" (UID: \"4f565495-cb16-4443-8018-24e277acac69\") " pod="openshift-authentication/oauth-openshift-558db77b4-5dpx7" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.886426 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/90473429-30ec-490e-a96f-d66fce3c994c-image-import-ca\") pod \"apiserver-76f77b778f-72vb4\" (UID: \"90473429-30ec-490e-a96f-d66fce3c994c\") " pod="openshift-apiserver/apiserver-76f77b778f-72vb4" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.886462 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c528ef7f-ab03-4c1a-96b7-6d88270b67ee-service-ca-bundle\") pod \"authentication-operator-69f744f599-l89kw\" (UID: \"c528ef7f-ab03-4c1a-96b7-6d88270b67ee\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-l89kw" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.886492 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/4f565495-cb16-4443-8018-24e277acac69-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-5dpx7\" (UID: \"4f565495-cb16-4443-8018-24e277acac69\") " pod="openshift-authentication/oauth-openshift-558db77b4-5dpx7" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.886581 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d6de0ee-416d-43d1-bf5a-e176bc41b2c5-config\") pod \"controller-manager-879f6c89f-wxrg4\" (UID: \"9d6de0ee-416d-43d1-bf5a-e176bc41b2c5\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wxrg4" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.886604 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32de0c29-41b8-4e79-97c0-1d72acb48feb-config\") pod \"console-operator-58897d9998-jlq7q\" (UID: \"32de0c29-41b8-4e79-97c0-1d72acb48feb\") " pod="openshift-console-operator/console-operator-58897d9998-jlq7q" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.886623 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqfhf\" (UniqueName: \"kubernetes.io/projected/c528ef7f-ab03-4c1a-96b7-6d88270b67ee-kube-api-access-kqfhf\") pod \"authentication-operator-69f744f599-l89kw\" (UID: \"c528ef7f-ab03-4c1a-96b7-6d88270b67ee\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-l89kw" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.886643 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/90473429-30ec-490e-a96f-d66fce3c994c-etcd-serving-ca\") pod \"apiserver-76f77b778f-72vb4\" (UID: \"90473429-30ec-490e-a96f-d66fce3c994c\") " pod="openshift-apiserver/apiserver-76f77b778f-72vb4" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.886663 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jtjf\" (UniqueName: \"kubernetes.io/projected/df26da53-3ff8-4402-b5f6-25166b6b0f8a-kube-api-access-8jtjf\") pod \"cluster-samples-operator-665b6dd947-swfqw\" (UID: \"df26da53-3ff8-4402-b5f6-25166b6b0f8a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-swfqw" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.886747 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/4f565495-cb16-4443-8018-24e277acac69-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-5dpx7\" (UID: \"4f565495-cb16-4443-8018-24e277acac69\") " pod="openshift-authentication/oauth-openshift-558db77b4-5dpx7" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.886767 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xn6xb\" (UniqueName: \"kubernetes.io/projected/c72e2007-fbd4-4c7a-a0fc-9c949a748441-kube-api-access-xn6xb\") pod \"console-f9d7485db-8xklm\" (UID: \"c72e2007-fbd4-4c7a-a0fc-9c949a748441\") " pod="openshift-console/console-f9d7485db-8xklm" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.886785 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwhvg\" (UniqueName: \"kubernetes.io/projected/32de0c29-41b8-4e79-97c0-1d72acb48feb-kube-api-access-jwhvg\") pod \"console-operator-58897d9998-jlq7q\" (UID: \"32de0c29-41b8-4e79-97c0-1d72acb48feb\") " pod="openshift-console-operator/console-operator-58897d9998-jlq7q" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.886820 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4f565495-cb16-4443-8018-24e277acac69-audit-policies\") pod \"oauth-openshift-558db77b4-5dpx7\" (UID: \"4f565495-cb16-4443-8018-24e277acac69\") " pod="openshift-authentication/oauth-openshift-558db77b4-5dpx7" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.886874 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/90473429-30ec-490e-a96f-d66fce3c994c-node-pullsecrets\") pod \"apiserver-76f77b778f-72vb4\" (UID: \"90473429-30ec-490e-a96f-d66fce3c994c\") " pod="openshift-apiserver/apiserver-76f77b778f-72vb4" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.886892 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c72e2007-fbd4-4c7a-a0fc-9c949a748441-trusted-ca-bundle\") pod \"console-f9d7485db-8xklm\" (UID: \"c72e2007-fbd4-4c7a-a0fc-9c949a748441\") " pod="openshift-console/console-f9d7485db-8xklm" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.886989 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/90473429-30ec-490e-a96f-d66fce3c994c-etcd-client\") pod \"apiserver-76f77b778f-72vb4\" (UID: \"90473429-30ec-490e-a96f-d66fce3c994c\") " pod="openshift-apiserver/apiserver-76f77b778f-72vb4" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.887011 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cltnv\" (UniqueName: \"kubernetes.io/projected/d637970b-bb85-4dd8-beb8-1f01479781d1-kube-api-access-cltnv\") pod \"machine-approver-56656f9798-dmwr8\" (UID: \"d637970b-bb85-4dd8-beb8-1f01479781d1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-dmwr8" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.887031 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c72e2007-fbd4-4c7a-a0fc-9c949a748441-console-config\") pod \"console-f9d7485db-8xklm\" (UID: \"c72e2007-fbd4-4c7a-a0fc-9c949a748441\") " pod="openshift-console/console-f9d7485db-8xklm" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.887053 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5dae836b-33d9-45ed-9b21-13311ceff098-config\") pod \"openshift-apiserver-operator-796bbdcf4f-mwhth\" (UID: \"5dae836b-33d9-45ed-9b21-13311ceff098\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mwhth" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.887073 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c72e2007-fbd4-4c7a-a0fc-9c949a748441-console-oauth-config\") pod \"console-f9d7485db-8xklm\" (UID: \"c72e2007-fbd4-4c7a-a0fc-9c949a748441\") " pod="openshift-console/console-f9d7485db-8xklm" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.887187 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f156a9cc-9d97-4364-a17d-f02d8b0f8abe-audit-dir\") pod \"apiserver-7bbb656c7d-ccxwp\" (UID: \"f156a9cc-9d97-4364-a17d-f02d8b0f8abe\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ccxwp" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.887223 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/4f565495-cb16-4443-8018-24e277acac69-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-5dpx7\" (UID: \"4f565495-cb16-4443-8018-24e277acac69\") " pod="openshift-authentication/oauth-openshift-558db77b4-5dpx7" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.887241 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33f4ebe7-de91-4b1c-b157-4234d535e206-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-vfstq\" (UID: \"33f4ebe7-de91-4b1c-b157-4234d535e206\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vfstq" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.887264 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f156a9cc-9d97-4364-a17d-f02d8b0f8abe-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-ccxwp\" (UID: \"f156a9cc-9d97-4364-a17d-f02d8b0f8abe\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ccxwp" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.887349 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c528ef7f-ab03-4c1a-96b7-6d88270b67ee-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-l89kw\" (UID: \"c528ef7f-ab03-4c1a-96b7-6d88270b67ee\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-l89kw" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.887379 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/d637970b-bb85-4dd8-beb8-1f01479781d1-machine-approver-tls\") pod \"machine-approver-56656f9798-dmwr8\" (UID: \"d637970b-bb85-4dd8-beb8-1f01479781d1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-dmwr8" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.887514 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/f156a9cc-9d97-4364-a17d-f02d8b0f8abe-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-ccxwp\" (UID: \"f156a9cc-9d97-4364-a17d-f02d8b0f8abe\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ccxwp" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.887535 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/4f565495-cb16-4443-8018-24e277acac69-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-5dpx7\" (UID: \"4f565495-cb16-4443-8018-24e277acac69\") " pod="openshift-authentication/oauth-openshift-558db77b4-5dpx7" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.887560 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/4f565495-cb16-4443-8018-24e277acac69-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-5dpx7\" (UID: \"4f565495-cb16-4443-8018-24e277acac69\") " pod="openshift-authentication/oauth-openshift-558db77b4-5dpx7" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.887581 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4f565495-cb16-4443-8018-24e277acac69-audit-dir\") pod \"oauth-openshift-558db77b4-5dpx7\" (UID: \"4f565495-cb16-4443-8018-24e277acac69\") " pod="openshift-authentication/oauth-openshift-558db77b4-5dpx7" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.887602 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c72e2007-fbd4-4c7a-a0fc-9c949a748441-oauth-serving-cert\") pod \"console-f9d7485db-8xklm\" (UID: \"c72e2007-fbd4-4c7a-a0fc-9c949a748441\") " pod="openshift-console/console-f9d7485db-8xklm" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.887619 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/4f565495-cb16-4443-8018-24e277acac69-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-5dpx7\" (UID: \"4f565495-cb16-4443-8018-24e277acac69\") " pod="openshift-authentication/oauth-openshift-558db77b4-5dpx7" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.887639 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/90473429-30ec-490e-a96f-d66fce3c994c-audit\") pod \"apiserver-76f77b778f-72vb4\" (UID: \"90473429-30ec-490e-a96f-d66fce3c994c\") " pod="openshift-apiserver/apiserver-76f77b778f-72vb4" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.890339 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-gbkvx"] Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.890411 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.890785 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.890921 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-qssvq" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.890981 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.891068 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.897111 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.897462 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.909925 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.910699 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.911088 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.911195 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.911369 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.911536 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.911704 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.911908 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.912073 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.912204 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-9pdrq"] Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.912440 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-gbkvx" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.912216 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.912492 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-9pdrq" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.912408 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.912434 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.912797 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.912964 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.915296 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-fr6cp"] Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.915374 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.915547 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.915825 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.915554 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.916415 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-fr6cp" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.916622 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.916901 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-x9zw9"] Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.917856 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.919435 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-x9zw9" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.919539 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-tc8mj"] Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.919960 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.920240 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.920277 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.922004 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-ngmd7"] Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.920339 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.921423 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.922395 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2n27q"] Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.922447 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-tc8mj" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.922581 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-ngmd7" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.923242 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-lkws7"] Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.923717 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mwhth"] Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.923749 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5pjbz"] Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.924003 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2n27q" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.924126 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5pjbz" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.924205 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lkws7" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.924558 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-fb6d2"] Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.925267 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-fb6d2" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.925579 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.926074 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-wnl59"] Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.926678 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wnl59" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.928535 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9xnlf"] Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.929082 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9xnlf" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.930119 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-jsw4c"] Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.931193 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-mtgwk"] Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.931341 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jsw4c" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.931881 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-mtgwk" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.933291 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-7rxbv"] Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.934196 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-lvjgk"] Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.934410 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7rxbv" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.937901 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lwt47"] Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.938564 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lwt47" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.939913 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-8k7h2"] Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.940650 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-8k7h2" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.941169 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-hxck2"] Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.941611 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-hxck2" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.944106 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29339340-k2d9s"] Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.945087 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kv9dz"] Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.945388 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-7s6jm"] Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.946055 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-7s6jm" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.946556 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29339340-k2d9s" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.946696 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kv9dz" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.947029 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-ccxwp"] Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.947239 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-9pdrq"] Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.948603 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-qssvq"] Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.950535 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-jlq7q"] Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.951681 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.953198 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-wxrg4"] Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.957136 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7j4bs"] Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.958726 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7j4bs" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.962028 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-89wql"] Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.969418 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.972180 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tmlh5"] Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.972446 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-89wql" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.973128 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-72vb4"] Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.973158 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2n27q"] Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.973219 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tmlh5" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.979232 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-vw685"] Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.979517 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-5dpx7"] Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.979538 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5pjbz"] Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.979554 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-fb6d2"] Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.981059 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vfstq"] Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.984708 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-x9zw9"] Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.986153 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-l89kw"] Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.986399 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.988480 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/4f565495-cb16-4443-8018-24e277acac69-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-5dpx7\" (UID: \"4f565495-cb16-4443-8018-24e277acac69\") " pod="openshift-authentication/oauth-openshift-558db77b4-5dpx7" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.988527 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/90473429-30ec-490e-a96f-d66fce3c994c-trusted-ca-bundle\") pod \"apiserver-76f77b778f-72vb4\" (UID: \"90473429-30ec-490e-a96f-d66fce3c994c\") " pod="openshift-apiserver/apiserver-76f77b778f-72vb4" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.988558 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/90473429-30ec-490e-a96f-d66fce3c994c-audit-dir\") pod \"apiserver-76f77b778f-72vb4\" (UID: \"90473429-30ec-490e-a96f-d66fce3c994c\") " pod="openshift-apiserver/apiserver-76f77b778f-72vb4" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.988585 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9d6de0ee-416d-43d1-bf5a-e176bc41b2c5-client-ca\") pod \"controller-manager-879f6c89f-wxrg4\" (UID: \"9d6de0ee-416d-43d1-bf5a-e176bc41b2c5\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wxrg4" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.988613 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d637970b-bb85-4dd8-beb8-1f01479781d1-config\") pod \"machine-approver-56656f9798-dmwr8\" (UID: \"d637970b-bb85-4dd8-beb8-1f01479781d1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-dmwr8" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.988639 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f156a9cc-9d97-4364-a17d-f02d8b0f8abe-etcd-client\") pod \"apiserver-7bbb656c7d-ccxwp\" (UID: \"f156a9cc-9d97-4364-a17d-f02d8b0f8abe\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ccxwp" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.988664 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d6de0ee-416d-43d1-bf5a-e176bc41b2c5-serving-cert\") pod \"controller-manager-879f6c89f-wxrg4\" (UID: \"9d6de0ee-416d-43d1-bf5a-e176bc41b2c5\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wxrg4" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.988685 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/f156a9cc-9d97-4364-a17d-f02d8b0f8abe-encryption-config\") pod \"apiserver-7bbb656c7d-ccxwp\" (UID: \"f156a9cc-9d97-4364-a17d-f02d8b0f8abe\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ccxwp" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.988711 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v578k\" (UniqueName: \"kubernetes.io/projected/3a9a3c57-03b2-4adc-82a1-3aba68c83636-kube-api-access-v578k\") pod \"machine-api-operator-5694c8668f-vw685\" (UID: \"3a9a3c57-03b2-4adc-82a1-3aba68c83636\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vw685" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.988761 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/90473429-30ec-490e-a96f-d66fce3c994c-encryption-config\") pod \"apiserver-76f77b778f-72vb4\" (UID: \"90473429-30ec-490e-a96f-d66fce3c994c\") " pod="openshift-apiserver/apiserver-76f77b778f-72vb4" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.988785 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ea99d2e2-7d7a-4af5-8f2d-dadc6c1acd50-serving-cert\") pod \"route-controller-manager-6576b87f9c-lvjgk\" (UID: \"ea99d2e2-7d7a-4af5-8f2d-dadc6c1acd50\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lvjgk" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.988827 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f156a9cc-9d97-4364-a17d-f02d8b0f8abe-audit-policies\") pod \"apiserver-7bbb656c7d-ccxwp\" (UID: \"f156a9cc-9d97-4364-a17d-f02d8b0f8abe\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ccxwp" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.988854 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c528ef7f-ab03-4c1a-96b7-6d88270b67ee-service-ca-bundle\") pod \"authentication-operator-69f744f599-l89kw\" (UID: \"c528ef7f-ab03-4c1a-96b7-6d88270b67ee\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-l89kw" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.988879 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/4f565495-cb16-4443-8018-24e277acac69-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-5dpx7\" (UID: \"4f565495-cb16-4443-8018-24e277acac69\") " pod="openshift-authentication/oauth-openshift-558db77b4-5dpx7" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.988901 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/90473429-30ec-490e-a96f-d66fce3c994c-image-import-ca\") pod \"apiserver-76f77b778f-72vb4\" (UID: \"90473429-30ec-490e-a96f-d66fce3c994c\") " pod="openshift-apiserver/apiserver-76f77b778f-72vb4" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.988926 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/4f565495-cb16-4443-8018-24e277acac69-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-5dpx7\" (UID: \"4f565495-cb16-4443-8018-24e277acac69\") " pod="openshift-authentication/oauth-openshift-558db77b4-5dpx7" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.988944 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kqfhf\" (UniqueName: \"kubernetes.io/projected/c528ef7f-ab03-4c1a-96b7-6d88270b67ee-kube-api-access-kqfhf\") pod \"authentication-operator-69f744f599-l89kw\" (UID: \"c528ef7f-ab03-4c1a-96b7-6d88270b67ee\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-l89kw" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.988960 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d6de0ee-416d-43d1-bf5a-e176bc41b2c5-config\") pod \"controller-manager-879f6c89f-wxrg4\" (UID: \"9d6de0ee-416d-43d1-bf5a-e176bc41b2c5\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wxrg4" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.988983 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32de0c29-41b8-4e79-97c0-1d72acb48feb-config\") pod \"console-operator-58897d9998-jlq7q\" (UID: \"32de0c29-41b8-4e79-97c0-1d72acb48feb\") " pod="openshift-console-operator/console-operator-58897d9998-jlq7q" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.989002 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/4f565495-cb16-4443-8018-24e277acac69-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-5dpx7\" (UID: \"4f565495-cb16-4443-8018-24e277acac69\") " pod="openshift-authentication/oauth-openshift-558db77b4-5dpx7" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.989024 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/90473429-30ec-490e-a96f-d66fce3c994c-etcd-serving-ca\") pod \"apiserver-76f77b778f-72vb4\" (UID: \"90473429-30ec-490e-a96f-d66fce3c994c\") " pod="openshift-apiserver/apiserver-76f77b778f-72vb4" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.989049 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8jtjf\" (UniqueName: \"kubernetes.io/projected/df26da53-3ff8-4402-b5f6-25166b6b0f8a-kube-api-access-8jtjf\") pod \"cluster-samples-operator-665b6dd947-swfqw\" (UID: \"df26da53-3ff8-4402-b5f6-25166b6b0f8a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-swfqw" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.989071 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xn6xb\" (UniqueName: \"kubernetes.io/projected/c72e2007-fbd4-4c7a-a0fc-9c949a748441-kube-api-access-xn6xb\") pod \"console-f9d7485db-8xklm\" (UID: \"c72e2007-fbd4-4c7a-a0fc-9c949a748441\") " pod="openshift-console/console-f9d7485db-8xklm" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.989089 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4f565495-cb16-4443-8018-24e277acac69-audit-policies\") pod \"oauth-openshift-558db77b4-5dpx7\" (UID: \"4f565495-cb16-4443-8018-24e277acac69\") " pod="openshift-authentication/oauth-openshift-558db77b4-5dpx7" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.989125 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/90473429-30ec-490e-a96f-d66fce3c994c-node-pullsecrets\") pod \"apiserver-76f77b778f-72vb4\" (UID: \"90473429-30ec-490e-a96f-d66fce3c994c\") " pod="openshift-apiserver/apiserver-76f77b778f-72vb4" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.989142 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwhvg\" (UniqueName: \"kubernetes.io/projected/32de0c29-41b8-4e79-97c0-1d72acb48feb-kube-api-access-jwhvg\") pod \"console-operator-58897d9998-jlq7q\" (UID: \"32de0c29-41b8-4e79-97c0-1d72acb48feb\") " pod="openshift-console-operator/console-operator-58897d9998-jlq7q" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.989160 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/90473429-30ec-490e-a96f-d66fce3c994c-etcd-client\") pod \"apiserver-76f77b778f-72vb4\" (UID: \"90473429-30ec-490e-a96f-d66fce3c994c\") " pod="openshift-apiserver/apiserver-76f77b778f-72vb4" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.989177 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c72e2007-fbd4-4c7a-a0fc-9c949a748441-trusted-ca-bundle\") pod \"console-f9d7485db-8xklm\" (UID: \"c72e2007-fbd4-4c7a-a0fc-9c949a748441\") " pod="openshift-console/console-f9d7485db-8xklm" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.989202 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cltnv\" (UniqueName: \"kubernetes.io/projected/d637970b-bb85-4dd8-beb8-1f01479781d1-kube-api-access-cltnv\") pod \"machine-approver-56656f9798-dmwr8\" (UID: \"d637970b-bb85-4dd8-beb8-1f01479781d1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-dmwr8" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.989222 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5dae836b-33d9-45ed-9b21-13311ceff098-config\") pod \"openshift-apiserver-operator-796bbdcf4f-mwhth\" (UID: \"5dae836b-33d9-45ed-9b21-13311ceff098\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mwhth" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.989250 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/d24fe086-ff48-4538-ad8a-3bb681cf9116-etcd-ca\") pod \"etcd-operator-b45778765-fr6cp\" (UID: \"d24fe086-ff48-4538-ad8a-3bb681cf9116\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fr6cp" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.989267 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c72e2007-fbd4-4c7a-a0fc-9c949a748441-console-config\") pod \"console-f9d7485db-8xklm\" (UID: \"c72e2007-fbd4-4c7a-a0fc-9c949a748441\") " pod="openshift-console/console-f9d7485db-8xklm" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.989284 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/4f565495-cb16-4443-8018-24e277acac69-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-5dpx7\" (UID: \"4f565495-cb16-4443-8018-24e277acac69\") " pod="openshift-authentication/oauth-openshift-558db77b4-5dpx7" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.989302 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33f4ebe7-de91-4b1c-b157-4234d535e206-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-vfstq\" (UID: \"33f4ebe7-de91-4b1c-b157-4234d535e206\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vfstq" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.989321 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c72e2007-fbd4-4c7a-a0fc-9c949a748441-console-oauth-config\") pod \"console-f9d7485db-8xklm\" (UID: \"c72e2007-fbd4-4c7a-a0fc-9c949a748441\") " pod="openshift-console/console-f9d7485db-8xklm" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.989352 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f156a9cc-9d97-4364-a17d-f02d8b0f8abe-audit-dir\") pod \"apiserver-7bbb656c7d-ccxwp\" (UID: \"f156a9cc-9d97-4364-a17d-f02d8b0f8abe\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ccxwp" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.989381 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/d637970b-bb85-4dd8-beb8-1f01479781d1-machine-approver-tls\") pod \"machine-approver-56656f9798-dmwr8\" (UID: \"d637970b-bb85-4dd8-beb8-1f01479781d1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-dmwr8" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.989412 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f156a9cc-9d97-4364-a17d-f02d8b0f8abe-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-ccxwp\" (UID: \"f156a9cc-9d97-4364-a17d-f02d8b0f8abe\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ccxwp" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.989430 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c528ef7f-ab03-4c1a-96b7-6d88270b67ee-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-l89kw\" (UID: \"c528ef7f-ab03-4c1a-96b7-6d88270b67ee\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-l89kw" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.989451 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d24fe086-ff48-4538-ad8a-3bb681cf9116-serving-cert\") pod \"etcd-operator-b45778765-fr6cp\" (UID: \"d24fe086-ff48-4538-ad8a-3bb681cf9116\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fr6cp" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.989470 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/f156a9cc-9d97-4364-a17d-f02d8b0f8abe-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-ccxwp\" (UID: \"f156a9cc-9d97-4364-a17d-f02d8b0f8abe\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ccxwp" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.989490 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4f565495-cb16-4443-8018-24e277acac69-audit-dir\") pod \"oauth-openshift-558db77b4-5dpx7\" (UID: \"4f565495-cb16-4443-8018-24e277acac69\") " pod="openshift-authentication/oauth-openshift-558db77b4-5dpx7" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.989509 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/4f565495-cb16-4443-8018-24e277acac69-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-5dpx7\" (UID: \"4f565495-cb16-4443-8018-24e277acac69\") " pod="openshift-authentication/oauth-openshift-558db77b4-5dpx7" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.989527 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/4f565495-cb16-4443-8018-24e277acac69-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-5dpx7\" (UID: \"4f565495-cb16-4443-8018-24e277acac69\") " pod="openshift-authentication/oauth-openshift-558db77b4-5dpx7" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.989549 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/4f565495-cb16-4443-8018-24e277acac69-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-5dpx7\" (UID: \"4f565495-cb16-4443-8018-24e277acac69\") " pod="openshift-authentication/oauth-openshift-558db77b4-5dpx7" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.989568 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/90473429-30ec-490e-a96f-d66fce3c994c-audit\") pod \"apiserver-76f77b778f-72vb4\" (UID: \"90473429-30ec-490e-a96f-d66fce3c994c\") " pod="openshift-apiserver/apiserver-76f77b778f-72vb4" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.989588 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c72e2007-fbd4-4c7a-a0fc-9c949a748441-oauth-serving-cert\") pod \"console-f9d7485db-8xklm\" (UID: \"c72e2007-fbd4-4c7a-a0fc-9c949a748441\") " pod="openshift-console/console-f9d7485db-8xklm" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.989605 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f156a9cc-9d97-4364-a17d-f02d8b0f8abe-serving-cert\") pod \"apiserver-7bbb656c7d-ccxwp\" (UID: \"f156a9cc-9d97-4364-a17d-f02d8b0f8abe\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ccxwp" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.989638 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a9a3c57-03b2-4adc-82a1-3aba68c83636-config\") pod \"machine-api-operator-5694c8668f-vw685\" (UID: \"3a9a3c57-03b2-4adc-82a1-3aba68c83636\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vw685" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.989657 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9p2qf\" (UniqueName: \"kubernetes.io/projected/33f4ebe7-de91-4b1c-b157-4234d535e206-kube-api-access-9p2qf\") pod \"openshift-controller-manager-operator-756b6f6bc6-vfstq\" (UID: \"33f4ebe7-de91-4b1c-b157-4234d535e206\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vfstq" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.989673 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ea99d2e2-7d7a-4af5-8f2d-dadc6c1acd50-client-ca\") pod \"route-controller-manager-6576b87f9c-lvjgk\" (UID: \"ea99d2e2-7d7a-4af5-8f2d-dadc6c1acd50\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lvjgk" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.989693 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hwz5l\" (UniqueName: \"kubernetes.io/projected/90473429-30ec-490e-a96f-d66fce3c994c-kube-api-access-hwz5l\") pod \"apiserver-76f77b778f-72vb4\" (UID: \"90473429-30ec-490e-a96f-d66fce3c994c\") " pod="openshift-apiserver/apiserver-76f77b778f-72vb4" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.989712 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9n45\" (UniqueName: \"kubernetes.io/projected/d24fe086-ff48-4538-ad8a-3bb681cf9116-kube-api-access-g9n45\") pod \"etcd-operator-b45778765-fr6cp\" (UID: \"d24fe086-ff48-4538-ad8a-3bb681cf9116\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fr6cp" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.989733 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/3a9a3c57-03b2-4adc-82a1-3aba68c83636-images\") pod \"machine-api-operator-5694c8668f-vw685\" (UID: \"3a9a3c57-03b2-4adc-82a1-3aba68c83636\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vw685" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.989750 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d24fe086-ff48-4538-ad8a-3bb681cf9116-config\") pod \"etcd-operator-b45778765-fr6cp\" (UID: \"d24fe086-ff48-4538-ad8a-3bb681cf9116\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fr6cp" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.989771 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5wzvr\" (UniqueName: \"kubernetes.io/projected/4f565495-cb16-4443-8018-24e277acac69-kube-api-access-5wzvr\") pod \"oauth-openshift-558db77b4-5dpx7\" (UID: \"4f565495-cb16-4443-8018-24e277acac69\") " pod="openshift-authentication/oauth-openshift-558db77b4-5dpx7" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.989790 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9d6de0ee-416d-43d1-bf5a-e176bc41b2c5-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-wxrg4\" (UID: \"9d6de0ee-416d-43d1-bf5a-e176bc41b2c5\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wxrg4" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.989822 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4f565495-cb16-4443-8018-24e277acac69-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-5dpx7\" (UID: \"4f565495-cb16-4443-8018-24e277acac69\") " pod="openshift-authentication/oauth-openshift-558db77b4-5dpx7" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.989840 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/df26da53-3ff8-4402-b5f6-25166b6b0f8a-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-swfqw\" (UID: \"df26da53-3ff8-4402-b5f6-25166b6b0f8a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-swfqw" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.989858 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5dae836b-33d9-45ed-9b21-13311ceff098-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-mwhth\" (UID: \"5dae836b-33d9-45ed-9b21-13311ceff098\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mwhth" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.989886 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d637970b-bb85-4dd8-beb8-1f01479781d1-auth-proxy-config\") pod \"machine-approver-56656f9798-dmwr8\" (UID: \"d637970b-bb85-4dd8-beb8-1f01479781d1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-dmwr8" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.989905 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/4f565495-cb16-4443-8018-24e277acac69-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-5dpx7\" (UID: \"4f565495-cb16-4443-8018-24e277acac69\") " pod="openshift-authentication/oauth-openshift-558db77b4-5dpx7" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.989922 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/90473429-30ec-490e-a96f-d66fce3c994c-serving-cert\") pod \"apiserver-76f77b778f-72vb4\" (UID: \"90473429-30ec-490e-a96f-d66fce3c994c\") " pod="openshift-apiserver/apiserver-76f77b778f-72vb4" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.989941 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z7bzt\" (UniqueName: \"kubernetes.io/projected/9d6de0ee-416d-43d1-bf5a-e176bc41b2c5-kube-api-access-z7bzt\") pod \"controller-manager-879f6c89f-wxrg4\" (UID: \"9d6de0ee-416d-43d1-bf5a-e176bc41b2c5\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wxrg4" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.989960 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c72e2007-fbd4-4c7a-a0fc-9c949a748441-service-ca\") pod \"console-f9d7485db-8xklm\" (UID: \"c72e2007-fbd4-4c7a-a0fc-9c949a748441\") " pod="openshift-console/console-f9d7485db-8xklm" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.989980 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4f565495-cb16-4443-8018-24e277acac69-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-5dpx7\" (UID: \"4f565495-cb16-4443-8018-24e277acac69\") " pod="openshift-authentication/oauth-openshift-558db77b4-5dpx7" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.989997 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90473429-30ec-490e-a96f-d66fce3c994c-config\") pod \"apiserver-76f77b778f-72vb4\" (UID: \"90473429-30ec-490e-a96f-d66fce3c994c\") " pod="openshift-apiserver/apiserver-76f77b778f-72vb4" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.990016 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c528ef7f-ab03-4c1a-96b7-6d88270b67ee-config\") pod \"authentication-operator-69f744f599-l89kw\" (UID: \"c528ef7f-ab03-4c1a-96b7-6d88270b67ee\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-l89kw" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.990034 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/d24fe086-ff48-4538-ad8a-3bb681cf9116-etcd-service-ca\") pod \"etcd-operator-b45778765-fr6cp\" (UID: \"d24fe086-ff48-4538-ad8a-3bb681cf9116\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fr6cp" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.990056 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/32de0c29-41b8-4e79-97c0-1d72acb48feb-serving-cert\") pod \"console-operator-58897d9998-jlq7q\" (UID: \"32de0c29-41b8-4e79-97c0-1d72acb48feb\") " pod="openshift-console-operator/console-operator-58897d9998-jlq7q" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.990075 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea99d2e2-7d7a-4af5-8f2d-dadc6c1acd50-config\") pod \"route-controller-manager-6576b87f9c-lvjgk\" (UID: \"ea99d2e2-7d7a-4af5-8f2d-dadc6c1acd50\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lvjgk" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.990096 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5kks\" (UniqueName: \"kubernetes.io/projected/ea99d2e2-7d7a-4af5-8f2d-dadc6c1acd50-kube-api-access-m5kks\") pod \"route-controller-manager-6576b87f9c-lvjgk\" (UID: \"ea99d2e2-7d7a-4af5-8f2d-dadc6c1acd50\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lvjgk" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.990123 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c72e2007-fbd4-4c7a-a0fc-9c949a748441-console-serving-cert\") pod \"console-f9d7485db-8xklm\" (UID: \"c72e2007-fbd4-4c7a-a0fc-9c949a748441\") " pod="openshift-console/console-f9d7485db-8xklm" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.990142 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4x7hf\" (UniqueName: \"kubernetes.io/projected/5dae836b-33d9-45ed-9b21-13311ceff098-kube-api-access-4x7hf\") pod \"openshift-apiserver-operator-796bbdcf4f-mwhth\" (UID: \"5dae836b-33d9-45ed-9b21-13311ceff098\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mwhth" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.990160 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gg756\" (UniqueName: \"kubernetes.io/projected/f156a9cc-9d97-4364-a17d-f02d8b0f8abe-kube-api-access-gg756\") pod \"apiserver-7bbb656c7d-ccxwp\" (UID: \"f156a9cc-9d97-4364-a17d-f02d8b0f8abe\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ccxwp" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.990180 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/33f4ebe7-de91-4b1c-b157-4234d535e206-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-vfstq\" (UID: \"33f4ebe7-de91-4b1c-b157-4234d535e206\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vfstq" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.990198 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c528ef7f-ab03-4c1a-96b7-6d88270b67ee-serving-cert\") pod \"authentication-operator-69f744f599-l89kw\" (UID: \"c528ef7f-ab03-4c1a-96b7-6d88270b67ee\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-l89kw" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.990216 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/3a9a3c57-03b2-4adc-82a1-3aba68c83636-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-vw685\" (UID: \"3a9a3c57-03b2-4adc-82a1-3aba68c83636\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vw685" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.990235 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d24fe086-ff48-4538-ad8a-3bb681cf9116-etcd-client\") pod \"etcd-operator-b45778765-fr6cp\" (UID: \"d24fe086-ff48-4538-ad8a-3bb681cf9116\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fr6cp" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.990254 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/32de0c29-41b8-4e79-97c0-1d72acb48feb-trusted-ca\") pod \"console-operator-58897d9998-jlq7q\" (UID: \"32de0c29-41b8-4e79-97c0-1d72acb48feb\") " pod="openshift-console-operator/console-operator-58897d9998-jlq7q" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.991836 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4f565495-cb16-4443-8018-24e277acac69-audit-dir\") pod \"oauth-openshift-558db77b4-5dpx7\" (UID: \"4f565495-cb16-4443-8018-24e277acac69\") " pod="openshift-authentication/oauth-openshift-558db77b4-5dpx7" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.991849 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c528ef7f-ab03-4c1a-96b7-6d88270b67ee-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-l89kw\" (UID: \"c528ef7f-ab03-4c1a-96b7-6d88270b67ee\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-l89kw" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.991941 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c528ef7f-ab03-4c1a-96b7-6d88270b67ee-service-ca-bundle\") pod \"authentication-operator-69f744f599-l89kw\" (UID: \"c528ef7f-ab03-4c1a-96b7-6d88270b67ee\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-l89kw" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.993079 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/90473429-30ec-490e-a96f-d66fce3c994c-audit-dir\") pod \"apiserver-76f77b778f-72vb4\" (UID: \"90473429-30ec-490e-a96f-d66fce3c994c\") " pod="openshift-apiserver/apiserver-76f77b778f-72vb4" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.993231 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/90473429-30ec-490e-a96f-d66fce3c994c-node-pullsecrets\") pod \"apiserver-76f77b778f-72vb4\" (UID: \"90473429-30ec-490e-a96f-d66fce3c994c\") " pod="openshift-apiserver/apiserver-76f77b778f-72vb4" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.993261 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/90473429-30ec-490e-a96f-d66fce3c994c-image-import-ca\") pod \"apiserver-76f77b778f-72vb4\" (UID: \"90473429-30ec-490e-a96f-d66fce3c994c\") " pod="openshift-apiserver/apiserver-76f77b778f-72vb4" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.993324 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f156a9cc-9d97-4364-a17d-f02d8b0f8abe-audit-dir\") pod \"apiserver-7bbb656c7d-ccxwp\" (UID: \"f156a9cc-9d97-4364-a17d-f02d8b0f8abe\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ccxwp" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.993567 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/90473429-30ec-490e-a96f-d66fce3c994c-etcd-serving-ca\") pod \"apiserver-76f77b778f-72vb4\" (UID: \"90473429-30ec-490e-a96f-d66fce3c994c\") " pod="openshift-apiserver/apiserver-76f77b778f-72vb4" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.994129 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9d6de0ee-416d-43d1-bf5a-e176bc41b2c5-client-ca\") pod \"controller-manager-879f6c89f-wxrg4\" (UID: \"9d6de0ee-416d-43d1-bf5a-e176bc41b2c5\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wxrg4" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.994265 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f156a9cc-9d97-4364-a17d-f02d8b0f8abe-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-ccxwp\" (UID: \"f156a9cc-9d97-4364-a17d-f02d8b0f8abe\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ccxwp" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.994674 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/90473429-30ec-490e-a96f-d66fce3c994c-audit\") pod \"apiserver-76f77b778f-72vb4\" (UID: \"90473429-30ec-490e-a96f-d66fce3c994c\") " pod="openshift-apiserver/apiserver-76f77b778f-72vb4" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.994749 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5dae836b-33d9-45ed-9b21-13311ceff098-config\") pod \"openshift-apiserver-operator-796bbdcf4f-mwhth\" (UID: \"5dae836b-33d9-45ed-9b21-13311ceff098\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mwhth" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.995254 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d6de0ee-416d-43d1-bf5a-e176bc41b2c5-config\") pod \"controller-manager-879f6c89f-wxrg4\" (UID: \"9d6de0ee-416d-43d1-bf5a-e176bc41b2c5\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wxrg4" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.995288 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/f156a9cc-9d97-4364-a17d-f02d8b0f8abe-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-ccxwp\" (UID: \"f156a9cc-9d97-4364-a17d-f02d8b0f8abe\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ccxwp" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.995510 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d637970b-bb85-4dd8-beb8-1f01479781d1-auth-proxy-config\") pod \"machine-approver-56656f9798-dmwr8\" (UID: \"d637970b-bb85-4dd8-beb8-1f01479781d1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-dmwr8" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.995542 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/32de0c29-41b8-4e79-97c0-1d72acb48feb-trusted-ca\") pod \"console-operator-58897d9998-jlq7q\" (UID: \"32de0c29-41b8-4e79-97c0-1d72acb48feb\") " pod="openshift-console-operator/console-operator-58897d9998-jlq7q" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.995587 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/90473429-30ec-490e-a96f-d66fce3c994c-trusted-ca-bundle\") pod \"apiserver-76f77b778f-72vb4\" (UID: \"90473429-30ec-490e-a96f-d66fce3c994c\") " pod="openshift-apiserver/apiserver-76f77b778f-72vb4" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.995651 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d637970b-bb85-4dd8-beb8-1f01479781d1-config\") pod \"machine-approver-56656f9798-dmwr8\" (UID: \"d637970b-bb85-4dd8-beb8-1f01479781d1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-dmwr8" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.995691 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/4f565495-cb16-4443-8018-24e277acac69-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-5dpx7\" (UID: \"4f565495-cb16-4443-8018-24e277acac69\") " pod="openshift-authentication/oauth-openshift-558db77b4-5dpx7" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.995830 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32de0c29-41b8-4e79-97c0-1d72acb48feb-config\") pod \"console-operator-58897d9998-jlq7q\" (UID: \"32de0c29-41b8-4e79-97c0-1d72acb48feb\") " pod="openshift-console-operator/console-operator-58897d9998-jlq7q" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.996070 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c72e2007-fbd4-4c7a-a0fc-9c949a748441-console-config\") pod \"console-f9d7485db-8xklm\" (UID: \"c72e2007-fbd4-4c7a-a0fc-9c949a748441\") " pod="openshift-console/console-f9d7485db-8xklm" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.996108 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c72e2007-fbd4-4c7a-a0fc-9c949a748441-service-ca\") pod \"console-f9d7485db-8xklm\" (UID: \"c72e2007-fbd4-4c7a-a0fc-9c949a748441\") " pod="openshift-console/console-f9d7485db-8xklm" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.996181 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33f4ebe7-de91-4b1c-b157-4234d535e206-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-vfstq\" (UID: \"33f4ebe7-de91-4b1c-b157-4234d535e206\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vfstq" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.996701 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4f565495-cb16-4443-8018-24e277acac69-audit-policies\") pod \"oauth-openshift-558db77b4-5dpx7\" (UID: \"4f565495-cb16-4443-8018-24e277acac69\") " pod="openshift-authentication/oauth-openshift-558db77b4-5dpx7" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.997159 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/3a9a3c57-03b2-4adc-82a1-3aba68c83636-images\") pod \"machine-api-operator-5694c8668f-vw685\" (UID: \"3a9a3c57-03b2-4adc-82a1-3aba68c83636\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vw685" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.997159 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c72e2007-fbd4-4c7a-a0fc-9c949a748441-oauth-serving-cert\") pod \"console-f9d7485db-8xklm\" (UID: \"c72e2007-fbd4-4c7a-a0fc-9c949a748441\") " pod="openshift-console/console-f9d7485db-8xklm" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.997269 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90473429-30ec-490e-a96f-d66fce3c994c-config\") pod \"apiserver-76f77b778f-72vb4\" (UID: \"90473429-30ec-490e-a96f-d66fce3c994c\") " pod="openshift-apiserver/apiserver-76f77b778f-72vb4" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.998005 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c72e2007-fbd4-4c7a-a0fc-9c949a748441-trusted-ca-bundle\") pod \"console-f9d7485db-8xklm\" (UID: \"c72e2007-fbd4-4c7a-a0fc-9c949a748441\") " pod="openshift-console/console-f9d7485db-8xklm" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.998110 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a9a3c57-03b2-4adc-82a1-3aba68c83636-config\") pod \"machine-api-operator-5694c8668f-vw685\" (UID: \"3a9a3c57-03b2-4adc-82a1-3aba68c83636\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vw685" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.998149 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-mtgwk"] Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.998337 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9d6de0ee-416d-43d1-bf5a-e176bc41b2c5-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-wxrg4\" (UID: \"9d6de0ee-416d-43d1-bf5a-e176bc41b2c5\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wxrg4" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.998467 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c528ef7f-ab03-4c1a-96b7-6d88270b67ee-config\") pod \"authentication-operator-69f744f599-l89kw\" (UID: \"c528ef7f-ab03-4c1a-96b7-6d88270b67ee\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-l89kw" Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.999049 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-swfqw"] Oct 13 13:08:57 crc kubenswrapper[4797]: I1013 13:08:57.999658 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea99d2e2-7d7a-4af5-8f2d-dadc6c1acd50-config\") pod \"route-controller-manager-6576b87f9c-lvjgk\" (UID: \"ea99d2e2-7d7a-4af5-8f2d-dadc6c1acd50\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lvjgk" Oct 13 13:08:58 crc kubenswrapper[4797]: I1013 13:08:58.000003 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4f565495-cb16-4443-8018-24e277acac69-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-5dpx7\" (UID: \"4f565495-cb16-4443-8018-24e277acac69\") " pod="openshift-authentication/oauth-openshift-558db77b4-5dpx7" Oct 13 13:08:58 crc kubenswrapper[4797]: I1013 13:08:58.000099 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4f565495-cb16-4443-8018-24e277acac69-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-5dpx7\" (UID: \"4f565495-cb16-4443-8018-24e277acac69\") " pod="openshift-authentication/oauth-openshift-558db77b4-5dpx7" Oct 13 13:08:58 crc kubenswrapper[4797]: I1013 13:08:58.000743 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ea99d2e2-7d7a-4af5-8f2d-dadc6c1acd50-client-ca\") pod \"route-controller-manager-6576b87f9c-lvjgk\" (UID: \"ea99d2e2-7d7a-4af5-8f2d-dadc6c1acd50\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lvjgk" Oct 13 13:08:58 crc kubenswrapper[4797]: I1013 13:08:58.002337 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c72e2007-fbd4-4c7a-a0fc-9c949a748441-console-serving-cert\") pod \"console-f9d7485db-8xklm\" (UID: \"c72e2007-fbd4-4c7a-a0fc-9c949a748441\") " pod="openshift-console/console-f9d7485db-8xklm" Oct 13 13:08:58 crc kubenswrapper[4797]: I1013 13:08:58.002478 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-8xklm"] Oct 13 13:08:58 crc kubenswrapper[4797]: I1013 13:08:58.004591 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/4f565495-cb16-4443-8018-24e277acac69-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-5dpx7\" (UID: \"4f565495-cb16-4443-8018-24e277acac69\") " pod="openshift-authentication/oauth-openshift-558db77b4-5dpx7" Oct 13 13:08:58 crc kubenswrapper[4797]: I1013 13:08:58.004766 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-lkws7"] Oct 13 13:08:58 crc kubenswrapper[4797]: I1013 13:08:58.006060 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/4f565495-cb16-4443-8018-24e277acac69-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-5dpx7\" (UID: \"4f565495-cb16-4443-8018-24e277acac69\") " pod="openshift-authentication/oauth-openshift-558db77b4-5dpx7" Oct 13 13:08:58 crc kubenswrapper[4797]: I1013 13:08:58.006112 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f156a9cc-9d97-4364-a17d-f02d8b0f8abe-serving-cert\") pod \"apiserver-7bbb656c7d-ccxwp\" (UID: \"f156a9cc-9d97-4364-a17d-f02d8b0f8abe\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ccxwp" Oct 13 13:08:58 crc kubenswrapper[4797]: I1013 13:08:58.006288 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/4f565495-cb16-4443-8018-24e277acac69-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-5dpx7\" (UID: \"4f565495-cb16-4443-8018-24e277acac69\") " pod="openshift-authentication/oauth-openshift-558db77b4-5dpx7" Oct 13 13:08:58 crc kubenswrapper[4797]: I1013 13:08:58.006425 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/4f565495-cb16-4443-8018-24e277acac69-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-5dpx7\" (UID: \"4f565495-cb16-4443-8018-24e277acac69\") " pod="openshift-authentication/oauth-openshift-558db77b4-5dpx7" Oct 13 13:08:58 crc kubenswrapper[4797]: I1013 13:08:58.006899 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/33f4ebe7-de91-4b1c-b157-4234d535e206-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-vfstq\" (UID: \"33f4ebe7-de91-4b1c-b157-4234d535e206\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vfstq" Oct 13 13:08:58 crc kubenswrapper[4797]: I1013 13:08:58.006944 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/df26da53-3ff8-4402-b5f6-25166b6b0f8a-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-swfqw\" (UID: \"df26da53-3ff8-4402-b5f6-25166b6b0f8a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-swfqw" Oct 13 13:08:58 crc kubenswrapper[4797]: I1013 13:08:58.006964 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/4f565495-cb16-4443-8018-24e277acac69-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-5dpx7\" (UID: \"4f565495-cb16-4443-8018-24e277acac69\") " pod="openshift-authentication/oauth-openshift-558db77b4-5dpx7" Oct 13 13:08:58 crc kubenswrapper[4797]: I1013 13:08:58.007074 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5dae836b-33d9-45ed-9b21-13311ceff098-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-mwhth\" (UID: \"5dae836b-33d9-45ed-9b21-13311ceff098\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mwhth" Oct 13 13:08:58 crc kubenswrapper[4797]: I1013 13:08:58.007211 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/4f565495-cb16-4443-8018-24e277acac69-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-5dpx7\" (UID: \"4f565495-cb16-4443-8018-24e277acac69\") " pod="openshift-authentication/oauth-openshift-558db77b4-5dpx7" Oct 13 13:08:58 crc kubenswrapper[4797]: I1013 13:08:58.007229 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Oct 13 13:08:58 crc kubenswrapper[4797]: I1013 13:08:58.007340 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mgjqp"] Oct 13 13:08:58 crc kubenswrapper[4797]: I1013 13:08:58.008024 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/4f565495-cb16-4443-8018-24e277acac69-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-5dpx7\" (UID: \"4f565495-cb16-4443-8018-24e277acac69\") " pod="openshift-authentication/oauth-openshift-558db77b4-5dpx7" Oct 13 13:08:58 crc kubenswrapper[4797]: I1013 13:08:58.008172 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c528ef7f-ab03-4c1a-96b7-6d88270b67ee-serving-cert\") pod \"authentication-operator-69f744f599-l89kw\" (UID: \"c528ef7f-ab03-4c1a-96b7-6d88270b67ee\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-l89kw" Oct 13 13:08:58 crc kubenswrapper[4797]: I1013 13:08:58.008419 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/32de0c29-41b8-4e79-97c0-1d72acb48feb-serving-cert\") pod \"console-operator-58897d9998-jlq7q\" (UID: \"32de0c29-41b8-4e79-97c0-1d72acb48feb\") " pod="openshift-console-operator/console-operator-58897d9998-jlq7q" Oct 13 13:08:58 crc kubenswrapper[4797]: I1013 13:08:58.008748 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tmlh5"] Oct 13 13:08:58 crc kubenswrapper[4797]: I1013 13:08:58.010653 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/3a9a3c57-03b2-4adc-82a1-3aba68c83636-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-vw685\" (UID: \"3a9a3c57-03b2-4adc-82a1-3aba68c83636\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vw685" Oct 13 13:08:58 crc kubenswrapper[4797]: I1013 13:08:58.010898 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f156a9cc-9d97-4364-a17d-f02d8b0f8abe-audit-policies\") pod \"apiserver-7bbb656c7d-ccxwp\" (UID: \"f156a9cc-9d97-4364-a17d-f02d8b0f8abe\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ccxwp" Oct 13 13:08:58 crc kubenswrapper[4797]: I1013 13:08:58.016279 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/4f565495-cb16-4443-8018-24e277acac69-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-5dpx7\" (UID: \"4f565495-cb16-4443-8018-24e277acac69\") " pod="openshift-authentication/oauth-openshift-558db77b4-5dpx7" Oct 13 13:08:58 crc kubenswrapper[4797]: I1013 13:08:58.016302 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/90473429-30ec-490e-a96f-d66fce3c994c-encryption-config\") pod \"apiserver-76f77b778f-72vb4\" (UID: \"90473429-30ec-490e-a96f-d66fce3c994c\") " pod="openshift-apiserver/apiserver-76f77b778f-72vb4" Oct 13 13:08:58 crc kubenswrapper[4797]: I1013 13:08:58.016736 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c72e2007-fbd4-4c7a-a0fc-9c949a748441-console-oauth-config\") pod \"console-f9d7485db-8xklm\" (UID: \"c72e2007-fbd4-4c7a-a0fc-9c949a748441\") " pod="openshift-console/console-f9d7485db-8xklm" Oct 13 13:08:58 crc kubenswrapper[4797]: I1013 13:08:58.017084 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d6de0ee-416d-43d1-bf5a-e176bc41b2c5-serving-cert\") pod \"controller-manager-879f6c89f-wxrg4\" (UID: \"9d6de0ee-416d-43d1-bf5a-e176bc41b2c5\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wxrg4" Oct 13 13:08:58 crc kubenswrapper[4797]: I1013 13:08:58.017210 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ea99d2e2-7d7a-4af5-8f2d-dadc6c1acd50-serving-cert\") pod \"route-controller-manager-6576b87f9c-lvjgk\" (UID: \"ea99d2e2-7d7a-4af5-8f2d-dadc6c1acd50\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lvjgk" Oct 13 13:08:58 crc kubenswrapper[4797]: I1013 13:08:58.018040 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9xnlf"] Oct 13 13:08:58 crc kubenswrapper[4797]: I1013 13:08:58.018451 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/f156a9cc-9d97-4364-a17d-f02d8b0f8abe-encryption-config\") pod \"apiserver-7bbb656c7d-ccxwp\" (UID: \"f156a9cc-9d97-4364-a17d-f02d8b0f8abe\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ccxwp" Oct 13 13:08:58 crc kubenswrapper[4797]: I1013 13:08:58.018575 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/90473429-30ec-490e-a96f-d66fce3c994c-serving-cert\") pod \"apiserver-76f77b778f-72vb4\" (UID: \"90473429-30ec-490e-a96f-d66fce3c994c\") " pod="openshift-apiserver/apiserver-76f77b778f-72vb4" Oct 13 13:08:58 crc kubenswrapper[4797]: I1013 13:08:58.018696 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/d637970b-bb85-4dd8-beb8-1f01479781d1-machine-approver-tls\") pod \"machine-approver-56656f9798-dmwr8\" (UID: \"d637970b-bb85-4dd8-beb8-1f01479781d1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-dmwr8" Oct 13 13:08:58 crc kubenswrapper[4797]: I1013 13:08:58.019411 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/90473429-30ec-490e-a96f-d66fce3c994c-etcd-client\") pod \"apiserver-76f77b778f-72vb4\" (UID: \"90473429-30ec-490e-a96f-d66fce3c994c\") " pod="openshift-apiserver/apiserver-76f77b778f-72vb4" Oct 13 13:08:58 crc kubenswrapper[4797]: I1013 13:08:58.019616 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f156a9cc-9d97-4364-a17d-f02d8b0f8abe-etcd-client\") pod \"apiserver-7bbb656c7d-ccxwp\" (UID: \"f156a9cc-9d97-4364-a17d-f02d8b0f8abe\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ccxwp" Oct 13 13:08:58 crc kubenswrapper[4797]: I1013 13:08:58.020234 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-jsw4c"] Oct 13 13:08:58 crc kubenswrapper[4797]: I1013 13:08:58.024054 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-wnl59"] Oct 13 13:08:58 crc kubenswrapper[4797]: I1013 13:08:58.025352 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lwt47"] Oct 13 13:08:58 crc kubenswrapper[4797]: I1013 13:08:58.027204 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-tc8mj"] Oct 13 13:08:58 crc kubenswrapper[4797]: I1013 13:08:58.027943 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-8k7h2"] Oct 13 13:08:58 crc kubenswrapper[4797]: I1013 13:08:58.030863 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-fr6cp"] Oct 13 13:08:58 crc kubenswrapper[4797]: I1013 13:08:58.030903 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-29smx"] Oct 13 13:08:58 crc kubenswrapper[4797]: I1013 13:08:58.031577 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-8cwgx"] Oct 13 13:08:58 crc kubenswrapper[4797]: I1013 13:08:58.031978 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-8cwgx" Oct 13 13:08:58 crc kubenswrapper[4797]: I1013 13:08:58.032474 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-29smx" Oct 13 13:08:58 crc kubenswrapper[4797]: I1013 13:08:58.033726 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-7rxbv"] Oct 13 13:08:58 crc kubenswrapper[4797]: I1013 13:08:58.034693 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-gbkvx"] Oct 13 13:08:58 crc kubenswrapper[4797]: I1013 13:08:58.035675 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-hxck2"] Oct 13 13:08:58 crc kubenswrapper[4797]: I1013 13:08:58.050956 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Oct 13 13:08:58 crc kubenswrapper[4797]: I1013 13:08:58.051596 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kv9dz"] Oct 13 13:08:58 crc kubenswrapper[4797]: I1013 13:08:58.053855 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-29smx"] Oct 13 13:08:58 crc kubenswrapper[4797]: I1013 13:08:58.053909 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-8cwgx"] Oct 13 13:08:58 crc kubenswrapper[4797]: I1013 13:08:58.054715 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7j4bs"] Oct 13 13:08:58 crc kubenswrapper[4797]: I1013 13:08:58.055645 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-89wql"] Oct 13 13:08:58 crc kubenswrapper[4797]: I1013 13:08:58.058765 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29339340-k2d9s"] Oct 13 13:08:58 crc kubenswrapper[4797]: I1013 13:08:58.058789 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-7s6jm"] Oct 13 13:08:58 crc kubenswrapper[4797]: I1013 13:08:58.058813 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-bvpb8"] Oct 13 13:08:58 crc kubenswrapper[4797]: I1013 13:08:58.059436 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-bvpb8" Oct 13 13:08:58 crc kubenswrapper[4797]: I1013 13:08:58.065891 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Oct 13 13:08:58 crc kubenswrapper[4797]: I1013 13:08:58.086056 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Oct 13 13:08:58 crc kubenswrapper[4797]: I1013 13:08:58.091604 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d24fe086-ff48-4538-ad8a-3bb681cf9116-etcd-client\") pod \"etcd-operator-b45778765-fr6cp\" (UID: \"d24fe086-ff48-4538-ad8a-3bb681cf9116\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fr6cp" Oct 13 13:08:58 crc kubenswrapper[4797]: I1013 13:08:58.091728 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/d24fe086-ff48-4538-ad8a-3bb681cf9116-etcd-ca\") pod \"etcd-operator-b45778765-fr6cp\" (UID: \"d24fe086-ff48-4538-ad8a-3bb681cf9116\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fr6cp" Oct 13 13:08:58 crc kubenswrapper[4797]: I1013 13:08:58.091763 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d24fe086-ff48-4538-ad8a-3bb681cf9116-serving-cert\") pod \"etcd-operator-b45778765-fr6cp\" (UID: \"d24fe086-ff48-4538-ad8a-3bb681cf9116\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fr6cp" Oct 13 13:08:58 crc kubenswrapper[4797]: I1013 13:08:58.091821 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9n45\" (UniqueName: \"kubernetes.io/projected/d24fe086-ff48-4538-ad8a-3bb681cf9116-kube-api-access-g9n45\") pod \"etcd-operator-b45778765-fr6cp\" (UID: \"d24fe086-ff48-4538-ad8a-3bb681cf9116\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fr6cp" Oct 13 13:08:58 crc kubenswrapper[4797]: I1013 13:08:58.091845 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d24fe086-ff48-4538-ad8a-3bb681cf9116-config\") pod \"etcd-operator-b45778765-fr6cp\" (UID: \"d24fe086-ff48-4538-ad8a-3bb681cf9116\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fr6cp" Oct 13 13:08:58 crc kubenswrapper[4797]: I1013 13:08:58.091891 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/d24fe086-ff48-4538-ad8a-3bb681cf9116-etcd-service-ca\") pod \"etcd-operator-b45778765-fr6cp\" (UID: \"d24fe086-ff48-4538-ad8a-3bb681cf9116\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fr6cp" Oct 13 13:08:58 crc kubenswrapper[4797]: I1013 13:08:58.092651 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/d24fe086-ff48-4538-ad8a-3bb681cf9116-etcd-ca\") pod \"etcd-operator-b45778765-fr6cp\" (UID: \"d24fe086-ff48-4538-ad8a-3bb681cf9116\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fr6cp" Oct 13 13:08:58 crc kubenswrapper[4797]: I1013 13:08:58.094235 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d24fe086-ff48-4538-ad8a-3bb681cf9116-serving-cert\") pod \"etcd-operator-b45778765-fr6cp\" (UID: \"d24fe086-ff48-4538-ad8a-3bb681cf9116\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fr6cp" Oct 13 13:08:58 crc kubenswrapper[4797]: I1013 13:08:58.106088 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Oct 13 13:08:58 crc kubenswrapper[4797]: I1013 13:08:58.112924 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d24fe086-ff48-4538-ad8a-3bb681cf9116-config\") pod \"etcd-operator-b45778765-fr6cp\" (UID: \"d24fe086-ff48-4538-ad8a-3bb681cf9116\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fr6cp" Oct 13 13:08:58 crc kubenswrapper[4797]: I1013 13:08:58.125842 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Oct 13 13:08:58 crc kubenswrapper[4797]: I1013 13:08:58.143561 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d24fe086-ff48-4538-ad8a-3bb681cf9116-etcd-client\") pod \"etcd-operator-b45778765-fr6cp\" (UID: \"d24fe086-ff48-4538-ad8a-3bb681cf9116\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fr6cp" Oct 13 13:08:58 crc kubenswrapper[4797]: I1013 13:08:58.145896 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Oct 13 13:08:58 crc kubenswrapper[4797]: I1013 13:08:58.154283 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/d24fe086-ff48-4538-ad8a-3bb681cf9116-etcd-service-ca\") pod \"etcd-operator-b45778765-fr6cp\" (UID: \"d24fe086-ff48-4538-ad8a-3bb681cf9116\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fr6cp" Oct 13 13:08:58 crc kubenswrapper[4797]: I1013 13:08:58.166139 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Oct 13 13:08:58 crc kubenswrapper[4797]: I1013 13:08:58.185639 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Oct 13 13:08:58 crc kubenswrapper[4797]: I1013 13:08:58.205436 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Oct 13 13:08:58 crc kubenswrapper[4797]: I1013 13:08:58.225868 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Oct 13 13:08:58 crc kubenswrapper[4797]: I1013 13:08:58.245560 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Oct 13 13:08:58 crc kubenswrapper[4797]: I1013 13:08:58.265655 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Oct 13 13:08:58 crc kubenswrapper[4797]: I1013 13:08:58.286606 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Oct 13 13:08:58 crc kubenswrapper[4797]: I1013 13:08:58.306044 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Oct 13 13:08:58 crc kubenswrapper[4797]: I1013 13:08:58.326198 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Oct 13 13:08:58 crc kubenswrapper[4797]: I1013 13:08:58.345823 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Oct 13 13:08:58 crc kubenswrapper[4797]: I1013 13:08:58.366570 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Oct 13 13:08:58 crc kubenswrapper[4797]: I1013 13:08:58.385885 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Oct 13 13:08:58 crc kubenswrapper[4797]: I1013 13:08:58.406145 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Oct 13 13:08:58 crc kubenswrapper[4797]: I1013 13:08:58.426244 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Oct 13 13:08:58 crc kubenswrapper[4797]: I1013 13:08:58.445683 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Oct 13 13:08:58 crc kubenswrapper[4797]: I1013 13:08:58.467534 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Oct 13 13:08:58 crc kubenswrapper[4797]: I1013 13:08:58.486578 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Oct 13 13:08:58 crc kubenswrapper[4797]: I1013 13:08:58.506758 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Oct 13 13:08:58 crc kubenswrapper[4797]: I1013 13:08:58.526158 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Oct 13 13:08:58 crc kubenswrapper[4797]: I1013 13:08:58.545924 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Oct 13 13:08:58 crc kubenswrapper[4797]: I1013 13:08:58.566948 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Oct 13 13:08:58 crc kubenswrapper[4797]: I1013 13:08:58.586952 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Oct 13 13:08:58 crc kubenswrapper[4797]: I1013 13:08:58.606948 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Oct 13 13:08:58 crc kubenswrapper[4797]: I1013 13:08:58.626646 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Oct 13 13:08:58 crc kubenswrapper[4797]: I1013 13:08:58.646968 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Oct 13 13:08:58 crc kubenswrapper[4797]: I1013 13:08:58.666661 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Oct 13 13:08:58 crc kubenswrapper[4797]: I1013 13:08:58.686378 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Oct 13 13:08:58 crc kubenswrapper[4797]: I1013 13:08:58.706579 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Oct 13 13:08:58 crc kubenswrapper[4797]: I1013 13:08:58.725980 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Oct 13 13:08:58 crc kubenswrapper[4797]: I1013 13:08:58.746330 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Oct 13 13:08:58 crc kubenswrapper[4797]: I1013 13:08:58.766956 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Oct 13 13:08:58 crc kubenswrapper[4797]: I1013 13:08:58.786198 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Oct 13 13:08:58 crc kubenswrapper[4797]: I1013 13:08:58.806150 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Oct 13 13:08:58 crc kubenswrapper[4797]: I1013 13:08:58.826126 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Oct 13 13:08:58 crc kubenswrapper[4797]: I1013 13:08:58.857685 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Oct 13 13:08:58 crc kubenswrapper[4797]: I1013 13:08:58.866707 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Oct 13 13:08:58 crc kubenswrapper[4797]: I1013 13:08:58.886705 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Oct 13 13:08:58 crc kubenswrapper[4797]: I1013 13:08:58.906947 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Oct 13 13:08:58 crc kubenswrapper[4797]: I1013 13:08:58.932498 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Oct 13 13:08:58 crc kubenswrapper[4797]: I1013 13:08:58.944222 4797 request.go:700] Waited for 1.011934668s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-storage-version-migrator/configmaps?fieldSelector=metadata.name%3Dopenshift-service-ca.crt&limit=500&resourceVersion=0 Oct 13 13:08:58 crc kubenswrapper[4797]: I1013 13:08:58.945989 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Oct 13 13:08:58 crc kubenswrapper[4797]: I1013 13:08:58.966063 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Oct 13 13:08:58 crc kubenswrapper[4797]: I1013 13:08:58.987108 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Oct 13 13:08:59 crc kubenswrapper[4797]: I1013 13:08:59.006621 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Oct 13 13:08:59 crc kubenswrapper[4797]: I1013 13:08:59.026288 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Oct 13 13:08:59 crc kubenswrapper[4797]: I1013 13:08:59.047749 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Oct 13 13:08:59 crc kubenswrapper[4797]: I1013 13:08:59.067123 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Oct 13 13:08:59 crc kubenswrapper[4797]: I1013 13:08:59.086035 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Oct 13 13:08:59 crc kubenswrapper[4797]: I1013 13:08:59.108191 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Oct 13 13:08:59 crc kubenswrapper[4797]: I1013 13:08:59.127532 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Oct 13 13:08:59 crc kubenswrapper[4797]: I1013 13:08:59.146583 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Oct 13 13:08:59 crc kubenswrapper[4797]: I1013 13:08:59.167169 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Oct 13 13:08:59 crc kubenswrapper[4797]: I1013 13:08:59.186443 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Oct 13 13:08:59 crc kubenswrapper[4797]: I1013 13:08:59.206679 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Oct 13 13:08:59 crc kubenswrapper[4797]: I1013 13:08:59.226947 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Oct 13 13:08:59 crc kubenswrapper[4797]: I1013 13:08:59.236209 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 13:08:59 crc kubenswrapper[4797]: I1013 13:08:59.236300 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 13 13:08:59 crc kubenswrapper[4797]: I1013 13:08:59.236480 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pdvg5" Oct 13 13:08:59 crc kubenswrapper[4797]: I1013 13:08:59.236485 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 13 13:08:59 crc kubenswrapper[4797]: I1013 13:08:59.246850 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Oct 13 13:08:59 crc kubenswrapper[4797]: I1013 13:08:59.266975 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Oct 13 13:08:59 crc kubenswrapper[4797]: I1013 13:08:59.286073 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Oct 13 13:08:59 crc kubenswrapper[4797]: I1013 13:08:59.306605 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Oct 13 13:08:59 crc kubenswrapper[4797]: I1013 13:08:59.326145 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Oct 13 13:08:59 crc kubenswrapper[4797]: I1013 13:08:59.357354 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Oct 13 13:08:59 crc kubenswrapper[4797]: I1013 13:08:59.367763 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Oct 13 13:08:59 crc kubenswrapper[4797]: I1013 13:08:59.387139 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Oct 13 13:08:59 crc kubenswrapper[4797]: I1013 13:08:59.406870 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 13 13:08:59 crc kubenswrapper[4797]: I1013 13:08:59.426984 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Oct 13 13:08:59 crc kubenswrapper[4797]: I1013 13:08:59.446757 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Oct 13 13:08:59 crc kubenswrapper[4797]: I1013 13:08:59.466696 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Oct 13 13:08:59 crc kubenswrapper[4797]: I1013 13:08:59.486697 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Oct 13 13:08:59 crc kubenswrapper[4797]: I1013 13:08:59.507005 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 13 13:08:59 crc kubenswrapper[4797]: I1013 13:08:59.526933 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Oct 13 13:08:59 crc kubenswrapper[4797]: I1013 13:08:59.546714 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Oct 13 13:08:59 crc kubenswrapper[4797]: I1013 13:08:59.586038 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Oct 13 13:08:59 crc kubenswrapper[4797]: I1013 13:08:59.606310 4797 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Oct 13 13:08:59 crc kubenswrapper[4797]: I1013 13:08:59.627553 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Oct 13 13:08:59 crc kubenswrapper[4797]: I1013 13:08:59.646536 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Oct 13 13:08:59 crc kubenswrapper[4797]: I1013 13:08:59.690442 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqfhf\" (UniqueName: \"kubernetes.io/projected/c528ef7f-ab03-4c1a-96b7-6d88270b67ee-kube-api-access-kqfhf\") pod \"authentication-operator-69f744f599-l89kw\" (UID: \"c528ef7f-ab03-4c1a-96b7-6d88270b67ee\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-l89kw" Oct 13 13:08:59 crc kubenswrapper[4797]: I1013 13:08:59.718683 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cltnv\" (UniqueName: \"kubernetes.io/projected/d637970b-bb85-4dd8-beb8-1f01479781d1-kube-api-access-cltnv\") pod \"machine-approver-56656f9798-dmwr8\" (UID: \"d637970b-bb85-4dd8-beb8-1f01479781d1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-dmwr8" Oct 13 13:08:59 crc kubenswrapper[4797]: I1013 13:08:59.735946 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jtjf\" (UniqueName: \"kubernetes.io/projected/df26da53-3ff8-4402-b5f6-25166b6b0f8a-kube-api-access-8jtjf\") pod \"cluster-samples-operator-665b6dd947-swfqw\" (UID: \"df26da53-3ff8-4402-b5f6-25166b6b0f8a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-swfqw" Oct 13 13:08:59 crc kubenswrapper[4797]: I1013 13:08:59.740082 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-l89kw" Oct 13 13:08:59 crc kubenswrapper[4797]: I1013 13:08:59.754999 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xn6xb\" (UniqueName: \"kubernetes.io/projected/c72e2007-fbd4-4c7a-a0fc-9c949a748441-kube-api-access-xn6xb\") pod \"console-f9d7485db-8xklm\" (UID: \"c72e2007-fbd4-4c7a-a0fc-9c949a748441\") " pod="openshift-console/console-f9d7485db-8xklm" Oct 13 13:08:59 crc kubenswrapper[4797]: I1013 13:08:59.777382 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwhvg\" (UniqueName: \"kubernetes.io/projected/32de0c29-41b8-4e79-97c0-1d72acb48feb-kube-api-access-jwhvg\") pod \"console-operator-58897d9998-jlq7q\" (UID: \"32de0c29-41b8-4e79-97c0-1d72acb48feb\") " pod="openshift-console-operator/console-operator-58897d9998-jlq7q" Oct 13 13:08:59 crc kubenswrapper[4797]: I1013 13:08:59.785696 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-8xklm" Oct 13 13:08:59 crc kubenswrapper[4797]: I1013 13:08:59.793634 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-swfqw" Oct 13 13:08:59 crc kubenswrapper[4797]: I1013 13:08:59.794901 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7bzt\" (UniqueName: \"kubernetes.io/projected/9d6de0ee-416d-43d1-bf5a-e176bc41b2c5-kube-api-access-z7bzt\") pod \"controller-manager-879f6c89f-wxrg4\" (UID: \"9d6de0ee-416d-43d1-bf5a-e176bc41b2c5\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wxrg4" Oct 13 13:08:59 crc kubenswrapper[4797]: I1013 13:08:59.813878 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v578k\" (UniqueName: \"kubernetes.io/projected/3a9a3c57-03b2-4adc-82a1-3aba68c83636-kube-api-access-v578k\") pod \"machine-api-operator-5694c8668f-vw685\" (UID: \"3a9a3c57-03b2-4adc-82a1-3aba68c83636\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vw685" Oct 13 13:08:59 crc kubenswrapper[4797]: I1013 13:08:59.832895 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5wzvr\" (UniqueName: \"kubernetes.io/projected/4f565495-cb16-4443-8018-24e277acac69-kube-api-access-5wzvr\") pod \"oauth-openshift-558db77b4-5dpx7\" (UID: \"4f565495-cb16-4443-8018-24e277acac69\") " pod="openshift-authentication/oauth-openshift-558db77b4-5dpx7" Oct 13 13:08:59 crc kubenswrapper[4797]: I1013 13:08:59.855609 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwz5l\" (UniqueName: \"kubernetes.io/projected/90473429-30ec-490e-a96f-d66fce3c994c-kube-api-access-hwz5l\") pod \"apiserver-76f77b778f-72vb4\" (UID: \"90473429-30ec-490e-a96f-d66fce3c994c\") " pod="openshift-apiserver/apiserver-76f77b778f-72vb4" Oct 13 13:08:59 crc kubenswrapper[4797]: I1013 13:08:59.862412 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-jlq7q" Oct 13 13:08:59 crc kubenswrapper[4797]: I1013 13:08:59.871880 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9p2qf\" (UniqueName: \"kubernetes.io/projected/33f4ebe7-de91-4b1c-b157-4234d535e206-kube-api-access-9p2qf\") pod \"openshift-controller-manager-operator-756b6f6bc6-vfstq\" (UID: \"33f4ebe7-de91-4b1c-b157-4234d535e206\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vfstq" Oct 13 13:08:59 crc kubenswrapper[4797]: I1013 13:08:59.896485 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4x7hf\" (UniqueName: \"kubernetes.io/projected/5dae836b-33d9-45ed-9b21-13311ceff098-kube-api-access-4x7hf\") pod \"openshift-apiserver-operator-796bbdcf4f-mwhth\" (UID: \"5dae836b-33d9-45ed-9b21-13311ceff098\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mwhth" Oct 13 13:08:59 crc kubenswrapper[4797]: I1013 13:08:59.900000 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-dmwr8" Oct 13 13:08:59 crc kubenswrapper[4797]: I1013 13:08:59.912173 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gg756\" (UniqueName: \"kubernetes.io/projected/f156a9cc-9d97-4364-a17d-f02d8b0f8abe-kube-api-access-gg756\") pod \"apiserver-7bbb656c7d-ccxwp\" (UID: \"f156a9cc-9d97-4364-a17d-f02d8b0f8abe\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ccxwp" Oct 13 13:08:59 crc kubenswrapper[4797]: I1013 13:08:59.923684 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mwhth" Oct 13 13:08:59 crc kubenswrapper[4797]: I1013 13:08:59.944253 4797 request.go:700] Waited for 1.911997208s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-ingress-canary/configmaps?fieldSelector=metadata.name%3Dopenshift-service-ca.crt&limit=500&resourceVersion=0 Oct 13 13:08:59 crc kubenswrapper[4797]: I1013 13:08:59.944667 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-72vb4" Oct 13 13:08:59 crc kubenswrapper[4797]: I1013 13:08:59.945921 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Oct 13 13:08:59 crc kubenswrapper[4797]: I1013 13:08:59.948775 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5kks\" (UniqueName: \"kubernetes.io/projected/ea99d2e2-7d7a-4af5-8f2d-dadc6c1acd50-kube-api-access-m5kks\") pod \"route-controller-manager-6576b87f9c-lvjgk\" (UID: \"ea99d2e2-7d7a-4af5-8f2d-dadc6c1acd50\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lvjgk" Oct 13 13:08:59 crc kubenswrapper[4797]: I1013 13:08:59.954215 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lvjgk" Oct 13 13:08:59 crc kubenswrapper[4797]: I1013 13:08:59.965498 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Oct 13 13:08:59 crc kubenswrapper[4797]: I1013 13:08:59.969112 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-wxrg4" Oct 13 13:08:59 crc kubenswrapper[4797]: I1013 13:08:59.986070 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Oct 13 13:08:59 crc kubenswrapper[4797]: I1013 13:08:59.996139 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ccxwp" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.008622 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.026981 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.028945 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-vw685" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.047075 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.066073 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.072852 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-5dpx7" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.088321 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.107219 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.127380 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.162607 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9n45\" (UniqueName: \"kubernetes.io/projected/d24fe086-ff48-4538-ad8a-3bb681cf9116-kube-api-access-g9n45\") pod \"etcd-operator-b45778765-fr6cp\" (UID: \"d24fe086-ff48-4538-ad8a-3bb681cf9116\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fr6cp" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.167127 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.168426 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vfstq" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.185729 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.205737 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.213245 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-fr6cp" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.226728 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.248073 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.265608 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.269836 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-l89kw"] Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.272891 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-swfqw"] Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.326218 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/6007d943-10d1-4737-9821-874c5fe1a043-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-mgjqp\" (UID: \"6007d943-10d1-4737-9821-874c5fe1a043\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mgjqp" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.326251 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/812d4016-c6ce-4440-bd68-3328eb8b9421-images\") pod \"machine-config-operator-74547568cd-7rxbv\" (UID: \"812d4016-c6ce-4440-bd68-3328eb8b9421\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7rxbv" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.326269 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sppvk\" (UniqueName: \"kubernetes.io/projected/4e99badd-da38-4d6d-bf30-8c5d837e4ca9-kube-api-access-sppvk\") pod \"service-ca-operator-777779d784-8k7h2\" (UID: \"4e99badd-da38-4d6d-bf30-8c5d837e4ca9\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-8k7h2" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.326286 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/df83a900-367b-4c33-8a5f-ac822552edde-profile-collector-cert\") pod \"catalog-operator-68c6474976-lwt47\" (UID: \"df83a900-367b-4c33-8a5f-ac822552edde\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lwt47" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.326301 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cb019f89-a9a5-4589-9116-f9ee8e3ffb3c-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-5pjbz\" (UID: \"cb019f89-a9a5-4589-9116-f9ee8e3ffb3c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5pjbz" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.326316 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fc2e514b-61f8-47b1-975e-4a910550ecaa-config-volume\") pod \"collect-profiles-29339340-k2d9s\" (UID: \"fc2e514b-61f8-47b1-975e-4a910550ecaa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339340-k2d9s" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.326332 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lsdmf\" (UniqueName: \"kubernetes.io/projected/19ba1f1c-943a-416f-94cf-0f16d9908a88-kube-api-access-lsdmf\") pod \"olm-operator-6b444d44fb-7j4bs\" (UID: \"19ba1f1c-943a-416f-94cf-0f16d9908a88\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7j4bs" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.326350 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/bb8a2965-1b45-4811-8da9-72e901118c75-signing-cabundle\") pod \"service-ca-9c57cc56f-7s6jm\" (UID: \"bb8a2965-1b45-4811-8da9-72e901118c75\") " pod="openshift-service-ca/service-ca-9c57cc56f-7s6jm" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.326377 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/62769b11-1e27-44f2-836c-da79ac2655b6-default-certificate\") pod \"router-default-5444994796-ngmd7\" (UID: \"62769b11-1e27-44f2-836c-da79ac2655b6\") " pod="openshift-ingress/router-default-5444994796-ngmd7" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.326447 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5w4d\" (UniqueName: \"kubernetes.io/projected/ffa6af37-afb0-4226-be8c-83399f30793a-kube-api-access-v5w4d\") pod \"multus-admission-controller-857f4d67dd-mtgwk\" (UID: \"ffa6af37-afb0-4226-be8c-83399f30793a\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-mtgwk" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.326464 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzclr\" (UniqueName: \"kubernetes.io/projected/fc2e514b-61f8-47b1-975e-4a910550ecaa-kube-api-access-kzclr\") pod \"collect-profiles-29339340-k2d9s\" (UID: \"fc2e514b-61f8-47b1-975e-4a910550ecaa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339340-k2d9s" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.326487 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/fcbb5cc0-3585-4ddb-aa28-1c1097d59318-registry-certificates\") pod \"image-registry-697d97f7c8-x9zw9\" (UID: \"fcbb5cc0-3585-4ddb-aa28-1c1097d59318\") " pod="openshift-image-registry/image-registry-697d97f7c8-x9zw9" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.326502 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/16ad833e-f3a4-4af3-97b7-d960f7905292-webhook-cert\") pod \"packageserver-d55dfcdfc-kv9dz\" (UID: \"16ad833e-f3a4-4af3-97b7-d960f7905292\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kv9dz" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.326526 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/83a79b04-2fae-4444-90fd-a165aab2f901-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-tc8mj\" (UID: \"83a79b04-2fae-4444-90fd-a165aab2f901\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-tc8mj" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.326540 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/62769b11-1e27-44f2-836c-da79ac2655b6-metrics-certs\") pod \"router-default-5444994796-ngmd7\" (UID: \"62769b11-1e27-44f2-836c-da79ac2655b6\") " pod="openshift-ingress/router-default-5444994796-ngmd7" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.326624 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/df83a900-367b-4c33-8a5f-ac822552edde-srv-cert\") pod \"catalog-operator-68c6474976-lwt47\" (UID: \"df83a900-367b-4c33-8a5f-ac822552edde\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lwt47" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.326744 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a2bc926f-f4d3-4811-ba2f-c7f520b910bb-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-9xnlf\" (UID: \"a2bc926f-f4d3-4811-ba2f-c7f520b910bb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9xnlf" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.326775 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6939a36c-2200-4151-bd6f-50f54ecdf4c9-metrics-tls\") pod \"ingress-operator-5b745b69d9-lkws7\" (UID: \"6939a36c-2200-4151-bd6f-50f54ecdf4c9\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lkws7" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.326791 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c229cffd-cd92-47b5-bec4-3f3eb1c6c81e-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-hxck2\" (UID: \"c229cffd-cd92-47b5-bec4-3f3eb1c6c81e\") " pod="openshift-marketplace/marketplace-operator-79b997595-hxck2" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.326822 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/fcbb5cc0-3585-4ddb-aa28-1c1097d59318-installation-pull-secrets\") pod \"image-registry-697d97f7c8-x9zw9\" (UID: \"fcbb5cc0-3585-4ddb-aa28-1c1097d59318\") " pod="openshift-image-registry/image-registry-697d97f7c8-x9zw9" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.326837 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/16ad833e-f3a4-4af3-97b7-d960f7905292-tmpfs\") pod \"packageserver-d55dfcdfc-kv9dz\" (UID: \"16ad833e-f3a4-4af3-97b7-d960f7905292\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kv9dz" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.326852 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/19ba1f1c-943a-416f-94cf-0f16d9908a88-profile-collector-cert\") pod \"olm-operator-6b444d44fb-7j4bs\" (UID: \"19ba1f1c-943a-416f-94cf-0f16d9908a88\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7j4bs" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.326868 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/c229cffd-cd92-47b5-bec4-3f3eb1c6c81e-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-hxck2\" (UID: \"c229cffd-cd92-47b5-bec4-3f3eb1c6c81e\") " pod="openshift-marketplace/marketplace-operator-79b997595-hxck2" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.326892 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b380ffb7-df80-4c6d-8c1a-1ed4a7d1208e-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-2n27q\" (UID: \"b380ffb7-df80-4c6d-8c1a-1ed4a7d1208e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2n27q" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.326910 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b380ffb7-df80-4c6d-8c1a-1ed4a7d1208e-config\") pod \"kube-controller-manager-operator-78b949d7b-2n27q\" (UID: \"b380ffb7-df80-4c6d-8c1a-1ed4a7d1208e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2n27q" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.326924 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6939a36c-2200-4151-bd6f-50f54ecdf4c9-trusted-ca\") pod \"ingress-operator-5b745b69d9-lkws7\" (UID: \"6939a36c-2200-4151-bd6f-50f54ecdf4c9\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lkws7" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.326946 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e99badd-da38-4d6d-bf30-8c5d837e4ca9-config\") pod \"service-ca-operator-777779d784-8k7h2\" (UID: \"4e99badd-da38-4d6d-bf30-8c5d837e4ca9\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-8k7h2" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.326961 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb019f89-a9a5-4589-9116-f9ee8e3ffb3c-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-5pjbz\" (UID: \"cb019f89-a9a5-4589-9116-f9ee8e3ffb3c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5pjbz" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.326984 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xg9b\" (UniqueName: \"kubernetes.io/projected/16ad833e-f3a4-4af3-97b7-d960f7905292-kube-api-access-8xg9b\") pod \"packageserver-d55dfcdfc-kv9dz\" (UID: \"16ad833e-f3a4-4af3-97b7-d960f7905292\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kv9dz" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.327035 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/fcbb5cc0-3585-4ddb-aa28-1c1097d59318-ca-trust-extracted\") pod \"image-registry-697d97f7c8-x9zw9\" (UID: \"fcbb5cc0-3585-4ddb-aa28-1c1097d59318\") " pod="openshift-image-registry/image-registry-697d97f7c8-x9zw9" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.327052 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/bb8a2965-1b45-4811-8da9-72e901118c75-signing-key\") pod \"service-ca-9c57cc56f-7s6jm\" (UID: \"bb8a2965-1b45-4811-8da9-72e901118c75\") " pod="openshift-service-ca/service-ca-9c57cc56f-7s6jm" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.327098 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4nx98\" (UniqueName: \"kubernetes.io/projected/c229cffd-cd92-47b5-bec4-3f3eb1c6c81e-kube-api-access-4nx98\") pod \"marketplace-operator-79b997595-hxck2\" (UID: \"c229cffd-cd92-47b5-bec4-3f3eb1c6c81e\") " pod="openshift-marketplace/marketplace-operator-79b997595-hxck2" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.327115 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lh7wp\" (UniqueName: \"kubernetes.io/projected/bb8a2965-1b45-4811-8da9-72e901118c75-kube-api-access-lh7wp\") pod \"service-ca-9c57cc56f-7s6jm\" (UID: \"bb8a2965-1b45-4811-8da9-72e901118c75\") " pod="openshift-service-ca/service-ca-9c57cc56f-7s6jm" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.327128 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fc2e514b-61f8-47b1-975e-4a910550ecaa-secret-volume\") pod \"collect-profiles-29339340-k2d9s\" (UID: \"fc2e514b-61f8-47b1-975e-4a910550ecaa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339340-k2d9s" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.327156 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/73bc6ad6-8024-4d5b-a0b6-995e29f987af-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-fb6d2\" (UID: \"73bc6ad6-8024-4d5b-a0b6-995e29f987af\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-fb6d2" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.327185 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vh57k\" (UniqueName: \"kubernetes.io/projected/812d4016-c6ce-4440-bd68-3328eb8b9421-kube-api-access-vh57k\") pod \"machine-config-operator-74547568cd-7rxbv\" (UID: \"812d4016-c6ce-4440-bd68-3328eb8b9421\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7rxbv" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.327201 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/812d4016-c6ce-4440-bd68-3328eb8b9421-auth-proxy-config\") pod \"machine-config-operator-74547568cd-7rxbv\" (UID: \"812d4016-c6ce-4440-bd68-3328eb8b9421\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7rxbv" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.327234 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8c4kt\" (UniqueName: \"kubernetes.io/projected/ef80bed0-4ac7-4df7-87e9-72eb4299008a-kube-api-access-8c4kt\") pod \"openshift-config-operator-7777fb866f-gbkvx\" (UID: \"ef80bed0-4ac7-4df7-87e9-72eb4299008a\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-gbkvx" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.327249 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cb019f89-a9a5-4589-9116-f9ee8e3ffb3c-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-5pjbz\" (UID: \"cb019f89-a9a5-4589-9116-f9ee8e3ffb3c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5pjbz" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.327290 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/62769b11-1e27-44f2-836c-da79ac2655b6-service-ca-bundle\") pod \"router-default-5444994796-ngmd7\" (UID: \"62769b11-1e27-44f2-836c-da79ac2655b6\") " pod="openshift-ingress/router-default-5444994796-ngmd7" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.327306 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2bc926f-f4d3-4811-ba2f-c7f520b910bb-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-9xnlf\" (UID: \"a2bc926f-f4d3-4811-ba2f-c7f520b910bb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9xnlf" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.327322 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fcbb5cc0-3585-4ddb-aa28-1c1097d59318-bound-sa-token\") pod \"image-registry-697d97f7c8-x9zw9\" (UID: \"fcbb5cc0-3585-4ddb-aa28-1c1097d59318\") " pod="openshift-image-registry/image-registry-697d97f7c8-x9zw9" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.327347 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/83a79b04-2fae-4444-90fd-a165aab2f901-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-tc8mj\" (UID: \"83a79b04-2fae-4444-90fd-a165aab2f901\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-tc8mj" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.327364 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6007d943-10d1-4737-9821-874c5fe1a043-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-mgjqp\" (UID: \"6007d943-10d1-4737-9821-874c5fe1a043\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mgjqp" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.327385 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x9zw9\" (UID: \"fcbb5cc0-3585-4ddb-aa28-1c1097d59318\") " pod="openshift-image-registry/image-registry-697d97f7c8-x9zw9" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.327425 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/ef80bed0-4ac7-4df7-87e9-72eb4299008a-available-featuregates\") pod \"openshift-config-operator-7777fb866f-gbkvx\" (UID: \"ef80bed0-4ac7-4df7-87e9-72eb4299008a\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-gbkvx" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.327451 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4e99badd-da38-4d6d-bf30-8c5d837e4ca9-serving-cert\") pod \"service-ca-operator-777779d784-8k7h2\" (UID: \"4e99badd-da38-4d6d-bf30-8c5d837e4ca9\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-8k7h2" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.327477 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83a79b04-2fae-4444-90fd-a165aab2f901-config\") pod \"kube-apiserver-operator-766d6c64bb-tc8mj\" (UID: \"83a79b04-2fae-4444-90fd-a165aab2f901\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-tc8mj" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.327514 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b380ffb7-df80-4c6d-8c1a-1ed4a7d1208e-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-2n27q\" (UID: \"b380ffb7-df80-4c6d-8c1a-1ed4a7d1208e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2n27q" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.327531 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdmh5\" (UniqueName: \"kubernetes.io/projected/a2bc926f-f4d3-4811-ba2f-c7f520b910bb-kube-api-access-zdmh5\") pod \"kube-storage-version-migrator-operator-b67b599dd-9xnlf\" (UID: \"a2bc926f-f4d3-4811-ba2f-c7f520b910bb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9xnlf" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.327564 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6939a36c-2200-4151-bd6f-50f54ecdf4c9-bound-sa-token\") pod \"ingress-operator-5b745b69d9-lkws7\" (UID: \"6939a36c-2200-4151-bd6f-50f54ecdf4c9\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lkws7" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.327581 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2kdcw\" (UniqueName: \"kubernetes.io/projected/6939a36c-2200-4151-bd6f-50f54ecdf4c9-kube-api-access-2kdcw\") pod \"ingress-operator-5b745b69d9-lkws7\" (UID: \"6939a36c-2200-4151-bd6f-50f54ecdf4c9\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lkws7" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.327602 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-scwl5\" (UniqueName: \"kubernetes.io/projected/9a6c6613-63d4-485e-ba56-f2347d72872e-kube-api-access-scwl5\") pod \"dns-operator-744455d44c-qssvq\" (UID: \"9a6c6613-63d4-485e-ba56-f2347d72872e\") " pod="openshift-dns-operator/dns-operator-744455d44c-qssvq" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.327621 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/ffa6af37-afb0-4226-be8c-83399f30793a-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-mtgwk\" (UID: \"ffa6af37-afb0-4226-be8c-83399f30793a\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-mtgwk" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.327638 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgjcm\" (UniqueName: \"kubernetes.io/projected/7be202b3-8f35-4fe4-84ab-aea26389b7fd-kube-api-access-tgjcm\") pod \"migrator-59844c95c7-wnl59\" (UID: \"7be202b3-8f35-4fe4-84ab-aea26389b7fd\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wnl59" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.327654 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/62769b11-1e27-44f2-836c-da79ac2655b6-stats-auth\") pod \"router-default-5444994796-ngmd7\" (UID: \"62769b11-1e27-44f2-836c-da79ac2655b6\") " pod="openshift-ingress/router-default-5444994796-ngmd7" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.327670 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmg8v\" (UniqueName: \"kubernetes.io/projected/62769b11-1e27-44f2-836c-da79ac2655b6-kube-api-access-zmg8v\") pod \"router-default-5444994796-ngmd7\" (UID: \"62769b11-1e27-44f2-836c-da79ac2655b6\") " pod="openshift-ingress/router-default-5444994796-ngmd7" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.327686 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/16ad833e-f3a4-4af3-97b7-d960f7905292-apiservice-cert\") pod \"packageserver-d55dfcdfc-kv9dz\" (UID: \"16ad833e-f3a4-4af3-97b7-d960f7905292\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kv9dz" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.327710 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/812d4016-c6ce-4440-bd68-3328eb8b9421-proxy-tls\") pod \"machine-config-operator-74547568cd-7rxbv\" (UID: \"812d4016-c6ce-4440-bd68-3328eb8b9421\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7rxbv" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.327725 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6007d943-10d1-4737-9821-874c5fe1a043-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-mgjqp\" (UID: \"6007d943-10d1-4737-9821-874c5fe1a043\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mgjqp" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.327751 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5sgpm\" (UniqueName: \"kubernetes.io/projected/df83a900-367b-4c33-8a5f-ac822552edde-kube-api-access-5sgpm\") pod \"catalog-operator-68c6474976-lwt47\" (UID: \"df83a900-367b-4c33-8a5f-ac822552edde\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lwt47" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.327786 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9a6c6613-63d4-485e-ba56-f2347d72872e-metrics-tls\") pod \"dns-operator-744455d44c-qssvq\" (UID: \"9a6c6613-63d4-485e-ba56-f2347d72872e\") " pod="openshift-dns-operator/dns-operator-744455d44c-qssvq" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.328397 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ef80bed0-4ac7-4df7-87e9-72eb4299008a-serving-cert\") pod \"openshift-config-operator-7777fb866f-gbkvx\" (UID: \"ef80bed0-4ac7-4df7-87e9-72eb4299008a\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-gbkvx" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.328427 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c9e38323-ab31-433d-a8c2-6f1814529fea-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-jsw4c\" (UID: \"c9e38323-ab31-433d-a8c2-6f1814529fea\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jsw4c" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.328444 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxgs7\" (UniqueName: \"kubernetes.io/projected/c9e38323-ab31-433d-a8c2-6f1814529fea-kube-api-access-lxgs7\") pod \"machine-config-controller-84d6567774-jsw4c\" (UID: \"c9e38323-ab31-433d-a8c2-6f1814529fea\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jsw4c" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.328471 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9kwf8\" (UniqueName: \"kubernetes.io/projected/2b8b627e-6ee4-4ba8-b83f-fc84cf2b2c11-kube-api-access-9kwf8\") pod \"downloads-7954f5f757-9pdrq\" (UID: \"2b8b627e-6ee4-4ba8-b83f-fc84cf2b2c11\") " pod="openshift-console/downloads-7954f5f757-9pdrq" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.328499 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fcbb5cc0-3585-4ddb-aa28-1c1097d59318-registry-tls\") pod \"image-registry-697d97f7c8-x9zw9\" (UID: \"fcbb5cc0-3585-4ddb-aa28-1c1097d59318\") " pod="openshift-image-registry/image-registry-697d97f7c8-x9zw9" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.328544 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8w64\" (UniqueName: \"kubernetes.io/projected/73bc6ad6-8024-4d5b-a0b6-995e29f987af-kube-api-access-l8w64\") pod \"control-plane-machine-set-operator-78cbb6b69f-fb6d2\" (UID: \"73bc6ad6-8024-4d5b-a0b6-995e29f987af\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-fb6d2" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.328568 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fcbb5cc0-3585-4ddb-aa28-1c1097d59318-trusted-ca\") pod \"image-registry-697d97f7c8-x9zw9\" (UID: \"fcbb5cc0-3585-4ddb-aa28-1c1097d59318\") " pod="openshift-image-registry/image-registry-697d97f7c8-x9zw9" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.328685 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c9e38323-ab31-433d-a8c2-6f1814529fea-proxy-tls\") pod \"machine-config-controller-84d6567774-jsw4c\" (UID: \"c9e38323-ab31-433d-a8c2-6f1814529fea\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jsw4c" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.328713 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nlx64\" (UniqueName: \"kubernetes.io/projected/fcbb5cc0-3585-4ddb-aa28-1c1097d59318-kube-api-access-nlx64\") pod \"image-registry-697d97f7c8-x9zw9\" (UID: \"fcbb5cc0-3585-4ddb-aa28-1c1097d59318\") " pod="openshift-image-registry/image-registry-697d97f7c8-x9zw9" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.328728 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/19ba1f1c-943a-416f-94cf-0f16d9908a88-srv-cert\") pod \"olm-operator-6b444d44fb-7j4bs\" (UID: \"19ba1f1c-943a-416f-94cf-0f16d9908a88\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7j4bs" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.328780 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8r5lg\" (UniqueName: \"kubernetes.io/projected/6007d943-10d1-4737-9821-874c5fe1a043-kube-api-access-8r5lg\") pod \"cluster-image-registry-operator-dc59b4c8b-mgjqp\" (UID: \"6007d943-10d1-4737-9821-874c5fe1a043\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mgjqp" Oct 13 13:09:00 crc kubenswrapper[4797]: E1013 13:09:00.330616 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-13 13:09:00.830598516 +0000 UTC m=+118.364148772 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x9zw9" (UID: "fcbb5cc0-3585-4ddb-aa28-1c1097d59318") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.345794 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-jlq7q"] Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.350365 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-8xklm"] Oct 13 13:09:00 crc kubenswrapper[4797]: W1013 13:09:00.378296 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc72e2007_fbd4_4c7a_a0fc_9c949a748441.slice/crio-72744f9bd95604904e2a2099c4bab3a740bad454a60ca03375de631a423cd521 WatchSource:0}: Error finding container 72744f9bd95604904e2a2099c4bab3a740bad454a60ca03375de631a423cd521: Status 404 returned error can't find the container with id 72744f9bd95604904e2a2099c4bab3a740bad454a60ca03375de631a423cd521 Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.429567 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.429717 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/fcbb5cc0-3585-4ddb-aa28-1c1097d59318-ca-trust-extracted\") pod \"image-registry-697d97f7c8-x9zw9\" (UID: \"fcbb5cc0-3585-4ddb-aa28-1c1097d59318\") " pod="openshift-image-registry/image-registry-697d97f7c8-x9zw9" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.429742 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/bb8a2965-1b45-4811-8da9-72e901118c75-signing-key\") pod \"service-ca-9c57cc56f-7s6jm\" (UID: \"bb8a2965-1b45-4811-8da9-72e901118c75\") " pod="openshift-service-ca/service-ca-9c57cc56f-7s6jm" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.429761 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/85522ce2-6359-4ee0-bb51-7c19dffbfd09-plugins-dir\") pod \"csi-hostpathplugin-89wql\" (UID: \"85522ce2-6359-4ee0-bb51-7c19dffbfd09\") " pod="hostpath-provisioner/csi-hostpathplugin-89wql" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.429775 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/463ec1ab-bb91-45f7-b090-7aeb71365797-cert\") pod \"ingress-canary-8cwgx\" (UID: \"463ec1ab-bb91-45f7-b090-7aeb71365797\") " pod="openshift-ingress-canary/ingress-canary-8cwgx" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.429813 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/41cd2d07-6acd-47ab-8c03-c5dfbc10dbf8-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-tmlh5\" (UID: \"41cd2d07-6acd-47ab-8c03-c5dfbc10dbf8\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tmlh5" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.429833 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4nx98\" (UniqueName: \"kubernetes.io/projected/c229cffd-cd92-47b5-bec4-3f3eb1c6c81e-kube-api-access-4nx98\") pod \"marketplace-operator-79b997595-hxck2\" (UID: \"c229cffd-cd92-47b5-bec4-3f3eb1c6c81e\") " pod="openshift-marketplace/marketplace-operator-79b997595-hxck2" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.429850 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lh7wp\" (UniqueName: \"kubernetes.io/projected/bb8a2965-1b45-4811-8da9-72e901118c75-kube-api-access-lh7wp\") pod \"service-ca-9c57cc56f-7s6jm\" (UID: \"bb8a2965-1b45-4811-8da9-72e901118c75\") " pod="openshift-service-ca/service-ca-9c57cc56f-7s6jm" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.429868 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fc2e514b-61f8-47b1-975e-4a910550ecaa-secret-volume\") pod \"collect-profiles-29339340-k2d9s\" (UID: \"fc2e514b-61f8-47b1-975e-4a910550ecaa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339340-k2d9s" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.429891 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/73bc6ad6-8024-4d5b-a0b6-995e29f987af-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-fb6d2\" (UID: \"73bc6ad6-8024-4d5b-a0b6-995e29f987af\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-fb6d2" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.429911 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/812d4016-c6ce-4440-bd68-3328eb8b9421-auth-proxy-config\") pod \"machine-config-operator-74547568cd-7rxbv\" (UID: \"812d4016-c6ce-4440-bd68-3328eb8b9421\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7rxbv" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.429925 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vh57k\" (UniqueName: \"kubernetes.io/projected/812d4016-c6ce-4440-bd68-3328eb8b9421-kube-api-access-vh57k\") pod \"machine-config-operator-74547568cd-7rxbv\" (UID: \"812d4016-c6ce-4440-bd68-3328eb8b9421\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7rxbv" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.429940 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cb019f89-a9a5-4589-9116-f9ee8e3ffb3c-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-5pjbz\" (UID: \"cb019f89-a9a5-4589-9116-f9ee8e3ffb3c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5pjbz" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.429958 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8c4kt\" (UniqueName: \"kubernetes.io/projected/ef80bed0-4ac7-4df7-87e9-72eb4299008a-kube-api-access-8c4kt\") pod \"openshift-config-operator-7777fb866f-gbkvx\" (UID: \"ef80bed0-4ac7-4df7-87e9-72eb4299008a\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-gbkvx" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.429975 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6pq9\" (UniqueName: \"kubernetes.io/projected/463ec1ab-bb91-45f7-b090-7aeb71365797-kube-api-access-k6pq9\") pod \"ingress-canary-8cwgx\" (UID: \"463ec1ab-bb91-45f7-b090-7aeb71365797\") " pod="openshift-ingress-canary/ingress-canary-8cwgx" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.429989 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fcbb5cc0-3585-4ddb-aa28-1c1097d59318-bound-sa-token\") pod \"image-registry-697d97f7c8-x9zw9\" (UID: \"fcbb5cc0-3585-4ddb-aa28-1c1097d59318\") " pod="openshift-image-registry/image-registry-697d97f7c8-x9zw9" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.430003 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/62769b11-1e27-44f2-836c-da79ac2655b6-service-ca-bundle\") pod \"router-default-5444994796-ngmd7\" (UID: \"62769b11-1e27-44f2-836c-da79ac2655b6\") " pod="openshift-ingress/router-default-5444994796-ngmd7" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.430018 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2bc926f-f4d3-4811-ba2f-c7f520b910bb-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-9xnlf\" (UID: \"a2bc926f-f4d3-4811-ba2f-c7f520b910bb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9xnlf" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.430032 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/83a79b04-2fae-4444-90fd-a165aab2f901-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-tc8mj\" (UID: \"83a79b04-2fae-4444-90fd-a165aab2f901\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-tc8mj" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.430048 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6007d943-10d1-4737-9821-874c5fe1a043-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-mgjqp\" (UID: \"6007d943-10d1-4737-9821-874c5fe1a043\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mgjqp" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.430064 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/75093995-2ea3-48d0-bbe7-aa933ebb1dc5-config-volume\") pod \"dns-default-29smx\" (UID: \"75093995-2ea3-48d0-bbe7-aa933ebb1dc5\") " pod="openshift-dns/dns-default-29smx" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.430097 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/ef80bed0-4ac7-4df7-87e9-72eb4299008a-available-featuregates\") pod \"openshift-config-operator-7777fb866f-gbkvx\" (UID: \"ef80bed0-4ac7-4df7-87e9-72eb4299008a\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-gbkvx" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.430113 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4e99badd-da38-4d6d-bf30-8c5d837e4ca9-serving-cert\") pod \"service-ca-operator-777779d784-8k7h2\" (UID: \"4e99badd-da38-4d6d-bf30-8c5d837e4ca9\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-8k7h2" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.430129 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83a79b04-2fae-4444-90fd-a165aab2f901-config\") pod \"kube-apiserver-operator-766d6c64bb-tc8mj\" (UID: \"83a79b04-2fae-4444-90fd-a165aab2f901\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-tc8mj" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.430144 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/85522ce2-6359-4ee0-bb51-7c19dffbfd09-mountpoint-dir\") pod \"csi-hostpathplugin-89wql\" (UID: \"85522ce2-6359-4ee0-bb51-7c19dffbfd09\") " pod="hostpath-provisioner/csi-hostpathplugin-89wql" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.430161 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b380ffb7-df80-4c6d-8c1a-1ed4a7d1208e-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-2n27q\" (UID: \"b380ffb7-df80-4c6d-8c1a-1ed4a7d1208e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2n27q" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.430175 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zdmh5\" (UniqueName: \"kubernetes.io/projected/a2bc926f-f4d3-4811-ba2f-c7f520b910bb-kube-api-access-zdmh5\") pod \"kube-storage-version-migrator-operator-b67b599dd-9xnlf\" (UID: \"a2bc926f-f4d3-4811-ba2f-c7f520b910bb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9xnlf" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.430190 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8l5c\" (UniqueName: \"kubernetes.io/projected/85522ce2-6359-4ee0-bb51-7c19dffbfd09-kube-api-access-z8l5c\") pod \"csi-hostpathplugin-89wql\" (UID: \"85522ce2-6359-4ee0-bb51-7c19dffbfd09\") " pod="hostpath-provisioner/csi-hostpathplugin-89wql" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.430209 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vx28m\" (UniqueName: \"kubernetes.io/projected/75093995-2ea3-48d0-bbe7-aa933ebb1dc5-kube-api-access-vx28m\") pod \"dns-default-29smx\" (UID: \"75093995-2ea3-48d0-bbe7-aa933ebb1dc5\") " pod="openshift-dns/dns-default-29smx" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.430223 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2kdcw\" (UniqueName: \"kubernetes.io/projected/6939a36c-2200-4151-bd6f-50f54ecdf4c9-kube-api-access-2kdcw\") pod \"ingress-operator-5b745b69d9-lkws7\" (UID: \"6939a36c-2200-4151-bd6f-50f54ecdf4c9\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lkws7" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.430238 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-scwl5\" (UniqueName: \"kubernetes.io/projected/9a6c6613-63d4-485e-ba56-f2347d72872e-kube-api-access-scwl5\") pod \"dns-operator-744455d44c-qssvq\" (UID: \"9a6c6613-63d4-485e-ba56-f2347d72872e\") " pod="openshift-dns-operator/dns-operator-744455d44c-qssvq" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.430253 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6939a36c-2200-4151-bd6f-50f54ecdf4c9-bound-sa-token\") pod \"ingress-operator-5b745b69d9-lkws7\" (UID: \"6939a36c-2200-4151-bd6f-50f54ecdf4c9\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lkws7" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.430268 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/ffa6af37-afb0-4226-be8c-83399f30793a-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-mtgwk\" (UID: \"ffa6af37-afb0-4226-be8c-83399f30793a\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-mtgwk" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.430285 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wlhw\" (UniqueName: \"kubernetes.io/projected/b499af0a-7024-43c4-9dad-53b9c27e3944-kube-api-access-9wlhw\") pod \"machine-config-server-bvpb8\" (UID: \"b499af0a-7024-43c4-9dad-53b9c27e3944\") " pod="openshift-machine-config-operator/machine-config-server-bvpb8" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.430299 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/16ad833e-f3a4-4af3-97b7-d960f7905292-apiservice-cert\") pod \"packageserver-d55dfcdfc-kv9dz\" (UID: \"16ad833e-f3a4-4af3-97b7-d960f7905292\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kv9dz" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.430314 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tgjcm\" (UniqueName: \"kubernetes.io/projected/7be202b3-8f35-4fe4-84ab-aea26389b7fd-kube-api-access-tgjcm\") pod \"migrator-59844c95c7-wnl59\" (UID: \"7be202b3-8f35-4fe4-84ab-aea26389b7fd\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wnl59" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.430330 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/62769b11-1e27-44f2-836c-da79ac2655b6-stats-auth\") pod \"router-default-5444994796-ngmd7\" (UID: \"62769b11-1e27-44f2-836c-da79ac2655b6\") " pod="openshift-ingress/router-default-5444994796-ngmd7" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.430343 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zmg8v\" (UniqueName: \"kubernetes.io/projected/62769b11-1e27-44f2-836c-da79ac2655b6-kube-api-access-zmg8v\") pod \"router-default-5444994796-ngmd7\" (UID: \"62769b11-1e27-44f2-836c-da79ac2655b6\") " pod="openshift-ingress/router-default-5444994796-ngmd7" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.430358 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/812d4016-c6ce-4440-bd68-3328eb8b9421-proxy-tls\") pod \"machine-config-operator-74547568cd-7rxbv\" (UID: \"812d4016-c6ce-4440-bd68-3328eb8b9421\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7rxbv" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.430371 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6007d943-10d1-4737-9821-874c5fe1a043-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-mgjqp\" (UID: \"6007d943-10d1-4737-9821-874c5fe1a043\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mgjqp" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.430386 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5sgpm\" (UniqueName: \"kubernetes.io/projected/df83a900-367b-4c33-8a5f-ac822552edde-kube-api-access-5sgpm\") pod \"catalog-operator-68c6474976-lwt47\" (UID: \"df83a900-367b-4c33-8a5f-ac822552edde\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lwt47" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.430401 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9a6c6613-63d4-485e-ba56-f2347d72872e-metrics-tls\") pod \"dns-operator-744455d44c-qssvq\" (UID: \"9a6c6613-63d4-485e-ba56-f2347d72872e\") " pod="openshift-dns-operator/dns-operator-744455d44c-qssvq" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.430416 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ef80bed0-4ac7-4df7-87e9-72eb4299008a-serving-cert\") pod \"openshift-config-operator-7777fb866f-gbkvx\" (UID: \"ef80bed0-4ac7-4df7-87e9-72eb4299008a\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-gbkvx" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.430431 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c9e38323-ab31-433d-a8c2-6f1814529fea-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-jsw4c\" (UID: \"c9e38323-ab31-433d-a8c2-6f1814529fea\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jsw4c" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.430447 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lxgs7\" (UniqueName: \"kubernetes.io/projected/c9e38323-ab31-433d-a8c2-6f1814529fea-kube-api-access-lxgs7\") pod \"machine-config-controller-84d6567774-jsw4c\" (UID: \"c9e38323-ab31-433d-a8c2-6f1814529fea\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jsw4c" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.430462 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9kwf8\" (UniqueName: \"kubernetes.io/projected/2b8b627e-6ee4-4ba8-b83f-fc84cf2b2c11-kube-api-access-9kwf8\") pod \"downloads-7954f5f757-9pdrq\" (UID: \"2b8b627e-6ee4-4ba8-b83f-fc84cf2b2c11\") " pod="openshift-console/downloads-7954f5f757-9pdrq" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.430477 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhgvn\" (UniqueName: \"kubernetes.io/projected/41cd2d07-6acd-47ab-8c03-c5dfbc10dbf8-kube-api-access-dhgvn\") pod \"package-server-manager-789f6589d5-tmlh5\" (UID: \"41cd2d07-6acd-47ab-8c03-c5dfbc10dbf8\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tmlh5" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.430493 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fcbb5cc0-3585-4ddb-aa28-1c1097d59318-registry-tls\") pod \"image-registry-697d97f7c8-x9zw9\" (UID: \"fcbb5cc0-3585-4ddb-aa28-1c1097d59318\") " pod="openshift-image-registry/image-registry-697d97f7c8-x9zw9" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.430515 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l8w64\" (UniqueName: \"kubernetes.io/projected/73bc6ad6-8024-4d5b-a0b6-995e29f987af-kube-api-access-l8w64\") pod \"control-plane-machine-set-operator-78cbb6b69f-fb6d2\" (UID: \"73bc6ad6-8024-4d5b-a0b6-995e29f987af\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-fb6d2" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.430530 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fcbb5cc0-3585-4ddb-aa28-1c1097d59318-trusted-ca\") pod \"image-registry-697d97f7c8-x9zw9\" (UID: \"fcbb5cc0-3585-4ddb-aa28-1c1097d59318\") " pod="openshift-image-registry/image-registry-697d97f7c8-x9zw9" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.430549 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c9e38323-ab31-433d-a8c2-6f1814529fea-proxy-tls\") pod \"machine-config-controller-84d6567774-jsw4c\" (UID: \"c9e38323-ab31-433d-a8c2-6f1814529fea\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jsw4c" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.430563 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/85522ce2-6359-4ee0-bb51-7c19dffbfd09-csi-data-dir\") pod \"csi-hostpathplugin-89wql\" (UID: \"85522ce2-6359-4ee0-bb51-7c19dffbfd09\") " pod="hostpath-provisioner/csi-hostpathplugin-89wql" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.430580 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nlx64\" (UniqueName: \"kubernetes.io/projected/fcbb5cc0-3585-4ddb-aa28-1c1097d59318-kube-api-access-nlx64\") pod \"image-registry-697d97f7c8-x9zw9\" (UID: \"fcbb5cc0-3585-4ddb-aa28-1c1097d59318\") " pod="openshift-image-registry/image-registry-697d97f7c8-x9zw9" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.430595 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/19ba1f1c-943a-416f-94cf-0f16d9908a88-srv-cert\") pod \"olm-operator-6b444d44fb-7j4bs\" (UID: \"19ba1f1c-943a-416f-94cf-0f16d9908a88\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7j4bs" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.430612 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8r5lg\" (UniqueName: \"kubernetes.io/projected/6007d943-10d1-4737-9821-874c5fe1a043-kube-api-access-8r5lg\") pod \"cluster-image-registry-operator-dc59b4c8b-mgjqp\" (UID: \"6007d943-10d1-4737-9821-874c5fe1a043\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mgjqp" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.430634 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/6007d943-10d1-4737-9821-874c5fe1a043-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-mgjqp\" (UID: \"6007d943-10d1-4737-9821-874c5fe1a043\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mgjqp" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.430650 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/812d4016-c6ce-4440-bd68-3328eb8b9421-images\") pod \"machine-config-operator-74547568cd-7rxbv\" (UID: \"812d4016-c6ce-4440-bd68-3328eb8b9421\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7rxbv" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.430665 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sppvk\" (UniqueName: \"kubernetes.io/projected/4e99badd-da38-4d6d-bf30-8c5d837e4ca9-kube-api-access-sppvk\") pod \"service-ca-operator-777779d784-8k7h2\" (UID: \"4e99badd-da38-4d6d-bf30-8c5d837e4ca9\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-8k7h2" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.430687 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/df83a900-367b-4c33-8a5f-ac822552edde-profile-collector-cert\") pod \"catalog-operator-68c6474976-lwt47\" (UID: \"df83a900-367b-4c33-8a5f-ac822552edde\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lwt47" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.430702 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cb019f89-a9a5-4589-9116-f9ee8e3ffb3c-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-5pjbz\" (UID: \"cb019f89-a9a5-4589-9116-f9ee8e3ffb3c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5pjbz" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.430718 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fc2e514b-61f8-47b1-975e-4a910550ecaa-config-volume\") pod \"collect-profiles-29339340-k2d9s\" (UID: \"fc2e514b-61f8-47b1-975e-4a910550ecaa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339340-k2d9s" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.430735 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lsdmf\" (UniqueName: \"kubernetes.io/projected/19ba1f1c-943a-416f-94cf-0f16d9908a88-kube-api-access-lsdmf\") pod \"olm-operator-6b444d44fb-7j4bs\" (UID: \"19ba1f1c-943a-416f-94cf-0f16d9908a88\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7j4bs" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.430750 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/bb8a2965-1b45-4811-8da9-72e901118c75-signing-cabundle\") pod \"service-ca-9c57cc56f-7s6jm\" (UID: \"bb8a2965-1b45-4811-8da9-72e901118c75\") " pod="openshift-service-ca/service-ca-9c57cc56f-7s6jm" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.430765 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/b499af0a-7024-43c4-9dad-53b9c27e3944-node-bootstrap-token\") pod \"machine-config-server-bvpb8\" (UID: \"b499af0a-7024-43c4-9dad-53b9c27e3944\") " pod="openshift-machine-config-operator/machine-config-server-bvpb8" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.430787 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/62769b11-1e27-44f2-836c-da79ac2655b6-default-certificate\") pod \"router-default-5444994796-ngmd7\" (UID: \"62769b11-1e27-44f2-836c-da79ac2655b6\") " pod="openshift-ingress/router-default-5444994796-ngmd7" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.430822 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v5w4d\" (UniqueName: \"kubernetes.io/projected/ffa6af37-afb0-4226-be8c-83399f30793a-kube-api-access-v5w4d\") pod \"multus-admission-controller-857f4d67dd-mtgwk\" (UID: \"ffa6af37-afb0-4226-be8c-83399f30793a\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-mtgwk" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.430838 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kzclr\" (UniqueName: \"kubernetes.io/projected/fc2e514b-61f8-47b1-975e-4a910550ecaa-kube-api-access-kzclr\") pod \"collect-profiles-29339340-k2d9s\" (UID: \"fc2e514b-61f8-47b1-975e-4a910550ecaa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339340-k2d9s" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.430853 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/b499af0a-7024-43c4-9dad-53b9c27e3944-certs\") pod \"machine-config-server-bvpb8\" (UID: \"b499af0a-7024-43c4-9dad-53b9c27e3944\") " pod="openshift-machine-config-operator/machine-config-server-bvpb8" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.430869 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/fcbb5cc0-3585-4ddb-aa28-1c1097d59318-registry-certificates\") pod \"image-registry-697d97f7c8-x9zw9\" (UID: \"fcbb5cc0-3585-4ddb-aa28-1c1097d59318\") " pod="openshift-image-registry/image-registry-697d97f7c8-x9zw9" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.430885 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/16ad833e-f3a4-4af3-97b7-d960f7905292-webhook-cert\") pod \"packageserver-d55dfcdfc-kv9dz\" (UID: \"16ad833e-f3a4-4af3-97b7-d960f7905292\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kv9dz" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.430899 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/83a79b04-2fae-4444-90fd-a165aab2f901-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-tc8mj\" (UID: \"83a79b04-2fae-4444-90fd-a165aab2f901\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-tc8mj" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.430915 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/62769b11-1e27-44f2-836c-da79ac2655b6-metrics-certs\") pod \"router-default-5444994796-ngmd7\" (UID: \"62769b11-1e27-44f2-836c-da79ac2655b6\") " pod="openshift-ingress/router-default-5444994796-ngmd7" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.430930 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/75093995-2ea3-48d0-bbe7-aa933ebb1dc5-metrics-tls\") pod \"dns-default-29smx\" (UID: \"75093995-2ea3-48d0-bbe7-aa933ebb1dc5\") " pod="openshift-dns/dns-default-29smx" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.430944 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/85522ce2-6359-4ee0-bb51-7c19dffbfd09-socket-dir\") pod \"csi-hostpathplugin-89wql\" (UID: \"85522ce2-6359-4ee0-bb51-7c19dffbfd09\") " pod="hostpath-provisioner/csi-hostpathplugin-89wql" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.430962 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/df83a900-367b-4c33-8a5f-ac822552edde-srv-cert\") pod \"catalog-operator-68c6474976-lwt47\" (UID: \"df83a900-367b-4c33-8a5f-ac822552edde\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lwt47" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.430979 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a2bc926f-f4d3-4811-ba2f-c7f520b910bb-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-9xnlf\" (UID: \"a2bc926f-f4d3-4811-ba2f-c7f520b910bb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9xnlf" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.430994 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6939a36c-2200-4151-bd6f-50f54ecdf4c9-metrics-tls\") pod \"ingress-operator-5b745b69d9-lkws7\" (UID: \"6939a36c-2200-4151-bd6f-50f54ecdf4c9\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lkws7" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.431010 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c229cffd-cd92-47b5-bec4-3f3eb1c6c81e-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-hxck2\" (UID: \"c229cffd-cd92-47b5-bec4-3f3eb1c6c81e\") " pod="openshift-marketplace/marketplace-operator-79b997595-hxck2" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.431035 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/fcbb5cc0-3585-4ddb-aa28-1c1097d59318-installation-pull-secrets\") pod \"image-registry-697d97f7c8-x9zw9\" (UID: \"fcbb5cc0-3585-4ddb-aa28-1c1097d59318\") " pod="openshift-image-registry/image-registry-697d97f7c8-x9zw9" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.431049 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/16ad833e-f3a4-4af3-97b7-d960f7905292-tmpfs\") pod \"packageserver-d55dfcdfc-kv9dz\" (UID: \"16ad833e-f3a4-4af3-97b7-d960f7905292\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kv9dz" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.431064 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/19ba1f1c-943a-416f-94cf-0f16d9908a88-profile-collector-cert\") pod \"olm-operator-6b444d44fb-7j4bs\" (UID: \"19ba1f1c-943a-416f-94cf-0f16d9908a88\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7j4bs" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.431087 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/c229cffd-cd92-47b5-bec4-3f3eb1c6c81e-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-hxck2\" (UID: \"c229cffd-cd92-47b5-bec4-3f3eb1c6c81e\") " pod="openshift-marketplace/marketplace-operator-79b997595-hxck2" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.431101 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b380ffb7-df80-4c6d-8c1a-1ed4a7d1208e-config\") pod \"kube-controller-manager-operator-78b949d7b-2n27q\" (UID: \"b380ffb7-df80-4c6d-8c1a-1ed4a7d1208e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2n27q" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.431116 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b380ffb7-df80-4c6d-8c1a-1ed4a7d1208e-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-2n27q\" (UID: \"b380ffb7-df80-4c6d-8c1a-1ed4a7d1208e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2n27q" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.431130 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6939a36c-2200-4151-bd6f-50f54ecdf4c9-trusted-ca\") pod \"ingress-operator-5b745b69d9-lkws7\" (UID: \"6939a36c-2200-4151-bd6f-50f54ecdf4c9\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lkws7" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.431144 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e99badd-da38-4d6d-bf30-8c5d837e4ca9-config\") pod \"service-ca-operator-777779d784-8k7h2\" (UID: \"4e99badd-da38-4d6d-bf30-8c5d837e4ca9\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-8k7h2" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.431158 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb019f89-a9a5-4589-9116-f9ee8e3ffb3c-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-5pjbz\" (UID: \"cb019f89-a9a5-4589-9116-f9ee8e3ffb3c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5pjbz" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.431173 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8xg9b\" (UniqueName: \"kubernetes.io/projected/16ad833e-f3a4-4af3-97b7-d960f7905292-kube-api-access-8xg9b\") pod \"packageserver-d55dfcdfc-kv9dz\" (UID: \"16ad833e-f3a4-4af3-97b7-d960f7905292\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kv9dz" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.431188 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/85522ce2-6359-4ee0-bb51-7c19dffbfd09-registration-dir\") pod \"csi-hostpathplugin-89wql\" (UID: \"85522ce2-6359-4ee0-bb51-7c19dffbfd09\") " pod="hostpath-provisioner/csi-hostpathplugin-89wql" Oct 13 13:09:00 crc kubenswrapper[4797]: E1013 13:09:00.431457 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-13 13:09:00.93144352 +0000 UTC m=+118.464993776 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.434252 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/16ad833e-f3a4-4af3-97b7-d960f7905292-tmpfs\") pod \"packageserver-d55dfcdfc-kv9dz\" (UID: \"16ad833e-f3a4-4af3-97b7-d960f7905292\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kv9dz" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.434453 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83a79b04-2fae-4444-90fd-a165aab2f901-config\") pod \"kube-apiserver-operator-766d6c64bb-tc8mj\" (UID: \"83a79b04-2fae-4444-90fd-a165aab2f901\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-tc8mj" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.435466 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/fcbb5cc0-3585-4ddb-aa28-1c1097d59318-ca-trust-extracted\") pod \"image-registry-697d97f7c8-x9zw9\" (UID: \"fcbb5cc0-3585-4ddb-aa28-1c1097d59318\") " pod="openshift-image-registry/image-registry-697d97f7c8-x9zw9" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.438406 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/bb8a2965-1b45-4811-8da9-72e901118c75-signing-cabundle\") pod \"service-ca-9c57cc56f-7s6jm\" (UID: \"bb8a2965-1b45-4811-8da9-72e901118c75\") " pod="openshift-service-ca/service-ca-9c57cc56f-7s6jm" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.438565 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fc2e514b-61f8-47b1-975e-4a910550ecaa-config-volume\") pod \"collect-profiles-29339340-k2d9s\" (UID: \"fc2e514b-61f8-47b1-975e-4a910550ecaa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339340-k2d9s" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.439984 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-lvjgk"] Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.441011 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/62769b11-1e27-44f2-836c-da79ac2655b6-service-ca-bundle\") pod \"router-default-5444994796-ngmd7\" (UID: \"62769b11-1e27-44f2-836c-da79ac2655b6\") " pod="openshift-ingress/router-default-5444994796-ngmd7" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.442050 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fcbb5cc0-3585-4ddb-aa28-1c1097d59318-trusted-ca\") pod \"image-registry-697d97f7c8-x9zw9\" (UID: \"fcbb5cc0-3585-4ddb-aa28-1c1097d59318\") " pod="openshift-image-registry/image-registry-697d97f7c8-x9zw9" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.442090 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2bc926f-f4d3-4811-ba2f-c7f520b910bb-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-9xnlf\" (UID: \"a2bc926f-f4d3-4811-ba2f-c7f520b910bb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9xnlf" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.442209 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb019f89-a9a5-4589-9116-f9ee8e3ffb3c-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-5pjbz\" (UID: \"cb019f89-a9a5-4589-9116-f9ee8e3ffb3c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5pjbz" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.443307 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6939a36c-2200-4151-bd6f-50f54ecdf4c9-trusted-ca\") pod \"ingress-operator-5b745b69d9-lkws7\" (UID: \"6939a36c-2200-4151-bd6f-50f54ecdf4c9\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lkws7" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.443520 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/fcbb5cc0-3585-4ddb-aa28-1c1097d59318-registry-certificates\") pod \"image-registry-697d97f7c8-x9zw9\" (UID: \"fcbb5cc0-3585-4ddb-aa28-1c1097d59318\") " pod="openshift-image-registry/image-registry-697d97f7c8-x9zw9" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.444002 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/ef80bed0-4ac7-4df7-87e9-72eb4299008a-available-featuregates\") pod \"openshift-config-operator-7777fb866f-gbkvx\" (UID: \"ef80bed0-4ac7-4df7-87e9-72eb4299008a\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-gbkvx" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.444363 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e99badd-da38-4d6d-bf30-8c5d837e4ca9-config\") pod \"service-ca-operator-777779d784-8k7h2\" (UID: \"4e99badd-da38-4d6d-bf30-8c5d837e4ca9\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-8k7h2" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.453819 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/df83a900-367b-4c33-8a5f-ac822552edde-profile-collector-cert\") pod \"catalog-operator-68c6474976-lwt47\" (UID: \"df83a900-367b-4c33-8a5f-ac822552edde\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lwt47" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.454332 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/bb8a2965-1b45-4811-8da9-72e901118c75-signing-key\") pod \"service-ca-9c57cc56f-7s6jm\" (UID: \"bb8a2965-1b45-4811-8da9-72e901118c75\") " pod="openshift-service-ca/service-ca-9c57cc56f-7s6jm" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.445566 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/16ad833e-f3a4-4af3-97b7-d960f7905292-webhook-cert\") pod \"packageserver-d55dfcdfc-kv9dz\" (UID: \"16ad833e-f3a4-4af3-97b7-d960f7905292\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kv9dz" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.455550 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/812d4016-c6ce-4440-bd68-3328eb8b9421-proxy-tls\") pod \"machine-config-operator-74547568cd-7rxbv\" (UID: \"812d4016-c6ce-4440-bd68-3328eb8b9421\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7rxbv" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.455589 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c229cffd-cd92-47b5-bec4-3f3eb1c6c81e-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-hxck2\" (UID: \"c229cffd-cd92-47b5-bec4-3f3eb1c6c81e\") " pod="openshift-marketplace/marketplace-operator-79b997595-hxck2" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.455943 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/83a79b04-2fae-4444-90fd-a165aab2f901-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-tc8mj\" (UID: \"83a79b04-2fae-4444-90fd-a165aab2f901\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-tc8mj" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.456100 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6939a36c-2200-4151-bd6f-50f54ecdf4c9-metrics-tls\") pod \"ingress-operator-5b745b69d9-lkws7\" (UID: \"6939a36c-2200-4151-bd6f-50f54ecdf4c9\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lkws7" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.456463 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b380ffb7-df80-4c6d-8c1a-1ed4a7d1208e-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-2n27q\" (UID: \"b380ffb7-df80-4c6d-8c1a-1ed4a7d1208e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2n27q" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.456713 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9a6c6613-63d4-485e-ba56-f2347d72872e-metrics-tls\") pod \"dns-operator-744455d44c-qssvq\" (UID: \"9a6c6613-63d4-485e-ba56-f2347d72872e\") " pod="openshift-dns-operator/dns-operator-744455d44c-qssvq" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.457227 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/df83a900-367b-4c33-8a5f-ac822552edde-srv-cert\") pod \"catalog-operator-68c6474976-lwt47\" (UID: \"df83a900-367b-4c33-8a5f-ac822552edde\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lwt47" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.457434 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c9e38323-ab31-433d-a8c2-6f1814529fea-proxy-tls\") pod \"machine-config-controller-84d6567774-jsw4c\" (UID: \"c9e38323-ab31-433d-a8c2-6f1814529fea\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jsw4c" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.457841 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/19ba1f1c-943a-416f-94cf-0f16d9908a88-profile-collector-cert\") pod \"olm-operator-6b444d44fb-7j4bs\" (UID: \"19ba1f1c-943a-416f-94cf-0f16d9908a88\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7j4bs" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.459376 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/62769b11-1e27-44f2-836c-da79ac2655b6-stats-auth\") pod \"router-default-5444994796-ngmd7\" (UID: \"62769b11-1e27-44f2-836c-da79ac2655b6\") " pod="openshift-ingress/router-default-5444994796-ngmd7" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.459629 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/812d4016-c6ce-4440-bd68-3328eb8b9421-auth-proxy-config\") pod \"machine-config-operator-74547568cd-7rxbv\" (UID: \"812d4016-c6ce-4440-bd68-3328eb8b9421\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7rxbv" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.459724 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/73bc6ad6-8024-4d5b-a0b6-995e29f987af-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-fb6d2\" (UID: \"73bc6ad6-8024-4d5b-a0b6-995e29f987af\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-fb6d2" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.459746 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6007d943-10d1-4737-9821-874c5fe1a043-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-mgjqp\" (UID: \"6007d943-10d1-4737-9821-874c5fe1a043\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mgjqp" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.460181 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/6007d943-10d1-4737-9821-874c5fe1a043-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-mgjqp\" (UID: \"6007d943-10d1-4737-9821-874c5fe1a043\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mgjqp" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.460859 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/16ad833e-f3a4-4af3-97b7-d960f7905292-apiservice-cert\") pod \"packageserver-d55dfcdfc-kv9dz\" (UID: \"16ad833e-f3a4-4af3-97b7-d960f7905292\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kv9dz" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.461307 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/ffa6af37-afb0-4226-be8c-83399f30793a-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-mtgwk\" (UID: \"ffa6af37-afb0-4226-be8c-83399f30793a\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-mtgwk" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.461478 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cb019f89-a9a5-4589-9116-f9ee8e3ffb3c-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-5pjbz\" (UID: \"cb019f89-a9a5-4589-9116-f9ee8e3ffb3c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5pjbz" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.461735 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/19ba1f1c-943a-416f-94cf-0f16d9908a88-srv-cert\") pod \"olm-operator-6b444d44fb-7j4bs\" (UID: \"19ba1f1c-943a-416f-94cf-0f16d9908a88\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7j4bs" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.461897 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c9e38323-ab31-433d-a8c2-6f1814529fea-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-jsw4c\" (UID: \"c9e38323-ab31-433d-a8c2-6f1814529fea\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jsw4c" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.461917 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fc2e514b-61f8-47b1-975e-4a910550ecaa-secret-volume\") pod \"collect-profiles-29339340-k2d9s\" (UID: \"fc2e514b-61f8-47b1-975e-4a910550ecaa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339340-k2d9s" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.462253 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4e99badd-da38-4d6d-bf30-8c5d837e4ca9-serving-cert\") pod \"service-ca-operator-777779d784-8k7h2\" (UID: \"4e99badd-da38-4d6d-bf30-8c5d837e4ca9\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-8k7h2" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.462761 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a2bc926f-f4d3-4811-ba2f-c7f520b910bb-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-9xnlf\" (UID: \"a2bc926f-f4d3-4811-ba2f-c7f520b910bb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9xnlf" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.462839 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b380ffb7-df80-4c6d-8c1a-1ed4a7d1208e-config\") pod \"kube-controller-manager-operator-78b949d7b-2n27q\" (UID: \"b380ffb7-df80-4c6d-8c1a-1ed4a7d1208e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2n27q" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.462848 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fcbb5cc0-3585-4ddb-aa28-1c1097d59318-registry-tls\") pod \"image-registry-697d97f7c8-x9zw9\" (UID: \"fcbb5cc0-3585-4ddb-aa28-1c1097d59318\") " pod="openshift-image-registry/image-registry-697d97f7c8-x9zw9" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.463527 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/62769b11-1e27-44f2-836c-da79ac2655b6-default-certificate\") pod \"router-default-5444994796-ngmd7\" (UID: \"62769b11-1e27-44f2-836c-da79ac2655b6\") " pod="openshift-ingress/router-default-5444994796-ngmd7" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.466194 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-72vb4"] Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.470368 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/812d4016-c6ce-4440-bd68-3328eb8b9421-images\") pod \"machine-config-operator-74547568cd-7rxbv\" (UID: \"812d4016-c6ce-4440-bd68-3328eb8b9421\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7rxbv" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.472793 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/fcbb5cc0-3585-4ddb-aa28-1c1097d59318-installation-pull-secrets\") pod \"image-registry-697d97f7c8-x9zw9\" (UID: \"fcbb5cc0-3585-4ddb-aa28-1c1097d59318\") " pod="openshift-image-registry/image-registry-697d97f7c8-x9zw9" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.476393 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/62769b11-1e27-44f2-836c-da79ac2655b6-metrics-certs\") pod \"router-default-5444994796-ngmd7\" (UID: \"62769b11-1e27-44f2-836c-da79ac2655b6\") " pod="openshift-ingress/router-default-5444994796-ngmd7" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.478436 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vfstq"] Oct 13 13:09:00 crc kubenswrapper[4797]: W1013 13:09:00.478450 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod33f4ebe7_de91_4b1c_b157_4234d535e206.slice/crio-5d41e58eba877db15419a821ddad19726dd1bd34c39a2319fd904caa569e6528 WatchSource:0}: Error finding container 5d41e58eba877db15419a821ddad19726dd1bd34c39a2319fd904caa569e6528: Status 404 returned error can't find the container with id 5d41e58eba877db15419a821ddad19726dd1bd34c39a2319fd904caa569e6528 Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.491517 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9kwf8\" (UniqueName: \"kubernetes.io/projected/2b8b627e-6ee4-4ba8-b83f-fc84cf2b2c11-kube-api-access-9kwf8\") pod \"downloads-7954f5f757-9pdrq\" (UID: \"2b8b627e-6ee4-4ba8-b83f-fc84cf2b2c11\") " pod="openshift-console/downloads-7954f5f757-9pdrq" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.494266 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/c229cffd-cd92-47b5-bec4-3f3eb1c6c81e-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-hxck2\" (UID: \"c229cffd-cd92-47b5-bec4-3f3eb1c6c81e\") " pod="openshift-marketplace/marketplace-operator-79b997595-hxck2" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.498246 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-wxrg4"] Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.502195 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mwhth"] Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.504986 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4nx98\" (UniqueName: \"kubernetes.io/projected/c229cffd-cd92-47b5-bec4-3f3eb1c6c81e-kube-api-access-4nx98\") pod \"marketplace-operator-79b997595-hxck2\" (UID: \"c229cffd-cd92-47b5-bec4-3f3eb1c6c81e\") " pod="openshift-marketplace/marketplace-operator-79b997595-hxck2" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.506039 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-ccxwp"] Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.507649 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ef80bed0-4ac7-4df7-87e9-72eb4299008a-serving-cert\") pod \"openshift-config-operator-7777fb866f-gbkvx\" (UID: \"ef80bed0-4ac7-4df7-87e9-72eb4299008a\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-gbkvx" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.508118 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-9pdrq" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.515402 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-fr6cp"] Oct 13 13:09:00 crc kubenswrapper[4797]: W1013 13:09:00.520286 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d6de0ee_416d_43d1_bf5a_e176bc41b2c5.slice/crio-3d0fb86568e53a68f2537825c73ab55c4499701f69851e09be40d347ceade88b WatchSource:0}: Error finding container 3d0fb86568e53a68f2537825c73ab55c4499701f69851e09be40d347ceade88b: Status 404 returned error can't find the container with id 3d0fb86568e53a68f2537825c73ab55c4499701f69851e09be40d347ceade88b Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.525862 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lh7wp\" (UniqueName: \"kubernetes.io/projected/bb8a2965-1b45-4811-8da9-72e901118c75-kube-api-access-lh7wp\") pod \"service-ca-9c57cc56f-7s6jm\" (UID: \"bb8a2965-1b45-4811-8da9-72e901118c75\") " pod="openshift-service-ca/service-ca-9c57cc56f-7s6jm" Oct 13 13:09:00 crc kubenswrapper[4797]: W1013 13:09:00.533231 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd24fe086_ff48_4538_ad8a_3bb681cf9116.slice/crio-64981e9f5724b29b016677657f309601f94680f652c39689579800e0d0db58d8 WatchSource:0}: Error finding container 64981e9f5724b29b016677657f309601f94680f652c39689579800e0d0db58d8: Status 404 returned error can't find the container with id 64981e9f5724b29b016677657f309601f94680f652c39689579800e0d0db58d8 Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.533236 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dhgvn\" (UniqueName: \"kubernetes.io/projected/41cd2d07-6acd-47ab-8c03-c5dfbc10dbf8-kube-api-access-dhgvn\") pod \"package-server-manager-789f6589d5-tmlh5\" (UID: \"41cd2d07-6acd-47ab-8c03-c5dfbc10dbf8\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tmlh5" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.533324 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/85522ce2-6359-4ee0-bb51-7c19dffbfd09-csi-data-dir\") pod \"csi-hostpathplugin-89wql\" (UID: \"85522ce2-6359-4ee0-bb51-7c19dffbfd09\") " pod="hostpath-provisioner/csi-hostpathplugin-89wql" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.533404 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/b499af0a-7024-43c4-9dad-53b9c27e3944-node-bootstrap-token\") pod \"machine-config-server-bvpb8\" (UID: \"b499af0a-7024-43c4-9dad-53b9c27e3944\") " pod="openshift-machine-config-operator/machine-config-server-bvpb8" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.533436 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/b499af0a-7024-43c4-9dad-53b9c27e3944-certs\") pod \"machine-config-server-bvpb8\" (UID: \"b499af0a-7024-43c4-9dad-53b9c27e3944\") " pod="openshift-machine-config-operator/machine-config-server-bvpb8" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.533460 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/75093995-2ea3-48d0-bbe7-aa933ebb1dc5-metrics-tls\") pod \"dns-default-29smx\" (UID: \"75093995-2ea3-48d0-bbe7-aa933ebb1dc5\") " pod="openshift-dns/dns-default-29smx" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.533476 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/85522ce2-6359-4ee0-bb51-7c19dffbfd09-socket-dir\") pod \"csi-hostpathplugin-89wql\" (UID: \"85522ce2-6359-4ee0-bb51-7c19dffbfd09\") " pod="hostpath-provisioner/csi-hostpathplugin-89wql" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.533491 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/85522ce2-6359-4ee0-bb51-7c19dffbfd09-csi-data-dir\") pod \"csi-hostpathplugin-89wql\" (UID: \"85522ce2-6359-4ee0-bb51-7c19dffbfd09\") " pod="hostpath-provisioner/csi-hostpathplugin-89wql" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.533515 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/85522ce2-6359-4ee0-bb51-7c19dffbfd09-registration-dir\") pod \"csi-hostpathplugin-89wql\" (UID: \"85522ce2-6359-4ee0-bb51-7c19dffbfd09\") " pod="hostpath-provisioner/csi-hostpathplugin-89wql" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.533540 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/85522ce2-6359-4ee0-bb51-7c19dffbfd09-plugins-dir\") pod \"csi-hostpathplugin-89wql\" (UID: \"85522ce2-6359-4ee0-bb51-7c19dffbfd09\") " pod="hostpath-provisioner/csi-hostpathplugin-89wql" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.533553 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/463ec1ab-bb91-45f7-b090-7aeb71365797-cert\") pod \"ingress-canary-8cwgx\" (UID: \"463ec1ab-bb91-45f7-b090-7aeb71365797\") " pod="openshift-ingress-canary/ingress-canary-8cwgx" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.533571 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/41cd2d07-6acd-47ab-8c03-c5dfbc10dbf8-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-tmlh5\" (UID: \"41cd2d07-6acd-47ab-8c03-c5dfbc10dbf8\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tmlh5" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.533630 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k6pq9\" (UniqueName: \"kubernetes.io/projected/463ec1ab-bb91-45f7-b090-7aeb71365797-kube-api-access-k6pq9\") pod \"ingress-canary-8cwgx\" (UID: \"463ec1ab-bb91-45f7-b090-7aeb71365797\") " pod="openshift-ingress-canary/ingress-canary-8cwgx" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.533656 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/75093995-2ea3-48d0-bbe7-aa933ebb1dc5-config-volume\") pod \"dns-default-29smx\" (UID: \"75093995-2ea3-48d0-bbe7-aa933ebb1dc5\") " pod="openshift-dns/dns-default-29smx" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.533677 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x9zw9\" (UID: \"fcbb5cc0-3585-4ddb-aa28-1c1097d59318\") " pod="openshift-image-registry/image-registry-697d97f7c8-x9zw9" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.533700 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/85522ce2-6359-4ee0-bb51-7c19dffbfd09-mountpoint-dir\") pod \"csi-hostpathplugin-89wql\" (UID: \"85522ce2-6359-4ee0-bb51-7c19dffbfd09\") " pod="hostpath-provisioner/csi-hostpathplugin-89wql" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.533725 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z8l5c\" (UniqueName: \"kubernetes.io/projected/85522ce2-6359-4ee0-bb51-7c19dffbfd09-kube-api-access-z8l5c\") pod \"csi-hostpathplugin-89wql\" (UID: \"85522ce2-6359-4ee0-bb51-7c19dffbfd09\") " pod="hostpath-provisioner/csi-hostpathplugin-89wql" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.533750 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vx28m\" (UniqueName: \"kubernetes.io/projected/75093995-2ea3-48d0-bbe7-aa933ebb1dc5-kube-api-access-vx28m\") pod \"dns-default-29smx\" (UID: \"75093995-2ea3-48d0-bbe7-aa933ebb1dc5\") " pod="openshift-dns/dns-default-29smx" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.533784 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9wlhw\" (UniqueName: \"kubernetes.io/projected/b499af0a-7024-43c4-9dad-53b9c27e3944-kube-api-access-9wlhw\") pod \"machine-config-server-bvpb8\" (UID: \"b499af0a-7024-43c4-9dad-53b9c27e3944\") " pod="openshift-machine-config-operator/machine-config-server-bvpb8" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.534099 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/85522ce2-6359-4ee0-bb51-7c19dffbfd09-socket-dir\") pod \"csi-hostpathplugin-89wql\" (UID: \"85522ce2-6359-4ee0-bb51-7c19dffbfd09\") " pod="hostpath-provisioner/csi-hostpathplugin-89wql" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.534143 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/85522ce2-6359-4ee0-bb51-7c19dffbfd09-registration-dir\") pod \"csi-hostpathplugin-89wql\" (UID: \"85522ce2-6359-4ee0-bb51-7c19dffbfd09\") " pod="hostpath-provisioner/csi-hostpathplugin-89wql" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.534171 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/85522ce2-6359-4ee0-bb51-7c19dffbfd09-plugins-dir\") pod \"csi-hostpathplugin-89wql\" (UID: \"85522ce2-6359-4ee0-bb51-7c19dffbfd09\") " pod="hostpath-provisioner/csi-hostpathplugin-89wql" Oct 13 13:09:00 crc kubenswrapper[4797]: E1013 13:09:00.534973 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-13 13:09:01.0349617 +0000 UTC m=+118.568511956 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x9zw9" (UID: "fcbb5cc0-3585-4ddb-aa28-1c1097d59318") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.537051 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/85522ce2-6359-4ee0-bb51-7c19dffbfd09-mountpoint-dir\") pod \"csi-hostpathplugin-89wql\" (UID: \"85522ce2-6359-4ee0-bb51-7c19dffbfd09\") " pod="hostpath-provisioner/csi-hostpathplugin-89wql" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.538184 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/b499af0a-7024-43c4-9dad-53b9c27e3944-certs\") pod \"machine-config-server-bvpb8\" (UID: \"b499af0a-7024-43c4-9dad-53b9c27e3944\") " pod="openshift-machine-config-operator/machine-config-server-bvpb8" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.538406 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/b499af0a-7024-43c4-9dad-53b9c27e3944-node-bootstrap-token\") pod \"machine-config-server-bvpb8\" (UID: \"b499af0a-7024-43c4-9dad-53b9c27e3944\") " pod="openshift-machine-config-operator/machine-config-server-bvpb8" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.539819 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/75093995-2ea3-48d0-bbe7-aa933ebb1dc5-config-volume\") pod \"dns-default-29smx\" (UID: \"75093995-2ea3-48d0-bbe7-aa933ebb1dc5\") " pod="openshift-dns/dns-default-29smx" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.540301 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/41cd2d07-6acd-47ab-8c03-c5dfbc10dbf8-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-tmlh5\" (UID: \"41cd2d07-6acd-47ab-8c03-c5dfbc10dbf8\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tmlh5" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.542956 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8w64\" (UniqueName: \"kubernetes.io/projected/73bc6ad6-8024-4d5b-a0b6-995e29f987af-kube-api-access-l8w64\") pod \"control-plane-machine-set-operator-78cbb6b69f-fb6d2\" (UID: \"73bc6ad6-8024-4d5b-a0b6-995e29f987af\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-fb6d2" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.544246 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/75093995-2ea3-48d0-bbe7-aa933ebb1dc5-metrics-tls\") pod \"dns-default-29smx\" (UID: \"75093995-2ea3-48d0-bbe7-aa933ebb1dc5\") " pod="openshift-dns/dns-default-29smx" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.544561 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/463ec1ab-bb91-45f7-b090-7aeb71365797-cert\") pod \"ingress-canary-8cwgx\" (UID: \"463ec1ab-bb91-45f7-b090-7aeb71365797\") " pod="openshift-ingress-canary/ingress-canary-8cwgx" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.563839 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/83a79b04-2fae-4444-90fd-a165aab2f901-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-tc8mj\" (UID: \"83a79b04-2fae-4444-90fd-a165aab2f901\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-tc8mj" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.583937 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdmh5\" (UniqueName: \"kubernetes.io/projected/a2bc926f-f4d3-4811-ba2f-c7f520b910bb-kube-api-access-zdmh5\") pod \"kube-storage-version-migrator-operator-b67b599dd-9xnlf\" (UID: \"a2bc926f-f4d3-4811-ba2f-c7f520b910bb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9xnlf" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.608639 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2kdcw\" (UniqueName: \"kubernetes.io/projected/6939a36c-2200-4151-bd6f-50f54ecdf4c9-kube-api-access-2kdcw\") pod \"ingress-operator-5b745b69d9-lkws7\" (UID: \"6939a36c-2200-4151-bd6f-50f54ecdf4c9\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lkws7" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.617171 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-vw685"] Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.626513 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-scwl5\" (UniqueName: \"kubernetes.io/projected/9a6c6613-63d4-485e-ba56-f2347d72872e-kube-api-access-scwl5\") pod \"dns-operator-744455d44c-qssvq\" (UID: \"9a6c6613-63d4-485e-ba56-f2347d72872e\") " pod="openshift-dns-operator/dns-operator-744455d44c-qssvq" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.627982 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-5dpx7"] Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.629965 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9xnlf" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.635117 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 13 13:09:00 crc kubenswrapper[4797]: E1013 13:09:00.635548 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-13 13:09:01.135532058 +0000 UTC m=+118.669082314 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 13:09:00 crc kubenswrapper[4797]: W1013 13:09:00.638142 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3a9a3c57_03b2_4adc_82a1_3aba68c83636.slice/crio-4e440305c7449a5cda87c2a4a8e3237f54b5e37eaa39e97552ff1947367b5086 WatchSource:0}: Error finding container 4e440305c7449a5cda87c2a4a8e3237f54b5e37eaa39e97552ff1947367b5086: Status 404 returned error can't find the container with id 4e440305c7449a5cda87c2a4a8e3237f54b5e37eaa39e97552ff1947367b5086 Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.639883 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6939a36c-2200-4151-bd6f-50f54ecdf4c9-bound-sa-token\") pod \"ingress-operator-5b745b69d9-lkws7\" (UID: \"6939a36c-2200-4151-bd6f-50f54ecdf4c9\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lkws7" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.641698 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-fb6d2" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.669728 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lsdmf\" (UniqueName: \"kubernetes.io/projected/19ba1f1c-943a-416f-94cf-0f16d9908a88-kube-api-access-lsdmf\") pod \"olm-operator-6b444d44fb-7j4bs\" (UID: \"19ba1f1c-943a-416f-94cf-0f16d9908a88\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7j4bs" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.675584 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-hxck2" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.681838 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5w4d\" (UniqueName: \"kubernetes.io/projected/ffa6af37-afb0-4226-be8c-83399f30793a-kube-api-access-v5w4d\") pod \"multus-admission-controller-857f4d67dd-mtgwk\" (UID: \"ffa6af37-afb0-4226-be8c-83399f30793a\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-mtgwk" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.686497 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-7s6jm" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.703864 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-9pdrq"] Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.715796 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7j4bs" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.719399 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzclr\" (UniqueName: \"kubernetes.io/projected/fc2e514b-61f8-47b1-975e-4a910550ecaa-kube-api-access-kzclr\") pod \"collect-profiles-29339340-k2d9s\" (UID: \"fc2e514b-61f8-47b1-975e-4a910550ecaa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339340-k2d9s" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.722199 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8r5lg\" (UniqueName: \"kubernetes.io/projected/6007d943-10d1-4737-9821-874c5fe1a043-kube-api-access-8r5lg\") pod \"cluster-image-registry-operator-dc59b4c8b-mgjqp\" (UID: \"6007d943-10d1-4737-9821-874c5fe1a043\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mgjqp" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.736756 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x9zw9\" (UID: \"fcbb5cc0-3585-4ddb-aa28-1c1097d59318\") " pod="openshift-image-registry/image-registry-697d97f7c8-x9zw9" Oct 13 13:09:00 crc kubenswrapper[4797]: E1013 13:09:00.737142 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-13 13:09:01.23713103 +0000 UTC m=+118.770681286 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x9zw9" (UID: "fcbb5cc0-3585-4ddb-aa28-1c1097d59318") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 13:09:00 crc kubenswrapper[4797]: W1013 13:09:00.737950 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b8b627e_6ee4_4ba8_b83f_fc84cf2b2c11.slice/crio-5ecb829d72563fcb5d58479484eff6a5124084f7e9967e3adbece0916ba891d9 WatchSource:0}: Error finding container 5ecb829d72563fcb5d58479484eff6a5124084f7e9967e3adbece0916ba891d9: Status 404 returned error can't find the container with id 5ecb829d72563fcb5d58479484eff6a5124084f7e9967e3adbece0916ba891d9 Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.740290 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fcbb5cc0-3585-4ddb-aa28-1c1097d59318-bound-sa-token\") pod \"image-registry-697d97f7c8-x9zw9\" (UID: \"fcbb5cc0-3585-4ddb-aa28-1c1097d59318\") " pod="openshift-image-registry/image-registry-697d97f7c8-x9zw9" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.759339 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cb019f89-a9a5-4589-9116-f9ee8e3ffb3c-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-5pjbz\" (UID: \"cb019f89-a9a5-4589-9116-f9ee8e3ffb3c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5pjbz" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.782080 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8c4kt\" (UniqueName: \"kubernetes.io/projected/ef80bed0-4ac7-4df7-87e9-72eb4299008a-kube-api-access-8c4kt\") pod \"openshift-config-operator-7777fb866f-gbkvx\" (UID: \"ef80bed0-4ac7-4df7-87e9-72eb4299008a\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-gbkvx" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.785281 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-qssvq" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.792518 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-gbkvx" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.802713 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vh57k\" (UniqueName: \"kubernetes.io/projected/812d4016-c6ce-4440-bd68-3328eb8b9421-kube-api-access-vh57k\") pod \"machine-config-operator-74547568cd-7rxbv\" (UID: \"812d4016-c6ce-4440-bd68-3328eb8b9421\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7rxbv" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.820774 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-tc8mj" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.830042 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nlx64\" (UniqueName: \"kubernetes.io/projected/fcbb5cc0-3585-4ddb-aa28-1c1097d59318-kube-api-access-nlx64\") pod \"image-registry-697d97f7c8-x9zw9\" (UID: \"fcbb5cc0-3585-4ddb-aa28-1c1097d59318\") " pod="openshift-image-registry/image-registry-697d97f7c8-x9zw9" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.843765 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 13 13:09:00 crc kubenswrapper[4797]: E1013 13:09:00.844177 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-13 13:09:01.344163217 +0000 UTC m=+118.877713473 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.859769 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xg9b\" (UniqueName: \"kubernetes.io/projected/16ad833e-f3a4-4af3-97b7-d960f7905292-kube-api-access-8xg9b\") pod \"packageserver-d55dfcdfc-kv9dz\" (UID: \"16ad833e-f3a4-4af3-97b7-d960f7905292\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kv9dz" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.882988 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgjcm\" (UniqueName: \"kubernetes.io/projected/7be202b3-8f35-4fe4-84ab-aea26389b7fd-kube-api-access-tgjcm\") pod \"migrator-59844c95c7-wnl59\" (UID: \"7be202b3-8f35-4fe4-84ab-aea26389b7fd\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wnl59" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.883218 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-mtgwk" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.884315 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmg8v\" (UniqueName: \"kubernetes.io/projected/62769b11-1e27-44f2-836c-da79ac2655b6-kube-api-access-zmg8v\") pod \"router-default-5444994796-ngmd7\" (UID: \"62769b11-1e27-44f2-836c-da79ac2655b6\") " pod="openshift-ingress/router-default-5444994796-ngmd7" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.894440 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5pjbz" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.902184 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b380ffb7-df80-4c6d-8c1a-1ed4a7d1208e-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-2n27q\" (UID: \"b380ffb7-df80-4c6d-8c1a-1ed4a7d1208e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2n27q" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.904477 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-fb6d2"] Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.904533 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lkws7" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.919290 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6007d943-10d1-4737-9821-874c5fe1a043-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-mgjqp\" (UID: \"6007d943-10d1-4737-9821-874c5fe1a043\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mgjqp" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.921393 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wnl59" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.941031 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-dmwr8" event={"ID":"d637970b-bb85-4dd8-beb8-1f01479781d1","Type":"ContainerStarted","Data":"ef880a4e3024f5bcaae02b7f58faf90762c75433ebaeb97bcd69be2f14d40328"} Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.941068 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-dmwr8" event={"ID":"d637970b-bb85-4dd8-beb8-1f01479781d1","Type":"ContainerStarted","Data":"4d21461ceca7b701f288b5c20d8c7a7c337f7acd160a93b24f324833dacaa495"} Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.941077 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-dmwr8" event={"ID":"d637970b-bb85-4dd8-beb8-1f01479781d1","Type":"ContainerStarted","Data":"a28dc597b3073508c89d839ed52f05afeaa9498434ecbb9dea2ec32fd376a7db"} Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.942340 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-wxrg4" event={"ID":"9d6de0ee-416d-43d1-bf5a-e176bc41b2c5","Type":"ContainerStarted","Data":"c6f329d5d6d23692f1fd941bd6552f80ebc318c2a2ee899eb8c17630e23bd375"} Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.942379 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-wxrg4" event={"ID":"9d6de0ee-416d-43d1-bf5a-e176bc41b2c5","Type":"ContainerStarted","Data":"3d0fb86568e53a68f2537825c73ab55c4499701f69851e09be40d347ceade88b"} Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.942856 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-wxrg4" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.945796 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x9zw9\" (UID: \"fcbb5cc0-3585-4ddb-aa28-1c1097d59318\") " pod="openshift-image-registry/image-registry-697d97f7c8-x9zw9" Oct 13 13:09:00 crc kubenswrapper[4797]: E1013 13:09:00.946268 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-13 13:09:01.446255312 +0000 UTC m=+118.979805568 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x9zw9" (UID: "fcbb5cc0-3585-4ddb-aa28-1c1097d59318") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.948226 4797 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-wxrg4 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" start-of-body= Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.948297 4797 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-wxrg4" podUID="9d6de0ee-416d-43d1-bf5a-e176bc41b2c5" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.948436 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5sgpm\" (UniqueName: \"kubernetes.io/projected/df83a900-367b-4c33-8a5f-ac822552edde-kube-api-access-5sgpm\") pod \"catalog-operator-68c6474976-lwt47\" (UID: \"df83a900-367b-4c33-8a5f-ac822552edde\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lwt47" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.950053 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-jlq7q" event={"ID":"32de0c29-41b8-4e79-97c0-1d72acb48feb","Type":"ContainerStarted","Data":"56e915f2027a5e3ecc83939fd35dc80bda70b6eebb7551adc8002803579f2a9d"} Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.950084 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-jlq7q" event={"ID":"32de0c29-41b8-4e79-97c0-1d72acb48feb","Type":"ContainerStarted","Data":"dd22547f6b05e8f3d2d46d378524dc454cab983163a8ec703c6adcfd1cf889d1"} Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.950187 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7rxbv" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.950921 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-jlq7q" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.954432 4797 patch_prober.go:28] interesting pod/console-operator-58897d9998-jlq7q container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.11:8443/readyz\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.954486 4797 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-jlq7q" podUID="32de0c29-41b8-4e79-97c0-1d72acb48feb" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.11:8443/readyz\": dial tcp 10.217.0.11:8443: connect: connection refused" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.959170 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-5dpx7" event={"ID":"4f565495-cb16-4443-8018-24e277acac69","Type":"ContainerStarted","Data":"b166f5ffe228add9c386a0e7095a55e709d7d62522bf9b1a83f250a1ec2ac2db"} Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.959744 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lwt47" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.968448 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-swfqw" event={"ID":"df26da53-3ff8-4402-b5f6-25166b6b0f8a","Type":"ContainerStarted","Data":"58ff5b9cd673a48fbec243e0f692229eaddb0933325017c66068cbfe7386d815"} Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.968488 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-swfqw" event={"ID":"df26da53-3ff8-4402-b5f6-25166b6b0f8a","Type":"ContainerStarted","Data":"f5040ba7f295c500bb34960aed6b8edc283a642f2098db16bf3156967f0edbc7"} Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.970495 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sppvk\" (UniqueName: \"kubernetes.io/projected/4e99badd-da38-4d6d-bf30-8c5d837e4ca9-kube-api-access-sppvk\") pod \"service-ca-operator-777779d784-8k7h2\" (UID: \"4e99badd-da38-4d6d-bf30-8c5d837e4ca9\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-8k7h2" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.981187 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxgs7\" (UniqueName: \"kubernetes.io/projected/c9e38323-ab31-433d-a8c2-6f1814529fea-kube-api-access-lxgs7\") pod \"machine-config-controller-84d6567774-jsw4c\" (UID: \"c9e38323-ab31-433d-a8c2-6f1814529fea\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jsw4c" Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.984119 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-8xklm" event={"ID":"c72e2007-fbd4-4c7a-a0fc-9c949a748441","Type":"ContainerStarted","Data":"c4ce615f43ca2b7599743ffc83f05495a08d009733c9986e42c1d90955c430e9"} Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.984164 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-8xklm" event={"ID":"c72e2007-fbd4-4c7a-a0fc-9c949a748441","Type":"ContainerStarted","Data":"72744f9bd95604904e2a2099c4bab3a740bad454a60ca03375de631a423cd521"} Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.987609 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-fr6cp" event={"ID":"d24fe086-ff48-4538-ad8a-3bb681cf9116","Type":"ContainerStarted","Data":"64981e9f5724b29b016677657f309601f94680f652c39689579800e0d0db58d8"} Oct 13 13:09:00 crc kubenswrapper[4797]: W1013 13:09:00.988482 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod73bc6ad6_8024_4d5b_a0b6_995e29f987af.slice/crio-9aa56944fb8655b9f137acb821ecd41ff7a33e30f8f25188ee6d20cf7cb8a701 WatchSource:0}: Error finding container 9aa56944fb8655b9f137acb821ecd41ff7a33e30f8f25188ee6d20cf7cb8a701: Status 404 returned error can't find the container with id 9aa56944fb8655b9f137acb821ecd41ff7a33e30f8f25188ee6d20cf7cb8a701 Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.988732 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-9pdrq" event={"ID":"2b8b627e-6ee4-4ba8-b83f-fc84cf2b2c11","Type":"ContainerStarted","Data":"5ecb829d72563fcb5d58479484eff6a5124084f7e9967e3adbece0916ba891d9"} Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.990230 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-vw685" event={"ID":"3a9a3c57-03b2-4adc-82a1-3aba68c83636","Type":"ContainerStarted","Data":"4e440305c7449a5cda87c2a4a8e3237f54b5e37eaa39e97552ff1947367b5086"} Oct 13 13:09:00 crc kubenswrapper[4797]: I1013 13:09:00.993725 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29339340-k2d9s" Oct 13 13:09:01 crc kubenswrapper[4797]: I1013 13:09:01.000946 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ccxwp" event={"ID":"f156a9cc-9d97-4364-a17d-f02d8b0f8abe","Type":"ContainerStarted","Data":"04aeba2bb3e4229a62d87ef5153cb824733d145852432f519fee77509d8c766a"} Oct 13 13:09:01 crc kubenswrapper[4797]: I1013 13:09:01.001008 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kv9dz" Oct 13 13:09:01 crc kubenswrapper[4797]: I1013 13:09:01.005024 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lvjgk" event={"ID":"ea99d2e2-7d7a-4af5-8f2d-dadc6c1acd50","Type":"ContainerStarted","Data":"3e6a9b8e8fafbb2d90b4c64e24b8f6b2df766397dfca25b12eb37e51b7c8c938"} Oct 13 13:09:01 crc kubenswrapper[4797]: I1013 13:09:01.005078 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lvjgk" event={"ID":"ea99d2e2-7d7a-4af5-8f2d-dadc6c1acd50","Type":"ContainerStarted","Data":"56ce8624dbdeb13b333f32d25598b678031ef627e8efa4d8a1d84e6e1feee495"} Oct 13 13:09:01 crc kubenswrapper[4797]: I1013 13:09:01.005198 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lvjgk" Oct 13 13:09:01 crc kubenswrapper[4797]: I1013 13:09:01.016743 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-l89kw" event={"ID":"c528ef7f-ab03-4c1a-96b7-6d88270b67ee","Type":"ContainerStarted","Data":"b85e77f158cc47d8b989bfc5d08330368e7ee03ca7eb1174e66a22b4c7484fe5"} Oct 13 13:09:01 crc kubenswrapper[4797]: I1013 13:09:01.016789 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-l89kw" event={"ID":"c528ef7f-ab03-4c1a-96b7-6d88270b67ee","Type":"ContainerStarted","Data":"a4ac464bab8b38c3f36aa31ea86f8eca93e7efc177e5f4ca432593a82fa43410"} Oct 13 13:09:01 crc kubenswrapper[4797]: I1013 13:09:01.024445 4797 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-lvjgk container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Oct 13 13:09:01 crc kubenswrapper[4797]: I1013 13:09:01.024493 4797 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lvjgk" podUID="ea99d2e2-7d7a-4af5-8f2d-dadc6c1acd50" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" Oct 13 13:09:01 crc kubenswrapper[4797]: I1013 13:09:01.029467 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhgvn\" (UniqueName: \"kubernetes.io/projected/41cd2d07-6acd-47ab-8c03-c5dfbc10dbf8-kube-api-access-dhgvn\") pod \"package-server-manager-789f6589d5-tmlh5\" (UID: \"41cd2d07-6acd-47ab-8c03-c5dfbc10dbf8\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tmlh5" Oct 13 13:09:01 crc kubenswrapper[4797]: I1013 13:09:01.030262 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vfstq" event={"ID":"33f4ebe7-de91-4b1c-b157-4234d535e206","Type":"ContainerStarted","Data":"232881b559951fcbf21979d01eaf46d8fb01e51b0b3b1c471d57630d593af0d6"} Oct 13 13:09:01 crc kubenswrapper[4797]: I1013 13:09:01.030298 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vfstq" event={"ID":"33f4ebe7-de91-4b1c-b157-4234d535e206","Type":"ContainerStarted","Data":"5d41e58eba877db15419a821ddad19726dd1bd34c39a2319fd904caa569e6528"} Oct 13 13:09:01 crc kubenswrapper[4797]: I1013 13:09:01.037787 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tmlh5" Oct 13 13:09:01 crc kubenswrapper[4797]: I1013 13:09:01.043050 4797 generic.go:334] "Generic (PLEG): container finished" podID="90473429-30ec-490e-a96f-d66fce3c994c" containerID="a86e22b10f5471457292cde786831d73171cd5118386712c234a35867cf2cce6" exitCode=0 Oct 13 13:09:01 crc kubenswrapper[4797]: I1013 13:09:01.043174 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-72vb4" event={"ID":"90473429-30ec-490e-a96f-d66fce3c994c","Type":"ContainerDied","Data":"a86e22b10f5471457292cde786831d73171cd5118386712c234a35867cf2cce6"} Oct 13 13:09:01 crc kubenswrapper[4797]: I1013 13:09:01.043201 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-72vb4" event={"ID":"90473429-30ec-490e-a96f-d66fce3c994c","Type":"ContainerStarted","Data":"98ff539155a56d11bbf76d4d3c8b900f39a51715f301fbe1cc87a9bf6cc4f811"} Oct 13 13:09:01 crc kubenswrapper[4797]: I1013 13:09:01.045178 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mwhth" event={"ID":"5dae836b-33d9-45ed-9b21-13311ceff098","Type":"ContainerStarted","Data":"6bcad77503840a8e54730444fe8c137848f04ce32b06e6bc3f139e9ae6182bcc"} Oct 13 13:09:01 crc kubenswrapper[4797]: I1013 13:09:01.045202 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mwhth" event={"ID":"5dae836b-33d9-45ed-9b21-13311ceff098","Type":"ContainerStarted","Data":"482b2be8246ff58451d22fc781bfa06f2e7285886ec95d4a255c2042384d8d10"} Oct 13 13:09:01 crc kubenswrapper[4797]: I1013 13:09:01.046877 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 13 13:09:01 crc kubenswrapper[4797]: E1013 13:09:01.047938 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-13 13:09:01.547921357 +0000 UTC m=+119.081471613 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 13:09:01 crc kubenswrapper[4797]: I1013 13:09:01.056153 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wlhw\" (UniqueName: \"kubernetes.io/projected/b499af0a-7024-43c4-9dad-53b9c27e3944-kube-api-access-9wlhw\") pod \"machine-config-server-bvpb8\" (UID: \"b499af0a-7024-43c4-9dad-53b9c27e3944\") " pod="openshift-machine-config-operator/machine-config-server-bvpb8" Oct 13 13:09:01 crc kubenswrapper[4797]: I1013 13:09:01.057980 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-bvpb8" Oct 13 13:09:01 crc kubenswrapper[4797]: I1013 13:09:01.065603 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k6pq9\" (UniqueName: \"kubernetes.io/projected/463ec1ab-bb91-45f7-b090-7aeb71365797-kube-api-access-k6pq9\") pod \"ingress-canary-8cwgx\" (UID: \"463ec1ab-bb91-45f7-b090-7aeb71365797\") " pod="openshift-ingress-canary/ingress-canary-8cwgx" Oct 13 13:09:01 crc kubenswrapper[4797]: I1013 13:09:01.075940 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mgjqp" Oct 13 13:09:01 crc kubenswrapper[4797]: I1013 13:09:01.081493 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vx28m\" (UniqueName: \"kubernetes.io/projected/75093995-2ea3-48d0-bbe7-aa933ebb1dc5-kube-api-access-vx28m\") pod \"dns-default-29smx\" (UID: \"75093995-2ea3-48d0-bbe7-aa933ebb1dc5\") " pod="openshift-dns/dns-default-29smx" Oct 13 13:09:01 crc kubenswrapper[4797]: I1013 13:09:01.109611 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8l5c\" (UniqueName: \"kubernetes.io/projected/85522ce2-6359-4ee0-bb51-7c19dffbfd09-kube-api-access-z8l5c\") pod \"csi-hostpathplugin-89wql\" (UID: \"85522ce2-6359-4ee0-bb51-7c19dffbfd09\") " pod="hostpath-provisioner/csi-hostpathplugin-89wql" Oct 13 13:09:01 crc kubenswrapper[4797]: I1013 13:09:01.130824 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-ngmd7" Oct 13 13:09:01 crc kubenswrapper[4797]: I1013 13:09:01.149026 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x9zw9\" (UID: \"fcbb5cc0-3585-4ddb-aa28-1c1097d59318\") " pod="openshift-image-registry/image-registry-697d97f7c8-x9zw9" Oct 13 13:09:01 crc kubenswrapper[4797]: E1013 13:09:01.152278 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-13 13:09:01.652263608 +0000 UTC m=+119.185813864 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x9zw9" (UID: "fcbb5cc0-3585-4ddb-aa28-1c1097d59318") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 13:09:01 crc kubenswrapper[4797]: I1013 13:09:01.164909 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2n27q" Oct 13 13:09:01 crc kubenswrapper[4797]: I1013 13:09:01.180677 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jsw4c" Oct 13 13:09:01 crc kubenswrapper[4797]: I1013 13:09:01.192539 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-7s6jm"] Oct 13 13:09:01 crc kubenswrapper[4797]: I1013 13:09:01.226013 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9xnlf"] Oct 13 13:09:01 crc kubenswrapper[4797]: I1013 13:09:01.250729 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 13 13:09:01 crc kubenswrapper[4797]: E1013 13:09:01.251089 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-13 13:09:01.751074641 +0000 UTC m=+119.284624897 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 13:09:01 crc kubenswrapper[4797]: I1013 13:09:01.268588 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-8k7h2" Oct 13 13:09:01 crc kubenswrapper[4797]: I1013 13:09:01.270380 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-hxck2"] Oct 13 13:09:01 crc kubenswrapper[4797]: I1013 13:09:01.329179 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-89wql" Oct 13 13:09:01 crc kubenswrapper[4797]: I1013 13:09:01.345288 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-8cwgx" Oct 13 13:09:01 crc kubenswrapper[4797]: I1013 13:09:01.352409 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-29smx" Oct 13 13:09:01 crc kubenswrapper[4797]: I1013 13:09:01.353220 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x9zw9\" (UID: \"fcbb5cc0-3585-4ddb-aa28-1c1097d59318\") " pod="openshift-image-registry/image-registry-697d97f7c8-x9zw9" Oct 13 13:09:01 crc kubenswrapper[4797]: E1013 13:09:01.353509 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-13 13:09:01.853498385 +0000 UTC m=+119.387048641 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x9zw9" (UID: "fcbb5cc0-3585-4ddb-aa28-1c1097d59318") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 13:09:01 crc kubenswrapper[4797]: I1013 13:09:01.454594 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 13 13:09:01 crc kubenswrapper[4797]: E1013 13:09:01.454743 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-13 13:09:01.954727218 +0000 UTC m=+119.488277474 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 13:09:01 crc kubenswrapper[4797]: I1013 13:09:01.455081 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x9zw9\" (UID: \"fcbb5cc0-3585-4ddb-aa28-1c1097d59318\") " pod="openshift-image-registry/image-registry-697d97f7c8-x9zw9" Oct 13 13:09:01 crc kubenswrapper[4797]: E1013 13:09:01.455353 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-13 13:09:01.955344263 +0000 UTC m=+119.488894519 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x9zw9" (UID: "fcbb5cc0-3585-4ddb-aa28-1c1097d59318") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 13:09:01 crc kubenswrapper[4797]: I1013 13:09:01.555723 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 13 13:09:01 crc kubenswrapper[4797]: E1013 13:09:01.556155 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-13 13:09:02.056139556 +0000 UTC m=+119.589689812 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 13:09:01 crc kubenswrapper[4797]: I1013 13:09:01.657195 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x9zw9\" (UID: \"fcbb5cc0-3585-4ddb-aa28-1c1097d59318\") " pod="openshift-image-registry/image-registry-697d97f7c8-x9zw9" Oct 13 13:09:01 crc kubenswrapper[4797]: E1013 13:09:01.657561 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-13 13:09:02.157548284 +0000 UTC m=+119.691098540 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x9zw9" (UID: "fcbb5cc0-3585-4ddb-aa28-1c1097d59318") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 13:09:01 crc kubenswrapper[4797]: I1013 13:09:01.675148 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-8xklm" podStartSLOduration=98.675131509 podStartE2EDuration="1m38.675131509s" podCreationTimestamp="2025-10-13 13:07:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 13:09:01.67313014 +0000 UTC m=+119.206680406" watchObservedRunningTime="2025-10-13 13:09:01.675131509 +0000 UTC m=+119.208681765" Oct 13 13:09:01 crc kubenswrapper[4797]: I1013 13:09:01.723686 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-jlq7q" podStartSLOduration=98.72367248 podStartE2EDuration="1m38.72367248s" podCreationTimestamp="2025-10-13 13:07:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 13:09:01.722349937 +0000 UTC m=+119.255900203" watchObservedRunningTime="2025-10-13 13:09:01.72367248 +0000 UTC m=+119.257222726" Oct 13 13:09:01 crc kubenswrapper[4797]: I1013 13:09:01.760635 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 13 13:09:01 crc kubenswrapper[4797]: E1013 13:09:01.760877 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-13 13:09:02.260845279 +0000 UTC m=+119.794395535 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 13:09:01 crc kubenswrapper[4797]: E1013 13:09:01.763344 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-13 13:09:02.263328681 +0000 UTC m=+119.796878927 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x9zw9" (UID: "fcbb5cc0-3585-4ddb-aa28-1c1097d59318") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 13:09:01 crc kubenswrapper[4797]: I1013 13:09:01.766877 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x9zw9\" (UID: \"fcbb5cc0-3585-4ddb-aa28-1c1097d59318\") " pod="openshift-image-registry/image-registry-697d97f7c8-x9zw9" Oct 13 13:09:01 crc kubenswrapper[4797]: I1013 13:09:01.793185 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vfstq" podStartSLOduration=97.793166169 podStartE2EDuration="1m37.793166169s" podCreationTimestamp="2025-10-13 13:07:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 13:09:01.760111841 +0000 UTC m=+119.293662117" watchObservedRunningTime="2025-10-13 13:09:01.793166169 +0000 UTC m=+119.326716425" Oct 13 13:09:01 crc kubenswrapper[4797]: I1013 13:09:01.871395 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 13 13:09:01 crc kubenswrapper[4797]: E1013 13:09:01.871905 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-13 13:09:02.371870425 +0000 UTC m=+119.905420681 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 13:09:01 crc kubenswrapper[4797]: I1013 13:09:01.975214 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x9zw9\" (UID: \"fcbb5cc0-3585-4ddb-aa28-1c1097d59318\") " pod="openshift-image-registry/image-registry-697d97f7c8-x9zw9" Oct 13 13:09:01 crc kubenswrapper[4797]: E1013 13:09:01.975651 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-13 13:09:02.475639792 +0000 UTC m=+120.009190048 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x9zw9" (UID: "fcbb5cc0-3585-4ddb-aa28-1c1097d59318") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 13:09:02 crc kubenswrapper[4797]: I1013 13:09:02.029477 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7j4bs"] Oct 13 13:09:02 crc kubenswrapper[4797]: I1013 13:09:02.049355 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mwhth" podStartSLOduration=99.049338484 podStartE2EDuration="1m39.049338484s" podCreationTimestamp="2025-10-13 13:07:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 13:09:02.048793721 +0000 UTC m=+119.582343987" watchObservedRunningTime="2025-10-13 13:09:02.049338484 +0000 UTC m=+119.582888740" Oct 13 13:09:02 crc kubenswrapper[4797]: I1013 13:09:02.064772 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-hxck2" event={"ID":"c229cffd-cd92-47b5-bec4-3f3eb1c6c81e","Type":"ContainerStarted","Data":"92daef95090852916fb77bf7c96e6d4f56cb5b2410062f430b59b417e127ca50"} Oct 13 13:09:02 crc kubenswrapper[4797]: I1013 13:09:02.065093 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-hxck2" event={"ID":"c229cffd-cd92-47b5-bec4-3f3eb1c6c81e","Type":"ContainerStarted","Data":"eae8462120f3f150e29fd415d4be57658b1ca08ee6355835566fa41c17ae84d0"} Oct 13 13:09:02 crc kubenswrapper[4797]: I1013 13:09:02.065304 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-hxck2" Oct 13 13:09:02 crc kubenswrapper[4797]: I1013 13:09:02.076003 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 13 13:09:02 crc kubenswrapper[4797]: E1013 13:09:02.076434 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-13 13:09:02.576407374 +0000 UTC m=+120.109957630 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 13:09:02 crc kubenswrapper[4797]: I1013 13:09:02.086446 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-72vb4" event={"ID":"90473429-30ec-490e-a96f-d66fce3c994c","Type":"ContainerStarted","Data":"c278d44762b2350ba09a27b9862a5338405e88a73eba1159cd5e676d15f4aa7b"} Oct 13 13:09:02 crc kubenswrapper[4797]: I1013 13:09:02.086482 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-72vb4" event={"ID":"90473429-30ec-490e-a96f-d66fce3c994c","Type":"ContainerStarted","Data":"969eb15449e84b024f0f8842e86285ae9f185ae1d2abbcba149ff7477cf6491d"} Oct 13 13:09:02 crc kubenswrapper[4797]: I1013 13:09:02.092968 4797 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-hxck2 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.36:8080/healthz\": dial tcp 10.217.0.36:8080: connect: connection refused" start-of-body= Oct 13 13:09:02 crc kubenswrapper[4797]: I1013 13:09:02.093034 4797 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-hxck2" podUID="c229cffd-cd92-47b5-bec4-3f3eb1c6c81e" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.36:8080/healthz\": dial tcp 10.217.0.36:8080: connect: connection refused" Oct 13 13:09:02 crc kubenswrapper[4797]: I1013 13:09:02.097082 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-9pdrq" event={"ID":"2b8b627e-6ee4-4ba8-b83f-fc84cf2b2c11","Type":"ContainerStarted","Data":"4aa13144efca8e2f7cda72136b4c91ce1c991f42f57fcc0a411ef34d8c1da84a"} Oct 13 13:09:02 crc kubenswrapper[4797]: I1013 13:09:02.097441 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-9pdrq" Oct 13 13:09:02 crc kubenswrapper[4797]: I1013 13:09:02.101575 4797 patch_prober.go:28] interesting pod/downloads-7954f5f757-9pdrq container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Oct 13 13:09:02 crc kubenswrapper[4797]: I1013 13:09:02.101616 4797 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-9pdrq" podUID="2b8b627e-6ee4-4ba8-b83f-fc84cf2b2c11" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Oct 13 13:09:02 crc kubenswrapper[4797]: I1013 13:09:02.113278 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-vw685" event={"ID":"3a9a3c57-03b2-4adc-82a1-3aba68c83636","Type":"ContainerStarted","Data":"0b38468079ef9d8cba99483edc32dddd2ca8c287644781f3edf2b5ea826fc895"} Oct 13 13:09:02 crc kubenswrapper[4797]: I1013 13:09:02.113318 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-vw685" event={"ID":"3a9a3c57-03b2-4adc-82a1-3aba68c83636","Type":"ContainerStarted","Data":"d7b73d66eb9fcb41a64120f82b4ef8de220733eb03988e04add793aeacafce44"} Oct 13 13:09:02 crc kubenswrapper[4797]: I1013 13:09:02.128084 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-5dpx7" event={"ID":"4f565495-cb16-4443-8018-24e277acac69","Type":"ContainerStarted","Data":"623b6fb9f987c0ab7aac9b59fab0d5579d482e7445779757e4d3325f2d1ea42f"} Oct 13 13:09:02 crc kubenswrapper[4797]: I1013 13:09:02.129222 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-5dpx7" Oct 13 13:09:02 crc kubenswrapper[4797]: I1013 13:09:02.134064 4797 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-5dpx7 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.17:6443/healthz\": dial tcp 10.217.0.17:6443: connect: connection refused" start-of-body= Oct 13 13:09:02 crc kubenswrapper[4797]: I1013 13:09:02.134115 4797 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-5dpx7" podUID="4f565495-cb16-4443-8018-24e277acac69" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.17:6443/healthz\": dial tcp 10.217.0.17:6443: connect: connection refused" Oct 13 13:09:02 crc kubenswrapper[4797]: I1013 13:09:02.138486 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" Oct 13 13:09:02 crc kubenswrapper[4797]: I1013 13:09:02.139874 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-ngmd7" event={"ID":"62769b11-1e27-44f2-836c-da79ac2655b6","Type":"ContainerStarted","Data":"a9a861b7ef856e04528ba7ce5c194c85264bfd76aaf48648b9cec24815f62240"} Oct 13 13:09:02 crc kubenswrapper[4797]: I1013 13:09:02.139900 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-ngmd7" event={"ID":"62769b11-1e27-44f2-836c-da79ac2655b6","Type":"ContainerStarted","Data":"7477b382c783ef62bdbb8953b99d615cd6bafdd4147f8e9549085ea763479473"} Oct 13 13:09:02 crc kubenswrapper[4797]: I1013 13:09:02.145854 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-swfqw" event={"ID":"df26da53-3ff8-4402-b5f6-25166b6b0f8a","Type":"ContainerStarted","Data":"91c4e8b3de5eb071efe60511215a42c70e1a16cda246472d3e10159e2d872ee9"} Oct 13 13:09:02 crc kubenswrapper[4797]: I1013 13:09:02.163606 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lvjgk" podStartSLOduration=98.1635905 podStartE2EDuration="1m38.1635905s" podCreationTimestamp="2025-10-13 13:07:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 13:09:02.163079398 +0000 UTC m=+119.696629664" watchObservedRunningTime="2025-10-13 13:09:02.1635905 +0000 UTC m=+119.697140756" Oct 13 13:09:02 crc kubenswrapper[4797]: I1013 13:09:02.165606 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-bvpb8" event={"ID":"b499af0a-7024-43c4-9dad-53b9c27e3944","Type":"ContainerStarted","Data":"93b9c4fe84495094197e3f2de866ea08183a4c87e79840762592a915a4ecf114"} Oct 13 13:09:02 crc kubenswrapper[4797]: I1013 13:09:02.165635 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-bvpb8" event={"ID":"b499af0a-7024-43c4-9dad-53b9c27e3944","Type":"ContainerStarted","Data":"ef8947b2aba036b6f09774e329d8075d4f0c619f3674823d7b4ef2d6a7f8d5a2"} Oct 13 13:09:02 crc kubenswrapper[4797]: I1013 13:09:02.181791 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9xnlf" event={"ID":"a2bc926f-f4d3-4811-ba2f-c7f520b910bb","Type":"ContainerStarted","Data":"f80dc95ef7ef2f5ddff8cbe8d0e4e8e59ec0ebcbb0d7cdece803483bae626383"} Oct 13 13:09:02 crc kubenswrapper[4797]: I1013 13:09:02.183496 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x9zw9\" (UID: \"fcbb5cc0-3585-4ddb-aa28-1c1097d59318\") " pod="openshift-image-registry/image-registry-697d97f7c8-x9zw9" Oct 13 13:09:02 crc kubenswrapper[4797]: E1013 13:09:02.187764 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-13 13:09:02.687750838 +0000 UTC m=+120.221301084 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x9zw9" (UID: "fcbb5cc0-3585-4ddb-aa28-1c1097d59318") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 13:09:02 crc kubenswrapper[4797]: I1013 13:09:02.208224 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-fb6d2" event={"ID":"73bc6ad6-8024-4d5b-a0b6-995e29f987af","Type":"ContainerStarted","Data":"289389d083cefbb20a670b066b21b0d893550e7e21fa3739895a7cdebdf85c79"} Oct 13 13:09:02 crc kubenswrapper[4797]: I1013 13:09:02.208285 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-fb6d2" event={"ID":"73bc6ad6-8024-4d5b-a0b6-995e29f987af","Type":"ContainerStarted","Data":"9aa56944fb8655b9f137acb821ecd41ff7a33e30f8f25188ee6d20cf7cb8a701"} Oct 13 13:09:02 crc kubenswrapper[4797]: I1013 13:09:02.212431 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-fr6cp" event={"ID":"d24fe086-ff48-4538-ad8a-3bb681cf9116","Type":"ContainerStarted","Data":"68525e9e83f4633d92a38a5796d79e67ec682a3404a14f8bc54e28065e8d7ebd"} Oct 13 13:09:02 crc kubenswrapper[4797]: I1013 13:09:02.214693 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-7s6jm" event={"ID":"bb8a2965-1b45-4811-8da9-72e901118c75","Type":"ContainerStarted","Data":"7b1c6e0270cf0b0331c5717a289e5295a874038614ec5533f38c0ec3f3db3a83"} Oct 13 13:09:02 crc kubenswrapper[4797]: I1013 13:09:02.214721 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-7s6jm" event={"ID":"bb8a2965-1b45-4811-8da9-72e901118c75","Type":"ContainerStarted","Data":"baa6a5998612a209b79ee430f18cb035119f3cc3fb19291a7d588e827a091043"} Oct 13 13:09:02 crc kubenswrapper[4797]: I1013 13:09:02.216308 4797 generic.go:334] "Generic (PLEG): container finished" podID="f156a9cc-9d97-4364-a17d-f02d8b0f8abe" containerID="3936148700cdb6addb616ca53797c66ecb026f77715076afe0562c33b6ab03fc" exitCode=0 Oct 13 13:09:02 crc kubenswrapper[4797]: I1013 13:09:02.218718 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ccxwp" event={"ID":"f156a9cc-9d97-4364-a17d-f02d8b0f8abe","Type":"ContainerDied","Data":"3936148700cdb6addb616ca53797c66ecb026f77715076afe0562c33b6ab03fc"} Oct 13 13:09:02 crc kubenswrapper[4797]: I1013 13:09:02.225057 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-wxrg4" Oct 13 13:09:02 crc kubenswrapper[4797]: I1013 13:09:02.247655 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lvjgk" Oct 13 13:09:02 crc kubenswrapper[4797]: I1013 13:09:02.292596 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 13 13:09:02 crc kubenswrapper[4797]: E1013 13:09:02.294637 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-13 13:09:02.794615861 +0000 UTC m=+120.328166117 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 13:09:02 crc kubenswrapper[4797]: I1013 13:09:02.351317 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-wxrg4" podStartSLOduration=98.351295433 podStartE2EDuration="1m38.351295433s" podCreationTimestamp="2025-10-13 13:07:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 13:09:02.337325147 +0000 UTC m=+119.870875413" watchObservedRunningTime="2025-10-13 13:09:02.351295433 +0000 UTC m=+119.884845689" Oct 13 13:09:02 crc kubenswrapper[4797]: I1013 13:09:02.394533 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x9zw9\" (UID: \"fcbb5cc0-3585-4ddb-aa28-1c1097d59318\") " pod="openshift-image-registry/image-registry-697d97f7c8-x9zw9" Oct 13 13:09:02 crc kubenswrapper[4797]: E1013 13:09:02.394929 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-13 13:09:02.894915581 +0000 UTC m=+120.428465837 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x9zw9" (UID: "fcbb5cc0-3585-4ddb-aa28-1c1097d59318") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 13:09:02 crc kubenswrapper[4797]: I1013 13:09:02.430489 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-gbkvx"] Oct 13 13:09:02 crc kubenswrapper[4797]: I1013 13:09:02.481444 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-mtgwk"] Oct 13 13:09:02 crc kubenswrapper[4797]: I1013 13:09:02.482235 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-qssvq"] Oct 13 13:09:02 crc kubenswrapper[4797]: I1013 13:09:02.495081 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 13 13:09:02 crc kubenswrapper[4797]: E1013 13:09:02.495445 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-13 13:09:02.995431047 +0000 UTC m=+120.528981303 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 13:09:02 crc kubenswrapper[4797]: I1013 13:09:02.509691 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kv9dz"] Oct 13 13:09:02 crc kubenswrapper[4797]: I1013 13:09:02.535366 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-tc8mj"] Oct 13 13:09:02 crc kubenswrapper[4797]: I1013 13:09:02.547167 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-wnl59"] Oct 13 13:09:02 crc kubenswrapper[4797]: I1013 13:09:02.570150 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-lkws7"] Oct 13 13:09:02 crc kubenswrapper[4797]: I1013 13:09:02.571281 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-7rxbv"] Oct 13 13:09:02 crc kubenswrapper[4797]: I1013 13:09:02.595466 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-89wql"] Oct 13 13:09:02 crc kubenswrapper[4797]: I1013 13:09:02.595964 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x9zw9\" (UID: \"fcbb5cc0-3585-4ddb-aa28-1c1097d59318\") " pod="openshift-image-registry/image-registry-697d97f7c8-x9zw9" Oct 13 13:09:02 crc kubenswrapper[4797]: E1013 13:09:02.596230 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-13 13:09:03.09621976 +0000 UTC m=+120.629770016 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x9zw9" (UID: "fcbb5cc0-3585-4ddb-aa28-1c1097d59318") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 13:09:02 crc kubenswrapper[4797]: W1013 13:09:02.607101 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7be202b3_8f35_4fe4_84ab_aea26389b7fd.slice/crio-6a7e6a689978aac2e8a8869da566b8a98717bd9924820ccd2884d074003df614 WatchSource:0}: Error finding container 6a7e6a689978aac2e8a8869da566b8a98717bd9924820ccd2884d074003df614: Status 404 returned error can't find the container with id 6a7e6a689978aac2e8a8869da566b8a98717bd9924820ccd2884d074003df614 Oct 13 13:09:02 crc kubenswrapper[4797]: I1013 13:09:02.609269 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-dmwr8" podStartSLOduration=99.609252642 podStartE2EDuration="1m39.609252642s" podCreationTimestamp="2025-10-13 13:07:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 13:09:02.608987916 +0000 UTC m=+120.142538172" watchObservedRunningTime="2025-10-13 13:09:02.609252642 +0000 UTC m=+120.142802888" Oct 13 13:09:02 crc kubenswrapper[4797]: I1013 13:09:02.618373 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lwt47"] Oct 13 13:09:02 crc kubenswrapper[4797]: I1013 13:09:02.636348 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5pjbz"] Oct 13 13:09:02 crc kubenswrapper[4797]: I1013 13:09:02.638382 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29339340-k2d9s"] Oct 13 13:09:02 crc kubenswrapper[4797]: I1013 13:09:02.643367 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2n27q"] Oct 13 13:09:02 crc kubenswrapper[4797]: W1013 13:09:02.671023 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfc2e514b_61f8_47b1_975e_4a910550ecaa.slice/crio-4cf3177b501a3a7ea0df2a551e8fe048530dbc05e02581bf2a68351f5bb8569e WatchSource:0}: Error finding container 4cf3177b501a3a7ea0df2a551e8fe048530dbc05e02581bf2a68351f5bb8569e: Status 404 returned error can't find the container with id 4cf3177b501a3a7ea0df2a551e8fe048530dbc05e02581bf2a68351f5bb8569e Oct 13 13:09:02 crc kubenswrapper[4797]: I1013 13:09:02.673343 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mgjqp"] Oct 13 13:09:02 crc kubenswrapper[4797]: I1013 13:09:02.695958 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tmlh5"] Oct 13 13:09:02 crc kubenswrapper[4797]: I1013 13:09:02.695995 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-29smx"] Oct 13 13:09:02 crc kubenswrapper[4797]: I1013 13:09:02.702302 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-8k7h2"] Oct 13 13:09:02 crc kubenswrapper[4797]: I1013 13:09:02.702346 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-8cwgx"] Oct 13 13:09:02 crc kubenswrapper[4797]: I1013 13:09:02.702359 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-jsw4c"] Oct 13 13:09:02 crc kubenswrapper[4797]: I1013 13:09:02.704254 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 13 13:09:02 crc kubenswrapper[4797]: E1013 13:09:02.704834 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-13 13:09:03.204793224 +0000 UTC m=+120.738343480 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 13:09:02 crc kubenswrapper[4797]: W1013 13:09:02.731962 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddf83a900_367b_4c33_8a5f_ac822552edde.slice/crio-05da7d7c88900b76f0730a87b083b75b92ea947f869f83b24baf5db4103ae6f9 WatchSource:0}: Error finding container 05da7d7c88900b76f0730a87b083b75b92ea947f869f83b24baf5db4103ae6f9: Status 404 returned error can't find the container with id 05da7d7c88900b76f0730a87b083b75b92ea947f869f83b24baf5db4103ae6f9 Oct 13 13:09:02 crc kubenswrapper[4797]: W1013 13:09:02.744252 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcb019f89_a9a5_4589_9116_f9ee8e3ffb3c.slice/crio-fda203d8a706efb2f74a3087828415c41d4d529b3ab5f1c6fcf6d844a29665f9 WatchSource:0}: Error finding container fda203d8a706efb2f74a3087828415c41d4d529b3ab5f1c6fcf6d844a29665f9: Status 404 returned error can't find the container with id fda203d8a706efb2f74a3087828415c41d4d529b3ab5f1c6fcf6d844a29665f9 Oct 13 13:09:02 crc kubenswrapper[4797]: W1013 13:09:02.761987 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb380ffb7_df80_4c6d_8c1a_1ed4a7d1208e.slice/crio-bec737b59ede9de28a373de81bf3f97f73570db396c765afed7e2810087ec1fb WatchSource:0}: Error finding container bec737b59ede9de28a373de81bf3f97f73570db396c765afed7e2810087ec1fb: Status 404 returned error can't find the container with id bec737b59ede9de28a373de81bf3f97f73570db396c765afed7e2810087ec1fb Oct 13 13:09:02 crc kubenswrapper[4797]: W1013 13:09:02.799679 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc9e38323_ab31_433d_a8c2_6f1814529fea.slice/crio-81ee424c31d97903cdf9bd8dca8d823cef13e83400f71ffdf277adb77a3ff735 WatchSource:0}: Error finding container 81ee424c31d97903cdf9bd8dca8d823cef13e83400f71ffdf277adb77a3ff735: Status 404 returned error can't find the container with id 81ee424c31d97903cdf9bd8dca8d823cef13e83400f71ffdf277adb77a3ff735 Oct 13 13:09:02 crc kubenswrapper[4797]: I1013 13:09:02.811723 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x9zw9\" (UID: \"fcbb5cc0-3585-4ddb-aa28-1c1097d59318\") " pod="openshift-image-registry/image-registry-697d97f7c8-x9zw9" Oct 13 13:09:02 crc kubenswrapper[4797]: E1013 13:09:02.812078 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-13 13:09:03.312062477 +0000 UTC m=+120.845612733 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x9zw9" (UID: "fcbb5cc0-3585-4ddb-aa28-1c1097d59318") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 13:09:02 crc kubenswrapper[4797]: I1013 13:09:02.834307 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-l89kw" podStartSLOduration=99.834291367 podStartE2EDuration="1m39.834291367s" podCreationTimestamp="2025-10-13 13:07:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 13:09:02.783672805 +0000 UTC m=+120.317223061" watchObservedRunningTime="2025-10-13 13:09:02.834291367 +0000 UTC m=+120.367841623" Oct 13 13:09:02 crc kubenswrapper[4797]: I1013 13:09:02.906722 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-vw685" podStartSLOduration=98.906706408 podStartE2EDuration="1m38.906706408s" podCreationTimestamp="2025-10-13 13:07:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 13:09:02.905223821 +0000 UTC m=+120.438774077" watchObservedRunningTime="2025-10-13 13:09:02.906706408 +0000 UTC m=+120.440256664" Oct 13 13:09:02 crc kubenswrapper[4797]: I1013 13:09:02.934976 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 13 13:09:02 crc kubenswrapper[4797]: E1013 13:09:02.935189 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-13 13:09:03.435170892 +0000 UTC m=+120.968721148 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 13:09:02 crc kubenswrapper[4797]: I1013 13:09:02.935224 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x9zw9\" (UID: \"fcbb5cc0-3585-4ddb-aa28-1c1097d59318\") " pod="openshift-image-registry/image-registry-697d97f7c8-x9zw9" Oct 13 13:09:02 crc kubenswrapper[4797]: E1013 13:09:02.935619 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-13 13:09:03.435604263 +0000 UTC m=+120.969154519 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x9zw9" (UID: "fcbb5cc0-3585-4ddb-aa28-1c1097d59318") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 13:09:03 crc kubenswrapper[4797]: I1013 13:09:03.047089 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 13 13:09:03 crc kubenswrapper[4797]: E1013 13:09:03.047500 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-13 13:09:03.54748512 +0000 UTC m=+121.081035376 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 13:09:03 crc kubenswrapper[4797]: I1013 13:09:03.054824 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-7s6jm" podStartSLOduration=99.054791241 podStartE2EDuration="1m39.054791241s" podCreationTimestamp="2025-10-13 13:07:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 13:09:03.054087693 +0000 UTC m=+120.587637949" watchObservedRunningTime="2025-10-13 13:09:03.054791241 +0000 UTC m=+120.588341497" Oct 13 13:09:03 crc kubenswrapper[4797]: I1013 13:09:03.055911 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-hxck2" podStartSLOduration=99.055905098 podStartE2EDuration="1m39.055905098s" podCreationTimestamp="2025-10-13 13:07:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 13:09:03.018436762 +0000 UTC m=+120.551987028" watchObservedRunningTime="2025-10-13 13:09:03.055905098 +0000 UTC m=+120.589455354" Oct 13 13:09:03 crc kubenswrapper[4797]: I1013 13:09:03.134988 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-ngmd7" Oct 13 13:09:03 crc kubenswrapper[4797]: I1013 13:09:03.149562 4797 patch_prober.go:28] interesting pod/router-default-5444994796-ngmd7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 13 13:09:03 crc kubenswrapper[4797]: [-]has-synced failed: reason withheld Oct 13 13:09:03 crc kubenswrapper[4797]: [+]process-running ok Oct 13 13:09:03 crc kubenswrapper[4797]: healthz check failed Oct 13 13:09:03 crc kubenswrapper[4797]: I1013 13:09:03.149600 4797 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ngmd7" podUID="62769b11-1e27-44f2-836c-da79ac2655b6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 13 13:09:03 crc kubenswrapper[4797]: I1013 13:09:03.151179 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x9zw9\" (UID: \"fcbb5cc0-3585-4ddb-aa28-1c1097d59318\") " pod="openshift-image-registry/image-registry-697d97f7c8-x9zw9" Oct 13 13:09:03 crc kubenswrapper[4797]: E1013 13:09:03.151554 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-13 13:09:03.651542054 +0000 UTC m=+121.185092300 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x9zw9" (UID: "fcbb5cc0-3585-4ddb-aa28-1c1097d59318") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 13:09:03 crc kubenswrapper[4797]: I1013 13:09:03.193785 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-5dpx7" podStartSLOduration=100.193767958 podStartE2EDuration="1m40.193767958s" podCreationTimestamp="2025-10-13 13:07:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 13:09:03.192794944 +0000 UTC m=+120.726345210" watchObservedRunningTime="2025-10-13 13:09:03.193767958 +0000 UTC m=+120.727318204" Oct 13 13:09:03 crc kubenswrapper[4797]: I1013 13:09:03.196007 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-jlq7q" Oct 13 13:09:03 crc kubenswrapper[4797]: I1013 13:09:03.243624 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-fr6cp" podStartSLOduration=99.24360697 podStartE2EDuration="1m39.24360697s" podCreationTimestamp="2025-10-13 13:07:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 13:09:03.232994498 +0000 UTC m=+120.766544754" watchObservedRunningTime="2025-10-13 13:09:03.24360697 +0000 UTC m=+120.777157226" Oct 13 13:09:03 crc kubenswrapper[4797]: I1013 13:09:03.268565 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 13 13:09:03 crc kubenswrapper[4797]: E1013 13:09:03.269226 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-13 13:09:03.769209284 +0000 UTC m=+121.302759540 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 13:09:03 crc kubenswrapper[4797]: I1013 13:09:03.299120 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-fb6d2" podStartSLOduration=99.299102733 podStartE2EDuration="1m39.299102733s" podCreationTimestamp="2025-10-13 13:07:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 13:09:03.293411542 +0000 UTC m=+120.826961798" watchObservedRunningTime="2025-10-13 13:09:03.299102733 +0000 UTC m=+120.832652989" Oct 13 13:09:03 crc kubenswrapper[4797]: I1013 13:09:03.299695 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-ngmd7" podStartSLOduration=99.299689617 podStartE2EDuration="1m39.299689617s" podCreationTimestamp="2025-10-13 13:07:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 13:09:03.269620184 +0000 UTC m=+120.803170440" watchObservedRunningTime="2025-10-13 13:09:03.299689617 +0000 UTC m=+120.833239883" Oct 13 13:09:03 crc kubenswrapper[4797]: I1013 13:09:03.336583 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9xnlf" podStartSLOduration=99.336560979 podStartE2EDuration="1m39.336560979s" podCreationTimestamp="2025-10-13 13:07:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 13:09:03.328474939 +0000 UTC m=+120.862025205" watchObservedRunningTime="2025-10-13 13:09:03.336560979 +0000 UTC m=+120.870111235" Oct 13 13:09:03 crc kubenswrapper[4797]: I1013 13:09:03.369338 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-bvpb8" podStartSLOduration=6.36932306 podStartE2EDuration="6.36932306s" podCreationTimestamp="2025-10-13 13:08:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 13:09:03.366740286 +0000 UTC m=+120.900290542" watchObservedRunningTime="2025-10-13 13:09:03.36932306 +0000 UTC m=+120.902873316" Oct 13 13:09:03 crc kubenswrapper[4797]: I1013 13:09:03.372935 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x9zw9\" (UID: \"fcbb5cc0-3585-4ddb-aa28-1c1097d59318\") " pod="openshift-image-registry/image-registry-697d97f7c8-x9zw9" Oct 13 13:09:03 crc kubenswrapper[4797]: E1013 13:09:03.373319 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-13 13:09:03.873304458 +0000 UTC m=+121.406854704 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x9zw9" (UID: "fcbb5cc0-3585-4ddb-aa28-1c1097d59318") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 13:09:03 crc kubenswrapper[4797]: I1013 13:09:03.403773 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7j4bs" event={"ID":"19ba1f1c-943a-416f-94cf-0f16d9908a88","Type":"ContainerStarted","Data":"1c06edbf7066ae985f2fdac17e386159d49d66fdf46c0544f133f87145a678b1"} Oct 13 13:09:03 crc kubenswrapper[4797]: I1013 13:09:03.411959 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7j4bs" Oct 13 13:09:03 crc kubenswrapper[4797]: I1013 13:09:03.411982 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7j4bs" event={"ID":"19ba1f1c-943a-416f-94cf-0f16d9908a88","Type":"ContainerStarted","Data":"519ea83f45f681d84dabbe4b49e3260898c6563d826430c4e7f36ec957a0b881"} Oct 13 13:09:03 crc kubenswrapper[4797]: I1013 13:09:03.420788 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-swfqw" podStartSLOduration=100.420773272 podStartE2EDuration="1m40.420773272s" podCreationTimestamp="2025-10-13 13:07:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 13:09:03.420185538 +0000 UTC m=+120.953735794" watchObservedRunningTime="2025-10-13 13:09:03.420773272 +0000 UTC m=+120.954323528" Oct 13 13:09:03 crc kubenswrapper[4797]: I1013 13:09:03.428887 4797 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-7j4bs container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.40:8443/healthz\": dial tcp 10.217.0.40:8443: connect: connection refused" start-of-body= Oct 13 13:09:03 crc kubenswrapper[4797]: I1013 13:09:03.428961 4797 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7j4bs" podUID="19ba1f1c-943a-416f-94cf-0f16d9908a88" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.40:8443/healthz\": dial tcp 10.217.0.40:8443: connect: connection refused" Oct 13 13:09:03 crc kubenswrapper[4797]: I1013 13:09:03.435236 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-tc8mj" event={"ID":"83a79b04-2fae-4444-90fd-a165aab2f901","Type":"ContainerStarted","Data":"c224690dade6c9e4c7bbf809747da66fcc80397c6d11867ce6a57ed0433f4cd2"} Oct 13 13:09:03 crc kubenswrapper[4797]: I1013 13:09:03.469096 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-8cwgx" event={"ID":"463ec1ab-bb91-45f7-b090-7aeb71365797","Type":"ContainerStarted","Data":"dc97610640267eaaa3f866da2aad714dbeecb249074e58c99047b70c773dd049"} Oct 13 13:09:03 crc kubenswrapper[4797]: I1013 13:09:03.474190 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 13 13:09:03 crc kubenswrapper[4797]: E1013 13:09:03.474455 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-13 13:09:03.974440209 +0000 UTC m=+121.507990465 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 13:09:03 crc kubenswrapper[4797]: I1013 13:09:03.481848 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lkws7" event={"ID":"6939a36c-2200-4151-bd6f-50f54ecdf4c9","Type":"ContainerStarted","Data":"a897b3c63048b085aa49c035a278a9d1c82beb819258765e05c7eb45d34bbb97"} Oct 13 13:09:03 crc kubenswrapper[4797]: I1013 13:09:03.495993 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-72vb4" podStartSLOduration=100.495975382 podStartE2EDuration="1m40.495975382s" podCreationTimestamp="2025-10-13 13:07:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 13:09:03.484065808 +0000 UTC m=+121.017616074" watchObservedRunningTime="2025-10-13 13:09:03.495975382 +0000 UTC m=+121.029525638" Oct 13 13:09:03 crc kubenswrapper[4797]: I1013 13:09:03.505620 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9xnlf" event={"ID":"a2bc926f-f4d3-4811-ba2f-c7f520b910bb","Type":"ContainerStarted","Data":"84ad360650e9804c3eb206849c2a0fc4a20de80658f0870380270d7cdf811b86"} Oct 13 13:09:03 crc kubenswrapper[4797]: I1013 13:09:03.515854 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-9pdrq" podStartSLOduration=100.515838863 podStartE2EDuration="1m40.515838863s" podCreationTimestamp="2025-10-13 13:07:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 13:09:03.515189347 +0000 UTC m=+121.048739613" watchObservedRunningTime="2025-10-13 13:09:03.515838863 +0000 UTC m=+121.049389119" Oct 13 13:09:03 crc kubenswrapper[4797]: I1013 13:09:03.516732 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-8k7h2" event={"ID":"4e99badd-da38-4d6d-bf30-8c5d837e4ca9","Type":"ContainerStarted","Data":"e9aa9ca605911dcfb0d7ad19d76cf4cd727fafd90258419902ec04cc7186fd0e"} Oct 13 13:09:03 crc kubenswrapper[4797]: I1013 13:09:03.530535 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tmlh5" event={"ID":"41cd2d07-6acd-47ab-8c03-c5dfbc10dbf8","Type":"ContainerStarted","Data":"a6ba9e322eaf1e2f85ad91d783e6f16354ebc1e8015ed1cca3010cbbeab42f0e"} Oct 13 13:09:03 crc kubenswrapper[4797]: I1013 13:09:03.530592 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kv9dz" event={"ID":"16ad833e-f3a4-4af3-97b7-d960f7905292","Type":"ContainerStarted","Data":"3009883e0a4ae3b6934c7d173d7b926a3b43290aa1744514f8ed8f3c8006a471"} Oct 13 13:09:03 crc kubenswrapper[4797]: I1013 13:09:03.530607 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5pjbz" event={"ID":"cb019f89-a9a5-4589-9116-f9ee8e3ffb3c","Type":"ContainerStarted","Data":"fda203d8a706efb2f74a3087828415c41d4d529b3ab5f1c6fcf6d844a29665f9"} Oct 13 13:09:03 crc kubenswrapper[4797]: I1013 13:09:03.530620 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-mtgwk" event={"ID":"ffa6af37-afb0-4226-be8c-83399f30793a","Type":"ContainerStarted","Data":"65938c00ea8855e5df715036658fef644b5b8182813893ef1c9fc82dbf28ff68"} Oct 13 13:09:03 crc kubenswrapper[4797]: I1013 13:09:03.530634 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29339340-k2d9s" event={"ID":"fc2e514b-61f8-47b1-975e-4a910550ecaa","Type":"ContainerStarted","Data":"4cf3177b501a3a7ea0df2a551e8fe048530dbc05e02581bf2a68351f5bb8569e"} Oct 13 13:09:03 crc kubenswrapper[4797]: I1013 13:09:03.530646 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wnl59" event={"ID":"7be202b3-8f35-4fe4-84ab-aea26389b7fd","Type":"ContainerStarted","Data":"33a6a690947bac649e0b930cb7a272b4f3a91010632970048e033ab537dcfe7e"} Oct 13 13:09:03 crc kubenswrapper[4797]: I1013 13:09:03.530660 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wnl59" event={"ID":"7be202b3-8f35-4fe4-84ab-aea26389b7fd","Type":"ContainerStarted","Data":"6a7e6a689978aac2e8a8869da566b8a98717bd9924820ccd2884d074003df614"} Oct 13 13:09:03 crc kubenswrapper[4797]: I1013 13:09:03.534432 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mgjqp" event={"ID":"6007d943-10d1-4737-9821-874c5fe1a043","Type":"ContainerStarted","Data":"831afe7f2c1e70576bdf14454008e8e07ad28ba972fe46cae18f981784e18a13"} Oct 13 13:09:03 crc kubenswrapper[4797]: I1013 13:09:03.546086 4797 generic.go:334] "Generic (PLEG): container finished" podID="ef80bed0-4ac7-4df7-87e9-72eb4299008a" containerID="ab8acc10fe54de374849b54d7c31fc2f5332acc17f83aae45dce2e1560a32f48" exitCode=0 Oct 13 13:09:03 crc kubenswrapper[4797]: I1013 13:09:03.546142 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-gbkvx" event={"ID":"ef80bed0-4ac7-4df7-87e9-72eb4299008a","Type":"ContainerDied","Data":"ab8acc10fe54de374849b54d7c31fc2f5332acc17f83aae45dce2e1560a32f48"} Oct 13 13:09:03 crc kubenswrapper[4797]: I1013 13:09:03.546166 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-gbkvx" event={"ID":"ef80bed0-4ac7-4df7-87e9-72eb4299008a","Type":"ContainerStarted","Data":"2190e54674ac3f5d5e6c598efc0a37e650453c7957e8a6a120cde79424c77e34"} Oct 13 13:09:03 crc kubenswrapper[4797]: I1013 13:09:03.556618 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-89wql" event={"ID":"85522ce2-6359-4ee0-bb51-7c19dffbfd09","Type":"ContainerStarted","Data":"86a4686935498adb2187dca461bd26721a359018dbfe109700717a7c31cf021b"} Oct 13 13:09:03 crc kubenswrapper[4797]: I1013 13:09:03.570839 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jsw4c" event={"ID":"c9e38323-ab31-433d-a8c2-6f1814529fea","Type":"ContainerStarted","Data":"81ee424c31d97903cdf9bd8dca8d823cef13e83400f71ffdf277adb77a3ff735"} Oct 13 13:09:03 crc kubenswrapper[4797]: I1013 13:09:03.583451 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x9zw9\" (UID: \"fcbb5cc0-3585-4ddb-aa28-1c1097d59318\") " pod="openshift-image-registry/image-registry-697d97f7c8-x9zw9" Oct 13 13:09:03 crc kubenswrapper[4797]: E1013 13:09:03.584966 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-13 13:09:04.084952173 +0000 UTC m=+121.618502499 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x9zw9" (UID: "fcbb5cc0-3585-4ddb-aa28-1c1097d59318") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 13:09:03 crc kubenswrapper[4797]: I1013 13:09:03.598508 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7j4bs" podStartSLOduration=99.598491038 podStartE2EDuration="1m39.598491038s" podCreationTimestamp="2025-10-13 13:07:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 13:09:03.582672596 +0000 UTC m=+121.116222852" watchObservedRunningTime="2025-10-13 13:09:03.598491038 +0000 UTC m=+121.132041294" Oct 13 13:09:03 crc kubenswrapper[4797]: I1013 13:09:03.657880 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29339340-k2d9s" podStartSLOduration=99.657864226 podStartE2EDuration="1m39.657864226s" podCreationTimestamp="2025-10-13 13:07:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 13:09:03.603257405 +0000 UTC m=+121.136807661" watchObservedRunningTime="2025-10-13 13:09:03.657864226 +0000 UTC m=+121.191414482" Oct 13 13:09:03 crc kubenswrapper[4797]: I1013 13:09:03.668398 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-qssvq" event={"ID":"9a6c6613-63d4-485e-ba56-f2347d72872e","Type":"ContainerStarted","Data":"4ff529d22167513d0e99897cea918ca44dfaf7b04ad6542507ad6e45501bebd5"} Oct 13 13:09:03 crc kubenswrapper[4797]: I1013 13:09:03.691887 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 13 13:09:03 crc kubenswrapper[4797]: E1013 13:09:03.692605 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-13 13:09:04.192578595 +0000 UTC m=+121.726128851 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 13:09:03 crc kubenswrapper[4797]: I1013 13:09:03.692639 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x9zw9\" (UID: \"fcbb5cc0-3585-4ddb-aa28-1c1097d59318\") " pod="openshift-image-registry/image-registry-697d97f7c8-x9zw9" Oct 13 13:09:03 crc kubenswrapper[4797]: E1013 13:09:03.693387 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-13 13:09:04.193379934 +0000 UTC m=+121.726930180 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x9zw9" (UID: "fcbb5cc0-3585-4ddb-aa28-1c1097d59318") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 13:09:03 crc kubenswrapper[4797]: I1013 13:09:03.726780 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ccxwp" event={"ID":"f156a9cc-9d97-4364-a17d-f02d8b0f8abe","Type":"ContainerStarted","Data":"db14cca1f14e085d5351534c0081eea2835370080fa32460b0a55b59d8c73a30"} Oct 13 13:09:03 crc kubenswrapper[4797]: I1013 13:09:03.753975 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-29smx" event={"ID":"75093995-2ea3-48d0-bbe7-aa933ebb1dc5","Type":"ContainerStarted","Data":"7da5cdb66559a855e450f7aa47d8baa3fb7dfe04780e7c426b125d5c997ed419"} Oct 13 13:09:03 crc kubenswrapper[4797]: I1013 13:09:03.765992 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lwt47" event={"ID":"df83a900-367b-4c33-8a5f-ac822552edde","Type":"ContainerStarted","Data":"05da7d7c88900b76f0730a87b083b75b92ea947f869f83b24baf5db4103ae6f9"} Oct 13 13:09:03 crc kubenswrapper[4797]: I1013 13:09:03.767862 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2n27q" event={"ID":"b380ffb7-df80-4c6d-8c1a-1ed4a7d1208e","Type":"ContainerStarted","Data":"bec737b59ede9de28a373de81bf3f97f73570db396c765afed7e2810087ec1fb"} Oct 13 13:09:03 crc kubenswrapper[4797]: I1013 13:09:03.795383 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 13 13:09:03 crc kubenswrapper[4797]: E1013 13:09:03.795789 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-13 13:09:04.295775397 +0000 UTC m=+121.829325653 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 13:09:03 crc kubenswrapper[4797]: I1013 13:09:03.813909 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7rxbv" event={"ID":"812d4016-c6ce-4440-bd68-3328eb8b9421","Type":"ContainerStarted","Data":"d820a5c8804397b5423f507a85f164cd48b614c44948de54fe553fe767563f8b"} Oct 13 13:09:03 crc kubenswrapper[4797]: I1013 13:09:03.816070 4797 patch_prober.go:28] interesting pod/downloads-7954f5f757-9pdrq container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Oct 13 13:09:03 crc kubenswrapper[4797]: I1013 13:09:03.816126 4797 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-9pdrq" podUID="2b8b627e-6ee4-4ba8-b83f-fc84cf2b2c11" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Oct 13 13:09:03 crc kubenswrapper[4797]: I1013 13:09:03.816275 4797 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-hxck2 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.36:8080/healthz\": dial tcp 10.217.0.36:8080: connect: connection refused" start-of-body= Oct 13 13:09:03 crc kubenswrapper[4797]: I1013 13:09:03.816314 4797 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-hxck2" podUID="c229cffd-cd92-47b5-bec4-3f3eb1c6c81e" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.36:8080/healthz\": dial tcp 10.217.0.36:8080: connect: connection refused" Oct 13 13:09:03 crc kubenswrapper[4797]: I1013 13:09:03.820007 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-5dpx7" Oct 13 13:09:03 crc kubenswrapper[4797]: I1013 13:09:03.905594 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x9zw9\" (UID: \"fcbb5cc0-3585-4ddb-aa28-1c1097d59318\") " pod="openshift-image-registry/image-registry-697d97f7c8-x9zw9" Oct 13 13:09:03 crc kubenswrapper[4797]: E1013 13:09:03.906653 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-13 13:09:04.406633069 +0000 UTC m=+121.940183435 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x9zw9" (UID: "fcbb5cc0-3585-4ddb-aa28-1c1097d59318") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 13:09:04 crc kubenswrapper[4797]: I1013 13:09:04.006475 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 13 13:09:04 crc kubenswrapper[4797]: E1013 13:09:04.012969 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-13 13:09:04.512931878 +0000 UTC m=+122.046482134 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 13:09:04 crc kubenswrapper[4797]: E1013 13:09:04.032166 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-13 13:09:04.532149923 +0000 UTC m=+122.065700179 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x9zw9" (UID: "fcbb5cc0-3585-4ddb-aa28-1c1097d59318") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 13:09:04 crc kubenswrapper[4797]: I1013 13:09:04.022822 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x9zw9\" (UID: \"fcbb5cc0-3585-4ddb-aa28-1c1097d59318\") " pod="openshift-image-registry/image-registry-697d97f7c8-x9zw9" Oct 13 13:09:04 crc kubenswrapper[4797]: I1013 13:09:04.133393 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 13 13:09:04 crc kubenswrapper[4797]: E1013 13:09:04.133766 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-13 13:09:04.633750376 +0000 UTC m=+122.167300632 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 13:09:04 crc kubenswrapper[4797]: I1013 13:09:04.143338 4797 patch_prober.go:28] interesting pod/router-default-5444994796-ngmd7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 13 13:09:04 crc kubenswrapper[4797]: [-]has-synced failed: reason withheld Oct 13 13:09:04 crc kubenswrapper[4797]: [+]process-running ok Oct 13 13:09:04 crc kubenswrapper[4797]: healthz check failed Oct 13 13:09:04 crc kubenswrapper[4797]: I1013 13:09:04.143373 4797 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ngmd7" podUID="62769b11-1e27-44f2-836c-da79ac2655b6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 13 13:09:04 crc kubenswrapper[4797]: I1013 13:09:04.238641 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x9zw9\" (UID: \"fcbb5cc0-3585-4ddb-aa28-1c1097d59318\") " pod="openshift-image-registry/image-registry-697d97f7c8-x9zw9" Oct 13 13:09:04 crc kubenswrapper[4797]: E1013 13:09:04.239234 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-13 13:09:04.739221894 +0000 UTC m=+122.272772150 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x9zw9" (UID: "fcbb5cc0-3585-4ddb-aa28-1c1097d59318") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 13:09:04 crc kubenswrapper[4797]: I1013 13:09:04.343725 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 13 13:09:04 crc kubenswrapper[4797]: E1013 13:09:04.343925 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-13 13:09:04.843896343 +0000 UTC m=+122.377446599 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 13:09:04 crc kubenswrapper[4797]: I1013 13:09:04.348126 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x9zw9\" (UID: \"fcbb5cc0-3585-4ddb-aa28-1c1097d59318\") " pod="openshift-image-registry/image-registry-697d97f7c8-x9zw9" Oct 13 13:09:04 crc kubenswrapper[4797]: E1013 13:09:04.348486 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-13 13:09:04.848474976 +0000 UTC m=+122.382025232 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x9zw9" (UID: "fcbb5cc0-3585-4ddb-aa28-1c1097d59318") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 13:09:04 crc kubenswrapper[4797]: I1013 13:09:04.452489 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 13 13:09:04 crc kubenswrapper[4797]: E1013 13:09:04.452872 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-13 13:09:04.952785076 +0000 UTC m=+122.486335332 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 13:09:04 crc kubenswrapper[4797]: I1013 13:09:04.555017 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x9zw9\" (UID: \"fcbb5cc0-3585-4ddb-aa28-1c1097d59318\") " pod="openshift-image-registry/image-registry-697d97f7c8-x9zw9" Oct 13 13:09:04 crc kubenswrapper[4797]: E1013 13:09:04.555656 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-13 13:09:05.05564455 +0000 UTC m=+122.589194796 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x9zw9" (UID: "fcbb5cc0-3585-4ddb-aa28-1c1097d59318") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 13:09:04 crc kubenswrapper[4797]: I1013 13:09:04.656595 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 13 13:09:04 crc kubenswrapper[4797]: E1013 13:09:04.656987 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-13 13:09:05.156972716 +0000 UTC m=+122.690522972 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 13:09:04 crc kubenswrapper[4797]: I1013 13:09:04.762181 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x9zw9\" (UID: \"fcbb5cc0-3585-4ddb-aa28-1c1097d59318\") " pod="openshift-image-registry/image-registry-697d97f7c8-x9zw9" Oct 13 13:09:04 crc kubenswrapper[4797]: E1013 13:09:04.762759 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-13 13:09:05.262728892 +0000 UTC m=+122.796279148 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x9zw9" (UID: "fcbb5cc0-3585-4ddb-aa28-1c1097d59318") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 13:09:04 crc kubenswrapper[4797]: I1013 13:09:04.848517 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-29smx" event={"ID":"75093995-2ea3-48d0-bbe7-aa933ebb1dc5","Type":"ContainerStarted","Data":"0939481d54d4c6841f1685cb1fa380008b8b253a4529bede2dfaccfbaae2fed9"} Oct 13 13:09:04 crc kubenswrapper[4797]: I1013 13:09:04.861101 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lwt47" event={"ID":"df83a900-367b-4c33-8a5f-ac822552edde","Type":"ContainerStarted","Data":"f92dbb31d00614271b089560f33bb236ae20bb0fa4ba7e9c494e99af0838614d"} Oct 13 13:09:04 crc kubenswrapper[4797]: I1013 13:09:04.861986 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lwt47" Oct 13 13:09:04 crc kubenswrapper[4797]: I1013 13:09:04.866438 4797 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-lwt47 container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.26:8443/healthz\": dial tcp 10.217.0.26:8443: connect: connection refused" start-of-body= Oct 13 13:09:04 crc kubenswrapper[4797]: I1013 13:09:04.866498 4797 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lwt47" podUID="df83a900-367b-4c33-8a5f-ac822552edde" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.26:8443/healthz\": dial tcp 10.217.0.26:8443: connect: connection refused" Oct 13 13:09:04 crc kubenswrapper[4797]: I1013 13:09:04.866847 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 13 13:09:04 crc kubenswrapper[4797]: E1013 13:09:04.867009 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-13 13:09:05.366987001 +0000 UTC m=+122.900537267 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 13:09:04 crc kubenswrapper[4797]: I1013 13:09:04.867203 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x9zw9\" (UID: \"fcbb5cc0-3585-4ddb-aa28-1c1097d59318\") " pod="openshift-image-registry/image-registry-697d97f7c8-x9zw9" Oct 13 13:09:04 crc kubenswrapper[4797]: E1013 13:09:04.867513 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-13 13:09:05.367500853 +0000 UTC m=+122.901051109 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x9zw9" (UID: "fcbb5cc0-3585-4ddb-aa28-1c1097d59318") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 13:09:04 crc kubenswrapper[4797]: I1013 13:09:04.873303 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-mtgwk" event={"ID":"ffa6af37-afb0-4226-be8c-83399f30793a","Type":"ContainerStarted","Data":"3945cbe05351fa809b3f7b2aa305128aaa31d2164377c6dd74b294bfed13b4b2"} Oct 13 13:09:04 crc kubenswrapper[4797]: I1013 13:09:04.873345 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-mtgwk" event={"ID":"ffa6af37-afb0-4226-be8c-83399f30793a","Type":"ContainerStarted","Data":"950a653f489d204b9d7daf528aedd6e7e8e474c7bfb846dfaac4832cc59355a9"} Oct 13 13:09:04 crc kubenswrapper[4797]: I1013 13:09:04.878449 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29339340-k2d9s" event={"ID":"fc2e514b-61f8-47b1-975e-4a910550ecaa","Type":"ContainerStarted","Data":"4784279ce83fb7e1cdeef66da1ff994c99dbc1fb7d75ec0b692e2916b972f53d"} Oct 13 13:09:04 crc kubenswrapper[4797]: I1013 13:09:04.889076 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ccxwp" podStartSLOduration=100.889061587 podStartE2EDuration="1m40.889061587s" podCreationTimestamp="2025-10-13 13:07:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 13:09:04.470165526 +0000 UTC m=+122.003715802" watchObservedRunningTime="2025-10-13 13:09:04.889061587 +0000 UTC m=+122.422611843" Oct 13 13:09:04 crc kubenswrapper[4797]: I1013 13:09:04.892332 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-qssvq" event={"ID":"9a6c6613-63d4-485e-ba56-f2347d72872e","Type":"ContainerStarted","Data":"39426315be65a1b5a2250a3b3ccbe93c6c64501dc9460f3aeac37e403ef1bb09"} Oct 13 13:09:04 crc kubenswrapper[4797]: I1013 13:09:04.892374 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-qssvq" event={"ID":"9a6c6613-63d4-485e-ba56-f2347d72872e","Type":"ContainerStarted","Data":"8e9fad29ffcc132832bf3042070d480a9995719e4752fe4dfdbaecba9c646029"} Oct 13 13:09:04 crc kubenswrapper[4797]: I1013 13:09:04.895205 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-tc8mj" event={"ID":"83a79b04-2fae-4444-90fd-a165aab2f901","Type":"ContainerStarted","Data":"fc90005de32c026dfbccff9bbb6e44d03571875b8592568a6adb44741541b962"} Oct 13 13:09:04 crc kubenswrapper[4797]: I1013 13:09:04.896764 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-8cwgx" event={"ID":"463ec1ab-bb91-45f7-b090-7aeb71365797","Type":"ContainerStarted","Data":"bd15a154945342010c08b50321c2e3a8cff99bbf0b200721b1de8d31b153e5e7"} Oct 13 13:09:04 crc kubenswrapper[4797]: I1013 13:09:04.901615 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-8k7h2" event={"ID":"4e99badd-da38-4d6d-bf30-8c5d837e4ca9","Type":"ContainerStarted","Data":"3bdfb89e1090e3e10e4099516850a93573c66c381c6def36353c8e8280a0d5bf"} Oct 13 13:09:04 crc kubenswrapper[4797]: I1013 13:09:04.904637 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tmlh5" event={"ID":"41cd2d07-6acd-47ab-8c03-c5dfbc10dbf8","Type":"ContainerStarted","Data":"f10036462389d92d06128b790f4d75feb994ba86f7907f5b7174adba6fe2fa75"} Oct 13 13:09:04 crc kubenswrapper[4797]: I1013 13:09:04.904736 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tmlh5" event={"ID":"41cd2d07-6acd-47ab-8c03-c5dfbc10dbf8","Type":"ContainerStarted","Data":"12947738d6055b8b1fb6b04318890350d152795cd25a3493fe192985a107b096"} Oct 13 13:09:04 crc kubenswrapper[4797]: I1013 13:09:04.904840 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tmlh5" Oct 13 13:09:04 crc kubenswrapper[4797]: I1013 13:09:04.905620 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kv9dz" event={"ID":"16ad833e-f3a4-4af3-97b7-d960f7905292","Type":"ContainerStarted","Data":"3f0b2656e44eeddc239e80bc0559cb9b08caa915c97bed890012df004fa1ada2"} Oct 13 13:09:04 crc kubenswrapper[4797]: I1013 13:09:04.905840 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kv9dz" Oct 13 13:09:04 crc kubenswrapper[4797]: I1013 13:09:04.907925 4797 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-kv9dz container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.37:5443/healthz\": dial tcp 10.217.0.37:5443: connect: connection refused" start-of-body= Oct 13 13:09:04 crc kubenswrapper[4797]: I1013 13:09:04.908015 4797 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kv9dz" podUID="16ad833e-f3a4-4af3-97b7-d960f7905292" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.37:5443/healthz\": dial tcp 10.217.0.37:5443: connect: connection refused" Oct 13 13:09:04 crc kubenswrapper[4797]: I1013 13:09:04.909007 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7rxbv" event={"ID":"812d4016-c6ce-4440-bd68-3328eb8b9421","Type":"ContainerStarted","Data":"8a49d4e80edab11dfc5fcb4304bcecc42ed0aea4d3243c2215a00e52ae4912bc"} Oct 13 13:09:04 crc kubenswrapper[4797]: I1013 13:09:04.909038 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7rxbv" event={"ID":"812d4016-c6ce-4440-bd68-3328eb8b9421","Type":"ContainerStarted","Data":"ac86e16f7be83643cb61765c469cf724e292ec359325eaf4909b875ffba5c1f7"} Oct 13 13:09:04 crc kubenswrapper[4797]: I1013 13:09:04.913263 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wnl59" event={"ID":"7be202b3-8f35-4fe4-84ab-aea26389b7fd","Type":"ContainerStarted","Data":"d09a22957942b22ab9a1b977765e927ce4534589aa255ee5badb18fe4c2253f3"} Oct 13 13:09:04 crc kubenswrapper[4797]: I1013 13:09:04.915732 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-gbkvx" event={"ID":"ef80bed0-4ac7-4df7-87e9-72eb4299008a","Type":"ContainerStarted","Data":"74675f81afb70894ca44971d4e43fa7cfc3e5aa5319e6c6c355bb1a92ec09582"} Oct 13 13:09:04 crc kubenswrapper[4797]: I1013 13:09:04.915902 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-gbkvx" Oct 13 13:09:04 crc kubenswrapper[4797]: I1013 13:09:04.932377 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jsw4c" event={"ID":"c9e38323-ab31-433d-a8c2-6f1814529fea","Type":"ContainerStarted","Data":"42e72570e9f267db9d64f42289df33645574ea0e069f86a1416950f6e625e9bd"} Oct 13 13:09:04 crc kubenswrapper[4797]: I1013 13:09:04.932436 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jsw4c" event={"ID":"c9e38323-ab31-433d-a8c2-6f1814529fea","Type":"ContainerStarted","Data":"5b0e8834cea1484b04e86a015ee39d58b3dd965a3e77e14cf93a2670fa808d2b"} Oct 13 13:09:04 crc kubenswrapper[4797]: I1013 13:09:04.945573 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-72vb4" Oct 13 13:09:04 crc kubenswrapper[4797]: I1013 13:09:04.945856 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-72vb4" Oct 13 13:09:04 crc kubenswrapper[4797]: I1013 13:09:04.952740 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lwt47" podStartSLOduration=100.952723501 podStartE2EDuration="1m40.952723501s" podCreationTimestamp="2025-10-13 13:07:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 13:09:04.890204905 +0000 UTC m=+122.423755161" watchObservedRunningTime="2025-10-13 13:09:04.952723501 +0000 UTC m=+122.486273757" Oct 13 13:09:04 crc kubenswrapper[4797]: I1013 13:09:04.953794 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-mtgwk" podStartSLOduration=100.953789047 podStartE2EDuration="1m40.953789047s" podCreationTimestamp="2025-10-13 13:07:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 13:09:04.930016469 +0000 UTC m=+122.463566735" watchObservedRunningTime="2025-10-13 13:09:04.953789047 +0000 UTC m=+122.487339303" Oct 13 13:09:04 crc kubenswrapper[4797]: I1013 13:09:04.956034 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5pjbz" event={"ID":"cb019f89-a9a5-4589-9116-f9ee8e3ffb3c","Type":"ContainerStarted","Data":"9b7747d4384e5f313fa50ded1ab7b08ed467d5fe346efd24d7fe03fdbb10e9e1"} Oct 13 13:09:04 crc kubenswrapper[4797]: I1013 13:09:04.961469 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tmlh5" podStartSLOduration=100.961448797 podStartE2EDuration="1m40.961448797s" podCreationTimestamp="2025-10-13 13:07:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 13:09:04.951185213 +0000 UTC m=+122.484735469" watchObservedRunningTime="2025-10-13 13:09:04.961448797 +0000 UTC m=+122.494999073" Oct 13 13:09:04 crc kubenswrapper[4797]: I1013 13:09:04.968600 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 13 13:09:04 crc kubenswrapper[4797]: I1013 13:09:04.982176 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2n27q" event={"ID":"b380ffb7-df80-4c6d-8c1a-1ed4a7d1208e","Type":"ContainerStarted","Data":"e8c988703525606abf68182904c42733b5f749d299a0f91ae000c0008da8179d"} Oct 13 13:09:04 crc kubenswrapper[4797]: I1013 13:09:04.986737 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mgjqp" event={"ID":"6007d943-10d1-4737-9821-874c5fe1a043","Type":"ContainerStarted","Data":"094803bd56fe1b91bc358662fef6c02e8290b16c748d71a7f8c9dcbdd9083e86"} Oct 13 13:09:04 crc kubenswrapper[4797]: E1013 13:09:04.988598 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-13 13:09:05.488560077 +0000 UTC m=+123.022110333 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 13:09:04 crc kubenswrapper[4797]: I1013 13:09:04.998264 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ccxwp" Oct 13 13:09:04 crc kubenswrapper[4797]: I1013 13:09:04.998509 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ccxwp" Oct 13 13:09:04 crc kubenswrapper[4797]: I1013 13:09:04.999648 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lkws7" event={"ID":"6939a36c-2200-4151-bd6f-50f54ecdf4c9","Type":"ContainerStarted","Data":"4b23843d72e603f71c5419e06327912bf84c048eb5e4de6ed4a7298df76c4060"} Oct 13 13:09:04 crc kubenswrapper[4797]: I1013 13:09:04.999672 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lkws7" event={"ID":"6939a36c-2200-4151-bd6f-50f54ecdf4c9","Type":"ContainerStarted","Data":"a33f8e7d4bd5d38a8afa782f2a2b1d3192c1d48034e81fa5f29b52a35aee9aa4"} Oct 13 13:09:05 crc kubenswrapper[4797]: I1013 13:09:05.035026 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7j4bs" Oct 13 13:09:05 crc kubenswrapper[4797]: I1013 13:09:05.041938 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ccxwp" Oct 13 13:09:05 crc kubenswrapper[4797]: I1013 13:09:05.055766 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-tc8mj" podStartSLOduration=101.055750589 podStartE2EDuration="1m41.055750589s" podCreationTimestamp="2025-10-13 13:07:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 13:09:05.015215607 +0000 UTC m=+122.548765863" watchObservedRunningTime="2025-10-13 13:09:05.055750589 +0000 UTC m=+122.589300845" Oct 13 13:09:05 crc kubenswrapper[4797]: I1013 13:09:05.057491 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7rxbv" podStartSLOduration=101.057484162 podStartE2EDuration="1m41.057484162s" podCreationTimestamp="2025-10-13 13:07:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 13:09:05.056529268 +0000 UTC m=+122.590079524" watchObservedRunningTime="2025-10-13 13:09:05.057484162 +0000 UTC m=+122.591034418" Oct 13 13:09:05 crc kubenswrapper[4797]: I1013 13:09:05.070631 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x9zw9\" (UID: \"fcbb5cc0-3585-4ddb-aa28-1c1097d59318\") " pod="openshift-image-registry/image-registry-697d97f7c8-x9zw9" Oct 13 13:09:05 crc kubenswrapper[4797]: E1013 13:09:05.077592 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-13 13:09:05.577578319 +0000 UTC m=+123.111128575 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x9zw9" (UID: "fcbb5cc0-3585-4ddb-aa28-1c1097d59318") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 13:09:05 crc kubenswrapper[4797]: I1013 13:09:05.086304 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-8k7h2" podStartSLOduration=101.086285134 podStartE2EDuration="1m41.086285134s" podCreationTimestamp="2025-10-13 13:07:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 13:09:05.083190418 +0000 UTC m=+122.616740674" watchObservedRunningTime="2025-10-13 13:09:05.086285134 +0000 UTC m=+122.619835390" Oct 13 13:09:05 crc kubenswrapper[4797]: I1013 13:09:05.135907 4797 patch_prober.go:28] interesting pod/apiserver-76f77b778f-72vb4 container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 13 13:09:05 crc kubenswrapper[4797]: [+]log ok Oct 13 13:09:05 crc kubenswrapper[4797]: [+]etcd ok Oct 13 13:09:05 crc kubenswrapper[4797]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 13 13:09:05 crc kubenswrapper[4797]: [+]poststarthook/generic-apiserver-start-informers ok Oct 13 13:09:05 crc kubenswrapper[4797]: [+]poststarthook/max-in-flight-filter ok Oct 13 13:09:05 crc kubenswrapper[4797]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 13 13:09:05 crc kubenswrapper[4797]: [+]poststarthook/image.openshift.io-apiserver-caches ok Oct 13 13:09:05 crc kubenswrapper[4797]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Oct 13 13:09:05 crc kubenswrapper[4797]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Oct 13 13:09:05 crc kubenswrapper[4797]: [+]poststarthook/project.openshift.io-projectcache ok Oct 13 13:09:05 crc kubenswrapper[4797]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Oct 13 13:09:05 crc kubenswrapper[4797]: [+]poststarthook/openshift.io-startinformers ok Oct 13 13:09:05 crc kubenswrapper[4797]: [+]poststarthook/openshift.io-restmapperupdater ok Oct 13 13:09:05 crc kubenswrapper[4797]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Oct 13 13:09:05 crc kubenswrapper[4797]: livez check failed Oct 13 13:09:05 crc kubenswrapper[4797]: I1013 13:09:05.135968 4797 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-72vb4" podUID="90473429-30ec-490e-a96f-d66fce3c994c" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 13 13:09:05 crc kubenswrapper[4797]: I1013 13:09:05.137515 4797 patch_prober.go:28] interesting pod/router-default-5444994796-ngmd7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 13 13:09:05 crc kubenswrapper[4797]: [-]has-synced failed: reason withheld Oct 13 13:09:05 crc kubenswrapper[4797]: [+]process-running ok Oct 13 13:09:05 crc kubenswrapper[4797]: healthz check failed Oct 13 13:09:05 crc kubenswrapper[4797]: I1013 13:09:05.137553 4797 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ngmd7" podUID="62769b11-1e27-44f2-836c-da79ac2655b6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 13 13:09:05 crc kubenswrapper[4797]: I1013 13:09:05.141785 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-qssvq" podStartSLOduration=101.141768107 podStartE2EDuration="1m41.141768107s" podCreationTimestamp="2025-10-13 13:07:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 13:09:05.122532421 +0000 UTC m=+122.656082677" watchObservedRunningTime="2025-10-13 13:09:05.141768107 +0000 UTC m=+122.675318363" Oct 13 13:09:05 crc kubenswrapper[4797]: I1013 13:09:05.173322 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 13 13:09:05 crc kubenswrapper[4797]: E1013 13:09:05.173469 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-13 13:09:05.67344637 +0000 UTC m=+123.206996626 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 13:09:05 crc kubenswrapper[4797]: I1013 13:09:05.173572 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x9zw9\" (UID: \"fcbb5cc0-3585-4ddb-aa28-1c1097d59318\") " pod="openshift-image-registry/image-registry-697d97f7c8-x9zw9" Oct 13 13:09:05 crc kubenswrapper[4797]: E1013 13:09:05.173851 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-13 13:09:05.6738396 +0000 UTC m=+123.207389856 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x9zw9" (UID: "fcbb5cc0-3585-4ddb-aa28-1c1097d59318") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 13:09:05 crc kubenswrapper[4797]: I1013 13:09:05.174078 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kv9dz" podStartSLOduration=101.174066325 podStartE2EDuration="1m41.174066325s" podCreationTimestamp="2025-10-13 13:07:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 13:09:05.169606175 +0000 UTC m=+122.703156441" watchObservedRunningTime="2025-10-13 13:09:05.174066325 +0000 UTC m=+122.707616581" Oct 13 13:09:05 crc kubenswrapper[4797]: I1013 13:09:05.193413 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-gbkvx" podStartSLOduration=102.193396993 podStartE2EDuration="1m42.193396993s" podCreationTimestamp="2025-10-13 13:07:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 13:09:05.191774893 +0000 UTC m=+122.725325169" watchObservedRunningTime="2025-10-13 13:09:05.193396993 +0000 UTC m=+122.726947249" Oct 13 13:09:05 crc kubenswrapper[4797]: I1013 13:09:05.260597 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-8cwgx" podStartSLOduration=8.260578635 podStartE2EDuration="8.260578635s" podCreationTimestamp="2025-10-13 13:08:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 13:09:05.259016346 +0000 UTC m=+122.792566602" watchObservedRunningTime="2025-10-13 13:09:05.260578635 +0000 UTC m=+122.794128881" Oct 13 13:09:05 crc kubenswrapper[4797]: I1013 13:09:05.260792 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wnl59" podStartSLOduration=101.26078903 podStartE2EDuration="1m41.26078903s" podCreationTimestamp="2025-10-13 13:07:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 13:09:05.22804923 +0000 UTC m=+122.761599486" watchObservedRunningTime="2025-10-13 13:09:05.26078903 +0000 UTC m=+122.794339286" Oct 13 13:09:05 crc kubenswrapper[4797]: I1013 13:09:05.274862 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 13 13:09:05 crc kubenswrapper[4797]: E1013 13:09:05.275155 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-13 13:09:05.775140835 +0000 UTC m=+123.308691091 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 13:09:05 crc kubenswrapper[4797]: I1013 13:09:05.300404 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jsw4c" podStartSLOduration=101.30038524 podStartE2EDuration="1m41.30038524s" podCreationTimestamp="2025-10-13 13:07:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 13:09:05.299129688 +0000 UTC m=+122.832679944" watchObservedRunningTime="2025-10-13 13:09:05.30038524 +0000 UTC m=+122.833935496" Oct 13 13:09:05 crc kubenswrapper[4797]: I1013 13:09:05.378666 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x9zw9\" (UID: \"fcbb5cc0-3585-4ddb-aa28-1c1097d59318\") " pod="openshift-image-registry/image-registry-697d97f7c8-x9zw9" Oct 13 13:09:05 crc kubenswrapper[4797]: E1013 13:09:05.378972 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-13 13:09:05.878960053 +0000 UTC m=+123.412510309 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x9zw9" (UID: "fcbb5cc0-3585-4ddb-aa28-1c1097d59318") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 13:09:05 crc kubenswrapper[4797]: I1013 13:09:05.409329 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5pjbz" podStartSLOduration=101.409309314 podStartE2EDuration="1m41.409309314s" podCreationTimestamp="2025-10-13 13:07:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 13:09:05.407774386 +0000 UTC m=+122.941324652" watchObservedRunningTime="2025-10-13 13:09:05.409309314 +0000 UTC m=+122.942859570" Oct 13 13:09:05 crc kubenswrapper[4797]: I1013 13:09:05.480521 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 13 13:09:05 crc kubenswrapper[4797]: E1013 13:09:05.480716 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-13 13:09:05.980692799 +0000 UTC m=+123.514243055 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 13:09:05 crc kubenswrapper[4797]: I1013 13:09:05.480957 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x9zw9\" (UID: \"fcbb5cc0-3585-4ddb-aa28-1c1097d59318\") " pod="openshift-image-registry/image-registry-697d97f7c8-x9zw9" Oct 13 13:09:05 crc kubenswrapper[4797]: E1013 13:09:05.481294 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-13 13:09:05.981284294 +0000 UTC m=+123.514834550 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x9zw9" (UID: "fcbb5cc0-3585-4ddb-aa28-1c1097d59318") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 13:09:05 crc kubenswrapper[4797]: I1013 13:09:05.520613 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2n27q" podStartSLOduration=101.520593786 podStartE2EDuration="1m41.520593786s" podCreationTimestamp="2025-10-13 13:07:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 13:09:05.509490641 +0000 UTC m=+123.043040897" watchObservedRunningTime="2025-10-13 13:09:05.520593786 +0000 UTC m=+123.054144042" Oct 13 13:09:05 crc kubenswrapper[4797]: I1013 13:09:05.581405 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 13 13:09:05 crc kubenswrapper[4797]: E1013 13:09:05.581597 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-13 13:09:06.081567814 +0000 UTC m=+123.615118070 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 13:09:05 crc kubenswrapper[4797]: I1013 13:09:05.581718 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x9zw9\" (UID: \"fcbb5cc0-3585-4ddb-aa28-1c1097d59318\") " pod="openshift-image-registry/image-registry-697d97f7c8-x9zw9" Oct 13 13:09:05 crc kubenswrapper[4797]: E1013 13:09:05.582069 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-13 13:09:06.082060786 +0000 UTC m=+123.615611042 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x9zw9" (UID: "fcbb5cc0-3585-4ddb-aa28-1c1097d59318") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 13:09:05 crc kubenswrapper[4797]: I1013 13:09:05.601136 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mgjqp" podStartSLOduration=101.601118177 podStartE2EDuration="1m41.601118177s" podCreationTimestamp="2025-10-13 13:07:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 13:09:05.598688627 +0000 UTC m=+123.132238913" watchObservedRunningTime="2025-10-13 13:09:05.601118177 +0000 UTC m=+123.134668433" Oct 13 13:09:05 crc kubenswrapper[4797]: I1013 13:09:05.682736 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 13 13:09:05 crc kubenswrapper[4797]: E1013 13:09:05.682899 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-13 13:09:06.182872359 +0000 UTC m=+123.716422615 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 13:09:05 crc kubenswrapper[4797]: I1013 13:09:05.683007 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x9zw9\" (UID: \"fcbb5cc0-3585-4ddb-aa28-1c1097d59318\") " pod="openshift-image-registry/image-registry-697d97f7c8-x9zw9" Oct 13 13:09:05 crc kubenswrapper[4797]: E1013 13:09:05.683537 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-13 13:09:06.183522486 +0000 UTC m=+123.717072802 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x9zw9" (UID: "fcbb5cc0-3585-4ddb-aa28-1c1097d59318") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 13:09:05 crc kubenswrapper[4797]: I1013 13:09:05.707303 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lkws7" podStartSLOduration=101.707284443 podStartE2EDuration="1m41.707284443s" podCreationTimestamp="2025-10-13 13:07:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 13:09:05.658522277 +0000 UTC m=+123.192072533" watchObservedRunningTime="2025-10-13 13:09:05.707284443 +0000 UTC m=+123.240834699" Oct 13 13:09:05 crc kubenswrapper[4797]: I1013 13:09:05.784650 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 13 13:09:05 crc kubenswrapper[4797]: E1013 13:09:05.784964 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-13 13:09:06.284947844 +0000 UTC m=+123.818498100 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 13:09:05 crc kubenswrapper[4797]: I1013 13:09:05.886207 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x9zw9\" (UID: \"fcbb5cc0-3585-4ddb-aa28-1c1097d59318\") " pod="openshift-image-registry/image-registry-697d97f7c8-x9zw9" Oct 13 13:09:05 crc kubenswrapper[4797]: E1013 13:09:05.886575 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-13 13:09:06.386560267 +0000 UTC m=+123.920110523 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x9zw9" (UID: "fcbb5cc0-3585-4ddb-aa28-1c1097d59318") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 13:09:05 crc kubenswrapper[4797]: I1013 13:09:05.987154 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 13 13:09:05 crc kubenswrapper[4797]: E1013 13:09:05.987429 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-13 13:09:06.487396001 +0000 UTC m=+124.020946267 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 13:09:05 crc kubenswrapper[4797]: I1013 13:09:05.987698 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x9zw9\" (UID: \"fcbb5cc0-3585-4ddb-aa28-1c1097d59318\") " pod="openshift-image-registry/image-registry-697d97f7c8-x9zw9" Oct 13 13:09:05 crc kubenswrapper[4797]: E1013 13:09:05.988038 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-13 13:09:06.488024917 +0000 UTC m=+124.021575173 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x9zw9" (UID: "fcbb5cc0-3585-4ddb-aa28-1c1097d59318") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 13:09:06 crc kubenswrapper[4797]: I1013 13:09:06.004385 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-29smx" event={"ID":"75093995-2ea3-48d0-bbe7-aa933ebb1dc5","Type":"ContainerStarted","Data":"990cf881a81b9294eef6880a087072dd749b85498172460bd1c1b29dfdab5066"} Oct 13 13:09:06 crc kubenswrapper[4797]: I1013 13:09:06.004563 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-29smx" Oct 13 13:09:06 crc kubenswrapper[4797]: I1013 13:09:06.006283 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-89wql" event={"ID":"85522ce2-6359-4ee0-bb51-7c19dffbfd09","Type":"ContainerStarted","Data":"9b634a4fe06f2af8a9100fe6f413386d283ade574024eab264713922fc8512fe"} Oct 13 13:09:06 crc kubenswrapper[4797]: I1013 13:09:06.006327 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-89wql" event={"ID":"85522ce2-6359-4ee0-bb51-7c19dffbfd09","Type":"ContainerStarted","Data":"4c4cd7a7e5531e00e6e4a25628c6eafa98bee10c6e367c7e6d7755796862fddf"} Oct 13 13:09:06 crc kubenswrapper[4797]: I1013 13:09:06.017700 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lwt47" Oct 13 13:09:06 crc kubenswrapper[4797]: I1013 13:09:06.019818 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ccxwp" Oct 13 13:09:06 crc kubenswrapper[4797]: I1013 13:09:06.035914 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-29smx" podStartSLOduration=9.03588586 podStartE2EDuration="9.03588586s" podCreationTimestamp="2025-10-13 13:08:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 13:09:06.028856666 +0000 UTC m=+123.562406942" watchObservedRunningTime="2025-10-13 13:09:06.03588586 +0000 UTC m=+123.569436116" Oct 13 13:09:06 crc kubenswrapper[4797]: I1013 13:09:06.109424 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 13 13:09:06 crc kubenswrapper[4797]: E1013 13:09:06.109790 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-13 13:09:06.609775148 +0000 UTC m=+124.143325404 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 13:09:06 crc kubenswrapper[4797]: I1013 13:09:06.141267 4797 patch_prober.go:28] interesting pod/router-default-5444994796-ngmd7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 13 13:09:06 crc kubenswrapper[4797]: [-]has-synced failed: reason withheld Oct 13 13:09:06 crc kubenswrapper[4797]: [+]process-running ok Oct 13 13:09:06 crc kubenswrapper[4797]: healthz check failed Oct 13 13:09:06 crc kubenswrapper[4797]: I1013 13:09:06.141318 4797 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ngmd7" podUID="62769b11-1e27-44f2-836c-da79ac2655b6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 13 13:09:06 crc kubenswrapper[4797]: I1013 13:09:06.211707 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x9zw9\" (UID: \"fcbb5cc0-3585-4ddb-aa28-1c1097d59318\") " pod="openshift-image-registry/image-registry-697d97f7c8-x9zw9" Oct 13 13:09:06 crc kubenswrapper[4797]: E1013 13:09:06.216000 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-13 13:09:06.715987595 +0000 UTC m=+124.249537851 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x9zw9" (UID: "fcbb5cc0-3585-4ddb-aa28-1c1097d59318") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 13:09:06 crc kubenswrapper[4797]: I1013 13:09:06.220701 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kv9dz" Oct 13 13:09:06 crc kubenswrapper[4797]: I1013 13:09:06.312857 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 13 13:09:06 crc kubenswrapper[4797]: E1013 13:09:06.312947 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-13 13:09:06.812932991 +0000 UTC m=+124.346483247 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 13:09:06 crc kubenswrapper[4797]: I1013 13:09:06.313309 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x9zw9\" (UID: \"fcbb5cc0-3585-4ddb-aa28-1c1097d59318\") " pod="openshift-image-registry/image-registry-697d97f7c8-x9zw9" Oct 13 13:09:06 crc kubenswrapper[4797]: E1013 13:09:06.314176 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-13 13:09:06.81411045 +0000 UTC m=+124.347660706 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x9zw9" (UID: "fcbb5cc0-3585-4ddb-aa28-1c1097d59318") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 13:09:06 crc kubenswrapper[4797]: I1013 13:09:06.414192 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 13 13:09:06 crc kubenswrapper[4797]: E1013 13:09:06.414353 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-13 13:09:06.914329839 +0000 UTC m=+124.447880095 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 13:09:06 crc kubenswrapper[4797]: I1013 13:09:06.414428 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x9zw9\" (UID: \"fcbb5cc0-3585-4ddb-aa28-1c1097d59318\") " pod="openshift-image-registry/image-registry-697d97f7c8-x9zw9" Oct 13 13:09:06 crc kubenswrapper[4797]: E1013 13:09:06.414730 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-13 13:09:06.914719239 +0000 UTC m=+124.448269495 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x9zw9" (UID: "fcbb5cc0-3585-4ddb-aa28-1c1097d59318") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 13:09:06 crc kubenswrapper[4797]: I1013 13:09:06.515260 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 13 13:09:06 crc kubenswrapper[4797]: E1013 13:09:06.515478 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-13 13:09:07.01544598 +0000 UTC m=+124.548996246 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 13:09:06 crc kubenswrapper[4797]: I1013 13:09:06.515658 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x9zw9\" (UID: \"fcbb5cc0-3585-4ddb-aa28-1c1097d59318\") " pod="openshift-image-registry/image-registry-697d97f7c8-x9zw9" Oct 13 13:09:06 crc kubenswrapper[4797]: E1013 13:09:06.516039 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-13 13:09:07.016026425 +0000 UTC m=+124.549576691 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x9zw9" (UID: "fcbb5cc0-3585-4ddb-aa28-1c1097d59318") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 13:09:06 crc kubenswrapper[4797]: I1013 13:09:06.616831 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 13 13:09:06 crc kubenswrapper[4797]: E1013 13:09:06.617005 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-13 13:09:07.116984792 +0000 UTC m=+124.650535068 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 13:09:06 crc kubenswrapper[4797]: I1013 13:09:06.699903 4797 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Oct 13 13:09:06 crc kubenswrapper[4797]: I1013 13:09:06.717509 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x9zw9\" (UID: \"fcbb5cc0-3585-4ddb-aa28-1c1097d59318\") " pod="openshift-image-registry/image-registry-697d97f7c8-x9zw9" Oct 13 13:09:06 crc kubenswrapper[4797]: E1013 13:09:06.717972 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-13 13:09:07.217952439 +0000 UTC m=+124.751502765 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x9zw9" (UID: "fcbb5cc0-3585-4ddb-aa28-1c1097d59318") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 13:09:06 crc kubenswrapper[4797]: I1013 13:09:06.818292 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 13 13:09:06 crc kubenswrapper[4797]: E1013 13:09:06.818719 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-13 13:09:07.318702141 +0000 UTC m=+124.852252397 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 13:09:06 crc kubenswrapper[4797]: I1013 13:09:06.919902 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x9zw9\" (UID: \"fcbb5cc0-3585-4ddb-aa28-1c1097d59318\") " pod="openshift-image-registry/image-registry-697d97f7c8-x9zw9" Oct 13 13:09:06 crc kubenswrapper[4797]: E1013 13:09:06.920270 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-13 13:09:07.420254212 +0000 UTC m=+124.953804468 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x9zw9" (UID: "fcbb5cc0-3585-4ddb-aa28-1c1097d59318") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 13:09:07 crc kubenswrapper[4797]: I1013 13:09:07.013383 4797 generic.go:334] "Generic (PLEG): container finished" podID="fc2e514b-61f8-47b1-975e-4a910550ecaa" containerID="4784279ce83fb7e1cdeef66da1ff994c99dbc1fb7d75ec0b692e2916b972f53d" exitCode=0 Oct 13 13:09:07 crc kubenswrapper[4797]: I1013 13:09:07.013467 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29339340-k2d9s" event={"ID":"fc2e514b-61f8-47b1-975e-4a910550ecaa","Type":"ContainerDied","Data":"4784279ce83fb7e1cdeef66da1ff994c99dbc1fb7d75ec0b692e2916b972f53d"} Oct 13 13:09:07 crc kubenswrapper[4797]: I1013 13:09:07.016733 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-89wql" event={"ID":"85522ce2-6359-4ee0-bb51-7c19dffbfd09","Type":"ContainerStarted","Data":"b559750785c793599aac924be205b85f9ff6e9ade402855f24a2a5f1d810ca49"} Oct 13 13:09:07 crc kubenswrapper[4797]: I1013 13:09:07.016770 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-89wql" event={"ID":"85522ce2-6359-4ee0-bb51-7c19dffbfd09","Type":"ContainerStarted","Data":"7bf08de800cb2de9cfd2ecfe1a75f702dfb9f972546ec41da840c0f346d236ba"} Oct 13 13:09:07 crc kubenswrapper[4797]: I1013 13:09:07.021766 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 13 13:09:07 crc kubenswrapper[4797]: E1013 13:09:07.021880 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-13 13:09:07.521864615 +0000 UTC m=+125.055414871 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 13:09:07 crc kubenswrapper[4797]: I1013 13:09:07.022630 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x9zw9\" (UID: \"fcbb5cc0-3585-4ddb-aa28-1c1097d59318\") " pod="openshift-image-registry/image-registry-697d97f7c8-x9zw9" Oct 13 13:09:07 crc kubenswrapper[4797]: E1013 13:09:07.023433 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-13 13:09:07.523421844 +0000 UTC m=+125.056972110 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x9zw9" (UID: "fcbb5cc0-3585-4ddb-aa28-1c1097d59318") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 13:09:07 crc kubenswrapper[4797]: I1013 13:09:07.049772 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-89wql" podStartSLOduration=10.049753075 podStartE2EDuration="10.049753075s" podCreationTimestamp="2025-10-13 13:08:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 13:09:07.046416443 +0000 UTC m=+124.579966729" watchObservedRunningTime="2025-10-13 13:09:07.049753075 +0000 UTC m=+124.583303331" Oct 13 13:09:07 crc kubenswrapper[4797]: I1013 13:09:07.123205 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 13 13:09:07 crc kubenswrapper[4797]: E1013 13:09:07.123421 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-13 13:09:07.623385186 +0000 UTC m=+125.156935442 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 13:09:07 crc kubenswrapper[4797]: I1013 13:09:07.147330 4797 patch_prober.go:28] interesting pod/router-default-5444994796-ngmd7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 13 13:09:07 crc kubenswrapper[4797]: [-]has-synced failed: reason withheld Oct 13 13:09:07 crc kubenswrapper[4797]: [+]process-running ok Oct 13 13:09:07 crc kubenswrapper[4797]: healthz check failed Oct 13 13:09:07 crc kubenswrapper[4797]: I1013 13:09:07.147411 4797 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ngmd7" podUID="62769b11-1e27-44f2-836c-da79ac2655b6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 13 13:09:07 crc kubenswrapper[4797]: I1013 13:09:07.224394 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x9zw9\" (UID: \"fcbb5cc0-3585-4ddb-aa28-1c1097d59318\") " pod="openshift-image-registry/image-registry-697d97f7c8-x9zw9" Oct 13 13:09:07 crc kubenswrapper[4797]: E1013 13:09:07.224744 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-13 13:09:07.724728803 +0000 UTC m=+125.258279059 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x9zw9" (UID: "fcbb5cc0-3585-4ddb-aa28-1c1097d59318") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 13:09:07 crc kubenswrapper[4797]: I1013 13:09:07.325421 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 13 13:09:07 crc kubenswrapper[4797]: E1013 13:09:07.325533 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-13 13:09:07.825516955 +0000 UTC m=+125.359067211 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 13:09:07 crc kubenswrapper[4797]: I1013 13:09:07.325644 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x9zw9\" (UID: \"fcbb5cc0-3585-4ddb-aa28-1c1097d59318\") " pod="openshift-image-registry/image-registry-697d97f7c8-x9zw9" Oct 13 13:09:07 crc kubenswrapper[4797]: E1013 13:09:07.325977 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-13 13:09:07.825966046 +0000 UTC m=+125.359516302 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x9zw9" (UID: "fcbb5cc0-3585-4ddb-aa28-1c1097d59318") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 13:09:07 crc kubenswrapper[4797]: I1013 13:09:07.347172 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-tfpf9"] Oct 13 13:09:07 crc kubenswrapper[4797]: I1013 13:09:07.348244 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tfpf9" Oct 13 13:09:07 crc kubenswrapper[4797]: I1013 13:09:07.350385 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Oct 13 13:09:07 crc kubenswrapper[4797]: I1013 13:09:07.355651 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tfpf9"] Oct 13 13:09:07 crc kubenswrapper[4797]: I1013 13:09:07.426505 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 13 13:09:07 crc kubenswrapper[4797]: E1013 13:09:07.426682 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-13 13:09:07.926661367 +0000 UTC m=+125.460211623 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 13:09:07 crc kubenswrapper[4797]: I1013 13:09:07.426734 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b97354f-235c-4e1e-9121-13d644be8813-catalog-content\") pod \"certified-operators-tfpf9\" (UID: \"8b97354f-235c-4e1e-9121-13d644be8813\") " pod="openshift-marketplace/certified-operators-tfpf9" Oct 13 13:09:07 crc kubenswrapper[4797]: I1013 13:09:07.426872 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x9zw9\" (UID: \"fcbb5cc0-3585-4ddb-aa28-1c1097d59318\") " pod="openshift-image-registry/image-registry-697d97f7c8-x9zw9" Oct 13 13:09:07 crc kubenswrapper[4797]: I1013 13:09:07.426903 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b97354f-235c-4e1e-9121-13d644be8813-utilities\") pod \"certified-operators-tfpf9\" (UID: \"8b97354f-235c-4e1e-9121-13d644be8813\") " pod="openshift-marketplace/certified-operators-tfpf9" Oct 13 13:09:07 crc kubenswrapper[4797]: I1013 13:09:07.426935 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xtxgp\" (UniqueName: \"kubernetes.io/projected/8b97354f-235c-4e1e-9121-13d644be8813-kube-api-access-xtxgp\") pod \"certified-operators-tfpf9\" (UID: \"8b97354f-235c-4e1e-9121-13d644be8813\") " pod="openshift-marketplace/certified-operators-tfpf9" Oct 13 13:09:07 crc kubenswrapper[4797]: E1013 13:09:07.427176 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-13 13:09:07.92716706 +0000 UTC m=+125.460717316 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x9zw9" (UID: "fcbb5cc0-3585-4ddb-aa28-1c1097d59318") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 13:09:07 crc kubenswrapper[4797]: I1013 13:09:07.453418 4797 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-10-13T13:09:06.699934053Z","Handler":null,"Name":""} Oct 13 13:09:07 crc kubenswrapper[4797]: I1013 13:09:07.457251 4797 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Oct 13 13:09:07 crc kubenswrapper[4797]: I1013 13:09:07.457286 4797 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Oct 13 13:09:07 crc kubenswrapper[4797]: I1013 13:09:07.527879 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 13 13:09:07 crc kubenswrapper[4797]: I1013 13:09:07.528040 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b97354f-235c-4e1e-9121-13d644be8813-catalog-content\") pod \"certified-operators-tfpf9\" (UID: \"8b97354f-235c-4e1e-9121-13d644be8813\") " pod="openshift-marketplace/certified-operators-tfpf9" Oct 13 13:09:07 crc kubenswrapper[4797]: I1013 13:09:07.528105 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b97354f-235c-4e1e-9121-13d644be8813-utilities\") pod \"certified-operators-tfpf9\" (UID: \"8b97354f-235c-4e1e-9121-13d644be8813\") " pod="openshift-marketplace/certified-operators-tfpf9" Oct 13 13:09:07 crc kubenswrapper[4797]: I1013 13:09:07.528128 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xtxgp\" (UniqueName: \"kubernetes.io/projected/8b97354f-235c-4e1e-9121-13d644be8813-kube-api-access-xtxgp\") pod \"certified-operators-tfpf9\" (UID: \"8b97354f-235c-4e1e-9121-13d644be8813\") " pod="openshift-marketplace/certified-operators-tfpf9" Oct 13 13:09:07 crc kubenswrapper[4797]: I1013 13:09:07.528546 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b97354f-235c-4e1e-9121-13d644be8813-catalog-content\") pod \"certified-operators-tfpf9\" (UID: \"8b97354f-235c-4e1e-9121-13d644be8813\") " pod="openshift-marketplace/certified-operators-tfpf9" Oct 13 13:09:07 crc kubenswrapper[4797]: I1013 13:09:07.528589 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b97354f-235c-4e1e-9121-13d644be8813-utilities\") pod \"certified-operators-tfpf9\" (UID: \"8b97354f-235c-4e1e-9121-13d644be8813\") " pod="openshift-marketplace/certified-operators-tfpf9" Oct 13 13:09:07 crc kubenswrapper[4797]: I1013 13:09:07.539653 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 13 13:09:07 crc kubenswrapper[4797]: I1013 13:09:07.544459 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-2m52s"] Oct 13 13:09:07 crc kubenswrapper[4797]: I1013 13:09:07.545530 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2m52s" Oct 13 13:09:07 crc kubenswrapper[4797]: I1013 13:09:07.548649 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Oct 13 13:09:07 crc kubenswrapper[4797]: I1013 13:09:07.558537 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xtxgp\" (UniqueName: \"kubernetes.io/projected/8b97354f-235c-4e1e-9121-13d644be8813-kube-api-access-xtxgp\") pod \"certified-operators-tfpf9\" (UID: \"8b97354f-235c-4e1e-9121-13d644be8813\") " pod="openshift-marketplace/certified-operators-tfpf9" Oct 13 13:09:07 crc kubenswrapper[4797]: I1013 13:09:07.569134 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2m52s"] Oct 13 13:09:07 crc kubenswrapper[4797]: I1013 13:09:07.634587 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x9zw9\" (UID: \"fcbb5cc0-3585-4ddb-aa28-1c1097d59318\") " pod="openshift-image-registry/image-registry-697d97f7c8-x9zw9" Oct 13 13:09:07 crc kubenswrapper[4797]: I1013 13:09:07.635933 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/922167a9-88b5-40b2-8dd3-b04e4ba3f035-utilities\") pod \"community-operators-2m52s\" (UID: \"922167a9-88b5-40b2-8dd3-b04e4ba3f035\") " pod="openshift-marketplace/community-operators-2m52s" Oct 13 13:09:07 crc kubenswrapper[4797]: I1013 13:09:07.636063 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-772zf\" (UniqueName: \"kubernetes.io/projected/922167a9-88b5-40b2-8dd3-b04e4ba3f035-kube-api-access-772zf\") pod \"community-operators-2m52s\" (UID: \"922167a9-88b5-40b2-8dd3-b04e4ba3f035\") " pod="openshift-marketplace/community-operators-2m52s" Oct 13 13:09:07 crc kubenswrapper[4797]: I1013 13:09:07.637639 4797 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 13 13:09:07 crc kubenswrapper[4797]: I1013 13:09:07.639133 4797 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x9zw9\" (UID: \"fcbb5cc0-3585-4ddb-aa28-1c1097d59318\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-x9zw9" Oct 13 13:09:07 crc kubenswrapper[4797]: I1013 13:09:07.639071 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/922167a9-88b5-40b2-8dd3-b04e4ba3f035-catalog-content\") pod \"community-operators-2m52s\" (UID: \"922167a9-88b5-40b2-8dd3-b04e4ba3f035\") " pod="openshift-marketplace/community-operators-2m52s" Oct 13 13:09:07 crc kubenswrapper[4797]: I1013 13:09:07.661151 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tfpf9" Oct 13 13:09:07 crc kubenswrapper[4797]: I1013 13:09:07.671530 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x9zw9\" (UID: \"fcbb5cc0-3585-4ddb-aa28-1c1097d59318\") " pod="openshift-image-registry/image-registry-697d97f7c8-x9zw9" Oct 13 13:09:07 crc kubenswrapper[4797]: I1013 13:09:07.740301 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/922167a9-88b5-40b2-8dd3-b04e4ba3f035-utilities\") pod \"community-operators-2m52s\" (UID: \"922167a9-88b5-40b2-8dd3-b04e4ba3f035\") " pod="openshift-marketplace/community-operators-2m52s" Oct 13 13:09:07 crc kubenswrapper[4797]: I1013 13:09:07.740397 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-772zf\" (UniqueName: \"kubernetes.io/projected/922167a9-88b5-40b2-8dd3-b04e4ba3f035-kube-api-access-772zf\") pod \"community-operators-2m52s\" (UID: \"922167a9-88b5-40b2-8dd3-b04e4ba3f035\") " pod="openshift-marketplace/community-operators-2m52s" Oct 13 13:09:07 crc kubenswrapper[4797]: I1013 13:09:07.740501 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/922167a9-88b5-40b2-8dd3-b04e4ba3f035-catalog-content\") pod \"community-operators-2m52s\" (UID: \"922167a9-88b5-40b2-8dd3-b04e4ba3f035\") " pod="openshift-marketplace/community-operators-2m52s" Oct 13 13:09:07 crc kubenswrapper[4797]: I1013 13:09:07.740864 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/922167a9-88b5-40b2-8dd3-b04e4ba3f035-utilities\") pod \"community-operators-2m52s\" (UID: \"922167a9-88b5-40b2-8dd3-b04e4ba3f035\") " pod="openshift-marketplace/community-operators-2m52s" Oct 13 13:09:07 crc kubenswrapper[4797]: I1013 13:09:07.741405 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/922167a9-88b5-40b2-8dd3-b04e4ba3f035-catalog-content\") pod \"community-operators-2m52s\" (UID: \"922167a9-88b5-40b2-8dd3-b04e4ba3f035\") " pod="openshift-marketplace/community-operators-2m52s" Oct 13 13:09:07 crc kubenswrapper[4797]: I1013 13:09:07.742393 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-rzpc2"] Oct 13 13:09:07 crc kubenswrapper[4797]: I1013 13:09:07.743592 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rzpc2" Oct 13 13:09:07 crc kubenswrapper[4797]: I1013 13:09:07.745001 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Oct 13 13:09:07 crc kubenswrapper[4797]: I1013 13:09:07.754642 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-x9zw9" Oct 13 13:09:07 crc kubenswrapper[4797]: I1013 13:09:07.776010 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-772zf\" (UniqueName: \"kubernetes.io/projected/922167a9-88b5-40b2-8dd3-b04e4ba3f035-kube-api-access-772zf\") pod \"community-operators-2m52s\" (UID: \"922167a9-88b5-40b2-8dd3-b04e4ba3f035\") " pod="openshift-marketplace/community-operators-2m52s" Oct 13 13:09:07 crc kubenswrapper[4797]: I1013 13:09:07.786217 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rzpc2"] Oct 13 13:09:07 crc kubenswrapper[4797]: I1013 13:09:07.841786 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae42829d-c380-4186-9b4c-55c2221fffd7-catalog-content\") pod \"certified-operators-rzpc2\" (UID: \"ae42829d-c380-4186-9b4c-55c2221fffd7\") " pod="openshift-marketplace/certified-operators-rzpc2" Oct 13 13:09:07 crc kubenswrapper[4797]: I1013 13:09:07.841873 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d74kt\" (UniqueName: \"kubernetes.io/projected/ae42829d-c380-4186-9b4c-55c2221fffd7-kube-api-access-d74kt\") pod \"certified-operators-rzpc2\" (UID: \"ae42829d-c380-4186-9b4c-55c2221fffd7\") " pod="openshift-marketplace/certified-operators-rzpc2" Oct 13 13:09:07 crc kubenswrapper[4797]: I1013 13:09:07.841961 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae42829d-c380-4186-9b4c-55c2221fffd7-utilities\") pod \"certified-operators-rzpc2\" (UID: \"ae42829d-c380-4186-9b4c-55c2221fffd7\") " pod="openshift-marketplace/certified-operators-rzpc2" Oct 13 13:09:07 crc kubenswrapper[4797]: I1013 13:09:07.896832 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2m52s" Oct 13 13:09:07 crc kubenswrapper[4797]: I1013 13:09:07.899486 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tfpf9"] Oct 13 13:09:07 crc kubenswrapper[4797]: I1013 13:09:07.943436 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae42829d-c380-4186-9b4c-55c2221fffd7-catalog-content\") pod \"certified-operators-rzpc2\" (UID: \"ae42829d-c380-4186-9b4c-55c2221fffd7\") " pod="openshift-marketplace/certified-operators-rzpc2" Oct 13 13:09:07 crc kubenswrapper[4797]: I1013 13:09:07.943513 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d74kt\" (UniqueName: \"kubernetes.io/projected/ae42829d-c380-4186-9b4c-55c2221fffd7-kube-api-access-d74kt\") pod \"certified-operators-rzpc2\" (UID: \"ae42829d-c380-4186-9b4c-55c2221fffd7\") " pod="openshift-marketplace/certified-operators-rzpc2" Oct 13 13:09:07 crc kubenswrapper[4797]: I1013 13:09:07.943577 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae42829d-c380-4186-9b4c-55c2221fffd7-utilities\") pod \"certified-operators-rzpc2\" (UID: \"ae42829d-c380-4186-9b4c-55c2221fffd7\") " pod="openshift-marketplace/certified-operators-rzpc2" Oct 13 13:09:07 crc kubenswrapper[4797]: I1013 13:09:07.944165 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae42829d-c380-4186-9b4c-55c2221fffd7-utilities\") pod \"certified-operators-rzpc2\" (UID: \"ae42829d-c380-4186-9b4c-55c2221fffd7\") " pod="openshift-marketplace/certified-operators-rzpc2" Oct 13 13:09:07 crc kubenswrapper[4797]: I1013 13:09:07.944419 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae42829d-c380-4186-9b4c-55c2221fffd7-catalog-content\") pod \"certified-operators-rzpc2\" (UID: \"ae42829d-c380-4186-9b4c-55c2221fffd7\") " pod="openshift-marketplace/certified-operators-rzpc2" Oct 13 13:09:07 crc kubenswrapper[4797]: I1013 13:09:07.950159 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-mxd8n"] Oct 13 13:09:07 crc kubenswrapper[4797]: I1013 13:09:07.951584 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mxd8n" Oct 13 13:09:07 crc kubenswrapper[4797]: I1013 13:09:07.953742 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-x9zw9"] Oct 13 13:09:07 crc kubenswrapper[4797]: I1013 13:09:07.959275 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mxd8n"] Oct 13 13:09:07 crc kubenswrapper[4797]: I1013 13:09:07.965826 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d74kt\" (UniqueName: \"kubernetes.io/projected/ae42829d-c380-4186-9b4c-55c2221fffd7-kube-api-access-d74kt\") pod \"certified-operators-rzpc2\" (UID: \"ae42829d-c380-4186-9b4c-55c2221fffd7\") " pod="openshift-marketplace/certified-operators-rzpc2" Oct 13 13:09:08 crc kubenswrapper[4797]: I1013 13:09:08.023253 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-x9zw9" event={"ID":"fcbb5cc0-3585-4ddb-aa28-1c1097d59318","Type":"ContainerStarted","Data":"4e4b50f7512eba4570a3e059afddc8eb940e71032452d8d77612a7f9a13e77a8"} Oct 13 13:09:08 crc kubenswrapper[4797]: I1013 13:09:08.024671 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tfpf9" event={"ID":"8b97354f-235c-4e1e-9121-13d644be8813","Type":"ContainerStarted","Data":"e4846dea3e620fa577e8984c87dcb74b8db637d1996a2c8257fa7f74d0707144"} Oct 13 13:09:08 crc kubenswrapper[4797]: I1013 13:09:08.045116 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98e37ff7-36ac-4230-b449-83d3d2627535-utilities\") pod \"community-operators-mxd8n\" (UID: \"98e37ff7-36ac-4230-b449-83d3d2627535\") " pod="openshift-marketplace/community-operators-mxd8n" Oct 13 13:09:08 crc kubenswrapper[4797]: I1013 13:09:08.045264 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98e37ff7-36ac-4230-b449-83d3d2627535-catalog-content\") pod \"community-operators-mxd8n\" (UID: \"98e37ff7-36ac-4230-b449-83d3d2627535\") " pod="openshift-marketplace/community-operators-mxd8n" Oct 13 13:09:08 crc kubenswrapper[4797]: I1013 13:09:08.045295 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-scd2h\" (UniqueName: \"kubernetes.io/projected/98e37ff7-36ac-4230-b449-83d3d2627535-kube-api-access-scd2h\") pod \"community-operators-mxd8n\" (UID: \"98e37ff7-36ac-4230-b449-83d3d2627535\") " pod="openshift-marketplace/community-operators-mxd8n" Oct 13 13:09:08 crc kubenswrapper[4797]: I1013 13:09:08.065571 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rzpc2" Oct 13 13:09:08 crc kubenswrapper[4797]: I1013 13:09:08.101734 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2m52s"] Oct 13 13:09:08 crc kubenswrapper[4797]: I1013 13:09:08.146366 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98e37ff7-36ac-4230-b449-83d3d2627535-catalog-content\") pod \"community-operators-mxd8n\" (UID: \"98e37ff7-36ac-4230-b449-83d3d2627535\") " pod="openshift-marketplace/community-operators-mxd8n" Oct 13 13:09:08 crc kubenswrapper[4797]: I1013 13:09:08.146902 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-scd2h\" (UniqueName: \"kubernetes.io/projected/98e37ff7-36ac-4230-b449-83d3d2627535-kube-api-access-scd2h\") pod \"community-operators-mxd8n\" (UID: \"98e37ff7-36ac-4230-b449-83d3d2627535\") " pod="openshift-marketplace/community-operators-mxd8n" Oct 13 13:09:08 crc kubenswrapper[4797]: I1013 13:09:08.146957 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98e37ff7-36ac-4230-b449-83d3d2627535-utilities\") pod \"community-operators-mxd8n\" (UID: \"98e37ff7-36ac-4230-b449-83d3d2627535\") " pod="openshift-marketplace/community-operators-mxd8n" Oct 13 13:09:08 crc kubenswrapper[4797]: I1013 13:09:08.146976 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98e37ff7-36ac-4230-b449-83d3d2627535-catalog-content\") pod \"community-operators-mxd8n\" (UID: \"98e37ff7-36ac-4230-b449-83d3d2627535\") " pod="openshift-marketplace/community-operators-mxd8n" Oct 13 13:09:08 crc kubenswrapper[4797]: I1013 13:09:08.147270 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98e37ff7-36ac-4230-b449-83d3d2627535-utilities\") pod \"community-operators-mxd8n\" (UID: \"98e37ff7-36ac-4230-b449-83d3d2627535\") " pod="openshift-marketplace/community-operators-mxd8n" Oct 13 13:09:08 crc kubenswrapper[4797]: I1013 13:09:08.150915 4797 patch_prober.go:28] interesting pod/router-default-5444994796-ngmd7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 13 13:09:08 crc kubenswrapper[4797]: [-]has-synced failed: reason withheld Oct 13 13:09:08 crc kubenswrapper[4797]: [+]process-running ok Oct 13 13:09:08 crc kubenswrapper[4797]: healthz check failed Oct 13 13:09:08 crc kubenswrapper[4797]: I1013 13:09:08.150983 4797 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ngmd7" podUID="62769b11-1e27-44f2-836c-da79ac2655b6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 13 13:09:08 crc kubenswrapper[4797]: I1013 13:09:08.166779 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-scd2h\" (UniqueName: \"kubernetes.io/projected/98e37ff7-36ac-4230-b449-83d3d2627535-kube-api-access-scd2h\") pod \"community-operators-mxd8n\" (UID: \"98e37ff7-36ac-4230-b449-83d3d2627535\") " pod="openshift-marketplace/community-operators-mxd8n" Oct 13 13:09:08 crc kubenswrapper[4797]: I1013 13:09:08.228258 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29339340-k2d9s" Oct 13 13:09:08 crc kubenswrapper[4797]: I1013 13:09:08.247856 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fc2e514b-61f8-47b1-975e-4a910550ecaa-config-volume\") pod \"fc2e514b-61f8-47b1-975e-4a910550ecaa\" (UID: \"fc2e514b-61f8-47b1-975e-4a910550ecaa\") " Oct 13 13:09:08 crc kubenswrapper[4797]: I1013 13:09:08.247895 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kzclr\" (UniqueName: \"kubernetes.io/projected/fc2e514b-61f8-47b1-975e-4a910550ecaa-kube-api-access-kzclr\") pod \"fc2e514b-61f8-47b1-975e-4a910550ecaa\" (UID: \"fc2e514b-61f8-47b1-975e-4a910550ecaa\") " Oct 13 13:09:08 crc kubenswrapper[4797]: I1013 13:09:08.247979 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fc2e514b-61f8-47b1-975e-4a910550ecaa-secret-volume\") pod \"fc2e514b-61f8-47b1-975e-4a910550ecaa\" (UID: \"fc2e514b-61f8-47b1-975e-4a910550ecaa\") " Oct 13 13:09:08 crc kubenswrapper[4797]: I1013 13:09:08.249788 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc2e514b-61f8-47b1-975e-4a910550ecaa-config-volume" (OuterVolumeSpecName: "config-volume") pod "fc2e514b-61f8-47b1-975e-4a910550ecaa" (UID: "fc2e514b-61f8-47b1-975e-4a910550ecaa"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:09:08 crc kubenswrapper[4797]: I1013 13:09:08.251318 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc2e514b-61f8-47b1-975e-4a910550ecaa-kube-api-access-kzclr" (OuterVolumeSpecName: "kube-api-access-kzclr") pod "fc2e514b-61f8-47b1-975e-4a910550ecaa" (UID: "fc2e514b-61f8-47b1-975e-4a910550ecaa"). InnerVolumeSpecName "kube-api-access-kzclr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:09:08 crc kubenswrapper[4797]: I1013 13:09:08.252058 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc2e514b-61f8-47b1-975e-4a910550ecaa-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "fc2e514b-61f8-47b1-975e-4a910550ecaa" (UID: "fc2e514b-61f8-47b1-975e-4a910550ecaa"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:09:08 crc kubenswrapper[4797]: I1013 13:09:08.271032 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mxd8n" Oct 13 13:09:08 crc kubenswrapper[4797]: I1013 13:09:08.278799 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rzpc2"] Oct 13 13:09:08 crc kubenswrapper[4797]: I1013 13:09:08.350526 4797 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fc2e514b-61f8-47b1-975e-4a910550ecaa-config-volume\") on node \"crc\" DevicePath \"\"" Oct 13 13:09:08 crc kubenswrapper[4797]: I1013 13:09:08.350564 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kzclr\" (UniqueName: \"kubernetes.io/projected/fc2e514b-61f8-47b1-975e-4a910550ecaa-kube-api-access-kzclr\") on node \"crc\" DevicePath \"\"" Oct 13 13:09:08 crc kubenswrapper[4797]: I1013 13:09:08.350577 4797 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fc2e514b-61f8-47b1-975e-4a910550ecaa-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 13 13:09:08 crc kubenswrapper[4797]: I1013 13:09:08.736083 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mxd8n"] Oct 13 13:09:09 crc kubenswrapper[4797]: I1013 13:09:09.031523 4797 generic.go:334] "Generic (PLEG): container finished" podID="ae42829d-c380-4186-9b4c-55c2221fffd7" containerID="f48d2ec912c9a8ee0d103cee62910aed5e4df10c4070c6fb8b4e0dfb9000f85d" exitCode=0 Oct 13 13:09:09 crc kubenswrapper[4797]: I1013 13:09:09.031592 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rzpc2" event={"ID":"ae42829d-c380-4186-9b4c-55c2221fffd7","Type":"ContainerDied","Data":"f48d2ec912c9a8ee0d103cee62910aed5e4df10c4070c6fb8b4e0dfb9000f85d"} Oct 13 13:09:09 crc kubenswrapper[4797]: I1013 13:09:09.031618 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rzpc2" event={"ID":"ae42829d-c380-4186-9b4c-55c2221fffd7","Type":"ContainerStarted","Data":"e200f596b1df7e174fb7168f36b56cbae1e7ad7cfcd60fa753d0a48de8328958"} Oct 13 13:09:09 crc kubenswrapper[4797]: I1013 13:09:09.033971 4797 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 13 13:09:09 crc kubenswrapper[4797]: I1013 13:09:09.033953 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-x9zw9" event={"ID":"fcbb5cc0-3585-4ddb-aa28-1c1097d59318","Type":"ContainerStarted","Data":"bd7bdc6e606fe8503731538f4ca59fdcbaadd9ab7a6e3edc9ab1fb1a640ee359"} Oct 13 13:09:09 crc kubenswrapper[4797]: I1013 13:09:09.034072 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-x9zw9" Oct 13 13:09:09 crc kubenswrapper[4797]: I1013 13:09:09.037161 4797 generic.go:334] "Generic (PLEG): container finished" podID="8b97354f-235c-4e1e-9121-13d644be8813" containerID="2bfad293734fa09c3b0552fe4361032662c4545ed4ff057a5201a40d9fdb0a75" exitCode=0 Oct 13 13:09:09 crc kubenswrapper[4797]: I1013 13:09:09.037253 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tfpf9" event={"ID":"8b97354f-235c-4e1e-9121-13d644be8813","Type":"ContainerDied","Data":"2bfad293734fa09c3b0552fe4361032662c4545ed4ff057a5201a40d9fdb0a75"} Oct 13 13:09:09 crc kubenswrapper[4797]: I1013 13:09:09.039338 4797 generic.go:334] "Generic (PLEG): container finished" podID="98e37ff7-36ac-4230-b449-83d3d2627535" containerID="b171a07754fe27c866b9b0d97d63589e0687bec227ea39276ecb680c18c3dfbd" exitCode=0 Oct 13 13:09:09 crc kubenswrapper[4797]: I1013 13:09:09.039426 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mxd8n" event={"ID":"98e37ff7-36ac-4230-b449-83d3d2627535","Type":"ContainerDied","Data":"b171a07754fe27c866b9b0d97d63589e0687bec227ea39276ecb680c18c3dfbd"} Oct 13 13:09:09 crc kubenswrapper[4797]: I1013 13:09:09.039459 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mxd8n" event={"ID":"98e37ff7-36ac-4230-b449-83d3d2627535","Type":"ContainerStarted","Data":"9a92e0647b53ff914499105bd862e5299327974808b56fa19e56ee32eb365339"} Oct 13 13:09:09 crc kubenswrapper[4797]: I1013 13:09:09.043151 4797 generic.go:334] "Generic (PLEG): container finished" podID="922167a9-88b5-40b2-8dd3-b04e4ba3f035" containerID="9d735cd13ac34550de620c04613fc46515d29ed9f19210f858844b597e98e435" exitCode=0 Oct 13 13:09:09 crc kubenswrapper[4797]: I1013 13:09:09.043214 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2m52s" event={"ID":"922167a9-88b5-40b2-8dd3-b04e4ba3f035","Type":"ContainerDied","Data":"9d735cd13ac34550de620c04613fc46515d29ed9f19210f858844b597e98e435"} Oct 13 13:09:09 crc kubenswrapper[4797]: I1013 13:09:09.043241 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2m52s" event={"ID":"922167a9-88b5-40b2-8dd3-b04e4ba3f035","Type":"ContainerStarted","Data":"bccb7f864b5c89d28ed5610a395821dba41c2177ac9ec1398438ba2296818b14"} Oct 13 13:09:09 crc kubenswrapper[4797]: I1013 13:09:09.051837 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29339340-k2d9s" event={"ID":"fc2e514b-61f8-47b1-975e-4a910550ecaa","Type":"ContainerDied","Data":"4cf3177b501a3a7ea0df2a551e8fe048530dbc05e02581bf2a68351f5bb8569e"} Oct 13 13:09:09 crc kubenswrapper[4797]: I1013 13:09:09.051887 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4cf3177b501a3a7ea0df2a551e8fe048530dbc05e02581bf2a68351f5bb8569e" Oct 13 13:09:09 crc kubenswrapper[4797]: I1013 13:09:09.052034 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29339340-k2d9s" Oct 13 13:09:09 crc kubenswrapper[4797]: I1013 13:09:09.080915 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-x9zw9" podStartSLOduration=105.08089807 podStartE2EDuration="1m45.08089807s" podCreationTimestamp="2025-10-13 13:07:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 13:09:09.076653685 +0000 UTC m=+126.610203941" watchObservedRunningTime="2025-10-13 13:09:09.08089807 +0000 UTC m=+126.614448326" Oct 13 13:09:09 crc kubenswrapper[4797]: I1013 13:09:09.135112 4797 patch_prober.go:28] interesting pod/router-default-5444994796-ngmd7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 13 13:09:09 crc kubenswrapper[4797]: [-]has-synced failed: reason withheld Oct 13 13:09:09 crc kubenswrapper[4797]: [+]process-running ok Oct 13 13:09:09 crc kubenswrapper[4797]: healthz check failed Oct 13 13:09:09 crc kubenswrapper[4797]: I1013 13:09:09.135165 4797 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ngmd7" podUID="62769b11-1e27-44f2-836c-da79ac2655b6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 13 13:09:09 crc kubenswrapper[4797]: I1013 13:09:09.244018 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Oct 13 13:09:09 crc kubenswrapper[4797]: I1013 13:09:09.343217 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-v4q5g"] Oct 13 13:09:09 crc kubenswrapper[4797]: E1013 13:09:09.343472 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc2e514b-61f8-47b1-975e-4a910550ecaa" containerName="collect-profiles" Oct 13 13:09:09 crc kubenswrapper[4797]: I1013 13:09:09.343487 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc2e514b-61f8-47b1-975e-4a910550ecaa" containerName="collect-profiles" Oct 13 13:09:09 crc kubenswrapper[4797]: I1013 13:09:09.343594 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc2e514b-61f8-47b1-975e-4a910550ecaa" containerName="collect-profiles" Oct 13 13:09:09 crc kubenswrapper[4797]: I1013 13:09:09.344467 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-v4q5g" Oct 13 13:09:09 crc kubenswrapper[4797]: I1013 13:09:09.346546 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Oct 13 13:09:09 crc kubenswrapper[4797]: I1013 13:09:09.359873 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-v4q5g"] Oct 13 13:09:09 crc kubenswrapper[4797]: I1013 13:09:09.381385 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-224st\" (UniqueName: \"kubernetes.io/projected/51eb3ebb-3fa0-4571-bb5e-ca393071f745-kube-api-access-224st\") pod \"redhat-marketplace-v4q5g\" (UID: \"51eb3ebb-3fa0-4571-bb5e-ca393071f745\") " pod="openshift-marketplace/redhat-marketplace-v4q5g" Oct 13 13:09:09 crc kubenswrapper[4797]: I1013 13:09:09.381640 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51eb3ebb-3fa0-4571-bb5e-ca393071f745-catalog-content\") pod \"redhat-marketplace-v4q5g\" (UID: \"51eb3ebb-3fa0-4571-bb5e-ca393071f745\") " pod="openshift-marketplace/redhat-marketplace-v4q5g" Oct 13 13:09:09 crc kubenswrapper[4797]: I1013 13:09:09.381725 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51eb3ebb-3fa0-4571-bb5e-ca393071f745-utilities\") pod \"redhat-marketplace-v4q5g\" (UID: \"51eb3ebb-3fa0-4571-bb5e-ca393071f745\") " pod="openshift-marketplace/redhat-marketplace-v4q5g" Oct 13 13:09:09 crc kubenswrapper[4797]: I1013 13:09:09.385356 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 13 13:09:09 crc kubenswrapper[4797]: I1013 13:09:09.386388 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 13 13:09:09 crc kubenswrapper[4797]: I1013 13:09:09.394174 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 13 13:09:09 crc kubenswrapper[4797]: I1013 13:09:09.394363 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Oct 13 13:09:09 crc kubenswrapper[4797]: I1013 13:09:09.395276 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Oct 13 13:09:09 crc kubenswrapper[4797]: I1013 13:09:09.482916 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51eb3ebb-3fa0-4571-bb5e-ca393071f745-catalog-content\") pod \"redhat-marketplace-v4q5g\" (UID: \"51eb3ebb-3fa0-4571-bb5e-ca393071f745\") " pod="openshift-marketplace/redhat-marketplace-v4q5g" Oct 13 13:09:09 crc kubenswrapper[4797]: I1013 13:09:09.482975 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51eb3ebb-3fa0-4571-bb5e-ca393071f745-utilities\") pod \"redhat-marketplace-v4q5g\" (UID: \"51eb3ebb-3fa0-4571-bb5e-ca393071f745\") " pod="openshift-marketplace/redhat-marketplace-v4q5g" Oct 13 13:09:09 crc kubenswrapper[4797]: I1013 13:09:09.483000 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/54ecdf4a-fcad-43f1-8a08-6a9c9e0e619a-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"54ecdf4a-fcad-43f1-8a08-6a9c9e0e619a\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 13 13:09:09 crc kubenswrapper[4797]: I1013 13:09:09.483068 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/54ecdf4a-fcad-43f1-8a08-6a9c9e0e619a-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"54ecdf4a-fcad-43f1-8a08-6a9c9e0e619a\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 13 13:09:09 crc kubenswrapper[4797]: I1013 13:09:09.483099 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-224st\" (UniqueName: \"kubernetes.io/projected/51eb3ebb-3fa0-4571-bb5e-ca393071f745-kube-api-access-224st\") pod \"redhat-marketplace-v4q5g\" (UID: \"51eb3ebb-3fa0-4571-bb5e-ca393071f745\") " pod="openshift-marketplace/redhat-marketplace-v4q5g" Oct 13 13:09:09 crc kubenswrapper[4797]: I1013 13:09:09.483517 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51eb3ebb-3fa0-4571-bb5e-ca393071f745-catalog-content\") pod \"redhat-marketplace-v4q5g\" (UID: \"51eb3ebb-3fa0-4571-bb5e-ca393071f745\") " pod="openshift-marketplace/redhat-marketplace-v4q5g" Oct 13 13:09:09 crc kubenswrapper[4797]: I1013 13:09:09.483567 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51eb3ebb-3fa0-4571-bb5e-ca393071f745-utilities\") pod \"redhat-marketplace-v4q5g\" (UID: \"51eb3ebb-3fa0-4571-bb5e-ca393071f745\") " pod="openshift-marketplace/redhat-marketplace-v4q5g" Oct 13 13:09:09 crc kubenswrapper[4797]: I1013 13:09:09.503839 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-224st\" (UniqueName: \"kubernetes.io/projected/51eb3ebb-3fa0-4571-bb5e-ca393071f745-kube-api-access-224st\") pod \"redhat-marketplace-v4q5g\" (UID: \"51eb3ebb-3fa0-4571-bb5e-ca393071f745\") " pod="openshift-marketplace/redhat-marketplace-v4q5g" Oct 13 13:09:09 crc kubenswrapper[4797]: I1013 13:09:09.584253 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/54ecdf4a-fcad-43f1-8a08-6a9c9e0e619a-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"54ecdf4a-fcad-43f1-8a08-6a9c9e0e619a\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 13 13:09:09 crc kubenswrapper[4797]: I1013 13:09:09.584345 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/54ecdf4a-fcad-43f1-8a08-6a9c9e0e619a-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"54ecdf4a-fcad-43f1-8a08-6a9c9e0e619a\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 13 13:09:09 crc kubenswrapper[4797]: I1013 13:09:09.584541 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/54ecdf4a-fcad-43f1-8a08-6a9c9e0e619a-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"54ecdf4a-fcad-43f1-8a08-6a9c9e0e619a\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 13 13:09:09 crc kubenswrapper[4797]: I1013 13:09:09.605538 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/54ecdf4a-fcad-43f1-8a08-6a9c9e0e619a-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"54ecdf4a-fcad-43f1-8a08-6a9c9e0e619a\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 13 13:09:09 crc kubenswrapper[4797]: I1013 13:09:09.669114 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-v4q5g" Oct 13 13:09:09 crc kubenswrapper[4797]: I1013 13:09:09.724546 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 13 13:09:09 crc kubenswrapper[4797]: I1013 13:09:09.746878 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-thw49"] Oct 13 13:09:09 crc kubenswrapper[4797]: I1013 13:09:09.750097 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-thw49" Oct 13 13:09:09 crc kubenswrapper[4797]: I1013 13:09:09.758180 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-thw49"] Oct 13 13:09:09 crc kubenswrapper[4797]: I1013 13:09:09.794377 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-8xklm" Oct 13 13:09:09 crc kubenswrapper[4797]: I1013 13:09:09.794707 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-8xklm" Oct 13 13:09:09 crc kubenswrapper[4797]: I1013 13:09:09.794829 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d86244d-7468-424e-92d6-13aa23a66ad8-catalog-content\") pod \"redhat-marketplace-thw49\" (UID: \"5d86244d-7468-424e-92d6-13aa23a66ad8\") " pod="openshift-marketplace/redhat-marketplace-thw49" Oct 13 13:09:09 crc kubenswrapper[4797]: I1013 13:09:09.794987 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d86244d-7468-424e-92d6-13aa23a66ad8-utilities\") pod \"redhat-marketplace-thw49\" (UID: \"5d86244d-7468-424e-92d6-13aa23a66ad8\") " pod="openshift-marketplace/redhat-marketplace-thw49" Oct 13 13:09:09 crc kubenswrapper[4797]: I1013 13:09:09.795142 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvtsv\" (UniqueName: \"kubernetes.io/projected/5d86244d-7468-424e-92d6-13aa23a66ad8-kube-api-access-tvtsv\") pod \"redhat-marketplace-thw49\" (UID: \"5d86244d-7468-424e-92d6-13aa23a66ad8\") " pod="openshift-marketplace/redhat-marketplace-thw49" Oct 13 13:09:09 crc kubenswrapper[4797]: I1013 13:09:09.825720 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-gbkvx" Oct 13 13:09:09 crc kubenswrapper[4797]: I1013 13:09:09.829237 4797 patch_prober.go:28] interesting pod/console-f9d7485db-8xklm container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.24:8443/health\": dial tcp 10.217.0.24:8443: connect: connection refused" start-of-body= Oct 13 13:09:09 crc kubenswrapper[4797]: I1013 13:09:09.829283 4797 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-8xklm" podUID="c72e2007-fbd4-4c7a-a0fc-9c949a748441" containerName="console" probeResult="failure" output="Get \"https://10.217.0.24:8443/health\": dial tcp 10.217.0.24:8443: connect: connection refused" Oct 13 13:09:09 crc kubenswrapper[4797]: I1013 13:09:09.895776 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tvtsv\" (UniqueName: \"kubernetes.io/projected/5d86244d-7468-424e-92d6-13aa23a66ad8-kube-api-access-tvtsv\") pod \"redhat-marketplace-thw49\" (UID: \"5d86244d-7468-424e-92d6-13aa23a66ad8\") " pod="openshift-marketplace/redhat-marketplace-thw49" Oct 13 13:09:09 crc kubenswrapper[4797]: I1013 13:09:09.896458 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d86244d-7468-424e-92d6-13aa23a66ad8-catalog-content\") pod \"redhat-marketplace-thw49\" (UID: \"5d86244d-7468-424e-92d6-13aa23a66ad8\") " pod="openshift-marketplace/redhat-marketplace-thw49" Oct 13 13:09:09 crc kubenswrapper[4797]: I1013 13:09:09.896540 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d86244d-7468-424e-92d6-13aa23a66ad8-utilities\") pod \"redhat-marketplace-thw49\" (UID: \"5d86244d-7468-424e-92d6-13aa23a66ad8\") " pod="openshift-marketplace/redhat-marketplace-thw49" Oct 13 13:09:09 crc kubenswrapper[4797]: I1013 13:09:09.897406 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d86244d-7468-424e-92d6-13aa23a66ad8-catalog-content\") pod \"redhat-marketplace-thw49\" (UID: \"5d86244d-7468-424e-92d6-13aa23a66ad8\") " pod="openshift-marketplace/redhat-marketplace-thw49" Oct 13 13:09:09 crc kubenswrapper[4797]: I1013 13:09:09.898034 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d86244d-7468-424e-92d6-13aa23a66ad8-utilities\") pod \"redhat-marketplace-thw49\" (UID: \"5d86244d-7468-424e-92d6-13aa23a66ad8\") " pod="openshift-marketplace/redhat-marketplace-thw49" Oct 13 13:09:09 crc kubenswrapper[4797]: I1013 13:09:09.913403 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvtsv\" (UniqueName: \"kubernetes.io/projected/5d86244d-7468-424e-92d6-13aa23a66ad8-kube-api-access-tvtsv\") pod \"redhat-marketplace-thw49\" (UID: \"5d86244d-7468-424e-92d6-13aa23a66ad8\") " pod="openshift-marketplace/redhat-marketplace-thw49" Oct 13 13:09:09 crc kubenswrapper[4797]: I1013 13:09:09.951062 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-72vb4" Oct 13 13:09:09 crc kubenswrapper[4797]: I1013 13:09:09.956552 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-72vb4" Oct 13 13:09:10 crc kubenswrapper[4797]: I1013 13:09:10.016927 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 13 13:09:10 crc kubenswrapper[4797]: I1013 13:09:10.086708 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"54ecdf4a-fcad-43f1-8a08-6a9c9e0e619a","Type":"ContainerStarted","Data":"c23322e894dbaddba3fd2f9a22bbad842bbfdd2df1ec48f188d29a8e6e6327e3"} Oct 13 13:09:10 crc kubenswrapper[4797]: I1013 13:09:10.107531 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-thw49" Oct 13 13:09:10 crc kubenswrapper[4797]: I1013 13:09:10.134496 4797 patch_prober.go:28] interesting pod/router-default-5444994796-ngmd7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 13 13:09:10 crc kubenswrapper[4797]: [-]has-synced failed: reason withheld Oct 13 13:09:10 crc kubenswrapper[4797]: [+]process-running ok Oct 13 13:09:10 crc kubenswrapper[4797]: healthz check failed Oct 13 13:09:10 crc kubenswrapper[4797]: I1013 13:09:10.134550 4797 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ngmd7" podUID="62769b11-1e27-44f2-836c-da79ac2655b6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 13 13:09:10 crc kubenswrapper[4797]: I1013 13:09:10.145331 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-v4q5g"] Oct 13 13:09:10 crc kubenswrapper[4797]: W1013 13:09:10.154731 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod51eb3ebb_3fa0_4571_bb5e_ca393071f745.slice/crio-b120c58051a132baad0fc5a0d1065da2ad3104aeec46a2a3a3c6aa1ad844639f WatchSource:0}: Error finding container b120c58051a132baad0fc5a0d1065da2ad3104aeec46a2a3a3c6aa1ad844639f: Status 404 returned error can't find the container with id b120c58051a132baad0fc5a0d1065da2ad3104aeec46a2a3a3c6aa1ad844639f Oct 13 13:09:10 crc kubenswrapper[4797]: I1013 13:09:10.404537 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-thw49"] Oct 13 13:09:10 crc kubenswrapper[4797]: I1013 13:09:10.510919 4797 patch_prober.go:28] interesting pod/downloads-7954f5f757-9pdrq container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Oct 13 13:09:10 crc kubenswrapper[4797]: I1013 13:09:10.510938 4797 patch_prober.go:28] interesting pod/downloads-7954f5f757-9pdrq container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Oct 13 13:09:10 crc kubenswrapper[4797]: I1013 13:09:10.510981 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-9pdrq" podUID="2b8b627e-6ee4-4ba8-b83f-fc84cf2b2c11" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Oct 13 13:09:10 crc kubenswrapper[4797]: I1013 13:09:10.510984 4797 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-9pdrq" podUID="2b8b627e-6ee4-4ba8-b83f-fc84cf2b2c11" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Oct 13 13:09:10 crc kubenswrapper[4797]: I1013 13:09:10.558707 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-n7gzd"] Oct 13 13:09:10 crc kubenswrapper[4797]: I1013 13:09:10.559893 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n7gzd" Oct 13 13:09:10 crc kubenswrapper[4797]: I1013 13:09:10.562380 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Oct 13 13:09:10 crc kubenswrapper[4797]: I1013 13:09:10.577378 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-n7gzd"] Oct 13 13:09:10 crc kubenswrapper[4797]: I1013 13:09:10.680899 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-hxck2" Oct 13 13:09:10 crc kubenswrapper[4797]: I1013 13:09:10.731207 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54tbd\" (UniqueName: \"kubernetes.io/projected/3ae58328-3b33-44bc-a168-9d19d64bc09c-kube-api-access-54tbd\") pod \"redhat-operators-n7gzd\" (UID: \"3ae58328-3b33-44bc-a168-9d19d64bc09c\") " pod="openshift-marketplace/redhat-operators-n7gzd" Oct 13 13:09:10 crc kubenswrapper[4797]: I1013 13:09:10.731271 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ae58328-3b33-44bc-a168-9d19d64bc09c-catalog-content\") pod \"redhat-operators-n7gzd\" (UID: \"3ae58328-3b33-44bc-a168-9d19d64bc09c\") " pod="openshift-marketplace/redhat-operators-n7gzd" Oct 13 13:09:10 crc kubenswrapper[4797]: I1013 13:09:10.731307 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ae58328-3b33-44bc-a168-9d19d64bc09c-utilities\") pod \"redhat-operators-n7gzd\" (UID: \"3ae58328-3b33-44bc-a168-9d19d64bc09c\") " pod="openshift-marketplace/redhat-operators-n7gzd" Oct 13 13:09:10 crc kubenswrapper[4797]: I1013 13:09:10.832857 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54tbd\" (UniqueName: \"kubernetes.io/projected/3ae58328-3b33-44bc-a168-9d19d64bc09c-kube-api-access-54tbd\") pod \"redhat-operators-n7gzd\" (UID: \"3ae58328-3b33-44bc-a168-9d19d64bc09c\") " pod="openshift-marketplace/redhat-operators-n7gzd" Oct 13 13:09:10 crc kubenswrapper[4797]: I1013 13:09:10.832998 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ae58328-3b33-44bc-a168-9d19d64bc09c-catalog-content\") pod \"redhat-operators-n7gzd\" (UID: \"3ae58328-3b33-44bc-a168-9d19d64bc09c\") " pod="openshift-marketplace/redhat-operators-n7gzd" Oct 13 13:09:10 crc kubenswrapper[4797]: I1013 13:09:10.833067 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ae58328-3b33-44bc-a168-9d19d64bc09c-utilities\") pod \"redhat-operators-n7gzd\" (UID: \"3ae58328-3b33-44bc-a168-9d19d64bc09c\") " pod="openshift-marketplace/redhat-operators-n7gzd" Oct 13 13:09:10 crc kubenswrapper[4797]: I1013 13:09:10.833774 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ae58328-3b33-44bc-a168-9d19d64bc09c-utilities\") pod \"redhat-operators-n7gzd\" (UID: \"3ae58328-3b33-44bc-a168-9d19d64bc09c\") " pod="openshift-marketplace/redhat-operators-n7gzd" Oct 13 13:09:10 crc kubenswrapper[4797]: I1013 13:09:10.834089 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ae58328-3b33-44bc-a168-9d19d64bc09c-catalog-content\") pod \"redhat-operators-n7gzd\" (UID: \"3ae58328-3b33-44bc-a168-9d19d64bc09c\") " pod="openshift-marketplace/redhat-operators-n7gzd" Oct 13 13:09:10 crc kubenswrapper[4797]: I1013 13:09:10.874784 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54tbd\" (UniqueName: \"kubernetes.io/projected/3ae58328-3b33-44bc-a168-9d19d64bc09c-kube-api-access-54tbd\") pod \"redhat-operators-n7gzd\" (UID: \"3ae58328-3b33-44bc-a168-9d19d64bc09c\") " pod="openshift-marketplace/redhat-operators-n7gzd" Oct 13 13:09:10 crc kubenswrapper[4797]: I1013 13:09:10.966997 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n7gzd" Oct 13 13:09:10 crc kubenswrapper[4797]: I1013 13:09:10.967632 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-gwwpp"] Oct 13 13:09:10 crc kubenswrapper[4797]: I1013 13:09:10.971210 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gwwpp" Oct 13 13:09:10 crc kubenswrapper[4797]: I1013 13:09:10.972101 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gwwpp"] Oct 13 13:09:11 crc kubenswrapper[4797]: I1013 13:09:11.054594 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e73c1f4e-419f-4073-966f-dd76a4b93916-catalog-content\") pod \"redhat-operators-gwwpp\" (UID: \"e73c1f4e-419f-4073-966f-dd76a4b93916\") " pod="openshift-marketplace/redhat-operators-gwwpp" Oct 13 13:09:11 crc kubenswrapper[4797]: I1013 13:09:11.054683 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nlskg\" (UniqueName: \"kubernetes.io/projected/e73c1f4e-419f-4073-966f-dd76a4b93916-kube-api-access-nlskg\") pod \"redhat-operators-gwwpp\" (UID: \"e73c1f4e-419f-4073-966f-dd76a4b93916\") " pod="openshift-marketplace/redhat-operators-gwwpp" Oct 13 13:09:11 crc kubenswrapper[4797]: I1013 13:09:11.054838 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e73c1f4e-419f-4073-966f-dd76a4b93916-utilities\") pod \"redhat-operators-gwwpp\" (UID: \"e73c1f4e-419f-4073-966f-dd76a4b93916\") " pod="openshift-marketplace/redhat-operators-gwwpp" Oct 13 13:09:11 crc kubenswrapper[4797]: I1013 13:09:11.129464 4797 generic.go:334] "Generic (PLEG): container finished" podID="54ecdf4a-fcad-43f1-8a08-6a9c9e0e619a" containerID="730647c9eeb27cab719381f40e70d3c87d1df364a29cc0d1c6c9b2832e9d65b0" exitCode=0 Oct 13 13:09:11 crc kubenswrapper[4797]: I1013 13:09:11.130120 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"54ecdf4a-fcad-43f1-8a08-6a9c9e0e619a","Type":"ContainerDied","Data":"730647c9eeb27cab719381f40e70d3c87d1df364a29cc0d1c6c9b2832e9d65b0"} Oct 13 13:09:11 crc kubenswrapper[4797]: I1013 13:09:11.131996 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-ngmd7" Oct 13 13:09:11 crc kubenswrapper[4797]: I1013 13:09:11.143279 4797 patch_prober.go:28] interesting pod/router-default-5444994796-ngmd7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 13 13:09:11 crc kubenswrapper[4797]: [-]has-synced failed: reason withheld Oct 13 13:09:11 crc kubenswrapper[4797]: [+]process-running ok Oct 13 13:09:11 crc kubenswrapper[4797]: healthz check failed Oct 13 13:09:11 crc kubenswrapper[4797]: I1013 13:09:11.143356 4797 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ngmd7" podUID="62769b11-1e27-44f2-836c-da79ac2655b6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 13 13:09:11 crc kubenswrapper[4797]: I1013 13:09:11.144062 4797 generic.go:334] "Generic (PLEG): container finished" podID="5d86244d-7468-424e-92d6-13aa23a66ad8" containerID="d814c16fd2b03bf5b39cf81a363d693f9f7eebaa721f35ae4a8f05ac8557bfe2" exitCode=0 Oct 13 13:09:11 crc kubenswrapper[4797]: I1013 13:09:11.144244 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-thw49" event={"ID":"5d86244d-7468-424e-92d6-13aa23a66ad8","Type":"ContainerDied","Data":"d814c16fd2b03bf5b39cf81a363d693f9f7eebaa721f35ae4a8f05ac8557bfe2"} Oct 13 13:09:11 crc kubenswrapper[4797]: I1013 13:09:11.144431 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-thw49" event={"ID":"5d86244d-7468-424e-92d6-13aa23a66ad8","Type":"ContainerStarted","Data":"d49520929c37d7d0c14e511fa909f51cbf1712c705fddc80cd8515dd4a01222c"} Oct 13 13:09:11 crc kubenswrapper[4797]: I1013 13:09:11.157627 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e73c1f4e-419f-4073-966f-dd76a4b93916-utilities\") pod \"redhat-operators-gwwpp\" (UID: \"e73c1f4e-419f-4073-966f-dd76a4b93916\") " pod="openshift-marketplace/redhat-operators-gwwpp" Oct 13 13:09:11 crc kubenswrapper[4797]: I1013 13:09:11.157790 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e73c1f4e-419f-4073-966f-dd76a4b93916-catalog-content\") pod \"redhat-operators-gwwpp\" (UID: \"e73c1f4e-419f-4073-966f-dd76a4b93916\") " pod="openshift-marketplace/redhat-operators-gwwpp" Oct 13 13:09:11 crc kubenswrapper[4797]: I1013 13:09:11.159099 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nlskg\" (UniqueName: \"kubernetes.io/projected/e73c1f4e-419f-4073-966f-dd76a4b93916-kube-api-access-nlskg\") pod \"redhat-operators-gwwpp\" (UID: \"e73c1f4e-419f-4073-966f-dd76a4b93916\") " pod="openshift-marketplace/redhat-operators-gwwpp" Oct 13 13:09:11 crc kubenswrapper[4797]: I1013 13:09:11.160136 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e73c1f4e-419f-4073-966f-dd76a4b93916-catalog-content\") pod \"redhat-operators-gwwpp\" (UID: \"e73c1f4e-419f-4073-966f-dd76a4b93916\") " pod="openshift-marketplace/redhat-operators-gwwpp" Oct 13 13:09:11 crc kubenswrapper[4797]: I1013 13:09:11.161938 4797 generic.go:334] "Generic (PLEG): container finished" podID="51eb3ebb-3fa0-4571-bb5e-ca393071f745" containerID="9d9f2ec3a9a051e9c56b4e148ebc256b0a20d16677899f3c219d698ee1a00619" exitCode=0 Oct 13 13:09:11 crc kubenswrapper[4797]: I1013 13:09:11.161980 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v4q5g" event={"ID":"51eb3ebb-3fa0-4571-bb5e-ca393071f745","Type":"ContainerDied","Data":"9d9f2ec3a9a051e9c56b4e148ebc256b0a20d16677899f3c219d698ee1a00619"} Oct 13 13:09:11 crc kubenswrapper[4797]: I1013 13:09:11.162012 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v4q5g" event={"ID":"51eb3ebb-3fa0-4571-bb5e-ca393071f745","Type":"ContainerStarted","Data":"b120c58051a132baad0fc5a0d1065da2ad3104aeec46a2a3a3c6aa1ad844639f"} Oct 13 13:09:11 crc kubenswrapper[4797]: I1013 13:09:11.162377 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e73c1f4e-419f-4073-966f-dd76a4b93916-utilities\") pod \"redhat-operators-gwwpp\" (UID: \"e73c1f4e-419f-4073-966f-dd76a4b93916\") " pod="openshift-marketplace/redhat-operators-gwwpp" Oct 13 13:09:11 crc kubenswrapper[4797]: I1013 13:09:11.195044 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nlskg\" (UniqueName: \"kubernetes.io/projected/e73c1f4e-419f-4073-966f-dd76a4b93916-kube-api-access-nlskg\") pod \"redhat-operators-gwwpp\" (UID: \"e73c1f4e-419f-4073-966f-dd76a4b93916\") " pod="openshift-marketplace/redhat-operators-gwwpp" Oct 13 13:09:11 crc kubenswrapper[4797]: I1013 13:09:11.282597 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-n7gzd"] Oct 13 13:09:11 crc kubenswrapper[4797]: W1013 13:09:11.313659 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3ae58328_3b33_44bc_a168_9d19d64bc09c.slice/crio-41c546e00a0557c5ab95e26123ba5397ed6de4aa808f0c6bf097032cb961e5e0 WatchSource:0}: Error finding container 41c546e00a0557c5ab95e26123ba5397ed6de4aa808f0c6bf097032cb961e5e0: Status 404 returned error can't find the container with id 41c546e00a0557c5ab95e26123ba5397ed6de4aa808f0c6bf097032cb961e5e0 Oct 13 13:09:11 crc kubenswrapper[4797]: I1013 13:09:11.435447 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gwwpp" Oct 13 13:09:11 crc kubenswrapper[4797]: I1013 13:09:11.902259 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gwwpp"] Oct 13 13:09:11 crc kubenswrapper[4797]: W1013 13:09:11.915487 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode73c1f4e_419f_4073_966f_dd76a4b93916.slice/crio-fa371ac6cce548ca43793d7964efda6f6ed35e2638e3596dee5d3b9c0367f3da WatchSource:0}: Error finding container fa371ac6cce548ca43793d7964efda6f6ed35e2638e3596dee5d3b9c0367f3da: Status 404 returned error can't find the container with id fa371ac6cce548ca43793d7964efda6f6ed35e2638e3596dee5d3b9c0367f3da Oct 13 13:09:12 crc kubenswrapper[4797]: I1013 13:09:12.140495 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-ngmd7" Oct 13 13:09:12 crc kubenswrapper[4797]: I1013 13:09:12.193370 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-665b6dd947-swfqw_df26da53-3ff8-4402-b5f6-25166b6b0f8a/cluster-samples-operator/0.log" Oct 13 13:09:12 crc kubenswrapper[4797]: I1013 13:09:12.193444 4797 generic.go:334] "Generic (PLEG): container finished" podID="df26da53-3ff8-4402-b5f6-25166b6b0f8a" containerID="58ff5b9cd673a48fbec243e0f692229eaddb0933325017c66068cbfe7386d815" exitCode=2 Oct 13 13:09:12 crc kubenswrapper[4797]: I1013 13:09:12.193543 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-swfqw" event={"ID":"df26da53-3ff8-4402-b5f6-25166b6b0f8a","Type":"ContainerDied","Data":"58ff5b9cd673a48fbec243e0f692229eaddb0933325017c66068cbfe7386d815"} Oct 13 13:09:12 crc kubenswrapper[4797]: I1013 13:09:12.194139 4797 scope.go:117] "RemoveContainer" containerID="58ff5b9cd673a48fbec243e0f692229eaddb0933325017c66068cbfe7386d815" Oct 13 13:09:12 crc kubenswrapper[4797]: I1013 13:09:12.195923 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gwwpp" event={"ID":"e73c1f4e-419f-4073-966f-dd76a4b93916","Type":"ContainerStarted","Data":"fa371ac6cce548ca43793d7964efda6f6ed35e2638e3596dee5d3b9c0367f3da"} Oct 13 13:09:12 crc kubenswrapper[4797]: I1013 13:09:12.208153 4797 generic.go:334] "Generic (PLEG): container finished" podID="3ae58328-3b33-44bc-a168-9d19d64bc09c" containerID="651ad3fffa71e7863684a923d61c94e9a584c4780104b60cdd4b8bf30321ef9e" exitCode=0 Oct 13 13:09:12 crc kubenswrapper[4797]: I1013 13:09:12.208900 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n7gzd" event={"ID":"3ae58328-3b33-44bc-a168-9d19d64bc09c","Type":"ContainerDied","Data":"651ad3fffa71e7863684a923d61c94e9a584c4780104b60cdd4b8bf30321ef9e"} Oct 13 13:09:12 crc kubenswrapper[4797]: I1013 13:09:12.208959 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n7gzd" event={"ID":"3ae58328-3b33-44bc-a168-9d19d64bc09c","Type":"ContainerStarted","Data":"41c546e00a0557c5ab95e26123ba5397ed6de4aa808f0c6bf097032cb961e5e0"} Oct 13 13:09:12 crc kubenswrapper[4797]: I1013 13:09:12.213923 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-ngmd7" Oct 13 13:09:12 crc kubenswrapper[4797]: I1013 13:09:12.665052 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 13 13:09:12 crc kubenswrapper[4797]: I1013 13:09:12.692908 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/54ecdf4a-fcad-43f1-8a08-6a9c9e0e619a-kube-api-access\") pod \"54ecdf4a-fcad-43f1-8a08-6a9c9e0e619a\" (UID: \"54ecdf4a-fcad-43f1-8a08-6a9c9e0e619a\") " Oct 13 13:09:12 crc kubenswrapper[4797]: I1013 13:09:12.693215 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/54ecdf4a-fcad-43f1-8a08-6a9c9e0e619a-kubelet-dir\") pod \"54ecdf4a-fcad-43f1-8a08-6a9c9e0e619a\" (UID: \"54ecdf4a-fcad-43f1-8a08-6a9c9e0e619a\") " Oct 13 13:09:12 crc kubenswrapper[4797]: I1013 13:09:12.693596 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/54ecdf4a-fcad-43f1-8a08-6a9c9e0e619a-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "54ecdf4a-fcad-43f1-8a08-6a9c9e0e619a" (UID: "54ecdf4a-fcad-43f1-8a08-6a9c9e0e619a"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 13 13:09:12 crc kubenswrapper[4797]: I1013 13:09:12.703180 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54ecdf4a-fcad-43f1-8a08-6a9c9e0e619a-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "54ecdf4a-fcad-43f1-8a08-6a9c9e0e619a" (UID: "54ecdf4a-fcad-43f1-8a08-6a9c9e0e619a"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:09:12 crc kubenswrapper[4797]: I1013 13:09:12.797018 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/54ecdf4a-fcad-43f1-8a08-6a9c9e0e619a-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 13 13:09:12 crc kubenswrapper[4797]: I1013 13:09:12.797067 4797 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/54ecdf4a-fcad-43f1-8a08-6a9c9e0e619a-kubelet-dir\") on node \"crc\" DevicePath \"\"" Oct 13 13:09:13 crc kubenswrapper[4797]: I1013 13:09:13.252630 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 13 13:09:13 crc kubenswrapper[4797]: I1013 13:09:13.255335 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"54ecdf4a-fcad-43f1-8a08-6a9c9e0e619a","Type":"ContainerDied","Data":"c23322e894dbaddba3fd2f9a22bbad842bbfdd2df1ec48f188d29a8e6e6327e3"} Oct 13 13:09:13 crc kubenswrapper[4797]: I1013 13:09:13.255384 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c23322e894dbaddba3fd2f9a22bbad842bbfdd2df1ec48f188d29a8e6e6327e3" Oct 13 13:09:13 crc kubenswrapper[4797]: I1013 13:09:13.256425 4797 generic.go:334] "Generic (PLEG): container finished" podID="e73c1f4e-419f-4073-966f-dd76a4b93916" containerID="0b04c58e31d574fc5cbcb70c9c72e59080c33f632e3cc2b96be75f135414f5e8" exitCode=0 Oct 13 13:09:13 crc kubenswrapper[4797]: I1013 13:09:13.256483 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gwwpp" event={"ID":"e73c1f4e-419f-4073-966f-dd76a4b93916","Type":"ContainerDied","Data":"0b04c58e31d574fc5cbcb70c9c72e59080c33f632e3cc2b96be75f135414f5e8"} Oct 13 13:09:13 crc kubenswrapper[4797]: I1013 13:09:13.275270 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-665b6dd947-swfqw_df26da53-3ff8-4402-b5f6-25166b6b0f8a/cluster-samples-operator/0.log" Oct 13 13:09:13 crc kubenswrapper[4797]: I1013 13:09:13.276403 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-swfqw" event={"ID":"df26da53-3ff8-4402-b5f6-25166b6b0f8a","Type":"ContainerStarted","Data":"6c0ca473b27fa15f043da1b6c96f9e736acd11c89716d54207ed490e24c4af5a"} Oct 13 13:09:13 crc kubenswrapper[4797]: I1013 13:09:13.478788 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 13 13:09:13 crc kubenswrapper[4797]: E1013 13:09:13.479094 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54ecdf4a-fcad-43f1-8a08-6a9c9e0e619a" containerName="pruner" Oct 13 13:09:13 crc kubenswrapper[4797]: I1013 13:09:13.479114 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="54ecdf4a-fcad-43f1-8a08-6a9c9e0e619a" containerName="pruner" Oct 13 13:09:13 crc kubenswrapper[4797]: I1013 13:09:13.479249 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="54ecdf4a-fcad-43f1-8a08-6a9c9e0e619a" containerName="pruner" Oct 13 13:09:13 crc kubenswrapper[4797]: I1013 13:09:13.479703 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 13 13:09:13 crc kubenswrapper[4797]: I1013 13:09:13.482573 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Oct 13 13:09:13 crc kubenswrapper[4797]: I1013 13:09:13.483319 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Oct 13 13:09:13 crc kubenswrapper[4797]: I1013 13:09:13.495594 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 13 13:09:13 crc kubenswrapper[4797]: I1013 13:09:13.518142 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/be1ba19c-2cc3-4c5b-babf-f61c63d1f504-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"be1ba19c-2cc3-4c5b-babf-f61c63d1f504\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 13 13:09:13 crc kubenswrapper[4797]: I1013 13:09:13.518276 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/be1ba19c-2cc3-4c5b-babf-f61c63d1f504-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"be1ba19c-2cc3-4c5b-babf-f61c63d1f504\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 13 13:09:13 crc kubenswrapper[4797]: I1013 13:09:13.623038 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/be1ba19c-2cc3-4c5b-babf-f61c63d1f504-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"be1ba19c-2cc3-4c5b-babf-f61c63d1f504\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 13 13:09:13 crc kubenswrapper[4797]: I1013 13:09:13.623214 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/be1ba19c-2cc3-4c5b-babf-f61c63d1f504-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"be1ba19c-2cc3-4c5b-babf-f61c63d1f504\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 13 13:09:13 crc kubenswrapper[4797]: I1013 13:09:13.623114 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/be1ba19c-2cc3-4c5b-babf-f61c63d1f504-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"be1ba19c-2cc3-4c5b-babf-f61c63d1f504\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 13 13:09:13 crc kubenswrapper[4797]: I1013 13:09:13.642781 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/be1ba19c-2cc3-4c5b-babf-f61c63d1f504-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"be1ba19c-2cc3-4c5b-babf-f61c63d1f504\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 13 13:09:13 crc kubenswrapper[4797]: I1013 13:09:13.828023 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 13 13:09:14 crc kubenswrapper[4797]: I1013 13:09:14.092266 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 13 13:09:14 crc kubenswrapper[4797]: W1013 13:09:14.112048 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podbe1ba19c_2cc3_4c5b_babf_f61c63d1f504.slice/crio-3071ba9a74a1f1374a95d9e13d907889b097c2c5d18fddcf0669af41cb85615f WatchSource:0}: Error finding container 3071ba9a74a1f1374a95d9e13d907889b097c2c5d18fddcf0669af41cb85615f: Status 404 returned error can't find the container with id 3071ba9a74a1f1374a95d9e13d907889b097c2c5d18fddcf0669af41cb85615f Oct 13 13:09:14 crc kubenswrapper[4797]: I1013 13:09:14.291908 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"be1ba19c-2cc3-4c5b-babf-f61c63d1f504","Type":"ContainerStarted","Data":"3071ba9a74a1f1374a95d9e13d907889b097c2c5d18fddcf0669af41cb85615f"} Oct 13 13:09:15 crc kubenswrapper[4797]: I1013 13:09:15.321169 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"be1ba19c-2cc3-4c5b-babf-f61c63d1f504","Type":"ContainerStarted","Data":"687205838b4399db8fafd04ef520db5bf022b97f17ace1afcf5380249b0aff75"} Oct 13 13:09:15 crc kubenswrapper[4797]: I1013 13:09:15.350123 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=2.350106202 podStartE2EDuration="2.350106202s" podCreationTimestamp="2025-10-13 13:09:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 13:09:15.337627469 +0000 UTC m=+132.871177735" watchObservedRunningTime="2025-10-13 13:09:15.350106202 +0000 UTC m=+132.883656448" Oct 13 13:09:16 crc kubenswrapper[4797]: I1013 13:09:16.365213 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-29smx" Oct 13 13:09:16 crc kubenswrapper[4797]: I1013 13:09:16.390772 4797 generic.go:334] "Generic (PLEG): container finished" podID="be1ba19c-2cc3-4c5b-babf-f61c63d1f504" containerID="687205838b4399db8fafd04ef520db5bf022b97f17ace1afcf5380249b0aff75" exitCode=0 Oct 13 13:09:16 crc kubenswrapper[4797]: I1013 13:09:16.390833 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"be1ba19c-2cc3-4c5b-babf-f61c63d1f504","Type":"ContainerDied","Data":"687205838b4399db8fafd04ef520db5bf022b97f17ace1afcf5380249b0aff75"} Oct 13 13:09:19 crc kubenswrapper[4797]: I1013 13:09:19.794039 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-8xklm" Oct 13 13:09:19 crc kubenswrapper[4797]: I1013 13:09:19.798673 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-8xklm" Oct 13 13:09:20 crc kubenswrapper[4797]: I1013 13:09:20.514027 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-9pdrq" Oct 13 13:09:22 crc kubenswrapper[4797]: I1013 13:09:22.014000 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 13 13:09:22 crc kubenswrapper[4797]: I1013 13:09:22.097994 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/be1ba19c-2cc3-4c5b-babf-f61c63d1f504-kubelet-dir\") pod \"be1ba19c-2cc3-4c5b-babf-f61c63d1f504\" (UID: \"be1ba19c-2cc3-4c5b-babf-f61c63d1f504\") " Oct 13 13:09:22 crc kubenswrapper[4797]: I1013 13:09:22.098040 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/be1ba19c-2cc3-4c5b-babf-f61c63d1f504-kube-api-access\") pod \"be1ba19c-2cc3-4c5b-babf-f61c63d1f504\" (UID: \"be1ba19c-2cc3-4c5b-babf-f61c63d1f504\") " Oct 13 13:09:22 crc kubenswrapper[4797]: I1013 13:09:22.098107 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/be1ba19c-2cc3-4c5b-babf-f61c63d1f504-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "be1ba19c-2cc3-4c5b-babf-f61c63d1f504" (UID: "be1ba19c-2cc3-4c5b-babf-f61c63d1f504"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 13 13:09:22 crc kubenswrapper[4797]: I1013 13:09:22.098416 4797 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/be1ba19c-2cc3-4c5b-babf-f61c63d1f504-kubelet-dir\") on node \"crc\" DevicePath \"\"" Oct 13 13:09:22 crc kubenswrapper[4797]: I1013 13:09:22.103850 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be1ba19c-2cc3-4c5b-babf-f61c63d1f504-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "be1ba19c-2cc3-4c5b-babf-f61c63d1f504" (UID: "be1ba19c-2cc3-4c5b-babf-f61c63d1f504"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:09:22 crc kubenswrapper[4797]: I1013 13:09:22.199702 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/be1ba19c-2cc3-4c5b-babf-f61c63d1f504-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 13 13:09:22 crc kubenswrapper[4797]: I1013 13:09:22.432123 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"be1ba19c-2cc3-4c5b-babf-f61c63d1f504","Type":"ContainerDied","Data":"3071ba9a74a1f1374a95d9e13d907889b097c2c5d18fddcf0669af41cb85615f"} Oct 13 13:09:22 crc kubenswrapper[4797]: I1013 13:09:22.432557 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3071ba9a74a1f1374a95d9e13d907889b097c2c5d18fddcf0669af41cb85615f" Oct 13 13:09:22 crc kubenswrapper[4797]: I1013 13:09:22.432161 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 13 13:09:27 crc kubenswrapper[4797]: I1013 13:09:27.761078 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-x9zw9" Oct 13 13:09:31 crc kubenswrapper[4797]: I1013 13:09:31.686836 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 13:09:31 crc kubenswrapper[4797]: I1013 13:09:31.687535 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 13 13:09:31 crc kubenswrapper[4797]: I1013 13:09:31.687590 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 13:09:31 crc kubenswrapper[4797]: I1013 13:09:31.687624 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 13 13:09:31 crc kubenswrapper[4797]: I1013 13:09:31.688792 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Oct 13 13:09:31 crc kubenswrapper[4797]: I1013 13:09:31.689597 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Oct 13 13:09:31 crc kubenswrapper[4797]: I1013 13:09:31.689720 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Oct 13 13:09:31 crc kubenswrapper[4797]: I1013 13:09:31.698467 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 13:09:31 crc kubenswrapper[4797]: I1013 13:09:31.700185 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Oct 13 13:09:31 crc kubenswrapper[4797]: I1013 13:09:31.706675 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 13:09:31 crc kubenswrapper[4797]: I1013 13:09:31.712082 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 13 13:09:31 crc kubenswrapper[4797]: I1013 13:09:31.712233 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 13 13:09:31 crc kubenswrapper[4797]: I1013 13:09:31.964206 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 13:09:31 crc kubenswrapper[4797]: I1013 13:09:31.980490 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 13 13:09:32 crc kubenswrapper[4797]: I1013 13:09:32.009667 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 13 13:09:33 crc kubenswrapper[4797]: E1013 13:09:33.711799 4797 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Oct 13 13:09:33 crc kubenswrapper[4797]: E1013 13:09:33.712448 4797 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-224st,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-v4q5g_openshift-marketplace(51eb3ebb-3fa0-4571-bb5e-ca393071f745): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 13 13:09:33 crc kubenswrapper[4797]: E1013 13:09:33.713741 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-v4q5g" podUID="51eb3ebb-3fa0-4571-bb5e-ca393071f745" Oct 13 13:09:35 crc kubenswrapper[4797]: E1013 13:09:35.054648 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-v4q5g" podUID="51eb3ebb-3fa0-4571-bb5e-ca393071f745" Oct 13 13:09:35 crc kubenswrapper[4797]: E1013 13:09:35.140909 4797 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Oct 13 13:09:35 crc kubenswrapper[4797]: E1013 13:09:35.141203 4797 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xtxgp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-tfpf9_openshift-marketplace(8b97354f-235c-4e1e-9121-13d644be8813): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 13 13:09:35 crc kubenswrapper[4797]: E1013 13:09:35.142371 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-tfpf9" podUID="8b97354f-235c-4e1e-9121-13d644be8813" Oct 13 13:09:35 crc kubenswrapper[4797]: E1013 13:09:35.150866 4797 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Oct 13 13:09:35 crc kubenswrapper[4797]: E1013 13:09:35.150979 4797 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-d74kt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-rzpc2_openshift-marketplace(ae42829d-c380-4186-9b4c-55c2221fffd7): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 13 13:09:35 crc kubenswrapper[4797]: E1013 13:09:35.152234 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-rzpc2" podUID="ae42829d-c380-4186-9b4c-55c2221fffd7" Oct 13 13:09:35 crc kubenswrapper[4797]: E1013 13:09:35.267674 4797 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Oct 13 13:09:35 crc kubenswrapper[4797]: E1013 13:09:35.268274 4797 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-54tbd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-n7gzd_openshift-marketplace(3ae58328-3b33-44bc-a168-9d19d64bc09c): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 13 13:09:35 crc kubenswrapper[4797]: E1013 13:09:35.274521 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-n7gzd" podUID="3ae58328-3b33-44bc-a168-9d19d64bc09c" Oct 13 13:09:35 crc kubenswrapper[4797]: I1013 13:09:35.506310 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2m52s" event={"ID":"922167a9-88b5-40b2-8dd3-b04e4ba3f035","Type":"ContainerStarted","Data":"4950f04c819ac223fc7dd75830eb4f5e560bdb0e234fc9cd584fcbea7dd59886"} Oct 13 13:09:35 crc kubenswrapper[4797]: I1013 13:09:35.518540 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gwwpp" event={"ID":"e73c1f4e-419f-4073-966f-dd76a4b93916","Type":"ContainerStarted","Data":"1493d59f1d22e48a4fe61ac8143f927757a4d4bf0745bbb6c3c8d67e45a081a1"} Oct 13 13:09:35 crc kubenswrapper[4797]: I1013 13:09:35.524582 4797 generic.go:334] "Generic (PLEG): container finished" podID="5d86244d-7468-424e-92d6-13aa23a66ad8" containerID="5439ff3e0b94db491aa40fe0e35c91b37947752f0e26d6e3b11fd5913ab224ea" exitCode=0 Oct 13 13:09:35 crc kubenswrapper[4797]: I1013 13:09:35.524669 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-thw49" event={"ID":"5d86244d-7468-424e-92d6-13aa23a66ad8","Type":"ContainerDied","Data":"5439ff3e0b94db491aa40fe0e35c91b37947752f0e26d6e3b11fd5913ab224ea"} Oct 13 13:09:35 crc kubenswrapper[4797]: I1013 13:09:35.530179 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mxd8n" event={"ID":"98e37ff7-36ac-4230-b449-83d3d2627535","Type":"ContainerStarted","Data":"572b0d409bac4e836a48417814c176bcfe79d33e90ab89b70eebaba4de59f27f"} Oct 13 13:09:35 crc kubenswrapper[4797]: E1013 13:09:35.534654 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-rzpc2" podUID="ae42829d-c380-4186-9b4c-55c2221fffd7" Oct 13 13:09:35 crc kubenswrapper[4797]: E1013 13:09:35.534740 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-n7gzd" podUID="3ae58328-3b33-44bc-a168-9d19d64bc09c" Oct 13 13:09:35 crc kubenswrapper[4797]: E1013 13:09:35.534848 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-tfpf9" podUID="8b97354f-235c-4e1e-9121-13d644be8813" Oct 13 13:09:35 crc kubenswrapper[4797]: W1013 13:09:35.561512 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-597700c85186f50585462c444965d2a183ef1ead5478fb1f485eeaec636b8ccc WatchSource:0}: Error finding container 597700c85186f50585462c444965d2a183ef1ead5478fb1f485eeaec636b8ccc: Status 404 returned error can't find the container with id 597700c85186f50585462c444965d2a183ef1ead5478fb1f485eeaec636b8ccc Oct 13 13:09:35 crc kubenswrapper[4797]: W1013 13:09:35.808271 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-cc9adce186f65a6836b47193b00615f8e606c43178abded41db960d1f3953f7e WatchSource:0}: Error finding container cc9adce186f65a6836b47193b00615f8e606c43178abded41db960d1f3953f7e: Status 404 returned error can't find the container with id cc9adce186f65a6836b47193b00615f8e606c43178abded41db960d1f3953f7e Oct 13 13:09:36 crc kubenswrapper[4797]: I1013 13:09:36.537613 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"ffe713f06e75104af5e317d96abfc76d690758c823c21100dc4d5e88420a5c4b"} Oct 13 13:09:36 crc kubenswrapper[4797]: I1013 13:09:36.537983 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"2f83a6a5095f46ef95b27f4558ddbc5dc59e598d97c81e09857c15e52c32ca20"} Oct 13 13:09:36 crc kubenswrapper[4797]: I1013 13:09:36.541612 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-thw49" event={"ID":"5d86244d-7468-424e-92d6-13aa23a66ad8","Type":"ContainerStarted","Data":"dec71000db7c0e70f89283a2d6a8bb0f0120abbb67a30e251bed3f8a1fca3645"} Oct 13 13:09:36 crc kubenswrapper[4797]: I1013 13:09:36.544122 4797 generic.go:334] "Generic (PLEG): container finished" podID="e73c1f4e-419f-4073-966f-dd76a4b93916" containerID="1493d59f1d22e48a4fe61ac8143f927757a4d4bf0745bbb6c3c8d67e45a081a1" exitCode=0 Oct 13 13:09:36 crc kubenswrapper[4797]: I1013 13:09:36.544179 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gwwpp" event={"ID":"e73c1f4e-419f-4073-966f-dd76a4b93916","Type":"ContainerDied","Data":"1493d59f1d22e48a4fe61ac8143f927757a4d4bf0745bbb6c3c8d67e45a081a1"} Oct 13 13:09:36 crc kubenswrapper[4797]: I1013 13:09:36.546576 4797 generic.go:334] "Generic (PLEG): container finished" podID="98e37ff7-36ac-4230-b449-83d3d2627535" containerID="572b0d409bac4e836a48417814c176bcfe79d33e90ab89b70eebaba4de59f27f" exitCode=0 Oct 13 13:09:36 crc kubenswrapper[4797]: I1013 13:09:36.546656 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mxd8n" event={"ID":"98e37ff7-36ac-4230-b449-83d3d2627535","Type":"ContainerDied","Data":"572b0d409bac4e836a48417814c176bcfe79d33e90ab89b70eebaba4de59f27f"} Oct 13 13:09:36 crc kubenswrapper[4797]: I1013 13:09:36.549343 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"8efeeed6d44c33f85a11232997b6f72ede10790fd107371490099088f36d50a8"} Oct 13 13:09:36 crc kubenswrapper[4797]: I1013 13:09:36.549383 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"597700c85186f50585462c444965d2a183ef1ead5478fb1f485eeaec636b8ccc"} Oct 13 13:09:36 crc kubenswrapper[4797]: I1013 13:09:36.558094 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"65023d71092c7b135fce0593cf6627c5d41ddbcc56caa378e5718aa314d4910a"} Oct 13 13:09:36 crc kubenswrapper[4797]: I1013 13:09:36.559932 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"cc9adce186f65a6836b47193b00615f8e606c43178abded41db960d1f3953f7e"} Oct 13 13:09:36 crc kubenswrapper[4797]: I1013 13:09:36.560132 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 13 13:09:36 crc kubenswrapper[4797]: I1013 13:09:36.560778 4797 generic.go:334] "Generic (PLEG): container finished" podID="922167a9-88b5-40b2-8dd3-b04e4ba3f035" containerID="4950f04c819ac223fc7dd75830eb4f5e560bdb0e234fc9cd584fcbea7dd59886" exitCode=0 Oct 13 13:09:36 crc kubenswrapper[4797]: I1013 13:09:36.560841 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2m52s" event={"ID":"922167a9-88b5-40b2-8dd3-b04e4ba3f035","Type":"ContainerDied","Data":"4950f04c819ac223fc7dd75830eb4f5e560bdb0e234fc9cd584fcbea7dd59886"} Oct 13 13:09:36 crc kubenswrapper[4797]: I1013 13:09:36.665078 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-thw49" podStartSLOduration=2.684838012 podStartE2EDuration="27.665061308s" podCreationTimestamp="2025-10-13 13:09:09 +0000 UTC" firstStartedPulling="2025-10-13 13:09:11.158698348 +0000 UTC m=+128.692248604" lastFinishedPulling="2025-10-13 13:09:36.138921634 +0000 UTC m=+153.672471900" observedRunningTime="2025-10-13 13:09:36.664165605 +0000 UTC m=+154.197715911" watchObservedRunningTime="2025-10-13 13:09:36.665061308 +0000 UTC m=+154.198611564" Oct 13 13:09:37 crc kubenswrapper[4797]: I1013 13:09:37.568880 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gwwpp" event={"ID":"e73c1f4e-419f-4073-966f-dd76a4b93916","Type":"ContainerStarted","Data":"880c9c82b22abb49f98777435923f8c1c3a35cbd4bf3abd1fc960d78e75e03c6"} Oct 13 13:09:37 crc kubenswrapper[4797]: I1013 13:09:37.571197 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mxd8n" event={"ID":"98e37ff7-36ac-4230-b449-83d3d2627535","Type":"ContainerStarted","Data":"8268a79559e646a34aa16205c0662291659f0d2685e0d0f29ba20b5310eab8c1"} Oct 13 13:09:37 crc kubenswrapper[4797]: I1013 13:09:37.574562 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2m52s" event={"ID":"922167a9-88b5-40b2-8dd3-b04e4ba3f035","Type":"ContainerStarted","Data":"28eb5662e2c52a3d5af63863c9404b9ac2526b626b59cbd62785a5a3ac1562e1"} Oct 13 13:09:37 crc kubenswrapper[4797]: I1013 13:09:37.596967 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-gwwpp" podStartSLOduration=3.7668602890000003 podStartE2EDuration="27.596949561s" podCreationTimestamp="2025-10-13 13:09:10 +0000 UTC" firstStartedPulling="2025-10-13 13:09:13.260514761 +0000 UTC m=+130.794065017" lastFinishedPulling="2025-10-13 13:09:37.090604043 +0000 UTC m=+154.624154289" observedRunningTime="2025-10-13 13:09:37.59450409 +0000 UTC m=+155.128054366" watchObservedRunningTime="2025-10-13 13:09:37.596949561 +0000 UTC m=+155.130499817" Oct 13 13:09:37 crc kubenswrapper[4797]: I1013 13:09:37.618408 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-2m52s" podStartSLOduration=2.665647923 podStartE2EDuration="30.618383948s" podCreationTimestamp="2025-10-13 13:09:07 +0000 UTC" firstStartedPulling="2025-10-13 13:09:09.045021853 +0000 UTC m=+126.578572149" lastFinishedPulling="2025-10-13 13:09:36.997757918 +0000 UTC m=+154.531308174" observedRunningTime="2025-10-13 13:09:37.616156102 +0000 UTC m=+155.149706358" watchObservedRunningTime="2025-10-13 13:09:37.618383948 +0000 UTC m=+155.151934224" Oct 13 13:09:37 crc kubenswrapper[4797]: I1013 13:09:37.634798 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-mxd8n" podStartSLOduration=2.663180401 podStartE2EDuration="30.634777728s" podCreationTimestamp="2025-10-13 13:09:07 +0000 UTC" firstStartedPulling="2025-10-13 13:09:09.04128056 +0000 UTC m=+126.574830816" lastFinishedPulling="2025-10-13 13:09:37.012877877 +0000 UTC m=+154.546428143" observedRunningTime="2025-10-13 13:09:37.632698906 +0000 UTC m=+155.166249172" watchObservedRunningTime="2025-10-13 13:09:37.634777728 +0000 UTC m=+155.168327994" Oct 13 13:09:37 crc kubenswrapper[4797]: I1013 13:09:37.897363 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-2m52s" Oct 13 13:09:37 crc kubenswrapper[4797]: I1013 13:09:37.897490 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-2m52s" Oct 13 13:09:38 crc kubenswrapper[4797]: I1013 13:09:38.271910 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-mxd8n" Oct 13 13:09:38 crc kubenswrapper[4797]: I1013 13:09:38.271983 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-mxd8n" Oct 13 13:09:39 crc kubenswrapper[4797]: I1013 13:09:39.026559 4797 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-2m52s" podUID="922167a9-88b5-40b2-8dd3-b04e4ba3f035" containerName="registry-server" probeResult="failure" output=< Oct 13 13:09:39 crc kubenswrapper[4797]: timeout: failed to connect service ":50051" within 1s Oct 13 13:09:39 crc kubenswrapper[4797]: > Oct 13 13:09:39 crc kubenswrapper[4797]: I1013 13:09:39.346796 4797 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-mxd8n" podUID="98e37ff7-36ac-4230-b449-83d3d2627535" containerName="registry-server" probeResult="failure" output=< Oct 13 13:09:39 crc kubenswrapper[4797]: timeout: failed to connect service ":50051" within 1s Oct 13 13:09:39 crc kubenswrapper[4797]: > Oct 13 13:09:40 crc kubenswrapper[4797]: I1013 13:09:40.108902 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-thw49" Oct 13 13:09:40 crc kubenswrapper[4797]: I1013 13:09:40.109244 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-thw49" Oct 13 13:09:40 crc kubenswrapper[4797]: I1013 13:09:40.169295 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-thw49" Oct 13 13:09:41 crc kubenswrapper[4797]: I1013 13:09:41.046958 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tmlh5" Oct 13 13:09:41 crc kubenswrapper[4797]: I1013 13:09:41.436274 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-gwwpp" Oct 13 13:09:41 crc kubenswrapper[4797]: I1013 13:09:41.436791 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-gwwpp" Oct 13 13:09:42 crc kubenswrapper[4797]: I1013 13:09:42.506398 4797 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-gwwpp" podUID="e73c1f4e-419f-4073-966f-dd76a4b93916" containerName="registry-server" probeResult="failure" output=< Oct 13 13:09:42 crc kubenswrapper[4797]: timeout: failed to connect service ":50051" within 1s Oct 13 13:09:42 crc kubenswrapper[4797]: > Oct 13 13:09:47 crc kubenswrapper[4797]: I1013 13:09:47.327933 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e65d35bc-209d-4438-ae53-31deb132aaf5-metrics-certs\") pod \"network-metrics-daemon-pdvg5\" (UID: \"e65d35bc-209d-4438-ae53-31deb132aaf5\") " pod="openshift-multus/network-metrics-daemon-pdvg5" Oct 13 13:09:47 crc kubenswrapper[4797]: I1013 13:09:47.330760 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Oct 13 13:09:47 crc kubenswrapper[4797]: I1013 13:09:47.345936 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e65d35bc-209d-4438-ae53-31deb132aaf5-metrics-certs\") pod \"network-metrics-daemon-pdvg5\" (UID: \"e65d35bc-209d-4438-ae53-31deb132aaf5\") " pod="openshift-multus/network-metrics-daemon-pdvg5" Oct 13 13:09:47 crc kubenswrapper[4797]: I1013 13:09:47.599340 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Oct 13 13:09:47 crc kubenswrapper[4797]: I1013 13:09:47.607682 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pdvg5" Oct 13 13:09:47 crc kubenswrapper[4797]: I1013 13:09:47.951783 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-2m52s" Oct 13 13:09:48 crc kubenswrapper[4797]: I1013 13:09:48.001179 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-2m52s" Oct 13 13:09:48 crc kubenswrapper[4797]: I1013 13:09:48.041450 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-pdvg5"] Oct 13 13:09:48 crc kubenswrapper[4797]: I1013 13:09:48.120777 4797 patch_prober.go:28] interesting pod/machine-config-daemon-hrdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 13:09:48 crc kubenswrapper[4797]: I1013 13:09:48.120923 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 13:09:48 crc kubenswrapper[4797]: I1013 13:09:48.332737 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-mxd8n" Oct 13 13:09:48 crc kubenswrapper[4797]: I1013 13:09:48.396064 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-mxd8n" Oct 13 13:09:48 crc kubenswrapper[4797]: I1013 13:09:48.648884 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-pdvg5" event={"ID":"e65d35bc-209d-4438-ae53-31deb132aaf5","Type":"ContainerStarted","Data":"5235d4f0be44c320ad4c01263bc214b8fe2343c7820a91ac7833e926618134f5"} Oct 13 13:09:48 crc kubenswrapper[4797]: I1013 13:09:48.648931 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-pdvg5" event={"ID":"e65d35bc-209d-4438-ae53-31deb132aaf5","Type":"ContainerStarted","Data":"39f951b45df6bae7b14fe9366760a76797462cc09aff88987569cc4ed28b20c2"} Oct 13 13:09:48 crc kubenswrapper[4797]: I1013 13:09:48.648941 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-pdvg5" event={"ID":"e65d35bc-209d-4438-ae53-31deb132aaf5","Type":"ContainerStarted","Data":"4493fffcc56042d53f7b250da0a9339b1cf5f72c1a56fb8eb96814894c3732c1"} Oct 13 13:09:48 crc kubenswrapper[4797]: I1013 13:09:48.664588 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-pdvg5" podStartSLOduration=145.664567979 podStartE2EDuration="2m25.664567979s" podCreationTimestamp="2025-10-13 13:07:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 13:09:48.66298327 +0000 UTC m=+166.196533536" watchObservedRunningTime="2025-10-13 13:09:48.664567979 +0000 UTC m=+166.198118235" Oct 13 13:09:49 crc kubenswrapper[4797]: I1013 13:09:49.658427 4797 generic.go:334] "Generic (PLEG): container finished" podID="8b97354f-235c-4e1e-9121-13d644be8813" containerID="9b94d92c5126df23297b8ec73d7e81272defc3785500c8bf8d3680b3915e3180" exitCode=0 Oct 13 13:09:49 crc kubenswrapper[4797]: I1013 13:09:49.658610 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tfpf9" event={"ID":"8b97354f-235c-4e1e-9121-13d644be8813","Type":"ContainerDied","Data":"9b94d92c5126df23297b8ec73d7e81272defc3785500c8bf8d3680b3915e3180"} Oct 13 13:09:49 crc kubenswrapper[4797]: I1013 13:09:49.667264 4797 generic.go:334] "Generic (PLEG): container finished" podID="51eb3ebb-3fa0-4571-bb5e-ca393071f745" containerID="c482c43fd4f782cf58d8d3492cba10633ba9b148279d0c015c0a2de0a2a31acc" exitCode=0 Oct 13 13:09:49 crc kubenswrapper[4797]: I1013 13:09:49.668419 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v4q5g" event={"ID":"51eb3ebb-3fa0-4571-bb5e-ca393071f745","Type":"ContainerDied","Data":"c482c43fd4f782cf58d8d3492cba10633ba9b148279d0c015c0a2de0a2a31acc"} Oct 13 13:09:50 crc kubenswrapper[4797]: I1013 13:09:50.189504 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-thw49" Oct 13 13:09:50 crc kubenswrapper[4797]: I1013 13:09:50.680836 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tfpf9" event={"ID":"8b97354f-235c-4e1e-9121-13d644be8813","Type":"ContainerStarted","Data":"9b0dc98137c3f8338470382e7bbb17265c71f9e0519ce6b54cd868a9caa81d22"} Oct 13 13:09:50 crc kubenswrapper[4797]: I1013 13:09:50.684518 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v4q5g" event={"ID":"51eb3ebb-3fa0-4571-bb5e-ca393071f745","Type":"ContainerStarted","Data":"ab80f50d7a5db2455fbfe85604510463c7a8f8a0e571818aee63744d7ba25bee"} Oct 13 13:09:50 crc kubenswrapper[4797]: I1013 13:09:50.686594 4797 generic.go:334] "Generic (PLEG): container finished" podID="ae42829d-c380-4186-9b4c-55c2221fffd7" containerID="589747dbaa997b888bc19e3ff056761e9ec5ba32180a0670f18078e53033ee18" exitCode=0 Oct 13 13:09:50 crc kubenswrapper[4797]: I1013 13:09:50.686635 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rzpc2" event={"ID":"ae42829d-c380-4186-9b4c-55c2221fffd7","Type":"ContainerDied","Data":"589747dbaa997b888bc19e3ff056761e9ec5ba32180a0670f18078e53033ee18"} Oct 13 13:09:50 crc kubenswrapper[4797]: I1013 13:09:50.702692 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-tfpf9" podStartSLOduration=2.619372775 podStartE2EDuration="43.70267004s" podCreationTimestamp="2025-10-13 13:09:07 +0000 UTC" firstStartedPulling="2025-10-13 13:09:09.039061875 +0000 UTC m=+126.572612171" lastFinishedPulling="2025-10-13 13:09:50.12235913 +0000 UTC m=+167.655909436" observedRunningTime="2025-10-13 13:09:50.699595693 +0000 UTC m=+168.233145959" watchObservedRunningTime="2025-10-13 13:09:50.70267004 +0000 UTC m=+168.236220306" Oct 13 13:09:50 crc kubenswrapper[4797]: I1013 13:09:50.744823 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-v4q5g" podStartSLOduration=2.696205141 podStartE2EDuration="41.744786435s" podCreationTimestamp="2025-10-13 13:09:09 +0000 UTC" firstStartedPulling="2025-10-13 13:09:11.179961794 +0000 UTC m=+128.713512040" lastFinishedPulling="2025-10-13 13:09:50.228543058 +0000 UTC m=+167.762093334" observedRunningTime="2025-10-13 13:09:50.741319168 +0000 UTC m=+168.274869444" watchObservedRunningTime="2025-10-13 13:09:50.744786435 +0000 UTC m=+168.278336701" Oct 13 13:09:51 crc kubenswrapper[4797]: I1013 13:09:51.495998 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-gwwpp" Oct 13 13:09:51 crc kubenswrapper[4797]: I1013 13:09:51.543939 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-gwwpp" Oct 13 13:09:51 crc kubenswrapper[4797]: I1013 13:09:51.980771 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mxd8n"] Oct 13 13:09:51 crc kubenswrapper[4797]: I1013 13:09:51.983072 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-mxd8n" podUID="98e37ff7-36ac-4230-b449-83d3d2627535" containerName="registry-server" containerID="cri-o://8268a79559e646a34aa16205c0662291659f0d2685e0d0f29ba20b5310eab8c1" gracePeriod=2 Oct 13 13:09:52 crc kubenswrapper[4797]: I1013 13:09:52.587701 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-thw49"] Oct 13 13:09:52 crc kubenswrapper[4797]: I1013 13:09:52.589133 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-thw49" podUID="5d86244d-7468-424e-92d6-13aa23a66ad8" containerName="registry-server" containerID="cri-o://dec71000db7c0e70f89283a2d6a8bb0f0120abbb67a30e251bed3f8a1fca3645" gracePeriod=2 Oct 13 13:09:52 crc kubenswrapper[4797]: I1013 13:09:52.703838 4797 generic.go:334] "Generic (PLEG): container finished" podID="98e37ff7-36ac-4230-b449-83d3d2627535" containerID="8268a79559e646a34aa16205c0662291659f0d2685e0d0f29ba20b5310eab8c1" exitCode=0 Oct 13 13:09:52 crc kubenswrapper[4797]: I1013 13:09:52.703905 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mxd8n" event={"ID":"98e37ff7-36ac-4230-b449-83d3d2627535","Type":"ContainerDied","Data":"8268a79559e646a34aa16205c0662291659f0d2685e0d0f29ba20b5310eab8c1"} Oct 13 13:09:53 crc kubenswrapper[4797]: I1013 13:09:53.292260 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mxd8n" Oct 13 13:09:53 crc kubenswrapper[4797]: I1013 13:09:53.420712 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98e37ff7-36ac-4230-b449-83d3d2627535-utilities\") pod \"98e37ff7-36ac-4230-b449-83d3d2627535\" (UID: \"98e37ff7-36ac-4230-b449-83d3d2627535\") " Oct 13 13:09:53 crc kubenswrapper[4797]: I1013 13:09:53.420794 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98e37ff7-36ac-4230-b449-83d3d2627535-catalog-content\") pod \"98e37ff7-36ac-4230-b449-83d3d2627535\" (UID: \"98e37ff7-36ac-4230-b449-83d3d2627535\") " Oct 13 13:09:53 crc kubenswrapper[4797]: I1013 13:09:53.420833 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-scd2h\" (UniqueName: \"kubernetes.io/projected/98e37ff7-36ac-4230-b449-83d3d2627535-kube-api-access-scd2h\") pod \"98e37ff7-36ac-4230-b449-83d3d2627535\" (UID: \"98e37ff7-36ac-4230-b449-83d3d2627535\") " Oct 13 13:09:53 crc kubenswrapper[4797]: I1013 13:09:53.424651 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/98e37ff7-36ac-4230-b449-83d3d2627535-utilities" (OuterVolumeSpecName: "utilities") pod "98e37ff7-36ac-4230-b449-83d3d2627535" (UID: "98e37ff7-36ac-4230-b449-83d3d2627535"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 13:09:53 crc kubenswrapper[4797]: I1013 13:09:53.429087 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98e37ff7-36ac-4230-b449-83d3d2627535-kube-api-access-scd2h" (OuterVolumeSpecName: "kube-api-access-scd2h") pod "98e37ff7-36ac-4230-b449-83d3d2627535" (UID: "98e37ff7-36ac-4230-b449-83d3d2627535"). InnerVolumeSpecName "kube-api-access-scd2h". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:09:53 crc kubenswrapper[4797]: I1013 13:09:53.481946 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/98e37ff7-36ac-4230-b449-83d3d2627535-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "98e37ff7-36ac-4230-b449-83d3d2627535" (UID: "98e37ff7-36ac-4230-b449-83d3d2627535"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 13:09:53 crc kubenswrapper[4797]: I1013 13:09:53.522599 4797 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98e37ff7-36ac-4230-b449-83d3d2627535-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 13 13:09:53 crc kubenswrapper[4797]: I1013 13:09:53.522639 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-scd2h\" (UniqueName: \"kubernetes.io/projected/98e37ff7-36ac-4230-b449-83d3d2627535-kube-api-access-scd2h\") on node \"crc\" DevicePath \"\"" Oct 13 13:09:53 crc kubenswrapper[4797]: I1013 13:09:53.522654 4797 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98e37ff7-36ac-4230-b449-83d3d2627535-utilities\") on node \"crc\" DevicePath \"\"" Oct 13 13:09:53 crc kubenswrapper[4797]: I1013 13:09:53.710596 4797 generic.go:334] "Generic (PLEG): container finished" podID="5d86244d-7468-424e-92d6-13aa23a66ad8" containerID="dec71000db7c0e70f89283a2d6a8bb0f0120abbb67a30e251bed3f8a1fca3645" exitCode=0 Oct 13 13:09:53 crc kubenswrapper[4797]: I1013 13:09:53.710664 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-thw49" event={"ID":"5d86244d-7468-424e-92d6-13aa23a66ad8","Type":"ContainerDied","Data":"dec71000db7c0e70f89283a2d6a8bb0f0120abbb67a30e251bed3f8a1fca3645"} Oct 13 13:09:53 crc kubenswrapper[4797]: I1013 13:09:53.714061 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mxd8n" event={"ID":"98e37ff7-36ac-4230-b449-83d3d2627535","Type":"ContainerDied","Data":"9a92e0647b53ff914499105bd862e5299327974808b56fa19e56ee32eb365339"} Oct 13 13:09:53 crc kubenswrapper[4797]: I1013 13:09:53.714094 4797 scope.go:117] "RemoveContainer" containerID="8268a79559e646a34aa16205c0662291659f0d2685e0d0f29ba20b5310eab8c1" Oct 13 13:09:53 crc kubenswrapper[4797]: I1013 13:09:53.714199 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mxd8n" Oct 13 13:09:53 crc kubenswrapper[4797]: I1013 13:09:53.744356 4797 scope.go:117] "RemoveContainer" containerID="572b0d409bac4e836a48417814c176bcfe79d33e90ab89b70eebaba4de59f27f" Oct 13 13:09:53 crc kubenswrapper[4797]: I1013 13:09:53.781742 4797 scope.go:117] "RemoveContainer" containerID="b171a07754fe27c866b9b0d97d63589e0687bec227ea39276ecb680c18c3dfbd" Oct 13 13:09:53 crc kubenswrapper[4797]: I1013 13:09:53.790631 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mxd8n"] Oct 13 13:09:53 crc kubenswrapper[4797]: I1013 13:09:53.790860 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-thw49" Oct 13 13:09:53 crc kubenswrapper[4797]: I1013 13:09:53.793521 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-mxd8n"] Oct 13 13:09:53 crc kubenswrapper[4797]: I1013 13:09:53.928031 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tvtsv\" (UniqueName: \"kubernetes.io/projected/5d86244d-7468-424e-92d6-13aa23a66ad8-kube-api-access-tvtsv\") pod \"5d86244d-7468-424e-92d6-13aa23a66ad8\" (UID: \"5d86244d-7468-424e-92d6-13aa23a66ad8\") " Oct 13 13:09:53 crc kubenswrapper[4797]: I1013 13:09:53.928120 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d86244d-7468-424e-92d6-13aa23a66ad8-catalog-content\") pod \"5d86244d-7468-424e-92d6-13aa23a66ad8\" (UID: \"5d86244d-7468-424e-92d6-13aa23a66ad8\") " Oct 13 13:09:53 crc kubenswrapper[4797]: I1013 13:09:53.928159 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d86244d-7468-424e-92d6-13aa23a66ad8-utilities\") pod \"5d86244d-7468-424e-92d6-13aa23a66ad8\" (UID: \"5d86244d-7468-424e-92d6-13aa23a66ad8\") " Oct 13 13:09:53 crc kubenswrapper[4797]: I1013 13:09:53.928980 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d86244d-7468-424e-92d6-13aa23a66ad8-utilities" (OuterVolumeSpecName: "utilities") pod "5d86244d-7468-424e-92d6-13aa23a66ad8" (UID: "5d86244d-7468-424e-92d6-13aa23a66ad8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 13:09:53 crc kubenswrapper[4797]: I1013 13:09:53.934936 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d86244d-7468-424e-92d6-13aa23a66ad8-kube-api-access-tvtsv" (OuterVolumeSpecName: "kube-api-access-tvtsv") pod "5d86244d-7468-424e-92d6-13aa23a66ad8" (UID: "5d86244d-7468-424e-92d6-13aa23a66ad8"). InnerVolumeSpecName "kube-api-access-tvtsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:09:53 crc kubenswrapper[4797]: I1013 13:09:53.955257 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d86244d-7468-424e-92d6-13aa23a66ad8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5d86244d-7468-424e-92d6-13aa23a66ad8" (UID: "5d86244d-7468-424e-92d6-13aa23a66ad8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 13:09:54 crc kubenswrapper[4797]: I1013 13:09:54.029226 4797 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d86244d-7468-424e-92d6-13aa23a66ad8-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 13 13:09:54 crc kubenswrapper[4797]: I1013 13:09:54.029263 4797 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d86244d-7468-424e-92d6-13aa23a66ad8-utilities\") on node \"crc\" DevicePath \"\"" Oct 13 13:09:54 crc kubenswrapper[4797]: I1013 13:09:54.029273 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tvtsv\" (UniqueName: \"kubernetes.io/projected/5d86244d-7468-424e-92d6-13aa23a66ad8-kube-api-access-tvtsv\") on node \"crc\" DevicePath \"\"" Oct 13 13:09:54 crc kubenswrapper[4797]: I1013 13:09:54.721507 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rzpc2" event={"ID":"ae42829d-c380-4186-9b4c-55c2221fffd7","Type":"ContainerStarted","Data":"5f4932a17de8c0ff27b23532c9ca7b80388ad285ac5642a9e32818f7f63a8907"} Oct 13 13:09:54 crc kubenswrapper[4797]: I1013 13:09:54.723908 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-thw49" event={"ID":"5d86244d-7468-424e-92d6-13aa23a66ad8","Type":"ContainerDied","Data":"d49520929c37d7d0c14e511fa909f51cbf1712c705fddc80cd8515dd4a01222c"} Oct 13 13:09:54 crc kubenswrapper[4797]: I1013 13:09:54.723972 4797 scope.go:117] "RemoveContainer" containerID="dec71000db7c0e70f89283a2d6a8bb0f0120abbb67a30e251bed3f8a1fca3645" Oct 13 13:09:54 crc kubenswrapper[4797]: I1013 13:09:54.724085 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-thw49" Oct 13 13:09:54 crc kubenswrapper[4797]: I1013 13:09:54.738488 4797 generic.go:334] "Generic (PLEG): container finished" podID="3ae58328-3b33-44bc-a168-9d19d64bc09c" containerID="582399ff574eddf8dbb9b93816221f5567a54fe125defe896195a4cb57ad5c95" exitCode=0 Oct 13 13:09:54 crc kubenswrapper[4797]: I1013 13:09:54.738560 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n7gzd" event={"ID":"3ae58328-3b33-44bc-a168-9d19d64bc09c","Type":"ContainerDied","Data":"582399ff574eddf8dbb9b93816221f5567a54fe125defe896195a4cb57ad5c95"} Oct 13 13:09:54 crc kubenswrapper[4797]: I1013 13:09:54.756144 4797 scope.go:117] "RemoveContainer" containerID="5439ff3e0b94db491aa40fe0e35c91b37947752f0e26d6e3b11fd5913ab224ea" Oct 13 13:09:54 crc kubenswrapper[4797]: I1013 13:09:54.758525 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-rzpc2" podStartSLOduration=4.421504399 podStartE2EDuration="47.758500513s" podCreationTimestamp="2025-10-13 13:09:07 +0000 UTC" firstStartedPulling="2025-10-13 13:09:09.033690502 +0000 UTC m=+126.567240758" lastFinishedPulling="2025-10-13 13:09:52.370686626 +0000 UTC m=+169.904236872" observedRunningTime="2025-10-13 13:09:54.756168604 +0000 UTC m=+172.289718880" watchObservedRunningTime="2025-10-13 13:09:54.758500513 +0000 UTC m=+172.292050779" Oct 13 13:09:54 crc kubenswrapper[4797]: I1013 13:09:54.777720 4797 scope.go:117] "RemoveContainer" containerID="d814c16fd2b03bf5b39cf81a363d693f9f7eebaa721f35ae4a8f05ac8557bfe2" Oct 13 13:09:54 crc kubenswrapper[4797]: I1013 13:09:54.797548 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-thw49"] Oct 13 13:09:54 crc kubenswrapper[4797]: I1013 13:09:54.806177 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-thw49"] Oct 13 13:09:54 crc kubenswrapper[4797]: I1013 13:09:54.980547 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gwwpp"] Oct 13 13:09:54 crc kubenswrapper[4797]: I1013 13:09:54.980824 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-gwwpp" podUID="e73c1f4e-419f-4073-966f-dd76a4b93916" containerName="registry-server" containerID="cri-o://880c9c82b22abb49f98777435923f8c1c3a35cbd4bf3abd1fc960d78e75e03c6" gracePeriod=2 Oct 13 13:09:55 crc kubenswrapper[4797]: I1013 13:09:55.242322 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d86244d-7468-424e-92d6-13aa23a66ad8" path="/var/lib/kubelet/pods/5d86244d-7468-424e-92d6-13aa23a66ad8/volumes" Oct 13 13:09:55 crc kubenswrapper[4797]: I1013 13:09:55.243244 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98e37ff7-36ac-4230-b449-83d3d2627535" path="/var/lib/kubelet/pods/98e37ff7-36ac-4230-b449-83d3d2627535/volumes" Oct 13 13:09:55 crc kubenswrapper[4797]: I1013 13:09:55.331963 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gwwpp" Oct 13 13:09:55 crc kubenswrapper[4797]: I1013 13:09:55.443189 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nlskg\" (UniqueName: \"kubernetes.io/projected/e73c1f4e-419f-4073-966f-dd76a4b93916-kube-api-access-nlskg\") pod \"e73c1f4e-419f-4073-966f-dd76a4b93916\" (UID: \"e73c1f4e-419f-4073-966f-dd76a4b93916\") " Oct 13 13:09:55 crc kubenswrapper[4797]: I1013 13:09:55.443311 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e73c1f4e-419f-4073-966f-dd76a4b93916-catalog-content\") pod \"e73c1f4e-419f-4073-966f-dd76a4b93916\" (UID: \"e73c1f4e-419f-4073-966f-dd76a4b93916\") " Oct 13 13:09:55 crc kubenswrapper[4797]: I1013 13:09:55.443382 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e73c1f4e-419f-4073-966f-dd76a4b93916-utilities\") pod \"e73c1f4e-419f-4073-966f-dd76a4b93916\" (UID: \"e73c1f4e-419f-4073-966f-dd76a4b93916\") " Oct 13 13:09:55 crc kubenswrapper[4797]: I1013 13:09:55.444238 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e73c1f4e-419f-4073-966f-dd76a4b93916-utilities" (OuterVolumeSpecName: "utilities") pod "e73c1f4e-419f-4073-966f-dd76a4b93916" (UID: "e73c1f4e-419f-4073-966f-dd76a4b93916"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 13:09:55 crc kubenswrapper[4797]: I1013 13:09:55.454060 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e73c1f4e-419f-4073-966f-dd76a4b93916-kube-api-access-nlskg" (OuterVolumeSpecName: "kube-api-access-nlskg") pod "e73c1f4e-419f-4073-966f-dd76a4b93916" (UID: "e73c1f4e-419f-4073-966f-dd76a4b93916"). InnerVolumeSpecName "kube-api-access-nlskg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:09:55 crc kubenswrapper[4797]: I1013 13:09:55.526401 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e73c1f4e-419f-4073-966f-dd76a4b93916-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e73c1f4e-419f-4073-966f-dd76a4b93916" (UID: "e73c1f4e-419f-4073-966f-dd76a4b93916"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 13:09:55 crc kubenswrapper[4797]: I1013 13:09:55.545119 4797 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e73c1f4e-419f-4073-966f-dd76a4b93916-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 13 13:09:55 crc kubenswrapper[4797]: I1013 13:09:55.545160 4797 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e73c1f4e-419f-4073-966f-dd76a4b93916-utilities\") on node \"crc\" DevicePath \"\"" Oct 13 13:09:55 crc kubenswrapper[4797]: I1013 13:09:55.545194 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nlskg\" (UniqueName: \"kubernetes.io/projected/e73c1f4e-419f-4073-966f-dd76a4b93916-kube-api-access-nlskg\") on node \"crc\" DevicePath \"\"" Oct 13 13:09:55 crc kubenswrapper[4797]: I1013 13:09:55.757121 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n7gzd" event={"ID":"3ae58328-3b33-44bc-a168-9d19d64bc09c","Type":"ContainerStarted","Data":"b80350a81a0f80eb8672a2347440f67f729e8f78486e8fd3368560c9d8116399"} Oct 13 13:09:55 crc kubenswrapper[4797]: I1013 13:09:55.760490 4797 generic.go:334] "Generic (PLEG): container finished" podID="e73c1f4e-419f-4073-966f-dd76a4b93916" containerID="880c9c82b22abb49f98777435923f8c1c3a35cbd4bf3abd1fc960d78e75e03c6" exitCode=0 Oct 13 13:09:55 crc kubenswrapper[4797]: I1013 13:09:55.760519 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gwwpp" event={"ID":"e73c1f4e-419f-4073-966f-dd76a4b93916","Type":"ContainerDied","Data":"880c9c82b22abb49f98777435923f8c1c3a35cbd4bf3abd1fc960d78e75e03c6"} Oct 13 13:09:55 crc kubenswrapper[4797]: I1013 13:09:55.760561 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gwwpp" event={"ID":"e73c1f4e-419f-4073-966f-dd76a4b93916","Type":"ContainerDied","Data":"fa371ac6cce548ca43793d7964efda6f6ed35e2638e3596dee5d3b9c0367f3da"} Oct 13 13:09:55 crc kubenswrapper[4797]: I1013 13:09:55.760583 4797 scope.go:117] "RemoveContainer" containerID="880c9c82b22abb49f98777435923f8c1c3a35cbd4bf3abd1fc960d78e75e03c6" Oct 13 13:09:55 crc kubenswrapper[4797]: I1013 13:09:55.760592 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gwwpp" Oct 13 13:09:55 crc kubenswrapper[4797]: I1013 13:09:55.778793 4797 scope.go:117] "RemoveContainer" containerID="1493d59f1d22e48a4fe61ac8143f927757a4d4bf0745bbb6c3c8d67e45a081a1" Oct 13 13:09:55 crc kubenswrapper[4797]: I1013 13:09:55.779816 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-n7gzd" podStartSLOduration=2.773738425 podStartE2EDuration="45.779787075s" podCreationTimestamp="2025-10-13 13:09:10 +0000 UTC" firstStartedPulling="2025-10-13 13:09:12.244174845 +0000 UTC m=+129.777725101" lastFinishedPulling="2025-10-13 13:09:55.250223495 +0000 UTC m=+172.783773751" observedRunningTime="2025-10-13 13:09:55.775324853 +0000 UTC m=+173.308875129" watchObservedRunningTime="2025-10-13 13:09:55.779787075 +0000 UTC m=+173.313337331" Oct 13 13:09:55 crc kubenswrapper[4797]: I1013 13:09:55.794534 4797 scope.go:117] "RemoveContainer" containerID="0b04c58e31d574fc5cbcb70c9c72e59080c33f632e3cc2b96be75f135414f5e8" Oct 13 13:09:55 crc kubenswrapper[4797]: I1013 13:09:55.810508 4797 scope.go:117] "RemoveContainer" containerID="880c9c82b22abb49f98777435923f8c1c3a35cbd4bf3abd1fc960d78e75e03c6" Oct 13 13:09:55 crc kubenswrapper[4797]: E1013 13:09:55.811013 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"880c9c82b22abb49f98777435923f8c1c3a35cbd4bf3abd1fc960d78e75e03c6\": container with ID starting with 880c9c82b22abb49f98777435923f8c1c3a35cbd4bf3abd1fc960d78e75e03c6 not found: ID does not exist" containerID="880c9c82b22abb49f98777435923f8c1c3a35cbd4bf3abd1fc960d78e75e03c6" Oct 13 13:09:55 crc kubenswrapper[4797]: I1013 13:09:55.811045 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"880c9c82b22abb49f98777435923f8c1c3a35cbd4bf3abd1fc960d78e75e03c6"} err="failed to get container status \"880c9c82b22abb49f98777435923f8c1c3a35cbd4bf3abd1fc960d78e75e03c6\": rpc error: code = NotFound desc = could not find container \"880c9c82b22abb49f98777435923f8c1c3a35cbd4bf3abd1fc960d78e75e03c6\": container with ID starting with 880c9c82b22abb49f98777435923f8c1c3a35cbd4bf3abd1fc960d78e75e03c6 not found: ID does not exist" Oct 13 13:09:55 crc kubenswrapper[4797]: I1013 13:09:55.811093 4797 scope.go:117] "RemoveContainer" containerID="1493d59f1d22e48a4fe61ac8143f927757a4d4bf0745bbb6c3c8d67e45a081a1" Oct 13 13:09:55 crc kubenswrapper[4797]: E1013 13:09:55.811350 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1493d59f1d22e48a4fe61ac8143f927757a4d4bf0745bbb6c3c8d67e45a081a1\": container with ID starting with 1493d59f1d22e48a4fe61ac8143f927757a4d4bf0745bbb6c3c8d67e45a081a1 not found: ID does not exist" containerID="1493d59f1d22e48a4fe61ac8143f927757a4d4bf0745bbb6c3c8d67e45a081a1" Oct 13 13:09:55 crc kubenswrapper[4797]: I1013 13:09:55.811373 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1493d59f1d22e48a4fe61ac8143f927757a4d4bf0745bbb6c3c8d67e45a081a1"} err="failed to get container status \"1493d59f1d22e48a4fe61ac8143f927757a4d4bf0745bbb6c3c8d67e45a081a1\": rpc error: code = NotFound desc = could not find container \"1493d59f1d22e48a4fe61ac8143f927757a4d4bf0745bbb6c3c8d67e45a081a1\": container with ID starting with 1493d59f1d22e48a4fe61ac8143f927757a4d4bf0745bbb6c3c8d67e45a081a1 not found: ID does not exist" Oct 13 13:09:55 crc kubenswrapper[4797]: I1013 13:09:55.811389 4797 scope.go:117] "RemoveContainer" containerID="0b04c58e31d574fc5cbcb70c9c72e59080c33f632e3cc2b96be75f135414f5e8" Oct 13 13:09:55 crc kubenswrapper[4797]: E1013 13:09:55.811592 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b04c58e31d574fc5cbcb70c9c72e59080c33f632e3cc2b96be75f135414f5e8\": container with ID starting with 0b04c58e31d574fc5cbcb70c9c72e59080c33f632e3cc2b96be75f135414f5e8 not found: ID does not exist" containerID="0b04c58e31d574fc5cbcb70c9c72e59080c33f632e3cc2b96be75f135414f5e8" Oct 13 13:09:55 crc kubenswrapper[4797]: I1013 13:09:55.811613 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b04c58e31d574fc5cbcb70c9c72e59080c33f632e3cc2b96be75f135414f5e8"} err="failed to get container status \"0b04c58e31d574fc5cbcb70c9c72e59080c33f632e3cc2b96be75f135414f5e8\": rpc error: code = NotFound desc = could not find container \"0b04c58e31d574fc5cbcb70c9c72e59080c33f632e3cc2b96be75f135414f5e8\": container with ID starting with 0b04c58e31d574fc5cbcb70c9c72e59080c33f632e3cc2b96be75f135414f5e8 not found: ID does not exist" Oct 13 13:09:55 crc kubenswrapper[4797]: I1013 13:09:55.840742 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gwwpp"] Oct 13 13:09:55 crc kubenswrapper[4797]: I1013 13:09:55.843761 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-gwwpp"] Oct 13 13:09:57 crc kubenswrapper[4797]: I1013 13:09:57.244691 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e73c1f4e-419f-4073-966f-dd76a4b93916" path="/var/lib/kubelet/pods/e73c1f4e-419f-4073-966f-dd76a4b93916/volumes" Oct 13 13:09:57 crc kubenswrapper[4797]: I1013 13:09:57.661582 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-tfpf9" Oct 13 13:09:57 crc kubenswrapper[4797]: I1013 13:09:57.661947 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-tfpf9" Oct 13 13:09:57 crc kubenswrapper[4797]: I1013 13:09:57.728861 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-tfpf9" Oct 13 13:09:57 crc kubenswrapper[4797]: I1013 13:09:57.825101 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-tfpf9" Oct 13 13:09:58 crc kubenswrapper[4797]: I1013 13:09:58.066999 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-rzpc2" Oct 13 13:09:58 crc kubenswrapper[4797]: I1013 13:09:58.067056 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-rzpc2" Oct 13 13:09:58 crc kubenswrapper[4797]: I1013 13:09:58.103889 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-rzpc2" Oct 13 13:09:58 crc kubenswrapper[4797]: I1013 13:09:58.835235 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-rzpc2" Oct 13 13:09:59 crc kubenswrapper[4797]: I1013 13:09:59.670390 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-v4q5g" Oct 13 13:09:59 crc kubenswrapper[4797]: I1013 13:09:59.670514 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-v4q5g" Oct 13 13:09:59 crc kubenswrapper[4797]: I1013 13:09:59.712674 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-v4q5g" Oct 13 13:09:59 crc kubenswrapper[4797]: I1013 13:09:59.830929 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-v4q5g" Oct 13 13:10:00 crc kubenswrapper[4797]: I1013 13:10:00.969071 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-n7gzd" Oct 13 13:10:00 crc kubenswrapper[4797]: I1013 13:10:00.969146 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-n7gzd" Oct 13 13:10:01 crc kubenswrapper[4797]: I1013 13:10:01.020135 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-n7gzd" Oct 13 13:10:01 crc kubenswrapper[4797]: I1013 13:10:01.378958 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rzpc2"] Oct 13 13:10:01 crc kubenswrapper[4797]: I1013 13:10:01.379157 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-rzpc2" podUID="ae42829d-c380-4186-9b4c-55c2221fffd7" containerName="registry-server" containerID="cri-o://5f4932a17de8c0ff27b23532c9ca7b80388ad285ac5642a9e32818f7f63a8907" gracePeriod=2 Oct 13 13:10:01 crc kubenswrapper[4797]: I1013 13:10:01.795863 4797 generic.go:334] "Generic (PLEG): container finished" podID="ae42829d-c380-4186-9b4c-55c2221fffd7" containerID="5f4932a17de8c0ff27b23532c9ca7b80388ad285ac5642a9e32818f7f63a8907" exitCode=0 Oct 13 13:10:01 crc kubenswrapper[4797]: I1013 13:10:01.795958 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rzpc2" event={"ID":"ae42829d-c380-4186-9b4c-55c2221fffd7","Type":"ContainerDied","Data":"5f4932a17de8c0ff27b23532c9ca7b80388ad285ac5642a9e32818f7f63a8907"} Oct 13 13:10:01 crc kubenswrapper[4797]: I1013 13:10:01.796935 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rzpc2" event={"ID":"ae42829d-c380-4186-9b4c-55c2221fffd7","Type":"ContainerDied","Data":"e200f596b1df7e174fb7168f36b56cbae1e7ad7cfcd60fa753d0a48de8328958"} Oct 13 13:10:01 crc kubenswrapper[4797]: I1013 13:10:01.796981 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e200f596b1df7e174fb7168f36b56cbae1e7ad7cfcd60fa753d0a48de8328958" Oct 13 13:10:01 crc kubenswrapper[4797]: I1013 13:10:01.805305 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rzpc2" Oct 13 13:10:01 crc kubenswrapper[4797]: I1013 13:10:01.839088 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-n7gzd" Oct 13 13:10:01 crc kubenswrapper[4797]: I1013 13:10:01.921760 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae42829d-c380-4186-9b4c-55c2221fffd7-catalog-content\") pod \"ae42829d-c380-4186-9b4c-55c2221fffd7\" (UID: \"ae42829d-c380-4186-9b4c-55c2221fffd7\") " Oct 13 13:10:01 crc kubenswrapper[4797]: I1013 13:10:01.921825 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae42829d-c380-4186-9b4c-55c2221fffd7-utilities\") pod \"ae42829d-c380-4186-9b4c-55c2221fffd7\" (UID: \"ae42829d-c380-4186-9b4c-55c2221fffd7\") " Oct 13 13:10:01 crc kubenswrapper[4797]: I1013 13:10:01.921847 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d74kt\" (UniqueName: \"kubernetes.io/projected/ae42829d-c380-4186-9b4c-55c2221fffd7-kube-api-access-d74kt\") pod \"ae42829d-c380-4186-9b4c-55c2221fffd7\" (UID: \"ae42829d-c380-4186-9b4c-55c2221fffd7\") " Oct 13 13:10:01 crc kubenswrapper[4797]: I1013 13:10:01.922527 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ae42829d-c380-4186-9b4c-55c2221fffd7-utilities" (OuterVolumeSpecName: "utilities") pod "ae42829d-c380-4186-9b4c-55c2221fffd7" (UID: "ae42829d-c380-4186-9b4c-55c2221fffd7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 13:10:01 crc kubenswrapper[4797]: I1013 13:10:01.929211 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae42829d-c380-4186-9b4c-55c2221fffd7-kube-api-access-d74kt" (OuterVolumeSpecName: "kube-api-access-d74kt") pod "ae42829d-c380-4186-9b4c-55c2221fffd7" (UID: "ae42829d-c380-4186-9b4c-55c2221fffd7"). InnerVolumeSpecName "kube-api-access-d74kt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:10:01 crc kubenswrapper[4797]: I1013 13:10:01.966702 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ae42829d-c380-4186-9b4c-55c2221fffd7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ae42829d-c380-4186-9b4c-55c2221fffd7" (UID: "ae42829d-c380-4186-9b4c-55c2221fffd7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 13:10:02 crc kubenswrapper[4797]: I1013 13:10:02.022948 4797 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae42829d-c380-4186-9b4c-55c2221fffd7-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 13 13:10:02 crc kubenswrapper[4797]: I1013 13:10:02.022985 4797 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae42829d-c380-4186-9b4c-55c2221fffd7-utilities\") on node \"crc\" DevicePath \"\"" Oct 13 13:10:02 crc kubenswrapper[4797]: I1013 13:10:02.022995 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d74kt\" (UniqueName: \"kubernetes.io/projected/ae42829d-c380-4186-9b4c-55c2221fffd7-kube-api-access-d74kt\") on node \"crc\" DevicePath \"\"" Oct 13 13:10:02 crc kubenswrapper[4797]: I1013 13:10:02.800547 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rzpc2" Oct 13 13:10:02 crc kubenswrapper[4797]: I1013 13:10:02.827069 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rzpc2"] Oct 13 13:10:02 crc kubenswrapper[4797]: I1013 13:10:02.831111 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-rzpc2"] Oct 13 13:10:03 crc kubenswrapper[4797]: I1013 13:10:03.244084 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae42829d-c380-4186-9b4c-55c2221fffd7" path="/var/lib/kubelet/pods/ae42829d-c380-4186-9b4c-55c2221fffd7/volumes" Oct 13 13:10:04 crc kubenswrapper[4797]: I1013 13:10:04.554014 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-5dpx7"] Oct 13 13:10:11 crc kubenswrapper[4797]: I1013 13:10:11.987335 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 13 13:10:18 crc kubenswrapper[4797]: I1013 13:10:18.120239 4797 patch_prober.go:28] interesting pod/machine-config-daemon-hrdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 13:10:18 crc kubenswrapper[4797]: I1013 13:10:18.120678 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 13:10:29 crc kubenswrapper[4797]: I1013 13:10:29.579611 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-5dpx7" podUID="4f565495-cb16-4443-8018-24e277acac69" containerName="oauth-openshift" containerID="cri-o://623b6fb9f987c0ab7aac9b59fab0d5579d482e7445779757e4d3325f2d1ea42f" gracePeriod=15 Oct 13 13:10:29 crc kubenswrapper[4797]: I1013 13:10:29.966040 4797 generic.go:334] "Generic (PLEG): container finished" podID="4f565495-cb16-4443-8018-24e277acac69" containerID="623b6fb9f987c0ab7aac9b59fab0d5579d482e7445779757e4d3325f2d1ea42f" exitCode=0 Oct 13 13:10:29 crc kubenswrapper[4797]: I1013 13:10:29.966109 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-5dpx7" event={"ID":"4f565495-cb16-4443-8018-24e277acac69","Type":"ContainerDied","Data":"623b6fb9f987c0ab7aac9b59fab0d5579d482e7445779757e4d3325f2d1ea42f"} Oct 13 13:10:30 crc kubenswrapper[4797]: I1013 13:10:30.077093 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-5dpx7" Oct 13 13:10:30 crc kubenswrapper[4797]: I1013 13:10:30.137456 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-5d4f55d7c5-vlcb5"] Oct 13 13:10:30 crc kubenswrapper[4797]: E1013 13:10:30.138116 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98e37ff7-36ac-4230-b449-83d3d2627535" containerName="extract-utilities" Oct 13 13:10:30 crc kubenswrapper[4797]: I1013 13:10:30.138154 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="98e37ff7-36ac-4230-b449-83d3d2627535" containerName="extract-utilities" Oct 13 13:10:30 crc kubenswrapper[4797]: E1013 13:10:30.138187 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98e37ff7-36ac-4230-b449-83d3d2627535" containerName="extract-content" Oct 13 13:10:30 crc kubenswrapper[4797]: I1013 13:10:30.138207 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="98e37ff7-36ac-4230-b449-83d3d2627535" containerName="extract-content" Oct 13 13:10:30 crc kubenswrapper[4797]: E1013 13:10:30.138281 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e73c1f4e-419f-4073-966f-dd76a4b93916" containerName="extract-utilities" Oct 13 13:10:30 crc kubenswrapper[4797]: I1013 13:10:30.138302 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="e73c1f4e-419f-4073-966f-dd76a4b93916" containerName="extract-utilities" Oct 13 13:10:30 crc kubenswrapper[4797]: E1013 13:10:30.138340 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d86244d-7468-424e-92d6-13aa23a66ad8" containerName="extract-content" Oct 13 13:10:30 crc kubenswrapper[4797]: I1013 13:10:30.138360 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d86244d-7468-424e-92d6-13aa23a66ad8" containerName="extract-content" Oct 13 13:10:30 crc kubenswrapper[4797]: E1013 13:10:30.138422 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae42829d-c380-4186-9b4c-55c2221fffd7" containerName="extract-content" Oct 13 13:10:30 crc kubenswrapper[4797]: I1013 13:10:30.138442 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae42829d-c380-4186-9b4c-55c2221fffd7" containerName="extract-content" Oct 13 13:10:30 crc kubenswrapper[4797]: E1013 13:10:30.138486 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d86244d-7468-424e-92d6-13aa23a66ad8" containerName="registry-server" Oct 13 13:10:30 crc kubenswrapper[4797]: I1013 13:10:30.138505 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d86244d-7468-424e-92d6-13aa23a66ad8" containerName="registry-server" Oct 13 13:10:30 crc kubenswrapper[4797]: E1013 13:10:30.138544 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d86244d-7468-424e-92d6-13aa23a66ad8" containerName="extract-utilities" Oct 13 13:10:30 crc kubenswrapper[4797]: I1013 13:10:30.138564 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d86244d-7468-424e-92d6-13aa23a66ad8" containerName="extract-utilities" Oct 13 13:10:30 crc kubenswrapper[4797]: E1013 13:10:30.138602 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae42829d-c380-4186-9b4c-55c2221fffd7" containerName="extract-utilities" Oct 13 13:10:30 crc kubenswrapper[4797]: I1013 13:10:30.138621 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae42829d-c380-4186-9b4c-55c2221fffd7" containerName="extract-utilities" Oct 13 13:10:30 crc kubenswrapper[4797]: E1013 13:10:30.138660 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae42829d-c380-4186-9b4c-55c2221fffd7" containerName="registry-server" Oct 13 13:10:30 crc kubenswrapper[4797]: I1013 13:10:30.138680 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae42829d-c380-4186-9b4c-55c2221fffd7" containerName="registry-server" Oct 13 13:10:30 crc kubenswrapper[4797]: E1013 13:10:30.138720 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e73c1f4e-419f-4073-966f-dd76a4b93916" containerName="extract-content" Oct 13 13:10:30 crc kubenswrapper[4797]: I1013 13:10:30.138738 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="e73c1f4e-419f-4073-966f-dd76a4b93916" containerName="extract-content" Oct 13 13:10:30 crc kubenswrapper[4797]: E1013 13:10:30.138777 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98e37ff7-36ac-4230-b449-83d3d2627535" containerName="registry-server" Oct 13 13:10:30 crc kubenswrapper[4797]: I1013 13:10:30.138795 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="98e37ff7-36ac-4230-b449-83d3d2627535" containerName="registry-server" Oct 13 13:10:30 crc kubenswrapper[4797]: E1013 13:10:30.138872 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e73c1f4e-419f-4073-966f-dd76a4b93916" containerName="registry-server" Oct 13 13:10:30 crc kubenswrapper[4797]: I1013 13:10:30.138894 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="e73c1f4e-419f-4073-966f-dd76a4b93916" containerName="registry-server" Oct 13 13:10:30 crc kubenswrapper[4797]: E1013 13:10:30.138933 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be1ba19c-2cc3-4c5b-babf-f61c63d1f504" containerName="pruner" Oct 13 13:10:30 crc kubenswrapper[4797]: I1013 13:10:30.138951 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="be1ba19c-2cc3-4c5b-babf-f61c63d1f504" containerName="pruner" Oct 13 13:10:30 crc kubenswrapper[4797]: E1013 13:10:30.138985 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f565495-cb16-4443-8018-24e277acac69" containerName="oauth-openshift" Oct 13 13:10:30 crc kubenswrapper[4797]: I1013 13:10:30.139002 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f565495-cb16-4443-8018-24e277acac69" containerName="oauth-openshift" Oct 13 13:10:30 crc kubenswrapper[4797]: I1013 13:10:30.139478 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="e73c1f4e-419f-4073-966f-dd76a4b93916" containerName="registry-server" Oct 13 13:10:30 crc kubenswrapper[4797]: I1013 13:10:30.139530 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f565495-cb16-4443-8018-24e277acac69" containerName="oauth-openshift" Oct 13 13:10:30 crc kubenswrapper[4797]: I1013 13:10:30.139572 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae42829d-c380-4186-9b4c-55c2221fffd7" containerName="registry-server" Oct 13 13:10:30 crc kubenswrapper[4797]: I1013 13:10:30.139606 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="98e37ff7-36ac-4230-b449-83d3d2627535" containerName="registry-server" Oct 13 13:10:30 crc kubenswrapper[4797]: I1013 13:10:30.139640 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="be1ba19c-2cc3-4c5b-babf-f61c63d1f504" containerName="pruner" Oct 13 13:10:30 crc kubenswrapper[4797]: I1013 13:10:30.139674 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d86244d-7468-424e-92d6-13aa23a66ad8" containerName="registry-server" Oct 13 13:10:30 crc kubenswrapper[4797]: I1013 13:10:30.141378 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-5d4f55d7c5-vlcb5" Oct 13 13:10:30 crc kubenswrapper[4797]: I1013 13:10:30.155248 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-5d4f55d7c5-vlcb5"] Oct 13 13:10:30 crc kubenswrapper[4797]: I1013 13:10:30.207279 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/4f565495-cb16-4443-8018-24e277acac69-v4-0-config-system-service-ca\") pod \"4f565495-cb16-4443-8018-24e277acac69\" (UID: \"4f565495-cb16-4443-8018-24e277acac69\") " Oct 13 13:10:30 crc kubenswrapper[4797]: I1013 13:10:30.207386 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/4f565495-cb16-4443-8018-24e277acac69-v4-0-config-system-ocp-branding-template\") pod \"4f565495-cb16-4443-8018-24e277acac69\" (UID: \"4f565495-cb16-4443-8018-24e277acac69\") " Oct 13 13:10:30 crc kubenswrapper[4797]: I1013 13:10:30.207445 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/4f565495-cb16-4443-8018-24e277acac69-v4-0-config-user-idp-0-file-data\") pod \"4f565495-cb16-4443-8018-24e277acac69\" (UID: \"4f565495-cb16-4443-8018-24e277acac69\") " Oct 13 13:10:30 crc kubenswrapper[4797]: I1013 13:10:30.207477 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/4f565495-cb16-4443-8018-24e277acac69-v4-0-config-system-serving-cert\") pod \"4f565495-cb16-4443-8018-24e277acac69\" (UID: \"4f565495-cb16-4443-8018-24e277acac69\") " Oct 13 13:10:30 crc kubenswrapper[4797]: I1013 13:10:30.207519 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/4f565495-cb16-4443-8018-24e277acac69-v4-0-config-system-session\") pod \"4f565495-cb16-4443-8018-24e277acac69\" (UID: \"4f565495-cb16-4443-8018-24e277acac69\") " Oct 13 13:10:30 crc kubenswrapper[4797]: I1013 13:10:30.207566 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4f565495-cb16-4443-8018-24e277acac69-v4-0-config-system-trusted-ca-bundle\") pod \"4f565495-cb16-4443-8018-24e277acac69\" (UID: \"4f565495-cb16-4443-8018-24e277acac69\") " Oct 13 13:10:30 crc kubenswrapper[4797]: I1013 13:10:30.207599 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4f565495-cb16-4443-8018-24e277acac69-audit-policies\") pod \"4f565495-cb16-4443-8018-24e277acac69\" (UID: \"4f565495-cb16-4443-8018-24e277acac69\") " Oct 13 13:10:30 crc kubenswrapper[4797]: I1013 13:10:30.207631 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4f565495-cb16-4443-8018-24e277acac69-v4-0-config-system-cliconfig\") pod \"4f565495-cb16-4443-8018-24e277acac69\" (UID: \"4f565495-cb16-4443-8018-24e277acac69\") " Oct 13 13:10:30 crc kubenswrapper[4797]: I1013 13:10:30.207661 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/4f565495-cb16-4443-8018-24e277acac69-v4-0-config-system-router-certs\") pod \"4f565495-cb16-4443-8018-24e277acac69\" (UID: \"4f565495-cb16-4443-8018-24e277acac69\") " Oct 13 13:10:30 crc kubenswrapper[4797]: I1013 13:10:30.207683 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/4f565495-cb16-4443-8018-24e277acac69-v4-0-config-user-template-error\") pod \"4f565495-cb16-4443-8018-24e277acac69\" (UID: \"4f565495-cb16-4443-8018-24e277acac69\") " Oct 13 13:10:30 crc kubenswrapper[4797]: I1013 13:10:30.207704 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4f565495-cb16-4443-8018-24e277acac69-audit-dir\") pod \"4f565495-cb16-4443-8018-24e277acac69\" (UID: \"4f565495-cb16-4443-8018-24e277acac69\") " Oct 13 13:10:30 crc kubenswrapper[4797]: I1013 13:10:30.207734 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5wzvr\" (UniqueName: \"kubernetes.io/projected/4f565495-cb16-4443-8018-24e277acac69-kube-api-access-5wzvr\") pod \"4f565495-cb16-4443-8018-24e277acac69\" (UID: \"4f565495-cb16-4443-8018-24e277acac69\") " Oct 13 13:10:30 crc kubenswrapper[4797]: I1013 13:10:30.207774 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/4f565495-cb16-4443-8018-24e277acac69-v4-0-config-user-template-login\") pod \"4f565495-cb16-4443-8018-24e277acac69\" (UID: \"4f565495-cb16-4443-8018-24e277acac69\") " Oct 13 13:10:30 crc kubenswrapper[4797]: I1013 13:10:30.207835 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/4f565495-cb16-4443-8018-24e277acac69-v4-0-config-user-template-provider-selection\") pod \"4f565495-cb16-4443-8018-24e277acac69\" (UID: \"4f565495-cb16-4443-8018-24e277acac69\") " Oct 13 13:10:30 crc kubenswrapper[4797]: I1013 13:10:30.208011 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/1f3b0a9f-5c91-4b07-8437-722585c861b6-v4-0-config-system-session\") pod \"oauth-openshift-5d4f55d7c5-vlcb5\" (UID: \"1f3b0a9f-5c91-4b07-8437-722585c861b6\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-vlcb5" Oct 13 13:10:30 crc kubenswrapper[4797]: I1013 13:10:30.208042 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/1f3b0a9f-5c91-4b07-8437-722585c861b6-v4-0-config-user-template-error\") pod \"oauth-openshift-5d4f55d7c5-vlcb5\" (UID: \"1f3b0a9f-5c91-4b07-8437-722585c861b6\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-vlcb5" Oct 13 13:10:30 crc kubenswrapper[4797]: I1013 13:10:30.208070 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/1f3b0a9f-5c91-4b07-8437-722585c861b6-v4-0-config-user-template-login\") pod \"oauth-openshift-5d4f55d7c5-vlcb5\" (UID: \"1f3b0a9f-5c91-4b07-8437-722585c861b6\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-vlcb5" Oct 13 13:10:30 crc kubenswrapper[4797]: I1013 13:10:30.208096 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/1f3b0a9f-5c91-4b07-8437-722585c861b6-v4-0-config-system-service-ca\") pod \"oauth-openshift-5d4f55d7c5-vlcb5\" (UID: \"1f3b0a9f-5c91-4b07-8437-722585c861b6\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-vlcb5" Oct 13 13:10:30 crc kubenswrapper[4797]: I1013 13:10:30.208121 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1f3b0a9f-5c91-4b07-8437-722585c861b6-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5d4f55d7c5-vlcb5\" (UID: \"1f3b0a9f-5c91-4b07-8437-722585c861b6\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-vlcb5" Oct 13 13:10:30 crc kubenswrapper[4797]: I1013 13:10:30.208150 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/1f3b0a9f-5c91-4b07-8437-722585c861b6-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5d4f55d7c5-vlcb5\" (UID: \"1f3b0a9f-5c91-4b07-8437-722585c861b6\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-vlcb5" Oct 13 13:10:30 crc kubenswrapper[4797]: I1013 13:10:30.208174 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/1f3b0a9f-5c91-4b07-8437-722585c861b6-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5d4f55d7c5-vlcb5\" (UID: \"1f3b0a9f-5c91-4b07-8437-722585c861b6\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-vlcb5" Oct 13 13:10:30 crc kubenswrapper[4797]: I1013 13:10:30.208199 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/1f3b0a9f-5c91-4b07-8437-722585c861b6-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5d4f55d7c5-vlcb5\" (UID: \"1f3b0a9f-5c91-4b07-8437-722585c861b6\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-vlcb5" Oct 13 13:10:30 crc kubenswrapper[4797]: I1013 13:10:30.208228 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jq2tc\" (UniqueName: \"kubernetes.io/projected/1f3b0a9f-5c91-4b07-8437-722585c861b6-kube-api-access-jq2tc\") pod \"oauth-openshift-5d4f55d7c5-vlcb5\" (UID: \"1f3b0a9f-5c91-4b07-8437-722585c861b6\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-vlcb5" Oct 13 13:10:30 crc kubenswrapper[4797]: I1013 13:10:30.208267 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/1f3b0a9f-5c91-4b07-8437-722585c861b6-v4-0-config-system-router-certs\") pod \"oauth-openshift-5d4f55d7c5-vlcb5\" (UID: \"1f3b0a9f-5c91-4b07-8437-722585c861b6\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-vlcb5" Oct 13 13:10:30 crc kubenswrapper[4797]: I1013 13:10:30.208298 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1f3b0a9f-5c91-4b07-8437-722585c861b6-audit-policies\") pod \"oauth-openshift-5d4f55d7c5-vlcb5\" (UID: \"1f3b0a9f-5c91-4b07-8437-722585c861b6\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-vlcb5" Oct 13 13:10:30 crc kubenswrapper[4797]: I1013 13:10:30.208324 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/1f3b0a9f-5c91-4b07-8437-722585c861b6-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5d4f55d7c5-vlcb5\" (UID: \"1f3b0a9f-5c91-4b07-8437-722585c861b6\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-vlcb5" Oct 13 13:10:30 crc kubenswrapper[4797]: I1013 13:10:30.208356 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/1f3b0a9f-5c91-4b07-8437-722585c861b6-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5d4f55d7c5-vlcb5\" (UID: \"1f3b0a9f-5c91-4b07-8437-722585c861b6\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-vlcb5" Oct 13 13:10:30 crc kubenswrapper[4797]: I1013 13:10:30.208383 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1f3b0a9f-5c91-4b07-8437-722585c861b6-audit-dir\") pod \"oauth-openshift-5d4f55d7c5-vlcb5\" (UID: \"1f3b0a9f-5c91-4b07-8437-722585c861b6\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-vlcb5" Oct 13 13:10:30 crc kubenswrapper[4797]: I1013 13:10:30.209496 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f565495-cb16-4443-8018-24e277acac69-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "4f565495-cb16-4443-8018-24e277acac69" (UID: "4f565495-cb16-4443-8018-24e277acac69"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:10:30 crc kubenswrapper[4797]: I1013 13:10:30.209539 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f565495-cb16-4443-8018-24e277acac69-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "4f565495-cb16-4443-8018-24e277acac69" (UID: "4f565495-cb16-4443-8018-24e277acac69"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:10:30 crc kubenswrapper[4797]: I1013 13:10:30.209512 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f565495-cb16-4443-8018-24e277acac69-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "4f565495-cb16-4443-8018-24e277acac69" (UID: "4f565495-cb16-4443-8018-24e277acac69"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:10:30 crc kubenswrapper[4797]: I1013 13:10:30.209641 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4f565495-cb16-4443-8018-24e277acac69-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "4f565495-cb16-4443-8018-24e277acac69" (UID: "4f565495-cb16-4443-8018-24e277acac69"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 13 13:10:30 crc kubenswrapper[4797]: I1013 13:10:30.210324 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f565495-cb16-4443-8018-24e277acac69-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "4f565495-cb16-4443-8018-24e277acac69" (UID: "4f565495-cb16-4443-8018-24e277acac69"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:10:30 crc kubenswrapper[4797]: I1013 13:10:30.215890 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f565495-cb16-4443-8018-24e277acac69-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "4f565495-cb16-4443-8018-24e277acac69" (UID: "4f565495-cb16-4443-8018-24e277acac69"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:10:30 crc kubenswrapper[4797]: I1013 13:10:30.216076 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f565495-cb16-4443-8018-24e277acac69-kube-api-access-5wzvr" (OuterVolumeSpecName: "kube-api-access-5wzvr") pod "4f565495-cb16-4443-8018-24e277acac69" (UID: "4f565495-cb16-4443-8018-24e277acac69"). InnerVolumeSpecName "kube-api-access-5wzvr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:10:30 crc kubenswrapper[4797]: I1013 13:10:30.216724 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f565495-cb16-4443-8018-24e277acac69-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "4f565495-cb16-4443-8018-24e277acac69" (UID: "4f565495-cb16-4443-8018-24e277acac69"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:10:30 crc kubenswrapper[4797]: I1013 13:10:30.217012 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f565495-cb16-4443-8018-24e277acac69-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "4f565495-cb16-4443-8018-24e277acac69" (UID: "4f565495-cb16-4443-8018-24e277acac69"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:10:30 crc kubenswrapper[4797]: I1013 13:10:30.217400 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f565495-cb16-4443-8018-24e277acac69-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "4f565495-cb16-4443-8018-24e277acac69" (UID: "4f565495-cb16-4443-8018-24e277acac69"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:10:30 crc kubenswrapper[4797]: I1013 13:10:30.218276 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f565495-cb16-4443-8018-24e277acac69-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "4f565495-cb16-4443-8018-24e277acac69" (UID: "4f565495-cb16-4443-8018-24e277acac69"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:10:30 crc kubenswrapper[4797]: I1013 13:10:30.218527 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f565495-cb16-4443-8018-24e277acac69-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "4f565495-cb16-4443-8018-24e277acac69" (UID: "4f565495-cb16-4443-8018-24e277acac69"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:10:30 crc kubenswrapper[4797]: I1013 13:10:30.218709 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f565495-cb16-4443-8018-24e277acac69-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "4f565495-cb16-4443-8018-24e277acac69" (UID: "4f565495-cb16-4443-8018-24e277acac69"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:10:30 crc kubenswrapper[4797]: I1013 13:10:30.218703 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f565495-cb16-4443-8018-24e277acac69-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "4f565495-cb16-4443-8018-24e277acac69" (UID: "4f565495-cb16-4443-8018-24e277acac69"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:10:30 crc kubenswrapper[4797]: I1013 13:10:30.310250 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/1f3b0a9f-5c91-4b07-8437-722585c861b6-v4-0-config-system-router-certs\") pod \"oauth-openshift-5d4f55d7c5-vlcb5\" (UID: \"1f3b0a9f-5c91-4b07-8437-722585c861b6\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-vlcb5" Oct 13 13:10:30 crc kubenswrapper[4797]: I1013 13:10:30.310337 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1f3b0a9f-5c91-4b07-8437-722585c861b6-audit-policies\") pod \"oauth-openshift-5d4f55d7c5-vlcb5\" (UID: \"1f3b0a9f-5c91-4b07-8437-722585c861b6\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-vlcb5" Oct 13 13:10:30 crc kubenswrapper[4797]: I1013 13:10:30.310377 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/1f3b0a9f-5c91-4b07-8437-722585c861b6-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5d4f55d7c5-vlcb5\" (UID: \"1f3b0a9f-5c91-4b07-8437-722585c861b6\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-vlcb5" Oct 13 13:10:30 crc kubenswrapper[4797]: I1013 13:10:30.310427 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/1f3b0a9f-5c91-4b07-8437-722585c861b6-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5d4f55d7c5-vlcb5\" (UID: \"1f3b0a9f-5c91-4b07-8437-722585c861b6\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-vlcb5" Oct 13 13:10:30 crc kubenswrapper[4797]: I1013 13:10:30.310471 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1f3b0a9f-5c91-4b07-8437-722585c861b6-audit-dir\") pod \"oauth-openshift-5d4f55d7c5-vlcb5\" (UID: \"1f3b0a9f-5c91-4b07-8437-722585c861b6\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-vlcb5" Oct 13 13:10:30 crc kubenswrapper[4797]: I1013 13:10:30.310529 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/1f3b0a9f-5c91-4b07-8437-722585c861b6-v4-0-config-system-session\") pod \"oauth-openshift-5d4f55d7c5-vlcb5\" (UID: \"1f3b0a9f-5c91-4b07-8437-722585c861b6\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-vlcb5" Oct 13 13:10:30 crc kubenswrapper[4797]: I1013 13:10:30.310565 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/1f3b0a9f-5c91-4b07-8437-722585c861b6-v4-0-config-user-template-error\") pod \"oauth-openshift-5d4f55d7c5-vlcb5\" (UID: \"1f3b0a9f-5c91-4b07-8437-722585c861b6\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-vlcb5" Oct 13 13:10:30 crc kubenswrapper[4797]: I1013 13:10:30.310595 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/1f3b0a9f-5c91-4b07-8437-722585c861b6-v4-0-config-user-template-login\") pod \"oauth-openshift-5d4f55d7c5-vlcb5\" (UID: \"1f3b0a9f-5c91-4b07-8437-722585c861b6\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-vlcb5" Oct 13 13:10:30 crc kubenswrapper[4797]: I1013 13:10:30.310628 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/1f3b0a9f-5c91-4b07-8437-722585c861b6-v4-0-config-system-service-ca\") pod \"oauth-openshift-5d4f55d7c5-vlcb5\" (UID: \"1f3b0a9f-5c91-4b07-8437-722585c861b6\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-vlcb5" Oct 13 13:10:30 crc kubenswrapper[4797]: I1013 13:10:30.310687 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1f3b0a9f-5c91-4b07-8437-722585c861b6-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5d4f55d7c5-vlcb5\" (UID: \"1f3b0a9f-5c91-4b07-8437-722585c861b6\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-vlcb5" Oct 13 13:10:30 crc kubenswrapper[4797]: I1013 13:10:30.310725 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/1f3b0a9f-5c91-4b07-8437-722585c861b6-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5d4f55d7c5-vlcb5\" (UID: \"1f3b0a9f-5c91-4b07-8437-722585c861b6\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-vlcb5" Oct 13 13:10:30 crc kubenswrapper[4797]: I1013 13:10:30.310762 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/1f3b0a9f-5c91-4b07-8437-722585c861b6-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5d4f55d7c5-vlcb5\" (UID: \"1f3b0a9f-5c91-4b07-8437-722585c861b6\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-vlcb5" Oct 13 13:10:30 crc kubenswrapper[4797]: I1013 13:10:30.310801 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/1f3b0a9f-5c91-4b07-8437-722585c861b6-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5d4f55d7c5-vlcb5\" (UID: \"1f3b0a9f-5c91-4b07-8437-722585c861b6\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-vlcb5" Oct 13 13:10:30 crc kubenswrapper[4797]: I1013 13:10:30.310866 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jq2tc\" (UniqueName: \"kubernetes.io/projected/1f3b0a9f-5c91-4b07-8437-722585c861b6-kube-api-access-jq2tc\") pod \"oauth-openshift-5d4f55d7c5-vlcb5\" (UID: \"1f3b0a9f-5c91-4b07-8437-722585c861b6\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-vlcb5" Oct 13 13:10:30 crc kubenswrapper[4797]: I1013 13:10:30.310961 4797 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/4f565495-cb16-4443-8018-24e277acac69-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Oct 13 13:10:30 crc kubenswrapper[4797]: I1013 13:10:30.310986 4797 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/4f565495-cb16-4443-8018-24e277acac69-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Oct 13 13:10:30 crc kubenswrapper[4797]: I1013 13:10:30.311008 4797 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/4f565495-cb16-4443-8018-24e277acac69-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 13 13:10:30 crc kubenswrapper[4797]: I1013 13:10:30.311029 4797 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/4f565495-cb16-4443-8018-24e277acac69-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Oct 13 13:10:30 crc kubenswrapper[4797]: I1013 13:10:30.311050 4797 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4f565495-cb16-4443-8018-24e277acac69-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 13:10:30 crc kubenswrapper[4797]: I1013 13:10:30.311068 4797 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4f565495-cb16-4443-8018-24e277acac69-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 13 13:10:30 crc kubenswrapper[4797]: I1013 13:10:30.311088 4797 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4f565495-cb16-4443-8018-24e277acac69-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Oct 13 13:10:30 crc kubenswrapper[4797]: I1013 13:10:30.311106 4797 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/4f565495-cb16-4443-8018-24e277acac69-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Oct 13 13:10:30 crc kubenswrapper[4797]: I1013 13:10:30.311125 4797 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/4f565495-cb16-4443-8018-24e277acac69-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Oct 13 13:10:30 crc kubenswrapper[4797]: I1013 13:10:30.311144 4797 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4f565495-cb16-4443-8018-24e277acac69-audit-dir\") on node \"crc\" DevicePath \"\"" Oct 13 13:10:30 crc kubenswrapper[4797]: I1013 13:10:30.311161 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5wzvr\" (UniqueName: \"kubernetes.io/projected/4f565495-cb16-4443-8018-24e277acac69-kube-api-access-5wzvr\") on node \"crc\" DevicePath \"\"" Oct 13 13:10:30 crc kubenswrapper[4797]: I1013 13:10:30.311180 4797 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/4f565495-cb16-4443-8018-24e277acac69-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Oct 13 13:10:30 crc kubenswrapper[4797]: I1013 13:10:30.311198 4797 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/4f565495-cb16-4443-8018-24e277acac69-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Oct 13 13:10:30 crc kubenswrapper[4797]: I1013 13:10:30.311219 4797 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/4f565495-cb16-4443-8018-24e277acac69-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Oct 13 13:10:30 crc kubenswrapper[4797]: I1013 13:10:30.311566 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1f3b0a9f-5c91-4b07-8437-722585c861b6-audit-policies\") pod \"oauth-openshift-5d4f55d7c5-vlcb5\" (UID: \"1f3b0a9f-5c91-4b07-8437-722585c861b6\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-vlcb5" Oct 13 13:10:30 crc kubenswrapper[4797]: I1013 13:10:30.311939 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1f3b0a9f-5c91-4b07-8437-722585c861b6-audit-dir\") pod \"oauth-openshift-5d4f55d7c5-vlcb5\" (UID: \"1f3b0a9f-5c91-4b07-8437-722585c861b6\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-vlcb5" Oct 13 13:10:30 crc kubenswrapper[4797]: I1013 13:10:30.312995 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/1f3b0a9f-5c91-4b07-8437-722585c861b6-v4-0-config-system-service-ca\") pod \"oauth-openshift-5d4f55d7c5-vlcb5\" (UID: \"1f3b0a9f-5c91-4b07-8437-722585c861b6\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-vlcb5" Oct 13 13:10:30 crc kubenswrapper[4797]: I1013 13:10:30.313385 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/1f3b0a9f-5c91-4b07-8437-722585c861b6-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5d4f55d7c5-vlcb5\" (UID: \"1f3b0a9f-5c91-4b07-8437-722585c861b6\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-vlcb5" Oct 13 13:10:30 crc kubenswrapper[4797]: I1013 13:10:30.315342 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1f3b0a9f-5c91-4b07-8437-722585c861b6-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5d4f55d7c5-vlcb5\" (UID: \"1f3b0a9f-5c91-4b07-8437-722585c861b6\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-vlcb5" Oct 13 13:10:30 crc kubenswrapper[4797]: I1013 13:10:30.315661 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/1f3b0a9f-5c91-4b07-8437-722585c861b6-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5d4f55d7c5-vlcb5\" (UID: \"1f3b0a9f-5c91-4b07-8437-722585c861b6\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-vlcb5" Oct 13 13:10:30 crc kubenswrapper[4797]: I1013 13:10:30.316420 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/1f3b0a9f-5c91-4b07-8437-722585c861b6-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5d4f55d7c5-vlcb5\" (UID: \"1f3b0a9f-5c91-4b07-8437-722585c861b6\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-vlcb5" Oct 13 13:10:30 crc kubenswrapper[4797]: I1013 13:10:30.317211 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/1f3b0a9f-5c91-4b07-8437-722585c861b6-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5d4f55d7c5-vlcb5\" (UID: \"1f3b0a9f-5c91-4b07-8437-722585c861b6\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-vlcb5" Oct 13 13:10:30 crc kubenswrapper[4797]: I1013 13:10:30.317875 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/1f3b0a9f-5c91-4b07-8437-722585c861b6-v4-0-config-user-template-login\") pod \"oauth-openshift-5d4f55d7c5-vlcb5\" (UID: \"1f3b0a9f-5c91-4b07-8437-722585c861b6\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-vlcb5" Oct 13 13:10:30 crc kubenswrapper[4797]: I1013 13:10:30.318234 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/1f3b0a9f-5c91-4b07-8437-722585c861b6-v4-0-config-system-router-certs\") pod \"oauth-openshift-5d4f55d7c5-vlcb5\" (UID: \"1f3b0a9f-5c91-4b07-8437-722585c861b6\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-vlcb5" Oct 13 13:10:30 crc kubenswrapper[4797]: I1013 13:10:30.319310 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/1f3b0a9f-5c91-4b07-8437-722585c861b6-v4-0-config-system-session\") pod \"oauth-openshift-5d4f55d7c5-vlcb5\" (UID: \"1f3b0a9f-5c91-4b07-8437-722585c861b6\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-vlcb5" Oct 13 13:10:30 crc kubenswrapper[4797]: I1013 13:10:30.319408 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/1f3b0a9f-5c91-4b07-8437-722585c861b6-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5d4f55d7c5-vlcb5\" (UID: \"1f3b0a9f-5c91-4b07-8437-722585c861b6\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-vlcb5" Oct 13 13:10:30 crc kubenswrapper[4797]: I1013 13:10:30.320139 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/1f3b0a9f-5c91-4b07-8437-722585c861b6-v4-0-config-user-template-error\") pod \"oauth-openshift-5d4f55d7c5-vlcb5\" (UID: \"1f3b0a9f-5c91-4b07-8437-722585c861b6\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-vlcb5" Oct 13 13:10:30 crc kubenswrapper[4797]: I1013 13:10:30.330696 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jq2tc\" (UniqueName: \"kubernetes.io/projected/1f3b0a9f-5c91-4b07-8437-722585c861b6-kube-api-access-jq2tc\") pod \"oauth-openshift-5d4f55d7c5-vlcb5\" (UID: \"1f3b0a9f-5c91-4b07-8437-722585c861b6\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-vlcb5" Oct 13 13:10:30 crc kubenswrapper[4797]: I1013 13:10:30.465555 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-5d4f55d7c5-vlcb5" Oct 13 13:10:30 crc kubenswrapper[4797]: I1013 13:10:30.729931 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-5d4f55d7c5-vlcb5"] Oct 13 13:10:30 crc kubenswrapper[4797]: I1013 13:10:30.975775 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-5dpx7" event={"ID":"4f565495-cb16-4443-8018-24e277acac69","Type":"ContainerDied","Data":"b166f5ffe228add9c386a0e7095a55e709d7d62522bf9b1a83f250a1ec2ac2db"} Oct 13 13:10:30 crc kubenswrapper[4797]: I1013 13:10:30.976106 4797 scope.go:117] "RemoveContainer" containerID="623b6fb9f987c0ab7aac9b59fab0d5579d482e7445779757e4d3325f2d1ea42f" Oct 13 13:10:30 crc kubenswrapper[4797]: I1013 13:10:30.975909 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-5dpx7" Oct 13 13:10:30 crc kubenswrapper[4797]: I1013 13:10:30.979919 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5d4f55d7c5-vlcb5" event={"ID":"1f3b0a9f-5c91-4b07-8437-722585c861b6","Type":"ContainerStarted","Data":"4b527e43dfd15e5c78b361411e47d907eb05d2f3359c4e0df3908891c417b3b3"} Oct 13 13:10:31 crc kubenswrapper[4797]: I1013 13:10:31.031485 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-5dpx7"] Oct 13 13:10:31 crc kubenswrapper[4797]: I1013 13:10:31.035166 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-5dpx7"] Oct 13 13:10:31 crc kubenswrapper[4797]: I1013 13:10:31.251504 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f565495-cb16-4443-8018-24e277acac69" path="/var/lib/kubelet/pods/4f565495-cb16-4443-8018-24e277acac69/volumes" Oct 13 13:10:31 crc kubenswrapper[4797]: I1013 13:10:31.990575 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5d4f55d7c5-vlcb5" event={"ID":"1f3b0a9f-5c91-4b07-8437-722585c861b6","Type":"ContainerStarted","Data":"83f2dd6eb621b5420da86b9004e69fd3c0ce75d4fa652d141a46d7b4984ed426"} Oct 13 13:10:31 crc kubenswrapper[4797]: I1013 13:10:31.990680 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-5d4f55d7c5-vlcb5" Oct 13 13:10:32 crc kubenswrapper[4797]: I1013 13:10:32.001355 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-5d4f55d7c5-vlcb5" Oct 13 13:10:32 crc kubenswrapper[4797]: I1013 13:10:32.025600 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-5d4f55d7c5-vlcb5" podStartSLOduration=28.025564966 podStartE2EDuration="28.025564966s" podCreationTimestamp="2025-10-13 13:10:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 13:10:32.023751211 +0000 UTC m=+209.557301507" watchObservedRunningTime="2025-10-13 13:10:32.025564966 +0000 UTC m=+209.559115232" Oct 13 13:10:48 crc kubenswrapper[4797]: I1013 13:10:48.119737 4797 patch_prober.go:28] interesting pod/machine-config-daemon-hrdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 13:10:48 crc kubenswrapper[4797]: I1013 13:10:48.120243 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 13:10:48 crc kubenswrapper[4797]: I1013 13:10:48.120290 4797 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" Oct 13 13:10:48 crc kubenswrapper[4797]: I1013 13:10:48.120757 4797 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ae2106d4b7e73d19b0c8cbd8089d372e56fa08d827a3b45148d0cf68e8596c00"} pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 13 13:10:48 crc kubenswrapper[4797]: I1013 13:10:48.120823 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerName="machine-config-daemon" containerID="cri-o://ae2106d4b7e73d19b0c8cbd8089d372e56fa08d827a3b45148d0cf68e8596c00" gracePeriod=600 Oct 13 13:10:49 crc kubenswrapper[4797]: I1013 13:10:49.098996 4797 generic.go:334] "Generic (PLEG): container finished" podID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerID="ae2106d4b7e73d19b0c8cbd8089d372e56fa08d827a3b45148d0cf68e8596c00" exitCode=0 Oct 13 13:10:49 crc kubenswrapper[4797]: I1013 13:10:49.099126 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" event={"ID":"345b1c60-ba79-407d-8423-53010f2dfeb0","Type":"ContainerDied","Data":"ae2106d4b7e73d19b0c8cbd8089d372e56fa08d827a3b45148d0cf68e8596c00"} Oct 13 13:10:49 crc kubenswrapper[4797]: I1013 13:10:49.099361 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" event={"ID":"345b1c60-ba79-407d-8423-53010f2dfeb0","Type":"ContainerStarted","Data":"d3ee7955330145cda9bf05f82ba784bd3a2439ca3cd35803a4ccfd63040068c9"} Oct 13 13:10:59 crc kubenswrapper[4797]: I1013 13:10:59.887570 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-tfpf9"] Oct 13 13:10:59 crc kubenswrapper[4797]: I1013 13:10:59.889155 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-tfpf9" podUID="8b97354f-235c-4e1e-9121-13d644be8813" containerName="registry-server" containerID="cri-o://9b0dc98137c3f8338470382e7bbb17265c71f9e0519ce6b54cd868a9caa81d22" gracePeriod=30 Oct 13 13:10:59 crc kubenswrapper[4797]: I1013 13:10:59.900765 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2m52s"] Oct 13 13:10:59 crc kubenswrapper[4797]: I1013 13:10:59.901321 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-2m52s" podUID="922167a9-88b5-40b2-8dd3-b04e4ba3f035" containerName="registry-server" containerID="cri-o://28eb5662e2c52a3d5af63863c9404b9ac2526b626b59cbd62785a5a3ac1562e1" gracePeriod=30 Oct 13 13:10:59 crc kubenswrapper[4797]: I1013 13:10:59.905285 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-hxck2"] Oct 13 13:10:59 crc kubenswrapper[4797]: I1013 13:10:59.905535 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-hxck2" podUID="c229cffd-cd92-47b5-bec4-3f3eb1c6c81e" containerName="marketplace-operator" containerID="cri-o://92daef95090852916fb77bf7c96e6d4f56cb5b2410062f430b59b417e127ca50" gracePeriod=30 Oct 13 13:10:59 crc kubenswrapper[4797]: I1013 13:10:59.925564 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-v4q5g"] Oct 13 13:10:59 crc kubenswrapper[4797]: I1013 13:10:59.925891 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-v4q5g" podUID="51eb3ebb-3fa0-4571-bb5e-ca393071f745" containerName="registry-server" containerID="cri-o://ab80f50d7a5db2455fbfe85604510463c7a8f8a0e571818aee63744d7ba25bee" gracePeriod=30 Oct 13 13:10:59 crc kubenswrapper[4797]: I1013 13:10:59.936203 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-n7gzd"] Oct 13 13:10:59 crc kubenswrapper[4797]: I1013 13:10:59.936424 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-n7gzd" podUID="3ae58328-3b33-44bc-a168-9d19d64bc09c" containerName="registry-server" containerID="cri-o://b80350a81a0f80eb8672a2347440f67f729e8f78486e8fd3368560c9d8116399" gracePeriod=30 Oct 13 13:10:59 crc kubenswrapper[4797]: I1013 13:10:59.943111 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-frmvf"] Oct 13 13:10:59 crc kubenswrapper[4797]: I1013 13:10:59.944069 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-frmvf" Oct 13 13:10:59 crc kubenswrapper[4797]: I1013 13:10:59.958663 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-frmvf"] Oct 13 13:11:00 crc kubenswrapper[4797]: I1013 13:11:00.049374 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/aea11d03-92b6-4f03-b4bc-61042afa7406-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-frmvf\" (UID: \"aea11d03-92b6-4f03-b4bc-61042afa7406\") " pod="openshift-marketplace/marketplace-operator-79b997595-frmvf" Oct 13 13:11:00 crc kubenswrapper[4797]: I1013 13:11:00.049446 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kk8wh\" (UniqueName: \"kubernetes.io/projected/aea11d03-92b6-4f03-b4bc-61042afa7406-kube-api-access-kk8wh\") pod \"marketplace-operator-79b997595-frmvf\" (UID: \"aea11d03-92b6-4f03-b4bc-61042afa7406\") " pod="openshift-marketplace/marketplace-operator-79b997595-frmvf" Oct 13 13:11:00 crc kubenswrapper[4797]: I1013 13:11:00.049528 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/aea11d03-92b6-4f03-b4bc-61042afa7406-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-frmvf\" (UID: \"aea11d03-92b6-4f03-b4bc-61042afa7406\") " pod="openshift-marketplace/marketplace-operator-79b997595-frmvf" Oct 13 13:11:00 crc kubenswrapper[4797]: I1013 13:11:00.151358 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/aea11d03-92b6-4f03-b4bc-61042afa7406-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-frmvf\" (UID: \"aea11d03-92b6-4f03-b4bc-61042afa7406\") " pod="openshift-marketplace/marketplace-operator-79b997595-frmvf" Oct 13 13:11:00 crc kubenswrapper[4797]: I1013 13:11:00.151414 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kk8wh\" (UniqueName: \"kubernetes.io/projected/aea11d03-92b6-4f03-b4bc-61042afa7406-kube-api-access-kk8wh\") pod \"marketplace-operator-79b997595-frmvf\" (UID: \"aea11d03-92b6-4f03-b4bc-61042afa7406\") " pod="openshift-marketplace/marketplace-operator-79b997595-frmvf" Oct 13 13:11:00 crc kubenswrapper[4797]: I1013 13:11:00.151474 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/aea11d03-92b6-4f03-b4bc-61042afa7406-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-frmvf\" (UID: \"aea11d03-92b6-4f03-b4bc-61042afa7406\") " pod="openshift-marketplace/marketplace-operator-79b997595-frmvf" Oct 13 13:11:00 crc kubenswrapper[4797]: I1013 13:11:00.152929 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/aea11d03-92b6-4f03-b4bc-61042afa7406-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-frmvf\" (UID: \"aea11d03-92b6-4f03-b4bc-61042afa7406\") " pod="openshift-marketplace/marketplace-operator-79b997595-frmvf" Oct 13 13:11:00 crc kubenswrapper[4797]: I1013 13:11:00.159625 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/aea11d03-92b6-4f03-b4bc-61042afa7406-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-frmvf\" (UID: \"aea11d03-92b6-4f03-b4bc-61042afa7406\") " pod="openshift-marketplace/marketplace-operator-79b997595-frmvf" Oct 13 13:11:00 crc kubenswrapper[4797]: I1013 13:11:00.168233 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kk8wh\" (UniqueName: \"kubernetes.io/projected/aea11d03-92b6-4f03-b4bc-61042afa7406-kube-api-access-kk8wh\") pod \"marketplace-operator-79b997595-frmvf\" (UID: \"aea11d03-92b6-4f03-b4bc-61042afa7406\") " pod="openshift-marketplace/marketplace-operator-79b997595-frmvf" Oct 13 13:11:00 crc kubenswrapper[4797]: I1013 13:11:00.186699 4797 generic.go:334] "Generic (PLEG): container finished" podID="8b97354f-235c-4e1e-9121-13d644be8813" containerID="9b0dc98137c3f8338470382e7bbb17265c71f9e0519ce6b54cd868a9caa81d22" exitCode=0 Oct 13 13:11:00 crc kubenswrapper[4797]: I1013 13:11:00.186750 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tfpf9" event={"ID":"8b97354f-235c-4e1e-9121-13d644be8813","Type":"ContainerDied","Data":"9b0dc98137c3f8338470382e7bbb17265c71f9e0519ce6b54cd868a9caa81d22"} Oct 13 13:11:00 crc kubenswrapper[4797]: I1013 13:11:00.188624 4797 generic.go:334] "Generic (PLEG): container finished" podID="c229cffd-cd92-47b5-bec4-3f3eb1c6c81e" containerID="92daef95090852916fb77bf7c96e6d4f56cb5b2410062f430b59b417e127ca50" exitCode=0 Oct 13 13:11:00 crc kubenswrapper[4797]: I1013 13:11:00.188694 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-hxck2" event={"ID":"c229cffd-cd92-47b5-bec4-3f3eb1c6c81e","Type":"ContainerDied","Data":"92daef95090852916fb77bf7c96e6d4f56cb5b2410062f430b59b417e127ca50"} Oct 13 13:11:00 crc kubenswrapper[4797]: I1013 13:11:00.191426 4797 generic.go:334] "Generic (PLEG): container finished" podID="3ae58328-3b33-44bc-a168-9d19d64bc09c" containerID="b80350a81a0f80eb8672a2347440f67f729e8f78486e8fd3368560c9d8116399" exitCode=0 Oct 13 13:11:00 crc kubenswrapper[4797]: I1013 13:11:00.191470 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n7gzd" event={"ID":"3ae58328-3b33-44bc-a168-9d19d64bc09c","Type":"ContainerDied","Data":"b80350a81a0f80eb8672a2347440f67f729e8f78486e8fd3368560c9d8116399"} Oct 13 13:11:00 crc kubenswrapper[4797]: I1013 13:11:00.206657 4797 generic.go:334] "Generic (PLEG): container finished" podID="51eb3ebb-3fa0-4571-bb5e-ca393071f745" containerID="ab80f50d7a5db2455fbfe85604510463c7a8f8a0e571818aee63744d7ba25bee" exitCode=0 Oct 13 13:11:00 crc kubenswrapper[4797]: I1013 13:11:00.206742 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v4q5g" event={"ID":"51eb3ebb-3fa0-4571-bb5e-ca393071f745","Type":"ContainerDied","Data":"ab80f50d7a5db2455fbfe85604510463c7a8f8a0e571818aee63744d7ba25bee"} Oct 13 13:11:00 crc kubenswrapper[4797]: I1013 13:11:00.210645 4797 generic.go:334] "Generic (PLEG): container finished" podID="922167a9-88b5-40b2-8dd3-b04e4ba3f035" containerID="28eb5662e2c52a3d5af63863c9404b9ac2526b626b59cbd62785a5a3ac1562e1" exitCode=0 Oct 13 13:11:00 crc kubenswrapper[4797]: I1013 13:11:00.210674 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2m52s" event={"ID":"922167a9-88b5-40b2-8dd3-b04e4ba3f035","Type":"ContainerDied","Data":"28eb5662e2c52a3d5af63863c9404b9ac2526b626b59cbd62785a5a3ac1562e1"} Oct 13 13:11:00 crc kubenswrapper[4797]: I1013 13:11:00.315086 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-frmvf" Oct 13 13:11:00 crc kubenswrapper[4797]: I1013 13:11:00.343769 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tfpf9" Oct 13 13:11:00 crc kubenswrapper[4797]: I1013 13:11:00.404530 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-hxck2" Oct 13 13:11:00 crc kubenswrapper[4797]: I1013 13:11:00.411044 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n7gzd" Oct 13 13:11:00 crc kubenswrapper[4797]: I1013 13:11:00.454552 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b97354f-235c-4e1e-9121-13d644be8813-catalog-content\") pod \"8b97354f-235c-4e1e-9121-13d644be8813\" (UID: \"8b97354f-235c-4e1e-9121-13d644be8813\") " Oct 13 13:11:00 crc kubenswrapper[4797]: I1013 13:11:00.454650 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xtxgp\" (UniqueName: \"kubernetes.io/projected/8b97354f-235c-4e1e-9121-13d644be8813-kube-api-access-xtxgp\") pod \"8b97354f-235c-4e1e-9121-13d644be8813\" (UID: \"8b97354f-235c-4e1e-9121-13d644be8813\") " Oct 13 13:11:00 crc kubenswrapper[4797]: I1013 13:11:00.454707 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b97354f-235c-4e1e-9121-13d644be8813-utilities\") pod \"8b97354f-235c-4e1e-9121-13d644be8813\" (UID: \"8b97354f-235c-4e1e-9121-13d644be8813\") " Oct 13 13:11:00 crc kubenswrapper[4797]: I1013 13:11:00.457662 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8b97354f-235c-4e1e-9121-13d644be8813-utilities" (OuterVolumeSpecName: "utilities") pod "8b97354f-235c-4e1e-9121-13d644be8813" (UID: "8b97354f-235c-4e1e-9121-13d644be8813"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 13:11:00 crc kubenswrapper[4797]: I1013 13:11:00.457702 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2m52s" Oct 13 13:11:00 crc kubenswrapper[4797]: I1013 13:11:00.466110 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b97354f-235c-4e1e-9121-13d644be8813-kube-api-access-xtxgp" (OuterVolumeSpecName: "kube-api-access-xtxgp") pod "8b97354f-235c-4e1e-9121-13d644be8813" (UID: "8b97354f-235c-4e1e-9121-13d644be8813"). InnerVolumeSpecName "kube-api-access-xtxgp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:11:00 crc kubenswrapper[4797]: I1013 13:11:00.467848 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-v4q5g" Oct 13 13:11:00 crc kubenswrapper[4797]: I1013 13:11:00.531262 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8b97354f-235c-4e1e-9121-13d644be8813-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8b97354f-235c-4e1e-9121-13d644be8813" (UID: "8b97354f-235c-4e1e-9121-13d644be8813"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 13:11:00 crc kubenswrapper[4797]: I1013 13:11:00.560353 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ae58328-3b33-44bc-a168-9d19d64bc09c-catalog-content\") pod \"3ae58328-3b33-44bc-a168-9d19d64bc09c\" (UID: \"3ae58328-3b33-44bc-a168-9d19d64bc09c\") " Oct 13 13:11:00 crc kubenswrapper[4797]: I1013 13:11:00.560422 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ae58328-3b33-44bc-a168-9d19d64bc09c-utilities\") pod \"3ae58328-3b33-44bc-a168-9d19d64bc09c\" (UID: \"3ae58328-3b33-44bc-a168-9d19d64bc09c\") " Oct 13 13:11:00 crc kubenswrapper[4797]: I1013 13:11:00.560469 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/922167a9-88b5-40b2-8dd3-b04e4ba3f035-catalog-content\") pod \"922167a9-88b5-40b2-8dd3-b04e4ba3f035\" (UID: \"922167a9-88b5-40b2-8dd3-b04e4ba3f035\") " Oct 13 13:11:00 crc kubenswrapper[4797]: I1013 13:11:00.560491 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/922167a9-88b5-40b2-8dd3-b04e4ba3f035-utilities\") pod \"922167a9-88b5-40b2-8dd3-b04e4ba3f035\" (UID: \"922167a9-88b5-40b2-8dd3-b04e4ba3f035\") " Oct 13 13:11:00 crc kubenswrapper[4797]: I1013 13:11:00.560516 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51eb3ebb-3fa0-4571-bb5e-ca393071f745-utilities\") pod \"51eb3ebb-3fa0-4571-bb5e-ca393071f745\" (UID: \"51eb3ebb-3fa0-4571-bb5e-ca393071f745\") " Oct 13 13:11:00 crc kubenswrapper[4797]: I1013 13:11:00.560563 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51eb3ebb-3fa0-4571-bb5e-ca393071f745-catalog-content\") pod \"51eb3ebb-3fa0-4571-bb5e-ca393071f745\" (UID: \"51eb3ebb-3fa0-4571-bb5e-ca393071f745\") " Oct 13 13:11:00 crc kubenswrapper[4797]: I1013 13:11:00.560597 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4nx98\" (UniqueName: \"kubernetes.io/projected/c229cffd-cd92-47b5-bec4-3f3eb1c6c81e-kube-api-access-4nx98\") pod \"c229cffd-cd92-47b5-bec4-3f3eb1c6c81e\" (UID: \"c229cffd-cd92-47b5-bec4-3f3eb1c6c81e\") " Oct 13 13:11:00 crc kubenswrapper[4797]: I1013 13:11:00.560636 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-224st\" (UniqueName: \"kubernetes.io/projected/51eb3ebb-3fa0-4571-bb5e-ca393071f745-kube-api-access-224st\") pod \"51eb3ebb-3fa0-4571-bb5e-ca393071f745\" (UID: \"51eb3ebb-3fa0-4571-bb5e-ca393071f745\") " Oct 13 13:11:00 crc kubenswrapper[4797]: I1013 13:11:00.560656 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c229cffd-cd92-47b5-bec4-3f3eb1c6c81e-marketplace-trusted-ca\") pod \"c229cffd-cd92-47b5-bec4-3f3eb1c6c81e\" (UID: \"c229cffd-cd92-47b5-bec4-3f3eb1c6c81e\") " Oct 13 13:11:00 crc kubenswrapper[4797]: I1013 13:11:00.560682 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-772zf\" (UniqueName: \"kubernetes.io/projected/922167a9-88b5-40b2-8dd3-b04e4ba3f035-kube-api-access-772zf\") pod \"922167a9-88b5-40b2-8dd3-b04e4ba3f035\" (UID: \"922167a9-88b5-40b2-8dd3-b04e4ba3f035\") " Oct 13 13:11:00 crc kubenswrapper[4797]: I1013 13:11:00.560722 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-54tbd\" (UniqueName: \"kubernetes.io/projected/3ae58328-3b33-44bc-a168-9d19d64bc09c-kube-api-access-54tbd\") pod \"3ae58328-3b33-44bc-a168-9d19d64bc09c\" (UID: \"3ae58328-3b33-44bc-a168-9d19d64bc09c\") " Oct 13 13:11:00 crc kubenswrapper[4797]: I1013 13:11:00.560750 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/c229cffd-cd92-47b5-bec4-3f3eb1c6c81e-marketplace-operator-metrics\") pod \"c229cffd-cd92-47b5-bec4-3f3eb1c6c81e\" (UID: \"c229cffd-cd92-47b5-bec4-3f3eb1c6c81e\") " Oct 13 13:11:00 crc kubenswrapper[4797]: I1013 13:11:00.561037 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xtxgp\" (UniqueName: \"kubernetes.io/projected/8b97354f-235c-4e1e-9121-13d644be8813-kube-api-access-xtxgp\") on node \"crc\" DevicePath \"\"" Oct 13 13:11:00 crc kubenswrapper[4797]: I1013 13:11:00.561051 4797 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b97354f-235c-4e1e-9121-13d644be8813-utilities\") on node \"crc\" DevicePath \"\"" Oct 13 13:11:00 crc kubenswrapper[4797]: I1013 13:11:00.561086 4797 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b97354f-235c-4e1e-9121-13d644be8813-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 13 13:11:00 crc kubenswrapper[4797]: I1013 13:11:00.562352 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/922167a9-88b5-40b2-8dd3-b04e4ba3f035-utilities" (OuterVolumeSpecName: "utilities") pod "922167a9-88b5-40b2-8dd3-b04e4ba3f035" (UID: "922167a9-88b5-40b2-8dd3-b04e4ba3f035"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 13:11:00 crc kubenswrapper[4797]: I1013 13:11:00.563110 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c229cffd-cd92-47b5-bec4-3f3eb1c6c81e-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "c229cffd-cd92-47b5-bec4-3f3eb1c6c81e" (UID: "c229cffd-cd92-47b5-bec4-3f3eb1c6c81e"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:11:00 crc kubenswrapper[4797]: I1013 13:11:00.563687 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3ae58328-3b33-44bc-a168-9d19d64bc09c-utilities" (OuterVolumeSpecName: "utilities") pod "3ae58328-3b33-44bc-a168-9d19d64bc09c" (UID: "3ae58328-3b33-44bc-a168-9d19d64bc09c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 13:11:00 crc kubenswrapper[4797]: I1013 13:11:00.565941 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c229cffd-cd92-47b5-bec4-3f3eb1c6c81e-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "c229cffd-cd92-47b5-bec4-3f3eb1c6c81e" (UID: "c229cffd-cd92-47b5-bec4-3f3eb1c6c81e"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:11:00 crc kubenswrapper[4797]: I1013 13:11:00.566072 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51eb3ebb-3fa0-4571-bb5e-ca393071f745-utilities" (OuterVolumeSpecName: "utilities") pod "51eb3ebb-3fa0-4571-bb5e-ca393071f745" (UID: "51eb3ebb-3fa0-4571-bb5e-ca393071f745"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 13:11:00 crc kubenswrapper[4797]: I1013 13:11:00.566638 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/922167a9-88b5-40b2-8dd3-b04e4ba3f035-kube-api-access-772zf" (OuterVolumeSpecName: "kube-api-access-772zf") pod "922167a9-88b5-40b2-8dd3-b04e4ba3f035" (UID: "922167a9-88b5-40b2-8dd3-b04e4ba3f035"). InnerVolumeSpecName "kube-api-access-772zf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:11:00 crc kubenswrapper[4797]: I1013 13:11:00.566661 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c229cffd-cd92-47b5-bec4-3f3eb1c6c81e-kube-api-access-4nx98" (OuterVolumeSpecName: "kube-api-access-4nx98") pod "c229cffd-cd92-47b5-bec4-3f3eb1c6c81e" (UID: "c229cffd-cd92-47b5-bec4-3f3eb1c6c81e"). InnerVolumeSpecName "kube-api-access-4nx98". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:11:00 crc kubenswrapper[4797]: I1013 13:11:00.568090 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ae58328-3b33-44bc-a168-9d19d64bc09c-kube-api-access-54tbd" (OuterVolumeSpecName: "kube-api-access-54tbd") pod "3ae58328-3b33-44bc-a168-9d19d64bc09c" (UID: "3ae58328-3b33-44bc-a168-9d19d64bc09c"). InnerVolumeSpecName "kube-api-access-54tbd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:11:00 crc kubenswrapper[4797]: I1013 13:11:00.570068 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51eb3ebb-3fa0-4571-bb5e-ca393071f745-kube-api-access-224st" (OuterVolumeSpecName: "kube-api-access-224st") pod "51eb3ebb-3fa0-4571-bb5e-ca393071f745" (UID: "51eb3ebb-3fa0-4571-bb5e-ca393071f745"). InnerVolumeSpecName "kube-api-access-224st". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:11:00 crc kubenswrapper[4797]: I1013 13:11:00.576413 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51eb3ebb-3fa0-4571-bb5e-ca393071f745-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "51eb3ebb-3fa0-4571-bb5e-ca393071f745" (UID: "51eb3ebb-3fa0-4571-bb5e-ca393071f745"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 13:11:00 crc kubenswrapper[4797]: I1013 13:11:00.620841 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/922167a9-88b5-40b2-8dd3-b04e4ba3f035-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "922167a9-88b5-40b2-8dd3-b04e4ba3f035" (UID: "922167a9-88b5-40b2-8dd3-b04e4ba3f035"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 13:11:00 crc kubenswrapper[4797]: I1013 13:11:00.654844 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3ae58328-3b33-44bc-a168-9d19d64bc09c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3ae58328-3b33-44bc-a168-9d19d64bc09c" (UID: "3ae58328-3b33-44bc-a168-9d19d64bc09c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 13:11:00 crc kubenswrapper[4797]: I1013 13:11:00.662235 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-772zf\" (UniqueName: \"kubernetes.io/projected/922167a9-88b5-40b2-8dd3-b04e4ba3f035-kube-api-access-772zf\") on node \"crc\" DevicePath \"\"" Oct 13 13:11:00 crc kubenswrapper[4797]: I1013 13:11:00.662273 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-54tbd\" (UniqueName: \"kubernetes.io/projected/3ae58328-3b33-44bc-a168-9d19d64bc09c-kube-api-access-54tbd\") on node \"crc\" DevicePath \"\"" Oct 13 13:11:00 crc kubenswrapper[4797]: I1013 13:11:00.662284 4797 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/c229cffd-cd92-47b5-bec4-3f3eb1c6c81e-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Oct 13 13:11:00 crc kubenswrapper[4797]: I1013 13:11:00.662294 4797 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ae58328-3b33-44bc-a168-9d19d64bc09c-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 13 13:11:00 crc kubenswrapper[4797]: I1013 13:11:00.662304 4797 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ae58328-3b33-44bc-a168-9d19d64bc09c-utilities\") on node \"crc\" DevicePath \"\"" Oct 13 13:11:00 crc kubenswrapper[4797]: I1013 13:11:00.662314 4797 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/922167a9-88b5-40b2-8dd3-b04e4ba3f035-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 13 13:11:00 crc kubenswrapper[4797]: I1013 13:11:00.662336 4797 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/922167a9-88b5-40b2-8dd3-b04e4ba3f035-utilities\") on node \"crc\" DevicePath \"\"" Oct 13 13:11:00 crc kubenswrapper[4797]: I1013 13:11:00.662345 4797 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51eb3ebb-3fa0-4571-bb5e-ca393071f745-utilities\") on node \"crc\" DevicePath \"\"" Oct 13 13:11:00 crc kubenswrapper[4797]: I1013 13:11:00.662354 4797 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51eb3ebb-3fa0-4571-bb5e-ca393071f745-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 13 13:11:00 crc kubenswrapper[4797]: I1013 13:11:00.662364 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4nx98\" (UniqueName: \"kubernetes.io/projected/c229cffd-cd92-47b5-bec4-3f3eb1c6c81e-kube-api-access-4nx98\") on node \"crc\" DevicePath \"\"" Oct 13 13:11:00 crc kubenswrapper[4797]: I1013 13:11:00.662373 4797 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c229cffd-cd92-47b5-bec4-3f3eb1c6c81e-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 13 13:11:00 crc kubenswrapper[4797]: I1013 13:11:00.662381 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-224st\" (UniqueName: \"kubernetes.io/projected/51eb3ebb-3fa0-4571-bb5e-ca393071f745-kube-api-access-224st\") on node \"crc\" DevicePath \"\"" Oct 13 13:11:00 crc kubenswrapper[4797]: I1013 13:11:00.774983 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-frmvf"] Oct 13 13:11:01 crc kubenswrapper[4797]: I1013 13:11:01.222495 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-hxck2" Oct 13 13:11:01 crc kubenswrapper[4797]: I1013 13:11:01.222520 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-hxck2" event={"ID":"c229cffd-cd92-47b5-bec4-3f3eb1c6c81e","Type":"ContainerDied","Data":"eae8462120f3f150e29fd415d4be57658b1ca08ee6355835566fa41c17ae84d0"} Oct 13 13:11:01 crc kubenswrapper[4797]: I1013 13:11:01.223053 4797 scope.go:117] "RemoveContainer" containerID="92daef95090852916fb77bf7c96e6d4f56cb5b2410062f430b59b417e127ca50" Oct 13 13:11:01 crc kubenswrapper[4797]: I1013 13:11:01.226976 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n7gzd" Oct 13 13:11:01 crc kubenswrapper[4797]: I1013 13:11:01.227035 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n7gzd" event={"ID":"3ae58328-3b33-44bc-a168-9d19d64bc09c","Type":"ContainerDied","Data":"41c546e00a0557c5ab95e26123ba5397ed6de4aa808f0c6bf097032cb961e5e0"} Oct 13 13:11:01 crc kubenswrapper[4797]: I1013 13:11:01.232962 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-v4q5g" Oct 13 13:11:01 crc kubenswrapper[4797]: I1013 13:11:01.232961 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v4q5g" event={"ID":"51eb3ebb-3fa0-4571-bb5e-ca393071f745","Type":"ContainerDied","Data":"b120c58051a132baad0fc5a0d1065da2ad3104aeec46a2a3a3c6aa1ad844639f"} Oct 13 13:11:01 crc kubenswrapper[4797]: I1013 13:11:01.234886 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-frmvf" event={"ID":"aea11d03-92b6-4f03-b4bc-61042afa7406","Type":"ContainerStarted","Data":"589aabf2b4a50e00d8ba3db6f65dbb0914cc107c0c6078a18409f8042483603f"} Oct 13 13:11:01 crc kubenswrapper[4797]: I1013 13:11:01.234918 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-frmvf" event={"ID":"aea11d03-92b6-4f03-b4bc-61042afa7406","Type":"ContainerStarted","Data":"61b12788795858100eb48eca25b67fa3aaa6ebc4867c43d85410dab8c7254bf9"} Oct 13 13:11:01 crc kubenswrapper[4797]: I1013 13:11:01.237439 4797 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-frmvf container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.55:8080/healthz\": dial tcp 10.217.0.55:8080: connect: connection refused" start-of-body= Oct 13 13:11:01 crc kubenswrapper[4797]: I1013 13:11:01.237507 4797 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-frmvf" podUID="aea11d03-92b6-4f03-b4bc-61042afa7406" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.55:8080/healthz\": dial tcp 10.217.0.55:8080: connect: connection refused" Oct 13 13:11:01 crc kubenswrapper[4797]: I1013 13:11:01.237844 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2m52s" Oct 13 13:11:01 crc kubenswrapper[4797]: I1013 13:11:01.241593 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tfpf9" Oct 13 13:11:01 crc kubenswrapper[4797]: I1013 13:11:01.245012 4797 scope.go:117] "RemoveContainer" containerID="b80350a81a0f80eb8672a2347440f67f729e8f78486e8fd3368560c9d8116399" Oct 13 13:11:01 crc kubenswrapper[4797]: I1013 13:11:01.265964 4797 scope.go:117] "RemoveContainer" containerID="582399ff574eddf8dbb9b93816221f5567a54fe125defe896195a4cb57ad5c95" Oct 13 13:11:01 crc kubenswrapper[4797]: I1013 13:11:01.284688 4797 scope.go:117] "RemoveContainer" containerID="651ad3fffa71e7863684a923d61c94e9a584c4780104b60cdd4b8bf30321ef9e" Oct 13 13:11:01 crc kubenswrapper[4797]: I1013 13:11:01.289596 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2m52s" event={"ID":"922167a9-88b5-40b2-8dd3-b04e4ba3f035","Type":"ContainerDied","Data":"bccb7f864b5c89d28ed5610a395821dba41c2177ac9ec1398438ba2296818b14"} Oct 13 13:11:01 crc kubenswrapper[4797]: I1013 13:11:01.290052 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-frmvf" Oct 13 13:11:01 crc kubenswrapper[4797]: I1013 13:11:01.290162 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tfpf9" event={"ID":"8b97354f-235c-4e1e-9121-13d644be8813","Type":"ContainerDied","Data":"e4846dea3e620fa577e8984c87dcb74b8db637d1996a2c8257fa7f74d0707144"} Oct 13 13:11:01 crc kubenswrapper[4797]: I1013 13:11:01.298240 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-frmvf" podStartSLOduration=2.298206985 podStartE2EDuration="2.298206985s" podCreationTimestamp="2025-10-13 13:10:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 13:11:01.293269623 +0000 UTC m=+238.826819889" watchObservedRunningTime="2025-10-13 13:11:01.298206985 +0000 UTC m=+238.831757251" Oct 13 13:11:01 crc kubenswrapper[4797]: I1013 13:11:01.307672 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-hxck2"] Oct 13 13:11:01 crc kubenswrapper[4797]: I1013 13:11:01.312314 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-hxck2"] Oct 13 13:11:01 crc kubenswrapper[4797]: I1013 13:11:01.322966 4797 scope.go:117] "RemoveContainer" containerID="ab80f50d7a5db2455fbfe85604510463c7a8f8a0e571818aee63744d7ba25bee" Oct 13 13:11:01 crc kubenswrapper[4797]: I1013 13:11:01.349432 4797 scope.go:117] "RemoveContainer" containerID="c482c43fd4f782cf58d8d3492cba10633ba9b148279d0c015c0a2de0a2a31acc" Oct 13 13:11:01 crc kubenswrapper[4797]: I1013 13:11:01.357693 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-tfpf9"] Oct 13 13:11:01 crc kubenswrapper[4797]: I1013 13:11:01.369283 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-tfpf9"] Oct 13 13:11:01 crc kubenswrapper[4797]: I1013 13:11:01.375762 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-n7gzd"] Oct 13 13:11:01 crc kubenswrapper[4797]: I1013 13:11:01.376974 4797 scope.go:117] "RemoveContainer" containerID="9d9f2ec3a9a051e9c56b4e148ebc256b0a20d16677899f3c219d698ee1a00619" Oct 13 13:11:01 crc kubenswrapper[4797]: I1013 13:11:01.382058 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-n7gzd"] Oct 13 13:11:01 crc kubenswrapper[4797]: I1013 13:11:01.387651 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2m52s"] Oct 13 13:11:01 crc kubenswrapper[4797]: I1013 13:11:01.390338 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-2m52s"] Oct 13 13:11:01 crc kubenswrapper[4797]: I1013 13:11:01.398241 4797 scope.go:117] "RemoveContainer" containerID="28eb5662e2c52a3d5af63863c9404b9ac2526b626b59cbd62785a5a3ac1562e1" Oct 13 13:11:01 crc kubenswrapper[4797]: I1013 13:11:01.405419 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-v4q5g"] Oct 13 13:11:01 crc kubenswrapper[4797]: I1013 13:11:01.408108 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-v4q5g"] Oct 13 13:11:01 crc kubenswrapper[4797]: I1013 13:11:01.413970 4797 scope.go:117] "RemoveContainer" containerID="4950f04c819ac223fc7dd75830eb4f5e560bdb0e234fc9cd584fcbea7dd59886" Oct 13 13:11:01 crc kubenswrapper[4797]: I1013 13:11:01.435785 4797 scope.go:117] "RemoveContainer" containerID="9d735cd13ac34550de620c04613fc46515d29ed9f19210f858844b597e98e435" Oct 13 13:11:01 crc kubenswrapper[4797]: I1013 13:11:01.448509 4797 scope.go:117] "RemoveContainer" containerID="9b0dc98137c3f8338470382e7bbb17265c71f9e0519ce6b54cd868a9caa81d22" Oct 13 13:11:01 crc kubenswrapper[4797]: I1013 13:11:01.468337 4797 scope.go:117] "RemoveContainer" containerID="9b94d92c5126df23297b8ec73d7e81272defc3785500c8bf8d3680b3915e3180" Oct 13 13:11:01 crc kubenswrapper[4797]: I1013 13:11:01.487187 4797 scope.go:117] "RemoveContainer" containerID="2bfad293734fa09c3b0552fe4361032662c4545ed4ff057a5201a40d9fdb0a75" Oct 13 13:11:02 crc kubenswrapper[4797]: I1013 13:11:02.122138 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-c9wvg"] Oct 13 13:11:02 crc kubenswrapper[4797]: E1013 13:11:02.122434 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="922167a9-88b5-40b2-8dd3-b04e4ba3f035" containerName="extract-utilities" Oct 13 13:11:02 crc kubenswrapper[4797]: I1013 13:11:02.122450 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="922167a9-88b5-40b2-8dd3-b04e4ba3f035" containerName="extract-utilities" Oct 13 13:11:02 crc kubenswrapper[4797]: E1013 13:11:02.122465 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ae58328-3b33-44bc-a168-9d19d64bc09c" containerName="extract-utilities" Oct 13 13:11:02 crc kubenswrapper[4797]: I1013 13:11:02.122474 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ae58328-3b33-44bc-a168-9d19d64bc09c" containerName="extract-utilities" Oct 13 13:11:02 crc kubenswrapper[4797]: E1013 13:11:02.122485 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="922167a9-88b5-40b2-8dd3-b04e4ba3f035" containerName="registry-server" Oct 13 13:11:02 crc kubenswrapper[4797]: I1013 13:11:02.122493 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="922167a9-88b5-40b2-8dd3-b04e4ba3f035" containerName="registry-server" Oct 13 13:11:02 crc kubenswrapper[4797]: E1013 13:11:02.122504 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ae58328-3b33-44bc-a168-9d19d64bc09c" containerName="registry-server" Oct 13 13:11:02 crc kubenswrapper[4797]: I1013 13:11:02.122513 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ae58328-3b33-44bc-a168-9d19d64bc09c" containerName="registry-server" Oct 13 13:11:02 crc kubenswrapper[4797]: E1013 13:11:02.122525 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c229cffd-cd92-47b5-bec4-3f3eb1c6c81e" containerName="marketplace-operator" Oct 13 13:11:02 crc kubenswrapper[4797]: I1013 13:11:02.122532 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="c229cffd-cd92-47b5-bec4-3f3eb1c6c81e" containerName="marketplace-operator" Oct 13 13:11:02 crc kubenswrapper[4797]: E1013 13:11:02.122544 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ae58328-3b33-44bc-a168-9d19d64bc09c" containerName="extract-content" Oct 13 13:11:02 crc kubenswrapper[4797]: I1013 13:11:02.122552 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ae58328-3b33-44bc-a168-9d19d64bc09c" containerName="extract-content" Oct 13 13:11:02 crc kubenswrapper[4797]: E1013 13:11:02.122565 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51eb3ebb-3fa0-4571-bb5e-ca393071f745" containerName="registry-server" Oct 13 13:11:02 crc kubenswrapper[4797]: I1013 13:11:02.122573 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="51eb3ebb-3fa0-4571-bb5e-ca393071f745" containerName="registry-server" Oct 13 13:11:02 crc kubenswrapper[4797]: E1013 13:11:02.122585 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b97354f-235c-4e1e-9121-13d644be8813" containerName="extract-utilities" Oct 13 13:11:02 crc kubenswrapper[4797]: I1013 13:11:02.122592 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b97354f-235c-4e1e-9121-13d644be8813" containerName="extract-utilities" Oct 13 13:11:02 crc kubenswrapper[4797]: E1013 13:11:02.122601 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b97354f-235c-4e1e-9121-13d644be8813" containerName="registry-server" Oct 13 13:11:02 crc kubenswrapper[4797]: I1013 13:11:02.122609 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b97354f-235c-4e1e-9121-13d644be8813" containerName="registry-server" Oct 13 13:11:02 crc kubenswrapper[4797]: E1013 13:11:02.122621 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b97354f-235c-4e1e-9121-13d644be8813" containerName="extract-content" Oct 13 13:11:02 crc kubenswrapper[4797]: I1013 13:11:02.122629 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b97354f-235c-4e1e-9121-13d644be8813" containerName="extract-content" Oct 13 13:11:02 crc kubenswrapper[4797]: E1013 13:11:02.122640 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="922167a9-88b5-40b2-8dd3-b04e4ba3f035" containerName="extract-content" Oct 13 13:11:02 crc kubenswrapper[4797]: I1013 13:11:02.122648 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="922167a9-88b5-40b2-8dd3-b04e4ba3f035" containerName="extract-content" Oct 13 13:11:02 crc kubenswrapper[4797]: E1013 13:11:02.122658 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51eb3ebb-3fa0-4571-bb5e-ca393071f745" containerName="extract-content" Oct 13 13:11:02 crc kubenswrapper[4797]: I1013 13:11:02.122665 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="51eb3ebb-3fa0-4571-bb5e-ca393071f745" containerName="extract-content" Oct 13 13:11:02 crc kubenswrapper[4797]: E1013 13:11:02.122678 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51eb3ebb-3fa0-4571-bb5e-ca393071f745" containerName="extract-utilities" Oct 13 13:11:02 crc kubenswrapper[4797]: I1013 13:11:02.122685 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="51eb3ebb-3fa0-4571-bb5e-ca393071f745" containerName="extract-utilities" Oct 13 13:11:02 crc kubenswrapper[4797]: I1013 13:11:02.122797 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="922167a9-88b5-40b2-8dd3-b04e4ba3f035" containerName="registry-server" Oct 13 13:11:02 crc kubenswrapper[4797]: I1013 13:11:02.122827 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ae58328-3b33-44bc-a168-9d19d64bc09c" containerName="registry-server" Oct 13 13:11:02 crc kubenswrapper[4797]: I1013 13:11:02.122837 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="51eb3ebb-3fa0-4571-bb5e-ca393071f745" containerName="registry-server" Oct 13 13:11:02 crc kubenswrapper[4797]: I1013 13:11:02.122852 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="c229cffd-cd92-47b5-bec4-3f3eb1c6c81e" containerName="marketplace-operator" Oct 13 13:11:02 crc kubenswrapper[4797]: I1013 13:11:02.122865 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b97354f-235c-4e1e-9121-13d644be8813" containerName="registry-server" Oct 13 13:11:02 crc kubenswrapper[4797]: I1013 13:11:02.123900 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-c9wvg" Oct 13 13:11:02 crc kubenswrapper[4797]: I1013 13:11:02.127055 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Oct 13 13:11:02 crc kubenswrapper[4797]: I1013 13:11:02.135870 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-c9wvg"] Oct 13 13:11:02 crc kubenswrapper[4797]: I1013 13:11:02.258786 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-frmvf" Oct 13 13:11:02 crc kubenswrapper[4797]: I1013 13:11:02.286373 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb509468-0e31-4f31-8aea-3a1c9111574d-utilities\") pod \"redhat-marketplace-c9wvg\" (UID: \"cb509468-0e31-4f31-8aea-3a1c9111574d\") " pod="openshift-marketplace/redhat-marketplace-c9wvg" Oct 13 13:11:02 crc kubenswrapper[4797]: I1013 13:11:02.286758 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb509468-0e31-4f31-8aea-3a1c9111574d-catalog-content\") pod \"redhat-marketplace-c9wvg\" (UID: \"cb509468-0e31-4f31-8aea-3a1c9111574d\") " pod="openshift-marketplace/redhat-marketplace-c9wvg" Oct 13 13:11:02 crc kubenswrapper[4797]: I1013 13:11:02.286959 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mnm2w\" (UniqueName: \"kubernetes.io/projected/cb509468-0e31-4f31-8aea-3a1c9111574d-kube-api-access-mnm2w\") pod \"redhat-marketplace-c9wvg\" (UID: \"cb509468-0e31-4f31-8aea-3a1c9111574d\") " pod="openshift-marketplace/redhat-marketplace-c9wvg" Oct 13 13:11:02 crc kubenswrapper[4797]: I1013 13:11:02.312095 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-vgrr4"] Oct 13 13:11:02 crc kubenswrapper[4797]: I1013 13:11:02.313022 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vgrr4" Oct 13 13:11:02 crc kubenswrapper[4797]: I1013 13:11:02.315040 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Oct 13 13:11:02 crc kubenswrapper[4797]: I1013 13:11:02.325423 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vgrr4"] Oct 13 13:11:02 crc kubenswrapper[4797]: I1013 13:11:02.390864 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb509468-0e31-4f31-8aea-3a1c9111574d-utilities\") pod \"redhat-marketplace-c9wvg\" (UID: \"cb509468-0e31-4f31-8aea-3a1c9111574d\") " pod="openshift-marketplace/redhat-marketplace-c9wvg" Oct 13 13:11:02 crc kubenswrapper[4797]: I1013 13:11:02.390966 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb509468-0e31-4f31-8aea-3a1c9111574d-catalog-content\") pod \"redhat-marketplace-c9wvg\" (UID: \"cb509468-0e31-4f31-8aea-3a1c9111574d\") " pod="openshift-marketplace/redhat-marketplace-c9wvg" Oct 13 13:11:02 crc kubenswrapper[4797]: I1013 13:11:02.391010 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mnm2w\" (UniqueName: \"kubernetes.io/projected/cb509468-0e31-4f31-8aea-3a1c9111574d-kube-api-access-mnm2w\") pod \"redhat-marketplace-c9wvg\" (UID: \"cb509468-0e31-4f31-8aea-3a1c9111574d\") " pod="openshift-marketplace/redhat-marketplace-c9wvg" Oct 13 13:11:02 crc kubenswrapper[4797]: I1013 13:11:02.391970 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb509468-0e31-4f31-8aea-3a1c9111574d-utilities\") pod \"redhat-marketplace-c9wvg\" (UID: \"cb509468-0e31-4f31-8aea-3a1c9111574d\") " pod="openshift-marketplace/redhat-marketplace-c9wvg" Oct 13 13:11:02 crc kubenswrapper[4797]: I1013 13:11:02.392061 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb509468-0e31-4f31-8aea-3a1c9111574d-catalog-content\") pod \"redhat-marketplace-c9wvg\" (UID: \"cb509468-0e31-4f31-8aea-3a1c9111574d\") " pod="openshift-marketplace/redhat-marketplace-c9wvg" Oct 13 13:11:02 crc kubenswrapper[4797]: I1013 13:11:02.410666 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mnm2w\" (UniqueName: \"kubernetes.io/projected/cb509468-0e31-4f31-8aea-3a1c9111574d-kube-api-access-mnm2w\") pod \"redhat-marketplace-c9wvg\" (UID: \"cb509468-0e31-4f31-8aea-3a1c9111574d\") " pod="openshift-marketplace/redhat-marketplace-c9wvg" Oct 13 13:11:02 crc kubenswrapper[4797]: I1013 13:11:02.450238 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-c9wvg" Oct 13 13:11:02 crc kubenswrapper[4797]: I1013 13:11:02.492850 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb0fc30c-b4bd-41bc-871e-1e85b3f115f2-utilities\") pod \"redhat-operators-vgrr4\" (UID: \"eb0fc30c-b4bd-41bc-871e-1e85b3f115f2\") " pod="openshift-marketplace/redhat-operators-vgrr4" Oct 13 13:11:02 crc kubenswrapper[4797]: I1013 13:11:02.492928 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb0fc30c-b4bd-41bc-871e-1e85b3f115f2-catalog-content\") pod \"redhat-operators-vgrr4\" (UID: \"eb0fc30c-b4bd-41bc-871e-1e85b3f115f2\") " pod="openshift-marketplace/redhat-operators-vgrr4" Oct 13 13:11:02 crc kubenswrapper[4797]: I1013 13:11:02.493030 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8nz5v\" (UniqueName: \"kubernetes.io/projected/eb0fc30c-b4bd-41bc-871e-1e85b3f115f2-kube-api-access-8nz5v\") pod \"redhat-operators-vgrr4\" (UID: \"eb0fc30c-b4bd-41bc-871e-1e85b3f115f2\") " pod="openshift-marketplace/redhat-operators-vgrr4" Oct 13 13:11:02 crc kubenswrapper[4797]: I1013 13:11:02.594881 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb0fc30c-b4bd-41bc-871e-1e85b3f115f2-catalog-content\") pod \"redhat-operators-vgrr4\" (UID: \"eb0fc30c-b4bd-41bc-871e-1e85b3f115f2\") " pod="openshift-marketplace/redhat-operators-vgrr4" Oct 13 13:11:02 crc kubenswrapper[4797]: I1013 13:11:02.594952 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8nz5v\" (UniqueName: \"kubernetes.io/projected/eb0fc30c-b4bd-41bc-871e-1e85b3f115f2-kube-api-access-8nz5v\") pod \"redhat-operators-vgrr4\" (UID: \"eb0fc30c-b4bd-41bc-871e-1e85b3f115f2\") " pod="openshift-marketplace/redhat-operators-vgrr4" Oct 13 13:11:02 crc kubenswrapper[4797]: I1013 13:11:02.594990 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb0fc30c-b4bd-41bc-871e-1e85b3f115f2-utilities\") pod \"redhat-operators-vgrr4\" (UID: \"eb0fc30c-b4bd-41bc-871e-1e85b3f115f2\") " pod="openshift-marketplace/redhat-operators-vgrr4" Oct 13 13:11:02 crc kubenswrapper[4797]: I1013 13:11:02.595622 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb0fc30c-b4bd-41bc-871e-1e85b3f115f2-catalog-content\") pod \"redhat-operators-vgrr4\" (UID: \"eb0fc30c-b4bd-41bc-871e-1e85b3f115f2\") " pod="openshift-marketplace/redhat-operators-vgrr4" Oct 13 13:11:02 crc kubenswrapper[4797]: I1013 13:11:02.595647 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb0fc30c-b4bd-41bc-871e-1e85b3f115f2-utilities\") pod \"redhat-operators-vgrr4\" (UID: \"eb0fc30c-b4bd-41bc-871e-1e85b3f115f2\") " pod="openshift-marketplace/redhat-operators-vgrr4" Oct 13 13:11:02 crc kubenswrapper[4797]: I1013 13:11:02.616699 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8nz5v\" (UniqueName: \"kubernetes.io/projected/eb0fc30c-b4bd-41bc-871e-1e85b3f115f2-kube-api-access-8nz5v\") pod \"redhat-operators-vgrr4\" (UID: \"eb0fc30c-b4bd-41bc-871e-1e85b3f115f2\") " pod="openshift-marketplace/redhat-operators-vgrr4" Oct 13 13:11:02 crc kubenswrapper[4797]: I1013 13:11:02.629837 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vgrr4" Oct 13 13:11:02 crc kubenswrapper[4797]: I1013 13:11:02.802313 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vgrr4"] Oct 13 13:11:02 crc kubenswrapper[4797]: W1013 13:11:02.812375 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeb0fc30c_b4bd_41bc_871e_1e85b3f115f2.slice/crio-79ff0280c8e5ef255aa5fc4397f579f69dbd5610879eb28bdd3615065371b0b2 WatchSource:0}: Error finding container 79ff0280c8e5ef255aa5fc4397f579f69dbd5610879eb28bdd3615065371b0b2: Status 404 returned error can't find the container with id 79ff0280c8e5ef255aa5fc4397f579f69dbd5610879eb28bdd3615065371b0b2 Oct 13 13:11:02 crc kubenswrapper[4797]: I1013 13:11:02.905913 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-c9wvg"] Oct 13 13:11:02 crc kubenswrapper[4797]: W1013 13:11:02.911662 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcb509468_0e31_4f31_8aea_3a1c9111574d.slice/crio-4c1bc2e368504a43002b9e1633909bd68f778d6714113229def4a58d5a3ef94a WatchSource:0}: Error finding container 4c1bc2e368504a43002b9e1633909bd68f778d6714113229def4a58d5a3ef94a: Status 404 returned error can't find the container with id 4c1bc2e368504a43002b9e1633909bd68f778d6714113229def4a58d5a3ef94a Oct 13 13:11:03 crc kubenswrapper[4797]: I1013 13:11:03.244304 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ae58328-3b33-44bc-a168-9d19d64bc09c" path="/var/lib/kubelet/pods/3ae58328-3b33-44bc-a168-9d19d64bc09c/volumes" Oct 13 13:11:03 crc kubenswrapper[4797]: I1013 13:11:03.245072 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51eb3ebb-3fa0-4571-bb5e-ca393071f745" path="/var/lib/kubelet/pods/51eb3ebb-3fa0-4571-bb5e-ca393071f745/volumes" Oct 13 13:11:03 crc kubenswrapper[4797]: I1013 13:11:03.245700 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b97354f-235c-4e1e-9121-13d644be8813" path="/var/lib/kubelet/pods/8b97354f-235c-4e1e-9121-13d644be8813/volumes" Oct 13 13:11:03 crc kubenswrapper[4797]: I1013 13:11:03.246766 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="922167a9-88b5-40b2-8dd3-b04e4ba3f035" path="/var/lib/kubelet/pods/922167a9-88b5-40b2-8dd3-b04e4ba3f035/volumes" Oct 13 13:11:03 crc kubenswrapper[4797]: I1013 13:11:03.247396 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c229cffd-cd92-47b5-bec4-3f3eb1c6c81e" path="/var/lib/kubelet/pods/c229cffd-cd92-47b5-bec4-3f3eb1c6c81e/volumes" Oct 13 13:11:03 crc kubenswrapper[4797]: I1013 13:11:03.264321 4797 generic.go:334] "Generic (PLEG): container finished" podID="eb0fc30c-b4bd-41bc-871e-1e85b3f115f2" containerID="c5b53846e9867ab58e53ba4de6d81726042668ca1b0b980ed4760659abc54490" exitCode=0 Oct 13 13:11:03 crc kubenswrapper[4797]: I1013 13:11:03.264391 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vgrr4" event={"ID":"eb0fc30c-b4bd-41bc-871e-1e85b3f115f2","Type":"ContainerDied","Data":"c5b53846e9867ab58e53ba4de6d81726042668ca1b0b980ed4760659abc54490"} Oct 13 13:11:03 crc kubenswrapper[4797]: I1013 13:11:03.264411 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vgrr4" event={"ID":"eb0fc30c-b4bd-41bc-871e-1e85b3f115f2","Type":"ContainerStarted","Data":"79ff0280c8e5ef255aa5fc4397f579f69dbd5610879eb28bdd3615065371b0b2"} Oct 13 13:11:03 crc kubenswrapper[4797]: I1013 13:11:03.269067 4797 generic.go:334] "Generic (PLEG): container finished" podID="cb509468-0e31-4f31-8aea-3a1c9111574d" containerID="cf24d24e4b4694bd20379bd03d7b2b6d1f6b7a83b2b132a764a502935adaf913" exitCode=0 Oct 13 13:11:03 crc kubenswrapper[4797]: I1013 13:11:03.269848 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c9wvg" event={"ID":"cb509468-0e31-4f31-8aea-3a1c9111574d","Type":"ContainerDied","Data":"cf24d24e4b4694bd20379bd03d7b2b6d1f6b7a83b2b132a764a502935adaf913"} Oct 13 13:11:03 crc kubenswrapper[4797]: I1013 13:11:03.269874 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c9wvg" event={"ID":"cb509468-0e31-4f31-8aea-3a1c9111574d","Type":"ContainerStarted","Data":"4c1bc2e368504a43002b9e1633909bd68f778d6714113229def4a58d5a3ef94a"} Oct 13 13:11:04 crc kubenswrapper[4797]: I1013 13:11:04.511114 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-4wk2g"] Oct 13 13:11:04 crc kubenswrapper[4797]: I1013 13:11:04.513621 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4wk2g" Oct 13 13:11:04 crc kubenswrapper[4797]: I1013 13:11:04.522499 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Oct 13 13:11:04 crc kubenswrapper[4797]: I1013 13:11:04.523157 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4wk2g"] Oct 13 13:11:04 crc kubenswrapper[4797]: I1013 13:11:04.628624 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cefe52ef-b36f-4f16-90f5-dc15e699992e-utilities\") pod \"community-operators-4wk2g\" (UID: \"cefe52ef-b36f-4f16-90f5-dc15e699992e\") " pod="openshift-marketplace/community-operators-4wk2g" Oct 13 13:11:04 crc kubenswrapper[4797]: I1013 13:11:04.628696 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cefe52ef-b36f-4f16-90f5-dc15e699992e-catalog-content\") pod \"community-operators-4wk2g\" (UID: \"cefe52ef-b36f-4f16-90f5-dc15e699992e\") " pod="openshift-marketplace/community-operators-4wk2g" Oct 13 13:11:04 crc kubenswrapper[4797]: I1013 13:11:04.628723 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mlfwl\" (UniqueName: \"kubernetes.io/projected/cefe52ef-b36f-4f16-90f5-dc15e699992e-kube-api-access-mlfwl\") pod \"community-operators-4wk2g\" (UID: \"cefe52ef-b36f-4f16-90f5-dc15e699992e\") " pod="openshift-marketplace/community-operators-4wk2g" Oct 13 13:11:04 crc kubenswrapper[4797]: I1013 13:11:04.711933 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-x7gp7"] Oct 13 13:11:04 crc kubenswrapper[4797]: I1013 13:11:04.713333 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x7gp7" Oct 13 13:11:04 crc kubenswrapper[4797]: I1013 13:11:04.715608 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Oct 13 13:11:04 crc kubenswrapper[4797]: I1013 13:11:04.717566 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-x7gp7"] Oct 13 13:11:04 crc kubenswrapper[4797]: I1013 13:11:04.730319 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cefe52ef-b36f-4f16-90f5-dc15e699992e-catalog-content\") pod \"community-operators-4wk2g\" (UID: \"cefe52ef-b36f-4f16-90f5-dc15e699992e\") " pod="openshift-marketplace/community-operators-4wk2g" Oct 13 13:11:04 crc kubenswrapper[4797]: I1013 13:11:04.730368 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mlfwl\" (UniqueName: \"kubernetes.io/projected/cefe52ef-b36f-4f16-90f5-dc15e699992e-kube-api-access-mlfwl\") pod \"community-operators-4wk2g\" (UID: \"cefe52ef-b36f-4f16-90f5-dc15e699992e\") " pod="openshift-marketplace/community-operators-4wk2g" Oct 13 13:11:04 crc kubenswrapper[4797]: I1013 13:11:04.730415 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cefe52ef-b36f-4f16-90f5-dc15e699992e-utilities\") pod \"community-operators-4wk2g\" (UID: \"cefe52ef-b36f-4f16-90f5-dc15e699992e\") " pod="openshift-marketplace/community-operators-4wk2g" Oct 13 13:11:04 crc kubenswrapper[4797]: I1013 13:11:04.730823 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cefe52ef-b36f-4f16-90f5-dc15e699992e-utilities\") pod \"community-operators-4wk2g\" (UID: \"cefe52ef-b36f-4f16-90f5-dc15e699992e\") " pod="openshift-marketplace/community-operators-4wk2g" Oct 13 13:11:04 crc kubenswrapper[4797]: I1013 13:11:04.731029 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cefe52ef-b36f-4f16-90f5-dc15e699992e-catalog-content\") pod \"community-operators-4wk2g\" (UID: \"cefe52ef-b36f-4f16-90f5-dc15e699992e\") " pod="openshift-marketplace/community-operators-4wk2g" Oct 13 13:11:04 crc kubenswrapper[4797]: I1013 13:11:04.758212 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mlfwl\" (UniqueName: \"kubernetes.io/projected/cefe52ef-b36f-4f16-90f5-dc15e699992e-kube-api-access-mlfwl\") pod \"community-operators-4wk2g\" (UID: \"cefe52ef-b36f-4f16-90f5-dc15e699992e\") " pod="openshift-marketplace/community-operators-4wk2g" Oct 13 13:11:04 crc kubenswrapper[4797]: I1013 13:11:04.831736 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c1a9748e-b3fa-4c05-98f8-5ed245e08fad-catalog-content\") pod \"certified-operators-x7gp7\" (UID: \"c1a9748e-b3fa-4c05-98f8-5ed245e08fad\") " pod="openshift-marketplace/certified-operators-x7gp7" Oct 13 13:11:04 crc kubenswrapper[4797]: I1013 13:11:04.831786 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqhg5\" (UniqueName: \"kubernetes.io/projected/c1a9748e-b3fa-4c05-98f8-5ed245e08fad-kube-api-access-fqhg5\") pod \"certified-operators-x7gp7\" (UID: \"c1a9748e-b3fa-4c05-98f8-5ed245e08fad\") " pod="openshift-marketplace/certified-operators-x7gp7" Oct 13 13:11:04 crc kubenswrapper[4797]: I1013 13:11:04.831834 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c1a9748e-b3fa-4c05-98f8-5ed245e08fad-utilities\") pod \"certified-operators-x7gp7\" (UID: \"c1a9748e-b3fa-4c05-98f8-5ed245e08fad\") " pod="openshift-marketplace/certified-operators-x7gp7" Oct 13 13:11:04 crc kubenswrapper[4797]: I1013 13:11:04.831960 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4wk2g" Oct 13 13:11:04 crc kubenswrapper[4797]: I1013 13:11:04.939703 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c1a9748e-b3fa-4c05-98f8-5ed245e08fad-catalog-content\") pod \"certified-operators-x7gp7\" (UID: \"c1a9748e-b3fa-4c05-98f8-5ed245e08fad\") " pod="openshift-marketplace/certified-operators-x7gp7" Oct 13 13:11:04 crc kubenswrapper[4797]: I1013 13:11:04.939754 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fqhg5\" (UniqueName: \"kubernetes.io/projected/c1a9748e-b3fa-4c05-98f8-5ed245e08fad-kube-api-access-fqhg5\") pod \"certified-operators-x7gp7\" (UID: \"c1a9748e-b3fa-4c05-98f8-5ed245e08fad\") " pod="openshift-marketplace/certified-operators-x7gp7" Oct 13 13:11:04 crc kubenswrapper[4797]: I1013 13:11:04.939824 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c1a9748e-b3fa-4c05-98f8-5ed245e08fad-utilities\") pod \"certified-operators-x7gp7\" (UID: \"c1a9748e-b3fa-4c05-98f8-5ed245e08fad\") " pod="openshift-marketplace/certified-operators-x7gp7" Oct 13 13:11:04 crc kubenswrapper[4797]: I1013 13:11:04.940387 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c1a9748e-b3fa-4c05-98f8-5ed245e08fad-catalog-content\") pod \"certified-operators-x7gp7\" (UID: \"c1a9748e-b3fa-4c05-98f8-5ed245e08fad\") " pod="openshift-marketplace/certified-operators-x7gp7" Oct 13 13:11:04 crc kubenswrapper[4797]: I1013 13:11:04.940459 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c1a9748e-b3fa-4c05-98f8-5ed245e08fad-utilities\") pod \"certified-operators-x7gp7\" (UID: \"c1a9748e-b3fa-4c05-98f8-5ed245e08fad\") " pod="openshift-marketplace/certified-operators-x7gp7" Oct 13 13:11:04 crc kubenswrapper[4797]: I1013 13:11:04.976946 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqhg5\" (UniqueName: \"kubernetes.io/projected/c1a9748e-b3fa-4c05-98f8-5ed245e08fad-kube-api-access-fqhg5\") pod \"certified-operators-x7gp7\" (UID: \"c1a9748e-b3fa-4c05-98f8-5ed245e08fad\") " pod="openshift-marketplace/certified-operators-x7gp7" Oct 13 13:11:05 crc kubenswrapper[4797]: I1013 13:11:05.089568 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x7gp7" Oct 13 13:11:05 crc kubenswrapper[4797]: W1013 13:11:05.259259 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcefe52ef_b36f_4f16_90f5_dc15e699992e.slice/crio-279b724e9dbb2f68260b9fee23a86a073fd815d8b2099dc5aa940950ba9e53b1 WatchSource:0}: Error finding container 279b724e9dbb2f68260b9fee23a86a073fd815d8b2099dc5aa940950ba9e53b1: Status 404 returned error can't find the container with id 279b724e9dbb2f68260b9fee23a86a073fd815d8b2099dc5aa940950ba9e53b1 Oct 13 13:11:05 crc kubenswrapper[4797]: I1013 13:11:05.260828 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4wk2g"] Oct 13 13:11:05 crc kubenswrapper[4797]: I1013 13:11:05.286206 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4wk2g" event={"ID":"cefe52ef-b36f-4f16-90f5-dc15e699992e","Type":"ContainerStarted","Data":"279b724e9dbb2f68260b9fee23a86a073fd815d8b2099dc5aa940950ba9e53b1"} Oct 13 13:11:05 crc kubenswrapper[4797]: I1013 13:11:05.289577 4797 generic.go:334] "Generic (PLEG): container finished" podID="eb0fc30c-b4bd-41bc-871e-1e85b3f115f2" containerID="559b793600ee8c13a058ff7c96bbefcf9c7b1e077babc4eb853da1e5bd61b01b" exitCode=0 Oct 13 13:11:05 crc kubenswrapper[4797]: I1013 13:11:05.289633 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vgrr4" event={"ID":"eb0fc30c-b4bd-41bc-871e-1e85b3f115f2","Type":"ContainerDied","Data":"559b793600ee8c13a058ff7c96bbefcf9c7b1e077babc4eb853da1e5bd61b01b"} Oct 13 13:11:05 crc kubenswrapper[4797]: I1013 13:11:05.293741 4797 generic.go:334] "Generic (PLEG): container finished" podID="cb509468-0e31-4f31-8aea-3a1c9111574d" containerID="a92f80a8cd5c5f488bc4480387d33b1891b82139daa59c8cd5da82b2b101f5c1" exitCode=0 Oct 13 13:11:05 crc kubenswrapper[4797]: I1013 13:11:05.293779 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c9wvg" event={"ID":"cb509468-0e31-4f31-8aea-3a1c9111574d","Type":"ContainerDied","Data":"a92f80a8cd5c5f488bc4480387d33b1891b82139daa59c8cd5da82b2b101f5c1"} Oct 13 13:11:05 crc kubenswrapper[4797]: I1013 13:11:05.499065 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-x7gp7"] Oct 13 13:11:05 crc kubenswrapper[4797]: W1013 13:11:05.548730 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc1a9748e_b3fa_4c05_98f8_5ed245e08fad.slice/crio-8aa1d259c44c099f8729aea51981174f64afbc6d843be1a501ed8ae155efc763 WatchSource:0}: Error finding container 8aa1d259c44c099f8729aea51981174f64afbc6d843be1a501ed8ae155efc763: Status 404 returned error can't find the container with id 8aa1d259c44c099f8729aea51981174f64afbc6d843be1a501ed8ae155efc763 Oct 13 13:11:06 crc kubenswrapper[4797]: I1013 13:11:06.300096 4797 generic.go:334] "Generic (PLEG): container finished" podID="cefe52ef-b36f-4f16-90f5-dc15e699992e" containerID="3ccb1e47c1bb844d88836462ee24b737103c70b80721843749c8ce4d51889e25" exitCode=0 Oct 13 13:11:06 crc kubenswrapper[4797]: I1013 13:11:06.300179 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4wk2g" event={"ID":"cefe52ef-b36f-4f16-90f5-dc15e699992e","Type":"ContainerDied","Data":"3ccb1e47c1bb844d88836462ee24b737103c70b80721843749c8ce4d51889e25"} Oct 13 13:11:06 crc kubenswrapper[4797]: I1013 13:11:06.302428 4797 generic.go:334] "Generic (PLEG): container finished" podID="c1a9748e-b3fa-4c05-98f8-5ed245e08fad" containerID="99af5be0471e2570aad3bd68c40c621095f5a366196020f422f31f5f9e025aad" exitCode=0 Oct 13 13:11:06 crc kubenswrapper[4797]: I1013 13:11:06.302516 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x7gp7" event={"ID":"c1a9748e-b3fa-4c05-98f8-5ed245e08fad","Type":"ContainerDied","Data":"99af5be0471e2570aad3bd68c40c621095f5a366196020f422f31f5f9e025aad"} Oct 13 13:11:06 crc kubenswrapper[4797]: I1013 13:11:06.302543 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x7gp7" event={"ID":"c1a9748e-b3fa-4c05-98f8-5ed245e08fad","Type":"ContainerStarted","Data":"8aa1d259c44c099f8729aea51981174f64afbc6d843be1a501ed8ae155efc763"} Oct 13 13:11:06 crc kubenswrapper[4797]: I1013 13:11:06.305051 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vgrr4" event={"ID":"eb0fc30c-b4bd-41bc-871e-1e85b3f115f2","Type":"ContainerStarted","Data":"81df54a278f2c7040488c3279528f69a451ca6ca5934991bb67ca85b29d70ca5"} Oct 13 13:11:06 crc kubenswrapper[4797]: I1013 13:11:06.308024 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c9wvg" event={"ID":"cb509468-0e31-4f31-8aea-3a1c9111574d","Type":"ContainerStarted","Data":"88fbae7debf002f209eac3fe7a6001649344c2c66904951c780371698c4e5b61"} Oct 13 13:11:06 crc kubenswrapper[4797]: I1013 13:11:06.360516 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-c9wvg" podStartSLOduration=1.843372395 podStartE2EDuration="4.36049835s" podCreationTimestamp="2025-10-13 13:11:02 +0000 UTC" firstStartedPulling="2025-10-13 13:11:03.285479302 +0000 UTC m=+240.819029558" lastFinishedPulling="2025-10-13 13:11:05.802605257 +0000 UTC m=+243.336155513" observedRunningTime="2025-10-13 13:11:06.357345851 +0000 UTC m=+243.890896107" watchObservedRunningTime="2025-10-13 13:11:06.36049835 +0000 UTC m=+243.894048606" Oct 13 13:11:06 crc kubenswrapper[4797]: I1013 13:11:06.379884 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-vgrr4" podStartSLOduration=1.951464799 podStartE2EDuration="4.379860729s" podCreationTimestamp="2025-10-13 13:11:02 +0000 UTC" firstStartedPulling="2025-10-13 13:11:03.266890052 +0000 UTC m=+240.800440318" lastFinishedPulling="2025-10-13 13:11:05.695285992 +0000 UTC m=+243.228836248" observedRunningTime="2025-10-13 13:11:06.378929975 +0000 UTC m=+243.912480241" watchObservedRunningTime="2025-10-13 13:11:06.379860729 +0000 UTC m=+243.913410995" Oct 13 13:11:09 crc kubenswrapper[4797]: I1013 13:11:09.325703 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x7gp7" event={"ID":"c1a9748e-b3fa-4c05-98f8-5ed245e08fad","Type":"ContainerStarted","Data":"6e51bb3e46258fd7ce08e90221899d47e39c5864e01714f5ac88e7e8891c936f"} Oct 13 13:11:09 crc kubenswrapper[4797]: I1013 13:11:09.328727 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4wk2g" event={"ID":"cefe52ef-b36f-4f16-90f5-dc15e699992e","Type":"ContainerStarted","Data":"48ee501209b1dc62a1d1a5e7bc5cc52ff73d8807336561873f0696686b599127"} Oct 13 13:11:10 crc kubenswrapper[4797]: I1013 13:11:10.335901 4797 generic.go:334] "Generic (PLEG): container finished" podID="c1a9748e-b3fa-4c05-98f8-5ed245e08fad" containerID="6e51bb3e46258fd7ce08e90221899d47e39c5864e01714f5ac88e7e8891c936f" exitCode=0 Oct 13 13:11:10 crc kubenswrapper[4797]: I1013 13:11:10.335997 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x7gp7" event={"ID":"c1a9748e-b3fa-4c05-98f8-5ed245e08fad","Type":"ContainerDied","Data":"6e51bb3e46258fd7ce08e90221899d47e39c5864e01714f5ac88e7e8891c936f"} Oct 13 13:11:10 crc kubenswrapper[4797]: I1013 13:11:10.341200 4797 generic.go:334] "Generic (PLEG): container finished" podID="cefe52ef-b36f-4f16-90f5-dc15e699992e" containerID="48ee501209b1dc62a1d1a5e7bc5cc52ff73d8807336561873f0696686b599127" exitCode=0 Oct 13 13:11:10 crc kubenswrapper[4797]: I1013 13:11:10.341250 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4wk2g" event={"ID":"cefe52ef-b36f-4f16-90f5-dc15e699992e","Type":"ContainerDied","Data":"48ee501209b1dc62a1d1a5e7bc5cc52ff73d8807336561873f0696686b599127"} Oct 13 13:11:11 crc kubenswrapper[4797]: I1013 13:11:11.360466 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4wk2g" event={"ID":"cefe52ef-b36f-4f16-90f5-dc15e699992e","Type":"ContainerStarted","Data":"675a6af72d1141a6cfe21384ffeb5d54670a908da000770e1f47a428775c79fa"} Oct 13 13:11:11 crc kubenswrapper[4797]: I1013 13:11:11.363024 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x7gp7" event={"ID":"c1a9748e-b3fa-4c05-98f8-5ed245e08fad","Type":"ContainerStarted","Data":"4e0b340edc011f5b72708d83fd0fe1a601a8d5fc1a965702261f902081112f10"} Oct 13 13:11:11 crc kubenswrapper[4797]: I1013 13:11:11.384175 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-4wk2g" podStartSLOduration=2.963776206 podStartE2EDuration="7.384156948s" podCreationTimestamp="2025-10-13 13:11:04 +0000 UTC" firstStartedPulling="2025-10-13 13:11:06.30271533 +0000 UTC m=+243.836265606" lastFinishedPulling="2025-10-13 13:11:10.723096052 +0000 UTC m=+248.256646348" observedRunningTime="2025-10-13 13:11:11.382259461 +0000 UTC m=+248.915809717" watchObservedRunningTime="2025-10-13 13:11:11.384156948 +0000 UTC m=+248.917707204" Oct 13 13:11:11 crc kubenswrapper[4797]: I1013 13:11:11.404610 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-x7gp7" podStartSLOduration=2.837355918 podStartE2EDuration="7.404580543s" podCreationTimestamp="2025-10-13 13:11:04 +0000 UTC" firstStartedPulling="2025-10-13 13:11:06.303763856 +0000 UTC m=+243.837314132" lastFinishedPulling="2025-10-13 13:11:10.870988491 +0000 UTC m=+248.404538757" observedRunningTime="2025-10-13 13:11:11.400299347 +0000 UTC m=+248.933849603" watchObservedRunningTime="2025-10-13 13:11:11.404580543 +0000 UTC m=+248.938130799" Oct 13 13:11:12 crc kubenswrapper[4797]: I1013 13:11:12.450439 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-c9wvg" Oct 13 13:11:12 crc kubenswrapper[4797]: I1013 13:11:12.450535 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-c9wvg" Oct 13 13:11:12 crc kubenswrapper[4797]: I1013 13:11:12.500080 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-c9wvg" Oct 13 13:11:12 crc kubenswrapper[4797]: I1013 13:11:12.630698 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-vgrr4" Oct 13 13:11:12 crc kubenswrapper[4797]: I1013 13:11:12.631234 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-vgrr4" Oct 13 13:11:12 crc kubenswrapper[4797]: I1013 13:11:12.684252 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-vgrr4" Oct 13 13:11:13 crc kubenswrapper[4797]: I1013 13:11:13.414489 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-c9wvg" Oct 13 13:11:13 crc kubenswrapper[4797]: I1013 13:11:13.414581 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-vgrr4" Oct 13 13:11:14 crc kubenswrapper[4797]: I1013 13:11:14.832929 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-4wk2g" Oct 13 13:11:14 crc kubenswrapper[4797]: I1013 13:11:14.832992 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-4wk2g" Oct 13 13:11:14 crc kubenswrapper[4797]: I1013 13:11:14.874399 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-4wk2g" Oct 13 13:11:15 crc kubenswrapper[4797]: I1013 13:11:15.090457 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-x7gp7" Oct 13 13:11:15 crc kubenswrapper[4797]: I1013 13:11:15.090966 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-x7gp7" Oct 13 13:11:15 crc kubenswrapper[4797]: I1013 13:11:15.129567 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-x7gp7" Oct 13 13:11:15 crc kubenswrapper[4797]: I1013 13:11:15.431854 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-4wk2g" Oct 13 13:11:25 crc kubenswrapper[4797]: I1013 13:11:25.159189 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-x7gp7" Oct 13 13:12:48 crc kubenswrapper[4797]: I1013 13:12:48.120926 4797 patch_prober.go:28] interesting pod/machine-config-daemon-hrdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 13:12:48 crc kubenswrapper[4797]: I1013 13:12:48.121617 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 13:13:18 crc kubenswrapper[4797]: I1013 13:13:18.120388 4797 patch_prober.go:28] interesting pod/machine-config-daemon-hrdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 13:13:18 crc kubenswrapper[4797]: I1013 13:13:18.121041 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 13:13:21 crc kubenswrapper[4797]: I1013 13:13:21.988000 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-mxnr5"] Oct 13 13:13:21 crc kubenswrapper[4797]: I1013 13:13:21.989029 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-mxnr5" Oct 13 13:13:22 crc kubenswrapper[4797]: I1013 13:13:22.004225 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-mxnr5"] Oct 13 13:13:22 crc kubenswrapper[4797]: I1013 13:13:22.152194 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fddhc\" (UniqueName: \"kubernetes.io/projected/6511dc90-def6-4ebc-8277-8936109f50e1-kube-api-access-fddhc\") pod \"image-registry-66df7c8f76-mxnr5\" (UID: \"6511dc90-def6-4ebc-8277-8936109f50e1\") " pod="openshift-image-registry/image-registry-66df7c8f76-mxnr5" Oct 13 13:13:22 crc kubenswrapper[4797]: I1013 13:13:22.152251 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6511dc90-def6-4ebc-8277-8936109f50e1-registry-tls\") pod \"image-registry-66df7c8f76-mxnr5\" (UID: \"6511dc90-def6-4ebc-8277-8936109f50e1\") " pod="openshift-image-registry/image-registry-66df7c8f76-mxnr5" Oct 13 13:13:22 crc kubenswrapper[4797]: I1013 13:13:22.152268 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6511dc90-def6-4ebc-8277-8936109f50e1-installation-pull-secrets\") pod \"image-registry-66df7c8f76-mxnr5\" (UID: \"6511dc90-def6-4ebc-8277-8936109f50e1\") " pod="openshift-image-registry/image-registry-66df7c8f76-mxnr5" Oct 13 13:13:22 crc kubenswrapper[4797]: I1013 13:13:22.152294 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6511dc90-def6-4ebc-8277-8936109f50e1-bound-sa-token\") pod \"image-registry-66df7c8f76-mxnr5\" (UID: \"6511dc90-def6-4ebc-8277-8936109f50e1\") " pod="openshift-image-registry/image-registry-66df7c8f76-mxnr5" Oct 13 13:13:22 crc kubenswrapper[4797]: I1013 13:13:22.152332 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6511dc90-def6-4ebc-8277-8936109f50e1-ca-trust-extracted\") pod \"image-registry-66df7c8f76-mxnr5\" (UID: \"6511dc90-def6-4ebc-8277-8936109f50e1\") " pod="openshift-image-registry/image-registry-66df7c8f76-mxnr5" Oct 13 13:13:22 crc kubenswrapper[4797]: I1013 13:13:22.152365 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6511dc90-def6-4ebc-8277-8936109f50e1-trusted-ca\") pod \"image-registry-66df7c8f76-mxnr5\" (UID: \"6511dc90-def6-4ebc-8277-8936109f50e1\") " pod="openshift-image-registry/image-registry-66df7c8f76-mxnr5" Oct 13 13:13:22 crc kubenswrapper[4797]: I1013 13:13:22.152418 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-mxnr5\" (UID: \"6511dc90-def6-4ebc-8277-8936109f50e1\") " pod="openshift-image-registry/image-registry-66df7c8f76-mxnr5" Oct 13 13:13:22 crc kubenswrapper[4797]: I1013 13:13:22.152444 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6511dc90-def6-4ebc-8277-8936109f50e1-registry-certificates\") pod \"image-registry-66df7c8f76-mxnr5\" (UID: \"6511dc90-def6-4ebc-8277-8936109f50e1\") " pod="openshift-image-registry/image-registry-66df7c8f76-mxnr5" Oct 13 13:13:22 crc kubenswrapper[4797]: I1013 13:13:22.176196 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-mxnr5\" (UID: \"6511dc90-def6-4ebc-8277-8936109f50e1\") " pod="openshift-image-registry/image-registry-66df7c8f76-mxnr5" Oct 13 13:13:22 crc kubenswrapper[4797]: I1013 13:13:22.253749 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6511dc90-def6-4ebc-8277-8936109f50e1-registry-certificates\") pod \"image-registry-66df7c8f76-mxnr5\" (UID: \"6511dc90-def6-4ebc-8277-8936109f50e1\") " pod="openshift-image-registry/image-registry-66df7c8f76-mxnr5" Oct 13 13:13:22 crc kubenswrapper[4797]: I1013 13:13:22.254111 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fddhc\" (UniqueName: \"kubernetes.io/projected/6511dc90-def6-4ebc-8277-8936109f50e1-kube-api-access-fddhc\") pod \"image-registry-66df7c8f76-mxnr5\" (UID: \"6511dc90-def6-4ebc-8277-8936109f50e1\") " pod="openshift-image-registry/image-registry-66df7c8f76-mxnr5" Oct 13 13:13:22 crc kubenswrapper[4797]: I1013 13:13:22.254134 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6511dc90-def6-4ebc-8277-8936109f50e1-registry-tls\") pod \"image-registry-66df7c8f76-mxnr5\" (UID: \"6511dc90-def6-4ebc-8277-8936109f50e1\") " pod="openshift-image-registry/image-registry-66df7c8f76-mxnr5" Oct 13 13:13:22 crc kubenswrapper[4797]: I1013 13:13:22.254152 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6511dc90-def6-4ebc-8277-8936109f50e1-installation-pull-secrets\") pod \"image-registry-66df7c8f76-mxnr5\" (UID: \"6511dc90-def6-4ebc-8277-8936109f50e1\") " pod="openshift-image-registry/image-registry-66df7c8f76-mxnr5" Oct 13 13:13:22 crc kubenswrapper[4797]: I1013 13:13:22.254173 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6511dc90-def6-4ebc-8277-8936109f50e1-bound-sa-token\") pod \"image-registry-66df7c8f76-mxnr5\" (UID: \"6511dc90-def6-4ebc-8277-8936109f50e1\") " pod="openshift-image-registry/image-registry-66df7c8f76-mxnr5" Oct 13 13:13:22 crc kubenswrapper[4797]: I1013 13:13:22.254210 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6511dc90-def6-4ebc-8277-8936109f50e1-ca-trust-extracted\") pod \"image-registry-66df7c8f76-mxnr5\" (UID: \"6511dc90-def6-4ebc-8277-8936109f50e1\") " pod="openshift-image-registry/image-registry-66df7c8f76-mxnr5" Oct 13 13:13:22 crc kubenswrapper[4797]: I1013 13:13:22.254243 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6511dc90-def6-4ebc-8277-8936109f50e1-trusted-ca\") pod \"image-registry-66df7c8f76-mxnr5\" (UID: \"6511dc90-def6-4ebc-8277-8936109f50e1\") " pod="openshift-image-registry/image-registry-66df7c8f76-mxnr5" Oct 13 13:13:22 crc kubenswrapper[4797]: I1013 13:13:22.255193 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6511dc90-def6-4ebc-8277-8936109f50e1-ca-trust-extracted\") pod \"image-registry-66df7c8f76-mxnr5\" (UID: \"6511dc90-def6-4ebc-8277-8936109f50e1\") " pod="openshift-image-registry/image-registry-66df7c8f76-mxnr5" Oct 13 13:13:22 crc kubenswrapper[4797]: I1013 13:13:22.255351 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6511dc90-def6-4ebc-8277-8936109f50e1-trusted-ca\") pod \"image-registry-66df7c8f76-mxnr5\" (UID: \"6511dc90-def6-4ebc-8277-8936109f50e1\") " pod="openshift-image-registry/image-registry-66df7c8f76-mxnr5" Oct 13 13:13:22 crc kubenswrapper[4797]: I1013 13:13:22.255919 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6511dc90-def6-4ebc-8277-8936109f50e1-registry-certificates\") pod \"image-registry-66df7c8f76-mxnr5\" (UID: \"6511dc90-def6-4ebc-8277-8936109f50e1\") " pod="openshift-image-registry/image-registry-66df7c8f76-mxnr5" Oct 13 13:13:22 crc kubenswrapper[4797]: I1013 13:13:22.263390 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6511dc90-def6-4ebc-8277-8936109f50e1-installation-pull-secrets\") pod \"image-registry-66df7c8f76-mxnr5\" (UID: \"6511dc90-def6-4ebc-8277-8936109f50e1\") " pod="openshift-image-registry/image-registry-66df7c8f76-mxnr5" Oct 13 13:13:22 crc kubenswrapper[4797]: I1013 13:13:22.263562 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6511dc90-def6-4ebc-8277-8936109f50e1-registry-tls\") pod \"image-registry-66df7c8f76-mxnr5\" (UID: \"6511dc90-def6-4ebc-8277-8936109f50e1\") " pod="openshift-image-registry/image-registry-66df7c8f76-mxnr5" Oct 13 13:13:22 crc kubenswrapper[4797]: I1013 13:13:22.271047 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fddhc\" (UniqueName: \"kubernetes.io/projected/6511dc90-def6-4ebc-8277-8936109f50e1-kube-api-access-fddhc\") pod \"image-registry-66df7c8f76-mxnr5\" (UID: \"6511dc90-def6-4ebc-8277-8936109f50e1\") " pod="openshift-image-registry/image-registry-66df7c8f76-mxnr5" Oct 13 13:13:22 crc kubenswrapper[4797]: I1013 13:13:22.272066 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6511dc90-def6-4ebc-8277-8936109f50e1-bound-sa-token\") pod \"image-registry-66df7c8f76-mxnr5\" (UID: \"6511dc90-def6-4ebc-8277-8936109f50e1\") " pod="openshift-image-registry/image-registry-66df7c8f76-mxnr5" Oct 13 13:13:22 crc kubenswrapper[4797]: I1013 13:13:22.361416 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-mxnr5" Oct 13 13:13:22 crc kubenswrapper[4797]: I1013 13:13:22.595867 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-mxnr5"] Oct 13 13:13:23 crc kubenswrapper[4797]: I1013 13:13:23.280450 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-mxnr5" event={"ID":"6511dc90-def6-4ebc-8277-8936109f50e1","Type":"ContainerStarted","Data":"b57fc7b19a91d745e799f733a1ba2782e501a1244c5da4517b822859fafe3438"} Oct 13 13:13:23 crc kubenswrapper[4797]: I1013 13:13:23.280542 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-mxnr5" event={"ID":"6511dc90-def6-4ebc-8277-8936109f50e1","Type":"ContainerStarted","Data":"40d27b3aeac611bd62ba3c9b9ce92e5d203c9b3287a3366bf49632d6a7315ec8"} Oct 13 13:13:23 crc kubenswrapper[4797]: I1013 13:13:23.280709 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-mxnr5" Oct 13 13:13:23 crc kubenswrapper[4797]: I1013 13:13:23.312361 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-mxnr5" podStartSLOduration=2.312332746 podStartE2EDuration="2.312332746s" podCreationTimestamp="2025-10-13 13:13:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 13:13:23.311352972 +0000 UTC m=+380.844903288" watchObservedRunningTime="2025-10-13 13:13:23.312332746 +0000 UTC m=+380.845883042" Oct 13 13:13:42 crc kubenswrapper[4797]: I1013 13:13:42.367711 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-mxnr5" Oct 13 13:13:42 crc kubenswrapper[4797]: I1013 13:13:42.424939 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-x9zw9"] Oct 13 13:13:48 crc kubenswrapper[4797]: I1013 13:13:48.119982 4797 patch_prober.go:28] interesting pod/machine-config-daemon-hrdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 13:13:48 crc kubenswrapper[4797]: I1013 13:13:48.121333 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 13:13:48 crc kubenswrapper[4797]: I1013 13:13:48.121436 4797 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" Oct 13 13:13:48 crc kubenswrapper[4797]: I1013 13:13:48.126347 4797 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d3ee7955330145cda9bf05f82ba784bd3a2439ca3cd35803a4ccfd63040068c9"} pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 13 13:13:48 crc kubenswrapper[4797]: I1013 13:13:48.126730 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerName="machine-config-daemon" containerID="cri-o://d3ee7955330145cda9bf05f82ba784bd3a2439ca3cd35803a4ccfd63040068c9" gracePeriod=600 Oct 13 13:13:48 crc kubenswrapper[4797]: I1013 13:13:48.462348 4797 generic.go:334] "Generic (PLEG): container finished" podID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerID="d3ee7955330145cda9bf05f82ba784bd3a2439ca3cd35803a4ccfd63040068c9" exitCode=0 Oct 13 13:13:48 crc kubenswrapper[4797]: I1013 13:13:48.462452 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" event={"ID":"345b1c60-ba79-407d-8423-53010f2dfeb0","Type":"ContainerDied","Data":"d3ee7955330145cda9bf05f82ba784bd3a2439ca3cd35803a4ccfd63040068c9"} Oct 13 13:13:48 crc kubenswrapper[4797]: I1013 13:13:48.462731 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" event={"ID":"345b1c60-ba79-407d-8423-53010f2dfeb0","Type":"ContainerStarted","Data":"05b59feccb2234abc6b4fd15b059be4b96eb36f86698f4039d57c3c1a3c8d369"} Oct 13 13:13:48 crc kubenswrapper[4797]: I1013 13:13:48.462760 4797 scope.go:117] "RemoveContainer" containerID="ae2106d4b7e73d19b0c8cbd8089d372e56fa08d827a3b45148d0cf68e8596c00" Oct 13 13:14:07 crc kubenswrapper[4797]: I1013 13:14:07.474932 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-x9zw9" podUID="fcbb5cc0-3585-4ddb-aa28-1c1097d59318" containerName="registry" containerID="cri-o://bd7bdc6e606fe8503731538f4ca59fdcbaadd9ab7a6e3edc9ab1fb1a640ee359" gracePeriod=30 Oct 13 13:14:07 crc kubenswrapper[4797]: I1013 13:14:07.898863 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-x9zw9" Oct 13 13:14:08 crc kubenswrapper[4797]: I1013 13:14:08.080361 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fcbb5cc0-3585-4ddb-aa28-1c1097d59318-trusted-ca\") pod \"fcbb5cc0-3585-4ddb-aa28-1c1097d59318\" (UID: \"fcbb5cc0-3585-4ddb-aa28-1c1097d59318\") " Oct 13 13:14:08 crc kubenswrapper[4797]: I1013 13:14:08.080440 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/fcbb5cc0-3585-4ddb-aa28-1c1097d59318-ca-trust-extracted\") pod \"fcbb5cc0-3585-4ddb-aa28-1c1097d59318\" (UID: \"fcbb5cc0-3585-4ddb-aa28-1c1097d59318\") " Oct 13 13:14:08 crc kubenswrapper[4797]: I1013 13:14:08.080493 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/fcbb5cc0-3585-4ddb-aa28-1c1097d59318-registry-certificates\") pod \"fcbb5cc0-3585-4ddb-aa28-1c1097d59318\" (UID: \"fcbb5cc0-3585-4ddb-aa28-1c1097d59318\") " Oct 13 13:14:08 crc kubenswrapper[4797]: I1013 13:14:08.080575 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fcbb5cc0-3585-4ddb-aa28-1c1097d59318-bound-sa-token\") pod \"fcbb5cc0-3585-4ddb-aa28-1c1097d59318\" (UID: \"fcbb5cc0-3585-4ddb-aa28-1c1097d59318\") " Oct 13 13:14:08 crc kubenswrapper[4797]: I1013 13:14:08.080659 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nlx64\" (UniqueName: \"kubernetes.io/projected/fcbb5cc0-3585-4ddb-aa28-1c1097d59318-kube-api-access-nlx64\") pod \"fcbb5cc0-3585-4ddb-aa28-1c1097d59318\" (UID: \"fcbb5cc0-3585-4ddb-aa28-1c1097d59318\") " Oct 13 13:14:08 crc kubenswrapper[4797]: I1013 13:14:08.080692 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/fcbb5cc0-3585-4ddb-aa28-1c1097d59318-installation-pull-secrets\") pod \"fcbb5cc0-3585-4ddb-aa28-1c1097d59318\" (UID: \"fcbb5cc0-3585-4ddb-aa28-1c1097d59318\") " Oct 13 13:14:08 crc kubenswrapper[4797]: I1013 13:14:08.080902 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"fcbb5cc0-3585-4ddb-aa28-1c1097d59318\" (UID: \"fcbb5cc0-3585-4ddb-aa28-1c1097d59318\") " Oct 13 13:14:08 crc kubenswrapper[4797]: I1013 13:14:08.081040 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fcbb5cc0-3585-4ddb-aa28-1c1097d59318-registry-tls\") pod \"fcbb5cc0-3585-4ddb-aa28-1c1097d59318\" (UID: \"fcbb5cc0-3585-4ddb-aa28-1c1097d59318\") " Oct 13 13:14:08 crc kubenswrapper[4797]: I1013 13:14:08.081333 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fcbb5cc0-3585-4ddb-aa28-1c1097d59318-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "fcbb5cc0-3585-4ddb-aa28-1c1097d59318" (UID: "fcbb5cc0-3585-4ddb-aa28-1c1097d59318"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:14:08 crc kubenswrapper[4797]: I1013 13:14:08.082176 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fcbb5cc0-3585-4ddb-aa28-1c1097d59318-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "fcbb5cc0-3585-4ddb-aa28-1c1097d59318" (UID: "fcbb5cc0-3585-4ddb-aa28-1c1097d59318"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:14:08 crc kubenswrapper[4797]: I1013 13:14:08.082665 4797 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fcbb5cc0-3585-4ddb-aa28-1c1097d59318-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 13 13:14:08 crc kubenswrapper[4797]: I1013 13:14:08.082700 4797 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/fcbb5cc0-3585-4ddb-aa28-1c1097d59318-registry-certificates\") on node \"crc\" DevicePath \"\"" Oct 13 13:14:08 crc kubenswrapper[4797]: I1013 13:14:08.089599 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fcbb5cc0-3585-4ddb-aa28-1c1097d59318-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "fcbb5cc0-3585-4ddb-aa28-1c1097d59318" (UID: "fcbb5cc0-3585-4ddb-aa28-1c1097d59318"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:14:08 crc kubenswrapper[4797]: I1013 13:14:08.090262 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fcbb5cc0-3585-4ddb-aa28-1c1097d59318-kube-api-access-nlx64" (OuterVolumeSpecName: "kube-api-access-nlx64") pod "fcbb5cc0-3585-4ddb-aa28-1c1097d59318" (UID: "fcbb5cc0-3585-4ddb-aa28-1c1097d59318"). InnerVolumeSpecName "kube-api-access-nlx64". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:14:08 crc kubenswrapper[4797]: I1013 13:14:08.090445 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fcbb5cc0-3585-4ddb-aa28-1c1097d59318-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "fcbb5cc0-3585-4ddb-aa28-1c1097d59318" (UID: "fcbb5cc0-3585-4ddb-aa28-1c1097d59318"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:14:08 crc kubenswrapper[4797]: I1013 13:14:08.090650 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fcbb5cc0-3585-4ddb-aa28-1c1097d59318-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "fcbb5cc0-3585-4ddb-aa28-1c1097d59318" (UID: "fcbb5cc0-3585-4ddb-aa28-1c1097d59318"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:14:08 crc kubenswrapper[4797]: I1013 13:14:08.094416 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "fcbb5cc0-3585-4ddb-aa28-1c1097d59318" (UID: "fcbb5cc0-3585-4ddb-aa28-1c1097d59318"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 13 13:14:08 crc kubenswrapper[4797]: I1013 13:14:08.116500 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fcbb5cc0-3585-4ddb-aa28-1c1097d59318-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "fcbb5cc0-3585-4ddb-aa28-1c1097d59318" (UID: "fcbb5cc0-3585-4ddb-aa28-1c1097d59318"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 13:14:08 crc kubenswrapper[4797]: I1013 13:14:08.183593 4797 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fcbb5cc0-3585-4ddb-aa28-1c1097d59318-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 13 13:14:08 crc kubenswrapper[4797]: I1013 13:14:08.183650 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nlx64\" (UniqueName: \"kubernetes.io/projected/fcbb5cc0-3585-4ddb-aa28-1c1097d59318-kube-api-access-nlx64\") on node \"crc\" DevicePath \"\"" Oct 13 13:14:08 crc kubenswrapper[4797]: I1013 13:14:08.183677 4797 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/fcbb5cc0-3585-4ddb-aa28-1c1097d59318-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Oct 13 13:14:08 crc kubenswrapper[4797]: I1013 13:14:08.183696 4797 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fcbb5cc0-3585-4ddb-aa28-1c1097d59318-registry-tls\") on node \"crc\" DevicePath \"\"" Oct 13 13:14:08 crc kubenswrapper[4797]: I1013 13:14:08.183715 4797 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/fcbb5cc0-3585-4ddb-aa28-1c1097d59318-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Oct 13 13:14:08 crc kubenswrapper[4797]: I1013 13:14:08.600854 4797 generic.go:334] "Generic (PLEG): container finished" podID="fcbb5cc0-3585-4ddb-aa28-1c1097d59318" containerID="bd7bdc6e606fe8503731538f4ca59fdcbaadd9ab7a6e3edc9ab1fb1a640ee359" exitCode=0 Oct 13 13:14:08 crc kubenswrapper[4797]: I1013 13:14:08.600912 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-x9zw9" event={"ID":"fcbb5cc0-3585-4ddb-aa28-1c1097d59318","Type":"ContainerDied","Data":"bd7bdc6e606fe8503731538f4ca59fdcbaadd9ab7a6e3edc9ab1fb1a640ee359"} Oct 13 13:14:08 crc kubenswrapper[4797]: I1013 13:14:08.600946 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-x9zw9" Oct 13 13:14:08 crc kubenswrapper[4797]: I1013 13:14:08.600969 4797 scope.go:117] "RemoveContainer" containerID="bd7bdc6e606fe8503731538f4ca59fdcbaadd9ab7a6e3edc9ab1fb1a640ee359" Oct 13 13:14:08 crc kubenswrapper[4797]: I1013 13:14:08.600954 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-x9zw9" event={"ID":"fcbb5cc0-3585-4ddb-aa28-1c1097d59318","Type":"ContainerDied","Data":"4e4b50f7512eba4570a3e059afddc8eb940e71032452d8d77612a7f9a13e77a8"} Oct 13 13:14:08 crc kubenswrapper[4797]: I1013 13:14:08.629135 4797 scope.go:117] "RemoveContainer" containerID="bd7bdc6e606fe8503731538f4ca59fdcbaadd9ab7a6e3edc9ab1fb1a640ee359" Oct 13 13:14:08 crc kubenswrapper[4797]: E1013 13:14:08.630525 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd7bdc6e606fe8503731538f4ca59fdcbaadd9ab7a6e3edc9ab1fb1a640ee359\": container with ID starting with bd7bdc6e606fe8503731538f4ca59fdcbaadd9ab7a6e3edc9ab1fb1a640ee359 not found: ID does not exist" containerID="bd7bdc6e606fe8503731538f4ca59fdcbaadd9ab7a6e3edc9ab1fb1a640ee359" Oct 13 13:14:08 crc kubenswrapper[4797]: I1013 13:14:08.630671 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd7bdc6e606fe8503731538f4ca59fdcbaadd9ab7a6e3edc9ab1fb1a640ee359"} err="failed to get container status \"bd7bdc6e606fe8503731538f4ca59fdcbaadd9ab7a6e3edc9ab1fb1a640ee359\": rpc error: code = NotFound desc = could not find container \"bd7bdc6e606fe8503731538f4ca59fdcbaadd9ab7a6e3edc9ab1fb1a640ee359\": container with ID starting with bd7bdc6e606fe8503731538f4ca59fdcbaadd9ab7a6e3edc9ab1fb1a640ee359 not found: ID does not exist" Oct 13 13:14:08 crc kubenswrapper[4797]: I1013 13:14:08.658981 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-x9zw9"] Oct 13 13:14:08 crc kubenswrapper[4797]: I1013 13:14:08.665891 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-x9zw9"] Oct 13 13:14:08 crc kubenswrapper[4797]: E1013 13:14:08.749676 4797 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfcbb5cc0_3585_4ddb_aa28_1c1097d59318.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfcbb5cc0_3585_4ddb_aa28_1c1097d59318.slice/crio-4e4b50f7512eba4570a3e059afddc8eb940e71032452d8d77612a7f9a13e77a8\": RecentStats: unable to find data in memory cache]" Oct 13 13:14:09 crc kubenswrapper[4797]: I1013 13:14:09.248739 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fcbb5cc0-3585-4ddb-aa28-1c1097d59318" path="/var/lib/kubelet/pods/fcbb5cc0-3585-4ddb-aa28-1c1097d59318/volumes" Oct 13 13:14:12 crc kubenswrapper[4797]: I1013 13:14:12.756730 4797 patch_prober.go:28] interesting pod/image-registry-697d97f7c8-x9zw9 container/registry namespace/openshift-image-registry: Readiness probe status=failure output="Get \"https://10.217.0.13:5000/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 13 13:14:12 crc kubenswrapper[4797]: I1013 13:14:12.757012 4797 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-697d97f7c8-x9zw9" podUID="fcbb5cc0-3585-4ddb-aa28-1c1097d59318" containerName="registry" probeResult="failure" output="Get \"https://10.217.0.13:5000/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 13 13:15:00 crc kubenswrapper[4797]: I1013 13:15:00.149219 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29339355-zk5lp"] Oct 13 13:15:00 crc kubenswrapper[4797]: E1013 13:15:00.150281 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcbb5cc0-3585-4ddb-aa28-1c1097d59318" containerName="registry" Oct 13 13:15:00 crc kubenswrapper[4797]: I1013 13:15:00.150304 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcbb5cc0-3585-4ddb-aa28-1c1097d59318" containerName="registry" Oct 13 13:15:00 crc kubenswrapper[4797]: I1013 13:15:00.150519 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="fcbb5cc0-3585-4ddb-aa28-1c1097d59318" containerName="registry" Oct 13 13:15:00 crc kubenswrapper[4797]: I1013 13:15:00.152886 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29339355-zk5lp" Oct 13 13:15:00 crc kubenswrapper[4797]: I1013 13:15:00.155438 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29339355-zk5lp"] Oct 13 13:15:00 crc kubenswrapper[4797]: I1013 13:15:00.155859 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 13 13:15:00 crc kubenswrapper[4797]: I1013 13:15:00.156036 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 13 13:15:00 crc kubenswrapper[4797]: I1013 13:15:00.309451 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2dafde8d-885e-4344-86a8-f384c52b4b56-config-volume\") pod \"collect-profiles-29339355-zk5lp\" (UID: \"2dafde8d-885e-4344-86a8-f384c52b4b56\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339355-zk5lp" Oct 13 13:15:00 crc kubenswrapper[4797]: I1013 13:15:00.309735 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2dafde8d-885e-4344-86a8-f384c52b4b56-secret-volume\") pod \"collect-profiles-29339355-zk5lp\" (UID: \"2dafde8d-885e-4344-86a8-f384c52b4b56\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339355-zk5lp" Oct 13 13:15:00 crc kubenswrapper[4797]: I1013 13:15:00.309907 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzqkd\" (UniqueName: \"kubernetes.io/projected/2dafde8d-885e-4344-86a8-f384c52b4b56-kube-api-access-bzqkd\") pod \"collect-profiles-29339355-zk5lp\" (UID: \"2dafde8d-885e-4344-86a8-f384c52b4b56\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339355-zk5lp" Oct 13 13:15:00 crc kubenswrapper[4797]: I1013 13:15:00.412114 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2dafde8d-885e-4344-86a8-f384c52b4b56-config-volume\") pod \"collect-profiles-29339355-zk5lp\" (UID: \"2dafde8d-885e-4344-86a8-f384c52b4b56\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339355-zk5lp" Oct 13 13:15:00 crc kubenswrapper[4797]: I1013 13:15:00.412224 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2dafde8d-885e-4344-86a8-f384c52b4b56-secret-volume\") pod \"collect-profiles-29339355-zk5lp\" (UID: \"2dafde8d-885e-4344-86a8-f384c52b4b56\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339355-zk5lp" Oct 13 13:15:00 crc kubenswrapper[4797]: I1013 13:15:00.412307 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bzqkd\" (UniqueName: \"kubernetes.io/projected/2dafde8d-885e-4344-86a8-f384c52b4b56-kube-api-access-bzqkd\") pod \"collect-profiles-29339355-zk5lp\" (UID: \"2dafde8d-885e-4344-86a8-f384c52b4b56\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339355-zk5lp" Oct 13 13:15:00 crc kubenswrapper[4797]: I1013 13:15:00.413147 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2dafde8d-885e-4344-86a8-f384c52b4b56-config-volume\") pod \"collect-profiles-29339355-zk5lp\" (UID: \"2dafde8d-885e-4344-86a8-f384c52b4b56\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339355-zk5lp" Oct 13 13:15:00 crc kubenswrapper[4797]: I1013 13:15:00.427402 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2dafde8d-885e-4344-86a8-f384c52b4b56-secret-volume\") pod \"collect-profiles-29339355-zk5lp\" (UID: \"2dafde8d-885e-4344-86a8-f384c52b4b56\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339355-zk5lp" Oct 13 13:15:00 crc kubenswrapper[4797]: I1013 13:15:00.436398 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzqkd\" (UniqueName: \"kubernetes.io/projected/2dafde8d-885e-4344-86a8-f384c52b4b56-kube-api-access-bzqkd\") pod \"collect-profiles-29339355-zk5lp\" (UID: \"2dafde8d-885e-4344-86a8-f384c52b4b56\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339355-zk5lp" Oct 13 13:15:00 crc kubenswrapper[4797]: I1013 13:15:00.476151 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29339355-zk5lp" Oct 13 13:15:00 crc kubenswrapper[4797]: I1013 13:15:00.754750 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29339355-zk5lp"] Oct 13 13:15:00 crc kubenswrapper[4797]: I1013 13:15:00.940641 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29339355-zk5lp" event={"ID":"2dafde8d-885e-4344-86a8-f384c52b4b56","Type":"ContainerStarted","Data":"7a806d5e97400be2ffedbd628932000ed9e7800a42578e4f43a02a9b9989a650"} Oct 13 13:15:01 crc kubenswrapper[4797]: I1013 13:15:01.951382 4797 generic.go:334] "Generic (PLEG): container finished" podID="2dafde8d-885e-4344-86a8-f384c52b4b56" containerID="00b045b78ab122f5ca663f60b236c4a5bce0f800121e007c6ce781004fb0e3f4" exitCode=0 Oct 13 13:15:01 crc kubenswrapper[4797]: I1013 13:15:01.951478 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29339355-zk5lp" event={"ID":"2dafde8d-885e-4344-86a8-f384c52b4b56","Type":"ContainerDied","Data":"00b045b78ab122f5ca663f60b236c4a5bce0f800121e007c6ce781004fb0e3f4"} Oct 13 13:15:03 crc kubenswrapper[4797]: I1013 13:15:03.267113 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29339355-zk5lp" Oct 13 13:15:03 crc kubenswrapper[4797]: I1013 13:15:03.359582 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bzqkd\" (UniqueName: \"kubernetes.io/projected/2dafde8d-885e-4344-86a8-f384c52b4b56-kube-api-access-bzqkd\") pod \"2dafde8d-885e-4344-86a8-f384c52b4b56\" (UID: \"2dafde8d-885e-4344-86a8-f384c52b4b56\") " Oct 13 13:15:03 crc kubenswrapper[4797]: I1013 13:15:03.360159 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2dafde8d-885e-4344-86a8-f384c52b4b56-config-volume\") pod \"2dafde8d-885e-4344-86a8-f384c52b4b56\" (UID: \"2dafde8d-885e-4344-86a8-f384c52b4b56\") " Oct 13 13:15:03 crc kubenswrapper[4797]: I1013 13:15:03.360281 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2dafde8d-885e-4344-86a8-f384c52b4b56-secret-volume\") pod \"2dafde8d-885e-4344-86a8-f384c52b4b56\" (UID: \"2dafde8d-885e-4344-86a8-f384c52b4b56\") " Oct 13 13:15:03 crc kubenswrapper[4797]: I1013 13:15:03.361374 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2dafde8d-885e-4344-86a8-f384c52b4b56-config-volume" (OuterVolumeSpecName: "config-volume") pod "2dafde8d-885e-4344-86a8-f384c52b4b56" (UID: "2dafde8d-885e-4344-86a8-f384c52b4b56"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:15:03 crc kubenswrapper[4797]: I1013 13:15:03.366439 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2dafde8d-885e-4344-86a8-f384c52b4b56-kube-api-access-bzqkd" (OuterVolumeSpecName: "kube-api-access-bzqkd") pod "2dafde8d-885e-4344-86a8-f384c52b4b56" (UID: "2dafde8d-885e-4344-86a8-f384c52b4b56"). InnerVolumeSpecName "kube-api-access-bzqkd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:15:03 crc kubenswrapper[4797]: I1013 13:15:03.366927 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2dafde8d-885e-4344-86a8-f384c52b4b56-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "2dafde8d-885e-4344-86a8-f384c52b4b56" (UID: "2dafde8d-885e-4344-86a8-f384c52b4b56"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:15:03 crc kubenswrapper[4797]: I1013 13:15:03.462167 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bzqkd\" (UniqueName: \"kubernetes.io/projected/2dafde8d-885e-4344-86a8-f384c52b4b56-kube-api-access-bzqkd\") on node \"crc\" DevicePath \"\"" Oct 13 13:15:03 crc kubenswrapper[4797]: I1013 13:15:03.462214 4797 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2dafde8d-885e-4344-86a8-f384c52b4b56-config-volume\") on node \"crc\" DevicePath \"\"" Oct 13 13:15:03 crc kubenswrapper[4797]: I1013 13:15:03.462228 4797 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2dafde8d-885e-4344-86a8-f384c52b4b56-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 13 13:15:03 crc kubenswrapper[4797]: I1013 13:15:03.966297 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29339355-zk5lp" event={"ID":"2dafde8d-885e-4344-86a8-f384c52b4b56","Type":"ContainerDied","Data":"7a806d5e97400be2ffedbd628932000ed9e7800a42578e4f43a02a9b9989a650"} Oct 13 13:15:03 crc kubenswrapper[4797]: I1013 13:15:03.966540 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7a806d5e97400be2ffedbd628932000ed9e7800a42578e4f43a02a9b9989a650" Oct 13 13:15:03 crc kubenswrapper[4797]: I1013 13:15:03.966355 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29339355-zk5lp" Oct 13 13:15:48 crc kubenswrapper[4797]: I1013 13:15:48.120767 4797 patch_prober.go:28] interesting pod/machine-config-daemon-hrdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 13:15:48 crc kubenswrapper[4797]: I1013 13:15:48.121517 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 13:16:03 crc kubenswrapper[4797]: I1013 13:16:03.527670 4797 scope.go:117] "RemoveContainer" containerID="f48d2ec912c9a8ee0d103cee62910aed5e4df10c4070c6fb8b4e0dfb9000f85d" Oct 13 13:16:03 crc kubenswrapper[4797]: I1013 13:16:03.555299 4797 scope.go:117] "RemoveContainer" containerID="589747dbaa997b888bc19e3ff056761e9ec5ba32180a0670f18078e53033ee18" Oct 13 13:16:03 crc kubenswrapper[4797]: I1013 13:16:03.582364 4797 scope.go:117] "RemoveContainer" containerID="5f4932a17de8c0ff27b23532c9ca7b80388ad285ac5642a9e32818f7f63a8907" Oct 13 13:16:18 crc kubenswrapper[4797]: I1013 13:16:18.120321 4797 patch_prober.go:28] interesting pod/machine-config-daemon-hrdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 13:16:18 crc kubenswrapper[4797]: I1013 13:16:18.121081 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 13:16:48 crc kubenswrapper[4797]: I1013 13:16:48.120167 4797 patch_prober.go:28] interesting pod/machine-config-daemon-hrdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 13:16:48 crc kubenswrapper[4797]: I1013 13:16:48.120891 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 13:16:48 crc kubenswrapper[4797]: I1013 13:16:48.120988 4797 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" Oct 13 13:16:48 crc kubenswrapper[4797]: I1013 13:16:48.121981 4797 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"05b59feccb2234abc6b4fd15b059be4b96eb36f86698f4039d57c3c1a3c8d369"} pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 13 13:16:48 crc kubenswrapper[4797]: I1013 13:16:48.122180 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerName="machine-config-daemon" containerID="cri-o://05b59feccb2234abc6b4fd15b059be4b96eb36f86698f4039d57c3c1a3c8d369" gracePeriod=600 Oct 13 13:16:48 crc kubenswrapper[4797]: I1013 13:16:48.662646 4797 generic.go:334] "Generic (PLEG): container finished" podID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerID="05b59feccb2234abc6b4fd15b059be4b96eb36f86698f4039d57c3c1a3c8d369" exitCode=0 Oct 13 13:16:48 crc kubenswrapper[4797]: I1013 13:16:48.662770 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" event={"ID":"345b1c60-ba79-407d-8423-53010f2dfeb0","Type":"ContainerDied","Data":"05b59feccb2234abc6b4fd15b059be4b96eb36f86698f4039d57c3c1a3c8d369"} Oct 13 13:16:48 crc kubenswrapper[4797]: I1013 13:16:48.663226 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" event={"ID":"345b1c60-ba79-407d-8423-53010f2dfeb0","Type":"ContainerStarted","Data":"61854bbd861c1fc9b67c996c47d52d46e92470dc4bfb3423c7c24026ce57b8ba"} Oct 13 13:16:48 crc kubenswrapper[4797]: I1013 13:16:48.663256 4797 scope.go:117] "RemoveContainer" containerID="d3ee7955330145cda9bf05f82ba784bd3a2439ca3cd35803a4ccfd63040068c9" Oct 13 13:17:40 crc kubenswrapper[4797]: I1013 13:17:40.518508 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-dhk2q"] Oct 13 13:17:40 crc kubenswrapper[4797]: I1013 13:17:40.519521 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" podUID="658edc6a-9975-4d8b-9551-821edcc32ce1" containerName="sbdb" containerID="cri-o://a900854ab289e65833932548eadd4705ec501737d66773d5b6c283458125b598" gracePeriod=30 Oct 13 13:17:40 crc kubenswrapper[4797]: I1013 13:17:40.519500 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" podUID="658edc6a-9975-4d8b-9551-821edcc32ce1" containerName="nbdb" containerID="cri-o://aa5161ba66d687daedb3caa1a0e2d7be83859aa3076731f94aebf83cc3348a4a" gracePeriod=30 Oct 13 13:17:40 crc kubenswrapper[4797]: I1013 13:17:40.519646 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" podUID="658edc6a-9975-4d8b-9551-821edcc32ce1" containerName="ovn-acl-logging" containerID="cri-o://32991406197be9d38b8d5e8d1a7e95165b1846e9e054efbe87f30aac9f7f8784" gracePeriod=30 Oct 13 13:17:40 crc kubenswrapper[4797]: I1013 13:17:40.519628 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" podUID="658edc6a-9975-4d8b-9551-821edcc32ce1" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://3e599d81d1a996abd4de74afc58a8255a1ae548327401146b6bdf688d7455823" gracePeriod=30 Oct 13 13:17:40 crc kubenswrapper[4797]: I1013 13:17:40.519460 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" podUID="658edc6a-9975-4d8b-9551-821edcc32ce1" containerName="ovn-controller" containerID="cri-o://7aebb018a68c2984d9e4e58071c2b623652bfa700acebaf735c35615abf8c592" gracePeriod=30 Oct 13 13:17:40 crc kubenswrapper[4797]: I1013 13:17:40.519657 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" podUID="658edc6a-9975-4d8b-9551-821edcc32ce1" containerName="northd" containerID="cri-o://1293f7ed35796e22a4be73a35ad07f83fa98d250d21de2d0b96b9090354142b7" gracePeriod=30 Oct 13 13:17:40 crc kubenswrapper[4797]: I1013 13:17:40.519684 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" podUID="658edc6a-9975-4d8b-9551-821edcc32ce1" containerName="kube-rbac-proxy-node" containerID="cri-o://6815d3509df673d7f5da2c26130c6c4d533e9d2c25c40f82365ef61d63ee71bb" gracePeriod=30 Oct 13 13:17:40 crc kubenswrapper[4797]: I1013 13:17:40.572951 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" podUID="658edc6a-9975-4d8b-9551-821edcc32ce1" containerName="ovnkube-controller" containerID="cri-o://36507d098b9eb0faf4505b62ed2c38e52ab7e05b27bd8b54f2eaafec373bb752" gracePeriod=30 Oct 13 13:17:40 crc kubenswrapper[4797]: I1013 13:17:40.846208 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dhk2q_658edc6a-9975-4d8b-9551-821edcc32ce1/ovnkube-controller/3.log" Oct 13 13:17:40 crc kubenswrapper[4797]: I1013 13:17:40.849724 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dhk2q_658edc6a-9975-4d8b-9551-821edcc32ce1/ovn-acl-logging/0.log" Oct 13 13:17:40 crc kubenswrapper[4797]: I1013 13:17:40.850525 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dhk2q_658edc6a-9975-4d8b-9551-821edcc32ce1/ovn-controller/0.log" Oct 13 13:17:40 crc kubenswrapper[4797]: I1013 13:17:40.851168 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" Oct 13 13:17:40 crc kubenswrapper[4797]: I1013 13:17:40.885257 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/658edc6a-9975-4d8b-9551-821edcc32ce1-host-cni-bin\") pod \"658edc6a-9975-4d8b-9551-821edcc32ce1\" (UID: \"658edc6a-9975-4d8b-9551-821edcc32ce1\") " Oct 13 13:17:40 crc kubenswrapper[4797]: I1013 13:17:40.885342 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z2wq7\" (UniqueName: \"kubernetes.io/projected/658edc6a-9975-4d8b-9551-821edcc32ce1-kube-api-access-z2wq7\") pod \"658edc6a-9975-4d8b-9551-821edcc32ce1\" (UID: \"658edc6a-9975-4d8b-9551-821edcc32ce1\") " Oct 13 13:17:40 crc kubenswrapper[4797]: I1013 13:17:40.885349 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/658edc6a-9975-4d8b-9551-821edcc32ce1-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "658edc6a-9975-4d8b-9551-821edcc32ce1" (UID: "658edc6a-9975-4d8b-9551-821edcc32ce1"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 13 13:17:40 crc kubenswrapper[4797]: I1013 13:17:40.885387 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/658edc6a-9975-4d8b-9551-821edcc32ce1-log-socket\") pod \"658edc6a-9975-4d8b-9551-821edcc32ce1\" (UID: \"658edc6a-9975-4d8b-9551-821edcc32ce1\") " Oct 13 13:17:40 crc kubenswrapper[4797]: I1013 13:17:40.885429 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/658edc6a-9975-4d8b-9551-821edcc32ce1-host-run-ovn-kubernetes\") pod \"658edc6a-9975-4d8b-9551-821edcc32ce1\" (UID: \"658edc6a-9975-4d8b-9551-821edcc32ce1\") " Oct 13 13:17:40 crc kubenswrapper[4797]: I1013 13:17:40.885483 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/658edc6a-9975-4d8b-9551-821edcc32ce1-run-openvswitch\") pod \"658edc6a-9975-4d8b-9551-821edcc32ce1\" (UID: \"658edc6a-9975-4d8b-9551-821edcc32ce1\") " Oct 13 13:17:40 crc kubenswrapper[4797]: I1013 13:17:40.885513 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/658edc6a-9975-4d8b-9551-821edcc32ce1-host-slash\") pod \"658edc6a-9975-4d8b-9551-821edcc32ce1\" (UID: \"658edc6a-9975-4d8b-9551-821edcc32ce1\") " Oct 13 13:17:40 crc kubenswrapper[4797]: I1013 13:17:40.885540 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/658edc6a-9975-4d8b-9551-821edcc32ce1-etc-openvswitch\") pod \"658edc6a-9975-4d8b-9551-821edcc32ce1\" (UID: \"658edc6a-9975-4d8b-9551-821edcc32ce1\") " Oct 13 13:17:40 crc kubenswrapper[4797]: I1013 13:17:40.885542 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/658edc6a-9975-4d8b-9551-821edcc32ce1-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "658edc6a-9975-4d8b-9551-821edcc32ce1" (UID: "658edc6a-9975-4d8b-9551-821edcc32ce1"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 13 13:17:40 crc kubenswrapper[4797]: I1013 13:17:40.885564 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/658edc6a-9975-4d8b-9551-821edcc32ce1-run-ovn\") pod \"658edc6a-9975-4d8b-9551-821edcc32ce1\" (UID: \"658edc6a-9975-4d8b-9551-821edcc32ce1\") " Oct 13 13:17:40 crc kubenswrapper[4797]: I1013 13:17:40.885598 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/658edc6a-9975-4d8b-9551-821edcc32ce1-host-cni-netd\") pod \"658edc6a-9975-4d8b-9551-821edcc32ce1\" (UID: \"658edc6a-9975-4d8b-9551-821edcc32ce1\") " Oct 13 13:17:40 crc kubenswrapper[4797]: I1013 13:17:40.885598 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/658edc6a-9975-4d8b-9551-821edcc32ce1-host-slash" (OuterVolumeSpecName: "host-slash") pod "658edc6a-9975-4d8b-9551-821edcc32ce1" (UID: "658edc6a-9975-4d8b-9551-821edcc32ce1"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 13 13:17:40 crc kubenswrapper[4797]: I1013 13:17:40.885586 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/658edc6a-9975-4d8b-9551-821edcc32ce1-log-socket" (OuterVolumeSpecName: "log-socket") pod "658edc6a-9975-4d8b-9551-821edcc32ce1" (UID: "658edc6a-9975-4d8b-9551-821edcc32ce1"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 13 13:17:40 crc kubenswrapper[4797]: I1013 13:17:40.885633 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/658edc6a-9975-4d8b-9551-821edcc32ce1-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "658edc6a-9975-4d8b-9551-821edcc32ce1" (UID: "658edc6a-9975-4d8b-9551-821edcc32ce1"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 13 13:17:40 crc kubenswrapper[4797]: I1013 13:17:40.885617 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/658edc6a-9975-4d8b-9551-821edcc32ce1-run-systemd\") pod \"658edc6a-9975-4d8b-9551-821edcc32ce1\" (UID: \"658edc6a-9975-4d8b-9551-821edcc32ce1\") " Oct 13 13:17:40 crc kubenswrapper[4797]: I1013 13:17:40.885671 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/658edc6a-9975-4d8b-9551-821edcc32ce1-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "658edc6a-9975-4d8b-9551-821edcc32ce1" (UID: "658edc6a-9975-4d8b-9551-821edcc32ce1"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 13 13:17:40 crc kubenswrapper[4797]: I1013 13:17:40.885702 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/658edc6a-9975-4d8b-9551-821edcc32ce1-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "658edc6a-9975-4d8b-9551-821edcc32ce1" (UID: "658edc6a-9975-4d8b-9551-821edcc32ce1"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 13 13:17:40 crc kubenswrapper[4797]: I1013 13:17:40.885734 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/658edc6a-9975-4d8b-9551-821edcc32ce1-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "658edc6a-9975-4d8b-9551-821edcc32ce1" (UID: "658edc6a-9975-4d8b-9551-821edcc32ce1"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 13 13:17:40 crc kubenswrapper[4797]: I1013 13:17:40.885733 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/658edc6a-9975-4d8b-9551-821edcc32ce1-host-var-lib-cni-networks-ovn-kubernetes\") pod \"658edc6a-9975-4d8b-9551-821edcc32ce1\" (UID: \"658edc6a-9975-4d8b-9551-821edcc32ce1\") " Oct 13 13:17:40 crc kubenswrapper[4797]: I1013 13:17:40.885766 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/658edc6a-9975-4d8b-9551-821edcc32ce1-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "658edc6a-9975-4d8b-9551-821edcc32ce1" (UID: "658edc6a-9975-4d8b-9551-821edcc32ce1"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 13 13:17:40 crc kubenswrapper[4797]: I1013 13:17:40.885840 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/658edc6a-9975-4d8b-9551-821edcc32ce1-ovnkube-script-lib\") pod \"658edc6a-9975-4d8b-9551-821edcc32ce1\" (UID: \"658edc6a-9975-4d8b-9551-821edcc32ce1\") " Oct 13 13:17:40 crc kubenswrapper[4797]: I1013 13:17:40.885911 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/658edc6a-9975-4d8b-9551-821edcc32ce1-ovn-node-metrics-cert\") pod \"658edc6a-9975-4d8b-9551-821edcc32ce1\" (UID: \"658edc6a-9975-4d8b-9551-821edcc32ce1\") " Oct 13 13:17:40 crc kubenswrapper[4797]: I1013 13:17:40.885947 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/658edc6a-9975-4d8b-9551-821edcc32ce1-node-log\") pod \"658edc6a-9975-4d8b-9551-821edcc32ce1\" (UID: \"658edc6a-9975-4d8b-9551-821edcc32ce1\") " Oct 13 13:17:40 crc kubenswrapper[4797]: I1013 13:17:40.886007 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/658edc6a-9975-4d8b-9551-821edcc32ce1-systemd-units\") pod \"658edc6a-9975-4d8b-9551-821edcc32ce1\" (UID: \"658edc6a-9975-4d8b-9551-821edcc32ce1\") " Oct 13 13:17:40 crc kubenswrapper[4797]: I1013 13:17:40.886035 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/658edc6a-9975-4d8b-9551-821edcc32ce1-var-lib-openvswitch\") pod \"658edc6a-9975-4d8b-9551-821edcc32ce1\" (UID: \"658edc6a-9975-4d8b-9551-821edcc32ce1\") " Oct 13 13:17:40 crc kubenswrapper[4797]: I1013 13:17:40.886067 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/658edc6a-9975-4d8b-9551-821edcc32ce1-host-kubelet\") pod \"658edc6a-9975-4d8b-9551-821edcc32ce1\" (UID: \"658edc6a-9975-4d8b-9551-821edcc32ce1\") " Oct 13 13:17:40 crc kubenswrapper[4797]: I1013 13:17:40.886106 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/658edc6a-9975-4d8b-9551-821edcc32ce1-ovnkube-config\") pod \"658edc6a-9975-4d8b-9551-821edcc32ce1\" (UID: \"658edc6a-9975-4d8b-9551-821edcc32ce1\") " Oct 13 13:17:40 crc kubenswrapper[4797]: I1013 13:17:40.886135 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/658edc6a-9975-4d8b-9551-821edcc32ce1-host-run-netns\") pod \"658edc6a-9975-4d8b-9551-821edcc32ce1\" (UID: \"658edc6a-9975-4d8b-9551-821edcc32ce1\") " Oct 13 13:17:40 crc kubenswrapper[4797]: I1013 13:17:40.886176 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/658edc6a-9975-4d8b-9551-821edcc32ce1-env-overrides\") pod \"658edc6a-9975-4d8b-9551-821edcc32ce1\" (UID: \"658edc6a-9975-4d8b-9551-821edcc32ce1\") " Oct 13 13:17:40 crc kubenswrapper[4797]: I1013 13:17:40.886546 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/658edc6a-9975-4d8b-9551-821edcc32ce1-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "658edc6a-9975-4d8b-9551-821edcc32ce1" (UID: "658edc6a-9975-4d8b-9551-821edcc32ce1"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 13 13:17:40 crc kubenswrapper[4797]: I1013 13:17:40.886691 4797 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/658edc6a-9975-4d8b-9551-821edcc32ce1-run-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 13 13:17:40 crc kubenswrapper[4797]: I1013 13:17:40.886745 4797 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/658edc6a-9975-4d8b-9551-821edcc32ce1-host-slash\") on node \"crc\" DevicePath \"\"" Oct 13 13:17:40 crc kubenswrapper[4797]: I1013 13:17:40.886768 4797 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/658edc6a-9975-4d8b-9551-821edcc32ce1-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 13 13:17:40 crc kubenswrapper[4797]: I1013 13:17:40.886786 4797 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/658edc6a-9975-4d8b-9551-821edcc32ce1-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 13 13:17:40 crc kubenswrapper[4797]: I1013 13:17:40.886832 4797 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/658edc6a-9975-4d8b-9551-821edcc32ce1-host-cni-netd\") on node \"crc\" DevicePath \"\"" Oct 13 13:17:40 crc kubenswrapper[4797]: I1013 13:17:40.886852 4797 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/658edc6a-9975-4d8b-9551-821edcc32ce1-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Oct 13 13:17:40 crc kubenswrapper[4797]: I1013 13:17:40.886870 4797 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/658edc6a-9975-4d8b-9551-821edcc32ce1-systemd-units\") on node \"crc\" DevicePath \"\"" Oct 13 13:17:40 crc kubenswrapper[4797]: I1013 13:17:40.886887 4797 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/658edc6a-9975-4d8b-9551-821edcc32ce1-host-cni-bin\") on node \"crc\" DevicePath \"\"" Oct 13 13:17:40 crc kubenswrapper[4797]: I1013 13:17:40.886906 4797 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/658edc6a-9975-4d8b-9551-821edcc32ce1-log-socket\") on node \"crc\" DevicePath \"\"" Oct 13 13:17:40 crc kubenswrapper[4797]: I1013 13:17:40.886923 4797 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/658edc6a-9975-4d8b-9551-821edcc32ce1-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Oct 13 13:17:40 crc kubenswrapper[4797]: I1013 13:17:40.886980 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/658edc6a-9975-4d8b-9551-821edcc32ce1-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "658edc6a-9975-4d8b-9551-821edcc32ce1" (UID: "658edc6a-9975-4d8b-9551-821edcc32ce1"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 13 13:17:40 crc kubenswrapper[4797]: I1013 13:17:40.887023 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/658edc6a-9975-4d8b-9551-821edcc32ce1-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "658edc6a-9975-4d8b-9551-821edcc32ce1" (UID: "658edc6a-9975-4d8b-9551-821edcc32ce1"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 13 13:17:40 crc kubenswrapper[4797]: I1013 13:17:40.887099 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/658edc6a-9975-4d8b-9551-821edcc32ce1-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "658edc6a-9975-4d8b-9551-821edcc32ce1" (UID: "658edc6a-9975-4d8b-9551-821edcc32ce1"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:17:40 crc kubenswrapper[4797]: I1013 13:17:40.889262 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/658edc6a-9975-4d8b-9551-821edcc32ce1-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "658edc6a-9975-4d8b-9551-821edcc32ce1" (UID: "658edc6a-9975-4d8b-9551-821edcc32ce1"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:17:40 crc kubenswrapper[4797]: I1013 13:17:40.889345 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/658edc6a-9975-4d8b-9551-821edcc32ce1-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "658edc6a-9975-4d8b-9551-821edcc32ce1" (UID: "658edc6a-9975-4d8b-9551-821edcc32ce1"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 13 13:17:40 crc kubenswrapper[4797]: I1013 13:17:40.890341 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/658edc6a-9975-4d8b-9551-821edcc32ce1-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "658edc6a-9975-4d8b-9551-821edcc32ce1" (UID: "658edc6a-9975-4d8b-9551-821edcc32ce1"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:17:40 crc kubenswrapper[4797]: I1013 13:17:40.890438 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/658edc6a-9975-4d8b-9551-821edcc32ce1-node-log" (OuterVolumeSpecName: "node-log") pod "658edc6a-9975-4d8b-9551-821edcc32ce1" (UID: "658edc6a-9975-4d8b-9551-821edcc32ce1"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 13 13:17:40 crc kubenswrapper[4797]: I1013 13:17:40.895319 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/658edc6a-9975-4d8b-9551-821edcc32ce1-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "658edc6a-9975-4d8b-9551-821edcc32ce1" (UID: "658edc6a-9975-4d8b-9551-821edcc32ce1"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:17:40 crc kubenswrapper[4797]: I1013 13:17:40.900568 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/658edc6a-9975-4d8b-9551-821edcc32ce1-kube-api-access-z2wq7" (OuterVolumeSpecName: "kube-api-access-z2wq7") pod "658edc6a-9975-4d8b-9551-821edcc32ce1" (UID: "658edc6a-9975-4d8b-9551-821edcc32ce1"). InnerVolumeSpecName "kube-api-access-z2wq7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:17:40 crc kubenswrapper[4797]: I1013 13:17:40.904393 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/658edc6a-9975-4d8b-9551-821edcc32ce1-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "658edc6a-9975-4d8b-9551-821edcc32ce1" (UID: "658edc6a-9975-4d8b-9551-821edcc32ce1"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 13 13:17:40 crc kubenswrapper[4797]: I1013 13:17:40.929668 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-s7g6f"] Oct 13 13:17:40 crc kubenswrapper[4797]: E1013 13:17:40.929971 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="658edc6a-9975-4d8b-9551-821edcc32ce1" containerName="ovnkube-controller" Oct 13 13:17:40 crc kubenswrapper[4797]: I1013 13:17:40.929988 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="658edc6a-9975-4d8b-9551-821edcc32ce1" containerName="ovnkube-controller" Oct 13 13:17:40 crc kubenswrapper[4797]: E1013 13:17:40.930003 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="658edc6a-9975-4d8b-9551-821edcc32ce1" containerName="northd" Oct 13 13:17:40 crc kubenswrapper[4797]: I1013 13:17:40.930010 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="658edc6a-9975-4d8b-9551-821edcc32ce1" containerName="northd" Oct 13 13:17:40 crc kubenswrapper[4797]: E1013 13:17:40.930021 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="658edc6a-9975-4d8b-9551-821edcc32ce1" containerName="ovnkube-controller" Oct 13 13:17:40 crc kubenswrapper[4797]: I1013 13:17:40.930028 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="658edc6a-9975-4d8b-9551-821edcc32ce1" containerName="ovnkube-controller" Oct 13 13:17:40 crc kubenswrapper[4797]: E1013 13:17:40.930038 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="658edc6a-9975-4d8b-9551-821edcc32ce1" containerName="ovn-controller" Oct 13 13:17:40 crc kubenswrapper[4797]: I1013 13:17:40.930045 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="658edc6a-9975-4d8b-9551-821edcc32ce1" containerName="ovn-controller" Oct 13 13:17:40 crc kubenswrapper[4797]: E1013 13:17:40.930054 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="658edc6a-9975-4d8b-9551-821edcc32ce1" containerName="ovnkube-controller" Oct 13 13:17:40 crc kubenswrapper[4797]: I1013 13:17:40.930063 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="658edc6a-9975-4d8b-9551-821edcc32ce1" containerName="ovnkube-controller" Oct 13 13:17:40 crc kubenswrapper[4797]: E1013 13:17:40.930071 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="658edc6a-9975-4d8b-9551-821edcc32ce1" containerName="ovnkube-controller" Oct 13 13:17:40 crc kubenswrapper[4797]: I1013 13:17:40.930077 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="658edc6a-9975-4d8b-9551-821edcc32ce1" containerName="ovnkube-controller" Oct 13 13:17:40 crc kubenswrapper[4797]: E1013 13:17:40.930088 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="658edc6a-9975-4d8b-9551-821edcc32ce1" containerName="kube-rbac-proxy-node" Oct 13 13:17:40 crc kubenswrapper[4797]: I1013 13:17:40.930094 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="658edc6a-9975-4d8b-9551-821edcc32ce1" containerName="kube-rbac-proxy-node" Oct 13 13:17:40 crc kubenswrapper[4797]: E1013 13:17:40.930105 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="658edc6a-9975-4d8b-9551-821edcc32ce1" containerName="ovn-acl-logging" Oct 13 13:17:40 crc kubenswrapper[4797]: I1013 13:17:40.930113 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="658edc6a-9975-4d8b-9551-821edcc32ce1" containerName="ovn-acl-logging" Oct 13 13:17:40 crc kubenswrapper[4797]: E1013 13:17:40.930123 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="658edc6a-9975-4d8b-9551-821edcc32ce1" containerName="nbdb" Oct 13 13:17:40 crc kubenswrapper[4797]: I1013 13:17:40.930130 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="658edc6a-9975-4d8b-9551-821edcc32ce1" containerName="nbdb" Oct 13 13:17:40 crc kubenswrapper[4797]: E1013 13:17:40.930140 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="658edc6a-9975-4d8b-9551-821edcc32ce1" containerName="kube-rbac-proxy-ovn-metrics" Oct 13 13:17:40 crc kubenswrapper[4797]: I1013 13:17:40.930147 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="658edc6a-9975-4d8b-9551-821edcc32ce1" containerName="kube-rbac-proxy-ovn-metrics" Oct 13 13:17:40 crc kubenswrapper[4797]: E1013 13:17:40.930156 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="658edc6a-9975-4d8b-9551-821edcc32ce1" containerName="sbdb" Oct 13 13:17:40 crc kubenswrapper[4797]: I1013 13:17:40.930163 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="658edc6a-9975-4d8b-9551-821edcc32ce1" containerName="sbdb" Oct 13 13:17:40 crc kubenswrapper[4797]: E1013 13:17:40.930175 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="658edc6a-9975-4d8b-9551-821edcc32ce1" containerName="kubecfg-setup" Oct 13 13:17:40 crc kubenswrapper[4797]: I1013 13:17:40.930183 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="658edc6a-9975-4d8b-9551-821edcc32ce1" containerName="kubecfg-setup" Oct 13 13:17:40 crc kubenswrapper[4797]: E1013 13:17:40.930191 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2dafde8d-885e-4344-86a8-f384c52b4b56" containerName="collect-profiles" Oct 13 13:17:40 crc kubenswrapper[4797]: I1013 13:17:40.930197 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="2dafde8d-885e-4344-86a8-f384c52b4b56" containerName="collect-profiles" Oct 13 13:17:40 crc kubenswrapper[4797]: I1013 13:17:40.930288 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="658edc6a-9975-4d8b-9551-821edcc32ce1" containerName="kube-rbac-proxy-node" Oct 13 13:17:40 crc kubenswrapper[4797]: I1013 13:17:40.930305 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="658edc6a-9975-4d8b-9551-821edcc32ce1" containerName="ovnkube-controller" Oct 13 13:17:40 crc kubenswrapper[4797]: I1013 13:17:40.930312 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="658edc6a-9975-4d8b-9551-821edcc32ce1" containerName="ovnkube-controller" Oct 13 13:17:40 crc kubenswrapper[4797]: I1013 13:17:40.930322 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="658edc6a-9975-4d8b-9551-821edcc32ce1" containerName="nbdb" Oct 13 13:17:40 crc kubenswrapper[4797]: I1013 13:17:40.930331 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="658edc6a-9975-4d8b-9551-821edcc32ce1" containerName="ovn-controller" Oct 13 13:17:40 crc kubenswrapper[4797]: I1013 13:17:40.930341 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="658edc6a-9975-4d8b-9551-821edcc32ce1" containerName="kube-rbac-proxy-ovn-metrics" Oct 13 13:17:40 crc kubenswrapper[4797]: I1013 13:17:40.930351 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="658edc6a-9975-4d8b-9551-821edcc32ce1" containerName="northd" Oct 13 13:17:40 crc kubenswrapper[4797]: I1013 13:17:40.930361 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="2dafde8d-885e-4344-86a8-f384c52b4b56" containerName="collect-profiles" Oct 13 13:17:40 crc kubenswrapper[4797]: I1013 13:17:40.930368 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="658edc6a-9975-4d8b-9551-821edcc32ce1" containerName="sbdb" Oct 13 13:17:40 crc kubenswrapper[4797]: I1013 13:17:40.930377 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="658edc6a-9975-4d8b-9551-821edcc32ce1" containerName="ovnkube-controller" Oct 13 13:17:40 crc kubenswrapper[4797]: I1013 13:17:40.930383 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="658edc6a-9975-4d8b-9551-821edcc32ce1" containerName="ovn-acl-logging" Oct 13 13:17:40 crc kubenswrapper[4797]: E1013 13:17:40.930501 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="658edc6a-9975-4d8b-9551-821edcc32ce1" containerName="ovnkube-controller" Oct 13 13:17:40 crc kubenswrapper[4797]: I1013 13:17:40.930508 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="658edc6a-9975-4d8b-9551-821edcc32ce1" containerName="ovnkube-controller" Oct 13 13:17:40 crc kubenswrapper[4797]: I1013 13:17:40.930600 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="658edc6a-9975-4d8b-9551-821edcc32ce1" containerName="ovnkube-controller" Oct 13 13:17:40 crc kubenswrapper[4797]: I1013 13:17:40.930774 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="658edc6a-9975-4d8b-9551-821edcc32ce1" containerName="ovnkube-controller" Oct 13 13:17:40 crc kubenswrapper[4797]: I1013 13:17:40.932317 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-s7g6f" Oct 13 13:17:40 crc kubenswrapper[4797]: I1013 13:17:40.987854 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ac4b6b8c-c038-4bd8-8876-b49f366ecdfc-node-log\") pod \"ovnkube-node-s7g6f\" (UID: \"ac4b6b8c-c038-4bd8-8876-b49f366ecdfc\") " pod="openshift-ovn-kubernetes/ovnkube-node-s7g6f" Oct 13 13:17:40 crc kubenswrapper[4797]: I1013 13:17:40.987984 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ac4b6b8c-c038-4bd8-8876-b49f366ecdfc-var-lib-openvswitch\") pod \"ovnkube-node-s7g6f\" (UID: \"ac4b6b8c-c038-4bd8-8876-b49f366ecdfc\") " pod="openshift-ovn-kubernetes/ovnkube-node-s7g6f" Oct 13 13:17:40 crc kubenswrapper[4797]: I1013 13:17:40.988052 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ac4b6b8c-c038-4bd8-8876-b49f366ecdfc-etc-openvswitch\") pod \"ovnkube-node-s7g6f\" (UID: \"ac4b6b8c-c038-4bd8-8876-b49f366ecdfc\") " pod="openshift-ovn-kubernetes/ovnkube-node-s7g6f" Oct 13 13:17:40 crc kubenswrapper[4797]: I1013 13:17:40.988069 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ac4b6b8c-c038-4bd8-8876-b49f366ecdfc-ovnkube-config\") pod \"ovnkube-node-s7g6f\" (UID: \"ac4b6b8c-c038-4bd8-8876-b49f366ecdfc\") " pod="openshift-ovn-kubernetes/ovnkube-node-s7g6f" Oct 13 13:17:40 crc kubenswrapper[4797]: I1013 13:17:40.988089 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ac4b6b8c-c038-4bd8-8876-b49f366ecdfc-run-ovn\") pod \"ovnkube-node-s7g6f\" (UID: \"ac4b6b8c-c038-4bd8-8876-b49f366ecdfc\") " pod="openshift-ovn-kubernetes/ovnkube-node-s7g6f" Oct 13 13:17:40 crc kubenswrapper[4797]: I1013 13:17:40.988109 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ac4b6b8c-c038-4bd8-8876-b49f366ecdfc-ovn-node-metrics-cert\") pod \"ovnkube-node-s7g6f\" (UID: \"ac4b6b8c-c038-4bd8-8876-b49f366ecdfc\") " pod="openshift-ovn-kubernetes/ovnkube-node-s7g6f" Oct 13 13:17:40 crc kubenswrapper[4797]: I1013 13:17:40.988161 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ac4b6b8c-c038-4bd8-8876-b49f366ecdfc-ovnkube-script-lib\") pod \"ovnkube-node-s7g6f\" (UID: \"ac4b6b8c-c038-4bd8-8876-b49f366ecdfc\") " pod="openshift-ovn-kubernetes/ovnkube-node-s7g6f" Oct 13 13:17:40 crc kubenswrapper[4797]: I1013 13:17:40.988232 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ac4b6b8c-c038-4bd8-8876-b49f366ecdfc-host-cni-netd\") pod \"ovnkube-node-s7g6f\" (UID: \"ac4b6b8c-c038-4bd8-8876-b49f366ecdfc\") " pod="openshift-ovn-kubernetes/ovnkube-node-s7g6f" Oct 13 13:17:40 crc kubenswrapper[4797]: I1013 13:17:40.988253 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ac4b6b8c-c038-4bd8-8876-b49f366ecdfc-host-kubelet\") pod \"ovnkube-node-s7g6f\" (UID: \"ac4b6b8c-c038-4bd8-8876-b49f366ecdfc\") " pod="openshift-ovn-kubernetes/ovnkube-node-s7g6f" Oct 13 13:17:40 crc kubenswrapper[4797]: I1013 13:17:40.988278 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ac4b6b8c-c038-4bd8-8876-b49f366ecdfc-run-openvswitch\") pod \"ovnkube-node-s7g6f\" (UID: \"ac4b6b8c-c038-4bd8-8876-b49f366ecdfc\") " pod="openshift-ovn-kubernetes/ovnkube-node-s7g6f" Oct 13 13:17:40 crc kubenswrapper[4797]: I1013 13:17:40.988299 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ac4b6b8c-c038-4bd8-8876-b49f366ecdfc-run-systemd\") pod \"ovnkube-node-s7g6f\" (UID: \"ac4b6b8c-c038-4bd8-8876-b49f366ecdfc\") " pod="openshift-ovn-kubernetes/ovnkube-node-s7g6f" Oct 13 13:17:40 crc kubenswrapper[4797]: I1013 13:17:40.988475 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ac4b6b8c-c038-4bd8-8876-b49f366ecdfc-host-run-ovn-kubernetes\") pod \"ovnkube-node-s7g6f\" (UID: \"ac4b6b8c-c038-4bd8-8876-b49f366ecdfc\") " pod="openshift-ovn-kubernetes/ovnkube-node-s7g6f" Oct 13 13:17:40 crc kubenswrapper[4797]: I1013 13:17:40.988565 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ac4b6b8c-c038-4bd8-8876-b49f366ecdfc-host-run-netns\") pod \"ovnkube-node-s7g6f\" (UID: \"ac4b6b8c-c038-4bd8-8876-b49f366ecdfc\") " pod="openshift-ovn-kubernetes/ovnkube-node-s7g6f" Oct 13 13:17:40 crc kubenswrapper[4797]: I1013 13:17:40.988641 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zxjqn\" (UniqueName: \"kubernetes.io/projected/ac4b6b8c-c038-4bd8-8876-b49f366ecdfc-kube-api-access-zxjqn\") pod \"ovnkube-node-s7g6f\" (UID: \"ac4b6b8c-c038-4bd8-8876-b49f366ecdfc\") " pod="openshift-ovn-kubernetes/ovnkube-node-s7g6f" Oct 13 13:17:40 crc kubenswrapper[4797]: I1013 13:17:40.988691 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ac4b6b8c-c038-4bd8-8876-b49f366ecdfc-log-socket\") pod \"ovnkube-node-s7g6f\" (UID: \"ac4b6b8c-c038-4bd8-8876-b49f366ecdfc\") " pod="openshift-ovn-kubernetes/ovnkube-node-s7g6f" Oct 13 13:17:40 crc kubenswrapper[4797]: I1013 13:17:40.988749 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ac4b6b8c-c038-4bd8-8876-b49f366ecdfc-systemd-units\") pod \"ovnkube-node-s7g6f\" (UID: \"ac4b6b8c-c038-4bd8-8876-b49f366ecdfc\") " pod="openshift-ovn-kubernetes/ovnkube-node-s7g6f" Oct 13 13:17:40 crc kubenswrapper[4797]: I1013 13:17:40.988800 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ac4b6b8c-c038-4bd8-8876-b49f366ecdfc-host-cni-bin\") pod \"ovnkube-node-s7g6f\" (UID: \"ac4b6b8c-c038-4bd8-8876-b49f366ecdfc\") " pod="openshift-ovn-kubernetes/ovnkube-node-s7g6f" Oct 13 13:17:40 crc kubenswrapper[4797]: I1013 13:17:40.988886 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ac4b6b8c-c038-4bd8-8876-b49f366ecdfc-env-overrides\") pod \"ovnkube-node-s7g6f\" (UID: \"ac4b6b8c-c038-4bd8-8876-b49f366ecdfc\") " pod="openshift-ovn-kubernetes/ovnkube-node-s7g6f" Oct 13 13:17:40 crc kubenswrapper[4797]: I1013 13:17:40.988989 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ac4b6b8c-c038-4bd8-8876-b49f366ecdfc-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-s7g6f\" (UID: \"ac4b6b8c-c038-4bd8-8876-b49f366ecdfc\") " pod="openshift-ovn-kubernetes/ovnkube-node-s7g6f" Oct 13 13:17:40 crc kubenswrapper[4797]: I1013 13:17:40.989031 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ac4b6b8c-c038-4bd8-8876-b49f366ecdfc-host-slash\") pod \"ovnkube-node-s7g6f\" (UID: \"ac4b6b8c-c038-4bd8-8876-b49f366ecdfc\") " pod="openshift-ovn-kubernetes/ovnkube-node-s7g6f" Oct 13 13:17:40 crc kubenswrapper[4797]: I1013 13:17:40.989126 4797 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/658edc6a-9975-4d8b-9551-821edcc32ce1-run-systemd\") on node \"crc\" DevicePath \"\"" Oct 13 13:17:40 crc kubenswrapper[4797]: I1013 13:17:40.989157 4797 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/658edc6a-9975-4d8b-9551-821edcc32ce1-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Oct 13 13:17:40 crc kubenswrapper[4797]: I1013 13:17:40.989184 4797 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/658edc6a-9975-4d8b-9551-821edcc32ce1-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 13 13:17:40 crc kubenswrapper[4797]: I1013 13:17:40.989200 4797 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/658edc6a-9975-4d8b-9551-821edcc32ce1-node-log\") on node \"crc\" DevicePath \"\"" Oct 13 13:17:40 crc kubenswrapper[4797]: I1013 13:17:40.989217 4797 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/658edc6a-9975-4d8b-9551-821edcc32ce1-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 13 13:17:40 crc kubenswrapper[4797]: I1013 13:17:40.989318 4797 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/658edc6a-9975-4d8b-9551-821edcc32ce1-host-kubelet\") on node \"crc\" DevicePath \"\"" Oct 13 13:17:40 crc kubenswrapper[4797]: I1013 13:17:40.989342 4797 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/658edc6a-9975-4d8b-9551-821edcc32ce1-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 13 13:17:40 crc kubenswrapper[4797]: I1013 13:17:40.989360 4797 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/658edc6a-9975-4d8b-9551-821edcc32ce1-host-run-netns\") on node \"crc\" DevicePath \"\"" Oct 13 13:17:40 crc kubenswrapper[4797]: I1013 13:17:40.989378 4797 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/658edc6a-9975-4d8b-9551-821edcc32ce1-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 13 13:17:40 crc kubenswrapper[4797]: I1013 13:17:40.989398 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z2wq7\" (UniqueName: \"kubernetes.io/projected/658edc6a-9975-4d8b-9551-821edcc32ce1-kube-api-access-z2wq7\") on node \"crc\" DevicePath \"\"" Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.021865 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-6gbdx_b2ab9f14-aae8-45ef-880e-a1563e920f87/kube-multus/1.log" Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.022343 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-6gbdx_b2ab9f14-aae8-45ef-880e-a1563e920f87/kube-multus/0.log" Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.022398 4797 generic.go:334] "Generic (PLEG): container finished" podID="b2ab9f14-aae8-45ef-880e-a1563e920f87" containerID="fa998288bf7354f5914b82c32971cd88e1fe9535016c7d137b79e4cf5c5c7248" exitCode=2 Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.022493 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-6gbdx" event={"ID":"b2ab9f14-aae8-45ef-880e-a1563e920f87","Type":"ContainerDied","Data":"fa998288bf7354f5914b82c32971cd88e1fe9535016c7d137b79e4cf5c5c7248"} Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.022642 4797 scope.go:117] "RemoveContainer" containerID="414f6ddbfec431109009fc83e56eeac94db15726b109e707ebd8d3e2403999b7" Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.023525 4797 scope.go:117] "RemoveContainer" containerID="fa998288bf7354f5914b82c32971cd88e1fe9535016c7d137b79e4cf5c5c7248" Oct 13 13:17:41 crc kubenswrapper[4797]: E1013 13:17:41.023980 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-6gbdx_openshift-multus(b2ab9f14-aae8-45ef-880e-a1563e920f87)\"" pod="openshift-multus/multus-6gbdx" podUID="b2ab9f14-aae8-45ef-880e-a1563e920f87" Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.025655 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dhk2q_658edc6a-9975-4d8b-9551-821edcc32ce1/ovnkube-controller/3.log" Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.030224 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dhk2q_658edc6a-9975-4d8b-9551-821edcc32ce1/ovn-acl-logging/0.log" Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.030947 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dhk2q_658edc6a-9975-4d8b-9551-821edcc32ce1/ovn-controller/0.log" Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.031541 4797 generic.go:334] "Generic (PLEG): container finished" podID="658edc6a-9975-4d8b-9551-821edcc32ce1" containerID="36507d098b9eb0faf4505b62ed2c38e52ab7e05b27bd8b54f2eaafec373bb752" exitCode=0 Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.031591 4797 generic.go:334] "Generic (PLEG): container finished" podID="658edc6a-9975-4d8b-9551-821edcc32ce1" containerID="a900854ab289e65833932548eadd4705ec501737d66773d5b6c283458125b598" exitCode=0 Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.031603 4797 generic.go:334] "Generic (PLEG): container finished" podID="658edc6a-9975-4d8b-9551-821edcc32ce1" containerID="aa5161ba66d687daedb3caa1a0e2d7be83859aa3076731f94aebf83cc3348a4a" exitCode=0 Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.031612 4797 generic.go:334] "Generic (PLEG): container finished" podID="658edc6a-9975-4d8b-9551-821edcc32ce1" containerID="1293f7ed35796e22a4be73a35ad07f83fa98d250d21de2d0b96b9090354142b7" exitCode=0 Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.031619 4797 generic.go:334] "Generic (PLEG): container finished" podID="658edc6a-9975-4d8b-9551-821edcc32ce1" containerID="3e599d81d1a996abd4de74afc58a8255a1ae548327401146b6bdf688d7455823" exitCode=0 Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.031627 4797 generic.go:334] "Generic (PLEG): container finished" podID="658edc6a-9975-4d8b-9551-821edcc32ce1" containerID="6815d3509df673d7f5da2c26130c6c4d533e9d2c25c40f82365ef61d63ee71bb" exitCode=0 Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.031635 4797 generic.go:334] "Generic (PLEG): container finished" podID="658edc6a-9975-4d8b-9551-821edcc32ce1" containerID="32991406197be9d38b8d5e8d1a7e95165b1846e9e054efbe87f30aac9f7f8784" exitCode=143 Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.031643 4797 generic.go:334] "Generic (PLEG): container finished" podID="658edc6a-9975-4d8b-9551-821edcc32ce1" containerID="7aebb018a68c2984d9e4e58071c2b623652bfa700acebaf735c35615abf8c592" exitCode=143 Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.031694 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" event={"ID":"658edc6a-9975-4d8b-9551-821edcc32ce1","Type":"ContainerDied","Data":"36507d098b9eb0faf4505b62ed2c38e52ab7e05b27bd8b54f2eaafec373bb752"} Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.031749 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" event={"ID":"658edc6a-9975-4d8b-9551-821edcc32ce1","Type":"ContainerDied","Data":"a900854ab289e65833932548eadd4705ec501737d66773d5b6c283458125b598"} Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.031765 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" event={"ID":"658edc6a-9975-4d8b-9551-821edcc32ce1","Type":"ContainerDied","Data":"aa5161ba66d687daedb3caa1a0e2d7be83859aa3076731f94aebf83cc3348a4a"} Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.031778 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" event={"ID":"658edc6a-9975-4d8b-9551-821edcc32ce1","Type":"ContainerDied","Data":"1293f7ed35796e22a4be73a35ad07f83fa98d250d21de2d0b96b9090354142b7"} Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.031789 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" event={"ID":"658edc6a-9975-4d8b-9551-821edcc32ce1","Type":"ContainerDied","Data":"3e599d81d1a996abd4de74afc58a8255a1ae548327401146b6bdf688d7455823"} Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.031812 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" event={"ID":"658edc6a-9975-4d8b-9551-821edcc32ce1","Type":"ContainerDied","Data":"6815d3509df673d7f5da2c26130c6c4d533e9d2c25c40f82365ef61d63ee71bb"} Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.031827 4797 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"36507d098b9eb0faf4505b62ed2c38e52ab7e05b27bd8b54f2eaafec373bb752"} Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.031841 4797 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4df9cf44f891f2c9d341965d5c89c3f3ca2eb1bf12a54a0a673389869f9c878d"} Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.031847 4797 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a900854ab289e65833932548eadd4705ec501737d66773d5b6c283458125b598"} Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.031856 4797 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"aa5161ba66d687daedb3caa1a0e2d7be83859aa3076731f94aebf83cc3348a4a"} Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.031863 4797 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1293f7ed35796e22a4be73a35ad07f83fa98d250d21de2d0b96b9090354142b7"} Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.031871 4797 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3e599d81d1a996abd4de74afc58a8255a1ae548327401146b6bdf688d7455823"} Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.031877 4797 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6815d3509df673d7f5da2c26130c6c4d533e9d2c25c40f82365ef61d63ee71bb"} Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.031884 4797 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"32991406197be9d38b8d5e8d1a7e95165b1846e9e054efbe87f30aac9f7f8784"} Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.031890 4797 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7aebb018a68c2984d9e4e58071c2b623652bfa700acebaf735c35615abf8c592"} Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.031896 4797 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6339ec20a5a892de024630eaf6febedfdfa4373c24d394fc758267cc7ae2603e"} Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.031904 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" event={"ID":"658edc6a-9975-4d8b-9551-821edcc32ce1","Type":"ContainerDied","Data":"32991406197be9d38b8d5e8d1a7e95165b1846e9e054efbe87f30aac9f7f8784"} Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.031914 4797 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"36507d098b9eb0faf4505b62ed2c38e52ab7e05b27bd8b54f2eaafec373bb752"} Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.031922 4797 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4df9cf44f891f2c9d341965d5c89c3f3ca2eb1bf12a54a0a673389869f9c878d"} Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.031928 4797 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a900854ab289e65833932548eadd4705ec501737d66773d5b6c283458125b598"} Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.031935 4797 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"aa5161ba66d687daedb3caa1a0e2d7be83859aa3076731f94aebf83cc3348a4a"} Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.031940 4797 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1293f7ed35796e22a4be73a35ad07f83fa98d250d21de2d0b96b9090354142b7"} Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.031946 4797 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3e599d81d1a996abd4de74afc58a8255a1ae548327401146b6bdf688d7455823"} Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.031951 4797 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6815d3509df673d7f5da2c26130c6c4d533e9d2c25c40f82365ef61d63ee71bb"} Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.031957 4797 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"32991406197be9d38b8d5e8d1a7e95165b1846e9e054efbe87f30aac9f7f8784"} Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.031963 4797 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7aebb018a68c2984d9e4e58071c2b623652bfa700acebaf735c35615abf8c592"} Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.031969 4797 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6339ec20a5a892de024630eaf6febedfdfa4373c24d394fc758267cc7ae2603e"} Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.031977 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" event={"ID":"658edc6a-9975-4d8b-9551-821edcc32ce1","Type":"ContainerDied","Data":"7aebb018a68c2984d9e4e58071c2b623652bfa700acebaf735c35615abf8c592"} Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.031986 4797 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"36507d098b9eb0faf4505b62ed2c38e52ab7e05b27bd8b54f2eaafec373bb752"} Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.031994 4797 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4df9cf44f891f2c9d341965d5c89c3f3ca2eb1bf12a54a0a673389869f9c878d"} Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.032002 4797 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a900854ab289e65833932548eadd4705ec501737d66773d5b6c283458125b598"} Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.032008 4797 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"aa5161ba66d687daedb3caa1a0e2d7be83859aa3076731f94aebf83cc3348a4a"} Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.032014 4797 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1293f7ed35796e22a4be73a35ad07f83fa98d250d21de2d0b96b9090354142b7"} Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.032021 4797 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3e599d81d1a996abd4de74afc58a8255a1ae548327401146b6bdf688d7455823"} Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.032027 4797 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6815d3509df673d7f5da2c26130c6c4d533e9d2c25c40f82365ef61d63ee71bb"} Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.032040 4797 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"32991406197be9d38b8d5e8d1a7e95165b1846e9e054efbe87f30aac9f7f8784"} Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.032045 4797 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7aebb018a68c2984d9e4e58071c2b623652bfa700acebaf735c35615abf8c592"} Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.032051 4797 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6339ec20a5a892de024630eaf6febedfdfa4373c24d394fc758267cc7ae2603e"} Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.032059 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" event={"ID":"658edc6a-9975-4d8b-9551-821edcc32ce1","Type":"ContainerDied","Data":"23f2e2805650d7d1dd19457d52d7fbaa2345c59697754367b63f611235d82b36"} Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.032068 4797 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"36507d098b9eb0faf4505b62ed2c38e52ab7e05b27bd8b54f2eaafec373bb752"} Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.032076 4797 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4df9cf44f891f2c9d341965d5c89c3f3ca2eb1bf12a54a0a673389869f9c878d"} Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.032082 4797 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a900854ab289e65833932548eadd4705ec501737d66773d5b6c283458125b598"} Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.032087 4797 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"aa5161ba66d687daedb3caa1a0e2d7be83859aa3076731f94aebf83cc3348a4a"} Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.032094 4797 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1293f7ed35796e22a4be73a35ad07f83fa98d250d21de2d0b96b9090354142b7"} Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.032103 4797 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3e599d81d1a996abd4de74afc58a8255a1ae548327401146b6bdf688d7455823"} Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.032110 4797 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6815d3509df673d7f5da2c26130c6c4d533e9d2c25c40f82365ef61d63ee71bb"} Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.032116 4797 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"32991406197be9d38b8d5e8d1a7e95165b1846e9e054efbe87f30aac9f7f8784"} Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.032124 4797 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7aebb018a68c2984d9e4e58071c2b623652bfa700acebaf735c35615abf8c592"} Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.032131 4797 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6339ec20a5a892de024630eaf6febedfdfa4373c24d394fc758267cc7ae2603e"} Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.032298 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-dhk2q" Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.058654 4797 scope.go:117] "RemoveContainer" containerID="36507d098b9eb0faf4505b62ed2c38e52ab7e05b27bd8b54f2eaafec373bb752" Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.082918 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-dhk2q"] Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.084899 4797 scope.go:117] "RemoveContainer" containerID="4df9cf44f891f2c9d341965d5c89c3f3ca2eb1bf12a54a0a673389869f9c878d" Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.088193 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-dhk2q"] Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.090853 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ac4b6b8c-c038-4bd8-8876-b49f366ecdfc-host-cni-bin\") pod \"ovnkube-node-s7g6f\" (UID: \"ac4b6b8c-c038-4bd8-8876-b49f366ecdfc\") " pod="openshift-ovn-kubernetes/ovnkube-node-s7g6f" Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.090915 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ac4b6b8c-c038-4bd8-8876-b49f366ecdfc-env-overrides\") pod \"ovnkube-node-s7g6f\" (UID: \"ac4b6b8c-c038-4bd8-8876-b49f366ecdfc\") " pod="openshift-ovn-kubernetes/ovnkube-node-s7g6f" Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.090990 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ac4b6b8c-c038-4bd8-8876-b49f366ecdfc-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-s7g6f\" (UID: \"ac4b6b8c-c038-4bd8-8876-b49f366ecdfc\") " pod="openshift-ovn-kubernetes/ovnkube-node-s7g6f" Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.091035 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ac4b6b8c-c038-4bd8-8876-b49f366ecdfc-host-slash\") pod \"ovnkube-node-s7g6f\" (UID: \"ac4b6b8c-c038-4bd8-8876-b49f366ecdfc\") " pod="openshift-ovn-kubernetes/ovnkube-node-s7g6f" Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.091057 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ac4b6b8c-c038-4bd8-8876-b49f366ecdfc-node-log\") pod \"ovnkube-node-s7g6f\" (UID: \"ac4b6b8c-c038-4bd8-8876-b49f366ecdfc\") " pod="openshift-ovn-kubernetes/ovnkube-node-s7g6f" Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.091108 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ac4b6b8c-c038-4bd8-8876-b49f366ecdfc-var-lib-openvswitch\") pod \"ovnkube-node-s7g6f\" (UID: \"ac4b6b8c-c038-4bd8-8876-b49f366ecdfc\") " pod="openshift-ovn-kubernetes/ovnkube-node-s7g6f" Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.091134 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ac4b6b8c-c038-4bd8-8876-b49f366ecdfc-etc-openvswitch\") pod \"ovnkube-node-s7g6f\" (UID: \"ac4b6b8c-c038-4bd8-8876-b49f366ecdfc\") " pod="openshift-ovn-kubernetes/ovnkube-node-s7g6f" Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.091149 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ac4b6b8c-c038-4bd8-8876-b49f366ecdfc-ovnkube-config\") pod \"ovnkube-node-s7g6f\" (UID: \"ac4b6b8c-c038-4bd8-8876-b49f366ecdfc\") " pod="openshift-ovn-kubernetes/ovnkube-node-s7g6f" Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.091186 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ac4b6b8c-c038-4bd8-8876-b49f366ecdfc-run-ovn\") pod \"ovnkube-node-s7g6f\" (UID: \"ac4b6b8c-c038-4bd8-8876-b49f366ecdfc\") " pod="openshift-ovn-kubernetes/ovnkube-node-s7g6f" Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.091208 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ac4b6b8c-c038-4bd8-8876-b49f366ecdfc-ovn-node-metrics-cert\") pod \"ovnkube-node-s7g6f\" (UID: \"ac4b6b8c-c038-4bd8-8876-b49f366ecdfc\") " pod="openshift-ovn-kubernetes/ovnkube-node-s7g6f" Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.091225 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ac4b6b8c-c038-4bd8-8876-b49f366ecdfc-ovnkube-script-lib\") pod \"ovnkube-node-s7g6f\" (UID: \"ac4b6b8c-c038-4bd8-8876-b49f366ecdfc\") " pod="openshift-ovn-kubernetes/ovnkube-node-s7g6f" Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.091284 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ac4b6b8c-c038-4bd8-8876-b49f366ecdfc-host-cni-netd\") pod \"ovnkube-node-s7g6f\" (UID: \"ac4b6b8c-c038-4bd8-8876-b49f366ecdfc\") " pod="openshift-ovn-kubernetes/ovnkube-node-s7g6f" Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.091304 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ac4b6b8c-c038-4bd8-8876-b49f366ecdfc-host-kubelet\") pod \"ovnkube-node-s7g6f\" (UID: \"ac4b6b8c-c038-4bd8-8876-b49f366ecdfc\") " pod="openshift-ovn-kubernetes/ovnkube-node-s7g6f" Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.091358 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ac4b6b8c-c038-4bd8-8876-b49f366ecdfc-run-openvswitch\") pod \"ovnkube-node-s7g6f\" (UID: \"ac4b6b8c-c038-4bd8-8876-b49f366ecdfc\") " pod="openshift-ovn-kubernetes/ovnkube-node-s7g6f" Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.091391 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ac4b6b8c-c038-4bd8-8876-b49f366ecdfc-run-systemd\") pod \"ovnkube-node-s7g6f\" (UID: \"ac4b6b8c-c038-4bd8-8876-b49f366ecdfc\") " pod="openshift-ovn-kubernetes/ovnkube-node-s7g6f" Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.091449 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ac4b6b8c-c038-4bd8-8876-b49f366ecdfc-host-run-ovn-kubernetes\") pod \"ovnkube-node-s7g6f\" (UID: \"ac4b6b8c-c038-4bd8-8876-b49f366ecdfc\") " pod="openshift-ovn-kubernetes/ovnkube-node-s7g6f" Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.091504 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ac4b6b8c-c038-4bd8-8876-b49f366ecdfc-host-run-netns\") pod \"ovnkube-node-s7g6f\" (UID: \"ac4b6b8c-c038-4bd8-8876-b49f366ecdfc\") " pod="openshift-ovn-kubernetes/ovnkube-node-s7g6f" Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.091536 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zxjqn\" (UniqueName: \"kubernetes.io/projected/ac4b6b8c-c038-4bd8-8876-b49f366ecdfc-kube-api-access-zxjqn\") pod \"ovnkube-node-s7g6f\" (UID: \"ac4b6b8c-c038-4bd8-8876-b49f366ecdfc\") " pod="openshift-ovn-kubernetes/ovnkube-node-s7g6f" Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.091557 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ac4b6b8c-c038-4bd8-8876-b49f366ecdfc-log-socket\") pod \"ovnkube-node-s7g6f\" (UID: \"ac4b6b8c-c038-4bd8-8876-b49f366ecdfc\") " pod="openshift-ovn-kubernetes/ovnkube-node-s7g6f" Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.091598 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ac4b6b8c-c038-4bd8-8876-b49f366ecdfc-systemd-units\") pod \"ovnkube-node-s7g6f\" (UID: \"ac4b6b8c-c038-4bd8-8876-b49f366ecdfc\") " pod="openshift-ovn-kubernetes/ovnkube-node-s7g6f" Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.091699 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ac4b6b8c-c038-4bd8-8876-b49f366ecdfc-systemd-units\") pod \"ovnkube-node-s7g6f\" (UID: \"ac4b6b8c-c038-4bd8-8876-b49f366ecdfc\") " pod="openshift-ovn-kubernetes/ovnkube-node-s7g6f" Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.091765 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ac4b6b8c-c038-4bd8-8876-b49f366ecdfc-host-cni-bin\") pod \"ovnkube-node-s7g6f\" (UID: \"ac4b6b8c-c038-4bd8-8876-b49f366ecdfc\") " pod="openshift-ovn-kubernetes/ovnkube-node-s7g6f" Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.092433 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ac4b6b8c-c038-4bd8-8876-b49f366ecdfc-env-overrides\") pod \"ovnkube-node-s7g6f\" (UID: \"ac4b6b8c-c038-4bd8-8876-b49f366ecdfc\") " pod="openshift-ovn-kubernetes/ovnkube-node-s7g6f" Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.093177 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ac4b6b8c-c038-4bd8-8876-b49f366ecdfc-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-s7g6f\" (UID: \"ac4b6b8c-c038-4bd8-8876-b49f366ecdfc\") " pod="openshift-ovn-kubernetes/ovnkube-node-s7g6f" Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.093212 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ac4b6b8c-c038-4bd8-8876-b49f366ecdfc-host-slash\") pod \"ovnkube-node-s7g6f\" (UID: \"ac4b6b8c-c038-4bd8-8876-b49f366ecdfc\") " pod="openshift-ovn-kubernetes/ovnkube-node-s7g6f" Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.093237 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ac4b6b8c-c038-4bd8-8876-b49f366ecdfc-node-log\") pod \"ovnkube-node-s7g6f\" (UID: \"ac4b6b8c-c038-4bd8-8876-b49f366ecdfc\") " pod="openshift-ovn-kubernetes/ovnkube-node-s7g6f" Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.093385 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ac4b6b8c-c038-4bd8-8876-b49f366ecdfc-var-lib-openvswitch\") pod \"ovnkube-node-s7g6f\" (UID: \"ac4b6b8c-c038-4bd8-8876-b49f366ecdfc\") " pod="openshift-ovn-kubernetes/ovnkube-node-s7g6f" Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.093407 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ac4b6b8c-c038-4bd8-8876-b49f366ecdfc-etc-openvswitch\") pod \"ovnkube-node-s7g6f\" (UID: \"ac4b6b8c-c038-4bd8-8876-b49f366ecdfc\") " pod="openshift-ovn-kubernetes/ovnkube-node-s7g6f" Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.093890 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ac4b6b8c-c038-4bd8-8876-b49f366ecdfc-ovnkube-config\") pod \"ovnkube-node-s7g6f\" (UID: \"ac4b6b8c-c038-4bd8-8876-b49f366ecdfc\") " pod="openshift-ovn-kubernetes/ovnkube-node-s7g6f" Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.093929 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ac4b6b8c-c038-4bd8-8876-b49f366ecdfc-run-systemd\") pod \"ovnkube-node-s7g6f\" (UID: \"ac4b6b8c-c038-4bd8-8876-b49f366ecdfc\") " pod="openshift-ovn-kubernetes/ovnkube-node-s7g6f" Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.093959 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ac4b6b8c-c038-4bd8-8876-b49f366ecdfc-host-run-ovn-kubernetes\") pod \"ovnkube-node-s7g6f\" (UID: \"ac4b6b8c-c038-4bd8-8876-b49f366ecdfc\") " pod="openshift-ovn-kubernetes/ovnkube-node-s7g6f" Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.094131 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ac4b6b8c-c038-4bd8-8876-b49f366ecdfc-host-run-netns\") pod \"ovnkube-node-s7g6f\" (UID: \"ac4b6b8c-c038-4bd8-8876-b49f366ecdfc\") " pod="openshift-ovn-kubernetes/ovnkube-node-s7g6f" Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.094186 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ac4b6b8c-c038-4bd8-8876-b49f366ecdfc-host-cni-netd\") pod \"ovnkube-node-s7g6f\" (UID: \"ac4b6b8c-c038-4bd8-8876-b49f366ecdfc\") " pod="openshift-ovn-kubernetes/ovnkube-node-s7g6f" Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.094186 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ac4b6b8c-c038-4bd8-8876-b49f366ecdfc-run-ovn\") pod \"ovnkube-node-s7g6f\" (UID: \"ac4b6b8c-c038-4bd8-8876-b49f366ecdfc\") " pod="openshift-ovn-kubernetes/ovnkube-node-s7g6f" Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.094244 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ac4b6b8c-c038-4bd8-8876-b49f366ecdfc-host-kubelet\") pod \"ovnkube-node-s7g6f\" (UID: \"ac4b6b8c-c038-4bd8-8876-b49f366ecdfc\") " pod="openshift-ovn-kubernetes/ovnkube-node-s7g6f" Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.094271 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ac4b6b8c-c038-4bd8-8876-b49f366ecdfc-log-socket\") pod \"ovnkube-node-s7g6f\" (UID: \"ac4b6b8c-c038-4bd8-8876-b49f366ecdfc\") " pod="openshift-ovn-kubernetes/ovnkube-node-s7g6f" Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.094776 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ac4b6b8c-c038-4bd8-8876-b49f366ecdfc-ovnkube-script-lib\") pod \"ovnkube-node-s7g6f\" (UID: \"ac4b6b8c-c038-4bd8-8876-b49f366ecdfc\") " pod="openshift-ovn-kubernetes/ovnkube-node-s7g6f" Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.094931 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ac4b6b8c-c038-4bd8-8876-b49f366ecdfc-run-openvswitch\") pod \"ovnkube-node-s7g6f\" (UID: \"ac4b6b8c-c038-4bd8-8876-b49f366ecdfc\") " pod="openshift-ovn-kubernetes/ovnkube-node-s7g6f" Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.099255 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ac4b6b8c-c038-4bd8-8876-b49f366ecdfc-ovn-node-metrics-cert\") pod \"ovnkube-node-s7g6f\" (UID: \"ac4b6b8c-c038-4bd8-8876-b49f366ecdfc\") " pod="openshift-ovn-kubernetes/ovnkube-node-s7g6f" Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.107437 4797 scope.go:117] "RemoveContainer" containerID="a900854ab289e65833932548eadd4705ec501737d66773d5b6c283458125b598" Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.113668 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zxjqn\" (UniqueName: \"kubernetes.io/projected/ac4b6b8c-c038-4bd8-8876-b49f366ecdfc-kube-api-access-zxjqn\") pod \"ovnkube-node-s7g6f\" (UID: \"ac4b6b8c-c038-4bd8-8876-b49f366ecdfc\") " pod="openshift-ovn-kubernetes/ovnkube-node-s7g6f" Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.120752 4797 scope.go:117] "RemoveContainer" containerID="aa5161ba66d687daedb3caa1a0e2d7be83859aa3076731f94aebf83cc3348a4a" Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.134474 4797 scope.go:117] "RemoveContainer" containerID="1293f7ed35796e22a4be73a35ad07f83fa98d250d21de2d0b96b9090354142b7" Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.148367 4797 scope.go:117] "RemoveContainer" containerID="3e599d81d1a996abd4de74afc58a8255a1ae548327401146b6bdf688d7455823" Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.165054 4797 scope.go:117] "RemoveContainer" containerID="6815d3509df673d7f5da2c26130c6c4d533e9d2c25c40f82365ef61d63ee71bb" Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.179799 4797 scope.go:117] "RemoveContainer" containerID="32991406197be9d38b8d5e8d1a7e95165b1846e9e054efbe87f30aac9f7f8784" Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.195065 4797 scope.go:117] "RemoveContainer" containerID="7aebb018a68c2984d9e4e58071c2b623652bfa700acebaf735c35615abf8c592" Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.211373 4797 scope.go:117] "RemoveContainer" containerID="6339ec20a5a892de024630eaf6febedfdfa4373c24d394fc758267cc7ae2603e" Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.226671 4797 scope.go:117] "RemoveContainer" containerID="36507d098b9eb0faf4505b62ed2c38e52ab7e05b27bd8b54f2eaafec373bb752" Oct 13 13:17:41 crc kubenswrapper[4797]: E1013 13:17:41.227210 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"36507d098b9eb0faf4505b62ed2c38e52ab7e05b27bd8b54f2eaafec373bb752\": container with ID starting with 36507d098b9eb0faf4505b62ed2c38e52ab7e05b27bd8b54f2eaafec373bb752 not found: ID does not exist" containerID="36507d098b9eb0faf4505b62ed2c38e52ab7e05b27bd8b54f2eaafec373bb752" Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.227299 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36507d098b9eb0faf4505b62ed2c38e52ab7e05b27bd8b54f2eaafec373bb752"} err="failed to get container status \"36507d098b9eb0faf4505b62ed2c38e52ab7e05b27bd8b54f2eaafec373bb752\": rpc error: code = NotFound desc = could not find container \"36507d098b9eb0faf4505b62ed2c38e52ab7e05b27bd8b54f2eaafec373bb752\": container with ID starting with 36507d098b9eb0faf4505b62ed2c38e52ab7e05b27bd8b54f2eaafec373bb752 not found: ID does not exist" Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.227367 4797 scope.go:117] "RemoveContainer" containerID="4df9cf44f891f2c9d341965d5c89c3f3ca2eb1bf12a54a0a673389869f9c878d" Oct 13 13:17:41 crc kubenswrapper[4797]: E1013 13:17:41.228090 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4df9cf44f891f2c9d341965d5c89c3f3ca2eb1bf12a54a0a673389869f9c878d\": container with ID starting with 4df9cf44f891f2c9d341965d5c89c3f3ca2eb1bf12a54a0a673389869f9c878d not found: ID does not exist" containerID="4df9cf44f891f2c9d341965d5c89c3f3ca2eb1bf12a54a0a673389869f9c878d" Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.228145 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4df9cf44f891f2c9d341965d5c89c3f3ca2eb1bf12a54a0a673389869f9c878d"} err="failed to get container status \"4df9cf44f891f2c9d341965d5c89c3f3ca2eb1bf12a54a0a673389869f9c878d\": rpc error: code = NotFound desc = could not find container \"4df9cf44f891f2c9d341965d5c89c3f3ca2eb1bf12a54a0a673389869f9c878d\": container with ID starting with 4df9cf44f891f2c9d341965d5c89c3f3ca2eb1bf12a54a0a673389869f9c878d not found: ID does not exist" Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.228180 4797 scope.go:117] "RemoveContainer" containerID="a900854ab289e65833932548eadd4705ec501737d66773d5b6c283458125b598" Oct 13 13:17:41 crc kubenswrapper[4797]: E1013 13:17:41.228600 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a900854ab289e65833932548eadd4705ec501737d66773d5b6c283458125b598\": container with ID starting with a900854ab289e65833932548eadd4705ec501737d66773d5b6c283458125b598 not found: ID does not exist" containerID="a900854ab289e65833932548eadd4705ec501737d66773d5b6c283458125b598" Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.228635 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a900854ab289e65833932548eadd4705ec501737d66773d5b6c283458125b598"} err="failed to get container status \"a900854ab289e65833932548eadd4705ec501737d66773d5b6c283458125b598\": rpc error: code = NotFound desc = could not find container \"a900854ab289e65833932548eadd4705ec501737d66773d5b6c283458125b598\": container with ID starting with a900854ab289e65833932548eadd4705ec501737d66773d5b6c283458125b598 not found: ID does not exist" Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.228660 4797 scope.go:117] "RemoveContainer" containerID="aa5161ba66d687daedb3caa1a0e2d7be83859aa3076731f94aebf83cc3348a4a" Oct 13 13:17:41 crc kubenswrapper[4797]: E1013 13:17:41.229032 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa5161ba66d687daedb3caa1a0e2d7be83859aa3076731f94aebf83cc3348a4a\": container with ID starting with aa5161ba66d687daedb3caa1a0e2d7be83859aa3076731f94aebf83cc3348a4a not found: ID does not exist" containerID="aa5161ba66d687daedb3caa1a0e2d7be83859aa3076731f94aebf83cc3348a4a" Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.229077 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa5161ba66d687daedb3caa1a0e2d7be83859aa3076731f94aebf83cc3348a4a"} err="failed to get container status \"aa5161ba66d687daedb3caa1a0e2d7be83859aa3076731f94aebf83cc3348a4a\": rpc error: code = NotFound desc = could not find container \"aa5161ba66d687daedb3caa1a0e2d7be83859aa3076731f94aebf83cc3348a4a\": container with ID starting with aa5161ba66d687daedb3caa1a0e2d7be83859aa3076731f94aebf83cc3348a4a not found: ID does not exist" Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.229108 4797 scope.go:117] "RemoveContainer" containerID="1293f7ed35796e22a4be73a35ad07f83fa98d250d21de2d0b96b9090354142b7" Oct 13 13:17:41 crc kubenswrapper[4797]: E1013 13:17:41.229478 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1293f7ed35796e22a4be73a35ad07f83fa98d250d21de2d0b96b9090354142b7\": container with ID starting with 1293f7ed35796e22a4be73a35ad07f83fa98d250d21de2d0b96b9090354142b7 not found: ID does not exist" containerID="1293f7ed35796e22a4be73a35ad07f83fa98d250d21de2d0b96b9090354142b7" Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.229508 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1293f7ed35796e22a4be73a35ad07f83fa98d250d21de2d0b96b9090354142b7"} err="failed to get container status \"1293f7ed35796e22a4be73a35ad07f83fa98d250d21de2d0b96b9090354142b7\": rpc error: code = NotFound desc = could not find container \"1293f7ed35796e22a4be73a35ad07f83fa98d250d21de2d0b96b9090354142b7\": container with ID starting with 1293f7ed35796e22a4be73a35ad07f83fa98d250d21de2d0b96b9090354142b7 not found: ID does not exist" Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.229528 4797 scope.go:117] "RemoveContainer" containerID="3e599d81d1a996abd4de74afc58a8255a1ae548327401146b6bdf688d7455823" Oct 13 13:17:41 crc kubenswrapper[4797]: E1013 13:17:41.229965 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e599d81d1a996abd4de74afc58a8255a1ae548327401146b6bdf688d7455823\": container with ID starting with 3e599d81d1a996abd4de74afc58a8255a1ae548327401146b6bdf688d7455823 not found: ID does not exist" containerID="3e599d81d1a996abd4de74afc58a8255a1ae548327401146b6bdf688d7455823" Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.230005 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e599d81d1a996abd4de74afc58a8255a1ae548327401146b6bdf688d7455823"} err="failed to get container status \"3e599d81d1a996abd4de74afc58a8255a1ae548327401146b6bdf688d7455823\": rpc error: code = NotFound desc = could not find container \"3e599d81d1a996abd4de74afc58a8255a1ae548327401146b6bdf688d7455823\": container with ID starting with 3e599d81d1a996abd4de74afc58a8255a1ae548327401146b6bdf688d7455823 not found: ID does not exist" Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.230021 4797 scope.go:117] "RemoveContainer" containerID="6815d3509df673d7f5da2c26130c6c4d533e9d2c25c40f82365ef61d63ee71bb" Oct 13 13:17:41 crc kubenswrapper[4797]: E1013 13:17:41.230296 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6815d3509df673d7f5da2c26130c6c4d533e9d2c25c40f82365ef61d63ee71bb\": container with ID starting with 6815d3509df673d7f5da2c26130c6c4d533e9d2c25c40f82365ef61d63ee71bb not found: ID does not exist" containerID="6815d3509df673d7f5da2c26130c6c4d533e9d2c25c40f82365ef61d63ee71bb" Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.230333 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6815d3509df673d7f5da2c26130c6c4d533e9d2c25c40f82365ef61d63ee71bb"} err="failed to get container status \"6815d3509df673d7f5da2c26130c6c4d533e9d2c25c40f82365ef61d63ee71bb\": rpc error: code = NotFound desc = could not find container \"6815d3509df673d7f5da2c26130c6c4d533e9d2c25c40f82365ef61d63ee71bb\": container with ID starting with 6815d3509df673d7f5da2c26130c6c4d533e9d2c25c40f82365ef61d63ee71bb not found: ID does not exist" Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.230354 4797 scope.go:117] "RemoveContainer" containerID="32991406197be9d38b8d5e8d1a7e95165b1846e9e054efbe87f30aac9f7f8784" Oct 13 13:17:41 crc kubenswrapper[4797]: E1013 13:17:41.230712 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"32991406197be9d38b8d5e8d1a7e95165b1846e9e054efbe87f30aac9f7f8784\": container with ID starting with 32991406197be9d38b8d5e8d1a7e95165b1846e9e054efbe87f30aac9f7f8784 not found: ID does not exist" containerID="32991406197be9d38b8d5e8d1a7e95165b1846e9e054efbe87f30aac9f7f8784" Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.230745 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32991406197be9d38b8d5e8d1a7e95165b1846e9e054efbe87f30aac9f7f8784"} err="failed to get container status \"32991406197be9d38b8d5e8d1a7e95165b1846e9e054efbe87f30aac9f7f8784\": rpc error: code = NotFound desc = could not find container \"32991406197be9d38b8d5e8d1a7e95165b1846e9e054efbe87f30aac9f7f8784\": container with ID starting with 32991406197be9d38b8d5e8d1a7e95165b1846e9e054efbe87f30aac9f7f8784 not found: ID does not exist" Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.230762 4797 scope.go:117] "RemoveContainer" containerID="7aebb018a68c2984d9e4e58071c2b623652bfa700acebaf735c35615abf8c592" Oct 13 13:17:41 crc kubenswrapper[4797]: E1013 13:17:41.231196 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7aebb018a68c2984d9e4e58071c2b623652bfa700acebaf735c35615abf8c592\": container with ID starting with 7aebb018a68c2984d9e4e58071c2b623652bfa700acebaf735c35615abf8c592 not found: ID does not exist" containerID="7aebb018a68c2984d9e4e58071c2b623652bfa700acebaf735c35615abf8c592" Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.231231 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7aebb018a68c2984d9e4e58071c2b623652bfa700acebaf735c35615abf8c592"} err="failed to get container status \"7aebb018a68c2984d9e4e58071c2b623652bfa700acebaf735c35615abf8c592\": rpc error: code = NotFound desc = could not find container \"7aebb018a68c2984d9e4e58071c2b623652bfa700acebaf735c35615abf8c592\": container with ID starting with 7aebb018a68c2984d9e4e58071c2b623652bfa700acebaf735c35615abf8c592 not found: ID does not exist" Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.231252 4797 scope.go:117] "RemoveContainer" containerID="6339ec20a5a892de024630eaf6febedfdfa4373c24d394fc758267cc7ae2603e" Oct 13 13:17:41 crc kubenswrapper[4797]: E1013 13:17:41.231563 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6339ec20a5a892de024630eaf6febedfdfa4373c24d394fc758267cc7ae2603e\": container with ID starting with 6339ec20a5a892de024630eaf6febedfdfa4373c24d394fc758267cc7ae2603e not found: ID does not exist" containerID="6339ec20a5a892de024630eaf6febedfdfa4373c24d394fc758267cc7ae2603e" Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.231596 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6339ec20a5a892de024630eaf6febedfdfa4373c24d394fc758267cc7ae2603e"} err="failed to get container status \"6339ec20a5a892de024630eaf6febedfdfa4373c24d394fc758267cc7ae2603e\": rpc error: code = NotFound desc = could not find container \"6339ec20a5a892de024630eaf6febedfdfa4373c24d394fc758267cc7ae2603e\": container with ID starting with 6339ec20a5a892de024630eaf6febedfdfa4373c24d394fc758267cc7ae2603e not found: ID does not exist" Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.231613 4797 scope.go:117] "RemoveContainer" containerID="36507d098b9eb0faf4505b62ed2c38e52ab7e05b27bd8b54f2eaafec373bb752" Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.231897 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36507d098b9eb0faf4505b62ed2c38e52ab7e05b27bd8b54f2eaafec373bb752"} err="failed to get container status \"36507d098b9eb0faf4505b62ed2c38e52ab7e05b27bd8b54f2eaafec373bb752\": rpc error: code = NotFound desc = could not find container \"36507d098b9eb0faf4505b62ed2c38e52ab7e05b27bd8b54f2eaafec373bb752\": container with ID starting with 36507d098b9eb0faf4505b62ed2c38e52ab7e05b27bd8b54f2eaafec373bb752 not found: ID does not exist" Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.231917 4797 scope.go:117] "RemoveContainer" containerID="4df9cf44f891f2c9d341965d5c89c3f3ca2eb1bf12a54a0a673389869f9c878d" Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.232284 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4df9cf44f891f2c9d341965d5c89c3f3ca2eb1bf12a54a0a673389869f9c878d"} err="failed to get container status \"4df9cf44f891f2c9d341965d5c89c3f3ca2eb1bf12a54a0a673389869f9c878d\": rpc error: code = NotFound desc = could not find container \"4df9cf44f891f2c9d341965d5c89c3f3ca2eb1bf12a54a0a673389869f9c878d\": container with ID starting with 4df9cf44f891f2c9d341965d5c89c3f3ca2eb1bf12a54a0a673389869f9c878d not found: ID does not exist" Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.232326 4797 scope.go:117] "RemoveContainer" containerID="a900854ab289e65833932548eadd4705ec501737d66773d5b6c283458125b598" Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.232682 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a900854ab289e65833932548eadd4705ec501737d66773d5b6c283458125b598"} err="failed to get container status \"a900854ab289e65833932548eadd4705ec501737d66773d5b6c283458125b598\": rpc error: code = NotFound desc = could not find container \"a900854ab289e65833932548eadd4705ec501737d66773d5b6c283458125b598\": container with ID starting with a900854ab289e65833932548eadd4705ec501737d66773d5b6c283458125b598 not found: ID does not exist" Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.232713 4797 scope.go:117] "RemoveContainer" containerID="aa5161ba66d687daedb3caa1a0e2d7be83859aa3076731f94aebf83cc3348a4a" Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.233071 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa5161ba66d687daedb3caa1a0e2d7be83859aa3076731f94aebf83cc3348a4a"} err="failed to get container status \"aa5161ba66d687daedb3caa1a0e2d7be83859aa3076731f94aebf83cc3348a4a\": rpc error: code = NotFound desc = could not find container \"aa5161ba66d687daedb3caa1a0e2d7be83859aa3076731f94aebf83cc3348a4a\": container with ID starting with aa5161ba66d687daedb3caa1a0e2d7be83859aa3076731f94aebf83cc3348a4a not found: ID does not exist" Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.233091 4797 scope.go:117] "RemoveContainer" containerID="1293f7ed35796e22a4be73a35ad07f83fa98d250d21de2d0b96b9090354142b7" Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.233369 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1293f7ed35796e22a4be73a35ad07f83fa98d250d21de2d0b96b9090354142b7"} err="failed to get container status \"1293f7ed35796e22a4be73a35ad07f83fa98d250d21de2d0b96b9090354142b7\": rpc error: code = NotFound desc = could not find container \"1293f7ed35796e22a4be73a35ad07f83fa98d250d21de2d0b96b9090354142b7\": container with ID starting with 1293f7ed35796e22a4be73a35ad07f83fa98d250d21de2d0b96b9090354142b7 not found: ID does not exist" Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.233391 4797 scope.go:117] "RemoveContainer" containerID="3e599d81d1a996abd4de74afc58a8255a1ae548327401146b6bdf688d7455823" Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.233610 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e599d81d1a996abd4de74afc58a8255a1ae548327401146b6bdf688d7455823"} err="failed to get container status \"3e599d81d1a996abd4de74afc58a8255a1ae548327401146b6bdf688d7455823\": rpc error: code = NotFound desc = could not find container \"3e599d81d1a996abd4de74afc58a8255a1ae548327401146b6bdf688d7455823\": container with ID starting with 3e599d81d1a996abd4de74afc58a8255a1ae548327401146b6bdf688d7455823 not found: ID does not exist" Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.233631 4797 scope.go:117] "RemoveContainer" containerID="6815d3509df673d7f5da2c26130c6c4d533e9d2c25c40f82365ef61d63ee71bb" Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.233999 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6815d3509df673d7f5da2c26130c6c4d533e9d2c25c40f82365ef61d63ee71bb"} err="failed to get container status \"6815d3509df673d7f5da2c26130c6c4d533e9d2c25c40f82365ef61d63ee71bb\": rpc error: code = NotFound desc = could not find container \"6815d3509df673d7f5da2c26130c6c4d533e9d2c25c40f82365ef61d63ee71bb\": container with ID starting with 6815d3509df673d7f5da2c26130c6c4d533e9d2c25c40f82365ef61d63ee71bb not found: ID does not exist" Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.234019 4797 scope.go:117] "RemoveContainer" containerID="32991406197be9d38b8d5e8d1a7e95165b1846e9e054efbe87f30aac9f7f8784" Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.234343 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32991406197be9d38b8d5e8d1a7e95165b1846e9e054efbe87f30aac9f7f8784"} err="failed to get container status \"32991406197be9d38b8d5e8d1a7e95165b1846e9e054efbe87f30aac9f7f8784\": rpc error: code = NotFound desc = could not find container \"32991406197be9d38b8d5e8d1a7e95165b1846e9e054efbe87f30aac9f7f8784\": container with ID starting with 32991406197be9d38b8d5e8d1a7e95165b1846e9e054efbe87f30aac9f7f8784 not found: ID does not exist" Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.234372 4797 scope.go:117] "RemoveContainer" containerID="7aebb018a68c2984d9e4e58071c2b623652bfa700acebaf735c35615abf8c592" Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.234721 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7aebb018a68c2984d9e4e58071c2b623652bfa700acebaf735c35615abf8c592"} err="failed to get container status \"7aebb018a68c2984d9e4e58071c2b623652bfa700acebaf735c35615abf8c592\": rpc error: code = NotFound desc = could not find container \"7aebb018a68c2984d9e4e58071c2b623652bfa700acebaf735c35615abf8c592\": container with ID starting with 7aebb018a68c2984d9e4e58071c2b623652bfa700acebaf735c35615abf8c592 not found: ID does not exist" Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.234765 4797 scope.go:117] "RemoveContainer" containerID="6339ec20a5a892de024630eaf6febedfdfa4373c24d394fc758267cc7ae2603e" Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.235242 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6339ec20a5a892de024630eaf6febedfdfa4373c24d394fc758267cc7ae2603e"} err="failed to get container status \"6339ec20a5a892de024630eaf6febedfdfa4373c24d394fc758267cc7ae2603e\": rpc error: code = NotFound desc = could not find container \"6339ec20a5a892de024630eaf6febedfdfa4373c24d394fc758267cc7ae2603e\": container with ID starting with 6339ec20a5a892de024630eaf6febedfdfa4373c24d394fc758267cc7ae2603e not found: ID does not exist" Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.235264 4797 scope.go:117] "RemoveContainer" containerID="36507d098b9eb0faf4505b62ed2c38e52ab7e05b27bd8b54f2eaafec373bb752" Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.235658 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36507d098b9eb0faf4505b62ed2c38e52ab7e05b27bd8b54f2eaafec373bb752"} err="failed to get container status \"36507d098b9eb0faf4505b62ed2c38e52ab7e05b27bd8b54f2eaafec373bb752\": rpc error: code = NotFound desc = could not find container \"36507d098b9eb0faf4505b62ed2c38e52ab7e05b27bd8b54f2eaafec373bb752\": container with ID starting with 36507d098b9eb0faf4505b62ed2c38e52ab7e05b27bd8b54f2eaafec373bb752 not found: ID does not exist" Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.235695 4797 scope.go:117] "RemoveContainer" containerID="4df9cf44f891f2c9d341965d5c89c3f3ca2eb1bf12a54a0a673389869f9c878d" Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.236156 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4df9cf44f891f2c9d341965d5c89c3f3ca2eb1bf12a54a0a673389869f9c878d"} err="failed to get container status \"4df9cf44f891f2c9d341965d5c89c3f3ca2eb1bf12a54a0a673389869f9c878d\": rpc error: code = NotFound desc = could not find container \"4df9cf44f891f2c9d341965d5c89c3f3ca2eb1bf12a54a0a673389869f9c878d\": container with ID starting with 4df9cf44f891f2c9d341965d5c89c3f3ca2eb1bf12a54a0a673389869f9c878d not found: ID does not exist" Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.236201 4797 scope.go:117] "RemoveContainer" containerID="a900854ab289e65833932548eadd4705ec501737d66773d5b6c283458125b598" Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.236546 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a900854ab289e65833932548eadd4705ec501737d66773d5b6c283458125b598"} err="failed to get container status \"a900854ab289e65833932548eadd4705ec501737d66773d5b6c283458125b598\": rpc error: code = NotFound desc = could not find container \"a900854ab289e65833932548eadd4705ec501737d66773d5b6c283458125b598\": container with ID starting with a900854ab289e65833932548eadd4705ec501737d66773d5b6c283458125b598 not found: ID does not exist" Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.236569 4797 scope.go:117] "RemoveContainer" containerID="aa5161ba66d687daedb3caa1a0e2d7be83859aa3076731f94aebf83cc3348a4a" Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.236897 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa5161ba66d687daedb3caa1a0e2d7be83859aa3076731f94aebf83cc3348a4a"} err="failed to get container status \"aa5161ba66d687daedb3caa1a0e2d7be83859aa3076731f94aebf83cc3348a4a\": rpc error: code = NotFound desc = could not find container \"aa5161ba66d687daedb3caa1a0e2d7be83859aa3076731f94aebf83cc3348a4a\": container with ID starting with aa5161ba66d687daedb3caa1a0e2d7be83859aa3076731f94aebf83cc3348a4a not found: ID does not exist" Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.236927 4797 scope.go:117] "RemoveContainer" containerID="1293f7ed35796e22a4be73a35ad07f83fa98d250d21de2d0b96b9090354142b7" Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.237224 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1293f7ed35796e22a4be73a35ad07f83fa98d250d21de2d0b96b9090354142b7"} err="failed to get container status \"1293f7ed35796e22a4be73a35ad07f83fa98d250d21de2d0b96b9090354142b7\": rpc error: code = NotFound desc = could not find container \"1293f7ed35796e22a4be73a35ad07f83fa98d250d21de2d0b96b9090354142b7\": container with ID starting with 1293f7ed35796e22a4be73a35ad07f83fa98d250d21de2d0b96b9090354142b7 not found: ID does not exist" Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.237249 4797 scope.go:117] "RemoveContainer" containerID="3e599d81d1a996abd4de74afc58a8255a1ae548327401146b6bdf688d7455823" Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.237585 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e599d81d1a996abd4de74afc58a8255a1ae548327401146b6bdf688d7455823"} err="failed to get container status \"3e599d81d1a996abd4de74afc58a8255a1ae548327401146b6bdf688d7455823\": rpc error: code = NotFound desc = could not find container \"3e599d81d1a996abd4de74afc58a8255a1ae548327401146b6bdf688d7455823\": container with ID starting with 3e599d81d1a996abd4de74afc58a8255a1ae548327401146b6bdf688d7455823 not found: ID does not exist" Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.237614 4797 scope.go:117] "RemoveContainer" containerID="6815d3509df673d7f5da2c26130c6c4d533e9d2c25c40f82365ef61d63ee71bb" Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.237886 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6815d3509df673d7f5da2c26130c6c4d533e9d2c25c40f82365ef61d63ee71bb"} err="failed to get container status \"6815d3509df673d7f5da2c26130c6c4d533e9d2c25c40f82365ef61d63ee71bb\": rpc error: code = NotFound desc = could not find container \"6815d3509df673d7f5da2c26130c6c4d533e9d2c25c40f82365ef61d63ee71bb\": container with ID starting with 6815d3509df673d7f5da2c26130c6c4d533e9d2c25c40f82365ef61d63ee71bb not found: ID does not exist" Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.237923 4797 scope.go:117] "RemoveContainer" containerID="32991406197be9d38b8d5e8d1a7e95165b1846e9e054efbe87f30aac9f7f8784" Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.238224 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32991406197be9d38b8d5e8d1a7e95165b1846e9e054efbe87f30aac9f7f8784"} err="failed to get container status \"32991406197be9d38b8d5e8d1a7e95165b1846e9e054efbe87f30aac9f7f8784\": rpc error: code = NotFound desc = could not find container \"32991406197be9d38b8d5e8d1a7e95165b1846e9e054efbe87f30aac9f7f8784\": container with ID starting with 32991406197be9d38b8d5e8d1a7e95165b1846e9e054efbe87f30aac9f7f8784 not found: ID does not exist" Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.238248 4797 scope.go:117] "RemoveContainer" containerID="7aebb018a68c2984d9e4e58071c2b623652bfa700acebaf735c35615abf8c592" Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.238532 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7aebb018a68c2984d9e4e58071c2b623652bfa700acebaf735c35615abf8c592"} err="failed to get container status \"7aebb018a68c2984d9e4e58071c2b623652bfa700acebaf735c35615abf8c592\": rpc error: code = NotFound desc = could not find container \"7aebb018a68c2984d9e4e58071c2b623652bfa700acebaf735c35615abf8c592\": container with ID starting with 7aebb018a68c2984d9e4e58071c2b623652bfa700acebaf735c35615abf8c592 not found: ID does not exist" Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.238559 4797 scope.go:117] "RemoveContainer" containerID="6339ec20a5a892de024630eaf6febedfdfa4373c24d394fc758267cc7ae2603e" Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.238891 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6339ec20a5a892de024630eaf6febedfdfa4373c24d394fc758267cc7ae2603e"} err="failed to get container status \"6339ec20a5a892de024630eaf6febedfdfa4373c24d394fc758267cc7ae2603e\": rpc error: code = NotFound desc = could not find container \"6339ec20a5a892de024630eaf6febedfdfa4373c24d394fc758267cc7ae2603e\": container with ID starting with 6339ec20a5a892de024630eaf6febedfdfa4373c24d394fc758267cc7ae2603e not found: ID does not exist" Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.238921 4797 scope.go:117] "RemoveContainer" containerID="36507d098b9eb0faf4505b62ed2c38e52ab7e05b27bd8b54f2eaafec373bb752" Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.239225 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36507d098b9eb0faf4505b62ed2c38e52ab7e05b27bd8b54f2eaafec373bb752"} err="failed to get container status \"36507d098b9eb0faf4505b62ed2c38e52ab7e05b27bd8b54f2eaafec373bb752\": rpc error: code = NotFound desc = could not find container \"36507d098b9eb0faf4505b62ed2c38e52ab7e05b27bd8b54f2eaafec373bb752\": container with ID starting with 36507d098b9eb0faf4505b62ed2c38e52ab7e05b27bd8b54f2eaafec373bb752 not found: ID does not exist" Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.239248 4797 scope.go:117] "RemoveContainer" containerID="4df9cf44f891f2c9d341965d5c89c3f3ca2eb1bf12a54a0a673389869f9c878d" Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.239524 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4df9cf44f891f2c9d341965d5c89c3f3ca2eb1bf12a54a0a673389869f9c878d"} err="failed to get container status \"4df9cf44f891f2c9d341965d5c89c3f3ca2eb1bf12a54a0a673389869f9c878d\": rpc error: code = NotFound desc = could not find container \"4df9cf44f891f2c9d341965d5c89c3f3ca2eb1bf12a54a0a673389869f9c878d\": container with ID starting with 4df9cf44f891f2c9d341965d5c89c3f3ca2eb1bf12a54a0a673389869f9c878d not found: ID does not exist" Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.239552 4797 scope.go:117] "RemoveContainer" containerID="a900854ab289e65833932548eadd4705ec501737d66773d5b6c283458125b598" Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.239996 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a900854ab289e65833932548eadd4705ec501737d66773d5b6c283458125b598"} err="failed to get container status \"a900854ab289e65833932548eadd4705ec501737d66773d5b6c283458125b598\": rpc error: code = NotFound desc = could not find container \"a900854ab289e65833932548eadd4705ec501737d66773d5b6c283458125b598\": container with ID starting with a900854ab289e65833932548eadd4705ec501737d66773d5b6c283458125b598 not found: ID does not exist" Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.240022 4797 scope.go:117] "RemoveContainer" containerID="aa5161ba66d687daedb3caa1a0e2d7be83859aa3076731f94aebf83cc3348a4a" Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.240335 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa5161ba66d687daedb3caa1a0e2d7be83859aa3076731f94aebf83cc3348a4a"} err="failed to get container status \"aa5161ba66d687daedb3caa1a0e2d7be83859aa3076731f94aebf83cc3348a4a\": rpc error: code = NotFound desc = could not find container \"aa5161ba66d687daedb3caa1a0e2d7be83859aa3076731f94aebf83cc3348a4a\": container with ID starting with aa5161ba66d687daedb3caa1a0e2d7be83859aa3076731f94aebf83cc3348a4a not found: ID does not exist" Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.240360 4797 scope.go:117] "RemoveContainer" containerID="1293f7ed35796e22a4be73a35ad07f83fa98d250d21de2d0b96b9090354142b7" Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.240682 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1293f7ed35796e22a4be73a35ad07f83fa98d250d21de2d0b96b9090354142b7"} err="failed to get container status \"1293f7ed35796e22a4be73a35ad07f83fa98d250d21de2d0b96b9090354142b7\": rpc error: code = NotFound desc = could not find container \"1293f7ed35796e22a4be73a35ad07f83fa98d250d21de2d0b96b9090354142b7\": container with ID starting with 1293f7ed35796e22a4be73a35ad07f83fa98d250d21de2d0b96b9090354142b7 not found: ID does not exist" Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.240708 4797 scope.go:117] "RemoveContainer" containerID="3e599d81d1a996abd4de74afc58a8255a1ae548327401146b6bdf688d7455823" Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.241377 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e599d81d1a996abd4de74afc58a8255a1ae548327401146b6bdf688d7455823"} err="failed to get container status \"3e599d81d1a996abd4de74afc58a8255a1ae548327401146b6bdf688d7455823\": rpc error: code = NotFound desc = could not find container \"3e599d81d1a996abd4de74afc58a8255a1ae548327401146b6bdf688d7455823\": container with ID starting with 3e599d81d1a996abd4de74afc58a8255a1ae548327401146b6bdf688d7455823 not found: ID does not exist" Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.241414 4797 scope.go:117] "RemoveContainer" containerID="6815d3509df673d7f5da2c26130c6c4d533e9d2c25c40f82365ef61d63ee71bb" Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.241708 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6815d3509df673d7f5da2c26130c6c4d533e9d2c25c40f82365ef61d63ee71bb"} err="failed to get container status \"6815d3509df673d7f5da2c26130c6c4d533e9d2c25c40f82365ef61d63ee71bb\": rpc error: code = NotFound desc = could not find container \"6815d3509df673d7f5da2c26130c6c4d533e9d2c25c40f82365ef61d63ee71bb\": container with ID starting with 6815d3509df673d7f5da2c26130c6c4d533e9d2c25c40f82365ef61d63ee71bb not found: ID does not exist" Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.241744 4797 scope.go:117] "RemoveContainer" containerID="32991406197be9d38b8d5e8d1a7e95165b1846e9e054efbe87f30aac9f7f8784" Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.242019 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32991406197be9d38b8d5e8d1a7e95165b1846e9e054efbe87f30aac9f7f8784"} err="failed to get container status \"32991406197be9d38b8d5e8d1a7e95165b1846e9e054efbe87f30aac9f7f8784\": rpc error: code = NotFound desc = could not find container \"32991406197be9d38b8d5e8d1a7e95165b1846e9e054efbe87f30aac9f7f8784\": container with ID starting with 32991406197be9d38b8d5e8d1a7e95165b1846e9e054efbe87f30aac9f7f8784 not found: ID does not exist" Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.242041 4797 scope.go:117] "RemoveContainer" containerID="7aebb018a68c2984d9e4e58071c2b623652bfa700acebaf735c35615abf8c592" Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.244012 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="658edc6a-9975-4d8b-9551-821edcc32ce1" path="/var/lib/kubelet/pods/658edc6a-9975-4d8b-9551-821edcc32ce1/volumes" Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.245711 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-s7g6f" Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.245785 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7aebb018a68c2984d9e4e58071c2b623652bfa700acebaf735c35615abf8c592"} err="failed to get container status \"7aebb018a68c2984d9e4e58071c2b623652bfa700acebaf735c35615abf8c592\": rpc error: code = NotFound desc = could not find container \"7aebb018a68c2984d9e4e58071c2b623652bfa700acebaf735c35615abf8c592\": container with ID starting with 7aebb018a68c2984d9e4e58071c2b623652bfa700acebaf735c35615abf8c592 not found: ID does not exist" Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.245830 4797 scope.go:117] "RemoveContainer" containerID="6339ec20a5a892de024630eaf6febedfdfa4373c24d394fc758267cc7ae2603e" Oct 13 13:17:41 crc kubenswrapper[4797]: I1013 13:17:41.246245 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6339ec20a5a892de024630eaf6febedfdfa4373c24d394fc758267cc7ae2603e"} err="failed to get container status \"6339ec20a5a892de024630eaf6febedfdfa4373c24d394fc758267cc7ae2603e\": rpc error: code = NotFound desc = could not find container \"6339ec20a5a892de024630eaf6febedfdfa4373c24d394fc758267cc7ae2603e\": container with ID starting with 6339ec20a5a892de024630eaf6febedfdfa4373c24d394fc758267cc7ae2603e not found: ID does not exist" Oct 13 13:17:42 crc kubenswrapper[4797]: I1013 13:17:42.048382 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-6gbdx_b2ab9f14-aae8-45ef-880e-a1563e920f87/kube-multus/1.log" Oct 13 13:17:42 crc kubenswrapper[4797]: I1013 13:17:42.053375 4797 generic.go:334] "Generic (PLEG): container finished" podID="ac4b6b8c-c038-4bd8-8876-b49f366ecdfc" containerID="db2aaa83374fdf44ae5e629357d8707daf7b90d82e714e8f69aac8d7a4d99ef8" exitCode=0 Oct 13 13:17:42 crc kubenswrapper[4797]: I1013 13:17:42.053452 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s7g6f" event={"ID":"ac4b6b8c-c038-4bd8-8876-b49f366ecdfc","Type":"ContainerDied","Data":"db2aaa83374fdf44ae5e629357d8707daf7b90d82e714e8f69aac8d7a4d99ef8"} Oct 13 13:17:42 crc kubenswrapper[4797]: I1013 13:17:42.053510 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s7g6f" event={"ID":"ac4b6b8c-c038-4bd8-8876-b49f366ecdfc","Type":"ContainerStarted","Data":"51ccde701e59cbd6275f0bdd1f39da59b7b74bcbd8a1c0247d4b39de65ce00e2"} Oct 13 13:17:43 crc kubenswrapper[4797]: I1013 13:17:43.065018 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s7g6f" event={"ID":"ac4b6b8c-c038-4bd8-8876-b49f366ecdfc","Type":"ContainerStarted","Data":"33a49e6273de6747d2ce7667da06a2557518a74e89c9e6dc465d4cb224f4c316"} Oct 13 13:17:43 crc kubenswrapper[4797]: I1013 13:17:43.065712 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s7g6f" event={"ID":"ac4b6b8c-c038-4bd8-8876-b49f366ecdfc","Type":"ContainerStarted","Data":"5013a66a5b83dbadd78d39ab1d969232d8c863877e718c1e657017afbc0f8f0b"} Oct 13 13:17:43 crc kubenswrapper[4797]: I1013 13:17:43.065735 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s7g6f" event={"ID":"ac4b6b8c-c038-4bd8-8876-b49f366ecdfc","Type":"ContainerStarted","Data":"ee46c71773fd87a9ac81d97a1d00907d93909ad26e9f49ad8910816fd2f376f7"} Oct 13 13:17:43 crc kubenswrapper[4797]: I1013 13:17:43.065754 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s7g6f" event={"ID":"ac4b6b8c-c038-4bd8-8876-b49f366ecdfc","Type":"ContainerStarted","Data":"0e4d87458199849fae6d5579e30208c216d439325b1d06ef272ee2faa2174748"} Oct 13 13:17:43 crc kubenswrapper[4797]: I1013 13:17:43.065777 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s7g6f" event={"ID":"ac4b6b8c-c038-4bd8-8876-b49f366ecdfc","Type":"ContainerStarted","Data":"93def5c31301aad561bc230473497055ca56686286b7136a3c633fcb4be35763"} Oct 13 13:17:43 crc kubenswrapper[4797]: I1013 13:17:43.065854 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s7g6f" event={"ID":"ac4b6b8c-c038-4bd8-8876-b49f366ecdfc","Type":"ContainerStarted","Data":"db572999aa0de57576c861bae755675b04336069c7848d8773a24a00ec96208b"} Oct 13 13:17:46 crc kubenswrapper[4797]: I1013 13:17:46.089318 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s7g6f" event={"ID":"ac4b6b8c-c038-4bd8-8876-b49f366ecdfc","Type":"ContainerStarted","Data":"93698779ffe3d5eb1a447a49af7c5b976bad0174439fa725cb03e8c335a62591"} Oct 13 13:17:48 crc kubenswrapper[4797]: I1013 13:17:48.108740 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s7g6f" event={"ID":"ac4b6b8c-c038-4bd8-8876-b49f366ecdfc","Type":"ContainerStarted","Data":"3903c53f9d9136ba764df5578f5efc117a363f34755d8644604bd5e76391781e"} Oct 13 13:17:48 crc kubenswrapper[4797]: I1013 13:17:48.109662 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-s7g6f" Oct 13 13:17:48 crc kubenswrapper[4797]: I1013 13:17:48.109694 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-s7g6f" Oct 13 13:17:48 crc kubenswrapper[4797]: I1013 13:17:48.109722 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-s7g6f" Oct 13 13:17:48 crc kubenswrapper[4797]: I1013 13:17:48.146793 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-s7g6f" Oct 13 13:17:48 crc kubenswrapper[4797]: I1013 13:17:48.148021 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-s7g6f" Oct 13 13:17:48 crc kubenswrapper[4797]: I1013 13:17:48.154544 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-s7g6f" podStartSLOduration=8.154523068 podStartE2EDuration="8.154523068s" podCreationTimestamp="2025-10-13 13:17:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 13:17:48.151146657 +0000 UTC m=+645.684697013" watchObservedRunningTime="2025-10-13 13:17:48.154523068 +0000 UTC m=+645.688073354" Oct 13 13:17:49 crc kubenswrapper[4797]: I1013 13:17:49.375239 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-7qr56"] Oct 13 13:17:49 crc kubenswrapper[4797]: I1013 13:17:49.376521 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-7qr56" Oct 13 13:17:49 crc kubenswrapper[4797]: I1013 13:17:49.379437 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Oct 13 13:17:49 crc kubenswrapper[4797]: I1013 13:17:49.380995 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Oct 13 13:17:49 crc kubenswrapper[4797]: I1013 13:17:49.381835 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Oct 13 13:17:49 crc kubenswrapper[4797]: I1013 13:17:49.382517 4797 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-xnt4c" Oct 13 13:17:49 crc kubenswrapper[4797]: I1013 13:17:49.385350 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-7qr56"] Oct 13 13:17:49 crc kubenswrapper[4797]: I1013 13:17:49.418379 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/307671bd-dffc-403e-8ad7-034af76c10be-crc-storage\") pod \"crc-storage-crc-7qr56\" (UID: \"307671bd-dffc-403e-8ad7-034af76c10be\") " pod="crc-storage/crc-storage-crc-7qr56" Oct 13 13:17:49 crc kubenswrapper[4797]: I1013 13:17:49.418558 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/307671bd-dffc-403e-8ad7-034af76c10be-node-mnt\") pod \"crc-storage-crc-7qr56\" (UID: \"307671bd-dffc-403e-8ad7-034af76c10be\") " pod="crc-storage/crc-storage-crc-7qr56" Oct 13 13:17:49 crc kubenswrapper[4797]: I1013 13:17:49.418671 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cv8j9\" (UniqueName: \"kubernetes.io/projected/307671bd-dffc-403e-8ad7-034af76c10be-kube-api-access-cv8j9\") pod \"crc-storage-crc-7qr56\" (UID: \"307671bd-dffc-403e-8ad7-034af76c10be\") " pod="crc-storage/crc-storage-crc-7qr56" Oct 13 13:17:49 crc kubenswrapper[4797]: I1013 13:17:49.520887 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/307671bd-dffc-403e-8ad7-034af76c10be-node-mnt\") pod \"crc-storage-crc-7qr56\" (UID: \"307671bd-dffc-403e-8ad7-034af76c10be\") " pod="crc-storage/crc-storage-crc-7qr56" Oct 13 13:17:49 crc kubenswrapper[4797]: I1013 13:17:49.520979 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cv8j9\" (UniqueName: \"kubernetes.io/projected/307671bd-dffc-403e-8ad7-034af76c10be-kube-api-access-cv8j9\") pod \"crc-storage-crc-7qr56\" (UID: \"307671bd-dffc-403e-8ad7-034af76c10be\") " pod="crc-storage/crc-storage-crc-7qr56" Oct 13 13:17:49 crc kubenswrapper[4797]: I1013 13:17:49.521093 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/307671bd-dffc-403e-8ad7-034af76c10be-crc-storage\") pod \"crc-storage-crc-7qr56\" (UID: \"307671bd-dffc-403e-8ad7-034af76c10be\") " pod="crc-storage/crc-storage-crc-7qr56" Oct 13 13:17:49 crc kubenswrapper[4797]: I1013 13:17:49.521596 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/307671bd-dffc-403e-8ad7-034af76c10be-node-mnt\") pod \"crc-storage-crc-7qr56\" (UID: \"307671bd-dffc-403e-8ad7-034af76c10be\") " pod="crc-storage/crc-storage-crc-7qr56" Oct 13 13:17:49 crc kubenswrapper[4797]: I1013 13:17:49.522474 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/307671bd-dffc-403e-8ad7-034af76c10be-crc-storage\") pod \"crc-storage-crc-7qr56\" (UID: \"307671bd-dffc-403e-8ad7-034af76c10be\") " pod="crc-storage/crc-storage-crc-7qr56" Oct 13 13:17:49 crc kubenswrapper[4797]: I1013 13:17:49.556268 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cv8j9\" (UniqueName: \"kubernetes.io/projected/307671bd-dffc-403e-8ad7-034af76c10be-kube-api-access-cv8j9\") pod \"crc-storage-crc-7qr56\" (UID: \"307671bd-dffc-403e-8ad7-034af76c10be\") " pod="crc-storage/crc-storage-crc-7qr56" Oct 13 13:17:49 crc kubenswrapper[4797]: I1013 13:17:49.713943 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-7qr56" Oct 13 13:17:49 crc kubenswrapper[4797]: E1013 13:17:49.753617 4797 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-7qr56_crc-storage_307671bd-dffc-403e-8ad7-034af76c10be_0(2cedff38954cbc82d09fe7596105dbbb47feee628d13b023b1e306dac6a389b0): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 13 13:17:49 crc kubenswrapper[4797]: E1013 13:17:49.753720 4797 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-7qr56_crc-storage_307671bd-dffc-403e-8ad7-034af76c10be_0(2cedff38954cbc82d09fe7596105dbbb47feee628d13b023b1e306dac6a389b0): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-7qr56" Oct 13 13:17:49 crc kubenswrapper[4797]: E1013 13:17:49.753754 4797 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-7qr56_crc-storage_307671bd-dffc-403e-8ad7-034af76c10be_0(2cedff38954cbc82d09fe7596105dbbb47feee628d13b023b1e306dac6a389b0): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-7qr56" Oct 13 13:17:49 crc kubenswrapper[4797]: E1013 13:17:49.753902 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"crc-storage-crc-7qr56_crc-storage(307671bd-dffc-403e-8ad7-034af76c10be)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"crc-storage-crc-7qr56_crc-storage(307671bd-dffc-403e-8ad7-034af76c10be)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-7qr56_crc-storage_307671bd-dffc-403e-8ad7-034af76c10be_0(2cedff38954cbc82d09fe7596105dbbb47feee628d13b023b1e306dac6a389b0): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="crc-storage/crc-storage-crc-7qr56" podUID="307671bd-dffc-403e-8ad7-034af76c10be" Oct 13 13:17:50 crc kubenswrapper[4797]: I1013 13:17:50.118780 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-7qr56" Oct 13 13:17:50 crc kubenswrapper[4797]: I1013 13:17:50.119602 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-7qr56" Oct 13 13:17:50 crc kubenswrapper[4797]: E1013 13:17:50.158287 4797 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-7qr56_crc-storage_307671bd-dffc-403e-8ad7-034af76c10be_0(33fe183e1997aebae4250b7f7ddd136b0ea8fdeaea3b8322db8d931595cb7795): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 13 13:17:50 crc kubenswrapper[4797]: E1013 13:17:50.158349 4797 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-7qr56_crc-storage_307671bd-dffc-403e-8ad7-034af76c10be_0(33fe183e1997aebae4250b7f7ddd136b0ea8fdeaea3b8322db8d931595cb7795): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-7qr56" Oct 13 13:17:50 crc kubenswrapper[4797]: E1013 13:17:50.158368 4797 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-7qr56_crc-storage_307671bd-dffc-403e-8ad7-034af76c10be_0(33fe183e1997aebae4250b7f7ddd136b0ea8fdeaea3b8322db8d931595cb7795): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-7qr56" Oct 13 13:17:50 crc kubenswrapper[4797]: E1013 13:17:50.158403 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"crc-storage-crc-7qr56_crc-storage(307671bd-dffc-403e-8ad7-034af76c10be)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"crc-storage-crc-7qr56_crc-storage(307671bd-dffc-403e-8ad7-034af76c10be)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-7qr56_crc-storage_307671bd-dffc-403e-8ad7-034af76c10be_0(33fe183e1997aebae4250b7f7ddd136b0ea8fdeaea3b8322db8d931595cb7795): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="crc-storage/crc-storage-crc-7qr56" podUID="307671bd-dffc-403e-8ad7-034af76c10be" Oct 13 13:17:55 crc kubenswrapper[4797]: I1013 13:17:55.236611 4797 scope.go:117] "RemoveContainer" containerID="fa998288bf7354f5914b82c32971cd88e1fe9535016c7d137b79e4cf5c5c7248" Oct 13 13:17:56 crc kubenswrapper[4797]: I1013 13:17:56.154750 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-6gbdx_b2ab9f14-aae8-45ef-880e-a1563e920f87/kube-multus/1.log" Oct 13 13:17:56 crc kubenswrapper[4797]: I1013 13:17:56.154996 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-6gbdx" event={"ID":"b2ab9f14-aae8-45ef-880e-a1563e920f87","Type":"ContainerStarted","Data":"4d0d2ef0c5357ae2cc12c11ec99ccc7f549a05eeb2e548aa6ace781c15d7a31f"} Oct 13 13:18:03 crc kubenswrapper[4797]: I1013 13:18:03.235898 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-7qr56" Oct 13 13:18:03 crc kubenswrapper[4797]: I1013 13:18:03.240958 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-7qr56" Oct 13 13:18:03 crc kubenswrapper[4797]: I1013 13:18:03.487715 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-7qr56"] Oct 13 13:18:03 crc kubenswrapper[4797]: I1013 13:18:03.497878 4797 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 13 13:18:04 crc kubenswrapper[4797]: I1013 13:18:04.208783 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-7qr56" event={"ID":"307671bd-dffc-403e-8ad7-034af76c10be","Type":"ContainerStarted","Data":"5df25fb1853006bca93cd44cd7512a1ae95ada6f9fc5ae9a6ec2f6d996144989"} Oct 13 13:18:05 crc kubenswrapper[4797]: I1013 13:18:05.217031 4797 generic.go:334] "Generic (PLEG): container finished" podID="307671bd-dffc-403e-8ad7-034af76c10be" containerID="4cbef5a09a8ea7a1158c859166d8ca251477ff59b88f204ba42490bff7c6d5cc" exitCode=0 Oct 13 13:18:05 crc kubenswrapper[4797]: I1013 13:18:05.217225 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-7qr56" event={"ID":"307671bd-dffc-403e-8ad7-034af76c10be","Type":"ContainerDied","Data":"4cbef5a09a8ea7a1158c859166d8ca251477ff59b88f204ba42490bff7c6d5cc"} Oct 13 13:18:06 crc kubenswrapper[4797]: I1013 13:18:06.539919 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-7qr56" Oct 13 13:18:06 crc kubenswrapper[4797]: I1013 13:18:06.576969 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/307671bd-dffc-403e-8ad7-034af76c10be-crc-storage\") pod \"307671bd-dffc-403e-8ad7-034af76c10be\" (UID: \"307671bd-dffc-403e-8ad7-034af76c10be\") " Oct 13 13:18:06 crc kubenswrapper[4797]: I1013 13:18:06.577028 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/307671bd-dffc-403e-8ad7-034af76c10be-node-mnt\") pod \"307671bd-dffc-403e-8ad7-034af76c10be\" (UID: \"307671bd-dffc-403e-8ad7-034af76c10be\") " Oct 13 13:18:06 crc kubenswrapper[4797]: I1013 13:18:06.577207 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cv8j9\" (UniqueName: \"kubernetes.io/projected/307671bd-dffc-403e-8ad7-034af76c10be-kube-api-access-cv8j9\") pod \"307671bd-dffc-403e-8ad7-034af76c10be\" (UID: \"307671bd-dffc-403e-8ad7-034af76c10be\") " Oct 13 13:18:06 crc kubenswrapper[4797]: I1013 13:18:06.577273 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/307671bd-dffc-403e-8ad7-034af76c10be-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "307671bd-dffc-403e-8ad7-034af76c10be" (UID: "307671bd-dffc-403e-8ad7-034af76c10be"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 13 13:18:06 crc kubenswrapper[4797]: I1013 13:18:06.577525 4797 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/307671bd-dffc-403e-8ad7-034af76c10be-node-mnt\") on node \"crc\" DevicePath \"\"" Oct 13 13:18:06 crc kubenswrapper[4797]: I1013 13:18:06.585062 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/307671bd-dffc-403e-8ad7-034af76c10be-kube-api-access-cv8j9" (OuterVolumeSpecName: "kube-api-access-cv8j9") pod "307671bd-dffc-403e-8ad7-034af76c10be" (UID: "307671bd-dffc-403e-8ad7-034af76c10be"). InnerVolumeSpecName "kube-api-access-cv8j9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:18:06 crc kubenswrapper[4797]: I1013 13:18:06.600286 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/307671bd-dffc-403e-8ad7-034af76c10be-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "307671bd-dffc-403e-8ad7-034af76c10be" (UID: "307671bd-dffc-403e-8ad7-034af76c10be"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:18:06 crc kubenswrapper[4797]: I1013 13:18:06.678143 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cv8j9\" (UniqueName: \"kubernetes.io/projected/307671bd-dffc-403e-8ad7-034af76c10be-kube-api-access-cv8j9\") on node \"crc\" DevicePath \"\"" Oct 13 13:18:06 crc kubenswrapper[4797]: I1013 13:18:06.678176 4797 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/307671bd-dffc-403e-8ad7-034af76c10be-crc-storage\") on node \"crc\" DevicePath \"\"" Oct 13 13:18:07 crc kubenswrapper[4797]: I1013 13:18:07.235599 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-7qr56" Oct 13 13:18:07 crc kubenswrapper[4797]: I1013 13:18:07.249254 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-7qr56" event={"ID":"307671bd-dffc-403e-8ad7-034af76c10be","Type":"ContainerDied","Data":"5df25fb1853006bca93cd44cd7512a1ae95ada6f9fc5ae9a6ec2f6d996144989"} Oct 13 13:18:07 crc kubenswrapper[4797]: I1013 13:18:07.249324 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5df25fb1853006bca93cd44cd7512a1ae95ada6f9fc5ae9a6ec2f6d996144989" Oct 13 13:18:11 crc kubenswrapper[4797]: I1013 13:18:11.280686 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-s7g6f" Oct 13 13:18:14 crc kubenswrapper[4797]: I1013 13:18:14.959714 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6rqvz"] Oct 13 13:18:14 crc kubenswrapper[4797]: E1013 13:18:14.960182 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="307671bd-dffc-403e-8ad7-034af76c10be" containerName="storage" Oct 13 13:18:14 crc kubenswrapper[4797]: I1013 13:18:14.960197 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="307671bd-dffc-403e-8ad7-034af76c10be" containerName="storage" Oct 13 13:18:14 crc kubenswrapper[4797]: I1013 13:18:14.960293 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="307671bd-dffc-403e-8ad7-034af76c10be" containerName="storage" Oct 13 13:18:14 crc kubenswrapper[4797]: I1013 13:18:14.960952 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6rqvz" Oct 13 13:18:14 crc kubenswrapper[4797]: I1013 13:18:14.962531 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 13 13:18:14 crc kubenswrapper[4797]: I1013 13:18:14.972702 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6rqvz"] Oct 13 13:18:15 crc kubenswrapper[4797]: I1013 13:18:15.101728 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbhmv\" (UniqueName: \"kubernetes.io/projected/e70a0257-aaba-41a5-a201-8062761a1adf-kube-api-access-vbhmv\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6rqvz\" (UID: \"e70a0257-aaba-41a5-a201-8062761a1adf\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6rqvz" Oct 13 13:18:15 crc kubenswrapper[4797]: I1013 13:18:15.101797 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e70a0257-aaba-41a5-a201-8062761a1adf-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6rqvz\" (UID: \"e70a0257-aaba-41a5-a201-8062761a1adf\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6rqvz" Oct 13 13:18:15 crc kubenswrapper[4797]: I1013 13:18:15.101883 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e70a0257-aaba-41a5-a201-8062761a1adf-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6rqvz\" (UID: \"e70a0257-aaba-41a5-a201-8062761a1adf\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6rqvz" Oct 13 13:18:15 crc kubenswrapper[4797]: I1013 13:18:15.202512 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vbhmv\" (UniqueName: \"kubernetes.io/projected/e70a0257-aaba-41a5-a201-8062761a1adf-kube-api-access-vbhmv\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6rqvz\" (UID: \"e70a0257-aaba-41a5-a201-8062761a1adf\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6rqvz" Oct 13 13:18:15 crc kubenswrapper[4797]: I1013 13:18:15.202570 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e70a0257-aaba-41a5-a201-8062761a1adf-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6rqvz\" (UID: \"e70a0257-aaba-41a5-a201-8062761a1adf\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6rqvz" Oct 13 13:18:15 crc kubenswrapper[4797]: I1013 13:18:15.202614 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e70a0257-aaba-41a5-a201-8062761a1adf-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6rqvz\" (UID: \"e70a0257-aaba-41a5-a201-8062761a1adf\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6rqvz" Oct 13 13:18:15 crc kubenswrapper[4797]: I1013 13:18:15.203031 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e70a0257-aaba-41a5-a201-8062761a1adf-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6rqvz\" (UID: \"e70a0257-aaba-41a5-a201-8062761a1adf\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6rqvz" Oct 13 13:18:15 crc kubenswrapper[4797]: I1013 13:18:15.203215 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e70a0257-aaba-41a5-a201-8062761a1adf-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6rqvz\" (UID: \"e70a0257-aaba-41a5-a201-8062761a1adf\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6rqvz" Oct 13 13:18:15 crc kubenswrapper[4797]: I1013 13:18:15.229721 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vbhmv\" (UniqueName: \"kubernetes.io/projected/e70a0257-aaba-41a5-a201-8062761a1adf-kube-api-access-vbhmv\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6rqvz\" (UID: \"e70a0257-aaba-41a5-a201-8062761a1adf\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6rqvz" Oct 13 13:18:15 crc kubenswrapper[4797]: I1013 13:18:15.277881 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6rqvz" Oct 13 13:18:15 crc kubenswrapper[4797]: I1013 13:18:15.440626 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6rqvz"] Oct 13 13:18:15 crc kubenswrapper[4797]: W1013 13:18:15.448625 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode70a0257_aaba_41a5_a201_8062761a1adf.slice/crio-bb3c16e5bc7509fac9acbde8f6242d2e2561e4f01818a338abfe557d1a09447a WatchSource:0}: Error finding container bb3c16e5bc7509fac9acbde8f6242d2e2561e4f01818a338abfe557d1a09447a: Status 404 returned error can't find the container with id bb3c16e5bc7509fac9acbde8f6242d2e2561e4f01818a338abfe557d1a09447a Oct 13 13:18:16 crc kubenswrapper[4797]: I1013 13:18:16.302572 4797 generic.go:334] "Generic (PLEG): container finished" podID="e70a0257-aaba-41a5-a201-8062761a1adf" containerID="dceed38242f2deca06b6d9f1dfecce410dd3d20f39d629ab16d344232f3149d6" exitCode=0 Oct 13 13:18:16 crc kubenswrapper[4797]: I1013 13:18:16.302630 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6rqvz" event={"ID":"e70a0257-aaba-41a5-a201-8062761a1adf","Type":"ContainerDied","Data":"dceed38242f2deca06b6d9f1dfecce410dd3d20f39d629ab16d344232f3149d6"} Oct 13 13:18:16 crc kubenswrapper[4797]: I1013 13:18:16.302660 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6rqvz" event={"ID":"e70a0257-aaba-41a5-a201-8062761a1adf","Type":"ContainerStarted","Data":"bb3c16e5bc7509fac9acbde8f6242d2e2561e4f01818a338abfe557d1a09447a"} Oct 13 13:18:18 crc kubenswrapper[4797]: I1013 13:18:18.316414 4797 generic.go:334] "Generic (PLEG): container finished" podID="e70a0257-aaba-41a5-a201-8062761a1adf" containerID="3a5578914cc25a792ffa12f8f441012873197eeb6c4aa96cb689ea7d243e5128" exitCode=0 Oct 13 13:18:18 crc kubenswrapper[4797]: I1013 13:18:18.316485 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6rqvz" event={"ID":"e70a0257-aaba-41a5-a201-8062761a1adf","Type":"ContainerDied","Data":"3a5578914cc25a792ffa12f8f441012873197eeb6c4aa96cb689ea7d243e5128"} Oct 13 13:18:19 crc kubenswrapper[4797]: I1013 13:18:19.327945 4797 generic.go:334] "Generic (PLEG): container finished" podID="e70a0257-aaba-41a5-a201-8062761a1adf" containerID="28445a7e7b324b5cdafbdc988b581a45bf4a8d5752a18a075cf2aa3e530d1481" exitCode=0 Oct 13 13:18:19 crc kubenswrapper[4797]: I1013 13:18:19.328027 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6rqvz" event={"ID":"e70a0257-aaba-41a5-a201-8062761a1adf","Type":"ContainerDied","Data":"28445a7e7b324b5cdafbdc988b581a45bf4a8d5752a18a075cf2aa3e530d1481"} Oct 13 13:18:20 crc kubenswrapper[4797]: I1013 13:18:20.621601 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6rqvz" Oct 13 13:18:20 crc kubenswrapper[4797]: I1013 13:18:20.684849 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e70a0257-aaba-41a5-a201-8062761a1adf-util\") pod \"e70a0257-aaba-41a5-a201-8062761a1adf\" (UID: \"e70a0257-aaba-41a5-a201-8062761a1adf\") " Oct 13 13:18:20 crc kubenswrapper[4797]: I1013 13:18:20.684936 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vbhmv\" (UniqueName: \"kubernetes.io/projected/e70a0257-aaba-41a5-a201-8062761a1adf-kube-api-access-vbhmv\") pod \"e70a0257-aaba-41a5-a201-8062761a1adf\" (UID: \"e70a0257-aaba-41a5-a201-8062761a1adf\") " Oct 13 13:18:20 crc kubenswrapper[4797]: I1013 13:18:20.684983 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e70a0257-aaba-41a5-a201-8062761a1adf-bundle\") pod \"e70a0257-aaba-41a5-a201-8062761a1adf\" (UID: \"e70a0257-aaba-41a5-a201-8062761a1adf\") " Oct 13 13:18:20 crc kubenswrapper[4797]: I1013 13:18:20.686212 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e70a0257-aaba-41a5-a201-8062761a1adf-bundle" (OuterVolumeSpecName: "bundle") pod "e70a0257-aaba-41a5-a201-8062761a1adf" (UID: "e70a0257-aaba-41a5-a201-8062761a1adf"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 13:18:20 crc kubenswrapper[4797]: I1013 13:18:20.693364 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e70a0257-aaba-41a5-a201-8062761a1adf-kube-api-access-vbhmv" (OuterVolumeSpecName: "kube-api-access-vbhmv") pod "e70a0257-aaba-41a5-a201-8062761a1adf" (UID: "e70a0257-aaba-41a5-a201-8062761a1adf"). InnerVolumeSpecName "kube-api-access-vbhmv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:18:20 crc kubenswrapper[4797]: I1013 13:18:20.703918 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e70a0257-aaba-41a5-a201-8062761a1adf-util" (OuterVolumeSpecName: "util") pod "e70a0257-aaba-41a5-a201-8062761a1adf" (UID: "e70a0257-aaba-41a5-a201-8062761a1adf"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 13:18:20 crc kubenswrapper[4797]: I1013 13:18:20.787083 4797 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e70a0257-aaba-41a5-a201-8062761a1adf-util\") on node \"crc\" DevicePath \"\"" Oct 13 13:18:20 crc kubenswrapper[4797]: I1013 13:18:20.787130 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vbhmv\" (UniqueName: \"kubernetes.io/projected/e70a0257-aaba-41a5-a201-8062761a1adf-kube-api-access-vbhmv\") on node \"crc\" DevicePath \"\"" Oct 13 13:18:20 crc kubenswrapper[4797]: I1013 13:18:20.787153 4797 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e70a0257-aaba-41a5-a201-8062761a1adf-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 13:18:21 crc kubenswrapper[4797]: I1013 13:18:21.341796 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6rqvz" event={"ID":"e70a0257-aaba-41a5-a201-8062761a1adf","Type":"ContainerDied","Data":"bb3c16e5bc7509fac9acbde8f6242d2e2561e4f01818a338abfe557d1a09447a"} Oct 13 13:18:21 crc kubenswrapper[4797]: I1013 13:18:21.341854 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bb3c16e5bc7509fac9acbde8f6242d2e2561e4f01818a338abfe557d1a09447a" Oct 13 13:18:21 crc kubenswrapper[4797]: I1013 13:18:21.341999 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6rqvz" Oct 13 13:18:23 crc kubenswrapper[4797]: I1013 13:18:23.736172 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-rzsld"] Oct 13 13:18:23 crc kubenswrapper[4797]: E1013 13:18:23.736639 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e70a0257-aaba-41a5-a201-8062761a1adf" containerName="pull" Oct 13 13:18:23 crc kubenswrapper[4797]: I1013 13:18:23.736654 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="e70a0257-aaba-41a5-a201-8062761a1adf" containerName="pull" Oct 13 13:18:23 crc kubenswrapper[4797]: E1013 13:18:23.736672 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e70a0257-aaba-41a5-a201-8062761a1adf" containerName="extract" Oct 13 13:18:23 crc kubenswrapper[4797]: I1013 13:18:23.736680 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="e70a0257-aaba-41a5-a201-8062761a1adf" containerName="extract" Oct 13 13:18:23 crc kubenswrapper[4797]: E1013 13:18:23.736701 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e70a0257-aaba-41a5-a201-8062761a1adf" containerName="util" Oct 13 13:18:23 crc kubenswrapper[4797]: I1013 13:18:23.736712 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="e70a0257-aaba-41a5-a201-8062761a1adf" containerName="util" Oct 13 13:18:23 crc kubenswrapper[4797]: I1013 13:18:23.736833 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="e70a0257-aaba-41a5-a201-8062761a1adf" containerName="extract" Oct 13 13:18:23 crc kubenswrapper[4797]: I1013 13:18:23.737260 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-858ddd8f98-rzsld" Oct 13 13:18:23 crc kubenswrapper[4797]: I1013 13:18:23.739748 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-z4x49" Oct 13 13:18:23 crc kubenswrapper[4797]: I1013 13:18:23.739821 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Oct 13 13:18:23 crc kubenswrapper[4797]: I1013 13:18:23.739941 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Oct 13 13:18:23 crc kubenswrapper[4797]: I1013 13:18:23.751676 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-rzsld"] Oct 13 13:18:23 crc kubenswrapper[4797]: I1013 13:18:23.832592 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6z5v\" (UniqueName: \"kubernetes.io/projected/b08560e7-e858-4ae9-9674-ac7c89a2d5d4-kube-api-access-b6z5v\") pod \"nmstate-operator-858ddd8f98-rzsld\" (UID: \"b08560e7-e858-4ae9-9674-ac7c89a2d5d4\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-rzsld" Oct 13 13:18:23 crc kubenswrapper[4797]: I1013 13:18:23.933642 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b6z5v\" (UniqueName: \"kubernetes.io/projected/b08560e7-e858-4ae9-9674-ac7c89a2d5d4-kube-api-access-b6z5v\") pod \"nmstate-operator-858ddd8f98-rzsld\" (UID: \"b08560e7-e858-4ae9-9674-ac7c89a2d5d4\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-rzsld" Oct 13 13:18:23 crc kubenswrapper[4797]: I1013 13:18:23.957771 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6z5v\" (UniqueName: \"kubernetes.io/projected/b08560e7-e858-4ae9-9674-ac7c89a2d5d4-kube-api-access-b6z5v\") pod \"nmstate-operator-858ddd8f98-rzsld\" (UID: \"b08560e7-e858-4ae9-9674-ac7c89a2d5d4\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-rzsld" Oct 13 13:18:24 crc kubenswrapper[4797]: I1013 13:18:24.051963 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-858ddd8f98-rzsld" Oct 13 13:18:24 crc kubenswrapper[4797]: I1013 13:18:24.456216 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-rzsld"] Oct 13 13:18:25 crc kubenswrapper[4797]: I1013 13:18:25.363271 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-858ddd8f98-rzsld" event={"ID":"b08560e7-e858-4ae9-9674-ac7c89a2d5d4","Type":"ContainerStarted","Data":"472c86b564274be582ce328f4180ab5ffdbdec75c1e1731f9aef720e5293eb4b"} Oct 13 13:18:27 crc kubenswrapper[4797]: I1013 13:18:27.378544 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-858ddd8f98-rzsld" event={"ID":"b08560e7-e858-4ae9-9674-ac7c89a2d5d4","Type":"ContainerStarted","Data":"8309f5f80829b3f432cb48e57433b3a0e7181ee944bb4cbcabc578b73eb7b3ac"} Oct 13 13:18:28 crc kubenswrapper[4797]: I1013 13:18:28.372070 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-858ddd8f98-rzsld" podStartSLOduration=3.257622803 podStartE2EDuration="5.372048833s" podCreationTimestamp="2025-10-13 13:18:23 +0000 UTC" firstStartedPulling="2025-10-13 13:18:24.469037824 +0000 UTC m=+682.002588080" lastFinishedPulling="2025-10-13 13:18:26.583463854 +0000 UTC m=+684.117014110" observedRunningTime="2025-10-13 13:18:27.406228059 +0000 UTC m=+684.939778325" watchObservedRunningTime="2025-10-13 13:18:28.372048833 +0000 UTC m=+685.905599089" Oct 13 13:18:28 crc kubenswrapper[4797]: I1013 13:18:28.374536 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-6g2mr"] Oct 13 13:18:28 crc kubenswrapper[4797]: I1013 13:18:28.375654 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-6g2mr" Oct 13 13:18:28 crc kubenswrapper[4797]: I1013 13:18:28.378245 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-dgck6"] Oct 13 13:18:28 crc kubenswrapper[4797]: I1013 13:18:28.378373 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Oct 13 13:18:28 crc kubenswrapper[4797]: I1013 13:18:28.378764 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-nrvtn" Oct 13 13:18:28 crc kubenswrapper[4797]: I1013 13:18:28.379243 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-dgck6" Oct 13 13:18:28 crc kubenswrapper[4797]: I1013 13:18:28.391279 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-6g2mr"] Oct 13 13:18:28 crc kubenswrapper[4797]: I1013 13:18:28.414035 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-dgck6"] Oct 13 13:18:28 crc kubenswrapper[4797]: I1013 13:18:28.418520 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-4gzdz"] Oct 13 13:18:28 crc kubenswrapper[4797]: I1013 13:18:28.419308 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-4gzdz" Oct 13 13:18:28 crc kubenswrapper[4797]: I1013 13:18:28.492602 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/567b9831-64e4-48ec-bdae-acc961720179-ovs-socket\") pod \"nmstate-handler-4gzdz\" (UID: \"567b9831-64e4-48ec-bdae-acc961720179\") " pod="openshift-nmstate/nmstate-handler-4gzdz" Oct 13 13:18:28 crc kubenswrapper[4797]: I1013 13:18:28.492646 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvftj\" (UniqueName: \"kubernetes.io/projected/4b7f9fc5-d10a-4c60-a58c-f3636f159812-kube-api-access-pvftj\") pod \"nmstate-metrics-fdff9cb8d-dgck6\" (UID: \"4b7f9fc5-d10a-4c60-a58c-f3636f159812\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-dgck6" Oct 13 13:18:28 crc kubenswrapper[4797]: I1013 13:18:28.492689 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/567b9831-64e4-48ec-bdae-acc961720179-dbus-socket\") pod \"nmstate-handler-4gzdz\" (UID: \"567b9831-64e4-48ec-bdae-acc961720179\") " pod="openshift-nmstate/nmstate-handler-4gzdz" Oct 13 13:18:28 crc kubenswrapper[4797]: I1013 13:18:28.492714 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/567b9831-64e4-48ec-bdae-acc961720179-nmstate-lock\") pod \"nmstate-handler-4gzdz\" (UID: \"567b9831-64e4-48ec-bdae-acc961720179\") " pod="openshift-nmstate/nmstate-handler-4gzdz" Oct 13 13:18:28 crc kubenswrapper[4797]: I1013 13:18:28.492894 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5smx\" (UniqueName: \"kubernetes.io/projected/567b9831-64e4-48ec-bdae-acc961720179-kube-api-access-z5smx\") pod \"nmstate-handler-4gzdz\" (UID: \"567b9831-64e4-48ec-bdae-acc961720179\") " pod="openshift-nmstate/nmstate-handler-4gzdz" Oct 13 13:18:28 crc kubenswrapper[4797]: I1013 13:18:28.492955 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frg6k\" (UniqueName: \"kubernetes.io/projected/2b289246-79ef-45ae-afdd-ab40537151b5-kube-api-access-frg6k\") pod \"nmstate-webhook-6cdbc54649-6g2mr\" (UID: \"2b289246-79ef-45ae-afdd-ab40537151b5\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-6g2mr" Oct 13 13:18:28 crc kubenswrapper[4797]: I1013 13:18:28.493078 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/2b289246-79ef-45ae-afdd-ab40537151b5-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-6g2mr\" (UID: \"2b289246-79ef-45ae-afdd-ab40537151b5\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-6g2mr" Oct 13 13:18:28 crc kubenswrapper[4797]: I1013 13:18:28.498841 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-vcz62"] Oct 13 13:18:28 crc kubenswrapper[4797]: I1013 13:18:28.499866 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-vcz62" Oct 13 13:18:28 crc kubenswrapper[4797]: I1013 13:18:28.501454 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Oct 13 13:18:28 crc kubenswrapper[4797]: I1013 13:18:28.501830 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Oct 13 13:18:28 crc kubenswrapper[4797]: I1013 13:18:28.502895 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-kv7sk" Oct 13 13:18:28 crc kubenswrapper[4797]: I1013 13:18:28.509733 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-vcz62"] Oct 13 13:18:28 crc kubenswrapper[4797]: I1013 13:18:28.594726 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/567b9831-64e4-48ec-bdae-acc961720179-dbus-socket\") pod \"nmstate-handler-4gzdz\" (UID: \"567b9831-64e4-48ec-bdae-acc961720179\") " pod="openshift-nmstate/nmstate-handler-4gzdz" Oct 13 13:18:28 crc kubenswrapper[4797]: I1013 13:18:28.594763 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/567b9831-64e4-48ec-bdae-acc961720179-nmstate-lock\") pod \"nmstate-handler-4gzdz\" (UID: \"567b9831-64e4-48ec-bdae-acc961720179\") " pod="openshift-nmstate/nmstate-handler-4gzdz" Oct 13 13:18:28 crc kubenswrapper[4797]: I1013 13:18:28.594799 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-frg6k\" (UniqueName: \"kubernetes.io/projected/2b289246-79ef-45ae-afdd-ab40537151b5-kube-api-access-frg6k\") pod \"nmstate-webhook-6cdbc54649-6g2mr\" (UID: \"2b289246-79ef-45ae-afdd-ab40537151b5\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-6g2mr" Oct 13 13:18:28 crc kubenswrapper[4797]: I1013 13:18:28.594832 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z5smx\" (UniqueName: \"kubernetes.io/projected/567b9831-64e4-48ec-bdae-acc961720179-kube-api-access-z5smx\") pod \"nmstate-handler-4gzdz\" (UID: \"567b9831-64e4-48ec-bdae-acc961720179\") " pod="openshift-nmstate/nmstate-handler-4gzdz" Oct 13 13:18:28 crc kubenswrapper[4797]: I1013 13:18:28.594866 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/b6e43341-62a8-462b-80c3-86ab7db3a7f6-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-vcz62\" (UID: \"b6e43341-62a8-462b-80c3-86ab7db3a7f6\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-vcz62" Oct 13 13:18:28 crc kubenswrapper[4797]: I1013 13:18:28.594889 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/2b289246-79ef-45ae-afdd-ab40537151b5-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-6g2mr\" (UID: \"2b289246-79ef-45ae-afdd-ab40537151b5\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-6g2mr" Oct 13 13:18:28 crc kubenswrapper[4797]: I1013 13:18:28.594912 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/567b9831-64e4-48ec-bdae-acc961720179-ovs-socket\") pod \"nmstate-handler-4gzdz\" (UID: \"567b9831-64e4-48ec-bdae-acc961720179\") " pod="openshift-nmstate/nmstate-handler-4gzdz" Oct 13 13:18:28 crc kubenswrapper[4797]: I1013 13:18:28.594932 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/b6e43341-62a8-462b-80c3-86ab7db3a7f6-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-vcz62\" (UID: \"b6e43341-62a8-462b-80c3-86ab7db3a7f6\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-vcz62" Oct 13 13:18:28 crc kubenswrapper[4797]: I1013 13:18:28.594947 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvftj\" (UniqueName: \"kubernetes.io/projected/4b7f9fc5-d10a-4c60-a58c-f3636f159812-kube-api-access-pvftj\") pod \"nmstate-metrics-fdff9cb8d-dgck6\" (UID: \"4b7f9fc5-d10a-4c60-a58c-f3636f159812\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-dgck6" Oct 13 13:18:28 crc kubenswrapper[4797]: I1013 13:18:28.594971 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ncbdc\" (UniqueName: \"kubernetes.io/projected/b6e43341-62a8-462b-80c3-86ab7db3a7f6-kube-api-access-ncbdc\") pod \"nmstate-console-plugin-6b874cbd85-vcz62\" (UID: \"b6e43341-62a8-462b-80c3-86ab7db3a7f6\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-vcz62" Oct 13 13:18:28 crc kubenswrapper[4797]: I1013 13:18:28.595044 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/567b9831-64e4-48ec-bdae-acc961720179-nmstate-lock\") pod \"nmstate-handler-4gzdz\" (UID: \"567b9831-64e4-48ec-bdae-acc961720179\") " pod="openshift-nmstate/nmstate-handler-4gzdz" Oct 13 13:18:28 crc kubenswrapper[4797]: I1013 13:18:28.595069 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/567b9831-64e4-48ec-bdae-acc961720179-dbus-socket\") pod \"nmstate-handler-4gzdz\" (UID: \"567b9831-64e4-48ec-bdae-acc961720179\") " pod="openshift-nmstate/nmstate-handler-4gzdz" Oct 13 13:18:28 crc kubenswrapper[4797]: E1013 13:18:28.595199 4797 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Oct 13 13:18:28 crc kubenswrapper[4797]: E1013 13:18:28.595261 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2b289246-79ef-45ae-afdd-ab40537151b5-tls-key-pair podName:2b289246-79ef-45ae-afdd-ab40537151b5 nodeName:}" failed. No retries permitted until 2025-10-13 13:18:29.095242773 +0000 UTC m=+686.628793029 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/2b289246-79ef-45ae-afdd-ab40537151b5-tls-key-pair") pod "nmstate-webhook-6cdbc54649-6g2mr" (UID: "2b289246-79ef-45ae-afdd-ab40537151b5") : secret "openshift-nmstate-webhook" not found Oct 13 13:18:28 crc kubenswrapper[4797]: I1013 13:18:28.595343 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/567b9831-64e4-48ec-bdae-acc961720179-ovs-socket\") pod \"nmstate-handler-4gzdz\" (UID: \"567b9831-64e4-48ec-bdae-acc961720179\") " pod="openshift-nmstate/nmstate-handler-4gzdz" Oct 13 13:18:28 crc kubenswrapper[4797]: I1013 13:18:28.616615 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvftj\" (UniqueName: \"kubernetes.io/projected/4b7f9fc5-d10a-4c60-a58c-f3636f159812-kube-api-access-pvftj\") pod \"nmstate-metrics-fdff9cb8d-dgck6\" (UID: \"4b7f9fc5-d10a-4c60-a58c-f3636f159812\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-dgck6" Oct 13 13:18:28 crc kubenswrapper[4797]: I1013 13:18:28.618642 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-frg6k\" (UniqueName: \"kubernetes.io/projected/2b289246-79ef-45ae-afdd-ab40537151b5-kube-api-access-frg6k\") pod \"nmstate-webhook-6cdbc54649-6g2mr\" (UID: \"2b289246-79ef-45ae-afdd-ab40537151b5\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-6g2mr" Oct 13 13:18:28 crc kubenswrapper[4797]: I1013 13:18:28.625895 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5smx\" (UniqueName: \"kubernetes.io/projected/567b9831-64e4-48ec-bdae-acc961720179-kube-api-access-z5smx\") pod \"nmstate-handler-4gzdz\" (UID: \"567b9831-64e4-48ec-bdae-acc961720179\") " pod="openshift-nmstate/nmstate-handler-4gzdz" Oct 13 13:18:28 crc kubenswrapper[4797]: I1013 13:18:28.688430 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-7675bc8ff6-52qlb"] Oct 13 13:18:28 crc kubenswrapper[4797]: I1013 13:18:28.689595 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7675bc8ff6-52qlb" Oct 13 13:18:28 crc kubenswrapper[4797]: I1013 13:18:28.696472 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/b6e43341-62a8-462b-80c3-86ab7db3a7f6-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-vcz62\" (UID: \"b6e43341-62a8-462b-80c3-86ab7db3a7f6\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-vcz62" Oct 13 13:18:28 crc kubenswrapper[4797]: I1013 13:18:28.696517 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ncbdc\" (UniqueName: \"kubernetes.io/projected/b6e43341-62a8-462b-80c3-86ab7db3a7f6-kube-api-access-ncbdc\") pod \"nmstate-console-plugin-6b874cbd85-vcz62\" (UID: \"b6e43341-62a8-462b-80c3-86ab7db3a7f6\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-vcz62" Oct 13 13:18:28 crc kubenswrapper[4797]: I1013 13:18:28.696566 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/b6e43341-62a8-462b-80c3-86ab7db3a7f6-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-vcz62\" (UID: \"b6e43341-62a8-462b-80c3-86ab7db3a7f6\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-vcz62" Oct 13 13:18:28 crc kubenswrapper[4797]: I1013 13:18:28.697619 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/b6e43341-62a8-462b-80c3-86ab7db3a7f6-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-vcz62\" (UID: \"b6e43341-62a8-462b-80c3-86ab7db3a7f6\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-vcz62" Oct 13 13:18:28 crc kubenswrapper[4797]: I1013 13:18:28.699993 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-dgck6" Oct 13 13:18:28 crc kubenswrapper[4797]: I1013 13:18:28.702191 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7675bc8ff6-52qlb"] Oct 13 13:18:28 crc kubenswrapper[4797]: I1013 13:18:28.705346 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/b6e43341-62a8-462b-80c3-86ab7db3a7f6-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-vcz62\" (UID: \"b6e43341-62a8-462b-80c3-86ab7db3a7f6\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-vcz62" Oct 13 13:18:28 crc kubenswrapper[4797]: I1013 13:18:28.724509 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ncbdc\" (UniqueName: \"kubernetes.io/projected/b6e43341-62a8-462b-80c3-86ab7db3a7f6-kube-api-access-ncbdc\") pod \"nmstate-console-plugin-6b874cbd85-vcz62\" (UID: \"b6e43341-62a8-462b-80c3-86ab7db3a7f6\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-vcz62" Oct 13 13:18:28 crc kubenswrapper[4797]: I1013 13:18:28.739233 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-4gzdz" Oct 13 13:18:28 crc kubenswrapper[4797]: I1013 13:18:28.798052 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2a138b4e-a564-465d-b9cc-592fd40b80bc-console-oauth-config\") pod \"console-7675bc8ff6-52qlb\" (UID: \"2a138b4e-a564-465d-b9cc-592fd40b80bc\") " pod="openshift-console/console-7675bc8ff6-52qlb" Oct 13 13:18:28 crc kubenswrapper[4797]: I1013 13:18:28.798103 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2a138b4e-a564-465d-b9cc-592fd40b80bc-console-serving-cert\") pod \"console-7675bc8ff6-52qlb\" (UID: \"2a138b4e-a564-465d-b9cc-592fd40b80bc\") " pod="openshift-console/console-7675bc8ff6-52qlb" Oct 13 13:18:28 crc kubenswrapper[4797]: I1013 13:18:28.798125 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2a138b4e-a564-465d-b9cc-592fd40b80bc-console-config\") pod \"console-7675bc8ff6-52qlb\" (UID: \"2a138b4e-a564-465d-b9cc-592fd40b80bc\") " pod="openshift-console/console-7675bc8ff6-52qlb" Oct 13 13:18:28 crc kubenswrapper[4797]: I1013 13:18:28.798145 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jzsd8\" (UniqueName: \"kubernetes.io/projected/2a138b4e-a564-465d-b9cc-592fd40b80bc-kube-api-access-jzsd8\") pod \"console-7675bc8ff6-52qlb\" (UID: \"2a138b4e-a564-465d-b9cc-592fd40b80bc\") " pod="openshift-console/console-7675bc8ff6-52qlb" Oct 13 13:18:28 crc kubenswrapper[4797]: I1013 13:18:28.798168 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2a138b4e-a564-465d-b9cc-592fd40b80bc-trusted-ca-bundle\") pod \"console-7675bc8ff6-52qlb\" (UID: \"2a138b4e-a564-465d-b9cc-592fd40b80bc\") " pod="openshift-console/console-7675bc8ff6-52qlb" Oct 13 13:18:28 crc kubenswrapper[4797]: I1013 13:18:28.798183 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2a138b4e-a564-465d-b9cc-592fd40b80bc-oauth-serving-cert\") pod \"console-7675bc8ff6-52qlb\" (UID: \"2a138b4e-a564-465d-b9cc-592fd40b80bc\") " pod="openshift-console/console-7675bc8ff6-52qlb" Oct 13 13:18:28 crc kubenswrapper[4797]: I1013 13:18:28.798214 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2a138b4e-a564-465d-b9cc-592fd40b80bc-service-ca\") pod \"console-7675bc8ff6-52qlb\" (UID: \"2a138b4e-a564-465d-b9cc-592fd40b80bc\") " pod="openshift-console/console-7675bc8ff6-52qlb" Oct 13 13:18:28 crc kubenswrapper[4797]: I1013 13:18:28.816917 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-vcz62" Oct 13 13:18:28 crc kubenswrapper[4797]: I1013 13:18:28.898785 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-dgck6"] Oct 13 13:18:28 crc kubenswrapper[4797]: I1013 13:18:28.899411 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2a138b4e-a564-465d-b9cc-592fd40b80bc-service-ca\") pod \"console-7675bc8ff6-52qlb\" (UID: \"2a138b4e-a564-465d-b9cc-592fd40b80bc\") " pod="openshift-console/console-7675bc8ff6-52qlb" Oct 13 13:18:28 crc kubenswrapper[4797]: I1013 13:18:28.899492 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2a138b4e-a564-465d-b9cc-592fd40b80bc-console-oauth-config\") pod \"console-7675bc8ff6-52qlb\" (UID: \"2a138b4e-a564-465d-b9cc-592fd40b80bc\") " pod="openshift-console/console-7675bc8ff6-52qlb" Oct 13 13:18:28 crc kubenswrapper[4797]: I1013 13:18:28.899528 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2a138b4e-a564-465d-b9cc-592fd40b80bc-console-serving-cert\") pod \"console-7675bc8ff6-52qlb\" (UID: \"2a138b4e-a564-465d-b9cc-592fd40b80bc\") " pod="openshift-console/console-7675bc8ff6-52qlb" Oct 13 13:18:28 crc kubenswrapper[4797]: I1013 13:18:28.899547 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2a138b4e-a564-465d-b9cc-592fd40b80bc-console-config\") pod \"console-7675bc8ff6-52qlb\" (UID: \"2a138b4e-a564-465d-b9cc-592fd40b80bc\") " pod="openshift-console/console-7675bc8ff6-52qlb" Oct 13 13:18:28 crc kubenswrapper[4797]: I1013 13:18:28.899572 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jzsd8\" (UniqueName: \"kubernetes.io/projected/2a138b4e-a564-465d-b9cc-592fd40b80bc-kube-api-access-jzsd8\") pod \"console-7675bc8ff6-52qlb\" (UID: \"2a138b4e-a564-465d-b9cc-592fd40b80bc\") " pod="openshift-console/console-7675bc8ff6-52qlb" Oct 13 13:18:28 crc kubenswrapper[4797]: I1013 13:18:28.899592 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2a138b4e-a564-465d-b9cc-592fd40b80bc-trusted-ca-bundle\") pod \"console-7675bc8ff6-52qlb\" (UID: \"2a138b4e-a564-465d-b9cc-592fd40b80bc\") " pod="openshift-console/console-7675bc8ff6-52qlb" Oct 13 13:18:28 crc kubenswrapper[4797]: I1013 13:18:28.899607 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2a138b4e-a564-465d-b9cc-592fd40b80bc-oauth-serving-cert\") pod \"console-7675bc8ff6-52qlb\" (UID: \"2a138b4e-a564-465d-b9cc-592fd40b80bc\") " pod="openshift-console/console-7675bc8ff6-52qlb" Oct 13 13:18:28 crc kubenswrapper[4797]: I1013 13:18:28.900390 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2a138b4e-a564-465d-b9cc-592fd40b80bc-oauth-serving-cert\") pod \"console-7675bc8ff6-52qlb\" (UID: \"2a138b4e-a564-465d-b9cc-592fd40b80bc\") " pod="openshift-console/console-7675bc8ff6-52qlb" Oct 13 13:18:28 crc kubenswrapper[4797]: I1013 13:18:28.900823 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2a138b4e-a564-465d-b9cc-592fd40b80bc-trusted-ca-bundle\") pod \"console-7675bc8ff6-52qlb\" (UID: \"2a138b4e-a564-465d-b9cc-592fd40b80bc\") " pod="openshift-console/console-7675bc8ff6-52qlb" Oct 13 13:18:28 crc kubenswrapper[4797]: I1013 13:18:28.901095 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2a138b4e-a564-465d-b9cc-592fd40b80bc-service-ca\") pod \"console-7675bc8ff6-52qlb\" (UID: \"2a138b4e-a564-465d-b9cc-592fd40b80bc\") " pod="openshift-console/console-7675bc8ff6-52qlb" Oct 13 13:18:28 crc kubenswrapper[4797]: I1013 13:18:28.901199 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2a138b4e-a564-465d-b9cc-592fd40b80bc-console-config\") pod \"console-7675bc8ff6-52qlb\" (UID: \"2a138b4e-a564-465d-b9cc-592fd40b80bc\") " pod="openshift-console/console-7675bc8ff6-52qlb" Oct 13 13:18:28 crc kubenswrapper[4797]: I1013 13:18:28.903271 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2a138b4e-a564-465d-b9cc-592fd40b80bc-console-serving-cert\") pod \"console-7675bc8ff6-52qlb\" (UID: \"2a138b4e-a564-465d-b9cc-592fd40b80bc\") " pod="openshift-console/console-7675bc8ff6-52qlb" Oct 13 13:18:28 crc kubenswrapper[4797]: I1013 13:18:28.904606 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2a138b4e-a564-465d-b9cc-592fd40b80bc-console-oauth-config\") pod \"console-7675bc8ff6-52qlb\" (UID: \"2a138b4e-a564-465d-b9cc-592fd40b80bc\") " pod="openshift-console/console-7675bc8ff6-52qlb" Oct 13 13:18:28 crc kubenswrapper[4797]: I1013 13:18:28.917339 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jzsd8\" (UniqueName: \"kubernetes.io/projected/2a138b4e-a564-465d-b9cc-592fd40b80bc-kube-api-access-jzsd8\") pod \"console-7675bc8ff6-52qlb\" (UID: \"2a138b4e-a564-465d-b9cc-592fd40b80bc\") " pod="openshift-console/console-7675bc8ff6-52qlb" Oct 13 13:18:29 crc kubenswrapper[4797]: I1013 13:18:29.023937 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7675bc8ff6-52qlb" Oct 13 13:18:29 crc kubenswrapper[4797]: I1013 13:18:29.101496 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/2b289246-79ef-45ae-afdd-ab40537151b5-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-6g2mr\" (UID: \"2b289246-79ef-45ae-afdd-ab40537151b5\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-6g2mr" Oct 13 13:18:29 crc kubenswrapper[4797]: I1013 13:18:29.104980 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/2b289246-79ef-45ae-afdd-ab40537151b5-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-6g2mr\" (UID: \"2b289246-79ef-45ae-afdd-ab40537151b5\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-6g2mr" Oct 13 13:18:29 crc kubenswrapper[4797]: I1013 13:18:29.212270 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-vcz62"] Oct 13 13:18:29 crc kubenswrapper[4797]: I1013 13:18:29.215197 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7675bc8ff6-52qlb"] Oct 13 13:18:29 crc kubenswrapper[4797]: W1013 13:18:29.215687 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb6e43341_62a8_462b_80c3_86ab7db3a7f6.slice/crio-146ee91f8eb64b877a6c210d848697a2124ecd1a32401d34432bd1321201f2ac WatchSource:0}: Error finding container 146ee91f8eb64b877a6c210d848697a2124ecd1a32401d34432bd1321201f2ac: Status 404 returned error can't find the container with id 146ee91f8eb64b877a6c210d848697a2124ecd1a32401d34432bd1321201f2ac Oct 13 13:18:29 crc kubenswrapper[4797]: W1013 13:18:29.221651 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2a138b4e_a564_465d_b9cc_592fd40b80bc.slice/crio-4a742c2f4201d35509e4c6be6f4cf609b96c00d3e580232d437ddfba582b93a2 WatchSource:0}: Error finding container 4a742c2f4201d35509e4c6be6f4cf609b96c00d3e580232d437ddfba582b93a2: Status 404 returned error can't find the container with id 4a742c2f4201d35509e4c6be6f4cf609b96c00d3e580232d437ddfba582b93a2 Oct 13 13:18:29 crc kubenswrapper[4797]: I1013 13:18:29.294301 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-6g2mr" Oct 13 13:18:29 crc kubenswrapper[4797]: I1013 13:18:29.391765 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7675bc8ff6-52qlb" event={"ID":"2a138b4e-a564-465d-b9cc-592fd40b80bc","Type":"ContainerStarted","Data":"992af0c632c703cd11fcd88c0ebc60ac4d7d6b52bfe6ce46cd1c25bac8512c55"} Oct 13 13:18:29 crc kubenswrapper[4797]: I1013 13:18:29.392134 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7675bc8ff6-52qlb" event={"ID":"2a138b4e-a564-465d-b9cc-592fd40b80bc","Type":"ContainerStarted","Data":"4a742c2f4201d35509e4c6be6f4cf609b96c00d3e580232d437ddfba582b93a2"} Oct 13 13:18:29 crc kubenswrapper[4797]: I1013 13:18:29.393456 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-vcz62" event={"ID":"b6e43341-62a8-462b-80c3-86ab7db3a7f6","Type":"ContainerStarted","Data":"146ee91f8eb64b877a6c210d848697a2124ecd1a32401d34432bd1321201f2ac"} Oct 13 13:18:29 crc kubenswrapper[4797]: I1013 13:18:29.394527 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-4gzdz" event={"ID":"567b9831-64e4-48ec-bdae-acc961720179","Type":"ContainerStarted","Data":"0db160b8e8de2ed701c72b938af3d2640ff545fb66b03753104755fff4ee6f02"} Oct 13 13:18:29 crc kubenswrapper[4797]: I1013 13:18:29.397017 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-dgck6" event={"ID":"4b7f9fc5-d10a-4c60-a58c-f3636f159812","Type":"ContainerStarted","Data":"b6224c4587e8e4cb4f311741610de8823bd20b2e8d5f36a211c19471da002ca0"} Oct 13 13:18:29 crc kubenswrapper[4797]: I1013 13:18:29.409689 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7675bc8ff6-52qlb" podStartSLOduration=1.409667984 podStartE2EDuration="1.409667984s" podCreationTimestamp="2025-10-13 13:18:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 13:18:29.406732142 +0000 UTC m=+686.940282428" watchObservedRunningTime="2025-10-13 13:18:29.409667984 +0000 UTC m=+686.943218240" Oct 13 13:18:29 crc kubenswrapper[4797]: I1013 13:18:29.498980 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-6g2mr"] Oct 13 13:18:30 crc kubenswrapper[4797]: I1013 13:18:30.403446 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-6g2mr" event={"ID":"2b289246-79ef-45ae-afdd-ab40537151b5","Type":"ContainerStarted","Data":"f1c760c754848fbc4ba8d8fd7bd5f859212f01b0c85131390a36018074d86f29"} Oct 13 13:18:31 crc kubenswrapper[4797]: I1013 13:18:31.411144 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-6g2mr" event={"ID":"2b289246-79ef-45ae-afdd-ab40537151b5","Type":"ContainerStarted","Data":"39b93424814c01bb3ba0558231ee2b6348a5c373a58e686dfab45ab9e3ba7e05"} Oct 13 13:18:31 crc kubenswrapper[4797]: I1013 13:18:31.411906 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-6g2mr" Oct 13 13:18:31 crc kubenswrapper[4797]: I1013 13:18:31.412605 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-4gzdz" event={"ID":"567b9831-64e4-48ec-bdae-acc961720179","Type":"ContainerStarted","Data":"e4154e203e6aa831c71ddc0caf5dc59253d28fb20489b2f56f9d48704fa3b7c6"} Oct 13 13:18:31 crc kubenswrapper[4797]: I1013 13:18:31.413083 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-4gzdz" Oct 13 13:18:31 crc kubenswrapper[4797]: I1013 13:18:31.416019 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-dgck6" event={"ID":"4b7f9fc5-d10a-4c60-a58c-f3636f159812","Type":"ContainerStarted","Data":"a7daf375a6ae6842fe779c076871c4b00e5c515eacf28cbc8398ae391c1b905c"} Oct 13 13:18:31 crc kubenswrapper[4797]: I1013 13:18:31.431402 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-6g2mr" podStartSLOduration=1.877016996 podStartE2EDuration="3.431382037s" podCreationTimestamp="2025-10-13 13:18:28 +0000 UTC" firstStartedPulling="2025-10-13 13:18:29.500583348 +0000 UTC m=+687.034133604" lastFinishedPulling="2025-10-13 13:18:31.054948389 +0000 UTC m=+688.588498645" observedRunningTime="2025-10-13 13:18:31.430321171 +0000 UTC m=+688.963871467" watchObservedRunningTime="2025-10-13 13:18:31.431382037 +0000 UTC m=+688.964932333" Oct 13 13:18:32 crc kubenswrapper[4797]: I1013 13:18:32.421774 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-vcz62" event={"ID":"b6e43341-62a8-462b-80c3-86ab7db3a7f6","Type":"ContainerStarted","Data":"41fd5c30104e04d1699e5532b8dd7c1194deb1fa7cb096d73131e884128f8dbb"} Oct 13 13:18:32 crc kubenswrapper[4797]: I1013 13:18:32.439353 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-vcz62" podStartSLOduration=1.7527793470000002 podStartE2EDuration="4.439319921s" podCreationTimestamp="2025-10-13 13:18:28 +0000 UTC" firstStartedPulling="2025-10-13 13:18:29.218098968 +0000 UTC m=+686.751649224" lastFinishedPulling="2025-10-13 13:18:31.904639532 +0000 UTC m=+689.438189798" observedRunningTime="2025-10-13 13:18:32.436747438 +0000 UTC m=+689.970297714" watchObservedRunningTime="2025-10-13 13:18:32.439319921 +0000 UTC m=+689.972870177" Oct 13 13:18:32 crc kubenswrapper[4797]: I1013 13:18:32.441840 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-4gzdz" podStartSLOduration=2.176633993 podStartE2EDuration="4.441827582s" podCreationTimestamp="2025-10-13 13:18:28 +0000 UTC" firstStartedPulling="2025-10-13 13:18:28.76398428 +0000 UTC m=+686.297534536" lastFinishedPulling="2025-10-13 13:18:31.029177849 +0000 UTC m=+688.562728125" observedRunningTime="2025-10-13 13:18:31.450232388 +0000 UTC m=+688.983782664" watchObservedRunningTime="2025-10-13 13:18:32.441827582 +0000 UTC m=+689.975377848" Oct 13 13:18:33 crc kubenswrapper[4797]: I1013 13:18:33.430784 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-dgck6" event={"ID":"4b7f9fc5-d10a-4c60-a58c-f3636f159812","Type":"ContainerStarted","Data":"45d45f581016f0ecbaa5925576bcbcdec27b6919bfd738c0c0bcba1d8dbf10ce"} Oct 13 13:18:33 crc kubenswrapper[4797]: I1013 13:18:33.463884 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-dgck6" podStartSLOduration=1.386863447 podStartE2EDuration="5.463849022s" podCreationTimestamp="2025-10-13 13:18:28 +0000 UTC" firstStartedPulling="2025-10-13 13:18:28.913510188 +0000 UTC m=+686.447060454" lastFinishedPulling="2025-10-13 13:18:32.990495773 +0000 UTC m=+690.524046029" observedRunningTime="2025-10-13 13:18:33.457110547 +0000 UTC m=+690.990660893" watchObservedRunningTime="2025-10-13 13:18:33.463849022 +0000 UTC m=+690.997399308" Oct 13 13:18:38 crc kubenswrapper[4797]: I1013 13:18:38.768301 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-4gzdz" Oct 13 13:18:39 crc kubenswrapper[4797]: I1013 13:18:39.024142 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7675bc8ff6-52qlb" Oct 13 13:18:39 crc kubenswrapper[4797]: I1013 13:18:39.024233 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-7675bc8ff6-52qlb" Oct 13 13:18:39 crc kubenswrapper[4797]: I1013 13:18:39.031545 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7675bc8ff6-52qlb" Oct 13 13:18:39 crc kubenswrapper[4797]: I1013 13:18:39.477572 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7675bc8ff6-52qlb" Oct 13 13:18:39 crc kubenswrapper[4797]: I1013 13:18:39.565751 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-8xklm"] Oct 13 13:18:48 crc kubenswrapper[4797]: I1013 13:18:48.120515 4797 patch_prober.go:28] interesting pod/machine-config-daemon-hrdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 13:18:48 crc kubenswrapper[4797]: I1013 13:18:48.121238 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 13:18:49 crc kubenswrapper[4797]: I1013 13:18:49.300664 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-6g2mr" Oct 13 13:19:02 crc kubenswrapper[4797]: I1013 13:19:02.686864 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-wxrg4"] Oct 13 13:19:02 crc kubenswrapper[4797]: I1013 13:19:02.687546 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-wxrg4" podUID="9d6de0ee-416d-43d1-bf5a-e176bc41b2c5" containerName="controller-manager" containerID="cri-o://c6f329d5d6d23692f1fd941bd6552f80ebc318c2a2ee899eb8c17630e23bd375" gracePeriod=30 Oct 13 13:19:02 crc kubenswrapper[4797]: I1013 13:19:02.797020 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-lvjgk"] Oct 13 13:19:02 crc kubenswrapper[4797]: I1013 13:19:02.797212 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lvjgk" podUID="ea99d2e2-7d7a-4af5-8f2d-dadc6c1acd50" containerName="route-controller-manager" containerID="cri-o://3e6a9b8e8fafbb2d90b4c64e24b8f6b2df766397dfca25b12eb37e51b7c8c938" gracePeriod=30 Oct 13 13:19:03 crc kubenswrapper[4797]: I1013 13:19:03.044306 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-wxrg4" Oct 13 13:19:03 crc kubenswrapper[4797]: I1013 13:19:03.126710 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lvjgk" Oct 13 13:19:03 crc kubenswrapper[4797]: I1013 13:19:03.178188 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9d6de0ee-416d-43d1-bf5a-e176bc41b2c5-proxy-ca-bundles\") pod \"9d6de0ee-416d-43d1-bf5a-e176bc41b2c5\" (UID: \"9d6de0ee-416d-43d1-bf5a-e176bc41b2c5\") " Oct 13 13:19:03 crc kubenswrapper[4797]: I1013 13:19:03.178269 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z7bzt\" (UniqueName: \"kubernetes.io/projected/9d6de0ee-416d-43d1-bf5a-e176bc41b2c5-kube-api-access-z7bzt\") pod \"9d6de0ee-416d-43d1-bf5a-e176bc41b2c5\" (UID: \"9d6de0ee-416d-43d1-bf5a-e176bc41b2c5\") " Oct 13 13:19:03 crc kubenswrapper[4797]: I1013 13:19:03.178344 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d6de0ee-416d-43d1-bf5a-e176bc41b2c5-serving-cert\") pod \"9d6de0ee-416d-43d1-bf5a-e176bc41b2c5\" (UID: \"9d6de0ee-416d-43d1-bf5a-e176bc41b2c5\") " Oct 13 13:19:03 crc kubenswrapper[4797]: I1013 13:19:03.178375 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d6de0ee-416d-43d1-bf5a-e176bc41b2c5-config\") pod \"9d6de0ee-416d-43d1-bf5a-e176bc41b2c5\" (UID: \"9d6de0ee-416d-43d1-bf5a-e176bc41b2c5\") " Oct 13 13:19:03 crc kubenswrapper[4797]: I1013 13:19:03.178390 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9d6de0ee-416d-43d1-bf5a-e176bc41b2c5-client-ca\") pod \"9d6de0ee-416d-43d1-bf5a-e176bc41b2c5\" (UID: \"9d6de0ee-416d-43d1-bf5a-e176bc41b2c5\") " Oct 13 13:19:03 crc kubenswrapper[4797]: I1013 13:19:03.179339 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d6de0ee-416d-43d1-bf5a-e176bc41b2c5-client-ca" (OuterVolumeSpecName: "client-ca") pod "9d6de0ee-416d-43d1-bf5a-e176bc41b2c5" (UID: "9d6de0ee-416d-43d1-bf5a-e176bc41b2c5"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:19:03 crc kubenswrapper[4797]: I1013 13:19:03.179699 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d6de0ee-416d-43d1-bf5a-e176bc41b2c5-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "9d6de0ee-416d-43d1-bf5a-e176bc41b2c5" (UID: "9d6de0ee-416d-43d1-bf5a-e176bc41b2c5"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:19:03 crc kubenswrapper[4797]: I1013 13:19:03.184150 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d6de0ee-416d-43d1-bf5a-e176bc41b2c5-config" (OuterVolumeSpecName: "config") pod "9d6de0ee-416d-43d1-bf5a-e176bc41b2c5" (UID: "9d6de0ee-416d-43d1-bf5a-e176bc41b2c5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:19:03 crc kubenswrapper[4797]: I1013 13:19:03.187701 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d6de0ee-416d-43d1-bf5a-e176bc41b2c5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d6de0ee-416d-43d1-bf5a-e176bc41b2c5" (UID: "9d6de0ee-416d-43d1-bf5a-e176bc41b2c5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:19:03 crc kubenswrapper[4797]: I1013 13:19:03.188189 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d6de0ee-416d-43d1-bf5a-e176bc41b2c5-kube-api-access-z7bzt" (OuterVolumeSpecName: "kube-api-access-z7bzt") pod "9d6de0ee-416d-43d1-bf5a-e176bc41b2c5" (UID: "9d6de0ee-416d-43d1-bf5a-e176bc41b2c5"). InnerVolumeSpecName "kube-api-access-z7bzt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:19:03 crc kubenswrapper[4797]: I1013 13:19:03.313555 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ea99d2e2-7d7a-4af5-8f2d-dadc6c1acd50-serving-cert\") pod \"ea99d2e2-7d7a-4af5-8f2d-dadc6c1acd50\" (UID: \"ea99d2e2-7d7a-4af5-8f2d-dadc6c1acd50\") " Oct 13 13:19:03 crc kubenswrapper[4797]: I1013 13:19:03.313659 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ea99d2e2-7d7a-4af5-8f2d-dadc6c1acd50-client-ca\") pod \"ea99d2e2-7d7a-4af5-8f2d-dadc6c1acd50\" (UID: \"ea99d2e2-7d7a-4af5-8f2d-dadc6c1acd50\") " Oct 13 13:19:03 crc kubenswrapper[4797]: I1013 13:19:03.313700 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea99d2e2-7d7a-4af5-8f2d-dadc6c1acd50-config\") pod \"ea99d2e2-7d7a-4af5-8f2d-dadc6c1acd50\" (UID: \"ea99d2e2-7d7a-4af5-8f2d-dadc6c1acd50\") " Oct 13 13:19:03 crc kubenswrapper[4797]: I1013 13:19:03.313816 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m5kks\" (UniqueName: \"kubernetes.io/projected/ea99d2e2-7d7a-4af5-8f2d-dadc6c1acd50-kube-api-access-m5kks\") pod \"ea99d2e2-7d7a-4af5-8f2d-dadc6c1acd50\" (UID: \"ea99d2e2-7d7a-4af5-8f2d-dadc6c1acd50\") " Oct 13 13:19:03 crc kubenswrapper[4797]: I1013 13:19:03.314351 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea99d2e2-7d7a-4af5-8f2d-dadc6c1acd50-client-ca" (OuterVolumeSpecName: "client-ca") pod "ea99d2e2-7d7a-4af5-8f2d-dadc6c1acd50" (UID: "ea99d2e2-7d7a-4af5-8f2d-dadc6c1acd50"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:19:03 crc kubenswrapper[4797]: I1013 13:19:03.314420 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea99d2e2-7d7a-4af5-8f2d-dadc6c1acd50-config" (OuterVolumeSpecName: "config") pod "ea99d2e2-7d7a-4af5-8f2d-dadc6c1acd50" (UID: "ea99d2e2-7d7a-4af5-8f2d-dadc6c1acd50"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:19:03 crc kubenswrapper[4797]: I1013 13:19:03.314954 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z7bzt\" (UniqueName: \"kubernetes.io/projected/9d6de0ee-416d-43d1-bf5a-e176bc41b2c5-kube-api-access-z7bzt\") on node \"crc\" DevicePath \"\"" Oct 13 13:19:03 crc kubenswrapper[4797]: I1013 13:19:03.315020 4797 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d6de0ee-416d-43d1-bf5a-e176bc41b2c5-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 13 13:19:03 crc kubenswrapper[4797]: I1013 13:19:03.315038 4797 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d6de0ee-416d-43d1-bf5a-e176bc41b2c5-config\") on node \"crc\" DevicePath \"\"" Oct 13 13:19:03 crc kubenswrapper[4797]: I1013 13:19:03.315051 4797 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9d6de0ee-416d-43d1-bf5a-e176bc41b2c5-client-ca\") on node \"crc\" DevicePath \"\"" Oct 13 13:19:03 crc kubenswrapper[4797]: I1013 13:19:03.315062 4797 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ea99d2e2-7d7a-4af5-8f2d-dadc6c1acd50-client-ca\") on node \"crc\" DevicePath \"\"" Oct 13 13:19:03 crc kubenswrapper[4797]: I1013 13:19:03.315119 4797 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea99d2e2-7d7a-4af5-8f2d-dadc6c1acd50-config\") on node \"crc\" DevicePath \"\"" Oct 13 13:19:03 crc kubenswrapper[4797]: I1013 13:19:03.315130 4797 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9d6de0ee-416d-43d1-bf5a-e176bc41b2c5-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 13 13:19:03 crc kubenswrapper[4797]: I1013 13:19:03.317079 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea99d2e2-7d7a-4af5-8f2d-dadc6c1acd50-kube-api-access-m5kks" (OuterVolumeSpecName: "kube-api-access-m5kks") pod "ea99d2e2-7d7a-4af5-8f2d-dadc6c1acd50" (UID: "ea99d2e2-7d7a-4af5-8f2d-dadc6c1acd50"). InnerVolumeSpecName "kube-api-access-m5kks". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:19:03 crc kubenswrapper[4797]: I1013 13:19:03.317737 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea99d2e2-7d7a-4af5-8f2d-dadc6c1acd50-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "ea99d2e2-7d7a-4af5-8f2d-dadc6c1acd50" (UID: "ea99d2e2-7d7a-4af5-8f2d-dadc6c1acd50"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:19:03 crc kubenswrapper[4797]: I1013 13:19:03.416383 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m5kks\" (UniqueName: \"kubernetes.io/projected/ea99d2e2-7d7a-4af5-8f2d-dadc6c1acd50-kube-api-access-m5kks\") on node \"crc\" DevicePath \"\"" Oct 13 13:19:03 crc kubenswrapper[4797]: I1013 13:19:03.416436 4797 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ea99d2e2-7d7a-4af5-8f2d-dadc6c1acd50-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 13 13:19:03 crc kubenswrapper[4797]: I1013 13:19:03.636655 4797 generic.go:334] "Generic (PLEG): container finished" podID="ea99d2e2-7d7a-4af5-8f2d-dadc6c1acd50" containerID="3e6a9b8e8fafbb2d90b4c64e24b8f6b2df766397dfca25b12eb37e51b7c8c938" exitCode=0 Oct 13 13:19:03 crc kubenswrapper[4797]: I1013 13:19:03.636766 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lvjgk" event={"ID":"ea99d2e2-7d7a-4af5-8f2d-dadc6c1acd50","Type":"ContainerDied","Data":"3e6a9b8e8fafbb2d90b4c64e24b8f6b2df766397dfca25b12eb37e51b7c8c938"} Oct 13 13:19:03 crc kubenswrapper[4797]: I1013 13:19:03.636797 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lvjgk" event={"ID":"ea99d2e2-7d7a-4af5-8f2d-dadc6c1acd50","Type":"ContainerDied","Data":"56ce8624dbdeb13b333f32d25598b678031ef627e8efa4d8a1d84e6e1feee495"} Oct 13 13:19:03 crc kubenswrapper[4797]: I1013 13:19:03.636834 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lvjgk" Oct 13 13:19:03 crc kubenswrapper[4797]: I1013 13:19:03.636844 4797 scope.go:117] "RemoveContainer" containerID="3e6a9b8e8fafbb2d90b4c64e24b8f6b2df766397dfca25b12eb37e51b7c8c938" Oct 13 13:19:03 crc kubenswrapper[4797]: I1013 13:19:03.639882 4797 generic.go:334] "Generic (PLEG): container finished" podID="9d6de0ee-416d-43d1-bf5a-e176bc41b2c5" containerID="c6f329d5d6d23692f1fd941bd6552f80ebc318c2a2ee899eb8c17630e23bd375" exitCode=0 Oct 13 13:19:03 crc kubenswrapper[4797]: I1013 13:19:03.639935 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-wxrg4" event={"ID":"9d6de0ee-416d-43d1-bf5a-e176bc41b2c5","Type":"ContainerDied","Data":"c6f329d5d6d23692f1fd941bd6552f80ebc318c2a2ee899eb8c17630e23bd375"} Oct 13 13:19:03 crc kubenswrapper[4797]: I1013 13:19:03.639968 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-wxrg4" event={"ID":"9d6de0ee-416d-43d1-bf5a-e176bc41b2c5","Type":"ContainerDied","Data":"3d0fb86568e53a68f2537825c73ab55c4499701f69851e09be40d347ceade88b"} Oct 13 13:19:03 crc kubenswrapper[4797]: I1013 13:19:03.640023 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-wxrg4" Oct 13 13:19:03 crc kubenswrapper[4797]: I1013 13:19:03.652631 4797 scope.go:117] "RemoveContainer" containerID="3e6a9b8e8fafbb2d90b4c64e24b8f6b2df766397dfca25b12eb37e51b7c8c938" Oct 13 13:19:03 crc kubenswrapper[4797]: I1013 13:19:03.658241 4797 scope.go:117] "RemoveContainer" containerID="3e6a9b8e8fafbb2d90b4c64e24b8f6b2df766397dfca25b12eb37e51b7c8c938" Oct 13 13:19:03 crc kubenswrapper[4797]: E1013 13:19:03.658576 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e6a9b8e8fafbb2d90b4c64e24b8f6b2df766397dfca25b12eb37e51b7c8c938\": container with ID starting with 3e6a9b8e8fafbb2d90b4c64e24b8f6b2df766397dfca25b12eb37e51b7c8c938 not found: ID does not exist" containerID="3e6a9b8e8fafbb2d90b4c64e24b8f6b2df766397dfca25b12eb37e51b7c8c938" Oct 13 13:19:03 crc kubenswrapper[4797]: I1013 13:19:03.658600 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e6a9b8e8fafbb2d90b4c64e24b8f6b2df766397dfca25b12eb37e51b7c8c938"} err="failed to get container status \"3e6a9b8e8fafbb2d90b4c64e24b8f6b2df766397dfca25b12eb37e51b7c8c938\": rpc error: code = NotFound desc = could not find container \"3e6a9b8e8fafbb2d90b4c64e24b8f6b2df766397dfca25b12eb37e51b7c8c938\": container with ID starting with 3e6a9b8e8fafbb2d90b4c64e24b8f6b2df766397dfca25b12eb37e51b7c8c938 not found: ID does not exist" Oct 13 13:19:03 crc kubenswrapper[4797]: I1013 13:19:03.658619 4797 scope.go:117] "RemoveContainer" containerID="c6f329d5d6d23692f1fd941bd6552f80ebc318c2a2ee899eb8c17630e23bd375" Oct 13 13:19:03 crc kubenswrapper[4797]: E1013 13:19:03.661706 4797 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = Unknown desc = failed to delete container k8s_route-controller-manager_route-controller-manager-6576b87f9c-lvjgk_openshift-route-controller-manager_ea99d2e2-7d7a-4af5-8f2d-dadc6c1acd50_0 in pod sandbox 56ce8624dbdeb13b333f32d25598b678031ef627e8efa4d8a1d84e6e1feee495 from index: no such id: '3e6a9b8e8fafbb2d90b4c64e24b8f6b2df766397dfca25b12eb37e51b7c8c938'" containerID="3e6a9b8e8fafbb2d90b4c64e24b8f6b2df766397dfca25b12eb37e51b7c8c938" Oct 13 13:19:03 crc kubenswrapper[4797]: E1013 13:19:03.661744 4797 kuberuntime_gc.go:150] "Failed to remove container" err="rpc error: code = Unknown desc = failed to delete container k8s_route-controller-manager_route-controller-manager-6576b87f9c-lvjgk_openshift-route-controller-manager_ea99d2e2-7d7a-4af5-8f2d-dadc6c1acd50_0 in pod sandbox 56ce8624dbdeb13b333f32d25598b678031ef627e8efa4d8a1d84e6e1feee495 from index: no such id: '3e6a9b8e8fafbb2d90b4c64e24b8f6b2df766397dfca25b12eb37e51b7c8c938'" containerID="3e6a9b8e8fafbb2d90b4c64e24b8f6b2df766397dfca25b12eb37e51b7c8c938" Oct 13 13:19:03 crc kubenswrapper[4797]: I1013 13:19:03.661773 4797 scope.go:117] "RemoveContainer" containerID="c6f329d5d6d23692f1fd941bd6552f80ebc318c2a2ee899eb8c17630e23bd375" Oct 13 13:19:03 crc kubenswrapper[4797]: I1013 13:19:03.661930 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-wxrg4"] Oct 13 13:19:03 crc kubenswrapper[4797]: I1013 13:19:03.679540 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-wxrg4"] Oct 13 13:19:03 crc kubenswrapper[4797]: I1013 13:19:03.680379 4797 scope.go:117] "RemoveContainer" containerID="c6f329d5d6d23692f1fd941bd6552f80ebc318c2a2ee899eb8c17630e23bd375" Oct 13 13:19:03 crc kubenswrapper[4797]: E1013 13:19:03.680360 4797 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = Unknown desc = failed to delete container k8s_controller-manager_controller-manager-879f6c89f-wxrg4_openshift-controller-manager_9d6de0ee-416d-43d1-bf5a-e176bc41b2c5_0 in pod sandbox 3d0fb86568e53a68f2537825c73ab55c4499701f69851e09be40d347ceade88b from index: no such id: 'c6f329d5d6d23692f1fd941bd6552f80ebc318c2a2ee899eb8c17630e23bd375'" containerID="c6f329d5d6d23692f1fd941bd6552f80ebc318c2a2ee899eb8c17630e23bd375" Oct 13 13:19:03 crc kubenswrapper[4797]: E1013 13:19:03.680516 4797 kuberuntime_gc.go:150] "Failed to remove container" err="rpc error: code = Unknown desc = failed to delete container k8s_controller-manager_controller-manager-879f6c89f-wxrg4_openshift-controller-manager_9d6de0ee-416d-43d1-bf5a-e176bc41b2c5_0 in pod sandbox 3d0fb86568e53a68f2537825c73ab55c4499701f69851e09be40d347ceade88b from index: no such id: 'c6f329d5d6d23692f1fd941bd6552f80ebc318c2a2ee899eb8c17630e23bd375'" containerID="c6f329d5d6d23692f1fd941bd6552f80ebc318c2a2ee899eb8c17630e23bd375" Oct 13 13:19:03 crc kubenswrapper[4797]: E1013 13:19:03.680972 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c6f329d5d6d23692f1fd941bd6552f80ebc318c2a2ee899eb8c17630e23bd375\": container with ID starting with c6f329d5d6d23692f1fd941bd6552f80ebc318c2a2ee899eb8c17630e23bd375 not found: ID does not exist" containerID="c6f329d5d6d23692f1fd941bd6552f80ebc318c2a2ee899eb8c17630e23bd375" Oct 13 13:19:03 crc kubenswrapper[4797]: I1013 13:19:03.681046 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6f329d5d6d23692f1fd941bd6552f80ebc318c2a2ee899eb8c17630e23bd375"} err="failed to get container status \"c6f329d5d6d23692f1fd941bd6552f80ebc318c2a2ee899eb8c17630e23bd375\": rpc error: code = NotFound desc = could not find container \"c6f329d5d6d23692f1fd941bd6552f80ebc318c2a2ee899eb8c17630e23bd375\": container with ID starting with c6f329d5d6d23692f1fd941bd6552f80ebc318c2a2ee899eb8c17630e23bd375 not found: ID does not exist" Oct 13 13:19:03 crc kubenswrapper[4797]: I1013 13:19:03.685840 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-lvjgk"] Oct 13 13:19:03 crc kubenswrapper[4797]: I1013 13:19:03.691110 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-lvjgk"] Oct 13 13:19:04 crc kubenswrapper[4797]: I1013 13:19:04.284820 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-84dd4665d-28sxp"] Oct 13 13:19:04 crc kubenswrapper[4797]: E1013 13:19:04.285424 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea99d2e2-7d7a-4af5-8f2d-dadc6c1acd50" containerName="route-controller-manager" Oct 13 13:19:04 crc kubenswrapper[4797]: I1013 13:19:04.285448 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea99d2e2-7d7a-4af5-8f2d-dadc6c1acd50" containerName="route-controller-manager" Oct 13 13:19:04 crc kubenswrapper[4797]: E1013 13:19:04.285462 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d6de0ee-416d-43d1-bf5a-e176bc41b2c5" containerName="controller-manager" Oct 13 13:19:04 crc kubenswrapper[4797]: I1013 13:19:04.285470 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d6de0ee-416d-43d1-bf5a-e176bc41b2c5" containerName="controller-manager" Oct 13 13:19:04 crc kubenswrapper[4797]: I1013 13:19:04.285597 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea99d2e2-7d7a-4af5-8f2d-dadc6c1acd50" containerName="route-controller-manager" Oct 13 13:19:04 crc kubenswrapper[4797]: I1013 13:19:04.285615 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d6de0ee-416d-43d1-bf5a-e176bc41b2c5" containerName="controller-manager" Oct 13 13:19:04 crc kubenswrapper[4797]: I1013 13:19:04.286050 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-84dd4665d-28sxp" Oct 13 13:19:04 crc kubenswrapper[4797]: I1013 13:19:04.288189 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Oct 13 13:19:04 crc kubenswrapper[4797]: I1013 13:19:04.288946 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Oct 13 13:19:04 crc kubenswrapper[4797]: I1013 13:19:04.289735 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Oct 13 13:19:04 crc kubenswrapper[4797]: I1013 13:19:04.289998 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Oct 13 13:19:04 crc kubenswrapper[4797]: I1013 13:19:04.290220 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Oct 13 13:19:04 crc kubenswrapper[4797]: I1013 13:19:04.290346 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Oct 13 13:19:04 crc kubenswrapper[4797]: I1013 13:19:04.299259 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-84dd4665d-28sxp"] Oct 13 13:19:04 crc kubenswrapper[4797]: I1013 13:19:04.325994 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e809d708-00fc-4cb4-bf3d-34eee6de3bf3-config\") pod \"route-controller-manager-84dd4665d-28sxp\" (UID: \"e809d708-00fc-4cb4-bf3d-34eee6de3bf3\") " pod="openshift-route-controller-manager/route-controller-manager-84dd4665d-28sxp" Oct 13 13:19:04 crc kubenswrapper[4797]: I1013 13:19:04.326138 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e809d708-00fc-4cb4-bf3d-34eee6de3bf3-client-ca\") pod \"route-controller-manager-84dd4665d-28sxp\" (UID: \"e809d708-00fc-4cb4-bf3d-34eee6de3bf3\") " pod="openshift-route-controller-manager/route-controller-manager-84dd4665d-28sxp" Oct 13 13:19:04 crc kubenswrapper[4797]: I1013 13:19:04.326343 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e809d708-00fc-4cb4-bf3d-34eee6de3bf3-serving-cert\") pod \"route-controller-manager-84dd4665d-28sxp\" (UID: \"e809d708-00fc-4cb4-bf3d-34eee6de3bf3\") " pod="openshift-route-controller-manager/route-controller-manager-84dd4665d-28sxp" Oct 13 13:19:04 crc kubenswrapper[4797]: I1013 13:19:04.326441 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mntx5\" (UniqueName: \"kubernetes.io/projected/e809d708-00fc-4cb4-bf3d-34eee6de3bf3-kube-api-access-mntx5\") pod \"route-controller-manager-84dd4665d-28sxp\" (UID: \"e809d708-00fc-4cb4-bf3d-34eee6de3bf3\") " pod="openshift-route-controller-manager/route-controller-manager-84dd4665d-28sxp" Oct 13 13:19:04 crc kubenswrapper[4797]: I1013 13:19:04.427278 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mntx5\" (UniqueName: \"kubernetes.io/projected/e809d708-00fc-4cb4-bf3d-34eee6de3bf3-kube-api-access-mntx5\") pod \"route-controller-manager-84dd4665d-28sxp\" (UID: \"e809d708-00fc-4cb4-bf3d-34eee6de3bf3\") " pod="openshift-route-controller-manager/route-controller-manager-84dd4665d-28sxp" Oct 13 13:19:04 crc kubenswrapper[4797]: I1013 13:19:04.427377 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e809d708-00fc-4cb4-bf3d-34eee6de3bf3-config\") pod \"route-controller-manager-84dd4665d-28sxp\" (UID: \"e809d708-00fc-4cb4-bf3d-34eee6de3bf3\") " pod="openshift-route-controller-manager/route-controller-manager-84dd4665d-28sxp" Oct 13 13:19:04 crc kubenswrapper[4797]: I1013 13:19:04.427415 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e809d708-00fc-4cb4-bf3d-34eee6de3bf3-client-ca\") pod \"route-controller-manager-84dd4665d-28sxp\" (UID: \"e809d708-00fc-4cb4-bf3d-34eee6de3bf3\") " pod="openshift-route-controller-manager/route-controller-manager-84dd4665d-28sxp" Oct 13 13:19:04 crc kubenswrapper[4797]: I1013 13:19:04.427482 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e809d708-00fc-4cb4-bf3d-34eee6de3bf3-serving-cert\") pod \"route-controller-manager-84dd4665d-28sxp\" (UID: \"e809d708-00fc-4cb4-bf3d-34eee6de3bf3\") " pod="openshift-route-controller-manager/route-controller-manager-84dd4665d-28sxp" Oct 13 13:19:04 crc kubenswrapper[4797]: I1013 13:19:04.428461 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e809d708-00fc-4cb4-bf3d-34eee6de3bf3-client-ca\") pod \"route-controller-manager-84dd4665d-28sxp\" (UID: \"e809d708-00fc-4cb4-bf3d-34eee6de3bf3\") " pod="openshift-route-controller-manager/route-controller-manager-84dd4665d-28sxp" Oct 13 13:19:04 crc kubenswrapper[4797]: I1013 13:19:04.428621 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e809d708-00fc-4cb4-bf3d-34eee6de3bf3-config\") pod \"route-controller-manager-84dd4665d-28sxp\" (UID: \"e809d708-00fc-4cb4-bf3d-34eee6de3bf3\") " pod="openshift-route-controller-manager/route-controller-manager-84dd4665d-28sxp" Oct 13 13:19:04 crc kubenswrapper[4797]: I1013 13:19:04.435147 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e809d708-00fc-4cb4-bf3d-34eee6de3bf3-serving-cert\") pod \"route-controller-manager-84dd4665d-28sxp\" (UID: \"e809d708-00fc-4cb4-bf3d-34eee6de3bf3\") " pod="openshift-route-controller-manager/route-controller-manager-84dd4665d-28sxp" Oct 13 13:19:04 crc kubenswrapper[4797]: I1013 13:19:04.447475 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mntx5\" (UniqueName: \"kubernetes.io/projected/e809d708-00fc-4cb4-bf3d-34eee6de3bf3-kube-api-access-mntx5\") pod \"route-controller-manager-84dd4665d-28sxp\" (UID: \"e809d708-00fc-4cb4-bf3d-34eee6de3bf3\") " pod="openshift-route-controller-manager/route-controller-manager-84dd4665d-28sxp" Oct 13 13:19:04 crc kubenswrapper[4797]: I1013 13:19:04.454677 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-795b9cb645-n7jmm"] Oct 13 13:19:04 crc kubenswrapper[4797]: I1013 13:19:04.455534 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-795b9cb645-n7jmm" Oct 13 13:19:04 crc kubenswrapper[4797]: I1013 13:19:04.459899 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Oct 13 13:19:04 crc kubenswrapper[4797]: I1013 13:19:04.461306 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Oct 13 13:19:04 crc kubenswrapper[4797]: I1013 13:19:04.461442 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Oct 13 13:19:04 crc kubenswrapper[4797]: I1013 13:19:04.461855 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Oct 13 13:19:04 crc kubenswrapper[4797]: I1013 13:19:04.461912 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Oct 13 13:19:04 crc kubenswrapper[4797]: I1013 13:19:04.462164 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Oct 13 13:19:04 crc kubenswrapper[4797]: I1013 13:19:04.470607 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-795b9cb645-n7jmm"] Oct 13 13:19:04 crc kubenswrapper[4797]: I1013 13:19:04.474376 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Oct 13 13:19:04 crc kubenswrapper[4797]: I1013 13:19:04.529389 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4d7bd76f-eca9-46f9-9655-52ec55379bbc-serving-cert\") pod \"controller-manager-795b9cb645-n7jmm\" (UID: \"4d7bd76f-eca9-46f9-9655-52ec55379bbc\") " pod="openshift-controller-manager/controller-manager-795b9cb645-n7jmm" Oct 13 13:19:04 crc kubenswrapper[4797]: I1013 13:19:04.529459 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zr564\" (UniqueName: \"kubernetes.io/projected/4d7bd76f-eca9-46f9-9655-52ec55379bbc-kube-api-access-zr564\") pod \"controller-manager-795b9cb645-n7jmm\" (UID: \"4d7bd76f-eca9-46f9-9655-52ec55379bbc\") " pod="openshift-controller-manager/controller-manager-795b9cb645-n7jmm" Oct 13 13:19:04 crc kubenswrapper[4797]: I1013 13:19:04.529515 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4d7bd76f-eca9-46f9-9655-52ec55379bbc-proxy-ca-bundles\") pod \"controller-manager-795b9cb645-n7jmm\" (UID: \"4d7bd76f-eca9-46f9-9655-52ec55379bbc\") " pod="openshift-controller-manager/controller-manager-795b9cb645-n7jmm" Oct 13 13:19:04 crc kubenswrapper[4797]: I1013 13:19:04.529552 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d7bd76f-eca9-46f9-9655-52ec55379bbc-config\") pod \"controller-manager-795b9cb645-n7jmm\" (UID: \"4d7bd76f-eca9-46f9-9655-52ec55379bbc\") " pod="openshift-controller-manager/controller-manager-795b9cb645-n7jmm" Oct 13 13:19:04 crc kubenswrapper[4797]: I1013 13:19:04.529608 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4d7bd76f-eca9-46f9-9655-52ec55379bbc-client-ca\") pod \"controller-manager-795b9cb645-n7jmm\" (UID: \"4d7bd76f-eca9-46f9-9655-52ec55379bbc\") " pod="openshift-controller-manager/controller-manager-795b9cb645-n7jmm" Oct 13 13:19:04 crc kubenswrapper[4797]: I1013 13:19:04.602732 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-84dd4665d-28sxp" Oct 13 13:19:04 crc kubenswrapper[4797]: I1013 13:19:04.624664 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-8xklm" podUID="c72e2007-fbd4-4c7a-a0fc-9c949a748441" containerName="console" containerID="cri-o://c4ce615f43ca2b7599743ffc83f05495a08d009733c9986e42c1d90955c430e9" gracePeriod=15 Oct 13 13:19:04 crc kubenswrapper[4797]: I1013 13:19:04.632756 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4d7bd76f-eca9-46f9-9655-52ec55379bbc-serving-cert\") pod \"controller-manager-795b9cb645-n7jmm\" (UID: \"4d7bd76f-eca9-46f9-9655-52ec55379bbc\") " pod="openshift-controller-manager/controller-manager-795b9cb645-n7jmm" Oct 13 13:19:04 crc kubenswrapper[4797]: I1013 13:19:04.632872 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zr564\" (UniqueName: \"kubernetes.io/projected/4d7bd76f-eca9-46f9-9655-52ec55379bbc-kube-api-access-zr564\") pod \"controller-manager-795b9cb645-n7jmm\" (UID: \"4d7bd76f-eca9-46f9-9655-52ec55379bbc\") " pod="openshift-controller-manager/controller-manager-795b9cb645-n7jmm" Oct 13 13:19:04 crc kubenswrapper[4797]: I1013 13:19:04.632921 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4d7bd76f-eca9-46f9-9655-52ec55379bbc-proxy-ca-bundles\") pod \"controller-manager-795b9cb645-n7jmm\" (UID: \"4d7bd76f-eca9-46f9-9655-52ec55379bbc\") " pod="openshift-controller-manager/controller-manager-795b9cb645-n7jmm" Oct 13 13:19:04 crc kubenswrapper[4797]: I1013 13:19:04.632956 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d7bd76f-eca9-46f9-9655-52ec55379bbc-config\") pod \"controller-manager-795b9cb645-n7jmm\" (UID: \"4d7bd76f-eca9-46f9-9655-52ec55379bbc\") " pod="openshift-controller-manager/controller-manager-795b9cb645-n7jmm" Oct 13 13:19:04 crc kubenswrapper[4797]: I1013 13:19:04.633010 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4d7bd76f-eca9-46f9-9655-52ec55379bbc-client-ca\") pod \"controller-manager-795b9cb645-n7jmm\" (UID: \"4d7bd76f-eca9-46f9-9655-52ec55379bbc\") " pod="openshift-controller-manager/controller-manager-795b9cb645-n7jmm" Oct 13 13:19:04 crc kubenswrapper[4797]: I1013 13:19:04.635398 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d7bd76f-eca9-46f9-9655-52ec55379bbc-config\") pod \"controller-manager-795b9cb645-n7jmm\" (UID: \"4d7bd76f-eca9-46f9-9655-52ec55379bbc\") " pod="openshift-controller-manager/controller-manager-795b9cb645-n7jmm" Oct 13 13:19:04 crc kubenswrapper[4797]: I1013 13:19:04.635565 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4d7bd76f-eca9-46f9-9655-52ec55379bbc-proxy-ca-bundles\") pod \"controller-manager-795b9cb645-n7jmm\" (UID: \"4d7bd76f-eca9-46f9-9655-52ec55379bbc\") " pod="openshift-controller-manager/controller-manager-795b9cb645-n7jmm" Oct 13 13:19:04 crc kubenswrapper[4797]: I1013 13:19:04.636892 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4d7bd76f-eca9-46f9-9655-52ec55379bbc-client-ca\") pod \"controller-manager-795b9cb645-n7jmm\" (UID: \"4d7bd76f-eca9-46f9-9655-52ec55379bbc\") " pod="openshift-controller-manager/controller-manager-795b9cb645-n7jmm" Oct 13 13:19:04 crc kubenswrapper[4797]: I1013 13:19:04.640268 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4d7bd76f-eca9-46f9-9655-52ec55379bbc-serving-cert\") pod \"controller-manager-795b9cb645-n7jmm\" (UID: \"4d7bd76f-eca9-46f9-9655-52ec55379bbc\") " pod="openshift-controller-manager/controller-manager-795b9cb645-n7jmm" Oct 13 13:19:04 crc kubenswrapper[4797]: I1013 13:19:04.662518 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zr564\" (UniqueName: \"kubernetes.io/projected/4d7bd76f-eca9-46f9-9655-52ec55379bbc-kube-api-access-zr564\") pod \"controller-manager-795b9cb645-n7jmm\" (UID: \"4d7bd76f-eca9-46f9-9655-52ec55379bbc\") " pod="openshift-controller-manager/controller-manager-795b9cb645-n7jmm" Oct 13 13:19:04 crc kubenswrapper[4797]: I1013 13:19:04.825416 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-795b9cb645-n7jmm" Oct 13 13:19:04 crc kubenswrapper[4797]: I1013 13:19:04.861545 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-84dd4665d-28sxp"] Oct 13 13:19:04 crc kubenswrapper[4797]: W1013 13:19:04.865898 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode809d708_00fc_4cb4_bf3d_34eee6de3bf3.slice/crio-d342c5fa281877d4a1858aa6fa75ca59c027d29ce11cbe7de5f5722883a0c3a9 WatchSource:0}: Error finding container d342c5fa281877d4a1858aa6fa75ca59c027d29ce11cbe7de5f5722883a0c3a9: Status 404 returned error can't find the container with id d342c5fa281877d4a1858aa6fa75ca59c027d29ce11cbe7de5f5722883a0c3a9 Oct 13 13:19:05 crc kubenswrapper[4797]: I1013 13:19:05.014328 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-8xklm_c72e2007-fbd4-4c7a-a0fc-9c949a748441/console/0.log" Oct 13 13:19:05 crc kubenswrapper[4797]: I1013 13:19:05.014885 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-8xklm" Oct 13 13:19:05 crc kubenswrapper[4797]: I1013 13:19:05.139079 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c72e2007-fbd4-4c7a-a0fc-9c949a748441-console-oauth-config\") pod \"c72e2007-fbd4-4c7a-a0fc-9c949a748441\" (UID: \"c72e2007-fbd4-4c7a-a0fc-9c949a748441\") " Oct 13 13:19:05 crc kubenswrapper[4797]: I1013 13:19:05.139146 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c72e2007-fbd4-4c7a-a0fc-9c949a748441-console-config\") pod \"c72e2007-fbd4-4c7a-a0fc-9c949a748441\" (UID: \"c72e2007-fbd4-4c7a-a0fc-9c949a748441\") " Oct 13 13:19:05 crc kubenswrapper[4797]: I1013 13:19:05.139173 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c72e2007-fbd4-4c7a-a0fc-9c949a748441-console-serving-cert\") pod \"c72e2007-fbd4-4c7a-a0fc-9c949a748441\" (UID: \"c72e2007-fbd4-4c7a-a0fc-9c949a748441\") " Oct 13 13:19:05 crc kubenswrapper[4797]: I1013 13:19:05.139203 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c72e2007-fbd4-4c7a-a0fc-9c949a748441-service-ca\") pod \"c72e2007-fbd4-4c7a-a0fc-9c949a748441\" (UID: \"c72e2007-fbd4-4c7a-a0fc-9c949a748441\") " Oct 13 13:19:05 crc kubenswrapper[4797]: I1013 13:19:05.139228 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c72e2007-fbd4-4c7a-a0fc-9c949a748441-oauth-serving-cert\") pod \"c72e2007-fbd4-4c7a-a0fc-9c949a748441\" (UID: \"c72e2007-fbd4-4c7a-a0fc-9c949a748441\") " Oct 13 13:19:05 crc kubenswrapper[4797]: I1013 13:19:05.139266 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xn6xb\" (UniqueName: \"kubernetes.io/projected/c72e2007-fbd4-4c7a-a0fc-9c949a748441-kube-api-access-xn6xb\") pod \"c72e2007-fbd4-4c7a-a0fc-9c949a748441\" (UID: \"c72e2007-fbd4-4c7a-a0fc-9c949a748441\") " Oct 13 13:19:05 crc kubenswrapper[4797]: I1013 13:19:05.139292 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c72e2007-fbd4-4c7a-a0fc-9c949a748441-trusted-ca-bundle\") pod \"c72e2007-fbd4-4c7a-a0fc-9c949a748441\" (UID: \"c72e2007-fbd4-4c7a-a0fc-9c949a748441\") " Oct 13 13:19:05 crc kubenswrapper[4797]: I1013 13:19:05.140113 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c72e2007-fbd4-4c7a-a0fc-9c949a748441-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "c72e2007-fbd4-4c7a-a0fc-9c949a748441" (UID: "c72e2007-fbd4-4c7a-a0fc-9c949a748441"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:19:05 crc kubenswrapper[4797]: I1013 13:19:05.140641 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c72e2007-fbd4-4c7a-a0fc-9c949a748441-service-ca" (OuterVolumeSpecName: "service-ca") pod "c72e2007-fbd4-4c7a-a0fc-9c949a748441" (UID: "c72e2007-fbd4-4c7a-a0fc-9c949a748441"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:19:05 crc kubenswrapper[4797]: I1013 13:19:05.140859 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c72e2007-fbd4-4c7a-a0fc-9c949a748441-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "c72e2007-fbd4-4c7a-a0fc-9c949a748441" (UID: "c72e2007-fbd4-4c7a-a0fc-9c949a748441"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:19:05 crc kubenswrapper[4797]: I1013 13:19:05.141147 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c72e2007-fbd4-4c7a-a0fc-9c949a748441-console-config" (OuterVolumeSpecName: "console-config") pod "c72e2007-fbd4-4c7a-a0fc-9c949a748441" (UID: "c72e2007-fbd4-4c7a-a0fc-9c949a748441"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:19:05 crc kubenswrapper[4797]: I1013 13:19:05.146011 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c72e2007-fbd4-4c7a-a0fc-9c949a748441-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "c72e2007-fbd4-4c7a-a0fc-9c949a748441" (UID: "c72e2007-fbd4-4c7a-a0fc-9c949a748441"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:19:05 crc kubenswrapper[4797]: I1013 13:19:05.146265 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c72e2007-fbd4-4c7a-a0fc-9c949a748441-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "c72e2007-fbd4-4c7a-a0fc-9c949a748441" (UID: "c72e2007-fbd4-4c7a-a0fc-9c949a748441"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:19:05 crc kubenswrapper[4797]: I1013 13:19:05.146880 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c72e2007-fbd4-4c7a-a0fc-9c949a748441-kube-api-access-xn6xb" (OuterVolumeSpecName: "kube-api-access-xn6xb") pod "c72e2007-fbd4-4c7a-a0fc-9c949a748441" (UID: "c72e2007-fbd4-4c7a-a0fc-9c949a748441"). InnerVolumeSpecName "kube-api-access-xn6xb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:19:05 crc kubenswrapper[4797]: I1013 13:19:05.240376 4797 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c72e2007-fbd4-4c7a-a0fc-9c949a748441-service-ca\") on node \"crc\" DevicePath \"\"" Oct 13 13:19:05 crc kubenswrapper[4797]: I1013 13:19:05.240405 4797 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c72e2007-fbd4-4c7a-a0fc-9c949a748441-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 13 13:19:05 crc kubenswrapper[4797]: I1013 13:19:05.240416 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xn6xb\" (UniqueName: \"kubernetes.io/projected/c72e2007-fbd4-4c7a-a0fc-9c949a748441-kube-api-access-xn6xb\") on node \"crc\" DevicePath \"\"" Oct 13 13:19:05 crc kubenswrapper[4797]: I1013 13:19:05.240425 4797 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c72e2007-fbd4-4c7a-a0fc-9c949a748441-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 13:19:05 crc kubenswrapper[4797]: I1013 13:19:05.240433 4797 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c72e2007-fbd4-4c7a-a0fc-9c949a748441-console-oauth-config\") on node \"crc\" DevicePath \"\"" Oct 13 13:19:05 crc kubenswrapper[4797]: I1013 13:19:05.240443 4797 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c72e2007-fbd4-4c7a-a0fc-9c949a748441-console-config\") on node \"crc\" DevicePath \"\"" Oct 13 13:19:05 crc kubenswrapper[4797]: I1013 13:19:05.240453 4797 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c72e2007-fbd4-4c7a-a0fc-9c949a748441-console-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 13 13:19:05 crc kubenswrapper[4797]: I1013 13:19:05.243278 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d6de0ee-416d-43d1-bf5a-e176bc41b2c5" path="/var/lib/kubelet/pods/9d6de0ee-416d-43d1-bf5a-e176bc41b2c5/volumes" Oct 13 13:19:05 crc kubenswrapper[4797]: I1013 13:19:05.244088 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea99d2e2-7d7a-4af5-8f2d-dadc6c1acd50" path="/var/lib/kubelet/pods/ea99d2e2-7d7a-4af5-8f2d-dadc6c1acd50/volumes" Oct 13 13:19:05 crc kubenswrapper[4797]: I1013 13:19:05.337159 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-795b9cb645-n7jmm"] Oct 13 13:19:05 crc kubenswrapper[4797]: W1013 13:19:05.344162 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4d7bd76f_eca9_46f9_9655_52ec55379bbc.slice/crio-d2eff598f1db80ccb1de7f23e2b907cd76d7a55738621c19daa20b1aba15bbb2 WatchSource:0}: Error finding container d2eff598f1db80ccb1de7f23e2b907cd76d7a55738621c19daa20b1aba15bbb2: Status 404 returned error can't find the container with id d2eff598f1db80ccb1de7f23e2b907cd76d7a55738621c19daa20b1aba15bbb2 Oct 13 13:19:05 crc kubenswrapper[4797]: I1013 13:19:05.440446 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zspc8"] Oct 13 13:19:05 crc kubenswrapper[4797]: E1013 13:19:05.440647 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c72e2007-fbd4-4c7a-a0fc-9c949a748441" containerName="console" Oct 13 13:19:05 crc kubenswrapper[4797]: I1013 13:19:05.440666 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="c72e2007-fbd4-4c7a-a0fc-9c949a748441" containerName="console" Oct 13 13:19:05 crc kubenswrapper[4797]: I1013 13:19:05.440786 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="c72e2007-fbd4-4c7a-a0fc-9c949a748441" containerName="console" Oct 13 13:19:05 crc kubenswrapper[4797]: I1013 13:19:05.441503 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zspc8" Oct 13 13:19:05 crc kubenswrapper[4797]: I1013 13:19:05.453474 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 13 13:19:05 crc kubenswrapper[4797]: I1013 13:19:05.499295 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zspc8"] Oct 13 13:19:05 crc kubenswrapper[4797]: I1013 13:19:05.543883 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/76e0cd0f-5fd1-4d5c-9590-c838a57bd5c4-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zspc8\" (UID: \"76e0cd0f-5fd1-4d5c-9590-c838a57bd5c4\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zspc8" Oct 13 13:19:05 crc kubenswrapper[4797]: I1013 13:19:05.543955 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49f2z\" (UniqueName: \"kubernetes.io/projected/76e0cd0f-5fd1-4d5c-9590-c838a57bd5c4-kube-api-access-49f2z\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zspc8\" (UID: \"76e0cd0f-5fd1-4d5c-9590-c838a57bd5c4\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zspc8" Oct 13 13:19:05 crc kubenswrapper[4797]: I1013 13:19:05.543990 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/76e0cd0f-5fd1-4d5c-9590-c838a57bd5c4-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zspc8\" (UID: \"76e0cd0f-5fd1-4d5c-9590-c838a57bd5c4\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zspc8" Oct 13 13:19:05 crc kubenswrapper[4797]: I1013 13:19:05.645178 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/76e0cd0f-5fd1-4d5c-9590-c838a57bd5c4-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zspc8\" (UID: \"76e0cd0f-5fd1-4d5c-9590-c838a57bd5c4\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zspc8" Oct 13 13:19:05 crc kubenswrapper[4797]: I1013 13:19:05.645234 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49f2z\" (UniqueName: \"kubernetes.io/projected/76e0cd0f-5fd1-4d5c-9590-c838a57bd5c4-kube-api-access-49f2z\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zspc8\" (UID: \"76e0cd0f-5fd1-4d5c-9590-c838a57bd5c4\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zspc8" Oct 13 13:19:05 crc kubenswrapper[4797]: I1013 13:19:05.645261 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/76e0cd0f-5fd1-4d5c-9590-c838a57bd5c4-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zspc8\" (UID: \"76e0cd0f-5fd1-4d5c-9590-c838a57bd5c4\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zspc8" Oct 13 13:19:05 crc kubenswrapper[4797]: I1013 13:19:05.645706 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/76e0cd0f-5fd1-4d5c-9590-c838a57bd5c4-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zspc8\" (UID: \"76e0cd0f-5fd1-4d5c-9590-c838a57bd5c4\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zspc8" Oct 13 13:19:05 crc kubenswrapper[4797]: I1013 13:19:05.645736 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/76e0cd0f-5fd1-4d5c-9590-c838a57bd5c4-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zspc8\" (UID: \"76e0cd0f-5fd1-4d5c-9590-c838a57bd5c4\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zspc8" Oct 13 13:19:05 crc kubenswrapper[4797]: I1013 13:19:05.659855 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-795b9cb645-n7jmm" event={"ID":"4d7bd76f-eca9-46f9-9655-52ec55379bbc","Type":"ContainerStarted","Data":"c39612e6296f4d403f0696e4c5003c4f2c8baf34366bb4caf9a228ea1a20aaef"} Oct 13 13:19:05 crc kubenswrapper[4797]: I1013 13:19:05.659904 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-795b9cb645-n7jmm" Oct 13 13:19:05 crc kubenswrapper[4797]: I1013 13:19:05.659917 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-795b9cb645-n7jmm" event={"ID":"4d7bd76f-eca9-46f9-9655-52ec55379bbc","Type":"ContainerStarted","Data":"d2eff598f1db80ccb1de7f23e2b907cd76d7a55738621c19daa20b1aba15bbb2"} Oct 13 13:19:05 crc kubenswrapper[4797]: I1013 13:19:05.660899 4797 patch_prober.go:28] interesting pod/controller-manager-795b9cb645-n7jmm container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.48:8443/healthz\": dial tcp 10.217.0.48:8443: connect: connection refused" start-of-body= Oct 13 13:19:05 crc kubenswrapper[4797]: I1013 13:19:05.660945 4797 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-795b9cb645-n7jmm" podUID="4d7bd76f-eca9-46f9-9655-52ec55379bbc" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.48:8443/healthz\": dial tcp 10.217.0.48:8443: connect: connection refused" Oct 13 13:19:05 crc kubenswrapper[4797]: I1013 13:19:05.661944 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-84dd4665d-28sxp" event={"ID":"e809d708-00fc-4cb4-bf3d-34eee6de3bf3","Type":"ContainerStarted","Data":"990ca16c3643934fdc8e8ef1278971792d0434d465a9ea4d0e6d5f0ba6e07800"} Oct 13 13:19:05 crc kubenswrapper[4797]: I1013 13:19:05.661979 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-84dd4665d-28sxp" event={"ID":"e809d708-00fc-4cb4-bf3d-34eee6de3bf3","Type":"ContainerStarted","Data":"d342c5fa281877d4a1858aa6fa75ca59c027d29ce11cbe7de5f5722883a0c3a9"} Oct 13 13:19:05 crc kubenswrapper[4797]: I1013 13:19:05.662109 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-84dd4665d-28sxp" Oct 13 13:19:05 crc kubenswrapper[4797]: I1013 13:19:05.664166 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-8xklm_c72e2007-fbd4-4c7a-a0fc-9c949a748441/console/0.log" Oct 13 13:19:05 crc kubenswrapper[4797]: I1013 13:19:05.664204 4797 generic.go:334] "Generic (PLEG): container finished" podID="c72e2007-fbd4-4c7a-a0fc-9c949a748441" containerID="c4ce615f43ca2b7599743ffc83f05495a08d009733c9986e42c1d90955c430e9" exitCode=2 Oct 13 13:19:05 crc kubenswrapper[4797]: I1013 13:19:05.664230 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-8xklm" event={"ID":"c72e2007-fbd4-4c7a-a0fc-9c949a748441","Type":"ContainerDied","Data":"c4ce615f43ca2b7599743ffc83f05495a08d009733c9986e42c1d90955c430e9"} Oct 13 13:19:05 crc kubenswrapper[4797]: I1013 13:19:05.664246 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-8xklm" event={"ID":"c72e2007-fbd4-4c7a-a0fc-9c949a748441","Type":"ContainerDied","Data":"72744f9bd95604904e2a2099c4bab3a740bad454a60ca03375de631a423cd521"} Oct 13 13:19:05 crc kubenswrapper[4797]: I1013 13:19:05.664261 4797 scope.go:117] "RemoveContainer" containerID="c4ce615f43ca2b7599743ffc83f05495a08d009733c9986e42c1d90955c430e9" Oct 13 13:19:05 crc kubenswrapper[4797]: I1013 13:19:05.664338 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-49f2z\" (UniqueName: \"kubernetes.io/projected/76e0cd0f-5fd1-4d5c-9590-c838a57bd5c4-kube-api-access-49f2z\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zspc8\" (UID: \"76e0cd0f-5fd1-4d5c-9590-c838a57bd5c4\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zspc8" Oct 13 13:19:05 crc kubenswrapper[4797]: I1013 13:19:05.664347 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-8xklm" Oct 13 13:19:05 crc kubenswrapper[4797]: I1013 13:19:05.665917 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-84dd4665d-28sxp" Oct 13 13:19:05 crc kubenswrapper[4797]: I1013 13:19:05.681472 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-795b9cb645-n7jmm" podStartSLOduration=3.681458451 podStartE2EDuration="3.681458451s" podCreationTimestamp="2025-10-13 13:19:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 13:19:05.681127953 +0000 UTC m=+723.214678209" watchObservedRunningTime="2025-10-13 13:19:05.681458451 +0000 UTC m=+723.215008707" Oct 13 13:19:05 crc kubenswrapper[4797]: I1013 13:19:05.684683 4797 scope.go:117] "RemoveContainer" containerID="c4ce615f43ca2b7599743ffc83f05495a08d009733c9986e42c1d90955c430e9" Oct 13 13:19:05 crc kubenswrapper[4797]: E1013 13:19:05.685126 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4ce615f43ca2b7599743ffc83f05495a08d009733c9986e42c1d90955c430e9\": container with ID starting with c4ce615f43ca2b7599743ffc83f05495a08d009733c9986e42c1d90955c430e9 not found: ID does not exist" containerID="c4ce615f43ca2b7599743ffc83f05495a08d009733c9986e42c1d90955c430e9" Oct 13 13:19:05 crc kubenswrapper[4797]: I1013 13:19:05.685157 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4ce615f43ca2b7599743ffc83f05495a08d009733c9986e42c1d90955c430e9"} err="failed to get container status \"c4ce615f43ca2b7599743ffc83f05495a08d009733c9986e42c1d90955c430e9\": rpc error: code = NotFound desc = could not find container \"c4ce615f43ca2b7599743ffc83f05495a08d009733c9986e42c1d90955c430e9\": container with ID starting with c4ce615f43ca2b7599743ffc83f05495a08d009733c9986e42c1d90955c430e9 not found: ID does not exist" Oct 13 13:19:05 crc kubenswrapper[4797]: I1013 13:19:05.694268 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-8xklm"] Oct 13 13:19:05 crc kubenswrapper[4797]: I1013 13:19:05.699078 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-8xklm"] Oct 13 13:19:05 crc kubenswrapper[4797]: I1013 13:19:05.711466 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-84dd4665d-28sxp" podStartSLOduration=1.711449236 podStartE2EDuration="1.711449236s" podCreationTimestamp="2025-10-13 13:19:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 13:19:05.709099088 +0000 UTC m=+723.242649354" watchObservedRunningTime="2025-10-13 13:19:05.711449236 +0000 UTC m=+723.244999492" Oct 13 13:19:05 crc kubenswrapper[4797]: I1013 13:19:05.756169 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zspc8" Oct 13 13:19:05 crc kubenswrapper[4797]: I1013 13:19:05.977697 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zspc8"] Oct 13 13:19:05 crc kubenswrapper[4797]: W1013 13:19:05.997055 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod76e0cd0f_5fd1_4d5c_9590_c838a57bd5c4.slice/crio-16bb81e9849128992e1a05ae7ef067d82ee2acfc8a74e9de5af641fee9b33620 WatchSource:0}: Error finding container 16bb81e9849128992e1a05ae7ef067d82ee2acfc8a74e9de5af641fee9b33620: Status 404 returned error can't find the container with id 16bb81e9849128992e1a05ae7ef067d82ee2acfc8a74e9de5af641fee9b33620 Oct 13 13:19:06 crc kubenswrapper[4797]: I1013 13:19:06.674718 4797 generic.go:334] "Generic (PLEG): container finished" podID="76e0cd0f-5fd1-4d5c-9590-c838a57bd5c4" containerID="f9b197697e110617c5b198e297f27652cca2b6f62f96482bc74dabb16c24d561" exitCode=0 Oct 13 13:19:06 crc kubenswrapper[4797]: I1013 13:19:06.674785 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zspc8" event={"ID":"76e0cd0f-5fd1-4d5c-9590-c838a57bd5c4","Type":"ContainerDied","Data":"f9b197697e110617c5b198e297f27652cca2b6f62f96482bc74dabb16c24d561"} Oct 13 13:19:06 crc kubenswrapper[4797]: I1013 13:19:06.675217 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zspc8" event={"ID":"76e0cd0f-5fd1-4d5c-9590-c838a57bd5c4","Type":"ContainerStarted","Data":"16bb81e9849128992e1a05ae7ef067d82ee2acfc8a74e9de5af641fee9b33620"} Oct 13 13:19:06 crc kubenswrapper[4797]: I1013 13:19:06.679912 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-795b9cb645-n7jmm" Oct 13 13:19:07 crc kubenswrapper[4797]: I1013 13:19:07.244213 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c72e2007-fbd4-4c7a-a0fc-9c949a748441" path="/var/lib/kubelet/pods/c72e2007-fbd4-4c7a-a0fc-9c949a748441/volumes" Oct 13 13:19:08 crc kubenswrapper[4797]: I1013 13:19:08.691560 4797 generic.go:334] "Generic (PLEG): container finished" podID="76e0cd0f-5fd1-4d5c-9590-c838a57bd5c4" containerID="2c24185bd82ccf050907a0843d3f779c529d68b7a4e3e21a309b47bb23747dbe" exitCode=0 Oct 13 13:19:08 crc kubenswrapper[4797]: I1013 13:19:08.691929 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zspc8" event={"ID":"76e0cd0f-5fd1-4d5c-9590-c838a57bd5c4","Type":"ContainerDied","Data":"2c24185bd82ccf050907a0843d3f779c529d68b7a4e3e21a309b47bb23747dbe"} Oct 13 13:19:09 crc kubenswrapper[4797]: I1013 13:19:09.699120 4797 generic.go:334] "Generic (PLEG): container finished" podID="76e0cd0f-5fd1-4d5c-9590-c838a57bd5c4" containerID="69b982f2d9d08bda1e603c5e24aa4e5803f6de0fdc7e3927e1c89d0028d236ee" exitCode=0 Oct 13 13:19:09 crc kubenswrapper[4797]: I1013 13:19:09.699195 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zspc8" event={"ID":"76e0cd0f-5fd1-4d5c-9590-c838a57bd5c4","Type":"ContainerDied","Data":"69b982f2d9d08bda1e603c5e24aa4e5803f6de0fdc7e3927e1c89d0028d236ee"} Oct 13 13:19:11 crc kubenswrapper[4797]: I1013 13:19:11.004520 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zspc8" Oct 13 13:19:11 crc kubenswrapper[4797]: I1013 13:19:11.124117 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/76e0cd0f-5fd1-4d5c-9590-c838a57bd5c4-util\") pod \"76e0cd0f-5fd1-4d5c-9590-c838a57bd5c4\" (UID: \"76e0cd0f-5fd1-4d5c-9590-c838a57bd5c4\") " Oct 13 13:19:11 crc kubenswrapper[4797]: I1013 13:19:11.124187 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-49f2z\" (UniqueName: \"kubernetes.io/projected/76e0cd0f-5fd1-4d5c-9590-c838a57bd5c4-kube-api-access-49f2z\") pod \"76e0cd0f-5fd1-4d5c-9590-c838a57bd5c4\" (UID: \"76e0cd0f-5fd1-4d5c-9590-c838a57bd5c4\") " Oct 13 13:19:11 crc kubenswrapper[4797]: I1013 13:19:11.124288 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/76e0cd0f-5fd1-4d5c-9590-c838a57bd5c4-bundle\") pod \"76e0cd0f-5fd1-4d5c-9590-c838a57bd5c4\" (UID: \"76e0cd0f-5fd1-4d5c-9590-c838a57bd5c4\") " Oct 13 13:19:11 crc kubenswrapper[4797]: I1013 13:19:11.125162 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/76e0cd0f-5fd1-4d5c-9590-c838a57bd5c4-bundle" (OuterVolumeSpecName: "bundle") pod "76e0cd0f-5fd1-4d5c-9590-c838a57bd5c4" (UID: "76e0cd0f-5fd1-4d5c-9590-c838a57bd5c4"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 13:19:11 crc kubenswrapper[4797]: I1013 13:19:11.130487 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76e0cd0f-5fd1-4d5c-9590-c838a57bd5c4-kube-api-access-49f2z" (OuterVolumeSpecName: "kube-api-access-49f2z") pod "76e0cd0f-5fd1-4d5c-9590-c838a57bd5c4" (UID: "76e0cd0f-5fd1-4d5c-9590-c838a57bd5c4"). InnerVolumeSpecName "kube-api-access-49f2z". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:19:11 crc kubenswrapper[4797]: I1013 13:19:11.137152 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/76e0cd0f-5fd1-4d5c-9590-c838a57bd5c4-util" (OuterVolumeSpecName: "util") pod "76e0cd0f-5fd1-4d5c-9590-c838a57bd5c4" (UID: "76e0cd0f-5fd1-4d5c-9590-c838a57bd5c4"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 13:19:11 crc kubenswrapper[4797]: I1013 13:19:11.227108 4797 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/76e0cd0f-5fd1-4d5c-9590-c838a57bd5c4-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 13:19:11 crc kubenswrapper[4797]: I1013 13:19:11.227242 4797 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/76e0cd0f-5fd1-4d5c-9590-c838a57bd5c4-util\") on node \"crc\" DevicePath \"\"" Oct 13 13:19:11 crc kubenswrapper[4797]: I1013 13:19:11.227268 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-49f2z\" (UniqueName: \"kubernetes.io/projected/76e0cd0f-5fd1-4d5c-9590-c838a57bd5c4-kube-api-access-49f2z\") on node \"crc\" DevicePath \"\"" Oct 13 13:19:11 crc kubenswrapper[4797]: I1013 13:19:11.303169 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-26845"] Oct 13 13:19:11 crc kubenswrapper[4797]: E1013 13:19:11.303423 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76e0cd0f-5fd1-4d5c-9590-c838a57bd5c4" containerName="pull" Oct 13 13:19:11 crc kubenswrapper[4797]: I1013 13:19:11.303439 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="76e0cd0f-5fd1-4d5c-9590-c838a57bd5c4" containerName="pull" Oct 13 13:19:11 crc kubenswrapper[4797]: E1013 13:19:11.303452 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76e0cd0f-5fd1-4d5c-9590-c838a57bd5c4" containerName="util" Oct 13 13:19:11 crc kubenswrapper[4797]: I1013 13:19:11.303460 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="76e0cd0f-5fd1-4d5c-9590-c838a57bd5c4" containerName="util" Oct 13 13:19:11 crc kubenswrapper[4797]: E1013 13:19:11.303485 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76e0cd0f-5fd1-4d5c-9590-c838a57bd5c4" containerName="extract" Oct 13 13:19:11 crc kubenswrapper[4797]: I1013 13:19:11.303492 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="76e0cd0f-5fd1-4d5c-9590-c838a57bd5c4" containerName="extract" Oct 13 13:19:11 crc kubenswrapper[4797]: I1013 13:19:11.303608 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="76e0cd0f-5fd1-4d5c-9590-c838a57bd5c4" containerName="extract" Oct 13 13:19:11 crc kubenswrapper[4797]: I1013 13:19:11.304468 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-26845" Oct 13 13:19:11 crc kubenswrapper[4797]: I1013 13:19:11.317464 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-26845"] Oct 13 13:19:11 crc kubenswrapper[4797]: I1013 13:19:11.429948 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89444635-3d4e-4855-89da-b911708b4a0d-utilities\") pod \"redhat-operators-26845\" (UID: \"89444635-3d4e-4855-89da-b911708b4a0d\") " pod="openshift-marketplace/redhat-operators-26845" Oct 13 13:19:11 crc kubenswrapper[4797]: I1013 13:19:11.430023 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zcg8w\" (UniqueName: \"kubernetes.io/projected/89444635-3d4e-4855-89da-b911708b4a0d-kube-api-access-zcg8w\") pod \"redhat-operators-26845\" (UID: \"89444635-3d4e-4855-89da-b911708b4a0d\") " pod="openshift-marketplace/redhat-operators-26845" Oct 13 13:19:11 crc kubenswrapper[4797]: I1013 13:19:11.430117 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89444635-3d4e-4855-89da-b911708b4a0d-catalog-content\") pod \"redhat-operators-26845\" (UID: \"89444635-3d4e-4855-89da-b911708b4a0d\") " pod="openshift-marketplace/redhat-operators-26845" Oct 13 13:19:11 crc kubenswrapper[4797]: I1013 13:19:11.531858 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89444635-3d4e-4855-89da-b911708b4a0d-catalog-content\") pod \"redhat-operators-26845\" (UID: \"89444635-3d4e-4855-89da-b911708b4a0d\") " pod="openshift-marketplace/redhat-operators-26845" Oct 13 13:19:11 crc kubenswrapper[4797]: I1013 13:19:11.531985 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89444635-3d4e-4855-89da-b911708b4a0d-utilities\") pod \"redhat-operators-26845\" (UID: \"89444635-3d4e-4855-89da-b911708b4a0d\") " pod="openshift-marketplace/redhat-operators-26845" Oct 13 13:19:11 crc kubenswrapper[4797]: I1013 13:19:11.532018 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zcg8w\" (UniqueName: \"kubernetes.io/projected/89444635-3d4e-4855-89da-b911708b4a0d-kube-api-access-zcg8w\") pod \"redhat-operators-26845\" (UID: \"89444635-3d4e-4855-89da-b911708b4a0d\") " pod="openshift-marketplace/redhat-operators-26845" Oct 13 13:19:11 crc kubenswrapper[4797]: I1013 13:19:11.532342 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89444635-3d4e-4855-89da-b911708b4a0d-catalog-content\") pod \"redhat-operators-26845\" (UID: \"89444635-3d4e-4855-89da-b911708b4a0d\") " pod="openshift-marketplace/redhat-operators-26845" Oct 13 13:19:11 crc kubenswrapper[4797]: I1013 13:19:11.532483 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89444635-3d4e-4855-89da-b911708b4a0d-utilities\") pod \"redhat-operators-26845\" (UID: \"89444635-3d4e-4855-89da-b911708b4a0d\") " pod="openshift-marketplace/redhat-operators-26845" Oct 13 13:19:11 crc kubenswrapper[4797]: I1013 13:19:11.560187 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zcg8w\" (UniqueName: \"kubernetes.io/projected/89444635-3d4e-4855-89da-b911708b4a0d-kube-api-access-zcg8w\") pod \"redhat-operators-26845\" (UID: \"89444635-3d4e-4855-89da-b911708b4a0d\") " pod="openshift-marketplace/redhat-operators-26845" Oct 13 13:19:11 crc kubenswrapper[4797]: I1013 13:19:11.627478 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-26845" Oct 13 13:19:11 crc kubenswrapper[4797]: I1013 13:19:11.732842 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zspc8" event={"ID":"76e0cd0f-5fd1-4d5c-9590-c838a57bd5c4","Type":"ContainerDied","Data":"16bb81e9849128992e1a05ae7ef067d82ee2acfc8a74e9de5af641fee9b33620"} Oct 13 13:19:11 crc kubenswrapper[4797]: I1013 13:19:11.733225 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="16bb81e9849128992e1a05ae7ef067d82ee2acfc8a74e9de5af641fee9b33620" Oct 13 13:19:11 crc kubenswrapper[4797]: I1013 13:19:11.732891 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zspc8" Oct 13 13:19:12 crc kubenswrapper[4797]: I1013 13:19:12.048322 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-26845"] Oct 13 13:19:12 crc kubenswrapper[4797]: W1013 13:19:12.058562 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod89444635_3d4e_4855_89da_b911708b4a0d.slice/crio-e0b1a232b2096fc468bd3b2e5427a0ff57c63eb7ac17810cac0afec503b58250 WatchSource:0}: Error finding container e0b1a232b2096fc468bd3b2e5427a0ff57c63eb7ac17810cac0afec503b58250: Status 404 returned error can't find the container with id e0b1a232b2096fc468bd3b2e5427a0ff57c63eb7ac17810cac0afec503b58250 Oct 13 13:19:12 crc kubenswrapper[4797]: I1013 13:19:12.739090 4797 generic.go:334] "Generic (PLEG): container finished" podID="89444635-3d4e-4855-89da-b911708b4a0d" containerID="ef77d2be2fc9ff561466aac1ea3eb4650eda4c0257d70ebf92877cd1912dfa21" exitCode=0 Oct 13 13:19:12 crc kubenswrapper[4797]: I1013 13:19:12.739184 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-26845" event={"ID":"89444635-3d4e-4855-89da-b911708b4a0d","Type":"ContainerDied","Data":"ef77d2be2fc9ff561466aac1ea3eb4650eda4c0257d70ebf92877cd1912dfa21"} Oct 13 13:19:12 crc kubenswrapper[4797]: I1013 13:19:12.739439 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-26845" event={"ID":"89444635-3d4e-4855-89da-b911708b4a0d","Type":"ContainerStarted","Data":"e0b1a232b2096fc468bd3b2e5427a0ff57c63eb7ac17810cac0afec503b58250"} Oct 13 13:19:14 crc kubenswrapper[4797]: I1013 13:19:14.289591 4797 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 13 13:19:14 crc kubenswrapper[4797]: I1013 13:19:14.754165 4797 generic.go:334] "Generic (PLEG): container finished" podID="89444635-3d4e-4855-89da-b911708b4a0d" containerID="2cebf59498e75f1078e66ef6c8800b7952729852dc9c56476648aa10b475bd89" exitCode=0 Oct 13 13:19:14 crc kubenswrapper[4797]: I1013 13:19:14.754214 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-26845" event={"ID":"89444635-3d4e-4855-89da-b911708b4a0d","Type":"ContainerDied","Data":"2cebf59498e75f1078e66ef6c8800b7952729852dc9c56476648aa10b475bd89"} Oct 13 13:19:15 crc kubenswrapper[4797]: I1013 13:19:15.761426 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-26845" event={"ID":"89444635-3d4e-4855-89da-b911708b4a0d","Type":"ContainerStarted","Data":"f0d7df87ba45f8b2a0f5b3ad11e3690952a01e3e3b2048f28b448bf158c5480f"} Oct 13 13:19:15 crc kubenswrapper[4797]: I1013 13:19:15.783282 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-26845" podStartSLOduration=2.342155465 podStartE2EDuration="4.783251556s" podCreationTimestamp="2025-10-13 13:19:11 +0000 UTC" firstStartedPulling="2025-10-13 13:19:12.740640747 +0000 UTC m=+730.274191003" lastFinishedPulling="2025-10-13 13:19:15.181736838 +0000 UTC m=+732.715287094" observedRunningTime="2025-10-13 13:19:15.777640889 +0000 UTC m=+733.311191145" watchObservedRunningTime="2025-10-13 13:19:15.783251556 +0000 UTC m=+733.316801852" Oct 13 13:19:18 crc kubenswrapper[4797]: I1013 13:19:18.119536 4797 patch_prober.go:28] interesting pod/machine-config-daemon-hrdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 13:19:18 crc kubenswrapper[4797]: I1013 13:19:18.120604 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 13:19:19 crc kubenswrapper[4797]: I1013 13:19:19.064580 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-56c6888874-lkjwb"] Oct 13 13:19:19 crc kubenswrapper[4797]: I1013 13:19:19.065499 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-56c6888874-lkjwb" Oct 13 13:19:19 crc kubenswrapper[4797]: I1013 13:19:19.067240 4797 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Oct 13 13:19:19 crc kubenswrapper[4797]: I1013 13:19:19.067245 4797 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-989xl" Oct 13 13:19:19 crc kubenswrapper[4797]: I1013 13:19:19.067398 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Oct 13 13:19:19 crc kubenswrapper[4797]: I1013 13:19:19.068199 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Oct 13 13:19:19 crc kubenswrapper[4797]: I1013 13:19:19.069984 4797 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Oct 13 13:19:19 crc kubenswrapper[4797]: I1013 13:19:19.087443 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-56c6888874-lkjwb"] Oct 13 13:19:19 crc kubenswrapper[4797]: I1013 13:19:19.230888 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/66c7fd1a-54de-49e1-9f09-93bad2b9ed1d-apiservice-cert\") pod \"metallb-operator-controller-manager-56c6888874-lkjwb\" (UID: \"66c7fd1a-54de-49e1-9f09-93bad2b9ed1d\") " pod="metallb-system/metallb-operator-controller-manager-56c6888874-lkjwb" Oct 13 13:19:19 crc kubenswrapper[4797]: I1013 13:19:19.231079 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/66c7fd1a-54de-49e1-9f09-93bad2b9ed1d-webhook-cert\") pod \"metallb-operator-controller-manager-56c6888874-lkjwb\" (UID: \"66c7fd1a-54de-49e1-9f09-93bad2b9ed1d\") " pod="metallb-system/metallb-operator-controller-manager-56c6888874-lkjwb" Oct 13 13:19:19 crc kubenswrapper[4797]: I1013 13:19:19.231223 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jgq8\" (UniqueName: \"kubernetes.io/projected/66c7fd1a-54de-49e1-9f09-93bad2b9ed1d-kube-api-access-5jgq8\") pod \"metallb-operator-controller-manager-56c6888874-lkjwb\" (UID: \"66c7fd1a-54de-49e1-9f09-93bad2b9ed1d\") " pod="metallb-system/metallb-operator-controller-manager-56c6888874-lkjwb" Oct 13 13:19:19 crc kubenswrapper[4797]: I1013 13:19:19.333074 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/66c7fd1a-54de-49e1-9f09-93bad2b9ed1d-webhook-cert\") pod \"metallb-operator-controller-manager-56c6888874-lkjwb\" (UID: \"66c7fd1a-54de-49e1-9f09-93bad2b9ed1d\") " pod="metallb-system/metallb-operator-controller-manager-56c6888874-lkjwb" Oct 13 13:19:19 crc kubenswrapper[4797]: I1013 13:19:19.333167 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5jgq8\" (UniqueName: \"kubernetes.io/projected/66c7fd1a-54de-49e1-9f09-93bad2b9ed1d-kube-api-access-5jgq8\") pod \"metallb-operator-controller-manager-56c6888874-lkjwb\" (UID: \"66c7fd1a-54de-49e1-9f09-93bad2b9ed1d\") " pod="metallb-system/metallb-operator-controller-manager-56c6888874-lkjwb" Oct 13 13:19:19 crc kubenswrapper[4797]: I1013 13:19:19.333210 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/66c7fd1a-54de-49e1-9f09-93bad2b9ed1d-apiservice-cert\") pod \"metallb-operator-controller-manager-56c6888874-lkjwb\" (UID: \"66c7fd1a-54de-49e1-9f09-93bad2b9ed1d\") " pod="metallb-system/metallb-operator-controller-manager-56c6888874-lkjwb" Oct 13 13:19:19 crc kubenswrapper[4797]: I1013 13:19:19.338914 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/66c7fd1a-54de-49e1-9f09-93bad2b9ed1d-webhook-cert\") pod \"metallb-operator-controller-manager-56c6888874-lkjwb\" (UID: \"66c7fd1a-54de-49e1-9f09-93bad2b9ed1d\") " pod="metallb-system/metallb-operator-controller-manager-56c6888874-lkjwb" Oct 13 13:19:19 crc kubenswrapper[4797]: I1013 13:19:19.342837 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/66c7fd1a-54de-49e1-9f09-93bad2b9ed1d-apiservice-cert\") pod \"metallb-operator-controller-manager-56c6888874-lkjwb\" (UID: \"66c7fd1a-54de-49e1-9f09-93bad2b9ed1d\") " pod="metallb-system/metallb-operator-controller-manager-56c6888874-lkjwb" Oct 13 13:19:19 crc kubenswrapper[4797]: I1013 13:19:19.355292 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jgq8\" (UniqueName: \"kubernetes.io/projected/66c7fd1a-54de-49e1-9f09-93bad2b9ed1d-kube-api-access-5jgq8\") pod \"metallb-operator-controller-manager-56c6888874-lkjwb\" (UID: \"66c7fd1a-54de-49e1-9f09-93bad2b9ed1d\") " pod="metallb-system/metallb-operator-controller-manager-56c6888874-lkjwb" Oct 13 13:19:19 crc kubenswrapper[4797]: I1013 13:19:19.381125 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-56c6888874-lkjwb" Oct 13 13:19:19 crc kubenswrapper[4797]: I1013 13:19:19.520687 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-69f86b4dfd-84kr7"] Oct 13 13:19:19 crc kubenswrapper[4797]: I1013 13:19:19.529421 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-69f86b4dfd-84kr7" Oct 13 13:19:19 crc kubenswrapper[4797]: I1013 13:19:19.532606 4797 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-tgfzd" Oct 13 13:19:19 crc kubenswrapper[4797]: I1013 13:19:19.532898 4797 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Oct 13 13:19:19 crc kubenswrapper[4797]: I1013 13:19:19.533279 4797 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Oct 13 13:19:19 crc kubenswrapper[4797]: I1013 13:19:19.550629 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-69f86b4dfd-84kr7"] Oct 13 13:19:19 crc kubenswrapper[4797]: I1013 13:19:19.650385 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6kmh\" (UniqueName: \"kubernetes.io/projected/ffa5c642-390d-402e-82e5-ec453a7814ee-kube-api-access-z6kmh\") pod \"metallb-operator-webhook-server-69f86b4dfd-84kr7\" (UID: \"ffa5c642-390d-402e-82e5-ec453a7814ee\") " pod="metallb-system/metallb-operator-webhook-server-69f86b4dfd-84kr7" Oct 13 13:19:19 crc kubenswrapper[4797]: I1013 13:19:19.650464 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ffa5c642-390d-402e-82e5-ec453a7814ee-webhook-cert\") pod \"metallb-operator-webhook-server-69f86b4dfd-84kr7\" (UID: \"ffa5c642-390d-402e-82e5-ec453a7814ee\") " pod="metallb-system/metallb-operator-webhook-server-69f86b4dfd-84kr7" Oct 13 13:19:19 crc kubenswrapper[4797]: I1013 13:19:19.650485 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ffa5c642-390d-402e-82e5-ec453a7814ee-apiservice-cert\") pod \"metallb-operator-webhook-server-69f86b4dfd-84kr7\" (UID: \"ffa5c642-390d-402e-82e5-ec453a7814ee\") " pod="metallb-system/metallb-operator-webhook-server-69f86b4dfd-84kr7" Oct 13 13:19:19 crc kubenswrapper[4797]: I1013 13:19:19.730514 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-56c6888874-lkjwb"] Oct 13 13:19:19 crc kubenswrapper[4797]: W1013 13:19:19.737156 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod66c7fd1a_54de_49e1_9f09_93bad2b9ed1d.slice/crio-e64161c95cb0764ba1f14f9c1ac22833581629f1c104031e9e09d500450d40ce WatchSource:0}: Error finding container e64161c95cb0764ba1f14f9c1ac22833581629f1c104031e9e09d500450d40ce: Status 404 returned error can't find the container with id e64161c95cb0764ba1f14f9c1ac22833581629f1c104031e9e09d500450d40ce Oct 13 13:19:19 crc kubenswrapper[4797]: I1013 13:19:19.751250 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ffa5c642-390d-402e-82e5-ec453a7814ee-webhook-cert\") pod \"metallb-operator-webhook-server-69f86b4dfd-84kr7\" (UID: \"ffa5c642-390d-402e-82e5-ec453a7814ee\") " pod="metallb-system/metallb-operator-webhook-server-69f86b4dfd-84kr7" Oct 13 13:19:19 crc kubenswrapper[4797]: I1013 13:19:19.751296 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ffa5c642-390d-402e-82e5-ec453a7814ee-apiservice-cert\") pod \"metallb-operator-webhook-server-69f86b4dfd-84kr7\" (UID: \"ffa5c642-390d-402e-82e5-ec453a7814ee\") " pod="metallb-system/metallb-operator-webhook-server-69f86b4dfd-84kr7" Oct 13 13:19:19 crc kubenswrapper[4797]: I1013 13:19:19.751354 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z6kmh\" (UniqueName: \"kubernetes.io/projected/ffa5c642-390d-402e-82e5-ec453a7814ee-kube-api-access-z6kmh\") pod \"metallb-operator-webhook-server-69f86b4dfd-84kr7\" (UID: \"ffa5c642-390d-402e-82e5-ec453a7814ee\") " pod="metallb-system/metallb-operator-webhook-server-69f86b4dfd-84kr7" Oct 13 13:19:19 crc kubenswrapper[4797]: I1013 13:19:19.757571 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ffa5c642-390d-402e-82e5-ec453a7814ee-apiservice-cert\") pod \"metallb-operator-webhook-server-69f86b4dfd-84kr7\" (UID: \"ffa5c642-390d-402e-82e5-ec453a7814ee\") " pod="metallb-system/metallb-operator-webhook-server-69f86b4dfd-84kr7" Oct 13 13:19:19 crc kubenswrapper[4797]: I1013 13:19:19.757580 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ffa5c642-390d-402e-82e5-ec453a7814ee-webhook-cert\") pod \"metallb-operator-webhook-server-69f86b4dfd-84kr7\" (UID: \"ffa5c642-390d-402e-82e5-ec453a7814ee\") " pod="metallb-system/metallb-operator-webhook-server-69f86b4dfd-84kr7" Oct 13 13:19:19 crc kubenswrapper[4797]: I1013 13:19:19.771520 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6kmh\" (UniqueName: \"kubernetes.io/projected/ffa5c642-390d-402e-82e5-ec453a7814ee-kube-api-access-z6kmh\") pod \"metallb-operator-webhook-server-69f86b4dfd-84kr7\" (UID: \"ffa5c642-390d-402e-82e5-ec453a7814ee\") " pod="metallb-system/metallb-operator-webhook-server-69f86b4dfd-84kr7" Oct 13 13:19:19 crc kubenswrapper[4797]: I1013 13:19:19.782303 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-56c6888874-lkjwb" event={"ID":"66c7fd1a-54de-49e1-9f09-93bad2b9ed1d","Type":"ContainerStarted","Data":"e64161c95cb0764ba1f14f9c1ac22833581629f1c104031e9e09d500450d40ce"} Oct 13 13:19:19 crc kubenswrapper[4797]: I1013 13:19:19.902956 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-69f86b4dfd-84kr7" Oct 13 13:19:20 crc kubenswrapper[4797]: I1013 13:19:20.372375 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-69f86b4dfd-84kr7"] Oct 13 13:19:20 crc kubenswrapper[4797]: W1013 13:19:20.379650 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podffa5c642_390d_402e_82e5_ec453a7814ee.slice/crio-76688d9f160907df424af95b5c47d3542e14e31f738266c35aa0deac5065965d WatchSource:0}: Error finding container 76688d9f160907df424af95b5c47d3542e14e31f738266c35aa0deac5065965d: Status 404 returned error can't find the container with id 76688d9f160907df424af95b5c47d3542e14e31f738266c35aa0deac5065965d Oct 13 13:19:20 crc kubenswrapper[4797]: I1013 13:19:20.795987 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-69f86b4dfd-84kr7" event={"ID":"ffa5c642-390d-402e-82e5-ec453a7814ee","Type":"ContainerStarted","Data":"76688d9f160907df424af95b5c47d3542e14e31f738266c35aa0deac5065965d"} Oct 13 13:19:21 crc kubenswrapper[4797]: I1013 13:19:21.627933 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-26845" Oct 13 13:19:21 crc kubenswrapper[4797]: I1013 13:19:21.628012 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-26845" Oct 13 13:19:21 crc kubenswrapper[4797]: I1013 13:19:21.683127 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-26845" Oct 13 13:19:21 crc kubenswrapper[4797]: I1013 13:19:21.869185 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-26845" Oct 13 13:19:23 crc kubenswrapper[4797]: I1013 13:19:23.823433 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-56c6888874-lkjwb" event={"ID":"66c7fd1a-54de-49e1-9f09-93bad2b9ed1d","Type":"ContainerStarted","Data":"cd59b684b9d131b547a4ff6ce59d05fd70e9e7eb256b0c9ea1f7882d98fc4a9e"} Oct 13 13:19:23 crc kubenswrapper[4797]: I1013 13:19:23.823827 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-56c6888874-lkjwb" Oct 13 13:19:23 crc kubenswrapper[4797]: I1013 13:19:23.849887 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-56c6888874-lkjwb" podStartSLOduration=1.4484401359999999 podStartE2EDuration="4.849863051s" podCreationTimestamp="2025-10-13 13:19:19 +0000 UTC" firstStartedPulling="2025-10-13 13:19:19.740116562 +0000 UTC m=+737.273666818" lastFinishedPulling="2025-10-13 13:19:23.141539477 +0000 UTC m=+740.675089733" observedRunningTime="2025-10-13 13:19:23.840392879 +0000 UTC m=+741.373943145" watchObservedRunningTime="2025-10-13 13:19:23.849863051 +0000 UTC m=+741.383413307" Oct 13 13:19:23 crc kubenswrapper[4797]: I1013 13:19:23.895410 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-26845"] Oct 13 13:19:23 crc kubenswrapper[4797]: I1013 13:19:23.895625 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-26845" podUID="89444635-3d4e-4855-89da-b911708b4a0d" containerName="registry-server" containerID="cri-o://f0d7df87ba45f8b2a0f5b3ad11e3690952a01e3e3b2048f28b448bf158c5480f" gracePeriod=2 Oct 13 13:19:24 crc kubenswrapper[4797]: I1013 13:19:24.832215 4797 generic.go:334] "Generic (PLEG): container finished" podID="89444635-3d4e-4855-89da-b911708b4a0d" containerID="f0d7df87ba45f8b2a0f5b3ad11e3690952a01e3e3b2048f28b448bf158c5480f" exitCode=0 Oct 13 13:19:24 crc kubenswrapper[4797]: I1013 13:19:24.832274 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-26845" event={"ID":"89444635-3d4e-4855-89da-b911708b4a0d","Type":"ContainerDied","Data":"f0d7df87ba45f8b2a0f5b3ad11e3690952a01e3e3b2048f28b448bf158c5480f"} Oct 13 13:19:25 crc kubenswrapper[4797]: I1013 13:19:25.486927 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-26845" Oct 13 13:19:25 crc kubenswrapper[4797]: I1013 13:19:25.550553 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zcg8w\" (UniqueName: \"kubernetes.io/projected/89444635-3d4e-4855-89da-b911708b4a0d-kube-api-access-zcg8w\") pod \"89444635-3d4e-4855-89da-b911708b4a0d\" (UID: \"89444635-3d4e-4855-89da-b911708b4a0d\") " Oct 13 13:19:25 crc kubenswrapper[4797]: I1013 13:19:25.558081 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89444635-3d4e-4855-89da-b911708b4a0d-kube-api-access-zcg8w" (OuterVolumeSpecName: "kube-api-access-zcg8w") pod "89444635-3d4e-4855-89da-b911708b4a0d" (UID: "89444635-3d4e-4855-89da-b911708b4a0d"). InnerVolumeSpecName "kube-api-access-zcg8w". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:19:25 crc kubenswrapper[4797]: I1013 13:19:25.651496 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89444635-3d4e-4855-89da-b911708b4a0d-utilities\") pod \"89444635-3d4e-4855-89da-b911708b4a0d\" (UID: \"89444635-3d4e-4855-89da-b911708b4a0d\") " Oct 13 13:19:25 crc kubenswrapper[4797]: I1013 13:19:25.651563 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89444635-3d4e-4855-89da-b911708b4a0d-catalog-content\") pod \"89444635-3d4e-4855-89da-b911708b4a0d\" (UID: \"89444635-3d4e-4855-89da-b911708b4a0d\") " Oct 13 13:19:25 crc kubenswrapper[4797]: I1013 13:19:25.651859 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zcg8w\" (UniqueName: \"kubernetes.io/projected/89444635-3d4e-4855-89da-b911708b4a0d-kube-api-access-zcg8w\") on node \"crc\" DevicePath \"\"" Oct 13 13:19:25 crc kubenswrapper[4797]: I1013 13:19:25.652232 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/89444635-3d4e-4855-89da-b911708b4a0d-utilities" (OuterVolumeSpecName: "utilities") pod "89444635-3d4e-4855-89da-b911708b4a0d" (UID: "89444635-3d4e-4855-89da-b911708b4a0d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 13:19:25 crc kubenswrapper[4797]: I1013 13:19:25.753154 4797 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89444635-3d4e-4855-89da-b911708b4a0d-utilities\") on node \"crc\" DevicePath \"\"" Oct 13 13:19:25 crc kubenswrapper[4797]: I1013 13:19:25.768951 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/89444635-3d4e-4855-89da-b911708b4a0d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "89444635-3d4e-4855-89da-b911708b4a0d" (UID: "89444635-3d4e-4855-89da-b911708b4a0d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 13:19:25 crc kubenswrapper[4797]: I1013 13:19:25.839042 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-69f86b4dfd-84kr7" event={"ID":"ffa5c642-390d-402e-82e5-ec453a7814ee","Type":"ContainerStarted","Data":"b92310f7e1cd0d0c194b6a31651b3d66e087d88135427cb272b0fc9ac9a0887f"} Oct 13 13:19:25 crc kubenswrapper[4797]: I1013 13:19:25.839207 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-69f86b4dfd-84kr7" Oct 13 13:19:25 crc kubenswrapper[4797]: I1013 13:19:25.841381 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-26845" event={"ID":"89444635-3d4e-4855-89da-b911708b4a0d","Type":"ContainerDied","Data":"e0b1a232b2096fc468bd3b2e5427a0ff57c63eb7ac17810cac0afec503b58250"} Oct 13 13:19:25 crc kubenswrapper[4797]: I1013 13:19:25.841435 4797 scope.go:117] "RemoveContainer" containerID="f0d7df87ba45f8b2a0f5b3ad11e3690952a01e3e3b2048f28b448bf158c5480f" Oct 13 13:19:25 crc kubenswrapper[4797]: I1013 13:19:25.841451 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-26845" Oct 13 13:19:25 crc kubenswrapper[4797]: I1013 13:19:25.854376 4797 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89444635-3d4e-4855-89da-b911708b4a0d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 13 13:19:25 crc kubenswrapper[4797]: I1013 13:19:25.856630 4797 scope.go:117] "RemoveContainer" containerID="2cebf59498e75f1078e66ef6c8800b7952729852dc9c56476648aa10b475bd89" Oct 13 13:19:25 crc kubenswrapper[4797]: I1013 13:19:25.862346 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-69f86b4dfd-84kr7" podStartSLOduration=2.063982787 podStartE2EDuration="6.862327116s" podCreationTimestamp="2025-10-13 13:19:19 +0000 UTC" firstStartedPulling="2025-10-13 13:19:20.383922157 +0000 UTC m=+737.917472413" lastFinishedPulling="2025-10-13 13:19:25.182266486 +0000 UTC m=+742.715816742" observedRunningTime="2025-10-13 13:19:25.860465601 +0000 UTC m=+743.394015877" watchObservedRunningTime="2025-10-13 13:19:25.862327116 +0000 UTC m=+743.395877372" Oct 13 13:19:25 crc kubenswrapper[4797]: I1013 13:19:25.889487 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-26845"] Oct 13 13:19:25 crc kubenswrapper[4797]: I1013 13:19:25.893056 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-26845"] Oct 13 13:19:25 crc kubenswrapper[4797]: I1013 13:19:25.898490 4797 scope.go:117] "RemoveContainer" containerID="ef77d2be2fc9ff561466aac1ea3eb4650eda4c0257d70ebf92877cd1912dfa21" Oct 13 13:19:26 crc kubenswrapper[4797]: I1013 13:19:26.505681 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-czsxc"] Oct 13 13:19:26 crc kubenswrapper[4797]: E1013 13:19:26.506105 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89444635-3d4e-4855-89da-b911708b4a0d" containerName="extract-utilities" Oct 13 13:19:26 crc kubenswrapper[4797]: I1013 13:19:26.506139 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="89444635-3d4e-4855-89da-b911708b4a0d" containerName="extract-utilities" Oct 13 13:19:26 crc kubenswrapper[4797]: E1013 13:19:26.506176 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89444635-3d4e-4855-89da-b911708b4a0d" containerName="registry-server" Oct 13 13:19:26 crc kubenswrapper[4797]: I1013 13:19:26.506190 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="89444635-3d4e-4855-89da-b911708b4a0d" containerName="registry-server" Oct 13 13:19:26 crc kubenswrapper[4797]: E1013 13:19:26.506277 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89444635-3d4e-4855-89da-b911708b4a0d" containerName="extract-content" Oct 13 13:19:26 crc kubenswrapper[4797]: I1013 13:19:26.506296 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="89444635-3d4e-4855-89da-b911708b4a0d" containerName="extract-content" Oct 13 13:19:26 crc kubenswrapper[4797]: I1013 13:19:26.506473 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="89444635-3d4e-4855-89da-b911708b4a0d" containerName="registry-server" Oct 13 13:19:26 crc kubenswrapper[4797]: I1013 13:19:26.507940 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-czsxc" Oct 13 13:19:26 crc kubenswrapper[4797]: I1013 13:19:26.512939 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-czsxc"] Oct 13 13:19:26 crc kubenswrapper[4797]: I1013 13:19:26.562296 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l92m5\" (UniqueName: \"kubernetes.io/projected/92f3f2e2-e980-438e-845c-d5d4991aaf2f-kube-api-access-l92m5\") pod \"certified-operators-czsxc\" (UID: \"92f3f2e2-e980-438e-845c-d5d4991aaf2f\") " pod="openshift-marketplace/certified-operators-czsxc" Oct 13 13:19:26 crc kubenswrapper[4797]: I1013 13:19:26.562365 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92f3f2e2-e980-438e-845c-d5d4991aaf2f-utilities\") pod \"certified-operators-czsxc\" (UID: \"92f3f2e2-e980-438e-845c-d5d4991aaf2f\") " pod="openshift-marketplace/certified-operators-czsxc" Oct 13 13:19:26 crc kubenswrapper[4797]: I1013 13:19:26.562385 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92f3f2e2-e980-438e-845c-d5d4991aaf2f-catalog-content\") pod \"certified-operators-czsxc\" (UID: \"92f3f2e2-e980-438e-845c-d5d4991aaf2f\") " pod="openshift-marketplace/certified-operators-czsxc" Oct 13 13:19:26 crc kubenswrapper[4797]: I1013 13:19:26.663265 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92f3f2e2-e980-438e-845c-d5d4991aaf2f-utilities\") pod \"certified-operators-czsxc\" (UID: \"92f3f2e2-e980-438e-845c-d5d4991aaf2f\") " pod="openshift-marketplace/certified-operators-czsxc" Oct 13 13:19:26 crc kubenswrapper[4797]: I1013 13:19:26.663576 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92f3f2e2-e980-438e-845c-d5d4991aaf2f-catalog-content\") pod \"certified-operators-czsxc\" (UID: \"92f3f2e2-e980-438e-845c-d5d4991aaf2f\") " pod="openshift-marketplace/certified-operators-czsxc" Oct 13 13:19:26 crc kubenswrapper[4797]: I1013 13:19:26.663644 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l92m5\" (UniqueName: \"kubernetes.io/projected/92f3f2e2-e980-438e-845c-d5d4991aaf2f-kube-api-access-l92m5\") pod \"certified-operators-czsxc\" (UID: \"92f3f2e2-e980-438e-845c-d5d4991aaf2f\") " pod="openshift-marketplace/certified-operators-czsxc" Oct 13 13:19:26 crc kubenswrapper[4797]: I1013 13:19:26.663763 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92f3f2e2-e980-438e-845c-d5d4991aaf2f-utilities\") pod \"certified-operators-czsxc\" (UID: \"92f3f2e2-e980-438e-845c-d5d4991aaf2f\") " pod="openshift-marketplace/certified-operators-czsxc" Oct 13 13:19:26 crc kubenswrapper[4797]: I1013 13:19:26.664010 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92f3f2e2-e980-438e-845c-d5d4991aaf2f-catalog-content\") pod \"certified-operators-czsxc\" (UID: \"92f3f2e2-e980-438e-845c-d5d4991aaf2f\") " pod="openshift-marketplace/certified-operators-czsxc" Oct 13 13:19:26 crc kubenswrapper[4797]: I1013 13:19:26.684672 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l92m5\" (UniqueName: \"kubernetes.io/projected/92f3f2e2-e980-438e-845c-d5d4991aaf2f-kube-api-access-l92m5\") pod \"certified-operators-czsxc\" (UID: \"92f3f2e2-e980-438e-845c-d5d4991aaf2f\") " pod="openshift-marketplace/certified-operators-czsxc" Oct 13 13:19:26 crc kubenswrapper[4797]: I1013 13:19:26.827424 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-czsxc" Oct 13 13:19:27 crc kubenswrapper[4797]: I1013 13:19:27.244166 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="89444635-3d4e-4855-89da-b911708b4a0d" path="/var/lib/kubelet/pods/89444635-3d4e-4855-89da-b911708b4a0d/volumes" Oct 13 13:19:27 crc kubenswrapper[4797]: I1013 13:19:27.288882 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-czsxc"] Oct 13 13:19:27 crc kubenswrapper[4797]: I1013 13:19:27.878871 4797 generic.go:334] "Generic (PLEG): container finished" podID="92f3f2e2-e980-438e-845c-d5d4991aaf2f" containerID="3310fc2ec2b374280bc7129d2c3feee90c6d404b8b52c30323e0a731848bd386" exitCode=0 Oct 13 13:19:27 crc kubenswrapper[4797]: I1013 13:19:27.879197 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-czsxc" event={"ID":"92f3f2e2-e980-438e-845c-d5d4991aaf2f","Type":"ContainerDied","Data":"3310fc2ec2b374280bc7129d2c3feee90c6d404b8b52c30323e0a731848bd386"} Oct 13 13:19:27 crc kubenswrapper[4797]: I1013 13:19:27.879273 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-czsxc" event={"ID":"92f3f2e2-e980-438e-845c-d5d4991aaf2f","Type":"ContainerStarted","Data":"62121fe245598d0383ca9140afd33555227b4d5b6d69af877206d57861f38a6b"} Oct 13 13:19:29 crc kubenswrapper[4797]: I1013 13:19:29.895334 4797 generic.go:334] "Generic (PLEG): container finished" podID="92f3f2e2-e980-438e-845c-d5d4991aaf2f" containerID="04a83b693052f3611963aa4279fa82109716e24a66ee09e949ccd3712e43792b" exitCode=0 Oct 13 13:19:29 crc kubenswrapper[4797]: I1013 13:19:29.895530 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-czsxc" event={"ID":"92f3f2e2-e980-438e-845c-d5d4991aaf2f","Type":"ContainerDied","Data":"04a83b693052f3611963aa4279fa82109716e24a66ee09e949ccd3712e43792b"} Oct 13 13:19:30 crc kubenswrapper[4797]: I1013 13:19:30.905183 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-czsxc" event={"ID":"92f3f2e2-e980-438e-845c-d5d4991aaf2f","Type":"ContainerStarted","Data":"081e0dd3988bc25cf6d469514528cf8fb7249327757d46a7c39236e640d21697"} Oct 13 13:19:36 crc kubenswrapper[4797]: I1013 13:19:36.827622 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-czsxc" Oct 13 13:19:36 crc kubenswrapper[4797]: I1013 13:19:36.828246 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-czsxc" Oct 13 13:19:36 crc kubenswrapper[4797]: I1013 13:19:36.872966 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-czsxc" Oct 13 13:19:36 crc kubenswrapper[4797]: I1013 13:19:36.912272 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-czsxc" podStartSLOduration=8.436512928 podStartE2EDuration="10.912247947s" podCreationTimestamp="2025-10-13 13:19:26 +0000 UTC" firstStartedPulling="2025-10-13 13:19:27.891293037 +0000 UTC m=+745.424843333" lastFinishedPulling="2025-10-13 13:19:30.367028086 +0000 UTC m=+747.900578352" observedRunningTime="2025-10-13 13:19:30.921179554 +0000 UTC m=+748.454729810" watchObservedRunningTime="2025-10-13 13:19:36.912247947 +0000 UTC m=+754.445798223" Oct 13 13:19:37 crc kubenswrapper[4797]: I1013 13:19:37.014074 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-czsxc" Oct 13 13:19:39 crc kubenswrapper[4797]: I1013 13:19:39.294798 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-czsxc"] Oct 13 13:19:39 crc kubenswrapper[4797]: I1013 13:19:39.295034 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-czsxc" podUID="92f3f2e2-e980-438e-845c-d5d4991aaf2f" containerName="registry-server" containerID="cri-o://081e0dd3988bc25cf6d469514528cf8fb7249327757d46a7c39236e640d21697" gracePeriod=2 Oct 13 13:19:39 crc kubenswrapper[4797]: I1013 13:19:39.690105 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-czsxc" Oct 13 13:19:39 crc kubenswrapper[4797]: I1013 13:19:39.833601 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l92m5\" (UniqueName: \"kubernetes.io/projected/92f3f2e2-e980-438e-845c-d5d4991aaf2f-kube-api-access-l92m5\") pod \"92f3f2e2-e980-438e-845c-d5d4991aaf2f\" (UID: \"92f3f2e2-e980-438e-845c-d5d4991aaf2f\") " Oct 13 13:19:39 crc kubenswrapper[4797]: I1013 13:19:39.833720 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92f3f2e2-e980-438e-845c-d5d4991aaf2f-utilities\") pod \"92f3f2e2-e980-438e-845c-d5d4991aaf2f\" (UID: \"92f3f2e2-e980-438e-845c-d5d4991aaf2f\") " Oct 13 13:19:39 crc kubenswrapper[4797]: I1013 13:19:39.833778 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92f3f2e2-e980-438e-845c-d5d4991aaf2f-catalog-content\") pod \"92f3f2e2-e980-438e-845c-d5d4991aaf2f\" (UID: \"92f3f2e2-e980-438e-845c-d5d4991aaf2f\") " Oct 13 13:19:39 crc kubenswrapper[4797]: I1013 13:19:39.834514 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/92f3f2e2-e980-438e-845c-d5d4991aaf2f-utilities" (OuterVolumeSpecName: "utilities") pod "92f3f2e2-e980-438e-845c-d5d4991aaf2f" (UID: "92f3f2e2-e980-438e-845c-d5d4991aaf2f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 13:19:39 crc kubenswrapper[4797]: I1013 13:19:39.850704 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92f3f2e2-e980-438e-845c-d5d4991aaf2f-kube-api-access-l92m5" (OuterVolumeSpecName: "kube-api-access-l92m5") pod "92f3f2e2-e980-438e-845c-d5d4991aaf2f" (UID: "92f3f2e2-e980-438e-845c-d5d4991aaf2f"). InnerVolumeSpecName "kube-api-access-l92m5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:19:39 crc kubenswrapper[4797]: I1013 13:19:39.907398 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-69f86b4dfd-84kr7" Oct 13 13:19:39 crc kubenswrapper[4797]: I1013 13:19:39.934849 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l92m5\" (UniqueName: \"kubernetes.io/projected/92f3f2e2-e980-438e-845c-d5d4991aaf2f-kube-api-access-l92m5\") on node \"crc\" DevicePath \"\"" Oct 13 13:19:39 crc kubenswrapper[4797]: I1013 13:19:39.934877 4797 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92f3f2e2-e980-438e-845c-d5d4991aaf2f-utilities\") on node \"crc\" DevicePath \"\"" Oct 13 13:19:39 crc kubenswrapper[4797]: I1013 13:19:39.953506 4797 generic.go:334] "Generic (PLEG): container finished" podID="92f3f2e2-e980-438e-845c-d5d4991aaf2f" containerID="081e0dd3988bc25cf6d469514528cf8fb7249327757d46a7c39236e640d21697" exitCode=0 Oct 13 13:19:39 crc kubenswrapper[4797]: I1013 13:19:39.953548 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-czsxc" event={"ID":"92f3f2e2-e980-438e-845c-d5d4991aaf2f","Type":"ContainerDied","Data":"081e0dd3988bc25cf6d469514528cf8fb7249327757d46a7c39236e640d21697"} Oct 13 13:19:39 crc kubenswrapper[4797]: I1013 13:19:39.953558 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-czsxc" Oct 13 13:19:39 crc kubenswrapper[4797]: I1013 13:19:39.953572 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-czsxc" event={"ID":"92f3f2e2-e980-438e-845c-d5d4991aaf2f","Type":"ContainerDied","Data":"62121fe245598d0383ca9140afd33555227b4d5b6d69af877206d57861f38a6b"} Oct 13 13:19:39 crc kubenswrapper[4797]: I1013 13:19:39.953587 4797 scope.go:117] "RemoveContainer" containerID="081e0dd3988bc25cf6d469514528cf8fb7249327757d46a7c39236e640d21697" Oct 13 13:19:39 crc kubenswrapper[4797]: I1013 13:19:39.972514 4797 scope.go:117] "RemoveContainer" containerID="04a83b693052f3611963aa4279fa82109716e24a66ee09e949ccd3712e43792b" Oct 13 13:19:39 crc kubenswrapper[4797]: I1013 13:19:39.988578 4797 scope.go:117] "RemoveContainer" containerID="3310fc2ec2b374280bc7129d2c3feee90c6d404b8b52c30323e0a731848bd386" Oct 13 13:19:40 crc kubenswrapper[4797]: I1013 13:19:40.010770 4797 scope.go:117] "RemoveContainer" containerID="081e0dd3988bc25cf6d469514528cf8fb7249327757d46a7c39236e640d21697" Oct 13 13:19:40 crc kubenswrapper[4797]: E1013 13:19:40.011253 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"081e0dd3988bc25cf6d469514528cf8fb7249327757d46a7c39236e640d21697\": container with ID starting with 081e0dd3988bc25cf6d469514528cf8fb7249327757d46a7c39236e640d21697 not found: ID does not exist" containerID="081e0dd3988bc25cf6d469514528cf8fb7249327757d46a7c39236e640d21697" Oct 13 13:19:40 crc kubenswrapper[4797]: I1013 13:19:40.011290 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"081e0dd3988bc25cf6d469514528cf8fb7249327757d46a7c39236e640d21697"} err="failed to get container status \"081e0dd3988bc25cf6d469514528cf8fb7249327757d46a7c39236e640d21697\": rpc error: code = NotFound desc = could not find container \"081e0dd3988bc25cf6d469514528cf8fb7249327757d46a7c39236e640d21697\": container with ID starting with 081e0dd3988bc25cf6d469514528cf8fb7249327757d46a7c39236e640d21697 not found: ID does not exist" Oct 13 13:19:40 crc kubenswrapper[4797]: I1013 13:19:40.011316 4797 scope.go:117] "RemoveContainer" containerID="04a83b693052f3611963aa4279fa82109716e24a66ee09e949ccd3712e43792b" Oct 13 13:19:40 crc kubenswrapper[4797]: E1013 13:19:40.011664 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"04a83b693052f3611963aa4279fa82109716e24a66ee09e949ccd3712e43792b\": container with ID starting with 04a83b693052f3611963aa4279fa82109716e24a66ee09e949ccd3712e43792b not found: ID does not exist" containerID="04a83b693052f3611963aa4279fa82109716e24a66ee09e949ccd3712e43792b" Oct 13 13:19:40 crc kubenswrapper[4797]: I1013 13:19:40.011695 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04a83b693052f3611963aa4279fa82109716e24a66ee09e949ccd3712e43792b"} err="failed to get container status \"04a83b693052f3611963aa4279fa82109716e24a66ee09e949ccd3712e43792b\": rpc error: code = NotFound desc = could not find container \"04a83b693052f3611963aa4279fa82109716e24a66ee09e949ccd3712e43792b\": container with ID starting with 04a83b693052f3611963aa4279fa82109716e24a66ee09e949ccd3712e43792b not found: ID does not exist" Oct 13 13:19:40 crc kubenswrapper[4797]: I1013 13:19:40.011723 4797 scope.go:117] "RemoveContainer" containerID="3310fc2ec2b374280bc7129d2c3feee90c6d404b8b52c30323e0a731848bd386" Oct 13 13:19:40 crc kubenswrapper[4797]: E1013 13:19:40.012024 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3310fc2ec2b374280bc7129d2c3feee90c6d404b8b52c30323e0a731848bd386\": container with ID starting with 3310fc2ec2b374280bc7129d2c3feee90c6d404b8b52c30323e0a731848bd386 not found: ID does not exist" containerID="3310fc2ec2b374280bc7129d2c3feee90c6d404b8b52c30323e0a731848bd386" Oct 13 13:19:40 crc kubenswrapper[4797]: I1013 13:19:40.012055 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3310fc2ec2b374280bc7129d2c3feee90c6d404b8b52c30323e0a731848bd386"} err="failed to get container status \"3310fc2ec2b374280bc7129d2c3feee90c6d404b8b52c30323e0a731848bd386\": rpc error: code = NotFound desc = could not find container \"3310fc2ec2b374280bc7129d2c3feee90c6d404b8b52c30323e0a731848bd386\": container with ID starting with 3310fc2ec2b374280bc7129d2c3feee90c6d404b8b52c30323e0a731848bd386 not found: ID does not exist" Oct 13 13:19:40 crc kubenswrapper[4797]: I1013 13:19:40.520899 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/92f3f2e2-e980-438e-845c-d5d4991aaf2f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "92f3f2e2-e980-438e-845c-d5d4991aaf2f" (UID: "92f3f2e2-e980-438e-845c-d5d4991aaf2f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 13:19:40 crc kubenswrapper[4797]: I1013 13:19:40.544151 4797 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92f3f2e2-e980-438e-845c-d5d4991aaf2f-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 13 13:19:40 crc kubenswrapper[4797]: I1013 13:19:40.593318 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-czsxc"] Oct 13 13:19:40 crc kubenswrapper[4797]: I1013 13:19:40.597837 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-czsxc"] Oct 13 13:19:41 crc kubenswrapper[4797]: I1013 13:19:41.243586 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92f3f2e2-e980-438e-845c-d5d4991aaf2f" path="/var/lib/kubelet/pods/92f3f2e2-e980-438e-845c-d5d4991aaf2f/volumes" Oct 13 13:19:48 crc kubenswrapper[4797]: I1013 13:19:48.119864 4797 patch_prober.go:28] interesting pod/machine-config-daemon-hrdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 13:19:48 crc kubenswrapper[4797]: I1013 13:19:48.120521 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 13:19:48 crc kubenswrapper[4797]: I1013 13:19:48.120575 4797 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" Oct 13 13:19:48 crc kubenswrapper[4797]: I1013 13:19:48.121282 4797 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"61854bbd861c1fc9b67c996c47d52d46e92470dc4bfb3423c7c24026ce57b8ba"} pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 13 13:19:48 crc kubenswrapper[4797]: I1013 13:19:48.121363 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerName="machine-config-daemon" containerID="cri-o://61854bbd861c1fc9b67c996c47d52d46e92470dc4bfb3423c7c24026ce57b8ba" gracePeriod=600 Oct 13 13:19:49 crc kubenswrapper[4797]: I1013 13:19:49.011626 4797 generic.go:334] "Generic (PLEG): container finished" podID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerID="61854bbd861c1fc9b67c996c47d52d46e92470dc4bfb3423c7c24026ce57b8ba" exitCode=0 Oct 13 13:19:49 crc kubenswrapper[4797]: I1013 13:19:49.011728 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" event={"ID":"345b1c60-ba79-407d-8423-53010f2dfeb0","Type":"ContainerDied","Data":"61854bbd861c1fc9b67c996c47d52d46e92470dc4bfb3423c7c24026ce57b8ba"} Oct 13 13:19:49 crc kubenswrapper[4797]: I1013 13:19:49.012048 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" event={"ID":"345b1c60-ba79-407d-8423-53010f2dfeb0","Type":"ContainerStarted","Data":"d13bde04be8878f52602789b8a495d96204227aa290488bc4d6eac0aef285521"} Oct 13 13:19:49 crc kubenswrapper[4797]: I1013 13:19:49.012080 4797 scope.go:117] "RemoveContainer" containerID="05b59feccb2234abc6b4fd15b059be4b96eb36f86698f4039d57c3c1a3c8d369" Oct 13 13:19:59 crc kubenswrapper[4797]: I1013 13:19:59.386031 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-56c6888874-lkjwb" Oct 13 13:20:00 crc kubenswrapper[4797]: I1013 13:20:00.147846 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-xjs8k"] Oct 13 13:20:00 crc kubenswrapper[4797]: E1013 13:20:00.148062 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92f3f2e2-e980-438e-845c-d5d4991aaf2f" containerName="extract-utilities" Oct 13 13:20:00 crc kubenswrapper[4797]: I1013 13:20:00.148077 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="92f3f2e2-e980-438e-845c-d5d4991aaf2f" containerName="extract-utilities" Oct 13 13:20:00 crc kubenswrapper[4797]: E1013 13:20:00.148092 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92f3f2e2-e980-438e-845c-d5d4991aaf2f" containerName="extract-content" Oct 13 13:20:00 crc kubenswrapper[4797]: I1013 13:20:00.148098 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="92f3f2e2-e980-438e-845c-d5d4991aaf2f" containerName="extract-content" Oct 13 13:20:00 crc kubenswrapper[4797]: E1013 13:20:00.148110 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92f3f2e2-e980-438e-845c-d5d4991aaf2f" containerName="registry-server" Oct 13 13:20:00 crc kubenswrapper[4797]: I1013 13:20:00.148117 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="92f3f2e2-e980-438e-845c-d5d4991aaf2f" containerName="registry-server" Oct 13 13:20:00 crc kubenswrapper[4797]: I1013 13:20:00.148220 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="92f3f2e2-e980-438e-845c-d5d4991aaf2f" containerName="registry-server" Oct 13 13:20:00 crc kubenswrapper[4797]: I1013 13:20:00.149933 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-xjs8k" Oct 13 13:20:00 crc kubenswrapper[4797]: I1013 13:20:00.151622 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Oct 13 13:20:00 crc kubenswrapper[4797]: I1013 13:20:00.151829 4797 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-cf878" Oct 13 13:20:00 crc kubenswrapper[4797]: I1013 13:20:00.151933 4797 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Oct 13 13:20:00 crc kubenswrapper[4797]: I1013 13:20:00.166881 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-zxbq8"] Oct 13 13:20:00 crc kubenswrapper[4797]: I1013 13:20:00.167525 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-zxbq8" Oct 13 13:20:00 crc kubenswrapper[4797]: I1013 13:20:00.173026 4797 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Oct 13 13:20:00 crc kubenswrapper[4797]: I1013 13:20:00.189798 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-zxbq8"] Oct 13 13:20:00 crc kubenswrapper[4797]: I1013 13:20:00.190043 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5a9c42d5-f403-43ce-80fd-e38f20646272-cert\") pod \"frr-k8s-webhook-server-64bf5d555-zxbq8\" (UID: \"5a9c42d5-f403-43ce-80fd-e38f20646272\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-zxbq8" Oct 13 13:20:00 crc kubenswrapper[4797]: I1013 13:20:00.190075 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9fkc\" (UniqueName: \"kubernetes.io/projected/5a9c42d5-f403-43ce-80fd-e38f20646272-kube-api-access-q9fkc\") pod \"frr-k8s-webhook-server-64bf5d555-zxbq8\" (UID: \"5a9c42d5-f403-43ce-80fd-e38f20646272\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-zxbq8" Oct 13 13:20:00 crc kubenswrapper[4797]: I1013 13:20:00.282090 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-kws7b"] Oct 13 13:20:00 crc kubenswrapper[4797]: I1013 13:20:00.282915 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-kws7b" Oct 13 13:20:00 crc kubenswrapper[4797]: I1013 13:20:00.284956 4797 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Oct 13 13:20:00 crc kubenswrapper[4797]: I1013 13:20:00.285013 4797 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-d4njq" Oct 13 13:20:00 crc kubenswrapper[4797]: I1013 13:20:00.286267 4797 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Oct 13 13:20:00 crc kubenswrapper[4797]: I1013 13:20:00.286297 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Oct 13 13:20:00 crc kubenswrapper[4797]: I1013 13:20:00.286579 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-68d546b9d8-v7qt6"] Oct 13 13:20:00 crc kubenswrapper[4797]: I1013 13:20:00.287435 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-68d546b9d8-v7qt6" Oct 13 13:20:00 crc kubenswrapper[4797]: I1013 13:20:00.290940 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/44e2efb9-07fa-42db-a605-44970bbe88bc-frr-startup\") pod \"frr-k8s-xjs8k\" (UID: \"44e2efb9-07fa-42db-a605-44970bbe88bc\") " pod="metallb-system/frr-k8s-xjs8k" Oct 13 13:20:00 crc kubenswrapper[4797]: I1013 13:20:00.290999 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5a9c42d5-f403-43ce-80fd-e38f20646272-cert\") pod \"frr-k8s-webhook-server-64bf5d555-zxbq8\" (UID: \"5a9c42d5-f403-43ce-80fd-e38f20646272\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-zxbq8" Oct 13 13:20:00 crc kubenswrapper[4797]: I1013 13:20:00.291184 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9fkc\" (UniqueName: \"kubernetes.io/projected/5a9c42d5-f403-43ce-80fd-e38f20646272-kube-api-access-q9fkc\") pod \"frr-k8s-webhook-server-64bf5d555-zxbq8\" (UID: \"5a9c42d5-f403-43ce-80fd-e38f20646272\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-zxbq8" Oct 13 13:20:00 crc kubenswrapper[4797]: I1013 13:20:00.291424 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/44e2efb9-07fa-42db-a605-44970bbe88bc-metrics-certs\") pod \"frr-k8s-xjs8k\" (UID: \"44e2efb9-07fa-42db-a605-44970bbe88bc\") " pod="metallb-system/frr-k8s-xjs8k" Oct 13 13:20:00 crc kubenswrapper[4797]: I1013 13:20:00.291488 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6dth4\" (UniqueName: \"kubernetes.io/projected/44e2efb9-07fa-42db-a605-44970bbe88bc-kube-api-access-6dth4\") pod \"frr-k8s-xjs8k\" (UID: \"44e2efb9-07fa-42db-a605-44970bbe88bc\") " pod="metallb-system/frr-k8s-xjs8k" Oct 13 13:20:00 crc kubenswrapper[4797]: I1013 13:20:00.291524 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/44e2efb9-07fa-42db-a605-44970bbe88bc-frr-sockets\") pod \"frr-k8s-xjs8k\" (UID: \"44e2efb9-07fa-42db-a605-44970bbe88bc\") " pod="metallb-system/frr-k8s-xjs8k" Oct 13 13:20:00 crc kubenswrapper[4797]: I1013 13:20:00.291630 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/44e2efb9-07fa-42db-a605-44970bbe88bc-frr-conf\") pod \"frr-k8s-xjs8k\" (UID: \"44e2efb9-07fa-42db-a605-44970bbe88bc\") " pod="metallb-system/frr-k8s-xjs8k" Oct 13 13:20:00 crc kubenswrapper[4797]: I1013 13:20:00.291658 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/44e2efb9-07fa-42db-a605-44970bbe88bc-metrics\") pod \"frr-k8s-xjs8k\" (UID: \"44e2efb9-07fa-42db-a605-44970bbe88bc\") " pod="metallb-system/frr-k8s-xjs8k" Oct 13 13:20:00 crc kubenswrapper[4797]: I1013 13:20:00.291696 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/44e2efb9-07fa-42db-a605-44970bbe88bc-reloader\") pod \"frr-k8s-xjs8k\" (UID: \"44e2efb9-07fa-42db-a605-44970bbe88bc\") " pod="metallb-system/frr-k8s-xjs8k" Oct 13 13:20:00 crc kubenswrapper[4797]: I1013 13:20:00.292261 4797 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Oct 13 13:20:00 crc kubenswrapper[4797]: I1013 13:20:00.299111 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-68d546b9d8-v7qt6"] Oct 13 13:20:00 crc kubenswrapper[4797]: I1013 13:20:00.317774 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9fkc\" (UniqueName: \"kubernetes.io/projected/5a9c42d5-f403-43ce-80fd-e38f20646272-kube-api-access-q9fkc\") pod \"frr-k8s-webhook-server-64bf5d555-zxbq8\" (UID: \"5a9c42d5-f403-43ce-80fd-e38f20646272\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-zxbq8" Oct 13 13:20:00 crc kubenswrapper[4797]: I1013 13:20:00.319079 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5a9c42d5-f403-43ce-80fd-e38f20646272-cert\") pod \"frr-k8s-webhook-server-64bf5d555-zxbq8\" (UID: \"5a9c42d5-f403-43ce-80fd-e38f20646272\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-zxbq8" Oct 13 13:20:00 crc kubenswrapper[4797]: I1013 13:20:00.392547 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76b8m\" (UniqueName: \"kubernetes.io/projected/3fad6639-3207-47d1-9c1c-f28bed56e219-kube-api-access-76b8m\") pod \"controller-68d546b9d8-v7qt6\" (UID: \"3fad6639-3207-47d1-9c1c-f28bed56e219\") " pod="metallb-system/controller-68d546b9d8-v7qt6" Oct 13 13:20:00 crc kubenswrapper[4797]: I1013 13:20:00.392594 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/44e2efb9-07fa-42db-a605-44970bbe88bc-metrics-certs\") pod \"frr-k8s-xjs8k\" (UID: \"44e2efb9-07fa-42db-a605-44970bbe88bc\") " pod="metallb-system/frr-k8s-xjs8k" Oct 13 13:20:00 crc kubenswrapper[4797]: I1013 13:20:00.392613 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6dth4\" (UniqueName: \"kubernetes.io/projected/44e2efb9-07fa-42db-a605-44970bbe88bc-kube-api-access-6dth4\") pod \"frr-k8s-xjs8k\" (UID: \"44e2efb9-07fa-42db-a605-44970bbe88bc\") " pod="metallb-system/frr-k8s-xjs8k" Oct 13 13:20:00 crc kubenswrapper[4797]: I1013 13:20:00.392635 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/44e2efb9-07fa-42db-a605-44970bbe88bc-frr-sockets\") pod \"frr-k8s-xjs8k\" (UID: \"44e2efb9-07fa-42db-a605-44970bbe88bc\") " pod="metallb-system/frr-k8s-xjs8k" Oct 13 13:20:00 crc kubenswrapper[4797]: I1013 13:20:00.392662 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3fad6639-3207-47d1-9c1c-f28bed56e219-metrics-certs\") pod \"controller-68d546b9d8-v7qt6\" (UID: \"3fad6639-3207-47d1-9c1c-f28bed56e219\") " pod="metallb-system/controller-68d546b9d8-v7qt6" Oct 13 13:20:00 crc kubenswrapper[4797]: I1013 13:20:00.392687 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/44e2efb9-07fa-42db-a605-44970bbe88bc-frr-conf\") pod \"frr-k8s-xjs8k\" (UID: \"44e2efb9-07fa-42db-a605-44970bbe88bc\") " pod="metallb-system/frr-k8s-xjs8k" Oct 13 13:20:00 crc kubenswrapper[4797]: I1013 13:20:00.392704 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/44e2efb9-07fa-42db-a605-44970bbe88bc-metrics\") pod \"frr-k8s-xjs8k\" (UID: \"44e2efb9-07fa-42db-a605-44970bbe88bc\") " pod="metallb-system/frr-k8s-xjs8k" Oct 13 13:20:00 crc kubenswrapper[4797]: I1013 13:20:00.392772 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/c2336e04-93f4-4e2a-a221-21c09083f0ac-metallb-excludel2\") pod \"speaker-kws7b\" (UID: \"c2336e04-93f4-4e2a-a221-21c09083f0ac\") " pod="metallb-system/speaker-kws7b" Oct 13 13:20:00 crc kubenswrapper[4797]: I1013 13:20:00.393134 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/44e2efb9-07fa-42db-a605-44970bbe88bc-reloader\") pod \"frr-k8s-xjs8k\" (UID: \"44e2efb9-07fa-42db-a605-44970bbe88bc\") " pod="metallb-system/frr-k8s-xjs8k" Oct 13 13:20:00 crc kubenswrapper[4797]: I1013 13:20:00.393221 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c2336e04-93f4-4e2a-a221-21c09083f0ac-metrics-certs\") pod \"speaker-kws7b\" (UID: \"c2336e04-93f4-4e2a-a221-21c09083f0ac\") " pod="metallb-system/speaker-kws7b" Oct 13 13:20:00 crc kubenswrapper[4797]: I1013 13:20:00.393270 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/44e2efb9-07fa-42db-a605-44970bbe88bc-metrics\") pod \"frr-k8s-xjs8k\" (UID: \"44e2efb9-07fa-42db-a605-44970bbe88bc\") " pod="metallb-system/frr-k8s-xjs8k" Oct 13 13:20:00 crc kubenswrapper[4797]: I1013 13:20:00.393285 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/c2336e04-93f4-4e2a-a221-21c09083f0ac-memberlist\") pod \"speaker-kws7b\" (UID: \"c2336e04-93f4-4e2a-a221-21c09083f0ac\") " pod="metallb-system/speaker-kws7b" Oct 13 13:20:00 crc kubenswrapper[4797]: I1013 13:20:00.393308 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/44e2efb9-07fa-42db-a605-44970bbe88bc-frr-conf\") pod \"frr-k8s-xjs8k\" (UID: \"44e2efb9-07fa-42db-a605-44970bbe88bc\") " pod="metallb-system/frr-k8s-xjs8k" Oct 13 13:20:00 crc kubenswrapper[4797]: I1013 13:20:00.393370 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3fad6639-3207-47d1-9c1c-f28bed56e219-cert\") pod \"controller-68d546b9d8-v7qt6\" (UID: \"3fad6639-3207-47d1-9c1c-f28bed56e219\") " pod="metallb-system/controller-68d546b9d8-v7qt6" Oct 13 13:20:00 crc kubenswrapper[4797]: I1013 13:20:00.393403 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hwvq\" (UniqueName: \"kubernetes.io/projected/c2336e04-93f4-4e2a-a221-21c09083f0ac-kube-api-access-7hwvq\") pod \"speaker-kws7b\" (UID: \"c2336e04-93f4-4e2a-a221-21c09083f0ac\") " pod="metallb-system/speaker-kws7b" Oct 13 13:20:00 crc kubenswrapper[4797]: I1013 13:20:00.393495 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/44e2efb9-07fa-42db-a605-44970bbe88bc-frr-startup\") pod \"frr-k8s-xjs8k\" (UID: \"44e2efb9-07fa-42db-a605-44970bbe88bc\") " pod="metallb-system/frr-k8s-xjs8k" Oct 13 13:20:00 crc kubenswrapper[4797]: I1013 13:20:00.393528 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/44e2efb9-07fa-42db-a605-44970bbe88bc-reloader\") pod \"frr-k8s-xjs8k\" (UID: \"44e2efb9-07fa-42db-a605-44970bbe88bc\") " pod="metallb-system/frr-k8s-xjs8k" Oct 13 13:20:00 crc kubenswrapper[4797]: I1013 13:20:00.394402 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/44e2efb9-07fa-42db-a605-44970bbe88bc-frr-startup\") pod \"frr-k8s-xjs8k\" (UID: \"44e2efb9-07fa-42db-a605-44970bbe88bc\") " pod="metallb-system/frr-k8s-xjs8k" Oct 13 13:20:00 crc kubenswrapper[4797]: I1013 13:20:00.394623 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/44e2efb9-07fa-42db-a605-44970bbe88bc-frr-sockets\") pod \"frr-k8s-xjs8k\" (UID: \"44e2efb9-07fa-42db-a605-44970bbe88bc\") " pod="metallb-system/frr-k8s-xjs8k" Oct 13 13:20:00 crc kubenswrapper[4797]: I1013 13:20:00.395999 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/44e2efb9-07fa-42db-a605-44970bbe88bc-metrics-certs\") pod \"frr-k8s-xjs8k\" (UID: \"44e2efb9-07fa-42db-a605-44970bbe88bc\") " pod="metallb-system/frr-k8s-xjs8k" Oct 13 13:20:00 crc kubenswrapper[4797]: I1013 13:20:00.411033 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6dth4\" (UniqueName: \"kubernetes.io/projected/44e2efb9-07fa-42db-a605-44970bbe88bc-kube-api-access-6dth4\") pod \"frr-k8s-xjs8k\" (UID: \"44e2efb9-07fa-42db-a605-44970bbe88bc\") " pod="metallb-system/frr-k8s-xjs8k" Oct 13 13:20:00 crc kubenswrapper[4797]: I1013 13:20:00.463061 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-xjs8k" Oct 13 13:20:00 crc kubenswrapper[4797]: I1013 13:20:00.480107 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-zxbq8" Oct 13 13:20:00 crc kubenswrapper[4797]: I1013 13:20:00.494425 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-76b8m\" (UniqueName: \"kubernetes.io/projected/3fad6639-3207-47d1-9c1c-f28bed56e219-kube-api-access-76b8m\") pod \"controller-68d546b9d8-v7qt6\" (UID: \"3fad6639-3207-47d1-9c1c-f28bed56e219\") " pod="metallb-system/controller-68d546b9d8-v7qt6" Oct 13 13:20:00 crc kubenswrapper[4797]: I1013 13:20:00.494474 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3fad6639-3207-47d1-9c1c-f28bed56e219-metrics-certs\") pod \"controller-68d546b9d8-v7qt6\" (UID: \"3fad6639-3207-47d1-9c1c-f28bed56e219\") " pod="metallb-system/controller-68d546b9d8-v7qt6" Oct 13 13:20:00 crc kubenswrapper[4797]: I1013 13:20:00.494525 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/c2336e04-93f4-4e2a-a221-21c09083f0ac-metallb-excludel2\") pod \"speaker-kws7b\" (UID: \"c2336e04-93f4-4e2a-a221-21c09083f0ac\") " pod="metallb-system/speaker-kws7b" Oct 13 13:20:00 crc kubenswrapper[4797]: I1013 13:20:00.494549 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c2336e04-93f4-4e2a-a221-21c09083f0ac-metrics-certs\") pod \"speaker-kws7b\" (UID: \"c2336e04-93f4-4e2a-a221-21c09083f0ac\") " pod="metallb-system/speaker-kws7b" Oct 13 13:20:00 crc kubenswrapper[4797]: E1013 13:20:00.494910 4797 secret.go:188] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Oct 13 13:20:00 crc kubenswrapper[4797]: E1013 13:20:00.495023 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c2336e04-93f4-4e2a-a221-21c09083f0ac-metrics-certs podName:c2336e04-93f4-4e2a-a221-21c09083f0ac nodeName:}" failed. No retries permitted until 2025-10-13 13:20:00.994994019 +0000 UTC m=+778.528544295 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c2336e04-93f4-4e2a-a221-21c09083f0ac-metrics-certs") pod "speaker-kws7b" (UID: "c2336e04-93f4-4e2a-a221-21c09083f0ac") : secret "speaker-certs-secret" not found Oct 13 13:20:00 crc kubenswrapper[4797]: I1013 13:20:00.495188 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/c2336e04-93f4-4e2a-a221-21c09083f0ac-metallb-excludel2\") pod \"speaker-kws7b\" (UID: \"c2336e04-93f4-4e2a-a221-21c09083f0ac\") " pod="metallb-system/speaker-kws7b" Oct 13 13:20:00 crc kubenswrapper[4797]: I1013 13:20:00.495244 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/c2336e04-93f4-4e2a-a221-21c09083f0ac-memberlist\") pod \"speaker-kws7b\" (UID: \"c2336e04-93f4-4e2a-a221-21c09083f0ac\") " pod="metallb-system/speaker-kws7b" Oct 13 13:20:00 crc kubenswrapper[4797]: I1013 13:20:00.495266 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3fad6639-3207-47d1-9c1c-f28bed56e219-cert\") pod \"controller-68d546b9d8-v7qt6\" (UID: \"3fad6639-3207-47d1-9c1c-f28bed56e219\") " pod="metallb-system/controller-68d546b9d8-v7qt6" Oct 13 13:20:00 crc kubenswrapper[4797]: I1013 13:20:00.495282 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7hwvq\" (UniqueName: \"kubernetes.io/projected/c2336e04-93f4-4e2a-a221-21c09083f0ac-kube-api-access-7hwvq\") pod \"speaker-kws7b\" (UID: \"c2336e04-93f4-4e2a-a221-21c09083f0ac\") " pod="metallb-system/speaker-kws7b" Oct 13 13:20:00 crc kubenswrapper[4797]: E1013 13:20:00.495412 4797 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Oct 13 13:20:00 crc kubenswrapper[4797]: E1013 13:20:00.495462 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c2336e04-93f4-4e2a-a221-21c09083f0ac-memberlist podName:c2336e04-93f4-4e2a-a221-21c09083f0ac nodeName:}" failed. No retries permitted until 2025-10-13 13:20:00.99544643 +0000 UTC m=+778.528996796 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/c2336e04-93f4-4e2a-a221-21c09083f0ac-memberlist") pod "speaker-kws7b" (UID: "c2336e04-93f4-4e2a-a221-21c09083f0ac") : secret "metallb-memberlist" not found Oct 13 13:20:00 crc kubenswrapper[4797]: I1013 13:20:00.498329 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3fad6639-3207-47d1-9c1c-f28bed56e219-metrics-certs\") pod \"controller-68d546b9d8-v7qt6\" (UID: \"3fad6639-3207-47d1-9c1c-f28bed56e219\") " pod="metallb-system/controller-68d546b9d8-v7qt6" Oct 13 13:20:00 crc kubenswrapper[4797]: I1013 13:20:00.502239 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3fad6639-3207-47d1-9c1c-f28bed56e219-cert\") pod \"controller-68d546b9d8-v7qt6\" (UID: \"3fad6639-3207-47d1-9c1c-f28bed56e219\") " pod="metallb-system/controller-68d546b9d8-v7qt6" Oct 13 13:20:00 crc kubenswrapper[4797]: I1013 13:20:00.515842 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hwvq\" (UniqueName: \"kubernetes.io/projected/c2336e04-93f4-4e2a-a221-21c09083f0ac-kube-api-access-7hwvq\") pod \"speaker-kws7b\" (UID: \"c2336e04-93f4-4e2a-a221-21c09083f0ac\") " pod="metallb-system/speaker-kws7b" Oct 13 13:20:00 crc kubenswrapper[4797]: I1013 13:20:00.519496 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-76b8m\" (UniqueName: \"kubernetes.io/projected/3fad6639-3207-47d1-9c1c-f28bed56e219-kube-api-access-76b8m\") pod \"controller-68d546b9d8-v7qt6\" (UID: \"3fad6639-3207-47d1-9c1c-f28bed56e219\") " pod="metallb-system/controller-68d546b9d8-v7qt6" Oct 13 13:20:00 crc kubenswrapper[4797]: I1013 13:20:00.646847 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-68d546b9d8-v7qt6" Oct 13 13:20:00 crc kubenswrapper[4797]: I1013 13:20:00.846171 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-68d546b9d8-v7qt6"] Oct 13 13:20:00 crc kubenswrapper[4797]: W1013 13:20:00.851995 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3fad6639_3207_47d1_9c1c_f28bed56e219.slice/crio-bf3e9518ca3edb5f17d9e972043e46f87905c542b827e3d7cd44e97c25097706 WatchSource:0}: Error finding container bf3e9518ca3edb5f17d9e972043e46f87905c542b827e3d7cd44e97c25097706: Status 404 returned error can't find the container with id bf3e9518ca3edb5f17d9e972043e46f87905c542b827e3d7cd44e97c25097706 Oct 13 13:20:00 crc kubenswrapper[4797]: I1013 13:20:00.922048 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-zxbq8"] Oct 13 13:20:00 crc kubenswrapper[4797]: W1013 13:20:00.947770 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5a9c42d5_f403_43ce_80fd_e38f20646272.slice/crio-c282850ffe3550ee5fe16be9af8576be5f5faf6442f0c4e6938229c5738576a8 WatchSource:0}: Error finding container c282850ffe3550ee5fe16be9af8576be5f5faf6442f0c4e6938229c5738576a8: Status 404 returned error can't find the container with id c282850ffe3550ee5fe16be9af8576be5f5faf6442f0c4e6938229c5738576a8 Oct 13 13:20:01 crc kubenswrapper[4797]: I1013 13:20:01.000364 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c2336e04-93f4-4e2a-a221-21c09083f0ac-metrics-certs\") pod \"speaker-kws7b\" (UID: \"c2336e04-93f4-4e2a-a221-21c09083f0ac\") " pod="metallb-system/speaker-kws7b" Oct 13 13:20:01 crc kubenswrapper[4797]: I1013 13:20:01.000400 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/c2336e04-93f4-4e2a-a221-21c09083f0ac-memberlist\") pod \"speaker-kws7b\" (UID: \"c2336e04-93f4-4e2a-a221-21c09083f0ac\") " pod="metallb-system/speaker-kws7b" Oct 13 13:20:01 crc kubenswrapper[4797]: E1013 13:20:01.000519 4797 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Oct 13 13:20:01 crc kubenswrapper[4797]: E1013 13:20:01.000572 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c2336e04-93f4-4e2a-a221-21c09083f0ac-memberlist podName:c2336e04-93f4-4e2a-a221-21c09083f0ac nodeName:}" failed. No retries permitted until 2025-10-13 13:20:02.000557548 +0000 UTC m=+779.534107814 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/c2336e04-93f4-4e2a-a221-21c09083f0ac-memberlist") pod "speaker-kws7b" (UID: "c2336e04-93f4-4e2a-a221-21c09083f0ac") : secret "metallb-memberlist" not found Oct 13 13:20:01 crc kubenswrapper[4797]: I1013 13:20:01.006953 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c2336e04-93f4-4e2a-a221-21c09083f0ac-metrics-certs\") pod \"speaker-kws7b\" (UID: \"c2336e04-93f4-4e2a-a221-21c09083f0ac\") " pod="metallb-system/speaker-kws7b" Oct 13 13:20:01 crc kubenswrapper[4797]: I1013 13:20:01.086017 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-v7qt6" event={"ID":"3fad6639-3207-47d1-9c1c-f28bed56e219","Type":"ContainerStarted","Data":"84e0a2890161dc0ba6dd6a319ef152db3820c9874fc93a0ce515ccd065786fda"} Oct 13 13:20:01 crc kubenswrapper[4797]: I1013 13:20:01.086058 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-v7qt6" event={"ID":"3fad6639-3207-47d1-9c1c-f28bed56e219","Type":"ContainerStarted","Data":"bf3e9518ca3edb5f17d9e972043e46f87905c542b827e3d7cd44e97c25097706"} Oct 13 13:20:01 crc kubenswrapper[4797]: I1013 13:20:01.086990 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-zxbq8" event={"ID":"5a9c42d5-f403-43ce-80fd-e38f20646272","Type":"ContainerStarted","Data":"c282850ffe3550ee5fe16be9af8576be5f5faf6442f0c4e6938229c5738576a8"} Oct 13 13:20:01 crc kubenswrapper[4797]: I1013 13:20:01.087683 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-xjs8k" event={"ID":"44e2efb9-07fa-42db-a605-44970bbe88bc","Type":"ContainerStarted","Data":"acd11b92e3929dd7fe5ad505e84da83864073937910a47c0ca923f48ca9f6042"} Oct 13 13:20:02 crc kubenswrapper[4797]: I1013 13:20:02.011303 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/c2336e04-93f4-4e2a-a221-21c09083f0ac-memberlist\") pod \"speaker-kws7b\" (UID: \"c2336e04-93f4-4e2a-a221-21c09083f0ac\") " pod="metallb-system/speaker-kws7b" Oct 13 13:20:02 crc kubenswrapper[4797]: I1013 13:20:02.016955 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/c2336e04-93f4-4e2a-a221-21c09083f0ac-memberlist\") pod \"speaker-kws7b\" (UID: \"c2336e04-93f4-4e2a-a221-21c09083f0ac\") " pod="metallb-system/speaker-kws7b" Oct 13 13:20:02 crc kubenswrapper[4797]: I1013 13:20:02.095477 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-v7qt6" event={"ID":"3fad6639-3207-47d1-9c1c-f28bed56e219","Type":"ContainerStarted","Data":"f6b2cc0efe97edecce4df870e0c6fabcfe31cb78918684acf0f18cf07362b452"} Oct 13 13:20:02 crc kubenswrapper[4797]: I1013 13:20:02.095628 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-68d546b9d8-v7qt6" Oct 13 13:20:02 crc kubenswrapper[4797]: I1013 13:20:02.098351 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-kws7b" Oct 13 13:20:02 crc kubenswrapper[4797]: I1013 13:20:02.115455 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-68d546b9d8-v7qt6" podStartSLOduration=2.115433895 podStartE2EDuration="2.115433895s" podCreationTimestamp="2025-10-13 13:20:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 13:20:02.111369296 +0000 UTC m=+779.644919592" watchObservedRunningTime="2025-10-13 13:20:02.115433895 +0000 UTC m=+779.648984161" Oct 13 13:20:02 crc kubenswrapper[4797]: W1013 13:20:02.124672 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc2336e04_93f4_4e2a_a221_21c09083f0ac.slice/crio-8e9065e488e7c63232ec753a024e68737b211008de56ae6ef9b14a53f2b94715 WatchSource:0}: Error finding container 8e9065e488e7c63232ec753a024e68737b211008de56ae6ef9b14a53f2b94715: Status 404 returned error can't find the container with id 8e9065e488e7c63232ec753a024e68737b211008de56ae6ef9b14a53f2b94715 Oct 13 13:20:03 crc kubenswrapper[4797]: I1013 13:20:03.106710 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-kws7b" event={"ID":"c2336e04-93f4-4e2a-a221-21c09083f0ac","Type":"ContainerStarted","Data":"9f273aee5ddaabe1dbf2f65fc70fe82ce546e15e60246b00ee0ec200aafd0f3e"} Oct 13 13:20:03 crc kubenswrapper[4797]: I1013 13:20:03.107069 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-kws7b" event={"ID":"c2336e04-93f4-4e2a-a221-21c09083f0ac","Type":"ContainerStarted","Data":"376dee2ca5ca68beb3403dd8ac14c681601103cae1fd3d8a25dde622c8a44220"} Oct 13 13:20:03 crc kubenswrapper[4797]: I1013 13:20:03.107084 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-kws7b" event={"ID":"c2336e04-93f4-4e2a-a221-21c09083f0ac","Type":"ContainerStarted","Data":"8e9065e488e7c63232ec753a024e68737b211008de56ae6ef9b14a53f2b94715"} Oct 13 13:20:03 crc kubenswrapper[4797]: I1013 13:20:03.107259 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-kws7b" Oct 13 13:20:03 crc kubenswrapper[4797]: I1013 13:20:03.130469 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-kws7b" podStartSLOduration=3.130449208 podStartE2EDuration="3.130449208s" podCreationTimestamp="2025-10-13 13:20:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 13:20:03.127480506 +0000 UTC m=+780.661030782" watchObservedRunningTime="2025-10-13 13:20:03.130449208 +0000 UTC m=+780.663999464" Oct 13 13:20:09 crc kubenswrapper[4797]: I1013 13:20:09.144170 4797 generic.go:334] "Generic (PLEG): container finished" podID="44e2efb9-07fa-42db-a605-44970bbe88bc" containerID="182c16b9c30483f57faf191e9fbf1246c06fa55f0bd3aabc1f85a86f29e7092d" exitCode=0 Oct 13 13:20:09 crc kubenswrapper[4797]: I1013 13:20:09.144213 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-xjs8k" event={"ID":"44e2efb9-07fa-42db-a605-44970bbe88bc","Type":"ContainerDied","Data":"182c16b9c30483f57faf191e9fbf1246c06fa55f0bd3aabc1f85a86f29e7092d"} Oct 13 13:20:09 crc kubenswrapper[4797]: I1013 13:20:09.146367 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-zxbq8" event={"ID":"5a9c42d5-f403-43ce-80fd-e38f20646272","Type":"ContainerStarted","Data":"02ffead86ed1df753748ce3a90d9eba9af0f45a52782acb5cac3e8b170466d9e"} Oct 13 13:20:09 crc kubenswrapper[4797]: I1013 13:20:09.146553 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-zxbq8" Oct 13 13:20:09 crc kubenswrapper[4797]: I1013 13:20:09.182199 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-zxbq8" podStartSLOduration=1.838022852 podStartE2EDuration="9.182180276s" podCreationTimestamp="2025-10-13 13:20:00 +0000 UTC" firstStartedPulling="2025-10-13 13:20:00.950233385 +0000 UTC m=+778.483783651" lastFinishedPulling="2025-10-13 13:20:08.294390819 +0000 UTC m=+785.827941075" observedRunningTime="2025-10-13 13:20:09.181550451 +0000 UTC m=+786.715100717" watchObservedRunningTime="2025-10-13 13:20:09.182180276 +0000 UTC m=+786.715730532" Oct 13 13:20:10 crc kubenswrapper[4797]: I1013 13:20:10.152850 4797 generic.go:334] "Generic (PLEG): container finished" podID="44e2efb9-07fa-42db-a605-44970bbe88bc" containerID="23c5e733a7e3cc630f283f965ef63a546be6a00fb69a993695853ed99c41c20a" exitCode=0 Oct 13 13:20:10 crc kubenswrapper[4797]: I1013 13:20:10.153688 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-xjs8k" event={"ID":"44e2efb9-07fa-42db-a605-44970bbe88bc","Type":"ContainerDied","Data":"23c5e733a7e3cc630f283f965ef63a546be6a00fb69a993695853ed99c41c20a"} Oct 13 13:20:10 crc kubenswrapper[4797]: I1013 13:20:10.650903 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-68d546b9d8-v7qt6" Oct 13 13:20:11 crc kubenswrapper[4797]: I1013 13:20:11.102123 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-phmt9"] Oct 13 13:20:11 crc kubenswrapper[4797]: I1013 13:20:11.103144 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-phmt9" Oct 13 13:20:11 crc kubenswrapper[4797]: I1013 13:20:11.129976 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-phmt9"] Oct 13 13:20:11 crc kubenswrapper[4797]: I1013 13:20:11.154928 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tk5k4\" (UniqueName: \"kubernetes.io/projected/1fd8962b-9e85-452e-b0db-8d1f12109329-kube-api-access-tk5k4\") pod \"community-operators-phmt9\" (UID: \"1fd8962b-9e85-452e-b0db-8d1f12109329\") " pod="openshift-marketplace/community-operators-phmt9" Oct 13 13:20:11 crc kubenswrapper[4797]: I1013 13:20:11.154968 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1fd8962b-9e85-452e-b0db-8d1f12109329-utilities\") pod \"community-operators-phmt9\" (UID: \"1fd8962b-9e85-452e-b0db-8d1f12109329\") " pod="openshift-marketplace/community-operators-phmt9" Oct 13 13:20:11 crc kubenswrapper[4797]: I1013 13:20:11.154999 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1fd8962b-9e85-452e-b0db-8d1f12109329-catalog-content\") pod \"community-operators-phmt9\" (UID: \"1fd8962b-9e85-452e-b0db-8d1f12109329\") " pod="openshift-marketplace/community-operators-phmt9" Oct 13 13:20:11 crc kubenswrapper[4797]: I1013 13:20:11.159303 4797 generic.go:334] "Generic (PLEG): container finished" podID="44e2efb9-07fa-42db-a605-44970bbe88bc" containerID="19c9904a434fe3e239a374b91fd1a19a590adaa64ffe2b693f12e49478d26147" exitCode=0 Oct 13 13:20:11 crc kubenswrapper[4797]: I1013 13:20:11.159345 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-xjs8k" event={"ID":"44e2efb9-07fa-42db-a605-44970bbe88bc","Type":"ContainerDied","Data":"19c9904a434fe3e239a374b91fd1a19a590adaa64ffe2b693f12e49478d26147"} Oct 13 13:20:11 crc kubenswrapper[4797]: I1013 13:20:11.257549 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tk5k4\" (UniqueName: \"kubernetes.io/projected/1fd8962b-9e85-452e-b0db-8d1f12109329-kube-api-access-tk5k4\") pod \"community-operators-phmt9\" (UID: \"1fd8962b-9e85-452e-b0db-8d1f12109329\") " pod="openshift-marketplace/community-operators-phmt9" Oct 13 13:20:11 crc kubenswrapper[4797]: I1013 13:20:11.257618 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1fd8962b-9e85-452e-b0db-8d1f12109329-utilities\") pod \"community-operators-phmt9\" (UID: \"1fd8962b-9e85-452e-b0db-8d1f12109329\") " pod="openshift-marketplace/community-operators-phmt9" Oct 13 13:20:11 crc kubenswrapper[4797]: I1013 13:20:11.257653 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1fd8962b-9e85-452e-b0db-8d1f12109329-catalog-content\") pod \"community-operators-phmt9\" (UID: \"1fd8962b-9e85-452e-b0db-8d1f12109329\") " pod="openshift-marketplace/community-operators-phmt9" Oct 13 13:20:11 crc kubenswrapper[4797]: I1013 13:20:11.262339 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1fd8962b-9e85-452e-b0db-8d1f12109329-utilities\") pod \"community-operators-phmt9\" (UID: \"1fd8962b-9e85-452e-b0db-8d1f12109329\") " pod="openshift-marketplace/community-operators-phmt9" Oct 13 13:20:11 crc kubenswrapper[4797]: I1013 13:20:11.262628 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1fd8962b-9e85-452e-b0db-8d1f12109329-catalog-content\") pod \"community-operators-phmt9\" (UID: \"1fd8962b-9e85-452e-b0db-8d1f12109329\") " pod="openshift-marketplace/community-operators-phmt9" Oct 13 13:20:11 crc kubenswrapper[4797]: I1013 13:20:11.283229 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tk5k4\" (UniqueName: \"kubernetes.io/projected/1fd8962b-9e85-452e-b0db-8d1f12109329-kube-api-access-tk5k4\") pod \"community-operators-phmt9\" (UID: \"1fd8962b-9e85-452e-b0db-8d1f12109329\") " pod="openshift-marketplace/community-operators-phmt9" Oct 13 13:20:11 crc kubenswrapper[4797]: I1013 13:20:11.433382 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-phmt9" Oct 13 13:20:11 crc kubenswrapper[4797]: I1013 13:20:11.967680 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-phmt9"] Oct 13 13:20:11 crc kubenswrapper[4797]: W1013 13:20:11.977620 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1fd8962b_9e85_452e_b0db_8d1f12109329.slice/crio-5ddf65468003ec3d84a34c69d5195fe1fd612db9fd9f8c45ee41b7d591f620f6 WatchSource:0}: Error finding container 5ddf65468003ec3d84a34c69d5195fe1fd612db9fd9f8c45ee41b7d591f620f6: Status 404 returned error can't find the container with id 5ddf65468003ec3d84a34c69d5195fe1fd612db9fd9f8c45ee41b7d591f620f6 Oct 13 13:20:12 crc kubenswrapper[4797]: I1013 13:20:12.102431 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-kws7b" Oct 13 13:20:12 crc kubenswrapper[4797]: I1013 13:20:12.172842 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-xjs8k" event={"ID":"44e2efb9-07fa-42db-a605-44970bbe88bc","Type":"ContainerStarted","Data":"9454612e1d4773ac258ee716371745b630756196c3da83223248eacbc65b1bef"} Oct 13 13:20:12 crc kubenswrapper[4797]: I1013 13:20:12.172886 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-xjs8k" event={"ID":"44e2efb9-07fa-42db-a605-44970bbe88bc","Type":"ContainerStarted","Data":"29257e58a3fa5abdcfc77bde21099fc407ce1ecd780e44fa516a6897a8d1f2b0"} Oct 13 13:20:12 crc kubenswrapper[4797]: I1013 13:20:12.172899 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-xjs8k" event={"ID":"44e2efb9-07fa-42db-a605-44970bbe88bc","Type":"ContainerStarted","Data":"36dc8c61a6a0dc5d5c8aaf38ec1fe02f9d510e37d852126896d7af370f4db3bb"} Oct 13 13:20:12 crc kubenswrapper[4797]: I1013 13:20:12.172911 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-xjs8k" event={"ID":"44e2efb9-07fa-42db-a605-44970bbe88bc","Type":"ContainerStarted","Data":"96846b5d3dff1e205881a418f95ef01c48b10922cf12a6cdb2629a1ddd6e7218"} Oct 13 13:20:12 crc kubenswrapper[4797]: I1013 13:20:12.172923 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-xjs8k" event={"ID":"44e2efb9-07fa-42db-a605-44970bbe88bc","Type":"ContainerStarted","Data":"7f0d59c7ea855d2f63799e061028eba6c408ea0c29154e42a6253bad1d03cf84"} Oct 13 13:20:12 crc kubenswrapper[4797]: I1013 13:20:12.175004 4797 generic.go:334] "Generic (PLEG): container finished" podID="1fd8962b-9e85-452e-b0db-8d1f12109329" containerID="f4b446c8f476ce6cd2d2f70ec8f424cca6864f6d1d83024e5d19cba42caacbcb" exitCode=0 Oct 13 13:20:12 crc kubenswrapper[4797]: I1013 13:20:12.175049 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-phmt9" event={"ID":"1fd8962b-9e85-452e-b0db-8d1f12109329","Type":"ContainerDied","Data":"f4b446c8f476ce6cd2d2f70ec8f424cca6864f6d1d83024e5d19cba42caacbcb"} Oct 13 13:20:12 crc kubenswrapper[4797]: I1013 13:20:12.175079 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-phmt9" event={"ID":"1fd8962b-9e85-452e-b0db-8d1f12109329","Type":"ContainerStarted","Data":"5ddf65468003ec3d84a34c69d5195fe1fd612db9fd9f8c45ee41b7d591f620f6"} Oct 13 13:20:13 crc kubenswrapper[4797]: I1013 13:20:13.187867 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-xjs8k" event={"ID":"44e2efb9-07fa-42db-a605-44970bbe88bc","Type":"ContainerStarted","Data":"935f701d0a6d9a17af734e58ca79a799093705d5a1483b8e269e4a446c0790b4"} Oct 13 13:20:13 crc kubenswrapper[4797]: I1013 13:20:13.188165 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-xjs8k" Oct 13 13:20:13 crc kubenswrapper[4797]: I1013 13:20:13.212502 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-xjs8k" podStartSLOduration=5.534148663 podStartE2EDuration="13.212481289s" podCreationTimestamp="2025-10-13 13:20:00 +0000 UTC" firstStartedPulling="2025-10-13 13:20:00.596954005 +0000 UTC m=+778.130504261" lastFinishedPulling="2025-10-13 13:20:08.275286631 +0000 UTC m=+785.808836887" observedRunningTime="2025-10-13 13:20:13.210438379 +0000 UTC m=+790.743988645" watchObservedRunningTime="2025-10-13 13:20:13.212481289 +0000 UTC m=+790.746031545" Oct 13 13:20:13 crc kubenswrapper[4797]: I1013 13:20:13.939726 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb695jx5p"] Oct 13 13:20:13 crc kubenswrapper[4797]: I1013 13:20:13.940788 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb695jx5p" Oct 13 13:20:13 crc kubenswrapper[4797]: I1013 13:20:13.942378 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 13 13:20:13 crc kubenswrapper[4797]: I1013 13:20:13.991708 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2g6xd\" (UniqueName: \"kubernetes.io/projected/bde8aa22-3546-40b8-89bb-3415532d55b4-kube-api-access-2g6xd\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb695jx5p\" (UID: \"bde8aa22-3546-40b8-89bb-3415532d55b4\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb695jx5p" Oct 13 13:20:13 crc kubenswrapper[4797]: I1013 13:20:13.991767 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bde8aa22-3546-40b8-89bb-3415532d55b4-bundle\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb695jx5p\" (UID: \"bde8aa22-3546-40b8-89bb-3415532d55b4\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb695jx5p" Oct 13 13:20:13 crc kubenswrapper[4797]: I1013 13:20:13.991797 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bde8aa22-3546-40b8-89bb-3415532d55b4-util\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb695jx5p\" (UID: \"bde8aa22-3546-40b8-89bb-3415532d55b4\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb695jx5p" Oct 13 13:20:13 crc kubenswrapper[4797]: I1013 13:20:13.999674 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb695jx5p"] Oct 13 13:20:14 crc kubenswrapper[4797]: I1013 13:20:14.093487 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2g6xd\" (UniqueName: \"kubernetes.io/projected/bde8aa22-3546-40b8-89bb-3415532d55b4-kube-api-access-2g6xd\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb695jx5p\" (UID: \"bde8aa22-3546-40b8-89bb-3415532d55b4\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb695jx5p" Oct 13 13:20:14 crc kubenswrapper[4797]: I1013 13:20:14.093595 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bde8aa22-3546-40b8-89bb-3415532d55b4-bundle\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb695jx5p\" (UID: \"bde8aa22-3546-40b8-89bb-3415532d55b4\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb695jx5p" Oct 13 13:20:14 crc kubenswrapper[4797]: I1013 13:20:14.093645 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bde8aa22-3546-40b8-89bb-3415532d55b4-util\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb695jx5p\" (UID: \"bde8aa22-3546-40b8-89bb-3415532d55b4\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb695jx5p" Oct 13 13:20:14 crc kubenswrapper[4797]: I1013 13:20:14.094130 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bde8aa22-3546-40b8-89bb-3415532d55b4-util\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb695jx5p\" (UID: \"bde8aa22-3546-40b8-89bb-3415532d55b4\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb695jx5p" Oct 13 13:20:14 crc kubenswrapper[4797]: I1013 13:20:14.094666 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bde8aa22-3546-40b8-89bb-3415532d55b4-bundle\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb695jx5p\" (UID: \"bde8aa22-3546-40b8-89bb-3415532d55b4\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb695jx5p" Oct 13 13:20:14 crc kubenswrapper[4797]: I1013 13:20:14.125292 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2g6xd\" (UniqueName: \"kubernetes.io/projected/bde8aa22-3546-40b8-89bb-3415532d55b4-kube-api-access-2g6xd\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb695jx5p\" (UID: \"bde8aa22-3546-40b8-89bb-3415532d55b4\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb695jx5p" Oct 13 13:20:14 crc kubenswrapper[4797]: I1013 13:20:14.300410 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb695jx5p" Oct 13 13:20:14 crc kubenswrapper[4797]: I1013 13:20:14.784280 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb695jx5p"] Oct 13 13:20:15 crc kubenswrapper[4797]: I1013 13:20:15.464372 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-xjs8k" Oct 13 13:20:15 crc kubenswrapper[4797]: I1013 13:20:15.504604 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-xjs8k" Oct 13 13:20:17 crc kubenswrapper[4797]: I1013 13:20:17.212220 4797 generic.go:334] "Generic (PLEG): container finished" podID="1fd8962b-9e85-452e-b0db-8d1f12109329" containerID="6dfe4137e6570acd62fe97de58a54052b8fe85e11e8305a9c6bee137d67d29c2" exitCode=0 Oct 13 13:20:17 crc kubenswrapper[4797]: I1013 13:20:17.212295 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-phmt9" event={"ID":"1fd8962b-9e85-452e-b0db-8d1f12109329","Type":"ContainerDied","Data":"6dfe4137e6570acd62fe97de58a54052b8fe85e11e8305a9c6bee137d67d29c2"} Oct 13 13:20:17 crc kubenswrapper[4797]: I1013 13:20:17.215333 4797 generic.go:334] "Generic (PLEG): container finished" podID="bde8aa22-3546-40b8-89bb-3415532d55b4" containerID="baa9afd301c896ae2f122d22c68722b1acc87ff3cf28ab0de00712fba134bc80" exitCode=0 Oct 13 13:20:17 crc kubenswrapper[4797]: I1013 13:20:17.215368 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb695jx5p" event={"ID":"bde8aa22-3546-40b8-89bb-3415532d55b4","Type":"ContainerDied","Data":"baa9afd301c896ae2f122d22c68722b1acc87ff3cf28ab0de00712fba134bc80"} Oct 13 13:20:17 crc kubenswrapper[4797]: I1013 13:20:17.215392 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb695jx5p" event={"ID":"bde8aa22-3546-40b8-89bb-3415532d55b4","Type":"ContainerStarted","Data":"1f9c4f84b2a88d502d097e563ad1d092b494c27d98946097c9d737a393e29a4b"} Oct 13 13:20:18 crc kubenswrapper[4797]: I1013 13:20:18.221940 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-phmt9" event={"ID":"1fd8962b-9e85-452e-b0db-8d1f12109329","Type":"ContainerStarted","Data":"ae63e646a7a2c713825ca167751d5475093d94ec4755e190cdf00737a570ca99"} Oct 13 13:20:18 crc kubenswrapper[4797]: I1013 13:20:18.252570 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-phmt9" podStartSLOduration=1.7582052259999998 podStartE2EDuration="7.252552457s" podCreationTimestamp="2025-10-13 13:20:11 +0000 UTC" firstStartedPulling="2025-10-13 13:20:12.176179315 +0000 UTC m=+789.709729571" lastFinishedPulling="2025-10-13 13:20:17.670526506 +0000 UTC m=+795.204076802" observedRunningTime="2025-10-13 13:20:18.2485645 +0000 UTC m=+795.782114836" watchObservedRunningTime="2025-10-13 13:20:18.252552457 +0000 UTC m=+795.786102713" Oct 13 13:20:20 crc kubenswrapper[4797]: I1013 13:20:20.486881 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-zxbq8" Oct 13 13:20:21 crc kubenswrapper[4797]: I1013 13:20:21.247152 4797 generic.go:334] "Generic (PLEG): container finished" podID="bde8aa22-3546-40b8-89bb-3415532d55b4" containerID="d08535210bea4d915dcb1ccc6e11c4bcfe9fd110915e7a14981d95d40a2c7ab0" exitCode=0 Oct 13 13:20:21 crc kubenswrapper[4797]: I1013 13:20:21.249017 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb695jx5p" event={"ID":"bde8aa22-3546-40b8-89bb-3415532d55b4","Type":"ContainerDied","Data":"d08535210bea4d915dcb1ccc6e11c4bcfe9fd110915e7a14981d95d40a2c7ab0"} Oct 13 13:20:21 crc kubenswrapper[4797]: I1013 13:20:21.434394 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-phmt9" Oct 13 13:20:21 crc kubenswrapper[4797]: I1013 13:20:21.434572 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-phmt9" Oct 13 13:20:21 crc kubenswrapper[4797]: I1013 13:20:21.476193 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-phmt9" Oct 13 13:20:21 crc kubenswrapper[4797]: I1013 13:20:21.509734 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-m5f7n"] Oct 13 13:20:21 crc kubenswrapper[4797]: I1013 13:20:21.515656 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m5f7n" Oct 13 13:20:21 crc kubenswrapper[4797]: I1013 13:20:21.558386 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-m5f7n"] Oct 13 13:20:21 crc kubenswrapper[4797]: I1013 13:20:21.640743 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ttvr\" (UniqueName: \"kubernetes.io/projected/381c567b-38d2-4edb-9cde-d1df690d03b8-kube-api-access-5ttvr\") pod \"redhat-marketplace-m5f7n\" (UID: \"381c567b-38d2-4edb-9cde-d1df690d03b8\") " pod="openshift-marketplace/redhat-marketplace-m5f7n" Oct 13 13:20:21 crc kubenswrapper[4797]: I1013 13:20:21.640838 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/381c567b-38d2-4edb-9cde-d1df690d03b8-catalog-content\") pod \"redhat-marketplace-m5f7n\" (UID: \"381c567b-38d2-4edb-9cde-d1df690d03b8\") " pod="openshift-marketplace/redhat-marketplace-m5f7n" Oct 13 13:20:21 crc kubenswrapper[4797]: I1013 13:20:21.641064 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/381c567b-38d2-4edb-9cde-d1df690d03b8-utilities\") pod \"redhat-marketplace-m5f7n\" (UID: \"381c567b-38d2-4edb-9cde-d1df690d03b8\") " pod="openshift-marketplace/redhat-marketplace-m5f7n" Oct 13 13:20:21 crc kubenswrapper[4797]: I1013 13:20:21.742123 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/381c567b-38d2-4edb-9cde-d1df690d03b8-utilities\") pod \"redhat-marketplace-m5f7n\" (UID: \"381c567b-38d2-4edb-9cde-d1df690d03b8\") " pod="openshift-marketplace/redhat-marketplace-m5f7n" Oct 13 13:20:21 crc kubenswrapper[4797]: I1013 13:20:21.742209 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5ttvr\" (UniqueName: \"kubernetes.io/projected/381c567b-38d2-4edb-9cde-d1df690d03b8-kube-api-access-5ttvr\") pod \"redhat-marketplace-m5f7n\" (UID: \"381c567b-38d2-4edb-9cde-d1df690d03b8\") " pod="openshift-marketplace/redhat-marketplace-m5f7n" Oct 13 13:20:21 crc kubenswrapper[4797]: I1013 13:20:21.742239 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/381c567b-38d2-4edb-9cde-d1df690d03b8-catalog-content\") pod \"redhat-marketplace-m5f7n\" (UID: \"381c567b-38d2-4edb-9cde-d1df690d03b8\") " pod="openshift-marketplace/redhat-marketplace-m5f7n" Oct 13 13:20:21 crc kubenswrapper[4797]: I1013 13:20:21.742736 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/381c567b-38d2-4edb-9cde-d1df690d03b8-utilities\") pod \"redhat-marketplace-m5f7n\" (UID: \"381c567b-38d2-4edb-9cde-d1df690d03b8\") " pod="openshift-marketplace/redhat-marketplace-m5f7n" Oct 13 13:20:21 crc kubenswrapper[4797]: I1013 13:20:21.742754 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/381c567b-38d2-4edb-9cde-d1df690d03b8-catalog-content\") pod \"redhat-marketplace-m5f7n\" (UID: \"381c567b-38d2-4edb-9cde-d1df690d03b8\") " pod="openshift-marketplace/redhat-marketplace-m5f7n" Oct 13 13:20:21 crc kubenswrapper[4797]: I1013 13:20:21.761850 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ttvr\" (UniqueName: \"kubernetes.io/projected/381c567b-38d2-4edb-9cde-d1df690d03b8-kube-api-access-5ttvr\") pod \"redhat-marketplace-m5f7n\" (UID: \"381c567b-38d2-4edb-9cde-d1df690d03b8\") " pod="openshift-marketplace/redhat-marketplace-m5f7n" Oct 13 13:20:21 crc kubenswrapper[4797]: I1013 13:20:21.883870 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m5f7n" Oct 13 13:20:22 crc kubenswrapper[4797]: I1013 13:20:22.255783 4797 generic.go:334] "Generic (PLEG): container finished" podID="bde8aa22-3546-40b8-89bb-3415532d55b4" containerID="856261372cc1b77b31e668f298cadd5d97f6317ba5e72cc6404ef0291edcca68" exitCode=0 Oct 13 13:20:22 crc kubenswrapper[4797]: I1013 13:20:22.255871 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb695jx5p" event={"ID":"bde8aa22-3546-40b8-89bb-3415532d55b4","Type":"ContainerDied","Data":"856261372cc1b77b31e668f298cadd5d97f6317ba5e72cc6404ef0291edcca68"} Oct 13 13:20:22 crc kubenswrapper[4797]: I1013 13:20:22.294532 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-m5f7n"] Oct 13 13:20:22 crc kubenswrapper[4797]: W1013 13:20:22.304940 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod381c567b_38d2_4edb_9cde_d1df690d03b8.slice/crio-7a8edb463fdc2130954504c3acf7e54a4ae5bb83438752cb3af32167533ed72d WatchSource:0}: Error finding container 7a8edb463fdc2130954504c3acf7e54a4ae5bb83438752cb3af32167533ed72d: Status 404 returned error can't find the container with id 7a8edb463fdc2130954504c3acf7e54a4ae5bb83438752cb3af32167533ed72d Oct 13 13:20:22 crc kubenswrapper[4797]: I1013 13:20:22.319846 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-phmt9" Oct 13 13:20:23 crc kubenswrapper[4797]: I1013 13:20:23.263268 4797 generic.go:334] "Generic (PLEG): container finished" podID="381c567b-38d2-4edb-9cde-d1df690d03b8" containerID="476e000e19233479f35557a0fbe8defe8d61e03d37b43a4abf5c6b1f493990dd" exitCode=0 Oct 13 13:20:23 crc kubenswrapper[4797]: I1013 13:20:23.263351 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m5f7n" event={"ID":"381c567b-38d2-4edb-9cde-d1df690d03b8","Type":"ContainerDied","Data":"476e000e19233479f35557a0fbe8defe8d61e03d37b43a4abf5c6b1f493990dd"} Oct 13 13:20:23 crc kubenswrapper[4797]: I1013 13:20:23.263926 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m5f7n" event={"ID":"381c567b-38d2-4edb-9cde-d1df690d03b8","Type":"ContainerStarted","Data":"7a8edb463fdc2130954504c3acf7e54a4ae5bb83438752cb3af32167533ed72d"} Oct 13 13:20:23 crc kubenswrapper[4797]: I1013 13:20:23.572138 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb695jx5p" Oct 13 13:20:23 crc kubenswrapper[4797]: I1013 13:20:23.768692 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bde8aa22-3546-40b8-89bb-3415532d55b4-bundle\") pod \"bde8aa22-3546-40b8-89bb-3415532d55b4\" (UID: \"bde8aa22-3546-40b8-89bb-3415532d55b4\") " Oct 13 13:20:23 crc kubenswrapper[4797]: I1013 13:20:23.768767 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bde8aa22-3546-40b8-89bb-3415532d55b4-util\") pod \"bde8aa22-3546-40b8-89bb-3415532d55b4\" (UID: \"bde8aa22-3546-40b8-89bb-3415532d55b4\") " Oct 13 13:20:23 crc kubenswrapper[4797]: I1013 13:20:23.768842 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2g6xd\" (UniqueName: \"kubernetes.io/projected/bde8aa22-3546-40b8-89bb-3415532d55b4-kube-api-access-2g6xd\") pod \"bde8aa22-3546-40b8-89bb-3415532d55b4\" (UID: \"bde8aa22-3546-40b8-89bb-3415532d55b4\") " Oct 13 13:20:23 crc kubenswrapper[4797]: I1013 13:20:23.771656 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bde8aa22-3546-40b8-89bb-3415532d55b4-bundle" (OuterVolumeSpecName: "bundle") pod "bde8aa22-3546-40b8-89bb-3415532d55b4" (UID: "bde8aa22-3546-40b8-89bb-3415532d55b4"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 13:20:23 crc kubenswrapper[4797]: I1013 13:20:23.777541 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bde8aa22-3546-40b8-89bb-3415532d55b4-kube-api-access-2g6xd" (OuterVolumeSpecName: "kube-api-access-2g6xd") pod "bde8aa22-3546-40b8-89bb-3415532d55b4" (UID: "bde8aa22-3546-40b8-89bb-3415532d55b4"). InnerVolumeSpecName "kube-api-access-2g6xd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:20:23 crc kubenswrapper[4797]: I1013 13:20:23.790079 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bde8aa22-3546-40b8-89bb-3415532d55b4-util" (OuterVolumeSpecName: "util") pod "bde8aa22-3546-40b8-89bb-3415532d55b4" (UID: "bde8aa22-3546-40b8-89bb-3415532d55b4"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 13:20:23 crc kubenswrapper[4797]: I1013 13:20:23.869964 4797 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bde8aa22-3546-40b8-89bb-3415532d55b4-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 13:20:23 crc kubenswrapper[4797]: I1013 13:20:23.870000 4797 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bde8aa22-3546-40b8-89bb-3415532d55b4-util\") on node \"crc\" DevicePath \"\"" Oct 13 13:20:23 crc kubenswrapper[4797]: I1013 13:20:23.870015 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2g6xd\" (UniqueName: \"kubernetes.io/projected/bde8aa22-3546-40b8-89bb-3415532d55b4-kube-api-access-2g6xd\") on node \"crc\" DevicePath \"\"" Oct 13 13:20:24 crc kubenswrapper[4797]: I1013 13:20:24.271344 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb695jx5p" event={"ID":"bde8aa22-3546-40b8-89bb-3415532d55b4","Type":"ContainerDied","Data":"1f9c4f84b2a88d502d097e563ad1d092b494c27d98946097c9d737a393e29a4b"} Oct 13 13:20:24 crc kubenswrapper[4797]: I1013 13:20:24.271613 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1f9c4f84b2a88d502d097e563ad1d092b494c27d98946097c9d737a393e29a4b" Oct 13 13:20:24 crc kubenswrapper[4797]: I1013 13:20:24.271549 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb695jx5p" Oct 13 13:20:24 crc kubenswrapper[4797]: I1013 13:20:24.278394 4797 generic.go:334] "Generic (PLEG): container finished" podID="381c567b-38d2-4edb-9cde-d1df690d03b8" containerID="1ac34f030b3d86002eba5111b9f5a9c1689dbfb079ccdd1d7bd89b62b536c7b9" exitCode=0 Oct 13 13:20:24 crc kubenswrapper[4797]: I1013 13:20:24.278446 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m5f7n" event={"ID":"381c567b-38d2-4edb-9cde-d1df690d03b8","Type":"ContainerDied","Data":"1ac34f030b3d86002eba5111b9f5a9c1689dbfb079ccdd1d7bd89b62b536c7b9"} Oct 13 13:20:27 crc kubenswrapper[4797]: I1013 13:20:27.863994 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-8wghs"] Oct 13 13:20:27 crc kubenswrapper[4797]: E1013 13:20:27.864243 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bde8aa22-3546-40b8-89bb-3415532d55b4" containerName="pull" Oct 13 13:20:27 crc kubenswrapper[4797]: I1013 13:20:27.864254 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="bde8aa22-3546-40b8-89bb-3415532d55b4" containerName="pull" Oct 13 13:20:27 crc kubenswrapper[4797]: E1013 13:20:27.864266 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bde8aa22-3546-40b8-89bb-3415532d55b4" containerName="util" Oct 13 13:20:27 crc kubenswrapper[4797]: I1013 13:20:27.864272 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="bde8aa22-3546-40b8-89bb-3415532d55b4" containerName="util" Oct 13 13:20:27 crc kubenswrapper[4797]: E1013 13:20:27.864283 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bde8aa22-3546-40b8-89bb-3415532d55b4" containerName="extract" Oct 13 13:20:27 crc kubenswrapper[4797]: I1013 13:20:27.864288 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="bde8aa22-3546-40b8-89bb-3415532d55b4" containerName="extract" Oct 13 13:20:27 crc kubenswrapper[4797]: I1013 13:20:27.864397 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="bde8aa22-3546-40b8-89bb-3415532d55b4" containerName="extract" Oct 13 13:20:27 crc kubenswrapper[4797]: I1013 13:20:27.864781 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-8wghs" Oct 13 13:20:27 crc kubenswrapper[4797]: I1013 13:20:27.867654 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"openshift-service-ca.crt" Oct 13 13:20:27 crc kubenswrapper[4797]: I1013 13:20:27.867694 4797 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager-operator"/"cert-manager-operator-controller-manager-dockercfg-pz9zb" Oct 13 13:20:27 crc kubenswrapper[4797]: I1013 13:20:27.868009 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"kube-root-ca.crt" Oct 13 13:20:27 crc kubenswrapper[4797]: I1013 13:20:27.919671 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzpd9\" (UniqueName: \"kubernetes.io/projected/4cf8ae28-99a9-43f9-a95d-9c0bc55e20df-kube-api-access-rzpd9\") pod \"cert-manager-operator-controller-manager-57cd46d6d-8wghs\" (UID: \"4cf8ae28-99a9-43f9-a95d-9c0bc55e20df\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-8wghs" Oct 13 13:20:27 crc kubenswrapper[4797]: I1013 13:20:27.925143 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-8wghs"] Oct 13 13:20:28 crc kubenswrapper[4797]: I1013 13:20:28.021135 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rzpd9\" (UniqueName: \"kubernetes.io/projected/4cf8ae28-99a9-43f9-a95d-9c0bc55e20df-kube-api-access-rzpd9\") pod \"cert-manager-operator-controller-manager-57cd46d6d-8wghs\" (UID: \"4cf8ae28-99a9-43f9-a95d-9c0bc55e20df\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-8wghs" Oct 13 13:20:28 crc kubenswrapper[4797]: I1013 13:20:28.040259 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzpd9\" (UniqueName: \"kubernetes.io/projected/4cf8ae28-99a9-43f9-a95d-9c0bc55e20df-kube-api-access-rzpd9\") pod \"cert-manager-operator-controller-manager-57cd46d6d-8wghs\" (UID: \"4cf8ae28-99a9-43f9-a95d-9c0bc55e20df\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-8wghs" Oct 13 13:20:28 crc kubenswrapper[4797]: I1013 13:20:28.178696 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-8wghs" Oct 13 13:20:28 crc kubenswrapper[4797]: I1013 13:20:28.317839 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m5f7n" event={"ID":"381c567b-38d2-4edb-9cde-d1df690d03b8","Type":"ContainerStarted","Data":"bb2438804d222844b1b35a9e35d0d591b1fc60217955b55b4b500e57e073dfc2"} Oct 13 13:20:28 crc kubenswrapper[4797]: I1013 13:20:28.681468 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-m5f7n" podStartSLOduration=2.97777326 podStartE2EDuration="7.681451032s" podCreationTimestamp="2025-10-13 13:20:21 +0000 UTC" firstStartedPulling="2025-10-13 13:20:23.265613433 +0000 UTC m=+800.799163679" lastFinishedPulling="2025-10-13 13:20:27.969291195 +0000 UTC m=+805.502841451" observedRunningTime="2025-10-13 13:20:28.352268992 +0000 UTC m=+805.885819258" watchObservedRunningTime="2025-10-13 13:20:28.681451032 +0000 UTC m=+806.215001288" Oct 13 13:20:28 crc kubenswrapper[4797]: I1013 13:20:28.683818 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-8wghs"] Oct 13 13:20:28 crc kubenswrapper[4797]: W1013 13:20:28.688037 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4cf8ae28_99a9_43f9_a95d_9c0bc55e20df.slice/crio-d725e519c1a553594fcc6705f89b8f8d9da6a3ec4690fa4545d6c82339629db9 WatchSource:0}: Error finding container d725e519c1a553594fcc6705f89b8f8d9da6a3ec4690fa4545d6c82339629db9: Status 404 returned error can't find the container with id d725e519c1a553594fcc6705f89b8f8d9da6a3ec4690fa4545d6c82339629db9 Oct 13 13:20:29 crc kubenswrapper[4797]: I1013 13:20:29.325702 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-8wghs" event={"ID":"4cf8ae28-99a9-43f9-a95d-9c0bc55e20df","Type":"ContainerStarted","Data":"d725e519c1a553594fcc6705f89b8f8d9da6a3ec4690fa4545d6c82339629db9"} Oct 13 13:20:29 crc kubenswrapper[4797]: I1013 13:20:29.517069 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-phmt9"] Oct 13 13:20:30 crc kubenswrapper[4797]: I1013 13:20:30.296703 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4wk2g"] Oct 13 13:20:30 crc kubenswrapper[4797]: I1013 13:20:30.297076 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-4wk2g" podUID="cefe52ef-b36f-4f16-90f5-dc15e699992e" containerName="registry-server" containerID="cri-o://675a6af72d1141a6cfe21384ffeb5d54670a908da000770e1f47a428775c79fa" gracePeriod=2 Oct 13 13:20:30 crc kubenswrapper[4797]: I1013 13:20:30.482343 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-xjs8k" Oct 13 13:20:30 crc kubenswrapper[4797]: I1013 13:20:30.744549 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4wk2g" Oct 13 13:20:30 crc kubenswrapper[4797]: I1013 13:20:30.767622 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cefe52ef-b36f-4f16-90f5-dc15e699992e-utilities\") pod \"cefe52ef-b36f-4f16-90f5-dc15e699992e\" (UID: \"cefe52ef-b36f-4f16-90f5-dc15e699992e\") " Oct 13 13:20:30 crc kubenswrapper[4797]: I1013 13:20:30.767728 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cefe52ef-b36f-4f16-90f5-dc15e699992e-catalog-content\") pod \"cefe52ef-b36f-4f16-90f5-dc15e699992e\" (UID: \"cefe52ef-b36f-4f16-90f5-dc15e699992e\") " Oct 13 13:20:30 crc kubenswrapper[4797]: I1013 13:20:30.767773 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mlfwl\" (UniqueName: \"kubernetes.io/projected/cefe52ef-b36f-4f16-90f5-dc15e699992e-kube-api-access-mlfwl\") pod \"cefe52ef-b36f-4f16-90f5-dc15e699992e\" (UID: \"cefe52ef-b36f-4f16-90f5-dc15e699992e\") " Oct 13 13:20:30 crc kubenswrapper[4797]: I1013 13:20:30.768506 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cefe52ef-b36f-4f16-90f5-dc15e699992e-utilities" (OuterVolumeSpecName: "utilities") pod "cefe52ef-b36f-4f16-90f5-dc15e699992e" (UID: "cefe52ef-b36f-4f16-90f5-dc15e699992e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 13:20:30 crc kubenswrapper[4797]: I1013 13:20:30.772878 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cefe52ef-b36f-4f16-90f5-dc15e699992e-kube-api-access-mlfwl" (OuterVolumeSpecName: "kube-api-access-mlfwl") pod "cefe52ef-b36f-4f16-90f5-dc15e699992e" (UID: "cefe52ef-b36f-4f16-90f5-dc15e699992e"). InnerVolumeSpecName "kube-api-access-mlfwl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:20:30 crc kubenswrapper[4797]: I1013 13:20:30.844055 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cefe52ef-b36f-4f16-90f5-dc15e699992e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cefe52ef-b36f-4f16-90f5-dc15e699992e" (UID: "cefe52ef-b36f-4f16-90f5-dc15e699992e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 13:20:30 crc kubenswrapper[4797]: I1013 13:20:30.871187 4797 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cefe52ef-b36f-4f16-90f5-dc15e699992e-utilities\") on node \"crc\" DevicePath \"\"" Oct 13 13:20:30 crc kubenswrapper[4797]: I1013 13:20:30.871220 4797 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cefe52ef-b36f-4f16-90f5-dc15e699992e-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 13 13:20:30 crc kubenswrapper[4797]: I1013 13:20:30.871232 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mlfwl\" (UniqueName: \"kubernetes.io/projected/cefe52ef-b36f-4f16-90f5-dc15e699992e-kube-api-access-mlfwl\") on node \"crc\" DevicePath \"\"" Oct 13 13:20:31 crc kubenswrapper[4797]: I1013 13:20:31.346728 4797 generic.go:334] "Generic (PLEG): container finished" podID="cefe52ef-b36f-4f16-90f5-dc15e699992e" containerID="675a6af72d1141a6cfe21384ffeb5d54670a908da000770e1f47a428775c79fa" exitCode=0 Oct 13 13:20:31 crc kubenswrapper[4797]: I1013 13:20:31.346773 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4wk2g" event={"ID":"cefe52ef-b36f-4f16-90f5-dc15e699992e","Type":"ContainerDied","Data":"675a6af72d1141a6cfe21384ffeb5d54670a908da000770e1f47a428775c79fa"} Oct 13 13:20:31 crc kubenswrapper[4797]: I1013 13:20:31.346820 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4wk2g" event={"ID":"cefe52ef-b36f-4f16-90f5-dc15e699992e","Type":"ContainerDied","Data":"279b724e9dbb2f68260b9fee23a86a073fd815d8b2099dc5aa940950ba9e53b1"} Oct 13 13:20:31 crc kubenswrapper[4797]: I1013 13:20:31.346841 4797 scope.go:117] "RemoveContainer" containerID="675a6af72d1141a6cfe21384ffeb5d54670a908da000770e1f47a428775c79fa" Oct 13 13:20:31 crc kubenswrapper[4797]: I1013 13:20:31.346852 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4wk2g" Oct 13 13:20:31 crc kubenswrapper[4797]: I1013 13:20:31.370231 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4wk2g"] Oct 13 13:20:31 crc kubenswrapper[4797]: I1013 13:20:31.375525 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-4wk2g"] Oct 13 13:20:31 crc kubenswrapper[4797]: I1013 13:20:31.383277 4797 scope.go:117] "RemoveContainer" containerID="48ee501209b1dc62a1d1a5e7bc5cc52ff73d8807336561873f0696686b599127" Oct 13 13:20:31 crc kubenswrapper[4797]: I1013 13:20:31.412697 4797 scope.go:117] "RemoveContainer" containerID="3ccb1e47c1bb844d88836462ee24b737103c70b80721843749c8ce4d51889e25" Oct 13 13:20:31 crc kubenswrapper[4797]: I1013 13:20:31.452016 4797 scope.go:117] "RemoveContainer" containerID="675a6af72d1141a6cfe21384ffeb5d54670a908da000770e1f47a428775c79fa" Oct 13 13:20:31 crc kubenswrapper[4797]: E1013 13:20:31.452502 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"675a6af72d1141a6cfe21384ffeb5d54670a908da000770e1f47a428775c79fa\": container with ID starting with 675a6af72d1141a6cfe21384ffeb5d54670a908da000770e1f47a428775c79fa not found: ID does not exist" containerID="675a6af72d1141a6cfe21384ffeb5d54670a908da000770e1f47a428775c79fa" Oct 13 13:20:31 crc kubenswrapper[4797]: I1013 13:20:31.452544 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"675a6af72d1141a6cfe21384ffeb5d54670a908da000770e1f47a428775c79fa"} err="failed to get container status \"675a6af72d1141a6cfe21384ffeb5d54670a908da000770e1f47a428775c79fa\": rpc error: code = NotFound desc = could not find container \"675a6af72d1141a6cfe21384ffeb5d54670a908da000770e1f47a428775c79fa\": container with ID starting with 675a6af72d1141a6cfe21384ffeb5d54670a908da000770e1f47a428775c79fa not found: ID does not exist" Oct 13 13:20:31 crc kubenswrapper[4797]: I1013 13:20:31.452572 4797 scope.go:117] "RemoveContainer" containerID="48ee501209b1dc62a1d1a5e7bc5cc52ff73d8807336561873f0696686b599127" Oct 13 13:20:31 crc kubenswrapper[4797]: E1013 13:20:31.454007 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"48ee501209b1dc62a1d1a5e7bc5cc52ff73d8807336561873f0696686b599127\": container with ID starting with 48ee501209b1dc62a1d1a5e7bc5cc52ff73d8807336561873f0696686b599127 not found: ID does not exist" containerID="48ee501209b1dc62a1d1a5e7bc5cc52ff73d8807336561873f0696686b599127" Oct 13 13:20:31 crc kubenswrapper[4797]: I1013 13:20:31.454047 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48ee501209b1dc62a1d1a5e7bc5cc52ff73d8807336561873f0696686b599127"} err="failed to get container status \"48ee501209b1dc62a1d1a5e7bc5cc52ff73d8807336561873f0696686b599127\": rpc error: code = NotFound desc = could not find container \"48ee501209b1dc62a1d1a5e7bc5cc52ff73d8807336561873f0696686b599127\": container with ID starting with 48ee501209b1dc62a1d1a5e7bc5cc52ff73d8807336561873f0696686b599127 not found: ID does not exist" Oct 13 13:20:31 crc kubenswrapper[4797]: I1013 13:20:31.454075 4797 scope.go:117] "RemoveContainer" containerID="3ccb1e47c1bb844d88836462ee24b737103c70b80721843749c8ce4d51889e25" Oct 13 13:20:31 crc kubenswrapper[4797]: E1013 13:20:31.454749 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ccb1e47c1bb844d88836462ee24b737103c70b80721843749c8ce4d51889e25\": container with ID starting with 3ccb1e47c1bb844d88836462ee24b737103c70b80721843749c8ce4d51889e25 not found: ID does not exist" containerID="3ccb1e47c1bb844d88836462ee24b737103c70b80721843749c8ce4d51889e25" Oct 13 13:20:31 crc kubenswrapper[4797]: I1013 13:20:31.454778 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ccb1e47c1bb844d88836462ee24b737103c70b80721843749c8ce4d51889e25"} err="failed to get container status \"3ccb1e47c1bb844d88836462ee24b737103c70b80721843749c8ce4d51889e25\": rpc error: code = NotFound desc = could not find container \"3ccb1e47c1bb844d88836462ee24b737103c70b80721843749c8ce4d51889e25\": container with ID starting with 3ccb1e47c1bb844d88836462ee24b737103c70b80721843749c8ce4d51889e25 not found: ID does not exist" Oct 13 13:20:31 crc kubenswrapper[4797]: I1013 13:20:31.884576 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-m5f7n" Oct 13 13:20:31 crc kubenswrapper[4797]: I1013 13:20:31.884654 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-m5f7n" Oct 13 13:20:31 crc kubenswrapper[4797]: I1013 13:20:31.920370 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-m5f7n" Oct 13 13:20:33 crc kubenswrapper[4797]: I1013 13:20:33.246264 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cefe52ef-b36f-4f16-90f5-dc15e699992e" path="/var/lib/kubelet/pods/cefe52ef-b36f-4f16-90f5-dc15e699992e/volumes" Oct 13 13:20:36 crc kubenswrapper[4797]: I1013 13:20:36.405892 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-8wghs" event={"ID":"4cf8ae28-99a9-43f9-a95d-9c0bc55e20df","Type":"ContainerStarted","Data":"fa23a7e0f5fd2508bb47af51cf22d163da808cefd6866321e63ff64a7b6cc138"} Oct 13 13:20:36 crc kubenswrapper[4797]: I1013 13:20:36.425678 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-8wghs" podStartSLOduration=1.973829311 podStartE2EDuration="9.425663782s" podCreationTimestamp="2025-10-13 13:20:27 +0000 UTC" firstStartedPulling="2025-10-13 13:20:28.690926954 +0000 UTC m=+806.224477210" lastFinishedPulling="2025-10-13 13:20:36.142761425 +0000 UTC m=+813.676311681" observedRunningTime="2025-10-13 13:20:36.424094313 +0000 UTC m=+813.957644609" watchObservedRunningTime="2025-10-13 13:20:36.425663782 +0000 UTC m=+813.959214028" Oct 13 13:20:39 crc kubenswrapper[4797]: I1013 13:20:39.472236 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-d969966f-h8b9w"] Oct 13 13:20:39 crc kubenswrapper[4797]: E1013 13:20:39.473228 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cefe52ef-b36f-4f16-90f5-dc15e699992e" containerName="extract-utilities" Oct 13 13:20:39 crc kubenswrapper[4797]: I1013 13:20:39.473247 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="cefe52ef-b36f-4f16-90f5-dc15e699992e" containerName="extract-utilities" Oct 13 13:20:39 crc kubenswrapper[4797]: E1013 13:20:39.473281 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cefe52ef-b36f-4f16-90f5-dc15e699992e" containerName="extract-content" Oct 13 13:20:39 crc kubenswrapper[4797]: I1013 13:20:39.473287 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="cefe52ef-b36f-4f16-90f5-dc15e699992e" containerName="extract-content" Oct 13 13:20:39 crc kubenswrapper[4797]: E1013 13:20:39.473297 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cefe52ef-b36f-4f16-90f5-dc15e699992e" containerName="registry-server" Oct 13 13:20:39 crc kubenswrapper[4797]: I1013 13:20:39.473305 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="cefe52ef-b36f-4f16-90f5-dc15e699992e" containerName="registry-server" Oct 13 13:20:39 crc kubenswrapper[4797]: I1013 13:20:39.473429 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="cefe52ef-b36f-4f16-90f5-dc15e699992e" containerName="registry-server" Oct 13 13:20:39 crc kubenswrapper[4797]: I1013 13:20:39.474021 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-d969966f-h8b9w" Oct 13 13:20:39 crc kubenswrapper[4797]: I1013 13:20:39.476469 4797 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-dpqj9" Oct 13 13:20:39 crc kubenswrapper[4797]: I1013 13:20:39.476716 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Oct 13 13:20:39 crc kubenswrapper[4797]: I1013 13:20:39.476879 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Oct 13 13:20:39 crc kubenswrapper[4797]: I1013 13:20:39.489647 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-d969966f-h8b9w"] Oct 13 13:20:39 crc kubenswrapper[4797]: I1013 13:20:39.581628 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hc98b\" (UniqueName: \"kubernetes.io/projected/ec3a8923-32a5-467e-949b-0450963f0afb-kube-api-access-hc98b\") pod \"cert-manager-webhook-d969966f-h8b9w\" (UID: \"ec3a8923-32a5-467e-949b-0450963f0afb\") " pod="cert-manager/cert-manager-webhook-d969966f-h8b9w" Oct 13 13:20:39 crc kubenswrapper[4797]: I1013 13:20:39.581713 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ec3a8923-32a5-467e-949b-0450963f0afb-bound-sa-token\") pod \"cert-manager-webhook-d969966f-h8b9w\" (UID: \"ec3a8923-32a5-467e-949b-0450963f0afb\") " pod="cert-manager/cert-manager-webhook-d969966f-h8b9w" Oct 13 13:20:39 crc kubenswrapper[4797]: I1013 13:20:39.682999 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hc98b\" (UniqueName: \"kubernetes.io/projected/ec3a8923-32a5-467e-949b-0450963f0afb-kube-api-access-hc98b\") pod \"cert-manager-webhook-d969966f-h8b9w\" (UID: \"ec3a8923-32a5-467e-949b-0450963f0afb\") " pod="cert-manager/cert-manager-webhook-d969966f-h8b9w" Oct 13 13:20:39 crc kubenswrapper[4797]: I1013 13:20:39.683052 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ec3a8923-32a5-467e-949b-0450963f0afb-bound-sa-token\") pod \"cert-manager-webhook-d969966f-h8b9w\" (UID: \"ec3a8923-32a5-467e-949b-0450963f0afb\") " pod="cert-manager/cert-manager-webhook-d969966f-h8b9w" Oct 13 13:20:39 crc kubenswrapper[4797]: I1013 13:20:39.700504 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ec3a8923-32a5-467e-949b-0450963f0afb-bound-sa-token\") pod \"cert-manager-webhook-d969966f-h8b9w\" (UID: \"ec3a8923-32a5-467e-949b-0450963f0afb\") " pod="cert-manager/cert-manager-webhook-d969966f-h8b9w" Oct 13 13:20:39 crc kubenswrapper[4797]: I1013 13:20:39.700620 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hc98b\" (UniqueName: \"kubernetes.io/projected/ec3a8923-32a5-467e-949b-0450963f0afb-kube-api-access-hc98b\") pod \"cert-manager-webhook-d969966f-h8b9w\" (UID: \"ec3a8923-32a5-467e-949b-0450963f0afb\") " pod="cert-manager/cert-manager-webhook-d969966f-h8b9w" Oct 13 13:20:39 crc kubenswrapper[4797]: I1013 13:20:39.829605 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-d969966f-h8b9w" Oct 13 13:20:40 crc kubenswrapper[4797]: I1013 13:20:40.252830 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-d969966f-h8b9w"] Oct 13 13:20:40 crc kubenswrapper[4797]: I1013 13:20:40.427879 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-d969966f-h8b9w" event={"ID":"ec3a8923-32a5-467e-949b-0450963f0afb","Type":"ContainerStarted","Data":"6fc68fdc2d9b44bda4a78fbabb9524cd60f24f2dc4375dbc6a3401b503e988d9"} Oct 13 13:20:41 crc kubenswrapper[4797]: I1013 13:20:41.929034 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-m5f7n" Oct 13 13:20:42 crc kubenswrapper[4797]: I1013 13:20:42.100158 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-m5f7n"] Oct 13 13:20:42 crc kubenswrapper[4797]: I1013 13:20:42.439374 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-m5f7n" podUID="381c567b-38d2-4edb-9cde-d1df690d03b8" containerName="registry-server" containerID="cri-o://bb2438804d222844b1b35a9e35d0d591b1fc60217955b55b4b500e57e073dfc2" gracePeriod=2 Oct 13 13:20:42 crc kubenswrapper[4797]: I1013 13:20:42.841690 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m5f7n" Oct 13 13:20:42 crc kubenswrapper[4797]: I1013 13:20:42.983727 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7d9f95dbf-f2mp5"] Oct 13 13:20:42 crc kubenswrapper[4797]: E1013 13:20:42.983974 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="381c567b-38d2-4edb-9cde-d1df690d03b8" containerName="registry-server" Oct 13 13:20:42 crc kubenswrapper[4797]: I1013 13:20:42.983986 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="381c567b-38d2-4edb-9cde-d1df690d03b8" containerName="registry-server" Oct 13 13:20:42 crc kubenswrapper[4797]: E1013 13:20:42.983997 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="381c567b-38d2-4edb-9cde-d1df690d03b8" containerName="extract-utilities" Oct 13 13:20:42 crc kubenswrapper[4797]: I1013 13:20:42.984003 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="381c567b-38d2-4edb-9cde-d1df690d03b8" containerName="extract-utilities" Oct 13 13:20:42 crc kubenswrapper[4797]: E1013 13:20:42.984012 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="381c567b-38d2-4edb-9cde-d1df690d03b8" containerName="extract-content" Oct 13 13:20:42 crc kubenswrapper[4797]: I1013 13:20:42.984018 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="381c567b-38d2-4edb-9cde-d1df690d03b8" containerName="extract-content" Oct 13 13:20:42 crc kubenswrapper[4797]: I1013 13:20:42.984128 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="381c567b-38d2-4edb-9cde-d1df690d03b8" containerName="registry-server" Oct 13 13:20:42 crc kubenswrapper[4797]: I1013 13:20:42.984483 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7d9f95dbf-f2mp5" Oct 13 13:20:42 crc kubenswrapper[4797]: I1013 13:20:42.987968 4797 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-g7fjb" Oct 13 13:20:42 crc kubenswrapper[4797]: I1013 13:20:42.993726 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7d9f95dbf-f2mp5"] Oct 13 13:20:43 crc kubenswrapper[4797]: I1013 13:20:43.048601 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/381c567b-38d2-4edb-9cde-d1df690d03b8-catalog-content\") pod \"381c567b-38d2-4edb-9cde-d1df690d03b8\" (UID: \"381c567b-38d2-4edb-9cde-d1df690d03b8\") " Oct 13 13:20:43 crc kubenswrapper[4797]: I1013 13:20:43.048649 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5ttvr\" (UniqueName: \"kubernetes.io/projected/381c567b-38d2-4edb-9cde-d1df690d03b8-kube-api-access-5ttvr\") pod \"381c567b-38d2-4edb-9cde-d1df690d03b8\" (UID: \"381c567b-38d2-4edb-9cde-d1df690d03b8\") " Oct 13 13:20:43 crc kubenswrapper[4797]: I1013 13:20:43.048743 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/381c567b-38d2-4edb-9cde-d1df690d03b8-utilities\") pod \"381c567b-38d2-4edb-9cde-d1df690d03b8\" (UID: \"381c567b-38d2-4edb-9cde-d1df690d03b8\") " Oct 13 13:20:43 crc kubenswrapper[4797]: I1013 13:20:43.048888 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phnvk\" (UniqueName: \"kubernetes.io/projected/99855fda-83c3-48ad-bdea-d1f21a0407fd-kube-api-access-phnvk\") pod \"cert-manager-cainjector-7d9f95dbf-f2mp5\" (UID: \"99855fda-83c3-48ad-bdea-d1f21a0407fd\") " pod="cert-manager/cert-manager-cainjector-7d9f95dbf-f2mp5" Oct 13 13:20:43 crc kubenswrapper[4797]: I1013 13:20:43.048935 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/99855fda-83c3-48ad-bdea-d1f21a0407fd-bound-sa-token\") pod \"cert-manager-cainjector-7d9f95dbf-f2mp5\" (UID: \"99855fda-83c3-48ad-bdea-d1f21a0407fd\") " pod="cert-manager/cert-manager-cainjector-7d9f95dbf-f2mp5" Oct 13 13:20:43 crc kubenswrapper[4797]: I1013 13:20:43.049682 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/381c567b-38d2-4edb-9cde-d1df690d03b8-utilities" (OuterVolumeSpecName: "utilities") pod "381c567b-38d2-4edb-9cde-d1df690d03b8" (UID: "381c567b-38d2-4edb-9cde-d1df690d03b8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 13:20:43 crc kubenswrapper[4797]: I1013 13:20:43.063094 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/381c567b-38d2-4edb-9cde-d1df690d03b8-kube-api-access-5ttvr" (OuterVolumeSpecName: "kube-api-access-5ttvr") pod "381c567b-38d2-4edb-9cde-d1df690d03b8" (UID: "381c567b-38d2-4edb-9cde-d1df690d03b8"). InnerVolumeSpecName "kube-api-access-5ttvr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:20:43 crc kubenswrapper[4797]: I1013 13:20:43.064388 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/381c567b-38d2-4edb-9cde-d1df690d03b8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "381c567b-38d2-4edb-9cde-d1df690d03b8" (UID: "381c567b-38d2-4edb-9cde-d1df690d03b8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 13:20:43 crc kubenswrapper[4797]: I1013 13:20:43.149692 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-phnvk\" (UniqueName: \"kubernetes.io/projected/99855fda-83c3-48ad-bdea-d1f21a0407fd-kube-api-access-phnvk\") pod \"cert-manager-cainjector-7d9f95dbf-f2mp5\" (UID: \"99855fda-83c3-48ad-bdea-d1f21a0407fd\") " pod="cert-manager/cert-manager-cainjector-7d9f95dbf-f2mp5" Oct 13 13:20:43 crc kubenswrapper[4797]: I1013 13:20:43.149745 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/99855fda-83c3-48ad-bdea-d1f21a0407fd-bound-sa-token\") pod \"cert-manager-cainjector-7d9f95dbf-f2mp5\" (UID: \"99855fda-83c3-48ad-bdea-d1f21a0407fd\") " pod="cert-manager/cert-manager-cainjector-7d9f95dbf-f2mp5" Oct 13 13:20:43 crc kubenswrapper[4797]: I1013 13:20:43.150158 4797 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/381c567b-38d2-4edb-9cde-d1df690d03b8-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 13 13:20:43 crc kubenswrapper[4797]: I1013 13:20:43.150205 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5ttvr\" (UniqueName: \"kubernetes.io/projected/381c567b-38d2-4edb-9cde-d1df690d03b8-kube-api-access-5ttvr\") on node \"crc\" DevicePath \"\"" Oct 13 13:20:43 crc kubenswrapper[4797]: I1013 13:20:43.150347 4797 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/381c567b-38d2-4edb-9cde-d1df690d03b8-utilities\") on node \"crc\" DevicePath \"\"" Oct 13 13:20:43 crc kubenswrapper[4797]: I1013 13:20:43.179529 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/99855fda-83c3-48ad-bdea-d1f21a0407fd-bound-sa-token\") pod \"cert-manager-cainjector-7d9f95dbf-f2mp5\" (UID: \"99855fda-83c3-48ad-bdea-d1f21a0407fd\") " pod="cert-manager/cert-manager-cainjector-7d9f95dbf-f2mp5" Oct 13 13:20:43 crc kubenswrapper[4797]: I1013 13:20:43.182624 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-phnvk\" (UniqueName: \"kubernetes.io/projected/99855fda-83c3-48ad-bdea-d1f21a0407fd-kube-api-access-phnvk\") pod \"cert-manager-cainjector-7d9f95dbf-f2mp5\" (UID: \"99855fda-83c3-48ad-bdea-d1f21a0407fd\") " pod="cert-manager/cert-manager-cainjector-7d9f95dbf-f2mp5" Oct 13 13:20:43 crc kubenswrapper[4797]: I1013 13:20:43.376119 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7d9f95dbf-f2mp5" Oct 13 13:20:43 crc kubenswrapper[4797]: I1013 13:20:43.446511 4797 generic.go:334] "Generic (PLEG): container finished" podID="381c567b-38d2-4edb-9cde-d1df690d03b8" containerID="bb2438804d222844b1b35a9e35d0d591b1fc60217955b55b4b500e57e073dfc2" exitCode=0 Oct 13 13:20:43 crc kubenswrapper[4797]: I1013 13:20:43.446564 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m5f7n" event={"ID":"381c567b-38d2-4edb-9cde-d1df690d03b8","Type":"ContainerDied","Data":"bb2438804d222844b1b35a9e35d0d591b1fc60217955b55b4b500e57e073dfc2"} Oct 13 13:20:43 crc kubenswrapper[4797]: I1013 13:20:43.446571 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m5f7n" Oct 13 13:20:43 crc kubenswrapper[4797]: I1013 13:20:43.446597 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m5f7n" event={"ID":"381c567b-38d2-4edb-9cde-d1df690d03b8","Type":"ContainerDied","Data":"7a8edb463fdc2130954504c3acf7e54a4ae5bb83438752cb3af32167533ed72d"} Oct 13 13:20:43 crc kubenswrapper[4797]: I1013 13:20:43.446620 4797 scope.go:117] "RemoveContainer" containerID="bb2438804d222844b1b35a9e35d0d591b1fc60217955b55b4b500e57e073dfc2" Oct 13 13:20:43 crc kubenswrapper[4797]: I1013 13:20:43.468364 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-m5f7n"] Oct 13 13:20:43 crc kubenswrapper[4797]: I1013 13:20:43.469830 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-m5f7n"] Oct 13 13:20:44 crc kubenswrapper[4797]: I1013 13:20:44.404006 4797 scope.go:117] "RemoveContainer" containerID="1ac34f030b3d86002eba5111b9f5a9c1689dbfb079ccdd1d7bd89b62b536c7b9" Oct 13 13:20:44 crc kubenswrapper[4797]: I1013 13:20:44.452523 4797 scope.go:117] "RemoveContainer" containerID="476e000e19233479f35557a0fbe8defe8d61e03d37b43a4abf5c6b1f493990dd" Oct 13 13:20:44 crc kubenswrapper[4797]: I1013 13:20:44.497593 4797 scope.go:117] "RemoveContainer" containerID="bb2438804d222844b1b35a9e35d0d591b1fc60217955b55b4b500e57e073dfc2" Oct 13 13:20:44 crc kubenswrapper[4797]: E1013 13:20:44.498077 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb2438804d222844b1b35a9e35d0d591b1fc60217955b55b4b500e57e073dfc2\": container with ID starting with bb2438804d222844b1b35a9e35d0d591b1fc60217955b55b4b500e57e073dfc2 not found: ID does not exist" containerID="bb2438804d222844b1b35a9e35d0d591b1fc60217955b55b4b500e57e073dfc2" Oct 13 13:20:44 crc kubenswrapper[4797]: I1013 13:20:44.498129 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb2438804d222844b1b35a9e35d0d591b1fc60217955b55b4b500e57e073dfc2"} err="failed to get container status \"bb2438804d222844b1b35a9e35d0d591b1fc60217955b55b4b500e57e073dfc2\": rpc error: code = NotFound desc = could not find container \"bb2438804d222844b1b35a9e35d0d591b1fc60217955b55b4b500e57e073dfc2\": container with ID starting with bb2438804d222844b1b35a9e35d0d591b1fc60217955b55b4b500e57e073dfc2 not found: ID does not exist" Oct 13 13:20:44 crc kubenswrapper[4797]: I1013 13:20:44.498162 4797 scope.go:117] "RemoveContainer" containerID="1ac34f030b3d86002eba5111b9f5a9c1689dbfb079ccdd1d7bd89b62b536c7b9" Oct 13 13:20:44 crc kubenswrapper[4797]: E1013 13:20:44.498452 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ac34f030b3d86002eba5111b9f5a9c1689dbfb079ccdd1d7bd89b62b536c7b9\": container with ID starting with 1ac34f030b3d86002eba5111b9f5a9c1689dbfb079ccdd1d7bd89b62b536c7b9 not found: ID does not exist" containerID="1ac34f030b3d86002eba5111b9f5a9c1689dbfb079ccdd1d7bd89b62b536c7b9" Oct 13 13:20:44 crc kubenswrapper[4797]: I1013 13:20:44.498492 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ac34f030b3d86002eba5111b9f5a9c1689dbfb079ccdd1d7bd89b62b536c7b9"} err="failed to get container status \"1ac34f030b3d86002eba5111b9f5a9c1689dbfb079ccdd1d7bd89b62b536c7b9\": rpc error: code = NotFound desc = could not find container \"1ac34f030b3d86002eba5111b9f5a9c1689dbfb079ccdd1d7bd89b62b536c7b9\": container with ID starting with 1ac34f030b3d86002eba5111b9f5a9c1689dbfb079ccdd1d7bd89b62b536c7b9 not found: ID does not exist" Oct 13 13:20:44 crc kubenswrapper[4797]: I1013 13:20:44.498517 4797 scope.go:117] "RemoveContainer" containerID="476e000e19233479f35557a0fbe8defe8d61e03d37b43a4abf5c6b1f493990dd" Oct 13 13:20:44 crc kubenswrapper[4797]: E1013 13:20:44.498843 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"476e000e19233479f35557a0fbe8defe8d61e03d37b43a4abf5c6b1f493990dd\": container with ID starting with 476e000e19233479f35557a0fbe8defe8d61e03d37b43a4abf5c6b1f493990dd not found: ID does not exist" containerID="476e000e19233479f35557a0fbe8defe8d61e03d37b43a4abf5c6b1f493990dd" Oct 13 13:20:44 crc kubenswrapper[4797]: I1013 13:20:44.498870 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"476e000e19233479f35557a0fbe8defe8d61e03d37b43a4abf5c6b1f493990dd"} err="failed to get container status \"476e000e19233479f35557a0fbe8defe8d61e03d37b43a4abf5c6b1f493990dd\": rpc error: code = NotFound desc = could not find container \"476e000e19233479f35557a0fbe8defe8d61e03d37b43a4abf5c6b1f493990dd\": container with ID starting with 476e000e19233479f35557a0fbe8defe8d61e03d37b43a4abf5c6b1f493990dd not found: ID does not exist" Oct 13 13:20:44 crc kubenswrapper[4797]: I1013 13:20:44.864454 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7d9f95dbf-f2mp5"] Oct 13 13:20:44 crc kubenswrapper[4797]: W1013 13:20:44.870792 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod99855fda_83c3_48ad_bdea_d1f21a0407fd.slice/crio-5cc25702cdb9806f15a366698505488db0ffb28e96d83f9fe71cb192596e40b1 WatchSource:0}: Error finding container 5cc25702cdb9806f15a366698505488db0ffb28e96d83f9fe71cb192596e40b1: Status 404 returned error can't find the container with id 5cc25702cdb9806f15a366698505488db0ffb28e96d83f9fe71cb192596e40b1 Oct 13 13:20:45 crc kubenswrapper[4797]: I1013 13:20:45.243976 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="381c567b-38d2-4edb-9cde-d1df690d03b8" path="/var/lib/kubelet/pods/381c567b-38d2-4edb-9cde-d1df690d03b8/volumes" Oct 13 13:20:45 crc kubenswrapper[4797]: I1013 13:20:45.463277 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7d9f95dbf-f2mp5" event={"ID":"99855fda-83c3-48ad-bdea-d1f21a0407fd","Type":"ContainerStarted","Data":"5619ca85e9cf85b4822d47800370a7863ccc03d4d45b8c4a8461670fe870a567"} Oct 13 13:20:45 crc kubenswrapper[4797]: I1013 13:20:45.463347 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7d9f95dbf-f2mp5" event={"ID":"99855fda-83c3-48ad-bdea-d1f21a0407fd","Type":"ContainerStarted","Data":"5cc25702cdb9806f15a366698505488db0ffb28e96d83f9fe71cb192596e40b1"} Oct 13 13:20:45 crc kubenswrapper[4797]: I1013 13:20:45.465648 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-d969966f-h8b9w" event={"ID":"ec3a8923-32a5-467e-949b-0450963f0afb","Type":"ContainerStarted","Data":"b332454fa1a28ac5115d3ec78f16e016bfc599607b4baaca108f44f4c7a72e9d"} Oct 13 13:20:45 crc kubenswrapper[4797]: I1013 13:20:45.465777 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-d969966f-h8b9w" Oct 13 13:20:45 crc kubenswrapper[4797]: I1013 13:20:45.486198 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7d9f95dbf-f2mp5" podStartSLOduration=3.486170681 podStartE2EDuration="3.486170681s" podCreationTimestamp="2025-10-13 13:20:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 13:20:45.484201393 +0000 UTC m=+823.017751649" watchObservedRunningTime="2025-10-13 13:20:45.486170681 +0000 UTC m=+823.019720967" Oct 13 13:20:45 crc kubenswrapper[4797]: I1013 13:20:45.508397 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-d969966f-h8b9w" podStartSLOduration=2.301740885 podStartE2EDuration="6.508372185s" podCreationTimestamp="2025-10-13 13:20:39 +0000 UTC" firstStartedPulling="2025-10-13 13:20:40.267367137 +0000 UTC m=+817.800917393" lastFinishedPulling="2025-10-13 13:20:44.473998437 +0000 UTC m=+822.007548693" observedRunningTime="2025-10-13 13:20:45.505763191 +0000 UTC m=+823.039313477" watchObservedRunningTime="2025-10-13 13:20:45.508372185 +0000 UTC m=+823.041922471" Oct 13 13:20:49 crc kubenswrapper[4797]: I1013 13:20:49.833325 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-d969966f-h8b9w" Oct 13 13:20:58 crc kubenswrapper[4797]: I1013 13:20:58.524623 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-7d4cc89fcb-p25dd"] Oct 13 13:20:58 crc kubenswrapper[4797]: I1013 13:20:58.526719 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-7d4cc89fcb-p25dd" Oct 13 13:20:58 crc kubenswrapper[4797]: I1013 13:20:58.540513 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-7d4cc89fcb-p25dd"] Oct 13 13:20:58 crc kubenswrapper[4797]: I1013 13:20:58.559447 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5903e7e4-5c29-4320-9c08-cccaab0cd30f-bound-sa-token\") pod \"cert-manager-7d4cc89fcb-p25dd\" (UID: \"5903e7e4-5c29-4320-9c08-cccaab0cd30f\") " pod="cert-manager/cert-manager-7d4cc89fcb-p25dd" Oct 13 13:20:58 crc kubenswrapper[4797]: I1013 13:20:58.559590 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7pkp\" (UniqueName: \"kubernetes.io/projected/5903e7e4-5c29-4320-9c08-cccaab0cd30f-kube-api-access-j7pkp\") pod \"cert-manager-7d4cc89fcb-p25dd\" (UID: \"5903e7e4-5c29-4320-9c08-cccaab0cd30f\") " pod="cert-manager/cert-manager-7d4cc89fcb-p25dd" Oct 13 13:20:58 crc kubenswrapper[4797]: I1013 13:20:58.562029 4797 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-6lxgz" Oct 13 13:20:58 crc kubenswrapper[4797]: I1013 13:20:58.661472 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j7pkp\" (UniqueName: \"kubernetes.io/projected/5903e7e4-5c29-4320-9c08-cccaab0cd30f-kube-api-access-j7pkp\") pod \"cert-manager-7d4cc89fcb-p25dd\" (UID: \"5903e7e4-5c29-4320-9c08-cccaab0cd30f\") " pod="cert-manager/cert-manager-7d4cc89fcb-p25dd" Oct 13 13:20:58 crc kubenswrapper[4797]: I1013 13:20:58.661552 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5903e7e4-5c29-4320-9c08-cccaab0cd30f-bound-sa-token\") pod \"cert-manager-7d4cc89fcb-p25dd\" (UID: \"5903e7e4-5c29-4320-9c08-cccaab0cd30f\") " pod="cert-manager/cert-manager-7d4cc89fcb-p25dd" Oct 13 13:20:58 crc kubenswrapper[4797]: I1013 13:20:58.688752 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5903e7e4-5c29-4320-9c08-cccaab0cd30f-bound-sa-token\") pod \"cert-manager-7d4cc89fcb-p25dd\" (UID: \"5903e7e4-5c29-4320-9c08-cccaab0cd30f\") " pod="cert-manager/cert-manager-7d4cc89fcb-p25dd" Oct 13 13:20:58 crc kubenswrapper[4797]: I1013 13:20:58.691940 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7pkp\" (UniqueName: \"kubernetes.io/projected/5903e7e4-5c29-4320-9c08-cccaab0cd30f-kube-api-access-j7pkp\") pod \"cert-manager-7d4cc89fcb-p25dd\" (UID: \"5903e7e4-5c29-4320-9c08-cccaab0cd30f\") " pod="cert-manager/cert-manager-7d4cc89fcb-p25dd" Oct 13 13:20:58 crc kubenswrapper[4797]: I1013 13:20:58.857914 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-7d4cc89fcb-p25dd" Oct 13 13:20:59 crc kubenswrapper[4797]: I1013 13:20:59.107366 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-7d4cc89fcb-p25dd"] Oct 13 13:20:59 crc kubenswrapper[4797]: I1013 13:20:59.566457 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-7d4cc89fcb-p25dd" event={"ID":"5903e7e4-5c29-4320-9c08-cccaab0cd30f","Type":"ContainerStarted","Data":"bb960c067760b8fe7fa2b5d87b9e3367bbf3dee837dfc30790fbf0d1474d04d2"} Oct 13 13:20:59 crc kubenswrapper[4797]: I1013 13:20:59.566841 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-7d4cc89fcb-p25dd" event={"ID":"5903e7e4-5c29-4320-9c08-cccaab0cd30f","Type":"ContainerStarted","Data":"fab735d50aff31483b645ae1eaa1262560416e012072e6f2e6e9c40a2e623ca4"} Oct 13 13:20:59 crc kubenswrapper[4797]: I1013 13:20:59.595324 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-7d4cc89fcb-p25dd" podStartSLOduration=1.5952893179999998 podStartE2EDuration="1.595289318s" podCreationTimestamp="2025-10-13 13:20:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 13:20:59.587935448 +0000 UTC m=+837.121485754" watchObservedRunningTime="2025-10-13 13:20:59.595289318 +0000 UTC m=+837.128839614" Oct 13 13:21:03 crc kubenswrapper[4797]: I1013 13:21:03.883943 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-ntxnj"] Oct 13 13:21:03 crc kubenswrapper[4797]: I1013 13:21:03.886358 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-ntxnj" Oct 13 13:21:03 crc kubenswrapper[4797]: I1013 13:21:03.889230 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-d277s" Oct 13 13:21:03 crc kubenswrapper[4797]: I1013 13:21:03.889642 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Oct 13 13:21:03 crc kubenswrapper[4797]: I1013 13:21:03.899609 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Oct 13 13:21:03 crc kubenswrapper[4797]: I1013 13:21:03.902680 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-ntxnj"] Oct 13 13:21:04 crc kubenswrapper[4797]: I1013 13:21:04.037137 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7dgb\" (UniqueName: \"kubernetes.io/projected/91facd8c-45d7-4db5-936d-d1851e13b51c-kube-api-access-b7dgb\") pod \"openstack-operator-index-ntxnj\" (UID: \"91facd8c-45d7-4db5-936d-d1851e13b51c\") " pod="openstack-operators/openstack-operator-index-ntxnj" Oct 13 13:21:04 crc kubenswrapper[4797]: I1013 13:21:04.138403 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b7dgb\" (UniqueName: \"kubernetes.io/projected/91facd8c-45d7-4db5-936d-d1851e13b51c-kube-api-access-b7dgb\") pod \"openstack-operator-index-ntxnj\" (UID: \"91facd8c-45d7-4db5-936d-d1851e13b51c\") " pod="openstack-operators/openstack-operator-index-ntxnj" Oct 13 13:21:04 crc kubenswrapper[4797]: I1013 13:21:04.161492 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7dgb\" (UniqueName: \"kubernetes.io/projected/91facd8c-45d7-4db5-936d-d1851e13b51c-kube-api-access-b7dgb\") pod \"openstack-operator-index-ntxnj\" (UID: \"91facd8c-45d7-4db5-936d-d1851e13b51c\") " pod="openstack-operators/openstack-operator-index-ntxnj" Oct 13 13:21:04 crc kubenswrapper[4797]: I1013 13:21:04.205607 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-ntxnj" Oct 13 13:21:04 crc kubenswrapper[4797]: I1013 13:21:04.584409 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-ntxnj"] Oct 13 13:21:05 crc kubenswrapper[4797]: I1013 13:21:05.603083 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-ntxnj" event={"ID":"91facd8c-45d7-4db5-936d-d1851e13b51c","Type":"ContainerStarted","Data":"21c764454af4ddcb5bc6bccbe646db6e958b35384e2079a5b6ad7aa37911c5f7"} Oct 13 13:21:06 crc kubenswrapper[4797]: I1013 13:21:06.611512 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-ntxnj" event={"ID":"91facd8c-45d7-4db5-936d-d1851e13b51c","Type":"ContainerStarted","Data":"439836a8c5a30efd51b453cca892998df677449d579f61fc69f656350b1c7c59"} Oct 13 13:21:06 crc kubenswrapper[4797]: I1013 13:21:06.629532 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-ntxnj" podStartSLOduration=2.710517798 podStartE2EDuration="3.629513749s" podCreationTimestamp="2025-10-13 13:21:03 +0000 UTC" firstStartedPulling="2025-10-13 13:21:04.596985084 +0000 UTC m=+842.130535340" lastFinishedPulling="2025-10-13 13:21:05.515981025 +0000 UTC m=+843.049531291" observedRunningTime="2025-10-13 13:21:06.627688494 +0000 UTC m=+844.161238790" watchObservedRunningTime="2025-10-13 13:21:06.629513749 +0000 UTC m=+844.163064005" Oct 13 13:21:07 crc kubenswrapper[4797]: I1013 13:21:07.273759 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-ntxnj"] Oct 13 13:21:07 crc kubenswrapper[4797]: I1013 13:21:07.875177 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-p2rmj"] Oct 13 13:21:07 crc kubenswrapper[4797]: I1013 13:21:07.876354 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-p2rmj" Oct 13 13:21:07 crc kubenswrapper[4797]: I1013 13:21:07.884897 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-p2rmj"] Oct 13 13:21:08 crc kubenswrapper[4797]: I1013 13:21:08.000565 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g77hv\" (UniqueName: \"kubernetes.io/projected/bb202dd1-eb2d-4c54-aa6e-160a65f46f21-kube-api-access-g77hv\") pod \"openstack-operator-index-p2rmj\" (UID: \"bb202dd1-eb2d-4c54-aa6e-160a65f46f21\") " pod="openstack-operators/openstack-operator-index-p2rmj" Oct 13 13:21:08 crc kubenswrapper[4797]: I1013 13:21:08.102220 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g77hv\" (UniqueName: \"kubernetes.io/projected/bb202dd1-eb2d-4c54-aa6e-160a65f46f21-kube-api-access-g77hv\") pod \"openstack-operator-index-p2rmj\" (UID: \"bb202dd1-eb2d-4c54-aa6e-160a65f46f21\") " pod="openstack-operators/openstack-operator-index-p2rmj" Oct 13 13:21:08 crc kubenswrapper[4797]: I1013 13:21:08.132196 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g77hv\" (UniqueName: \"kubernetes.io/projected/bb202dd1-eb2d-4c54-aa6e-160a65f46f21-kube-api-access-g77hv\") pod \"openstack-operator-index-p2rmj\" (UID: \"bb202dd1-eb2d-4c54-aa6e-160a65f46f21\") " pod="openstack-operators/openstack-operator-index-p2rmj" Oct 13 13:21:08 crc kubenswrapper[4797]: I1013 13:21:08.210239 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-p2rmj" Oct 13 13:21:08 crc kubenswrapper[4797]: I1013 13:21:08.599346 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-p2rmj"] Oct 13 13:21:08 crc kubenswrapper[4797]: I1013 13:21:08.627486 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-p2rmj" event={"ID":"bb202dd1-eb2d-4c54-aa6e-160a65f46f21","Type":"ContainerStarted","Data":"6162272139a2e69f460827c19d9341d8627cc4cad2f2cf17eb4f74bc7278af4f"} Oct 13 13:21:08 crc kubenswrapper[4797]: I1013 13:21:08.627702 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-ntxnj" podUID="91facd8c-45d7-4db5-936d-d1851e13b51c" containerName="registry-server" containerID="cri-o://439836a8c5a30efd51b453cca892998df677449d579f61fc69f656350b1c7c59" gracePeriod=2 Oct 13 13:21:08 crc kubenswrapper[4797]: I1013 13:21:08.961455 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-ntxnj" Oct 13 13:21:09 crc kubenswrapper[4797]: I1013 13:21:09.116431 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b7dgb\" (UniqueName: \"kubernetes.io/projected/91facd8c-45d7-4db5-936d-d1851e13b51c-kube-api-access-b7dgb\") pod \"91facd8c-45d7-4db5-936d-d1851e13b51c\" (UID: \"91facd8c-45d7-4db5-936d-d1851e13b51c\") " Oct 13 13:21:09 crc kubenswrapper[4797]: I1013 13:21:09.122580 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91facd8c-45d7-4db5-936d-d1851e13b51c-kube-api-access-b7dgb" (OuterVolumeSpecName: "kube-api-access-b7dgb") pod "91facd8c-45d7-4db5-936d-d1851e13b51c" (UID: "91facd8c-45d7-4db5-936d-d1851e13b51c"). InnerVolumeSpecName "kube-api-access-b7dgb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:21:09 crc kubenswrapper[4797]: I1013 13:21:09.218039 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b7dgb\" (UniqueName: \"kubernetes.io/projected/91facd8c-45d7-4db5-936d-d1851e13b51c-kube-api-access-b7dgb\") on node \"crc\" DevicePath \"\"" Oct 13 13:21:09 crc kubenswrapper[4797]: I1013 13:21:09.636359 4797 generic.go:334] "Generic (PLEG): container finished" podID="91facd8c-45d7-4db5-936d-d1851e13b51c" containerID="439836a8c5a30efd51b453cca892998df677449d579f61fc69f656350b1c7c59" exitCode=0 Oct 13 13:21:09 crc kubenswrapper[4797]: I1013 13:21:09.636395 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-ntxnj" Oct 13 13:21:09 crc kubenswrapper[4797]: I1013 13:21:09.636432 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-ntxnj" event={"ID":"91facd8c-45d7-4db5-936d-d1851e13b51c","Type":"ContainerDied","Data":"439836a8c5a30efd51b453cca892998df677449d579f61fc69f656350b1c7c59"} Oct 13 13:21:09 crc kubenswrapper[4797]: I1013 13:21:09.636508 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-ntxnj" event={"ID":"91facd8c-45d7-4db5-936d-d1851e13b51c","Type":"ContainerDied","Data":"21c764454af4ddcb5bc6bccbe646db6e958b35384e2079a5b6ad7aa37911c5f7"} Oct 13 13:21:09 crc kubenswrapper[4797]: I1013 13:21:09.636553 4797 scope.go:117] "RemoveContainer" containerID="439836a8c5a30efd51b453cca892998df677449d579f61fc69f656350b1c7c59" Oct 13 13:21:09 crc kubenswrapper[4797]: I1013 13:21:09.638706 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-p2rmj" event={"ID":"bb202dd1-eb2d-4c54-aa6e-160a65f46f21","Type":"ContainerStarted","Data":"1079d9a607d70f4afd325d0e8efbe4165750fb894c856add3e1c60f5834272a8"} Oct 13 13:21:09 crc kubenswrapper[4797]: I1013 13:21:09.663551 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-p2rmj" podStartSLOduration=2.268200277 podStartE2EDuration="2.663528488s" podCreationTimestamp="2025-10-13 13:21:07 +0000 UTC" firstStartedPulling="2025-10-13 13:21:08.603270442 +0000 UTC m=+846.136820738" lastFinishedPulling="2025-10-13 13:21:08.998598693 +0000 UTC m=+846.532148949" observedRunningTime="2025-10-13 13:21:09.65502816 +0000 UTC m=+847.188578436" watchObservedRunningTime="2025-10-13 13:21:09.663528488 +0000 UTC m=+847.197078754" Oct 13 13:21:09 crc kubenswrapper[4797]: I1013 13:21:09.668515 4797 scope.go:117] "RemoveContainer" containerID="439836a8c5a30efd51b453cca892998df677449d579f61fc69f656350b1c7c59" Oct 13 13:21:09 crc kubenswrapper[4797]: E1013 13:21:09.669290 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"439836a8c5a30efd51b453cca892998df677449d579f61fc69f656350b1c7c59\": container with ID starting with 439836a8c5a30efd51b453cca892998df677449d579f61fc69f656350b1c7c59 not found: ID does not exist" containerID="439836a8c5a30efd51b453cca892998df677449d579f61fc69f656350b1c7c59" Oct 13 13:21:09 crc kubenswrapper[4797]: I1013 13:21:09.669324 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"439836a8c5a30efd51b453cca892998df677449d579f61fc69f656350b1c7c59"} err="failed to get container status \"439836a8c5a30efd51b453cca892998df677449d579f61fc69f656350b1c7c59\": rpc error: code = NotFound desc = could not find container \"439836a8c5a30efd51b453cca892998df677449d579f61fc69f656350b1c7c59\": container with ID starting with 439836a8c5a30efd51b453cca892998df677449d579f61fc69f656350b1c7c59 not found: ID does not exist" Oct 13 13:21:09 crc kubenswrapper[4797]: I1013 13:21:09.673344 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-ntxnj"] Oct 13 13:21:09 crc kubenswrapper[4797]: I1013 13:21:09.676546 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-ntxnj"] Oct 13 13:21:11 crc kubenswrapper[4797]: I1013 13:21:11.253441 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91facd8c-45d7-4db5-936d-d1851e13b51c" path="/var/lib/kubelet/pods/91facd8c-45d7-4db5-936d-d1851e13b51c/volumes" Oct 13 13:21:18 crc kubenswrapper[4797]: I1013 13:21:18.211302 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-p2rmj" Oct 13 13:21:18 crc kubenswrapper[4797]: I1013 13:21:18.212100 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-p2rmj" Oct 13 13:21:18 crc kubenswrapper[4797]: I1013 13:21:18.255091 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-p2rmj" Oct 13 13:21:18 crc kubenswrapper[4797]: I1013 13:21:18.746229 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-p2rmj" Oct 13 13:21:25 crc kubenswrapper[4797]: I1013 13:21:25.508294 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/32da80840a2017f27ed4ad61f02adc64a25aa18e8dad0409953372036afmvwg"] Oct 13 13:21:25 crc kubenswrapper[4797]: E1013 13:21:25.509832 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91facd8c-45d7-4db5-936d-d1851e13b51c" containerName="registry-server" Oct 13 13:21:25 crc kubenswrapper[4797]: I1013 13:21:25.509858 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="91facd8c-45d7-4db5-936d-d1851e13b51c" containerName="registry-server" Oct 13 13:21:25 crc kubenswrapper[4797]: I1013 13:21:25.510047 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="91facd8c-45d7-4db5-936d-d1851e13b51c" containerName="registry-server" Oct 13 13:21:25 crc kubenswrapper[4797]: I1013 13:21:25.511389 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/32da80840a2017f27ed4ad61f02adc64a25aa18e8dad0409953372036afmvwg" Oct 13 13:21:25 crc kubenswrapper[4797]: I1013 13:21:25.513512 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-sv657" Oct 13 13:21:25 crc kubenswrapper[4797]: I1013 13:21:25.524721 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/32da80840a2017f27ed4ad61f02adc64a25aa18e8dad0409953372036afmvwg"] Oct 13 13:21:25 crc kubenswrapper[4797]: I1013 13:21:25.594976 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kx97w\" (UniqueName: \"kubernetes.io/projected/24115c99-7b3e-44b3-b517-45c8118d5645-kube-api-access-kx97w\") pod \"32da80840a2017f27ed4ad61f02adc64a25aa18e8dad0409953372036afmvwg\" (UID: \"24115c99-7b3e-44b3-b517-45c8118d5645\") " pod="openstack-operators/32da80840a2017f27ed4ad61f02adc64a25aa18e8dad0409953372036afmvwg" Oct 13 13:21:25 crc kubenswrapper[4797]: I1013 13:21:25.595322 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/24115c99-7b3e-44b3-b517-45c8118d5645-bundle\") pod \"32da80840a2017f27ed4ad61f02adc64a25aa18e8dad0409953372036afmvwg\" (UID: \"24115c99-7b3e-44b3-b517-45c8118d5645\") " pod="openstack-operators/32da80840a2017f27ed4ad61f02adc64a25aa18e8dad0409953372036afmvwg" Oct 13 13:21:25 crc kubenswrapper[4797]: I1013 13:21:25.595426 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/24115c99-7b3e-44b3-b517-45c8118d5645-util\") pod \"32da80840a2017f27ed4ad61f02adc64a25aa18e8dad0409953372036afmvwg\" (UID: \"24115c99-7b3e-44b3-b517-45c8118d5645\") " pod="openstack-operators/32da80840a2017f27ed4ad61f02adc64a25aa18e8dad0409953372036afmvwg" Oct 13 13:21:25 crc kubenswrapper[4797]: I1013 13:21:25.696894 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kx97w\" (UniqueName: \"kubernetes.io/projected/24115c99-7b3e-44b3-b517-45c8118d5645-kube-api-access-kx97w\") pod \"32da80840a2017f27ed4ad61f02adc64a25aa18e8dad0409953372036afmvwg\" (UID: \"24115c99-7b3e-44b3-b517-45c8118d5645\") " pod="openstack-operators/32da80840a2017f27ed4ad61f02adc64a25aa18e8dad0409953372036afmvwg" Oct 13 13:21:25 crc kubenswrapper[4797]: I1013 13:21:25.697355 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/24115c99-7b3e-44b3-b517-45c8118d5645-bundle\") pod \"32da80840a2017f27ed4ad61f02adc64a25aa18e8dad0409953372036afmvwg\" (UID: \"24115c99-7b3e-44b3-b517-45c8118d5645\") " pod="openstack-operators/32da80840a2017f27ed4ad61f02adc64a25aa18e8dad0409953372036afmvwg" Oct 13 13:21:25 crc kubenswrapper[4797]: I1013 13:21:25.697514 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/24115c99-7b3e-44b3-b517-45c8118d5645-util\") pod \"32da80840a2017f27ed4ad61f02adc64a25aa18e8dad0409953372036afmvwg\" (UID: \"24115c99-7b3e-44b3-b517-45c8118d5645\") " pod="openstack-operators/32da80840a2017f27ed4ad61f02adc64a25aa18e8dad0409953372036afmvwg" Oct 13 13:21:25 crc kubenswrapper[4797]: I1013 13:21:25.697824 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/24115c99-7b3e-44b3-b517-45c8118d5645-bundle\") pod \"32da80840a2017f27ed4ad61f02adc64a25aa18e8dad0409953372036afmvwg\" (UID: \"24115c99-7b3e-44b3-b517-45c8118d5645\") " pod="openstack-operators/32da80840a2017f27ed4ad61f02adc64a25aa18e8dad0409953372036afmvwg" Oct 13 13:21:25 crc kubenswrapper[4797]: I1013 13:21:25.698411 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/24115c99-7b3e-44b3-b517-45c8118d5645-util\") pod \"32da80840a2017f27ed4ad61f02adc64a25aa18e8dad0409953372036afmvwg\" (UID: \"24115c99-7b3e-44b3-b517-45c8118d5645\") " pod="openstack-operators/32da80840a2017f27ed4ad61f02adc64a25aa18e8dad0409953372036afmvwg" Oct 13 13:21:25 crc kubenswrapper[4797]: I1013 13:21:25.723703 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kx97w\" (UniqueName: \"kubernetes.io/projected/24115c99-7b3e-44b3-b517-45c8118d5645-kube-api-access-kx97w\") pod \"32da80840a2017f27ed4ad61f02adc64a25aa18e8dad0409953372036afmvwg\" (UID: \"24115c99-7b3e-44b3-b517-45c8118d5645\") " pod="openstack-operators/32da80840a2017f27ed4ad61f02adc64a25aa18e8dad0409953372036afmvwg" Oct 13 13:21:25 crc kubenswrapper[4797]: I1013 13:21:25.838978 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/32da80840a2017f27ed4ad61f02adc64a25aa18e8dad0409953372036afmvwg" Oct 13 13:21:26 crc kubenswrapper[4797]: I1013 13:21:26.343489 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/32da80840a2017f27ed4ad61f02adc64a25aa18e8dad0409953372036afmvwg"] Oct 13 13:21:26 crc kubenswrapper[4797]: I1013 13:21:26.773955 4797 generic.go:334] "Generic (PLEG): container finished" podID="24115c99-7b3e-44b3-b517-45c8118d5645" containerID="64fe3c3d3049252ad9cf1f2d1ae028166104d1bd0076d659d50d83de424d0e19" exitCode=0 Oct 13 13:21:26 crc kubenswrapper[4797]: I1013 13:21:26.773999 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/32da80840a2017f27ed4ad61f02adc64a25aa18e8dad0409953372036afmvwg" event={"ID":"24115c99-7b3e-44b3-b517-45c8118d5645","Type":"ContainerDied","Data":"64fe3c3d3049252ad9cf1f2d1ae028166104d1bd0076d659d50d83de424d0e19"} Oct 13 13:21:26 crc kubenswrapper[4797]: I1013 13:21:26.774023 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/32da80840a2017f27ed4ad61f02adc64a25aa18e8dad0409953372036afmvwg" event={"ID":"24115c99-7b3e-44b3-b517-45c8118d5645","Type":"ContainerStarted","Data":"4238f746143882cb679c904f56882ad4ba99fd4ba1ee29077a76ae2f120c6916"} Oct 13 13:21:27 crc kubenswrapper[4797]: I1013 13:21:27.781730 4797 generic.go:334] "Generic (PLEG): container finished" podID="24115c99-7b3e-44b3-b517-45c8118d5645" containerID="c934b8848881c67bca4f086e7e58c5c30ea3cdc2d26547b530408a0945ff1146" exitCode=0 Oct 13 13:21:27 crc kubenswrapper[4797]: I1013 13:21:27.781946 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/32da80840a2017f27ed4ad61f02adc64a25aa18e8dad0409953372036afmvwg" event={"ID":"24115c99-7b3e-44b3-b517-45c8118d5645","Type":"ContainerDied","Data":"c934b8848881c67bca4f086e7e58c5c30ea3cdc2d26547b530408a0945ff1146"} Oct 13 13:21:28 crc kubenswrapper[4797]: I1013 13:21:28.795994 4797 generic.go:334] "Generic (PLEG): container finished" podID="24115c99-7b3e-44b3-b517-45c8118d5645" containerID="cb2e1823b6a46b4e8810ce99e0d3b7829173cd2f4f2379648b6d5ec203bc30c9" exitCode=0 Oct 13 13:21:28 crc kubenswrapper[4797]: I1013 13:21:28.796061 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/32da80840a2017f27ed4ad61f02adc64a25aa18e8dad0409953372036afmvwg" event={"ID":"24115c99-7b3e-44b3-b517-45c8118d5645","Type":"ContainerDied","Data":"cb2e1823b6a46b4e8810ce99e0d3b7829173cd2f4f2379648b6d5ec203bc30c9"} Oct 13 13:21:30 crc kubenswrapper[4797]: I1013 13:21:30.083580 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/32da80840a2017f27ed4ad61f02adc64a25aa18e8dad0409953372036afmvwg" Oct 13 13:21:30 crc kubenswrapper[4797]: I1013 13:21:30.169167 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/24115c99-7b3e-44b3-b517-45c8118d5645-bundle\") pod \"24115c99-7b3e-44b3-b517-45c8118d5645\" (UID: \"24115c99-7b3e-44b3-b517-45c8118d5645\") " Oct 13 13:21:30 crc kubenswrapper[4797]: I1013 13:21:30.169216 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kx97w\" (UniqueName: \"kubernetes.io/projected/24115c99-7b3e-44b3-b517-45c8118d5645-kube-api-access-kx97w\") pod \"24115c99-7b3e-44b3-b517-45c8118d5645\" (UID: \"24115c99-7b3e-44b3-b517-45c8118d5645\") " Oct 13 13:21:30 crc kubenswrapper[4797]: I1013 13:21:30.169265 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/24115c99-7b3e-44b3-b517-45c8118d5645-util\") pod \"24115c99-7b3e-44b3-b517-45c8118d5645\" (UID: \"24115c99-7b3e-44b3-b517-45c8118d5645\") " Oct 13 13:21:30 crc kubenswrapper[4797]: I1013 13:21:30.170499 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/24115c99-7b3e-44b3-b517-45c8118d5645-bundle" (OuterVolumeSpecName: "bundle") pod "24115c99-7b3e-44b3-b517-45c8118d5645" (UID: "24115c99-7b3e-44b3-b517-45c8118d5645"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 13:21:30 crc kubenswrapper[4797]: I1013 13:21:30.174037 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24115c99-7b3e-44b3-b517-45c8118d5645-kube-api-access-kx97w" (OuterVolumeSpecName: "kube-api-access-kx97w") pod "24115c99-7b3e-44b3-b517-45c8118d5645" (UID: "24115c99-7b3e-44b3-b517-45c8118d5645"). InnerVolumeSpecName "kube-api-access-kx97w". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:21:30 crc kubenswrapper[4797]: I1013 13:21:30.182319 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/24115c99-7b3e-44b3-b517-45c8118d5645-util" (OuterVolumeSpecName: "util") pod "24115c99-7b3e-44b3-b517-45c8118d5645" (UID: "24115c99-7b3e-44b3-b517-45c8118d5645"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 13:21:30 crc kubenswrapper[4797]: I1013 13:21:30.270698 4797 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/24115c99-7b3e-44b3-b517-45c8118d5645-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 13:21:30 crc kubenswrapper[4797]: I1013 13:21:30.270741 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kx97w\" (UniqueName: \"kubernetes.io/projected/24115c99-7b3e-44b3-b517-45c8118d5645-kube-api-access-kx97w\") on node \"crc\" DevicePath \"\"" Oct 13 13:21:30 crc kubenswrapper[4797]: I1013 13:21:30.270756 4797 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/24115c99-7b3e-44b3-b517-45c8118d5645-util\") on node \"crc\" DevicePath \"\"" Oct 13 13:21:30 crc kubenswrapper[4797]: I1013 13:21:30.814906 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/32da80840a2017f27ed4ad61f02adc64a25aa18e8dad0409953372036afmvwg" event={"ID":"24115c99-7b3e-44b3-b517-45c8118d5645","Type":"ContainerDied","Data":"4238f746143882cb679c904f56882ad4ba99fd4ba1ee29077a76ae2f120c6916"} Oct 13 13:21:30 crc kubenswrapper[4797]: I1013 13:21:30.814952 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4238f746143882cb679c904f56882ad4ba99fd4ba1ee29077a76ae2f120c6916" Oct 13 13:21:30 crc kubenswrapper[4797]: I1013 13:21:30.814955 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/32da80840a2017f27ed4ad61f02adc64a25aa18e8dad0409953372036afmvwg" Oct 13 13:21:38 crc kubenswrapper[4797]: I1013 13:21:38.063075 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-64895cd698-r4rcj"] Oct 13 13:21:38 crc kubenswrapper[4797]: E1013 13:21:38.063916 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24115c99-7b3e-44b3-b517-45c8118d5645" containerName="util" Oct 13 13:21:38 crc kubenswrapper[4797]: I1013 13:21:38.063934 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="24115c99-7b3e-44b3-b517-45c8118d5645" containerName="util" Oct 13 13:21:38 crc kubenswrapper[4797]: E1013 13:21:38.063951 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24115c99-7b3e-44b3-b517-45c8118d5645" containerName="extract" Oct 13 13:21:38 crc kubenswrapper[4797]: I1013 13:21:38.063959 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="24115c99-7b3e-44b3-b517-45c8118d5645" containerName="extract" Oct 13 13:21:38 crc kubenswrapper[4797]: E1013 13:21:38.063980 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24115c99-7b3e-44b3-b517-45c8118d5645" containerName="pull" Oct 13 13:21:38 crc kubenswrapper[4797]: I1013 13:21:38.063989 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="24115c99-7b3e-44b3-b517-45c8118d5645" containerName="pull" Oct 13 13:21:38 crc kubenswrapper[4797]: I1013 13:21:38.064125 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="24115c99-7b3e-44b3-b517-45c8118d5645" containerName="extract" Oct 13 13:21:38 crc kubenswrapper[4797]: I1013 13:21:38.064923 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-64895cd698-r4rcj" Oct 13 13:21:38 crc kubenswrapper[4797]: I1013 13:21:38.068276 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-fhn7b" Oct 13 13:21:38 crc kubenswrapper[4797]: I1013 13:21:38.082775 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-64895cd698-r4rcj"] Oct 13 13:21:38 crc kubenswrapper[4797]: I1013 13:21:38.172497 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xnxrg\" (UniqueName: \"kubernetes.io/projected/13b7c0b2-c215-4740-a6af-b67ce7ab0dd3-kube-api-access-xnxrg\") pod \"openstack-operator-controller-operator-64895cd698-r4rcj\" (UID: \"13b7c0b2-c215-4740-a6af-b67ce7ab0dd3\") " pod="openstack-operators/openstack-operator-controller-operator-64895cd698-r4rcj" Oct 13 13:21:38 crc kubenswrapper[4797]: I1013 13:21:38.273876 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xnxrg\" (UniqueName: \"kubernetes.io/projected/13b7c0b2-c215-4740-a6af-b67ce7ab0dd3-kube-api-access-xnxrg\") pod \"openstack-operator-controller-operator-64895cd698-r4rcj\" (UID: \"13b7c0b2-c215-4740-a6af-b67ce7ab0dd3\") " pod="openstack-operators/openstack-operator-controller-operator-64895cd698-r4rcj" Oct 13 13:21:38 crc kubenswrapper[4797]: I1013 13:21:38.297892 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xnxrg\" (UniqueName: \"kubernetes.io/projected/13b7c0b2-c215-4740-a6af-b67ce7ab0dd3-kube-api-access-xnxrg\") pod \"openstack-operator-controller-operator-64895cd698-r4rcj\" (UID: \"13b7c0b2-c215-4740-a6af-b67ce7ab0dd3\") " pod="openstack-operators/openstack-operator-controller-operator-64895cd698-r4rcj" Oct 13 13:21:38 crc kubenswrapper[4797]: I1013 13:21:38.389355 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-64895cd698-r4rcj" Oct 13 13:21:38 crc kubenswrapper[4797]: I1013 13:21:38.667357 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-64895cd698-r4rcj"] Oct 13 13:21:38 crc kubenswrapper[4797]: I1013 13:21:38.873337 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-64895cd698-r4rcj" event={"ID":"13b7c0b2-c215-4740-a6af-b67ce7ab0dd3","Type":"ContainerStarted","Data":"31fd64f32ac8713e61f0eedcd51f628bb7bc93037dc610c120441ccc73768df0"} Oct 13 13:21:43 crc kubenswrapper[4797]: I1013 13:21:43.913087 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-64895cd698-r4rcj" event={"ID":"13b7c0b2-c215-4740-a6af-b67ce7ab0dd3","Type":"ContainerStarted","Data":"261a481219cd96d0dc5ee0d2c8f357e211936f3b5d1b82fbd39e2fa64987f332"} Oct 13 13:21:45 crc kubenswrapper[4797]: I1013 13:21:45.933534 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-64895cd698-r4rcj" event={"ID":"13b7c0b2-c215-4740-a6af-b67ce7ab0dd3","Type":"ContainerStarted","Data":"f5abf07611b8961984401dc9479906693f5035bba1548de0c928f7bf02ee60c1"} Oct 13 13:21:45 crc kubenswrapper[4797]: I1013 13:21:45.933915 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-64895cd698-r4rcj" Oct 13 13:21:45 crc kubenswrapper[4797]: I1013 13:21:45.991568 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-64895cd698-r4rcj" podStartSLOduration=1.378995776 podStartE2EDuration="7.991543077s" podCreationTimestamp="2025-10-13 13:21:38 +0000 UTC" firstStartedPulling="2025-10-13 13:21:38.681626473 +0000 UTC m=+876.215176739" lastFinishedPulling="2025-10-13 13:21:45.294173784 +0000 UTC m=+882.827724040" observedRunningTime="2025-10-13 13:21:45.984505344 +0000 UTC m=+883.518055680" watchObservedRunningTime="2025-10-13 13:21:45.991543077 +0000 UTC m=+883.525093373" Oct 13 13:21:48 crc kubenswrapper[4797]: I1013 13:21:48.120390 4797 patch_prober.go:28] interesting pod/machine-config-daemon-hrdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 13:21:48 crc kubenswrapper[4797]: I1013 13:21:48.120728 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 13:21:48 crc kubenswrapper[4797]: I1013 13:21:48.392543 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-64895cd698-r4rcj" Oct 13 13:22:18 crc kubenswrapper[4797]: I1013 13:22:18.119753 4797 patch_prober.go:28] interesting pod/machine-config-daemon-hrdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 13:22:18 crc kubenswrapper[4797]: I1013 13:22:18.120376 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.311402 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-658bdf4b74-4xbcc"] Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.313298 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-658bdf4b74-4xbcc" Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.314866 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-7b7fb68549-jtc6c"] Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.315200 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-qcvxc" Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.315763 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-7b7fb68549-jtc6c" Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.318339 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-nnvzm" Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.336686 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-84b9b84486-vfpgt"] Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.338098 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-84b9b84486-vfpgt" Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.340755 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-qqvj9" Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.341127 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-85d5d9dd78-rdczf"] Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.342217 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-85d5d9dd78-rdczf" Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.345416 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-4986z" Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.356936 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-85d5d9dd78-rdczf"] Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.361037 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-658bdf4b74-4xbcc"] Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.364655 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-84b9b84486-vfpgt"] Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.378846 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-858f76bbdd-79946"] Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.379922 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-858f76bbdd-79946" Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.382312 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-jgf2b" Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.383259 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-858f76bbdd-79946"] Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.388821 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-7b7fb68549-jtc6c"] Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.399399 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-7ffbcb7588-sdtkf"] Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.400380 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-7ffbcb7588-sdtkf" Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.407575 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-7ffbcb7588-sdtkf"] Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.409606 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-pbnfg" Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.428536 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-656bcbd775-ntnr8"] Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.429515 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-656bcbd775-ntnr8" Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.434104 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-4r9p5" Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.434263 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.434932 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66748\" (UniqueName: \"kubernetes.io/projected/cc7558e9-8bd0-4dda-9792-49855202f2bf-kube-api-access-66748\") pod \"cinder-operator-controller-manager-7b7fb68549-jtc6c\" (UID: \"cc7558e9-8bd0-4dda-9792-49855202f2bf\") " pod="openstack-operators/cinder-operator-controller-manager-7b7fb68549-jtc6c" Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.434982 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4697q\" (UniqueName: \"kubernetes.io/projected/2ce34775-733e-42d7-a688-4c12edad7614-kube-api-access-4697q\") pod \"barbican-operator-controller-manager-658bdf4b74-4xbcc\" (UID: \"2ce34775-733e-42d7-a688-4c12edad7614\") " pod="openstack-operators/barbican-operator-controller-manager-658bdf4b74-4xbcc" Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.435026 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kt9g5\" (UniqueName: \"kubernetes.io/projected/424af6e9-8a27-446e-b11e-7a84032f476e-kube-api-access-kt9g5\") pod \"heat-operator-controller-manager-858f76bbdd-79946\" (UID: \"424af6e9-8a27-446e-b11e-7a84032f476e\") " pod="openstack-operators/heat-operator-controller-manager-858f76bbdd-79946" Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.435045 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7gfq\" (UniqueName: \"kubernetes.io/projected/f07f9d14-0fd4-4702-877a-8e0097a23791-kube-api-access-l7gfq\") pod \"designate-operator-controller-manager-85d5d9dd78-rdczf\" (UID: \"f07f9d14-0fd4-4702-877a-8e0097a23791\") " pod="openstack-operators/designate-operator-controller-manager-85d5d9dd78-rdczf" Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.435073 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6bnk\" (UniqueName: \"kubernetes.io/projected/13487799-53f7-4c74-9f16-770bf4dbace5-kube-api-access-c6bnk\") pod \"glance-operator-controller-manager-84b9b84486-vfpgt\" (UID: \"13487799-53f7-4c74-9f16-770bf4dbace5\") " pod="openstack-operators/glance-operator-controller-manager-84b9b84486-vfpgt" Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.435091 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztxvg\" (UniqueName: \"kubernetes.io/projected/161dd833-44d7-4dac-9ea7-d2c059e2f593-kube-api-access-ztxvg\") pod \"horizon-operator-controller-manager-7ffbcb7588-sdtkf\" (UID: \"161dd833-44d7-4dac-9ea7-d2c059e2f593\") " pod="openstack-operators/horizon-operator-controller-manager-7ffbcb7588-sdtkf" Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.464761 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-656bcbd775-ntnr8"] Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.465544 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-9c5c78d49-l8d5s"] Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.466855 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-9c5c78d49-l8d5s" Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.472673 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-9c5c78d49-l8d5s"] Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.474840 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-8zzxq" Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.479826 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-55b6b7c7b8-znjhj"] Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.481715 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-55b6b7c7b8-znjhj" Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.485300 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-4sxmm" Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.497318 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-5f67fbc655-m9cpw"] Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.498295 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-5f67fbc655-m9cpw" Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.499796 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-79hhc" Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.501937 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-f9fb45f8f-bld7w"] Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.502986 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-f9fb45f8f-bld7w" Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.511238 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-xlnvg" Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.514753 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-55b6b7c7b8-znjhj"] Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.535906 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kt9g5\" (UniqueName: \"kubernetes.io/projected/424af6e9-8a27-446e-b11e-7a84032f476e-kube-api-access-kt9g5\") pod \"heat-operator-controller-manager-858f76bbdd-79946\" (UID: \"424af6e9-8a27-446e-b11e-7a84032f476e\") " pod="openstack-operators/heat-operator-controller-manager-858f76bbdd-79946" Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.535957 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l7gfq\" (UniqueName: \"kubernetes.io/projected/f07f9d14-0fd4-4702-877a-8e0097a23791-kube-api-access-l7gfq\") pod \"designate-operator-controller-manager-85d5d9dd78-rdczf\" (UID: \"f07f9d14-0fd4-4702-877a-8e0097a23791\") " pod="openstack-operators/designate-operator-controller-manager-85d5d9dd78-rdczf" Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.535999 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndczr\" (UniqueName: \"kubernetes.io/projected/ed01e7fd-31f4-47d4-9b83-3544f3e1f5d3-kube-api-access-ndczr\") pod \"mariadb-operator-controller-manager-f9fb45f8f-bld7w\" (UID: \"ed01e7fd-31f4-47d4-9b83-3544f3e1f5d3\") " pod="openstack-operators/mariadb-operator-controller-manager-f9fb45f8f-bld7w" Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.536030 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6bnk\" (UniqueName: \"kubernetes.io/projected/13487799-53f7-4c74-9f16-770bf4dbace5-kube-api-access-c6bnk\") pod \"glance-operator-controller-manager-84b9b84486-vfpgt\" (UID: \"13487799-53f7-4c74-9f16-770bf4dbace5\") " pod="openstack-operators/glance-operator-controller-manager-84b9b84486-vfpgt" Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.536057 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ztxvg\" (UniqueName: \"kubernetes.io/projected/161dd833-44d7-4dac-9ea7-d2c059e2f593-kube-api-access-ztxvg\") pod \"horizon-operator-controller-manager-7ffbcb7588-sdtkf\" (UID: \"161dd833-44d7-4dac-9ea7-d2c059e2f593\") " pod="openstack-operators/horizon-operator-controller-manager-7ffbcb7588-sdtkf" Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.536084 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-468nl\" (UniqueName: \"kubernetes.io/projected/7493abc9-384b-43e1-aa00-1c6ae0ddf144-kube-api-access-468nl\") pod \"infra-operator-controller-manager-656bcbd775-ntnr8\" (UID: \"7493abc9-384b-43e1-aa00-1c6ae0ddf144\") " pod="openstack-operators/infra-operator-controller-manager-656bcbd775-ntnr8" Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.536109 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66748\" (UniqueName: \"kubernetes.io/projected/cc7558e9-8bd0-4dda-9792-49855202f2bf-kube-api-access-66748\") pod \"cinder-operator-controller-manager-7b7fb68549-jtc6c\" (UID: \"cc7558e9-8bd0-4dda-9792-49855202f2bf\") " pod="openstack-operators/cinder-operator-controller-manager-7b7fb68549-jtc6c" Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.536130 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hsrlw\" (UniqueName: \"kubernetes.io/projected/f092abb5-bd31-41f3-bedb-1e9523f17044-kube-api-access-hsrlw\") pod \"ironic-operator-controller-manager-9c5c78d49-l8d5s\" (UID: \"f092abb5-bd31-41f3-bedb-1e9523f17044\") " pod="openstack-operators/ironic-operator-controller-manager-9c5c78d49-l8d5s" Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.536151 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7493abc9-384b-43e1-aa00-1c6ae0ddf144-cert\") pod \"infra-operator-controller-manager-656bcbd775-ntnr8\" (UID: \"7493abc9-384b-43e1-aa00-1c6ae0ddf144\") " pod="openstack-operators/infra-operator-controller-manager-656bcbd775-ntnr8" Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.536176 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9sv2\" (UniqueName: \"kubernetes.io/projected/b9adb7ab-599b-4ac1-b7d4-d22efc7fda95-kube-api-access-j9sv2\") pod \"keystone-operator-controller-manager-55b6b7c7b8-znjhj\" (UID: \"b9adb7ab-599b-4ac1-b7d4-d22efc7fda95\") " pod="openstack-operators/keystone-operator-controller-manager-55b6b7c7b8-znjhj" Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.536214 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hdh4c\" (UniqueName: \"kubernetes.io/projected/5132d95b-625f-4eb5-9a09-47e695441c86-kube-api-access-hdh4c\") pod \"manila-operator-controller-manager-5f67fbc655-m9cpw\" (UID: \"5132d95b-625f-4eb5-9a09-47e695441c86\") " pod="openstack-operators/manila-operator-controller-manager-5f67fbc655-m9cpw" Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.536243 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4697q\" (UniqueName: \"kubernetes.io/projected/2ce34775-733e-42d7-a688-4c12edad7614-kube-api-access-4697q\") pod \"barbican-operator-controller-manager-658bdf4b74-4xbcc\" (UID: \"2ce34775-733e-42d7-a688-4c12edad7614\") " pod="openstack-operators/barbican-operator-controller-manager-658bdf4b74-4xbcc" Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.539729 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-5f67fbc655-m9cpw"] Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.560741 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-f9fb45f8f-bld7w"] Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.566057 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-79d585cb66-pqvwj"] Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.566625 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztxvg\" (UniqueName: \"kubernetes.io/projected/161dd833-44d7-4dac-9ea7-d2c059e2f593-kube-api-access-ztxvg\") pod \"horizon-operator-controller-manager-7ffbcb7588-sdtkf\" (UID: \"161dd833-44d7-4dac-9ea7-d2c059e2f593\") " pod="openstack-operators/horizon-operator-controller-manager-7ffbcb7588-sdtkf" Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.566831 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7gfq\" (UniqueName: \"kubernetes.io/projected/f07f9d14-0fd4-4702-877a-8e0097a23791-kube-api-access-l7gfq\") pod \"designate-operator-controller-manager-85d5d9dd78-rdczf\" (UID: \"f07f9d14-0fd4-4702-877a-8e0097a23791\") " pod="openstack-operators/designate-operator-controller-manager-85d5d9dd78-rdczf" Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.567046 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-79d585cb66-pqvwj" Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.572589 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66748\" (UniqueName: \"kubernetes.io/projected/cc7558e9-8bd0-4dda-9792-49855202f2bf-kube-api-access-66748\") pod \"cinder-operator-controller-manager-7b7fb68549-jtc6c\" (UID: \"cc7558e9-8bd0-4dda-9792-49855202f2bf\") " pod="openstack-operators/cinder-operator-controller-manager-7b7fb68549-jtc6c" Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.572669 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-79d585cb66-pqvwj"] Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.573402 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-mls6p" Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.573786 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6bnk\" (UniqueName: \"kubernetes.io/projected/13487799-53f7-4c74-9f16-770bf4dbace5-kube-api-access-c6bnk\") pod \"glance-operator-controller-manager-84b9b84486-vfpgt\" (UID: \"13487799-53f7-4c74-9f16-770bf4dbace5\") " pod="openstack-operators/glance-operator-controller-manager-84b9b84486-vfpgt" Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.583978 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-5df598886f-fcgbt"] Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.585454 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-69fdcfc5f5-dh4qb"] Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.586230 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5df598886f-fcgbt" Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.588964 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-qdsbf" Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.589291 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kt9g5\" (UniqueName: \"kubernetes.io/projected/424af6e9-8a27-446e-b11e-7a84032f476e-kube-api-access-kt9g5\") pod \"heat-operator-controller-manager-858f76bbdd-79946\" (UID: \"424af6e9-8a27-446e-b11e-7a84032f476e\") " pod="openstack-operators/heat-operator-controller-manager-858f76bbdd-79946" Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.590364 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4697q\" (UniqueName: \"kubernetes.io/projected/2ce34775-733e-42d7-a688-4c12edad7614-kube-api-access-4697q\") pod \"barbican-operator-controller-manager-658bdf4b74-4xbcc\" (UID: \"2ce34775-733e-42d7-a688-4c12edad7614\") " pod="openstack-operators/barbican-operator-controller-manager-658bdf4b74-4xbcc" Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.597186 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5df598886f-fcgbt"] Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.597551 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-69fdcfc5f5-dh4qb" Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.618030 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-dx47m" Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.635603 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-658bdf4b74-4xbcc" Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.638543 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hdh4c\" (UniqueName: \"kubernetes.io/projected/5132d95b-625f-4eb5-9a09-47e695441c86-kube-api-access-hdh4c\") pod \"manila-operator-controller-manager-5f67fbc655-m9cpw\" (UID: \"5132d95b-625f-4eb5-9a09-47e695441c86\") " pod="openstack-operators/manila-operator-controller-manager-5f67fbc655-m9cpw" Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.638601 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwdm6\" (UniqueName: \"kubernetes.io/projected/990f0215-8f03-4fb7-ae16-0d89130a5ba3-kube-api-access-pwdm6\") pod \"nova-operator-controller-manager-5df598886f-fcgbt\" (UID: \"990f0215-8f03-4fb7-ae16-0d89130a5ba3\") " pod="openstack-operators/nova-operator-controller-manager-5df598886f-fcgbt" Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.638642 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5sqm\" (UniqueName: \"kubernetes.io/projected/21276615-f6a5-4b36-b65a-4b45a1f4b7e4-kube-api-access-h5sqm\") pod \"octavia-operator-controller-manager-69fdcfc5f5-dh4qb\" (UID: \"21276615-f6a5-4b36-b65a-4b45a1f4b7e4\") " pod="openstack-operators/octavia-operator-controller-manager-69fdcfc5f5-dh4qb" Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.638711 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ndczr\" (UniqueName: \"kubernetes.io/projected/ed01e7fd-31f4-47d4-9b83-3544f3e1f5d3-kube-api-access-ndczr\") pod \"mariadb-operator-controller-manager-f9fb45f8f-bld7w\" (UID: \"ed01e7fd-31f4-47d4-9b83-3544f3e1f5d3\") " pod="openstack-operators/mariadb-operator-controller-manager-f9fb45f8f-bld7w" Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.638758 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zsm5w\" (UniqueName: \"kubernetes.io/projected/8c61f396-891f-4c58-ba21-e53d8e357358-kube-api-access-zsm5w\") pod \"neutron-operator-controller-manager-79d585cb66-pqvwj\" (UID: \"8c61f396-891f-4c58-ba21-e53d8e357358\") " pod="openstack-operators/neutron-operator-controller-manager-79d585cb66-pqvwj" Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.638792 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-468nl\" (UniqueName: \"kubernetes.io/projected/7493abc9-384b-43e1-aa00-1c6ae0ddf144-kube-api-access-468nl\") pod \"infra-operator-controller-manager-656bcbd775-ntnr8\" (UID: \"7493abc9-384b-43e1-aa00-1c6ae0ddf144\") " pod="openstack-operators/infra-operator-controller-manager-656bcbd775-ntnr8" Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.638890 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hsrlw\" (UniqueName: \"kubernetes.io/projected/f092abb5-bd31-41f3-bedb-1e9523f17044-kube-api-access-hsrlw\") pod \"ironic-operator-controller-manager-9c5c78d49-l8d5s\" (UID: \"f092abb5-bd31-41f3-bedb-1e9523f17044\") " pod="openstack-operators/ironic-operator-controller-manager-9c5c78d49-l8d5s" Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.638924 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7493abc9-384b-43e1-aa00-1c6ae0ddf144-cert\") pod \"infra-operator-controller-manager-656bcbd775-ntnr8\" (UID: \"7493abc9-384b-43e1-aa00-1c6ae0ddf144\") " pod="openstack-operators/infra-operator-controller-manager-656bcbd775-ntnr8" Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.638957 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j9sv2\" (UniqueName: \"kubernetes.io/projected/b9adb7ab-599b-4ac1-b7d4-d22efc7fda95-kube-api-access-j9sv2\") pod \"keystone-operator-controller-manager-55b6b7c7b8-znjhj\" (UID: \"b9adb7ab-599b-4ac1-b7d4-d22efc7fda95\") " pod="openstack-operators/keystone-operator-controller-manager-55b6b7c7b8-znjhj" Oct 13 13:22:23 crc kubenswrapper[4797]: E1013 13:22:23.639883 4797 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Oct 13 13:22:23 crc kubenswrapper[4797]: E1013 13:22:23.639927 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7493abc9-384b-43e1-aa00-1c6ae0ddf144-cert podName:7493abc9-384b-43e1-aa00-1c6ae0ddf144 nodeName:}" failed. No retries permitted until 2025-10-13 13:22:24.139908477 +0000 UTC m=+921.673458723 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7493abc9-384b-43e1-aa00-1c6ae0ddf144-cert") pod "infra-operator-controller-manager-656bcbd775-ntnr8" (UID: "7493abc9-384b-43e1-aa00-1c6ae0ddf144") : secret "infra-operator-webhook-server-cert" not found Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.647517 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-7b7fb68549-jtc6c" Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.659841 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hsrlw\" (UniqueName: \"kubernetes.io/projected/f092abb5-bd31-41f3-bedb-1e9523f17044-kube-api-access-hsrlw\") pod \"ironic-operator-controller-manager-9c5c78d49-l8d5s\" (UID: \"f092abb5-bd31-41f3-bedb-1e9523f17044\") " pod="openstack-operators/ironic-operator-controller-manager-9c5c78d49-l8d5s" Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.660184 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-84b9b84486-vfpgt" Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.661753 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndczr\" (UniqueName: \"kubernetes.io/projected/ed01e7fd-31f4-47d4-9b83-3544f3e1f5d3-kube-api-access-ndczr\") pod \"mariadb-operator-controller-manager-f9fb45f8f-bld7w\" (UID: \"ed01e7fd-31f4-47d4-9b83-3544f3e1f5d3\") " pod="openstack-operators/mariadb-operator-controller-manager-f9fb45f8f-bld7w" Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.694322 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-85d5d9dd78-rdczf" Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.694636 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-55b7d448487kdnh"] Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.694760 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hdh4c\" (UniqueName: \"kubernetes.io/projected/5132d95b-625f-4eb5-9a09-47e695441c86-kube-api-access-hdh4c\") pod \"manila-operator-controller-manager-5f67fbc655-m9cpw\" (UID: \"5132d95b-625f-4eb5-9a09-47e695441c86\") " pod="openstack-operators/manila-operator-controller-manager-5f67fbc655-m9cpw" Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.695281 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9sv2\" (UniqueName: \"kubernetes.io/projected/b9adb7ab-599b-4ac1-b7d4-d22efc7fda95-kube-api-access-j9sv2\") pod \"keystone-operator-controller-manager-55b6b7c7b8-znjhj\" (UID: \"b9adb7ab-599b-4ac1-b7d4-d22efc7fda95\") " pod="openstack-operators/keystone-operator-controller-manager-55b6b7c7b8-znjhj" Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.695741 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-55b7d448487kdnh" Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.698286 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.698441 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-qmrg6" Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.698751 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-858f76bbdd-79946" Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.701388 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-468nl\" (UniqueName: \"kubernetes.io/projected/7493abc9-384b-43e1-aa00-1c6ae0ddf144-kube-api-access-468nl\") pod \"infra-operator-controller-manager-656bcbd775-ntnr8\" (UID: \"7493abc9-384b-43e1-aa00-1c6ae0ddf144\") " pod="openstack-operators/infra-operator-controller-manager-656bcbd775-ntnr8" Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.707227 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-69fdcfc5f5-dh4qb"] Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.720819 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-7ffbcb7588-sdtkf" Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.725829 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-79df5fb58c-mqjcg"] Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.728345 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-79df5fb58c-mqjcg" Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.731721 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-lx2h9" Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.737635 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-68b6c87b68-766z6"] Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.738723 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-68b6c87b68-766z6" Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.739683 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zsm5w\" (UniqueName: \"kubernetes.io/projected/8c61f396-891f-4c58-ba21-e53d8e357358-kube-api-access-zsm5w\") pod \"neutron-operator-controller-manager-79d585cb66-pqvwj\" (UID: \"8c61f396-891f-4c58-ba21-e53d8e357358\") " pod="openstack-operators/neutron-operator-controller-manager-79d585cb66-pqvwj" Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.739744 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/08de49ed-c17f-42fc-8bb1-2cb6684984f1-cert\") pod \"openstack-baremetal-operator-controller-manager-55b7d448487kdnh\" (UID: \"08de49ed-c17f-42fc-8bb1-2cb6684984f1\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-55b7d448487kdnh" Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.739779 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbnkd\" (UniqueName: \"kubernetes.io/projected/08de49ed-c17f-42fc-8bb1-2cb6684984f1-kube-api-access-bbnkd\") pod \"openstack-baremetal-operator-controller-manager-55b7d448487kdnh\" (UID: \"08de49ed-c17f-42fc-8bb1-2cb6684984f1\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-55b7d448487kdnh" Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.739818 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pwdm6\" (UniqueName: \"kubernetes.io/projected/990f0215-8f03-4fb7-ae16-0d89130a5ba3-kube-api-access-pwdm6\") pod \"nova-operator-controller-manager-5df598886f-fcgbt\" (UID: \"990f0215-8f03-4fb7-ae16-0d89130a5ba3\") " pod="openstack-operators/nova-operator-controller-manager-5df598886f-fcgbt" Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.739840 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h5sqm\" (UniqueName: \"kubernetes.io/projected/21276615-f6a5-4b36-b65a-4b45a1f4b7e4-kube-api-access-h5sqm\") pod \"octavia-operator-controller-manager-69fdcfc5f5-dh4qb\" (UID: \"21276615-f6a5-4b36-b65a-4b45a1f4b7e4\") " pod="openstack-operators/octavia-operator-controller-manager-69fdcfc5f5-dh4qb" Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.740618 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-55b7d448487kdnh"] Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.743851 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-vnlkt" Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.750719 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-79df5fb58c-mqjcg"] Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.757437 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-68b6c87b68-766z6"] Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.759828 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zsm5w\" (UniqueName: \"kubernetes.io/projected/8c61f396-891f-4c58-ba21-e53d8e357358-kube-api-access-zsm5w\") pod \"neutron-operator-controller-manager-79d585cb66-pqvwj\" (UID: \"8c61f396-891f-4c58-ba21-e53d8e357358\") " pod="openstack-operators/neutron-operator-controller-manager-79d585cb66-pqvwj" Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.761514 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5sqm\" (UniqueName: \"kubernetes.io/projected/21276615-f6a5-4b36-b65a-4b45a1f4b7e4-kube-api-access-h5sqm\") pod \"octavia-operator-controller-manager-69fdcfc5f5-dh4qb\" (UID: \"21276615-f6a5-4b36-b65a-4b45a1f4b7e4\") " pod="openstack-operators/octavia-operator-controller-manager-69fdcfc5f5-dh4qb" Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.763221 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwdm6\" (UniqueName: \"kubernetes.io/projected/990f0215-8f03-4fb7-ae16-0d89130a5ba3-kube-api-access-pwdm6\") pod \"nova-operator-controller-manager-5df598886f-fcgbt\" (UID: \"990f0215-8f03-4fb7-ae16-0d89130a5ba3\") " pod="openstack-operators/nova-operator-controller-manager-5df598886f-fcgbt" Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.771211 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-79d585cb66-pqvwj" Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.792279 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5df598886f-fcgbt" Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.801630 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-db6d7f97b-kgmrr"] Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.802607 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-db6d7f97b-kgmrr" Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.805357 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-w5tj8" Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.806526 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-69fdcfc5f5-dh4qb" Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.808240 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-db6d7f97b-kgmrr"] Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.815734 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-67cfc6749b-v826q"] Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.816724 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-67cfc6749b-v826q" Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.822463 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-75dln" Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.823302 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-9c5c78d49-l8d5s" Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.828938 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-55b6b7c7b8-znjhj" Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.838797 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-67cfc6749b-v826q"] Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.840650 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zlr45\" (UniqueName: \"kubernetes.io/projected/f0e7ab2d-9124-44fe-aa40-abe8b405d449-kube-api-access-zlr45\") pod \"placement-operator-controller-manager-68b6c87b68-766z6\" (UID: \"f0e7ab2d-9124-44fe-aa40-abe8b405d449\") " pod="openstack-operators/placement-operator-controller-manager-68b6c87b68-766z6" Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.840729 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bch67\" (UniqueName: \"kubernetes.io/projected/f7afdb2f-edf9-4dfe-a7d1-43b6a5ec8dcf-kube-api-access-bch67\") pod \"ovn-operator-controller-manager-79df5fb58c-mqjcg\" (UID: \"f7afdb2f-edf9-4dfe-a7d1-43b6a5ec8dcf\") " pod="openstack-operators/ovn-operator-controller-manager-79df5fb58c-mqjcg" Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.840757 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/08de49ed-c17f-42fc-8bb1-2cb6684984f1-cert\") pod \"openstack-baremetal-operator-controller-manager-55b7d448487kdnh\" (UID: \"08de49ed-c17f-42fc-8bb1-2cb6684984f1\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-55b7d448487kdnh" Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.840799 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qnxwg\" (UniqueName: \"kubernetes.io/projected/549ef07f-ef05-4c9a-8700-a19008de4afe-kube-api-access-qnxwg\") pod \"telemetry-operator-controller-manager-67cfc6749b-v826q\" (UID: \"549ef07f-ef05-4c9a-8700-a19008de4afe\") " pod="openstack-operators/telemetry-operator-controller-manager-67cfc6749b-v826q" Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.840861 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bbnkd\" (UniqueName: \"kubernetes.io/projected/08de49ed-c17f-42fc-8bb1-2cb6684984f1-kube-api-access-bbnkd\") pod \"openstack-baremetal-operator-controller-manager-55b7d448487kdnh\" (UID: \"08de49ed-c17f-42fc-8bb1-2cb6684984f1\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-55b7d448487kdnh" Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.840923 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t69z8\" (UniqueName: \"kubernetes.io/projected/ad662e1a-7f25-414a-9358-cb1994840925-kube-api-access-t69z8\") pod \"swift-operator-controller-manager-db6d7f97b-kgmrr\" (UID: \"ad662e1a-7f25-414a-9358-cb1994840925\") " pod="openstack-operators/swift-operator-controller-manager-db6d7f97b-kgmrr" Oct 13 13:22:23 crc kubenswrapper[4797]: E1013 13:22:23.841397 4797 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 13 13:22:23 crc kubenswrapper[4797]: E1013 13:22:23.841436 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/08de49ed-c17f-42fc-8bb1-2cb6684984f1-cert podName:08de49ed-c17f-42fc-8bb1-2cb6684984f1 nodeName:}" failed. No retries permitted until 2025-10-13 13:22:24.341421472 +0000 UTC m=+921.874971728 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/08de49ed-c17f-42fc-8bb1-2cb6684984f1-cert") pod "openstack-baremetal-operator-controller-manager-55b7d448487kdnh" (UID: "08de49ed-c17f-42fc-8bb1-2cb6684984f1") : secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.855864 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5458f77c4-vrt4f"] Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.857005 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5458f77c4-vrt4f" Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.862145 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-5f67fbc655-m9cpw" Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.862623 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-vvbpx" Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.875927 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5458f77c4-vrt4f"] Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.886281 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbnkd\" (UniqueName: \"kubernetes.io/projected/08de49ed-c17f-42fc-8bb1-2cb6684984f1-kube-api-access-bbnkd\") pod \"openstack-baremetal-operator-controller-manager-55b7d448487kdnh\" (UID: \"08de49ed-c17f-42fc-8bb1-2cb6684984f1\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-55b7d448487kdnh" Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.920075 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-7f554bff7b-qsnt6"] Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.924238 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-7f554bff7b-qsnt6" Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.927013 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-tkn5c" Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.942544 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qnxwg\" (UniqueName: \"kubernetes.io/projected/549ef07f-ef05-4c9a-8700-a19008de4afe-kube-api-access-qnxwg\") pod \"telemetry-operator-controller-manager-67cfc6749b-v826q\" (UID: \"549ef07f-ef05-4c9a-8700-a19008de4afe\") " pod="openstack-operators/telemetry-operator-controller-manager-67cfc6749b-v826q" Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.942626 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t69z8\" (UniqueName: \"kubernetes.io/projected/ad662e1a-7f25-414a-9358-cb1994840925-kube-api-access-t69z8\") pod \"swift-operator-controller-manager-db6d7f97b-kgmrr\" (UID: \"ad662e1a-7f25-414a-9358-cb1994840925\") " pod="openstack-operators/swift-operator-controller-manager-db6d7f97b-kgmrr" Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.942685 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7dht\" (UniqueName: \"kubernetes.io/projected/4cc72187-24a1-4ec5-907d-d5295814e428-kube-api-access-n7dht\") pod \"watcher-operator-controller-manager-7f554bff7b-qsnt6\" (UID: \"4cc72187-24a1-4ec5-907d-d5295814e428\") " pod="openstack-operators/watcher-operator-controller-manager-7f554bff7b-qsnt6" Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.942725 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zlr45\" (UniqueName: \"kubernetes.io/projected/f0e7ab2d-9124-44fe-aa40-abe8b405d449-kube-api-access-zlr45\") pod \"placement-operator-controller-manager-68b6c87b68-766z6\" (UID: \"f0e7ab2d-9124-44fe-aa40-abe8b405d449\") " pod="openstack-operators/placement-operator-controller-manager-68b6c87b68-766z6" Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.942784 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bch67\" (UniqueName: \"kubernetes.io/projected/f7afdb2f-edf9-4dfe-a7d1-43b6a5ec8dcf-kube-api-access-bch67\") pod \"ovn-operator-controller-manager-79df5fb58c-mqjcg\" (UID: \"f7afdb2f-edf9-4dfe-a7d1-43b6a5ec8dcf\") " pod="openstack-operators/ovn-operator-controller-manager-79df5fb58c-mqjcg" Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.942873 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jh7tf\" (UniqueName: \"kubernetes.io/projected/535566f0-4f81-4362-a4a0-18c9b2dedd8d-kube-api-access-jh7tf\") pod \"test-operator-controller-manager-5458f77c4-vrt4f\" (UID: \"535566f0-4f81-4362-a4a0-18c9b2dedd8d\") " pod="openstack-operators/test-operator-controller-manager-5458f77c4-vrt4f" Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.944151 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-f9fb45f8f-bld7w" Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.951225 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-7f554bff7b-qsnt6"] Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.964347 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t69z8\" (UniqueName: \"kubernetes.io/projected/ad662e1a-7f25-414a-9358-cb1994840925-kube-api-access-t69z8\") pod \"swift-operator-controller-manager-db6d7f97b-kgmrr\" (UID: \"ad662e1a-7f25-414a-9358-cb1994840925\") " pod="openstack-operators/swift-operator-controller-manager-db6d7f97b-kgmrr" Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.986230 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zlr45\" (UniqueName: \"kubernetes.io/projected/f0e7ab2d-9124-44fe-aa40-abe8b405d449-kube-api-access-zlr45\") pod \"placement-operator-controller-manager-68b6c87b68-766z6\" (UID: \"f0e7ab2d-9124-44fe-aa40-abe8b405d449\") " pod="openstack-operators/placement-operator-controller-manager-68b6c87b68-766z6" Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.987036 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qnxwg\" (UniqueName: \"kubernetes.io/projected/549ef07f-ef05-4c9a-8700-a19008de4afe-kube-api-access-qnxwg\") pod \"telemetry-operator-controller-manager-67cfc6749b-v826q\" (UID: \"549ef07f-ef05-4c9a-8700-a19008de4afe\") " pod="openstack-operators/telemetry-operator-controller-manager-67cfc6749b-v826q" Oct 13 13:22:23 crc kubenswrapper[4797]: I1013 13:22:23.987547 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bch67\" (UniqueName: \"kubernetes.io/projected/f7afdb2f-edf9-4dfe-a7d1-43b6a5ec8dcf-kube-api-access-bch67\") pod \"ovn-operator-controller-manager-79df5fb58c-mqjcg\" (UID: \"f7afdb2f-edf9-4dfe-a7d1-43b6a5ec8dcf\") " pod="openstack-operators/ovn-operator-controller-manager-79df5fb58c-mqjcg" Oct 13 13:22:24 crc kubenswrapper[4797]: I1013 13:22:24.019402 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-7fb8c88b76-4rdzv"] Oct 13 13:22:24 crc kubenswrapper[4797]: I1013 13:22:24.020840 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-7fb8c88b76-4rdzv" Oct 13 13:22:24 crc kubenswrapper[4797]: I1013 13:22:24.023924 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-7n9cn" Oct 13 13:22:24 crc kubenswrapper[4797]: I1013 13:22:24.024146 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Oct 13 13:22:24 crc kubenswrapper[4797]: I1013 13:22:24.035837 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-7fb8c88b76-4rdzv"] Oct 13 13:22:24 crc kubenswrapper[4797]: I1013 13:22:24.044310 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1d1f4280-f5b1-41f9-8eeb-3d2d9cac65e4-cert\") pod \"openstack-operator-controller-manager-7fb8c88b76-4rdzv\" (UID: \"1d1f4280-f5b1-41f9-8eeb-3d2d9cac65e4\") " pod="openstack-operators/openstack-operator-controller-manager-7fb8c88b76-4rdzv" Oct 13 13:22:24 crc kubenswrapper[4797]: I1013 13:22:24.044351 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9jzw\" (UniqueName: \"kubernetes.io/projected/1d1f4280-f5b1-41f9-8eeb-3d2d9cac65e4-kube-api-access-n9jzw\") pod \"openstack-operator-controller-manager-7fb8c88b76-4rdzv\" (UID: \"1d1f4280-f5b1-41f9-8eeb-3d2d9cac65e4\") " pod="openstack-operators/openstack-operator-controller-manager-7fb8c88b76-4rdzv" Oct 13 13:22:24 crc kubenswrapper[4797]: I1013 13:22:24.044388 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jh7tf\" (UniqueName: \"kubernetes.io/projected/535566f0-4f81-4362-a4a0-18c9b2dedd8d-kube-api-access-jh7tf\") pod \"test-operator-controller-manager-5458f77c4-vrt4f\" (UID: \"535566f0-4f81-4362-a4a0-18c9b2dedd8d\") " pod="openstack-operators/test-operator-controller-manager-5458f77c4-vrt4f" Oct 13 13:22:24 crc kubenswrapper[4797]: I1013 13:22:24.044475 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n7dht\" (UniqueName: \"kubernetes.io/projected/4cc72187-24a1-4ec5-907d-d5295814e428-kube-api-access-n7dht\") pod \"watcher-operator-controller-manager-7f554bff7b-qsnt6\" (UID: \"4cc72187-24a1-4ec5-907d-d5295814e428\") " pod="openstack-operators/watcher-operator-controller-manager-7f554bff7b-qsnt6" Oct 13 13:22:24 crc kubenswrapper[4797]: I1013 13:22:24.069705 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jh7tf\" (UniqueName: \"kubernetes.io/projected/535566f0-4f81-4362-a4a0-18c9b2dedd8d-kube-api-access-jh7tf\") pod \"test-operator-controller-manager-5458f77c4-vrt4f\" (UID: \"535566f0-4f81-4362-a4a0-18c9b2dedd8d\") " pod="openstack-operators/test-operator-controller-manager-5458f77c4-vrt4f" Oct 13 13:22:24 crc kubenswrapper[4797]: I1013 13:22:24.070162 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-67cfc6749b-v826q" Oct 13 13:22:24 crc kubenswrapper[4797]: I1013 13:22:24.070464 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7dht\" (UniqueName: \"kubernetes.io/projected/4cc72187-24a1-4ec5-907d-d5295814e428-kube-api-access-n7dht\") pod \"watcher-operator-controller-manager-7f554bff7b-qsnt6\" (UID: \"4cc72187-24a1-4ec5-907d-d5295814e428\") " pod="openstack-operators/watcher-operator-controller-manager-7f554bff7b-qsnt6" Oct 13 13:22:24 crc kubenswrapper[4797]: I1013 13:22:24.080494 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5458f77c4-vrt4f" Oct 13 13:22:24 crc kubenswrapper[4797]: I1013 13:22:24.096250 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-7f554bff7b-qsnt6" Oct 13 13:22:24 crc kubenswrapper[4797]: I1013 13:22:24.108157 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-w9st4"] Oct 13 13:22:24 crc kubenswrapper[4797]: I1013 13:22:24.109204 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-w9st4" Oct 13 13:22:24 crc kubenswrapper[4797]: I1013 13:22:24.128411 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-w9st4"] Oct 13 13:22:24 crc kubenswrapper[4797]: I1013 13:22:24.128740 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-9h85q" Oct 13 13:22:24 crc kubenswrapper[4797]: I1013 13:22:24.148970 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gcn95\" (UniqueName: \"kubernetes.io/projected/c2797d9e-1ac0-4ac4-8d0e-8c9061623f50-kube-api-access-gcn95\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-w9st4\" (UID: \"c2797d9e-1ac0-4ac4-8d0e-8c9061623f50\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-w9st4" Oct 13 13:22:24 crc kubenswrapper[4797]: I1013 13:22:24.149116 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7493abc9-384b-43e1-aa00-1c6ae0ddf144-cert\") pod \"infra-operator-controller-manager-656bcbd775-ntnr8\" (UID: \"7493abc9-384b-43e1-aa00-1c6ae0ddf144\") " pod="openstack-operators/infra-operator-controller-manager-656bcbd775-ntnr8" Oct 13 13:22:24 crc kubenswrapper[4797]: I1013 13:22:24.149138 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1d1f4280-f5b1-41f9-8eeb-3d2d9cac65e4-cert\") pod \"openstack-operator-controller-manager-7fb8c88b76-4rdzv\" (UID: \"1d1f4280-f5b1-41f9-8eeb-3d2d9cac65e4\") " pod="openstack-operators/openstack-operator-controller-manager-7fb8c88b76-4rdzv" Oct 13 13:22:24 crc kubenswrapper[4797]: I1013 13:22:24.149166 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n9jzw\" (UniqueName: \"kubernetes.io/projected/1d1f4280-f5b1-41f9-8eeb-3d2d9cac65e4-kube-api-access-n9jzw\") pod \"openstack-operator-controller-manager-7fb8c88b76-4rdzv\" (UID: \"1d1f4280-f5b1-41f9-8eeb-3d2d9cac65e4\") " pod="openstack-operators/openstack-operator-controller-manager-7fb8c88b76-4rdzv" Oct 13 13:22:24 crc kubenswrapper[4797]: E1013 13:22:24.161585 4797 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Oct 13 13:22:24 crc kubenswrapper[4797]: E1013 13:22:24.161641 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1d1f4280-f5b1-41f9-8eeb-3d2d9cac65e4-cert podName:1d1f4280-f5b1-41f9-8eeb-3d2d9cac65e4 nodeName:}" failed. No retries permitted until 2025-10-13 13:22:24.661623029 +0000 UTC m=+922.195173285 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1d1f4280-f5b1-41f9-8eeb-3d2d9cac65e4-cert") pod "openstack-operator-controller-manager-7fb8c88b76-4rdzv" (UID: "1d1f4280-f5b1-41f9-8eeb-3d2d9cac65e4") : secret "webhook-server-cert" not found Oct 13 13:22:24 crc kubenswrapper[4797]: I1013 13:22:24.166777 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7493abc9-384b-43e1-aa00-1c6ae0ddf144-cert\") pod \"infra-operator-controller-manager-656bcbd775-ntnr8\" (UID: \"7493abc9-384b-43e1-aa00-1c6ae0ddf144\") " pod="openstack-operators/infra-operator-controller-manager-656bcbd775-ntnr8" Oct 13 13:22:24 crc kubenswrapper[4797]: I1013 13:22:24.194399 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-79df5fb58c-mqjcg" Oct 13 13:22:24 crc kubenswrapper[4797]: I1013 13:22:24.213668 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-68b6c87b68-766z6" Oct 13 13:22:24 crc kubenswrapper[4797]: I1013 13:22:24.215455 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9jzw\" (UniqueName: \"kubernetes.io/projected/1d1f4280-f5b1-41f9-8eeb-3d2d9cac65e4-kube-api-access-n9jzw\") pod \"openstack-operator-controller-manager-7fb8c88b76-4rdzv\" (UID: \"1d1f4280-f5b1-41f9-8eeb-3d2d9cac65e4\") " pod="openstack-operators/openstack-operator-controller-manager-7fb8c88b76-4rdzv" Oct 13 13:22:24 crc kubenswrapper[4797]: I1013 13:22:24.235309 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-db6d7f97b-kgmrr" Oct 13 13:22:24 crc kubenswrapper[4797]: I1013 13:22:24.250632 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gcn95\" (UniqueName: \"kubernetes.io/projected/c2797d9e-1ac0-4ac4-8d0e-8c9061623f50-kube-api-access-gcn95\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-w9st4\" (UID: \"c2797d9e-1ac0-4ac4-8d0e-8c9061623f50\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-w9st4" Oct 13 13:22:24 crc kubenswrapper[4797]: I1013 13:22:24.269494 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gcn95\" (UniqueName: \"kubernetes.io/projected/c2797d9e-1ac0-4ac4-8d0e-8c9061623f50-kube-api-access-gcn95\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-w9st4\" (UID: \"c2797d9e-1ac0-4ac4-8d0e-8c9061623f50\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-w9st4" Oct 13 13:22:24 crc kubenswrapper[4797]: I1013 13:22:24.351796 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/08de49ed-c17f-42fc-8bb1-2cb6684984f1-cert\") pod \"openstack-baremetal-operator-controller-manager-55b7d448487kdnh\" (UID: \"08de49ed-c17f-42fc-8bb1-2cb6684984f1\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-55b7d448487kdnh" Oct 13 13:22:24 crc kubenswrapper[4797]: I1013 13:22:24.356543 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/08de49ed-c17f-42fc-8bb1-2cb6684984f1-cert\") pod \"openstack-baremetal-operator-controller-manager-55b7d448487kdnh\" (UID: \"08de49ed-c17f-42fc-8bb1-2cb6684984f1\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-55b7d448487kdnh" Oct 13 13:22:24 crc kubenswrapper[4797]: I1013 13:22:24.401087 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-656bcbd775-ntnr8" Oct 13 13:22:24 crc kubenswrapper[4797]: I1013 13:22:24.449138 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-55b7d448487kdnh" Oct 13 13:22:24 crc kubenswrapper[4797]: I1013 13:22:24.551546 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-w9st4" Oct 13 13:22:24 crc kubenswrapper[4797]: I1013 13:22:24.756459 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1d1f4280-f5b1-41f9-8eeb-3d2d9cac65e4-cert\") pod \"openstack-operator-controller-manager-7fb8c88b76-4rdzv\" (UID: \"1d1f4280-f5b1-41f9-8eeb-3d2d9cac65e4\") " pod="openstack-operators/openstack-operator-controller-manager-7fb8c88b76-4rdzv" Oct 13 13:22:24 crc kubenswrapper[4797]: I1013 13:22:24.760452 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1d1f4280-f5b1-41f9-8eeb-3d2d9cac65e4-cert\") pod \"openstack-operator-controller-manager-7fb8c88b76-4rdzv\" (UID: \"1d1f4280-f5b1-41f9-8eeb-3d2d9cac65e4\") " pod="openstack-operators/openstack-operator-controller-manager-7fb8c88b76-4rdzv" Oct 13 13:22:24 crc kubenswrapper[4797]: I1013 13:22:24.881118 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-7b7fb68549-jtc6c"] Oct 13 13:22:24 crc kubenswrapper[4797]: I1013 13:22:24.900905 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-858f76bbdd-79946"] Oct 13 13:22:24 crc kubenswrapper[4797]: I1013 13:22:24.906429 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-658bdf4b74-4xbcc"] Oct 13 13:22:25 crc kubenswrapper[4797]: I1013 13:22:25.043128 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-7fb8c88b76-4rdzv" Oct 13 13:22:25 crc kubenswrapper[4797]: I1013 13:22:25.252756 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-658bdf4b74-4xbcc" event={"ID":"2ce34775-733e-42d7-a688-4c12edad7614","Type":"ContainerStarted","Data":"ae4b2f2842b941ea5a8f99208a698fe1ae25df64d4c2e95b23923ab1ba477d5c"} Oct 13 13:22:25 crc kubenswrapper[4797]: I1013 13:22:25.253051 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-5f67fbc655-m9cpw"] Oct 13 13:22:25 crc kubenswrapper[4797]: I1013 13:22:25.253067 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-858f76bbdd-79946" event={"ID":"424af6e9-8a27-446e-b11e-7a84032f476e","Type":"ContainerStarted","Data":"fab325a26c55e05c227d002132b3e7ab21476e9d8ca6b1174666db5d13402e98"} Oct 13 13:22:25 crc kubenswrapper[4797]: I1013 13:22:25.253077 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-7b7fb68549-jtc6c" event={"ID":"cc7558e9-8bd0-4dda-9792-49855202f2bf","Type":"ContainerStarted","Data":"8717b3776c70dbc89d5c17ea9ba9ab2ec355e1e6e741ce6da91d22a4024970a9"} Oct 13 13:22:25 crc kubenswrapper[4797]: I1013 13:22:25.256138 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-84b9b84486-vfpgt"] Oct 13 13:22:25 crc kubenswrapper[4797]: I1013 13:22:25.268570 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-79df5fb58c-mqjcg"] Oct 13 13:22:25 crc kubenswrapper[4797]: I1013 13:22:25.283470 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-7ffbcb7588-sdtkf"] Oct 13 13:22:25 crc kubenswrapper[4797]: I1013 13:22:25.309010 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-69fdcfc5f5-dh4qb"] Oct 13 13:22:25 crc kubenswrapper[4797]: I1013 13:22:25.321127 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-85d5d9dd78-rdczf"] Oct 13 13:22:25 crc kubenswrapper[4797]: W1013 13:22:25.323262 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf092abb5_bd31_41f3_bedb_1e9523f17044.slice/crio-59eecc100925a3ffb66e6996eabcf85bf3009f0b7f4a203d8b811d6d081b2e2d WatchSource:0}: Error finding container 59eecc100925a3ffb66e6996eabcf85bf3009f0b7f4a203d8b811d6d081b2e2d: Status 404 returned error can't find the container with id 59eecc100925a3ffb66e6996eabcf85bf3009f0b7f4a203d8b811d6d081b2e2d Oct 13 13:22:25 crc kubenswrapper[4797]: E1013 13:22:25.332747 4797 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/horizon-operator@sha256:063a7e65b4ba98f0506f269ff7525b446eae06a5ed4a61c18ffa33a886500867,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-ztxvg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-operator-controller-manager-7ffbcb7588-sdtkf_openstack-operators(161dd833-44d7-4dac-9ea7-d2c059e2f593): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 13 13:22:25 crc kubenswrapper[4797]: I1013 13:22:25.339540 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-9c5c78d49-l8d5s"] Oct 13 13:22:25 crc kubenswrapper[4797]: E1013 13:22:25.342521 4797 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:79b43a69884631c635d2164b95a2d4ec68f5cb33f96da14764f1c710880f3997,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-j9sv2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-55b6b7c7b8-znjhj_openstack-operators(b9adb7ab-599b-4ac1-b7d4-d22efc7fda95): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 13 13:22:25 crc kubenswrapper[4797]: I1013 13:22:25.351955 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5df598886f-fcgbt"] Oct 13 13:22:25 crc kubenswrapper[4797]: I1013 13:22:25.360282 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-79d585cb66-pqvwj"] Oct 13 13:22:25 crc kubenswrapper[4797]: I1013 13:22:25.366408 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-f9fb45f8f-bld7w"] Oct 13 13:22:25 crc kubenswrapper[4797]: I1013 13:22:25.373535 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-55b6b7c7b8-znjhj"] Oct 13 13:22:25 crc kubenswrapper[4797]: E1013 13:22:25.374079 4797 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:98a5233f0596591acdf2c6a5838b08be108787cdb6ad1995b2b7886bac0fe6ca,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-n7dht,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-7f554bff7b-qsnt6_openstack-operators(4cc72187-24a1-4ec5-907d-d5295814e428): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 13 13:22:25 crc kubenswrapper[4797]: W1013 13:22:25.377694 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf0e7ab2d_9124_44fe_aa40_abe8b405d449.slice/crio-4eec81c88a35eb52777f2d19c50ef06875dda55b05692bca6c8bf6431e637263 WatchSource:0}: Error finding container 4eec81c88a35eb52777f2d19c50ef06875dda55b05692bca6c8bf6431e637263: Status 404 returned error can't find the container with id 4eec81c88a35eb52777f2d19c50ef06875dda55b05692bca6c8bf6431e637263 Oct 13 13:22:25 crc kubenswrapper[4797]: I1013 13:22:25.377867 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-7f554bff7b-qsnt6"] Oct 13 13:22:25 crc kubenswrapper[4797]: W1013 13:22:25.380995 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podad662e1a_7f25_414a_9358_cb1994840925.slice/crio-e98d32589d8c3e24a35e89159609aef7f0844db222a4f4100d805fe243cb4f53 WatchSource:0}: Error finding container e98d32589d8c3e24a35e89159609aef7f0844db222a4f4100d805fe243cb4f53: Status 404 returned error can't find the container with id e98d32589d8c3e24a35e89159609aef7f0844db222a4f4100d805fe243cb4f53 Oct 13 13:22:25 crc kubenswrapper[4797]: E1013 13:22:25.381163 4797 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:d33c1f507e1f5b9a4bf226ad98917e92101ac66b36e19d35cbe04ae7014f6bff,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-zlr45,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-68b6c87b68-766z6_openstack-operators(f0e7ab2d-9124-44fe-aa40-abe8b405d449): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 13 13:22:25 crc kubenswrapper[4797]: I1013 13:22:25.381973 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-68b6c87b68-766z6"] Oct 13 13:22:25 crc kubenswrapper[4797]: I1013 13:22:25.390437 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5458f77c4-vrt4f"] Oct 13 13:22:25 crc kubenswrapper[4797]: I1013 13:22:25.393894 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-db6d7f97b-kgmrr"] Oct 13 13:22:25 crc kubenswrapper[4797]: I1013 13:22:25.398196 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-67cfc6749b-v826q"] Oct 13 13:22:25 crc kubenswrapper[4797]: E1013 13:22:25.399643 4797 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:4b4a17fe08ce00e375afaaec6a28835f5c1784f03d11c4558376ac04130f3a9e,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-t69z8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-db6d7f97b-kgmrr_openstack-operators(ad662e1a-7f25-414a-9358-cb1994840925): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 13 13:22:25 crc kubenswrapper[4797]: W1013 13:22:25.404508 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod549ef07f_ef05_4c9a_8700_a19008de4afe.slice/crio-a21e939aa44eb0d104e105d759e1c4cf70a51b5f2b6f9ff564141df3c6fb2b4f WatchSource:0}: Error finding container a21e939aa44eb0d104e105d759e1c4cf70a51b5f2b6f9ff564141df3c6fb2b4f: Status 404 returned error can't find the container with id a21e939aa44eb0d104e105d759e1c4cf70a51b5f2b6f9ff564141df3c6fb2b4f Oct 13 13:22:25 crc kubenswrapper[4797]: E1013 13:22:25.407004 4797 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:abe978f8da75223de5043cca50278ad4e28c8dd309883f502fe1e7a9998733b0,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-qnxwg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-67cfc6749b-v826q_openstack-operators(549ef07f-ef05-4c9a-8700-a19008de4afe): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 13 13:22:25 crc kubenswrapper[4797]: W1013 13:22:25.408645 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod535566f0_4f81_4362_a4a0_18c9b2dedd8d.slice/crio-cc7e2fce046ad1e187ced110a2840b582765922471e92efb45b8ae2b444a09de WatchSource:0}: Error finding container cc7e2fce046ad1e187ced110a2840b582765922471e92efb45b8ae2b444a09de: Status 404 returned error can't find the container with id cc7e2fce046ad1e187ced110a2840b582765922471e92efb45b8ae2b444a09de Oct 13 13:22:25 crc kubenswrapper[4797]: E1013 13:22:25.415136 4797 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:7e584b1c430441c8b6591dadeff32e065de8a185ad37ef90d2e08d37e59aab4a,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-jh7tf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5458f77c4-vrt4f_openstack-operators(535566f0-4f81-4362-a4a0-18c9b2dedd8d): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 13 13:22:25 crc kubenswrapper[4797]: E1013 13:22:25.573916 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/keystone-operator-controller-manager-55b6b7c7b8-znjhj" podUID="b9adb7ab-599b-4ac1-b7d4-d22efc7fda95" Oct 13 13:22:25 crc kubenswrapper[4797]: E1013 13:22:25.574480 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/horizon-operator-controller-manager-7ffbcb7588-sdtkf" podUID="161dd833-44d7-4dac-9ea7-d2c059e2f593" Oct 13 13:22:25 crc kubenswrapper[4797]: E1013 13:22:25.587003 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-7f554bff7b-qsnt6" podUID="4cc72187-24a1-4ec5-907d-d5295814e428" Oct 13 13:22:25 crc kubenswrapper[4797]: I1013 13:22:25.636729 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-7fb8c88b76-4rdzv"] Oct 13 13:22:25 crc kubenswrapper[4797]: I1013 13:22:25.662875 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-55b7d448487kdnh"] Oct 13 13:22:25 crc kubenswrapper[4797]: I1013 13:22:25.667592 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-656bcbd775-ntnr8"] Oct 13 13:22:25 crc kubenswrapper[4797]: I1013 13:22:25.675628 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-w9st4"] Oct 13 13:22:25 crc kubenswrapper[4797]: E1013 13:22:25.734068 4797 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/infra-operator@sha256:5cfb2ae1092445950b39dd59caa9a8c9367f42fb8353a8c3848d3bc729f24492,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{600 -3} {} 600m DecimalSI},memory: {{2147483648 0} {} 2Gi BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{536870912 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cert,ReadOnly:true,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-468nl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod infra-operator-controller-manager-656bcbd775-ntnr8_openstack-operators(7493abc9-384b-43e1-aa00-1c6ae0ddf144): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 13 13:22:25 crc kubenswrapper[4797]: E1013 13:22:25.885104 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-5458f77c4-vrt4f" podUID="535566f0-4f81-4362-a4a0-18c9b2dedd8d" Oct 13 13:22:25 crc kubenswrapper[4797]: E1013 13:22:25.909115 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-db6d7f97b-kgmrr" podUID="ad662e1a-7f25-414a-9358-cb1994840925" Oct 13 13:22:25 crc kubenswrapper[4797]: E1013 13:22:25.921489 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/placement-operator-controller-manager-68b6c87b68-766z6" podUID="f0e7ab2d-9124-44fe-aa40-abe8b405d449" Oct 13 13:22:25 crc kubenswrapper[4797]: E1013 13:22:25.938928 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-67cfc6749b-v826q" podUID="549ef07f-ef05-4c9a-8700-a19008de4afe" Oct 13 13:22:26 crc kubenswrapper[4797]: E1013 13:22:26.049040 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/infra-operator-controller-manager-656bcbd775-ntnr8" podUID="7493abc9-384b-43e1-aa00-1c6ae0ddf144" Oct 13 13:22:26 crc kubenswrapper[4797]: I1013 13:22:26.274660 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-5f67fbc655-m9cpw" event={"ID":"5132d95b-625f-4eb5-9a09-47e695441c86","Type":"ContainerStarted","Data":"f5c22f37e3b75fdb78d2725a5bfac10f0d17a573fbf451f01e33ddf6c09fd7e2"} Oct 13 13:22:26 crc kubenswrapper[4797]: I1013 13:22:26.295268 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-7fb8c88b76-4rdzv" event={"ID":"1d1f4280-f5b1-41f9-8eeb-3d2d9cac65e4","Type":"ContainerStarted","Data":"3b90ce25bb7c4de0efeb19a6b7d2c52362bad54877d2fd906ae4e76d230c7d9d"} Oct 13 13:22:26 crc kubenswrapper[4797]: I1013 13:22:26.295295 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-7fb8c88b76-4rdzv" event={"ID":"1d1f4280-f5b1-41f9-8eeb-3d2d9cac65e4","Type":"ContainerStarted","Data":"608054ed4e8d441a1c20356bb723528787f051ef0fc6bb88b808fa952c269480"} Oct 13 13:22:26 crc kubenswrapper[4797]: I1013 13:22:26.295303 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-7fb8c88b76-4rdzv" event={"ID":"1d1f4280-f5b1-41f9-8eeb-3d2d9cac65e4","Type":"ContainerStarted","Data":"f2916709905bf02c6d0cec6bb329b17a49fe3ea758aab29ac727dc5cda073ba8"} Oct 13 13:22:26 crc kubenswrapper[4797]: I1013 13:22:26.296155 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-7fb8c88b76-4rdzv" Oct 13 13:22:26 crc kubenswrapper[4797]: I1013 13:22:26.320483 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-55b6b7c7b8-znjhj" event={"ID":"b9adb7ab-599b-4ac1-b7d4-d22efc7fda95","Type":"ContainerStarted","Data":"6380cd648b5cd6e5e654b64aba019add5d0c12b7e6e26ef08a20d2d00b65b14c"} Oct 13 13:22:26 crc kubenswrapper[4797]: I1013 13:22:26.320509 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-55b6b7c7b8-znjhj" event={"ID":"b9adb7ab-599b-4ac1-b7d4-d22efc7fda95","Type":"ContainerStarted","Data":"cf2854a71233f2237cc41f2fa08d2fb0a5a6361746e3c1331d97b0385ef17778"} Oct 13 13:22:26 crc kubenswrapper[4797]: I1013 13:22:26.341475 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-7fb8c88b76-4rdzv" podStartSLOduration=3.341460879 podStartE2EDuration="3.341460879s" podCreationTimestamp="2025-10-13 13:22:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 13:22:26.337575154 +0000 UTC m=+923.871125420" watchObservedRunningTime="2025-10-13 13:22:26.341460879 +0000 UTC m=+923.875011135" Oct 13 13:22:26 crc kubenswrapper[4797]: E1013 13:22:26.347044 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:79b43a69884631c635d2164b95a2d4ec68f5cb33f96da14764f1c710880f3997\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-55b6b7c7b8-znjhj" podUID="b9adb7ab-599b-4ac1-b7d4-d22efc7fda95" Oct 13 13:22:26 crc kubenswrapper[4797]: I1013 13:22:26.350496 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-656bcbd775-ntnr8" event={"ID":"7493abc9-384b-43e1-aa00-1c6ae0ddf144","Type":"ContainerStarted","Data":"c36e8f9481c965fb7c1fd06d61b484d35fea068fad898e7a0c4b358c91cdeed7"} Oct 13 13:22:26 crc kubenswrapper[4797]: I1013 13:22:26.350525 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-656bcbd775-ntnr8" event={"ID":"7493abc9-384b-43e1-aa00-1c6ae0ddf144","Type":"ContainerStarted","Data":"74d007db37006a2d11476d48d3bb4f7ef4e2c341f252401d869d50e86d29f7c2"} Oct 13 13:22:26 crc kubenswrapper[4797]: E1013 13:22:26.354918 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/infra-operator@sha256:5cfb2ae1092445950b39dd59caa9a8c9367f42fb8353a8c3848d3bc729f24492\\\"\"" pod="openstack-operators/infra-operator-controller-manager-656bcbd775-ntnr8" podUID="7493abc9-384b-43e1-aa00-1c6ae0ddf144" Oct 13 13:22:26 crc kubenswrapper[4797]: I1013 13:22:26.359141 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5458f77c4-vrt4f" event={"ID":"535566f0-4f81-4362-a4a0-18c9b2dedd8d","Type":"ContainerStarted","Data":"10cbc56491c84e02756e3da74445fb3aef91a752e3b8ce8d3742608ac9e91a89"} Oct 13 13:22:26 crc kubenswrapper[4797]: I1013 13:22:26.359182 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5458f77c4-vrt4f" event={"ID":"535566f0-4f81-4362-a4a0-18c9b2dedd8d","Type":"ContainerStarted","Data":"cc7e2fce046ad1e187ced110a2840b582765922471e92efb45b8ae2b444a09de"} Oct 13 13:22:26 crc kubenswrapper[4797]: E1013 13:22:26.361406 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:7e584b1c430441c8b6591dadeff32e065de8a185ad37ef90d2e08d37e59aab4a\\\"\"" pod="openstack-operators/test-operator-controller-manager-5458f77c4-vrt4f" podUID="535566f0-4f81-4362-a4a0-18c9b2dedd8d" Oct 13 13:22:26 crc kubenswrapper[4797]: I1013 13:22:26.368201 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-67cfc6749b-v826q" event={"ID":"549ef07f-ef05-4c9a-8700-a19008de4afe","Type":"ContainerStarted","Data":"0c7c6eca5909c1a5a8fe7f2b95465b784341be21439a7a7926d8f8b66a34d5d6"} Oct 13 13:22:26 crc kubenswrapper[4797]: I1013 13:22:26.368240 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-67cfc6749b-v826q" event={"ID":"549ef07f-ef05-4c9a-8700-a19008de4afe","Type":"ContainerStarted","Data":"a21e939aa44eb0d104e105d759e1c4cf70a51b5f2b6f9ff564141df3c6fb2b4f"} Oct 13 13:22:26 crc kubenswrapper[4797]: E1013 13:22:26.374990 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:abe978f8da75223de5043cca50278ad4e28c8dd309883f502fe1e7a9998733b0\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-67cfc6749b-v826q" podUID="549ef07f-ef05-4c9a-8700-a19008de4afe" Oct 13 13:22:26 crc kubenswrapper[4797]: I1013 13:22:26.379055 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-w9st4" event={"ID":"c2797d9e-1ac0-4ac4-8d0e-8c9061623f50","Type":"ContainerStarted","Data":"300cc06ac408df670ee5d8c5c91e5503039fac397cf6b99e0c5eb0cf5ae2bd6f"} Oct 13 13:22:26 crc kubenswrapper[4797]: I1013 13:22:26.380382 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-79d585cb66-pqvwj" event={"ID":"8c61f396-891f-4c58-ba21-e53d8e357358","Type":"ContainerStarted","Data":"5b6b19d146b44391883ea72e030128efcfa57b951fe45b3aa4c84f9db0a4505e"} Oct 13 13:22:26 crc kubenswrapper[4797]: I1013 13:22:26.409300 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-7f554bff7b-qsnt6" event={"ID":"4cc72187-24a1-4ec5-907d-d5295814e428","Type":"ContainerStarted","Data":"1fec723676d9cad9ce906d89e9ffad337b3b0a1e2a1c7342b1de3bb871222bb6"} Oct 13 13:22:26 crc kubenswrapper[4797]: I1013 13:22:26.409343 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-7f554bff7b-qsnt6" event={"ID":"4cc72187-24a1-4ec5-907d-d5295814e428","Type":"ContainerStarted","Data":"ab6084c1a74e4b846b99a9378720cbe289ae9c28f4e808ae65ee66155ceb0782"} Oct 13 13:22:26 crc kubenswrapper[4797]: E1013 13:22:26.414024 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:98a5233f0596591acdf2c6a5838b08be108787cdb6ad1995b2b7886bac0fe6ca\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-7f554bff7b-qsnt6" podUID="4cc72187-24a1-4ec5-907d-d5295814e428" Oct 13 13:22:26 crc kubenswrapper[4797]: I1013 13:22:26.428019 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-db6d7f97b-kgmrr" event={"ID":"ad662e1a-7f25-414a-9358-cb1994840925","Type":"ContainerStarted","Data":"60f90c91b67efb5a995511e0a5d61efb825a3a10d26ccb2b61626e704f639b45"} Oct 13 13:22:26 crc kubenswrapper[4797]: I1013 13:22:26.428063 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-db6d7f97b-kgmrr" event={"ID":"ad662e1a-7f25-414a-9358-cb1994840925","Type":"ContainerStarted","Data":"e98d32589d8c3e24a35e89159609aef7f0844db222a4f4100d805fe243cb4f53"} Oct 13 13:22:26 crc kubenswrapper[4797]: I1013 13:22:26.452978 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-84b9b84486-vfpgt" event={"ID":"13487799-53f7-4c74-9f16-770bf4dbace5","Type":"ContainerStarted","Data":"33480879b5427a1d77852085a53738e08d91b03e6c6c09e9e817cd161feaabb8"} Oct 13 13:22:26 crc kubenswrapper[4797]: E1013 13:22:26.457418 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:4b4a17fe08ce00e375afaaec6a28835f5c1784f03d11c4558376ac04130f3a9e\\\"\"" pod="openstack-operators/swift-operator-controller-manager-db6d7f97b-kgmrr" podUID="ad662e1a-7f25-414a-9358-cb1994840925" Oct 13 13:22:26 crc kubenswrapper[4797]: I1013 13:22:26.458472 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-69fdcfc5f5-dh4qb" event={"ID":"21276615-f6a5-4b36-b65a-4b45a1f4b7e4","Type":"ContainerStarted","Data":"8479e5a908d754ca7f3398bc432b8039105a3665cacdc46d9fcefd2f850ec028"} Oct 13 13:22:26 crc kubenswrapper[4797]: I1013 13:22:26.461879 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-9c5c78d49-l8d5s" event={"ID":"f092abb5-bd31-41f3-bedb-1e9523f17044","Type":"ContainerStarted","Data":"59eecc100925a3ffb66e6996eabcf85bf3009f0b7f4a203d8b811d6d081b2e2d"} Oct 13 13:22:26 crc kubenswrapper[4797]: I1013 13:22:26.464612 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5df598886f-fcgbt" event={"ID":"990f0215-8f03-4fb7-ae16-0d89130a5ba3","Type":"ContainerStarted","Data":"f86ee245b731435ee97e19b0f1eb0317b2eb2e0119f34783931ae1bb28941714"} Oct 13 13:22:26 crc kubenswrapper[4797]: I1013 13:22:26.466926 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-55b7d448487kdnh" event={"ID":"08de49ed-c17f-42fc-8bb1-2cb6684984f1","Type":"ContainerStarted","Data":"31e04c1abdda8c891f5a884de78be7ce0c2aea39d0b13d1ac32f86eba4edeef5"} Oct 13 13:22:26 crc kubenswrapper[4797]: I1013 13:22:26.467695 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-f9fb45f8f-bld7w" event={"ID":"ed01e7fd-31f4-47d4-9b83-3544f3e1f5d3","Type":"ContainerStarted","Data":"cae70f70e6a15fa081c1b32ab19208fd60f3f86aee03c3da032fc46751889b95"} Oct 13 13:22:26 crc kubenswrapper[4797]: I1013 13:22:26.469389 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-85d5d9dd78-rdczf" event={"ID":"f07f9d14-0fd4-4702-877a-8e0097a23791","Type":"ContainerStarted","Data":"0b0a1438cd0170065948d6b1e31f7858bf796854689a3fde528d198406ea9c0e"} Oct 13 13:22:26 crc kubenswrapper[4797]: I1013 13:22:26.473936 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-79df5fb58c-mqjcg" event={"ID":"f7afdb2f-edf9-4dfe-a7d1-43b6a5ec8dcf","Type":"ContainerStarted","Data":"806c9a0e5de44d12a9ad9ee4a774421a96db73052ea814037bd6e1216da8ddde"} Oct 13 13:22:26 crc kubenswrapper[4797]: I1013 13:22:26.508944 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-68b6c87b68-766z6" event={"ID":"f0e7ab2d-9124-44fe-aa40-abe8b405d449","Type":"ContainerStarted","Data":"263493c4a11355156adf974d782d31e4cc1566bf1ea046c356e46f7e165f622a"} Oct 13 13:22:26 crc kubenswrapper[4797]: I1013 13:22:26.508981 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-68b6c87b68-766z6" event={"ID":"f0e7ab2d-9124-44fe-aa40-abe8b405d449","Type":"ContainerStarted","Data":"4eec81c88a35eb52777f2d19c50ef06875dda55b05692bca6c8bf6431e637263"} Oct 13 13:22:26 crc kubenswrapper[4797]: E1013 13:22:26.510490 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:d33c1f507e1f5b9a4bf226ad98917e92101ac66b36e19d35cbe04ae7014f6bff\\\"\"" pod="openstack-operators/placement-operator-controller-manager-68b6c87b68-766z6" podUID="f0e7ab2d-9124-44fe-aa40-abe8b405d449" Oct 13 13:22:26 crc kubenswrapper[4797]: I1013 13:22:26.532499 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-7ffbcb7588-sdtkf" event={"ID":"161dd833-44d7-4dac-9ea7-d2c059e2f593","Type":"ContainerStarted","Data":"d12e5fc6fde1ed77bf4d16b11ebea664ec90805ca29bba79b882d946d1b800b4"} Oct 13 13:22:26 crc kubenswrapper[4797]: I1013 13:22:26.532548 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-7ffbcb7588-sdtkf" event={"ID":"161dd833-44d7-4dac-9ea7-d2c059e2f593","Type":"ContainerStarted","Data":"e223bc6975b8317f51afb7b385837218a81967305657d6905ec242b94f95f2f1"} Oct 13 13:22:26 crc kubenswrapper[4797]: E1013 13:22:26.534459 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/horizon-operator@sha256:063a7e65b4ba98f0506f269ff7525b446eae06a5ed4a61c18ffa33a886500867\\\"\"" pod="openstack-operators/horizon-operator-controller-manager-7ffbcb7588-sdtkf" podUID="161dd833-44d7-4dac-9ea7-d2c059e2f593" Oct 13 13:22:27 crc kubenswrapper[4797]: E1013 13:22:27.548790 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:abe978f8da75223de5043cca50278ad4e28c8dd309883f502fe1e7a9998733b0\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-67cfc6749b-v826q" podUID="549ef07f-ef05-4c9a-8700-a19008de4afe" Oct 13 13:22:27 crc kubenswrapper[4797]: E1013 13:22:27.548968 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/horizon-operator@sha256:063a7e65b4ba98f0506f269ff7525b446eae06a5ed4a61c18ffa33a886500867\\\"\"" pod="openstack-operators/horizon-operator-controller-manager-7ffbcb7588-sdtkf" podUID="161dd833-44d7-4dac-9ea7-d2c059e2f593" Oct 13 13:22:27 crc kubenswrapper[4797]: E1013 13:22:27.549012 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/infra-operator@sha256:5cfb2ae1092445950b39dd59caa9a8c9367f42fb8353a8c3848d3bc729f24492\\\"\"" pod="openstack-operators/infra-operator-controller-manager-656bcbd775-ntnr8" podUID="7493abc9-384b-43e1-aa00-1c6ae0ddf144" Oct 13 13:22:27 crc kubenswrapper[4797]: E1013 13:22:27.549047 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:79b43a69884631c635d2164b95a2d4ec68f5cb33f96da14764f1c710880f3997\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-55b6b7c7b8-znjhj" podUID="b9adb7ab-599b-4ac1-b7d4-d22efc7fda95" Oct 13 13:22:27 crc kubenswrapper[4797]: E1013 13:22:27.549073 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:98a5233f0596591acdf2c6a5838b08be108787cdb6ad1995b2b7886bac0fe6ca\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-7f554bff7b-qsnt6" podUID="4cc72187-24a1-4ec5-907d-d5295814e428" Oct 13 13:22:27 crc kubenswrapper[4797]: E1013 13:22:27.549126 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:4b4a17fe08ce00e375afaaec6a28835f5c1784f03d11c4558376ac04130f3a9e\\\"\"" pod="openstack-operators/swift-operator-controller-manager-db6d7f97b-kgmrr" podUID="ad662e1a-7f25-414a-9358-cb1994840925" Oct 13 13:22:27 crc kubenswrapper[4797]: E1013 13:22:27.549246 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:d33c1f507e1f5b9a4bf226ad98917e92101ac66b36e19d35cbe04ae7014f6bff\\\"\"" pod="openstack-operators/placement-operator-controller-manager-68b6c87b68-766z6" podUID="f0e7ab2d-9124-44fe-aa40-abe8b405d449" Oct 13 13:22:27 crc kubenswrapper[4797]: E1013 13:22:27.550561 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:7e584b1c430441c8b6591dadeff32e065de8a185ad37ef90d2e08d37e59aab4a\\\"\"" pod="openstack-operators/test-operator-controller-manager-5458f77c4-vrt4f" podUID="535566f0-4f81-4362-a4a0-18c9b2dedd8d" Oct 13 13:22:35 crc kubenswrapper[4797]: I1013 13:22:35.048919 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-7fb8c88b76-4rdzv" Oct 13 13:22:39 crc kubenswrapper[4797]: E1013 13:22:39.661675 4797 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/ironic-operator@sha256:ee05f2b06405240a8fcdbd430a9e8983b4667f372548334307b68c154e389960" Oct 13 13:22:39 crc kubenswrapper[4797]: E1013 13:22:39.662446 4797 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ironic-operator@sha256:ee05f2b06405240a8fcdbd430a9e8983b4667f372548334307b68c154e389960,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hsrlw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ironic-operator-controller-manager-9c5c78d49-l8d5s_openstack-operators(f092abb5-bd31-41f3-bedb-1e9523f17044): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 13 13:22:40 crc kubenswrapper[4797]: E1013 13:22:40.279003 4797 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/octavia-operator@sha256:09deecf840d38ff6af3c924729cf0a9444bc985848bfbe7c918019b88a6bc4d7" Oct 13 13:22:40 crc kubenswrapper[4797]: E1013 13:22:40.279194 4797 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:09deecf840d38ff6af3c924729cf0a9444bc985848bfbe7c918019b88a6bc4d7,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-h5sqm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-69fdcfc5f5-dh4qb_openstack-operators(21276615-f6a5-4b36-b65a-4b45a1f4b7e4): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 13 13:22:40 crc kubenswrapper[4797]: E1013 13:22:40.771340 4797 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/barbican-operator@sha256:783f711b4cb179819cfcb81167c3591c70671440f4551bbe48b7a8730567f577" Oct 13 13:22:40 crc kubenswrapper[4797]: E1013 13:22:40.771555 4797 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/barbican-operator@sha256:783f711b4cb179819cfcb81167c3591c70671440f4551bbe48b7a8730567f577,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4697q,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-operator-controller-manager-658bdf4b74-4xbcc_openstack-operators(2ce34775-733e-42d7-a688-4c12edad7614): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 13 13:22:44 crc kubenswrapper[4797]: E1013 13:22:44.986373 4797 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:b2e9acf568a48c28cf2aed6012e432eeeb7d5f0eb11878fc91b62bc34cba10cd" Oct 13 13:22:44 crc kubenswrapper[4797]: E1013 13:22:44.987525 4797 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:b2e9acf568a48c28cf2aed6012e432eeeb7d5f0eb11878fc91b62bc34cba10cd,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-pwdm6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-5df598886f-fcgbt_openstack-operators(990f0215-8f03-4fb7-ae16-0d89130a5ba3): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 13 13:22:45 crc kubenswrapper[4797]: E1013 13:22:45.345590 4797 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/glance-operator@sha256:3cc6bba71197ddf88dd4ba1301542bacbc1fe12e6faab2b69e6960944b3d74a0" Oct 13 13:22:45 crc kubenswrapper[4797]: E1013 13:22:45.345949 4797 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/glance-operator@sha256:3cc6bba71197ddf88dd4ba1301542bacbc1fe12e6faab2b69e6960944b3d74a0,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-c6bnk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-operator-controller-manager-84b9b84486-vfpgt_openstack-operators(13487799-53f7-4c74-9f16-770bf4dbace5): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 13 13:22:46 crc kubenswrapper[4797]: E1013 13:22:46.009621 4797 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/ovn-operator@sha256:315e558023b41ac1aa215082096995a03810c5b42910a33b00427ffcac9c6a14" Oct 13 13:22:46 crc kubenswrapper[4797]: E1013 13:22:46.009781 4797 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:315e558023b41ac1aa215082096995a03810c5b42910a33b00427ffcac9c6a14,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-bch67,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-79df5fb58c-mqjcg_openstack-operators(f7afdb2f-edf9-4dfe-a7d1-43b6a5ec8dcf): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 13 13:22:47 crc kubenswrapper[4797]: E1013 13:22:47.402300 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/ironic-operator-controller-manager-9c5c78d49-l8d5s" podUID="f092abb5-bd31-41f3-bedb-1e9523f17044" Oct 13 13:22:47 crc kubenswrapper[4797]: I1013 13:22:47.690640 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-9c5c78d49-l8d5s" event={"ID":"f092abb5-bd31-41f3-bedb-1e9523f17044","Type":"ContainerStarted","Data":"cdb7c4b816c08f24aea70e328f43a7f22e40dc6d608f0e9a84be7aa66cadd9e3"} Oct 13 13:22:47 crc kubenswrapper[4797]: E1013 13:22:47.692471 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ironic-operator@sha256:ee05f2b06405240a8fcdbd430a9e8983b4667f372548334307b68c154e389960\\\"\"" pod="openstack-operators/ironic-operator-controller-manager-9c5c78d49-l8d5s" podUID="f092abb5-bd31-41f3-bedb-1e9523f17044" Oct 13 13:22:48 crc kubenswrapper[4797]: I1013 13:22:48.120400 4797 patch_prober.go:28] interesting pod/machine-config-daemon-hrdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 13:22:48 crc kubenswrapper[4797]: I1013 13:22:48.120472 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 13:22:48 crc kubenswrapper[4797]: I1013 13:22:48.120524 4797 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" Oct 13 13:22:48 crc kubenswrapper[4797]: I1013 13:22:48.121269 4797 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d13bde04be8878f52602789b8a495d96204227aa290488bc4d6eac0aef285521"} pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 13 13:22:48 crc kubenswrapper[4797]: I1013 13:22:48.121328 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerName="machine-config-daemon" containerID="cri-o://d13bde04be8878f52602789b8a495d96204227aa290488bc4d6eac0aef285521" gracePeriod=600 Oct 13 13:22:48 crc kubenswrapper[4797]: I1013 13:22:48.699596 4797 generic.go:334] "Generic (PLEG): container finished" podID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerID="d13bde04be8878f52602789b8a495d96204227aa290488bc4d6eac0aef285521" exitCode=0 Oct 13 13:22:48 crc kubenswrapper[4797]: I1013 13:22:48.699676 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" event={"ID":"345b1c60-ba79-407d-8423-53010f2dfeb0","Type":"ContainerDied","Data":"d13bde04be8878f52602789b8a495d96204227aa290488bc4d6eac0aef285521"} Oct 13 13:22:48 crc kubenswrapper[4797]: I1013 13:22:48.699739 4797 scope.go:117] "RemoveContainer" containerID="61854bbd861c1fc9b67c996c47d52d46e92470dc4bfb3423c7c24026ce57b8ba" Oct 13 13:22:48 crc kubenswrapper[4797]: E1013 13:22:48.702230 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ironic-operator@sha256:ee05f2b06405240a8fcdbd430a9e8983b4667f372548334307b68c154e389960\\\"\"" pod="openstack-operators/ironic-operator-controller-manager-9c5c78d49-l8d5s" podUID="f092abb5-bd31-41f3-bedb-1e9523f17044" Oct 13 13:22:49 crc kubenswrapper[4797]: E1013 13:22:49.781966 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/octavia-operator-controller-manager-69fdcfc5f5-dh4qb" podUID="21276615-f6a5-4b36-b65a-4b45a1f4b7e4" Oct 13 13:22:50 crc kubenswrapper[4797]: E1013 13:22:50.009701 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/ovn-operator-controller-manager-79df5fb58c-mqjcg" podUID="f7afdb2f-edf9-4dfe-a7d1-43b6a5ec8dcf" Oct 13 13:22:50 crc kubenswrapper[4797]: E1013 13:22:50.098870 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/glance-operator-controller-manager-84b9b84486-vfpgt" podUID="13487799-53f7-4c74-9f16-770bf4dbace5" Oct 13 13:22:50 crc kubenswrapper[4797]: E1013 13:22:50.165203 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-5df598886f-fcgbt" podUID="990f0215-8f03-4fb7-ae16-0d89130a5ba3" Oct 13 13:22:50 crc kubenswrapper[4797]: E1013 13:22:50.418625 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/barbican-operator-controller-manager-658bdf4b74-4xbcc" podUID="2ce34775-733e-42d7-a688-4c12edad7614" Oct 13 13:22:50 crc kubenswrapper[4797]: I1013 13:22:50.719845 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-79d585cb66-pqvwj" event={"ID":"8c61f396-891f-4c58-ba21-e53d8e357358","Type":"ContainerStarted","Data":"6708bcca23ba27af9786dee15fe1e2c18d689b9fe8d762d21376ac098cc04165"} Oct 13 13:22:50 crc kubenswrapper[4797]: I1013 13:22:50.724904 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5df598886f-fcgbt" event={"ID":"990f0215-8f03-4fb7-ae16-0d89130a5ba3","Type":"ContainerStarted","Data":"b7c4ae054fcf0379c428315f7a8636352f72c0007da12ad2b56c9a7c67bd96ab"} Oct 13 13:22:50 crc kubenswrapper[4797]: E1013 13:22:50.730300 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:b2e9acf568a48c28cf2aed6012e432eeeb7d5f0eb11878fc91b62bc34cba10cd\\\"\"" pod="openstack-operators/nova-operator-controller-manager-5df598886f-fcgbt" podUID="990f0215-8f03-4fb7-ae16-0d89130a5ba3" Oct 13 13:22:50 crc kubenswrapper[4797]: I1013 13:22:50.737376 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-55b7d448487kdnh" event={"ID":"08de49ed-c17f-42fc-8bb1-2cb6684984f1","Type":"ContainerStarted","Data":"df1984dcf560a5a45705b46ef11706504f252ae7b15cdbeef207cd751ccb4bc5"} Oct 13 13:22:50 crc kubenswrapper[4797]: I1013 13:22:50.758932 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-7ffbcb7588-sdtkf" event={"ID":"161dd833-44d7-4dac-9ea7-d2c059e2f593","Type":"ContainerStarted","Data":"24f5ad38293d46a0e62dbe5bcec418764a2b558d37c1ffa544150c41e038b1a6"} Oct 13 13:22:50 crc kubenswrapper[4797]: I1013 13:22:50.759819 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-7ffbcb7588-sdtkf" Oct 13 13:22:50 crc kubenswrapper[4797]: I1013 13:22:50.763182 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-658bdf4b74-4xbcc" event={"ID":"2ce34775-733e-42d7-a688-4c12edad7614","Type":"ContainerStarted","Data":"f4d3a80e296aeaadd43649724a4214798b7dc1cac6f09c5922e8a802e497f622"} Oct 13 13:22:50 crc kubenswrapper[4797]: E1013 13:22:50.764547 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/barbican-operator@sha256:783f711b4cb179819cfcb81167c3591c70671440f4551bbe48b7a8730567f577\\\"\"" pod="openstack-operators/barbican-operator-controller-manager-658bdf4b74-4xbcc" podUID="2ce34775-733e-42d7-a688-4c12edad7614" Oct 13 13:22:50 crc kubenswrapper[4797]: I1013 13:22:50.765541 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-5f67fbc655-m9cpw" event={"ID":"5132d95b-625f-4eb5-9a09-47e695441c86","Type":"ContainerStarted","Data":"a42784ee803d81b431b47941c08bafc12b524ab237d475ce392a4aa8be10af15"} Oct 13 13:22:50 crc kubenswrapper[4797]: I1013 13:22:50.768967 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-656bcbd775-ntnr8" event={"ID":"7493abc9-384b-43e1-aa00-1c6ae0ddf144","Type":"ContainerStarted","Data":"813adcab47451b4c038f272bbaebe2c5228641d238129f1248f3ce4c2150fe27"} Oct 13 13:22:50 crc kubenswrapper[4797]: I1013 13:22:50.769396 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-656bcbd775-ntnr8" Oct 13 13:22:50 crc kubenswrapper[4797]: I1013 13:22:50.780085 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-858f76bbdd-79946" event={"ID":"424af6e9-8a27-446e-b11e-7a84032f476e","Type":"ContainerStarted","Data":"855322f14959c601039ad56afe561f2325c2888bfacf1321a1e573ab062e8ed1"} Oct 13 13:22:50 crc kubenswrapper[4797]: I1013 13:22:50.781420 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-85d5d9dd78-rdczf" event={"ID":"f07f9d14-0fd4-4702-877a-8e0097a23791","Type":"ContainerStarted","Data":"a6c4bf9f42cd50464783065ecccf864b20fb2c1fb3728dbeb3b17fccedcfb4ed"} Oct 13 13:22:50 crc kubenswrapper[4797]: I1013 13:22:50.782787 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" event={"ID":"345b1c60-ba79-407d-8423-53010f2dfeb0","Type":"ContainerStarted","Data":"d5941c177a81de15babd5a721122470aaaaccd1b9033980aac5b1cd72b64076c"} Oct 13 13:22:50 crc kubenswrapper[4797]: I1013 13:22:50.787283 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-79df5fb58c-mqjcg" event={"ID":"f7afdb2f-edf9-4dfe-a7d1-43b6a5ec8dcf","Type":"ContainerStarted","Data":"68303506db104a00bbc16ec0c303b025d09454ffe33949844f612c91fa97eaf1"} Oct 13 13:22:50 crc kubenswrapper[4797]: E1013 13:22:50.792927 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:315e558023b41ac1aa215082096995a03810c5b42910a33b00427ffcac9c6a14\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-79df5fb58c-mqjcg" podUID="f7afdb2f-edf9-4dfe-a7d1-43b6a5ec8dcf" Oct 13 13:22:50 crc kubenswrapper[4797]: I1013 13:22:50.793431 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-7b7fb68549-jtc6c" event={"ID":"cc7558e9-8bd0-4dda-9792-49855202f2bf","Type":"ContainerStarted","Data":"bd67aa9c45bd27b88eda73dd3158325ee8eb78a0374f2bbf2f1a295ee93d95bf"} Oct 13 13:22:50 crc kubenswrapper[4797]: I1013 13:22:50.801681 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-7ffbcb7588-sdtkf" podStartSLOduration=3.284943702 podStartE2EDuration="27.801660193s" podCreationTimestamp="2025-10-13 13:22:23 +0000 UTC" firstStartedPulling="2025-10-13 13:22:25.332614313 +0000 UTC m=+922.866164569" lastFinishedPulling="2025-10-13 13:22:49.849330814 +0000 UTC m=+947.382881060" observedRunningTime="2025-10-13 13:22:50.78361445 +0000 UTC m=+948.317164716" watchObservedRunningTime="2025-10-13 13:22:50.801660193 +0000 UTC m=+948.335210449" Oct 13 13:22:50 crc kubenswrapper[4797]: I1013 13:22:50.810076 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-69fdcfc5f5-dh4qb" event={"ID":"21276615-f6a5-4b36-b65a-4b45a1f4b7e4","Type":"ContainerStarted","Data":"3d231dc14fb429e5e79d4f403bf57d3a628946283444988b0803be57d8cf9b68"} Oct 13 13:22:50 crc kubenswrapper[4797]: I1013 13:22:50.823785 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-db6d7f97b-kgmrr" event={"ID":"ad662e1a-7f25-414a-9358-cb1994840925","Type":"ContainerStarted","Data":"5294826cb8979e43015ad90d4ce4cd4b2362f5420fe9786f75e6053b1f6dd3af"} Oct 13 13:22:50 crc kubenswrapper[4797]: I1013 13:22:50.824392 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-db6d7f97b-kgmrr" Oct 13 13:22:50 crc kubenswrapper[4797]: I1013 13:22:50.832319 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-656bcbd775-ntnr8" podStartSLOduration=3.717862245 podStartE2EDuration="27.832296034s" podCreationTimestamp="2025-10-13 13:22:23 +0000 UTC" firstStartedPulling="2025-10-13 13:22:25.733925301 +0000 UTC m=+923.267475557" lastFinishedPulling="2025-10-13 13:22:49.84835909 +0000 UTC m=+947.381909346" observedRunningTime="2025-10-13 13:22:50.819248094 +0000 UTC m=+948.352798340" watchObservedRunningTime="2025-10-13 13:22:50.832296034 +0000 UTC m=+948.365846290" Oct 13 13:22:50 crc kubenswrapper[4797]: I1013 13:22:50.835074 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-84b9b84486-vfpgt" event={"ID":"13487799-53f7-4c74-9f16-770bf4dbace5","Type":"ContainerStarted","Data":"3f8108b02faa235524fb0f876ee0079b9570a50c6d8a93c7f46e2d601dace826"} Oct 13 13:22:50 crc kubenswrapper[4797]: E1013 13:22:50.840060 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/glance-operator@sha256:3cc6bba71197ddf88dd4ba1301542bacbc1fe12e6faab2b69e6960944b3d74a0\\\"\"" pod="openstack-operators/glance-operator-controller-manager-84b9b84486-vfpgt" podUID="13487799-53f7-4c74-9f16-770bf4dbace5" Oct 13 13:22:51 crc kubenswrapper[4797]: I1013 13:22:51.059560 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-db6d7f97b-kgmrr" podStartSLOduration=3.620455165 podStartE2EDuration="28.05953607s" podCreationTimestamp="2025-10-13 13:22:23 +0000 UTC" firstStartedPulling="2025-10-13 13:22:25.399507105 +0000 UTC m=+922.933057361" lastFinishedPulling="2025-10-13 13:22:49.838588 +0000 UTC m=+947.372138266" observedRunningTime="2025-10-13 13:22:51.045333632 +0000 UTC m=+948.578883898" watchObservedRunningTime="2025-10-13 13:22:51.05953607 +0000 UTC m=+948.593086326" Oct 13 13:22:51 crc kubenswrapper[4797]: I1013 13:22:51.847912 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-69fdcfc5f5-dh4qb" event={"ID":"21276615-f6a5-4b36-b65a-4b45a1f4b7e4","Type":"ContainerStarted","Data":"2c5cbf913bf81d0ceb655400bdaf284935a36ab38478d7ed74247bd28c976981"} Oct 13 13:22:51 crc kubenswrapper[4797]: I1013 13:22:51.848924 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-69fdcfc5f5-dh4qb" Oct 13 13:22:51 crc kubenswrapper[4797]: I1013 13:22:51.851218 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-7b7fb68549-jtc6c" event={"ID":"cc7558e9-8bd0-4dda-9792-49855202f2bf","Type":"ContainerStarted","Data":"defa3b6eabe99abcc4133e11fbc499a0857891dde38c8f71a3291f10abedd023"} Oct 13 13:22:51 crc kubenswrapper[4797]: I1013 13:22:51.851899 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-7b7fb68549-jtc6c" Oct 13 13:22:51 crc kubenswrapper[4797]: I1013 13:22:51.854557 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-7f554bff7b-qsnt6" event={"ID":"4cc72187-24a1-4ec5-907d-d5295814e428","Type":"ContainerStarted","Data":"62f1bae71f824767db3d34049c18571ae74040c81d7d90f93efdd8692e23c6c4"} Oct 13 13:22:51 crc kubenswrapper[4797]: I1013 13:22:51.855211 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-7f554bff7b-qsnt6" Oct 13 13:22:51 crc kubenswrapper[4797]: I1013 13:22:51.857697 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-55b7d448487kdnh" event={"ID":"08de49ed-c17f-42fc-8bb1-2cb6684984f1","Type":"ContainerStarted","Data":"cb1a1651862723ff112c018d0aeebb432fefb7db6a52878cf2283977eca2ef64"} Oct 13 13:22:51 crc kubenswrapper[4797]: I1013 13:22:51.858126 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-55b7d448487kdnh" Oct 13 13:22:51 crc kubenswrapper[4797]: I1013 13:22:51.862867 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-f9fb45f8f-bld7w" event={"ID":"ed01e7fd-31f4-47d4-9b83-3544f3e1f5d3","Type":"ContainerStarted","Data":"3dfdb35bbb02a21c1089968f4469e2da4967f2b2f48b376c1b8dadcae32c8ce5"} Oct 13 13:22:51 crc kubenswrapper[4797]: I1013 13:22:51.862918 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-f9fb45f8f-bld7w" event={"ID":"ed01e7fd-31f4-47d4-9b83-3544f3e1f5d3","Type":"ContainerStarted","Data":"015e58fdf9bd2e61c2631035916e080be2cf55611909bea4b9882e1bdc966050"} Oct 13 13:22:51 crc kubenswrapper[4797]: I1013 13:22:51.863161 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-f9fb45f8f-bld7w" Oct 13 13:22:51 crc kubenswrapper[4797]: I1013 13:22:51.865676 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5458f77c4-vrt4f" event={"ID":"535566f0-4f81-4362-a4a0-18c9b2dedd8d","Type":"ContainerStarted","Data":"364c6a1aa18aed4126976b8fb9dba52bf3b70d85be526a8b814048db44b37217"} Oct 13 13:22:51 crc kubenswrapper[4797]: I1013 13:22:51.865887 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5458f77c4-vrt4f" Oct 13 13:22:51 crc kubenswrapper[4797]: I1013 13:22:51.867526 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-85d5d9dd78-rdczf" event={"ID":"f07f9d14-0fd4-4702-877a-8e0097a23791","Type":"ContainerStarted","Data":"a6fb2197edf6bf8dd31b5e2742215888979e4720f12176fe3e3d9976cbd66477"} Oct 13 13:22:51 crc kubenswrapper[4797]: I1013 13:22:51.867684 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-85d5d9dd78-rdczf" Oct 13 13:22:51 crc kubenswrapper[4797]: I1013 13:22:51.872318 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-67cfc6749b-v826q" event={"ID":"549ef07f-ef05-4c9a-8700-a19008de4afe","Type":"ContainerStarted","Data":"20660a3c9c60000b163597954656f4237bee2112ff338a04a6e7e305624107f5"} Oct 13 13:22:51 crc kubenswrapper[4797]: I1013 13:22:51.872502 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-67cfc6749b-v826q" Oct 13 13:22:51 crc kubenswrapper[4797]: I1013 13:22:51.872851 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-69fdcfc5f5-dh4qb" podStartSLOduration=2.84799735 podStartE2EDuration="28.872839288s" podCreationTimestamp="2025-10-13 13:22:23 +0000 UTC" firstStartedPulling="2025-10-13 13:22:25.328872571 +0000 UTC m=+922.862422827" lastFinishedPulling="2025-10-13 13:22:51.353714509 +0000 UTC m=+948.887264765" observedRunningTime="2025-10-13 13:22:51.871356541 +0000 UTC m=+949.404906807" watchObservedRunningTime="2025-10-13 13:22:51.872839288 +0000 UTC m=+949.406389544" Oct 13 13:22:51 crc kubenswrapper[4797]: I1013 13:22:51.874467 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-79d585cb66-pqvwj" event={"ID":"8c61f396-891f-4c58-ba21-e53d8e357358","Type":"ContainerStarted","Data":"a4edcd254b28ce028e5275c8aa12d92f980379351d1468db6c45e382e124da35"} Oct 13 13:22:51 crc kubenswrapper[4797]: I1013 13:22:51.874923 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-79d585cb66-pqvwj" Oct 13 13:22:51 crc kubenswrapper[4797]: I1013 13:22:51.876789 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-55b6b7c7b8-znjhj" event={"ID":"b9adb7ab-599b-4ac1-b7d4-d22efc7fda95","Type":"ContainerStarted","Data":"5c6c7cdcdd319770ab15068078c92dddac16a1140991bea003f2e28d8832a5a1"} Oct 13 13:22:51 crc kubenswrapper[4797]: I1013 13:22:51.877254 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-55b6b7c7b8-znjhj" Oct 13 13:22:51 crc kubenswrapper[4797]: I1013 13:22:51.879311 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-w9st4" event={"ID":"c2797d9e-1ac0-4ac4-8d0e-8c9061623f50","Type":"ContainerStarted","Data":"21005a3d99d8119c5d5a375af1d3d4e8dbfc52612ec55c43667ca88ad5ee55e1"} Oct 13 13:22:51 crc kubenswrapper[4797]: I1013 13:22:51.885275 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-68b6c87b68-766z6" event={"ID":"f0e7ab2d-9124-44fe-aa40-abe8b405d449","Type":"ContainerStarted","Data":"d3a7c3e7c9925ec8f48a6cc60f41d5e35483d2516a39def353dc5bd5e552626d"} Oct 13 13:22:51 crc kubenswrapper[4797]: I1013 13:22:51.885517 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-68b6c87b68-766z6" Oct 13 13:22:51 crc kubenswrapper[4797]: I1013 13:22:51.887752 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-858f76bbdd-79946" event={"ID":"424af6e9-8a27-446e-b11e-7a84032f476e","Type":"ContainerStarted","Data":"48aaa2293551e9f8c0548340997972cbe8a7baa851482974fe10d354cb8b4336"} Oct 13 13:22:51 crc kubenswrapper[4797]: I1013 13:22:51.888212 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-858f76bbdd-79946" Oct 13 13:22:51 crc kubenswrapper[4797]: I1013 13:22:51.890033 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-5f67fbc655-m9cpw" event={"ID":"5132d95b-625f-4eb5-9a09-47e695441c86","Type":"ContainerStarted","Data":"b5bb33244018fd09185276964ceaed6c1d8603cd9ed13ffeef3136ce90ceac58"} Oct 13 13:22:51 crc kubenswrapper[4797]: E1013 13:22:51.902436 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:315e558023b41ac1aa215082096995a03810c5b42910a33b00427ffcac9c6a14\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-79df5fb58c-mqjcg" podUID="f7afdb2f-edf9-4dfe-a7d1-43b6a5ec8dcf" Oct 13 13:22:51 crc kubenswrapper[4797]: E1013 13:22:51.909992 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/glance-operator@sha256:3cc6bba71197ddf88dd4ba1301542bacbc1fe12e6faab2b69e6960944b3d74a0\\\"\"" pod="openstack-operators/glance-operator-controller-manager-84b9b84486-vfpgt" podUID="13487799-53f7-4c74-9f16-770bf4dbace5" Oct 13 13:22:51 crc kubenswrapper[4797]: I1013 13:22:51.928220 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-7f554bff7b-qsnt6" podStartSLOduration=4.465719616 podStartE2EDuration="28.928200206s" podCreationTimestamp="2025-10-13 13:22:23 +0000 UTC" firstStartedPulling="2025-10-13 13:22:25.373939677 +0000 UTC m=+922.907489933" lastFinishedPulling="2025-10-13 13:22:49.836420257 +0000 UTC m=+947.369970523" observedRunningTime="2025-10-13 13:22:51.904254439 +0000 UTC m=+949.437804705" watchObservedRunningTime="2025-10-13 13:22:51.928200206 +0000 UTC m=+949.461750462" Oct 13 13:22:51 crc kubenswrapper[4797]: I1013 13:22:51.938625 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-f9fb45f8f-bld7w" podStartSLOduration=6.438562897 podStartE2EDuration="28.938606932s" podCreationTimestamp="2025-10-13 13:22:23 +0000 UTC" firstStartedPulling="2025-10-13 13:22:25.332001048 +0000 UTC m=+922.865551304" lastFinishedPulling="2025-10-13 13:22:47.832045073 +0000 UTC m=+945.365595339" observedRunningTime="2025-10-13 13:22:51.923212794 +0000 UTC m=+949.456763050" watchObservedRunningTime="2025-10-13 13:22:51.938606932 +0000 UTC m=+949.472157188" Oct 13 13:22:51 crc kubenswrapper[4797]: E1013 13:22:51.939001 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:b2e9acf568a48c28cf2aed6012e432eeeb7d5f0eb11878fc91b62bc34cba10cd\\\"\"" pod="openstack-operators/nova-operator-controller-manager-5df598886f-fcgbt" podUID="990f0215-8f03-4fb7-ae16-0d89130a5ba3" Oct 13 13:22:51 crc kubenswrapper[4797]: I1013 13:22:51.959010 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-7b7fb68549-jtc6c" podStartSLOduration=4.199726979 podStartE2EDuration="28.958992852s" podCreationTimestamp="2025-10-13 13:22:23 +0000 UTC" firstStartedPulling="2025-10-13 13:22:24.897147787 +0000 UTC m=+922.430698043" lastFinishedPulling="2025-10-13 13:22:49.65641366 +0000 UTC m=+947.189963916" observedRunningTime="2025-10-13 13:22:51.955443265 +0000 UTC m=+949.488993541" watchObservedRunningTime="2025-10-13 13:22:51.958992852 +0000 UTC m=+949.492543108" Oct 13 13:22:51 crc kubenswrapper[4797]: I1013 13:22:51.980605 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-55b7d448487kdnh" podStartSLOduration=4.993605241 podStartE2EDuration="28.980584432s" podCreationTimestamp="2025-10-13 13:22:23 +0000 UTC" firstStartedPulling="2025-10-13 13:22:25.669409268 +0000 UTC m=+923.202959524" lastFinishedPulling="2025-10-13 13:22:49.656388459 +0000 UTC m=+947.189938715" observedRunningTime="2025-10-13 13:22:51.974522543 +0000 UTC m=+949.508072789" watchObservedRunningTime="2025-10-13 13:22:51.980584432 +0000 UTC m=+949.514134688" Oct 13 13:22:52 crc kubenswrapper[4797]: I1013 13:22:52.004731 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5458f77c4-vrt4f" podStartSLOduration=4.582705157 podStartE2EDuration="29.004713494s" podCreationTimestamp="2025-10-13 13:22:23 +0000 UTC" firstStartedPulling="2025-10-13 13:22:25.415021085 +0000 UTC m=+922.948571341" lastFinishedPulling="2025-10-13 13:22:49.837029422 +0000 UTC m=+947.370579678" observedRunningTime="2025-10-13 13:22:52.000708166 +0000 UTC m=+949.534258442" watchObservedRunningTime="2025-10-13 13:22:52.004713494 +0000 UTC m=+949.538263750" Oct 13 13:22:52 crc kubenswrapper[4797]: I1013 13:22:52.024273 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-85d5d9dd78-rdczf" podStartSLOduration=7.862768455 podStartE2EDuration="29.024257333s" podCreationTimestamp="2025-10-13 13:22:23 +0000 UTC" firstStartedPulling="2025-10-13 13:22:25.31169226 +0000 UTC m=+922.845242516" lastFinishedPulling="2025-10-13 13:22:46.473181138 +0000 UTC m=+944.006731394" observedRunningTime="2025-10-13 13:22:52.021540247 +0000 UTC m=+949.555090503" watchObservedRunningTime="2025-10-13 13:22:52.024257333 +0000 UTC m=+949.557807589" Oct 13 13:22:52 crc kubenswrapper[4797]: I1013 13:22:52.039933 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-5f67fbc655-m9cpw" podStartSLOduration=6.490112012 podStartE2EDuration="29.039916408s" podCreationTimestamp="2025-10-13 13:22:23 +0000 UTC" firstStartedPulling="2025-10-13 13:22:25.280515715 +0000 UTC m=+922.814065971" lastFinishedPulling="2025-10-13 13:22:47.830320111 +0000 UTC m=+945.363870367" observedRunningTime="2025-10-13 13:22:52.038531344 +0000 UTC m=+949.572081600" watchObservedRunningTime="2025-10-13 13:22:52.039916408 +0000 UTC m=+949.573466664" Oct 13 13:22:52 crc kubenswrapper[4797]: I1013 13:22:52.076494 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-w9st4" podStartSLOduration=4.153631908 podStartE2EDuration="28.076478635s" podCreationTimestamp="2025-10-13 13:22:24 +0000 UTC" firstStartedPulling="2025-10-13 13:22:25.733709686 +0000 UTC m=+923.267259942" lastFinishedPulling="2025-10-13 13:22:49.656556403 +0000 UTC m=+947.190106669" observedRunningTime="2025-10-13 13:22:52.072018775 +0000 UTC m=+949.605569031" watchObservedRunningTime="2025-10-13 13:22:52.076478635 +0000 UTC m=+949.610028891" Oct 13 13:22:52 crc kubenswrapper[4797]: I1013 13:22:52.138641 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-68b6c87b68-766z6" podStartSLOduration=4.678531619 podStartE2EDuration="29.13862621s" podCreationTimestamp="2025-10-13 13:22:23 +0000 UTC" firstStartedPulling="2025-10-13 13:22:25.381060422 +0000 UTC m=+922.914610678" lastFinishedPulling="2025-10-13 13:22:49.841155003 +0000 UTC m=+947.374705269" observedRunningTime="2025-10-13 13:22:52.135577785 +0000 UTC m=+949.669128041" watchObservedRunningTime="2025-10-13 13:22:52.13862621 +0000 UTC m=+949.672176466" Oct 13 13:22:52 crc kubenswrapper[4797]: I1013 13:22:52.156494 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-67cfc6749b-v826q" podStartSLOduration=4.714383129 podStartE2EDuration="29.156468828s" podCreationTimestamp="2025-10-13 13:22:23 +0000 UTC" firstStartedPulling="2025-10-13 13:22:25.406895326 +0000 UTC m=+922.940445582" lastFinishedPulling="2025-10-13 13:22:49.848981025 +0000 UTC m=+947.382531281" observedRunningTime="2025-10-13 13:22:52.148728738 +0000 UTC m=+949.682279014" watchObservedRunningTime="2025-10-13 13:22:52.156468828 +0000 UTC m=+949.690019084" Oct 13 13:22:52 crc kubenswrapper[4797]: I1013 13:22:52.170419 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-858f76bbdd-79946" podStartSLOduration=6.261366768 podStartE2EDuration="29.170395419s" podCreationTimestamp="2025-10-13 13:22:23 +0000 UTC" firstStartedPulling="2025-10-13 13:22:24.92252305 +0000 UTC m=+922.456073326" lastFinishedPulling="2025-10-13 13:22:47.831551721 +0000 UTC m=+945.365101977" observedRunningTime="2025-10-13 13:22:52.162206229 +0000 UTC m=+949.695756505" watchObservedRunningTime="2025-10-13 13:22:52.170395419 +0000 UTC m=+949.703945665" Oct 13 13:22:52 crc kubenswrapper[4797]: I1013 13:22:52.178795 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-55b6b7c7b8-znjhj" podStartSLOduration=4.661144082 podStartE2EDuration="29.178784145s" podCreationTimestamp="2025-10-13 13:22:23 +0000 UTC" firstStartedPulling="2025-10-13 13:22:25.342373973 +0000 UTC m=+922.875924229" lastFinishedPulling="2025-10-13 13:22:49.860014036 +0000 UTC m=+947.393564292" observedRunningTime="2025-10-13 13:22:52.177887903 +0000 UTC m=+949.711438169" watchObservedRunningTime="2025-10-13 13:22:52.178784145 +0000 UTC m=+949.712334401" Oct 13 13:22:52 crc kubenswrapper[4797]: I1013 13:22:52.201899 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-79d585cb66-pqvwj" podStartSLOduration=4.8569236159999996 podStartE2EDuration="29.201879392s" podCreationTimestamp="2025-10-13 13:22:23 +0000 UTC" firstStartedPulling="2025-10-13 13:22:25.311452014 +0000 UTC m=+922.845002270" lastFinishedPulling="2025-10-13 13:22:49.65640779 +0000 UTC m=+947.189958046" observedRunningTime="2025-10-13 13:22:52.197452723 +0000 UTC m=+949.731002989" watchObservedRunningTime="2025-10-13 13:22:52.201879392 +0000 UTC m=+949.735429648" Oct 13 13:22:52 crc kubenswrapper[4797]: I1013 13:22:52.908761 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-658bdf4b74-4xbcc" event={"ID":"2ce34775-733e-42d7-a688-4c12edad7614","Type":"ContainerStarted","Data":"573e7b82717e8d65ae2de72474151e261dfb55b977c51b1a86d3f7f71e8c8aa5"} Oct 13 13:22:52 crc kubenswrapper[4797]: I1013 13:22:52.911098 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-5f67fbc655-m9cpw" Oct 13 13:22:52 crc kubenswrapper[4797]: I1013 13:22:52.927481 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-658bdf4b74-4xbcc" podStartSLOduration=2.456081933 podStartE2EDuration="29.927460187s" podCreationTimestamp="2025-10-13 13:22:23 +0000 UTC" firstStartedPulling="2025-10-13 13:22:24.91068514 +0000 UTC m=+922.444235396" lastFinishedPulling="2025-10-13 13:22:52.382063394 +0000 UTC m=+949.915613650" observedRunningTime="2025-10-13 13:22:52.926250727 +0000 UTC m=+950.459801043" watchObservedRunningTime="2025-10-13 13:22:52.927460187 +0000 UTC m=+950.461010453" Oct 13 13:22:53 crc kubenswrapper[4797]: I1013 13:22:53.636391 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-658bdf4b74-4xbcc" Oct 13 13:23:02 crc kubenswrapper[4797]: I1013 13:23:02.021495 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-9c5c78d49-l8d5s" event={"ID":"f092abb5-bd31-41f3-bedb-1e9523f17044","Type":"ContainerStarted","Data":"a45e5a4178047dac2fd63a02882822eab2181d01ae89135f1caee625a7ee7ec6"} Oct 13 13:23:02 crc kubenswrapper[4797]: I1013 13:23:02.022319 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-9c5c78d49-l8d5s" Oct 13 13:23:02 crc kubenswrapper[4797]: I1013 13:23:02.042773 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-9c5c78d49-l8d5s" podStartSLOduration=2.7134946490000003 podStartE2EDuration="39.042757381s" podCreationTimestamp="2025-10-13 13:22:23 +0000 UTC" firstStartedPulling="2025-10-13 13:22:25.327320653 +0000 UTC m=+922.860870909" lastFinishedPulling="2025-10-13 13:23:01.656583345 +0000 UTC m=+959.190133641" observedRunningTime="2025-10-13 13:23:02.041200223 +0000 UTC m=+959.574750559" watchObservedRunningTime="2025-10-13 13:23:02.042757381 +0000 UTC m=+959.576307657" Oct 13 13:23:03 crc kubenswrapper[4797]: I1013 13:23:03.639306 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-658bdf4b74-4xbcc" Oct 13 13:23:03 crc kubenswrapper[4797]: I1013 13:23:03.656327 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-7b7fb68549-jtc6c" Oct 13 13:23:03 crc kubenswrapper[4797]: I1013 13:23:03.699084 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-85d5d9dd78-rdczf" Oct 13 13:23:03 crc kubenswrapper[4797]: I1013 13:23:03.703644 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-858f76bbdd-79946" Oct 13 13:23:03 crc kubenswrapper[4797]: I1013 13:23:03.728878 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-7ffbcb7588-sdtkf" Oct 13 13:23:03 crc kubenswrapper[4797]: I1013 13:23:03.778674 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-79d585cb66-pqvwj" Oct 13 13:23:03 crc kubenswrapper[4797]: I1013 13:23:03.814370 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-69fdcfc5f5-dh4qb" Oct 13 13:23:03 crc kubenswrapper[4797]: I1013 13:23:03.832499 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-55b6b7c7b8-znjhj" Oct 13 13:23:03 crc kubenswrapper[4797]: I1013 13:23:03.866273 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-5f67fbc655-m9cpw" Oct 13 13:23:03 crc kubenswrapper[4797]: I1013 13:23:03.952797 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-f9fb45f8f-bld7w" Oct 13 13:23:04 crc kubenswrapper[4797]: I1013 13:23:04.040354 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5df598886f-fcgbt" event={"ID":"990f0215-8f03-4fb7-ae16-0d89130a5ba3","Type":"ContainerStarted","Data":"5678ff02ffeceb2872a4f612b681bca9e8b87fc482b053b7f23a3a06d1c92db9"} Oct 13 13:23:04 crc kubenswrapper[4797]: I1013 13:23:04.040602 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-5df598886f-fcgbt" Oct 13 13:23:04 crc kubenswrapper[4797]: I1013 13:23:04.059437 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-5df598886f-fcgbt" podStartSLOduration=2.665007768 podStartE2EDuration="41.059418446s" podCreationTimestamp="2025-10-13 13:22:23 +0000 UTC" firstStartedPulling="2025-10-13 13:22:25.319504751 +0000 UTC m=+922.853055007" lastFinishedPulling="2025-10-13 13:23:03.713915429 +0000 UTC m=+961.247465685" observedRunningTime="2025-10-13 13:23:04.05510395 +0000 UTC m=+961.588654216" watchObservedRunningTime="2025-10-13 13:23:04.059418446 +0000 UTC m=+961.592968702" Oct 13 13:23:04 crc kubenswrapper[4797]: I1013 13:23:04.073597 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-67cfc6749b-v826q" Oct 13 13:23:04 crc kubenswrapper[4797]: I1013 13:23:04.085924 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5458f77c4-vrt4f" Oct 13 13:23:04 crc kubenswrapper[4797]: I1013 13:23:04.119889 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-7f554bff7b-qsnt6" Oct 13 13:23:04 crc kubenswrapper[4797]: I1013 13:23:04.219721 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-68b6c87b68-766z6" Oct 13 13:23:04 crc kubenswrapper[4797]: I1013 13:23:04.238003 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-db6d7f97b-kgmrr" Oct 13 13:23:04 crc kubenswrapper[4797]: I1013 13:23:04.407372 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-656bcbd775-ntnr8" Oct 13 13:23:04 crc kubenswrapper[4797]: I1013 13:23:04.455052 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-55b7d448487kdnh" Oct 13 13:23:05 crc kubenswrapper[4797]: I1013 13:23:05.238156 4797 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 13 13:23:06 crc kubenswrapper[4797]: I1013 13:23:06.056128 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-84b9b84486-vfpgt" event={"ID":"13487799-53f7-4c74-9f16-770bf4dbace5","Type":"ContainerStarted","Data":"9c7b5d113706fd0bc61abcd0db497af1f3dc85d326d2109c5b8ec3a3ac428977"} Oct 13 13:23:06 crc kubenswrapper[4797]: I1013 13:23:06.056581 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-84b9b84486-vfpgt" Oct 13 13:23:06 crc kubenswrapper[4797]: I1013 13:23:06.079461 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-84b9b84486-vfpgt" podStartSLOduration=2.685379839 podStartE2EDuration="43.079444725s" podCreationTimestamp="2025-10-13 13:22:23 +0000 UTC" firstStartedPulling="2025-10-13 13:22:25.276876325 +0000 UTC m=+922.810426581" lastFinishedPulling="2025-10-13 13:23:05.670941211 +0000 UTC m=+963.204491467" observedRunningTime="2025-10-13 13:23:06.077307453 +0000 UTC m=+963.610857749" watchObservedRunningTime="2025-10-13 13:23:06.079444725 +0000 UTC m=+963.612994981" Oct 13 13:23:08 crc kubenswrapper[4797]: I1013 13:23:08.071087 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-79df5fb58c-mqjcg" event={"ID":"f7afdb2f-edf9-4dfe-a7d1-43b6a5ec8dcf","Type":"ContainerStarted","Data":"1971b79f830a0fbaa71a8c7d1532867d243f3b384d0989547f41300a276d039b"} Oct 13 13:23:08 crc kubenswrapper[4797]: I1013 13:23:08.072489 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-79df5fb58c-mqjcg" Oct 13 13:23:13 crc kubenswrapper[4797]: I1013 13:23:13.664340 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-84b9b84486-vfpgt" Oct 13 13:23:13 crc kubenswrapper[4797]: I1013 13:23:13.682883 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-79df5fb58c-mqjcg" podStartSLOduration=8.072298398 podStartE2EDuration="50.682864933s" podCreationTimestamp="2025-10-13 13:22:23 +0000 UTC" firstStartedPulling="2025-10-13 13:22:25.282480933 +0000 UTC m=+922.816031189" lastFinishedPulling="2025-10-13 13:23:07.893047458 +0000 UTC m=+965.426597724" observedRunningTime="2025-10-13 13:23:08.098052598 +0000 UTC m=+965.631602864" watchObservedRunningTime="2025-10-13 13:23:13.682864933 +0000 UTC m=+971.216415189" Oct 13 13:23:13 crc kubenswrapper[4797]: I1013 13:23:13.796466 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-5df598886f-fcgbt" Oct 13 13:23:13 crc kubenswrapper[4797]: I1013 13:23:13.826750 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-9c5c78d49-l8d5s" Oct 13 13:23:14 crc kubenswrapper[4797]: I1013 13:23:14.198839 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-79df5fb58c-mqjcg" Oct 13 13:23:29 crc kubenswrapper[4797]: I1013 13:23:29.585467 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5d487d97d7-jhb2q"] Oct 13 13:23:29 crc kubenswrapper[4797]: I1013 13:23:29.586966 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d487d97d7-jhb2q" Oct 13 13:23:29 crc kubenswrapper[4797]: I1013 13:23:29.592706 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-7kx24" Oct 13 13:23:29 crc kubenswrapper[4797]: I1013 13:23:29.592693 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Oct 13 13:23:29 crc kubenswrapper[4797]: I1013 13:23:29.592894 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Oct 13 13:23:29 crc kubenswrapper[4797]: I1013 13:23:29.593136 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Oct 13 13:23:29 crc kubenswrapper[4797]: I1013 13:23:29.613274 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d487d97d7-jhb2q"] Oct 13 13:23:29 crc kubenswrapper[4797]: I1013 13:23:29.653914 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6948694bd9-qlpcp"] Oct 13 13:23:29 crc kubenswrapper[4797]: I1013 13:23:29.667403 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jclc\" (UniqueName: \"kubernetes.io/projected/fa65711d-07d4-4d5e-bd53-1762a14fc2b7-kube-api-access-2jclc\") pod \"dnsmasq-dns-5d487d97d7-jhb2q\" (UID: \"fa65711d-07d4-4d5e-bd53-1762a14fc2b7\") " pod="openstack/dnsmasq-dns-5d487d97d7-jhb2q" Oct 13 13:23:29 crc kubenswrapper[4797]: I1013 13:23:29.667454 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa65711d-07d4-4d5e-bd53-1762a14fc2b7-config\") pod \"dnsmasq-dns-5d487d97d7-jhb2q\" (UID: \"fa65711d-07d4-4d5e-bd53-1762a14fc2b7\") " pod="openstack/dnsmasq-dns-5d487d97d7-jhb2q" Oct 13 13:23:29 crc kubenswrapper[4797]: I1013 13:23:29.667694 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6948694bd9-qlpcp"] Oct 13 13:23:29 crc kubenswrapper[4797]: I1013 13:23:29.667842 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6948694bd9-qlpcp" Oct 13 13:23:29 crc kubenswrapper[4797]: I1013 13:23:29.676659 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Oct 13 13:23:29 crc kubenswrapper[4797]: I1013 13:23:29.771583 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9908f76b-b586-4e5c-b77d-21e49c072ebc-config\") pod \"dnsmasq-dns-6948694bd9-qlpcp\" (UID: \"9908f76b-b586-4e5c-b77d-21e49c072ebc\") " pod="openstack/dnsmasq-dns-6948694bd9-qlpcp" Oct 13 13:23:29 crc kubenswrapper[4797]: I1013 13:23:29.771707 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2jclc\" (UniqueName: \"kubernetes.io/projected/fa65711d-07d4-4d5e-bd53-1762a14fc2b7-kube-api-access-2jclc\") pod \"dnsmasq-dns-5d487d97d7-jhb2q\" (UID: \"fa65711d-07d4-4d5e-bd53-1762a14fc2b7\") " pod="openstack/dnsmasq-dns-5d487d97d7-jhb2q" Oct 13 13:23:29 crc kubenswrapper[4797]: I1013 13:23:29.771753 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa65711d-07d4-4d5e-bd53-1762a14fc2b7-config\") pod \"dnsmasq-dns-5d487d97d7-jhb2q\" (UID: \"fa65711d-07d4-4d5e-bd53-1762a14fc2b7\") " pod="openstack/dnsmasq-dns-5d487d97d7-jhb2q" Oct 13 13:23:29 crc kubenswrapper[4797]: I1013 13:23:29.771837 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9908f76b-b586-4e5c-b77d-21e49c072ebc-dns-svc\") pod \"dnsmasq-dns-6948694bd9-qlpcp\" (UID: \"9908f76b-b586-4e5c-b77d-21e49c072ebc\") " pod="openstack/dnsmasq-dns-6948694bd9-qlpcp" Oct 13 13:23:29 crc kubenswrapper[4797]: I1013 13:23:29.771857 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8gt64\" (UniqueName: \"kubernetes.io/projected/9908f76b-b586-4e5c-b77d-21e49c072ebc-kube-api-access-8gt64\") pod \"dnsmasq-dns-6948694bd9-qlpcp\" (UID: \"9908f76b-b586-4e5c-b77d-21e49c072ebc\") " pod="openstack/dnsmasq-dns-6948694bd9-qlpcp" Oct 13 13:23:29 crc kubenswrapper[4797]: I1013 13:23:29.772729 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa65711d-07d4-4d5e-bd53-1762a14fc2b7-config\") pod \"dnsmasq-dns-5d487d97d7-jhb2q\" (UID: \"fa65711d-07d4-4d5e-bd53-1762a14fc2b7\") " pod="openstack/dnsmasq-dns-5d487d97d7-jhb2q" Oct 13 13:23:29 crc kubenswrapper[4797]: I1013 13:23:29.788974 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jclc\" (UniqueName: \"kubernetes.io/projected/fa65711d-07d4-4d5e-bd53-1762a14fc2b7-kube-api-access-2jclc\") pod \"dnsmasq-dns-5d487d97d7-jhb2q\" (UID: \"fa65711d-07d4-4d5e-bd53-1762a14fc2b7\") " pod="openstack/dnsmasq-dns-5d487d97d7-jhb2q" Oct 13 13:23:29 crc kubenswrapper[4797]: I1013 13:23:29.873571 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9908f76b-b586-4e5c-b77d-21e49c072ebc-config\") pod \"dnsmasq-dns-6948694bd9-qlpcp\" (UID: \"9908f76b-b586-4e5c-b77d-21e49c072ebc\") " pod="openstack/dnsmasq-dns-6948694bd9-qlpcp" Oct 13 13:23:29 crc kubenswrapper[4797]: I1013 13:23:29.873956 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9908f76b-b586-4e5c-b77d-21e49c072ebc-dns-svc\") pod \"dnsmasq-dns-6948694bd9-qlpcp\" (UID: \"9908f76b-b586-4e5c-b77d-21e49c072ebc\") " pod="openstack/dnsmasq-dns-6948694bd9-qlpcp" Oct 13 13:23:29 crc kubenswrapper[4797]: I1013 13:23:29.873981 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8gt64\" (UniqueName: \"kubernetes.io/projected/9908f76b-b586-4e5c-b77d-21e49c072ebc-kube-api-access-8gt64\") pod \"dnsmasq-dns-6948694bd9-qlpcp\" (UID: \"9908f76b-b586-4e5c-b77d-21e49c072ebc\") " pod="openstack/dnsmasq-dns-6948694bd9-qlpcp" Oct 13 13:23:29 crc kubenswrapper[4797]: I1013 13:23:29.874870 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9908f76b-b586-4e5c-b77d-21e49c072ebc-dns-svc\") pod \"dnsmasq-dns-6948694bd9-qlpcp\" (UID: \"9908f76b-b586-4e5c-b77d-21e49c072ebc\") " pod="openstack/dnsmasq-dns-6948694bd9-qlpcp" Oct 13 13:23:29 crc kubenswrapper[4797]: I1013 13:23:29.874926 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9908f76b-b586-4e5c-b77d-21e49c072ebc-config\") pod \"dnsmasq-dns-6948694bd9-qlpcp\" (UID: \"9908f76b-b586-4e5c-b77d-21e49c072ebc\") " pod="openstack/dnsmasq-dns-6948694bd9-qlpcp" Oct 13 13:23:29 crc kubenswrapper[4797]: I1013 13:23:29.909369 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d487d97d7-jhb2q" Oct 13 13:23:29 crc kubenswrapper[4797]: I1013 13:23:29.909627 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8gt64\" (UniqueName: \"kubernetes.io/projected/9908f76b-b586-4e5c-b77d-21e49c072ebc-kube-api-access-8gt64\") pod \"dnsmasq-dns-6948694bd9-qlpcp\" (UID: \"9908f76b-b586-4e5c-b77d-21e49c072ebc\") " pod="openstack/dnsmasq-dns-6948694bd9-qlpcp" Oct 13 13:23:29 crc kubenswrapper[4797]: I1013 13:23:29.995129 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6948694bd9-qlpcp" Oct 13 13:23:30 crc kubenswrapper[4797]: I1013 13:23:30.450792 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d487d97d7-jhb2q"] Oct 13 13:23:30 crc kubenswrapper[4797]: I1013 13:23:30.509468 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6948694bd9-qlpcp"] Oct 13 13:23:30 crc kubenswrapper[4797]: W1013 13:23:30.513447 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9908f76b_b586_4e5c_b77d_21e49c072ebc.slice/crio-5bf9b1116b220c3a6491066de70aa944930799796f892c4c41ed014d6040ec39 WatchSource:0}: Error finding container 5bf9b1116b220c3a6491066de70aa944930799796f892c4c41ed014d6040ec39: Status 404 returned error can't find the container with id 5bf9b1116b220c3a6491066de70aa944930799796f892c4c41ed014d6040ec39 Oct 13 13:23:31 crc kubenswrapper[4797]: I1013 13:23:31.250375 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d487d97d7-jhb2q" event={"ID":"fa65711d-07d4-4d5e-bd53-1762a14fc2b7","Type":"ContainerStarted","Data":"dccabb8e1b62f0b66c6371e7e633cdc00a75b6c8349518dea0a164dab99dfb04"} Oct 13 13:23:31 crc kubenswrapper[4797]: I1013 13:23:31.250536 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6948694bd9-qlpcp" event={"ID":"9908f76b-b586-4e5c-b77d-21e49c072ebc","Type":"ContainerStarted","Data":"5bf9b1116b220c3a6491066de70aa944930799796f892c4c41ed014d6040ec39"} Oct 13 13:23:31 crc kubenswrapper[4797]: I1013 13:23:31.603302 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d487d97d7-jhb2q"] Oct 13 13:23:31 crc kubenswrapper[4797]: I1013 13:23:31.635222 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86f694bf-m8ztx"] Oct 13 13:23:31 crc kubenswrapper[4797]: I1013 13:23:31.636680 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86f694bf-m8ztx" Oct 13 13:23:31 crc kubenswrapper[4797]: I1013 13:23:31.651563 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86f694bf-m8ztx"] Oct 13 13:23:31 crc kubenswrapper[4797]: I1013 13:23:31.719619 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0df186e7-541e-4998-bba3-95f086636a6d-config\") pod \"dnsmasq-dns-86f694bf-m8ztx\" (UID: \"0df186e7-541e-4998-bba3-95f086636a6d\") " pod="openstack/dnsmasq-dns-86f694bf-m8ztx" Oct 13 13:23:31 crc kubenswrapper[4797]: I1013 13:23:31.719867 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrgw5\" (UniqueName: \"kubernetes.io/projected/0df186e7-541e-4998-bba3-95f086636a6d-kube-api-access-rrgw5\") pod \"dnsmasq-dns-86f694bf-m8ztx\" (UID: \"0df186e7-541e-4998-bba3-95f086636a6d\") " pod="openstack/dnsmasq-dns-86f694bf-m8ztx" Oct 13 13:23:31 crc kubenswrapper[4797]: I1013 13:23:31.719934 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0df186e7-541e-4998-bba3-95f086636a6d-dns-svc\") pod \"dnsmasq-dns-86f694bf-m8ztx\" (UID: \"0df186e7-541e-4998-bba3-95f086636a6d\") " pod="openstack/dnsmasq-dns-86f694bf-m8ztx" Oct 13 13:23:31 crc kubenswrapper[4797]: I1013 13:23:31.821832 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrgw5\" (UniqueName: \"kubernetes.io/projected/0df186e7-541e-4998-bba3-95f086636a6d-kube-api-access-rrgw5\") pod \"dnsmasq-dns-86f694bf-m8ztx\" (UID: \"0df186e7-541e-4998-bba3-95f086636a6d\") " pod="openstack/dnsmasq-dns-86f694bf-m8ztx" Oct 13 13:23:31 crc kubenswrapper[4797]: I1013 13:23:31.821877 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0df186e7-541e-4998-bba3-95f086636a6d-dns-svc\") pod \"dnsmasq-dns-86f694bf-m8ztx\" (UID: \"0df186e7-541e-4998-bba3-95f086636a6d\") " pod="openstack/dnsmasq-dns-86f694bf-m8ztx" Oct 13 13:23:31 crc kubenswrapper[4797]: I1013 13:23:31.821923 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0df186e7-541e-4998-bba3-95f086636a6d-config\") pod \"dnsmasq-dns-86f694bf-m8ztx\" (UID: \"0df186e7-541e-4998-bba3-95f086636a6d\") " pod="openstack/dnsmasq-dns-86f694bf-m8ztx" Oct 13 13:23:31 crc kubenswrapper[4797]: I1013 13:23:31.823152 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0df186e7-541e-4998-bba3-95f086636a6d-dns-svc\") pod \"dnsmasq-dns-86f694bf-m8ztx\" (UID: \"0df186e7-541e-4998-bba3-95f086636a6d\") " pod="openstack/dnsmasq-dns-86f694bf-m8ztx" Oct 13 13:23:31 crc kubenswrapper[4797]: I1013 13:23:31.823341 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0df186e7-541e-4998-bba3-95f086636a6d-config\") pod \"dnsmasq-dns-86f694bf-m8ztx\" (UID: \"0df186e7-541e-4998-bba3-95f086636a6d\") " pod="openstack/dnsmasq-dns-86f694bf-m8ztx" Oct 13 13:23:31 crc kubenswrapper[4797]: I1013 13:23:31.844632 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrgw5\" (UniqueName: \"kubernetes.io/projected/0df186e7-541e-4998-bba3-95f086636a6d-kube-api-access-rrgw5\") pod \"dnsmasq-dns-86f694bf-m8ztx\" (UID: \"0df186e7-541e-4998-bba3-95f086636a6d\") " pod="openstack/dnsmasq-dns-86f694bf-m8ztx" Oct 13 13:23:31 crc kubenswrapper[4797]: I1013 13:23:31.992548 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86f694bf-m8ztx" Oct 13 13:23:32 crc kubenswrapper[4797]: I1013 13:23:32.507510 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6948694bd9-qlpcp"] Oct 13 13:23:32 crc kubenswrapper[4797]: I1013 13:23:32.519133 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86f694bf-m8ztx"] Oct 13 13:23:32 crc kubenswrapper[4797]: I1013 13:23:32.533908 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7869c47d6c-g9669"] Oct 13 13:23:32 crc kubenswrapper[4797]: I1013 13:23:32.537552 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7869c47d6c-g9669" Oct 13 13:23:32 crc kubenswrapper[4797]: I1013 13:23:32.550588 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7869c47d6c-g9669"] Oct 13 13:23:32 crc kubenswrapper[4797]: I1013 13:23:32.643698 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wzd2r\" (UniqueName: \"kubernetes.io/projected/ed404a60-7b1e-4d3d-91a1-50a66c87f7b4-kube-api-access-wzd2r\") pod \"dnsmasq-dns-7869c47d6c-g9669\" (UID: \"ed404a60-7b1e-4d3d-91a1-50a66c87f7b4\") " pod="openstack/dnsmasq-dns-7869c47d6c-g9669" Oct 13 13:23:32 crc kubenswrapper[4797]: I1013 13:23:32.643952 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ed404a60-7b1e-4d3d-91a1-50a66c87f7b4-dns-svc\") pod \"dnsmasq-dns-7869c47d6c-g9669\" (UID: \"ed404a60-7b1e-4d3d-91a1-50a66c87f7b4\") " pod="openstack/dnsmasq-dns-7869c47d6c-g9669" Oct 13 13:23:32 crc kubenswrapper[4797]: I1013 13:23:32.644002 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed404a60-7b1e-4d3d-91a1-50a66c87f7b4-config\") pod \"dnsmasq-dns-7869c47d6c-g9669\" (UID: \"ed404a60-7b1e-4d3d-91a1-50a66c87f7b4\") " pod="openstack/dnsmasq-dns-7869c47d6c-g9669" Oct 13 13:23:32 crc kubenswrapper[4797]: I1013 13:23:32.745855 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ed404a60-7b1e-4d3d-91a1-50a66c87f7b4-dns-svc\") pod \"dnsmasq-dns-7869c47d6c-g9669\" (UID: \"ed404a60-7b1e-4d3d-91a1-50a66c87f7b4\") " pod="openstack/dnsmasq-dns-7869c47d6c-g9669" Oct 13 13:23:32 crc kubenswrapper[4797]: I1013 13:23:32.745931 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed404a60-7b1e-4d3d-91a1-50a66c87f7b4-config\") pod \"dnsmasq-dns-7869c47d6c-g9669\" (UID: \"ed404a60-7b1e-4d3d-91a1-50a66c87f7b4\") " pod="openstack/dnsmasq-dns-7869c47d6c-g9669" Oct 13 13:23:32 crc kubenswrapper[4797]: I1013 13:23:32.745999 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wzd2r\" (UniqueName: \"kubernetes.io/projected/ed404a60-7b1e-4d3d-91a1-50a66c87f7b4-kube-api-access-wzd2r\") pod \"dnsmasq-dns-7869c47d6c-g9669\" (UID: \"ed404a60-7b1e-4d3d-91a1-50a66c87f7b4\") " pod="openstack/dnsmasq-dns-7869c47d6c-g9669" Oct 13 13:23:32 crc kubenswrapper[4797]: I1013 13:23:32.746861 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ed404a60-7b1e-4d3d-91a1-50a66c87f7b4-dns-svc\") pod \"dnsmasq-dns-7869c47d6c-g9669\" (UID: \"ed404a60-7b1e-4d3d-91a1-50a66c87f7b4\") " pod="openstack/dnsmasq-dns-7869c47d6c-g9669" Oct 13 13:23:32 crc kubenswrapper[4797]: I1013 13:23:32.746867 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed404a60-7b1e-4d3d-91a1-50a66c87f7b4-config\") pod \"dnsmasq-dns-7869c47d6c-g9669\" (UID: \"ed404a60-7b1e-4d3d-91a1-50a66c87f7b4\") " pod="openstack/dnsmasq-dns-7869c47d6c-g9669" Oct 13 13:23:32 crc kubenswrapper[4797]: I1013 13:23:32.781832 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wzd2r\" (UniqueName: \"kubernetes.io/projected/ed404a60-7b1e-4d3d-91a1-50a66c87f7b4-kube-api-access-wzd2r\") pod \"dnsmasq-dns-7869c47d6c-g9669\" (UID: \"ed404a60-7b1e-4d3d-91a1-50a66c87f7b4\") " pod="openstack/dnsmasq-dns-7869c47d6c-g9669" Oct 13 13:23:32 crc kubenswrapper[4797]: I1013 13:23:32.787350 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Oct 13 13:23:32 crc kubenswrapper[4797]: I1013 13:23:32.788574 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 13 13:23:32 crc kubenswrapper[4797]: I1013 13:23:32.792184 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Oct 13 13:23:32 crc kubenswrapper[4797]: I1013 13:23:32.792454 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Oct 13 13:23:32 crc kubenswrapper[4797]: I1013 13:23:32.792622 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Oct 13 13:23:32 crc kubenswrapper[4797]: I1013 13:23:32.794239 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-pt9kq" Oct 13 13:23:32 crc kubenswrapper[4797]: I1013 13:23:32.794327 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Oct 13 13:23:32 crc kubenswrapper[4797]: I1013 13:23:32.794423 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Oct 13 13:23:32 crc kubenswrapper[4797]: I1013 13:23:32.794962 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Oct 13 13:23:32 crc kubenswrapper[4797]: I1013 13:23:32.822715 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 13 13:23:32 crc kubenswrapper[4797]: I1013 13:23:32.882102 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7869c47d6c-g9669" Oct 13 13:23:32 crc kubenswrapper[4797]: I1013 13:23:32.970712 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-server-0\" (UID: \"21067728-d3cf-4ff2-94c9-87600f7324ab\") " pod="openstack/rabbitmq-server-0" Oct 13 13:23:32 crc kubenswrapper[4797]: I1013 13:23:32.970756 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/21067728-d3cf-4ff2-94c9-87600f7324ab-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"21067728-d3cf-4ff2-94c9-87600f7324ab\") " pod="openstack/rabbitmq-server-0" Oct 13 13:23:32 crc kubenswrapper[4797]: I1013 13:23:32.970779 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/21067728-d3cf-4ff2-94c9-87600f7324ab-server-conf\") pod \"rabbitmq-server-0\" (UID: \"21067728-d3cf-4ff2-94c9-87600f7324ab\") " pod="openstack/rabbitmq-server-0" Oct 13 13:23:32 crc kubenswrapper[4797]: I1013 13:23:32.970797 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5d865\" (UniqueName: \"kubernetes.io/projected/21067728-d3cf-4ff2-94c9-87600f7324ab-kube-api-access-5d865\") pod \"rabbitmq-server-0\" (UID: \"21067728-d3cf-4ff2-94c9-87600f7324ab\") " pod="openstack/rabbitmq-server-0" Oct 13 13:23:32 crc kubenswrapper[4797]: I1013 13:23:32.970841 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/21067728-d3cf-4ff2-94c9-87600f7324ab-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"21067728-d3cf-4ff2-94c9-87600f7324ab\") " pod="openstack/rabbitmq-server-0" Oct 13 13:23:32 crc kubenswrapper[4797]: I1013 13:23:32.970858 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/21067728-d3cf-4ff2-94c9-87600f7324ab-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"21067728-d3cf-4ff2-94c9-87600f7324ab\") " pod="openstack/rabbitmq-server-0" Oct 13 13:23:32 crc kubenswrapper[4797]: I1013 13:23:32.970882 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/21067728-d3cf-4ff2-94c9-87600f7324ab-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"21067728-d3cf-4ff2-94c9-87600f7324ab\") " pod="openstack/rabbitmq-server-0" Oct 13 13:23:32 crc kubenswrapper[4797]: I1013 13:23:32.970922 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/21067728-d3cf-4ff2-94c9-87600f7324ab-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"21067728-d3cf-4ff2-94c9-87600f7324ab\") " pod="openstack/rabbitmq-server-0" Oct 13 13:23:32 crc kubenswrapper[4797]: I1013 13:23:32.970937 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/21067728-d3cf-4ff2-94c9-87600f7324ab-config-data\") pod \"rabbitmq-server-0\" (UID: \"21067728-d3cf-4ff2-94c9-87600f7324ab\") " pod="openstack/rabbitmq-server-0" Oct 13 13:23:32 crc kubenswrapper[4797]: I1013 13:23:32.970953 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/21067728-d3cf-4ff2-94c9-87600f7324ab-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"21067728-d3cf-4ff2-94c9-87600f7324ab\") " pod="openstack/rabbitmq-server-0" Oct 13 13:23:32 crc kubenswrapper[4797]: I1013 13:23:32.970992 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/21067728-d3cf-4ff2-94c9-87600f7324ab-pod-info\") pod \"rabbitmq-server-0\" (UID: \"21067728-d3cf-4ff2-94c9-87600f7324ab\") " pod="openstack/rabbitmq-server-0" Oct 13 13:23:33 crc kubenswrapper[4797]: I1013 13:23:33.073046 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/21067728-d3cf-4ff2-94c9-87600f7324ab-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"21067728-d3cf-4ff2-94c9-87600f7324ab\") " pod="openstack/rabbitmq-server-0" Oct 13 13:23:33 crc kubenswrapper[4797]: I1013 13:23:33.077738 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/21067728-d3cf-4ff2-94c9-87600f7324ab-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"21067728-d3cf-4ff2-94c9-87600f7324ab\") " pod="openstack/rabbitmq-server-0" Oct 13 13:23:33 crc kubenswrapper[4797]: I1013 13:23:33.078791 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/21067728-d3cf-4ff2-94c9-87600f7324ab-config-data\") pod \"rabbitmq-server-0\" (UID: \"21067728-d3cf-4ff2-94c9-87600f7324ab\") " pod="openstack/rabbitmq-server-0" Oct 13 13:23:33 crc kubenswrapper[4797]: I1013 13:23:33.078961 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/21067728-d3cf-4ff2-94c9-87600f7324ab-config-data\") pod \"rabbitmq-server-0\" (UID: \"21067728-d3cf-4ff2-94c9-87600f7324ab\") " pod="openstack/rabbitmq-server-0" Oct 13 13:23:33 crc kubenswrapper[4797]: I1013 13:23:33.078999 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/21067728-d3cf-4ff2-94c9-87600f7324ab-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"21067728-d3cf-4ff2-94c9-87600f7324ab\") " pod="openstack/rabbitmq-server-0" Oct 13 13:23:33 crc kubenswrapper[4797]: I1013 13:23:33.079110 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/21067728-d3cf-4ff2-94c9-87600f7324ab-pod-info\") pod \"rabbitmq-server-0\" (UID: \"21067728-d3cf-4ff2-94c9-87600f7324ab\") " pod="openstack/rabbitmq-server-0" Oct 13 13:23:33 crc kubenswrapper[4797]: I1013 13:23:33.079151 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-server-0\" (UID: \"21067728-d3cf-4ff2-94c9-87600f7324ab\") " pod="openstack/rabbitmq-server-0" Oct 13 13:23:33 crc kubenswrapper[4797]: I1013 13:23:33.079179 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/21067728-d3cf-4ff2-94c9-87600f7324ab-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"21067728-d3cf-4ff2-94c9-87600f7324ab\") " pod="openstack/rabbitmq-server-0" Oct 13 13:23:33 crc kubenswrapper[4797]: I1013 13:23:33.079215 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/21067728-d3cf-4ff2-94c9-87600f7324ab-server-conf\") pod \"rabbitmq-server-0\" (UID: \"21067728-d3cf-4ff2-94c9-87600f7324ab\") " pod="openstack/rabbitmq-server-0" Oct 13 13:23:33 crc kubenswrapper[4797]: I1013 13:23:33.079237 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5d865\" (UniqueName: \"kubernetes.io/projected/21067728-d3cf-4ff2-94c9-87600f7324ab-kube-api-access-5d865\") pod \"rabbitmq-server-0\" (UID: \"21067728-d3cf-4ff2-94c9-87600f7324ab\") " pod="openstack/rabbitmq-server-0" Oct 13 13:23:33 crc kubenswrapper[4797]: I1013 13:23:33.079298 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/21067728-d3cf-4ff2-94c9-87600f7324ab-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"21067728-d3cf-4ff2-94c9-87600f7324ab\") " pod="openstack/rabbitmq-server-0" Oct 13 13:23:33 crc kubenswrapper[4797]: I1013 13:23:33.079320 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/21067728-d3cf-4ff2-94c9-87600f7324ab-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"21067728-d3cf-4ff2-94c9-87600f7324ab\") " pod="openstack/rabbitmq-server-0" Oct 13 13:23:33 crc kubenswrapper[4797]: I1013 13:23:33.079373 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/21067728-d3cf-4ff2-94c9-87600f7324ab-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"21067728-d3cf-4ff2-94c9-87600f7324ab\") " pod="openstack/rabbitmq-server-0" Oct 13 13:23:33 crc kubenswrapper[4797]: I1013 13:23:33.084090 4797 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-server-0\" (UID: \"21067728-d3cf-4ff2-94c9-87600f7324ab\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/rabbitmq-server-0" Oct 13 13:23:33 crc kubenswrapper[4797]: I1013 13:23:33.084618 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/21067728-d3cf-4ff2-94c9-87600f7324ab-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"21067728-d3cf-4ff2-94c9-87600f7324ab\") " pod="openstack/rabbitmq-server-0" Oct 13 13:23:33 crc kubenswrapper[4797]: I1013 13:23:33.085278 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/21067728-d3cf-4ff2-94c9-87600f7324ab-server-conf\") pod \"rabbitmq-server-0\" (UID: \"21067728-d3cf-4ff2-94c9-87600f7324ab\") " pod="openstack/rabbitmq-server-0" Oct 13 13:23:33 crc kubenswrapper[4797]: I1013 13:23:33.085329 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/21067728-d3cf-4ff2-94c9-87600f7324ab-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"21067728-d3cf-4ff2-94c9-87600f7324ab\") " pod="openstack/rabbitmq-server-0" Oct 13 13:23:33 crc kubenswrapper[4797]: I1013 13:23:33.085664 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/21067728-d3cf-4ff2-94c9-87600f7324ab-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"21067728-d3cf-4ff2-94c9-87600f7324ab\") " pod="openstack/rabbitmq-server-0" Oct 13 13:23:33 crc kubenswrapper[4797]: I1013 13:23:33.090620 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/21067728-d3cf-4ff2-94c9-87600f7324ab-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"21067728-d3cf-4ff2-94c9-87600f7324ab\") " pod="openstack/rabbitmq-server-0" Oct 13 13:23:33 crc kubenswrapper[4797]: I1013 13:23:33.095746 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/21067728-d3cf-4ff2-94c9-87600f7324ab-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"21067728-d3cf-4ff2-94c9-87600f7324ab\") " pod="openstack/rabbitmq-server-0" Oct 13 13:23:33 crc kubenswrapper[4797]: I1013 13:23:33.102767 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5d865\" (UniqueName: \"kubernetes.io/projected/21067728-d3cf-4ff2-94c9-87600f7324ab-kube-api-access-5d865\") pod \"rabbitmq-server-0\" (UID: \"21067728-d3cf-4ff2-94c9-87600f7324ab\") " pod="openstack/rabbitmq-server-0" Oct 13 13:23:33 crc kubenswrapper[4797]: I1013 13:23:33.102869 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/21067728-d3cf-4ff2-94c9-87600f7324ab-pod-info\") pod \"rabbitmq-server-0\" (UID: \"21067728-d3cf-4ff2-94c9-87600f7324ab\") " pod="openstack/rabbitmq-server-0" Oct 13 13:23:33 crc kubenswrapper[4797]: I1013 13:23:33.137911 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-server-0\" (UID: \"21067728-d3cf-4ff2-94c9-87600f7324ab\") " pod="openstack/rabbitmq-server-0" Oct 13 13:23:33 crc kubenswrapper[4797]: I1013 13:23:33.180518 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 13 13:23:33 crc kubenswrapper[4797]: I1013 13:23:33.217029 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7869c47d6c-g9669"] Oct 13 13:23:33 crc kubenswrapper[4797]: W1013 13:23:33.242370 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poded404a60_7b1e_4d3d_91a1_50a66c87f7b4.slice/crio-9bec38f6ba065f92070a6bb0b0e9217b2d2a843288e7b4ab90d27c1e11b94ad7 WatchSource:0}: Error finding container 9bec38f6ba065f92070a6bb0b0e9217b2d2a843288e7b4ab90d27c1e11b94ad7: Status 404 returned error can't find the container with id 9bec38f6ba065f92070a6bb0b0e9217b2d2a843288e7b4ab90d27c1e11b94ad7 Oct 13 13:23:33 crc kubenswrapper[4797]: I1013 13:23:33.267748 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7869c47d6c-g9669" event={"ID":"ed404a60-7b1e-4d3d-91a1-50a66c87f7b4","Type":"ContainerStarted","Data":"9bec38f6ba065f92070a6bb0b0e9217b2d2a843288e7b4ab90d27c1e11b94ad7"} Oct 13 13:23:33 crc kubenswrapper[4797]: I1013 13:23:33.270349 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86f694bf-m8ztx" event={"ID":"0df186e7-541e-4998-bba3-95f086636a6d","Type":"ContainerStarted","Data":"823aa6bafcf219db1ff2bc72d64b5ecc41c6e8cbc893969d92b63e1e4711d0d8"} Oct 13 13:23:33 crc kubenswrapper[4797]: I1013 13:23:33.652528 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 13 13:23:33 crc kubenswrapper[4797]: I1013 13:23:33.655557 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 13 13:23:33 crc kubenswrapper[4797]: I1013 13:23:33.659494 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-hz7lv" Oct 13 13:23:33 crc kubenswrapper[4797]: I1013 13:23:33.664634 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Oct 13 13:23:33 crc kubenswrapper[4797]: I1013 13:23:33.664846 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Oct 13 13:23:33 crc kubenswrapper[4797]: I1013 13:23:33.665193 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Oct 13 13:23:33 crc kubenswrapper[4797]: I1013 13:23:33.665310 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Oct 13 13:23:33 crc kubenswrapper[4797]: I1013 13:23:33.665431 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Oct 13 13:23:33 crc kubenswrapper[4797]: I1013 13:23:33.665544 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Oct 13 13:23:33 crc kubenswrapper[4797]: I1013 13:23:33.674938 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 13 13:23:33 crc kubenswrapper[4797]: I1013 13:23:33.767378 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 13 13:23:33 crc kubenswrapper[4797]: W1013 13:23:33.786625 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod21067728_d3cf_4ff2_94c9_87600f7324ab.slice/crio-c827d0ec843c5e1ccb7ad4f30660ae1ed3fff3b98977fa073ac90305847b44f4 WatchSource:0}: Error finding container c827d0ec843c5e1ccb7ad4f30660ae1ed3fff3b98977fa073ac90305847b44f4: Status 404 returned error can't find the container with id c827d0ec843c5e1ccb7ad4f30660ae1ed3fff3b98977fa073ac90305847b44f4 Oct 13 13:23:33 crc kubenswrapper[4797]: I1013 13:23:33.788934 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/acdec9fc-360a-46e4-89ea-3fde84f417c0-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"acdec9fc-360a-46e4-89ea-3fde84f417c0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 13:23:33 crc kubenswrapper[4797]: I1013 13:23:33.789078 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"acdec9fc-360a-46e4-89ea-3fde84f417c0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 13:23:33 crc kubenswrapper[4797]: I1013 13:23:33.789146 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49qwl\" (UniqueName: \"kubernetes.io/projected/acdec9fc-360a-46e4-89ea-3fde84f417c0-kube-api-access-49qwl\") pod \"rabbitmq-cell1-server-0\" (UID: \"acdec9fc-360a-46e4-89ea-3fde84f417c0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 13:23:33 crc kubenswrapper[4797]: I1013 13:23:33.789169 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/acdec9fc-360a-46e4-89ea-3fde84f417c0-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"acdec9fc-360a-46e4-89ea-3fde84f417c0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 13:23:33 crc kubenswrapper[4797]: I1013 13:23:33.789201 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/acdec9fc-360a-46e4-89ea-3fde84f417c0-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"acdec9fc-360a-46e4-89ea-3fde84f417c0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 13:23:33 crc kubenswrapper[4797]: I1013 13:23:33.789220 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/acdec9fc-360a-46e4-89ea-3fde84f417c0-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"acdec9fc-360a-46e4-89ea-3fde84f417c0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 13:23:33 crc kubenswrapper[4797]: I1013 13:23:33.789236 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/acdec9fc-360a-46e4-89ea-3fde84f417c0-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"acdec9fc-360a-46e4-89ea-3fde84f417c0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 13:23:33 crc kubenswrapper[4797]: I1013 13:23:33.789259 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/acdec9fc-360a-46e4-89ea-3fde84f417c0-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"acdec9fc-360a-46e4-89ea-3fde84f417c0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 13:23:33 crc kubenswrapper[4797]: I1013 13:23:33.789278 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/acdec9fc-360a-46e4-89ea-3fde84f417c0-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"acdec9fc-360a-46e4-89ea-3fde84f417c0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 13:23:33 crc kubenswrapper[4797]: I1013 13:23:33.789295 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/acdec9fc-360a-46e4-89ea-3fde84f417c0-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"acdec9fc-360a-46e4-89ea-3fde84f417c0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 13:23:33 crc kubenswrapper[4797]: I1013 13:23:33.789315 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/acdec9fc-360a-46e4-89ea-3fde84f417c0-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"acdec9fc-360a-46e4-89ea-3fde84f417c0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 13:23:33 crc kubenswrapper[4797]: I1013 13:23:33.890610 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49qwl\" (UniqueName: \"kubernetes.io/projected/acdec9fc-360a-46e4-89ea-3fde84f417c0-kube-api-access-49qwl\") pod \"rabbitmq-cell1-server-0\" (UID: \"acdec9fc-360a-46e4-89ea-3fde84f417c0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 13:23:33 crc kubenswrapper[4797]: I1013 13:23:33.890655 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/acdec9fc-360a-46e4-89ea-3fde84f417c0-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"acdec9fc-360a-46e4-89ea-3fde84f417c0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 13:23:33 crc kubenswrapper[4797]: I1013 13:23:33.890688 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/acdec9fc-360a-46e4-89ea-3fde84f417c0-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"acdec9fc-360a-46e4-89ea-3fde84f417c0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 13:23:33 crc kubenswrapper[4797]: I1013 13:23:33.890706 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/acdec9fc-360a-46e4-89ea-3fde84f417c0-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"acdec9fc-360a-46e4-89ea-3fde84f417c0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 13:23:33 crc kubenswrapper[4797]: I1013 13:23:33.890722 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/acdec9fc-360a-46e4-89ea-3fde84f417c0-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"acdec9fc-360a-46e4-89ea-3fde84f417c0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 13:23:33 crc kubenswrapper[4797]: I1013 13:23:33.890745 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/acdec9fc-360a-46e4-89ea-3fde84f417c0-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"acdec9fc-360a-46e4-89ea-3fde84f417c0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 13:23:33 crc kubenswrapper[4797]: I1013 13:23:33.890765 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/acdec9fc-360a-46e4-89ea-3fde84f417c0-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"acdec9fc-360a-46e4-89ea-3fde84f417c0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 13:23:33 crc kubenswrapper[4797]: I1013 13:23:33.890785 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/acdec9fc-360a-46e4-89ea-3fde84f417c0-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"acdec9fc-360a-46e4-89ea-3fde84f417c0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 13:23:33 crc kubenswrapper[4797]: I1013 13:23:33.891926 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/acdec9fc-360a-46e4-89ea-3fde84f417c0-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"acdec9fc-360a-46e4-89ea-3fde84f417c0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 13:23:33 crc kubenswrapper[4797]: I1013 13:23:33.892028 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/acdec9fc-360a-46e4-89ea-3fde84f417c0-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"acdec9fc-360a-46e4-89ea-3fde84f417c0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 13:23:33 crc kubenswrapper[4797]: I1013 13:23:33.892049 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"acdec9fc-360a-46e4-89ea-3fde84f417c0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 13:23:33 crc kubenswrapper[4797]: I1013 13:23:33.892097 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/acdec9fc-360a-46e4-89ea-3fde84f417c0-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"acdec9fc-360a-46e4-89ea-3fde84f417c0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 13:23:33 crc kubenswrapper[4797]: I1013 13:23:33.892229 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/acdec9fc-360a-46e4-89ea-3fde84f417c0-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"acdec9fc-360a-46e4-89ea-3fde84f417c0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 13:23:33 crc kubenswrapper[4797]: I1013 13:23:33.892496 4797 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"acdec9fc-360a-46e4-89ea-3fde84f417c0\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/rabbitmq-cell1-server-0" Oct 13 13:23:33 crc kubenswrapper[4797]: I1013 13:23:33.892689 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/acdec9fc-360a-46e4-89ea-3fde84f417c0-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"acdec9fc-360a-46e4-89ea-3fde84f417c0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 13:23:33 crc kubenswrapper[4797]: I1013 13:23:33.893052 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/acdec9fc-360a-46e4-89ea-3fde84f417c0-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"acdec9fc-360a-46e4-89ea-3fde84f417c0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 13:23:33 crc kubenswrapper[4797]: I1013 13:23:33.893776 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/acdec9fc-360a-46e4-89ea-3fde84f417c0-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"acdec9fc-360a-46e4-89ea-3fde84f417c0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 13:23:33 crc kubenswrapper[4797]: I1013 13:23:33.897205 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/acdec9fc-360a-46e4-89ea-3fde84f417c0-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"acdec9fc-360a-46e4-89ea-3fde84f417c0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 13:23:33 crc kubenswrapper[4797]: I1013 13:23:33.897693 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/acdec9fc-360a-46e4-89ea-3fde84f417c0-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"acdec9fc-360a-46e4-89ea-3fde84f417c0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 13:23:33 crc kubenswrapper[4797]: I1013 13:23:33.898667 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/acdec9fc-360a-46e4-89ea-3fde84f417c0-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"acdec9fc-360a-46e4-89ea-3fde84f417c0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 13:23:33 crc kubenswrapper[4797]: I1013 13:23:33.905521 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/acdec9fc-360a-46e4-89ea-3fde84f417c0-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"acdec9fc-360a-46e4-89ea-3fde84f417c0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 13:23:33 crc kubenswrapper[4797]: I1013 13:23:33.905821 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-49qwl\" (UniqueName: \"kubernetes.io/projected/acdec9fc-360a-46e4-89ea-3fde84f417c0-kube-api-access-49qwl\") pod \"rabbitmq-cell1-server-0\" (UID: \"acdec9fc-360a-46e4-89ea-3fde84f417c0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 13:23:33 crc kubenswrapper[4797]: I1013 13:23:33.951435 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"acdec9fc-360a-46e4-89ea-3fde84f417c0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 13:23:33 crc kubenswrapper[4797]: I1013 13:23:33.990185 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 13 13:23:34 crc kubenswrapper[4797]: I1013 13:23:34.294106 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"21067728-d3cf-4ff2-94c9-87600f7324ab","Type":"ContainerStarted","Data":"c827d0ec843c5e1ccb7ad4f30660ae1ed3fff3b98977fa073ac90305847b44f4"} Oct 13 13:23:34 crc kubenswrapper[4797]: I1013 13:23:34.579970 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 13 13:23:35 crc kubenswrapper[4797]: I1013 13:23:35.081953 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Oct 13 13:23:35 crc kubenswrapper[4797]: I1013 13:23:35.083992 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 13 13:23:35 crc kubenswrapper[4797]: I1013 13:23:35.087414 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Oct 13 13:23:35 crc kubenswrapper[4797]: I1013 13:23:35.088022 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Oct 13 13:23:35 crc kubenswrapper[4797]: I1013 13:23:35.088144 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Oct 13 13:23:35 crc kubenswrapper[4797]: I1013 13:23:35.088289 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Oct 13 13:23:35 crc kubenswrapper[4797]: I1013 13:23:35.088539 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-d7ldc" Oct 13 13:23:35 crc kubenswrapper[4797]: I1013 13:23:35.098305 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Oct 13 13:23:35 crc kubenswrapper[4797]: I1013 13:23:35.093153 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Oct 13 13:23:35 crc kubenswrapper[4797]: I1013 13:23:35.218370 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/300309d9-4375-4cce-8fb1-0833d2cfdcde-config-data-generated\") pod \"openstack-galera-0\" (UID: \"300309d9-4375-4cce-8fb1-0833d2cfdcde\") " pod="openstack/openstack-galera-0" Oct 13 13:23:35 crc kubenswrapper[4797]: I1013 13:23:35.218425 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/300309d9-4375-4cce-8fb1-0833d2cfdcde-operator-scripts\") pod \"openstack-galera-0\" (UID: \"300309d9-4375-4cce-8fb1-0833d2cfdcde\") " pod="openstack/openstack-galera-0" Oct 13 13:23:35 crc kubenswrapper[4797]: I1013 13:23:35.218673 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/300309d9-4375-4cce-8fb1-0833d2cfdcde-kolla-config\") pod \"openstack-galera-0\" (UID: \"300309d9-4375-4cce-8fb1-0833d2cfdcde\") " pod="openstack/openstack-galera-0" Oct 13 13:23:35 crc kubenswrapper[4797]: I1013 13:23:35.218752 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/300309d9-4375-4cce-8fb1-0833d2cfdcde-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"300309d9-4375-4cce-8fb1-0833d2cfdcde\") " pod="openstack/openstack-galera-0" Oct 13 13:23:35 crc kubenswrapper[4797]: I1013 13:23:35.218918 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/300309d9-4375-4cce-8fb1-0833d2cfdcde-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"300309d9-4375-4cce-8fb1-0833d2cfdcde\") " pod="openstack/openstack-galera-0" Oct 13 13:23:35 crc kubenswrapper[4797]: I1013 13:23:35.218963 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/300309d9-4375-4cce-8fb1-0833d2cfdcde-config-data-default\") pod \"openstack-galera-0\" (UID: \"300309d9-4375-4cce-8fb1-0833d2cfdcde\") " pod="openstack/openstack-galera-0" Oct 13 13:23:35 crc kubenswrapper[4797]: I1013 13:23:35.219020 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jxcg\" (UniqueName: \"kubernetes.io/projected/300309d9-4375-4cce-8fb1-0833d2cfdcde-kube-api-access-7jxcg\") pod \"openstack-galera-0\" (UID: \"300309d9-4375-4cce-8fb1-0833d2cfdcde\") " pod="openstack/openstack-galera-0" Oct 13 13:23:35 crc kubenswrapper[4797]: I1013 13:23:35.219249 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/300309d9-4375-4cce-8fb1-0833d2cfdcde-secrets\") pod \"openstack-galera-0\" (UID: \"300309d9-4375-4cce-8fb1-0833d2cfdcde\") " pod="openstack/openstack-galera-0" Oct 13 13:23:35 crc kubenswrapper[4797]: I1013 13:23:35.219304 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"300309d9-4375-4cce-8fb1-0833d2cfdcde\") " pod="openstack/openstack-galera-0" Oct 13 13:23:35 crc kubenswrapper[4797]: I1013 13:23:35.320696 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/300309d9-4375-4cce-8fb1-0833d2cfdcde-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"300309d9-4375-4cce-8fb1-0833d2cfdcde\") " pod="openstack/openstack-galera-0" Oct 13 13:23:35 crc kubenswrapper[4797]: I1013 13:23:35.320740 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/300309d9-4375-4cce-8fb1-0833d2cfdcde-config-data-default\") pod \"openstack-galera-0\" (UID: \"300309d9-4375-4cce-8fb1-0833d2cfdcde\") " pod="openstack/openstack-galera-0" Oct 13 13:23:35 crc kubenswrapper[4797]: I1013 13:23:35.320768 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7jxcg\" (UniqueName: \"kubernetes.io/projected/300309d9-4375-4cce-8fb1-0833d2cfdcde-kube-api-access-7jxcg\") pod \"openstack-galera-0\" (UID: \"300309d9-4375-4cce-8fb1-0833d2cfdcde\") " pod="openstack/openstack-galera-0" Oct 13 13:23:35 crc kubenswrapper[4797]: I1013 13:23:35.320830 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/300309d9-4375-4cce-8fb1-0833d2cfdcde-secrets\") pod \"openstack-galera-0\" (UID: \"300309d9-4375-4cce-8fb1-0833d2cfdcde\") " pod="openstack/openstack-galera-0" Oct 13 13:23:35 crc kubenswrapper[4797]: I1013 13:23:35.320859 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"300309d9-4375-4cce-8fb1-0833d2cfdcde\") " pod="openstack/openstack-galera-0" Oct 13 13:23:35 crc kubenswrapper[4797]: I1013 13:23:35.320914 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/300309d9-4375-4cce-8fb1-0833d2cfdcde-config-data-generated\") pod \"openstack-galera-0\" (UID: \"300309d9-4375-4cce-8fb1-0833d2cfdcde\") " pod="openstack/openstack-galera-0" Oct 13 13:23:35 crc kubenswrapper[4797]: I1013 13:23:35.320935 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/300309d9-4375-4cce-8fb1-0833d2cfdcde-operator-scripts\") pod \"openstack-galera-0\" (UID: \"300309d9-4375-4cce-8fb1-0833d2cfdcde\") " pod="openstack/openstack-galera-0" Oct 13 13:23:35 crc kubenswrapper[4797]: I1013 13:23:35.320994 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/300309d9-4375-4cce-8fb1-0833d2cfdcde-kolla-config\") pod \"openstack-galera-0\" (UID: \"300309d9-4375-4cce-8fb1-0833d2cfdcde\") " pod="openstack/openstack-galera-0" Oct 13 13:23:35 crc kubenswrapper[4797]: I1013 13:23:35.321009 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/300309d9-4375-4cce-8fb1-0833d2cfdcde-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"300309d9-4375-4cce-8fb1-0833d2cfdcde\") " pod="openstack/openstack-galera-0" Oct 13 13:23:35 crc kubenswrapper[4797]: I1013 13:23:35.322120 4797 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"300309d9-4375-4cce-8fb1-0833d2cfdcde\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/openstack-galera-0" Oct 13 13:23:35 crc kubenswrapper[4797]: I1013 13:23:35.324367 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/300309d9-4375-4cce-8fb1-0833d2cfdcde-config-data-default\") pod \"openstack-galera-0\" (UID: \"300309d9-4375-4cce-8fb1-0833d2cfdcde\") " pod="openstack/openstack-galera-0" Oct 13 13:23:35 crc kubenswrapper[4797]: I1013 13:23:35.324390 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/300309d9-4375-4cce-8fb1-0833d2cfdcde-config-data-generated\") pod \"openstack-galera-0\" (UID: \"300309d9-4375-4cce-8fb1-0833d2cfdcde\") " pod="openstack/openstack-galera-0" Oct 13 13:23:35 crc kubenswrapper[4797]: I1013 13:23:35.324593 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/300309d9-4375-4cce-8fb1-0833d2cfdcde-operator-scripts\") pod \"openstack-galera-0\" (UID: \"300309d9-4375-4cce-8fb1-0833d2cfdcde\") " pod="openstack/openstack-galera-0" Oct 13 13:23:35 crc kubenswrapper[4797]: I1013 13:23:35.327359 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/300309d9-4375-4cce-8fb1-0833d2cfdcde-kolla-config\") pod \"openstack-galera-0\" (UID: \"300309d9-4375-4cce-8fb1-0833d2cfdcde\") " pod="openstack/openstack-galera-0" Oct 13 13:23:35 crc kubenswrapper[4797]: I1013 13:23:35.335076 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/300309d9-4375-4cce-8fb1-0833d2cfdcde-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"300309d9-4375-4cce-8fb1-0833d2cfdcde\") " pod="openstack/openstack-galera-0" Oct 13 13:23:35 crc kubenswrapper[4797]: I1013 13:23:35.335151 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/300309d9-4375-4cce-8fb1-0833d2cfdcde-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"300309d9-4375-4cce-8fb1-0833d2cfdcde\") " pod="openstack/openstack-galera-0" Oct 13 13:23:35 crc kubenswrapper[4797]: I1013 13:23:35.336891 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/300309d9-4375-4cce-8fb1-0833d2cfdcde-secrets\") pod \"openstack-galera-0\" (UID: \"300309d9-4375-4cce-8fb1-0833d2cfdcde\") " pod="openstack/openstack-galera-0" Oct 13 13:23:35 crc kubenswrapper[4797]: I1013 13:23:35.337422 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"acdec9fc-360a-46e4-89ea-3fde84f417c0","Type":"ContainerStarted","Data":"7ca684bd06b79cd361e9ae37a6ebd09af0d174370e8b4c826f82ccdf5efb69f3"} Oct 13 13:23:35 crc kubenswrapper[4797]: I1013 13:23:35.343052 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jxcg\" (UniqueName: \"kubernetes.io/projected/300309d9-4375-4cce-8fb1-0833d2cfdcde-kube-api-access-7jxcg\") pod \"openstack-galera-0\" (UID: \"300309d9-4375-4cce-8fb1-0833d2cfdcde\") " pod="openstack/openstack-galera-0" Oct 13 13:23:35 crc kubenswrapper[4797]: I1013 13:23:35.368609 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"300309d9-4375-4cce-8fb1-0833d2cfdcde\") " pod="openstack/openstack-galera-0" Oct 13 13:23:35 crc kubenswrapper[4797]: I1013 13:23:35.460383 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 13 13:23:36 crc kubenswrapper[4797]: I1013 13:23:36.094073 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Oct 13 13:23:36 crc kubenswrapper[4797]: W1013 13:23:36.098838 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod300309d9_4375_4cce_8fb1_0833d2cfdcde.slice/crio-a674b1af4d463b4e06f89e5b0191853be57dd6654cd17eb0f136ece2fbae1d13 WatchSource:0}: Error finding container a674b1af4d463b4e06f89e5b0191853be57dd6654cd17eb0f136ece2fbae1d13: Status 404 returned error can't find the container with id a674b1af4d463b4e06f89e5b0191853be57dd6654cd17eb0f136ece2fbae1d13 Oct 13 13:23:36 crc kubenswrapper[4797]: I1013 13:23:36.358202 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"300309d9-4375-4cce-8fb1-0833d2cfdcde","Type":"ContainerStarted","Data":"a674b1af4d463b4e06f89e5b0191853be57dd6654cd17eb0f136ece2fbae1d13"} Oct 13 13:23:36 crc kubenswrapper[4797]: I1013 13:23:36.484518 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 13 13:23:36 crc kubenswrapper[4797]: I1013 13:23:36.486513 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 13 13:23:36 crc kubenswrapper[4797]: I1013 13:23:36.489201 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-krqv6" Oct 13 13:23:36 crc kubenswrapper[4797]: I1013 13:23:36.489691 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Oct 13 13:23:36 crc kubenswrapper[4797]: I1013 13:23:36.489711 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Oct 13 13:23:36 crc kubenswrapper[4797]: I1013 13:23:36.489872 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Oct 13 13:23:36 crc kubenswrapper[4797]: I1013 13:23:36.509039 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 13 13:23:36 crc kubenswrapper[4797]: I1013 13:23:36.542104 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-cell1-galera-0\" (UID: \"f12892ce-6d68-4f79-b1dd-e874dffba145\") " pod="openstack/openstack-cell1-galera-0" Oct 13 13:23:36 crc kubenswrapper[4797]: I1013 13:23:36.542160 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f12892ce-6d68-4f79-b1dd-e874dffba145-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"f12892ce-6d68-4f79-b1dd-e874dffba145\") " pod="openstack/openstack-cell1-galera-0" Oct 13 13:23:36 crc kubenswrapper[4797]: I1013 13:23:36.542181 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f12892ce-6d68-4f79-b1dd-e874dffba145-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"f12892ce-6d68-4f79-b1dd-e874dffba145\") " pod="openstack/openstack-cell1-galera-0" Oct 13 13:23:36 crc kubenswrapper[4797]: I1013 13:23:36.542204 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f12892ce-6d68-4f79-b1dd-e874dffba145-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"f12892ce-6d68-4f79-b1dd-e874dffba145\") " pod="openstack/openstack-cell1-galera-0" Oct 13 13:23:36 crc kubenswrapper[4797]: I1013 13:23:36.542317 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f12892ce-6d68-4f79-b1dd-e874dffba145-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"f12892ce-6d68-4f79-b1dd-e874dffba145\") " pod="openstack/openstack-cell1-galera-0" Oct 13 13:23:36 crc kubenswrapper[4797]: I1013 13:23:36.542369 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xh98n\" (UniqueName: \"kubernetes.io/projected/f12892ce-6d68-4f79-b1dd-e874dffba145-kube-api-access-xh98n\") pod \"openstack-cell1-galera-0\" (UID: \"f12892ce-6d68-4f79-b1dd-e874dffba145\") " pod="openstack/openstack-cell1-galera-0" Oct 13 13:23:36 crc kubenswrapper[4797]: I1013 13:23:36.542455 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f12892ce-6d68-4f79-b1dd-e874dffba145-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"f12892ce-6d68-4f79-b1dd-e874dffba145\") " pod="openstack/openstack-cell1-galera-0" Oct 13 13:23:36 crc kubenswrapper[4797]: I1013 13:23:36.542500 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/f12892ce-6d68-4f79-b1dd-e874dffba145-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"f12892ce-6d68-4f79-b1dd-e874dffba145\") " pod="openstack/openstack-cell1-galera-0" Oct 13 13:23:36 crc kubenswrapper[4797]: I1013 13:23:36.542528 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f12892ce-6d68-4f79-b1dd-e874dffba145-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"f12892ce-6d68-4f79-b1dd-e874dffba145\") " pod="openstack/openstack-cell1-galera-0" Oct 13 13:23:36 crc kubenswrapper[4797]: I1013 13:23:36.653604 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xh98n\" (UniqueName: \"kubernetes.io/projected/f12892ce-6d68-4f79-b1dd-e874dffba145-kube-api-access-xh98n\") pod \"openstack-cell1-galera-0\" (UID: \"f12892ce-6d68-4f79-b1dd-e874dffba145\") " pod="openstack/openstack-cell1-galera-0" Oct 13 13:23:36 crc kubenswrapper[4797]: I1013 13:23:36.653656 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f12892ce-6d68-4f79-b1dd-e874dffba145-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"f12892ce-6d68-4f79-b1dd-e874dffba145\") " pod="openstack/openstack-cell1-galera-0" Oct 13 13:23:36 crc kubenswrapper[4797]: I1013 13:23:36.653678 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/f12892ce-6d68-4f79-b1dd-e874dffba145-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"f12892ce-6d68-4f79-b1dd-e874dffba145\") " pod="openstack/openstack-cell1-galera-0" Oct 13 13:23:36 crc kubenswrapper[4797]: I1013 13:23:36.653696 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f12892ce-6d68-4f79-b1dd-e874dffba145-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"f12892ce-6d68-4f79-b1dd-e874dffba145\") " pod="openstack/openstack-cell1-galera-0" Oct 13 13:23:36 crc kubenswrapper[4797]: I1013 13:23:36.653731 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-cell1-galera-0\" (UID: \"f12892ce-6d68-4f79-b1dd-e874dffba145\") " pod="openstack/openstack-cell1-galera-0" Oct 13 13:23:36 crc kubenswrapper[4797]: I1013 13:23:36.653762 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f12892ce-6d68-4f79-b1dd-e874dffba145-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"f12892ce-6d68-4f79-b1dd-e874dffba145\") " pod="openstack/openstack-cell1-galera-0" Oct 13 13:23:36 crc kubenswrapper[4797]: I1013 13:23:36.653778 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f12892ce-6d68-4f79-b1dd-e874dffba145-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"f12892ce-6d68-4f79-b1dd-e874dffba145\") " pod="openstack/openstack-cell1-galera-0" Oct 13 13:23:36 crc kubenswrapper[4797]: I1013 13:23:36.653817 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f12892ce-6d68-4f79-b1dd-e874dffba145-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"f12892ce-6d68-4f79-b1dd-e874dffba145\") " pod="openstack/openstack-cell1-galera-0" Oct 13 13:23:36 crc kubenswrapper[4797]: I1013 13:23:36.653881 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f12892ce-6d68-4f79-b1dd-e874dffba145-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"f12892ce-6d68-4f79-b1dd-e874dffba145\") " pod="openstack/openstack-cell1-galera-0" Oct 13 13:23:36 crc kubenswrapper[4797]: I1013 13:23:36.654242 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f12892ce-6d68-4f79-b1dd-e874dffba145-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"f12892ce-6d68-4f79-b1dd-e874dffba145\") " pod="openstack/openstack-cell1-galera-0" Oct 13 13:23:36 crc kubenswrapper[4797]: I1013 13:23:36.655636 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f12892ce-6d68-4f79-b1dd-e874dffba145-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"f12892ce-6d68-4f79-b1dd-e874dffba145\") " pod="openstack/openstack-cell1-galera-0" Oct 13 13:23:36 crc kubenswrapper[4797]: I1013 13:23:36.655690 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f12892ce-6d68-4f79-b1dd-e874dffba145-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"f12892ce-6d68-4f79-b1dd-e874dffba145\") " pod="openstack/openstack-cell1-galera-0" Oct 13 13:23:36 crc kubenswrapper[4797]: I1013 13:23:36.656008 4797 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-cell1-galera-0\" (UID: \"f12892ce-6d68-4f79-b1dd-e874dffba145\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/openstack-cell1-galera-0" Oct 13 13:23:36 crc kubenswrapper[4797]: I1013 13:23:36.659880 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f12892ce-6d68-4f79-b1dd-e874dffba145-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"f12892ce-6d68-4f79-b1dd-e874dffba145\") " pod="openstack/openstack-cell1-galera-0" Oct 13 13:23:36 crc kubenswrapper[4797]: I1013 13:23:36.660866 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f12892ce-6d68-4f79-b1dd-e874dffba145-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"f12892ce-6d68-4f79-b1dd-e874dffba145\") " pod="openstack/openstack-cell1-galera-0" Oct 13 13:23:36 crc kubenswrapper[4797]: I1013 13:23:36.677602 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f12892ce-6d68-4f79-b1dd-e874dffba145-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"f12892ce-6d68-4f79-b1dd-e874dffba145\") " pod="openstack/openstack-cell1-galera-0" Oct 13 13:23:36 crc kubenswrapper[4797]: I1013 13:23:36.685046 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/f12892ce-6d68-4f79-b1dd-e874dffba145-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"f12892ce-6d68-4f79-b1dd-e874dffba145\") " pod="openstack/openstack-cell1-galera-0" Oct 13 13:23:36 crc kubenswrapper[4797]: I1013 13:23:36.690741 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xh98n\" (UniqueName: \"kubernetes.io/projected/f12892ce-6d68-4f79-b1dd-e874dffba145-kube-api-access-xh98n\") pod \"openstack-cell1-galera-0\" (UID: \"f12892ce-6d68-4f79-b1dd-e874dffba145\") " pod="openstack/openstack-cell1-galera-0" Oct 13 13:23:36 crc kubenswrapper[4797]: I1013 13:23:36.696475 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-cell1-galera-0\" (UID: \"f12892ce-6d68-4f79-b1dd-e874dffba145\") " pod="openstack/openstack-cell1-galera-0" Oct 13 13:23:36 crc kubenswrapper[4797]: I1013 13:23:36.777880 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Oct 13 13:23:36 crc kubenswrapper[4797]: I1013 13:23:36.779535 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 13 13:23:36 crc kubenswrapper[4797]: I1013 13:23:36.790966 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-tlz4z" Oct 13 13:23:36 crc kubenswrapper[4797]: I1013 13:23:36.791220 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Oct 13 13:23:36 crc kubenswrapper[4797]: I1013 13:23:36.791359 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Oct 13 13:23:36 crc kubenswrapper[4797]: I1013 13:23:36.808638 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Oct 13 13:23:36 crc kubenswrapper[4797]: I1013 13:23:36.811895 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 13 13:23:36 crc kubenswrapper[4797]: I1013 13:23:36.860606 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb5bf67e-bfcd-4a21-b67f-9d2893da31ab-combined-ca-bundle\") pod \"memcached-0\" (UID: \"cb5bf67e-bfcd-4a21-b67f-9d2893da31ab\") " pod="openstack/memcached-0" Oct 13 13:23:36 crc kubenswrapper[4797]: I1013 13:23:36.860680 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cb5bf67e-bfcd-4a21-b67f-9d2893da31ab-config-data\") pod \"memcached-0\" (UID: \"cb5bf67e-bfcd-4a21-b67f-9d2893da31ab\") " pod="openstack/memcached-0" Oct 13 13:23:36 crc kubenswrapper[4797]: I1013 13:23:36.860733 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhskg\" (UniqueName: \"kubernetes.io/projected/cb5bf67e-bfcd-4a21-b67f-9d2893da31ab-kube-api-access-dhskg\") pod \"memcached-0\" (UID: \"cb5bf67e-bfcd-4a21-b67f-9d2893da31ab\") " pod="openstack/memcached-0" Oct 13 13:23:36 crc kubenswrapper[4797]: I1013 13:23:36.860769 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/cb5bf67e-bfcd-4a21-b67f-9d2893da31ab-kolla-config\") pod \"memcached-0\" (UID: \"cb5bf67e-bfcd-4a21-b67f-9d2893da31ab\") " pod="openstack/memcached-0" Oct 13 13:23:36 crc kubenswrapper[4797]: I1013 13:23:36.860786 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb5bf67e-bfcd-4a21-b67f-9d2893da31ab-memcached-tls-certs\") pod \"memcached-0\" (UID: \"cb5bf67e-bfcd-4a21-b67f-9d2893da31ab\") " pod="openstack/memcached-0" Oct 13 13:23:36 crc kubenswrapper[4797]: I1013 13:23:36.962523 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cb5bf67e-bfcd-4a21-b67f-9d2893da31ab-config-data\") pod \"memcached-0\" (UID: \"cb5bf67e-bfcd-4a21-b67f-9d2893da31ab\") " pod="openstack/memcached-0" Oct 13 13:23:36 crc kubenswrapper[4797]: I1013 13:23:36.962631 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dhskg\" (UniqueName: \"kubernetes.io/projected/cb5bf67e-bfcd-4a21-b67f-9d2893da31ab-kube-api-access-dhskg\") pod \"memcached-0\" (UID: \"cb5bf67e-bfcd-4a21-b67f-9d2893da31ab\") " pod="openstack/memcached-0" Oct 13 13:23:36 crc kubenswrapper[4797]: I1013 13:23:36.962698 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/cb5bf67e-bfcd-4a21-b67f-9d2893da31ab-kolla-config\") pod \"memcached-0\" (UID: \"cb5bf67e-bfcd-4a21-b67f-9d2893da31ab\") " pod="openstack/memcached-0" Oct 13 13:23:36 crc kubenswrapper[4797]: I1013 13:23:36.962718 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb5bf67e-bfcd-4a21-b67f-9d2893da31ab-memcached-tls-certs\") pod \"memcached-0\" (UID: \"cb5bf67e-bfcd-4a21-b67f-9d2893da31ab\") " pod="openstack/memcached-0" Oct 13 13:23:36 crc kubenswrapper[4797]: I1013 13:23:36.962762 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb5bf67e-bfcd-4a21-b67f-9d2893da31ab-combined-ca-bundle\") pod \"memcached-0\" (UID: \"cb5bf67e-bfcd-4a21-b67f-9d2893da31ab\") " pod="openstack/memcached-0" Oct 13 13:23:36 crc kubenswrapper[4797]: I1013 13:23:36.965651 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/cb5bf67e-bfcd-4a21-b67f-9d2893da31ab-kolla-config\") pod \"memcached-0\" (UID: \"cb5bf67e-bfcd-4a21-b67f-9d2893da31ab\") " pod="openstack/memcached-0" Oct 13 13:23:36 crc kubenswrapper[4797]: I1013 13:23:36.966452 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cb5bf67e-bfcd-4a21-b67f-9d2893da31ab-config-data\") pod \"memcached-0\" (UID: \"cb5bf67e-bfcd-4a21-b67f-9d2893da31ab\") " pod="openstack/memcached-0" Oct 13 13:23:36 crc kubenswrapper[4797]: I1013 13:23:36.969150 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb5bf67e-bfcd-4a21-b67f-9d2893da31ab-memcached-tls-certs\") pod \"memcached-0\" (UID: \"cb5bf67e-bfcd-4a21-b67f-9d2893da31ab\") " pod="openstack/memcached-0" Oct 13 13:23:36 crc kubenswrapper[4797]: I1013 13:23:36.988233 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhskg\" (UniqueName: \"kubernetes.io/projected/cb5bf67e-bfcd-4a21-b67f-9d2893da31ab-kube-api-access-dhskg\") pod \"memcached-0\" (UID: \"cb5bf67e-bfcd-4a21-b67f-9d2893da31ab\") " pod="openstack/memcached-0" Oct 13 13:23:36 crc kubenswrapper[4797]: I1013 13:23:36.989698 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb5bf67e-bfcd-4a21-b67f-9d2893da31ab-combined-ca-bundle\") pod \"memcached-0\" (UID: \"cb5bf67e-bfcd-4a21-b67f-9d2893da31ab\") " pod="openstack/memcached-0" Oct 13 13:23:37 crc kubenswrapper[4797]: I1013 13:23:37.135372 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 13 13:23:38 crc kubenswrapper[4797]: I1013 13:23:38.190827 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Oct 13 13:23:38 crc kubenswrapper[4797]: I1013 13:23:38.192376 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 13 13:23:38 crc kubenswrapper[4797]: I1013 13:23:38.198911 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 13 13:23:38 crc kubenswrapper[4797]: I1013 13:23:38.199612 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-q2h4d" Oct 13 13:23:38 crc kubenswrapper[4797]: I1013 13:23:38.300708 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ch6m\" (UniqueName: \"kubernetes.io/projected/130b6035-6c63-4a81-b112-fdf5da3d970e-kube-api-access-4ch6m\") pod \"kube-state-metrics-0\" (UID: \"130b6035-6c63-4a81-b112-fdf5da3d970e\") " pod="openstack/kube-state-metrics-0" Oct 13 13:23:38 crc kubenswrapper[4797]: I1013 13:23:38.402695 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4ch6m\" (UniqueName: \"kubernetes.io/projected/130b6035-6c63-4a81-b112-fdf5da3d970e-kube-api-access-4ch6m\") pod \"kube-state-metrics-0\" (UID: \"130b6035-6c63-4a81-b112-fdf5da3d970e\") " pod="openstack/kube-state-metrics-0" Oct 13 13:23:38 crc kubenswrapper[4797]: I1013 13:23:38.444006 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ch6m\" (UniqueName: \"kubernetes.io/projected/130b6035-6c63-4a81-b112-fdf5da3d970e-kube-api-access-4ch6m\") pod \"kube-state-metrics-0\" (UID: \"130b6035-6c63-4a81-b112-fdf5da3d970e\") " pod="openstack/kube-state-metrics-0" Oct 13 13:23:38 crc kubenswrapper[4797]: I1013 13:23:38.540030 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 13 13:23:42 crc kubenswrapper[4797]: I1013 13:23:42.889216 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-htk8n"] Oct 13 13:23:42 crc kubenswrapper[4797]: I1013 13:23:42.891029 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-htk8n" Oct 13 13:23:42 crc kubenswrapper[4797]: I1013 13:23:42.893661 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Oct 13 13:23:42 crc kubenswrapper[4797]: I1013 13:23:42.894701 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-ljwgd" Oct 13 13:23:42 crc kubenswrapper[4797]: I1013 13:23:42.900593 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Oct 13 13:23:42 crc kubenswrapper[4797]: I1013 13:23:42.953874 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-htk8n"] Oct 13 13:23:42 crc kubenswrapper[4797]: I1013 13:23:42.968219 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-2mpq9"] Oct 13 13:23:42 crc kubenswrapper[4797]: I1013 13:23:42.969752 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-2mpq9" Oct 13 13:23:42 crc kubenswrapper[4797]: I1013 13:23:42.983123 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbr5d\" (UniqueName: \"kubernetes.io/projected/85dd770b-9d5c-4cc9-adaa-87963d5bb160-kube-api-access-fbr5d\") pod \"ovn-controller-htk8n\" (UID: \"85dd770b-9d5c-4cc9-adaa-87963d5bb160\") " pod="openstack/ovn-controller-htk8n" Oct 13 13:23:42 crc kubenswrapper[4797]: I1013 13:23:42.983225 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/85dd770b-9d5c-4cc9-adaa-87963d5bb160-var-log-ovn\") pod \"ovn-controller-htk8n\" (UID: \"85dd770b-9d5c-4cc9-adaa-87963d5bb160\") " pod="openstack/ovn-controller-htk8n" Oct 13 13:23:42 crc kubenswrapper[4797]: I1013 13:23:42.983283 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/85dd770b-9d5c-4cc9-adaa-87963d5bb160-ovn-controller-tls-certs\") pod \"ovn-controller-htk8n\" (UID: \"85dd770b-9d5c-4cc9-adaa-87963d5bb160\") " pod="openstack/ovn-controller-htk8n" Oct 13 13:23:42 crc kubenswrapper[4797]: I1013 13:23:42.983340 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/85dd770b-9d5c-4cc9-adaa-87963d5bb160-var-run-ovn\") pod \"ovn-controller-htk8n\" (UID: \"85dd770b-9d5c-4cc9-adaa-87963d5bb160\") " pod="openstack/ovn-controller-htk8n" Oct 13 13:23:42 crc kubenswrapper[4797]: I1013 13:23:42.983384 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/85dd770b-9d5c-4cc9-adaa-87963d5bb160-var-run\") pod \"ovn-controller-htk8n\" (UID: \"85dd770b-9d5c-4cc9-adaa-87963d5bb160\") " pod="openstack/ovn-controller-htk8n" Oct 13 13:23:42 crc kubenswrapper[4797]: I1013 13:23:42.983415 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85dd770b-9d5c-4cc9-adaa-87963d5bb160-combined-ca-bundle\") pod \"ovn-controller-htk8n\" (UID: \"85dd770b-9d5c-4cc9-adaa-87963d5bb160\") " pod="openstack/ovn-controller-htk8n" Oct 13 13:23:42 crc kubenswrapper[4797]: I1013 13:23:42.983441 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/85dd770b-9d5c-4cc9-adaa-87963d5bb160-scripts\") pod \"ovn-controller-htk8n\" (UID: \"85dd770b-9d5c-4cc9-adaa-87963d5bb160\") " pod="openstack/ovn-controller-htk8n" Oct 13 13:23:42 crc kubenswrapper[4797]: I1013 13:23:42.987456 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-2mpq9"] Oct 13 13:23:43 crc kubenswrapper[4797]: I1013 13:23:43.038402 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 13 13:23:43 crc kubenswrapper[4797]: I1013 13:23:43.040303 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 13 13:23:43 crc kubenswrapper[4797]: I1013 13:23:43.042634 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Oct 13 13:23:43 crc kubenswrapper[4797]: I1013 13:23:43.042885 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Oct 13 13:23:43 crc kubenswrapper[4797]: I1013 13:23:43.044491 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 13 13:23:43 crc kubenswrapper[4797]: I1013 13:23:43.044938 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Oct 13 13:23:43 crc kubenswrapper[4797]: I1013 13:23:43.045021 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Oct 13 13:23:43 crc kubenswrapper[4797]: I1013 13:23:43.045286 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-9frxx" Oct 13 13:23:43 crc kubenswrapper[4797]: I1013 13:23:43.085833 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/85dd770b-9d5c-4cc9-adaa-87963d5bb160-ovn-controller-tls-certs\") pod \"ovn-controller-htk8n\" (UID: \"85dd770b-9d5c-4cc9-adaa-87963d5bb160\") " pod="openstack/ovn-controller-htk8n" Oct 13 13:23:43 crc kubenswrapper[4797]: I1013 13:23:43.085889 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/85dd770b-9d5c-4cc9-adaa-87963d5bb160-var-run-ovn\") pod \"ovn-controller-htk8n\" (UID: \"85dd770b-9d5c-4cc9-adaa-87963d5bb160\") " pod="openstack/ovn-controller-htk8n" Oct 13 13:23:43 crc kubenswrapper[4797]: I1013 13:23:43.085919 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/85dd770b-9d5c-4cc9-adaa-87963d5bb160-var-run\") pod \"ovn-controller-htk8n\" (UID: \"85dd770b-9d5c-4cc9-adaa-87963d5bb160\") " pod="openstack/ovn-controller-htk8n" Oct 13 13:23:43 crc kubenswrapper[4797]: I1013 13:23:43.085952 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wsw2g\" (UniqueName: \"kubernetes.io/projected/3ac6531d-4d7d-4cf0-b943-984f885b4a6d-kube-api-access-wsw2g\") pod \"ovn-controller-ovs-2mpq9\" (UID: \"3ac6531d-4d7d-4cf0-b943-984f885b4a6d\") " pod="openstack/ovn-controller-ovs-2mpq9" Oct 13 13:23:43 crc kubenswrapper[4797]: I1013 13:23:43.085972 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85dd770b-9d5c-4cc9-adaa-87963d5bb160-combined-ca-bundle\") pod \"ovn-controller-htk8n\" (UID: \"85dd770b-9d5c-4cc9-adaa-87963d5bb160\") " pod="openstack/ovn-controller-htk8n" Oct 13 13:23:43 crc kubenswrapper[4797]: I1013 13:23:43.085995 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/85dd770b-9d5c-4cc9-adaa-87963d5bb160-scripts\") pod \"ovn-controller-htk8n\" (UID: \"85dd770b-9d5c-4cc9-adaa-87963d5bb160\") " pod="openstack/ovn-controller-htk8n" Oct 13 13:23:43 crc kubenswrapper[4797]: I1013 13:23:43.086013 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3ac6531d-4d7d-4cf0-b943-984f885b4a6d-var-run\") pod \"ovn-controller-ovs-2mpq9\" (UID: \"3ac6531d-4d7d-4cf0-b943-984f885b4a6d\") " pod="openstack/ovn-controller-ovs-2mpq9" Oct 13 13:23:43 crc kubenswrapper[4797]: I1013 13:23:43.086047 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/3ac6531d-4d7d-4cf0-b943-984f885b4a6d-var-lib\") pod \"ovn-controller-ovs-2mpq9\" (UID: \"3ac6531d-4d7d-4cf0-b943-984f885b4a6d\") " pod="openstack/ovn-controller-ovs-2mpq9" Oct 13 13:23:43 crc kubenswrapper[4797]: I1013 13:23:43.086062 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3ac6531d-4d7d-4cf0-b943-984f885b4a6d-scripts\") pod \"ovn-controller-ovs-2mpq9\" (UID: \"3ac6531d-4d7d-4cf0-b943-984f885b4a6d\") " pod="openstack/ovn-controller-ovs-2mpq9" Oct 13 13:23:43 crc kubenswrapper[4797]: I1013 13:23:43.086098 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fbr5d\" (UniqueName: \"kubernetes.io/projected/85dd770b-9d5c-4cc9-adaa-87963d5bb160-kube-api-access-fbr5d\") pod \"ovn-controller-htk8n\" (UID: \"85dd770b-9d5c-4cc9-adaa-87963d5bb160\") " pod="openstack/ovn-controller-htk8n" Oct 13 13:23:43 crc kubenswrapper[4797]: I1013 13:23:43.086120 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/3ac6531d-4d7d-4cf0-b943-984f885b4a6d-var-log\") pod \"ovn-controller-ovs-2mpq9\" (UID: \"3ac6531d-4d7d-4cf0-b943-984f885b4a6d\") " pod="openstack/ovn-controller-ovs-2mpq9" Oct 13 13:23:43 crc kubenswrapper[4797]: I1013 13:23:43.086609 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/85dd770b-9d5c-4cc9-adaa-87963d5bb160-var-log-ovn\") pod \"ovn-controller-htk8n\" (UID: \"85dd770b-9d5c-4cc9-adaa-87963d5bb160\") " pod="openstack/ovn-controller-htk8n" Oct 13 13:23:43 crc kubenswrapper[4797]: I1013 13:23:43.086658 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/3ac6531d-4d7d-4cf0-b943-984f885b4a6d-etc-ovs\") pod \"ovn-controller-ovs-2mpq9\" (UID: \"3ac6531d-4d7d-4cf0-b943-984f885b4a6d\") " pod="openstack/ovn-controller-ovs-2mpq9" Oct 13 13:23:43 crc kubenswrapper[4797]: I1013 13:23:43.088848 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/85dd770b-9d5c-4cc9-adaa-87963d5bb160-var-run-ovn\") pod \"ovn-controller-htk8n\" (UID: \"85dd770b-9d5c-4cc9-adaa-87963d5bb160\") " pod="openstack/ovn-controller-htk8n" Oct 13 13:23:43 crc kubenswrapper[4797]: I1013 13:23:43.088978 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/85dd770b-9d5c-4cc9-adaa-87963d5bb160-var-run\") pod \"ovn-controller-htk8n\" (UID: \"85dd770b-9d5c-4cc9-adaa-87963d5bb160\") " pod="openstack/ovn-controller-htk8n" Oct 13 13:23:43 crc kubenswrapper[4797]: I1013 13:23:43.089134 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/85dd770b-9d5c-4cc9-adaa-87963d5bb160-var-log-ovn\") pod \"ovn-controller-htk8n\" (UID: \"85dd770b-9d5c-4cc9-adaa-87963d5bb160\") " pod="openstack/ovn-controller-htk8n" Oct 13 13:23:43 crc kubenswrapper[4797]: I1013 13:23:43.090855 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/85dd770b-9d5c-4cc9-adaa-87963d5bb160-scripts\") pod \"ovn-controller-htk8n\" (UID: \"85dd770b-9d5c-4cc9-adaa-87963d5bb160\") " pod="openstack/ovn-controller-htk8n" Oct 13 13:23:43 crc kubenswrapper[4797]: I1013 13:23:43.094058 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/85dd770b-9d5c-4cc9-adaa-87963d5bb160-ovn-controller-tls-certs\") pod \"ovn-controller-htk8n\" (UID: \"85dd770b-9d5c-4cc9-adaa-87963d5bb160\") " pod="openstack/ovn-controller-htk8n" Oct 13 13:23:43 crc kubenswrapper[4797]: I1013 13:23:43.094290 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85dd770b-9d5c-4cc9-adaa-87963d5bb160-combined-ca-bundle\") pod \"ovn-controller-htk8n\" (UID: \"85dd770b-9d5c-4cc9-adaa-87963d5bb160\") " pod="openstack/ovn-controller-htk8n" Oct 13 13:23:43 crc kubenswrapper[4797]: I1013 13:23:43.106903 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbr5d\" (UniqueName: \"kubernetes.io/projected/85dd770b-9d5c-4cc9-adaa-87963d5bb160-kube-api-access-fbr5d\") pod \"ovn-controller-htk8n\" (UID: \"85dd770b-9d5c-4cc9-adaa-87963d5bb160\") " pod="openstack/ovn-controller-htk8n" Oct 13 13:23:43 crc kubenswrapper[4797]: I1013 13:23:43.187623 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/244e58a1-ed2c-4ff6-8885-ebd066e8adab-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"244e58a1-ed2c-4ff6-8885-ebd066e8adab\") " pod="openstack/ovsdbserver-nb-0" Oct 13 13:23:43 crc kubenswrapper[4797]: I1013 13:23:43.187959 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wsw2g\" (UniqueName: \"kubernetes.io/projected/3ac6531d-4d7d-4cf0-b943-984f885b4a6d-kube-api-access-wsw2g\") pod \"ovn-controller-ovs-2mpq9\" (UID: \"3ac6531d-4d7d-4cf0-b943-984f885b4a6d\") " pod="openstack/ovn-controller-ovs-2mpq9" Oct 13 13:23:43 crc kubenswrapper[4797]: I1013 13:23:43.187986 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/244e58a1-ed2c-4ff6-8885-ebd066e8adab-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"244e58a1-ed2c-4ff6-8885-ebd066e8adab\") " pod="openstack/ovsdbserver-nb-0" Oct 13 13:23:43 crc kubenswrapper[4797]: I1013 13:23:43.188010 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/244e58a1-ed2c-4ff6-8885-ebd066e8adab-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"244e58a1-ed2c-4ff6-8885-ebd066e8adab\") " pod="openstack/ovsdbserver-nb-0" Oct 13 13:23:43 crc kubenswrapper[4797]: I1013 13:23:43.188027 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/3ac6531d-4d7d-4cf0-b943-984f885b4a6d-var-lib\") pod \"ovn-controller-ovs-2mpq9\" (UID: \"3ac6531d-4d7d-4cf0-b943-984f885b4a6d\") " pod="openstack/ovn-controller-ovs-2mpq9" Oct 13 13:23:43 crc kubenswrapper[4797]: I1013 13:23:43.188104 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/244e58a1-ed2c-4ff6-8885-ebd066e8adab-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"244e58a1-ed2c-4ff6-8885-ebd066e8adab\") " pod="openstack/ovsdbserver-nb-0" Oct 13 13:23:43 crc kubenswrapper[4797]: I1013 13:23:43.188222 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/3ac6531d-4d7d-4cf0-b943-984f885b4a6d-var-lib\") pod \"ovn-controller-ovs-2mpq9\" (UID: \"3ac6531d-4d7d-4cf0-b943-984f885b4a6d\") " pod="openstack/ovn-controller-ovs-2mpq9" Oct 13 13:23:43 crc kubenswrapper[4797]: I1013 13:23:43.188241 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/244e58a1-ed2c-4ff6-8885-ebd066e8adab-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"244e58a1-ed2c-4ff6-8885-ebd066e8adab\") " pod="openstack/ovsdbserver-nb-0" Oct 13 13:23:43 crc kubenswrapper[4797]: I1013 13:23:43.188316 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/3ac6531d-4d7d-4cf0-b943-984f885b4a6d-etc-ovs\") pod \"ovn-controller-ovs-2mpq9\" (UID: \"3ac6531d-4d7d-4cf0-b943-984f885b4a6d\") " pod="openstack/ovn-controller-ovs-2mpq9" Oct 13 13:23:43 crc kubenswrapper[4797]: I1013 13:23:43.188387 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-nb-0\" (UID: \"244e58a1-ed2c-4ff6-8885-ebd066e8adab\") " pod="openstack/ovsdbserver-nb-0" Oct 13 13:23:43 crc kubenswrapper[4797]: I1013 13:23:43.188417 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktv47\" (UniqueName: \"kubernetes.io/projected/244e58a1-ed2c-4ff6-8885-ebd066e8adab-kube-api-access-ktv47\") pod \"ovsdbserver-nb-0\" (UID: \"244e58a1-ed2c-4ff6-8885-ebd066e8adab\") " pod="openstack/ovsdbserver-nb-0" Oct 13 13:23:43 crc kubenswrapper[4797]: I1013 13:23:43.188446 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/244e58a1-ed2c-4ff6-8885-ebd066e8adab-config\") pod \"ovsdbserver-nb-0\" (UID: \"244e58a1-ed2c-4ff6-8885-ebd066e8adab\") " pod="openstack/ovsdbserver-nb-0" Oct 13 13:23:43 crc kubenswrapper[4797]: I1013 13:23:43.188463 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3ac6531d-4d7d-4cf0-b943-984f885b4a6d-var-run\") pod \"ovn-controller-ovs-2mpq9\" (UID: \"3ac6531d-4d7d-4cf0-b943-984f885b4a6d\") " pod="openstack/ovn-controller-ovs-2mpq9" Oct 13 13:23:43 crc kubenswrapper[4797]: I1013 13:23:43.188510 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/3ac6531d-4d7d-4cf0-b943-984f885b4a6d-etc-ovs\") pod \"ovn-controller-ovs-2mpq9\" (UID: \"3ac6531d-4d7d-4cf0-b943-984f885b4a6d\") " pod="openstack/ovn-controller-ovs-2mpq9" Oct 13 13:23:43 crc kubenswrapper[4797]: I1013 13:23:43.188523 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3ac6531d-4d7d-4cf0-b943-984f885b4a6d-scripts\") pod \"ovn-controller-ovs-2mpq9\" (UID: \"3ac6531d-4d7d-4cf0-b943-984f885b4a6d\") " pod="openstack/ovn-controller-ovs-2mpq9" Oct 13 13:23:43 crc kubenswrapper[4797]: I1013 13:23:43.188540 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3ac6531d-4d7d-4cf0-b943-984f885b4a6d-var-run\") pod \"ovn-controller-ovs-2mpq9\" (UID: \"3ac6531d-4d7d-4cf0-b943-984f885b4a6d\") " pod="openstack/ovn-controller-ovs-2mpq9" Oct 13 13:23:43 crc kubenswrapper[4797]: I1013 13:23:43.188604 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/3ac6531d-4d7d-4cf0-b943-984f885b4a6d-var-log\") pod \"ovn-controller-ovs-2mpq9\" (UID: \"3ac6531d-4d7d-4cf0-b943-984f885b4a6d\") " pod="openstack/ovn-controller-ovs-2mpq9" Oct 13 13:23:43 crc kubenswrapper[4797]: I1013 13:23:43.188790 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/3ac6531d-4d7d-4cf0-b943-984f885b4a6d-var-log\") pod \"ovn-controller-ovs-2mpq9\" (UID: \"3ac6531d-4d7d-4cf0-b943-984f885b4a6d\") " pod="openstack/ovn-controller-ovs-2mpq9" Oct 13 13:23:43 crc kubenswrapper[4797]: I1013 13:23:43.190890 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3ac6531d-4d7d-4cf0-b943-984f885b4a6d-scripts\") pod \"ovn-controller-ovs-2mpq9\" (UID: \"3ac6531d-4d7d-4cf0-b943-984f885b4a6d\") " pod="openstack/ovn-controller-ovs-2mpq9" Oct 13 13:23:43 crc kubenswrapper[4797]: I1013 13:23:43.207159 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wsw2g\" (UniqueName: \"kubernetes.io/projected/3ac6531d-4d7d-4cf0-b943-984f885b4a6d-kube-api-access-wsw2g\") pod \"ovn-controller-ovs-2mpq9\" (UID: \"3ac6531d-4d7d-4cf0-b943-984f885b4a6d\") " pod="openstack/ovn-controller-ovs-2mpq9" Oct 13 13:23:43 crc kubenswrapper[4797]: I1013 13:23:43.254305 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-htk8n" Oct 13 13:23:43 crc kubenswrapper[4797]: I1013 13:23:43.289777 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-nb-0\" (UID: \"244e58a1-ed2c-4ff6-8885-ebd066e8adab\") " pod="openstack/ovsdbserver-nb-0" Oct 13 13:23:43 crc kubenswrapper[4797]: I1013 13:23:43.289844 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ktv47\" (UniqueName: \"kubernetes.io/projected/244e58a1-ed2c-4ff6-8885-ebd066e8adab-kube-api-access-ktv47\") pod \"ovsdbserver-nb-0\" (UID: \"244e58a1-ed2c-4ff6-8885-ebd066e8adab\") " pod="openstack/ovsdbserver-nb-0" Oct 13 13:23:43 crc kubenswrapper[4797]: I1013 13:23:43.289869 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/244e58a1-ed2c-4ff6-8885-ebd066e8adab-config\") pod \"ovsdbserver-nb-0\" (UID: \"244e58a1-ed2c-4ff6-8885-ebd066e8adab\") " pod="openstack/ovsdbserver-nb-0" Oct 13 13:23:43 crc kubenswrapper[4797]: I1013 13:23:43.289927 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/244e58a1-ed2c-4ff6-8885-ebd066e8adab-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"244e58a1-ed2c-4ff6-8885-ebd066e8adab\") " pod="openstack/ovsdbserver-nb-0" Oct 13 13:23:43 crc kubenswrapper[4797]: I1013 13:23:43.289958 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/244e58a1-ed2c-4ff6-8885-ebd066e8adab-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"244e58a1-ed2c-4ff6-8885-ebd066e8adab\") " pod="openstack/ovsdbserver-nb-0" Oct 13 13:23:43 crc kubenswrapper[4797]: I1013 13:23:43.289978 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/244e58a1-ed2c-4ff6-8885-ebd066e8adab-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"244e58a1-ed2c-4ff6-8885-ebd066e8adab\") " pod="openstack/ovsdbserver-nb-0" Oct 13 13:23:43 crc kubenswrapper[4797]: I1013 13:23:43.289996 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/244e58a1-ed2c-4ff6-8885-ebd066e8adab-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"244e58a1-ed2c-4ff6-8885-ebd066e8adab\") " pod="openstack/ovsdbserver-nb-0" Oct 13 13:23:43 crc kubenswrapper[4797]: I1013 13:23:43.290027 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/244e58a1-ed2c-4ff6-8885-ebd066e8adab-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"244e58a1-ed2c-4ff6-8885-ebd066e8adab\") " pod="openstack/ovsdbserver-nb-0" Oct 13 13:23:43 crc kubenswrapper[4797]: I1013 13:23:43.290172 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-2mpq9" Oct 13 13:23:43 crc kubenswrapper[4797]: I1013 13:23:43.290273 4797 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-nb-0\" (UID: \"244e58a1-ed2c-4ff6-8885-ebd066e8adab\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/ovsdbserver-nb-0" Oct 13 13:23:43 crc kubenswrapper[4797]: I1013 13:23:43.290649 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/244e58a1-ed2c-4ff6-8885-ebd066e8adab-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"244e58a1-ed2c-4ff6-8885-ebd066e8adab\") " pod="openstack/ovsdbserver-nb-0" Oct 13 13:23:43 crc kubenswrapper[4797]: I1013 13:23:43.291369 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/244e58a1-ed2c-4ff6-8885-ebd066e8adab-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"244e58a1-ed2c-4ff6-8885-ebd066e8adab\") " pod="openstack/ovsdbserver-nb-0" Oct 13 13:23:43 crc kubenswrapper[4797]: I1013 13:23:43.291705 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/244e58a1-ed2c-4ff6-8885-ebd066e8adab-config\") pod \"ovsdbserver-nb-0\" (UID: \"244e58a1-ed2c-4ff6-8885-ebd066e8adab\") " pod="openstack/ovsdbserver-nb-0" Oct 13 13:23:43 crc kubenswrapper[4797]: I1013 13:23:43.293915 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/244e58a1-ed2c-4ff6-8885-ebd066e8adab-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"244e58a1-ed2c-4ff6-8885-ebd066e8adab\") " pod="openstack/ovsdbserver-nb-0" Oct 13 13:23:43 crc kubenswrapper[4797]: I1013 13:23:43.294362 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/244e58a1-ed2c-4ff6-8885-ebd066e8adab-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"244e58a1-ed2c-4ff6-8885-ebd066e8adab\") " pod="openstack/ovsdbserver-nb-0" Oct 13 13:23:43 crc kubenswrapper[4797]: I1013 13:23:43.305924 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/244e58a1-ed2c-4ff6-8885-ebd066e8adab-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"244e58a1-ed2c-4ff6-8885-ebd066e8adab\") " pod="openstack/ovsdbserver-nb-0" Oct 13 13:23:43 crc kubenswrapper[4797]: I1013 13:23:43.317131 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktv47\" (UniqueName: \"kubernetes.io/projected/244e58a1-ed2c-4ff6-8885-ebd066e8adab-kube-api-access-ktv47\") pod \"ovsdbserver-nb-0\" (UID: \"244e58a1-ed2c-4ff6-8885-ebd066e8adab\") " pod="openstack/ovsdbserver-nb-0" Oct 13 13:23:43 crc kubenswrapper[4797]: I1013 13:23:43.319783 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-nb-0\" (UID: \"244e58a1-ed2c-4ff6-8885-ebd066e8adab\") " pod="openstack/ovsdbserver-nb-0" Oct 13 13:23:43 crc kubenswrapper[4797]: I1013 13:23:43.355991 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 13 13:23:45 crc kubenswrapper[4797]: I1013 13:23:45.592536 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 13 13:23:45 crc kubenswrapper[4797]: I1013 13:23:45.594258 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 13 13:23:45 crc kubenswrapper[4797]: I1013 13:23:45.598131 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Oct 13 13:23:45 crc kubenswrapper[4797]: I1013 13:23:45.598182 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-jnp2z" Oct 13 13:23:45 crc kubenswrapper[4797]: I1013 13:23:45.598421 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Oct 13 13:23:45 crc kubenswrapper[4797]: I1013 13:23:45.598586 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Oct 13 13:23:45 crc kubenswrapper[4797]: I1013 13:23:45.605750 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 13 13:23:45 crc kubenswrapper[4797]: I1013 13:23:45.745028 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/047404d9-b0ab-44e2-a31d-94d8fe429698-config\") pod \"ovsdbserver-sb-0\" (UID: \"047404d9-b0ab-44e2-a31d-94d8fe429698\") " pod="openstack/ovsdbserver-sb-0" Oct 13 13:23:45 crc kubenswrapper[4797]: I1013 13:23:45.745077 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/047404d9-b0ab-44e2-a31d-94d8fe429698-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"047404d9-b0ab-44e2-a31d-94d8fe429698\") " pod="openstack/ovsdbserver-sb-0" Oct 13 13:23:45 crc kubenswrapper[4797]: I1013 13:23:45.745110 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/047404d9-b0ab-44e2-a31d-94d8fe429698-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"047404d9-b0ab-44e2-a31d-94d8fe429698\") " pod="openstack/ovsdbserver-sb-0" Oct 13 13:23:45 crc kubenswrapper[4797]: I1013 13:23:45.745169 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/047404d9-b0ab-44e2-a31d-94d8fe429698-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"047404d9-b0ab-44e2-a31d-94d8fe429698\") " pod="openstack/ovsdbserver-sb-0" Oct 13 13:23:45 crc kubenswrapper[4797]: I1013 13:23:45.745262 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-znls5\" (UniqueName: \"kubernetes.io/projected/047404d9-b0ab-44e2-a31d-94d8fe429698-kube-api-access-znls5\") pod \"ovsdbserver-sb-0\" (UID: \"047404d9-b0ab-44e2-a31d-94d8fe429698\") " pod="openstack/ovsdbserver-sb-0" Oct 13 13:23:45 crc kubenswrapper[4797]: I1013 13:23:45.745324 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-sb-0\" (UID: \"047404d9-b0ab-44e2-a31d-94d8fe429698\") " pod="openstack/ovsdbserver-sb-0" Oct 13 13:23:45 crc kubenswrapper[4797]: I1013 13:23:45.745349 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/047404d9-b0ab-44e2-a31d-94d8fe429698-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"047404d9-b0ab-44e2-a31d-94d8fe429698\") " pod="openstack/ovsdbserver-sb-0" Oct 13 13:23:45 crc kubenswrapper[4797]: I1013 13:23:45.745384 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/047404d9-b0ab-44e2-a31d-94d8fe429698-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"047404d9-b0ab-44e2-a31d-94d8fe429698\") " pod="openstack/ovsdbserver-sb-0" Oct 13 13:23:45 crc kubenswrapper[4797]: I1013 13:23:45.846681 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-sb-0\" (UID: \"047404d9-b0ab-44e2-a31d-94d8fe429698\") " pod="openstack/ovsdbserver-sb-0" Oct 13 13:23:45 crc kubenswrapper[4797]: I1013 13:23:45.846733 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/047404d9-b0ab-44e2-a31d-94d8fe429698-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"047404d9-b0ab-44e2-a31d-94d8fe429698\") " pod="openstack/ovsdbserver-sb-0" Oct 13 13:23:45 crc kubenswrapper[4797]: I1013 13:23:45.846770 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/047404d9-b0ab-44e2-a31d-94d8fe429698-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"047404d9-b0ab-44e2-a31d-94d8fe429698\") " pod="openstack/ovsdbserver-sb-0" Oct 13 13:23:45 crc kubenswrapper[4797]: I1013 13:23:45.846798 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/047404d9-b0ab-44e2-a31d-94d8fe429698-config\") pod \"ovsdbserver-sb-0\" (UID: \"047404d9-b0ab-44e2-a31d-94d8fe429698\") " pod="openstack/ovsdbserver-sb-0" Oct 13 13:23:45 crc kubenswrapper[4797]: I1013 13:23:45.846865 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/047404d9-b0ab-44e2-a31d-94d8fe429698-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"047404d9-b0ab-44e2-a31d-94d8fe429698\") " pod="openstack/ovsdbserver-sb-0" Oct 13 13:23:45 crc kubenswrapper[4797]: I1013 13:23:45.846889 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/047404d9-b0ab-44e2-a31d-94d8fe429698-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"047404d9-b0ab-44e2-a31d-94d8fe429698\") " pod="openstack/ovsdbserver-sb-0" Oct 13 13:23:45 crc kubenswrapper[4797]: I1013 13:23:45.846921 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/047404d9-b0ab-44e2-a31d-94d8fe429698-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"047404d9-b0ab-44e2-a31d-94d8fe429698\") " pod="openstack/ovsdbserver-sb-0" Oct 13 13:23:45 crc kubenswrapper[4797]: I1013 13:23:45.846986 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-znls5\" (UniqueName: \"kubernetes.io/projected/047404d9-b0ab-44e2-a31d-94d8fe429698-kube-api-access-znls5\") pod \"ovsdbserver-sb-0\" (UID: \"047404d9-b0ab-44e2-a31d-94d8fe429698\") " pod="openstack/ovsdbserver-sb-0" Oct 13 13:23:45 crc kubenswrapper[4797]: I1013 13:23:45.847232 4797 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-sb-0\" (UID: \"047404d9-b0ab-44e2-a31d-94d8fe429698\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/ovsdbserver-sb-0" Oct 13 13:23:45 crc kubenswrapper[4797]: I1013 13:23:45.847364 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/047404d9-b0ab-44e2-a31d-94d8fe429698-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"047404d9-b0ab-44e2-a31d-94d8fe429698\") " pod="openstack/ovsdbserver-sb-0" Oct 13 13:23:45 crc kubenswrapper[4797]: I1013 13:23:45.849164 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/047404d9-b0ab-44e2-a31d-94d8fe429698-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"047404d9-b0ab-44e2-a31d-94d8fe429698\") " pod="openstack/ovsdbserver-sb-0" Oct 13 13:23:45 crc kubenswrapper[4797]: I1013 13:23:45.849563 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/047404d9-b0ab-44e2-a31d-94d8fe429698-config\") pod \"ovsdbserver-sb-0\" (UID: \"047404d9-b0ab-44e2-a31d-94d8fe429698\") " pod="openstack/ovsdbserver-sb-0" Oct 13 13:23:45 crc kubenswrapper[4797]: I1013 13:23:45.854182 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/047404d9-b0ab-44e2-a31d-94d8fe429698-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"047404d9-b0ab-44e2-a31d-94d8fe429698\") " pod="openstack/ovsdbserver-sb-0" Oct 13 13:23:45 crc kubenswrapper[4797]: I1013 13:23:45.854712 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/047404d9-b0ab-44e2-a31d-94d8fe429698-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"047404d9-b0ab-44e2-a31d-94d8fe429698\") " pod="openstack/ovsdbserver-sb-0" Oct 13 13:23:45 crc kubenswrapper[4797]: I1013 13:23:45.860380 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/047404d9-b0ab-44e2-a31d-94d8fe429698-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"047404d9-b0ab-44e2-a31d-94d8fe429698\") " pod="openstack/ovsdbserver-sb-0" Oct 13 13:23:45 crc kubenswrapper[4797]: I1013 13:23:45.867105 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-znls5\" (UniqueName: \"kubernetes.io/projected/047404d9-b0ab-44e2-a31d-94d8fe429698-kube-api-access-znls5\") pod \"ovsdbserver-sb-0\" (UID: \"047404d9-b0ab-44e2-a31d-94d8fe429698\") " pod="openstack/ovsdbserver-sb-0" Oct 13 13:23:45 crc kubenswrapper[4797]: I1013 13:23:45.876411 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-sb-0\" (UID: \"047404d9-b0ab-44e2-a31d-94d8fe429698\") " pod="openstack/ovsdbserver-sb-0" Oct 13 13:23:45 crc kubenswrapper[4797]: I1013 13:23:45.922607 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 13 13:23:55 crc kubenswrapper[4797]: E1013 13:23:55.239716 4797 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq@sha256:97feaea1e66145857f3eb548d741ee56062b97fd3e8f4d136a5ca807c49c0cca" Oct 13 13:23:55 crc kubenswrapper[4797]: E1013 13:23:55.240280 4797 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq@sha256:97feaea1e66145857f3eb548d741ee56062b97fd3e8f4d136a5ca807c49c0cca,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5d865,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-server-0_openstack(21067728-d3cf-4ff2-94c9-87600f7324ab): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 13 13:23:55 crc kubenswrapper[4797]: E1013 13:23:55.242238 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-server-0" podUID="21067728-d3cf-4ff2-94c9-87600f7324ab" Oct 13 13:23:55 crc kubenswrapper[4797]: E1013 13:23:55.504479 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-rabbitmq@sha256:97feaea1e66145857f3eb548d741ee56062b97fd3e8f4d136a5ca807c49c0cca\\\"\"" pod="openstack/rabbitmq-server-0" podUID="21067728-d3cf-4ff2-94c9-87600f7324ab" Oct 13 13:23:56 crc kubenswrapper[4797]: E1013 13:23:56.781914 4797 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-mariadb@sha256:91737b02501b73af0aee486b7447b4ae3005c904f31f1a9bf4047d0433586f80" Oct 13 13:23:56 crc kubenswrapper[4797]: E1013 13:23:56.782422 4797 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:mysql-bootstrap,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:91737b02501b73af0aee486b7447b4ae3005c904f31f1a9bf4047d0433586f80,Command:[bash /var/lib/operator-scripts/mysql_bootstrap.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:True,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:DB_ROOT_PASSWORD,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:osp-secret,},Key:DbRootPassword,Optional:nil,},},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:mysql-db,ReadOnly:false,MountPath:/var/lib/mysql,SubPath:mysql,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-default,ReadOnly:true,MountPath:/var/lib/config-data/default,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-generated,ReadOnly:false,MountPath:/var/lib/config-data/generated,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:secrets,ReadOnly:true,MountPath:/var/lib/secrets,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:operator-scripts,ReadOnly:true,MountPath:/var/lib/operator-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7jxcg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-galera-0_openstack(300309d9-4375-4cce-8fb1-0833d2cfdcde): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 13 13:23:56 crc kubenswrapper[4797]: E1013 13:23:56.783717 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/openstack-galera-0" podUID="300309d9-4375-4cce-8fb1-0833d2cfdcde" Oct 13 13:23:57 crc kubenswrapper[4797]: E1013 13:23:57.542604 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-mariadb@sha256:91737b02501b73af0aee486b7447b4ae3005c904f31f1a9bf4047d0433586f80\\\"\"" pod="openstack/openstack-galera-0" podUID="300309d9-4375-4cce-8fb1-0833d2cfdcde" Oct 13 13:23:57 crc kubenswrapper[4797]: E1013 13:23:57.573334 4797 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:673685cea6ea2dbd78bcb555955c1b9f05ea26018f79ee34494256a5f2d7b74a" Oct 13 13:23:57 crc kubenswrapper[4797]: E1013 13:23:57.573498 4797 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:673685cea6ea2dbd78bcb555955c1b9f05ea26018f79ee34494256a5f2d7b74a,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8gt64,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-6948694bd9-qlpcp_openstack(9908f76b-b586-4e5c-b77d-21e49c072ebc): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 13 13:23:57 crc kubenswrapper[4797]: E1013 13:23:57.574837 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-6948694bd9-qlpcp" podUID="9908f76b-b586-4e5c-b77d-21e49c072ebc" Oct 13 13:23:57 crc kubenswrapper[4797]: E1013 13:23:57.680390 4797 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:673685cea6ea2dbd78bcb555955c1b9f05ea26018f79ee34494256a5f2d7b74a" Oct 13 13:23:57 crc kubenswrapper[4797]: E1013 13:23:57.683128 4797 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:673685cea6ea2dbd78bcb555955c1b9f05ea26018f79ee34494256a5f2d7b74a,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wzd2r,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-7869c47d6c-g9669_openstack(ed404a60-7b1e-4d3d-91a1-50a66c87f7b4): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 13 13:23:57 crc kubenswrapper[4797]: E1013 13:23:57.684225 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-7869c47d6c-g9669" podUID="ed404a60-7b1e-4d3d-91a1-50a66c87f7b4" Oct 13 13:23:57 crc kubenswrapper[4797]: E1013 13:23:57.784029 4797 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:673685cea6ea2dbd78bcb555955c1b9f05ea26018f79ee34494256a5f2d7b74a" Oct 13 13:23:57 crc kubenswrapper[4797]: E1013 13:23:57.784206 4797 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:673685cea6ea2dbd78bcb555955c1b9f05ea26018f79ee34494256a5f2d7b74a,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2jclc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-5d487d97d7-jhb2q_openstack(fa65711d-07d4-4d5e-bd53-1762a14fc2b7): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 13 13:23:57 crc kubenswrapper[4797]: E1013 13:23:57.785303 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-5d487d97d7-jhb2q" podUID="fa65711d-07d4-4d5e-bd53-1762a14fc2b7" Oct 13 13:23:57 crc kubenswrapper[4797]: E1013 13:23:57.864277 4797 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:673685cea6ea2dbd78bcb555955c1b9f05ea26018f79ee34494256a5f2d7b74a" Oct 13 13:23:57 crc kubenswrapper[4797]: E1013 13:23:57.864721 4797 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:673685cea6ea2dbd78bcb555955c1b9f05ea26018f79ee34494256a5f2d7b74a,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rrgw5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-86f694bf-m8ztx_openstack(0df186e7-541e-4998-bba3-95f086636a6d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 13 13:23:57 crc kubenswrapper[4797]: E1013 13:23:57.866145 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-86f694bf-m8ztx" podUID="0df186e7-541e-4998-bba3-95f086636a6d" Oct 13 13:23:58 crc kubenswrapper[4797]: I1013 13:23:58.147896 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Oct 13 13:23:58 crc kubenswrapper[4797]: I1013 13:23:58.151746 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 13 13:23:58 crc kubenswrapper[4797]: I1013 13:23:58.197538 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 13 13:23:58 crc kubenswrapper[4797]: W1013 13:23:58.211163 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod130b6035_6c63_4a81_b112_fdf5da3d970e.slice/crio-c526830c40a400cbe7ea0f85fb6104a001568d00f6f4b35837275270953f1ee7 WatchSource:0}: Error finding container c526830c40a400cbe7ea0f85fb6104a001568d00f6f4b35837275270953f1ee7: Status 404 returned error can't find the container with id c526830c40a400cbe7ea0f85fb6104a001568d00f6f4b35837275270953f1ee7 Oct 13 13:23:58 crc kubenswrapper[4797]: I1013 13:23:58.235197 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-htk8n"] Oct 13 13:23:58 crc kubenswrapper[4797]: W1013 13:23:58.258638 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod85dd770b_9d5c_4cc9_adaa_87963d5bb160.slice/crio-a4a9aab6f4c0faac2c2585b800c20e01af800d5b5329d28e1dab0619122eeaa1 WatchSource:0}: Error finding container a4a9aab6f4c0faac2c2585b800c20e01af800d5b5329d28e1dab0619122eeaa1: Status 404 returned error can't find the container with id a4a9aab6f4c0faac2c2585b800c20e01af800d5b5329d28e1dab0619122eeaa1 Oct 13 13:23:58 crc kubenswrapper[4797]: I1013 13:23:58.325148 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 13 13:23:58 crc kubenswrapper[4797]: W1013 13:23:58.327657 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod047404d9_b0ab_44e2_a31d_94d8fe429698.slice/crio-962baa4a87d5c8ee7ad1b1e524a17b551a24101a2211c7f0874b486652e9f832 WatchSource:0}: Error finding container 962baa4a87d5c8ee7ad1b1e524a17b551a24101a2211c7f0874b486652e9f832: Status 404 returned error can't find the container with id 962baa4a87d5c8ee7ad1b1e524a17b551a24101a2211c7f0874b486652e9f832 Oct 13 13:23:58 crc kubenswrapper[4797]: I1013 13:23:58.534492 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"cb5bf67e-bfcd-4a21-b67f-9d2893da31ab","Type":"ContainerStarted","Data":"2be24ed4ffb6d0135f074533e692c594fecabeca9e1e729455ef7aa0af6ec4f2"} Oct 13 13:23:58 crc kubenswrapper[4797]: I1013 13:23:58.535998 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"047404d9-b0ab-44e2-a31d-94d8fe429698","Type":"ContainerStarted","Data":"962baa4a87d5c8ee7ad1b1e524a17b551a24101a2211c7f0874b486652e9f832"} Oct 13 13:23:58 crc kubenswrapper[4797]: I1013 13:23:58.537259 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"f12892ce-6d68-4f79-b1dd-e874dffba145","Type":"ContainerStarted","Data":"559fda50c03ecc7086dae8eeea693d15ca9e11a5636b2093106d5a1dbac0d7f3"} Oct 13 13:23:58 crc kubenswrapper[4797]: I1013 13:23:58.538593 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"130b6035-6c63-4a81-b112-fdf5da3d970e","Type":"ContainerStarted","Data":"c526830c40a400cbe7ea0f85fb6104a001568d00f6f4b35837275270953f1ee7"} Oct 13 13:23:58 crc kubenswrapper[4797]: I1013 13:23:58.539596 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-htk8n" event={"ID":"85dd770b-9d5c-4cc9-adaa-87963d5bb160","Type":"ContainerStarted","Data":"a4a9aab6f4c0faac2c2585b800c20e01af800d5b5329d28e1dab0619122eeaa1"} Oct 13 13:23:58 crc kubenswrapper[4797]: E1013 13:23:58.542032 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:673685cea6ea2dbd78bcb555955c1b9f05ea26018f79ee34494256a5f2d7b74a\\\"\"" pod="openstack/dnsmasq-dns-86f694bf-m8ztx" podUID="0df186e7-541e-4998-bba3-95f086636a6d" Oct 13 13:23:58 crc kubenswrapper[4797]: E1013 13:23:58.542307 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:673685cea6ea2dbd78bcb555955c1b9f05ea26018f79ee34494256a5f2d7b74a\\\"\"" pod="openstack/dnsmasq-dns-7869c47d6c-g9669" podUID="ed404a60-7b1e-4d3d-91a1-50a66c87f7b4" Oct 13 13:23:58 crc kubenswrapper[4797]: I1013 13:23:58.969401 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 13 13:23:59 crc kubenswrapper[4797]: I1013 13:23:59.004343 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6948694bd9-qlpcp" Oct 13 13:23:59 crc kubenswrapper[4797]: I1013 13:23:59.012148 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d487d97d7-jhb2q" Oct 13 13:23:59 crc kubenswrapper[4797]: I1013 13:23:59.108518 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8gt64\" (UniqueName: \"kubernetes.io/projected/9908f76b-b586-4e5c-b77d-21e49c072ebc-kube-api-access-8gt64\") pod \"9908f76b-b586-4e5c-b77d-21e49c072ebc\" (UID: \"9908f76b-b586-4e5c-b77d-21e49c072ebc\") " Oct 13 13:23:59 crc kubenswrapper[4797]: I1013 13:23:59.108593 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2jclc\" (UniqueName: \"kubernetes.io/projected/fa65711d-07d4-4d5e-bd53-1762a14fc2b7-kube-api-access-2jclc\") pod \"fa65711d-07d4-4d5e-bd53-1762a14fc2b7\" (UID: \"fa65711d-07d4-4d5e-bd53-1762a14fc2b7\") " Oct 13 13:23:59 crc kubenswrapper[4797]: I1013 13:23:59.108636 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9908f76b-b586-4e5c-b77d-21e49c072ebc-dns-svc\") pod \"9908f76b-b586-4e5c-b77d-21e49c072ebc\" (UID: \"9908f76b-b586-4e5c-b77d-21e49c072ebc\") " Oct 13 13:23:59 crc kubenswrapper[4797]: I1013 13:23:59.108759 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9908f76b-b586-4e5c-b77d-21e49c072ebc-config\") pod \"9908f76b-b586-4e5c-b77d-21e49c072ebc\" (UID: \"9908f76b-b586-4e5c-b77d-21e49c072ebc\") " Oct 13 13:23:59 crc kubenswrapper[4797]: I1013 13:23:59.108833 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa65711d-07d4-4d5e-bd53-1762a14fc2b7-config\") pod \"fa65711d-07d4-4d5e-bd53-1762a14fc2b7\" (UID: \"fa65711d-07d4-4d5e-bd53-1762a14fc2b7\") " Oct 13 13:23:59 crc kubenswrapper[4797]: I1013 13:23:59.109605 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9908f76b-b586-4e5c-b77d-21e49c072ebc-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9908f76b-b586-4e5c-b77d-21e49c072ebc" (UID: "9908f76b-b586-4e5c-b77d-21e49c072ebc"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:23:59 crc kubenswrapper[4797]: I1013 13:23:59.109673 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9908f76b-b586-4e5c-b77d-21e49c072ebc-config" (OuterVolumeSpecName: "config") pod "9908f76b-b586-4e5c-b77d-21e49c072ebc" (UID: "9908f76b-b586-4e5c-b77d-21e49c072ebc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:23:59 crc kubenswrapper[4797]: I1013 13:23:59.109703 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa65711d-07d4-4d5e-bd53-1762a14fc2b7-config" (OuterVolumeSpecName: "config") pod "fa65711d-07d4-4d5e-bd53-1762a14fc2b7" (UID: "fa65711d-07d4-4d5e-bd53-1762a14fc2b7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:23:59 crc kubenswrapper[4797]: I1013 13:23:59.109857 4797 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa65711d-07d4-4d5e-bd53-1762a14fc2b7-config\") on node \"crc\" DevicePath \"\"" Oct 13 13:23:59 crc kubenswrapper[4797]: I1013 13:23:59.109875 4797 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9908f76b-b586-4e5c-b77d-21e49c072ebc-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 13 13:23:59 crc kubenswrapper[4797]: I1013 13:23:59.109885 4797 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9908f76b-b586-4e5c-b77d-21e49c072ebc-config\") on node \"crc\" DevicePath \"\"" Oct 13 13:23:59 crc kubenswrapper[4797]: I1013 13:23:59.112971 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa65711d-07d4-4d5e-bd53-1762a14fc2b7-kube-api-access-2jclc" (OuterVolumeSpecName: "kube-api-access-2jclc") pod "fa65711d-07d4-4d5e-bd53-1762a14fc2b7" (UID: "fa65711d-07d4-4d5e-bd53-1762a14fc2b7"). InnerVolumeSpecName "kube-api-access-2jclc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:23:59 crc kubenswrapper[4797]: I1013 13:23:59.113498 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9908f76b-b586-4e5c-b77d-21e49c072ebc-kube-api-access-8gt64" (OuterVolumeSpecName: "kube-api-access-8gt64") pod "9908f76b-b586-4e5c-b77d-21e49c072ebc" (UID: "9908f76b-b586-4e5c-b77d-21e49c072ebc"). InnerVolumeSpecName "kube-api-access-8gt64". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:23:59 crc kubenswrapper[4797]: I1013 13:23:59.211478 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8gt64\" (UniqueName: \"kubernetes.io/projected/9908f76b-b586-4e5c-b77d-21e49c072ebc-kube-api-access-8gt64\") on node \"crc\" DevicePath \"\"" Oct 13 13:23:59 crc kubenswrapper[4797]: I1013 13:23:59.211505 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2jclc\" (UniqueName: \"kubernetes.io/projected/fa65711d-07d4-4d5e-bd53-1762a14fc2b7-kube-api-access-2jclc\") on node \"crc\" DevicePath \"\"" Oct 13 13:23:59 crc kubenswrapper[4797]: I1013 13:23:59.234663 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-2mpq9"] Oct 13 13:23:59 crc kubenswrapper[4797]: W1013 13:23:59.402019 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3ac6531d_4d7d_4cf0_b943_984f885b4a6d.slice/crio-670bd2bb956f74860f67d01b8baeecf23d1c7656d71c3a7834b27f415b0e8483 WatchSource:0}: Error finding container 670bd2bb956f74860f67d01b8baeecf23d1c7656d71c3a7834b27f415b0e8483: Status 404 returned error can't find the container with id 670bd2bb956f74860f67d01b8baeecf23d1c7656d71c3a7834b27f415b0e8483 Oct 13 13:23:59 crc kubenswrapper[4797]: I1013 13:23:59.550630 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-2mpq9" event={"ID":"3ac6531d-4d7d-4cf0-b943-984f885b4a6d","Type":"ContainerStarted","Data":"670bd2bb956f74860f67d01b8baeecf23d1c7656d71c3a7834b27f415b0e8483"} Oct 13 13:23:59 crc kubenswrapper[4797]: I1013 13:23:59.552314 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"244e58a1-ed2c-4ff6-8885-ebd066e8adab","Type":"ContainerStarted","Data":"a68ed4441c229162a797613f77467948c60aa25c7f213142f53e88b4f973d9c5"} Oct 13 13:23:59 crc kubenswrapper[4797]: I1013 13:23:59.553889 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d487d97d7-jhb2q" event={"ID":"fa65711d-07d4-4d5e-bd53-1762a14fc2b7","Type":"ContainerDied","Data":"dccabb8e1b62f0b66c6371e7e633cdc00a75b6c8349518dea0a164dab99dfb04"} Oct 13 13:23:59 crc kubenswrapper[4797]: I1013 13:23:59.553999 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d487d97d7-jhb2q" Oct 13 13:23:59 crc kubenswrapper[4797]: I1013 13:23:59.558257 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"f12892ce-6d68-4f79-b1dd-e874dffba145","Type":"ContainerStarted","Data":"c055cb85c274325ba4dfd7ffd98811f07884655ca3f5193c1fd0181b74581d57"} Oct 13 13:23:59 crc kubenswrapper[4797]: I1013 13:23:59.560577 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6948694bd9-qlpcp" Oct 13 13:23:59 crc kubenswrapper[4797]: I1013 13:23:59.562263 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6948694bd9-qlpcp" event={"ID":"9908f76b-b586-4e5c-b77d-21e49c072ebc","Type":"ContainerDied","Data":"5bf9b1116b220c3a6491066de70aa944930799796f892c4c41ed014d6040ec39"} Oct 13 13:23:59 crc kubenswrapper[4797]: I1013 13:23:59.566023 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"acdec9fc-360a-46e4-89ea-3fde84f417c0","Type":"ContainerStarted","Data":"c87767a147d4c8704a237c416cfdc4858485ccdb790924187fa2558d28ea1605"} Oct 13 13:23:59 crc kubenswrapper[4797]: I1013 13:23:59.686376 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d487d97d7-jhb2q"] Oct 13 13:23:59 crc kubenswrapper[4797]: I1013 13:23:59.695454 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5d487d97d7-jhb2q"] Oct 13 13:23:59 crc kubenswrapper[4797]: I1013 13:23:59.707651 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6948694bd9-qlpcp"] Oct 13 13:23:59 crc kubenswrapper[4797]: I1013 13:23:59.712397 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6948694bd9-qlpcp"] Oct 13 13:24:01 crc kubenswrapper[4797]: I1013 13:24:01.245005 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9908f76b-b586-4e5c-b77d-21e49c072ebc" path="/var/lib/kubelet/pods/9908f76b-b586-4e5c-b77d-21e49c072ebc/volumes" Oct 13 13:24:01 crc kubenswrapper[4797]: I1013 13:24:01.245470 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa65711d-07d4-4d5e-bd53-1762a14fc2b7" path="/var/lib/kubelet/pods/fa65711d-07d4-4d5e-bd53-1762a14fc2b7/volumes" Oct 13 13:24:02 crc kubenswrapper[4797]: I1013 13:24:02.591042 4797 generic.go:334] "Generic (PLEG): container finished" podID="f12892ce-6d68-4f79-b1dd-e874dffba145" containerID="c055cb85c274325ba4dfd7ffd98811f07884655ca3f5193c1fd0181b74581d57" exitCode=0 Oct 13 13:24:02 crc kubenswrapper[4797]: I1013 13:24:02.591262 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"f12892ce-6d68-4f79-b1dd-e874dffba145","Type":"ContainerDied","Data":"c055cb85c274325ba4dfd7ffd98811f07884655ca3f5193c1fd0181b74581d57"} Oct 13 13:24:03 crc kubenswrapper[4797]: I1013 13:24:03.598208 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"130b6035-6c63-4a81-b112-fdf5da3d970e","Type":"ContainerStarted","Data":"98984ae0b0aa073e880a1ec2c8d909773dc9cef1bac9574b58fb38b3b429b392"} Oct 13 13:24:03 crc kubenswrapper[4797]: I1013 13:24:03.598780 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Oct 13 13:24:03 crc kubenswrapper[4797]: I1013 13:24:03.599182 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-htk8n" event={"ID":"85dd770b-9d5c-4cc9-adaa-87963d5bb160","Type":"ContainerStarted","Data":"c1f45af3970cc786037a9d09839c3b48d54254bcac9f280549e8da329ed5ed7c"} Oct 13 13:24:03 crc kubenswrapper[4797]: I1013 13:24:03.599303 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-htk8n" Oct 13 13:24:03 crc kubenswrapper[4797]: I1013 13:24:03.600132 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"cb5bf67e-bfcd-4a21-b67f-9d2893da31ab","Type":"ContainerStarted","Data":"cd039029de14470512836faefcd9431e8ebee84dc53830473d5f4f71c8f24d3c"} Oct 13 13:24:03 crc kubenswrapper[4797]: I1013 13:24:03.600210 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Oct 13 13:24:03 crc kubenswrapper[4797]: I1013 13:24:03.601571 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"047404d9-b0ab-44e2-a31d-94d8fe429698","Type":"ContainerStarted","Data":"2f8d3e11442030783b8beb821adc64c961a2829fa98e12946ceb2502de4e83c9"} Oct 13 13:24:03 crc kubenswrapper[4797]: I1013 13:24:03.602723 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-2mpq9" event={"ID":"3ac6531d-4d7d-4cf0-b943-984f885b4a6d","Type":"ContainerStarted","Data":"89c20ea2719d92d48088b58d7e39f9d562964dd99319780d3e2dff0b28232c4e"} Oct 13 13:24:03 crc kubenswrapper[4797]: I1013 13:24:03.603708 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"244e58a1-ed2c-4ff6-8885-ebd066e8adab","Type":"ContainerStarted","Data":"15a0ba06c59d7bea85972ec892e686e89aa4eb9037d3d04f437f8ad32558c17b"} Oct 13 13:24:03 crc kubenswrapper[4797]: I1013 13:24:03.605233 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"f12892ce-6d68-4f79-b1dd-e874dffba145","Type":"ContainerStarted","Data":"d71ea01203c3ae01ea8325c2bb868f94f1383842ef8fa152c98b3afecb3c64ce"} Oct 13 13:24:03 crc kubenswrapper[4797]: I1013 13:24:03.619845 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=20.812676986 podStartE2EDuration="25.619823042s" podCreationTimestamp="2025-10-13 13:23:38 +0000 UTC" firstStartedPulling="2025-10-13 13:23:58.214786762 +0000 UTC m=+1015.748337028" lastFinishedPulling="2025-10-13 13:24:03.021932828 +0000 UTC m=+1020.555483084" observedRunningTime="2025-10-13 13:24:03.613325793 +0000 UTC m=+1021.146876059" watchObservedRunningTime="2025-10-13 13:24:03.619823042 +0000 UTC m=+1021.153373298" Oct 13 13:24:03 crc kubenswrapper[4797]: I1013 13:24:03.639154 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-htk8n" podStartSLOduration=17.01779106 podStartE2EDuration="21.639114996s" podCreationTimestamp="2025-10-13 13:23:42 +0000 UTC" firstStartedPulling="2025-10-13 13:23:58.260706999 +0000 UTC m=+1015.794257255" lastFinishedPulling="2025-10-13 13:24:02.882030925 +0000 UTC m=+1020.415581191" observedRunningTime="2025-10-13 13:24:03.630621787 +0000 UTC m=+1021.164172053" watchObservedRunningTime="2025-10-13 13:24:03.639114996 +0000 UTC m=+1021.172665262" Oct 13 13:24:03 crc kubenswrapper[4797]: I1013 13:24:03.665910 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=23.558093721 podStartE2EDuration="27.665890173s" podCreationTimestamp="2025-10-13 13:23:36 +0000 UTC" firstStartedPulling="2025-10-13 13:23:58.152986865 +0000 UTC m=+1015.686537121" lastFinishedPulling="2025-10-13 13:24:02.260783317 +0000 UTC m=+1019.794333573" observedRunningTime="2025-10-13 13:24:03.664733885 +0000 UTC m=+1021.198284151" watchObservedRunningTime="2025-10-13 13:24:03.665890173 +0000 UTC m=+1021.199440439" Oct 13 13:24:03 crc kubenswrapper[4797]: I1013 13:24:03.692693 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=28.129981179 podStartE2EDuration="28.69266914s" podCreationTimestamp="2025-10-13 13:23:35 +0000 UTC" firstStartedPulling="2025-10-13 13:23:58.15646817 +0000 UTC m=+1015.690018426" lastFinishedPulling="2025-10-13 13:23:58.719156131 +0000 UTC m=+1016.252706387" observedRunningTime="2025-10-13 13:24:03.68329566 +0000 UTC m=+1021.216845926" watchObservedRunningTime="2025-10-13 13:24:03.69266914 +0000 UTC m=+1021.226219396" Oct 13 13:24:04 crc kubenswrapper[4797]: I1013 13:24:04.613916 4797 generic.go:334] "Generic (PLEG): container finished" podID="3ac6531d-4d7d-4cf0-b943-984f885b4a6d" containerID="89c20ea2719d92d48088b58d7e39f9d562964dd99319780d3e2dff0b28232c4e" exitCode=0 Oct 13 13:24:04 crc kubenswrapper[4797]: I1013 13:24:04.613986 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-2mpq9" event={"ID":"3ac6531d-4d7d-4cf0-b943-984f885b4a6d","Type":"ContainerDied","Data":"89c20ea2719d92d48088b58d7e39f9d562964dd99319780d3e2dff0b28232c4e"} Oct 13 13:24:05 crc kubenswrapper[4797]: I1013 13:24:05.625692 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-2mpq9" event={"ID":"3ac6531d-4d7d-4cf0-b943-984f885b4a6d","Type":"ContainerStarted","Data":"a06c1a43c4f53948ecf8f2a50db2d6224db0be23e64e47a447e2e46bcc407c2e"} Oct 13 13:24:06 crc kubenswrapper[4797]: I1013 13:24:06.153340 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-57bdg"] Oct 13 13:24:06 crc kubenswrapper[4797]: I1013 13:24:06.154650 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-57bdg" Oct 13 13:24:06 crc kubenswrapper[4797]: I1013 13:24:06.160586 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Oct 13 13:24:06 crc kubenswrapper[4797]: I1013 13:24:06.179678 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-57bdg"] Oct 13 13:24:06 crc kubenswrapper[4797]: I1013 13:24:06.237540 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fc3b8cc-c74c-402d-8284-7d578bfa7c02-config\") pod \"ovn-controller-metrics-57bdg\" (UID: \"1fc3b8cc-c74c-402d-8284-7d578bfa7c02\") " pod="openstack/ovn-controller-metrics-57bdg" Oct 13 13:24:06 crc kubenswrapper[4797]: I1013 13:24:06.237586 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/1fc3b8cc-c74c-402d-8284-7d578bfa7c02-ovn-rundir\") pod \"ovn-controller-metrics-57bdg\" (UID: \"1fc3b8cc-c74c-402d-8284-7d578bfa7c02\") " pod="openstack/ovn-controller-metrics-57bdg" Oct 13 13:24:06 crc kubenswrapper[4797]: I1013 13:24:06.237611 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fc3b8cc-c74c-402d-8284-7d578bfa7c02-combined-ca-bundle\") pod \"ovn-controller-metrics-57bdg\" (UID: \"1fc3b8cc-c74c-402d-8284-7d578bfa7c02\") " pod="openstack/ovn-controller-metrics-57bdg" Oct 13 13:24:06 crc kubenswrapper[4797]: I1013 13:24:06.237724 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mwnpw\" (UniqueName: \"kubernetes.io/projected/1fc3b8cc-c74c-402d-8284-7d578bfa7c02-kube-api-access-mwnpw\") pod \"ovn-controller-metrics-57bdg\" (UID: \"1fc3b8cc-c74c-402d-8284-7d578bfa7c02\") " pod="openstack/ovn-controller-metrics-57bdg" Oct 13 13:24:06 crc kubenswrapper[4797]: I1013 13:24:06.237755 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/1fc3b8cc-c74c-402d-8284-7d578bfa7c02-ovs-rundir\") pod \"ovn-controller-metrics-57bdg\" (UID: \"1fc3b8cc-c74c-402d-8284-7d578bfa7c02\") " pod="openstack/ovn-controller-metrics-57bdg" Oct 13 13:24:06 crc kubenswrapper[4797]: I1013 13:24:06.237791 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1fc3b8cc-c74c-402d-8284-7d578bfa7c02-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-57bdg\" (UID: \"1fc3b8cc-c74c-402d-8284-7d578bfa7c02\") " pod="openstack/ovn-controller-metrics-57bdg" Oct 13 13:24:06 crc kubenswrapper[4797]: I1013 13:24:06.282736 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86f694bf-m8ztx"] Oct 13 13:24:06 crc kubenswrapper[4797]: I1013 13:24:06.300921 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-77f67f5697-wwk78"] Oct 13 13:24:06 crc kubenswrapper[4797]: I1013 13:24:06.302340 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77f67f5697-wwk78" Oct 13 13:24:06 crc kubenswrapper[4797]: I1013 13:24:06.311841 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Oct 13 13:24:06 crc kubenswrapper[4797]: I1013 13:24:06.327198 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77f67f5697-wwk78"] Oct 13 13:24:06 crc kubenswrapper[4797]: I1013 13:24:06.340691 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1fc3b8cc-c74c-402d-8284-7d578bfa7c02-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-57bdg\" (UID: \"1fc3b8cc-c74c-402d-8284-7d578bfa7c02\") " pod="openstack/ovn-controller-metrics-57bdg" Oct 13 13:24:06 crc kubenswrapper[4797]: I1013 13:24:06.340891 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hv8vg\" (UniqueName: \"kubernetes.io/projected/fc05ede6-0732-4eb7-8e84-803d40530c21-kube-api-access-hv8vg\") pod \"dnsmasq-dns-77f67f5697-wwk78\" (UID: \"fc05ede6-0732-4eb7-8e84-803d40530c21\") " pod="openstack/dnsmasq-dns-77f67f5697-wwk78" Oct 13 13:24:06 crc kubenswrapper[4797]: I1013 13:24:06.341001 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fc05ede6-0732-4eb7-8e84-803d40530c21-dns-svc\") pod \"dnsmasq-dns-77f67f5697-wwk78\" (UID: \"fc05ede6-0732-4eb7-8e84-803d40530c21\") " pod="openstack/dnsmasq-dns-77f67f5697-wwk78" Oct 13 13:24:06 crc kubenswrapper[4797]: I1013 13:24:06.341102 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fc3b8cc-c74c-402d-8284-7d578bfa7c02-config\") pod \"ovn-controller-metrics-57bdg\" (UID: \"1fc3b8cc-c74c-402d-8284-7d578bfa7c02\") " pod="openstack/ovn-controller-metrics-57bdg" Oct 13 13:24:06 crc kubenswrapper[4797]: I1013 13:24:06.341194 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/1fc3b8cc-c74c-402d-8284-7d578bfa7c02-ovn-rundir\") pod \"ovn-controller-metrics-57bdg\" (UID: \"1fc3b8cc-c74c-402d-8284-7d578bfa7c02\") " pod="openstack/ovn-controller-metrics-57bdg" Oct 13 13:24:06 crc kubenswrapper[4797]: I1013 13:24:06.341274 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc05ede6-0732-4eb7-8e84-803d40530c21-config\") pod \"dnsmasq-dns-77f67f5697-wwk78\" (UID: \"fc05ede6-0732-4eb7-8e84-803d40530c21\") " pod="openstack/dnsmasq-dns-77f67f5697-wwk78" Oct 13 13:24:06 crc kubenswrapper[4797]: I1013 13:24:06.341356 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fc3b8cc-c74c-402d-8284-7d578bfa7c02-combined-ca-bundle\") pod \"ovn-controller-metrics-57bdg\" (UID: \"1fc3b8cc-c74c-402d-8284-7d578bfa7c02\") " pod="openstack/ovn-controller-metrics-57bdg" Oct 13 13:24:06 crc kubenswrapper[4797]: I1013 13:24:06.341426 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fc05ede6-0732-4eb7-8e84-803d40530c21-ovsdbserver-sb\") pod \"dnsmasq-dns-77f67f5697-wwk78\" (UID: \"fc05ede6-0732-4eb7-8e84-803d40530c21\") " pod="openstack/dnsmasq-dns-77f67f5697-wwk78" Oct 13 13:24:06 crc kubenswrapper[4797]: I1013 13:24:06.341539 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mwnpw\" (UniqueName: \"kubernetes.io/projected/1fc3b8cc-c74c-402d-8284-7d578bfa7c02-kube-api-access-mwnpw\") pod \"ovn-controller-metrics-57bdg\" (UID: \"1fc3b8cc-c74c-402d-8284-7d578bfa7c02\") " pod="openstack/ovn-controller-metrics-57bdg" Oct 13 13:24:06 crc kubenswrapper[4797]: I1013 13:24:06.341609 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/1fc3b8cc-c74c-402d-8284-7d578bfa7c02-ovs-rundir\") pod \"ovn-controller-metrics-57bdg\" (UID: \"1fc3b8cc-c74c-402d-8284-7d578bfa7c02\") " pod="openstack/ovn-controller-metrics-57bdg" Oct 13 13:24:06 crc kubenswrapper[4797]: I1013 13:24:06.341541 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/1fc3b8cc-c74c-402d-8284-7d578bfa7c02-ovn-rundir\") pod \"ovn-controller-metrics-57bdg\" (UID: \"1fc3b8cc-c74c-402d-8284-7d578bfa7c02\") " pod="openstack/ovn-controller-metrics-57bdg" Oct 13 13:24:06 crc kubenswrapper[4797]: I1013 13:24:06.341940 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/1fc3b8cc-c74c-402d-8284-7d578bfa7c02-ovs-rundir\") pod \"ovn-controller-metrics-57bdg\" (UID: \"1fc3b8cc-c74c-402d-8284-7d578bfa7c02\") " pod="openstack/ovn-controller-metrics-57bdg" Oct 13 13:24:06 crc kubenswrapper[4797]: I1013 13:24:06.341968 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fc3b8cc-c74c-402d-8284-7d578bfa7c02-config\") pod \"ovn-controller-metrics-57bdg\" (UID: \"1fc3b8cc-c74c-402d-8284-7d578bfa7c02\") " pod="openstack/ovn-controller-metrics-57bdg" Oct 13 13:24:06 crc kubenswrapper[4797]: I1013 13:24:06.346306 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fc3b8cc-c74c-402d-8284-7d578bfa7c02-combined-ca-bundle\") pod \"ovn-controller-metrics-57bdg\" (UID: \"1fc3b8cc-c74c-402d-8284-7d578bfa7c02\") " pod="openstack/ovn-controller-metrics-57bdg" Oct 13 13:24:06 crc kubenswrapper[4797]: I1013 13:24:06.359781 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1fc3b8cc-c74c-402d-8284-7d578bfa7c02-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-57bdg\" (UID: \"1fc3b8cc-c74c-402d-8284-7d578bfa7c02\") " pod="openstack/ovn-controller-metrics-57bdg" Oct 13 13:24:06 crc kubenswrapper[4797]: I1013 13:24:06.366355 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwnpw\" (UniqueName: \"kubernetes.io/projected/1fc3b8cc-c74c-402d-8284-7d578bfa7c02-kube-api-access-mwnpw\") pod \"ovn-controller-metrics-57bdg\" (UID: \"1fc3b8cc-c74c-402d-8284-7d578bfa7c02\") " pod="openstack/ovn-controller-metrics-57bdg" Oct 13 13:24:06 crc kubenswrapper[4797]: I1013 13:24:06.414246 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7869c47d6c-g9669"] Oct 13 13:24:06 crc kubenswrapper[4797]: I1013 13:24:06.442777 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hv8vg\" (UniqueName: \"kubernetes.io/projected/fc05ede6-0732-4eb7-8e84-803d40530c21-kube-api-access-hv8vg\") pod \"dnsmasq-dns-77f67f5697-wwk78\" (UID: \"fc05ede6-0732-4eb7-8e84-803d40530c21\") " pod="openstack/dnsmasq-dns-77f67f5697-wwk78" Oct 13 13:24:06 crc kubenswrapper[4797]: I1013 13:24:06.444208 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fc05ede6-0732-4eb7-8e84-803d40530c21-dns-svc\") pod \"dnsmasq-dns-77f67f5697-wwk78\" (UID: \"fc05ede6-0732-4eb7-8e84-803d40530c21\") " pod="openstack/dnsmasq-dns-77f67f5697-wwk78" Oct 13 13:24:06 crc kubenswrapper[4797]: I1013 13:24:06.444301 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc05ede6-0732-4eb7-8e84-803d40530c21-config\") pod \"dnsmasq-dns-77f67f5697-wwk78\" (UID: \"fc05ede6-0732-4eb7-8e84-803d40530c21\") " pod="openstack/dnsmasq-dns-77f67f5697-wwk78" Oct 13 13:24:06 crc kubenswrapper[4797]: I1013 13:24:06.444336 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fc05ede6-0732-4eb7-8e84-803d40530c21-ovsdbserver-sb\") pod \"dnsmasq-dns-77f67f5697-wwk78\" (UID: \"fc05ede6-0732-4eb7-8e84-803d40530c21\") " pod="openstack/dnsmasq-dns-77f67f5697-wwk78" Oct 13 13:24:06 crc kubenswrapper[4797]: I1013 13:24:06.447090 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc05ede6-0732-4eb7-8e84-803d40530c21-config\") pod \"dnsmasq-dns-77f67f5697-wwk78\" (UID: \"fc05ede6-0732-4eb7-8e84-803d40530c21\") " pod="openstack/dnsmasq-dns-77f67f5697-wwk78" Oct 13 13:24:06 crc kubenswrapper[4797]: I1013 13:24:06.447551 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fc05ede6-0732-4eb7-8e84-803d40530c21-ovsdbserver-sb\") pod \"dnsmasq-dns-77f67f5697-wwk78\" (UID: \"fc05ede6-0732-4eb7-8e84-803d40530c21\") " pod="openstack/dnsmasq-dns-77f67f5697-wwk78" Oct 13 13:24:06 crc kubenswrapper[4797]: I1013 13:24:06.447707 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fc05ede6-0732-4eb7-8e84-803d40530c21-dns-svc\") pod \"dnsmasq-dns-77f67f5697-wwk78\" (UID: \"fc05ede6-0732-4eb7-8e84-803d40530c21\") " pod="openstack/dnsmasq-dns-77f67f5697-wwk78" Oct 13 13:24:06 crc kubenswrapper[4797]: I1013 13:24:06.462751 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hv8vg\" (UniqueName: \"kubernetes.io/projected/fc05ede6-0732-4eb7-8e84-803d40530c21-kube-api-access-hv8vg\") pod \"dnsmasq-dns-77f67f5697-wwk78\" (UID: \"fc05ede6-0732-4eb7-8e84-803d40530c21\") " pod="openstack/dnsmasq-dns-77f67f5697-wwk78" Oct 13 13:24:06 crc kubenswrapper[4797]: I1013 13:24:06.492592 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-57bdg" Oct 13 13:24:06 crc kubenswrapper[4797]: I1013 13:24:06.496206 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-d49c4d845-xmwk4"] Oct 13 13:24:06 crc kubenswrapper[4797]: I1013 13:24:06.497432 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d49c4d845-xmwk4" Oct 13 13:24:06 crc kubenswrapper[4797]: I1013 13:24:06.502161 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Oct 13 13:24:06 crc kubenswrapper[4797]: I1013 13:24:06.518963 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-d49c4d845-xmwk4"] Oct 13 13:24:06 crc kubenswrapper[4797]: I1013 13:24:06.545406 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e708d16b-703f-455a-a806-a9e96e23da95-dns-svc\") pod \"dnsmasq-dns-d49c4d845-xmwk4\" (UID: \"e708d16b-703f-455a-a806-a9e96e23da95\") " pod="openstack/dnsmasq-dns-d49c4d845-xmwk4" Oct 13 13:24:06 crc kubenswrapper[4797]: I1013 13:24:06.545704 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e708d16b-703f-455a-a806-a9e96e23da95-config\") pod \"dnsmasq-dns-d49c4d845-xmwk4\" (UID: \"e708d16b-703f-455a-a806-a9e96e23da95\") " pod="openstack/dnsmasq-dns-d49c4d845-xmwk4" Oct 13 13:24:06 crc kubenswrapper[4797]: I1013 13:24:06.545793 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e708d16b-703f-455a-a806-a9e96e23da95-ovsdbserver-nb\") pod \"dnsmasq-dns-d49c4d845-xmwk4\" (UID: \"e708d16b-703f-455a-a806-a9e96e23da95\") " pod="openstack/dnsmasq-dns-d49c4d845-xmwk4" Oct 13 13:24:06 crc kubenswrapper[4797]: I1013 13:24:06.545937 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rs8f\" (UniqueName: \"kubernetes.io/projected/e708d16b-703f-455a-a806-a9e96e23da95-kube-api-access-5rs8f\") pod \"dnsmasq-dns-d49c4d845-xmwk4\" (UID: \"e708d16b-703f-455a-a806-a9e96e23da95\") " pod="openstack/dnsmasq-dns-d49c4d845-xmwk4" Oct 13 13:24:06 crc kubenswrapper[4797]: I1013 13:24:06.546017 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e708d16b-703f-455a-a806-a9e96e23da95-ovsdbserver-sb\") pod \"dnsmasq-dns-d49c4d845-xmwk4\" (UID: \"e708d16b-703f-455a-a806-a9e96e23da95\") " pod="openstack/dnsmasq-dns-d49c4d845-xmwk4" Oct 13 13:24:06 crc kubenswrapper[4797]: I1013 13:24:06.628837 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77f67f5697-wwk78" Oct 13 13:24:06 crc kubenswrapper[4797]: I1013 13:24:06.647634 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5rs8f\" (UniqueName: \"kubernetes.io/projected/e708d16b-703f-455a-a806-a9e96e23da95-kube-api-access-5rs8f\") pod \"dnsmasq-dns-d49c4d845-xmwk4\" (UID: \"e708d16b-703f-455a-a806-a9e96e23da95\") " pod="openstack/dnsmasq-dns-d49c4d845-xmwk4" Oct 13 13:24:06 crc kubenswrapper[4797]: I1013 13:24:06.647701 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e708d16b-703f-455a-a806-a9e96e23da95-ovsdbserver-sb\") pod \"dnsmasq-dns-d49c4d845-xmwk4\" (UID: \"e708d16b-703f-455a-a806-a9e96e23da95\") " pod="openstack/dnsmasq-dns-d49c4d845-xmwk4" Oct 13 13:24:06 crc kubenswrapper[4797]: I1013 13:24:06.647744 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e708d16b-703f-455a-a806-a9e96e23da95-dns-svc\") pod \"dnsmasq-dns-d49c4d845-xmwk4\" (UID: \"e708d16b-703f-455a-a806-a9e96e23da95\") " pod="openstack/dnsmasq-dns-d49c4d845-xmwk4" Oct 13 13:24:06 crc kubenswrapper[4797]: I1013 13:24:06.647857 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e708d16b-703f-455a-a806-a9e96e23da95-config\") pod \"dnsmasq-dns-d49c4d845-xmwk4\" (UID: \"e708d16b-703f-455a-a806-a9e96e23da95\") " pod="openstack/dnsmasq-dns-d49c4d845-xmwk4" Oct 13 13:24:06 crc kubenswrapper[4797]: I1013 13:24:06.647892 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e708d16b-703f-455a-a806-a9e96e23da95-ovsdbserver-nb\") pod \"dnsmasq-dns-d49c4d845-xmwk4\" (UID: \"e708d16b-703f-455a-a806-a9e96e23da95\") " pod="openstack/dnsmasq-dns-d49c4d845-xmwk4" Oct 13 13:24:06 crc kubenswrapper[4797]: I1013 13:24:06.649012 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e708d16b-703f-455a-a806-a9e96e23da95-ovsdbserver-sb\") pod \"dnsmasq-dns-d49c4d845-xmwk4\" (UID: \"e708d16b-703f-455a-a806-a9e96e23da95\") " pod="openstack/dnsmasq-dns-d49c4d845-xmwk4" Oct 13 13:24:06 crc kubenswrapper[4797]: I1013 13:24:06.649067 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e708d16b-703f-455a-a806-a9e96e23da95-ovsdbserver-nb\") pod \"dnsmasq-dns-d49c4d845-xmwk4\" (UID: \"e708d16b-703f-455a-a806-a9e96e23da95\") " pod="openstack/dnsmasq-dns-d49c4d845-xmwk4" Oct 13 13:24:06 crc kubenswrapper[4797]: I1013 13:24:06.649111 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e708d16b-703f-455a-a806-a9e96e23da95-dns-svc\") pod \"dnsmasq-dns-d49c4d845-xmwk4\" (UID: \"e708d16b-703f-455a-a806-a9e96e23da95\") " pod="openstack/dnsmasq-dns-d49c4d845-xmwk4" Oct 13 13:24:06 crc kubenswrapper[4797]: I1013 13:24:06.649607 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e708d16b-703f-455a-a806-a9e96e23da95-config\") pod \"dnsmasq-dns-d49c4d845-xmwk4\" (UID: \"e708d16b-703f-455a-a806-a9e96e23da95\") " pod="openstack/dnsmasq-dns-d49c4d845-xmwk4" Oct 13 13:24:06 crc kubenswrapper[4797]: I1013 13:24:06.665177 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rs8f\" (UniqueName: \"kubernetes.io/projected/e708d16b-703f-455a-a806-a9e96e23da95-kube-api-access-5rs8f\") pod \"dnsmasq-dns-d49c4d845-xmwk4\" (UID: \"e708d16b-703f-455a-a806-a9e96e23da95\") " pod="openstack/dnsmasq-dns-d49c4d845-xmwk4" Oct 13 13:24:06 crc kubenswrapper[4797]: I1013 13:24:06.733962 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86f694bf-m8ztx" Oct 13 13:24:06 crc kubenswrapper[4797]: I1013 13:24:06.738319 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7869c47d6c-g9669" Oct 13 13:24:06 crc kubenswrapper[4797]: I1013 13:24:06.813093 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Oct 13 13:24:06 crc kubenswrapper[4797]: I1013 13:24:06.813157 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Oct 13 13:24:06 crc kubenswrapper[4797]: I1013 13:24:06.821346 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d49c4d845-xmwk4" Oct 13 13:24:06 crc kubenswrapper[4797]: I1013 13:24:06.987353 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed404a60-7b1e-4d3d-91a1-50a66c87f7b4-config\") pod \"ed404a60-7b1e-4d3d-91a1-50a66c87f7b4\" (UID: \"ed404a60-7b1e-4d3d-91a1-50a66c87f7b4\") " Oct 13 13:24:06 crc kubenswrapper[4797]: I1013 13:24:06.987822 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rrgw5\" (UniqueName: \"kubernetes.io/projected/0df186e7-541e-4998-bba3-95f086636a6d-kube-api-access-rrgw5\") pod \"0df186e7-541e-4998-bba3-95f086636a6d\" (UID: \"0df186e7-541e-4998-bba3-95f086636a6d\") " Oct 13 13:24:06 crc kubenswrapper[4797]: I1013 13:24:06.987912 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ed404a60-7b1e-4d3d-91a1-50a66c87f7b4-dns-svc\") pod \"ed404a60-7b1e-4d3d-91a1-50a66c87f7b4\" (UID: \"ed404a60-7b1e-4d3d-91a1-50a66c87f7b4\") " Oct 13 13:24:06 crc kubenswrapper[4797]: I1013 13:24:06.988078 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wzd2r\" (UniqueName: \"kubernetes.io/projected/ed404a60-7b1e-4d3d-91a1-50a66c87f7b4-kube-api-access-wzd2r\") pod \"ed404a60-7b1e-4d3d-91a1-50a66c87f7b4\" (UID: \"ed404a60-7b1e-4d3d-91a1-50a66c87f7b4\") " Oct 13 13:24:06 crc kubenswrapper[4797]: I1013 13:24:06.988139 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0df186e7-541e-4998-bba3-95f086636a6d-config\") pod \"0df186e7-541e-4998-bba3-95f086636a6d\" (UID: \"0df186e7-541e-4998-bba3-95f086636a6d\") " Oct 13 13:24:06 crc kubenswrapper[4797]: I1013 13:24:06.988189 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0df186e7-541e-4998-bba3-95f086636a6d-dns-svc\") pod \"0df186e7-541e-4998-bba3-95f086636a6d\" (UID: \"0df186e7-541e-4998-bba3-95f086636a6d\") " Oct 13 13:24:06 crc kubenswrapper[4797]: I1013 13:24:06.990259 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed404a60-7b1e-4d3d-91a1-50a66c87f7b4-config" (OuterVolumeSpecName: "config") pod "ed404a60-7b1e-4d3d-91a1-50a66c87f7b4" (UID: "ed404a60-7b1e-4d3d-91a1-50a66c87f7b4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:24:06 crc kubenswrapper[4797]: I1013 13:24:06.992975 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed404a60-7b1e-4d3d-91a1-50a66c87f7b4-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ed404a60-7b1e-4d3d-91a1-50a66c87f7b4" (UID: "ed404a60-7b1e-4d3d-91a1-50a66c87f7b4"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:24:06 crc kubenswrapper[4797]: I1013 13:24:06.993354 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0df186e7-541e-4998-bba3-95f086636a6d-config" (OuterVolumeSpecName: "config") pod "0df186e7-541e-4998-bba3-95f086636a6d" (UID: "0df186e7-541e-4998-bba3-95f086636a6d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:24:06 crc kubenswrapper[4797]: I1013 13:24:06.993698 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0df186e7-541e-4998-bba3-95f086636a6d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0df186e7-541e-4998-bba3-95f086636a6d" (UID: "0df186e7-541e-4998-bba3-95f086636a6d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:24:07 crc kubenswrapper[4797]: I1013 13:24:07.007989 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed404a60-7b1e-4d3d-91a1-50a66c87f7b4-kube-api-access-wzd2r" (OuterVolumeSpecName: "kube-api-access-wzd2r") pod "ed404a60-7b1e-4d3d-91a1-50a66c87f7b4" (UID: "ed404a60-7b1e-4d3d-91a1-50a66c87f7b4"). InnerVolumeSpecName "kube-api-access-wzd2r". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:24:07 crc kubenswrapper[4797]: I1013 13:24:07.008745 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0df186e7-541e-4998-bba3-95f086636a6d-kube-api-access-rrgw5" (OuterVolumeSpecName: "kube-api-access-rrgw5") pod "0df186e7-541e-4998-bba3-95f086636a6d" (UID: "0df186e7-541e-4998-bba3-95f086636a6d"). InnerVolumeSpecName "kube-api-access-rrgw5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:24:07 crc kubenswrapper[4797]: I1013 13:24:07.090602 4797 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ed404a60-7b1e-4d3d-91a1-50a66c87f7b4-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 13 13:24:07 crc kubenswrapper[4797]: I1013 13:24:07.090635 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wzd2r\" (UniqueName: \"kubernetes.io/projected/ed404a60-7b1e-4d3d-91a1-50a66c87f7b4-kube-api-access-wzd2r\") on node \"crc\" DevicePath \"\"" Oct 13 13:24:07 crc kubenswrapper[4797]: I1013 13:24:07.090647 4797 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0df186e7-541e-4998-bba3-95f086636a6d-config\") on node \"crc\" DevicePath \"\"" Oct 13 13:24:07 crc kubenswrapper[4797]: I1013 13:24:07.090661 4797 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0df186e7-541e-4998-bba3-95f086636a6d-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 13 13:24:07 crc kubenswrapper[4797]: I1013 13:24:07.090670 4797 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed404a60-7b1e-4d3d-91a1-50a66c87f7b4-config\") on node \"crc\" DevicePath \"\"" Oct 13 13:24:07 crc kubenswrapper[4797]: I1013 13:24:07.090679 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rrgw5\" (UniqueName: \"kubernetes.io/projected/0df186e7-541e-4998-bba3-95f086636a6d-kube-api-access-rrgw5\") on node \"crc\" DevicePath \"\"" Oct 13 13:24:07 crc kubenswrapper[4797]: I1013 13:24:07.292181 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77f67f5697-wwk78"] Oct 13 13:24:07 crc kubenswrapper[4797]: W1013 13:24:07.299243 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfc05ede6_0732_4eb7_8e84_803d40530c21.slice/crio-44349de50458b5b243a468ee4d4d325f4229cd591e8ef7df4d638cc155f1ffcd WatchSource:0}: Error finding container 44349de50458b5b243a468ee4d4d325f4229cd591e8ef7df4d638cc155f1ffcd: Status 404 returned error can't find the container with id 44349de50458b5b243a468ee4d4d325f4229cd591e8ef7df4d638cc155f1ffcd Oct 13 13:24:07 crc kubenswrapper[4797]: I1013 13:24:07.319482 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-57bdg"] Oct 13 13:24:07 crc kubenswrapper[4797]: W1013 13:24:07.319562 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1fc3b8cc_c74c_402d_8284_7d578bfa7c02.slice/crio-57bb4040b4aab5e54aad325812c0ed957180a8bb668bec1728a3e319b457587d WatchSource:0}: Error finding container 57bb4040b4aab5e54aad325812c0ed957180a8bb668bec1728a3e319b457587d: Status 404 returned error can't find the container with id 57bb4040b4aab5e54aad325812c0ed957180a8bb668bec1728a3e319b457587d Oct 13 13:24:07 crc kubenswrapper[4797]: I1013 13:24:07.408703 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-d49c4d845-xmwk4"] Oct 13 13:24:07 crc kubenswrapper[4797]: I1013 13:24:07.643887 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"244e58a1-ed2c-4ff6-8885-ebd066e8adab","Type":"ContainerStarted","Data":"ea620bd698810f04fad3cc655e6b00829d4603731d7d1dd124d75e6de787a1e3"} Oct 13 13:24:07 crc kubenswrapper[4797]: I1013 13:24:07.646111 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-57bdg" event={"ID":"1fc3b8cc-c74c-402d-8284-7d578bfa7c02","Type":"ContainerStarted","Data":"58cfde229cd18c95dec2460726a8982f10895baa52b242e5b8b923162d984a0a"} Oct 13 13:24:07 crc kubenswrapper[4797]: I1013 13:24:07.646171 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-57bdg" event={"ID":"1fc3b8cc-c74c-402d-8284-7d578bfa7c02","Type":"ContainerStarted","Data":"57bb4040b4aab5e54aad325812c0ed957180a8bb668bec1728a3e319b457587d"} Oct 13 13:24:07 crc kubenswrapper[4797]: I1013 13:24:07.647491 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86f694bf-m8ztx" event={"ID":"0df186e7-541e-4998-bba3-95f086636a6d","Type":"ContainerDied","Data":"823aa6bafcf219db1ff2bc72d64b5ecc41c6e8cbc893969d92b63e1e4711d0d8"} Oct 13 13:24:07 crc kubenswrapper[4797]: I1013 13:24:07.647504 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86f694bf-m8ztx" Oct 13 13:24:07 crc kubenswrapper[4797]: I1013 13:24:07.648692 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d49c4d845-xmwk4" event={"ID":"e708d16b-703f-455a-a806-a9e96e23da95","Type":"ContainerStarted","Data":"c6fb38884b0225996b320460bf9b750805809eea67512f6e06b024e5aa4cb325"} Oct 13 13:24:07 crc kubenswrapper[4797]: I1013 13:24:07.649706 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7869c47d6c-g9669" event={"ID":"ed404a60-7b1e-4d3d-91a1-50a66c87f7b4","Type":"ContainerDied","Data":"9bec38f6ba065f92070a6bb0b0e9217b2d2a843288e7b4ab90d27c1e11b94ad7"} Oct 13 13:24:07 crc kubenswrapper[4797]: I1013 13:24:07.649868 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7869c47d6c-g9669" Oct 13 13:24:07 crc kubenswrapper[4797]: I1013 13:24:07.651084 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77f67f5697-wwk78" event={"ID":"fc05ede6-0732-4eb7-8e84-803d40530c21","Type":"ContainerStarted","Data":"44349de50458b5b243a468ee4d4d325f4229cd591e8ef7df4d638cc155f1ffcd"} Oct 13 13:24:07 crc kubenswrapper[4797]: I1013 13:24:07.652995 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"047404d9-b0ab-44e2-a31d-94d8fe429698","Type":"ContainerStarted","Data":"eb49a8ba0c15790a316319a7af2bb9f90a15b3fcad1167d493975eeda527d705"} Oct 13 13:24:07 crc kubenswrapper[4797]: I1013 13:24:07.654964 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-2mpq9" event={"ID":"3ac6531d-4d7d-4cf0-b943-984f885b4a6d","Type":"ContainerStarted","Data":"530a23262b718d00b295a64b20c635224a04f307f764b6900f060c9b7a722368"} Oct 13 13:24:07 crc kubenswrapper[4797]: I1013 13:24:07.655518 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-2mpq9" Oct 13 13:24:07 crc kubenswrapper[4797]: I1013 13:24:07.655594 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-2mpq9" Oct 13 13:24:07 crc kubenswrapper[4797]: I1013 13:24:07.671999 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=17.94303204 podStartE2EDuration="25.671980748s" podCreationTimestamp="2025-10-13 13:23:42 +0000 UTC" firstStartedPulling="2025-10-13 13:23:59.068543307 +0000 UTC m=+1016.602093563" lastFinishedPulling="2025-10-13 13:24:06.797492015 +0000 UTC m=+1024.331042271" observedRunningTime="2025-10-13 13:24:07.667647482 +0000 UTC m=+1025.201197788" watchObservedRunningTime="2025-10-13 13:24:07.671980748 +0000 UTC m=+1025.205531004" Oct 13 13:24:07 crc kubenswrapper[4797]: I1013 13:24:07.720391 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=15.25324276 podStartE2EDuration="23.720363576s" podCreationTimestamp="2025-10-13 13:23:44 +0000 UTC" firstStartedPulling="2025-10-13 13:23:58.329959569 +0000 UTC m=+1015.863509825" lastFinishedPulling="2025-10-13 13:24:06.797080385 +0000 UTC m=+1024.330630641" observedRunningTime="2025-10-13 13:24:07.717338002 +0000 UTC m=+1025.250888278" watchObservedRunningTime="2025-10-13 13:24:07.720363576 +0000 UTC m=+1025.253913852" Oct 13 13:24:07 crc kubenswrapper[4797]: I1013 13:24:07.721090 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-57bdg" podStartSLOduration=1.721077333 podStartE2EDuration="1.721077333s" podCreationTimestamp="2025-10-13 13:24:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 13:24:07.68996167 +0000 UTC m=+1025.223511956" watchObservedRunningTime="2025-10-13 13:24:07.721077333 +0000 UTC m=+1025.254627629" Oct 13 13:24:07 crc kubenswrapper[4797]: I1013 13:24:07.775660 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7869c47d6c-g9669"] Oct 13 13:24:07 crc kubenswrapper[4797]: I1013 13:24:07.811578 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7869c47d6c-g9669"] Oct 13 13:24:07 crc kubenswrapper[4797]: I1013 13:24:07.817326 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-2mpq9" podStartSLOduration=22.340273155 podStartE2EDuration="25.817304195s" podCreationTimestamp="2025-10-13 13:23:42 +0000 UTC" firstStartedPulling="2025-10-13 13:23:59.405043956 +0000 UTC m=+1016.938594212" lastFinishedPulling="2025-10-13 13:24:02.882074996 +0000 UTC m=+1020.415625252" observedRunningTime="2025-10-13 13:24:07.797413077 +0000 UTC m=+1025.330963363" watchObservedRunningTime="2025-10-13 13:24:07.817304195 +0000 UTC m=+1025.350854451" Oct 13 13:24:07 crc kubenswrapper[4797]: I1013 13:24:07.887119 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86f694bf-m8ztx"] Oct 13 13:24:07 crc kubenswrapper[4797]: I1013 13:24:07.892019 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86f694bf-m8ztx"] Oct 13 13:24:08 crc kubenswrapper[4797]: E1013 13:24:08.054783 4797 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfc05ede6_0732_4eb7_8e84_803d40530c21.slice/crio-4ecb8a9b78da5ed96baecd47784a2e41db7470fd4b651cc6d7f874429198b742.scope\": RecentStats: unable to find data in memory cache]" Oct 13 13:24:08 crc kubenswrapper[4797]: I1013 13:24:08.356601 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Oct 13 13:24:08 crc kubenswrapper[4797]: I1013 13:24:08.544663 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Oct 13 13:24:08 crc kubenswrapper[4797]: I1013 13:24:08.662718 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"300309d9-4375-4cce-8fb1-0833d2cfdcde","Type":"ContainerStarted","Data":"b1e44153ed376aa2a5b20e44939ad60b37e029b63d505942d897d9c4bae17008"} Oct 13 13:24:08 crc kubenswrapper[4797]: I1013 13:24:08.665559 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"21067728-d3cf-4ff2-94c9-87600f7324ab","Type":"ContainerStarted","Data":"dc60509f6fdf80ab3c7a93d4d24dd520df75f4c9ccb98b44fd3e2e5450ca0b88"} Oct 13 13:24:08 crc kubenswrapper[4797]: I1013 13:24:08.667256 4797 generic.go:334] "Generic (PLEG): container finished" podID="e708d16b-703f-455a-a806-a9e96e23da95" containerID="ab523cc2a141c6a10760f5d182dde3c8b997392f5b3ffa96dfb116a3cea6dccd" exitCode=0 Oct 13 13:24:08 crc kubenswrapper[4797]: I1013 13:24:08.667392 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d49c4d845-xmwk4" event={"ID":"e708d16b-703f-455a-a806-a9e96e23da95","Type":"ContainerDied","Data":"ab523cc2a141c6a10760f5d182dde3c8b997392f5b3ffa96dfb116a3cea6dccd"} Oct 13 13:24:08 crc kubenswrapper[4797]: I1013 13:24:08.669507 4797 generic.go:334] "Generic (PLEG): container finished" podID="fc05ede6-0732-4eb7-8e84-803d40530c21" containerID="4ecb8a9b78da5ed96baecd47784a2e41db7470fd4b651cc6d7f874429198b742" exitCode=0 Oct 13 13:24:08 crc kubenswrapper[4797]: I1013 13:24:08.670230 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77f67f5697-wwk78" event={"ID":"fc05ede6-0732-4eb7-8e84-803d40530c21","Type":"ContainerDied","Data":"4ecb8a9b78da5ed96baecd47784a2e41db7470fd4b651cc6d7f874429198b742"} Oct 13 13:24:09 crc kubenswrapper[4797]: I1013 13:24:09.250663 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0df186e7-541e-4998-bba3-95f086636a6d" path="/var/lib/kubelet/pods/0df186e7-541e-4998-bba3-95f086636a6d/volumes" Oct 13 13:24:09 crc kubenswrapper[4797]: I1013 13:24:09.251494 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed404a60-7b1e-4d3d-91a1-50a66c87f7b4" path="/var/lib/kubelet/pods/ed404a60-7b1e-4d3d-91a1-50a66c87f7b4/volumes" Oct 13 13:24:09 crc kubenswrapper[4797]: I1013 13:24:09.682585 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d49c4d845-xmwk4" event={"ID":"e708d16b-703f-455a-a806-a9e96e23da95","Type":"ContainerStarted","Data":"d6db270e188b50752f536af9dc06aee52e97667047bae3efcbb7001e3ed43d2e"} Oct 13 13:24:09 crc kubenswrapper[4797]: I1013 13:24:09.682645 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-d49c4d845-xmwk4" Oct 13 13:24:09 crc kubenswrapper[4797]: I1013 13:24:09.686611 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77f67f5697-wwk78" event={"ID":"fc05ede6-0732-4eb7-8e84-803d40530c21","Type":"ContainerStarted","Data":"a5c64fa2f39ca1ad6761835f21af55ff63c3d8fa8a81c851bc954a4f8bda2831"} Oct 13 13:24:09 crc kubenswrapper[4797]: I1013 13:24:09.686648 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-77f67f5697-wwk78" Oct 13 13:24:09 crc kubenswrapper[4797]: I1013 13:24:09.702579 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-d49c4d845-xmwk4" podStartSLOduration=3.105035782 podStartE2EDuration="3.702556767s" podCreationTimestamp="2025-10-13 13:24:06 +0000 UTC" firstStartedPulling="2025-10-13 13:24:07.410669015 +0000 UTC m=+1024.944219271" lastFinishedPulling="2025-10-13 13:24:08.00819001 +0000 UTC m=+1025.541740256" observedRunningTime="2025-10-13 13:24:09.700124977 +0000 UTC m=+1027.233675253" watchObservedRunningTime="2025-10-13 13:24:09.702556767 +0000 UTC m=+1027.236107063" Oct 13 13:24:09 crc kubenswrapper[4797]: I1013 13:24:09.720473 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-77f67f5697-wwk78" podStartSLOduration=3.268157585 podStartE2EDuration="3.720452136s" podCreationTimestamp="2025-10-13 13:24:06 +0000 UTC" firstStartedPulling="2025-10-13 13:24:07.301209498 +0000 UTC m=+1024.834759754" lastFinishedPulling="2025-10-13 13:24:07.753504049 +0000 UTC m=+1025.287054305" observedRunningTime="2025-10-13 13:24:09.715193247 +0000 UTC m=+1027.248743543" watchObservedRunningTime="2025-10-13 13:24:09.720452136 +0000 UTC m=+1027.254002432" Oct 13 13:24:09 crc kubenswrapper[4797]: I1013 13:24:09.924563 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Oct 13 13:24:09 crc kubenswrapper[4797]: I1013 13:24:09.986303 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Oct 13 13:24:10 crc kubenswrapper[4797]: I1013 13:24:10.356339 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Oct 13 13:24:10 crc kubenswrapper[4797]: I1013 13:24:10.410713 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Oct 13 13:24:10 crc kubenswrapper[4797]: I1013 13:24:10.694194 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Oct 13 13:24:10 crc kubenswrapper[4797]: I1013 13:24:10.736043 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Oct 13 13:24:10 crc kubenswrapper[4797]: I1013 13:24:10.744972 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Oct 13 13:24:10 crc kubenswrapper[4797]: I1013 13:24:10.982222 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Oct 13 13:24:11 crc kubenswrapper[4797]: I1013 13:24:11.037456 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Oct 13 13:24:11 crc kubenswrapper[4797]: I1013 13:24:11.213732 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Oct 13 13:24:11 crc kubenswrapper[4797]: I1013 13:24:11.215273 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 13 13:24:11 crc kubenswrapper[4797]: I1013 13:24:11.219821 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Oct 13 13:24:11 crc kubenswrapper[4797]: I1013 13:24:11.220078 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-lxpcn" Oct 13 13:24:11 crc kubenswrapper[4797]: I1013 13:24:11.219975 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Oct 13 13:24:11 crc kubenswrapper[4797]: I1013 13:24:11.220020 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Oct 13 13:24:11 crc kubenswrapper[4797]: I1013 13:24:11.227188 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Oct 13 13:24:11 crc kubenswrapper[4797]: I1013 13:24:11.273693 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d661f302-5234-4d18-9aa8-0eddd26153fe-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"d661f302-5234-4d18-9aa8-0eddd26153fe\") " pod="openstack/ovn-northd-0" Oct 13 13:24:11 crc kubenswrapper[4797]: I1013 13:24:11.274118 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d661f302-5234-4d18-9aa8-0eddd26153fe-scripts\") pod \"ovn-northd-0\" (UID: \"d661f302-5234-4d18-9aa8-0eddd26153fe\") " pod="openstack/ovn-northd-0" Oct 13 13:24:11 crc kubenswrapper[4797]: I1013 13:24:11.274156 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fwsx\" (UniqueName: \"kubernetes.io/projected/d661f302-5234-4d18-9aa8-0eddd26153fe-kube-api-access-9fwsx\") pod \"ovn-northd-0\" (UID: \"d661f302-5234-4d18-9aa8-0eddd26153fe\") " pod="openstack/ovn-northd-0" Oct 13 13:24:11 crc kubenswrapper[4797]: I1013 13:24:11.274185 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d661f302-5234-4d18-9aa8-0eddd26153fe-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"d661f302-5234-4d18-9aa8-0eddd26153fe\") " pod="openstack/ovn-northd-0" Oct 13 13:24:11 crc kubenswrapper[4797]: I1013 13:24:11.274259 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/d661f302-5234-4d18-9aa8-0eddd26153fe-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"d661f302-5234-4d18-9aa8-0eddd26153fe\") " pod="openstack/ovn-northd-0" Oct 13 13:24:11 crc kubenswrapper[4797]: I1013 13:24:11.274305 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d661f302-5234-4d18-9aa8-0eddd26153fe-config\") pod \"ovn-northd-0\" (UID: \"d661f302-5234-4d18-9aa8-0eddd26153fe\") " pod="openstack/ovn-northd-0" Oct 13 13:24:11 crc kubenswrapper[4797]: I1013 13:24:11.274332 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d661f302-5234-4d18-9aa8-0eddd26153fe-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"d661f302-5234-4d18-9aa8-0eddd26153fe\") " pod="openstack/ovn-northd-0" Oct 13 13:24:11 crc kubenswrapper[4797]: I1013 13:24:11.375888 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9fwsx\" (UniqueName: \"kubernetes.io/projected/d661f302-5234-4d18-9aa8-0eddd26153fe-kube-api-access-9fwsx\") pod \"ovn-northd-0\" (UID: \"d661f302-5234-4d18-9aa8-0eddd26153fe\") " pod="openstack/ovn-northd-0" Oct 13 13:24:11 crc kubenswrapper[4797]: I1013 13:24:11.376165 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d661f302-5234-4d18-9aa8-0eddd26153fe-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"d661f302-5234-4d18-9aa8-0eddd26153fe\") " pod="openstack/ovn-northd-0" Oct 13 13:24:11 crc kubenswrapper[4797]: I1013 13:24:11.376323 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/d661f302-5234-4d18-9aa8-0eddd26153fe-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"d661f302-5234-4d18-9aa8-0eddd26153fe\") " pod="openstack/ovn-northd-0" Oct 13 13:24:11 crc kubenswrapper[4797]: I1013 13:24:11.376487 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d661f302-5234-4d18-9aa8-0eddd26153fe-config\") pod \"ovn-northd-0\" (UID: \"d661f302-5234-4d18-9aa8-0eddd26153fe\") " pod="openstack/ovn-northd-0" Oct 13 13:24:11 crc kubenswrapper[4797]: I1013 13:24:11.376598 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d661f302-5234-4d18-9aa8-0eddd26153fe-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"d661f302-5234-4d18-9aa8-0eddd26153fe\") " pod="openstack/ovn-northd-0" Oct 13 13:24:11 crc kubenswrapper[4797]: I1013 13:24:11.376722 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d661f302-5234-4d18-9aa8-0eddd26153fe-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"d661f302-5234-4d18-9aa8-0eddd26153fe\") " pod="openstack/ovn-northd-0" Oct 13 13:24:11 crc kubenswrapper[4797]: I1013 13:24:11.376857 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d661f302-5234-4d18-9aa8-0eddd26153fe-scripts\") pod \"ovn-northd-0\" (UID: \"d661f302-5234-4d18-9aa8-0eddd26153fe\") " pod="openstack/ovn-northd-0" Oct 13 13:24:11 crc kubenswrapper[4797]: I1013 13:24:11.377457 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d661f302-5234-4d18-9aa8-0eddd26153fe-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"d661f302-5234-4d18-9aa8-0eddd26153fe\") " pod="openstack/ovn-northd-0" Oct 13 13:24:11 crc kubenswrapper[4797]: I1013 13:24:11.377538 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d661f302-5234-4d18-9aa8-0eddd26153fe-config\") pod \"ovn-northd-0\" (UID: \"d661f302-5234-4d18-9aa8-0eddd26153fe\") " pod="openstack/ovn-northd-0" Oct 13 13:24:11 crc kubenswrapper[4797]: I1013 13:24:11.377642 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d661f302-5234-4d18-9aa8-0eddd26153fe-scripts\") pod \"ovn-northd-0\" (UID: \"d661f302-5234-4d18-9aa8-0eddd26153fe\") " pod="openstack/ovn-northd-0" Oct 13 13:24:11 crc kubenswrapper[4797]: I1013 13:24:11.381937 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/d661f302-5234-4d18-9aa8-0eddd26153fe-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"d661f302-5234-4d18-9aa8-0eddd26153fe\") " pod="openstack/ovn-northd-0" Oct 13 13:24:11 crc kubenswrapper[4797]: I1013 13:24:11.382243 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d661f302-5234-4d18-9aa8-0eddd26153fe-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"d661f302-5234-4d18-9aa8-0eddd26153fe\") " pod="openstack/ovn-northd-0" Oct 13 13:24:11 crc kubenswrapper[4797]: I1013 13:24:11.383728 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d661f302-5234-4d18-9aa8-0eddd26153fe-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"d661f302-5234-4d18-9aa8-0eddd26153fe\") " pod="openstack/ovn-northd-0" Oct 13 13:24:11 crc kubenswrapper[4797]: I1013 13:24:11.397415 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fwsx\" (UniqueName: \"kubernetes.io/projected/d661f302-5234-4d18-9aa8-0eddd26153fe-kube-api-access-9fwsx\") pod \"ovn-northd-0\" (UID: \"d661f302-5234-4d18-9aa8-0eddd26153fe\") " pod="openstack/ovn-northd-0" Oct 13 13:24:11 crc kubenswrapper[4797]: I1013 13:24:11.538510 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 13 13:24:11 crc kubenswrapper[4797]: I1013 13:24:11.716666 4797 generic.go:334] "Generic (PLEG): container finished" podID="300309d9-4375-4cce-8fb1-0833d2cfdcde" containerID="b1e44153ed376aa2a5b20e44939ad60b37e029b63d505942d897d9c4bae17008" exitCode=0 Oct 13 13:24:11 crc kubenswrapper[4797]: I1013 13:24:11.716748 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"300309d9-4375-4cce-8fb1-0833d2cfdcde","Type":"ContainerDied","Data":"b1e44153ed376aa2a5b20e44939ad60b37e029b63d505942d897d9c4bae17008"} Oct 13 13:24:12 crc kubenswrapper[4797]: I1013 13:24:12.028342 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Oct 13 13:24:12 crc kubenswrapper[4797]: W1013 13:24:12.033020 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd661f302_5234_4d18_9aa8_0eddd26153fe.slice/crio-c43d7fd65f1a3539ba1c9fb279f0e0bca70eec6d5392156f528778ea76540ad0 WatchSource:0}: Error finding container c43d7fd65f1a3539ba1c9fb279f0e0bca70eec6d5392156f528778ea76540ad0: Status 404 returned error can't find the container with id c43d7fd65f1a3539ba1c9fb279f0e0bca70eec6d5392156f528778ea76540ad0 Oct 13 13:24:12 crc kubenswrapper[4797]: I1013 13:24:12.137153 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Oct 13 13:24:12 crc kubenswrapper[4797]: I1013 13:24:12.724948 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"300309d9-4375-4cce-8fb1-0833d2cfdcde","Type":"ContainerStarted","Data":"a0ba3520e5651533522d5be4eedd2ce11b85f4a41e04d516a04f5658408ca62b"} Oct 13 13:24:12 crc kubenswrapper[4797]: I1013 13:24:12.726524 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"d661f302-5234-4d18-9aa8-0eddd26153fe","Type":"ContainerStarted","Data":"c43d7fd65f1a3539ba1c9fb279f0e0bca70eec6d5392156f528778ea76540ad0"} Oct 13 13:24:13 crc kubenswrapper[4797]: I1013 13:24:13.257350 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=-9223371997.597439 podStartE2EDuration="39.257337035s" podCreationTimestamp="2025-10-13 13:23:34 +0000 UTC" firstStartedPulling="2025-10-13 13:23:36.107265605 +0000 UTC m=+993.640815871" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 13:24:12.74376821 +0000 UTC m=+1030.277318496" watchObservedRunningTime="2025-10-13 13:24:13.257337035 +0000 UTC m=+1030.790887281" Oct 13 13:24:13 crc kubenswrapper[4797]: I1013 13:24:13.735232 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"d661f302-5234-4d18-9aa8-0eddd26153fe","Type":"ContainerStarted","Data":"1a733e45e064aeec3878d4c5dd8fe67bcb4e25caaf9192479e03c78ce5fbd2b5"} Oct 13 13:24:13 crc kubenswrapper[4797]: I1013 13:24:13.735284 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"d661f302-5234-4d18-9aa8-0eddd26153fe","Type":"ContainerStarted","Data":"c4243df011234c180288fc1c95c327de116944eeb9f76e3b80b6ff0317063169"} Oct 13 13:24:13 crc kubenswrapper[4797]: I1013 13:24:13.735369 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Oct 13 13:24:13 crc kubenswrapper[4797]: I1013 13:24:13.754418 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=1.556075874 podStartE2EDuration="2.754399895s" podCreationTimestamp="2025-10-13 13:24:11 +0000 UTC" firstStartedPulling="2025-10-13 13:24:12.034716527 +0000 UTC m=+1029.568266783" lastFinishedPulling="2025-10-13 13:24:13.233040548 +0000 UTC m=+1030.766590804" observedRunningTime="2025-10-13 13:24:13.753611535 +0000 UTC m=+1031.287161811" watchObservedRunningTime="2025-10-13 13:24:13.754399895 +0000 UTC m=+1031.287950171" Oct 13 13:24:15 crc kubenswrapper[4797]: I1013 13:24:15.461694 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Oct 13 13:24:15 crc kubenswrapper[4797]: I1013 13:24:15.462033 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Oct 13 13:24:16 crc kubenswrapper[4797]: I1013 13:24:16.631118 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-77f67f5697-wwk78" Oct 13 13:24:16 crc kubenswrapper[4797]: I1013 13:24:16.824763 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-d49c4d845-xmwk4" Oct 13 13:24:16 crc kubenswrapper[4797]: I1013 13:24:16.870677 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77f67f5697-wwk78"] Oct 13 13:24:16 crc kubenswrapper[4797]: I1013 13:24:16.870932 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-77f67f5697-wwk78" podUID="fc05ede6-0732-4eb7-8e84-803d40530c21" containerName="dnsmasq-dns" containerID="cri-o://a5c64fa2f39ca1ad6761835f21af55ff63c3d8fa8a81c851bc954a4f8bda2831" gracePeriod=10 Oct 13 13:24:17 crc kubenswrapper[4797]: I1013 13:24:17.288156 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77f67f5697-wwk78" Oct 13 13:24:17 crc kubenswrapper[4797]: I1013 13:24:17.400649 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hv8vg\" (UniqueName: \"kubernetes.io/projected/fc05ede6-0732-4eb7-8e84-803d40530c21-kube-api-access-hv8vg\") pod \"fc05ede6-0732-4eb7-8e84-803d40530c21\" (UID: \"fc05ede6-0732-4eb7-8e84-803d40530c21\") " Oct 13 13:24:17 crc kubenswrapper[4797]: I1013 13:24:17.400701 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc05ede6-0732-4eb7-8e84-803d40530c21-config\") pod \"fc05ede6-0732-4eb7-8e84-803d40530c21\" (UID: \"fc05ede6-0732-4eb7-8e84-803d40530c21\") " Oct 13 13:24:17 crc kubenswrapper[4797]: I1013 13:24:17.400752 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fc05ede6-0732-4eb7-8e84-803d40530c21-dns-svc\") pod \"fc05ede6-0732-4eb7-8e84-803d40530c21\" (UID: \"fc05ede6-0732-4eb7-8e84-803d40530c21\") " Oct 13 13:24:17 crc kubenswrapper[4797]: I1013 13:24:17.400997 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fc05ede6-0732-4eb7-8e84-803d40530c21-ovsdbserver-sb\") pod \"fc05ede6-0732-4eb7-8e84-803d40530c21\" (UID: \"fc05ede6-0732-4eb7-8e84-803d40530c21\") " Oct 13 13:24:17 crc kubenswrapper[4797]: I1013 13:24:17.407045 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc05ede6-0732-4eb7-8e84-803d40530c21-kube-api-access-hv8vg" (OuterVolumeSpecName: "kube-api-access-hv8vg") pod "fc05ede6-0732-4eb7-8e84-803d40530c21" (UID: "fc05ede6-0732-4eb7-8e84-803d40530c21"). InnerVolumeSpecName "kube-api-access-hv8vg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:24:17 crc kubenswrapper[4797]: I1013 13:24:17.440270 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc05ede6-0732-4eb7-8e84-803d40530c21-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "fc05ede6-0732-4eb7-8e84-803d40530c21" (UID: "fc05ede6-0732-4eb7-8e84-803d40530c21"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:24:17 crc kubenswrapper[4797]: I1013 13:24:17.443077 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc05ede6-0732-4eb7-8e84-803d40530c21-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "fc05ede6-0732-4eb7-8e84-803d40530c21" (UID: "fc05ede6-0732-4eb7-8e84-803d40530c21"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:24:17 crc kubenswrapper[4797]: I1013 13:24:17.471099 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc05ede6-0732-4eb7-8e84-803d40530c21-config" (OuterVolumeSpecName: "config") pod "fc05ede6-0732-4eb7-8e84-803d40530c21" (UID: "fc05ede6-0732-4eb7-8e84-803d40530c21"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:24:17 crc kubenswrapper[4797]: I1013 13:24:17.502617 4797 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fc05ede6-0732-4eb7-8e84-803d40530c21-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 13 13:24:17 crc kubenswrapper[4797]: I1013 13:24:17.502655 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hv8vg\" (UniqueName: \"kubernetes.io/projected/fc05ede6-0732-4eb7-8e84-803d40530c21-kube-api-access-hv8vg\") on node \"crc\" DevicePath \"\"" Oct 13 13:24:17 crc kubenswrapper[4797]: I1013 13:24:17.502667 4797 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc05ede6-0732-4eb7-8e84-803d40530c21-config\") on node \"crc\" DevicePath \"\"" Oct 13 13:24:17 crc kubenswrapper[4797]: I1013 13:24:17.502675 4797 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fc05ede6-0732-4eb7-8e84-803d40530c21-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 13 13:24:17 crc kubenswrapper[4797]: I1013 13:24:17.534365 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Oct 13 13:24:17 crc kubenswrapper[4797]: I1013 13:24:17.587970 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Oct 13 13:24:17 crc kubenswrapper[4797]: I1013 13:24:17.774479 4797 generic.go:334] "Generic (PLEG): container finished" podID="fc05ede6-0732-4eb7-8e84-803d40530c21" containerID="a5c64fa2f39ca1ad6761835f21af55ff63c3d8fa8a81c851bc954a4f8bda2831" exitCode=0 Oct 13 13:24:17 crc kubenswrapper[4797]: I1013 13:24:17.774576 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77f67f5697-wwk78" Oct 13 13:24:17 crc kubenswrapper[4797]: I1013 13:24:17.774614 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77f67f5697-wwk78" event={"ID":"fc05ede6-0732-4eb7-8e84-803d40530c21","Type":"ContainerDied","Data":"a5c64fa2f39ca1ad6761835f21af55ff63c3d8fa8a81c851bc954a4f8bda2831"} Oct 13 13:24:17 crc kubenswrapper[4797]: I1013 13:24:17.774706 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77f67f5697-wwk78" event={"ID":"fc05ede6-0732-4eb7-8e84-803d40530c21","Type":"ContainerDied","Data":"44349de50458b5b243a468ee4d4d325f4229cd591e8ef7df4d638cc155f1ffcd"} Oct 13 13:24:17 crc kubenswrapper[4797]: I1013 13:24:17.774752 4797 scope.go:117] "RemoveContainer" containerID="a5c64fa2f39ca1ad6761835f21af55ff63c3d8fa8a81c851bc954a4f8bda2831" Oct 13 13:24:17 crc kubenswrapper[4797]: I1013 13:24:17.797133 4797 scope.go:117] "RemoveContainer" containerID="4ecb8a9b78da5ed96baecd47784a2e41db7470fd4b651cc6d7f874429198b742" Oct 13 13:24:17 crc kubenswrapper[4797]: I1013 13:24:17.805530 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77f67f5697-wwk78"] Oct 13 13:24:17 crc kubenswrapper[4797]: I1013 13:24:17.811984 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-77f67f5697-wwk78"] Oct 13 13:24:17 crc kubenswrapper[4797]: I1013 13:24:17.833377 4797 scope.go:117] "RemoveContainer" containerID="a5c64fa2f39ca1ad6761835f21af55ff63c3d8fa8a81c851bc954a4f8bda2831" Oct 13 13:24:17 crc kubenswrapper[4797]: E1013 13:24:17.833874 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a5c64fa2f39ca1ad6761835f21af55ff63c3d8fa8a81c851bc954a4f8bda2831\": container with ID starting with a5c64fa2f39ca1ad6761835f21af55ff63c3d8fa8a81c851bc954a4f8bda2831 not found: ID does not exist" containerID="a5c64fa2f39ca1ad6761835f21af55ff63c3d8fa8a81c851bc954a4f8bda2831" Oct 13 13:24:17 crc kubenswrapper[4797]: I1013 13:24:17.833914 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5c64fa2f39ca1ad6761835f21af55ff63c3d8fa8a81c851bc954a4f8bda2831"} err="failed to get container status \"a5c64fa2f39ca1ad6761835f21af55ff63c3d8fa8a81c851bc954a4f8bda2831\": rpc error: code = NotFound desc = could not find container \"a5c64fa2f39ca1ad6761835f21af55ff63c3d8fa8a81c851bc954a4f8bda2831\": container with ID starting with a5c64fa2f39ca1ad6761835f21af55ff63c3d8fa8a81c851bc954a4f8bda2831 not found: ID does not exist" Oct 13 13:24:17 crc kubenswrapper[4797]: I1013 13:24:17.833942 4797 scope.go:117] "RemoveContainer" containerID="4ecb8a9b78da5ed96baecd47784a2e41db7470fd4b651cc6d7f874429198b742" Oct 13 13:24:17 crc kubenswrapper[4797]: E1013 13:24:17.834211 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ecb8a9b78da5ed96baecd47784a2e41db7470fd4b651cc6d7f874429198b742\": container with ID starting with 4ecb8a9b78da5ed96baecd47784a2e41db7470fd4b651cc6d7f874429198b742 not found: ID does not exist" containerID="4ecb8a9b78da5ed96baecd47784a2e41db7470fd4b651cc6d7f874429198b742" Oct 13 13:24:17 crc kubenswrapper[4797]: I1013 13:24:17.834249 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ecb8a9b78da5ed96baecd47784a2e41db7470fd4b651cc6d7f874429198b742"} err="failed to get container status \"4ecb8a9b78da5ed96baecd47784a2e41db7470fd4b651cc6d7f874429198b742\": rpc error: code = NotFound desc = could not find container \"4ecb8a9b78da5ed96baecd47784a2e41db7470fd4b651cc6d7f874429198b742\": container with ID starting with 4ecb8a9b78da5ed96baecd47784a2e41db7470fd4b651cc6d7f874429198b742 not found: ID does not exist" Oct 13 13:24:18 crc kubenswrapper[4797]: I1013 13:24:18.499402 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-c757dd68f-28ns7"] Oct 13 13:24:18 crc kubenswrapper[4797]: E1013 13:24:18.499788 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc05ede6-0732-4eb7-8e84-803d40530c21" containerName="init" Oct 13 13:24:18 crc kubenswrapper[4797]: I1013 13:24:18.505195 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc05ede6-0732-4eb7-8e84-803d40530c21" containerName="init" Oct 13 13:24:18 crc kubenswrapper[4797]: E1013 13:24:18.505293 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc05ede6-0732-4eb7-8e84-803d40530c21" containerName="dnsmasq-dns" Oct 13 13:24:18 crc kubenswrapper[4797]: I1013 13:24:18.505307 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc05ede6-0732-4eb7-8e84-803d40530c21" containerName="dnsmasq-dns" Oct 13 13:24:18 crc kubenswrapper[4797]: I1013 13:24:18.505649 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc05ede6-0732-4eb7-8e84-803d40530c21" containerName="dnsmasq-dns" Oct 13 13:24:18 crc kubenswrapper[4797]: I1013 13:24:18.506827 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c757dd68f-28ns7" Oct 13 13:24:18 crc kubenswrapper[4797]: I1013 13:24:18.525273 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-c757dd68f-28ns7"] Oct 13 13:24:18 crc kubenswrapper[4797]: I1013 13:24:18.635215 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b0488671-d7b4-4c33-a64b-b163a812f2eb-dns-svc\") pod \"dnsmasq-dns-c757dd68f-28ns7\" (UID: \"b0488671-d7b4-4c33-a64b-b163a812f2eb\") " pod="openstack/dnsmasq-dns-c757dd68f-28ns7" Oct 13 13:24:18 crc kubenswrapper[4797]: I1013 13:24:18.635298 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b0488671-d7b4-4c33-a64b-b163a812f2eb-ovsdbserver-sb\") pod \"dnsmasq-dns-c757dd68f-28ns7\" (UID: \"b0488671-d7b4-4c33-a64b-b163a812f2eb\") " pod="openstack/dnsmasq-dns-c757dd68f-28ns7" Oct 13 13:24:18 crc kubenswrapper[4797]: I1013 13:24:18.635340 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-blm8c\" (UniqueName: \"kubernetes.io/projected/b0488671-d7b4-4c33-a64b-b163a812f2eb-kube-api-access-blm8c\") pod \"dnsmasq-dns-c757dd68f-28ns7\" (UID: \"b0488671-d7b4-4c33-a64b-b163a812f2eb\") " pod="openstack/dnsmasq-dns-c757dd68f-28ns7" Oct 13 13:24:18 crc kubenswrapper[4797]: I1013 13:24:18.635387 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b0488671-d7b4-4c33-a64b-b163a812f2eb-ovsdbserver-nb\") pod \"dnsmasq-dns-c757dd68f-28ns7\" (UID: \"b0488671-d7b4-4c33-a64b-b163a812f2eb\") " pod="openstack/dnsmasq-dns-c757dd68f-28ns7" Oct 13 13:24:18 crc kubenswrapper[4797]: I1013 13:24:18.635403 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b0488671-d7b4-4c33-a64b-b163a812f2eb-config\") pod \"dnsmasq-dns-c757dd68f-28ns7\" (UID: \"b0488671-d7b4-4c33-a64b-b163a812f2eb\") " pod="openstack/dnsmasq-dns-c757dd68f-28ns7" Oct 13 13:24:18 crc kubenswrapper[4797]: I1013 13:24:18.736485 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b0488671-d7b4-4c33-a64b-b163a812f2eb-dns-svc\") pod \"dnsmasq-dns-c757dd68f-28ns7\" (UID: \"b0488671-d7b4-4c33-a64b-b163a812f2eb\") " pod="openstack/dnsmasq-dns-c757dd68f-28ns7" Oct 13 13:24:18 crc kubenswrapper[4797]: I1013 13:24:18.736563 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b0488671-d7b4-4c33-a64b-b163a812f2eb-ovsdbserver-sb\") pod \"dnsmasq-dns-c757dd68f-28ns7\" (UID: \"b0488671-d7b4-4c33-a64b-b163a812f2eb\") " pod="openstack/dnsmasq-dns-c757dd68f-28ns7" Oct 13 13:24:18 crc kubenswrapper[4797]: I1013 13:24:18.736595 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-blm8c\" (UniqueName: \"kubernetes.io/projected/b0488671-d7b4-4c33-a64b-b163a812f2eb-kube-api-access-blm8c\") pod \"dnsmasq-dns-c757dd68f-28ns7\" (UID: \"b0488671-d7b4-4c33-a64b-b163a812f2eb\") " pod="openstack/dnsmasq-dns-c757dd68f-28ns7" Oct 13 13:24:18 crc kubenswrapper[4797]: I1013 13:24:18.736627 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b0488671-d7b4-4c33-a64b-b163a812f2eb-ovsdbserver-nb\") pod \"dnsmasq-dns-c757dd68f-28ns7\" (UID: \"b0488671-d7b4-4c33-a64b-b163a812f2eb\") " pod="openstack/dnsmasq-dns-c757dd68f-28ns7" Oct 13 13:24:18 crc kubenswrapper[4797]: I1013 13:24:18.736644 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b0488671-d7b4-4c33-a64b-b163a812f2eb-config\") pod \"dnsmasq-dns-c757dd68f-28ns7\" (UID: \"b0488671-d7b4-4c33-a64b-b163a812f2eb\") " pod="openstack/dnsmasq-dns-c757dd68f-28ns7" Oct 13 13:24:18 crc kubenswrapper[4797]: I1013 13:24:18.737839 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b0488671-d7b4-4c33-a64b-b163a812f2eb-config\") pod \"dnsmasq-dns-c757dd68f-28ns7\" (UID: \"b0488671-d7b4-4c33-a64b-b163a812f2eb\") " pod="openstack/dnsmasq-dns-c757dd68f-28ns7" Oct 13 13:24:18 crc kubenswrapper[4797]: I1013 13:24:18.737856 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b0488671-d7b4-4c33-a64b-b163a812f2eb-ovsdbserver-sb\") pod \"dnsmasq-dns-c757dd68f-28ns7\" (UID: \"b0488671-d7b4-4c33-a64b-b163a812f2eb\") " pod="openstack/dnsmasq-dns-c757dd68f-28ns7" Oct 13 13:24:18 crc kubenswrapper[4797]: I1013 13:24:18.737973 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b0488671-d7b4-4c33-a64b-b163a812f2eb-dns-svc\") pod \"dnsmasq-dns-c757dd68f-28ns7\" (UID: \"b0488671-d7b4-4c33-a64b-b163a812f2eb\") " pod="openstack/dnsmasq-dns-c757dd68f-28ns7" Oct 13 13:24:18 crc kubenswrapper[4797]: I1013 13:24:18.738026 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b0488671-d7b4-4c33-a64b-b163a812f2eb-ovsdbserver-nb\") pod \"dnsmasq-dns-c757dd68f-28ns7\" (UID: \"b0488671-d7b4-4c33-a64b-b163a812f2eb\") " pod="openstack/dnsmasq-dns-c757dd68f-28ns7" Oct 13 13:24:18 crc kubenswrapper[4797]: I1013 13:24:18.753652 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-blm8c\" (UniqueName: \"kubernetes.io/projected/b0488671-d7b4-4c33-a64b-b163a812f2eb-kube-api-access-blm8c\") pod \"dnsmasq-dns-c757dd68f-28ns7\" (UID: \"b0488671-d7b4-4c33-a64b-b163a812f2eb\") " pod="openstack/dnsmasq-dns-c757dd68f-28ns7" Oct 13 13:24:18 crc kubenswrapper[4797]: I1013 13:24:18.838471 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c757dd68f-28ns7" Oct 13 13:24:19 crc kubenswrapper[4797]: I1013 13:24:19.247775 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc05ede6-0732-4eb7-8e84-803d40530c21" path="/var/lib/kubelet/pods/fc05ede6-0732-4eb7-8e84-803d40530c21/volumes" Oct 13 13:24:19 crc kubenswrapper[4797]: I1013 13:24:19.254057 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-c757dd68f-28ns7"] Oct 13 13:24:19 crc kubenswrapper[4797]: W1013 13:24:19.261320 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb0488671_d7b4_4c33_a64b_b163a812f2eb.slice/crio-c23a409ccdf4ecbc45c1b97b1fb1448926b8a288d1eeb026c58bb66f0903ea5e WatchSource:0}: Error finding container c23a409ccdf4ecbc45c1b97b1fb1448926b8a288d1eeb026c58bb66f0903ea5e: Status 404 returned error can't find the container with id c23a409ccdf4ecbc45c1b97b1fb1448926b8a288d1eeb026c58bb66f0903ea5e Oct 13 13:24:19 crc kubenswrapper[4797]: I1013 13:24:19.637916 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Oct 13 13:24:19 crc kubenswrapper[4797]: I1013 13:24:19.644504 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Oct 13 13:24:19 crc kubenswrapper[4797]: I1013 13:24:19.646795 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-5fgdb" Oct 13 13:24:19 crc kubenswrapper[4797]: I1013 13:24:19.647387 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Oct 13 13:24:19 crc kubenswrapper[4797]: I1013 13:24:19.647416 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Oct 13 13:24:19 crc kubenswrapper[4797]: I1013 13:24:19.647564 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Oct 13 13:24:19 crc kubenswrapper[4797]: I1013 13:24:19.666636 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Oct 13 13:24:19 crc kubenswrapper[4797]: I1013 13:24:19.754291 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/f853cd93-92bc-46d6-8bd4-82373edcac6c-cache\") pod \"swift-storage-0\" (UID: \"f853cd93-92bc-46d6-8bd4-82373edcac6c\") " pod="openstack/swift-storage-0" Oct 13 13:24:19 crc kubenswrapper[4797]: I1013 13:24:19.754610 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sckhd\" (UniqueName: \"kubernetes.io/projected/f853cd93-92bc-46d6-8bd4-82373edcac6c-kube-api-access-sckhd\") pod \"swift-storage-0\" (UID: \"f853cd93-92bc-46d6-8bd4-82373edcac6c\") " pod="openstack/swift-storage-0" Oct 13 13:24:19 crc kubenswrapper[4797]: I1013 13:24:19.754645 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/f853cd93-92bc-46d6-8bd4-82373edcac6c-lock\") pod \"swift-storage-0\" (UID: \"f853cd93-92bc-46d6-8bd4-82373edcac6c\") " pod="openstack/swift-storage-0" Oct 13 13:24:19 crc kubenswrapper[4797]: I1013 13:24:19.754671 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"swift-storage-0\" (UID: \"f853cd93-92bc-46d6-8bd4-82373edcac6c\") " pod="openstack/swift-storage-0" Oct 13 13:24:19 crc kubenswrapper[4797]: I1013 13:24:19.754699 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f853cd93-92bc-46d6-8bd4-82373edcac6c-etc-swift\") pod \"swift-storage-0\" (UID: \"f853cd93-92bc-46d6-8bd4-82373edcac6c\") " pod="openstack/swift-storage-0" Oct 13 13:24:19 crc kubenswrapper[4797]: I1013 13:24:19.793210 4797 generic.go:334] "Generic (PLEG): container finished" podID="b0488671-d7b4-4c33-a64b-b163a812f2eb" containerID="bae487224769ef9bfd1228980c3701a1b9e337f37dd3010452a6d49a59eff1c3" exitCode=0 Oct 13 13:24:19 crc kubenswrapper[4797]: I1013 13:24:19.793261 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c757dd68f-28ns7" event={"ID":"b0488671-d7b4-4c33-a64b-b163a812f2eb","Type":"ContainerDied","Data":"bae487224769ef9bfd1228980c3701a1b9e337f37dd3010452a6d49a59eff1c3"} Oct 13 13:24:19 crc kubenswrapper[4797]: I1013 13:24:19.793289 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c757dd68f-28ns7" event={"ID":"b0488671-d7b4-4c33-a64b-b163a812f2eb","Type":"ContainerStarted","Data":"c23a409ccdf4ecbc45c1b97b1fb1448926b8a288d1eeb026c58bb66f0903ea5e"} Oct 13 13:24:19 crc kubenswrapper[4797]: I1013 13:24:19.856150 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/f853cd93-92bc-46d6-8bd4-82373edcac6c-lock\") pod \"swift-storage-0\" (UID: \"f853cd93-92bc-46d6-8bd4-82373edcac6c\") " pod="openstack/swift-storage-0" Oct 13 13:24:19 crc kubenswrapper[4797]: I1013 13:24:19.856213 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"swift-storage-0\" (UID: \"f853cd93-92bc-46d6-8bd4-82373edcac6c\") " pod="openstack/swift-storage-0" Oct 13 13:24:19 crc kubenswrapper[4797]: I1013 13:24:19.856257 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f853cd93-92bc-46d6-8bd4-82373edcac6c-etc-swift\") pod \"swift-storage-0\" (UID: \"f853cd93-92bc-46d6-8bd4-82373edcac6c\") " pod="openstack/swift-storage-0" Oct 13 13:24:19 crc kubenswrapper[4797]: I1013 13:24:19.856408 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/f853cd93-92bc-46d6-8bd4-82373edcac6c-cache\") pod \"swift-storage-0\" (UID: \"f853cd93-92bc-46d6-8bd4-82373edcac6c\") " pod="openstack/swift-storage-0" Oct 13 13:24:19 crc kubenswrapper[4797]: I1013 13:24:19.856429 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sckhd\" (UniqueName: \"kubernetes.io/projected/f853cd93-92bc-46d6-8bd4-82373edcac6c-kube-api-access-sckhd\") pod \"swift-storage-0\" (UID: \"f853cd93-92bc-46d6-8bd4-82373edcac6c\") " pod="openstack/swift-storage-0" Oct 13 13:24:19 crc kubenswrapper[4797]: I1013 13:24:19.857141 4797 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"swift-storage-0\" (UID: \"f853cd93-92bc-46d6-8bd4-82373edcac6c\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/swift-storage-0" Oct 13 13:24:19 crc kubenswrapper[4797]: I1013 13:24:19.857306 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/f853cd93-92bc-46d6-8bd4-82373edcac6c-lock\") pod \"swift-storage-0\" (UID: \"f853cd93-92bc-46d6-8bd4-82373edcac6c\") " pod="openstack/swift-storage-0" Oct 13 13:24:19 crc kubenswrapper[4797]: I1013 13:24:19.857779 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/f853cd93-92bc-46d6-8bd4-82373edcac6c-cache\") pod \"swift-storage-0\" (UID: \"f853cd93-92bc-46d6-8bd4-82373edcac6c\") " pod="openstack/swift-storage-0" Oct 13 13:24:19 crc kubenswrapper[4797]: E1013 13:24:19.857867 4797 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 13 13:24:19 crc kubenswrapper[4797]: E1013 13:24:19.857884 4797 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 13 13:24:19 crc kubenswrapper[4797]: E1013 13:24:19.857947 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f853cd93-92bc-46d6-8bd4-82373edcac6c-etc-swift podName:f853cd93-92bc-46d6-8bd4-82373edcac6c nodeName:}" failed. No retries permitted until 2025-10-13 13:24:20.357928669 +0000 UTC m=+1037.891478925 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/f853cd93-92bc-46d6-8bd4-82373edcac6c-etc-swift") pod "swift-storage-0" (UID: "f853cd93-92bc-46d6-8bd4-82373edcac6c") : configmap "swift-ring-files" not found Oct 13 13:24:19 crc kubenswrapper[4797]: I1013 13:24:19.875566 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sckhd\" (UniqueName: \"kubernetes.io/projected/f853cd93-92bc-46d6-8bd4-82373edcac6c-kube-api-access-sckhd\") pod \"swift-storage-0\" (UID: \"f853cd93-92bc-46d6-8bd4-82373edcac6c\") " pod="openstack/swift-storage-0" Oct 13 13:24:19 crc kubenswrapper[4797]: I1013 13:24:19.876728 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"swift-storage-0\" (UID: \"f853cd93-92bc-46d6-8bd4-82373edcac6c\") " pod="openstack/swift-storage-0" Oct 13 13:24:20 crc kubenswrapper[4797]: I1013 13:24:20.175550 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-pn9q8"] Oct 13 13:24:20 crc kubenswrapper[4797]: I1013 13:24:20.177278 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-pn9q8" Oct 13 13:24:20 crc kubenswrapper[4797]: I1013 13:24:20.180545 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Oct 13 13:24:20 crc kubenswrapper[4797]: I1013 13:24:20.180545 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Oct 13 13:24:20 crc kubenswrapper[4797]: I1013 13:24:20.180998 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Oct 13 13:24:20 crc kubenswrapper[4797]: I1013 13:24:20.188213 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-pn9q8"] Oct 13 13:24:20 crc kubenswrapper[4797]: I1013 13:24:20.262908 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9075d4a4-54e2-492f-bc84-bd1fb11df325-swiftconf\") pod \"swift-ring-rebalance-pn9q8\" (UID: \"9075d4a4-54e2-492f-bc84-bd1fb11df325\") " pod="openstack/swift-ring-rebalance-pn9q8" Oct 13 13:24:20 crc kubenswrapper[4797]: I1013 13:24:20.263006 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9075d4a4-54e2-492f-bc84-bd1fb11df325-scripts\") pod \"swift-ring-rebalance-pn9q8\" (UID: \"9075d4a4-54e2-492f-bc84-bd1fb11df325\") " pod="openstack/swift-ring-rebalance-pn9q8" Oct 13 13:24:20 crc kubenswrapper[4797]: I1013 13:24:20.263176 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9075d4a4-54e2-492f-bc84-bd1fb11df325-ring-data-devices\") pod \"swift-ring-rebalance-pn9q8\" (UID: \"9075d4a4-54e2-492f-bc84-bd1fb11df325\") " pod="openstack/swift-ring-rebalance-pn9q8" Oct 13 13:24:20 crc kubenswrapper[4797]: I1013 13:24:20.263202 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9075d4a4-54e2-492f-bc84-bd1fb11df325-etc-swift\") pod \"swift-ring-rebalance-pn9q8\" (UID: \"9075d4a4-54e2-492f-bc84-bd1fb11df325\") " pod="openstack/swift-ring-rebalance-pn9q8" Oct 13 13:24:20 crc kubenswrapper[4797]: I1013 13:24:20.263233 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9075d4a4-54e2-492f-bc84-bd1fb11df325-combined-ca-bundle\") pod \"swift-ring-rebalance-pn9q8\" (UID: \"9075d4a4-54e2-492f-bc84-bd1fb11df325\") " pod="openstack/swift-ring-rebalance-pn9q8" Oct 13 13:24:20 crc kubenswrapper[4797]: I1013 13:24:20.263265 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9075d4a4-54e2-492f-bc84-bd1fb11df325-dispersionconf\") pod \"swift-ring-rebalance-pn9q8\" (UID: \"9075d4a4-54e2-492f-bc84-bd1fb11df325\") " pod="openstack/swift-ring-rebalance-pn9q8" Oct 13 13:24:20 crc kubenswrapper[4797]: I1013 13:24:20.263289 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xws7b\" (UniqueName: \"kubernetes.io/projected/9075d4a4-54e2-492f-bc84-bd1fb11df325-kube-api-access-xws7b\") pod \"swift-ring-rebalance-pn9q8\" (UID: \"9075d4a4-54e2-492f-bc84-bd1fb11df325\") " pod="openstack/swift-ring-rebalance-pn9q8" Oct 13 13:24:20 crc kubenswrapper[4797]: I1013 13:24:20.364933 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9075d4a4-54e2-492f-bc84-bd1fb11df325-swiftconf\") pod \"swift-ring-rebalance-pn9q8\" (UID: \"9075d4a4-54e2-492f-bc84-bd1fb11df325\") " pod="openstack/swift-ring-rebalance-pn9q8" Oct 13 13:24:20 crc kubenswrapper[4797]: I1013 13:24:20.365227 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9075d4a4-54e2-492f-bc84-bd1fb11df325-scripts\") pod \"swift-ring-rebalance-pn9q8\" (UID: \"9075d4a4-54e2-492f-bc84-bd1fb11df325\") " pod="openstack/swift-ring-rebalance-pn9q8" Oct 13 13:24:20 crc kubenswrapper[4797]: I1013 13:24:20.365381 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f853cd93-92bc-46d6-8bd4-82373edcac6c-etc-swift\") pod \"swift-storage-0\" (UID: \"f853cd93-92bc-46d6-8bd4-82373edcac6c\") " pod="openstack/swift-storage-0" Oct 13 13:24:20 crc kubenswrapper[4797]: I1013 13:24:20.365472 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9075d4a4-54e2-492f-bc84-bd1fb11df325-ring-data-devices\") pod \"swift-ring-rebalance-pn9q8\" (UID: \"9075d4a4-54e2-492f-bc84-bd1fb11df325\") " pod="openstack/swift-ring-rebalance-pn9q8" Oct 13 13:24:20 crc kubenswrapper[4797]: I1013 13:24:20.365594 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9075d4a4-54e2-492f-bc84-bd1fb11df325-etc-swift\") pod \"swift-ring-rebalance-pn9q8\" (UID: \"9075d4a4-54e2-492f-bc84-bd1fb11df325\") " pod="openstack/swift-ring-rebalance-pn9q8" Oct 13 13:24:20 crc kubenswrapper[4797]: I1013 13:24:20.365690 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9075d4a4-54e2-492f-bc84-bd1fb11df325-combined-ca-bundle\") pod \"swift-ring-rebalance-pn9q8\" (UID: \"9075d4a4-54e2-492f-bc84-bd1fb11df325\") " pod="openstack/swift-ring-rebalance-pn9q8" Oct 13 13:24:20 crc kubenswrapper[4797]: I1013 13:24:20.365819 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9075d4a4-54e2-492f-bc84-bd1fb11df325-dispersionconf\") pod \"swift-ring-rebalance-pn9q8\" (UID: \"9075d4a4-54e2-492f-bc84-bd1fb11df325\") " pod="openstack/swift-ring-rebalance-pn9q8" Oct 13 13:24:20 crc kubenswrapper[4797]: I1013 13:24:20.365919 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xws7b\" (UniqueName: \"kubernetes.io/projected/9075d4a4-54e2-492f-bc84-bd1fb11df325-kube-api-access-xws7b\") pod \"swift-ring-rebalance-pn9q8\" (UID: \"9075d4a4-54e2-492f-bc84-bd1fb11df325\") " pod="openstack/swift-ring-rebalance-pn9q8" Oct 13 13:24:20 crc kubenswrapper[4797]: E1013 13:24:20.365647 4797 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 13 13:24:20 crc kubenswrapper[4797]: E1013 13:24:20.366142 4797 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 13 13:24:20 crc kubenswrapper[4797]: E1013 13:24:20.366249 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f853cd93-92bc-46d6-8bd4-82373edcac6c-etc-swift podName:f853cd93-92bc-46d6-8bd4-82373edcac6c nodeName:}" failed. No retries permitted until 2025-10-13 13:24:21.366232795 +0000 UTC m=+1038.899783061 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/f853cd93-92bc-46d6-8bd4-82373edcac6c-etc-swift") pod "swift-storage-0" (UID: "f853cd93-92bc-46d6-8bd4-82373edcac6c") : configmap "swift-ring-files" not found Oct 13 13:24:20 crc kubenswrapper[4797]: I1013 13:24:20.366178 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9075d4a4-54e2-492f-bc84-bd1fb11df325-etc-swift\") pod \"swift-ring-rebalance-pn9q8\" (UID: \"9075d4a4-54e2-492f-bc84-bd1fb11df325\") " pod="openstack/swift-ring-rebalance-pn9q8" Oct 13 13:24:20 crc kubenswrapper[4797]: I1013 13:24:20.366409 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9075d4a4-54e2-492f-bc84-bd1fb11df325-scripts\") pod \"swift-ring-rebalance-pn9q8\" (UID: \"9075d4a4-54e2-492f-bc84-bd1fb11df325\") " pod="openstack/swift-ring-rebalance-pn9q8" Oct 13 13:24:20 crc kubenswrapper[4797]: I1013 13:24:20.366619 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9075d4a4-54e2-492f-bc84-bd1fb11df325-ring-data-devices\") pod \"swift-ring-rebalance-pn9q8\" (UID: \"9075d4a4-54e2-492f-bc84-bd1fb11df325\") " pod="openstack/swift-ring-rebalance-pn9q8" Oct 13 13:24:20 crc kubenswrapper[4797]: I1013 13:24:20.369527 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9075d4a4-54e2-492f-bc84-bd1fb11df325-dispersionconf\") pod \"swift-ring-rebalance-pn9q8\" (UID: \"9075d4a4-54e2-492f-bc84-bd1fb11df325\") " pod="openstack/swift-ring-rebalance-pn9q8" Oct 13 13:24:20 crc kubenswrapper[4797]: I1013 13:24:20.369914 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9075d4a4-54e2-492f-bc84-bd1fb11df325-swiftconf\") pod \"swift-ring-rebalance-pn9q8\" (UID: \"9075d4a4-54e2-492f-bc84-bd1fb11df325\") " pod="openstack/swift-ring-rebalance-pn9q8" Oct 13 13:24:20 crc kubenswrapper[4797]: I1013 13:24:20.381365 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9075d4a4-54e2-492f-bc84-bd1fb11df325-combined-ca-bundle\") pod \"swift-ring-rebalance-pn9q8\" (UID: \"9075d4a4-54e2-492f-bc84-bd1fb11df325\") " pod="openstack/swift-ring-rebalance-pn9q8" Oct 13 13:24:20 crc kubenswrapper[4797]: I1013 13:24:20.384043 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xws7b\" (UniqueName: \"kubernetes.io/projected/9075d4a4-54e2-492f-bc84-bd1fb11df325-kube-api-access-xws7b\") pod \"swift-ring-rebalance-pn9q8\" (UID: \"9075d4a4-54e2-492f-bc84-bd1fb11df325\") " pod="openstack/swift-ring-rebalance-pn9q8" Oct 13 13:24:20 crc kubenswrapper[4797]: I1013 13:24:20.508665 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-pn9q8" Oct 13 13:24:20 crc kubenswrapper[4797]: I1013 13:24:20.815226 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c757dd68f-28ns7" event={"ID":"b0488671-d7b4-4c33-a64b-b163a812f2eb","Type":"ContainerStarted","Data":"6c7f3f5a65043f5417a69014fd274093dd20803d37acd9e49b2e231982ecaa1a"} Oct 13 13:24:20 crc kubenswrapper[4797]: I1013 13:24:20.815876 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-c757dd68f-28ns7" Oct 13 13:24:20 crc kubenswrapper[4797]: I1013 13:24:20.842559 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-c757dd68f-28ns7" podStartSLOduration=2.842527605 podStartE2EDuration="2.842527605s" podCreationTimestamp="2025-10-13 13:24:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 13:24:20.841161772 +0000 UTC m=+1038.374712028" watchObservedRunningTime="2025-10-13 13:24:20.842527605 +0000 UTC m=+1038.376077861" Oct 13 13:24:21 crc kubenswrapper[4797]: I1013 13:24:21.035768 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-pn9q8"] Oct 13 13:24:21 crc kubenswrapper[4797]: W1013 13:24:21.048129 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9075d4a4_54e2_492f_bc84_bd1fb11df325.slice/crio-5ae409a2b24ba8566cae12870c746c63f0239f6f5ba7c1378070a20c63007d7e WatchSource:0}: Error finding container 5ae409a2b24ba8566cae12870c746c63f0239f6f5ba7c1378070a20c63007d7e: Status 404 returned error can't find the container with id 5ae409a2b24ba8566cae12870c746c63f0239f6f5ba7c1378070a20c63007d7e Oct 13 13:24:21 crc kubenswrapper[4797]: I1013 13:24:21.386402 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f853cd93-92bc-46d6-8bd4-82373edcac6c-etc-swift\") pod \"swift-storage-0\" (UID: \"f853cd93-92bc-46d6-8bd4-82373edcac6c\") " pod="openstack/swift-storage-0" Oct 13 13:24:21 crc kubenswrapper[4797]: E1013 13:24:21.386621 4797 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 13 13:24:21 crc kubenswrapper[4797]: E1013 13:24:21.386650 4797 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 13 13:24:21 crc kubenswrapper[4797]: E1013 13:24:21.386739 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f853cd93-92bc-46d6-8bd4-82373edcac6c-etc-swift podName:f853cd93-92bc-46d6-8bd4-82373edcac6c nodeName:}" failed. No retries permitted until 2025-10-13 13:24:23.386716722 +0000 UTC m=+1040.920266988 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/f853cd93-92bc-46d6-8bd4-82373edcac6c-etc-swift") pod "swift-storage-0" (UID: "f853cd93-92bc-46d6-8bd4-82373edcac6c") : configmap "swift-ring-files" not found Oct 13 13:24:21 crc kubenswrapper[4797]: I1013 13:24:21.822856 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-pn9q8" event={"ID":"9075d4a4-54e2-492f-bc84-bd1fb11df325","Type":"ContainerStarted","Data":"5ae409a2b24ba8566cae12870c746c63f0239f6f5ba7c1378070a20c63007d7e"} Oct 13 13:24:22 crc kubenswrapper[4797]: I1013 13:24:22.383951 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-25b92"] Oct 13 13:24:22 crc kubenswrapper[4797]: I1013 13:24:22.385799 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-25b92" Oct 13 13:24:22 crc kubenswrapper[4797]: I1013 13:24:22.389451 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-25b92"] Oct 13 13:24:22 crc kubenswrapper[4797]: I1013 13:24:22.504983 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gpphk\" (UniqueName: \"kubernetes.io/projected/d77b871f-f807-4a5d-a6ac-ee918d9d1530-kube-api-access-gpphk\") pod \"glance-db-create-25b92\" (UID: \"d77b871f-f807-4a5d-a6ac-ee918d9d1530\") " pod="openstack/glance-db-create-25b92" Oct 13 13:24:22 crc kubenswrapper[4797]: I1013 13:24:22.606688 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gpphk\" (UniqueName: \"kubernetes.io/projected/d77b871f-f807-4a5d-a6ac-ee918d9d1530-kube-api-access-gpphk\") pod \"glance-db-create-25b92\" (UID: \"d77b871f-f807-4a5d-a6ac-ee918d9d1530\") " pod="openstack/glance-db-create-25b92" Oct 13 13:24:22 crc kubenswrapper[4797]: I1013 13:24:22.629385 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gpphk\" (UniqueName: \"kubernetes.io/projected/d77b871f-f807-4a5d-a6ac-ee918d9d1530-kube-api-access-gpphk\") pod \"glance-db-create-25b92\" (UID: \"d77b871f-f807-4a5d-a6ac-ee918d9d1530\") " pod="openstack/glance-db-create-25b92" Oct 13 13:24:22 crc kubenswrapper[4797]: I1013 13:24:22.718340 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-25b92" Oct 13 13:24:23 crc kubenswrapper[4797]: I1013 13:24:23.153899 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-25b92"] Oct 13 13:24:23 crc kubenswrapper[4797]: I1013 13:24:23.420848 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f853cd93-92bc-46d6-8bd4-82373edcac6c-etc-swift\") pod \"swift-storage-0\" (UID: \"f853cd93-92bc-46d6-8bd4-82373edcac6c\") " pod="openstack/swift-storage-0" Oct 13 13:24:23 crc kubenswrapper[4797]: E1013 13:24:23.421038 4797 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 13 13:24:23 crc kubenswrapper[4797]: E1013 13:24:23.421064 4797 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 13 13:24:23 crc kubenswrapper[4797]: E1013 13:24:23.421126 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f853cd93-92bc-46d6-8bd4-82373edcac6c-etc-swift podName:f853cd93-92bc-46d6-8bd4-82373edcac6c nodeName:}" failed. No retries permitted until 2025-10-13 13:24:27.421105944 +0000 UTC m=+1044.954656200 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/f853cd93-92bc-46d6-8bd4-82373edcac6c-etc-swift") pod "swift-storage-0" (UID: "f853cd93-92bc-46d6-8bd4-82373edcac6c") : configmap "swift-ring-files" not found Oct 13 13:24:24 crc kubenswrapper[4797]: I1013 13:24:24.845329 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-25b92" event={"ID":"d77b871f-f807-4a5d-a6ac-ee918d9d1530","Type":"ContainerStarted","Data":"215735d74a420c0891a993afd15a2a4b1cfd70ef7726270a79db144ea103e0b7"} Oct 13 13:24:25 crc kubenswrapper[4797]: I1013 13:24:25.854238 4797 generic.go:334] "Generic (PLEG): container finished" podID="d77b871f-f807-4a5d-a6ac-ee918d9d1530" containerID="de93f004ff46c218cae69e7fffa4767fbadceb4b17be2daf089d6bb40a62869c" exitCode=0 Oct 13 13:24:25 crc kubenswrapper[4797]: I1013 13:24:25.854624 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-25b92" event={"ID":"d77b871f-f807-4a5d-a6ac-ee918d9d1530","Type":"ContainerDied","Data":"de93f004ff46c218cae69e7fffa4767fbadceb4b17be2daf089d6bb40a62869c"} Oct 13 13:24:25 crc kubenswrapper[4797]: I1013 13:24:25.857572 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-pn9q8" event={"ID":"9075d4a4-54e2-492f-bc84-bd1fb11df325","Type":"ContainerStarted","Data":"5c5cc2d87050a3de571b1f1b9c6bf168c449bd32a7883ebd8e094f5734bd54f2"} Oct 13 13:24:25 crc kubenswrapper[4797]: I1013 13:24:25.887506 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-pn9q8" podStartSLOduration=2.233915206 podStartE2EDuration="5.887480939s" podCreationTimestamp="2025-10-13 13:24:20 +0000 UTC" firstStartedPulling="2025-10-13 13:24:21.050752696 +0000 UTC m=+1038.584302962" lastFinishedPulling="2025-10-13 13:24:24.704318439 +0000 UTC m=+1042.237868695" observedRunningTime="2025-10-13 13:24:25.886456424 +0000 UTC m=+1043.420006740" watchObservedRunningTime="2025-10-13 13:24:25.887480939 +0000 UTC m=+1043.421031215" Oct 13 13:24:26 crc kubenswrapper[4797]: I1013 13:24:26.605161 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Oct 13 13:24:26 crc kubenswrapper[4797]: I1013 13:24:26.714921 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-g4xv6"] Oct 13 13:24:26 crc kubenswrapper[4797]: I1013 13:24:26.716077 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-g4xv6" Oct 13 13:24:26 crc kubenswrapper[4797]: I1013 13:24:26.726269 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-g4xv6"] Oct 13 13:24:26 crc kubenswrapper[4797]: I1013 13:24:26.800513 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7n8fd\" (UniqueName: \"kubernetes.io/projected/768bafb1-558b-4033-a28c-a504a8f75281-kube-api-access-7n8fd\") pod \"keystone-db-create-g4xv6\" (UID: \"768bafb1-558b-4033-a28c-a504a8f75281\") " pod="openstack/keystone-db-create-g4xv6" Oct 13 13:24:26 crc kubenswrapper[4797]: I1013 13:24:26.901909 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7n8fd\" (UniqueName: \"kubernetes.io/projected/768bafb1-558b-4033-a28c-a504a8f75281-kube-api-access-7n8fd\") pod \"keystone-db-create-g4xv6\" (UID: \"768bafb1-558b-4033-a28c-a504a8f75281\") " pod="openstack/keystone-db-create-g4xv6" Oct 13 13:24:26 crc kubenswrapper[4797]: I1013 13:24:26.922065 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7n8fd\" (UniqueName: \"kubernetes.io/projected/768bafb1-558b-4033-a28c-a504a8f75281-kube-api-access-7n8fd\") pod \"keystone-db-create-g4xv6\" (UID: \"768bafb1-558b-4033-a28c-a504a8f75281\") " pod="openstack/keystone-db-create-g4xv6" Oct 13 13:24:27 crc kubenswrapper[4797]: I1013 13:24:27.053719 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-g4xv6" Oct 13 13:24:27 crc kubenswrapper[4797]: I1013 13:24:27.069862 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-6xtp5"] Oct 13 13:24:27 crc kubenswrapper[4797]: I1013 13:24:27.070890 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-6xtp5" Oct 13 13:24:27 crc kubenswrapper[4797]: I1013 13:24:27.076662 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-6xtp5"] Oct 13 13:24:27 crc kubenswrapper[4797]: I1013 13:24:27.211987 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6qjt\" (UniqueName: \"kubernetes.io/projected/830fc84d-bd05-42f9-a2c8-9404e9a1acb7-kube-api-access-q6qjt\") pod \"placement-db-create-6xtp5\" (UID: \"830fc84d-bd05-42f9-a2c8-9404e9a1acb7\") " pod="openstack/placement-db-create-6xtp5" Oct 13 13:24:27 crc kubenswrapper[4797]: I1013 13:24:27.227276 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-25b92" Oct 13 13:24:27 crc kubenswrapper[4797]: I1013 13:24:27.312996 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gpphk\" (UniqueName: \"kubernetes.io/projected/d77b871f-f807-4a5d-a6ac-ee918d9d1530-kube-api-access-gpphk\") pod \"d77b871f-f807-4a5d-a6ac-ee918d9d1530\" (UID: \"d77b871f-f807-4a5d-a6ac-ee918d9d1530\") " Oct 13 13:24:27 crc kubenswrapper[4797]: I1013 13:24:27.313448 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6qjt\" (UniqueName: \"kubernetes.io/projected/830fc84d-bd05-42f9-a2c8-9404e9a1acb7-kube-api-access-q6qjt\") pod \"placement-db-create-6xtp5\" (UID: \"830fc84d-bd05-42f9-a2c8-9404e9a1acb7\") " pod="openstack/placement-db-create-6xtp5" Oct 13 13:24:27 crc kubenswrapper[4797]: I1013 13:24:27.318799 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d77b871f-f807-4a5d-a6ac-ee918d9d1530-kube-api-access-gpphk" (OuterVolumeSpecName: "kube-api-access-gpphk") pod "d77b871f-f807-4a5d-a6ac-ee918d9d1530" (UID: "d77b871f-f807-4a5d-a6ac-ee918d9d1530"). InnerVolumeSpecName "kube-api-access-gpphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:24:27 crc kubenswrapper[4797]: I1013 13:24:27.331471 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6qjt\" (UniqueName: \"kubernetes.io/projected/830fc84d-bd05-42f9-a2c8-9404e9a1acb7-kube-api-access-q6qjt\") pod \"placement-db-create-6xtp5\" (UID: \"830fc84d-bd05-42f9-a2c8-9404e9a1acb7\") " pod="openstack/placement-db-create-6xtp5" Oct 13 13:24:27 crc kubenswrapper[4797]: I1013 13:24:27.415219 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gpphk\" (UniqueName: \"kubernetes.io/projected/d77b871f-f807-4a5d-a6ac-ee918d9d1530-kube-api-access-gpphk\") on node \"crc\" DevicePath \"\"" Oct 13 13:24:27 crc kubenswrapper[4797]: I1013 13:24:27.500403 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-g4xv6"] Oct 13 13:24:27 crc kubenswrapper[4797]: I1013 13:24:27.516756 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f853cd93-92bc-46d6-8bd4-82373edcac6c-etc-swift\") pod \"swift-storage-0\" (UID: \"f853cd93-92bc-46d6-8bd4-82373edcac6c\") " pod="openstack/swift-storage-0" Oct 13 13:24:27 crc kubenswrapper[4797]: E1013 13:24:27.516922 4797 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 13 13:24:27 crc kubenswrapper[4797]: E1013 13:24:27.516937 4797 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 13 13:24:27 crc kubenswrapper[4797]: E1013 13:24:27.516979 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f853cd93-92bc-46d6-8bd4-82373edcac6c-etc-swift podName:f853cd93-92bc-46d6-8bd4-82373edcac6c nodeName:}" failed. No retries permitted until 2025-10-13 13:24:35.516965532 +0000 UTC m=+1053.050515788 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/f853cd93-92bc-46d6-8bd4-82373edcac6c-etc-swift") pod "swift-storage-0" (UID: "f853cd93-92bc-46d6-8bd4-82373edcac6c") : configmap "swift-ring-files" not found Oct 13 13:24:27 crc kubenswrapper[4797]: I1013 13:24:27.520434 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-6xtp5" Oct 13 13:24:27 crc kubenswrapper[4797]: I1013 13:24:27.873416 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-25b92" Oct 13 13:24:27 crc kubenswrapper[4797]: I1013 13:24:27.873410 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-25b92" event={"ID":"d77b871f-f807-4a5d-a6ac-ee918d9d1530","Type":"ContainerDied","Data":"215735d74a420c0891a993afd15a2a4b1cfd70ef7726270a79db144ea103e0b7"} Oct 13 13:24:27 crc kubenswrapper[4797]: I1013 13:24:27.873820 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="215735d74a420c0891a993afd15a2a4b1cfd70ef7726270a79db144ea103e0b7" Oct 13 13:24:27 crc kubenswrapper[4797]: I1013 13:24:27.879510 4797 generic.go:334] "Generic (PLEG): container finished" podID="768bafb1-558b-4033-a28c-a504a8f75281" containerID="8e8a08b1fb90c5143dfc8554374d133c2d0f35ccf8dc5db129321ce754983254" exitCode=0 Oct 13 13:24:27 crc kubenswrapper[4797]: I1013 13:24:27.879570 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-g4xv6" event={"ID":"768bafb1-558b-4033-a28c-a504a8f75281","Type":"ContainerDied","Data":"8e8a08b1fb90c5143dfc8554374d133c2d0f35ccf8dc5db129321ce754983254"} Oct 13 13:24:27 crc kubenswrapper[4797]: I1013 13:24:27.879595 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-g4xv6" event={"ID":"768bafb1-558b-4033-a28c-a504a8f75281","Type":"ContainerStarted","Data":"5baac3c42b844c829c24116d8b212cd9bb2513c4662636f76c301e8b784422ad"} Oct 13 13:24:28 crc kubenswrapper[4797]: I1013 13:24:28.016542 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-6xtp5"] Oct 13 13:24:28 crc kubenswrapper[4797]: W1013 13:24:28.019949 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod830fc84d_bd05_42f9_a2c8_9404e9a1acb7.slice/crio-0a4808a0752db027986c41c0d6ac58063967b9ddd08abb8344063a716f9a8581 WatchSource:0}: Error finding container 0a4808a0752db027986c41c0d6ac58063967b9ddd08abb8344063a716f9a8581: Status 404 returned error can't find the container with id 0a4808a0752db027986c41c0d6ac58063967b9ddd08abb8344063a716f9a8581 Oct 13 13:24:28 crc kubenswrapper[4797]: I1013 13:24:28.840084 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-c757dd68f-28ns7" Oct 13 13:24:28 crc kubenswrapper[4797]: I1013 13:24:28.888832 4797 generic.go:334] "Generic (PLEG): container finished" podID="830fc84d-bd05-42f9-a2c8-9404e9a1acb7" containerID="2d5f6e18830224ba06878209c725fc3b37c7a8b004f6560b8c4a0fc12883cc60" exitCode=0 Oct 13 13:24:28 crc kubenswrapper[4797]: I1013 13:24:28.889338 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-6xtp5" event={"ID":"830fc84d-bd05-42f9-a2c8-9404e9a1acb7","Type":"ContainerDied","Data":"2d5f6e18830224ba06878209c725fc3b37c7a8b004f6560b8c4a0fc12883cc60"} Oct 13 13:24:28 crc kubenswrapper[4797]: I1013 13:24:28.889376 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-6xtp5" event={"ID":"830fc84d-bd05-42f9-a2c8-9404e9a1acb7","Type":"ContainerStarted","Data":"0a4808a0752db027986c41c0d6ac58063967b9ddd08abb8344063a716f9a8581"} Oct 13 13:24:28 crc kubenswrapper[4797]: I1013 13:24:28.898233 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-d49c4d845-xmwk4"] Oct 13 13:24:28 crc kubenswrapper[4797]: I1013 13:24:28.898737 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-d49c4d845-xmwk4" podUID="e708d16b-703f-455a-a806-a9e96e23da95" containerName="dnsmasq-dns" containerID="cri-o://d6db270e188b50752f536af9dc06aee52e97667047bae3efcbb7001e3ed43d2e" gracePeriod=10 Oct 13 13:24:29 crc kubenswrapper[4797]: I1013 13:24:29.334077 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-g4xv6" Oct 13 13:24:29 crc kubenswrapper[4797]: I1013 13:24:29.388021 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d49c4d845-xmwk4" Oct 13 13:24:29 crc kubenswrapper[4797]: I1013 13:24:29.455067 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e708d16b-703f-455a-a806-a9e96e23da95-config\") pod \"e708d16b-703f-455a-a806-a9e96e23da95\" (UID: \"e708d16b-703f-455a-a806-a9e96e23da95\") " Oct 13 13:24:29 crc kubenswrapper[4797]: I1013 13:24:29.455189 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7n8fd\" (UniqueName: \"kubernetes.io/projected/768bafb1-558b-4033-a28c-a504a8f75281-kube-api-access-7n8fd\") pod \"768bafb1-558b-4033-a28c-a504a8f75281\" (UID: \"768bafb1-558b-4033-a28c-a504a8f75281\") " Oct 13 13:24:29 crc kubenswrapper[4797]: I1013 13:24:29.455212 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5rs8f\" (UniqueName: \"kubernetes.io/projected/e708d16b-703f-455a-a806-a9e96e23da95-kube-api-access-5rs8f\") pod \"e708d16b-703f-455a-a806-a9e96e23da95\" (UID: \"e708d16b-703f-455a-a806-a9e96e23da95\") " Oct 13 13:24:29 crc kubenswrapper[4797]: I1013 13:24:29.455235 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e708d16b-703f-455a-a806-a9e96e23da95-dns-svc\") pod \"e708d16b-703f-455a-a806-a9e96e23da95\" (UID: \"e708d16b-703f-455a-a806-a9e96e23da95\") " Oct 13 13:24:29 crc kubenswrapper[4797]: I1013 13:24:29.455272 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e708d16b-703f-455a-a806-a9e96e23da95-ovsdbserver-sb\") pod \"e708d16b-703f-455a-a806-a9e96e23da95\" (UID: \"e708d16b-703f-455a-a806-a9e96e23da95\") " Oct 13 13:24:29 crc kubenswrapper[4797]: I1013 13:24:29.455341 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e708d16b-703f-455a-a806-a9e96e23da95-ovsdbserver-nb\") pod \"e708d16b-703f-455a-a806-a9e96e23da95\" (UID: \"e708d16b-703f-455a-a806-a9e96e23da95\") " Oct 13 13:24:29 crc kubenswrapper[4797]: I1013 13:24:29.461357 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e708d16b-703f-455a-a806-a9e96e23da95-kube-api-access-5rs8f" (OuterVolumeSpecName: "kube-api-access-5rs8f") pod "e708d16b-703f-455a-a806-a9e96e23da95" (UID: "e708d16b-703f-455a-a806-a9e96e23da95"). InnerVolumeSpecName "kube-api-access-5rs8f". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:24:29 crc kubenswrapper[4797]: I1013 13:24:29.462029 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/768bafb1-558b-4033-a28c-a504a8f75281-kube-api-access-7n8fd" (OuterVolumeSpecName: "kube-api-access-7n8fd") pod "768bafb1-558b-4033-a28c-a504a8f75281" (UID: "768bafb1-558b-4033-a28c-a504a8f75281"). InnerVolumeSpecName "kube-api-access-7n8fd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:24:29 crc kubenswrapper[4797]: I1013 13:24:29.494286 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e708d16b-703f-455a-a806-a9e96e23da95-config" (OuterVolumeSpecName: "config") pod "e708d16b-703f-455a-a806-a9e96e23da95" (UID: "e708d16b-703f-455a-a806-a9e96e23da95"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:24:29 crc kubenswrapper[4797]: I1013 13:24:29.494945 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e708d16b-703f-455a-a806-a9e96e23da95-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e708d16b-703f-455a-a806-a9e96e23da95" (UID: "e708d16b-703f-455a-a806-a9e96e23da95"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:24:29 crc kubenswrapper[4797]: I1013 13:24:29.501756 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e708d16b-703f-455a-a806-a9e96e23da95-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e708d16b-703f-455a-a806-a9e96e23da95" (UID: "e708d16b-703f-455a-a806-a9e96e23da95"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:24:29 crc kubenswrapper[4797]: I1013 13:24:29.530288 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e708d16b-703f-455a-a806-a9e96e23da95-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e708d16b-703f-455a-a806-a9e96e23da95" (UID: "e708d16b-703f-455a-a806-a9e96e23da95"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:24:29 crc kubenswrapper[4797]: I1013 13:24:29.558216 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7n8fd\" (UniqueName: \"kubernetes.io/projected/768bafb1-558b-4033-a28c-a504a8f75281-kube-api-access-7n8fd\") on node \"crc\" DevicePath \"\"" Oct 13 13:24:29 crc kubenswrapper[4797]: I1013 13:24:29.558251 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5rs8f\" (UniqueName: \"kubernetes.io/projected/e708d16b-703f-455a-a806-a9e96e23da95-kube-api-access-5rs8f\") on node \"crc\" DevicePath \"\"" Oct 13 13:24:29 crc kubenswrapper[4797]: I1013 13:24:29.558260 4797 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e708d16b-703f-455a-a806-a9e96e23da95-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 13 13:24:29 crc kubenswrapper[4797]: I1013 13:24:29.558270 4797 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e708d16b-703f-455a-a806-a9e96e23da95-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 13 13:24:29 crc kubenswrapper[4797]: I1013 13:24:29.558278 4797 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e708d16b-703f-455a-a806-a9e96e23da95-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 13 13:24:29 crc kubenswrapper[4797]: I1013 13:24:29.558286 4797 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e708d16b-703f-455a-a806-a9e96e23da95-config\") on node \"crc\" DevicePath \"\"" Oct 13 13:24:29 crc kubenswrapper[4797]: I1013 13:24:29.901687 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-g4xv6" event={"ID":"768bafb1-558b-4033-a28c-a504a8f75281","Type":"ContainerDied","Data":"5baac3c42b844c829c24116d8b212cd9bb2513c4662636f76c301e8b784422ad"} Oct 13 13:24:29 crc kubenswrapper[4797]: I1013 13:24:29.901748 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5baac3c42b844c829c24116d8b212cd9bb2513c4662636f76c301e8b784422ad" Oct 13 13:24:29 crc kubenswrapper[4797]: I1013 13:24:29.902150 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-g4xv6" Oct 13 13:24:29 crc kubenswrapper[4797]: I1013 13:24:29.904570 4797 generic.go:334] "Generic (PLEG): container finished" podID="e708d16b-703f-455a-a806-a9e96e23da95" containerID="d6db270e188b50752f536af9dc06aee52e97667047bae3efcbb7001e3ed43d2e" exitCode=0 Oct 13 13:24:29 crc kubenswrapper[4797]: I1013 13:24:29.904621 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d49c4d845-xmwk4" event={"ID":"e708d16b-703f-455a-a806-a9e96e23da95","Type":"ContainerDied","Data":"d6db270e188b50752f536af9dc06aee52e97667047bae3efcbb7001e3ed43d2e"} Oct 13 13:24:29 crc kubenswrapper[4797]: I1013 13:24:29.904663 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d49c4d845-xmwk4" Oct 13 13:24:29 crc kubenswrapper[4797]: I1013 13:24:29.904688 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d49c4d845-xmwk4" event={"ID":"e708d16b-703f-455a-a806-a9e96e23da95","Type":"ContainerDied","Data":"c6fb38884b0225996b320460bf9b750805809eea67512f6e06b024e5aa4cb325"} Oct 13 13:24:29 crc kubenswrapper[4797]: I1013 13:24:29.904725 4797 scope.go:117] "RemoveContainer" containerID="d6db270e188b50752f536af9dc06aee52e97667047bae3efcbb7001e3ed43d2e" Oct 13 13:24:29 crc kubenswrapper[4797]: I1013 13:24:29.944087 4797 scope.go:117] "RemoveContainer" containerID="ab523cc2a141c6a10760f5d182dde3c8b997392f5b3ffa96dfb116a3cea6dccd" Oct 13 13:24:29 crc kubenswrapper[4797]: I1013 13:24:29.955126 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-d49c4d845-xmwk4"] Oct 13 13:24:29 crc kubenswrapper[4797]: I1013 13:24:29.966911 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-d49c4d845-xmwk4"] Oct 13 13:24:30 crc kubenswrapper[4797]: I1013 13:24:30.013048 4797 scope.go:117] "RemoveContainer" containerID="d6db270e188b50752f536af9dc06aee52e97667047bae3efcbb7001e3ed43d2e" Oct 13 13:24:30 crc kubenswrapper[4797]: E1013 13:24:30.013439 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d6db270e188b50752f536af9dc06aee52e97667047bae3efcbb7001e3ed43d2e\": container with ID starting with d6db270e188b50752f536af9dc06aee52e97667047bae3efcbb7001e3ed43d2e not found: ID does not exist" containerID="d6db270e188b50752f536af9dc06aee52e97667047bae3efcbb7001e3ed43d2e" Oct 13 13:24:30 crc kubenswrapper[4797]: I1013 13:24:30.013491 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6db270e188b50752f536af9dc06aee52e97667047bae3efcbb7001e3ed43d2e"} err="failed to get container status \"d6db270e188b50752f536af9dc06aee52e97667047bae3efcbb7001e3ed43d2e\": rpc error: code = NotFound desc = could not find container \"d6db270e188b50752f536af9dc06aee52e97667047bae3efcbb7001e3ed43d2e\": container with ID starting with d6db270e188b50752f536af9dc06aee52e97667047bae3efcbb7001e3ed43d2e not found: ID does not exist" Oct 13 13:24:30 crc kubenswrapper[4797]: I1013 13:24:30.013527 4797 scope.go:117] "RemoveContainer" containerID="ab523cc2a141c6a10760f5d182dde3c8b997392f5b3ffa96dfb116a3cea6dccd" Oct 13 13:24:30 crc kubenswrapper[4797]: E1013 13:24:30.013915 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab523cc2a141c6a10760f5d182dde3c8b997392f5b3ffa96dfb116a3cea6dccd\": container with ID starting with ab523cc2a141c6a10760f5d182dde3c8b997392f5b3ffa96dfb116a3cea6dccd not found: ID does not exist" containerID="ab523cc2a141c6a10760f5d182dde3c8b997392f5b3ffa96dfb116a3cea6dccd" Oct 13 13:24:30 crc kubenswrapper[4797]: I1013 13:24:30.014023 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab523cc2a141c6a10760f5d182dde3c8b997392f5b3ffa96dfb116a3cea6dccd"} err="failed to get container status \"ab523cc2a141c6a10760f5d182dde3c8b997392f5b3ffa96dfb116a3cea6dccd\": rpc error: code = NotFound desc = could not find container \"ab523cc2a141c6a10760f5d182dde3c8b997392f5b3ffa96dfb116a3cea6dccd\": container with ID starting with ab523cc2a141c6a10760f5d182dde3c8b997392f5b3ffa96dfb116a3cea6dccd not found: ID does not exist" Oct 13 13:24:30 crc kubenswrapper[4797]: I1013 13:24:30.268957 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-6xtp5" Oct 13 13:24:30 crc kubenswrapper[4797]: I1013 13:24:30.370172 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q6qjt\" (UniqueName: \"kubernetes.io/projected/830fc84d-bd05-42f9-a2c8-9404e9a1acb7-kube-api-access-q6qjt\") pod \"830fc84d-bd05-42f9-a2c8-9404e9a1acb7\" (UID: \"830fc84d-bd05-42f9-a2c8-9404e9a1acb7\") " Oct 13 13:24:30 crc kubenswrapper[4797]: I1013 13:24:30.381277 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/830fc84d-bd05-42f9-a2c8-9404e9a1acb7-kube-api-access-q6qjt" (OuterVolumeSpecName: "kube-api-access-q6qjt") pod "830fc84d-bd05-42f9-a2c8-9404e9a1acb7" (UID: "830fc84d-bd05-42f9-a2c8-9404e9a1acb7"). InnerVolumeSpecName "kube-api-access-q6qjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:24:30 crc kubenswrapper[4797]: I1013 13:24:30.472721 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q6qjt\" (UniqueName: \"kubernetes.io/projected/830fc84d-bd05-42f9-a2c8-9404e9a1acb7-kube-api-access-q6qjt\") on node \"crc\" DevicePath \"\"" Oct 13 13:24:30 crc kubenswrapper[4797]: I1013 13:24:30.919027 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-6xtp5" event={"ID":"830fc84d-bd05-42f9-a2c8-9404e9a1acb7","Type":"ContainerDied","Data":"0a4808a0752db027986c41c0d6ac58063967b9ddd08abb8344063a716f9a8581"} Oct 13 13:24:30 crc kubenswrapper[4797]: I1013 13:24:30.919076 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0a4808a0752db027986c41c0d6ac58063967b9ddd08abb8344063a716f9a8581" Oct 13 13:24:30 crc kubenswrapper[4797]: I1013 13:24:30.919201 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-6xtp5" Oct 13 13:24:31 crc kubenswrapper[4797]: I1013 13:24:31.254553 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e708d16b-703f-455a-a806-a9e96e23da95" path="/var/lib/kubelet/pods/e708d16b-703f-455a-a806-a9e96e23da95/volumes" Oct 13 13:24:31 crc kubenswrapper[4797]: I1013 13:24:31.928660 4797 generic.go:334] "Generic (PLEG): container finished" podID="acdec9fc-360a-46e4-89ea-3fde84f417c0" containerID="c87767a147d4c8704a237c416cfdc4858485ccdb790924187fa2558d28ea1605" exitCode=0 Oct 13 13:24:31 crc kubenswrapper[4797]: I1013 13:24:31.928751 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"acdec9fc-360a-46e4-89ea-3fde84f417c0","Type":"ContainerDied","Data":"c87767a147d4c8704a237c416cfdc4858485ccdb790924187fa2558d28ea1605"} Oct 13 13:24:31 crc kubenswrapper[4797]: I1013 13:24:31.930758 4797 generic.go:334] "Generic (PLEG): container finished" podID="9075d4a4-54e2-492f-bc84-bd1fb11df325" containerID="5c5cc2d87050a3de571b1f1b9c6bf168c449bd32a7883ebd8e094f5734bd54f2" exitCode=0 Oct 13 13:24:31 crc kubenswrapper[4797]: I1013 13:24:31.930795 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-pn9q8" event={"ID":"9075d4a4-54e2-492f-bc84-bd1fb11df325","Type":"ContainerDied","Data":"5c5cc2d87050a3de571b1f1b9c6bf168c449bd32a7883ebd8e094f5734bd54f2"} Oct 13 13:24:33 crc kubenswrapper[4797]: I1013 13:24:33.258991 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-pn9q8" Oct 13 13:24:33 crc kubenswrapper[4797]: I1013 13:24:33.307105 4797 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-htk8n" podUID="85dd770b-9d5c-4cc9-adaa-87963d5bb160" containerName="ovn-controller" probeResult="failure" output=< Oct 13 13:24:33 crc kubenswrapper[4797]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Oct 13 13:24:33 crc kubenswrapper[4797]: > Oct 13 13:24:33 crc kubenswrapper[4797]: I1013 13:24:33.371296 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9075d4a4-54e2-492f-bc84-bd1fb11df325-ring-data-devices\") pod \"9075d4a4-54e2-492f-bc84-bd1fb11df325\" (UID: \"9075d4a4-54e2-492f-bc84-bd1fb11df325\") " Oct 13 13:24:33 crc kubenswrapper[4797]: I1013 13:24:33.371374 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9075d4a4-54e2-492f-bc84-bd1fb11df325-scripts\") pod \"9075d4a4-54e2-492f-bc84-bd1fb11df325\" (UID: \"9075d4a4-54e2-492f-bc84-bd1fb11df325\") " Oct 13 13:24:33 crc kubenswrapper[4797]: I1013 13:24:33.371451 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9075d4a4-54e2-492f-bc84-bd1fb11df325-dispersionconf\") pod \"9075d4a4-54e2-492f-bc84-bd1fb11df325\" (UID: \"9075d4a4-54e2-492f-bc84-bd1fb11df325\") " Oct 13 13:24:33 crc kubenswrapper[4797]: I1013 13:24:33.371485 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9075d4a4-54e2-492f-bc84-bd1fb11df325-swiftconf\") pod \"9075d4a4-54e2-492f-bc84-bd1fb11df325\" (UID: \"9075d4a4-54e2-492f-bc84-bd1fb11df325\") " Oct 13 13:24:33 crc kubenswrapper[4797]: I1013 13:24:33.371512 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9075d4a4-54e2-492f-bc84-bd1fb11df325-etc-swift\") pod \"9075d4a4-54e2-492f-bc84-bd1fb11df325\" (UID: \"9075d4a4-54e2-492f-bc84-bd1fb11df325\") " Oct 13 13:24:33 crc kubenswrapper[4797]: I1013 13:24:33.371644 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xws7b\" (UniqueName: \"kubernetes.io/projected/9075d4a4-54e2-492f-bc84-bd1fb11df325-kube-api-access-xws7b\") pod \"9075d4a4-54e2-492f-bc84-bd1fb11df325\" (UID: \"9075d4a4-54e2-492f-bc84-bd1fb11df325\") " Oct 13 13:24:33 crc kubenswrapper[4797]: I1013 13:24:33.371702 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9075d4a4-54e2-492f-bc84-bd1fb11df325-combined-ca-bundle\") pod \"9075d4a4-54e2-492f-bc84-bd1fb11df325\" (UID: \"9075d4a4-54e2-492f-bc84-bd1fb11df325\") " Oct 13 13:24:33 crc kubenswrapper[4797]: I1013 13:24:33.372131 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9075d4a4-54e2-492f-bc84-bd1fb11df325-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "9075d4a4-54e2-492f-bc84-bd1fb11df325" (UID: "9075d4a4-54e2-492f-bc84-bd1fb11df325"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:24:33 crc kubenswrapper[4797]: I1013 13:24:33.372364 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9075d4a4-54e2-492f-bc84-bd1fb11df325-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "9075d4a4-54e2-492f-bc84-bd1fb11df325" (UID: "9075d4a4-54e2-492f-bc84-bd1fb11df325"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 13:24:33 crc kubenswrapper[4797]: I1013 13:24:33.380134 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9075d4a4-54e2-492f-bc84-bd1fb11df325-kube-api-access-xws7b" (OuterVolumeSpecName: "kube-api-access-xws7b") pod "9075d4a4-54e2-492f-bc84-bd1fb11df325" (UID: "9075d4a4-54e2-492f-bc84-bd1fb11df325"). InnerVolumeSpecName "kube-api-access-xws7b". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:24:33 crc kubenswrapper[4797]: I1013 13:24:33.382979 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9075d4a4-54e2-492f-bc84-bd1fb11df325-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "9075d4a4-54e2-492f-bc84-bd1fb11df325" (UID: "9075d4a4-54e2-492f-bc84-bd1fb11df325"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:24:33 crc kubenswrapper[4797]: I1013 13:24:33.395382 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9075d4a4-54e2-492f-bc84-bd1fb11df325-scripts" (OuterVolumeSpecName: "scripts") pod "9075d4a4-54e2-492f-bc84-bd1fb11df325" (UID: "9075d4a4-54e2-492f-bc84-bd1fb11df325"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:24:33 crc kubenswrapper[4797]: I1013 13:24:33.396706 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9075d4a4-54e2-492f-bc84-bd1fb11df325-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "9075d4a4-54e2-492f-bc84-bd1fb11df325" (UID: "9075d4a4-54e2-492f-bc84-bd1fb11df325"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:24:33 crc kubenswrapper[4797]: I1013 13:24:33.405639 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9075d4a4-54e2-492f-bc84-bd1fb11df325-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9075d4a4-54e2-492f-bc84-bd1fb11df325" (UID: "9075d4a4-54e2-492f-bc84-bd1fb11df325"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:24:33 crc kubenswrapper[4797]: I1013 13:24:33.473475 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9075d4a4-54e2-492f-bc84-bd1fb11df325-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 13:24:33 crc kubenswrapper[4797]: I1013 13:24:33.473877 4797 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9075d4a4-54e2-492f-bc84-bd1fb11df325-ring-data-devices\") on node \"crc\" DevicePath \"\"" Oct 13 13:24:33 crc kubenswrapper[4797]: I1013 13:24:33.473890 4797 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9075d4a4-54e2-492f-bc84-bd1fb11df325-scripts\") on node \"crc\" DevicePath \"\"" Oct 13 13:24:33 crc kubenswrapper[4797]: I1013 13:24:33.473903 4797 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9075d4a4-54e2-492f-bc84-bd1fb11df325-dispersionconf\") on node \"crc\" DevicePath \"\"" Oct 13 13:24:33 crc kubenswrapper[4797]: I1013 13:24:33.473937 4797 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9075d4a4-54e2-492f-bc84-bd1fb11df325-swiftconf\") on node \"crc\" DevicePath \"\"" Oct 13 13:24:33 crc kubenswrapper[4797]: I1013 13:24:33.473949 4797 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9075d4a4-54e2-492f-bc84-bd1fb11df325-etc-swift\") on node \"crc\" DevicePath \"\"" Oct 13 13:24:33 crc kubenswrapper[4797]: I1013 13:24:33.473961 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xws7b\" (UniqueName: \"kubernetes.io/projected/9075d4a4-54e2-492f-bc84-bd1fb11df325-kube-api-access-xws7b\") on node \"crc\" DevicePath \"\"" Oct 13 13:24:33 crc kubenswrapper[4797]: I1013 13:24:33.956058 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"acdec9fc-360a-46e4-89ea-3fde84f417c0","Type":"ContainerStarted","Data":"84b296cdae027daaba2dce536affe2df5bb8565c9eaea497ef3762320f6ea09d"} Oct 13 13:24:33 crc kubenswrapper[4797]: I1013 13:24:33.957214 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Oct 13 13:24:33 crc kubenswrapper[4797]: I1013 13:24:33.959189 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-pn9q8" event={"ID":"9075d4a4-54e2-492f-bc84-bd1fb11df325","Type":"ContainerDied","Data":"5ae409a2b24ba8566cae12870c746c63f0239f6f5ba7c1378070a20c63007d7e"} Oct 13 13:24:33 crc kubenswrapper[4797]: I1013 13:24:33.959214 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5ae409a2b24ba8566cae12870c746c63f0239f6f5ba7c1378070a20c63007d7e" Oct 13 13:24:33 crc kubenswrapper[4797]: I1013 13:24:33.959254 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-pn9q8" Oct 13 13:24:33 crc kubenswrapper[4797]: I1013 13:24:33.995618 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=39.001405835 podStartE2EDuration="1m1.995592653s" podCreationTimestamp="2025-10-13 13:23:32 +0000 UTC" firstStartedPulling="2025-10-13 13:23:34.592885337 +0000 UTC m=+992.126435593" lastFinishedPulling="2025-10-13 13:23:57.587072155 +0000 UTC m=+1015.120622411" observedRunningTime="2025-10-13 13:24:33.990723824 +0000 UTC m=+1051.524274150" watchObservedRunningTime="2025-10-13 13:24:33.995592653 +0000 UTC m=+1051.529142919" Oct 13 13:24:35 crc kubenswrapper[4797]: I1013 13:24:35.610664 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f853cd93-92bc-46d6-8bd4-82373edcac6c-etc-swift\") pod \"swift-storage-0\" (UID: \"f853cd93-92bc-46d6-8bd4-82373edcac6c\") " pod="openstack/swift-storage-0" Oct 13 13:24:35 crc kubenswrapper[4797]: I1013 13:24:35.619252 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f853cd93-92bc-46d6-8bd4-82373edcac6c-etc-swift\") pod \"swift-storage-0\" (UID: \"f853cd93-92bc-46d6-8bd4-82373edcac6c\") " pod="openstack/swift-storage-0" Oct 13 13:24:35 crc kubenswrapper[4797]: I1013 13:24:35.861853 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Oct 13 13:24:36 crc kubenswrapper[4797]: I1013 13:24:36.440318 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Oct 13 13:24:36 crc kubenswrapper[4797]: I1013 13:24:36.891956 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-849f-account-create-8cfr6"] Oct 13 13:24:36 crc kubenswrapper[4797]: E1013 13:24:36.892329 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d77b871f-f807-4a5d-a6ac-ee918d9d1530" containerName="mariadb-database-create" Oct 13 13:24:36 crc kubenswrapper[4797]: I1013 13:24:36.892345 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="d77b871f-f807-4a5d-a6ac-ee918d9d1530" containerName="mariadb-database-create" Oct 13 13:24:36 crc kubenswrapper[4797]: E1013 13:24:36.892384 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e708d16b-703f-455a-a806-a9e96e23da95" containerName="init" Oct 13 13:24:36 crc kubenswrapper[4797]: I1013 13:24:36.892392 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="e708d16b-703f-455a-a806-a9e96e23da95" containerName="init" Oct 13 13:24:36 crc kubenswrapper[4797]: E1013 13:24:36.892403 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="768bafb1-558b-4033-a28c-a504a8f75281" containerName="mariadb-database-create" Oct 13 13:24:36 crc kubenswrapper[4797]: I1013 13:24:36.892411 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="768bafb1-558b-4033-a28c-a504a8f75281" containerName="mariadb-database-create" Oct 13 13:24:36 crc kubenswrapper[4797]: E1013 13:24:36.892424 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9075d4a4-54e2-492f-bc84-bd1fb11df325" containerName="swift-ring-rebalance" Oct 13 13:24:36 crc kubenswrapper[4797]: I1013 13:24:36.892431 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="9075d4a4-54e2-492f-bc84-bd1fb11df325" containerName="swift-ring-rebalance" Oct 13 13:24:36 crc kubenswrapper[4797]: E1013 13:24:36.892446 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e708d16b-703f-455a-a806-a9e96e23da95" containerName="dnsmasq-dns" Oct 13 13:24:36 crc kubenswrapper[4797]: I1013 13:24:36.892454 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="e708d16b-703f-455a-a806-a9e96e23da95" containerName="dnsmasq-dns" Oct 13 13:24:36 crc kubenswrapper[4797]: E1013 13:24:36.892467 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="830fc84d-bd05-42f9-a2c8-9404e9a1acb7" containerName="mariadb-database-create" Oct 13 13:24:36 crc kubenswrapper[4797]: I1013 13:24:36.892474 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="830fc84d-bd05-42f9-a2c8-9404e9a1acb7" containerName="mariadb-database-create" Oct 13 13:24:36 crc kubenswrapper[4797]: I1013 13:24:36.892647 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="9075d4a4-54e2-492f-bc84-bd1fb11df325" containerName="swift-ring-rebalance" Oct 13 13:24:36 crc kubenswrapper[4797]: I1013 13:24:36.892664 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="d77b871f-f807-4a5d-a6ac-ee918d9d1530" containerName="mariadb-database-create" Oct 13 13:24:36 crc kubenswrapper[4797]: I1013 13:24:36.892672 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="830fc84d-bd05-42f9-a2c8-9404e9a1acb7" containerName="mariadb-database-create" Oct 13 13:24:36 crc kubenswrapper[4797]: I1013 13:24:36.892680 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="768bafb1-558b-4033-a28c-a504a8f75281" containerName="mariadb-database-create" Oct 13 13:24:36 crc kubenswrapper[4797]: I1013 13:24:36.892696 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="e708d16b-703f-455a-a806-a9e96e23da95" containerName="dnsmasq-dns" Oct 13 13:24:36 crc kubenswrapper[4797]: I1013 13:24:36.893351 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-849f-account-create-8cfr6" Oct 13 13:24:36 crc kubenswrapper[4797]: I1013 13:24:36.899792 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Oct 13 13:24:36 crc kubenswrapper[4797]: I1013 13:24:36.902150 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-849f-account-create-8cfr6"] Oct 13 13:24:36 crc kubenswrapper[4797]: I1013 13:24:36.983050 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f853cd93-92bc-46d6-8bd4-82373edcac6c","Type":"ContainerStarted","Data":"2be8916fa85b9eacc2af4fecd094384a0088200f5107be5125dbae69fcbeced8"} Oct 13 13:24:37 crc kubenswrapper[4797]: I1013 13:24:37.032341 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxhdn\" (UniqueName: \"kubernetes.io/projected/332dc6dc-b877-41b6-836e-ad902bfbb12a-kube-api-access-wxhdn\") pod \"keystone-849f-account-create-8cfr6\" (UID: \"332dc6dc-b877-41b6-836e-ad902bfbb12a\") " pod="openstack/keystone-849f-account-create-8cfr6" Oct 13 13:24:37 crc kubenswrapper[4797]: I1013 13:24:37.134618 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wxhdn\" (UniqueName: \"kubernetes.io/projected/332dc6dc-b877-41b6-836e-ad902bfbb12a-kube-api-access-wxhdn\") pod \"keystone-849f-account-create-8cfr6\" (UID: \"332dc6dc-b877-41b6-836e-ad902bfbb12a\") " pod="openstack/keystone-849f-account-create-8cfr6" Oct 13 13:24:37 crc kubenswrapper[4797]: I1013 13:24:37.156711 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-cde4-account-create-7vdmk"] Oct 13 13:24:37 crc kubenswrapper[4797]: I1013 13:24:37.158321 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-cde4-account-create-7vdmk" Oct 13 13:24:37 crc kubenswrapper[4797]: I1013 13:24:37.164950 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Oct 13 13:24:37 crc kubenswrapper[4797]: I1013 13:24:37.165953 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-cde4-account-create-7vdmk"] Oct 13 13:24:37 crc kubenswrapper[4797]: I1013 13:24:37.193333 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxhdn\" (UniqueName: \"kubernetes.io/projected/332dc6dc-b877-41b6-836e-ad902bfbb12a-kube-api-access-wxhdn\") pod \"keystone-849f-account-create-8cfr6\" (UID: \"332dc6dc-b877-41b6-836e-ad902bfbb12a\") " pod="openstack/keystone-849f-account-create-8cfr6" Oct 13 13:24:37 crc kubenswrapper[4797]: I1013 13:24:37.215423 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-849f-account-create-8cfr6" Oct 13 13:24:37 crc kubenswrapper[4797]: I1013 13:24:37.236054 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c48gq\" (UniqueName: \"kubernetes.io/projected/c68b9280-7334-47da-ab97-8dcb4c4f3016-kube-api-access-c48gq\") pod \"placement-cde4-account-create-7vdmk\" (UID: \"c68b9280-7334-47da-ab97-8dcb4c4f3016\") " pod="openstack/placement-cde4-account-create-7vdmk" Oct 13 13:24:37 crc kubenswrapper[4797]: I1013 13:24:37.337559 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c48gq\" (UniqueName: \"kubernetes.io/projected/c68b9280-7334-47da-ab97-8dcb4c4f3016-kube-api-access-c48gq\") pod \"placement-cde4-account-create-7vdmk\" (UID: \"c68b9280-7334-47da-ab97-8dcb4c4f3016\") " pod="openstack/placement-cde4-account-create-7vdmk" Oct 13 13:24:37 crc kubenswrapper[4797]: I1013 13:24:37.355866 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c48gq\" (UniqueName: \"kubernetes.io/projected/c68b9280-7334-47da-ab97-8dcb4c4f3016-kube-api-access-c48gq\") pod \"placement-cde4-account-create-7vdmk\" (UID: \"c68b9280-7334-47da-ab97-8dcb4c4f3016\") " pod="openstack/placement-cde4-account-create-7vdmk" Oct 13 13:24:37 crc kubenswrapper[4797]: I1013 13:24:37.493797 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-cde4-account-create-7vdmk" Oct 13 13:24:37 crc kubenswrapper[4797]: W1013 13:24:37.669785 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod332dc6dc_b877_41b6_836e_ad902bfbb12a.slice/crio-1821cc68aad75b51d481e3b936a5ab7744d80de6278c10879055e011ea84a4f0 WatchSource:0}: Error finding container 1821cc68aad75b51d481e3b936a5ab7744d80de6278c10879055e011ea84a4f0: Status 404 returned error can't find the container with id 1821cc68aad75b51d481e3b936a5ab7744d80de6278c10879055e011ea84a4f0 Oct 13 13:24:37 crc kubenswrapper[4797]: I1013 13:24:37.679624 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-849f-account-create-8cfr6"] Oct 13 13:24:37 crc kubenswrapper[4797]: I1013 13:24:37.953228 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-cde4-account-create-7vdmk"] Oct 13 13:24:37 crc kubenswrapper[4797]: W1013 13:24:37.962220 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc68b9280_7334_47da_ab97_8dcb4c4f3016.slice/crio-aea412890ff58c964f1a853d62a2405bc5c94898198f8fc3ad40635be7177f26 WatchSource:0}: Error finding container aea412890ff58c964f1a853d62a2405bc5c94898198f8fc3ad40635be7177f26: Status 404 returned error can't find the container with id aea412890ff58c964f1a853d62a2405bc5c94898198f8fc3ad40635be7177f26 Oct 13 13:24:38 crc kubenswrapper[4797]: I1013 13:24:38.003584 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-cde4-account-create-7vdmk" event={"ID":"c68b9280-7334-47da-ab97-8dcb4c4f3016","Type":"ContainerStarted","Data":"aea412890ff58c964f1a853d62a2405bc5c94898198f8fc3ad40635be7177f26"} Oct 13 13:24:38 crc kubenswrapper[4797]: I1013 13:24:38.005341 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-849f-account-create-8cfr6" event={"ID":"332dc6dc-b877-41b6-836e-ad902bfbb12a","Type":"ContainerStarted","Data":"1821cc68aad75b51d481e3b936a5ab7744d80de6278c10879055e011ea84a4f0"} Oct 13 13:24:38 crc kubenswrapper[4797]: I1013 13:24:38.316067 4797 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-htk8n" podUID="85dd770b-9d5c-4cc9-adaa-87963d5bb160" containerName="ovn-controller" probeResult="failure" output=< Oct 13 13:24:38 crc kubenswrapper[4797]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Oct 13 13:24:38 crc kubenswrapper[4797]: > Oct 13 13:24:38 crc kubenswrapper[4797]: I1013 13:24:38.338571 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-2mpq9" Oct 13 13:24:38 crc kubenswrapper[4797]: I1013 13:24:38.340061 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-2mpq9" Oct 13 13:24:38 crc kubenswrapper[4797]: I1013 13:24:38.562543 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-htk8n-config-t8q4z"] Oct 13 13:24:38 crc kubenswrapper[4797]: I1013 13:24:38.564117 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-htk8n-config-t8q4z" Oct 13 13:24:38 crc kubenswrapper[4797]: I1013 13:24:38.568354 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Oct 13 13:24:38 crc kubenswrapper[4797]: I1013 13:24:38.577308 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-htk8n-config-t8q4z"] Oct 13 13:24:38 crc kubenswrapper[4797]: I1013 13:24:38.668163 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d9cdda3d-9e70-4e48-9089-b17e0fd4e4ca-var-run\") pod \"ovn-controller-htk8n-config-t8q4z\" (UID: \"d9cdda3d-9e70-4e48-9089-b17e0fd4e4ca\") " pod="openstack/ovn-controller-htk8n-config-t8q4z" Oct 13 13:24:38 crc kubenswrapper[4797]: I1013 13:24:38.668219 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/d9cdda3d-9e70-4e48-9089-b17e0fd4e4ca-additional-scripts\") pod \"ovn-controller-htk8n-config-t8q4z\" (UID: \"d9cdda3d-9e70-4e48-9089-b17e0fd4e4ca\") " pod="openstack/ovn-controller-htk8n-config-t8q4z" Oct 13 13:24:38 crc kubenswrapper[4797]: I1013 13:24:38.668291 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d9cdda3d-9e70-4e48-9089-b17e0fd4e4ca-scripts\") pod \"ovn-controller-htk8n-config-t8q4z\" (UID: \"d9cdda3d-9e70-4e48-9089-b17e0fd4e4ca\") " pod="openstack/ovn-controller-htk8n-config-t8q4z" Oct 13 13:24:38 crc kubenswrapper[4797]: I1013 13:24:38.668369 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6kwht\" (UniqueName: \"kubernetes.io/projected/d9cdda3d-9e70-4e48-9089-b17e0fd4e4ca-kube-api-access-6kwht\") pod \"ovn-controller-htk8n-config-t8q4z\" (UID: \"d9cdda3d-9e70-4e48-9089-b17e0fd4e4ca\") " pod="openstack/ovn-controller-htk8n-config-t8q4z" Oct 13 13:24:38 crc kubenswrapper[4797]: I1013 13:24:38.668695 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/d9cdda3d-9e70-4e48-9089-b17e0fd4e4ca-var-log-ovn\") pod \"ovn-controller-htk8n-config-t8q4z\" (UID: \"d9cdda3d-9e70-4e48-9089-b17e0fd4e4ca\") " pod="openstack/ovn-controller-htk8n-config-t8q4z" Oct 13 13:24:38 crc kubenswrapper[4797]: I1013 13:24:38.668770 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/d9cdda3d-9e70-4e48-9089-b17e0fd4e4ca-var-run-ovn\") pod \"ovn-controller-htk8n-config-t8q4z\" (UID: \"d9cdda3d-9e70-4e48-9089-b17e0fd4e4ca\") " pod="openstack/ovn-controller-htk8n-config-t8q4z" Oct 13 13:24:38 crc kubenswrapper[4797]: I1013 13:24:38.770620 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/d9cdda3d-9e70-4e48-9089-b17e0fd4e4ca-var-log-ovn\") pod \"ovn-controller-htk8n-config-t8q4z\" (UID: \"d9cdda3d-9e70-4e48-9089-b17e0fd4e4ca\") " pod="openstack/ovn-controller-htk8n-config-t8q4z" Oct 13 13:24:38 crc kubenswrapper[4797]: I1013 13:24:38.771185 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/d9cdda3d-9e70-4e48-9089-b17e0fd4e4ca-var-run-ovn\") pod \"ovn-controller-htk8n-config-t8q4z\" (UID: \"d9cdda3d-9e70-4e48-9089-b17e0fd4e4ca\") " pod="openstack/ovn-controller-htk8n-config-t8q4z" Oct 13 13:24:38 crc kubenswrapper[4797]: I1013 13:24:38.771274 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d9cdda3d-9e70-4e48-9089-b17e0fd4e4ca-var-run\") pod \"ovn-controller-htk8n-config-t8q4z\" (UID: \"d9cdda3d-9e70-4e48-9089-b17e0fd4e4ca\") " pod="openstack/ovn-controller-htk8n-config-t8q4z" Oct 13 13:24:38 crc kubenswrapper[4797]: I1013 13:24:38.771297 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/d9cdda3d-9e70-4e48-9089-b17e0fd4e4ca-additional-scripts\") pod \"ovn-controller-htk8n-config-t8q4z\" (UID: \"d9cdda3d-9e70-4e48-9089-b17e0fd4e4ca\") " pod="openstack/ovn-controller-htk8n-config-t8q4z" Oct 13 13:24:38 crc kubenswrapper[4797]: I1013 13:24:38.771351 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d9cdda3d-9e70-4e48-9089-b17e0fd4e4ca-scripts\") pod \"ovn-controller-htk8n-config-t8q4z\" (UID: \"d9cdda3d-9e70-4e48-9089-b17e0fd4e4ca\") " pod="openstack/ovn-controller-htk8n-config-t8q4z" Oct 13 13:24:38 crc kubenswrapper[4797]: I1013 13:24:38.771411 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6kwht\" (UniqueName: \"kubernetes.io/projected/d9cdda3d-9e70-4e48-9089-b17e0fd4e4ca-kube-api-access-6kwht\") pod \"ovn-controller-htk8n-config-t8q4z\" (UID: \"d9cdda3d-9e70-4e48-9089-b17e0fd4e4ca\") " pod="openstack/ovn-controller-htk8n-config-t8q4z" Oct 13 13:24:38 crc kubenswrapper[4797]: I1013 13:24:38.771586 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/d9cdda3d-9e70-4e48-9089-b17e0fd4e4ca-var-log-ovn\") pod \"ovn-controller-htk8n-config-t8q4z\" (UID: \"d9cdda3d-9e70-4e48-9089-b17e0fd4e4ca\") " pod="openstack/ovn-controller-htk8n-config-t8q4z" Oct 13 13:24:38 crc kubenswrapper[4797]: I1013 13:24:38.771687 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d9cdda3d-9e70-4e48-9089-b17e0fd4e4ca-var-run\") pod \"ovn-controller-htk8n-config-t8q4z\" (UID: \"d9cdda3d-9e70-4e48-9089-b17e0fd4e4ca\") " pod="openstack/ovn-controller-htk8n-config-t8q4z" Oct 13 13:24:38 crc kubenswrapper[4797]: I1013 13:24:38.771730 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/d9cdda3d-9e70-4e48-9089-b17e0fd4e4ca-var-run-ovn\") pod \"ovn-controller-htk8n-config-t8q4z\" (UID: \"d9cdda3d-9e70-4e48-9089-b17e0fd4e4ca\") " pod="openstack/ovn-controller-htk8n-config-t8q4z" Oct 13 13:24:38 crc kubenswrapper[4797]: I1013 13:24:38.774059 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d9cdda3d-9e70-4e48-9089-b17e0fd4e4ca-scripts\") pod \"ovn-controller-htk8n-config-t8q4z\" (UID: \"d9cdda3d-9e70-4e48-9089-b17e0fd4e4ca\") " pod="openstack/ovn-controller-htk8n-config-t8q4z" Oct 13 13:24:38 crc kubenswrapper[4797]: I1013 13:24:38.776237 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/d9cdda3d-9e70-4e48-9089-b17e0fd4e4ca-additional-scripts\") pod \"ovn-controller-htk8n-config-t8q4z\" (UID: \"d9cdda3d-9e70-4e48-9089-b17e0fd4e4ca\") " pod="openstack/ovn-controller-htk8n-config-t8q4z" Oct 13 13:24:38 crc kubenswrapper[4797]: I1013 13:24:38.794130 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6kwht\" (UniqueName: \"kubernetes.io/projected/d9cdda3d-9e70-4e48-9089-b17e0fd4e4ca-kube-api-access-6kwht\") pod \"ovn-controller-htk8n-config-t8q4z\" (UID: \"d9cdda3d-9e70-4e48-9089-b17e0fd4e4ca\") " pod="openstack/ovn-controller-htk8n-config-t8q4z" Oct 13 13:24:38 crc kubenswrapper[4797]: I1013 13:24:38.890088 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-htk8n-config-t8q4z" Oct 13 13:24:39 crc kubenswrapper[4797]: I1013 13:24:39.036967 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f853cd93-92bc-46d6-8bd4-82373edcac6c","Type":"ContainerStarted","Data":"ad8ecb3fe40030b9df7dc7b9b77499e35e2b251475733a87367b79e69bc6d068"} Oct 13 13:24:39 crc kubenswrapper[4797]: I1013 13:24:39.037489 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f853cd93-92bc-46d6-8bd4-82373edcac6c","Type":"ContainerStarted","Data":"c4ce4b63b22c727785eb5abec889eca7d94fdcd5d0b3cda4520df492e1acde65"} Oct 13 13:24:39 crc kubenswrapper[4797]: I1013 13:24:39.049080 4797 generic.go:334] "Generic (PLEG): container finished" podID="c68b9280-7334-47da-ab97-8dcb4c4f3016" containerID="ab217f3ba3c3ed71ed169dc093a277ef9cafc9d81e5055e8a1c6d6832b21be97" exitCode=0 Oct 13 13:24:39 crc kubenswrapper[4797]: I1013 13:24:39.049170 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-cde4-account-create-7vdmk" event={"ID":"c68b9280-7334-47da-ab97-8dcb4c4f3016","Type":"ContainerDied","Data":"ab217f3ba3c3ed71ed169dc093a277ef9cafc9d81e5055e8a1c6d6832b21be97"} Oct 13 13:24:39 crc kubenswrapper[4797]: I1013 13:24:39.057573 4797 generic.go:334] "Generic (PLEG): container finished" podID="332dc6dc-b877-41b6-836e-ad902bfbb12a" containerID="cede006473b611931f79f58356b66eee508c9f0bdede650304bb54fc7058b8f5" exitCode=0 Oct 13 13:24:39 crc kubenswrapper[4797]: I1013 13:24:39.058411 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-849f-account-create-8cfr6" event={"ID":"332dc6dc-b877-41b6-836e-ad902bfbb12a","Type":"ContainerDied","Data":"cede006473b611931f79f58356b66eee508c9f0bdede650304bb54fc7058b8f5"} Oct 13 13:24:39 crc kubenswrapper[4797]: I1013 13:24:39.371251 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-htk8n-config-t8q4z"] Oct 13 13:24:39 crc kubenswrapper[4797]: W1013 13:24:39.375062 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd9cdda3d_9e70_4e48_9089_b17e0fd4e4ca.slice/crio-4971323b529a13c188fcaaa26d698c42c400e92b30caebf8159f427a15b98ad4 WatchSource:0}: Error finding container 4971323b529a13c188fcaaa26d698c42c400e92b30caebf8159f427a15b98ad4: Status 404 returned error can't find the container with id 4971323b529a13c188fcaaa26d698c42c400e92b30caebf8159f427a15b98ad4 Oct 13 13:24:40 crc kubenswrapper[4797]: I1013 13:24:40.067247 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f853cd93-92bc-46d6-8bd4-82373edcac6c","Type":"ContainerStarted","Data":"e14c6e172f563574bf54ee4a48fdf4ed5d505dd4e2e640e88c9ca54cdfa00ed3"} Oct 13 13:24:40 crc kubenswrapper[4797]: I1013 13:24:40.067569 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f853cd93-92bc-46d6-8bd4-82373edcac6c","Type":"ContainerStarted","Data":"fd4ad2d986ae8532b8471784274e34f2e4066c64a28827eb0bdc5088d9c323f3"} Oct 13 13:24:40 crc kubenswrapper[4797]: I1013 13:24:40.069493 4797 generic.go:334] "Generic (PLEG): container finished" podID="d9cdda3d-9e70-4e48-9089-b17e0fd4e4ca" containerID="2818ad7c05d964e9362c8f835a73e1b7d87549c1a9b7705e996f9b7e2e49537c" exitCode=0 Oct 13 13:24:40 crc kubenswrapper[4797]: I1013 13:24:40.069586 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-htk8n-config-t8q4z" event={"ID":"d9cdda3d-9e70-4e48-9089-b17e0fd4e4ca","Type":"ContainerDied","Data":"2818ad7c05d964e9362c8f835a73e1b7d87549c1a9b7705e996f9b7e2e49537c"} Oct 13 13:24:40 crc kubenswrapper[4797]: I1013 13:24:40.069612 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-htk8n-config-t8q4z" event={"ID":"d9cdda3d-9e70-4e48-9089-b17e0fd4e4ca","Type":"ContainerStarted","Data":"4971323b529a13c188fcaaa26d698c42c400e92b30caebf8159f427a15b98ad4"} Oct 13 13:24:40 crc kubenswrapper[4797]: I1013 13:24:40.533627 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-849f-account-create-8cfr6" Oct 13 13:24:40 crc kubenswrapper[4797]: I1013 13:24:40.541924 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-cde4-account-create-7vdmk" Oct 13 13:24:40 crc kubenswrapper[4797]: I1013 13:24:40.599225 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxhdn\" (UniqueName: \"kubernetes.io/projected/332dc6dc-b877-41b6-836e-ad902bfbb12a-kube-api-access-wxhdn\") pod \"332dc6dc-b877-41b6-836e-ad902bfbb12a\" (UID: \"332dc6dc-b877-41b6-836e-ad902bfbb12a\") " Oct 13 13:24:40 crc kubenswrapper[4797]: I1013 13:24:40.599304 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c48gq\" (UniqueName: \"kubernetes.io/projected/c68b9280-7334-47da-ab97-8dcb4c4f3016-kube-api-access-c48gq\") pod \"c68b9280-7334-47da-ab97-8dcb4c4f3016\" (UID: \"c68b9280-7334-47da-ab97-8dcb4c4f3016\") " Oct 13 13:24:40 crc kubenswrapper[4797]: I1013 13:24:40.605551 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/332dc6dc-b877-41b6-836e-ad902bfbb12a-kube-api-access-wxhdn" (OuterVolumeSpecName: "kube-api-access-wxhdn") pod "332dc6dc-b877-41b6-836e-ad902bfbb12a" (UID: "332dc6dc-b877-41b6-836e-ad902bfbb12a"). InnerVolumeSpecName "kube-api-access-wxhdn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:24:40 crc kubenswrapper[4797]: I1013 13:24:40.605741 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c68b9280-7334-47da-ab97-8dcb4c4f3016-kube-api-access-c48gq" (OuterVolumeSpecName: "kube-api-access-c48gq") pod "c68b9280-7334-47da-ab97-8dcb4c4f3016" (UID: "c68b9280-7334-47da-ab97-8dcb4c4f3016"). InnerVolumeSpecName "kube-api-access-c48gq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:24:40 crc kubenswrapper[4797]: I1013 13:24:40.701581 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxhdn\" (UniqueName: \"kubernetes.io/projected/332dc6dc-b877-41b6-836e-ad902bfbb12a-kube-api-access-wxhdn\") on node \"crc\" DevicePath \"\"" Oct 13 13:24:40 crc kubenswrapper[4797]: I1013 13:24:40.701620 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c48gq\" (UniqueName: \"kubernetes.io/projected/c68b9280-7334-47da-ab97-8dcb4c4f3016-kube-api-access-c48gq\") on node \"crc\" DevicePath \"\"" Oct 13 13:24:41 crc kubenswrapper[4797]: I1013 13:24:41.083388 4797 generic.go:334] "Generic (PLEG): container finished" podID="21067728-d3cf-4ff2-94c9-87600f7324ab" containerID="dc60509f6fdf80ab3c7a93d4d24dd520df75f4c9ccb98b44fd3e2e5450ca0b88" exitCode=0 Oct 13 13:24:41 crc kubenswrapper[4797]: I1013 13:24:41.083510 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"21067728-d3cf-4ff2-94c9-87600f7324ab","Type":"ContainerDied","Data":"dc60509f6fdf80ab3c7a93d4d24dd520df75f4c9ccb98b44fd3e2e5450ca0b88"} Oct 13 13:24:41 crc kubenswrapper[4797]: I1013 13:24:41.087174 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-cde4-account-create-7vdmk" event={"ID":"c68b9280-7334-47da-ab97-8dcb4c4f3016","Type":"ContainerDied","Data":"aea412890ff58c964f1a853d62a2405bc5c94898198f8fc3ad40635be7177f26"} Oct 13 13:24:41 crc kubenswrapper[4797]: I1013 13:24:41.087216 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aea412890ff58c964f1a853d62a2405bc5c94898198f8fc3ad40635be7177f26" Oct 13 13:24:41 crc kubenswrapper[4797]: I1013 13:24:41.087292 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-cde4-account-create-7vdmk" Oct 13 13:24:41 crc kubenswrapper[4797]: I1013 13:24:41.094693 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-849f-account-create-8cfr6" event={"ID":"332dc6dc-b877-41b6-836e-ad902bfbb12a","Type":"ContainerDied","Data":"1821cc68aad75b51d481e3b936a5ab7744d80de6278c10879055e011ea84a4f0"} Oct 13 13:24:41 crc kubenswrapper[4797]: I1013 13:24:41.094732 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1821cc68aad75b51d481e3b936a5ab7744d80de6278c10879055e011ea84a4f0" Oct 13 13:24:41 crc kubenswrapper[4797]: I1013 13:24:41.094852 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-849f-account-create-8cfr6" Oct 13 13:24:41 crc kubenswrapper[4797]: I1013 13:24:41.110882 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f853cd93-92bc-46d6-8bd4-82373edcac6c","Type":"ContainerStarted","Data":"e34ee38f199fac8191d033ce294a7a0959606e96937bb15c3973792f94df9fb9"} Oct 13 13:24:41 crc kubenswrapper[4797]: I1013 13:24:41.366705 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-htk8n-config-t8q4z" Oct 13 13:24:41 crc kubenswrapper[4797]: I1013 13:24:41.419004 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/d9cdda3d-9e70-4e48-9089-b17e0fd4e4ca-var-log-ovn\") pod \"d9cdda3d-9e70-4e48-9089-b17e0fd4e4ca\" (UID: \"d9cdda3d-9e70-4e48-9089-b17e0fd4e4ca\") " Oct 13 13:24:41 crc kubenswrapper[4797]: I1013 13:24:41.419409 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/d9cdda3d-9e70-4e48-9089-b17e0fd4e4ca-var-run-ovn\") pod \"d9cdda3d-9e70-4e48-9089-b17e0fd4e4ca\" (UID: \"d9cdda3d-9e70-4e48-9089-b17e0fd4e4ca\") " Oct 13 13:24:41 crc kubenswrapper[4797]: I1013 13:24:41.419490 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d9cdda3d-9e70-4e48-9089-b17e0fd4e4ca-scripts\") pod \"d9cdda3d-9e70-4e48-9089-b17e0fd4e4ca\" (UID: \"d9cdda3d-9e70-4e48-9089-b17e0fd4e4ca\") " Oct 13 13:24:41 crc kubenswrapper[4797]: I1013 13:24:41.419554 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/d9cdda3d-9e70-4e48-9089-b17e0fd4e4ca-additional-scripts\") pod \"d9cdda3d-9e70-4e48-9089-b17e0fd4e4ca\" (UID: \"d9cdda3d-9e70-4e48-9089-b17e0fd4e4ca\") " Oct 13 13:24:41 crc kubenswrapper[4797]: I1013 13:24:41.419593 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d9cdda3d-9e70-4e48-9089-b17e0fd4e4ca-var-run\") pod \"d9cdda3d-9e70-4e48-9089-b17e0fd4e4ca\" (UID: \"d9cdda3d-9e70-4e48-9089-b17e0fd4e4ca\") " Oct 13 13:24:41 crc kubenswrapper[4797]: I1013 13:24:41.419642 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6kwht\" (UniqueName: \"kubernetes.io/projected/d9cdda3d-9e70-4e48-9089-b17e0fd4e4ca-kube-api-access-6kwht\") pod \"d9cdda3d-9e70-4e48-9089-b17e0fd4e4ca\" (UID: \"d9cdda3d-9e70-4e48-9089-b17e0fd4e4ca\") " Oct 13 13:24:41 crc kubenswrapper[4797]: I1013 13:24:41.421957 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d9cdda3d-9e70-4e48-9089-b17e0fd4e4ca-scripts" (OuterVolumeSpecName: "scripts") pod "d9cdda3d-9e70-4e48-9089-b17e0fd4e4ca" (UID: "d9cdda3d-9e70-4e48-9089-b17e0fd4e4ca"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:24:41 crc kubenswrapper[4797]: I1013 13:24:41.422001 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d9cdda3d-9e70-4e48-9089-b17e0fd4e4ca-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "d9cdda3d-9e70-4e48-9089-b17e0fd4e4ca" (UID: "d9cdda3d-9e70-4e48-9089-b17e0fd4e4ca"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 13 13:24:41 crc kubenswrapper[4797]: I1013 13:24:41.422170 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d9cdda3d-9e70-4e48-9089-b17e0fd4e4ca-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "d9cdda3d-9e70-4e48-9089-b17e0fd4e4ca" (UID: "d9cdda3d-9e70-4e48-9089-b17e0fd4e4ca"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 13 13:24:41 crc kubenswrapper[4797]: I1013 13:24:41.423015 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d9cdda3d-9e70-4e48-9089-b17e0fd4e4ca-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "d9cdda3d-9e70-4e48-9089-b17e0fd4e4ca" (UID: "d9cdda3d-9e70-4e48-9089-b17e0fd4e4ca"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:24:41 crc kubenswrapper[4797]: I1013 13:24:41.424015 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d9cdda3d-9e70-4e48-9089-b17e0fd4e4ca-var-run" (OuterVolumeSpecName: "var-run") pod "d9cdda3d-9e70-4e48-9089-b17e0fd4e4ca" (UID: "d9cdda3d-9e70-4e48-9089-b17e0fd4e4ca"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 13 13:24:41 crc kubenswrapper[4797]: I1013 13:24:41.427384 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9cdda3d-9e70-4e48-9089-b17e0fd4e4ca-kube-api-access-6kwht" (OuterVolumeSpecName: "kube-api-access-6kwht") pod "d9cdda3d-9e70-4e48-9089-b17e0fd4e4ca" (UID: "d9cdda3d-9e70-4e48-9089-b17e0fd4e4ca"). InnerVolumeSpecName "kube-api-access-6kwht". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:24:41 crc kubenswrapper[4797]: I1013 13:24:41.521288 4797 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d9cdda3d-9e70-4e48-9089-b17e0fd4e4ca-scripts\") on node \"crc\" DevicePath \"\"" Oct 13 13:24:41 crc kubenswrapper[4797]: I1013 13:24:41.521337 4797 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/d9cdda3d-9e70-4e48-9089-b17e0fd4e4ca-additional-scripts\") on node \"crc\" DevicePath \"\"" Oct 13 13:24:41 crc kubenswrapper[4797]: I1013 13:24:41.521350 4797 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d9cdda3d-9e70-4e48-9089-b17e0fd4e4ca-var-run\") on node \"crc\" DevicePath \"\"" Oct 13 13:24:41 crc kubenswrapper[4797]: I1013 13:24:41.521362 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6kwht\" (UniqueName: \"kubernetes.io/projected/d9cdda3d-9e70-4e48-9089-b17e0fd4e4ca-kube-api-access-6kwht\") on node \"crc\" DevicePath \"\"" Oct 13 13:24:41 crc kubenswrapper[4797]: I1013 13:24:41.521375 4797 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/d9cdda3d-9e70-4e48-9089-b17e0fd4e4ca-var-log-ovn\") on node \"crc\" DevicePath \"\"" Oct 13 13:24:41 crc kubenswrapper[4797]: I1013 13:24:41.521386 4797 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/d9cdda3d-9e70-4e48-9089-b17e0fd4e4ca-var-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 13 13:24:42 crc kubenswrapper[4797]: I1013 13:24:42.122319 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f853cd93-92bc-46d6-8bd4-82373edcac6c","Type":"ContainerStarted","Data":"7fa1e18bf4f048760b982cf61e4fdc5706474901c9f59b877086707fd0c2bad5"} Oct 13 13:24:42 crc kubenswrapper[4797]: I1013 13:24:42.122364 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f853cd93-92bc-46d6-8bd4-82373edcac6c","Type":"ContainerStarted","Data":"a05561b229ddf692864eb50f4e3134b3ae7ce64dcd992877ca1a87198bca77ca"} Oct 13 13:24:42 crc kubenswrapper[4797]: I1013 13:24:42.122376 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f853cd93-92bc-46d6-8bd4-82373edcac6c","Type":"ContainerStarted","Data":"826bb87dfbab6e082d77678c431c5f60fe685f53d7f5d122666545d7e4d18b90"} Oct 13 13:24:42 crc kubenswrapper[4797]: I1013 13:24:42.123873 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-htk8n-config-t8q4z" Oct 13 13:24:42 crc kubenswrapper[4797]: I1013 13:24:42.123881 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-htk8n-config-t8q4z" event={"ID":"d9cdda3d-9e70-4e48-9089-b17e0fd4e4ca","Type":"ContainerDied","Data":"4971323b529a13c188fcaaa26d698c42c400e92b30caebf8159f427a15b98ad4"} Oct 13 13:24:42 crc kubenswrapper[4797]: I1013 13:24:42.123907 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4971323b529a13c188fcaaa26d698c42c400e92b30caebf8159f427a15b98ad4" Oct 13 13:24:42 crc kubenswrapper[4797]: I1013 13:24:42.125569 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"21067728-d3cf-4ff2-94c9-87600f7324ab","Type":"ContainerStarted","Data":"b14ae8c7513d0ced2b738189c2e015ff0743dc1feeeda16b9f6380925730cb4f"} Oct 13 13:24:42 crc kubenswrapper[4797]: I1013 13:24:42.125755 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Oct 13 13:24:42 crc kubenswrapper[4797]: I1013 13:24:42.151383 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=-9223371965.703413 podStartE2EDuration="1m11.151363579s" podCreationTimestamp="2025-10-13 13:23:31 +0000 UTC" firstStartedPulling="2025-10-13 13:23:33.791699352 +0000 UTC m=+991.325249598" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 13:24:42.14940611 +0000 UTC m=+1059.682956376" watchObservedRunningTime="2025-10-13 13:24:42.151363579 +0000 UTC m=+1059.684913835" Oct 13 13:24:42 crc kubenswrapper[4797]: I1013 13:24:42.485824 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-496f-account-create-jjvzm"] Oct 13 13:24:42 crc kubenswrapper[4797]: E1013 13:24:42.486321 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9cdda3d-9e70-4e48-9089-b17e0fd4e4ca" containerName="ovn-config" Oct 13 13:24:42 crc kubenswrapper[4797]: I1013 13:24:42.486346 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9cdda3d-9e70-4e48-9089-b17e0fd4e4ca" containerName="ovn-config" Oct 13 13:24:42 crc kubenswrapper[4797]: E1013 13:24:42.486367 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c68b9280-7334-47da-ab97-8dcb4c4f3016" containerName="mariadb-account-create" Oct 13 13:24:42 crc kubenswrapper[4797]: I1013 13:24:42.486376 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="c68b9280-7334-47da-ab97-8dcb4c4f3016" containerName="mariadb-account-create" Oct 13 13:24:42 crc kubenswrapper[4797]: E1013 13:24:42.486394 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="332dc6dc-b877-41b6-836e-ad902bfbb12a" containerName="mariadb-account-create" Oct 13 13:24:42 crc kubenswrapper[4797]: I1013 13:24:42.486404 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="332dc6dc-b877-41b6-836e-ad902bfbb12a" containerName="mariadb-account-create" Oct 13 13:24:42 crc kubenswrapper[4797]: I1013 13:24:42.486618 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9cdda3d-9e70-4e48-9089-b17e0fd4e4ca" containerName="ovn-config" Oct 13 13:24:42 crc kubenswrapper[4797]: I1013 13:24:42.486665 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="c68b9280-7334-47da-ab97-8dcb4c4f3016" containerName="mariadb-account-create" Oct 13 13:24:42 crc kubenswrapper[4797]: I1013 13:24:42.486694 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="332dc6dc-b877-41b6-836e-ad902bfbb12a" containerName="mariadb-account-create" Oct 13 13:24:42 crc kubenswrapper[4797]: I1013 13:24:42.487427 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-496f-account-create-jjvzm" Oct 13 13:24:42 crc kubenswrapper[4797]: I1013 13:24:42.489520 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Oct 13 13:24:42 crc kubenswrapper[4797]: I1013 13:24:42.504571 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-496f-account-create-jjvzm"] Oct 13 13:24:42 crc kubenswrapper[4797]: I1013 13:24:42.513563 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-htk8n-config-t8q4z"] Oct 13 13:24:42 crc kubenswrapper[4797]: I1013 13:24:42.522158 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-htk8n-config-t8q4z"] Oct 13 13:24:42 crc kubenswrapper[4797]: I1013 13:24:42.538867 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmvjf\" (UniqueName: \"kubernetes.io/projected/d8ec4b28-9317-420b-afeb-ac89bd5f0f9b-kube-api-access-nmvjf\") pod \"glance-496f-account-create-jjvzm\" (UID: \"d8ec4b28-9317-420b-afeb-ac89bd5f0f9b\") " pod="openstack/glance-496f-account-create-jjvzm" Oct 13 13:24:42 crc kubenswrapper[4797]: I1013 13:24:42.579161 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-htk8n-config-4p5v9"] Oct 13 13:24:42 crc kubenswrapper[4797]: I1013 13:24:42.581337 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-htk8n-config-4p5v9" Oct 13 13:24:42 crc kubenswrapper[4797]: I1013 13:24:42.583630 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Oct 13 13:24:42 crc kubenswrapper[4797]: I1013 13:24:42.591241 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-htk8n-config-4p5v9"] Oct 13 13:24:42 crc kubenswrapper[4797]: I1013 13:24:42.640478 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nmvjf\" (UniqueName: \"kubernetes.io/projected/d8ec4b28-9317-420b-afeb-ac89bd5f0f9b-kube-api-access-nmvjf\") pod \"glance-496f-account-create-jjvzm\" (UID: \"d8ec4b28-9317-420b-afeb-ac89bd5f0f9b\") " pod="openstack/glance-496f-account-create-jjvzm" Oct 13 13:24:42 crc kubenswrapper[4797]: I1013 13:24:42.640572 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a2561f6e-2e49-49a3-89c5-fc967e8d358b-scripts\") pod \"ovn-controller-htk8n-config-4p5v9\" (UID: \"a2561f6e-2e49-49a3-89c5-fc967e8d358b\") " pod="openstack/ovn-controller-htk8n-config-4p5v9" Oct 13 13:24:42 crc kubenswrapper[4797]: I1013 13:24:42.640620 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzl7g\" (UniqueName: \"kubernetes.io/projected/a2561f6e-2e49-49a3-89c5-fc967e8d358b-kube-api-access-pzl7g\") pod \"ovn-controller-htk8n-config-4p5v9\" (UID: \"a2561f6e-2e49-49a3-89c5-fc967e8d358b\") " pod="openstack/ovn-controller-htk8n-config-4p5v9" Oct 13 13:24:42 crc kubenswrapper[4797]: I1013 13:24:42.640661 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a2561f6e-2e49-49a3-89c5-fc967e8d358b-var-log-ovn\") pod \"ovn-controller-htk8n-config-4p5v9\" (UID: \"a2561f6e-2e49-49a3-89c5-fc967e8d358b\") " pod="openstack/ovn-controller-htk8n-config-4p5v9" Oct 13 13:24:42 crc kubenswrapper[4797]: I1013 13:24:42.640681 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a2561f6e-2e49-49a3-89c5-fc967e8d358b-var-run-ovn\") pod \"ovn-controller-htk8n-config-4p5v9\" (UID: \"a2561f6e-2e49-49a3-89c5-fc967e8d358b\") " pod="openstack/ovn-controller-htk8n-config-4p5v9" Oct 13 13:24:42 crc kubenswrapper[4797]: I1013 13:24:42.640732 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a2561f6e-2e49-49a3-89c5-fc967e8d358b-var-run\") pod \"ovn-controller-htk8n-config-4p5v9\" (UID: \"a2561f6e-2e49-49a3-89c5-fc967e8d358b\") " pod="openstack/ovn-controller-htk8n-config-4p5v9" Oct 13 13:24:42 crc kubenswrapper[4797]: I1013 13:24:42.640781 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/a2561f6e-2e49-49a3-89c5-fc967e8d358b-additional-scripts\") pod \"ovn-controller-htk8n-config-4p5v9\" (UID: \"a2561f6e-2e49-49a3-89c5-fc967e8d358b\") " pod="openstack/ovn-controller-htk8n-config-4p5v9" Oct 13 13:24:42 crc kubenswrapper[4797]: I1013 13:24:42.667113 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmvjf\" (UniqueName: \"kubernetes.io/projected/d8ec4b28-9317-420b-afeb-ac89bd5f0f9b-kube-api-access-nmvjf\") pod \"glance-496f-account-create-jjvzm\" (UID: \"d8ec4b28-9317-420b-afeb-ac89bd5f0f9b\") " pod="openstack/glance-496f-account-create-jjvzm" Oct 13 13:24:42 crc kubenswrapper[4797]: I1013 13:24:42.742236 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a2561f6e-2e49-49a3-89c5-fc967e8d358b-var-log-ovn\") pod \"ovn-controller-htk8n-config-4p5v9\" (UID: \"a2561f6e-2e49-49a3-89c5-fc967e8d358b\") " pod="openstack/ovn-controller-htk8n-config-4p5v9" Oct 13 13:24:42 crc kubenswrapper[4797]: I1013 13:24:42.742272 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pzl7g\" (UniqueName: \"kubernetes.io/projected/a2561f6e-2e49-49a3-89c5-fc967e8d358b-kube-api-access-pzl7g\") pod \"ovn-controller-htk8n-config-4p5v9\" (UID: \"a2561f6e-2e49-49a3-89c5-fc967e8d358b\") " pod="openstack/ovn-controller-htk8n-config-4p5v9" Oct 13 13:24:42 crc kubenswrapper[4797]: I1013 13:24:42.742319 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a2561f6e-2e49-49a3-89c5-fc967e8d358b-var-run-ovn\") pod \"ovn-controller-htk8n-config-4p5v9\" (UID: \"a2561f6e-2e49-49a3-89c5-fc967e8d358b\") " pod="openstack/ovn-controller-htk8n-config-4p5v9" Oct 13 13:24:42 crc kubenswrapper[4797]: I1013 13:24:42.742357 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a2561f6e-2e49-49a3-89c5-fc967e8d358b-var-run\") pod \"ovn-controller-htk8n-config-4p5v9\" (UID: \"a2561f6e-2e49-49a3-89c5-fc967e8d358b\") " pod="openstack/ovn-controller-htk8n-config-4p5v9" Oct 13 13:24:42 crc kubenswrapper[4797]: I1013 13:24:42.742428 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/a2561f6e-2e49-49a3-89c5-fc967e8d358b-additional-scripts\") pod \"ovn-controller-htk8n-config-4p5v9\" (UID: \"a2561f6e-2e49-49a3-89c5-fc967e8d358b\") " pod="openstack/ovn-controller-htk8n-config-4p5v9" Oct 13 13:24:42 crc kubenswrapper[4797]: I1013 13:24:42.742494 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a2561f6e-2e49-49a3-89c5-fc967e8d358b-scripts\") pod \"ovn-controller-htk8n-config-4p5v9\" (UID: \"a2561f6e-2e49-49a3-89c5-fc967e8d358b\") " pod="openstack/ovn-controller-htk8n-config-4p5v9" Oct 13 13:24:42 crc kubenswrapper[4797]: I1013 13:24:42.742590 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a2561f6e-2e49-49a3-89c5-fc967e8d358b-var-run-ovn\") pod \"ovn-controller-htk8n-config-4p5v9\" (UID: \"a2561f6e-2e49-49a3-89c5-fc967e8d358b\") " pod="openstack/ovn-controller-htk8n-config-4p5v9" Oct 13 13:24:42 crc kubenswrapper[4797]: I1013 13:24:42.742601 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a2561f6e-2e49-49a3-89c5-fc967e8d358b-var-run\") pod \"ovn-controller-htk8n-config-4p5v9\" (UID: \"a2561f6e-2e49-49a3-89c5-fc967e8d358b\") " pod="openstack/ovn-controller-htk8n-config-4p5v9" Oct 13 13:24:42 crc kubenswrapper[4797]: I1013 13:24:42.742757 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a2561f6e-2e49-49a3-89c5-fc967e8d358b-var-log-ovn\") pod \"ovn-controller-htk8n-config-4p5v9\" (UID: \"a2561f6e-2e49-49a3-89c5-fc967e8d358b\") " pod="openstack/ovn-controller-htk8n-config-4p5v9" Oct 13 13:24:42 crc kubenswrapper[4797]: I1013 13:24:42.743398 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/a2561f6e-2e49-49a3-89c5-fc967e8d358b-additional-scripts\") pod \"ovn-controller-htk8n-config-4p5v9\" (UID: \"a2561f6e-2e49-49a3-89c5-fc967e8d358b\") " pod="openstack/ovn-controller-htk8n-config-4p5v9" Oct 13 13:24:42 crc kubenswrapper[4797]: I1013 13:24:42.745001 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a2561f6e-2e49-49a3-89c5-fc967e8d358b-scripts\") pod \"ovn-controller-htk8n-config-4p5v9\" (UID: \"a2561f6e-2e49-49a3-89c5-fc967e8d358b\") " pod="openstack/ovn-controller-htk8n-config-4p5v9" Oct 13 13:24:42 crc kubenswrapper[4797]: I1013 13:24:42.758973 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzl7g\" (UniqueName: \"kubernetes.io/projected/a2561f6e-2e49-49a3-89c5-fc967e8d358b-kube-api-access-pzl7g\") pod \"ovn-controller-htk8n-config-4p5v9\" (UID: \"a2561f6e-2e49-49a3-89c5-fc967e8d358b\") " pod="openstack/ovn-controller-htk8n-config-4p5v9" Oct 13 13:24:42 crc kubenswrapper[4797]: I1013 13:24:42.813105 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-496f-account-create-jjvzm" Oct 13 13:24:42 crc kubenswrapper[4797]: I1013 13:24:42.921733 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-htk8n-config-4p5v9" Oct 13 13:24:43 crc kubenswrapper[4797]: I1013 13:24:43.282753 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9cdda3d-9e70-4e48-9089-b17e0fd4e4ca" path="/var/lib/kubelet/pods/d9cdda3d-9e70-4e48-9089-b17e0fd4e4ca/volumes" Oct 13 13:24:43 crc kubenswrapper[4797]: I1013 13:24:43.315064 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-htk8n" Oct 13 13:24:43 crc kubenswrapper[4797]: I1013 13:24:43.563285 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-496f-account-create-jjvzm"] Oct 13 13:24:43 crc kubenswrapper[4797]: I1013 13:24:43.613865 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-htk8n-config-4p5v9"] Oct 13 13:24:43 crc kubenswrapper[4797]: W1013 13:24:43.619558 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda2561f6e_2e49_49a3_89c5_fc967e8d358b.slice/crio-921af66100298befba0a44adb3205cc02eeb1c72af304de3aa90bf366f38bb7e WatchSource:0}: Error finding container 921af66100298befba0a44adb3205cc02eeb1c72af304de3aa90bf366f38bb7e: Status 404 returned error can't find the container with id 921af66100298befba0a44adb3205cc02eeb1c72af304de3aa90bf366f38bb7e Oct 13 13:24:43 crc kubenswrapper[4797]: I1013 13:24:43.994990 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Oct 13 13:24:44 crc kubenswrapper[4797]: I1013 13:24:44.150089 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-htk8n-config-4p5v9" event={"ID":"a2561f6e-2e49-49a3-89c5-fc967e8d358b","Type":"ContainerStarted","Data":"921af66100298befba0a44adb3205cc02eeb1c72af304de3aa90bf366f38bb7e"} Oct 13 13:24:44 crc kubenswrapper[4797]: I1013 13:24:44.156487 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f853cd93-92bc-46d6-8bd4-82373edcac6c","Type":"ContainerStarted","Data":"88fe19d7acc00d4d082fc5672f55a59aa59797313060f49a5e87d490f16e6bd6"} Oct 13 13:24:44 crc kubenswrapper[4797]: I1013 13:24:44.156531 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f853cd93-92bc-46d6-8bd4-82373edcac6c","Type":"ContainerStarted","Data":"95610d8f86f925fa29c39c4b649aca588a5bfca6031effb0eddf4c0ba26766bc"} Oct 13 13:24:44 crc kubenswrapper[4797]: I1013 13:24:44.156540 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f853cd93-92bc-46d6-8bd4-82373edcac6c","Type":"ContainerStarted","Data":"2c54fef5660226ff362a71ac3e917e001db5e9a347a08deab58b5660776811fc"} Oct 13 13:24:44 crc kubenswrapper[4797]: I1013 13:24:44.156550 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f853cd93-92bc-46d6-8bd4-82373edcac6c","Type":"ContainerStarted","Data":"f98ce95dd4a03f4ac462cac39a646ae12a813083511d7a3c50bdddd1624ba0aa"} Oct 13 13:24:44 crc kubenswrapper[4797]: I1013 13:24:44.173357 4797 generic.go:334] "Generic (PLEG): container finished" podID="d8ec4b28-9317-420b-afeb-ac89bd5f0f9b" containerID="784be3c5f941d88bcbbee59370e2e0936ed8962a19a1f5e76ada2d6bea8e66d5" exitCode=0 Oct 13 13:24:44 crc kubenswrapper[4797]: I1013 13:24:44.173423 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-496f-account-create-jjvzm" event={"ID":"d8ec4b28-9317-420b-afeb-ac89bd5f0f9b","Type":"ContainerDied","Data":"784be3c5f941d88bcbbee59370e2e0936ed8962a19a1f5e76ada2d6bea8e66d5"} Oct 13 13:24:44 crc kubenswrapper[4797]: I1013 13:24:44.173463 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-496f-account-create-jjvzm" event={"ID":"d8ec4b28-9317-420b-afeb-ac89bd5f0f9b","Type":"ContainerStarted","Data":"ea0daee27fc14903ed30ac8f12cf1d568ca3979c97d69e1d0ae7fa88ae50087c"} Oct 13 13:24:45 crc kubenswrapper[4797]: I1013 13:24:45.181901 4797 generic.go:334] "Generic (PLEG): container finished" podID="a2561f6e-2e49-49a3-89c5-fc967e8d358b" containerID="4360772a9d1e2f22d64f0217e01b149d341125b39f692cf2ceaffd1c6f90417b" exitCode=0 Oct 13 13:24:45 crc kubenswrapper[4797]: I1013 13:24:45.181950 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-htk8n-config-4p5v9" event={"ID":"a2561f6e-2e49-49a3-89c5-fc967e8d358b","Type":"ContainerDied","Data":"4360772a9d1e2f22d64f0217e01b149d341125b39f692cf2ceaffd1c6f90417b"} Oct 13 13:24:45 crc kubenswrapper[4797]: I1013 13:24:45.188524 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f853cd93-92bc-46d6-8bd4-82373edcac6c","Type":"ContainerStarted","Data":"df5690bc37dc98f263a51145643771f0253fa3da622e37dab0ac9bd217f8b17d"} Oct 13 13:24:45 crc kubenswrapper[4797]: I1013 13:24:45.188588 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f853cd93-92bc-46d6-8bd4-82373edcac6c","Type":"ContainerStarted","Data":"7c966ea8b8d377fe98198c55458614115fcf39c47512b6e6a01c20502d489780"} Oct 13 13:24:45 crc kubenswrapper[4797]: I1013 13:24:45.188604 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f853cd93-92bc-46d6-8bd4-82373edcac6c","Type":"ContainerStarted","Data":"0771ac639aaa1011650222d8f818316f8651f5729cf3d02726d6f291d0ca0403"} Oct 13 13:24:45 crc kubenswrapper[4797]: I1013 13:24:45.234458 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=20.608724418 podStartE2EDuration="27.234435329s" podCreationTimestamp="2025-10-13 13:24:18 +0000 UTC" firstStartedPulling="2025-10-13 13:24:36.45265897 +0000 UTC m=+1053.986209236" lastFinishedPulling="2025-10-13 13:24:43.078369891 +0000 UTC m=+1060.611920147" observedRunningTime="2025-10-13 13:24:45.234164042 +0000 UTC m=+1062.767714308" watchObservedRunningTime="2025-10-13 13:24:45.234435329 +0000 UTC m=+1062.767985585" Oct 13 13:24:45 crc kubenswrapper[4797]: I1013 13:24:45.512450 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-496f-account-create-jjvzm" Oct 13 13:24:45 crc kubenswrapper[4797]: I1013 13:24:45.513289 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-d9ddcb47c-khxjk"] Oct 13 13:24:45 crc kubenswrapper[4797]: E1013 13:24:45.513664 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8ec4b28-9317-420b-afeb-ac89bd5f0f9b" containerName="mariadb-account-create" Oct 13 13:24:45 crc kubenswrapper[4797]: I1013 13:24:45.513683 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8ec4b28-9317-420b-afeb-ac89bd5f0f9b" containerName="mariadb-account-create" Oct 13 13:24:45 crc kubenswrapper[4797]: I1013 13:24:45.513921 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8ec4b28-9317-420b-afeb-ac89bd5f0f9b" containerName="mariadb-account-create" Oct 13 13:24:45 crc kubenswrapper[4797]: I1013 13:24:45.514739 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d9ddcb47c-khxjk" Oct 13 13:24:45 crc kubenswrapper[4797]: I1013 13:24:45.524407 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Oct 13 13:24:45 crc kubenswrapper[4797]: I1013 13:24:45.531397 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-d9ddcb47c-khxjk"] Oct 13 13:24:45 crc kubenswrapper[4797]: I1013 13:24:45.603961 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nmvjf\" (UniqueName: \"kubernetes.io/projected/d8ec4b28-9317-420b-afeb-ac89bd5f0f9b-kube-api-access-nmvjf\") pod \"d8ec4b28-9317-420b-afeb-ac89bd5f0f9b\" (UID: \"d8ec4b28-9317-420b-afeb-ac89bd5f0f9b\") " Oct 13 13:24:45 crc kubenswrapper[4797]: I1013 13:24:45.605243 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/885a17a7-d224-4081-8568-8d362c86f321-config\") pod \"dnsmasq-dns-d9ddcb47c-khxjk\" (UID: \"885a17a7-d224-4081-8568-8d362c86f321\") " pod="openstack/dnsmasq-dns-d9ddcb47c-khxjk" Oct 13 13:24:45 crc kubenswrapper[4797]: I1013 13:24:45.605340 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/885a17a7-d224-4081-8568-8d362c86f321-ovsdbserver-sb\") pod \"dnsmasq-dns-d9ddcb47c-khxjk\" (UID: \"885a17a7-d224-4081-8568-8d362c86f321\") " pod="openstack/dnsmasq-dns-d9ddcb47c-khxjk" Oct 13 13:24:45 crc kubenswrapper[4797]: I1013 13:24:45.605377 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/885a17a7-d224-4081-8568-8d362c86f321-dns-svc\") pod \"dnsmasq-dns-d9ddcb47c-khxjk\" (UID: \"885a17a7-d224-4081-8568-8d362c86f321\") " pod="openstack/dnsmasq-dns-d9ddcb47c-khxjk" Oct 13 13:24:45 crc kubenswrapper[4797]: I1013 13:24:45.605431 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9265\" (UniqueName: \"kubernetes.io/projected/885a17a7-d224-4081-8568-8d362c86f321-kube-api-access-w9265\") pod \"dnsmasq-dns-d9ddcb47c-khxjk\" (UID: \"885a17a7-d224-4081-8568-8d362c86f321\") " pod="openstack/dnsmasq-dns-d9ddcb47c-khxjk" Oct 13 13:24:45 crc kubenswrapper[4797]: I1013 13:24:45.605468 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/885a17a7-d224-4081-8568-8d362c86f321-ovsdbserver-nb\") pod \"dnsmasq-dns-d9ddcb47c-khxjk\" (UID: \"885a17a7-d224-4081-8568-8d362c86f321\") " pod="openstack/dnsmasq-dns-d9ddcb47c-khxjk" Oct 13 13:24:45 crc kubenswrapper[4797]: I1013 13:24:45.605559 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/885a17a7-d224-4081-8568-8d362c86f321-dns-swift-storage-0\") pod \"dnsmasq-dns-d9ddcb47c-khxjk\" (UID: \"885a17a7-d224-4081-8568-8d362c86f321\") " pod="openstack/dnsmasq-dns-d9ddcb47c-khxjk" Oct 13 13:24:45 crc kubenswrapper[4797]: I1013 13:24:45.614473 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8ec4b28-9317-420b-afeb-ac89bd5f0f9b-kube-api-access-nmvjf" (OuterVolumeSpecName: "kube-api-access-nmvjf") pod "d8ec4b28-9317-420b-afeb-ac89bd5f0f9b" (UID: "d8ec4b28-9317-420b-afeb-ac89bd5f0f9b"). InnerVolumeSpecName "kube-api-access-nmvjf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:24:45 crc kubenswrapper[4797]: I1013 13:24:45.706474 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/885a17a7-d224-4081-8568-8d362c86f321-dns-svc\") pod \"dnsmasq-dns-d9ddcb47c-khxjk\" (UID: \"885a17a7-d224-4081-8568-8d362c86f321\") " pod="openstack/dnsmasq-dns-d9ddcb47c-khxjk" Oct 13 13:24:45 crc kubenswrapper[4797]: I1013 13:24:45.706515 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9265\" (UniqueName: \"kubernetes.io/projected/885a17a7-d224-4081-8568-8d362c86f321-kube-api-access-w9265\") pod \"dnsmasq-dns-d9ddcb47c-khxjk\" (UID: \"885a17a7-d224-4081-8568-8d362c86f321\") " pod="openstack/dnsmasq-dns-d9ddcb47c-khxjk" Oct 13 13:24:45 crc kubenswrapper[4797]: I1013 13:24:45.706537 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/885a17a7-d224-4081-8568-8d362c86f321-ovsdbserver-nb\") pod \"dnsmasq-dns-d9ddcb47c-khxjk\" (UID: \"885a17a7-d224-4081-8568-8d362c86f321\") " pod="openstack/dnsmasq-dns-d9ddcb47c-khxjk" Oct 13 13:24:45 crc kubenswrapper[4797]: I1013 13:24:45.706589 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/885a17a7-d224-4081-8568-8d362c86f321-dns-swift-storage-0\") pod \"dnsmasq-dns-d9ddcb47c-khxjk\" (UID: \"885a17a7-d224-4081-8568-8d362c86f321\") " pod="openstack/dnsmasq-dns-d9ddcb47c-khxjk" Oct 13 13:24:45 crc kubenswrapper[4797]: I1013 13:24:45.706631 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/885a17a7-d224-4081-8568-8d362c86f321-config\") pod \"dnsmasq-dns-d9ddcb47c-khxjk\" (UID: \"885a17a7-d224-4081-8568-8d362c86f321\") " pod="openstack/dnsmasq-dns-d9ddcb47c-khxjk" Oct 13 13:24:45 crc kubenswrapper[4797]: I1013 13:24:45.706665 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/885a17a7-d224-4081-8568-8d362c86f321-ovsdbserver-sb\") pod \"dnsmasq-dns-d9ddcb47c-khxjk\" (UID: \"885a17a7-d224-4081-8568-8d362c86f321\") " pod="openstack/dnsmasq-dns-d9ddcb47c-khxjk" Oct 13 13:24:45 crc kubenswrapper[4797]: I1013 13:24:45.706709 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nmvjf\" (UniqueName: \"kubernetes.io/projected/d8ec4b28-9317-420b-afeb-ac89bd5f0f9b-kube-api-access-nmvjf\") on node \"crc\" DevicePath \"\"" Oct 13 13:24:45 crc kubenswrapper[4797]: I1013 13:24:45.707517 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/885a17a7-d224-4081-8568-8d362c86f321-dns-svc\") pod \"dnsmasq-dns-d9ddcb47c-khxjk\" (UID: \"885a17a7-d224-4081-8568-8d362c86f321\") " pod="openstack/dnsmasq-dns-d9ddcb47c-khxjk" Oct 13 13:24:45 crc kubenswrapper[4797]: I1013 13:24:45.707553 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/885a17a7-d224-4081-8568-8d362c86f321-ovsdbserver-sb\") pod \"dnsmasq-dns-d9ddcb47c-khxjk\" (UID: \"885a17a7-d224-4081-8568-8d362c86f321\") " pod="openstack/dnsmasq-dns-d9ddcb47c-khxjk" Oct 13 13:24:45 crc kubenswrapper[4797]: I1013 13:24:45.707677 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/885a17a7-d224-4081-8568-8d362c86f321-ovsdbserver-nb\") pod \"dnsmasq-dns-d9ddcb47c-khxjk\" (UID: \"885a17a7-d224-4081-8568-8d362c86f321\") " pod="openstack/dnsmasq-dns-d9ddcb47c-khxjk" Oct 13 13:24:45 crc kubenswrapper[4797]: I1013 13:24:45.708159 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/885a17a7-d224-4081-8568-8d362c86f321-config\") pod \"dnsmasq-dns-d9ddcb47c-khxjk\" (UID: \"885a17a7-d224-4081-8568-8d362c86f321\") " pod="openstack/dnsmasq-dns-d9ddcb47c-khxjk" Oct 13 13:24:45 crc kubenswrapper[4797]: I1013 13:24:45.709008 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/885a17a7-d224-4081-8568-8d362c86f321-dns-swift-storage-0\") pod \"dnsmasq-dns-d9ddcb47c-khxjk\" (UID: \"885a17a7-d224-4081-8568-8d362c86f321\") " pod="openstack/dnsmasq-dns-d9ddcb47c-khxjk" Oct 13 13:24:45 crc kubenswrapper[4797]: I1013 13:24:45.726275 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9265\" (UniqueName: \"kubernetes.io/projected/885a17a7-d224-4081-8568-8d362c86f321-kube-api-access-w9265\") pod \"dnsmasq-dns-d9ddcb47c-khxjk\" (UID: \"885a17a7-d224-4081-8568-8d362c86f321\") " pod="openstack/dnsmasq-dns-d9ddcb47c-khxjk" Oct 13 13:24:45 crc kubenswrapper[4797]: I1013 13:24:45.836964 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d9ddcb47c-khxjk" Oct 13 13:24:46 crc kubenswrapper[4797]: I1013 13:24:46.202609 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-496f-account-create-jjvzm" event={"ID":"d8ec4b28-9317-420b-afeb-ac89bd5f0f9b","Type":"ContainerDied","Data":"ea0daee27fc14903ed30ac8f12cf1d568ca3979c97d69e1d0ae7fa88ae50087c"} Oct 13 13:24:46 crc kubenswrapper[4797]: I1013 13:24:46.202673 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ea0daee27fc14903ed30ac8f12cf1d568ca3979c97d69e1d0ae7fa88ae50087c" Oct 13 13:24:46 crc kubenswrapper[4797]: I1013 13:24:46.202691 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-496f-account-create-jjvzm" Oct 13 13:24:46 crc kubenswrapper[4797]: I1013 13:24:46.337985 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-d9ddcb47c-khxjk"] Oct 13 13:24:46 crc kubenswrapper[4797]: I1013 13:24:46.553062 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-htk8n-config-4p5v9" Oct 13 13:24:46 crc kubenswrapper[4797]: I1013 13:24:46.723233 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a2561f6e-2e49-49a3-89c5-fc967e8d358b-scripts\") pod \"a2561f6e-2e49-49a3-89c5-fc967e8d358b\" (UID: \"a2561f6e-2e49-49a3-89c5-fc967e8d358b\") " Oct 13 13:24:46 crc kubenswrapper[4797]: I1013 13:24:46.723569 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a2561f6e-2e49-49a3-89c5-fc967e8d358b-var-run-ovn\") pod \"a2561f6e-2e49-49a3-89c5-fc967e8d358b\" (UID: \"a2561f6e-2e49-49a3-89c5-fc967e8d358b\") " Oct 13 13:24:46 crc kubenswrapper[4797]: I1013 13:24:46.723632 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pzl7g\" (UniqueName: \"kubernetes.io/projected/a2561f6e-2e49-49a3-89c5-fc967e8d358b-kube-api-access-pzl7g\") pod \"a2561f6e-2e49-49a3-89c5-fc967e8d358b\" (UID: \"a2561f6e-2e49-49a3-89c5-fc967e8d358b\") " Oct 13 13:24:46 crc kubenswrapper[4797]: I1013 13:24:46.723669 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a2561f6e-2e49-49a3-89c5-fc967e8d358b-var-run\") pod \"a2561f6e-2e49-49a3-89c5-fc967e8d358b\" (UID: \"a2561f6e-2e49-49a3-89c5-fc967e8d358b\") " Oct 13 13:24:46 crc kubenswrapper[4797]: I1013 13:24:46.723776 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/a2561f6e-2e49-49a3-89c5-fc967e8d358b-additional-scripts\") pod \"a2561f6e-2e49-49a3-89c5-fc967e8d358b\" (UID: \"a2561f6e-2e49-49a3-89c5-fc967e8d358b\") " Oct 13 13:24:46 crc kubenswrapper[4797]: I1013 13:24:46.723811 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a2561f6e-2e49-49a3-89c5-fc967e8d358b-var-log-ovn\") pod \"a2561f6e-2e49-49a3-89c5-fc967e8d358b\" (UID: \"a2561f6e-2e49-49a3-89c5-fc967e8d358b\") " Oct 13 13:24:46 crc kubenswrapper[4797]: I1013 13:24:46.724115 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a2561f6e-2e49-49a3-89c5-fc967e8d358b-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "a2561f6e-2e49-49a3-89c5-fc967e8d358b" (UID: "a2561f6e-2e49-49a3-89c5-fc967e8d358b"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 13 13:24:46 crc kubenswrapper[4797]: I1013 13:24:46.724168 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a2561f6e-2e49-49a3-89c5-fc967e8d358b-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "a2561f6e-2e49-49a3-89c5-fc967e8d358b" (UID: "a2561f6e-2e49-49a3-89c5-fc967e8d358b"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 13 13:24:46 crc kubenswrapper[4797]: I1013 13:24:46.724851 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a2561f6e-2e49-49a3-89c5-fc967e8d358b-var-run" (OuterVolumeSpecName: "var-run") pod "a2561f6e-2e49-49a3-89c5-fc967e8d358b" (UID: "a2561f6e-2e49-49a3-89c5-fc967e8d358b"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 13 13:24:46 crc kubenswrapper[4797]: I1013 13:24:46.725242 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a2561f6e-2e49-49a3-89c5-fc967e8d358b-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "a2561f6e-2e49-49a3-89c5-fc967e8d358b" (UID: "a2561f6e-2e49-49a3-89c5-fc967e8d358b"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:24:46 crc kubenswrapper[4797]: I1013 13:24:46.725379 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a2561f6e-2e49-49a3-89c5-fc967e8d358b-scripts" (OuterVolumeSpecName: "scripts") pod "a2561f6e-2e49-49a3-89c5-fc967e8d358b" (UID: "a2561f6e-2e49-49a3-89c5-fc967e8d358b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:24:46 crc kubenswrapper[4797]: I1013 13:24:46.728952 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2561f6e-2e49-49a3-89c5-fc967e8d358b-kube-api-access-pzl7g" (OuterVolumeSpecName: "kube-api-access-pzl7g") pod "a2561f6e-2e49-49a3-89c5-fc967e8d358b" (UID: "a2561f6e-2e49-49a3-89c5-fc967e8d358b"). InnerVolumeSpecName "kube-api-access-pzl7g". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:24:46 crc kubenswrapper[4797]: I1013 13:24:46.826004 4797 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a2561f6e-2e49-49a3-89c5-fc967e8d358b-scripts\") on node \"crc\" DevicePath \"\"" Oct 13 13:24:46 crc kubenswrapper[4797]: I1013 13:24:46.826044 4797 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a2561f6e-2e49-49a3-89c5-fc967e8d358b-var-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 13 13:24:46 crc kubenswrapper[4797]: I1013 13:24:46.826057 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pzl7g\" (UniqueName: \"kubernetes.io/projected/a2561f6e-2e49-49a3-89c5-fc967e8d358b-kube-api-access-pzl7g\") on node \"crc\" DevicePath \"\"" Oct 13 13:24:46 crc kubenswrapper[4797]: I1013 13:24:46.826074 4797 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a2561f6e-2e49-49a3-89c5-fc967e8d358b-var-run\") on node \"crc\" DevicePath \"\"" Oct 13 13:24:46 crc kubenswrapper[4797]: I1013 13:24:46.826084 4797 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/a2561f6e-2e49-49a3-89c5-fc967e8d358b-additional-scripts\") on node \"crc\" DevicePath \"\"" Oct 13 13:24:46 crc kubenswrapper[4797]: I1013 13:24:46.826095 4797 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a2561f6e-2e49-49a3-89c5-fc967e8d358b-var-log-ovn\") on node \"crc\" DevicePath \"\"" Oct 13 13:24:47 crc kubenswrapper[4797]: I1013 13:24:47.208759 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-htk8n-config-4p5v9" event={"ID":"a2561f6e-2e49-49a3-89c5-fc967e8d358b","Type":"ContainerDied","Data":"921af66100298befba0a44adb3205cc02eeb1c72af304de3aa90bf366f38bb7e"} Oct 13 13:24:47 crc kubenswrapper[4797]: I1013 13:24:47.208779 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-htk8n-config-4p5v9" Oct 13 13:24:47 crc kubenswrapper[4797]: I1013 13:24:47.208797 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="921af66100298befba0a44adb3205cc02eeb1c72af304de3aa90bf366f38bb7e" Oct 13 13:24:47 crc kubenswrapper[4797]: I1013 13:24:47.210472 4797 generic.go:334] "Generic (PLEG): container finished" podID="885a17a7-d224-4081-8568-8d362c86f321" containerID="61a6ca717c36ed19a2b71ed9ef41193b6aed3a67ebfcc8e62b7497a1d4caeaad" exitCode=0 Oct 13 13:24:47 crc kubenswrapper[4797]: I1013 13:24:47.210500 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d9ddcb47c-khxjk" event={"ID":"885a17a7-d224-4081-8568-8d362c86f321","Type":"ContainerDied","Data":"61a6ca717c36ed19a2b71ed9ef41193b6aed3a67ebfcc8e62b7497a1d4caeaad"} Oct 13 13:24:47 crc kubenswrapper[4797]: I1013 13:24:47.210521 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d9ddcb47c-khxjk" event={"ID":"885a17a7-d224-4081-8568-8d362c86f321","Type":"ContainerStarted","Data":"4a2aa2afa41f15b05c4cfa9ec64a8b333cdbbe39238d300134fa294cde0b5f2f"} Oct 13 13:24:47 crc kubenswrapper[4797]: I1013 13:24:47.532349 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-jrlnp"] Oct 13 13:24:47 crc kubenswrapper[4797]: E1013 13:24:47.532906 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2561f6e-2e49-49a3-89c5-fc967e8d358b" containerName="ovn-config" Oct 13 13:24:47 crc kubenswrapper[4797]: I1013 13:24:47.532983 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2561f6e-2e49-49a3-89c5-fc967e8d358b" containerName="ovn-config" Oct 13 13:24:47 crc kubenswrapper[4797]: I1013 13:24:47.533211 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2561f6e-2e49-49a3-89c5-fc967e8d358b" containerName="ovn-config" Oct 13 13:24:47 crc kubenswrapper[4797]: I1013 13:24:47.533798 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-jrlnp" Oct 13 13:24:47 crc kubenswrapper[4797]: I1013 13:24:47.536508 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Oct 13 13:24:47 crc kubenswrapper[4797]: I1013 13:24:47.536537 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-c7pxm" Oct 13 13:24:47 crc kubenswrapper[4797]: I1013 13:24:47.537773 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/646c1def-f060-445e-b0a1-c616965aca86-db-sync-config-data\") pod \"glance-db-sync-jrlnp\" (UID: \"646c1def-f060-445e-b0a1-c616965aca86\") " pod="openstack/glance-db-sync-jrlnp" Oct 13 13:24:47 crc kubenswrapper[4797]: I1013 13:24:47.537854 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkvqt\" (UniqueName: \"kubernetes.io/projected/646c1def-f060-445e-b0a1-c616965aca86-kube-api-access-fkvqt\") pod \"glance-db-sync-jrlnp\" (UID: \"646c1def-f060-445e-b0a1-c616965aca86\") " pod="openstack/glance-db-sync-jrlnp" Oct 13 13:24:47 crc kubenswrapper[4797]: I1013 13:24:47.537941 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/646c1def-f060-445e-b0a1-c616965aca86-combined-ca-bundle\") pod \"glance-db-sync-jrlnp\" (UID: \"646c1def-f060-445e-b0a1-c616965aca86\") " pod="openstack/glance-db-sync-jrlnp" Oct 13 13:24:47 crc kubenswrapper[4797]: I1013 13:24:47.537966 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/646c1def-f060-445e-b0a1-c616965aca86-config-data\") pod \"glance-db-sync-jrlnp\" (UID: \"646c1def-f060-445e-b0a1-c616965aca86\") " pod="openstack/glance-db-sync-jrlnp" Oct 13 13:24:47 crc kubenswrapper[4797]: I1013 13:24:47.544921 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-jrlnp"] Oct 13 13:24:47 crc kubenswrapper[4797]: I1013 13:24:47.628412 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-htk8n-config-4p5v9"] Oct 13 13:24:47 crc kubenswrapper[4797]: I1013 13:24:47.634182 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-htk8n-config-4p5v9"] Oct 13 13:24:47 crc kubenswrapper[4797]: I1013 13:24:47.638758 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/646c1def-f060-445e-b0a1-c616965aca86-combined-ca-bundle\") pod \"glance-db-sync-jrlnp\" (UID: \"646c1def-f060-445e-b0a1-c616965aca86\") " pod="openstack/glance-db-sync-jrlnp" Oct 13 13:24:47 crc kubenswrapper[4797]: I1013 13:24:47.638970 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/646c1def-f060-445e-b0a1-c616965aca86-config-data\") pod \"glance-db-sync-jrlnp\" (UID: \"646c1def-f060-445e-b0a1-c616965aca86\") " pod="openstack/glance-db-sync-jrlnp" Oct 13 13:24:47 crc kubenswrapper[4797]: I1013 13:24:47.639141 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/646c1def-f060-445e-b0a1-c616965aca86-db-sync-config-data\") pod \"glance-db-sync-jrlnp\" (UID: \"646c1def-f060-445e-b0a1-c616965aca86\") " pod="openstack/glance-db-sync-jrlnp" Oct 13 13:24:47 crc kubenswrapper[4797]: I1013 13:24:47.639228 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fkvqt\" (UniqueName: \"kubernetes.io/projected/646c1def-f060-445e-b0a1-c616965aca86-kube-api-access-fkvqt\") pod \"glance-db-sync-jrlnp\" (UID: \"646c1def-f060-445e-b0a1-c616965aca86\") " pod="openstack/glance-db-sync-jrlnp" Oct 13 13:24:47 crc kubenswrapper[4797]: I1013 13:24:47.642328 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/646c1def-f060-445e-b0a1-c616965aca86-combined-ca-bundle\") pod \"glance-db-sync-jrlnp\" (UID: \"646c1def-f060-445e-b0a1-c616965aca86\") " pod="openstack/glance-db-sync-jrlnp" Oct 13 13:24:47 crc kubenswrapper[4797]: I1013 13:24:47.642490 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/646c1def-f060-445e-b0a1-c616965aca86-config-data\") pod \"glance-db-sync-jrlnp\" (UID: \"646c1def-f060-445e-b0a1-c616965aca86\") " pod="openstack/glance-db-sync-jrlnp" Oct 13 13:24:47 crc kubenswrapper[4797]: I1013 13:24:47.646513 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/646c1def-f060-445e-b0a1-c616965aca86-db-sync-config-data\") pod \"glance-db-sync-jrlnp\" (UID: \"646c1def-f060-445e-b0a1-c616965aca86\") " pod="openstack/glance-db-sync-jrlnp" Oct 13 13:24:47 crc kubenswrapper[4797]: I1013 13:24:47.657473 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fkvqt\" (UniqueName: \"kubernetes.io/projected/646c1def-f060-445e-b0a1-c616965aca86-kube-api-access-fkvqt\") pod \"glance-db-sync-jrlnp\" (UID: \"646c1def-f060-445e-b0a1-c616965aca86\") " pod="openstack/glance-db-sync-jrlnp" Oct 13 13:24:47 crc kubenswrapper[4797]: I1013 13:24:47.852207 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-jrlnp" Oct 13 13:24:48 crc kubenswrapper[4797]: I1013 13:24:48.220409 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d9ddcb47c-khxjk" event={"ID":"885a17a7-d224-4081-8568-8d362c86f321","Type":"ContainerStarted","Data":"016a8ddf0a24c47c392325e00e0013bc3b8e64971b096a87817aa29af7080dcd"} Oct 13 13:24:48 crc kubenswrapper[4797]: I1013 13:24:48.221628 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-d9ddcb47c-khxjk" Oct 13 13:24:48 crc kubenswrapper[4797]: I1013 13:24:48.241131 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-d9ddcb47c-khxjk" podStartSLOduration=3.241111504 podStartE2EDuration="3.241111504s" podCreationTimestamp="2025-10-13 13:24:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 13:24:48.235711412 +0000 UTC m=+1065.769261668" watchObservedRunningTime="2025-10-13 13:24:48.241111504 +0000 UTC m=+1065.774661770" Oct 13 13:24:48 crc kubenswrapper[4797]: I1013 13:24:48.340464 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-jrlnp"] Oct 13 13:24:48 crc kubenswrapper[4797]: W1013 13:24:48.348384 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod646c1def_f060_445e_b0a1_c616965aca86.slice/crio-966b53b614fac9fce289d23cfdf2d844277604440f5d9b97ede1c7190255bb01 WatchSource:0}: Error finding container 966b53b614fac9fce289d23cfdf2d844277604440f5d9b97ede1c7190255bb01: Status 404 returned error can't find the container with id 966b53b614fac9fce289d23cfdf2d844277604440f5d9b97ede1c7190255bb01 Oct 13 13:24:49 crc kubenswrapper[4797]: I1013 13:24:49.228632 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-jrlnp" event={"ID":"646c1def-f060-445e-b0a1-c616965aca86","Type":"ContainerStarted","Data":"966b53b614fac9fce289d23cfdf2d844277604440f5d9b97ede1c7190255bb01"} Oct 13 13:24:49 crc kubenswrapper[4797]: I1013 13:24:49.245443 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2561f6e-2e49-49a3-89c5-fc967e8d358b" path="/var/lib/kubelet/pods/a2561f6e-2e49-49a3-89c5-fc967e8d358b/volumes" Oct 13 13:24:53 crc kubenswrapper[4797]: I1013 13:24:53.184412 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Oct 13 13:24:53 crc kubenswrapper[4797]: I1013 13:24:53.587644 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-jdw9k"] Oct 13 13:24:53 crc kubenswrapper[4797]: I1013 13:24:53.589313 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-jdw9k" Oct 13 13:24:53 crc kubenswrapper[4797]: I1013 13:24:53.602154 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-jdw9k"] Oct 13 13:24:53 crc kubenswrapper[4797]: I1013 13:24:53.689574 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-7hjgk"] Oct 13 13:24:53 crc kubenswrapper[4797]: I1013 13:24:53.690601 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-7hjgk" Oct 13 13:24:53 crc kubenswrapper[4797]: I1013 13:24:53.702337 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-7hjgk"] Oct 13 13:24:53 crc kubenswrapper[4797]: I1013 13:24:53.738723 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76s9p\" (UniqueName: \"kubernetes.io/projected/23264a32-2fc6-48e6-aa01-6c6cf519f5b5-kube-api-access-76s9p\") pod \"cinder-db-create-jdw9k\" (UID: \"23264a32-2fc6-48e6-aa01-6c6cf519f5b5\") " pod="openstack/cinder-db-create-jdw9k" Oct 13 13:24:53 crc kubenswrapper[4797]: I1013 13:24:53.795858 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-vchss"] Oct 13 13:24:53 crc kubenswrapper[4797]: I1013 13:24:53.796807 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-vchss" Oct 13 13:24:53 crc kubenswrapper[4797]: I1013 13:24:53.804480 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-vchss"] Oct 13 13:24:53 crc kubenswrapper[4797]: I1013 13:24:53.839709 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-76s9p\" (UniqueName: \"kubernetes.io/projected/23264a32-2fc6-48e6-aa01-6c6cf519f5b5-kube-api-access-76s9p\") pod \"cinder-db-create-jdw9k\" (UID: \"23264a32-2fc6-48e6-aa01-6c6cf519f5b5\") " pod="openstack/cinder-db-create-jdw9k" Oct 13 13:24:53 crc kubenswrapper[4797]: I1013 13:24:53.839771 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-plxzb\" (UniqueName: \"kubernetes.io/projected/4d71a130-6ed9-4cba-8f40-dea82ad6e42e-kube-api-access-plxzb\") pod \"barbican-db-create-7hjgk\" (UID: \"4d71a130-6ed9-4cba-8f40-dea82ad6e42e\") " pod="openstack/barbican-db-create-7hjgk" Oct 13 13:24:53 crc kubenswrapper[4797]: I1013 13:24:53.853250 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-8ftfs"] Oct 13 13:24:53 crc kubenswrapper[4797]: I1013 13:24:53.854735 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-8ftfs" Oct 13 13:24:53 crc kubenswrapper[4797]: I1013 13:24:53.856269 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 13 13:24:53 crc kubenswrapper[4797]: I1013 13:24:53.857464 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 13 13:24:53 crc kubenswrapper[4797]: I1013 13:24:53.857733 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 13 13:24:53 crc kubenswrapper[4797]: I1013 13:24:53.858439 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-4snmq" Oct 13 13:24:53 crc kubenswrapper[4797]: I1013 13:24:53.863577 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-76s9p\" (UniqueName: \"kubernetes.io/projected/23264a32-2fc6-48e6-aa01-6c6cf519f5b5-kube-api-access-76s9p\") pod \"cinder-db-create-jdw9k\" (UID: \"23264a32-2fc6-48e6-aa01-6c6cf519f5b5\") " pod="openstack/cinder-db-create-jdw9k" Oct 13 13:24:53 crc kubenswrapper[4797]: I1013 13:24:53.879862 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-8ftfs"] Oct 13 13:24:53 crc kubenswrapper[4797]: I1013 13:24:53.914988 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-jdw9k" Oct 13 13:24:53 crc kubenswrapper[4797]: I1013 13:24:53.941698 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttxbv\" (UniqueName: \"kubernetes.io/projected/d35fbf40-e053-49c6-9eb9-fbdd8337061c-kube-api-access-ttxbv\") pod \"neutron-db-create-vchss\" (UID: \"d35fbf40-e053-49c6-9eb9-fbdd8337061c\") " pod="openstack/neutron-db-create-vchss" Oct 13 13:24:53 crc kubenswrapper[4797]: I1013 13:24:53.941791 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-plxzb\" (UniqueName: \"kubernetes.io/projected/4d71a130-6ed9-4cba-8f40-dea82ad6e42e-kube-api-access-plxzb\") pod \"barbican-db-create-7hjgk\" (UID: \"4d71a130-6ed9-4cba-8f40-dea82ad6e42e\") " pod="openstack/barbican-db-create-7hjgk" Oct 13 13:24:53 crc kubenswrapper[4797]: I1013 13:24:53.959701 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-plxzb\" (UniqueName: \"kubernetes.io/projected/4d71a130-6ed9-4cba-8f40-dea82ad6e42e-kube-api-access-plxzb\") pod \"barbican-db-create-7hjgk\" (UID: \"4d71a130-6ed9-4cba-8f40-dea82ad6e42e\") " pod="openstack/barbican-db-create-7hjgk" Oct 13 13:24:54 crc kubenswrapper[4797]: I1013 13:24:54.006842 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-7hjgk" Oct 13 13:24:54 crc kubenswrapper[4797]: I1013 13:24:54.043306 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a945414-93e9-401f-aa0e-15e040d78017-config-data\") pod \"keystone-db-sync-8ftfs\" (UID: \"5a945414-93e9-401f-aa0e-15e040d78017\") " pod="openstack/keystone-db-sync-8ftfs" Oct 13 13:24:54 crc kubenswrapper[4797]: I1013 13:24:54.043384 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85qnt\" (UniqueName: \"kubernetes.io/projected/5a945414-93e9-401f-aa0e-15e040d78017-kube-api-access-85qnt\") pod \"keystone-db-sync-8ftfs\" (UID: \"5a945414-93e9-401f-aa0e-15e040d78017\") " pod="openstack/keystone-db-sync-8ftfs" Oct 13 13:24:54 crc kubenswrapper[4797]: I1013 13:24:54.043428 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ttxbv\" (UniqueName: \"kubernetes.io/projected/d35fbf40-e053-49c6-9eb9-fbdd8337061c-kube-api-access-ttxbv\") pod \"neutron-db-create-vchss\" (UID: \"d35fbf40-e053-49c6-9eb9-fbdd8337061c\") " pod="openstack/neutron-db-create-vchss" Oct 13 13:24:54 crc kubenswrapper[4797]: I1013 13:24:54.043501 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a945414-93e9-401f-aa0e-15e040d78017-combined-ca-bundle\") pod \"keystone-db-sync-8ftfs\" (UID: \"5a945414-93e9-401f-aa0e-15e040d78017\") " pod="openstack/keystone-db-sync-8ftfs" Oct 13 13:24:54 crc kubenswrapper[4797]: I1013 13:24:54.067246 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttxbv\" (UniqueName: \"kubernetes.io/projected/d35fbf40-e053-49c6-9eb9-fbdd8337061c-kube-api-access-ttxbv\") pod \"neutron-db-create-vchss\" (UID: \"d35fbf40-e053-49c6-9eb9-fbdd8337061c\") " pod="openstack/neutron-db-create-vchss" Oct 13 13:24:54 crc kubenswrapper[4797]: I1013 13:24:54.114124 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-vchss" Oct 13 13:24:54 crc kubenswrapper[4797]: I1013 13:24:54.144892 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a945414-93e9-401f-aa0e-15e040d78017-config-data\") pod \"keystone-db-sync-8ftfs\" (UID: \"5a945414-93e9-401f-aa0e-15e040d78017\") " pod="openstack/keystone-db-sync-8ftfs" Oct 13 13:24:54 crc kubenswrapper[4797]: I1013 13:24:54.144978 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85qnt\" (UniqueName: \"kubernetes.io/projected/5a945414-93e9-401f-aa0e-15e040d78017-kube-api-access-85qnt\") pod \"keystone-db-sync-8ftfs\" (UID: \"5a945414-93e9-401f-aa0e-15e040d78017\") " pod="openstack/keystone-db-sync-8ftfs" Oct 13 13:24:54 crc kubenswrapper[4797]: I1013 13:24:54.145072 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a945414-93e9-401f-aa0e-15e040d78017-combined-ca-bundle\") pod \"keystone-db-sync-8ftfs\" (UID: \"5a945414-93e9-401f-aa0e-15e040d78017\") " pod="openstack/keystone-db-sync-8ftfs" Oct 13 13:24:54 crc kubenswrapper[4797]: I1013 13:24:54.148491 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a945414-93e9-401f-aa0e-15e040d78017-config-data\") pod \"keystone-db-sync-8ftfs\" (UID: \"5a945414-93e9-401f-aa0e-15e040d78017\") " pod="openstack/keystone-db-sync-8ftfs" Oct 13 13:24:54 crc kubenswrapper[4797]: I1013 13:24:54.149720 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a945414-93e9-401f-aa0e-15e040d78017-combined-ca-bundle\") pod \"keystone-db-sync-8ftfs\" (UID: \"5a945414-93e9-401f-aa0e-15e040d78017\") " pod="openstack/keystone-db-sync-8ftfs" Oct 13 13:24:54 crc kubenswrapper[4797]: I1013 13:24:54.161788 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85qnt\" (UniqueName: \"kubernetes.io/projected/5a945414-93e9-401f-aa0e-15e040d78017-kube-api-access-85qnt\") pod \"keystone-db-sync-8ftfs\" (UID: \"5a945414-93e9-401f-aa0e-15e040d78017\") " pod="openstack/keystone-db-sync-8ftfs" Oct 13 13:24:54 crc kubenswrapper[4797]: I1013 13:24:54.212114 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-8ftfs" Oct 13 13:24:55 crc kubenswrapper[4797]: I1013 13:24:55.838666 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-d9ddcb47c-khxjk" Oct 13 13:24:55 crc kubenswrapper[4797]: I1013 13:24:55.897524 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-c757dd68f-28ns7"] Oct 13 13:24:55 crc kubenswrapper[4797]: I1013 13:24:55.897778 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-c757dd68f-28ns7" podUID="b0488671-d7b4-4c33-a64b-b163a812f2eb" containerName="dnsmasq-dns" containerID="cri-o://6c7f3f5a65043f5417a69014fd274093dd20803d37acd9e49b2e231982ecaa1a" gracePeriod=10 Oct 13 13:24:56 crc kubenswrapper[4797]: I1013 13:24:56.312396 4797 generic.go:334] "Generic (PLEG): container finished" podID="b0488671-d7b4-4c33-a64b-b163a812f2eb" containerID="6c7f3f5a65043f5417a69014fd274093dd20803d37acd9e49b2e231982ecaa1a" exitCode=0 Oct 13 13:24:56 crc kubenswrapper[4797]: I1013 13:24:56.312474 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c757dd68f-28ns7" event={"ID":"b0488671-d7b4-4c33-a64b-b163a812f2eb","Type":"ContainerDied","Data":"6c7f3f5a65043f5417a69014fd274093dd20803d37acd9e49b2e231982ecaa1a"} Oct 13 13:24:58 crc kubenswrapper[4797]: I1013 13:24:58.839791 4797 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-c757dd68f-28ns7" podUID="b0488671-d7b4-4c33-a64b-b163a812f2eb" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.117:5353: connect: connection refused" Oct 13 13:25:00 crc kubenswrapper[4797]: I1013 13:25:00.130647 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c757dd68f-28ns7" Oct 13 13:25:00 crc kubenswrapper[4797]: I1013 13:25:00.245591 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b0488671-d7b4-4c33-a64b-b163a812f2eb-ovsdbserver-sb\") pod \"b0488671-d7b4-4c33-a64b-b163a812f2eb\" (UID: \"b0488671-d7b4-4c33-a64b-b163a812f2eb\") " Oct 13 13:25:00 crc kubenswrapper[4797]: I1013 13:25:00.246368 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b0488671-d7b4-4c33-a64b-b163a812f2eb-dns-svc\") pod \"b0488671-d7b4-4c33-a64b-b163a812f2eb\" (UID: \"b0488671-d7b4-4c33-a64b-b163a812f2eb\") " Oct 13 13:25:00 crc kubenswrapper[4797]: I1013 13:25:00.246570 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b0488671-d7b4-4c33-a64b-b163a812f2eb-ovsdbserver-nb\") pod \"b0488671-d7b4-4c33-a64b-b163a812f2eb\" (UID: \"b0488671-d7b4-4c33-a64b-b163a812f2eb\") " Oct 13 13:25:00 crc kubenswrapper[4797]: I1013 13:25:00.246813 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b0488671-d7b4-4c33-a64b-b163a812f2eb-config\") pod \"b0488671-d7b4-4c33-a64b-b163a812f2eb\" (UID: \"b0488671-d7b4-4c33-a64b-b163a812f2eb\") " Oct 13 13:25:00 crc kubenswrapper[4797]: I1013 13:25:00.246995 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-blm8c\" (UniqueName: \"kubernetes.io/projected/b0488671-d7b4-4c33-a64b-b163a812f2eb-kube-api-access-blm8c\") pod \"b0488671-d7b4-4c33-a64b-b163a812f2eb\" (UID: \"b0488671-d7b4-4c33-a64b-b163a812f2eb\") " Oct 13 13:25:00 crc kubenswrapper[4797]: I1013 13:25:00.251483 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0488671-d7b4-4c33-a64b-b163a812f2eb-kube-api-access-blm8c" (OuterVolumeSpecName: "kube-api-access-blm8c") pod "b0488671-d7b4-4c33-a64b-b163a812f2eb" (UID: "b0488671-d7b4-4c33-a64b-b163a812f2eb"). InnerVolumeSpecName "kube-api-access-blm8c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:25:01 crc kubenswrapper[4797]: I1013 13:25:00.297006 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b0488671-d7b4-4c33-a64b-b163a812f2eb-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b0488671-d7b4-4c33-a64b-b163a812f2eb" (UID: "b0488671-d7b4-4c33-a64b-b163a812f2eb"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:25:01 crc kubenswrapper[4797]: I1013 13:25:00.297385 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b0488671-d7b4-4c33-a64b-b163a812f2eb-config" (OuterVolumeSpecName: "config") pod "b0488671-d7b4-4c33-a64b-b163a812f2eb" (UID: "b0488671-d7b4-4c33-a64b-b163a812f2eb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:25:01 crc kubenswrapper[4797]: I1013 13:25:00.297439 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b0488671-d7b4-4c33-a64b-b163a812f2eb-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b0488671-d7b4-4c33-a64b-b163a812f2eb" (UID: "b0488671-d7b4-4c33-a64b-b163a812f2eb"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:25:01 crc kubenswrapper[4797]: I1013 13:25:00.326163 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b0488671-d7b4-4c33-a64b-b163a812f2eb-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b0488671-d7b4-4c33-a64b-b163a812f2eb" (UID: "b0488671-d7b4-4c33-a64b-b163a812f2eb"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:25:01 crc kubenswrapper[4797]: I1013 13:25:00.349371 4797 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b0488671-d7b4-4c33-a64b-b163a812f2eb-config\") on node \"crc\" DevicePath \"\"" Oct 13 13:25:01 crc kubenswrapper[4797]: I1013 13:25:00.349398 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-blm8c\" (UniqueName: \"kubernetes.io/projected/b0488671-d7b4-4c33-a64b-b163a812f2eb-kube-api-access-blm8c\") on node \"crc\" DevicePath \"\"" Oct 13 13:25:01 crc kubenswrapper[4797]: I1013 13:25:00.349410 4797 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b0488671-d7b4-4c33-a64b-b163a812f2eb-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 13 13:25:01 crc kubenswrapper[4797]: I1013 13:25:00.349418 4797 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b0488671-d7b4-4c33-a64b-b163a812f2eb-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 13 13:25:01 crc kubenswrapper[4797]: I1013 13:25:00.349426 4797 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b0488671-d7b4-4c33-a64b-b163a812f2eb-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 13 13:25:01 crc kubenswrapper[4797]: I1013 13:25:00.354100 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c757dd68f-28ns7" event={"ID":"b0488671-d7b4-4c33-a64b-b163a812f2eb","Type":"ContainerDied","Data":"c23a409ccdf4ecbc45c1b97b1fb1448926b8a288d1eeb026c58bb66f0903ea5e"} Oct 13 13:25:01 crc kubenswrapper[4797]: I1013 13:25:00.354171 4797 scope.go:117] "RemoveContainer" containerID="6c7f3f5a65043f5417a69014fd274093dd20803d37acd9e49b2e231982ecaa1a" Oct 13 13:25:01 crc kubenswrapper[4797]: I1013 13:25:00.354316 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c757dd68f-28ns7" Oct 13 13:25:01 crc kubenswrapper[4797]: I1013 13:25:00.378577 4797 scope.go:117] "RemoveContainer" containerID="bae487224769ef9bfd1228980c3701a1b9e337f37dd3010452a6d49a59eff1c3" Oct 13 13:25:01 crc kubenswrapper[4797]: I1013 13:25:00.417975 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-c757dd68f-28ns7"] Oct 13 13:25:01 crc kubenswrapper[4797]: I1013 13:25:00.427084 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-c757dd68f-28ns7"] Oct 13 13:25:01 crc kubenswrapper[4797]: I1013 13:25:00.444215 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-8ftfs"] Oct 13 13:25:01 crc kubenswrapper[4797]: I1013 13:25:00.452879 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-vchss"] Oct 13 13:25:01 crc kubenswrapper[4797]: I1013 13:25:00.464567 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-jdw9k"] Oct 13 13:25:01 crc kubenswrapper[4797]: I1013 13:25:01.245016 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0488671-d7b4-4c33-a64b-b163a812f2eb" path="/var/lib/kubelet/pods/b0488671-d7b4-4c33-a64b-b163a812f2eb/volumes" Oct 13 13:25:01 crc kubenswrapper[4797]: I1013 13:25:01.368552 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-jrlnp" event={"ID":"646c1def-f060-445e-b0a1-c616965aca86","Type":"ContainerStarted","Data":"4c4d14744c4c686a72e50ef1567e0aedf9062d7e4844b7fb8c13d286b0b7f12b"} Oct 13 13:25:01 crc kubenswrapper[4797]: I1013 13:25:01.370886 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-8ftfs" event={"ID":"5a945414-93e9-401f-aa0e-15e040d78017","Type":"ContainerStarted","Data":"9f51698068d90c50ec68e5f1542b2dc5a2212a554176a69d1ad5bcc89c2e7e8f"} Oct 13 13:25:01 crc kubenswrapper[4797]: I1013 13:25:01.375273 4797 generic.go:334] "Generic (PLEG): container finished" podID="d35fbf40-e053-49c6-9eb9-fbdd8337061c" containerID="569a075615ac9ed0ec14fd0173ec0ac70aa932fe695c353fe31c61bed8fd10e7" exitCode=0 Oct 13 13:25:01 crc kubenswrapper[4797]: I1013 13:25:01.375524 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-vchss" event={"ID":"d35fbf40-e053-49c6-9eb9-fbdd8337061c","Type":"ContainerDied","Data":"569a075615ac9ed0ec14fd0173ec0ac70aa932fe695c353fe31c61bed8fd10e7"} Oct 13 13:25:01 crc kubenswrapper[4797]: I1013 13:25:01.375591 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-vchss" event={"ID":"d35fbf40-e053-49c6-9eb9-fbdd8337061c","Type":"ContainerStarted","Data":"e942870a8603822d0a09091e7ae669490f83c7300f245a1f5c3dde7a8d2fb4fa"} Oct 13 13:25:01 crc kubenswrapper[4797]: I1013 13:25:01.377027 4797 generic.go:334] "Generic (PLEG): container finished" podID="23264a32-2fc6-48e6-aa01-6c6cf519f5b5" containerID="6558c30f6c2510b59df66709040b96084db877c0138ba5ac939214197756b028" exitCode=0 Oct 13 13:25:01 crc kubenswrapper[4797]: I1013 13:25:01.377109 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-jdw9k" event={"ID":"23264a32-2fc6-48e6-aa01-6c6cf519f5b5","Type":"ContainerDied","Data":"6558c30f6c2510b59df66709040b96084db877c0138ba5ac939214197756b028"} Oct 13 13:25:01 crc kubenswrapper[4797]: I1013 13:25:01.377149 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-jdw9k" event={"ID":"23264a32-2fc6-48e6-aa01-6c6cf519f5b5","Type":"ContainerStarted","Data":"93c7547edfb1b8b48495401b60a9b1a2d20e3a085b69d702285686efa1b19352"} Oct 13 13:25:01 crc kubenswrapper[4797]: I1013 13:25:01.392350 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-jrlnp" podStartSLOduration=2.806986348 podStartE2EDuration="14.392323038s" podCreationTimestamp="2025-10-13 13:24:47 +0000 UTC" firstStartedPulling="2025-10-13 13:24:48.353298288 +0000 UTC m=+1065.886848544" lastFinishedPulling="2025-10-13 13:24:59.938634978 +0000 UTC m=+1077.472185234" observedRunningTime="2025-10-13 13:25:01.391171329 +0000 UTC m=+1078.924721615" watchObservedRunningTime="2025-10-13 13:25:01.392323038 +0000 UTC m=+1078.925873284" Oct 13 13:25:01 crc kubenswrapper[4797]: I1013 13:25:01.422284 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-7hjgk"] Oct 13 13:25:02 crc kubenswrapper[4797]: I1013 13:25:02.390881 4797 generic.go:334] "Generic (PLEG): container finished" podID="4d71a130-6ed9-4cba-8f40-dea82ad6e42e" containerID="f289e553a5332ee32521ef8171b127e08ae0d885341334c6dd1917ebd2145de5" exitCode=0 Oct 13 13:25:02 crc kubenswrapper[4797]: I1013 13:25:02.391099 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-7hjgk" event={"ID":"4d71a130-6ed9-4cba-8f40-dea82ad6e42e","Type":"ContainerDied","Data":"f289e553a5332ee32521ef8171b127e08ae0d885341334c6dd1917ebd2145de5"} Oct 13 13:25:02 crc kubenswrapper[4797]: I1013 13:25:02.391164 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-7hjgk" event={"ID":"4d71a130-6ed9-4cba-8f40-dea82ad6e42e","Type":"ContainerStarted","Data":"b9fa5c491fa057707a1882bd165199088f48cd2dd716693552c1d6a99f340261"} Oct 13 13:25:07 crc kubenswrapper[4797]: I1013 13:25:07.132666 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-7hjgk" Oct 13 13:25:07 crc kubenswrapper[4797]: I1013 13:25:07.137098 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-jdw9k" Oct 13 13:25:07 crc kubenswrapper[4797]: I1013 13:25:07.141545 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-vchss" Oct 13 13:25:07 crc kubenswrapper[4797]: I1013 13:25:07.269323 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ttxbv\" (UniqueName: \"kubernetes.io/projected/d35fbf40-e053-49c6-9eb9-fbdd8337061c-kube-api-access-ttxbv\") pod \"d35fbf40-e053-49c6-9eb9-fbdd8337061c\" (UID: \"d35fbf40-e053-49c6-9eb9-fbdd8337061c\") " Oct 13 13:25:07 crc kubenswrapper[4797]: I1013 13:25:07.269586 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-plxzb\" (UniqueName: \"kubernetes.io/projected/4d71a130-6ed9-4cba-8f40-dea82ad6e42e-kube-api-access-plxzb\") pod \"4d71a130-6ed9-4cba-8f40-dea82ad6e42e\" (UID: \"4d71a130-6ed9-4cba-8f40-dea82ad6e42e\") " Oct 13 13:25:07 crc kubenswrapper[4797]: I1013 13:25:07.269668 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-76s9p\" (UniqueName: \"kubernetes.io/projected/23264a32-2fc6-48e6-aa01-6c6cf519f5b5-kube-api-access-76s9p\") pod \"23264a32-2fc6-48e6-aa01-6c6cf519f5b5\" (UID: \"23264a32-2fc6-48e6-aa01-6c6cf519f5b5\") " Oct 13 13:25:07 crc kubenswrapper[4797]: I1013 13:25:07.273431 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d35fbf40-e053-49c6-9eb9-fbdd8337061c-kube-api-access-ttxbv" (OuterVolumeSpecName: "kube-api-access-ttxbv") pod "d35fbf40-e053-49c6-9eb9-fbdd8337061c" (UID: "d35fbf40-e053-49c6-9eb9-fbdd8337061c"). InnerVolumeSpecName "kube-api-access-ttxbv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:25:07 crc kubenswrapper[4797]: I1013 13:25:07.275052 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23264a32-2fc6-48e6-aa01-6c6cf519f5b5-kube-api-access-76s9p" (OuterVolumeSpecName: "kube-api-access-76s9p") pod "23264a32-2fc6-48e6-aa01-6c6cf519f5b5" (UID: "23264a32-2fc6-48e6-aa01-6c6cf519f5b5"). InnerVolumeSpecName "kube-api-access-76s9p". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:25:07 crc kubenswrapper[4797]: I1013 13:25:07.275132 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d71a130-6ed9-4cba-8f40-dea82ad6e42e-kube-api-access-plxzb" (OuterVolumeSpecName: "kube-api-access-plxzb") pod "4d71a130-6ed9-4cba-8f40-dea82ad6e42e" (UID: "4d71a130-6ed9-4cba-8f40-dea82ad6e42e"). InnerVolumeSpecName "kube-api-access-plxzb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:25:07 crc kubenswrapper[4797]: I1013 13:25:07.372250 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-plxzb\" (UniqueName: \"kubernetes.io/projected/4d71a130-6ed9-4cba-8f40-dea82ad6e42e-kube-api-access-plxzb\") on node \"crc\" DevicePath \"\"" Oct 13 13:25:07 crc kubenswrapper[4797]: I1013 13:25:07.372308 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-76s9p\" (UniqueName: \"kubernetes.io/projected/23264a32-2fc6-48e6-aa01-6c6cf519f5b5-kube-api-access-76s9p\") on node \"crc\" DevicePath \"\"" Oct 13 13:25:07 crc kubenswrapper[4797]: I1013 13:25:07.372331 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ttxbv\" (UniqueName: \"kubernetes.io/projected/d35fbf40-e053-49c6-9eb9-fbdd8337061c-kube-api-access-ttxbv\") on node \"crc\" DevicePath \"\"" Oct 13 13:25:07 crc kubenswrapper[4797]: I1013 13:25:07.432247 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-7hjgk" Oct 13 13:25:07 crc kubenswrapper[4797]: I1013 13:25:07.432244 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-7hjgk" event={"ID":"4d71a130-6ed9-4cba-8f40-dea82ad6e42e","Type":"ContainerDied","Data":"b9fa5c491fa057707a1882bd165199088f48cd2dd716693552c1d6a99f340261"} Oct 13 13:25:07 crc kubenswrapper[4797]: I1013 13:25:07.432374 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b9fa5c491fa057707a1882bd165199088f48cd2dd716693552c1d6a99f340261" Oct 13 13:25:07 crc kubenswrapper[4797]: I1013 13:25:07.435661 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-jdw9k" event={"ID":"23264a32-2fc6-48e6-aa01-6c6cf519f5b5","Type":"ContainerDied","Data":"93c7547edfb1b8b48495401b60a9b1a2d20e3a085b69d702285686efa1b19352"} Oct 13 13:25:07 crc kubenswrapper[4797]: I1013 13:25:07.435702 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="93c7547edfb1b8b48495401b60a9b1a2d20e3a085b69d702285686efa1b19352" Oct 13 13:25:07 crc kubenswrapper[4797]: I1013 13:25:07.435667 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-jdw9k" Oct 13 13:25:07 crc kubenswrapper[4797]: I1013 13:25:07.437107 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-8ftfs" event={"ID":"5a945414-93e9-401f-aa0e-15e040d78017","Type":"ContainerStarted","Data":"80cd488390657ee26fda2dfd41b682aa23d52a2cb5737ee54f682386c5c48619"} Oct 13 13:25:07 crc kubenswrapper[4797]: I1013 13:25:07.438682 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-vchss" event={"ID":"d35fbf40-e053-49c6-9eb9-fbdd8337061c","Type":"ContainerDied","Data":"e942870a8603822d0a09091e7ae669490f83c7300f245a1f5c3dde7a8d2fb4fa"} Oct 13 13:25:07 crc kubenswrapper[4797]: I1013 13:25:07.438710 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e942870a8603822d0a09091e7ae669490f83c7300f245a1f5c3dde7a8d2fb4fa" Oct 13 13:25:07 crc kubenswrapper[4797]: I1013 13:25:07.438965 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-vchss" Oct 13 13:25:08 crc kubenswrapper[4797]: I1013 13:25:08.173771 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-8ftfs" podStartSLOduration=8.623539442 podStartE2EDuration="15.17373455s" podCreationTimestamp="2025-10-13 13:24:53 +0000 UTC" firstStartedPulling="2025-10-13 13:25:00.448701157 +0000 UTC m=+1077.982251413" lastFinishedPulling="2025-10-13 13:25:06.998896255 +0000 UTC m=+1084.532446521" observedRunningTime="2025-10-13 13:25:07.454175439 +0000 UTC m=+1084.987725695" watchObservedRunningTime="2025-10-13 13:25:08.17373455 +0000 UTC m=+1085.707284856" Oct 13 13:25:08 crc kubenswrapper[4797]: I1013 13:25:08.457120 4797 generic.go:334] "Generic (PLEG): container finished" podID="646c1def-f060-445e-b0a1-c616965aca86" containerID="4c4d14744c4c686a72e50ef1567e0aedf9062d7e4844b7fb8c13d286b0b7f12b" exitCode=0 Oct 13 13:25:08 crc kubenswrapper[4797]: I1013 13:25:08.457269 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-jrlnp" event={"ID":"646c1def-f060-445e-b0a1-c616965aca86","Type":"ContainerDied","Data":"4c4d14744c4c686a72e50ef1567e0aedf9062d7e4844b7fb8c13d286b0b7f12b"} Oct 13 13:25:09 crc kubenswrapper[4797]: I1013 13:25:09.960580 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-jrlnp" Oct 13 13:25:10 crc kubenswrapper[4797]: I1013 13:25:10.128341 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/646c1def-f060-445e-b0a1-c616965aca86-combined-ca-bundle\") pod \"646c1def-f060-445e-b0a1-c616965aca86\" (UID: \"646c1def-f060-445e-b0a1-c616965aca86\") " Oct 13 13:25:10 crc kubenswrapper[4797]: I1013 13:25:10.128434 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fkvqt\" (UniqueName: \"kubernetes.io/projected/646c1def-f060-445e-b0a1-c616965aca86-kube-api-access-fkvqt\") pod \"646c1def-f060-445e-b0a1-c616965aca86\" (UID: \"646c1def-f060-445e-b0a1-c616965aca86\") " Oct 13 13:25:10 crc kubenswrapper[4797]: I1013 13:25:10.128546 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/646c1def-f060-445e-b0a1-c616965aca86-config-data\") pod \"646c1def-f060-445e-b0a1-c616965aca86\" (UID: \"646c1def-f060-445e-b0a1-c616965aca86\") " Oct 13 13:25:10 crc kubenswrapper[4797]: I1013 13:25:10.128602 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/646c1def-f060-445e-b0a1-c616965aca86-db-sync-config-data\") pod \"646c1def-f060-445e-b0a1-c616965aca86\" (UID: \"646c1def-f060-445e-b0a1-c616965aca86\") " Oct 13 13:25:10 crc kubenswrapper[4797]: I1013 13:25:10.133777 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/646c1def-f060-445e-b0a1-c616965aca86-kube-api-access-fkvqt" (OuterVolumeSpecName: "kube-api-access-fkvqt") pod "646c1def-f060-445e-b0a1-c616965aca86" (UID: "646c1def-f060-445e-b0a1-c616965aca86"). InnerVolumeSpecName "kube-api-access-fkvqt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:25:10 crc kubenswrapper[4797]: I1013 13:25:10.134482 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/646c1def-f060-445e-b0a1-c616965aca86-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "646c1def-f060-445e-b0a1-c616965aca86" (UID: "646c1def-f060-445e-b0a1-c616965aca86"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:25:10 crc kubenswrapper[4797]: I1013 13:25:10.150553 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/646c1def-f060-445e-b0a1-c616965aca86-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "646c1def-f060-445e-b0a1-c616965aca86" (UID: "646c1def-f060-445e-b0a1-c616965aca86"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:25:10 crc kubenswrapper[4797]: I1013 13:25:10.173738 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/646c1def-f060-445e-b0a1-c616965aca86-config-data" (OuterVolumeSpecName: "config-data") pod "646c1def-f060-445e-b0a1-c616965aca86" (UID: "646c1def-f060-445e-b0a1-c616965aca86"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:25:10 crc kubenswrapper[4797]: I1013 13:25:10.230187 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/646c1def-f060-445e-b0a1-c616965aca86-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 13:25:10 crc kubenswrapper[4797]: I1013 13:25:10.230232 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fkvqt\" (UniqueName: \"kubernetes.io/projected/646c1def-f060-445e-b0a1-c616965aca86-kube-api-access-fkvqt\") on node \"crc\" DevicePath \"\"" Oct 13 13:25:10 crc kubenswrapper[4797]: I1013 13:25:10.230246 4797 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/646c1def-f060-445e-b0a1-c616965aca86-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 13:25:10 crc kubenswrapper[4797]: I1013 13:25:10.230258 4797 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/646c1def-f060-445e-b0a1-c616965aca86-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 13:25:10 crc kubenswrapper[4797]: I1013 13:25:10.497499 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-jrlnp" event={"ID":"646c1def-f060-445e-b0a1-c616965aca86","Type":"ContainerDied","Data":"966b53b614fac9fce289d23cfdf2d844277604440f5d9b97ede1c7190255bb01"} Oct 13 13:25:10 crc kubenswrapper[4797]: I1013 13:25:10.497562 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="966b53b614fac9fce289d23cfdf2d844277604440f5d9b97ede1c7190255bb01" Oct 13 13:25:10 crc kubenswrapper[4797]: I1013 13:25:10.497650 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-jrlnp" Oct 13 13:25:10 crc kubenswrapper[4797]: I1013 13:25:10.504923 4797 generic.go:334] "Generic (PLEG): container finished" podID="5a945414-93e9-401f-aa0e-15e040d78017" containerID="80cd488390657ee26fda2dfd41b682aa23d52a2cb5737ee54f682386c5c48619" exitCode=0 Oct 13 13:25:10 crc kubenswrapper[4797]: I1013 13:25:10.505002 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-8ftfs" event={"ID":"5a945414-93e9-401f-aa0e-15e040d78017","Type":"ContainerDied","Data":"80cd488390657ee26fda2dfd41b682aa23d52a2cb5737ee54f682386c5c48619"} Oct 13 13:25:10 crc kubenswrapper[4797]: I1013 13:25:10.872211 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6b4dc6c99-zvnw5"] Oct 13 13:25:10 crc kubenswrapper[4797]: E1013 13:25:10.872756 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d35fbf40-e053-49c6-9eb9-fbdd8337061c" containerName="mariadb-database-create" Oct 13 13:25:10 crc kubenswrapper[4797]: I1013 13:25:10.872771 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="d35fbf40-e053-49c6-9eb9-fbdd8337061c" containerName="mariadb-database-create" Oct 13 13:25:10 crc kubenswrapper[4797]: E1013 13:25:10.872785 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23264a32-2fc6-48e6-aa01-6c6cf519f5b5" containerName="mariadb-database-create" Oct 13 13:25:10 crc kubenswrapper[4797]: I1013 13:25:10.872792 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="23264a32-2fc6-48e6-aa01-6c6cf519f5b5" containerName="mariadb-database-create" Oct 13 13:25:10 crc kubenswrapper[4797]: E1013 13:25:10.872817 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0488671-d7b4-4c33-a64b-b163a812f2eb" containerName="init" Oct 13 13:25:10 crc kubenswrapper[4797]: I1013 13:25:10.872823 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0488671-d7b4-4c33-a64b-b163a812f2eb" containerName="init" Oct 13 13:25:10 crc kubenswrapper[4797]: E1013 13:25:10.872835 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="646c1def-f060-445e-b0a1-c616965aca86" containerName="glance-db-sync" Oct 13 13:25:10 crc kubenswrapper[4797]: I1013 13:25:10.872840 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="646c1def-f060-445e-b0a1-c616965aca86" containerName="glance-db-sync" Oct 13 13:25:10 crc kubenswrapper[4797]: E1013 13:25:10.872848 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0488671-d7b4-4c33-a64b-b163a812f2eb" containerName="dnsmasq-dns" Oct 13 13:25:10 crc kubenswrapper[4797]: I1013 13:25:10.872853 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0488671-d7b4-4c33-a64b-b163a812f2eb" containerName="dnsmasq-dns" Oct 13 13:25:10 crc kubenswrapper[4797]: E1013 13:25:10.872872 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d71a130-6ed9-4cba-8f40-dea82ad6e42e" containerName="mariadb-database-create" Oct 13 13:25:10 crc kubenswrapper[4797]: I1013 13:25:10.872878 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d71a130-6ed9-4cba-8f40-dea82ad6e42e" containerName="mariadb-database-create" Oct 13 13:25:10 crc kubenswrapper[4797]: I1013 13:25:10.873029 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="23264a32-2fc6-48e6-aa01-6c6cf519f5b5" containerName="mariadb-database-create" Oct 13 13:25:10 crc kubenswrapper[4797]: I1013 13:25:10.873045 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="d35fbf40-e053-49c6-9eb9-fbdd8337061c" containerName="mariadb-database-create" Oct 13 13:25:10 crc kubenswrapper[4797]: I1013 13:25:10.873052 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d71a130-6ed9-4cba-8f40-dea82ad6e42e" containerName="mariadb-database-create" Oct 13 13:25:10 crc kubenswrapper[4797]: I1013 13:25:10.873063 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0488671-d7b4-4c33-a64b-b163a812f2eb" containerName="dnsmasq-dns" Oct 13 13:25:10 crc kubenswrapper[4797]: I1013 13:25:10.873077 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="646c1def-f060-445e-b0a1-c616965aca86" containerName="glance-db-sync" Oct 13 13:25:10 crc kubenswrapper[4797]: I1013 13:25:10.873890 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b4dc6c99-zvnw5" Oct 13 13:25:10 crc kubenswrapper[4797]: I1013 13:25:10.890759 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b4dc6c99-zvnw5"] Oct 13 13:25:11 crc kubenswrapper[4797]: I1013 13:25:11.042697 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/19d55089-2ba6-420b-8d7c-d4f96fe2434d-ovsdbserver-sb\") pod \"dnsmasq-dns-6b4dc6c99-zvnw5\" (UID: \"19d55089-2ba6-420b-8d7c-d4f96fe2434d\") " pod="openstack/dnsmasq-dns-6b4dc6c99-zvnw5" Oct 13 13:25:11 crc kubenswrapper[4797]: I1013 13:25:11.042829 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/19d55089-2ba6-420b-8d7c-d4f96fe2434d-ovsdbserver-nb\") pod \"dnsmasq-dns-6b4dc6c99-zvnw5\" (UID: \"19d55089-2ba6-420b-8d7c-d4f96fe2434d\") " pod="openstack/dnsmasq-dns-6b4dc6c99-zvnw5" Oct 13 13:25:11 crc kubenswrapper[4797]: I1013 13:25:11.042853 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19d55089-2ba6-420b-8d7c-d4f96fe2434d-config\") pod \"dnsmasq-dns-6b4dc6c99-zvnw5\" (UID: \"19d55089-2ba6-420b-8d7c-d4f96fe2434d\") " pod="openstack/dnsmasq-dns-6b4dc6c99-zvnw5" Oct 13 13:25:11 crc kubenswrapper[4797]: I1013 13:25:11.042923 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5z8f\" (UniqueName: \"kubernetes.io/projected/19d55089-2ba6-420b-8d7c-d4f96fe2434d-kube-api-access-f5z8f\") pod \"dnsmasq-dns-6b4dc6c99-zvnw5\" (UID: \"19d55089-2ba6-420b-8d7c-d4f96fe2434d\") " pod="openstack/dnsmasq-dns-6b4dc6c99-zvnw5" Oct 13 13:25:11 crc kubenswrapper[4797]: I1013 13:25:11.042949 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/19d55089-2ba6-420b-8d7c-d4f96fe2434d-dns-svc\") pod \"dnsmasq-dns-6b4dc6c99-zvnw5\" (UID: \"19d55089-2ba6-420b-8d7c-d4f96fe2434d\") " pod="openstack/dnsmasq-dns-6b4dc6c99-zvnw5" Oct 13 13:25:11 crc kubenswrapper[4797]: I1013 13:25:11.042991 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/19d55089-2ba6-420b-8d7c-d4f96fe2434d-dns-swift-storage-0\") pod \"dnsmasq-dns-6b4dc6c99-zvnw5\" (UID: \"19d55089-2ba6-420b-8d7c-d4f96fe2434d\") " pod="openstack/dnsmasq-dns-6b4dc6c99-zvnw5" Oct 13 13:25:11 crc kubenswrapper[4797]: I1013 13:25:11.144259 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19d55089-2ba6-420b-8d7c-d4f96fe2434d-config\") pod \"dnsmasq-dns-6b4dc6c99-zvnw5\" (UID: \"19d55089-2ba6-420b-8d7c-d4f96fe2434d\") " pod="openstack/dnsmasq-dns-6b4dc6c99-zvnw5" Oct 13 13:25:11 crc kubenswrapper[4797]: I1013 13:25:11.144302 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/19d55089-2ba6-420b-8d7c-d4f96fe2434d-ovsdbserver-nb\") pod \"dnsmasq-dns-6b4dc6c99-zvnw5\" (UID: \"19d55089-2ba6-420b-8d7c-d4f96fe2434d\") " pod="openstack/dnsmasq-dns-6b4dc6c99-zvnw5" Oct 13 13:25:11 crc kubenswrapper[4797]: I1013 13:25:11.144356 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f5z8f\" (UniqueName: \"kubernetes.io/projected/19d55089-2ba6-420b-8d7c-d4f96fe2434d-kube-api-access-f5z8f\") pod \"dnsmasq-dns-6b4dc6c99-zvnw5\" (UID: \"19d55089-2ba6-420b-8d7c-d4f96fe2434d\") " pod="openstack/dnsmasq-dns-6b4dc6c99-zvnw5" Oct 13 13:25:11 crc kubenswrapper[4797]: I1013 13:25:11.144381 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/19d55089-2ba6-420b-8d7c-d4f96fe2434d-dns-svc\") pod \"dnsmasq-dns-6b4dc6c99-zvnw5\" (UID: \"19d55089-2ba6-420b-8d7c-d4f96fe2434d\") " pod="openstack/dnsmasq-dns-6b4dc6c99-zvnw5" Oct 13 13:25:11 crc kubenswrapper[4797]: I1013 13:25:11.144398 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/19d55089-2ba6-420b-8d7c-d4f96fe2434d-dns-swift-storage-0\") pod \"dnsmasq-dns-6b4dc6c99-zvnw5\" (UID: \"19d55089-2ba6-420b-8d7c-d4f96fe2434d\") " pod="openstack/dnsmasq-dns-6b4dc6c99-zvnw5" Oct 13 13:25:11 crc kubenswrapper[4797]: I1013 13:25:11.144433 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/19d55089-2ba6-420b-8d7c-d4f96fe2434d-ovsdbserver-sb\") pod \"dnsmasq-dns-6b4dc6c99-zvnw5\" (UID: \"19d55089-2ba6-420b-8d7c-d4f96fe2434d\") " pod="openstack/dnsmasq-dns-6b4dc6c99-zvnw5" Oct 13 13:25:11 crc kubenswrapper[4797]: I1013 13:25:11.145416 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/19d55089-2ba6-420b-8d7c-d4f96fe2434d-ovsdbserver-nb\") pod \"dnsmasq-dns-6b4dc6c99-zvnw5\" (UID: \"19d55089-2ba6-420b-8d7c-d4f96fe2434d\") " pod="openstack/dnsmasq-dns-6b4dc6c99-zvnw5" Oct 13 13:25:11 crc kubenswrapper[4797]: I1013 13:25:11.145415 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/19d55089-2ba6-420b-8d7c-d4f96fe2434d-dns-svc\") pod \"dnsmasq-dns-6b4dc6c99-zvnw5\" (UID: \"19d55089-2ba6-420b-8d7c-d4f96fe2434d\") " pod="openstack/dnsmasq-dns-6b4dc6c99-zvnw5" Oct 13 13:25:11 crc kubenswrapper[4797]: I1013 13:25:11.145467 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/19d55089-2ba6-420b-8d7c-d4f96fe2434d-ovsdbserver-sb\") pod \"dnsmasq-dns-6b4dc6c99-zvnw5\" (UID: \"19d55089-2ba6-420b-8d7c-d4f96fe2434d\") " pod="openstack/dnsmasq-dns-6b4dc6c99-zvnw5" Oct 13 13:25:11 crc kubenswrapper[4797]: I1013 13:25:11.145472 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19d55089-2ba6-420b-8d7c-d4f96fe2434d-config\") pod \"dnsmasq-dns-6b4dc6c99-zvnw5\" (UID: \"19d55089-2ba6-420b-8d7c-d4f96fe2434d\") " pod="openstack/dnsmasq-dns-6b4dc6c99-zvnw5" Oct 13 13:25:11 crc kubenswrapper[4797]: I1013 13:25:11.145982 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/19d55089-2ba6-420b-8d7c-d4f96fe2434d-dns-swift-storage-0\") pod \"dnsmasq-dns-6b4dc6c99-zvnw5\" (UID: \"19d55089-2ba6-420b-8d7c-d4f96fe2434d\") " pod="openstack/dnsmasq-dns-6b4dc6c99-zvnw5" Oct 13 13:25:11 crc kubenswrapper[4797]: I1013 13:25:11.164679 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5z8f\" (UniqueName: \"kubernetes.io/projected/19d55089-2ba6-420b-8d7c-d4f96fe2434d-kube-api-access-f5z8f\") pod \"dnsmasq-dns-6b4dc6c99-zvnw5\" (UID: \"19d55089-2ba6-420b-8d7c-d4f96fe2434d\") " pod="openstack/dnsmasq-dns-6b4dc6c99-zvnw5" Oct 13 13:25:11 crc kubenswrapper[4797]: I1013 13:25:11.193973 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b4dc6c99-zvnw5" Oct 13 13:25:11 crc kubenswrapper[4797]: I1013 13:25:11.651171 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b4dc6c99-zvnw5"] Oct 13 13:25:11 crc kubenswrapper[4797]: I1013 13:25:11.770681 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-8ftfs" Oct 13 13:25:11 crc kubenswrapper[4797]: I1013 13:25:11.957845 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-85qnt\" (UniqueName: \"kubernetes.io/projected/5a945414-93e9-401f-aa0e-15e040d78017-kube-api-access-85qnt\") pod \"5a945414-93e9-401f-aa0e-15e040d78017\" (UID: \"5a945414-93e9-401f-aa0e-15e040d78017\") " Oct 13 13:25:11 crc kubenswrapper[4797]: I1013 13:25:11.957980 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a945414-93e9-401f-aa0e-15e040d78017-combined-ca-bundle\") pod \"5a945414-93e9-401f-aa0e-15e040d78017\" (UID: \"5a945414-93e9-401f-aa0e-15e040d78017\") " Oct 13 13:25:11 crc kubenswrapper[4797]: I1013 13:25:11.958010 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a945414-93e9-401f-aa0e-15e040d78017-config-data\") pod \"5a945414-93e9-401f-aa0e-15e040d78017\" (UID: \"5a945414-93e9-401f-aa0e-15e040d78017\") " Oct 13 13:25:11 crc kubenswrapper[4797]: I1013 13:25:11.963991 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a945414-93e9-401f-aa0e-15e040d78017-kube-api-access-85qnt" (OuterVolumeSpecName: "kube-api-access-85qnt") pod "5a945414-93e9-401f-aa0e-15e040d78017" (UID: "5a945414-93e9-401f-aa0e-15e040d78017"). InnerVolumeSpecName "kube-api-access-85qnt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:25:11 crc kubenswrapper[4797]: I1013 13:25:11.981441 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a945414-93e9-401f-aa0e-15e040d78017-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5a945414-93e9-401f-aa0e-15e040d78017" (UID: "5a945414-93e9-401f-aa0e-15e040d78017"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:25:12 crc kubenswrapper[4797]: I1013 13:25:12.002690 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a945414-93e9-401f-aa0e-15e040d78017-config-data" (OuterVolumeSpecName: "config-data") pod "5a945414-93e9-401f-aa0e-15e040d78017" (UID: "5a945414-93e9-401f-aa0e-15e040d78017"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:25:12 crc kubenswrapper[4797]: I1013 13:25:12.060186 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a945414-93e9-401f-aa0e-15e040d78017-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 13:25:12 crc kubenswrapper[4797]: I1013 13:25:12.060222 4797 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a945414-93e9-401f-aa0e-15e040d78017-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 13:25:12 crc kubenswrapper[4797]: I1013 13:25:12.060232 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-85qnt\" (UniqueName: \"kubernetes.io/projected/5a945414-93e9-401f-aa0e-15e040d78017-kube-api-access-85qnt\") on node \"crc\" DevicePath \"\"" Oct 13 13:25:12 crc kubenswrapper[4797]: I1013 13:25:12.521955 4797 generic.go:334] "Generic (PLEG): container finished" podID="19d55089-2ba6-420b-8d7c-d4f96fe2434d" containerID="084c6ff31def85a70029a97db084833ace7775183cdf9693bf21112dcd3332a7" exitCode=0 Oct 13 13:25:12 crc kubenswrapper[4797]: I1013 13:25:12.522146 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b4dc6c99-zvnw5" event={"ID":"19d55089-2ba6-420b-8d7c-d4f96fe2434d","Type":"ContainerDied","Data":"084c6ff31def85a70029a97db084833ace7775183cdf9693bf21112dcd3332a7"} Oct 13 13:25:12 crc kubenswrapper[4797]: I1013 13:25:12.522639 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b4dc6c99-zvnw5" event={"ID":"19d55089-2ba6-420b-8d7c-d4f96fe2434d","Type":"ContainerStarted","Data":"d14043c623b401de0fc6dc8327e480808f3a395e6ac99bece027c7225bbaa8a0"} Oct 13 13:25:12 crc kubenswrapper[4797]: I1013 13:25:12.524577 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-8ftfs" event={"ID":"5a945414-93e9-401f-aa0e-15e040d78017","Type":"ContainerDied","Data":"9f51698068d90c50ec68e5f1542b2dc5a2212a554176a69d1ad5bcc89c2e7e8f"} Oct 13 13:25:12 crc kubenswrapper[4797]: I1013 13:25:12.524669 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9f51698068d90c50ec68e5f1542b2dc5a2212a554176a69d1ad5bcc89c2e7e8f" Oct 13 13:25:12 crc kubenswrapper[4797]: I1013 13:25:12.524672 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-8ftfs" Oct 13 13:25:12 crc kubenswrapper[4797]: I1013 13:25:12.739509 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b4dc6c99-zvnw5"] Oct 13 13:25:12 crc kubenswrapper[4797]: I1013 13:25:12.766203 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-mxz7m"] Oct 13 13:25:12 crc kubenswrapper[4797]: E1013 13:25:12.766549 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a945414-93e9-401f-aa0e-15e040d78017" containerName="keystone-db-sync" Oct 13 13:25:12 crc kubenswrapper[4797]: I1013 13:25:12.766563 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a945414-93e9-401f-aa0e-15e040d78017" containerName="keystone-db-sync" Oct 13 13:25:12 crc kubenswrapper[4797]: I1013 13:25:12.766748 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a945414-93e9-401f-aa0e-15e040d78017" containerName="keystone-db-sync" Oct 13 13:25:12 crc kubenswrapper[4797]: I1013 13:25:12.767314 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-mxz7m" Oct 13 13:25:12 crc kubenswrapper[4797]: I1013 13:25:12.778300 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-4snmq" Oct 13 13:25:12 crc kubenswrapper[4797]: I1013 13:25:12.778519 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 13 13:25:12 crc kubenswrapper[4797]: I1013 13:25:12.778624 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 13 13:25:12 crc kubenswrapper[4797]: I1013 13:25:12.779063 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 13 13:25:12 crc kubenswrapper[4797]: I1013 13:25:12.780223 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e951a4d9-1a79-4872-b889-1dfdf3b8fa18-combined-ca-bundle\") pod \"keystone-bootstrap-mxz7m\" (UID: \"e951a4d9-1a79-4872-b889-1dfdf3b8fa18\") " pod="openstack/keystone-bootstrap-mxz7m" Oct 13 13:25:12 crc kubenswrapper[4797]: I1013 13:25:12.780305 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e951a4d9-1a79-4872-b889-1dfdf3b8fa18-credential-keys\") pod \"keystone-bootstrap-mxz7m\" (UID: \"e951a4d9-1a79-4872-b889-1dfdf3b8fa18\") " pod="openstack/keystone-bootstrap-mxz7m" Oct 13 13:25:12 crc kubenswrapper[4797]: I1013 13:25:12.780373 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e951a4d9-1a79-4872-b889-1dfdf3b8fa18-config-data\") pod \"keystone-bootstrap-mxz7m\" (UID: \"e951a4d9-1a79-4872-b889-1dfdf3b8fa18\") " pod="openstack/keystone-bootstrap-mxz7m" Oct 13 13:25:12 crc kubenswrapper[4797]: I1013 13:25:12.780424 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5smbc\" (UniqueName: \"kubernetes.io/projected/e951a4d9-1a79-4872-b889-1dfdf3b8fa18-kube-api-access-5smbc\") pod \"keystone-bootstrap-mxz7m\" (UID: \"e951a4d9-1a79-4872-b889-1dfdf3b8fa18\") " pod="openstack/keystone-bootstrap-mxz7m" Oct 13 13:25:12 crc kubenswrapper[4797]: I1013 13:25:12.780452 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e951a4d9-1a79-4872-b889-1dfdf3b8fa18-scripts\") pod \"keystone-bootstrap-mxz7m\" (UID: \"e951a4d9-1a79-4872-b889-1dfdf3b8fa18\") " pod="openstack/keystone-bootstrap-mxz7m" Oct 13 13:25:12 crc kubenswrapper[4797]: I1013 13:25:12.780577 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e951a4d9-1a79-4872-b889-1dfdf3b8fa18-fernet-keys\") pod \"keystone-bootstrap-mxz7m\" (UID: \"e951a4d9-1a79-4872-b889-1dfdf3b8fa18\") " pod="openstack/keystone-bootstrap-mxz7m" Oct 13 13:25:12 crc kubenswrapper[4797]: I1013 13:25:12.802687 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6b9564cbb5-5l5n4"] Oct 13 13:25:12 crc kubenswrapper[4797]: I1013 13:25:12.834696 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b9564cbb5-5l5n4" Oct 13 13:25:12 crc kubenswrapper[4797]: I1013 13:25:12.846714 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-mxz7m"] Oct 13 13:25:12 crc kubenswrapper[4797]: I1013 13:25:12.899533 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e951a4d9-1a79-4872-b889-1dfdf3b8fa18-credential-keys\") pod \"keystone-bootstrap-mxz7m\" (UID: \"e951a4d9-1a79-4872-b889-1dfdf3b8fa18\") " pod="openstack/keystone-bootstrap-mxz7m" Oct 13 13:25:12 crc kubenswrapper[4797]: I1013 13:25:12.899604 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e951a4d9-1a79-4872-b889-1dfdf3b8fa18-config-data\") pod \"keystone-bootstrap-mxz7m\" (UID: \"e951a4d9-1a79-4872-b889-1dfdf3b8fa18\") " pod="openstack/keystone-bootstrap-mxz7m" Oct 13 13:25:12 crc kubenswrapper[4797]: I1013 13:25:12.899642 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5smbc\" (UniqueName: \"kubernetes.io/projected/e951a4d9-1a79-4872-b889-1dfdf3b8fa18-kube-api-access-5smbc\") pod \"keystone-bootstrap-mxz7m\" (UID: \"e951a4d9-1a79-4872-b889-1dfdf3b8fa18\") " pod="openstack/keystone-bootstrap-mxz7m" Oct 13 13:25:12 crc kubenswrapper[4797]: I1013 13:25:12.900515 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e951a4d9-1a79-4872-b889-1dfdf3b8fa18-scripts\") pod \"keystone-bootstrap-mxz7m\" (UID: \"e951a4d9-1a79-4872-b889-1dfdf3b8fa18\") " pod="openstack/keystone-bootstrap-mxz7m" Oct 13 13:25:12 crc kubenswrapper[4797]: I1013 13:25:12.900997 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e951a4d9-1a79-4872-b889-1dfdf3b8fa18-fernet-keys\") pod \"keystone-bootstrap-mxz7m\" (UID: \"e951a4d9-1a79-4872-b889-1dfdf3b8fa18\") " pod="openstack/keystone-bootstrap-mxz7m" Oct 13 13:25:12 crc kubenswrapper[4797]: I1013 13:25:12.901097 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e951a4d9-1a79-4872-b889-1dfdf3b8fa18-combined-ca-bundle\") pod \"keystone-bootstrap-mxz7m\" (UID: \"e951a4d9-1a79-4872-b889-1dfdf3b8fa18\") " pod="openstack/keystone-bootstrap-mxz7m" Oct 13 13:25:12 crc kubenswrapper[4797]: I1013 13:25:12.914766 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e951a4d9-1a79-4872-b889-1dfdf3b8fa18-combined-ca-bundle\") pod \"keystone-bootstrap-mxz7m\" (UID: \"e951a4d9-1a79-4872-b889-1dfdf3b8fa18\") " pod="openstack/keystone-bootstrap-mxz7m" Oct 13 13:25:12 crc kubenswrapper[4797]: I1013 13:25:12.917547 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e951a4d9-1a79-4872-b889-1dfdf3b8fa18-config-data\") pod \"keystone-bootstrap-mxz7m\" (UID: \"e951a4d9-1a79-4872-b889-1dfdf3b8fa18\") " pod="openstack/keystone-bootstrap-mxz7m" Oct 13 13:25:12 crc kubenswrapper[4797]: I1013 13:25:12.918350 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e951a4d9-1a79-4872-b889-1dfdf3b8fa18-credential-keys\") pod \"keystone-bootstrap-mxz7m\" (UID: \"e951a4d9-1a79-4872-b889-1dfdf3b8fa18\") " pod="openstack/keystone-bootstrap-mxz7m" Oct 13 13:25:12 crc kubenswrapper[4797]: I1013 13:25:12.931213 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e951a4d9-1a79-4872-b889-1dfdf3b8fa18-scripts\") pod \"keystone-bootstrap-mxz7m\" (UID: \"e951a4d9-1a79-4872-b889-1dfdf3b8fa18\") " pod="openstack/keystone-bootstrap-mxz7m" Oct 13 13:25:12 crc kubenswrapper[4797]: I1013 13:25:12.933460 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e951a4d9-1a79-4872-b889-1dfdf3b8fa18-fernet-keys\") pod \"keystone-bootstrap-mxz7m\" (UID: \"e951a4d9-1a79-4872-b889-1dfdf3b8fa18\") " pod="openstack/keystone-bootstrap-mxz7m" Oct 13 13:25:12 crc kubenswrapper[4797]: I1013 13:25:12.938912 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b9564cbb5-5l5n4"] Oct 13 13:25:12 crc kubenswrapper[4797]: I1013 13:25:12.963531 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5smbc\" (UniqueName: \"kubernetes.io/projected/e951a4d9-1a79-4872-b889-1dfdf3b8fa18-kube-api-access-5smbc\") pod \"keystone-bootstrap-mxz7m\" (UID: \"e951a4d9-1a79-4872-b889-1dfdf3b8fa18\") " pod="openstack/keystone-bootstrap-mxz7m" Oct 13 13:25:13 crc kubenswrapper[4797]: I1013 13:25:13.002707 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5ae891fa-6596-4556-96c5-52e4118d7fc1-dns-svc\") pod \"dnsmasq-dns-6b9564cbb5-5l5n4\" (UID: \"5ae891fa-6596-4556-96c5-52e4118d7fc1\") " pod="openstack/dnsmasq-dns-6b9564cbb5-5l5n4" Oct 13 13:25:13 crc kubenswrapper[4797]: I1013 13:25:13.002763 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2ns8\" (UniqueName: \"kubernetes.io/projected/5ae891fa-6596-4556-96c5-52e4118d7fc1-kube-api-access-g2ns8\") pod \"dnsmasq-dns-6b9564cbb5-5l5n4\" (UID: \"5ae891fa-6596-4556-96c5-52e4118d7fc1\") " pod="openstack/dnsmasq-dns-6b9564cbb5-5l5n4" Oct 13 13:25:13 crc kubenswrapper[4797]: I1013 13:25:13.002806 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5ae891fa-6596-4556-96c5-52e4118d7fc1-dns-swift-storage-0\") pod \"dnsmasq-dns-6b9564cbb5-5l5n4\" (UID: \"5ae891fa-6596-4556-96c5-52e4118d7fc1\") " pod="openstack/dnsmasq-dns-6b9564cbb5-5l5n4" Oct 13 13:25:13 crc kubenswrapper[4797]: I1013 13:25:13.002916 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5ae891fa-6596-4556-96c5-52e4118d7fc1-ovsdbserver-sb\") pod \"dnsmasq-dns-6b9564cbb5-5l5n4\" (UID: \"5ae891fa-6596-4556-96c5-52e4118d7fc1\") " pod="openstack/dnsmasq-dns-6b9564cbb5-5l5n4" Oct 13 13:25:13 crc kubenswrapper[4797]: I1013 13:25:13.002937 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ae891fa-6596-4556-96c5-52e4118d7fc1-config\") pod \"dnsmasq-dns-6b9564cbb5-5l5n4\" (UID: \"5ae891fa-6596-4556-96c5-52e4118d7fc1\") " pod="openstack/dnsmasq-dns-6b9564cbb5-5l5n4" Oct 13 13:25:13 crc kubenswrapper[4797]: I1013 13:25:13.002955 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5ae891fa-6596-4556-96c5-52e4118d7fc1-ovsdbserver-nb\") pod \"dnsmasq-dns-6b9564cbb5-5l5n4\" (UID: \"5ae891fa-6596-4556-96c5-52e4118d7fc1\") " pod="openstack/dnsmasq-dns-6b9564cbb5-5l5n4" Oct 13 13:25:13 crc kubenswrapper[4797]: I1013 13:25:13.086153 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 13 13:25:13 crc kubenswrapper[4797]: I1013 13:25:13.088193 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 13 13:25:13 crc kubenswrapper[4797]: I1013 13:25:13.092122 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 13 13:25:13 crc kubenswrapper[4797]: I1013 13:25:13.107015 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g2ns8\" (UniqueName: \"kubernetes.io/projected/5ae891fa-6596-4556-96c5-52e4118d7fc1-kube-api-access-g2ns8\") pod \"dnsmasq-dns-6b9564cbb5-5l5n4\" (UID: \"5ae891fa-6596-4556-96c5-52e4118d7fc1\") " pod="openstack/dnsmasq-dns-6b9564cbb5-5l5n4" Oct 13 13:25:13 crc kubenswrapper[4797]: I1013 13:25:13.107085 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5ae891fa-6596-4556-96c5-52e4118d7fc1-dns-swift-storage-0\") pod \"dnsmasq-dns-6b9564cbb5-5l5n4\" (UID: \"5ae891fa-6596-4556-96c5-52e4118d7fc1\") " pod="openstack/dnsmasq-dns-6b9564cbb5-5l5n4" Oct 13 13:25:13 crc kubenswrapper[4797]: I1013 13:25:13.107143 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5ae891fa-6596-4556-96c5-52e4118d7fc1-ovsdbserver-sb\") pod \"dnsmasq-dns-6b9564cbb5-5l5n4\" (UID: \"5ae891fa-6596-4556-96c5-52e4118d7fc1\") " pod="openstack/dnsmasq-dns-6b9564cbb5-5l5n4" Oct 13 13:25:13 crc kubenswrapper[4797]: I1013 13:25:13.107177 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ae891fa-6596-4556-96c5-52e4118d7fc1-config\") pod \"dnsmasq-dns-6b9564cbb5-5l5n4\" (UID: \"5ae891fa-6596-4556-96c5-52e4118d7fc1\") " pod="openstack/dnsmasq-dns-6b9564cbb5-5l5n4" Oct 13 13:25:13 crc kubenswrapper[4797]: I1013 13:25:13.107206 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5ae891fa-6596-4556-96c5-52e4118d7fc1-ovsdbserver-nb\") pod \"dnsmasq-dns-6b9564cbb5-5l5n4\" (UID: \"5ae891fa-6596-4556-96c5-52e4118d7fc1\") " pod="openstack/dnsmasq-dns-6b9564cbb5-5l5n4" Oct 13 13:25:13 crc kubenswrapper[4797]: I1013 13:25:13.107327 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5ae891fa-6596-4556-96c5-52e4118d7fc1-dns-svc\") pod \"dnsmasq-dns-6b9564cbb5-5l5n4\" (UID: \"5ae891fa-6596-4556-96c5-52e4118d7fc1\") " pod="openstack/dnsmasq-dns-6b9564cbb5-5l5n4" Oct 13 13:25:13 crc kubenswrapper[4797]: I1013 13:25:13.108432 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5ae891fa-6596-4556-96c5-52e4118d7fc1-ovsdbserver-sb\") pod \"dnsmasq-dns-6b9564cbb5-5l5n4\" (UID: \"5ae891fa-6596-4556-96c5-52e4118d7fc1\") " pod="openstack/dnsmasq-dns-6b9564cbb5-5l5n4" Oct 13 13:25:13 crc kubenswrapper[4797]: I1013 13:25:13.108477 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5ae891fa-6596-4556-96c5-52e4118d7fc1-dns-svc\") pod \"dnsmasq-dns-6b9564cbb5-5l5n4\" (UID: \"5ae891fa-6596-4556-96c5-52e4118d7fc1\") " pod="openstack/dnsmasq-dns-6b9564cbb5-5l5n4" Oct 13 13:25:13 crc kubenswrapper[4797]: I1013 13:25:13.108490 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5ae891fa-6596-4556-96c5-52e4118d7fc1-ovsdbserver-nb\") pod \"dnsmasq-dns-6b9564cbb5-5l5n4\" (UID: \"5ae891fa-6596-4556-96c5-52e4118d7fc1\") " pod="openstack/dnsmasq-dns-6b9564cbb5-5l5n4" Oct 13 13:25:13 crc kubenswrapper[4797]: I1013 13:25:13.108679 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5ae891fa-6596-4556-96c5-52e4118d7fc1-dns-swift-storage-0\") pod \"dnsmasq-dns-6b9564cbb5-5l5n4\" (UID: \"5ae891fa-6596-4556-96c5-52e4118d7fc1\") " pod="openstack/dnsmasq-dns-6b9564cbb5-5l5n4" Oct 13 13:25:13 crc kubenswrapper[4797]: I1013 13:25:13.109355 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 13 13:25:13 crc kubenswrapper[4797]: I1013 13:25:13.119637 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ae891fa-6596-4556-96c5-52e4118d7fc1-config\") pod \"dnsmasq-dns-6b9564cbb5-5l5n4\" (UID: \"5ae891fa-6596-4556-96c5-52e4118d7fc1\") " pod="openstack/dnsmasq-dns-6b9564cbb5-5l5n4" Oct 13 13:25:13 crc kubenswrapper[4797]: I1013 13:25:13.130883 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 13 13:25:13 crc kubenswrapper[4797]: I1013 13:25:13.138767 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2ns8\" (UniqueName: \"kubernetes.io/projected/5ae891fa-6596-4556-96c5-52e4118d7fc1-kube-api-access-g2ns8\") pod \"dnsmasq-dns-6b9564cbb5-5l5n4\" (UID: \"5ae891fa-6596-4556-96c5-52e4118d7fc1\") " pod="openstack/dnsmasq-dns-6b9564cbb5-5l5n4" Oct 13 13:25:13 crc kubenswrapper[4797]: I1013 13:25:13.146261 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-mxz7m" Oct 13 13:25:13 crc kubenswrapper[4797]: I1013 13:25:13.188913 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-5dzch"] Oct 13 13:25:13 crc kubenswrapper[4797]: I1013 13:25:13.189949 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-5dzch" Oct 13 13:25:13 crc kubenswrapper[4797]: I1013 13:25:13.201228 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b9564cbb5-5l5n4" Oct 13 13:25:13 crc kubenswrapper[4797]: I1013 13:25:13.210147 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Oct 13 13:25:13 crc kubenswrapper[4797]: I1013 13:25:13.210421 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Oct 13 13:25:13 crc kubenswrapper[4797]: I1013 13:25:13.210533 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-m9gb6" Oct 13 13:25:13 crc kubenswrapper[4797]: I1013 13:25:13.212487 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f52aa623-f23f-4f02-b744-7c5b1e066e50-run-httpd\") pod \"ceilometer-0\" (UID: \"f52aa623-f23f-4f02-b744-7c5b1e066e50\") " pod="openstack/ceilometer-0" Oct 13 13:25:13 crc kubenswrapper[4797]: I1013 13:25:13.212518 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f52aa623-f23f-4f02-b744-7c5b1e066e50-scripts\") pod \"ceilometer-0\" (UID: \"f52aa623-f23f-4f02-b744-7c5b1e066e50\") " pod="openstack/ceilometer-0" Oct 13 13:25:13 crc kubenswrapper[4797]: I1013 13:25:13.212555 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f52aa623-f23f-4f02-b744-7c5b1e066e50-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f52aa623-f23f-4f02-b744-7c5b1e066e50\") " pod="openstack/ceilometer-0" Oct 13 13:25:13 crc kubenswrapper[4797]: I1013 13:25:13.212622 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f52aa623-f23f-4f02-b744-7c5b1e066e50-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f52aa623-f23f-4f02-b744-7c5b1e066e50\") " pod="openstack/ceilometer-0" Oct 13 13:25:13 crc kubenswrapper[4797]: I1013 13:25:13.212641 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f52aa623-f23f-4f02-b744-7c5b1e066e50-config-data\") pod \"ceilometer-0\" (UID: \"f52aa623-f23f-4f02-b744-7c5b1e066e50\") " pod="openstack/ceilometer-0" Oct 13 13:25:13 crc kubenswrapper[4797]: I1013 13:25:13.212666 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f52aa623-f23f-4f02-b744-7c5b1e066e50-log-httpd\") pod \"ceilometer-0\" (UID: \"f52aa623-f23f-4f02-b744-7c5b1e066e50\") " pod="openstack/ceilometer-0" Oct 13 13:25:13 crc kubenswrapper[4797]: I1013 13:25:13.212682 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fp9df\" (UniqueName: \"kubernetes.io/projected/f52aa623-f23f-4f02-b744-7c5b1e066e50-kube-api-access-fp9df\") pod \"ceilometer-0\" (UID: \"f52aa623-f23f-4f02-b744-7c5b1e066e50\") " pod="openstack/ceilometer-0" Oct 13 13:25:13 crc kubenswrapper[4797]: I1013 13:25:13.232910 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-5dzch"] Oct 13 13:25:13 crc kubenswrapper[4797]: I1013 13:25:13.275143 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b9564cbb5-5l5n4"] Oct 13 13:25:13 crc kubenswrapper[4797]: I1013 13:25:13.306231 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-77f7885f7f-jk7j6"] Oct 13 13:25:13 crc kubenswrapper[4797]: I1013 13:25:13.307450 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77f7885f7f-jk7j6" Oct 13 13:25:13 crc kubenswrapper[4797]: I1013 13:25:13.315582 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f52aa623-f23f-4f02-b744-7c5b1e066e50-log-httpd\") pod \"ceilometer-0\" (UID: \"f52aa623-f23f-4f02-b744-7c5b1e066e50\") " pod="openstack/ceilometer-0" Oct 13 13:25:13 crc kubenswrapper[4797]: I1013 13:25:13.315628 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fp9df\" (UniqueName: \"kubernetes.io/projected/f52aa623-f23f-4f02-b744-7c5b1e066e50-kube-api-access-fp9df\") pod \"ceilometer-0\" (UID: \"f52aa623-f23f-4f02-b744-7c5b1e066e50\") " pod="openstack/ceilometer-0" Oct 13 13:25:13 crc kubenswrapper[4797]: I1013 13:25:13.315669 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z999w\" (UniqueName: \"kubernetes.io/projected/974e0c08-3519-4be7-a9d1-c7db6016ad6f-kube-api-access-z999w\") pod \"placement-db-sync-5dzch\" (UID: \"974e0c08-3519-4be7-a9d1-c7db6016ad6f\") " pod="openstack/placement-db-sync-5dzch" Oct 13 13:25:13 crc kubenswrapper[4797]: I1013 13:25:13.315724 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f52aa623-f23f-4f02-b744-7c5b1e066e50-run-httpd\") pod \"ceilometer-0\" (UID: \"f52aa623-f23f-4f02-b744-7c5b1e066e50\") " pod="openstack/ceilometer-0" Oct 13 13:25:13 crc kubenswrapper[4797]: I1013 13:25:13.315752 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f52aa623-f23f-4f02-b744-7c5b1e066e50-scripts\") pod \"ceilometer-0\" (UID: \"f52aa623-f23f-4f02-b744-7c5b1e066e50\") " pod="openstack/ceilometer-0" Oct 13 13:25:13 crc kubenswrapper[4797]: I1013 13:25:13.315809 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f52aa623-f23f-4f02-b744-7c5b1e066e50-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f52aa623-f23f-4f02-b744-7c5b1e066e50\") " pod="openstack/ceilometer-0" Oct 13 13:25:13 crc kubenswrapper[4797]: I1013 13:25:13.315849 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/974e0c08-3519-4be7-a9d1-c7db6016ad6f-scripts\") pod \"placement-db-sync-5dzch\" (UID: \"974e0c08-3519-4be7-a9d1-c7db6016ad6f\") " pod="openstack/placement-db-sync-5dzch" Oct 13 13:25:13 crc kubenswrapper[4797]: I1013 13:25:13.315876 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/974e0c08-3519-4be7-a9d1-c7db6016ad6f-logs\") pod \"placement-db-sync-5dzch\" (UID: \"974e0c08-3519-4be7-a9d1-c7db6016ad6f\") " pod="openstack/placement-db-sync-5dzch" Oct 13 13:25:13 crc kubenswrapper[4797]: I1013 13:25:13.315899 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/974e0c08-3519-4be7-a9d1-c7db6016ad6f-config-data\") pod \"placement-db-sync-5dzch\" (UID: \"974e0c08-3519-4be7-a9d1-c7db6016ad6f\") " pod="openstack/placement-db-sync-5dzch" Oct 13 13:25:13 crc kubenswrapper[4797]: I1013 13:25:13.315954 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/974e0c08-3519-4be7-a9d1-c7db6016ad6f-combined-ca-bundle\") pod \"placement-db-sync-5dzch\" (UID: \"974e0c08-3519-4be7-a9d1-c7db6016ad6f\") " pod="openstack/placement-db-sync-5dzch" Oct 13 13:25:13 crc kubenswrapper[4797]: I1013 13:25:13.315997 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f52aa623-f23f-4f02-b744-7c5b1e066e50-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f52aa623-f23f-4f02-b744-7c5b1e066e50\") " pod="openstack/ceilometer-0" Oct 13 13:25:13 crc kubenswrapper[4797]: I1013 13:25:13.316028 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f52aa623-f23f-4f02-b744-7c5b1e066e50-config-data\") pod \"ceilometer-0\" (UID: \"f52aa623-f23f-4f02-b744-7c5b1e066e50\") " pod="openstack/ceilometer-0" Oct 13 13:25:13 crc kubenswrapper[4797]: I1013 13:25:13.316368 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f52aa623-f23f-4f02-b744-7c5b1e066e50-run-httpd\") pod \"ceilometer-0\" (UID: \"f52aa623-f23f-4f02-b744-7c5b1e066e50\") " pod="openstack/ceilometer-0" Oct 13 13:25:13 crc kubenswrapper[4797]: I1013 13:25:13.316578 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f52aa623-f23f-4f02-b744-7c5b1e066e50-log-httpd\") pod \"ceilometer-0\" (UID: \"f52aa623-f23f-4f02-b744-7c5b1e066e50\") " pod="openstack/ceilometer-0" Oct 13 13:25:13 crc kubenswrapper[4797]: I1013 13:25:13.333502 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77f7885f7f-jk7j6"] Oct 13 13:25:13 crc kubenswrapper[4797]: I1013 13:25:13.352669 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f52aa623-f23f-4f02-b744-7c5b1e066e50-scripts\") pod \"ceilometer-0\" (UID: \"f52aa623-f23f-4f02-b744-7c5b1e066e50\") " pod="openstack/ceilometer-0" Oct 13 13:25:13 crc kubenswrapper[4797]: I1013 13:25:13.352685 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f52aa623-f23f-4f02-b744-7c5b1e066e50-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f52aa623-f23f-4f02-b744-7c5b1e066e50\") " pod="openstack/ceilometer-0" Oct 13 13:25:13 crc kubenswrapper[4797]: I1013 13:25:13.362468 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f52aa623-f23f-4f02-b744-7c5b1e066e50-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f52aa623-f23f-4f02-b744-7c5b1e066e50\") " pod="openstack/ceilometer-0" Oct 13 13:25:13 crc kubenswrapper[4797]: I1013 13:25:13.363294 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fp9df\" (UniqueName: \"kubernetes.io/projected/f52aa623-f23f-4f02-b744-7c5b1e066e50-kube-api-access-fp9df\") pod \"ceilometer-0\" (UID: \"f52aa623-f23f-4f02-b744-7c5b1e066e50\") " pod="openstack/ceilometer-0" Oct 13 13:25:13 crc kubenswrapper[4797]: I1013 13:25:13.365204 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f52aa623-f23f-4f02-b744-7c5b1e066e50-config-data\") pod \"ceilometer-0\" (UID: \"f52aa623-f23f-4f02-b744-7c5b1e066e50\") " pod="openstack/ceilometer-0" Oct 13 13:25:13 crc kubenswrapper[4797]: I1013 13:25:13.417063 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ptppm\" (UniqueName: \"kubernetes.io/projected/71552bbd-b6bc-43b0-95ba-0c3dc0d93468-kube-api-access-ptppm\") pod \"dnsmasq-dns-77f7885f7f-jk7j6\" (UID: \"71552bbd-b6bc-43b0-95ba-0c3dc0d93468\") " pod="openstack/dnsmasq-dns-77f7885f7f-jk7j6" Oct 13 13:25:13 crc kubenswrapper[4797]: I1013 13:25:13.417118 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/974e0c08-3519-4be7-a9d1-c7db6016ad6f-scripts\") pod \"placement-db-sync-5dzch\" (UID: \"974e0c08-3519-4be7-a9d1-c7db6016ad6f\") " pod="openstack/placement-db-sync-5dzch" Oct 13 13:25:13 crc kubenswrapper[4797]: I1013 13:25:13.417140 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/71552bbd-b6bc-43b0-95ba-0c3dc0d93468-dns-swift-storage-0\") pod \"dnsmasq-dns-77f7885f7f-jk7j6\" (UID: \"71552bbd-b6bc-43b0-95ba-0c3dc0d93468\") " pod="openstack/dnsmasq-dns-77f7885f7f-jk7j6" Oct 13 13:25:13 crc kubenswrapper[4797]: I1013 13:25:13.417161 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/974e0c08-3519-4be7-a9d1-c7db6016ad6f-logs\") pod \"placement-db-sync-5dzch\" (UID: \"974e0c08-3519-4be7-a9d1-c7db6016ad6f\") " pod="openstack/placement-db-sync-5dzch" Oct 13 13:25:13 crc kubenswrapper[4797]: I1013 13:25:13.417179 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/974e0c08-3519-4be7-a9d1-c7db6016ad6f-config-data\") pod \"placement-db-sync-5dzch\" (UID: \"974e0c08-3519-4be7-a9d1-c7db6016ad6f\") " pod="openstack/placement-db-sync-5dzch" Oct 13 13:25:13 crc kubenswrapper[4797]: I1013 13:25:13.417198 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/71552bbd-b6bc-43b0-95ba-0c3dc0d93468-ovsdbserver-nb\") pod \"dnsmasq-dns-77f7885f7f-jk7j6\" (UID: \"71552bbd-b6bc-43b0-95ba-0c3dc0d93468\") " pod="openstack/dnsmasq-dns-77f7885f7f-jk7j6" Oct 13 13:25:13 crc kubenswrapper[4797]: I1013 13:25:13.417239 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/974e0c08-3519-4be7-a9d1-c7db6016ad6f-combined-ca-bundle\") pod \"placement-db-sync-5dzch\" (UID: \"974e0c08-3519-4be7-a9d1-c7db6016ad6f\") " pod="openstack/placement-db-sync-5dzch" Oct 13 13:25:13 crc kubenswrapper[4797]: I1013 13:25:13.417281 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/71552bbd-b6bc-43b0-95ba-0c3dc0d93468-ovsdbserver-sb\") pod \"dnsmasq-dns-77f7885f7f-jk7j6\" (UID: \"71552bbd-b6bc-43b0-95ba-0c3dc0d93468\") " pod="openstack/dnsmasq-dns-77f7885f7f-jk7j6" Oct 13 13:25:13 crc kubenswrapper[4797]: I1013 13:25:13.417310 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z999w\" (UniqueName: \"kubernetes.io/projected/974e0c08-3519-4be7-a9d1-c7db6016ad6f-kube-api-access-z999w\") pod \"placement-db-sync-5dzch\" (UID: \"974e0c08-3519-4be7-a9d1-c7db6016ad6f\") " pod="openstack/placement-db-sync-5dzch" Oct 13 13:25:13 crc kubenswrapper[4797]: I1013 13:25:13.417327 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/71552bbd-b6bc-43b0-95ba-0c3dc0d93468-dns-svc\") pod \"dnsmasq-dns-77f7885f7f-jk7j6\" (UID: \"71552bbd-b6bc-43b0-95ba-0c3dc0d93468\") " pod="openstack/dnsmasq-dns-77f7885f7f-jk7j6" Oct 13 13:25:13 crc kubenswrapper[4797]: I1013 13:25:13.417988 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71552bbd-b6bc-43b0-95ba-0c3dc0d93468-config\") pod \"dnsmasq-dns-77f7885f7f-jk7j6\" (UID: \"71552bbd-b6bc-43b0-95ba-0c3dc0d93468\") " pod="openstack/dnsmasq-dns-77f7885f7f-jk7j6" Oct 13 13:25:13 crc kubenswrapper[4797]: I1013 13:25:13.418282 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 13 13:25:13 crc kubenswrapper[4797]: I1013 13:25:13.420104 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/974e0c08-3519-4be7-a9d1-c7db6016ad6f-logs\") pod \"placement-db-sync-5dzch\" (UID: \"974e0c08-3519-4be7-a9d1-c7db6016ad6f\") " pod="openstack/placement-db-sync-5dzch" Oct 13 13:25:13 crc kubenswrapper[4797]: I1013 13:25:13.427308 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/974e0c08-3519-4be7-a9d1-c7db6016ad6f-config-data\") pod \"placement-db-sync-5dzch\" (UID: \"974e0c08-3519-4be7-a9d1-c7db6016ad6f\") " pod="openstack/placement-db-sync-5dzch" Oct 13 13:25:13 crc kubenswrapper[4797]: I1013 13:25:13.427948 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/974e0c08-3519-4be7-a9d1-c7db6016ad6f-scripts\") pod \"placement-db-sync-5dzch\" (UID: \"974e0c08-3519-4be7-a9d1-c7db6016ad6f\") " pod="openstack/placement-db-sync-5dzch" Oct 13 13:25:13 crc kubenswrapper[4797]: I1013 13:25:13.428352 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/974e0c08-3519-4be7-a9d1-c7db6016ad6f-combined-ca-bundle\") pod \"placement-db-sync-5dzch\" (UID: \"974e0c08-3519-4be7-a9d1-c7db6016ad6f\") " pod="openstack/placement-db-sync-5dzch" Oct 13 13:25:13 crc kubenswrapper[4797]: I1013 13:25:13.450442 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z999w\" (UniqueName: \"kubernetes.io/projected/974e0c08-3519-4be7-a9d1-c7db6016ad6f-kube-api-access-z999w\") pod \"placement-db-sync-5dzch\" (UID: \"974e0c08-3519-4be7-a9d1-c7db6016ad6f\") " pod="openstack/placement-db-sync-5dzch" Oct 13 13:25:13 crc kubenswrapper[4797]: I1013 13:25:13.460661 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-5dzch" Oct 13 13:25:13 crc kubenswrapper[4797]: I1013 13:25:13.519845 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ptppm\" (UniqueName: \"kubernetes.io/projected/71552bbd-b6bc-43b0-95ba-0c3dc0d93468-kube-api-access-ptppm\") pod \"dnsmasq-dns-77f7885f7f-jk7j6\" (UID: \"71552bbd-b6bc-43b0-95ba-0c3dc0d93468\") " pod="openstack/dnsmasq-dns-77f7885f7f-jk7j6" Oct 13 13:25:13 crc kubenswrapper[4797]: I1013 13:25:13.520136 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/71552bbd-b6bc-43b0-95ba-0c3dc0d93468-dns-swift-storage-0\") pod \"dnsmasq-dns-77f7885f7f-jk7j6\" (UID: \"71552bbd-b6bc-43b0-95ba-0c3dc0d93468\") " pod="openstack/dnsmasq-dns-77f7885f7f-jk7j6" Oct 13 13:25:13 crc kubenswrapper[4797]: I1013 13:25:13.520174 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/71552bbd-b6bc-43b0-95ba-0c3dc0d93468-ovsdbserver-nb\") pod \"dnsmasq-dns-77f7885f7f-jk7j6\" (UID: \"71552bbd-b6bc-43b0-95ba-0c3dc0d93468\") " pod="openstack/dnsmasq-dns-77f7885f7f-jk7j6" Oct 13 13:25:13 crc kubenswrapper[4797]: I1013 13:25:13.520251 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/71552bbd-b6bc-43b0-95ba-0c3dc0d93468-ovsdbserver-sb\") pod \"dnsmasq-dns-77f7885f7f-jk7j6\" (UID: \"71552bbd-b6bc-43b0-95ba-0c3dc0d93468\") " pod="openstack/dnsmasq-dns-77f7885f7f-jk7j6" Oct 13 13:25:13 crc kubenswrapper[4797]: I1013 13:25:13.520293 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/71552bbd-b6bc-43b0-95ba-0c3dc0d93468-dns-svc\") pod \"dnsmasq-dns-77f7885f7f-jk7j6\" (UID: \"71552bbd-b6bc-43b0-95ba-0c3dc0d93468\") " pod="openstack/dnsmasq-dns-77f7885f7f-jk7j6" Oct 13 13:25:13 crc kubenswrapper[4797]: I1013 13:25:13.520316 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71552bbd-b6bc-43b0-95ba-0c3dc0d93468-config\") pod \"dnsmasq-dns-77f7885f7f-jk7j6\" (UID: \"71552bbd-b6bc-43b0-95ba-0c3dc0d93468\") " pod="openstack/dnsmasq-dns-77f7885f7f-jk7j6" Oct 13 13:25:13 crc kubenswrapper[4797]: I1013 13:25:13.522342 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71552bbd-b6bc-43b0-95ba-0c3dc0d93468-config\") pod \"dnsmasq-dns-77f7885f7f-jk7j6\" (UID: \"71552bbd-b6bc-43b0-95ba-0c3dc0d93468\") " pod="openstack/dnsmasq-dns-77f7885f7f-jk7j6" Oct 13 13:25:13 crc kubenswrapper[4797]: I1013 13:25:13.522394 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/71552bbd-b6bc-43b0-95ba-0c3dc0d93468-ovsdbserver-sb\") pod \"dnsmasq-dns-77f7885f7f-jk7j6\" (UID: \"71552bbd-b6bc-43b0-95ba-0c3dc0d93468\") " pod="openstack/dnsmasq-dns-77f7885f7f-jk7j6" Oct 13 13:25:13 crc kubenswrapper[4797]: I1013 13:25:13.522503 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/71552bbd-b6bc-43b0-95ba-0c3dc0d93468-ovsdbserver-nb\") pod \"dnsmasq-dns-77f7885f7f-jk7j6\" (UID: \"71552bbd-b6bc-43b0-95ba-0c3dc0d93468\") " pod="openstack/dnsmasq-dns-77f7885f7f-jk7j6" Oct 13 13:25:13 crc kubenswrapper[4797]: I1013 13:25:13.525727 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/71552bbd-b6bc-43b0-95ba-0c3dc0d93468-dns-svc\") pod \"dnsmasq-dns-77f7885f7f-jk7j6\" (UID: \"71552bbd-b6bc-43b0-95ba-0c3dc0d93468\") " pod="openstack/dnsmasq-dns-77f7885f7f-jk7j6" Oct 13 13:25:13 crc kubenswrapper[4797]: I1013 13:25:13.526168 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/71552bbd-b6bc-43b0-95ba-0c3dc0d93468-dns-swift-storage-0\") pod \"dnsmasq-dns-77f7885f7f-jk7j6\" (UID: \"71552bbd-b6bc-43b0-95ba-0c3dc0d93468\") " pod="openstack/dnsmasq-dns-77f7885f7f-jk7j6" Oct 13 13:25:13 crc kubenswrapper[4797]: I1013 13:25:13.540029 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ptppm\" (UniqueName: \"kubernetes.io/projected/71552bbd-b6bc-43b0-95ba-0c3dc0d93468-kube-api-access-ptppm\") pod \"dnsmasq-dns-77f7885f7f-jk7j6\" (UID: \"71552bbd-b6bc-43b0-95ba-0c3dc0d93468\") " pod="openstack/dnsmasq-dns-77f7885f7f-jk7j6" Oct 13 13:25:13 crc kubenswrapper[4797]: I1013 13:25:13.541151 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b4dc6c99-zvnw5" event={"ID":"19d55089-2ba6-420b-8d7c-d4f96fe2434d","Type":"ContainerStarted","Data":"26e9da622a6e38bfe4dd1c0cc3088ccd5ab9b29fa4aceeb79ee76e05d791c285"} Oct 13 13:25:13 crc kubenswrapper[4797]: I1013 13:25:13.541315 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6b4dc6c99-zvnw5" podUID="19d55089-2ba6-420b-8d7c-d4f96fe2434d" containerName="dnsmasq-dns" containerID="cri-o://26e9da622a6e38bfe4dd1c0cc3088ccd5ab9b29fa4aceeb79ee76e05d791c285" gracePeriod=10 Oct 13 13:25:13 crc kubenswrapper[4797]: I1013 13:25:13.541421 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6b4dc6c99-zvnw5" Oct 13 13:25:13 crc kubenswrapper[4797]: I1013 13:25:13.565506 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6b4dc6c99-zvnw5" podStartSLOduration=3.565485985 podStartE2EDuration="3.565485985s" podCreationTimestamp="2025-10-13 13:25:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 13:25:13.559462987 +0000 UTC m=+1091.093013253" watchObservedRunningTime="2025-10-13 13:25:13.565485985 +0000 UTC m=+1091.099036231" Oct 13 13:25:13 crc kubenswrapper[4797]: I1013 13:25:13.742898 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-b951-account-create-slxdp"] Oct 13 13:25:13 crc kubenswrapper[4797]: I1013 13:25:13.744015 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-b951-account-create-slxdp" Oct 13 13:25:13 crc kubenswrapper[4797]: I1013 13:25:13.755306 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Oct 13 13:25:13 crc kubenswrapper[4797]: I1013 13:25:13.773655 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-b951-account-create-slxdp"] Oct 13 13:25:13 crc kubenswrapper[4797]: I1013 13:25:13.779314 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77f7885f7f-jk7j6" Oct 13 13:25:13 crc kubenswrapper[4797]: W1013 13:25:13.780376 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode951a4d9_1a79_4872_b889_1dfdf3b8fa18.slice/crio-e3d999e4dff9ff330764e26ab23613c906dce36b31600f29633e73a54589e43f WatchSource:0}: Error finding container e3d999e4dff9ff330764e26ab23613c906dce36b31600f29633e73a54589e43f: Status 404 returned error can't find the container with id e3d999e4dff9ff330764e26ab23613c906dce36b31600f29633e73a54589e43f Oct 13 13:25:13 crc kubenswrapper[4797]: I1013 13:25:13.790683 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-mxz7m"] Oct 13 13:25:13 crc kubenswrapper[4797]: I1013 13:25:13.828401 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xplt\" (UniqueName: \"kubernetes.io/projected/3833a450-53fb-44f6-974d-b2496e3a98d8-kube-api-access-6xplt\") pod \"barbican-b951-account-create-slxdp\" (UID: \"3833a450-53fb-44f6-974d-b2496e3a98d8\") " pod="openstack/barbican-b951-account-create-slxdp" Oct 13 13:25:13 crc kubenswrapper[4797]: I1013 13:25:13.844265 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-faff-account-create-wnr29"] Oct 13 13:25:13 crc kubenswrapper[4797]: I1013 13:25:13.845527 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-faff-account-create-wnr29" Oct 13 13:25:13 crc kubenswrapper[4797]: I1013 13:25:13.848259 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Oct 13 13:25:13 crc kubenswrapper[4797]: I1013 13:25:13.862364 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-faff-account-create-wnr29"] Oct 13 13:25:13 crc kubenswrapper[4797]: I1013 13:25:13.893502 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b9564cbb5-5l5n4"] Oct 13 13:25:13 crc kubenswrapper[4797]: I1013 13:25:13.928932 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 13 13:25:13 crc kubenswrapper[4797]: I1013 13:25:13.930723 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 13 13:25:13 crc kubenswrapper[4797]: I1013 13:25:13.931653 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xplt\" (UniqueName: \"kubernetes.io/projected/3833a450-53fb-44f6-974d-b2496e3a98d8-kube-api-access-6xplt\") pod \"barbican-b951-account-create-slxdp\" (UID: \"3833a450-53fb-44f6-974d-b2496e3a98d8\") " pod="openstack/barbican-b951-account-create-slxdp" Oct 13 13:25:13 crc kubenswrapper[4797]: I1013 13:25:13.931746 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n44t2\" (UniqueName: \"kubernetes.io/projected/3744809d-6956-4dfa-bede-4679ab2d9296-kube-api-access-n44t2\") pod \"cinder-faff-account-create-wnr29\" (UID: \"3744809d-6956-4dfa-bede-4679ab2d9296\") " pod="openstack/cinder-faff-account-create-wnr29" Oct 13 13:25:13 crc kubenswrapper[4797]: I1013 13:25:13.935132 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-c7pxm" Oct 13 13:25:13 crc kubenswrapper[4797]: I1013 13:25:13.935460 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Oct 13 13:25:13 crc kubenswrapper[4797]: I1013 13:25:13.935954 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 13 13:25:13 crc kubenswrapper[4797]: I1013 13:25:13.936013 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 13 13:25:13 crc kubenswrapper[4797]: I1013 13:25:13.952486 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xplt\" (UniqueName: \"kubernetes.io/projected/3833a450-53fb-44f6-974d-b2496e3a98d8-kube-api-access-6xplt\") pod \"barbican-b951-account-create-slxdp\" (UID: \"3833a450-53fb-44f6-974d-b2496e3a98d8\") " pod="openstack/barbican-b951-account-create-slxdp" Oct 13 13:25:14 crc kubenswrapper[4797]: I1013 13:25:14.005919 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 13 13:25:14 crc kubenswrapper[4797]: I1013 13:25:14.023050 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 13 13:25:14 crc kubenswrapper[4797]: I1013 13:25:14.025761 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 13 13:25:14 crc kubenswrapper[4797]: I1013 13:25:14.031386 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 13 13:25:14 crc kubenswrapper[4797]: I1013 13:25:14.035636 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtsjv\" (UniqueName: \"kubernetes.io/projected/775308c0-af24-4959-a964-b0d3de6ab0fd-kube-api-access-jtsjv\") pod \"glance-default-external-api-0\" (UID: \"775308c0-af24-4959-a964-b0d3de6ab0fd\") " pod="openstack/glance-default-external-api-0" Oct 13 13:25:14 crc kubenswrapper[4797]: I1013 13:25:14.035703 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"775308c0-af24-4959-a964-b0d3de6ab0fd\") " pod="openstack/glance-default-external-api-0" Oct 13 13:25:14 crc kubenswrapper[4797]: I1013 13:25:14.035726 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/775308c0-af24-4959-a964-b0d3de6ab0fd-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"775308c0-af24-4959-a964-b0d3de6ab0fd\") " pod="openstack/glance-default-external-api-0" Oct 13 13:25:14 crc kubenswrapper[4797]: I1013 13:25:14.035765 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/775308c0-af24-4959-a964-b0d3de6ab0fd-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"775308c0-af24-4959-a964-b0d3de6ab0fd\") " pod="openstack/glance-default-external-api-0" Oct 13 13:25:14 crc kubenswrapper[4797]: I1013 13:25:14.035802 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/775308c0-af24-4959-a964-b0d3de6ab0fd-config-data\") pod \"glance-default-external-api-0\" (UID: \"775308c0-af24-4959-a964-b0d3de6ab0fd\") " pod="openstack/glance-default-external-api-0" Oct 13 13:25:14 crc kubenswrapper[4797]: I1013 13:25:14.035851 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/775308c0-af24-4959-a964-b0d3de6ab0fd-logs\") pod \"glance-default-external-api-0\" (UID: \"775308c0-af24-4959-a964-b0d3de6ab0fd\") " pod="openstack/glance-default-external-api-0" Oct 13 13:25:14 crc kubenswrapper[4797]: I1013 13:25:14.035915 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n44t2\" (UniqueName: \"kubernetes.io/projected/3744809d-6956-4dfa-bede-4679ab2d9296-kube-api-access-n44t2\") pod \"cinder-faff-account-create-wnr29\" (UID: \"3744809d-6956-4dfa-bede-4679ab2d9296\") " pod="openstack/cinder-faff-account-create-wnr29" Oct 13 13:25:14 crc kubenswrapper[4797]: I1013 13:25:14.036005 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/775308c0-af24-4959-a964-b0d3de6ab0fd-scripts\") pod \"glance-default-external-api-0\" (UID: \"775308c0-af24-4959-a964-b0d3de6ab0fd\") " pod="openstack/glance-default-external-api-0" Oct 13 13:25:14 crc kubenswrapper[4797]: I1013 13:25:14.047666 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 13 13:25:14 crc kubenswrapper[4797]: I1013 13:25:14.053401 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-5dzch"] Oct 13 13:25:14 crc kubenswrapper[4797]: I1013 13:25:14.067005 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n44t2\" (UniqueName: \"kubernetes.io/projected/3744809d-6956-4dfa-bede-4679ab2d9296-kube-api-access-n44t2\") pod \"cinder-faff-account-create-wnr29\" (UID: \"3744809d-6956-4dfa-bede-4679ab2d9296\") " pod="openstack/cinder-faff-account-create-wnr29" Oct 13 13:25:14 crc kubenswrapper[4797]: I1013 13:25:14.079515 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b4dc6c99-zvnw5" Oct 13 13:25:14 crc kubenswrapper[4797]: I1013 13:25:14.088777 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-b951-account-create-slxdp" Oct 13 13:25:14 crc kubenswrapper[4797]: I1013 13:25:14.135672 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-f077-account-create-c5h7s"] Oct 13 13:25:14 crc kubenswrapper[4797]: E1013 13:25:14.136082 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19d55089-2ba6-420b-8d7c-d4f96fe2434d" containerName="dnsmasq-dns" Oct 13 13:25:14 crc kubenswrapper[4797]: I1013 13:25:14.136095 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="19d55089-2ba6-420b-8d7c-d4f96fe2434d" containerName="dnsmasq-dns" Oct 13 13:25:14 crc kubenswrapper[4797]: E1013 13:25:14.136125 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19d55089-2ba6-420b-8d7c-d4f96fe2434d" containerName="init" Oct 13 13:25:14 crc kubenswrapper[4797]: I1013 13:25:14.136131 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="19d55089-2ba6-420b-8d7c-d4f96fe2434d" containerName="init" Oct 13 13:25:14 crc kubenswrapper[4797]: I1013 13:25:14.136279 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="19d55089-2ba6-420b-8d7c-d4f96fe2434d" containerName="dnsmasq-dns" Oct 13 13:25:14 crc kubenswrapper[4797]: I1013 13:25:14.137834 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/19d55089-2ba6-420b-8d7c-d4f96fe2434d-dns-swift-storage-0\") pod \"19d55089-2ba6-420b-8d7c-d4f96fe2434d\" (UID: \"19d55089-2ba6-420b-8d7c-d4f96fe2434d\") " Oct 13 13:25:14 crc kubenswrapper[4797]: I1013 13:25:14.137983 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f5z8f\" (UniqueName: \"kubernetes.io/projected/19d55089-2ba6-420b-8d7c-d4f96fe2434d-kube-api-access-f5z8f\") pod \"19d55089-2ba6-420b-8d7c-d4f96fe2434d\" (UID: \"19d55089-2ba6-420b-8d7c-d4f96fe2434d\") " Oct 13 13:25:14 crc kubenswrapper[4797]: I1013 13:25:14.138023 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19d55089-2ba6-420b-8d7c-d4f96fe2434d-config\") pod \"19d55089-2ba6-420b-8d7c-d4f96fe2434d\" (UID: \"19d55089-2ba6-420b-8d7c-d4f96fe2434d\") " Oct 13 13:25:14 crc kubenswrapper[4797]: I1013 13:25:14.138077 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/19d55089-2ba6-420b-8d7c-d4f96fe2434d-ovsdbserver-sb\") pod \"19d55089-2ba6-420b-8d7c-d4f96fe2434d\" (UID: \"19d55089-2ba6-420b-8d7c-d4f96fe2434d\") " Oct 13 13:25:14 crc kubenswrapper[4797]: I1013 13:25:14.138174 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/19d55089-2ba6-420b-8d7c-d4f96fe2434d-dns-svc\") pod \"19d55089-2ba6-420b-8d7c-d4f96fe2434d\" (UID: \"19d55089-2ba6-420b-8d7c-d4f96fe2434d\") " Oct 13 13:25:14 crc kubenswrapper[4797]: I1013 13:25:14.138193 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/19d55089-2ba6-420b-8d7c-d4f96fe2434d-ovsdbserver-nb\") pod \"19d55089-2ba6-420b-8d7c-d4f96fe2434d\" (UID: \"19d55089-2ba6-420b-8d7c-d4f96fe2434d\") " Oct 13 13:25:14 crc kubenswrapper[4797]: I1013 13:25:14.138428 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/775308c0-af24-4959-a964-b0d3de6ab0fd-scripts\") pod \"glance-default-external-api-0\" (UID: \"775308c0-af24-4959-a964-b0d3de6ab0fd\") " pod="openstack/glance-default-external-api-0" Oct 13 13:25:14 crc kubenswrapper[4797]: I1013 13:25:14.138477 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9kpb\" (UniqueName: \"kubernetes.io/projected/41cd509e-46e1-4776-9179-b00dec55dd2a-kube-api-access-f9kpb\") pod \"glance-default-internal-api-0\" (UID: \"41cd509e-46e1-4776-9179-b00dec55dd2a\") " pod="openstack/glance-default-internal-api-0" Oct 13 13:25:14 crc kubenswrapper[4797]: I1013 13:25:14.138510 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41cd509e-46e1-4776-9179-b00dec55dd2a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"41cd509e-46e1-4776-9179-b00dec55dd2a\") " pod="openstack/glance-default-internal-api-0" Oct 13 13:25:14 crc kubenswrapper[4797]: I1013 13:25:14.138527 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jtsjv\" (UniqueName: \"kubernetes.io/projected/775308c0-af24-4959-a964-b0d3de6ab0fd-kube-api-access-jtsjv\") pod \"glance-default-external-api-0\" (UID: \"775308c0-af24-4959-a964-b0d3de6ab0fd\") " pod="openstack/glance-default-external-api-0" Oct 13 13:25:14 crc kubenswrapper[4797]: I1013 13:25:14.138565 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41cd509e-46e1-4776-9179-b00dec55dd2a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"41cd509e-46e1-4776-9179-b00dec55dd2a\") " pod="openstack/glance-default-internal-api-0" Oct 13 13:25:14 crc kubenswrapper[4797]: I1013 13:25:14.138582 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"775308c0-af24-4959-a964-b0d3de6ab0fd\") " pod="openstack/glance-default-external-api-0" Oct 13 13:25:14 crc kubenswrapper[4797]: I1013 13:25:14.138599 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/775308c0-af24-4959-a964-b0d3de6ab0fd-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"775308c0-af24-4959-a964-b0d3de6ab0fd\") " pod="openstack/glance-default-external-api-0" Oct 13 13:25:14 crc kubenswrapper[4797]: I1013 13:25:14.138649 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/775308c0-af24-4959-a964-b0d3de6ab0fd-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"775308c0-af24-4959-a964-b0d3de6ab0fd\") " pod="openstack/glance-default-external-api-0" Oct 13 13:25:14 crc kubenswrapper[4797]: I1013 13:25:14.138671 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/41cd509e-46e1-4776-9179-b00dec55dd2a-logs\") pod \"glance-default-internal-api-0\" (UID: \"41cd509e-46e1-4776-9179-b00dec55dd2a\") " pod="openstack/glance-default-internal-api-0" Oct 13 13:25:14 crc kubenswrapper[4797]: I1013 13:25:14.138714 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/775308c0-af24-4959-a964-b0d3de6ab0fd-config-data\") pod \"glance-default-external-api-0\" (UID: \"775308c0-af24-4959-a964-b0d3de6ab0fd\") " pod="openstack/glance-default-external-api-0" Oct 13 13:25:14 crc kubenswrapper[4797]: I1013 13:25:14.138729 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/41cd509e-46e1-4776-9179-b00dec55dd2a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"41cd509e-46e1-4776-9179-b00dec55dd2a\") " pod="openstack/glance-default-internal-api-0" Oct 13 13:25:14 crc kubenswrapper[4797]: I1013 13:25:14.138750 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/775308c0-af24-4959-a964-b0d3de6ab0fd-logs\") pod \"glance-default-external-api-0\" (UID: \"775308c0-af24-4959-a964-b0d3de6ab0fd\") " pod="openstack/glance-default-external-api-0" Oct 13 13:25:14 crc kubenswrapper[4797]: I1013 13:25:14.138800 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41cd509e-46e1-4776-9179-b00dec55dd2a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"41cd509e-46e1-4776-9179-b00dec55dd2a\") " pod="openstack/glance-default-internal-api-0" Oct 13 13:25:14 crc kubenswrapper[4797]: I1013 13:25:14.138845 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"41cd509e-46e1-4776-9179-b00dec55dd2a\") " pod="openstack/glance-default-internal-api-0" Oct 13 13:25:14 crc kubenswrapper[4797]: I1013 13:25:14.139403 4797 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"775308c0-af24-4959-a964-b0d3de6ab0fd\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/glance-default-external-api-0" Oct 13 13:25:14 crc kubenswrapper[4797]: I1013 13:25:14.148310 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/775308c0-af24-4959-a964-b0d3de6ab0fd-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"775308c0-af24-4959-a964-b0d3de6ab0fd\") " pod="openstack/glance-default-external-api-0" Oct 13 13:25:14 crc kubenswrapper[4797]: I1013 13:25:14.148345 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/775308c0-af24-4959-a964-b0d3de6ab0fd-logs\") pod \"glance-default-external-api-0\" (UID: \"775308c0-af24-4959-a964-b0d3de6ab0fd\") " pod="openstack/glance-default-external-api-0" Oct 13 13:25:14 crc kubenswrapper[4797]: I1013 13:25:14.153685 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19d55089-2ba6-420b-8d7c-d4f96fe2434d-kube-api-access-f5z8f" (OuterVolumeSpecName: "kube-api-access-f5z8f") pod "19d55089-2ba6-420b-8d7c-d4f96fe2434d" (UID: "19d55089-2ba6-420b-8d7c-d4f96fe2434d"). InnerVolumeSpecName "kube-api-access-f5z8f". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:25:14 crc kubenswrapper[4797]: I1013 13:25:14.154070 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f077-account-create-c5h7s" Oct 13 13:25:14 crc kubenswrapper[4797]: I1013 13:25:14.155813 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Oct 13 13:25:14 crc kubenswrapper[4797]: I1013 13:25:14.168758 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtsjv\" (UniqueName: \"kubernetes.io/projected/775308c0-af24-4959-a964-b0d3de6ab0fd-kube-api-access-jtsjv\") pod \"glance-default-external-api-0\" (UID: \"775308c0-af24-4959-a964-b0d3de6ab0fd\") " pod="openstack/glance-default-external-api-0" Oct 13 13:25:14 crc kubenswrapper[4797]: I1013 13:25:14.170589 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/775308c0-af24-4959-a964-b0d3de6ab0fd-scripts\") pod \"glance-default-external-api-0\" (UID: \"775308c0-af24-4959-a964-b0d3de6ab0fd\") " pod="openstack/glance-default-external-api-0" Oct 13 13:25:14 crc kubenswrapper[4797]: I1013 13:25:14.175891 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/775308c0-af24-4959-a964-b0d3de6ab0fd-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"775308c0-af24-4959-a964-b0d3de6ab0fd\") " pod="openstack/glance-default-external-api-0" Oct 13 13:25:14 crc kubenswrapper[4797]: I1013 13:25:14.177048 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/775308c0-af24-4959-a964-b0d3de6ab0fd-config-data\") pod \"glance-default-external-api-0\" (UID: \"775308c0-af24-4959-a964-b0d3de6ab0fd\") " pod="openstack/glance-default-external-api-0" Oct 13 13:25:14 crc kubenswrapper[4797]: I1013 13:25:14.187204 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-faff-account-create-wnr29" Oct 13 13:25:14 crc kubenswrapper[4797]: I1013 13:25:14.198251 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-f077-account-create-c5h7s"] Oct 13 13:25:14 crc kubenswrapper[4797]: I1013 13:25:14.239881 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/41cd509e-46e1-4776-9179-b00dec55dd2a-logs\") pod \"glance-default-internal-api-0\" (UID: \"41cd509e-46e1-4776-9179-b00dec55dd2a\") " pod="openstack/glance-default-internal-api-0" Oct 13 13:25:14 crc kubenswrapper[4797]: I1013 13:25:14.239928 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/41cd509e-46e1-4776-9179-b00dec55dd2a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"41cd509e-46e1-4776-9179-b00dec55dd2a\") " pod="openstack/glance-default-internal-api-0" Oct 13 13:25:14 crc kubenswrapper[4797]: I1013 13:25:14.239971 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41cd509e-46e1-4776-9179-b00dec55dd2a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"41cd509e-46e1-4776-9179-b00dec55dd2a\") " pod="openstack/glance-default-internal-api-0" Oct 13 13:25:14 crc kubenswrapper[4797]: I1013 13:25:14.240004 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"41cd509e-46e1-4776-9179-b00dec55dd2a\") " pod="openstack/glance-default-internal-api-0" Oct 13 13:25:14 crc kubenswrapper[4797]: I1013 13:25:14.240036 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbxdd\" (UniqueName: \"kubernetes.io/projected/a8ac30bd-fd2e-4d0e-99b6-06b9adc7981b-kube-api-access-kbxdd\") pod \"neutron-f077-account-create-c5h7s\" (UID: \"a8ac30bd-fd2e-4d0e-99b6-06b9adc7981b\") " pod="openstack/neutron-f077-account-create-c5h7s" Oct 13 13:25:14 crc kubenswrapper[4797]: I1013 13:25:14.240068 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f9kpb\" (UniqueName: \"kubernetes.io/projected/41cd509e-46e1-4776-9179-b00dec55dd2a-kube-api-access-f9kpb\") pod \"glance-default-internal-api-0\" (UID: \"41cd509e-46e1-4776-9179-b00dec55dd2a\") " pod="openstack/glance-default-internal-api-0" Oct 13 13:25:14 crc kubenswrapper[4797]: I1013 13:25:14.240093 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41cd509e-46e1-4776-9179-b00dec55dd2a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"41cd509e-46e1-4776-9179-b00dec55dd2a\") " pod="openstack/glance-default-internal-api-0" Oct 13 13:25:14 crc kubenswrapper[4797]: I1013 13:25:14.240120 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41cd509e-46e1-4776-9179-b00dec55dd2a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"41cd509e-46e1-4776-9179-b00dec55dd2a\") " pod="openstack/glance-default-internal-api-0" Oct 13 13:25:14 crc kubenswrapper[4797]: I1013 13:25:14.240180 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f5z8f\" (UniqueName: \"kubernetes.io/projected/19d55089-2ba6-420b-8d7c-d4f96fe2434d-kube-api-access-f5z8f\") on node \"crc\" DevicePath \"\"" Oct 13 13:25:14 crc kubenswrapper[4797]: I1013 13:25:14.240690 4797 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"41cd509e-46e1-4776-9179-b00dec55dd2a\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-internal-api-0" Oct 13 13:25:14 crc kubenswrapper[4797]: I1013 13:25:14.241264 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/41cd509e-46e1-4776-9179-b00dec55dd2a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"41cd509e-46e1-4776-9179-b00dec55dd2a\") " pod="openstack/glance-default-internal-api-0" Oct 13 13:25:14 crc kubenswrapper[4797]: I1013 13:25:14.241481 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/41cd509e-46e1-4776-9179-b00dec55dd2a-logs\") pod \"glance-default-internal-api-0\" (UID: \"41cd509e-46e1-4776-9179-b00dec55dd2a\") " pod="openstack/glance-default-internal-api-0" Oct 13 13:25:14 crc kubenswrapper[4797]: I1013 13:25:14.267413 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9kpb\" (UniqueName: \"kubernetes.io/projected/41cd509e-46e1-4776-9179-b00dec55dd2a-kube-api-access-f9kpb\") pod \"glance-default-internal-api-0\" (UID: \"41cd509e-46e1-4776-9179-b00dec55dd2a\") " pod="openstack/glance-default-internal-api-0" Oct 13 13:25:14 crc kubenswrapper[4797]: I1013 13:25:14.276002 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41cd509e-46e1-4776-9179-b00dec55dd2a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"41cd509e-46e1-4776-9179-b00dec55dd2a\") " pod="openstack/glance-default-internal-api-0" Oct 13 13:25:14 crc kubenswrapper[4797]: I1013 13:25:14.278530 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41cd509e-46e1-4776-9179-b00dec55dd2a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"41cd509e-46e1-4776-9179-b00dec55dd2a\") " pod="openstack/glance-default-internal-api-0" Oct 13 13:25:14 crc kubenswrapper[4797]: I1013 13:25:14.278920 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41cd509e-46e1-4776-9179-b00dec55dd2a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"41cd509e-46e1-4776-9179-b00dec55dd2a\") " pod="openstack/glance-default-internal-api-0" Oct 13 13:25:14 crc kubenswrapper[4797]: I1013 13:25:14.343134 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kbxdd\" (UniqueName: \"kubernetes.io/projected/a8ac30bd-fd2e-4d0e-99b6-06b9adc7981b-kube-api-access-kbxdd\") pod \"neutron-f077-account-create-c5h7s\" (UID: \"a8ac30bd-fd2e-4d0e-99b6-06b9adc7981b\") " pod="openstack/neutron-f077-account-create-c5h7s" Oct 13 13:25:14 crc kubenswrapper[4797]: I1013 13:25:14.365049 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"775308c0-af24-4959-a964-b0d3de6ab0fd\") " pod="openstack/glance-default-external-api-0" Oct 13 13:25:14 crc kubenswrapper[4797]: I1013 13:25:14.378735 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77f7885f7f-jk7j6"] Oct 13 13:25:14 crc kubenswrapper[4797]: I1013 13:25:14.379741 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbxdd\" (UniqueName: \"kubernetes.io/projected/a8ac30bd-fd2e-4d0e-99b6-06b9adc7981b-kube-api-access-kbxdd\") pod \"neutron-f077-account-create-c5h7s\" (UID: \"a8ac30bd-fd2e-4d0e-99b6-06b9adc7981b\") " pod="openstack/neutron-f077-account-create-c5h7s" Oct 13 13:25:14 crc kubenswrapper[4797]: I1013 13:25:14.442421 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19d55089-2ba6-420b-8d7c-d4f96fe2434d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "19d55089-2ba6-420b-8d7c-d4f96fe2434d" (UID: "19d55089-2ba6-420b-8d7c-d4f96fe2434d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:25:14 crc kubenswrapper[4797]: I1013 13:25:14.459364 4797 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/19d55089-2ba6-420b-8d7c-d4f96fe2434d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 13 13:25:14 crc kubenswrapper[4797]: I1013 13:25:14.508021 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"41cd509e-46e1-4776-9179-b00dec55dd2a\") " pod="openstack/glance-default-internal-api-0" Oct 13 13:25:14 crc kubenswrapper[4797]: I1013 13:25:14.519946 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19d55089-2ba6-420b-8d7c-d4f96fe2434d-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "19d55089-2ba6-420b-8d7c-d4f96fe2434d" (UID: "19d55089-2ba6-420b-8d7c-d4f96fe2434d"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:25:14 crc kubenswrapper[4797]: I1013 13:25:14.560866 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19d55089-2ba6-420b-8d7c-d4f96fe2434d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "19d55089-2ba6-420b-8d7c-d4f96fe2434d" (UID: "19d55089-2ba6-420b-8d7c-d4f96fe2434d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:25:14 crc kubenswrapper[4797]: I1013 13:25:14.562461 4797 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/19d55089-2ba6-420b-8d7c-d4f96fe2434d-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 13 13:25:14 crc kubenswrapper[4797]: I1013 13:25:14.562555 4797 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/19d55089-2ba6-420b-8d7c-d4f96fe2434d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 13 13:25:14 crc kubenswrapper[4797]: I1013 13:25:14.565896 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 13 13:25:14 crc kubenswrapper[4797]: I1013 13:25:14.576510 4797 generic.go:334] "Generic (PLEG): container finished" podID="19d55089-2ba6-420b-8d7c-d4f96fe2434d" containerID="26e9da622a6e38bfe4dd1c0cc3088ccd5ab9b29fa4aceeb79ee76e05d791c285" exitCode=0 Oct 13 13:25:14 crc kubenswrapper[4797]: I1013 13:25:14.576618 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b4dc6c99-zvnw5" event={"ID":"19d55089-2ba6-420b-8d7c-d4f96fe2434d","Type":"ContainerDied","Data":"26e9da622a6e38bfe4dd1c0cc3088ccd5ab9b29fa4aceeb79ee76e05d791c285"} Oct 13 13:25:14 crc kubenswrapper[4797]: I1013 13:25:14.576657 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b4dc6c99-zvnw5" event={"ID":"19d55089-2ba6-420b-8d7c-d4f96fe2434d","Type":"ContainerDied","Data":"d14043c623b401de0fc6dc8327e480808f3a395e6ac99bece027c7225bbaa8a0"} Oct 13 13:25:14 crc kubenswrapper[4797]: I1013 13:25:14.576682 4797 scope.go:117] "RemoveContainer" containerID="26e9da622a6e38bfe4dd1c0cc3088ccd5ab9b29fa4aceeb79ee76e05d791c285" Oct 13 13:25:14 crc kubenswrapper[4797]: I1013 13:25:14.576751 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b4dc6c99-zvnw5" Oct 13 13:25:14 crc kubenswrapper[4797]: I1013 13:25:14.590971 4797 generic.go:334] "Generic (PLEG): container finished" podID="5ae891fa-6596-4556-96c5-52e4118d7fc1" containerID="de19660da247d440f3212d062a389530dba92083100020a19171b495b07dff1a" exitCode=0 Oct 13 13:25:14 crc kubenswrapper[4797]: I1013 13:25:14.591044 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b9564cbb5-5l5n4" event={"ID":"5ae891fa-6596-4556-96c5-52e4118d7fc1","Type":"ContainerDied","Data":"de19660da247d440f3212d062a389530dba92083100020a19171b495b07dff1a"} Oct 13 13:25:14 crc kubenswrapper[4797]: I1013 13:25:14.591076 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b9564cbb5-5l5n4" event={"ID":"5ae891fa-6596-4556-96c5-52e4118d7fc1","Type":"ContainerStarted","Data":"6732e4706ccaea04fde963372814211c83a31f11bb00de17229f28d01eca2042"} Oct 13 13:25:14 crc kubenswrapper[4797]: I1013 13:25:14.606895 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-mxz7m" event={"ID":"e951a4d9-1a79-4872-b889-1dfdf3b8fa18","Type":"ContainerStarted","Data":"eafea942da1ddb6d4abaaa93876373ee484606f7001d932ec2587dd831ade1f5"} Oct 13 13:25:14 crc kubenswrapper[4797]: I1013 13:25:14.607119 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-mxz7m" event={"ID":"e951a4d9-1a79-4872-b889-1dfdf3b8fa18","Type":"ContainerStarted","Data":"e3d999e4dff9ff330764e26ab23613c906dce36b31600f29633e73a54589e43f"} Oct 13 13:25:14 crc kubenswrapper[4797]: I1013 13:25:14.609108 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77f7885f7f-jk7j6" event={"ID":"71552bbd-b6bc-43b0-95ba-0c3dc0d93468","Type":"ContainerStarted","Data":"4adfe03f1df3525f379263654c5803c20a021fde3d136a13d3b414f3a9cbab71"} Oct 13 13:25:14 crc kubenswrapper[4797]: I1013 13:25:14.610067 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-5dzch" event={"ID":"974e0c08-3519-4be7-a9d1-c7db6016ad6f","Type":"ContainerStarted","Data":"70e4b839298a0f2a732d1fe1873068e787271458d69772f97be84243440b17e9"} Oct 13 13:25:14 crc kubenswrapper[4797]: I1013 13:25:14.611403 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f52aa623-f23f-4f02-b744-7c5b1e066e50","Type":"ContainerStarted","Data":"5929e5b3693de4a2e6e846dbca7316d5946ffebd561262e8f926584f4c7d1670"} Oct 13 13:25:14 crc kubenswrapper[4797]: I1013 13:25:14.622323 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19d55089-2ba6-420b-8d7c-d4f96fe2434d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "19d55089-2ba6-420b-8d7c-d4f96fe2434d" (UID: "19d55089-2ba6-420b-8d7c-d4f96fe2434d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:25:14 crc kubenswrapper[4797]: I1013 13:25:14.624066 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19d55089-2ba6-420b-8d7c-d4f96fe2434d-config" (OuterVolumeSpecName: "config") pod "19d55089-2ba6-420b-8d7c-d4f96fe2434d" (UID: "19d55089-2ba6-420b-8d7c-d4f96fe2434d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:25:14 crc kubenswrapper[4797]: I1013 13:25:14.646119 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-mxz7m" podStartSLOduration=2.646094788 podStartE2EDuration="2.646094788s" podCreationTimestamp="2025-10-13 13:25:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 13:25:14.643679848 +0000 UTC m=+1092.177230104" watchObservedRunningTime="2025-10-13 13:25:14.646094788 +0000 UTC m=+1092.179645044" Oct 13 13:25:14 crc kubenswrapper[4797]: I1013 13:25:14.649102 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 13 13:25:14 crc kubenswrapper[4797]: I1013 13:25:14.664245 4797 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19d55089-2ba6-420b-8d7c-d4f96fe2434d-config\") on node \"crc\" DevicePath \"\"" Oct 13 13:25:14 crc kubenswrapper[4797]: I1013 13:25:14.664294 4797 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/19d55089-2ba6-420b-8d7c-d4f96fe2434d-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 13 13:25:14 crc kubenswrapper[4797]: I1013 13:25:14.714123 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f077-account-create-c5h7s" Oct 13 13:25:14 crc kubenswrapper[4797]: I1013 13:25:14.767952 4797 scope.go:117] "RemoveContainer" containerID="084c6ff31def85a70029a97db084833ace7775183cdf9693bf21112dcd3332a7" Oct 13 13:25:14 crc kubenswrapper[4797]: I1013 13:25:14.783651 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-b951-account-create-slxdp"] Oct 13 13:25:14 crc kubenswrapper[4797]: I1013 13:25:14.816664 4797 scope.go:117] "RemoveContainer" containerID="26e9da622a6e38bfe4dd1c0cc3088ccd5ab9b29fa4aceeb79ee76e05d791c285" Oct 13 13:25:14 crc kubenswrapper[4797]: E1013 13:25:14.820882 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26e9da622a6e38bfe4dd1c0cc3088ccd5ab9b29fa4aceeb79ee76e05d791c285\": container with ID starting with 26e9da622a6e38bfe4dd1c0cc3088ccd5ab9b29fa4aceeb79ee76e05d791c285 not found: ID does not exist" containerID="26e9da622a6e38bfe4dd1c0cc3088ccd5ab9b29fa4aceeb79ee76e05d791c285" Oct 13 13:25:14 crc kubenswrapper[4797]: I1013 13:25:14.820927 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26e9da622a6e38bfe4dd1c0cc3088ccd5ab9b29fa4aceeb79ee76e05d791c285"} err="failed to get container status \"26e9da622a6e38bfe4dd1c0cc3088ccd5ab9b29fa4aceeb79ee76e05d791c285\": rpc error: code = NotFound desc = could not find container \"26e9da622a6e38bfe4dd1c0cc3088ccd5ab9b29fa4aceeb79ee76e05d791c285\": container with ID starting with 26e9da622a6e38bfe4dd1c0cc3088ccd5ab9b29fa4aceeb79ee76e05d791c285 not found: ID does not exist" Oct 13 13:25:14 crc kubenswrapper[4797]: I1013 13:25:14.820959 4797 scope.go:117] "RemoveContainer" containerID="084c6ff31def85a70029a97db084833ace7775183cdf9693bf21112dcd3332a7" Oct 13 13:25:14 crc kubenswrapper[4797]: E1013 13:25:14.824519 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"084c6ff31def85a70029a97db084833ace7775183cdf9693bf21112dcd3332a7\": container with ID starting with 084c6ff31def85a70029a97db084833ace7775183cdf9693bf21112dcd3332a7 not found: ID does not exist" containerID="084c6ff31def85a70029a97db084833ace7775183cdf9693bf21112dcd3332a7" Oct 13 13:25:14 crc kubenswrapper[4797]: I1013 13:25:14.824564 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"084c6ff31def85a70029a97db084833ace7775183cdf9693bf21112dcd3332a7"} err="failed to get container status \"084c6ff31def85a70029a97db084833ace7775183cdf9693bf21112dcd3332a7\": rpc error: code = NotFound desc = could not find container \"084c6ff31def85a70029a97db084833ace7775183cdf9693bf21112dcd3332a7\": container with ID starting with 084c6ff31def85a70029a97db084833ace7775183cdf9693bf21112dcd3332a7 not found: ID does not exist" Oct 13 13:25:14 crc kubenswrapper[4797]: I1013 13:25:14.950404 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-faff-account-create-wnr29"] Oct 13 13:25:14 crc kubenswrapper[4797]: I1013 13:25:14.958669 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b4dc6c99-zvnw5"] Oct 13 13:25:14 crc kubenswrapper[4797]: I1013 13:25:14.964576 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6b4dc6c99-zvnw5"] Oct 13 13:25:15 crc kubenswrapper[4797]: I1013 13:25:15.247590 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19d55089-2ba6-420b-8d7c-d4f96fe2434d" path="/var/lib/kubelet/pods/19d55089-2ba6-420b-8d7c-d4f96fe2434d/volumes" Oct 13 13:25:15 crc kubenswrapper[4797]: I1013 13:25:15.287969 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 13 13:25:15 crc kubenswrapper[4797]: I1013 13:25:15.368889 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b9564cbb5-5l5n4" Oct 13 13:25:15 crc kubenswrapper[4797]: I1013 13:25:15.430001 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-f077-account-create-c5h7s"] Oct 13 13:25:15 crc kubenswrapper[4797]: W1013 13:25:15.454204 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda8ac30bd_fd2e_4d0e_99b6_06b9adc7981b.slice/crio-a2f8177df4af8fec872f9de581d7e790cc551ca4e769c0794e4997c6acdee57f WatchSource:0}: Error finding container a2f8177df4af8fec872f9de581d7e790cc551ca4e769c0794e4997c6acdee57f: Status 404 returned error can't find the container with id a2f8177df4af8fec872f9de581d7e790cc551ca4e769c0794e4997c6acdee57f Oct 13 13:25:15 crc kubenswrapper[4797]: I1013 13:25:15.497010 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ae891fa-6596-4556-96c5-52e4118d7fc1-config\") pod \"5ae891fa-6596-4556-96c5-52e4118d7fc1\" (UID: \"5ae891fa-6596-4556-96c5-52e4118d7fc1\") " Oct 13 13:25:15 crc kubenswrapper[4797]: I1013 13:25:15.497085 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5ae891fa-6596-4556-96c5-52e4118d7fc1-dns-swift-storage-0\") pod \"5ae891fa-6596-4556-96c5-52e4118d7fc1\" (UID: \"5ae891fa-6596-4556-96c5-52e4118d7fc1\") " Oct 13 13:25:15 crc kubenswrapper[4797]: I1013 13:25:15.497134 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5ae891fa-6596-4556-96c5-52e4118d7fc1-dns-svc\") pod \"5ae891fa-6596-4556-96c5-52e4118d7fc1\" (UID: \"5ae891fa-6596-4556-96c5-52e4118d7fc1\") " Oct 13 13:25:15 crc kubenswrapper[4797]: I1013 13:25:15.497196 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5ae891fa-6596-4556-96c5-52e4118d7fc1-ovsdbserver-nb\") pod \"5ae891fa-6596-4556-96c5-52e4118d7fc1\" (UID: \"5ae891fa-6596-4556-96c5-52e4118d7fc1\") " Oct 13 13:25:15 crc kubenswrapper[4797]: I1013 13:25:15.497262 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g2ns8\" (UniqueName: \"kubernetes.io/projected/5ae891fa-6596-4556-96c5-52e4118d7fc1-kube-api-access-g2ns8\") pod \"5ae891fa-6596-4556-96c5-52e4118d7fc1\" (UID: \"5ae891fa-6596-4556-96c5-52e4118d7fc1\") " Oct 13 13:25:15 crc kubenswrapper[4797]: I1013 13:25:15.497315 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5ae891fa-6596-4556-96c5-52e4118d7fc1-ovsdbserver-sb\") pod \"5ae891fa-6596-4556-96c5-52e4118d7fc1\" (UID: \"5ae891fa-6596-4556-96c5-52e4118d7fc1\") " Oct 13 13:25:15 crc kubenswrapper[4797]: I1013 13:25:15.504127 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ae891fa-6596-4556-96c5-52e4118d7fc1-kube-api-access-g2ns8" (OuterVolumeSpecName: "kube-api-access-g2ns8") pod "5ae891fa-6596-4556-96c5-52e4118d7fc1" (UID: "5ae891fa-6596-4556-96c5-52e4118d7fc1"). InnerVolumeSpecName "kube-api-access-g2ns8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:25:15 crc kubenswrapper[4797]: I1013 13:25:15.519116 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ae891fa-6596-4556-96c5-52e4118d7fc1-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "5ae891fa-6596-4556-96c5-52e4118d7fc1" (UID: "5ae891fa-6596-4556-96c5-52e4118d7fc1"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:25:15 crc kubenswrapper[4797]: I1013 13:25:15.530570 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ae891fa-6596-4556-96c5-52e4118d7fc1-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5ae891fa-6596-4556-96c5-52e4118d7fc1" (UID: "5ae891fa-6596-4556-96c5-52e4118d7fc1"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:25:15 crc kubenswrapper[4797]: I1013 13:25:15.530752 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ae891fa-6596-4556-96c5-52e4118d7fc1-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "5ae891fa-6596-4556-96c5-52e4118d7fc1" (UID: "5ae891fa-6596-4556-96c5-52e4118d7fc1"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:25:15 crc kubenswrapper[4797]: I1013 13:25:15.539672 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ae891fa-6596-4556-96c5-52e4118d7fc1-config" (OuterVolumeSpecName: "config") pod "5ae891fa-6596-4556-96c5-52e4118d7fc1" (UID: "5ae891fa-6596-4556-96c5-52e4118d7fc1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:25:15 crc kubenswrapper[4797]: I1013 13:25:15.546616 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ae891fa-6596-4556-96c5-52e4118d7fc1-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "5ae891fa-6596-4556-96c5-52e4118d7fc1" (UID: "5ae891fa-6596-4556-96c5-52e4118d7fc1"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:25:15 crc kubenswrapper[4797]: I1013 13:25:15.600268 4797 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5ae891fa-6596-4556-96c5-52e4118d7fc1-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 13 13:25:15 crc kubenswrapper[4797]: I1013 13:25:15.600301 4797 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ae891fa-6596-4556-96c5-52e4118d7fc1-config\") on node \"crc\" DevicePath \"\"" Oct 13 13:25:15 crc kubenswrapper[4797]: I1013 13:25:15.600310 4797 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5ae891fa-6596-4556-96c5-52e4118d7fc1-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 13 13:25:15 crc kubenswrapper[4797]: I1013 13:25:15.600320 4797 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5ae891fa-6596-4556-96c5-52e4118d7fc1-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 13 13:25:15 crc kubenswrapper[4797]: I1013 13:25:15.600328 4797 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5ae891fa-6596-4556-96c5-52e4118d7fc1-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 13 13:25:15 crc kubenswrapper[4797]: I1013 13:25:15.603914 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g2ns8\" (UniqueName: \"kubernetes.io/projected/5ae891fa-6596-4556-96c5-52e4118d7fc1-kube-api-access-g2ns8\") on node \"crc\" DevicePath \"\"" Oct 13 13:25:15 crc kubenswrapper[4797]: I1013 13:25:15.632886 4797 generic.go:334] "Generic (PLEG): container finished" podID="3833a450-53fb-44f6-974d-b2496e3a98d8" containerID="b59989a84aa6cae92a4cd6b910046455f53509fe73ce82a007f4aab8aa821634" exitCode=0 Oct 13 13:25:15 crc kubenswrapper[4797]: I1013 13:25:15.632978 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-b951-account-create-slxdp" event={"ID":"3833a450-53fb-44f6-974d-b2496e3a98d8","Type":"ContainerDied","Data":"b59989a84aa6cae92a4cd6b910046455f53509fe73ce82a007f4aab8aa821634"} Oct 13 13:25:15 crc kubenswrapper[4797]: I1013 13:25:15.633004 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-b951-account-create-slxdp" event={"ID":"3833a450-53fb-44f6-974d-b2496e3a98d8","Type":"ContainerStarted","Data":"cccdb156cf9a74e4660a924bd98658b52b6b1c88a9cc176784d8a0cf20d75f89"} Oct 13 13:25:15 crc kubenswrapper[4797]: I1013 13:25:15.651703 4797 generic.go:334] "Generic (PLEG): container finished" podID="3744809d-6956-4dfa-bede-4679ab2d9296" containerID="500b5a1ee7aa9cf5b1f05189445b52d0c05810f3b993e86be05f0b07214c543c" exitCode=0 Oct 13 13:25:15 crc kubenswrapper[4797]: I1013 13:25:15.651799 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-faff-account-create-wnr29" event={"ID":"3744809d-6956-4dfa-bede-4679ab2d9296","Type":"ContainerDied","Data":"500b5a1ee7aa9cf5b1f05189445b52d0c05810f3b993e86be05f0b07214c543c"} Oct 13 13:25:15 crc kubenswrapper[4797]: I1013 13:25:15.651844 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-faff-account-create-wnr29" event={"ID":"3744809d-6956-4dfa-bede-4679ab2d9296","Type":"ContainerStarted","Data":"2507ce921801f73d0861601fcc73e5e4199d27655c04d6d43d6b025847d7af0d"} Oct 13 13:25:15 crc kubenswrapper[4797]: I1013 13:25:15.660507 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b9564cbb5-5l5n4" event={"ID":"5ae891fa-6596-4556-96c5-52e4118d7fc1","Type":"ContainerDied","Data":"6732e4706ccaea04fde963372814211c83a31f11bb00de17229f28d01eca2042"} Oct 13 13:25:15 crc kubenswrapper[4797]: I1013 13:25:15.660560 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b9564cbb5-5l5n4" Oct 13 13:25:15 crc kubenswrapper[4797]: I1013 13:25:15.660576 4797 scope.go:117] "RemoveContainer" containerID="de19660da247d440f3212d062a389530dba92083100020a19171b495b07dff1a" Oct 13 13:25:15 crc kubenswrapper[4797]: I1013 13:25:15.670601 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"775308c0-af24-4959-a964-b0d3de6ab0fd","Type":"ContainerStarted","Data":"1c25c83e23ae1601152dc68df42997a3e106e88157d444f868a8b59cabec2592"} Oct 13 13:25:15 crc kubenswrapper[4797]: I1013 13:25:15.675307 4797 generic.go:334] "Generic (PLEG): container finished" podID="71552bbd-b6bc-43b0-95ba-0c3dc0d93468" containerID="684d518d30db8023e55b9fa1b1df06ff30fb3b5a41be41ec0aaf4e2bdd6c6a91" exitCode=0 Oct 13 13:25:15 crc kubenswrapper[4797]: I1013 13:25:15.675358 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77f7885f7f-jk7j6" event={"ID":"71552bbd-b6bc-43b0-95ba-0c3dc0d93468","Type":"ContainerDied","Data":"684d518d30db8023e55b9fa1b1df06ff30fb3b5a41be41ec0aaf4e2bdd6c6a91"} Oct 13 13:25:15 crc kubenswrapper[4797]: I1013 13:25:15.681742 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f077-account-create-c5h7s" event={"ID":"a8ac30bd-fd2e-4d0e-99b6-06b9adc7981b","Type":"ContainerStarted","Data":"a2f8177df4af8fec872f9de581d7e790cc551ca4e769c0794e4997c6acdee57f"} Oct 13 13:25:15 crc kubenswrapper[4797]: I1013 13:25:15.749152 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b9564cbb5-5l5n4"] Oct 13 13:25:15 crc kubenswrapper[4797]: I1013 13:25:15.755250 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-f077-account-create-c5h7s" podStartSLOduration=1.75523238 podStartE2EDuration="1.75523238s" podCreationTimestamp="2025-10-13 13:25:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 13:25:15.74421908 +0000 UTC m=+1093.277769356" watchObservedRunningTime="2025-10-13 13:25:15.75523238 +0000 UTC m=+1093.288782646" Oct 13 13:25:15 crc kubenswrapper[4797]: I1013 13:25:15.755442 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6b9564cbb5-5l5n4"] Oct 13 13:25:15 crc kubenswrapper[4797]: I1013 13:25:15.898750 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 13 13:25:16 crc kubenswrapper[4797]: I1013 13:25:16.699451 4797 generic.go:334] "Generic (PLEG): container finished" podID="a8ac30bd-fd2e-4d0e-99b6-06b9adc7981b" containerID="e4cf8b303fc9930630ea8cf2fafc4b1ecdee57504baf50f2bfe1bf0f63ecfaf1" exitCode=0 Oct 13 13:25:16 crc kubenswrapper[4797]: I1013 13:25:16.700052 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f077-account-create-c5h7s" event={"ID":"a8ac30bd-fd2e-4d0e-99b6-06b9adc7981b","Type":"ContainerDied","Data":"e4cf8b303fc9930630ea8cf2fafc4b1ecdee57504baf50f2bfe1bf0f63ecfaf1"} Oct 13 13:25:16 crc kubenswrapper[4797]: I1013 13:25:16.719684 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 13 13:25:16 crc kubenswrapper[4797]: I1013 13:25:16.728697 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"775308c0-af24-4959-a964-b0d3de6ab0fd","Type":"ContainerStarted","Data":"483b89678ad8a6da6cc5e6b483c0f8234a8b6514d49c6d5ef739f1bd8f9adf17"} Oct 13 13:25:16 crc kubenswrapper[4797]: I1013 13:25:16.738768 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77f7885f7f-jk7j6" event={"ID":"71552bbd-b6bc-43b0-95ba-0c3dc0d93468","Type":"ContainerStarted","Data":"90c85f1f1692cc90d79b1aa8915b2f93f0fbe1cac190e747ffc66dcea36dd3ff"} Oct 13 13:25:16 crc kubenswrapper[4797]: I1013 13:25:16.738852 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-77f7885f7f-jk7j6" Oct 13 13:25:16 crc kubenswrapper[4797]: I1013 13:25:16.747704 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"41cd509e-46e1-4776-9179-b00dec55dd2a","Type":"ContainerStarted","Data":"9abbcfbad2eafde13a6e28147c07f4d9a3d502cdf539e6ac66cde4517fe86e5b"} Oct 13 13:25:16 crc kubenswrapper[4797]: I1013 13:25:16.747757 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"41cd509e-46e1-4776-9179-b00dec55dd2a","Type":"ContainerStarted","Data":"1f5749ab58cc7bd4264c1de7877526009f121ddcb8ee86584a055cd493eb5381"} Oct 13 13:25:16 crc kubenswrapper[4797]: I1013 13:25:16.780375 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 13 13:25:16 crc kubenswrapper[4797]: I1013 13:25:16.787483 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-77f7885f7f-jk7j6" podStartSLOduration=3.787463725 podStartE2EDuration="3.787463725s" podCreationTimestamp="2025-10-13 13:25:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 13:25:16.766525941 +0000 UTC m=+1094.300076217" watchObservedRunningTime="2025-10-13 13:25:16.787463725 +0000 UTC m=+1094.321013991" Oct 13 13:25:16 crc kubenswrapper[4797]: I1013 13:25:16.805248 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 13 13:25:17 crc kubenswrapper[4797]: I1013 13:25:17.246726 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ae891fa-6596-4556-96c5-52e4118d7fc1" path="/var/lib/kubelet/pods/5ae891fa-6596-4556-96c5-52e4118d7fc1/volumes" Oct 13 13:25:17 crc kubenswrapper[4797]: I1013 13:25:17.755326 4797 generic.go:334] "Generic (PLEG): container finished" podID="e951a4d9-1a79-4872-b889-1dfdf3b8fa18" containerID="eafea942da1ddb6d4abaaa93876373ee484606f7001d932ec2587dd831ade1f5" exitCode=0 Oct 13 13:25:17 crc kubenswrapper[4797]: I1013 13:25:17.755392 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-mxz7m" event={"ID":"e951a4d9-1a79-4872-b889-1dfdf3b8fa18","Type":"ContainerDied","Data":"eafea942da1ddb6d4abaaa93876373ee484606f7001d932ec2587dd831ade1f5"} Oct 13 13:25:17 crc kubenswrapper[4797]: I1013 13:25:17.757482 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"775308c0-af24-4959-a964-b0d3de6ab0fd","Type":"ContainerStarted","Data":"1e412b3c9aa4e09b4b752b697aa2a6071c6cdbba82da1496a2b0c9896bb6a012"} Oct 13 13:25:17 crc kubenswrapper[4797]: I1013 13:25:17.757582 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="775308c0-af24-4959-a964-b0d3de6ab0fd" containerName="glance-log" containerID="cri-o://483b89678ad8a6da6cc5e6b483c0f8234a8b6514d49c6d5ef739f1bd8f9adf17" gracePeriod=30 Oct 13 13:25:17 crc kubenswrapper[4797]: I1013 13:25:17.757779 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="775308c0-af24-4959-a964-b0d3de6ab0fd" containerName="glance-httpd" containerID="cri-o://1e412b3c9aa4e09b4b752b697aa2a6071c6cdbba82da1496a2b0c9896bb6a012" gracePeriod=30 Oct 13 13:25:17 crc kubenswrapper[4797]: I1013 13:25:17.760043 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"41cd509e-46e1-4776-9179-b00dec55dd2a","Type":"ContainerStarted","Data":"c4f716e090dd6db2f624e901efba209b216bca39002fc8909c00edabc0a0bf68"} Oct 13 13:25:17 crc kubenswrapper[4797]: I1013 13:25:17.760097 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="41cd509e-46e1-4776-9179-b00dec55dd2a" containerName="glance-log" containerID="cri-o://9abbcfbad2eafde13a6e28147c07f4d9a3d502cdf539e6ac66cde4517fe86e5b" gracePeriod=30 Oct 13 13:25:17 crc kubenswrapper[4797]: I1013 13:25:17.760171 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="41cd509e-46e1-4776-9179-b00dec55dd2a" containerName="glance-httpd" containerID="cri-o://c4f716e090dd6db2f624e901efba209b216bca39002fc8909c00edabc0a0bf68" gracePeriod=30 Oct 13 13:25:17 crc kubenswrapper[4797]: I1013 13:25:17.795461 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.795436685 podStartE2EDuration="5.795436685s" podCreationTimestamp="2025-10-13 13:25:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 13:25:17.792587475 +0000 UTC m=+1095.326137751" watchObservedRunningTime="2025-10-13 13:25:17.795436685 +0000 UTC m=+1095.328986961" Oct 13 13:25:17 crc kubenswrapper[4797]: I1013 13:25:17.818229 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=5.818209754 podStartE2EDuration="5.818209754s" podCreationTimestamp="2025-10-13 13:25:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 13:25:17.816821039 +0000 UTC m=+1095.350371305" watchObservedRunningTime="2025-10-13 13:25:17.818209754 +0000 UTC m=+1095.351760010" Oct 13 13:25:18 crc kubenswrapper[4797]: I1013 13:25:18.120178 4797 patch_prober.go:28] interesting pod/machine-config-daemon-hrdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 13:25:18 crc kubenswrapper[4797]: I1013 13:25:18.120245 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 13:25:18 crc kubenswrapper[4797]: I1013 13:25:18.768977 4797 generic.go:334] "Generic (PLEG): container finished" podID="41cd509e-46e1-4776-9179-b00dec55dd2a" containerID="c4f716e090dd6db2f624e901efba209b216bca39002fc8909c00edabc0a0bf68" exitCode=0 Oct 13 13:25:18 crc kubenswrapper[4797]: I1013 13:25:18.769015 4797 generic.go:334] "Generic (PLEG): container finished" podID="41cd509e-46e1-4776-9179-b00dec55dd2a" containerID="9abbcfbad2eafde13a6e28147c07f4d9a3d502cdf539e6ac66cde4517fe86e5b" exitCode=143 Oct 13 13:25:18 crc kubenswrapper[4797]: I1013 13:25:18.769039 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"41cd509e-46e1-4776-9179-b00dec55dd2a","Type":"ContainerDied","Data":"c4f716e090dd6db2f624e901efba209b216bca39002fc8909c00edabc0a0bf68"} Oct 13 13:25:18 crc kubenswrapper[4797]: I1013 13:25:18.769073 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"41cd509e-46e1-4776-9179-b00dec55dd2a","Type":"ContainerDied","Data":"9abbcfbad2eafde13a6e28147c07f4d9a3d502cdf539e6ac66cde4517fe86e5b"} Oct 13 13:25:18 crc kubenswrapper[4797]: I1013 13:25:18.770773 4797 generic.go:334] "Generic (PLEG): container finished" podID="775308c0-af24-4959-a964-b0d3de6ab0fd" containerID="1e412b3c9aa4e09b4b752b697aa2a6071c6cdbba82da1496a2b0c9896bb6a012" exitCode=0 Oct 13 13:25:18 crc kubenswrapper[4797]: I1013 13:25:18.770805 4797 generic.go:334] "Generic (PLEG): container finished" podID="775308c0-af24-4959-a964-b0d3de6ab0fd" containerID="483b89678ad8a6da6cc5e6b483c0f8234a8b6514d49c6d5ef739f1bd8f9adf17" exitCode=143 Oct 13 13:25:18 crc kubenswrapper[4797]: I1013 13:25:18.770848 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"775308c0-af24-4959-a964-b0d3de6ab0fd","Type":"ContainerDied","Data":"1e412b3c9aa4e09b4b752b697aa2a6071c6cdbba82da1496a2b0c9896bb6a012"} Oct 13 13:25:18 crc kubenswrapper[4797]: I1013 13:25:18.770875 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"775308c0-af24-4959-a964-b0d3de6ab0fd","Type":"ContainerDied","Data":"483b89678ad8a6da6cc5e6b483c0f8234a8b6514d49c6d5ef739f1bd8f9adf17"} Oct 13 13:25:19 crc kubenswrapper[4797]: I1013 13:25:19.371778 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f077-account-create-c5h7s" Oct 13 13:25:19 crc kubenswrapper[4797]: I1013 13:25:19.473116 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kbxdd\" (UniqueName: \"kubernetes.io/projected/a8ac30bd-fd2e-4d0e-99b6-06b9adc7981b-kube-api-access-kbxdd\") pod \"a8ac30bd-fd2e-4d0e-99b6-06b9adc7981b\" (UID: \"a8ac30bd-fd2e-4d0e-99b6-06b9adc7981b\") " Oct 13 13:25:19 crc kubenswrapper[4797]: I1013 13:25:19.479284 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8ac30bd-fd2e-4d0e-99b6-06b9adc7981b-kube-api-access-kbxdd" (OuterVolumeSpecName: "kube-api-access-kbxdd") pod "a8ac30bd-fd2e-4d0e-99b6-06b9adc7981b" (UID: "a8ac30bd-fd2e-4d0e-99b6-06b9adc7981b"). InnerVolumeSpecName "kube-api-access-kbxdd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:25:19 crc kubenswrapper[4797]: I1013 13:25:19.576764 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kbxdd\" (UniqueName: \"kubernetes.io/projected/a8ac30bd-fd2e-4d0e-99b6-06b9adc7981b-kube-api-access-kbxdd\") on node \"crc\" DevicePath \"\"" Oct 13 13:25:19 crc kubenswrapper[4797]: I1013 13:25:19.807579 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f077-account-create-c5h7s" event={"ID":"a8ac30bd-fd2e-4d0e-99b6-06b9adc7981b","Type":"ContainerDied","Data":"a2f8177df4af8fec872f9de581d7e790cc551ca4e769c0794e4997c6acdee57f"} Oct 13 13:25:19 crc kubenswrapper[4797]: I1013 13:25:19.807903 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a2f8177df4af8fec872f9de581d7e790cc551ca4e769c0794e4997c6acdee57f" Oct 13 13:25:19 crc kubenswrapper[4797]: I1013 13:25:19.807851 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f077-account-create-c5h7s" Oct 13 13:25:21 crc kubenswrapper[4797]: I1013 13:25:21.712705 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 13 13:25:21 crc kubenswrapper[4797]: I1013 13:25:21.746334 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-b951-account-create-slxdp" Oct 13 13:25:21 crc kubenswrapper[4797]: I1013 13:25:21.755579 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-faff-account-create-wnr29" Oct 13 13:25:21 crc kubenswrapper[4797]: I1013 13:25:21.780353 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-mxz7m" Oct 13 13:25:21 crc kubenswrapper[4797]: I1013 13:25:21.796206 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 13 13:25:21 crc kubenswrapper[4797]: I1013 13:25:21.825302 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41cd509e-46e1-4776-9179-b00dec55dd2a-scripts\") pod \"41cd509e-46e1-4776-9179-b00dec55dd2a\" (UID: \"41cd509e-46e1-4776-9179-b00dec55dd2a\") " Oct 13 13:25:21 crc kubenswrapper[4797]: I1013 13:25:21.825354 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/41cd509e-46e1-4776-9179-b00dec55dd2a-httpd-run\") pod \"41cd509e-46e1-4776-9179-b00dec55dd2a\" (UID: \"41cd509e-46e1-4776-9179-b00dec55dd2a\") " Oct 13 13:25:21 crc kubenswrapper[4797]: I1013 13:25:21.825443 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f9kpb\" (UniqueName: \"kubernetes.io/projected/41cd509e-46e1-4776-9179-b00dec55dd2a-kube-api-access-f9kpb\") pod \"41cd509e-46e1-4776-9179-b00dec55dd2a\" (UID: \"41cd509e-46e1-4776-9179-b00dec55dd2a\") " Oct 13 13:25:21 crc kubenswrapper[4797]: I1013 13:25:21.825480 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6xplt\" (UniqueName: \"kubernetes.io/projected/3833a450-53fb-44f6-974d-b2496e3a98d8-kube-api-access-6xplt\") pod \"3833a450-53fb-44f6-974d-b2496e3a98d8\" (UID: \"3833a450-53fb-44f6-974d-b2496e3a98d8\") " Oct 13 13:25:21 crc kubenswrapper[4797]: I1013 13:25:21.825499 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"41cd509e-46e1-4776-9179-b00dec55dd2a\" (UID: \"41cd509e-46e1-4776-9179-b00dec55dd2a\") " Oct 13 13:25:21 crc kubenswrapper[4797]: I1013 13:25:21.825557 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41cd509e-46e1-4776-9179-b00dec55dd2a-config-data\") pod \"41cd509e-46e1-4776-9179-b00dec55dd2a\" (UID: \"41cd509e-46e1-4776-9179-b00dec55dd2a\") " Oct 13 13:25:21 crc kubenswrapper[4797]: I1013 13:25:21.825598 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41cd509e-46e1-4776-9179-b00dec55dd2a-combined-ca-bundle\") pod \"41cd509e-46e1-4776-9179-b00dec55dd2a\" (UID: \"41cd509e-46e1-4776-9179-b00dec55dd2a\") " Oct 13 13:25:21 crc kubenswrapper[4797]: I1013 13:25:21.825619 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/41cd509e-46e1-4776-9179-b00dec55dd2a-logs\") pod \"41cd509e-46e1-4776-9179-b00dec55dd2a\" (UID: \"41cd509e-46e1-4776-9179-b00dec55dd2a\") " Oct 13 13:25:21 crc kubenswrapper[4797]: I1013 13:25:21.825636 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n44t2\" (UniqueName: \"kubernetes.io/projected/3744809d-6956-4dfa-bede-4679ab2d9296-kube-api-access-n44t2\") pod \"3744809d-6956-4dfa-bede-4679ab2d9296\" (UID: \"3744809d-6956-4dfa-bede-4679ab2d9296\") " Oct 13 13:25:21 crc kubenswrapper[4797]: I1013 13:25:21.827784 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/41cd509e-46e1-4776-9179-b00dec55dd2a-logs" (OuterVolumeSpecName: "logs") pod "41cd509e-46e1-4776-9179-b00dec55dd2a" (UID: "41cd509e-46e1-4776-9179-b00dec55dd2a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 13:25:21 crc kubenswrapper[4797]: I1013 13:25:21.828109 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/41cd509e-46e1-4776-9179-b00dec55dd2a-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "41cd509e-46e1-4776-9179-b00dec55dd2a" (UID: "41cd509e-46e1-4776-9179-b00dec55dd2a"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 13:25:21 crc kubenswrapper[4797]: I1013 13:25:21.831258 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41cd509e-46e1-4776-9179-b00dec55dd2a-kube-api-access-f9kpb" (OuterVolumeSpecName: "kube-api-access-f9kpb") pod "41cd509e-46e1-4776-9179-b00dec55dd2a" (UID: "41cd509e-46e1-4776-9179-b00dec55dd2a"). InnerVolumeSpecName "kube-api-access-f9kpb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:25:21 crc kubenswrapper[4797]: I1013 13:25:21.831693 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3744809d-6956-4dfa-bede-4679ab2d9296-kube-api-access-n44t2" (OuterVolumeSpecName: "kube-api-access-n44t2") pod "3744809d-6956-4dfa-bede-4679ab2d9296" (UID: "3744809d-6956-4dfa-bede-4679ab2d9296"). InnerVolumeSpecName "kube-api-access-n44t2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:25:21 crc kubenswrapper[4797]: I1013 13:25:21.835966 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "41cd509e-46e1-4776-9179-b00dec55dd2a" (UID: "41cd509e-46e1-4776-9179-b00dec55dd2a"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 13 13:25:21 crc kubenswrapper[4797]: I1013 13:25:21.836027 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41cd509e-46e1-4776-9179-b00dec55dd2a-scripts" (OuterVolumeSpecName: "scripts") pod "41cd509e-46e1-4776-9179-b00dec55dd2a" (UID: "41cd509e-46e1-4776-9179-b00dec55dd2a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:25:21 crc kubenswrapper[4797]: I1013 13:25:21.839502 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 13 13:25:21 crc kubenswrapper[4797]: I1013 13:25:21.840042 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"775308c0-af24-4959-a964-b0d3de6ab0fd","Type":"ContainerDied","Data":"1c25c83e23ae1601152dc68df42997a3e106e88157d444f868a8b59cabec2592"} Oct 13 13:25:21 crc kubenswrapper[4797]: I1013 13:25:21.840133 4797 scope.go:117] "RemoveContainer" containerID="1e412b3c9aa4e09b4b752b697aa2a6071c6cdbba82da1496a2b0c9896bb6a012" Oct 13 13:25:21 crc kubenswrapper[4797]: I1013 13:25:21.840290 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3833a450-53fb-44f6-974d-b2496e3a98d8-kube-api-access-6xplt" (OuterVolumeSpecName: "kube-api-access-6xplt") pod "3833a450-53fb-44f6-974d-b2496e3a98d8" (UID: "3833a450-53fb-44f6-974d-b2496e3a98d8"). InnerVolumeSpecName "kube-api-access-6xplt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:25:21 crc kubenswrapper[4797]: I1013 13:25:21.845121 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 13 13:25:21 crc kubenswrapper[4797]: I1013 13:25:21.845268 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"41cd509e-46e1-4776-9179-b00dec55dd2a","Type":"ContainerDied","Data":"1f5749ab58cc7bd4264c1de7877526009f121ddcb8ee86584a055cd493eb5381"} Oct 13 13:25:21 crc kubenswrapper[4797]: I1013 13:25:21.847019 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-b951-account-create-slxdp" event={"ID":"3833a450-53fb-44f6-974d-b2496e3a98d8","Type":"ContainerDied","Data":"cccdb156cf9a74e4660a924bd98658b52b6b1c88a9cc176784d8a0cf20d75f89"} Oct 13 13:25:21 crc kubenswrapper[4797]: I1013 13:25:21.847138 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cccdb156cf9a74e4660a924bd98658b52b6b1c88a9cc176784d8a0cf20d75f89" Oct 13 13:25:21 crc kubenswrapper[4797]: I1013 13:25:21.847070 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-b951-account-create-slxdp" Oct 13 13:25:21 crc kubenswrapper[4797]: I1013 13:25:21.855251 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-faff-account-create-wnr29" event={"ID":"3744809d-6956-4dfa-bede-4679ab2d9296","Type":"ContainerDied","Data":"2507ce921801f73d0861601fcc73e5e4199d27655c04d6d43d6b025847d7af0d"} Oct 13 13:25:21 crc kubenswrapper[4797]: I1013 13:25:21.855323 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2507ce921801f73d0861601fcc73e5e4199d27655c04d6d43d6b025847d7af0d" Oct 13 13:25:21 crc kubenswrapper[4797]: I1013 13:25:21.855446 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-faff-account-create-wnr29" Oct 13 13:25:21 crc kubenswrapper[4797]: I1013 13:25:21.867768 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-mxz7m" event={"ID":"e951a4d9-1a79-4872-b889-1dfdf3b8fa18","Type":"ContainerDied","Data":"e3d999e4dff9ff330764e26ab23613c906dce36b31600f29633e73a54589e43f"} Oct 13 13:25:21 crc kubenswrapper[4797]: I1013 13:25:21.868035 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e3d999e4dff9ff330764e26ab23613c906dce36b31600f29633e73a54589e43f" Oct 13 13:25:21 crc kubenswrapper[4797]: I1013 13:25:21.867898 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-mxz7m" Oct 13 13:25:21 crc kubenswrapper[4797]: I1013 13:25:21.874176 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41cd509e-46e1-4776-9179-b00dec55dd2a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "41cd509e-46e1-4776-9179-b00dec55dd2a" (UID: "41cd509e-46e1-4776-9179-b00dec55dd2a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:25:21 crc kubenswrapper[4797]: I1013 13:25:21.880896 4797 scope.go:117] "RemoveContainer" containerID="483b89678ad8a6da6cc5e6b483c0f8234a8b6514d49c6d5ef739f1bd8f9adf17" Oct 13 13:25:21 crc kubenswrapper[4797]: I1013 13:25:21.900840 4797 scope.go:117] "RemoveContainer" containerID="c4f716e090dd6db2f624e901efba209b216bca39002fc8909c00edabc0a0bf68" Oct 13 13:25:21 crc kubenswrapper[4797]: I1013 13:25:21.914293 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41cd509e-46e1-4776-9179-b00dec55dd2a-config-data" (OuterVolumeSpecName: "config-data") pod "41cd509e-46e1-4776-9179-b00dec55dd2a" (UID: "41cd509e-46e1-4776-9179-b00dec55dd2a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:25:21 crc kubenswrapper[4797]: I1013 13:25:21.917729 4797 scope.go:117] "RemoveContainer" containerID="9abbcfbad2eafde13a6e28147c07f4d9a3d502cdf539e6ac66cde4517fe86e5b" Oct 13 13:25:21 crc kubenswrapper[4797]: I1013 13:25:21.926584 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e951a4d9-1a79-4872-b889-1dfdf3b8fa18-combined-ca-bundle\") pod \"e951a4d9-1a79-4872-b889-1dfdf3b8fa18\" (UID: \"e951a4d9-1a79-4872-b889-1dfdf3b8fa18\") " Oct 13 13:25:21 crc kubenswrapper[4797]: I1013 13:25:21.926658 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/775308c0-af24-4959-a964-b0d3de6ab0fd-scripts\") pod \"775308c0-af24-4959-a964-b0d3de6ab0fd\" (UID: \"775308c0-af24-4959-a964-b0d3de6ab0fd\") " Oct 13 13:25:21 crc kubenswrapper[4797]: I1013 13:25:21.926682 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e951a4d9-1a79-4872-b889-1dfdf3b8fa18-fernet-keys\") pod \"e951a4d9-1a79-4872-b889-1dfdf3b8fa18\" (UID: \"e951a4d9-1a79-4872-b889-1dfdf3b8fa18\") " Oct 13 13:25:21 crc kubenswrapper[4797]: I1013 13:25:21.926699 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e951a4d9-1a79-4872-b889-1dfdf3b8fa18-config-data\") pod \"e951a4d9-1a79-4872-b889-1dfdf3b8fa18\" (UID: \"e951a4d9-1a79-4872-b889-1dfdf3b8fa18\") " Oct 13 13:25:21 crc kubenswrapper[4797]: I1013 13:25:21.926741 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/775308c0-af24-4959-a964-b0d3de6ab0fd-logs\") pod \"775308c0-af24-4959-a964-b0d3de6ab0fd\" (UID: \"775308c0-af24-4959-a964-b0d3de6ab0fd\") " Oct 13 13:25:21 crc kubenswrapper[4797]: I1013 13:25:21.926823 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e951a4d9-1a79-4872-b889-1dfdf3b8fa18-scripts\") pod \"e951a4d9-1a79-4872-b889-1dfdf3b8fa18\" (UID: \"e951a4d9-1a79-4872-b889-1dfdf3b8fa18\") " Oct 13 13:25:21 crc kubenswrapper[4797]: I1013 13:25:21.926859 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/775308c0-af24-4959-a964-b0d3de6ab0fd-config-data\") pod \"775308c0-af24-4959-a964-b0d3de6ab0fd\" (UID: \"775308c0-af24-4959-a964-b0d3de6ab0fd\") " Oct 13 13:25:21 crc kubenswrapper[4797]: I1013 13:25:21.926886 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e951a4d9-1a79-4872-b889-1dfdf3b8fa18-credential-keys\") pod \"e951a4d9-1a79-4872-b889-1dfdf3b8fa18\" (UID: \"e951a4d9-1a79-4872-b889-1dfdf3b8fa18\") " Oct 13 13:25:21 crc kubenswrapper[4797]: I1013 13:25:21.926909 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"775308c0-af24-4959-a964-b0d3de6ab0fd\" (UID: \"775308c0-af24-4959-a964-b0d3de6ab0fd\") " Oct 13 13:25:21 crc kubenswrapper[4797]: I1013 13:25:21.926952 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/775308c0-af24-4959-a964-b0d3de6ab0fd-combined-ca-bundle\") pod \"775308c0-af24-4959-a964-b0d3de6ab0fd\" (UID: \"775308c0-af24-4959-a964-b0d3de6ab0fd\") " Oct 13 13:25:21 crc kubenswrapper[4797]: I1013 13:25:21.926981 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5smbc\" (UniqueName: \"kubernetes.io/projected/e951a4d9-1a79-4872-b889-1dfdf3b8fa18-kube-api-access-5smbc\") pod \"e951a4d9-1a79-4872-b889-1dfdf3b8fa18\" (UID: \"e951a4d9-1a79-4872-b889-1dfdf3b8fa18\") " Oct 13 13:25:21 crc kubenswrapper[4797]: I1013 13:25:21.927009 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/775308c0-af24-4959-a964-b0d3de6ab0fd-httpd-run\") pod \"775308c0-af24-4959-a964-b0d3de6ab0fd\" (UID: \"775308c0-af24-4959-a964-b0d3de6ab0fd\") " Oct 13 13:25:21 crc kubenswrapper[4797]: I1013 13:25:21.927031 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jtsjv\" (UniqueName: \"kubernetes.io/projected/775308c0-af24-4959-a964-b0d3de6ab0fd-kube-api-access-jtsjv\") pod \"775308c0-af24-4959-a964-b0d3de6ab0fd\" (UID: \"775308c0-af24-4959-a964-b0d3de6ab0fd\") " Oct 13 13:25:21 crc kubenswrapper[4797]: I1013 13:25:21.927346 4797 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41cd509e-46e1-4776-9179-b00dec55dd2a-scripts\") on node \"crc\" DevicePath \"\"" Oct 13 13:25:21 crc kubenswrapper[4797]: I1013 13:25:21.927363 4797 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/41cd509e-46e1-4776-9179-b00dec55dd2a-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 13 13:25:21 crc kubenswrapper[4797]: I1013 13:25:21.927376 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f9kpb\" (UniqueName: \"kubernetes.io/projected/41cd509e-46e1-4776-9179-b00dec55dd2a-kube-api-access-f9kpb\") on node \"crc\" DevicePath \"\"" Oct 13 13:25:21 crc kubenswrapper[4797]: I1013 13:25:21.927392 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6xplt\" (UniqueName: \"kubernetes.io/projected/3833a450-53fb-44f6-974d-b2496e3a98d8-kube-api-access-6xplt\") on node \"crc\" DevicePath \"\"" Oct 13 13:25:21 crc kubenswrapper[4797]: I1013 13:25:21.927414 4797 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Oct 13 13:25:21 crc kubenswrapper[4797]: I1013 13:25:21.927425 4797 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41cd509e-46e1-4776-9179-b00dec55dd2a-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 13:25:21 crc kubenswrapper[4797]: I1013 13:25:21.927436 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41cd509e-46e1-4776-9179-b00dec55dd2a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 13:25:21 crc kubenswrapper[4797]: I1013 13:25:21.927446 4797 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/41cd509e-46e1-4776-9179-b00dec55dd2a-logs\") on node \"crc\" DevicePath \"\"" Oct 13 13:25:21 crc kubenswrapper[4797]: I1013 13:25:21.927456 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n44t2\" (UniqueName: \"kubernetes.io/projected/3744809d-6956-4dfa-bede-4679ab2d9296-kube-api-access-n44t2\") on node \"crc\" DevicePath \"\"" Oct 13 13:25:21 crc kubenswrapper[4797]: I1013 13:25:21.928423 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/775308c0-af24-4959-a964-b0d3de6ab0fd-logs" (OuterVolumeSpecName: "logs") pod "775308c0-af24-4959-a964-b0d3de6ab0fd" (UID: "775308c0-af24-4959-a964-b0d3de6ab0fd"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 13:25:21 crc kubenswrapper[4797]: I1013 13:25:21.928702 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/775308c0-af24-4959-a964-b0d3de6ab0fd-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "775308c0-af24-4959-a964-b0d3de6ab0fd" (UID: "775308c0-af24-4959-a964-b0d3de6ab0fd"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 13:25:21 crc kubenswrapper[4797]: I1013 13:25:21.931871 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e951a4d9-1a79-4872-b889-1dfdf3b8fa18-kube-api-access-5smbc" (OuterVolumeSpecName: "kube-api-access-5smbc") pod "e951a4d9-1a79-4872-b889-1dfdf3b8fa18" (UID: "e951a4d9-1a79-4872-b889-1dfdf3b8fa18"). InnerVolumeSpecName "kube-api-access-5smbc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:25:21 crc kubenswrapper[4797]: I1013 13:25:21.933471 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e951a4d9-1a79-4872-b889-1dfdf3b8fa18-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "e951a4d9-1a79-4872-b889-1dfdf3b8fa18" (UID: "e951a4d9-1a79-4872-b889-1dfdf3b8fa18"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:25:21 crc kubenswrapper[4797]: I1013 13:25:21.934084 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "glance") pod "775308c0-af24-4959-a964-b0d3de6ab0fd" (UID: "775308c0-af24-4959-a964-b0d3de6ab0fd"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 13 13:25:21 crc kubenswrapper[4797]: I1013 13:25:21.934146 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/775308c0-af24-4959-a964-b0d3de6ab0fd-kube-api-access-jtsjv" (OuterVolumeSpecName: "kube-api-access-jtsjv") pod "775308c0-af24-4959-a964-b0d3de6ab0fd" (UID: "775308c0-af24-4959-a964-b0d3de6ab0fd"). InnerVolumeSpecName "kube-api-access-jtsjv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:25:21 crc kubenswrapper[4797]: I1013 13:25:21.934616 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e951a4d9-1a79-4872-b889-1dfdf3b8fa18-scripts" (OuterVolumeSpecName: "scripts") pod "e951a4d9-1a79-4872-b889-1dfdf3b8fa18" (UID: "e951a4d9-1a79-4872-b889-1dfdf3b8fa18"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:25:21 crc kubenswrapper[4797]: I1013 13:25:21.935258 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e951a4d9-1a79-4872-b889-1dfdf3b8fa18-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "e951a4d9-1a79-4872-b889-1dfdf3b8fa18" (UID: "e951a4d9-1a79-4872-b889-1dfdf3b8fa18"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:25:21 crc kubenswrapper[4797]: I1013 13:25:21.937389 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/775308c0-af24-4959-a964-b0d3de6ab0fd-scripts" (OuterVolumeSpecName: "scripts") pod "775308c0-af24-4959-a964-b0d3de6ab0fd" (UID: "775308c0-af24-4959-a964-b0d3de6ab0fd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:25:21 crc kubenswrapper[4797]: I1013 13:25:21.950714 4797 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Oct 13 13:25:21 crc kubenswrapper[4797]: I1013 13:25:21.951991 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e951a4d9-1a79-4872-b889-1dfdf3b8fa18-config-data" (OuterVolumeSpecName: "config-data") pod "e951a4d9-1a79-4872-b889-1dfdf3b8fa18" (UID: "e951a4d9-1a79-4872-b889-1dfdf3b8fa18"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:25:21 crc kubenswrapper[4797]: I1013 13:25:21.954939 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e951a4d9-1a79-4872-b889-1dfdf3b8fa18-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e951a4d9-1a79-4872-b889-1dfdf3b8fa18" (UID: "e951a4d9-1a79-4872-b889-1dfdf3b8fa18"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:25:21 crc kubenswrapper[4797]: I1013 13:25:21.974952 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/775308c0-af24-4959-a964-b0d3de6ab0fd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "775308c0-af24-4959-a964-b0d3de6ab0fd" (UID: "775308c0-af24-4959-a964-b0d3de6ab0fd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:25:21 crc kubenswrapper[4797]: I1013 13:25:21.983029 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/775308c0-af24-4959-a964-b0d3de6ab0fd-config-data" (OuterVolumeSpecName: "config-data") pod "775308c0-af24-4959-a964-b0d3de6ab0fd" (UID: "775308c0-af24-4959-a964-b0d3de6ab0fd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:25:22 crc kubenswrapper[4797]: I1013 13:25:22.029547 4797 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e951a4d9-1a79-4872-b889-1dfdf3b8fa18-scripts\") on node \"crc\" DevicePath \"\"" Oct 13 13:25:22 crc kubenswrapper[4797]: I1013 13:25:22.029582 4797 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Oct 13 13:25:22 crc kubenswrapper[4797]: I1013 13:25:22.029593 4797 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/775308c0-af24-4959-a964-b0d3de6ab0fd-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 13:25:22 crc kubenswrapper[4797]: I1013 13:25:22.029607 4797 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e951a4d9-1a79-4872-b889-1dfdf3b8fa18-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 13 13:25:22 crc kubenswrapper[4797]: I1013 13:25:22.029642 4797 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Oct 13 13:25:22 crc kubenswrapper[4797]: I1013 13:25:22.029655 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/775308c0-af24-4959-a964-b0d3de6ab0fd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 13:25:22 crc kubenswrapper[4797]: I1013 13:25:22.029665 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5smbc\" (UniqueName: \"kubernetes.io/projected/e951a4d9-1a79-4872-b889-1dfdf3b8fa18-kube-api-access-5smbc\") on node \"crc\" DevicePath \"\"" Oct 13 13:25:22 crc kubenswrapper[4797]: I1013 13:25:22.029676 4797 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/775308c0-af24-4959-a964-b0d3de6ab0fd-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 13 13:25:22 crc kubenswrapper[4797]: I1013 13:25:22.029686 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jtsjv\" (UniqueName: \"kubernetes.io/projected/775308c0-af24-4959-a964-b0d3de6ab0fd-kube-api-access-jtsjv\") on node \"crc\" DevicePath \"\"" Oct 13 13:25:22 crc kubenswrapper[4797]: I1013 13:25:22.029697 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e951a4d9-1a79-4872-b889-1dfdf3b8fa18-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 13:25:22 crc kubenswrapper[4797]: I1013 13:25:22.029709 4797 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/775308c0-af24-4959-a964-b0d3de6ab0fd-scripts\") on node \"crc\" DevicePath \"\"" Oct 13 13:25:22 crc kubenswrapper[4797]: I1013 13:25:22.029719 4797 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e951a4d9-1a79-4872-b889-1dfdf3b8fa18-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 13 13:25:22 crc kubenswrapper[4797]: I1013 13:25:22.029729 4797 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e951a4d9-1a79-4872-b889-1dfdf3b8fa18-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 13:25:22 crc kubenswrapper[4797]: I1013 13:25:22.029738 4797 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/775308c0-af24-4959-a964-b0d3de6ab0fd-logs\") on node \"crc\" DevicePath \"\"" Oct 13 13:25:22 crc kubenswrapper[4797]: I1013 13:25:22.046020 4797 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Oct 13 13:25:22 crc kubenswrapper[4797]: I1013 13:25:22.131302 4797 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Oct 13 13:25:22 crc kubenswrapper[4797]: I1013 13:25:22.179534 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 13 13:25:22 crc kubenswrapper[4797]: I1013 13:25:22.194508 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 13 13:25:22 crc kubenswrapper[4797]: I1013 13:25:22.206192 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 13 13:25:22 crc kubenswrapper[4797]: E1013 13:25:22.206602 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3833a450-53fb-44f6-974d-b2496e3a98d8" containerName="mariadb-account-create" Oct 13 13:25:22 crc kubenswrapper[4797]: I1013 13:25:22.206626 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="3833a450-53fb-44f6-974d-b2496e3a98d8" containerName="mariadb-account-create" Oct 13 13:25:22 crc kubenswrapper[4797]: E1013 13:25:22.206646 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41cd509e-46e1-4776-9179-b00dec55dd2a" containerName="glance-log" Oct 13 13:25:22 crc kubenswrapper[4797]: I1013 13:25:22.206654 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="41cd509e-46e1-4776-9179-b00dec55dd2a" containerName="glance-log" Oct 13 13:25:22 crc kubenswrapper[4797]: E1013 13:25:22.206668 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="775308c0-af24-4959-a964-b0d3de6ab0fd" containerName="glance-log" Oct 13 13:25:22 crc kubenswrapper[4797]: I1013 13:25:22.206676 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="775308c0-af24-4959-a964-b0d3de6ab0fd" containerName="glance-log" Oct 13 13:25:22 crc kubenswrapper[4797]: E1013 13:25:22.206692 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="775308c0-af24-4959-a964-b0d3de6ab0fd" containerName="glance-httpd" Oct 13 13:25:22 crc kubenswrapper[4797]: I1013 13:25:22.206699 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="775308c0-af24-4959-a964-b0d3de6ab0fd" containerName="glance-httpd" Oct 13 13:25:22 crc kubenswrapper[4797]: E1013 13:25:22.206719 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8ac30bd-fd2e-4d0e-99b6-06b9adc7981b" containerName="mariadb-account-create" Oct 13 13:25:22 crc kubenswrapper[4797]: I1013 13:25:22.206726 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8ac30bd-fd2e-4d0e-99b6-06b9adc7981b" containerName="mariadb-account-create" Oct 13 13:25:22 crc kubenswrapper[4797]: E1013 13:25:22.206743 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ae891fa-6596-4556-96c5-52e4118d7fc1" containerName="init" Oct 13 13:25:22 crc kubenswrapper[4797]: I1013 13:25:22.206750 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ae891fa-6596-4556-96c5-52e4118d7fc1" containerName="init" Oct 13 13:25:22 crc kubenswrapper[4797]: E1013 13:25:22.206761 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3744809d-6956-4dfa-bede-4679ab2d9296" containerName="mariadb-account-create" Oct 13 13:25:22 crc kubenswrapper[4797]: I1013 13:25:22.206769 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="3744809d-6956-4dfa-bede-4679ab2d9296" containerName="mariadb-account-create" Oct 13 13:25:22 crc kubenswrapper[4797]: E1013 13:25:22.206783 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e951a4d9-1a79-4872-b889-1dfdf3b8fa18" containerName="keystone-bootstrap" Oct 13 13:25:22 crc kubenswrapper[4797]: I1013 13:25:22.206791 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="e951a4d9-1a79-4872-b889-1dfdf3b8fa18" containerName="keystone-bootstrap" Oct 13 13:25:22 crc kubenswrapper[4797]: E1013 13:25:22.206823 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41cd509e-46e1-4776-9179-b00dec55dd2a" containerName="glance-httpd" Oct 13 13:25:22 crc kubenswrapper[4797]: I1013 13:25:22.206832 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="41cd509e-46e1-4776-9179-b00dec55dd2a" containerName="glance-httpd" Oct 13 13:25:22 crc kubenswrapper[4797]: I1013 13:25:22.207052 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="e951a4d9-1a79-4872-b889-1dfdf3b8fa18" containerName="keystone-bootstrap" Oct 13 13:25:22 crc kubenswrapper[4797]: I1013 13:25:22.207070 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ae891fa-6596-4556-96c5-52e4118d7fc1" containerName="init" Oct 13 13:25:22 crc kubenswrapper[4797]: I1013 13:25:22.207084 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="775308c0-af24-4959-a964-b0d3de6ab0fd" containerName="glance-log" Oct 13 13:25:22 crc kubenswrapper[4797]: I1013 13:25:22.207098 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="41cd509e-46e1-4776-9179-b00dec55dd2a" containerName="glance-log" Oct 13 13:25:22 crc kubenswrapper[4797]: I1013 13:25:22.207106 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="41cd509e-46e1-4776-9179-b00dec55dd2a" containerName="glance-httpd" Oct 13 13:25:22 crc kubenswrapper[4797]: I1013 13:25:22.207121 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="775308c0-af24-4959-a964-b0d3de6ab0fd" containerName="glance-httpd" Oct 13 13:25:22 crc kubenswrapper[4797]: I1013 13:25:22.207135 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="3744809d-6956-4dfa-bede-4679ab2d9296" containerName="mariadb-account-create" Oct 13 13:25:22 crc kubenswrapper[4797]: I1013 13:25:22.207149 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="3833a450-53fb-44f6-974d-b2496e3a98d8" containerName="mariadb-account-create" Oct 13 13:25:22 crc kubenswrapper[4797]: I1013 13:25:22.207159 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8ac30bd-fd2e-4d0e-99b6-06b9adc7981b" containerName="mariadb-account-create" Oct 13 13:25:22 crc kubenswrapper[4797]: I1013 13:25:22.208276 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 13 13:25:22 crc kubenswrapper[4797]: I1013 13:25:22.210541 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Oct 13 13:25:22 crc kubenswrapper[4797]: I1013 13:25:22.211421 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-c7pxm" Oct 13 13:25:22 crc kubenswrapper[4797]: I1013 13:25:22.211504 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 13 13:25:22 crc kubenswrapper[4797]: I1013 13:25:22.215718 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 13 13:25:22 crc kubenswrapper[4797]: I1013 13:25:22.237083 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 13 13:25:22 crc kubenswrapper[4797]: I1013 13:25:22.262721 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 13 13:25:22 crc kubenswrapper[4797]: I1013 13:25:22.271064 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 13 13:25:22 crc kubenswrapper[4797]: I1013 13:25:22.272826 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 13 13:25:22 crc kubenswrapper[4797]: I1013 13:25:22.278220 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 13 13:25:22 crc kubenswrapper[4797]: I1013 13:25:22.307874 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 13 13:25:22 crc kubenswrapper[4797]: I1013 13:25:22.319107 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 13 13:25:22 crc kubenswrapper[4797]: E1013 13:25:22.319537 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle config-data glance httpd-run kube-api-access-clppr logs scripts], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/glance-default-external-api-0" podUID="6162b080-bb13-4fb9-9911-4cb2318a0ea7" Oct 13 13:25:22 crc kubenswrapper[4797]: I1013 13:25:22.334736 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/49732ccc-e2de-4754-b66b-a1a83e504bb9-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"49732ccc-e2de-4754-b66b-a1a83e504bb9\") " pod="openstack/glance-default-internal-api-0" Oct 13 13:25:22 crc kubenswrapper[4797]: I1013 13:25:22.334783 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6162b080-bb13-4fb9-9911-4cb2318a0ea7-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"6162b080-bb13-4fb9-9911-4cb2318a0ea7\") " pod="openstack/glance-default-external-api-0" Oct 13 13:25:22 crc kubenswrapper[4797]: I1013 13:25:22.334872 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mwhwc\" (UniqueName: \"kubernetes.io/projected/49732ccc-e2de-4754-b66b-a1a83e504bb9-kube-api-access-mwhwc\") pod \"glance-default-internal-api-0\" (UID: \"49732ccc-e2de-4754-b66b-a1a83e504bb9\") " pod="openstack/glance-default-internal-api-0" Oct 13 13:25:22 crc kubenswrapper[4797]: I1013 13:25:22.334900 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"49732ccc-e2de-4754-b66b-a1a83e504bb9\") " pod="openstack/glance-default-internal-api-0" Oct 13 13:25:22 crc kubenswrapper[4797]: I1013 13:25:22.334947 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6162b080-bb13-4fb9-9911-4cb2318a0ea7-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"6162b080-bb13-4fb9-9911-4cb2318a0ea7\") " pod="openstack/glance-default-external-api-0" Oct 13 13:25:22 crc kubenswrapper[4797]: I1013 13:25:22.334982 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49732ccc-e2de-4754-b66b-a1a83e504bb9-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"49732ccc-e2de-4754-b66b-a1a83e504bb9\") " pod="openstack/glance-default-internal-api-0" Oct 13 13:25:22 crc kubenswrapper[4797]: I1013 13:25:22.335050 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6162b080-bb13-4fb9-9911-4cb2318a0ea7-scripts\") pod \"glance-default-external-api-0\" (UID: \"6162b080-bb13-4fb9-9911-4cb2318a0ea7\") " pod="openstack/glance-default-external-api-0" Oct 13 13:25:22 crc kubenswrapper[4797]: I1013 13:25:22.335080 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"6162b080-bb13-4fb9-9911-4cb2318a0ea7\") " pod="openstack/glance-default-external-api-0" Oct 13 13:25:22 crc kubenswrapper[4797]: I1013 13:25:22.335106 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6162b080-bb13-4fb9-9911-4cb2318a0ea7-config-data\") pod \"glance-default-external-api-0\" (UID: \"6162b080-bb13-4fb9-9911-4cb2318a0ea7\") " pod="openstack/glance-default-external-api-0" Oct 13 13:25:22 crc kubenswrapper[4797]: I1013 13:25:22.335164 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49732ccc-e2de-4754-b66b-a1a83e504bb9-config-data\") pod \"glance-default-internal-api-0\" (UID: \"49732ccc-e2de-4754-b66b-a1a83e504bb9\") " pod="openstack/glance-default-internal-api-0" Oct 13 13:25:22 crc kubenswrapper[4797]: I1013 13:25:22.335186 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/49732ccc-e2de-4754-b66b-a1a83e504bb9-scripts\") pod \"glance-default-internal-api-0\" (UID: \"49732ccc-e2de-4754-b66b-a1a83e504bb9\") " pod="openstack/glance-default-internal-api-0" Oct 13 13:25:22 crc kubenswrapper[4797]: I1013 13:25:22.335212 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clppr\" (UniqueName: \"kubernetes.io/projected/6162b080-bb13-4fb9-9911-4cb2318a0ea7-kube-api-access-clppr\") pod \"glance-default-external-api-0\" (UID: \"6162b080-bb13-4fb9-9911-4cb2318a0ea7\") " pod="openstack/glance-default-external-api-0" Oct 13 13:25:22 crc kubenswrapper[4797]: I1013 13:25:22.335247 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/49732ccc-e2de-4754-b66b-a1a83e504bb9-logs\") pod \"glance-default-internal-api-0\" (UID: \"49732ccc-e2de-4754-b66b-a1a83e504bb9\") " pod="openstack/glance-default-internal-api-0" Oct 13 13:25:22 crc kubenswrapper[4797]: I1013 13:25:22.335271 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6162b080-bb13-4fb9-9911-4cb2318a0ea7-logs\") pod \"glance-default-external-api-0\" (UID: \"6162b080-bb13-4fb9-9911-4cb2318a0ea7\") " pod="openstack/glance-default-external-api-0" Oct 13 13:25:22 crc kubenswrapper[4797]: I1013 13:25:22.381230 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 13 13:25:22 crc kubenswrapper[4797]: E1013 13:25:22.382965 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle config-data glance httpd-run kube-api-access-mwhwc logs scripts], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/glance-default-internal-api-0" podUID="49732ccc-e2de-4754-b66b-a1a83e504bb9" Oct 13 13:25:22 crc kubenswrapper[4797]: I1013 13:25:22.436442 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6162b080-bb13-4fb9-9911-4cb2318a0ea7-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"6162b080-bb13-4fb9-9911-4cb2318a0ea7\") " pod="openstack/glance-default-external-api-0" Oct 13 13:25:22 crc kubenswrapper[4797]: I1013 13:25:22.436507 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49732ccc-e2de-4754-b66b-a1a83e504bb9-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"49732ccc-e2de-4754-b66b-a1a83e504bb9\") " pod="openstack/glance-default-internal-api-0" Oct 13 13:25:22 crc kubenswrapper[4797]: I1013 13:25:22.436565 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6162b080-bb13-4fb9-9911-4cb2318a0ea7-scripts\") pod \"glance-default-external-api-0\" (UID: \"6162b080-bb13-4fb9-9911-4cb2318a0ea7\") " pod="openstack/glance-default-external-api-0" Oct 13 13:25:22 crc kubenswrapper[4797]: I1013 13:25:22.436593 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"6162b080-bb13-4fb9-9911-4cb2318a0ea7\") " pod="openstack/glance-default-external-api-0" Oct 13 13:25:22 crc kubenswrapper[4797]: I1013 13:25:22.436615 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6162b080-bb13-4fb9-9911-4cb2318a0ea7-config-data\") pod \"glance-default-external-api-0\" (UID: \"6162b080-bb13-4fb9-9911-4cb2318a0ea7\") " pod="openstack/glance-default-external-api-0" Oct 13 13:25:22 crc kubenswrapper[4797]: I1013 13:25:22.436755 4797 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"6162b080-bb13-4fb9-9911-4cb2318a0ea7\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/glance-default-external-api-0" Oct 13 13:25:22 crc kubenswrapper[4797]: I1013 13:25:22.437077 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49732ccc-e2de-4754-b66b-a1a83e504bb9-config-data\") pod \"glance-default-internal-api-0\" (UID: \"49732ccc-e2de-4754-b66b-a1a83e504bb9\") " pod="openstack/glance-default-internal-api-0" Oct 13 13:25:22 crc kubenswrapper[4797]: I1013 13:25:22.437187 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/49732ccc-e2de-4754-b66b-a1a83e504bb9-scripts\") pod \"glance-default-internal-api-0\" (UID: \"49732ccc-e2de-4754-b66b-a1a83e504bb9\") " pod="openstack/glance-default-internal-api-0" Oct 13 13:25:22 crc kubenswrapper[4797]: I1013 13:25:22.437291 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-clppr\" (UniqueName: \"kubernetes.io/projected/6162b080-bb13-4fb9-9911-4cb2318a0ea7-kube-api-access-clppr\") pod \"glance-default-external-api-0\" (UID: \"6162b080-bb13-4fb9-9911-4cb2318a0ea7\") " pod="openstack/glance-default-external-api-0" Oct 13 13:25:22 crc kubenswrapper[4797]: I1013 13:25:22.437385 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/49732ccc-e2de-4754-b66b-a1a83e504bb9-logs\") pod \"glance-default-internal-api-0\" (UID: \"49732ccc-e2de-4754-b66b-a1a83e504bb9\") " pod="openstack/glance-default-internal-api-0" Oct 13 13:25:22 crc kubenswrapper[4797]: I1013 13:25:22.437472 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6162b080-bb13-4fb9-9911-4cb2318a0ea7-logs\") pod \"glance-default-external-api-0\" (UID: \"6162b080-bb13-4fb9-9911-4cb2318a0ea7\") " pod="openstack/glance-default-external-api-0" Oct 13 13:25:22 crc kubenswrapper[4797]: I1013 13:25:22.437586 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/49732ccc-e2de-4754-b66b-a1a83e504bb9-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"49732ccc-e2de-4754-b66b-a1a83e504bb9\") " pod="openstack/glance-default-internal-api-0" Oct 13 13:25:22 crc kubenswrapper[4797]: I1013 13:25:22.437678 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6162b080-bb13-4fb9-9911-4cb2318a0ea7-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"6162b080-bb13-4fb9-9911-4cb2318a0ea7\") " pod="openstack/glance-default-external-api-0" Oct 13 13:25:22 crc kubenswrapper[4797]: I1013 13:25:22.437782 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mwhwc\" (UniqueName: \"kubernetes.io/projected/49732ccc-e2de-4754-b66b-a1a83e504bb9-kube-api-access-mwhwc\") pod \"glance-default-internal-api-0\" (UID: \"49732ccc-e2de-4754-b66b-a1a83e504bb9\") " pod="openstack/glance-default-internal-api-0" Oct 13 13:25:22 crc kubenswrapper[4797]: I1013 13:25:22.437893 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"49732ccc-e2de-4754-b66b-a1a83e504bb9\") " pod="openstack/glance-default-internal-api-0" Oct 13 13:25:22 crc kubenswrapper[4797]: I1013 13:25:22.438212 4797 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"49732ccc-e2de-4754-b66b-a1a83e504bb9\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-internal-api-0" Oct 13 13:25:22 crc kubenswrapper[4797]: I1013 13:25:22.438377 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6162b080-bb13-4fb9-9911-4cb2318a0ea7-logs\") pod \"glance-default-external-api-0\" (UID: \"6162b080-bb13-4fb9-9911-4cb2318a0ea7\") " pod="openstack/glance-default-external-api-0" Oct 13 13:25:22 crc kubenswrapper[4797]: I1013 13:25:22.438596 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/49732ccc-e2de-4754-b66b-a1a83e504bb9-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"49732ccc-e2de-4754-b66b-a1a83e504bb9\") " pod="openstack/glance-default-internal-api-0" Oct 13 13:25:22 crc kubenswrapper[4797]: I1013 13:25:22.438637 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/49732ccc-e2de-4754-b66b-a1a83e504bb9-logs\") pod \"glance-default-internal-api-0\" (UID: \"49732ccc-e2de-4754-b66b-a1a83e504bb9\") " pod="openstack/glance-default-internal-api-0" Oct 13 13:25:22 crc kubenswrapper[4797]: I1013 13:25:22.438874 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6162b080-bb13-4fb9-9911-4cb2318a0ea7-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"6162b080-bb13-4fb9-9911-4cb2318a0ea7\") " pod="openstack/glance-default-external-api-0" Oct 13 13:25:22 crc kubenswrapper[4797]: I1013 13:25:22.441001 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49732ccc-e2de-4754-b66b-a1a83e504bb9-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"49732ccc-e2de-4754-b66b-a1a83e504bb9\") " pod="openstack/glance-default-internal-api-0" Oct 13 13:25:22 crc kubenswrapper[4797]: I1013 13:25:22.451005 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/49732ccc-e2de-4754-b66b-a1a83e504bb9-scripts\") pod \"glance-default-internal-api-0\" (UID: \"49732ccc-e2de-4754-b66b-a1a83e504bb9\") " pod="openstack/glance-default-internal-api-0" Oct 13 13:25:22 crc kubenswrapper[4797]: I1013 13:25:22.453635 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6162b080-bb13-4fb9-9911-4cb2318a0ea7-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"6162b080-bb13-4fb9-9911-4cb2318a0ea7\") " pod="openstack/glance-default-external-api-0" Oct 13 13:25:22 crc kubenswrapper[4797]: I1013 13:25:22.456512 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6162b080-bb13-4fb9-9911-4cb2318a0ea7-config-data\") pod \"glance-default-external-api-0\" (UID: \"6162b080-bb13-4fb9-9911-4cb2318a0ea7\") " pod="openstack/glance-default-external-api-0" Oct 13 13:25:22 crc kubenswrapper[4797]: I1013 13:25:22.460577 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6162b080-bb13-4fb9-9911-4cb2318a0ea7-scripts\") pod \"glance-default-external-api-0\" (UID: \"6162b080-bb13-4fb9-9911-4cb2318a0ea7\") " pod="openstack/glance-default-external-api-0" Oct 13 13:25:22 crc kubenswrapper[4797]: I1013 13:25:22.462792 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49732ccc-e2de-4754-b66b-a1a83e504bb9-config-data\") pod \"glance-default-internal-api-0\" (UID: \"49732ccc-e2de-4754-b66b-a1a83e504bb9\") " pod="openstack/glance-default-internal-api-0" Oct 13 13:25:22 crc kubenswrapper[4797]: I1013 13:25:22.471704 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-clppr\" (UniqueName: \"kubernetes.io/projected/6162b080-bb13-4fb9-9911-4cb2318a0ea7-kube-api-access-clppr\") pod \"glance-default-external-api-0\" (UID: \"6162b080-bb13-4fb9-9911-4cb2318a0ea7\") " pod="openstack/glance-default-external-api-0" Oct 13 13:25:22 crc kubenswrapper[4797]: I1013 13:25:22.472893 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwhwc\" (UniqueName: \"kubernetes.io/projected/49732ccc-e2de-4754-b66b-a1a83e504bb9-kube-api-access-mwhwc\") pod \"glance-default-internal-api-0\" (UID: \"49732ccc-e2de-4754-b66b-a1a83e504bb9\") " pod="openstack/glance-default-internal-api-0" Oct 13 13:25:22 crc kubenswrapper[4797]: I1013 13:25:22.508217 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"6162b080-bb13-4fb9-9911-4cb2318a0ea7\") " pod="openstack/glance-default-external-api-0" Oct 13 13:25:22 crc kubenswrapper[4797]: I1013 13:25:22.515700 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"49732ccc-e2de-4754-b66b-a1a83e504bb9\") " pod="openstack/glance-default-internal-api-0" Oct 13 13:25:22 crc kubenswrapper[4797]: I1013 13:25:22.876100 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f52aa623-f23f-4f02-b744-7c5b1e066e50","Type":"ContainerStarted","Data":"24839f95a992d62de78749ed42c00b47bd3b4a6e9c699eb44cdbcb07b4a559cf"} Oct 13 13:25:22 crc kubenswrapper[4797]: I1013 13:25:22.877798 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-5dzch" event={"ID":"974e0c08-3519-4be7-a9d1-c7db6016ad6f","Type":"ContainerStarted","Data":"b5c8e1cc5e2837e1df6c74841ca13a868c1d79cca109abf5eddbf0d8bb543195"} Oct 13 13:25:22 crc kubenswrapper[4797]: I1013 13:25:22.877871 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 13 13:25:22 crc kubenswrapper[4797]: I1013 13:25:22.877919 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 13 13:25:22 crc kubenswrapper[4797]: I1013 13:25:22.888566 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 13 13:25:22 crc kubenswrapper[4797]: I1013 13:25:22.905731 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-5dzch" podStartSLOduration=2.558656606 podStartE2EDuration="9.905712641s" podCreationTimestamp="2025-10-13 13:25:13 +0000 UTC" firstStartedPulling="2025-10-13 13:25:14.064296058 +0000 UTC m=+1091.597846314" lastFinishedPulling="2025-10-13 13:25:21.411352093 +0000 UTC m=+1098.944902349" observedRunningTime="2025-10-13 13:25:22.898645608 +0000 UTC m=+1100.432195864" watchObservedRunningTime="2025-10-13 13:25:22.905712641 +0000 UTC m=+1100.439262897" Oct 13 13:25:22 crc kubenswrapper[4797]: I1013 13:25:22.908329 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 13 13:25:22 crc kubenswrapper[4797]: I1013 13:25:22.939717 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-mxz7m"] Oct 13 13:25:22 crc kubenswrapper[4797]: I1013 13:25:22.946178 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"6162b080-bb13-4fb9-9911-4cb2318a0ea7\" (UID: \"6162b080-bb13-4fb9-9911-4cb2318a0ea7\") " Oct 13 13:25:22 crc kubenswrapper[4797]: I1013 13:25:22.946234 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6162b080-bb13-4fb9-9911-4cb2318a0ea7-logs\") pod \"6162b080-bb13-4fb9-9911-4cb2318a0ea7\" (UID: \"6162b080-bb13-4fb9-9911-4cb2318a0ea7\") " Oct 13 13:25:22 crc kubenswrapper[4797]: I1013 13:25:22.946281 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6162b080-bb13-4fb9-9911-4cb2318a0ea7-config-data\") pod \"6162b080-bb13-4fb9-9911-4cb2318a0ea7\" (UID: \"6162b080-bb13-4fb9-9911-4cb2318a0ea7\") " Oct 13 13:25:22 crc kubenswrapper[4797]: I1013 13:25:22.946305 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49732ccc-e2de-4754-b66b-a1a83e504bb9-config-data\") pod \"49732ccc-e2de-4754-b66b-a1a83e504bb9\" (UID: \"49732ccc-e2de-4754-b66b-a1a83e504bb9\") " Oct 13 13:25:22 crc kubenswrapper[4797]: I1013 13:25:22.946336 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49732ccc-e2de-4754-b66b-a1a83e504bb9-combined-ca-bundle\") pod \"49732ccc-e2de-4754-b66b-a1a83e504bb9\" (UID: \"49732ccc-e2de-4754-b66b-a1a83e504bb9\") " Oct 13 13:25:22 crc kubenswrapper[4797]: I1013 13:25:22.946382 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6162b080-bb13-4fb9-9911-4cb2318a0ea7-scripts\") pod \"6162b080-bb13-4fb9-9911-4cb2318a0ea7\" (UID: \"6162b080-bb13-4fb9-9911-4cb2318a0ea7\") " Oct 13 13:25:22 crc kubenswrapper[4797]: I1013 13:25:22.946439 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6162b080-bb13-4fb9-9911-4cb2318a0ea7-combined-ca-bundle\") pod \"6162b080-bb13-4fb9-9911-4cb2318a0ea7\" (UID: \"6162b080-bb13-4fb9-9911-4cb2318a0ea7\") " Oct 13 13:25:22 crc kubenswrapper[4797]: I1013 13:25:22.946476 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/49732ccc-e2de-4754-b66b-a1a83e504bb9-httpd-run\") pod \"49732ccc-e2de-4754-b66b-a1a83e504bb9\" (UID: \"49732ccc-e2de-4754-b66b-a1a83e504bb9\") " Oct 13 13:25:22 crc kubenswrapper[4797]: I1013 13:25:22.946533 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mwhwc\" (UniqueName: \"kubernetes.io/projected/49732ccc-e2de-4754-b66b-a1a83e504bb9-kube-api-access-mwhwc\") pod \"49732ccc-e2de-4754-b66b-a1a83e504bb9\" (UID: \"49732ccc-e2de-4754-b66b-a1a83e504bb9\") " Oct 13 13:25:22 crc kubenswrapper[4797]: I1013 13:25:22.946564 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/49732ccc-e2de-4754-b66b-a1a83e504bb9-scripts\") pod \"49732ccc-e2de-4754-b66b-a1a83e504bb9\" (UID: \"49732ccc-e2de-4754-b66b-a1a83e504bb9\") " Oct 13 13:25:22 crc kubenswrapper[4797]: I1013 13:25:22.946600 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6162b080-bb13-4fb9-9911-4cb2318a0ea7-httpd-run\") pod \"6162b080-bb13-4fb9-9911-4cb2318a0ea7\" (UID: \"6162b080-bb13-4fb9-9911-4cb2318a0ea7\") " Oct 13 13:25:22 crc kubenswrapper[4797]: I1013 13:25:22.946623 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-clppr\" (UniqueName: \"kubernetes.io/projected/6162b080-bb13-4fb9-9911-4cb2318a0ea7-kube-api-access-clppr\") pod \"6162b080-bb13-4fb9-9911-4cb2318a0ea7\" (UID: \"6162b080-bb13-4fb9-9911-4cb2318a0ea7\") " Oct 13 13:25:22 crc kubenswrapper[4797]: I1013 13:25:22.946716 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"49732ccc-e2de-4754-b66b-a1a83e504bb9\" (UID: \"49732ccc-e2de-4754-b66b-a1a83e504bb9\") " Oct 13 13:25:22 crc kubenswrapper[4797]: I1013 13:25:22.946745 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/49732ccc-e2de-4754-b66b-a1a83e504bb9-logs\") pod \"49732ccc-e2de-4754-b66b-a1a83e504bb9\" (UID: \"49732ccc-e2de-4754-b66b-a1a83e504bb9\") " Oct 13 13:25:22 crc kubenswrapper[4797]: I1013 13:25:22.947348 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-mxz7m"] Oct 13 13:25:22 crc kubenswrapper[4797]: I1013 13:25:22.947591 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6162b080-bb13-4fb9-9911-4cb2318a0ea7-logs" (OuterVolumeSpecName: "logs") pod "6162b080-bb13-4fb9-9911-4cb2318a0ea7" (UID: "6162b080-bb13-4fb9-9911-4cb2318a0ea7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 13:25:22 crc kubenswrapper[4797]: I1013 13:25:22.948198 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49732ccc-e2de-4754-b66b-a1a83e504bb9-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "49732ccc-e2de-4754-b66b-a1a83e504bb9" (UID: "49732ccc-e2de-4754-b66b-a1a83e504bb9"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 13:25:22 crc kubenswrapper[4797]: I1013 13:25:22.948240 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6162b080-bb13-4fb9-9911-4cb2318a0ea7-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "6162b080-bb13-4fb9-9911-4cb2318a0ea7" (UID: "6162b080-bb13-4fb9-9911-4cb2318a0ea7"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 13:25:22 crc kubenswrapper[4797]: I1013 13:25:22.949723 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49732ccc-e2de-4754-b66b-a1a83e504bb9-logs" (OuterVolumeSpecName: "logs") pod "49732ccc-e2de-4754-b66b-a1a83e504bb9" (UID: "49732ccc-e2de-4754-b66b-a1a83e504bb9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 13:25:22 crc kubenswrapper[4797]: I1013 13:25:22.951208 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "glance") pod "6162b080-bb13-4fb9-9911-4cb2318a0ea7" (UID: "6162b080-bb13-4fb9-9911-4cb2318a0ea7"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 13 13:25:22 crc kubenswrapper[4797]: I1013 13:25:22.951828 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49732ccc-e2de-4754-b66b-a1a83e504bb9-config-data" (OuterVolumeSpecName: "config-data") pod "49732ccc-e2de-4754-b66b-a1a83e504bb9" (UID: "49732ccc-e2de-4754-b66b-a1a83e504bb9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:25:22 crc kubenswrapper[4797]: I1013 13:25:22.952280 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6162b080-bb13-4fb9-9911-4cb2318a0ea7-scripts" (OuterVolumeSpecName: "scripts") pod "6162b080-bb13-4fb9-9911-4cb2318a0ea7" (UID: "6162b080-bb13-4fb9-9911-4cb2318a0ea7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:25:22 crc kubenswrapper[4797]: I1013 13:25:22.953350 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6162b080-bb13-4fb9-9911-4cb2318a0ea7-config-data" (OuterVolumeSpecName: "config-data") pod "6162b080-bb13-4fb9-9911-4cb2318a0ea7" (UID: "6162b080-bb13-4fb9-9911-4cb2318a0ea7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:25:22 crc kubenswrapper[4797]: I1013 13:25:22.953393 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49732ccc-e2de-4754-b66b-a1a83e504bb9-scripts" (OuterVolumeSpecName: "scripts") pod "49732ccc-e2de-4754-b66b-a1a83e504bb9" (UID: "49732ccc-e2de-4754-b66b-a1a83e504bb9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:25:22 crc kubenswrapper[4797]: I1013 13:25:22.954194 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "49732ccc-e2de-4754-b66b-a1a83e504bb9" (UID: "49732ccc-e2de-4754-b66b-a1a83e504bb9"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 13 13:25:22 crc kubenswrapper[4797]: I1013 13:25:22.955430 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6162b080-bb13-4fb9-9911-4cb2318a0ea7-kube-api-access-clppr" (OuterVolumeSpecName: "kube-api-access-clppr") pod "6162b080-bb13-4fb9-9911-4cb2318a0ea7" (UID: "6162b080-bb13-4fb9-9911-4cb2318a0ea7"). InnerVolumeSpecName "kube-api-access-clppr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:25:22 crc kubenswrapper[4797]: I1013 13:25:22.956032 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49732ccc-e2de-4754-b66b-a1a83e504bb9-kube-api-access-mwhwc" (OuterVolumeSpecName: "kube-api-access-mwhwc") pod "49732ccc-e2de-4754-b66b-a1a83e504bb9" (UID: "49732ccc-e2de-4754-b66b-a1a83e504bb9"). InnerVolumeSpecName "kube-api-access-mwhwc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:25:22 crc kubenswrapper[4797]: I1013 13:25:22.956930 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49732ccc-e2de-4754-b66b-a1a83e504bb9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "49732ccc-e2de-4754-b66b-a1a83e504bb9" (UID: "49732ccc-e2de-4754-b66b-a1a83e504bb9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:25:22 crc kubenswrapper[4797]: I1013 13:25:22.958891 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6162b080-bb13-4fb9-9911-4cb2318a0ea7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6162b080-bb13-4fb9-9911-4cb2318a0ea7" (UID: "6162b080-bb13-4fb9-9911-4cb2318a0ea7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:25:23 crc kubenswrapper[4797]: I1013 13:25:23.037649 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-tsc96"] Oct 13 13:25:23 crc kubenswrapper[4797]: I1013 13:25:23.039456 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-tsc96" Oct 13 13:25:23 crc kubenswrapper[4797]: I1013 13:25:23.043833 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 13 13:25:23 crc kubenswrapper[4797]: I1013 13:25:23.043860 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 13 13:25:23 crc kubenswrapper[4797]: I1013 13:25:23.043886 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 13 13:25:23 crc kubenswrapper[4797]: I1013 13:25:23.044158 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-4snmq" Oct 13 13:25:23 crc kubenswrapper[4797]: I1013 13:25:23.048572 4797 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Oct 13 13:25:23 crc kubenswrapper[4797]: I1013 13:25:23.048599 4797 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/49732ccc-e2de-4754-b66b-a1a83e504bb9-logs\") on node \"crc\" DevicePath \"\"" Oct 13 13:25:23 crc kubenswrapper[4797]: I1013 13:25:23.048615 4797 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Oct 13 13:25:23 crc kubenswrapper[4797]: I1013 13:25:23.048625 4797 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6162b080-bb13-4fb9-9911-4cb2318a0ea7-logs\") on node \"crc\" DevicePath \"\"" Oct 13 13:25:23 crc kubenswrapper[4797]: I1013 13:25:23.048645 4797 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6162b080-bb13-4fb9-9911-4cb2318a0ea7-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 13:25:23 crc kubenswrapper[4797]: I1013 13:25:23.048656 4797 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49732ccc-e2de-4754-b66b-a1a83e504bb9-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 13:25:23 crc kubenswrapper[4797]: I1013 13:25:23.048664 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49732ccc-e2de-4754-b66b-a1a83e504bb9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 13:25:23 crc kubenswrapper[4797]: I1013 13:25:23.048674 4797 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6162b080-bb13-4fb9-9911-4cb2318a0ea7-scripts\") on node \"crc\" DevicePath \"\"" Oct 13 13:25:23 crc kubenswrapper[4797]: I1013 13:25:23.048682 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6162b080-bb13-4fb9-9911-4cb2318a0ea7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 13:25:23 crc kubenswrapper[4797]: I1013 13:25:23.048690 4797 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/49732ccc-e2de-4754-b66b-a1a83e504bb9-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 13 13:25:23 crc kubenswrapper[4797]: I1013 13:25:23.048698 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mwhwc\" (UniqueName: \"kubernetes.io/projected/49732ccc-e2de-4754-b66b-a1a83e504bb9-kube-api-access-mwhwc\") on node \"crc\" DevicePath \"\"" Oct 13 13:25:23 crc kubenswrapper[4797]: I1013 13:25:23.048706 4797 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/49732ccc-e2de-4754-b66b-a1a83e504bb9-scripts\") on node \"crc\" DevicePath \"\"" Oct 13 13:25:23 crc kubenswrapper[4797]: I1013 13:25:23.048714 4797 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6162b080-bb13-4fb9-9911-4cb2318a0ea7-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 13 13:25:23 crc kubenswrapper[4797]: I1013 13:25:23.048722 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-clppr\" (UniqueName: \"kubernetes.io/projected/6162b080-bb13-4fb9-9911-4cb2318a0ea7-kube-api-access-clppr\") on node \"crc\" DevicePath \"\"" Oct 13 13:25:23 crc kubenswrapper[4797]: I1013 13:25:23.070578 4797 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Oct 13 13:25:23 crc kubenswrapper[4797]: I1013 13:25:23.075727 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-tsc96"] Oct 13 13:25:23 crc kubenswrapper[4797]: I1013 13:25:23.080945 4797 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Oct 13 13:25:23 crc kubenswrapper[4797]: I1013 13:25:23.150621 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/46f55e80-87b4-4286-afa0-ab9c7143b02f-fernet-keys\") pod \"keystone-bootstrap-tsc96\" (UID: \"46f55e80-87b4-4286-afa0-ab9c7143b02f\") " pod="openstack/keystone-bootstrap-tsc96" Oct 13 13:25:23 crc kubenswrapper[4797]: I1013 13:25:23.151089 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/46f55e80-87b4-4286-afa0-ab9c7143b02f-credential-keys\") pod \"keystone-bootstrap-tsc96\" (UID: \"46f55e80-87b4-4286-afa0-ab9c7143b02f\") " pod="openstack/keystone-bootstrap-tsc96" Oct 13 13:25:23 crc kubenswrapper[4797]: I1013 13:25:23.151261 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46f55e80-87b4-4286-afa0-ab9c7143b02f-scripts\") pod \"keystone-bootstrap-tsc96\" (UID: \"46f55e80-87b4-4286-afa0-ab9c7143b02f\") " pod="openstack/keystone-bootstrap-tsc96" Oct 13 13:25:23 crc kubenswrapper[4797]: I1013 13:25:23.151408 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46f55e80-87b4-4286-afa0-ab9c7143b02f-combined-ca-bundle\") pod \"keystone-bootstrap-tsc96\" (UID: \"46f55e80-87b4-4286-afa0-ab9c7143b02f\") " pod="openstack/keystone-bootstrap-tsc96" Oct 13 13:25:23 crc kubenswrapper[4797]: I1013 13:25:23.151597 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9n8f5\" (UniqueName: \"kubernetes.io/projected/46f55e80-87b4-4286-afa0-ab9c7143b02f-kube-api-access-9n8f5\") pod \"keystone-bootstrap-tsc96\" (UID: \"46f55e80-87b4-4286-afa0-ab9c7143b02f\") " pod="openstack/keystone-bootstrap-tsc96" Oct 13 13:25:23 crc kubenswrapper[4797]: I1013 13:25:23.151708 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46f55e80-87b4-4286-afa0-ab9c7143b02f-config-data\") pod \"keystone-bootstrap-tsc96\" (UID: \"46f55e80-87b4-4286-afa0-ab9c7143b02f\") " pod="openstack/keystone-bootstrap-tsc96" Oct 13 13:25:23 crc kubenswrapper[4797]: I1013 13:25:23.151875 4797 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Oct 13 13:25:23 crc kubenswrapper[4797]: I1013 13:25:23.152015 4797 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Oct 13 13:25:23 crc kubenswrapper[4797]: I1013 13:25:23.254746 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46f55e80-87b4-4286-afa0-ab9c7143b02f-scripts\") pod \"keystone-bootstrap-tsc96\" (UID: \"46f55e80-87b4-4286-afa0-ab9c7143b02f\") " pod="openstack/keystone-bootstrap-tsc96" Oct 13 13:25:23 crc kubenswrapper[4797]: I1013 13:25:23.254839 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46f55e80-87b4-4286-afa0-ab9c7143b02f-combined-ca-bundle\") pod \"keystone-bootstrap-tsc96\" (UID: \"46f55e80-87b4-4286-afa0-ab9c7143b02f\") " pod="openstack/keystone-bootstrap-tsc96" Oct 13 13:25:23 crc kubenswrapper[4797]: I1013 13:25:23.254897 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9n8f5\" (UniqueName: \"kubernetes.io/projected/46f55e80-87b4-4286-afa0-ab9c7143b02f-kube-api-access-9n8f5\") pod \"keystone-bootstrap-tsc96\" (UID: \"46f55e80-87b4-4286-afa0-ab9c7143b02f\") " pod="openstack/keystone-bootstrap-tsc96" Oct 13 13:25:23 crc kubenswrapper[4797]: I1013 13:25:23.254922 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46f55e80-87b4-4286-afa0-ab9c7143b02f-config-data\") pod \"keystone-bootstrap-tsc96\" (UID: \"46f55e80-87b4-4286-afa0-ab9c7143b02f\") " pod="openstack/keystone-bootstrap-tsc96" Oct 13 13:25:23 crc kubenswrapper[4797]: I1013 13:25:23.255010 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/46f55e80-87b4-4286-afa0-ab9c7143b02f-fernet-keys\") pod \"keystone-bootstrap-tsc96\" (UID: \"46f55e80-87b4-4286-afa0-ab9c7143b02f\") " pod="openstack/keystone-bootstrap-tsc96" Oct 13 13:25:23 crc kubenswrapper[4797]: I1013 13:25:23.255044 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/46f55e80-87b4-4286-afa0-ab9c7143b02f-credential-keys\") pod \"keystone-bootstrap-tsc96\" (UID: \"46f55e80-87b4-4286-afa0-ab9c7143b02f\") " pod="openstack/keystone-bootstrap-tsc96" Oct 13 13:25:23 crc kubenswrapper[4797]: I1013 13:25:23.257688 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41cd509e-46e1-4776-9179-b00dec55dd2a" path="/var/lib/kubelet/pods/41cd509e-46e1-4776-9179-b00dec55dd2a/volumes" Oct 13 13:25:23 crc kubenswrapper[4797]: I1013 13:25:23.259032 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="775308c0-af24-4959-a964-b0d3de6ab0fd" path="/var/lib/kubelet/pods/775308c0-af24-4959-a964-b0d3de6ab0fd/volumes" Oct 13 13:25:23 crc kubenswrapper[4797]: I1013 13:25:23.259324 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46f55e80-87b4-4286-afa0-ab9c7143b02f-scripts\") pod \"keystone-bootstrap-tsc96\" (UID: \"46f55e80-87b4-4286-afa0-ab9c7143b02f\") " pod="openstack/keystone-bootstrap-tsc96" Oct 13 13:25:23 crc kubenswrapper[4797]: I1013 13:25:23.260777 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e951a4d9-1a79-4872-b889-1dfdf3b8fa18" path="/var/lib/kubelet/pods/e951a4d9-1a79-4872-b889-1dfdf3b8fa18/volumes" Oct 13 13:25:23 crc kubenswrapper[4797]: I1013 13:25:23.262611 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46f55e80-87b4-4286-afa0-ab9c7143b02f-combined-ca-bundle\") pod \"keystone-bootstrap-tsc96\" (UID: \"46f55e80-87b4-4286-afa0-ab9c7143b02f\") " pod="openstack/keystone-bootstrap-tsc96" Oct 13 13:25:23 crc kubenswrapper[4797]: I1013 13:25:23.263012 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/46f55e80-87b4-4286-afa0-ab9c7143b02f-fernet-keys\") pod \"keystone-bootstrap-tsc96\" (UID: \"46f55e80-87b4-4286-afa0-ab9c7143b02f\") " pod="openstack/keystone-bootstrap-tsc96" Oct 13 13:25:23 crc kubenswrapper[4797]: I1013 13:25:23.263019 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46f55e80-87b4-4286-afa0-ab9c7143b02f-config-data\") pod \"keystone-bootstrap-tsc96\" (UID: \"46f55e80-87b4-4286-afa0-ab9c7143b02f\") " pod="openstack/keystone-bootstrap-tsc96" Oct 13 13:25:23 crc kubenswrapper[4797]: I1013 13:25:23.263495 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/46f55e80-87b4-4286-afa0-ab9c7143b02f-credential-keys\") pod \"keystone-bootstrap-tsc96\" (UID: \"46f55e80-87b4-4286-afa0-ab9c7143b02f\") " pod="openstack/keystone-bootstrap-tsc96" Oct 13 13:25:23 crc kubenswrapper[4797]: I1013 13:25:23.277651 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9n8f5\" (UniqueName: \"kubernetes.io/projected/46f55e80-87b4-4286-afa0-ab9c7143b02f-kube-api-access-9n8f5\") pod \"keystone-bootstrap-tsc96\" (UID: \"46f55e80-87b4-4286-afa0-ab9c7143b02f\") " pod="openstack/keystone-bootstrap-tsc96" Oct 13 13:25:23 crc kubenswrapper[4797]: I1013 13:25:23.362373 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-tsc96" Oct 13 13:25:23 crc kubenswrapper[4797]: I1013 13:25:23.781957 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-77f7885f7f-jk7j6" Oct 13 13:25:23 crc kubenswrapper[4797]: I1013 13:25:23.838541 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-tsc96"] Oct 13 13:25:23 crc kubenswrapper[4797]: I1013 13:25:23.866622 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-d9ddcb47c-khxjk"] Oct 13 13:25:23 crc kubenswrapper[4797]: I1013 13:25:23.866871 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-d9ddcb47c-khxjk" podUID="885a17a7-d224-4081-8568-8d362c86f321" containerName="dnsmasq-dns" containerID="cri-o://016a8ddf0a24c47c392325e00e0013bc3b8e64971b096a87817aa29af7080dcd" gracePeriod=10 Oct 13 13:25:23 crc kubenswrapper[4797]: I1013 13:25:23.924752 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-tsc96" event={"ID":"46f55e80-87b4-4286-afa0-ab9c7143b02f","Type":"ContainerStarted","Data":"7a5d63e49a4b4beb1a57151e937ab6386a5f7cf6cce251ff5f834873da3c9c14"} Oct 13 13:25:23 crc kubenswrapper[4797]: I1013 13:25:23.928870 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f52aa623-f23f-4f02-b744-7c5b1e066e50","Type":"ContainerStarted","Data":"a7fbc889f92c77b5d394129b70266a52ff645e52f67dbaaef83a48efffc55a63"} Oct 13 13:25:23 crc kubenswrapper[4797]: I1013 13:25:23.928911 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 13 13:25:23 crc kubenswrapper[4797]: I1013 13:25:23.928914 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.001870 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.025396 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.081730 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.091833 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.092064 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.100653 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.105876 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-c7pxm" Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.106383 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.106388 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.148872 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-tqpk9"] Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.150702 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-tqpk9" Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.155685 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-cb64g" Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.156102 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.163394 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.186682 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/bb8faf00-58df-4131-af70-117df286f396-db-sync-config-data\") pod \"barbican-db-sync-tqpk9\" (UID: \"bb8faf00-58df-4131-af70-117df286f396\") " pod="openstack/barbican-db-sync-tqpk9" Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.186821 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mbkvk\" (UniqueName: \"kubernetes.io/projected/bb8faf00-58df-4131-af70-117df286f396-kube-api-access-mbkvk\") pod \"barbican-db-sync-tqpk9\" (UID: \"bb8faf00-58df-4131-af70-117df286f396\") " pod="openstack/barbican-db-sync-tqpk9" Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.186865 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb8faf00-58df-4131-af70-117df286f396-combined-ca-bundle\") pod \"barbican-db-sync-tqpk9\" (UID: \"bb8faf00-58df-4131-af70-117df286f396\") " pod="openstack/barbican-db-sync-tqpk9" Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.190101 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.206858 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-tqpk9"] Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.232586 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.234509 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.238411 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.239469 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.245845 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.262750 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-xzbbx"] Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.264055 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-xzbbx" Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.266912 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.267018 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.267184 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-z87h5" Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.270421 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-xzbbx"] Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.289653 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"6252a65e-85f5-42cf-9fce-3cd585d5e834\") " pod="openstack/glance-default-internal-api-0" Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.289719 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6252a65e-85f5-42cf-9fce-3cd585d5e834-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"6252a65e-85f5-42cf-9fce-3cd585d5e834\") " pod="openstack/glance-default-internal-api-0" Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.289744 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8lb9b\" (UniqueName: \"kubernetes.io/projected/6252a65e-85f5-42cf-9fce-3cd585d5e834-kube-api-access-8lb9b\") pod \"glance-default-internal-api-0\" (UID: \"6252a65e-85f5-42cf-9fce-3cd585d5e834\") " pod="openstack/glance-default-internal-api-0" Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.289769 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mbkvk\" (UniqueName: \"kubernetes.io/projected/bb8faf00-58df-4131-af70-117df286f396-kube-api-access-mbkvk\") pod \"barbican-db-sync-tqpk9\" (UID: \"bb8faf00-58df-4131-af70-117df286f396\") " pod="openstack/barbican-db-sync-tqpk9" Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.289795 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb8faf00-58df-4131-af70-117df286f396-combined-ca-bundle\") pod \"barbican-db-sync-tqpk9\" (UID: \"bb8faf00-58df-4131-af70-117df286f396\") " pod="openstack/barbican-db-sync-tqpk9" Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.289876 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6252a65e-85f5-42cf-9fce-3cd585d5e834-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"6252a65e-85f5-42cf-9fce-3cd585d5e834\") " pod="openstack/glance-default-internal-api-0" Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.289900 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6252a65e-85f5-42cf-9fce-3cd585d5e834-config-data\") pod \"glance-default-internal-api-0\" (UID: \"6252a65e-85f5-42cf-9fce-3cd585d5e834\") " pod="openstack/glance-default-internal-api-0" Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.289940 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6252a65e-85f5-42cf-9fce-3cd585d5e834-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"6252a65e-85f5-42cf-9fce-3cd585d5e834\") " pod="openstack/glance-default-internal-api-0" Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.289961 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6252a65e-85f5-42cf-9fce-3cd585d5e834-scripts\") pod \"glance-default-internal-api-0\" (UID: \"6252a65e-85f5-42cf-9fce-3cd585d5e834\") " pod="openstack/glance-default-internal-api-0" Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.289992 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/bb8faf00-58df-4131-af70-117df286f396-db-sync-config-data\") pod \"barbican-db-sync-tqpk9\" (UID: \"bb8faf00-58df-4131-af70-117df286f396\") " pod="openstack/barbican-db-sync-tqpk9" Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.290008 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6252a65e-85f5-42cf-9fce-3cd585d5e834-logs\") pod \"glance-default-internal-api-0\" (UID: \"6252a65e-85f5-42cf-9fce-3cd585d5e834\") " pod="openstack/glance-default-internal-api-0" Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.319196 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb8faf00-58df-4131-af70-117df286f396-combined-ca-bundle\") pod \"barbican-db-sync-tqpk9\" (UID: \"bb8faf00-58df-4131-af70-117df286f396\") " pod="openstack/barbican-db-sync-tqpk9" Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.321236 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mbkvk\" (UniqueName: \"kubernetes.io/projected/bb8faf00-58df-4131-af70-117df286f396-kube-api-access-mbkvk\") pod \"barbican-db-sync-tqpk9\" (UID: \"bb8faf00-58df-4131-af70-117df286f396\") " pod="openstack/barbican-db-sync-tqpk9" Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.323958 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/bb8faf00-58df-4131-af70-117df286f396-db-sync-config-data\") pod \"barbican-db-sync-tqpk9\" (UID: \"bb8faf00-58df-4131-af70-117df286f396\") " pod="openstack/barbican-db-sync-tqpk9" Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.392048 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99cf4c75-5042-4f58-945f-5461cad0fbcc-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"99cf4c75-5042-4f58-945f-5461cad0fbcc\") " pod="openstack/glance-default-external-api-0" Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.392109 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74tj7\" (UniqueName: \"kubernetes.io/projected/99cf4c75-5042-4f58-945f-5461cad0fbcc-kube-api-access-74tj7\") pod \"glance-default-external-api-0\" (UID: \"99cf4c75-5042-4f58-945f-5461cad0fbcc\") " pod="openstack/glance-default-external-api-0" Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.392138 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7978a82-8a2f-4a86-8598-65e7dae25b77-combined-ca-bundle\") pod \"cinder-db-sync-xzbbx\" (UID: \"d7978a82-8a2f-4a86-8598-65e7dae25b77\") " pod="openstack/cinder-db-sync-xzbbx" Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.392164 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6252a65e-85f5-42cf-9fce-3cd585d5e834-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"6252a65e-85f5-42cf-9fce-3cd585d5e834\") " pod="openstack/glance-default-internal-api-0" Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.392185 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99cf4c75-5042-4f58-945f-5461cad0fbcc-config-data\") pod \"glance-default-external-api-0\" (UID: \"99cf4c75-5042-4f58-945f-5461cad0fbcc\") " pod="openstack/glance-default-external-api-0" Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.392668 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6252a65e-85f5-42cf-9fce-3cd585d5e834-scripts\") pod \"glance-default-internal-api-0\" (UID: \"6252a65e-85f5-42cf-9fce-3cd585d5e834\") " pod="openstack/glance-default-internal-api-0" Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.392908 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6252a65e-85f5-42cf-9fce-3cd585d5e834-logs\") pod \"glance-default-internal-api-0\" (UID: \"6252a65e-85f5-42cf-9fce-3cd585d5e834\") " pod="openstack/glance-default-internal-api-0" Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.392935 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7978a82-8a2f-4a86-8598-65e7dae25b77-config-data\") pod \"cinder-db-sync-xzbbx\" (UID: \"d7978a82-8a2f-4a86-8598-65e7dae25b77\") " pod="openstack/cinder-db-sync-xzbbx" Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.393269 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/99cf4c75-5042-4f58-945f-5461cad0fbcc-logs\") pod \"glance-default-external-api-0\" (UID: \"99cf4c75-5042-4f58-945f-5461cad0fbcc\") " pod="openstack/glance-default-external-api-0" Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.393293 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/99cf4c75-5042-4f58-945f-5461cad0fbcc-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"99cf4c75-5042-4f58-945f-5461cad0fbcc\") " pod="openstack/glance-default-external-api-0" Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.393320 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"6252a65e-85f5-42cf-9fce-3cd585d5e834\") " pod="openstack/glance-default-internal-api-0" Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.393341 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d7978a82-8a2f-4a86-8598-65e7dae25b77-etc-machine-id\") pod \"cinder-db-sync-xzbbx\" (UID: \"d7978a82-8a2f-4a86-8598-65e7dae25b77\") " pod="openstack/cinder-db-sync-xzbbx" Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.393392 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6252a65e-85f5-42cf-9fce-3cd585d5e834-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"6252a65e-85f5-42cf-9fce-3cd585d5e834\") " pod="openstack/glance-default-internal-api-0" Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.393417 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8lb9b\" (UniqueName: \"kubernetes.io/projected/6252a65e-85f5-42cf-9fce-3cd585d5e834-kube-api-access-8lb9b\") pod \"glance-default-internal-api-0\" (UID: \"6252a65e-85f5-42cf-9fce-3cd585d5e834\") " pod="openstack/glance-default-internal-api-0" Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.393924 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/99cf4c75-5042-4f58-945f-5461cad0fbcc-scripts\") pod \"glance-default-external-api-0\" (UID: \"99cf4c75-5042-4f58-945f-5461cad0fbcc\") " pod="openstack/glance-default-external-api-0" Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.393956 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cwgl8\" (UniqueName: \"kubernetes.io/projected/d7978a82-8a2f-4a86-8598-65e7dae25b77-kube-api-access-cwgl8\") pod \"cinder-db-sync-xzbbx\" (UID: \"d7978a82-8a2f-4a86-8598-65e7dae25b77\") " pod="openstack/cinder-db-sync-xzbbx" Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.393997 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/99cf4c75-5042-4f58-945f-5461cad0fbcc-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"99cf4c75-5042-4f58-945f-5461cad0fbcc\") " pod="openstack/glance-default-external-api-0" Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.394035 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"99cf4c75-5042-4f58-945f-5461cad0fbcc\") " pod="openstack/glance-default-external-api-0" Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.394064 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6252a65e-85f5-42cf-9fce-3cd585d5e834-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"6252a65e-85f5-42cf-9fce-3cd585d5e834\") " pod="openstack/glance-default-internal-api-0" Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.394088 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d7978a82-8a2f-4a86-8598-65e7dae25b77-db-sync-config-data\") pod \"cinder-db-sync-xzbbx\" (UID: \"d7978a82-8a2f-4a86-8598-65e7dae25b77\") " pod="openstack/cinder-db-sync-xzbbx" Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.394106 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d7978a82-8a2f-4a86-8598-65e7dae25b77-scripts\") pod \"cinder-db-sync-xzbbx\" (UID: \"d7978a82-8a2f-4a86-8598-65e7dae25b77\") " pod="openstack/cinder-db-sync-xzbbx" Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.394124 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6252a65e-85f5-42cf-9fce-3cd585d5e834-config-data\") pod \"glance-default-internal-api-0\" (UID: \"6252a65e-85f5-42cf-9fce-3cd585d5e834\") " pod="openstack/glance-default-internal-api-0" Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.401622 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6252a65e-85f5-42cf-9fce-3cd585d5e834-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"6252a65e-85f5-42cf-9fce-3cd585d5e834\") " pod="openstack/glance-default-internal-api-0" Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.402572 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6252a65e-85f5-42cf-9fce-3cd585d5e834-scripts\") pod \"glance-default-internal-api-0\" (UID: \"6252a65e-85f5-42cf-9fce-3cd585d5e834\") " pod="openstack/glance-default-internal-api-0" Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.403095 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6252a65e-85f5-42cf-9fce-3cd585d5e834-logs\") pod \"glance-default-internal-api-0\" (UID: \"6252a65e-85f5-42cf-9fce-3cd585d5e834\") " pod="openstack/glance-default-internal-api-0" Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.404367 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6252a65e-85f5-42cf-9fce-3cd585d5e834-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"6252a65e-85f5-42cf-9fce-3cd585d5e834\") " pod="openstack/glance-default-internal-api-0" Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.404562 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6252a65e-85f5-42cf-9fce-3cd585d5e834-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"6252a65e-85f5-42cf-9fce-3cd585d5e834\") " pod="openstack/glance-default-internal-api-0" Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.404612 4797 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"6252a65e-85f5-42cf-9fce-3cd585d5e834\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-internal-api-0" Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.406419 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6252a65e-85f5-42cf-9fce-3cd585d5e834-config-data\") pod \"glance-default-internal-api-0\" (UID: \"6252a65e-85f5-42cf-9fce-3cd585d5e834\") " pod="openstack/glance-default-internal-api-0" Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.412044 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8lb9b\" (UniqueName: \"kubernetes.io/projected/6252a65e-85f5-42cf-9fce-3cd585d5e834-kube-api-access-8lb9b\") pod \"glance-default-internal-api-0\" (UID: \"6252a65e-85f5-42cf-9fce-3cd585d5e834\") " pod="openstack/glance-default-internal-api-0" Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.461281 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"6252a65e-85f5-42cf-9fce-3cd585d5e834\") " pod="openstack/glance-default-internal-api-0" Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.466382 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-d764t"] Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.468299 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-d764t" Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.478231 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.478553 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-wxbwd" Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.478617 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.486657 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.495246 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"99cf4c75-5042-4f58-945f-5461cad0fbcc\") " pod="openstack/glance-default-external-api-0" Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.495312 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d7978a82-8a2f-4a86-8598-65e7dae25b77-db-sync-config-data\") pod \"cinder-db-sync-xzbbx\" (UID: \"d7978a82-8a2f-4a86-8598-65e7dae25b77\") " pod="openstack/cinder-db-sync-xzbbx" Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.495338 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d7978a82-8a2f-4a86-8598-65e7dae25b77-scripts\") pod \"cinder-db-sync-xzbbx\" (UID: \"d7978a82-8a2f-4a86-8598-65e7dae25b77\") " pod="openstack/cinder-db-sync-xzbbx" Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.495384 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99cf4c75-5042-4f58-945f-5461cad0fbcc-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"99cf4c75-5042-4f58-945f-5461cad0fbcc\") " pod="openstack/glance-default-external-api-0" Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.495415 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74tj7\" (UniqueName: \"kubernetes.io/projected/99cf4c75-5042-4f58-945f-5461cad0fbcc-kube-api-access-74tj7\") pod \"glance-default-external-api-0\" (UID: \"99cf4c75-5042-4f58-945f-5461cad0fbcc\") " pod="openstack/glance-default-external-api-0" Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.495436 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7978a82-8a2f-4a86-8598-65e7dae25b77-combined-ca-bundle\") pod \"cinder-db-sync-xzbbx\" (UID: \"d7978a82-8a2f-4a86-8598-65e7dae25b77\") " pod="openstack/cinder-db-sync-xzbbx" Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.495462 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99cf4c75-5042-4f58-945f-5461cad0fbcc-config-data\") pod \"glance-default-external-api-0\" (UID: \"99cf4c75-5042-4f58-945f-5461cad0fbcc\") " pod="openstack/glance-default-external-api-0" Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.495513 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7978a82-8a2f-4a86-8598-65e7dae25b77-config-data\") pod \"cinder-db-sync-xzbbx\" (UID: \"d7978a82-8a2f-4a86-8598-65e7dae25b77\") " pod="openstack/cinder-db-sync-xzbbx" Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.495544 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/99cf4c75-5042-4f58-945f-5461cad0fbcc-logs\") pod \"glance-default-external-api-0\" (UID: \"99cf4c75-5042-4f58-945f-5461cad0fbcc\") " pod="openstack/glance-default-external-api-0" Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.495567 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/99cf4c75-5042-4f58-945f-5461cad0fbcc-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"99cf4c75-5042-4f58-945f-5461cad0fbcc\") " pod="openstack/glance-default-external-api-0" Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.495564 4797 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"99cf4c75-5042-4f58-945f-5461cad0fbcc\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/glance-default-external-api-0" Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.497136 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-d764t"] Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.495637 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d7978a82-8a2f-4a86-8598-65e7dae25b77-etc-machine-id\") pod \"cinder-db-sync-xzbbx\" (UID: \"d7978a82-8a2f-4a86-8598-65e7dae25b77\") " pod="openstack/cinder-db-sync-xzbbx" Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.496084 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/99cf4c75-5042-4f58-945f-5461cad0fbcc-logs\") pod \"glance-default-external-api-0\" (UID: \"99cf4c75-5042-4f58-945f-5461cad0fbcc\") " pod="openstack/glance-default-external-api-0" Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.496334 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/99cf4c75-5042-4f58-945f-5461cad0fbcc-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"99cf4c75-5042-4f58-945f-5461cad0fbcc\") " pod="openstack/glance-default-external-api-0" Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.495597 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d7978a82-8a2f-4a86-8598-65e7dae25b77-etc-machine-id\") pod \"cinder-db-sync-xzbbx\" (UID: \"d7978a82-8a2f-4a86-8598-65e7dae25b77\") " pod="openstack/cinder-db-sync-xzbbx" Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.497635 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/99cf4c75-5042-4f58-945f-5461cad0fbcc-scripts\") pod \"glance-default-external-api-0\" (UID: \"99cf4c75-5042-4f58-945f-5461cad0fbcc\") " pod="openstack/glance-default-external-api-0" Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.497672 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cwgl8\" (UniqueName: \"kubernetes.io/projected/d7978a82-8a2f-4a86-8598-65e7dae25b77-kube-api-access-cwgl8\") pod \"cinder-db-sync-xzbbx\" (UID: \"d7978a82-8a2f-4a86-8598-65e7dae25b77\") " pod="openstack/cinder-db-sync-xzbbx" Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.497758 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/99cf4c75-5042-4f58-945f-5461cad0fbcc-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"99cf4c75-5042-4f58-945f-5461cad0fbcc\") " pod="openstack/glance-default-external-api-0" Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.503108 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d7978a82-8a2f-4a86-8598-65e7dae25b77-scripts\") pod \"cinder-db-sync-xzbbx\" (UID: \"d7978a82-8a2f-4a86-8598-65e7dae25b77\") " pod="openstack/cinder-db-sync-xzbbx" Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.504117 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/99cf4c75-5042-4f58-945f-5461cad0fbcc-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"99cf4c75-5042-4f58-945f-5461cad0fbcc\") " pod="openstack/glance-default-external-api-0" Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.504653 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99cf4c75-5042-4f58-945f-5461cad0fbcc-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"99cf4c75-5042-4f58-945f-5461cad0fbcc\") " pod="openstack/glance-default-external-api-0" Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.506163 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-tqpk9" Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.508456 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7978a82-8a2f-4a86-8598-65e7dae25b77-config-data\") pod \"cinder-db-sync-xzbbx\" (UID: \"d7978a82-8a2f-4a86-8598-65e7dae25b77\") " pod="openstack/cinder-db-sync-xzbbx" Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.506234 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7978a82-8a2f-4a86-8598-65e7dae25b77-combined-ca-bundle\") pod \"cinder-db-sync-xzbbx\" (UID: \"d7978a82-8a2f-4a86-8598-65e7dae25b77\") " pod="openstack/cinder-db-sync-xzbbx" Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.509731 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/99cf4c75-5042-4f58-945f-5461cad0fbcc-scripts\") pod \"glance-default-external-api-0\" (UID: \"99cf4c75-5042-4f58-945f-5461cad0fbcc\") " pod="openstack/glance-default-external-api-0" Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.512886 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99cf4c75-5042-4f58-945f-5461cad0fbcc-config-data\") pod \"glance-default-external-api-0\" (UID: \"99cf4c75-5042-4f58-945f-5461cad0fbcc\") " pod="openstack/glance-default-external-api-0" Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.514659 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74tj7\" (UniqueName: \"kubernetes.io/projected/99cf4c75-5042-4f58-945f-5461cad0fbcc-kube-api-access-74tj7\") pod \"glance-default-external-api-0\" (UID: \"99cf4c75-5042-4f58-945f-5461cad0fbcc\") " pod="openstack/glance-default-external-api-0" Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.516930 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d7978a82-8a2f-4a86-8598-65e7dae25b77-db-sync-config-data\") pod \"cinder-db-sync-xzbbx\" (UID: \"d7978a82-8a2f-4a86-8598-65e7dae25b77\") " pod="openstack/cinder-db-sync-xzbbx" Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.518505 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cwgl8\" (UniqueName: \"kubernetes.io/projected/d7978a82-8a2f-4a86-8598-65e7dae25b77-kube-api-access-cwgl8\") pod \"cinder-db-sync-xzbbx\" (UID: \"d7978a82-8a2f-4a86-8598-65e7dae25b77\") " pod="openstack/cinder-db-sync-xzbbx" Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.547286 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"99cf4c75-5042-4f58-945f-5461cad0fbcc\") " pod="openstack/glance-default-external-api-0" Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.556015 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.572177 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d9ddcb47c-khxjk" Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.585159 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-xzbbx" Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.601845 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pw6bg\" (UniqueName: \"kubernetes.io/projected/c5dd5a95-b091-4f4f-8b05-d49d9dc0e979-kube-api-access-pw6bg\") pod \"neutron-db-sync-d764t\" (UID: \"c5dd5a95-b091-4f4f-8b05-d49d9dc0e979\") " pod="openstack/neutron-db-sync-d764t" Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.601944 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c5dd5a95-b091-4f4f-8b05-d49d9dc0e979-config\") pod \"neutron-db-sync-d764t\" (UID: \"c5dd5a95-b091-4f4f-8b05-d49d9dc0e979\") " pod="openstack/neutron-db-sync-d764t" Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.601983 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5dd5a95-b091-4f4f-8b05-d49d9dc0e979-combined-ca-bundle\") pod \"neutron-db-sync-d764t\" (UID: \"c5dd5a95-b091-4f4f-8b05-d49d9dc0e979\") " pod="openstack/neutron-db-sync-d764t" Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.703733 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9265\" (UniqueName: \"kubernetes.io/projected/885a17a7-d224-4081-8568-8d362c86f321-kube-api-access-w9265\") pod \"885a17a7-d224-4081-8568-8d362c86f321\" (UID: \"885a17a7-d224-4081-8568-8d362c86f321\") " Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.704131 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/885a17a7-d224-4081-8568-8d362c86f321-ovsdbserver-nb\") pod \"885a17a7-d224-4081-8568-8d362c86f321\" (UID: \"885a17a7-d224-4081-8568-8d362c86f321\") " Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.704195 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/885a17a7-d224-4081-8568-8d362c86f321-dns-svc\") pod \"885a17a7-d224-4081-8568-8d362c86f321\" (UID: \"885a17a7-d224-4081-8568-8d362c86f321\") " Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.704239 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/885a17a7-d224-4081-8568-8d362c86f321-config\") pod \"885a17a7-d224-4081-8568-8d362c86f321\" (UID: \"885a17a7-d224-4081-8568-8d362c86f321\") " Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.704264 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/885a17a7-d224-4081-8568-8d362c86f321-ovsdbserver-sb\") pod \"885a17a7-d224-4081-8568-8d362c86f321\" (UID: \"885a17a7-d224-4081-8568-8d362c86f321\") " Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.704314 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/885a17a7-d224-4081-8568-8d362c86f321-dns-swift-storage-0\") pod \"885a17a7-d224-4081-8568-8d362c86f321\" (UID: \"885a17a7-d224-4081-8568-8d362c86f321\") " Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.705078 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pw6bg\" (UniqueName: \"kubernetes.io/projected/c5dd5a95-b091-4f4f-8b05-d49d9dc0e979-kube-api-access-pw6bg\") pod \"neutron-db-sync-d764t\" (UID: \"c5dd5a95-b091-4f4f-8b05-d49d9dc0e979\") " pod="openstack/neutron-db-sync-d764t" Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.705265 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c5dd5a95-b091-4f4f-8b05-d49d9dc0e979-config\") pod \"neutron-db-sync-d764t\" (UID: \"c5dd5a95-b091-4f4f-8b05-d49d9dc0e979\") " pod="openstack/neutron-db-sync-d764t" Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.705310 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5dd5a95-b091-4f4f-8b05-d49d9dc0e979-combined-ca-bundle\") pod \"neutron-db-sync-d764t\" (UID: \"c5dd5a95-b091-4f4f-8b05-d49d9dc0e979\") " pod="openstack/neutron-db-sync-d764t" Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.711379 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5dd5a95-b091-4f4f-8b05-d49d9dc0e979-combined-ca-bundle\") pod \"neutron-db-sync-d764t\" (UID: \"c5dd5a95-b091-4f4f-8b05-d49d9dc0e979\") " pod="openstack/neutron-db-sync-d764t" Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.713893 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/c5dd5a95-b091-4f4f-8b05-d49d9dc0e979-config\") pod \"neutron-db-sync-d764t\" (UID: \"c5dd5a95-b091-4f4f-8b05-d49d9dc0e979\") " pod="openstack/neutron-db-sync-d764t" Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.722879 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/885a17a7-d224-4081-8568-8d362c86f321-kube-api-access-w9265" (OuterVolumeSpecName: "kube-api-access-w9265") pod "885a17a7-d224-4081-8568-8d362c86f321" (UID: "885a17a7-d224-4081-8568-8d362c86f321"). InnerVolumeSpecName "kube-api-access-w9265". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.725671 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pw6bg\" (UniqueName: \"kubernetes.io/projected/c5dd5a95-b091-4f4f-8b05-d49d9dc0e979-kube-api-access-pw6bg\") pod \"neutron-db-sync-d764t\" (UID: \"c5dd5a95-b091-4f4f-8b05-d49d9dc0e979\") " pod="openstack/neutron-db-sync-d764t" Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.770468 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/885a17a7-d224-4081-8568-8d362c86f321-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "885a17a7-d224-4081-8568-8d362c86f321" (UID: "885a17a7-d224-4081-8568-8d362c86f321"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.779625 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/885a17a7-d224-4081-8568-8d362c86f321-config" (OuterVolumeSpecName: "config") pod "885a17a7-d224-4081-8568-8d362c86f321" (UID: "885a17a7-d224-4081-8568-8d362c86f321"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.779754 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/885a17a7-d224-4081-8568-8d362c86f321-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "885a17a7-d224-4081-8568-8d362c86f321" (UID: "885a17a7-d224-4081-8568-8d362c86f321"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.779885 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/885a17a7-d224-4081-8568-8d362c86f321-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "885a17a7-d224-4081-8568-8d362c86f321" (UID: "885a17a7-d224-4081-8568-8d362c86f321"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.807484 4797 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/885a17a7-d224-4081-8568-8d362c86f321-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.807550 4797 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/885a17a7-d224-4081-8568-8d362c86f321-config\") on node \"crc\" DevicePath \"\"" Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.807563 4797 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/885a17a7-d224-4081-8568-8d362c86f321-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.807607 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9265\" (UniqueName: \"kubernetes.io/projected/885a17a7-d224-4081-8568-8d362c86f321-kube-api-access-w9265\") on node \"crc\" DevicePath \"\"" Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.807620 4797 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/885a17a7-d224-4081-8568-8d362c86f321-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.820882 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/885a17a7-d224-4081-8568-8d362c86f321-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "885a17a7-d224-4081-8568-8d362c86f321" (UID: "885a17a7-d224-4081-8568-8d362c86f321"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.897824 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-d764t" Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.910674 4797 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/885a17a7-d224-4081-8568-8d362c86f321-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.959236 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-tsc96" event={"ID":"46f55e80-87b4-4286-afa0-ab9c7143b02f","Type":"ContainerStarted","Data":"8a82a9e40c540c4652fc02f27eeaca8b45824717be1359cb164d63d458e5e12f"} Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.961893 4797 generic.go:334] "Generic (PLEG): container finished" podID="885a17a7-d224-4081-8568-8d362c86f321" containerID="016a8ddf0a24c47c392325e00e0013bc3b8e64971b096a87817aa29af7080dcd" exitCode=0 Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.961949 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d9ddcb47c-khxjk" event={"ID":"885a17a7-d224-4081-8568-8d362c86f321","Type":"ContainerDied","Data":"016a8ddf0a24c47c392325e00e0013bc3b8e64971b096a87817aa29af7080dcd"} Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.961971 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d9ddcb47c-khxjk" event={"ID":"885a17a7-d224-4081-8568-8d362c86f321","Type":"ContainerDied","Data":"4a2aa2afa41f15b05c4cfa9ec64a8b333cdbbe39238d300134fa294cde0b5f2f"} Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.961987 4797 scope.go:117] "RemoveContainer" containerID="016a8ddf0a24c47c392325e00e0013bc3b8e64971b096a87817aa29af7080dcd" Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.962091 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d9ddcb47c-khxjk" Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.969869 4797 generic.go:334] "Generic (PLEG): container finished" podID="974e0c08-3519-4be7-a9d1-c7db6016ad6f" containerID="b5c8e1cc5e2837e1df6c74841ca13a868c1d79cca109abf5eddbf0d8bb543195" exitCode=0 Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.969914 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-5dzch" event={"ID":"974e0c08-3519-4be7-a9d1-c7db6016ad6f","Type":"ContainerDied","Data":"b5c8e1cc5e2837e1df6c74841ca13a868c1d79cca109abf5eddbf0d8bb543195"} Oct 13 13:25:24 crc kubenswrapper[4797]: I1013 13:25:24.977650 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-tsc96" podStartSLOduration=1.97762785 podStartE2EDuration="1.97762785s" podCreationTimestamp="2025-10-13 13:25:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 13:25:24.976667087 +0000 UTC m=+1102.510217343" watchObservedRunningTime="2025-10-13 13:25:24.97762785 +0000 UTC m=+1102.511178106" Oct 13 13:25:25 crc kubenswrapper[4797]: I1013 13:25:25.018398 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-d9ddcb47c-khxjk"] Oct 13 13:25:25 crc kubenswrapper[4797]: I1013 13:25:25.027483 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-d9ddcb47c-khxjk"] Oct 13 13:25:25 crc kubenswrapper[4797]: I1013 13:25:25.054451 4797 scope.go:117] "RemoveContainer" containerID="61a6ca717c36ed19a2b71ed9ef41193b6aed3a67ebfcc8e62b7497a1d4caeaad" Oct 13 13:25:25 crc kubenswrapper[4797]: I1013 13:25:25.062146 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-tqpk9"] Oct 13 13:25:25 crc kubenswrapper[4797]: I1013 13:25:25.082650 4797 scope.go:117] "RemoveContainer" containerID="016a8ddf0a24c47c392325e00e0013bc3b8e64971b096a87817aa29af7080dcd" Oct 13 13:25:25 crc kubenswrapper[4797]: E1013 13:25:25.083450 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"016a8ddf0a24c47c392325e00e0013bc3b8e64971b096a87817aa29af7080dcd\": container with ID starting with 016a8ddf0a24c47c392325e00e0013bc3b8e64971b096a87817aa29af7080dcd not found: ID does not exist" containerID="016a8ddf0a24c47c392325e00e0013bc3b8e64971b096a87817aa29af7080dcd" Oct 13 13:25:25 crc kubenswrapper[4797]: I1013 13:25:25.083477 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"016a8ddf0a24c47c392325e00e0013bc3b8e64971b096a87817aa29af7080dcd"} err="failed to get container status \"016a8ddf0a24c47c392325e00e0013bc3b8e64971b096a87817aa29af7080dcd\": rpc error: code = NotFound desc = could not find container \"016a8ddf0a24c47c392325e00e0013bc3b8e64971b096a87817aa29af7080dcd\": container with ID starting with 016a8ddf0a24c47c392325e00e0013bc3b8e64971b096a87817aa29af7080dcd not found: ID does not exist" Oct 13 13:25:25 crc kubenswrapper[4797]: I1013 13:25:25.083495 4797 scope.go:117] "RemoveContainer" containerID="61a6ca717c36ed19a2b71ed9ef41193b6aed3a67ebfcc8e62b7497a1d4caeaad" Oct 13 13:25:25 crc kubenswrapper[4797]: E1013 13:25:25.084015 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"61a6ca717c36ed19a2b71ed9ef41193b6aed3a67ebfcc8e62b7497a1d4caeaad\": container with ID starting with 61a6ca717c36ed19a2b71ed9ef41193b6aed3a67ebfcc8e62b7497a1d4caeaad not found: ID does not exist" containerID="61a6ca717c36ed19a2b71ed9ef41193b6aed3a67ebfcc8e62b7497a1d4caeaad" Oct 13 13:25:25 crc kubenswrapper[4797]: I1013 13:25:25.084033 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61a6ca717c36ed19a2b71ed9ef41193b6aed3a67ebfcc8e62b7497a1d4caeaad"} err="failed to get container status \"61a6ca717c36ed19a2b71ed9ef41193b6aed3a67ebfcc8e62b7497a1d4caeaad\": rpc error: code = NotFound desc = could not find container \"61a6ca717c36ed19a2b71ed9ef41193b6aed3a67ebfcc8e62b7497a1d4caeaad\": container with ID starting with 61a6ca717c36ed19a2b71ed9ef41193b6aed3a67ebfcc8e62b7497a1d4caeaad not found: ID does not exist" Oct 13 13:25:25 crc kubenswrapper[4797]: I1013 13:25:25.205013 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 13 13:25:25 crc kubenswrapper[4797]: W1013 13:25:25.216251 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6252a65e_85f5_42cf_9fce_3cd585d5e834.slice/crio-6a07cbd2d7f3bc47690aa8c3e9f195eb4ca38be7f9280eddd4212dbd76a1c357 WatchSource:0}: Error finding container 6a07cbd2d7f3bc47690aa8c3e9f195eb4ca38be7f9280eddd4212dbd76a1c357: Status 404 returned error can't find the container with id 6a07cbd2d7f3bc47690aa8c3e9f195eb4ca38be7f9280eddd4212dbd76a1c357 Oct 13 13:25:25 crc kubenswrapper[4797]: I1013 13:25:25.268356 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49732ccc-e2de-4754-b66b-a1a83e504bb9" path="/var/lib/kubelet/pods/49732ccc-e2de-4754-b66b-a1a83e504bb9/volumes" Oct 13 13:25:25 crc kubenswrapper[4797]: I1013 13:25:25.269187 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6162b080-bb13-4fb9-9911-4cb2318a0ea7" path="/var/lib/kubelet/pods/6162b080-bb13-4fb9-9911-4cb2318a0ea7/volumes" Oct 13 13:25:25 crc kubenswrapper[4797]: I1013 13:25:25.269532 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="885a17a7-d224-4081-8568-8d362c86f321" path="/var/lib/kubelet/pods/885a17a7-d224-4081-8568-8d362c86f321/volumes" Oct 13 13:25:25 crc kubenswrapper[4797]: I1013 13:25:25.297889 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 13 13:25:25 crc kubenswrapper[4797]: W1013 13:25:25.301492 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod99cf4c75_5042_4f58_945f_5461cad0fbcc.slice/crio-a45547ea08b1ea0857b051b937df96a89df3098205ac91cc97a88802d7e32512 WatchSource:0}: Error finding container a45547ea08b1ea0857b051b937df96a89df3098205ac91cc97a88802d7e32512: Status 404 returned error can't find the container with id a45547ea08b1ea0857b051b937df96a89df3098205ac91cc97a88802d7e32512 Oct 13 13:25:25 crc kubenswrapper[4797]: I1013 13:25:25.311299 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-xzbbx"] Oct 13 13:25:25 crc kubenswrapper[4797]: I1013 13:25:25.436998 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-d764t"] Oct 13 13:25:25 crc kubenswrapper[4797]: W1013 13:25:25.440505 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc5dd5a95_b091_4f4f_8b05_d49d9dc0e979.slice/crio-1c7c91e3b6f7738fca6a74da8986b26f92fb80fda1e9e8aea0f68f84b9811ef7 WatchSource:0}: Error finding container 1c7c91e3b6f7738fca6a74da8986b26f92fb80fda1e9e8aea0f68f84b9811ef7: Status 404 returned error can't find the container with id 1c7c91e3b6f7738fca6a74da8986b26f92fb80fda1e9e8aea0f68f84b9811ef7 Oct 13 13:25:25 crc kubenswrapper[4797]: I1013 13:25:25.992636 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6252a65e-85f5-42cf-9fce-3cd585d5e834","Type":"ContainerStarted","Data":"b445980dba3c97e4a3a6a603ee26d9e781dbb8bb8304c528c84252ac702779dd"} Oct 13 13:25:25 crc kubenswrapper[4797]: I1013 13:25:25.993024 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6252a65e-85f5-42cf-9fce-3cd585d5e834","Type":"ContainerStarted","Data":"6a07cbd2d7f3bc47690aa8c3e9f195eb4ca38be7f9280eddd4212dbd76a1c357"} Oct 13 13:25:25 crc kubenswrapper[4797]: I1013 13:25:25.994356 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"99cf4c75-5042-4f58-945f-5461cad0fbcc","Type":"ContainerStarted","Data":"a45547ea08b1ea0857b051b937df96a89df3098205ac91cc97a88802d7e32512"} Oct 13 13:25:25 crc kubenswrapper[4797]: I1013 13:25:25.997278 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-d764t" event={"ID":"c5dd5a95-b091-4f4f-8b05-d49d9dc0e979","Type":"ContainerStarted","Data":"4251a02173a50fd16205c9b173f202379b93c111710f2865655e603d39f3ed2c"} Oct 13 13:25:25 crc kubenswrapper[4797]: I1013 13:25:25.997312 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-d764t" event={"ID":"c5dd5a95-b091-4f4f-8b05-d49d9dc0e979","Type":"ContainerStarted","Data":"1c7c91e3b6f7738fca6a74da8986b26f92fb80fda1e9e8aea0f68f84b9811ef7"} Oct 13 13:25:25 crc kubenswrapper[4797]: I1013 13:25:25.998387 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-tqpk9" event={"ID":"bb8faf00-58df-4131-af70-117df286f396","Type":"ContainerStarted","Data":"74b974ec6c8f5d365f5a6d85e0623b152ef8cbaff0ccad93ae97eab1dfcd5498"} Oct 13 13:25:25 crc kubenswrapper[4797]: I1013 13:25:25.999223 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-xzbbx" event={"ID":"d7978a82-8a2f-4a86-8598-65e7dae25b77","Type":"ContainerStarted","Data":"08ff721a340ac5c90445a430f3d31c4908633e240ef5e5abcd65184839f6125b"} Oct 13 13:25:26 crc kubenswrapper[4797]: I1013 13:25:26.019152 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-d764t" podStartSLOduration=2.019132834 podStartE2EDuration="2.019132834s" podCreationTimestamp="2025-10-13 13:25:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 13:25:26.014231214 +0000 UTC m=+1103.547781470" watchObservedRunningTime="2025-10-13 13:25:26.019132834 +0000 UTC m=+1103.552683090" Oct 13 13:25:26 crc kubenswrapper[4797]: I1013 13:25:26.284374 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-5dzch" Oct 13 13:25:26 crc kubenswrapper[4797]: I1013 13:25:26.348894 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/974e0c08-3519-4be7-a9d1-c7db6016ad6f-config-data\") pod \"974e0c08-3519-4be7-a9d1-c7db6016ad6f\" (UID: \"974e0c08-3519-4be7-a9d1-c7db6016ad6f\") " Oct 13 13:25:26 crc kubenswrapper[4797]: I1013 13:25:26.348954 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/974e0c08-3519-4be7-a9d1-c7db6016ad6f-scripts\") pod \"974e0c08-3519-4be7-a9d1-c7db6016ad6f\" (UID: \"974e0c08-3519-4be7-a9d1-c7db6016ad6f\") " Oct 13 13:25:26 crc kubenswrapper[4797]: I1013 13:25:26.348972 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/974e0c08-3519-4be7-a9d1-c7db6016ad6f-combined-ca-bundle\") pod \"974e0c08-3519-4be7-a9d1-c7db6016ad6f\" (UID: \"974e0c08-3519-4be7-a9d1-c7db6016ad6f\") " Oct 13 13:25:26 crc kubenswrapper[4797]: I1013 13:25:26.349010 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/974e0c08-3519-4be7-a9d1-c7db6016ad6f-logs\") pod \"974e0c08-3519-4be7-a9d1-c7db6016ad6f\" (UID: \"974e0c08-3519-4be7-a9d1-c7db6016ad6f\") " Oct 13 13:25:26 crc kubenswrapper[4797]: I1013 13:25:26.349031 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z999w\" (UniqueName: \"kubernetes.io/projected/974e0c08-3519-4be7-a9d1-c7db6016ad6f-kube-api-access-z999w\") pod \"974e0c08-3519-4be7-a9d1-c7db6016ad6f\" (UID: \"974e0c08-3519-4be7-a9d1-c7db6016ad6f\") " Oct 13 13:25:26 crc kubenswrapper[4797]: I1013 13:25:26.354372 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/974e0c08-3519-4be7-a9d1-c7db6016ad6f-logs" (OuterVolumeSpecName: "logs") pod "974e0c08-3519-4be7-a9d1-c7db6016ad6f" (UID: "974e0c08-3519-4be7-a9d1-c7db6016ad6f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 13:25:26 crc kubenswrapper[4797]: I1013 13:25:26.354683 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/974e0c08-3519-4be7-a9d1-c7db6016ad6f-kube-api-access-z999w" (OuterVolumeSpecName: "kube-api-access-z999w") pod "974e0c08-3519-4be7-a9d1-c7db6016ad6f" (UID: "974e0c08-3519-4be7-a9d1-c7db6016ad6f"). InnerVolumeSpecName "kube-api-access-z999w". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:25:26 crc kubenswrapper[4797]: I1013 13:25:26.373002 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/974e0c08-3519-4be7-a9d1-c7db6016ad6f-scripts" (OuterVolumeSpecName: "scripts") pod "974e0c08-3519-4be7-a9d1-c7db6016ad6f" (UID: "974e0c08-3519-4be7-a9d1-c7db6016ad6f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:25:26 crc kubenswrapper[4797]: I1013 13:25:26.392115 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/974e0c08-3519-4be7-a9d1-c7db6016ad6f-config-data" (OuterVolumeSpecName: "config-data") pod "974e0c08-3519-4be7-a9d1-c7db6016ad6f" (UID: "974e0c08-3519-4be7-a9d1-c7db6016ad6f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:25:26 crc kubenswrapper[4797]: I1013 13:25:26.392205 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/974e0c08-3519-4be7-a9d1-c7db6016ad6f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "974e0c08-3519-4be7-a9d1-c7db6016ad6f" (UID: "974e0c08-3519-4be7-a9d1-c7db6016ad6f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:25:26 crc kubenswrapper[4797]: I1013 13:25:26.450959 4797 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/974e0c08-3519-4be7-a9d1-c7db6016ad6f-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 13:25:26 crc kubenswrapper[4797]: I1013 13:25:26.451000 4797 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/974e0c08-3519-4be7-a9d1-c7db6016ad6f-scripts\") on node \"crc\" DevicePath \"\"" Oct 13 13:25:26 crc kubenswrapper[4797]: I1013 13:25:26.451009 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/974e0c08-3519-4be7-a9d1-c7db6016ad6f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 13:25:26 crc kubenswrapper[4797]: I1013 13:25:26.451021 4797 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/974e0c08-3519-4be7-a9d1-c7db6016ad6f-logs\") on node \"crc\" DevicePath \"\"" Oct 13 13:25:26 crc kubenswrapper[4797]: I1013 13:25:26.451029 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z999w\" (UniqueName: \"kubernetes.io/projected/974e0c08-3519-4be7-a9d1-c7db6016ad6f-kube-api-access-z999w\") on node \"crc\" DevicePath \"\"" Oct 13 13:25:27 crc kubenswrapper[4797]: I1013 13:25:27.017028 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-5dzch" event={"ID":"974e0c08-3519-4be7-a9d1-c7db6016ad6f","Type":"ContainerDied","Data":"70e4b839298a0f2a732d1fe1873068e787271458d69772f97be84243440b17e9"} Oct 13 13:25:27 crc kubenswrapper[4797]: I1013 13:25:27.017538 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="70e4b839298a0f2a732d1fe1873068e787271458d69772f97be84243440b17e9" Oct 13 13:25:27 crc kubenswrapper[4797]: I1013 13:25:27.017413 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-5dzch" Oct 13 13:25:27 crc kubenswrapper[4797]: I1013 13:25:27.038207 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"99cf4c75-5042-4f58-945f-5461cad0fbcc","Type":"ContainerStarted","Data":"e95744e7a7fb393cf4dc18192de5ca1a6dbdc1518ad298e95bb74bf374852a4b"} Oct 13 13:25:27 crc kubenswrapper[4797]: I1013 13:25:27.095914 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-778fd9d9d-t868n"] Oct 13 13:25:27 crc kubenswrapper[4797]: E1013 13:25:27.096333 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="885a17a7-d224-4081-8568-8d362c86f321" containerName="dnsmasq-dns" Oct 13 13:25:27 crc kubenswrapper[4797]: I1013 13:25:27.096355 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="885a17a7-d224-4081-8568-8d362c86f321" containerName="dnsmasq-dns" Oct 13 13:25:27 crc kubenswrapper[4797]: E1013 13:25:27.096381 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="885a17a7-d224-4081-8568-8d362c86f321" containerName="init" Oct 13 13:25:27 crc kubenswrapper[4797]: I1013 13:25:27.096419 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="885a17a7-d224-4081-8568-8d362c86f321" containerName="init" Oct 13 13:25:27 crc kubenswrapper[4797]: E1013 13:25:27.096435 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="974e0c08-3519-4be7-a9d1-c7db6016ad6f" containerName="placement-db-sync" Oct 13 13:25:27 crc kubenswrapper[4797]: I1013 13:25:27.096444 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="974e0c08-3519-4be7-a9d1-c7db6016ad6f" containerName="placement-db-sync" Oct 13 13:25:27 crc kubenswrapper[4797]: I1013 13:25:27.096653 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="974e0c08-3519-4be7-a9d1-c7db6016ad6f" containerName="placement-db-sync" Oct 13 13:25:27 crc kubenswrapper[4797]: I1013 13:25:27.096679 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="885a17a7-d224-4081-8568-8d362c86f321" containerName="dnsmasq-dns" Oct 13 13:25:27 crc kubenswrapper[4797]: I1013 13:25:27.097775 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-778fd9d9d-t868n" Oct 13 13:25:27 crc kubenswrapper[4797]: I1013 13:25:27.099311 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Oct 13 13:25:27 crc kubenswrapper[4797]: I1013 13:25:27.101302 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Oct 13 13:25:27 crc kubenswrapper[4797]: I1013 13:25:27.101501 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Oct 13 13:25:27 crc kubenswrapper[4797]: I1013 13:25:27.101689 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-m9gb6" Oct 13 13:25:27 crc kubenswrapper[4797]: I1013 13:25:27.101870 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Oct 13 13:25:27 crc kubenswrapper[4797]: I1013 13:25:27.103907 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-778fd9d9d-t868n"] Oct 13 13:25:27 crc kubenswrapper[4797]: I1013 13:25:27.170399 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5e82c55-e59e-4d97-800c-66a4f9555047-internal-tls-certs\") pod \"placement-778fd9d9d-t868n\" (UID: \"d5e82c55-e59e-4d97-800c-66a4f9555047\") " pod="openstack/placement-778fd9d9d-t868n" Oct 13 13:25:27 crc kubenswrapper[4797]: I1013 13:25:27.170455 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d5e82c55-e59e-4d97-800c-66a4f9555047-logs\") pod \"placement-778fd9d9d-t868n\" (UID: \"d5e82c55-e59e-4d97-800c-66a4f9555047\") " pod="openstack/placement-778fd9d9d-t868n" Oct 13 13:25:27 crc kubenswrapper[4797]: I1013 13:25:27.170583 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqm27\" (UniqueName: \"kubernetes.io/projected/d5e82c55-e59e-4d97-800c-66a4f9555047-kube-api-access-cqm27\") pod \"placement-778fd9d9d-t868n\" (UID: \"d5e82c55-e59e-4d97-800c-66a4f9555047\") " pod="openstack/placement-778fd9d9d-t868n" Oct 13 13:25:27 crc kubenswrapper[4797]: I1013 13:25:27.170708 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d5e82c55-e59e-4d97-800c-66a4f9555047-scripts\") pod \"placement-778fd9d9d-t868n\" (UID: \"d5e82c55-e59e-4d97-800c-66a4f9555047\") " pod="openstack/placement-778fd9d9d-t868n" Oct 13 13:25:27 crc kubenswrapper[4797]: I1013 13:25:27.170744 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5e82c55-e59e-4d97-800c-66a4f9555047-combined-ca-bundle\") pod \"placement-778fd9d9d-t868n\" (UID: \"d5e82c55-e59e-4d97-800c-66a4f9555047\") " pod="openstack/placement-778fd9d9d-t868n" Oct 13 13:25:27 crc kubenswrapper[4797]: I1013 13:25:27.170909 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5e82c55-e59e-4d97-800c-66a4f9555047-public-tls-certs\") pod \"placement-778fd9d9d-t868n\" (UID: \"d5e82c55-e59e-4d97-800c-66a4f9555047\") " pod="openstack/placement-778fd9d9d-t868n" Oct 13 13:25:27 crc kubenswrapper[4797]: I1013 13:25:27.170936 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5e82c55-e59e-4d97-800c-66a4f9555047-config-data\") pod \"placement-778fd9d9d-t868n\" (UID: \"d5e82c55-e59e-4d97-800c-66a4f9555047\") " pod="openstack/placement-778fd9d9d-t868n" Oct 13 13:25:27 crc kubenswrapper[4797]: I1013 13:25:27.273324 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5e82c55-e59e-4d97-800c-66a4f9555047-internal-tls-certs\") pod \"placement-778fd9d9d-t868n\" (UID: \"d5e82c55-e59e-4d97-800c-66a4f9555047\") " pod="openstack/placement-778fd9d9d-t868n" Oct 13 13:25:27 crc kubenswrapper[4797]: I1013 13:25:27.273379 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d5e82c55-e59e-4d97-800c-66a4f9555047-logs\") pod \"placement-778fd9d9d-t868n\" (UID: \"d5e82c55-e59e-4d97-800c-66a4f9555047\") " pod="openstack/placement-778fd9d9d-t868n" Oct 13 13:25:27 crc kubenswrapper[4797]: I1013 13:25:27.273425 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqm27\" (UniqueName: \"kubernetes.io/projected/d5e82c55-e59e-4d97-800c-66a4f9555047-kube-api-access-cqm27\") pod \"placement-778fd9d9d-t868n\" (UID: \"d5e82c55-e59e-4d97-800c-66a4f9555047\") " pod="openstack/placement-778fd9d9d-t868n" Oct 13 13:25:27 crc kubenswrapper[4797]: I1013 13:25:27.273471 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d5e82c55-e59e-4d97-800c-66a4f9555047-scripts\") pod \"placement-778fd9d9d-t868n\" (UID: \"d5e82c55-e59e-4d97-800c-66a4f9555047\") " pod="openstack/placement-778fd9d9d-t868n" Oct 13 13:25:27 crc kubenswrapper[4797]: I1013 13:25:27.273500 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5e82c55-e59e-4d97-800c-66a4f9555047-combined-ca-bundle\") pod \"placement-778fd9d9d-t868n\" (UID: \"d5e82c55-e59e-4d97-800c-66a4f9555047\") " pod="openstack/placement-778fd9d9d-t868n" Oct 13 13:25:27 crc kubenswrapper[4797]: I1013 13:25:27.273561 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5e82c55-e59e-4d97-800c-66a4f9555047-public-tls-certs\") pod \"placement-778fd9d9d-t868n\" (UID: \"d5e82c55-e59e-4d97-800c-66a4f9555047\") " pod="openstack/placement-778fd9d9d-t868n" Oct 13 13:25:27 crc kubenswrapper[4797]: I1013 13:25:27.273626 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5e82c55-e59e-4d97-800c-66a4f9555047-config-data\") pod \"placement-778fd9d9d-t868n\" (UID: \"d5e82c55-e59e-4d97-800c-66a4f9555047\") " pod="openstack/placement-778fd9d9d-t868n" Oct 13 13:25:27 crc kubenswrapper[4797]: I1013 13:25:27.274362 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d5e82c55-e59e-4d97-800c-66a4f9555047-logs\") pod \"placement-778fd9d9d-t868n\" (UID: \"d5e82c55-e59e-4d97-800c-66a4f9555047\") " pod="openstack/placement-778fd9d9d-t868n" Oct 13 13:25:27 crc kubenswrapper[4797]: I1013 13:25:27.281378 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5e82c55-e59e-4d97-800c-66a4f9555047-internal-tls-certs\") pod \"placement-778fd9d9d-t868n\" (UID: \"d5e82c55-e59e-4d97-800c-66a4f9555047\") " pod="openstack/placement-778fd9d9d-t868n" Oct 13 13:25:27 crc kubenswrapper[4797]: I1013 13:25:27.281696 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5e82c55-e59e-4d97-800c-66a4f9555047-config-data\") pod \"placement-778fd9d9d-t868n\" (UID: \"d5e82c55-e59e-4d97-800c-66a4f9555047\") " pod="openstack/placement-778fd9d9d-t868n" Oct 13 13:25:27 crc kubenswrapper[4797]: I1013 13:25:27.283003 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5e82c55-e59e-4d97-800c-66a4f9555047-public-tls-certs\") pod \"placement-778fd9d9d-t868n\" (UID: \"d5e82c55-e59e-4d97-800c-66a4f9555047\") " pod="openstack/placement-778fd9d9d-t868n" Oct 13 13:25:27 crc kubenswrapper[4797]: I1013 13:25:27.285533 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5e82c55-e59e-4d97-800c-66a4f9555047-combined-ca-bundle\") pod \"placement-778fd9d9d-t868n\" (UID: \"d5e82c55-e59e-4d97-800c-66a4f9555047\") " pod="openstack/placement-778fd9d9d-t868n" Oct 13 13:25:27 crc kubenswrapper[4797]: I1013 13:25:27.286159 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d5e82c55-e59e-4d97-800c-66a4f9555047-scripts\") pod \"placement-778fd9d9d-t868n\" (UID: \"d5e82c55-e59e-4d97-800c-66a4f9555047\") " pod="openstack/placement-778fd9d9d-t868n" Oct 13 13:25:27 crc kubenswrapper[4797]: I1013 13:25:27.290971 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqm27\" (UniqueName: \"kubernetes.io/projected/d5e82c55-e59e-4d97-800c-66a4f9555047-kube-api-access-cqm27\") pod \"placement-778fd9d9d-t868n\" (UID: \"d5e82c55-e59e-4d97-800c-66a4f9555047\") " pod="openstack/placement-778fd9d9d-t868n" Oct 13 13:25:27 crc kubenswrapper[4797]: I1013 13:25:27.418154 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-778fd9d9d-t868n" Oct 13 13:25:27 crc kubenswrapper[4797]: I1013 13:25:27.943577 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-778fd9d9d-t868n"] Oct 13 13:25:28 crc kubenswrapper[4797]: I1013 13:25:28.049541 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6252a65e-85f5-42cf-9fce-3cd585d5e834","Type":"ContainerStarted","Data":"70667378fe0ce4e05def33b1987bc19d6f57f64b4c4138291bc6588d9f55a2dc"} Oct 13 13:25:28 crc kubenswrapper[4797]: I1013 13:25:28.088851 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.088828814 podStartE2EDuration="4.088828814s" podCreationTimestamp="2025-10-13 13:25:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 13:25:28.076499792 +0000 UTC m=+1105.610050068" watchObservedRunningTime="2025-10-13 13:25:28.088828814 +0000 UTC m=+1105.622379070" Oct 13 13:25:29 crc kubenswrapper[4797]: I1013 13:25:29.063052 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"99cf4c75-5042-4f58-945f-5461cad0fbcc","Type":"ContainerStarted","Data":"8a2c55dde5e7cdb88e6195d0cc7cf9c60c2d76d91b4fbb30b8a991240a8dfd5a"} Oct 13 13:25:29 crc kubenswrapper[4797]: I1013 13:25:29.069236 4797 generic.go:334] "Generic (PLEG): container finished" podID="46f55e80-87b4-4286-afa0-ab9c7143b02f" containerID="8a82a9e40c540c4652fc02f27eeaca8b45824717be1359cb164d63d458e5e12f" exitCode=0 Oct 13 13:25:29 crc kubenswrapper[4797]: I1013 13:25:29.069589 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-tsc96" event={"ID":"46f55e80-87b4-4286-afa0-ab9c7143b02f","Type":"ContainerDied","Data":"8a82a9e40c540c4652fc02f27eeaca8b45824717be1359cb164d63d458e5e12f"} Oct 13 13:25:29 crc kubenswrapper[4797]: I1013 13:25:29.115649 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.115630208 podStartE2EDuration="5.115630208s" podCreationTimestamp="2025-10-13 13:25:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 13:25:29.092259305 +0000 UTC m=+1106.625809581" watchObservedRunningTime="2025-10-13 13:25:29.115630208 +0000 UTC m=+1106.649180454" Oct 13 13:25:31 crc kubenswrapper[4797]: I1013 13:25:31.092066 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-778fd9d9d-t868n" event={"ID":"d5e82c55-e59e-4d97-800c-66a4f9555047","Type":"ContainerStarted","Data":"87b4a365c29d87d012645e72faf44be655aa1e8a7cac0dd56068d6bbc703513f"} Oct 13 13:25:33 crc kubenswrapper[4797]: I1013 13:25:33.519457 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-tsc96" Oct 13 13:25:33 crc kubenswrapper[4797]: I1013 13:25:33.593465 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/46f55e80-87b4-4286-afa0-ab9c7143b02f-fernet-keys\") pod \"46f55e80-87b4-4286-afa0-ab9c7143b02f\" (UID: \"46f55e80-87b4-4286-afa0-ab9c7143b02f\") " Oct 13 13:25:33 crc kubenswrapper[4797]: I1013 13:25:33.593539 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46f55e80-87b4-4286-afa0-ab9c7143b02f-config-data\") pod \"46f55e80-87b4-4286-afa0-ab9c7143b02f\" (UID: \"46f55e80-87b4-4286-afa0-ab9c7143b02f\") " Oct 13 13:25:33 crc kubenswrapper[4797]: I1013 13:25:33.593605 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46f55e80-87b4-4286-afa0-ab9c7143b02f-scripts\") pod \"46f55e80-87b4-4286-afa0-ab9c7143b02f\" (UID: \"46f55e80-87b4-4286-afa0-ab9c7143b02f\") " Oct 13 13:25:33 crc kubenswrapper[4797]: I1013 13:25:33.593631 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9n8f5\" (UniqueName: \"kubernetes.io/projected/46f55e80-87b4-4286-afa0-ab9c7143b02f-kube-api-access-9n8f5\") pod \"46f55e80-87b4-4286-afa0-ab9c7143b02f\" (UID: \"46f55e80-87b4-4286-afa0-ab9c7143b02f\") " Oct 13 13:25:33 crc kubenswrapper[4797]: I1013 13:25:33.593658 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/46f55e80-87b4-4286-afa0-ab9c7143b02f-credential-keys\") pod \"46f55e80-87b4-4286-afa0-ab9c7143b02f\" (UID: \"46f55e80-87b4-4286-afa0-ab9c7143b02f\") " Oct 13 13:25:33 crc kubenswrapper[4797]: I1013 13:25:33.593710 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46f55e80-87b4-4286-afa0-ab9c7143b02f-combined-ca-bundle\") pod \"46f55e80-87b4-4286-afa0-ab9c7143b02f\" (UID: \"46f55e80-87b4-4286-afa0-ab9c7143b02f\") " Oct 13 13:25:33 crc kubenswrapper[4797]: I1013 13:25:33.598507 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46f55e80-87b4-4286-afa0-ab9c7143b02f-scripts" (OuterVolumeSpecName: "scripts") pod "46f55e80-87b4-4286-afa0-ab9c7143b02f" (UID: "46f55e80-87b4-4286-afa0-ab9c7143b02f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:25:33 crc kubenswrapper[4797]: I1013 13:25:33.598626 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46f55e80-87b4-4286-afa0-ab9c7143b02f-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "46f55e80-87b4-4286-afa0-ab9c7143b02f" (UID: "46f55e80-87b4-4286-afa0-ab9c7143b02f"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:25:33 crc kubenswrapper[4797]: I1013 13:25:33.599305 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46f55e80-87b4-4286-afa0-ab9c7143b02f-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "46f55e80-87b4-4286-afa0-ab9c7143b02f" (UID: "46f55e80-87b4-4286-afa0-ab9c7143b02f"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:25:33 crc kubenswrapper[4797]: I1013 13:25:33.601927 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46f55e80-87b4-4286-afa0-ab9c7143b02f-kube-api-access-9n8f5" (OuterVolumeSpecName: "kube-api-access-9n8f5") pod "46f55e80-87b4-4286-afa0-ab9c7143b02f" (UID: "46f55e80-87b4-4286-afa0-ab9c7143b02f"). InnerVolumeSpecName "kube-api-access-9n8f5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:25:33 crc kubenswrapper[4797]: I1013 13:25:33.622842 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46f55e80-87b4-4286-afa0-ab9c7143b02f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "46f55e80-87b4-4286-afa0-ab9c7143b02f" (UID: "46f55e80-87b4-4286-afa0-ab9c7143b02f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:25:33 crc kubenswrapper[4797]: I1013 13:25:33.623377 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46f55e80-87b4-4286-afa0-ab9c7143b02f-config-data" (OuterVolumeSpecName: "config-data") pod "46f55e80-87b4-4286-afa0-ab9c7143b02f" (UID: "46f55e80-87b4-4286-afa0-ab9c7143b02f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:25:33 crc kubenswrapper[4797]: I1013 13:25:33.696460 4797 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46f55e80-87b4-4286-afa0-ab9c7143b02f-scripts\") on node \"crc\" DevicePath \"\"" Oct 13 13:25:33 crc kubenswrapper[4797]: I1013 13:25:33.696491 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9n8f5\" (UniqueName: \"kubernetes.io/projected/46f55e80-87b4-4286-afa0-ab9c7143b02f-kube-api-access-9n8f5\") on node \"crc\" DevicePath \"\"" Oct 13 13:25:33 crc kubenswrapper[4797]: I1013 13:25:33.696505 4797 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/46f55e80-87b4-4286-afa0-ab9c7143b02f-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 13 13:25:33 crc kubenswrapper[4797]: I1013 13:25:33.696515 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46f55e80-87b4-4286-afa0-ab9c7143b02f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 13:25:33 crc kubenswrapper[4797]: I1013 13:25:33.696524 4797 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/46f55e80-87b4-4286-afa0-ab9c7143b02f-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 13 13:25:33 crc kubenswrapper[4797]: I1013 13:25:33.696532 4797 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46f55e80-87b4-4286-afa0-ab9c7143b02f-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 13:25:34 crc kubenswrapper[4797]: I1013 13:25:34.122316 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-tsc96" event={"ID":"46f55e80-87b4-4286-afa0-ab9c7143b02f","Type":"ContainerDied","Data":"7a5d63e49a4b4beb1a57151e937ab6386a5f7cf6cce251ff5f834873da3c9c14"} Oct 13 13:25:34 crc kubenswrapper[4797]: I1013 13:25:34.122646 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7a5d63e49a4b4beb1a57151e937ab6386a5f7cf6cce251ff5f834873da3c9c14" Oct 13 13:25:34 crc kubenswrapper[4797]: I1013 13:25:34.122389 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-tsc96" Oct 13 13:25:34 crc kubenswrapper[4797]: I1013 13:25:34.487840 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 13 13:25:34 crc kubenswrapper[4797]: I1013 13:25:34.487895 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 13 13:25:34 crc kubenswrapper[4797]: I1013 13:25:34.518944 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 13 13:25:34 crc kubenswrapper[4797]: I1013 13:25:34.533186 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 13 13:25:34 crc kubenswrapper[4797]: I1013 13:25:34.557751 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 13 13:25:34 crc kubenswrapper[4797]: I1013 13:25:34.557831 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 13 13:25:34 crc kubenswrapper[4797]: I1013 13:25:34.599573 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 13 13:25:34 crc kubenswrapper[4797]: I1013 13:25:34.632093 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 13 13:25:34 crc kubenswrapper[4797]: I1013 13:25:34.672597 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-789bb6874b-qp58p"] Oct 13 13:25:34 crc kubenswrapper[4797]: E1013 13:25:34.673584 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46f55e80-87b4-4286-afa0-ab9c7143b02f" containerName="keystone-bootstrap" Oct 13 13:25:34 crc kubenswrapper[4797]: I1013 13:25:34.673606 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="46f55e80-87b4-4286-afa0-ab9c7143b02f" containerName="keystone-bootstrap" Oct 13 13:25:34 crc kubenswrapper[4797]: I1013 13:25:34.673821 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="46f55e80-87b4-4286-afa0-ab9c7143b02f" containerName="keystone-bootstrap" Oct 13 13:25:34 crc kubenswrapper[4797]: I1013 13:25:34.674526 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-789bb6874b-qp58p" Oct 13 13:25:34 crc kubenswrapper[4797]: I1013 13:25:34.679827 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 13 13:25:34 crc kubenswrapper[4797]: I1013 13:25:34.680033 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 13 13:25:34 crc kubenswrapper[4797]: I1013 13:25:34.680103 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 13 13:25:34 crc kubenswrapper[4797]: I1013 13:25:34.680241 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Oct 13 13:25:34 crc kubenswrapper[4797]: I1013 13:25:34.680346 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Oct 13 13:25:34 crc kubenswrapper[4797]: I1013 13:25:34.680065 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-4snmq" Oct 13 13:25:34 crc kubenswrapper[4797]: I1013 13:25:34.693494 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-789bb6874b-qp58p"] Oct 13 13:25:34 crc kubenswrapper[4797]: I1013 13:25:34.736903 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6bf02d7e-6b92-4d2a-838f-20cdd6a7046e-config-data\") pod \"keystone-789bb6874b-qp58p\" (UID: \"6bf02d7e-6b92-4d2a-838f-20cdd6a7046e\") " pod="openstack/keystone-789bb6874b-qp58p" Oct 13 13:25:34 crc kubenswrapper[4797]: I1013 13:25:34.736967 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6bf02d7e-6b92-4d2a-838f-20cdd6a7046e-combined-ca-bundle\") pod \"keystone-789bb6874b-qp58p\" (UID: \"6bf02d7e-6b92-4d2a-838f-20cdd6a7046e\") " pod="openstack/keystone-789bb6874b-qp58p" Oct 13 13:25:34 crc kubenswrapper[4797]: I1013 13:25:34.737013 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6bf02d7e-6b92-4d2a-838f-20cdd6a7046e-internal-tls-certs\") pod \"keystone-789bb6874b-qp58p\" (UID: \"6bf02d7e-6b92-4d2a-838f-20cdd6a7046e\") " pod="openstack/keystone-789bb6874b-qp58p" Oct 13 13:25:34 crc kubenswrapper[4797]: I1013 13:25:34.737068 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6bf02d7e-6b92-4d2a-838f-20cdd6a7046e-credential-keys\") pod \"keystone-789bb6874b-qp58p\" (UID: \"6bf02d7e-6b92-4d2a-838f-20cdd6a7046e\") " pod="openstack/keystone-789bb6874b-qp58p" Oct 13 13:25:34 crc kubenswrapper[4797]: I1013 13:25:34.737095 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6bf02d7e-6b92-4d2a-838f-20cdd6a7046e-scripts\") pod \"keystone-789bb6874b-qp58p\" (UID: \"6bf02d7e-6b92-4d2a-838f-20cdd6a7046e\") " pod="openstack/keystone-789bb6874b-qp58p" Oct 13 13:25:34 crc kubenswrapper[4797]: I1013 13:25:34.737131 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5nmdl\" (UniqueName: \"kubernetes.io/projected/6bf02d7e-6b92-4d2a-838f-20cdd6a7046e-kube-api-access-5nmdl\") pod \"keystone-789bb6874b-qp58p\" (UID: \"6bf02d7e-6b92-4d2a-838f-20cdd6a7046e\") " pod="openstack/keystone-789bb6874b-qp58p" Oct 13 13:25:34 crc kubenswrapper[4797]: I1013 13:25:34.737182 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6bf02d7e-6b92-4d2a-838f-20cdd6a7046e-fernet-keys\") pod \"keystone-789bb6874b-qp58p\" (UID: \"6bf02d7e-6b92-4d2a-838f-20cdd6a7046e\") " pod="openstack/keystone-789bb6874b-qp58p" Oct 13 13:25:34 crc kubenswrapper[4797]: I1013 13:25:34.737224 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6bf02d7e-6b92-4d2a-838f-20cdd6a7046e-public-tls-certs\") pod \"keystone-789bb6874b-qp58p\" (UID: \"6bf02d7e-6b92-4d2a-838f-20cdd6a7046e\") " pod="openstack/keystone-789bb6874b-qp58p" Oct 13 13:25:34 crc kubenswrapper[4797]: I1013 13:25:34.838470 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6bf02d7e-6b92-4d2a-838f-20cdd6a7046e-credential-keys\") pod \"keystone-789bb6874b-qp58p\" (UID: \"6bf02d7e-6b92-4d2a-838f-20cdd6a7046e\") " pod="openstack/keystone-789bb6874b-qp58p" Oct 13 13:25:34 crc kubenswrapper[4797]: I1013 13:25:34.838521 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6bf02d7e-6b92-4d2a-838f-20cdd6a7046e-scripts\") pod \"keystone-789bb6874b-qp58p\" (UID: \"6bf02d7e-6b92-4d2a-838f-20cdd6a7046e\") " pod="openstack/keystone-789bb6874b-qp58p" Oct 13 13:25:34 crc kubenswrapper[4797]: I1013 13:25:34.838607 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5nmdl\" (UniqueName: \"kubernetes.io/projected/6bf02d7e-6b92-4d2a-838f-20cdd6a7046e-kube-api-access-5nmdl\") pod \"keystone-789bb6874b-qp58p\" (UID: \"6bf02d7e-6b92-4d2a-838f-20cdd6a7046e\") " pod="openstack/keystone-789bb6874b-qp58p" Oct 13 13:25:34 crc kubenswrapper[4797]: I1013 13:25:34.838688 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6bf02d7e-6b92-4d2a-838f-20cdd6a7046e-fernet-keys\") pod \"keystone-789bb6874b-qp58p\" (UID: \"6bf02d7e-6b92-4d2a-838f-20cdd6a7046e\") " pod="openstack/keystone-789bb6874b-qp58p" Oct 13 13:25:34 crc kubenswrapper[4797]: I1013 13:25:34.838744 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6bf02d7e-6b92-4d2a-838f-20cdd6a7046e-public-tls-certs\") pod \"keystone-789bb6874b-qp58p\" (UID: \"6bf02d7e-6b92-4d2a-838f-20cdd6a7046e\") " pod="openstack/keystone-789bb6874b-qp58p" Oct 13 13:25:34 crc kubenswrapper[4797]: I1013 13:25:34.838791 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6bf02d7e-6b92-4d2a-838f-20cdd6a7046e-config-data\") pod \"keystone-789bb6874b-qp58p\" (UID: \"6bf02d7e-6b92-4d2a-838f-20cdd6a7046e\") " pod="openstack/keystone-789bb6874b-qp58p" Oct 13 13:25:34 crc kubenswrapper[4797]: I1013 13:25:34.838846 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6bf02d7e-6b92-4d2a-838f-20cdd6a7046e-combined-ca-bundle\") pod \"keystone-789bb6874b-qp58p\" (UID: \"6bf02d7e-6b92-4d2a-838f-20cdd6a7046e\") " pod="openstack/keystone-789bb6874b-qp58p" Oct 13 13:25:34 crc kubenswrapper[4797]: I1013 13:25:34.838899 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6bf02d7e-6b92-4d2a-838f-20cdd6a7046e-internal-tls-certs\") pod \"keystone-789bb6874b-qp58p\" (UID: \"6bf02d7e-6b92-4d2a-838f-20cdd6a7046e\") " pod="openstack/keystone-789bb6874b-qp58p" Oct 13 13:25:34 crc kubenswrapper[4797]: I1013 13:25:34.843490 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6bf02d7e-6b92-4d2a-838f-20cdd6a7046e-credential-keys\") pod \"keystone-789bb6874b-qp58p\" (UID: \"6bf02d7e-6b92-4d2a-838f-20cdd6a7046e\") " pod="openstack/keystone-789bb6874b-qp58p" Oct 13 13:25:34 crc kubenswrapper[4797]: I1013 13:25:34.844509 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6bf02d7e-6b92-4d2a-838f-20cdd6a7046e-scripts\") pod \"keystone-789bb6874b-qp58p\" (UID: \"6bf02d7e-6b92-4d2a-838f-20cdd6a7046e\") " pod="openstack/keystone-789bb6874b-qp58p" Oct 13 13:25:34 crc kubenswrapper[4797]: I1013 13:25:34.844629 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6bf02d7e-6b92-4d2a-838f-20cdd6a7046e-combined-ca-bundle\") pod \"keystone-789bb6874b-qp58p\" (UID: \"6bf02d7e-6b92-4d2a-838f-20cdd6a7046e\") " pod="openstack/keystone-789bb6874b-qp58p" Oct 13 13:25:34 crc kubenswrapper[4797]: I1013 13:25:34.845973 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6bf02d7e-6b92-4d2a-838f-20cdd6a7046e-public-tls-certs\") pod \"keystone-789bb6874b-qp58p\" (UID: \"6bf02d7e-6b92-4d2a-838f-20cdd6a7046e\") " pod="openstack/keystone-789bb6874b-qp58p" Oct 13 13:25:34 crc kubenswrapper[4797]: I1013 13:25:34.846455 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6bf02d7e-6b92-4d2a-838f-20cdd6a7046e-fernet-keys\") pod \"keystone-789bb6874b-qp58p\" (UID: \"6bf02d7e-6b92-4d2a-838f-20cdd6a7046e\") " pod="openstack/keystone-789bb6874b-qp58p" Oct 13 13:25:34 crc kubenswrapper[4797]: I1013 13:25:34.848330 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6bf02d7e-6b92-4d2a-838f-20cdd6a7046e-config-data\") pod \"keystone-789bb6874b-qp58p\" (UID: \"6bf02d7e-6b92-4d2a-838f-20cdd6a7046e\") " pod="openstack/keystone-789bb6874b-qp58p" Oct 13 13:25:34 crc kubenswrapper[4797]: I1013 13:25:34.848963 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6bf02d7e-6b92-4d2a-838f-20cdd6a7046e-internal-tls-certs\") pod \"keystone-789bb6874b-qp58p\" (UID: \"6bf02d7e-6b92-4d2a-838f-20cdd6a7046e\") " pod="openstack/keystone-789bb6874b-qp58p" Oct 13 13:25:34 crc kubenswrapper[4797]: I1013 13:25:34.863024 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5nmdl\" (UniqueName: \"kubernetes.io/projected/6bf02d7e-6b92-4d2a-838f-20cdd6a7046e-kube-api-access-5nmdl\") pod \"keystone-789bb6874b-qp58p\" (UID: \"6bf02d7e-6b92-4d2a-838f-20cdd6a7046e\") " pod="openstack/keystone-789bb6874b-qp58p" Oct 13 13:25:34 crc kubenswrapper[4797]: I1013 13:25:34.995190 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-789bb6874b-qp58p" Oct 13 13:25:35 crc kubenswrapper[4797]: I1013 13:25:35.130501 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 13 13:25:35 crc kubenswrapper[4797]: I1013 13:25:35.130533 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 13 13:25:35 crc kubenswrapper[4797]: I1013 13:25:35.130629 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 13 13:25:35 crc kubenswrapper[4797]: I1013 13:25:35.131049 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 13 13:25:37 crc kubenswrapper[4797]: I1013 13:25:37.057542 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 13 13:25:37 crc kubenswrapper[4797]: I1013 13:25:37.059311 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 13 13:25:37 crc kubenswrapper[4797]: I1013 13:25:37.254495 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 13 13:25:37 crc kubenswrapper[4797]: I1013 13:25:37.254576 4797 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 13 13:25:37 crc kubenswrapper[4797]: I1013 13:25:37.323451 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 13 13:25:42 crc kubenswrapper[4797]: E1013 13:25:42.235391 4797 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:121a845dacd051814fb4709fc557420363cd923a9cf2b4ed09addd394f83a3f5" Oct 13 13:25:42 crc kubenswrapper[4797]: E1013 13:25:42.236391 4797 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:121a845dacd051814fb4709fc557420363cd923a9cf2b4ed09addd394f83a3f5,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mbkvk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-tqpk9_openstack(bb8faf00-58df-4131-af70-117df286f396): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 13 13:25:42 crc kubenswrapper[4797]: E1013 13:25:42.237658 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-tqpk9" podUID="bb8faf00-58df-4131-af70-117df286f396" Oct 13 13:25:43 crc kubenswrapper[4797]: E1013 13:25:43.216456 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:121a845dacd051814fb4709fc557420363cd923a9cf2b4ed09addd394f83a3f5\\\"\"" pod="openstack/barbican-db-sync-tqpk9" podUID="bb8faf00-58df-4131-af70-117df286f396" Oct 13 13:25:43 crc kubenswrapper[4797]: E1013 13:25:43.340425 4797 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:93b475af15a0d10e95cb17b98927077f05ac24c89472a601d677eb89f82fd429" Oct 13 13:25:43 crc kubenswrapper[4797]: E1013 13:25:43.340569 4797 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:93b475af15a0d10e95cb17b98927077f05ac24c89472a601d677eb89f82fd429,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cwgl8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-xzbbx_openstack(d7978a82-8a2f-4a86-8598-65e7dae25b77): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 13 13:25:43 crc kubenswrapper[4797]: E1013 13:25:43.341856 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-xzbbx" podUID="d7978a82-8a2f-4a86-8598-65e7dae25b77" Oct 13 13:25:43 crc kubenswrapper[4797]: W1013 13:25:43.758116 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6bf02d7e_6b92_4d2a_838f_20cdd6a7046e.slice/crio-5aa99e1063a1d06b9d809420818758f38869724bffd3398b9dda527ddb797d2e WatchSource:0}: Error finding container 5aa99e1063a1d06b9d809420818758f38869724bffd3398b9dda527ddb797d2e: Status 404 returned error can't find the container with id 5aa99e1063a1d06b9d809420818758f38869724bffd3398b9dda527ddb797d2e Oct 13 13:25:43 crc kubenswrapper[4797]: I1013 13:25:43.760257 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-789bb6874b-qp58p"] Oct 13 13:25:44 crc kubenswrapper[4797]: I1013 13:25:44.231903 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f52aa623-f23f-4f02-b744-7c5b1e066e50","Type":"ContainerStarted","Data":"cf8018877e5ab2c9172ef35f55edb3b2e0ac0fa2ff314039b589665b34660caf"} Oct 13 13:25:44 crc kubenswrapper[4797]: I1013 13:25:44.234490 4797 generic.go:334] "Generic (PLEG): container finished" podID="c5dd5a95-b091-4f4f-8b05-d49d9dc0e979" containerID="4251a02173a50fd16205c9b173f202379b93c111710f2865655e603d39f3ed2c" exitCode=0 Oct 13 13:25:44 crc kubenswrapper[4797]: I1013 13:25:44.234585 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-d764t" event={"ID":"c5dd5a95-b091-4f4f-8b05-d49d9dc0e979","Type":"ContainerDied","Data":"4251a02173a50fd16205c9b173f202379b93c111710f2865655e603d39f3ed2c"} Oct 13 13:25:44 crc kubenswrapper[4797]: I1013 13:25:44.237412 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-789bb6874b-qp58p" event={"ID":"6bf02d7e-6b92-4d2a-838f-20cdd6a7046e","Type":"ContainerStarted","Data":"9a62cd7c84ab7e9f42d9574734c0a64f0e34918d01cc88cd3821d18b0063a155"} Oct 13 13:25:44 crc kubenswrapper[4797]: I1013 13:25:44.239744 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-789bb6874b-qp58p" Oct 13 13:25:44 crc kubenswrapper[4797]: I1013 13:25:44.239770 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-789bb6874b-qp58p" event={"ID":"6bf02d7e-6b92-4d2a-838f-20cdd6a7046e","Type":"ContainerStarted","Data":"5aa99e1063a1d06b9d809420818758f38869724bffd3398b9dda527ddb797d2e"} Oct 13 13:25:44 crc kubenswrapper[4797]: I1013 13:25:44.242603 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-778fd9d9d-t868n" event={"ID":"d5e82c55-e59e-4d97-800c-66a4f9555047","Type":"ContainerStarted","Data":"15e2f5844a61cab4f3406761a946255268af3094c017a3f89e9d9d6a0a06b3d1"} Oct 13 13:25:44 crc kubenswrapper[4797]: I1013 13:25:44.242631 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-778fd9d9d-t868n" event={"ID":"d5e82c55-e59e-4d97-800c-66a4f9555047","Type":"ContainerStarted","Data":"f92eb182c86f5ebc1b55e35e6ebaf0b53e6772b112e92cce8213cf90110c1026"} Oct 13 13:25:44 crc kubenswrapper[4797]: E1013 13:25:44.244348 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:93b475af15a0d10e95cb17b98927077f05ac24c89472a601d677eb89f82fd429\\\"\"" pod="openstack/cinder-db-sync-xzbbx" podUID="d7978a82-8a2f-4a86-8598-65e7dae25b77" Oct 13 13:25:44 crc kubenswrapper[4797]: I1013 13:25:44.320628 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-789bb6874b-qp58p" podStartSLOduration=10.320610959 podStartE2EDuration="10.320610959s" podCreationTimestamp="2025-10-13 13:25:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 13:25:44.307870027 +0000 UTC m=+1121.841420303" watchObservedRunningTime="2025-10-13 13:25:44.320610959 +0000 UTC m=+1121.854161205" Oct 13 13:25:44 crc kubenswrapper[4797]: I1013 13:25:44.338106 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-778fd9d9d-t868n" podStartSLOduration=17.338080317 podStartE2EDuration="17.338080317s" podCreationTimestamp="2025-10-13 13:25:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 13:25:44.330212544 +0000 UTC m=+1121.863762830" watchObservedRunningTime="2025-10-13 13:25:44.338080317 +0000 UTC m=+1121.871630603" Oct 13 13:25:45 crc kubenswrapper[4797]: I1013 13:25:45.256351 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-778fd9d9d-t868n" Oct 13 13:25:45 crc kubenswrapper[4797]: I1013 13:25:45.256909 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-778fd9d9d-t868n" Oct 13 13:25:45 crc kubenswrapper[4797]: I1013 13:25:45.591813 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-d764t" Oct 13 13:25:45 crc kubenswrapper[4797]: I1013 13:25:45.754147 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c5dd5a95-b091-4f4f-8b05-d49d9dc0e979-config\") pod \"c5dd5a95-b091-4f4f-8b05-d49d9dc0e979\" (UID: \"c5dd5a95-b091-4f4f-8b05-d49d9dc0e979\") " Oct 13 13:25:45 crc kubenswrapper[4797]: I1013 13:25:45.754317 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pw6bg\" (UniqueName: \"kubernetes.io/projected/c5dd5a95-b091-4f4f-8b05-d49d9dc0e979-kube-api-access-pw6bg\") pod \"c5dd5a95-b091-4f4f-8b05-d49d9dc0e979\" (UID: \"c5dd5a95-b091-4f4f-8b05-d49d9dc0e979\") " Oct 13 13:25:45 crc kubenswrapper[4797]: I1013 13:25:45.754441 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5dd5a95-b091-4f4f-8b05-d49d9dc0e979-combined-ca-bundle\") pod \"c5dd5a95-b091-4f4f-8b05-d49d9dc0e979\" (UID: \"c5dd5a95-b091-4f4f-8b05-d49d9dc0e979\") " Oct 13 13:25:45 crc kubenswrapper[4797]: I1013 13:25:45.761961 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5dd5a95-b091-4f4f-8b05-d49d9dc0e979-kube-api-access-pw6bg" (OuterVolumeSpecName: "kube-api-access-pw6bg") pod "c5dd5a95-b091-4f4f-8b05-d49d9dc0e979" (UID: "c5dd5a95-b091-4f4f-8b05-d49d9dc0e979"). InnerVolumeSpecName "kube-api-access-pw6bg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:25:45 crc kubenswrapper[4797]: I1013 13:25:45.786412 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5dd5a95-b091-4f4f-8b05-d49d9dc0e979-config" (OuterVolumeSpecName: "config") pod "c5dd5a95-b091-4f4f-8b05-d49d9dc0e979" (UID: "c5dd5a95-b091-4f4f-8b05-d49d9dc0e979"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:25:45 crc kubenswrapper[4797]: I1013 13:25:45.796151 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5dd5a95-b091-4f4f-8b05-d49d9dc0e979-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c5dd5a95-b091-4f4f-8b05-d49d9dc0e979" (UID: "c5dd5a95-b091-4f4f-8b05-d49d9dc0e979"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:25:45 crc kubenswrapper[4797]: I1013 13:25:45.856116 4797 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/c5dd5a95-b091-4f4f-8b05-d49d9dc0e979-config\") on node \"crc\" DevicePath \"\"" Oct 13 13:25:45 crc kubenswrapper[4797]: I1013 13:25:45.856159 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pw6bg\" (UniqueName: \"kubernetes.io/projected/c5dd5a95-b091-4f4f-8b05-d49d9dc0e979-kube-api-access-pw6bg\") on node \"crc\" DevicePath \"\"" Oct 13 13:25:45 crc kubenswrapper[4797]: I1013 13:25:45.856176 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5dd5a95-b091-4f4f-8b05-d49d9dc0e979-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 13:25:46 crc kubenswrapper[4797]: I1013 13:25:46.268718 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-d764t" event={"ID":"c5dd5a95-b091-4f4f-8b05-d49d9dc0e979","Type":"ContainerDied","Data":"1c7c91e3b6f7738fca6a74da8986b26f92fb80fda1e9e8aea0f68f84b9811ef7"} Oct 13 13:25:46 crc kubenswrapper[4797]: I1013 13:25:46.268792 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1c7c91e3b6f7738fca6a74da8986b26f92fb80fda1e9e8aea0f68f84b9811ef7" Oct 13 13:25:46 crc kubenswrapper[4797]: I1013 13:25:46.268738 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-d764t" Oct 13 13:25:46 crc kubenswrapper[4797]: I1013 13:25:46.499201 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-649d884857-8lbk5"] Oct 13 13:25:46 crc kubenswrapper[4797]: E1013 13:25:46.499650 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5dd5a95-b091-4f4f-8b05-d49d9dc0e979" containerName="neutron-db-sync" Oct 13 13:25:46 crc kubenswrapper[4797]: I1013 13:25:46.499668 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5dd5a95-b091-4f4f-8b05-d49d9dc0e979" containerName="neutron-db-sync" Oct 13 13:25:46 crc kubenswrapper[4797]: I1013 13:25:46.499909 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5dd5a95-b091-4f4f-8b05-d49d9dc0e979" containerName="neutron-db-sync" Oct 13 13:25:46 crc kubenswrapper[4797]: I1013 13:25:46.501145 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-649d884857-8lbk5" Oct 13 13:25:46 crc kubenswrapper[4797]: I1013 13:25:46.515477 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-649d884857-8lbk5"] Oct 13 13:25:46 crc kubenswrapper[4797]: I1013 13:25:46.634787 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-7fd4c97c98-mmgwk"] Oct 13 13:25:46 crc kubenswrapper[4797]: I1013 13:25:46.636784 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7fd4c97c98-mmgwk" Oct 13 13:25:46 crc kubenswrapper[4797]: I1013 13:25:46.643984 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Oct 13 13:25:46 crc kubenswrapper[4797]: I1013 13:25:46.644086 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Oct 13 13:25:46 crc kubenswrapper[4797]: I1013 13:25:46.644584 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-wxbwd" Oct 13 13:25:46 crc kubenswrapper[4797]: I1013 13:25:46.644777 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Oct 13 13:25:46 crc kubenswrapper[4797]: I1013 13:25:46.654151 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7fd4c97c98-mmgwk"] Oct 13 13:25:46 crc kubenswrapper[4797]: I1013 13:25:46.677361 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/afdc0228-d55d-4e5b-846d-605af1635de7-dns-svc\") pod \"dnsmasq-dns-649d884857-8lbk5\" (UID: \"afdc0228-d55d-4e5b-846d-605af1635de7\") " pod="openstack/dnsmasq-dns-649d884857-8lbk5" Oct 13 13:25:46 crc kubenswrapper[4797]: I1013 13:25:46.677418 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/afdc0228-d55d-4e5b-846d-605af1635de7-ovsdbserver-nb\") pod \"dnsmasq-dns-649d884857-8lbk5\" (UID: \"afdc0228-d55d-4e5b-846d-605af1635de7\") " pod="openstack/dnsmasq-dns-649d884857-8lbk5" Oct 13 13:25:46 crc kubenswrapper[4797]: I1013 13:25:46.677456 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/afdc0228-d55d-4e5b-846d-605af1635de7-config\") pod \"dnsmasq-dns-649d884857-8lbk5\" (UID: \"afdc0228-d55d-4e5b-846d-605af1635de7\") " pod="openstack/dnsmasq-dns-649d884857-8lbk5" Oct 13 13:25:46 crc kubenswrapper[4797]: I1013 13:25:46.687042 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8z9s\" (UniqueName: \"kubernetes.io/projected/afdc0228-d55d-4e5b-846d-605af1635de7-kube-api-access-z8z9s\") pod \"dnsmasq-dns-649d884857-8lbk5\" (UID: \"afdc0228-d55d-4e5b-846d-605af1635de7\") " pod="openstack/dnsmasq-dns-649d884857-8lbk5" Oct 13 13:25:46 crc kubenswrapper[4797]: I1013 13:25:46.687325 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/afdc0228-d55d-4e5b-846d-605af1635de7-ovsdbserver-sb\") pod \"dnsmasq-dns-649d884857-8lbk5\" (UID: \"afdc0228-d55d-4e5b-846d-605af1635de7\") " pod="openstack/dnsmasq-dns-649d884857-8lbk5" Oct 13 13:25:46 crc kubenswrapper[4797]: I1013 13:25:46.687472 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/afdc0228-d55d-4e5b-846d-605af1635de7-dns-swift-storage-0\") pod \"dnsmasq-dns-649d884857-8lbk5\" (UID: \"afdc0228-d55d-4e5b-846d-605af1635de7\") " pod="openstack/dnsmasq-dns-649d884857-8lbk5" Oct 13 13:25:46 crc kubenswrapper[4797]: I1013 13:25:46.789688 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdf73d21-1504-42d0-9521-9d1201947cc9-combined-ca-bundle\") pod \"neutron-7fd4c97c98-mmgwk\" (UID: \"bdf73d21-1504-42d0-9521-9d1201947cc9\") " pod="openstack/neutron-7fd4c97c98-mmgwk" Oct 13 13:25:46 crc kubenswrapper[4797]: I1013 13:25:46.789730 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/bdf73d21-1504-42d0-9521-9d1201947cc9-httpd-config\") pod \"neutron-7fd4c97c98-mmgwk\" (UID: \"bdf73d21-1504-42d0-9521-9d1201947cc9\") " pod="openstack/neutron-7fd4c97c98-mmgwk" Oct 13 13:25:46 crc kubenswrapper[4797]: I1013 13:25:46.789764 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z8z9s\" (UniqueName: \"kubernetes.io/projected/afdc0228-d55d-4e5b-846d-605af1635de7-kube-api-access-z8z9s\") pod \"dnsmasq-dns-649d884857-8lbk5\" (UID: \"afdc0228-d55d-4e5b-846d-605af1635de7\") " pod="openstack/dnsmasq-dns-649d884857-8lbk5" Oct 13 13:25:46 crc kubenswrapper[4797]: I1013 13:25:46.789818 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8lfc\" (UniqueName: \"kubernetes.io/projected/bdf73d21-1504-42d0-9521-9d1201947cc9-kube-api-access-g8lfc\") pod \"neutron-7fd4c97c98-mmgwk\" (UID: \"bdf73d21-1504-42d0-9521-9d1201947cc9\") " pod="openstack/neutron-7fd4c97c98-mmgwk" Oct 13 13:25:46 crc kubenswrapper[4797]: I1013 13:25:46.789852 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/afdc0228-d55d-4e5b-846d-605af1635de7-ovsdbserver-sb\") pod \"dnsmasq-dns-649d884857-8lbk5\" (UID: \"afdc0228-d55d-4e5b-846d-605af1635de7\") " pod="openstack/dnsmasq-dns-649d884857-8lbk5" Oct 13 13:25:46 crc kubenswrapper[4797]: I1013 13:25:46.789894 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/afdc0228-d55d-4e5b-846d-605af1635de7-dns-swift-storage-0\") pod \"dnsmasq-dns-649d884857-8lbk5\" (UID: \"afdc0228-d55d-4e5b-846d-605af1635de7\") " pod="openstack/dnsmasq-dns-649d884857-8lbk5" Oct 13 13:25:46 crc kubenswrapper[4797]: I1013 13:25:46.789913 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bdf73d21-1504-42d0-9521-9d1201947cc9-ovndb-tls-certs\") pod \"neutron-7fd4c97c98-mmgwk\" (UID: \"bdf73d21-1504-42d0-9521-9d1201947cc9\") " pod="openstack/neutron-7fd4c97c98-mmgwk" Oct 13 13:25:46 crc kubenswrapper[4797]: I1013 13:25:46.789939 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/afdc0228-d55d-4e5b-846d-605af1635de7-dns-svc\") pod \"dnsmasq-dns-649d884857-8lbk5\" (UID: \"afdc0228-d55d-4e5b-846d-605af1635de7\") " pod="openstack/dnsmasq-dns-649d884857-8lbk5" Oct 13 13:25:46 crc kubenswrapper[4797]: I1013 13:25:46.789964 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/afdc0228-d55d-4e5b-846d-605af1635de7-ovsdbserver-nb\") pod \"dnsmasq-dns-649d884857-8lbk5\" (UID: \"afdc0228-d55d-4e5b-846d-605af1635de7\") " pod="openstack/dnsmasq-dns-649d884857-8lbk5" Oct 13 13:25:46 crc kubenswrapper[4797]: I1013 13:25:46.789983 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/afdc0228-d55d-4e5b-846d-605af1635de7-config\") pod \"dnsmasq-dns-649d884857-8lbk5\" (UID: \"afdc0228-d55d-4e5b-846d-605af1635de7\") " pod="openstack/dnsmasq-dns-649d884857-8lbk5" Oct 13 13:25:46 crc kubenswrapper[4797]: I1013 13:25:46.790016 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/bdf73d21-1504-42d0-9521-9d1201947cc9-config\") pod \"neutron-7fd4c97c98-mmgwk\" (UID: \"bdf73d21-1504-42d0-9521-9d1201947cc9\") " pod="openstack/neutron-7fd4c97c98-mmgwk" Oct 13 13:25:46 crc kubenswrapper[4797]: I1013 13:25:46.791131 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/afdc0228-d55d-4e5b-846d-605af1635de7-ovsdbserver-sb\") pod \"dnsmasq-dns-649d884857-8lbk5\" (UID: \"afdc0228-d55d-4e5b-846d-605af1635de7\") " pod="openstack/dnsmasq-dns-649d884857-8lbk5" Oct 13 13:25:46 crc kubenswrapper[4797]: I1013 13:25:46.791707 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/afdc0228-d55d-4e5b-846d-605af1635de7-dns-swift-storage-0\") pod \"dnsmasq-dns-649d884857-8lbk5\" (UID: \"afdc0228-d55d-4e5b-846d-605af1635de7\") " pod="openstack/dnsmasq-dns-649d884857-8lbk5" Oct 13 13:25:46 crc kubenswrapper[4797]: I1013 13:25:46.792248 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/afdc0228-d55d-4e5b-846d-605af1635de7-dns-svc\") pod \"dnsmasq-dns-649d884857-8lbk5\" (UID: \"afdc0228-d55d-4e5b-846d-605af1635de7\") " pod="openstack/dnsmasq-dns-649d884857-8lbk5" Oct 13 13:25:46 crc kubenswrapper[4797]: I1013 13:25:46.795061 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/afdc0228-d55d-4e5b-846d-605af1635de7-ovsdbserver-nb\") pod \"dnsmasq-dns-649d884857-8lbk5\" (UID: \"afdc0228-d55d-4e5b-846d-605af1635de7\") " pod="openstack/dnsmasq-dns-649d884857-8lbk5" Oct 13 13:25:46 crc kubenswrapper[4797]: I1013 13:25:46.799672 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/afdc0228-d55d-4e5b-846d-605af1635de7-config\") pod \"dnsmasq-dns-649d884857-8lbk5\" (UID: \"afdc0228-d55d-4e5b-846d-605af1635de7\") " pod="openstack/dnsmasq-dns-649d884857-8lbk5" Oct 13 13:25:46 crc kubenswrapper[4797]: I1013 13:25:46.811480 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8z9s\" (UniqueName: \"kubernetes.io/projected/afdc0228-d55d-4e5b-846d-605af1635de7-kube-api-access-z8z9s\") pod \"dnsmasq-dns-649d884857-8lbk5\" (UID: \"afdc0228-d55d-4e5b-846d-605af1635de7\") " pod="openstack/dnsmasq-dns-649d884857-8lbk5" Oct 13 13:25:46 crc kubenswrapper[4797]: I1013 13:25:46.831592 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-649d884857-8lbk5" Oct 13 13:25:46 crc kubenswrapper[4797]: I1013 13:25:46.891674 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/bdf73d21-1504-42d0-9521-9d1201947cc9-config\") pod \"neutron-7fd4c97c98-mmgwk\" (UID: \"bdf73d21-1504-42d0-9521-9d1201947cc9\") " pod="openstack/neutron-7fd4c97c98-mmgwk" Oct 13 13:25:46 crc kubenswrapper[4797]: I1013 13:25:46.891732 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdf73d21-1504-42d0-9521-9d1201947cc9-combined-ca-bundle\") pod \"neutron-7fd4c97c98-mmgwk\" (UID: \"bdf73d21-1504-42d0-9521-9d1201947cc9\") " pod="openstack/neutron-7fd4c97c98-mmgwk" Oct 13 13:25:46 crc kubenswrapper[4797]: I1013 13:25:46.891755 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/bdf73d21-1504-42d0-9521-9d1201947cc9-httpd-config\") pod \"neutron-7fd4c97c98-mmgwk\" (UID: \"bdf73d21-1504-42d0-9521-9d1201947cc9\") " pod="openstack/neutron-7fd4c97c98-mmgwk" Oct 13 13:25:46 crc kubenswrapper[4797]: I1013 13:25:46.891793 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g8lfc\" (UniqueName: \"kubernetes.io/projected/bdf73d21-1504-42d0-9521-9d1201947cc9-kube-api-access-g8lfc\") pod \"neutron-7fd4c97c98-mmgwk\" (UID: \"bdf73d21-1504-42d0-9521-9d1201947cc9\") " pod="openstack/neutron-7fd4c97c98-mmgwk" Oct 13 13:25:46 crc kubenswrapper[4797]: I1013 13:25:46.891870 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bdf73d21-1504-42d0-9521-9d1201947cc9-ovndb-tls-certs\") pod \"neutron-7fd4c97c98-mmgwk\" (UID: \"bdf73d21-1504-42d0-9521-9d1201947cc9\") " pod="openstack/neutron-7fd4c97c98-mmgwk" Oct 13 13:25:46 crc kubenswrapper[4797]: I1013 13:25:46.896584 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/bdf73d21-1504-42d0-9521-9d1201947cc9-config\") pod \"neutron-7fd4c97c98-mmgwk\" (UID: \"bdf73d21-1504-42d0-9521-9d1201947cc9\") " pod="openstack/neutron-7fd4c97c98-mmgwk" Oct 13 13:25:46 crc kubenswrapper[4797]: I1013 13:25:46.898037 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdf73d21-1504-42d0-9521-9d1201947cc9-combined-ca-bundle\") pod \"neutron-7fd4c97c98-mmgwk\" (UID: \"bdf73d21-1504-42d0-9521-9d1201947cc9\") " pod="openstack/neutron-7fd4c97c98-mmgwk" Oct 13 13:25:46 crc kubenswrapper[4797]: I1013 13:25:46.902521 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/bdf73d21-1504-42d0-9521-9d1201947cc9-httpd-config\") pod \"neutron-7fd4c97c98-mmgwk\" (UID: \"bdf73d21-1504-42d0-9521-9d1201947cc9\") " pod="openstack/neutron-7fd4c97c98-mmgwk" Oct 13 13:25:46 crc kubenswrapper[4797]: I1013 13:25:46.909554 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8lfc\" (UniqueName: \"kubernetes.io/projected/bdf73d21-1504-42d0-9521-9d1201947cc9-kube-api-access-g8lfc\") pod \"neutron-7fd4c97c98-mmgwk\" (UID: \"bdf73d21-1504-42d0-9521-9d1201947cc9\") " pod="openstack/neutron-7fd4c97c98-mmgwk" Oct 13 13:25:46 crc kubenswrapper[4797]: I1013 13:25:46.913925 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bdf73d21-1504-42d0-9521-9d1201947cc9-ovndb-tls-certs\") pod \"neutron-7fd4c97c98-mmgwk\" (UID: \"bdf73d21-1504-42d0-9521-9d1201947cc9\") " pod="openstack/neutron-7fd4c97c98-mmgwk" Oct 13 13:25:46 crc kubenswrapper[4797]: I1013 13:25:46.992059 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7fd4c97c98-mmgwk" Oct 13 13:25:47 crc kubenswrapper[4797]: I1013 13:25:47.136106 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-649d884857-8lbk5"] Oct 13 13:25:47 crc kubenswrapper[4797]: I1013 13:25:47.811636 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7fd4c97c98-mmgwk"] Oct 13 13:25:48 crc kubenswrapper[4797]: I1013 13:25:48.120316 4797 patch_prober.go:28] interesting pod/machine-config-daemon-hrdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 13:25:48 crc kubenswrapper[4797]: I1013 13:25:48.120662 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 13:25:48 crc kubenswrapper[4797]: I1013 13:25:48.918195 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-57cbbb4d89-r9rvd"] Oct 13 13:25:48 crc kubenswrapper[4797]: I1013 13:25:48.920028 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-57cbbb4d89-r9rvd" Oct 13 13:25:48 crc kubenswrapper[4797]: I1013 13:25:48.922198 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Oct 13 13:25:48 crc kubenswrapper[4797]: I1013 13:25:48.922702 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Oct 13 13:25:48 crc kubenswrapper[4797]: I1013 13:25:48.933836 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-57cbbb4d89-r9rvd"] Oct 13 13:25:49 crc kubenswrapper[4797]: I1013 13:25:49.030212 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/11a6d485-2926-4d07-9b32-e81ab882de4c-public-tls-certs\") pod \"neutron-57cbbb4d89-r9rvd\" (UID: \"11a6d485-2926-4d07-9b32-e81ab882de4c\") " pod="openstack/neutron-57cbbb4d89-r9rvd" Oct 13 13:25:49 crc kubenswrapper[4797]: I1013 13:25:49.030522 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/11a6d485-2926-4d07-9b32-e81ab882de4c-ovndb-tls-certs\") pod \"neutron-57cbbb4d89-r9rvd\" (UID: \"11a6d485-2926-4d07-9b32-e81ab882de4c\") " pod="openstack/neutron-57cbbb4d89-r9rvd" Oct 13 13:25:49 crc kubenswrapper[4797]: I1013 13:25:49.030592 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11a6d485-2926-4d07-9b32-e81ab882de4c-combined-ca-bundle\") pod \"neutron-57cbbb4d89-r9rvd\" (UID: \"11a6d485-2926-4d07-9b32-e81ab882de4c\") " pod="openstack/neutron-57cbbb4d89-r9rvd" Oct 13 13:25:49 crc kubenswrapper[4797]: I1013 13:25:49.030618 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/11a6d485-2926-4d07-9b32-e81ab882de4c-httpd-config\") pod \"neutron-57cbbb4d89-r9rvd\" (UID: \"11a6d485-2926-4d07-9b32-e81ab882de4c\") " pod="openstack/neutron-57cbbb4d89-r9rvd" Oct 13 13:25:49 crc kubenswrapper[4797]: I1013 13:25:49.030787 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tglwn\" (UniqueName: \"kubernetes.io/projected/11a6d485-2926-4d07-9b32-e81ab882de4c-kube-api-access-tglwn\") pod \"neutron-57cbbb4d89-r9rvd\" (UID: \"11a6d485-2926-4d07-9b32-e81ab882de4c\") " pod="openstack/neutron-57cbbb4d89-r9rvd" Oct 13 13:25:49 crc kubenswrapper[4797]: I1013 13:25:49.030864 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/11a6d485-2926-4d07-9b32-e81ab882de4c-internal-tls-certs\") pod \"neutron-57cbbb4d89-r9rvd\" (UID: \"11a6d485-2926-4d07-9b32-e81ab882de4c\") " pod="openstack/neutron-57cbbb4d89-r9rvd" Oct 13 13:25:49 crc kubenswrapper[4797]: I1013 13:25:49.031022 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/11a6d485-2926-4d07-9b32-e81ab882de4c-config\") pod \"neutron-57cbbb4d89-r9rvd\" (UID: \"11a6d485-2926-4d07-9b32-e81ab882de4c\") " pod="openstack/neutron-57cbbb4d89-r9rvd" Oct 13 13:25:49 crc kubenswrapper[4797]: I1013 13:25:49.133146 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tglwn\" (UniqueName: \"kubernetes.io/projected/11a6d485-2926-4d07-9b32-e81ab882de4c-kube-api-access-tglwn\") pod \"neutron-57cbbb4d89-r9rvd\" (UID: \"11a6d485-2926-4d07-9b32-e81ab882de4c\") " pod="openstack/neutron-57cbbb4d89-r9rvd" Oct 13 13:25:49 crc kubenswrapper[4797]: I1013 13:25:49.133207 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/11a6d485-2926-4d07-9b32-e81ab882de4c-internal-tls-certs\") pod \"neutron-57cbbb4d89-r9rvd\" (UID: \"11a6d485-2926-4d07-9b32-e81ab882de4c\") " pod="openstack/neutron-57cbbb4d89-r9rvd" Oct 13 13:25:49 crc kubenswrapper[4797]: I1013 13:25:49.133239 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/11a6d485-2926-4d07-9b32-e81ab882de4c-config\") pod \"neutron-57cbbb4d89-r9rvd\" (UID: \"11a6d485-2926-4d07-9b32-e81ab882de4c\") " pod="openstack/neutron-57cbbb4d89-r9rvd" Oct 13 13:25:49 crc kubenswrapper[4797]: I1013 13:25:49.133296 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/11a6d485-2926-4d07-9b32-e81ab882de4c-ovndb-tls-certs\") pod \"neutron-57cbbb4d89-r9rvd\" (UID: \"11a6d485-2926-4d07-9b32-e81ab882de4c\") " pod="openstack/neutron-57cbbb4d89-r9rvd" Oct 13 13:25:49 crc kubenswrapper[4797]: I1013 13:25:49.133320 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/11a6d485-2926-4d07-9b32-e81ab882de4c-public-tls-certs\") pod \"neutron-57cbbb4d89-r9rvd\" (UID: \"11a6d485-2926-4d07-9b32-e81ab882de4c\") " pod="openstack/neutron-57cbbb4d89-r9rvd" Oct 13 13:25:49 crc kubenswrapper[4797]: I1013 13:25:49.133380 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11a6d485-2926-4d07-9b32-e81ab882de4c-combined-ca-bundle\") pod \"neutron-57cbbb4d89-r9rvd\" (UID: \"11a6d485-2926-4d07-9b32-e81ab882de4c\") " pod="openstack/neutron-57cbbb4d89-r9rvd" Oct 13 13:25:49 crc kubenswrapper[4797]: I1013 13:25:49.133411 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/11a6d485-2926-4d07-9b32-e81ab882de4c-httpd-config\") pod \"neutron-57cbbb4d89-r9rvd\" (UID: \"11a6d485-2926-4d07-9b32-e81ab882de4c\") " pod="openstack/neutron-57cbbb4d89-r9rvd" Oct 13 13:25:49 crc kubenswrapper[4797]: I1013 13:25:49.141692 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/11a6d485-2926-4d07-9b32-e81ab882de4c-public-tls-certs\") pod \"neutron-57cbbb4d89-r9rvd\" (UID: \"11a6d485-2926-4d07-9b32-e81ab882de4c\") " pod="openstack/neutron-57cbbb4d89-r9rvd" Oct 13 13:25:49 crc kubenswrapper[4797]: I1013 13:25:49.141692 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/11a6d485-2926-4d07-9b32-e81ab882de4c-httpd-config\") pod \"neutron-57cbbb4d89-r9rvd\" (UID: \"11a6d485-2926-4d07-9b32-e81ab882de4c\") " pod="openstack/neutron-57cbbb4d89-r9rvd" Oct 13 13:25:49 crc kubenswrapper[4797]: I1013 13:25:49.141692 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/11a6d485-2926-4d07-9b32-e81ab882de4c-ovndb-tls-certs\") pod \"neutron-57cbbb4d89-r9rvd\" (UID: \"11a6d485-2926-4d07-9b32-e81ab882de4c\") " pod="openstack/neutron-57cbbb4d89-r9rvd" Oct 13 13:25:49 crc kubenswrapper[4797]: I1013 13:25:49.142260 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/11a6d485-2926-4d07-9b32-e81ab882de4c-internal-tls-certs\") pod \"neutron-57cbbb4d89-r9rvd\" (UID: \"11a6d485-2926-4d07-9b32-e81ab882de4c\") " pod="openstack/neutron-57cbbb4d89-r9rvd" Oct 13 13:25:49 crc kubenswrapper[4797]: I1013 13:25:49.142654 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11a6d485-2926-4d07-9b32-e81ab882de4c-combined-ca-bundle\") pod \"neutron-57cbbb4d89-r9rvd\" (UID: \"11a6d485-2926-4d07-9b32-e81ab882de4c\") " pod="openstack/neutron-57cbbb4d89-r9rvd" Oct 13 13:25:49 crc kubenswrapper[4797]: I1013 13:25:49.144497 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/11a6d485-2926-4d07-9b32-e81ab882de4c-config\") pod \"neutron-57cbbb4d89-r9rvd\" (UID: \"11a6d485-2926-4d07-9b32-e81ab882de4c\") " pod="openstack/neutron-57cbbb4d89-r9rvd" Oct 13 13:25:49 crc kubenswrapper[4797]: I1013 13:25:49.154395 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tglwn\" (UniqueName: \"kubernetes.io/projected/11a6d485-2926-4d07-9b32-e81ab882de4c-kube-api-access-tglwn\") pod \"neutron-57cbbb4d89-r9rvd\" (UID: \"11a6d485-2926-4d07-9b32-e81ab882de4c\") " pod="openstack/neutron-57cbbb4d89-r9rvd" Oct 13 13:25:49 crc kubenswrapper[4797]: I1013 13:25:49.284212 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-57cbbb4d89-r9rvd" Oct 13 13:25:50 crc kubenswrapper[4797]: W1013 13:25:50.221856 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podafdc0228_d55d_4e5b_846d_605af1635de7.slice/crio-9e07c721a2bf27c83db9eda2cae582588646b689137cd825761e9c750545ff83 WatchSource:0}: Error finding container 9e07c721a2bf27c83db9eda2cae582588646b689137cd825761e9c750545ff83: Status 404 returned error can't find the container with id 9e07c721a2bf27c83db9eda2cae582588646b689137cd825761e9c750545ff83 Oct 13 13:25:50 crc kubenswrapper[4797]: I1013 13:25:50.331307 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-649d884857-8lbk5" event={"ID":"afdc0228-d55d-4e5b-846d-605af1635de7","Type":"ContainerStarted","Data":"9e07c721a2bf27c83db9eda2cae582588646b689137cd825761e9c750545ff83"} Oct 13 13:25:50 crc kubenswrapper[4797]: W1013 13:25:50.772056 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbdf73d21_1504_42d0_9521_9d1201947cc9.slice/crio-3d7bddb8281e44c336fd313b3e15c3fb83c4fe41e3a22e6ea040304171c185e8 WatchSource:0}: Error finding container 3d7bddb8281e44c336fd313b3e15c3fb83c4fe41e3a22e6ea040304171c185e8: Status 404 returned error can't find the container with id 3d7bddb8281e44c336fd313b3e15c3fb83c4fe41e3a22e6ea040304171c185e8 Oct 13 13:25:51 crc kubenswrapper[4797]: I1013 13:25:51.341295 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7fd4c97c98-mmgwk" event={"ID":"bdf73d21-1504-42d0-9521-9d1201947cc9","Type":"ContainerStarted","Data":"3d7bddb8281e44c336fd313b3e15c3fb83c4fe41e3a22e6ea040304171c185e8"} Oct 13 13:25:51 crc kubenswrapper[4797]: I1013 13:25:51.876244 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-57cbbb4d89-r9rvd"] Oct 13 13:25:52 crc kubenswrapper[4797]: I1013 13:25:52.348759 4797 generic.go:334] "Generic (PLEG): container finished" podID="afdc0228-d55d-4e5b-846d-605af1635de7" containerID="1f19d65fd4aec7a257dc3909c51e0b4fea4ba934b5e63785ab9d2495723d0e7c" exitCode=0 Oct 13 13:25:52 crc kubenswrapper[4797]: I1013 13:25:52.348838 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-649d884857-8lbk5" event={"ID":"afdc0228-d55d-4e5b-846d-605af1635de7","Type":"ContainerDied","Data":"1f19d65fd4aec7a257dc3909c51e0b4fea4ba934b5e63785ab9d2495723d0e7c"} Oct 13 13:25:52 crc kubenswrapper[4797]: I1013 13:25:52.354784 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f52aa623-f23f-4f02-b744-7c5b1e066e50","Type":"ContainerStarted","Data":"70bb806b739f2dfc6ae327fa45119c1172a3fcf822e02e65f44fe2f6f575c7e5"} Oct 13 13:25:52 crc kubenswrapper[4797]: I1013 13:25:52.354969 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f52aa623-f23f-4f02-b744-7c5b1e066e50" containerName="ceilometer-central-agent" containerID="cri-o://24839f95a992d62de78749ed42c00b47bd3b4a6e9c699eb44cdbcb07b4a559cf" gracePeriod=30 Oct 13 13:25:52 crc kubenswrapper[4797]: I1013 13:25:52.355069 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 13 13:25:52 crc kubenswrapper[4797]: I1013 13:25:52.355115 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f52aa623-f23f-4f02-b744-7c5b1e066e50" containerName="proxy-httpd" containerID="cri-o://70bb806b739f2dfc6ae327fa45119c1172a3fcf822e02e65f44fe2f6f575c7e5" gracePeriod=30 Oct 13 13:25:52 crc kubenswrapper[4797]: I1013 13:25:52.355167 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f52aa623-f23f-4f02-b744-7c5b1e066e50" containerName="sg-core" containerID="cri-o://cf8018877e5ab2c9172ef35f55edb3b2e0ac0fa2ff314039b589665b34660caf" gracePeriod=30 Oct 13 13:25:52 crc kubenswrapper[4797]: I1013 13:25:52.355214 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f52aa623-f23f-4f02-b744-7c5b1e066e50" containerName="ceilometer-notification-agent" containerID="cri-o://a7fbc889f92c77b5d394129b70266a52ff645e52f67dbaaef83a48efffc55a63" gracePeriod=30 Oct 13 13:25:52 crc kubenswrapper[4797]: I1013 13:25:52.357612 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-57cbbb4d89-r9rvd" event={"ID":"11a6d485-2926-4d07-9b32-e81ab882de4c","Type":"ContainerStarted","Data":"366a02b08ccc7deee54d9245cc187eef22dec7cc6454b3fe21bc9ed73aea1117"} Oct 13 13:25:52 crc kubenswrapper[4797]: I1013 13:25:52.357650 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-57cbbb4d89-r9rvd" event={"ID":"11a6d485-2926-4d07-9b32-e81ab882de4c","Type":"ContainerStarted","Data":"765f926c0e272fb4be3628cfe2a3f4a06d19c3788e310b7584f497aaf8cfd1b8"} Oct 13 13:25:52 crc kubenswrapper[4797]: I1013 13:25:52.357661 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-57cbbb4d89-r9rvd" event={"ID":"11a6d485-2926-4d07-9b32-e81ab882de4c","Type":"ContainerStarted","Data":"330a258a35155936c58f00a41dda095b1ddd6de665b1352dcea80c60c24af94b"} Oct 13 13:25:52 crc kubenswrapper[4797]: I1013 13:25:52.358487 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-57cbbb4d89-r9rvd" Oct 13 13:25:52 crc kubenswrapper[4797]: I1013 13:25:52.360139 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7fd4c97c98-mmgwk" event={"ID":"bdf73d21-1504-42d0-9521-9d1201947cc9","Type":"ContainerStarted","Data":"8be75a01073153c2b6a4e433be7da517f34e589f7fa0ce0cac175dab6cfec505"} Oct 13 13:25:52 crc kubenswrapper[4797]: I1013 13:25:52.360172 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7fd4c97c98-mmgwk" event={"ID":"bdf73d21-1504-42d0-9521-9d1201947cc9","Type":"ContainerStarted","Data":"57283ac326d0242aaf47e4ff612cf4c3235d8825402aa5d07b3d9cdfcbc2afda"} Oct 13 13:25:52 crc kubenswrapper[4797]: I1013 13:25:52.360938 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-7fd4c97c98-mmgwk" Oct 13 13:25:52 crc kubenswrapper[4797]: I1013 13:25:52.410273 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.093688633 podStartE2EDuration="39.410249967s" podCreationTimestamp="2025-10-13 13:25:13 +0000 UTC" firstStartedPulling="2025-10-13 13:25:14.044114742 +0000 UTC m=+1091.577664998" lastFinishedPulling="2025-10-13 13:25:51.360676076 +0000 UTC m=+1128.894226332" observedRunningTime="2025-10-13 13:25:52.407299105 +0000 UTC m=+1129.940849381" watchObservedRunningTime="2025-10-13 13:25:52.410249967 +0000 UTC m=+1129.943800233" Oct 13 13:25:52 crc kubenswrapper[4797]: I1013 13:25:52.440464 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-7fd4c97c98-mmgwk" podStartSLOduration=6.440438677 podStartE2EDuration="6.440438677s" podCreationTimestamp="2025-10-13 13:25:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 13:25:52.431527568 +0000 UTC m=+1129.965077854" watchObservedRunningTime="2025-10-13 13:25:52.440438677 +0000 UTC m=+1129.973988943" Oct 13 13:25:52 crc kubenswrapper[4797]: I1013 13:25:52.461825 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-57cbbb4d89-r9rvd" podStartSLOduration=4.46177985 podStartE2EDuration="4.46177985s" podCreationTimestamp="2025-10-13 13:25:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 13:25:52.448827262 +0000 UTC m=+1129.982377558" watchObservedRunningTime="2025-10-13 13:25:52.46177985 +0000 UTC m=+1129.995330106" Oct 13 13:25:53 crc kubenswrapper[4797]: I1013 13:25:53.372101 4797 generic.go:334] "Generic (PLEG): container finished" podID="f52aa623-f23f-4f02-b744-7c5b1e066e50" containerID="70bb806b739f2dfc6ae327fa45119c1172a3fcf822e02e65f44fe2f6f575c7e5" exitCode=0 Oct 13 13:25:53 crc kubenswrapper[4797]: I1013 13:25:53.372510 4797 generic.go:334] "Generic (PLEG): container finished" podID="f52aa623-f23f-4f02-b744-7c5b1e066e50" containerID="cf8018877e5ab2c9172ef35f55edb3b2e0ac0fa2ff314039b589665b34660caf" exitCode=2 Oct 13 13:25:53 crc kubenswrapper[4797]: I1013 13:25:53.372523 4797 generic.go:334] "Generic (PLEG): container finished" podID="f52aa623-f23f-4f02-b744-7c5b1e066e50" containerID="24839f95a992d62de78749ed42c00b47bd3b4a6e9c699eb44cdbcb07b4a559cf" exitCode=0 Oct 13 13:25:53 crc kubenswrapper[4797]: I1013 13:25:53.372204 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f52aa623-f23f-4f02-b744-7c5b1e066e50","Type":"ContainerDied","Data":"70bb806b739f2dfc6ae327fa45119c1172a3fcf822e02e65f44fe2f6f575c7e5"} Oct 13 13:25:53 crc kubenswrapper[4797]: I1013 13:25:53.372619 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f52aa623-f23f-4f02-b744-7c5b1e066e50","Type":"ContainerDied","Data":"cf8018877e5ab2c9172ef35f55edb3b2e0ac0fa2ff314039b589665b34660caf"} Oct 13 13:25:53 crc kubenswrapper[4797]: I1013 13:25:53.372659 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f52aa623-f23f-4f02-b744-7c5b1e066e50","Type":"ContainerDied","Data":"24839f95a992d62de78749ed42c00b47bd3b4a6e9c699eb44cdbcb07b4a559cf"} Oct 13 13:25:53 crc kubenswrapper[4797]: I1013 13:25:53.374566 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-649d884857-8lbk5" event={"ID":"afdc0228-d55d-4e5b-846d-605af1635de7","Type":"ContainerStarted","Data":"322ab71a472f62847155ef62aae8f13f5be5d55a48cf23fc4a7574bab88d162a"} Oct 13 13:25:53 crc kubenswrapper[4797]: I1013 13:25:53.399693 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-649d884857-8lbk5" podStartSLOduration=7.399671965 podStartE2EDuration="7.399671965s" podCreationTimestamp="2025-10-13 13:25:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 13:25:53.39215054 +0000 UTC m=+1130.925700806" watchObservedRunningTime="2025-10-13 13:25:53.399671965 +0000 UTC m=+1130.933222241" Oct 13 13:25:54 crc kubenswrapper[4797]: I1013 13:25:54.387066 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-649d884857-8lbk5" Oct 13 13:25:56 crc kubenswrapper[4797]: I1013 13:25:56.413184 4797 generic.go:334] "Generic (PLEG): container finished" podID="f52aa623-f23f-4f02-b744-7c5b1e066e50" containerID="a7fbc889f92c77b5d394129b70266a52ff645e52f67dbaaef83a48efffc55a63" exitCode=0 Oct 13 13:25:56 crc kubenswrapper[4797]: I1013 13:25:56.413587 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f52aa623-f23f-4f02-b744-7c5b1e066e50","Type":"ContainerDied","Data":"a7fbc889f92c77b5d394129b70266a52ff645e52f67dbaaef83a48efffc55a63"} Oct 13 13:25:57 crc kubenswrapper[4797]: I1013 13:25:57.013566 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 13 13:25:57 crc kubenswrapper[4797]: I1013 13:25:57.194314 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f52aa623-f23f-4f02-b744-7c5b1e066e50-config-data\") pod \"f52aa623-f23f-4f02-b744-7c5b1e066e50\" (UID: \"f52aa623-f23f-4f02-b744-7c5b1e066e50\") " Oct 13 13:25:57 crc kubenswrapper[4797]: I1013 13:25:57.194663 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f52aa623-f23f-4f02-b744-7c5b1e066e50-scripts\") pod \"f52aa623-f23f-4f02-b744-7c5b1e066e50\" (UID: \"f52aa623-f23f-4f02-b744-7c5b1e066e50\") " Oct 13 13:25:57 crc kubenswrapper[4797]: I1013 13:25:57.195218 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f52aa623-f23f-4f02-b744-7c5b1e066e50-log-httpd\") pod \"f52aa623-f23f-4f02-b744-7c5b1e066e50\" (UID: \"f52aa623-f23f-4f02-b744-7c5b1e066e50\") " Oct 13 13:25:57 crc kubenswrapper[4797]: I1013 13:25:57.195314 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f52aa623-f23f-4f02-b744-7c5b1e066e50-sg-core-conf-yaml\") pod \"f52aa623-f23f-4f02-b744-7c5b1e066e50\" (UID: \"f52aa623-f23f-4f02-b744-7c5b1e066e50\") " Oct 13 13:25:57 crc kubenswrapper[4797]: I1013 13:25:57.195343 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f52aa623-f23f-4f02-b744-7c5b1e066e50-run-httpd\") pod \"f52aa623-f23f-4f02-b744-7c5b1e066e50\" (UID: \"f52aa623-f23f-4f02-b744-7c5b1e066e50\") " Oct 13 13:25:57 crc kubenswrapper[4797]: I1013 13:25:57.195705 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f52aa623-f23f-4f02-b744-7c5b1e066e50-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "f52aa623-f23f-4f02-b744-7c5b1e066e50" (UID: "f52aa623-f23f-4f02-b744-7c5b1e066e50"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 13:25:57 crc kubenswrapper[4797]: I1013 13:25:57.195777 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f52aa623-f23f-4f02-b744-7c5b1e066e50-combined-ca-bundle\") pod \"f52aa623-f23f-4f02-b744-7c5b1e066e50\" (UID: \"f52aa623-f23f-4f02-b744-7c5b1e066e50\") " Oct 13 13:25:57 crc kubenswrapper[4797]: I1013 13:25:57.195789 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f52aa623-f23f-4f02-b744-7c5b1e066e50-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "f52aa623-f23f-4f02-b744-7c5b1e066e50" (UID: "f52aa623-f23f-4f02-b744-7c5b1e066e50"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 13:25:57 crc kubenswrapper[4797]: I1013 13:25:57.195893 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fp9df\" (UniqueName: \"kubernetes.io/projected/f52aa623-f23f-4f02-b744-7c5b1e066e50-kube-api-access-fp9df\") pod \"f52aa623-f23f-4f02-b744-7c5b1e066e50\" (UID: \"f52aa623-f23f-4f02-b744-7c5b1e066e50\") " Oct 13 13:25:57 crc kubenswrapper[4797]: I1013 13:25:57.196384 4797 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f52aa623-f23f-4f02-b744-7c5b1e066e50-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 13 13:25:57 crc kubenswrapper[4797]: I1013 13:25:57.196429 4797 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f52aa623-f23f-4f02-b744-7c5b1e066e50-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 13 13:25:57 crc kubenswrapper[4797]: I1013 13:25:57.199361 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f52aa623-f23f-4f02-b744-7c5b1e066e50-scripts" (OuterVolumeSpecName: "scripts") pod "f52aa623-f23f-4f02-b744-7c5b1e066e50" (UID: "f52aa623-f23f-4f02-b744-7c5b1e066e50"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:25:57 crc kubenswrapper[4797]: I1013 13:25:57.201364 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f52aa623-f23f-4f02-b744-7c5b1e066e50-kube-api-access-fp9df" (OuterVolumeSpecName: "kube-api-access-fp9df") pod "f52aa623-f23f-4f02-b744-7c5b1e066e50" (UID: "f52aa623-f23f-4f02-b744-7c5b1e066e50"). InnerVolumeSpecName "kube-api-access-fp9df". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:25:57 crc kubenswrapper[4797]: I1013 13:25:57.222023 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f52aa623-f23f-4f02-b744-7c5b1e066e50-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "f52aa623-f23f-4f02-b744-7c5b1e066e50" (UID: "f52aa623-f23f-4f02-b744-7c5b1e066e50"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:25:57 crc kubenswrapper[4797]: I1013 13:25:57.265168 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f52aa623-f23f-4f02-b744-7c5b1e066e50-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f52aa623-f23f-4f02-b744-7c5b1e066e50" (UID: "f52aa623-f23f-4f02-b744-7c5b1e066e50"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:25:57 crc kubenswrapper[4797]: I1013 13:25:57.279173 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f52aa623-f23f-4f02-b744-7c5b1e066e50-config-data" (OuterVolumeSpecName: "config-data") pod "f52aa623-f23f-4f02-b744-7c5b1e066e50" (UID: "f52aa623-f23f-4f02-b744-7c5b1e066e50"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:25:57 crc kubenswrapper[4797]: I1013 13:25:57.298347 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fp9df\" (UniqueName: \"kubernetes.io/projected/f52aa623-f23f-4f02-b744-7c5b1e066e50-kube-api-access-fp9df\") on node \"crc\" DevicePath \"\"" Oct 13 13:25:57 crc kubenswrapper[4797]: I1013 13:25:57.298383 4797 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f52aa623-f23f-4f02-b744-7c5b1e066e50-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 13:25:57 crc kubenswrapper[4797]: I1013 13:25:57.298397 4797 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f52aa623-f23f-4f02-b744-7c5b1e066e50-scripts\") on node \"crc\" DevicePath \"\"" Oct 13 13:25:57 crc kubenswrapper[4797]: I1013 13:25:57.298431 4797 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f52aa623-f23f-4f02-b744-7c5b1e066e50-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 13 13:25:57 crc kubenswrapper[4797]: I1013 13:25:57.298443 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f52aa623-f23f-4f02-b744-7c5b1e066e50-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 13:25:57 crc kubenswrapper[4797]: I1013 13:25:57.426582 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f52aa623-f23f-4f02-b744-7c5b1e066e50","Type":"ContainerDied","Data":"5929e5b3693de4a2e6e846dbca7316d5946ffebd561262e8f926584f4c7d1670"} Oct 13 13:25:57 crc kubenswrapper[4797]: I1013 13:25:57.426641 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 13 13:25:57 crc kubenswrapper[4797]: I1013 13:25:57.426650 4797 scope.go:117] "RemoveContainer" containerID="70bb806b739f2dfc6ae327fa45119c1172a3fcf822e02e65f44fe2f6f575c7e5" Oct 13 13:25:57 crc kubenswrapper[4797]: I1013 13:25:57.461036 4797 scope.go:117] "RemoveContainer" containerID="cf8018877e5ab2c9172ef35f55edb3b2e0ac0fa2ff314039b589665b34660caf" Oct 13 13:25:57 crc kubenswrapper[4797]: I1013 13:25:57.485060 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 13 13:25:57 crc kubenswrapper[4797]: I1013 13:25:57.497587 4797 scope.go:117] "RemoveContainer" containerID="a7fbc889f92c77b5d394129b70266a52ff645e52f67dbaaef83a48efffc55a63" Oct 13 13:25:57 crc kubenswrapper[4797]: I1013 13:25:57.499932 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 13 13:25:57 crc kubenswrapper[4797]: I1013 13:25:57.523274 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 13 13:25:57 crc kubenswrapper[4797]: E1013 13:25:57.523720 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f52aa623-f23f-4f02-b744-7c5b1e066e50" containerName="proxy-httpd" Oct 13 13:25:57 crc kubenswrapper[4797]: I1013 13:25:57.523739 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="f52aa623-f23f-4f02-b744-7c5b1e066e50" containerName="proxy-httpd" Oct 13 13:25:57 crc kubenswrapper[4797]: E1013 13:25:57.523753 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f52aa623-f23f-4f02-b744-7c5b1e066e50" containerName="ceilometer-central-agent" Oct 13 13:25:57 crc kubenswrapper[4797]: I1013 13:25:57.523762 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="f52aa623-f23f-4f02-b744-7c5b1e066e50" containerName="ceilometer-central-agent" Oct 13 13:25:57 crc kubenswrapper[4797]: E1013 13:25:57.523781 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f52aa623-f23f-4f02-b744-7c5b1e066e50" containerName="sg-core" Oct 13 13:25:57 crc kubenswrapper[4797]: I1013 13:25:57.523790 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="f52aa623-f23f-4f02-b744-7c5b1e066e50" containerName="sg-core" Oct 13 13:25:57 crc kubenswrapper[4797]: E1013 13:25:57.523838 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f52aa623-f23f-4f02-b744-7c5b1e066e50" containerName="ceilometer-notification-agent" Oct 13 13:25:57 crc kubenswrapper[4797]: I1013 13:25:57.523847 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="f52aa623-f23f-4f02-b744-7c5b1e066e50" containerName="ceilometer-notification-agent" Oct 13 13:25:57 crc kubenswrapper[4797]: I1013 13:25:57.524053 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="f52aa623-f23f-4f02-b744-7c5b1e066e50" containerName="ceilometer-notification-agent" Oct 13 13:25:57 crc kubenswrapper[4797]: I1013 13:25:57.524077 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="f52aa623-f23f-4f02-b744-7c5b1e066e50" containerName="proxy-httpd" Oct 13 13:25:57 crc kubenswrapper[4797]: I1013 13:25:57.524098 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="f52aa623-f23f-4f02-b744-7c5b1e066e50" containerName="sg-core" Oct 13 13:25:57 crc kubenswrapper[4797]: I1013 13:25:57.524116 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="f52aa623-f23f-4f02-b744-7c5b1e066e50" containerName="ceilometer-central-agent" Oct 13 13:25:57 crc kubenswrapper[4797]: I1013 13:25:57.526094 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 13 13:25:57 crc kubenswrapper[4797]: I1013 13:25:57.529419 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 13 13:25:57 crc kubenswrapper[4797]: I1013 13:25:57.529800 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 13 13:25:57 crc kubenswrapper[4797]: I1013 13:25:57.537115 4797 scope.go:117] "RemoveContainer" containerID="24839f95a992d62de78749ed42c00b47bd3b4a6e9c699eb44cdbcb07b4a559cf" Oct 13 13:25:57 crc kubenswrapper[4797]: I1013 13:25:57.552681 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 13 13:25:57 crc kubenswrapper[4797]: I1013 13:25:57.708842 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ee338af0-a1a1-4515-973b-b914753e76cf-scripts\") pod \"ceilometer-0\" (UID: \"ee338af0-a1a1-4515-973b-b914753e76cf\") " pod="openstack/ceilometer-0" Oct 13 13:25:57 crc kubenswrapper[4797]: I1013 13:25:57.708902 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ee338af0-a1a1-4515-973b-b914753e76cf-log-httpd\") pod \"ceilometer-0\" (UID: \"ee338af0-a1a1-4515-973b-b914753e76cf\") " pod="openstack/ceilometer-0" Oct 13 13:25:57 crc kubenswrapper[4797]: I1013 13:25:57.708943 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee338af0-a1a1-4515-973b-b914753e76cf-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ee338af0-a1a1-4515-973b-b914753e76cf\") " pod="openstack/ceilometer-0" Oct 13 13:25:57 crc kubenswrapper[4797]: I1013 13:25:57.708977 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ee338af0-a1a1-4515-973b-b914753e76cf-run-httpd\") pod \"ceilometer-0\" (UID: \"ee338af0-a1a1-4515-973b-b914753e76cf\") " pod="openstack/ceilometer-0" Oct 13 13:25:57 crc kubenswrapper[4797]: I1013 13:25:57.708992 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee338af0-a1a1-4515-973b-b914753e76cf-config-data\") pod \"ceilometer-0\" (UID: \"ee338af0-a1a1-4515-973b-b914753e76cf\") " pod="openstack/ceilometer-0" Oct 13 13:25:57 crc kubenswrapper[4797]: I1013 13:25:57.709038 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ee338af0-a1a1-4515-973b-b914753e76cf-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ee338af0-a1a1-4515-973b-b914753e76cf\") " pod="openstack/ceilometer-0" Oct 13 13:25:57 crc kubenswrapper[4797]: I1013 13:25:57.709055 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8mjg\" (UniqueName: \"kubernetes.io/projected/ee338af0-a1a1-4515-973b-b914753e76cf-kube-api-access-d8mjg\") pod \"ceilometer-0\" (UID: \"ee338af0-a1a1-4515-973b-b914753e76cf\") " pod="openstack/ceilometer-0" Oct 13 13:25:57 crc kubenswrapper[4797]: I1013 13:25:57.810959 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d8mjg\" (UniqueName: \"kubernetes.io/projected/ee338af0-a1a1-4515-973b-b914753e76cf-kube-api-access-d8mjg\") pod \"ceilometer-0\" (UID: \"ee338af0-a1a1-4515-973b-b914753e76cf\") " pod="openstack/ceilometer-0" Oct 13 13:25:57 crc kubenswrapper[4797]: I1013 13:25:57.811033 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ee338af0-a1a1-4515-973b-b914753e76cf-scripts\") pod \"ceilometer-0\" (UID: \"ee338af0-a1a1-4515-973b-b914753e76cf\") " pod="openstack/ceilometer-0" Oct 13 13:25:57 crc kubenswrapper[4797]: I1013 13:25:57.811075 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ee338af0-a1a1-4515-973b-b914753e76cf-log-httpd\") pod \"ceilometer-0\" (UID: \"ee338af0-a1a1-4515-973b-b914753e76cf\") " pod="openstack/ceilometer-0" Oct 13 13:25:57 crc kubenswrapper[4797]: I1013 13:25:57.811110 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee338af0-a1a1-4515-973b-b914753e76cf-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ee338af0-a1a1-4515-973b-b914753e76cf\") " pod="openstack/ceilometer-0" Oct 13 13:25:57 crc kubenswrapper[4797]: I1013 13:25:57.811145 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ee338af0-a1a1-4515-973b-b914753e76cf-run-httpd\") pod \"ceilometer-0\" (UID: \"ee338af0-a1a1-4515-973b-b914753e76cf\") " pod="openstack/ceilometer-0" Oct 13 13:25:57 crc kubenswrapper[4797]: I1013 13:25:57.811159 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee338af0-a1a1-4515-973b-b914753e76cf-config-data\") pod \"ceilometer-0\" (UID: \"ee338af0-a1a1-4515-973b-b914753e76cf\") " pod="openstack/ceilometer-0" Oct 13 13:25:57 crc kubenswrapper[4797]: I1013 13:25:57.811206 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ee338af0-a1a1-4515-973b-b914753e76cf-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ee338af0-a1a1-4515-973b-b914753e76cf\") " pod="openstack/ceilometer-0" Oct 13 13:25:57 crc kubenswrapper[4797]: I1013 13:25:57.812346 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ee338af0-a1a1-4515-973b-b914753e76cf-log-httpd\") pod \"ceilometer-0\" (UID: \"ee338af0-a1a1-4515-973b-b914753e76cf\") " pod="openstack/ceilometer-0" Oct 13 13:25:57 crc kubenswrapper[4797]: I1013 13:25:57.812465 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ee338af0-a1a1-4515-973b-b914753e76cf-run-httpd\") pod \"ceilometer-0\" (UID: \"ee338af0-a1a1-4515-973b-b914753e76cf\") " pod="openstack/ceilometer-0" Oct 13 13:25:57 crc kubenswrapper[4797]: I1013 13:25:57.816362 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ee338af0-a1a1-4515-973b-b914753e76cf-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ee338af0-a1a1-4515-973b-b914753e76cf\") " pod="openstack/ceilometer-0" Oct 13 13:25:57 crc kubenswrapper[4797]: I1013 13:25:57.816675 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ee338af0-a1a1-4515-973b-b914753e76cf-scripts\") pod \"ceilometer-0\" (UID: \"ee338af0-a1a1-4515-973b-b914753e76cf\") " pod="openstack/ceilometer-0" Oct 13 13:25:57 crc kubenswrapper[4797]: I1013 13:25:57.816841 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee338af0-a1a1-4515-973b-b914753e76cf-config-data\") pod \"ceilometer-0\" (UID: \"ee338af0-a1a1-4515-973b-b914753e76cf\") " pod="openstack/ceilometer-0" Oct 13 13:25:57 crc kubenswrapper[4797]: I1013 13:25:57.820456 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee338af0-a1a1-4515-973b-b914753e76cf-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ee338af0-a1a1-4515-973b-b914753e76cf\") " pod="openstack/ceilometer-0" Oct 13 13:25:57 crc kubenswrapper[4797]: I1013 13:25:57.829382 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8mjg\" (UniqueName: \"kubernetes.io/projected/ee338af0-a1a1-4515-973b-b914753e76cf-kube-api-access-d8mjg\") pod \"ceilometer-0\" (UID: \"ee338af0-a1a1-4515-973b-b914753e76cf\") " pod="openstack/ceilometer-0" Oct 13 13:25:57 crc kubenswrapper[4797]: I1013 13:25:57.852009 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 13 13:25:58 crc kubenswrapper[4797]: I1013 13:25:58.327250 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 13 13:25:58 crc kubenswrapper[4797]: I1013 13:25:58.453914 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-xzbbx" event={"ID":"d7978a82-8a2f-4a86-8598-65e7dae25b77","Type":"ContainerStarted","Data":"12e079d1fea424b348c31cef1525d77ed92d39ac5a4b98cb78fbf186197a402c"} Oct 13 13:25:58 crc kubenswrapper[4797]: I1013 13:25:58.457293 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ee338af0-a1a1-4515-973b-b914753e76cf","Type":"ContainerStarted","Data":"1b66a64e7dac0a16d85e2c05b5daca93768ea25698d8906017ed789b0de9b098"} Oct 13 13:25:58 crc kubenswrapper[4797]: I1013 13:25:58.474432 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-xzbbx" podStartSLOduration=3.103184642 podStartE2EDuration="34.474387349s" podCreationTimestamp="2025-10-13 13:25:24 +0000 UTC" firstStartedPulling="2025-10-13 13:25:25.328027428 +0000 UTC m=+1102.861577684" lastFinishedPulling="2025-10-13 13:25:56.699230135 +0000 UTC m=+1134.232780391" observedRunningTime="2025-10-13 13:25:58.468842723 +0000 UTC m=+1136.002392979" watchObservedRunningTime="2025-10-13 13:25:58.474387349 +0000 UTC m=+1136.007937605" Oct 13 13:25:58 crc kubenswrapper[4797]: I1013 13:25:58.520328 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-778fd9d9d-t868n" Oct 13 13:25:58 crc kubenswrapper[4797]: I1013 13:25:58.521673 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-778fd9d9d-t868n" Oct 13 13:25:59 crc kubenswrapper[4797]: I1013 13:25:59.261764 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f52aa623-f23f-4f02-b744-7c5b1e066e50" path="/var/lib/kubelet/pods/f52aa623-f23f-4f02-b744-7c5b1e066e50/volumes" Oct 13 13:25:59 crc kubenswrapper[4797]: I1013 13:25:59.467798 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ee338af0-a1a1-4515-973b-b914753e76cf","Type":"ContainerStarted","Data":"aa3f0cfe1c8e9a489f2c9bfe5437f7132bf4938834a5ca666bd15a9bde15355b"} Oct 13 13:25:59 crc kubenswrapper[4797]: I1013 13:25:59.469788 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-tqpk9" event={"ID":"bb8faf00-58df-4131-af70-117df286f396","Type":"ContainerStarted","Data":"07625101ccd79368453ce261cd92f3c80074068cd66650ebfdbe4888531d4cd2"} Oct 13 13:25:59 crc kubenswrapper[4797]: I1013 13:25:59.496094 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-tqpk9" podStartSLOduration=1.973262669 podStartE2EDuration="35.496071856s" podCreationTimestamp="2025-10-13 13:25:24 +0000 UTC" firstStartedPulling="2025-10-13 13:25:25.102432139 +0000 UTC m=+1102.635982395" lastFinishedPulling="2025-10-13 13:25:58.625241326 +0000 UTC m=+1136.158791582" observedRunningTime="2025-10-13 13:25:59.493553174 +0000 UTC m=+1137.027103430" watchObservedRunningTime="2025-10-13 13:25:59.496071856 +0000 UTC m=+1137.029622112" Oct 13 13:26:00 crc kubenswrapper[4797]: I1013 13:26:00.486071 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ee338af0-a1a1-4515-973b-b914753e76cf","Type":"ContainerStarted","Data":"d034c56424f578783826d4b8e3e3adf2868eb02a2d2f39809be377fcbd84b10a"} Oct 13 13:26:00 crc kubenswrapper[4797]: I1013 13:26:00.486480 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ee338af0-a1a1-4515-973b-b914753e76cf","Type":"ContainerStarted","Data":"c1c0a25f61889177289bbe5d1b1f2d63cfdb3457c8b73ec447a0b59ec80484d3"} Oct 13 13:26:01 crc kubenswrapper[4797]: I1013 13:26:01.833065 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-649d884857-8lbk5" Oct 13 13:26:01 crc kubenswrapper[4797]: I1013 13:26:01.909054 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77f7885f7f-jk7j6"] Oct 13 13:26:01 crc kubenswrapper[4797]: I1013 13:26:01.909319 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-77f7885f7f-jk7j6" podUID="71552bbd-b6bc-43b0-95ba-0c3dc0d93468" containerName="dnsmasq-dns" containerID="cri-o://90c85f1f1692cc90d79b1aa8915b2f93f0fbe1cac190e747ffc66dcea36dd3ff" gracePeriod=10 Oct 13 13:26:02 crc kubenswrapper[4797]: I1013 13:26:02.427063 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77f7885f7f-jk7j6" Oct 13 13:26:02 crc kubenswrapper[4797]: I1013 13:26:02.560819 4797 generic.go:334] "Generic (PLEG): container finished" podID="71552bbd-b6bc-43b0-95ba-0c3dc0d93468" containerID="90c85f1f1692cc90d79b1aa8915b2f93f0fbe1cac190e747ffc66dcea36dd3ff" exitCode=0 Oct 13 13:26:02 crc kubenswrapper[4797]: I1013 13:26:02.561223 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77f7885f7f-jk7j6" event={"ID":"71552bbd-b6bc-43b0-95ba-0c3dc0d93468","Type":"ContainerDied","Data":"90c85f1f1692cc90d79b1aa8915b2f93f0fbe1cac190e747ffc66dcea36dd3ff"} Oct 13 13:26:02 crc kubenswrapper[4797]: I1013 13:26:02.561259 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77f7885f7f-jk7j6" event={"ID":"71552bbd-b6bc-43b0-95ba-0c3dc0d93468","Type":"ContainerDied","Data":"4adfe03f1df3525f379263654c5803c20a021fde3d136a13d3b414f3a9cbab71"} Oct 13 13:26:02 crc kubenswrapper[4797]: I1013 13:26:02.561281 4797 scope.go:117] "RemoveContainer" containerID="90c85f1f1692cc90d79b1aa8915b2f93f0fbe1cac190e747ffc66dcea36dd3ff" Oct 13 13:26:02 crc kubenswrapper[4797]: I1013 13:26:02.561475 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77f7885f7f-jk7j6" Oct 13 13:26:02 crc kubenswrapper[4797]: I1013 13:26:02.585351 4797 generic.go:334] "Generic (PLEG): container finished" podID="d7978a82-8a2f-4a86-8598-65e7dae25b77" containerID="12e079d1fea424b348c31cef1525d77ed92d39ac5a4b98cb78fbf186197a402c" exitCode=0 Oct 13 13:26:02 crc kubenswrapper[4797]: I1013 13:26:02.585465 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-xzbbx" event={"ID":"d7978a82-8a2f-4a86-8598-65e7dae25b77","Type":"ContainerDied","Data":"12e079d1fea424b348c31cef1525d77ed92d39ac5a4b98cb78fbf186197a402c"} Oct 13 13:26:02 crc kubenswrapper[4797]: I1013 13:26:02.594141 4797 scope.go:117] "RemoveContainer" containerID="684d518d30db8023e55b9fa1b1df06ff30fb3b5a41be41ec0aaf4e2bdd6c6a91" Oct 13 13:26:02 crc kubenswrapper[4797]: I1013 13:26:02.601890 4797 generic.go:334] "Generic (PLEG): container finished" podID="bb8faf00-58df-4131-af70-117df286f396" containerID="07625101ccd79368453ce261cd92f3c80074068cd66650ebfdbe4888531d4cd2" exitCode=0 Oct 13 13:26:02 crc kubenswrapper[4797]: I1013 13:26:02.601937 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-tqpk9" event={"ID":"bb8faf00-58df-4131-af70-117df286f396","Type":"ContainerDied","Data":"07625101ccd79368453ce261cd92f3c80074068cd66650ebfdbe4888531d4cd2"} Oct 13 13:26:02 crc kubenswrapper[4797]: I1013 13:26:02.602544 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ptppm\" (UniqueName: \"kubernetes.io/projected/71552bbd-b6bc-43b0-95ba-0c3dc0d93468-kube-api-access-ptppm\") pod \"71552bbd-b6bc-43b0-95ba-0c3dc0d93468\" (UID: \"71552bbd-b6bc-43b0-95ba-0c3dc0d93468\") " Oct 13 13:26:02 crc kubenswrapper[4797]: I1013 13:26:02.602593 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71552bbd-b6bc-43b0-95ba-0c3dc0d93468-config\") pod \"71552bbd-b6bc-43b0-95ba-0c3dc0d93468\" (UID: \"71552bbd-b6bc-43b0-95ba-0c3dc0d93468\") " Oct 13 13:26:02 crc kubenswrapper[4797]: I1013 13:26:02.602638 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/71552bbd-b6bc-43b0-95ba-0c3dc0d93468-dns-svc\") pod \"71552bbd-b6bc-43b0-95ba-0c3dc0d93468\" (UID: \"71552bbd-b6bc-43b0-95ba-0c3dc0d93468\") " Oct 13 13:26:02 crc kubenswrapper[4797]: I1013 13:26:02.602681 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/71552bbd-b6bc-43b0-95ba-0c3dc0d93468-ovsdbserver-nb\") pod \"71552bbd-b6bc-43b0-95ba-0c3dc0d93468\" (UID: \"71552bbd-b6bc-43b0-95ba-0c3dc0d93468\") " Oct 13 13:26:02 crc kubenswrapper[4797]: I1013 13:26:02.602712 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/71552bbd-b6bc-43b0-95ba-0c3dc0d93468-dns-swift-storage-0\") pod \"71552bbd-b6bc-43b0-95ba-0c3dc0d93468\" (UID: \"71552bbd-b6bc-43b0-95ba-0c3dc0d93468\") " Oct 13 13:26:02 crc kubenswrapper[4797]: I1013 13:26:02.602761 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/71552bbd-b6bc-43b0-95ba-0c3dc0d93468-ovsdbserver-sb\") pod \"71552bbd-b6bc-43b0-95ba-0c3dc0d93468\" (UID: \"71552bbd-b6bc-43b0-95ba-0c3dc0d93468\") " Oct 13 13:26:02 crc kubenswrapper[4797]: I1013 13:26:02.610020 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71552bbd-b6bc-43b0-95ba-0c3dc0d93468-kube-api-access-ptppm" (OuterVolumeSpecName: "kube-api-access-ptppm") pod "71552bbd-b6bc-43b0-95ba-0c3dc0d93468" (UID: "71552bbd-b6bc-43b0-95ba-0c3dc0d93468"). InnerVolumeSpecName "kube-api-access-ptppm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:26:02 crc kubenswrapper[4797]: I1013 13:26:02.660557 4797 scope.go:117] "RemoveContainer" containerID="90c85f1f1692cc90d79b1aa8915b2f93f0fbe1cac190e747ffc66dcea36dd3ff" Oct 13 13:26:02 crc kubenswrapper[4797]: E1013 13:26:02.661137 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"90c85f1f1692cc90d79b1aa8915b2f93f0fbe1cac190e747ffc66dcea36dd3ff\": container with ID starting with 90c85f1f1692cc90d79b1aa8915b2f93f0fbe1cac190e747ffc66dcea36dd3ff not found: ID does not exist" containerID="90c85f1f1692cc90d79b1aa8915b2f93f0fbe1cac190e747ffc66dcea36dd3ff" Oct 13 13:26:02 crc kubenswrapper[4797]: I1013 13:26:02.661236 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90c85f1f1692cc90d79b1aa8915b2f93f0fbe1cac190e747ffc66dcea36dd3ff"} err="failed to get container status \"90c85f1f1692cc90d79b1aa8915b2f93f0fbe1cac190e747ffc66dcea36dd3ff\": rpc error: code = NotFound desc = could not find container \"90c85f1f1692cc90d79b1aa8915b2f93f0fbe1cac190e747ffc66dcea36dd3ff\": container with ID starting with 90c85f1f1692cc90d79b1aa8915b2f93f0fbe1cac190e747ffc66dcea36dd3ff not found: ID does not exist" Oct 13 13:26:02 crc kubenswrapper[4797]: I1013 13:26:02.661264 4797 scope.go:117] "RemoveContainer" containerID="684d518d30db8023e55b9fa1b1df06ff30fb3b5a41be41ec0aaf4e2bdd6c6a91" Oct 13 13:26:02 crc kubenswrapper[4797]: E1013 13:26:02.663401 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"684d518d30db8023e55b9fa1b1df06ff30fb3b5a41be41ec0aaf4e2bdd6c6a91\": container with ID starting with 684d518d30db8023e55b9fa1b1df06ff30fb3b5a41be41ec0aaf4e2bdd6c6a91 not found: ID does not exist" containerID="684d518d30db8023e55b9fa1b1df06ff30fb3b5a41be41ec0aaf4e2bdd6c6a91" Oct 13 13:26:02 crc kubenswrapper[4797]: I1013 13:26:02.663442 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"684d518d30db8023e55b9fa1b1df06ff30fb3b5a41be41ec0aaf4e2bdd6c6a91"} err="failed to get container status \"684d518d30db8023e55b9fa1b1df06ff30fb3b5a41be41ec0aaf4e2bdd6c6a91\": rpc error: code = NotFound desc = could not find container \"684d518d30db8023e55b9fa1b1df06ff30fb3b5a41be41ec0aaf4e2bdd6c6a91\": container with ID starting with 684d518d30db8023e55b9fa1b1df06ff30fb3b5a41be41ec0aaf4e2bdd6c6a91 not found: ID does not exist" Oct 13 13:26:02 crc kubenswrapper[4797]: I1013 13:26:02.705661 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ptppm\" (UniqueName: \"kubernetes.io/projected/71552bbd-b6bc-43b0-95ba-0c3dc0d93468-kube-api-access-ptppm\") on node \"crc\" DevicePath \"\"" Oct 13 13:26:02 crc kubenswrapper[4797]: I1013 13:26:02.712179 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/71552bbd-b6bc-43b0-95ba-0c3dc0d93468-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "71552bbd-b6bc-43b0-95ba-0c3dc0d93468" (UID: "71552bbd-b6bc-43b0-95ba-0c3dc0d93468"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:26:02 crc kubenswrapper[4797]: I1013 13:26:02.735309 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/71552bbd-b6bc-43b0-95ba-0c3dc0d93468-config" (OuterVolumeSpecName: "config") pod "71552bbd-b6bc-43b0-95ba-0c3dc0d93468" (UID: "71552bbd-b6bc-43b0-95ba-0c3dc0d93468"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:26:02 crc kubenswrapper[4797]: I1013 13:26:02.735581 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/71552bbd-b6bc-43b0-95ba-0c3dc0d93468-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "71552bbd-b6bc-43b0-95ba-0c3dc0d93468" (UID: "71552bbd-b6bc-43b0-95ba-0c3dc0d93468"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:26:02 crc kubenswrapper[4797]: I1013 13:26:02.752606 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/71552bbd-b6bc-43b0-95ba-0c3dc0d93468-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "71552bbd-b6bc-43b0-95ba-0c3dc0d93468" (UID: "71552bbd-b6bc-43b0-95ba-0c3dc0d93468"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:26:02 crc kubenswrapper[4797]: I1013 13:26:02.754856 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/71552bbd-b6bc-43b0-95ba-0c3dc0d93468-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "71552bbd-b6bc-43b0-95ba-0c3dc0d93468" (UID: "71552bbd-b6bc-43b0-95ba-0c3dc0d93468"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:26:02 crc kubenswrapper[4797]: I1013 13:26:02.806966 4797 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71552bbd-b6bc-43b0-95ba-0c3dc0d93468-config\") on node \"crc\" DevicePath \"\"" Oct 13 13:26:02 crc kubenswrapper[4797]: I1013 13:26:02.807003 4797 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/71552bbd-b6bc-43b0-95ba-0c3dc0d93468-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 13 13:26:02 crc kubenswrapper[4797]: I1013 13:26:02.807017 4797 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/71552bbd-b6bc-43b0-95ba-0c3dc0d93468-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 13 13:26:02 crc kubenswrapper[4797]: I1013 13:26:02.807031 4797 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/71552bbd-b6bc-43b0-95ba-0c3dc0d93468-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 13 13:26:02 crc kubenswrapper[4797]: I1013 13:26:02.807046 4797 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/71552bbd-b6bc-43b0-95ba-0c3dc0d93468-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 13 13:26:02 crc kubenswrapper[4797]: I1013 13:26:02.898900 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77f7885f7f-jk7j6"] Oct 13 13:26:02 crc kubenswrapper[4797]: I1013 13:26:02.911743 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-77f7885f7f-jk7j6"] Oct 13 13:26:03 crc kubenswrapper[4797]: I1013 13:26:03.263181 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71552bbd-b6bc-43b0-95ba-0c3dc0d93468" path="/var/lib/kubelet/pods/71552bbd-b6bc-43b0-95ba-0c3dc0d93468/volumes" Oct 13 13:26:03 crc kubenswrapper[4797]: I1013 13:26:03.613910 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ee338af0-a1a1-4515-973b-b914753e76cf","Type":"ContainerStarted","Data":"2c1ef579b9023feaff895b2c21122b7c802d2b9e0b2a3b211f8c95a0dcaab98e"} Oct 13 13:26:03 crc kubenswrapper[4797]: I1013 13:26:03.664800 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.662749161 podStartE2EDuration="6.664780957s" podCreationTimestamp="2025-10-13 13:25:57 +0000 UTC" firstStartedPulling="2025-10-13 13:25:58.348032472 +0000 UTC m=+1135.881582728" lastFinishedPulling="2025-10-13 13:26:02.350064268 +0000 UTC m=+1139.883614524" observedRunningTime="2025-10-13 13:26:03.649681316 +0000 UTC m=+1141.183231593" watchObservedRunningTime="2025-10-13 13:26:03.664780957 +0000 UTC m=+1141.198331223" Oct 13 13:26:04 crc kubenswrapper[4797]: I1013 13:26:04.100194 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-xzbbx" Oct 13 13:26:04 crc kubenswrapper[4797]: I1013 13:26:04.118952 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-tqpk9" Oct 13 13:26:04 crc kubenswrapper[4797]: I1013 13:26:04.259672 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7978a82-8a2f-4a86-8598-65e7dae25b77-config-data\") pod \"d7978a82-8a2f-4a86-8598-65e7dae25b77\" (UID: \"d7978a82-8a2f-4a86-8598-65e7dae25b77\") " Oct 13 13:26:04 crc kubenswrapper[4797]: I1013 13:26:04.259789 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7978a82-8a2f-4a86-8598-65e7dae25b77-combined-ca-bundle\") pod \"d7978a82-8a2f-4a86-8598-65e7dae25b77\" (UID: \"d7978a82-8a2f-4a86-8598-65e7dae25b77\") " Oct 13 13:26:04 crc kubenswrapper[4797]: I1013 13:26:04.259864 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb8faf00-58df-4131-af70-117df286f396-combined-ca-bundle\") pod \"bb8faf00-58df-4131-af70-117df286f396\" (UID: \"bb8faf00-58df-4131-af70-117df286f396\") " Oct 13 13:26:04 crc kubenswrapper[4797]: I1013 13:26:04.259908 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d7978a82-8a2f-4a86-8598-65e7dae25b77-etc-machine-id\") pod \"d7978a82-8a2f-4a86-8598-65e7dae25b77\" (UID: \"d7978a82-8a2f-4a86-8598-65e7dae25b77\") " Oct 13 13:26:04 crc kubenswrapper[4797]: I1013 13:26:04.260098 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d7978a82-8a2f-4a86-8598-65e7dae25b77-scripts\") pod \"d7978a82-8a2f-4a86-8598-65e7dae25b77\" (UID: \"d7978a82-8a2f-4a86-8598-65e7dae25b77\") " Oct 13 13:26:04 crc kubenswrapper[4797]: I1013 13:26:04.260635 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d7978a82-8a2f-4a86-8598-65e7dae25b77-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "d7978a82-8a2f-4a86-8598-65e7dae25b77" (UID: "d7978a82-8a2f-4a86-8598-65e7dae25b77"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 13 13:26:04 crc kubenswrapper[4797]: I1013 13:26:04.261431 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d7978a82-8a2f-4a86-8598-65e7dae25b77-db-sync-config-data\") pod \"d7978a82-8a2f-4a86-8598-65e7dae25b77\" (UID: \"d7978a82-8a2f-4a86-8598-65e7dae25b77\") " Oct 13 13:26:04 crc kubenswrapper[4797]: I1013 13:26:04.261777 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cwgl8\" (UniqueName: \"kubernetes.io/projected/d7978a82-8a2f-4a86-8598-65e7dae25b77-kube-api-access-cwgl8\") pod \"d7978a82-8a2f-4a86-8598-65e7dae25b77\" (UID: \"d7978a82-8a2f-4a86-8598-65e7dae25b77\") " Oct 13 13:26:04 crc kubenswrapper[4797]: I1013 13:26:04.261916 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/bb8faf00-58df-4131-af70-117df286f396-db-sync-config-data\") pod \"bb8faf00-58df-4131-af70-117df286f396\" (UID: \"bb8faf00-58df-4131-af70-117df286f396\") " Oct 13 13:26:04 crc kubenswrapper[4797]: I1013 13:26:04.262040 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mbkvk\" (UniqueName: \"kubernetes.io/projected/bb8faf00-58df-4131-af70-117df286f396-kube-api-access-mbkvk\") pod \"bb8faf00-58df-4131-af70-117df286f396\" (UID: \"bb8faf00-58df-4131-af70-117df286f396\") " Oct 13 13:26:04 crc kubenswrapper[4797]: I1013 13:26:04.262971 4797 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d7978a82-8a2f-4a86-8598-65e7dae25b77-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 13 13:26:04 crc kubenswrapper[4797]: I1013 13:26:04.266855 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7978a82-8a2f-4a86-8598-65e7dae25b77-scripts" (OuterVolumeSpecName: "scripts") pod "d7978a82-8a2f-4a86-8598-65e7dae25b77" (UID: "d7978a82-8a2f-4a86-8598-65e7dae25b77"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:26:04 crc kubenswrapper[4797]: I1013 13:26:04.267259 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb8faf00-58df-4131-af70-117df286f396-kube-api-access-mbkvk" (OuterVolumeSpecName: "kube-api-access-mbkvk") pod "bb8faf00-58df-4131-af70-117df286f396" (UID: "bb8faf00-58df-4131-af70-117df286f396"). InnerVolumeSpecName "kube-api-access-mbkvk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:26:04 crc kubenswrapper[4797]: I1013 13:26:04.268527 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7978a82-8a2f-4a86-8598-65e7dae25b77-kube-api-access-cwgl8" (OuterVolumeSpecName: "kube-api-access-cwgl8") pod "d7978a82-8a2f-4a86-8598-65e7dae25b77" (UID: "d7978a82-8a2f-4a86-8598-65e7dae25b77"). InnerVolumeSpecName "kube-api-access-cwgl8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:26:04 crc kubenswrapper[4797]: I1013 13:26:04.269466 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7978a82-8a2f-4a86-8598-65e7dae25b77-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "d7978a82-8a2f-4a86-8598-65e7dae25b77" (UID: "d7978a82-8a2f-4a86-8598-65e7dae25b77"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:26:04 crc kubenswrapper[4797]: I1013 13:26:04.287981 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb8faf00-58df-4131-af70-117df286f396-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "bb8faf00-58df-4131-af70-117df286f396" (UID: "bb8faf00-58df-4131-af70-117df286f396"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:26:04 crc kubenswrapper[4797]: I1013 13:26:04.317216 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7978a82-8a2f-4a86-8598-65e7dae25b77-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d7978a82-8a2f-4a86-8598-65e7dae25b77" (UID: "d7978a82-8a2f-4a86-8598-65e7dae25b77"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:26:04 crc kubenswrapper[4797]: I1013 13:26:04.325474 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb8faf00-58df-4131-af70-117df286f396-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bb8faf00-58df-4131-af70-117df286f396" (UID: "bb8faf00-58df-4131-af70-117df286f396"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:26:04 crc kubenswrapper[4797]: I1013 13:26:04.336245 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7978a82-8a2f-4a86-8598-65e7dae25b77-config-data" (OuterVolumeSpecName: "config-data") pod "d7978a82-8a2f-4a86-8598-65e7dae25b77" (UID: "d7978a82-8a2f-4a86-8598-65e7dae25b77"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:26:04 crc kubenswrapper[4797]: I1013 13:26:04.366452 4797 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7978a82-8a2f-4a86-8598-65e7dae25b77-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 13:26:04 crc kubenswrapper[4797]: I1013 13:26:04.366518 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7978a82-8a2f-4a86-8598-65e7dae25b77-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 13:26:04 crc kubenswrapper[4797]: I1013 13:26:04.366549 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb8faf00-58df-4131-af70-117df286f396-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 13:26:04 crc kubenswrapper[4797]: I1013 13:26:04.366572 4797 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d7978a82-8a2f-4a86-8598-65e7dae25b77-scripts\") on node \"crc\" DevicePath \"\"" Oct 13 13:26:04 crc kubenswrapper[4797]: I1013 13:26:04.366595 4797 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d7978a82-8a2f-4a86-8598-65e7dae25b77-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 13:26:04 crc kubenswrapper[4797]: I1013 13:26:04.366617 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cwgl8\" (UniqueName: \"kubernetes.io/projected/d7978a82-8a2f-4a86-8598-65e7dae25b77-kube-api-access-cwgl8\") on node \"crc\" DevicePath \"\"" Oct 13 13:26:04 crc kubenswrapper[4797]: I1013 13:26:04.366641 4797 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/bb8faf00-58df-4131-af70-117df286f396-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 13:26:04 crc kubenswrapper[4797]: I1013 13:26:04.366663 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mbkvk\" (UniqueName: \"kubernetes.io/projected/bb8faf00-58df-4131-af70-117df286f396-kube-api-access-mbkvk\") on node \"crc\" DevicePath \"\"" Oct 13 13:26:04 crc kubenswrapper[4797]: I1013 13:26:04.625396 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-xzbbx" event={"ID":"d7978a82-8a2f-4a86-8598-65e7dae25b77","Type":"ContainerDied","Data":"08ff721a340ac5c90445a430f3d31c4908633e240ef5e5abcd65184839f6125b"} Oct 13 13:26:04 crc kubenswrapper[4797]: I1013 13:26:04.625413 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-xzbbx" Oct 13 13:26:04 crc kubenswrapper[4797]: I1013 13:26:04.625435 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="08ff721a340ac5c90445a430f3d31c4908633e240ef5e5abcd65184839f6125b" Oct 13 13:26:04 crc kubenswrapper[4797]: I1013 13:26:04.640167 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-tqpk9" Oct 13 13:26:04 crc kubenswrapper[4797]: I1013 13:26:04.640708 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-tqpk9" event={"ID":"bb8faf00-58df-4131-af70-117df286f396","Type":"ContainerDied","Data":"74b974ec6c8f5d365f5a6d85e0623b152ef8cbaff0ccad93ae97eab1dfcd5498"} Oct 13 13:26:04 crc kubenswrapper[4797]: I1013 13:26:04.640790 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="74b974ec6c8f5d365f5a6d85e0623b152ef8cbaff0ccad93ae97eab1dfcd5498" Oct 13 13:26:04 crc kubenswrapper[4797]: I1013 13:26:04.641001 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 13 13:26:04 crc kubenswrapper[4797]: I1013 13:26:04.976663 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-cc74bd777-hvb5p"] Oct 13 13:26:04 crc kubenswrapper[4797]: E1013 13:26:04.977730 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb8faf00-58df-4131-af70-117df286f396" containerName="barbican-db-sync" Oct 13 13:26:04 crc kubenswrapper[4797]: I1013 13:26:04.977752 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb8faf00-58df-4131-af70-117df286f396" containerName="barbican-db-sync" Oct 13 13:26:04 crc kubenswrapper[4797]: E1013 13:26:04.977775 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71552bbd-b6bc-43b0-95ba-0c3dc0d93468" containerName="init" Oct 13 13:26:04 crc kubenswrapper[4797]: I1013 13:26:04.977782 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="71552bbd-b6bc-43b0-95ba-0c3dc0d93468" containerName="init" Oct 13 13:26:04 crc kubenswrapper[4797]: E1013 13:26:04.977794 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71552bbd-b6bc-43b0-95ba-0c3dc0d93468" containerName="dnsmasq-dns" Oct 13 13:26:04 crc kubenswrapper[4797]: I1013 13:26:04.977814 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="71552bbd-b6bc-43b0-95ba-0c3dc0d93468" containerName="dnsmasq-dns" Oct 13 13:26:04 crc kubenswrapper[4797]: E1013 13:26:04.977885 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7978a82-8a2f-4a86-8598-65e7dae25b77" containerName="cinder-db-sync" Oct 13 13:26:04 crc kubenswrapper[4797]: I1013 13:26:04.977894 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7978a82-8a2f-4a86-8598-65e7dae25b77" containerName="cinder-db-sync" Oct 13 13:26:04 crc kubenswrapper[4797]: I1013 13:26:04.978080 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="71552bbd-b6bc-43b0-95ba-0c3dc0d93468" containerName="dnsmasq-dns" Oct 13 13:26:04 crc kubenswrapper[4797]: I1013 13:26:04.978095 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7978a82-8a2f-4a86-8598-65e7dae25b77" containerName="cinder-db-sync" Oct 13 13:26:04 crc kubenswrapper[4797]: I1013 13:26:04.978120 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb8faf00-58df-4131-af70-117df286f396" containerName="barbican-db-sync" Oct 13 13:26:04 crc kubenswrapper[4797]: I1013 13:26:04.979504 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-cc74bd777-hvb5p" Oct 13 13:26:04 crc kubenswrapper[4797]: I1013 13:26:04.982762 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Oct 13 13:26:04 crc kubenswrapper[4797]: I1013 13:26:04.983405 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-cb64g" Oct 13 13:26:04 crc kubenswrapper[4797]: I1013 13:26:04.983473 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.008342 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.010187 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.016383 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-z87h5" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.016676 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.016838 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.016970 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.037416 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-cc74bd777-hvb5p"] Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.054869 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.074360 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-76fd44f586-tp25f"] Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.076573 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-76fd44f586-tp25f" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.081300 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.082405 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99ff3b6b-b49c-4214-9cf1-ea9eb85f28c0-config-data\") pod \"barbican-worker-cc74bd777-hvb5p\" (UID: \"99ff3b6b-b49c-4214-9cf1-ea9eb85f28c0\") " pod="openstack/barbican-worker-cc74bd777-hvb5p" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.082464 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/99ff3b6b-b49c-4214-9cf1-ea9eb85f28c0-logs\") pod \"barbican-worker-cc74bd777-hvb5p\" (UID: \"99ff3b6b-b49c-4214-9cf1-ea9eb85f28c0\") " pod="openstack/barbican-worker-cc74bd777-hvb5p" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.082492 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99ff3b6b-b49c-4214-9cf1-ea9eb85f28c0-combined-ca-bundle\") pod \"barbican-worker-cc74bd777-hvb5p\" (UID: \"99ff3b6b-b49c-4214-9cf1-ea9eb85f28c0\") " pod="openstack/barbican-worker-cc74bd777-hvb5p" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.082512 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/99ff3b6b-b49c-4214-9cf1-ea9eb85f28c0-config-data-custom\") pod \"barbican-worker-cc74bd777-hvb5p\" (UID: \"99ff3b6b-b49c-4214-9cf1-ea9eb85f28c0\") " pod="openstack/barbican-worker-cc74bd777-hvb5p" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.082570 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqxd8\" (UniqueName: \"kubernetes.io/projected/99ff3b6b-b49c-4214-9cf1-ea9eb85f28c0-kube-api-access-fqxd8\") pod \"barbican-worker-cc74bd777-hvb5p\" (UID: \"99ff3b6b-b49c-4214-9cf1-ea9eb85f28c0\") " pod="openstack/barbican-worker-cc74bd777-hvb5p" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.125154 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-76fd44f586-tp25f"] Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.135329 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6cfb6b659f-stc4k"] Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.136946 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cfb6b659f-stc4k" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.153912 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6cfb6b659f-stc4k"] Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.184384 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99ff3b6b-b49c-4214-9cf1-ea9eb85f28c0-combined-ca-bundle\") pod \"barbican-worker-cc74bd777-hvb5p\" (UID: \"99ff3b6b-b49c-4214-9cf1-ea9eb85f28c0\") " pod="openstack/barbican-worker-cc74bd777-hvb5p" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.184438 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/99ff3b6b-b49c-4214-9cf1-ea9eb85f28c0-config-data-custom\") pod \"barbican-worker-cc74bd777-hvb5p\" (UID: \"99ff3b6b-b49c-4214-9cf1-ea9eb85f28c0\") " pod="openstack/barbican-worker-cc74bd777-hvb5p" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.184471 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fqxd8\" (UniqueName: \"kubernetes.io/projected/99ff3b6b-b49c-4214-9cf1-ea9eb85f28c0-kube-api-access-fqxd8\") pod \"barbican-worker-cc74bd777-hvb5p\" (UID: \"99ff3b6b-b49c-4214-9cf1-ea9eb85f28c0\") " pod="openstack/barbican-worker-cc74bd777-hvb5p" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.184507 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d333dae8-679d-4469-bcbf-0c9ee221b136-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"d333dae8-679d-4469-bcbf-0c9ee221b136\") " pod="openstack/cinder-scheduler-0" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.184532 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d333dae8-679d-4469-bcbf-0c9ee221b136-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"d333dae8-679d-4469-bcbf-0c9ee221b136\") " pod="openstack/cinder-scheduler-0" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.184582 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6d097e4-da24-434c-9b9f-2e84279240a6-combined-ca-bundle\") pod \"barbican-keystone-listener-76fd44f586-tp25f\" (UID: \"b6d097e4-da24-434c-9b9f-2e84279240a6\") " pod="openstack/barbican-keystone-listener-76fd44f586-tp25f" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.184606 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d333dae8-679d-4469-bcbf-0c9ee221b136-config-data\") pod \"cinder-scheduler-0\" (UID: \"d333dae8-679d-4469-bcbf-0c9ee221b136\") " pod="openstack/cinder-scheduler-0" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.184643 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b6d097e4-da24-434c-9b9f-2e84279240a6-logs\") pod \"barbican-keystone-listener-76fd44f586-tp25f\" (UID: \"b6d097e4-da24-434c-9b9f-2e84279240a6\") " pod="openstack/barbican-keystone-listener-76fd44f586-tp25f" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.184672 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6d097e4-da24-434c-9b9f-2e84279240a6-config-data\") pod \"barbican-keystone-listener-76fd44f586-tp25f\" (UID: \"b6d097e4-da24-434c-9b9f-2e84279240a6\") " pod="openstack/barbican-keystone-listener-76fd44f586-tp25f" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.184694 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b6d097e4-da24-434c-9b9f-2e84279240a6-config-data-custom\") pod \"barbican-keystone-listener-76fd44f586-tp25f\" (UID: \"b6d097e4-da24-434c-9b9f-2e84279240a6\") " pod="openstack/barbican-keystone-listener-76fd44f586-tp25f" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.184715 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d333dae8-679d-4469-bcbf-0c9ee221b136-scripts\") pod \"cinder-scheduler-0\" (UID: \"d333dae8-679d-4469-bcbf-0c9ee221b136\") " pod="openstack/cinder-scheduler-0" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.184740 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99ff3b6b-b49c-4214-9cf1-ea9eb85f28c0-config-data\") pod \"barbican-worker-cc74bd777-hvb5p\" (UID: \"99ff3b6b-b49c-4214-9cf1-ea9eb85f28c0\") " pod="openstack/barbican-worker-cc74bd777-hvb5p" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.184778 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbzsl\" (UniqueName: \"kubernetes.io/projected/b6d097e4-da24-434c-9b9f-2e84279240a6-kube-api-access-kbzsl\") pod \"barbican-keystone-listener-76fd44f586-tp25f\" (UID: \"b6d097e4-da24-434c-9b9f-2e84279240a6\") " pod="openstack/barbican-keystone-listener-76fd44f586-tp25f" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.184821 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d333dae8-679d-4469-bcbf-0c9ee221b136-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"d333dae8-679d-4469-bcbf-0c9ee221b136\") " pod="openstack/cinder-scheduler-0" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.184839 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clmsj\" (UniqueName: \"kubernetes.io/projected/d333dae8-679d-4469-bcbf-0c9ee221b136-kube-api-access-clmsj\") pod \"cinder-scheduler-0\" (UID: \"d333dae8-679d-4469-bcbf-0c9ee221b136\") " pod="openstack/cinder-scheduler-0" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.184861 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/99ff3b6b-b49c-4214-9cf1-ea9eb85f28c0-logs\") pod \"barbican-worker-cc74bd777-hvb5p\" (UID: \"99ff3b6b-b49c-4214-9cf1-ea9eb85f28c0\") " pod="openstack/barbican-worker-cc74bd777-hvb5p" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.185368 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/99ff3b6b-b49c-4214-9cf1-ea9eb85f28c0-logs\") pod \"barbican-worker-cc74bd777-hvb5p\" (UID: \"99ff3b6b-b49c-4214-9cf1-ea9eb85f28c0\") " pod="openstack/barbican-worker-cc74bd777-hvb5p" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.196358 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99ff3b6b-b49c-4214-9cf1-ea9eb85f28c0-combined-ca-bundle\") pod \"barbican-worker-cc74bd777-hvb5p\" (UID: \"99ff3b6b-b49c-4214-9cf1-ea9eb85f28c0\") " pod="openstack/barbican-worker-cc74bd777-hvb5p" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.197193 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/99ff3b6b-b49c-4214-9cf1-ea9eb85f28c0-config-data-custom\") pod \"barbican-worker-cc74bd777-hvb5p\" (UID: \"99ff3b6b-b49c-4214-9cf1-ea9eb85f28c0\") " pod="openstack/barbican-worker-cc74bd777-hvb5p" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.207004 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99ff3b6b-b49c-4214-9cf1-ea9eb85f28c0-config-data\") pod \"barbican-worker-cc74bd777-hvb5p\" (UID: \"99ff3b6b-b49c-4214-9cf1-ea9eb85f28c0\") " pod="openstack/barbican-worker-cc74bd777-hvb5p" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.212482 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqxd8\" (UniqueName: \"kubernetes.io/projected/99ff3b6b-b49c-4214-9cf1-ea9eb85f28c0-kube-api-access-fqxd8\") pod \"barbican-worker-cc74bd777-hvb5p\" (UID: \"99ff3b6b-b49c-4214-9cf1-ea9eb85f28c0\") " pod="openstack/barbican-worker-cc74bd777-hvb5p" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.224923 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-966b5d6fd-tjmcl"] Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.226588 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-966b5d6fd-tjmcl" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.237926 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.267405 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6cfb6b659f-stc4k"] Oct 13 13:26:05 crc kubenswrapper[4797]: E1013 13:26:05.268153 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config dns-svc dns-swift-storage-0 kube-api-access-k6t9q ovsdbserver-nb ovsdbserver-sb], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-6cfb6b659f-stc4k" podUID="cb33be94-ed5d-453a-959b-5b13997b4f8a" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.269082 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-966b5d6fd-tjmcl"] Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.282962 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-cb9f44c77-zgtk5"] Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.284480 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cb9f44c77-zgtk5" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.285916 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kbzsl\" (UniqueName: \"kubernetes.io/projected/b6d097e4-da24-434c-9b9f-2e84279240a6-kube-api-access-kbzsl\") pod \"barbican-keystone-listener-76fd44f586-tp25f\" (UID: \"b6d097e4-da24-434c-9b9f-2e84279240a6\") " pod="openstack/barbican-keystone-listener-76fd44f586-tp25f" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.285977 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d333dae8-679d-4469-bcbf-0c9ee221b136-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"d333dae8-679d-4469-bcbf-0c9ee221b136\") " pod="openstack/cinder-scheduler-0" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.285992 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-clmsj\" (UniqueName: \"kubernetes.io/projected/d333dae8-679d-4469-bcbf-0c9ee221b136-kube-api-access-clmsj\") pod \"cinder-scheduler-0\" (UID: \"d333dae8-679d-4469-bcbf-0c9ee221b136\") " pod="openstack/cinder-scheduler-0" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.286027 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cb33be94-ed5d-453a-959b-5b13997b4f8a-dns-svc\") pod \"dnsmasq-dns-6cfb6b659f-stc4k\" (UID: \"cb33be94-ed5d-453a-959b-5b13997b4f8a\") " pod="openstack/dnsmasq-dns-6cfb6b659f-stc4k" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.286054 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cb33be94-ed5d-453a-959b-5b13997b4f8a-ovsdbserver-nb\") pod \"dnsmasq-dns-6cfb6b659f-stc4k\" (UID: \"cb33be94-ed5d-453a-959b-5b13997b4f8a\") " pod="openstack/dnsmasq-dns-6cfb6b659f-stc4k" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.286083 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d333dae8-679d-4469-bcbf-0c9ee221b136-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"d333dae8-679d-4469-bcbf-0c9ee221b136\") " pod="openstack/cinder-scheduler-0" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.286104 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d333dae8-679d-4469-bcbf-0c9ee221b136-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"d333dae8-679d-4469-bcbf-0c9ee221b136\") " pod="openstack/cinder-scheduler-0" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.286135 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cb33be94-ed5d-453a-959b-5b13997b4f8a-ovsdbserver-sb\") pod \"dnsmasq-dns-6cfb6b659f-stc4k\" (UID: \"cb33be94-ed5d-453a-959b-5b13997b4f8a\") " pod="openstack/dnsmasq-dns-6cfb6b659f-stc4k" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.286157 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb33be94-ed5d-453a-959b-5b13997b4f8a-config\") pod \"dnsmasq-dns-6cfb6b659f-stc4k\" (UID: \"cb33be94-ed5d-453a-959b-5b13997b4f8a\") " pod="openstack/dnsmasq-dns-6cfb6b659f-stc4k" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.286183 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6d097e4-da24-434c-9b9f-2e84279240a6-combined-ca-bundle\") pod \"barbican-keystone-listener-76fd44f586-tp25f\" (UID: \"b6d097e4-da24-434c-9b9f-2e84279240a6\") " pod="openstack/barbican-keystone-listener-76fd44f586-tp25f" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.286213 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d333dae8-679d-4469-bcbf-0c9ee221b136-config-data\") pod \"cinder-scheduler-0\" (UID: \"d333dae8-679d-4469-bcbf-0c9ee221b136\") " pod="openstack/cinder-scheduler-0" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.286232 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6t9q\" (UniqueName: \"kubernetes.io/projected/cb33be94-ed5d-453a-959b-5b13997b4f8a-kube-api-access-k6t9q\") pod \"dnsmasq-dns-6cfb6b659f-stc4k\" (UID: \"cb33be94-ed5d-453a-959b-5b13997b4f8a\") " pod="openstack/dnsmasq-dns-6cfb6b659f-stc4k" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.286262 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b6d097e4-da24-434c-9b9f-2e84279240a6-logs\") pod \"barbican-keystone-listener-76fd44f586-tp25f\" (UID: \"b6d097e4-da24-434c-9b9f-2e84279240a6\") " pod="openstack/barbican-keystone-listener-76fd44f586-tp25f" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.286298 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6d097e4-da24-434c-9b9f-2e84279240a6-config-data\") pod \"barbican-keystone-listener-76fd44f586-tp25f\" (UID: \"b6d097e4-da24-434c-9b9f-2e84279240a6\") " pod="openstack/barbican-keystone-listener-76fd44f586-tp25f" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.286319 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b6d097e4-da24-434c-9b9f-2e84279240a6-config-data-custom\") pod \"barbican-keystone-listener-76fd44f586-tp25f\" (UID: \"b6d097e4-da24-434c-9b9f-2e84279240a6\") " pod="openstack/barbican-keystone-listener-76fd44f586-tp25f" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.286335 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d333dae8-679d-4469-bcbf-0c9ee221b136-scripts\") pod \"cinder-scheduler-0\" (UID: \"d333dae8-679d-4469-bcbf-0c9ee221b136\") " pod="openstack/cinder-scheduler-0" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.286356 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cb33be94-ed5d-453a-959b-5b13997b4f8a-dns-swift-storage-0\") pod \"dnsmasq-dns-6cfb6b659f-stc4k\" (UID: \"cb33be94-ed5d-453a-959b-5b13997b4f8a\") " pod="openstack/dnsmasq-dns-6cfb6b659f-stc4k" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.295577 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d333dae8-679d-4469-bcbf-0c9ee221b136-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"d333dae8-679d-4469-bcbf-0c9ee221b136\") " pod="openstack/cinder-scheduler-0" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.296040 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b6d097e4-da24-434c-9b9f-2e84279240a6-logs\") pod \"barbican-keystone-listener-76fd44f586-tp25f\" (UID: \"b6d097e4-da24-434c-9b9f-2e84279240a6\") " pod="openstack/barbican-keystone-listener-76fd44f586-tp25f" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.300097 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d333dae8-679d-4469-bcbf-0c9ee221b136-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"d333dae8-679d-4469-bcbf-0c9ee221b136\") " pod="openstack/cinder-scheduler-0" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.307853 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-cc74bd777-hvb5p" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.312449 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6d097e4-da24-434c-9b9f-2e84279240a6-combined-ca-bundle\") pod \"barbican-keystone-listener-76fd44f586-tp25f\" (UID: \"b6d097e4-da24-434c-9b9f-2e84279240a6\") " pod="openstack/barbican-keystone-listener-76fd44f586-tp25f" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.312644 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d333dae8-679d-4469-bcbf-0c9ee221b136-config-data\") pod \"cinder-scheduler-0\" (UID: \"d333dae8-679d-4469-bcbf-0c9ee221b136\") " pod="openstack/cinder-scheduler-0" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.317537 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d333dae8-679d-4469-bcbf-0c9ee221b136-scripts\") pod \"cinder-scheduler-0\" (UID: \"d333dae8-679d-4469-bcbf-0c9ee221b136\") " pod="openstack/cinder-scheduler-0" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.320084 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d333dae8-679d-4469-bcbf-0c9ee221b136-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"d333dae8-679d-4469-bcbf-0c9ee221b136\") " pod="openstack/cinder-scheduler-0" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.327320 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbzsl\" (UniqueName: \"kubernetes.io/projected/b6d097e4-da24-434c-9b9f-2e84279240a6-kube-api-access-kbzsl\") pod \"barbican-keystone-listener-76fd44f586-tp25f\" (UID: \"b6d097e4-da24-434c-9b9f-2e84279240a6\") " pod="openstack/barbican-keystone-listener-76fd44f586-tp25f" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.338024 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cb9f44c77-zgtk5"] Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.368114 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6d097e4-da24-434c-9b9f-2e84279240a6-config-data\") pod \"barbican-keystone-listener-76fd44f586-tp25f\" (UID: \"b6d097e4-da24-434c-9b9f-2e84279240a6\") " pod="openstack/barbican-keystone-listener-76fd44f586-tp25f" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.374664 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b6d097e4-da24-434c-9b9f-2e84279240a6-config-data-custom\") pod \"barbican-keystone-listener-76fd44f586-tp25f\" (UID: \"b6d097e4-da24-434c-9b9f-2e84279240a6\") " pod="openstack/barbican-keystone-listener-76fd44f586-tp25f" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.374711 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-clmsj\" (UniqueName: \"kubernetes.io/projected/d333dae8-679d-4469-bcbf-0c9ee221b136-kube-api-access-clmsj\") pod \"cinder-scheduler-0\" (UID: \"d333dae8-679d-4469-bcbf-0c9ee221b136\") " pod="openstack/cinder-scheduler-0" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.387749 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cb33be94-ed5d-453a-959b-5b13997b4f8a-ovsdbserver-sb\") pod \"dnsmasq-dns-6cfb6b659f-stc4k\" (UID: \"cb33be94-ed5d-453a-959b-5b13997b4f8a\") " pod="openstack/dnsmasq-dns-6cfb6b659f-stc4k" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.387822 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb33be94-ed5d-453a-959b-5b13997b4f8a-config\") pod \"dnsmasq-dns-6cfb6b659f-stc4k\" (UID: \"cb33be94-ed5d-453a-959b-5b13997b4f8a\") " pod="openstack/dnsmasq-dns-6cfb6b659f-stc4k" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.387853 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c7c3de59-8ab4-450c-97bb-4826fb66db39-dns-svc\") pod \"dnsmasq-dns-cb9f44c77-zgtk5\" (UID: \"c7c3de59-8ab4-450c-97bb-4826fb66db39\") " pod="openstack/dnsmasq-dns-cb9f44c77-zgtk5" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.387880 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bp6zb\" (UniqueName: \"kubernetes.io/projected/c012c54e-85c7-4ee9-8513-4f5d8ae4b1c3-kube-api-access-bp6zb\") pod \"barbican-api-966b5d6fd-tjmcl\" (UID: \"c012c54e-85c7-4ee9-8513-4f5d8ae4b1c3\") " pod="openstack/barbican-api-966b5d6fd-tjmcl" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.387900 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c7c3de59-8ab4-450c-97bb-4826fb66db39-dns-swift-storage-0\") pod \"dnsmasq-dns-cb9f44c77-zgtk5\" (UID: \"c7c3de59-8ab4-450c-97bb-4826fb66db39\") " pod="openstack/dnsmasq-dns-cb9f44c77-zgtk5" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.387921 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k6t9q\" (UniqueName: \"kubernetes.io/projected/cb33be94-ed5d-453a-959b-5b13997b4f8a-kube-api-access-k6t9q\") pod \"dnsmasq-dns-6cfb6b659f-stc4k\" (UID: \"cb33be94-ed5d-453a-959b-5b13997b4f8a\") " pod="openstack/dnsmasq-dns-6cfb6b659f-stc4k" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.387968 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7c3de59-8ab4-450c-97bb-4826fb66db39-config\") pod \"dnsmasq-dns-cb9f44c77-zgtk5\" (UID: \"c7c3de59-8ab4-450c-97bb-4826fb66db39\") " pod="openstack/dnsmasq-dns-cb9f44c77-zgtk5" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.387997 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c012c54e-85c7-4ee9-8513-4f5d8ae4b1c3-config-data-custom\") pod \"barbican-api-966b5d6fd-tjmcl\" (UID: \"c012c54e-85c7-4ee9-8513-4f5d8ae4b1c3\") " pod="openstack/barbican-api-966b5d6fd-tjmcl" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.388024 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c012c54e-85c7-4ee9-8513-4f5d8ae4b1c3-combined-ca-bundle\") pod \"barbican-api-966b5d6fd-tjmcl\" (UID: \"c012c54e-85c7-4ee9-8513-4f5d8ae4b1c3\") " pod="openstack/barbican-api-966b5d6fd-tjmcl" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.388050 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwsd6\" (UniqueName: \"kubernetes.io/projected/c7c3de59-8ab4-450c-97bb-4826fb66db39-kube-api-access-kwsd6\") pod \"dnsmasq-dns-cb9f44c77-zgtk5\" (UID: \"c7c3de59-8ab4-450c-97bb-4826fb66db39\") " pod="openstack/dnsmasq-dns-cb9f44c77-zgtk5" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.388075 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cb33be94-ed5d-453a-959b-5b13997b4f8a-dns-swift-storage-0\") pod \"dnsmasq-dns-6cfb6b659f-stc4k\" (UID: \"cb33be94-ed5d-453a-959b-5b13997b4f8a\") " pod="openstack/dnsmasq-dns-6cfb6b659f-stc4k" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.388097 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c7c3de59-8ab4-450c-97bb-4826fb66db39-ovsdbserver-nb\") pod \"dnsmasq-dns-cb9f44c77-zgtk5\" (UID: \"c7c3de59-8ab4-450c-97bb-4826fb66db39\") " pod="openstack/dnsmasq-dns-cb9f44c77-zgtk5" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.388141 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c012c54e-85c7-4ee9-8513-4f5d8ae4b1c3-config-data\") pod \"barbican-api-966b5d6fd-tjmcl\" (UID: \"c012c54e-85c7-4ee9-8513-4f5d8ae4b1c3\") " pod="openstack/barbican-api-966b5d6fd-tjmcl" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.388161 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cb33be94-ed5d-453a-959b-5b13997b4f8a-dns-svc\") pod \"dnsmasq-dns-6cfb6b659f-stc4k\" (UID: \"cb33be94-ed5d-453a-959b-5b13997b4f8a\") " pod="openstack/dnsmasq-dns-6cfb6b659f-stc4k" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.388184 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c7c3de59-8ab4-450c-97bb-4826fb66db39-ovsdbserver-sb\") pod \"dnsmasq-dns-cb9f44c77-zgtk5\" (UID: \"c7c3de59-8ab4-450c-97bb-4826fb66db39\") " pod="openstack/dnsmasq-dns-cb9f44c77-zgtk5" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.388204 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c012c54e-85c7-4ee9-8513-4f5d8ae4b1c3-logs\") pod \"barbican-api-966b5d6fd-tjmcl\" (UID: \"c012c54e-85c7-4ee9-8513-4f5d8ae4b1c3\") " pod="openstack/barbican-api-966b5d6fd-tjmcl" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.388220 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cb33be94-ed5d-453a-959b-5b13997b4f8a-ovsdbserver-nb\") pod \"dnsmasq-dns-6cfb6b659f-stc4k\" (UID: \"cb33be94-ed5d-453a-959b-5b13997b4f8a\") " pod="openstack/dnsmasq-dns-6cfb6b659f-stc4k" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.389138 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cb33be94-ed5d-453a-959b-5b13997b4f8a-ovsdbserver-nb\") pod \"dnsmasq-dns-6cfb6b659f-stc4k\" (UID: \"cb33be94-ed5d-453a-959b-5b13997b4f8a\") " pod="openstack/dnsmasq-dns-6cfb6b659f-stc4k" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.389690 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cb33be94-ed5d-453a-959b-5b13997b4f8a-ovsdbserver-sb\") pod \"dnsmasq-dns-6cfb6b659f-stc4k\" (UID: \"cb33be94-ed5d-453a-959b-5b13997b4f8a\") " pod="openstack/dnsmasq-dns-6cfb6b659f-stc4k" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.390220 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb33be94-ed5d-453a-959b-5b13997b4f8a-config\") pod \"dnsmasq-dns-6cfb6b659f-stc4k\" (UID: \"cb33be94-ed5d-453a-959b-5b13997b4f8a\") " pod="openstack/dnsmasq-dns-6cfb6b659f-stc4k" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.390227 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cb33be94-ed5d-453a-959b-5b13997b4f8a-dns-swift-storage-0\") pod \"dnsmasq-dns-6cfb6b659f-stc4k\" (UID: \"cb33be94-ed5d-453a-959b-5b13997b4f8a\") " pod="openstack/dnsmasq-dns-6cfb6b659f-stc4k" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.390666 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cb33be94-ed5d-453a-959b-5b13997b4f8a-dns-svc\") pod \"dnsmasq-dns-6cfb6b659f-stc4k\" (UID: \"cb33be94-ed5d-453a-959b-5b13997b4f8a\") " pod="openstack/dnsmasq-dns-6cfb6b659f-stc4k" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.418917 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.420822 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.421647 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-76fd44f586-tp25f" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.422447 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k6t9q\" (UniqueName: \"kubernetes.io/projected/cb33be94-ed5d-453a-959b-5b13997b4f8a-kube-api-access-k6t9q\") pod \"dnsmasq-dns-6cfb6b659f-stc4k\" (UID: \"cb33be94-ed5d-453a-959b-5b13997b4f8a\") " pod="openstack/dnsmasq-dns-6cfb6b659f-stc4k" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.423824 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.438876 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.494565 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c012c54e-85c7-4ee9-8513-4f5d8ae4b1c3-config-data\") pod \"barbican-api-966b5d6fd-tjmcl\" (UID: \"c012c54e-85c7-4ee9-8513-4f5d8ae4b1c3\") " pod="openstack/barbican-api-966b5d6fd-tjmcl" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.494638 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c7c3de59-8ab4-450c-97bb-4826fb66db39-ovsdbserver-sb\") pod \"dnsmasq-dns-cb9f44c77-zgtk5\" (UID: \"c7c3de59-8ab4-450c-97bb-4826fb66db39\") " pod="openstack/dnsmasq-dns-cb9f44c77-zgtk5" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.494667 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c012c54e-85c7-4ee9-8513-4f5d8ae4b1c3-logs\") pod \"barbican-api-966b5d6fd-tjmcl\" (UID: \"c012c54e-85c7-4ee9-8513-4f5d8ae4b1c3\") " pod="openstack/barbican-api-966b5d6fd-tjmcl" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.494761 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c7c3de59-8ab4-450c-97bb-4826fb66db39-dns-svc\") pod \"dnsmasq-dns-cb9f44c77-zgtk5\" (UID: \"c7c3de59-8ab4-450c-97bb-4826fb66db39\") " pod="openstack/dnsmasq-dns-cb9f44c77-zgtk5" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.494792 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bp6zb\" (UniqueName: \"kubernetes.io/projected/c012c54e-85c7-4ee9-8513-4f5d8ae4b1c3-kube-api-access-bp6zb\") pod \"barbican-api-966b5d6fd-tjmcl\" (UID: \"c012c54e-85c7-4ee9-8513-4f5d8ae4b1c3\") " pod="openstack/barbican-api-966b5d6fd-tjmcl" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.494854 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c7c3de59-8ab4-450c-97bb-4826fb66db39-dns-swift-storage-0\") pod \"dnsmasq-dns-cb9f44c77-zgtk5\" (UID: \"c7c3de59-8ab4-450c-97bb-4826fb66db39\") " pod="openstack/dnsmasq-dns-cb9f44c77-zgtk5" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.494883 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7c3de59-8ab4-450c-97bb-4826fb66db39-config\") pod \"dnsmasq-dns-cb9f44c77-zgtk5\" (UID: \"c7c3de59-8ab4-450c-97bb-4826fb66db39\") " pod="openstack/dnsmasq-dns-cb9f44c77-zgtk5" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.494910 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c012c54e-85c7-4ee9-8513-4f5d8ae4b1c3-config-data-custom\") pod \"barbican-api-966b5d6fd-tjmcl\" (UID: \"c012c54e-85c7-4ee9-8513-4f5d8ae4b1c3\") " pod="openstack/barbican-api-966b5d6fd-tjmcl" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.494940 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c012c54e-85c7-4ee9-8513-4f5d8ae4b1c3-combined-ca-bundle\") pod \"barbican-api-966b5d6fd-tjmcl\" (UID: \"c012c54e-85c7-4ee9-8513-4f5d8ae4b1c3\") " pod="openstack/barbican-api-966b5d6fd-tjmcl" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.494969 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kwsd6\" (UniqueName: \"kubernetes.io/projected/c7c3de59-8ab4-450c-97bb-4826fb66db39-kube-api-access-kwsd6\") pod \"dnsmasq-dns-cb9f44c77-zgtk5\" (UID: \"c7c3de59-8ab4-450c-97bb-4826fb66db39\") " pod="openstack/dnsmasq-dns-cb9f44c77-zgtk5" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.495008 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c7c3de59-8ab4-450c-97bb-4826fb66db39-ovsdbserver-nb\") pod \"dnsmasq-dns-cb9f44c77-zgtk5\" (UID: \"c7c3de59-8ab4-450c-97bb-4826fb66db39\") " pod="openstack/dnsmasq-dns-cb9f44c77-zgtk5" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.495296 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c012c54e-85c7-4ee9-8513-4f5d8ae4b1c3-logs\") pod \"barbican-api-966b5d6fd-tjmcl\" (UID: \"c012c54e-85c7-4ee9-8513-4f5d8ae4b1c3\") " pod="openstack/barbican-api-966b5d6fd-tjmcl" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.496080 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7c3de59-8ab4-450c-97bb-4826fb66db39-config\") pod \"dnsmasq-dns-cb9f44c77-zgtk5\" (UID: \"c7c3de59-8ab4-450c-97bb-4826fb66db39\") " pod="openstack/dnsmasq-dns-cb9f44c77-zgtk5" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.496293 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c7c3de59-8ab4-450c-97bb-4826fb66db39-ovsdbserver-nb\") pod \"dnsmasq-dns-cb9f44c77-zgtk5\" (UID: \"c7c3de59-8ab4-450c-97bb-4826fb66db39\") " pod="openstack/dnsmasq-dns-cb9f44c77-zgtk5" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.496374 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c7c3de59-8ab4-450c-97bb-4826fb66db39-ovsdbserver-sb\") pod \"dnsmasq-dns-cb9f44c77-zgtk5\" (UID: \"c7c3de59-8ab4-450c-97bb-4826fb66db39\") " pod="openstack/dnsmasq-dns-cb9f44c77-zgtk5" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.496793 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c7c3de59-8ab4-450c-97bb-4826fb66db39-dns-svc\") pod \"dnsmasq-dns-cb9f44c77-zgtk5\" (UID: \"c7c3de59-8ab4-450c-97bb-4826fb66db39\") " pod="openstack/dnsmasq-dns-cb9f44c77-zgtk5" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.501168 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c7c3de59-8ab4-450c-97bb-4826fb66db39-dns-swift-storage-0\") pod \"dnsmasq-dns-cb9f44c77-zgtk5\" (UID: \"c7c3de59-8ab4-450c-97bb-4826fb66db39\") " pod="openstack/dnsmasq-dns-cb9f44c77-zgtk5" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.504993 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c012c54e-85c7-4ee9-8513-4f5d8ae4b1c3-config-data\") pod \"barbican-api-966b5d6fd-tjmcl\" (UID: \"c012c54e-85c7-4ee9-8513-4f5d8ae4b1c3\") " pod="openstack/barbican-api-966b5d6fd-tjmcl" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.505111 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c012c54e-85c7-4ee9-8513-4f5d8ae4b1c3-combined-ca-bundle\") pod \"barbican-api-966b5d6fd-tjmcl\" (UID: \"c012c54e-85c7-4ee9-8513-4f5d8ae4b1c3\") " pod="openstack/barbican-api-966b5d6fd-tjmcl" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.519531 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c012c54e-85c7-4ee9-8513-4f5d8ae4b1c3-config-data-custom\") pod \"barbican-api-966b5d6fd-tjmcl\" (UID: \"c012c54e-85c7-4ee9-8513-4f5d8ae4b1c3\") " pod="openstack/barbican-api-966b5d6fd-tjmcl" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.532850 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bp6zb\" (UniqueName: \"kubernetes.io/projected/c012c54e-85c7-4ee9-8513-4f5d8ae4b1c3-kube-api-access-bp6zb\") pod \"barbican-api-966b5d6fd-tjmcl\" (UID: \"c012c54e-85c7-4ee9-8513-4f5d8ae4b1c3\") " pod="openstack/barbican-api-966b5d6fd-tjmcl" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.541932 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwsd6\" (UniqueName: \"kubernetes.io/projected/c7c3de59-8ab4-450c-97bb-4826fb66db39-kube-api-access-kwsd6\") pod \"dnsmasq-dns-cb9f44c77-zgtk5\" (UID: \"c7c3de59-8ab4-450c-97bb-4826fb66db39\") " pod="openstack/dnsmasq-dns-cb9f44c77-zgtk5" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.603787 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e8fbbe28-3138-4280-a1bc-543138023af0-etc-machine-id\") pod \"cinder-api-0\" (UID: \"e8fbbe28-3138-4280-a1bc-543138023af0\") " pod="openstack/cinder-api-0" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.604323 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e8fbbe28-3138-4280-a1bc-543138023af0-logs\") pod \"cinder-api-0\" (UID: \"e8fbbe28-3138-4280-a1bc-543138023af0\") " pod="openstack/cinder-api-0" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.604358 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8fbbe28-3138-4280-a1bc-543138023af0-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"e8fbbe28-3138-4280-a1bc-543138023af0\") " pod="openstack/cinder-api-0" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.604507 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w97mr\" (UniqueName: \"kubernetes.io/projected/e8fbbe28-3138-4280-a1bc-543138023af0-kube-api-access-w97mr\") pod \"cinder-api-0\" (UID: \"e8fbbe28-3138-4280-a1bc-543138023af0\") " pod="openstack/cinder-api-0" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.604551 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8fbbe28-3138-4280-a1bc-543138023af0-scripts\") pod \"cinder-api-0\" (UID: \"e8fbbe28-3138-4280-a1bc-543138023af0\") " pod="openstack/cinder-api-0" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.604636 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e8fbbe28-3138-4280-a1bc-543138023af0-config-data-custom\") pod \"cinder-api-0\" (UID: \"e8fbbe28-3138-4280-a1bc-543138023af0\") " pod="openstack/cinder-api-0" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.604669 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8fbbe28-3138-4280-a1bc-543138023af0-config-data\") pod \"cinder-api-0\" (UID: \"e8fbbe28-3138-4280-a1bc-543138023af0\") " pod="openstack/cinder-api-0" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.652145 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cfb6b659f-stc4k" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.657949 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.705731 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8fbbe28-3138-4280-a1bc-543138023af0-scripts\") pod \"cinder-api-0\" (UID: \"e8fbbe28-3138-4280-a1bc-543138023af0\") " pod="openstack/cinder-api-0" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.705824 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8fbbe28-3138-4280-a1bc-543138023af0-config-data\") pod \"cinder-api-0\" (UID: \"e8fbbe28-3138-4280-a1bc-543138023af0\") " pod="openstack/cinder-api-0" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.705842 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e8fbbe28-3138-4280-a1bc-543138023af0-config-data-custom\") pod \"cinder-api-0\" (UID: \"e8fbbe28-3138-4280-a1bc-543138023af0\") " pod="openstack/cinder-api-0" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.705883 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e8fbbe28-3138-4280-a1bc-543138023af0-etc-machine-id\") pod \"cinder-api-0\" (UID: \"e8fbbe28-3138-4280-a1bc-543138023af0\") " pod="openstack/cinder-api-0" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.705922 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e8fbbe28-3138-4280-a1bc-543138023af0-logs\") pod \"cinder-api-0\" (UID: \"e8fbbe28-3138-4280-a1bc-543138023af0\") " pod="openstack/cinder-api-0" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.705951 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8fbbe28-3138-4280-a1bc-543138023af0-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"e8fbbe28-3138-4280-a1bc-543138023af0\") " pod="openstack/cinder-api-0" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.706023 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w97mr\" (UniqueName: \"kubernetes.io/projected/e8fbbe28-3138-4280-a1bc-543138023af0-kube-api-access-w97mr\") pod \"cinder-api-0\" (UID: \"e8fbbe28-3138-4280-a1bc-543138023af0\") " pod="openstack/cinder-api-0" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.706379 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e8fbbe28-3138-4280-a1bc-543138023af0-etc-machine-id\") pod \"cinder-api-0\" (UID: \"e8fbbe28-3138-4280-a1bc-543138023af0\") " pod="openstack/cinder-api-0" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.706698 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e8fbbe28-3138-4280-a1bc-543138023af0-logs\") pod \"cinder-api-0\" (UID: \"e8fbbe28-3138-4280-a1bc-543138023af0\") " pod="openstack/cinder-api-0" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.713761 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e8fbbe28-3138-4280-a1bc-543138023af0-config-data-custom\") pod \"cinder-api-0\" (UID: \"e8fbbe28-3138-4280-a1bc-543138023af0\") " pod="openstack/cinder-api-0" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.721999 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8fbbe28-3138-4280-a1bc-543138023af0-config-data\") pod \"cinder-api-0\" (UID: \"e8fbbe28-3138-4280-a1bc-543138023af0\") " pod="openstack/cinder-api-0" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.722783 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8fbbe28-3138-4280-a1bc-543138023af0-scripts\") pod \"cinder-api-0\" (UID: \"e8fbbe28-3138-4280-a1bc-543138023af0\") " pod="openstack/cinder-api-0" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.728398 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8fbbe28-3138-4280-a1bc-543138023af0-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"e8fbbe28-3138-4280-a1bc-543138023af0\") " pod="openstack/cinder-api-0" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.735045 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cfb6b659f-stc4k" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.750313 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-966b5d6fd-tjmcl" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.751636 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w97mr\" (UniqueName: \"kubernetes.io/projected/e8fbbe28-3138-4280-a1bc-543138023af0-kube-api-access-w97mr\") pod \"cinder-api-0\" (UID: \"e8fbbe28-3138-4280-a1bc-543138023af0\") " pod="openstack/cinder-api-0" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.766248 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cb9f44c77-zgtk5" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.769831 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.806693 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k6t9q\" (UniqueName: \"kubernetes.io/projected/cb33be94-ed5d-453a-959b-5b13997b4f8a-kube-api-access-k6t9q\") pod \"cb33be94-ed5d-453a-959b-5b13997b4f8a\" (UID: \"cb33be94-ed5d-453a-959b-5b13997b4f8a\") " Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.806751 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cb33be94-ed5d-453a-959b-5b13997b4f8a-ovsdbserver-sb\") pod \"cb33be94-ed5d-453a-959b-5b13997b4f8a\" (UID: \"cb33be94-ed5d-453a-959b-5b13997b4f8a\") " Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.806770 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb33be94-ed5d-453a-959b-5b13997b4f8a-config\") pod \"cb33be94-ed5d-453a-959b-5b13997b4f8a\" (UID: \"cb33be94-ed5d-453a-959b-5b13997b4f8a\") " Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.806835 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cb33be94-ed5d-453a-959b-5b13997b4f8a-dns-svc\") pod \"cb33be94-ed5d-453a-959b-5b13997b4f8a\" (UID: \"cb33be94-ed5d-453a-959b-5b13997b4f8a\") " Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.806977 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cb33be94-ed5d-453a-959b-5b13997b4f8a-dns-swift-storage-0\") pod \"cb33be94-ed5d-453a-959b-5b13997b4f8a\" (UID: \"cb33be94-ed5d-453a-959b-5b13997b4f8a\") " Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.807040 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cb33be94-ed5d-453a-959b-5b13997b4f8a-ovsdbserver-nb\") pod \"cb33be94-ed5d-453a-959b-5b13997b4f8a\" (UID: \"cb33be94-ed5d-453a-959b-5b13997b4f8a\") " Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.808523 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb33be94-ed5d-453a-959b-5b13997b4f8a-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "cb33be94-ed5d-453a-959b-5b13997b4f8a" (UID: "cb33be94-ed5d-453a-959b-5b13997b4f8a"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.810146 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb33be94-ed5d-453a-959b-5b13997b4f8a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "cb33be94-ed5d-453a-959b-5b13997b4f8a" (UID: "cb33be94-ed5d-453a-959b-5b13997b4f8a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.810481 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb33be94-ed5d-453a-959b-5b13997b4f8a-config" (OuterVolumeSpecName: "config") pod "cb33be94-ed5d-453a-959b-5b13997b4f8a" (UID: "cb33be94-ed5d-453a-959b-5b13997b4f8a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.813218 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb33be94-ed5d-453a-959b-5b13997b4f8a-kube-api-access-k6t9q" (OuterVolumeSpecName: "kube-api-access-k6t9q") pod "cb33be94-ed5d-453a-959b-5b13997b4f8a" (UID: "cb33be94-ed5d-453a-959b-5b13997b4f8a"). InnerVolumeSpecName "kube-api-access-k6t9q". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.813556 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb33be94-ed5d-453a-959b-5b13997b4f8a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "cb33be94-ed5d-453a-959b-5b13997b4f8a" (UID: "cb33be94-ed5d-453a-959b-5b13997b4f8a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.816369 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb33be94-ed5d-453a-959b-5b13997b4f8a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "cb33be94-ed5d-453a-959b-5b13997b4f8a" (UID: "cb33be94-ed5d-453a-959b-5b13997b4f8a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.910080 4797 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cb33be94-ed5d-453a-959b-5b13997b4f8a-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.910129 4797 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cb33be94-ed5d-453a-959b-5b13997b4f8a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.910143 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k6t9q\" (UniqueName: \"kubernetes.io/projected/cb33be94-ed5d-453a-959b-5b13997b4f8a-kube-api-access-k6t9q\") on node \"crc\" DevicePath \"\"" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.910156 4797 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cb33be94-ed5d-453a-959b-5b13997b4f8a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.910168 4797 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb33be94-ed5d-453a-959b-5b13997b4f8a-config\") on node \"crc\" DevicePath \"\"" Oct 13 13:26:05 crc kubenswrapper[4797]: I1013 13:26:05.910180 4797 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cb33be94-ed5d-453a-959b-5b13997b4f8a-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 13 13:26:06 crc kubenswrapper[4797]: I1013 13:26:06.029515 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-cc74bd777-hvb5p"] Oct 13 13:26:06 crc kubenswrapper[4797]: I1013 13:26:06.154950 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-76fd44f586-tp25f"] Oct 13 13:26:06 crc kubenswrapper[4797]: I1013 13:26:06.352920 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 13 13:26:06 crc kubenswrapper[4797]: W1013 13:26:06.363107 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd333dae8_679d_4469_bcbf_0c9ee221b136.slice/crio-7ec6324b7ab097e36ec43ff9b0e2d131c1273909807bdeb438d5a17104e6a043 WatchSource:0}: Error finding container 7ec6324b7ab097e36ec43ff9b0e2d131c1273909807bdeb438d5a17104e6a043: Status 404 returned error can't find the container with id 7ec6324b7ab097e36ec43ff9b0e2d131c1273909807bdeb438d5a17104e6a043 Oct 13 13:26:06 crc kubenswrapper[4797]: I1013 13:26:06.414918 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 13 13:26:06 crc kubenswrapper[4797]: I1013 13:26:06.424084 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-966b5d6fd-tjmcl"] Oct 13 13:26:06 crc kubenswrapper[4797]: I1013 13:26:06.437374 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cb9f44c77-zgtk5"] Oct 13 13:26:06 crc kubenswrapper[4797]: I1013 13:26:06.678512 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"d333dae8-679d-4469-bcbf-0c9ee221b136","Type":"ContainerStarted","Data":"7ec6324b7ab097e36ec43ff9b0e2d131c1273909807bdeb438d5a17104e6a043"} Oct 13 13:26:06 crc kubenswrapper[4797]: I1013 13:26:06.682538 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-cc74bd777-hvb5p" event={"ID":"99ff3b6b-b49c-4214-9cf1-ea9eb85f28c0","Type":"ContainerStarted","Data":"128213963d53bab847cdbca6cd310c1dd55ec97434c0b7ad2a2cce6bcb78a94b"} Oct 13 13:26:06 crc kubenswrapper[4797]: I1013 13:26:06.699113 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e8fbbe28-3138-4280-a1bc-543138023af0","Type":"ContainerStarted","Data":"09dbbd758e35913745a3ce0a003230fe37047b2dd5edae5581643817a6328d4d"} Oct 13 13:26:06 crc kubenswrapper[4797]: I1013 13:26:06.703092 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cb9f44c77-zgtk5" event={"ID":"c7c3de59-8ab4-450c-97bb-4826fb66db39","Type":"ContainerStarted","Data":"e9eabd68ecee04115a390bdfaef77ce3985625110380b3d2ca3ef83d5a738b5b"} Oct 13 13:26:06 crc kubenswrapper[4797]: I1013 13:26:06.705540 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-966b5d6fd-tjmcl" event={"ID":"c012c54e-85c7-4ee9-8513-4f5d8ae4b1c3","Type":"ContainerStarted","Data":"c194d424777b528262d7faf7dbc68e74d37a3b6e2e0d43724b482adbb86ba61c"} Oct 13 13:26:06 crc kubenswrapper[4797]: I1013 13:26:06.705596 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-966b5d6fd-tjmcl" event={"ID":"c012c54e-85c7-4ee9-8513-4f5d8ae4b1c3","Type":"ContainerStarted","Data":"a18f5c124fd9bc0689e2bd5852f56e0f4132553028de3ee5d0cc04a5c48490c0"} Oct 13 13:26:06 crc kubenswrapper[4797]: I1013 13:26:06.707222 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cfb6b659f-stc4k" Oct 13 13:26:06 crc kubenswrapper[4797]: I1013 13:26:06.707214 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-76fd44f586-tp25f" event={"ID":"b6d097e4-da24-434c-9b9f-2e84279240a6","Type":"ContainerStarted","Data":"bdc3d0adc4f7f76c25a6ad3da7a1f05cdc4c7f810641550ea31eed320d4f2224"} Oct 13 13:26:06 crc kubenswrapper[4797]: I1013 13:26:06.761844 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6cfb6b659f-stc4k"] Oct 13 13:26:06 crc kubenswrapper[4797]: I1013 13:26:06.768701 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6cfb6b659f-stc4k"] Oct 13 13:26:06 crc kubenswrapper[4797]: I1013 13:26:06.980540 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-789bb6874b-qp58p" Oct 13 13:26:07 crc kubenswrapper[4797]: I1013 13:26:07.262015 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb33be94-ed5d-453a-959b-5b13997b4f8a" path="/var/lib/kubelet/pods/cb33be94-ed5d-453a-959b-5b13997b4f8a/volumes" Oct 13 13:26:07 crc kubenswrapper[4797]: I1013 13:26:07.717344 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 13 13:26:07 crc kubenswrapper[4797]: I1013 13:26:07.718048 4797 generic.go:334] "Generic (PLEG): container finished" podID="c7c3de59-8ab4-450c-97bb-4826fb66db39" containerID="0b1d339d1c89a1dbbb64993f7f83a493c2b87570452b25455894f2ba368b079c" exitCode=0 Oct 13 13:26:07 crc kubenswrapper[4797]: I1013 13:26:07.718113 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cb9f44c77-zgtk5" event={"ID":"c7c3de59-8ab4-450c-97bb-4826fb66db39","Type":"ContainerDied","Data":"0b1d339d1c89a1dbbb64993f7f83a493c2b87570452b25455894f2ba368b079c"} Oct 13 13:26:07 crc kubenswrapper[4797]: I1013 13:26:07.721116 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-966b5d6fd-tjmcl" event={"ID":"c012c54e-85c7-4ee9-8513-4f5d8ae4b1c3","Type":"ContainerStarted","Data":"ffef222b79dd5a63d8d3ba55b0f0a1b56a6dbb536e90807283e7732510a1adcf"} Oct 13 13:26:07 crc kubenswrapper[4797]: I1013 13:26:07.721168 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-966b5d6fd-tjmcl" Oct 13 13:26:07 crc kubenswrapper[4797]: I1013 13:26:07.721262 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-966b5d6fd-tjmcl" Oct 13 13:26:07 crc kubenswrapper[4797]: I1013 13:26:07.723307 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e8fbbe28-3138-4280-a1bc-543138023af0","Type":"ContainerStarted","Data":"ea9b6c488a046a9e122a73b3301f9988fa128e615627df7adc9ebe366f4922c4"} Oct 13 13:26:07 crc kubenswrapper[4797]: I1013 13:26:07.778442 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-966b5d6fd-tjmcl" podStartSLOduration=2.778425378 podStartE2EDuration="2.778425378s" podCreationTimestamp="2025-10-13 13:26:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 13:26:07.777211738 +0000 UTC m=+1145.310761994" watchObservedRunningTime="2025-10-13 13:26:07.778425378 +0000 UTC m=+1145.311975634" Oct 13 13:26:08 crc kubenswrapper[4797]: I1013 13:26:08.033872 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Oct 13 13:26:08 crc kubenswrapper[4797]: I1013 13:26:08.035581 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 13 13:26:08 crc kubenswrapper[4797]: I1013 13:26:08.038305 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Oct 13 13:26:08 crc kubenswrapper[4797]: I1013 13:26:08.038528 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Oct 13 13:26:08 crc kubenswrapper[4797]: I1013 13:26:08.038716 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-bzbvf" Oct 13 13:26:08 crc kubenswrapper[4797]: I1013 13:26:08.041461 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 13 13:26:08 crc kubenswrapper[4797]: I1013 13:26:08.169115 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f8688abd-e654-404b-924c-9e4cf255f4e8-openstack-config-secret\") pod \"openstackclient\" (UID: \"f8688abd-e654-404b-924c-9e4cf255f4e8\") " pod="openstack/openstackclient" Oct 13 13:26:08 crc kubenswrapper[4797]: I1013 13:26:08.169237 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/f8688abd-e654-404b-924c-9e4cf255f4e8-openstack-config\") pod \"openstackclient\" (UID: \"f8688abd-e654-404b-924c-9e4cf255f4e8\") " pod="openstack/openstackclient" Oct 13 13:26:08 crc kubenswrapper[4797]: I1013 13:26:08.169297 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8688abd-e654-404b-924c-9e4cf255f4e8-combined-ca-bundle\") pod \"openstackclient\" (UID: \"f8688abd-e654-404b-924c-9e4cf255f4e8\") " pod="openstack/openstackclient" Oct 13 13:26:08 crc kubenswrapper[4797]: I1013 13:26:08.169334 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwl7t\" (UniqueName: \"kubernetes.io/projected/f8688abd-e654-404b-924c-9e4cf255f4e8-kube-api-access-lwl7t\") pod \"openstackclient\" (UID: \"f8688abd-e654-404b-924c-9e4cf255f4e8\") " pod="openstack/openstackclient" Oct 13 13:26:08 crc kubenswrapper[4797]: I1013 13:26:08.270732 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8688abd-e654-404b-924c-9e4cf255f4e8-combined-ca-bundle\") pod \"openstackclient\" (UID: \"f8688abd-e654-404b-924c-9e4cf255f4e8\") " pod="openstack/openstackclient" Oct 13 13:26:08 crc kubenswrapper[4797]: I1013 13:26:08.270848 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lwl7t\" (UniqueName: \"kubernetes.io/projected/f8688abd-e654-404b-924c-9e4cf255f4e8-kube-api-access-lwl7t\") pod \"openstackclient\" (UID: \"f8688abd-e654-404b-924c-9e4cf255f4e8\") " pod="openstack/openstackclient" Oct 13 13:26:08 crc kubenswrapper[4797]: I1013 13:26:08.270928 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f8688abd-e654-404b-924c-9e4cf255f4e8-openstack-config-secret\") pod \"openstackclient\" (UID: \"f8688abd-e654-404b-924c-9e4cf255f4e8\") " pod="openstack/openstackclient" Oct 13 13:26:08 crc kubenswrapper[4797]: I1013 13:26:08.270989 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/f8688abd-e654-404b-924c-9e4cf255f4e8-openstack-config\") pod \"openstackclient\" (UID: \"f8688abd-e654-404b-924c-9e4cf255f4e8\") " pod="openstack/openstackclient" Oct 13 13:26:08 crc kubenswrapper[4797]: I1013 13:26:08.272271 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/f8688abd-e654-404b-924c-9e4cf255f4e8-openstack-config\") pod \"openstackclient\" (UID: \"f8688abd-e654-404b-924c-9e4cf255f4e8\") " pod="openstack/openstackclient" Oct 13 13:26:08 crc kubenswrapper[4797]: I1013 13:26:08.282521 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f8688abd-e654-404b-924c-9e4cf255f4e8-openstack-config-secret\") pod \"openstackclient\" (UID: \"f8688abd-e654-404b-924c-9e4cf255f4e8\") " pod="openstack/openstackclient" Oct 13 13:26:08 crc kubenswrapper[4797]: I1013 13:26:08.283128 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8688abd-e654-404b-924c-9e4cf255f4e8-combined-ca-bundle\") pod \"openstackclient\" (UID: \"f8688abd-e654-404b-924c-9e4cf255f4e8\") " pod="openstack/openstackclient" Oct 13 13:26:08 crc kubenswrapper[4797]: I1013 13:26:08.291721 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwl7t\" (UniqueName: \"kubernetes.io/projected/f8688abd-e654-404b-924c-9e4cf255f4e8-kube-api-access-lwl7t\") pod \"openstackclient\" (UID: \"f8688abd-e654-404b-924c-9e4cf255f4e8\") " pod="openstack/openstackclient" Oct 13 13:26:08 crc kubenswrapper[4797]: I1013 13:26:08.364490 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 13 13:26:08 crc kubenswrapper[4797]: I1013 13:26:08.758942 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cb9f44c77-zgtk5" event={"ID":"c7c3de59-8ab4-450c-97bb-4826fb66db39","Type":"ContainerStarted","Data":"64bc7a351dba6fd2ea568ac0358f7489d2418e46d95c7d997ddaf2c98466fa14"} Oct 13 13:26:08 crc kubenswrapper[4797]: I1013 13:26:08.759378 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-cb9f44c77-zgtk5" Oct 13 13:26:08 crc kubenswrapper[4797]: I1013 13:26:08.796682 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-cb9f44c77-zgtk5" podStartSLOduration=3.796666242 podStartE2EDuration="3.796666242s" podCreationTimestamp="2025-10-13 13:26:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 13:26:08.784618436 +0000 UTC m=+1146.318168712" watchObservedRunningTime="2025-10-13 13:26:08.796666242 +0000 UTC m=+1146.330216498" Oct 13 13:26:08 crc kubenswrapper[4797]: I1013 13:26:08.800838 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-76fd44f586-tp25f" event={"ID":"b6d097e4-da24-434c-9b9f-2e84279240a6","Type":"ContainerStarted","Data":"af88e3a88bd4391cf42637ef4fd96c25933e8e2d449d7aa7ee46670261ef0157"} Oct 13 13:26:08 crc kubenswrapper[4797]: I1013 13:26:08.800879 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-76fd44f586-tp25f" event={"ID":"b6d097e4-da24-434c-9b9f-2e84279240a6","Type":"ContainerStarted","Data":"91313e817dc0492cf7a60f479444a1c27f27b873cabe704400743e07a7510a7d"} Oct 13 13:26:08 crc kubenswrapper[4797]: I1013 13:26:08.988318 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-76fd44f586-tp25f" podStartSLOduration=3.306127764 podStartE2EDuration="4.988301008s" podCreationTimestamp="2025-10-13 13:26:04 +0000 UTC" firstStartedPulling="2025-10-13 13:26:06.157227298 +0000 UTC m=+1143.690777554" lastFinishedPulling="2025-10-13 13:26:07.839400542 +0000 UTC m=+1145.372950798" observedRunningTime="2025-10-13 13:26:08.856536409 +0000 UTC m=+1146.390086665" watchObservedRunningTime="2025-10-13 13:26:08.988301008 +0000 UTC m=+1146.521851264" Oct 13 13:26:08 crc kubenswrapper[4797]: I1013 13:26:08.989657 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 13 13:26:09 crc kubenswrapper[4797]: W1013 13:26:09.178508 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf8688abd_e654_404b_924c_9e4cf255f4e8.slice/crio-7e35110e1acc78b1ed7f0949b5c879d8906f8c488eb4346fbbeb6517d0be0bdd WatchSource:0}: Error finding container 7e35110e1acc78b1ed7f0949b5c879d8906f8c488eb4346fbbeb6517d0be0bdd: Status 404 returned error can't find the container with id 7e35110e1acc78b1ed7f0949b5c879d8906f8c488eb4346fbbeb6517d0be0bdd Oct 13 13:26:09 crc kubenswrapper[4797]: I1013 13:26:09.832949 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"d333dae8-679d-4469-bcbf-0c9ee221b136","Type":"ContainerStarted","Data":"9ca29efbdc2539e87f73a20111b58a34b8cb37cf884c812f37b79e1055717b72"} Oct 13 13:26:09 crc kubenswrapper[4797]: I1013 13:26:09.835217 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-cc74bd777-hvb5p" event={"ID":"99ff3b6b-b49c-4214-9cf1-ea9eb85f28c0","Type":"ContainerStarted","Data":"7f52052a3590935705a3b41e4aa7b4198b91ac9582dd55b94814a428e54fb81b"} Oct 13 13:26:09 crc kubenswrapper[4797]: I1013 13:26:09.835275 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-cc74bd777-hvb5p" event={"ID":"99ff3b6b-b49c-4214-9cf1-ea9eb85f28c0","Type":"ContainerStarted","Data":"b77c094eeee1c437962694391b190f9591ed0b8e1210663b157e4b3d4c2a0207"} Oct 13 13:26:09 crc kubenswrapper[4797]: I1013 13:26:09.837518 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e8fbbe28-3138-4280-a1bc-543138023af0","Type":"ContainerStarted","Data":"fe4ab14db452937ff907512470f5c5b052cc3bda722e5d37de183d908cfb800e"} Oct 13 13:26:09 crc kubenswrapper[4797]: I1013 13:26:09.837634 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="e8fbbe28-3138-4280-a1bc-543138023af0" containerName="cinder-api-log" containerID="cri-o://ea9b6c488a046a9e122a73b3301f9988fa128e615627df7adc9ebe366f4922c4" gracePeriod=30 Oct 13 13:26:09 crc kubenswrapper[4797]: I1013 13:26:09.837681 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Oct 13 13:26:09 crc kubenswrapper[4797]: I1013 13:26:09.837718 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="e8fbbe28-3138-4280-a1bc-543138023af0" containerName="cinder-api" containerID="cri-o://fe4ab14db452937ff907512470f5c5b052cc3bda722e5d37de183d908cfb800e" gracePeriod=30 Oct 13 13:26:09 crc kubenswrapper[4797]: I1013 13:26:09.840843 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"f8688abd-e654-404b-924c-9e4cf255f4e8","Type":"ContainerStarted","Data":"7e35110e1acc78b1ed7f0949b5c879d8906f8c488eb4346fbbeb6517d0be0bdd"} Oct 13 13:26:09 crc kubenswrapper[4797]: I1013 13:26:09.854208 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-cc74bd777-hvb5p" podStartSLOduration=2.622800208 podStartE2EDuration="5.854187528s" podCreationTimestamp="2025-10-13 13:26:04 +0000 UTC" firstStartedPulling="2025-10-13 13:26:06.058995271 +0000 UTC m=+1143.592545517" lastFinishedPulling="2025-10-13 13:26:09.290382581 +0000 UTC m=+1146.823932837" observedRunningTime="2025-10-13 13:26:09.851120953 +0000 UTC m=+1147.384671209" watchObservedRunningTime="2025-10-13 13:26:09.854187528 +0000 UTC m=+1147.387737784" Oct 13 13:26:09 crc kubenswrapper[4797]: I1013 13:26:09.886030 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.8860078179999995 podStartE2EDuration="4.886007818s" podCreationTimestamp="2025-10-13 13:26:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 13:26:09.869793581 +0000 UTC m=+1147.403343857" watchObservedRunningTime="2025-10-13 13:26:09.886007818 +0000 UTC m=+1147.419558074" Oct 13 13:26:10 crc kubenswrapper[4797]: I1013 13:26:10.592054 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 13 13:26:10 crc kubenswrapper[4797]: I1013 13:26:10.738857 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e8fbbe28-3138-4280-a1bc-543138023af0-config-data-custom\") pod \"e8fbbe28-3138-4280-a1bc-543138023af0\" (UID: \"e8fbbe28-3138-4280-a1bc-543138023af0\") " Oct 13 13:26:10 crc kubenswrapper[4797]: I1013 13:26:10.738917 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8fbbe28-3138-4280-a1bc-543138023af0-config-data\") pod \"e8fbbe28-3138-4280-a1bc-543138023af0\" (UID: \"e8fbbe28-3138-4280-a1bc-543138023af0\") " Oct 13 13:26:10 crc kubenswrapper[4797]: I1013 13:26:10.739002 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w97mr\" (UniqueName: \"kubernetes.io/projected/e8fbbe28-3138-4280-a1bc-543138023af0-kube-api-access-w97mr\") pod \"e8fbbe28-3138-4280-a1bc-543138023af0\" (UID: \"e8fbbe28-3138-4280-a1bc-543138023af0\") " Oct 13 13:26:10 crc kubenswrapper[4797]: I1013 13:26:10.739081 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8fbbe28-3138-4280-a1bc-543138023af0-scripts\") pod \"e8fbbe28-3138-4280-a1bc-543138023af0\" (UID: \"e8fbbe28-3138-4280-a1bc-543138023af0\") " Oct 13 13:26:10 crc kubenswrapper[4797]: I1013 13:26:10.739108 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e8fbbe28-3138-4280-a1bc-543138023af0-logs\") pod \"e8fbbe28-3138-4280-a1bc-543138023af0\" (UID: \"e8fbbe28-3138-4280-a1bc-543138023af0\") " Oct 13 13:26:10 crc kubenswrapper[4797]: I1013 13:26:10.739186 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e8fbbe28-3138-4280-a1bc-543138023af0-etc-machine-id\") pod \"e8fbbe28-3138-4280-a1bc-543138023af0\" (UID: \"e8fbbe28-3138-4280-a1bc-543138023af0\") " Oct 13 13:26:10 crc kubenswrapper[4797]: I1013 13:26:10.739220 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8fbbe28-3138-4280-a1bc-543138023af0-combined-ca-bundle\") pod \"e8fbbe28-3138-4280-a1bc-543138023af0\" (UID: \"e8fbbe28-3138-4280-a1bc-543138023af0\") " Oct 13 13:26:10 crc kubenswrapper[4797]: I1013 13:26:10.740274 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e8fbbe28-3138-4280-a1bc-543138023af0-logs" (OuterVolumeSpecName: "logs") pod "e8fbbe28-3138-4280-a1bc-543138023af0" (UID: "e8fbbe28-3138-4280-a1bc-543138023af0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 13:26:10 crc kubenswrapper[4797]: I1013 13:26:10.740361 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e8fbbe28-3138-4280-a1bc-543138023af0-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "e8fbbe28-3138-4280-a1bc-543138023af0" (UID: "e8fbbe28-3138-4280-a1bc-543138023af0"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 13 13:26:10 crc kubenswrapper[4797]: I1013 13:26:10.746234 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8fbbe28-3138-4280-a1bc-543138023af0-kube-api-access-w97mr" (OuterVolumeSpecName: "kube-api-access-w97mr") pod "e8fbbe28-3138-4280-a1bc-543138023af0" (UID: "e8fbbe28-3138-4280-a1bc-543138023af0"). InnerVolumeSpecName "kube-api-access-w97mr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:26:10 crc kubenswrapper[4797]: I1013 13:26:10.746382 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8fbbe28-3138-4280-a1bc-543138023af0-scripts" (OuterVolumeSpecName: "scripts") pod "e8fbbe28-3138-4280-a1bc-543138023af0" (UID: "e8fbbe28-3138-4280-a1bc-543138023af0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:26:10 crc kubenswrapper[4797]: I1013 13:26:10.746669 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8fbbe28-3138-4280-a1bc-543138023af0-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "e8fbbe28-3138-4280-a1bc-543138023af0" (UID: "e8fbbe28-3138-4280-a1bc-543138023af0"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:26:10 crc kubenswrapper[4797]: I1013 13:26:10.784989 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8fbbe28-3138-4280-a1bc-543138023af0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e8fbbe28-3138-4280-a1bc-543138023af0" (UID: "e8fbbe28-3138-4280-a1bc-543138023af0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:26:10 crc kubenswrapper[4797]: I1013 13:26:10.812315 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8fbbe28-3138-4280-a1bc-543138023af0-config-data" (OuterVolumeSpecName: "config-data") pod "e8fbbe28-3138-4280-a1bc-543138023af0" (UID: "e8fbbe28-3138-4280-a1bc-543138023af0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:26:10 crc kubenswrapper[4797]: I1013 13:26:10.842231 4797 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e8fbbe28-3138-4280-a1bc-543138023af0-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 13 13:26:10 crc kubenswrapper[4797]: I1013 13:26:10.842272 4797 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8fbbe28-3138-4280-a1bc-543138023af0-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 13:26:10 crc kubenswrapper[4797]: I1013 13:26:10.842289 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w97mr\" (UniqueName: \"kubernetes.io/projected/e8fbbe28-3138-4280-a1bc-543138023af0-kube-api-access-w97mr\") on node \"crc\" DevicePath \"\"" Oct 13 13:26:10 crc kubenswrapper[4797]: I1013 13:26:10.842302 4797 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8fbbe28-3138-4280-a1bc-543138023af0-scripts\") on node \"crc\" DevicePath \"\"" Oct 13 13:26:10 crc kubenswrapper[4797]: I1013 13:26:10.842313 4797 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e8fbbe28-3138-4280-a1bc-543138023af0-logs\") on node \"crc\" DevicePath \"\"" Oct 13 13:26:10 crc kubenswrapper[4797]: I1013 13:26:10.842321 4797 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e8fbbe28-3138-4280-a1bc-543138023af0-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 13 13:26:10 crc kubenswrapper[4797]: I1013 13:26:10.842333 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8fbbe28-3138-4280-a1bc-543138023af0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 13:26:10 crc kubenswrapper[4797]: I1013 13:26:10.857012 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"d333dae8-679d-4469-bcbf-0c9ee221b136","Type":"ContainerStarted","Data":"f29b8911fc63bc6933a9f53dc18e60ef397bda730d601a5ef719e2d0ce2050d8"} Oct 13 13:26:10 crc kubenswrapper[4797]: I1013 13:26:10.861498 4797 generic.go:334] "Generic (PLEG): container finished" podID="e8fbbe28-3138-4280-a1bc-543138023af0" containerID="fe4ab14db452937ff907512470f5c5b052cc3bda722e5d37de183d908cfb800e" exitCode=0 Oct 13 13:26:10 crc kubenswrapper[4797]: I1013 13:26:10.861537 4797 generic.go:334] "Generic (PLEG): container finished" podID="e8fbbe28-3138-4280-a1bc-543138023af0" containerID="ea9b6c488a046a9e122a73b3301f9988fa128e615627df7adc9ebe366f4922c4" exitCode=143 Oct 13 13:26:10 crc kubenswrapper[4797]: I1013 13:26:10.862924 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 13 13:26:10 crc kubenswrapper[4797]: I1013 13:26:10.863141 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e8fbbe28-3138-4280-a1bc-543138023af0","Type":"ContainerDied","Data":"fe4ab14db452937ff907512470f5c5b052cc3bda722e5d37de183d908cfb800e"} Oct 13 13:26:10 crc kubenswrapper[4797]: I1013 13:26:10.863197 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e8fbbe28-3138-4280-a1bc-543138023af0","Type":"ContainerDied","Data":"ea9b6c488a046a9e122a73b3301f9988fa128e615627df7adc9ebe366f4922c4"} Oct 13 13:26:10 crc kubenswrapper[4797]: I1013 13:26:10.863216 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e8fbbe28-3138-4280-a1bc-543138023af0","Type":"ContainerDied","Data":"09dbbd758e35913745a3ce0a003230fe37047b2dd5edae5581643817a6328d4d"} Oct 13 13:26:10 crc kubenswrapper[4797]: I1013 13:26:10.863233 4797 scope.go:117] "RemoveContainer" containerID="fe4ab14db452937ff907512470f5c5b052cc3bda722e5d37de183d908cfb800e" Oct 13 13:26:10 crc kubenswrapper[4797]: I1013 13:26:10.887293 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=6.012021796 podStartE2EDuration="6.887274825s" podCreationTimestamp="2025-10-13 13:26:04 +0000 UTC" firstStartedPulling="2025-10-13 13:26:06.36829442 +0000 UTC m=+1143.901844676" lastFinishedPulling="2025-10-13 13:26:07.243547449 +0000 UTC m=+1144.777097705" observedRunningTime="2025-10-13 13:26:10.886042564 +0000 UTC m=+1148.419592830" watchObservedRunningTime="2025-10-13 13:26:10.887274825 +0000 UTC m=+1148.420825081" Oct 13 13:26:10 crc kubenswrapper[4797]: I1013 13:26:10.910104 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 13 13:26:10 crc kubenswrapper[4797]: I1013 13:26:10.922861 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Oct 13 13:26:10 crc kubenswrapper[4797]: I1013 13:26:10.963353 4797 scope.go:117] "RemoveContainer" containerID="ea9b6c488a046a9e122a73b3301f9988fa128e615627df7adc9ebe366f4922c4" Oct 13 13:26:10 crc kubenswrapper[4797]: I1013 13:26:10.979711 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Oct 13 13:26:10 crc kubenswrapper[4797]: E1013 13:26:10.980104 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8fbbe28-3138-4280-a1bc-543138023af0" containerName="cinder-api" Oct 13 13:26:10 crc kubenswrapper[4797]: I1013 13:26:10.980120 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8fbbe28-3138-4280-a1bc-543138023af0" containerName="cinder-api" Oct 13 13:26:10 crc kubenswrapper[4797]: E1013 13:26:10.980150 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8fbbe28-3138-4280-a1bc-543138023af0" containerName="cinder-api-log" Oct 13 13:26:10 crc kubenswrapper[4797]: I1013 13:26:10.980157 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8fbbe28-3138-4280-a1bc-543138023af0" containerName="cinder-api-log" Oct 13 13:26:10 crc kubenswrapper[4797]: I1013 13:26:10.980333 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8fbbe28-3138-4280-a1bc-543138023af0" containerName="cinder-api" Oct 13 13:26:10 crc kubenswrapper[4797]: I1013 13:26:10.980347 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8fbbe28-3138-4280-a1bc-543138023af0" containerName="cinder-api-log" Oct 13 13:26:10 crc kubenswrapper[4797]: I1013 13:26:10.981313 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 13 13:26:10 crc kubenswrapper[4797]: I1013 13:26:10.983702 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Oct 13 13:26:10 crc kubenswrapper[4797]: I1013 13:26:10.983757 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Oct 13 13:26:10 crc kubenswrapper[4797]: I1013 13:26:10.984014 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Oct 13 13:26:10 crc kubenswrapper[4797]: I1013 13:26:10.989480 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 13 13:26:11 crc kubenswrapper[4797]: I1013 13:26:11.012944 4797 scope.go:117] "RemoveContainer" containerID="fe4ab14db452937ff907512470f5c5b052cc3bda722e5d37de183d908cfb800e" Oct 13 13:26:11 crc kubenswrapper[4797]: E1013 13:26:11.013989 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe4ab14db452937ff907512470f5c5b052cc3bda722e5d37de183d908cfb800e\": container with ID starting with fe4ab14db452937ff907512470f5c5b052cc3bda722e5d37de183d908cfb800e not found: ID does not exist" containerID="fe4ab14db452937ff907512470f5c5b052cc3bda722e5d37de183d908cfb800e" Oct 13 13:26:11 crc kubenswrapper[4797]: I1013 13:26:11.014096 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe4ab14db452937ff907512470f5c5b052cc3bda722e5d37de183d908cfb800e"} err="failed to get container status \"fe4ab14db452937ff907512470f5c5b052cc3bda722e5d37de183d908cfb800e\": rpc error: code = NotFound desc = could not find container \"fe4ab14db452937ff907512470f5c5b052cc3bda722e5d37de183d908cfb800e\": container with ID starting with fe4ab14db452937ff907512470f5c5b052cc3bda722e5d37de183d908cfb800e not found: ID does not exist" Oct 13 13:26:11 crc kubenswrapper[4797]: I1013 13:26:11.014176 4797 scope.go:117] "RemoveContainer" containerID="ea9b6c488a046a9e122a73b3301f9988fa128e615627df7adc9ebe366f4922c4" Oct 13 13:26:11 crc kubenswrapper[4797]: E1013 13:26:11.015299 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea9b6c488a046a9e122a73b3301f9988fa128e615627df7adc9ebe366f4922c4\": container with ID starting with ea9b6c488a046a9e122a73b3301f9988fa128e615627df7adc9ebe366f4922c4 not found: ID does not exist" containerID="ea9b6c488a046a9e122a73b3301f9988fa128e615627df7adc9ebe366f4922c4" Oct 13 13:26:11 crc kubenswrapper[4797]: I1013 13:26:11.015385 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea9b6c488a046a9e122a73b3301f9988fa128e615627df7adc9ebe366f4922c4"} err="failed to get container status \"ea9b6c488a046a9e122a73b3301f9988fa128e615627df7adc9ebe366f4922c4\": rpc error: code = NotFound desc = could not find container \"ea9b6c488a046a9e122a73b3301f9988fa128e615627df7adc9ebe366f4922c4\": container with ID starting with ea9b6c488a046a9e122a73b3301f9988fa128e615627df7adc9ebe366f4922c4 not found: ID does not exist" Oct 13 13:26:11 crc kubenswrapper[4797]: I1013 13:26:11.015453 4797 scope.go:117] "RemoveContainer" containerID="fe4ab14db452937ff907512470f5c5b052cc3bda722e5d37de183d908cfb800e" Oct 13 13:26:11 crc kubenswrapper[4797]: I1013 13:26:11.015754 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe4ab14db452937ff907512470f5c5b052cc3bda722e5d37de183d908cfb800e"} err="failed to get container status \"fe4ab14db452937ff907512470f5c5b052cc3bda722e5d37de183d908cfb800e\": rpc error: code = NotFound desc = could not find container \"fe4ab14db452937ff907512470f5c5b052cc3bda722e5d37de183d908cfb800e\": container with ID starting with fe4ab14db452937ff907512470f5c5b052cc3bda722e5d37de183d908cfb800e not found: ID does not exist" Oct 13 13:26:11 crc kubenswrapper[4797]: I1013 13:26:11.015874 4797 scope.go:117] "RemoveContainer" containerID="ea9b6c488a046a9e122a73b3301f9988fa128e615627df7adc9ebe366f4922c4" Oct 13 13:26:11 crc kubenswrapper[4797]: I1013 13:26:11.016305 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea9b6c488a046a9e122a73b3301f9988fa128e615627df7adc9ebe366f4922c4"} err="failed to get container status \"ea9b6c488a046a9e122a73b3301f9988fa128e615627df7adc9ebe366f4922c4\": rpc error: code = NotFound desc = could not find container \"ea9b6c488a046a9e122a73b3301f9988fa128e615627df7adc9ebe366f4922c4\": container with ID starting with ea9b6c488a046a9e122a73b3301f9988fa128e615627df7adc9ebe366f4922c4 not found: ID does not exist" Oct 13 13:26:11 crc kubenswrapper[4797]: I1013 13:26:11.047472 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b2f17a4-493b-4b76-9dea-ef70ed8e1525-config-data\") pod \"cinder-api-0\" (UID: \"6b2f17a4-493b-4b76-9dea-ef70ed8e1525\") " pod="openstack/cinder-api-0" Oct 13 13:26:11 crc kubenswrapper[4797]: I1013 13:26:11.047534 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6b2f17a4-493b-4b76-9dea-ef70ed8e1525-logs\") pod \"cinder-api-0\" (UID: \"6b2f17a4-493b-4b76-9dea-ef70ed8e1525\") " pod="openstack/cinder-api-0" Oct 13 13:26:11 crc kubenswrapper[4797]: I1013 13:26:11.047685 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p9w9g\" (UniqueName: \"kubernetes.io/projected/6b2f17a4-493b-4b76-9dea-ef70ed8e1525-kube-api-access-p9w9g\") pod \"cinder-api-0\" (UID: \"6b2f17a4-493b-4b76-9dea-ef70ed8e1525\") " pod="openstack/cinder-api-0" Oct 13 13:26:11 crc kubenswrapper[4797]: I1013 13:26:11.047738 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6b2f17a4-493b-4b76-9dea-ef70ed8e1525-scripts\") pod \"cinder-api-0\" (UID: \"6b2f17a4-493b-4b76-9dea-ef70ed8e1525\") " pod="openstack/cinder-api-0" Oct 13 13:26:11 crc kubenswrapper[4797]: I1013 13:26:11.047785 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6b2f17a4-493b-4b76-9dea-ef70ed8e1525-etc-machine-id\") pod \"cinder-api-0\" (UID: \"6b2f17a4-493b-4b76-9dea-ef70ed8e1525\") " pod="openstack/cinder-api-0" Oct 13 13:26:11 crc kubenswrapper[4797]: I1013 13:26:11.047824 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b2f17a4-493b-4b76-9dea-ef70ed8e1525-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"6b2f17a4-493b-4b76-9dea-ef70ed8e1525\") " pod="openstack/cinder-api-0" Oct 13 13:26:11 crc kubenswrapper[4797]: I1013 13:26:11.047976 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b2f17a4-493b-4b76-9dea-ef70ed8e1525-public-tls-certs\") pod \"cinder-api-0\" (UID: \"6b2f17a4-493b-4b76-9dea-ef70ed8e1525\") " pod="openstack/cinder-api-0" Oct 13 13:26:11 crc kubenswrapper[4797]: I1013 13:26:11.048007 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6b2f17a4-493b-4b76-9dea-ef70ed8e1525-config-data-custom\") pod \"cinder-api-0\" (UID: \"6b2f17a4-493b-4b76-9dea-ef70ed8e1525\") " pod="openstack/cinder-api-0" Oct 13 13:26:11 crc kubenswrapper[4797]: I1013 13:26:11.048037 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b2f17a4-493b-4b76-9dea-ef70ed8e1525-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"6b2f17a4-493b-4b76-9dea-ef70ed8e1525\") " pod="openstack/cinder-api-0" Oct 13 13:26:11 crc kubenswrapper[4797]: I1013 13:26:11.150061 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b2f17a4-493b-4b76-9dea-ef70ed8e1525-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"6b2f17a4-493b-4b76-9dea-ef70ed8e1525\") " pod="openstack/cinder-api-0" Oct 13 13:26:11 crc kubenswrapper[4797]: I1013 13:26:11.150127 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b2f17a4-493b-4b76-9dea-ef70ed8e1525-config-data\") pod \"cinder-api-0\" (UID: \"6b2f17a4-493b-4b76-9dea-ef70ed8e1525\") " pod="openstack/cinder-api-0" Oct 13 13:26:11 crc kubenswrapper[4797]: I1013 13:26:11.150170 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6b2f17a4-493b-4b76-9dea-ef70ed8e1525-logs\") pod \"cinder-api-0\" (UID: \"6b2f17a4-493b-4b76-9dea-ef70ed8e1525\") " pod="openstack/cinder-api-0" Oct 13 13:26:11 crc kubenswrapper[4797]: I1013 13:26:11.150208 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p9w9g\" (UniqueName: \"kubernetes.io/projected/6b2f17a4-493b-4b76-9dea-ef70ed8e1525-kube-api-access-p9w9g\") pod \"cinder-api-0\" (UID: \"6b2f17a4-493b-4b76-9dea-ef70ed8e1525\") " pod="openstack/cinder-api-0" Oct 13 13:26:11 crc kubenswrapper[4797]: I1013 13:26:11.150222 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6b2f17a4-493b-4b76-9dea-ef70ed8e1525-scripts\") pod \"cinder-api-0\" (UID: \"6b2f17a4-493b-4b76-9dea-ef70ed8e1525\") " pod="openstack/cinder-api-0" Oct 13 13:26:11 crc kubenswrapper[4797]: I1013 13:26:11.150247 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6b2f17a4-493b-4b76-9dea-ef70ed8e1525-etc-machine-id\") pod \"cinder-api-0\" (UID: \"6b2f17a4-493b-4b76-9dea-ef70ed8e1525\") " pod="openstack/cinder-api-0" Oct 13 13:26:11 crc kubenswrapper[4797]: I1013 13:26:11.150263 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b2f17a4-493b-4b76-9dea-ef70ed8e1525-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"6b2f17a4-493b-4b76-9dea-ef70ed8e1525\") " pod="openstack/cinder-api-0" Oct 13 13:26:11 crc kubenswrapper[4797]: I1013 13:26:11.150328 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b2f17a4-493b-4b76-9dea-ef70ed8e1525-public-tls-certs\") pod \"cinder-api-0\" (UID: \"6b2f17a4-493b-4b76-9dea-ef70ed8e1525\") " pod="openstack/cinder-api-0" Oct 13 13:26:11 crc kubenswrapper[4797]: I1013 13:26:11.150348 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6b2f17a4-493b-4b76-9dea-ef70ed8e1525-config-data-custom\") pod \"cinder-api-0\" (UID: \"6b2f17a4-493b-4b76-9dea-ef70ed8e1525\") " pod="openstack/cinder-api-0" Oct 13 13:26:11 crc kubenswrapper[4797]: I1013 13:26:11.150529 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6b2f17a4-493b-4b76-9dea-ef70ed8e1525-etc-machine-id\") pod \"cinder-api-0\" (UID: \"6b2f17a4-493b-4b76-9dea-ef70ed8e1525\") " pod="openstack/cinder-api-0" Oct 13 13:26:11 crc kubenswrapper[4797]: I1013 13:26:11.151026 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6b2f17a4-493b-4b76-9dea-ef70ed8e1525-logs\") pod \"cinder-api-0\" (UID: \"6b2f17a4-493b-4b76-9dea-ef70ed8e1525\") " pod="openstack/cinder-api-0" Oct 13 13:26:11 crc kubenswrapper[4797]: I1013 13:26:11.155423 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6b2f17a4-493b-4b76-9dea-ef70ed8e1525-config-data-custom\") pod \"cinder-api-0\" (UID: \"6b2f17a4-493b-4b76-9dea-ef70ed8e1525\") " pod="openstack/cinder-api-0" Oct 13 13:26:11 crc kubenswrapper[4797]: I1013 13:26:11.155423 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b2f17a4-493b-4b76-9dea-ef70ed8e1525-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"6b2f17a4-493b-4b76-9dea-ef70ed8e1525\") " pod="openstack/cinder-api-0" Oct 13 13:26:11 crc kubenswrapper[4797]: I1013 13:26:11.155999 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b2f17a4-493b-4b76-9dea-ef70ed8e1525-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"6b2f17a4-493b-4b76-9dea-ef70ed8e1525\") " pod="openstack/cinder-api-0" Oct 13 13:26:11 crc kubenswrapper[4797]: I1013 13:26:11.159314 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6b2f17a4-493b-4b76-9dea-ef70ed8e1525-scripts\") pod \"cinder-api-0\" (UID: \"6b2f17a4-493b-4b76-9dea-ef70ed8e1525\") " pod="openstack/cinder-api-0" Oct 13 13:26:11 crc kubenswrapper[4797]: I1013 13:26:11.167533 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b2f17a4-493b-4b76-9dea-ef70ed8e1525-public-tls-certs\") pod \"cinder-api-0\" (UID: \"6b2f17a4-493b-4b76-9dea-ef70ed8e1525\") " pod="openstack/cinder-api-0" Oct 13 13:26:11 crc kubenswrapper[4797]: I1013 13:26:11.171716 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b2f17a4-493b-4b76-9dea-ef70ed8e1525-config-data\") pod \"cinder-api-0\" (UID: \"6b2f17a4-493b-4b76-9dea-ef70ed8e1525\") " pod="openstack/cinder-api-0" Oct 13 13:26:11 crc kubenswrapper[4797]: I1013 13:26:11.175263 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p9w9g\" (UniqueName: \"kubernetes.io/projected/6b2f17a4-493b-4b76-9dea-ef70ed8e1525-kube-api-access-p9w9g\") pod \"cinder-api-0\" (UID: \"6b2f17a4-493b-4b76-9dea-ef70ed8e1525\") " pod="openstack/cinder-api-0" Oct 13 13:26:11 crc kubenswrapper[4797]: I1013 13:26:11.248723 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8fbbe28-3138-4280-a1bc-543138023af0" path="/var/lib/kubelet/pods/e8fbbe28-3138-4280-a1bc-543138023af0/volumes" Oct 13 13:26:11 crc kubenswrapper[4797]: I1013 13:26:11.315410 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 13 13:26:11 crc kubenswrapper[4797]: I1013 13:26:11.894573 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 13 13:26:11 crc kubenswrapper[4797]: W1013 13:26:11.913093 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6b2f17a4_493b_4b76_9dea_ef70ed8e1525.slice/crio-a53f1c7a7c1e6e96ec178bbdc83a3b7e28f3c89077a62580d8233e38b871188d WatchSource:0}: Error finding container a53f1c7a7c1e6e96ec178bbdc83a3b7e28f3c89077a62580d8233e38b871188d: Status 404 returned error can't find the container with id a53f1c7a7c1e6e96ec178bbdc83a3b7e28f3c89077a62580d8233e38b871188d Oct 13 13:26:12 crc kubenswrapper[4797]: I1013 13:26:12.027492 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-575995d4c4-dskht"] Oct 13 13:26:12 crc kubenswrapper[4797]: I1013 13:26:12.029267 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-575995d4c4-dskht" Oct 13 13:26:12 crc kubenswrapper[4797]: I1013 13:26:12.037332 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Oct 13 13:26:12 crc kubenswrapper[4797]: I1013 13:26:12.037772 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Oct 13 13:26:12 crc kubenswrapper[4797]: I1013 13:26:12.042726 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-575995d4c4-dskht"] Oct 13 13:26:12 crc kubenswrapper[4797]: I1013 13:26:12.177682 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3d561c30-1e2f-4a3d-b042-8191c88e4bb6-config-data-custom\") pod \"barbican-api-575995d4c4-dskht\" (UID: \"3d561c30-1e2f-4a3d-b042-8191c88e4bb6\") " pod="openstack/barbican-api-575995d4c4-dskht" Oct 13 13:26:12 crc kubenswrapper[4797]: I1013 13:26:12.178111 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d561c30-1e2f-4a3d-b042-8191c88e4bb6-config-data\") pod \"barbican-api-575995d4c4-dskht\" (UID: \"3d561c30-1e2f-4a3d-b042-8191c88e4bb6\") " pod="openstack/barbican-api-575995d4c4-dskht" Oct 13 13:26:12 crc kubenswrapper[4797]: I1013 13:26:12.178144 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d561c30-1e2f-4a3d-b042-8191c88e4bb6-logs\") pod \"barbican-api-575995d4c4-dskht\" (UID: \"3d561c30-1e2f-4a3d-b042-8191c88e4bb6\") " pod="openstack/barbican-api-575995d4c4-dskht" Oct 13 13:26:12 crc kubenswrapper[4797]: I1013 13:26:12.178223 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d561c30-1e2f-4a3d-b042-8191c88e4bb6-combined-ca-bundle\") pod \"barbican-api-575995d4c4-dskht\" (UID: \"3d561c30-1e2f-4a3d-b042-8191c88e4bb6\") " pod="openstack/barbican-api-575995d4c4-dskht" Oct 13 13:26:12 crc kubenswrapper[4797]: I1013 13:26:12.178284 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d561c30-1e2f-4a3d-b042-8191c88e4bb6-public-tls-certs\") pod \"barbican-api-575995d4c4-dskht\" (UID: \"3d561c30-1e2f-4a3d-b042-8191c88e4bb6\") " pod="openstack/barbican-api-575995d4c4-dskht" Oct 13 13:26:12 crc kubenswrapper[4797]: I1013 13:26:12.178630 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d561c30-1e2f-4a3d-b042-8191c88e4bb6-internal-tls-certs\") pod \"barbican-api-575995d4c4-dskht\" (UID: \"3d561c30-1e2f-4a3d-b042-8191c88e4bb6\") " pod="openstack/barbican-api-575995d4c4-dskht" Oct 13 13:26:12 crc kubenswrapper[4797]: I1013 13:26:12.178714 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hr9zt\" (UniqueName: \"kubernetes.io/projected/3d561c30-1e2f-4a3d-b042-8191c88e4bb6-kube-api-access-hr9zt\") pod \"barbican-api-575995d4c4-dskht\" (UID: \"3d561c30-1e2f-4a3d-b042-8191c88e4bb6\") " pod="openstack/barbican-api-575995d4c4-dskht" Oct 13 13:26:12 crc kubenswrapper[4797]: I1013 13:26:12.192463 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-6c8957dfc-xhpz5"] Oct 13 13:26:12 crc kubenswrapper[4797]: I1013 13:26:12.193954 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-6c8957dfc-xhpz5" Oct 13 13:26:12 crc kubenswrapper[4797]: I1013 13:26:12.196765 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Oct 13 13:26:12 crc kubenswrapper[4797]: I1013 13:26:12.198168 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Oct 13 13:26:12 crc kubenswrapper[4797]: I1013 13:26:12.202086 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Oct 13 13:26:12 crc kubenswrapper[4797]: I1013 13:26:12.227822 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-6c8957dfc-xhpz5"] Oct 13 13:26:12 crc kubenswrapper[4797]: I1013 13:26:12.280088 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6fed4821-c587-408b-b6b0-bcc080170628-combined-ca-bundle\") pod \"swift-proxy-6c8957dfc-xhpz5\" (UID: \"6fed4821-c587-408b-b6b0-bcc080170628\") " pod="openstack/swift-proxy-6c8957dfc-xhpz5" Oct 13 13:26:12 crc kubenswrapper[4797]: I1013 13:26:12.280138 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hr9zt\" (UniqueName: \"kubernetes.io/projected/3d561c30-1e2f-4a3d-b042-8191c88e4bb6-kube-api-access-hr9zt\") pod \"barbican-api-575995d4c4-dskht\" (UID: \"3d561c30-1e2f-4a3d-b042-8191c88e4bb6\") " pod="openstack/barbican-api-575995d4c4-dskht" Oct 13 13:26:12 crc kubenswrapper[4797]: I1013 13:26:12.280159 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3d561c30-1e2f-4a3d-b042-8191c88e4bb6-config-data-custom\") pod \"barbican-api-575995d4c4-dskht\" (UID: \"3d561c30-1e2f-4a3d-b042-8191c88e4bb6\") " pod="openstack/barbican-api-575995d4c4-dskht" Oct 13 13:26:12 crc kubenswrapper[4797]: I1013 13:26:12.280213 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6fed4821-c587-408b-b6b0-bcc080170628-run-httpd\") pod \"swift-proxy-6c8957dfc-xhpz5\" (UID: \"6fed4821-c587-408b-b6b0-bcc080170628\") " pod="openstack/swift-proxy-6c8957dfc-xhpz5" Oct 13 13:26:12 crc kubenswrapper[4797]: I1013 13:26:12.280262 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6fed4821-c587-408b-b6b0-bcc080170628-log-httpd\") pod \"swift-proxy-6c8957dfc-xhpz5\" (UID: \"6fed4821-c587-408b-b6b0-bcc080170628\") " pod="openstack/swift-proxy-6c8957dfc-xhpz5" Oct 13 13:26:12 crc kubenswrapper[4797]: I1013 13:26:12.280327 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6fed4821-c587-408b-b6b0-bcc080170628-public-tls-certs\") pod \"swift-proxy-6c8957dfc-xhpz5\" (UID: \"6fed4821-c587-408b-b6b0-bcc080170628\") " pod="openstack/swift-proxy-6c8957dfc-xhpz5" Oct 13 13:26:12 crc kubenswrapper[4797]: I1013 13:26:12.280388 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6fed4821-c587-408b-b6b0-bcc080170628-internal-tls-certs\") pod \"swift-proxy-6c8957dfc-xhpz5\" (UID: \"6fed4821-c587-408b-b6b0-bcc080170628\") " pod="openstack/swift-proxy-6c8957dfc-xhpz5" Oct 13 13:26:12 crc kubenswrapper[4797]: I1013 13:26:12.280416 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d561c30-1e2f-4a3d-b042-8191c88e4bb6-config-data\") pod \"barbican-api-575995d4c4-dskht\" (UID: \"3d561c30-1e2f-4a3d-b042-8191c88e4bb6\") " pod="openstack/barbican-api-575995d4c4-dskht" Oct 13 13:26:12 crc kubenswrapper[4797]: I1013 13:26:12.280442 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d561c30-1e2f-4a3d-b042-8191c88e4bb6-logs\") pod \"barbican-api-575995d4c4-dskht\" (UID: \"3d561c30-1e2f-4a3d-b042-8191c88e4bb6\") " pod="openstack/barbican-api-575995d4c4-dskht" Oct 13 13:26:12 crc kubenswrapper[4797]: I1013 13:26:12.280473 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d561c30-1e2f-4a3d-b042-8191c88e4bb6-combined-ca-bundle\") pod \"barbican-api-575995d4c4-dskht\" (UID: \"3d561c30-1e2f-4a3d-b042-8191c88e4bb6\") " pod="openstack/barbican-api-575995d4c4-dskht" Oct 13 13:26:12 crc kubenswrapper[4797]: I1013 13:26:12.280514 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d561c30-1e2f-4a3d-b042-8191c88e4bb6-public-tls-certs\") pod \"barbican-api-575995d4c4-dskht\" (UID: \"3d561c30-1e2f-4a3d-b042-8191c88e4bb6\") " pod="openstack/barbican-api-575995d4c4-dskht" Oct 13 13:26:12 crc kubenswrapper[4797]: I1013 13:26:12.280543 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6fed4821-c587-408b-b6b0-bcc080170628-etc-swift\") pod \"swift-proxy-6c8957dfc-xhpz5\" (UID: \"6fed4821-c587-408b-b6b0-bcc080170628\") " pod="openstack/swift-proxy-6c8957dfc-xhpz5" Oct 13 13:26:12 crc kubenswrapper[4797]: I1013 13:26:12.281386 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d561c30-1e2f-4a3d-b042-8191c88e4bb6-logs\") pod \"barbican-api-575995d4c4-dskht\" (UID: \"3d561c30-1e2f-4a3d-b042-8191c88e4bb6\") " pod="openstack/barbican-api-575995d4c4-dskht" Oct 13 13:26:12 crc kubenswrapper[4797]: I1013 13:26:12.281511 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6fed4821-c587-408b-b6b0-bcc080170628-config-data\") pod \"swift-proxy-6c8957dfc-xhpz5\" (UID: \"6fed4821-c587-408b-b6b0-bcc080170628\") " pod="openstack/swift-proxy-6c8957dfc-xhpz5" Oct 13 13:26:12 crc kubenswrapper[4797]: I1013 13:26:12.281614 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c94xd\" (UniqueName: \"kubernetes.io/projected/6fed4821-c587-408b-b6b0-bcc080170628-kube-api-access-c94xd\") pod \"swift-proxy-6c8957dfc-xhpz5\" (UID: \"6fed4821-c587-408b-b6b0-bcc080170628\") " pod="openstack/swift-proxy-6c8957dfc-xhpz5" Oct 13 13:26:12 crc kubenswrapper[4797]: I1013 13:26:12.281757 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d561c30-1e2f-4a3d-b042-8191c88e4bb6-internal-tls-certs\") pod \"barbican-api-575995d4c4-dskht\" (UID: \"3d561c30-1e2f-4a3d-b042-8191c88e4bb6\") " pod="openstack/barbican-api-575995d4c4-dskht" Oct 13 13:26:12 crc kubenswrapper[4797]: I1013 13:26:12.286225 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d561c30-1e2f-4a3d-b042-8191c88e4bb6-combined-ca-bundle\") pod \"barbican-api-575995d4c4-dskht\" (UID: \"3d561c30-1e2f-4a3d-b042-8191c88e4bb6\") " pod="openstack/barbican-api-575995d4c4-dskht" Oct 13 13:26:12 crc kubenswrapper[4797]: I1013 13:26:12.288260 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d561c30-1e2f-4a3d-b042-8191c88e4bb6-internal-tls-certs\") pod \"barbican-api-575995d4c4-dskht\" (UID: \"3d561c30-1e2f-4a3d-b042-8191c88e4bb6\") " pod="openstack/barbican-api-575995d4c4-dskht" Oct 13 13:26:12 crc kubenswrapper[4797]: I1013 13:26:12.289046 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d561c30-1e2f-4a3d-b042-8191c88e4bb6-config-data\") pod \"barbican-api-575995d4c4-dskht\" (UID: \"3d561c30-1e2f-4a3d-b042-8191c88e4bb6\") " pod="openstack/barbican-api-575995d4c4-dskht" Oct 13 13:26:12 crc kubenswrapper[4797]: I1013 13:26:12.293387 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3d561c30-1e2f-4a3d-b042-8191c88e4bb6-config-data-custom\") pod \"barbican-api-575995d4c4-dskht\" (UID: \"3d561c30-1e2f-4a3d-b042-8191c88e4bb6\") " pod="openstack/barbican-api-575995d4c4-dskht" Oct 13 13:26:12 crc kubenswrapper[4797]: I1013 13:26:12.295041 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d561c30-1e2f-4a3d-b042-8191c88e4bb6-public-tls-certs\") pod \"barbican-api-575995d4c4-dskht\" (UID: \"3d561c30-1e2f-4a3d-b042-8191c88e4bb6\") " pod="openstack/barbican-api-575995d4c4-dskht" Oct 13 13:26:12 crc kubenswrapper[4797]: I1013 13:26:12.301462 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hr9zt\" (UniqueName: \"kubernetes.io/projected/3d561c30-1e2f-4a3d-b042-8191c88e4bb6-kube-api-access-hr9zt\") pod \"barbican-api-575995d4c4-dskht\" (UID: \"3d561c30-1e2f-4a3d-b042-8191c88e4bb6\") " pod="openstack/barbican-api-575995d4c4-dskht" Oct 13 13:26:12 crc kubenswrapper[4797]: I1013 13:26:12.348197 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-575995d4c4-dskht" Oct 13 13:26:12 crc kubenswrapper[4797]: I1013 13:26:12.383864 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6fed4821-c587-408b-b6b0-bcc080170628-internal-tls-certs\") pod \"swift-proxy-6c8957dfc-xhpz5\" (UID: \"6fed4821-c587-408b-b6b0-bcc080170628\") " pod="openstack/swift-proxy-6c8957dfc-xhpz5" Oct 13 13:26:12 crc kubenswrapper[4797]: I1013 13:26:12.383947 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6fed4821-c587-408b-b6b0-bcc080170628-etc-swift\") pod \"swift-proxy-6c8957dfc-xhpz5\" (UID: \"6fed4821-c587-408b-b6b0-bcc080170628\") " pod="openstack/swift-proxy-6c8957dfc-xhpz5" Oct 13 13:26:12 crc kubenswrapper[4797]: I1013 13:26:12.383980 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6fed4821-c587-408b-b6b0-bcc080170628-config-data\") pod \"swift-proxy-6c8957dfc-xhpz5\" (UID: \"6fed4821-c587-408b-b6b0-bcc080170628\") " pod="openstack/swift-proxy-6c8957dfc-xhpz5" Oct 13 13:26:12 crc kubenswrapper[4797]: I1013 13:26:12.384026 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c94xd\" (UniqueName: \"kubernetes.io/projected/6fed4821-c587-408b-b6b0-bcc080170628-kube-api-access-c94xd\") pod \"swift-proxy-6c8957dfc-xhpz5\" (UID: \"6fed4821-c587-408b-b6b0-bcc080170628\") " pod="openstack/swift-proxy-6c8957dfc-xhpz5" Oct 13 13:26:12 crc kubenswrapper[4797]: I1013 13:26:12.384086 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6fed4821-c587-408b-b6b0-bcc080170628-combined-ca-bundle\") pod \"swift-proxy-6c8957dfc-xhpz5\" (UID: \"6fed4821-c587-408b-b6b0-bcc080170628\") " pod="openstack/swift-proxy-6c8957dfc-xhpz5" Oct 13 13:26:12 crc kubenswrapper[4797]: I1013 13:26:12.384111 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6fed4821-c587-408b-b6b0-bcc080170628-run-httpd\") pod \"swift-proxy-6c8957dfc-xhpz5\" (UID: \"6fed4821-c587-408b-b6b0-bcc080170628\") " pod="openstack/swift-proxy-6c8957dfc-xhpz5" Oct 13 13:26:12 crc kubenswrapper[4797]: I1013 13:26:12.384140 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6fed4821-c587-408b-b6b0-bcc080170628-log-httpd\") pod \"swift-proxy-6c8957dfc-xhpz5\" (UID: \"6fed4821-c587-408b-b6b0-bcc080170628\") " pod="openstack/swift-proxy-6c8957dfc-xhpz5" Oct 13 13:26:12 crc kubenswrapper[4797]: I1013 13:26:12.384171 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6fed4821-c587-408b-b6b0-bcc080170628-public-tls-certs\") pod \"swift-proxy-6c8957dfc-xhpz5\" (UID: \"6fed4821-c587-408b-b6b0-bcc080170628\") " pod="openstack/swift-proxy-6c8957dfc-xhpz5" Oct 13 13:26:12 crc kubenswrapper[4797]: I1013 13:26:12.385065 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6fed4821-c587-408b-b6b0-bcc080170628-run-httpd\") pod \"swift-proxy-6c8957dfc-xhpz5\" (UID: \"6fed4821-c587-408b-b6b0-bcc080170628\") " pod="openstack/swift-proxy-6c8957dfc-xhpz5" Oct 13 13:26:12 crc kubenswrapper[4797]: I1013 13:26:12.385195 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6fed4821-c587-408b-b6b0-bcc080170628-log-httpd\") pod \"swift-proxy-6c8957dfc-xhpz5\" (UID: \"6fed4821-c587-408b-b6b0-bcc080170628\") " pod="openstack/swift-proxy-6c8957dfc-xhpz5" Oct 13 13:26:12 crc kubenswrapper[4797]: I1013 13:26:12.389672 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6fed4821-c587-408b-b6b0-bcc080170628-etc-swift\") pod \"swift-proxy-6c8957dfc-xhpz5\" (UID: \"6fed4821-c587-408b-b6b0-bcc080170628\") " pod="openstack/swift-proxy-6c8957dfc-xhpz5" Oct 13 13:26:12 crc kubenswrapper[4797]: I1013 13:26:12.392394 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6fed4821-c587-408b-b6b0-bcc080170628-public-tls-certs\") pod \"swift-proxy-6c8957dfc-xhpz5\" (UID: \"6fed4821-c587-408b-b6b0-bcc080170628\") " pod="openstack/swift-proxy-6c8957dfc-xhpz5" Oct 13 13:26:12 crc kubenswrapper[4797]: I1013 13:26:12.393447 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6fed4821-c587-408b-b6b0-bcc080170628-config-data\") pod \"swift-proxy-6c8957dfc-xhpz5\" (UID: \"6fed4821-c587-408b-b6b0-bcc080170628\") " pod="openstack/swift-proxy-6c8957dfc-xhpz5" Oct 13 13:26:12 crc kubenswrapper[4797]: I1013 13:26:12.396563 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6fed4821-c587-408b-b6b0-bcc080170628-combined-ca-bundle\") pod \"swift-proxy-6c8957dfc-xhpz5\" (UID: \"6fed4821-c587-408b-b6b0-bcc080170628\") " pod="openstack/swift-proxy-6c8957dfc-xhpz5" Oct 13 13:26:12 crc kubenswrapper[4797]: I1013 13:26:12.401386 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6fed4821-c587-408b-b6b0-bcc080170628-internal-tls-certs\") pod \"swift-proxy-6c8957dfc-xhpz5\" (UID: \"6fed4821-c587-408b-b6b0-bcc080170628\") " pod="openstack/swift-proxy-6c8957dfc-xhpz5" Oct 13 13:26:12 crc kubenswrapper[4797]: I1013 13:26:12.418145 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c94xd\" (UniqueName: \"kubernetes.io/projected/6fed4821-c587-408b-b6b0-bcc080170628-kube-api-access-c94xd\") pod \"swift-proxy-6c8957dfc-xhpz5\" (UID: \"6fed4821-c587-408b-b6b0-bcc080170628\") " pod="openstack/swift-proxy-6c8957dfc-xhpz5" Oct 13 13:26:12 crc kubenswrapper[4797]: I1013 13:26:12.524459 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-6c8957dfc-xhpz5" Oct 13 13:26:12 crc kubenswrapper[4797]: I1013 13:26:12.851313 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-575995d4c4-dskht"] Oct 13 13:26:12 crc kubenswrapper[4797]: W1013 13:26:12.869231 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3d561c30_1e2f_4a3d_b042_8191c88e4bb6.slice/crio-db29d3b34ea1cf5562679209393def260dfd35fbfea46ae481040c8d0f040700 WatchSource:0}: Error finding container db29d3b34ea1cf5562679209393def260dfd35fbfea46ae481040c8d0f040700: Status 404 returned error can't find the container with id db29d3b34ea1cf5562679209393def260dfd35fbfea46ae481040c8d0f040700 Oct 13 13:26:12 crc kubenswrapper[4797]: I1013 13:26:12.888993 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-575995d4c4-dskht" event={"ID":"3d561c30-1e2f-4a3d-b042-8191c88e4bb6","Type":"ContainerStarted","Data":"db29d3b34ea1cf5562679209393def260dfd35fbfea46ae481040c8d0f040700"} Oct 13 13:26:12 crc kubenswrapper[4797]: I1013 13:26:12.890331 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"6b2f17a4-493b-4b76-9dea-ef70ed8e1525","Type":"ContainerStarted","Data":"a53f1c7a7c1e6e96ec178bbdc83a3b7e28f3c89077a62580d8233e38b871188d"} Oct 13 13:26:13 crc kubenswrapper[4797]: I1013 13:26:13.115565 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 13 13:26:13 crc kubenswrapper[4797]: I1013 13:26:13.116336 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ee338af0-a1a1-4515-973b-b914753e76cf" containerName="ceilometer-central-agent" containerID="cri-o://aa3f0cfe1c8e9a489f2c9bfe5437f7132bf4938834a5ca666bd15a9bde15355b" gracePeriod=30 Oct 13 13:26:13 crc kubenswrapper[4797]: I1013 13:26:13.117170 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ee338af0-a1a1-4515-973b-b914753e76cf" containerName="ceilometer-notification-agent" containerID="cri-o://c1c0a25f61889177289bbe5d1b1f2d63cfdb3457c8b73ec447a0b59ec80484d3" gracePeriod=30 Oct 13 13:26:13 crc kubenswrapper[4797]: I1013 13:26:13.117198 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ee338af0-a1a1-4515-973b-b914753e76cf" containerName="sg-core" containerID="cri-o://d034c56424f578783826d4b8e3e3adf2868eb02a2d2f39809be377fcbd84b10a" gracePeriod=30 Oct 13 13:26:13 crc kubenswrapper[4797]: I1013 13:26:13.117198 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ee338af0-a1a1-4515-973b-b914753e76cf" containerName="proxy-httpd" containerID="cri-o://2c1ef579b9023feaff895b2c21122b7c802d2b9e0b2a3b211f8c95a0dcaab98e" gracePeriod=30 Oct 13 13:26:13 crc kubenswrapper[4797]: I1013 13:26:13.135766 4797 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="ee338af0-a1a1-4515-973b-b914753e76cf" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Oct 13 13:26:13 crc kubenswrapper[4797]: I1013 13:26:13.188030 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-6c8957dfc-xhpz5"] Oct 13 13:26:13 crc kubenswrapper[4797]: W1013 13:26:13.191187 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6fed4821_c587_408b_b6b0_bcc080170628.slice/crio-58bd0faa103d5cf78cea4f0ad31e883565b2b75c51534697f9eb48399ee2b257 WatchSource:0}: Error finding container 58bd0faa103d5cf78cea4f0ad31e883565b2b75c51534697f9eb48399ee2b257: Status 404 returned error can't find the container with id 58bd0faa103d5cf78cea4f0ad31e883565b2b75c51534697f9eb48399ee2b257 Oct 13 13:26:13 crc kubenswrapper[4797]: I1013 13:26:13.916134 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-575995d4c4-dskht" event={"ID":"3d561c30-1e2f-4a3d-b042-8191c88e4bb6","Type":"ContainerStarted","Data":"658bce01fb068a8991c6ad520dbfd6eedee82a6e9a0f4f191a112fb1f5f569bf"} Oct 13 13:26:13 crc kubenswrapper[4797]: I1013 13:26:13.916836 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-575995d4c4-dskht" Oct 13 13:26:13 crc kubenswrapper[4797]: I1013 13:26:13.916945 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-575995d4c4-dskht" event={"ID":"3d561c30-1e2f-4a3d-b042-8191c88e4bb6","Type":"ContainerStarted","Data":"7fbdf32f1bb326754cf202e723f8c39032e73df93800b78e5c5cdcd412a37605"} Oct 13 13:26:13 crc kubenswrapper[4797]: I1013 13:26:13.917036 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-575995d4c4-dskht" Oct 13 13:26:13 crc kubenswrapper[4797]: I1013 13:26:13.921840 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6c8957dfc-xhpz5" event={"ID":"6fed4821-c587-408b-b6b0-bcc080170628","Type":"ContainerStarted","Data":"b0e1792b043ca301b8bfdc8d457a0e08d3ef2be1af52069dc3da398ff15a9eb7"} Oct 13 13:26:13 crc kubenswrapper[4797]: I1013 13:26:13.921902 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6c8957dfc-xhpz5" event={"ID":"6fed4821-c587-408b-b6b0-bcc080170628","Type":"ContainerStarted","Data":"6d39515a47c02aadeb66a5f42bd4f250497bcfc5b08bfe8c08ae7fa7aae9544b"} Oct 13 13:26:13 crc kubenswrapper[4797]: I1013 13:26:13.921914 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6c8957dfc-xhpz5" event={"ID":"6fed4821-c587-408b-b6b0-bcc080170628","Type":"ContainerStarted","Data":"58bd0faa103d5cf78cea4f0ad31e883565b2b75c51534697f9eb48399ee2b257"} Oct 13 13:26:13 crc kubenswrapper[4797]: I1013 13:26:13.921948 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-6c8957dfc-xhpz5" Oct 13 13:26:13 crc kubenswrapper[4797]: I1013 13:26:13.922116 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-6c8957dfc-xhpz5" Oct 13 13:26:13 crc kubenswrapper[4797]: I1013 13:26:13.924975 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"6b2f17a4-493b-4b76-9dea-ef70ed8e1525","Type":"ContainerStarted","Data":"bf518b71e25438928a378f5c3bfabd9ce6a8af5ad6e5d68c81e9f3e1f5d12f53"} Oct 13 13:26:13 crc kubenswrapper[4797]: I1013 13:26:13.925005 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"6b2f17a4-493b-4b76-9dea-ef70ed8e1525","Type":"ContainerStarted","Data":"b525fe6d91c98126c6fdec2f494bf2955f9437633fcc37cf958bfcaa039ef67e"} Oct 13 13:26:13 crc kubenswrapper[4797]: I1013 13:26:13.925439 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Oct 13 13:26:13 crc kubenswrapper[4797]: I1013 13:26:13.932342 4797 generic.go:334] "Generic (PLEG): container finished" podID="ee338af0-a1a1-4515-973b-b914753e76cf" containerID="2c1ef579b9023feaff895b2c21122b7c802d2b9e0b2a3b211f8c95a0dcaab98e" exitCode=0 Oct 13 13:26:13 crc kubenswrapper[4797]: I1013 13:26:13.932465 4797 generic.go:334] "Generic (PLEG): container finished" podID="ee338af0-a1a1-4515-973b-b914753e76cf" containerID="d034c56424f578783826d4b8e3e3adf2868eb02a2d2f39809be377fcbd84b10a" exitCode=2 Oct 13 13:26:13 crc kubenswrapper[4797]: I1013 13:26:13.932526 4797 generic.go:334] "Generic (PLEG): container finished" podID="ee338af0-a1a1-4515-973b-b914753e76cf" containerID="aa3f0cfe1c8e9a489f2c9bfe5437f7132bf4938834a5ca666bd15a9bde15355b" exitCode=0 Oct 13 13:26:13 crc kubenswrapper[4797]: I1013 13:26:13.932593 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ee338af0-a1a1-4515-973b-b914753e76cf","Type":"ContainerDied","Data":"2c1ef579b9023feaff895b2c21122b7c802d2b9e0b2a3b211f8c95a0dcaab98e"} Oct 13 13:26:13 crc kubenswrapper[4797]: I1013 13:26:13.932656 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ee338af0-a1a1-4515-973b-b914753e76cf","Type":"ContainerDied","Data":"d034c56424f578783826d4b8e3e3adf2868eb02a2d2f39809be377fcbd84b10a"} Oct 13 13:26:13 crc kubenswrapper[4797]: I1013 13:26:13.932713 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ee338af0-a1a1-4515-973b-b914753e76cf","Type":"ContainerDied","Data":"aa3f0cfe1c8e9a489f2c9bfe5437f7132bf4938834a5ca666bd15a9bde15355b"} Oct 13 13:26:13 crc kubenswrapper[4797]: I1013 13:26:13.945490 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-575995d4c4-dskht" podStartSLOduration=1.94547312 podStartE2EDuration="1.94547312s" podCreationTimestamp="2025-10-13 13:26:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 13:26:13.934675545 +0000 UTC m=+1151.468225811" watchObservedRunningTime="2025-10-13 13:26:13.94547312 +0000 UTC m=+1151.479023376" Oct 13 13:26:13 crc kubenswrapper[4797]: I1013 13:26:13.968437 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-6c8957dfc-xhpz5" podStartSLOduration=1.968415552 podStartE2EDuration="1.968415552s" podCreationTimestamp="2025-10-13 13:26:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 13:26:13.95487897 +0000 UTC m=+1151.488429246" watchObservedRunningTime="2025-10-13 13:26:13.968415552 +0000 UTC m=+1151.501965808" Oct 13 13:26:13 crc kubenswrapper[4797]: I1013 13:26:13.984540 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.9845162370000002 podStartE2EDuration="3.984516237s" podCreationTimestamp="2025-10-13 13:26:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 13:26:13.973909187 +0000 UTC m=+1151.507459453" watchObservedRunningTime="2025-10-13 13:26:13.984516237 +0000 UTC m=+1151.518066503" Oct 13 13:26:15 crc kubenswrapper[4797]: I1013 13:26:15.658870 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Oct 13 13:26:15 crc kubenswrapper[4797]: I1013 13:26:15.768360 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-cb9f44c77-zgtk5" Oct 13 13:26:15 crc kubenswrapper[4797]: I1013 13:26:15.868683 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-649d884857-8lbk5"] Oct 13 13:26:15 crc kubenswrapper[4797]: I1013 13:26:15.869237 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-649d884857-8lbk5" podUID="afdc0228-d55d-4e5b-846d-605af1635de7" containerName="dnsmasq-dns" containerID="cri-o://322ab71a472f62847155ef62aae8f13f5be5d55a48cf23fc4a7574bab88d162a" gracePeriod=10 Oct 13 13:26:16 crc kubenswrapper[4797]: I1013 13:26:16.227876 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Oct 13 13:26:16 crc kubenswrapper[4797]: I1013 13:26:16.274672 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 13 13:26:16 crc kubenswrapper[4797]: I1013 13:26:16.601316 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-649d884857-8lbk5" Oct 13 13:26:16 crc kubenswrapper[4797]: I1013 13:26:16.710457 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z8z9s\" (UniqueName: \"kubernetes.io/projected/afdc0228-d55d-4e5b-846d-605af1635de7-kube-api-access-z8z9s\") pod \"afdc0228-d55d-4e5b-846d-605af1635de7\" (UID: \"afdc0228-d55d-4e5b-846d-605af1635de7\") " Oct 13 13:26:16 crc kubenswrapper[4797]: I1013 13:26:16.710584 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/afdc0228-d55d-4e5b-846d-605af1635de7-ovsdbserver-nb\") pod \"afdc0228-d55d-4e5b-846d-605af1635de7\" (UID: \"afdc0228-d55d-4e5b-846d-605af1635de7\") " Oct 13 13:26:16 crc kubenswrapper[4797]: I1013 13:26:16.710675 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/afdc0228-d55d-4e5b-846d-605af1635de7-dns-svc\") pod \"afdc0228-d55d-4e5b-846d-605af1635de7\" (UID: \"afdc0228-d55d-4e5b-846d-605af1635de7\") " Oct 13 13:26:16 crc kubenswrapper[4797]: I1013 13:26:16.710693 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/afdc0228-d55d-4e5b-846d-605af1635de7-dns-swift-storage-0\") pod \"afdc0228-d55d-4e5b-846d-605af1635de7\" (UID: \"afdc0228-d55d-4e5b-846d-605af1635de7\") " Oct 13 13:26:16 crc kubenswrapper[4797]: I1013 13:26:16.710747 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/afdc0228-d55d-4e5b-846d-605af1635de7-config\") pod \"afdc0228-d55d-4e5b-846d-605af1635de7\" (UID: \"afdc0228-d55d-4e5b-846d-605af1635de7\") " Oct 13 13:26:16 crc kubenswrapper[4797]: I1013 13:26:16.710776 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/afdc0228-d55d-4e5b-846d-605af1635de7-ovsdbserver-sb\") pod \"afdc0228-d55d-4e5b-846d-605af1635de7\" (UID: \"afdc0228-d55d-4e5b-846d-605af1635de7\") " Oct 13 13:26:16 crc kubenswrapper[4797]: I1013 13:26:16.724414 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/afdc0228-d55d-4e5b-846d-605af1635de7-kube-api-access-z8z9s" (OuterVolumeSpecName: "kube-api-access-z8z9s") pod "afdc0228-d55d-4e5b-846d-605af1635de7" (UID: "afdc0228-d55d-4e5b-846d-605af1635de7"). InnerVolumeSpecName "kube-api-access-z8z9s". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:26:16 crc kubenswrapper[4797]: I1013 13:26:16.808146 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/afdc0228-d55d-4e5b-846d-605af1635de7-config" (OuterVolumeSpecName: "config") pod "afdc0228-d55d-4e5b-846d-605af1635de7" (UID: "afdc0228-d55d-4e5b-846d-605af1635de7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:26:16 crc kubenswrapper[4797]: I1013 13:26:16.808947 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/afdc0228-d55d-4e5b-846d-605af1635de7-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "afdc0228-d55d-4e5b-846d-605af1635de7" (UID: "afdc0228-d55d-4e5b-846d-605af1635de7"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:26:16 crc kubenswrapper[4797]: I1013 13:26:16.812992 4797 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/afdc0228-d55d-4e5b-846d-605af1635de7-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 13 13:26:16 crc kubenswrapper[4797]: I1013 13:26:16.813034 4797 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/afdc0228-d55d-4e5b-846d-605af1635de7-config\") on node \"crc\" DevicePath \"\"" Oct 13 13:26:16 crc kubenswrapper[4797]: I1013 13:26:16.813047 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z8z9s\" (UniqueName: \"kubernetes.io/projected/afdc0228-d55d-4e5b-846d-605af1635de7-kube-api-access-z8z9s\") on node \"crc\" DevicePath \"\"" Oct 13 13:26:16 crc kubenswrapper[4797]: I1013 13:26:16.823779 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/afdc0228-d55d-4e5b-846d-605af1635de7-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "afdc0228-d55d-4e5b-846d-605af1635de7" (UID: "afdc0228-d55d-4e5b-846d-605af1635de7"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:26:16 crc kubenswrapper[4797]: I1013 13:26:16.829429 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/afdc0228-d55d-4e5b-846d-605af1635de7-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "afdc0228-d55d-4e5b-846d-605af1635de7" (UID: "afdc0228-d55d-4e5b-846d-605af1635de7"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:26:16 crc kubenswrapper[4797]: I1013 13:26:16.869026 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/afdc0228-d55d-4e5b-846d-605af1635de7-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "afdc0228-d55d-4e5b-846d-605af1635de7" (UID: "afdc0228-d55d-4e5b-846d-605af1635de7"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:26:16 crc kubenswrapper[4797]: I1013 13:26:16.915306 4797 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/afdc0228-d55d-4e5b-846d-605af1635de7-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 13 13:26:16 crc kubenswrapper[4797]: I1013 13:26:16.915338 4797 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/afdc0228-d55d-4e5b-846d-605af1635de7-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 13 13:26:16 crc kubenswrapper[4797]: I1013 13:26:16.915349 4797 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/afdc0228-d55d-4e5b-846d-605af1635de7-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 13 13:26:16 crc kubenswrapper[4797]: I1013 13:26:16.993355 4797 generic.go:334] "Generic (PLEG): container finished" podID="afdc0228-d55d-4e5b-846d-605af1635de7" containerID="322ab71a472f62847155ef62aae8f13f5be5d55a48cf23fc4a7574bab88d162a" exitCode=0 Oct 13 13:26:16 crc kubenswrapper[4797]: I1013 13:26:16.993589 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="d333dae8-679d-4469-bcbf-0c9ee221b136" containerName="cinder-scheduler" containerID="cri-o://9ca29efbdc2539e87f73a20111b58a34b8cb37cf884c812f37b79e1055717b72" gracePeriod=30 Oct 13 13:26:16 crc kubenswrapper[4797]: I1013 13:26:16.993832 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-649d884857-8lbk5" event={"ID":"afdc0228-d55d-4e5b-846d-605af1635de7","Type":"ContainerDied","Data":"322ab71a472f62847155ef62aae8f13f5be5d55a48cf23fc4a7574bab88d162a"} Oct 13 13:26:16 crc kubenswrapper[4797]: I1013 13:26:16.993921 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-649d884857-8lbk5" event={"ID":"afdc0228-d55d-4e5b-846d-605af1635de7","Type":"ContainerDied","Data":"9e07c721a2bf27c83db9eda2cae582588646b689137cd825761e9c750545ff83"} Oct 13 13:26:16 crc kubenswrapper[4797]: I1013 13:26:16.993944 4797 scope.go:117] "RemoveContainer" containerID="322ab71a472f62847155ef62aae8f13f5be5d55a48cf23fc4a7574bab88d162a" Oct 13 13:26:16 crc kubenswrapper[4797]: I1013 13:26:16.993977 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="d333dae8-679d-4469-bcbf-0c9ee221b136" containerName="probe" containerID="cri-o://f29b8911fc63bc6933a9f53dc18e60ef397bda730d601a5ef719e2d0ce2050d8" gracePeriod=30 Oct 13 13:26:16 crc kubenswrapper[4797]: I1013 13:26:16.994067 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-649d884857-8lbk5" Oct 13 13:26:17 crc kubenswrapper[4797]: I1013 13:26:17.013340 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-7fd4c97c98-mmgwk" Oct 13 13:26:17 crc kubenswrapper[4797]: I1013 13:26:17.074554 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-649d884857-8lbk5"] Oct 13 13:26:17 crc kubenswrapper[4797]: I1013 13:26:17.080256 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-649d884857-8lbk5"] Oct 13 13:26:17 crc kubenswrapper[4797]: I1013 13:26:17.253839 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="afdc0228-d55d-4e5b-846d-605af1635de7" path="/var/lib/kubelet/pods/afdc0228-d55d-4e5b-846d-605af1635de7/volumes" Oct 13 13:26:17 crc kubenswrapper[4797]: I1013 13:26:17.726097 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 13 13:26:17 crc kubenswrapper[4797]: I1013 13:26:17.726910 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="6252a65e-85f5-42cf-9fce-3cd585d5e834" containerName="glance-log" containerID="cri-o://b445980dba3c97e4a3a6a603ee26d9e781dbb8bb8304c528c84252ac702779dd" gracePeriod=30 Oct 13 13:26:17 crc kubenswrapper[4797]: I1013 13:26:17.727005 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="6252a65e-85f5-42cf-9fce-3cd585d5e834" containerName="glance-httpd" containerID="cri-o://70667378fe0ce4e05def33b1987bc19d6f57f64b4c4138291bc6588d9f55a2dc" gracePeriod=30 Oct 13 13:26:17 crc kubenswrapper[4797]: I1013 13:26:17.771737 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-966b5d6fd-tjmcl" Oct 13 13:26:17 crc kubenswrapper[4797]: I1013 13:26:17.958737 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-966b5d6fd-tjmcl" Oct 13 13:26:18 crc kubenswrapper[4797]: I1013 13:26:18.017643 4797 generic.go:334] "Generic (PLEG): container finished" podID="ee338af0-a1a1-4515-973b-b914753e76cf" containerID="c1c0a25f61889177289bbe5d1b1f2d63cfdb3457c8b73ec447a0b59ec80484d3" exitCode=0 Oct 13 13:26:18 crc kubenswrapper[4797]: I1013 13:26:18.017696 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ee338af0-a1a1-4515-973b-b914753e76cf","Type":"ContainerDied","Data":"c1c0a25f61889177289bbe5d1b1f2d63cfdb3457c8b73ec447a0b59ec80484d3"} Oct 13 13:26:18 crc kubenswrapper[4797]: I1013 13:26:18.020305 4797 generic.go:334] "Generic (PLEG): container finished" podID="d333dae8-679d-4469-bcbf-0c9ee221b136" containerID="f29b8911fc63bc6933a9f53dc18e60ef397bda730d601a5ef719e2d0ce2050d8" exitCode=0 Oct 13 13:26:18 crc kubenswrapper[4797]: I1013 13:26:18.020357 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"d333dae8-679d-4469-bcbf-0c9ee221b136","Type":"ContainerDied","Data":"f29b8911fc63bc6933a9f53dc18e60ef397bda730d601a5ef719e2d0ce2050d8"} Oct 13 13:26:18 crc kubenswrapper[4797]: I1013 13:26:18.033738 4797 generic.go:334] "Generic (PLEG): container finished" podID="6252a65e-85f5-42cf-9fce-3cd585d5e834" containerID="b445980dba3c97e4a3a6a603ee26d9e781dbb8bb8304c528c84252ac702779dd" exitCode=143 Oct 13 13:26:18 crc kubenswrapper[4797]: I1013 13:26:18.034564 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6252a65e-85f5-42cf-9fce-3cd585d5e834","Type":"ContainerDied","Data":"b445980dba3c97e4a3a6a603ee26d9e781dbb8bb8304c528c84252ac702779dd"} Oct 13 13:26:18 crc kubenswrapper[4797]: I1013 13:26:18.120339 4797 patch_prober.go:28] interesting pod/machine-config-daemon-hrdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 13:26:18 crc kubenswrapper[4797]: I1013 13:26:18.120411 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 13:26:18 crc kubenswrapper[4797]: I1013 13:26:18.120463 4797 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" Oct 13 13:26:18 crc kubenswrapper[4797]: I1013 13:26:18.121318 4797 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d5941c177a81de15babd5a721122470aaaaccd1b9033980aac5b1cd72b64076c"} pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 13 13:26:18 crc kubenswrapper[4797]: I1013 13:26:18.121383 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerName="machine-config-daemon" containerID="cri-o://d5941c177a81de15babd5a721122470aaaaccd1b9033980aac5b1cd72b64076c" gracePeriod=600 Oct 13 13:26:19 crc kubenswrapper[4797]: I1013 13:26:19.054697 4797 generic.go:334] "Generic (PLEG): container finished" podID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerID="d5941c177a81de15babd5a721122470aaaaccd1b9033980aac5b1cd72b64076c" exitCode=0 Oct 13 13:26:19 crc kubenswrapper[4797]: I1013 13:26:19.054918 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" event={"ID":"345b1c60-ba79-407d-8423-53010f2dfeb0","Type":"ContainerDied","Data":"d5941c177a81de15babd5a721122470aaaaccd1b9033980aac5b1cd72b64076c"} Oct 13 13:26:19 crc kubenswrapper[4797]: I1013 13:26:19.301893 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-57cbbb4d89-r9rvd" Oct 13 13:26:19 crc kubenswrapper[4797]: I1013 13:26:19.384484 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7fd4c97c98-mmgwk"] Oct 13 13:26:19 crc kubenswrapper[4797]: I1013 13:26:19.384860 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-7fd4c97c98-mmgwk" podUID="bdf73d21-1504-42d0-9521-9d1201947cc9" containerName="neutron-api" containerID="cri-o://57283ac326d0242aaf47e4ff612cf4c3235d8825402aa5d07b3d9cdfcbc2afda" gracePeriod=30 Oct 13 13:26:19 crc kubenswrapper[4797]: I1013 13:26:19.385258 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-7fd4c97c98-mmgwk" podUID="bdf73d21-1504-42d0-9521-9d1201947cc9" containerName="neutron-httpd" containerID="cri-o://8be75a01073153c2b6a4e433be7da517f34e589f7fa0ce0cac175dab6cfec505" gracePeriod=30 Oct 13 13:26:20 crc kubenswrapper[4797]: I1013 13:26:20.064764 4797 generic.go:334] "Generic (PLEG): container finished" podID="bdf73d21-1504-42d0-9521-9d1201947cc9" containerID="8be75a01073153c2b6a4e433be7da517f34e589f7fa0ce0cac175dab6cfec505" exitCode=0 Oct 13 13:26:20 crc kubenswrapper[4797]: I1013 13:26:20.065135 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7fd4c97c98-mmgwk" event={"ID":"bdf73d21-1504-42d0-9521-9d1201947cc9","Type":"ContainerDied","Data":"8be75a01073153c2b6a4e433be7da517f34e589f7fa0ce0cac175dab6cfec505"} Oct 13 13:26:21 crc kubenswrapper[4797]: I1013 13:26:21.092102 4797 generic.go:334] "Generic (PLEG): container finished" podID="6252a65e-85f5-42cf-9fce-3cd585d5e834" containerID="70667378fe0ce4e05def33b1987bc19d6f57f64b4c4138291bc6588d9f55a2dc" exitCode=0 Oct 13 13:26:21 crc kubenswrapper[4797]: I1013 13:26:21.092197 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6252a65e-85f5-42cf-9fce-3cd585d5e834","Type":"ContainerDied","Data":"70667378fe0ce4e05def33b1987bc19d6f57f64b4c4138291bc6588d9f55a2dc"} Oct 13 13:26:21 crc kubenswrapper[4797]: I1013 13:26:21.094592 4797 generic.go:334] "Generic (PLEG): container finished" podID="d333dae8-679d-4469-bcbf-0c9ee221b136" containerID="9ca29efbdc2539e87f73a20111b58a34b8cb37cf884c812f37b79e1055717b72" exitCode=0 Oct 13 13:26:21 crc kubenswrapper[4797]: I1013 13:26:21.094623 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"d333dae8-679d-4469-bcbf-0c9ee221b136","Type":"ContainerDied","Data":"9ca29efbdc2539e87f73a20111b58a34b8cb37cf884c812f37b79e1055717b72"} Oct 13 13:26:21 crc kubenswrapper[4797]: E1013 13:26:21.111735 4797 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6252a65e_85f5_42cf_9fce_3cd585d5e834.slice/crio-conmon-70667378fe0ce4e05def33b1987bc19d6f57f64b4c4138291bc6588d9f55a2dc.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6252a65e_85f5_42cf_9fce_3cd585d5e834.slice/crio-70667378fe0ce4e05def33b1987bc19d6f57f64b4c4138291bc6588d9f55a2dc.scope\": RecentStats: unable to find data in memory cache]" Oct 13 13:26:21 crc kubenswrapper[4797]: I1013 13:26:21.304637 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 13 13:26:21 crc kubenswrapper[4797]: I1013 13:26:21.305144 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="99cf4c75-5042-4f58-945f-5461cad0fbcc" containerName="glance-log" containerID="cri-o://e95744e7a7fb393cf4dc18192de5ca1a6dbdc1518ad298e95bb74bf374852a4b" gracePeriod=30 Oct 13 13:26:21 crc kubenswrapper[4797]: I1013 13:26:21.305234 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="99cf4c75-5042-4f58-945f-5461cad0fbcc" containerName="glance-httpd" containerID="cri-o://8a2c55dde5e7cdb88e6195d0cc7cf9c60c2d76d91b4fbb30b8a991240a8dfd5a" gracePeriod=30 Oct 13 13:26:21 crc kubenswrapper[4797]: I1013 13:26:21.678759 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-vmbzt"] Oct 13 13:26:21 crc kubenswrapper[4797]: E1013 13:26:21.679151 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afdc0228-d55d-4e5b-846d-605af1635de7" containerName="dnsmasq-dns" Oct 13 13:26:21 crc kubenswrapper[4797]: I1013 13:26:21.679168 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="afdc0228-d55d-4e5b-846d-605af1635de7" containerName="dnsmasq-dns" Oct 13 13:26:21 crc kubenswrapper[4797]: E1013 13:26:21.679190 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afdc0228-d55d-4e5b-846d-605af1635de7" containerName="init" Oct 13 13:26:21 crc kubenswrapper[4797]: I1013 13:26:21.679195 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="afdc0228-d55d-4e5b-846d-605af1635de7" containerName="init" Oct 13 13:26:21 crc kubenswrapper[4797]: I1013 13:26:21.685088 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="afdc0228-d55d-4e5b-846d-605af1635de7" containerName="dnsmasq-dns" Oct 13 13:26:21 crc kubenswrapper[4797]: I1013 13:26:21.685991 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-vmbzt" Oct 13 13:26:21 crc kubenswrapper[4797]: I1013 13:26:21.728299 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8lmx\" (UniqueName: \"kubernetes.io/projected/f6745956-a404-4481-9d3e-3a8056d7aaf2-kube-api-access-m8lmx\") pod \"nova-api-db-create-vmbzt\" (UID: \"f6745956-a404-4481-9d3e-3a8056d7aaf2\") " pod="openstack/nova-api-db-create-vmbzt" Oct 13 13:26:21 crc kubenswrapper[4797]: I1013 13:26:21.745401 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-vmbzt"] Oct 13 13:26:21 crc kubenswrapper[4797]: I1013 13:26:21.830535 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m8lmx\" (UniqueName: \"kubernetes.io/projected/f6745956-a404-4481-9d3e-3a8056d7aaf2-kube-api-access-m8lmx\") pod \"nova-api-db-create-vmbzt\" (UID: \"f6745956-a404-4481-9d3e-3a8056d7aaf2\") " pod="openstack/nova-api-db-create-vmbzt" Oct 13 13:26:21 crc kubenswrapper[4797]: I1013 13:26:21.850636 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8lmx\" (UniqueName: \"kubernetes.io/projected/f6745956-a404-4481-9d3e-3a8056d7aaf2-kube-api-access-m8lmx\") pod \"nova-api-db-create-vmbzt\" (UID: \"f6745956-a404-4481-9d3e-3a8056d7aaf2\") " pod="openstack/nova-api-db-create-vmbzt" Oct 13 13:26:21 crc kubenswrapper[4797]: I1013 13:26:21.896088 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-9kldr"] Oct 13 13:26:21 crc kubenswrapper[4797]: I1013 13:26:21.897759 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-9kldr" Oct 13 13:26:21 crc kubenswrapper[4797]: I1013 13:26:21.904564 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-9kldr"] Oct 13 13:26:21 crc kubenswrapper[4797]: I1013 13:26:21.932923 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9pkkn\" (UniqueName: \"kubernetes.io/projected/50e6d20e-a3e6-43a4-a10c-2d7d06d1cfeb-kube-api-access-9pkkn\") pod \"nova-cell0-db-create-9kldr\" (UID: \"50e6d20e-a3e6-43a4-a10c-2d7d06d1cfeb\") " pod="openstack/nova-cell0-db-create-9kldr" Oct 13 13:26:21 crc kubenswrapper[4797]: I1013 13:26:21.994039 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-66cds"] Oct 13 13:26:21 crc kubenswrapper[4797]: I1013 13:26:21.995110 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-66cds" Oct 13 13:26:22 crc kubenswrapper[4797]: I1013 13:26:22.011942 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-66cds"] Oct 13 13:26:22 crc kubenswrapper[4797]: I1013 13:26:22.027352 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-vmbzt" Oct 13 13:26:22 crc kubenswrapper[4797]: I1013 13:26:22.035110 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9pkkn\" (UniqueName: \"kubernetes.io/projected/50e6d20e-a3e6-43a4-a10c-2d7d06d1cfeb-kube-api-access-9pkkn\") pod \"nova-cell0-db-create-9kldr\" (UID: \"50e6d20e-a3e6-43a4-a10c-2d7d06d1cfeb\") " pod="openstack/nova-cell0-db-create-9kldr" Oct 13 13:26:22 crc kubenswrapper[4797]: I1013 13:26:22.035214 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6cdb\" (UniqueName: \"kubernetes.io/projected/3e8201a9-2b2a-46f0-81ea-db939e25d192-kube-api-access-p6cdb\") pod \"nova-cell1-db-create-66cds\" (UID: \"3e8201a9-2b2a-46f0-81ea-db939e25d192\") " pod="openstack/nova-cell1-db-create-66cds" Oct 13 13:26:22 crc kubenswrapper[4797]: I1013 13:26:22.057402 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9pkkn\" (UniqueName: \"kubernetes.io/projected/50e6d20e-a3e6-43a4-a10c-2d7d06d1cfeb-kube-api-access-9pkkn\") pod \"nova-cell0-db-create-9kldr\" (UID: \"50e6d20e-a3e6-43a4-a10c-2d7d06d1cfeb\") " pod="openstack/nova-cell0-db-create-9kldr" Oct 13 13:26:22 crc kubenswrapper[4797]: I1013 13:26:22.111476 4797 generic.go:334] "Generic (PLEG): container finished" podID="99cf4c75-5042-4f58-945f-5461cad0fbcc" containerID="e95744e7a7fb393cf4dc18192de5ca1a6dbdc1518ad298e95bb74bf374852a4b" exitCode=143 Oct 13 13:26:22 crc kubenswrapper[4797]: I1013 13:26:22.111631 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"99cf4c75-5042-4f58-945f-5461cad0fbcc","Type":"ContainerDied","Data":"e95744e7a7fb393cf4dc18192de5ca1a6dbdc1518ad298e95bb74bf374852a4b"} Oct 13 13:26:22 crc kubenswrapper[4797]: I1013 13:26:22.113666 4797 generic.go:334] "Generic (PLEG): container finished" podID="bdf73d21-1504-42d0-9521-9d1201947cc9" containerID="57283ac326d0242aaf47e4ff612cf4c3235d8825402aa5d07b3d9cdfcbc2afda" exitCode=0 Oct 13 13:26:22 crc kubenswrapper[4797]: I1013 13:26:22.113695 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7fd4c97c98-mmgwk" event={"ID":"bdf73d21-1504-42d0-9521-9d1201947cc9","Type":"ContainerDied","Data":"57283ac326d0242aaf47e4ff612cf4c3235d8825402aa5d07b3d9cdfcbc2afda"} Oct 13 13:26:22 crc kubenswrapper[4797]: I1013 13:26:22.137544 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6cdb\" (UniqueName: \"kubernetes.io/projected/3e8201a9-2b2a-46f0-81ea-db939e25d192-kube-api-access-p6cdb\") pod \"nova-cell1-db-create-66cds\" (UID: \"3e8201a9-2b2a-46f0-81ea-db939e25d192\") " pod="openstack/nova-cell1-db-create-66cds" Oct 13 13:26:22 crc kubenswrapper[4797]: I1013 13:26:22.157484 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6cdb\" (UniqueName: \"kubernetes.io/projected/3e8201a9-2b2a-46f0-81ea-db939e25d192-kube-api-access-p6cdb\") pod \"nova-cell1-db-create-66cds\" (UID: \"3e8201a9-2b2a-46f0-81ea-db939e25d192\") " pod="openstack/nova-cell1-db-create-66cds" Oct 13 13:26:22 crc kubenswrapper[4797]: I1013 13:26:22.232835 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-9kldr" Oct 13 13:26:22 crc kubenswrapper[4797]: I1013 13:26:22.319222 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-66cds" Oct 13 13:26:22 crc kubenswrapper[4797]: I1013 13:26:22.535109 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-6c8957dfc-xhpz5" Oct 13 13:26:22 crc kubenswrapper[4797]: I1013 13:26:22.536866 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-6c8957dfc-xhpz5" Oct 13 13:26:24 crc kubenswrapper[4797]: I1013 13:26:24.021202 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-575995d4c4-dskht" Oct 13 13:26:24 crc kubenswrapper[4797]: I1013 13:26:24.272422 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Oct 13 13:26:24 crc kubenswrapper[4797]: I1013 13:26:24.490749 4797 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="6252a65e-85f5-42cf-9fce-3cd585d5e834" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.148:9292/healthcheck\": dial tcp 10.217.0.148:9292: connect: connection refused" Oct 13 13:26:24 crc kubenswrapper[4797]: I1013 13:26:24.490723 4797 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="6252a65e-85f5-42cf-9fce-3cd585d5e834" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.148:9292/healthcheck\": dial tcp 10.217.0.148:9292: connect: connection refused" Oct 13 13:26:24 crc kubenswrapper[4797]: I1013 13:26:24.572664 4797 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="99cf4c75-5042-4f58-945f-5461cad0fbcc" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.150:9292/healthcheck\": read tcp 10.217.0.2:48386->10.217.0.150:9292: read: connection reset by peer" Oct 13 13:26:24 crc kubenswrapper[4797]: I1013 13:26:24.572794 4797 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="99cf4c75-5042-4f58-945f-5461cad0fbcc" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.150:9292/healthcheck\": read tcp 10.217.0.2:48394->10.217.0.150:9292: read: connection reset by peer" Oct 13 13:26:25 crc kubenswrapper[4797]: I1013 13:26:25.114549 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-575995d4c4-dskht" Oct 13 13:26:25 crc kubenswrapper[4797]: I1013 13:26:25.162809 4797 generic.go:334] "Generic (PLEG): container finished" podID="99cf4c75-5042-4f58-945f-5461cad0fbcc" containerID="8a2c55dde5e7cdb88e6195d0cc7cf9c60c2d76d91b4fbb30b8a991240a8dfd5a" exitCode=0 Oct 13 13:26:25 crc kubenswrapper[4797]: I1013 13:26:25.162867 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"99cf4c75-5042-4f58-945f-5461cad0fbcc","Type":"ContainerDied","Data":"8a2c55dde5e7cdb88e6195d0cc7cf9c60c2d76d91b4fbb30b8a991240a8dfd5a"} Oct 13 13:26:25 crc kubenswrapper[4797]: I1013 13:26:25.190094 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-966b5d6fd-tjmcl"] Oct 13 13:26:25 crc kubenswrapper[4797]: I1013 13:26:25.190375 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-966b5d6fd-tjmcl" podUID="c012c54e-85c7-4ee9-8513-4f5d8ae4b1c3" containerName="barbican-api-log" containerID="cri-o://c194d424777b528262d7faf7dbc68e74d37a3b6e2e0d43724b482adbb86ba61c" gracePeriod=30 Oct 13 13:26:25 crc kubenswrapper[4797]: I1013 13:26:25.190521 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-966b5d6fd-tjmcl" podUID="c012c54e-85c7-4ee9-8513-4f5d8ae4b1c3" containerName="barbican-api" containerID="cri-o://ffef222b79dd5a63d8d3ba55b0f0a1b56a6dbb536e90807283e7732510a1adcf" gracePeriod=30 Oct 13 13:26:26 crc kubenswrapper[4797]: I1013 13:26:26.173795 4797 generic.go:334] "Generic (PLEG): container finished" podID="c012c54e-85c7-4ee9-8513-4f5d8ae4b1c3" containerID="c194d424777b528262d7faf7dbc68e74d37a3b6e2e0d43724b482adbb86ba61c" exitCode=143 Oct 13 13:26:26 crc kubenswrapper[4797]: I1013 13:26:26.173867 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-966b5d6fd-tjmcl" event={"ID":"c012c54e-85c7-4ee9-8513-4f5d8ae4b1c3","Type":"ContainerDied","Data":"c194d424777b528262d7faf7dbc68e74d37a3b6e2e0d43724b482adbb86ba61c"} Oct 13 13:26:26 crc kubenswrapper[4797]: I1013 13:26:26.625004 4797 scope.go:117] "RemoveContainer" containerID="1f19d65fd4aec7a257dc3909c51e0b4fea4ba934b5e63785ab9d2495723d0e7c" Oct 13 13:26:26 crc kubenswrapper[4797]: E1013 13:26:26.633153 4797 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-openstackclient@sha256:c2ebfcd639773ddc62a37198da44de8fd76348610d91424b880d41a51b702418" Oct 13 13:26:26 crc kubenswrapper[4797]: E1013 13:26:26.633329 4797 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:openstackclient,Image:quay.io/podified-antelope-centos9/openstack-openstackclient@sha256:c2ebfcd639773ddc62a37198da44de8fd76348610d91424b880d41a51b702418,Command:[/bin/sleep],Args:[infinity],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndh548h654h5fbh574h677h5d4h6bh597h669h695h55hbdhc6h54h76h89h5d4h89h95h8ch668h644h64fhdhb8h88h597hc5h647h4hfcq,ValueFrom:nil,},EnvVar{Name:OS_CLOUD,Value:default,ValueFrom:nil,},EnvVar{Name:PROMETHEUS_HOST,Value:metric-storage-prometheus.openstack.svc,ValueFrom:nil,},EnvVar{Name:PROMETHEUS_PORT,Value:9090,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:openstack-config,ReadOnly:false,MountPath:/home/cloud-admin/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/home/cloud-admin/.config/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/home/cloud-admin/cloudrc,SubPath:cloudrc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lwl7t,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42401,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*42401,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstackclient_openstack(f8688abd-e654-404b-924c-9e4cf255f4e8): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 13 13:26:26 crc kubenswrapper[4797]: E1013 13:26:26.634510 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openstackclient\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/openstackclient" podUID="f8688abd-e654-404b-924c-9e4cf255f4e8" Oct 13 13:26:26 crc kubenswrapper[4797]: I1013 13:26:26.886378 4797 scope.go:117] "RemoveContainer" containerID="322ab71a472f62847155ef62aae8f13f5be5d55a48cf23fc4a7574bab88d162a" Oct 13 13:26:26 crc kubenswrapper[4797]: E1013 13:26:26.887112 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"322ab71a472f62847155ef62aae8f13f5be5d55a48cf23fc4a7574bab88d162a\": container with ID starting with 322ab71a472f62847155ef62aae8f13f5be5d55a48cf23fc4a7574bab88d162a not found: ID does not exist" containerID="322ab71a472f62847155ef62aae8f13f5be5d55a48cf23fc4a7574bab88d162a" Oct 13 13:26:26 crc kubenswrapper[4797]: I1013 13:26:26.887177 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"322ab71a472f62847155ef62aae8f13f5be5d55a48cf23fc4a7574bab88d162a"} err="failed to get container status \"322ab71a472f62847155ef62aae8f13f5be5d55a48cf23fc4a7574bab88d162a\": rpc error: code = NotFound desc = could not find container \"322ab71a472f62847155ef62aae8f13f5be5d55a48cf23fc4a7574bab88d162a\": container with ID starting with 322ab71a472f62847155ef62aae8f13f5be5d55a48cf23fc4a7574bab88d162a not found: ID does not exist" Oct 13 13:26:26 crc kubenswrapper[4797]: I1013 13:26:26.887200 4797 scope.go:117] "RemoveContainer" containerID="1f19d65fd4aec7a257dc3909c51e0b4fea4ba934b5e63785ab9d2495723d0e7c" Oct 13 13:26:26 crc kubenswrapper[4797]: E1013 13:26:26.891057 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f19d65fd4aec7a257dc3909c51e0b4fea4ba934b5e63785ab9d2495723d0e7c\": container with ID starting with 1f19d65fd4aec7a257dc3909c51e0b4fea4ba934b5e63785ab9d2495723d0e7c not found: ID does not exist" containerID="1f19d65fd4aec7a257dc3909c51e0b4fea4ba934b5e63785ab9d2495723d0e7c" Oct 13 13:26:26 crc kubenswrapper[4797]: I1013 13:26:26.891104 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f19d65fd4aec7a257dc3909c51e0b4fea4ba934b5e63785ab9d2495723d0e7c"} err="failed to get container status \"1f19d65fd4aec7a257dc3909c51e0b4fea4ba934b5e63785ab9d2495723d0e7c\": rpc error: code = NotFound desc = could not find container \"1f19d65fd4aec7a257dc3909c51e0b4fea4ba934b5e63785ab9d2495723d0e7c\": container with ID starting with 1f19d65fd4aec7a257dc3909c51e0b4fea4ba934b5e63785ab9d2495723d0e7c not found: ID does not exist" Oct 13 13:26:26 crc kubenswrapper[4797]: I1013 13:26:26.891133 4797 scope.go:117] "RemoveContainer" containerID="d13bde04be8878f52602789b8a495d96204227aa290488bc4d6eac0aef285521" Oct 13 13:26:27 crc kubenswrapper[4797]: I1013 13:26:27.152020 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 13 13:26:27 crc kubenswrapper[4797]: I1013 13:26:27.229385 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 13 13:26:27 crc kubenswrapper[4797]: I1013 13:26:27.238515 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 13 13:26:27 crc kubenswrapper[4797]: I1013 13:26:27.238862 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 13 13:26:27 crc kubenswrapper[4797]: I1013 13:26:27.259786 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 13 13:26:27 crc kubenswrapper[4797]: E1013 13:26:27.271799 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openstackclient\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-openstackclient@sha256:c2ebfcd639773ddc62a37198da44de8fd76348610d91424b880d41a51b702418\\\"\"" pod="openstack/openstackclient" podUID="f8688abd-e654-404b-924c-9e4cf255f4e8" Oct 13 13:26:27 crc kubenswrapper[4797]: I1013 13:26:27.299532 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ee338af0-a1a1-4515-973b-b914753e76cf","Type":"ContainerDied","Data":"1b66a64e7dac0a16d85e2c05b5daca93768ea25698d8906017ed789b0de9b098"} Oct 13 13:26:27 crc kubenswrapper[4797]: I1013 13:26:27.299581 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"d333dae8-679d-4469-bcbf-0c9ee221b136","Type":"ContainerDied","Data":"7ec6324b7ab097e36ec43ff9b0e2d131c1273909807bdeb438d5a17104e6a043"} Oct 13 13:26:27 crc kubenswrapper[4797]: I1013 13:26:27.299603 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" event={"ID":"345b1c60-ba79-407d-8423-53010f2dfeb0","Type":"ContainerStarted","Data":"7a2f1a197e052d816aea722ded8ddb41413d4d55e91b26d0412cfadb04dd4ef6"} Oct 13 13:26:27 crc kubenswrapper[4797]: I1013 13:26:27.299634 4797 scope.go:117] "RemoveContainer" containerID="2c1ef579b9023feaff895b2c21122b7c802d2b9e0b2a3b211f8c95a0dcaab98e" Oct 13 13:26:27 crc kubenswrapper[4797]: I1013 13:26:27.341609 4797 scope.go:117] "RemoveContainer" containerID="d034c56424f578783826d4b8e3e3adf2868eb02a2d2f39809be377fcbd84b10a" Oct 13 13:26:27 crc kubenswrapper[4797]: I1013 13:26:27.359044 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee338af0-a1a1-4515-973b-b914753e76cf-combined-ca-bundle\") pod \"ee338af0-a1a1-4515-973b-b914753e76cf\" (UID: \"ee338af0-a1a1-4515-973b-b914753e76cf\") " Oct 13 13:26:27 crc kubenswrapper[4797]: I1013 13:26:27.359099 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"6252a65e-85f5-42cf-9fce-3cd585d5e834\" (UID: \"6252a65e-85f5-42cf-9fce-3cd585d5e834\") " Oct 13 13:26:27 crc kubenswrapper[4797]: I1013 13:26:27.359121 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee338af0-a1a1-4515-973b-b914753e76cf-config-data\") pod \"ee338af0-a1a1-4515-973b-b914753e76cf\" (UID: \"ee338af0-a1a1-4515-973b-b914753e76cf\") " Oct 13 13:26:27 crc kubenswrapper[4797]: I1013 13:26:27.359146 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ee338af0-a1a1-4515-973b-b914753e76cf-sg-core-conf-yaml\") pod \"ee338af0-a1a1-4515-973b-b914753e76cf\" (UID: \"ee338af0-a1a1-4515-973b-b914753e76cf\") " Oct 13 13:26:27 crc kubenswrapper[4797]: I1013 13:26:27.359177 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6252a65e-85f5-42cf-9fce-3cd585d5e834-config-data\") pod \"6252a65e-85f5-42cf-9fce-3cd585d5e834\" (UID: \"6252a65e-85f5-42cf-9fce-3cd585d5e834\") " Oct 13 13:26:27 crc kubenswrapper[4797]: I1013 13:26:27.359236 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8lb9b\" (UniqueName: \"kubernetes.io/projected/6252a65e-85f5-42cf-9fce-3cd585d5e834-kube-api-access-8lb9b\") pod \"6252a65e-85f5-42cf-9fce-3cd585d5e834\" (UID: \"6252a65e-85f5-42cf-9fce-3cd585d5e834\") " Oct 13 13:26:27 crc kubenswrapper[4797]: I1013 13:26:27.359291 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ee338af0-a1a1-4515-973b-b914753e76cf-scripts\") pod \"ee338af0-a1a1-4515-973b-b914753e76cf\" (UID: \"ee338af0-a1a1-4515-973b-b914753e76cf\") " Oct 13 13:26:27 crc kubenswrapper[4797]: I1013 13:26:27.359356 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6252a65e-85f5-42cf-9fce-3cd585d5e834-logs\") pod \"6252a65e-85f5-42cf-9fce-3cd585d5e834\" (UID: \"6252a65e-85f5-42cf-9fce-3cd585d5e834\") " Oct 13 13:26:27 crc kubenswrapper[4797]: I1013 13:26:27.359386 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6252a65e-85f5-42cf-9fce-3cd585d5e834-scripts\") pod \"6252a65e-85f5-42cf-9fce-3cd585d5e834\" (UID: \"6252a65e-85f5-42cf-9fce-3cd585d5e834\") " Oct 13 13:26:27 crc kubenswrapper[4797]: I1013 13:26:27.359429 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d333dae8-679d-4469-bcbf-0c9ee221b136-etc-machine-id\") pod \"d333dae8-679d-4469-bcbf-0c9ee221b136\" (UID: \"d333dae8-679d-4469-bcbf-0c9ee221b136\") " Oct 13 13:26:27 crc kubenswrapper[4797]: I1013 13:26:27.359458 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6252a65e-85f5-42cf-9fce-3cd585d5e834-combined-ca-bundle\") pod \"6252a65e-85f5-42cf-9fce-3cd585d5e834\" (UID: \"6252a65e-85f5-42cf-9fce-3cd585d5e834\") " Oct 13 13:26:27 crc kubenswrapper[4797]: I1013 13:26:27.359482 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6252a65e-85f5-42cf-9fce-3cd585d5e834-httpd-run\") pod \"6252a65e-85f5-42cf-9fce-3cd585d5e834\" (UID: \"6252a65e-85f5-42cf-9fce-3cd585d5e834\") " Oct 13 13:26:27 crc kubenswrapper[4797]: I1013 13:26:27.359504 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d333dae8-679d-4469-bcbf-0c9ee221b136-scripts\") pod \"d333dae8-679d-4469-bcbf-0c9ee221b136\" (UID: \"d333dae8-679d-4469-bcbf-0c9ee221b136\") " Oct 13 13:26:27 crc kubenswrapper[4797]: I1013 13:26:27.359522 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d333dae8-679d-4469-bcbf-0c9ee221b136-config-data\") pod \"d333dae8-679d-4469-bcbf-0c9ee221b136\" (UID: \"d333dae8-679d-4469-bcbf-0c9ee221b136\") " Oct 13 13:26:27 crc kubenswrapper[4797]: I1013 13:26:27.359592 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ee338af0-a1a1-4515-973b-b914753e76cf-log-httpd\") pod \"ee338af0-a1a1-4515-973b-b914753e76cf\" (UID: \"ee338af0-a1a1-4515-973b-b914753e76cf\") " Oct 13 13:26:27 crc kubenswrapper[4797]: I1013 13:26:27.359640 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-clmsj\" (UniqueName: \"kubernetes.io/projected/d333dae8-679d-4469-bcbf-0c9ee221b136-kube-api-access-clmsj\") pod \"d333dae8-679d-4469-bcbf-0c9ee221b136\" (UID: \"d333dae8-679d-4469-bcbf-0c9ee221b136\") " Oct 13 13:26:27 crc kubenswrapper[4797]: I1013 13:26:27.359766 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ee338af0-a1a1-4515-973b-b914753e76cf-run-httpd\") pod \"ee338af0-a1a1-4515-973b-b914753e76cf\" (UID: \"ee338af0-a1a1-4515-973b-b914753e76cf\") " Oct 13 13:26:27 crc kubenswrapper[4797]: I1013 13:26:27.359786 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d333dae8-679d-4469-bcbf-0c9ee221b136-config-data-custom\") pod \"d333dae8-679d-4469-bcbf-0c9ee221b136\" (UID: \"d333dae8-679d-4469-bcbf-0c9ee221b136\") " Oct 13 13:26:27 crc kubenswrapper[4797]: I1013 13:26:27.359822 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6252a65e-85f5-42cf-9fce-3cd585d5e834-internal-tls-certs\") pod \"6252a65e-85f5-42cf-9fce-3cd585d5e834\" (UID: \"6252a65e-85f5-42cf-9fce-3cd585d5e834\") " Oct 13 13:26:27 crc kubenswrapper[4797]: I1013 13:26:27.359859 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d333dae8-679d-4469-bcbf-0c9ee221b136-combined-ca-bundle\") pod \"d333dae8-679d-4469-bcbf-0c9ee221b136\" (UID: \"d333dae8-679d-4469-bcbf-0c9ee221b136\") " Oct 13 13:26:27 crc kubenswrapper[4797]: I1013 13:26:27.359900 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d8mjg\" (UniqueName: \"kubernetes.io/projected/ee338af0-a1a1-4515-973b-b914753e76cf-kube-api-access-d8mjg\") pod \"ee338af0-a1a1-4515-973b-b914753e76cf\" (UID: \"ee338af0-a1a1-4515-973b-b914753e76cf\") " Oct 13 13:26:27 crc kubenswrapper[4797]: I1013 13:26:27.360124 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d333dae8-679d-4469-bcbf-0c9ee221b136-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "d333dae8-679d-4469-bcbf-0c9ee221b136" (UID: "d333dae8-679d-4469-bcbf-0c9ee221b136"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 13 13:26:27 crc kubenswrapper[4797]: I1013 13:26:27.360421 4797 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d333dae8-679d-4469-bcbf-0c9ee221b136-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 13 13:26:27 crc kubenswrapper[4797]: I1013 13:26:27.373438 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6252a65e-85f5-42cf-9fce-3cd585d5e834-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "6252a65e-85f5-42cf-9fce-3cd585d5e834" (UID: "6252a65e-85f5-42cf-9fce-3cd585d5e834"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 13:26:27 crc kubenswrapper[4797]: I1013 13:26:27.378234 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee338af0-a1a1-4515-973b-b914753e76cf-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "ee338af0-a1a1-4515-973b-b914753e76cf" (UID: "ee338af0-a1a1-4515-973b-b914753e76cf"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 13:26:27 crc kubenswrapper[4797]: I1013 13:26:27.379752 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee338af0-a1a1-4515-973b-b914753e76cf-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "ee338af0-a1a1-4515-973b-b914753e76cf" (UID: "ee338af0-a1a1-4515-973b-b914753e76cf"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 13:26:27 crc kubenswrapper[4797]: I1013 13:26:27.380574 4797 scope.go:117] "RemoveContainer" containerID="c1c0a25f61889177289bbe5d1b1f2d63cfdb3457c8b73ec447a0b59ec80484d3" Oct 13 13:26:27 crc kubenswrapper[4797]: I1013 13:26:27.381798 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6252a65e-85f5-42cf-9fce-3cd585d5e834-logs" (OuterVolumeSpecName: "logs") pod "6252a65e-85f5-42cf-9fce-3cd585d5e834" (UID: "6252a65e-85f5-42cf-9fce-3cd585d5e834"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 13:26:27 crc kubenswrapper[4797]: I1013 13:26:27.386634 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d333dae8-679d-4469-bcbf-0c9ee221b136-scripts" (OuterVolumeSpecName: "scripts") pod "d333dae8-679d-4469-bcbf-0c9ee221b136" (UID: "d333dae8-679d-4469-bcbf-0c9ee221b136"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:26:27 crc kubenswrapper[4797]: I1013 13:26:27.392225 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee338af0-a1a1-4515-973b-b914753e76cf-scripts" (OuterVolumeSpecName: "scripts") pod "ee338af0-a1a1-4515-973b-b914753e76cf" (UID: "ee338af0-a1a1-4515-973b-b914753e76cf"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:26:27 crc kubenswrapper[4797]: I1013 13:26:27.392223 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "6252a65e-85f5-42cf-9fce-3cd585d5e834" (UID: "6252a65e-85f5-42cf-9fce-3cd585d5e834"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 13 13:26:27 crc kubenswrapper[4797]: I1013 13:26:27.392761 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6252a65e-85f5-42cf-9fce-3cd585d5e834-kube-api-access-8lb9b" (OuterVolumeSpecName: "kube-api-access-8lb9b") pod "6252a65e-85f5-42cf-9fce-3cd585d5e834" (UID: "6252a65e-85f5-42cf-9fce-3cd585d5e834"). InnerVolumeSpecName "kube-api-access-8lb9b". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:26:27 crc kubenswrapper[4797]: I1013 13:26:27.396388 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d333dae8-679d-4469-bcbf-0c9ee221b136-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "d333dae8-679d-4469-bcbf-0c9ee221b136" (UID: "d333dae8-679d-4469-bcbf-0c9ee221b136"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:26:27 crc kubenswrapper[4797]: I1013 13:26:27.396404 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d333dae8-679d-4469-bcbf-0c9ee221b136-kube-api-access-clmsj" (OuterVolumeSpecName: "kube-api-access-clmsj") pod "d333dae8-679d-4469-bcbf-0c9ee221b136" (UID: "d333dae8-679d-4469-bcbf-0c9ee221b136"). InnerVolumeSpecName "kube-api-access-clmsj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:26:27 crc kubenswrapper[4797]: I1013 13:26:27.396422 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6252a65e-85f5-42cf-9fce-3cd585d5e834-scripts" (OuterVolumeSpecName: "scripts") pod "6252a65e-85f5-42cf-9fce-3cd585d5e834" (UID: "6252a65e-85f5-42cf-9fce-3cd585d5e834"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:26:27 crc kubenswrapper[4797]: I1013 13:26:27.402659 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee338af0-a1a1-4515-973b-b914753e76cf-kube-api-access-d8mjg" (OuterVolumeSpecName: "kube-api-access-d8mjg") pod "ee338af0-a1a1-4515-973b-b914753e76cf" (UID: "ee338af0-a1a1-4515-973b-b914753e76cf"). InnerVolumeSpecName "kube-api-access-d8mjg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:26:27 crc kubenswrapper[4797]: I1013 13:26:27.425245 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6252a65e-85f5-42cf-9fce-3cd585d5e834-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6252a65e-85f5-42cf-9fce-3cd585d5e834" (UID: "6252a65e-85f5-42cf-9fce-3cd585d5e834"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:26:27 crc kubenswrapper[4797]: I1013 13:26:27.447196 4797 scope.go:117] "RemoveContainer" containerID="aa3f0cfe1c8e9a489f2c9bfe5437f7132bf4938834a5ca666bd15a9bde15355b" Oct 13 13:26:27 crc kubenswrapper[4797]: I1013 13:26:27.463388 4797 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ee338af0-a1a1-4515-973b-b914753e76cf-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 13 13:26:27 crc kubenswrapper[4797]: I1013 13:26:27.463411 4797 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d333dae8-679d-4469-bcbf-0c9ee221b136-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 13 13:26:27 crc kubenswrapper[4797]: I1013 13:26:27.463420 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d8mjg\" (UniqueName: \"kubernetes.io/projected/ee338af0-a1a1-4515-973b-b914753e76cf-kube-api-access-d8mjg\") on node \"crc\" DevicePath \"\"" Oct 13 13:26:27 crc kubenswrapper[4797]: I1013 13:26:27.463443 4797 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Oct 13 13:26:27 crc kubenswrapper[4797]: I1013 13:26:27.463452 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8lb9b\" (UniqueName: \"kubernetes.io/projected/6252a65e-85f5-42cf-9fce-3cd585d5e834-kube-api-access-8lb9b\") on node \"crc\" DevicePath \"\"" Oct 13 13:26:27 crc kubenswrapper[4797]: I1013 13:26:27.463460 4797 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ee338af0-a1a1-4515-973b-b914753e76cf-scripts\") on node \"crc\" DevicePath \"\"" Oct 13 13:26:27 crc kubenswrapper[4797]: I1013 13:26:27.463468 4797 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6252a65e-85f5-42cf-9fce-3cd585d5e834-logs\") on node \"crc\" DevicePath \"\"" Oct 13 13:26:27 crc kubenswrapper[4797]: I1013 13:26:27.463477 4797 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6252a65e-85f5-42cf-9fce-3cd585d5e834-scripts\") on node \"crc\" DevicePath \"\"" Oct 13 13:26:27 crc kubenswrapper[4797]: I1013 13:26:27.463485 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6252a65e-85f5-42cf-9fce-3cd585d5e834-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 13:26:27 crc kubenswrapper[4797]: I1013 13:26:27.463493 4797 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6252a65e-85f5-42cf-9fce-3cd585d5e834-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 13 13:26:27 crc kubenswrapper[4797]: I1013 13:26:27.463501 4797 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d333dae8-679d-4469-bcbf-0c9ee221b136-scripts\") on node \"crc\" DevicePath \"\"" Oct 13 13:26:27 crc kubenswrapper[4797]: I1013 13:26:27.463510 4797 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ee338af0-a1a1-4515-973b-b914753e76cf-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 13 13:26:27 crc kubenswrapper[4797]: I1013 13:26:27.463519 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-clmsj\" (UniqueName: \"kubernetes.io/projected/d333dae8-679d-4469-bcbf-0c9ee221b136-kube-api-access-clmsj\") on node \"crc\" DevicePath \"\"" Oct 13 13:26:27 crc kubenswrapper[4797]: I1013 13:26:27.487047 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee338af0-a1a1-4515-973b-b914753e76cf-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "ee338af0-a1a1-4515-973b-b914753e76cf" (UID: "ee338af0-a1a1-4515-973b-b914753e76cf"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:26:27 crc kubenswrapper[4797]: I1013 13:26:27.492174 4797 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Oct 13 13:26:27 crc kubenswrapper[4797]: I1013 13:26:27.516184 4797 scope.go:117] "RemoveContainer" containerID="f29b8911fc63bc6933a9f53dc18e60ef397bda730d601a5ef719e2d0ce2050d8" Oct 13 13:26:27 crc kubenswrapper[4797]: I1013 13:26:27.531487 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee338af0-a1a1-4515-973b-b914753e76cf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ee338af0-a1a1-4515-973b-b914753e76cf" (UID: "ee338af0-a1a1-4515-973b-b914753e76cf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:26:27 crc kubenswrapper[4797]: I1013 13:26:27.565841 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee338af0-a1a1-4515-973b-b914753e76cf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 13:26:27 crc kubenswrapper[4797]: I1013 13:26:27.565883 4797 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Oct 13 13:26:27 crc kubenswrapper[4797]: I1013 13:26:27.565895 4797 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ee338af0-a1a1-4515-973b-b914753e76cf-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 13 13:26:27 crc kubenswrapper[4797]: I1013 13:26:27.571915 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6252a65e-85f5-42cf-9fce-3cd585d5e834-config-data" (OuterVolumeSpecName: "config-data") pod "6252a65e-85f5-42cf-9fce-3cd585d5e834" (UID: "6252a65e-85f5-42cf-9fce-3cd585d5e834"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:26:27 crc kubenswrapper[4797]: I1013 13:26:27.577048 4797 scope.go:117] "RemoveContainer" containerID="9ca29efbdc2539e87f73a20111b58a34b8cb37cf884c812f37b79e1055717b72" Oct 13 13:26:27 crc kubenswrapper[4797]: I1013 13:26:27.592578 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d333dae8-679d-4469-bcbf-0c9ee221b136-config-data" (OuterVolumeSpecName: "config-data") pod "d333dae8-679d-4469-bcbf-0c9ee221b136" (UID: "d333dae8-679d-4469-bcbf-0c9ee221b136"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:26:27 crc kubenswrapper[4797]: I1013 13:26:27.602138 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7fd4c97c98-mmgwk" Oct 13 13:26:27 crc kubenswrapper[4797]: I1013 13:26:27.625808 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d333dae8-679d-4469-bcbf-0c9ee221b136-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d333dae8-679d-4469-bcbf-0c9ee221b136" (UID: "d333dae8-679d-4469-bcbf-0c9ee221b136"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:26:27 crc kubenswrapper[4797]: I1013 13:26:27.655714 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6252a65e-85f5-42cf-9fce-3cd585d5e834-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "6252a65e-85f5-42cf-9fce-3cd585d5e834" (UID: "6252a65e-85f5-42cf-9fce-3cd585d5e834"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:26:27 crc kubenswrapper[4797]: I1013 13:26:27.667980 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/bdf73d21-1504-42d0-9521-9d1201947cc9-httpd-config\") pod \"bdf73d21-1504-42d0-9521-9d1201947cc9\" (UID: \"bdf73d21-1504-42d0-9521-9d1201947cc9\") " Oct 13 13:26:27 crc kubenswrapper[4797]: I1013 13:26:27.668051 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bdf73d21-1504-42d0-9521-9d1201947cc9-ovndb-tls-certs\") pod \"bdf73d21-1504-42d0-9521-9d1201947cc9\" (UID: \"bdf73d21-1504-42d0-9521-9d1201947cc9\") " Oct 13 13:26:27 crc kubenswrapper[4797]: I1013 13:26:27.668070 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g8lfc\" (UniqueName: \"kubernetes.io/projected/bdf73d21-1504-42d0-9521-9d1201947cc9-kube-api-access-g8lfc\") pod \"bdf73d21-1504-42d0-9521-9d1201947cc9\" (UID: \"bdf73d21-1504-42d0-9521-9d1201947cc9\") " Oct 13 13:26:27 crc kubenswrapper[4797]: I1013 13:26:27.668133 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdf73d21-1504-42d0-9521-9d1201947cc9-combined-ca-bundle\") pod \"bdf73d21-1504-42d0-9521-9d1201947cc9\" (UID: \"bdf73d21-1504-42d0-9521-9d1201947cc9\") " Oct 13 13:26:27 crc kubenswrapper[4797]: I1013 13:26:27.668503 4797 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d333dae8-679d-4469-bcbf-0c9ee221b136-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 13:26:27 crc kubenswrapper[4797]: I1013 13:26:27.668515 4797 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6252a65e-85f5-42cf-9fce-3cd585d5e834-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 13 13:26:27 crc kubenswrapper[4797]: I1013 13:26:27.668525 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d333dae8-679d-4469-bcbf-0c9ee221b136-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 13:26:27 crc kubenswrapper[4797]: I1013 13:26:27.668534 4797 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6252a65e-85f5-42cf-9fce-3cd585d5e834-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 13:26:27 crc kubenswrapper[4797]: I1013 13:26:27.674489 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bdf73d21-1504-42d0-9521-9d1201947cc9-kube-api-access-g8lfc" (OuterVolumeSpecName: "kube-api-access-g8lfc") pod "bdf73d21-1504-42d0-9521-9d1201947cc9" (UID: "bdf73d21-1504-42d0-9521-9d1201947cc9"). InnerVolumeSpecName "kube-api-access-g8lfc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:26:27 crc kubenswrapper[4797]: I1013 13:26:27.687195 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bdf73d21-1504-42d0-9521-9d1201947cc9-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "bdf73d21-1504-42d0-9521-9d1201947cc9" (UID: "bdf73d21-1504-42d0-9521-9d1201947cc9"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:26:27 crc kubenswrapper[4797]: I1013 13:26:27.693761 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee338af0-a1a1-4515-973b-b914753e76cf-config-data" (OuterVolumeSpecName: "config-data") pod "ee338af0-a1a1-4515-973b-b914753e76cf" (UID: "ee338af0-a1a1-4515-973b-b914753e76cf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:26:27 crc kubenswrapper[4797]: I1013 13:26:27.725713 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bdf73d21-1504-42d0-9521-9d1201947cc9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bdf73d21-1504-42d0-9521-9d1201947cc9" (UID: "bdf73d21-1504-42d0-9521-9d1201947cc9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:26:27 crc kubenswrapper[4797]: I1013 13:26:27.775380 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/bdf73d21-1504-42d0-9521-9d1201947cc9-config\") pod \"bdf73d21-1504-42d0-9521-9d1201947cc9\" (UID: \"bdf73d21-1504-42d0-9521-9d1201947cc9\") " Oct 13 13:26:27 crc kubenswrapper[4797]: I1013 13:26:27.776250 4797 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/bdf73d21-1504-42d0-9521-9d1201947cc9-httpd-config\") on node \"crc\" DevicePath \"\"" Oct 13 13:26:27 crc kubenswrapper[4797]: I1013 13:26:27.776274 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g8lfc\" (UniqueName: \"kubernetes.io/projected/bdf73d21-1504-42d0-9521-9d1201947cc9-kube-api-access-g8lfc\") on node \"crc\" DevicePath \"\"" Oct 13 13:26:27 crc kubenswrapper[4797]: I1013 13:26:27.776288 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdf73d21-1504-42d0-9521-9d1201947cc9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 13:26:27 crc kubenswrapper[4797]: I1013 13:26:27.776298 4797 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee338af0-a1a1-4515-973b-b914753e76cf-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 13:26:27 crc kubenswrapper[4797]: I1013 13:26:27.858718 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bdf73d21-1504-42d0-9521-9d1201947cc9-config" (OuterVolumeSpecName: "config") pod "bdf73d21-1504-42d0-9521-9d1201947cc9" (UID: "bdf73d21-1504-42d0-9521-9d1201947cc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:26:27 crc kubenswrapper[4797]: I1013 13:26:27.858996 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bdf73d21-1504-42d0-9521-9d1201947cc9-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "bdf73d21-1504-42d0-9521-9d1201947cc9" (UID: "bdf73d21-1504-42d0-9521-9d1201947cc9"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:26:27 crc kubenswrapper[4797]: I1013 13:26:27.877595 4797 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bdf73d21-1504-42d0-9521-9d1201947cc9-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 13 13:26:27 crc kubenswrapper[4797]: I1013 13:26:27.877630 4797 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/bdf73d21-1504-42d0-9521-9d1201947cc9-config\") on node \"crc\" DevicePath \"\"" Oct 13 13:26:27 crc kubenswrapper[4797]: I1013 13:26:27.926926 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 13 13:26:27 crc kubenswrapper[4797]: I1013 13:26:27.978730 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 13 13:26:27 crc kubenswrapper[4797]: I1013 13:26:27.982861 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/99cf4c75-5042-4f58-945f-5461cad0fbcc-scripts\") pod \"99cf4c75-5042-4f58-945f-5461cad0fbcc\" (UID: \"99cf4c75-5042-4f58-945f-5461cad0fbcc\") " Oct 13 13:26:27 crc kubenswrapper[4797]: I1013 13:26:27.982897 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99cf4c75-5042-4f58-945f-5461cad0fbcc-config-data\") pod \"99cf4c75-5042-4f58-945f-5461cad0fbcc\" (UID: \"99cf4c75-5042-4f58-945f-5461cad0fbcc\") " Oct 13 13:26:27 crc kubenswrapper[4797]: I1013 13:26:27.982974 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"99cf4c75-5042-4f58-945f-5461cad0fbcc\" (UID: \"99cf4c75-5042-4f58-945f-5461cad0fbcc\") " Oct 13 13:26:27 crc kubenswrapper[4797]: I1013 13:26:27.983025 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/99cf4c75-5042-4f58-945f-5461cad0fbcc-public-tls-certs\") pod \"99cf4c75-5042-4f58-945f-5461cad0fbcc\" (UID: \"99cf4c75-5042-4f58-945f-5461cad0fbcc\") " Oct 13 13:26:27 crc kubenswrapper[4797]: I1013 13:26:27.983066 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/99cf4c75-5042-4f58-945f-5461cad0fbcc-httpd-run\") pod \"99cf4c75-5042-4f58-945f-5461cad0fbcc\" (UID: \"99cf4c75-5042-4f58-945f-5461cad0fbcc\") " Oct 13 13:26:27 crc kubenswrapper[4797]: I1013 13:26:27.983103 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99cf4c75-5042-4f58-945f-5461cad0fbcc-combined-ca-bundle\") pod \"99cf4c75-5042-4f58-945f-5461cad0fbcc\" (UID: \"99cf4c75-5042-4f58-945f-5461cad0fbcc\") " Oct 13 13:26:27 crc kubenswrapper[4797]: I1013 13:26:27.983129 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/99cf4c75-5042-4f58-945f-5461cad0fbcc-logs\") pod \"99cf4c75-5042-4f58-945f-5461cad0fbcc\" (UID: \"99cf4c75-5042-4f58-945f-5461cad0fbcc\") " Oct 13 13:26:27 crc kubenswrapper[4797]: I1013 13:26:27.983157 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-74tj7\" (UniqueName: \"kubernetes.io/projected/99cf4c75-5042-4f58-945f-5461cad0fbcc-kube-api-access-74tj7\") pod \"99cf4c75-5042-4f58-945f-5461cad0fbcc\" (UID: \"99cf4c75-5042-4f58-945f-5461cad0fbcc\") " Oct 13 13:26:27 crc kubenswrapper[4797]: I1013 13:26:27.985238 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/99cf4c75-5042-4f58-945f-5461cad0fbcc-logs" (OuterVolumeSpecName: "logs") pod "99cf4c75-5042-4f58-945f-5461cad0fbcc" (UID: "99cf4c75-5042-4f58-945f-5461cad0fbcc"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 13:26:27 crc kubenswrapper[4797]: I1013 13:26:27.985482 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/99cf4c75-5042-4f58-945f-5461cad0fbcc-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "99cf4c75-5042-4f58-945f-5461cad0fbcc" (UID: "99cf4c75-5042-4f58-945f-5461cad0fbcc"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 13:26:27 crc kubenswrapper[4797]: I1013 13:26:27.996955 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99cf4c75-5042-4f58-945f-5461cad0fbcc-scripts" (OuterVolumeSpecName: "scripts") pod "99cf4c75-5042-4f58-945f-5461cad0fbcc" (UID: "99cf4c75-5042-4f58-945f-5461cad0fbcc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:27.999329 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99cf4c75-5042-4f58-945f-5461cad0fbcc-kube-api-access-74tj7" (OuterVolumeSpecName: "kube-api-access-74tj7") pod "99cf4c75-5042-4f58-945f-5461cad0fbcc" (UID: "99cf4c75-5042-4f58-945f-5461cad0fbcc"). InnerVolumeSpecName "kube-api-access-74tj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.006012 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "glance") pod "99cf4c75-5042-4f58-945f-5461cad0fbcc" (UID: "99cf4c75-5042-4f58-945f-5461cad0fbcc"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.006075 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.020072 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 13 13:26:28 crc kubenswrapper[4797]: W1013 13:26:28.020485 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod50e6d20e_a3e6_43a4_a10c_2d7d06d1cfeb.slice/crio-c07ee4c55fa727ccaaa9ad4fb673dc8868c833295837b0e27b7c0a929eac9c43 WatchSource:0}: Error finding container c07ee4c55fa727ccaaa9ad4fb673dc8868c833295837b0e27b7c0a929eac9c43: Status 404 returned error can't find the container with id c07ee4c55fa727ccaaa9ad4fb673dc8868c833295837b0e27b7c0a929eac9c43 Oct 13 13:26:28 crc kubenswrapper[4797]: E1013 13:26:28.020499 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bdf73d21-1504-42d0-9521-9d1201947cc9" containerName="neutron-httpd" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.020532 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdf73d21-1504-42d0-9521-9d1201947cc9" containerName="neutron-httpd" Oct 13 13:26:28 crc kubenswrapper[4797]: E1013 13:26:28.020547 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d333dae8-679d-4469-bcbf-0c9ee221b136" containerName="cinder-scheduler" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.020555 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="d333dae8-679d-4469-bcbf-0c9ee221b136" containerName="cinder-scheduler" Oct 13 13:26:28 crc kubenswrapper[4797]: E1013 13:26:28.020567 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d333dae8-679d-4469-bcbf-0c9ee221b136" containerName="probe" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.020573 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="d333dae8-679d-4469-bcbf-0c9ee221b136" containerName="probe" Oct 13 13:26:28 crc kubenswrapper[4797]: E1013 13:26:28.020584 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee338af0-a1a1-4515-973b-b914753e76cf" containerName="sg-core" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.020592 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee338af0-a1a1-4515-973b-b914753e76cf" containerName="sg-core" Oct 13 13:26:28 crc kubenswrapper[4797]: E1013 13:26:28.020599 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee338af0-a1a1-4515-973b-b914753e76cf" containerName="proxy-httpd" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.020606 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee338af0-a1a1-4515-973b-b914753e76cf" containerName="proxy-httpd" Oct 13 13:26:28 crc kubenswrapper[4797]: E1013 13:26:28.020617 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6252a65e-85f5-42cf-9fce-3cd585d5e834" containerName="glance-log" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.020622 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="6252a65e-85f5-42cf-9fce-3cd585d5e834" containerName="glance-log" Oct 13 13:26:28 crc kubenswrapper[4797]: E1013 13:26:28.020635 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee338af0-a1a1-4515-973b-b914753e76cf" containerName="ceilometer-notification-agent" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.020641 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee338af0-a1a1-4515-973b-b914753e76cf" containerName="ceilometer-notification-agent" Oct 13 13:26:28 crc kubenswrapper[4797]: E1013 13:26:28.020653 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99cf4c75-5042-4f58-945f-5461cad0fbcc" containerName="glance-httpd" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.020659 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="99cf4c75-5042-4f58-945f-5461cad0fbcc" containerName="glance-httpd" Oct 13 13:26:28 crc kubenswrapper[4797]: E1013 13:26:28.020673 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6252a65e-85f5-42cf-9fce-3cd585d5e834" containerName="glance-httpd" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.020679 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="6252a65e-85f5-42cf-9fce-3cd585d5e834" containerName="glance-httpd" Oct 13 13:26:28 crc kubenswrapper[4797]: E1013 13:26:28.020687 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99cf4c75-5042-4f58-945f-5461cad0fbcc" containerName="glance-log" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.020692 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="99cf4c75-5042-4f58-945f-5461cad0fbcc" containerName="glance-log" Oct 13 13:26:28 crc kubenswrapper[4797]: E1013 13:26:28.020700 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee338af0-a1a1-4515-973b-b914753e76cf" containerName="ceilometer-central-agent" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.020707 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee338af0-a1a1-4515-973b-b914753e76cf" containerName="ceilometer-central-agent" Oct 13 13:26:28 crc kubenswrapper[4797]: E1013 13:26:28.020719 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bdf73d21-1504-42d0-9521-9d1201947cc9" containerName="neutron-api" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.020727 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdf73d21-1504-42d0-9521-9d1201947cc9" containerName="neutron-api" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.020826 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99cf4c75-5042-4f58-945f-5461cad0fbcc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "99cf4c75-5042-4f58-945f-5461cad0fbcc" (UID: "99cf4c75-5042-4f58-945f-5461cad0fbcc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.020915 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee338af0-a1a1-4515-973b-b914753e76cf" containerName="proxy-httpd" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.020931 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee338af0-a1a1-4515-973b-b914753e76cf" containerName="ceilometer-central-agent" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.020940 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="bdf73d21-1504-42d0-9521-9d1201947cc9" containerName="neutron-httpd" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.020956 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="d333dae8-679d-4469-bcbf-0c9ee221b136" containerName="probe" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.020964 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="6252a65e-85f5-42cf-9fce-3cd585d5e834" containerName="glance-httpd" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.020975 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="6252a65e-85f5-42cf-9fce-3cd585d5e834" containerName="glance-log" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.020985 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="d333dae8-679d-4469-bcbf-0c9ee221b136" containerName="cinder-scheduler" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.020997 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="99cf4c75-5042-4f58-945f-5461cad0fbcc" containerName="glance-log" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.021007 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee338af0-a1a1-4515-973b-b914753e76cf" containerName="sg-core" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.021014 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="99cf4c75-5042-4f58-945f-5461cad0fbcc" containerName="glance-httpd" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.021024 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee338af0-a1a1-4515-973b-b914753e76cf" containerName="ceilometer-notification-agent" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.021032 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="bdf73d21-1504-42d0-9521-9d1201947cc9" containerName="neutron-api" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.023044 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.029780 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.030018 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.068984 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-9kldr"] Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.081945 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.086076 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99cf4c75-5042-4f58-945f-5461cad0fbcc-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "99cf4c75-5042-4f58-945f-5461cad0fbcc" (UID: "99cf4c75-5042-4f58-945f-5461cad0fbcc"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.087518 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/22ccc9b3-d021-40db-b079-61ca5b0e3317-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"22ccc9b3-d021-40db-b079-61ca5b0e3317\") " pod="openstack/ceilometer-0" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.087587 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/22ccc9b3-d021-40db-b079-61ca5b0e3317-run-httpd\") pod \"ceilometer-0\" (UID: \"22ccc9b3-d021-40db-b079-61ca5b0e3317\") " pod="openstack/ceilometer-0" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.087617 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22ccc9b3-d021-40db-b079-61ca5b0e3317-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"22ccc9b3-d021-40db-b079-61ca5b0e3317\") " pod="openstack/ceilometer-0" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.087662 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76hjn\" (UniqueName: \"kubernetes.io/projected/22ccc9b3-d021-40db-b079-61ca5b0e3317-kube-api-access-76hjn\") pod \"ceilometer-0\" (UID: \"22ccc9b3-d021-40db-b079-61ca5b0e3317\") " pod="openstack/ceilometer-0" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.087755 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/22ccc9b3-d021-40db-b079-61ca5b0e3317-scripts\") pod \"ceilometer-0\" (UID: \"22ccc9b3-d021-40db-b079-61ca5b0e3317\") " pod="openstack/ceilometer-0" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.088156 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22ccc9b3-d021-40db-b079-61ca5b0e3317-config-data\") pod \"ceilometer-0\" (UID: \"22ccc9b3-d021-40db-b079-61ca5b0e3317\") " pod="openstack/ceilometer-0" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.088202 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/22ccc9b3-d021-40db-b079-61ca5b0e3317-log-httpd\") pod \"ceilometer-0\" (UID: \"22ccc9b3-d021-40db-b079-61ca5b0e3317\") " pod="openstack/ceilometer-0" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.088268 4797 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/99cf4c75-5042-4f58-945f-5461cad0fbcc-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.088287 4797 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/99cf4c75-5042-4f58-945f-5461cad0fbcc-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.088298 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99cf4c75-5042-4f58-945f-5461cad0fbcc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.088309 4797 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/99cf4c75-5042-4f58-945f-5461cad0fbcc-logs\") on node \"crc\" DevicePath \"\"" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.088319 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-74tj7\" (UniqueName: \"kubernetes.io/projected/99cf4c75-5042-4f58-945f-5461cad0fbcc-kube-api-access-74tj7\") on node \"crc\" DevicePath \"\"" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.088330 4797 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/99cf4c75-5042-4f58-945f-5461cad0fbcc-scripts\") on node \"crc\" DevicePath \"\"" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.088351 4797 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.099036 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99cf4c75-5042-4f58-945f-5461cad0fbcc-config-data" (OuterVolumeSpecName: "config-data") pod "99cf4c75-5042-4f58-945f-5461cad0fbcc" (UID: "99cf4c75-5042-4f58-945f-5461cad0fbcc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.102147 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-66cds"] Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.122846 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.138601 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.142534 4797 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.146952 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-vmbzt"] Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.161112 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.163088 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.167479 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.181603 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.191648 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/22ccc9b3-d021-40db-b079-61ca5b0e3317-log-httpd\") pod \"ceilometer-0\" (UID: \"22ccc9b3-d021-40db-b079-61ca5b0e3317\") " pod="openstack/ceilometer-0" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.191725 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc4b497b-efb0-4294-8af9-c16bb2835e36-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"cc4b497b-efb0-4294-8af9-c16bb2835e36\") " pod="openstack/cinder-scheduler-0" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.191760 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/22ccc9b3-d021-40db-b079-61ca5b0e3317-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"22ccc9b3-d021-40db-b079-61ca5b0e3317\") " pod="openstack/ceilometer-0" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.191797 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc4b497b-efb0-4294-8af9-c16bb2835e36-config-data\") pod \"cinder-scheduler-0\" (UID: \"cc4b497b-efb0-4294-8af9-c16bb2835e36\") " pod="openstack/cinder-scheduler-0" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.192066 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cc4b497b-efb0-4294-8af9-c16bb2835e36-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"cc4b497b-efb0-4294-8af9-c16bb2835e36\") " pod="openstack/cinder-scheduler-0" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.192094 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cc4b497b-efb0-4294-8af9-c16bb2835e36-scripts\") pod \"cinder-scheduler-0\" (UID: \"cc4b497b-efb0-4294-8af9-c16bb2835e36\") " pod="openstack/cinder-scheduler-0" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.192171 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/22ccc9b3-d021-40db-b079-61ca5b0e3317-run-httpd\") pod \"ceilometer-0\" (UID: \"22ccc9b3-d021-40db-b079-61ca5b0e3317\") " pod="openstack/ceilometer-0" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.192264 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/22ccc9b3-d021-40db-b079-61ca5b0e3317-log-httpd\") pod \"ceilometer-0\" (UID: \"22ccc9b3-d021-40db-b079-61ca5b0e3317\") " pod="openstack/ceilometer-0" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.192468 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/22ccc9b3-d021-40db-b079-61ca5b0e3317-run-httpd\") pod \"ceilometer-0\" (UID: \"22ccc9b3-d021-40db-b079-61ca5b0e3317\") " pod="openstack/ceilometer-0" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.192523 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22ccc9b3-d021-40db-b079-61ca5b0e3317-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"22ccc9b3-d021-40db-b079-61ca5b0e3317\") " pod="openstack/ceilometer-0" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.192551 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cc4b497b-efb0-4294-8af9-c16bb2835e36-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"cc4b497b-efb0-4294-8af9-c16bb2835e36\") " pod="openstack/cinder-scheduler-0" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.192611 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvjt5\" (UniqueName: \"kubernetes.io/projected/cc4b497b-efb0-4294-8af9-c16bb2835e36-kube-api-access-gvjt5\") pod \"cinder-scheduler-0\" (UID: \"cc4b497b-efb0-4294-8af9-c16bb2835e36\") " pod="openstack/cinder-scheduler-0" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.192658 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-76hjn\" (UniqueName: \"kubernetes.io/projected/22ccc9b3-d021-40db-b079-61ca5b0e3317-kube-api-access-76hjn\") pod \"ceilometer-0\" (UID: \"22ccc9b3-d021-40db-b079-61ca5b0e3317\") " pod="openstack/ceilometer-0" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.192856 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/22ccc9b3-d021-40db-b079-61ca5b0e3317-scripts\") pod \"ceilometer-0\" (UID: \"22ccc9b3-d021-40db-b079-61ca5b0e3317\") " pod="openstack/ceilometer-0" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.192943 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22ccc9b3-d021-40db-b079-61ca5b0e3317-config-data\") pod \"ceilometer-0\" (UID: \"22ccc9b3-d021-40db-b079-61ca5b0e3317\") " pod="openstack/ceilometer-0" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.193484 4797 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.195735 4797 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99cf4c75-5042-4f58-945f-5461cad0fbcc-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.196832 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/22ccc9b3-d021-40db-b079-61ca5b0e3317-scripts\") pod \"ceilometer-0\" (UID: \"22ccc9b3-d021-40db-b079-61ca5b0e3317\") " pod="openstack/ceilometer-0" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.196916 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/22ccc9b3-d021-40db-b079-61ca5b0e3317-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"22ccc9b3-d021-40db-b079-61ca5b0e3317\") " pod="openstack/ceilometer-0" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.197273 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22ccc9b3-d021-40db-b079-61ca5b0e3317-config-data\") pod \"ceilometer-0\" (UID: \"22ccc9b3-d021-40db-b079-61ca5b0e3317\") " pod="openstack/ceilometer-0" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.207294 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22ccc9b3-d021-40db-b079-61ca5b0e3317-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"22ccc9b3-d021-40db-b079-61ca5b0e3317\") " pod="openstack/ceilometer-0" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.214180 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-76hjn\" (UniqueName: \"kubernetes.io/projected/22ccc9b3-d021-40db-b079-61ca5b0e3317-kube-api-access-76hjn\") pod \"ceilometer-0\" (UID: \"22ccc9b3-d021-40db-b079-61ca5b0e3317\") " pod="openstack/ceilometer-0" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.279612 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-9kldr" event={"ID":"50e6d20e-a3e6-43a4-a10c-2d7d06d1cfeb","Type":"ContainerStarted","Data":"c07ee4c55fa727ccaaa9ad4fb673dc8868c833295837b0e27b7c0a929eac9c43"} Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.284173 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6252a65e-85f5-42cf-9fce-3cd585d5e834","Type":"ContainerDied","Data":"6a07cbd2d7f3bc47690aa8c3e9f195eb4ca38be7f9280eddd4212dbd76a1c357"} Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.284479 4797 scope.go:117] "RemoveContainer" containerID="70667378fe0ce4e05def33b1987bc19d6f57f64b4c4138291bc6588d9f55a2dc" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.284227 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.296076 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.296257 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"99cf4c75-5042-4f58-945f-5461cad0fbcc","Type":"ContainerDied","Data":"a45547ea08b1ea0857b051b937df96a89df3098205ac91cc97a88802d7e32512"} Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.297028 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc4b497b-efb0-4294-8af9-c16bb2835e36-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"cc4b497b-efb0-4294-8af9-c16bb2835e36\") " pod="openstack/cinder-scheduler-0" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.297084 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc4b497b-efb0-4294-8af9-c16bb2835e36-config-data\") pod \"cinder-scheduler-0\" (UID: \"cc4b497b-efb0-4294-8af9-c16bb2835e36\") " pod="openstack/cinder-scheduler-0" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.297117 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cc4b497b-efb0-4294-8af9-c16bb2835e36-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"cc4b497b-efb0-4294-8af9-c16bb2835e36\") " pod="openstack/cinder-scheduler-0" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.297133 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cc4b497b-efb0-4294-8af9-c16bb2835e36-scripts\") pod \"cinder-scheduler-0\" (UID: \"cc4b497b-efb0-4294-8af9-c16bb2835e36\") " pod="openstack/cinder-scheduler-0" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.297155 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cc4b497b-efb0-4294-8af9-c16bb2835e36-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"cc4b497b-efb0-4294-8af9-c16bb2835e36\") " pod="openstack/cinder-scheduler-0" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.297202 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gvjt5\" (UniqueName: \"kubernetes.io/projected/cc4b497b-efb0-4294-8af9-c16bb2835e36-kube-api-access-gvjt5\") pod \"cinder-scheduler-0\" (UID: \"cc4b497b-efb0-4294-8af9-c16bb2835e36\") " pod="openstack/cinder-scheduler-0" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.300291 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cc4b497b-efb0-4294-8af9-c16bb2835e36-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"cc4b497b-efb0-4294-8af9-c16bb2835e36\") " pod="openstack/cinder-scheduler-0" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.303150 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc4b497b-efb0-4294-8af9-c16bb2835e36-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"cc4b497b-efb0-4294-8af9-c16bb2835e36\") " pod="openstack/cinder-scheduler-0" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.305765 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-vmbzt" event={"ID":"f6745956-a404-4481-9d3e-3a8056d7aaf2","Type":"ContainerStarted","Data":"e2d50c42d3673947982f8b64a4d8d8e7a6e5dd02e8a591cfebb06429caadcbbf"} Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.308095 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cc4b497b-efb0-4294-8af9-c16bb2835e36-scripts\") pod \"cinder-scheduler-0\" (UID: \"cc4b497b-efb0-4294-8af9-c16bb2835e36\") " pod="openstack/cinder-scheduler-0" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.316418 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cc4b497b-efb0-4294-8af9-c16bb2835e36-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"cc4b497b-efb0-4294-8af9-c16bb2835e36\") " pod="openstack/cinder-scheduler-0" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.321186 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-66cds" event={"ID":"3e8201a9-2b2a-46f0-81ea-db939e25d192","Type":"ContainerStarted","Data":"d5fed2b2702b28a9661a3dfc6f0b40c7f995763f9fe7eda59b71d74989a5822d"} Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.332736 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc4b497b-efb0-4294-8af9-c16bb2835e36-config-data\") pod \"cinder-scheduler-0\" (UID: \"cc4b497b-efb0-4294-8af9-c16bb2835e36\") " pod="openstack/cinder-scheduler-0" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.337495 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvjt5\" (UniqueName: \"kubernetes.io/projected/cc4b497b-efb0-4294-8af9-c16bb2835e36-kube-api-access-gvjt5\") pod \"cinder-scheduler-0\" (UID: \"cc4b497b-efb0-4294-8af9-c16bb2835e36\") " pod="openstack/cinder-scheduler-0" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.342886 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.343386 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.347665 4797 scope.go:117] "RemoveContainer" containerID="b445980dba3c97e4a3a6a603ee26d9e781dbb8bb8304c528c84252ac702779dd" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.349382 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7fd4c97c98-mmgwk" event={"ID":"bdf73d21-1504-42d0-9521-9d1201947cc9","Type":"ContainerDied","Data":"3d7bddb8281e44c336fd313b3e15c3fb83c4fe41e3a22e6ea040304171c185e8"} Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.349467 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7fd4c97c98-mmgwk" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.360212 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.371641 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-db-create-66cds" podStartSLOduration=7.371620655 podStartE2EDuration="7.371620655s" podCreationTimestamp="2025-10-13 13:26:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 13:26:28.347006401 +0000 UTC m=+1165.880556667" watchObservedRunningTime="2025-10-13 13:26:28.371620655 +0000 UTC m=+1165.905170911" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.388048 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.389534 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.395001 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.399153 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.399398 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-c7pxm" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.399536 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.399666 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.400572 4797 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-966b5d6fd-tjmcl" podUID="c012c54e-85c7-4ee9-8513-4f5d8ae4b1c3" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.163:9311/healthcheck\": read tcp 10.217.0.2:52968->10.217.0.163:9311: read: connection reset by peer" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.400778 4797 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-966b5d6fd-tjmcl" podUID="c012c54e-85c7-4ee9-8513-4f5d8ae4b1c3" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.163:9311/healthcheck\": read tcp 10.217.0.2:52978->10.217.0.163:9311: read: connection reset by peer" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.402944 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.441281 4797 scope.go:117] "RemoveContainer" containerID="8a2c55dde5e7cdb88e6195d0cc7cf9c60c2d76d91b4fbb30b8a991240a8dfd5a" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.459038 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.494827 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.495217 4797 scope.go:117] "RemoveContainer" containerID="e95744e7a7fb393cf4dc18192de5ca1a6dbdc1518ad298e95bb74bf374852a4b" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.496551 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.497499 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.500752 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.501514 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.503036 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"e2be119d-ecfb-4f81-b947-46797c215b8e\") " pod="openstack/glance-default-internal-api-0" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.503116 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2be119d-ecfb-4f81-b947-46797c215b8e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e2be119d-ecfb-4f81-b947-46797c215b8e\") " pod="openstack/glance-default-internal-api-0" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.503152 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n46m9\" (UniqueName: \"kubernetes.io/projected/e2be119d-ecfb-4f81-b947-46797c215b8e-kube-api-access-n46m9\") pod \"glance-default-internal-api-0\" (UID: \"e2be119d-ecfb-4f81-b947-46797c215b8e\") " pod="openstack/glance-default-internal-api-0" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.503174 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e2be119d-ecfb-4f81-b947-46797c215b8e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e2be119d-ecfb-4f81-b947-46797c215b8e\") " pod="openstack/glance-default-internal-api-0" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.503206 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e2be119d-ecfb-4f81-b947-46797c215b8e-logs\") pod \"glance-default-internal-api-0\" (UID: \"e2be119d-ecfb-4f81-b947-46797c215b8e\") " pod="openstack/glance-default-internal-api-0" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.503229 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e2be119d-ecfb-4f81-b947-46797c215b8e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e2be119d-ecfb-4f81-b947-46797c215b8e\") " pod="openstack/glance-default-internal-api-0" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.503276 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2be119d-ecfb-4f81-b947-46797c215b8e-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"e2be119d-ecfb-4f81-b947-46797c215b8e\") " pod="openstack/glance-default-internal-api-0" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.503306 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2be119d-ecfb-4f81-b947-46797c215b8e-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e2be119d-ecfb-4f81-b947-46797c215b8e\") " pod="openstack/glance-default-internal-api-0" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.508588 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.527723 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7fd4c97c98-mmgwk"] Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.552182 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-7fd4c97c98-mmgwk"] Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.604740 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/416aefad-3318-4406-b1c1-fdba0ce21437-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"416aefad-3318-4406-b1c1-fdba0ce21437\") " pod="openstack/glance-default-external-api-0" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.605055 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/416aefad-3318-4406-b1c1-fdba0ce21437-scripts\") pod \"glance-default-external-api-0\" (UID: \"416aefad-3318-4406-b1c1-fdba0ce21437\") " pod="openstack/glance-default-external-api-0" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.605072 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/416aefad-3318-4406-b1c1-fdba0ce21437-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"416aefad-3318-4406-b1c1-fdba0ce21437\") " pod="openstack/glance-default-external-api-0" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.605125 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"e2be119d-ecfb-4f81-b947-46797c215b8e\") " pod="openstack/glance-default-internal-api-0" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.605157 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/416aefad-3318-4406-b1c1-fdba0ce21437-config-data\") pod \"glance-default-external-api-0\" (UID: \"416aefad-3318-4406-b1c1-fdba0ce21437\") " pod="openstack/glance-default-external-api-0" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.605178 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vpkhd\" (UniqueName: \"kubernetes.io/projected/416aefad-3318-4406-b1c1-fdba0ce21437-kube-api-access-vpkhd\") pod \"glance-default-external-api-0\" (UID: \"416aefad-3318-4406-b1c1-fdba0ce21437\") " pod="openstack/glance-default-external-api-0" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.605204 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2be119d-ecfb-4f81-b947-46797c215b8e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e2be119d-ecfb-4f81-b947-46797c215b8e\") " pod="openstack/glance-default-internal-api-0" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.605223 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n46m9\" (UniqueName: \"kubernetes.io/projected/e2be119d-ecfb-4f81-b947-46797c215b8e-kube-api-access-n46m9\") pod \"glance-default-internal-api-0\" (UID: \"e2be119d-ecfb-4f81-b947-46797c215b8e\") " pod="openstack/glance-default-internal-api-0" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.605238 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e2be119d-ecfb-4f81-b947-46797c215b8e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e2be119d-ecfb-4f81-b947-46797c215b8e\") " pod="openstack/glance-default-internal-api-0" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.605265 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e2be119d-ecfb-4f81-b947-46797c215b8e-logs\") pod \"glance-default-internal-api-0\" (UID: \"e2be119d-ecfb-4f81-b947-46797c215b8e\") " pod="openstack/glance-default-internal-api-0" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.605284 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"416aefad-3318-4406-b1c1-fdba0ce21437\") " pod="openstack/glance-default-external-api-0" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.605300 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e2be119d-ecfb-4f81-b947-46797c215b8e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e2be119d-ecfb-4f81-b947-46797c215b8e\") " pod="openstack/glance-default-internal-api-0" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.605335 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/416aefad-3318-4406-b1c1-fdba0ce21437-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"416aefad-3318-4406-b1c1-fdba0ce21437\") " pod="openstack/glance-default-external-api-0" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.605356 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2be119d-ecfb-4f81-b947-46797c215b8e-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"e2be119d-ecfb-4f81-b947-46797c215b8e\") " pod="openstack/glance-default-internal-api-0" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.605372 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/416aefad-3318-4406-b1c1-fdba0ce21437-logs\") pod \"glance-default-external-api-0\" (UID: \"416aefad-3318-4406-b1c1-fdba0ce21437\") " pod="openstack/glance-default-external-api-0" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.605391 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2be119d-ecfb-4f81-b947-46797c215b8e-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e2be119d-ecfb-4f81-b947-46797c215b8e\") " pod="openstack/glance-default-internal-api-0" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.606448 4797 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"e2be119d-ecfb-4f81-b947-46797c215b8e\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-internal-api-0" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.626265 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e2be119d-ecfb-4f81-b947-46797c215b8e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e2be119d-ecfb-4f81-b947-46797c215b8e\") " pod="openstack/glance-default-internal-api-0" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.626484 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e2be119d-ecfb-4f81-b947-46797c215b8e-logs\") pod \"glance-default-internal-api-0\" (UID: \"e2be119d-ecfb-4f81-b947-46797c215b8e\") " pod="openstack/glance-default-internal-api-0" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.627699 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e2be119d-ecfb-4f81-b947-46797c215b8e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e2be119d-ecfb-4f81-b947-46797c215b8e\") " pod="openstack/glance-default-internal-api-0" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.627944 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2be119d-ecfb-4f81-b947-46797c215b8e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e2be119d-ecfb-4f81-b947-46797c215b8e\") " pod="openstack/glance-default-internal-api-0" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.628771 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n46m9\" (UniqueName: \"kubernetes.io/projected/e2be119d-ecfb-4f81-b947-46797c215b8e-kube-api-access-n46m9\") pod \"glance-default-internal-api-0\" (UID: \"e2be119d-ecfb-4f81-b947-46797c215b8e\") " pod="openstack/glance-default-internal-api-0" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.634550 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2be119d-ecfb-4f81-b947-46797c215b8e-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"e2be119d-ecfb-4f81-b947-46797c215b8e\") " pod="openstack/glance-default-internal-api-0" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.657601 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2be119d-ecfb-4f81-b947-46797c215b8e-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e2be119d-ecfb-4f81-b947-46797c215b8e\") " pod="openstack/glance-default-internal-api-0" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.681701 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"e2be119d-ecfb-4f81-b947-46797c215b8e\") " pod="openstack/glance-default-internal-api-0" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.706915 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"416aefad-3318-4406-b1c1-fdba0ce21437\") " pod="openstack/glance-default-external-api-0" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.707228 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/416aefad-3318-4406-b1c1-fdba0ce21437-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"416aefad-3318-4406-b1c1-fdba0ce21437\") " pod="openstack/glance-default-external-api-0" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.707327 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/416aefad-3318-4406-b1c1-fdba0ce21437-logs\") pod \"glance-default-external-api-0\" (UID: \"416aefad-3318-4406-b1c1-fdba0ce21437\") " pod="openstack/glance-default-external-api-0" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.707428 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/416aefad-3318-4406-b1c1-fdba0ce21437-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"416aefad-3318-4406-b1c1-fdba0ce21437\") " pod="openstack/glance-default-external-api-0" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.707520 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/416aefad-3318-4406-b1c1-fdba0ce21437-scripts\") pod \"glance-default-external-api-0\" (UID: \"416aefad-3318-4406-b1c1-fdba0ce21437\") " pod="openstack/glance-default-external-api-0" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.707603 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/416aefad-3318-4406-b1c1-fdba0ce21437-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"416aefad-3318-4406-b1c1-fdba0ce21437\") " pod="openstack/glance-default-external-api-0" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.707855 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/416aefad-3318-4406-b1c1-fdba0ce21437-config-data\") pod \"glance-default-external-api-0\" (UID: \"416aefad-3318-4406-b1c1-fdba0ce21437\") " pod="openstack/glance-default-external-api-0" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.707951 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vpkhd\" (UniqueName: \"kubernetes.io/projected/416aefad-3318-4406-b1c1-fdba0ce21437-kube-api-access-vpkhd\") pod \"glance-default-external-api-0\" (UID: \"416aefad-3318-4406-b1c1-fdba0ce21437\") " pod="openstack/glance-default-external-api-0" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.709982 4797 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"416aefad-3318-4406-b1c1-fdba0ce21437\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/glance-default-external-api-0" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.711594 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/416aefad-3318-4406-b1c1-fdba0ce21437-logs\") pod \"glance-default-external-api-0\" (UID: \"416aefad-3318-4406-b1c1-fdba0ce21437\") " pod="openstack/glance-default-external-api-0" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.711930 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/416aefad-3318-4406-b1c1-fdba0ce21437-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"416aefad-3318-4406-b1c1-fdba0ce21437\") " pod="openstack/glance-default-external-api-0" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.713358 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/416aefad-3318-4406-b1c1-fdba0ce21437-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"416aefad-3318-4406-b1c1-fdba0ce21437\") " pod="openstack/glance-default-external-api-0" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.717375 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/416aefad-3318-4406-b1c1-fdba0ce21437-config-data\") pod \"glance-default-external-api-0\" (UID: \"416aefad-3318-4406-b1c1-fdba0ce21437\") " pod="openstack/glance-default-external-api-0" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.720060 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/416aefad-3318-4406-b1c1-fdba0ce21437-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"416aefad-3318-4406-b1c1-fdba0ce21437\") " pod="openstack/glance-default-external-api-0" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.720674 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/416aefad-3318-4406-b1c1-fdba0ce21437-scripts\") pod \"glance-default-external-api-0\" (UID: \"416aefad-3318-4406-b1c1-fdba0ce21437\") " pod="openstack/glance-default-external-api-0" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.736637 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vpkhd\" (UniqueName: \"kubernetes.io/projected/416aefad-3318-4406-b1c1-fdba0ce21437-kube-api-access-vpkhd\") pod \"glance-default-external-api-0\" (UID: \"416aefad-3318-4406-b1c1-fdba0ce21437\") " pod="openstack/glance-default-external-api-0" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.821413 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"416aefad-3318-4406-b1c1-fdba0ce21437\") " pod="openstack/glance-default-external-api-0" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.874703 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.902022 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.914090 4797 scope.go:117] "RemoveContainer" containerID="8be75a01073153c2b6a4e433be7da517f34e589f7fa0ce0cac175dab6cfec505" Oct 13 13:26:28 crc kubenswrapper[4797]: I1013 13:26:28.969897 4797 scope.go:117] "RemoveContainer" containerID="57283ac326d0242aaf47e4ff612cf4c3235d8825402aa5d07b3d9cdfcbc2afda" Oct 13 13:26:29 crc kubenswrapper[4797]: I1013 13:26:29.056469 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 13 13:26:29 crc kubenswrapper[4797]: I1013 13:26:29.092426 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-966b5d6fd-tjmcl" Oct 13 13:26:29 crc kubenswrapper[4797]: I1013 13:26:29.223651 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c012c54e-85c7-4ee9-8513-4f5d8ae4b1c3-logs\") pod \"c012c54e-85c7-4ee9-8513-4f5d8ae4b1c3\" (UID: \"c012c54e-85c7-4ee9-8513-4f5d8ae4b1c3\") " Oct 13 13:26:29 crc kubenswrapper[4797]: I1013 13:26:29.223764 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c012c54e-85c7-4ee9-8513-4f5d8ae4b1c3-config-data\") pod \"c012c54e-85c7-4ee9-8513-4f5d8ae4b1c3\" (UID: \"c012c54e-85c7-4ee9-8513-4f5d8ae4b1c3\") " Oct 13 13:26:29 crc kubenswrapper[4797]: I1013 13:26:29.223785 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c012c54e-85c7-4ee9-8513-4f5d8ae4b1c3-config-data-custom\") pod \"c012c54e-85c7-4ee9-8513-4f5d8ae4b1c3\" (UID: \"c012c54e-85c7-4ee9-8513-4f5d8ae4b1c3\") " Oct 13 13:26:29 crc kubenswrapper[4797]: I1013 13:26:29.223810 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c012c54e-85c7-4ee9-8513-4f5d8ae4b1c3-combined-ca-bundle\") pod \"c012c54e-85c7-4ee9-8513-4f5d8ae4b1c3\" (UID: \"c012c54e-85c7-4ee9-8513-4f5d8ae4b1c3\") " Oct 13 13:26:29 crc kubenswrapper[4797]: I1013 13:26:29.223902 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bp6zb\" (UniqueName: \"kubernetes.io/projected/c012c54e-85c7-4ee9-8513-4f5d8ae4b1c3-kube-api-access-bp6zb\") pod \"c012c54e-85c7-4ee9-8513-4f5d8ae4b1c3\" (UID: \"c012c54e-85c7-4ee9-8513-4f5d8ae4b1c3\") " Oct 13 13:26:29 crc kubenswrapper[4797]: I1013 13:26:29.225578 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c012c54e-85c7-4ee9-8513-4f5d8ae4b1c3-logs" (OuterVolumeSpecName: "logs") pod "c012c54e-85c7-4ee9-8513-4f5d8ae4b1c3" (UID: "c012c54e-85c7-4ee9-8513-4f5d8ae4b1c3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 13:26:29 crc kubenswrapper[4797]: I1013 13:26:29.229701 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c012c54e-85c7-4ee9-8513-4f5d8ae4b1c3-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "c012c54e-85c7-4ee9-8513-4f5d8ae4b1c3" (UID: "c012c54e-85c7-4ee9-8513-4f5d8ae4b1c3"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:26:29 crc kubenswrapper[4797]: I1013 13:26:29.232147 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c012c54e-85c7-4ee9-8513-4f5d8ae4b1c3-kube-api-access-bp6zb" (OuterVolumeSpecName: "kube-api-access-bp6zb") pod "c012c54e-85c7-4ee9-8513-4f5d8ae4b1c3" (UID: "c012c54e-85c7-4ee9-8513-4f5d8ae4b1c3"). InnerVolumeSpecName "kube-api-access-bp6zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:26:29 crc kubenswrapper[4797]: I1013 13:26:29.251019 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6252a65e-85f5-42cf-9fce-3cd585d5e834" path="/var/lib/kubelet/pods/6252a65e-85f5-42cf-9fce-3cd585d5e834/volumes" Oct 13 13:26:29 crc kubenswrapper[4797]: I1013 13:26:29.252378 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99cf4c75-5042-4f58-945f-5461cad0fbcc" path="/var/lib/kubelet/pods/99cf4c75-5042-4f58-945f-5461cad0fbcc/volumes" Oct 13 13:26:29 crc kubenswrapper[4797]: I1013 13:26:29.253237 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bdf73d21-1504-42d0-9521-9d1201947cc9" path="/var/lib/kubelet/pods/bdf73d21-1504-42d0-9521-9d1201947cc9/volumes" Oct 13 13:26:29 crc kubenswrapper[4797]: I1013 13:26:29.254616 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d333dae8-679d-4469-bcbf-0c9ee221b136" path="/var/lib/kubelet/pods/d333dae8-679d-4469-bcbf-0c9ee221b136/volumes" Oct 13 13:26:29 crc kubenswrapper[4797]: I1013 13:26:29.255851 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee338af0-a1a1-4515-973b-b914753e76cf" path="/var/lib/kubelet/pods/ee338af0-a1a1-4515-973b-b914753e76cf/volumes" Oct 13 13:26:29 crc kubenswrapper[4797]: I1013 13:26:29.259130 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c012c54e-85c7-4ee9-8513-4f5d8ae4b1c3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c012c54e-85c7-4ee9-8513-4f5d8ae4b1c3" (UID: "c012c54e-85c7-4ee9-8513-4f5d8ae4b1c3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:26:29 crc kubenswrapper[4797]: I1013 13:26:29.307222 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c012c54e-85c7-4ee9-8513-4f5d8ae4b1c3-config-data" (OuterVolumeSpecName: "config-data") pod "c012c54e-85c7-4ee9-8513-4f5d8ae4b1c3" (UID: "c012c54e-85c7-4ee9-8513-4f5d8ae4b1c3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:26:29 crc kubenswrapper[4797]: I1013 13:26:29.327014 4797 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c012c54e-85c7-4ee9-8513-4f5d8ae4b1c3-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 13:26:29 crc kubenswrapper[4797]: I1013 13:26:29.327051 4797 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c012c54e-85c7-4ee9-8513-4f5d8ae4b1c3-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 13 13:26:29 crc kubenswrapper[4797]: I1013 13:26:29.327106 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c012c54e-85c7-4ee9-8513-4f5d8ae4b1c3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 13:26:29 crc kubenswrapper[4797]: I1013 13:26:29.327180 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bp6zb\" (UniqueName: \"kubernetes.io/projected/c012c54e-85c7-4ee9-8513-4f5d8ae4b1c3-kube-api-access-bp6zb\") on node \"crc\" DevicePath \"\"" Oct 13 13:26:29 crc kubenswrapper[4797]: I1013 13:26:29.327200 4797 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c012c54e-85c7-4ee9-8513-4f5d8ae4b1c3-logs\") on node \"crc\" DevicePath \"\"" Oct 13 13:26:29 crc kubenswrapper[4797]: I1013 13:26:29.331268 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 13 13:26:29 crc kubenswrapper[4797]: W1013 13:26:29.339477 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcc4b497b_efb0_4294_8af9_c16bb2835e36.slice/crio-f5926e75952e74a6189644d96cefddb84f5ef84b17df4a64d1dd0322928cf2a9 WatchSource:0}: Error finding container f5926e75952e74a6189644d96cefddb84f5ef84b17df4a64d1dd0322928cf2a9: Status 404 returned error can't find the container with id f5926e75952e74a6189644d96cefddb84f5ef84b17df4a64d1dd0322928cf2a9 Oct 13 13:26:29 crc kubenswrapper[4797]: I1013 13:26:29.373924 4797 generic.go:334] "Generic (PLEG): container finished" podID="f6745956-a404-4481-9d3e-3a8056d7aaf2" containerID="c597d4730edec7f44d1fce4d0ce4d87234325f19fc8caa11012c424af77676b8" exitCode=0 Oct 13 13:26:29 crc kubenswrapper[4797]: I1013 13:26:29.374009 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-vmbzt" event={"ID":"f6745956-a404-4481-9d3e-3a8056d7aaf2","Type":"ContainerDied","Data":"c597d4730edec7f44d1fce4d0ce4d87234325f19fc8caa11012c424af77676b8"} Oct 13 13:26:29 crc kubenswrapper[4797]: I1013 13:26:29.376297 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"cc4b497b-efb0-4294-8af9-c16bb2835e36","Type":"ContainerStarted","Data":"f5926e75952e74a6189644d96cefddb84f5ef84b17df4a64d1dd0322928cf2a9"} Oct 13 13:26:29 crc kubenswrapper[4797]: I1013 13:26:29.388250 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"22ccc9b3-d021-40db-b079-61ca5b0e3317","Type":"ContainerStarted","Data":"82f4f99a01ccfb73cfe7d9b8fe38d647043ed3332121c01e00bac35dd182d5b8"} Oct 13 13:26:29 crc kubenswrapper[4797]: I1013 13:26:29.394129 4797 generic.go:334] "Generic (PLEG): container finished" podID="3e8201a9-2b2a-46f0-81ea-db939e25d192" containerID="2a0a2151ae65744b3d130619b0d9e3af037af9e2ad82efcda00139bb22be61e9" exitCode=0 Oct 13 13:26:29 crc kubenswrapper[4797]: I1013 13:26:29.394206 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-66cds" event={"ID":"3e8201a9-2b2a-46f0-81ea-db939e25d192","Type":"ContainerDied","Data":"2a0a2151ae65744b3d130619b0d9e3af037af9e2ad82efcda00139bb22be61e9"} Oct 13 13:26:29 crc kubenswrapper[4797]: I1013 13:26:29.397373 4797 generic.go:334] "Generic (PLEG): container finished" podID="c012c54e-85c7-4ee9-8513-4f5d8ae4b1c3" containerID="ffef222b79dd5a63d8d3ba55b0f0a1b56a6dbb536e90807283e7732510a1adcf" exitCode=0 Oct 13 13:26:29 crc kubenswrapper[4797]: I1013 13:26:29.397421 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-966b5d6fd-tjmcl" Oct 13 13:26:29 crc kubenswrapper[4797]: I1013 13:26:29.397468 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-966b5d6fd-tjmcl" event={"ID":"c012c54e-85c7-4ee9-8513-4f5d8ae4b1c3","Type":"ContainerDied","Data":"ffef222b79dd5a63d8d3ba55b0f0a1b56a6dbb536e90807283e7732510a1adcf"} Oct 13 13:26:29 crc kubenswrapper[4797]: I1013 13:26:29.397516 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-966b5d6fd-tjmcl" event={"ID":"c012c54e-85c7-4ee9-8513-4f5d8ae4b1c3","Type":"ContainerDied","Data":"a18f5c124fd9bc0689e2bd5852f56e0f4132553028de3ee5d0cc04a5c48490c0"} Oct 13 13:26:29 crc kubenswrapper[4797]: I1013 13:26:29.397539 4797 scope.go:117] "RemoveContainer" containerID="ffef222b79dd5a63d8d3ba55b0f0a1b56a6dbb536e90807283e7732510a1adcf" Oct 13 13:26:29 crc kubenswrapper[4797]: I1013 13:26:29.400106 4797 generic.go:334] "Generic (PLEG): container finished" podID="50e6d20e-a3e6-43a4-a10c-2d7d06d1cfeb" containerID="ac261ec2207e3bf512b4dde7e4a1e1e0dc07b2d3fbe679680f2d4d0c24ca5ddd" exitCode=0 Oct 13 13:26:29 crc kubenswrapper[4797]: I1013 13:26:29.400138 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-9kldr" event={"ID":"50e6d20e-a3e6-43a4-a10c-2d7d06d1cfeb","Type":"ContainerDied","Data":"ac261ec2207e3bf512b4dde7e4a1e1e0dc07b2d3fbe679680f2d4d0c24ca5ddd"} Oct 13 13:26:29 crc kubenswrapper[4797]: I1013 13:26:29.423200 4797 scope.go:117] "RemoveContainer" containerID="c194d424777b528262d7faf7dbc68e74d37a3b6e2e0d43724b482adbb86ba61c" Oct 13 13:26:29 crc kubenswrapper[4797]: I1013 13:26:29.466473 4797 scope.go:117] "RemoveContainer" containerID="ffef222b79dd5a63d8d3ba55b0f0a1b56a6dbb536e90807283e7732510a1adcf" Oct 13 13:26:29 crc kubenswrapper[4797]: E1013 13:26:29.468137 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ffef222b79dd5a63d8d3ba55b0f0a1b56a6dbb536e90807283e7732510a1adcf\": container with ID starting with ffef222b79dd5a63d8d3ba55b0f0a1b56a6dbb536e90807283e7732510a1adcf not found: ID does not exist" containerID="ffef222b79dd5a63d8d3ba55b0f0a1b56a6dbb536e90807283e7732510a1adcf" Oct 13 13:26:29 crc kubenswrapper[4797]: I1013 13:26:29.468215 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ffef222b79dd5a63d8d3ba55b0f0a1b56a6dbb536e90807283e7732510a1adcf"} err="failed to get container status \"ffef222b79dd5a63d8d3ba55b0f0a1b56a6dbb536e90807283e7732510a1adcf\": rpc error: code = NotFound desc = could not find container \"ffef222b79dd5a63d8d3ba55b0f0a1b56a6dbb536e90807283e7732510a1adcf\": container with ID starting with ffef222b79dd5a63d8d3ba55b0f0a1b56a6dbb536e90807283e7732510a1adcf not found: ID does not exist" Oct 13 13:26:29 crc kubenswrapper[4797]: I1013 13:26:29.468246 4797 scope.go:117] "RemoveContainer" containerID="c194d424777b528262d7faf7dbc68e74d37a3b6e2e0d43724b482adbb86ba61c" Oct 13 13:26:29 crc kubenswrapper[4797]: E1013 13:26:29.469567 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c194d424777b528262d7faf7dbc68e74d37a3b6e2e0d43724b482adbb86ba61c\": container with ID starting with c194d424777b528262d7faf7dbc68e74d37a3b6e2e0d43724b482adbb86ba61c not found: ID does not exist" containerID="c194d424777b528262d7faf7dbc68e74d37a3b6e2e0d43724b482adbb86ba61c" Oct 13 13:26:29 crc kubenswrapper[4797]: I1013 13:26:29.469608 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c194d424777b528262d7faf7dbc68e74d37a3b6e2e0d43724b482adbb86ba61c"} err="failed to get container status \"c194d424777b528262d7faf7dbc68e74d37a3b6e2e0d43724b482adbb86ba61c\": rpc error: code = NotFound desc = could not find container \"c194d424777b528262d7faf7dbc68e74d37a3b6e2e0d43724b482adbb86ba61c\": container with ID starting with c194d424777b528262d7faf7dbc68e74d37a3b6e2e0d43724b482adbb86ba61c not found: ID does not exist" Oct 13 13:26:29 crc kubenswrapper[4797]: I1013 13:26:29.480550 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-966b5d6fd-tjmcl"] Oct 13 13:26:29 crc kubenswrapper[4797]: I1013 13:26:29.506614 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-966b5d6fd-tjmcl"] Oct 13 13:26:29 crc kubenswrapper[4797]: I1013 13:26:29.575064 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 13 13:26:29 crc kubenswrapper[4797]: W1013 13:26:29.583276 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod416aefad_3318_4406_b1c1_fdba0ce21437.slice/crio-1eba22894556bf52053730874bb909c993976173f99d4d27a41487b223cb3754 WatchSource:0}: Error finding container 1eba22894556bf52053730874bb909c993976173f99d4d27a41487b223cb3754: Status 404 returned error can't find the container with id 1eba22894556bf52053730874bb909c993976173f99d4d27a41487b223cb3754 Oct 13 13:26:29 crc kubenswrapper[4797]: I1013 13:26:29.656940 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 13 13:26:30 crc kubenswrapper[4797]: I1013 13:26:30.451311 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"416aefad-3318-4406-b1c1-fdba0ce21437","Type":"ContainerStarted","Data":"52fe991f1e2fd4f29fc6e6714cf185447bb07fb1c6293b5cab3549dd2ea20248"} Oct 13 13:26:30 crc kubenswrapper[4797]: I1013 13:26:30.451575 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"416aefad-3318-4406-b1c1-fdba0ce21437","Type":"ContainerStarted","Data":"1eba22894556bf52053730874bb909c993976173f99d4d27a41487b223cb3754"} Oct 13 13:26:30 crc kubenswrapper[4797]: I1013 13:26:30.457384 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"22ccc9b3-d021-40db-b079-61ca5b0e3317","Type":"ContainerStarted","Data":"51156b7f654f2351addece98314f3734c22b153196731a32c55d751e93d7ab99"} Oct 13 13:26:30 crc kubenswrapper[4797]: I1013 13:26:30.461231 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"cc4b497b-efb0-4294-8af9-c16bb2835e36","Type":"ContainerStarted","Data":"cf81e79b12c28741c928f832632179cdb954c2366581d8d7306d9be83dbc6228"} Oct 13 13:26:30 crc kubenswrapper[4797]: I1013 13:26:30.463195 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e2be119d-ecfb-4f81-b947-46797c215b8e","Type":"ContainerStarted","Data":"5047e602c53523ee75385241870f170eddf3e400c8f5154307fa867893e3573e"} Oct 13 13:26:30 crc kubenswrapper[4797]: I1013 13:26:30.463217 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e2be119d-ecfb-4f81-b947-46797c215b8e","Type":"ContainerStarted","Data":"f72353f78f89b5ecf63f883eb7c92b882454c58c1f86b5ffa57341e163f8f8d6"} Oct 13 13:26:31 crc kubenswrapper[4797]: I1013 13:26:31.148119 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-9kldr" Oct 13 13:26:31 crc kubenswrapper[4797]: I1013 13:26:31.195841 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-vmbzt" Oct 13 13:26:31 crc kubenswrapper[4797]: I1013 13:26:31.196667 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-66cds" Oct 13 13:26:31 crc kubenswrapper[4797]: I1013 13:26:31.253827 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c012c54e-85c7-4ee9-8513-4f5d8ae4b1c3" path="/var/lib/kubelet/pods/c012c54e-85c7-4ee9-8513-4f5d8ae4b1c3/volumes" Oct 13 13:26:31 crc kubenswrapper[4797]: I1013 13:26:31.269631 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9pkkn\" (UniqueName: \"kubernetes.io/projected/50e6d20e-a3e6-43a4-a10c-2d7d06d1cfeb-kube-api-access-9pkkn\") pod \"50e6d20e-a3e6-43a4-a10c-2d7d06d1cfeb\" (UID: \"50e6d20e-a3e6-43a4-a10c-2d7d06d1cfeb\") " Oct 13 13:26:31 crc kubenswrapper[4797]: I1013 13:26:31.289335 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50e6d20e-a3e6-43a4-a10c-2d7d06d1cfeb-kube-api-access-9pkkn" (OuterVolumeSpecName: "kube-api-access-9pkkn") pod "50e6d20e-a3e6-43a4-a10c-2d7d06d1cfeb" (UID: "50e6d20e-a3e6-43a4-a10c-2d7d06d1cfeb"). InnerVolumeSpecName "kube-api-access-9pkkn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:26:31 crc kubenswrapper[4797]: I1013 13:26:31.371091 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p6cdb\" (UniqueName: \"kubernetes.io/projected/3e8201a9-2b2a-46f0-81ea-db939e25d192-kube-api-access-p6cdb\") pod \"3e8201a9-2b2a-46f0-81ea-db939e25d192\" (UID: \"3e8201a9-2b2a-46f0-81ea-db939e25d192\") " Oct 13 13:26:31 crc kubenswrapper[4797]: I1013 13:26:31.371587 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m8lmx\" (UniqueName: \"kubernetes.io/projected/f6745956-a404-4481-9d3e-3a8056d7aaf2-kube-api-access-m8lmx\") pod \"f6745956-a404-4481-9d3e-3a8056d7aaf2\" (UID: \"f6745956-a404-4481-9d3e-3a8056d7aaf2\") " Oct 13 13:26:31 crc kubenswrapper[4797]: I1013 13:26:31.372188 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9pkkn\" (UniqueName: \"kubernetes.io/projected/50e6d20e-a3e6-43a4-a10c-2d7d06d1cfeb-kube-api-access-9pkkn\") on node \"crc\" DevicePath \"\"" Oct 13 13:26:31 crc kubenswrapper[4797]: I1013 13:26:31.391696 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e8201a9-2b2a-46f0-81ea-db939e25d192-kube-api-access-p6cdb" (OuterVolumeSpecName: "kube-api-access-p6cdb") pod "3e8201a9-2b2a-46f0-81ea-db939e25d192" (UID: "3e8201a9-2b2a-46f0-81ea-db939e25d192"). InnerVolumeSpecName "kube-api-access-p6cdb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:26:31 crc kubenswrapper[4797]: I1013 13:26:31.393343 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6745956-a404-4481-9d3e-3a8056d7aaf2-kube-api-access-m8lmx" (OuterVolumeSpecName: "kube-api-access-m8lmx") pod "f6745956-a404-4481-9d3e-3a8056d7aaf2" (UID: "f6745956-a404-4481-9d3e-3a8056d7aaf2"). InnerVolumeSpecName "kube-api-access-m8lmx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:26:31 crc kubenswrapper[4797]: I1013 13:26:31.474118 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p6cdb\" (UniqueName: \"kubernetes.io/projected/3e8201a9-2b2a-46f0-81ea-db939e25d192-kube-api-access-p6cdb\") on node \"crc\" DevicePath \"\"" Oct 13 13:26:31 crc kubenswrapper[4797]: I1013 13:26:31.474150 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m8lmx\" (UniqueName: \"kubernetes.io/projected/f6745956-a404-4481-9d3e-3a8056d7aaf2-kube-api-access-m8lmx\") on node \"crc\" DevicePath \"\"" Oct 13 13:26:31 crc kubenswrapper[4797]: I1013 13:26:31.499433 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-9kldr" Oct 13 13:26:31 crc kubenswrapper[4797]: I1013 13:26:31.502305 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-9kldr" event={"ID":"50e6d20e-a3e6-43a4-a10c-2d7d06d1cfeb","Type":"ContainerDied","Data":"c07ee4c55fa727ccaaa9ad4fb673dc8868c833295837b0e27b7c0a929eac9c43"} Oct 13 13:26:31 crc kubenswrapper[4797]: I1013 13:26:31.502376 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c07ee4c55fa727ccaaa9ad4fb673dc8868c833295837b0e27b7c0a929eac9c43" Oct 13 13:26:31 crc kubenswrapper[4797]: I1013 13:26:31.515574 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-vmbzt" Oct 13 13:26:31 crc kubenswrapper[4797]: I1013 13:26:31.515584 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-vmbzt" event={"ID":"f6745956-a404-4481-9d3e-3a8056d7aaf2","Type":"ContainerDied","Data":"e2d50c42d3673947982f8b64a4d8d8e7a6e5dd02e8a591cfebb06429caadcbbf"} Oct 13 13:26:31 crc kubenswrapper[4797]: I1013 13:26:31.515626 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e2d50c42d3673947982f8b64a4d8d8e7a6e5dd02e8a591cfebb06429caadcbbf" Oct 13 13:26:31 crc kubenswrapper[4797]: I1013 13:26:31.525167 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-66cds" event={"ID":"3e8201a9-2b2a-46f0-81ea-db939e25d192","Type":"ContainerDied","Data":"d5fed2b2702b28a9661a3dfc6f0b40c7f995763f9fe7eda59b71d74989a5822d"} Oct 13 13:26:31 crc kubenswrapper[4797]: I1013 13:26:31.525218 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d5fed2b2702b28a9661a3dfc6f0b40c7f995763f9fe7eda59b71d74989a5822d" Oct 13 13:26:31 crc kubenswrapper[4797]: I1013 13:26:31.525299 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-66cds" Oct 13 13:26:31 crc kubenswrapper[4797]: I1013 13:26:31.557568 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"cc4b497b-efb0-4294-8af9-c16bb2835e36","Type":"ContainerStarted","Data":"2cf43831975875d820d476539ef4c0943fa120e06b5317c6aec843e932949edd"} Oct 13 13:26:31 crc kubenswrapper[4797]: I1013 13:26:31.574182 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e2be119d-ecfb-4f81-b947-46797c215b8e","Type":"ContainerStarted","Data":"14f8f7513577c04f3bc8c70c38562b364daf8b8a2754d149cd53b77f28fcf4d4"} Oct 13 13:26:31 crc kubenswrapper[4797]: I1013 13:26:31.584294 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.584264435 podStartE2EDuration="3.584264435s" podCreationTimestamp="2025-10-13 13:26:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 13:26:31.58159194 +0000 UTC m=+1169.115142196" watchObservedRunningTime="2025-10-13 13:26:31.584264435 +0000 UTC m=+1169.117814691" Oct 13 13:26:31 crc kubenswrapper[4797]: I1013 13:26:31.594141 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"416aefad-3318-4406-b1c1-fdba0ce21437","Type":"ContainerStarted","Data":"c575ab6cf83d919cac32e185c4f667a6b3abc5c3952e0de54dbbac6e3ad28900"} Oct 13 13:26:31 crc kubenswrapper[4797]: I1013 13:26:31.613177 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.613158273 podStartE2EDuration="3.613158273s" podCreationTimestamp="2025-10-13 13:26:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 13:26:31.607291699 +0000 UTC m=+1169.140841965" watchObservedRunningTime="2025-10-13 13:26:31.613158273 +0000 UTC m=+1169.146708519" Oct 13 13:26:31 crc kubenswrapper[4797]: I1013 13:26:31.637161 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.637145671 podStartE2EDuration="3.637145671s" podCreationTimestamp="2025-10-13 13:26:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 13:26:31.633624915 +0000 UTC m=+1169.167175171" watchObservedRunningTime="2025-10-13 13:26:31.637145671 +0000 UTC m=+1169.170695927" Oct 13 13:26:32 crc kubenswrapper[4797]: I1013 13:26:32.605932 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"22ccc9b3-d021-40db-b079-61ca5b0e3317","Type":"ContainerStarted","Data":"a8a603432dabb819dd64749c038881503c2649a5b5c11a0b9f656f602a376e75"} Oct 13 13:26:32 crc kubenswrapper[4797]: I1013 13:26:32.606423 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"22ccc9b3-d021-40db-b079-61ca5b0e3317","Type":"ContainerStarted","Data":"2a35b896bf49e6998defbeffb8f926c0d787c7cc208c197ee1fe182c07667ee3"} Oct 13 13:26:33 crc kubenswrapper[4797]: I1013 13:26:33.498081 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Oct 13 13:26:33 crc kubenswrapper[4797]: I1013 13:26:33.615981 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"22ccc9b3-d021-40db-b079-61ca5b0e3317","Type":"ContainerStarted","Data":"e804e625ce20740a024f22c3e640c20dcf0778ed122048e5d081191d6bd50780"} Oct 13 13:26:33 crc kubenswrapper[4797]: I1013 13:26:33.641886 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.3529101629999998 podStartE2EDuration="6.64186607s" podCreationTimestamp="2025-10-13 13:26:27 +0000 UTC" firstStartedPulling="2025-10-13 13:26:29.068490263 +0000 UTC m=+1166.602040519" lastFinishedPulling="2025-10-13 13:26:33.35744617 +0000 UTC m=+1170.890996426" observedRunningTime="2025-10-13 13:26:33.634322625 +0000 UTC m=+1171.167872891" watchObservedRunningTime="2025-10-13 13:26:33.64186607 +0000 UTC m=+1171.175416326" Oct 13 13:26:34 crc kubenswrapper[4797]: I1013 13:26:34.236384 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 13 13:26:34 crc kubenswrapper[4797]: I1013 13:26:34.624172 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 13 13:26:35 crc kubenswrapper[4797]: I1013 13:26:35.631335 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="22ccc9b3-d021-40db-b079-61ca5b0e3317" containerName="ceilometer-central-agent" containerID="cri-o://51156b7f654f2351addece98314f3734c22b153196731a32c55d751e93d7ab99" gracePeriod=30 Oct 13 13:26:35 crc kubenswrapper[4797]: I1013 13:26:35.631413 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="22ccc9b3-d021-40db-b079-61ca5b0e3317" containerName="proxy-httpd" containerID="cri-o://e804e625ce20740a024f22c3e640c20dcf0778ed122048e5d081191d6bd50780" gracePeriod=30 Oct 13 13:26:35 crc kubenswrapper[4797]: I1013 13:26:35.631437 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="22ccc9b3-d021-40db-b079-61ca5b0e3317" containerName="sg-core" containerID="cri-o://a8a603432dabb819dd64749c038881503c2649a5b5c11a0b9f656f602a376e75" gracePeriod=30 Oct 13 13:26:35 crc kubenswrapper[4797]: I1013 13:26:35.631447 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="22ccc9b3-d021-40db-b079-61ca5b0e3317" containerName="ceilometer-notification-agent" containerID="cri-o://2a35b896bf49e6998defbeffb8f926c0d787c7cc208c197ee1fe182c07667ee3" gracePeriod=30 Oct 13 13:26:36 crc kubenswrapper[4797]: I1013 13:26:36.644403 4797 generic.go:334] "Generic (PLEG): container finished" podID="22ccc9b3-d021-40db-b079-61ca5b0e3317" containerID="e804e625ce20740a024f22c3e640c20dcf0778ed122048e5d081191d6bd50780" exitCode=0 Oct 13 13:26:36 crc kubenswrapper[4797]: I1013 13:26:36.644706 4797 generic.go:334] "Generic (PLEG): container finished" podID="22ccc9b3-d021-40db-b079-61ca5b0e3317" containerID="a8a603432dabb819dd64749c038881503c2649a5b5c11a0b9f656f602a376e75" exitCode=2 Oct 13 13:26:36 crc kubenswrapper[4797]: I1013 13:26:36.644718 4797 generic.go:334] "Generic (PLEG): container finished" podID="22ccc9b3-d021-40db-b079-61ca5b0e3317" containerID="2a35b896bf49e6998defbeffb8f926c0d787c7cc208c197ee1fe182c07667ee3" exitCode=0 Oct 13 13:26:36 crc kubenswrapper[4797]: I1013 13:26:36.644486 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"22ccc9b3-d021-40db-b079-61ca5b0e3317","Type":"ContainerDied","Data":"e804e625ce20740a024f22c3e640c20dcf0778ed122048e5d081191d6bd50780"} Oct 13 13:26:36 crc kubenswrapper[4797]: I1013 13:26:36.644760 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"22ccc9b3-d021-40db-b079-61ca5b0e3317","Type":"ContainerDied","Data":"a8a603432dabb819dd64749c038881503c2649a5b5c11a0b9f656f602a376e75"} Oct 13 13:26:36 crc kubenswrapper[4797]: I1013 13:26:36.644779 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"22ccc9b3-d021-40db-b079-61ca5b0e3317","Type":"ContainerDied","Data":"2a35b896bf49e6998defbeffb8f926c0d787c7cc208c197ee1fe182c07667ee3"} Oct 13 13:26:38 crc kubenswrapper[4797]: I1013 13:26:38.665688 4797 generic.go:334] "Generic (PLEG): container finished" podID="22ccc9b3-d021-40db-b079-61ca5b0e3317" containerID="51156b7f654f2351addece98314f3734c22b153196731a32c55d751e93d7ab99" exitCode=0 Oct 13 13:26:38 crc kubenswrapper[4797]: I1013 13:26:38.666838 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"22ccc9b3-d021-40db-b079-61ca5b0e3317","Type":"ContainerDied","Data":"51156b7f654f2351addece98314f3734c22b153196731a32c55d751e93d7ab99"} Oct 13 13:26:38 crc kubenswrapper[4797]: I1013 13:26:38.726942 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Oct 13 13:26:38 crc kubenswrapper[4797]: I1013 13:26:38.825743 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 13 13:26:38 crc kubenswrapper[4797]: I1013 13:26:38.876007 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 13 13:26:38 crc kubenswrapper[4797]: I1013 13:26:38.876108 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 13 13:26:38 crc kubenswrapper[4797]: I1013 13:26:38.901398 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/22ccc9b3-d021-40db-b079-61ca5b0e3317-scripts\") pod \"22ccc9b3-d021-40db-b079-61ca5b0e3317\" (UID: \"22ccc9b3-d021-40db-b079-61ca5b0e3317\") " Oct 13 13:26:38 crc kubenswrapper[4797]: I1013 13:26:38.901586 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/22ccc9b3-d021-40db-b079-61ca5b0e3317-log-httpd\") pod \"22ccc9b3-d021-40db-b079-61ca5b0e3317\" (UID: \"22ccc9b3-d021-40db-b079-61ca5b0e3317\") " Oct 13 13:26:38 crc kubenswrapper[4797]: I1013 13:26:38.901694 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/22ccc9b3-d021-40db-b079-61ca5b0e3317-sg-core-conf-yaml\") pod \"22ccc9b3-d021-40db-b079-61ca5b0e3317\" (UID: \"22ccc9b3-d021-40db-b079-61ca5b0e3317\") " Oct 13 13:26:38 crc kubenswrapper[4797]: I1013 13:26:38.901727 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/22ccc9b3-d021-40db-b079-61ca5b0e3317-run-httpd\") pod \"22ccc9b3-d021-40db-b079-61ca5b0e3317\" (UID: \"22ccc9b3-d021-40db-b079-61ca5b0e3317\") " Oct 13 13:26:38 crc kubenswrapper[4797]: I1013 13:26:38.901754 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-76hjn\" (UniqueName: \"kubernetes.io/projected/22ccc9b3-d021-40db-b079-61ca5b0e3317-kube-api-access-76hjn\") pod \"22ccc9b3-d021-40db-b079-61ca5b0e3317\" (UID: \"22ccc9b3-d021-40db-b079-61ca5b0e3317\") " Oct 13 13:26:38 crc kubenswrapper[4797]: I1013 13:26:38.901783 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22ccc9b3-d021-40db-b079-61ca5b0e3317-config-data\") pod \"22ccc9b3-d021-40db-b079-61ca5b0e3317\" (UID: \"22ccc9b3-d021-40db-b079-61ca5b0e3317\") " Oct 13 13:26:38 crc kubenswrapper[4797]: I1013 13:26:38.901834 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22ccc9b3-d021-40db-b079-61ca5b0e3317-combined-ca-bundle\") pod \"22ccc9b3-d021-40db-b079-61ca5b0e3317\" (UID: \"22ccc9b3-d021-40db-b079-61ca5b0e3317\") " Oct 13 13:26:38 crc kubenswrapper[4797]: I1013 13:26:38.903815 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/22ccc9b3-d021-40db-b079-61ca5b0e3317-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "22ccc9b3-d021-40db-b079-61ca5b0e3317" (UID: "22ccc9b3-d021-40db-b079-61ca5b0e3317"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 13:26:38 crc kubenswrapper[4797]: I1013 13:26:38.905055 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 13 13:26:38 crc kubenswrapper[4797]: I1013 13:26:38.905902 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 13 13:26:38 crc kubenswrapper[4797]: I1013 13:26:38.905192 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/22ccc9b3-d021-40db-b079-61ca5b0e3317-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "22ccc9b3-d021-40db-b079-61ca5b0e3317" (UID: "22ccc9b3-d021-40db-b079-61ca5b0e3317"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 13:26:38 crc kubenswrapper[4797]: I1013 13:26:38.909050 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22ccc9b3-d021-40db-b079-61ca5b0e3317-scripts" (OuterVolumeSpecName: "scripts") pod "22ccc9b3-d021-40db-b079-61ca5b0e3317" (UID: "22ccc9b3-d021-40db-b079-61ca5b0e3317"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:26:38 crc kubenswrapper[4797]: I1013 13:26:38.911277 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22ccc9b3-d021-40db-b079-61ca5b0e3317-kube-api-access-76hjn" (OuterVolumeSpecName: "kube-api-access-76hjn") pod "22ccc9b3-d021-40db-b079-61ca5b0e3317" (UID: "22ccc9b3-d021-40db-b079-61ca5b0e3317"). InnerVolumeSpecName "kube-api-access-76hjn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:26:38 crc kubenswrapper[4797]: I1013 13:26:38.919610 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 13 13:26:38 crc kubenswrapper[4797]: I1013 13:26:38.927922 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 13 13:26:38 crc kubenswrapper[4797]: I1013 13:26:38.939020 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22ccc9b3-d021-40db-b079-61ca5b0e3317-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "22ccc9b3-d021-40db-b079-61ca5b0e3317" (UID: "22ccc9b3-d021-40db-b079-61ca5b0e3317"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:26:38 crc kubenswrapper[4797]: I1013 13:26:38.953639 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 13 13:26:38 crc kubenswrapper[4797]: I1013 13:26:38.963101 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 13 13:26:39 crc kubenswrapper[4797]: I1013 13:26:39.004109 4797 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/22ccc9b3-d021-40db-b079-61ca5b0e3317-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 13 13:26:39 crc kubenswrapper[4797]: I1013 13:26:39.004238 4797 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/22ccc9b3-d021-40db-b079-61ca5b0e3317-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 13 13:26:39 crc kubenswrapper[4797]: I1013 13:26:39.004258 4797 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/22ccc9b3-d021-40db-b079-61ca5b0e3317-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 13 13:26:39 crc kubenswrapper[4797]: I1013 13:26:39.004269 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-76hjn\" (UniqueName: \"kubernetes.io/projected/22ccc9b3-d021-40db-b079-61ca5b0e3317-kube-api-access-76hjn\") on node \"crc\" DevicePath \"\"" Oct 13 13:26:39 crc kubenswrapper[4797]: I1013 13:26:39.004279 4797 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/22ccc9b3-d021-40db-b079-61ca5b0e3317-scripts\") on node \"crc\" DevicePath \"\"" Oct 13 13:26:39 crc kubenswrapper[4797]: I1013 13:26:39.006003 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22ccc9b3-d021-40db-b079-61ca5b0e3317-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "22ccc9b3-d021-40db-b079-61ca5b0e3317" (UID: "22ccc9b3-d021-40db-b079-61ca5b0e3317"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:26:39 crc kubenswrapper[4797]: I1013 13:26:39.012774 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22ccc9b3-d021-40db-b079-61ca5b0e3317-config-data" (OuterVolumeSpecName: "config-data") pod "22ccc9b3-d021-40db-b079-61ca5b0e3317" (UID: "22ccc9b3-d021-40db-b079-61ca5b0e3317"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:26:39 crc kubenswrapper[4797]: I1013 13:26:39.106278 4797 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22ccc9b3-d021-40db-b079-61ca5b0e3317-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 13:26:39 crc kubenswrapper[4797]: I1013 13:26:39.106303 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22ccc9b3-d021-40db-b079-61ca5b0e3317-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 13:26:39 crc kubenswrapper[4797]: I1013 13:26:39.679538 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 13 13:26:39 crc kubenswrapper[4797]: I1013 13:26:39.679578 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"22ccc9b3-d021-40db-b079-61ca5b0e3317","Type":"ContainerDied","Data":"82f4f99a01ccfb73cfe7d9b8fe38d647043ed3332121c01e00bac35dd182d5b8"} Oct 13 13:26:39 crc kubenswrapper[4797]: I1013 13:26:39.680021 4797 scope.go:117] "RemoveContainer" containerID="e804e625ce20740a024f22c3e640c20dcf0778ed122048e5d081191d6bd50780" Oct 13 13:26:39 crc kubenswrapper[4797]: I1013 13:26:39.680632 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 13 13:26:39 crc kubenswrapper[4797]: I1013 13:26:39.680659 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 13 13:26:39 crc kubenswrapper[4797]: I1013 13:26:39.680671 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 13 13:26:39 crc kubenswrapper[4797]: I1013 13:26:39.680678 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 13 13:26:39 crc kubenswrapper[4797]: I1013 13:26:39.702864 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 13 13:26:39 crc kubenswrapper[4797]: I1013 13:26:39.718305 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 13 13:26:39 crc kubenswrapper[4797]: I1013 13:26:39.727888 4797 scope.go:117] "RemoveContainer" containerID="a8a603432dabb819dd64749c038881503c2649a5b5c11a0b9f656f602a376e75" Oct 13 13:26:39 crc kubenswrapper[4797]: I1013 13:26:39.739525 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 13 13:26:39 crc kubenswrapper[4797]: E1013 13:26:39.740745 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6745956-a404-4481-9d3e-3a8056d7aaf2" containerName="mariadb-database-create" Oct 13 13:26:39 crc kubenswrapper[4797]: I1013 13:26:39.740769 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6745956-a404-4481-9d3e-3a8056d7aaf2" containerName="mariadb-database-create" Oct 13 13:26:39 crc kubenswrapper[4797]: E1013 13:26:39.740787 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c012c54e-85c7-4ee9-8513-4f5d8ae4b1c3" containerName="barbican-api" Oct 13 13:26:39 crc kubenswrapper[4797]: I1013 13:26:39.740794 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="c012c54e-85c7-4ee9-8513-4f5d8ae4b1c3" containerName="barbican-api" Oct 13 13:26:39 crc kubenswrapper[4797]: E1013 13:26:39.740824 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22ccc9b3-d021-40db-b079-61ca5b0e3317" containerName="ceilometer-notification-agent" Oct 13 13:26:39 crc kubenswrapper[4797]: I1013 13:26:39.740832 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="22ccc9b3-d021-40db-b079-61ca5b0e3317" containerName="ceilometer-notification-agent" Oct 13 13:26:39 crc kubenswrapper[4797]: E1013 13:26:39.740844 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e8201a9-2b2a-46f0-81ea-db939e25d192" containerName="mariadb-database-create" Oct 13 13:26:39 crc kubenswrapper[4797]: I1013 13:26:39.740867 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e8201a9-2b2a-46f0-81ea-db939e25d192" containerName="mariadb-database-create" Oct 13 13:26:39 crc kubenswrapper[4797]: E1013 13:26:39.740882 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22ccc9b3-d021-40db-b079-61ca5b0e3317" containerName="proxy-httpd" Oct 13 13:26:39 crc kubenswrapper[4797]: I1013 13:26:39.740887 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="22ccc9b3-d021-40db-b079-61ca5b0e3317" containerName="proxy-httpd" Oct 13 13:26:39 crc kubenswrapper[4797]: E1013 13:26:39.740900 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22ccc9b3-d021-40db-b079-61ca5b0e3317" containerName="ceilometer-central-agent" Oct 13 13:26:39 crc kubenswrapper[4797]: I1013 13:26:39.740905 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="22ccc9b3-d021-40db-b079-61ca5b0e3317" containerName="ceilometer-central-agent" Oct 13 13:26:39 crc kubenswrapper[4797]: E1013 13:26:39.740917 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50e6d20e-a3e6-43a4-a10c-2d7d06d1cfeb" containerName="mariadb-database-create" Oct 13 13:26:39 crc kubenswrapper[4797]: I1013 13:26:39.740923 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="50e6d20e-a3e6-43a4-a10c-2d7d06d1cfeb" containerName="mariadb-database-create" Oct 13 13:26:39 crc kubenswrapper[4797]: E1013 13:26:39.740939 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22ccc9b3-d021-40db-b079-61ca5b0e3317" containerName="sg-core" Oct 13 13:26:39 crc kubenswrapper[4797]: I1013 13:26:39.740945 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="22ccc9b3-d021-40db-b079-61ca5b0e3317" containerName="sg-core" Oct 13 13:26:39 crc kubenswrapper[4797]: E1013 13:26:39.740966 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c012c54e-85c7-4ee9-8513-4f5d8ae4b1c3" containerName="barbican-api-log" Oct 13 13:26:39 crc kubenswrapper[4797]: I1013 13:26:39.740972 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="c012c54e-85c7-4ee9-8513-4f5d8ae4b1c3" containerName="barbican-api-log" Oct 13 13:26:39 crc kubenswrapper[4797]: I1013 13:26:39.741133 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="22ccc9b3-d021-40db-b079-61ca5b0e3317" containerName="sg-core" Oct 13 13:26:39 crc kubenswrapper[4797]: I1013 13:26:39.741147 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="50e6d20e-a3e6-43a4-a10c-2d7d06d1cfeb" containerName="mariadb-database-create" Oct 13 13:26:39 crc kubenswrapper[4797]: I1013 13:26:39.741165 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="c012c54e-85c7-4ee9-8513-4f5d8ae4b1c3" containerName="barbican-api-log" Oct 13 13:26:39 crc kubenswrapper[4797]: I1013 13:26:39.741174 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="c012c54e-85c7-4ee9-8513-4f5d8ae4b1c3" containerName="barbican-api" Oct 13 13:26:39 crc kubenswrapper[4797]: I1013 13:26:39.741188 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e8201a9-2b2a-46f0-81ea-db939e25d192" containerName="mariadb-database-create" Oct 13 13:26:39 crc kubenswrapper[4797]: I1013 13:26:39.741197 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="22ccc9b3-d021-40db-b079-61ca5b0e3317" containerName="ceilometer-central-agent" Oct 13 13:26:39 crc kubenswrapper[4797]: I1013 13:26:39.741205 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="22ccc9b3-d021-40db-b079-61ca5b0e3317" containerName="ceilometer-notification-agent" Oct 13 13:26:39 crc kubenswrapper[4797]: I1013 13:26:39.741215 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6745956-a404-4481-9d3e-3a8056d7aaf2" containerName="mariadb-database-create" Oct 13 13:26:39 crc kubenswrapper[4797]: I1013 13:26:39.741226 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="22ccc9b3-d021-40db-b079-61ca5b0e3317" containerName="proxy-httpd" Oct 13 13:26:39 crc kubenswrapper[4797]: I1013 13:26:39.742731 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 13 13:26:39 crc kubenswrapper[4797]: I1013 13:26:39.749994 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 13 13:26:39 crc kubenswrapper[4797]: I1013 13:26:39.750730 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 13 13:26:39 crc kubenswrapper[4797]: I1013 13:26:39.757495 4797 scope.go:117] "RemoveContainer" containerID="2a35b896bf49e6998defbeffb8f926c0d787c7cc208c197ee1fe182c07667ee3" Oct 13 13:26:39 crc kubenswrapper[4797]: I1013 13:26:39.762870 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 13 13:26:39 crc kubenswrapper[4797]: I1013 13:26:39.791552 4797 scope.go:117] "RemoveContainer" containerID="51156b7f654f2351addece98314f3734c22b153196731a32c55d751e93d7ab99" Oct 13 13:26:39 crc kubenswrapper[4797]: I1013 13:26:39.917188 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51806e8a-82b1-48f4-88bb-750d1a4b3cb3-scripts\") pod \"ceilometer-0\" (UID: \"51806e8a-82b1-48f4-88bb-750d1a4b3cb3\") " pod="openstack/ceilometer-0" Oct 13 13:26:39 crc kubenswrapper[4797]: I1013 13:26:39.917611 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/51806e8a-82b1-48f4-88bb-750d1a4b3cb3-run-httpd\") pod \"ceilometer-0\" (UID: \"51806e8a-82b1-48f4-88bb-750d1a4b3cb3\") " pod="openstack/ceilometer-0" Oct 13 13:26:39 crc kubenswrapper[4797]: I1013 13:26:39.917643 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51806e8a-82b1-48f4-88bb-750d1a4b3cb3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"51806e8a-82b1-48f4-88bb-750d1a4b3cb3\") " pod="openstack/ceilometer-0" Oct 13 13:26:39 crc kubenswrapper[4797]: I1013 13:26:39.917702 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/51806e8a-82b1-48f4-88bb-750d1a4b3cb3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"51806e8a-82b1-48f4-88bb-750d1a4b3cb3\") " pod="openstack/ceilometer-0" Oct 13 13:26:39 crc kubenswrapper[4797]: I1013 13:26:39.917742 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/51806e8a-82b1-48f4-88bb-750d1a4b3cb3-log-httpd\") pod \"ceilometer-0\" (UID: \"51806e8a-82b1-48f4-88bb-750d1a4b3cb3\") " pod="openstack/ceilometer-0" Oct 13 13:26:39 crc kubenswrapper[4797]: I1013 13:26:39.917785 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8ldg\" (UniqueName: \"kubernetes.io/projected/51806e8a-82b1-48f4-88bb-750d1a4b3cb3-kube-api-access-f8ldg\") pod \"ceilometer-0\" (UID: \"51806e8a-82b1-48f4-88bb-750d1a4b3cb3\") " pod="openstack/ceilometer-0" Oct 13 13:26:39 crc kubenswrapper[4797]: I1013 13:26:39.917858 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51806e8a-82b1-48f4-88bb-750d1a4b3cb3-config-data\") pod \"ceilometer-0\" (UID: \"51806e8a-82b1-48f4-88bb-750d1a4b3cb3\") " pod="openstack/ceilometer-0" Oct 13 13:26:40 crc kubenswrapper[4797]: I1013 13:26:40.019437 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51806e8a-82b1-48f4-88bb-750d1a4b3cb3-scripts\") pod \"ceilometer-0\" (UID: \"51806e8a-82b1-48f4-88bb-750d1a4b3cb3\") " pod="openstack/ceilometer-0" Oct 13 13:26:40 crc kubenswrapper[4797]: I1013 13:26:40.019510 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/51806e8a-82b1-48f4-88bb-750d1a4b3cb3-run-httpd\") pod \"ceilometer-0\" (UID: \"51806e8a-82b1-48f4-88bb-750d1a4b3cb3\") " pod="openstack/ceilometer-0" Oct 13 13:26:40 crc kubenswrapper[4797]: I1013 13:26:40.019541 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51806e8a-82b1-48f4-88bb-750d1a4b3cb3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"51806e8a-82b1-48f4-88bb-750d1a4b3cb3\") " pod="openstack/ceilometer-0" Oct 13 13:26:40 crc kubenswrapper[4797]: I1013 13:26:40.019596 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/51806e8a-82b1-48f4-88bb-750d1a4b3cb3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"51806e8a-82b1-48f4-88bb-750d1a4b3cb3\") " pod="openstack/ceilometer-0" Oct 13 13:26:40 crc kubenswrapper[4797]: I1013 13:26:40.019633 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/51806e8a-82b1-48f4-88bb-750d1a4b3cb3-log-httpd\") pod \"ceilometer-0\" (UID: \"51806e8a-82b1-48f4-88bb-750d1a4b3cb3\") " pod="openstack/ceilometer-0" Oct 13 13:26:40 crc kubenswrapper[4797]: I1013 13:26:40.019671 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8ldg\" (UniqueName: \"kubernetes.io/projected/51806e8a-82b1-48f4-88bb-750d1a4b3cb3-kube-api-access-f8ldg\") pod \"ceilometer-0\" (UID: \"51806e8a-82b1-48f4-88bb-750d1a4b3cb3\") " pod="openstack/ceilometer-0" Oct 13 13:26:40 crc kubenswrapper[4797]: I1013 13:26:40.019693 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51806e8a-82b1-48f4-88bb-750d1a4b3cb3-config-data\") pod \"ceilometer-0\" (UID: \"51806e8a-82b1-48f4-88bb-750d1a4b3cb3\") " pod="openstack/ceilometer-0" Oct 13 13:26:40 crc kubenswrapper[4797]: I1013 13:26:40.021907 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/51806e8a-82b1-48f4-88bb-750d1a4b3cb3-run-httpd\") pod \"ceilometer-0\" (UID: \"51806e8a-82b1-48f4-88bb-750d1a4b3cb3\") " pod="openstack/ceilometer-0" Oct 13 13:26:40 crc kubenswrapper[4797]: I1013 13:26:40.021926 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/51806e8a-82b1-48f4-88bb-750d1a4b3cb3-log-httpd\") pod \"ceilometer-0\" (UID: \"51806e8a-82b1-48f4-88bb-750d1a4b3cb3\") " pod="openstack/ceilometer-0" Oct 13 13:26:40 crc kubenswrapper[4797]: I1013 13:26:40.023542 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/51806e8a-82b1-48f4-88bb-750d1a4b3cb3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"51806e8a-82b1-48f4-88bb-750d1a4b3cb3\") " pod="openstack/ceilometer-0" Oct 13 13:26:40 crc kubenswrapper[4797]: I1013 13:26:40.024064 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51806e8a-82b1-48f4-88bb-750d1a4b3cb3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"51806e8a-82b1-48f4-88bb-750d1a4b3cb3\") " pod="openstack/ceilometer-0" Oct 13 13:26:40 crc kubenswrapper[4797]: I1013 13:26:40.025654 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51806e8a-82b1-48f4-88bb-750d1a4b3cb3-scripts\") pod \"ceilometer-0\" (UID: \"51806e8a-82b1-48f4-88bb-750d1a4b3cb3\") " pod="openstack/ceilometer-0" Oct 13 13:26:40 crc kubenswrapper[4797]: I1013 13:26:40.025941 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51806e8a-82b1-48f4-88bb-750d1a4b3cb3-config-data\") pod \"ceilometer-0\" (UID: \"51806e8a-82b1-48f4-88bb-750d1a4b3cb3\") " pod="openstack/ceilometer-0" Oct 13 13:26:40 crc kubenswrapper[4797]: I1013 13:26:40.045958 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8ldg\" (UniqueName: \"kubernetes.io/projected/51806e8a-82b1-48f4-88bb-750d1a4b3cb3-kube-api-access-f8ldg\") pod \"ceilometer-0\" (UID: \"51806e8a-82b1-48f4-88bb-750d1a4b3cb3\") " pod="openstack/ceilometer-0" Oct 13 13:26:40 crc kubenswrapper[4797]: I1013 13:26:40.060290 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 13 13:26:40 crc kubenswrapper[4797]: I1013 13:26:40.535708 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 13 13:26:40 crc kubenswrapper[4797]: W1013 13:26:40.536994 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod51806e8a_82b1_48f4_88bb_750d1a4b3cb3.slice/crio-20c6b9b2143b8326176fcd4f439be9a6a14110c7dcd79750edef49f4e12c1802 WatchSource:0}: Error finding container 20c6b9b2143b8326176fcd4f439be9a6a14110c7dcd79750edef49f4e12c1802: Status 404 returned error can't find the container with id 20c6b9b2143b8326176fcd4f439be9a6a14110c7dcd79750edef49f4e12c1802 Oct 13 13:26:40 crc kubenswrapper[4797]: I1013 13:26:40.690124 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"51806e8a-82b1-48f4-88bb-750d1a4b3cb3","Type":"ContainerStarted","Data":"20c6b9b2143b8326176fcd4f439be9a6a14110c7dcd79750edef49f4e12c1802"} Oct 13 13:26:41 crc kubenswrapper[4797]: I1013 13:26:41.252245 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22ccc9b3-d021-40db-b079-61ca5b0e3317" path="/var/lib/kubelet/pods/22ccc9b3-d021-40db-b079-61ca5b0e3317/volumes" Oct 13 13:26:41 crc kubenswrapper[4797]: I1013 13:26:41.682851 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 13 13:26:41 crc kubenswrapper[4797]: I1013 13:26:41.700313 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"51806e8a-82b1-48f4-88bb-750d1a4b3cb3","Type":"ContainerStarted","Data":"5cf03a4e992ec39f55550c3fdaf4900c2a03eda89e68c877b14aa306ba31b039"} Oct 13 13:26:41 crc kubenswrapper[4797]: I1013 13:26:41.700368 4797 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 13 13:26:41 crc kubenswrapper[4797]: I1013 13:26:41.701304 4797 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 13 13:26:41 crc kubenswrapper[4797]: I1013 13:26:41.701320 4797 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 13 13:26:41 crc kubenswrapper[4797]: I1013 13:26:41.744112 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 13 13:26:41 crc kubenswrapper[4797]: I1013 13:26:41.917062 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-a754-account-create-545z2"] Oct 13 13:26:41 crc kubenswrapper[4797]: I1013 13:26:41.918362 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-a754-account-create-545z2" Oct 13 13:26:41 crc kubenswrapper[4797]: I1013 13:26:41.922332 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Oct 13 13:26:41 crc kubenswrapper[4797]: I1013 13:26:41.933705 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-a754-account-create-545z2"] Oct 13 13:26:41 crc kubenswrapper[4797]: I1013 13:26:41.979396 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 13 13:26:41 crc kubenswrapper[4797]: I1013 13:26:41.979450 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 13 13:26:42 crc kubenswrapper[4797]: I1013 13:26:42.072674 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrwlh\" (UniqueName: \"kubernetes.io/projected/9280efc5-f863-4350-8044-7da90a6982fb-kube-api-access-qrwlh\") pod \"nova-api-a754-account-create-545z2\" (UID: \"9280efc5-f863-4350-8044-7da90a6982fb\") " pod="openstack/nova-api-a754-account-create-545z2" Oct 13 13:26:42 crc kubenswrapper[4797]: I1013 13:26:42.126300 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-84cf-account-create-hdct5"] Oct 13 13:26:42 crc kubenswrapper[4797]: I1013 13:26:42.141587 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-84cf-account-create-hdct5" Oct 13 13:26:42 crc kubenswrapper[4797]: I1013 13:26:42.147487 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Oct 13 13:26:42 crc kubenswrapper[4797]: I1013 13:26:42.149435 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-84cf-account-create-hdct5"] Oct 13 13:26:42 crc kubenswrapper[4797]: I1013 13:26:42.175014 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrwlh\" (UniqueName: \"kubernetes.io/projected/9280efc5-f863-4350-8044-7da90a6982fb-kube-api-access-qrwlh\") pod \"nova-api-a754-account-create-545z2\" (UID: \"9280efc5-f863-4350-8044-7da90a6982fb\") " pod="openstack/nova-api-a754-account-create-545z2" Oct 13 13:26:42 crc kubenswrapper[4797]: I1013 13:26:42.193536 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrwlh\" (UniqueName: \"kubernetes.io/projected/9280efc5-f863-4350-8044-7da90a6982fb-kube-api-access-qrwlh\") pod \"nova-api-a754-account-create-545z2\" (UID: \"9280efc5-f863-4350-8044-7da90a6982fb\") " pod="openstack/nova-api-a754-account-create-545z2" Oct 13 13:26:42 crc kubenswrapper[4797]: I1013 13:26:42.234050 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-a754-account-create-545z2" Oct 13 13:26:42 crc kubenswrapper[4797]: I1013 13:26:42.277167 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6t86w\" (UniqueName: \"kubernetes.io/projected/ef6a4b4d-5231-4b06-9fa1-695aee17a37b-kube-api-access-6t86w\") pod \"nova-cell0-84cf-account-create-hdct5\" (UID: \"ef6a4b4d-5231-4b06-9fa1-695aee17a37b\") " pod="openstack/nova-cell0-84cf-account-create-hdct5" Oct 13 13:26:42 crc kubenswrapper[4797]: I1013 13:26:42.321033 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-22bc-account-create-rrkb5"] Oct 13 13:26:42 crc kubenswrapper[4797]: I1013 13:26:42.322170 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-22bc-account-create-rrkb5" Oct 13 13:26:42 crc kubenswrapper[4797]: I1013 13:26:42.324305 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Oct 13 13:26:42 crc kubenswrapper[4797]: I1013 13:26:42.334460 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-22bc-account-create-rrkb5"] Oct 13 13:26:42 crc kubenswrapper[4797]: I1013 13:26:42.391159 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6t86w\" (UniqueName: \"kubernetes.io/projected/ef6a4b4d-5231-4b06-9fa1-695aee17a37b-kube-api-access-6t86w\") pod \"nova-cell0-84cf-account-create-hdct5\" (UID: \"ef6a4b4d-5231-4b06-9fa1-695aee17a37b\") " pod="openstack/nova-cell0-84cf-account-create-hdct5" Oct 13 13:26:42 crc kubenswrapper[4797]: I1013 13:26:42.415661 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6t86w\" (UniqueName: \"kubernetes.io/projected/ef6a4b4d-5231-4b06-9fa1-695aee17a37b-kube-api-access-6t86w\") pod \"nova-cell0-84cf-account-create-hdct5\" (UID: \"ef6a4b4d-5231-4b06-9fa1-695aee17a37b\") " pod="openstack/nova-cell0-84cf-account-create-hdct5" Oct 13 13:26:42 crc kubenswrapper[4797]: I1013 13:26:42.467255 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-84cf-account-create-hdct5" Oct 13 13:26:42 crc kubenswrapper[4797]: I1013 13:26:42.499459 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c4gdz\" (UniqueName: \"kubernetes.io/projected/a8cc6e8d-cce5-423b-a60c-49587bc50452-kube-api-access-c4gdz\") pod \"nova-cell1-22bc-account-create-rrkb5\" (UID: \"a8cc6e8d-cce5-423b-a60c-49587bc50452\") " pod="openstack/nova-cell1-22bc-account-create-rrkb5" Oct 13 13:26:42 crc kubenswrapper[4797]: I1013 13:26:42.602869 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c4gdz\" (UniqueName: \"kubernetes.io/projected/a8cc6e8d-cce5-423b-a60c-49587bc50452-kube-api-access-c4gdz\") pod \"nova-cell1-22bc-account-create-rrkb5\" (UID: \"a8cc6e8d-cce5-423b-a60c-49587bc50452\") " pod="openstack/nova-cell1-22bc-account-create-rrkb5" Oct 13 13:26:42 crc kubenswrapper[4797]: I1013 13:26:42.621852 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c4gdz\" (UniqueName: \"kubernetes.io/projected/a8cc6e8d-cce5-423b-a60c-49587bc50452-kube-api-access-c4gdz\") pod \"nova-cell1-22bc-account-create-rrkb5\" (UID: \"a8cc6e8d-cce5-423b-a60c-49587bc50452\") " pod="openstack/nova-cell1-22bc-account-create-rrkb5" Oct 13 13:26:42 crc kubenswrapper[4797]: I1013 13:26:42.643169 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-22bc-account-create-rrkb5" Oct 13 13:26:42 crc kubenswrapper[4797]: I1013 13:26:42.729982 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"f8688abd-e654-404b-924c-9e4cf255f4e8","Type":"ContainerStarted","Data":"a4c476b6ff3b37f629fd62076e094c447b4d35d68c78802f6d598f476f60adb0"} Oct 13 13:26:42 crc kubenswrapper[4797]: I1013 13:26:42.755911 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"51806e8a-82b1-48f4-88bb-750d1a4b3cb3","Type":"ContainerStarted","Data":"00ce7aec52aca589f046de2bd4fe8a79058b5ff1110404435b1bdb7629fab7f9"} Oct 13 13:26:42 crc kubenswrapper[4797]: I1013 13:26:42.820505 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-a754-account-create-545z2"] Oct 13 13:26:42 crc kubenswrapper[4797]: I1013 13:26:42.825871 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.298884775 podStartE2EDuration="34.825798466s" podCreationTimestamp="2025-10-13 13:26:08 +0000 UTC" firstStartedPulling="2025-10-13 13:26:09.182768204 +0000 UTC m=+1146.716318460" lastFinishedPulling="2025-10-13 13:26:41.709681895 +0000 UTC m=+1179.243232151" observedRunningTime="2025-10-13 13:26:42.787942868 +0000 UTC m=+1180.321493124" watchObservedRunningTime="2025-10-13 13:26:42.825798466 +0000 UTC m=+1180.359348722" Oct 13 13:26:42 crc kubenswrapper[4797]: I1013 13:26:42.977069 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-84cf-account-create-hdct5"] Oct 13 13:26:43 crc kubenswrapper[4797]: W1013 13:26:43.217178 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda8cc6e8d_cce5_423b_a60c_49587bc50452.slice/crio-fd0215c27e090ca59cfc32832117cbc6841df09702a25af0dee636668dc3c026 WatchSource:0}: Error finding container fd0215c27e090ca59cfc32832117cbc6841df09702a25af0dee636668dc3c026: Status 404 returned error can't find the container with id fd0215c27e090ca59cfc32832117cbc6841df09702a25af0dee636668dc3c026 Oct 13 13:26:43 crc kubenswrapper[4797]: I1013 13:26:43.219035 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-22bc-account-create-rrkb5"] Oct 13 13:26:43 crc kubenswrapper[4797]: I1013 13:26:43.758578 4797 generic.go:334] "Generic (PLEG): container finished" podID="9280efc5-f863-4350-8044-7da90a6982fb" containerID="5dbaaac2816f861e605b3530adc848ceccb2aa44765d018c23ce49dddf5fbec3" exitCode=0 Oct 13 13:26:43 crc kubenswrapper[4797]: I1013 13:26:43.759187 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-a754-account-create-545z2" event={"ID":"9280efc5-f863-4350-8044-7da90a6982fb","Type":"ContainerDied","Data":"5dbaaac2816f861e605b3530adc848ceccb2aa44765d018c23ce49dddf5fbec3"} Oct 13 13:26:43 crc kubenswrapper[4797]: I1013 13:26:43.759234 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-a754-account-create-545z2" event={"ID":"9280efc5-f863-4350-8044-7da90a6982fb","Type":"ContainerStarted","Data":"4b5c4d7a6973352557dd1348f21ff5745c84005a495cfd3d1d662abfd753d387"} Oct 13 13:26:43 crc kubenswrapper[4797]: I1013 13:26:43.763982 4797 generic.go:334] "Generic (PLEG): container finished" podID="a8cc6e8d-cce5-423b-a60c-49587bc50452" containerID="08725d77ede82c87183202476a70e1d4f3cdcf4541813fed902f4f45081356cd" exitCode=0 Oct 13 13:26:43 crc kubenswrapper[4797]: I1013 13:26:43.764047 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-22bc-account-create-rrkb5" event={"ID":"a8cc6e8d-cce5-423b-a60c-49587bc50452","Type":"ContainerDied","Data":"08725d77ede82c87183202476a70e1d4f3cdcf4541813fed902f4f45081356cd"} Oct 13 13:26:43 crc kubenswrapper[4797]: I1013 13:26:43.764065 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-22bc-account-create-rrkb5" event={"ID":"a8cc6e8d-cce5-423b-a60c-49587bc50452","Type":"ContainerStarted","Data":"fd0215c27e090ca59cfc32832117cbc6841df09702a25af0dee636668dc3c026"} Oct 13 13:26:43 crc kubenswrapper[4797]: I1013 13:26:43.773893 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"51806e8a-82b1-48f4-88bb-750d1a4b3cb3","Type":"ContainerStarted","Data":"bde5a186aab2b9cec2b089778ef44006877b19bb595852147f01590e717a2996"} Oct 13 13:26:43 crc kubenswrapper[4797]: I1013 13:26:43.781559 4797 generic.go:334] "Generic (PLEG): container finished" podID="ef6a4b4d-5231-4b06-9fa1-695aee17a37b" containerID="36e7e2aeec321eaebaa86fac8932d4e7bb3dce1420adea82887e645b8e0c07f2" exitCode=0 Oct 13 13:26:43 crc kubenswrapper[4797]: I1013 13:26:43.781642 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-84cf-account-create-hdct5" event={"ID":"ef6a4b4d-5231-4b06-9fa1-695aee17a37b","Type":"ContainerDied","Data":"36e7e2aeec321eaebaa86fac8932d4e7bb3dce1420adea82887e645b8e0c07f2"} Oct 13 13:26:43 crc kubenswrapper[4797]: I1013 13:26:43.781709 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-84cf-account-create-hdct5" event={"ID":"ef6a4b4d-5231-4b06-9fa1-695aee17a37b","Type":"ContainerStarted","Data":"c3822778ac104cbb0b08c34ab12adc0bc27824503253fbf09c51d979780c43c0"} Oct 13 13:26:44 crc kubenswrapper[4797]: I1013 13:26:44.808468 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"51806e8a-82b1-48f4-88bb-750d1a4b3cb3","Type":"ContainerStarted","Data":"1b55bee031f47a3777a8b4696fb649d08108c8994777937182b3d07c6dbcc664"} Oct 13 13:26:44 crc kubenswrapper[4797]: I1013 13:26:44.811864 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 13 13:26:44 crc kubenswrapper[4797]: I1013 13:26:44.839301 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.212660304 podStartE2EDuration="5.839239769s" podCreationTimestamp="2025-10-13 13:26:39 +0000 UTC" firstStartedPulling="2025-10-13 13:26:40.539297562 +0000 UTC m=+1178.072847838" lastFinishedPulling="2025-10-13 13:26:44.165877047 +0000 UTC m=+1181.699427303" observedRunningTime="2025-10-13 13:26:44.829207003 +0000 UTC m=+1182.362757269" watchObservedRunningTime="2025-10-13 13:26:44.839239769 +0000 UTC m=+1182.372790025" Oct 13 13:26:45 crc kubenswrapper[4797]: I1013 13:26:45.283101 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-a754-account-create-545z2" Oct 13 13:26:45 crc kubenswrapper[4797]: I1013 13:26:45.287558 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-22bc-account-create-rrkb5" Oct 13 13:26:45 crc kubenswrapper[4797]: I1013 13:26:45.294300 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-84cf-account-create-hdct5" Oct 13 13:26:45 crc kubenswrapper[4797]: I1013 13:26:45.377078 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c4gdz\" (UniqueName: \"kubernetes.io/projected/a8cc6e8d-cce5-423b-a60c-49587bc50452-kube-api-access-c4gdz\") pod \"a8cc6e8d-cce5-423b-a60c-49587bc50452\" (UID: \"a8cc6e8d-cce5-423b-a60c-49587bc50452\") " Oct 13 13:26:45 crc kubenswrapper[4797]: I1013 13:26:45.377142 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qrwlh\" (UniqueName: \"kubernetes.io/projected/9280efc5-f863-4350-8044-7da90a6982fb-kube-api-access-qrwlh\") pod \"9280efc5-f863-4350-8044-7da90a6982fb\" (UID: \"9280efc5-f863-4350-8044-7da90a6982fb\") " Oct 13 13:26:45 crc kubenswrapper[4797]: I1013 13:26:45.377294 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6t86w\" (UniqueName: \"kubernetes.io/projected/ef6a4b4d-5231-4b06-9fa1-695aee17a37b-kube-api-access-6t86w\") pod \"ef6a4b4d-5231-4b06-9fa1-695aee17a37b\" (UID: \"ef6a4b4d-5231-4b06-9fa1-695aee17a37b\") " Oct 13 13:26:45 crc kubenswrapper[4797]: I1013 13:26:45.383969 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8cc6e8d-cce5-423b-a60c-49587bc50452-kube-api-access-c4gdz" (OuterVolumeSpecName: "kube-api-access-c4gdz") pod "a8cc6e8d-cce5-423b-a60c-49587bc50452" (UID: "a8cc6e8d-cce5-423b-a60c-49587bc50452"). InnerVolumeSpecName "kube-api-access-c4gdz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:26:45 crc kubenswrapper[4797]: I1013 13:26:45.385398 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef6a4b4d-5231-4b06-9fa1-695aee17a37b-kube-api-access-6t86w" (OuterVolumeSpecName: "kube-api-access-6t86w") pod "ef6a4b4d-5231-4b06-9fa1-695aee17a37b" (UID: "ef6a4b4d-5231-4b06-9fa1-695aee17a37b"). InnerVolumeSpecName "kube-api-access-6t86w". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:26:45 crc kubenswrapper[4797]: I1013 13:26:45.387418 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9280efc5-f863-4350-8044-7da90a6982fb-kube-api-access-qrwlh" (OuterVolumeSpecName: "kube-api-access-qrwlh") pod "9280efc5-f863-4350-8044-7da90a6982fb" (UID: "9280efc5-f863-4350-8044-7da90a6982fb"). InnerVolumeSpecName "kube-api-access-qrwlh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:26:45 crc kubenswrapper[4797]: I1013 13:26:45.479278 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6t86w\" (UniqueName: \"kubernetes.io/projected/ef6a4b4d-5231-4b06-9fa1-695aee17a37b-kube-api-access-6t86w\") on node \"crc\" DevicePath \"\"" Oct 13 13:26:45 crc kubenswrapper[4797]: I1013 13:26:45.479553 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c4gdz\" (UniqueName: \"kubernetes.io/projected/a8cc6e8d-cce5-423b-a60c-49587bc50452-kube-api-access-c4gdz\") on node \"crc\" DevicePath \"\"" Oct 13 13:26:45 crc kubenswrapper[4797]: I1013 13:26:45.479566 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qrwlh\" (UniqueName: \"kubernetes.io/projected/9280efc5-f863-4350-8044-7da90a6982fb-kube-api-access-qrwlh\") on node \"crc\" DevicePath \"\"" Oct 13 13:26:45 crc kubenswrapper[4797]: I1013 13:26:45.819186 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-84cf-account-create-hdct5" event={"ID":"ef6a4b4d-5231-4b06-9fa1-695aee17a37b","Type":"ContainerDied","Data":"c3822778ac104cbb0b08c34ab12adc0bc27824503253fbf09c51d979780c43c0"} Oct 13 13:26:45 crc kubenswrapper[4797]: I1013 13:26:45.819230 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c3822778ac104cbb0b08c34ab12adc0bc27824503253fbf09c51d979780c43c0" Oct 13 13:26:45 crc kubenswrapper[4797]: I1013 13:26:45.819296 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-84cf-account-create-hdct5" Oct 13 13:26:45 crc kubenswrapper[4797]: I1013 13:26:45.829248 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-a754-account-create-545z2" Oct 13 13:26:45 crc kubenswrapper[4797]: I1013 13:26:45.829247 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-a754-account-create-545z2" event={"ID":"9280efc5-f863-4350-8044-7da90a6982fb","Type":"ContainerDied","Data":"4b5c4d7a6973352557dd1348f21ff5745c84005a495cfd3d1d662abfd753d387"} Oct 13 13:26:45 crc kubenswrapper[4797]: I1013 13:26:45.829375 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4b5c4d7a6973352557dd1348f21ff5745c84005a495cfd3d1d662abfd753d387" Oct 13 13:26:45 crc kubenswrapper[4797]: I1013 13:26:45.831586 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-22bc-account-create-rrkb5" Oct 13 13:26:45 crc kubenswrapper[4797]: I1013 13:26:45.833532 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-22bc-account-create-rrkb5" event={"ID":"a8cc6e8d-cce5-423b-a60c-49587bc50452","Type":"ContainerDied","Data":"fd0215c27e090ca59cfc32832117cbc6841df09702a25af0dee636668dc3c026"} Oct 13 13:26:45 crc kubenswrapper[4797]: I1013 13:26:45.833633 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fd0215c27e090ca59cfc32832117cbc6841df09702a25af0dee636668dc3c026" Oct 13 13:26:47 crc kubenswrapper[4797]: I1013 13:26:47.352616 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-f5b9c"] Oct 13 13:26:47 crc kubenswrapper[4797]: E1013 13:26:47.353384 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8cc6e8d-cce5-423b-a60c-49587bc50452" containerName="mariadb-account-create" Oct 13 13:26:47 crc kubenswrapper[4797]: I1013 13:26:47.353402 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8cc6e8d-cce5-423b-a60c-49587bc50452" containerName="mariadb-account-create" Oct 13 13:26:47 crc kubenswrapper[4797]: E1013 13:26:47.353429 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef6a4b4d-5231-4b06-9fa1-695aee17a37b" containerName="mariadb-account-create" Oct 13 13:26:47 crc kubenswrapper[4797]: I1013 13:26:47.353436 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef6a4b4d-5231-4b06-9fa1-695aee17a37b" containerName="mariadb-account-create" Oct 13 13:26:47 crc kubenswrapper[4797]: E1013 13:26:47.353452 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9280efc5-f863-4350-8044-7da90a6982fb" containerName="mariadb-account-create" Oct 13 13:26:47 crc kubenswrapper[4797]: I1013 13:26:47.353459 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="9280efc5-f863-4350-8044-7da90a6982fb" containerName="mariadb-account-create" Oct 13 13:26:47 crc kubenswrapper[4797]: I1013 13:26:47.353661 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8cc6e8d-cce5-423b-a60c-49587bc50452" containerName="mariadb-account-create" Oct 13 13:26:47 crc kubenswrapper[4797]: I1013 13:26:47.353694 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef6a4b4d-5231-4b06-9fa1-695aee17a37b" containerName="mariadb-account-create" Oct 13 13:26:47 crc kubenswrapper[4797]: I1013 13:26:47.353706 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="9280efc5-f863-4350-8044-7da90a6982fb" containerName="mariadb-account-create" Oct 13 13:26:47 crc kubenswrapper[4797]: I1013 13:26:47.354437 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-f5b9c" Oct 13 13:26:47 crc kubenswrapper[4797]: I1013 13:26:47.356768 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Oct 13 13:26:47 crc kubenswrapper[4797]: I1013 13:26:47.356850 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 13 13:26:47 crc kubenswrapper[4797]: I1013 13:26:47.356995 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-vngh9" Oct 13 13:26:47 crc kubenswrapper[4797]: I1013 13:26:47.372634 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-f5b9c"] Oct 13 13:26:47 crc kubenswrapper[4797]: I1013 13:26:47.413341 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8v579\" (UniqueName: \"kubernetes.io/projected/a88a7d56-1b07-41ef-94ec-d39f1c9bfb76-kube-api-access-8v579\") pod \"nova-cell0-conductor-db-sync-f5b9c\" (UID: \"a88a7d56-1b07-41ef-94ec-d39f1c9bfb76\") " pod="openstack/nova-cell0-conductor-db-sync-f5b9c" Oct 13 13:26:47 crc kubenswrapper[4797]: I1013 13:26:47.413398 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a88a7d56-1b07-41ef-94ec-d39f1c9bfb76-config-data\") pod \"nova-cell0-conductor-db-sync-f5b9c\" (UID: \"a88a7d56-1b07-41ef-94ec-d39f1c9bfb76\") " pod="openstack/nova-cell0-conductor-db-sync-f5b9c" Oct 13 13:26:47 crc kubenswrapper[4797]: I1013 13:26:47.413490 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a88a7d56-1b07-41ef-94ec-d39f1c9bfb76-scripts\") pod \"nova-cell0-conductor-db-sync-f5b9c\" (UID: \"a88a7d56-1b07-41ef-94ec-d39f1c9bfb76\") " pod="openstack/nova-cell0-conductor-db-sync-f5b9c" Oct 13 13:26:47 crc kubenswrapper[4797]: I1013 13:26:47.413528 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a88a7d56-1b07-41ef-94ec-d39f1c9bfb76-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-f5b9c\" (UID: \"a88a7d56-1b07-41ef-94ec-d39f1c9bfb76\") " pod="openstack/nova-cell0-conductor-db-sync-f5b9c" Oct 13 13:26:47 crc kubenswrapper[4797]: I1013 13:26:47.514680 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a88a7d56-1b07-41ef-94ec-d39f1c9bfb76-scripts\") pod \"nova-cell0-conductor-db-sync-f5b9c\" (UID: \"a88a7d56-1b07-41ef-94ec-d39f1c9bfb76\") " pod="openstack/nova-cell0-conductor-db-sync-f5b9c" Oct 13 13:26:47 crc kubenswrapper[4797]: I1013 13:26:47.514755 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a88a7d56-1b07-41ef-94ec-d39f1c9bfb76-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-f5b9c\" (UID: \"a88a7d56-1b07-41ef-94ec-d39f1c9bfb76\") " pod="openstack/nova-cell0-conductor-db-sync-f5b9c" Oct 13 13:26:47 crc kubenswrapper[4797]: I1013 13:26:47.514904 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8v579\" (UniqueName: \"kubernetes.io/projected/a88a7d56-1b07-41ef-94ec-d39f1c9bfb76-kube-api-access-8v579\") pod \"nova-cell0-conductor-db-sync-f5b9c\" (UID: \"a88a7d56-1b07-41ef-94ec-d39f1c9bfb76\") " pod="openstack/nova-cell0-conductor-db-sync-f5b9c" Oct 13 13:26:47 crc kubenswrapper[4797]: I1013 13:26:47.514938 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a88a7d56-1b07-41ef-94ec-d39f1c9bfb76-config-data\") pod \"nova-cell0-conductor-db-sync-f5b9c\" (UID: \"a88a7d56-1b07-41ef-94ec-d39f1c9bfb76\") " pod="openstack/nova-cell0-conductor-db-sync-f5b9c" Oct 13 13:26:47 crc kubenswrapper[4797]: I1013 13:26:47.520393 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a88a7d56-1b07-41ef-94ec-d39f1c9bfb76-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-f5b9c\" (UID: \"a88a7d56-1b07-41ef-94ec-d39f1c9bfb76\") " pod="openstack/nova-cell0-conductor-db-sync-f5b9c" Oct 13 13:26:47 crc kubenswrapper[4797]: I1013 13:26:47.521732 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a88a7d56-1b07-41ef-94ec-d39f1c9bfb76-config-data\") pod \"nova-cell0-conductor-db-sync-f5b9c\" (UID: \"a88a7d56-1b07-41ef-94ec-d39f1c9bfb76\") " pod="openstack/nova-cell0-conductor-db-sync-f5b9c" Oct 13 13:26:47 crc kubenswrapper[4797]: I1013 13:26:47.522102 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a88a7d56-1b07-41ef-94ec-d39f1c9bfb76-scripts\") pod \"nova-cell0-conductor-db-sync-f5b9c\" (UID: \"a88a7d56-1b07-41ef-94ec-d39f1c9bfb76\") " pod="openstack/nova-cell0-conductor-db-sync-f5b9c" Oct 13 13:26:47 crc kubenswrapper[4797]: I1013 13:26:47.534048 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8v579\" (UniqueName: \"kubernetes.io/projected/a88a7d56-1b07-41ef-94ec-d39f1c9bfb76-kube-api-access-8v579\") pod \"nova-cell0-conductor-db-sync-f5b9c\" (UID: \"a88a7d56-1b07-41ef-94ec-d39f1c9bfb76\") " pod="openstack/nova-cell0-conductor-db-sync-f5b9c" Oct 13 13:26:47 crc kubenswrapper[4797]: I1013 13:26:47.674533 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-f5b9c" Oct 13 13:26:48 crc kubenswrapper[4797]: I1013 13:26:48.205465 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-f5b9c"] Oct 13 13:26:48 crc kubenswrapper[4797]: W1013 13:26:48.206421 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda88a7d56_1b07_41ef_94ec_d39f1c9bfb76.slice/crio-5ff3c808013c7e6e1e43d22b37ea6b995bafa9b3b243e0b104ae949bf965fb09 WatchSource:0}: Error finding container 5ff3c808013c7e6e1e43d22b37ea6b995bafa9b3b243e0b104ae949bf965fb09: Status 404 returned error can't find the container with id 5ff3c808013c7e6e1e43d22b37ea6b995bafa9b3b243e0b104ae949bf965fb09 Oct 13 13:26:48 crc kubenswrapper[4797]: I1013 13:26:48.880689 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-f5b9c" event={"ID":"a88a7d56-1b07-41ef-94ec-d39f1c9bfb76","Type":"ContainerStarted","Data":"5ff3c808013c7e6e1e43d22b37ea6b995bafa9b3b243e0b104ae949bf965fb09"} Oct 13 13:26:48 crc kubenswrapper[4797]: I1013 13:26:48.897729 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 13 13:26:48 crc kubenswrapper[4797]: I1013 13:26:48.898033 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="51806e8a-82b1-48f4-88bb-750d1a4b3cb3" containerName="ceilometer-central-agent" containerID="cri-o://5cf03a4e992ec39f55550c3fdaf4900c2a03eda89e68c877b14aa306ba31b039" gracePeriod=30 Oct 13 13:26:48 crc kubenswrapper[4797]: I1013 13:26:48.898137 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="51806e8a-82b1-48f4-88bb-750d1a4b3cb3" containerName="sg-core" containerID="cri-o://bde5a186aab2b9cec2b089778ef44006877b19bb595852147f01590e717a2996" gracePeriod=30 Oct 13 13:26:48 crc kubenswrapper[4797]: I1013 13:26:48.898142 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="51806e8a-82b1-48f4-88bb-750d1a4b3cb3" containerName="ceilometer-notification-agent" containerID="cri-o://00ce7aec52aca589f046de2bd4fe8a79058b5ff1110404435b1bdb7629fab7f9" gracePeriod=30 Oct 13 13:26:48 crc kubenswrapper[4797]: I1013 13:26:48.898267 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="51806e8a-82b1-48f4-88bb-750d1a4b3cb3" containerName="proxy-httpd" containerID="cri-o://1b55bee031f47a3777a8b4696fb649d08108c8994777937182b3d07c6dbcc664" gracePeriod=30 Oct 13 13:26:49 crc kubenswrapper[4797]: I1013 13:26:49.832710 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 13 13:26:49 crc kubenswrapper[4797]: I1013 13:26:49.891873 4797 generic.go:334] "Generic (PLEG): container finished" podID="51806e8a-82b1-48f4-88bb-750d1a4b3cb3" containerID="1b55bee031f47a3777a8b4696fb649d08108c8994777937182b3d07c6dbcc664" exitCode=0 Oct 13 13:26:49 crc kubenswrapper[4797]: I1013 13:26:49.891903 4797 generic.go:334] "Generic (PLEG): container finished" podID="51806e8a-82b1-48f4-88bb-750d1a4b3cb3" containerID="bde5a186aab2b9cec2b089778ef44006877b19bb595852147f01590e717a2996" exitCode=2 Oct 13 13:26:49 crc kubenswrapper[4797]: I1013 13:26:49.891911 4797 generic.go:334] "Generic (PLEG): container finished" podID="51806e8a-82b1-48f4-88bb-750d1a4b3cb3" containerID="00ce7aec52aca589f046de2bd4fe8a79058b5ff1110404435b1bdb7629fab7f9" exitCode=0 Oct 13 13:26:49 crc kubenswrapper[4797]: I1013 13:26:49.891918 4797 generic.go:334] "Generic (PLEG): container finished" podID="51806e8a-82b1-48f4-88bb-750d1a4b3cb3" containerID="5cf03a4e992ec39f55550c3fdaf4900c2a03eda89e68c877b14aa306ba31b039" exitCode=0 Oct 13 13:26:49 crc kubenswrapper[4797]: I1013 13:26:49.891936 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"51806e8a-82b1-48f4-88bb-750d1a4b3cb3","Type":"ContainerDied","Data":"1b55bee031f47a3777a8b4696fb649d08108c8994777937182b3d07c6dbcc664"} Oct 13 13:26:49 crc kubenswrapper[4797]: I1013 13:26:49.891945 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 13 13:26:49 crc kubenswrapper[4797]: I1013 13:26:49.891960 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"51806e8a-82b1-48f4-88bb-750d1a4b3cb3","Type":"ContainerDied","Data":"bde5a186aab2b9cec2b089778ef44006877b19bb595852147f01590e717a2996"} Oct 13 13:26:49 crc kubenswrapper[4797]: I1013 13:26:49.891970 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"51806e8a-82b1-48f4-88bb-750d1a4b3cb3","Type":"ContainerDied","Data":"00ce7aec52aca589f046de2bd4fe8a79058b5ff1110404435b1bdb7629fab7f9"} Oct 13 13:26:49 crc kubenswrapper[4797]: I1013 13:26:49.891979 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"51806e8a-82b1-48f4-88bb-750d1a4b3cb3","Type":"ContainerDied","Data":"5cf03a4e992ec39f55550c3fdaf4900c2a03eda89e68c877b14aa306ba31b039"} Oct 13 13:26:49 crc kubenswrapper[4797]: I1013 13:26:49.891987 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"51806e8a-82b1-48f4-88bb-750d1a4b3cb3","Type":"ContainerDied","Data":"20c6b9b2143b8326176fcd4f439be9a6a14110c7dcd79750edef49f4e12c1802"} Oct 13 13:26:49 crc kubenswrapper[4797]: I1013 13:26:49.892001 4797 scope.go:117] "RemoveContainer" containerID="1b55bee031f47a3777a8b4696fb649d08108c8994777937182b3d07c6dbcc664" Oct 13 13:26:49 crc kubenswrapper[4797]: I1013 13:26:49.934618 4797 scope.go:117] "RemoveContainer" containerID="bde5a186aab2b9cec2b089778ef44006877b19bb595852147f01590e717a2996" Oct 13 13:26:49 crc kubenswrapper[4797]: I1013 13:26:49.957109 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51806e8a-82b1-48f4-88bb-750d1a4b3cb3-combined-ca-bundle\") pod \"51806e8a-82b1-48f4-88bb-750d1a4b3cb3\" (UID: \"51806e8a-82b1-48f4-88bb-750d1a4b3cb3\") " Oct 13 13:26:49 crc kubenswrapper[4797]: I1013 13:26:49.957951 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51806e8a-82b1-48f4-88bb-750d1a4b3cb3-config-data\") pod \"51806e8a-82b1-48f4-88bb-750d1a4b3cb3\" (UID: \"51806e8a-82b1-48f4-88bb-750d1a4b3cb3\") " Oct 13 13:26:49 crc kubenswrapper[4797]: I1013 13:26:49.958007 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f8ldg\" (UniqueName: \"kubernetes.io/projected/51806e8a-82b1-48f4-88bb-750d1a4b3cb3-kube-api-access-f8ldg\") pod \"51806e8a-82b1-48f4-88bb-750d1a4b3cb3\" (UID: \"51806e8a-82b1-48f4-88bb-750d1a4b3cb3\") " Oct 13 13:26:49 crc kubenswrapper[4797]: I1013 13:26:49.958072 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/51806e8a-82b1-48f4-88bb-750d1a4b3cb3-run-httpd\") pod \"51806e8a-82b1-48f4-88bb-750d1a4b3cb3\" (UID: \"51806e8a-82b1-48f4-88bb-750d1a4b3cb3\") " Oct 13 13:26:49 crc kubenswrapper[4797]: I1013 13:26:49.958137 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/51806e8a-82b1-48f4-88bb-750d1a4b3cb3-log-httpd\") pod \"51806e8a-82b1-48f4-88bb-750d1a4b3cb3\" (UID: \"51806e8a-82b1-48f4-88bb-750d1a4b3cb3\") " Oct 13 13:26:49 crc kubenswrapper[4797]: I1013 13:26:49.958164 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51806e8a-82b1-48f4-88bb-750d1a4b3cb3-scripts\") pod \"51806e8a-82b1-48f4-88bb-750d1a4b3cb3\" (UID: \"51806e8a-82b1-48f4-88bb-750d1a4b3cb3\") " Oct 13 13:26:49 crc kubenswrapper[4797]: I1013 13:26:49.958178 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/51806e8a-82b1-48f4-88bb-750d1a4b3cb3-sg-core-conf-yaml\") pod \"51806e8a-82b1-48f4-88bb-750d1a4b3cb3\" (UID: \"51806e8a-82b1-48f4-88bb-750d1a4b3cb3\") " Oct 13 13:26:49 crc kubenswrapper[4797]: I1013 13:26:49.959199 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51806e8a-82b1-48f4-88bb-750d1a4b3cb3-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "51806e8a-82b1-48f4-88bb-750d1a4b3cb3" (UID: "51806e8a-82b1-48f4-88bb-750d1a4b3cb3"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 13:26:49 crc kubenswrapper[4797]: I1013 13:26:49.959571 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51806e8a-82b1-48f4-88bb-750d1a4b3cb3-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "51806e8a-82b1-48f4-88bb-750d1a4b3cb3" (UID: "51806e8a-82b1-48f4-88bb-750d1a4b3cb3"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 13:26:49 crc kubenswrapper[4797]: I1013 13:26:49.960577 4797 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/51806e8a-82b1-48f4-88bb-750d1a4b3cb3-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 13 13:26:49 crc kubenswrapper[4797]: I1013 13:26:49.960617 4797 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/51806e8a-82b1-48f4-88bb-750d1a4b3cb3-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 13 13:26:49 crc kubenswrapper[4797]: I1013 13:26:49.966011 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51806e8a-82b1-48f4-88bb-750d1a4b3cb3-scripts" (OuterVolumeSpecName: "scripts") pod "51806e8a-82b1-48f4-88bb-750d1a4b3cb3" (UID: "51806e8a-82b1-48f4-88bb-750d1a4b3cb3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:26:49 crc kubenswrapper[4797]: I1013 13:26:49.966184 4797 scope.go:117] "RemoveContainer" containerID="00ce7aec52aca589f046de2bd4fe8a79058b5ff1110404435b1bdb7629fab7f9" Oct 13 13:26:49 crc kubenswrapper[4797]: I1013 13:26:49.968241 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51806e8a-82b1-48f4-88bb-750d1a4b3cb3-kube-api-access-f8ldg" (OuterVolumeSpecName: "kube-api-access-f8ldg") pod "51806e8a-82b1-48f4-88bb-750d1a4b3cb3" (UID: "51806e8a-82b1-48f4-88bb-750d1a4b3cb3"). InnerVolumeSpecName "kube-api-access-f8ldg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:26:49 crc kubenswrapper[4797]: I1013 13:26:49.998728 4797 scope.go:117] "RemoveContainer" containerID="5cf03a4e992ec39f55550c3fdaf4900c2a03eda89e68c877b14aa306ba31b039" Oct 13 13:26:50 crc kubenswrapper[4797]: I1013 13:26:50.002272 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51806e8a-82b1-48f4-88bb-750d1a4b3cb3-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "51806e8a-82b1-48f4-88bb-750d1a4b3cb3" (UID: "51806e8a-82b1-48f4-88bb-750d1a4b3cb3"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:26:50 crc kubenswrapper[4797]: I1013 13:26:50.024476 4797 scope.go:117] "RemoveContainer" containerID="1b55bee031f47a3777a8b4696fb649d08108c8994777937182b3d07c6dbcc664" Oct 13 13:26:50 crc kubenswrapper[4797]: E1013 13:26:50.024915 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b55bee031f47a3777a8b4696fb649d08108c8994777937182b3d07c6dbcc664\": container with ID starting with 1b55bee031f47a3777a8b4696fb649d08108c8994777937182b3d07c6dbcc664 not found: ID does not exist" containerID="1b55bee031f47a3777a8b4696fb649d08108c8994777937182b3d07c6dbcc664" Oct 13 13:26:50 crc kubenswrapper[4797]: I1013 13:26:50.024950 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b55bee031f47a3777a8b4696fb649d08108c8994777937182b3d07c6dbcc664"} err="failed to get container status \"1b55bee031f47a3777a8b4696fb649d08108c8994777937182b3d07c6dbcc664\": rpc error: code = NotFound desc = could not find container \"1b55bee031f47a3777a8b4696fb649d08108c8994777937182b3d07c6dbcc664\": container with ID starting with 1b55bee031f47a3777a8b4696fb649d08108c8994777937182b3d07c6dbcc664 not found: ID does not exist" Oct 13 13:26:50 crc kubenswrapper[4797]: I1013 13:26:50.024971 4797 scope.go:117] "RemoveContainer" containerID="bde5a186aab2b9cec2b089778ef44006877b19bb595852147f01590e717a2996" Oct 13 13:26:50 crc kubenswrapper[4797]: E1013 13:26:50.025166 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bde5a186aab2b9cec2b089778ef44006877b19bb595852147f01590e717a2996\": container with ID starting with bde5a186aab2b9cec2b089778ef44006877b19bb595852147f01590e717a2996 not found: ID does not exist" containerID="bde5a186aab2b9cec2b089778ef44006877b19bb595852147f01590e717a2996" Oct 13 13:26:50 crc kubenswrapper[4797]: I1013 13:26:50.025194 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bde5a186aab2b9cec2b089778ef44006877b19bb595852147f01590e717a2996"} err="failed to get container status \"bde5a186aab2b9cec2b089778ef44006877b19bb595852147f01590e717a2996\": rpc error: code = NotFound desc = could not find container \"bde5a186aab2b9cec2b089778ef44006877b19bb595852147f01590e717a2996\": container with ID starting with bde5a186aab2b9cec2b089778ef44006877b19bb595852147f01590e717a2996 not found: ID does not exist" Oct 13 13:26:50 crc kubenswrapper[4797]: I1013 13:26:50.025207 4797 scope.go:117] "RemoveContainer" containerID="00ce7aec52aca589f046de2bd4fe8a79058b5ff1110404435b1bdb7629fab7f9" Oct 13 13:26:50 crc kubenswrapper[4797]: E1013 13:26:50.025434 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00ce7aec52aca589f046de2bd4fe8a79058b5ff1110404435b1bdb7629fab7f9\": container with ID starting with 00ce7aec52aca589f046de2bd4fe8a79058b5ff1110404435b1bdb7629fab7f9 not found: ID does not exist" containerID="00ce7aec52aca589f046de2bd4fe8a79058b5ff1110404435b1bdb7629fab7f9" Oct 13 13:26:50 crc kubenswrapper[4797]: I1013 13:26:50.025455 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00ce7aec52aca589f046de2bd4fe8a79058b5ff1110404435b1bdb7629fab7f9"} err="failed to get container status \"00ce7aec52aca589f046de2bd4fe8a79058b5ff1110404435b1bdb7629fab7f9\": rpc error: code = NotFound desc = could not find container \"00ce7aec52aca589f046de2bd4fe8a79058b5ff1110404435b1bdb7629fab7f9\": container with ID starting with 00ce7aec52aca589f046de2bd4fe8a79058b5ff1110404435b1bdb7629fab7f9 not found: ID does not exist" Oct 13 13:26:50 crc kubenswrapper[4797]: I1013 13:26:50.025467 4797 scope.go:117] "RemoveContainer" containerID="5cf03a4e992ec39f55550c3fdaf4900c2a03eda89e68c877b14aa306ba31b039" Oct 13 13:26:50 crc kubenswrapper[4797]: E1013 13:26:50.025760 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5cf03a4e992ec39f55550c3fdaf4900c2a03eda89e68c877b14aa306ba31b039\": container with ID starting with 5cf03a4e992ec39f55550c3fdaf4900c2a03eda89e68c877b14aa306ba31b039 not found: ID does not exist" containerID="5cf03a4e992ec39f55550c3fdaf4900c2a03eda89e68c877b14aa306ba31b039" Oct 13 13:26:50 crc kubenswrapper[4797]: I1013 13:26:50.025780 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5cf03a4e992ec39f55550c3fdaf4900c2a03eda89e68c877b14aa306ba31b039"} err="failed to get container status \"5cf03a4e992ec39f55550c3fdaf4900c2a03eda89e68c877b14aa306ba31b039\": rpc error: code = NotFound desc = could not find container \"5cf03a4e992ec39f55550c3fdaf4900c2a03eda89e68c877b14aa306ba31b039\": container with ID starting with 5cf03a4e992ec39f55550c3fdaf4900c2a03eda89e68c877b14aa306ba31b039 not found: ID does not exist" Oct 13 13:26:50 crc kubenswrapper[4797]: I1013 13:26:50.025796 4797 scope.go:117] "RemoveContainer" containerID="1b55bee031f47a3777a8b4696fb649d08108c8994777937182b3d07c6dbcc664" Oct 13 13:26:50 crc kubenswrapper[4797]: I1013 13:26:50.026023 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b55bee031f47a3777a8b4696fb649d08108c8994777937182b3d07c6dbcc664"} err="failed to get container status \"1b55bee031f47a3777a8b4696fb649d08108c8994777937182b3d07c6dbcc664\": rpc error: code = NotFound desc = could not find container \"1b55bee031f47a3777a8b4696fb649d08108c8994777937182b3d07c6dbcc664\": container with ID starting with 1b55bee031f47a3777a8b4696fb649d08108c8994777937182b3d07c6dbcc664 not found: ID does not exist" Oct 13 13:26:50 crc kubenswrapper[4797]: I1013 13:26:50.026037 4797 scope.go:117] "RemoveContainer" containerID="bde5a186aab2b9cec2b089778ef44006877b19bb595852147f01590e717a2996" Oct 13 13:26:50 crc kubenswrapper[4797]: I1013 13:26:50.026265 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bde5a186aab2b9cec2b089778ef44006877b19bb595852147f01590e717a2996"} err="failed to get container status \"bde5a186aab2b9cec2b089778ef44006877b19bb595852147f01590e717a2996\": rpc error: code = NotFound desc = could not find container \"bde5a186aab2b9cec2b089778ef44006877b19bb595852147f01590e717a2996\": container with ID starting with bde5a186aab2b9cec2b089778ef44006877b19bb595852147f01590e717a2996 not found: ID does not exist" Oct 13 13:26:50 crc kubenswrapper[4797]: I1013 13:26:50.026282 4797 scope.go:117] "RemoveContainer" containerID="00ce7aec52aca589f046de2bd4fe8a79058b5ff1110404435b1bdb7629fab7f9" Oct 13 13:26:50 crc kubenswrapper[4797]: I1013 13:26:50.026994 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00ce7aec52aca589f046de2bd4fe8a79058b5ff1110404435b1bdb7629fab7f9"} err="failed to get container status \"00ce7aec52aca589f046de2bd4fe8a79058b5ff1110404435b1bdb7629fab7f9\": rpc error: code = NotFound desc = could not find container \"00ce7aec52aca589f046de2bd4fe8a79058b5ff1110404435b1bdb7629fab7f9\": container with ID starting with 00ce7aec52aca589f046de2bd4fe8a79058b5ff1110404435b1bdb7629fab7f9 not found: ID does not exist" Oct 13 13:26:50 crc kubenswrapper[4797]: I1013 13:26:50.027019 4797 scope.go:117] "RemoveContainer" containerID="5cf03a4e992ec39f55550c3fdaf4900c2a03eda89e68c877b14aa306ba31b039" Oct 13 13:26:50 crc kubenswrapper[4797]: I1013 13:26:50.027299 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5cf03a4e992ec39f55550c3fdaf4900c2a03eda89e68c877b14aa306ba31b039"} err="failed to get container status \"5cf03a4e992ec39f55550c3fdaf4900c2a03eda89e68c877b14aa306ba31b039\": rpc error: code = NotFound desc = could not find container \"5cf03a4e992ec39f55550c3fdaf4900c2a03eda89e68c877b14aa306ba31b039\": container with ID starting with 5cf03a4e992ec39f55550c3fdaf4900c2a03eda89e68c877b14aa306ba31b039 not found: ID does not exist" Oct 13 13:26:50 crc kubenswrapper[4797]: I1013 13:26:50.027320 4797 scope.go:117] "RemoveContainer" containerID="1b55bee031f47a3777a8b4696fb649d08108c8994777937182b3d07c6dbcc664" Oct 13 13:26:50 crc kubenswrapper[4797]: I1013 13:26:50.027528 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b55bee031f47a3777a8b4696fb649d08108c8994777937182b3d07c6dbcc664"} err="failed to get container status \"1b55bee031f47a3777a8b4696fb649d08108c8994777937182b3d07c6dbcc664\": rpc error: code = NotFound desc = could not find container \"1b55bee031f47a3777a8b4696fb649d08108c8994777937182b3d07c6dbcc664\": container with ID starting with 1b55bee031f47a3777a8b4696fb649d08108c8994777937182b3d07c6dbcc664 not found: ID does not exist" Oct 13 13:26:50 crc kubenswrapper[4797]: I1013 13:26:50.027547 4797 scope.go:117] "RemoveContainer" containerID="bde5a186aab2b9cec2b089778ef44006877b19bb595852147f01590e717a2996" Oct 13 13:26:50 crc kubenswrapper[4797]: I1013 13:26:50.027724 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bde5a186aab2b9cec2b089778ef44006877b19bb595852147f01590e717a2996"} err="failed to get container status \"bde5a186aab2b9cec2b089778ef44006877b19bb595852147f01590e717a2996\": rpc error: code = NotFound desc = could not find container \"bde5a186aab2b9cec2b089778ef44006877b19bb595852147f01590e717a2996\": container with ID starting with bde5a186aab2b9cec2b089778ef44006877b19bb595852147f01590e717a2996 not found: ID does not exist" Oct 13 13:26:50 crc kubenswrapper[4797]: I1013 13:26:50.027761 4797 scope.go:117] "RemoveContainer" containerID="00ce7aec52aca589f046de2bd4fe8a79058b5ff1110404435b1bdb7629fab7f9" Oct 13 13:26:50 crc kubenswrapper[4797]: I1013 13:26:50.027969 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00ce7aec52aca589f046de2bd4fe8a79058b5ff1110404435b1bdb7629fab7f9"} err="failed to get container status \"00ce7aec52aca589f046de2bd4fe8a79058b5ff1110404435b1bdb7629fab7f9\": rpc error: code = NotFound desc = could not find container \"00ce7aec52aca589f046de2bd4fe8a79058b5ff1110404435b1bdb7629fab7f9\": container with ID starting with 00ce7aec52aca589f046de2bd4fe8a79058b5ff1110404435b1bdb7629fab7f9 not found: ID does not exist" Oct 13 13:26:50 crc kubenswrapper[4797]: I1013 13:26:50.028029 4797 scope.go:117] "RemoveContainer" containerID="5cf03a4e992ec39f55550c3fdaf4900c2a03eda89e68c877b14aa306ba31b039" Oct 13 13:26:50 crc kubenswrapper[4797]: I1013 13:26:50.028195 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5cf03a4e992ec39f55550c3fdaf4900c2a03eda89e68c877b14aa306ba31b039"} err="failed to get container status \"5cf03a4e992ec39f55550c3fdaf4900c2a03eda89e68c877b14aa306ba31b039\": rpc error: code = NotFound desc = could not find container \"5cf03a4e992ec39f55550c3fdaf4900c2a03eda89e68c877b14aa306ba31b039\": container with ID starting with 5cf03a4e992ec39f55550c3fdaf4900c2a03eda89e68c877b14aa306ba31b039 not found: ID does not exist" Oct 13 13:26:50 crc kubenswrapper[4797]: I1013 13:26:50.028213 4797 scope.go:117] "RemoveContainer" containerID="1b55bee031f47a3777a8b4696fb649d08108c8994777937182b3d07c6dbcc664" Oct 13 13:26:50 crc kubenswrapper[4797]: I1013 13:26:50.028340 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b55bee031f47a3777a8b4696fb649d08108c8994777937182b3d07c6dbcc664"} err="failed to get container status \"1b55bee031f47a3777a8b4696fb649d08108c8994777937182b3d07c6dbcc664\": rpc error: code = NotFound desc = could not find container \"1b55bee031f47a3777a8b4696fb649d08108c8994777937182b3d07c6dbcc664\": container with ID starting with 1b55bee031f47a3777a8b4696fb649d08108c8994777937182b3d07c6dbcc664 not found: ID does not exist" Oct 13 13:26:50 crc kubenswrapper[4797]: I1013 13:26:50.028356 4797 scope.go:117] "RemoveContainer" containerID="bde5a186aab2b9cec2b089778ef44006877b19bb595852147f01590e717a2996" Oct 13 13:26:50 crc kubenswrapper[4797]: I1013 13:26:50.028493 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bde5a186aab2b9cec2b089778ef44006877b19bb595852147f01590e717a2996"} err="failed to get container status \"bde5a186aab2b9cec2b089778ef44006877b19bb595852147f01590e717a2996\": rpc error: code = NotFound desc = could not find container \"bde5a186aab2b9cec2b089778ef44006877b19bb595852147f01590e717a2996\": container with ID starting with bde5a186aab2b9cec2b089778ef44006877b19bb595852147f01590e717a2996 not found: ID does not exist" Oct 13 13:26:50 crc kubenswrapper[4797]: I1013 13:26:50.028510 4797 scope.go:117] "RemoveContainer" containerID="00ce7aec52aca589f046de2bd4fe8a79058b5ff1110404435b1bdb7629fab7f9" Oct 13 13:26:50 crc kubenswrapper[4797]: I1013 13:26:50.028776 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00ce7aec52aca589f046de2bd4fe8a79058b5ff1110404435b1bdb7629fab7f9"} err="failed to get container status \"00ce7aec52aca589f046de2bd4fe8a79058b5ff1110404435b1bdb7629fab7f9\": rpc error: code = NotFound desc = could not find container \"00ce7aec52aca589f046de2bd4fe8a79058b5ff1110404435b1bdb7629fab7f9\": container with ID starting with 00ce7aec52aca589f046de2bd4fe8a79058b5ff1110404435b1bdb7629fab7f9 not found: ID does not exist" Oct 13 13:26:50 crc kubenswrapper[4797]: I1013 13:26:50.028793 4797 scope.go:117] "RemoveContainer" containerID="5cf03a4e992ec39f55550c3fdaf4900c2a03eda89e68c877b14aa306ba31b039" Oct 13 13:26:50 crc kubenswrapper[4797]: I1013 13:26:50.029050 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5cf03a4e992ec39f55550c3fdaf4900c2a03eda89e68c877b14aa306ba31b039"} err="failed to get container status \"5cf03a4e992ec39f55550c3fdaf4900c2a03eda89e68c877b14aa306ba31b039\": rpc error: code = NotFound desc = could not find container \"5cf03a4e992ec39f55550c3fdaf4900c2a03eda89e68c877b14aa306ba31b039\": container with ID starting with 5cf03a4e992ec39f55550c3fdaf4900c2a03eda89e68c877b14aa306ba31b039 not found: ID does not exist" Oct 13 13:26:50 crc kubenswrapper[4797]: I1013 13:26:50.043765 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51806e8a-82b1-48f4-88bb-750d1a4b3cb3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "51806e8a-82b1-48f4-88bb-750d1a4b3cb3" (UID: "51806e8a-82b1-48f4-88bb-750d1a4b3cb3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:26:50 crc kubenswrapper[4797]: I1013 13:26:50.065982 4797 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51806e8a-82b1-48f4-88bb-750d1a4b3cb3-scripts\") on node \"crc\" DevicePath \"\"" Oct 13 13:26:50 crc kubenswrapper[4797]: I1013 13:26:50.066018 4797 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/51806e8a-82b1-48f4-88bb-750d1a4b3cb3-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 13 13:26:50 crc kubenswrapper[4797]: I1013 13:26:50.066029 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51806e8a-82b1-48f4-88bb-750d1a4b3cb3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 13:26:50 crc kubenswrapper[4797]: I1013 13:26:50.066038 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f8ldg\" (UniqueName: \"kubernetes.io/projected/51806e8a-82b1-48f4-88bb-750d1a4b3cb3-kube-api-access-f8ldg\") on node \"crc\" DevicePath \"\"" Oct 13 13:26:50 crc kubenswrapper[4797]: I1013 13:26:50.067464 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51806e8a-82b1-48f4-88bb-750d1a4b3cb3-config-data" (OuterVolumeSpecName: "config-data") pod "51806e8a-82b1-48f4-88bb-750d1a4b3cb3" (UID: "51806e8a-82b1-48f4-88bb-750d1a4b3cb3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:26:50 crc kubenswrapper[4797]: I1013 13:26:50.167663 4797 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51806e8a-82b1-48f4-88bb-750d1a4b3cb3-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 13:26:50 crc kubenswrapper[4797]: I1013 13:26:50.226202 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 13 13:26:50 crc kubenswrapper[4797]: I1013 13:26:50.235069 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 13 13:26:50 crc kubenswrapper[4797]: I1013 13:26:50.247497 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 13 13:26:50 crc kubenswrapper[4797]: E1013 13:26:50.248373 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51806e8a-82b1-48f4-88bb-750d1a4b3cb3" containerName="ceilometer-central-agent" Oct 13 13:26:50 crc kubenswrapper[4797]: I1013 13:26:50.248397 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="51806e8a-82b1-48f4-88bb-750d1a4b3cb3" containerName="ceilometer-central-agent" Oct 13 13:26:50 crc kubenswrapper[4797]: E1013 13:26:50.248414 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51806e8a-82b1-48f4-88bb-750d1a4b3cb3" containerName="sg-core" Oct 13 13:26:50 crc kubenswrapper[4797]: I1013 13:26:50.248421 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="51806e8a-82b1-48f4-88bb-750d1a4b3cb3" containerName="sg-core" Oct 13 13:26:50 crc kubenswrapper[4797]: E1013 13:26:50.248430 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51806e8a-82b1-48f4-88bb-750d1a4b3cb3" containerName="ceilometer-notification-agent" Oct 13 13:26:50 crc kubenswrapper[4797]: I1013 13:26:50.248436 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="51806e8a-82b1-48f4-88bb-750d1a4b3cb3" containerName="ceilometer-notification-agent" Oct 13 13:26:50 crc kubenswrapper[4797]: E1013 13:26:50.248451 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51806e8a-82b1-48f4-88bb-750d1a4b3cb3" containerName="proxy-httpd" Oct 13 13:26:50 crc kubenswrapper[4797]: I1013 13:26:50.248456 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="51806e8a-82b1-48f4-88bb-750d1a4b3cb3" containerName="proxy-httpd" Oct 13 13:26:50 crc kubenswrapper[4797]: I1013 13:26:50.248629 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="51806e8a-82b1-48f4-88bb-750d1a4b3cb3" containerName="ceilometer-central-agent" Oct 13 13:26:50 crc kubenswrapper[4797]: I1013 13:26:50.248653 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="51806e8a-82b1-48f4-88bb-750d1a4b3cb3" containerName="ceilometer-notification-agent" Oct 13 13:26:50 crc kubenswrapper[4797]: I1013 13:26:50.248667 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="51806e8a-82b1-48f4-88bb-750d1a4b3cb3" containerName="proxy-httpd" Oct 13 13:26:50 crc kubenswrapper[4797]: I1013 13:26:50.248697 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="51806e8a-82b1-48f4-88bb-750d1a4b3cb3" containerName="sg-core" Oct 13 13:26:50 crc kubenswrapper[4797]: I1013 13:26:50.251821 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 13 13:26:50 crc kubenswrapper[4797]: I1013 13:26:50.258569 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 13 13:26:50 crc kubenswrapper[4797]: I1013 13:26:50.259724 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 13 13:26:50 crc kubenswrapper[4797]: I1013 13:26:50.263348 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 13 13:26:50 crc kubenswrapper[4797]: I1013 13:26:50.382304 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6a126467-420f-4ec7-96ba-920c65d323e8-run-httpd\") pod \"ceilometer-0\" (UID: \"6a126467-420f-4ec7-96ba-920c65d323e8\") " pod="openstack/ceilometer-0" Oct 13 13:26:50 crc kubenswrapper[4797]: I1013 13:26:50.382484 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a126467-420f-4ec7-96ba-920c65d323e8-config-data\") pod \"ceilometer-0\" (UID: \"6a126467-420f-4ec7-96ba-920c65d323e8\") " pod="openstack/ceilometer-0" Oct 13 13:26:50 crc kubenswrapper[4797]: I1013 13:26:50.382667 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvszk\" (UniqueName: \"kubernetes.io/projected/6a126467-420f-4ec7-96ba-920c65d323e8-kube-api-access-xvszk\") pod \"ceilometer-0\" (UID: \"6a126467-420f-4ec7-96ba-920c65d323e8\") " pod="openstack/ceilometer-0" Oct 13 13:26:50 crc kubenswrapper[4797]: I1013 13:26:50.382762 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6a126467-420f-4ec7-96ba-920c65d323e8-log-httpd\") pod \"ceilometer-0\" (UID: \"6a126467-420f-4ec7-96ba-920c65d323e8\") " pod="openstack/ceilometer-0" Oct 13 13:26:50 crc kubenswrapper[4797]: I1013 13:26:50.382813 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6a126467-420f-4ec7-96ba-920c65d323e8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6a126467-420f-4ec7-96ba-920c65d323e8\") " pod="openstack/ceilometer-0" Oct 13 13:26:50 crc kubenswrapper[4797]: I1013 13:26:50.382894 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a126467-420f-4ec7-96ba-920c65d323e8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6a126467-420f-4ec7-96ba-920c65d323e8\") " pod="openstack/ceilometer-0" Oct 13 13:26:50 crc kubenswrapper[4797]: I1013 13:26:50.383189 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a126467-420f-4ec7-96ba-920c65d323e8-scripts\") pod \"ceilometer-0\" (UID: \"6a126467-420f-4ec7-96ba-920c65d323e8\") " pod="openstack/ceilometer-0" Oct 13 13:26:50 crc kubenswrapper[4797]: I1013 13:26:50.485174 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a126467-420f-4ec7-96ba-920c65d323e8-scripts\") pod \"ceilometer-0\" (UID: \"6a126467-420f-4ec7-96ba-920c65d323e8\") " pod="openstack/ceilometer-0" Oct 13 13:26:50 crc kubenswrapper[4797]: I1013 13:26:50.485240 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6a126467-420f-4ec7-96ba-920c65d323e8-run-httpd\") pod \"ceilometer-0\" (UID: \"6a126467-420f-4ec7-96ba-920c65d323e8\") " pod="openstack/ceilometer-0" Oct 13 13:26:50 crc kubenswrapper[4797]: I1013 13:26:50.485277 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a126467-420f-4ec7-96ba-920c65d323e8-config-data\") pod \"ceilometer-0\" (UID: \"6a126467-420f-4ec7-96ba-920c65d323e8\") " pod="openstack/ceilometer-0" Oct 13 13:26:50 crc kubenswrapper[4797]: I1013 13:26:50.485310 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xvszk\" (UniqueName: \"kubernetes.io/projected/6a126467-420f-4ec7-96ba-920c65d323e8-kube-api-access-xvszk\") pod \"ceilometer-0\" (UID: \"6a126467-420f-4ec7-96ba-920c65d323e8\") " pod="openstack/ceilometer-0" Oct 13 13:26:50 crc kubenswrapper[4797]: I1013 13:26:50.485330 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6a126467-420f-4ec7-96ba-920c65d323e8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6a126467-420f-4ec7-96ba-920c65d323e8\") " pod="openstack/ceilometer-0" Oct 13 13:26:50 crc kubenswrapper[4797]: I1013 13:26:50.485346 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6a126467-420f-4ec7-96ba-920c65d323e8-log-httpd\") pod \"ceilometer-0\" (UID: \"6a126467-420f-4ec7-96ba-920c65d323e8\") " pod="openstack/ceilometer-0" Oct 13 13:26:50 crc kubenswrapper[4797]: I1013 13:26:50.485373 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a126467-420f-4ec7-96ba-920c65d323e8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6a126467-420f-4ec7-96ba-920c65d323e8\") " pod="openstack/ceilometer-0" Oct 13 13:26:50 crc kubenswrapper[4797]: I1013 13:26:50.486486 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6a126467-420f-4ec7-96ba-920c65d323e8-run-httpd\") pod \"ceilometer-0\" (UID: \"6a126467-420f-4ec7-96ba-920c65d323e8\") " pod="openstack/ceilometer-0" Oct 13 13:26:50 crc kubenswrapper[4797]: I1013 13:26:50.486742 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6a126467-420f-4ec7-96ba-920c65d323e8-log-httpd\") pod \"ceilometer-0\" (UID: \"6a126467-420f-4ec7-96ba-920c65d323e8\") " pod="openstack/ceilometer-0" Oct 13 13:26:50 crc kubenswrapper[4797]: I1013 13:26:50.489821 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6a126467-420f-4ec7-96ba-920c65d323e8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6a126467-420f-4ec7-96ba-920c65d323e8\") " pod="openstack/ceilometer-0" Oct 13 13:26:50 crc kubenswrapper[4797]: I1013 13:26:50.490709 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a126467-420f-4ec7-96ba-920c65d323e8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6a126467-420f-4ec7-96ba-920c65d323e8\") " pod="openstack/ceilometer-0" Oct 13 13:26:50 crc kubenswrapper[4797]: I1013 13:26:50.497662 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a126467-420f-4ec7-96ba-920c65d323e8-scripts\") pod \"ceilometer-0\" (UID: \"6a126467-420f-4ec7-96ba-920c65d323e8\") " pod="openstack/ceilometer-0" Oct 13 13:26:50 crc kubenswrapper[4797]: I1013 13:26:50.499155 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a126467-420f-4ec7-96ba-920c65d323e8-config-data\") pod \"ceilometer-0\" (UID: \"6a126467-420f-4ec7-96ba-920c65d323e8\") " pod="openstack/ceilometer-0" Oct 13 13:26:50 crc kubenswrapper[4797]: I1013 13:26:50.503451 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvszk\" (UniqueName: \"kubernetes.io/projected/6a126467-420f-4ec7-96ba-920c65d323e8-kube-api-access-xvszk\") pod \"ceilometer-0\" (UID: \"6a126467-420f-4ec7-96ba-920c65d323e8\") " pod="openstack/ceilometer-0" Oct 13 13:26:50 crc kubenswrapper[4797]: I1013 13:26:50.568148 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 13 13:26:50 crc kubenswrapper[4797]: I1013 13:26:50.637696 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 13 13:26:51 crc kubenswrapper[4797]: I1013 13:26:51.024346 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 13 13:26:51 crc kubenswrapper[4797]: W1013 13:26:51.033341 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6a126467_420f_4ec7_96ba_920c65d323e8.slice/crio-0773a68f9d2b36cc7398ab8bdc0745c07339193d03be5bb0e77f8798742da350 WatchSource:0}: Error finding container 0773a68f9d2b36cc7398ab8bdc0745c07339193d03be5bb0e77f8798742da350: Status 404 returned error can't find the container with id 0773a68f9d2b36cc7398ab8bdc0745c07339193d03be5bb0e77f8798742da350 Oct 13 13:26:51 crc kubenswrapper[4797]: I1013 13:26:51.248021 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51806e8a-82b1-48f4-88bb-750d1a4b3cb3" path="/var/lib/kubelet/pods/51806e8a-82b1-48f4-88bb-750d1a4b3cb3/volumes" Oct 13 13:26:51 crc kubenswrapper[4797]: I1013 13:26:51.914795 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6a126467-420f-4ec7-96ba-920c65d323e8","Type":"ContainerStarted","Data":"0773a68f9d2b36cc7398ab8bdc0745c07339193d03be5bb0e77f8798742da350"} Oct 13 13:26:55 crc kubenswrapper[4797]: I1013 13:26:55.954070 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-f5b9c" event={"ID":"a88a7d56-1b07-41ef-94ec-d39f1c9bfb76","Type":"ContainerStarted","Data":"9d64c2be9688b9fb4e9f73b6943972899943509da3f03d108888f03aae8e7ad2"} Oct 13 13:26:55 crc kubenswrapper[4797]: I1013 13:26:55.959019 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6a126467-420f-4ec7-96ba-920c65d323e8","Type":"ContainerStarted","Data":"fc4292319cde6be16435f4465e413cb1c410a093490629858f653af3b74b0b25"} Oct 13 13:26:55 crc kubenswrapper[4797]: I1013 13:26:55.974360 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-f5b9c" podStartSLOduration=2.104573167 podStartE2EDuration="8.974337401s" podCreationTimestamp="2025-10-13 13:26:47 +0000 UTC" firstStartedPulling="2025-10-13 13:26:48.209054031 +0000 UTC m=+1185.742604287" lastFinishedPulling="2025-10-13 13:26:55.078818265 +0000 UTC m=+1192.612368521" observedRunningTime="2025-10-13 13:26:55.969347319 +0000 UTC m=+1193.502897585" watchObservedRunningTime="2025-10-13 13:26:55.974337401 +0000 UTC m=+1193.507887647" Oct 13 13:26:56 crc kubenswrapper[4797]: I1013 13:26:56.970635 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6a126467-420f-4ec7-96ba-920c65d323e8","Type":"ContainerStarted","Data":"affb13e100d918a3e8819e8a8c640d90e1e0ae3cc17da14058c782545ec299e0"} Oct 13 13:26:56 crc kubenswrapper[4797]: I1013 13:26:56.970995 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6a126467-420f-4ec7-96ba-920c65d323e8","Type":"ContainerStarted","Data":"433f2c4c3d717f2a207de586e3f5e2ee5613f866724f64bd593dbb1974a7491f"} Oct 13 13:27:00 crc kubenswrapper[4797]: I1013 13:27:00.003297 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6a126467-420f-4ec7-96ba-920c65d323e8","Type":"ContainerStarted","Data":"e0fa63fb8c8859c11abc1eeadcee2d83c6d8ccb769d6cd350684638529c986ea"} Oct 13 13:27:00 crc kubenswrapper[4797]: I1013 13:27:00.003446 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6a126467-420f-4ec7-96ba-920c65d323e8" containerName="ceilometer-central-agent" containerID="cri-o://fc4292319cde6be16435f4465e413cb1c410a093490629858f653af3b74b0b25" gracePeriod=30 Oct 13 13:27:00 crc kubenswrapper[4797]: I1013 13:27:00.003491 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6a126467-420f-4ec7-96ba-920c65d323e8" containerName="proxy-httpd" containerID="cri-o://e0fa63fb8c8859c11abc1eeadcee2d83c6d8ccb769d6cd350684638529c986ea" gracePeriod=30 Oct 13 13:27:00 crc kubenswrapper[4797]: I1013 13:27:00.003506 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6a126467-420f-4ec7-96ba-920c65d323e8" containerName="sg-core" containerID="cri-o://affb13e100d918a3e8819e8a8c640d90e1e0ae3cc17da14058c782545ec299e0" gracePeriod=30 Oct 13 13:27:00 crc kubenswrapper[4797]: I1013 13:27:00.003523 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6a126467-420f-4ec7-96ba-920c65d323e8" containerName="ceilometer-notification-agent" containerID="cri-o://433f2c4c3d717f2a207de586e3f5e2ee5613f866724f64bd593dbb1974a7491f" gracePeriod=30 Oct 13 13:27:00 crc kubenswrapper[4797]: I1013 13:27:00.003893 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 13 13:27:00 crc kubenswrapper[4797]: I1013 13:27:00.036215 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.291286641 podStartE2EDuration="10.036197012s" podCreationTimestamp="2025-10-13 13:26:50 +0000 UTC" firstStartedPulling="2025-10-13 13:26:51.037198529 +0000 UTC m=+1188.570748795" lastFinishedPulling="2025-10-13 13:26:58.7821089 +0000 UTC m=+1196.315659166" observedRunningTime="2025-10-13 13:27:00.030848961 +0000 UTC m=+1197.564399227" watchObservedRunningTime="2025-10-13 13:27:00.036197012 +0000 UTC m=+1197.569747268" Oct 13 13:27:01 crc kubenswrapper[4797]: I1013 13:27:01.018171 4797 generic.go:334] "Generic (PLEG): container finished" podID="6a126467-420f-4ec7-96ba-920c65d323e8" containerID="e0fa63fb8c8859c11abc1eeadcee2d83c6d8ccb769d6cd350684638529c986ea" exitCode=0 Oct 13 13:27:01 crc kubenswrapper[4797]: I1013 13:27:01.018485 4797 generic.go:334] "Generic (PLEG): container finished" podID="6a126467-420f-4ec7-96ba-920c65d323e8" containerID="affb13e100d918a3e8819e8a8c640d90e1e0ae3cc17da14058c782545ec299e0" exitCode=2 Oct 13 13:27:01 crc kubenswrapper[4797]: I1013 13:27:01.018492 4797 generic.go:334] "Generic (PLEG): container finished" podID="6a126467-420f-4ec7-96ba-920c65d323e8" containerID="433f2c4c3d717f2a207de586e3f5e2ee5613f866724f64bd593dbb1974a7491f" exitCode=0 Oct 13 13:27:01 crc kubenswrapper[4797]: I1013 13:27:01.018284 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6a126467-420f-4ec7-96ba-920c65d323e8","Type":"ContainerDied","Data":"e0fa63fb8c8859c11abc1eeadcee2d83c6d8ccb769d6cd350684638529c986ea"} Oct 13 13:27:01 crc kubenswrapper[4797]: I1013 13:27:01.018528 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6a126467-420f-4ec7-96ba-920c65d323e8","Type":"ContainerDied","Data":"affb13e100d918a3e8819e8a8c640d90e1e0ae3cc17da14058c782545ec299e0"} Oct 13 13:27:01 crc kubenswrapper[4797]: I1013 13:27:01.018544 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6a126467-420f-4ec7-96ba-920c65d323e8","Type":"ContainerDied","Data":"433f2c4c3d717f2a207de586e3f5e2ee5613f866724f64bd593dbb1974a7491f"} Oct 13 13:27:02 crc kubenswrapper[4797]: I1013 13:27:02.029576 4797 generic.go:334] "Generic (PLEG): container finished" podID="6a126467-420f-4ec7-96ba-920c65d323e8" containerID="fc4292319cde6be16435f4465e413cb1c410a093490629858f653af3b74b0b25" exitCode=0 Oct 13 13:27:02 crc kubenswrapper[4797]: I1013 13:27:02.029645 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6a126467-420f-4ec7-96ba-920c65d323e8","Type":"ContainerDied","Data":"fc4292319cde6be16435f4465e413cb1c410a093490629858f653af3b74b0b25"} Oct 13 13:27:02 crc kubenswrapper[4797]: I1013 13:27:02.308768 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 13 13:27:02 crc kubenswrapper[4797]: I1013 13:27:02.437123 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6a126467-420f-4ec7-96ba-920c65d323e8-run-httpd\") pod \"6a126467-420f-4ec7-96ba-920c65d323e8\" (UID: \"6a126467-420f-4ec7-96ba-920c65d323e8\") " Oct 13 13:27:02 crc kubenswrapper[4797]: I1013 13:27:02.437200 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xvszk\" (UniqueName: \"kubernetes.io/projected/6a126467-420f-4ec7-96ba-920c65d323e8-kube-api-access-xvszk\") pod \"6a126467-420f-4ec7-96ba-920c65d323e8\" (UID: \"6a126467-420f-4ec7-96ba-920c65d323e8\") " Oct 13 13:27:02 crc kubenswrapper[4797]: I1013 13:27:02.437346 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a126467-420f-4ec7-96ba-920c65d323e8-config-data\") pod \"6a126467-420f-4ec7-96ba-920c65d323e8\" (UID: \"6a126467-420f-4ec7-96ba-920c65d323e8\") " Oct 13 13:27:02 crc kubenswrapper[4797]: I1013 13:27:02.437381 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6a126467-420f-4ec7-96ba-920c65d323e8-log-httpd\") pod \"6a126467-420f-4ec7-96ba-920c65d323e8\" (UID: \"6a126467-420f-4ec7-96ba-920c65d323e8\") " Oct 13 13:27:02 crc kubenswrapper[4797]: I1013 13:27:02.437446 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a126467-420f-4ec7-96ba-920c65d323e8-scripts\") pod \"6a126467-420f-4ec7-96ba-920c65d323e8\" (UID: \"6a126467-420f-4ec7-96ba-920c65d323e8\") " Oct 13 13:27:02 crc kubenswrapper[4797]: I1013 13:27:02.437491 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a126467-420f-4ec7-96ba-920c65d323e8-combined-ca-bundle\") pod \"6a126467-420f-4ec7-96ba-920c65d323e8\" (UID: \"6a126467-420f-4ec7-96ba-920c65d323e8\") " Oct 13 13:27:02 crc kubenswrapper[4797]: I1013 13:27:02.437519 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6a126467-420f-4ec7-96ba-920c65d323e8-sg-core-conf-yaml\") pod \"6a126467-420f-4ec7-96ba-920c65d323e8\" (UID: \"6a126467-420f-4ec7-96ba-920c65d323e8\") " Oct 13 13:27:02 crc kubenswrapper[4797]: I1013 13:27:02.438570 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a126467-420f-4ec7-96ba-920c65d323e8-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "6a126467-420f-4ec7-96ba-920c65d323e8" (UID: "6a126467-420f-4ec7-96ba-920c65d323e8"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 13:27:02 crc kubenswrapper[4797]: I1013 13:27:02.438772 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a126467-420f-4ec7-96ba-920c65d323e8-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "6a126467-420f-4ec7-96ba-920c65d323e8" (UID: "6a126467-420f-4ec7-96ba-920c65d323e8"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 13:27:02 crc kubenswrapper[4797]: I1013 13:27:02.444268 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a126467-420f-4ec7-96ba-920c65d323e8-kube-api-access-xvszk" (OuterVolumeSpecName: "kube-api-access-xvszk") pod "6a126467-420f-4ec7-96ba-920c65d323e8" (UID: "6a126467-420f-4ec7-96ba-920c65d323e8"). InnerVolumeSpecName "kube-api-access-xvszk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:27:02 crc kubenswrapper[4797]: I1013 13:27:02.461507 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a126467-420f-4ec7-96ba-920c65d323e8-scripts" (OuterVolumeSpecName: "scripts") pod "6a126467-420f-4ec7-96ba-920c65d323e8" (UID: "6a126467-420f-4ec7-96ba-920c65d323e8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:27:02 crc kubenswrapper[4797]: I1013 13:27:02.486956 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a126467-420f-4ec7-96ba-920c65d323e8-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "6a126467-420f-4ec7-96ba-920c65d323e8" (UID: "6a126467-420f-4ec7-96ba-920c65d323e8"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:27:02 crc kubenswrapper[4797]: I1013 13:27:02.528549 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a126467-420f-4ec7-96ba-920c65d323e8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6a126467-420f-4ec7-96ba-920c65d323e8" (UID: "6a126467-420f-4ec7-96ba-920c65d323e8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:27:02 crc kubenswrapper[4797]: I1013 13:27:02.540051 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a126467-420f-4ec7-96ba-920c65d323e8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 13:27:02 crc kubenswrapper[4797]: I1013 13:27:02.540090 4797 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6a126467-420f-4ec7-96ba-920c65d323e8-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 13 13:27:02 crc kubenswrapper[4797]: I1013 13:27:02.540103 4797 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6a126467-420f-4ec7-96ba-920c65d323e8-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 13 13:27:02 crc kubenswrapper[4797]: I1013 13:27:02.540115 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xvszk\" (UniqueName: \"kubernetes.io/projected/6a126467-420f-4ec7-96ba-920c65d323e8-kube-api-access-xvszk\") on node \"crc\" DevicePath \"\"" Oct 13 13:27:02 crc kubenswrapper[4797]: I1013 13:27:02.540131 4797 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6a126467-420f-4ec7-96ba-920c65d323e8-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 13 13:27:02 crc kubenswrapper[4797]: I1013 13:27:02.540143 4797 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a126467-420f-4ec7-96ba-920c65d323e8-scripts\") on node \"crc\" DevicePath \"\"" Oct 13 13:27:02 crc kubenswrapper[4797]: I1013 13:27:02.590449 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a126467-420f-4ec7-96ba-920c65d323e8-config-data" (OuterVolumeSpecName: "config-data") pod "6a126467-420f-4ec7-96ba-920c65d323e8" (UID: "6a126467-420f-4ec7-96ba-920c65d323e8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:27:02 crc kubenswrapper[4797]: I1013 13:27:02.641945 4797 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a126467-420f-4ec7-96ba-920c65d323e8-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 13:27:03 crc kubenswrapper[4797]: I1013 13:27:03.041296 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6a126467-420f-4ec7-96ba-920c65d323e8","Type":"ContainerDied","Data":"0773a68f9d2b36cc7398ab8bdc0745c07339193d03be5bb0e77f8798742da350"} Oct 13 13:27:03 crc kubenswrapper[4797]: I1013 13:27:03.041364 4797 scope.go:117] "RemoveContainer" containerID="e0fa63fb8c8859c11abc1eeadcee2d83c6d8ccb769d6cd350684638529c986ea" Oct 13 13:27:03 crc kubenswrapper[4797]: I1013 13:27:03.041378 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 13 13:27:03 crc kubenswrapper[4797]: I1013 13:27:03.072406 4797 scope.go:117] "RemoveContainer" containerID="affb13e100d918a3e8819e8a8c640d90e1e0ae3cc17da14058c782545ec299e0" Oct 13 13:27:03 crc kubenswrapper[4797]: I1013 13:27:03.080227 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 13 13:27:03 crc kubenswrapper[4797]: I1013 13:27:03.089894 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 13 13:27:03 crc kubenswrapper[4797]: I1013 13:27:03.098964 4797 scope.go:117] "RemoveContainer" containerID="433f2c4c3d717f2a207de586e3f5e2ee5613f866724f64bd593dbb1974a7491f" Oct 13 13:27:03 crc kubenswrapper[4797]: I1013 13:27:03.105665 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 13 13:27:03 crc kubenswrapper[4797]: E1013 13:27:03.106023 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a126467-420f-4ec7-96ba-920c65d323e8" containerName="sg-core" Oct 13 13:27:03 crc kubenswrapper[4797]: I1013 13:27:03.106041 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a126467-420f-4ec7-96ba-920c65d323e8" containerName="sg-core" Oct 13 13:27:03 crc kubenswrapper[4797]: E1013 13:27:03.106062 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a126467-420f-4ec7-96ba-920c65d323e8" containerName="proxy-httpd" Oct 13 13:27:03 crc kubenswrapper[4797]: I1013 13:27:03.106068 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a126467-420f-4ec7-96ba-920c65d323e8" containerName="proxy-httpd" Oct 13 13:27:03 crc kubenswrapper[4797]: E1013 13:27:03.106125 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a126467-420f-4ec7-96ba-920c65d323e8" containerName="ceilometer-notification-agent" Oct 13 13:27:03 crc kubenswrapper[4797]: I1013 13:27:03.106132 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a126467-420f-4ec7-96ba-920c65d323e8" containerName="ceilometer-notification-agent" Oct 13 13:27:03 crc kubenswrapper[4797]: E1013 13:27:03.106145 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a126467-420f-4ec7-96ba-920c65d323e8" containerName="ceilometer-central-agent" Oct 13 13:27:03 crc kubenswrapper[4797]: I1013 13:27:03.106150 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a126467-420f-4ec7-96ba-920c65d323e8" containerName="ceilometer-central-agent" Oct 13 13:27:03 crc kubenswrapper[4797]: I1013 13:27:03.107839 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a126467-420f-4ec7-96ba-920c65d323e8" containerName="ceilometer-notification-agent" Oct 13 13:27:03 crc kubenswrapper[4797]: I1013 13:27:03.107861 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a126467-420f-4ec7-96ba-920c65d323e8" containerName="sg-core" Oct 13 13:27:03 crc kubenswrapper[4797]: I1013 13:27:03.107880 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a126467-420f-4ec7-96ba-920c65d323e8" containerName="ceilometer-central-agent" Oct 13 13:27:03 crc kubenswrapper[4797]: I1013 13:27:03.107893 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a126467-420f-4ec7-96ba-920c65d323e8" containerName="proxy-httpd" Oct 13 13:27:03 crc kubenswrapper[4797]: I1013 13:27:03.119931 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 13 13:27:03 crc kubenswrapper[4797]: I1013 13:27:03.122670 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 13 13:27:03 crc kubenswrapper[4797]: I1013 13:27:03.122829 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 13 13:27:03 crc kubenswrapper[4797]: I1013 13:27:03.130633 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 13 13:27:03 crc kubenswrapper[4797]: I1013 13:27:03.138236 4797 scope.go:117] "RemoveContainer" containerID="fc4292319cde6be16435f4465e413cb1c410a093490629858f653af3b74b0b25" Oct 13 13:27:03 crc kubenswrapper[4797]: I1013 13:27:03.249774 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a126467-420f-4ec7-96ba-920c65d323e8" path="/var/lib/kubelet/pods/6a126467-420f-4ec7-96ba-920c65d323e8/volumes" Oct 13 13:27:03 crc kubenswrapper[4797]: I1013 13:27:03.255065 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/88171561-ebd1-4a7c-ad02-8360aa091756-log-httpd\") pod \"ceilometer-0\" (UID: \"88171561-ebd1-4a7c-ad02-8360aa091756\") " pod="openstack/ceilometer-0" Oct 13 13:27:03 crc kubenswrapper[4797]: I1013 13:27:03.255117 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/88171561-ebd1-4a7c-ad02-8360aa091756-run-httpd\") pod \"ceilometer-0\" (UID: \"88171561-ebd1-4a7c-ad02-8360aa091756\") " pod="openstack/ceilometer-0" Oct 13 13:27:03 crc kubenswrapper[4797]: I1013 13:27:03.255140 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkg9s\" (UniqueName: \"kubernetes.io/projected/88171561-ebd1-4a7c-ad02-8360aa091756-kube-api-access-jkg9s\") pod \"ceilometer-0\" (UID: \"88171561-ebd1-4a7c-ad02-8360aa091756\") " pod="openstack/ceilometer-0" Oct 13 13:27:03 crc kubenswrapper[4797]: I1013 13:27:03.255219 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88171561-ebd1-4a7c-ad02-8360aa091756-scripts\") pod \"ceilometer-0\" (UID: \"88171561-ebd1-4a7c-ad02-8360aa091756\") " pod="openstack/ceilometer-0" Oct 13 13:27:03 crc kubenswrapper[4797]: I1013 13:27:03.255300 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88171561-ebd1-4a7c-ad02-8360aa091756-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"88171561-ebd1-4a7c-ad02-8360aa091756\") " pod="openstack/ceilometer-0" Oct 13 13:27:03 crc kubenswrapper[4797]: I1013 13:27:03.255340 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88171561-ebd1-4a7c-ad02-8360aa091756-config-data\") pod \"ceilometer-0\" (UID: \"88171561-ebd1-4a7c-ad02-8360aa091756\") " pod="openstack/ceilometer-0" Oct 13 13:27:03 crc kubenswrapper[4797]: I1013 13:27:03.255554 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/88171561-ebd1-4a7c-ad02-8360aa091756-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"88171561-ebd1-4a7c-ad02-8360aa091756\") " pod="openstack/ceilometer-0" Oct 13 13:27:03 crc kubenswrapper[4797]: I1013 13:27:03.377968 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88171561-ebd1-4a7c-ad02-8360aa091756-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"88171561-ebd1-4a7c-ad02-8360aa091756\") " pod="openstack/ceilometer-0" Oct 13 13:27:03 crc kubenswrapper[4797]: I1013 13:27:03.378044 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88171561-ebd1-4a7c-ad02-8360aa091756-config-data\") pod \"ceilometer-0\" (UID: \"88171561-ebd1-4a7c-ad02-8360aa091756\") " pod="openstack/ceilometer-0" Oct 13 13:27:03 crc kubenswrapper[4797]: I1013 13:27:03.378063 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/88171561-ebd1-4a7c-ad02-8360aa091756-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"88171561-ebd1-4a7c-ad02-8360aa091756\") " pod="openstack/ceilometer-0" Oct 13 13:27:03 crc kubenswrapper[4797]: I1013 13:27:03.378319 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/88171561-ebd1-4a7c-ad02-8360aa091756-log-httpd\") pod \"ceilometer-0\" (UID: \"88171561-ebd1-4a7c-ad02-8360aa091756\") " pod="openstack/ceilometer-0" Oct 13 13:27:03 crc kubenswrapper[4797]: I1013 13:27:03.378370 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/88171561-ebd1-4a7c-ad02-8360aa091756-run-httpd\") pod \"ceilometer-0\" (UID: \"88171561-ebd1-4a7c-ad02-8360aa091756\") " pod="openstack/ceilometer-0" Oct 13 13:27:03 crc kubenswrapper[4797]: I1013 13:27:03.378409 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jkg9s\" (UniqueName: \"kubernetes.io/projected/88171561-ebd1-4a7c-ad02-8360aa091756-kube-api-access-jkg9s\") pod \"ceilometer-0\" (UID: \"88171561-ebd1-4a7c-ad02-8360aa091756\") " pod="openstack/ceilometer-0" Oct 13 13:27:03 crc kubenswrapper[4797]: I1013 13:27:03.378456 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88171561-ebd1-4a7c-ad02-8360aa091756-scripts\") pod \"ceilometer-0\" (UID: \"88171561-ebd1-4a7c-ad02-8360aa091756\") " pod="openstack/ceilometer-0" Oct 13 13:27:03 crc kubenswrapper[4797]: I1013 13:27:03.379462 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/88171561-ebd1-4a7c-ad02-8360aa091756-run-httpd\") pod \"ceilometer-0\" (UID: \"88171561-ebd1-4a7c-ad02-8360aa091756\") " pod="openstack/ceilometer-0" Oct 13 13:27:03 crc kubenswrapper[4797]: I1013 13:27:03.380479 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/88171561-ebd1-4a7c-ad02-8360aa091756-log-httpd\") pod \"ceilometer-0\" (UID: \"88171561-ebd1-4a7c-ad02-8360aa091756\") " pod="openstack/ceilometer-0" Oct 13 13:27:03 crc kubenswrapper[4797]: I1013 13:27:03.389361 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 13 13:27:03 crc kubenswrapper[4797]: I1013 13:27:03.389563 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 13 13:27:03 crc kubenswrapper[4797]: I1013 13:27:03.389600 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88171561-ebd1-4a7c-ad02-8360aa091756-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"88171561-ebd1-4a7c-ad02-8360aa091756\") " pod="openstack/ceilometer-0" Oct 13 13:27:03 crc kubenswrapper[4797]: I1013 13:27:03.393739 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/88171561-ebd1-4a7c-ad02-8360aa091756-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"88171561-ebd1-4a7c-ad02-8360aa091756\") " pod="openstack/ceilometer-0" Oct 13 13:27:03 crc kubenswrapper[4797]: I1013 13:27:03.397084 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88171561-ebd1-4a7c-ad02-8360aa091756-config-data\") pod \"ceilometer-0\" (UID: \"88171561-ebd1-4a7c-ad02-8360aa091756\") " pod="openstack/ceilometer-0" Oct 13 13:27:03 crc kubenswrapper[4797]: I1013 13:27:03.400291 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkg9s\" (UniqueName: \"kubernetes.io/projected/88171561-ebd1-4a7c-ad02-8360aa091756-kube-api-access-jkg9s\") pod \"ceilometer-0\" (UID: \"88171561-ebd1-4a7c-ad02-8360aa091756\") " pod="openstack/ceilometer-0" Oct 13 13:27:03 crc kubenswrapper[4797]: I1013 13:27:03.402257 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88171561-ebd1-4a7c-ad02-8360aa091756-scripts\") pod \"ceilometer-0\" (UID: \"88171561-ebd1-4a7c-ad02-8360aa091756\") " pod="openstack/ceilometer-0" Oct 13 13:27:03 crc kubenswrapper[4797]: I1013 13:27:03.444990 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 13 13:27:03 crc kubenswrapper[4797]: I1013 13:27:03.935551 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 13 13:27:03 crc kubenswrapper[4797]: W1013 13:27:03.956243 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod88171561_ebd1_4a7c_ad02_8360aa091756.slice/crio-20ae6440a46898a0c7e5541d033764a67ff7d088949224b31097b59a5c980c04 WatchSource:0}: Error finding container 20ae6440a46898a0c7e5541d033764a67ff7d088949224b31097b59a5c980c04: Status 404 returned error can't find the container with id 20ae6440a46898a0c7e5541d033764a67ff7d088949224b31097b59a5c980c04 Oct 13 13:27:04 crc kubenswrapper[4797]: I1013 13:27:04.051394 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"88171561-ebd1-4a7c-ad02-8360aa091756","Type":"ContainerStarted","Data":"20ae6440a46898a0c7e5541d033764a67ff7d088949224b31097b59a5c980c04"} Oct 13 13:27:06 crc kubenswrapper[4797]: I1013 13:27:06.073789 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"88171561-ebd1-4a7c-ad02-8360aa091756","Type":"ContainerStarted","Data":"3fc0e047482f6b4e8796efcb93239b4645ccc1777e2bd67758d7006e21be9356"} Oct 13 13:27:06 crc kubenswrapper[4797]: I1013 13:27:06.074951 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"88171561-ebd1-4a7c-ad02-8360aa091756","Type":"ContainerStarted","Data":"fa599775b1fb6fc88fdbec4282bc3aee28747d9ba54de9e4632164935a2d64f3"} Oct 13 13:27:07 crc kubenswrapper[4797]: I1013 13:27:07.082958 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"88171561-ebd1-4a7c-ad02-8360aa091756","Type":"ContainerStarted","Data":"90b297b1efc9c1fb3133d1d8335d0e00bdfc160e3da3b580942756cf725c4955"} Oct 13 13:27:10 crc kubenswrapper[4797]: I1013 13:27:10.116233 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"88171561-ebd1-4a7c-ad02-8360aa091756","Type":"ContainerStarted","Data":"cbb5c4f10d0236cd1b245c2570179bf7a542ef5614cb3ba16b96a7951f5af82b"} Oct 13 13:27:10 crc kubenswrapper[4797]: I1013 13:27:10.116859 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 13 13:27:10 crc kubenswrapper[4797]: I1013 13:27:10.150453 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.791567551 podStartE2EDuration="7.150430318s" podCreationTimestamp="2025-10-13 13:27:03 +0000 UTC" firstStartedPulling="2025-10-13 13:27:03.963939728 +0000 UTC m=+1201.497489984" lastFinishedPulling="2025-10-13 13:27:09.322802495 +0000 UTC m=+1206.856352751" observedRunningTime="2025-10-13 13:27:10.139758976 +0000 UTC m=+1207.673309252" watchObservedRunningTime="2025-10-13 13:27:10.150430318 +0000 UTC m=+1207.683980574" Oct 13 13:27:16 crc kubenswrapper[4797]: I1013 13:27:16.186349 4797 generic.go:334] "Generic (PLEG): container finished" podID="a88a7d56-1b07-41ef-94ec-d39f1c9bfb76" containerID="9d64c2be9688b9fb4e9f73b6943972899943509da3f03d108888f03aae8e7ad2" exitCode=0 Oct 13 13:27:16 crc kubenswrapper[4797]: I1013 13:27:16.186949 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-f5b9c" event={"ID":"a88a7d56-1b07-41ef-94ec-d39f1c9bfb76","Type":"ContainerDied","Data":"9d64c2be9688b9fb4e9f73b6943972899943509da3f03d108888f03aae8e7ad2"} Oct 13 13:27:17 crc kubenswrapper[4797]: I1013 13:27:17.592484 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-f5b9c" Oct 13 13:27:17 crc kubenswrapper[4797]: I1013 13:27:17.755548 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a88a7d56-1b07-41ef-94ec-d39f1c9bfb76-combined-ca-bundle\") pod \"a88a7d56-1b07-41ef-94ec-d39f1c9bfb76\" (UID: \"a88a7d56-1b07-41ef-94ec-d39f1c9bfb76\") " Oct 13 13:27:17 crc kubenswrapper[4797]: I1013 13:27:17.756010 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a88a7d56-1b07-41ef-94ec-d39f1c9bfb76-scripts\") pod \"a88a7d56-1b07-41ef-94ec-d39f1c9bfb76\" (UID: \"a88a7d56-1b07-41ef-94ec-d39f1c9bfb76\") " Oct 13 13:27:17 crc kubenswrapper[4797]: I1013 13:27:17.756142 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8v579\" (UniqueName: \"kubernetes.io/projected/a88a7d56-1b07-41ef-94ec-d39f1c9bfb76-kube-api-access-8v579\") pod \"a88a7d56-1b07-41ef-94ec-d39f1c9bfb76\" (UID: \"a88a7d56-1b07-41ef-94ec-d39f1c9bfb76\") " Oct 13 13:27:17 crc kubenswrapper[4797]: I1013 13:27:17.756250 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a88a7d56-1b07-41ef-94ec-d39f1c9bfb76-config-data\") pod \"a88a7d56-1b07-41ef-94ec-d39f1c9bfb76\" (UID: \"a88a7d56-1b07-41ef-94ec-d39f1c9bfb76\") " Oct 13 13:27:17 crc kubenswrapper[4797]: I1013 13:27:17.762659 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a88a7d56-1b07-41ef-94ec-d39f1c9bfb76-scripts" (OuterVolumeSpecName: "scripts") pod "a88a7d56-1b07-41ef-94ec-d39f1c9bfb76" (UID: "a88a7d56-1b07-41ef-94ec-d39f1c9bfb76"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:27:17 crc kubenswrapper[4797]: I1013 13:27:17.762676 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a88a7d56-1b07-41ef-94ec-d39f1c9bfb76-kube-api-access-8v579" (OuterVolumeSpecName: "kube-api-access-8v579") pod "a88a7d56-1b07-41ef-94ec-d39f1c9bfb76" (UID: "a88a7d56-1b07-41ef-94ec-d39f1c9bfb76"). InnerVolumeSpecName "kube-api-access-8v579". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:27:17 crc kubenswrapper[4797]: I1013 13:27:17.784977 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a88a7d56-1b07-41ef-94ec-d39f1c9bfb76-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a88a7d56-1b07-41ef-94ec-d39f1c9bfb76" (UID: "a88a7d56-1b07-41ef-94ec-d39f1c9bfb76"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:27:17 crc kubenswrapper[4797]: I1013 13:27:17.793431 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a88a7d56-1b07-41ef-94ec-d39f1c9bfb76-config-data" (OuterVolumeSpecName: "config-data") pod "a88a7d56-1b07-41ef-94ec-d39f1c9bfb76" (UID: "a88a7d56-1b07-41ef-94ec-d39f1c9bfb76"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:27:17 crc kubenswrapper[4797]: I1013 13:27:17.858371 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8v579\" (UniqueName: \"kubernetes.io/projected/a88a7d56-1b07-41ef-94ec-d39f1c9bfb76-kube-api-access-8v579\") on node \"crc\" DevicePath \"\"" Oct 13 13:27:17 crc kubenswrapper[4797]: I1013 13:27:17.858414 4797 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a88a7d56-1b07-41ef-94ec-d39f1c9bfb76-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 13:27:17 crc kubenswrapper[4797]: I1013 13:27:17.858425 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a88a7d56-1b07-41ef-94ec-d39f1c9bfb76-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 13:27:17 crc kubenswrapper[4797]: I1013 13:27:17.858435 4797 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a88a7d56-1b07-41ef-94ec-d39f1c9bfb76-scripts\") on node \"crc\" DevicePath \"\"" Oct 13 13:27:18 crc kubenswrapper[4797]: I1013 13:27:18.211044 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-f5b9c" event={"ID":"a88a7d56-1b07-41ef-94ec-d39f1c9bfb76","Type":"ContainerDied","Data":"5ff3c808013c7e6e1e43d22b37ea6b995bafa9b3b243e0b104ae949bf965fb09"} Oct 13 13:27:18 crc kubenswrapper[4797]: I1013 13:27:18.211103 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5ff3c808013c7e6e1e43d22b37ea6b995bafa9b3b243e0b104ae949bf965fb09" Oct 13 13:27:18 crc kubenswrapper[4797]: I1013 13:27:18.211107 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-f5b9c" Oct 13 13:27:18 crc kubenswrapper[4797]: I1013 13:27:18.372603 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 13 13:27:18 crc kubenswrapper[4797]: E1013 13:27:18.373114 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a88a7d56-1b07-41ef-94ec-d39f1c9bfb76" containerName="nova-cell0-conductor-db-sync" Oct 13 13:27:18 crc kubenswrapper[4797]: I1013 13:27:18.373137 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="a88a7d56-1b07-41ef-94ec-d39f1c9bfb76" containerName="nova-cell0-conductor-db-sync" Oct 13 13:27:18 crc kubenswrapper[4797]: I1013 13:27:18.373409 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="a88a7d56-1b07-41ef-94ec-d39f1c9bfb76" containerName="nova-cell0-conductor-db-sync" Oct 13 13:27:18 crc kubenswrapper[4797]: I1013 13:27:18.374146 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 13 13:27:18 crc kubenswrapper[4797]: I1013 13:27:18.382250 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 13 13:27:18 crc kubenswrapper[4797]: I1013 13:27:18.382857 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-vngh9" Oct 13 13:27:18 crc kubenswrapper[4797]: I1013 13:27:18.394751 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 13 13:27:18 crc kubenswrapper[4797]: I1013 13:27:18.467595 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7503305-66a2-4504-b208-6795946d8701-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"c7503305-66a2-4504-b208-6795946d8701\") " pod="openstack/nova-cell0-conductor-0" Oct 13 13:27:18 crc kubenswrapper[4797]: I1013 13:27:18.467689 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8z89s\" (UniqueName: \"kubernetes.io/projected/c7503305-66a2-4504-b208-6795946d8701-kube-api-access-8z89s\") pod \"nova-cell0-conductor-0\" (UID: \"c7503305-66a2-4504-b208-6795946d8701\") " pod="openstack/nova-cell0-conductor-0" Oct 13 13:27:18 crc kubenswrapper[4797]: I1013 13:27:18.467722 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7503305-66a2-4504-b208-6795946d8701-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"c7503305-66a2-4504-b208-6795946d8701\") " pod="openstack/nova-cell0-conductor-0" Oct 13 13:27:18 crc kubenswrapper[4797]: I1013 13:27:18.569101 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7503305-66a2-4504-b208-6795946d8701-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"c7503305-66a2-4504-b208-6795946d8701\") " pod="openstack/nova-cell0-conductor-0" Oct 13 13:27:18 crc kubenswrapper[4797]: I1013 13:27:18.569166 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8z89s\" (UniqueName: \"kubernetes.io/projected/c7503305-66a2-4504-b208-6795946d8701-kube-api-access-8z89s\") pod \"nova-cell0-conductor-0\" (UID: \"c7503305-66a2-4504-b208-6795946d8701\") " pod="openstack/nova-cell0-conductor-0" Oct 13 13:27:18 crc kubenswrapper[4797]: I1013 13:27:18.569193 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7503305-66a2-4504-b208-6795946d8701-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"c7503305-66a2-4504-b208-6795946d8701\") " pod="openstack/nova-cell0-conductor-0" Oct 13 13:27:18 crc kubenswrapper[4797]: I1013 13:27:18.573234 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7503305-66a2-4504-b208-6795946d8701-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"c7503305-66a2-4504-b208-6795946d8701\") " pod="openstack/nova-cell0-conductor-0" Oct 13 13:27:18 crc kubenswrapper[4797]: I1013 13:27:18.574355 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7503305-66a2-4504-b208-6795946d8701-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"c7503305-66a2-4504-b208-6795946d8701\") " pod="openstack/nova-cell0-conductor-0" Oct 13 13:27:18 crc kubenswrapper[4797]: I1013 13:27:18.596157 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8z89s\" (UniqueName: \"kubernetes.io/projected/c7503305-66a2-4504-b208-6795946d8701-kube-api-access-8z89s\") pod \"nova-cell0-conductor-0\" (UID: \"c7503305-66a2-4504-b208-6795946d8701\") " pod="openstack/nova-cell0-conductor-0" Oct 13 13:27:18 crc kubenswrapper[4797]: I1013 13:27:18.700758 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 13 13:27:19 crc kubenswrapper[4797]: I1013 13:27:19.257473 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 13 13:27:20 crc kubenswrapper[4797]: I1013 13:27:20.230986 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"c7503305-66a2-4504-b208-6795946d8701","Type":"ContainerStarted","Data":"aab08153ea37f716e034bf774837202e797c20f3821e253a5ccd1da48ecfc7e4"} Oct 13 13:27:20 crc kubenswrapper[4797]: I1013 13:27:20.231660 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"c7503305-66a2-4504-b208-6795946d8701","Type":"ContainerStarted","Data":"4fa7f8b954978e855fbedc304fa886ac89da8580df573901cddc814942ca2f28"} Oct 13 13:27:20 crc kubenswrapper[4797]: I1013 13:27:20.231683 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Oct 13 13:27:20 crc kubenswrapper[4797]: I1013 13:27:20.263570 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.263550265 podStartE2EDuration="2.263550265s" podCreationTimestamp="2025-10-13 13:27:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 13:27:20.257206609 +0000 UTC m=+1217.790756865" watchObservedRunningTime="2025-10-13 13:27:20.263550265 +0000 UTC m=+1217.797100521" Oct 13 13:27:28 crc kubenswrapper[4797]: I1013 13:27:28.739953 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Oct 13 13:27:29 crc kubenswrapper[4797]: I1013 13:27:29.338924 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-pp9mc"] Oct 13 13:27:29 crc kubenswrapper[4797]: I1013 13:27:29.340300 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-pp9mc" Oct 13 13:27:29 crc kubenswrapper[4797]: I1013 13:27:29.342339 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Oct 13 13:27:29 crc kubenswrapper[4797]: I1013 13:27:29.343007 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Oct 13 13:27:29 crc kubenswrapper[4797]: I1013 13:27:29.355828 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-pp9mc"] Oct 13 13:27:29 crc kubenswrapper[4797]: I1013 13:27:29.378880 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gckrb\" (UniqueName: \"kubernetes.io/projected/87c2e451-cf73-4e5e-9e2e-703043c09184-kube-api-access-gckrb\") pod \"nova-cell0-cell-mapping-pp9mc\" (UID: \"87c2e451-cf73-4e5e-9e2e-703043c09184\") " pod="openstack/nova-cell0-cell-mapping-pp9mc" Oct 13 13:27:29 crc kubenswrapper[4797]: I1013 13:27:29.379549 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87c2e451-cf73-4e5e-9e2e-703043c09184-config-data\") pod \"nova-cell0-cell-mapping-pp9mc\" (UID: \"87c2e451-cf73-4e5e-9e2e-703043c09184\") " pod="openstack/nova-cell0-cell-mapping-pp9mc" Oct 13 13:27:29 crc kubenswrapper[4797]: I1013 13:27:29.379893 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87c2e451-cf73-4e5e-9e2e-703043c09184-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-pp9mc\" (UID: \"87c2e451-cf73-4e5e-9e2e-703043c09184\") " pod="openstack/nova-cell0-cell-mapping-pp9mc" Oct 13 13:27:29 crc kubenswrapper[4797]: I1013 13:27:29.379939 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/87c2e451-cf73-4e5e-9e2e-703043c09184-scripts\") pod \"nova-cell0-cell-mapping-pp9mc\" (UID: \"87c2e451-cf73-4e5e-9e2e-703043c09184\") " pod="openstack/nova-cell0-cell-mapping-pp9mc" Oct 13 13:27:29 crc kubenswrapper[4797]: I1013 13:27:29.463921 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 13 13:27:29 crc kubenswrapper[4797]: I1013 13:27:29.465087 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 13 13:27:29 crc kubenswrapper[4797]: I1013 13:27:29.473430 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 13 13:27:29 crc kubenswrapper[4797]: I1013 13:27:29.473581 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 13 13:27:29 crc kubenswrapper[4797]: I1013 13:27:29.481326 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87c2e451-cf73-4e5e-9e2e-703043c09184-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-pp9mc\" (UID: \"87c2e451-cf73-4e5e-9e2e-703043c09184\") " pod="openstack/nova-cell0-cell-mapping-pp9mc" Oct 13 13:27:29 crc kubenswrapper[4797]: I1013 13:27:29.481358 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/87c2e451-cf73-4e5e-9e2e-703043c09184-scripts\") pod \"nova-cell0-cell-mapping-pp9mc\" (UID: \"87c2e451-cf73-4e5e-9e2e-703043c09184\") " pod="openstack/nova-cell0-cell-mapping-pp9mc" Oct 13 13:27:29 crc kubenswrapper[4797]: I1013 13:27:29.481413 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gckrb\" (UniqueName: \"kubernetes.io/projected/87c2e451-cf73-4e5e-9e2e-703043c09184-kube-api-access-gckrb\") pod \"nova-cell0-cell-mapping-pp9mc\" (UID: \"87c2e451-cf73-4e5e-9e2e-703043c09184\") " pod="openstack/nova-cell0-cell-mapping-pp9mc" Oct 13 13:27:29 crc kubenswrapper[4797]: I1013 13:27:29.481497 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87c2e451-cf73-4e5e-9e2e-703043c09184-config-data\") pod \"nova-cell0-cell-mapping-pp9mc\" (UID: \"87c2e451-cf73-4e5e-9e2e-703043c09184\") " pod="openstack/nova-cell0-cell-mapping-pp9mc" Oct 13 13:27:29 crc kubenswrapper[4797]: I1013 13:27:29.489548 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/87c2e451-cf73-4e5e-9e2e-703043c09184-scripts\") pod \"nova-cell0-cell-mapping-pp9mc\" (UID: \"87c2e451-cf73-4e5e-9e2e-703043c09184\") " pod="openstack/nova-cell0-cell-mapping-pp9mc" Oct 13 13:27:29 crc kubenswrapper[4797]: I1013 13:27:29.494028 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87c2e451-cf73-4e5e-9e2e-703043c09184-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-pp9mc\" (UID: \"87c2e451-cf73-4e5e-9e2e-703043c09184\") " pod="openstack/nova-cell0-cell-mapping-pp9mc" Oct 13 13:27:29 crc kubenswrapper[4797]: I1013 13:27:29.495991 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87c2e451-cf73-4e5e-9e2e-703043c09184-config-data\") pod \"nova-cell0-cell-mapping-pp9mc\" (UID: \"87c2e451-cf73-4e5e-9e2e-703043c09184\") " pod="openstack/nova-cell0-cell-mapping-pp9mc" Oct 13 13:27:29 crc kubenswrapper[4797]: I1013 13:27:29.520262 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gckrb\" (UniqueName: \"kubernetes.io/projected/87c2e451-cf73-4e5e-9e2e-703043c09184-kube-api-access-gckrb\") pod \"nova-cell0-cell-mapping-pp9mc\" (UID: \"87c2e451-cf73-4e5e-9e2e-703043c09184\") " pod="openstack/nova-cell0-cell-mapping-pp9mc" Oct 13 13:27:29 crc kubenswrapper[4797]: I1013 13:27:29.551546 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 13 13:27:29 crc kubenswrapper[4797]: I1013 13:27:29.555790 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 13 13:27:29 crc kubenswrapper[4797]: I1013 13:27:29.558290 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 13 13:27:29 crc kubenswrapper[4797]: I1013 13:27:29.576330 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 13 13:27:29 crc kubenswrapper[4797]: I1013 13:27:29.583942 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d916771f-8789-42cf-aa66-11707d4825f6-config-data\") pod \"nova-scheduler-0\" (UID: \"d916771f-8789-42cf-aa66-11707d4825f6\") " pod="openstack/nova-scheduler-0" Oct 13 13:27:29 crc kubenswrapper[4797]: I1013 13:27:29.583982 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/295e91cc-9024-4280-a782-3c7f3a2d19dc-logs\") pod \"nova-api-0\" (UID: \"295e91cc-9024-4280-a782-3c7f3a2d19dc\") " pod="openstack/nova-api-0" Oct 13 13:27:29 crc kubenswrapper[4797]: I1013 13:27:29.584028 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/295e91cc-9024-4280-a782-3c7f3a2d19dc-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"295e91cc-9024-4280-a782-3c7f3a2d19dc\") " pod="openstack/nova-api-0" Oct 13 13:27:29 crc kubenswrapper[4797]: I1013 13:27:29.584081 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zh752\" (UniqueName: \"kubernetes.io/projected/d916771f-8789-42cf-aa66-11707d4825f6-kube-api-access-zh752\") pod \"nova-scheduler-0\" (UID: \"d916771f-8789-42cf-aa66-11707d4825f6\") " pod="openstack/nova-scheduler-0" Oct 13 13:27:29 crc kubenswrapper[4797]: I1013 13:27:29.584133 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d916771f-8789-42cf-aa66-11707d4825f6-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d916771f-8789-42cf-aa66-11707d4825f6\") " pod="openstack/nova-scheduler-0" Oct 13 13:27:29 crc kubenswrapper[4797]: I1013 13:27:29.584179 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbrs6\" (UniqueName: \"kubernetes.io/projected/295e91cc-9024-4280-a782-3c7f3a2d19dc-kube-api-access-zbrs6\") pod \"nova-api-0\" (UID: \"295e91cc-9024-4280-a782-3c7f3a2d19dc\") " pod="openstack/nova-api-0" Oct 13 13:27:29 crc kubenswrapper[4797]: I1013 13:27:29.584198 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/295e91cc-9024-4280-a782-3c7f3a2d19dc-config-data\") pod \"nova-api-0\" (UID: \"295e91cc-9024-4280-a782-3c7f3a2d19dc\") " pod="openstack/nova-api-0" Oct 13 13:27:29 crc kubenswrapper[4797]: I1013 13:27:29.599622 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 13 13:27:29 crc kubenswrapper[4797]: I1013 13:27:29.601497 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 13 13:27:29 crc kubenswrapper[4797]: I1013 13:27:29.603711 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 13 13:27:29 crc kubenswrapper[4797]: I1013 13:27:29.654716 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 13 13:27:29 crc kubenswrapper[4797]: I1013 13:27:29.664404 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-pp9mc" Oct 13 13:27:29 crc kubenswrapper[4797]: I1013 13:27:29.687963 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zh752\" (UniqueName: \"kubernetes.io/projected/d916771f-8789-42cf-aa66-11707d4825f6-kube-api-access-zh752\") pod \"nova-scheduler-0\" (UID: \"d916771f-8789-42cf-aa66-11707d4825f6\") " pod="openstack/nova-scheduler-0" Oct 13 13:27:29 crc kubenswrapper[4797]: I1013 13:27:29.688353 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a97dc3ca-450c-4dcb-96f8-22924e5b08be-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a97dc3ca-450c-4dcb-96f8-22924e5b08be\") " pod="openstack/nova-metadata-0" Oct 13 13:27:29 crc kubenswrapper[4797]: I1013 13:27:29.688673 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a97dc3ca-450c-4dcb-96f8-22924e5b08be-logs\") pod \"nova-metadata-0\" (UID: \"a97dc3ca-450c-4dcb-96f8-22924e5b08be\") " pod="openstack/nova-metadata-0" Oct 13 13:27:29 crc kubenswrapper[4797]: I1013 13:27:29.688834 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d916771f-8789-42cf-aa66-11707d4825f6-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d916771f-8789-42cf-aa66-11707d4825f6\") " pod="openstack/nova-scheduler-0" Oct 13 13:27:29 crc kubenswrapper[4797]: I1013 13:27:29.693124 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a97dc3ca-450c-4dcb-96f8-22924e5b08be-config-data\") pod \"nova-metadata-0\" (UID: \"a97dc3ca-450c-4dcb-96f8-22924e5b08be\") " pod="openstack/nova-metadata-0" Oct 13 13:27:29 crc kubenswrapper[4797]: I1013 13:27:29.693413 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ksbp7\" (UniqueName: \"kubernetes.io/projected/a97dc3ca-450c-4dcb-96f8-22924e5b08be-kube-api-access-ksbp7\") pod \"nova-metadata-0\" (UID: \"a97dc3ca-450c-4dcb-96f8-22924e5b08be\") " pod="openstack/nova-metadata-0" Oct 13 13:27:29 crc kubenswrapper[4797]: I1013 13:27:29.693535 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zbrs6\" (UniqueName: \"kubernetes.io/projected/295e91cc-9024-4280-a782-3c7f3a2d19dc-kube-api-access-zbrs6\") pod \"nova-api-0\" (UID: \"295e91cc-9024-4280-a782-3c7f3a2d19dc\") " pod="openstack/nova-api-0" Oct 13 13:27:29 crc kubenswrapper[4797]: I1013 13:27:29.693680 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/295e91cc-9024-4280-a782-3c7f3a2d19dc-config-data\") pod \"nova-api-0\" (UID: \"295e91cc-9024-4280-a782-3c7f3a2d19dc\") " pod="openstack/nova-api-0" Oct 13 13:27:29 crc kubenswrapper[4797]: I1013 13:27:29.693986 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d916771f-8789-42cf-aa66-11707d4825f6-config-data\") pod \"nova-scheduler-0\" (UID: \"d916771f-8789-42cf-aa66-11707d4825f6\") " pod="openstack/nova-scheduler-0" Oct 13 13:27:29 crc kubenswrapper[4797]: I1013 13:27:29.694091 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/295e91cc-9024-4280-a782-3c7f3a2d19dc-logs\") pod \"nova-api-0\" (UID: \"295e91cc-9024-4280-a782-3c7f3a2d19dc\") " pod="openstack/nova-api-0" Oct 13 13:27:29 crc kubenswrapper[4797]: I1013 13:27:29.697994 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/295e91cc-9024-4280-a782-3c7f3a2d19dc-logs\") pod \"nova-api-0\" (UID: \"295e91cc-9024-4280-a782-3c7f3a2d19dc\") " pod="openstack/nova-api-0" Oct 13 13:27:29 crc kubenswrapper[4797]: I1013 13:27:29.703926 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d916771f-8789-42cf-aa66-11707d4825f6-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d916771f-8789-42cf-aa66-11707d4825f6\") " pod="openstack/nova-scheduler-0" Oct 13 13:27:29 crc kubenswrapper[4797]: I1013 13:27:29.710709 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/295e91cc-9024-4280-a782-3c7f3a2d19dc-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"295e91cc-9024-4280-a782-3c7f3a2d19dc\") " pod="openstack/nova-api-0" Oct 13 13:27:29 crc kubenswrapper[4797]: I1013 13:27:29.723640 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d916771f-8789-42cf-aa66-11707d4825f6-config-data\") pod \"nova-scheduler-0\" (UID: \"d916771f-8789-42cf-aa66-11707d4825f6\") " pod="openstack/nova-scheduler-0" Oct 13 13:27:29 crc kubenswrapper[4797]: I1013 13:27:29.730771 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-d74749bf5-5lfsm"] Oct 13 13:27:29 crc kubenswrapper[4797]: I1013 13:27:29.731617 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zh752\" (UniqueName: \"kubernetes.io/projected/d916771f-8789-42cf-aa66-11707d4825f6-kube-api-access-zh752\") pod \"nova-scheduler-0\" (UID: \"d916771f-8789-42cf-aa66-11707d4825f6\") " pod="openstack/nova-scheduler-0" Oct 13 13:27:29 crc kubenswrapper[4797]: I1013 13:27:29.731834 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/295e91cc-9024-4280-a782-3c7f3a2d19dc-config-data\") pod \"nova-api-0\" (UID: \"295e91cc-9024-4280-a782-3c7f3a2d19dc\") " pod="openstack/nova-api-0" Oct 13 13:27:29 crc kubenswrapper[4797]: I1013 13:27:29.732210 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/295e91cc-9024-4280-a782-3c7f3a2d19dc-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"295e91cc-9024-4280-a782-3c7f3a2d19dc\") " pod="openstack/nova-api-0" Oct 13 13:27:29 crc kubenswrapper[4797]: I1013 13:27:29.732668 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d74749bf5-5lfsm" Oct 13 13:27:29 crc kubenswrapper[4797]: I1013 13:27:29.737916 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbrs6\" (UniqueName: \"kubernetes.io/projected/295e91cc-9024-4280-a782-3c7f3a2d19dc-kube-api-access-zbrs6\") pod \"nova-api-0\" (UID: \"295e91cc-9024-4280-a782-3c7f3a2d19dc\") " pod="openstack/nova-api-0" Oct 13 13:27:29 crc kubenswrapper[4797]: I1013 13:27:29.791242 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 13 13:27:29 crc kubenswrapper[4797]: I1013 13:27:29.794371 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 13 13:27:29 crc kubenswrapper[4797]: I1013 13:27:29.797349 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 13 13:27:29 crc kubenswrapper[4797]: I1013 13:27:29.799338 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Oct 13 13:27:29 crc kubenswrapper[4797]: I1013 13:27:29.806400 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-d74749bf5-5lfsm"] Oct 13 13:27:29 crc kubenswrapper[4797]: I1013 13:27:29.819933 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 13 13:27:29 crc kubenswrapper[4797]: I1013 13:27:29.822073 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a97dc3ca-450c-4dcb-96f8-22924e5b08be-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a97dc3ca-450c-4dcb-96f8-22924e5b08be\") " pod="openstack/nova-metadata-0" Oct 13 13:27:29 crc kubenswrapper[4797]: I1013 13:27:29.822169 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3b04026b-c0b6-4373-b87a-0eaea7d96163-ovsdbserver-sb\") pod \"dnsmasq-dns-d74749bf5-5lfsm\" (UID: \"3b04026b-c0b6-4373-b87a-0eaea7d96163\") " pod="openstack/dnsmasq-dns-d74749bf5-5lfsm" Oct 13 13:27:29 crc kubenswrapper[4797]: I1013 13:27:29.822201 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwnch\" (UniqueName: \"kubernetes.io/projected/3b04026b-c0b6-4373-b87a-0eaea7d96163-kube-api-access-nwnch\") pod \"dnsmasq-dns-d74749bf5-5lfsm\" (UID: \"3b04026b-c0b6-4373-b87a-0eaea7d96163\") " pod="openstack/dnsmasq-dns-d74749bf5-5lfsm" Oct 13 13:27:29 crc kubenswrapper[4797]: I1013 13:27:29.822245 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a97dc3ca-450c-4dcb-96f8-22924e5b08be-logs\") pod \"nova-metadata-0\" (UID: \"a97dc3ca-450c-4dcb-96f8-22924e5b08be\") " pod="openstack/nova-metadata-0" Oct 13 13:27:29 crc kubenswrapper[4797]: I1013 13:27:29.822344 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a97dc3ca-450c-4dcb-96f8-22924e5b08be-config-data\") pod \"nova-metadata-0\" (UID: \"a97dc3ca-450c-4dcb-96f8-22924e5b08be\") " pod="openstack/nova-metadata-0" Oct 13 13:27:29 crc kubenswrapper[4797]: I1013 13:27:29.822426 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ksbp7\" (UniqueName: \"kubernetes.io/projected/a97dc3ca-450c-4dcb-96f8-22924e5b08be-kube-api-access-ksbp7\") pod \"nova-metadata-0\" (UID: \"a97dc3ca-450c-4dcb-96f8-22924e5b08be\") " pod="openstack/nova-metadata-0" Oct 13 13:27:29 crc kubenswrapper[4797]: I1013 13:27:29.822515 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b04026b-c0b6-4373-b87a-0eaea7d96163-config\") pod \"dnsmasq-dns-d74749bf5-5lfsm\" (UID: \"3b04026b-c0b6-4373-b87a-0eaea7d96163\") " pod="openstack/dnsmasq-dns-d74749bf5-5lfsm" Oct 13 13:27:29 crc kubenswrapper[4797]: I1013 13:27:29.822684 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3b04026b-c0b6-4373-b87a-0eaea7d96163-ovsdbserver-nb\") pod \"dnsmasq-dns-d74749bf5-5lfsm\" (UID: \"3b04026b-c0b6-4373-b87a-0eaea7d96163\") " pod="openstack/dnsmasq-dns-d74749bf5-5lfsm" Oct 13 13:27:29 crc kubenswrapper[4797]: I1013 13:27:29.822760 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3b04026b-c0b6-4373-b87a-0eaea7d96163-dns-swift-storage-0\") pod \"dnsmasq-dns-d74749bf5-5lfsm\" (UID: \"3b04026b-c0b6-4373-b87a-0eaea7d96163\") " pod="openstack/dnsmasq-dns-d74749bf5-5lfsm" Oct 13 13:27:29 crc kubenswrapper[4797]: I1013 13:27:29.822871 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3b04026b-c0b6-4373-b87a-0eaea7d96163-dns-svc\") pod \"dnsmasq-dns-d74749bf5-5lfsm\" (UID: \"3b04026b-c0b6-4373-b87a-0eaea7d96163\") " pod="openstack/dnsmasq-dns-d74749bf5-5lfsm" Oct 13 13:27:29 crc kubenswrapper[4797]: I1013 13:27:29.823406 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a97dc3ca-450c-4dcb-96f8-22924e5b08be-logs\") pod \"nova-metadata-0\" (UID: \"a97dc3ca-450c-4dcb-96f8-22924e5b08be\") " pod="openstack/nova-metadata-0" Oct 13 13:27:29 crc kubenswrapper[4797]: I1013 13:27:29.830475 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a97dc3ca-450c-4dcb-96f8-22924e5b08be-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a97dc3ca-450c-4dcb-96f8-22924e5b08be\") " pod="openstack/nova-metadata-0" Oct 13 13:27:29 crc kubenswrapper[4797]: I1013 13:27:29.836542 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a97dc3ca-450c-4dcb-96f8-22924e5b08be-config-data\") pod \"nova-metadata-0\" (UID: \"a97dc3ca-450c-4dcb-96f8-22924e5b08be\") " pod="openstack/nova-metadata-0" Oct 13 13:27:29 crc kubenswrapper[4797]: I1013 13:27:29.840409 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ksbp7\" (UniqueName: \"kubernetes.io/projected/a97dc3ca-450c-4dcb-96f8-22924e5b08be-kube-api-access-ksbp7\") pod \"nova-metadata-0\" (UID: \"a97dc3ca-450c-4dcb-96f8-22924e5b08be\") " pod="openstack/nova-metadata-0" Oct 13 13:27:29 crc kubenswrapper[4797]: I1013 13:27:29.900590 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 13 13:27:29 crc kubenswrapper[4797]: I1013 13:27:29.930058 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/126b2f1c-8c5f-431d-8d9d-aefc2ca0d4c1-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"126b2f1c-8c5f-431d-8d9d-aefc2ca0d4c1\") " pod="openstack/nova-cell1-novncproxy-0" Oct 13 13:27:29 crc kubenswrapper[4797]: I1013 13:27:29.930111 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/126b2f1c-8c5f-431d-8d9d-aefc2ca0d4c1-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"126b2f1c-8c5f-431d-8d9d-aefc2ca0d4c1\") " pod="openstack/nova-cell1-novncproxy-0" Oct 13 13:27:29 crc kubenswrapper[4797]: I1013 13:27:29.930154 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3b04026b-c0b6-4373-b87a-0eaea7d96163-ovsdbserver-sb\") pod \"dnsmasq-dns-d74749bf5-5lfsm\" (UID: \"3b04026b-c0b6-4373-b87a-0eaea7d96163\") " pod="openstack/dnsmasq-dns-d74749bf5-5lfsm" Oct 13 13:27:29 crc kubenswrapper[4797]: I1013 13:27:29.930182 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nwnch\" (UniqueName: \"kubernetes.io/projected/3b04026b-c0b6-4373-b87a-0eaea7d96163-kube-api-access-nwnch\") pod \"dnsmasq-dns-d74749bf5-5lfsm\" (UID: \"3b04026b-c0b6-4373-b87a-0eaea7d96163\") " pod="openstack/dnsmasq-dns-d74749bf5-5lfsm" Oct 13 13:27:29 crc kubenswrapper[4797]: I1013 13:27:29.930279 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b04026b-c0b6-4373-b87a-0eaea7d96163-config\") pod \"dnsmasq-dns-d74749bf5-5lfsm\" (UID: \"3b04026b-c0b6-4373-b87a-0eaea7d96163\") " pod="openstack/dnsmasq-dns-d74749bf5-5lfsm" Oct 13 13:27:29 crc kubenswrapper[4797]: I1013 13:27:29.930296 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3b04026b-c0b6-4373-b87a-0eaea7d96163-ovsdbserver-nb\") pod \"dnsmasq-dns-d74749bf5-5lfsm\" (UID: \"3b04026b-c0b6-4373-b87a-0eaea7d96163\") " pod="openstack/dnsmasq-dns-d74749bf5-5lfsm" Oct 13 13:27:29 crc kubenswrapper[4797]: I1013 13:27:29.930314 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p78bs\" (UniqueName: \"kubernetes.io/projected/126b2f1c-8c5f-431d-8d9d-aefc2ca0d4c1-kube-api-access-p78bs\") pod \"nova-cell1-novncproxy-0\" (UID: \"126b2f1c-8c5f-431d-8d9d-aefc2ca0d4c1\") " pod="openstack/nova-cell1-novncproxy-0" Oct 13 13:27:29 crc kubenswrapper[4797]: I1013 13:27:29.930342 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3b04026b-c0b6-4373-b87a-0eaea7d96163-dns-swift-storage-0\") pod \"dnsmasq-dns-d74749bf5-5lfsm\" (UID: \"3b04026b-c0b6-4373-b87a-0eaea7d96163\") " pod="openstack/dnsmasq-dns-d74749bf5-5lfsm" Oct 13 13:27:29 crc kubenswrapper[4797]: I1013 13:27:29.930381 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3b04026b-c0b6-4373-b87a-0eaea7d96163-dns-svc\") pod \"dnsmasq-dns-d74749bf5-5lfsm\" (UID: \"3b04026b-c0b6-4373-b87a-0eaea7d96163\") " pod="openstack/dnsmasq-dns-d74749bf5-5lfsm" Oct 13 13:27:29 crc kubenswrapper[4797]: I1013 13:27:29.931184 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3b04026b-c0b6-4373-b87a-0eaea7d96163-dns-svc\") pod \"dnsmasq-dns-d74749bf5-5lfsm\" (UID: \"3b04026b-c0b6-4373-b87a-0eaea7d96163\") " pod="openstack/dnsmasq-dns-d74749bf5-5lfsm" Oct 13 13:27:29 crc kubenswrapper[4797]: I1013 13:27:29.931349 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3b04026b-c0b6-4373-b87a-0eaea7d96163-ovsdbserver-sb\") pod \"dnsmasq-dns-d74749bf5-5lfsm\" (UID: \"3b04026b-c0b6-4373-b87a-0eaea7d96163\") " pod="openstack/dnsmasq-dns-d74749bf5-5lfsm" Oct 13 13:27:29 crc kubenswrapper[4797]: I1013 13:27:29.931893 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3b04026b-c0b6-4373-b87a-0eaea7d96163-ovsdbserver-nb\") pod \"dnsmasq-dns-d74749bf5-5lfsm\" (UID: \"3b04026b-c0b6-4373-b87a-0eaea7d96163\") " pod="openstack/dnsmasq-dns-d74749bf5-5lfsm" Oct 13 13:27:29 crc kubenswrapper[4797]: I1013 13:27:29.932238 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3b04026b-c0b6-4373-b87a-0eaea7d96163-dns-swift-storage-0\") pod \"dnsmasq-dns-d74749bf5-5lfsm\" (UID: \"3b04026b-c0b6-4373-b87a-0eaea7d96163\") " pod="openstack/dnsmasq-dns-d74749bf5-5lfsm" Oct 13 13:27:29 crc kubenswrapper[4797]: I1013 13:27:29.932362 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 13 13:27:29 crc kubenswrapper[4797]: I1013 13:27:29.933095 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b04026b-c0b6-4373-b87a-0eaea7d96163-config\") pod \"dnsmasq-dns-d74749bf5-5lfsm\" (UID: \"3b04026b-c0b6-4373-b87a-0eaea7d96163\") " pod="openstack/dnsmasq-dns-d74749bf5-5lfsm" Oct 13 13:27:29 crc kubenswrapper[4797]: I1013 13:27:29.957390 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nwnch\" (UniqueName: \"kubernetes.io/projected/3b04026b-c0b6-4373-b87a-0eaea7d96163-kube-api-access-nwnch\") pod \"dnsmasq-dns-d74749bf5-5lfsm\" (UID: \"3b04026b-c0b6-4373-b87a-0eaea7d96163\") " pod="openstack/dnsmasq-dns-d74749bf5-5lfsm" Oct 13 13:27:30 crc kubenswrapper[4797]: I1013 13:27:30.032469 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p78bs\" (UniqueName: \"kubernetes.io/projected/126b2f1c-8c5f-431d-8d9d-aefc2ca0d4c1-kube-api-access-p78bs\") pod \"nova-cell1-novncproxy-0\" (UID: \"126b2f1c-8c5f-431d-8d9d-aefc2ca0d4c1\") " pod="openstack/nova-cell1-novncproxy-0" Oct 13 13:27:30 crc kubenswrapper[4797]: I1013 13:27:30.032577 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/126b2f1c-8c5f-431d-8d9d-aefc2ca0d4c1-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"126b2f1c-8c5f-431d-8d9d-aefc2ca0d4c1\") " pod="openstack/nova-cell1-novncproxy-0" Oct 13 13:27:30 crc kubenswrapper[4797]: I1013 13:27:30.032610 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/126b2f1c-8c5f-431d-8d9d-aefc2ca0d4c1-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"126b2f1c-8c5f-431d-8d9d-aefc2ca0d4c1\") " pod="openstack/nova-cell1-novncproxy-0" Oct 13 13:27:30 crc kubenswrapper[4797]: I1013 13:27:30.036480 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/126b2f1c-8c5f-431d-8d9d-aefc2ca0d4c1-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"126b2f1c-8c5f-431d-8d9d-aefc2ca0d4c1\") " pod="openstack/nova-cell1-novncproxy-0" Oct 13 13:27:30 crc kubenswrapper[4797]: I1013 13:27:30.038656 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/126b2f1c-8c5f-431d-8d9d-aefc2ca0d4c1-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"126b2f1c-8c5f-431d-8d9d-aefc2ca0d4c1\") " pod="openstack/nova-cell1-novncproxy-0" Oct 13 13:27:30 crc kubenswrapper[4797]: I1013 13:27:30.061271 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p78bs\" (UniqueName: \"kubernetes.io/projected/126b2f1c-8c5f-431d-8d9d-aefc2ca0d4c1-kube-api-access-p78bs\") pod \"nova-cell1-novncproxy-0\" (UID: \"126b2f1c-8c5f-431d-8d9d-aefc2ca0d4c1\") " pod="openstack/nova-cell1-novncproxy-0" Oct 13 13:27:30 crc kubenswrapper[4797]: I1013 13:27:30.122225 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d74749bf5-5lfsm" Oct 13 13:27:30 crc kubenswrapper[4797]: I1013 13:27:30.193525 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 13 13:27:30 crc kubenswrapper[4797]: I1013 13:27:30.285763 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-pp9mc"] Oct 13 13:27:30 crc kubenswrapper[4797]: W1013 13:27:30.292470 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod87c2e451_cf73_4e5e_9e2e_703043c09184.slice/crio-e5a12ccf8c096e32e3a0e619893ab414666649ea9f10c97769e674a7389a5699 WatchSource:0}: Error finding container e5a12ccf8c096e32e3a0e619893ab414666649ea9f10c97769e674a7389a5699: Status 404 returned error can't find the container with id e5a12ccf8c096e32e3a0e619893ab414666649ea9f10c97769e674a7389a5699 Oct 13 13:27:30 crc kubenswrapper[4797]: I1013 13:27:30.346048 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-pp9mc" event={"ID":"87c2e451-cf73-4e5e-9e2e-703043c09184","Type":"ContainerStarted","Data":"e5a12ccf8c096e32e3a0e619893ab414666649ea9f10c97769e674a7389a5699"} Oct 13 13:27:30 crc kubenswrapper[4797]: I1013 13:27:30.371066 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-9drpj"] Oct 13 13:27:30 crc kubenswrapper[4797]: I1013 13:27:30.372419 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-9drpj" Oct 13 13:27:30 crc kubenswrapper[4797]: I1013 13:27:30.374136 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 13 13:27:30 crc kubenswrapper[4797]: I1013 13:27:30.374858 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Oct 13 13:27:30 crc kubenswrapper[4797]: I1013 13:27:30.399656 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-9drpj"] Oct 13 13:27:30 crc kubenswrapper[4797]: W1013 13:27:30.418626 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd916771f_8789_42cf_aa66_11707d4825f6.slice/crio-fb9aeba3bcd842bb33e696953339e7d39af20fe23222839c4f11e16ec69bc551 WatchSource:0}: Error finding container fb9aeba3bcd842bb33e696953339e7d39af20fe23222839c4f11e16ec69bc551: Status 404 returned error can't find the container with id fb9aeba3bcd842bb33e696953339e7d39af20fe23222839c4f11e16ec69bc551 Oct 13 13:27:30 crc kubenswrapper[4797]: I1013 13:27:30.430832 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 13 13:27:30 crc kubenswrapper[4797]: I1013 13:27:30.445867 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ee3b1264-55ce-4cb4-a390-2fb520ae9b87-scripts\") pod \"nova-cell1-conductor-db-sync-9drpj\" (UID: \"ee3b1264-55ce-4cb4-a390-2fb520ae9b87\") " pod="openstack/nova-cell1-conductor-db-sync-9drpj" Oct 13 13:27:30 crc kubenswrapper[4797]: I1013 13:27:30.445971 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee3b1264-55ce-4cb4-a390-2fb520ae9b87-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-9drpj\" (UID: \"ee3b1264-55ce-4cb4-a390-2fb520ae9b87\") " pod="openstack/nova-cell1-conductor-db-sync-9drpj" Oct 13 13:27:30 crc kubenswrapper[4797]: I1013 13:27:30.446001 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2c7k\" (UniqueName: \"kubernetes.io/projected/ee3b1264-55ce-4cb4-a390-2fb520ae9b87-kube-api-access-c2c7k\") pod \"nova-cell1-conductor-db-sync-9drpj\" (UID: \"ee3b1264-55ce-4cb4-a390-2fb520ae9b87\") " pod="openstack/nova-cell1-conductor-db-sync-9drpj" Oct 13 13:27:30 crc kubenswrapper[4797]: I1013 13:27:30.446223 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee3b1264-55ce-4cb4-a390-2fb520ae9b87-config-data\") pod \"nova-cell1-conductor-db-sync-9drpj\" (UID: \"ee3b1264-55ce-4cb4-a390-2fb520ae9b87\") " pod="openstack/nova-cell1-conductor-db-sync-9drpj" Oct 13 13:27:30 crc kubenswrapper[4797]: W1013 13:27:30.476284 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda97dc3ca_450c_4dcb_96f8_22924e5b08be.slice/crio-7fa39944f964375c6d824c1fceeea1da8cd695753b3d6343c6bf09d7f2478911 WatchSource:0}: Error finding container 7fa39944f964375c6d824c1fceeea1da8cd695753b3d6343c6bf09d7f2478911: Status 404 returned error can't find the container with id 7fa39944f964375c6d824c1fceeea1da8cd695753b3d6343c6bf09d7f2478911 Oct 13 13:27:30 crc kubenswrapper[4797]: I1013 13:27:30.484205 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 13 13:27:30 crc kubenswrapper[4797]: W1013 13:27:30.486901 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod295e91cc_9024_4280_a782_3c7f3a2d19dc.slice/crio-8d2a09717a60726f99312698dccb0b45abc5b9351a09ad0864a8ee755b21dde0 WatchSource:0}: Error finding container 8d2a09717a60726f99312698dccb0b45abc5b9351a09ad0864a8ee755b21dde0: Status 404 returned error can't find the container with id 8d2a09717a60726f99312698dccb0b45abc5b9351a09ad0864a8ee755b21dde0 Oct 13 13:27:30 crc kubenswrapper[4797]: I1013 13:27:30.493933 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 13 13:27:30 crc kubenswrapper[4797]: I1013 13:27:30.547399 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee3b1264-55ce-4cb4-a390-2fb520ae9b87-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-9drpj\" (UID: \"ee3b1264-55ce-4cb4-a390-2fb520ae9b87\") " pod="openstack/nova-cell1-conductor-db-sync-9drpj" Oct 13 13:27:30 crc kubenswrapper[4797]: I1013 13:27:30.547477 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2c7k\" (UniqueName: \"kubernetes.io/projected/ee3b1264-55ce-4cb4-a390-2fb520ae9b87-kube-api-access-c2c7k\") pod \"nova-cell1-conductor-db-sync-9drpj\" (UID: \"ee3b1264-55ce-4cb4-a390-2fb520ae9b87\") " pod="openstack/nova-cell1-conductor-db-sync-9drpj" Oct 13 13:27:30 crc kubenswrapper[4797]: I1013 13:27:30.547560 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee3b1264-55ce-4cb4-a390-2fb520ae9b87-config-data\") pod \"nova-cell1-conductor-db-sync-9drpj\" (UID: \"ee3b1264-55ce-4cb4-a390-2fb520ae9b87\") " pod="openstack/nova-cell1-conductor-db-sync-9drpj" Oct 13 13:27:30 crc kubenswrapper[4797]: I1013 13:27:30.547622 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ee3b1264-55ce-4cb4-a390-2fb520ae9b87-scripts\") pod \"nova-cell1-conductor-db-sync-9drpj\" (UID: \"ee3b1264-55ce-4cb4-a390-2fb520ae9b87\") " pod="openstack/nova-cell1-conductor-db-sync-9drpj" Oct 13 13:27:30 crc kubenswrapper[4797]: I1013 13:27:30.552870 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ee3b1264-55ce-4cb4-a390-2fb520ae9b87-scripts\") pod \"nova-cell1-conductor-db-sync-9drpj\" (UID: \"ee3b1264-55ce-4cb4-a390-2fb520ae9b87\") " pod="openstack/nova-cell1-conductor-db-sync-9drpj" Oct 13 13:27:30 crc kubenswrapper[4797]: I1013 13:27:30.554021 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee3b1264-55ce-4cb4-a390-2fb520ae9b87-config-data\") pod \"nova-cell1-conductor-db-sync-9drpj\" (UID: \"ee3b1264-55ce-4cb4-a390-2fb520ae9b87\") " pod="openstack/nova-cell1-conductor-db-sync-9drpj" Oct 13 13:27:30 crc kubenswrapper[4797]: I1013 13:27:30.554479 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee3b1264-55ce-4cb4-a390-2fb520ae9b87-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-9drpj\" (UID: \"ee3b1264-55ce-4cb4-a390-2fb520ae9b87\") " pod="openstack/nova-cell1-conductor-db-sync-9drpj" Oct 13 13:27:30 crc kubenswrapper[4797]: I1013 13:27:30.566995 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2c7k\" (UniqueName: \"kubernetes.io/projected/ee3b1264-55ce-4cb4-a390-2fb520ae9b87-kube-api-access-c2c7k\") pod \"nova-cell1-conductor-db-sync-9drpj\" (UID: \"ee3b1264-55ce-4cb4-a390-2fb520ae9b87\") " pod="openstack/nova-cell1-conductor-db-sync-9drpj" Oct 13 13:27:30 crc kubenswrapper[4797]: I1013 13:27:30.705538 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-d74749bf5-5lfsm"] Oct 13 13:27:30 crc kubenswrapper[4797]: I1013 13:27:30.725153 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-9drpj" Oct 13 13:27:30 crc kubenswrapper[4797]: I1013 13:27:30.796773 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 13 13:27:30 crc kubenswrapper[4797]: W1013 13:27:30.833448 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod126b2f1c_8c5f_431d_8d9d_aefc2ca0d4c1.slice/crio-4a3819d45087ff74b060fb47730b7c74d5a2bf2ff7fcafc7a5db29c231cd9231 WatchSource:0}: Error finding container 4a3819d45087ff74b060fb47730b7c74d5a2bf2ff7fcafc7a5db29c231cd9231: Status 404 returned error can't find the container with id 4a3819d45087ff74b060fb47730b7c74d5a2bf2ff7fcafc7a5db29c231cd9231 Oct 13 13:27:31 crc kubenswrapper[4797]: I1013 13:27:31.294983 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-9drpj"] Oct 13 13:27:31 crc kubenswrapper[4797]: I1013 13:27:31.367960 4797 generic.go:334] "Generic (PLEG): container finished" podID="3b04026b-c0b6-4373-b87a-0eaea7d96163" containerID="0ca43873ba235d687e8e91e7f952ef663cce7934a24dde81c51aa6374cbd5622" exitCode=0 Oct 13 13:27:31 crc kubenswrapper[4797]: I1013 13:27:31.368076 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d74749bf5-5lfsm" event={"ID":"3b04026b-c0b6-4373-b87a-0eaea7d96163","Type":"ContainerDied","Data":"0ca43873ba235d687e8e91e7f952ef663cce7934a24dde81c51aa6374cbd5622"} Oct 13 13:27:31 crc kubenswrapper[4797]: I1013 13:27:31.368110 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d74749bf5-5lfsm" event={"ID":"3b04026b-c0b6-4373-b87a-0eaea7d96163","Type":"ContainerStarted","Data":"a4f50de1974402f5f3119a93d2d2072d884571a013e7c4332855e24c06818dba"} Oct 13 13:27:31 crc kubenswrapper[4797]: I1013 13:27:31.374365 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-9drpj" event={"ID":"ee3b1264-55ce-4cb4-a390-2fb520ae9b87","Type":"ContainerStarted","Data":"619a1151790656278584ed4dae030abe89e71a5d46806f27f65f04230fe87049"} Oct 13 13:27:31 crc kubenswrapper[4797]: I1013 13:27:31.382817 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d916771f-8789-42cf-aa66-11707d4825f6","Type":"ContainerStarted","Data":"fb9aeba3bcd842bb33e696953339e7d39af20fe23222839c4f11e16ec69bc551"} Oct 13 13:27:31 crc kubenswrapper[4797]: I1013 13:27:31.388057 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a97dc3ca-450c-4dcb-96f8-22924e5b08be","Type":"ContainerStarted","Data":"7fa39944f964375c6d824c1fceeea1da8cd695753b3d6343c6bf09d7f2478911"} Oct 13 13:27:31 crc kubenswrapper[4797]: I1013 13:27:31.395474 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"126b2f1c-8c5f-431d-8d9d-aefc2ca0d4c1","Type":"ContainerStarted","Data":"4a3819d45087ff74b060fb47730b7c74d5a2bf2ff7fcafc7a5db29c231cd9231"} Oct 13 13:27:31 crc kubenswrapper[4797]: I1013 13:27:31.402217 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-pp9mc" event={"ID":"87c2e451-cf73-4e5e-9e2e-703043c09184","Type":"ContainerStarted","Data":"08c5f7a46cf66da7ac4bd3d50fc97b7d99b7ffff6dfe1bf44438942e8be3a569"} Oct 13 13:27:31 crc kubenswrapper[4797]: I1013 13:27:31.404826 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"295e91cc-9024-4280-a782-3c7f3a2d19dc","Type":"ContainerStarted","Data":"8d2a09717a60726f99312698dccb0b45abc5b9351a09ad0864a8ee755b21dde0"} Oct 13 13:27:31 crc kubenswrapper[4797]: I1013 13:27:31.425407 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-pp9mc" podStartSLOduration=2.425381061 podStartE2EDuration="2.425381061s" podCreationTimestamp="2025-10-13 13:27:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 13:27:31.417449227 +0000 UTC m=+1228.950999513" watchObservedRunningTime="2025-10-13 13:27:31.425381061 +0000 UTC m=+1228.958931317" Oct 13 13:27:32 crc kubenswrapper[4797]: I1013 13:27:32.428341 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-9drpj" event={"ID":"ee3b1264-55ce-4cb4-a390-2fb520ae9b87","Type":"ContainerStarted","Data":"5fbdc263c596a7c3afdf13d3673d89669a69914a1c25734b4adeb4c6c4f4e7be"} Oct 13 13:27:32 crc kubenswrapper[4797]: I1013 13:27:32.435889 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d74749bf5-5lfsm" event={"ID":"3b04026b-c0b6-4373-b87a-0eaea7d96163","Type":"ContainerStarted","Data":"82c136de1437b97cd26a1364e4f0882585e571d85c4318f416061c9b95f79c3d"} Oct 13 13:27:32 crc kubenswrapper[4797]: I1013 13:27:32.446029 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-9drpj" podStartSLOduration=2.4460097149999998 podStartE2EDuration="2.446009715s" podCreationTimestamp="2025-10-13 13:27:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 13:27:32.439198588 +0000 UTC m=+1229.972748854" watchObservedRunningTime="2025-10-13 13:27:32.446009715 +0000 UTC m=+1229.979559971" Oct 13 13:27:32 crc kubenswrapper[4797]: I1013 13:27:32.464615 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-d74749bf5-5lfsm" podStartSLOduration=3.464592549 podStartE2EDuration="3.464592549s" podCreationTimestamp="2025-10-13 13:27:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 13:27:32.460166981 +0000 UTC m=+1229.993717247" watchObservedRunningTime="2025-10-13 13:27:32.464592549 +0000 UTC m=+1229.998142815" Oct 13 13:27:33 crc kubenswrapper[4797]: I1013 13:27:33.453234 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-d74749bf5-5lfsm" Oct 13 13:27:33 crc kubenswrapper[4797]: I1013 13:27:33.456624 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 13 13:27:33 crc kubenswrapper[4797]: I1013 13:27:33.721293 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 13 13:27:33 crc kubenswrapper[4797]: I1013 13:27:33.754344 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 13 13:27:35 crc kubenswrapper[4797]: I1013 13:27:35.481150 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"295e91cc-9024-4280-a782-3c7f3a2d19dc","Type":"ContainerStarted","Data":"7c864360250eda4150f9a3eacc5f7bd8c83b74c0a6eda4fcedee00c2f972660a"} Oct 13 13:27:35 crc kubenswrapper[4797]: I1013 13:27:35.481615 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"295e91cc-9024-4280-a782-3c7f3a2d19dc","Type":"ContainerStarted","Data":"087de0154f9960b6ee388e1e995a5c6a2ef15b2fb7775bdccea9ec17c4e8774e"} Oct 13 13:27:35 crc kubenswrapper[4797]: I1013 13:27:35.482955 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d916771f-8789-42cf-aa66-11707d4825f6","Type":"ContainerStarted","Data":"d0e136c6ee06a6dbb29283674c3aa6dd2d3838158b94538d5b3654533f98cdff"} Oct 13 13:27:35 crc kubenswrapper[4797]: I1013 13:27:35.485276 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a97dc3ca-450c-4dcb-96f8-22924e5b08be","Type":"ContainerStarted","Data":"dea44d4b3ff3d85577eaaae8587dd294fd3ae8a53db9cdbabd356de4c5ff483f"} Oct 13 13:27:35 crc kubenswrapper[4797]: I1013 13:27:35.485314 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a97dc3ca-450c-4dcb-96f8-22924e5b08be","Type":"ContainerStarted","Data":"dba91dc118dc2ce842cf4c7129cb293f7f3216b778ca7b5f91c2a79f124a21d2"} Oct 13 13:27:35 crc kubenswrapper[4797]: I1013 13:27:35.485374 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="a97dc3ca-450c-4dcb-96f8-22924e5b08be" containerName="nova-metadata-log" containerID="cri-o://dba91dc118dc2ce842cf4c7129cb293f7f3216b778ca7b5f91c2a79f124a21d2" gracePeriod=30 Oct 13 13:27:35 crc kubenswrapper[4797]: I1013 13:27:35.485396 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="a97dc3ca-450c-4dcb-96f8-22924e5b08be" containerName="nova-metadata-metadata" containerID="cri-o://dea44d4b3ff3d85577eaaae8587dd294fd3ae8a53db9cdbabd356de4c5ff483f" gracePeriod=30 Oct 13 13:27:35 crc kubenswrapper[4797]: I1013 13:27:35.487917 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"126b2f1c-8c5f-431d-8d9d-aefc2ca0d4c1","Type":"ContainerStarted","Data":"f4972aa92a262458c3b812eecd8c851cc85886977df5fed9fd6db1c7e3ce766a"} Oct 13 13:27:35 crc kubenswrapper[4797]: I1013 13:27:35.488105 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="126b2f1c-8c5f-431d-8d9d-aefc2ca0d4c1" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://f4972aa92a262458c3b812eecd8c851cc85886977df5fed9fd6db1c7e3ce766a" gracePeriod=30 Oct 13 13:27:35 crc kubenswrapper[4797]: I1013 13:27:35.528898 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.67786662 podStartE2EDuration="6.528880439s" podCreationTimestamp="2025-10-13 13:27:29 +0000 UTC" firstStartedPulling="2025-10-13 13:27:30.489543997 +0000 UTC m=+1228.023094253" lastFinishedPulling="2025-10-13 13:27:34.340557806 +0000 UTC m=+1231.874108072" observedRunningTime="2025-10-13 13:27:35.510053398 +0000 UTC m=+1233.043603654" watchObservedRunningTime="2025-10-13 13:27:35.528880439 +0000 UTC m=+1233.062430695" Oct 13 13:27:35 crc kubenswrapper[4797]: I1013 13:27:35.550787 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.691994915 podStartE2EDuration="6.550771104s" podCreationTimestamp="2025-10-13 13:27:29 +0000 UTC" firstStartedPulling="2025-10-13 13:27:30.481665704 +0000 UTC m=+1228.015215960" lastFinishedPulling="2025-10-13 13:27:34.340441893 +0000 UTC m=+1231.873992149" observedRunningTime="2025-10-13 13:27:35.550011125 +0000 UTC m=+1233.083561401" watchObservedRunningTime="2025-10-13 13:27:35.550771104 +0000 UTC m=+1233.084321360" Oct 13 13:27:35 crc kubenswrapper[4797]: I1013 13:27:35.552678 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.637979991 podStartE2EDuration="6.552668371s" podCreationTimestamp="2025-10-13 13:27:29 +0000 UTC" firstStartedPulling="2025-10-13 13:27:30.425380914 +0000 UTC m=+1227.958931170" lastFinishedPulling="2025-10-13 13:27:34.340069294 +0000 UTC m=+1231.873619550" observedRunningTime="2025-10-13 13:27:35.529928874 +0000 UTC m=+1233.063479130" watchObservedRunningTime="2025-10-13 13:27:35.552668371 +0000 UTC m=+1233.086218627" Oct 13 13:27:35 crc kubenswrapper[4797]: I1013 13:27:35.575279 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=3.069145187 podStartE2EDuration="6.575259913s" podCreationTimestamp="2025-10-13 13:27:29 +0000 UTC" firstStartedPulling="2025-10-13 13:27:30.837651328 +0000 UTC m=+1228.371201584" lastFinishedPulling="2025-10-13 13:27:34.343766054 +0000 UTC m=+1231.877316310" observedRunningTime="2025-10-13 13:27:35.568454187 +0000 UTC m=+1233.102004463" watchObservedRunningTime="2025-10-13 13:27:35.575259913 +0000 UTC m=+1233.108810159" Oct 13 13:27:36 crc kubenswrapper[4797]: I1013 13:27:36.132625 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 13 13:27:36 crc kubenswrapper[4797]: I1013 13:27:36.199750 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a97dc3ca-450c-4dcb-96f8-22924e5b08be-logs\") pod \"a97dc3ca-450c-4dcb-96f8-22924e5b08be\" (UID: \"a97dc3ca-450c-4dcb-96f8-22924e5b08be\") " Oct 13 13:27:36 crc kubenswrapper[4797]: I1013 13:27:36.200170 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ksbp7\" (UniqueName: \"kubernetes.io/projected/a97dc3ca-450c-4dcb-96f8-22924e5b08be-kube-api-access-ksbp7\") pod \"a97dc3ca-450c-4dcb-96f8-22924e5b08be\" (UID: \"a97dc3ca-450c-4dcb-96f8-22924e5b08be\") " Oct 13 13:27:36 crc kubenswrapper[4797]: I1013 13:27:36.200212 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a97dc3ca-450c-4dcb-96f8-22924e5b08be-config-data\") pod \"a97dc3ca-450c-4dcb-96f8-22924e5b08be\" (UID: \"a97dc3ca-450c-4dcb-96f8-22924e5b08be\") " Oct 13 13:27:36 crc kubenswrapper[4797]: I1013 13:27:36.200235 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a97dc3ca-450c-4dcb-96f8-22924e5b08be-combined-ca-bundle\") pod \"a97dc3ca-450c-4dcb-96f8-22924e5b08be\" (UID: \"a97dc3ca-450c-4dcb-96f8-22924e5b08be\") " Oct 13 13:27:36 crc kubenswrapper[4797]: I1013 13:27:36.201206 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a97dc3ca-450c-4dcb-96f8-22924e5b08be-logs" (OuterVolumeSpecName: "logs") pod "a97dc3ca-450c-4dcb-96f8-22924e5b08be" (UID: "a97dc3ca-450c-4dcb-96f8-22924e5b08be"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 13:27:36 crc kubenswrapper[4797]: I1013 13:27:36.212010 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a97dc3ca-450c-4dcb-96f8-22924e5b08be-kube-api-access-ksbp7" (OuterVolumeSpecName: "kube-api-access-ksbp7") pod "a97dc3ca-450c-4dcb-96f8-22924e5b08be" (UID: "a97dc3ca-450c-4dcb-96f8-22924e5b08be"). InnerVolumeSpecName "kube-api-access-ksbp7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:27:36 crc kubenswrapper[4797]: I1013 13:27:36.236639 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a97dc3ca-450c-4dcb-96f8-22924e5b08be-config-data" (OuterVolumeSpecName: "config-data") pod "a97dc3ca-450c-4dcb-96f8-22924e5b08be" (UID: "a97dc3ca-450c-4dcb-96f8-22924e5b08be"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:27:36 crc kubenswrapper[4797]: I1013 13:27:36.257060 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a97dc3ca-450c-4dcb-96f8-22924e5b08be-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a97dc3ca-450c-4dcb-96f8-22924e5b08be" (UID: "a97dc3ca-450c-4dcb-96f8-22924e5b08be"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:27:36 crc kubenswrapper[4797]: I1013 13:27:36.302436 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ksbp7\" (UniqueName: \"kubernetes.io/projected/a97dc3ca-450c-4dcb-96f8-22924e5b08be-kube-api-access-ksbp7\") on node \"crc\" DevicePath \"\"" Oct 13 13:27:36 crc kubenswrapper[4797]: I1013 13:27:36.302469 4797 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a97dc3ca-450c-4dcb-96f8-22924e5b08be-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 13:27:36 crc kubenswrapper[4797]: I1013 13:27:36.302479 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a97dc3ca-450c-4dcb-96f8-22924e5b08be-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 13:27:36 crc kubenswrapper[4797]: I1013 13:27:36.302487 4797 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a97dc3ca-450c-4dcb-96f8-22924e5b08be-logs\") on node \"crc\" DevicePath \"\"" Oct 13 13:27:36 crc kubenswrapper[4797]: I1013 13:27:36.500723 4797 generic.go:334] "Generic (PLEG): container finished" podID="a97dc3ca-450c-4dcb-96f8-22924e5b08be" containerID="dea44d4b3ff3d85577eaaae8587dd294fd3ae8a53db9cdbabd356de4c5ff483f" exitCode=0 Oct 13 13:27:36 crc kubenswrapper[4797]: I1013 13:27:36.501007 4797 generic.go:334] "Generic (PLEG): container finished" podID="a97dc3ca-450c-4dcb-96f8-22924e5b08be" containerID="dba91dc118dc2ce842cf4c7129cb293f7f3216b778ca7b5f91c2a79f124a21d2" exitCode=143 Oct 13 13:27:36 crc kubenswrapper[4797]: I1013 13:27:36.500762 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 13 13:27:36 crc kubenswrapper[4797]: I1013 13:27:36.500780 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a97dc3ca-450c-4dcb-96f8-22924e5b08be","Type":"ContainerDied","Data":"dea44d4b3ff3d85577eaaae8587dd294fd3ae8a53db9cdbabd356de4c5ff483f"} Oct 13 13:27:36 crc kubenswrapper[4797]: I1013 13:27:36.501069 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a97dc3ca-450c-4dcb-96f8-22924e5b08be","Type":"ContainerDied","Data":"dba91dc118dc2ce842cf4c7129cb293f7f3216b778ca7b5f91c2a79f124a21d2"} Oct 13 13:27:36 crc kubenswrapper[4797]: I1013 13:27:36.501080 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a97dc3ca-450c-4dcb-96f8-22924e5b08be","Type":"ContainerDied","Data":"7fa39944f964375c6d824c1fceeea1da8cd695753b3d6343c6bf09d7f2478911"} Oct 13 13:27:36 crc kubenswrapper[4797]: I1013 13:27:36.501092 4797 scope.go:117] "RemoveContainer" containerID="dea44d4b3ff3d85577eaaae8587dd294fd3ae8a53db9cdbabd356de4c5ff483f" Oct 13 13:27:36 crc kubenswrapper[4797]: I1013 13:27:36.525504 4797 scope.go:117] "RemoveContainer" containerID="dba91dc118dc2ce842cf4c7129cb293f7f3216b778ca7b5f91c2a79f124a21d2" Oct 13 13:27:36 crc kubenswrapper[4797]: I1013 13:27:36.546888 4797 scope.go:117] "RemoveContainer" containerID="dea44d4b3ff3d85577eaaae8587dd294fd3ae8a53db9cdbabd356de4c5ff483f" Oct 13 13:27:36 crc kubenswrapper[4797]: E1013 13:27:36.548530 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dea44d4b3ff3d85577eaaae8587dd294fd3ae8a53db9cdbabd356de4c5ff483f\": container with ID starting with dea44d4b3ff3d85577eaaae8587dd294fd3ae8a53db9cdbabd356de4c5ff483f not found: ID does not exist" containerID="dea44d4b3ff3d85577eaaae8587dd294fd3ae8a53db9cdbabd356de4c5ff483f" Oct 13 13:27:36 crc kubenswrapper[4797]: I1013 13:27:36.548559 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dea44d4b3ff3d85577eaaae8587dd294fd3ae8a53db9cdbabd356de4c5ff483f"} err="failed to get container status \"dea44d4b3ff3d85577eaaae8587dd294fd3ae8a53db9cdbabd356de4c5ff483f\": rpc error: code = NotFound desc = could not find container \"dea44d4b3ff3d85577eaaae8587dd294fd3ae8a53db9cdbabd356de4c5ff483f\": container with ID starting with dea44d4b3ff3d85577eaaae8587dd294fd3ae8a53db9cdbabd356de4c5ff483f not found: ID does not exist" Oct 13 13:27:36 crc kubenswrapper[4797]: I1013 13:27:36.548586 4797 scope.go:117] "RemoveContainer" containerID="dba91dc118dc2ce842cf4c7129cb293f7f3216b778ca7b5f91c2a79f124a21d2" Oct 13 13:27:36 crc kubenswrapper[4797]: E1013 13:27:36.551175 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dba91dc118dc2ce842cf4c7129cb293f7f3216b778ca7b5f91c2a79f124a21d2\": container with ID starting with dba91dc118dc2ce842cf4c7129cb293f7f3216b778ca7b5f91c2a79f124a21d2 not found: ID does not exist" containerID="dba91dc118dc2ce842cf4c7129cb293f7f3216b778ca7b5f91c2a79f124a21d2" Oct 13 13:27:36 crc kubenswrapper[4797]: I1013 13:27:36.551257 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dba91dc118dc2ce842cf4c7129cb293f7f3216b778ca7b5f91c2a79f124a21d2"} err="failed to get container status \"dba91dc118dc2ce842cf4c7129cb293f7f3216b778ca7b5f91c2a79f124a21d2\": rpc error: code = NotFound desc = could not find container \"dba91dc118dc2ce842cf4c7129cb293f7f3216b778ca7b5f91c2a79f124a21d2\": container with ID starting with dba91dc118dc2ce842cf4c7129cb293f7f3216b778ca7b5f91c2a79f124a21d2 not found: ID does not exist" Oct 13 13:27:36 crc kubenswrapper[4797]: I1013 13:27:36.551289 4797 scope.go:117] "RemoveContainer" containerID="dea44d4b3ff3d85577eaaae8587dd294fd3ae8a53db9cdbabd356de4c5ff483f" Oct 13 13:27:36 crc kubenswrapper[4797]: I1013 13:27:36.553072 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dea44d4b3ff3d85577eaaae8587dd294fd3ae8a53db9cdbabd356de4c5ff483f"} err="failed to get container status \"dea44d4b3ff3d85577eaaae8587dd294fd3ae8a53db9cdbabd356de4c5ff483f\": rpc error: code = NotFound desc = could not find container \"dea44d4b3ff3d85577eaaae8587dd294fd3ae8a53db9cdbabd356de4c5ff483f\": container with ID starting with dea44d4b3ff3d85577eaaae8587dd294fd3ae8a53db9cdbabd356de4c5ff483f not found: ID does not exist" Oct 13 13:27:36 crc kubenswrapper[4797]: I1013 13:27:36.553099 4797 scope.go:117] "RemoveContainer" containerID="dba91dc118dc2ce842cf4c7129cb293f7f3216b778ca7b5f91c2a79f124a21d2" Oct 13 13:27:36 crc kubenswrapper[4797]: I1013 13:27:36.553526 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dba91dc118dc2ce842cf4c7129cb293f7f3216b778ca7b5f91c2a79f124a21d2"} err="failed to get container status \"dba91dc118dc2ce842cf4c7129cb293f7f3216b778ca7b5f91c2a79f124a21d2\": rpc error: code = NotFound desc = could not find container \"dba91dc118dc2ce842cf4c7129cb293f7f3216b778ca7b5f91c2a79f124a21d2\": container with ID starting with dba91dc118dc2ce842cf4c7129cb293f7f3216b778ca7b5f91c2a79f124a21d2 not found: ID does not exist" Oct 13 13:27:36 crc kubenswrapper[4797]: I1013 13:27:36.559571 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 13 13:27:36 crc kubenswrapper[4797]: I1013 13:27:36.590589 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 13 13:27:36 crc kubenswrapper[4797]: I1013 13:27:36.605151 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 13 13:27:36 crc kubenswrapper[4797]: E1013 13:27:36.605584 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a97dc3ca-450c-4dcb-96f8-22924e5b08be" containerName="nova-metadata-log" Oct 13 13:27:36 crc kubenswrapper[4797]: I1013 13:27:36.605596 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="a97dc3ca-450c-4dcb-96f8-22924e5b08be" containerName="nova-metadata-log" Oct 13 13:27:36 crc kubenswrapper[4797]: E1013 13:27:36.605634 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a97dc3ca-450c-4dcb-96f8-22924e5b08be" containerName="nova-metadata-metadata" Oct 13 13:27:36 crc kubenswrapper[4797]: I1013 13:27:36.605640 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="a97dc3ca-450c-4dcb-96f8-22924e5b08be" containerName="nova-metadata-metadata" Oct 13 13:27:36 crc kubenswrapper[4797]: I1013 13:27:36.605815 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="a97dc3ca-450c-4dcb-96f8-22924e5b08be" containerName="nova-metadata-metadata" Oct 13 13:27:36 crc kubenswrapper[4797]: I1013 13:27:36.605831 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="a97dc3ca-450c-4dcb-96f8-22924e5b08be" containerName="nova-metadata-log" Oct 13 13:27:36 crc kubenswrapper[4797]: I1013 13:27:36.606769 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 13 13:27:36 crc kubenswrapper[4797]: I1013 13:27:36.613286 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 13 13:27:36 crc kubenswrapper[4797]: I1013 13:27:36.613552 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 13 13:27:36 crc kubenswrapper[4797]: I1013 13:27:36.620162 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 13 13:27:36 crc kubenswrapper[4797]: I1013 13:27:36.708684 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/70734429-9865-47a5-a25e-f7e3b5d56d8f-logs\") pod \"nova-metadata-0\" (UID: \"70734429-9865-47a5-a25e-f7e3b5d56d8f\") " pod="openstack/nova-metadata-0" Oct 13 13:27:36 crc kubenswrapper[4797]: I1013 13:27:36.708773 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70734429-9865-47a5-a25e-f7e3b5d56d8f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"70734429-9865-47a5-a25e-f7e3b5d56d8f\") " pod="openstack/nova-metadata-0" Oct 13 13:27:36 crc kubenswrapper[4797]: I1013 13:27:36.708825 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hb2cw\" (UniqueName: \"kubernetes.io/projected/70734429-9865-47a5-a25e-f7e3b5d56d8f-kube-api-access-hb2cw\") pod \"nova-metadata-0\" (UID: \"70734429-9865-47a5-a25e-f7e3b5d56d8f\") " pod="openstack/nova-metadata-0" Oct 13 13:27:36 crc kubenswrapper[4797]: I1013 13:27:36.708859 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/70734429-9865-47a5-a25e-f7e3b5d56d8f-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"70734429-9865-47a5-a25e-f7e3b5d56d8f\") " pod="openstack/nova-metadata-0" Oct 13 13:27:36 crc kubenswrapper[4797]: I1013 13:27:36.708893 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70734429-9865-47a5-a25e-f7e3b5d56d8f-config-data\") pod \"nova-metadata-0\" (UID: \"70734429-9865-47a5-a25e-f7e3b5d56d8f\") " pod="openstack/nova-metadata-0" Oct 13 13:27:36 crc kubenswrapper[4797]: I1013 13:27:36.809903 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/70734429-9865-47a5-a25e-f7e3b5d56d8f-logs\") pod \"nova-metadata-0\" (UID: \"70734429-9865-47a5-a25e-f7e3b5d56d8f\") " pod="openstack/nova-metadata-0" Oct 13 13:27:36 crc kubenswrapper[4797]: I1013 13:27:36.809992 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70734429-9865-47a5-a25e-f7e3b5d56d8f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"70734429-9865-47a5-a25e-f7e3b5d56d8f\") " pod="openstack/nova-metadata-0" Oct 13 13:27:36 crc kubenswrapper[4797]: I1013 13:27:36.810025 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hb2cw\" (UniqueName: \"kubernetes.io/projected/70734429-9865-47a5-a25e-f7e3b5d56d8f-kube-api-access-hb2cw\") pod \"nova-metadata-0\" (UID: \"70734429-9865-47a5-a25e-f7e3b5d56d8f\") " pod="openstack/nova-metadata-0" Oct 13 13:27:36 crc kubenswrapper[4797]: I1013 13:27:36.810060 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/70734429-9865-47a5-a25e-f7e3b5d56d8f-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"70734429-9865-47a5-a25e-f7e3b5d56d8f\") " pod="openstack/nova-metadata-0" Oct 13 13:27:36 crc kubenswrapper[4797]: I1013 13:27:36.810092 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70734429-9865-47a5-a25e-f7e3b5d56d8f-config-data\") pod \"nova-metadata-0\" (UID: \"70734429-9865-47a5-a25e-f7e3b5d56d8f\") " pod="openstack/nova-metadata-0" Oct 13 13:27:36 crc kubenswrapper[4797]: I1013 13:27:36.810899 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/70734429-9865-47a5-a25e-f7e3b5d56d8f-logs\") pod \"nova-metadata-0\" (UID: \"70734429-9865-47a5-a25e-f7e3b5d56d8f\") " pod="openstack/nova-metadata-0" Oct 13 13:27:36 crc kubenswrapper[4797]: I1013 13:27:36.815354 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70734429-9865-47a5-a25e-f7e3b5d56d8f-config-data\") pod \"nova-metadata-0\" (UID: \"70734429-9865-47a5-a25e-f7e3b5d56d8f\") " pod="openstack/nova-metadata-0" Oct 13 13:27:36 crc kubenswrapper[4797]: I1013 13:27:36.817627 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70734429-9865-47a5-a25e-f7e3b5d56d8f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"70734429-9865-47a5-a25e-f7e3b5d56d8f\") " pod="openstack/nova-metadata-0" Oct 13 13:27:36 crc kubenswrapper[4797]: I1013 13:27:36.820297 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/70734429-9865-47a5-a25e-f7e3b5d56d8f-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"70734429-9865-47a5-a25e-f7e3b5d56d8f\") " pod="openstack/nova-metadata-0" Oct 13 13:27:36 crc kubenswrapper[4797]: I1013 13:27:36.841433 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hb2cw\" (UniqueName: \"kubernetes.io/projected/70734429-9865-47a5-a25e-f7e3b5d56d8f-kube-api-access-hb2cw\") pod \"nova-metadata-0\" (UID: \"70734429-9865-47a5-a25e-f7e3b5d56d8f\") " pod="openstack/nova-metadata-0" Oct 13 13:27:36 crc kubenswrapper[4797]: I1013 13:27:36.931605 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 13 13:27:37 crc kubenswrapper[4797]: I1013 13:27:37.260117 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a97dc3ca-450c-4dcb-96f8-22924e5b08be" path="/var/lib/kubelet/pods/a97dc3ca-450c-4dcb-96f8-22924e5b08be/volumes" Oct 13 13:27:37 crc kubenswrapper[4797]: I1013 13:27:37.445538 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 13 13:27:37 crc kubenswrapper[4797]: I1013 13:27:37.517038 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"70734429-9865-47a5-a25e-f7e3b5d56d8f","Type":"ContainerStarted","Data":"e9e5f1f2ea2b392e66a48a8b1c63dac261bad59ef12603fe2ef80548f272ae26"} Oct 13 13:27:37 crc kubenswrapper[4797]: I1013 13:27:37.632762 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 13 13:27:37 crc kubenswrapper[4797]: I1013 13:27:37.632997 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="130b6035-6c63-4a81-b112-fdf5da3d970e" containerName="kube-state-metrics" containerID="cri-o://98984ae0b0aa073e880a1ec2c8d909773dc9cef1bac9574b58fb38b3b429b392" gracePeriod=30 Oct 13 13:27:38 crc kubenswrapper[4797]: I1013 13:27:38.538784 4797 generic.go:334] "Generic (PLEG): container finished" podID="130b6035-6c63-4a81-b112-fdf5da3d970e" containerID="98984ae0b0aa073e880a1ec2c8d909773dc9cef1bac9574b58fb38b3b429b392" exitCode=2 Oct 13 13:27:38 crc kubenswrapper[4797]: I1013 13:27:38.539353 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"130b6035-6c63-4a81-b112-fdf5da3d970e","Type":"ContainerDied","Data":"98984ae0b0aa073e880a1ec2c8d909773dc9cef1bac9574b58fb38b3b429b392"} Oct 13 13:27:38 crc kubenswrapper[4797]: I1013 13:27:38.542121 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"70734429-9865-47a5-a25e-f7e3b5d56d8f","Type":"ContainerStarted","Data":"6d59470a3a30b2ba3ad49e40a91d4f792fa1b9f1f73604dc7341b0ea2f8f77e9"} Oct 13 13:27:38 crc kubenswrapper[4797]: I1013 13:27:38.542155 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"70734429-9865-47a5-a25e-f7e3b5d56d8f","Type":"ContainerStarted","Data":"85a5ae6fa09e816cd486557f2e30297d5061e15ac825eb4688e465ebe406464d"} Oct 13 13:27:38 crc kubenswrapper[4797]: I1013 13:27:38.588187 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.588160974 podStartE2EDuration="2.588160974s" podCreationTimestamp="2025-10-13 13:27:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 13:27:38.564513965 +0000 UTC m=+1236.098064231" watchObservedRunningTime="2025-10-13 13:27:38.588160974 +0000 UTC m=+1236.121711240" Oct 13 13:27:38 crc kubenswrapper[4797]: I1013 13:27:38.696445 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 13 13:27:38 crc kubenswrapper[4797]: I1013 13:27:38.862860 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4ch6m\" (UniqueName: \"kubernetes.io/projected/130b6035-6c63-4a81-b112-fdf5da3d970e-kube-api-access-4ch6m\") pod \"130b6035-6c63-4a81-b112-fdf5da3d970e\" (UID: \"130b6035-6c63-4a81-b112-fdf5da3d970e\") " Oct 13 13:27:38 crc kubenswrapper[4797]: I1013 13:27:38.867443 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/130b6035-6c63-4a81-b112-fdf5da3d970e-kube-api-access-4ch6m" (OuterVolumeSpecName: "kube-api-access-4ch6m") pod "130b6035-6c63-4a81-b112-fdf5da3d970e" (UID: "130b6035-6c63-4a81-b112-fdf5da3d970e"). InnerVolumeSpecName "kube-api-access-4ch6m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:27:38 crc kubenswrapper[4797]: I1013 13:27:38.965633 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4ch6m\" (UniqueName: \"kubernetes.io/projected/130b6035-6c63-4a81-b112-fdf5da3d970e-kube-api-access-4ch6m\") on node \"crc\" DevicePath \"\"" Oct 13 13:27:39 crc kubenswrapper[4797]: I1013 13:27:39.552247 4797 generic.go:334] "Generic (PLEG): container finished" podID="87c2e451-cf73-4e5e-9e2e-703043c09184" containerID="08c5f7a46cf66da7ac4bd3d50fc97b7d99b7ffff6dfe1bf44438942e8be3a569" exitCode=0 Oct 13 13:27:39 crc kubenswrapper[4797]: I1013 13:27:39.552285 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-pp9mc" event={"ID":"87c2e451-cf73-4e5e-9e2e-703043c09184","Type":"ContainerDied","Data":"08c5f7a46cf66da7ac4bd3d50fc97b7d99b7ffff6dfe1bf44438942e8be3a569"} Oct 13 13:27:39 crc kubenswrapper[4797]: I1013 13:27:39.554085 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"130b6035-6c63-4a81-b112-fdf5da3d970e","Type":"ContainerDied","Data":"c526830c40a400cbe7ea0f85fb6104a001568d00f6f4b35837275270953f1ee7"} Oct 13 13:27:39 crc kubenswrapper[4797]: I1013 13:27:39.554154 4797 scope.go:117] "RemoveContainer" containerID="98984ae0b0aa073e880a1ec2c8d909773dc9cef1bac9574b58fb38b3b429b392" Oct 13 13:27:39 crc kubenswrapper[4797]: I1013 13:27:39.554108 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 13 13:27:39 crc kubenswrapper[4797]: I1013 13:27:39.600663 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 13 13:27:39 crc kubenswrapper[4797]: I1013 13:27:39.609393 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 13 13:27:39 crc kubenswrapper[4797]: I1013 13:27:39.621002 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 13 13:27:39 crc kubenswrapper[4797]: I1013 13:27:39.621418 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="88171561-ebd1-4a7c-ad02-8360aa091756" containerName="sg-core" containerID="cri-o://90b297b1efc9c1fb3133d1d8335d0e00bdfc160e3da3b580942756cf725c4955" gracePeriod=30 Oct 13 13:27:39 crc kubenswrapper[4797]: I1013 13:27:39.621469 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="88171561-ebd1-4a7c-ad02-8360aa091756" containerName="ceilometer-notification-agent" containerID="cri-o://3fc0e047482f6b4e8796efcb93239b4645ccc1777e2bd67758d7006e21be9356" gracePeriod=30 Oct 13 13:27:39 crc kubenswrapper[4797]: I1013 13:27:39.621406 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="88171561-ebd1-4a7c-ad02-8360aa091756" containerName="proxy-httpd" containerID="cri-o://cbb5c4f10d0236cd1b245c2570179bf7a542ef5614cb3ba16b96a7951f5af82b" gracePeriod=30 Oct 13 13:27:39 crc kubenswrapper[4797]: I1013 13:27:39.621741 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="88171561-ebd1-4a7c-ad02-8360aa091756" containerName="ceilometer-central-agent" containerID="cri-o://fa599775b1fb6fc88fdbec4282bc3aee28747d9ba54de9e4632164935a2d64f3" gracePeriod=30 Oct 13 13:27:39 crc kubenswrapper[4797]: I1013 13:27:39.632768 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Oct 13 13:27:39 crc kubenswrapper[4797]: E1013 13:27:39.633304 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="130b6035-6c63-4a81-b112-fdf5da3d970e" containerName="kube-state-metrics" Oct 13 13:27:39 crc kubenswrapper[4797]: I1013 13:27:39.633327 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="130b6035-6c63-4a81-b112-fdf5da3d970e" containerName="kube-state-metrics" Oct 13 13:27:39 crc kubenswrapper[4797]: I1013 13:27:39.633618 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="130b6035-6c63-4a81-b112-fdf5da3d970e" containerName="kube-state-metrics" Oct 13 13:27:39 crc kubenswrapper[4797]: I1013 13:27:39.634402 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 13 13:27:39 crc kubenswrapper[4797]: I1013 13:27:39.636855 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Oct 13 13:27:39 crc kubenswrapper[4797]: I1013 13:27:39.639147 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Oct 13 13:27:39 crc kubenswrapper[4797]: I1013 13:27:39.644740 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 13 13:27:39 crc kubenswrapper[4797]: I1013 13:27:39.780473 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/ddeeadaa-8237-4fad-8fd7-8e9c0580a1ed-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"ddeeadaa-8237-4fad-8fd7-8e9c0580a1ed\") " pod="openstack/kube-state-metrics-0" Oct 13 13:27:39 crc kubenswrapper[4797]: I1013 13:27:39.780571 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddeeadaa-8237-4fad-8fd7-8e9c0580a1ed-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"ddeeadaa-8237-4fad-8fd7-8e9c0580a1ed\") " pod="openstack/kube-state-metrics-0" Oct 13 13:27:39 crc kubenswrapper[4797]: I1013 13:27:39.780893 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/ddeeadaa-8237-4fad-8fd7-8e9c0580a1ed-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"ddeeadaa-8237-4fad-8fd7-8e9c0580a1ed\") " pod="openstack/kube-state-metrics-0" Oct 13 13:27:39 crc kubenswrapper[4797]: I1013 13:27:39.780985 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74n9p\" (UniqueName: \"kubernetes.io/projected/ddeeadaa-8237-4fad-8fd7-8e9c0580a1ed-kube-api-access-74n9p\") pod \"kube-state-metrics-0\" (UID: \"ddeeadaa-8237-4fad-8fd7-8e9c0580a1ed\") " pod="openstack/kube-state-metrics-0" Oct 13 13:27:39 crc kubenswrapper[4797]: I1013 13:27:39.793076 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 13 13:27:39 crc kubenswrapper[4797]: I1013 13:27:39.793117 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 13 13:27:39 crc kubenswrapper[4797]: I1013 13:27:39.820507 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 13 13:27:39 crc kubenswrapper[4797]: I1013 13:27:39.882165 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddeeadaa-8237-4fad-8fd7-8e9c0580a1ed-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"ddeeadaa-8237-4fad-8fd7-8e9c0580a1ed\") " pod="openstack/kube-state-metrics-0" Oct 13 13:27:39 crc kubenswrapper[4797]: I1013 13:27:39.882280 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/ddeeadaa-8237-4fad-8fd7-8e9c0580a1ed-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"ddeeadaa-8237-4fad-8fd7-8e9c0580a1ed\") " pod="openstack/kube-state-metrics-0" Oct 13 13:27:39 crc kubenswrapper[4797]: I1013 13:27:39.882313 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74n9p\" (UniqueName: \"kubernetes.io/projected/ddeeadaa-8237-4fad-8fd7-8e9c0580a1ed-kube-api-access-74n9p\") pod \"kube-state-metrics-0\" (UID: \"ddeeadaa-8237-4fad-8fd7-8e9c0580a1ed\") " pod="openstack/kube-state-metrics-0" Oct 13 13:27:39 crc kubenswrapper[4797]: I1013 13:27:39.882358 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/ddeeadaa-8237-4fad-8fd7-8e9c0580a1ed-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"ddeeadaa-8237-4fad-8fd7-8e9c0580a1ed\") " pod="openstack/kube-state-metrics-0" Oct 13 13:27:39 crc kubenswrapper[4797]: I1013 13:27:39.887170 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/ddeeadaa-8237-4fad-8fd7-8e9c0580a1ed-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"ddeeadaa-8237-4fad-8fd7-8e9c0580a1ed\") " pod="openstack/kube-state-metrics-0" Oct 13 13:27:39 crc kubenswrapper[4797]: I1013 13:27:39.887226 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddeeadaa-8237-4fad-8fd7-8e9c0580a1ed-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"ddeeadaa-8237-4fad-8fd7-8e9c0580a1ed\") " pod="openstack/kube-state-metrics-0" Oct 13 13:27:39 crc kubenswrapper[4797]: I1013 13:27:39.891267 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/ddeeadaa-8237-4fad-8fd7-8e9c0580a1ed-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"ddeeadaa-8237-4fad-8fd7-8e9c0580a1ed\") " pod="openstack/kube-state-metrics-0" Oct 13 13:27:39 crc kubenswrapper[4797]: I1013 13:27:39.906109 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 13 13:27:39 crc kubenswrapper[4797]: I1013 13:27:39.906166 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 13 13:27:39 crc kubenswrapper[4797]: I1013 13:27:39.906563 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74n9p\" (UniqueName: \"kubernetes.io/projected/ddeeadaa-8237-4fad-8fd7-8e9c0580a1ed-kube-api-access-74n9p\") pod \"kube-state-metrics-0\" (UID: \"ddeeadaa-8237-4fad-8fd7-8e9c0580a1ed\") " pod="openstack/kube-state-metrics-0" Oct 13 13:27:39 crc kubenswrapper[4797]: I1013 13:27:39.964123 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 13 13:27:40 crc kubenswrapper[4797]: I1013 13:27:40.127298 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-d74749bf5-5lfsm" Oct 13 13:27:40 crc kubenswrapper[4797]: I1013 13:27:40.195730 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-cb9f44c77-zgtk5"] Oct 13 13:27:40 crc kubenswrapper[4797]: I1013 13:27:40.196110 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-cb9f44c77-zgtk5" podUID="c7c3de59-8ab4-450c-97bb-4826fb66db39" containerName="dnsmasq-dns" containerID="cri-o://64bc7a351dba6fd2ea568ac0358f7489d2418e46d95c7d997ddaf2c98466fa14" gracePeriod=10 Oct 13 13:27:40 crc kubenswrapper[4797]: I1013 13:27:40.196618 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Oct 13 13:27:40 crc kubenswrapper[4797]: I1013 13:27:40.409301 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 13 13:27:40 crc kubenswrapper[4797]: W1013 13:27:40.419970 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podddeeadaa_8237_4fad_8fd7_8e9c0580a1ed.slice/crio-bd517bbc565e9f1dcced1ca5f49edd0136a1c502ba0357f09621f60b395484c4 WatchSource:0}: Error finding container bd517bbc565e9f1dcced1ca5f49edd0136a1c502ba0357f09621f60b395484c4: Status 404 returned error can't find the container with id bd517bbc565e9f1dcced1ca5f49edd0136a1c502ba0357f09621f60b395484c4 Oct 13 13:27:40 crc kubenswrapper[4797]: I1013 13:27:40.568626 4797 generic.go:334] "Generic (PLEG): container finished" podID="88171561-ebd1-4a7c-ad02-8360aa091756" containerID="cbb5c4f10d0236cd1b245c2570179bf7a542ef5614cb3ba16b96a7951f5af82b" exitCode=0 Oct 13 13:27:40 crc kubenswrapper[4797]: I1013 13:27:40.569046 4797 generic.go:334] "Generic (PLEG): container finished" podID="88171561-ebd1-4a7c-ad02-8360aa091756" containerID="90b297b1efc9c1fb3133d1d8335d0e00bdfc160e3da3b580942756cf725c4955" exitCode=2 Oct 13 13:27:40 crc kubenswrapper[4797]: I1013 13:27:40.569061 4797 generic.go:334] "Generic (PLEG): container finished" podID="88171561-ebd1-4a7c-ad02-8360aa091756" containerID="fa599775b1fb6fc88fdbec4282bc3aee28747d9ba54de9e4632164935a2d64f3" exitCode=0 Oct 13 13:27:40 crc kubenswrapper[4797]: I1013 13:27:40.568698 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"88171561-ebd1-4a7c-ad02-8360aa091756","Type":"ContainerDied","Data":"cbb5c4f10d0236cd1b245c2570179bf7a542ef5614cb3ba16b96a7951f5af82b"} Oct 13 13:27:40 crc kubenswrapper[4797]: I1013 13:27:40.569131 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"88171561-ebd1-4a7c-ad02-8360aa091756","Type":"ContainerDied","Data":"90b297b1efc9c1fb3133d1d8335d0e00bdfc160e3da3b580942756cf725c4955"} Oct 13 13:27:40 crc kubenswrapper[4797]: I1013 13:27:40.569147 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"88171561-ebd1-4a7c-ad02-8360aa091756","Type":"ContainerDied","Data":"fa599775b1fb6fc88fdbec4282bc3aee28747d9ba54de9e4632164935a2d64f3"} Oct 13 13:27:40 crc kubenswrapper[4797]: I1013 13:27:40.574585 4797 generic.go:334] "Generic (PLEG): container finished" podID="c7c3de59-8ab4-450c-97bb-4826fb66db39" containerID="64bc7a351dba6fd2ea568ac0358f7489d2418e46d95c7d997ddaf2c98466fa14" exitCode=0 Oct 13 13:27:40 crc kubenswrapper[4797]: I1013 13:27:40.574668 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cb9f44c77-zgtk5" event={"ID":"c7c3de59-8ab4-450c-97bb-4826fb66db39","Type":"ContainerDied","Data":"64bc7a351dba6fd2ea568ac0358f7489d2418e46d95c7d997ddaf2c98466fa14"} Oct 13 13:27:40 crc kubenswrapper[4797]: I1013 13:27:40.576738 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"ddeeadaa-8237-4fad-8fd7-8e9c0580a1ed","Type":"ContainerStarted","Data":"bd517bbc565e9f1dcced1ca5f49edd0136a1c502ba0357f09621f60b395484c4"} Oct 13 13:27:40 crc kubenswrapper[4797]: I1013 13:27:40.627611 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 13 13:27:40 crc kubenswrapper[4797]: I1013 13:27:40.744137 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cb9f44c77-zgtk5" Oct 13 13:27:40 crc kubenswrapper[4797]: I1013 13:27:40.909985 4797 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="295e91cc-9024-4280-a782-3c7f3a2d19dc" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.187:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 13 13:27:40 crc kubenswrapper[4797]: I1013 13:27:40.910329 4797 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="295e91cc-9024-4280-a782-3c7f3a2d19dc" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.187:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 13 13:27:40 crc kubenswrapper[4797]: I1013 13:27:40.915543 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c7c3de59-8ab4-450c-97bb-4826fb66db39-ovsdbserver-sb\") pod \"c7c3de59-8ab4-450c-97bb-4826fb66db39\" (UID: \"c7c3de59-8ab4-450c-97bb-4826fb66db39\") " Oct 13 13:27:40 crc kubenswrapper[4797]: I1013 13:27:40.915591 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c7c3de59-8ab4-450c-97bb-4826fb66db39-ovsdbserver-nb\") pod \"c7c3de59-8ab4-450c-97bb-4826fb66db39\" (UID: \"c7c3de59-8ab4-450c-97bb-4826fb66db39\") " Oct 13 13:27:40 crc kubenswrapper[4797]: I1013 13:27:40.915694 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7c3de59-8ab4-450c-97bb-4826fb66db39-config\") pod \"c7c3de59-8ab4-450c-97bb-4826fb66db39\" (UID: \"c7c3de59-8ab4-450c-97bb-4826fb66db39\") " Oct 13 13:27:40 crc kubenswrapper[4797]: I1013 13:27:40.915753 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c7c3de59-8ab4-450c-97bb-4826fb66db39-dns-swift-storage-0\") pod \"c7c3de59-8ab4-450c-97bb-4826fb66db39\" (UID: \"c7c3de59-8ab4-450c-97bb-4826fb66db39\") " Oct 13 13:27:40 crc kubenswrapper[4797]: I1013 13:27:40.915845 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c7c3de59-8ab4-450c-97bb-4826fb66db39-dns-svc\") pod \"c7c3de59-8ab4-450c-97bb-4826fb66db39\" (UID: \"c7c3de59-8ab4-450c-97bb-4826fb66db39\") " Oct 13 13:27:40 crc kubenswrapper[4797]: I1013 13:27:40.915882 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kwsd6\" (UniqueName: \"kubernetes.io/projected/c7c3de59-8ab4-450c-97bb-4826fb66db39-kube-api-access-kwsd6\") pod \"c7c3de59-8ab4-450c-97bb-4826fb66db39\" (UID: \"c7c3de59-8ab4-450c-97bb-4826fb66db39\") " Oct 13 13:27:40 crc kubenswrapper[4797]: I1013 13:27:40.944181 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7c3de59-8ab4-450c-97bb-4826fb66db39-kube-api-access-kwsd6" (OuterVolumeSpecName: "kube-api-access-kwsd6") pod "c7c3de59-8ab4-450c-97bb-4826fb66db39" (UID: "c7c3de59-8ab4-450c-97bb-4826fb66db39"). InnerVolumeSpecName "kube-api-access-kwsd6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:27:41 crc kubenswrapper[4797]: I1013 13:27:41.017971 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kwsd6\" (UniqueName: \"kubernetes.io/projected/c7c3de59-8ab4-450c-97bb-4826fb66db39-kube-api-access-kwsd6\") on node \"crc\" DevicePath \"\"" Oct 13 13:27:41 crc kubenswrapper[4797]: I1013 13:27:41.089824 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7c3de59-8ab4-450c-97bb-4826fb66db39-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "c7c3de59-8ab4-450c-97bb-4826fb66db39" (UID: "c7c3de59-8ab4-450c-97bb-4826fb66db39"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:27:41 crc kubenswrapper[4797]: I1013 13:27:41.121347 4797 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c7c3de59-8ab4-450c-97bb-4826fb66db39-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 13 13:27:41 crc kubenswrapper[4797]: I1013 13:27:41.122250 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7c3de59-8ab4-450c-97bb-4826fb66db39-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c7c3de59-8ab4-450c-97bb-4826fb66db39" (UID: "c7c3de59-8ab4-450c-97bb-4826fb66db39"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:27:41 crc kubenswrapper[4797]: I1013 13:27:41.134041 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7c3de59-8ab4-450c-97bb-4826fb66db39-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c7c3de59-8ab4-450c-97bb-4826fb66db39" (UID: "c7c3de59-8ab4-450c-97bb-4826fb66db39"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:27:41 crc kubenswrapper[4797]: I1013 13:27:41.156879 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-pp9mc" Oct 13 13:27:41 crc kubenswrapper[4797]: I1013 13:27:41.173680 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7c3de59-8ab4-450c-97bb-4826fb66db39-config" (OuterVolumeSpecName: "config") pod "c7c3de59-8ab4-450c-97bb-4826fb66db39" (UID: "c7c3de59-8ab4-450c-97bb-4826fb66db39"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:27:41 crc kubenswrapper[4797]: I1013 13:27:41.174255 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7c3de59-8ab4-450c-97bb-4826fb66db39-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c7c3de59-8ab4-450c-97bb-4826fb66db39" (UID: "c7c3de59-8ab4-450c-97bb-4826fb66db39"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:27:41 crc kubenswrapper[4797]: I1013 13:27:41.222715 4797 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c7c3de59-8ab4-450c-97bb-4826fb66db39-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 13 13:27:41 crc kubenswrapper[4797]: I1013 13:27:41.222752 4797 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c7c3de59-8ab4-450c-97bb-4826fb66db39-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 13 13:27:41 crc kubenswrapper[4797]: I1013 13:27:41.222763 4797 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7c3de59-8ab4-450c-97bb-4826fb66db39-config\") on node \"crc\" DevicePath \"\"" Oct 13 13:27:41 crc kubenswrapper[4797]: I1013 13:27:41.222771 4797 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c7c3de59-8ab4-450c-97bb-4826fb66db39-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 13 13:27:41 crc kubenswrapper[4797]: I1013 13:27:41.264195 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="130b6035-6c63-4a81-b112-fdf5da3d970e" path="/var/lib/kubelet/pods/130b6035-6c63-4a81-b112-fdf5da3d970e/volumes" Oct 13 13:27:41 crc kubenswrapper[4797]: I1013 13:27:41.332984 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gckrb\" (UniqueName: \"kubernetes.io/projected/87c2e451-cf73-4e5e-9e2e-703043c09184-kube-api-access-gckrb\") pod \"87c2e451-cf73-4e5e-9e2e-703043c09184\" (UID: \"87c2e451-cf73-4e5e-9e2e-703043c09184\") " Oct 13 13:27:41 crc kubenswrapper[4797]: I1013 13:27:41.333197 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87c2e451-cf73-4e5e-9e2e-703043c09184-config-data\") pod \"87c2e451-cf73-4e5e-9e2e-703043c09184\" (UID: \"87c2e451-cf73-4e5e-9e2e-703043c09184\") " Oct 13 13:27:41 crc kubenswrapper[4797]: I1013 13:27:41.333247 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87c2e451-cf73-4e5e-9e2e-703043c09184-combined-ca-bundle\") pod \"87c2e451-cf73-4e5e-9e2e-703043c09184\" (UID: \"87c2e451-cf73-4e5e-9e2e-703043c09184\") " Oct 13 13:27:41 crc kubenswrapper[4797]: I1013 13:27:41.333307 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/87c2e451-cf73-4e5e-9e2e-703043c09184-scripts\") pod \"87c2e451-cf73-4e5e-9e2e-703043c09184\" (UID: \"87c2e451-cf73-4e5e-9e2e-703043c09184\") " Oct 13 13:27:41 crc kubenswrapper[4797]: I1013 13:27:41.339860 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87c2e451-cf73-4e5e-9e2e-703043c09184-scripts" (OuterVolumeSpecName: "scripts") pod "87c2e451-cf73-4e5e-9e2e-703043c09184" (UID: "87c2e451-cf73-4e5e-9e2e-703043c09184"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:27:41 crc kubenswrapper[4797]: I1013 13:27:41.340787 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87c2e451-cf73-4e5e-9e2e-703043c09184-kube-api-access-gckrb" (OuterVolumeSpecName: "kube-api-access-gckrb") pod "87c2e451-cf73-4e5e-9e2e-703043c09184" (UID: "87c2e451-cf73-4e5e-9e2e-703043c09184"). InnerVolumeSpecName "kube-api-access-gckrb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:27:41 crc kubenswrapper[4797]: I1013 13:27:41.370440 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87c2e451-cf73-4e5e-9e2e-703043c09184-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "87c2e451-cf73-4e5e-9e2e-703043c09184" (UID: "87c2e451-cf73-4e5e-9e2e-703043c09184"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:27:41 crc kubenswrapper[4797]: I1013 13:27:41.375967 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87c2e451-cf73-4e5e-9e2e-703043c09184-config-data" (OuterVolumeSpecName: "config-data") pod "87c2e451-cf73-4e5e-9e2e-703043c09184" (UID: "87c2e451-cf73-4e5e-9e2e-703043c09184"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:27:41 crc kubenswrapper[4797]: I1013 13:27:41.434846 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gckrb\" (UniqueName: \"kubernetes.io/projected/87c2e451-cf73-4e5e-9e2e-703043c09184-kube-api-access-gckrb\") on node \"crc\" DevicePath \"\"" Oct 13 13:27:41 crc kubenswrapper[4797]: I1013 13:27:41.434874 4797 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87c2e451-cf73-4e5e-9e2e-703043c09184-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 13:27:41 crc kubenswrapper[4797]: I1013 13:27:41.434886 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87c2e451-cf73-4e5e-9e2e-703043c09184-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 13:27:41 crc kubenswrapper[4797]: I1013 13:27:41.434896 4797 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/87c2e451-cf73-4e5e-9e2e-703043c09184-scripts\") on node \"crc\" DevicePath \"\"" Oct 13 13:27:41 crc kubenswrapper[4797]: I1013 13:27:41.613744 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cb9f44c77-zgtk5" event={"ID":"c7c3de59-8ab4-450c-97bb-4826fb66db39","Type":"ContainerDied","Data":"e9eabd68ecee04115a390bdfaef77ce3985625110380b3d2ca3ef83d5a738b5b"} Oct 13 13:27:41 crc kubenswrapper[4797]: I1013 13:27:41.614029 4797 scope.go:117] "RemoveContainer" containerID="64bc7a351dba6fd2ea568ac0358f7489d2418e46d95c7d997ddaf2c98466fa14" Oct 13 13:27:41 crc kubenswrapper[4797]: I1013 13:27:41.613953 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cb9f44c77-zgtk5" Oct 13 13:27:41 crc kubenswrapper[4797]: I1013 13:27:41.620553 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-pp9mc" event={"ID":"87c2e451-cf73-4e5e-9e2e-703043c09184","Type":"ContainerDied","Data":"e5a12ccf8c096e32e3a0e619893ab414666649ea9f10c97769e674a7389a5699"} Oct 13 13:27:41 crc kubenswrapper[4797]: I1013 13:27:41.620598 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e5a12ccf8c096e32e3a0e619893ab414666649ea9f10c97769e674a7389a5699" Oct 13 13:27:41 crc kubenswrapper[4797]: I1013 13:27:41.620716 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-pp9mc" Oct 13 13:27:41 crc kubenswrapper[4797]: I1013 13:27:41.625657 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"ddeeadaa-8237-4fad-8fd7-8e9c0580a1ed","Type":"ContainerStarted","Data":"be06c7925ca69832ee2ecd465bccfa99df2217df9880fecf2f1eaf2ed8591ad0"} Oct 13 13:27:41 crc kubenswrapper[4797]: I1013 13:27:41.625866 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Oct 13 13:27:41 crc kubenswrapper[4797]: I1013 13:27:41.646297 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-cb9f44c77-zgtk5"] Oct 13 13:27:41 crc kubenswrapper[4797]: I1013 13:27:41.669671 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-cb9f44c77-zgtk5"] Oct 13 13:27:41 crc kubenswrapper[4797]: I1013 13:27:41.674557 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.243555198 podStartE2EDuration="2.674541282s" podCreationTimestamp="2025-10-13 13:27:39 +0000 UTC" firstStartedPulling="2025-10-13 13:27:40.424747676 +0000 UTC m=+1237.958297932" lastFinishedPulling="2025-10-13 13:27:40.85573376 +0000 UTC m=+1238.389284016" observedRunningTime="2025-10-13 13:27:41.659139236 +0000 UTC m=+1239.192689502" watchObservedRunningTime="2025-10-13 13:27:41.674541282 +0000 UTC m=+1239.208091538" Oct 13 13:27:41 crc kubenswrapper[4797]: I1013 13:27:41.684197 4797 scope.go:117] "RemoveContainer" containerID="0b1d339d1c89a1dbbb64993f7f83a493c2b87570452b25455894f2ba368b079c" Oct 13 13:27:41 crc kubenswrapper[4797]: I1013 13:27:41.747381 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 13 13:27:41 crc kubenswrapper[4797]: I1013 13:27:41.748008 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="295e91cc-9024-4280-a782-3c7f3a2d19dc" containerName="nova-api-api" containerID="cri-o://7c864360250eda4150f9a3eacc5f7bd8c83b74c0a6eda4fcedee00c2f972660a" gracePeriod=30 Oct 13 13:27:41 crc kubenswrapper[4797]: I1013 13:27:41.747613 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="295e91cc-9024-4280-a782-3c7f3a2d19dc" containerName="nova-api-log" containerID="cri-o://087de0154f9960b6ee388e1e995a5c6a2ef15b2fb7775bdccea9ec17c4e8774e" gracePeriod=30 Oct 13 13:27:41 crc kubenswrapper[4797]: I1013 13:27:41.804974 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 13 13:27:41 crc kubenswrapper[4797]: I1013 13:27:41.805229 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="70734429-9865-47a5-a25e-f7e3b5d56d8f" containerName="nova-metadata-log" containerID="cri-o://85a5ae6fa09e816cd486557f2e30297d5061e15ac825eb4688e465ebe406464d" gracePeriod=30 Oct 13 13:27:41 crc kubenswrapper[4797]: I1013 13:27:41.805318 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="70734429-9865-47a5-a25e-f7e3b5d56d8f" containerName="nova-metadata-metadata" containerID="cri-o://6d59470a3a30b2ba3ad49e40a91d4f792fa1b9f1f73604dc7341b0ea2f8f77e9" gracePeriod=30 Oct 13 13:27:41 crc kubenswrapper[4797]: I1013 13:27:41.911471 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 13 13:27:41 crc kubenswrapper[4797]: I1013 13:27:41.932191 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 13 13:27:41 crc kubenswrapper[4797]: I1013 13:27:41.932243 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 13 13:27:42 crc kubenswrapper[4797]: I1013 13:27:42.424921 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 13 13:27:42 crc kubenswrapper[4797]: I1013 13:27:42.428169 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 13 13:27:42 crc kubenswrapper[4797]: I1013 13:27:42.560979 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88171561-ebd1-4a7c-ad02-8360aa091756-scripts\") pod \"88171561-ebd1-4a7c-ad02-8360aa091756\" (UID: \"88171561-ebd1-4a7c-ad02-8360aa091756\") " Oct 13 13:27:42 crc kubenswrapper[4797]: I1013 13:27:42.561027 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/70734429-9865-47a5-a25e-f7e3b5d56d8f-nova-metadata-tls-certs\") pod \"70734429-9865-47a5-a25e-f7e3b5d56d8f\" (UID: \"70734429-9865-47a5-a25e-f7e3b5d56d8f\") " Oct 13 13:27:42 crc kubenswrapper[4797]: I1013 13:27:42.561127 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70734429-9865-47a5-a25e-f7e3b5d56d8f-config-data\") pod \"70734429-9865-47a5-a25e-f7e3b5d56d8f\" (UID: \"70734429-9865-47a5-a25e-f7e3b5d56d8f\") " Oct 13 13:27:42 crc kubenswrapper[4797]: I1013 13:27:42.561150 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/70734429-9865-47a5-a25e-f7e3b5d56d8f-logs\") pod \"70734429-9865-47a5-a25e-f7e3b5d56d8f\" (UID: \"70734429-9865-47a5-a25e-f7e3b5d56d8f\") " Oct 13 13:27:42 crc kubenswrapper[4797]: I1013 13:27:42.561172 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70734429-9865-47a5-a25e-f7e3b5d56d8f-combined-ca-bundle\") pod \"70734429-9865-47a5-a25e-f7e3b5d56d8f\" (UID: \"70734429-9865-47a5-a25e-f7e3b5d56d8f\") " Oct 13 13:27:42 crc kubenswrapper[4797]: I1013 13:27:42.561193 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hb2cw\" (UniqueName: \"kubernetes.io/projected/70734429-9865-47a5-a25e-f7e3b5d56d8f-kube-api-access-hb2cw\") pod \"70734429-9865-47a5-a25e-f7e3b5d56d8f\" (UID: \"70734429-9865-47a5-a25e-f7e3b5d56d8f\") " Oct 13 13:27:42 crc kubenswrapper[4797]: I1013 13:27:42.561232 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88171561-ebd1-4a7c-ad02-8360aa091756-config-data\") pod \"88171561-ebd1-4a7c-ad02-8360aa091756\" (UID: \"88171561-ebd1-4a7c-ad02-8360aa091756\") " Oct 13 13:27:42 crc kubenswrapper[4797]: I1013 13:27:42.561275 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/88171561-ebd1-4a7c-ad02-8360aa091756-run-httpd\") pod \"88171561-ebd1-4a7c-ad02-8360aa091756\" (UID: \"88171561-ebd1-4a7c-ad02-8360aa091756\") " Oct 13 13:27:42 crc kubenswrapper[4797]: I1013 13:27:42.561324 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/88171561-ebd1-4a7c-ad02-8360aa091756-sg-core-conf-yaml\") pod \"88171561-ebd1-4a7c-ad02-8360aa091756\" (UID: \"88171561-ebd1-4a7c-ad02-8360aa091756\") " Oct 13 13:27:42 crc kubenswrapper[4797]: I1013 13:27:42.561375 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88171561-ebd1-4a7c-ad02-8360aa091756-combined-ca-bundle\") pod \"88171561-ebd1-4a7c-ad02-8360aa091756\" (UID: \"88171561-ebd1-4a7c-ad02-8360aa091756\") " Oct 13 13:27:42 crc kubenswrapper[4797]: I1013 13:27:42.561421 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkg9s\" (UniqueName: \"kubernetes.io/projected/88171561-ebd1-4a7c-ad02-8360aa091756-kube-api-access-jkg9s\") pod \"88171561-ebd1-4a7c-ad02-8360aa091756\" (UID: \"88171561-ebd1-4a7c-ad02-8360aa091756\") " Oct 13 13:27:42 crc kubenswrapper[4797]: I1013 13:27:42.561438 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/88171561-ebd1-4a7c-ad02-8360aa091756-log-httpd\") pod \"88171561-ebd1-4a7c-ad02-8360aa091756\" (UID: \"88171561-ebd1-4a7c-ad02-8360aa091756\") " Oct 13 13:27:42 crc kubenswrapper[4797]: I1013 13:27:42.562361 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/70734429-9865-47a5-a25e-f7e3b5d56d8f-logs" (OuterVolumeSpecName: "logs") pod "70734429-9865-47a5-a25e-f7e3b5d56d8f" (UID: "70734429-9865-47a5-a25e-f7e3b5d56d8f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 13:27:42 crc kubenswrapper[4797]: I1013 13:27:42.562683 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88171561-ebd1-4a7c-ad02-8360aa091756-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "88171561-ebd1-4a7c-ad02-8360aa091756" (UID: "88171561-ebd1-4a7c-ad02-8360aa091756"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 13:27:42 crc kubenswrapper[4797]: I1013 13:27:42.562905 4797 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/88171561-ebd1-4a7c-ad02-8360aa091756-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 13 13:27:42 crc kubenswrapper[4797]: I1013 13:27:42.562922 4797 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/70734429-9865-47a5-a25e-f7e3b5d56d8f-logs\") on node \"crc\" DevicePath \"\"" Oct 13 13:27:42 crc kubenswrapper[4797]: I1013 13:27:42.562936 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88171561-ebd1-4a7c-ad02-8360aa091756-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "88171561-ebd1-4a7c-ad02-8360aa091756" (UID: "88171561-ebd1-4a7c-ad02-8360aa091756"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 13:27:42 crc kubenswrapper[4797]: I1013 13:27:42.567623 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88171561-ebd1-4a7c-ad02-8360aa091756-kube-api-access-jkg9s" (OuterVolumeSpecName: "kube-api-access-jkg9s") pod "88171561-ebd1-4a7c-ad02-8360aa091756" (UID: "88171561-ebd1-4a7c-ad02-8360aa091756"). InnerVolumeSpecName "kube-api-access-jkg9s". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:27:42 crc kubenswrapper[4797]: I1013 13:27:42.572942 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88171561-ebd1-4a7c-ad02-8360aa091756-scripts" (OuterVolumeSpecName: "scripts") pod "88171561-ebd1-4a7c-ad02-8360aa091756" (UID: "88171561-ebd1-4a7c-ad02-8360aa091756"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:27:42 crc kubenswrapper[4797]: I1013 13:27:42.573054 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70734429-9865-47a5-a25e-f7e3b5d56d8f-kube-api-access-hb2cw" (OuterVolumeSpecName: "kube-api-access-hb2cw") pod "70734429-9865-47a5-a25e-f7e3b5d56d8f" (UID: "70734429-9865-47a5-a25e-f7e3b5d56d8f"). InnerVolumeSpecName "kube-api-access-hb2cw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:27:42 crc kubenswrapper[4797]: I1013 13:27:42.617940 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88171561-ebd1-4a7c-ad02-8360aa091756-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "88171561-ebd1-4a7c-ad02-8360aa091756" (UID: "88171561-ebd1-4a7c-ad02-8360aa091756"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:27:42 crc kubenswrapper[4797]: I1013 13:27:42.622491 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70734429-9865-47a5-a25e-f7e3b5d56d8f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "70734429-9865-47a5-a25e-f7e3b5d56d8f" (UID: "70734429-9865-47a5-a25e-f7e3b5d56d8f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:27:42 crc kubenswrapper[4797]: I1013 13:27:42.629769 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70734429-9865-47a5-a25e-f7e3b5d56d8f-config-data" (OuterVolumeSpecName: "config-data") pod "70734429-9865-47a5-a25e-f7e3b5d56d8f" (UID: "70734429-9865-47a5-a25e-f7e3b5d56d8f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:27:42 crc kubenswrapper[4797]: I1013 13:27:42.660001 4797 generic.go:334] "Generic (PLEG): container finished" podID="88171561-ebd1-4a7c-ad02-8360aa091756" containerID="3fc0e047482f6b4e8796efcb93239b4645ccc1777e2bd67758d7006e21be9356" exitCode=0 Oct 13 13:27:42 crc kubenswrapper[4797]: I1013 13:27:42.660086 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"88171561-ebd1-4a7c-ad02-8360aa091756","Type":"ContainerDied","Data":"3fc0e047482f6b4e8796efcb93239b4645ccc1777e2bd67758d7006e21be9356"} Oct 13 13:27:42 crc kubenswrapper[4797]: I1013 13:27:42.660111 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"88171561-ebd1-4a7c-ad02-8360aa091756","Type":"ContainerDied","Data":"20ae6440a46898a0c7e5541d033764a67ff7d088949224b31097b59a5c980c04"} Oct 13 13:27:42 crc kubenswrapper[4797]: I1013 13:27:42.660127 4797 scope.go:117] "RemoveContainer" containerID="cbb5c4f10d0236cd1b245c2570179bf7a542ef5614cb3ba16b96a7951f5af82b" Oct 13 13:27:42 crc kubenswrapper[4797]: I1013 13:27:42.660237 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 13 13:27:42 crc kubenswrapper[4797]: I1013 13:27:42.660902 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70734429-9865-47a5-a25e-f7e3b5d56d8f-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "70734429-9865-47a5-a25e-f7e3b5d56d8f" (UID: "70734429-9865-47a5-a25e-f7e3b5d56d8f"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:27:42 crc kubenswrapper[4797]: I1013 13:27:42.664903 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88171561-ebd1-4a7c-ad02-8360aa091756-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "88171561-ebd1-4a7c-ad02-8360aa091756" (UID: "88171561-ebd1-4a7c-ad02-8360aa091756"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:27:42 crc kubenswrapper[4797]: I1013 13:27:42.665236 4797 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/88171561-ebd1-4a7c-ad02-8360aa091756-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 13 13:27:42 crc kubenswrapper[4797]: I1013 13:27:42.665253 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88171561-ebd1-4a7c-ad02-8360aa091756-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 13:27:42 crc kubenswrapper[4797]: I1013 13:27:42.665268 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkg9s\" (UniqueName: \"kubernetes.io/projected/88171561-ebd1-4a7c-ad02-8360aa091756-kube-api-access-jkg9s\") on node \"crc\" DevicePath \"\"" Oct 13 13:27:42 crc kubenswrapper[4797]: I1013 13:27:42.665284 4797 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88171561-ebd1-4a7c-ad02-8360aa091756-scripts\") on node \"crc\" DevicePath \"\"" Oct 13 13:27:42 crc kubenswrapper[4797]: I1013 13:27:42.665296 4797 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/70734429-9865-47a5-a25e-f7e3b5d56d8f-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 13 13:27:42 crc kubenswrapper[4797]: I1013 13:27:42.665308 4797 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70734429-9865-47a5-a25e-f7e3b5d56d8f-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 13:27:42 crc kubenswrapper[4797]: I1013 13:27:42.665319 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70734429-9865-47a5-a25e-f7e3b5d56d8f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 13:27:42 crc kubenswrapper[4797]: I1013 13:27:42.665332 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hb2cw\" (UniqueName: \"kubernetes.io/projected/70734429-9865-47a5-a25e-f7e3b5d56d8f-kube-api-access-hb2cw\") on node \"crc\" DevicePath \"\"" Oct 13 13:27:42 crc kubenswrapper[4797]: I1013 13:27:42.665343 4797 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/88171561-ebd1-4a7c-ad02-8360aa091756-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 13 13:27:42 crc kubenswrapper[4797]: I1013 13:27:42.680462 4797 generic.go:334] "Generic (PLEG): container finished" podID="70734429-9865-47a5-a25e-f7e3b5d56d8f" containerID="6d59470a3a30b2ba3ad49e40a91d4f792fa1b9f1f73604dc7341b0ea2f8f77e9" exitCode=0 Oct 13 13:27:42 crc kubenswrapper[4797]: I1013 13:27:42.680500 4797 generic.go:334] "Generic (PLEG): container finished" podID="70734429-9865-47a5-a25e-f7e3b5d56d8f" containerID="85a5ae6fa09e816cd486557f2e30297d5061e15ac825eb4688e465ebe406464d" exitCode=143 Oct 13 13:27:42 crc kubenswrapper[4797]: I1013 13:27:42.680552 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"70734429-9865-47a5-a25e-f7e3b5d56d8f","Type":"ContainerDied","Data":"6d59470a3a30b2ba3ad49e40a91d4f792fa1b9f1f73604dc7341b0ea2f8f77e9"} Oct 13 13:27:42 crc kubenswrapper[4797]: I1013 13:27:42.680585 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"70734429-9865-47a5-a25e-f7e3b5d56d8f","Type":"ContainerDied","Data":"85a5ae6fa09e816cd486557f2e30297d5061e15ac825eb4688e465ebe406464d"} Oct 13 13:27:42 crc kubenswrapper[4797]: I1013 13:27:42.680600 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"70734429-9865-47a5-a25e-f7e3b5d56d8f","Type":"ContainerDied","Data":"e9e5f1f2ea2b392e66a48a8b1c63dac261bad59ef12603fe2ef80548f272ae26"} Oct 13 13:27:42 crc kubenswrapper[4797]: I1013 13:27:42.680665 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 13 13:27:42 crc kubenswrapper[4797]: I1013 13:27:42.696090 4797 generic.go:334] "Generic (PLEG): container finished" podID="295e91cc-9024-4280-a782-3c7f3a2d19dc" containerID="087de0154f9960b6ee388e1e995a5c6a2ef15b2fb7775bdccea9ec17c4e8774e" exitCode=143 Oct 13 13:27:42 crc kubenswrapper[4797]: I1013 13:27:42.696262 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="d916771f-8789-42cf-aa66-11707d4825f6" containerName="nova-scheduler-scheduler" containerID="cri-o://d0e136c6ee06a6dbb29283674c3aa6dd2d3838158b94538d5b3654533f98cdff" gracePeriod=30 Oct 13 13:27:42 crc kubenswrapper[4797]: I1013 13:27:42.696524 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"295e91cc-9024-4280-a782-3c7f3a2d19dc","Type":"ContainerDied","Data":"087de0154f9960b6ee388e1e995a5c6a2ef15b2fb7775bdccea9ec17c4e8774e"} Oct 13 13:27:42 crc kubenswrapper[4797]: I1013 13:27:42.697725 4797 scope.go:117] "RemoveContainer" containerID="90b297b1efc9c1fb3133d1d8335d0e00bdfc160e3da3b580942756cf725c4955" Oct 13 13:27:42 crc kubenswrapper[4797]: I1013 13:27:42.708005 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88171561-ebd1-4a7c-ad02-8360aa091756-config-data" (OuterVolumeSpecName: "config-data") pod "88171561-ebd1-4a7c-ad02-8360aa091756" (UID: "88171561-ebd1-4a7c-ad02-8360aa091756"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:27:42 crc kubenswrapper[4797]: I1013 13:27:42.720717 4797 scope.go:117] "RemoveContainer" containerID="3fc0e047482f6b4e8796efcb93239b4645ccc1777e2bd67758d7006e21be9356" Oct 13 13:27:42 crc kubenswrapper[4797]: I1013 13:27:42.739995 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 13 13:27:42 crc kubenswrapper[4797]: I1013 13:27:42.753046 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 13 13:27:42 crc kubenswrapper[4797]: I1013 13:27:42.766550 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 13 13:27:42 crc kubenswrapper[4797]: E1013 13:27:42.766991 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7c3de59-8ab4-450c-97bb-4826fb66db39" containerName="init" Oct 13 13:27:42 crc kubenswrapper[4797]: I1013 13:27:42.767004 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7c3de59-8ab4-450c-97bb-4826fb66db39" containerName="init" Oct 13 13:27:42 crc kubenswrapper[4797]: E1013 13:27:42.767022 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7c3de59-8ab4-450c-97bb-4826fb66db39" containerName="dnsmasq-dns" Oct 13 13:27:42 crc kubenswrapper[4797]: I1013 13:27:42.767029 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7c3de59-8ab4-450c-97bb-4826fb66db39" containerName="dnsmasq-dns" Oct 13 13:27:42 crc kubenswrapper[4797]: E1013 13:27:42.767049 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88171561-ebd1-4a7c-ad02-8360aa091756" containerName="ceilometer-central-agent" Oct 13 13:27:42 crc kubenswrapper[4797]: I1013 13:27:42.767056 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="88171561-ebd1-4a7c-ad02-8360aa091756" containerName="ceilometer-central-agent" Oct 13 13:27:42 crc kubenswrapper[4797]: E1013 13:27:42.767067 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88171561-ebd1-4a7c-ad02-8360aa091756" containerName="sg-core" Oct 13 13:27:42 crc kubenswrapper[4797]: I1013 13:27:42.767073 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="88171561-ebd1-4a7c-ad02-8360aa091756" containerName="sg-core" Oct 13 13:27:42 crc kubenswrapper[4797]: E1013 13:27:42.767090 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88171561-ebd1-4a7c-ad02-8360aa091756" containerName="ceilometer-notification-agent" Oct 13 13:27:42 crc kubenswrapper[4797]: I1013 13:27:42.767098 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="88171561-ebd1-4a7c-ad02-8360aa091756" containerName="ceilometer-notification-agent" Oct 13 13:27:42 crc kubenswrapper[4797]: E1013 13:27:42.767109 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87c2e451-cf73-4e5e-9e2e-703043c09184" containerName="nova-manage" Oct 13 13:27:42 crc kubenswrapper[4797]: I1013 13:27:42.767115 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="87c2e451-cf73-4e5e-9e2e-703043c09184" containerName="nova-manage" Oct 13 13:27:42 crc kubenswrapper[4797]: E1013 13:27:42.767126 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88171561-ebd1-4a7c-ad02-8360aa091756" containerName="proxy-httpd" Oct 13 13:27:42 crc kubenswrapper[4797]: I1013 13:27:42.767146 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="88171561-ebd1-4a7c-ad02-8360aa091756" containerName="proxy-httpd" Oct 13 13:27:42 crc kubenswrapper[4797]: E1013 13:27:42.767157 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70734429-9865-47a5-a25e-f7e3b5d56d8f" containerName="nova-metadata-metadata" Oct 13 13:27:42 crc kubenswrapper[4797]: I1013 13:27:42.767163 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="70734429-9865-47a5-a25e-f7e3b5d56d8f" containerName="nova-metadata-metadata" Oct 13 13:27:42 crc kubenswrapper[4797]: E1013 13:27:42.767173 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70734429-9865-47a5-a25e-f7e3b5d56d8f" containerName="nova-metadata-log" Oct 13 13:27:42 crc kubenswrapper[4797]: I1013 13:27:42.767179 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="70734429-9865-47a5-a25e-f7e3b5d56d8f" containerName="nova-metadata-log" Oct 13 13:27:42 crc kubenswrapper[4797]: I1013 13:27:42.767361 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="88171561-ebd1-4a7c-ad02-8360aa091756" containerName="proxy-httpd" Oct 13 13:27:42 crc kubenswrapper[4797]: I1013 13:27:42.767376 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="88171561-ebd1-4a7c-ad02-8360aa091756" containerName="ceilometer-notification-agent" Oct 13 13:27:42 crc kubenswrapper[4797]: I1013 13:27:42.767391 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="88171561-ebd1-4a7c-ad02-8360aa091756" containerName="ceilometer-central-agent" Oct 13 13:27:42 crc kubenswrapper[4797]: I1013 13:27:42.767408 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7c3de59-8ab4-450c-97bb-4826fb66db39" containerName="dnsmasq-dns" Oct 13 13:27:42 crc kubenswrapper[4797]: I1013 13:27:42.767419 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="88171561-ebd1-4a7c-ad02-8360aa091756" containerName="sg-core" Oct 13 13:27:42 crc kubenswrapper[4797]: I1013 13:27:42.767428 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="70734429-9865-47a5-a25e-f7e3b5d56d8f" containerName="nova-metadata-log" Oct 13 13:27:42 crc kubenswrapper[4797]: I1013 13:27:42.767445 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="70734429-9865-47a5-a25e-f7e3b5d56d8f" containerName="nova-metadata-metadata" Oct 13 13:27:42 crc kubenswrapper[4797]: I1013 13:27:42.767453 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="87c2e451-cf73-4e5e-9e2e-703043c09184" containerName="nova-manage" Oct 13 13:27:42 crc kubenswrapper[4797]: I1013 13:27:42.767559 4797 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88171561-ebd1-4a7c-ad02-8360aa091756-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 13:27:42 crc kubenswrapper[4797]: I1013 13:27:42.768382 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 13 13:27:42 crc kubenswrapper[4797]: I1013 13:27:42.770176 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 13 13:27:42 crc kubenswrapper[4797]: I1013 13:27:42.770305 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 13 13:27:42 crc kubenswrapper[4797]: I1013 13:27:42.773550 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 13 13:27:42 crc kubenswrapper[4797]: I1013 13:27:42.836575 4797 scope.go:117] "RemoveContainer" containerID="fa599775b1fb6fc88fdbec4282bc3aee28747d9ba54de9e4632164935a2d64f3" Oct 13 13:27:42 crc kubenswrapper[4797]: I1013 13:27:42.867149 4797 scope.go:117] "RemoveContainer" containerID="cbb5c4f10d0236cd1b245c2570179bf7a542ef5614cb3ba16b96a7951f5af82b" Oct 13 13:27:42 crc kubenswrapper[4797]: E1013 13:27:42.867636 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cbb5c4f10d0236cd1b245c2570179bf7a542ef5614cb3ba16b96a7951f5af82b\": container with ID starting with cbb5c4f10d0236cd1b245c2570179bf7a542ef5614cb3ba16b96a7951f5af82b not found: ID does not exist" containerID="cbb5c4f10d0236cd1b245c2570179bf7a542ef5614cb3ba16b96a7951f5af82b" Oct 13 13:27:42 crc kubenswrapper[4797]: I1013 13:27:42.867705 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cbb5c4f10d0236cd1b245c2570179bf7a542ef5614cb3ba16b96a7951f5af82b"} err="failed to get container status \"cbb5c4f10d0236cd1b245c2570179bf7a542ef5614cb3ba16b96a7951f5af82b\": rpc error: code = NotFound desc = could not find container \"cbb5c4f10d0236cd1b245c2570179bf7a542ef5614cb3ba16b96a7951f5af82b\": container with ID starting with cbb5c4f10d0236cd1b245c2570179bf7a542ef5614cb3ba16b96a7951f5af82b not found: ID does not exist" Oct 13 13:27:42 crc kubenswrapper[4797]: I1013 13:27:42.867733 4797 scope.go:117] "RemoveContainer" containerID="90b297b1efc9c1fb3133d1d8335d0e00bdfc160e3da3b580942756cf725c4955" Oct 13 13:27:42 crc kubenswrapper[4797]: E1013 13:27:42.868245 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"90b297b1efc9c1fb3133d1d8335d0e00bdfc160e3da3b580942756cf725c4955\": container with ID starting with 90b297b1efc9c1fb3133d1d8335d0e00bdfc160e3da3b580942756cf725c4955 not found: ID does not exist" containerID="90b297b1efc9c1fb3133d1d8335d0e00bdfc160e3da3b580942756cf725c4955" Oct 13 13:27:42 crc kubenswrapper[4797]: I1013 13:27:42.868285 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90b297b1efc9c1fb3133d1d8335d0e00bdfc160e3da3b580942756cf725c4955"} err="failed to get container status \"90b297b1efc9c1fb3133d1d8335d0e00bdfc160e3da3b580942756cf725c4955\": rpc error: code = NotFound desc = could not find container \"90b297b1efc9c1fb3133d1d8335d0e00bdfc160e3da3b580942756cf725c4955\": container with ID starting with 90b297b1efc9c1fb3133d1d8335d0e00bdfc160e3da3b580942756cf725c4955 not found: ID does not exist" Oct 13 13:27:42 crc kubenswrapper[4797]: I1013 13:27:42.868310 4797 scope.go:117] "RemoveContainer" containerID="3fc0e047482f6b4e8796efcb93239b4645ccc1777e2bd67758d7006e21be9356" Oct 13 13:27:42 crc kubenswrapper[4797]: E1013 13:27:42.868601 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3fc0e047482f6b4e8796efcb93239b4645ccc1777e2bd67758d7006e21be9356\": container with ID starting with 3fc0e047482f6b4e8796efcb93239b4645ccc1777e2bd67758d7006e21be9356 not found: ID does not exist" containerID="3fc0e047482f6b4e8796efcb93239b4645ccc1777e2bd67758d7006e21be9356" Oct 13 13:27:42 crc kubenswrapper[4797]: I1013 13:27:42.868654 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4aac353b-6a2c-4072-b40f-cc91a3907bce-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"4aac353b-6a2c-4072-b40f-cc91a3907bce\") " pod="openstack/nova-metadata-0" Oct 13 13:27:42 crc kubenswrapper[4797]: I1013 13:27:42.868642 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3fc0e047482f6b4e8796efcb93239b4645ccc1777e2bd67758d7006e21be9356"} err="failed to get container status \"3fc0e047482f6b4e8796efcb93239b4645ccc1777e2bd67758d7006e21be9356\": rpc error: code = NotFound desc = could not find container \"3fc0e047482f6b4e8796efcb93239b4645ccc1777e2bd67758d7006e21be9356\": container with ID starting with 3fc0e047482f6b4e8796efcb93239b4645ccc1777e2bd67758d7006e21be9356 not found: ID does not exist" Oct 13 13:27:42 crc kubenswrapper[4797]: I1013 13:27:42.868685 4797 scope.go:117] "RemoveContainer" containerID="fa599775b1fb6fc88fdbec4282bc3aee28747d9ba54de9e4632164935a2d64f3" Oct 13 13:27:42 crc kubenswrapper[4797]: I1013 13:27:42.868720 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4aac353b-6a2c-4072-b40f-cc91a3907bce-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"4aac353b-6a2c-4072-b40f-cc91a3907bce\") " pod="openstack/nova-metadata-0" Oct 13 13:27:42 crc kubenswrapper[4797]: I1013 13:27:42.868754 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4aac353b-6a2c-4072-b40f-cc91a3907bce-logs\") pod \"nova-metadata-0\" (UID: \"4aac353b-6a2c-4072-b40f-cc91a3907bce\") " pod="openstack/nova-metadata-0" Oct 13 13:27:42 crc kubenswrapper[4797]: I1013 13:27:42.868797 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4ccm\" (UniqueName: \"kubernetes.io/projected/4aac353b-6a2c-4072-b40f-cc91a3907bce-kube-api-access-s4ccm\") pod \"nova-metadata-0\" (UID: \"4aac353b-6a2c-4072-b40f-cc91a3907bce\") " pod="openstack/nova-metadata-0" Oct 13 13:27:42 crc kubenswrapper[4797]: I1013 13:27:42.868902 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4aac353b-6a2c-4072-b40f-cc91a3907bce-config-data\") pod \"nova-metadata-0\" (UID: \"4aac353b-6a2c-4072-b40f-cc91a3907bce\") " pod="openstack/nova-metadata-0" Oct 13 13:27:42 crc kubenswrapper[4797]: E1013 13:27:42.869117 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa599775b1fb6fc88fdbec4282bc3aee28747d9ba54de9e4632164935a2d64f3\": container with ID starting with fa599775b1fb6fc88fdbec4282bc3aee28747d9ba54de9e4632164935a2d64f3 not found: ID does not exist" containerID="fa599775b1fb6fc88fdbec4282bc3aee28747d9ba54de9e4632164935a2d64f3" Oct 13 13:27:42 crc kubenswrapper[4797]: I1013 13:27:42.869161 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa599775b1fb6fc88fdbec4282bc3aee28747d9ba54de9e4632164935a2d64f3"} err="failed to get container status \"fa599775b1fb6fc88fdbec4282bc3aee28747d9ba54de9e4632164935a2d64f3\": rpc error: code = NotFound desc = could not find container \"fa599775b1fb6fc88fdbec4282bc3aee28747d9ba54de9e4632164935a2d64f3\": container with ID starting with fa599775b1fb6fc88fdbec4282bc3aee28747d9ba54de9e4632164935a2d64f3 not found: ID does not exist" Oct 13 13:27:42 crc kubenswrapper[4797]: I1013 13:27:42.869180 4797 scope.go:117] "RemoveContainer" containerID="6d59470a3a30b2ba3ad49e40a91d4f792fa1b9f1f73604dc7341b0ea2f8f77e9" Oct 13 13:27:42 crc kubenswrapper[4797]: I1013 13:27:42.892080 4797 scope.go:117] "RemoveContainer" containerID="85a5ae6fa09e816cd486557f2e30297d5061e15ac825eb4688e465ebe406464d" Oct 13 13:27:42 crc kubenswrapper[4797]: I1013 13:27:42.915241 4797 scope.go:117] "RemoveContainer" containerID="6d59470a3a30b2ba3ad49e40a91d4f792fa1b9f1f73604dc7341b0ea2f8f77e9" Oct 13 13:27:42 crc kubenswrapper[4797]: E1013 13:27:42.915730 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d59470a3a30b2ba3ad49e40a91d4f792fa1b9f1f73604dc7341b0ea2f8f77e9\": container with ID starting with 6d59470a3a30b2ba3ad49e40a91d4f792fa1b9f1f73604dc7341b0ea2f8f77e9 not found: ID does not exist" containerID="6d59470a3a30b2ba3ad49e40a91d4f792fa1b9f1f73604dc7341b0ea2f8f77e9" Oct 13 13:27:42 crc kubenswrapper[4797]: I1013 13:27:42.915787 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d59470a3a30b2ba3ad49e40a91d4f792fa1b9f1f73604dc7341b0ea2f8f77e9"} err="failed to get container status \"6d59470a3a30b2ba3ad49e40a91d4f792fa1b9f1f73604dc7341b0ea2f8f77e9\": rpc error: code = NotFound desc = could not find container \"6d59470a3a30b2ba3ad49e40a91d4f792fa1b9f1f73604dc7341b0ea2f8f77e9\": container with ID starting with 6d59470a3a30b2ba3ad49e40a91d4f792fa1b9f1f73604dc7341b0ea2f8f77e9 not found: ID does not exist" Oct 13 13:27:42 crc kubenswrapper[4797]: I1013 13:27:42.915836 4797 scope.go:117] "RemoveContainer" containerID="85a5ae6fa09e816cd486557f2e30297d5061e15ac825eb4688e465ebe406464d" Oct 13 13:27:42 crc kubenswrapper[4797]: E1013 13:27:42.916253 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85a5ae6fa09e816cd486557f2e30297d5061e15ac825eb4688e465ebe406464d\": container with ID starting with 85a5ae6fa09e816cd486557f2e30297d5061e15ac825eb4688e465ebe406464d not found: ID does not exist" containerID="85a5ae6fa09e816cd486557f2e30297d5061e15ac825eb4688e465ebe406464d" Oct 13 13:27:42 crc kubenswrapper[4797]: I1013 13:27:42.916307 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85a5ae6fa09e816cd486557f2e30297d5061e15ac825eb4688e465ebe406464d"} err="failed to get container status \"85a5ae6fa09e816cd486557f2e30297d5061e15ac825eb4688e465ebe406464d\": rpc error: code = NotFound desc = could not find container \"85a5ae6fa09e816cd486557f2e30297d5061e15ac825eb4688e465ebe406464d\": container with ID starting with 85a5ae6fa09e816cd486557f2e30297d5061e15ac825eb4688e465ebe406464d not found: ID does not exist" Oct 13 13:27:42 crc kubenswrapper[4797]: I1013 13:27:42.916328 4797 scope.go:117] "RemoveContainer" containerID="6d59470a3a30b2ba3ad49e40a91d4f792fa1b9f1f73604dc7341b0ea2f8f77e9" Oct 13 13:27:42 crc kubenswrapper[4797]: I1013 13:27:42.916569 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d59470a3a30b2ba3ad49e40a91d4f792fa1b9f1f73604dc7341b0ea2f8f77e9"} err="failed to get container status \"6d59470a3a30b2ba3ad49e40a91d4f792fa1b9f1f73604dc7341b0ea2f8f77e9\": rpc error: code = NotFound desc = could not find container \"6d59470a3a30b2ba3ad49e40a91d4f792fa1b9f1f73604dc7341b0ea2f8f77e9\": container with ID starting with 6d59470a3a30b2ba3ad49e40a91d4f792fa1b9f1f73604dc7341b0ea2f8f77e9 not found: ID does not exist" Oct 13 13:27:42 crc kubenswrapper[4797]: I1013 13:27:42.916589 4797 scope.go:117] "RemoveContainer" containerID="85a5ae6fa09e816cd486557f2e30297d5061e15ac825eb4688e465ebe406464d" Oct 13 13:27:42 crc kubenswrapper[4797]: I1013 13:27:42.916993 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85a5ae6fa09e816cd486557f2e30297d5061e15ac825eb4688e465ebe406464d"} err="failed to get container status \"85a5ae6fa09e816cd486557f2e30297d5061e15ac825eb4688e465ebe406464d\": rpc error: code = NotFound desc = could not find container \"85a5ae6fa09e816cd486557f2e30297d5061e15ac825eb4688e465ebe406464d\": container with ID starting with 85a5ae6fa09e816cd486557f2e30297d5061e15ac825eb4688e465ebe406464d not found: ID does not exist" Oct 13 13:27:42 crc kubenswrapper[4797]: I1013 13:27:42.970444 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4aac353b-6a2c-4072-b40f-cc91a3907bce-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"4aac353b-6a2c-4072-b40f-cc91a3907bce\") " pod="openstack/nova-metadata-0" Oct 13 13:27:42 crc kubenswrapper[4797]: I1013 13:27:42.970541 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4aac353b-6a2c-4072-b40f-cc91a3907bce-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"4aac353b-6a2c-4072-b40f-cc91a3907bce\") " pod="openstack/nova-metadata-0" Oct 13 13:27:42 crc kubenswrapper[4797]: I1013 13:27:42.970585 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4aac353b-6a2c-4072-b40f-cc91a3907bce-logs\") pod \"nova-metadata-0\" (UID: \"4aac353b-6a2c-4072-b40f-cc91a3907bce\") " pod="openstack/nova-metadata-0" Oct 13 13:27:42 crc kubenswrapper[4797]: I1013 13:27:42.970649 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4ccm\" (UniqueName: \"kubernetes.io/projected/4aac353b-6a2c-4072-b40f-cc91a3907bce-kube-api-access-s4ccm\") pod \"nova-metadata-0\" (UID: \"4aac353b-6a2c-4072-b40f-cc91a3907bce\") " pod="openstack/nova-metadata-0" Oct 13 13:27:42 crc kubenswrapper[4797]: I1013 13:27:42.970696 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4aac353b-6a2c-4072-b40f-cc91a3907bce-config-data\") pod \"nova-metadata-0\" (UID: \"4aac353b-6a2c-4072-b40f-cc91a3907bce\") " pod="openstack/nova-metadata-0" Oct 13 13:27:42 crc kubenswrapper[4797]: I1013 13:27:42.972343 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4aac353b-6a2c-4072-b40f-cc91a3907bce-logs\") pod \"nova-metadata-0\" (UID: \"4aac353b-6a2c-4072-b40f-cc91a3907bce\") " pod="openstack/nova-metadata-0" Oct 13 13:27:42 crc kubenswrapper[4797]: I1013 13:27:42.973585 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4aac353b-6a2c-4072-b40f-cc91a3907bce-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"4aac353b-6a2c-4072-b40f-cc91a3907bce\") " pod="openstack/nova-metadata-0" Oct 13 13:27:42 crc kubenswrapper[4797]: I1013 13:27:42.973847 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4aac353b-6a2c-4072-b40f-cc91a3907bce-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"4aac353b-6a2c-4072-b40f-cc91a3907bce\") " pod="openstack/nova-metadata-0" Oct 13 13:27:42 crc kubenswrapper[4797]: I1013 13:27:42.975185 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4aac353b-6a2c-4072-b40f-cc91a3907bce-config-data\") pod \"nova-metadata-0\" (UID: \"4aac353b-6a2c-4072-b40f-cc91a3907bce\") " pod="openstack/nova-metadata-0" Oct 13 13:27:42 crc kubenswrapper[4797]: I1013 13:27:42.995534 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4ccm\" (UniqueName: \"kubernetes.io/projected/4aac353b-6a2c-4072-b40f-cc91a3907bce-kube-api-access-s4ccm\") pod \"nova-metadata-0\" (UID: \"4aac353b-6a2c-4072-b40f-cc91a3907bce\") " pod="openstack/nova-metadata-0" Oct 13 13:27:43 crc kubenswrapper[4797]: I1013 13:27:43.009242 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 13 13:27:43 crc kubenswrapper[4797]: I1013 13:27:43.019903 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 13 13:27:43 crc kubenswrapper[4797]: I1013 13:27:43.039521 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 13 13:27:43 crc kubenswrapper[4797]: I1013 13:27:43.042336 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 13 13:27:43 crc kubenswrapper[4797]: I1013 13:27:43.045320 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 13 13:27:43 crc kubenswrapper[4797]: I1013 13:27:43.045413 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 13 13:27:43 crc kubenswrapper[4797]: I1013 13:27:43.045544 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 13 13:27:43 crc kubenswrapper[4797]: I1013 13:27:43.064140 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 13 13:27:43 crc kubenswrapper[4797]: I1013 13:27:43.130235 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 13 13:27:43 crc kubenswrapper[4797]: I1013 13:27:43.174719 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8c6a6659-fb80-4cb0-be00-6582c7b3dba0-run-httpd\") pod \"ceilometer-0\" (UID: \"8c6a6659-fb80-4cb0-be00-6582c7b3dba0\") " pod="openstack/ceilometer-0" Oct 13 13:27:43 crc kubenswrapper[4797]: I1013 13:27:43.174815 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c6a6659-fb80-4cb0-be00-6582c7b3dba0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8c6a6659-fb80-4cb0-be00-6582c7b3dba0\") " pod="openstack/ceilometer-0" Oct 13 13:27:43 crc kubenswrapper[4797]: I1013 13:27:43.174947 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njfsk\" (UniqueName: \"kubernetes.io/projected/8c6a6659-fb80-4cb0-be00-6582c7b3dba0-kube-api-access-njfsk\") pod \"ceilometer-0\" (UID: \"8c6a6659-fb80-4cb0-be00-6582c7b3dba0\") " pod="openstack/ceilometer-0" Oct 13 13:27:43 crc kubenswrapper[4797]: I1013 13:27:43.175002 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c6a6659-fb80-4cb0-be00-6582c7b3dba0-scripts\") pod \"ceilometer-0\" (UID: \"8c6a6659-fb80-4cb0-be00-6582c7b3dba0\") " pod="openstack/ceilometer-0" Oct 13 13:27:43 crc kubenswrapper[4797]: I1013 13:27:43.175034 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8c6a6659-fb80-4cb0-be00-6582c7b3dba0-log-httpd\") pod \"ceilometer-0\" (UID: \"8c6a6659-fb80-4cb0-be00-6582c7b3dba0\") " pod="openstack/ceilometer-0" Oct 13 13:27:43 crc kubenswrapper[4797]: I1013 13:27:43.175074 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c6a6659-fb80-4cb0-be00-6582c7b3dba0-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"8c6a6659-fb80-4cb0-be00-6582c7b3dba0\") " pod="openstack/ceilometer-0" Oct 13 13:27:43 crc kubenswrapper[4797]: I1013 13:27:43.175124 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8c6a6659-fb80-4cb0-be00-6582c7b3dba0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8c6a6659-fb80-4cb0-be00-6582c7b3dba0\") " pod="openstack/ceilometer-0" Oct 13 13:27:43 crc kubenswrapper[4797]: I1013 13:27:43.175165 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c6a6659-fb80-4cb0-be00-6582c7b3dba0-config-data\") pod \"ceilometer-0\" (UID: \"8c6a6659-fb80-4cb0-be00-6582c7b3dba0\") " pod="openstack/ceilometer-0" Oct 13 13:27:43 crc kubenswrapper[4797]: I1013 13:27:43.278928 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70734429-9865-47a5-a25e-f7e3b5d56d8f" path="/var/lib/kubelet/pods/70734429-9865-47a5-a25e-f7e3b5d56d8f/volumes" Oct 13 13:27:43 crc kubenswrapper[4797]: I1013 13:27:43.282559 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88171561-ebd1-4a7c-ad02-8360aa091756" path="/var/lib/kubelet/pods/88171561-ebd1-4a7c-ad02-8360aa091756/volumes" Oct 13 13:27:43 crc kubenswrapper[4797]: I1013 13:27:43.283510 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7c3de59-8ab4-450c-97bb-4826fb66db39" path="/var/lib/kubelet/pods/c7c3de59-8ab4-450c-97bb-4826fb66db39/volumes" Oct 13 13:27:43 crc kubenswrapper[4797]: I1013 13:27:43.285488 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c6a6659-fb80-4cb0-be00-6582c7b3dba0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8c6a6659-fb80-4cb0-be00-6582c7b3dba0\") " pod="openstack/ceilometer-0" Oct 13 13:27:43 crc kubenswrapper[4797]: I1013 13:27:43.285571 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-njfsk\" (UniqueName: \"kubernetes.io/projected/8c6a6659-fb80-4cb0-be00-6582c7b3dba0-kube-api-access-njfsk\") pod \"ceilometer-0\" (UID: \"8c6a6659-fb80-4cb0-be00-6582c7b3dba0\") " pod="openstack/ceilometer-0" Oct 13 13:27:43 crc kubenswrapper[4797]: I1013 13:27:43.285638 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c6a6659-fb80-4cb0-be00-6582c7b3dba0-scripts\") pod \"ceilometer-0\" (UID: \"8c6a6659-fb80-4cb0-be00-6582c7b3dba0\") " pod="openstack/ceilometer-0" Oct 13 13:27:43 crc kubenswrapper[4797]: I1013 13:27:43.285676 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8c6a6659-fb80-4cb0-be00-6582c7b3dba0-log-httpd\") pod \"ceilometer-0\" (UID: \"8c6a6659-fb80-4cb0-be00-6582c7b3dba0\") " pod="openstack/ceilometer-0" Oct 13 13:27:43 crc kubenswrapper[4797]: I1013 13:27:43.285732 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c6a6659-fb80-4cb0-be00-6582c7b3dba0-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"8c6a6659-fb80-4cb0-be00-6582c7b3dba0\") " pod="openstack/ceilometer-0" Oct 13 13:27:43 crc kubenswrapper[4797]: I1013 13:27:43.285791 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8c6a6659-fb80-4cb0-be00-6582c7b3dba0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8c6a6659-fb80-4cb0-be00-6582c7b3dba0\") " pod="openstack/ceilometer-0" Oct 13 13:27:43 crc kubenswrapper[4797]: I1013 13:27:43.285851 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c6a6659-fb80-4cb0-be00-6582c7b3dba0-config-data\") pod \"ceilometer-0\" (UID: \"8c6a6659-fb80-4cb0-be00-6582c7b3dba0\") " pod="openstack/ceilometer-0" Oct 13 13:27:43 crc kubenswrapper[4797]: I1013 13:27:43.285939 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8c6a6659-fb80-4cb0-be00-6582c7b3dba0-run-httpd\") pod \"ceilometer-0\" (UID: \"8c6a6659-fb80-4cb0-be00-6582c7b3dba0\") " pod="openstack/ceilometer-0" Oct 13 13:27:43 crc kubenswrapper[4797]: I1013 13:27:43.286674 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8c6a6659-fb80-4cb0-be00-6582c7b3dba0-log-httpd\") pod \"ceilometer-0\" (UID: \"8c6a6659-fb80-4cb0-be00-6582c7b3dba0\") " pod="openstack/ceilometer-0" Oct 13 13:27:43 crc kubenswrapper[4797]: I1013 13:27:43.286996 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8c6a6659-fb80-4cb0-be00-6582c7b3dba0-run-httpd\") pod \"ceilometer-0\" (UID: \"8c6a6659-fb80-4cb0-be00-6582c7b3dba0\") " pod="openstack/ceilometer-0" Oct 13 13:27:43 crc kubenswrapper[4797]: I1013 13:27:43.305119 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c6a6659-fb80-4cb0-be00-6582c7b3dba0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8c6a6659-fb80-4cb0-be00-6582c7b3dba0\") " pod="openstack/ceilometer-0" Oct 13 13:27:43 crc kubenswrapper[4797]: I1013 13:27:43.307939 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c6a6659-fb80-4cb0-be00-6582c7b3dba0-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"8c6a6659-fb80-4cb0-be00-6582c7b3dba0\") " pod="openstack/ceilometer-0" Oct 13 13:27:43 crc kubenswrapper[4797]: I1013 13:27:43.314533 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c6a6659-fb80-4cb0-be00-6582c7b3dba0-config-data\") pod \"ceilometer-0\" (UID: \"8c6a6659-fb80-4cb0-be00-6582c7b3dba0\") " pod="openstack/ceilometer-0" Oct 13 13:27:43 crc kubenswrapper[4797]: I1013 13:27:43.314752 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8c6a6659-fb80-4cb0-be00-6582c7b3dba0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8c6a6659-fb80-4cb0-be00-6582c7b3dba0\") " pod="openstack/ceilometer-0" Oct 13 13:27:43 crc kubenswrapper[4797]: I1013 13:27:43.323757 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c6a6659-fb80-4cb0-be00-6582c7b3dba0-scripts\") pod \"ceilometer-0\" (UID: \"8c6a6659-fb80-4cb0-be00-6582c7b3dba0\") " pod="openstack/ceilometer-0" Oct 13 13:27:43 crc kubenswrapper[4797]: I1013 13:27:43.329268 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-njfsk\" (UniqueName: \"kubernetes.io/projected/8c6a6659-fb80-4cb0-be00-6582c7b3dba0-kube-api-access-njfsk\") pod \"ceilometer-0\" (UID: \"8c6a6659-fb80-4cb0-be00-6582c7b3dba0\") " pod="openstack/ceilometer-0" Oct 13 13:27:43 crc kubenswrapper[4797]: I1013 13:27:43.377070 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 13 13:27:43 crc kubenswrapper[4797]: I1013 13:27:43.546388 4797 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/kube-state-metrics-0" podUID="130b6035-6c63-4a81-b112-fdf5da3d970e" containerName="kube-state-metrics" probeResult="failure" output="Get \"http://10.217.0.108:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 13 13:27:43 crc kubenswrapper[4797]: I1013 13:27:43.672016 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 13 13:27:43 crc kubenswrapper[4797]: I1013 13:27:43.707531 4797 generic.go:334] "Generic (PLEG): container finished" podID="ee3b1264-55ce-4cb4-a390-2fb520ae9b87" containerID="5fbdc263c596a7c3afdf13d3673d89669a69914a1c25734b4adeb4c6c4f4e7be" exitCode=0 Oct 13 13:27:43 crc kubenswrapper[4797]: I1013 13:27:43.707597 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-9drpj" event={"ID":"ee3b1264-55ce-4cb4-a390-2fb520ae9b87","Type":"ContainerDied","Data":"5fbdc263c596a7c3afdf13d3673d89669a69914a1c25734b4adeb4c6c4f4e7be"} Oct 13 13:27:43 crc kubenswrapper[4797]: I1013 13:27:43.708848 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4aac353b-6a2c-4072-b40f-cc91a3907bce","Type":"ContainerStarted","Data":"554c6ec6d8d8580a7786e8f9862a6d56e6c5a24f50910c97fb741def07be9cee"} Oct 13 13:27:43 crc kubenswrapper[4797]: I1013 13:27:43.824541 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 13 13:27:44 crc kubenswrapper[4797]: I1013 13:27:44.749201 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4aac353b-6a2c-4072-b40f-cc91a3907bce","Type":"ContainerStarted","Data":"a9336d63ad3480a4cbf4f416b73926c80dd69c8319d9b129aa45be9c596f6185"} Oct 13 13:27:44 crc kubenswrapper[4797]: I1013 13:27:44.749532 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4aac353b-6a2c-4072-b40f-cc91a3907bce","Type":"ContainerStarted","Data":"1caf1d3c32e8a777d8f507d9d95855d40ffc2c5ead36e51f277d99f3a725c0f6"} Oct 13 13:27:44 crc kubenswrapper[4797]: I1013 13:27:44.750964 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8c6a6659-fb80-4cb0-be00-6582c7b3dba0","Type":"ContainerStarted","Data":"65c9b2a04e7d86e7c70b6e68148cbfb2133e7e8c1fa064fefd55d86f1526d8e7"} Oct 13 13:27:44 crc kubenswrapper[4797]: I1013 13:27:44.751107 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8c6a6659-fb80-4cb0-be00-6582c7b3dba0","Type":"ContainerStarted","Data":"c00a186ed1a93dce95f9aefbc72fb153ff03a646780e38011fe1d97a32fa61c4"} Oct 13 13:27:44 crc kubenswrapper[4797]: I1013 13:27:44.770256 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.770235959 podStartE2EDuration="2.770235959s" podCreationTimestamp="2025-10-13 13:27:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 13:27:44.76703415 +0000 UTC m=+1242.300584416" watchObservedRunningTime="2025-10-13 13:27:44.770235959 +0000 UTC m=+1242.303786215" Oct 13 13:27:44 crc kubenswrapper[4797]: E1013 13:27:44.796065 4797 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d0e136c6ee06a6dbb29283674c3aa6dd2d3838158b94538d5b3654533f98cdff" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 13 13:27:44 crc kubenswrapper[4797]: E1013 13:27:44.797281 4797 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d0e136c6ee06a6dbb29283674c3aa6dd2d3838158b94538d5b3654533f98cdff" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 13 13:27:44 crc kubenswrapper[4797]: E1013 13:27:44.798505 4797 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d0e136c6ee06a6dbb29283674c3aa6dd2d3838158b94538d5b3654533f98cdff" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 13 13:27:44 crc kubenswrapper[4797]: E1013 13:27:44.798549 4797 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="d916771f-8789-42cf-aa66-11707d4825f6" containerName="nova-scheduler-scheduler" Oct 13 13:27:45 crc kubenswrapper[4797]: I1013 13:27:45.114169 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-9drpj" Oct 13 13:27:45 crc kubenswrapper[4797]: I1013 13:27:45.222604 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c2c7k\" (UniqueName: \"kubernetes.io/projected/ee3b1264-55ce-4cb4-a390-2fb520ae9b87-kube-api-access-c2c7k\") pod \"ee3b1264-55ce-4cb4-a390-2fb520ae9b87\" (UID: \"ee3b1264-55ce-4cb4-a390-2fb520ae9b87\") " Oct 13 13:27:45 crc kubenswrapper[4797]: I1013 13:27:45.222820 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee3b1264-55ce-4cb4-a390-2fb520ae9b87-config-data\") pod \"ee3b1264-55ce-4cb4-a390-2fb520ae9b87\" (UID: \"ee3b1264-55ce-4cb4-a390-2fb520ae9b87\") " Oct 13 13:27:45 crc kubenswrapper[4797]: I1013 13:27:45.223030 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee3b1264-55ce-4cb4-a390-2fb520ae9b87-combined-ca-bundle\") pod \"ee3b1264-55ce-4cb4-a390-2fb520ae9b87\" (UID: \"ee3b1264-55ce-4cb4-a390-2fb520ae9b87\") " Oct 13 13:27:45 crc kubenswrapper[4797]: I1013 13:27:45.223065 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ee3b1264-55ce-4cb4-a390-2fb520ae9b87-scripts\") pod \"ee3b1264-55ce-4cb4-a390-2fb520ae9b87\" (UID: \"ee3b1264-55ce-4cb4-a390-2fb520ae9b87\") " Oct 13 13:27:45 crc kubenswrapper[4797]: I1013 13:27:45.227741 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee3b1264-55ce-4cb4-a390-2fb520ae9b87-kube-api-access-c2c7k" (OuterVolumeSpecName: "kube-api-access-c2c7k") pod "ee3b1264-55ce-4cb4-a390-2fb520ae9b87" (UID: "ee3b1264-55ce-4cb4-a390-2fb520ae9b87"). InnerVolumeSpecName "kube-api-access-c2c7k". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:27:45 crc kubenswrapper[4797]: I1013 13:27:45.229134 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee3b1264-55ce-4cb4-a390-2fb520ae9b87-scripts" (OuterVolumeSpecName: "scripts") pod "ee3b1264-55ce-4cb4-a390-2fb520ae9b87" (UID: "ee3b1264-55ce-4cb4-a390-2fb520ae9b87"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:27:45 crc kubenswrapper[4797]: I1013 13:27:45.257064 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee3b1264-55ce-4cb4-a390-2fb520ae9b87-config-data" (OuterVolumeSpecName: "config-data") pod "ee3b1264-55ce-4cb4-a390-2fb520ae9b87" (UID: "ee3b1264-55ce-4cb4-a390-2fb520ae9b87"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:27:45 crc kubenswrapper[4797]: I1013 13:27:45.259964 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee3b1264-55ce-4cb4-a390-2fb520ae9b87-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ee3b1264-55ce-4cb4-a390-2fb520ae9b87" (UID: "ee3b1264-55ce-4cb4-a390-2fb520ae9b87"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:27:45 crc kubenswrapper[4797]: I1013 13:27:45.325349 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c2c7k\" (UniqueName: \"kubernetes.io/projected/ee3b1264-55ce-4cb4-a390-2fb520ae9b87-kube-api-access-c2c7k\") on node \"crc\" DevicePath \"\"" Oct 13 13:27:45 crc kubenswrapper[4797]: I1013 13:27:45.325493 4797 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee3b1264-55ce-4cb4-a390-2fb520ae9b87-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 13:27:45 crc kubenswrapper[4797]: I1013 13:27:45.325554 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee3b1264-55ce-4cb4-a390-2fb520ae9b87-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 13:27:45 crc kubenswrapper[4797]: I1013 13:27:45.325614 4797 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ee3b1264-55ce-4cb4-a390-2fb520ae9b87-scripts\") on node \"crc\" DevicePath \"\"" Oct 13 13:27:45 crc kubenswrapper[4797]: I1013 13:27:45.777141 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8c6a6659-fb80-4cb0-be00-6582c7b3dba0","Type":"ContainerStarted","Data":"b4664bfefc234f895e831a2503689ccee8e6be6a7c2ec6e616843e49456d8777"} Oct 13 13:27:45 crc kubenswrapper[4797]: I1013 13:27:45.780084 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-9drpj" Oct 13 13:27:45 crc kubenswrapper[4797]: I1013 13:27:45.789453 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-9drpj" event={"ID":"ee3b1264-55ce-4cb4-a390-2fb520ae9b87","Type":"ContainerDied","Data":"619a1151790656278584ed4dae030abe89e71a5d46806f27f65f04230fe87049"} Oct 13 13:27:45 crc kubenswrapper[4797]: I1013 13:27:45.789490 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="619a1151790656278584ed4dae030abe89e71a5d46806f27f65f04230fe87049" Oct 13 13:27:45 crc kubenswrapper[4797]: I1013 13:27:45.828418 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 13 13:27:45 crc kubenswrapper[4797]: E1013 13:27:45.828978 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee3b1264-55ce-4cb4-a390-2fb520ae9b87" containerName="nova-cell1-conductor-db-sync" Oct 13 13:27:45 crc kubenswrapper[4797]: I1013 13:27:45.828998 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee3b1264-55ce-4cb4-a390-2fb520ae9b87" containerName="nova-cell1-conductor-db-sync" Oct 13 13:27:45 crc kubenswrapper[4797]: I1013 13:27:45.829252 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee3b1264-55ce-4cb4-a390-2fb520ae9b87" containerName="nova-cell1-conductor-db-sync" Oct 13 13:27:45 crc kubenswrapper[4797]: I1013 13:27:45.829988 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 13 13:27:45 crc kubenswrapper[4797]: I1013 13:27:45.836939 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 13 13:27:45 crc kubenswrapper[4797]: I1013 13:27:45.858498 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 13 13:27:45 crc kubenswrapper[4797]: I1013 13:27:45.939072 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60394f60-af79-4a07-8f3f-75fb61c31894-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"60394f60-af79-4a07-8f3f-75fb61c31894\") " pod="openstack/nova-cell1-conductor-0" Oct 13 13:27:45 crc kubenswrapper[4797]: I1013 13:27:45.939174 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhbhl\" (UniqueName: \"kubernetes.io/projected/60394f60-af79-4a07-8f3f-75fb61c31894-kube-api-access-fhbhl\") pod \"nova-cell1-conductor-0\" (UID: \"60394f60-af79-4a07-8f3f-75fb61c31894\") " pod="openstack/nova-cell1-conductor-0" Oct 13 13:27:45 crc kubenswrapper[4797]: I1013 13:27:45.939349 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60394f60-af79-4a07-8f3f-75fb61c31894-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"60394f60-af79-4a07-8f3f-75fb61c31894\") " pod="openstack/nova-cell1-conductor-0" Oct 13 13:27:46 crc kubenswrapper[4797]: I1013 13:27:46.040865 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60394f60-af79-4a07-8f3f-75fb61c31894-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"60394f60-af79-4a07-8f3f-75fb61c31894\") " pod="openstack/nova-cell1-conductor-0" Oct 13 13:27:46 crc kubenswrapper[4797]: I1013 13:27:46.041125 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60394f60-af79-4a07-8f3f-75fb61c31894-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"60394f60-af79-4a07-8f3f-75fb61c31894\") " pod="openstack/nova-cell1-conductor-0" Oct 13 13:27:46 crc kubenswrapper[4797]: I1013 13:27:46.041277 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fhbhl\" (UniqueName: \"kubernetes.io/projected/60394f60-af79-4a07-8f3f-75fb61c31894-kube-api-access-fhbhl\") pod \"nova-cell1-conductor-0\" (UID: \"60394f60-af79-4a07-8f3f-75fb61c31894\") " pod="openstack/nova-cell1-conductor-0" Oct 13 13:27:46 crc kubenswrapper[4797]: I1013 13:27:46.044958 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60394f60-af79-4a07-8f3f-75fb61c31894-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"60394f60-af79-4a07-8f3f-75fb61c31894\") " pod="openstack/nova-cell1-conductor-0" Oct 13 13:27:46 crc kubenswrapper[4797]: I1013 13:27:46.048317 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60394f60-af79-4a07-8f3f-75fb61c31894-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"60394f60-af79-4a07-8f3f-75fb61c31894\") " pod="openstack/nova-cell1-conductor-0" Oct 13 13:27:46 crc kubenswrapper[4797]: I1013 13:27:46.061135 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhbhl\" (UniqueName: \"kubernetes.io/projected/60394f60-af79-4a07-8f3f-75fb61c31894-kube-api-access-fhbhl\") pod \"nova-cell1-conductor-0\" (UID: \"60394f60-af79-4a07-8f3f-75fb61c31894\") " pod="openstack/nova-cell1-conductor-0" Oct 13 13:27:46 crc kubenswrapper[4797]: I1013 13:27:46.180751 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 13 13:27:46 crc kubenswrapper[4797]: I1013 13:27:46.648273 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 13 13:27:46 crc kubenswrapper[4797]: W1013 13:27:46.650232 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod60394f60_af79_4a07_8f3f_75fb61c31894.slice/crio-0ac92e3ed27f5d295a88366f51b035ae718352911b12dc0dd6a7cbef5c66267c WatchSource:0}: Error finding container 0ac92e3ed27f5d295a88366f51b035ae718352911b12dc0dd6a7cbef5c66267c: Status 404 returned error can't find the container with id 0ac92e3ed27f5d295a88366f51b035ae718352911b12dc0dd6a7cbef5c66267c Oct 13 13:27:46 crc kubenswrapper[4797]: I1013 13:27:46.793301 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"60394f60-af79-4a07-8f3f-75fb61c31894","Type":"ContainerStarted","Data":"0ac92e3ed27f5d295a88366f51b035ae718352911b12dc0dd6a7cbef5c66267c"} Oct 13 13:27:46 crc kubenswrapper[4797]: I1013 13:27:46.796972 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8c6a6659-fb80-4cb0-be00-6582c7b3dba0","Type":"ContainerStarted","Data":"aff88e478b26dd2a4d0f4775dd1261c763352892b1cb769932fe57497cb8a240"} Oct 13 13:27:47 crc kubenswrapper[4797]: I1013 13:27:47.628523 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 13 13:27:47 crc kubenswrapper[4797]: I1013 13:27:47.696475 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 13 13:27:47 crc kubenswrapper[4797]: I1013 13:27:47.775801 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d916771f-8789-42cf-aa66-11707d4825f6-config-data\") pod \"d916771f-8789-42cf-aa66-11707d4825f6\" (UID: \"d916771f-8789-42cf-aa66-11707d4825f6\") " Oct 13 13:27:47 crc kubenswrapper[4797]: I1013 13:27:47.775880 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/295e91cc-9024-4280-a782-3c7f3a2d19dc-combined-ca-bundle\") pod \"295e91cc-9024-4280-a782-3c7f3a2d19dc\" (UID: \"295e91cc-9024-4280-a782-3c7f3a2d19dc\") " Oct 13 13:27:47 crc kubenswrapper[4797]: I1013 13:27:47.775904 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zh752\" (UniqueName: \"kubernetes.io/projected/d916771f-8789-42cf-aa66-11707d4825f6-kube-api-access-zh752\") pod \"d916771f-8789-42cf-aa66-11707d4825f6\" (UID: \"d916771f-8789-42cf-aa66-11707d4825f6\") " Oct 13 13:27:47 crc kubenswrapper[4797]: I1013 13:27:47.775950 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zbrs6\" (UniqueName: \"kubernetes.io/projected/295e91cc-9024-4280-a782-3c7f3a2d19dc-kube-api-access-zbrs6\") pod \"295e91cc-9024-4280-a782-3c7f3a2d19dc\" (UID: \"295e91cc-9024-4280-a782-3c7f3a2d19dc\") " Oct 13 13:27:47 crc kubenswrapper[4797]: I1013 13:27:47.775996 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/295e91cc-9024-4280-a782-3c7f3a2d19dc-logs\") pod \"295e91cc-9024-4280-a782-3c7f3a2d19dc\" (UID: \"295e91cc-9024-4280-a782-3c7f3a2d19dc\") " Oct 13 13:27:47 crc kubenswrapper[4797]: I1013 13:27:47.776145 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/295e91cc-9024-4280-a782-3c7f3a2d19dc-config-data\") pod \"295e91cc-9024-4280-a782-3c7f3a2d19dc\" (UID: \"295e91cc-9024-4280-a782-3c7f3a2d19dc\") " Oct 13 13:27:47 crc kubenswrapper[4797]: I1013 13:27:47.776272 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d916771f-8789-42cf-aa66-11707d4825f6-combined-ca-bundle\") pod \"d916771f-8789-42cf-aa66-11707d4825f6\" (UID: \"d916771f-8789-42cf-aa66-11707d4825f6\") " Oct 13 13:27:47 crc kubenswrapper[4797]: I1013 13:27:47.777632 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/295e91cc-9024-4280-a782-3c7f3a2d19dc-logs" (OuterVolumeSpecName: "logs") pod "295e91cc-9024-4280-a782-3c7f3a2d19dc" (UID: "295e91cc-9024-4280-a782-3c7f3a2d19dc"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 13:27:47 crc kubenswrapper[4797]: I1013 13:27:47.782893 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d916771f-8789-42cf-aa66-11707d4825f6-kube-api-access-zh752" (OuterVolumeSpecName: "kube-api-access-zh752") pod "d916771f-8789-42cf-aa66-11707d4825f6" (UID: "d916771f-8789-42cf-aa66-11707d4825f6"). InnerVolumeSpecName "kube-api-access-zh752". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:27:47 crc kubenswrapper[4797]: I1013 13:27:47.798622 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/295e91cc-9024-4280-a782-3c7f3a2d19dc-kube-api-access-zbrs6" (OuterVolumeSpecName: "kube-api-access-zbrs6") pod "295e91cc-9024-4280-a782-3c7f3a2d19dc" (UID: "295e91cc-9024-4280-a782-3c7f3a2d19dc"). InnerVolumeSpecName "kube-api-access-zbrs6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:27:47 crc kubenswrapper[4797]: I1013 13:27:47.806233 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d916771f-8789-42cf-aa66-11707d4825f6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d916771f-8789-42cf-aa66-11707d4825f6" (UID: "d916771f-8789-42cf-aa66-11707d4825f6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:27:47 crc kubenswrapper[4797]: I1013 13:27:47.811265 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/295e91cc-9024-4280-a782-3c7f3a2d19dc-config-data" (OuterVolumeSpecName: "config-data") pod "295e91cc-9024-4280-a782-3c7f3a2d19dc" (UID: "295e91cc-9024-4280-a782-3c7f3a2d19dc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:27:47 crc kubenswrapper[4797]: I1013 13:27:47.826248 4797 generic.go:334] "Generic (PLEG): container finished" podID="d916771f-8789-42cf-aa66-11707d4825f6" containerID="d0e136c6ee06a6dbb29283674c3aa6dd2d3838158b94538d5b3654533f98cdff" exitCode=0 Oct 13 13:27:47 crc kubenswrapper[4797]: I1013 13:27:47.826327 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d916771f-8789-42cf-aa66-11707d4825f6","Type":"ContainerDied","Data":"d0e136c6ee06a6dbb29283674c3aa6dd2d3838158b94538d5b3654533f98cdff"} Oct 13 13:27:47 crc kubenswrapper[4797]: I1013 13:27:47.826354 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d916771f-8789-42cf-aa66-11707d4825f6","Type":"ContainerDied","Data":"fb9aeba3bcd842bb33e696953339e7d39af20fe23222839c4f11e16ec69bc551"} Oct 13 13:27:47 crc kubenswrapper[4797]: I1013 13:27:47.826373 4797 scope.go:117] "RemoveContainer" containerID="d0e136c6ee06a6dbb29283674c3aa6dd2d3838158b94538d5b3654533f98cdff" Oct 13 13:27:47 crc kubenswrapper[4797]: I1013 13:27:47.826464 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 13 13:27:47 crc kubenswrapper[4797]: I1013 13:27:47.838889 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d916771f-8789-42cf-aa66-11707d4825f6-config-data" (OuterVolumeSpecName: "config-data") pod "d916771f-8789-42cf-aa66-11707d4825f6" (UID: "d916771f-8789-42cf-aa66-11707d4825f6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:27:47 crc kubenswrapper[4797]: I1013 13:27:47.839833 4797 generic.go:334] "Generic (PLEG): container finished" podID="295e91cc-9024-4280-a782-3c7f3a2d19dc" containerID="7c864360250eda4150f9a3eacc5f7bd8c83b74c0a6eda4fcedee00c2f972660a" exitCode=0 Oct 13 13:27:47 crc kubenswrapper[4797]: I1013 13:27:47.839910 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"295e91cc-9024-4280-a782-3c7f3a2d19dc","Type":"ContainerDied","Data":"7c864360250eda4150f9a3eacc5f7bd8c83b74c0a6eda4fcedee00c2f972660a"} Oct 13 13:27:47 crc kubenswrapper[4797]: I1013 13:27:47.839914 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 13 13:27:47 crc kubenswrapper[4797]: I1013 13:27:47.839937 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"295e91cc-9024-4280-a782-3c7f3a2d19dc","Type":"ContainerDied","Data":"8d2a09717a60726f99312698dccb0b45abc5b9351a09ad0864a8ee755b21dde0"} Oct 13 13:27:47 crc kubenswrapper[4797]: I1013 13:27:47.847925 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"60394f60-af79-4a07-8f3f-75fb61c31894","Type":"ContainerStarted","Data":"45f5cee6335c0ba2bc083ace9fff9eb625941edef21cfaa370cfe539173e7b53"} Oct 13 13:27:47 crc kubenswrapper[4797]: I1013 13:27:47.848635 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/295e91cc-9024-4280-a782-3c7f3a2d19dc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "295e91cc-9024-4280-a782-3c7f3a2d19dc" (UID: "295e91cc-9024-4280-a782-3c7f3a2d19dc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:27:47 crc kubenswrapper[4797]: I1013 13:27:47.848742 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Oct 13 13:27:47 crc kubenswrapper[4797]: I1013 13:27:47.878118 4797 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/295e91cc-9024-4280-a782-3c7f3a2d19dc-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 13:27:47 crc kubenswrapper[4797]: I1013 13:27:47.878157 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d916771f-8789-42cf-aa66-11707d4825f6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 13:27:47 crc kubenswrapper[4797]: I1013 13:27:47.878171 4797 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d916771f-8789-42cf-aa66-11707d4825f6-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 13:27:47 crc kubenswrapper[4797]: I1013 13:27:47.878183 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/295e91cc-9024-4280-a782-3c7f3a2d19dc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 13:27:47 crc kubenswrapper[4797]: I1013 13:27:47.878196 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zh752\" (UniqueName: \"kubernetes.io/projected/d916771f-8789-42cf-aa66-11707d4825f6-kube-api-access-zh752\") on node \"crc\" DevicePath \"\"" Oct 13 13:27:47 crc kubenswrapper[4797]: I1013 13:27:47.878208 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zbrs6\" (UniqueName: \"kubernetes.io/projected/295e91cc-9024-4280-a782-3c7f3a2d19dc-kube-api-access-zbrs6\") on node \"crc\" DevicePath \"\"" Oct 13 13:27:47 crc kubenswrapper[4797]: I1013 13:27:47.878219 4797 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/295e91cc-9024-4280-a782-3c7f3a2d19dc-logs\") on node \"crc\" DevicePath \"\"" Oct 13 13:27:47 crc kubenswrapper[4797]: I1013 13:27:47.882474 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.88245934 podStartE2EDuration="2.88245934s" podCreationTimestamp="2025-10-13 13:27:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 13:27:47.868722094 +0000 UTC m=+1245.402272360" watchObservedRunningTime="2025-10-13 13:27:47.88245934 +0000 UTC m=+1245.416009596" Oct 13 13:27:47 crc kubenswrapper[4797]: I1013 13:27:47.960844 4797 scope.go:117] "RemoveContainer" containerID="d0e136c6ee06a6dbb29283674c3aa6dd2d3838158b94538d5b3654533f98cdff" Oct 13 13:27:47 crc kubenswrapper[4797]: E1013 13:27:47.961418 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d0e136c6ee06a6dbb29283674c3aa6dd2d3838158b94538d5b3654533f98cdff\": container with ID starting with d0e136c6ee06a6dbb29283674c3aa6dd2d3838158b94538d5b3654533f98cdff not found: ID does not exist" containerID="d0e136c6ee06a6dbb29283674c3aa6dd2d3838158b94538d5b3654533f98cdff" Oct 13 13:27:47 crc kubenswrapper[4797]: I1013 13:27:47.961463 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0e136c6ee06a6dbb29283674c3aa6dd2d3838158b94538d5b3654533f98cdff"} err="failed to get container status \"d0e136c6ee06a6dbb29283674c3aa6dd2d3838158b94538d5b3654533f98cdff\": rpc error: code = NotFound desc = could not find container \"d0e136c6ee06a6dbb29283674c3aa6dd2d3838158b94538d5b3654533f98cdff\": container with ID starting with d0e136c6ee06a6dbb29283674c3aa6dd2d3838158b94538d5b3654533f98cdff not found: ID does not exist" Oct 13 13:27:47 crc kubenswrapper[4797]: I1013 13:27:47.961491 4797 scope.go:117] "RemoveContainer" containerID="7c864360250eda4150f9a3eacc5f7bd8c83b74c0a6eda4fcedee00c2f972660a" Oct 13 13:27:47 crc kubenswrapper[4797]: I1013 13:27:47.998737 4797 scope.go:117] "RemoveContainer" containerID="087de0154f9960b6ee388e1e995a5c6a2ef15b2fb7775bdccea9ec17c4e8774e" Oct 13 13:27:48 crc kubenswrapper[4797]: I1013 13:27:48.045240 4797 scope.go:117] "RemoveContainer" containerID="7c864360250eda4150f9a3eacc5f7bd8c83b74c0a6eda4fcedee00c2f972660a" Oct 13 13:27:48 crc kubenswrapper[4797]: E1013 13:27:48.047171 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c864360250eda4150f9a3eacc5f7bd8c83b74c0a6eda4fcedee00c2f972660a\": container with ID starting with 7c864360250eda4150f9a3eacc5f7bd8c83b74c0a6eda4fcedee00c2f972660a not found: ID does not exist" containerID="7c864360250eda4150f9a3eacc5f7bd8c83b74c0a6eda4fcedee00c2f972660a" Oct 13 13:27:48 crc kubenswrapper[4797]: I1013 13:27:48.047205 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c864360250eda4150f9a3eacc5f7bd8c83b74c0a6eda4fcedee00c2f972660a"} err="failed to get container status \"7c864360250eda4150f9a3eacc5f7bd8c83b74c0a6eda4fcedee00c2f972660a\": rpc error: code = NotFound desc = could not find container \"7c864360250eda4150f9a3eacc5f7bd8c83b74c0a6eda4fcedee00c2f972660a\": container with ID starting with 7c864360250eda4150f9a3eacc5f7bd8c83b74c0a6eda4fcedee00c2f972660a not found: ID does not exist" Oct 13 13:27:48 crc kubenswrapper[4797]: I1013 13:27:48.047226 4797 scope.go:117] "RemoveContainer" containerID="087de0154f9960b6ee388e1e995a5c6a2ef15b2fb7775bdccea9ec17c4e8774e" Oct 13 13:27:48 crc kubenswrapper[4797]: E1013 13:27:48.047555 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"087de0154f9960b6ee388e1e995a5c6a2ef15b2fb7775bdccea9ec17c4e8774e\": container with ID starting with 087de0154f9960b6ee388e1e995a5c6a2ef15b2fb7775bdccea9ec17c4e8774e not found: ID does not exist" containerID="087de0154f9960b6ee388e1e995a5c6a2ef15b2fb7775bdccea9ec17c4e8774e" Oct 13 13:27:48 crc kubenswrapper[4797]: I1013 13:27:48.047577 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"087de0154f9960b6ee388e1e995a5c6a2ef15b2fb7775bdccea9ec17c4e8774e"} err="failed to get container status \"087de0154f9960b6ee388e1e995a5c6a2ef15b2fb7775bdccea9ec17c4e8774e\": rpc error: code = NotFound desc = could not find container \"087de0154f9960b6ee388e1e995a5c6a2ef15b2fb7775bdccea9ec17c4e8774e\": container with ID starting with 087de0154f9960b6ee388e1e995a5c6a2ef15b2fb7775bdccea9ec17c4e8774e not found: ID does not exist" Oct 13 13:27:48 crc kubenswrapper[4797]: I1013 13:27:48.130447 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 13 13:27:48 crc kubenswrapper[4797]: I1013 13:27:48.130491 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 13 13:27:48 crc kubenswrapper[4797]: I1013 13:27:48.159991 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 13 13:27:48 crc kubenswrapper[4797]: I1013 13:27:48.173193 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 13 13:27:48 crc kubenswrapper[4797]: I1013 13:27:48.185557 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 13 13:27:48 crc kubenswrapper[4797]: I1013 13:27:48.193384 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 13 13:27:48 crc kubenswrapper[4797]: I1013 13:27:48.200221 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 13 13:27:48 crc kubenswrapper[4797]: E1013 13:27:48.200797 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="295e91cc-9024-4280-a782-3c7f3a2d19dc" containerName="nova-api-log" Oct 13 13:27:48 crc kubenswrapper[4797]: I1013 13:27:48.200837 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="295e91cc-9024-4280-a782-3c7f3a2d19dc" containerName="nova-api-log" Oct 13 13:27:48 crc kubenswrapper[4797]: E1013 13:27:48.200860 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d916771f-8789-42cf-aa66-11707d4825f6" containerName="nova-scheduler-scheduler" Oct 13 13:27:48 crc kubenswrapper[4797]: I1013 13:27:48.200869 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="d916771f-8789-42cf-aa66-11707d4825f6" containerName="nova-scheduler-scheduler" Oct 13 13:27:48 crc kubenswrapper[4797]: E1013 13:27:48.200893 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="295e91cc-9024-4280-a782-3c7f3a2d19dc" containerName="nova-api-api" Oct 13 13:27:48 crc kubenswrapper[4797]: I1013 13:27:48.200902 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="295e91cc-9024-4280-a782-3c7f3a2d19dc" containerName="nova-api-api" Oct 13 13:27:48 crc kubenswrapper[4797]: I1013 13:27:48.201129 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="295e91cc-9024-4280-a782-3c7f3a2d19dc" containerName="nova-api-api" Oct 13 13:27:48 crc kubenswrapper[4797]: I1013 13:27:48.201152 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="295e91cc-9024-4280-a782-3c7f3a2d19dc" containerName="nova-api-log" Oct 13 13:27:48 crc kubenswrapper[4797]: I1013 13:27:48.201164 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="d916771f-8789-42cf-aa66-11707d4825f6" containerName="nova-scheduler-scheduler" Oct 13 13:27:48 crc kubenswrapper[4797]: I1013 13:27:48.201988 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 13 13:27:48 crc kubenswrapper[4797]: I1013 13:27:48.203920 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 13 13:27:48 crc kubenswrapper[4797]: I1013 13:27:48.208041 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 13 13:27:48 crc kubenswrapper[4797]: I1013 13:27:48.209968 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 13 13:27:48 crc kubenswrapper[4797]: I1013 13:27:48.213458 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 13 13:27:48 crc kubenswrapper[4797]: I1013 13:27:48.236879 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 13 13:27:48 crc kubenswrapper[4797]: I1013 13:27:48.277640 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 13 13:27:48 crc kubenswrapper[4797]: I1013 13:27:48.287869 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9ad8baa-2c07-4598-9347-71831d4d264e-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e9ad8baa-2c07-4598-9347-71831d4d264e\") " pod="openstack/nova-scheduler-0" Oct 13 13:27:48 crc kubenswrapper[4797]: I1013 13:27:48.287903 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9ad8baa-2c07-4598-9347-71831d4d264e-config-data\") pod \"nova-scheduler-0\" (UID: \"e9ad8baa-2c07-4598-9347-71831d4d264e\") " pod="openstack/nova-scheduler-0" Oct 13 13:27:48 crc kubenswrapper[4797]: I1013 13:27:48.287935 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2dbfdf40-54bd-48ca-a55e-1a40ed46b41b-logs\") pod \"nova-api-0\" (UID: \"2dbfdf40-54bd-48ca-a55e-1a40ed46b41b\") " pod="openstack/nova-api-0" Oct 13 13:27:48 crc kubenswrapper[4797]: I1013 13:27:48.288029 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2dbfdf40-54bd-48ca-a55e-1a40ed46b41b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2dbfdf40-54bd-48ca-a55e-1a40ed46b41b\") " pod="openstack/nova-api-0" Oct 13 13:27:48 crc kubenswrapper[4797]: I1013 13:27:48.288057 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rtnfj\" (UniqueName: \"kubernetes.io/projected/2dbfdf40-54bd-48ca-a55e-1a40ed46b41b-kube-api-access-rtnfj\") pod \"nova-api-0\" (UID: \"2dbfdf40-54bd-48ca-a55e-1a40ed46b41b\") " pod="openstack/nova-api-0" Oct 13 13:27:48 crc kubenswrapper[4797]: I1013 13:27:48.288141 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2dbfdf40-54bd-48ca-a55e-1a40ed46b41b-config-data\") pod \"nova-api-0\" (UID: \"2dbfdf40-54bd-48ca-a55e-1a40ed46b41b\") " pod="openstack/nova-api-0" Oct 13 13:27:48 crc kubenswrapper[4797]: I1013 13:27:48.288173 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mt2vq\" (UniqueName: \"kubernetes.io/projected/e9ad8baa-2c07-4598-9347-71831d4d264e-kube-api-access-mt2vq\") pod \"nova-scheduler-0\" (UID: \"e9ad8baa-2c07-4598-9347-71831d4d264e\") " pod="openstack/nova-scheduler-0" Oct 13 13:27:48 crc kubenswrapper[4797]: I1013 13:27:48.390335 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2dbfdf40-54bd-48ca-a55e-1a40ed46b41b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2dbfdf40-54bd-48ca-a55e-1a40ed46b41b\") " pod="openstack/nova-api-0" Oct 13 13:27:48 crc kubenswrapper[4797]: I1013 13:27:48.390410 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rtnfj\" (UniqueName: \"kubernetes.io/projected/2dbfdf40-54bd-48ca-a55e-1a40ed46b41b-kube-api-access-rtnfj\") pod \"nova-api-0\" (UID: \"2dbfdf40-54bd-48ca-a55e-1a40ed46b41b\") " pod="openstack/nova-api-0" Oct 13 13:27:48 crc kubenswrapper[4797]: I1013 13:27:48.390559 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2dbfdf40-54bd-48ca-a55e-1a40ed46b41b-config-data\") pod \"nova-api-0\" (UID: \"2dbfdf40-54bd-48ca-a55e-1a40ed46b41b\") " pod="openstack/nova-api-0" Oct 13 13:27:48 crc kubenswrapper[4797]: I1013 13:27:48.390610 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mt2vq\" (UniqueName: \"kubernetes.io/projected/e9ad8baa-2c07-4598-9347-71831d4d264e-kube-api-access-mt2vq\") pod \"nova-scheduler-0\" (UID: \"e9ad8baa-2c07-4598-9347-71831d4d264e\") " pod="openstack/nova-scheduler-0" Oct 13 13:27:48 crc kubenswrapper[4797]: I1013 13:27:48.390643 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9ad8baa-2c07-4598-9347-71831d4d264e-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e9ad8baa-2c07-4598-9347-71831d4d264e\") " pod="openstack/nova-scheduler-0" Oct 13 13:27:48 crc kubenswrapper[4797]: I1013 13:27:48.390671 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9ad8baa-2c07-4598-9347-71831d4d264e-config-data\") pod \"nova-scheduler-0\" (UID: \"e9ad8baa-2c07-4598-9347-71831d4d264e\") " pod="openstack/nova-scheduler-0" Oct 13 13:27:48 crc kubenswrapper[4797]: I1013 13:27:48.391768 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2dbfdf40-54bd-48ca-a55e-1a40ed46b41b-logs\") pod \"nova-api-0\" (UID: \"2dbfdf40-54bd-48ca-a55e-1a40ed46b41b\") " pod="openstack/nova-api-0" Oct 13 13:27:48 crc kubenswrapper[4797]: I1013 13:27:48.392277 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2dbfdf40-54bd-48ca-a55e-1a40ed46b41b-logs\") pod \"nova-api-0\" (UID: \"2dbfdf40-54bd-48ca-a55e-1a40ed46b41b\") " pod="openstack/nova-api-0" Oct 13 13:27:48 crc kubenswrapper[4797]: I1013 13:27:48.394760 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2dbfdf40-54bd-48ca-a55e-1a40ed46b41b-config-data\") pod \"nova-api-0\" (UID: \"2dbfdf40-54bd-48ca-a55e-1a40ed46b41b\") " pod="openstack/nova-api-0" Oct 13 13:27:48 crc kubenswrapper[4797]: I1013 13:27:48.395606 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9ad8baa-2c07-4598-9347-71831d4d264e-config-data\") pod \"nova-scheduler-0\" (UID: \"e9ad8baa-2c07-4598-9347-71831d4d264e\") " pod="openstack/nova-scheduler-0" Oct 13 13:27:48 crc kubenswrapper[4797]: I1013 13:27:48.398246 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9ad8baa-2c07-4598-9347-71831d4d264e-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e9ad8baa-2c07-4598-9347-71831d4d264e\") " pod="openstack/nova-scheduler-0" Oct 13 13:27:48 crc kubenswrapper[4797]: I1013 13:27:48.407521 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2dbfdf40-54bd-48ca-a55e-1a40ed46b41b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2dbfdf40-54bd-48ca-a55e-1a40ed46b41b\") " pod="openstack/nova-api-0" Oct 13 13:27:48 crc kubenswrapper[4797]: I1013 13:27:48.407621 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mt2vq\" (UniqueName: \"kubernetes.io/projected/e9ad8baa-2c07-4598-9347-71831d4d264e-kube-api-access-mt2vq\") pod \"nova-scheduler-0\" (UID: \"e9ad8baa-2c07-4598-9347-71831d4d264e\") " pod="openstack/nova-scheduler-0" Oct 13 13:27:48 crc kubenswrapper[4797]: I1013 13:27:48.409223 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rtnfj\" (UniqueName: \"kubernetes.io/projected/2dbfdf40-54bd-48ca-a55e-1a40ed46b41b-kube-api-access-rtnfj\") pod \"nova-api-0\" (UID: \"2dbfdf40-54bd-48ca-a55e-1a40ed46b41b\") " pod="openstack/nova-api-0" Oct 13 13:27:48 crc kubenswrapper[4797]: I1013 13:27:48.533267 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 13 13:27:48 crc kubenswrapper[4797]: I1013 13:27:48.565792 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 13 13:27:48 crc kubenswrapper[4797]: I1013 13:27:48.921543 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8c6a6659-fb80-4cb0-be00-6582c7b3dba0","Type":"ContainerStarted","Data":"cff4fa9cc1c7c7fe2ce49da33153fffa149734e73a060095c01b3199af634a15"} Oct 13 13:27:48 crc kubenswrapper[4797]: I1013 13:27:48.921838 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 13 13:27:48 crc kubenswrapper[4797]: I1013 13:27:48.948047 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.07888113 podStartE2EDuration="5.948029359s" podCreationTimestamp="2025-10-13 13:27:43 +0000 UTC" firstStartedPulling="2025-10-13 13:27:43.837080749 +0000 UTC m=+1241.370631015" lastFinishedPulling="2025-10-13 13:27:47.706228988 +0000 UTC m=+1245.239779244" observedRunningTime="2025-10-13 13:27:48.944935283 +0000 UTC m=+1246.478485549" watchObservedRunningTime="2025-10-13 13:27:48.948029359 +0000 UTC m=+1246.481579615" Oct 13 13:27:49 crc kubenswrapper[4797]: I1013 13:27:49.117856 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 13 13:27:49 crc kubenswrapper[4797]: I1013 13:27:49.250212 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="295e91cc-9024-4280-a782-3c7f3a2d19dc" path="/var/lib/kubelet/pods/295e91cc-9024-4280-a782-3c7f3a2d19dc/volumes" Oct 13 13:27:49 crc kubenswrapper[4797]: I1013 13:27:49.251323 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d916771f-8789-42cf-aa66-11707d4825f6" path="/var/lib/kubelet/pods/d916771f-8789-42cf-aa66-11707d4825f6/volumes" Oct 13 13:27:49 crc kubenswrapper[4797]: I1013 13:27:49.251930 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 13 13:27:49 crc kubenswrapper[4797]: I1013 13:27:49.929182 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e9ad8baa-2c07-4598-9347-71831d4d264e","Type":"ContainerStarted","Data":"e0bd5a9e4a93dff8c862d42da3d24ddaaf3e00e8cc59c05ccb112cb226cdf57b"} Oct 13 13:27:49 crc kubenswrapper[4797]: I1013 13:27:49.930430 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e9ad8baa-2c07-4598-9347-71831d4d264e","Type":"ContainerStarted","Data":"8429031209a2425ff9179441bf86c64e86759cae28a1c556e15ffb31901e4634"} Oct 13 13:27:49 crc kubenswrapper[4797]: I1013 13:27:49.933129 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2dbfdf40-54bd-48ca-a55e-1a40ed46b41b","Type":"ContainerStarted","Data":"a49e5d7ef03e38510d8582a1a5568287b39a3cc8d9db5b9c20c9fc1e9eb55a36"} Oct 13 13:27:49 crc kubenswrapper[4797]: I1013 13:27:49.933182 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2dbfdf40-54bd-48ca-a55e-1a40ed46b41b","Type":"ContainerStarted","Data":"009892dd887d85a6910109fd57146eb5fe606d9b5225dc6109009e2b8336e753"} Oct 13 13:27:49 crc kubenswrapper[4797]: I1013 13:27:49.933196 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2dbfdf40-54bd-48ca-a55e-1a40ed46b41b","Type":"ContainerStarted","Data":"b7561e593917785385730e85745792d2d7853be126c4904381cddbc925d684f0"} Oct 13 13:27:49 crc kubenswrapper[4797]: I1013 13:27:49.945861 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=1.945846221 podStartE2EDuration="1.945846221s" podCreationTimestamp="2025-10-13 13:27:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 13:27:49.943404031 +0000 UTC m=+1247.476954287" watchObservedRunningTime="2025-10-13 13:27:49.945846221 +0000 UTC m=+1247.479396477" Oct 13 13:27:49 crc kubenswrapper[4797]: I1013 13:27:49.962670 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=1.962652042 podStartE2EDuration="1.962652042s" podCreationTimestamp="2025-10-13 13:27:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 13:27:49.958824849 +0000 UTC m=+1247.492375115" watchObservedRunningTime="2025-10-13 13:27:49.962652042 +0000 UTC m=+1247.496202288" Oct 13 13:27:49 crc kubenswrapper[4797]: I1013 13:27:49.976910 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Oct 13 13:27:51 crc kubenswrapper[4797]: I1013 13:27:51.218833 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Oct 13 13:27:53 crc kubenswrapper[4797]: I1013 13:27:53.131259 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 13 13:27:53 crc kubenswrapper[4797]: I1013 13:27:53.131588 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 13 13:27:53 crc kubenswrapper[4797]: I1013 13:27:53.533874 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 13 13:27:54 crc kubenswrapper[4797]: I1013 13:27:54.148282 4797 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="4aac353b-6a2c-4072-b40f-cc91a3907bce" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.194:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 13 13:27:54 crc kubenswrapper[4797]: I1013 13:27:54.148286 4797 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="4aac353b-6a2c-4072-b40f-cc91a3907bce" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.194:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 13 13:27:58 crc kubenswrapper[4797]: I1013 13:27:58.534419 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 13 13:27:58 crc kubenswrapper[4797]: I1013 13:27:58.566230 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 13 13:27:58 crc kubenswrapper[4797]: I1013 13:27:58.566310 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 13 13:27:58 crc kubenswrapper[4797]: I1013 13:27:58.568754 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 13 13:27:59 crc kubenswrapper[4797]: I1013 13:27:59.063673 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 13 13:27:59 crc kubenswrapper[4797]: I1013 13:27:59.648084 4797 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="2dbfdf40-54bd-48ca-a55e-1a40ed46b41b" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.198:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 13 13:27:59 crc kubenswrapper[4797]: I1013 13:27:59.648393 4797 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="2dbfdf40-54bd-48ca-a55e-1a40ed46b41b" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.198:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 13 13:28:03 crc kubenswrapper[4797]: I1013 13:28:03.136629 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 13 13:28:03 crc kubenswrapper[4797]: I1013 13:28:03.140488 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 13 13:28:03 crc kubenswrapper[4797]: I1013 13:28:03.144739 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 13 13:28:04 crc kubenswrapper[4797]: I1013 13:28:04.084288 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 13 13:28:05 crc kubenswrapper[4797]: I1013 13:28:05.902972 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 13 13:28:06 crc kubenswrapper[4797]: I1013 13:28:06.057974 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/126b2f1c-8c5f-431d-8d9d-aefc2ca0d4c1-config-data\") pod \"126b2f1c-8c5f-431d-8d9d-aefc2ca0d4c1\" (UID: \"126b2f1c-8c5f-431d-8d9d-aefc2ca0d4c1\") " Oct 13 13:28:06 crc kubenswrapper[4797]: I1013 13:28:06.058068 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/126b2f1c-8c5f-431d-8d9d-aefc2ca0d4c1-combined-ca-bundle\") pod \"126b2f1c-8c5f-431d-8d9d-aefc2ca0d4c1\" (UID: \"126b2f1c-8c5f-431d-8d9d-aefc2ca0d4c1\") " Oct 13 13:28:06 crc kubenswrapper[4797]: I1013 13:28:06.058239 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p78bs\" (UniqueName: \"kubernetes.io/projected/126b2f1c-8c5f-431d-8d9d-aefc2ca0d4c1-kube-api-access-p78bs\") pod \"126b2f1c-8c5f-431d-8d9d-aefc2ca0d4c1\" (UID: \"126b2f1c-8c5f-431d-8d9d-aefc2ca0d4c1\") " Oct 13 13:28:06 crc kubenswrapper[4797]: I1013 13:28:06.070795 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/126b2f1c-8c5f-431d-8d9d-aefc2ca0d4c1-kube-api-access-p78bs" (OuterVolumeSpecName: "kube-api-access-p78bs") pod "126b2f1c-8c5f-431d-8d9d-aefc2ca0d4c1" (UID: "126b2f1c-8c5f-431d-8d9d-aefc2ca0d4c1"). InnerVolumeSpecName "kube-api-access-p78bs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:28:06 crc kubenswrapper[4797]: I1013 13:28:06.100787 4797 generic.go:334] "Generic (PLEG): container finished" podID="126b2f1c-8c5f-431d-8d9d-aefc2ca0d4c1" containerID="f4972aa92a262458c3b812eecd8c851cc85886977df5fed9fd6db1c7e3ce766a" exitCode=137 Oct 13 13:28:06 crc kubenswrapper[4797]: I1013 13:28:06.100847 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 13 13:28:06 crc kubenswrapper[4797]: I1013 13:28:06.100881 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"126b2f1c-8c5f-431d-8d9d-aefc2ca0d4c1","Type":"ContainerDied","Data":"f4972aa92a262458c3b812eecd8c851cc85886977df5fed9fd6db1c7e3ce766a"} Oct 13 13:28:06 crc kubenswrapper[4797]: I1013 13:28:06.100981 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"126b2f1c-8c5f-431d-8d9d-aefc2ca0d4c1","Type":"ContainerDied","Data":"4a3819d45087ff74b060fb47730b7c74d5a2bf2ff7fcafc7a5db29c231cd9231"} Oct 13 13:28:06 crc kubenswrapper[4797]: I1013 13:28:06.101026 4797 scope.go:117] "RemoveContainer" containerID="f4972aa92a262458c3b812eecd8c851cc85886977df5fed9fd6db1c7e3ce766a" Oct 13 13:28:06 crc kubenswrapper[4797]: I1013 13:28:06.110054 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/126b2f1c-8c5f-431d-8d9d-aefc2ca0d4c1-config-data" (OuterVolumeSpecName: "config-data") pod "126b2f1c-8c5f-431d-8d9d-aefc2ca0d4c1" (UID: "126b2f1c-8c5f-431d-8d9d-aefc2ca0d4c1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:28:06 crc kubenswrapper[4797]: I1013 13:28:06.111306 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/126b2f1c-8c5f-431d-8d9d-aefc2ca0d4c1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "126b2f1c-8c5f-431d-8d9d-aefc2ca0d4c1" (UID: "126b2f1c-8c5f-431d-8d9d-aefc2ca0d4c1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:28:06 crc kubenswrapper[4797]: I1013 13:28:06.161437 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/126b2f1c-8c5f-431d-8d9d-aefc2ca0d4c1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 13:28:06 crc kubenswrapper[4797]: I1013 13:28:06.161468 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p78bs\" (UniqueName: \"kubernetes.io/projected/126b2f1c-8c5f-431d-8d9d-aefc2ca0d4c1-kube-api-access-p78bs\") on node \"crc\" DevicePath \"\"" Oct 13 13:28:06 crc kubenswrapper[4797]: I1013 13:28:06.161481 4797 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/126b2f1c-8c5f-431d-8d9d-aefc2ca0d4c1-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 13:28:06 crc kubenswrapper[4797]: I1013 13:28:06.214118 4797 scope.go:117] "RemoveContainer" containerID="f4972aa92a262458c3b812eecd8c851cc85886977df5fed9fd6db1c7e3ce766a" Oct 13 13:28:06 crc kubenswrapper[4797]: E1013 13:28:06.214606 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f4972aa92a262458c3b812eecd8c851cc85886977df5fed9fd6db1c7e3ce766a\": container with ID starting with f4972aa92a262458c3b812eecd8c851cc85886977df5fed9fd6db1c7e3ce766a not found: ID does not exist" containerID="f4972aa92a262458c3b812eecd8c851cc85886977df5fed9fd6db1c7e3ce766a" Oct 13 13:28:06 crc kubenswrapper[4797]: I1013 13:28:06.214649 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4972aa92a262458c3b812eecd8c851cc85886977df5fed9fd6db1c7e3ce766a"} err="failed to get container status \"f4972aa92a262458c3b812eecd8c851cc85886977df5fed9fd6db1c7e3ce766a\": rpc error: code = NotFound desc = could not find container \"f4972aa92a262458c3b812eecd8c851cc85886977df5fed9fd6db1c7e3ce766a\": container with ID starting with f4972aa92a262458c3b812eecd8c851cc85886977df5fed9fd6db1c7e3ce766a not found: ID does not exist" Oct 13 13:28:06 crc kubenswrapper[4797]: I1013 13:28:06.467474 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 13 13:28:06 crc kubenswrapper[4797]: I1013 13:28:06.483520 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 13 13:28:06 crc kubenswrapper[4797]: I1013 13:28:06.494212 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 13 13:28:06 crc kubenswrapper[4797]: E1013 13:28:06.494704 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="126b2f1c-8c5f-431d-8d9d-aefc2ca0d4c1" containerName="nova-cell1-novncproxy-novncproxy" Oct 13 13:28:06 crc kubenswrapper[4797]: I1013 13:28:06.494730 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="126b2f1c-8c5f-431d-8d9d-aefc2ca0d4c1" containerName="nova-cell1-novncproxy-novncproxy" Oct 13 13:28:06 crc kubenswrapper[4797]: I1013 13:28:06.494971 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="126b2f1c-8c5f-431d-8d9d-aefc2ca0d4c1" containerName="nova-cell1-novncproxy-novncproxy" Oct 13 13:28:06 crc kubenswrapper[4797]: I1013 13:28:06.495683 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 13 13:28:06 crc kubenswrapper[4797]: I1013 13:28:06.498281 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Oct 13 13:28:06 crc kubenswrapper[4797]: I1013 13:28:06.498297 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Oct 13 13:28:06 crc kubenswrapper[4797]: I1013 13:28:06.502355 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Oct 13 13:28:06 crc kubenswrapper[4797]: I1013 13:28:06.503390 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 13 13:28:06 crc kubenswrapper[4797]: I1013 13:28:06.672420 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe29d3f9-7e6b-4fc0-8268-a5c3bd0e7f1e-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"fe29d3f9-7e6b-4fc0-8268-a5c3bd0e7f1e\") " pod="openstack/nova-cell1-novncproxy-0" Oct 13 13:28:06 crc kubenswrapper[4797]: I1013 13:28:06.672508 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe29d3f9-7e6b-4fc0-8268-a5c3bd0e7f1e-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"fe29d3f9-7e6b-4fc0-8268-a5c3bd0e7f1e\") " pod="openstack/nova-cell1-novncproxy-0" Oct 13 13:28:06 crc kubenswrapper[4797]: I1013 13:28:06.672615 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe29d3f9-7e6b-4fc0-8268-a5c3bd0e7f1e-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"fe29d3f9-7e6b-4fc0-8268-a5c3bd0e7f1e\") " pod="openstack/nova-cell1-novncproxy-0" Oct 13 13:28:06 crc kubenswrapper[4797]: I1013 13:28:06.672659 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe29d3f9-7e6b-4fc0-8268-a5c3bd0e7f1e-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"fe29d3f9-7e6b-4fc0-8268-a5c3bd0e7f1e\") " pod="openstack/nova-cell1-novncproxy-0" Oct 13 13:28:06 crc kubenswrapper[4797]: I1013 13:28:06.672712 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrrtq\" (UniqueName: \"kubernetes.io/projected/fe29d3f9-7e6b-4fc0-8268-a5c3bd0e7f1e-kube-api-access-xrrtq\") pod \"nova-cell1-novncproxy-0\" (UID: \"fe29d3f9-7e6b-4fc0-8268-a5c3bd0e7f1e\") " pod="openstack/nova-cell1-novncproxy-0" Oct 13 13:28:06 crc kubenswrapper[4797]: I1013 13:28:06.774137 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe29d3f9-7e6b-4fc0-8268-a5c3bd0e7f1e-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"fe29d3f9-7e6b-4fc0-8268-a5c3bd0e7f1e\") " pod="openstack/nova-cell1-novncproxy-0" Oct 13 13:28:06 crc kubenswrapper[4797]: I1013 13:28:06.774214 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe29d3f9-7e6b-4fc0-8268-a5c3bd0e7f1e-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"fe29d3f9-7e6b-4fc0-8268-a5c3bd0e7f1e\") " pod="openstack/nova-cell1-novncproxy-0" Oct 13 13:28:06 crc kubenswrapper[4797]: I1013 13:28:06.774299 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xrrtq\" (UniqueName: \"kubernetes.io/projected/fe29d3f9-7e6b-4fc0-8268-a5c3bd0e7f1e-kube-api-access-xrrtq\") pod \"nova-cell1-novncproxy-0\" (UID: \"fe29d3f9-7e6b-4fc0-8268-a5c3bd0e7f1e\") " pod="openstack/nova-cell1-novncproxy-0" Oct 13 13:28:06 crc kubenswrapper[4797]: I1013 13:28:06.774429 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe29d3f9-7e6b-4fc0-8268-a5c3bd0e7f1e-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"fe29d3f9-7e6b-4fc0-8268-a5c3bd0e7f1e\") " pod="openstack/nova-cell1-novncproxy-0" Oct 13 13:28:06 crc kubenswrapper[4797]: I1013 13:28:06.774507 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe29d3f9-7e6b-4fc0-8268-a5c3bd0e7f1e-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"fe29d3f9-7e6b-4fc0-8268-a5c3bd0e7f1e\") " pod="openstack/nova-cell1-novncproxy-0" Oct 13 13:28:06 crc kubenswrapper[4797]: I1013 13:28:06.780493 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe29d3f9-7e6b-4fc0-8268-a5c3bd0e7f1e-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"fe29d3f9-7e6b-4fc0-8268-a5c3bd0e7f1e\") " pod="openstack/nova-cell1-novncproxy-0" Oct 13 13:28:06 crc kubenswrapper[4797]: I1013 13:28:06.780920 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe29d3f9-7e6b-4fc0-8268-a5c3bd0e7f1e-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"fe29d3f9-7e6b-4fc0-8268-a5c3bd0e7f1e\") " pod="openstack/nova-cell1-novncproxy-0" Oct 13 13:28:06 crc kubenswrapper[4797]: I1013 13:28:06.782359 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe29d3f9-7e6b-4fc0-8268-a5c3bd0e7f1e-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"fe29d3f9-7e6b-4fc0-8268-a5c3bd0e7f1e\") " pod="openstack/nova-cell1-novncproxy-0" Oct 13 13:28:06 crc kubenswrapper[4797]: I1013 13:28:06.783140 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe29d3f9-7e6b-4fc0-8268-a5c3bd0e7f1e-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"fe29d3f9-7e6b-4fc0-8268-a5c3bd0e7f1e\") " pod="openstack/nova-cell1-novncproxy-0" Oct 13 13:28:06 crc kubenswrapper[4797]: I1013 13:28:06.792571 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrrtq\" (UniqueName: \"kubernetes.io/projected/fe29d3f9-7e6b-4fc0-8268-a5c3bd0e7f1e-kube-api-access-xrrtq\") pod \"nova-cell1-novncproxy-0\" (UID: \"fe29d3f9-7e6b-4fc0-8268-a5c3bd0e7f1e\") " pod="openstack/nova-cell1-novncproxy-0" Oct 13 13:28:06 crc kubenswrapper[4797]: I1013 13:28:06.820586 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 13 13:28:07 crc kubenswrapper[4797]: I1013 13:28:07.263584 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="126b2f1c-8c5f-431d-8d9d-aefc2ca0d4c1" path="/var/lib/kubelet/pods/126b2f1c-8c5f-431d-8d9d-aefc2ca0d4c1/volumes" Oct 13 13:28:07 crc kubenswrapper[4797]: I1013 13:28:07.321089 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 13 13:28:08 crc kubenswrapper[4797]: I1013 13:28:08.122262 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"fe29d3f9-7e6b-4fc0-8268-a5c3bd0e7f1e","Type":"ContainerStarted","Data":"f7e069b9ab89c7959910da337a2d82dec852dac12fc5e241175f7c593d851a00"} Oct 13 13:28:08 crc kubenswrapper[4797]: I1013 13:28:08.122560 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"fe29d3f9-7e6b-4fc0-8268-a5c3bd0e7f1e","Type":"ContainerStarted","Data":"b83f748df79935f480a3ca22d9fd0f8e3033754f61916af139f831e4c58a27ae"} Oct 13 13:28:08 crc kubenswrapper[4797]: I1013 13:28:08.142625 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.142603346 podStartE2EDuration="2.142603346s" podCreationTimestamp="2025-10-13 13:28:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 13:28:08.135608095 +0000 UTC m=+1265.669158371" watchObservedRunningTime="2025-10-13 13:28:08.142603346 +0000 UTC m=+1265.676153602" Oct 13 13:28:08 crc kubenswrapper[4797]: I1013 13:28:08.571675 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 13 13:28:08 crc kubenswrapper[4797]: I1013 13:28:08.573710 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 13 13:28:08 crc kubenswrapper[4797]: I1013 13:28:08.573775 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 13 13:28:08 crc kubenswrapper[4797]: I1013 13:28:08.578358 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 13 13:28:09 crc kubenswrapper[4797]: I1013 13:28:09.131991 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 13 13:28:09 crc kubenswrapper[4797]: I1013 13:28:09.135605 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 13 13:28:09 crc kubenswrapper[4797]: I1013 13:28:09.330757 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-65bf758599-wncdc"] Oct 13 13:28:09 crc kubenswrapper[4797]: I1013 13:28:09.332638 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65bf758599-wncdc" Oct 13 13:28:09 crc kubenswrapper[4797]: I1013 13:28:09.360932 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-65bf758599-wncdc"] Oct 13 13:28:09 crc kubenswrapper[4797]: I1013 13:28:09.533566 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a7fec705-3fa8-4f2b-aa9d-1afec561d884-dns-swift-storage-0\") pod \"dnsmasq-dns-65bf758599-wncdc\" (UID: \"a7fec705-3fa8-4f2b-aa9d-1afec561d884\") " pod="openstack/dnsmasq-dns-65bf758599-wncdc" Oct 13 13:28:09 crc kubenswrapper[4797]: I1013 13:28:09.533624 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a7fec705-3fa8-4f2b-aa9d-1afec561d884-ovsdbserver-sb\") pod \"dnsmasq-dns-65bf758599-wncdc\" (UID: \"a7fec705-3fa8-4f2b-aa9d-1afec561d884\") " pod="openstack/dnsmasq-dns-65bf758599-wncdc" Oct 13 13:28:09 crc kubenswrapper[4797]: I1013 13:28:09.533673 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a7fec705-3fa8-4f2b-aa9d-1afec561d884-config\") pod \"dnsmasq-dns-65bf758599-wncdc\" (UID: \"a7fec705-3fa8-4f2b-aa9d-1afec561d884\") " pod="openstack/dnsmasq-dns-65bf758599-wncdc" Oct 13 13:28:09 crc kubenswrapper[4797]: I1013 13:28:09.533702 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a7fec705-3fa8-4f2b-aa9d-1afec561d884-dns-svc\") pod \"dnsmasq-dns-65bf758599-wncdc\" (UID: \"a7fec705-3fa8-4f2b-aa9d-1afec561d884\") " pod="openstack/dnsmasq-dns-65bf758599-wncdc" Oct 13 13:28:09 crc kubenswrapper[4797]: I1013 13:28:09.533729 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2bvb\" (UniqueName: \"kubernetes.io/projected/a7fec705-3fa8-4f2b-aa9d-1afec561d884-kube-api-access-s2bvb\") pod \"dnsmasq-dns-65bf758599-wncdc\" (UID: \"a7fec705-3fa8-4f2b-aa9d-1afec561d884\") " pod="openstack/dnsmasq-dns-65bf758599-wncdc" Oct 13 13:28:09 crc kubenswrapper[4797]: I1013 13:28:09.533769 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a7fec705-3fa8-4f2b-aa9d-1afec561d884-ovsdbserver-nb\") pod \"dnsmasq-dns-65bf758599-wncdc\" (UID: \"a7fec705-3fa8-4f2b-aa9d-1afec561d884\") " pod="openstack/dnsmasq-dns-65bf758599-wncdc" Oct 13 13:28:09 crc kubenswrapper[4797]: I1013 13:28:09.635251 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a7fec705-3fa8-4f2b-aa9d-1afec561d884-config\") pod \"dnsmasq-dns-65bf758599-wncdc\" (UID: \"a7fec705-3fa8-4f2b-aa9d-1afec561d884\") " pod="openstack/dnsmasq-dns-65bf758599-wncdc" Oct 13 13:28:09 crc kubenswrapper[4797]: I1013 13:28:09.635324 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a7fec705-3fa8-4f2b-aa9d-1afec561d884-dns-svc\") pod \"dnsmasq-dns-65bf758599-wncdc\" (UID: \"a7fec705-3fa8-4f2b-aa9d-1afec561d884\") " pod="openstack/dnsmasq-dns-65bf758599-wncdc" Oct 13 13:28:09 crc kubenswrapper[4797]: I1013 13:28:09.635364 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2bvb\" (UniqueName: \"kubernetes.io/projected/a7fec705-3fa8-4f2b-aa9d-1afec561d884-kube-api-access-s2bvb\") pod \"dnsmasq-dns-65bf758599-wncdc\" (UID: \"a7fec705-3fa8-4f2b-aa9d-1afec561d884\") " pod="openstack/dnsmasq-dns-65bf758599-wncdc" Oct 13 13:28:09 crc kubenswrapper[4797]: I1013 13:28:09.635400 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a7fec705-3fa8-4f2b-aa9d-1afec561d884-ovsdbserver-nb\") pod \"dnsmasq-dns-65bf758599-wncdc\" (UID: \"a7fec705-3fa8-4f2b-aa9d-1afec561d884\") " pod="openstack/dnsmasq-dns-65bf758599-wncdc" Oct 13 13:28:09 crc kubenswrapper[4797]: I1013 13:28:09.635467 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a7fec705-3fa8-4f2b-aa9d-1afec561d884-dns-swift-storage-0\") pod \"dnsmasq-dns-65bf758599-wncdc\" (UID: \"a7fec705-3fa8-4f2b-aa9d-1afec561d884\") " pod="openstack/dnsmasq-dns-65bf758599-wncdc" Oct 13 13:28:09 crc kubenswrapper[4797]: I1013 13:28:09.635519 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a7fec705-3fa8-4f2b-aa9d-1afec561d884-ovsdbserver-sb\") pod \"dnsmasq-dns-65bf758599-wncdc\" (UID: \"a7fec705-3fa8-4f2b-aa9d-1afec561d884\") " pod="openstack/dnsmasq-dns-65bf758599-wncdc" Oct 13 13:28:09 crc kubenswrapper[4797]: I1013 13:28:09.636407 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a7fec705-3fa8-4f2b-aa9d-1afec561d884-config\") pod \"dnsmasq-dns-65bf758599-wncdc\" (UID: \"a7fec705-3fa8-4f2b-aa9d-1afec561d884\") " pod="openstack/dnsmasq-dns-65bf758599-wncdc" Oct 13 13:28:09 crc kubenswrapper[4797]: I1013 13:28:09.636422 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a7fec705-3fa8-4f2b-aa9d-1afec561d884-dns-svc\") pod \"dnsmasq-dns-65bf758599-wncdc\" (UID: \"a7fec705-3fa8-4f2b-aa9d-1afec561d884\") " pod="openstack/dnsmasq-dns-65bf758599-wncdc" Oct 13 13:28:09 crc kubenswrapper[4797]: I1013 13:28:09.636451 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a7fec705-3fa8-4f2b-aa9d-1afec561d884-ovsdbserver-sb\") pod \"dnsmasq-dns-65bf758599-wncdc\" (UID: \"a7fec705-3fa8-4f2b-aa9d-1afec561d884\") " pod="openstack/dnsmasq-dns-65bf758599-wncdc" Oct 13 13:28:09 crc kubenswrapper[4797]: I1013 13:28:09.636498 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a7fec705-3fa8-4f2b-aa9d-1afec561d884-ovsdbserver-nb\") pod \"dnsmasq-dns-65bf758599-wncdc\" (UID: \"a7fec705-3fa8-4f2b-aa9d-1afec561d884\") " pod="openstack/dnsmasq-dns-65bf758599-wncdc" Oct 13 13:28:09 crc kubenswrapper[4797]: I1013 13:28:09.636660 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a7fec705-3fa8-4f2b-aa9d-1afec561d884-dns-swift-storage-0\") pod \"dnsmasq-dns-65bf758599-wncdc\" (UID: \"a7fec705-3fa8-4f2b-aa9d-1afec561d884\") " pod="openstack/dnsmasq-dns-65bf758599-wncdc" Oct 13 13:28:09 crc kubenswrapper[4797]: I1013 13:28:09.656670 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2bvb\" (UniqueName: \"kubernetes.io/projected/a7fec705-3fa8-4f2b-aa9d-1afec561d884-kube-api-access-s2bvb\") pod \"dnsmasq-dns-65bf758599-wncdc\" (UID: \"a7fec705-3fa8-4f2b-aa9d-1afec561d884\") " pod="openstack/dnsmasq-dns-65bf758599-wncdc" Oct 13 13:28:09 crc kubenswrapper[4797]: I1013 13:28:09.664634 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65bf758599-wncdc" Oct 13 13:28:10 crc kubenswrapper[4797]: I1013 13:28:10.306386 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-65bf758599-wncdc"] Oct 13 13:28:10 crc kubenswrapper[4797]: W1013 13:28:10.309427 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda7fec705_3fa8_4f2b_aa9d_1afec561d884.slice/crio-48dc4b9d6c500fbfa452e70a74642c8c46bc5987d563149c8f6821dfa227895d WatchSource:0}: Error finding container 48dc4b9d6c500fbfa452e70a74642c8c46bc5987d563149c8f6821dfa227895d: Status 404 returned error can't find the container with id 48dc4b9d6c500fbfa452e70a74642c8c46bc5987d563149c8f6821dfa227895d Oct 13 13:28:11 crc kubenswrapper[4797]: I1013 13:28:11.152670 4797 generic.go:334] "Generic (PLEG): container finished" podID="a7fec705-3fa8-4f2b-aa9d-1afec561d884" containerID="8d8fce65555c86053229d2ee3e490186f49eaf012bc64c9f7fb0365ee57c6724" exitCode=0 Oct 13 13:28:11 crc kubenswrapper[4797]: I1013 13:28:11.153079 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65bf758599-wncdc" event={"ID":"a7fec705-3fa8-4f2b-aa9d-1afec561d884","Type":"ContainerDied","Data":"8d8fce65555c86053229d2ee3e490186f49eaf012bc64c9f7fb0365ee57c6724"} Oct 13 13:28:11 crc kubenswrapper[4797]: I1013 13:28:11.153140 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65bf758599-wncdc" event={"ID":"a7fec705-3fa8-4f2b-aa9d-1afec561d884","Type":"ContainerStarted","Data":"48dc4b9d6c500fbfa452e70a74642c8c46bc5987d563149c8f6821dfa227895d"} Oct 13 13:28:11 crc kubenswrapper[4797]: I1013 13:28:11.698411 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 13 13:28:11 crc kubenswrapper[4797]: I1013 13:28:11.699307 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8c6a6659-fb80-4cb0-be00-6582c7b3dba0" containerName="ceilometer-central-agent" containerID="cri-o://65c9b2a04e7d86e7c70b6e68148cbfb2133e7e8c1fa064fefd55d86f1526d8e7" gracePeriod=30 Oct 13 13:28:11 crc kubenswrapper[4797]: I1013 13:28:11.699338 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8c6a6659-fb80-4cb0-be00-6582c7b3dba0" containerName="proxy-httpd" containerID="cri-o://cff4fa9cc1c7c7fe2ce49da33153fffa149734e73a060095c01b3199af634a15" gracePeriod=30 Oct 13 13:28:11 crc kubenswrapper[4797]: I1013 13:28:11.699416 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8c6a6659-fb80-4cb0-be00-6582c7b3dba0" containerName="sg-core" containerID="cri-o://aff88e478b26dd2a4d0f4775dd1261c763352892b1cb769932fe57497cb8a240" gracePeriod=30 Oct 13 13:28:11 crc kubenswrapper[4797]: I1013 13:28:11.699504 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8c6a6659-fb80-4cb0-be00-6582c7b3dba0" containerName="ceilometer-notification-agent" containerID="cri-o://b4664bfefc234f895e831a2503689ccee8e6be6a7c2ec6e616843e49456d8777" gracePeriod=30 Oct 13 13:28:11 crc kubenswrapper[4797]: I1013 13:28:11.709002 4797 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="8c6a6659-fb80-4cb0-be00-6582c7b3dba0" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.195:3000/\": EOF" Oct 13 13:28:11 crc kubenswrapper[4797]: I1013 13:28:11.822656 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Oct 13 13:28:12 crc kubenswrapper[4797]: I1013 13:28:12.162271 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65bf758599-wncdc" event={"ID":"a7fec705-3fa8-4f2b-aa9d-1afec561d884","Type":"ContainerStarted","Data":"f67257e3e1a1c5986f176debe54b11331896c78d077c52898951e4e00a7acef3"} Oct 13 13:28:12 crc kubenswrapper[4797]: I1013 13:28:12.162463 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-65bf758599-wncdc" Oct 13 13:28:12 crc kubenswrapper[4797]: I1013 13:28:12.165117 4797 generic.go:334] "Generic (PLEG): container finished" podID="8c6a6659-fb80-4cb0-be00-6582c7b3dba0" containerID="cff4fa9cc1c7c7fe2ce49da33153fffa149734e73a060095c01b3199af634a15" exitCode=0 Oct 13 13:28:12 crc kubenswrapper[4797]: I1013 13:28:12.165158 4797 generic.go:334] "Generic (PLEG): container finished" podID="8c6a6659-fb80-4cb0-be00-6582c7b3dba0" containerID="aff88e478b26dd2a4d0f4775dd1261c763352892b1cb769932fe57497cb8a240" exitCode=2 Oct 13 13:28:12 crc kubenswrapper[4797]: I1013 13:28:12.165169 4797 generic.go:334] "Generic (PLEG): container finished" podID="8c6a6659-fb80-4cb0-be00-6582c7b3dba0" containerID="65c9b2a04e7d86e7c70b6e68148cbfb2133e7e8c1fa064fefd55d86f1526d8e7" exitCode=0 Oct 13 13:28:12 crc kubenswrapper[4797]: I1013 13:28:12.165174 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8c6a6659-fb80-4cb0-be00-6582c7b3dba0","Type":"ContainerDied","Data":"cff4fa9cc1c7c7fe2ce49da33153fffa149734e73a060095c01b3199af634a15"} Oct 13 13:28:12 crc kubenswrapper[4797]: I1013 13:28:12.165203 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8c6a6659-fb80-4cb0-be00-6582c7b3dba0","Type":"ContainerDied","Data":"aff88e478b26dd2a4d0f4775dd1261c763352892b1cb769932fe57497cb8a240"} Oct 13 13:28:12 crc kubenswrapper[4797]: I1013 13:28:12.165214 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8c6a6659-fb80-4cb0-be00-6582c7b3dba0","Type":"ContainerDied","Data":"65c9b2a04e7d86e7c70b6e68148cbfb2133e7e8c1fa064fefd55d86f1526d8e7"} Oct 13 13:28:12 crc kubenswrapper[4797]: I1013 13:28:12.204293 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-65bf758599-wncdc" podStartSLOduration=3.2042761459999998 podStartE2EDuration="3.204276146s" podCreationTimestamp="2025-10-13 13:28:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 13:28:12.197947211 +0000 UTC m=+1269.731497477" watchObservedRunningTime="2025-10-13 13:28:12.204276146 +0000 UTC m=+1269.737826402" Oct 13 13:28:12 crc kubenswrapper[4797]: I1013 13:28:12.231422 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 13 13:28:12 crc kubenswrapper[4797]: I1013 13:28:12.231604 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="2dbfdf40-54bd-48ca-a55e-1a40ed46b41b" containerName="nova-api-log" containerID="cri-o://009892dd887d85a6910109fd57146eb5fe606d9b5225dc6109009e2b8336e753" gracePeriod=30 Oct 13 13:28:12 crc kubenswrapper[4797]: I1013 13:28:12.233549 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="2dbfdf40-54bd-48ca-a55e-1a40ed46b41b" containerName="nova-api-api" containerID="cri-o://a49e5d7ef03e38510d8582a1a5568287b39a3cc8d9db5b9c20c9fc1e9eb55a36" gracePeriod=30 Oct 13 13:28:12 crc kubenswrapper[4797]: I1013 13:28:12.566154 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 13 13:28:12 crc kubenswrapper[4797]: I1013 13:28:12.713958 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c6a6659-fb80-4cb0-be00-6582c7b3dba0-scripts\") pod \"8c6a6659-fb80-4cb0-be00-6582c7b3dba0\" (UID: \"8c6a6659-fb80-4cb0-be00-6582c7b3dba0\") " Oct 13 13:28:12 crc kubenswrapper[4797]: I1013 13:28:12.714020 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c6a6659-fb80-4cb0-be00-6582c7b3dba0-config-data\") pod \"8c6a6659-fb80-4cb0-be00-6582c7b3dba0\" (UID: \"8c6a6659-fb80-4cb0-be00-6582c7b3dba0\") " Oct 13 13:28:12 crc kubenswrapper[4797]: I1013 13:28:12.714076 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-njfsk\" (UniqueName: \"kubernetes.io/projected/8c6a6659-fb80-4cb0-be00-6582c7b3dba0-kube-api-access-njfsk\") pod \"8c6a6659-fb80-4cb0-be00-6582c7b3dba0\" (UID: \"8c6a6659-fb80-4cb0-be00-6582c7b3dba0\") " Oct 13 13:28:12 crc kubenswrapper[4797]: I1013 13:28:12.714102 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c6a6659-fb80-4cb0-be00-6582c7b3dba0-ceilometer-tls-certs\") pod \"8c6a6659-fb80-4cb0-be00-6582c7b3dba0\" (UID: \"8c6a6659-fb80-4cb0-be00-6582c7b3dba0\") " Oct 13 13:28:12 crc kubenswrapper[4797]: I1013 13:28:12.714187 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8c6a6659-fb80-4cb0-be00-6582c7b3dba0-log-httpd\") pod \"8c6a6659-fb80-4cb0-be00-6582c7b3dba0\" (UID: \"8c6a6659-fb80-4cb0-be00-6582c7b3dba0\") " Oct 13 13:28:12 crc kubenswrapper[4797]: I1013 13:28:12.714219 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8c6a6659-fb80-4cb0-be00-6582c7b3dba0-run-httpd\") pod \"8c6a6659-fb80-4cb0-be00-6582c7b3dba0\" (UID: \"8c6a6659-fb80-4cb0-be00-6582c7b3dba0\") " Oct 13 13:28:12 crc kubenswrapper[4797]: I1013 13:28:12.714272 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8c6a6659-fb80-4cb0-be00-6582c7b3dba0-sg-core-conf-yaml\") pod \"8c6a6659-fb80-4cb0-be00-6582c7b3dba0\" (UID: \"8c6a6659-fb80-4cb0-be00-6582c7b3dba0\") " Oct 13 13:28:12 crc kubenswrapper[4797]: I1013 13:28:12.714344 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c6a6659-fb80-4cb0-be00-6582c7b3dba0-combined-ca-bundle\") pod \"8c6a6659-fb80-4cb0-be00-6582c7b3dba0\" (UID: \"8c6a6659-fb80-4cb0-be00-6582c7b3dba0\") " Oct 13 13:28:12 crc kubenswrapper[4797]: I1013 13:28:12.720206 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c6a6659-fb80-4cb0-be00-6582c7b3dba0-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "8c6a6659-fb80-4cb0-be00-6582c7b3dba0" (UID: "8c6a6659-fb80-4cb0-be00-6582c7b3dba0"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 13:28:12 crc kubenswrapper[4797]: I1013 13:28:12.720439 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c6a6659-fb80-4cb0-be00-6582c7b3dba0-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "8c6a6659-fb80-4cb0-be00-6582c7b3dba0" (UID: "8c6a6659-fb80-4cb0-be00-6582c7b3dba0"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 13:28:12 crc kubenswrapper[4797]: I1013 13:28:12.743054 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c6a6659-fb80-4cb0-be00-6582c7b3dba0-kube-api-access-njfsk" (OuterVolumeSpecName: "kube-api-access-njfsk") pod "8c6a6659-fb80-4cb0-be00-6582c7b3dba0" (UID: "8c6a6659-fb80-4cb0-be00-6582c7b3dba0"). InnerVolumeSpecName "kube-api-access-njfsk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:28:12 crc kubenswrapper[4797]: I1013 13:28:12.771973 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c6a6659-fb80-4cb0-be00-6582c7b3dba0-scripts" (OuterVolumeSpecName: "scripts") pod "8c6a6659-fb80-4cb0-be00-6582c7b3dba0" (UID: "8c6a6659-fb80-4cb0-be00-6582c7b3dba0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:28:12 crc kubenswrapper[4797]: I1013 13:28:12.816850 4797 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8c6a6659-fb80-4cb0-be00-6582c7b3dba0-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 13 13:28:12 crc kubenswrapper[4797]: I1013 13:28:12.817078 4797 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8c6a6659-fb80-4cb0-be00-6582c7b3dba0-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 13 13:28:12 crc kubenswrapper[4797]: I1013 13:28:12.817090 4797 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c6a6659-fb80-4cb0-be00-6582c7b3dba0-scripts\") on node \"crc\" DevicePath \"\"" Oct 13 13:28:12 crc kubenswrapper[4797]: I1013 13:28:12.817102 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-njfsk\" (UniqueName: \"kubernetes.io/projected/8c6a6659-fb80-4cb0-be00-6582c7b3dba0-kube-api-access-njfsk\") on node \"crc\" DevicePath \"\"" Oct 13 13:28:12 crc kubenswrapper[4797]: I1013 13:28:12.842853 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c6a6659-fb80-4cb0-be00-6582c7b3dba0-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "8c6a6659-fb80-4cb0-be00-6582c7b3dba0" (UID: "8c6a6659-fb80-4cb0-be00-6582c7b3dba0"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:28:12 crc kubenswrapper[4797]: I1013 13:28:12.872531 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c6a6659-fb80-4cb0-be00-6582c7b3dba0-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "8c6a6659-fb80-4cb0-be00-6582c7b3dba0" (UID: "8c6a6659-fb80-4cb0-be00-6582c7b3dba0"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:28:12 crc kubenswrapper[4797]: I1013 13:28:12.874292 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c6a6659-fb80-4cb0-be00-6582c7b3dba0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8c6a6659-fb80-4cb0-be00-6582c7b3dba0" (UID: "8c6a6659-fb80-4cb0-be00-6582c7b3dba0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:28:12 crc kubenswrapper[4797]: I1013 13:28:12.901310 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c6a6659-fb80-4cb0-be00-6582c7b3dba0-config-data" (OuterVolumeSpecName: "config-data") pod "8c6a6659-fb80-4cb0-be00-6582c7b3dba0" (UID: "8c6a6659-fb80-4cb0-be00-6582c7b3dba0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:28:12 crc kubenswrapper[4797]: I1013 13:28:12.918630 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c6a6659-fb80-4cb0-be00-6582c7b3dba0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 13:28:12 crc kubenswrapper[4797]: I1013 13:28:12.918673 4797 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c6a6659-fb80-4cb0-be00-6582c7b3dba0-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 13:28:12 crc kubenswrapper[4797]: I1013 13:28:12.918685 4797 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c6a6659-fb80-4cb0-be00-6582c7b3dba0-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 13 13:28:12 crc kubenswrapper[4797]: I1013 13:28:12.918697 4797 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8c6a6659-fb80-4cb0-be00-6582c7b3dba0-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 13 13:28:13 crc kubenswrapper[4797]: I1013 13:28:13.173789 4797 generic.go:334] "Generic (PLEG): container finished" podID="2dbfdf40-54bd-48ca-a55e-1a40ed46b41b" containerID="009892dd887d85a6910109fd57146eb5fe606d9b5225dc6109009e2b8336e753" exitCode=143 Oct 13 13:28:13 crc kubenswrapper[4797]: I1013 13:28:13.173865 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2dbfdf40-54bd-48ca-a55e-1a40ed46b41b","Type":"ContainerDied","Data":"009892dd887d85a6910109fd57146eb5fe606d9b5225dc6109009e2b8336e753"} Oct 13 13:28:13 crc kubenswrapper[4797]: I1013 13:28:13.176245 4797 generic.go:334] "Generic (PLEG): container finished" podID="8c6a6659-fb80-4cb0-be00-6582c7b3dba0" containerID="b4664bfefc234f895e831a2503689ccee8e6be6a7c2ec6e616843e49456d8777" exitCode=0 Oct 13 13:28:13 crc kubenswrapper[4797]: I1013 13:28:13.176619 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 13 13:28:13 crc kubenswrapper[4797]: I1013 13:28:13.177252 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8c6a6659-fb80-4cb0-be00-6582c7b3dba0","Type":"ContainerDied","Data":"b4664bfefc234f895e831a2503689ccee8e6be6a7c2ec6e616843e49456d8777"} Oct 13 13:28:13 crc kubenswrapper[4797]: I1013 13:28:13.177277 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8c6a6659-fb80-4cb0-be00-6582c7b3dba0","Type":"ContainerDied","Data":"c00a186ed1a93dce95f9aefbc72fb153ff03a646780e38011fe1d97a32fa61c4"} Oct 13 13:28:13 crc kubenswrapper[4797]: I1013 13:28:13.177294 4797 scope.go:117] "RemoveContainer" containerID="cff4fa9cc1c7c7fe2ce49da33153fffa149734e73a060095c01b3199af634a15" Oct 13 13:28:13 crc kubenswrapper[4797]: I1013 13:28:13.196831 4797 scope.go:117] "RemoveContainer" containerID="aff88e478b26dd2a4d0f4775dd1261c763352892b1cb769932fe57497cb8a240" Oct 13 13:28:13 crc kubenswrapper[4797]: I1013 13:28:13.214113 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 13 13:28:13 crc kubenswrapper[4797]: I1013 13:28:13.222160 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 13 13:28:13 crc kubenswrapper[4797]: I1013 13:28:13.242955 4797 scope.go:117] "RemoveContainer" containerID="b4664bfefc234f895e831a2503689ccee8e6be6a7c2ec6e616843e49456d8777" Oct 13 13:28:13 crc kubenswrapper[4797]: I1013 13:28:13.280590 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c6a6659-fb80-4cb0-be00-6582c7b3dba0" path="/var/lib/kubelet/pods/8c6a6659-fb80-4cb0-be00-6582c7b3dba0/volumes" Oct 13 13:28:13 crc kubenswrapper[4797]: I1013 13:28:13.281658 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 13 13:28:13 crc kubenswrapper[4797]: E1013 13:28:13.282104 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c6a6659-fb80-4cb0-be00-6582c7b3dba0" containerName="proxy-httpd" Oct 13 13:28:13 crc kubenswrapper[4797]: I1013 13:28:13.282127 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c6a6659-fb80-4cb0-be00-6582c7b3dba0" containerName="proxy-httpd" Oct 13 13:28:13 crc kubenswrapper[4797]: E1013 13:28:13.282149 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c6a6659-fb80-4cb0-be00-6582c7b3dba0" containerName="ceilometer-notification-agent" Oct 13 13:28:13 crc kubenswrapper[4797]: I1013 13:28:13.282160 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c6a6659-fb80-4cb0-be00-6582c7b3dba0" containerName="ceilometer-notification-agent" Oct 13 13:28:13 crc kubenswrapper[4797]: E1013 13:28:13.282191 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c6a6659-fb80-4cb0-be00-6582c7b3dba0" containerName="sg-core" Oct 13 13:28:13 crc kubenswrapper[4797]: I1013 13:28:13.282199 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c6a6659-fb80-4cb0-be00-6582c7b3dba0" containerName="sg-core" Oct 13 13:28:13 crc kubenswrapper[4797]: E1013 13:28:13.282212 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c6a6659-fb80-4cb0-be00-6582c7b3dba0" containerName="ceilometer-central-agent" Oct 13 13:28:13 crc kubenswrapper[4797]: I1013 13:28:13.282219 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c6a6659-fb80-4cb0-be00-6582c7b3dba0" containerName="ceilometer-central-agent" Oct 13 13:28:13 crc kubenswrapper[4797]: I1013 13:28:13.282439 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c6a6659-fb80-4cb0-be00-6582c7b3dba0" containerName="ceilometer-notification-agent" Oct 13 13:28:13 crc kubenswrapper[4797]: I1013 13:28:13.282457 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c6a6659-fb80-4cb0-be00-6582c7b3dba0" containerName="proxy-httpd" Oct 13 13:28:13 crc kubenswrapper[4797]: I1013 13:28:13.282472 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c6a6659-fb80-4cb0-be00-6582c7b3dba0" containerName="ceilometer-central-agent" Oct 13 13:28:13 crc kubenswrapper[4797]: I1013 13:28:13.282501 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c6a6659-fb80-4cb0-be00-6582c7b3dba0" containerName="sg-core" Oct 13 13:28:13 crc kubenswrapper[4797]: I1013 13:28:13.289248 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 13 13:28:13 crc kubenswrapper[4797]: I1013 13:28:13.289377 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 13 13:28:13 crc kubenswrapper[4797]: I1013 13:28:13.292978 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 13 13:28:13 crc kubenswrapper[4797]: I1013 13:28:13.293197 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 13 13:28:13 crc kubenswrapper[4797]: I1013 13:28:13.293260 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 13 13:28:13 crc kubenswrapper[4797]: I1013 13:28:13.298126 4797 scope.go:117] "RemoveContainer" containerID="65c9b2a04e7d86e7c70b6e68148cbfb2133e7e8c1fa064fefd55d86f1526d8e7" Oct 13 13:28:13 crc kubenswrapper[4797]: I1013 13:28:13.327766 4797 scope.go:117] "RemoveContainer" containerID="cff4fa9cc1c7c7fe2ce49da33153fffa149734e73a060095c01b3199af634a15" Oct 13 13:28:13 crc kubenswrapper[4797]: E1013 13:28:13.328247 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cff4fa9cc1c7c7fe2ce49da33153fffa149734e73a060095c01b3199af634a15\": container with ID starting with cff4fa9cc1c7c7fe2ce49da33153fffa149734e73a060095c01b3199af634a15 not found: ID does not exist" containerID="cff4fa9cc1c7c7fe2ce49da33153fffa149734e73a060095c01b3199af634a15" Oct 13 13:28:13 crc kubenswrapper[4797]: I1013 13:28:13.328281 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cff4fa9cc1c7c7fe2ce49da33153fffa149734e73a060095c01b3199af634a15"} err="failed to get container status \"cff4fa9cc1c7c7fe2ce49da33153fffa149734e73a060095c01b3199af634a15\": rpc error: code = NotFound desc = could not find container \"cff4fa9cc1c7c7fe2ce49da33153fffa149734e73a060095c01b3199af634a15\": container with ID starting with cff4fa9cc1c7c7fe2ce49da33153fffa149734e73a060095c01b3199af634a15 not found: ID does not exist" Oct 13 13:28:13 crc kubenswrapper[4797]: I1013 13:28:13.328310 4797 scope.go:117] "RemoveContainer" containerID="aff88e478b26dd2a4d0f4775dd1261c763352892b1cb769932fe57497cb8a240" Oct 13 13:28:13 crc kubenswrapper[4797]: E1013 13:28:13.328551 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aff88e478b26dd2a4d0f4775dd1261c763352892b1cb769932fe57497cb8a240\": container with ID starting with aff88e478b26dd2a4d0f4775dd1261c763352892b1cb769932fe57497cb8a240 not found: ID does not exist" containerID="aff88e478b26dd2a4d0f4775dd1261c763352892b1cb769932fe57497cb8a240" Oct 13 13:28:13 crc kubenswrapper[4797]: I1013 13:28:13.328575 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aff88e478b26dd2a4d0f4775dd1261c763352892b1cb769932fe57497cb8a240"} err="failed to get container status \"aff88e478b26dd2a4d0f4775dd1261c763352892b1cb769932fe57497cb8a240\": rpc error: code = NotFound desc = could not find container \"aff88e478b26dd2a4d0f4775dd1261c763352892b1cb769932fe57497cb8a240\": container with ID starting with aff88e478b26dd2a4d0f4775dd1261c763352892b1cb769932fe57497cb8a240 not found: ID does not exist" Oct 13 13:28:13 crc kubenswrapper[4797]: I1013 13:28:13.328592 4797 scope.go:117] "RemoveContainer" containerID="b4664bfefc234f895e831a2503689ccee8e6be6a7c2ec6e616843e49456d8777" Oct 13 13:28:13 crc kubenswrapper[4797]: E1013 13:28:13.328795 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b4664bfefc234f895e831a2503689ccee8e6be6a7c2ec6e616843e49456d8777\": container with ID starting with b4664bfefc234f895e831a2503689ccee8e6be6a7c2ec6e616843e49456d8777 not found: ID does not exist" containerID="b4664bfefc234f895e831a2503689ccee8e6be6a7c2ec6e616843e49456d8777" Oct 13 13:28:13 crc kubenswrapper[4797]: I1013 13:28:13.328902 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4664bfefc234f895e831a2503689ccee8e6be6a7c2ec6e616843e49456d8777"} err="failed to get container status \"b4664bfefc234f895e831a2503689ccee8e6be6a7c2ec6e616843e49456d8777\": rpc error: code = NotFound desc = could not find container \"b4664bfefc234f895e831a2503689ccee8e6be6a7c2ec6e616843e49456d8777\": container with ID starting with b4664bfefc234f895e831a2503689ccee8e6be6a7c2ec6e616843e49456d8777 not found: ID does not exist" Oct 13 13:28:13 crc kubenswrapper[4797]: I1013 13:28:13.328923 4797 scope.go:117] "RemoveContainer" containerID="65c9b2a04e7d86e7c70b6e68148cbfb2133e7e8c1fa064fefd55d86f1526d8e7" Oct 13 13:28:13 crc kubenswrapper[4797]: E1013 13:28:13.329103 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65c9b2a04e7d86e7c70b6e68148cbfb2133e7e8c1fa064fefd55d86f1526d8e7\": container with ID starting with 65c9b2a04e7d86e7c70b6e68148cbfb2133e7e8c1fa064fefd55d86f1526d8e7 not found: ID does not exist" containerID="65c9b2a04e7d86e7c70b6e68148cbfb2133e7e8c1fa064fefd55d86f1526d8e7" Oct 13 13:28:13 crc kubenswrapper[4797]: I1013 13:28:13.329135 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65c9b2a04e7d86e7c70b6e68148cbfb2133e7e8c1fa064fefd55d86f1526d8e7"} err="failed to get container status \"65c9b2a04e7d86e7c70b6e68148cbfb2133e7e8c1fa064fefd55d86f1526d8e7\": rpc error: code = NotFound desc = could not find container \"65c9b2a04e7d86e7c70b6e68148cbfb2133e7e8c1fa064fefd55d86f1526d8e7\": container with ID starting with 65c9b2a04e7d86e7c70b6e68148cbfb2133e7e8c1fa064fefd55d86f1526d8e7 not found: ID does not exist" Oct 13 13:28:13 crc kubenswrapper[4797]: I1013 13:28:13.427376 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8c12a30d-f475-4e40-ac5f-0c03c238fb89-run-httpd\") pod \"ceilometer-0\" (UID: \"8c12a30d-f475-4e40-ac5f-0c03c238fb89\") " pod="openstack/ceilometer-0" Oct 13 13:28:13 crc kubenswrapper[4797]: I1013 13:28:13.427477 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c12a30d-f475-4e40-ac5f-0c03c238fb89-scripts\") pod \"ceilometer-0\" (UID: \"8c12a30d-f475-4e40-ac5f-0c03c238fb89\") " pod="openstack/ceilometer-0" Oct 13 13:28:13 crc kubenswrapper[4797]: I1013 13:28:13.427508 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8c12a30d-f475-4e40-ac5f-0c03c238fb89-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8c12a30d-f475-4e40-ac5f-0c03c238fb89\") " pod="openstack/ceilometer-0" Oct 13 13:28:13 crc kubenswrapper[4797]: I1013 13:28:13.427567 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8c12a30d-f475-4e40-ac5f-0c03c238fb89-log-httpd\") pod \"ceilometer-0\" (UID: \"8c12a30d-f475-4e40-ac5f-0c03c238fb89\") " pod="openstack/ceilometer-0" Oct 13 13:28:13 crc kubenswrapper[4797]: I1013 13:28:13.427597 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vtz26\" (UniqueName: \"kubernetes.io/projected/8c12a30d-f475-4e40-ac5f-0c03c238fb89-kube-api-access-vtz26\") pod \"ceilometer-0\" (UID: \"8c12a30d-f475-4e40-ac5f-0c03c238fb89\") " pod="openstack/ceilometer-0" Oct 13 13:28:13 crc kubenswrapper[4797]: I1013 13:28:13.427638 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c12a30d-f475-4e40-ac5f-0c03c238fb89-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8c12a30d-f475-4e40-ac5f-0c03c238fb89\") " pod="openstack/ceilometer-0" Oct 13 13:28:13 crc kubenswrapper[4797]: I1013 13:28:13.427672 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c12a30d-f475-4e40-ac5f-0c03c238fb89-config-data\") pod \"ceilometer-0\" (UID: \"8c12a30d-f475-4e40-ac5f-0c03c238fb89\") " pod="openstack/ceilometer-0" Oct 13 13:28:13 crc kubenswrapper[4797]: I1013 13:28:13.427720 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c12a30d-f475-4e40-ac5f-0c03c238fb89-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"8c12a30d-f475-4e40-ac5f-0c03c238fb89\") " pod="openstack/ceilometer-0" Oct 13 13:28:13 crc kubenswrapper[4797]: I1013 13:28:13.529158 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c12a30d-f475-4e40-ac5f-0c03c238fb89-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"8c12a30d-f475-4e40-ac5f-0c03c238fb89\") " pod="openstack/ceilometer-0" Oct 13 13:28:13 crc kubenswrapper[4797]: I1013 13:28:13.529234 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8c12a30d-f475-4e40-ac5f-0c03c238fb89-run-httpd\") pod \"ceilometer-0\" (UID: \"8c12a30d-f475-4e40-ac5f-0c03c238fb89\") " pod="openstack/ceilometer-0" Oct 13 13:28:13 crc kubenswrapper[4797]: I1013 13:28:13.529285 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c12a30d-f475-4e40-ac5f-0c03c238fb89-scripts\") pod \"ceilometer-0\" (UID: \"8c12a30d-f475-4e40-ac5f-0c03c238fb89\") " pod="openstack/ceilometer-0" Oct 13 13:28:13 crc kubenswrapper[4797]: I1013 13:28:13.529305 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8c12a30d-f475-4e40-ac5f-0c03c238fb89-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8c12a30d-f475-4e40-ac5f-0c03c238fb89\") " pod="openstack/ceilometer-0" Oct 13 13:28:13 crc kubenswrapper[4797]: I1013 13:28:13.529348 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8c12a30d-f475-4e40-ac5f-0c03c238fb89-log-httpd\") pod \"ceilometer-0\" (UID: \"8c12a30d-f475-4e40-ac5f-0c03c238fb89\") " pod="openstack/ceilometer-0" Oct 13 13:28:13 crc kubenswrapper[4797]: I1013 13:28:13.529369 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vtz26\" (UniqueName: \"kubernetes.io/projected/8c12a30d-f475-4e40-ac5f-0c03c238fb89-kube-api-access-vtz26\") pod \"ceilometer-0\" (UID: \"8c12a30d-f475-4e40-ac5f-0c03c238fb89\") " pod="openstack/ceilometer-0" Oct 13 13:28:13 crc kubenswrapper[4797]: I1013 13:28:13.529397 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c12a30d-f475-4e40-ac5f-0c03c238fb89-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8c12a30d-f475-4e40-ac5f-0c03c238fb89\") " pod="openstack/ceilometer-0" Oct 13 13:28:13 crc kubenswrapper[4797]: I1013 13:28:13.529418 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c12a30d-f475-4e40-ac5f-0c03c238fb89-config-data\") pod \"ceilometer-0\" (UID: \"8c12a30d-f475-4e40-ac5f-0c03c238fb89\") " pod="openstack/ceilometer-0" Oct 13 13:28:13 crc kubenswrapper[4797]: I1013 13:28:13.529774 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8c12a30d-f475-4e40-ac5f-0c03c238fb89-run-httpd\") pod \"ceilometer-0\" (UID: \"8c12a30d-f475-4e40-ac5f-0c03c238fb89\") " pod="openstack/ceilometer-0" Oct 13 13:28:13 crc kubenswrapper[4797]: I1013 13:28:13.530010 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8c12a30d-f475-4e40-ac5f-0c03c238fb89-log-httpd\") pod \"ceilometer-0\" (UID: \"8c12a30d-f475-4e40-ac5f-0c03c238fb89\") " pod="openstack/ceilometer-0" Oct 13 13:28:13 crc kubenswrapper[4797]: I1013 13:28:13.533980 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c12a30d-f475-4e40-ac5f-0c03c238fb89-scripts\") pod \"ceilometer-0\" (UID: \"8c12a30d-f475-4e40-ac5f-0c03c238fb89\") " pod="openstack/ceilometer-0" Oct 13 13:28:13 crc kubenswrapper[4797]: I1013 13:28:13.533979 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c12a30d-f475-4e40-ac5f-0c03c238fb89-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"8c12a30d-f475-4e40-ac5f-0c03c238fb89\") " pod="openstack/ceilometer-0" Oct 13 13:28:13 crc kubenswrapper[4797]: I1013 13:28:13.534675 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8c12a30d-f475-4e40-ac5f-0c03c238fb89-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8c12a30d-f475-4e40-ac5f-0c03c238fb89\") " pod="openstack/ceilometer-0" Oct 13 13:28:13 crc kubenswrapper[4797]: I1013 13:28:13.535641 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c12a30d-f475-4e40-ac5f-0c03c238fb89-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8c12a30d-f475-4e40-ac5f-0c03c238fb89\") " pod="openstack/ceilometer-0" Oct 13 13:28:13 crc kubenswrapper[4797]: I1013 13:28:13.540750 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c12a30d-f475-4e40-ac5f-0c03c238fb89-config-data\") pod \"ceilometer-0\" (UID: \"8c12a30d-f475-4e40-ac5f-0c03c238fb89\") " pod="openstack/ceilometer-0" Oct 13 13:28:13 crc kubenswrapper[4797]: I1013 13:28:13.549555 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vtz26\" (UniqueName: \"kubernetes.io/projected/8c12a30d-f475-4e40-ac5f-0c03c238fb89-kube-api-access-vtz26\") pod \"ceilometer-0\" (UID: \"8c12a30d-f475-4e40-ac5f-0c03c238fb89\") " pod="openstack/ceilometer-0" Oct 13 13:28:13 crc kubenswrapper[4797]: I1013 13:28:13.609343 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 13 13:28:13 crc kubenswrapper[4797]: I1013 13:28:13.916266 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 13 13:28:14 crc kubenswrapper[4797]: I1013 13:28:14.086658 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 13 13:28:14 crc kubenswrapper[4797]: I1013 13:28:14.124169 4797 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 13 13:28:14 crc kubenswrapper[4797]: I1013 13:28:14.185186 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8c12a30d-f475-4e40-ac5f-0c03c238fb89","Type":"ContainerStarted","Data":"5b9ea7bdd9f09d26c717c7300432d1d81962b0ae20127762cc269306b273354e"} Oct 13 13:28:15 crc kubenswrapper[4797]: I1013 13:28:15.195693 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8c12a30d-f475-4e40-ac5f-0c03c238fb89","Type":"ContainerStarted","Data":"7f94011a6860107ef9dae63f1f21baacda439283afab4d633ec8e51fc6caaa03"} Oct 13 13:28:15 crc kubenswrapper[4797]: I1013 13:28:15.770957 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 13 13:28:15 crc kubenswrapper[4797]: I1013 13:28:15.877067 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rtnfj\" (UniqueName: \"kubernetes.io/projected/2dbfdf40-54bd-48ca-a55e-1a40ed46b41b-kube-api-access-rtnfj\") pod \"2dbfdf40-54bd-48ca-a55e-1a40ed46b41b\" (UID: \"2dbfdf40-54bd-48ca-a55e-1a40ed46b41b\") " Oct 13 13:28:15 crc kubenswrapper[4797]: I1013 13:28:15.877162 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2dbfdf40-54bd-48ca-a55e-1a40ed46b41b-config-data\") pod \"2dbfdf40-54bd-48ca-a55e-1a40ed46b41b\" (UID: \"2dbfdf40-54bd-48ca-a55e-1a40ed46b41b\") " Oct 13 13:28:15 crc kubenswrapper[4797]: I1013 13:28:15.877235 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2dbfdf40-54bd-48ca-a55e-1a40ed46b41b-logs\") pod \"2dbfdf40-54bd-48ca-a55e-1a40ed46b41b\" (UID: \"2dbfdf40-54bd-48ca-a55e-1a40ed46b41b\") " Oct 13 13:28:15 crc kubenswrapper[4797]: I1013 13:28:15.877349 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2dbfdf40-54bd-48ca-a55e-1a40ed46b41b-combined-ca-bundle\") pod \"2dbfdf40-54bd-48ca-a55e-1a40ed46b41b\" (UID: \"2dbfdf40-54bd-48ca-a55e-1a40ed46b41b\") " Oct 13 13:28:15 crc kubenswrapper[4797]: I1013 13:28:15.879826 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2dbfdf40-54bd-48ca-a55e-1a40ed46b41b-logs" (OuterVolumeSpecName: "logs") pod "2dbfdf40-54bd-48ca-a55e-1a40ed46b41b" (UID: "2dbfdf40-54bd-48ca-a55e-1a40ed46b41b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 13:28:15 crc kubenswrapper[4797]: I1013 13:28:15.884949 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2dbfdf40-54bd-48ca-a55e-1a40ed46b41b-kube-api-access-rtnfj" (OuterVolumeSpecName: "kube-api-access-rtnfj") pod "2dbfdf40-54bd-48ca-a55e-1a40ed46b41b" (UID: "2dbfdf40-54bd-48ca-a55e-1a40ed46b41b"). InnerVolumeSpecName "kube-api-access-rtnfj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:28:15 crc kubenswrapper[4797]: I1013 13:28:15.913178 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2dbfdf40-54bd-48ca-a55e-1a40ed46b41b-config-data" (OuterVolumeSpecName: "config-data") pod "2dbfdf40-54bd-48ca-a55e-1a40ed46b41b" (UID: "2dbfdf40-54bd-48ca-a55e-1a40ed46b41b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:28:15 crc kubenswrapper[4797]: I1013 13:28:15.922331 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2dbfdf40-54bd-48ca-a55e-1a40ed46b41b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2dbfdf40-54bd-48ca-a55e-1a40ed46b41b" (UID: "2dbfdf40-54bd-48ca-a55e-1a40ed46b41b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:28:15 crc kubenswrapper[4797]: I1013 13:28:15.983141 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rtnfj\" (UniqueName: \"kubernetes.io/projected/2dbfdf40-54bd-48ca-a55e-1a40ed46b41b-kube-api-access-rtnfj\") on node \"crc\" DevicePath \"\"" Oct 13 13:28:15 crc kubenswrapper[4797]: I1013 13:28:15.983209 4797 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2dbfdf40-54bd-48ca-a55e-1a40ed46b41b-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 13:28:15 crc kubenswrapper[4797]: I1013 13:28:15.983223 4797 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2dbfdf40-54bd-48ca-a55e-1a40ed46b41b-logs\") on node \"crc\" DevicePath \"\"" Oct 13 13:28:15 crc kubenswrapper[4797]: I1013 13:28:15.983233 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2dbfdf40-54bd-48ca-a55e-1a40ed46b41b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 13:28:16 crc kubenswrapper[4797]: I1013 13:28:16.209576 4797 generic.go:334] "Generic (PLEG): container finished" podID="2dbfdf40-54bd-48ca-a55e-1a40ed46b41b" containerID="a49e5d7ef03e38510d8582a1a5568287b39a3cc8d9db5b9c20c9fc1e9eb55a36" exitCode=0 Oct 13 13:28:16 crc kubenswrapper[4797]: I1013 13:28:16.209653 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 13 13:28:16 crc kubenswrapper[4797]: I1013 13:28:16.209633 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2dbfdf40-54bd-48ca-a55e-1a40ed46b41b","Type":"ContainerDied","Data":"a49e5d7ef03e38510d8582a1a5568287b39a3cc8d9db5b9c20c9fc1e9eb55a36"} Oct 13 13:28:16 crc kubenswrapper[4797]: I1013 13:28:16.213243 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2dbfdf40-54bd-48ca-a55e-1a40ed46b41b","Type":"ContainerDied","Data":"b7561e593917785385730e85745792d2d7853be126c4904381cddbc925d684f0"} Oct 13 13:28:16 crc kubenswrapper[4797]: I1013 13:28:16.213280 4797 scope.go:117] "RemoveContainer" containerID="a49e5d7ef03e38510d8582a1a5568287b39a3cc8d9db5b9c20c9fc1e9eb55a36" Oct 13 13:28:16 crc kubenswrapper[4797]: I1013 13:28:16.217257 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8c12a30d-f475-4e40-ac5f-0c03c238fb89","Type":"ContainerStarted","Data":"d8864921b8dbbdee8b7c5297ccc18f6112af6fa0d1b9e8f119c7cd8f8c905c03"} Oct 13 13:28:16 crc kubenswrapper[4797]: I1013 13:28:16.236252 4797 scope.go:117] "RemoveContainer" containerID="009892dd887d85a6910109fd57146eb5fe606d9b5225dc6109009e2b8336e753" Oct 13 13:28:16 crc kubenswrapper[4797]: I1013 13:28:16.251738 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 13 13:28:16 crc kubenswrapper[4797]: I1013 13:28:16.263284 4797 scope.go:117] "RemoveContainer" containerID="a49e5d7ef03e38510d8582a1a5568287b39a3cc8d9db5b9c20c9fc1e9eb55a36" Oct 13 13:28:16 crc kubenswrapper[4797]: E1013 13:28:16.269149 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a49e5d7ef03e38510d8582a1a5568287b39a3cc8d9db5b9c20c9fc1e9eb55a36\": container with ID starting with a49e5d7ef03e38510d8582a1a5568287b39a3cc8d9db5b9c20c9fc1e9eb55a36 not found: ID does not exist" containerID="a49e5d7ef03e38510d8582a1a5568287b39a3cc8d9db5b9c20c9fc1e9eb55a36" Oct 13 13:28:16 crc kubenswrapper[4797]: I1013 13:28:16.269199 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a49e5d7ef03e38510d8582a1a5568287b39a3cc8d9db5b9c20c9fc1e9eb55a36"} err="failed to get container status \"a49e5d7ef03e38510d8582a1a5568287b39a3cc8d9db5b9c20c9fc1e9eb55a36\": rpc error: code = NotFound desc = could not find container \"a49e5d7ef03e38510d8582a1a5568287b39a3cc8d9db5b9c20c9fc1e9eb55a36\": container with ID starting with a49e5d7ef03e38510d8582a1a5568287b39a3cc8d9db5b9c20c9fc1e9eb55a36 not found: ID does not exist" Oct 13 13:28:16 crc kubenswrapper[4797]: I1013 13:28:16.269226 4797 scope.go:117] "RemoveContainer" containerID="009892dd887d85a6910109fd57146eb5fe606d9b5225dc6109009e2b8336e753" Oct 13 13:28:16 crc kubenswrapper[4797]: I1013 13:28:16.270190 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 13 13:28:16 crc kubenswrapper[4797]: E1013 13:28:16.271351 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"009892dd887d85a6910109fd57146eb5fe606d9b5225dc6109009e2b8336e753\": container with ID starting with 009892dd887d85a6910109fd57146eb5fe606d9b5225dc6109009e2b8336e753 not found: ID does not exist" containerID="009892dd887d85a6910109fd57146eb5fe606d9b5225dc6109009e2b8336e753" Oct 13 13:28:16 crc kubenswrapper[4797]: I1013 13:28:16.271375 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"009892dd887d85a6910109fd57146eb5fe606d9b5225dc6109009e2b8336e753"} err="failed to get container status \"009892dd887d85a6910109fd57146eb5fe606d9b5225dc6109009e2b8336e753\": rpc error: code = NotFound desc = could not find container \"009892dd887d85a6910109fd57146eb5fe606d9b5225dc6109009e2b8336e753\": container with ID starting with 009892dd887d85a6910109fd57146eb5fe606d9b5225dc6109009e2b8336e753 not found: ID does not exist" Oct 13 13:28:16 crc kubenswrapper[4797]: I1013 13:28:16.283785 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 13 13:28:16 crc kubenswrapper[4797]: E1013 13:28:16.284239 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2dbfdf40-54bd-48ca-a55e-1a40ed46b41b" containerName="nova-api-api" Oct 13 13:28:16 crc kubenswrapper[4797]: I1013 13:28:16.284255 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="2dbfdf40-54bd-48ca-a55e-1a40ed46b41b" containerName="nova-api-api" Oct 13 13:28:16 crc kubenswrapper[4797]: E1013 13:28:16.284269 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2dbfdf40-54bd-48ca-a55e-1a40ed46b41b" containerName="nova-api-log" Oct 13 13:28:16 crc kubenswrapper[4797]: I1013 13:28:16.284277 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="2dbfdf40-54bd-48ca-a55e-1a40ed46b41b" containerName="nova-api-log" Oct 13 13:28:16 crc kubenswrapper[4797]: I1013 13:28:16.284471 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="2dbfdf40-54bd-48ca-a55e-1a40ed46b41b" containerName="nova-api-api" Oct 13 13:28:16 crc kubenswrapper[4797]: I1013 13:28:16.284495 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="2dbfdf40-54bd-48ca-a55e-1a40ed46b41b" containerName="nova-api-log" Oct 13 13:28:16 crc kubenswrapper[4797]: I1013 13:28:16.285418 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 13 13:28:16 crc kubenswrapper[4797]: I1013 13:28:16.287855 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Oct 13 13:28:16 crc kubenswrapper[4797]: I1013 13:28:16.288139 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Oct 13 13:28:16 crc kubenswrapper[4797]: I1013 13:28:16.289560 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 13 13:28:16 crc kubenswrapper[4797]: I1013 13:28:16.311915 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 13 13:28:16 crc kubenswrapper[4797]: I1013 13:28:16.395626 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3bd0bbd4-8354-4260-9c9d-f8263046f438-internal-tls-certs\") pod \"nova-api-0\" (UID: \"3bd0bbd4-8354-4260-9c9d-f8263046f438\") " pod="openstack/nova-api-0" Oct 13 13:28:16 crc kubenswrapper[4797]: I1013 13:28:16.395757 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3bd0bbd4-8354-4260-9c9d-f8263046f438-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"3bd0bbd4-8354-4260-9c9d-f8263046f438\") " pod="openstack/nova-api-0" Oct 13 13:28:16 crc kubenswrapper[4797]: I1013 13:28:16.395869 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3bd0bbd4-8354-4260-9c9d-f8263046f438-public-tls-certs\") pod \"nova-api-0\" (UID: \"3bd0bbd4-8354-4260-9c9d-f8263046f438\") " pod="openstack/nova-api-0" Oct 13 13:28:16 crc kubenswrapper[4797]: I1013 13:28:16.395906 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3bd0bbd4-8354-4260-9c9d-f8263046f438-logs\") pod \"nova-api-0\" (UID: \"3bd0bbd4-8354-4260-9c9d-f8263046f438\") " pod="openstack/nova-api-0" Oct 13 13:28:16 crc kubenswrapper[4797]: I1013 13:28:16.395934 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3bd0bbd4-8354-4260-9c9d-f8263046f438-config-data\") pod \"nova-api-0\" (UID: \"3bd0bbd4-8354-4260-9c9d-f8263046f438\") " pod="openstack/nova-api-0" Oct 13 13:28:16 crc kubenswrapper[4797]: I1013 13:28:16.395973 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2v5tz\" (UniqueName: \"kubernetes.io/projected/3bd0bbd4-8354-4260-9c9d-f8263046f438-kube-api-access-2v5tz\") pod \"nova-api-0\" (UID: \"3bd0bbd4-8354-4260-9c9d-f8263046f438\") " pod="openstack/nova-api-0" Oct 13 13:28:16 crc kubenswrapper[4797]: I1013 13:28:16.497816 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3bd0bbd4-8354-4260-9c9d-f8263046f438-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"3bd0bbd4-8354-4260-9c9d-f8263046f438\") " pod="openstack/nova-api-0" Oct 13 13:28:16 crc kubenswrapper[4797]: I1013 13:28:16.497920 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3bd0bbd4-8354-4260-9c9d-f8263046f438-public-tls-certs\") pod \"nova-api-0\" (UID: \"3bd0bbd4-8354-4260-9c9d-f8263046f438\") " pod="openstack/nova-api-0" Oct 13 13:28:16 crc kubenswrapper[4797]: I1013 13:28:16.497952 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3bd0bbd4-8354-4260-9c9d-f8263046f438-logs\") pod \"nova-api-0\" (UID: \"3bd0bbd4-8354-4260-9c9d-f8263046f438\") " pod="openstack/nova-api-0" Oct 13 13:28:16 crc kubenswrapper[4797]: I1013 13:28:16.497971 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3bd0bbd4-8354-4260-9c9d-f8263046f438-config-data\") pod \"nova-api-0\" (UID: \"3bd0bbd4-8354-4260-9c9d-f8263046f438\") " pod="openstack/nova-api-0" Oct 13 13:28:16 crc kubenswrapper[4797]: I1013 13:28:16.498025 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2v5tz\" (UniqueName: \"kubernetes.io/projected/3bd0bbd4-8354-4260-9c9d-f8263046f438-kube-api-access-2v5tz\") pod \"nova-api-0\" (UID: \"3bd0bbd4-8354-4260-9c9d-f8263046f438\") " pod="openstack/nova-api-0" Oct 13 13:28:16 crc kubenswrapper[4797]: I1013 13:28:16.498046 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3bd0bbd4-8354-4260-9c9d-f8263046f438-internal-tls-certs\") pod \"nova-api-0\" (UID: \"3bd0bbd4-8354-4260-9c9d-f8263046f438\") " pod="openstack/nova-api-0" Oct 13 13:28:16 crc kubenswrapper[4797]: I1013 13:28:16.509568 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3bd0bbd4-8354-4260-9c9d-f8263046f438-config-data\") pod \"nova-api-0\" (UID: \"3bd0bbd4-8354-4260-9c9d-f8263046f438\") " pod="openstack/nova-api-0" Oct 13 13:28:16 crc kubenswrapper[4797]: I1013 13:28:16.510538 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3bd0bbd4-8354-4260-9c9d-f8263046f438-logs\") pod \"nova-api-0\" (UID: \"3bd0bbd4-8354-4260-9c9d-f8263046f438\") " pod="openstack/nova-api-0" Oct 13 13:28:16 crc kubenswrapper[4797]: I1013 13:28:16.511999 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3bd0bbd4-8354-4260-9c9d-f8263046f438-public-tls-certs\") pod \"nova-api-0\" (UID: \"3bd0bbd4-8354-4260-9c9d-f8263046f438\") " pod="openstack/nova-api-0" Oct 13 13:28:16 crc kubenswrapper[4797]: I1013 13:28:16.512308 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3bd0bbd4-8354-4260-9c9d-f8263046f438-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"3bd0bbd4-8354-4260-9c9d-f8263046f438\") " pod="openstack/nova-api-0" Oct 13 13:28:16 crc kubenswrapper[4797]: I1013 13:28:16.517216 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3bd0bbd4-8354-4260-9c9d-f8263046f438-internal-tls-certs\") pod \"nova-api-0\" (UID: \"3bd0bbd4-8354-4260-9c9d-f8263046f438\") " pod="openstack/nova-api-0" Oct 13 13:28:16 crc kubenswrapper[4797]: I1013 13:28:16.548064 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2v5tz\" (UniqueName: \"kubernetes.io/projected/3bd0bbd4-8354-4260-9c9d-f8263046f438-kube-api-access-2v5tz\") pod \"nova-api-0\" (UID: \"3bd0bbd4-8354-4260-9c9d-f8263046f438\") " pod="openstack/nova-api-0" Oct 13 13:28:16 crc kubenswrapper[4797]: I1013 13:28:16.658568 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 13 13:28:16 crc kubenswrapper[4797]: I1013 13:28:16.822739 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Oct 13 13:28:16 crc kubenswrapper[4797]: I1013 13:28:16.843189 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Oct 13 13:28:17 crc kubenswrapper[4797]: I1013 13:28:17.115593 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 13 13:28:17 crc kubenswrapper[4797]: W1013 13:28:17.124581 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3bd0bbd4_8354_4260_9c9d_f8263046f438.slice/crio-362479ab37f6c7b3789bdd760894f0784d66557e8fcee453a6612e8a0a83acf5 WatchSource:0}: Error finding container 362479ab37f6c7b3789bdd760894f0784d66557e8fcee453a6612e8a0a83acf5: Status 404 returned error can't find the container with id 362479ab37f6c7b3789bdd760894f0784d66557e8fcee453a6612e8a0a83acf5 Oct 13 13:28:17 crc kubenswrapper[4797]: I1013 13:28:17.227629 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8c12a30d-f475-4e40-ac5f-0c03c238fb89","Type":"ContainerStarted","Data":"3cf9fcc54220118a191ddf671d8241baf56272d228d5553f00738e1a85482ca8"} Oct 13 13:28:17 crc kubenswrapper[4797]: I1013 13:28:17.229778 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3bd0bbd4-8354-4260-9c9d-f8263046f438","Type":"ContainerStarted","Data":"362479ab37f6c7b3789bdd760894f0784d66557e8fcee453a6612e8a0a83acf5"} Oct 13 13:28:17 crc kubenswrapper[4797]: I1013 13:28:17.247375 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2dbfdf40-54bd-48ca-a55e-1a40ed46b41b" path="/var/lib/kubelet/pods/2dbfdf40-54bd-48ca-a55e-1a40ed46b41b/volumes" Oct 13 13:28:17 crc kubenswrapper[4797]: I1013 13:28:17.248694 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Oct 13 13:28:17 crc kubenswrapper[4797]: I1013 13:28:17.421503 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-z9wd5"] Oct 13 13:28:17 crc kubenswrapper[4797]: I1013 13:28:17.422941 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-z9wd5" Oct 13 13:28:17 crc kubenswrapper[4797]: I1013 13:28:17.435822 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Oct 13 13:28:17 crc kubenswrapper[4797]: I1013 13:28:17.436042 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Oct 13 13:28:17 crc kubenswrapper[4797]: I1013 13:28:17.437917 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-z9wd5"] Oct 13 13:28:17 crc kubenswrapper[4797]: I1013 13:28:17.519264 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c56c463-d9bb-46df-995e-424abef44adc-config-data\") pod \"nova-cell1-cell-mapping-z9wd5\" (UID: \"8c56c463-d9bb-46df-995e-424abef44adc\") " pod="openstack/nova-cell1-cell-mapping-z9wd5" Oct 13 13:28:17 crc kubenswrapper[4797]: I1013 13:28:17.519683 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9n8db\" (UniqueName: \"kubernetes.io/projected/8c56c463-d9bb-46df-995e-424abef44adc-kube-api-access-9n8db\") pod \"nova-cell1-cell-mapping-z9wd5\" (UID: \"8c56c463-d9bb-46df-995e-424abef44adc\") " pod="openstack/nova-cell1-cell-mapping-z9wd5" Oct 13 13:28:17 crc kubenswrapper[4797]: I1013 13:28:17.519727 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c56c463-d9bb-46df-995e-424abef44adc-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-z9wd5\" (UID: \"8c56c463-d9bb-46df-995e-424abef44adc\") " pod="openstack/nova-cell1-cell-mapping-z9wd5" Oct 13 13:28:17 crc kubenswrapper[4797]: I1013 13:28:17.519751 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c56c463-d9bb-46df-995e-424abef44adc-scripts\") pod \"nova-cell1-cell-mapping-z9wd5\" (UID: \"8c56c463-d9bb-46df-995e-424abef44adc\") " pod="openstack/nova-cell1-cell-mapping-z9wd5" Oct 13 13:28:17 crc kubenswrapper[4797]: I1013 13:28:17.621319 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c56c463-d9bb-46df-995e-424abef44adc-config-data\") pod \"nova-cell1-cell-mapping-z9wd5\" (UID: \"8c56c463-d9bb-46df-995e-424abef44adc\") " pod="openstack/nova-cell1-cell-mapping-z9wd5" Oct 13 13:28:17 crc kubenswrapper[4797]: I1013 13:28:17.621439 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9n8db\" (UniqueName: \"kubernetes.io/projected/8c56c463-d9bb-46df-995e-424abef44adc-kube-api-access-9n8db\") pod \"nova-cell1-cell-mapping-z9wd5\" (UID: \"8c56c463-d9bb-46df-995e-424abef44adc\") " pod="openstack/nova-cell1-cell-mapping-z9wd5" Oct 13 13:28:17 crc kubenswrapper[4797]: I1013 13:28:17.621482 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c56c463-d9bb-46df-995e-424abef44adc-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-z9wd5\" (UID: \"8c56c463-d9bb-46df-995e-424abef44adc\") " pod="openstack/nova-cell1-cell-mapping-z9wd5" Oct 13 13:28:17 crc kubenswrapper[4797]: I1013 13:28:17.621498 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c56c463-d9bb-46df-995e-424abef44adc-scripts\") pod \"nova-cell1-cell-mapping-z9wd5\" (UID: \"8c56c463-d9bb-46df-995e-424abef44adc\") " pod="openstack/nova-cell1-cell-mapping-z9wd5" Oct 13 13:28:17 crc kubenswrapper[4797]: I1013 13:28:17.627749 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c56c463-d9bb-46df-995e-424abef44adc-scripts\") pod \"nova-cell1-cell-mapping-z9wd5\" (UID: \"8c56c463-d9bb-46df-995e-424abef44adc\") " pod="openstack/nova-cell1-cell-mapping-z9wd5" Oct 13 13:28:17 crc kubenswrapper[4797]: I1013 13:28:17.628361 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c56c463-d9bb-46df-995e-424abef44adc-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-z9wd5\" (UID: \"8c56c463-d9bb-46df-995e-424abef44adc\") " pod="openstack/nova-cell1-cell-mapping-z9wd5" Oct 13 13:28:17 crc kubenswrapper[4797]: I1013 13:28:17.630716 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c56c463-d9bb-46df-995e-424abef44adc-config-data\") pod \"nova-cell1-cell-mapping-z9wd5\" (UID: \"8c56c463-d9bb-46df-995e-424abef44adc\") " pod="openstack/nova-cell1-cell-mapping-z9wd5" Oct 13 13:28:17 crc kubenswrapper[4797]: I1013 13:28:17.645879 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9n8db\" (UniqueName: \"kubernetes.io/projected/8c56c463-d9bb-46df-995e-424abef44adc-kube-api-access-9n8db\") pod \"nova-cell1-cell-mapping-z9wd5\" (UID: \"8c56c463-d9bb-46df-995e-424abef44adc\") " pod="openstack/nova-cell1-cell-mapping-z9wd5" Oct 13 13:28:17 crc kubenswrapper[4797]: I1013 13:28:17.813960 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-z9wd5" Oct 13 13:28:18 crc kubenswrapper[4797]: I1013 13:28:18.248590 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3bd0bbd4-8354-4260-9c9d-f8263046f438","Type":"ContainerStarted","Data":"f79811d1d5c5d25067c75c79369c437d4d6633cf1c9cb45a20172b657b1c7cf9"} Oct 13 13:28:18 crc kubenswrapper[4797]: I1013 13:28:18.249005 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3bd0bbd4-8354-4260-9c9d-f8263046f438","Type":"ContainerStarted","Data":"a4720f10b64a94eb58f734f53910281591a56f01293cd367161902ec94520f94"} Oct 13 13:28:18 crc kubenswrapper[4797]: I1013 13:28:18.282503 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.282477251 podStartE2EDuration="2.282477251s" podCreationTimestamp="2025-10-13 13:28:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 13:28:18.270985069 +0000 UTC m=+1275.804535375" watchObservedRunningTime="2025-10-13 13:28:18.282477251 +0000 UTC m=+1275.816027517" Oct 13 13:28:18 crc kubenswrapper[4797]: I1013 13:28:18.282915 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-z9wd5"] Oct 13 13:28:19 crc kubenswrapper[4797]: I1013 13:28:19.259850 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8c12a30d-f475-4e40-ac5f-0c03c238fb89","Type":"ContainerStarted","Data":"61b3075099c1eee3aa921f23de5f19ac78ed9c6b5c838c6e0eb79c1c1ecd3b9c"} Oct 13 13:28:19 crc kubenswrapper[4797]: I1013 13:28:19.260134 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8c12a30d-f475-4e40-ac5f-0c03c238fb89" containerName="ceilometer-central-agent" containerID="cri-o://7f94011a6860107ef9dae63f1f21baacda439283afab4d633ec8e51fc6caaa03" gracePeriod=30 Oct 13 13:28:19 crc kubenswrapper[4797]: I1013 13:28:19.260604 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8c12a30d-f475-4e40-ac5f-0c03c238fb89" containerName="proxy-httpd" containerID="cri-o://61b3075099c1eee3aa921f23de5f19ac78ed9c6b5c838c6e0eb79c1c1ecd3b9c" gracePeriod=30 Oct 13 13:28:19 crc kubenswrapper[4797]: I1013 13:28:19.260653 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8c12a30d-f475-4e40-ac5f-0c03c238fb89" containerName="sg-core" containerID="cri-o://3cf9fcc54220118a191ddf671d8241baf56272d228d5553f00738e1a85482ca8" gracePeriod=30 Oct 13 13:28:19 crc kubenswrapper[4797]: I1013 13:28:19.260687 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 13 13:28:19 crc kubenswrapper[4797]: I1013 13:28:19.260691 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8c12a30d-f475-4e40-ac5f-0c03c238fb89" containerName="ceilometer-notification-agent" containerID="cri-o://d8864921b8dbbdee8b7c5297ccc18f6112af6fa0d1b9e8f119c7cd8f8c905c03" gracePeriod=30 Oct 13 13:28:19 crc kubenswrapper[4797]: I1013 13:28:19.264947 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-z9wd5" event={"ID":"8c56c463-d9bb-46df-995e-424abef44adc","Type":"ContainerStarted","Data":"ddbd9fbba7afb83602cfa1c4d7828d401160383366b2cc9f83c9b5cab203b153"} Oct 13 13:28:19 crc kubenswrapper[4797]: I1013 13:28:19.264978 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-z9wd5" event={"ID":"8c56c463-d9bb-46df-995e-424abef44adc","Type":"ContainerStarted","Data":"d18b989c297f9a3ea4142180a475acf07f4b9c41a6d9ffd348736c6646fdfb7d"} Oct 13 13:28:19 crc kubenswrapper[4797]: I1013 13:28:19.283094 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.444376191 podStartE2EDuration="6.283059829s" podCreationTimestamp="2025-10-13 13:28:13 +0000 UTC" firstStartedPulling="2025-10-13 13:28:14.097970496 +0000 UTC m=+1271.631520752" lastFinishedPulling="2025-10-13 13:28:18.936654104 +0000 UTC m=+1276.470204390" observedRunningTime="2025-10-13 13:28:19.281502551 +0000 UTC m=+1276.815052827" watchObservedRunningTime="2025-10-13 13:28:19.283059829 +0000 UTC m=+1276.816610085" Oct 13 13:28:19 crc kubenswrapper[4797]: I1013 13:28:19.309760 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-z9wd5" podStartSLOduration=2.309738592 podStartE2EDuration="2.309738592s" podCreationTimestamp="2025-10-13 13:28:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 13:28:19.300548597 +0000 UTC m=+1276.834098873" watchObservedRunningTime="2025-10-13 13:28:19.309738592 +0000 UTC m=+1276.843288858" Oct 13 13:28:19 crc kubenswrapper[4797]: I1013 13:28:19.665960 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-65bf758599-wncdc" Oct 13 13:28:19 crc kubenswrapper[4797]: I1013 13:28:19.749653 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-d74749bf5-5lfsm"] Oct 13 13:28:19 crc kubenswrapper[4797]: I1013 13:28:19.749912 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-d74749bf5-5lfsm" podUID="3b04026b-c0b6-4373-b87a-0eaea7d96163" containerName="dnsmasq-dns" containerID="cri-o://82c136de1437b97cd26a1364e4f0882585e571d85c4318f416061c9b95f79c3d" gracePeriod=10 Oct 13 13:28:20 crc kubenswrapper[4797]: I1013 13:28:20.191984 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d74749bf5-5lfsm" Oct 13 13:28:20 crc kubenswrapper[4797]: I1013 13:28:20.271715 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3b04026b-c0b6-4373-b87a-0eaea7d96163-dns-swift-storage-0\") pod \"3b04026b-c0b6-4373-b87a-0eaea7d96163\" (UID: \"3b04026b-c0b6-4373-b87a-0eaea7d96163\") " Oct 13 13:28:20 crc kubenswrapper[4797]: I1013 13:28:20.271862 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3b04026b-c0b6-4373-b87a-0eaea7d96163-ovsdbserver-sb\") pod \"3b04026b-c0b6-4373-b87a-0eaea7d96163\" (UID: \"3b04026b-c0b6-4373-b87a-0eaea7d96163\") " Oct 13 13:28:20 crc kubenswrapper[4797]: I1013 13:28:20.271958 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3b04026b-c0b6-4373-b87a-0eaea7d96163-dns-svc\") pod \"3b04026b-c0b6-4373-b87a-0eaea7d96163\" (UID: \"3b04026b-c0b6-4373-b87a-0eaea7d96163\") " Oct 13 13:28:20 crc kubenswrapper[4797]: I1013 13:28:20.272016 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b04026b-c0b6-4373-b87a-0eaea7d96163-config\") pod \"3b04026b-c0b6-4373-b87a-0eaea7d96163\" (UID: \"3b04026b-c0b6-4373-b87a-0eaea7d96163\") " Oct 13 13:28:20 crc kubenswrapper[4797]: I1013 13:28:20.272090 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3b04026b-c0b6-4373-b87a-0eaea7d96163-ovsdbserver-nb\") pod \"3b04026b-c0b6-4373-b87a-0eaea7d96163\" (UID: \"3b04026b-c0b6-4373-b87a-0eaea7d96163\") " Oct 13 13:28:20 crc kubenswrapper[4797]: I1013 13:28:20.272227 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nwnch\" (UniqueName: \"kubernetes.io/projected/3b04026b-c0b6-4373-b87a-0eaea7d96163-kube-api-access-nwnch\") pod \"3b04026b-c0b6-4373-b87a-0eaea7d96163\" (UID: \"3b04026b-c0b6-4373-b87a-0eaea7d96163\") " Oct 13 13:28:20 crc kubenswrapper[4797]: I1013 13:28:20.290174 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b04026b-c0b6-4373-b87a-0eaea7d96163-kube-api-access-nwnch" (OuterVolumeSpecName: "kube-api-access-nwnch") pod "3b04026b-c0b6-4373-b87a-0eaea7d96163" (UID: "3b04026b-c0b6-4373-b87a-0eaea7d96163"). InnerVolumeSpecName "kube-api-access-nwnch". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:28:20 crc kubenswrapper[4797]: I1013 13:28:20.357101 4797 generic.go:334] "Generic (PLEG): container finished" podID="8c12a30d-f475-4e40-ac5f-0c03c238fb89" containerID="3cf9fcc54220118a191ddf671d8241baf56272d228d5553f00738e1a85482ca8" exitCode=2 Oct 13 13:28:20 crc kubenswrapper[4797]: I1013 13:28:20.357156 4797 generic.go:334] "Generic (PLEG): container finished" podID="8c12a30d-f475-4e40-ac5f-0c03c238fb89" containerID="d8864921b8dbbdee8b7c5297ccc18f6112af6fa0d1b9e8f119c7cd8f8c905c03" exitCode=0 Oct 13 13:28:20 crc kubenswrapper[4797]: I1013 13:28:20.357166 4797 generic.go:334] "Generic (PLEG): container finished" podID="8c12a30d-f475-4e40-ac5f-0c03c238fb89" containerID="7f94011a6860107ef9dae63f1f21baacda439283afab4d633ec8e51fc6caaa03" exitCode=0 Oct 13 13:28:20 crc kubenswrapper[4797]: I1013 13:28:20.357979 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8c12a30d-f475-4e40-ac5f-0c03c238fb89","Type":"ContainerDied","Data":"3cf9fcc54220118a191ddf671d8241baf56272d228d5553f00738e1a85482ca8"} Oct 13 13:28:20 crc kubenswrapper[4797]: I1013 13:28:20.358068 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8c12a30d-f475-4e40-ac5f-0c03c238fb89","Type":"ContainerDied","Data":"d8864921b8dbbdee8b7c5297ccc18f6112af6fa0d1b9e8f119c7cd8f8c905c03"} Oct 13 13:28:20 crc kubenswrapper[4797]: I1013 13:28:20.358085 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8c12a30d-f475-4e40-ac5f-0c03c238fb89","Type":"ContainerDied","Data":"7f94011a6860107ef9dae63f1f21baacda439283afab4d633ec8e51fc6caaa03"} Oct 13 13:28:20 crc kubenswrapper[4797]: I1013 13:28:20.416387 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nwnch\" (UniqueName: \"kubernetes.io/projected/3b04026b-c0b6-4373-b87a-0eaea7d96163-kube-api-access-nwnch\") on node \"crc\" DevicePath \"\"" Oct 13 13:28:20 crc kubenswrapper[4797]: I1013 13:28:20.417034 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b04026b-c0b6-4373-b87a-0eaea7d96163-config" (OuterVolumeSpecName: "config") pod "3b04026b-c0b6-4373-b87a-0eaea7d96163" (UID: "3b04026b-c0b6-4373-b87a-0eaea7d96163"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:28:20 crc kubenswrapper[4797]: I1013 13:28:20.425369 4797 generic.go:334] "Generic (PLEG): container finished" podID="3b04026b-c0b6-4373-b87a-0eaea7d96163" containerID="82c136de1437b97cd26a1364e4f0882585e571d85c4318f416061c9b95f79c3d" exitCode=0 Oct 13 13:28:20 crc kubenswrapper[4797]: I1013 13:28:20.426175 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d74749bf5-5lfsm" Oct 13 13:28:20 crc kubenswrapper[4797]: I1013 13:28:20.426587 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d74749bf5-5lfsm" event={"ID":"3b04026b-c0b6-4373-b87a-0eaea7d96163","Type":"ContainerDied","Data":"82c136de1437b97cd26a1364e4f0882585e571d85c4318f416061c9b95f79c3d"} Oct 13 13:28:20 crc kubenswrapper[4797]: I1013 13:28:20.426610 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d74749bf5-5lfsm" event={"ID":"3b04026b-c0b6-4373-b87a-0eaea7d96163","Type":"ContainerDied","Data":"a4f50de1974402f5f3119a93d2d2072d884571a013e7c4332855e24c06818dba"} Oct 13 13:28:20 crc kubenswrapper[4797]: I1013 13:28:20.426626 4797 scope.go:117] "RemoveContainer" containerID="82c136de1437b97cd26a1364e4f0882585e571d85c4318f416061c9b95f79c3d" Oct 13 13:28:20 crc kubenswrapper[4797]: I1013 13:28:20.454194 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b04026b-c0b6-4373-b87a-0eaea7d96163-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "3b04026b-c0b6-4373-b87a-0eaea7d96163" (UID: "3b04026b-c0b6-4373-b87a-0eaea7d96163"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:28:20 crc kubenswrapper[4797]: I1013 13:28:20.468210 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b04026b-c0b6-4373-b87a-0eaea7d96163-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "3b04026b-c0b6-4373-b87a-0eaea7d96163" (UID: "3b04026b-c0b6-4373-b87a-0eaea7d96163"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:28:20 crc kubenswrapper[4797]: I1013 13:28:20.471468 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b04026b-c0b6-4373-b87a-0eaea7d96163-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "3b04026b-c0b6-4373-b87a-0eaea7d96163" (UID: "3b04026b-c0b6-4373-b87a-0eaea7d96163"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:28:20 crc kubenswrapper[4797]: I1013 13:28:20.475661 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b04026b-c0b6-4373-b87a-0eaea7d96163-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3b04026b-c0b6-4373-b87a-0eaea7d96163" (UID: "3b04026b-c0b6-4373-b87a-0eaea7d96163"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:28:20 crc kubenswrapper[4797]: I1013 13:28:20.517062 4797 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3b04026b-c0b6-4373-b87a-0eaea7d96163-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 13 13:28:20 crc kubenswrapper[4797]: I1013 13:28:20.517092 4797 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3b04026b-c0b6-4373-b87a-0eaea7d96163-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 13 13:28:20 crc kubenswrapper[4797]: I1013 13:28:20.517102 4797 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3b04026b-c0b6-4373-b87a-0eaea7d96163-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 13 13:28:20 crc kubenswrapper[4797]: I1013 13:28:20.517111 4797 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b04026b-c0b6-4373-b87a-0eaea7d96163-config\") on node \"crc\" DevicePath \"\"" Oct 13 13:28:20 crc kubenswrapper[4797]: I1013 13:28:20.517119 4797 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3b04026b-c0b6-4373-b87a-0eaea7d96163-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 13 13:28:20 crc kubenswrapper[4797]: I1013 13:28:20.533019 4797 scope.go:117] "RemoveContainer" containerID="0ca43873ba235d687e8e91e7f952ef663cce7934a24dde81c51aa6374cbd5622" Oct 13 13:28:20 crc kubenswrapper[4797]: I1013 13:28:20.552314 4797 scope.go:117] "RemoveContainer" containerID="82c136de1437b97cd26a1364e4f0882585e571d85c4318f416061c9b95f79c3d" Oct 13 13:28:20 crc kubenswrapper[4797]: E1013 13:28:20.552831 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"82c136de1437b97cd26a1364e4f0882585e571d85c4318f416061c9b95f79c3d\": container with ID starting with 82c136de1437b97cd26a1364e4f0882585e571d85c4318f416061c9b95f79c3d not found: ID does not exist" containerID="82c136de1437b97cd26a1364e4f0882585e571d85c4318f416061c9b95f79c3d" Oct 13 13:28:20 crc kubenswrapper[4797]: I1013 13:28:20.552892 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82c136de1437b97cd26a1364e4f0882585e571d85c4318f416061c9b95f79c3d"} err="failed to get container status \"82c136de1437b97cd26a1364e4f0882585e571d85c4318f416061c9b95f79c3d\": rpc error: code = NotFound desc = could not find container \"82c136de1437b97cd26a1364e4f0882585e571d85c4318f416061c9b95f79c3d\": container with ID starting with 82c136de1437b97cd26a1364e4f0882585e571d85c4318f416061c9b95f79c3d not found: ID does not exist" Oct 13 13:28:20 crc kubenswrapper[4797]: I1013 13:28:20.552929 4797 scope.go:117] "RemoveContainer" containerID="0ca43873ba235d687e8e91e7f952ef663cce7934a24dde81c51aa6374cbd5622" Oct 13 13:28:20 crc kubenswrapper[4797]: E1013 13:28:20.553268 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ca43873ba235d687e8e91e7f952ef663cce7934a24dde81c51aa6374cbd5622\": container with ID starting with 0ca43873ba235d687e8e91e7f952ef663cce7934a24dde81c51aa6374cbd5622 not found: ID does not exist" containerID="0ca43873ba235d687e8e91e7f952ef663cce7934a24dde81c51aa6374cbd5622" Oct 13 13:28:20 crc kubenswrapper[4797]: I1013 13:28:20.553305 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ca43873ba235d687e8e91e7f952ef663cce7934a24dde81c51aa6374cbd5622"} err="failed to get container status \"0ca43873ba235d687e8e91e7f952ef663cce7934a24dde81c51aa6374cbd5622\": rpc error: code = NotFound desc = could not find container \"0ca43873ba235d687e8e91e7f952ef663cce7934a24dde81c51aa6374cbd5622\": container with ID starting with 0ca43873ba235d687e8e91e7f952ef663cce7934a24dde81c51aa6374cbd5622 not found: ID does not exist" Oct 13 13:28:20 crc kubenswrapper[4797]: I1013 13:28:20.758113 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-d74749bf5-5lfsm"] Oct 13 13:28:20 crc kubenswrapper[4797]: I1013 13:28:20.768139 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-d74749bf5-5lfsm"] Oct 13 13:28:21 crc kubenswrapper[4797]: I1013 13:28:21.245465 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b04026b-c0b6-4373-b87a-0eaea7d96163" path="/var/lib/kubelet/pods/3b04026b-c0b6-4373-b87a-0eaea7d96163/volumes" Oct 13 13:28:23 crc kubenswrapper[4797]: I1013 13:28:23.458734 4797 generic.go:334] "Generic (PLEG): container finished" podID="8c56c463-d9bb-46df-995e-424abef44adc" containerID="ddbd9fbba7afb83602cfa1c4d7828d401160383366b2cc9f83c9b5cab203b153" exitCode=0 Oct 13 13:28:23 crc kubenswrapper[4797]: I1013 13:28:23.458870 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-z9wd5" event={"ID":"8c56c463-d9bb-46df-995e-424abef44adc","Type":"ContainerDied","Data":"ddbd9fbba7afb83602cfa1c4d7828d401160383366b2cc9f83c9b5cab203b153"} Oct 13 13:28:24 crc kubenswrapper[4797]: I1013 13:28:24.813127 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-z9wd5" Oct 13 13:28:24 crc kubenswrapper[4797]: I1013 13:28:24.917102 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c56c463-d9bb-46df-995e-424abef44adc-combined-ca-bundle\") pod \"8c56c463-d9bb-46df-995e-424abef44adc\" (UID: \"8c56c463-d9bb-46df-995e-424abef44adc\") " Oct 13 13:28:24 crc kubenswrapper[4797]: I1013 13:28:24.917178 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c56c463-d9bb-46df-995e-424abef44adc-config-data\") pod \"8c56c463-d9bb-46df-995e-424abef44adc\" (UID: \"8c56c463-d9bb-46df-995e-424abef44adc\") " Oct 13 13:28:24 crc kubenswrapper[4797]: I1013 13:28:24.917231 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c56c463-d9bb-46df-995e-424abef44adc-scripts\") pod \"8c56c463-d9bb-46df-995e-424abef44adc\" (UID: \"8c56c463-d9bb-46df-995e-424abef44adc\") " Oct 13 13:28:24 crc kubenswrapper[4797]: I1013 13:28:24.917501 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9n8db\" (UniqueName: \"kubernetes.io/projected/8c56c463-d9bb-46df-995e-424abef44adc-kube-api-access-9n8db\") pod \"8c56c463-d9bb-46df-995e-424abef44adc\" (UID: \"8c56c463-d9bb-46df-995e-424abef44adc\") " Oct 13 13:28:24 crc kubenswrapper[4797]: I1013 13:28:24.923616 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c56c463-d9bb-46df-995e-424abef44adc-kube-api-access-9n8db" (OuterVolumeSpecName: "kube-api-access-9n8db") pod "8c56c463-d9bb-46df-995e-424abef44adc" (UID: "8c56c463-d9bb-46df-995e-424abef44adc"). InnerVolumeSpecName "kube-api-access-9n8db". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:28:24 crc kubenswrapper[4797]: I1013 13:28:24.924593 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c56c463-d9bb-46df-995e-424abef44adc-scripts" (OuterVolumeSpecName: "scripts") pod "8c56c463-d9bb-46df-995e-424abef44adc" (UID: "8c56c463-d9bb-46df-995e-424abef44adc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:28:24 crc kubenswrapper[4797]: I1013 13:28:24.950331 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c56c463-d9bb-46df-995e-424abef44adc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8c56c463-d9bb-46df-995e-424abef44adc" (UID: "8c56c463-d9bb-46df-995e-424abef44adc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:28:24 crc kubenswrapper[4797]: I1013 13:28:24.955001 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c56c463-d9bb-46df-995e-424abef44adc-config-data" (OuterVolumeSpecName: "config-data") pod "8c56c463-d9bb-46df-995e-424abef44adc" (UID: "8c56c463-d9bb-46df-995e-424abef44adc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:28:25 crc kubenswrapper[4797]: I1013 13:28:25.023400 4797 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c56c463-d9bb-46df-995e-424abef44adc-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 13:28:25 crc kubenswrapper[4797]: I1013 13:28:25.023454 4797 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c56c463-d9bb-46df-995e-424abef44adc-scripts\") on node \"crc\" DevicePath \"\"" Oct 13 13:28:25 crc kubenswrapper[4797]: I1013 13:28:25.023467 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9n8db\" (UniqueName: \"kubernetes.io/projected/8c56c463-d9bb-46df-995e-424abef44adc-kube-api-access-9n8db\") on node \"crc\" DevicePath \"\"" Oct 13 13:28:25 crc kubenswrapper[4797]: I1013 13:28:25.023480 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c56c463-d9bb-46df-995e-424abef44adc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 13:28:25 crc kubenswrapper[4797]: I1013 13:28:25.124486 4797 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-d74749bf5-5lfsm" podUID="3b04026b-c0b6-4373-b87a-0eaea7d96163" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.189:5353: i/o timeout" Oct 13 13:28:25 crc kubenswrapper[4797]: I1013 13:28:25.481489 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-z9wd5" event={"ID":"8c56c463-d9bb-46df-995e-424abef44adc","Type":"ContainerDied","Data":"d18b989c297f9a3ea4142180a475acf07f4b9c41a6d9ffd348736c6646fdfb7d"} Oct 13 13:28:25 crc kubenswrapper[4797]: I1013 13:28:25.481549 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-z9wd5" Oct 13 13:28:25 crc kubenswrapper[4797]: I1013 13:28:25.481566 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d18b989c297f9a3ea4142180a475acf07f4b9c41a6d9ffd348736c6646fdfb7d" Oct 13 13:28:25 crc kubenswrapper[4797]: I1013 13:28:25.749650 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 13 13:28:25 crc kubenswrapper[4797]: I1013 13:28:25.752298 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="3bd0bbd4-8354-4260-9c9d-f8263046f438" containerName="nova-api-log" containerID="cri-o://a4720f10b64a94eb58f734f53910281591a56f01293cd367161902ec94520f94" gracePeriod=30 Oct 13 13:28:25 crc kubenswrapper[4797]: I1013 13:28:25.752350 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="3bd0bbd4-8354-4260-9c9d-f8263046f438" containerName="nova-api-api" containerID="cri-o://f79811d1d5c5d25067c75c79369c437d4d6633cf1c9cb45a20172b657b1c7cf9" gracePeriod=30 Oct 13 13:28:25 crc kubenswrapper[4797]: I1013 13:28:25.770390 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 13 13:28:25 crc kubenswrapper[4797]: I1013 13:28:25.770949 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="e9ad8baa-2c07-4598-9347-71831d4d264e" containerName="nova-scheduler-scheduler" containerID="cri-o://e0bd5a9e4a93dff8c862d42da3d24ddaaf3e00e8cc59c05ccb112cb226cdf57b" gracePeriod=30 Oct 13 13:28:25 crc kubenswrapper[4797]: I1013 13:28:25.783724 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 13 13:28:25 crc kubenswrapper[4797]: I1013 13:28:25.784008 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="4aac353b-6a2c-4072-b40f-cc91a3907bce" containerName="nova-metadata-log" containerID="cri-o://1caf1d3c32e8a777d8f507d9d95855d40ffc2c5ead36e51f277d99f3a725c0f6" gracePeriod=30 Oct 13 13:28:25 crc kubenswrapper[4797]: I1013 13:28:25.784205 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="4aac353b-6a2c-4072-b40f-cc91a3907bce" containerName="nova-metadata-metadata" containerID="cri-o://a9336d63ad3480a4cbf4f416b73926c80dd69c8319d9b129aa45be9c596f6185" gracePeriod=30 Oct 13 13:28:26 crc kubenswrapper[4797]: I1013 13:28:26.414587 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 13 13:28:26 crc kubenswrapper[4797]: I1013 13:28:26.497368 4797 generic.go:334] "Generic (PLEG): container finished" podID="4aac353b-6a2c-4072-b40f-cc91a3907bce" containerID="1caf1d3c32e8a777d8f507d9d95855d40ffc2c5ead36e51f277d99f3a725c0f6" exitCode=143 Oct 13 13:28:26 crc kubenswrapper[4797]: I1013 13:28:26.497444 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4aac353b-6a2c-4072-b40f-cc91a3907bce","Type":"ContainerDied","Data":"1caf1d3c32e8a777d8f507d9d95855d40ffc2c5ead36e51f277d99f3a725c0f6"} Oct 13 13:28:26 crc kubenswrapper[4797]: I1013 13:28:26.499534 4797 generic.go:334] "Generic (PLEG): container finished" podID="3bd0bbd4-8354-4260-9c9d-f8263046f438" containerID="f79811d1d5c5d25067c75c79369c437d4d6633cf1c9cb45a20172b657b1c7cf9" exitCode=0 Oct 13 13:28:26 crc kubenswrapper[4797]: I1013 13:28:26.499561 4797 generic.go:334] "Generic (PLEG): container finished" podID="3bd0bbd4-8354-4260-9c9d-f8263046f438" containerID="a4720f10b64a94eb58f734f53910281591a56f01293cd367161902ec94520f94" exitCode=143 Oct 13 13:28:26 crc kubenswrapper[4797]: I1013 13:28:26.499582 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 13 13:28:26 crc kubenswrapper[4797]: I1013 13:28:26.499581 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3bd0bbd4-8354-4260-9c9d-f8263046f438","Type":"ContainerDied","Data":"f79811d1d5c5d25067c75c79369c437d4d6633cf1c9cb45a20172b657b1c7cf9"} Oct 13 13:28:26 crc kubenswrapper[4797]: I1013 13:28:26.499706 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3bd0bbd4-8354-4260-9c9d-f8263046f438","Type":"ContainerDied","Data":"a4720f10b64a94eb58f734f53910281591a56f01293cd367161902ec94520f94"} Oct 13 13:28:26 crc kubenswrapper[4797]: I1013 13:28:26.499729 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3bd0bbd4-8354-4260-9c9d-f8263046f438","Type":"ContainerDied","Data":"362479ab37f6c7b3789bdd760894f0784d66557e8fcee453a6612e8a0a83acf5"} Oct 13 13:28:26 crc kubenswrapper[4797]: I1013 13:28:26.499750 4797 scope.go:117] "RemoveContainer" containerID="f79811d1d5c5d25067c75c79369c437d4d6633cf1c9cb45a20172b657b1c7cf9" Oct 13 13:28:26 crc kubenswrapper[4797]: I1013 13:28:26.525402 4797 scope.go:117] "RemoveContainer" containerID="a4720f10b64a94eb58f734f53910281591a56f01293cd367161902ec94520f94" Oct 13 13:28:26 crc kubenswrapper[4797]: I1013 13:28:26.543744 4797 scope.go:117] "RemoveContainer" containerID="f79811d1d5c5d25067c75c79369c437d4d6633cf1c9cb45a20172b657b1c7cf9" Oct 13 13:28:26 crc kubenswrapper[4797]: E1013 13:28:26.544262 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f79811d1d5c5d25067c75c79369c437d4d6633cf1c9cb45a20172b657b1c7cf9\": container with ID starting with f79811d1d5c5d25067c75c79369c437d4d6633cf1c9cb45a20172b657b1c7cf9 not found: ID does not exist" containerID="f79811d1d5c5d25067c75c79369c437d4d6633cf1c9cb45a20172b657b1c7cf9" Oct 13 13:28:26 crc kubenswrapper[4797]: I1013 13:28:26.545036 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f79811d1d5c5d25067c75c79369c437d4d6633cf1c9cb45a20172b657b1c7cf9"} err="failed to get container status \"f79811d1d5c5d25067c75c79369c437d4d6633cf1c9cb45a20172b657b1c7cf9\": rpc error: code = NotFound desc = could not find container \"f79811d1d5c5d25067c75c79369c437d4d6633cf1c9cb45a20172b657b1c7cf9\": container with ID starting with f79811d1d5c5d25067c75c79369c437d4d6633cf1c9cb45a20172b657b1c7cf9 not found: ID does not exist" Oct 13 13:28:26 crc kubenswrapper[4797]: I1013 13:28:26.545211 4797 scope.go:117] "RemoveContainer" containerID="a4720f10b64a94eb58f734f53910281591a56f01293cd367161902ec94520f94" Oct 13 13:28:26 crc kubenswrapper[4797]: E1013 13:28:26.545887 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a4720f10b64a94eb58f734f53910281591a56f01293cd367161902ec94520f94\": container with ID starting with a4720f10b64a94eb58f734f53910281591a56f01293cd367161902ec94520f94 not found: ID does not exist" containerID="a4720f10b64a94eb58f734f53910281591a56f01293cd367161902ec94520f94" Oct 13 13:28:26 crc kubenswrapper[4797]: I1013 13:28:26.545955 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4720f10b64a94eb58f734f53910281591a56f01293cd367161902ec94520f94"} err="failed to get container status \"a4720f10b64a94eb58f734f53910281591a56f01293cd367161902ec94520f94\": rpc error: code = NotFound desc = could not find container \"a4720f10b64a94eb58f734f53910281591a56f01293cd367161902ec94520f94\": container with ID starting with a4720f10b64a94eb58f734f53910281591a56f01293cd367161902ec94520f94 not found: ID does not exist" Oct 13 13:28:26 crc kubenswrapper[4797]: I1013 13:28:26.546003 4797 scope.go:117] "RemoveContainer" containerID="f79811d1d5c5d25067c75c79369c437d4d6633cf1c9cb45a20172b657b1c7cf9" Oct 13 13:28:26 crc kubenswrapper[4797]: I1013 13:28:26.546876 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f79811d1d5c5d25067c75c79369c437d4d6633cf1c9cb45a20172b657b1c7cf9"} err="failed to get container status \"f79811d1d5c5d25067c75c79369c437d4d6633cf1c9cb45a20172b657b1c7cf9\": rpc error: code = NotFound desc = could not find container \"f79811d1d5c5d25067c75c79369c437d4d6633cf1c9cb45a20172b657b1c7cf9\": container with ID starting with f79811d1d5c5d25067c75c79369c437d4d6633cf1c9cb45a20172b657b1c7cf9 not found: ID does not exist" Oct 13 13:28:26 crc kubenswrapper[4797]: I1013 13:28:26.546913 4797 scope.go:117] "RemoveContainer" containerID="a4720f10b64a94eb58f734f53910281591a56f01293cd367161902ec94520f94" Oct 13 13:28:26 crc kubenswrapper[4797]: I1013 13:28:26.547377 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4720f10b64a94eb58f734f53910281591a56f01293cd367161902ec94520f94"} err="failed to get container status \"a4720f10b64a94eb58f734f53910281591a56f01293cd367161902ec94520f94\": rpc error: code = NotFound desc = could not find container \"a4720f10b64a94eb58f734f53910281591a56f01293cd367161902ec94520f94\": container with ID starting with a4720f10b64a94eb58f734f53910281591a56f01293cd367161902ec94520f94 not found: ID does not exist" Oct 13 13:28:26 crc kubenswrapper[4797]: I1013 13:28:26.562870 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3bd0bbd4-8354-4260-9c9d-f8263046f438-config-data\") pod \"3bd0bbd4-8354-4260-9c9d-f8263046f438\" (UID: \"3bd0bbd4-8354-4260-9c9d-f8263046f438\") " Oct 13 13:28:26 crc kubenswrapper[4797]: I1013 13:28:26.562969 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3bd0bbd4-8354-4260-9c9d-f8263046f438-combined-ca-bundle\") pod \"3bd0bbd4-8354-4260-9c9d-f8263046f438\" (UID: \"3bd0bbd4-8354-4260-9c9d-f8263046f438\") " Oct 13 13:28:26 crc kubenswrapper[4797]: I1013 13:28:26.562997 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2v5tz\" (UniqueName: \"kubernetes.io/projected/3bd0bbd4-8354-4260-9c9d-f8263046f438-kube-api-access-2v5tz\") pod \"3bd0bbd4-8354-4260-9c9d-f8263046f438\" (UID: \"3bd0bbd4-8354-4260-9c9d-f8263046f438\") " Oct 13 13:28:26 crc kubenswrapper[4797]: I1013 13:28:26.563105 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3bd0bbd4-8354-4260-9c9d-f8263046f438-internal-tls-certs\") pod \"3bd0bbd4-8354-4260-9c9d-f8263046f438\" (UID: \"3bd0bbd4-8354-4260-9c9d-f8263046f438\") " Oct 13 13:28:26 crc kubenswrapper[4797]: I1013 13:28:26.563182 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3bd0bbd4-8354-4260-9c9d-f8263046f438-logs\") pod \"3bd0bbd4-8354-4260-9c9d-f8263046f438\" (UID: \"3bd0bbd4-8354-4260-9c9d-f8263046f438\") " Oct 13 13:28:26 crc kubenswrapper[4797]: I1013 13:28:26.563283 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3bd0bbd4-8354-4260-9c9d-f8263046f438-public-tls-certs\") pod \"3bd0bbd4-8354-4260-9c9d-f8263046f438\" (UID: \"3bd0bbd4-8354-4260-9c9d-f8263046f438\") " Oct 13 13:28:26 crc kubenswrapper[4797]: I1013 13:28:26.564067 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3bd0bbd4-8354-4260-9c9d-f8263046f438-logs" (OuterVolumeSpecName: "logs") pod "3bd0bbd4-8354-4260-9c9d-f8263046f438" (UID: "3bd0bbd4-8354-4260-9c9d-f8263046f438"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 13:28:26 crc kubenswrapper[4797]: I1013 13:28:26.573920 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3bd0bbd4-8354-4260-9c9d-f8263046f438-kube-api-access-2v5tz" (OuterVolumeSpecName: "kube-api-access-2v5tz") pod "3bd0bbd4-8354-4260-9c9d-f8263046f438" (UID: "3bd0bbd4-8354-4260-9c9d-f8263046f438"). InnerVolumeSpecName "kube-api-access-2v5tz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:28:26 crc kubenswrapper[4797]: I1013 13:28:26.604169 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3bd0bbd4-8354-4260-9c9d-f8263046f438-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3bd0bbd4-8354-4260-9c9d-f8263046f438" (UID: "3bd0bbd4-8354-4260-9c9d-f8263046f438"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:28:26 crc kubenswrapper[4797]: I1013 13:28:26.605227 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3bd0bbd4-8354-4260-9c9d-f8263046f438-config-data" (OuterVolumeSpecName: "config-data") pod "3bd0bbd4-8354-4260-9c9d-f8263046f438" (UID: "3bd0bbd4-8354-4260-9c9d-f8263046f438"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:28:26 crc kubenswrapper[4797]: I1013 13:28:26.628649 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3bd0bbd4-8354-4260-9c9d-f8263046f438-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "3bd0bbd4-8354-4260-9c9d-f8263046f438" (UID: "3bd0bbd4-8354-4260-9c9d-f8263046f438"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:28:26 crc kubenswrapper[4797]: I1013 13:28:26.631253 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3bd0bbd4-8354-4260-9c9d-f8263046f438-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "3bd0bbd4-8354-4260-9c9d-f8263046f438" (UID: "3bd0bbd4-8354-4260-9c9d-f8263046f438"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:28:26 crc kubenswrapper[4797]: I1013 13:28:26.665627 4797 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3bd0bbd4-8354-4260-9c9d-f8263046f438-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 13:28:26 crc kubenswrapper[4797]: I1013 13:28:26.668407 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3bd0bbd4-8354-4260-9c9d-f8263046f438-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 13:28:26 crc kubenswrapper[4797]: I1013 13:28:26.668537 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2v5tz\" (UniqueName: \"kubernetes.io/projected/3bd0bbd4-8354-4260-9c9d-f8263046f438-kube-api-access-2v5tz\") on node \"crc\" DevicePath \"\"" Oct 13 13:28:26 crc kubenswrapper[4797]: I1013 13:28:26.668640 4797 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3bd0bbd4-8354-4260-9c9d-f8263046f438-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 13 13:28:26 crc kubenswrapper[4797]: I1013 13:28:26.668770 4797 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3bd0bbd4-8354-4260-9c9d-f8263046f438-logs\") on node \"crc\" DevicePath \"\"" Oct 13 13:28:26 crc kubenswrapper[4797]: I1013 13:28:26.668880 4797 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3bd0bbd4-8354-4260-9c9d-f8263046f438-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 13 13:28:26 crc kubenswrapper[4797]: I1013 13:28:26.842872 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 13 13:28:26 crc kubenswrapper[4797]: I1013 13:28:26.872284 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 13 13:28:26 crc kubenswrapper[4797]: I1013 13:28:26.882697 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 13 13:28:26 crc kubenswrapper[4797]: E1013 13:28:26.883659 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c56c463-d9bb-46df-995e-424abef44adc" containerName="nova-manage" Oct 13 13:28:26 crc kubenswrapper[4797]: I1013 13:28:26.883764 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c56c463-d9bb-46df-995e-424abef44adc" containerName="nova-manage" Oct 13 13:28:26 crc kubenswrapper[4797]: E1013 13:28:26.883893 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bd0bbd4-8354-4260-9c9d-f8263046f438" containerName="nova-api-api" Oct 13 13:28:26 crc kubenswrapper[4797]: I1013 13:28:26.883986 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bd0bbd4-8354-4260-9c9d-f8263046f438" containerName="nova-api-api" Oct 13 13:28:26 crc kubenswrapper[4797]: E1013 13:28:26.884087 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b04026b-c0b6-4373-b87a-0eaea7d96163" containerName="init" Oct 13 13:28:26 crc kubenswrapper[4797]: I1013 13:28:26.884213 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b04026b-c0b6-4373-b87a-0eaea7d96163" containerName="init" Oct 13 13:28:26 crc kubenswrapper[4797]: E1013 13:28:26.884325 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b04026b-c0b6-4373-b87a-0eaea7d96163" containerName="dnsmasq-dns" Oct 13 13:28:26 crc kubenswrapper[4797]: I1013 13:28:26.884400 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b04026b-c0b6-4373-b87a-0eaea7d96163" containerName="dnsmasq-dns" Oct 13 13:28:26 crc kubenswrapper[4797]: E1013 13:28:26.884490 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bd0bbd4-8354-4260-9c9d-f8263046f438" containerName="nova-api-log" Oct 13 13:28:26 crc kubenswrapper[4797]: I1013 13:28:26.884560 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bd0bbd4-8354-4260-9c9d-f8263046f438" containerName="nova-api-log" Oct 13 13:28:26 crc kubenswrapper[4797]: I1013 13:28:26.884874 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c56c463-d9bb-46df-995e-424abef44adc" containerName="nova-manage" Oct 13 13:28:26 crc kubenswrapper[4797]: I1013 13:28:26.884986 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="3bd0bbd4-8354-4260-9c9d-f8263046f438" containerName="nova-api-api" Oct 13 13:28:26 crc kubenswrapper[4797]: I1013 13:28:26.885072 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="3bd0bbd4-8354-4260-9c9d-f8263046f438" containerName="nova-api-log" Oct 13 13:28:26 crc kubenswrapper[4797]: I1013 13:28:26.885151 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b04026b-c0b6-4373-b87a-0eaea7d96163" containerName="dnsmasq-dns" Oct 13 13:28:26 crc kubenswrapper[4797]: I1013 13:28:26.886420 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 13 13:28:26 crc kubenswrapper[4797]: I1013 13:28:26.889695 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 13 13:28:26 crc kubenswrapper[4797]: I1013 13:28:26.889740 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Oct 13 13:28:26 crc kubenswrapper[4797]: I1013 13:28:26.889703 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Oct 13 13:28:26 crc kubenswrapper[4797]: I1013 13:28:26.892924 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 13 13:28:26 crc kubenswrapper[4797]: I1013 13:28:26.973892 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-572w4\" (UniqueName: \"kubernetes.io/projected/e2a3161e-b16d-436d-b547-87e182ef5e27-kube-api-access-572w4\") pod \"nova-api-0\" (UID: \"e2a3161e-b16d-436d-b547-87e182ef5e27\") " pod="openstack/nova-api-0" Oct 13 13:28:26 crc kubenswrapper[4797]: I1013 13:28:26.973967 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2a3161e-b16d-436d-b547-87e182ef5e27-public-tls-certs\") pod \"nova-api-0\" (UID: \"e2a3161e-b16d-436d-b547-87e182ef5e27\") " pod="openstack/nova-api-0" Oct 13 13:28:26 crc kubenswrapper[4797]: I1013 13:28:26.973995 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2a3161e-b16d-436d-b547-87e182ef5e27-config-data\") pod \"nova-api-0\" (UID: \"e2a3161e-b16d-436d-b547-87e182ef5e27\") " pod="openstack/nova-api-0" Oct 13 13:28:26 crc kubenswrapper[4797]: I1013 13:28:26.974026 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2a3161e-b16d-436d-b547-87e182ef5e27-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e2a3161e-b16d-436d-b547-87e182ef5e27\") " pod="openstack/nova-api-0" Oct 13 13:28:26 crc kubenswrapper[4797]: I1013 13:28:26.974051 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e2a3161e-b16d-436d-b547-87e182ef5e27-logs\") pod \"nova-api-0\" (UID: \"e2a3161e-b16d-436d-b547-87e182ef5e27\") " pod="openstack/nova-api-0" Oct 13 13:28:26 crc kubenswrapper[4797]: I1013 13:28:26.974080 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2a3161e-b16d-436d-b547-87e182ef5e27-internal-tls-certs\") pod \"nova-api-0\" (UID: \"e2a3161e-b16d-436d-b547-87e182ef5e27\") " pod="openstack/nova-api-0" Oct 13 13:28:27 crc kubenswrapper[4797]: I1013 13:28:27.075401 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2a3161e-b16d-436d-b547-87e182ef5e27-internal-tls-certs\") pod \"nova-api-0\" (UID: \"e2a3161e-b16d-436d-b547-87e182ef5e27\") " pod="openstack/nova-api-0" Oct 13 13:28:27 crc kubenswrapper[4797]: I1013 13:28:27.075560 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-572w4\" (UniqueName: \"kubernetes.io/projected/e2a3161e-b16d-436d-b547-87e182ef5e27-kube-api-access-572w4\") pod \"nova-api-0\" (UID: \"e2a3161e-b16d-436d-b547-87e182ef5e27\") " pod="openstack/nova-api-0" Oct 13 13:28:27 crc kubenswrapper[4797]: I1013 13:28:27.075619 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2a3161e-b16d-436d-b547-87e182ef5e27-public-tls-certs\") pod \"nova-api-0\" (UID: \"e2a3161e-b16d-436d-b547-87e182ef5e27\") " pod="openstack/nova-api-0" Oct 13 13:28:27 crc kubenswrapper[4797]: I1013 13:28:27.075651 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2a3161e-b16d-436d-b547-87e182ef5e27-config-data\") pod \"nova-api-0\" (UID: \"e2a3161e-b16d-436d-b547-87e182ef5e27\") " pod="openstack/nova-api-0" Oct 13 13:28:27 crc kubenswrapper[4797]: I1013 13:28:27.075683 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2a3161e-b16d-436d-b547-87e182ef5e27-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e2a3161e-b16d-436d-b547-87e182ef5e27\") " pod="openstack/nova-api-0" Oct 13 13:28:27 crc kubenswrapper[4797]: I1013 13:28:27.075720 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e2a3161e-b16d-436d-b547-87e182ef5e27-logs\") pod \"nova-api-0\" (UID: \"e2a3161e-b16d-436d-b547-87e182ef5e27\") " pod="openstack/nova-api-0" Oct 13 13:28:27 crc kubenswrapper[4797]: I1013 13:28:27.076179 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e2a3161e-b16d-436d-b547-87e182ef5e27-logs\") pod \"nova-api-0\" (UID: \"e2a3161e-b16d-436d-b547-87e182ef5e27\") " pod="openstack/nova-api-0" Oct 13 13:28:27 crc kubenswrapper[4797]: I1013 13:28:27.080070 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2a3161e-b16d-436d-b547-87e182ef5e27-config-data\") pod \"nova-api-0\" (UID: \"e2a3161e-b16d-436d-b547-87e182ef5e27\") " pod="openstack/nova-api-0" Oct 13 13:28:27 crc kubenswrapper[4797]: I1013 13:28:27.083467 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2a3161e-b16d-436d-b547-87e182ef5e27-internal-tls-certs\") pod \"nova-api-0\" (UID: \"e2a3161e-b16d-436d-b547-87e182ef5e27\") " pod="openstack/nova-api-0" Oct 13 13:28:27 crc kubenswrapper[4797]: I1013 13:28:27.083468 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2a3161e-b16d-436d-b547-87e182ef5e27-public-tls-certs\") pod \"nova-api-0\" (UID: \"e2a3161e-b16d-436d-b547-87e182ef5e27\") " pod="openstack/nova-api-0" Oct 13 13:28:27 crc kubenswrapper[4797]: I1013 13:28:27.083520 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2a3161e-b16d-436d-b547-87e182ef5e27-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e2a3161e-b16d-436d-b547-87e182ef5e27\") " pod="openstack/nova-api-0" Oct 13 13:28:27 crc kubenswrapper[4797]: I1013 13:28:27.100372 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-572w4\" (UniqueName: \"kubernetes.io/projected/e2a3161e-b16d-436d-b547-87e182ef5e27-kube-api-access-572w4\") pod \"nova-api-0\" (UID: \"e2a3161e-b16d-436d-b547-87e182ef5e27\") " pod="openstack/nova-api-0" Oct 13 13:28:27 crc kubenswrapper[4797]: I1013 13:28:27.239677 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 13 13:28:27 crc kubenswrapper[4797]: I1013 13:28:27.247366 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3bd0bbd4-8354-4260-9c9d-f8263046f438" path="/var/lib/kubelet/pods/3bd0bbd4-8354-4260-9c9d-f8263046f438/volumes" Oct 13 13:28:27 crc kubenswrapper[4797]: I1013 13:28:27.726255 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 13 13:28:28 crc kubenswrapper[4797]: I1013 13:28:28.123224 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 13 13:28:28 crc kubenswrapper[4797]: I1013 13:28:28.297541 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9ad8baa-2c07-4598-9347-71831d4d264e-config-data\") pod \"e9ad8baa-2c07-4598-9347-71831d4d264e\" (UID: \"e9ad8baa-2c07-4598-9347-71831d4d264e\") " Oct 13 13:28:28 crc kubenswrapper[4797]: I1013 13:28:28.297719 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9ad8baa-2c07-4598-9347-71831d4d264e-combined-ca-bundle\") pod \"e9ad8baa-2c07-4598-9347-71831d4d264e\" (UID: \"e9ad8baa-2c07-4598-9347-71831d4d264e\") " Oct 13 13:28:28 crc kubenswrapper[4797]: I1013 13:28:28.297809 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mt2vq\" (UniqueName: \"kubernetes.io/projected/e9ad8baa-2c07-4598-9347-71831d4d264e-kube-api-access-mt2vq\") pod \"e9ad8baa-2c07-4598-9347-71831d4d264e\" (UID: \"e9ad8baa-2c07-4598-9347-71831d4d264e\") " Oct 13 13:28:28 crc kubenswrapper[4797]: I1013 13:28:28.301557 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9ad8baa-2c07-4598-9347-71831d4d264e-kube-api-access-mt2vq" (OuterVolumeSpecName: "kube-api-access-mt2vq") pod "e9ad8baa-2c07-4598-9347-71831d4d264e" (UID: "e9ad8baa-2c07-4598-9347-71831d4d264e"). InnerVolumeSpecName "kube-api-access-mt2vq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:28:28 crc kubenswrapper[4797]: I1013 13:28:28.328622 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9ad8baa-2c07-4598-9347-71831d4d264e-config-data" (OuterVolumeSpecName: "config-data") pod "e9ad8baa-2c07-4598-9347-71831d4d264e" (UID: "e9ad8baa-2c07-4598-9347-71831d4d264e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:28:28 crc kubenswrapper[4797]: I1013 13:28:28.339592 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9ad8baa-2c07-4598-9347-71831d4d264e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e9ad8baa-2c07-4598-9347-71831d4d264e" (UID: "e9ad8baa-2c07-4598-9347-71831d4d264e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:28:28 crc kubenswrapper[4797]: I1013 13:28:28.401266 4797 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9ad8baa-2c07-4598-9347-71831d4d264e-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 13:28:28 crc kubenswrapper[4797]: I1013 13:28:28.401317 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9ad8baa-2c07-4598-9347-71831d4d264e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 13:28:28 crc kubenswrapper[4797]: I1013 13:28:28.401330 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mt2vq\" (UniqueName: \"kubernetes.io/projected/e9ad8baa-2c07-4598-9347-71831d4d264e-kube-api-access-mt2vq\") on node \"crc\" DevicePath \"\"" Oct 13 13:28:28 crc kubenswrapper[4797]: I1013 13:28:28.520962 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e2a3161e-b16d-436d-b547-87e182ef5e27","Type":"ContainerStarted","Data":"96c199433ec042676ed19e22d59f1e2298214f4e17f2643c330e718dd6cd93a6"} Oct 13 13:28:28 crc kubenswrapper[4797]: I1013 13:28:28.521022 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e2a3161e-b16d-436d-b547-87e182ef5e27","Type":"ContainerStarted","Data":"5098f65b53c031667082257aa9c75ea641809f197a0ff696aad19b986be56dba"} Oct 13 13:28:28 crc kubenswrapper[4797]: I1013 13:28:28.521040 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e2a3161e-b16d-436d-b547-87e182ef5e27","Type":"ContainerStarted","Data":"f7f353052245c703cc603e6eb0933e0bc1e907108176ccd384de9c18ca71a4ea"} Oct 13 13:28:28 crc kubenswrapper[4797]: I1013 13:28:28.522692 4797 generic.go:334] "Generic (PLEG): container finished" podID="e9ad8baa-2c07-4598-9347-71831d4d264e" containerID="e0bd5a9e4a93dff8c862d42da3d24ddaaf3e00e8cc59c05ccb112cb226cdf57b" exitCode=0 Oct 13 13:28:28 crc kubenswrapper[4797]: I1013 13:28:28.522724 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e9ad8baa-2c07-4598-9347-71831d4d264e","Type":"ContainerDied","Data":"e0bd5a9e4a93dff8c862d42da3d24ddaaf3e00e8cc59c05ccb112cb226cdf57b"} Oct 13 13:28:28 crc kubenswrapper[4797]: I1013 13:28:28.522741 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e9ad8baa-2c07-4598-9347-71831d4d264e","Type":"ContainerDied","Data":"8429031209a2425ff9179441bf86c64e86759cae28a1c556e15ffb31901e4634"} Oct 13 13:28:28 crc kubenswrapper[4797]: I1013 13:28:28.522741 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 13 13:28:28 crc kubenswrapper[4797]: I1013 13:28:28.522757 4797 scope.go:117] "RemoveContainer" containerID="e0bd5a9e4a93dff8c862d42da3d24ddaaf3e00e8cc59c05ccb112cb226cdf57b" Oct 13 13:28:28 crc kubenswrapper[4797]: I1013 13:28:28.538877 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.538855604 podStartE2EDuration="2.538855604s" podCreationTimestamp="2025-10-13 13:28:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 13:28:28.535708897 +0000 UTC m=+1286.069259163" watchObservedRunningTime="2025-10-13 13:28:28.538855604 +0000 UTC m=+1286.072405860" Oct 13 13:28:28 crc kubenswrapper[4797]: I1013 13:28:28.545434 4797 scope.go:117] "RemoveContainer" containerID="e0bd5a9e4a93dff8c862d42da3d24ddaaf3e00e8cc59c05ccb112cb226cdf57b" Oct 13 13:28:28 crc kubenswrapper[4797]: E1013 13:28:28.545980 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0bd5a9e4a93dff8c862d42da3d24ddaaf3e00e8cc59c05ccb112cb226cdf57b\": container with ID starting with e0bd5a9e4a93dff8c862d42da3d24ddaaf3e00e8cc59c05ccb112cb226cdf57b not found: ID does not exist" containerID="e0bd5a9e4a93dff8c862d42da3d24ddaaf3e00e8cc59c05ccb112cb226cdf57b" Oct 13 13:28:28 crc kubenswrapper[4797]: I1013 13:28:28.546017 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0bd5a9e4a93dff8c862d42da3d24ddaaf3e00e8cc59c05ccb112cb226cdf57b"} err="failed to get container status \"e0bd5a9e4a93dff8c862d42da3d24ddaaf3e00e8cc59c05ccb112cb226cdf57b\": rpc error: code = NotFound desc = could not find container \"e0bd5a9e4a93dff8c862d42da3d24ddaaf3e00e8cc59c05ccb112cb226cdf57b\": container with ID starting with e0bd5a9e4a93dff8c862d42da3d24ddaaf3e00e8cc59c05ccb112cb226cdf57b not found: ID does not exist" Oct 13 13:28:28 crc kubenswrapper[4797]: I1013 13:28:28.557503 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 13 13:28:28 crc kubenswrapper[4797]: I1013 13:28:28.573972 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 13 13:28:28 crc kubenswrapper[4797]: I1013 13:28:28.584826 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 13 13:28:28 crc kubenswrapper[4797]: E1013 13:28:28.585374 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9ad8baa-2c07-4598-9347-71831d4d264e" containerName="nova-scheduler-scheduler" Oct 13 13:28:28 crc kubenswrapper[4797]: I1013 13:28:28.585396 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9ad8baa-2c07-4598-9347-71831d4d264e" containerName="nova-scheduler-scheduler" Oct 13 13:28:28 crc kubenswrapper[4797]: I1013 13:28:28.585580 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9ad8baa-2c07-4598-9347-71831d4d264e" containerName="nova-scheduler-scheduler" Oct 13 13:28:28 crc kubenswrapper[4797]: I1013 13:28:28.586262 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 13 13:28:28 crc kubenswrapper[4797]: I1013 13:28:28.588536 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 13 13:28:28 crc kubenswrapper[4797]: I1013 13:28:28.594410 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 13 13:28:28 crc kubenswrapper[4797]: I1013 13:28:28.708767 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f10b6d3a-3a6f-4ab7-9ded-4885e28bdbc6-config-data\") pod \"nova-scheduler-0\" (UID: \"f10b6d3a-3a6f-4ab7-9ded-4885e28bdbc6\") " pod="openstack/nova-scheduler-0" Oct 13 13:28:28 crc kubenswrapper[4797]: I1013 13:28:28.709467 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddln6\" (UniqueName: \"kubernetes.io/projected/f10b6d3a-3a6f-4ab7-9ded-4885e28bdbc6-kube-api-access-ddln6\") pod \"nova-scheduler-0\" (UID: \"f10b6d3a-3a6f-4ab7-9ded-4885e28bdbc6\") " pod="openstack/nova-scheduler-0" Oct 13 13:28:28 crc kubenswrapper[4797]: I1013 13:28:28.709581 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f10b6d3a-3a6f-4ab7-9ded-4885e28bdbc6-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f10b6d3a-3a6f-4ab7-9ded-4885e28bdbc6\") " pod="openstack/nova-scheduler-0" Oct 13 13:28:28 crc kubenswrapper[4797]: I1013 13:28:28.811689 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f10b6d3a-3a6f-4ab7-9ded-4885e28bdbc6-config-data\") pod \"nova-scheduler-0\" (UID: \"f10b6d3a-3a6f-4ab7-9ded-4885e28bdbc6\") " pod="openstack/nova-scheduler-0" Oct 13 13:28:28 crc kubenswrapper[4797]: I1013 13:28:28.812669 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ddln6\" (UniqueName: \"kubernetes.io/projected/f10b6d3a-3a6f-4ab7-9ded-4885e28bdbc6-kube-api-access-ddln6\") pod \"nova-scheduler-0\" (UID: \"f10b6d3a-3a6f-4ab7-9ded-4885e28bdbc6\") " pod="openstack/nova-scheduler-0" Oct 13 13:28:28 crc kubenswrapper[4797]: I1013 13:28:28.812806 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f10b6d3a-3a6f-4ab7-9ded-4885e28bdbc6-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f10b6d3a-3a6f-4ab7-9ded-4885e28bdbc6\") " pod="openstack/nova-scheduler-0" Oct 13 13:28:28 crc kubenswrapper[4797]: I1013 13:28:28.817155 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f10b6d3a-3a6f-4ab7-9ded-4885e28bdbc6-config-data\") pod \"nova-scheduler-0\" (UID: \"f10b6d3a-3a6f-4ab7-9ded-4885e28bdbc6\") " pod="openstack/nova-scheduler-0" Oct 13 13:28:28 crc kubenswrapper[4797]: I1013 13:28:28.817655 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f10b6d3a-3a6f-4ab7-9ded-4885e28bdbc6-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f10b6d3a-3a6f-4ab7-9ded-4885e28bdbc6\") " pod="openstack/nova-scheduler-0" Oct 13 13:28:28 crc kubenswrapper[4797]: I1013 13:28:28.842509 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddln6\" (UniqueName: \"kubernetes.io/projected/f10b6d3a-3a6f-4ab7-9ded-4885e28bdbc6-kube-api-access-ddln6\") pod \"nova-scheduler-0\" (UID: \"f10b6d3a-3a6f-4ab7-9ded-4885e28bdbc6\") " pod="openstack/nova-scheduler-0" Oct 13 13:28:28 crc kubenswrapper[4797]: I1013 13:28:28.906623 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 13 13:28:28 crc kubenswrapper[4797]: I1013 13:28:28.943166 4797 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="4aac353b-6a2c-4072-b40f-cc91a3907bce" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.194:8775/\": read tcp 10.217.0.2:48668->10.217.0.194:8775: read: connection reset by peer" Oct 13 13:28:28 crc kubenswrapper[4797]: I1013 13:28:28.943187 4797 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="4aac353b-6a2c-4072-b40f-cc91a3907bce" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.194:8775/\": read tcp 10.217.0.2:48676->10.217.0.194:8775: read: connection reset by peer" Oct 13 13:28:29 crc kubenswrapper[4797]: I1013 13:28:29.248850 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9ad8baa-2c07-4598-9347-71831d4d264e" path="/var/lib/kubelet/pods/e9ad8baa-2c07-4598-9347-71831d4d264e/volumes" Oct 13 13:28:29 crc kubenswrapper[4797]: I1013 13:28:29.397449 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 13 13:28:29 crc kubenswrapper[4797]: W1013 13:28:29.406125 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf10b6d3a_3a6f_4ab7_9ded_4885e28bdbc6.slice/crio-37f701374f5f727c9573ca188d7d69b2c041a92307e0c7e1eabcd64ec3722489 WatchSource:0}: Error finding container 37f701374f5f727c9573ca188d7d69b2c041a92307e0c7e1eabcd64ec3722489: Status 404 returned error can't find the container with id 37f701374f5f727c9573ca188d7d69b2c041a92307e0c7e1eabcd64ec3722489 Oct 13 13:28:29 crc kubenswrapper[4797]: I1013 13:28:29.436748 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 13 13:28:29 crc kubenswrapper[4797]: I1013 13:28:29.537945 4797 generic.go:334] "Generic (PLEG): container finished" podID="4aac353b-6a2c-4072-b40f-cc91a3907bce" containerID="a9336d63ad3480a4cbf4f416b73926c80dd69c8319d9b129aa45be9c596f6185" exitCode=0 Oct 13 13:28:29 crc kubenswrapper[4797]: I1013 13:28:29.538031 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4aac353b-6a2c-4072-b40f-cc91a3907bce","Type":"ContainerDied","Data":"a9336d63ad3480a4cbf4f416b73926c80dd69c8319d9b129aa45be9c596f6185"} Oct 13 13:28:29 crc kubenswrapper[4797]: I1013 13:28:29.538062 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4aac353b-6a2c-4072-b40f-cc91a3907bce","Type":"ContainerDied","Data":"554c6ec6d8d8580a7786e8f9862a6d56e6c5a24f50910c97fb741def07be9cee"} Oct 13 13:28:29 crc kubenswrapper[4797]: I1013 13:28:29.538082 4797 scope.go:117] "RemoveContainer" containerID="a9336d63ad3480a4cbf4f416b73926c80dd69c8319d9b129aa45be9c596f6185" Oct 13 13:28:29 crc kubenswrapper[4797]: I1013 13:28:29.538204 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 13 13:28:29 crc kubenswrapper[4797]: I1013 13:28:29.539942 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f10b6d3a-3a6f-4ab7-9ded-4885e28bdbc6","Type":"ContainerStarted","Data":"37f701374f5f727c9573ca188d7d69b2c041a92307e0c7e1eabcd64ec3722489"} Oct 13 13:28:29 crc kubenswrapper[4797]: I1013 13:28:29.562720 4797 scope.go:117] "RemoveContainer" containerID="1caf1d3c32e8a777d8f507d9d95855d40ffc2c5ead36e51f277d99f3a725c0f6" Oct 13 13:28:29 crc kubenswrapper[4797]: I1013 13:28:29.572688 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4ccm\" (UniqueName: \"kubernetes.io/projected/4aac353b-6a2c-4072-b40f-cc91a3907bce-kube-api-access-s4ccm\") pod \"4aac353b-6a2c-4072-b40f-cc91a3907bce\" (UID: \"4aac353b-6a2c-4072-b40f-cc91a3907bce\") " Oct 13 13:28:29 crc kubenswrapper[4797]: I1013 13:28:29.572740 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4aac353b-6a2c-4072-b40f-cc91a3907bce-config-data\") pod \"4aac353b-6a2c-4072-b40f-cc91a3907bce\" (UID: \"4aac353b-6a2c-4072-b40f-cc91a3907bce\") " Oct 13 13:28:29 crc kubenswrapper[4797]: I1013 13:28:29.572777 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4aac353b-6a2c-4072-b40f-cc91a3907bce-nova-metadata-tls-certs\") pod \"4aac353b-6a2c-4072-b40f-cc91a3907bce\" (UID: \"4aac353b-6a2c-4072-b40f-cc91a3907bce\") " Oct 13 13:28:29 crc kubenswrapper[4797]: I1013 13:28:29.572837 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4aac353b-6a2c-4072-b40f-cc91a3907bce-logs\") pod \"4aac353b-6a2c-4072-b40f-cc91a3907bce\" (UID: \"4aac353b-6a2c-4072-b40f-cc91a3907bce\") " Oct 13 13:28:29 crc kubenswrapper[4797]: I1013 13:28:29.572876 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4aac353b-6a2c-4072-b40f-cc91a3907bce-combined-ca-bundle\") pod \"4aac353b-6a2c-4072-b40f-cc91a3907bce\" (UID: \"4aac353b-6a2c-4072-b40f-cc91a3907bce\") " Oct 13 13:28:29 crc kubenswrapper[4797]: I1013 13:28:29.573949 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4aac353b-6a2c-4072-b40f-cc91a3907bce-logs" (OuterVolumeSpecName: "logs") pod "4aac353b-6a2c-4072-b40f-cc91a3907bce" (UID: "4aac353b-6a2c-4072-b40f-cc91a3907bce"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 13:28:29 crc kubenswrapper[4797]: I1013 13:28:29.577327 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4aac353b-6a2c-4072-b40f-cc91a3907bce-kube-api-access-s4ccm" (OuterVolumeSpecName: "kube-api-access-s4ccm") pod "4aac353b-6a2c-4072-b40f-cc91a3907bce" (UID: "4aac353b-6a2c-4072-b40f-cc91a3907bce"). InnerVolumeSpecName "kube-api-access-s4ccm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:28:29 crc kubenswrapper[4797]: I1013 13:28:29.585289 4797 scope.go:117] "RemoveContainer" containerID="a9336d63ad3480a4cbf4f416b73926c80dd69c8319d9b129aa45be9c596f6185" Oct 13 13:28:29 crc kubenswrapper[4797]: E1013 13:28:29.585629 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9336d63ad3480a4cbf4f416b73926c80dd69c8319d9b129aa45be9c596f6185\": container with ID starting with a9336d63ad3480a4cbf4f416b73926c80dd69c8319d9b129aa45be9c596f6185 not found: ID does not exist" containerID="a9336d63ad3480a4cbf4f416b73926c80dd69c8319d9b129aa45be9c596f6185" Oct 13 13:28:29 crc kubenswrapper[4797]: I1013 13:28:29.585656 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9336d63ad3480a4cbf4f416b73926c80dd69c8319d9b129aa45be9c596f6185"} err="failed to get container status \"a9336d63ad3480a4cbf4f416b73926c80dd69c8319d9b129aa45be9c596f6185\": rpc error: code = NotFound desc = could not find container \"a9336d63ad3480a4cbf4f416b73926c80dd69c8319d9b129aa45be9c596f6185\": container with ID starting with a9336d63ad3480a4cbf4f416b73926c80dd69c8319d9b129aa45be9c596f6185 not found: ID does not exist" Oct 13 13:28:29 crc kubenswrapper[4797]: I1013 13:28:29.585675 4797 scope.go:117] "RemoveContainer" containerID="1caf1d3c32e8a777d8f507d9d95855d40ffc2c5ead36e51f277d99f3a725c0f6" Oct 13 13:28:29 crc kubenswrapper[4797]: E1013 13:28:29.585867 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1caf1d3c32e8a777d8f507d9d95855d40ffc2c5ead36e51f277d99f3a725c0f6\": container with ID starting with 1caf1d3c32e8a777d8f507d9d95855d40ffc2c5ead36e51f277d99f3a725c0f6 not found: ID does not exist" containerID="1caf1d3c32e8a777d8f507d9d95855d40ffc2c5ead36e51f277d99f3a725c0f6" Oct 13 13:28:29 crc kubenswrapper[4797]: I1013 13:28:29.585883 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1caf1d3c32e8a777d8f507d9d95855d40ffc2c5ead36e51f277d99f3a725c0f6"} err="failed to get container status \"1caf1d3c32e8a777d8f507d9d95855d40ffc2c5ead36e51f277d99f3a725c0f6\": rpc error: code = NotFound desc = could not find container \"1caf1d3c32e8a777d8f507d9d95855d40ffc2c5ead36e51f277d99f3a725c0f6\": container with ID starting with 1caf1d3c32e8a777d8f507d9d95855d40ffc2c5ead36e51f277d99f3a725c0f6 not found: ID does not exist" Oct 13 13:28:29 crc kubenswrapper[4797]: I1013 13:28:29.605182 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4aac353b-6a2c-4072-b40f-cc91a3907bce-config-data" (OuterVolumeSpecName: "config-data") pod "4aac353b-6a2c-4072-b40f-cc91a3907bce" (UID: "4aac353b-6a2c-4072-b40f-cc91a3907bce"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:28:29 crc kubenswrapper[4797]: I1013 13:28:29.610725 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4aac353b-6a2c-4072-b40f-cc91a3907bce-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4aac353b-6a2c-4072-b40f-cc91a3907bce" (UID: "4aac353b-6a2c-4072-b40f-cc91a3907bce"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:28:29 crc kubenswrapper[4797]: I1013 13:28:29.629616 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4aac353b-6a2c-4072-b40f-cc91a3907bce-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "4aac353b-6a2c-4072-b40f-cc91a3907bce" (UID: "4aac353b-6a2c-4072-b40f-cc91a3907bce"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:28:29 crc kubenswrapper[4797]: I1013 13:28:29.675921 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4ccm\" (UniqueName: \"kubernetes.io/projected/4aac353b-6a2c-4072-b40f-cc91a3907bce-kube-api-access-s4ccm\") on node \"crc\" DevicePath \"\"" Oct 13 13:28:29 crc kubenswrapper[4797]: I1013 13:28:29.676366 4797 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4aac353b-6a2c-4072-b40f-cc91a3907bce-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 13:28:29 crc kubenswrapper[4797]: I1013 13:28:29.676406 4797 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4aac353b-6a2c-4072-b40f-cc91a3907bce-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 13 13:28:29 crc kubenswrapper[4797]: I1013 13:28:29.676424 4797 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4aac353b-6a2c-4072-b40f-cc91a3907bce-logs\") on node \"crc\" DevicePath \"\"" Oct 13 13:28:29 crc kubenswrapper[4797]: I1013 13:28:29.676438 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4aac353b-6a2c-4072-b40f-cc91a3907bce-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 13:28:29 crc kubenswrapper[4797]: I1013 13:28:29.876930 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 13 13:28:29 crc kubenswrapper[4797]: I1013 13:28:29.897828 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 13 13:28:29 crc kubenswrapper[4797]: I1013 13:28:29.908044 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 13 13:28:29 crc kubenswrapper[4797]: E1013 13:28:29.908478 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4aac353b-6a2c-4072-b40f-cc91a3907bce" containerName="nova-metadata-metadata" Oct 13 13:28:29 crc kubenswrapper[4797]: I1013 13:28:29.908500 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="4aac353b-6a2c-4072-b40f-cc91a3907bce" containerName="nova-metadata-metadata" Oct 13 13:28:29 crc kubenswrapper[4797]: E1013 13:28:29.908546 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4aac353b-6a2c-4072-b40f-cc91a3907bce" containerName="nova-metadata-log" Oct 13 13:28:29 crc kubenswrapper[4797]: I1013 13:28:29.908555 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="4aac353b-6a2c-4072-b40f-cc91a3907bce" containerName="nova-metadata-log" Oct 13 13:28:29 crc kubenswrapper[4797]: I1013 13:28:29.908763 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="4aac353b-6a2c-4072-b40f-cc91a3907bce" containerName="nova-metadata-log" Oct 13 13:28:29 crc kubenswrapper[4797]: I1013 13:28:29.908789 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="4aac353b-6a2c-4072-b40f-cc91a3907bce" containerName="nova-metadata-metadata" Oct 13 13:28:29 crc kubenswrapper[4797]: I1013 13:28:29.910171 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 13 13:28:29 crc kubenswrapper[4797]: I1013 13:28:29.912111 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 13 13:28:29 crc kubenswrapper[4797]: I1013 13:28:29.912271 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 13 13:28:29 crc kubenswrapper[4797]: I1013 13:28:29.917604 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 13 13:28:30 crc kubenswrapper[4797]: I1013 13:28:30.082680 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/cefeac7c-e65d-4c12-8f7e-e56bf30c04fa-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"cefeac7c-e65d-4c12-8f7e-e56bf30c04fa\") " pod="openstack/nova-metadata-0" Oct 13 13:28:30 crc kubenswrapper[4797]: I1013 13:28:30.083019 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cefeac7c-e65d-4c12-8f7e-e56bf30c04fa-logs\") pod \"nova-metadata-0\" (UID: \"cefeac7c-e65d-4c12-8f7e-e56bf30c04fa\") " pod="openstack/nova-metadata-0" Oct 13 13:28:30 crc kubenswrapper[4797]: I1013 13:28:30.083073 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cefeac7c-e65d-4c12-8f7e-e56bf30c04fa-config-data\") pod \"nova-metadata-0\" (UID: \"cefeac7c-e65d-4c12-8f7e-e56bf30c04fa\") " pod="openstack/nova-metadata-0" Oct 13 13:28:30 crc kubenswrapper[4797]: I1013 13:28:30.083094 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wz5rs\" (UniqueName: \"kubernetes.io/projected/cefeac7c-e65d-4c12-8f7e-e56bf30c04fa-kube-api-access-wz5rs\") pod \"nova-metadata-0\" (UID: \"cefeac7c-e65d-4c12-8f7e-e56bf30c04fa\") " pod="openstack/nova-metadata-0" Oct 13 13:28:30 crc kubenswrapper[4797]: I1013 13:28:30.083112 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cefeac7c-e65d-4c12-8f7e-e56bf30c04fa-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"cefeac7c-e65d-4c12-8f7e-e56bf30c04fa\") " pod="openstack/nova-metadata-0" Oct 13 13:28:30 crc kubenswrapper[4797]: I1013 13:28:30.184499 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/cefeac7c-e65d-4c12-8f7e-e56bf30c04fa-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"cefeac7c-e65d-4c12-8f7e-e56bf30c04fa\") " pod="openstack/nova-metadata-0" Oct 13 13:28:30 crc kubenswrapper[4797]: I1013 13:28:30.184653 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cefeac7c-e65d-4c12-8f7e-e56bf30c04fa-logs\") pod \"nova-metadata-0\" (UID: \"cefeac7c-e65d-4c12-8f7e-e56bf30c04fa\") " pod="openstack/nova-metadata-0" Oct 13 13:28:30 crc kubenswrapper[4797]: I1013 13:28:30.184734 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cefeac7c-e65d-4c12-8f7e-e56bf30c04fa-config-data\") pod \"nova-metadata-0\" (UID: \"cefeac7c-e65d-4c12-8f7e-e56bf30c04fa\") " pod="openstack/nova-metadata-0" Oct 13 13:28:30 crc kubenswrapper[4797]: I1013 13:28:30.184760 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wz5rs\" (UniqueName: \"kubernetes.io/projected/cefeac7c-e65d-4c12-8f7e-e56bf30c04fa-kube-api-access-wz5rs\") pod \"nova-metadata-0\" (UID: \"cefeac7c-e65d-4c12-8f7e-e56bf30c04fa\") " pod="openstack/nova-metadata-0" Oct 13 13:28:30 crc kubenswrapper[4797]: I1013 13:28:30.184793 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cefeac7c-e65d-4c12-8f7e-e56bf30c04fa-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"cefeac7c-e65d-4c12-8f7e-e56bf30c04fa\") " pod="openstack/nova-metadata-0" Oct 13 13:28:30 crc kubenswrapper[4797]: I1013 13:28:30.189841 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/cefeac7c-e65d-4c12-8f7e-e56bf30c04fa-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"cefeac7c-e65d-4c12-8f7e-e56bf30c04fa\") " pod="openstack/nova-metadata-0" Oct 13 13:28:30 crc kubenswrapper[4797]: I1013 13:28:30.189990 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cefeac7c-e65d-4c12-8f7e-e56bf30c04fa-config-data\") pod \"nova-metadata-0\" (UID: \"cefeac7c-e65d-4c12-8f7e-e56bf30c04fa\") " pod="openstack/nova-metadata-0" Oct 13 13:28:30 crc kubenswrapper[4797]: I1013 13:28:30.190679 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cefeac7c-e65d-4c12-8f7e-e56bf30c04fa-logs\") pod \"nova-metadata-0\" (UID: \"cefeac7c-e65d-4c12-8f7e-e56bf30c04fa\") " pod="openstack/nova-metadata-0" Oct 13 13:28:30 crc kubenswrapper[4797]: I1013 13:28:30.197508 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cefeac7c-e65d-4c12-8f7e-e56bf30c04fa-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"cefeac7c-e65d-4c12-8f7e-e56bf30c04fa\") " pod="openstack/nova-metadata-0" Oct 13 13:28:30 crc kubenswrapper[4797]: I1013 13:28:30.205328 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wz5rs\" (UniqueName: \"kubernetes.io/projected/cefeac7c-e65d-4c12-8f7e-e56bf30c04fa-kube-api-access-wz5rs\") pod \"nova-metadata-0\" (UID: \"cefeac7c-e65d-4c12-8f7e-e56bf30c04fa\") " pod="openstack/nova-metadata-0" Oct 13 13:28:30 crc kubenswrapper[4797]: I1013 13:28:30.263593 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 13 13:28:30 crc kubenswrapper[4797]: I1013 13:28:30.553847 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f10b6d3a-3a6f-4ab7-9ded-4885e28bdbc6","Type":"ContainerStarted","Data":"eb7d4e571e76b36027b315aa98010f3dd65dc78fd3a01a2aa0cae7369e4d667e"} Oct 13 13:28:30 crc kubenswrapper[4797]: I1013 13:28:30.573005 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.572990309 podStartE2EDuration="2.572990309s" podCreationTimestamp="2025-10-13 13:28:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 13:28:30.570077128 +0000 UTC m=+1288.103627384" watchObservedRunningTime="2025-10-13 13:28:30.572990309 +0000 UTC m=+1288.106540565" Oct 13 13:28:30 crc kubenswrapper[4797]: I1013 13:28:30.723098 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 13 13:28:31 crc kubenswrapper[4797]: I1013 13:28:31.248364 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4aac353b-6a2c-4072-b40f-cc91a3907bce" path="/var/lib/kubelet/pods/4aac353b-6a2c-4072-b40f-cc91a3907bce/volumes" Oct 13 13:28:31 crc kubenswrapper[4797]: I1013 13:28:31.563969 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cefeac7c-e65d-4c12-8f7e-e56bf30c04fa","Type":"ContainerStarted","Data":"2ce2d5a74583559ea08ad5502820fbbd8cb181a62c48ed513f6762ab6f0ec152"} Oct 13 13:28:31 crc kubenswrapper[4797]: I1013 13:28:31.564012 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cefeac7c-e65d-4c12-8f7e-e56bf30c04fa","Type":"ContainerStarted","Data":"87d1861a88b2c106ab6c47eaad4f1a0647557f81e2c4766a50ad9d24c12524bf"} Oct 13 13:28:31 crc kubenswrapper[4797]: I1013 13:28:31.564023 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cefeac7c-e65d-4c12-8f7e-e56bf30c04fa","Type":"ContainerStarted","Data":"c45f2506b30ee16013f3be2a9e77460b2bb6cc5f56459c665a0b2070ea0e1c62"} Oct 13 13:28:31 crc kubenswrapper[4797]: I1013 13:28:31.593555 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.5935279270000002 podStartE2EDuration="2.593527927s" podCreationTimestamp="2025-10-13 13:28:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 13:28:31.583195254 +0000 UTC m=+1289.116745520" watchObservedRunningTime="2025-10-13 13:28:31.593527927 +0000 UTC m=+1289.127078183" Oct 13 13:28:33 crc kubenswrapper[4797]: I1013 13:28:33.906784 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 13 13:28:35 crc kubenswrapper[4797]: I1013 13:28:35.264099 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 13 13:28:35 crc kubenswrapper[4797]: I1013 13:28:35.264456 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 13 13:28:37 crc kubenswrapper[4797]: I1013 13:28:37.246021 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 13 13:28:37 crc kubenswrapper[4797]: I1013 13:28:37.246936 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 13 13:28:38 crc kubenswrapper[4797]: I1013 13:28:38.254108 4797 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="e2a3161e-b16d-436d-b547-87e182ef5e27" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.204:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 13 13:28:38 crc kubenswrapper[4797]: I1013 13:28:38.254146 4797 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="e2a3161e-b16d-436d-b547-87e182ef5e27" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.204:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 13 13:28:38 crc kubenswrapper[4797]: I1013 13:28:38.906964 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 13 13:28:38 crc kubenswrapper[4797]: I1013 13:28:38.945842 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 13 13:28:39 crc kubenswrapper[4797]: I1013 13:28:39.680716 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 13 13:28:40 crc kubenswrapper[4797]: I1013 13:28:40.264299 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 13 13:28:40 crc kubenswrapper[4797]: I1013 13:28:40.264352 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 13 13:28:41 crc kubenswrapper[4797]: I1013 13:28:41.278303 4797 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="cefeac7c-e65d-4c12-8f7e-e56bf30c04fa" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.206:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 13 13:28:41 crc kubenswrapper[4797]: I1013 13:28:41.278356 4797 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="cefeac7c-e65d-4c12-8f7e-e56bf30c04fa" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.206:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 13 13:28:43 crc kubenswrapper[4797]: I1013 13:28:43.619285 4797 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="8c12a30d-f475-4e40-ac5f-0c03c238fb89" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Oct 13 13:28:47 crc kubenswrapper[4797]: I1013 13:28:47.251882 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 13 13:28:47 crc kubenswrapper[4797]: I1013 13:28:47.252221 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 13 13:28:47 crc kubenswrapper[4797]: I1013 13:28:47.252874 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 13 13:28:47 crc kubenswrapper[4797]: I1013 13:28:47.253402 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 13 13:28:47 crc kubenswrapper[4797]: I1013 13:28:47.259468 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 13 13:28:47 crc kubenswrapper[4797]: I1013 13:28:47.259830 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 13 13:28:48 crc kubenswrapper[4797]: I1013 13:28:48.120387 4797 patch_prober.go:28] interesting pod/machine-config-daemon-hrdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 13:28:48 crc kubenswrapper[4797]: I1013 13:28:48.120740 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 13:28:49 crc kubenswrapper[4797]: I1013 13:28:49.725864 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 13 13:28:49 crc kubenswrapper[4797]: I1013 13:28:49.746776 4797 generic.go:334] "Generic (PLEG): container finished" podID="8c12a30d-f475-4e40-ac5f-0c03c238fb89" containerID="61b3075099c1eee3aa921f23de5f19ac78ed9c6b5c838c6e0eb79c1c1ecd3b9c" exitCode=137 Oct 13 13:28:49 crc kubenswrapper[4797]: I1013 13:28:49.746869 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8c12a30d-f475-4e40-ac5f-0c03c238fb89","Type":"ContainerDied","Data":"61b3075099c1eee3aa921f23de5f19ac78ed9c6b5c838c6e0eb79c1c1ecd3b9c"} Oct 13 13:28:49 crc kubenswrapper[4797]: I1013 13:28:49.746974 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 13 13:28:49 crc kubenswrapper[4797]: I1013 13:28:49.747007 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8c12a30d-f475-4e40-ac5f-0c03c238fb89","Type":"ContainerDied","Data":"5b9ea7bdd9f09d26c717c7300432d1d81962b0ae20127762cc269306b273354e"} Oct 13 13:28:49 crc kubenswrapper[4797]: I1013 13:28:49.747040 4797 scope.go:117] "RemoveContainer" containerID="61b3075099c1eee3aa921f23de5f19ac78ed9c6b5c838c6e0eb79c1c1ecd3b9c" Oct 13 13:28:49 crc kubenswrapper[4797]: I1013 13:28:49.776823 4797 scope.go:117] "RemoveContainer" containerID="3cf9fcc54220118a191ddf671d8241baf56272d228d5553f00738e1a85482ca8" Oct 13 13:28:49 crc kubenswrapper[4797]: I1013 13:28:49.801846 4797 scope.go:117] "RemoveContainer" containerID="d8864921b8dbbdee8b7c5297ccc18f6112af6fa0d1b9e8f119c7cd8f8c905c03" Oct 13 13:28:49 crc kubenswrapper[4797]: I1013 13:28:49.836163 4797 scope.go:117] "RemoveContainer" containerID="7f94011a6860107ef9dae63f1f21baacda439283afab4d633ec8e51fc6caaa03" Oct 13 13:28:49 crc kubenswrapper[4797]: I1013 13:28:49.846580 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8c12a30d-f475-4e40-ac5f-0c03c238fb89-sg-core-conf-yaml\") pod \"8c12a30d-f475-4e40-ac5f-0c03c238fb89\" (UID: \"8c12a30d-f475-4e40-ac5f-0c03c238fb89\") " Oct 13 13:28:49 crc kubenswrapper[4797]: I1013 13:28:49.846723 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vtz26\" (UniqueName: \"kubernetes.io/projected/8c12a30d-f475-4e40-ac5f-0c03c238fb89-kube-api-access-vtz26\") pod \"8c12a30d-f475-4e40-ac5f-0c03c238fb89\" (UID: \"8c12a30d-f475-4e40-ac5f-0c03c238fb89\") " Oct 13 13:28:49 crc kubenswrapper[4797]: I1013 13:28:49.846778 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c12a30d-f475-4e40-ac5f-0c03c238fb89-ceilometer-tls-certs\") pod \"8c12a30d-f475-4e40-ac5f-0c03c238fb89\" (UID: \"8c12a30d-f475-4e40-ac5f-0c03c238fb89\") " Oct 13 13:28:49 crc kubenswrapper[4797]: I1013 13:28:49.846905 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8c12a30d-f475-4e40-ac5f-0c03c238fb89-run-httpd\") pod \"8c12a30d-f475-4e40-ac5f-0c03c238fb89\" (UID: \"8c12a30d-f475-4e40-ac5f-0c03c238fb89\") " Oct 13 13:28:49 crc kubenswrapper[4797]: I1013 13:28:49.847046 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c12a30d-f475-4e40-ac5f-0c03c238fb89-scripts\") pod \"8c12a30d-f475-4e40-ac5f-0c03c238fb89\" (UID: \"8c12a30d-f475-4e40-ac5f-0c03c238fb89\") " Oct 13 13:28:49 crc kubenswrapper[4797]: I1013 13:28:49.847081 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c12a30d-f475-4e40-ac5f-0c03c238fb89-combined-ca-bundle\") pod \"8c12a30d-f475-4e40-ac5f-0c03c238fb89\" (UID: \"8c12a30d-f475-4e40-ac5f-0c03c238fb89\") " Oct 13 13:28:49 crc kubenswrapper[4797]: I1013 13:28:49.847107 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c12a30d-f475-4e40-ac5f-0c03c238fb89-config-data\") pod \"8c12a30d-f475-4e40-ac5f-0c03c238fb89\" (UID: \"8c12a30d-f475-4e40-ac5f-0c03c238fb89\") " Oct 13 13:28:49 crc kubenswrapper[4797]: I1013 13:28:49.847181 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8c12a30d-f475-4e40-ac5f-0c03c238fb89-log-httpd\") pod \"8c12a30d-f475-4e40-ac5f-0c03c238fb89\" (UID: \"8c12a30d-f475-4e40-ac5f-0c03c238fb89\") " Oct 13 13:28:49 crc kubenswrapper[4797]: I1013 13:28:49.847679 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c12a30d-f475-4e40-ac5f-0c03c238fb89-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "8c12a30d-f475-4e40-ac5f-0c03c238fb89" (UID: "8c12a30d-f475-4e40-ac5f-0c03c238fb89"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 13:28:49 crc kubenswrapper[4797]: I1013 13:28:49.847868 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c12a30d-f475-4e40-ac5f-0c03c238fb89-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "8c12a30d-f475-4e40-ac5f-0c03c238fb89" (UID: "8c12a30d-f475-4e40-ac5f-0c03c238fb89"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 13:28:49 crc kubenswrapper[4797]: I1013 13:28:49.852836 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c12a30d-f475-4e40-ac5f-0c03c238fb89-kube-api-access-vtz26" (OuterVolumeSpecName: "kube-api-access-vtz26") pod "8c12a30d-f475-4e40-ac5f-0c03c238fb89" (UID: "8c12a30d-f475-4e40-ac5f-0c03c238fb89"). InnerVolumeSpecName "kube-api-access-vtz26". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:28:49 crc kubenswrapper[4797]: I1013 13:28:49.854078 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c12a30d-f475-4e40-ac5f-0c03c238fb89-scripts" (OuterVolumeSpecName: "scripts") pod "8c12a30d-f475-4e40-ac5f-0c03c238fb89" (UID: "8c12a30d-f475-4e40-ac5f-0c03c238fb89"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:28:49 crc kubenswrapper[4797]: I1013 13:28:49.859831 4797 scope.go:117] "RemoveContainer" containerID="61b3075099c1eee3aa921f23de5f19ac78ed9c6b5c838c6e0eb79c1c1ecd3b9c" Oct 13 13:28:49 crc kubenswrapper[4797]: E1013 13:28:49.860405 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"61b3075099c1eee3aa921f23de5f19ac78ed9c6b5c838c6e0eb79c1c1ecd3b9c\": container with ID starting with 61b3075099c1eee3aa921f23de5f19ac78ed9c6b5c838c6e0eb79c1c1ecd3b9c not found: ID does not exist" containerID="61b3075099c1eee3aa921f23de5f19ac78ed9c6b5c838c6e0eb79c1c1ecd3b9c" Oct 13 13:28:49 crc kubenswrapper[4797]: I1013 13:28:49.860456 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61b3075099c1eee3aa921f23de5f19ac78ed9c6b5c838c6e0eb79c1c1ecd3b9c"} err="failed to get container status \"61b3075099c1eee3aa921f23de5f19ac78ed9c6b5c838c6e0eb79c1c1ecd3b9c\": rpc error: code = NotFound desc = could not find container \"61b3075099c1eee3aa921f23de5f19ac78ed9c6b5c838c6e0eb79c1c1ecd3b9c\": container with ID starting with 61b3075099c1eee3aa921f23de5f19ac78ed9c6b5c838c6e0eb79c1c1ecd3b9c not found: ID does not exist" Oct 13 13:28:49 crc kubenswrapper[4797]: I1013 13:28:49.860485 4797 scope.go:117] "RemoveContainer" containerID="3cf9fcc54220118a191ddf671d8241baf56272d228d5553f00738e1a85482ca8" Oct 13 13:28:49 crc kubenswrapper[4797]: E1013 13:28:49.861031 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3cf9fcc54220118a191ddf671d8241baf56272d228d5553f00738e1a85482ca8\": container with ID starting with 3cf9fcc54220118a191ddf671d8241baf56272d228d5553f00738e1a85482ca8 not found: ID does not exist" containerID="3cf9fcc54220118a191ddf671d8241baf56272d228d5553f00738e1a85482ca8" Oct 13 13:28:49 crc kubenswrapper[4797]: I1013 13:28:49.861094 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3cf9fcc54220118a191ddf671d8241baf56272d228d5553f00738e1a85482ca8"} err="failed to get container status \"3cf9fcc54220118a191ddf671d8241baf56272d228d5553f00738e1a85482ca8\": rpc error: code = NotFound desc = could not find container \"3cf9fcc54220118a191ddf671d8241baf56272d228d5553f00738e1a85482ca8\": container with ID starting with 3cf9fcc54220118a191ddf671d8241baf56272d228d5553f00738e1a85482ca8 not found: ID does not exist" Oct 13 13:28:49 crc kubenswrapper[4797]: I1013 13:28:49.861130 4797 scope.go:117] "RemoveContainer" containerID="d8864921b8dbbdee8b7c5297ccc18f6112af6fa0d1b9e8f119c7cd8f8c905c03" Oct 13 13:28:49 crc kubenswrapper[4797]: E1013 13:28:49.861520 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d8864921b8dbbdee8b7c5297ccc18f6112af6fa0d1b9e8f119c7cd8f8c905c03\": container with ID starting with d8864921b8dbbdee8b7c5297ccc18f6112af6fa0d1b9e8f119c7cd8f8c905c03 not found: ID does not exist" containerID="d8864921b8dbbdee8b7c5297ccc18f6112af6fa0d1b9e8f119c7cd8f8c905c03" Oct 13 13:28:49 crc kubenswrapper[4797]: I1013 13:28:49.861659 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8864921b8dbbdee8b7c5297ccc18f6112af6fa0d1b9e8f119c7cd8f8c905c03"} err="failed to get container status \"d8864921b8dbbdee8b7c5297ccc18f6112af6fa0d1b9e8f119c7cd8f8c905c03\": rpc error: code = NotFound desc = could not find container \"d8864921b8dbbdee8b7c5297ccc18f6112af6fa0d1b9e8f119c7cd8f8c905c03\": container with ID starting with d8864921b8dbbdee8b7c5297ccc18f6112af6fa0d1b9e8f119c7cd8f8c905c03 not found: ID does not exist" Oct 13 13:28:49 crc kubenswrapper[4797]: I1013 13:28:49.861773 4797 scope.go:117] "RemoveContainer" containerID="7f94011a6860107ef9dae63f1f21baacda439283afab4d633ec8e51fc6caaa03" Oct 13 13:28:49 crc kubenswrapper[4797]: E1013 13:28:49.862307 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f94011a6860107ef9dae63f1f21baacda439283afab4d633ec8e51fc6caaa03\": container with ID starting with 7f94011a6860107ef9dae63f1f21baacda439283afab4d633ec8e51fc6caaa03 not found: ID does not exist" containerID="7f94011a6860107ef9dae63f1f21baacda439283afab4d633ec8e51fc6caaa03" Oct 13 13:28:49 crc kubenswrapper[4797]: I1013 13:28:49.862356 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f94011a6860107ef9dae63f1f21baacda439283afab4d633ec8e51fc6caaa03"} err="failed to get container status \"7f94011a6860107ef9dae63f1f21baacda439283afab4d633ec8e51fc6caaa03\": rpc error: code = NotFound desc = could not find container \"7f94011a6860107ef9dae63f1f21baacda439283afab4d633ec8e51fc6caaa03\": container with ID starting with 7f94011a6860107ef9dae63f1f21baacda439283afab4d633ec8e51fc6caaa03 not found: ID does not exist" Oct 13 13:28:49 crc kubenswrapper[4797]: I1013 13:28:49.878042 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c12a30d-f475-4e40-ac5f-0c03c238fb89-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "8c12a30d-f475-4e40-ac5f-0c03c238fb89" (UID: "8c12a30d-f475-4e40-ac5f-0c03c238fb89"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:28:49 crc kubenswrapper[4797]: I1013 13:28:49.930378 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c12a30d-f475-4e40-ac5f-0c03c238fb89-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "8c12a30d-f475-4e40-ac5f-0c03c238fb89" (UID: "8c12a30d-f475-4e40-ac5f-0c03c238fb89"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:28:49 crc kubenswrapper[4797]: I1013 13:28:49.941730 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c12a30d-f475-4e40-ac5f-0c03c238fb89-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8c12a30d-f475-4e40-ac5f-0c03c238fb89" (UID: "8c12a30d-f475-4e40-ac5f-0c03c238fb89"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:28:49 crc kubenswrapper[4797]: I1013 13:28:49.945101 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c12a30d-f475-4e40-ac5f-0c03c238fb89-config-data" (OuterVolumeSpecName: "config-data") pod "8c12a30d-f475-4e40-ac5f-0c03c238fb89" (UID: "8c12a30d-f475-4e40-ac5f-0c03c238fb89"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:28:49 crc kubenswrapper[4797]: I1013 13:28:49.948898 4797 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c12a30d-f475-4e40-ac5f-0c03c238fb89-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 13 13:28:49 crc kubenswrapper[4797]: I1013 13:28:49.949013 4797 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8c12a30d-f475-4e40-ac5f-0c03c238fb89-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 13 13:28:49 crc kubenswrapper[4797]: I1013 13:28:49.949087 4797 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c12a30d-f475-4e40-ac5f-0c03c238fb89-scripts\") on node \"crc\" DevicePath \"\"" Oct 13 13:28:49 crc kubenswrapper[4797]: I1013 13:28:49.949150 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c12a30d-f475-4e40-ac5f-0c03c238fb89-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 13:28:49 crc kubenswrapper[4797]: I1013 13:28:49.949206 4797 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c12a30d-f475-4e40-ac5f-0c03c238fb89-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 13:28:49 crc kubenswrapper[4797]: I1013 13:28:49.949267 4797 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8c12a30d-f475-4e40-ac5f-0c03c238fb89-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 13 13:28:49 crc kubenswrapper[4797]: I1013 13:28:49.949327 4797 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8c12a30d-f475-4e40-ac5f-0c03c238fb89-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 13 13:28:49 crc kubenswrapper[4797]: I1013 13:28:49.949382 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vtz26\" (UniqueName: \"kubernetes.io/projected/8c12a30d-f475-4e40-ac5f-0c03c238fb89-kube-api-access-vtz26\") on node \"crc\" DevicePath \"\"" Oct 13 13:28:50 crc kubenswrapper[4797]: I1013 13:28:50.087955 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 13 13:28:50 crc kubenswrapper[4797]: I1013 13:28:50.104767 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 13 13:28:50 crc kubenswrapper[4797]: I1013 13:28:50.116305 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 13 13:28:50 crc kubenswrapper[4797]: E1013 13:28:50.116849 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c12a30d-f475-4e40-ac5f-0c03c238fb89" containerName="ceilometer-notification-agent" Oct 13 13:28:50 crc kubenswrapper[4797]: I1013 13:28:50.116875 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c12a30d-f475-4e40-ac5f-0c03c238fb89" containerName="ceilometer-notification-agent" Oct 13 13:28:50 crc kubenswrapper[4797]: E1013 13:28:50.116905 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c12a30d-f475-4e40-ac5f-0c03c238fb89" containerName="ceilometer-central-agent" Oct 13 13:28:50 crc kubenswrapper[4797]: I1013 13:28:50.116914 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c12a30d-f475-4e40-ac5f-0c03c238fb89" containerName="ceilometer-central-agent" Oct 13 13:28:50 crc kubenswrapper[4797]: E1013 13:28:50.116939 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c12a30d-f475-4e40-ac5f-0c03c238fb89" containerName="sg-core" Oct 13 13:28:50 crc kubenswrapper[4797]: I1013 13:28:50.116947 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c12a30d-f475-4e40-ac5f-0c03c238fb89" containerName="sg-core" Oct 13 13:28:50 crc kubenswrapper[4797]: E1013 13:28:50.116963 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c12a30d-f475-4e40-ac5f-0c03c238fb89" containerName="proxy-httpd" Oct 13 13:28:50 crc kubenswrapper[4797]: I1013 13:28:50.116970 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c12a30d-f475-4e40-ac5f-0c03c238fb89" containerName="proxy-httpd" Oct 13 13:28:50 crc kubenswrapper[4797]: I1013 13:28:50.117177 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c12a30d-f475-4e40-ac5f-0c03c238fb89" containerName="ceilometer-central-agent" Oct 13 13:28:50 crc kubenswrapper[4797]: I1013 13:28:50.117192 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c12a30d-f475-4e40-ac5f-0c03c238fb89" containerName="sg-core" Oct 13 13:28:50 crc kubenswrapper[4797]: I1013 13:28:50.117207 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c12a30d-f475-4e40-ac5f-0c03c238fb89" containerName="ceilometer-notification-agent" Oct 13 13:28:50 crc kubenswrapper[4797]: I1013 13:28:50.117229 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c12a30d-f475-4e40-ac5f-0c03c238fb89" containerName="proxy-httpd" Oct 13 13:28:50 crc kubenswrapper[4797]: I1013 13:28:50.119309 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 13 13:28:50 crc kubenswrapper[4797]: I1013 13:28:50.127703 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 13 13:28:50 crc kubenswrapper[4797]: I1013 13:28:50.165240 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 13 13:28:50 crc kubenswrapper[4797]: I1013 13:28:50.165492 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 13 13:28:50 crc kubenswrapper[4797]: I1013 13:28:50.165678 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 13 13:28:50 crc kubenswrapper[4797]: I1013 13:28:50.265746 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7297d3f3-134d-4fcf-85f2-8b414e2fb27d-scripts\") pod \"ceilometer-0\" (UID: \"7297d3f3-134d-4fcf-85f2-8b414e2fb27d\") " pod="openstack/ceilometer-0" Oct 13 13:28:50 crc kubenswrapper[4797]: I1013 13:28:50.265846 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7297d3f3-134d-4fcf-85f2-8b414e2fb27d-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"7297d3f3-134d-4fcf-85f2-8b414e2fb27d\") " pod="openstack/ceilometer-0" Oct 13 13:28:50 crc kubenswrapper[4797]: I1013 13:28:50.265882 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7297d3f3-134d-4fcf-85f2-8b414e2fb27d-run-httpd\") pod \"ceilometer-0\" (UID: \"7297d3f3-134d-4fcf-85f2-8b414e2fb27d\") " pod="openstack/ceilometer-0" Oct 13 13:28:50 crc kubenswrapper[4797]: I1013 13:28:50.266011 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7297d3f3-134d-4fcf-85f2-8b414e2fb27d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7297d3f3-134d-4fcf-85f2-8b414e2fb27d\") " pod="openstack/ceilometer-0" Oct 13 13:28:50 crc kubenswrapper[4797]: I1013 13:28:50.266055 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7297d3f3-134d-4fcf-85f2-8b414e2fb27d-config-data\") pod \"ceilometer-0\" (UID: \"7297d3f3-134d-4fcf-85f2-8b414e2fb27d\") " pod="openstack/ceilometer-0" Oct 13 13:28:50 crc kubenswrapper[4797]: I1013 13:28:50.266113 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7297d3f3-134d-4fcf-85f2-8b414e2fb27d-log-httpd\") pod \"ceilometer-0\" (UID: \"7297d3f3-134d-4fcf-85f2-8b414e2fb27d\") " pod="openstack/ceilometer-0" Oct 13 13:28:50 crc kubenswrapper[4797]: I1013 13:28:50.266148 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7297d3f3-134d-4fcf-85f2-8b414e2fb27d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7297d3f3-134d-4fcf-85f2-8b414e2fb27d\") " pod="openstack/ceilometer-0" Oct 13 13:28:50 crc kubenswrapper[4797]: I1013 13:28:50.266176 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4fqs5\" (UniqueName: \"kubernetes.io/projected/7297d3f3-134d-4fcf-85f2-8b414e2fb27d-kube-api-access-4fqs5\") pod \"ceilometer-0\" (UID: \"7297d3f3-134d-4fcf-85f2-8b414e2fb27d\") " pod="openstack/ceilometer-0" Oct 13 13:28:50 crc kubenswrapper[4797]: I1013 13:28:50.269242 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 13 13:28:50 crc kubenswrapper[4797]: I1013 13:28:50.269722 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 13 13:28:50 crc kubenswrapper[4797]: I1013 13:28:50.274136 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 13 13:28:50 crc kubenswrapper[4797]: I1013 13:28:50.367587 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7297d3f3-134d-4fcf-85f2-8b414e2fb27d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7297d3f3-134d-4fcf-85f2-8b414e2fb27d\") " pod="openstack/ceilometer-0" Oct 13 13:28:50 crc kubenswrapper[4797]: I1013 13:28:50.367664 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7297d3f3-134d-4fcf-85f2-8b414e2fb27d-config-data\") pod \"ceilometer-0\" (UID: \"7297d3f3-134d-4fcf-85f2-8b414e2fb27d\") " pod="openstack/ceilometer-0" Oct 13 13:28:50 crc kubenswrapper[4797]: I1013 13:28:50.367749 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7297d3f3-134d-4fcf-85f2-8b414e2fb27d-log-httpd\") pod \"ceilometer-0\" (UID: \"7297d3f3-134d-4fcf-85f2-8b414e2fb27d\") " pod="openstack/ceilometer-0" Oct 13 13:28:50 crc kubenswrapper[4797]: I1013 13:28:50.367785 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7297d3f3-134d-4fcf-85f2-8b414e2fb27d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7297d3f3-134d-4fcf-85f2-8b414e2fb27d\") " pod="openstack/ceilometer-0" Oct 13 13:28:50 crc kubenswrapper[4797]: I1013 13:28:50.367837 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4fqs5\" (UniqueName: \"kubernetes.io/projected/7297d3f3-134d-4fcf-85f2-8b414e2fb27d-kube-api-access-4fqs5\") pod \"ceilometer-0\" (UID: \"7297d3f3-134d-4fcf-85f2-8b414e2fb27d\") " pod="openstack/ceilometer-0" Oct 13 13:28:50 crc kubenswrapper[4797]: I1013 13:28:50.367989 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7297d3f3-134d-4fcf-85f2-8b414e2fb27d-scripts\") pod \"ceilometer-0\" (UID: \"7297d3f3-134d-4fcf-85f2-8b414e2fb27d\") " pod="openstack/ceilometer-0" Oct 13 13:28:50 crc kubenswrapper[4797]: I1013 13:28:50.368015 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7297d3f3-134d-4fcf-85f2-8b414e2fb27d-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"7297d3f3-134d-4fcf-85f2-8b414e2fb27d\") " pod="openstack/ceilometer-0" Oct 13 13:28:50 crc kubenswrapper[4797]: I1013 13:28:50.368038 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7297d3f3-134d-4fcf-85f2-8b414e2fb27d-run-httpd\") pod \"ceilometer-0\" (UID: \"7297d3f3-134d-4fcf-85f2-8b414e2fb27d\") " pod="openstack/ceilometer-0" Oct 13 13:28:50 crc kubenswrapper[4797]: I1013 13:28:50.368289 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7297d3f3-134d-4fcf-85f2-8b414e2fb27d-log-httpd\") pod \"ceilometer-0\" (UID: \"7297d3f3-134d-4fcf-85f2-8b414e2fb27d\") " pod="openstack/ceilometer-0" Oct 13 13:28:50 crc kubenswrapper[4797]: I1013 13:28:50.368985 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7297d3f3-134d-4fcf-85f2-8b414e2fb27d-run-httpd\") pod \"ceilometer-0\" (UID: \"7297d3f3-134d-4fcf-85f2-8b414e2fb27d\") " pod="openstack/ceilometer-0" Oct 13 13:28:50 crc kubenswrapper[4797]: I1013 13:28:50.373263 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7297d3f3-134d-4fcf-85f2-8b414e2fb27d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7297d3f3-134d-4fcf-85f2-8b414e2fb27d\") " pod="openstack/ceilometer-0" Oct 13 13:28:50 crc kubenswrapper[4797]: I1013 13:28:50.373410 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7297d3f3-134d-4fcf-85f2-8b414e2fb27d-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"7297d3f3-134d-4fcf-85f2-8b414e2fb27d\") " pod="openstack/ceilometer-0" Oct 13 13:28:50 crc kubenswrapper[4797]: I1013 13:28:50.374046 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7297d3f3-134d-4fcf-85f2-8b414e2fb27d-scripts\") pod \"ceilometer-0\" (UID: \"7297d3f3-134d-4fcf-85f2-8b414e2fb27d\") " pod="openstack/ceilometer-0" Oct 13 13:28:50 crc kubenswrapper[4797]: I1013 13:28:50.374287 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7297d3f3-134d-4fcf-85f2-8b414e2fb27d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7297d3f3-134d-4fcf-85f2-8b414e2fb27d\") " pod="openstack/ceilometer-0" Oct 13 13:28:50 crc kubenswrapper[4797]: I1013 13:28:50.384615 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7297d3f3-134d-4fcf-85f2-8b414e2fb27d-config-data\") pod \"ceilometer-0\" (UID: \"7297d3f3-134d-4fcf-85f2-8b414e2fb27d\") " pod="openstack/ceilometer-0" Oct 13 13:28:50 crc kubenswrapper[4797]: I1013 13:28:50.394422 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4fqs5\" (UniqueName: \"kubernetes.io/projected/7297d3f3-134d-4fcf-85f2-8b414e2fb27d-kube-api-access-4fqs5\") pod \"ceilometer-0\" (UID: \"7297d3f3-134d-4fcf-85f2-8b414e2fb27d\") " pod="openstack/ceilometer-0" Oct 13 13:28:50 crc kubenswrapper[4797]: I1013 13:28:50.482865 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 13 13:28:51 crc kubenswrapper[4797]: I1013 13:28:50.762749 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 13 13:28:51 crc kubenswrapper[4797]: I1013 13:28:50.941687 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 13 13:28:51 crc kubenswrapper[4797]: W1013 13:28:50.942851 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7297d3f3_134d_4fcf_85f2_8b414e2fb27d.slice/crio-2cab4f7f0a37ffa4f3249cf612ce70a140a5ccf56e26a4bb1043dead686d84eb WatchSource:0}: Error finding container 2cab4f7f0a37ffa4f3249cf612ce70a140a5ccf56e26a4bb1043dead686d84eb: Status 404 returned error can't find the container with id 2cab4f7f0a37ffa4f3249cf612ce70a140a5ccf56e26a4bb1043dead686d84eb Oct 13 13:28:51 crc kubenswrapper[4797]: I1013 13:28:51.260646 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c12a30d-f475-4e40-ac5f-0c03c238fb89" path="/var/lib/kubelet/pods/8c12a30d-f475-4e40-ac5f-0c03c238fb89/volumes" Oct 13 13:28:51 crc kubenswrapper[4797]: I1013 13:28:51.767157 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7297d3f3-134d-4fcf-85f2-8b414e2fb27d","Type":"ContainerStarted","Data":"2cab4f7f0a37ffa4f3249cf612ce70a140a5ccf56e26a4bb1043dead686d84eb"} Oct 13 13:28:52 crc kubenswrapper[4797]: I1013 13:28:52.779059 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7297d3f3-134d-4fcf-85f2-8b414e2fb27d","Type":"ContainerStarted","Data":"3e128c54e1e732791a3db021ae19e7a7d4bd2ecbe1361a7e7ba33071c47c83c3"} Oct 13 13:28:52 crc kubenswrapper[4797]: I1013 13:28:52.779163 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7297d3f3-134d-4fcf-85f2-8b414e2fb27d","Type":"ContainerStarted","Data":"8b3dbc305e498b09f402b2934e96ebeacc6ba59d0b362bafb37d2783cec22f3c"} Oct 13 13:28:53 crc kubenswrapper[4797]: I1013 13:28:53.797917 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7297d3f3-134d-4fcf-85f2-8b414e2fb27d","Type":"ContainerStarted","Data":"0eb68056284c660ad9015956fa330c09c86d1bd6d0b2e7c87865c45cb9f710fa"} Oct 13 13:28:55 crc kubenswrapper[4797]: I1013 13:28:55.818966 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7297d3f3-134d-4fcf-85f2-8b414e2fb27d","Type":"ContainerStarted","Data":"26a477d9a49a32396a4c8959aafd79c16854ffb6419bb4f4a5f1de667a6d6c19"} Oct 13 13:28:55 crc kubenswrapper[4797]: I1013 13:28:55.819612 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 13 13:28:55 crc kubenswrapper[4797]: I1013 13:28:55.844848 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.962959538 podStartE2EDuration="5.844826448s" podCreationTimestamp="2025-10-13 13:28:50 +0000 UTC" firstStartedPulling="2025-10-13 13:28:50.945331302 +0000 UTC m=+1308.478881558" lastFinishedPulling="2025-10-13 13:28:54.827198212 +0000 UTC m=+1312.360748468" observedRunningTime="2025-10-13 13:28:55.838455502 +0000 UTC m=+1313.372005768" watchObservedRunningTime="2025-10-13 13:28:55.844826448 +0000 UTC m=+1313.378376704" Oct 13 13:29:18 crc kubenswrapper[4797]: I1013 13:29:18.119699 4797 patch_prober.go:28] interesting pod/machine-config-daemon-hrdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 13:29:18 crc kubenswrapper[4797]: I1013 13:29:18.120236 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 13:29:20 crc kubenswrapper[4797]: I1013 13:29:20.502736 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 13 13:29:42 crc kubenswrapper[4797]: I1013 13:29:42.797109 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Oct 13 13:29:42 crc kubenswrapper[4797]: I1013 13:29:42.797879 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="f8688abd-e654-404b-924c-9e4cf255f4e8" containerName="openstackclient" containerID="cri-o://a4c476b6ff3b37f629fd62076e094c447b4d35d68c78802f6d598f476f60adb0" gracePeriod=2 Oct 13 13:29:42 crc kubenswrapper[4797]: I1013 13:29:42.822845 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Oct 13 13:29:42 crc kubenswrapper[4797]: I1013 13:29:42.919689 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 13 13:29:42 crc kubenswrapper[4797]: E1013 13:29:42.978984 4797 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Oct 13 13:29:42 crc kubenswrapper[4797]: E1013 13:29:42.979063 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/21067728-d3cf-4ff2-94c9-87600f7324ab-config-data podName:21067728-d3cf-4ff2-94c9-87600f7324ab nodeName:}" failed. No retries permitted until 2025-10-13 13:29:43.479034621 +0000 UTC m=+1361.012584877 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/21067728-d3cf-4ff2-94c9-87600f7324ab-config-data") pod "rabbitmq-server-0" (UID: "21067728-d3cf-4ff2-94c9-87600f7324ab") : configmap "rabbitmq-config-data" not found Oct 13 13:29:43 crc kubenswrapper[4797]: I1013 13:29:43.076664 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance496f-account-delete-qst6j"] Oct 13 13:29:43 crc kubenswrapper[4797]: E1013 13:29:43.077105 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8688abd-e654-404b-924c-9e4cf255f4e8" containerName="openstackclient" Oct 13 13:29:43 crc kubenswrapper[4797]: I1013 13:29:43.077122 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8688abd-e654-404b-924c-9e4cf255f4e8" containerName="openstackclient" Oct 13 13:29:43 crc kubenswrapper[4797]: I1013 13:29:43.077344 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8688abd-e654-404b-924c-9e4cf255f4e8" containerName="openstackclient" Oct 13 13:29:43 crc kubenswrapper[4797]: I1013 13:29:43.078022 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance496f-account-delete-qst6j" Oct 13 13:29:43 crc kubenswrapper[4797]: I1013 13:29:43.110243 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 13 13:29:43 crc kubenswrapper[4797]: I1013 13:29:43.111126 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="244e58a1-ed2c-4ff6-8885-ebd066e8adab" containerName="openstack-network-exporter" containerID="cri-o://ea620bd698810f04fad3cc655e6b00829d4603731d7d1dd124d75e6de787a1e3" gracePeriod=300 Oct 13 13:29:43 crc kubenswrapper[4797]: I1013 13:29:43.141867 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinderfaff-account-delete-ltnnq"] Oct 13 13:29:43 crc kubenswrapper[4797]: I1013 13:29:43.143101 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinderfaff-account-delete-ltnnq" Oct 13 13:29:43 crc kubenswrapper[4797]: I1013 13:29:43.183414 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7mf8\" (UniqueName: \"kubernetes.io/projected/312a660f-ea89-49ac-8857-16dae844353f-kube-api-access-x7mf8\") pod \"glance496f-account-delete-qst6j\" (UID: \"312a660f-ea89-49ac-8857-16dae844353f\") " pod="openstack/glance496f-account-delete-qst6j" Oct 13 13:29:43 crc kubenswrapper[4797]: I1013 13:29:43.183619 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxxwm\" (UniqueName: \"kubernetes.io/projected/5421ab8e-2db8-4909-b67b-e0491f7b80e7-kube-api-access-jxxwm\") pod \"cinderfaff-account-delete-ltnnq\" (UID: \"5421ab8e-2db8-4909-b67b-e0491f7b80e7\") " pod="openstack/cinderfaff-account-delete-ltnnq" Oct 13 13:29:43 crc kubenswrapper[4797]: I1013 13:29:43.192189 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance496f-account-delete-qst6j"] Oct 13 13:29:43 crc kubenswrapper[4797]: I1013 13:29:43.259975 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="244e58a1-ed2c-4ff6-8885-ebd066e8adab" containerName="ovsdbserver-nb" containerID="cri-o://15a0ba06c59d7bea85972ec892e686e89aa4eb9037d3d04f437f8ad32558c17b" gracePeriod=300 Oct 13 13:29:43 crc kubenswrapper[4797]: I1013 13:29:43.287613 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jxxwm\" (UniqueName: \"kubernetes.io/projected/5421ab8e-2db8-4909-b67b-e0491f7b80e7-kube-api-access-jxxwm\") pod \"cinderfaff-account-delete-ltnnq\" (UID: \"5421ab8e-2db8-4909-b67b-e0491f7b80e7\") " pod="openstack/cinderfaff-account-delete-ltnnq" Oct 13 13:29:43 crc kubenswrapper[4797]: I1013 13:29:43.287898 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7mf8\" (UniqueName: \"kubernetes.io/projected/312a660f-ea89-49ac-8857-16dae844353f-kube-api-access-x7mf8\") pod \"glance496f-account-delete-qst6j\" (UID: \"312a660f-ea89-49ac-8857-16dae844353f\") " pod="openstack/glance496f-account-delete-qst6j" Oct 13 13:29:43 crc kubenswrapper[4797]: I1013 13:29:43.292998 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinderfaff-account-delete-ltnnq"] Oct 13 13:29:43 crc kubenswrapper[4797]: I1013 13:29:43.308317 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 13 13:29:43 crc kubenswrapper[4797]: I1013 13:29:43.320901 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/novacell084cf-account-delete-r9bpk"] Oct 13 13:29:43 crc kubenswrapper[4797]: I1013 13:29:43.322339 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novacell084cf-account-delete-r9bpk" Oct 13 13:29:43 crc kubenswrapper[4797]: I1013 13:29:43.334038 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 13 13:29:43 crc kubenswrapper[4797]: I1013 13:29:43.334376 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="047404d9-b0ab-44e2-a31d-94d8fe429698" containerName="openstack-network-exporter" containerID="cri-o://eb49a8ba0c15790a316319a7af2bb9f90a15b3fcad1167d493975eeda527d705" gracePeriod=300 Oct 13 13:29:43 crc kubenswrapper[4797]: I1013 13:29:43.334430 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxxwm\" (UniqueName: \"kubernetes.io/projected/5421ab8e-2db8-4909-b67b-e0491f7b80e7-kube-api-access-jxxwm\") pod \"cinderfaff-account-delete-ltnnq\" (UID: \"5421ab8e-2db8-4909-b67b-e0491f7b80e7\") " pod="openstack/cinderfaff-account-delete-ltnnq" Oct 13 13:29:43 crc kubenswrapper[4797]: E1013 13:29:43.365290 4797 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 15a0ba06c59d7bea85972ec892e686e89aa4eb9037d3d04f437f8ad32558c17b is running failed: container process not found" containerID="15a0ba06c59d7bea85972ec892e686e89aa4eb9037d3d04f437f8ad32558c17b" cmd=["/usr/bin/pidof","ovsdb-server"] Oct 13 13:29:43 crc kubenswrapper[4797]: E1013 13:29:43.369358 4797 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 15a0ba06c59d7bea85972ec892e686e89aa4eb9037d3d04f437f8ad32558c17b is running failed: container process not found" containerID="15a0ba06c59d7bea85972ec892e686e89aa4eb9037d3d04f437f8ad32558c17b" cmd=["/usr/bin/pidof","ovsdb-server"] Oct 13 13:29:43 crc kubenswrapper[4797]: I1013 13:29:43.372957 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7mf8\" (UniqueName: \"kubernetes.io/projected/312a660f-ea89-49ac-8857-16dae844353f-kube-api-access-x7mf8\") pod \"glance496f-account-delete-qst6j\" (UID: \"312a660f-ea89-49ac-8857-16dae844353f\") " pod="openstack/glance496f-account-delete-qst6j" Oct 13 13:29:43 crc kubenswrapper[4797]: I1013 13:29:43.375864 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/novacell084cf-account-delete-r9bpk"] Oct 13 13:29:43 crc kubenswrapper[4797]: E1013 13:29:43.386134 4797 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 15a0ba06c59d7bea85972ec892e686e89aa4eb9037d3d04f437f8ad32558c17b is running failed: container process not found" containerID="15a0ba06c59d7bea85972ec892e686e89aa4eb9037d3d04f437f8ad32558c17b" cmd=["/usr/bin/pidof","ovsdb-server"] Oct 13 13:29:43 crc kubenswrapper[4797]: E1013 13:29:43.386207 4797 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 15a0ba06c59d7bea85972ec892e686e89aa4eb9037d3d04f437f8ad32558c17b is running failed: container process not found" probeType="Readiness" pod="openstack/ovsdbserver-nb-0" podUID="244e58a1-ed2c-4ff6-8885-ebd066e8adab" containerName="ovsdbserver-nb" Oct 13 13:29:43 crc kubenswrapper[4797]: I1013 13:29:43.386368 4797 generic.go:334] "Generic (PLEG): container finished" podID="244e58a1-ed2c-4ff6-8885-ebd066e8adab" containerID="ea620bd698810f04fad3cc655e6b00829d4603731d7d1dd124d75e6de787a1e3" exitCode=2 Oct 13 13:29:43 crc kubenswrapper[4797]: I1013 13:29:43.386405 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"244e58a1-ed2c-4ff6-8885-ebd066e8adab","Type":"ContainerDied","Data":"ea620bd698810f04fad3cc655e6b00829d4603731d7d1dd124d75e6de787a1e3"} Oct 13 13:29:43 crc kubenswrapper[4797]: I1013 13:29:43.391564 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5kpc\" (UniqueName: \"kubernetes.io/projected/a327836f-196f-4d29-8792-33085b552aa9-kube-api-access-r5kpc\") pod \"novacell084cf-account-delete-r9bpk\" (UID: \"a327836f-196f-4d29-8792-33085b552aa9\") " pod="openstack/novacell084cf-account-delete-r9bpk" Oct 13 13:29:43 crc kubenswrapper[4797]: E1013 13:29:43.393197 4797 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Oct 13 13:29:43 crc kubenswrapper[4797]: E1013 13:29:43.393234 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/acdec9fc-360a-46e4-89ea-3fde84f417c0-config-data podName:acdec9fc-360a-46e4-89ea-3fde84f417c0 nodeName:}" failed. No retries permitted until 2025-10-13 13:29:43.893221221 +0000 UTC m=+1361.426771477 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/acdec9fc-360a-46e4-89ea-3fde84f417c0-config-data") pod "rabbitmq-cell1-server-0" (UID: "acdec9fc-360a-46e4-89ea-3fde84f417c0") : configmap "rabbitmq-cell1-config-data" not found Oct 13 13:29:43 crc kubenswrapper[4797]: I1013 13:29:43.400838 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/novaapia754-account-delete-6khh6"] Oct 13 13:29:43 crc kubenswrapper[4797]: I1013 13:29:43.402316 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novaapia754-account-delete-6khh6" Oct 13 13:29:43 crc kubenswrapper[4797]: I1013 13:29:43.404512 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance496f-account-delete-qst6j" Oct 13 13:29:43 crc kubenswrapper[4797]: I1013 13:29:43.432672 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/novaapia754-account-delete-6khh6"] Oct 13 13:29:43 crc kubenswrapper[4797]: I1013 13:29:43.456848 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/novacell122bc-account-delete-v4zs5"] Oct 13 13:29:43 crc kubenswrapper[4797]: I1013 13:29:43.458143 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novacell122bc-account-delete-v4zs5" Oct 13 13:29:43 crc kubenswrapper[4797]: I1013 13:29:43.471897 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/novacell122bc-account-delete-v4zs5"] Oct 13 13:29:43 crc kubenswrapper[4797]: I1013 13:29:43.490249 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinderfaff-account-delete-ltnnq" Oct 13 13:29:43 crc kubenswrapper[4797]: I1013 13:29:43.493846 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r5kpc\" (UniqueName: \"kubernetes.io/projected/a327836f-196f-4d29-8792-33085b552aa9-kube-api-access-r5kpc\") pod \"novacell084cf-account-delete-r9bpk\" (UID: \"a327836f-196f-4d29-8792-33085b552aa9\") " pod="openstack/novacell084cf-account-delete-r9bpk" Oct 13 13:29:43 crc kubenswrapper[4797]: I1013 13:29:43.493986 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q69xt\" (UniqueName: \"kubernetes.io/projected/3cb03045-bfdc-4d0b-af8f-4e3c4717e792-kube-api-access-q69xt\") pod \"novaapia754-account-delete-6khh6\" (UID: \"3cb03045-bfdc-4d0b-af8f-4e3c4717e792\") " pod="openstack/novaapia754-account-delete-6khh6" Oct 13 13:29:43 crc kubenswrapper[4797]: I1013 13:29:43.494031 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95tw5\" (UniqueName: \"kubernetes.io/projected/3111854e-cfce-493d-a094-63479ed35583-kube-api-access-95tw5\") pod \"novacell122bc-account-delete-v4zs5\" (UID: \"3111854e-cfce-493d-a094-63479ed35583\") " pod="openstack/novacell122bc-account-delete-v4zs5" Oct 13 13:29:43 crc kubenswrapper[4797]: E1013 13:29:43.494226 4797 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Oct 13 13:29:43 crc kubenswrapper[4797]: E1013 13:29:43.494280 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/21067728-d3cf-4ff2-94c9-87600f7324ab-config-data podName:21067728-d3cf-4ff2-94c9-87600f7324ab nodeName:}" failed. No retries permitted until 2025-10-13 13:29:44.494262892 +0000 UTC m=+1362.027813148 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/21067728-d3cf-4ff2-94c9-87600f7324ab-config-data") pod "rabbitmq-server-0" (UID: "21067728-d3cf-4ff2-94c9-87600f7324ab") : configmap "rabbitmq-config-data" not found Oct 13 13:29:43 crc kubenswrapper[4797]: I1013 13:29:43.545681 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5kpc\" (UniqueName: \"kubernetes.io/projected/a327836f-196f-4d29-8792-33085b552aa9-kube-api-access-r5kpc\") pod \"novacell084cf-account-delete-r9bpk\" (UID: \"a327836f-196f-4d29-8792-33085b552aa9\") " pod="openstack/novacell084cf-account-delete-r9bpk" Oct 13 13:29:43 crc kubenswrapper[4797]: I1013 13:29:43.580265 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placementcde4-account-delete-dpcwh"] Oct 13 13:29:43 crc kubenswrapper[4797]: I1013 13:29:43.581549 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placementcde4-account-delete-dpcwh" Oct 13 13:29:43 crc kubenswrapper[4797]: I1013 13:29:43.598029 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q69xt\" (UniqueName: \"kubernetes.io/projected/3cb03045-bfdc-4d0b-af8f-4e3c4717e792-kube-api-access-q69xt\") pod \"novaapia754-account-delete-6khh6\" (UID: \"3cb03045-bfdc-4d0b-af8f-4e3c4717e792\") " pod="openstack/novaapia754-account-delete-6khh6" Oct 13 13:29:43 crc kubenswrapper[4797]: I1013 13:29:43.598090 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-95tw5\" (UniqueName: \"kubernetes.io/projected/3111854e-cfce-493d-a094-63479ed35583-kube-api-access-95tw5\") pod \"novacell122bc-account-delete-v4zs5\" (UID: \"3111854e-cfce-493d-a094-63479ed35583\") " pod="openstack/novacell122bc-account-delete-v4zs5" Oct 13 13:29:43 crc kubenswrapper[4797]: I1013 13:29:43.617072 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placementcde4-account-delete-dpcwh"] Oct 13 13:29:43 crc kubenswrapper[4797]: I1013 13:29:43.630091 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-jrlnp"] Oct 13 13:29:43 crc kubenswrapper[4797]: I1013 13:29:43.632510 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-95tw5\" (UniqueName: \"kubernetes.io/projected/3111854e-cfce-493d-a094-63479ed35583-kube-api-access-95tw5\") pod \"novacell122bc-account-delete-v4zs5\" (UID: \"3111854e-cfce-493d-a094-63479ed35583\") " pod="openstack/novacell122bc-account-delete-v4zs5" Oct 13 13:29:43 crc kubenswrapper[4797]: I1013 13:29:43.639415 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q69xt\" (UniqueName: \"kubernetes.io/projected/3cb03045-bfdc-4d0b-af8f-4e3c4717e792-kube-api-access-q69xt\") pod \"novaapia754-account-delete-6khh6\" (UID: \"3cb03045-bfdc-4d0b-af8f-4e3c4717e792\") " pod="openstack/novaapia754-account-delete-6khh6" Oct 13 13:29:43 crc kubenswrapper[4797]: I1013 13:29:43.650616 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-jrlnp"] Oct 13 13:29:43 crc kubenswrapper[4797]: I1013 13:29:43.676837 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-xzbbx"] Oct 13 13:29:43 crc kubenswrapper[4797]: I1013 13:29:43.687226 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novacell084cf-account-delete-r9bpk" Oct 13 13:29:43 crc kubenswrapper[4797]: I1013 13:29:43.699669 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4mnz\" (UniqueName: \"kubernetes.io/projected/ecdce3f4-f1b1-4323-b237-eb28b936ebc7-kube-api-access-z4mnz\") pod \"placementcde4-account-delete-dpcwh\" (UID: \"ecdce3f4-f1b1-4323-b237-eb28b936ebc7\") " pod="openstack/placementcde4-account-delete-dpcwh" Oct 13 13:29:43 crc kubenswrapper[4797]: I1013 13:29:43.742973 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novaapia754-account-delete-6khh6" Oct 13 13:29:43 crc kubenswrapper[4797]: I1013 13:29:43.766269 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novacell122bc-account-delete-v4zs5" Oct 13 13:29:43 crc kubenswrapper[4797]: I1013 13:29:43.771728 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-xzbbx"] Oct 13 13:29:43 crc kubenswrapper[4797]: I1013 13:29:43.800934 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4mnz\" (UniqueName: \"kubernetes.io/projected/ecdce3f4-f1b1-4323-b237-eb28b936ebc7-kube-api-access-z4mnz\") pod \"placementcde4-account-delete-dpcwh\" (UID: \"ecdce3f4-f1b1-4323-b237-eb28b936ebc7\") " pod="openstack/placementcde4-account-delete-dpcwh" Oct 13 13:29:43 crc kubenswrapper[4797]: I1013 13:29:43.818013 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="047404d9-b0ab-44e2-a31d-94d8fe429698" containerName="ovsdbserver-sb" containerID="cri-o://2f8d3e11442030783b8beb821adc64c961a2829fa98e12946ceb2502de4e83c9" gracePeriod=300 Oct 13 13:29:43 crc kubenswrapper[4797]: I1013 13:29:43.863971 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4mnz\" (UniqueName: \"kubernetes.io/projected/ecdce3f4-f1b1-4323-b237-eb28b936ebc7-kube-api-access-z4mnz\") pod \"placementcde4-account-delete-dpcwh\" (UID: \"ecdce3f4-f1b1-4323-b237-eb28b936ebc7\") " pod="openstack/placementcde4-account-delete-dpcwh" Oct 13 13:29:43 crc kubenswrapper[4797]: I1013 13:29:43.896971 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Oct 13 13:29:43 crc kubenswrapper[4797]: I1013 13:29:43.897232 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="d661f302-5234-4d18-9aa8-0eddd26153fe" containerName="ovn-northd" containerID="cri-o://c4243df011234c180288fc1c95c327de116944eeb9f76e3b80b6ff0317063169" gracePeriod=30 Oct 13 13:29:43 crc kubenswrapper[4797]: I1013 13:29:43.897349 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="d661f302-5234-4d18-9aa8-0eddd26153fe" containerName="openstack-network-exporter" containerID="cri-o://1a733e45e064aeec3878d4c5dd8fe67bcb4e25caaf9192479e03c78ce5fbd2b5" gracePeriod=30 Oct 13 13:29:43 crc kubenswrapper[4797]: E1013 13:29:43.903196 4797 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Oct 13 13:29:43 crc kubenswrapper[4797]: E1013 13:29:43.903247 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/acdec9fc-360a-46e4-89ea-3fde84f417c0-config-data podName:acdec9fc-360a-46e4-89ea-3fde84f417c0 nodeName:}" failed. No retries permitted until 2025-10-13 13:29:44.903231305 +0000 UTC m=+1362.436781561 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/acdec9fc-360a-46e4-89ea-3fde84f417c0-config-data") pod "rabbitmq-cell1-server-0" (UID: "acdec9fc-360a-46e4-89ea-3fde84f417c0") : configmap "rabbitmq-cell1-config-data" not found Oct 13 13:29:43 crc kubenswrapper[4797]: I1013 13:29:43.915534 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-z9wd5"] Oct 13 13:29:43 crc kubenswrapper[4797]: I1013 13:29:43.961411 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-z9wd5"] Oct 13 13:29:43 crc kubenswrapper[4797]: I1013 13:29:43.989068 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-pp9mc"] Oct 13 13:29:44 crc kubenswrapper[4797]: I1013 13:29:44.017631 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-pp9mc"] Oct 13 13:29:44 crc kubenswrapper[4797]: I1013 13:29:44.085374 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-5dzch"] Oct 13 13:29:44 crc kubenswrapper[4797]: I1013 13:29:44.108867 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-5dzch"] Oct 13 13:29:44 crc kubenswrapper[4797]: I1013 13:29:44.141230 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placementcde4-account-delete-dpcwh" Oct 13 13:29:44 crc kubenswrapper[4797]: I1013 13:29:44.155264 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-2mpq9"] Oct 13 13:29:44 crc kubenswrapper[4797]: I1013 13:29:44.190463 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-d764t"] Oct 13 13:29:44 crc kubenswrapper[4797]: I1013 13:29:44.191881 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-d764t"] Oct 13 13:29:44 crc kubenswrapper[4797]: I1013 13:29:44.210868 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-htk8n"] Oct 13 13:29:44 crc kubenswrapper[4797]: I1013 13:29:44.225240 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-57bdg"] Oct 13 13:29:44 crc kubenswrapper[4797]: I1013 13:29:44.225615 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-metrics-57bdg" podUID="1fc3b8cc-c74c-402d-8284-7d578bfa7c02" containerName="openstack-network-exporter" containerID="cri-o://58cfde229cd18c95dec2460726a8982f10895baa52b242e5b8b923162d984a0a" gracePeriod=30 Oct 13 13:29:44 crc kubenswrapper[4797]: I1013 13:29:44.249821 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-tqpk9"] Oct 13 13:29:44 crc kubenswrapper[4797]: I1013 13:29:44.269649 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-tqpk9"] Oct 13 13:29:44 crc kubenswrapper[4797]: I1013 13:29:44.282999 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-65bf758599-wncdc"] Oct 13 13:29:44 crc kubenswrapper[4797]: I1013 13:29:44.283198 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-65bf758599-wncdc" podUID="a7fec705-3fa8-4f2b-aa9d-1afec561d884" containerName="dnsmasq-dns" containerID="cri-o://f67257e3e1a1c5986f176debe54b11331896c78d077c52898951e4e00a7acef3" gracePeriod=10 Oct 13 13:29:44 crc kubenswrapper[4797]: I1013 13:29:44.298682 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-pn9q8"] Oct 13 13:29:44 crc kubenswrapper[4797]: I1013 13:29:44.308977 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-pn9q8"] Oct 13 13:29:44 crc kubenswrapper[4797]: I1013 13:29:44.394146 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 13 13:29:44 crc kubenswrapper[4797]: I1013 13:29:44.394442 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="cc4b497b-efb0-4294-8af9-c16bb2835e36" containerName="cinder-scheduler" containerID="cri-o://cf81e79b12c28741c928f832632179cdb954c2366581d8d7306d9be83dbc6228" gracePeriod=30 Oct 13 13:29:44 crc kubenswrapper[4797]: I1013 13:29:44.394589 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="cc4b497b-efb0-4294-8af9-c16bb2835e36" containerName="probe" containerID="cri-o://2cf43831975875d820d476539ef4c0943fa120e06b5317c6aec843e932949edd" gracePeriod=30 Oct 13 13:29:44 crc kubenswrapper[4797]: I1013 13:29:44.432394 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Oct 13 13:29:44 crc kubenswrapper[4797]: I1013 13:29:44.433248 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="f853cd93-92bc-46d6-8bd4-82373edcac6c" containerName="account-server" containerID="cri-o://c4ce4b63b22c727785eb5abec889eca7d94fdcd5d0b3cda4520df492e1acde65" gracePeriod=30 Oct 13 13:29:44 crc kubenswrapper[4797]: I1013 13:29:44.435623 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="f853cd93-92bc-46d6-8bd4-82373edcac6c" containerName="swift-recon-cron" containerID="cri-o://df5690bc37dc98f263a51145643771f0253fa3da622e37dab0ac9bd217f8b17d" gracePeriod=30 Oct 13 13:29:44 crc kubenswrapper[4797]: I1013 13:29:44.435709 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="f853cd93-92bc-46d6-8bd4-82373edcac6c" containerName="rsync" containerID="cri-o://7c966ea8b8d377fe98198c55458614115fcf39c47512b6e6a01c20502d489780" gracePeriod=30 Oct 13 13:29:44 crc kubenswrapper[4797]: I1013 13:29:44.435756 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="f853cd93-92bc-46d6-8bd4-82373edcac6c" containerName="object-expirer" containerID="cri-o://0771ac639aaa1011650222d8f818316f8651f5729cf3d02726d6f291d0ca0403" gracePeriod=30 Oct 13 13:29:44 crc kubenswrapper[4797]: I1013 13:29:44.435802 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="f853cd93-92bc-46d6-8bd4-82373edcac6c" containerName="object-updater" containerID="cri-o://88fe19d7acc00d4d082fc5672f55a59aa59797313060f49a5e87d490f16e6bd6" gracePeriod=30 Oct 13 13:29:44 crc kubenswrapper[4797]: I1013 13:29:44.435860 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="f853cd93-92bc-46d6-8bd4-82373edcac6c" containerName="object-auditor" containerID="cri-o://95610d8f86f925fa29c39c4b649aca588a5bfca6031effb0eddf4c0ba26766bc" gracePeriod=30 Oct 13 13:29:44 crc kubenswrapper[4797]: I1013 13:29:44.435900 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="f853cd93-92bc-46d6-8bd4-82373edcac6c" containerName="object-replicator" containerID="cri-o://2c54fef5660226ff362a71ac3e917e001db5e9a347a08deab58b5660776811fc" gracePeriod=30 Oct 13 13:29:44 crc kubenswrapper[4797]: I1013 13:29:44.435941 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="f853cd93-92bc-46d6-8bd4-82373edcac6c" containerName="object-server" containerID="cri-o://f98ce95dd4a03f4ac462cac39a646ae12a813083511d7a3c50bdddd1624ba0aa" gracePeriod=30 Oct 13 13:29:44 crc kubenswrapper[4797]: I1013 13:29:44.435979 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="f853cd93-92bc-46d6-8bd4-82373edcac6c" containerName="container-updater" containerID="cri-o://7fa1e18bf4f048760b982cf61e4fdc5706474901c9f59b877086707fd0c2bad5" gracePeriod=30 Oct 13 13:29:44 crc kubenswrapper[4797]: I1013 13:29:44.436017 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="f853cd93-92bc-46d6-8bd4-82373edcac6c" containerName="container-auditor" containerID="cri-o://a05561b229ddf692864eb50f4e3134b3ae7ce64dcd992877ca1a87198bca77ca" gracePeriod=30 Oct 13 13:29:44 crc kubenswrapper[4797]: I1013 13:29:44.436057 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="f853cd93-92bc-46d6-8bd4-82373edcac6c" containerName="container-replicator" containerID="cri-o://826bb87dfbab6e082d77678c431c5f60fe685f53d7f5d122666545d7e4d18b90" gracePeriod=30 Oct 13 13:29:44 crc kubenswrapper[4797]: I1013 13:29:44.436097 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="f853cd93-92bc-46d6-8bd4-82373edcac6c" containerName="container-server" containerID="cri-o://e34ee38f199fac8191d033ce294a7a0959606e96937bb15c3973792f94df9fb9" gracePeriod=30 Oct 13 13:29:44 crc kubenswrapper[4797]: I1013 13:29:44.436137 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="f853cd93-92bc-46d6-8bd4-82373edcac6c" containerName="account-reaper" containerID="cri-o://e14c6e172f563574bf54ee4a48fdf4ed5d505dd4e2e640e88c9ca54cdfa00ed3" gracePeriod=30 Oct 13 13:29:44 crc kubenswrapper[4797]: I1013 13:29:44.436195 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="f853cd93-92bc-46d6-8bd4-82373edcac6c" containerName="account-auditor" containerID="cri-o://fd4ad2d986ae8532b8471784274e34f2e4066c64a28827eb0bdc5088d9c323f3" gracePeriod=30 Oct 13 13:29:44 crc kubenswrapper[4797]: I1013 13:29:44.436233 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="f853cd93-92bc-46d6-8bd4-82373edcac6c" containerName="account-replicator" containerID="cri-o://ad8ecb3fe40030b9df7dc7b9b77499e35e2b251475733a87367b79e69bc6d068" gracePeriod=30 Oct 13 13:29:44 crc kubenswrapper[4797]: I1013 13:29:44.519299 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_047404d9-b0ab-44e2-a31d-94d8fe429698/ovsdbserver-sb/0.log" Oct 13 13:29:44 crc kubenswrapper[4797]: I1013 13:29:44.519343 4797 generic.go:334] "Generic (PLEG): container finished" podID="047404d9-b0ab-44e2-a31d-94d8fe429698" containerID="eb49a8ba0c15790a316319a7af2bb9f90a15b3fcad1167d493975eeda527d705" exitCode=2 Oct 13 13:29:44 crc kubenswrapper[4797]: I1013 13:29:44.519361 4797 generic.go:334] "Generic (PLEG): container finished" podID="047404d9-b0ab-44e2-a31d-94d8fe429698" containerID="2f8d3e11442030783b8beb821adc64c961a2829fa98e12946ceb2502de4e83c9" exitCode=143 Oct 13 13:29:44 crc kubenswrapper[4797]: I1013 13:29:44.519410 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"047404d9-b0ab-44e2-a31d-94d8fe429698","Type":"ContainerDied","Data":"eb49a8ba0c15790a316319a7af2bb9f90a15b3fcad1167d493975eeda527d705"} Oct 13 13:29:44 crc kubenswrapper[4797]: I1013 13:29:44.519440 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"047404d9-b0ab-44e2-a31d-94d8fe429698","Type":"ContainerDied","Data":"2f8d3e11442030783b8beb821adc64c961a2829fa98e12946ceb2502de4e83c9"} Oct 13 13:29:44 crc kubenswrapper[4797]: I1013 13:29:44.535424 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 13 13:29:44 crc kubenswrapper[4797]: I1013 13:29:44.535733 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="e2be119d-ecfb-4f81-b947-46797c215b8e" containerName="glance-log" containerID="cri-o://5047e602c53523ee75385241870f170eddf3e400c8f5154307fa867893e3573e" gracePeriod=30 Oct 13 13:29:44 crc kubenswrapper[4797]: I1013 13:29:44.536068 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="e2be119d-ecfb-4f81-b947-46797c215b8e" containerName="glance-httpd" containerID="cri-o://14f8f7513577c04f3bc8c70c38562b364daf8b8a2754d149cd53b77f28fcf4d4" gracePeriod=30 Oct 13 13:29:44 crc kubenswrapper[4797]: I1013 13:29:44.549552 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 13 13:29:44 crc kubenswrapper[4797]: I1013 13:29:44.549778 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="416aefad-3318-4406-b1c1-fdba0ce21437" containerName="glance-log" containerID="cri-o://52fe991f1e2fd4f29fc6e6714cf185447bb07fb1c6293b5cab3549dd2ea20248" gracePeriod=30 Oct 13 13:29:44 crc kubenswrapper[4797]: I1013 13:29:44.550174 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="416aefad-3318-4406-b1c1-fdba0ce21437" containerName="glance-httpd" containerID="cri-o://c575ab6cf83d919cac32e185c4f667a6b3abc5c3952e0de54dbbac6e3ad28900" gracePeriod=30 Oct 13 13:29:44 crc kubenswrapper[4797]: I1013 13:29:44.569779 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_244e58a1-ed2c-4ff6-8885-ebd066e8adab/ovsdbserver-nb/0.log" Oct 13 13:29:44 crc kubenswrapper[4797]: I1013 13:29:44.569834 4797 generic.go:334] "Generic (PLEG): container finished" podID="244e58a1-ed2c-4ff6-8885-ebd066e8adab" containerID="15a0ba06c59d7bea85972ec892e686e89aa4eb9037d3d04f437f8ad32558c17b" exitCode=143 Oct 13 13:29:44 crc kubenswrapper[4797]: I1013 13:29:44.569887 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"244e58a1-ed2c-4ff6-8885-ebd066e8adab","Type":"ContainerDied","Data":"15a0ba06c59d7bea85972ec892e686e89aa4eb9037d3d04f437f8ad32558c17b"} Oct 13 13:29:44 crc kubenswrapper[4797]: I1013 13:29:44.575953 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 13 13:29:44 crc kubenswrapper[4797]: I1013 13:29:44.576242 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="6b2f17a4-493b-4b76-9dea-ef70ed8e1525" containerName="cinder-api-log" containerID="cri-o://b525fe6d91c98126c6fdec2f494bf2955f9437633fcc37cf958bfcaa039ef67e" gracePeriod=30 Oct 13 13:29:44 crc kubenswrapper[4797]: I1013 13:29:44.576639 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="6b2f17a4-493b-4b76-9dea-ef70ed8e1525" containerName="cinder-api" containerID="cri-o://bf518b71e25438928a378f5c3bfabd9ce6a8af5ad6e5d68c81e9f3e1f5d12f53" gracePeriod=30 Oct 13 13:29:44 crc kubenswrapper[4797]: E1013 13:29:44.586639 4797 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Oct 13 13:29:44 crc kubenswrapper[4797]: E1013 13:29:44.586705 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/21067728-d3cf-4ff2-94c9-87600f7324ab-config-data podName:21067728-d3cf-4ff2-94c9-87600f7324ab nodeName:}" failed. No retries permitted until 2025-10-13 13:29:46.586689421 +0000 UTC m=+1364.120239677 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/21067728-d3cf-4ff2-94c9-87600f7324ab-config-data") pod "rabbitmq-server-0" (UID: "21067728-d3cf-4ff2-94c9-87600f7324ab") : configmap "rabbitmq-config-data" not found Oct 13 13:29:44 crc kubenswrapper[4797]: I1013 13:29:44.598073 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-57bdg_1fc3b8cc-c74c-402d-8284-7d578bfa7c02/openstack-network-exporter/0.log" Oct 13 13:29:44 crc kubenswrapper[4797]: I1013 13:29:44.598130 4797 generic.go:334] "Generic (PLEG): container finished" podID="1fc3b8cc-c74c-402d-8284-7d578bfa7c02" containerID="58cfde229cd18c95dec2460726a8982f10895baa52b242e5b8b923162d984a0a" exitCode=2 Oct 13 13:29:44 crc kubenswrapper[4797]: I1013 13:29:44.598256 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-57bdg" event={"ID":"1fc3b8cc-c74c-402d-8284-7d578bfa7c02","Type":"ContainerDied","Data":"58cfde229cd18c95dec2460726a8982f10895baa52b242e5b8b923162d984a0a"} Oct 13 13:29:44 crc kubenswrapper[4797]: I1013 13:29:44.625274 4797 generic.go:334] "Generic (PLEG): container finished" podID="d661f302-5234-4d18-9aa8-0eddd26153fe" containerID="1a733e45e064aeec3878d4c5dd8fe67bcb4e25caaf9192479e03c78ce5fbd2b5" exitCode=2 Oct 13 13:29:44 crc kubenswrapper[4797]: I1013 13:29:44.625314 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"d661f302-5234-4d18-9aa8-0eddd26153fe","Type":"ContainerDied","Data":"1a733e45e064aeec3878d4c5dd8fe67bcb4e25caaf9192479e03c78ce5fbd2b5"} Oct 13 13:29:44 crc kubenswrapper[4797]: I1013 13:29:44.667485 4797 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-65bf758599-wncdc" podUID="a7fec705-3fa8-4f2b-aa9d-1afec561d884" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.200:5353: connect: connection refused" Oct 13 13:29:44 crc kubenswrapper[4797]: I1013 13:29:44.680657 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 13 13:29:44 crc kubenswrapper[4797]: I1013 13:29:44.680916 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="cefeac7c-e65d-4c12-8f7e-e56bf30c04fa" containerName="nova-metadata-log" containerID="cri-o://87d1861a88b2c106ab6c47eaad4f1a0647557f81e2c4766a50ad9d24c12524bf" gracePeriod=30 Oct 13 13:29:44 crc kubenswrapper[4797]: I1013 13:29:44.681314 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="cefeac7c-e65d-4c12-8f7e-e56bf30c04fa" containerName="nova-metadata-metadata" containerID="cri-o://2ce2d5a74583559ea08ad5502820fbbd8cb181a62c48ed513f6762ab6f0ec152" gracePeriod=30 Oct 13 13:29:44 crc kubenswrapper[4797]: I1013 13:29:44.700896 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 13 13:29:44 crc kubenswrapper[4797]: I1013 13:29:44.729491 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-778fd9d9d-t868n"] Oct 13 13:29:44 crc kubenswrapper[4797]: I1013 13:29:44.729824 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-778fd9d9d-t868n" podUID="d5e82c55-e59e-4d97-800c-66a4f9555047" containerName="placement-log" containerID="cri-o://f92eb182c86f5ebc1b55e35e6ebaf0b53e6772b112e92cce8213cf90110c1026" gracePeriod=30 Oct 13 13:29:44 crc kubenswrapper[4797]: I1013 13:29:44.730344 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-778fd9d9d-t868n" podUID="d5e82c55-e59e-4d97-800c-66a4f9555047" containerName="placement-api" containerID="cri-o://15e2f5844a61cab4f3406761a946255268af3094c017a3f89e9d9d6a0a06b3d1" gracePeriod=30 Oct 13 13:29:44 crc kubenswrapper[4797]: I1013 13:29:44.745854 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_244e58a1-ed2c-4ff6-8885-ebd066e8adab/ovsdbserver-nb/0.log" Oct 13 13:29:44 crc kubenswrapper[4797]: I1013 13:29:44.745935 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 13 13:29:44 crc kubenswrapper[4797]: I1013 13:29:44.750256 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 13 13:29:44 crc kubenswrapper[4797]: I1013 13:29:44.750482 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="e2a3161e-b16d-436d-b547-87e182ef5e27" containerName="nova-api-log" containerID="cri-o://5098f65b53c031667082257aa9c75ea641809f197a0ff696aad19b986be56dba" gracePeriod=30 Oct 13 13:29:44 crc kubenswrapper[4797]: I1013 13:29:44.750683 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="e2a3161e-b16d-436d-b547-87e182ef5e27" containerName="nova-api-api" containerID="cri-o://96c199433ec042676ed19e22d59f1e2298214f4e17f2643c330e718dd6cd93a6" gracePeriod=30 Oct 13 13:29:44 crc kubenswrapper[4797]: I1013 13:29:44.812649 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-57cbbb4d89-r9rvd"] Oct 13 13:29:44 crc kubenswrapper[4797]: I1013 13:29:44.814424 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-57cbbb4d89-r9rvd" podUID="11a6d485-2926-4d07-9b32-e81ab882de4c" containerName="neutron-api" containerID="cri-o://765f926c0e272fb4be3628cfe2a3f4a06d19c3788e310b7584f497aaf8cfd1b8" gracePeriod=30 Oct 13 13:29:44 crc kubenswrapper[4797]: I1013 13:29:44.815014 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-57cbbb4d89-r9rvd" podUID="11a6d485-2926-4d07-9b32-e81ab882de4c" containerName="neutron-httpd" containerID="cri-o://366a02b08ccc7deee54d9245cc187eef22dec7cc6454b3fe21bc9ed73aea1117" gracePeriod=30 Oct 13 13:29:44 crc kubenswrapper[4797]: I1013 13:29:44.819585 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="21067728-d3cf-4ff2-94c9-87600f7324ab" containerName="rabbitmq" containerID="cri-o://b14ae8c7513d0ced2b738189c2e015ff0743dc1feeeda16b9f6380925730cb4f" gracePeriod=604800 Oct 13 13:29:44 crc kubenswrapper[4797]: I1013 13:29:44.822174 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 13 13:29:44 crc kubenswrapper[4797]: I1013 13:29:44.886738 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 13 13:29:44 crc kubenswrapper[4797]: I1013 13:29:44.896567 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/244e58a1-ed2c-4ff6-8885-ebd066e8adab-config\") pod \"244e58a1-ed2c-4ff6-8885-ebd066e8adab\" (UID: \"244e58a1-ed2c-4ff6-8885-ebd066e8adab\") " Oct 13 13:29:44 crc kubenswrapper[4797]: I1013 13:29:44.896668 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/244e58a1-ed2c-4ff6-8885-ebd066e8adab-combined-ca-bundle\") pod \"244e58a1-ed2c-4ff6-8885-ebd066e8adab\" (UID: \"244e58a1-ed2c-4ff6-8885-ebd066e8adab\") " Oct 13 13:29:44 crc kubenswrapper[4797]: I1013 13:29:44.896713 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/244e58a1-ed2c-4ff6-8885-ebd066e8adab-ovsdb-rundir\") pod \"244e58a1-ed2c-4ff6-8885-ebd066e8adab\" (UID: \"244e58a1-ed2c-4ff6-8885-ebd066e8adab\") " Oct 13 13:29:44 crc kubenswrapper[4797]: I1013 13:29:44.896733 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/244e58a1-ed2c-4ff6-8885-ebd066e8adab-ovsdbserver-nb-tls-certs\") pod \"244e58a1-ed2c-4ff6-8885-ebd066e8adab\" (UID: \"244e58a1-ed2c-4ff6-8885-ebd066e8adab\") " Oct 13 13:29:44 crc kubenswrapper[4797]: I1013 13:29:44.896826 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/244e58a1-ed2c-4ff6-8885-ebd066e8adab-metrics-certs-tls-certs\") pod \"244e58a1-ed2c-4ff6-8885-ebd066e8adab\" (UID: \"244e58a1-ed2c-4ff6-8885-ebd066e8adab\") " Oct 13 13:29:44 crc kubenswrapper[4797]: I1013 13:29:44.896844 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ktv47\" (UniqueName: \"kubernetes.io/projected/244e58a1-ed2c-4ff6-8885-ebd066e8adab-kube-api-access-ktv47\") pod \"244e58a1-ed2c-4ff6-8885-ebd066e8adab\" (UID: \"244e58a1-ed2c-4ff6-8885-ebd066e8adab\") " Oct 13 13:29:44 crc kubenswrapper[4797]: I1013 13:29:44.896872 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/244e58a1-ed2c-4ff6-8885-ebd066e8adab-scripts\") pod \"244e58a1-ed2c-4ff6-8885-ebd066e8adab\" (UID: \"244e58a1-ed2c-4ff6-8885-ebd066e8adab\") " Oct 13 13:29:44 crc kubenswrapper[4797]: I1013 13:29:44.896955 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-nb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"244e58a1-ed2c-4ff6-8885-ebd066e8adab\" (UID: \"244e58a1-ed2c-4ff6-8885-ebd066e8adab\") " Oct 13 13:29:44 crc kubenswrapper[4797]: I1013 13:29:44.903733 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "ovndbcluster-nb-etc-ovn") pod "244e58a1-ed2c-4ff6-8885-ebd066e8adab" (UID: "244e58a1-ed2c-4ff6-8885-ebd066e8adab"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 13 13:29:44 crc kubenswrapper[4797]: I1013 13:29:44.904290 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/244e58a1-ed2c-4ff6-8885-ebd066e8adab-config" (OuterVolumeSpecName: "config") pod "244e58a1-ed2c-4ff6-8885-ebd066e8adab" (UID: "244e58a1-ed2c-4ff6-8885-ebd066e8adab"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:29:44 crc kubenswrapper[4797]: I1013 13:29:44.914677 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/244e58a1-ed2c-4ff6-8885-ebd066e8adab-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "244e58a1-ed2c-4ff6-8885-ebd066e8adab" (UID: "244e58a1-ed2c-4ff6-8885-ebd066e8adab"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 13:29:44 crc kubenswrapper[4797]: I1013 13:29:44.915153 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/244e58a1-ed2c-4ff6-8885-ebd066e8adab-scripts" (OuterVolumeSpecName: "scripts") pod "244e58a1-ed2c-4ff6-8885-ebd066e8adab" (UID: "244e58a1-ed2c-4ff6-8885-ebd066e8adab"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:29:44 crc kubenswrapper[4797]: I1013 13:29:44.953969 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/244e58a1-ed2c-4ff6-8885-ebd066e8adab-kube-api-access-ktv47" (OuterVolumeSpecName: "kube-api-access-ktv47") pod "244e58a1-ed2c-4ff6-8885-ebd066e8adab" (UID: "244e58a1-ed2c-4ff6-8885-ebd066e8adab"). InnerVolumeSpecName "kube-api-access-ktv47". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:29:44 crc kubenswrapper[4797]: I1013 13:29:44.963944 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-6c8957dfc-xhpz5"] Oct 13 13:29:44 crc kubenswrapper[4797]: I1013 13:29:44.964408 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-6c8957dfc-xhpz5" podUID="6fed4821-c587-408b-b6b0-bcc080170628" containerName="proxy-httpd" containerID="cri-o://6d39515a47c02aadeb66a5f42bd4f250497bcfc5b08bfe8c08ae7fa7aae9544b" gracePeriod=30 Oct 13 13:29:44 crc kubenswrapper[4797]: I1013 13:29:44.964716 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-6c8957dfc-xhpz5" podUID="6fed4821-c587-408b-b6b0-bcc080170628" containerName="proxy-server" containerID="cri-o://b0e1792b043ca301b8bfdc8d457a0e08d3ef2be1af52069dc3da398ff15a9eb7" gracePeriod=30 Oct 13 13:29:44 crc kubenswrapper[4797]: I1013 13:29:44.993318 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-84cf-account-create-hdct5"] Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.014882 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ktv47\" (UniqueName: \"kubernetes.io/projected/244e58a1-ed2c-4ff6-8885-ebd066e8adab-kube-api-access-ktv47\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.014910 4797 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/244e58a1-ed2c-4ff6-8885-ebd066e8adab-scripts\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.014931 4797 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.014942 4797 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/244e58a1-ed2c-4ff6-8885-ebd066e8adab-config\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.014950 4797 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/244e58a1-ed2c-4ff6-8885-ebd066e8adab-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:45 crc kubenswrapper[4797]: E1013 13:29:45.015184 4797 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Oct 13 13:29:45 crc kubenswrapper[4797]: E1013 13:29:45.016560 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/acdec9fc-360a-46e4-89ea-3fde84f417c0-config-data podName:acdec9fc-360a-46e4-89ea-3fde84f417c0 nodeName:}" failed. No retries permitted until 2025-10-13 13:29:47.016540773 +0000 UTC m=+1364.550091029 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/acdec9fc-360a-46e4-89ea-3fde84f417c0-config-data") pod "rabbitmq-cell1-server-0" (UID: "acdec9fc-360a-46e4-89ea-3fde84f417c0") : configmap "rabbitmq-cell1-config-data" not found Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.057963 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-84cf-account-create-hdct5"] Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.063012 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/244e58a1-ed2c-4ff6-8885-ebd066e8adab-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "244e58a1-ed2c-4ff6-8885-ebd066e8adab" (UID: "244e58a1-ed2c-4ff6-8885-ebd066e8adab"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.086862 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-9kldr"] Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.104544 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-9kldr"] Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.122119 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/244e58a1-ed2c-4ff6-8885-ebd066e8adab-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.128641 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novacell084cf-account-delete-r9bpk"] Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.170963 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-f077-account-create-c5h7s"] Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.192716 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-vchss"] Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.203582 4797 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.203648 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-f077-account-create-c5h7s"] Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.221454 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="acdec9fc-360a-46e4-89ea-3fde84f417c0" containerName="rabbitmq" containerID="cri-o://84b296cdae027daaba2dce536affe2df5bb8565c9eaea497ef3762320f6ea09d" gracePeriod=604800 Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.227465 4797 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.270151 4797 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/barbican-worker-cc74bd777-hvb5p" secret="" err="secret \"barbican-barbican-dockercfg-cb64g\" not found" Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.316511 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50e6d20e-a3e6-43a4-a10c-2d7d06d1cfeb" path="/var/lib/kubelet/pods/50e6d20e-a3e6-43a4-a10c-2d7d06d1cfeb/volumes" Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.317219 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="646c1def-f060-445e-b0a1-c616965aca86" path="/var/lib/kubelet/pods/646c1def-f060-445e-b0a1-c616965aca86/volumes" Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.322997 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87c2e451-cf73-4e5e-9e2e-703043c09184" path="/var/lib/kubelet/pods/87c2e451-cf73-4e5e-9e2e-703043c09184/volumes" Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.324210 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c56c463-d9bb-46df-995e-424abef44adc" path="/var/lib/kubelet/pods/8c56c463-d9bb-46df-995e-424abef44adc/volumes" Oct 13 13:29:45 crc kubenswrapper[4797]: E1013 13:29:45.332428 4797 secret.go:188] Couldn't get secret openstack/barbican-config-data: secret "barbican-config-data" not found Oct 13 13:29:45 crc kubenswrapper[4797]: E1013 13:29:45.332494 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/99ff3b6b-b49c-4214-9cf1-ea9eb85f28c0-config-data podName:99ff3b6b-b49c-4214-9cf1-ea9eb85f28c0 nodeName:}" failed. No retries permitted until 2025-10-13 13:29:45.83247862 +0000 UTC m=+1363.366028876 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/99ff3b6b-b49c-4214-9cf1-ea9eb85f28c0-config-data") pod "barbican-worker-cc74bd777-hvb5p" (UID: "99ff3b6b-b49c-4214-9cf1-ea9eb85f28c0") : secret "barbican-config-data" not found Oct 13 13:29:45 crc kubenswrapper[4797]: E1013 13:29:45.334645 4797 secret.go:188] Couldn't get secret openstack/barbican-worker-config-data: secret "barbican-worker-config-data" not found Oct 13 13:29:45 crc kubenswrapper[4797]: E1013 13:29:45.334682 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/99ff3b6b-b49c-4214-9cf1-ea9eb85f28c0-config-data-custom podName:99ff3b6b-b49c-4214-9cf1-ea9eb85f28c0 nodeName:}" failed. No retries permitted until 2025-10-13 13:29:45.834673674 +0000 UTC m=+1363.368223930 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data-custom" (UniqueName: "kubernetes.io/secret/99ff3b6b-b49c-4214-9cf1-ea9eb85f28c0-config-data-custom") pod "barbican-worker-cc74bd777-hvb5p" (UID: "99ff3b6b-b49c-4214-9cf1-ea9eb85f28c0") : secret "barbican-worker-config-data" not found Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.335773 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9075d4a4-54e2-492f-bc84-bd1fb11df325" path="/var/lib/kubelet/pods/9075d4a4-54e2-492f-bc84-bd1fb11df325/volumes" Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.346138 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="974e0c08-3519-4be7-a9d1-c7db6016ad6f" path="/var/lib/kubelet/pods/974e0c08-3519-4be7-a9d1-c7db6016ad6f/volumes" Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.348760 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8ac30bd-fd2e-4d0e-99b6-06b9adc7981b" path="/var/lib/kubelet/pods/a8ac30bd-fd2e-4d0e-99b6-06b9adc7981b/volumes" Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.349539 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb8faf00-58df-4131-af70-117df286f396" path="/var/lib/kubelet/pods/bb8faf00-58df-4131-af70-117df286f396/volumes" Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.352310 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5dd5a95-b091-4f4f-8b05-d49d9dc0e979" path="/var/lib/kubelet/pods/c5dd5a95-b091-4f4f-8b05-d49d9dc0e979/volumes" Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.354013 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7978a82-8a2f-4a86-8598-65e7dae25b77" path="/var/lib/kubelet/pods/d7978a82-8a2f-4a86-8598-65e7dae25b77/volumes" Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.360002 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef6a4b4d-5231-4b06-9fa1-695aee17a37b" path="/var/lib/kubelet/pods/ef6a4b4d-5231-4b06-9fa1-695aee17a37b/volumes" Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.363563 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/244e58a1-ed2c-4ff6-8885-ebd066e8adab-ovsdbserver-nb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-nb-tls-certs") pod "244e58a1-ed2c-4ff6-8885-ebd066e8adab" (UID: "244e58a1-ed2c-4ff6-8885-ebd066e8adab"). InnerVolumeSpecName "ovsdbserver-nb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.368150 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/244e58a1-ed2c-4ff6-8885-ebd066e8adab-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "244e58a1-ed2c-4ff6-8885-ebd066e8adab" (UID: "244e58a1-ed2c-4ff6-8885-ebd066e8adab"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.389208 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-vchss"] Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.389244 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-vmbzt"] Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.389257 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-vmbzt"] Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.420423 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-a754-account-create-545z2"] Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.444257 4797 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/244e58a1-ed2c-4ff6-8885-ebd066e8adab-ovsdbserver-nb-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.444589 4797 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/244e58a1-ed2c-4ff6-8885-ebd066e8adab-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:45 crc kubenswrapper[4797]: E1013 13:29:45.447399 4797 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err=< Oct 13 13:29:45 crc kubenswrapper[4797]: command '/usr/local/bin/container-scripts/stop-ovsdb-server.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Oct 13 13:29:45 crc kubenswrapper[4797]: + source /usr/local/bin/container-scripts/functions Oct 13 13:29:45 crc kubenswrapper[4797]: ++ OVNBridge=br-int Oct 13 13:29:45 crc kubenswrapper[4797]: ++ OVNRemote=tcp:localhost:6642 Oct 13 13:29:45 crc kubenswrapper[4797]: ++ OVNEncapType=geneve Oct 13 13:29:45 crc kubenswrapper[4797]: ++ OVNAvailabilityZones= Oct 13 13:29:45 crc kubenswrapper[4797]: ++ EnableChassisAsGateway=true Oct 13 13:29:45 crc kubenswrapper[4797]: ++ PhysicalNetworks= Oct 13 13:29:45 crc kubenswrapper[4797]: ++ OVNHostName= Oct 13 13:29:45 crc kubenswrapper[4797]: ++ DB_FILE=/etc/openvswitch/conf.db Oct 13 13:29:45 crc kubenswrapper[4797]: ++ ovs_dir=/var/lib/openvswitch Oct 13 13:29:45 crc kubenswrapper[4797]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Oct 13 13:29:45 crc kubenswrapper[4797]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Oct 13 13:29:45 crc kubenswrapper[4797]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Oct 13 13:29:45 crc kubenswrapper[4797]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 13 13:29:45 crc kubenswrapper[4797]: + sleep 0.5 Oct 13 13:29:45 crc kubenswrapper[4797]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 13 13:29:45 crc kubenswrapper[4797]: + sleep 0.5 Oct 13 13:29:45 crc kubenswrapper[4797]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 13 13:29:45 crc kubenswrapper[4797]: + cleanup_ovsdb_server_semaphore Oct 13 13:29:45 crc kubenswrapper[4797]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Oct 13 13:29:45 crc kubenswrapper[4797]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Oct 13 13:29:45 crc kubenswrapper[4797]: > execCommand=["/usr/local/bin/container-scripts/stop-ovsdb-server.sh"] containerName="ovsdb-server" pod="openstack/ovn-controller-ovs-2mpq9" message=< Oct 13 13:29:45 crc kubenswrapper[4797]: Exiting ovsdb-server (5) ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Oct 13 13:29:45 crc kubenswrapper[4797]: + source /usr/local/bin/container-scripts/functions Oct 13 13:29:45 crc kubenswrapper[4797]: ++ OVNBridge=br-int Oct 13 13:29:45 crc kubenswrapper[4797]: ++ OVNRemote=tcp:localhost:6642 Oct 13 13:29:45 crc kubenswrapper[4797]: ++ OVNEncapType=geneve Oct 13 13:29:45 crc kubenswrapper[4797]: ++ OVNAvailabilityZones= Oct 13 13:29:45 crc kubenswrapper[4797]: ++ EnableChassisAsGateway=true Oct 13 13:29:45 crc kubenswrapper[4797]: ++ PhysicalNetworks= Oct 13 13:29:45 crc kubenswrapper[4797]: ++ OVNHostName= Oct 13 13:29:45 crc kubenswrapper[4797]: ++ DB_FILE=/etc/openvswitch/conf.db Oct 13 13:29:45 crc kubenswrapper[4797]: ++ ovs_dir=/var/lib/openvswitch Oct 13 13:29:45 crc kubenswrapper[4797]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Oct 13 13:29:45 crc kubenswrapper[4797]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Oct 13 13:29:45 crc kubenswrapper[4797]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Oct 13 13:29:45 crc kubenswrapper[4797]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 13 13:29:45 crc kubenswrapper[4797]: + sleep 0.5 Oct 13 13:29:45 crc kubenswrapper[4797]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 13 13:29:45 crc kubenswrapper[4797]: + sleep 0.5 Oct 13 13:29:45 crc kubenswrapper[4797]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 13 13:29:45 crc kubenswrapper[4797]: + cleanup_ovsdb_server_semaphore Oct 13 13:29:45 crc kubenswrapper[4797]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Oct 13 13:29:45 crc kubenswrapper[4797]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Oct 13 13:29:45 crc kubenswrapper[4797]: > Oct 13 13:29:45 crc kubenswrapper[4797]: E1013 13:29:45.448498 4797 kuberuntime_container.go:691] "PreStop hook failed" err=< Oct 13 13:29:45 crc kubenswrapper[4797]: command '/usr/local/bin/container-scripts/stop-ovsdb-server.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Oct 13 13:29:45 crc kubenswrapper[4797]: + source /usr/local/bin/container-scripts/functions Oct 13 13:29:45 crc kubenswrapper[4797]: ++ OVNBridge=br-int Oct 13 13:29:45 crc kubenswrapper[4797]: ++ OVNRemote=tcp:localhost:6642 Oct 13 13:29:45 crc kubenswrapper[4797]: ++ OVNEncapType=geneve Oct 13 13:29:45 crc kubenswrapper[4797]: ++ OVNAvailabilityZones= Oct 13 13:29:45 crc kubenswrapper[4797]: ++ EnableChassisAsGateway=true Oct 13 13:29:45 crc kubenswrapper[4797]: ++ PhysicalNetworks= Oct 13 13:29:45 crc kubenswrapper[4797]: ++ OVNHostName= Oct 13 13:29:45 crc kubenswrapper[4797]: ++ DB_FILE=/etc/openvswitch/conf.db Oct 13 13:29:45 crc kubenswrapper[4797]: ++ ovs_dir=/var/lib/openvswitch Oct 13 13:29:45 crc kubenswrapper[4797]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Oct 13 13:29:45 crc kubenswrapper[4797]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Oct 13 13:29:45 crc kubenswrapper[4797]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Oct 13 13:29:45 crc kubenswrapper[4797]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 13 13:29:45 crc kubenswrapper[4797]: + sleep 0.5 Oct 13 13:29:45 crc kubenswrapper[4797]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 13 13:29:45 crc kubenswrapper[4797]: + sleep 0.5 Oct 13 13:29:45 crc kubenswrapper[4797]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 13 13:29:45 crc kubenswrapper[4797]: + cleanup_ovsdb_server_semaphore Oct 13 13:29:45 crc kubenswrapper[4797]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Oct 13 13:29:45 crc kubenswrapper[4797]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Oct 13 13:29:45 crc kubenswrapper[4797]: > pod="openstack/ovn-controller-ovs-2mpq9" podUID="3ac6531d-4d7d-4cf0-b943-984f885b4a6d" containerName="ovsdb-server" containerID="cri-o://a06c1a43c4f53948ecf8f2a50db2d6224db0be23e64e47a447e2e46bcc407c2e" Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.449044 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-2mpq9" podUID="3ac6531d-4d7d-4cf0-b943-984f885b4a6d" containerName="ovsdb-server" containerID="cri-o://a06c1a43c4f53948ecf8f2a50db2d6224db0be23e64e47a447e2e46bcc407c2e" gracePeriod=29 Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.449446 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-a754-account-create-545z2"] Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.469483 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-cell1-galera-0" podUID="f12892ce-6d68-4f79-b1dd-e874dffba145" containerName="galera" containerID="cri-o://d71ea01203c3ae01ea8325c2bb868f94f1383842ef8fa152c98b3afecb3c64ce" gracePeriod=30 Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.485702 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novaapia754-account-delete-6khh6"] Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.506178 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-7hjgk"] Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.537027 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-7hjgk"] Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.540992 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-57bdg_1fc3b8cc-c74c-402d-8284-7d578bfa7c02/openstack-network-exporter/0.log" Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.541070 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-57bdg" Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.547872 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-66cds"] Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.559495 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_047404d9-b0ab-44e2-a31d-94d8fe429698/ovsdbserver-sb/0.log" Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.559665 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.581995 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-66cds"] Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.596767 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-b951-account-create-slxdp"] Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.603953 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-22bc-account-create-rrkb5"] Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.609533 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-22bc-account-create-rrkb5"] Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.619261 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-b951-account-create-slxdp"] Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.627022 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novacell122bc-account-delete-v4zs5"] Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.632566 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-6xtp5"] Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.646848 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-6xtp5"] Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.650402 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fc3b8cc-c74c-402d-8284-7d578bfa7c02-config\") pod \"1fc3b8cc-c74c-402d-8284-7d578bfa7c02\" (UID: \"1fc3b8cc-c74c-402d-8284-7d578bfa7c02\") " Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.650486 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/047404d9-b0ab-44e2-a31d-94d8fe429698-ovsdb-rundir\") pod \"047404d9-b0ab-44e2-a31d-94d8fe429698\" (UID: \"047404d9-b0ab-44e2-a31d-94d8fe429698\") " Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.650523 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/047404d9-b0ab-44e2-a31d-94d8fe429698-config\") pod \"047404d9-b0ab-44e2-a31d-94d8fe429698\" (UID: \"047404d9-b0ab-44e2-a31d-94d8fe429698\") " Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.650554 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1fc3b8cc-c74c-402d-8284-7d578bfa7c02-metrics-certs-tls-certs\") pod \"1fc3b8cc-c74c-402d-8284-7d578bfa7c02\" (UID: \"1fc3b8cc-c74c-402d-8284-7d578bfa7c02\") " Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.650620 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-znls5\" (UniqueName: \"kubernetes.io/projected/047404d9-b0ab-44e2-a31d-94d8fe429698-kube-api-access-znls5\") pod \"047404d9-b0ab-44e2-a31d-94d8fe429698\" (UID: \"047404d9-b0ab-44e2-a31d-94d8fe429698\") " Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.650666 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/1fc3b8cc-c74c-402d-8284-7d578bfa7c02-ovn-rundir\") pod \"1fc3b8cc-c74c-402d-8284-7d578bfa7c02\" (UID: \"1fc3b8cc-c74c-402d-8284-7d578bfa7c02\") " Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.650738 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/047404d9-b0ab-44e2-a31d-94d8fe429698-combined-ca-bundle\") pod \"047404d9-b0ab-44e2-a31d-94d8fe429698\" (UID: \"047404d9-b0ab-44e2-a31d-94d8fe429698\") " Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.650776 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/1fc3b8cc-c74c-402d-8284-7d578bfa7c02-ovs-rundir\") pod \"1fc3b8cc-c74c-402d-8284-7d578bfa7c02\" (UID: \"1fc3b8cc-c74c-402d-8284-7d578bfa7c02\") " Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.650829 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/047404d9-b0ab-44e2-a31d-94d8fe429698-ovsdbserver-sb-tls-certs\") pod \"047404d9-b0ab-44e2-a31d-94d8fe429698\" (UID: \"047404d9-b0ab-44e2-a31d-94d8fe429698\") " Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.650858 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-sb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"047404d9-b0ab-44e2-a31d-94d8fe429698\" (UID: \"047404d9-b0ab-44e2-a31d-94d8fe429698\") " Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.650892 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fc3b8cc-c74c-402d-8284-7d578bfa7c02-combined-ca-bundle\") pod \"1fc3b8cc-c74c-402d-8284-7d578bfa7c02\" (UID: \"1fc3b8cc-c74c-402d-8284-7d578bfa7c02\") " Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.651031 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mwnpw\" (UniqueName: \"kubernetes.io/projected/1fc3b8cc-c74c-402d-8284-7d578bfa7c02-kube-api-access-mwnpw\") pod \"1fc3b8cc-c74c-402d-8284-7d578bfa7c02\" (UID: \"1fc3b8cc-c74c-402d-8284-7d578bfa7c02\") " Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.651066 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/047404d9-b0ab-44e2-a31d-94d8fe429698-metrics-certs-tls-certs\") pod \"047404d9-b0ab-44e2-a31d-94d8fe429698\" (UID: \"047404d9-b0ab-44e2-a31d-94d8fe429698\") " Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.651144 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/047404d9-b0ab-44e2-a31d-94d8fe429698-scripts\") pod \"047404d9-b0ab-44e2-a31d-94d8fe429698\" (UID: \"047404d9-b0ab-44e2-a31d-94d8fe429698\") " Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.652258 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/047404d9-b0ab-44e2-a31d-94d8fe429698-config" (OuterVolumeSpecName: "config") pod "047404d9-b0ab-44e2-a31d-94d8fe429698" (UID: "047404d9-b0ab-44e2-a31d-94d8fe429698"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.652512 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/047404d9-b0ab-44e2-a31d-94d8fe429698-scripts" (OuterVolumeSpecName: "scripts") pod "047404d9-b0ab-44e2-a31d-94d8fe429698" (UID: "047404d9-b0ab-44e2-a31d-94d8fe429698"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.652569 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1fc3b8cc-c74c-402d-8284-7d578bfa7c02-ovs-rundir" (OuterVolumeSpecName: "ovs-rundir") pod "1fc3b8cc-c74c-402d-8284-7d578bfa7c02" (UID: "1fc3b8cc-c74c-402d-8284-7d578bfa7c02"). InnerVolumeSpecName "ovs-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.653088 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1fc3b8cc-c74c-402d-8284-7d578bfa7c02-config" (OuterVolumeSpecName: "config") pod "1fc3b8cc-c74c-402d-8284-7d578bfa7c02" (UID: "1fc3b8cc-c74c-402d-8284-7d578bfa7c02"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.653569 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/047404d9-b0ab-44e2-a31d-94d8fe429698-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "047404d9-b0ab-44e2-a31d-94d8fe429698" (UID: "047404d9-b0ab-44e2-a31d-94d8fe429698"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.653664 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1fc3b8cc-c74c-402d-8284-7d578bfa7c02-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "1fc3b8cc-c74c-402d-8284-7d578bfa7c02" (UID: "1fc3b8cc-c74c-402d-8284-7d578bfa7c02"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.656070 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/047404d9-b0ab-44e2-a31d-94d8fe429698-kube-api-access-znls5" (OuterVolumeSpecName: "kube-api-access-znls5") pod "047404d9-b0ab-44e2-a31d-94d8fe429698" (UID: "047404d9-b0ab-44e2-a31d-94d8fe429698"). InnerVolumeSpecName "kube-api-access-znls5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.656330 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-cde4-account-create-7vdmk"] Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.658218 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-cde4-account-create-7vdmk"] Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.660619 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "ovndbcluster-sb-etc-ovn") pod "047404d9-b0ab-44e2-a31d-94d8fe429698" (UID: "047404d9-b0ab-44e2-a31d-94d8fe429698"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.664829 4797 generic.go:334] "Generic (PLEG): container finished" podID="3ac6531d-4d7d-4cf0-b943-984f885b4a6d" containerID="a06c1a43c4f53948ecf8f2a50db2d6224db0be23e64e47a447e2e46bcc407c2e" exitCode=0 Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.664895 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-2mpq9" event={"ID":"3ac6531d-4d7d-4cf0-b943-984f885b4a6d","Type":"ContainerDied","Data":"a06c1a43c4f53948ecf8f2a50db2d6224db0be23e64e47a447e2e46bcc407c2e"} Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.665990 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placementcde4-account-delete-dpcwh"] Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.674536 4797 generic.go:334] "Generic (PLEG): container finished" podID="e2be119d-ecfb-4f81-b947-46797c215b8e" containerID="5047e602c53523ee75385241870f170eddf3e400c8f5154307fa867893e3573e" exitCode=143 Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.674704 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e2be119d-ecfb-4f81-b947-46797c215b8e","Type":"ContainerDied","Data":"5047e602c53523ee75385241870f170eddf3e400c8f5154307fa867893e3573e"} Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.674795 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.675023 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="fe29d3f9-7e6b-4fc0-8268-a5c3bd0e7f1e" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://f7e069b9ab89c7959910da337a2d82dec852dac12fc5e241175f7c593d851a00" gracePeriod=30 Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.679340 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1fc3b8cc-c74c-402d-8284-7d578bfa7c02-kube-api-access-mwnpw" (OuterVolumeSpecName: "kube-api-access-mwnpw") pod "1fc3b8cc-c74c-402d-8284-7d578bfa7c02" (UID: "1fc3b8cc-c74c-402d-8284-7d578bfa7c02"). InnerVolumeSpecName "kube-api-access-mwnpw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.696220 4797 generic.go:334] "Generic (PLEG): container finished" podID="6b2f17a4-493b-4b76-9dea-ef70ed8e1525" containerID="b525fe6d91c98126c6fdec2f494bf2955f9437633fcc37cf958bfcaa039ef67e" exitCode=143 Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.696302 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"6b2f17a4-493b-4b76-9dea-ef70ed8e1525","Type":"ContainerDied","Data":"b525fe6d91c98126c6fdec2f494bf2955f9437633fcc37cf958bfcaa039ef67e"} Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.726699 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-575995d4c4-dskht"] Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.726924 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-575995d4c4-dskht" podUID="3d561c30-1e2f-4a3d-b042-8191c88e4bb6" containerName="barbican-api-log" containerID="cri-o://7fbdf32f1bb326754cf202e723f8c39032e73df93800b78e5c5cdcd412a37605" gracePeriod=30 Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.727384 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-575995d4c4-dskht" podUID="3d561c30-1e2f-4a3d-b042-8191c88e4bb6" containerName="barbican-api" containerID="cri-o://658bce01fb068a8991c6ad520dbfd6eedee82a6e9a0f4f191a112fb1f5f569bf" gracePeriod=30 Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.728781 4797 generic.go:334] "Generic (PLEG): container finished" podID="f853cd93-92bc-46d6-8bd4-82373edcac6c" containerID="7c966ea8b8d377fe98198c55458614115fcf39c47512b6e6a01c20502d489780" exitCode=0 Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.728796 4797 generic.go:334] "Generic (PLEG): container finished" podID="f853cd93-92bc-46d6-8bd4-82373edcac6c" containerID="0771ac639aaa1011650222d8f818316f8651f5729cf3d02726d6f291d0ca0403" exitCode=0 Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.728844 4797 generic.go:334] "Generic (PLEG): container finished" podID="f853cd93-92bc-46d6-8bd4-82373edcac6c" containerID="88fe19d7acc00d4d082fc5672f55a59aa59797313060f49a5e87d490f16e6bd6" exitCode=0 Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.728851 4797 generic.go:334] "Generic (PLEG): container finished" podID="f853cd93-92bc-46d6-8bd4-82373edcac6c" containerID="95610d8f86f925fa29c39c4b649aca588a5bfca6031effb0eddf4c0ba26766bc" exitCode=0 Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.728857 4797 generic.go:334] "Generic (PLEG): container finished" podID="f853cd93-92bc-46d6-8bd4-82373edcac6c" containerID="2c54fef5660226ff362a71ac3e917e001db5e9a347a08deab58b5660776811fc" exitCode=0 Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.728863 4797 generic.go:334] "Generic (PLEG): container finished" podID="f853cd93-92bc-46d6-8bd4-82373edcac6c" containerID="f98ce95dd4a03f4ac462cac39a646ae12a813083511d7a3c50bdddd1624ba0aa" exitCode=0 Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.728869 4797 generic.go:334] "Generic (PLEG): container finished" podID="f853cd93-92bc-46d6-8bd4-82373edcac6c" containerID="7fa1e18bf4f048760b982cf61e4fdc5706474901c9f59b877086707fd0c2bad5" exitCode=0 Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.728876 4797 generic.go:334] "Generic (PLEG): container finished" podID="f853cd93-92bc-46d6-8bd4-82373edcac6c" containerID="a05561b229ddf692864eb50f4e3134b3ae7ce64dcd992877ca1a87198bca77ca" exitCode=0 Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.728882 4797 generic.go:334] "Generic (PLEG): container finished" podID="f853cd93-92bc-46d6-8bd4-82373edcac6c" containerID="826bb87dfbab6e082d77678c431c5f60fe685f53d7f5d122666545d7e4d18b90" exitCode=0 Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.728887 4797 generic.go:334] "Generic (PLEG): container finished" podID="f853cd93-92bc-46d6-8bd4-82373edcac6c" containerID="e34ee38f199fac8191d033ce294a7a0959606e96937bb15c3973792f94df9fb9" exitCode=0 Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.728896 4797 generic.go:334] "Generic (PLEG): container finished" podID="f853cd93-92bc-46d6-8bd4-82373edcac6c" containerID="e14c6e172f563574bf54ee4a48fdf4ed5d505dd4e2e640e88c9ca54cdfa00ed3" exitCode=0 Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.728902 4797 generic.go:334] "Generic (PLEG): container finished" podID="f853cd93-92bc-46d6-8bd4-82373edcac6c" containerID="fd4ad2d986ae8532b8471784274e34f2e4066c64a28827eb0bdc5088d9c323f3" exitCode=0 Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.728908 4797 generic.go:334] "Generic (PLEG): container finished" podID="f853cd93-92bc-46d6-8bd4-82373edcac6c" containerID="ad8ecb3fe40030b9df7dc7b9b77499e35e2b251475733a87367b79e69bc6d068" exitCode=0 Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.728914 4797 generic.go:334] "Generic (PLEG): container finished" podID="f853cd93-92bc-46d6-8bd4-82373edcac6c" containerID="c4ce4b63b22c727785eb5abec889eca7d94fdcd5d0b3cda4520df492e1acde65" exitCode=0 Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.728950 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f853cd93-92bc-46d6-8bd4-82373edcac6c","Type":"ContainerDied","Data":"7c966ea8b8d377fe98198c55458614115fcf39c47512b6e6a01c20502d489780"} Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.728970 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f853cd93-92bc-46d6-8bd4-82373edcac6c","Type":"ContainerDied","Data":"0771ac639aaa1011650222d8f818316f8651f5729cf3d02726d6f291d0ca0403"} Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.728980 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f853cd93-92bc-46d6-8bd4-82373edcac6c","Type":"ContainerDied","Data":"88fe19d7acc00d4d082fc5672f55a59aa59797313060f49a5e87d490f16e6bd6"} Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.728989 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f853cd93-92bc-46d6-8bd4-82373edcac6c","Type":"ContainerDied","Data":"95610d8f86f925fa29c39c4b649aca588a5bfca6031effb0eddf4c0ba26766bc"} Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.728997 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f853cd93-92bc-46d6-8bd4-82373edcac6c","Type":"ContainerDied","Data":"2c54fef5660226ff362a71ac3e917e001db5e9a347a08deab58b5660776811fc"} Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.729006 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f853cd93-92bc-46d6-8bd4-82373edcac6c","Type":"ContainerDied","Data":"f98ce95dd4a03f4ac462cac39a646ae12a813083511d7a3c50bdddd1624ba0aa"} Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.729014 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f853cd93-92bc-46d6-8bd4-82373edcac6c","Type":"ContainerDied","Data":"7fa1e18bf4f048760b982cf61e4fdc5706474901c9f59b877086707fd0c2bad5"} Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.729023 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f853cd93-92bc-46d6-8bd4-82373edcac6c","Type":"ContainerDied","Data":"a05561b229ddf692864eb50f4e3134b3ae7ce64dcd992877ca1a87198bca77ca"} Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.729033 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f853cd93-92bc-46d6-8bd4-82373edcac6c","Type":"ContainerDied","Data":"826bb87dfbab6e082d77678c431c5f60fe685f53d7f5d122666545d7e4d18b90"} Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.729040 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f853cd93-92bc-46d6-8bd4-82373edcac6c","Type":"ContainerDied","Data":"e34ee38f199fac8191d033ce294a7a0959606e96937bb15c3973792f94df9fb9"} Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.729048 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f853cd93-92bc-46d6-8bd4-82373edcac6c","Type":"ContainerDied","Data":"e14c6e172f563574bf54ee4a48fdf4ed5d505dd4e2e640e88c9ca54cdfa00ed3"} Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.729056 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f853cd93-92bc-46d6-8bd4-82373edcac6c","Type":"ContainerDied","Data":"fd4ad2d986ae8532b8471784274e34f2e4066c64a28827eb0bdc5088d9c323f3"} Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.729065 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f853cd93-92bc-46d6-8bd4-82373edcac6c","Type":"ContainerDied","Data":"ad8ecb3fe40030b9df7dc7b9b77499e35e2b251475733a87367b79e69bc6d068"} Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.729073 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f853cd93-92bc-46d6-8bd4-82373edcac6c","Type":"ContainerDied","Data":"c4ce4b63b22c727785eb5abec889eca7d94fdcd5d0b3cda4520df492e1acde65"} Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.737989 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/047404d9-b0ab-44e2-a31d-94d8fe429698-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "047404d9-b0ab-44e2-a31d-94d8fe429698" (UID: "047404d9-b0ab-44e2-a31d-94d8fe429698"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.745338 4797 generic.go:334] "Generic (PLEG): container finished" podID="cc4b497b-efb0-4294-8af9-c16bb2835e36" containerID="2cf43831975875d820d476539ef4c0943fa120e06b5317c6aec843e932949edd" exitCode=0 Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.745945 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"cc4b497b-efb0-4294-8af9-c16bb2835e36","Type":"ContainerDied","Data":"2cf43831975875d820d476539ef4c0943fa120e06b5317c6aec843e932949edd"} Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.750151 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65bf758599-wncdc" Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.763953 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mwnpw\" (UniqueName: \"kubernetes.io/projected/1fc3b8cc-c74c-402d-8284-7d578bfa7c02-kube-api-access-mwnpw\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.763989 4797 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/047404d9-b0ab-44e2-a31d-94d8fe429698-scripts\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.764004 4797 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fc3b8cc-c74c-402d-8284-7d578bfa7c02-config\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.764017 4797 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/047404d9-b0ab-44e2-a31d-94d8fe429698-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.764029 4797 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/047404d9-b0ab-44e2-a31d-94d8fe429698-config\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.764039 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-znls5\" (UniqueName: \"kubernetes.io/projected/047404d9-b0ab-44e2-a31d-94d8fe429698-kube-api-access-znls5\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.764050 4797 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/1fc3b8cc-c74c-402d-8284-7d578bfa7c02-ovn-rundir\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.764061 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/047404d9-b0ab-44e2-a31d-94d8fe429698-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.764071 4797 reconciler_common.go:293] "Volume detached for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/1fc3b8cc-c74c-402d-8284-7d578bfa7c02-ovs-rundir\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.764097 4797 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.765347 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-76fd44f586-tp25f"] Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.767113 4797 generic.go:334] "Generic (PLEG): container finished" podID="d5e82c55-e59e-4d97-800c-66a4f9555047" containerID="f92eb182c86f5ebc1b55e35e6ebaf0b53e6772b112e92cce8213cf90110c1026" exitCode=143 Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.767207 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-2mpq9" podUID="3ac6531d-4d7d-4cf0-b943-984f885b4a6d" containerName="ovs-vswitchd" containerID="cri-o://530a23262b718d00b295a64b20c635224a04f307f764b6900f060c9b7a722368" gracePeriod=29 Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.770693 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-778fd9d9d-t868n" event={"ID":"d5e82c55-e59e-4d97-800c-66a4f9555047","Type":"ContainerDied","Data":"f92eb182c86f5ebc1b55e35e6ebaf0b53e6772b112e92cce8213cf90110c1026"} Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.771327 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-76fd44f586-tp25f" podUID="b6d097e4-da24-434c-9b9f-2e84279240a6" containerName="barbican-keystone-listener-log" containerID="cri-o://91313e817dc0492cf7a60f479444a1c27f27b873cabe704400743e07a7510a7d" gracePeriod=30 Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.771839 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-76fd44f586-tp25f" podUID="b6d097e4-da24-434c-9b9f-2e84279240a6" containerName="barbican-keystone-listener" containerID="cri-o://af88e3a88bd4391cf42637ef4fd96c25933e8e2d449d7aa7ee46670261ef0157" gracePeriod=30 Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.773256 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance496f-account-delete-qst6j" event={"ID":"312a660f-ea89-49ac-8857-16dae844353f","Type":"ContainerStarted","Data":"323ef85aaeb8b705a1858529352142233dc9716056c6c8cb85038aef3c8e34e9"} Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.774330 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-cc74bd777-hvb5p"] Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.782877 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-f5b9c"] Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.785965 4797 generic.go:334] "Generic (PLEG): container finished" podID="416aefad-3318-4406-b1c1-fdba0ce21437" containerID="52fe991f1e2fd4f29fc6e6714cf185447bb07fb1c6293b5cab3549dd2ea20248" exitCode=143 Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.786012 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"416aefad-3318-4406-b1c1-fdba0ce21437","Type":"ContainerDied","Data":"52fe991f1e2fd4f29fc6e6714cf185447bb07fb1c6293b5cab3549dd2ea20248"} Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.790947 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-f5b9c"] Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.792513 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinderfaff-account-delete-ltnnq" event={"ID":"5421ab8e-2db8-4909-b67b-e0491f7b80e7","Type":"ContainerStarted","Data":"216d0d423d478b6567928f1732c5f96a7a5a63213749725894be0dbfdb1d409a"} Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.798602 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.799718 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="f10b6d3a-3a6f-4ab7-9ded-4885e28bdbc6" containerName="nova-scheduler-scheduler" containerID="cri-o://eb7d4e571e76b36027b315aa98010f3dd65dc78fd3a01a2aa0cae7369e4d667e" gracePeriod=30 Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.807605 4797 generic.go:334] "Generic (PLEG): container finished" podID="6fed4821-c587-408b-b6b0-bcc080170628" containerID="6d39515a47c02aadeb66a5f42bd4f250497bcfc5b08bfe8c08ae7fa7aae9544b" exitCode=0 Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.807656 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6c8957dfc-xhpz5" event={"ID":"6fed4821-c587-408b-b6b0-bcc080170628","Type":"ContainerDied","Data":"6d39515a47c02aadeb66a5f42bd4f250497bcfc5b08bfe8c08ae7fa7aae9544b"} Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.808934 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.809085 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="c7503305-66a2-4504-b208-6795946d8701" containerName="nova-cell0-conductor-conductor" containerID="cri-o://aab08153ea37f716e034bf774837202e797c20f3821e253a5ccd1da48ecfc7e4" gracePeriod=30 Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.814046 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_047404d9-b0ab-44e2-a31d-94d8fe429698/ovsdbserver-sb/0.log" Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.814193 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"047404d9-b0ab-44e2-a31d-94d8fe429698","Type":"ContainerDied","Data":"962baa4a87d5c8ee7ad1b1e524a17b551a24101a2211c7f0874b486652e9f832"} Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.814223 4797 scope.go:117] "RemoveContainer" containerID="eb49a8ba0c15790a316319a7af2bb9f90a15b3fcad1167d493975eeda527d705" Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.814339 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.822710 4797 generic.go:334] "Generic (PLEG): container finished" podID="a7fec705-3fa8-4f2b-aa9d-1afec561d884" containerID="f67257e3e1a1c5986f176debe54b11331896c78d077c52898951e4e00a7acef3" exitCode=0 Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.822767 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65bf758599-wncdc" event={"ID":"a7fec705-3fa8-4f2b-aa9d-1afec561d884","Type":"ContainerDied","Data":"f67257e3e1a1c5986f176debe54b11331896c78d077c52898951e4e00a7acef3"} Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.822798 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65bf758599-wncdc" event={"ID":"a7fec705-3fa8-4f2b-aa9d-1afec561d884","Type":"ContainerDied","Data":"48dc4b9d6c500fbfa452e70a74642c8c46bc5987d563149c8f6821dfa227895d"} Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.822952 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65bf758599-wncdc" Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.825029 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.827334 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_244e58a1-ed2c-4ff6-8885-ebd066e8adab/ovsdbserver-nb/0.log" Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.827387 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"244e58a1-ed2c-4ff6-8885-ebd066e8adab","Type":"ContainerDied","Data":"a68ed4441c229162a797613f77467948c60aa25c7f213142f53e88b4f973d9c5"} Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.827446 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.828750 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-9drpj"] Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.844818 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-9drpj"] Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.855954 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fc3b8cc-c74c-402d-8284-7d578bfa7c02-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1fc3b8cc-c74c-402d-8284-7d578bfa7c02" (UID: "1fc3b8cc-c74c-402d-8284-7d578bfa7c02"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.856031 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.856342 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="60394f60-af79-4a07-8f3f-75fb61c31894" containerName="nova-cell1-conductor-conductor" containerID="cri-o://45f5cee6335c0ba2bc083ace9fff9eb625941edef21cfaa370cfe539173e7b53" gracePeriod=30 Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.863087 4797 scope.go:117] "RemoveContainer" containerID="2f8d3e11442030783b8beb821adc64c961a2829fa98e12946ceb2502de4e83c9" Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.865397 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s2bvb\" (UniqueName: \"kubernetes.io/projected/a7fec705-3fa8-4f2b-aa9d-1afec561d884-kube-api-access-s2bvb\") pod \"a7fec705-3fa8-4f2b-aa9d-1afec561d884\" (UID: \"a7fec705-3fa8-4f2b-aa9d-1afec561d884\") " Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.865432 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a7fec705-3fa8-4f2b-aa9d-1afec561d884-dns-svc\") pod \"a7fec705-3fa8-4f2b-aa9d-1afec561d884\" (UID: \"a7fec705-3fa8-4f2b-aa9d-1afec561d884\") " Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.865573 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a7fec705-3fa8-4f2b-aa9d-1afec561d884-config\") pod \"a7fec705-3fa8-4f2b-aa9d-1afec561d884\" (UID: \"a7fec705-3fa8-4f2b-aa9d-1afec561d884\") " Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.865683 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a7fec705-3fa8-4f2b-aa9d-1afec561d884-ovsdbserver-sb\") pod \"a7fec705-3fa8-4f2b-aa9d-1afec561d884\" (UID: \"a7fec705-3fa8-4f2b-aa9d-1afec561d884\") " Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.865726 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a7fec705-3fa8-4f2b-aa9d-1afec561d884-dns-swift-storage-0\") pod \"a7fec705-3fa8-4f2b-aa9d-1afec561d884\" (UID: \"a7fec705-3fa8-4f2b-aa9d-1afec561d884\") " Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.865763 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a7fec705-3fa8-4f2b-aa9d-1afec561d884-ovsdbserver-nb\") pod \"a7fec705-3fa8-4f2b-aa9d-1afec561d884\" (UID: \"a7fec705-3fa8-4f2b-aa9d-1afec561d884\") " Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.866152 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fc3b8cc-c74c-402d-8284-7d578bfa7c02-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:45 crc kubenswrapper[4797]: E1013 13:29:45.866225 4797 secret.go:188] Couldn't get secret openstack/barbican-config-data: secret "barbican-config-data" not found Oct 13 13:29:45 crc kubenswrapper[4797]: E1013 13:29:45.866260 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/99ff3b6b-b49c-4214-9cf1-ea9eb85f28c0-config-data podName:99ff3b6b-b49c-4214-9cf1-ea9eb85f28c0 nodeName:}" failed. No retries permitted until 2025-10-13 13:29:46.866248645 +0000 UTC m=+1364.399798901 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/99ff3b6b-b49c-4214-9cf1-ea9eb85f28c0-config-data") pod "barbican-worker-cc74bd777-hvb5p" (UID: "99ff3b6b-b49c-4214-9cf1-ea9eb85f28c0") : secret "barbican-config-data" not found Oct 13 13:29:45 crc kubenswrapper[4797]: E1013 13:29:45.867229 4797 secret.go:188] Couldn't get secret openstack/barbican-worker-config-data: secret "barbican-worker-config-data" not found Oct 13 13:29:45 crc kubenswrapper[4797]: E1013 13:29:45.867313 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/99ff3b6b-b49c-4214-9cf1-ea9eb85f28c0-config-data-custom podName:99ff3b6b-b49c-4214-9cf1-ea9eb85f28c0 nodeName:}" failed. No retries permitted until 2025-10-13 13:29:46.86728841 +0000 UTC m=+1364.400838666 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data-custom" (UniqueName: "kubernetes.io/secret/99ff3b6b-b49c-4214-9cf1-ea9eb85f28c0-config-data-custom") pod "barbican-worker-cc74bd777-hvb5p" (UID: "99ff3b6b-b49c-4214-9cf1-ea9eb85f28c0") : secret "barbican-worker-config-data" not found Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.868215 4797 generic.go:334] "Generic (PLEG): container finished" podID="e2a3161e-b16d-436d-b547-87e182ef5e27" containerID="5098f65b53c031667082257aa9c75ea641809f197a0ff696aad19b986be56dba" exitCode=143 Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.868263 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e2a3161e-b16d-436d-b547-87e182ef5e27","Type":"ContainerDied","Data":"5098f65b53c031667082257aa9c75ea641809f197a0ff696aad19b986be56dba"} Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.878273 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance496f-account-delete-qst6j"] Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.893195 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7fec705-3fa8-4f2b-aa9d-1afec561d884-kube-api-access-s2bvb" (OuterVolumeSpecName: "kube-api-access-s2bvb") pod "a7fec705-3fa8-4f2b-aa9d-1afec561d884" (UID: "a7fec705-3fa8-4f2b-aa9d-1afec561d884"). InnerVolumeSpecName "kube-api-access-s2bvb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.895881 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinderfaff-account-delete-ltnnq"] Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.899505 4797 generic.go:334] "Generic (PLEG): container finished" podID="f8688abd-e654-404b-924c-9e4cf255f4e8" containerID="a4c476b6ff3b37f629fd62076e094c447b4d35d68c78802f6d598f476f60adb0" exitCode=137 Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.899605 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.908151 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-57bdg_1fc3b8cc-c74c-402d-8284-7d578bfa7c02/openstack-network-exporter/0.log" Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.908276 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-57bdg" Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.909581 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-57bdg" event={"ID":"1fc3b8cc-c74c-402d-8284-7d578bfa7c02","Type":"ContainerDied","Data":"57bb4040b4aab5e54aad325812c0ed957180a8bb668bec1728a3e319b457587d"} Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.949045 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/047404d9-b0ab-44e2-a31d-94d8fe429698-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "047404d9-b0ab-44e2-a31d-94d8fe429698" (UID: "047404d9-b0ab-44e2-a31d-94d8fe429698"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.955946 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.964087 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.967150 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8688abd-e654-404b-924c-9e4cf255f4e8-combined-ca-bundle\") pod \"f8688abd-e654-404b-924c-9e4cf255f4e8\" (UID: \"f8688abd-e654-404b-924c-9e4cf255f4e8\") " Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.967388 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f8688abd-e654-404b-924c-9e4cf255f4e8-openstack-config-secret\") pod \"f8688abd-e654-404b-924c-9e4cf255f4e8\" (UID: \"f8688abd-e654-404b-924c-9e4cf255f4e8\") " Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.967424 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/f8688abd-e654-404b-924c-9e4cf255f4e8-openstack-config\") pod \"f8688abd-e654-404b-924c-9e4cf255f4e8\" (UID: \"f8688abd-e654-404b-924c-9e4cf255f4e8\") " Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.967475 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lwl7t\" (UniqueName: \"kubernetes.io/projected/f8688abd-e654-404b-924c-9e4cf255f4e8-kube-api-access-lwl7t\") pod \"f8688abd-e654-404b-924c-9e4cf255f4e8\" (UID: \"f8688abd-e654-404b-924c-9e4cf255f4e8\") " Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.968004 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s2bvb\" (UniqueName: \"kubernetes.io/projected/a7fec705-3fa8-4f2b-aa9d-1afec561d884-kube-api-access-s2bvb\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.968029 4797 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/047404d9-b0ab-44e2-a31d-94d8fe429698-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.971656 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placementcde4-account-delete-dpcwh"] Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.971923 4797 scope.go:117] "RemoveContainer" containerID="f67257e3e1a1c5986f176debe54b11331896c78d077c52898951e4e00a7acef3" Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.979710 4797 generic.go:334] "Generic (PLEG): container finished" podID="cefeac7c-e65d-4c12-8f7e-e56bf30c04fa" containerID="87d1861a88b2c106ab6c47eaad4f1a0647557f81e2c4766a50ad9d24c12524bf" exitCode=143 Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.979790 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cefeac7c-e65d-4c12-8f7e-e56bf30c04fa","Type":"ContainerDied","Data":"87d1861a88b2c106ab6c47eaad4f1a0647557f81e2c4766a50ad9d24c12524bf"} Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.988846 4797 generic.go:334] "Generic (PLEG): container finished" podID="11a6d485-2926-4d07-9b32-e81ab882de4c" containerID="366a02b08ccc7deee54d9245cc187eef22dec7cc6454b3fe21bc9ed73aea1117" exitCode=0 Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.989047 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-cc74bd777-hvb5p" podUID="99ff3b6b-b49c-4214-9cf1-ea9eb85f28c0" containerName="barbican-worker-log" containerID="cri-o://b77c094eeee1c437962694391b190f9591ed0b8e1210663b157e4b3d4c2a0207" gracePeriod=30 Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.989294 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-57cbbb4d89-r9rvd" event={"ID":"11a6d485-2926-4d07-9b32-e81ab882de4c","Type":"ContainerDied","Data":"366a02b08ccc7deee54d9245cc187eef22dec7cc6454b3fe21bc9ed73aea1117"} Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.989534 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-cc74bd777-hvb5p" podUID="99ff3b6b-b49c-4214-9cf1-ea9eb85f28c0" containerName="barbican-worker" containerID="cri-o://7f52052a3590935705a3b41e4aa7b4198b91ac9582dd55b94814a428e54fb81b" gracePeriod=30 Oct 13 13:29:45 crc kubenswrapper[4797]: I1013 13:29:45.997643 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novacell122bc-account-delete-v4zs5"] Oct 13 13:29:46 crc kubenswrapper[4797]: I1013 13:29:46.005689 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8688abd-e654-404b-924c-9e4cf255f4e8-kube-api-access-lwl7t" (OuterVolumeSpecName: "kube-api-access-lwl7t") pod "f8688abd-e654-404b-924c-9e4cf255f4e8" (UID: "f8688abd-e654-404b-924c-9e4cf255f4e8"). InnerVolumeSpecName "kube-api-access-lwl7t". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:29:46 crc kubenswrapper[4797]: W1013 13:29:46.016737 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podecdce3f4_f1b1_4323_b237_eb28b936ebc7.slice/crio-8d1c74252b830faaf48d72f41bd9065f690fcb5e8af24cb778af0d45d8f165d0 WatchSource:0}: Error finding container 8d1c74252b830faaf48d72f41bd9065f690fcb5e8af24cb778af0d45d8f165d0: Status 404 returned error can't find the container with id 8d1c74252b830faaf48d72f41bd9065f690fcb5e8af24cb778af0d45d8f165d0 Oct 13 13:29:46 crc kubenswrapper[4797]: I1013 13:29:46.039019 4797 scope.go:117] "RemoveContainer" containerID="8d8fce65555c86053229d2ee3e490186f49eaf012bc64c9f7fb0365ee57c6724" Oct 13 13:29:46 crc kubenswrapper[4797]: I1013 13:29:46.070769 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lwl7t\" (UniqueName: \"kubernetes.io/projected/f8688abd-e654-404b-924c-9e4cf255f4e8-kube-api-access-lwl7t\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:46 crc kubenswrapper[4797]: I1013 13:29:46.105178 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novacell084cf-account-delete-r9bpk"] Oct 13 13:29:46 crc kubenswrapper[4797]: I1013 13:29:46.125010 4797 scope.go:117] "RemoveContainer" containerID="f67257e3e1a1c5986f176debe54b11331896c78d077c52898951e4e00a7acef3" Oct 13 13:29:46 crc kubenswrapper[4797]: E1013 13:29:46.130239 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f67257e3e1a1c5986f176debe54b11331896c78d077c52898951e4e00a7acef3\": container with ID starting with f67257e3e1a1c5986f176debe54b11331896c78d077c52898951e4e00a7acef3 not found: ID does not exist" containerID="f67257e3e1a1c5986f176debe54b11331896c78d077c52898951e4e00a7acef3" Oct 13 13:29:46 crc kubenswrapper[4797]: I1013 13:29:46.130274 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f67257e3e1a1c5986f176debe54b11331896c78d077c52898951e4e00a7acef3"} err="failed to get container status \"f67257e3e1a1c5986f176debe54b11331896c78d077c52898951e4e00a7acef3\": rpc error: code = NotFound desc = could not find container \"f67257e3e1a1c5986f176debe54b11331896c78d077c52898951e4e00a7acef3\": container with ID starting with f67257e3e1a1c5986f176debe54b11331896c78d077c52898951e4e00a7acef3 not found: ID does not exist" Oct 13 13:29:46 crc kubenswrapper[4797]: I1013 13:29:46.130330 4797 scope.go:117] "RemoveContainer" containerID="8d8fce65555c86053229d2ee3e490186f49eaf012bc64c9f7fb0365ee57c6724" Oct 13 13:29:46 crc kubenswrapper[4797]: E1013 13:29:46.130708 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d8fce65555c86053229d2ee3e490186f49eaf012bc64c9f7fb0365ee57c6724\": container with ID starting with 8d8fce65555c86053229d2ee3e490186f49eaf012bc64c9f7fb0365ee57c6724 not found: ID does not exist" containerID="8d8fce65555c86053229d2ee3e490186f49eaf012bc64c9f7fb0365ee57c6724" Oct 13 13:29:46 crc kubenswrapper[4797]: I1013 13:29:46.130732 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d8fce65555c86053229d2ee3e490186f49eaf012bc64c9f7fb0365ee57c6724"} err="failed to get container status \"8d8fce65555c86053229d2ee3e490186f49eaf012bc64c9f7fb0365ee57c6724\": rpc error: code = NotFound desc = could not find container \"8d8fce65555c86053229d2ee3e490186f49eaf012bc64c9f7fb0365ee57c6724\": container with ID starting with 8d8fce65555c86053229d2ee3e490186f49eaf012bc64c9f7fb0365ee57c6724 not found: ID does not exist" Oct 13 13:29:46 crc kubenswrapper[4797]: I1013 13:29:46.130749 4797 scope.go:117] "RemoveContainer" containerID="ea620bd698810f04fad3cc655e6b00829d4603731d7d1dd124d75e6de787a1e3" Oct 13 13:29:46 crc kubenswrapper[4797]: I1013 13:29:46.132913 4797 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Oct 13 13:29:46 crc kubenswrapper[4797]: W1013 13:29:46.138181 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda327836f_196f_4d29_8792_33085b552aa9.slice/crio-c5c93dc9186042cf70ddb8fc3f33f12b6cbf1d1ca8a19d6ca344620adbd4260a WatchSource:0}: Error finding container c5c93dc9186042cf70ddb8fc3f33f12b6cbf1d1ca8a19d6ca344620adbd4260a: Status 404 returned error can't find the container with id c5c93dc9186042cf70ddb8fc3f33f12b6cbf1d1ca8a19d6ca344620adbd4260a Oct 13 13:29:46 crc kubenswrapper[4797]: I1013 13:29:46.155795 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8688abd-e654-404b-924c-9e4cf255f4e8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f8688abd-e654-404b-924c-9e4cf255f4e8" (UID: "f8688abd-e654-404b-924c-9e4cf255f4e8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:29:46 crc kubenswrapper[4797]: I1013 13:29:46.165965 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novaapia754-account-delete-6khh6"] Oct 13 13:29:46 crc kubenswrapper[4797]: I1013 13:29:46.175983 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8688abd-e654-404b-924c-9e4cf255f4e8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:46 crc kubenswrapper[4797]: I1013 13:29:46.176012 4797 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:46 crc kubenswrapper[4797]: W1013 13:29:46.180722 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3cb03045_bfdc_4d0b_af8f_4e3c4717e792.slice/crio-42d53a9b5f5e3aaefae9fb34afa6990b4398360ded008da5af31342a3addee98 WatchSource:0}: Error finding container 42d53a9b5f5e3aaefae9fb34afa6990b4398360ded008da5af31342a3addee98: Status 404 returned error can't find the container with id 42d53a9b5f5e3aaefae9fb34afa6990b4398360ded008da5af31342a3addee98 Oct 13 13:29:46 crc kubenswrapper[4797]: I1013 13:29:46.189733 4797 scope.go:117] "RemoveContainer" containerID="15a0ba06c59d7bea85972ec892e686e89aa4eb9037d3d04f437f8ad32558c17b" Oct 13 13:29:46 crc kubenswrapper[4797]: E1013 13:29:46.190181 4797 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="45f5cee6335c0ba2bc083ace9fff9eb625941edef21cfaa370cfe539173e7b53" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 13 13:29:46 crc kubenswrapper[4797]: E1013 13:29:46.193369 4797 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="45f5cee6335c0ba2bc083ace9fff9eb625941edef21cfaa370cfe539173e7b53" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 13 13:29:46 crc kubenswrapper[4797]: E1013 13:29:46.195562 4797 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="45f5cee6335c0ba2bc083ace9fff9eb625941edef21cfaa370cfe539173e7b53" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 13 13:29:46 crc kubenswrapper[4797]: E1013 13:29:46.195629 4797 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell1-conductor-0" podUID="60394f60-af79-4a07-8f3f-75fb61c31894" containerName="nova-cell1-conductor-conductor" Oct 13 13:29:46 crc kubenswrapper[4797]: I1013 13:29:46.197667 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a7fec705-3fa8-4f2b-aa9d-1afec561d884-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a7fec705-3fa8-4f2b-aa9d-1afec561d884" (UID: "a7fec705-3fa8-4f2b-aa9d-1afec561d884"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:29:46 crc kubenswrapper[4797]: I1013 13:29:46.238792 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8688abd-e654-404b-924c-9e4cf255f4e8-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "f8688abd-e654-404b-924c-9e4cf255f4e8" (UID: "f8688abd-e654-404b-924c-9e4cf255f4e8"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:29:46 crc kubenswrapper[4797]: I1013 13:29:46.268958 4797 scope.go:117] "RemoveContainer" containerID="a4c476b6ff3b37f629fd62076e094c447b4d35d68c78802f6d598f476f60adb0" Oct 13 13:29:46 crc kubenswrapper[4797]: I1013 13:29:46.293883 4797 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/f8688abd-e654-404b-924c-9e4cf255f4e8-openstack-config\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:46 crc kubenswrapper[4797]: I1013 13:29:46.293920 4797 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a7fec705-3fa8-4f2b-aa9d-1afec561d884-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:46 crc kubenswrapper[4797]: I1013 13:29:46.348357 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a7fec705-3fa8-4f2b-aa9d-1afec561d884-config" (OuterVolumeSpecName: "config") pod "a7fec705-3fa8-4f2b-aa9d-1afec561d884" (UID: "a7fec705-3fa8-4f2b-aa9d-1afec561d884"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:29:46 crc kubenswrapper[4797]: I1013 13:29:46.361018 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fc3b8cc-c74c-402d-8284-7d578bfa7c02-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "1fc3b8cc-c74c-402d-8284-7d578bfa7c02" (UID: "1fc3b8cc-c74c-402d-8284-7d578bfa7c02"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:29:46 crc kubenswrapper[4797]: I1013 13:29:46.377577 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a7fec705-3fa8-4f2b-aa9d-1afec561d884-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "a7fec705-3fa8-4f2b-aa9d-1afec561d884" (UID: "a7fec705-3fa8-4f2b-aa9d-1afec561d884"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:29:46 crc kubenswrapper[4797]: I1013 13:29:46.396794 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/047404d9-b0ab-44e2-a31d-94d8fe429698-ovsdbserver-sb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-sb-tls-certs") pod "047404d9-b0ab-44e2-a31d-94d8fe429698" (UID: "047404d9-b0ab-44e2-a31d-94d8fe429698"). InnerVolumeSpecName "ovsdbserver-sb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:29:46 crc kubenswrapper[4797]: I1013 13:29:46.400726 4797 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a7fec705-3fa8-4f2b-aa9d-1afec561d884-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:46 crc kubenswrapper[4797]: I1013 13:29:46.400758 4797 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1fc3b8cc-c74c-402d-8284-7d578bfa7c02-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:46 crc kubenswrapper[4797]: I1013 13:29:46.400772 4797 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a7fec705-3fa8-4f2b-aa9d-1afec561d884-config\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:46 crc kubenswrapper[4797]: I1013 13:29:46.400782 4797 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/047404d9-b0ab-44e2-a31d-94d8fe429698-ovsdbserver-sb-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:46 crc kubenswrapper[4797]: I1013 13:29:46.405029 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a7fec705-3fa8-4f2b-aa9d-1afec561d884-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a7fec705-3fa8-4f2b-aa9d-1afec561d884" (UID: "a7fec705-3fa8-4f2b-aa9d-1afec561d884"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:29:46 crc kubenswrapper[4797]: I1013 13:29:46.429438 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a7fec705-3fa8-4f2b-aa9d-1afec561d884-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a7fec705-3fa8-4f2b-aa9d-1afec561d884" (UID: "a7fec705-3fa8-4f2b-aa9d-1afec561d884"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:29:46 crc kubenswrapper[4797]: I1013 13:29:46.502655 4797 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a7fec705-3fa8-4f2b-aa9d-1afec561d884-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:46 crc kubenswrapper[4797]: I1013 13:29:46.502688 4797 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a7fec705-3fa8-4f2b-aa9d-1afec561d884-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:46 crc kubenswrapper[4797]: E1013 13:29:46.546078 4797 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c4243df011234c180288fc1c95c327de116944eeb9f76e3b80b6ff0317063169" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Oct 13 13:29:46 crc kubenswrapper[4797]: E1013 13:29:46.552981 4797 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c4243df011234c180288fc1c95c327de116944eeb9f76e3b80b6ff0317063169" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Oct 13 13:29:46 crc kubenswrapper[4797]: E1013 13:29:46.559523 4797 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c4243df011234c180288fc1c95c327de116944eeb9f76e3b80b6ff0317063169" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Oct 13 13:29:46 crc kubenswrapper[4797]: E1013 13:29:46.559616 4797 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-northd-0" podUID="d661f302-5234-4d18-9aa8-0eddd26153fe" containerName="ovn-northd" Oct 13 13:29:46 crc kubenswrapper[4797]: I1013 13:29:46.592404 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8688abd-e654-404b-924c-9e4cf255f4e8-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "f8688abd-e654-404b-924c-9e4cf255f4e8" (UID: "f8688abd-e654-404b-924c-9e4cf255f4e8"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:29:46 crc kubenswrapper[4797]: E1013 13:29:46.607620 4797 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Oct 13 13:29:46 crc kubenswrapper[4797]: E1013 13:29:46.607693 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/21067728-d3cf-4ff2-94c9-87600f7324ab-config-data podName:21067728-d3cf-4ff2-94c9-87600f7324ab nodeName:}" failed. No retries permitted until 2025-10-13 13:29:50.607676638 +0000 UTC m=+1368.141226894 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/21067728-d3cf-4ff2-94c9-87600f7324ab-config-data") pod "rabbitmq-server-0" (UID: "21067728-d3cf-4ff2-94c9-87600f7324ab") : configmap "rabbitmq-config-data" not found Oct 13 13:29:46 crc kubenswrapper[4797]: I1013 13:29:46.607629 4797 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f8688abd-e654-404b-924c-9e4cf255f4e8-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:46 crc kubenswrapper[4797]: E1013 13:29:46.814146 4797 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d71ea01203c3ae01ea8325c2bb868f94f1383842ef8fa152c98b3afecb3c64ce is running failed: container process not found" containerID="d71ea01203c3ae01ea8325c2bb868f94f1383842ef8fa152c98b3afecb3c64ce" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Oct 13 13:29:46 crc kubenswrapper[4797]: E1013 13:29:46.814776 4797 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d71ea01203c3ae01ea8325c2bb868f94f1383842ef8fa152c98b3afecb3c64ce is running failed: container process not found" containerID="d71ea01203c3ae01ea8325c2bb868f94f1383842ef8fa152c98b3afecb3c64ce" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Oct 13 13:29:46 crc kubenswrapper[4797]: E1013 13:29:46.818343 4797 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d71ea01203c3ae01ea8325c2bb868f94f1383842ef8fa152c98b3afecb3c64ce is running failed: container process not found" containerID="d71ea01203c3ae01ea8325c2bb868f94f1383842ef8fa152c98b3afecb3c64ce" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Oct 13 13:29:46 crc kubenswrapper[4797]: E1013 13:29:46.818480 4797 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d71ea01203c3ae01ea8325c2bb868f94f1383842ef8fa152c98b3afecb3c64ce is running failed: container process not found" probeType="Readiness" pod="openstack/openstack-cell1-galera-0" podUID="f12892ce-6d68-4f79-b1dd-e874dffba145" containerName="galera" Oct 13 13:29:46 crc kubenswrapper[4797]: I1013 13:29:46.849288 4797 scope.go:117] "RemoveContainer" containerID="58cfde229cd18c95dec2460726a8982f10895baa52b242e5b8b923162d984a0a" Oct 13 13:29:46 crc kubenswrapper[4797]: E1013 13:29:46.915524 4797 secret.go:188] Couldn't get secret openstack/barbican-worker-config-data: secret "barbican-worker-config-data" not found Oct 13 13:29:46 crc kubenswrapper[4797]: E1013 13:29:46.915592 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/99ff3b6b-b49c-4214-9cf1-ea9eb85f28c0-config-data-custom podName:99ff3b6b-b49c-4214-9cf1-ea9eb85f28c0 nodeName:}" failed. No retries permitted until 2025-10-13 13:29:48.915572359 +0000 UTC m=+1366.449122615 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data-custom" (UniqueName: "kubernetes.io/secret/99ff3b6b-b49c-4214-9cf1-ea9eb85f28c0-config-data-custom") pod "barbican-worker-cc74bd777-hvb5p" (UID: "99ff3b6b-b49c-4214-9cf1-ea9eb85f28c0") : secret "barbican-worker-config-data" not found Oct 13 13:29:46 crc kubenswrapper[4797]: E1013 13:29:46.915947 4797 secret.go:188] Couldn't get secret openstack/barbican-config-data: secret "barbican-config-data" not found Oct 13 13:29:46 crc kubenswrapper[4797]: E1013 13:29:46.915981 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/99ff3b6b-b49c-4214-9cf1-ea9eb85f28c0-config-data podName:99ff3b6b-b49c-4214-9cf1-ea9eb85f28c0 nodeName:}" failed. No retries permitted until 2025-10-13 13:29:48.915971489 +0000 UTC m=+1366.449521745 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/99ff3b6b-b49c-4214-9cf1-ea9eb85f28c0-config-data") pod "barbican-worker-cc74bd777-hvb5p" (UID: "99ff3b6b-b49c-4214-9cf1-ea9eb85f28c0") : secret "barbican-config-data" not found Oct 13 13:29:47 crc kubenswrapper[4797]: I1013 13:29:47.004152 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell122bc-account-delete-v4zs5" event={"ID":"3111854e-cfce-493d-a094-63479ed35583","Type":"ContainerStarted","Data":"8e81067639dfc13496740be1a5d2b5c3c39f9b729b68ebe3ac5b4752c0b39280"} Oct 13 13:29:47 crc kubenswrapper[4797]: I1013 13:29:47.005675 4797 generic.go:334] "Generic (PLEG): container finished" podID="3d561c30-1e2f-4a3d-b042-8191c88e4bb6" containerID="7fbdf32f1bb326754cf202e723f8c39032e73df93800b78e5c5cdcd412a37605" exitCode=143 Oct 13 13:29:47 crc kubenswrapper[4797]: I1013 13:29:47.005718 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-575995d4c4-dskht" event={"ID":"3d561c30-1e2f-4a3d-b042-8191c88e4bb6","Type":"ContainerDied","Data":"7fbdf32f1bb326754cf202e723f8c39032e73df93800b78e5c5cdcd412a37605"} Oct 13 13:29:47 crc kubenswrapper[4797]: E1013 13:29:47.017313 4797 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Oct 13 13:29:47 crc kubenswrapper[4797]: E1013 13:29:47.017401 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/acdec9fc-360a-46e4-89ea-3fde84f417c0-config-data podName:acdec9fc-360a-46e4-89ea-3fde84f417c0 nodeName:}" failed. No retries permitted until 2025-10-13 13:29:51.017381609 +0000 UTC m=+1368.550931865 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/acdec9fc-360a-46e4-89ea-3fde84f417c0-config-data") pod "rabbitmq-cell1-server-0" (UID: "acdec9fc-360a-46e4-89ea-3fde84f417c0") : configmap "rabbitmq-cell1-config-data" not found Oct 13 13:29:47 crc kubenswrapper[4797]: I1013 13:29:47.022136 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novaapia754-account-delete-6khh6" event={"ID":"3cb03045-bfdc-4d0b-af8f-4e3c4717e792","Type":"ContainerStarted","Data":"42d53a9b5f5e3aaefae9fb34afa6990b4398360ded008da5af31342a3addee98"} Oct 13 13:29:47 crc kubenswrapper[4797]: I1013 13:29:47.025646 4797 generic.go:334] "Generic (PLEG): container finished" podID="99ff3b6b-b49c-4214-9cf1-ea9eb85f28c0" containerID="b77c094eeee1c437962694391b190f9591ed0b8e1210663b157e4b3d4c2a0207" exitCode=143 Oct 13 13:29:47 crc kubenswrapper[4797]: I1013 13:29:47.025730 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-cc74bd777-hvb5p" event={"ID":"99ff3b6b-b49c-4214-9cf1-ea9eb85f28c0","Type":"ContainerDied","Data":"b77c094eeee1c437962694391b190f9591ed0b8e1210663b157e4b3d4c2a0207"} Oct 13 13:29:47 crc kubenswrapper[4797]: I1013 13:29:47.028635 4797 generic.go:334] "Generic (PLEG): container finished" podID="6fed4821-c587-408b-b6b0-bcc080170628" containerID="b0e1792b043ca301b8bfdc8d457a0e08d3ef2be1af52069dc3da398ff15a9eb7" exitCode=0 Oct 13 13:29:47 crc kubenswrapper[4797]: I1013 13:29:47.028660 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6c8957dfc-xhpz5" event={"ID":"6fed4821-c587-408b-b6b0-bcc080170628","Type":"ContainerDied","Data":"b0e1792b043ca301b8bfdc8d457a0e08d3ef2be1af52069dc3da398ff15a9eb7"} Oct 13 13:29:47 crc kubenswrapper[4797]: I1013 13:29:47.030724 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placementcde4-account-delete-dpcwh" event={"ID":"ecdce3f4-f1b1-4323-b237-eb28b936ebc7","Type":"ContainerStarted","Data":"8d1c74252b830faaf48d72f41bd9065f690fcb5e8af24cb778af0d45d8f165d0"} Oct 13 13:29:47 crc kubenswrapper[4797]: I1013 13:29:47.035718 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell084cf-account-delete-r9bpk" event={"ID":"a327836f-196f-4d29-8792-33085b552aa9","Type":"ContainerStarted","Data":"c5c93dc9186042cf70ddb8fc3f33f12b6cbf1d1ca8a19d6ca344620adbd4260a"} Oct 13 13:29:47 crc kubenswrapper[4797]: I1013 13:29:47.039348 4797 generic.go:334] "Generic (PLEG): container finished" podID="fe29d3f9-7e6b-4fc0-8268-a5c3bd0e7f1e" containerID="f7e069b9ab89c7959910da337a2d82dec852dac12fc5e241175f7c593d851a00" exitCode=0 Oct 13 13:29:47 crc kubenswrapper[4797]: I1013 13:29:47.039408 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"fe29d3f9-7e6b-4fc0-8268-a5c3bd0e7f1e","Type":"ContainerDied","Data":"f7e069b9ab89c7959910da337a2d82dec852dac12fc5e241175f7c593d851a00"} Oct 13 13:29:47 crc kubenswrapper[4797]: I1013 13:29:47.039559 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"fe29d3f9-7e6b-4fc0-8268-a5c3bd0e7f1e","Type":"ContainerDied","Data":"b83f748df79935f480a3ca22d9fd0f8e3033754f61916af139f831e4c58a27ae"} Oct 13 13:29:47 crc kubenswrapper[4797]: I1013 13:29:47.039616 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b83f748df79935f480a3ca22d9fd0f8e3033754f61916af139f831e4c58a27ae" Oct 13 13:29:47 crc kubenswrapper[4797]: I1013 13:29:47.042412 4797 generic.go:334] "Generic (PLEG): container finished" podID="5421ab8e-2db8-4909-b67b-e0491f7b80e7" containerID="0046c194d5cb2a17a8e468e7c97653174cda795dc3a35ef52e73abaa120bddf3" exitCode=0 Oct 13 13:29:47 crc kubenswrapper[4797]: I1013 13:29:47.042467 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinderfaff-account-delete-ltnnq" event={"ID":"5421ab8e-2db8-4909-b67b-e0491f7b80e7","Type":"ContainerDied","Data":"0046c194d5cb2a17a8e468e7c97653174cda795dc3a35ef52e73abaa120bddf3"} Oct 13 13:29:47 crc kubenswrapper[4797]: I1013 13:29:47.050438 4797 generic.go:334] "Generic (PLEG): container finished" podID="f12892ce-6d68-4f79-b1dd-e874dffba145" containerID="d71ea01203c3ae01ea8325c2bb868f94f1383842ef8fa152c98b3afecb3c64ce" exitCode=0 Oct 13 13:29:47 crc kubenswrapper[4797]: I1013 13:29:47.050534 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"f12892ce-6d68-4f79-b1dd-e874dffba145","Type":"ContainerDied","Data":"d71ea01203c3ae01ea8325c2bb868f94f1383842ef8fa152c98b3afecb3c64ce"} Oct 13 13:29:47 crc kubenswrapper[4797]: I1013 13:29:47.050566 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"f12892ce-6d68-4f79-b1dd-e874dffba145","Type":"ContainerDied","Data":"559fda50c03ecc7086dae8eeea693d15ca9e11a5636b2093106d5a1dbac0d7f3"} Oct 13 13:29:47 crc kubenswrapper[4797]: I1013 13:29:47.050603 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="559fda50c03ecc7086dae8eeea693d15ca9e11a5636b2093106d5a1dbac0d7f3" Oct 13 13:29:47 crc kubenswrapper[4797]: I1013 13:29:47.055898 4797 generic.go:334] "Generic (PLEG): container finished" podID="312a660f-ea89-49ac-8857-16dae844353f" containerID="d2554b76de82af7c27df20bec7682a8cbc461613a3a2b1a5e9ff6acf46612daf" exitCode=0 Oct 13 13:29:47 crc kubenswrapper[4797]: I1013 13:29:47.055972 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance496f-account-delete-qst6j" event={"ID":"312a660f-ea89-49ac-8857-16dae844353f","Type":"ContainerDied","Data":"d2554b76de82af7c27df20bec7682a8cbc461613a3a2b1a5e9ff6acf46612daf"} Oct 13 13:29:47 crc kubenswrapper[4797]: I1013 13:29:47.067131 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 13 13:29:47 crc kubenswrapper[4797]: I1013 13:29:47.070380 4797 generic.go:334] "Generic (PLEG): container finished" podID="b6d097e4-da24-434c-9b9f-2e84279240a6" containerID="91313e817dc0492cf7a60f479444a1c27f27b873cabe704400743e07a7510a7d" exitCode=143 Oct 13 13:29:47 crc kubenswrapper[4797]: I1013 13:29:47.070519 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-76fd44f586-tp25f" event={"ID":"b6d097e4-da24-434c-9b9f-2e84279240a6","Type":"ContainerDied","Data":"91313e817dc0492cf7a60f479444a1c27f27b873cabe704400743e07a7510a7d"} Oct 13 13:29:47 crc kubenswrapper[4797]: I1013 13:29:47.150585 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 13 13:29:47 crc kubenswrapper[4797]: I1013 13:29:47.199080 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-65bf758599-wncdc"] Oct 13 13:29:47 crc kubenswrapper[4797]: I1013 13:29:47.205896 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-65bf758599-wncdc"] Oct 13 13:29:47 crc kubenswrapper[4797]: I1013 13:29:47.211843 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-57bdg"] Oct 13 13:29:47 crc kubenswrapper[4797]: I1013 13:29:47.217927 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-metrics-57bdg"] Oct 13 13:29:47 crc kubenswrapper[4797]: I1013 13:29:47.223229 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 13 13:29:47 crc kubenswrapper[4797]: I1013 13:29:47.228661 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe29d3f9-7e6b-4fc0-8268-a5c3bd0e7f1e-config-data\") pod \"fe29d3f9-7e6b-4fc0-8268-a5c3bd0e7f1e\" (UID: \"fe29d3f9-7e6b-4fc0-8268-a5c3bd0e7f1e\") " Oct 13 13:29:47 crc kubenswrapper[4797]: I1013 13:29:47.228690 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 13 13:29:47 crc kubenswrapper[4797]: I1013 13:29:47.228698 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xrrtq\" (UniqueName: \"kubernetes.io/projected/fe29d3f9-7e6b-4fc0-8268-a5c3bd0e7f1e-kube-api-access-xrrtq\") pod \"fe29d3f9-7e6b-4fc0-8268-a5c3bd0e7f1e\" (UID: \"fe29d3f9-7e6b-4fc0-8268-a5c3bd0e7f1e\") " Oct 13 13:29:47 crc kubenswrapper[4797]: I1013 13:29:47.228926 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe29d3f9-7e6b-4fc0-8268-a5c3bd0e7f1e-nova-novncproxy-tls-certs\") pod \"fe29d3f9-7e6b-4fc0-8268-a5c3bd0e7f1e\" (UID: \"fe29d3f9-7e6b-4fc0-8268-a5c3bd0e7f1e\") " Oct 13 13:29:47 crc kubenswrapper[4797]: I1013 13:29:47.228976 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe29d3f9-7e6b-4fc0-8268-a5c3bd0e7f1e-combined-ca-bundle\") pod \"fe29d3f9-7e6b-4fc0-8268-a5c3bd0e7f1e\" (UID: \"fe29d3f9-7e6b-4fc0-8268-a5c3bd0e7f1e\") " Oct 13 13:29:47 crc kubenswrapper[4797]: I1013 13:29:47.229155 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xh98n\" (UniqueName: \"kubernetes.io/projected/f12892ce-6d68-4f79-b1dd-e874dffba145-kube-api-access-xh98n\") pod \"f12892ce-6d68-4f79-b1dd-e874dffba145\" (UID: \"f12892ce-6d68-4f79-b1dd-e874dffba145\") " Oct 13 13:29:47 crc kubenswrapper[4797]: I1013 13:29:47.229196 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe29d3f9-7e6b-4fc0-8268-a5c3bd0e7f1e-vencrypt-tls-certs\") pod \"fe29d3f9-7e6b-4fc0-8268-a5c3bd0e7f1e\" (UID: \"fe29d3f9-7e6b-4fc0-8268-a5c3bd0e7f1e\") " Oct 13 13:29:47 crc kubenswrapper[4797]: I1013 13:29:47.229248 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f12892ce-6d68-4f79-b1dd-e874dffba145-galera-tls-certs\") pod \"f12892ce-6d68-4f79-b1dd-e874dffba145\" (UID: \"f12892ce-6d68-4f79-b1dd-e874dffba145\") " Oct 13 13:29:47 crc kubenswrapper[4797]: I1013 13:29:47.229286 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f12892ce-6d68-4f79-b1dd-e874dffba145-combined-ca-bundle\") pod \"f12892ce-6d68-4f79-b1dd-e874dffba145\" (UID: \"f12892ce-6d68-4f79-b1dd-e874dffba145\") " Oct 13 13:29:47 crc kubenswrapper[4797]: I1013 13:29:47.229311 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/f12892ce-6d68-4f79-b1dd-e874dffba145-secrets\") pod \"f12892ce-6d68-4f79-b1dd-e874dffba145\" (UID: \"f12892ce-6d68-4f79-b1dd-e874dffba145\") " Oct 13 13:29:47 crc kubenswrapper[4797]: I1013 13:29:47.260224 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f12892ce-6d68-4f79-b1dd-e874dffba145-secrets" (OuterVolumeSpecName: "secrets") pod "f12892ce-6d68-4f79-b1dd-e874dffba145" (UID: "f12892ce-6d68-4f79-b1dd-e874dffba145"). InnerVolumeSpecName "secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:29:47 crc kubenswrapper[4797]: I1013 13:29:47.265972 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f12892ce-6d68-4f79-b1dd-e874dffba145-kube-api-access-xh98n" (OuterVolumeSpecName: "kube-api-access-xh98n") pod "f12892ce-6d68-4f79-b1dd-e874dffba145" (UID: "f12892ce-6d68-4f79-b1dd-e874dffba145"). InnerVolumeSpecName "kube-api-access-xh98n". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:29:47 crc kubenswrapper[4797]: I1013 13:29:47.266874 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe29d3f9-7e6b-4fc0-8268-a5c3bd0e7f1e-kube-api-access-xrrtq" (OuterVolumeSpecName: "kube-api-access-xrrtq") pod "fe29d3f9-7e6b-4fc0-8268-a5c3bd0e7f1e" (UID: "fe29d3f9-7e6b-4fc0-8268-a5c3bd0e7f1e"). InnerVolumeSpecName "kube-api-access-xrrtq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:29:47 crc kubenswrapper[4797]: I1013 13:29:47.273285 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="047404d9-b0ab-44e2-a31d-94d8fe429698" path="/var/lib/kubelet/pods/047404d9-b0ab-44e2-a31d-94d8fe429698/volumes" Oct 13 13:29:47 crc kubenswrapper[4797]: I1013 13:29:47.274379 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1fc3b8cc-c74c-402d-8284-7d578bfa7c02" path="/var/lib/kubelet/pods/1fc3b8cc-c74c-402d-8284-7d578bfa7c02/volumes" Oct 13 13:29:47 crc kubenswrapper[4797]: I1013 13:29:47.275354 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="244e58a1-ed2c-4ff6-8885-ebd066e8adab" path="/var/lib/kubelet/pods/244e58a1-ed2c-4ff6-8885-ebd066e8adab/volumes" Oct 13 13:29:47 crc kubenswrapper[4797]: I1013 13:29:47.280164 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3833a450-53fb-44f6-974d-b2496e3a98d8" path="/var/lib/kubelet/pods/3833a450-53fb-44f6-974d-b2496e3a98d8/volumes" Oct 13 13:29:47 crc kubenswrapper[4797]: I1013 13:29:47.281419 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e8201a9-2b2a-46f0-81ea-db939e25d192" path="/var/lib/kubelet/pods/3e8201a9-2b2a-46f0-81ea-db939e25d192/volumes" Oct 13 13:29:47 crc kubenswrapper[4797]: I1013 13:29:47.281977 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d71a130-6ed9-4cba-8f40-dea82ad6e42e" path="/var/lib/kubelet/pods/4d71a130-6ed9-4cba-8f40-dea82ad6e42e/volumes" Oct 13 13:29:47 crc kubenswrapper[4797]: I1013 13:29:47.282456 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="830fc84d-bd05-42f9-a2c8-9404e9a1acb7" path="/var/lib/kubelet/pods/830fc84d-bd05-42f9-a2c8-9404e9a1acb7/volumes" Oct 13 13:29:47 crc kubenswrapper[4797]: I1013 13:29:47.283790 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9280efc5-f863-4350-8044-7da90a6982fb" path="/var/lib/kubelet/pods/9280efc5-f863-4350-8044-7da90a6982fb/volumes" Oct 13 13:29:47 crc kubenswrapper[4797]: I1013 13:29:47.284394 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7fec705-3fa8-4f2b-aa9d-1afec561d884" path="/var/lib/kubelet/pods/a7fec705-3fa8-4f2b-aa9d-1afec561d884/volumes" Oct 13 13:29:47 crc kubenswrapper[4797]: I1013 13:29:47.288053 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a88a7d56-1b07-41ef-94ec-d39f1c9bfb76" path="/var/lib/kubelet/pods/a88a7d56-1b07-41ef-94ec-d39f1c9bfb76/volumes" Oct 13 13:29:47 crc kubenswrapper[4797]: I1013 13:29:47.290643 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8cc6e8d-cce5-423b-a60c-49587bc50452" path="/var/lib/kubelet/pods/a8cc6e8d-cce5-423b-a60c-49587bc50452/volumes" Oct 13 13:29:47 crc kubenswrapper[4797]: I1013 13:29:47.291345 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c68b9280-7334-47da-ab97-8dcb4c4f3016" path="/var/lib/kubelet/pods/c68b9280-7334-47da-ab97-8dcb4c4f3016/volumes" Oct 13 13:29:47 crc kubenswrapper[4797]: I1013 13:29:47.291990 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d35fbf40-e053-49c6-9eb9-fbdd8337061c" path="/var/lib/kubelet/pods/d35fbf40-e053-49c6-9eb9-fbdd8337061c/volumes" Oct 13 13:29:47 crc kubenswrapper[4797]: I1013 13:29:47.292442 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe29d3f9-7e6b-4fc0-8268-a5c3bd0e7f1e-config-data" (OuterVolumeSpecName: "config-data") pod "fe29d3f9-7e6b-4fc0-8268-a5c3bd0e7f1e" (UID: "fe29d3f9-7e6b-4fc0-8268-a5c3bd0e7f1e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:29:47 crc kubenswrapper[4797]: I1013 13:29:47.292927 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee3b1264-55ce-4cb4-a390-2fb520ae9b87" path="/var/lib/kubelet/pods/ee3b1264-55ce-4cb4-a390-2fb520ae9b87/volumes" Oct 13 13:29:47 crc kubenswrapper[4797]: I1013 13:29:47.294863 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6745956-a404-4481-9d3e-3a8056d7aaf2" path="/var/lib/kubelet/pods/f6745956-a404-4481-9d3e-3a8056d7aaf2/volumes" Oct 13 13:29:47 crc kubenswrapper[4797]: I1013 13:29:47.295833 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8688abd-e654-404b-924c-9e4cf255f4e8" path="/var/lib/kubelet/pods/f8688abd-e654-404b-924c-9e4cf255f4e8/volumes" Oct 13 13:29:47 crc kubenswrapper[4797]: I1013 13:29:47.308657 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe29d3f9-7e6b-4fc0-8268-a5c3bd0e7f1e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fe29d3f9-7e6b-4fc0-8268-a5c3bd0e7f1e" (UID: "fe29d3f9-7e6b-4fc0-8268-a5c3bd0e7f1e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:29:47 crc kubenswrapper[4797]: I1013 13:29:47.326911 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe29d3f9-7e6b-4fc0-8268-a5c3bd0e7f1e-vencrypt-tls-certs" (OuterVolumeSpecName: "vencrypt-tls-certs") pod "fe29d3f9-7e6b-4fc0-8268-a5c3bd0e7f1e" (UID: "fe29d3f9-7e6b-4fc0-8268-a5c3bd0e7f1e"). InnerVolumeSpecName "vencrypt-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:29:47 crc kubenswrapper[4797]: I1013 13:29:47.327199 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f12892ce-6d68-4f79-b1dd-e874dffba145-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "f12892ce-6d68-4f79-b1dd-e874dffba145" (UID: "f12892ce-6d68-4f79-b1dd-e874dffba145"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:29:47 crc kubenswrapper[4797]: I1013 13:29:47.330775 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f12892ce-6d68-4f79-b1dd-e874dffba145-config-data-generated\") pod \"f12892ce-6d68-4f79-b1dd-e874dffba145\" (UID: \"f12892ce-6d68-4f79-b1dd-e874dffba145\") " Oct 13 13:29:47 crc kubenswrapper[4797]: I1013 13:29:47.330905 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f12892ce-6d68-4f79-b1dd-e874dffba145-operator-scripts\") pod \"f12892ce-6d68-4f79-b1dd-e874dffba145\" (UID: \"f12892ce-6d68-4f79-b1dd-e874dffba145\") " Oct 13 13:29:47 crc kubenswrapper[4797]: I1013 13:29:47.330989 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"f12892ce-6d68-4f79-b1dd-e874dffba145\" (UID: \"f12892ce-6d68-4f79-b1dd-e874dffba145\") " Oct 13 13:29:47 crc kubenswrapper[4797]: I1013 13:29:47.331013 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f12892ce-6d68-4f79-b1dd-e874dffba145-kolla-config\") pod \"f12892ce-6d68-4f79-b1dd-e874dffba145\" (UID: \"f12892ce-6d68-4f79-b1dd-e874dffba145\") " Oct 13 13:29:47 crc kubenswrapper[4797]: I1013 13:29:47.331042 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f12892ce-6d68-4f79-b1dd-e874dffba145-config-data-default\") pod \"f12892ce-6d68-4f79-b1dd-e874dffba145\" (UID: \"f12892ce-6d68-4f79-b1dd-e874dffba145\") " Oct 13 13:29:47 crc kubenswrapper[4797]: I1013 13:29:47.331449 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f12892ce-6d68-4f79-b1dd-e874dffba145-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "f12892ce-6d68-4f79-b1dd-e874dffba145" (UID: "f12892ce-6d68-4f79-b1dd-e874dffba145"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 13:29:47 crc kubenswrapper[4797]: I1013 13:29:47.332155 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f12892ce-6d68-4f79-b1dd-e874dffba145-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "f12892ce-6d68-4f79-b1dd-e874dffba145" (UID: "f12892ce-6d68-4f79-b1dd-e874dffba145"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:29:47 crc kubenswrapper[4797]: I1013 13:29:47.332307 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f12892ce-6d68-4f79-b1dd-e874dffba145-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f12892ce-6d68-4f79-b1dd-e874dffba145" (UID: "f12892ce-6d68-4f79-b1dd-e874dffba145"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:29:47 crc kubenswrapper[4797]: I1013 13:29:47.335986 4797 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f12892ce-6d68-4f79-b1dd-e874dffba145-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:47 crc kubenswrapper[4797]: I1013 13:29:47.336045 4797 reconciler_common.go:293] "Volume detached for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/f12892ce-6d68-4f79-b1dd-e874dffba145-secrets\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:47 crc kubenswrapper[4797]: I1013 13:29:47.336056 4797 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f12892ce-6d68-4f79-b1dd-e874dffba145-config-data-generated\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:47 crc kubenswrapper[4797]: I1013 13:29:47.336068 4797 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f12892ce-6d68-4f79-b1dd-e874dffba145-operator-scripts\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:47 crc kubenswrapper[4797]: I1013 13:29:47.336076 4797 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f12892ce-6d68-4f79-b1dd-e874dffba145-config-data-default\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:47 crc kubenswrapper[4797]: I1013 13:29:47.336086 4797 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe29d3f9-7e6b-4fc0-8268-a5c3bd0e7f1e-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:47 crc kubenswrapper[4797]: I1013 13:29:47.336095 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xrrtq\" (UniqueName: \"kubernetes.io/projected/fe29d3f9-7e6b-4fc0-8268-a5c3bd0e7f1e-kube-api-access-xrrtq\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:47 crc kubenswrapper[4797]: I1013 13:29:47.336104 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe29d3f9-7e6b-4fc0-8268-a5c3bd0e7f1e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:47 crc kubenswrapper[4797]: I1013 13:29:47.336112 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xh98n\" (UniqueName: \"kubernetes.io/projected/f12892ce-6d68-4f79-b1dd-e874dffba145-kube-api-access-xh98n\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:47 crc kubenswrapper[4797]: I1013 13:29:47.336122 4797 reconciler_common.go:293] "Volume detached for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe29d3f9-7e6b-4fc0-8268-a5c3bd0e7f1e-vencrypt-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:47 crc kubenswrapper[4797]: I1013 13:29:47.338459 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe29d3f9-7e6b-4fc0-8268-a5c3bd0e7f1e-nova-novncproxy-tls-certs" (OuterVolumeSpecName: "nova-novncproxy-tls-certs") pod "fe29d3f9-7e6b-4fc0-8268-a5c3bd0e7f1e" (UID: "fe29d3f9-7e6b-4fc0-8268-a5c3bd0e7f1e"). InnerVolumeSpecName "nova-novncproxy-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:29:47 crc kubenswrapper[4797]: I1013 13:29:47.342999 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "mysql-db") pod "f12892ce-6d68-4f79-b1dd-e874dffba145" (UID: "f12892ce-6d68-4f79-b1dd-e874dffba145"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 13 13:29:47 crc kubenswrapper[4797]: I1013 13:29:47.343970 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f12892ce-6d68-4f79-b1dd-e874dffba145-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f12892ce-6d68-4f79-b1dd-e874dffba145" (UID: "f12892ce-6d68-4f79-b1dd-e874dffba145"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:29:47 crc kubenswrapper[4797]: I1013 13:29:47.349040 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f12892ce-6d68-4f79-b1dd-e874dffba145-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "f12892ce-6d68-4f79-b1dd-e874dffba145" (UID: "f12892ce-6d68-4f79-b1dd-e874dffba145"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:29:47 crc kubenswrapper[4797]: I1013 13:29:47.441128 4797 reconciler_common.go:293] "Volume detached for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe29d3f9-7e6b-4fc0-8268-a5c3bd0e7f1e-nova-novncproxy-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:47 crc kubenswrapper[4797]: I1013 13:29:47.441441 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f12892ce-6d68-4f79-b1dd-e874dffba145-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:47 crc kubenswrapper[4797]: I1013 13:29:47.441722 4797 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Oct 13 13:29:47 crc kubenswrapper[4797]: I1013 13:29:47.441825 4797 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f12892ce-6d68-4f79-b1dd-e874dffba145-kolla-config\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:47 crc kubenswrapper[4797]: I1013 13:29:47.505377 4797 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Oct 13 13:29:47 crc kubenswrapper[4797]: I1013 13:29:47.525305 4797 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/swift-proxy-6c8957dfc-xhpz5" podUID="6fed4821-c587-408b-b6b0-bcc080170628" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.169:8080/healthcheck\": dial tcp 10.217.0.169:8080: connect: connection refused" Oct 13 13:29:47 crc kubenswrapper[4797]: I1013 13:29:47.532312 4797 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/swift-proxy-6c8957dfc-xhpz5" podUID="6fed4821-c587-408b-b6b0-bcc080170628" containerName="proxy-server" probeResult="failure" output="Get \"https://10.217.0.169:8080/healthcheck\": dial tcp 10.217.0.169:8080: connect: connection refused" Oct 13 13:29:47 crc kubenswrapper[4797]: I1013 13:29:47.543842 4797 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:47 crc kubenswrapper[4797]: I1013 13:29:47.579900 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 13 13:29:47 crc kubenswrapper[4797]: I1013 13:29:47.579934 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 13 13:29:47 crc kubenswrapper[4797]: I1013 13:29:47.579945 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Oct 13 13:29:47 crc kubenswrapper[4797]: I1013 13:29:47.580115 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/memcached-0" podUID="cb5bf67e-bfcd-4a21-b67f-9d2893da31ab" containerName="memcached" containerID="cri-o://cd039029de14470512836faefcd9431e8ebee84dc53830473d5f4f71c8f24d3c" gracePeriod=30 Oct 13 13:29:47 crc kubenswrapper[4797]: I1013 13:29:47.580915 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7297d3f3-134d-4fcf-85f2-8b414e2fb27d" containerName="ceilometer-central-agent" containerID="cri-o://8b3dbc305e498b09f402b2934e96ebeacc6ba59d0b362bafb37d2783cec22f3c" gracePeriod=30 Oct 13 13:29:47 crc kubenswrapper[4797]: I1013 13:29:47.581040 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7297d3f3-134d-4fcf-85f2-8b414e2fb27d" containerName="proxy-httpd" containerID="cri-o://26a477d9a49a32396a4c8959aafd79c16854ffb6419bb4f4a5f1de667a6d6c19" gracePeriod=30 Oct 13 13:29:47 crc kubenswrapper[4797]: I1013 13:29:47.581055 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7297d3f3-134d-4fcf-85f2-8b414e2fb27d" containerName="ceilometer-notification-agent" containerID="cri-o://3e128c54e1e732791a3db021ae19e7a7d4bd2ecbe1361a7e7ba33071c47c83c3" gracePeriod=30 Oct 13 13:29:47 crc kubenswrapper[4797]: I1013 13:29:47.581081 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7297d3f3-134d-4fcf-85f2-8b414e2fb27d" containerName="sg-core" containerID="cri-o://0eb68056284c660ad9015956fa330c09c86d1bd6d0b2e7c87865c45cb9f710fa" gracePeriod=30 Oct 13 13:29:47 crc kubenswrapper[4797]: I1013 13:29:47.581200 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="ddeeadaa-8237-4fad-8fd7-8e9c0580a1ed" containerName="kube-state-metrics" containerID="cri-o://be06c7925ca69832ee2ecd465bccfa99df2217df9880fecf2f1eaf2ed8591ad0" gracePeriod=30 Oct 13 13:29:47 crc kubenswrapper[4797]: I1013 13:29:47.640988 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-tsc96"] Oct 13 13:29:47 crc kubenswrapper[4797]: I1013 13:29:47.648865 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-8ftfs"] Oct 13 13:29:47 crc kubenswrapper[4797]: I1013 13:29:47.657073 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-tsc96"] Oct 13 13:29:47 crc kubenswrapper[4797]: I1013 13:29:47.663816 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-8ftfs"] Oct 13 13:29:47 crc kubenswrapper[4797]: I1013 13:29:47.672195 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-789bb6874b-qp58p"] Oct 13 13:29:47 crc kubenswrapper[4797]: I1013 13:29:47.672407 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/keystone-789bb6874b-qp58p" podUID="6bf02d7e-6b92-4d2a-838f-20cdd6a7046e" containerName="keystone-api" containerID="cri-o://9a62cd7c84ab7e9f42d9574734c0a64f0e34918d01cc88cd3821d18b0063a155" gracePeriod=30 Oct 13 13:29:47 crc kubenswrapper[4797]: E1013 13:29:47.681832 4797 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err=< Oct 13 13:29:47 crc kubenswrapper[4797]: command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: 2025-10-13T13:29:45Z|00001|fatal_signal|WARN|terminating with signal 14 (Alarm clock) Oct 13 13:29:47 crc kubenswrapper[4797]: /etc/init.d/functions: line 589: 463 Alarm clock "$@" Oct 13 13:29:47 crc kubenswrapper[4797]: > execCommand=["/usr/share/ovn/scripts/ovn-ctl","stop_controller"] containerName="ovn-controller" pod="openstack/ovn-controller-htk8n" message=< Oct 13 13:29:47 crc kubenswrapper[4797]: Exiting ovn-controller (1) [FAILED] Oct 13 13:29:47 crc kubenswrapper[4797]: Killing ovn-controller (1) [ OK ] Oct 13 13:29:47 crc kubenswrapper[4797]: 2025-10-13T13:29:45Z|00001|fatal_signal|WARN|terminating with signal 14 (Alarm clock) Oct 13 13:29:47 crc kubenswrapper[4797]: /etc/init.d/functions: line 589: 463 Alarm clock "$@" Oct 13 13:29:47 crc kubenswrapper[4797]: > Oct 13 13:29:47 crc kubenswrapper[4797]: E1013 13:29:47.681880 4797 kuberuntime_container.go:691] "PreStop hook failed" err=< Oct 13 13:29:47 crc kubenswrapper[4797]: command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: 2025-10-13T13:29:45Z|00001|fatal_signal|WARN|terminating with signal 14 (Alarm clock) Oct 13 13:29:47 crc kubenswrapper[4797]: /etc/init.d/functions: line 589: 463 Alarm clock "$@" Oct 13 13:29:47 crc kubenswrapper[4797]: > pod="openstack/ovn-controller-htk8n" podUID="85dd770b-9d5c-4cc9-adaa-87963d5bb160" containerName="ovn-controller" containerID="cri-o://c1f45af3970cc786037a9d09839c3b48d54254bcac9f280549e8da329ed5ed7c" Oct 13 13:29:47 crc kubenswrapper[4797]: I1013 13:29:47.681943 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-htk8n" podUID="85dd770b-9d5c-4cc9-adaa-87963d5bb160" containerName="ovn-controller" containerID="cri-o://c1f45af3970cc786037a9d09839c3b48d54254bcac9f280549e8da329ed5ed7c" gracePeriod=27 Oct 13 13:29:47 crc kubenswrapper[4797]: I1013 13:29:47.691401 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Oct 13 13:29:47 crc kubenswrapper[4797]: I1013 13:29:47.728921 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-g4xv6"] Oct 13 13:29:47 crc kubenswrapper[4797]: I1013 13:29:47.732788 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-g4xv6"] Oct 13 13:29:47 crc kubenswrapper[4797]: I1013 13:29:47.748180 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-849f-account-create-8cfr6"] Oct 13 13:29:47 crc kubenswrapper[4797]: I1013 13:29:47.759122 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-849f-account-create-8cfr6"] Oct 13 13:29:47 crc kubenswrapper[4797]: I1013 13:29:47.900986 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-galera-0" podUID="300309d9-4375-4cce-8fb1-0833d2cfdcde" containerName="galera" containerID="cri-o://a0ba3520e5651533522d5be4eedd2ce11b85f4a41e04d516a04f5658408ca62b" gracePeriod=30 Oct 13 13:29:48 crc kubenswrapper[4797]: I1013 13:29:48.039980 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-25b92"] Oct 13 13:29:48 crc kubenswrapper[4797]: I1013 13:29:48.050986 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-25b92"] Oct 13 13:29:48 crc kubenswrapper[4797]: I1013 13:29:48.079612 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-496f-account-create-jjvzm"] Oct 13 13:29:48 crc kubenswrapper[4797]: I1013 13:29:48.090549 4797 generic.go:334] "Generic (PLEG): container finished" podID="a327836f-196f-4d29-8792-33085b552aa9" containerID="3deb1ed3ceb86e7beaf83228fb0eb3de9d1897badd1d88888f0e381fccb33a91" exitCode=0 Oct 13 13:29:48 crc kubenswrapper[4797]: I1013 13:29:48.090616 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell084cf-account-delete-r9bpk" event={"ID":"a327836f-196f-4d29-8792-33085b552aa9","Type":"ContainerDied","Data":"3deb1ed3ceb86e7beaf83228fb0eb3de9d1897badd1d88888f0e381fccb33a91"} Oct 13 13:29:48 crc kubenswrapper[4797]: I1013 13:29:48.119615 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-496f-account-create-jjvzm"] Oct 13 13:29:48 crc kubenswrapper[4797]: I1013 13:29:48.120618 4797 patch_prober.go:28] interesting pod/machine-config-daemon-hrdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 13:29:48 crc kubenswrapper[4797]: I1013 13:29:48.120655 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 13:29:48 crc kubenswrapper[4797]: I1013 13:29:48.120696 4797 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" Oct 13 13:29:48 crc kubenswrapper[4797]: I1013 13:29:48.132447 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance496f-account-delete-qst6j"] Oct 13 13:29:48 crc kubenswrapper[4797]: I1013 13:29:48.132962 4797 generic.go:334] "Generic (PLEG): container finished" podID="3cb03045-bfdc-4d0b-af8f-4e3c4717e792" containerID="3ff088fac921f69c3d43fce69b617b7b587ee7207254fee948e2db608f2464d1" exitCode=0 Oct 13 13:29:48 crc kubenswrapper[4797]: I1013 13:29:48.133033 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novaapia754-account-delete-6khh6" event={"ID":"3cb03045-bfdc-4d0b-af8f-4e3c4717e792","Type":"ContainerDied","Data":"3ff088fac921f69c3d43fce69b617b7b587ee7207254fee948e2db608f2464d1"} Oct 13 13:29:48 crc kubenswrapper[4797]: I1013 13:29:48.137555 4797 generic.go:334] "Generic (PLEG): container finished" podID="ddeeadaa-8237-4fad-8fd7-8e9c0580a1ed" containerID="be06c7925ca69832ee2ecd465bccfa99df2217df9880fecf2f1eaf2ed8591ad0" exitCode=2 Oct 13 13:29:48 crc kubenswrapper[4797]: I1013 13:29:48.137792 4797 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7a2f1a197e052d816aea722ded8ddb41413d4d55e91b26d0412cfadb04dd4ef6"} pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 13 13:29:48 crc kubenswrapper[4797]: I1013 13:29:48.137875 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"ddeeadaa-8237-4fad-8fd7-8e9c0580a1ed","Type":"ContainerDied","Data":"be06c7925ca69832ee2ecd465bccfa99df2217df9880fecf2f1eaf2ed8591ad0"} Oct 13 13:29:48 crc kubenswrapper[4797]: I1013 13:29:48.137884 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerName="machine-config-daemon" containerID="cri-o://7a2f1a197e052d816aea722ded8ddb41413d4d55e91b26d0412cfadb04dd4ef6" gracePeriod=600 Oct 13 13:29:48 crc kubenswrapper[4797]: I1013 13:29:48.151189 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-jdw9k"] Oct 13 13:29:48 crc kubenswrapper[4797]: I1013 13:29:48.155185 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-htk8n_85dd770b-9d5c-4cc9-adaa-87963d5bb160/ovn-controller/0.log" Oct 13 13:29:48 crc kubenswrapper[4797]: I1013 13:29:48.155250 4797 generic.go:334] "Generic (PLEG): container finished" podID="85dd770b-9d5c-4cc9-adaa-87963d5bb160" containerID="c1f45af3970cc786037a9d09839c3b48d54254bcac9f280549e8da329ed5ed7c" exitCode=143 Oct 13 13:29:48 crc kubenswrapper[4797]: I1013 13:29:48.159019 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-htk8n" event={"ID":"85dd770b-9d5c-4cc9-adaa-87963d5bb160","Type":"ContainerDied","Data":"c1f45af3970cc786037a9d09839c3b48d54254bcac9f280549e8da329ed5ed7c"} Oct 13 13:29:48 crc kubenswrapper[4797]: I1013 13:29:48.159082 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-jdw9k"] Oct 13 13:29:48 crc kubenswrapper[4797]: I1013 13:29:48.176352 4797 generic.go:334] "Generic (PLEG): container finished" podID="7297d3f3-134d-4fcf-85f2-8b414e2fb27d" containerID="26a477d9a49a32396a4c8959aafd79c16854ffb6419bb4f4a5f1de667a6d6c19" exitCode=0 Oct 13 13:29:48 crc kubenswrapper[4797]: I1013 13:29:48.176388 4797 generic.go:334] "Generic (PLEG): container finished" podID="7297d3f3-134d-4fcf-85f2-8b414e2fb27d" containerID="0eb68056284c660ad9015956fa330c09c86d1bd6d0b2e7c87865c45cb9f710fa" exitCode=2 Oct 13 13:29:48 crc kubenswrapper[4797]: I1013 13:29:48.176464 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7297d3f3-134d-4fcf-85f2-8b414e2fb27d","Type":"ContainerDied","Data":"26a477d9a49a32396a4c8959aafd79c16854ffb6419bb4f4a5f1de667a6d6c19"} Oct 13 13:29:48 crc kubenswrapper[4797]: I1013 13:29:48.176501 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7297d3f3-134d-4fcf-85f2-8b414e2fb27d","Type":"ContainerDied","Data":"0eb68056284c660ad9015956fa330c09c86d1bd6d0b2e7c87865c45cb9f710fa"} Oct 13 13:29:48 crc kubenswrapper[4797]: I1013 13:29:48.204133 4797 generic.go:334] "Generic (PLEG): container finished" podID="ecdce3f4-f1b1-4323-b237-eb28b936ebc7" containerID="fe63cb02070cc071d8992b28f2861cf1fe5293a685eab37ca5fb6a3cb41aabd2" exitCode=0 Oct 13 13:29:48 crc kubenswrapper[4797]: I1013 13:29:48.204440 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placementcde4-account-delete-dpcwh" event={"ID":"ecdce3f4-f1b1-4323-b237-eb28b936ebc7","Type":"ContainerDied","Data":"fe63cb02070cc071d8992b28f2861cf1fe5293a685eab37ca5fb6a3cb41aabd2"} Oct 13 13:29:48 crc kubenswrapper[4797]: I1013 13:29:48.215007 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinderfaff-account-delete-ltnnq"] Oct 13 13:29:48 crc kubenswrapper[4797]: I1013 13:29:48.220185 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-faff-account-create-wnr29"] Oct 13 13:29:48 crc kubenswrapper[4797]: I1013 13:29:48.236699 4797 generic.go:334] "Generic (PLEG): container finished" podID="3111854e-cfce-493d-a094-63479ed35583" containerID="6d8dea0f7fd3f99f103a729af71199c45e8cb48ada730f3c234f6e561cabd22e" exitCode=1 Oct 13 13:29:48 crc kubenswrapper[4797]: I1013 13:29:48.237017 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 13 13:29:48 crc kubenswrapper[4797]: I1013 13:29:48.238096 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell122bc-account-delete-v4zs5" event={"ID":"3111854e-cfce-493d-a094-63479ed35583","Type":"ContainerDied","Data":"6d8dea0f7fd3f99f103a729af71199c45e8cb48ada730f3c234f6e561cabd22e"} Oct 13 13:29:48 crc kubenswrapper[4797]: I1013 13:29:48.238393 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 13 13:29:48 crc kubenswrapper[4797]: I1013 13:29:48.257770 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-faff-account-create-wnr29"] Oct 13 13:29:48 crc kubenswrapper[4797]: E1013 13:29:48.270045 4797 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c1f45af3970cc786037a9d09839c3b48d54254bcac9f280549e8da329ed5ed7c is running failed: container process not found" containerID="c1f45af3970cc786037a9d09839c3b48d54254bcac9f280549e8da329ed5ed7c" cmd=["/usr/local/bin/container-scripts/ovn_controller_readiness.sh"] Oct 13 13:29:48 crc kubenswrapper[4797]: E1013 13:29:48.273314 4797 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c1f45af3970cc786037a9d09839c3b48d54254bcac9f280549e8da329ed5ed7c is running failed: container process not found" containerID="c1f45af3970cc786037a9d09839c3b48d54254bcac9f280549e8da329ed5ed7c" cmd=["/usr/local/bin/container-scripts/ovn_controller_readiness.sh"] Oct 13 13:29:48 crc kubenswrapper[4797]: E1013 13:29:48.275084 4797 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c1f45af3970cc786037a9d09839c3b48d54254bcac9f280549e8da329ed5ed7c is running failed: container process not found" containerID="c1f45af3970cc786037a9d09839c3b48d54254bcac9f280549e8da329ed5ed7c" cmd=["/usr/local/bin/container-scripts/ovn_controller_readiness.sh"] Oct 13 13:29:48 crc kubenswrapper[4797]: E1013 13:29:48.275111 4797 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c1f45af3970cc786037a9d09839c3b48d54254bcac9f280549e8da329ed5ed7c is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-htk8n" podUID="85dd770b-9d5c-4cc9-adaa-87963d5bb160" containerName="ovn-controller" Oct 13 13:29:48 crc kubenswrapper[4797]: E1013 13:29:48.295205 4797 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a06c1a43c4f53948ecf8f2a50db2d6224db0be23e64e47a447e2e46bcc407c2e is running failed: container process not found" containerID="a06c1a43c4f53948ecf8f2a50db2d6224db0be23e64e47a447e2e46bcc407c2e" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 13 13:29:48 crc kubenswrapper[4797]: E1013 13:29:48.299565 4797 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a06c1a43c4f53948ecf8f2a50db2d6224db0be23e64e47a447e2e46bcc407c2e is running failed: container process not found" containerID="a06c1a43c4f53948ecf8f2a50db2d6224db0be23e64e47a447e2e46bcc407c2e" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 13 13:29:48 crc kubenswrapper[4797]: E1013 13:29:48.301168 4797 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a06c1a43c4f53948ecf8f2a50db2d6224db0be23e64e47a447e2e46bcc407c2e is running failed: container process not found" containerID="a06c1a43c4f53948ecf8f2a50db2d6224db0be23e64e47a447e2e46bcc407c2e" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 13 13:29:48 crc kubenswrapper[4797]: E1013 13:29:48.301218 4797 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a06c1a43c4f53948ecf8f2a50db2d6224db0be23e64e47a447e2e46bcc407c2e is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-2mpq9" podUID="3ac6531d-4d7d-4cf0-b943-984f885b4a6d" containerName="ovsdb-server" Oct 13 13:29:48 crc kubenswrapper[4797]: E1013 13:29:48.330940 4797 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="530a23262b718d00b295a64b20c635224a04f307f764b6900f060c9b7a722368" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 13 13:29:48 crc kubenswrapper[4797]: E1013 13:29:48.334908 4797 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="530a23262b718d00b295a64b20c635224a04f307f764b6900f060c9b7a722368" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 13 13:29:48 crc kubenswrapper[4797]: E1013 13:29:48.342063 4797 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="530a23262b718d00b295a64b20c635224a04f307f764b6900f060c9b7a722368" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 13 13:29:48 crc kubenswrapper[4797]: E1013 13:29:48.342114 4797 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-2mpq9" podUID="3ac6531d-4d7d-4cf0-b943-984f885b4a6d" containerName="ovs-vswitchd" Oct 13 13:29:48 crc kubenswrapper[4797]: I1013 13:29:48.360397 4797 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="6b2f17a4-493b-4b76-9dea-ef70ed8e1525" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.167:8776/healthcheck\": read tcp 10.217.0.2:49602->10.217.0.167:8776: read: connection reset by peer" Oct 13 13:29:48 crc kubenswrapper[4797]: I1013 13:29:48.407048 4797 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="cefeac7c-e65d-4c12-8f7e-e56bf30c04fa" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.206:8775/\": read tcp 10.217.0.2:49060->10.217.0.206:8775: read: connection reset by peer" Oct 13 13:29:48 crc kubenswrapper[4797]: I1013 13:29:48.407402 4797 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="cefeac7c-e65d-4c12-8f7e-e56bf30c04fa" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.206:8775/\": read tcp 10.217.0.2:49076->10.217.0.206:8775: read: connection reset by peer" Oct 13 13:29:48 crc kubenswrapper[4797]: E1013 13:29:48.711371 4797 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="aab08153ea37f716e034bf774837202e797c20f3821e253a5ccd1da48ecfc7e4" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 13 13:29:48 crc kubenswrapper[4797]: E1013 13:29:48.720120 4797 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="aab08153ea37f716e034bf774837202e797c20f3821e253a5ccd1da48ecfc7e4" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 13 13:29:48 crc kubenswrapper[4797]: E1013 13:29:48.735120 4797 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="aab08153ea37f716e034bf774837202e797c20f3821e253a5ccd1da48ecfc7e4" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 13 13:29:48 crc kubenswrapper[4797]: E1013 13:29:48.735216 4797 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="c7503305-66a2-4504-b208-6795946d8701" containerName="nova-cell0-conductor-conductor" Oct 13 13:29:48 crc kubenswrapper[4797]: I1013 13:29:48.903480 4797 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-575995d4c4-dskht" podUID="3d561c30-1e2f-4a3d-b042-8191c88e4bb6" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.168:9311/healthcheck\": read tcp 10.217.0.2:47270->10.217.0.168:9311: read: connection reset by peer" Oct 13 13:29:48 crc kubenswrapper[4797]: I1013 13:29:48.903524 4797 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-575995d4c4-dskht" podUID="3d561c30-1e2f-4a3d-b042-8191c88e4bb6" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.168:9311/healthcheck\": read tcp 10.217.0.2:47278->10.217.0.168:9311: read: connection reset by peer" Oct 13 13:29:48 crc kubenswrapper[4797]: I1013 13:29:48.920203 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 13 13:29:48 crc kubenswrapper[4797]: E1013 13:29:48.921450 4797 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of eb7d4e571e76b36027b315aa98010f3dd65dc78fd3a01a2aa0cae7369e4d667e is running failed: container process not found" containerID="eb7d4e571e76b36027b315aa98010f3dd65dc78fd3a01a2aa0cae7369e4d667e" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 13 13:29:48 crc kubenswrapper[4797]: E1013 13:29:48.928610 4797 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of eb7d4e571e76b36027b315aa98010f3dd65dc78fd3a01a2aa0cae7369e4d667e is running failed: container process not found" containerID="eb7d4e571e76b36027b315aa98010f3dd65dc78fd3a01a2aa0cae7369e4d667e" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 13 13:29:48 crc kubenswrapper[4797]: I1013 13:29:48.929097 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-htk8n_85dd770b-9d5c-4cc9-adaa-87963d5bb160/ovn-controller/0.log" Oct 13 13:29:48 crc kubenswrapper[4797]: I1013 13:29:48.929177 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-htk8n" Oct 13 13:29:48 crc kubenswrapper[4797]: E1013 13:29:48.929278 4797 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of eb7d4e571e76b36027b315aa98010f3dd65dc78fd3a01a2aa0cae7369e4d667e is running failed: container process not found" containerID="eb7d4e571e76b36027b315aa98010f3dd65dc78fd3a01a2aa0cae7369e4d667e" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 13 13:29:48 crc kubenswrapper[4797]: E1013 13:29:48.929328 4797 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of eb7d4e571e76b36027b315aa98010f3dd65dc78fd3a01a2aa0cae7369e4d667e is running failed: container process not found" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="f10b6d3a-3a6f-4ab7-9ded-4885e28bdbc6" containerName="nova-scheduler-scheduler" Oct 13 13:29:48 crc kubenswrapper[4797]: I1013 13:29:48.930883 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 13 13:29:48 crc kubenswrapper[4797]: I1013 13:29:48.937768 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 13 13:29:48 crc kubenswrapper[4797]: I1013 13:29:48.943617 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 13 13:29:48 crc kubenswrapper[4797]: I1013 13:29:48.948651 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 13 13:29:48 crc kubenswrapper[4797]: I1013 13:29:48.998100 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-6c8957dfc-xhpz5" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.016432 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/85dd770b-9d5c-4cc9-adaa-87963d5bb160-var-log-ovn\") pod \"85dd770b-9d5c-4cc9-adaa-87963d5bb160\" (UID: \"85dd770b-9d5c-4cc9-adaa-87963d5bb160\") " Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.016472 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fbr5d\" (UniqueName: \"kubernetes.io/projected/85dd770b-9d5c-4cc9-adaa-87963d5bb160-kube-api-access-fbr5d\") pod \"85dd770b-9d5c-4cc9-adaa-87963d5bb160\" (UID: \"85dd770b-9d5c-4cc9-adaa-87963d5bb160\") " Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.016537 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/ddeeadaa-8237-4fad-8fd7-8e9c0580a1ed-kube-state-metrics-tls-config\") pod \"ddeeadaa-8237-4fad-8fd7-8e9c0580a1ed\" (UID: \"ddeeadaa-8237-4fad-8fd7-8e9c0580a1ed\") " Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.016596 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/85dd770b-9d5c-4cc9-adaa-87963d5bb160-scripts\") pod \"85dd770b-9d5c-4cc9-adaa-87963d5bb160\" (UID: \"85dd770b-9d5c-4cc9-adaa-87963d5bb160\") " Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.016613 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/ddeeadaa-8237-4fad-8fd7-8e9c0580a1ed-kube-state-metrics-tls-certs\") pod \"ddeeadaa-8237-4fad-8fd7-8e9c0580a1ed\" (UID: \"ddeeadaa-8237-4fad-8fd7-8e9c0580a1ed\") " Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.016758 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/85dd770b-9d5c-4cc9-adaa-87963d5bb160-var-run-ovn\") pod \"85dd770b-9d5c-4cc9-adaa-87963d5bb160\" (UID: \"85dd770b-9d5c-4cc9-adaa-87963d5bb160\") " Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.016912 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85dd770b-9d5c-4cc9-adaa-87963d5bb160-combined-ca-bundle\") pod \"85dd770b-9d5c-4cc9-adaa-87963d5bb160\" (UID: \"85dd770b-9d5c-4cc9-adaa-87963d5bb160\") " Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.016994 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/85dd770b-9d5c-4cc9-adaa-87963d5bb160-var-run\") pod \"85dd770b-9d5c-4cc9-adaa-87963d5bb160\" (UID: \"85dd770b-9d5c-4cc9-adaa-87963d5bb160\") " Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.017017 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddeeadaa-8237-4fad-8fd7-8e9c0580a1ed-combined-ca-bundle\") pod \"ddeeadaa-8237-4fad-8fd7-8e9c0580a1ed\" (UID: \"ddeeadaa-8237-4fad-8fd7-8e9c0580a1ed\") " Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.017051 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/85dd770b-9d5c-4cc9-adaa-87963d5bb160-ovn-controller-tls-certs\") pod \"85dd770b-9d5c-4cc9-adaa-87963d5bb160\" (UID: \"85dd770b-9d5c-4cc9-adaa-87963d5bb160\") " Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.017079 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-74n9p\" (UniqueName: \"kubernetes.io/projected/ddeeadaa-8237-4fad-8fd7-8e9c0580a1ed-kube-api-access-74n9p\") pod \"ddeeadaa-8237-4fad-8fd7-8e9c0580a1ed\" (UID: \"ddeeadaa-8237-4fad-8fd7-8e9c0580a1ed\") " Oct 13 13:29:49 crc kubenswrapper[4797]: E1013 13:29:49.017648 4797 secret.go:188] Couldn't get secret openstack/barbican-worker-config-data: secret "barbican-worker-config-data" not found Oct 13 13:29:49 crc kubenswrapper[4797]: E1013 13:29:49.017718 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/99ff3b6b-b49c-4214-9cf1-ea9eb85f28c0-config-data-custom podName:99ff3b6b-b49c-4214-9cf1-ea9eb85f28c0 nodeName:}" failed. No retries permitted until 2025-10-13 13:29:53.017702351 +0000 UTC m=+1370.551252607 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data-custom" (UniqueName: "kubernetes.io/secret/99ff3b6b-b49c-4214-9cf1-ea9eb85f28c0-config-data-custom") pod "barbican-worker-cc74bd777-hvb5p" (UID: "99ff3b6b-b49c-4214-9cf1-ea9eb85f28c0") : secret "barbican-worker-config-data" not found Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.018264 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/85dd770b-9d5c-4cc9-adaa-87963d5bb160-var-run" (OuterVolumeSpecName: "var-run") pod "85dd770b-9d5c-4cc9-adaa-87963d5bb160" (UID: "85dd770b-9d5c-4cc9-adaa-87963d5bb160"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.021506 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/85dd770b-9d5c-4cc9-adaa-87963d5bb160-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "85dd770b-9d5c-4cc9-adaa-87963d5bb160" (UID: "85dd770b-9d5c-4cc9-adaa-87963d5bb160"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.022137 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/85dd770b-9d5c-4cc9-adaa-87963d5bb160-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "85dd770b-9d5c-4cc9-adaa-87963d5bb160" (UID: "85dd770b-9d5c-4cc9-adaa-87963d5bb160"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.022709 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/85dd770b-9d5c-4cc9-adaa-87963d5bb160-scripts" (OuterVolumeSpecName: "scripts") pod "85dd770b-9d5c-4cc9-adaa-87963d5bb160" (UID: "85dd770b-9d5c-4cc9-adaa-87963d5bb160"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:29:49 crc kubenswrapper[4797]: E1013 13:29:49.022847 4797 secret.go:188] Couldn't get secret openstack/barbican-config-data: secret "barbican-config-data" not found Oct 13 13:29:49 crc kubenswrapper[4797]: E1013 13:29:49.022908 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/99ff3b6b-b49c-4214-9cf1-ea9eb85f28c0-config-data podName:99ff3b6b-b49c-4214-9cf1-ea9eb85f28c0 nodeName:}" failed. No retries permitted until 2025-10-13 13:29:53.022886018 +0000 UTC m=+1370.556436354 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/99ff3b6b-b49c-4214-9cf1-ea9eb85f28c0-config-data") pod "barbican-worker-cc74bd777-hvb5p" (UID: "99ff3b6b-b49c-4214-9cf1-ea9eb85f28c0") : secret "barbican-config-data" not found Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.031521 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novacell084cf-account-delete-r9bpk" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.036058 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ddeeadaa-8237-4fad-8fd7-8e9c0580a1ed-kube-api-access-74n9p" (OuterVolumeSpecName: "kube-api-access-74n9p") pod "ddeeadaa-8237-4fad-8fd7-8e9c0580a1ed" (UID: "ddeeadaa-8237-4fad-8fd7-8e9c0580a1ed"). InnerVolumeSpecName "kube-api-access-74n9p". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.048129 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85dd770b-9d5c-4cc9-adaa-87963d5bb160-kube-api-access-fbr5d" (OuterVolumeSpecName: "kube-api-access-fbr5d") pod "85dd770b-9d5c-4cc9-adaa-87963d5bb160" (UID: "85dd770b-9d5c-4cc9-adaa-87963d5bb160"). InnerVolumeSpecName "kube-api-access-fbr5d". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.064177 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.084897 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance496f-account-delete-qst6j" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.114660 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddeeadaa-8237-4fad-8fd7-8e9c0580a1ed-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ddeeadaa-8237-4fad-8fd7-8e9c0580a1ed" (UID: "ddeeadaa-8237-4fad-8fd7-8e9c0580a1ed"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.118931 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dhskg\" (UniqueName: \"kubernetes.io/projected/cb5bf67e-bfcd-4a21-b67f-9d2893da31ab-kube-api-access-dhskg\") pod \"cb5bf67e-bfcd-4a21-b67f-9d2893da31ab\" (UID: \"cb5bf67e-bfcd-4a21-b67f-9d2893da31ab\") " Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.118990 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6fed4821-c587-408b-b6b0-bcc080170628-public-tls-certs\") pod \"6fed4821-c587-408b-b6b0-bcc080170628\" (UID: \"6fed4821-c587-408b-b6b0-bcc080170628\") " Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.119017 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r5kpc\" (UniqueName: \"kubernetes.io/projected/a327836f-196f-4d29-8792-33085b552aa9-kube-api-access-r5kpc\") pod \"a327836f-196f-4d29-8792-33085b552aa9\" (UID: \"a327836f-196f-4d29-8792-33085b552aa9\") " Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.119080 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6fed4821-c587-408b-b6b0-bcc080170628-config-data\") pod \"6fed4821-c587-408b-b6b0-bcc080170628\" (UID: \"6fed4821-c587-408b-b6b0-bcc080170628\") " Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.119158 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6fed4821-c587-408b-b6b0-bcc080170628-log-httpd\") pod \"6fed4821-c587-408b-b6b0-bcc080170628\" (UID: \"6fed4821-c587-408b-b6b0-bcc080170628\") " Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.119201 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/cb5bf67e-bfcd-4a21-b67f-9d2893da31ab-kolla-config\") pod \"cb5bf67e-bfcd-4a21-b67f-9d2893da31ab\" (UID: \"cb5bf67e-bfcd-4a21-b67f-9d2893da31ab\") " Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.119235 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb5bf67e-bfcd-4a21-b67f-9d2893da31ab-combined-ca-bundle\") pod \"cb5bf67e-bfcd-4a21-b67f-9d2893da31ab\" (UID: \"cb5bf67e-bfcd-4a21-b67f-9d2893da31ab\") " Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.119253 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6fed4821-c587-408b-b6b0-bcc080170628-combined-ca-bundle\") pod \"6fed4821-c587-408b-b6b0-bcc080170628\" (UID: \"6fed4821-c587-408b-b6b0-bcc080170628\") " Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.120923 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb5bf67e-bfcd-4a21-b67f-9d2893da31ab-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "cb5bf67e-bfcd-4a21-b67f-9d2893da31ab" (UID: "cb5bf67e-bfcd-4a21-b67f-9d2893da31ab"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.122522 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6fed4821-c587-408b-b6b0-bcc080170628-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "6fed4821-c587-408b-b6b0-bcc080170628" (UID: "6fed4821-c587-408b-b6b0-bcc080170628"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.122695 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6fed4821-c587-408b-b6b0-bcc080170628-internal-tls-certs\") pod \"6fed4821-c587-408b-b6b0-bcc080170628\" (UID: \"6fed4821-c587-408b-b6b0-bcc080170628\") " Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.122760 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cb5bf67e-bfcd-4a21-b67f-9d2893da31ab-config-data\") pod \"cb5bf67e-bfcd-4a21-b67f-9d2893da31ab\" (UID: \"cb5bf67e-bfcd-4a21-b67f-9d2893da31ab\") " Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.122794 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6fed4821-c587-408b-b6b0-bcc080170628-etc-swift\") pod \"6fed4821-c587-408b-b6b0-bcc080170628\" (UID: \"6fed4821-c587-408b-b6b0-bcc080170628\") " Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.122862 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7mf8\" (UniqueName: \"kubernetes.io/projected/312a660f-ea89-49ac-8857-16dae844353f-kube-api-access-x7mf8\") pod \"312a660f-ea89-49ac-8857-16dae844353f\" (UID: \"312a660f-ea89-49ac-8857-16dae844353f\") " Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.122920 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6fed4821-c587-408b-b6b0-bcc080170628-run-httpd\") pod \"6fed4821-c587-408b-b6b0-bcc080170628\" (UID: \"6fed4821-c587-408b-b6b0-bcc080170628\") " Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.122989 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb5bf67e-bfcd-4a21-b67f-9d2893da31ab-memcached-tls-certs\") pod \"cb5bf67e-bfcd-4a21-b67f-9d2893da31ab\" (UID: \"cb5bf67e-bfcd-4a21-b67f-9d2893da31ab\") " Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.123081 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c94xd\" (UniqueName: \"kubernetes.io/projected/6fed4821-c587-408b-b6b0-bcc080170628-kube-api-access-c94xd\") pod \"6fed4821-c587-408b-b6b0-bcc080170628\" (UID: \"6fed4821-c587-408b-b6b0-bcc080170628\") " Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.125071 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a327836f-196f-4d29-8792-33085b552aa9-kube-api-access-r5kpc" (OuterVolumeSpecName: "kube-api-access-r5kpc") pod "a327836f-196f-4d29-8792-33085b552aa9" (UID: "a327836f-196f-4d29-8792-33085b552aa9"). InnerVolumeSpecName "kube-api-access-r5kpc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.125233 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb5bf67e-bfcd-4a21-b67f-9d2893da31ab-kube-api-access-dhskg" (OuterVolumeSpecName: "kube-api-access-dhskg") pod "cb5bf67e-bfcd-4a21-b67f-9d2893da31ab" (UID: "cb5bf67e-bfcd-4a21-b67f-9d2893da31ab"). InnerVolumeSpecName "kube-api-access-dhskg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.125225 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6fed4821-c587-408b-b6b0-bcc080170628-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "6fed4821-c587-408b-b6b0-bcc080170628" (UID: "6fed4821-c587-408b-b6b0-bcc080170628"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.125746 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85dd770b-9d5c-4cc9-adaa-87963d5bb160-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "85dd770b-9d5c-4cc9-adaa-87963d5bb160" (UID: "85dd770b-9d5c-4cc9-adaa-87963d5bb160"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.127908 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novacell122bc-account-delete-v4zs5" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.129150 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb5bf67e-bfcd-4a21-b67f-9d2893da31ab-config-data" (OuterVolumeSpecName: "config-data") pod "cb5bf67e-bfcd-4a21-b67f-9d2893da31ab" (UID: "cb5bf67e-bfcd-4a21-b67f-9d2893da31ab"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.136420 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6fed4821-c587-408b-b6b0-bcc080170628-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "6fed4821-c587-408b-b6b0-bcc080170628" (UID: "6fed4821-c587-408b-b6b0-bcc080170628"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.124266 4797 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/85dd770b-9d5c-4cc9-adaa-87963d5bb160-var-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.137163 4797 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6fed4821-c587-408b-b6b0-bcc080170628-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.137179 4797 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/cb5bf67e-bfcd-4a21-b67f-9d2893da31ab-kolla-config\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.137191 4797 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/85dd770b-9d5c-4cc9-adaa-87963d5bb160-var-run\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.137200 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddeeadaa-8237-4fad-8fd7-8e9c0580a1ed-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.137210 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-74n9p\" (UniqueName: \"kubernetes.io/projected/ddeeadaa-8237-4fad-8fd7-8e9c0580a1ed-kube-api-access-74n9p\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.137220 4797 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/85dd770b-9d5c-4cc9-adaa-87963d5bb160-var-log-ovn\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.137228 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fbr5d\" (UniqueName: \"kubernetes.io/projected/85dd770b-9d5c-4cc9-adaa-87963d5bb160-kube-api-access-fbr5d\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.137237 4797 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/85dd770b-9d5c-4cc9-adaa-87963d5bb160-scripts\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.140194 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6fed4821-c587-408b-b6b0-bcc080170628-kube-api-access-c94xd" (OuterVolumeSpecName: "kube-api-access-c94xd") pod "6fed4821-c587-408b-b6b0-bcc080170628" (UID: "6fed4821-c587-408b-b6b0-bcc080170628"). InnerVolumeSpecName "kube-api-access-c94xd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.141045 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/312a660f-ea89-49ac-8857-16dae844353f-kube-api-access-x7mf8" (OuterVolumeSpecName: "kube-api-access-x7mf8") pod "312a660f-ea89-49ac-8857-16dae844353f" (UID: "312a660f-ea89-49ac-8857-16dae844353f"). InnerVolumeSpecName "kube-api-access-x7mf8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.141333 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placementcde4-account-delete-dpcwh" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.152356 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.176010 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-778fd9d9d-t868n" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.185015 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddeeadaa-8237-4fad-8fd7-8e9c0580a1ed-kube-state-metrics-tls-config" (OuterVolumeSpecName: "kube-state-metrics-tls-config") pod "ddeeadaa-8237-4fad-8fd7-8e9c0580a1ed" (UID: "ddeeadaa-8237-4fad-8fd7-8e9c0580a1ed"). InnerVolumeSpecName "kube-state-metrics-tls-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.192536 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinderfaff-account-delete-ltnnq" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.218466 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.236638 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novaapia754-account-delete-6khh6" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.237559 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6b2f17a4-493b-4b76-9dea-ef70ed8e1525-scripts\") pod \"6b2f17a4-493b-4b76-9dea-ef70ed8e1525\" (UID: \"6b2f17a4-493b-4b76-9dea-ef70ed8e1525\") " Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.237602 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jxxwm\" (UniqueName: \"kubernetes.io/projected/5421ab8e-2db8-4909-b67b-e0491f7b80e7-kube-api-access-jxxwm\") pod \"5421ab8e-2db8-4909-b67b-e0491f7b80e7\" (UID: \"5421ab8e-2db8-4909-b67b-e0491f7b80e7\") " Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.237649 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b2f17a4-493b-4b76-9dea-ef70ed8e1525-public-tls-certs\") pod \"6b2f17a4-493b-4b76-9dea-ef70ed8e1525\" (UID: \"6b2f17a4-493b-4b76-9dea-ef70ed8e1525\") " Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.237848 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d5e82c55-e59e-4d97-800c-66a4f9555047-scripts\") pod \"d5e82c55-e59e-4d97-800c-66a4f9555047\" (UID: \"d5e82c55-e59e-4d97-800c-66a4f9555047\") " Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.237906 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p9w9g\" (UniqueName: \"kubernetes.io/projected/6b2f17a4-493b-4b76-9dea-ef70ed8e1525-kube-api-access-p9w9g\") pod \"6b2f17a4-493b-4b76-9dea-ef70ed8e1525\" (UID: \"6b2f17a4-493b-4b76-9dea-ef70ed8e1525\") " Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.238024 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b2f17a4-493b-4b76-9dea-ef70ed8e1525-config-data\") pod \"6b2f17a4-493b-4b76-9dea-ef70ed8e1525\" (UID: \"6b2f17a4-493b-4b76-9dea-ef70ed8e1525\") " Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.238069 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b2f17a4-493b-4b76-9dea-ef70ed8e1525-combined-ca-bundle\") pod \"6b2f17a4-493b-4b76-9dea-ef70ed8e1525\" (UID: \"6b2f17a4-493b-4b76-9dea-ef70ed8e1525\") " Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.238473 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6b2f17a4-493b-4b76-9dea-ef70ed8e1525-config-data-custom\") pod \"6b2f17a4-493b-4b76-9dea-ef70ed8e1525\" (UID: \"6b2f17a4-493b-4b76-9dea-ef70ed8e1525\") " Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.238509 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5e82c55-e59e-4d97-800c-66a4f9555047-public-tls-certs\") pod \"d5e82c55-e59e-4d97-800c-66a4f9555047\" (UID: \"d5e82c55-e59e-4d97-800c-66a4f9555047\") " Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.238561 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6b2f17a4-493b-4b76-9dea-ef70ed8e1525-logs\") pod \"6b2f17a4-493b-4b76-9dea-ef70ed8e1525\" (UID: \"6b2f17a4-493b-4b76-9dea-ef70ed8e1525\") " Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.238582 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b2f17a4-493b-4b76-9dea-ef70ed8e1525-internal-tls-certs\") pod \"6b2f17a4-493b-4b76-9dea-ef70ed8e1525\" (UID: \"6b2f17a4-493b-4b76-9dea-ef70ed8e1525\") " Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.238613 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5e82c55-e59e-4d97-800c-66a4f9555047-internal-tls-certs\") pod \"d5e82c55-e59e-4d97-800c-66a4f9555047\" (UID: \"d5e82c55-e59e-4d97-800c-66a4f9555047\") " Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.238659 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-95tw5\" (UniqueName: \"kubernetes.io/projected/3111854e-cfce-493d-a094-63479ed35583-kube-api-access-95tw5\") pod \"3111854e-cfce-493d-a094-63479ed35583\" (UID: \"3111854e-cfce-493d-a094-63479ed35583\") " Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.238684 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z4mnz\" (UniqueName: \"kubernetes.io/projected/ecdce3f4-f1b1-4323-b237-eb28b936ebc7-kube-api-access-z4mnz\") pod \"ecdce3f4-f1b1-4323-b237-eb28b936ebc7\" (UID: \"ecdce3f4-f1b1-4323-b237-eb28b936ebc7\") " Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.238728 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6b2f17a4-493b-4b76-9dea-ef70ed8e1525-etc-machine-id\") pod \"6b2f17a4-493b-4b76-9dea-ef70ed8e1525\" (UID: \"6b2f17a4-493b-4b76-9dea-ef70ed8e1525\") " Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.238753 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cqm27\" (UniqueName: \"kubernetes.io/projected/d5e82c55-e59e-4d97-800c-66a4f9555047-kube-api-access-cqm27\") pod \"d5e82c55-e59e-4d97-800c-66a4f9555047\" (UID: \"d5e82c55-e59e-4d97-800c-66a4f9555047\") " Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.238781 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5e82c55-e59e-4d97-800c-66a4f9555047-config-data\") pod \"d5e82c55-e59e-4d97-800c-66a4f9555047\" (UID: \"d5e82c55-e59e-4d97-800c-66a4f9555047\") " Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.238865 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d5e82c55-e59e-4d97-800c-66a4f9555047-logs\") pod \"d5e82c55-e59e-4d97-800c-66a4f9555047\" (UID: \"d5e82c55-e59e-4d97-800c-66a4f9555047\") " Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.238895 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5e82c55-e59e-4d97-800c-66a4f9555047-combined-ca-bundle\") pod \"d5e82c55-e59e-4d97-800c-66a4f9555047\" (UID: \"d5e82c55-e59e-4d97-800c-66a4f9555047\") " Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.239078 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b2f17a4-493b-4b76-9dea-ef70ed8e1525-logs" (OuterVolumeSpecName: "logs") pod "6b2f17a4-493b-4b76-9dea-ef70ed8e1525" (UID: "6b2f17a4-493b-4b76-9dea-ef70ed8e1525"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.239525 4797 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/ddeeadaa-8237-4fad-8fd7-8e9c0580a1ed-kube-state-metrics-tls-config\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.239552 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c94xd\" (UniqueName: \"kubernetes.io/projected/6fed4821-c587-408b-b6b0-bcc080170628-kube-api-access-c94xd\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.239691 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dhskg\" (UniqueName: \"kubernetes.io/projected/cb5bf67e-bfcd-4a21-b67f-9d2893da31ab-kube-api-access-dhskg\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.239706 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r5kpc\" (UniqueName: \"kubernetes.io/projected/a327836f-196f-4d29-8792-33085b552aa9-kube-api-access-r5kpc\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.239717 4797 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6b2f17a4-493b-4b76-9dea-ef70ed8e1525-logs\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.241196 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85dd770b-9d5c-4cc9-adaa-87963d5bb160-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.241209 4797 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cb5bf67e-bfcd-4a21-b67f-9d2893da31ab-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.241219 4797 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6fed4821-c587-408b-b6b0-bcc080170628-etc-swift\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.241232 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7mf8\" (UniqueName: \"kubernetes.io/projected/312a660f-ea89-49ac-8857-16dae844353f-kube-api-access-x7mf8\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.241242 4797 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6fed4821-c587-408b-b6b0-bcc080170628-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.242125 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5e82c55-e59e-4d97-800c-66a4f9555047-scripts" (OuterVolumeSpecName: "scripts") pod "d5e82c55-e59e-4d97-800c-66a4f9555047" (UID: "d5e82c55-e59e-4d97-800c-66a4f9555047"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.242215 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b2f17a4-493b-4b76-9dea-ef70ed8e1525-scripts" (OuterVolumeSpecName: "scripts") pod "6b2f17a4-493b-4b76-9dea-ef70ed8e1525" (UID: "6b2f17a4-493b-4b76-9dea-ef70ed8e1525"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.242285 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6b2f17a4-493b-4b76-9dea-ef70ed8e1525-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "6b2f17a4-493b-4b76-9dea-ef70ed8e1525" (UID: "6b2f17a4-493b-4b76-9dea-ef70ed8e1525"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.244083 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b2f17a4-493b-4b76-9dea-ef70ed8e1525-kube-api-access-p9w9g" (OuterVolumeSpecName: "kube-api-access-p9w9g") pod "6b2f17a4-493b-4b76-9dea-ef70ed8e1525" (UID: "6b2f17a4-493b-4b76-9dea-ef70ed8e1525"). InnerVolumeSpecName "kube-api-access-p9w9g". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.244079 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b2f17a4-493b-4b76-9dea-ef70ed8e1525-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "6b2f17a4-493b-4b76-9dea-ef70ed8e1525" (UID: "6b2f17a4-493b-4b76-9dea-ef70ed8e1525"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.246892 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d5e82c55-e59e-4d97-800c-66a4f9555047-logs" (OuterVolumeSpecName: "logs") pod "d5e82c55-e59e-4d97-800c-66a4f9555047" (UID: "d5e82c55-e59e-4d97-800c-66a4f9555047"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.250304 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23264a32-2fc6-48e6-aa01-6c6cf519f5b5" path="/var/lib/kubelet/pods/23264a32-2fc6-48e6-aa01-6c6cf519f5b5/volumes" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.250982 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="332dc6dc-b877-41b6-836e-ad902bfbb12a" path="/var/lib/kubelet/pods/332dc6dc-b877-41b6-836e-ad902bfbb12a/volumes" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.251796 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3744809d-6956-4dfa-bede-4679ab2d9296" path="/var/lib/kubelet/pods/3744809d-6956-4dfa-bede-4679ab2d9296/volumes" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.252382 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46f55e80-87b4-4286-afa0-ab9c7143b02f" path="/var/lib/kubelet/pods/46f55e80-87b4-4286-afa0-ab9c7143b02f/volumes" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.252998 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a945414-93e9-401f-aa0e-15e040d78017" path="/var/lib/kubelet/pods/5a945414-93e9-401f-aa0e-15e040d78017/volumes" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.253523 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="768bafb1-558b-4033-a28c-a504a8f75281" path="/var/lib/kubelet/pods/768bafb1-558b-4033-a28c-a504a8f75281/volumes" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.254900 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d77b871f-f807-4a5d-a6ac-ee918d9d1530" path="/var/lib/kubelet/pods/d77b871f-f807-4a5d-a6ac-ee918d9d1530/volumes" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.255362 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8ec4b28-9317-420b-afeb-ac89bd5f0f9b" path="/var/lib/kubelet/pods/d8ec4b28-9317-420b-afeb-ac89bd5f0f9b/volumes" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.256082 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f12892ce-6d68-4f79-b1dd-e874dffba145" path="/var/lib/kubelet/pods/f12892ce-6d68-4f79-b1dd-e874dffba145/volumes" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.263617 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe29d3f9-7e6b-4fc0-8268-a5c3bd0e7f1e" path="/var/lib/kubelet/pods/fe29d3f9-7e6b-4fc0-8268-a5c3bd0e7f1e/volumes" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.267378 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5421ab8e-2db8-4909-b67b-e0491f7b80e7-kube-api-access-jxxwm" (OuterVolumeSpecName: "kube-api-access-jxxwm") pod "5421ab8e-2db8-4909-b67b-e0491f7b80e7" (UID: "5421ab8e-2db8-4909-b67b-e0491f7b80e7"). InnerVolumeSpecName "kube-api-access-jxxwm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.272987 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-htk8n_85dd770b-9d5c-4cc9-adaa-87963d5bb160/ovn-controller/0.log" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.273093 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-htk8n" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.276605 4797 generic.go:334] "Generic (PLEG): container finished" podID="c7503305-66a2-4504-b208-6795946d8701" containerID="aab08153ea37f716e034bf774837202e797c20f3821e253a5ccd1da48ecfc7e4" exitCode=0 Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.278117 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinderfaff-account-delete-ltnnq" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.285980 4797 generic.go:334] "Generic (PLEG): container finished" podID="416aefad-3318-4406-b1c1-fdba0ce21437" containerID="c575ab6cf83d919cac32e185c4f667a6b3abc5c3952e0de54dbbac6e3ad28900" exitCode=0 Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.286605 4797 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-57cbbb4d89-r9rvd" podUID="11a6d485-2926-4d07-9b32-e81ab882de4c" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.157:9696/\": dial tcp 10.217.0.157:9696: connect: connection refused" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.287933 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3111854e-cfce-493d-a094-63479ed35583-kube-api-access-95tw5" (OuterVolumeSpecName: "kube-api-access-95tw5") pod "3111854e-cfce-493d-a094-63479ed35583" (UID: "3111854e-cfce-493d-a094-63479ed35583"). InnerVolumeSpecName "kube-api-access-95tw5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.304771 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novacell084cf-account-delete-r9bpk" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.307049 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ecdce3f4-f1b1-4323-b237-eb28b936ebc7-kube-api-access-z4mnz" (OuterVolumeSpecName: "kube-api-access-z4mnz") pod "ecdce3f4-f1b1-4323-b237-eb28b936ebc7" (UID: "ecdce3f4-f1b1-4323-b237-eb28b936ebc7"). InnerVolumeSpecName "kube-api-access-z4mnz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.309105 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5e82c55-e59e-4d97-800c-66a4f9555047-kube-api-access-cqm27" (OuterVolumeSpecName: "kube-api-access-cqm27") pod "d5e82c55-e59e-4d97-800c-66a4f9555047" (UID: "d5e82c55-e59e-4d97-800c-66a4f9555047"). InnerVolumeSpecName "kube-api-access-cqm27". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.320888 4797 generic.go:334] "Generic (PLEG): container finished" podID="f10b6d3a-3a6f-4ab7-9ded-4885e28bdbc6" containerID="eb7d4e571e76b36027b315aa98010f3dd65dc78fd3a01a2aa0cae7369e4d667e" exitCode=0 Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.324137 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance496f-account-delete-qst6j" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.328565 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.333038 4797 generic.go:334] "Generic (PLEG): container finished" podID="cb5bf67e-bfcd-4a21-b67f-9d2893da31ab" containerID="cd039029de14470512836faefcd9431e8ebee84dc53830473d5f4f71c8f24d3c" exitCode=0 Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.333255 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.338234 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novacell122bc-account-delete-v4zs5" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.343674 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n46m9\" (UniqueName: \"kubernetes.io/projected/e2be119d-ecfb-4f81-b947-46797c215b8e-kube-api-access-n46m9\") pod \"e2be119d-ecfb-4f81-b947-46797c215b8e\" (UID: \"e2be119d-ecfb-4f81-b947-46797c215b8e\") " Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.344132 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q69xt\" (UniqueName: \"kubernetes.io/projected/3cb03045-bfdc-4d0b-af8f-4e3c4717e792-kube-api-access-q69xt\") pod \"3cb03045-bfdc-4d0b-af8f-4e3c4717e792\" (UID: \"3cb03045-bfdc-4d0b-af8f-4e3c4717e792\") " Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.344373 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e2be119d-ecfb-4f81-b947-46797c215b8e-scripts\") pod \"e2be119d-ecfb-4f81-b947-46797c215b8e\" (UID: \"e2be119d-ecfb-4f81-b947-46797c215b8e\") " Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.345428 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e2be119d-ecfb-4f81-b947-46797c215b8e-logs\") pod \"e2be119d-ecfb-4f81-b947-46797c215b8e\" (UID: \"e2be119d-ecfb-4f81-b947-46797c215b8e\") " Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.346266 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2be119d-ecfb-4f81-b947-46797c215b8e-config-data\") pod \"e2be119d-ecfb-4f81-b947-46797c215b8e\" (UID: \"e2be119d-ecfb-4f81-b947-46797c215b8e\") " Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.346740 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e2be119d-ecfb-4f81-b947-46797c215b8e-httpd-run\") pod \"e2be119d-ecfb-4f81-b947-46797c215b8e\" (UID: \"e2be119d-ecfb-4f81-b947-46797c215b8e\") " Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.347282 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e2be119d-ecfb-4f81-b947-46797c215b8e-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "e2be119d-ecfb-4f81-b947-46797c215b8e" (UID: "e2be119d-ecfb-4f81-b947-46797c215b8e"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.347569 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2be119d-ecfb-4f81-b947-46797c215b8e-combined-ca-bundle\") pod \"e2be119d-ecfb-4f81-b947-46797c215b8e\" (UID: \"e2be119d-ecfb-4f81-b947-46797c215b8e\") " Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.347686 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2be119d-ecfb-4f81-b947-46797c215b8e-internal-tls-certs\") pod \"e2be119d-ecfb-4f81-b947-46797c215b8e\" (UID: \"e2be119d-ecfb-4f81-b947-46797c215b8e\") " Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.348103 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"e2be119d-ecfb-4f81-b947-46797c215b8e\" (UID: \"e2be119d-ecfb-4f81-b947-46797c215b8e\") " Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.350925 4797 generic.go:334] "Generic (PLEG): container finished" podID="d5e82c55-e59e-4d97-800c-66a4f9555047" containerID="15e2f5844a61cab4f3406761a946255268af3094c017a3f89e9d9d6a0a06b3d1" exitCode=0 Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.351051 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-778fd9d9d-t868n" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.353096 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cqm27\" (UniqueName: \"kubernetes.io/projected/d5e82c55-e59e-4d97-800c-66a4f9555047-kube-api-access-cqm27\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.353125 4797 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d5e82c55-e59e-4d97-800c-66a4f9555047-logs\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.353138 4797 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6b2f17a4-493b-4b76-9dea-ef70ed8e1525-scripts\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.353152 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jxxwm\" (UniqueName: \"kubernetes.io/projected/5421ab8e-2db8-4909-b67b-e0491f7b80e7-kube-api-access-jxxwm\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.353163 4797 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d5e82c55-e59e-4d97-800c-66a4f9555047-scripts\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.353173 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p9w9g\" (UniqueName: \"kubernetes.io/projected/6b2f17a4-493b-4b76-9dea-ef70ed8e1525-kube-api-access-p9w9g\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.353181 4797 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e2be119d-ecfb-4f81-b947-46797c215b8e-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.353191 4797 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6b2f17a4-493b-4b76-9dea-ef70ed8e1525-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.353199 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-95tw5\" (UniqueName: \"kubernetes.io/projected/3111854e-cfce-493d-a094-63479ed35583-kube-api-access-95tw5\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.353208 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z4mnz\" (UniqueName: \"kubernetes.io/projected/ecdce3f4-f1b1-4323-b237-eb28b936ebc7-kube-api-access-z4mnz\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.353216 4797 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6b2f17a4-493b-4b76-9dea-ef70ed8e1525-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.355408 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e2be119d-ecfb-4f81-b947-46797c215b8e-logs" (OuterVolumeSpecName: "logs") pod "e2be119d-ecfb-4f81-b947-46797c215b8e" (UID: "e2be119d-ecfb-4f81-b947-46797c215b8e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.373762 4797 generic.go:334] "Generic (PLEG): container finished" podID="e2be119d-ecfb-4f81-b947-46797c215b8e" containerID="14f8f7513577c04f3bc8c70c38562b364daf8b8a2754d149cd53b77f28fcf4d4" exitCode=0 Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.373945 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.377157 4797 generic.go:334] "Generic (PLEG): container finished" podID="6b2f17a4-493b-4b76-9dea-ef70ed8e1525" containerID="bf518b71e25438928a378f5c3bfabd9ce6a8af5ad6e5d68c81e9f3e1f5d12f53" exitCode=0 Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.377249 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.381722 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb5bf67e-bfcd-4a21-b67f-9d2893da31ab-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cb5bf67e-bfcd-4a21-b67f-9d2893da31ab" (UID: "cb5bf67e-bfcd-4a21-b67f-9d2893da31ab"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.383643 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2be119d-ecfb-4f81-b947-46797c215b8e-kube-api-access-n46m9" (OuterVolumeSpecName: "kube-api-access-n46m9") pod "e2be119d-ecfb-4f81-b947-46797c215b8e" (UID: "e2be119d-ecfb-4f81-b947-46797c215b8e"). InnerVolumeSpecName "kube-api-access-n46m9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.384419 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb03045-bfdc-4d0b-af8f-4e3c4717e792-kube-api-access-q69xt" (OuterVolumeSpecName: "kube-api-access-q69xt") pod "3cb03045-bfdc-4d0b-af8f-4e3c4717e792" (UID: "3cb03045-bfdc-4d0b-af8f-4e3c4717e792"). InnerVolumeSpecName "kube-api-access-q69xt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.398428 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "e2be119d-ecfb-4f81-b947-46797c215b8e" (UID: "e2be119d-ecfb-4f81-b947-46797c215b8e"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.398533 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2be119d-ecfb-4f81-b947-46797c215b8e-scripts" (OuterVolumeSpecName: "scripts") pod "e2be119d-ecfb-4f81-b947-46797c215b8e" (UID: "e2be119d-ecfb-4f81-b947-46797c215b8e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.399065 4797 generic.go:334] "Generic (PLEG): container finished" podID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerID="7a2f1a197e052d816aea722ded8ddb41413d4d55e91b26d0412cfadb04dd4ef6" exitCode=0 Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.404687 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placementcde4-account-delete-dpcwh" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.409867 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-6c8957dfc-xhpz5" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.413053 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novaapia754-account-delete-6khh6" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.416183 4797 generic.go:334] "Generic (PLEG): container finished" podID="cefeac7c-e65d-4c12-8f7e-e56bf30c04fa" containerID="2ce2d5a74583559ea08ad5502820fbbd8cb181a62c48ed513f6762ab6f0ec152" exitCode=0 Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.419118 4797 generic.go:334] "Generic (PLEG): container finished" podID="3d561c30-1e2f-4a3d-b042-8191c88e4bb6" containerID="658bce01fb068a8991c6ad520dbfd6eedee82a6e9a0f4f191a112fb1f5f569bf" exitCode=0 Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.424340 4797 generic.go:334] "Generic (PLEG): container finished" podID="7297d3f3-134d-4fcf-85f2-8b414e2fb27d" containerID="8b3dbc305e498b09f402b2934e96ebeacc6ba59d0b362bafb37d2783cec22f3c" exitCode=0 Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.458426 4797 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e2be119d-ecfb-4f81-b947-46797c215b8e-scripts\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.458469 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb5bf67e-bfcd-4a21-b67f-9d2893da31ab-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.458482 4797 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e2be119d-ecfb-4f81-b947-46797c215b8e-logs\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.458513 4797 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.458528 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n46m9\" (UniqueName: \"kubernetes.io/projected/e2be119d-ecfb-4f81-b947-46797c215b8e-kube-api-access-n46m9\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.458542 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q69xt\" (UniqueName: \"kubernetes.io/projected/3cb03045-bfdc-4d0b-af8f-4e3c4717e792-kube-api-access-q69xt\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.486621 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6fed4821-c587-408b-b6b0-bcc080170628-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "6fed4821-c587-408b-b6b0-bcc080170628" (UID: "6fed4821-c587-408b-b6b0-bcc080170628"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.535965 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddeeadaa-8237-4fad-8fd7-8e9c0580a1ed-kube-state-metrics-tls-certs" (OuterVolumeSpecName: "kube-state-metrics-tls-certs") pod "ddeeadaa-8237-4fad-8fd7-8e9c0580a1ed" (UID: "ddeeadaa-8237-4fad-8fd7-8e9c0580a1ed"). InnerVolumeSpecName "kube-state-metrics-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.559898 4797 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/ddeeadaa-8237-4fad-8fd7-8e9c0580a1ed-kube-state-metrics-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.559930 4797 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6fed4821-c587-408b-b6b0-bcc080170628-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.603287 4797 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.607125 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6fed4821-c587-408b-b6b0-bcc080170628-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6fed4821-c587-408b-b6b0-bcc080170628" (UID: "6fed4821-c587-408b-b6b0-bcc080170628"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.616853 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b2f17a4-493b-4b76-9dea-ef70ed8e1525-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "6b2f17a4-493b-4b76-9dea-ef70ed8e1525" (UID: "6b2f17a4-493b-4b76-9dea-ef70ed8e1525"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.651485 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2be119d-ecfb-4f81-b947-46797c215b8e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e2be119d-ecfb-4f81-b947-46797c215b8e" (UID: "e2be119d-ecfb-4f81-b947-46797c215b8e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.651600 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5e82c55-e59e-4d97-800c-66a4f9555047-config-data" (OuterVolumeSpecName: "config-data") pod "d5e82c55-e59e-4d97-800c-66a4f9555047" (UID: "d5e82c55-e59e-4d97-800c-66a4f9555047"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.669383 4797 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5e82c55-e59e-4d97-800c-66a4f9555047-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.669429 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6fed4821-c587-408b-b6b0-bcc080170628-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.669443 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2be119d-ecfb-4f81-b947-46797c215b8e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.669455 4797 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.669468 4797 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b2f17a4-493b-4b76-9dea-ef70ed8e1525-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.677357 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b2f17a4-493b-4b76-9dea-ef70ed8e1525-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6b2f17a4-493b-4b76-9dea-ef70ed8e1525" (UID: "6b2f17a4-493b-4b76-9dea-ef70ed8e1525"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.713531 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b2f17a4-493b-4b76-9dea-ef70ed8e1525-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "6b2f17a4-493b-4b76-9dea-ef70ed8e1525" (UID: "6b2f17a4-493b-4b76-9dea-ef70ed8e1525"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.715647 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6fed4821-c587-408b-b6b0-bcc080170628-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "6fed4821-c587-408b-b6b0-bcc080170628" (UID: "6fed4821-c587-408b-b6b0-bcc080170628"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.724990 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5e82c55-e59e-4d97-800c-66a4f9555047-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d5e82c55-e59e-4d97-800c-66a4f9555047" (UID: "d5e82c55-e59e-4d97-800c-66a4f9555047"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.726661 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6fed4821-c587-408b-b6b0-bcc080170628-config-data" (OuterVolumeSpecName: "config-data") pod "6fed4821-c587-408b-b6b0-bcc080170628" (UID: "6fed4821-c587-408b-b6b0-bcc080170628"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.735559 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2be119d-ecfb-4f81-b947-46797c215b8e-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "e2be119d-ecfb-4f81-b947-46797c215b8e" (UID: "e2be119d-ecfb-4f81-b947-46797c215b8e"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.744199 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b2f17a4-493b-4b76-9dea-ef70ed8e1525-config-data" (OuterVolumeSpecName: "config-data") pod "6b2f17a4-493b-4b76-9dea-ef70ed8e1525" (UID: "6b2f17a4-493b-4b76-9dea-ef70ed8e1525"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.753200 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb5bf67e-bfcd-4a21-b67f-9d2893da31ab-memcached-tls-certs" (OuterVolumeSpecName: "memcached-tls-certs") pod "cb5bf67e-bfcd-4a21-b67f-9d2893da31ab" (UID: "cb5bf67e-bfcd-4a21-b67f-9d2893da31ab"). InnerVolumeSpecName "memcached-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.762559 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85dd770b-9d5c-4cc9-adaa-87963d5bb160-ovn-controller-tls-certs" (OuterVolumeSpecName: "ovn-controller-tls-certs") pod "85dd770b-9d5c-4cc9-adaa-87963d5bb160" (UID: "85dd770b-9d5c-4cc9-adaa-87963d5bb160"). InnerVolumeSpecName "ovn-controller-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.770591 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5e82c55-e59e-4d97-800c-66a4f9555047-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.770618 4797 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6fed4821-c587-408b-b6b0-bcc080170628-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.770629 4797 reconciler_common.go:293] "Volume detached for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/85dd770b-9d5c-4cc9-adaa-87963d5bb160-ovn-controller-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.770640 4797 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b2f17a4-493b-4b76-9dea-ef70ed8e1525-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.770648 4797 reconciler_common.go:293] "Volume detached for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb5bf67e-bfcd-4a21-b67f-9d2893da31ab-memcached-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.770657 4797 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2be119d-ecfb-4f81-b947-46797c215b8e-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.770665 4797 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b2f17a4-493b-4b76-9dea-ef70ed8e1525-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.770673 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b2f17a4-493b-4b76-9dea-ef70ed8e1525-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.770682 4797 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6fed4821-c587-408b-b6b0-bcc080170628-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.778411 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2be119d-ecfb-4f81-b947-46797c215b8e-config-data" (OuterVolumeSpecName: "config-data") pod "e2be119d-ecfb-4f81-b947-46797c215b8e" (UID: "e2be119d-ecfb-4f81-b947-46797c215b8e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.793250 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5e82c55-e59e-4d97-800c-66a4f9555047-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "d5e82c55-e59e-4d97-800c-66a4f9555047" (UID: "d5e82c55-e59e-4d97-800c-66a4f9555047"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.808651 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5e82c55-e59e-4d97-800c-66a4f9555047-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "d5e82c55-e59e-4d97-800c-66a4f9555047" (UID: "d5e82c55-e59e-4d97-800c-66a4f9555047"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.821239 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-htk8n" event={"ID":"85dd770b-9d5c-4cc9-adaa-87963d5bb160","Type":"ContainerDied","Data":"a4a9aab6f4c0faac2c2585b800c20e01af800d5b5329d28e1dab0619122eeaa1"} Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.821290 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"c7503305-66a2-4504-b208-6795946d8701","Type":"ContainerDied","Data":"aab08153ea37f716e034bf774837202e797c20f3821e253a5ccd1da48ecfc7e4"} Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.821303 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinderfaff-account-delete-ltnnq" event={"ID":"5421ab8e-2db8-4909-b67b-e0491f7b80e7","Type":"ContainerDied","Data":"216d0d423d478b6567928f1732c5f96a7a5a63213749725894be0dbfdb1d409a"} Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.821316 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="216d0d423d478b6567928f1732c5f96a7a5a63213749725894be0dbfdb1d409a" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.821325 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"416aefad-3318-4406-b1c1-fdba0ce21437","Type":"ContainerDied","Data":"c575ab6cf83d919cac32e185c4f667a6b3abc5c3952e0de54dbbac6e3ad28900"} Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.821340 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell084cf-account-delete-r9bpk" event={"ID":"a327836f-196f-4d29-8792-33085b552aa9","Type":"ContainerDied","Data":"c5c93dc9186042cf70ddb8fc3f33f12b6cbf1d1ca8a19d6ca344620adbd4260a"} Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.821351 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f10b6d3a-3a6f-4ab7-9ded-4885e28bdbc6","Type":"ContainerDied","Data":"eb7d4e571e76b36027b315aa98010f3dd65dc78fd3a01a2aa0cae7369e4d667e"} Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.821363 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance496f-account-delete-qst6j" event={"ID":"312a660f-ea89-49ac-8857-16dae844353f","Type":"ContainerDied","Data":"323ef85aaeb8b705a1858529352142233dc9716056c6c8cb85038aef3c8e34e9"} Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.821372 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="323ef85aaeb8b705a1858529352142233dc9716056c6c8cb85038aef3c8e34e9" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.821379 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"ddeeadaa-8237-4fad-8fd7-8e9c0580a1ed","Type":"ContainerDied","Data":"bd517bbc565e9f1dcced1ca5f49edd0136a1c502ba0357f09621f60b395484c4"} Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.821391 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"cb5bf67e-bfcd-4a21-b67f-9d2893da31ab","Type":"ContainerDied","Data":"cd039029de14470512836faefcd9431e8ebee84dc53830473d5f4f71c8f24d3c"} Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.821400 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"cb5bf67e-bfcd-4a21-b67f-9d2893da31ab","Type":"ContainerDied","Data":"2be24ed4ffb6d0135f074533e692c594fecabeca9e1e729455ef7aa0af6ec4f2"} Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.821410 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell122bc-account-delete-v4zs5" event={"ID":"3111854e-cfce-493d-a094-63479ed35583","Type":"ContainerDied","Data":"8e81067639dfc13496740be1a5d2b5c3c39f9b729b68ebe3ac5b4752c0b39280"} Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.821421 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-778fd9d9d-t868n" event={"ID":"d5e82c55-e59e-4d97-800c-66a4f9555047","Type":"ContainerDied","Data":"15e2f5844a61cab4f3406761a946255268af3094c017a3f89e9d9d6a0a06b3d1"} Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.821432 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-778fd9d9d-t868n" event={"ID":"d5e82c55-e59e-4d97-800c-66a4f9555047","Type":"ContainerDied","Data":"87b4a365c29d87d012645e72faf44be655aa1e8a7cac0dd56068d6bbc703513f"} Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.821441 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e2be119d-ecfb-4f81-b947-46797c215b8e","Type":"ContainerDied","Data":"14f8f7513577c04f3bc8c70c38562b364daf8b8a2754d149cd53b77f28fcf4d4"} Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.821452 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e2be119d-ecfb-4f81-b947-46797c215b8e","Type":"ContainerDied","Data":"f72353f78f89b5ecf63f883eb7c92b882454c58c1f86b5ffa57341e163f8f8d6"} Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.821461 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"6b2f17a4-493b-4b76-9dea-ef70ed8e1525","Type":"ContainerDied","Data":"bf518b71e25438928a378f5c3bfabd9ce6a8af5ad6e5d68c81e9f3e1f5d12f53"} Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.821470 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"6b2f17a4-493b-4b76-9dea-ef70ed8e1525","Type":"ContainerDied","Data":"a53f1c7a7c1e6e96ec178bbdc83a3b7e28f3c89077a62580d8233e38b871188d"} Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.821478 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" event={"ID":"345b1c60-ba79-407d-8423-53010f2dfeb0","Type":"ContainerDied","Data":"7a2f1a197e052d816aea722ded8ddb41413d4d55e91b26d0412cfadb04dd4ef6"} Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.821490 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placementcde4-account-delete-dpcwh" event={"ID":"ecdce3f4-f1b1-4323-b237-eb28b936ebc7","Type":"ContainerDied","Data":"8d1c74252b830faaf48d72f41bd9065f690fcb5e8af24cb778af0d45d8f165d0"} Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.821500 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6c8957dfc-xhpz5" event={"ID":"6fed4821-c587-408b-b6b0-bcc080170628","Type":"ContainerDied","Data":"58bd0faa103d5cf78cea4f0ad31e883565b2b75c51534697f9eb48399ee2b257"} Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.821511 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novaapia754-account-delete-6khh6" event={"ID":"3cb03045-bfdc-4d0b-af8f-4e3c4717e792","Type":"ContainerDied","Data":"42d53a9b5f5e3aaefae9fb34afa6990b4398360ded008da5af31342a3addee98"} Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.821519 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cefeac7c-e65d-4c12-8f7e-e56bf30c04fa","Type":"ContainerDied","Data":"2ce2d5a74583559ea08ad5502820fbbd8cb181a62c48ed513f6762ab6f0ec152"} Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.821529 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-575995d4c4-dskht" event={"ID":"3d561c30-1e2f-4a3d-b042-8191c88e4bb6","Type":"ContainerDied","Data":"658bce01fb068a8991c6ad520dbfd6eedee82a6e9a0f4f191a112fb1f5f569bf"} Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.821541 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7297d3f3-134d-4fcf-85f2-8b414e2fb27d","Type":"ContainerDied","Data":"8b3dbc305e498b09f402b2934e96ebeacc6ba59d0b362bafb37d2783cec22f3c"} Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.830091 4797 scope.go:117] "RemoveContainer" containerID="c1f45af3970cc786037a9d09839c3b48d54254bcac9f280549e8da329ed5ed7c" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.873937 4797 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2be119d-ecfb-4f81-b947-46797c215b8e-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.874393 4797 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5e82c55-e59e-4d97-800c-66a4f9555047-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.874427 4797 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5e82c55-e59e-4d97-800c-66a4f9555047-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.931132 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.936829 4797 scope.go:117] "RemoveContainer" containerID="3deb1ed3ceb86e7beaf83228fb0eb3de9d1897badd1d88888f0e381fccb33a91" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.943721 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 13 13:29:49 crc kubenswrapper[4797]: I1013 13:29:49.992369 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.009372 4797 scope.go:117] "RemoveContainer" containerID="be06c7925ca69832ee2ecd465bccfa99df2217df9880fecf2f1eaf2ed8591ad0" Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.038626 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.091079 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/416aefad-3318-4406-b1c1-fdba0ce21437-combined-ca-bundle\") pod \"416aefad-3318-4406-b1c1-fdba0ce21437\" (UID: \"416aefad-3318-4406-b1c1-fdba0ce21437\") " Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.091158 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wz5rs\" (UniqueName: \"kubernetes.io/projected/cefeac7c-e65d-4c12-8f7e-e56bf30c04fa-kube-api-access-wz5rs\") pod \"cefeac7c-e65d-4c12-8f7e-e56bf30c04fa\" (UID: \"cefeac7c-e65d-4c12-8f7e-e56bf30c04fa\") " Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.091203 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f10b6d3a-3a6f-4ab7-9ded-4885e28bdbc6-combined-ca-bundle\") pod \"f10b6d3a-3a6f-4ab7-9ded-4885e28bdbc6\" (UID: \"f10b6d3a-3a6f-4ab7-9ded-4885e28bdbc6\") " Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.091284 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"416aefad-3318-4406-b1c1-fdba0ce21437\" (UID: \"416aefad-3318-4406-b1c1-fdba0ce21437\") " Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.091310 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ddln6\" (UniqueName: \"kubernetes.io/projected/f10b6d3a-3a6f-4ab7-9ded-4885e28bdbc6-kube-api-access-ddln6\") pod \"f10b6d3a-3a6f-4ab7-9ded-4885e28bdbc6\" (UID: \"f10b6d3a-3a6f-4ab7-9ded-4885e28bdbc6\") " Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.091352 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/cefeac7c-e65d-4c12-8f7e-e56bf30c04fa-nova-metadata-tls-certs\") pod \"cefeac7c-e65d-4c12-8f7e-e56bf30c04fa\" (UID: \"cefeac7c-e65d-4c12-8f7e-e56bf30c04fa\") " Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.091380 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f10b6d3a-3a6f-4ab7-9ded-4885e28bdbc6-config-data\") pod \"f10b6d3a-3a6f-4ab7-9ded-4885e28bdbc6\" (UID: \"f10b6d3a-3a6f-4ab7-9ded-4885e28bdbc6\") " Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.091405 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cefeac7c-e65d-4c12-8f7e-e56bf30c04fa-logs\") pod \"cefeac7c-e65d-4c12-8f7e-e56bf30c04fa\" (UID: \"cefeac7c-e65d-4c12-8f7e-e56bf30c04fa\") " Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.091428 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/416aefad-3318-4406-b1c1-fdba0ce21437-httpd-run\") pod \"416aefad-3318-4406-b1c1-fdba0ce21437\" (UID: \"416aefad-3318-4406-b1c1-fdba0ce21437\") " Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.091475 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vpkhd\" (UniqueName: \"kubernetes.io/projected/416aefad-3318-4406-b1c1-fdba0ce21437-kube-api-access-vpkhd\") pod \"416aefad-3318-4406-b1c1-fdba0ce21437\" (UID: \"416aefad-3318-4406-b1c1-fdba0ce21437\") " Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.091501 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/416aefad-3318-4406-b1c1-fdba0ce21437-public-tls-certs\") pod \"416aefad-3318-4406-b1c1-fdba0ce21437\" (UID: \"416aefad-3318-4406-b1c1-fdba0ce21437\") " Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.091556 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/416aefad-3318-4406-b1c1-fdba0ce21437-logs\") pod \"416aefad-3318-4406-b1c1-fdba0ce21437\" (UID: \"416aefad-3318-4406-b1c1-fdba0ce21437\") " Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.091790 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cefeac7c-e65d-4c12-8f7e-e56bf30c04fa-combined-ca-bundle\") pod \"cefeac7c-e65d-4c12-8f7e-e56bf30c04fa\" (UID: \"cefeac7c-e65d-4c12-8f7e-e56bf30c04fa\") " Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.091862 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/416aefad-3318-4406-b1c1-fdba0ce21437-config-data\") pod \"416aefad-3318-4406-b1c1-fdba0ce21437\" (UID: \"416aefad-3318-4406-b1c1-fdba0ce21437\") " Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.091885 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cefeac7c-e65d-4c12-8f7e-e56bf30c04fa-config-data\") pod \"cefeac7c-e65d-4c12-8f7e-e56bf30c04fa\" (UID: \"cefeac7c-e65d-4c12-8f7e-e56bf30c04fa\") " Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.091926 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/416aefad-3318-4406-b1c1-fdba0ce21437-scripts\") pod \"416aefad-3318-4406-b1c1-fdba0ce21437\" (UID: \"416aefad-3318-4406-b1c1-fdba0ce21437\") " Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.093833 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/416aefad-3318-4406-b1c1-fdba0ce21437-logs" (OuterVolumeSpecName: "logs") pod "416aefad-3318-4406-b1c1-fdba0ce21437" (UID: "416aefad-3318-4406-b1c1-fdba0ce21437"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.093906 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance496f-account-delete-qst6j"] Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.094126 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cefeac7c-e65d-4c12-8f7e-e56bf30c04fa-logs" (OuterVolumeSpecName: "logs") pod "cefeac7c-e65d-4c12-8f7e-e56bf30c04fa" (UID: "cefeac7c-e65d-4c12-8f7e-e56bf30c04fa"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.095787 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/416aefad-3318-4406-b1c1-fdba0ce21437-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "416aefad-3318-4406-b1c1-fdba0ce21437" (UID: "416aefad-3318-4406-b1c1-fdba0ce21437"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.100114 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-575995d4c4-dskht" Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.105742 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f10b6d3a-3a6f-4ab7-9ded-4885e28bdbc6-kube-api-access-ddln6" (OuterVolumeSpecName: "kube-api-access-ddln6") pod "f10b6d3a-3a6f-4ab7-9ded-4885e28bdbc6" (UID: "f10b6d3a-3a6f-4ab7-9ded-4885e28bdbc6"). InnerVolumeSpecName "kube-api-access-ddln6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.105977 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/416aefad-3318-4406-b1c1-fdba0ce21437-scripts" (OuterVolumeSpecName: "scripts") pod "416aefad-3318-4406-b1c1-fdba0ce21437" (UID: "416aefad-3318-4406-b1c1-fdba0ce21437"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.109188 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cefeac7c-e65d-4c12-8f7e-e56bf30c04fa-kube-api-access-wz5rs" (OuterVolumeSpecName: "kube-api-access-wz5rs") pod "cefeac7c-e65d-4c12-8f7e-e56bf30c04fa" (UID: "cefeac7c-e65d-4c12-8f7e-e56bf30c04fa"). InnerVolumeSpecName "kube-api-access-wz5rs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.109706 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/416aefad-3318-4406-b1c1-fdba0ce21437-kube-api-access-vpkhd" (OuterVolumeSpecName: "kube-api-access-vpkhd") pod "416aefad-3318-4406-b1c1-fdba0ce21437" (UID: "416aefad-3318-4406-b1c1-fdba0ce21437"). InnerVolumeSpecName "kube-api-access-vpkhd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.111056 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance496f-account-delete-qst6j"] Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.121130 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinderfaff-account-delete-ltnnq"] Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.123142 4797 scope.go:117] "RemoveContainer" containerID="cd039029de14470512836faefcd9431e8ebee84dc53830473d5f4f71c8f24d3c" Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.133027 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "glance") pod "416aefad-3318-4406-b1c1-fdba0ce21437" (UID: "416aefad-3318-4406-b1c1-fdba0ce21437"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.141356 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinderfaff-account-delete-ltnnq"] Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.149433 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novacell084cf-account-delete-r9bpk"] Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.155394 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/novacell084cf-account-delete-r9bpk"] Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.157446 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cefeac7c-e65d-4c12-8f7e-e56bf30c04fa-config-data" (OuterVolumeSpecName: "config-data") pod "cefeac7c-e65d-4c12-8f7e-e56bf30c04fa" (UID: "cefeac7c-e65d-4c12-8f7e-e56bf30c04fa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.163309 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novacell122bc-account-delete-v4zs5"] Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.189286 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/416aefad-3318-4406-b1c1-fdba0ce21437-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "416aefad-3318-4406-b1c1-fdba0ce21437" (UID: "416aefad-3318-4406-b1c1-fdba0ce21437"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.189534 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f10b6d3a-3a6f-4ab7-9ded-4885e28bdbc6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f10b6d3a-3a6f-4ab7-9ded-4885e28bdbc6" (UID: "f10b6d3a-3a6f-4ab7-9ded-4885e28bdbc6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.196999 4797 scope.go:117] "RemoveContainer" containerID="cd039029de14470512836faefcd9431e8ebee84dc53830473d5f4f71c8f24d3c" Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.197616 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d561c30-1e2f-4a3d-b042-8191c88e4bb6-public-tls-certs\") pod \"3d561c30-1e2f-4a3d-b042-8191c88e4bb6\" (UID: \"3d561c30-1e2f-4a3d-b042-8191c88e4bb6\") " Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.197687 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7503305-66a2-4504-b208-6795946d8701-config-data\") pod \"c7503305-66a2-4504-b208-6795946d8701\" (UID: \"c7503305-66a2-4504-b208-6795946d8701\") " Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.197744 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d561c30-1e2f-4a3d-b042-8191c88e4bb6-logs\") pod \"3d561c30-1e2f-4a3d-b042-8191c88e4bb6\" (UID: \"3d561c30-1e2f-4a3d-b042-8191c88e4bb6\") " Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.197825 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7503305-66a2-4504-b208-6795946d8701-combined-ca-bundle\") pod \"c7503305-66a2-4504-b208-6795946d8701\" (UID: \"c7503305-66a2-4504-b208-6795946d8701\") " Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.197870 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hr9zt\" (UniqueName: \"kubernetes.io/projected/3d561c30-1e2f-4a3d-b042-8191c88e4bb6-kube-api-access-hr9zt\") pod \"3d561c30-1e2f-4a3d-b042-8191c88e4bb6\" (UID: \"3d561c30-1e2f-4a3d-b042-8191c88e4bb6\") " Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.197951 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d561c30-1e2f-4a3d-b042-8191c88e4bb6-combined-ca-bundle\") pod \"3d561c30-1e2f-4a3d-b042-8191c88e4bb6\" (UID: \"3d561c30-1e2f-4a3d-b042-8191c88e4bb6\") " Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.198041 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d561c30-1e2f-4a3d-b042-8191c88e4bb6-config-data\") pod \"3d561c30-1e2f-4a3d-b042-8191c88e4bb6\" (UID: \"3d561c30-1e2f-4a3d-b042-8191c88e4bb6\") " Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.198109 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8z89s\" (UniqueName: \"kubernetes.io/projected/c7503305-66a2-4504-b208-6795946d8701-kube-api-access-8z89s\") pod \"c7503305-66a2-4504-b208-6795946d8701\" (UID: \"c7503305-66a2-4504-b208-6795946d8701\") " Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.198145 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d561c30-1e2f-4a3d-b042-8191c88e4bb6-internal-tls-certs\") pod \"3d561c30-1e2f-4a3d-b042-8191c88e4bb6\" (UID: \"3d561c30-1e2f-4a3d-b042-8191c88e4bb6\") " Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.198176 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3d561c30-1e2f-4a3d-b042-8191c88e4bb6-config-data-custom\") pod \"3d561c30-1e2f-4a3d-b042-8191c88e4bb6\" (UID: \"3d561c30-1e2f-4a3d-b042-8191c88e4bb6\") " Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.198612 4797 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.198641 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ddln6\" (UniqueName: \"kubernetes.io/projected/f10b6d3a-3a6f-4ab7-9ded-4885e28bdbc6-kube-api-access-ddln6\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.198656 4797 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cefeac7c-e65d-4c12-8f7e-e56bf30c04fa-logs\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.198668 4797 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/416aefad-3318-4406-b1c1-fdba0ce21437-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.198680 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vpkhd\" (UniqueName: \"kubernetes.io/projected/416aefad-3318-4406-b1c1-fdba0ce21437-kube-api-access-vpkhd\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.198692 4797 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/416aefad-3318-4406-b1c1-fdba0ce21437-logs\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.198704 4797 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cefeac7c-e65d-4c12-8f7e-e56bf30c04fa-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.198717 4797 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/416aefad-3318-4406-b1c1-fdba0ce21437-scripts\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.198728 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/416aefad-3318-4406-b1c1-fdba0ce21437-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.198739 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wz5rs\" (UniqueName: \"kubernetes.io/projected/cefeac7c-e65d-4c12-8f7e-e56bf30c04fa-kube-api-access-wz5rs\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.198752 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f10b6d3a-3a6f-4ab7-9ded-4885e28bdbc6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:50 crc kubenswrapper[4797]: E1013 13:29:50.200149 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd039029de14470512836faefcd9431e8ebee84dc53830473d5f4f71c8f24d3c\": container with ID starting with cd039029de14470512836faefcd9431e8ebee84dc53830473d5f4f71c8f24d3c not found: ID does not exist" containerID="cd039029de14470512836faefcd9431e8ebee84dc53830473d5f4f71c8f24d3c" Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.200343 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd039029de14470512836faefcd9431e8ebee84dc53830473d5f4f71c8f24d3c"} err="failed to get container status \"cd039029de14470512836faefcd9431e8ebee84dc53830473d5f4f71c8f24d3c\": rpc error: code = NotFound desc = could not find container \"cd039029de14470512836faefcd9431e8ebee84dc53830473d5f4f71c8f24d3c\": container with ID starting with cd039029de14470512836faefcd9431e8ebee84dc53830473d5f4f71c8f24d3c not found: ID does not exist" Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.200452 4797 scope.go:117] "RemoveContainer" containerID="6d8dea0f7fd3f99f103a729af71199c45e8cb48ada730f3c234f6e561cabd22e" Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.201329 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d561c30-1e2f-4a3d-b042-8191c88e4bb6-logs" (OuterVolumeSpecName: "logs") pod "3d561c30-1e2f-4a3d-b042-8191c88e4bb6" (UID: "3d561c30-1e2f-4a3d-b042-8191c88e4bb6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.208467 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/novacell122bc-account-delete-v4zs5"] Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.223845 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.231939 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cefeac7c-e65d-4c12-8f7e-e56bf30c04fa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cefeac7c-e65d-4c12-8f7e-e56bf30c04fa" (UID: "cefeac7c-e65d-4c12-8f7e-e56bf30c04fa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.232000 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.241114 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7503305-66a2-4504-b208-6795946d8701-kube-api-access-8z89s" (OuterVolumeSpecName: "kube-api-access-8z89s") pod "c7503305-66a2-4504-b208-6795946d8701" (UID: "c7503305-66a2-4504-b208-6795946d8701"). InnerVolumeSpecName "kube-api-access-8z89s". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.262622 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d561c30-1e2f-4a3d-b042-8191c88e4bb6-kube-api-access-hr9zt" (OuterVolumeSpecName: "kube-api-access-hr9zt") pod "3d561c30-1e2f-4a3d-b042-8191c88e4bb6" (UID: "3d561c30-1e2f-4a3d-b042-8191c88e4bb6"). InnerVolumeSpecName "kube-api-access-hr9zt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.272798 4797 scope.go:117] "RemoveContainer" containerID="15e2f5844a61cab4f3406761a946255268af3094c017a3f89e9d9d6a0a06b3d1" Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.273346 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placementcde4-account-delete-dpcwh"] Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.282076 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d561c30-1e2f-4a3d-b042-8191c88e4bb6-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "3d561c30-1e2f-4a3d-b042-8191c88e4bb6" (UID: "3d561c30-1e2f-4a3d-b042-8191c88e4bb6"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.291111 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placementcde4-account-delete-dpcwh"] Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.297144 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.300696 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8z89s\" (UniqueName: \"kubernetes.io/projected/c7503305-66a2-4504-b208-6795946d8701-kube-api-access-8z89s\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.300718 4797 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3d561c30-1e2f-4a3d-b042-8191c88e4bb6-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.300728 4797 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d561c30-1e2f-4a3d-b042-8191c88e4bb6-logs\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.300737 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hr9zt\" (UniqueName: \"kubernetes.io/projected/3d561c30-1e2f-4a3d-b042-8191c88e4bb6-kube-api-access-hr9zt\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.300745 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cefeac7c-e65d-4c12-8f7e-e56bf30c04fa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.307308 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/memcached-0"] Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.308829 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.311818 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f10b6d3a-3a6f-4ab7-9ded-4885e28bdbc6-config-data" (OuterVolumeSpecName: "config-data") pod "f10b6d3a-3a6f-4ab7-9ded-4885e28bdbc6" (UID: "f10b6d3a-3a6f-4ab7-9ded-4885e28bdbc6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.314833 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.318663 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-778fd9d9d-t868n"] Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.324078 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-778fd9d9d-t868n"] Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.329546 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-6c8957dfc-xhpz5"] Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.334635 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-proxy-6c8957dfc-xhpz5"] Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.334793 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7503305-66a2-4504-b208-6795946d8701-config-data" (OuterVolumeSpecName: "config-data") pod "c7503305-66a2-4504-b208-6795946d8701" (UID: "c7503305-66a2-4504-b208-6795946d8701"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.339835 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novaapia754-account-delete-6khh6"] Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.344667 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/novaapia754-account-delete-6khh6"] Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.351390 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-htk8n"] Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.355512 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-htk8n"] Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.359377 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.363318 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.402729 4797 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7503305-66a2-4504-b208-6795946d8701-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.402764 4797 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f10b6d3a-3a6f-4ab7-9ded-4885e28bdbc6-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.417199 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7503305-66a2-4504-b208-6795946d8701-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c7503305-66a2-4504-b208-6795946d8701" (UID: "c7503305-66a2-4504-b208-6795946d8701"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.418391 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d561c30-1e2f-4a3d-b042-8191c88e4bb6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3d561c30-1e2f-4a3d-b042-8191c88e4bb6" (UID: "3d561c30-1e2f-4a3d-b042-8191c88e4bb6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.426304 4797 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.437939 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/416aefad-3318-4406-b1c1-fdba0ce21437-config-data" (OuterVolumeSpecName: "config-data") pod "416aefad-3318-4406-b1c1-fdba0ce21437" (UID: "416aefad-3318-4406-b1c1-fdba0ce21437"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.440972 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d561c30-1e2f-4a3d-b042-8191c88e4bb6-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "3d561c30-1e2f-4a3d-b042-8191c88e4bb6" (UID: "3d561c30-1e2f-4a3d-b042-8191c88e4bb6"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.443402 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/416aefad-3318-4406-b1c1-fdba0ce21437-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "416aefad-3318-4406-b1c1-fdba0ce21437" (UID: "416aefad-3318-4406-b1c1-fdba0ce21437"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.446019 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"416aefad-3318-4406-b1c1-fdba0ce21437","Type":"ContainerDied","Data":"1eba22894556bf52053730874bb909c993976173f99d4d27a41487b223cb3754"} Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.446253 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.448779 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d561c30-1e2f-4a3d-b042-8191c88e4bb6-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "3d561c30-1e2f-4a3d-b042-8191c88e4bb6" (UID: "3d561c30-1e2f-4a3d-b042-8191c88e4bb6"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.457153 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cefeac7c-e65d-4c12-8f7e-e56bf30c04fa","Type":"ContainerDied","Data":"c45f2506b30ee16013f3be2a9e77460b2bb6cc5f56459c665a0b2070ea0e1c62"} Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.457237 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.459485 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"c7503305-66a2-4504-b208-6795946d8701","Type":"ContainerDied","Data":"4fa7f8b954978e855fbedc304fa886ac89da8580df573901cddc814942ca2f28"} Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.459607 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.460036 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d561c30-1e2f-4a3d-b042-8191c88e4bb6-config-data" (OuterVolumeSpecName: "config-data") pod "3d561c30-1e2f-4a3d-b042-8191c88e4bb6" (UID: "3d561c30-1e2f-4a3d-b042-8191c88e4bb6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.464620 4797 generic.go:334] "Generic (PLEG): container finished" podID="e2a3161e-b16d-436d-b547-87e182ef5e27" containerID="96c199433ec042676ed19e22d59f1e2298214f4e17f2643c330e718dd6cd93a6" exitCode=0 Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.465034 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e2a3161e-b16d-436d-b547-87e182ef5e27","Type":"ContainerDied","Data":"96c199433ec042676ed19e22d59f1e2298214f4e17f2643c330e718dd6cd93a6"} Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.469302 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"300309d9-4375-4cce-8fb1-0833d2cfdcde","Type":"ContainerDied","Data":"a0ba3520e5651533522d5be4eedd2ce11b85f4a41e04d516a04f5658408ca62b"} Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.469914 4797 generic.go:334] "Generic (PLEG): container finished" podID="300309d9-4375-4cce-8fb1-0833d2cfdcde" containerID="a0ba3520e5651533522d5be4eedd2ce11b85f4a41e04d516a04f5658408ca62b" exitCode=0 Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.470010 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"300309d9-4375-4cce-8fb1-0833d2cfdcde","Type":"ContainerDied","Data":"a674b1af4d463b4e06f89e5b0191853be57dd6654cd17eb0f136ece2fbae1d13"} Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.470042 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a674b1af4d463b4e06f89e5b0191853be57dd6654cd17eb0f136ece2fbae1d13" Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.472091 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cefeac7c-e65d-4c12-8f7e-e56bf30c04fa-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "cefeac7c-e65d-4c12-8f7e-e56bf30c04fa" (UID: "cefeac7c-e65d-4c12-8f7e-e56bf30c04fa"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.473624 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f10b6d3a-3a6f-4ab7-9ded-4885e28bdbc6","Type":"ContainerDied","Data":"37f701374f5f727c9573ca188d7d69b2c041a92307e0c7e1eabcd64ec3722489"} Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.473738 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.484070 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" event={"ID":"345b1c60-ba79-407d-8423-53010f2dfeb0","Type":"ContainerStarted","Data":"2e1a8d37a243fdb391fd594b14c3967cbe771bff4ffdfcbdee7201408ecf2edb"} Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.486749 4797 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="7297d3f3-134d-4fcf-85f2-8b414e2fb27d" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.207:3000/\": dial tcp 10.217.0.207:3000: connect: connection refused" Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.487682 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-575995d4c4-dskht" event={"ID":"3d561c30-1e2f-4a3d-b042-8191c88e4bb6","Type":"ContainerDied","Data":"db29d3b34ea1cf5562679209393def260dfd35fbfea46ae481040c8d0f040700"} Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.487829 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-575995d4c4-dskht" Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.504763 4797 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d561c30-1e2f-4a3d-b042-8191c88e4bb6-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.505083 4797 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d561c30-1e2f-4a3d-b042-8191c88e4bb6-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.505153 4797 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d561c30-1e2f-4a3d-b042-8191c88e4bb6-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.505210 4797 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.505268 4797 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/cefeac7c-e65d-4c12-8f7e-e56bf30c04fa-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.505324 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7503305-66a2-4504-b208-6795946d8701-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.505575 4797 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/416aefad-3318-4406-b1c1-fdba0ce21437-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.505636 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d561c30-1e2f-4a3d-b042-8191c88e4bb6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.505692 4797 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/416aefad-3318-4406-b1c1-fdba0ce21437-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.623701 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.631874 4797 scope.go:117] "RemoveContainer" containerID="f92eb182c86f5ebc1b55e35e6ebaf0b53e6772b112e92cce8213cf90110c1026" Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.709644 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/300309d9-4375-4cce-8fb1-0833d2cfdcde-galera-tls-certs\") pod \"300309d9-4375-4cce-8fb1-0833d2cfdcde\" (UID: \"300309d9-4375-4cce-8fb1-0833d2cfdcde\") " Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.709704 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"300309d9-4375-4cce-8fb1-0833d2cfdcde\" (UID: \"300309d9-4375-4cce-8fb1-0833d2cfdcde\") " Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.709733 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/300309d9-4375-4cce-8fb1-0833d2cfdcde-config-data-generated\") pod \"300309d9-4375-4cce-8fb1-0833d2cfdcde\" (UID: \"300309d9-4375-4cce-8fb1-0833d2cfdcde\") " Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.709840 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/300309d9-4375-4cce-8fb1-0833d2cfdcde-operator-scripts\") pod \"300309d9-4375-4cce-8fb1-0833d2cfdcde\" (UID: \"300309d9-4375-4cce-8fb1-0833d2cfdcde\") " Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.709905 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7jxcg\" (UniqueName: \"kubernetes.io/projected/300309d9-4375-4cce-8fb1-0833d2cfdcde-kube-api-access-7jxcg\") pod \"300309d9-4375-4cce-8fb1-0833d2cfdcde\" (UID: \"300309d9-4375-4cce-8fb1-0833d2cfdcde\") " Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.710647 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/300309d9-4375-4cce-8fb1-0833d2cfdcde-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "300309d9-4375-4cce-8fb1-0833d2cfdcde" (UID: "300309d9-4375-4cce-8fb1-0833d2cfdcde"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.711211 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/300309d9-4375-4cce-8fb1-0833d2cfdcde-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "300309d9-4375-4cce-8fb1-0833d2cfdcde" (UID: "300309d9-4375-4cce-8fb1-0833d2cfdcde"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.711348 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/300309d9-4375-4cce-8fb1-0833d2cfdcde-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "300309d9-4375-4cce-8fb1-0833d2cfdcde" (UID: "300309d9-4375-4cce-8fb1-0833d2cfdcde"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.711443 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/300309d9-4375-4cce-8fb1-0833d2cfdcde-kolla-config\") pod \"300309d9-4375-4cce-8fb1-0833d2cfdcde\" (UID: \"300309d9-4375-4cce-8fb1-0833d2cfdcde\") " Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.715693 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/300309d9-4375-4cce-8fb1-0833d2cfdcde-kube-api-access-7jxcg" (OuterVolumeSpecName: "kube-api-access-7jxcg") pod "300309d9-4375-4cce-8fb1-0833d2cfdcde" (UID: "300309d9-4375-4cce-8fb1-0833d2cfdcde"). InnerVolumeSpecName "kube-api-access-7jxcg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.718949 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/300309d9-4375-4cce-8fb1-0833d2cfdcde-secrets" (OuterVolumeSpecName: "secrets") pod "300309d9-4375-4cce-8fb1-0833d2cfdcde" (UID: "300309d9-4375-4cce-8fb1-0833d2cfdcde"). InnerVolumeSpecName "secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.723226 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/300309d9-4375-4cce-8fb1-0833d2cfdcde-secrets\") pod \"300309d9-4375-4cce-8fb1-0833d2cfdcde\" (UID: \"300309d9-4375-4cce-8fb1-0833d2cfdcde\") " Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.723359 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/300309d9-4375-4cce-8fb1-0833d2cfdcde-combined-ca-bundle\") pod \"300309d9-4375-4cce-8fb1-0833d2cfdcde\" (UID: \"300309d9-4375-4cce-8fb1-0833d2cfdcde\") " Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.723415 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/300309d9-4375-4cce-8fb1-0833d2cfdcde-config-data-default\") pod \"300309d9-4375-4cce-8fb1-0833d2cfdcde\" (UID: \"300309d9-4375-4cce-8fb1-0833d2cfdcde\") " Oct 13 13:29:50 crc kubenswrapper[4797]: E1013 13:29:50.724428 4797 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.724484 4797 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/300309d9-4375-4cce-8fb1-0833d2cfdcde-config-data-generated\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:50 crc kubenswrapper[4797]: E1013 13:29:50.724494 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/21067728-d3cf-4ff2-94c9-87600f7324ab-config-data podName:21067728-d3cf-4ff2-94c9-87600f7324ab nodeName:}" failed. No retries permitted until 2025-10-13 13:29:58.724476925 +0000 UTC m=+1376.258027181 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/21067728-d3cf-4ff2-94c9-87600f7324ab-config-data") pod "rabbitmq-server-0" (UID: "21067728-d3cf-4ff2-94c9-87600f7324ab") : configmap "rabbitmq-config-data" not found Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.724530 4797 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/300309d9-4375-4cce-8fb1-0833d2cfdcde-operator-scripts\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.724546 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7jxcg\" (UniqueName: \"kubernetes.io/projected/300309d9-4375-4cce-8fb1-0833d2cfdcde-kube-api-access-7jxcg\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.724557 4797 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/300309d9-4375-4cce-8fb1-0833d2cfdcde-kolla-config\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.724568 4797 reconciler_common.go:293] "Volume detached for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/300309d9-4375-4cce-8fb1-0833d2cfdcde-secrets\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.725311 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/300309d9-4375-4cce-8fb1-0833d2cfdcde-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "300309d9-4375-4cce-8fb1-0833d2cfdcde" (UID: "300309d9-4375-4cce-8fb1-0833d2cfdcde"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.734907 4797 scope.go:117] "RemoveContainer" containerID="15e2f5844a61cab4f3406761a946255268af3094c017a3f89e9d9d6a0a06b3d1" Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.739937 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "mysql-db") pod "300309d9-4375-4cce-8fb1-0833d2cfdcde" (UID: "300309d9-4375-4cce-8fb1-0833d2cfdcde"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 13 13:29:50 crc kubenswrapper[4797]: E1013 13:29:50.740019 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15e2f5844a61cab4f3406761a946255268af3094c017a3f89e9d9d6a0a06b3d1\": container with ID starting with 15e2f5844a61cab4f3406761a946255268af3094c017a3f89e9d9d6a0a06b3d1 not found: ID does not exist" containerID="15e2f5844a61cab4f3406761a946255268af3094c017a3f89e9d9d6a0a06b3d1" Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.740092 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15e2f5844a61cab4f3406761a946255268af3094c017a3f89e9d9d6a0a06b3d1"} err="failed to get container status \"15e2f5844a61cab4f3406761a946255268af3094c017a3f89e9d9d6a0a06b3d1\": rpc error: code = NotFound desc = could not find container \"15e2f5844a61cab4f3406761a946255268af3094c017a3f89e9d9d6a0a06b3d1\": container with ID starting with 15e2f5844a61cab4f3406761a946255268af3094c017a3f89e9d9d6a0a06b3d1 not found: ID does not exist" Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.740124 4797 scope.go:117] "RemoveContainer" containerID="f92eb182c86f5ebc1b55e35e6ebaf0b53e6772b112e92cce8213cf90110c1026" Oct 13 13:29:50 crc kubenswrapper[4797]: E1013 13:29:50.744224 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f92eb182c86f5ebc1b55e35e6ebaf0b53e6772b112e92cce8213cf90110c1026\": container with ID starting with f92eb182c86f5ebc1b55e35e6ebaf0b53e6772b112e92cce8213cf90110c1026 not found: ID does not exist" containerID="f92eb182c86f5ebc1b55e35e6ebaf0b53e6772b112e92cce8213cf90110c1026" Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.744985 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.744982 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f92eb182c86f5ebc1b55e35e6ebaf0b53e6772b112e92cce8213cf90110c1026"} err="failed to get container status \"f92eb182c86f5ebc1b55e35e6ebaf0b53e6772b112e92cce8213cf90110c1026\": rpc error: code = NotFound desc = could not find container \"f92eb182c86f5ebc1b55e35e6ebaf0b53e6772b112e92cce8213cf90110c1026\": container with ID starting with f92eb182c86f5ebc1b55e35e6ebaf0b53e6772b112e92cce8213cf90110c1026 not found: ID does not exist" Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.745248 4797 scope.go:117] "RemoveContainer" containerID="14f8f7513577c04f3bc8c70c38562b364daf8b8a2754d149cd53b77f28fcf4d4" Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.747917 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-575995d4c4-dskht"] Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.761894 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/300309d9-4375-4cce-8fb1-0833d2cfdcde-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "300309d9-4375-4cce-8fb1-0833d2cfdcde" (UID: "300309d9-4375-4cce-8fb1-0833d2cfdcde"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.762469 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/300309d9-4375-4cce-8fb1-0833d2cfdcde-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "300309d9-4375-4cce-8fb1-0833d2cfdcde" (UID: "300309d9-4375-4cce-8fb1-0833d2cfdcde"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.767399 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-575995d4c4-dskht"] Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.791156 4797 scope.go:117] "RemoveContainer" containerID="5047e602c53523ee75385241870f170eddf3e400c8f5154307fa867893e3573e" Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.791929 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.802918 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.813132 4797 scope.go:117] "RemoveContainer" containerID="14f8f7513577c04f3bc8c70c38562b364daf8b8a2754d149cd53b77f28fcf4d4" Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.813284 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 13 13:29:50 crc kubenswrapper[4797]: E1013 13:29:50.815057 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14f8f7513577c04f3bc8c70c38562b364daf8b8a2754d149cd53b77f28fcf4d4\": container with ID starting with 14f8f7513577c04f3bc8c70c38562b364daf8b8a2754d149cd53b77f28fcf4d4 not found: ID does not exist" containerID="14f8f7513577c04f3bc8c70c38562b364daf8b8a2754d149cd53b77f28fcf4d4" Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.815085 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14f8f7513577c04f3bc8c70c38562b364daf8b8a2754d149cd53b77f28fcf4d4"} err="failed to get container status \"14f8f7513577c04f3bc8c70c38562b364daf8b8a2754d149cd53b77f28fcf4d4\": rpc error: code = NotFound desc = could not find container \"14f8f7513577c04f3bc8c70c38562b364daf8b8a2754d149cd53b77f28fcf4d4\": container with ID starting with 14f8f7513577c04f3bc8c70c38562b364daf8b8a2754d149cd53b77f28fcf4d4 not found: ID does not exist" Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.815105 4797 scope.go:117] "RemoveContainer" containerID="5047e602c53523ee75385241870f170eddf3e400c8f5154307fa867893e3573e" Oct 13 13:29:50 crc kubenswrapper[4797]: E1013 13:29:50.816066 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5047e602c53523ee75385241870f170eddf3e400c8f5154307fa867893e3573e\": container with ID starting with 5047e602c53523ee75385241870f170eddf3e400c8f5154307fa867893e3573e not found: ID does not exist" containerID="5047e602c53523ee75385241870f170eddf3e400c8f5154307fa867893e3573e" Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.816108 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5047e602c53523ee75385241870f170eddf3e400c8f5154307fa867893e3573e"} err="failed to get container status \"5047e602c53523ee75385241870f170eddf3e400c8f5154307fa867893e3573e\": rpc error: code = NotFound desc = could not find container \"5047e602c53523ee75385241870f170eddf3e400c8f5154307fa867893e3573e\": container with ID starting with 5047e602c53523ee75385241870f170eddf3e400c8f5154307fa867893e3573e not found: ID does not exist" Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.816136 4797 scope.go:117] "RemoveContainer" containerID="bf518b71e25438928a378f5c3bfabd9ce6a8af5ad6e5d68c81e9f3e1f5d12f53" Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.820358 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.826403 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2a3161e-b16d-436d-b547-87e182ef5e27-combined-ca-bundle\") pod \"e2a3161e-b16d-436d-b547-87e182ef5e27\" (UID: \"e2a3161e-b16d-436d-b547-87e182ef5e27\") " Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.826448 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-572w4\" (UniqueName: \"kubernetes.io/projected/e2a3161e-b16d-436d-b547-87e182ef5e27-kube-api-access-572w4\") pod \"e2a3161e-b16d-436d-b547-87e182ef5e27\" (UID: \"e2a3161e-b16d-436d-b547-87e182ef5e27\") " Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.826518 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2a3161e-b16d-436d-b547-87e182ef5e27-public-tls-certs\") pod \"e2a3161e-b16d-436d-b547-87e182ef5e27\" (UID: \"e2a3161e-b16d-436d-b547-87e182ef5e27\") " Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.826592 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2a3161e-b16d-436d-b547-87e182ef5e27-internal-tls-certs\") pod \"e2a3161e-b16d-436d-b547-87e182ef5e27\" (UID: \"e2a3161e-b16d-436d-b547-87e182ef5e27\") " Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.826648 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e2a3161e-b16d-436d-b547-87e182ef5e27-logs\") pod \"e2a3161e-b16d-436d-b547-87e182ef5e27\" (UID: \"e2a3161e-b16d-436d-b547-87e182ef5e27\") " Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.826693 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2a3161e-b16d-436d-b547-87e182ef5e27-config-data\") pod \"e2a3161e-b16d-436d-b547-87e182ef5e27\" (UID: \"e2a3161e-b16d-436d-b547-87e182ef5e27\") " Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.827030 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/300309d9-4375-4cce-8fb1-0833d2cfdcde-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.827048 4797 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/300309d9-4375-4cce-8fb1-0833d2cfdcde-config-data-default\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.827057 4797 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/300309d9-4375-4cce-8fb1-0833d2cfdcde-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.827076 4797 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.828463 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e2a3161e-b16d-436d-b547-87e182ef5e27-logs" (OuterVolumeSpecName: "logs") pod "e2a3161e-b16d-436d-b547-87e182ef5e27" (UID: "e2a3161e-b16d-436d-b547-87e182ef5e27"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.831789 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2a3161e-b16d-436d-b547-87e182ef5e27-kube-api-access-572w4" (OuterVolumeSpecName: "kube-api-access-572w4") pod "e2a3161e-b16d-436d-b547-87e182ef5e27" (UID: "e2a3161e-b16d-436d-b547-87e182ef5e27"). InnerVolumeSpecName "kube-api-access-572w4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.838165 4797 scope.go:117] "RemoveContainer" containerID="b525fe6d91c98126c6fdec2f494bf2955f9437633fcc37cf958bfcaa039ef67e" Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.843233 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.850408 4797 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.850527 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.854713 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2a3161e-b16d-436d-b547-87e182ef5e27-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e2a3161e-b16d-436d-b547-87e182ef5e27" (UID: "e2a3161e-b16d-436d-b547-87e182ef5e27"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.857562 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.865517 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.869924 4797 scope.go:117] "RemoveContainer" containerID="bf518b71e25438928a378f5c3bfabd9ce6a8af5ad6e5d68c81e9f3e1f5d12f53" Oct 13 13:29:50 crc kubenswrapper[4797]: E1013 13:29:50.870687 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf518b71e25438928a378f5c3bfabd9ce6a8af5ad6e5d68c81e9f3e1f5d12f53\": container with ID starting with bf518b71e25438928a378f5c3bfabd9ce6a8af5ad6e5d68c81e9f3e1f5d12f53 not found: ID does not exist" containerID="bf518b71e25438928a378f5c3bfabd9ce6a8af5ad6e5d68c81e9f3e1f5d12f53" Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.870714 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2a3161e-b16d-436d-b547-87e182ef5e27-config-data" (OuterVolumeSpecName: "config-data") pod "e2a3161e-b16d-436d-b547-87e182ef5e27" (UID: "e2a3161e-b16d-436d-b547-87e182ef5e27"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.870720 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf518b71e25438928a378f5c3bfabd9ce6a8af5ad6e5d68c81e9f3e1f5d12f53"} err="failed to get container status \"bf518b71e25438928a378f5c3bfabd9ce6a8af5ad6e5d68c81e9f3e1f5d12f53\": rpc error: code = NotFound desc = could not find container \"bf518b71e25438928a378f5c3bfabd9ce6a8af5ad6e5d68c81e9f3e1f5d12f53\": container with ID starting with bf518b71e25438928a378f5c3bfabd9ce6a8af5ad6e5d68c81e9f3e1f5d12f53 not found: ID does not exist" Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.870784 4797 scope.go:117] "RemoveContainer" containerID="b525fe6d91c98126c6fdec2f494bf2955f9437633fcc37cf958bfcaa039ef67e" Oct 13 13:29:50 crc kubenswrapper[4797]: E1013 13:29:50.872793 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b525fe6d91c98126c6fdec2f494bf2955f9437633fcc37cf958bfcaa039ef67e\": container with ID starting with b525fe6d91c98126c6fdec2f494bf2955f9437633fcc37cf958bfcaa039ef67e not found: ID does not exist" containerID="b525fe6d91c98126c6fdec2f494bf2955f9437633fcc37cf958bfcaa039ef67e" Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.872845 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b525fe6d91c98126c6fdec2f494bf2955f9437633fcc37cf958bfcaa039ef67e"} err="failed to get container status \"b525fe6d91c98126c6fdec2f494bf2955f9437633fcc37cf958bfcaa039ef67e\": rpc error: code = NotFound desc = could not find container \"b525fe6d91c98126c6fdec2f494bf2955f9437633fcc37cf958bfcaa039ef67e\": container with ID starting with b525fe6d91c98126c6fdec2f494bf2955f9437633fcc37cf958bfcaa039ef67e not found: ID does not exist" Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.872871 4797 scope.go:117] "RemoveContainer" containerID="d5941c177a81de15babd5a721122470aaaaccd1b9033980aac5b1cd72b64076c" Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.897846 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2a3161e-b16d-436d-b547-87e182ef5e27-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "e2a3161e-b16d-436d-b547-87e182ef5e27" (UID: "e2a3161e-b16d-436d-b547-87e182ef5e27"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.900454 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2a3161e-b16d-436d-b547-87e182ef5e27-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "e2a3161e-b16d-436d-b547-87e182ef5e27" (UID: "e2a3161e-b16d-436d-b547-87e182ef5e27"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.906329 4797 scope.go:117] "RemoveContainer" containerID="fe63cb02070cc071d8992b28f2861cf1fe5293a685eab37ca5fb6a3cb41aabd2" Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.928728 4797 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.928757 4797 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2a3161e-b16d-436d-b547-87e182ef5e27-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.928768 4797 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2a3161e-b16d-436d-b547-87e182ef5e27-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.928776 4797 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e2a3161e-b16d-436d-b547-87e182ef5e27-logs\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.928784 4797 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2a3161e-b16d-436d-b547-87e182ef5e27-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.928794 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2a3161e-b16d-436d-b547-87e182ef5e27-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.928814 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-572w4\" (UniqueName: \"kubernetes.io/projected/e2a3161e-b16d-436d-b547-87e182ef5e27-kube-api-access-572w4\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:50 crc kubenswrapper[4797]: I1013 13:29:50.989922 4797 scope.go:117] "RemoveContainer" containerID="b0e1792b043ca301b8bfdc8d457a0e08d3ef2be1af52069dc3da398ff15a9eb7" Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.020938 4797 scope.go:117] "RemoveContainer" containerID="6d39515a47c02aadeb66a5f42bd4f250497bcfc5b08bfe8c08ae7fa7aae9544b" Oct 13 13:29:51 crc kubenswrapper[4797]: E1013 13:29:51.034963 4797 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Oct 13 13:29:51 crc kubenswrapper[4797]: E1013 13:29:51.035152 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/acdec9fc-360a-46e4-89ea-3fde84f417c0-config-data podName:acdec9fc-360a-46e4-89ea-3fde84f417c0 nodeName:}" failed. No retries permitted until 2025-10-13 13:29:59.035040781 +0000 UTC m=+1376.568591047 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/acdec9fc-360a-46e4-89ea-3fde84f417c0-config-data") pod "rabbitmq-cell1-server-0" (UID: "acdec9fc-360a-46e4-89ea-3fde84f417c0") : configmap "rabbitmq-cell1-config-data" not found Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.052876 4797 scope.go:117] "RemoveContainer" containerID="3ff088fac921f69c3d43fce69b617b7b587ee7207254fee948e2db608f2464d1" Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.085642 4797 scope.go:117] "RemoveContainer" containerID="c575ab6cf83d919cac32e185c4f667a6b3abc5c3952e0de54dbbac6e3ad28900" Oct 13 13:29:51 crc kubenswrapper[4797]: E1013 13:29:51.189062 4797 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="45f5cee6335c0ba2bc083ace9fff9eb625941edef21cfaa370cfe539173e7b53" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 13 13:29:51 crc kubenswrapper[4797]: E1013 13:29:51.198566 4797 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="45f5cee6335c0ba2bc083ace9fff9eb625941edef21cfaa370cfe539173e7b53" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 13 13:29:51 crc kubenswrapper[4797]: E1013 13:29:51.200332 4797 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="45f5cee6335c0ba2bc083ace9fff9eb625941edef21cfaa370cfe539173e7b53" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 13 13:29:51 crc kubenswrapper[4797]: E1013 13:29:51.200369 4797 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell1-conductor-0" podUID="60394f60-af79-4a07-8f3f-75fb61c31894" containerName="nova-cell1-conductor-conductor" Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.216280 4797 scope.go:117] "RemoveContainer" containerID="52fe991f1e2fd4f29fc6e6714cf185447bb07fb1c6293b5cab3549dd2ea20248" Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.262556 4797 scope.go:117] "RemoveContainer" containerID="2ce2d5a74583559ea08ad5502820fbbd8cb181a62c48ed513f6762ab6f0ec152" Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.270856 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3111854e-cfce-493d-a094-63479ed35583" path="/var/lib/kubelet/pods/3111854e-cfce-493d-a094-63479ed35583/volumes" Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.271533 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="312a660f-ea89-49ac-8857-16dae844353f" path="/var/lib/kubelet/pods/312a660f-ea89-49ac-8857-16dae844353f/volumes" Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.272105 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb03045-bfdc-4d0b-af8f-4e3c4717e792" path="/var/lib/kubelet/pods/3cb03045-bfdc-4d0b-af8f-4e3c4717e792/volumes" Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.272966 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d561c30-1e2f-4a3d-b042-8191c88e4bb6" path="/var/lib/kubelet/pods/3d561c30-1e2f-4a3d-b042-8191c88e4bb6/volumes" Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.274837 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="416aefad-3318-4406-b1c1-fdba0ce21437" path="/var/lib/kubelet/pods/416aefad-3318-4406-b1c1-fdba0ce21437/volumes" Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.275705 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5421ab8e-2db8-4909-b67b-e0491f7b80e7" path="/var/lib/kubelet/pods/5421ab8e-2db8-4909-b67b-e0491f7b80e7/volumes" Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.276521 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b2f17a4-493b-4b76-9dea-ef70ed8e1525" path="/var/lib/kubelet/pods/6b2f17a4-493b-4b76-9dea-ef70ed8e1525/volumes" Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.278061 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6fed4821-c587-408b-b6b0-bcc080170628" path="/var/lib/kubelet/pods/6fed4821-c587-408b-b6b0-bcc080170628/volumes" Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.278729 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85dd770b-9d5c-4cc9-adaa-87963d5bb160" path="/var/lib/kubelet/pods/85dd770b-9d5c-4cc9-adaa-87963d5bb160/volumes" Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.279861 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a327836f-196f-4d29-8792-33085b552aa9" path="/var/lib/kubelet/pods/a327836f-196f-4d29-8792-33085b552aa9/volumes" Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.280401 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7503305-66a2-4504-b208-6795946d8701" path="/var/lib/kubelet/pods/c7503305-66a2-4504-b208-6795946d8701/volumes" Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.280926 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb5bf67e-bfcd-4a21-b67f-9d2893da31ab" path="/var/lib/kubelet/pods/cb5bf67e-bfcd-4a21-b67f-9d2893da31ab/volumes" Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.281425 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cefeac7c-e65d-4c12-8f7e-e56bf30c04fa" path="/var/lib/kubelet/pods/cefeac7c-e65d-4c12-8f7e-e56bf30c04fa/volumes" Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.282672 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5e82c55-e59e-4d97-800c-66a4f9555047" path="/var/lib/kubelet/pods/d5e82c55-e59e-4d97-800c-66a4f9555047/volumes" Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.283311 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ddeeadaa-8237-4fad-8fd7-8e9c0580a1ed" path="/var/lib/kubelet/pods/ddeeadaa-8237-4fad-8fd7-8e9c0580a1ed/volumes" Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.293977 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2be119d-ecfb-4f81-b947-46797c215b8e" path="/var/lib/kubelet/pods/e2be119d-ecfb-4f81-b947-46797c215b8e/volumes" Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.294918 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ecdce3f4-f1b1-4323-b237-eb28b936ebc7" path="/var/lib/kubelet/pods/ecdce3f4-f1b1-4323-b237-eb28b936ebc7/volumes" Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.295394 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f10b6d3a-3a6f-4ab7-9ded-4885e28bdbc6" path="/var/lib/kubelet/pods/f10b6d3a-3a6f-4ab7-9ded-4885e28bdbc6/volumes" Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.310455 4797 scope.go:117] "RemoveContainer" containerID="87d1861a88b2c106ab6c47eaad4f1a0647557f81e2c4766a50ad9d24c12524bf" Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.501400 4797 generic.go:334] "Generic (PLEG): container finished" podID="acdec9fc-360a-46e4-89ea-3fde84f417c0" containerID="84b296cdae027daaba2dce536affe2df5bb8565c9eaea497ef3762320f6ea09d" exitCode=0 Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.501459 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"acdec9fc-360a-46e4-89ea-3fde84f417c0","Type":"ContainerDied","Data":"84b296cdae027daaba2dce536affe2df5bb8565c9eaea497ef3762320f6ea09d"} Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.525089 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_d661f302-5234-4d18-9aa8-0eddd26153fe/ovn-northd/0.log" Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.525136 4797 generic.go:334] "Generic (PLEG): container finished" podID="d661f302-5234-4d18-9aa8-0eddd26153fe" containerID="c4243df011234c180288fc1c95c327de116944eeb9f76e3b80b6ff0317063169" exitCode=139 Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.525198 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"d661f302-5234-4d18-9aa8-0eddd26153fe","Type":"ContainerDied","Data":"c4243df011234c180288fc1c95c327de116944eeb9f76e3b80b6ff0317063169"} Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.535004 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-789bb6874b-qp58p" Oct 13 13:29:51 crc kubenswrapper[4797]: E1013 13:29:51.539175 4797 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c4243df011234c180288fc1c95c327de116944eeb9f76e3b80b6ff0317063169 is running failed: container process not found" containerID="c4243df011234c180288fc1c95c327de116944eeb9f76e3b80b6ff0317063169" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.539469 4797 generic.go:334] "Generic (PLEG): container finished" podID="21067728-d3cf-4ff2-94c9-87600f7324ab" containerID="b14ae8c7513d0ced2b738189c2e015ff0743dc1feeeda16b9f6380925730cb4f" exitCode=0 Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.539540 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"21067728-d3cf-4ff2-94c9-87600f7324ab","Type":"ContainerDied","Data":"b14ae8c7513d0ced2b738189c2e015ff0743dc1feeeda16b9f6380925730cb4f"} Oct 13 13:29:51 crc kubenswrapper[4797]: E1013 13:29:51.539704 4797 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c4243df011234c180288fc1c95c327de116944eeb9f76e3b80b6ff0317063169 is running failed: container process not found" containerID="c4243df011234c180288fc1c95c327de116944eeb9f76e3b80b6ff0317063169" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Oct 13 13:29:51 crc kubenswrapper[4797]: E1013 13:29:51.540184 4797 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c4243df011234c180288fc1c95c327de116944eeb9f76e3b80b6ff0317063169 is running failed: container process not found" containerID="c4243df011234c180288fc1c95c327de116944eeb9f76e3b80b6ff0317063169" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Oct 13 13:29:51 crc kubenswrapper[4797]: E1013 13:29:51.540233 4797 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c4243df011234c180288fc1c95c327de116944eeb9f76e3b80b6ff0317063169 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-northd-0" podUID="d661f302-5234-4d18-9aa8-0eddd26153fe" containerName="ovn-northd" Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.548159 4797 scope.go:117] "RemoveContainer" containerID="aab08153ea37f716e034bf774837202e797c20f3821e253a5ccd1da48ecfc7e4" Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.558962 4797 generic.go:334] "Generic (PLEG): container finished" podID="6bf02d7e-6b92-4d2a-838f-20cdd6a7046e" containerID="9a62cd7c84ab7e9f42d9574734c0a64f0e34918d01cc88cd3821d18b0063a155" exitCode=0 Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.559032 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-789bb6874b-qp58p" event={"ID":"6bf02d7e-6b92-4d2a-838f-20cdd6a7046e","Type":"ContainerDied","Data":"9a62cd7c84ab7e9f42d9574734c0a64f0e34918d01cc88cd3821d18b0063a155"} Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.559052 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-789bb6874b-qp58p" event={"ID":"6bf02d7e-6b92-4d2a-838f-20cdd6a7046e","Type":"ContainerDied","Data":"5aa99e1063a1d06b9d809420818758f38869724bffd3398b9dda527ddb797d2e"} Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.559096 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-789bb6874b-qp58p" Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.566407 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e2a3161e-b16d-436d-b547-87e182ef5e27","Type":"ContainerDied","Data":"f7f353052245c703cc603e6eb0933e0bc1e907108176ccd384de9c18ca71a4ea"} Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.566466 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.571072 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.637678 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.643099 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6bf02d7e-6b92-4d2a-838f-20cdd6a7046e-internal-tls-certs\") pod \"6bf02d7e-6b92-4d2a-838f-20cdd6a7046e\" (UID: \"6bf02d7e-6b92-4d2a-838f-20cdd6a7046e\") " Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.643175 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5nmdl\" (UniqueName: \"kubernetes.io/projected/6bf02d7e-6b92-4d2a-838f-20cdd6a7046e-kube-api-access-5nmdl\") pod \"6bf02d7e-6b92-4d2a-838f-20cdd6a7046e\" (UID: \"6bf02d7e-6b92-4d2a-838f-20cdd6a7046e\") " Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.643214 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6bf02d7e-6b92-4d2a-838f-20cdd6a7046e-fernet-keys\") pod \"6bf02d7e-6b92-4d2a-838f-20cdd6a7046e\" (UID: \"6bf02d7e-6b92-4d2a-838f-20cdd6a7046e\") " Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.643257 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6bf02d7e-6b92-4d2a-838f-20cdd6a7046e-scripts\") pod \"6bf02d7e-6b92-4d2a-838f-20cdd6a7046e\" (UID: \"6bf02d7e-6b92-4d2a-838f-20cdd6a7046e\") " Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.643282 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6bf02d7e-6b92-4d2a-838f-20cdd6a7046e-config-data\") pod \"6bf02d7e-6b92-4d2a-838f-20cdd6a7046e\" (UID: \"6bf02d7e-6b92-4d2a-838f-20cdd6a7046e\") " Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.643368 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6bf02d7e-6b92-4d2a-838f-20cdd6a7046e-combined-ca-bundle\") pod \"6bf02d7e-6b92-4d2a-838f-20cdd6a7046e\" (UID: \"6bf02d7e-6b92-4d2a-838f-20cdd6a7046e\") " Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.643390 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6bf02d7e-6b92-4d2a-838f-20cdd6a7046e-public-tls-certs\") pod \"6bf02d7e-6b92-4d2a-838f-20cdd6a7046e\" (UID: \"6bf02d7e-6b92-4d2a-838f-20cdd6a7046e\") " Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.643416 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6bf02d7e-6b92-4d2a-838f-20cdd6a7046e-credential-keys\") pod \"6bf02d7e-6b92-4d2a-838f-20cdd6a7046e\" (UID: \"6bf02d7e-6b92-4d2a-838f-20cdd6a7046e\") " Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.647942 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.649774 4797 scope.go:117] "RemoveContainer" containerID="eb7d4e571e76b36027b315aa98010f3dd65dc78fd3a01a2aa0cae7369e4d667e" Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.652478 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6bf02d7e-6b92-4d2a-838f-20cdd6a7046e-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "6bf02d7e-6b92-4d2a-838f-20cdd6a7046e" (UID: "6bf02d7e-6b92-4d2a-838f-20cdd6a7046e"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.652648 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6bf02d7e-6b92-4d2a-838f-20cdd6a7046e-scripts" (OuterVolumeSpecName: "scripts") pod "6bf02d7e-6b92-4d2a-838f-20cdd6a7046e" (UID: "6bf02d7e-6b92-4d2a-838f-20cdd6a7046e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.653014 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6bf02d7e-6b92-4d2a-838f-20cdd6a7046e-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "6bf02d7e-6b92-4d2a-838f-20cdd6a7046e" (UID: "6bf02d7e-6b92-4d2a-838f-20cdd6a7046e"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.662004 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6bf02d7e-6b92-4d2a-838f-20cdd6a7046e-kube-api-access-5nmdl" (OuterVolumeSpecName: "kube-api-access-5nmdl") pod "6bf02d7e-6b92-4d2a-838f-20cdd6a7046e" (UID: "6bf02d7e-6b92-4d2a-838f-20cdd6a7046e"). InnerVolumeSpecName "kube-api-access-5nmdl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.677959 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.695896 4797 scope.go:117] "RemoveContainer" containerID="658bce01fb068a8991c6ad520dbfd6eedee82a6e9a0f4f191a112fb1f5f569bf" Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.698070 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.702506 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-galera-0"] Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.723156 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_d661f302-5234-4d18-9aa8-0eddd26153fe/ovn-northd/0.log" Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.723352 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.727992 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6bf02d7e-6b92-4d2a-838f-20cdd6a7046e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6bf02d7e-6b92-4d2a-838f-20cdd6a7046e" (UID: "6bf02d7e-6b92-4d2a-838f-20cdd6a7046e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.735349 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6bf02d7e-6b92-4d2a-838f-20cdd6a7046e-config-data" (OuterVolumeSpecName: "config-data") pod "6bf02d7e-6b92-4d2a-838f-20cdd6a7046e" (UID: "6bf02d7e-6b92-4d2a-838f-20cdd6a7046e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.745237 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/21067728-d3cf-4ff2-94c9-87600f7324ab-rabbitmq-plugins\") pod \"21067728-d3cf-4ff2-94c9-87600f7324ab\" (UID: \"21067728-d3cf-4ff2-94c9-87600f7324ab\") " Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.745276 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6bf02d7e-6b92-4d2a-838f-20cdd6a7046e-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "6bf02d7e-6b92-4d2a-838f-20cdd6a7046e" (UID: "6bf02d7e-6b92-4d2a-838f-20cdd6a7046e"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.745301 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/21067728-d3cf-4ff2-94c9-87600f7324ab-pod-info\") pod \"21067728-d3cf-4ff2-94c9-87600f7324ab\" (UID: \"21067728-d3cf-4ff2-94c9-87600f7324ab\") " Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.745355 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/21067728-d3cf-4ff2-94c9-87600f7324ab-config-data\") pod \"21067728-d3cf-4ff2-94c9-87600f7324ab\" (UID: \"21067728-d3cf-4ff2-94c9-87600f7324ab\") " Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.745418 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/21067728-d3cf-4ff2-94c9-87600f7324ab-plugins-conf\") pod \"21067728-d3cf-4ff2-94c9-87600f7324ab\" (UID: \"21067728-d3cf-4ff2-94c9-87600f7324ab\") " Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.745459 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/21067728-d3cf-4ff2-94c9-87600f7324ab-rabbitmq-confd\") pod \"21067728-d3cf-4ff2-94c9-87600f7324ab\" (UID: \"21067728-d3cf-4ff2-94c9-87600f7324ab\") " Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.745484 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/21067728-d3cf-4ff2-94c9-87600f7324ab-erlang-cookie-secret\") pod \"21067728-d3cf-4ff2-94c9-87600f7324ab\" (UID: \"21067728-d3cf-4ff2-94c9-87600f7324ab\") " Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.745520 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/21067728-d3cf-4ff2-94c9-87600f7324ab-rabbitmq-erlang-cookie\") pod \"21067728-d3cf-4ff2-94c9-87600f7324ab\" (UID: \"21067728-d3cf-4ff2-94c9-87600f7324ab\") " Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.745551 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"21067728-d3cf-4ff2-94c9-87600f7324ab\" (UID: \"21067728-d3cf-4ff2-94c9-87600f7324ab\") " Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.745593 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/21067728-d3cf-4ff2-94c9-87600f7324ab-rabbitmq-tls\") pod \"21067728-d3cf-4ff2-94c9-87600f7324ab\" (UID: \"21067728-d3cf-4ff2-94c9-87600f7324ab\") " Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.745625 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/21067728-d3cf-4ff2-94c9-87600f7324ab-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "21067728-d3cf-4ff2-94c9-87600f7324ab" (UID: "21067728-d3cf-4ff2-94c9-87600f7324ab"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.745656 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5d865\" (UniqueName: \"kubernetes.io/projected/21067728-d3cf-4ff2-94c9-87600f7324ab-kube-api-access-5d865\") pod \"21067728-d3cf-4ff2-94c9-87600f7324ab\" (UID: \"21067728-d3cf-4ff2-94c9-87600f7324ab\") " Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.745707 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/21067728-d3cf-4ff2-94c9-87600f7324ab-server-conf\") pod \"21067728-d3cf-4ff2-94c9-87600f7324ab\" (UID: \"21067728-d3cf-4ff2-94c9-87600f7324ab\") " Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.745733 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6bf02d7e-6b92-4d2a-838f-20cdd6a7046e-public-tls-certs\") pod \"6bf02d7e-6b92-4d2a-838f-20cdd6a7046e\" (UID: \"6bf02d7e-6b92-4d2a-838f-20cdd6a7046e\") " Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.746030 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5nmdl\" (UniqueName: \"kubernetes.io/projected/6bf02d7e-6b92-4d2a-838f-20cdd6a7046e-kube-api-access-5nmdl\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.746046 4797 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6bf02d7e-6b92-4d2a-838f-20cdd6a7046e-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.746055 4797 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6bf02d7e-6b92-4d2a-838f-20cdd6a7046e-scripts\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.746065 4797 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6bf02d7e-6b92-4d2a-838f-20cdd6a7046e-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.746075 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6bf02d7e-6b92-4d2a-838f-20cdd6a7046e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.746087 4797 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/21067728-d3cf-4ff2-94c9-87600f7324ab-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.746099 4797 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6bf02d7e-6b92-4d2a-838f-20cdd6a7046e-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:51 crc kubenswrapper[4797]: W1013 13:29:51.746187 4797 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/6bf02d7e-6b92-4d2a-838f-20cdd6a7046e/volumes/kubernetes.io~secret/public-tls-certs Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.746201 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6bf02d7e-6b92-4d2a-838f-20cdd6a7046e-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "6bf02d7e-6b92-4d2a-838f-20cdd6a7046e" (UID: "6bf02d7e-6b92-4d2a-838f-20cdd6a7046e"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.749518 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21067728-d3cf-4ff2-94c9-87600f7324ab-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "21067728-d3cf-4ff2-94c9-87600f7324ab" (UID: "21067728-d3cf-4ff2-94c9-87600f7324ab"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.750153 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21067728-d3cf-4ff2-94c9-87600f7324ab-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "21067728-d3cf-4ff2-94c9-87600f7324ab" (UID: "21067728-d3cf-4ff2-94c9-87600f7324ab"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.750283 4797 scope.go:117] "RemoveContainer" containerID="7fbdf32f1bb326754cf202e723f8c39032e73df93800b78e5c5cdcd412a37605" Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.751835 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21067728-d3cf-4ff2-94c9-87600f7324ab-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "21067728-d3cf-4ff2-94c9-87600f7324ab" (UID: "21067728-d3cf-4ff2-94c9-87600f7324ab"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.752368 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21067728-d3cf-4ff2-94c9-87600f7324ab-kube-api-access-5d865" (OuterVolumeSpecName: "kube-api-access-5d865") pod "21067728-d3cf-4ff2-94c9-87600f7324ab" (UID: "21067728-d3cf-4ff2-94c9-87600f7324ab"). InnerVolumeSpecName "kube-api-access-5d865". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.752937 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/21067728-d3cf-4ff2-94c9-87600f7324ab-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "21067728-d3cf-4ff2-94c9-87600f7324ab" (UID: "21067728-d3cf-4ff2-94c9-87600f7324ab"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.753834 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "persistence") pod "21067728-d3cf-4ff2-94c9-87600f7324ab" (UID: "21067728-d3cf-4ff2-94c9-87600f7324ab"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.755166 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6bf02d7e-6b92-4d2a-838f-20cdd6a7046e-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "6bf02d7e-6b92-4d2a-838f-20cdd6a7046e" (UID: "6bf02d7e-6b92-4d2a-838f-20cdd6a7046e"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.756243 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/21067728-d3cf-4ff2-94c9-87600f7324ab-pod-info" (OuterVolumeSpecName: "pod-info") pod "21067728-d3cf-4ff2-94c9-87600f7324ab" (UID: "21067728-d3cf-4ff2-94c9-87600f7324ab"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.765729 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21067728-d3cf-4ff2-94c9-87600f7324ab-config-data" (OuterVolumeSpecName: "config-data") pod "21067728-d3cf-4ff2-94c9-87600f7324ab" (UID: "21067728-d3cf-4ff2-94c9-87600f7324ab"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.769367 4797 scope.go:117] "RemoveContainer" containerID="9a62cd7c84ab7e9f42d9574734c0a64f0e34918d01cc88cd3821d18b0063a155" Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.789768 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.791253 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21067728-d3cf-4ff2-94c9-87600f7324ab-server-conf" (OuterVolumeSpecName: "server-conf") pod "21067728-d3cf-4ff2-94c9-87600f7324ab" (UID: "21067728-d3cf-4ff2-94c9-87600f7324ab"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.791364 4797 scope.go:117] "RemoveContainer" containerID="9a62cd7c84ab7e9f42d9574734c0a64f0e34918d01cc88cd3821d18b0063a155" Oct 13 13:29:51 crc kubenswrapper[4797]: E1013 13:29:51.792310 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a62cd7c84ab7e9f42d9574734c0a64f0e34918d01cc88cd3821d18b0063a155\": container with ID starting with 9a62cd7c84ab7e9f42d9574734c0a64f0e34918d01cc88cd3821d18b0063a155 not found: ID does not exist" containerID="9a62cd7c84ab7e9f42d9574734c0a64f0e34918d01cc88cd3821d18b0063a155" Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.792343 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a62cd7c84ab7e9f42d9574734c0a64f0e34918d01cc88cd3821d18b0063a155"} err="failed to get container status \"9a62cd7c84ab7e9f42d9574734c0a64f0e34918d01cc88cd3821d18b0063a155\": rpc error: code = NotFound desc = could not find container \"9a62cd7c84ab7e9f42d9574734c0a64f0e34918d01cc88cd3821d18b0063a155\": container with ID starting with 9a62cd7c84ab7e9f42d9574734c0a64f0e34918d01cc88cd3821d18b0063a155 not found: ID does not exist" Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.792370 4797 scope.go:117] "RemoveContainer" containerID="96c199433ec042676ed19e22d59f1e2298214f4e17f2643c330e718dd6cd93a6" Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.820937 4797 scope.go:117] "RemoveContainer" containerID="5098f65b53c031667082257aa9c75ea641809f197a0ff696aad19b986be56dba" Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.824146 4797 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-cell1-novncproxy-0" podUID="fe29d3f9-7e6b-4fc0-8268-a5c3bd0e7f1e" containerName="nova-cell1-novncproxy-novncproxy" probeResult="failure" output="Get \"https://10.217.0.199:6080/vnc_lite.html\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.846770 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d661f302-5234-4d18-9aa8-0eddd26153fe-scripts\") pod \"d661f302-5234-4d18-9aa8-0eddd26153fe\" (UID: \"d661f302-5234-4d18-9aa8-0eddd26153fe\") " Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.847150 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/d661f302-5234-4d18-9aa8-0eddd26153fe-ovn-northd-tls-certs\") pod \"d661f302-5234-4d18-9aa8-0eddd26153fe\" (UID: \"d661f302-5234-4d18-9aa8-0eddd26153fe\") " Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.847501 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9fwsx\" (UniqueName: \"kubernetes.io/projected/d661f302-5234-4d18-9aa8-0eddd26153fe-kube-api-access-9fwsx\") pod \"d661f302-5234-4d18-9aa8-0eddd26153fe\" (UID: \"d661f302-5234-4d18-9aa8-0eddd26153fe\") " Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.847589 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d661f302-5234-4d18-9aa8-0eddd26153fe-combined-ca-bundle\") pod \"d661f302-5234-4d18-9aa8-0eddd26153fe\" (UID: \"d661f302-5234-4d18-9aa8-0eddd26153fe\") " Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.847314 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d661f302-5234-4d18-9aa8-0eddd26153fe-scripts" (OuterVolumeSpecName: "scripts") pod "d661f302-5234-4d18-9aa8-0eddd26153fe" (UID: "d661f302-5234-4d18-9aa8-0eddd26153fe"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.847741 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d661f302-5234-4d18-9aa8-0eddd26153fe-metrics-certs-tls-certs\") pod \"d661f302-5234-4d18-9aa8-0eddd26153fe\" (UID: \"d661f302-5234-4d18-9aa8-0eddd26153fe\") " Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.848013 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d661f302-5234-4d18-9aa8-0eddd26153fe-ovn-rundir\") pod \"d661f302-5234-4d18-9aa8-0eddd26153fe\" (UID: \"d661f302-5234-4d18-9aa8-0eddd26153fe\") " Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.848125 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d661f302-5234-4d18-9aa8-0eddd26153fe-config\") pod \"d661f302-5234-4d18-9aa8-0eddd26153fe\" (UID: \"d661f302-5234-4d18-9aa8-0eddd26153fe\") " Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.848379 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d661f302-5234-4d18-9aa8-0eddd26153fe-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "d661f302-5234-4d18-9aa8-0eddd26153fe" (UID: "d661f302-5234-4d18-9aa8-0eddd26153fe"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.848734 4797 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d661f302-5234-4d18-9aa8-0eddd26153fe-ovn-rundir\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.848894 4797 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/21067728-d3cf-4ff2-94c9-87600f7324ab-server-conf\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.848971 4797 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6bf02d7e-6b92-4d2a-838f-20cdd6a7046e-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.849042 4797 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/21067728-d3cf-4ff2-94c9-87600f7324ab-pod-info\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.849396 4797 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d661f302-5234-4d18-9aa8-0eddd26153fe-scripts\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.849012 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d661f302-5234-4d18-9aa8-0eddd26153fe-config" (OuterVolumeSpecName: "config") pod "d661f302-5234-4d18-9aa8-0eddd26153fe" (UID: "d661f302-5234-4d18-9aa8-0eddd26153fe"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.849609 4797 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/21067728-d3cf-4ff2-94c9-87600f7324ab-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.849686 4797 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6bf02d7e-6b92-4d2a-838f-20cdd6a7046e-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.849844 4797 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/21067728-d3cf-4ff2-94c9-87600f7324ab-plugins-conf\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.849917 4797 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/21067728-d3cf-4ff2-94c9-87600f7324ab-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.849995 4797 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/21067728-d3cf-4ff2-94c9-87600f7324ab-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.850077 4797 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.850147 4797 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/21067728-d3cf-4ff2-94c9-87600f7324ab-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.850223 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5d865\" (UniqueName: \"kubernetes.io/projected/21067728-d3cf-4ff2-94c9-87600f7324ab-kube-api-access-5d865\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.850261 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d661f302-5234-4d18-9aa8-0eddd26153fe-kube-api-access-9fwsx" (OuterVolumeSpecName: "kube-api-access-9fwsx") pod "d661f302-5234-4d18-9aa8-0eddd26153fe" (UID: "d661f302-5234-4d18-9aa8-0eddd26153fe"). InnerVolumeSpecName "kube-api-access-9fwsx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.854378 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21067728-d3cf-4ff2-94c9-87600f7324ab-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "21067728-d3cf-4ff2-94c9-87600f7324ab" (UID: "21067728-d3cf-4ff2-94c9-87600f7324ab"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.869104 4797 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.869422 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d661f302-5234-4d18-9aa8-0eddd26153fe-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d661f302-5234-4d18-9aa8-0eddd26153fe" (UID: "d661f302-5234-4d18-9aa8-0eddd26153fe"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.902025 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-789bb6874b-qp58p"] Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.922319 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-789bb6874b-qp58p"] Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.944061 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d661f302-5234-4d18-9aa8-0eddd26153fe-ovn-northd-tls-certs" (OuterVolumeSpecName: "ovn-northd-tls-certs") pod "d661f302-5234-4d18-9aa8-0eddd26153fe" (UID: "d661f302-5234-4d18-9aa8-0eddd26153fe"). InnerVolumeSpecName "ovn-northd-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.949318 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d661f302-5234-4d18-9aa8-0eddd26153fe-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "d661f302-5234-4d18-9aa8-0eddd26153fe" (UID: "d661f302-5234-4d18-9aa8-0eddd26153fe"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.950687 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/acdec9fc-360a-46e4-89ea-3fde84f417c0-rabbitmq-erlang-cookie\") pod \"acdec9fc-360a-46e4-89ea-3fde84f417c0\" (UID: \"acdec9fc-360a-46e4-89ea-3fde84f417c0\") " Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.950797 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"acdec9fc-360a-46e4-89ea-3fde84f417c0\" (UID: \"acdec9fc-360a-46e4-89ea-3fde84f417c0\") " Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.950849 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/acdec9fc-360a-46e4-89ea-3fde84f417c0-rabbitmq-tls\") pod \"acdec9fc-360a-46e4-89ea-3fde84f417c0\" (UID: \"acdec9fc-360a-46e4-89ea-3fde84f417c0\") " Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.950911 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/acdec9fc-360a-46e4-89ea-3fde84f417c0-rabbitmq-confd\") pod \"acdec9fc-360a-46e4-89ea-3fde84f417c0\" (UID: \"acdec9fc-360a-46e4-89ea-3fde84f417c0\") " Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.950942 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/acdec9fc-360a-46e4-89ea-3fde84f417c0-server-conf\") pod \"acdec9fc-360a-46e4-89ea-3fde84f417c0\" (UID: \"acdec9fc-360a-46e4-89ea-3fde84f417c0\") " Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.950968 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-49qwl\" (UniqueName: \"kubernetes.io/projected/acdec9fc-360a-46e4-89ea-3fde84f417c0-kube-api-access-49qwl\") pod \"acdec9fc-360a-46e4-89ea-3fde84f417c0\" (UID: \"acdec9fc-360a-46e4-89ea-3fde84f417c0\") " Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.951033 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/acdec9fc-360a-46e4-89ea-3fde84f417c0-plugins-conf\") pod \"acdec9fc-360a-46e4-89ea-3fde84f417c0\" (UID: \"acdec9fc-360a-46e4-89ea-3fde84f417c0\") " Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.951066 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/acdec9fc-360a-46e4-89ea-3fde84f417c0-erlang-cookie-secret\") pod \"acdec9fc-360a-46e4-89ea-3fde84f417c0\" (UID: \"acdec9fc-360a-46e4-89ea-3fde84f417c0\") " Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.951104 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/acdec9fc-360a-46e4-89ea-3fde84f417c0-pod-info\") pod \"acdec9fc-360a-46e4-89ea-3fde84f417c0\" (UID: \"acdec9fc-360a-46e4-89ea-3fde84f417c0\") " Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.951154 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/acdec9fc-360a-46e4-89ea-3fde84f417c0-config-data\") pod \"acdec9fc-360a-46e4-89ea-3fde84f417c0\" (UID: \"acdec9fc-360a-46e4-89ea-3fde84f417c0\") " Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.951181 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/acdec9fc-360a-46e4-89ea-3fde84f417c0-rabbitmq-plugins\") pod \"acdec9fc-360a-46e4-89ea-3fde84f417c0\" (UID: \"acdec9fc-360a-46e4-89ea-3fde84f417c0\") " Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.951233 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/acdec9fc-360a-46e4-89ea-3fde84f417c0-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "acdec9fc-360a-46e4-89ea-3fde84f417c0" (UID: "acdec9fc-360a-46e4-89ea-3fde84f417c0"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.951567 4797 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/21067728-d3cf-4ff2-94c9-87600f7324ab-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.951588 4797 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.951601 4797 reconciler_common.go:293] "Volume detached for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/d661f302-5234-4d18-9aa8-0eddd26153fe-ovn-northd-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.951614 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9fwsx\" (UniqueName: \"kubernetes.io/projected/d661f302-5234-4d18-9aa8-0eddd26153fe-kube-api-access-9fwsx\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.951627 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d661f302-5234-4d18-9aa8-0eddd26153fe-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.951638 4797 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/acdec9fc-360a-46e4-89ea-3fde84f417c0-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.951650 4797 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d661f302-5234-4d18-9aa8-0eddd26153fe-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.951662 4797 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d661f302-5234-4d18-9aa8-0eddd26153fe-config\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.951834 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/acdec9fc-360a-46e4-89ea-3fde84f417c0-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "acdec9fc-360a-46e4-89ea-3fde84f417c0" (UID: "acdec9fc-360a-46e4-89ea-3fde84f417c0"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.952113 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/acdec9fc-360a-46e4-89ea-3fde84f417c0-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "acdec9fc-360a-46e4-89ea-3fde84f417c0" (UID: "acdec9fc-360a-46e4-89ea-3fde84f417c0"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.953560 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/acdec9fc-360a-46e4-89ea-3fde84f417c0-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "acdec9fc-360a-46e4-89ea-3fde84f417c0" (UID: "acdec9fc-360a-46e4-89ea-3fde84f417c0"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.954085 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "persistence") pod "acdec9fc-360a-46e4-89ea-3fde84f417c0" (UID: "acdec9fc-360a-46e4-89ea-3fde84f417c0"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.954734 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/acdec9fc-360a-46e4-89ea-3fde84f417c0-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "acdec9fc-360a-46e4-89ea-3fde84f417c0" (UID: "acdec9fc-360a-46e4-89ea-3fde84f417c0"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.955890 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/acdec9fc-360a-46e4-89ea-3fde84f417c0-pod-info" (OuterVolumeSpecName: "pod-info") pod "acdec9fc-360a-46e4-89ea-3fde84f417c0" (UID: "acdec9fc-360a-46e4-89ea-3fde84f417c0"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.956246 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/acdec9fc-360a-46e4-89ea-3fde84f417c0-kube-api-access-49qwl" (OuterVolumeSpecName: "kube-api-access-49qwl") pod "acdec9fc-360a-46e4-89ea-3fde84f417c0" (UID: "acdec9fc-360a-46e4-89ea-3fde84f417c0"). InnerVolumeSpecName "kube-api-access-49qwl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.975111 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/acdec9fc-360a-46e4-89ea-3fde84f417c0-config-data" (OuterVolumeSpecName: "config-data") pod "acdec9fc-360a-46e4-89ea-3fde84f417c0" (UID: "acdec9fc-360a-46e4-89ea-3fde84f417c0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:29:51 crc kubenswrapper[4797]: I1013 13:29:51.986305 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/acdec9fc-360a-46e4-89ea-3fde84f417c0-server-conf" (OuterVolumeSpecName: "server-conf") pod "acdec9fc-360a-46e4-89ea-3fde84f417c0" (UID: "acdec9fc-360a-46e4-89ea-3fde84f417c0"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:29:52 crc kubenswrapper[4797]: I1013 13:29:52.035799 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/acdec9fc-360a-46e4-89ea-3fde84f417c0-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "acdec9fc-360a-46e4-89ea-3fde84f417c0" (UID: "acdec9fc-360a-46e4-89ea-3fde84f417c0"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:29:52 crc kubenswrapper[4797]: I1013 13:29:52.052663 4797 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Oct 13 13:29:52 crc kubenswrapper[4797]: I1013 13:29:52.052699 4797 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/acdec9fc-360a-46e4-89ea-3fde84f417c0-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:52 crc kubenswrapper[4797]: I1013 13:29:52.052709 4797 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/acdec9fc-360a-46e4-89ea-3fde84f417c0-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:52 crc kubenswrapper[4797]: I1013 13:29:52.052718 4797 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/acdec9fc-360a-46e4-89ea-3fde84f417c0-server-conf\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:52 crc kubenswrapper[4797]: I1013 13:29:52.052728 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-49qwl\" (UniqueName: \"kubernetes.io/projected/acdec9fc-360a-46e4-89ea-3fde84f417c0-kube-api-access-49qwl\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:52 crc kubenswrapper[4797]: I1013 13:29:52.052736 4797 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/acdec9fc-360a-46e4-89ea-3fde84f417c0-plugins-conf\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:52 crc kubenswrapper[4797]: I1013 13:29:52.052745 4797 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/acdec9fc-360a-46e4-89ea-3fde84f417c0-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:52 crc kubenswrapper[4797]: I1013 13:29:52.052753 4797 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/acdec9fc-360a-46e4-89ea-3fde84f417c0-pod-info\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:52 crc kubenswrapper[4797]: I1013 13:29:52.052761 4797 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/acdec9fc-360a-46e4-89ea-3fde84f417c0-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:52 crc kubenswrapper[4797]: I1013 13:29:52.052768 4797 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/acdec9fc-360a-46e4-89ea-3fde84f417c0-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:52 crc kubenswrapper[4797]: I1013 13:29:52.067679 4797 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Oct 13 13:29:52 crc kubenswrapper[4797]: I1013 13:29:52.154100 4797 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:52 crc kubenswrapper[4797]: I1013 13:29:52.585206 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"acdec9fc-360a-46e4-89ea-3fde84f417c0","Type":"ContainerDied","Data":"7ca684bd06b79cd361e9ae37a6ebd09af0d174370e8b4c826f82ccdf5efb69f3"} Oct 13 13:29:52 crc kubenswrapper[4797]: I1013 13:29:52.585258 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 13 13:29:52 crc kubenswrapper[4797]: I1013 13:29:52.585267 4797 scope.go:117] "RemoveContainer" containerID="84b296cdae027daaba2dce536affe2df5bb8565c9eaea497ef3762320f6ea09d" Oct 13 13:29:52 crc kubenswrapper[4797]: I1013 13:29:52.597734 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_d661f302-5234-4d18-9aa8-0eddd26153fe/ovn-northd/0.log" Oct 13 13:29:52 crc kubenswrapper[4797]: I1013 13:29:52.597884 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"d661f302-5234-4d18-9aa8-0eddd26153fe","Type":"ContainerDied","Data":"c43d7fd65f1a3539ba1c9fb279f0e0bca70eec6d5392156f528778ea76540ad0"} Oct 13 13:29:52 crc kubenswrapper[4797]: I1013 13:29:52.597893 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 13 13:29:52 crc kubenswrapper[4797]: I1013 13:29:52.603520 4797 generic.go:334] "Generic (PLEG): container finished" podID="7297d3f3-134d-4fcf-85f2-8b414e2fb27d" containerID="3e128c54e1e732791a3db021ae19e7a7d4bd2ecbe1361a7e7ba33071c47c83c3" exitCode=0 Oct 13 13:29:52 crc kubenswrapper[4797]: I1013 13:29:52.603582 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7297d3f3-134d-4fcf-85f2-8b414e2fb27d","Type":"ContainerDied","Data":"3e128c54e1e732791a3db021ae19e7a7d4bd2ecbe1361a7e7ba33071c47c83c3"} Oct 13 13:29:52 crc kubenswrapper[4797]: I1013 13:29:52.607833 4797 generic.go:334] "Generic (PLEG): container finished" podID="b6d097e4-da24-434c-9b9f-2e84279240a6" containerID="af88e3a88bd4391cf42637ef4fd96c25933e8e2d449d7aa7ee46670261ef0157" exitCode=0 Oct 13 13:29:52 crc kubenswrapper[4797]: I1013 13:29:52.607895 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-76fd44f586-tp25f" event={"ID":"b6d097e4-da24-434c-9b9f-2e84279240a6","Type":"ContainerDied","Data":"af88e3a88bd4391cf42637ef4fd96c25933e8e2d449d7aa7ee46670261ef0157"} Oct 13 13:29:52 crc kubenswrapper[4797]: I1013 13:29:52.613251 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 13 13:29:52 crc kubenswrapper[4797]: I1013 13:29:52.613827 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"21067728-d3cf-4ff2-94c9-87600f7324ab","Type":"ContainerDied","Data":"c827d0ec843c5e1ccb7ad4f30660ae1ed3fff3b98977fa073ac90305847b44f4"} Oct 13 13:29:52 crc kubenswrapper[4797]: I1013 13:29:52.616216 4797 scope.go:117] "RemoveContainer" containerID="c87767a147d4c8704a237c416cfdc4858485ccdb790924187fa2558d28ea1605" Oct 13 13:29:52 crc kubenswrapper[4797]: I1013 13:29:52.624997 4797 generic.go:334] "Generic (PLEG): container finished" podID="99ff3b6b-b49c-4214-9cf1-ea9eb85f28c0" containerID="7f52052a3590935705a3b41e4aa7b4198b91ac9582dd55b94814a428e54fb81b" exitCode=0 Oct 13 13:29:52 crc kubenswrapper[4797]: I1013 13:29:52.625070 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-cc74bd777-hvb5p" event={"ID":"99ff3b6b-b49c-4214-9cf1-ea9eb85f28c0","Type":"ContainerDied","Data":"7f52052a3590935705a3b41e4aa7b4198b91ac9582dd55b94814a428e54fb81b"} Oct 13 13:29:52 crc kubenswrapper[4797]: I1013 13:29:52.694916 4797 scope.go:117] "RemoveContainer" containerID="1a733e45e064aeec3878d4c5dd8fe67bcb4e25caaf9192479e03c78ce5fbd2b5" Oct 13 13:29:52 crc kubenswrapper[4797]: I1013 13:29:52.770043 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 13 13:29:52 crc kubenswrapper[4797]: I1013 13:29:52.772153 4797 scope.go:117] "RemoveContainer" containerID="c4243df011234c180288fc1c95c327de116944eeb9f76e3b80b6ff0317063169" Oct 13 13:29:52 crc kubenswrapper[4797]: I1013 13:29:52.772364 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-76fd44f586-tp25f" Oct 13 13:29:52 crc kubenswrapper[4797]: I1013 13:29:52.779560 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-cc74bd777-hvb5p" Oct 13 13:29:52 crc kubenswrapper[4797]: I1013 13:29:52.780734 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 13 13:29:52 crc kubenswrapper[4797]: I1013 13:29:52.784924 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 13 13:29:52 crc kubenswrapper[4797]: I1013 13:29:52.791387 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Oct 13 13:29:52 crc kubenswrapper[4797]: I1013 13:29:52.804070 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-northd-0"] Oct 13 13:29:52 crc kubenswrapper[4797]: I1013 13:29:52.805885 4797 scope.go:117] "RemoveContainer" containerID="b14ae8c7513d0ced2b738189c2e015ff0743dc1feeeda16b9f6380925730cb4f" Oct 13 13:29:52 crc kubenswrapper[4797]: I1013 13:29:52.812936 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 13 13:29:52 crc kubenswrapper[4797]: I1013 13:29:52.816633 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 13 13:29:52 crc kubenswrapper[4797]: I1013 13:29:52.832447 4797 scope.go:117] "RemoveContainer" containerID="dc60509f6fdf80ab3c7a93d4d24dd520df75f4c9ccb98b44fd3e2e5450ca0b88" Oct 13 13:29:52 crc kubenswrapper[4797]: I1013 13:29:52.867451 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b6d097e4-da24-434c-9b9f-2e84279240a6-config-data-custom\") pod \"b6d097e4-da24-434c-9b9f-2e84279240a6\" (UID: \"b6d097e4-da24-434c-9b9f-2e84279240a6\") " Oct 13 13:29:52 crc kubenswrapper[4797]: I1013 13:29:52.867509 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7297d3f3-134d-4fcf-85f2-8b414e2fb27d-ceilometer-tls-certs\") pod \"7297d3f3-134d-4fcf-85f2-8b414e2fb27d\" (UID: \"7297d3f3-134d-4fcf-85f2-8b414e2fb27d\") " Oct 13 13:29:52 crc kubenswrapper[4797]: I1013 13:29:52.867533 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7297d3f3-134d-4fcf-85f2-8b414e2fb27d-log-httpd\") pod \"7297d3f3-134d-4fcf-85f2-8b414e2fb27d\" (UID: \"7297d3f3-134d-4fcf-85f2-8b414e2fb27d\") " Oct 13 13:29:52 crc kubenswrapper[4797]: I1013 13:29:52.867551 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6d097e4-da24-434c-9b9f-2e84279240a6-combined-ca-bundle\") pod \"b6d097e4-da24-434c-9b9f-2e84279240a6\" (UID: \"b6d097e4-da24-434c-9b9f-2e84279240a6\") " Oct 13 13:29:52 crc kubenswrapper[4797]: I1013 13:29:52.867575 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7297d3f3-134d-4fcf-85f2-8b414e2fb27d-combined-ca-bundle\") pod \"7297d3f3-134d-4fcf-85f2-8b414e2fb27d\" (UID: \"7297d3f3-134d-4fcf-85f2-8b414e2fb27d\") " Oct 13 13:29:52 crc kubenswrapper[4797]: I1013 13:29:52.867605 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b6d097e4-da24-434c-9b9f-2e84279240a6-logs\") pod \"b6d097e4-da24-434c-9b9f-2e84279240a6\" (UID: \"b6d097e4-da24-434c-9b9f-2e84279240a6\") " Oct 13 13:29:52 crc kubenswrapper[4797]: I1013 13:29:52.867620 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kbzsl\" (UniqueName: \"kubernetes.io/projected/b6d097e4-da24-434c-9b9f-2e84279240a6-kube-api-access-kbzsl\") pod \"b6d097e4-da24-434c-9b9f-2e84279240a6\" (UID: \"b6d097e4-da24-434c-9b9f-2e84279240a6\") " Oct 13 13:29:52 crc kubenswrapper[4797]: I1013 13:29:52.867642 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/99ff3b6b-b49c-4214-9cf1-ea9eb85f28c0-config-data-custom\") pod \"99ff3b6b-b49c-4214-9cf1-ea9eb85f28c0\" (UID: \"99ff3b6b-b49c-4214-9cf1-ea9eb85f28c0\") " Oct 13 13:29:52 crc kubenswrapper[4797]: I1013 13:29:52.867659 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7297d3f3-134d-4fcf-85f2-8b414e2fb27d-run-httpd\") pod \"7297d3f3-134d-4fcf-85f2-8b414e2fb27d\" (UID: \"7297d3f3-134d-4fcf-85f2-8b414e2fb27d\") " Oct 13 13:29:52 crc kubenswrapper[4797]: I1013 13:29:52.867690 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7297d3f3-134d-4fcf-85f2-8b414e2fb27d-sg-core-conf-yaml\") pod \"7297d3f3-134d-4fcf-85f2-8b414e2fb27d\" (UID: \"7297d3f3-134d-4fcf-85f2-8b414e2fb27d\") " Oct 13 13:29:52 crc kubenswrapper[4797]: I1013 13:29:52.867743 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99ff3b6b-b49c-4214-9cf1-ea9eb85f28c0-config-data\") pod \"99ff3b6b-b49c-4214-9cf1-ea9eb85f28c0\" (UID: \"99ff3b6b-b49c-4214-9cf1-ea9eb85f28c0\") " Oct 13 13:29:52 crc kubenswrapper[4797]: I1013 13:29:52.867777 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99ff3b6b-b49c-4214-9cf1-ea9eb85f28c0-combined-ca-bundle\") pod \"99ff3b6b-b49c-4214-9cf1-ea9eb85f28c0\" (UID: \"99ff3b6b-b49c-4214-9cf1-ea9eb85f28c0\") " Oct 13 13:29:52 crc kubenswrapper[4797]: I1013 13:29:52.867849 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqxd8\" (UniqueName: \"kubernetes.io/projected/99ff3b6b-b49c-4214-9cf1-ea9eb85f28c0-kube-api-access-fqxd8\") pod \"99ff3b6b-b49c-4214-9cf1-ea9eb85f28c0\" (UID: \"99ff3b6b-b49c-4214-9cf1-ea9eb85f28c0\") " Oct 13 13:29:52 crc kubenswrapper[4797]: I1013 13:29:52.867868 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7297d3f3-134d-4fcf-85f2-8b414e2fb27d-config-data\") pod \"7297d3f3-134d-4fcf-85f2-8b414e2fb27d\" (UID: \"7297d3f3-134d-4fcf-85f2-8b414e2fb27d\") " Oct 13 13:29:52 crc kubenswrapper[4797]: I1013 13:29:52.867900 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6d097e4-da24-434c-9b9f-2e84279240a6-config-data\") pod \"b6d097e4-da24-434c-9b9f-2e84279240a6\" (UID: \"b6d097e4-da24-434c-9b9f-2e84279240a6\") " Oct 13 13:29:52 crc kubenswrapper[4797]: I1013 13:29:52.867926 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4fqs5\" (UniqueName: \"kubernetes.io/projected/7297d3f3-134d-4fcf-85f2-8b414e2fb27d-kube-api-access-4fqs5\") pod \"7297d3f3-134d-4fcf-85f2-8b414e2fb27d\" (UID: \"7297d3f3-134d-4fcf-85f2-8b414e2fb27d\") " Oct 13 13:29:52 crc kubenswrapper[4797]: I1013 13:29:52.867951 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7297d3f3-134d-4fcf-85f2-8b414e2fb27d-scripts\") pod \"7297d3f3-134d-4fcf-85f2-8b414e2fb27d\" (UID: \"7297d3f3-134d-4fcf-85f2-8b414e2fb27d\") " Oct 13 13:29:52 crc kubenswrapper[4797]: I1013 13:29:52.867978 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/99ff3b6b-b49c-4214-9cf1-ea9eb85f28c0-logs\") pod \"99ff3b6b-b49c-4214-9cf1-ea9eb85f28c0\" (UID: \"99ff3b6b-b49c-4214-9cf1-ea9eb85f28c0\") " Oct 13 13:29:52 crc kubenswrapper[4797]: I1013 13:29:52.868869 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7297d3f3-134d-4fcf-85f2-8b414e2fb27d-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "7297d3f3-134d-4fcf-85f2-8b414e2fb27d" (UID: "7297d3f3-134d-4fcf-85f2-8b414e2fb27d"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 13:29:52 crc kubenswrapper[4797]: I1013 13:29:52.868954 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/99ff3b6b-b49c-4214-9cf1-ea9eb85f28c0-logs" (OuterVolumeSpecName: "logs") pod "99ff3b6b-b49c-4214-9cf1-ea9eb85f28c0" (UID: "99ff3b6b-b49c-4214-9cf1-ea9eb85f28c0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 13:29:52 crc kubenswrapper[4797]: I1013 13:29:52.869858 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b6d097e4-da24-434c-9b9f-2e84279240a6-logs" (OuterVolumeSpecName: "logs") pod "b6d097e4-da24-434c-9b9f-2e84279240a6" (UID: "b6d097e4-da24-434c-9b9f-2e84279240a6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 13:29:52 crc kubenswrapper[4797]: I1013 13:29:52.871956 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7297d3f3-134d-4fcf-85f2-8b414e2fb27d-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "7297d3f3-134d-4fcf-85f2-8b414e2fb27d" (UID: "7297d3f3-134d-4fcf-85f2-8b414e2fb27d"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 13:29:52 crc kubenswrapper[4797]: I1013 13:29:52.877349 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6d097e4-da24-434c-9b9f-2e84279240a6-kube-api-access-kbzsl" (OuterVolumeSpecName: "kube-api-access-kbzsl") pod "b6d097e4-da24-434c-9b9f-2e84279240a6" (UID: "b6d097e4-da24-434c-9b9f-2e84279240a6"). InnerVolumeSpecName "kube-api-access-kbzsl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:29:52 crc kubenswrapper[4797]: I1013 13:29:52.878051 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7297d3f3-134d-4fcf-85f2-8b414e2fb27d-scripts" (OuterVolumeSpecName: "scripts") pod "7297d3f3-134d-4fcf-85f2-8b414e2fb27d" (UID: "7297d3f3-134d-4fcf-85f2-8b414e2fb27d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:29:52 crc kubenswrapper[4797]: I1013 13:29:52.878666 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99ff3b6b-b49c-4214-9cf1-ea9eb85f28c0-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "99ff3b6b-b49c-4214-9cf1-ea9eb85f28c0" (UID: "99ff3b6b-b49c-4214-9cf1-ea9eb85f28c0"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:29:52 crc kubenswrapper[4797]: I1013 13:29:52.879892 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99ff3b6b-b49c-4214-9cf1-ea9eb85f28c0-kube-api-access-fqxd8" (OuterVolumeSpecName: "kube-api-access-fqxd8") pod "99ff3b6b-b49c-4214-9cf1-ea9eb85f28c0" (UID: "99ff3b6b-b49c-4214-9cf1-ea9eb85f28c0"). InnerVolumeSpecName "kube-api-access-fqxd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:29:52 crc kubenswrapper[4797]: I1013 13:29:52.880714 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6d097e4-da24-434c-9b9f-2e84279240a6-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "b6d097e4-da24-434c-9b9f-2e84279240a6" (UID: "b6d097e4-da24-434c-9b9f-2e84279240a6"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:29:52 crc kubenswrapper[4797]: I1013 13:29:52.905398 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7297d3f3-134d-4fcf-85f2-8b414e2fb27d-kube-api-access-4fqs5" (OuterVolumeSpecName: "kube-api-access-4fqs5") pod "7297d3f3-134d-4fcf-85f2-8b414e2fb27d" (UID: "7297d3f3-134d-4fcf-85f2-8b414e2fb27d"). InnerVolumeSpecName "kube-api-access-4fqs5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:29:52 crc kubenswrapper[4797]: I1013 13:29:52.908072 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7297d3f3-134d-4fcf-85f2-8b414e2fb27d-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "7297d3f3-134d-4fcf-85f2-8b414e2fb27d" (UID: "7297d3f3-134d-4fcf-85f2-8b414e2fb27d"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:29:52 crc kubenswrapper[4797]: I1013 13:29:52.908189 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6d097e4-da24-434c-9b9f-2e84279240a6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b6d097e4-da24-434c-9b9f-2e84279240a6" (UID: "b6d097e4-da24-434c-9b9f-2e84279240a6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:29:52 crc kubenswrapper[4797]: I1013 13:29:52.922333 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99ff3b6b-b49c-4214-9cf1-ea9eb85f28c0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "99ff3b6b-b49c-4214-9cf1-ea9eb85f28c0" (UID: "99ff3b6b-b49c-4214-9cf1-ea9eb85f28c0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:29:52 crc kubenswrapper[4797]: I1013 13:29:52.924317 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7297d3f3-134d-4fcf-85f2-8b414e2fb27d-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "7297d3f3-134d-4fcf-85f2-8b414e2fb27d" (UID: "7297d3f3-134d-4fcf-85f2-8b414e2fb27d"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:29:52 crc kubenswrapper[4797]: I1013 13:29:52.935648 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99ff3b6b-b49c-4214-9cf1-ea9eb85f28c0-config-data" (OuterVolumeSpecName: "config-data") pod "99ff3b6b-b49c-4214-9cf1-ea9eb85f28c0" (UID: "99ff3b6b-b49c-4214-9cf1-ea9eb85f28c0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:29:52 crc kubenswrapper[4797]: I1013 13:29:52.939637 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6d097e4-da24-434c-9b9f-2e84279240a6-config-data" (OuterVolumeSpecName: "config-data") pod "b6d097e4-da24-434c-9b9f-2e84279240a6" (UID: "b6d097e4-da24-434c-9b9f-2e84279240a6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:29:52 crc kubenswrapper[4797]: I1013 13:29:52.956142 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7297d3f3-134d-4fcf-85f2-8b414e2fb27d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7297d3f3-134d-4fcf-85f2-8b414e2fb27d" (UID: "7297d3f3-134d-4fcf-85f2-8b414e2fb27d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:29:52 crc kubenswrapper[4797]: I1013 13:29:52.969904 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7297d3f3-134d-4fcf-85f2-8b414e2fb27d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:52 crc kubenswrapper[4797]: I1013 13:29:52.969939 4797 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b6d097e4-da24-434c-9b9f-2e84279240a6-logs\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:52 crc kubenswrapper[4797]: I1013 13:29:52.969951 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kbzsl\" (UniqueName: \"kubernetes.io/projected/b6d097e4-da24-434c-9b9f-2e84279240a6-kube-api-access-kbzsl\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:52 crc kubenswrapper[4797]: I1013 13:29:52.969962 4797 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/99ff3b6b-b49c-4214-9cf1-ea9eb85f28c0-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:52 crc kubenswrapper[4797]: I1013 13:29:52.969970 4797 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7297d3f3-134d-4fcf-85f2-8b414e2fb27d-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:52 crc kubenswrapper[4797]: I1013 13:29:52.969981 4797 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7297d3f3-134d-4fcf-85f2-8b414e2fb27d-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:52 crc kubenswrapper[4797]: I1013 13:29:52.969990 4797 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99ff3b6b-b49c-4214-9cf1-ea9eb85f28c0-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:52 crc kubenswrapper[4797]: I1013 13:29:52.970005 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99ff3b6b-b49c-4214-9cf1-ea9eb85f28c0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:52 crc kubenswrapper[4797]: I1013 13:29:52.970020 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqxd8\" (UniqueName: \"kubernetes.io/projected/99ff3b6b-b49c-4214-9cf1-ea9eb85f28c0-kube-api-access-fqxd8\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:52 crc kubenswrapper[4797]: I1013 13:29:52.970031 4797 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6d097e4-da24-434c-9b9f-2e84279240a6-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:52 crc kubenswrapper[4797]: I1013 13:29:52.970044 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4fqs5\" (UniqueName: \"kubernetes.io/projected/7297d3f3-134d-4fcf-85f2-8b414e2fb27d-kube-api-access-4fqs5\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:52 crc kubenswrapper[4797]: I1013 13:29:52.970054 4797 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7297d3f3-134d-4fcf-85f2-8b414e2fb27d-scripts\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:52 crc kubenswrapper[4797]: I1013 13:29:52.970061 4797 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/99ff3b6b-b49c-4214-9cf1-ea9eb85f28c0-logs\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:52 crc kubenswrapper[4797]: I1013 13:29:52.970069 4797 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b6d097e4-da24-434c-9b9f-2e84279240a6-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:52 crc kubenswrapper[4797]: I1013 13:29:52.970076 4797 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7297d3f3-134d-4fcf-85f2-8b414e2fb27d-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:52 crc kubenswrapper[4797]: I1013 13:29:52.970083 4797 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7297d3f3-134d-4fcf-85f2-8b414e2fb27d-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:52 crc kubenswrapper[4797]: I1013 13:29:52.970090 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6d097e4-da24-434c-9b9f-2e84279240a6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:52 crc kubenswrapper[4797]: I1013 13:29:52.976628 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7297d3f3-134d-4fcf-85f2-8b414e2fb27d-config-data" (OuterVolumeSpecName: "config-data") pod "7297d3f3-134d-4fcf-85f2-8b414e2fb27d" (UID: "7297d3f3-134d-4fcf-85f2-8b414e2fb27d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:29:53 crc kubenswrapper[4797]: I1013 13:29:53.070792 4797 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7297d3f3-134d-4fcf-85f2-8b414e2fb27d-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:53 crc kubenswrapper[4797]: I1013 13:29:53.116567 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 13 13:29:53 crc kubenswrapper[4797]: I1013 13:29:53.171477 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fhbhl\" (UniqueName: \"kubernetes.io/projected/60394f60-af79-4a07-8f3f-75fb61c31894-kube-api-access-fhbhl\") pod \"60394f60-af79-4a07-8f3f-75fb61c31894\" (UID: \"60394f60-af79-4a07-8f3f-75fb61c31894\") " Oct 13 13:29:53 crc kubenswrapper[4797]: I1013 13:29:53.171561 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60394f60-af79-4a07-8f3f-75fb61c31894-config-data\") pod \"60394f60-af79-4a07-8f3f-75fb61c31894\" (UID: \"60394f60-af79-4a07-8f3f-75fb61c31894\") " Oct 13 13:29:53 crc kubenswrapper[4797]: I1013 13:29:53.171682 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60394f60-af79-4a07-8f3f-75fb61c31894-combined-ca-bundle\") pod \"60394f60-af79-4a07-8f3f-75fb61c31894\" (UID: \"60394f60-af79-4a07-8f3f-75fb61c31894\") " Oct 13 13:29:53 crc kubenswrapper[4797]: I1013 13:29:53.177965 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60394f60-af79-4a07-8f3f-75fb61c31894-kube-api-access-fhbhl" (OuterVolumeSpecName: "kube-api-access-fhbhl") pod "60394f60-af79-4a07-8f3f-75fb61c31894" (UID: "60394f60-af79-4a07-8f3f-75fb61c31894"). InnerVolumeSpecName "kube-api-access-fhbhl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:29:53 crc kubenswrapper[4797]: I1013 13:29:53.192430 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60394f60-af79-4a07-8f3f-75fb61c31894-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "60394f60-af79-4a07-8f3f-75fb61c31894" (UID: "60394f60-af79-4a07-8f3f-75fb61c31894"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:29:53 crc kubenswrapper[4797]: I1013 13:29:53.192972 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60394f60-af79-4a07-8f3f-75fb61c31894-config-data" (OuterVolumeSpecName: "config-data") pod "60394f60-af79-4a07-8f3f-75fb61c31894" (UID: "60394f60-af79-4a07-8f3f-75fb61c31894"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:29:53 crc kubenswrapper[4797]: I1013 13:29:53.247579 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21067728-d3cf-4ff2-94c9-87600f7324ab" path="/var/lib/kubelet/pods/21067728-d3cf-4ff2-94c9-87600f7324ab/volumes" Oct 13 13:29:53 crc kubenswrapper[4797]: I1013 13:29:53.248777 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="300309d9-4375-4cce-8fb1-0833d2cfdcde" path="/var/lib/kubelet/pods/300309d9-4375-4cce-8fb1-0833d2cfdcde/volumes" Oct 13 13:29:53 crc kubenswrapper[4797]: I1013 13:29:53.250580 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6bf02d7e-6b92-4d2a-838f-20cdd6a7046e" path="/var/lib/kubelet/pods/6bf02d7e-6b92-4d2a-838f-20cdd6a7046e/volumes" Oct 13 13:29:53 crc kubenswrapper[4797]: I1013 13:29:53.251821 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="acdec9fc-360a-46e4-89ea-3fde84f417c0" path="/var/lib/kubelet/pods/acdec9fc-360a-46e4-89ea-3fde84f417c0/volumes" Oct 13 13:29:53 crc kubenswrapper[4797]: I1013 13:29:53.252918 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d661f302-5234-4d18-9aa8-0eddd26153fe" path="/var/lib/kubelet/pods/d661f302-5234-4d18-9aa8-0eddd26153fe/volumes" Oct 13 13:29:53 crc kubenswrapper[4797]: I1013 13:29:53.254707 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2a3161e-b16d-436d-b547-87e182ef5e27" path="/var/lib/kubelet/pods/e2a3161e-b16d-436d-b547-87e182ef5e27/volumes" Oct 13 13:29:53 crc kubenswrapper[4797]: I1013 13:29:53.273637 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60394f60-af79-4a07-8f3f-75fb61c31894-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:53 crc kubenswrapper[4797]: I1013 13:29:53.273703 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fhbhl\" (UniqueName: \"kubernetes.io/projected/60394f60-af79-4a07-8f3f-75fb61c31894-kube-api-access-fhbhl\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:53 crc kubenswrapper[4797]: I1013 13:29:53.273721 4797 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60394f60-af79-4a07-8f3f-75fb61c31894-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 13:29:53 crc kubenswrapper[4797]: E1013 13:29:53.291002 4797 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a06c1a43c4f53948ecf8f2a50db2d6224db0be23e64e47a447e2e46bcc407c2e is running failed: container process not found" containerID="a06c1a43c4f53948ecf8f2a50db2d6224db0be23e64e47a447e2e46bcc407c2e" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 13 13:29:53 crc kubenswrapper[4797]: E1013 13:29:53.292188 4797 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a06c1a43c4f53948ecf8f2a50db2d6224db0be23e64e47a447e2e46bcc407c2e is running failed: container process not found" containerID="a06c1a43c4f53948ecf8f2a50db2d6224db0be23e64e47a447e2e46bcc407c2e" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 13 13:29:53 crc kubenswrapper[4797]: E1013 13:29:53.292326 4797 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="530a23262b718d00b295a64b20c635224a04f307f764b6900f060c9b7a722368" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 13 13:29:53 crc kubenswrapper[4797]: E1013 13:29:53.292417 4797 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a06c1a43c4f53948ecf8f2a50db2d6224db0be23e64e47a447e2e46bcc407c2e is running failed: container process not found" containerID="a06c1a43c4f53948ecf8f2a50db2d6224db0be23e64e47a447e2e46bcc407c2e" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 13 13:29:53 crc kubenswrapper[4797]: E1013 13:29:53.292474 4797 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a06c1a43c4f53948ecf8f2a50db2d6224db0be23e64e47a447e2e46bcc407c2e is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-2mpq9" podUID="3ac6531d-4d7d-4cf0-b943-984f885b4a6d" containerName="ovsdb-server" Oct 13 13:29:53 crc kubenswrapper[4797]: E1013 13:29:53.294134 4797 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="530a23262b718d00b295a64b20c635224a04f307f764b6900f060c9b7a722368" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 13 13:29:53 crc kubenswrapper[4797]: E1013 13:29:53.295741 4797 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="530a23262b718d00b295a64b20c635224a04f307f764b6900f060c9b7a722368" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 13 13:29:53 crc kubenswrapper[4797]: E1013 13:29:53.295790 4797 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-2mpq9" podUID="3ac6531d-4d7d-4cf0-b943-984f885b4a6d" containerName="ovs-vswitchd" Oct 13 13:29:53 crc kubenswrapper[4797]: I1013 13:29:53.636775 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7297d3f3-134d-4fcf-85f2-8b414e2fb27d","Type":"ContainerDied","Data":"2cab4f7f0a37ffa4f3249cf612ce70a140a5ccf56e26a4bb1043dead686d84eb"} Oct 13 13:29:53 crc kubenswrapper[4797]: I1013 13:29:53.636891 4797 scope.go:117] "RemoveContainer" containerID="26a477d9a49a32396a4c8959aafd79c16854ffb6419bb4f4a5f1de667a6d6c19" Oct 13 13:29:53 crc kubenswrapper[4797]: I1013 13:29:53.637002 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 13 13:29:53 crc kubenswrapper[4797]: I1013 13:29:53.643756 4797 generic.go:334] "Generic (PLEG): container finished" podID="60394f60-af79-4a07-8f3f-75fb61c31894" containerID="45f5cee6335c0ba2bc083ace9fff9eb625941edef21cfaa370cfe539173e7b53" exitCode=0 Oct 13 13:29:53 crc kubenswrapper[4797]: I1013 13:29:53.643894 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"60394f60-af79-4a07-8f3f-75fb61c31894","Type":"ContainerDied","Data":"45f5cee6335c0ba2bc083ace9fff9eb625941edef21cfaa370cfe539173e7b53"} Oct 13 13:29:53 crc kubenswrapper[4797]: I1013 13:29:53.644214 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"60394f60-af79-4a07-8f3f-75fb61c31894","Type":"ContainerDied","Data":"0ac92e3ed27f5d295a88366f51b035ae718352911b12dc0dd6a7cbef5c66267c"} Oct 13 13:29:53 crc kubenswrapper[4797]: I1013 13:29:53.644495 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 13 13:29:53 crc kubenswrapper[4797]: I1013 13:29:53.646974 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-76fd44f586-tp25f" Oct 13 13:29:53 crc kubenswrapper[4797]: I1013 13:29:53.647717 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-76fd44f586-tp25f" event={"ID":"b6d097e4-da24-434c-9b9f-2e84279240a6","Type":"ContainerDied","Data":"bdc3d0adc4f7f76c25a6ad3da7a1f05cdc4c7f810641550ea31eed320d4f2224"} Oct 13 13:29:53 crc kubenswrapper[4797]: I1013 13:29:53.652063 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-cc74bd777-hvb5p" event={"ID":"99ff3b6b-b49c-4214-9cf1-ea9eb85f28c0","Type":"ContainerDied","Data":"128213963d53bab847cdbca6cd310c1dd55ec97434c0b7ad2a2cce6bcb78a94b"} Oct 13 13:29:53 crc kubenswrapper[4797]: I1013 13:29:53.652173 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-cc74bd777-hvb5p" Oct 13 13:29:53 crc kubenswrapper[4797]: I1013 13:29:53.679857 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-76fd44f586-tp25f"] Oct 13 13:29:53 crc kubenswrapper[4797]: I1013 13:29:53.680040 4797 scope.go:117] "RemoveContainer" containerID="0eb68056284c660ad9015956fa330c09c86d1bd6d0b2e7c87865c45cb9f710fa" Oct 13 13:29:53 crc kubenswrapper[4797]: I1013 13:29:53.691111 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-keystone-listener-76fd44f586-tp25f"] Oct 13 13:29:53 crc kubenswrapper[4797]: I1013 13:29:53.702194 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 13 13:29:53 crc kubenswrapper[4797]: I1013 13:29:53.712912 4797 scope.go:117] "RemoveContainer" containerID="3e128c54e1e732791a3db021ae19e7a7d4bd2ecbe1361a7e7ba33071c47c83c3" Oct 13 13:29:53 crc kubenswrapper[4797]: I1013 13:29:53.717251 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 13 13:29:53 crc kubenswrapper[4797]: I1013 13:29:53.724153 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-cc74bd777-hvb5p"] Oct 13 13:29:53 crc kubenswrapper[4797]: I1013 13:29:53.730144 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-worker-cc74bd777-hvb5p"] Oct 13 13:29:53 crc kubenswrapper[4797]: I1013 13:29:53.734992 4797 scope.go:117] "RemoveContainer" containerID="8b3dbc305e498b09f402b2934e96ebeacc6ba59d0b362bafb37d2783cec22f3c" Oct 13 13:29:53 crc kubenswrapper[4797]: I1013 13:29:53.735238 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 13 13:29:53 crc kubenswrapper[4797]: I1013 13:29:53.762964 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 13 13:29:53 crc kubenswrapper[4797]: I1013 13:29:53.783976 4797 scope.go:117] "RemoveContainer" containerID="45f5cee6335c0ba2bc083ace9fff9eb625941edef21cfaa370cfe539173e7b53" Oct 13 13:29:53 crc kubenswrapper[4797]: I1013 13:29:53.804779 4797 scope.go:117] "RemoveContainer" containerID="45f5cee6335c0ba2bc083ace9fff9eb625941edef21cfaa370cfe539173e7b53" Oct 13 13:29:53 crc kubenswrapper[4797]: E1013 13:29:53.805292 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45f5cee6335c0ba2bc083ace9fff9eb625941edef21cfaa370cfe539173e7b53\": container with ID starting with 45f5cee6335c0ba2bc083ace9fff9eb625941edef21cfaa370cfe539173e7b53 not found: ID does not exist" containerID="45f5cee6335c0ba2bc083ace9fff9eb625941edef21cfaa370cfe539173e7b53" Oct 13 13:29:53 crc kubenswrapper[4797]: I1013 13:29:53.805328 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45f5cee6335c0ba2bc083ace9fff9eb625941edef21cfaa370cfe539173e7b53"} err="failed to get container status \"45f5cee6335c0ba2bc083ace9fff9eb625941edef21cfaa370cfe539173e7b53\": rpc error: code = NotFound desc = could not find container \"45f5cee6335c0ba2bc083ace9fff9eb625941edef21cfaa370cfe539173e7b53\": container with ID starting with 45f5cee6335c0ba2bc083ace9fff9eb625941edef21cfaa370cfe539173e7b53 not found: ID does not exist" Oct 13 13:29:53 crc kubenswrapper[4797]: I1013 13:29:53.805348 4797 scope.go:117] "RemoveContainer" containerID="af88e3a88bd4391cf42637ef4fd96c25933e8e2d449d7aa7ee46670261ef0157" Oct 13 13:29:53 crc kubenswrapper[4797]: I1013 13:29:53.829305 4797 scope.go:117] "RemoveContainer" containerID="91313e817dc0492cf7a60f479444a1c27f27b873cabe704400743e07a7510a7d" Oct 13 13:29:53 crc kubenswrapper[4797]: I1013 13:29:53.848618 4797 scope.go:117] "RemoveContainer" containerID="7f52052a3590935705a3b41e4aa7b4198b91ac9582dd55b94814a428e54fb81b" Oct 13 13:29:53 crc kubenswrapper[4797]: I1013 13:29:53.869225 4797 scope.go:117] "RemoveContainer" containerID="b77c094eeee1c437962694391b190f9591ed0b8e1210663b157e4b3d4c2a0207" Oct 13 13:29:55 crc kubenswrapper[4797]: I1013 13:29:55.244309 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60394f60-af79-4a07-8f3f-75fb61c31894" path="/var/lib/kubelet/pods/60394f60-af79-4a07-8f3f-75fb61c31894/volumes" Oct 13 13:29:55 crc kubenswrapper[4797]: I1013 13:29:55.245082 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7297d3f3-134d-4fcf-85f2-8b414e2fb27d" path="/var/lib/kubelet/pods/7297d3f3-134d-4fcf-85f2-8b414e2fb27d/volumes" Oct 13 13:29:55 crc kubenswrapper[4797]: I1013 13:29:55.245823 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99ff3b6b-b49c-4214-9cf1-ea9eb85f28c0" path="/var/lib/kubelet/pods/99ff3b6b-b49c-4214-9cf1-ea9eb85f28c0/volumes" Oct 13 13:29:55 crc kubenswrapper[4797]: I1013 13:29:55.246838 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6d097e4-da24-434c-9b9f-2e84279240a6" path="/var/lib/kubelet/pods/b6d097e4-da24-434c-9b9f-2e84279240a6/volumes" Oct 13 13:29:58 crc kubenswrapper[4797]: E1013 13:29:58.292020 4797 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a06c1a43c4f53948ecf8f2a50db2d6224db0be23e64e47a447e2e46bcc407c2e is running failed: container process not found" containerID="a06c1a43c4f53948ecf8f2a50db2d6224db0be23e64e47a447e2e46bcc407c2e" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 13 13:29:58 crc kubenswrapper[4797]: E1013 13:29:58.292694 4797 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="530a23262b718d00b295a64b20c635224a04f307f764b6900f060c9b7a722368" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 13 13:29:58 crc kubenswrapper[4797]: E1013 13:29:58.292781 4797 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a06c1a43c4f53948ecf8f2a50db2d6224db0be23e64e47a447e2e46bcc407c2e is running failed: container process not found" containerID="a06c1a43c4f53948ecf8f2a50db2d6224db0be23e64e47a447e2e46bcc407c2e" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 13 13:29:58 crc kubenswrapper[4797]: E1013 13:29:58.294209 4797 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="530a23262b718d00b295a64b20c635224a04f307f764b6900f060c9b7a722368" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 13 13:29:58 crc kubenswrapper[4797]: E1013 13:29:58.294208 4797 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a06c1a43c4f53948ecf8f2a50db2d6224db0be23e64e47a447e2e46bcc407c2e is running failed: container process not found" containerID="a06c1a43c4f53948ecf8f2a50db2d6224db0be23e64e47a447e2e46bcc407c2e" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 13 13:29:58 crc kubenswrapper[4797]: E1013 13:29:58.294391 4797 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a06c1a43c4f53948ecf8f2a50db2d6224db0be23e64e47a447e2e46bcc407c2e is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-2mpq9" podUID="3ac6531d-4d7d-4cf0-b943-984f885b4a6d" containerName="ovsdb-server" Oct 13 13:29:58 crc kubenswrapper[4797]: E1013 13:29:58.296255 4797 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="530a23262b718d00b295a64b20c635224a04f307f764b6900f060c9b7a722368" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 13 13:29:58 crc kubenswrapper[4797]: E1013 13:29:58.296313 4797 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-2mpq9" podUID="3ac6531d-4d7d-4cf0-b943-984f885b4a6d" containerName="ovs-vswitchd" Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.152763 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29339370-mslvc"] Oct 13 13:30:00 crc kubenswrapper[4797]: E1013 13:30:00.153370 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f12892ce-6d68-4f79-b1dd-e874dffba145" containerName="galera" Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.153382 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="f12892ce-6d68-4f79-b1dd-e874dffba145" containerName="galera" Oct 13 13:30:00 crc kubenswrapper[4797]: E1013 13:30:00.153395 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7297d3f3-134d-4fcf-85f2-8b414e2fb27d" containerName="ceilometer-notification-agent" Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.153401 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="7297d3f3-134d-4fcf-85f2-8b414e2fb27d" containerName="ceilometer-notification-agent" Oct 13 13:30:00 crc kubenswrapper[4797]: E1013 13:30:00.153409 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="300309d9-4375-4cce-8fb1-0833d2cfdcde" containerName="mysql-bootstrap" Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.153415 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="300309d9-4375-4cce-8fb1-0833d2cfdcde" containerName="mysql-bootstrap" Oct 13 13:30:00 crc kubenswrapper[4797]: E1013 13:30:00.153424 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99ff3b6b-b49c-4214-9cf1-ea9eb85f28c0" containerName="barbican-worker" Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.153431 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="99ff3b6b-b49c-4214-9cf1-ea9eb85f28c0" containerName="barbican-worker" Oct 13 13:30:00 crc kubenswrapper[4797]: E1013 13:30:00.153445 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="244e58a1-ed2c-4ff6-8885-ebd066e8adab" containerName="openstack-network-exporter" Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.153451 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="244e58a1-ed2c-4ff6-8885-ebd066e8adab" containerName="openstack-network-exporter" Oct 13 13:30:00 crc kubenswrapper[4797]: E1013 13:30:00.153463 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2a3161e-b16d-436d-b547-87e182ef5e27" containerName="nova-api-log" Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.153469 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2a3161e-b16d-436d-b547-87e182ef5e27" containerName="nova-api-log" Oct 13 13:30:00 crc kubenswrapper[4797]: E1013 13:30:00.153481 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b2f17a4-493b-4b76-9dea-ef70ed8e1525" containerName="cinder-api-log" Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.153488 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b2f17a4-493b-4b76-9dea-ef70ed8e1525" containerName="cinder-api-log" Oct 13 13:30:00 crc kubenswrapper[4797]: E1013 13:30:00.153495 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cefeac7c-e65d-4c12-8f7e-e56bf30c04fa" containerName="nova-metadata-log" Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.153501 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="cefeac7c-e65d-4c12-8f7e-e56bf30c04fa" containerName="nova-metadata-log" Oct 13 13:30:00 crc kubenswrapper[4797]: E1013 13:30:00.153508 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60394f60-af79-4a07-8f3f-75fb61c31894" containerName="nova-cell1-conductor-conductor" Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.153513 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="60394f60-af79-4a07-8f3f-75fb61c31894" containerName="nova-cell1-conductor-conductor" Oct 13 13:30:00 crc kubenswrapper[4797]: E1013 13:30:00.153521 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7297d3f3-134d-4fcf-85f2-8b414e2fb27d" containerName="proxy-httpd" Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.153527 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="7297d3f3-134d-4fcf-85f2-8b414e2fb27d" containerName="proxy-httpd" Oct 13 13:30:00 crc kubenswrapper[4797]: E1013 13:30:00.153537 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7fec705-3fa8-4f2b-aa9d-1afec561d884" containerName="init" Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.153542 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7fec705-3fa8-4f2b-aa9d-1afec561d884" containerName="init" Oct 13 13:30:00 crc kubenswrapper[4797]: E1013 13:30:00.153550 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f12892ce-6d68-4f79-b1dd-e874dffba145" containerName="mysql-bootstrap" Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.153681 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="f12892ce-6d68-4f79-b1dd-e874dffba145" containerName="mysql-bootstrap" Oct 13 13:30:00 crc kubenswrapper[4797]: E1013 13:30:00.153693 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="244e58a1-ed2c-4ff6-8885-ebd066e8adab" containerName="ovsdbserver-nb" Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.153700 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="244e58a1-ed2c-4ff6-8885-ebd066e8adab" containerName="ovsdbserver-nb" Oct 13 13:30:00 crc kubenswrapper[4797]: E1013 13:30:00.153707 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="047404d9-b0ab-44e2-a31d-94d8fe429698" containerName="ovsdbserver-sb" Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.153713 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="047404d9-b0ab-44e2-a31d-94d8fe429698" containerName="ovsdbserver-sb" Oct 13 13:30:00 crc kubenswrapper[4797]: E1013 13:30:00.153723 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cefeac7c-e65d-4c12-8f7e-e56bf30c04fa" containerName="nova-metadata-metadata" Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.153729 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="cefeac7c-e65d-4c12-8f7e-e56bf30c04fa" containerName="nova-metadata-metadata" Oct 13 13:30:00 crc kubenswrapper[4797]: E1013 13:30:00.153738 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bf02d7e-6b92-4d2a-838f-20cdd6a7046e" containerName="keystone-api" Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.153743 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bf02d7e-6b92-4d2a-838f-20cdd6a7046e" containerName="keystone-api" Oct 13 13:30:00 crc kubenswrapper[4797]: E1013 13:30:00.153750 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fed4821-c587-408b-b6b0-bcc080170628" containerName="proxy-httpd" Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.153755 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fed4821-c587-408b-b6b0-bcc080170628" containerName="proxy-httpd" Oct 13 13:30:00 crc kubenswrapper[4797]: E1013 13:30:00.153766 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2be119d-ecfb-4f81-b947-46797c215b8e" containerName="glance-log" Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.153771 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2be119d-ecfb-4f81-b947-46797c215b8e" containerName="glance-log" Oct 13 13:30:00 crc kubenswrapper[4797]: E1013 13:30:00.153779 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddeeadaa-8237-4fad-8fd7-8e9c0580a1ed" containerName="kube-state-metrics" Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.153784 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddeeadaa-8237-4fad-8fd7-8e9c0580a1ed" containerName="kube-state-metrics" Oct 13 13:30:00 crc kubenswrapper[4797]: E1013 13:30:00.153793 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7fec705-3fa8-4f2b-aa9d-1afec561d884" containerName="dnsmasq-dns" Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.153798 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7fec705-3fa8-4f2b-aa9d-1afec561d884" containerName="dnsmasq-dns" Oct 13 13:30:00 crc kubenswrapper[4797]: E1013 13:30:00.153840 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb5bf67e-bfcd-4a21-b67f-9d2893da31ab" containerName="memcached" Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.153846 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb5bf67e-bfcd-4a21-b67f-9d2893da31ab" containerName="memcached" Oct 13 13:30:00 crc kubenswrapper[4797]: E1013 13:30:00.153856 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21067728-d3cf-4ff2-94c9-87600f7324ab" containerName="setup-container" Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.153862 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="21067728-d3cf-4ff2-94c9-87600f7324ab" containerName="setup-container" Oct 13 13:30:00 crc kubenswrapper[4797]: E1013 13:30:00.153872 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f10b6d3a-3a6f-4ab7-9ded-4885e28bdbc6" containerName="nova-scheduler-scheduler" Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.153878 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="f10b6d3a-3a6f-4ab7-9ded-4885e28bdbc6" containerName="nova-scheduler-scheduler" Oct 13 13:30:00 crc kubenswrapper[4797]: E1013 13:30:00.153886 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="312a660f-ea89-49ac-8857-16dae844353f" containerName="mariadb-account-delete" Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.153892 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="312a660f-ea89-49ac-8857-16dae844353f" containerName="mariadb-account-delete" Oct 13 13:30:00 crc kubenswrapper[4797]: E1013 13:30:00.153901 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="416aefad-3318-4406-b1c1-fdba0ce21437" containerName="glance-httpd" Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.153906 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="416aefad-3318-4406-b1c1-fdba0ce21437" containerName="glance-httpd" Oct 13 13:30:00 crc kubenswrapper[4797]: E1013 13:30:00.153913 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5e82c55-e59e-4d97-800c-66a4f9555047" containerName="placement-api" Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.153919 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5e82c55-e59e-4d97-800c-66a4f9555047" containerName="placement-api" Oct 13 13:30:00 crc kubenswrapper[4797]: E1013 13:30:00.153929 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7503305-66a2-4504-b208-6795946d8701" containerName="nova-cell0-conductor-conductor" Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.153935 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7503305-66a2-4504-b208-6795946d8701" containerName="nova-cell0-conductor-conductor" Oct 13 13:30:00 crc kubenswrapper[4797]: E1013 13:30:00.153946 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fed4821-c587-408b-b6b0-bcc080170628" containerName="proxy-server" Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.153952 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fed4821-c587-408b-b6b0-bcc080170628" containerName="proxy-server" Oct 13 13:30:00 crc kubenswrapper[4797]: E1013 13:30:00.153961 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7297d3f3-134d-4fcf-85f2-8b414e2fb27d" containerName="ceilometer-central-agent" Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.153966 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="7297d3f3-134d-4fcf-85f2-8b414e2fb27d" containerName="ceilometer-central-agent" Oct 13 13:30:00 crc kubenswrapper[4797]: E1013 13:30:00.153973 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5e82c55-e59e-4d97-800c-66a4f9555047" containerName="placement-log" Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.153978 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5e82c55-e59e-4d97-800c-66a4f9555047" containerName="placement-log" Oct 13 13:30:00 crc kubenswrapper[4797]: E1013 13:30:00.153984 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d661f302-5234-4d18-9aa8-0eddd26153fe" containerName="ovn-northd" Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.153990 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="d661f302-5234-4d18-9aa8-0eddd26153fe" containerName="ovn-northd" Oct 13 13:30:00 crc kubenswrapper[4797]: E1013 13:30:00.154000 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21067728-d3cf-4ff2-94c9-87600f7324ab" containerName="rabbitmq" Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.154007 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="21067728-d3cf-4ff2-94c9-87600f7324ab" containerName="rabbitmq" Oct 13 13:30:00 crc kubenswrapper[4797]: E1013 13:30:00.154016 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acdec9fc-360a-46e4-89ea-3fde84f417c0" containerName="rabbitmq" Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.154022 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="acdec9fc-360a-46e4-89ea-3fde84f417c0" containerName="rabbitmq" Oct 13 13:30:00 crc kubenswrapper[4797]: E1013 13:30:00.154032 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d561c30-1e2f-4a3d-b042-8191c88e4bb6" containerName="barbican-api" Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.154037 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d561c30-1e2f-4a3d-b042-8191c88e4bb6" containerName="barbican-api" Oct 13 13:30:00 crc kubenswrapper[4797]: E1013 13:30:00.154043 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fc3b8cc-c74c-402d-8284-7d578bfa7c02" containerName="openstack-network-exporter" Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.154048 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fc3b8cc-c74c-402d-8284-7d578bfa7c02" containerName="openstack-network-exporter" Oct 13 13:30:00 crc kubenswrapper[4797]: E1013 13:30:00.154058 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d561c30-1e2f-4a3d-b042-8191c88e4bb6" containerName="barbican-api-log" Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.154064 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d561c30-1e2f-4a3d-b042-8191c88e4bb6" containerName="barbican-api-log" Oct 13 13:30:00 crc kubenswrapper[4797]: E1013 13:30:00.154074 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="047404d9-b0ab-44e2-a31d-94d8fe429698" containerName="openstack-network-exporter" Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.154079 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="047404d9-b0ab-44e2-a31d-94d8fe429698" containerName="openstack-network-exporter" Oct 13 13:30:00 crc kubenswrapper[4797]: E1013 13:30:00.154085 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="300309d9-4375-4cce-8fb1-0833d2cfdcde" containerName="galera" Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.154091 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="300309d9-4375-4cce-8fb1-0833d2cfdcde" containerName="galera" Oct 13 13:30:00 crc kubenswrapper[4797]: E1013 13:30:00.154101 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="416aefad-3318-4406-b1c1-fdba0ce21437" containerName="glance-log" Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.154107 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="416aefad-3318-4406-b1c1-fdba0ce21437" containerName="glance-log" Oct 13 13:30:00 crc kubenswrapper[4797]: E1013 13:30:00.154117 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6d097e4-da24-434c-9b9f-2e84279240a6" containerName="barbican-keystone-listener" Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.154123 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6d097e4-da24-434c-9b9f-2e84279240a6" containerName="barbican-keystone-listener" Oct 13 13:30:00 crc kubenswrapper[4797]: E1013 13:30:00.154133 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a327836f-196f-4d29-8792-33085b552aa9" containerName="mariadb-account-delete" Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.154139 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="a327836f-196f-4d29-8792-33085b552aa9" containerName="mariadb-account-delete" Oct 13 13:30:00 crc kubenswrapper[4797]: E1013 13:30:00.154149 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acdec9fc-360a-46e4-89ea-3fde84f417c0" containerName="setup-container" Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.154155 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="acdec9fc-360a-46e4-89ea-3fde84f417c0" containerName="setup-container" Oct 13 13:30:00 crc kubenswrapper[4797]: E1013 13:30:00.154166 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b2f17a4-493b-4b76-9dea-ef70ed8e1525" containerName="cinder-api" Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.154171 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b2f17a4-493b-4b76-9dea-ef70ed8e1525" containerName="cinder-api" Oct 13 13:30:00 crc kubenswrapper[4797]: E1013 13:30:00.154180 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d661f302-5234-4d18-9aa8-0eddd26153fe" containerName="openstack-network-exporter" Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.154185 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="d661f302-5234-4d18-9aa8-0eddd26153fe" containerName="openstack-network-exporter" Oct 13 13:30:00 crc kubenswrapper[4797]: E1013 13:30:00.154194 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5421ab8e-2db8-4909-b67b-e0491f7b80e7" containerName="mariadb-account-delete" Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.154200 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="5421ab8e-2db8-4909-b67b-e0491f7b80e7" containerName="mariadb-account-delete" Oct 13 13:30:00 crc kubenswrapper[4797]: E1013 13:30:00.154206 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6d097e4-da24-434c-9b9f-2e84279240a6" containerName="barbican-keystone-listener-log" Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.154211 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6d097e4-da24-434c-9b9f-2e84279240a6" containerName="barbican-keystone-listener-log" Oct 13 13:30:00 crc kubenswrapper[4797]: E1013 13:30:00.154218 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2be119d-ecfb-4f81-b947-46797c215b8e" containerName="glance-httpd" Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.154224 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2be119d-ecfb-4f81-b947-46797c215b8e" containerName="glance-httpd" Oct 13 13:30:00 crc kubenswrapper[4797]: E1013 13:30:00.154233 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe29d3f9-7e6b-4fc0-8268-a5c3bd0e7f1e" containerName="nova-cell1-novncproxy-novncproxy" Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.154238 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe29d3f9-7e6b-4fc0-8268-a5c3bd0e7f1e" containerName="nova-cell1-novncproxy-novncproxy" Oct 13 13:30:00 crc kubenswrapper[4797]: E1013 13:30:00.154253 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99ff3b6b-b49c-4214-9cf1-ea9eb85f28c0" containerName="barbican-worker-log" Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.154260 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="99ff3b6b-b49c-4214-9cf1-ea9eb85f28c0" containerName="barbican-worker-log" Oct 13 13:30:00 crc kubenswrapper[4797]: E1013 13:30:00.154275 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85dd770b-9d5c-4cc9-adaa-87963d5bb160" containerName="ovn-controller" Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.154282 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="85dd770b-9d5c-4cc9-adaa-87963d5bb160" containerName="ovn-controller" Oct 13 13:30:00 crc kubenswrapper[4797]: E1013 13:30:00.154293 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecdce3f4-f1b1-4323-b237-eb28b936ebc7" containerName="mariadb-account-delete" Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.154300 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecdce3f4-f1b1-4323-b237-eb28b936ebc7" containerName="mariadb-account-delete" Oct 13 13:30:00 crc kubenswrapper[4797]: E1013 13:30:00.154314 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2a3161e-b16d-436d-b547-87e182ef5e27" containerName="nova-api-api" Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.154320 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2a3161e-b16d-436d-b547-87e182ef5e27" containerName="nova-api-api" Oct 13 13:30:00 crc kubenswrapper[4797]: E1013 13:30:00.154329 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3cb03045-bfdc-4d0b-af8f-4e3c4717e792" containerName="mariadb-account-delete" Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.154335 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="3cb03045-bfdc-4d0b-af8f-4e3c4717e792" containerName="mariadb-account-delete" Oct 13 13:30:00 crc kubenswrapper[4797]: E1013 13:30:00.154343 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7297d3f3-134d-4fcf-85f2-8b414e2fb27d" containerName="sg-core" Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.154348 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="7297d3f3-134d-4fcf-85f2-8b414e2fb27d" containerName="sg-core" Oct 13 13:30:00 crc kubenswrapper[4797]: E1013 13:30:00.154358 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3111854e-cfce-493d-a094-63479ed35583" containerName="mariadb-account-delete" Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.154365 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="3111854e-cfce-493d-a094-63479ed35583" containerName="mariadb-account-delete" Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.154511 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2a3161e-b16d-436d-b547-87e182ef5e27" containerName="nova-api-api" Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.154520 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="3111854e-cfce-493d-a094-63479ed35583" containerName="mariadb-account-delete" Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.154526 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="7297d3f3-134d-4fcf-85f2-8b414e2fb27d" containerName="sg-core" Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.154536 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="ecdce3f4-f1b1-4323-b237-eb28b936ebc7" containerName="mariadb-account-delete" Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.154544 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="6fed4821-c587-408b-b6b0-bcc080170628" containerName="proxy-server" Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.154552 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe29d3f9-7e6b-4fc0-8268-a5c3bd0e7f1e" containerName="nova-cell1-novncproxy-novncproxy" Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.154559 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fc3b8cc-c74c-402d-8284-7d578bfa7c02" containerName="openstack-network-exporter" Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.154571 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="7297d3f3-134d-4fcf-85f2-8b414e2fb27d" containerName="ceilometer-notification-agent" Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.154578 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="312a660f-ea89-49ac-8857-16dae844353f" containerName="mariadb-account-delete" Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.154589 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="5421ab8e-2db8-4909-b67b-e0491f7b80e7" containerName="mariadb-account-delete" Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.154595 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="a327836f-196f-4d29-8792-33085b552aa9" containerName="mariadb-account-delete" Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.154605 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb5bf67e-bfcd-4a21-b67f-9d2893da31ab" containerName="memcached" Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.154612 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="6fed4821-c587-408b-b6b0-bcc080170628" containerName="proxy-httpd" Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.154623 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="f12892ce-6d68-4f79-b1dd-e874dffba145" containerName="galera" Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.154631 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="6bf02d7e-6b92-4d2a-838f-20cdd6a7046e" containerName="keystone-api" Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.154640 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="047404d9-b0ab-44e2-a31d-94d8fe429698" containerName="openstack-network-exporter" Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.154650 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d561c30-1e2f-4a3d-b042-8191c88e4bb6" containerName="barbican-api" Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.154658 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d561c30-1e2f-4a3d-b042-8191c88e4bb6" containerName="barbican-api-log" Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.154666 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="85dd770b-9d5c-4cc9-adaa-87963d5bb160" containerName="ovn-controller" Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.154674 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2be119d-ecfb-4f81-b947-46797c215b8e" containerName="glance-log" Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.154683 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="f10b6d3a-3a6f-4ab7-9ded-4885e28bdbc6" containerName="nova-scheduler-scheduler" Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.154691 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="99ff3b6b-b49c-4214-9cf1-ea9eb85f28c0" containerName="barbican-worker-log" Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.154699 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="cefeac7c-e65d-4c12-8f7e-e56bf30c04fa" containerName="nova-metadata-log" Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.154708 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b2f17a4-493b-4b76-9dea-ef70ed8e1525" containerName="cinder-api-log" Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.154714 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="7297d3f3-134d-4fcf-85f2-8b414e2fb27d" containerName="ceilometer-central-agent" Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.154721 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b2f17a4-493b-4b76-9dea-ef70ed8e1525" containerName="cinder-api" Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.154729 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7503305-66a2-4504-b208-6795946d8701" containerName="nova-cell0-conductor-conductor" Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.154739 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="d661f302-5234-4d18-9aa8-0eddd26153fe" containerName="openstack-network-exporter" Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.154749 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="d661f302-5234-4d18-9aa8-0eddd26153fe" containerName="ovn-northd" Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.154757 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="244e58a1-ed2c-4ff6-8885-ebd066e8adab" containerName="openstack-network-exporter" Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.154763 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5e82c55-e59e-4d97-800c-66a4f9555047" containerName="placement-log" Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.154772 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5e82c55-e59e-4d97-800c-66a4f9555047" containerName="placement-api" Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.154780 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6d097e4-da24-434c-9b9f-2e84279240a6" containerName="barbican-keystone-listener" Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.154787 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="416aefad-3318-4406-b1c1-fdba0ce21437" containerName="glance-log" Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.154795 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="cefeac7c-e65d-4c12-8f7e-e56bf30c04fa" containerName="nova-metadata-metadata" Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.154847 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="ddeeadaa-8237-4fad-8fd7-8e9c0580a1ed" containerName="kube-state-metrics" Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.154857 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="99ff3b6b-b49c-4214-9cf1-ea9eb85f28c0" containerName="barbican-worker" Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.154863 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="60394f60-af79-4a07-8f3f-75fb61c31894" containerName="nova-cell1-conductor-conductor" Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.154870 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7fec705-3fa8-4f2b-aa9d-1afec561d884" containerName="dnsmasq-dns" Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.154879 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2be119d-ecfb-4f81-b947-46797c215b8e" containerName="glance-httpd" Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.154888 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2a3161e-b16d-436d-b547-87e182ef5e27" containerName="nova-api-log" Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.154896 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="acdec9fc-360a-46e4-89ea-3fde84f417c0" containerName="rabbitmq" Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.154905 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="047404d9-b0ab-44e2-a31d-94d8fe429698" containerName="ovsdbserver-sb" Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.154912 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6d097e4-da24-434c-9b9f-2e84279240a6" containerName="barbican-keystone-listener-log" Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.154919 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="7297d3f3-134d-4fcf-85f2-8b414e2fb27d" containerName="proxy-httpd" Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.154927 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="300309d9-4375-4cce-8fb1-0833d2cfdcde" containerName="galera" Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.154937 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="21067728-d3cf-4ff2-94c9-87600f7324ab" containerName="rabbitmq" Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.154944 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="244e58a1-ed2c-4ff6-8885-ebd066e8adab" containerName="ovsdbserver-nb" Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.154954 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="416aefad-3318-4406-b1c1-fdba0ce21437" containerName="glance-httpd" Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.154962 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="3cb03045-bfdc-4d0b-af8f-4e3c4717e792" containerName="mariadb-account-delete" Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.155617 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29339370-mslvc" Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.159467 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.159924 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.164358 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29339370-mslvc"] Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.278489 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ad219041-8188-4fbc-9fc6-dac8a4b904c3-secret-volume\") pod \"collect-profiles-29339370-mslvc\" (UID: \"ad219041-8188-4fbc-9fc6-dac8a4b904c3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339370-mslvc" Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.278604 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ad219041-8188-4fbc-9fc6-dac8a4b904c3-config-volume\") pod \"collect-profiles-29339370-mslvc\" (UID: \"ad219041-8188-4fbc-9fc6-dac8a4b904c3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339370-mslvc" Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.278846 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5ztx\" (UniqueName: \"kubernetes.io/projected/ad219041-8188-4fbc-9fc6-dac8a4b904c3-kube-api-access-j5ztx\") pod \"collect-profiles-29339370-mslvc\" (UID: \"ad219041-8188-4fbc-9fc6-dac8a4b904c3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339370-mslvc" Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.380020 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j5ztx\" (UniqueName: \"kubernetes.io/projected/ad219041-8188-4fbc-9fc6-dac8a4b904c3-kube-api-access-j5ztx\") pod \"collect-profiles-29339370-mslvc\" (UID: \"ad219041-8188-4fbc-9fc6-dac8a4b904c3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339370-mslvc" Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.380370 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ad219041-8188-4fbc-9fc6-dac8a4b904c3-secret-volume\") pod \"collect-profiles-29339370-mslvc\" (UID: \"ad219041-8188-4fbc-9fc6-dac8a4b904c3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339370-mslvc" Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.380417 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ad219041-8188-4fbc-9fc6-dac8a4b904c3-config-volume\") pod \"collect-profiles-29339370-mslvc\" (UID: \"ad219041-8188-4fbc-9fc6-dac8a4b904c3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339370-mslvc" Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.381414 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ad219041-8188-4fbc-9fc6-dac8a4b904c3-config-volume\") pod \"collect-profiles-29339370-mslvc\" (UID: \"ad219041-8188-4fbc-9fc6-dac8a4b904c3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339370-mslvc" Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.394116 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ad219041-8188-4fbc-9fc6-dac8a4b904c3-secret-volume\") pod \"collect-profiles-29339370-mslvc\" (UID: \"ad219041-8188-4fbc-9fc6-dac8a4b904c3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339370-mslvc" Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.397545 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5ztx\" (UniqueName: \"kubernetes.io/projected/ad219041-8188-4fbc-9fc6-dac8a4b904c3-kube-api-access-j5ztx\") pod \"collect-profiles-29339370-mslvc\" (UID: \"ad219041-8188-4fbc-9fc6-dac8a4b904c3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339370-mslvc" Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.530194 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29339370-mslvc" Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.565629 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-57cbbb4d89-r9rvd" Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.684897 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tglwn\" (UniqueName: \"kubernetes.io/projected/11a6d485-2926-4d07-9b32-e81ab882de4c-kube-api-access-tglwn\") pod \"11a6d485-2926-4d07-9b32-e81ab882de4c\" (UID: \"11a6d485-2926-4d07-9b32-e81ab882de4c\") " Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.685266 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/11a6d485-2926-4d07-9b32-e81ab882de4c-httpd-config\") pod \"11a6d485-2926-4d07-9b32-e81ab882de4c\" (UID: \"11a6d485-2926-4d07-9b32-e81ab882de4c\") " Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.685393 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/11a6d485-2926-4d07-9b32-e81ab882de4c-ovndb-tls-certs\") pod \"11a6d485-2926-4d07-9b32-e81ab882de4c\" (UID: \"11a6d485-2926-4d07-9b32-e81ab882de4c\") " Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.685427 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11a6d485-2926-4d07-9b32-e81ab882de4c-combined-ca-bundle\") pod \"11a6d485-2926-4d07-9b32-e81ab882de4c\" (UID: \"11a6d485-2926-4d07-9b32-e81ab882de4c\") " Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.685449 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/11a6d485-2926-4d07-9b32-e81ab882de4c-internal-tls-certs\") pod \"11a6d485-2926-4d07-9b32-e81ab882de4c\" (UID: \"11a6d485-2926-4d07-9b32-e81ab882de4c\") " Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.685898 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/11a6d485-2926-4d07-9b32-e81ab882de4c-config\") pod \"11a6d485-2926-4d07-9b32-e81ab882de4c\" (UID: \"11a6d485-2926-4d07-9b32-e81ab882de4c\") " Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.686020 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/11a6d485-2926-4d07-9b32-e81ab882de4c-public-tls-certs\") pod \"11a6d485-2926-4d07-9b32-e81ab882de4c\" (UID: \"11a6d485-2926-4d07-9b32-e81ab882de4c\") " Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.688727 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11a6d485-2926-4d07-9b32-e81ab882de4c-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "11a6d485-2926-4d07-9b32-e81ab882de4c" (UID: "11a6d485-2926-4d07-9b32-e81ab882de4c"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.689434 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11a6d485-2926-4d07-9b32-e81ab882de4c-kube-api-access-tglwn" (OuterVolumeSpecName: "kube-api-access-tglwn") pod "11a6d485-2926-4d07-9b32-e81ab882de4c" (UID: "11a6d485-2926-4d07-9b32-e81ab882de4c"). InnerVolumeSpecName "kube-api-access-tglwn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.725742 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11a6d485-2926-4d07-9b32-e81ab882de4c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "11a6d485-2926-4d07-9b32-e81ab882de4c" (UID: "11a6d485-2926-4d07-9b32-e81ab882de4c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.729746 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11a6d485-2926-4d07-9b32-e81ab882de4c-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "11a6d485-2926-4d07-9b32-e81ab882de4c" (UID: "11a6d485-2926-4d07-9b32-e81ab882de4c"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.731350 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11a6d485-2926-4d07-9b32-e81ab882de4c-config" (OuterVolumeSpecName: "config") pod "11a6d485-2926-4d07-9b32-e81ab882de4c" (UID: "11a6d485-2926-4d07-9b32-e81ab882de4c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.732646 4797 generic.go:334] "Generic (PLEG): container finished" podID="11a6d485-2926-4d07-9b32-e81ab882de4c" containerID="765f926c0e272fb4be3628cfe2a3f4a06d19c3788e310b7584f497aaf8cfd1b8" exitCode=0 Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.732680 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-57cbbb4d89-r9rvd" event={"ID":"11a6d485-2926-4d07-9b32-e81ab882de4c","Type":"ContainerDied","Data":"765f926c0e272fb4be3628cfe2a3f4a06d19c3788e310b7584f497aaf8cfd1b8"} Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.732704 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-57cbbb4d89-r9rvd" event={"ID":"11a6d485-2926-4d07-9b32-e81ab882de4c","Type":"ContainerDied","Data":"330a258a35155936c58f00a41dda095b1ddd6de665b1352dcea80c60c24af94b"} Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.732720 4797 scope.go:117] "RemoveContainer" containerID="366a02b08ccc7deee54d9245cc187eef22dec7cc6454b3fe21bc9ed73aea1117" Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.732856 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-57cbbb4d89-r9rvd" Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.738175 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11a6d485-2926-4d07-9b32-e81ab882de4c-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "11a6d485-2926-4d07-9b32-e81ab882de4c" (UID: "11a6d485-2926-4d07-9b32-e81ab882de4c"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.742954 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11a6d485-2926-4d07-9b32-e81ab882de4c-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "11a6d485-2926-4d07-9b32-e81ab882de4c" (UID: "11a6d485-2926-4d07-9b32-e81ab882de4c"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.753120 4797 scope.go:117] "RemoveContainer" containerID="765f926c0e272fb4be3628cfe2a3f4a06d19c3788e310b7584f497aaf8cfd1b8" Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.768918 4797 scope.go:117] "RemoveContainer" containerID="366a02b08ccc7deee54d9245cc187eef22dec7cc6454b3fe21bc9ed73aea1117" Oct 13 13:30:00 crc kubenswrapper[4797]: E1013 13:30:00.769304 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"366a02b08ccc7deee54d9245cc187eef22dec7cc6454b3fe21bc9ed73aea1117\": container with ID starting with 366a02b08ccc7deee54d9245cc187eef22dec7cc6454b3fe21bc9ed73aea1117 not found: ID does not exist" containerID="366a02b08ccc7deee54d9245cc187eef22dec7cc6454b3fe21bc9ed73aea1117" Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.769346 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"366a02b08ccc7deee54d9245cc187eef22dec7cc6454b3fe21bc9ed73aea1117"} err="failed to get container status \"366a02b08ccc7deee54d9245cc187eef22dec7cc6454b3fe21bc9ed73aea1117\": rpc error: code = NotFound desc = could not find container \"366a02b08ccc7deee54d9245cc187eef22dec7cc6454b3fe21bc9ed73aea1117\": container with ID starting with 366a02b08ccc7deee54d9245cc187eef22dec7cc6454b3fe21bc9ed73aea1117 not found: ID does not exist" Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.769372 4797 scope.go:117] "RemoveContainer" containerID="765f926c0e272fb4be3628cfe2a3f4a06d19c3788e310b7584f497aaf8cfd1b8" Oct 13 13:30:00 crc kubenswrapper[4797]: E1013 13:30:00.769662 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"765f926c0e272fb4be3628cfe2a3f4a06d19c3788e310b7584f497aaf8cfd1b8\": container with ID starting with 765f926c0e272fb4be3628cfe2a3f4a06d19c3788e310b7584f497aaf8cfd1b8 not found: ID does not exist" containerID="765f926c0e272fb4be3628cfe2a3f4a06d19c3788e310b7584f497aaf8cfd1b8" Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.769705 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"765f926c0e272fb4be3628cfe2a3f4a06d19c3788e310b7584f497aaf8cfd1b8"} err="failed to get container status \"765f926c0e272fb4be3628cfe2a3f4a06d19c3788e310b7584f497aaf8cfd1b8\": rpc error: code = NotFound desc = could not find container \"765f926c0e272fb4be3628cfe2a3f4a06d19c3788e310b7584f497aaf8cfd1b8\": container with ID starting with 765f926c0e272fb4be3628cfe2a3f4a06d19c3788e310b7584f497aaf8cfd1b8 not found: ID does not exist" Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.787964 4797 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/11a6d485-2926-4d07-9b32-e81ab882de4c-config\") on node \"crc\" DevicePath \"\"" Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.788002 4797 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/11a6d485-2926-4d07-9b32-e81ab882de4c-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.788017 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tglwn\" (UniqueName: \"kubernetes.io/projected/11a6d485-2926-4d07-9b32-e81ab882de4c-kube-api-access-tglwn\") on node \"crc\" DevicePath \"\"" Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.788031 4797 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/11a6d485-2926-4d07-9b32-e81ab882de4c-httpd-config\") on node \"crc\" DevicePath \"\"" Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.788043 4797 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/11a6d485-2926-4d07-9b32-e81ab882de4c-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.788053 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11a6d485-2926-4d07-9b32-e81ab882de4c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.788063 4797 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/11a6d485-2926-4d07-9b32-e81ab882de4c-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 13 13:30:00 crc kubenswrapper[4797]: I1013 13:30:00.937599 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29339370-mslvc"] Oct 13 13:30:01 crc kubenswrapper[4797]: I1013 13:30:01.067336 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-57cbbb4d89-r9rvd"] Oct 13 13:30:01 crc kubenswrapper[4797]: I1013 13:30:01.073567 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-57cbbb4d89-r9rvd"] Oct 13 13:30:01 crc kubenswrapper[4797]: I1013 13:30:01.245303 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11a6d485-2926-4d07-9b32-e81ab882de4c" path="/var/lib/kubelet/pods/11a6d485-2926-4d07-9b32-e81ab882de4c/volumes" Oct 13 13:30:01 crc kubenswrapper[4797]: I1013 13:30:01.748443 4797 generic.go:334] "Generic (PLEG): container finished" podID="ad219041-8188-4fbc-9fc6-dac8a4b904c3" containerID="a6a5f98a978fccc927f2b842bcde0364f07c12b99c741765c5ab8cb736a25f02" exitCode=0 Oct 13 13:30:01 crc kubenswrapper[4797]: I1013 13:30:01.748506 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29339370-mslvc" event={"ID":"ad219041-8188-4fbc-9fc6-dac8a4b904c3","Type":"ContainerDied","Data":"a6a5f98a978fccc927f2b842bcde0364f07c12b99c741765c5ab8cb736a25f02"} Oct 13 13:30:01 crc kubenswrapper[4797]: I1013 13:30:01.748866 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29339370-mslvc" event={"ID":"ad219041-8188-4fbc-9fc6-dac8a4b904c3","Type":"ContainerStarted","Data":"79b98d76fab6d52885d224f85ec90118c0a93861fa8d20ea601884eb198aab82"} Oct 13 13:30:03 crc kubenswrapper[4797]: I1013 13:30:03.059745 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29339370-mslvc" Oct 13 13:30:03 crc kubenswrapper[4797]: I1013 13:30:03.122583 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j5ztx\" (UniqueName: \"kubernetes.io/projected/ad219041-8188-4fbc-9fc6-dac8a4b904c3-kube-api-access-j5ztx\") pod \"ad219041-8188-4fbc-9fc6-dac8a4b904c3\" (UID: \"ad219041-8188-4fbc-9fc6-dac8a4b904c3\") " Oct 13 13:30:03 crc kubenswrapper[4797]: I1013 13:30:03.122678 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ad219041-8188-4fbc-9fc6-dac8a4b904c3-config-volume\") pod \"ad219041-8188-4fbc-9fc6-dac8a4b904c3\" (UID: \"ad219041-8188-4fbc-9fc6-dac8a4b904c3\") " Oct 13 13:30:03 crc kubenswrapper[4797]: I1013 13:30:03.122977 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ad219041-8188-4fbc-9fc6-dac8a4b904c3-secret-volume\") pod \"ad219041-8188-4fbc-9fc6-dac8a4b904c3\" (UID: \"ad219041-8188-4fbc-9fc6-dac8a4b904c3\") " Oct 13 13:30:03 crc kubenswrapper[4797]: I1013 13:30:03.123775 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad219041-8188-4fbc-9fc6-dac8a4b904c3-config-volume" (OuterVolumeSpecName: "config-volume") pod "ad219041-8188-4fbc-9fc6-dac8a4b904c3" (UID: "ad219041-8188-4fbc-9fc6-dac8a4b904c3"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:30:03 crc kubenswrapper[4797]: I1013 13:30:03.127582 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad219041-8188-4fbc-9fc6-dac8a4b904c3-kube-api-access-j5ztx" (OuterVolumeSpecName: "kube-api-access-j5ztx") pod "ad219041-8188-4fbc-9fc6-dac8a4b904c3" (UID: "ad219041-8188-4fbc-9fc6-dac8a4b904c3"). InnerVolumeSpecName "kube-api-access-j5ztx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:30:03 crc kubenswrapper[4797]: I1013 13:30:03.127627 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad219041-8188-4fbc-9fc6-dac8a4b904c3-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "ad219041-8188-4fbc-9fc6-dac8a4b904c3" (UID: "ad219041-8188-4fbc-9fc6-dac8a4b904c3"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:30:03 crc kubenswrapper[4797]: I1013 13:30:03.224719 4797 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ad219041-8188-4fbc-9fc6-dac8a4b904c3-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 13 13:30:03 crc kubenswrapper[4797]: I1013 13:30:03.224763 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j5ztx\" (UniqueName: \"kubernetes.io/projected/ad219041-8188-4fbc-9fc6-dac8a4b904c3-kube-api-access-j5ztx\") on node \"crc\" DevicePath \"\"" Oct 13 13:30:03 crc kubenswrapper[4797]: I1013 13:30:03.224779 4797 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ad219041-8188-4fbc-9fc6-dac8a4b904c3-config-volume\") on node \"crc\" DevicePath \"\"" Oct 13 13:30:03 crc kubenswrapper[4797]: E1013 13:30:03.291534 4797 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a06c1a43c4f53948ecf8f2a50db2d6224db0be23e64e47a447e2e46bcc407c2e is running failed: container process not found" containerID="a06c1a43c4f53948ecf8f2a50db2d6224db0be23e64e47a447e2e46bcc407c2e" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 13 13:30:03 crc kubenswrapper[4797]: E1013 13:30:03.291880 4797 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a06c1a43c4f53948ecf8f2a50db2d6224db0be23e64e47a447e2e46bcc407c2e is running failed: container process not found" containerID="a06c1a43c4f53948ecf8f2a50db2d6224db0be23e64e47a447e2e46bcc407c2e" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 13 13:30:03 crc kubenswrapper[4797]: E1013 13:30:03.292233 4797 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a06c1a43c4f53948ecf8f2a50db2d6224db0be23e64e47a447e2e46bcc407c2e is running failed: container process not found" containerID="a06c1a43c4f53948ecf8f2a50db2d6224db0be23e64e47a447e2e46bcc407c2e" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 13 13:30:03 crc kubenswrapper[4797]: E1013 13:30:03.292275 4797 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a06c1a43c4f53948ecf8f2a50db2d6224db0be23e64e47a447e2e46bcc407c2e is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-2mpq9" podUID="3ac6531d-4d7d-4cf0-b943-984f885b4a6d" containerName="ovsdb-server" Oct 13 13:30:03 crc kubenswrapper[4797]: E1013 13:30:03.293547 4797 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="530a23262b718d00b295a64b20c635224a04f307f764b6900f060c9b7a722368" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 13 13:30:03 crc kubenswrapper[4797]: E1013 13:30:03.295303 4797 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="530a23262b718d00b295a64b20c635224a04f307f764b6900f060c9b7a722368" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 13 13:30:03 crc kubenswrapper[4797]: E1013 13:30:03.296616 4797 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="530a23262b718d00b295a64b20c635224a04f307f764b6900f060c9b7a722368" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 13 13:30:03 crc kubenswrapper[4797]: E1013 13:30:03.296662 4797 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-2mpq9" podUID="3ac6531d-4d7d-4cf0-b943-984f885b4a6d" containerName="ovs-vswitchd" Oct 13 13:30:03 crc kubenswrapper[4797]: I1013 13:30:03.783697 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29339370-mslvc" event={"ID":"ad219041-8188-4fbc-9fc6-dac8a4b904c3","Type":"ContainerDied","Data":"79b98d76fab6d52885d224f85ec90118c0a93861fa8d20ea601884eb198aab82"} Oct 13 13:30:03 crc kubenswrapper[4797]: I1013 13:30:03.784084 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="79b98d76fab6d52885d224f85ec90118c0a93861fa8d20ea601884eb198aab82" Oct 13 13:30:03 crc kubenswrapper[4797]: I1013 13:30:03.783752 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29339370-mslvc" Oct 13 13:30:08 crc kubenswrapper[4797]: E1013 13:30:08.291404 4797 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a06c1a43c4f53948ecf8f2a50db2d6224db0be23e64e47a447e2e46bcc407c2e is running failed: container process not found" containerID="a06c1a43c4f53948ecf8f2a50db2d6224db0be23e64e47a447e2e46bcc407c2e" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 13 13:30:08 crc kubenswrapper[4797]: E1013 13:30:08.292589 4797 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a06c1a43c4f53948ecf8f2a50db2d6224db0be23e64e47a447e2e46bcc407c2e is running failed: container process not found" containerID="a06c1a43c4f53948ecf8f2a50db2d6224db0be23e64e47a447e2e46bcc407c2e" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 13 13:30:08 crc kubenswrapper[4797]: E1013 13:30:08.292995 4797 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a06c1a43c4f53948ecf8f2a50db2d6224db0be23e64e47a447e2e46bcc407c2e is running failed: container process not found" containerID="a06c1a43c4f53948ecf8f2a50db2d6224db0be23e64e47a447e2e46bcc407c2e" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 13 13:30:08 crc kubenswrapper[4797]: E1013 13:30:08.293036 4797 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a06c1a43c4f53948ecf8f2a50db2d6224db0be23e64e47a447e2e46bcc407c2e is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-2mpq9" podUID="3ac6531d-4d7d-4cf0-b943-984f885b4a6d" containerName="ovsdb-server" Oct 13 13:30:08 crc kubenswrapper[4797]: E1013 13:30:08.295222 4797 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="530a23262b718d00b295a64b20c635224a04f307f764b6900f060c9b7a722368" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 13 13:30:08 crc kubenswrapper[4797]: E1013 13:30:08.296705 4797 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="530a23262b718d00b295a64b20c635224a04f307f764b6900f060c9b7a722368" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 13 13:30:08 crc kubenswrapper[4797]: E1013 13:30:08.298038 4797 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="530a23262b718d00b295a64b20c635224a04f307f764b6900f060c9b7a722368" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 13 13:30:08 crc kubenswrapper[4797]: E1013 13:30:08.298083 4797 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-2mpq9" podUID="3ac6531d-4d7d-4cf0-b943-984f885b4a6d" containerName="ovs-vswitchd" Oct 13 13:30:08 crc kubenswrapper[4797]: I1013 13:30:08.332999 4797 scope.go:117] "RemoveContainer" containerID="d71ea01203c3ae01ea8325c2bb868f94f1383842ef8fa152c98b3afecb3c64ce" Oct 13 13:30:08 crc kubenswrapper[4797]: I1013 13:30:08.355975 4797 scope.go:117] "RemoveContainer" containerID="b1e44153ed376aa2a5b20e44939ad60b37e029b63d505942d897d9c4bae17008" Oct 13 13:30:08 crc kubenswrapper[4797]: I1013 13:30:08.377862 4797 scope.go:117] "RemoveContainer" containerID="c055cb85c274325ba4dfd7ffd98811f07884655ca3f5193c1fd0181b74581d57" Oct 13 13:30:13 crc kubenswrapper[4797]: E1013 13:30:13.291327 4797 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a06c1a43c4f53948ecf8f2a50db2d6224db0be23e64e47a447e2e46bcc407c2e is running failed: container process not found" containerID="a06c1a43c4f53948ecf8f2a50db2d6224db0be23e64e47a447e2e46bcc407c2e" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 13 13:30:13 crc kubenswrapper[4797]: E1013 13:30:13.292467 4797 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a06c1a43c4f53948ecf8f2a50db2d6224db0be23e64e47a447e2e46bcc407c2e is running failed: container process not found" containerID="a06c1a43c4f53948ecf8f2a50db2d6224db0be23e64e47a447e2e46bcc407c2e" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 13 13:30:13 crc kubenswrapper[4797]: E1013 13:30:13.292775 4797 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a06c1a43c4f53948ecf8f2a50db2d6224db0be23e64e47a447e2e46bcc407c2e is running failed: container process not found" containerID="a06c1a43c4f53948ecf8f2a50db2d6224db0be23e64e47a447e2e46bcc407c2e" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 13 13:30:13 crc kubenswrapper[4797]: E1013 13:30:13.292820 4797 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a06c1a43c4f53948ecf8f2a50db2d6224db0be23e64e47a447e2e46bcc407c2e is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-2mpq9" podUID="3ac6531d-4d7d-4cf0-b943-984f885b4a6d" containerName="ovsdb-server" Oct 13 13:30:13 crc kubenswrapper[4797]: E1013 13:30:13.293252 4797 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="530a23262b718d00b295a64b20c635224a04f307f764b6900f060c9b7a722368" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 13 13:30:13 crc kubenswrapper[4797]: E1013 13:30:13.295507 4797 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="530a23262b718d00b295a64b20c635224a04f307f764b6900f060c9b7a722368" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 13 13:30:13 crc kubenswrapper[4797]: E1013 13:30:13.297243 4797 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="530a23262b718d00b295a64b20c635224a04f307f764b6900f060c9b7a722368" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 13 13:30:13 crc kubenswrapper[4797]: E1013 13:30:13.297286 4797 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-2mpq9" podUID="3ac6531d-4d7d-4cf0-b943-984f885b4a6d" containerName="ovs-vswitchd" Oct 13 13:30:14 crc kubenswrapper[4797]: I1013 13:30:14.892314 4797 generic.go:334] "Generic (PLEG): container finished" podID="cc4b497b-efb0-4294-8af9-c16bb2835e36" containerID="cf81e79b12c28741c928f832632179cdb954c2366581d8d7306d9be83dbc6228" exitCode=137 Oct 13 13:30:14 crc kubenswrapper[4797]: I1013 13:30:14.892419 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"cc4b497b-efb0-4294-8af9-c16bb2835e36","Type":"ContainerDied","Data":"cf81e79b12c28741c928f832632179cdb954c2366581d8d7306d9be83dbc6228"} Oct 13 13:30:14 crc kubenswrapper[4797]: I1013 13:30:14.900864 4797 generic.go:334] "Generic (PLEG): container finished" podID="f853cd93-92bc-46d6-8bd4-82373edcac6c" containerID="df5690bc37dc98f263a51145643771f0253fa3da622e37dab0ac9bd217f8b17d" exitCode=137 Oct 13 13:30:14 crc kubenswrapper[4797]: I1013 13:30:14.900945 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f853cd93-92bc-46d6-8bd4-82373edcac6c","Type":"ContainerDied","Data":"df5690bc37dc98f263a51145643771f0253fa3da622e37dab0ac9bd217f8b17d"} Oct 13 13:30:14 crc kubenswrapper[4797]: I1013 13:30:14.903581 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-2mpq9_3ac6531d-4d7d-4cf0-b943-984f885b4a6d/ovs-vswitchd/0.log" Oct 13 13:30:14 crc kubenswrapper[4797]: I1013 13:30:14.904369 4797 generic.go:334] "Generic (PLEG): container finished" podID="3ac6531d-4d7d-4cf0-b943-984f885b4a6d" containerID="530a23262b718d00b295a64b20c635224a04f307f764b6900f060c9b7a722368" exitCode=137 Oct 13 13:30:14 crc kubenswrapper[4797]: I1013 13:30:14.904394 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-2mpq9" event={"ID":"3ac6531d-4d7d-4cf0-b943-984f885b4a6d","Type":"ContainerDied","Data":"530a23262b718d00b295a64b20c635224a04f307f764b6900f060c9b7a722368"} Oct 13 13:30:15 crc kubenswrapper[4797]: I1013 13:30:15.028090 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 13 13:30:15 crc kubenswrapper[4797]: I1013 13:30:15.029600 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Oct 13 13:30:15 crc kubenswrapper[4797]: I1013 13:30:15.115299 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc4b497b-efb0-4294-8af9-c16bb2835e36-config-data\") pod \"cc4b497b-efb0-4294-8af9-c16bb2835e36\" (UID: \"cc4b497b-efb0-4294-8af9-c16bb2835e36\") " Oct 13 13:30:15 crc kubenswrapper[4797]: I1013 13:30:15.115355 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sckhd\" (UniqueName: \"kubernetes.io/projected/f853cd93-92bc-46d6-8bd4-82373edcac6c-kube-api-access-sckhd\") pod \"f853cd93-92bc-46d6-8bd4-82373edcac6c\" (UID: \"f853cd93-92bc-46d6-8bd4-82373edcac6c\") " Oct 13 13:30:15 crc kubenswrapper[4797]: I1013 13:30:15.115422 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cc4b497b-efb0-4294-8af9-c16bb2835e36-config-data-custom\") pod \"cc4b497b-efb0-4294-8af9-c16bb2835e36\" (UID: \"cc4b497b-efb0-4294-8af9-c16bb2835e36\") " Oct 13 13:30:15 crc kubenswrapper[4797]: I1013 13:30:15.115443 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cc4b497b-efb0-4294-8af9-c16bb2835e36-scripts\") pod \"cc4b497b-efb0-4294-8af9-c16bb2835e36\" (UID: \"cc4b497b-efb0-4294-8af9-c16bb2835e36\") " Oct 13 13:30:15 crc kubenswrapper[4797]: I1013 13:30:15.115483 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swift\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"f853cd93-92bc-46d6-8bd4-82373edcac6c\" (UID: \"f853cd93-92bc-46d6-8bd4-82373edcac6c\") " Oct 13 13:30:15 crc kubenswrapper[4797]: I1013 13:30:15.115517 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f853cd93-92bc-46d6-8bd4-82373edcac6c-etc-swift\") pod \"f853cd93-92bc-46d6-8bd4-82373edcac6c\" (UID: \"f853cd93-92bc-46d6-8bd4-82373edcac6c\") " Oct 13 13:30:15 crc kubenswrapper[4797]: I1013 13:30:15.115545 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc4b497b-efb0-4294-8af9-c16bb2835e36-combined-ca-bundle\") pod \"cc4b497b-efb0-4294-8af9-c16bb2835e36\" (UID: \"cc4b497b-efb0-4294-8af9-c16bb2835e36\") " Oct 13 13:30:15 crc kubenswrapper[4797]: I1013 13:30:15.115583 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/f853cd93-92bc-46d6-8bd4-82373edcac6c-cache\") pod \"f853cd93-92bc-46d6-8bd4-82373edcac6c\" (UID: \"f853cd93-92bc-46d6-8bd4-82373edcac6c\") " Oct 13 13:30:15 crc kubenswrapper[4797]: I1013 13:30:15.115603 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cc4b497b-efb0-4294-8af9-c16bb2835e36-etc-machine-id\") pod \"cc4b497b-efb0-4294-8af9-c16bb2835e36\" (UID: \"cc4b497b-efb0-4294-8af9-c16bb2835e36\") " Oct 13 13:30:15 crc kubenswrapper[4797]: I1013 13:30:15.115649 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/f853cd93-92bc-46d6-8bd4-82373edcac6c-lock\") pod \"f853cd93-92bc-46d6-8bd4-82373edcac6c\" (UID: \"f853cd93-92bc-46d6-8bd4-82373edcac6c\") " Oct 13 13:30:15 crc kubenswrapper[4797]: I1013 13:30:15.115674 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gvjt5\" (UniqueName: \"kubernetes.io/projected/cc4b497b-efb0-4294-8af9-c16bb2835e36-kube-api-access-gvjt5\") pod \"cc4b497b-efb0-4294-8af9-c16bb2835e36\" (UID: \"cc4b497b-efb0-4294-8af9-c16bb2835e36\") " Oct 13 13:30:15 crc kubenswrapper[4797]: I1013 13:30:15.121919 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f853cd93-92bc-46d6-8bd4-82373edcac6c-cache" (OuterVolumeSpecName: "cache") pod "f853cd93-92bc-46d6-8bd4-82373edcac6c" (UID: "f853cd93-92bc-46d6-8bd4-82373edcac6c"). InnerVolumeSpecName "cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 13:30:15 crc kubenswrapper[4797]: I1013 13:30:15.122542 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cc4b497b-efb0-4294-8af9-c16bb2835e36-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "cc4b497b-efb0-4294-8af9-c16bb2835e36" (UID: "cc4b497b-efb0-4294-8af9-c16bb2835e36"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 13 13:30:15 crc kubenswrapper[4797]: I1013 13:30:15.123062 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f853cd93-92bc-46d6-8bd4-82373edcac6c-lock" (OuterVolumeSpecName: "lock") pod "f853cd93-92bc-46d6-8bd4-82373edcac6c" (UID: "f853cd93-92bc-46d6-8bd4-82373edcac6c"). InnerVolumeSpecName "lock". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 13:30:15 crc kubenswrapper[4797]: I1013 13:30:15.125211 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f853cd93-92bc-46d6-8bd4-82373edcac6c-kube-api-access-sckhd" (OuterVolumeSpecName: "kube-api-access-sckhd") pod "f853cd93-92bc-46d6-8bd4-82373edcac6c" (UID: "f853cd93-92bc-46d6-8bd4-82373edcac6c"). InnerVolumeSpecName "kube-api-access-sckhd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:30:15 crc kubenswrapper[4797]: I1013 13:30:15.125477 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc4b497b-efb0-4294-8af9-c16bb2835e36-kube-api-access-gvjt5" (OuterVolumeSpecName: "kube-api-access-gvjt5") pod "cc4b497b-efb0-4294-8af9-c16bb2835e36" (UID: "cc4b497b-efb0-4294-8af9-c16bb2835e36"). InnerVolumeSpecName "kube-api-access-gvjt5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:30:15 crc kubenswrapper[4797]: I1013 13:30:15.126584 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc4b497b-efb0-4294-8af9-c16bb2835e36-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "cc4b497b-efb0-4294-8af9-c16bb2835e36" (UID: "cc4b497b-efb0-4294-8af9-c16bb2835e36"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:30:15 crc kubenswrapper[4797]: I1013 13:30:15.126651 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "swift") pod "f853cd93-92bc-46d6-8bd4-82373edcac6c" (UID: "f853cd93-92bc-46d6-8bd4-82373edcac6c"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 13 13:30:15 crc kubenswrapper[4797]: I1013 13:30:15.128014 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc4b497b-efb0-4294-8af9-c16bb2835e36-scripts" (OuterVolumeSpecName: "scripts") pod "cc4b497b-efb0-4294-8af9-c16bb2835e36" (UID: "cc4b497b-efb0-4294-8af9-c16bb2835e36"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:30:15 crc kubenswrapper[4797]: I1013 13:30:15.128408 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f853cd93-92bc-46d6-8bd4-82373edcac6c-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "f853cd93-92bc-46d6-8bd4-82373edcac6c" (UID: "f853cd93-92bc-46d6-8bd4-82373edcac6c"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:30:15 crc kubenswrapper[4797]: I1013 13:30:15.165600 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc4b497b-efb0-4294-8af9-c16bb2835e36-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cc4b497b-efb0-4294-8af9-c16bb2835e36" (UID: "cc4b497b-efb0-4294-8af9-c16bb2835e36"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:30:15 crc kubenswrapper[4797]: I1013 13:30:15.201891 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc4b497b-efb0-4294-8af9-c16bb2835e36-config-data" (OuterVolumeSpecName: "config-data") pod "cc4b497b-efb0-4294-8af9-c16bb2835e36" (UID: "cc4b497b-efb0-4294-8af9-c16bb2835e36"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:30:15 crc kubenswrapper[4797]: I1013 13:30:15.216905 4797 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cc4b497b-efb0-4294-8af9-c16bb2835e36-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 13 13:30:15 crc kubenswrapper[4797]: I1013 13:30:15.216926 4797 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cc4b497b-efb0-4294-8af9-c16bb2835e36-scripts\") on node \"crc\" DevicePath \"\"" Oct 13 13:30:15 crc kubenswrapper[4797]: I1013 13:30:15.216953 4797 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Oct 13 13:30:15 crc kubenswrapper[4797]: I1013 13:30:15.216988 4797 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f853cd93-92bc-46d6-8bd4-82373edcac6c-etc-swift\") on node \"crc\" DevicePath \"\"" Oct 13 13:30:15 crc kubenswrapper[4797]: I1013 13:30:15.216997 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc4b497b-efb0-4294-8af9-c16bb2835e36-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 13:30:15 crc kubenswrapper[4797]: I1013 13:30:15.217005 4797 reconciler_common.go:293] "Volume detached for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/f853cd93-92bc-46d6-8bd4-82373edcac6c-cache\") on node \"crc\" DevicePath \"\"" Oct 13 13:30:15 crc kubenswrapper[4797]: I1013 13:30:15.217013 4797 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cc4b497b-efb0-4294-8af9-c16bb2835e36-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 13 13:30:15 crc kubenswrapper[4797]: I1013 13:30:15.217022 4797 reconciler_common.go:293] "Volume detached for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/f853cd93-92bc-46d6-8bd4-82373edcac6c-lock\") on node \"crc\" DevicePath \"\"" Oct 13 13:30:15 crc kubenswrapper[4797]: I1013 13:30:15.217030 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gvjt5\" (UniqueName: \"kubernetes.io/projected/cc4b497b-efb0-4294-8af9-c16bb2835e36-kube-api-access-gvjt5\") on node \"crc\" DevicePath \"\"" Oct 13 13:30:15 crc kubenswrapper[4797]: I1013 13:30:15.217039 4797 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc4b497b-efb0-4294-8af9-c16bb2835e36-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 13:30:15 crc kubenswrapper[4797]: I1013 13:30:15.217046 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sckhd\" (UniqueName: \"kubernetes.io/projected/f853cd93-92bc-46d6-8bd4-82373edcac6c-kube-api-access-sckhd\") on node \"crc\" DevicePath \"\"" Oct 13 13:30:15 crc kubenswrapper[4797]: I1013 13:30:15.230672 4797 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Oct 13 13:30:15 crc kubenswrapper[4797]: I1013 13:30:15.251965 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-2mpq9_3ac6531d-4d7d-4cf0-b943-984f885b4a6d/ovs-vswitchd/0.log" Oct 13 13:30:15 crc kubenswrapper[4797]: I1013 13:30:15.252852 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-2mpq9" Oct 13 13:30:15 crc kubenswrapper[4797]: I1013 13:30:15.317467 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/3ac6531d-4d7d-4cf0-b943-984f885b4a6d-var-log\") pod \"3ac6531d-4d7d-4cf0-b943-984f885b4a6d\" (UID: \"3ac6531d-4d7d-4cf0-b943-984f885b4a6d\") " Oct 13 13:30:15 crc kubenswrapper[4797]: I1013 13:30:15.317562 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/3ac6531d-4d7d-4cf0-b943-984f885b4a6d-etc-ovs\") pod \"3ac6531d-4d7d-4cf0-b943-984f885b4a6d\" (UID: \"3ac6531d-4d7d-4cf0-b943-984f885b4a6d\") " Oct 13 13:30:15 crc kubenswrapper[4797]: I1013 13:30:15.317580 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3ac6531d-4d7d-4cf0-b943-984f885b4a6d-var-log" (OuterVolumeSpecName: "var-log") pod "3ac6531d-4d7d-4cf0-b943-984f885b4a6d" (UID: "3ac6531d-4d7d-4cf0-b943-984f885b4a6d"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 13 13:30:15 crc kubenswrapper[4797]: I1013 13:30:15.317621 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3ac6531d-4d7d-4cf0-b943-984f885b4a6d-scripts\") pod \"3ac6531d-4d7d-4cf0-b943-984f885b4a6d\" (UID: \"3ac6531d-4d7d-4cf0-b943-984f885b4a6d\") " Oct 13 13:30:15 crc kubenswrapper[4797]: I1013 13:30:15.317665 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3ac6531d-4d7d-4cf0-b943-984f885b4a6d-var-run\") pod \"3ac6531d-4d7d-4cf0-b943-984f885b4a6d\" (UID: \"3ac6531d-4d7d-4cf0-b943-984f885b4a6d\") " Oct 13 13:30:15 crc kubenswrapper[4797]: I1013 13:30:15.317685 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/3ac6531d-4d7d-4cf0-b943-984f885b4a6d-var-lib\") pod \"3ac6531d-4d7d-4cf0-b943-984f885b4a6d\" (UID: \"3ac6531d-4d7d-4cf0-b943-984f885b4a6d\") " Oct 13 13:30:15 crc kubenswrapper[4797]: I1013 13:30:15.317681 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3ac6531d-4d7d-4cf0-b943-984f885b4a6d-etc-ovs" (OuterVolumeSpecName: "etc-ovs") pod "3ac6531d-4d7d-4cf0-b943-984f885b4a6d" (UID: "3ac6531d-4d7d-4cf0-b943-984f885b4a6d"). InnerVolumeSpecName "etc-ovs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 13 13:30:15 crc kubenswrapper[4797]: I1013 13:30:15.317705 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3ac6531d-4d7d-4cf0-b943-984f885b4a6d-var-run" (OuterVolumeSpecName: "var-run") pod "3ac6531d-4d7d-4cf0-b943-984f885b4a6d" (UID: "3ac6531d-4d7d-4cf0-b943-984f885b4a6d"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 13 13:30:15 crc kubenswrapper[4797]: I1013 13:30:15.317744 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wsw2g\" (UniqueName: \"kubernetes.io/projected/3ac6531d-4d7d-4cf0-b943-984f885b4a6d-kube-api-access-wsw2g\") pod \"3ac6531d-4d7d-4cf0-b943-984f885b4a6d\" (UID: \"3ac6531d-4d7d-4cf0-b943-984f885b4a6d\") " Oct 13 13:30:15 crc kubenswrapper[4797]: I1013 13:30:15.317796 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3ac6531d-4d7d-4cf0-b943-984f885b4a6d-var-lib" (OuterVolumeSpecName: "var-lib") pod "3ac6531d-4d7d-4cf0-b943-984f885b4a6d" (UID: "3ac6531d-4d7d-4cf0-b943-984f885b4a6d"). InnerVolumeSpecName "var-lib". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 13 13:30:15 crc kubenswrapper[4797]: I1013 13:30:15.318094 4797 reconciler_common.go:293] "Volume detached for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/3ac6531d-4d7d-4cf0-b943-984f885b4a6d-etc-ovs\") on node \"crc\" DevicePath \"\"" Oct 13 13:30:15 crc kubenswrapper[4797]: I1013 13:30:15.318113 4797 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3ac6531d-4d7d-4cf0-b943-984f885b4a6d-var-run\") on node \"crc\" DevicePath \"\"" Oct 13 13:30:15 crc kubenswrapper[4797]: I1013 13:30:15.318123 4797 reconciler_common.go:293] "Volume detached for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/3ac6531d-4d7d-4cf0-b943-984f885b4a6d-var-lib\") on node \"crc\" DevicePath \"\"" Oct 13 13:30:15 crc kubenswrapper[4797]: I1013 13:30:15.318133 4797 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Oct 13 13:30:15 crc kubenswrapper[4797]: I1013 13:30:15.318142 4797 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/3ac6531d-4d7d-4cf0-b943-984f885b4a6d-var-log\") on node \"crc\" DevicePath \"\"" Oct 13 13:30:15 crc kubenswrapper[4797]: I1013 13:30:15.319572 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3ac6531d-4d7d-4cf0-b943-984f885b4a6d-scripts" (OuterVolumeSpecName: "scripts") pod "3ac6531d-4d7d-4cf0-b943-984f885b4a6d" (UID: "3ac6531d-4d7d-4cf0-b943-984f885b4a6d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:30:15 crc kubenswrapper[4797]: I1013 13:30:15.322216 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ac6531d-4d7d-4cf0-b943-984f885b4a6d-kube-api-access-wsw2g" (OuterVolumeSpecName: "kube-api-access-wsw2g") pod "3ac6531d-4d7d-4cf0-b943-984f885b4a6d" (UID: "3ac6531d-4d7d-4cf0-b943-984f885b4a6d"). InnerVolumeSpecName "kube-api-access-wsw2g". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:30:15 crc kubenswrapper[4797]: I1013 13:30:15.420571 4797 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3ac6531d-4d7d-4cf0-b943-984f885b4a6d-scripts\") on node \"crc\" DevicePath \"\"" Oct 13 13:30:15 crc kubenswrapper[4797]: I1013 13:30:15.420632 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wsw2g\" (UniqueName: \"kubernetes.io/projected/3ac6531d-4d7d-4cf0-b943-984f885b4a6d-kube-api-access-wsw2g\") on node \"crc\" DevicePath \"\"" Oct 13 13:30:15 crc kubenswrapper[4797]: I1013 13:30:15.921471 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"cc4b497b-efb0-4294-8af9-c16bb2835e36","Type":"ContainerDied","Data":"f5926e75952e74a6189644d96cefddb84f5ef84b17df4a64d1dd0322928cf2a9"} Oct 13 13:30:15 crc kubenswrapper[4797]: I1013 13:30:15.921526 4797 scope.go:117] "RemoveContainer" containerID="2cf43831975875d820d476539ef4c0943fa120e06b5317c6aec843e932949edd" Oct 13 13:30:15 crc kubenswrapper[4797]: I1013 13:30:15.921539 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 13 13:30:15 crc kubenswrapper[4797]: I1013 13:30:15.936394 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f853cd93-92bc-46d6-8bd4-82373edcac6c","Type":"ContainerDied","Data":"2be8916fa85b9eacc2af4fecd094384a0088200f5107be5125dbae69fcbeced8"} Oct 13 13:30:15 crc kubenswrapper[4797]: I1013 13:30:15.936539 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Oct 13 13:30:15 crc kubenswrapper[4797]: I1013 13:30:15.948585 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-2mpq9_3ac6531d-4d7d-4cf0-b943-984f885b4a6d/ovs-vswitchd/0.log" Oct 13 13:30:15 crc kubenswrapper[4797]: I1013 13:30:15.952233 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-2mpq9" event={"ID":"3ac6531d-4d7d-4cf0-b943-984f885b4a6d","Type":"ContainerDied","Data":"670bd2bb956f74860f67d01b8baeecf23d1c7656d71c3a7834b27f415b0e8483"} Oct 13 13:30:15 crc kubenswrapper[4797]: I1013 13:30:15.952366 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-2mpq9" Oct 13 13:30:15 crc kubenswrapper[4797]: I1013 13:30:15.956274 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 13 13:30:15 crc kubenswrapper[4797]: I1013 13:30:15.964028 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 13 13:30:15 crc kubenswrapper[4797]: I1013 13:30:15.966913 4797 scope.go:117] "RemoveContainer" containerID="cf81e79b12c28741c928f832632179cdb954c2366581d8d7306d9be83dbc6228" Oct 13 13:30:16 crc kubenswrapper[4797]: I1013 13:30:16.003231 4797 scope.go:117] "RemoveContainer" containerID="df5690bc37dc98f263a51145643771f0253fa3da622e37dab0ac9bd217f8b17d" Oct 13 13:30:16 crc kubenswrapper[4797]: I1013 13:30:16.011882 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Oct 13 13:30:16 crc kubenswrapper[4797]: I1013 13:30:16.026766 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-storage-0"] Oct 13 13:30:16 crc kubenswrapper[4797]: I1013 13:30:16.038331 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-2mpq9"] Oct 13 13:30:16 crc kubenswrapper[4797]: I1013 13:30:16.041037 4797 scope.go:117] "RemoveContainer" containerID="7c966ea8b8d377fe98198c55458614115fcf39c47512b6e6a01c20502d489780" Oct 13 13:30:16 crc kubenswrapper[4797]: I1013 13:30:16.043960 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-ovs-2mpq9"] Oct 13 13:30:16 crc kubenswrapper[4797]: I1013 13:30:16.057751 4797 scope.go:117] "RemoveContainer" containerID="0771ac639aaa1011650222d8f818316f8651f5729cf3d02726d6f291d0ca0403" Oct 13 13:30:16 crc kubenswrapper[4797]: I1013 13:30:16.075258 4797 scope.go:117] "RemoveContainer" containerID="88fe19d7acc00d4d082fc5672f55a59aa59797313060f49a5e87d490f16e6bd6" Oct 13 13:30:16 crc kubenswrapper[4797]: I1013 13:30:16.093280 4797 scope.go:117] "RemoveContainer" containerID="95610d8f86f925fa29c39c4b649aca588a5bfca6031effb0eddf4c0ba26766bc" Oct 13 13:30:16 crc kubenswrapper[4797]: I1013 13:30:16.113405 4797 scope.go:117] "RemoveContainer" containerID="2c54fef5660226ff362a71ac3e917e001db5e9a347a08deab58b5660776811fc" Oct 13 13:30:16 crc kubenswrapper[4797]: I1013 13:30:16.129243 4797 scope.go:117] "RemoveContainer" containerID="f98ce95dd4a03f4ac462cac39a646ae12a813083511d7a3c50bdddd1624ba0aa" Oct 13 13:30:16 crc kubenswrapper[4797]: I1013 13:30:16.147834 4797 scope.go:117] "RemoveContainer" containerID="7fa1e18bf4f048760b982cf61e4fdc5706474901c9f59b877086707fd0c2bad5" Oct 13 13:30:16 crc kubenswrapper[4797]: I1013 13:30:16.164955 4797 scope.go:117] "RemoveContainer" containerID="a05561b229ddf692864eb50f4e3134b3ae7ce64dcd992877ca1a87198bca77ca" Oct 13 13:30:16 crc kubenswrapper[4797]: I1013 13:30:16.181598 4797 scope.go:117] "RemoveContainer" containerID="826bb87dfbab6e082d77678c431c5f60fe685f53d7f5d122666545d7e4d18b90" Oct 13 13:30:16 crc kubenswrapper[4797]: I1013 13:30:16.201074 4797 scope.go:117] "RemoveContainer" containerID="e34ee38f199fac8191d033ce294a7a0959606e96937bb15c3973792f94df9fb9" Oct 13 13:30:16 crc kubenswrapper[4797]: I1013 13:30:16.229020 4797 scope.go:117] "RemoveContainer" containerID="e14c6e172f563574bf54ee4a48fdf4ed5d505dd4e2e640e88c9ca54cdfa00ed3" Oct 13 13:30:16 crc kubenswrapper[4797]: I1013 13:30:16.247546 4797 scope.go:117] "RemoveContainer" containerID="fd4ad2d986ae8532b8471784274e34f2e4066c64a28827eb0bdc5088d9c323f3" Oct 13 13:30:16 crc kubenswrapper[4797]: I1013 13:30:16.275270 4797 scope.go:117] "RemoveContainer" containerID="ad8ecb3fe40030b9df7dc7b9b77499e35e2b251475733a87367b79e69bc6d068" Oct 13 13:30:16 crc kubenswrapper[4797]: I1013 13:30:16.291779 4797 scope.go:117] "RemoveContainer" containerID="c4ce4b63b22c727785eb5abec889eca7d94fdcd5d0b3cda4520df492e1acde65" Oct 13 13:30:16 crc kubenswrapper[4797]: I1013 13:30:16.313483 4797 scope.go:117] "RemoveContainer" containerID="530a23262b718d00b295a64b20c635224a04f307f764b6900f060c9b7a722368" Oct 13 13:30:16 crc kubenswrapper[4797]: I1013 13:30:16.336694 4797 scope.go:117] "RemoveContainer" containerID="a06c1a43c4f53948ecf8f2a50db2d6224db0be23e64e47a447e2e46bcc407c2e" Oct 13 13:30:16 crc kubenswrapper[4797]: I1013 13:30:16.353732 4797 scope.go:117] "RemoveContainer" containerID="89c20ea2719d92d48088b58d7e39f9d562964dd99319780d3e2dff0b28232c4e" Oct 13 13:30:17 crc kubenswrapper[4797]: I1013 13:30:17.248410 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ac6531d-4d7d-4cf0-b943-984f885b4a6d" path="/var/lib/kubelet/pods/3ac6531d-4d7d-4cf0-b943-984f885b4a6d/volumes" Oct 13 13:30:17 crc kubenswrapper[4797]: I1013 13:30:17.249546 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc4b497b-efb0-4294-8af9-c16bb2835e36" path="/var/lib/kubelet/pods/cc4b497b-efb0-4294-8af9-c16bb2835e36/volumes" Oct 13 13:30:17 crc kubenswrapper[4797]: I1013 13:30:17.250475 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f853cd93-92bc-46d6-8bd4-82373edcac6c" path="/var/lib/kubelet/pods/f853cd93-92bc-46d6-8bd4-82373edcac6c/volumes" Oct 13 13:30:19 crc kubenswrapper[4797]: I1013 13:30:19.902756 4797 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod312a660f-ea89-49ac-8857-16dae844353f"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod312a660f-ea89-49ac-8857-16dae844353f] : Timed out while waiting for systemd to remove kubepods-besteffort-pod312a660f_ea89_49ac_8857_16dae844353f.slice" Oct 13 13:30:43 crc kubenswrapper[4797]: I1013 13:30:43.400106 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-sffhh"] Oct 13 13:30:43 crc kubenswrapper[4797]: E1013 13:30:43.401132 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc4b497b-efb0-4294-8af9-c16bb2835e36" containerName="cinder-scheduler" Oct 13 13:30:43 crc kubenswrapper[4797]: I1013 13:30:43.401152 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc4b497b-efb0-4294-8af9-c16bb2835e36" containerName="cinder-scheduler" Oct 13 13:30:43 crc kubenswrapper[4797]: E1013 13:30:43.401170 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f853cd93-92bc-46d6-8bd4-82373edcac6c" containerName="object-expirer" Oct 13 13:30:43 crc kubenswrapper[4797]: I1013 13:30:43.401178 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="f853cd93-92bc-46d6-8bd4-82373edcac6c" containerName="object-expirer" Oct 13 13:30:43 crc kubenswrapper[4797]: E1013 13:30:43.401190 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f853cd93-92bc-46d6-8bd4-82373edcac6c" containerName="account-replicator" Oct 13 13:30:43 crc kubenswrapper[4797]: I1013 13:30:43.401199 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="f853cd93-92bc-46d6-8bd4-82373edcac6c" containerName="account-replicator" Oct 13 13:30:43 crc kubenswrapper[4797]: E1013 13:30:43.401215 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f853cd93-92bc-46d6-8bd4-82373edcac6c" containerName="container-updater" Oct 13 13:30:43 crc kubenswrapper[4797]: I1013 13:30:43.401223 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="f853cd93-92bc-46d6-8bd4-82373edcac6c" containerName="container-updater" Oct 13 13:30:43 crc kubenswrapper[4797]: E1013 13:30:43.401245 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f853cd93-92bc-46d6-8bd4-82373edcac6c" containerName="object-replicator" Oct 13 13:30:43 crc kubenswrapper[4797]: I1013 13:30:43.401253 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="f853cd93-92bc-46d6-8bd4-82373edcac6c" containerName="object-replicator" Oct 13 13:30:43 crc kubenswrapper[4797]: E1013 13:30:43.401268 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc4b497b-efb0-4294-8af9-c16bb2835e36" containerName="probe" Oct 13 13:30:43 crc kubenswrapper[4797]: I1013 13:30:43.401276 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc4b497b-efb0-4294-8af9-c16bb2835e36" containerName="probe" Oct 13 13:30:43 crc kubenswrapper[4797]: E1013 13:30:43.401294 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ac6531d-4d7d-4cf0-b943-984f885b4a6d" containerName="ovsdb-server" Oct 13 13:30:43 crc kubenswrapper[4797]: I1013 13:30:43.401304 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ac6531d-4d7d-4cf0-b943-984f885b4a6d" containerName="ovsdb-server" Oct 13 13:30:43 crc kubenswrapper[4797]: E1013 13:30:43.401315 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f853cd93-92bc-46d6-8bd4-82373edcac6c" containerName="rsync" Oct 13 13:30:43 crc kubenswrapper[4797]: I1013 13:30:43.401323 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="f853cd93-92bc-46d6-8bd4-82373edcac6c" containerName="rsync" Oct 13 13:30:43 crc kubenswrapper[4797]: E1013 13:30:43.401336 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f853cd93-92bc-46d6-8bd4-82373edcac6c" containerName="object-updater" Oct 13 13:30:43 crc kubenswrapper[4797]: I1013 13:30:43.401344 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="f853cd93-92bc-46d6-8bd4-82373edcac6c" containerName="object-updater" Oct 13 13:30:43 crc kubenswrapper[4797]: E1013 13:30:43.401353 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ac6531d-4d7d-4cf0-b943-984f885b4a6d" containerName="ovsdb-server-init" Oct 13 13:30:43 crc kubenswrapper[4797]: I1013 13:30:43.401361 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ac6531d-4d7d-4cf0-b943-984f885b4a6d" containerName="ovsdb-server-init" Oct 13 13:30:43 crc kubenswrapper[4797]: E1013 13:30:43.401372 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f853cd93-92bc-46d6-8bd4-82373edcac6c" containerName="object-server" Oct 13 13:30:43 crc kubenswrapper[4797]: I1013 13:30:43.401380 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="f853cd93-92bc-46d6-8bd4-82373edcac6c" containerName="object-server" Oct 13 13:30:43 crc kubenswrapper[4797]: E1013 13:30:43.401393 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ac6531d-4d7d-4cf0-b943-984f885b4a6d" containerName="ovs-vswitchd" Oct 13 13:30:43 crc kubenswrapper[4797]: I1013 13:30:43.401401 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ac6531d-4d7d-4cf0-b943-984f885b4a6d" containerName="ovs-vswitchd" Oct 13 13:30:43 crc kubenswrapper[4797]: E1013 13:30:43.401411 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f853cd93-92bc-46d6-8bd4-82373edcac6c" containerName="container-replicator" Oct 13 13:30:43 crc kubenswrapper[4797]: I1013 13:30:43.401419 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="f853cd93-92bc-46d6-8bd4-82373edcac6c" containerName="container-replicator" Oct 13 13:30:43 crc kubenswrapper[4797]: E1013 13:30:43.401433 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f853cd93-92bc-46d6-8bd4-82373edcac6c" containerName="object-auditor" Oct 13 13:30:43 crc kubenswrapper[4797]: I1013 13:30:43.401441 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="f853cd93-92bc-46d6-8bd4-82373edcac6c" containerName="object-auditor" Oct 13 13:30:43 crc kubenswrapper[4797]: E1013 13:30:43.401456 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f853cd93-92bc-46d6-8bd4-82373edcac6c" containerName="container-server" Oct 13 13:30:43 crc kubenswrapper[4797]: I1013 13:30:43.401464 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="f853cd93-92bc-46d6-8bd4-82373edcac6c" containerName="container-server" Oct 13 13:30:43 crc kubenswrapper[4797]: E1013 13:30:43.401480 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11a6d485-2926-4d07-9b32-e81ab882de4c" containerName="neutron-httpd" Oct 13 13:30:43 crc kubenswrapper[4797]: I1013 13:30:43.401488 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="11a6d485-2926-4d07-9b32-e81ab882de4c" containerName="neutron-httpd" Oct 13 13:30:43 crc kubenswrapper[4797]: E1013 13:30:43.401501 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad219041-8188-4fbc-9fc6-dac8a4b904c3" containerName="collect-profiles" Oct 13 13:30:43 crc kubenswrapper[4797]: I1013 13:30:43.401509 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad219041-8188-4fbc-9fc6-dac8a4b904c3" containerName="collect-profiles" Oct 13 13:30:43 crc kubenswrapper[4797]: E1013 13:30:43.401525 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f853cd93-92bc-46d6-8bd4-82373edcac6c" containerName="account-server" Oct 13 13:30:43 crc kubenswrapper[4797]: I1013 13:30:43.401533 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="f853cd93-92bc-46d6-8bd4-82373edcac6c" containerName="account-server" Oct 13 13:30:43 crc kubenswrapper[4797]: E1013 13:30:43.401547 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f853cd93-92bc-46d6-8bd4-82373edcac6c" containerName="account-reaper" Oct 13 13:30:43 crc kubenswrapper[4797]: I1013 13:30:43.401555 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="f853cd93-92bc-46d6-8bd4-82373edcac6c" containerName="account-reaper" Oct 13 13:30:43 crc kubenswrapper[4797]: E1013 13:30:43.401565 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f853cd93-92bc-46d6-8bd4-82373edcac6c" containerName="swift-recon-cron" Oct 13 13:30:43 crc kubenswrapper[4797]: I1013 13:30:43.401574 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="f853cd93-92bc-46d6-8bd4-82373edcac6c" containerName="swift-recon-cron" Oct 13 13:30:43 crc kubenswrapper[4797]: E1013 13:30:43.401591 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f853cd93-92bc-46d6-8bd4-82373edcac6c" containerName="account-auditor" Oct 13 13:30:43 crc kubenswrapper[4797]: I1013 13:30:43.401602 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="f853cd93-92bc-46d6-8bd4-82373edcac6c" containerName="account-auditor" Oct 13 13:30:43 crc kubenswrapper[4797]: E1013 13:30:43.401616 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f853cd93-92bc-46d6-8bd4-82373edcac6c" containerName="container-auditor" Oct 13 13:30:43 crc kubenswrapper[4797]: I1013 13:30:43.401627 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="f853cd93-92bc-46d6-8bd4-82373edcac6c" containerName="container-auditor" Oct 13 13:30:43 crc kubenswrapper[4797]: E1013 13:30:43.401644 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11a6d485-2926-4d07-9b32-e81ab882de4c" containerName="neutron-api" Oct 13 13:30:43 crc kubenswrapper[4797]: I1013 13:30:43.401654 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="11a6d485-2926-4d07-9b32-e81ab882de4c" containerName="neutron-api" Oct 13 13:30:43 crc kubenswrapper[4797]: I1013 13:30:43.401886 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="f853cd93-92bc-46d6-8bd4-82373edcac6c" containerName="object-auditor" Oct 13 13:30:43 crc kubenswrapper[4797]: I1013 13:30:43.401904 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="f853cd93-92bc-46d6-8bd4-82373edcac6c" containerName="account-replicator" Oct 13 13:30:43 crc kubenswrapper[4797]: I1013 13:30:43.401917 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="f853cd93-92bc-46d6-8bd4-82373edcac6c" containerName="object-expirer" Oct 13 13:30:43 crc kubenswrapper[4797]: I1013 13:30:43.401930 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="f853cd93-92bc-46d6-8bd4-82373edcac6c" containerName="account-server" Oct 13 13:30:43 crc kubenswrapper[4797]: I1013 13:30:43.401945 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="f853cd93-92bc-46d6-8bd4-82373edcac6c" containerName="object-replicator" Oct 13 13:30:43 crc kubenswrapper[4797]: I1013 13:30:43.401957 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ac6531d-4d7d-4cf0-b943-984f885b4a6d" containerName="ovsdb-server" Oct 13 13:30:43 crc kubenswrapper[4797]: I1013 13:30:43.401967 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="f853cd93-92bc-46d6-8bd4-82373edcac6c" containerName="container-auditor" Oct 13 13:30:43 crc kubenswrapper[4797]: I1013 13:30:43.401978 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="11a6d485-2926-4d07-9b32-e81ab882de4c" containerName="neutron-httpd" Oct 13 13:30:43 crc kubenswrapper[4797]: I1013 13:30:43.401993 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="f853cd93-92bc-46d6-8bd4-82373edcac6c" containerName="container-replicator" Oct 13 13:30:43 crc kubenswrapper[4797]: I1013 13:30:43.402003 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="f853cd93-92bc-46d6-8bd4-82373edcac6c" containerName="rsync" Oct 13 13:30:43 crc kubenswrapper[4797]: I1013 13:30:43.402014 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="f853cd93-92bc-46d6-8bd4-82373edcac6c" containerName="container-updater" Oct 13 13:30:43 crc kubenswrapper[4797]: I1013 13:30:43.402023 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad219041-8188-4fbc-9fc6-dac8a4b904c3" containerName="collect-profiles" Oct 13 13:30:43 crc kubenswrapper[4797]: I1013 13:30:43.402038 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="11a6d485-2926-4d07-9b32-e81ab882de4c" containerName="neutron-api" Oct 13 13:30:43 crc kubenswrapper[4797]: I1013 13:30:43.402047 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc4b497b-efb0-4294-8af9-c16bb2835e36" containerName="cinder-scheduler" Oct 13 13:30:43 crc kubenswrapper[4797]: I1013 13:30:43.402058 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="f853cd93-92bc-46d6-8bd4-82373edcac6c" containerName="account-reaper" Oct 13 13:30:43 crc kubenswrapper[4797]: I1013 13:30:43.402072 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="f853cd93-92bc-46d6-8bd4-82373edcac6c" containerName="container-server" Oct 13 13:30:43 crc kubenswrapper[4797]: I1013 13:30:43.402083 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ac6531d-4d7d-4cf0-b943-984f885b4a6d" containerName="ovs-vswitchd" Oct 13 13:30:43 crc kubenswrapper[4797]: I1013 13:30:43.402092 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc4b497b-efb0-4294-8af9-c16bb2835e36" containerName="probe" Oct 13 13:30:43 crc kubenswrapper[4797]: I1013 13:30:43.402103 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="f853cd93-92bc-46d6-8bd4-82373edcac6c" containerName="object-server" Oct 13 13:30:43 crc kubenswrapper[4797]: I1013 13:30:43.402116 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="f853cd93-92bc-46d6-8bd4-82373edcac6c" containerName="object-updater" Oct 13 13:30:43 crc kubenswrapper[4797]: I1013 13:30:43.402130 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="f853cd93-92bc-46d6-8bd4-82373edcac6c" containerName="swift-recon-cron" Oct 13 13:30:43 crc kubenswrapper[4797]: I1013 13:30:43.402139 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="f853cd93-92bc-46d6-8bd4-82373edcac6c" containerName="account-auditor" Oct 13 13:30:43 crc kubenswrapper[4797]: I1013 13:30:43.403346 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sffhh" Oct 13 13:30:43 crc kubenswrapper[4797]: I1013 13:30:43.418169 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-sffhh"] Oct 13 13:30:43 crc kubenswrapper[4797]: I1013 13:30:43.584135 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1a0d630-1daf-4b87-b297-3ea670a0e6d4-utilities\") pod \"community-operators-sffhh\" (UID: \"e1a0d630-1daf-4b87-b297-3ea670a0e6d4\") " pod="openshift-marketplace/community-operators-sffhh" Oct 13 13:30:43 crc kubenswrapper[4797]: I1013 13:30:43.586447 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1a0d630-1daf-4b87-b297-3ea670a0e6d4-catalog-content\") pod \"community-operators-sffhh\" (UID: \"e1a0d630-1daf-4b87-b297-3ea670a0e6d4\") " pod="openshift-marketplace/community-operators-sffhh" Oct 13 13:30:43 crc kubenswrapper[4797]: I1013 13:30:43.586651 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdj2t\" (UniqueName: \"kubernetes.io/projected/e1a0d630-1daf-4b87-b297-3ea670a0e6d4-kube-api-access-fdj2t\") pod \"community-operators-sffhh\" (UID: \"e1a0d630-1daf-4b87-b297-3ea670a0e6d4\") " pod="openshift-marketplace/community-operators-sffhh" Oct 13 13:30:43 crc kubenswrapper[4797]: I1013 13:30:43.688240 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1a0d630-1daf-4b87-b297-3ea670a0e6d4-utilities\") pod \"community-operators-sffhh\" (UID: \"e1a0d630-1daf-4b87-b297-3ea670a0e6d4\") " pod="openshift-marketplace/community-operators-sffhh" Oct 13 13:30:43 crc kubenswrapper[4797]: I1013 13:30:43.688719 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1a0d630-1daf-4b87-b297-3ea670a0e6d4-catalog-content\") pod \"community-operators-sffhh\" (UID: \"e1a0d630-1daf-4b87-b297-3ea670a0e6d4\") " pod="openshift-marketplace/community-operators-sffhh" Oct 13 13:30:43 crc kubenswrapper[4797]: I1013 13:30:43.688843 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fdj2t\" (UniqueName: \"kubernetes.io/projected/e1a0d630-1daf-4b87-b297-3ea670a0e6d4-kube-api-access-fdj2t\") pod \"community-operators-sffhh\" (UID: \"e1a0d630-1daf-4b87-b297-3ea670a0e6d4\") " pod="openshift-marketplace/community-operators-sffhh" Oct 13 13:30:43 crc kubenswrapper[4797]: I1013 13:30:43.688841 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1a0d630-1daf-4b87-b297-3ea670a0e6d4-utilities\") pod \"community-operators-sffhh\" (UID: \"e1a0d630-1daf-4b87-b297-3ea670a0e6d4\") " pod="openshift-marketplace/community-operators-sffhh" Oct 13 13:30:43 crc kubenswrapper[4797]: I1013 13:30:43.689157 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1a0d630-1daf-4b87-b297-3ea670a0e6d4-catalog-content\") pod \"community-operators-sffhh\" (UID: \"e1a0d630-1daf-4b87-b297-3ea670a0e6d4\") " pod="openshift-marketplace/community-operators-sffhh" Oct 13 13:30:43 crc kubenswrapper[4797]: I1013 13:30:43.715626 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdj2t\" (UniqueName: \"kubernetes.io/projected/e1a0d630-1daf-4b87-b297-3ea670a0e6d4-kube-api-access-fdj2t\") pod \"community-operators-sffhh\" (UID: \"e1a0d630-1daf-4b87-b297-3ea670a0e6d4\") " pod="openshift-marketplace/community-operators-sffhh" Oct 13 13:30:43 crc kubenswrapper[4797]: I1013 13:30:43.761672 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sffhh" Oct 13 13:30:44 crc kubenswrapper[4797]: I1013 13:30:44.306170 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-sffhh"] Oct 13 13:30:45 crc kubenswrapper[4797]: I1013 13:30:45.262235 4797 generic.go:334] "Generic (PLEG): container finished" podID="e1a0d630-1daf-4b87-b297-3ea670a0e6d4" containerID="c2082162e22386af6d921905e2a8b1c1c9861309572ad6fed2722b0900b57ec6" exitCode=0 Oct 13 13:30:45 crc kubenswrapper[4797]: I1013 13:30:45.262296 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sffhh" event={"ID":"e1a0d630-1daf-4b87-b297-3ea670a0e6d4","Type":"ContainerDied","Data":"c2082162e22386af6d921905e2a8b1c1c9861309572ad6fed2722b0900b57ec6"} Oct 13 13:30:45 crc kubenswrapper[4797]: I1013 13:30:45.263336 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sffhh" event={"ID":"e1a0d630-1daf-4b87-b297-3ea670a0e6d4","Type":"ContainerStarted","Data":"3670230045e3dc7f3e525e24228ba72c0dd17da35e45e453bed147a25cb66307"} Oct 13 13:30:46 crc kubenswrapper[4797]: I1013 13:30:46.276514 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sffhh" event={"ID":"e1a0d630-1daf-4b87-b297-3ea670a0e6d4","Type":"ContainerStarted","Data":"97f2657c74ec4eaa27ce8eff0a24c3564c0c898cca22139ea967f22a3141b31e"} Oct 13 13:30:48 crc kubenswrapper[4797]: I1013 13:30:48.296976 4797 generic.go:334] "Generic (PLEG): container finished" podID="e1a0d630-1daf-4b87-b297-3ea670a0e6d4" containerID="97f2657c74ec4eaa27ce8eff0a24c3564c0c898cca22139ea967f22a3141b31e" exitCode=0 Oct 13 13:30:48 crc kubenswrapper[4797]: I1013 13:30:48.297927 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sffhh" event={"ID":"e1a0d630-1daf-4b87-b297-3ea670a0e6d4","Type":"ContainerDied","Data":"97f2657c74ec4eaa27ce8eff0a24c3564c0c898cca22139ea967f22a3141b31e"} Oct 13 13:30:49 crc kubenswrapper[4797]: I1013 13:30:49.309655 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sffhh" event={"ID":"e1a0d630-1daf-4b87-b297-3ea670a0e6d4","Type":"ContainerStarted","Data":"8a5eb328ce9b42908cdb8f9dfd5b1323daa39273a3f619e32ce523c93600463e"} Oct 13 13:30:49 crc kubenswrapper[4797]: I1013 13:30:49.334941 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-sffhh" podStartSLOduration=2.847105903 podStartE2EDuration="6.334914196s" podCreationTimestamp="2025-10-13 13:30:43 +0000 UTC" firstStartedPulling="2025-10-13 13:30:45.264163576 +0000 UTC m=+1422.797713832" lastFinishedPulling="2025-10-13 13:30:48.751971819 +0000 UTC m=+1426.285522125" observedRunningTime="2025-10-13 13:30:49.328008947 +0000 UTC m=+1426.861559223" watchObservedRunningTime="2025-10-13 13:30:49.334914196 +0000 UTC m=+1426.868464462" Oct 13 13:30:53 crc kubenswrapper[4797]: I1013 13:30:53.762908 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-sffhh" Oct 13 13:30:53 crc kubenswrapper[4797]: I1013 13:30:53.763300 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-sffhh" Oct 13 13:30:53 crc kubenswrapper[4797]: I1013 13:30:53.841725 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-sffhh" Oct 13 13:30:54 crc kubenswrapper[4797]: I1013 13:30:54.433602 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-sffhh" Oct 13 13:30:54 crc kubenswrapper[4797]: I1013 13:30:54.491385 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-sffhh"] Oct 13 13:30:56 crc kubenswrapper[4797]: I1013 13:30:56.372946 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-sffhh" podUID="e1a0d630-1daf-4b87-b297-3ea670a0e6d4" containerName="registry-server" containerID="cri-o://8a5eb328ce9b42908cdb8f9dfd5b1323daa39273a3f619e32ce523c93600463e" gracePeriod=2 Oct 13 13:30:56 crc kubenswrapper[4797]: I1013 13:30:56.819380 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sffhh" Oct 13 13:30:56 crc kubenswrapper[4797]: I1013 13:30:56.910648 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1a0d630-1daf-4b87-b297-3ea670a0e6d4-catalog-content\") pod \"e1a0d630-1daf-4b87-b297-3ea670a0e6d4\" (UID: \"e1a0d630-1daf-4b87-b297-3ea670a0e6d4\") " Oct 13 13:30:56 crc kubenswrapper[4797]: I1013 13:30:56.910999 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fdj2t\" (UniqueName: \"kubernetes.io/projected/e1a0d630-1daf-4b87-b297-3ea670a0e6d4-kube-api-access-fdj2t\") pod \"e1a0d630-1daf-4b87-b297-3ea670a0e6d4\" (UID: \"e1a0d630-1daf-4b87-b297-3ea670a0e6d4\") " Oct 13 13:30:56 crc kubenswrapper[4797]: I1013 13:30:56.911131 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1a0d630-1daf-4b87-b297-3ea670a0e6d4-utilities\") pod \"e1a0d630-1daf-4b87-b297-3ea670a0e6d4\" (UID: \"e1a0d630-1daf-4b87-b297-3ea670a0e6d4\") " Oct 13 13:30:56 crc kubenswrapper[4797]: I1013 13:30:56.912033 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e1a0d630-1daf-4b87-b297-3ea670a0e6d4-utilities" (OuterVolumeSpecName: "utilities") pod "e1a0d630-1daf-4b87-b297-3ea670a0e6d4" (UID: "e1a0d630-1daf-4b87-b297-3ea670a0e6d4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 13:30:56 crc kubenswrapper[4797]: I1013 13:30:56.917171 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1a0d630-1daf-4b87-b297-3ea670a0e6d4-kube-api-access-fdj2t" (OuterVolumeSpecName: "kube-api-access-fdj2t") pod "e1a0d630-1daf-4b87-b297-3ea670a0e6d4" (UID: "e1a0d630-1daf-4b87-b297-3ea670a0e6d4"). InnerVolumeSpecName "kube-api-access-fdj2t". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:30:57 crc kubenswrapper[4797]: I1013 13:30:57.013049 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fdj2t\" (UniqueName: \"kubernetes.io/projected/e1a0d630-1daf-4b87-b297-3ea670a0e6d4-kube-api-access-fdj2t\") on node \"crc\" DevicePath \"\"" Oct 13 13:30:57 crc kubenswrapper[4797]: I1013 13:30:57.013094 4797 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1a0d630-1daf-4b87-b297-3ea670a0e6d4-utilities\") on node \"crc\" DevicePath \"\"" Oct 13 13:30:57 crc kubenswrapper[4797]: I1013 13:30:57.085067 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e1a0d630-1daf-4b87-b297-3ea670a0e6d4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e1a0d630-1daf-4b87-b297-3ea670a0e6d4" (UID: "e1a0d630-1daf-4b87-b297-3ea670a0e6d4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 13:30:57 crc kubenswrapper[4797]: I1013 13:30:57.114615 4797 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1a0d630-1daf-4b87-b297-3ea670a0e6d4-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 13 13:30:57 crc kubenswrapper[4797]: I1013 13:30:57.389106 4797 generic.go:334] "Generic (PLEG): container finished" podID="e1a0d630-1daf-4b87-b297-3ea670a0e6d4" containerID="8a5eb328ce9b42908cdb8f9dfd5b1323daa39273a3f619e32ce523c93600463e" exitCode=0 Oct 13 13:30:57 crc kubenswrapper[4797]: I1013 13:30:57.389168 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sffhh" event={"ID":"e1a0d630-1daf-4b87-b297-3ea670a0e6d4","Type":"ContainerDied","Data":"8a5eb328ce9b42908cdb8f9dfd5b1323daa39273a3f619e32ce523c93600463e"} Oct 13 13:30:57 crc kubenswrapper[4797]: I1013 13:30:57.389219 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sffhh" event={"ID":"e1a0d630-1daf-4b87-b297-3ea670a0e6d4","Type":"ContainerDied","Data":"3670230045e3dc7f3e525e24228ba72c0dd17da35e45e453bed147a25cb66307"} Oct 13 13:30:57 crc kubenswrapper[4797]: I1013 13:30:57.389233 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sffhh" Oct 13 13:30:57 crc kubenswrapper[4797]: I1013 13:30:57.389249 4797 scope.go:117] "RemoveContainer" containerID="8a5eb328ce9b42908cdb8f9dfd5b1323daa39273a3f619e32ce523c93600463e" Oct 13 13:30:57 crc kubenswrapper[4797]: I1013 13:30:57.429922 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-sffhh"] Oct 13 13:30:57 crc kubenswrapper[4797]: I1013 13:30:57.440743 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-sffhh"] Oct 13 13:30:57 crc kubenswrapper[4797]: I1013 13:30:57.447458 4797 scope.go:117] "RemoveContainer" containerID="97f2657c74ec4eaa27ce8eff0a24c3564c0c898cca22139ea967f22a3141b31e" Oct 13 13:30:57 crc kubenswrapper[4797]: I1013 13:30:57.475072 4797 scope.go:117] "RemoveContainer" containerID="c2082162e22386af6d921905e2a8b1c1c9861309572ad6fed2722b0900b57ec6" Oct 13 13:30:57 crc kubenswrapper[4797]: I1013 13:30:57.502949 4797 scope.go:117] "RemoveContainer" containerID="8a5eb328ce9b42908cdb8f9dfd5b1323daa39273a3f619e32ce523c93600463e" Oct 13 13:30:57 crc kubenswrapper[4797]: E1013 13:30:57.503659 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a5eb328ce9b42908cdb8f9dfd5b1323daa39273a3f619e32ce523c93600463e\": container with ID starting with 8a5eb328ce9b42908cdb8f9dfd5b1323daa39273a3f619e32ce523c93600463e not found: ID does not exist" containerID="8a5eb328ce9b42908cdb8f9dfd5b1323daa39273a3f619e32ce523c93600463e" Oct 13 13:30:57 crc kubenswrapper[4797]: I1013 13:30:57.503728 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a5eb328ce9b42908cdb8f9dfd5b1323daa39273a3f619e32ce523c93600463e"} err="failed to get container status \"8a5eb328ce9b42908cdb8f9dfd5b1323daa39273a3f619e32ce523c93600463e\": rpc error: code = NotFound desc = could not find container \"8a5eb328ce9b42908cdb8f9dfd5b1323daa39273a3f619e32ce523c93600463e\": container with ID starting with 8a5eb328ce9b42908cdb8f9dfd5b1323daa39273a3f619e32ce523c93600463e not found: ID does not exist" Oct 13 13:30:57 crc kubenswrapper[4797]: I1013 13:30:57.503772 4797 scope.go:117] "RemoveContainer" containerID="97f2657c74ec4eaa27ce8eff0a24c3564c0c898cca22139ea967f22a3141b31e" Oct 13 13:30:57 crc kubenswrapper[4797]: E1013 13:30:57.504430 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"97f2657c74ec4eaa27ce8eff0a24c3564c0c898cca22139ea967f22a3141b31e\": container with ID starting with 97f2657c74ec4eaa27ce8eff0a24c3564c0c898cca22139ea967f22a3141b31e not found: ID does not exist" containerID="97f2657c74ec4eaa27ce8eff0a24c3564c0c898cca22139ea967f22a3141b31e" Oct 13 13:30:57 crc kubenswrapper[4797]: I1013 13:30:57.504614 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97f2657c74ec4eaa27ce8eff0a24c3564c0c898cca22139ea967f22a3141b31e"} err="failed to get container status \"97f2657c74ec4eaa27ce8eff0a24c3564c0c898cca22139ea967f22a3141b31e\": rpc error: code = NotFound desc = could not find container \"97f2657c74ec4eaa27ce8eff0a24c3564c0c898cca22139ea967f22a3141b31e\": container with ID starting with 97f2657c74ec4eaa27ce8eff0a24c3564c0c898cca22139ea967f22a3141b31e not found: ID does not exist" Oct 13 13:30:57 crc kubenswrapper[4797]: I1013 13:30:57.504699 4797 scope.go:117] "RemoveContainer" containerID="c2082162e22386af6d921905e2a8b1c1c9861309572ad6fed2722b0900b57ec6" Oct 13 13:30:57 crc kubenswrapper[4797]: E1013 13:30:57.505270 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c2082162e22386af6d921905e2a8b1c1c9861309572ad6fed2722b0900b57ec6\": container with ID starting with c2082162e22386af6d921905e2a8b1c1c9861309572ad6fed2722b0900b57ec6 not found: ID does not exist" containerID="c2082162e22386af6d921905e2a8b1c1c9861309572ad6fed2722b0900b57ec6" Oct 13 13:30:57 crc kubenswrapper[4797]: I1013 13:30:57.505312 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c2082162e22386af6d921905e2a8b1c1c9861309572ad6fed2722b0900b57ec6"} err="failed to get container status \"c2082162e22386af6d921905e2a8b1c1c9861309572ad6fed2722b0900b57ec6\": rpc error: code = NotFound desc = could not find container \"c2082162e22386af6d921905e2a8b1c1c9861309572ad6fed2722b0900b57ec6\": container with ID starting with c2082162e22386af6d921905e2a8b1c1c9861309572ad6fed2722b0900b57ec6 not found: ID does not exist" Oct 13 13:30:59 crc kubenswrapper[4797]: I1013 13:30:59.252475 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1a0d630-1daf-4b87-b297-3ea670a0e6d4" path="/var/lib/kubelet/pods/e1a0d630-1daf-4b87-b297-3ea670a0e6d4/volumes" Oct 13 13:31:08 crc kubenswrapper[4797]: I1013 13:31:08.870739 4797 scope.go:117] "RemoveContainer" containerID="cede006473b611931f79f58356b66eee508c9f0bdede650304bb54fc7058b8f5" Oct 13 13:31:08 crc kubenswrapper[4797]: I1013 13:31:08.914430 4797 scope.go:117] "RemoveContainer" containerID="569a075615ac9ed0ec14fd0173ec0ac70aa932fe695c353fe31c61bed8fd10e7" Oct 13 13:31:08 crc kubenswrapper[4797]: I1013 13:31:08.954991 4797 scope.go:117] "RemoveContainer" containerID="a0ba3520e5651533522d5be4eedd2ce11b85f4a41e04d516a04f5658408ca62b" Oct 13 13:31:08 crc kubenswrapper[4797]: I1013 13:31:08.987129 4797 scope.go:117] "RemoveContainer" containerID="ab217f3ba3c3ed71ed169dc093a277ef9cafc9d81e5055e8a1c6d6832b21be97" Oct 13 13:31:09 crc kubenswrapper[4797]: I1013 13:31:09.011569 4797 scope.go:117] "RemoveContainer" containerID="2d5f6e18830224ba06878209c725fc3b37c7a8b004f6560b8c4a0fc12883cc60" Oct 13 13:31:09 crc kubenswrapper[4797]: I1013 13:31:09.096358 4797 scope.go:117] "RemoveContainer" containerID="6558c30f6c2510b59df66709040b96084db877c0138ba5ac939214197756b028" Oct 13 13:31:09 crc kubenswrapper[4797]: I1013 13:31:09.118555 4797 scope.go:117] "RemoveContainer" containerID="4360772a9d1e2f22d64f0217e01b149d341125b39f692cf2ceaffd1c6f90417b" Oct 13 13:31:09 crc kubenswrapper[4797]: I1013 13:31:09.148152 4797 scope.go:117] "RemoveContainer" containerID="80cd488390657ee26fda2dfd41b682aa23d52a2cb5737ee54f682386c5c48619" Oct 13 13:31:09 crc kubenswrapper[4797]: I1013 13:31:09.172867 4797 scope.go:117] "RemoveContainer" containerID="2818ad7c05d964e9362c8f835a73e1b7d87549c1a9b7705e996f9b7e2e49537c" Oct 13 13:31:09 crc kubenswrapper[4797]: I1013 13:31:09.194118 4797 scope.go:117] "RemoveContainer" containerID="8e8a08b1fb90c5143dfc8554374d133c2d0f35ccf8dc5db129321ce754983254" Oct 13 13:31:09 crc kubenswrapper[4797]: I1013 13:31:09.213719 4797 scope.go:117] "RemoveContainer" containerID="784be3c5f941d88bcbbee59370e2e0936ed8962a19a1f5e76ada2d6bea8e66d5" Oct 13 13:31:09 crc kubenswrapper[4797]: I1013 13:31:09.236169 4797 scope.go:117] "RemoveContainer" containerID="f289e553a5332ee32521ef8171b127e08ae0d885341334c6dd1917ebd2145de5" Oct 13 13:31:09 crc kubenswrapper[4797]: I1013 13:31:09.262879 4797 scope.go:117] "RemoveContainer" containerID="4c4d14744c4c686a72e50ef1567e0aedf9062d7e4844b7fb8c13d286b0b7f12b" Oct 13 13:31:09 crc kubenswrapper[4797]: I1013 13:31:09.300461 4797 scope.go:117] "RemoveContainer" containerID="5c5cc2d87050a3de571b1f1b9c6bf168c449bd32a7883ebd8e094f5734bd54f2" Oct 13 13:31:09 crc kubenswrapper[4797]: I1013 13:31:09.323728 4797 scope.go:117] "RemoveContainer" containerID="de93f004ff46c218cae69e7fffa4767fbadceb4b17be2daf089d6bb40a62869c" Oct 13 13:31:15 crc kubenswrapper[4797]: I1013 13:31:15.560295 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-gb798"] Oct 13 13:31:15 crc kubenswrapper[4797]: E1013 13:31:15.561214 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1a0d630-1daf-4b87-b297-3ea670a0e6d4" containerName="extract-content" Oct 13 13:31:15 crc kubenswrapper[4797]: I1013 13:31:15.561231 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1a0d630-1daf-4b87-b297-3ea670a0e6d4" containerName="extract-content" Oct 13 13:31:15 crc kubenswrapper[4797]: E1013 13:31:15.561250 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1a0d630-1daf-4b87-b297-3ea670a0e6d4" containerName="extract-utilities" Oct 13 13:31:15 crc kubenswrapper[4797]: I1013 13:31:15.561258 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1a0d630-1daf-4b87-b297-3ea670a0e6d4" containerName="extract-utilities" Oct 13 13:31:15 crc kubenswrapper[4797]: E1013 13:31:15.561279 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1a0d630-1daf-4b87-b297-3ea670a0e6d4" containerName="registry-server" Oct 13 13:31:15 crc kubenswrapper[4797]: I1013 13:31:15.561287 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1a0d630-1daf-4b87-b297-3ea670a0e6d4" containerName="registry-server" Oct 13 13:31:15 crc kubenswrapper[4797]: I1013 13:31:15.561464 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1a0d630-1daf-4b87-b297-3ea670a0e6d4" containerName="registry-server" Oct 13 13:31:15 crc kubenswrapper[4797]: I1013 13:31:15.562871 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gb798" Oct 13 13:31:15 crc kubenswrapper[4797]: I1013 13:31:15.590313 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gb798"] Oct 13 13:31:15 crc kubenswrapper[4797]: I1013 13:31:15.665076 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0123aec-090e-4af3-baeb-6129a0c6a176-catalog-content\") pod \"redhat-marketplace-gb798\" (UID: \"d0123aec-090e-4af3-baeb-6129a0c6a176\") " pod="openshift-marketplace/redhat-marketplace-gb798" Oct 13 13:31:15 crc kubenswrapper[4797]: I1013 13:31:15.665278 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z58nj\" (UniqueName: \"kubernetes.io/projected/d0123aec-090e-4af3-baeb-6129a0c6a176-kube-api-access-z58nj\") pod \"redhat-marketplace-gb798\" (UID: \"d0123aec-090e-4af3-baeb-6129a0c6a176\") " pod="openshift-marketplace/redhat-marketplace-gb798" Oct 13 13:31:15 crc kubenswrapper[4797]: I1013 13:31:15.665350 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0123aec-090e-4af3-baeb-6129a0c6a176-utilities\") pod \"redhat-marketplace-gb798\" (UID: \"d0123aec-090e-4af3-baeb-6129a0c6a176\") " pod="openshift-marketplace/redhat-marketplace-gb798" Oct 13 13:31:15 crc kubenswrapper[4797]: I1013 13:31:15.767343 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z58nj\" (UniqueName: \"kubernetes.io/projected/d0123aec-090e-4af3-baeb-6129a0c6a176-kube-api-access-z58nj\") pod \"redhat-marketplace-gb798\" (UID: \"d0123aec-090e-4af3-baeb-6129a0c6a176\") " pod="openshift-marketplace/redhat-marketplace-gb798" Oct 13 13:31:15 crc kubenswrapper[4797]: I1013 13:31:15.767406 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0123aec-090e-4af3-baeb-6129a0c6a176-utilities\") pod \"redhat-marketplace-gb798\" (UID: \"d0123aec-090e-4af3-baeb-6129a0c6a176\") " pod="openshift-marketplace/redhat-marketplace-gb798" Oct 13 13:31:15 crc kubenswrapper[4797]: I1013 13:31:15.767494 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0123aec-090e-4af3-baeb-6129a0c6a176-catalog-content\") pod \"redhat-marketplace-gb798\" (UID: \"d0123aec-090e-4af3-baeb-6129a0c6a176\") " pod="openshift-marketplace/redhat-marketplace-gb798" Oct 13 13:31:15 crc kubenswrapper[4797]: I1013 13:31:15.768019 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0123aec-090e-4af3-baeb-6129a0c6a176-catalog-content\") pod \"redhat-marketplace-gb798\" (UID: \"d0123aec-090e-4af3-baeb-6129a0c6a176\") " pod="openshift-marketplace/redhat-marketplace-gb798" Oct 13 13:31:15 crc kubenswrapper[4797]: I1013 13:31:15.768034 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0123aec-090e-4af3-baeb-6129a0c6a176-utilities\") pod \"redhat-marketplace-gb798\" (UID: \"d0123aec-090e-4af3-baeb-6129a0c6a176\") " pod="openshift-marketplace/redhat-marketplace-gb798" Oct 13 13:31:15 crc kubenswrapper[4797]: I1013 13:31:15.787215 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z58nj\" (UniqueName: \"kubernetes.io/projected/d0123aec-090e-4af3-baeb-6129a0c6a176-kube-api-access-z58nj\") pod \"redhat-marketplace-gb798\" (UID: \"d0123aec-090e-4af3-baeb-6129a0c6a176\") " pod="openshift-marketplace/redhat-marketplace-gb798" Oct 13 13:31:15 crc kubenswrapper[4797]: I1013 13:31:15.899473 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gb798" Oct 13 13:31:16 crc kubenswrapper[4797]: I1013 13:31:16.125901 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gb798"] Oct 13 13:31:16 crc kubenswrapper[4797]: I1013 13:31:16.595612 4797 generic.go:334] "Generic (PLEG): container finished" podID="d0123aec-090e-4af3-baeb-6129a0c6a176" containerID="e3e08391d7ed3a5d563bc720b95547e401cc6092124320ffee358d4f50626e5b" exitCode=0 Oct 13 13:31:16 crc kubenswrapper[4797]: I1013 13:31:16.595679 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gb798" event={"ID":"d0123aec-090e-4af3-baeb-6129a0c6a176","Type":"ContainerDied","Data":"e3e08391d7ed3a5d563bc720b95547e401cc6092124320ffee358d4f50626e5b"} Oct 13 13:31:16 crc kubenswrapper[4797]: I1013 13:31:16.596157 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gb798" event={"ID":"d0123aec-090e-4af3-baeb-6129a0c6a176","Type":"ContainerStarted","Data":"d3afa2dc42b568d0c0f534afc30e92d5679245a8cd2a3e8899a5421b6702a7e1"} Oct 13 13:31:17 crc kubenswrapper[4797]: I1013 13:31:17.608052 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gb798" event={"ID":"d0123aec-090e-4af3-baeb-6129a0c6a176","Type":"ContainerStarted","Data":"599147f930a5fbb5fcc3b2cb3f3af52468e7253eddc70a09908a16871a059366"} Oct 13 13:31:18 crc kubenswrapper[4797]: I1013 13:31:18.621783 4797 generic.go:334] "Generic (PLEG): container finished" podID="d0123aec-090e-4af3-baeb-6129a0c6a176" containerID="599147f930a5fbb5fcc3b2cb3f3af52468e7253eddc70a09908a16871a059366" exitCode=0 Oct 13 13:31:18 crc kubenswrapper[4797]: I1013 13:31:18.621891 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gb798" event={"ID":"d0123aec-090e-4af3-baeb-6129a0c6a176","Type":"ContainerDied","Data":"599147f930a5fbb5fcc3b2cb3f3af52468e7253eddc70a09908a16871a059366"} Oct 13 13:31:19 crc kubenswrapper[4797]: I1013 13:31:19.635081 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gb798" event={"ID":"d0123aec-090e-4af3-baeb-6129a0c6a176","Type":"ContainerStarted","Data":"294b81a3e33ec389a78cdda713a0035881dbb0828b17818ab99e7e2a1ea706b3"} Oct 13 13:31:25 crc kubenswrapper[4797]: I1013 13:31:25.900517 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-gb798" Oct 13 13:31:25 crc kubenswrapper[4797]: I1013 13:31:25.900942 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-gb798" Oct 13 13:31:25 crc kubenswrapper[4797]: I1013 13:31:25.977452 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-gb798" Oct 13 13:31:26 crc kubenswrapper[4797]: I1013 13:31:26.011751 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-gb798" podStartSLOduration=8.562513962 podStartE2EDuration="11.011719653s" podCreationTimestamp="2025-10-13 13:31:15 +0000 UTC" firstStartedPulling="2025-10-13 13:31:16.597970657 +0000 UTC m=+1454.131520953" lastFinishedPulling="2025-10-13 13:31:19.047176378 +0000 UTC m=+1456.580726644" observedRunningTime="2025-10-13 13:31:19.661729068 +0000 UTC m=+1457.195279344" watchObservedRunningTime="2025-10-13 13:31:26.011719653 +0000 UTC m=+1463.545269959" Oct 13 13:31:26 crc kubenswrapper[4797]: I1013 13:31:26.740293 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-gb798" Oct 13 13:31:26 crc kubenswrapper[4797]: I1013 13:31:26.781381 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gb798"] Oct 13 13:31:28 crc kubenswrapper[4797]: I1013 13:31:28.712955 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-gb798" podUID="d0123aec-090e-4af3-baeb-6129a0c6a176" containerName="registry-server" containerID="cri-o://294b81a3e33ec389a78cdda713a0035881dbb0828b17818ab99e7e2a1ea706b3" gracePeriod=2 Oct 13 13:31:29 crc kubenswrapper[4797]: I1013 13:31:29.178287 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gb798" Oct 13 13:31:29 crc kubenswrapper[4797]: I1013 13:31:29.268957 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z58nj\" (UniqueName: \"kubernetes.io/projected/d0123aec-090e-4af3-baeb-6129a0c6a176-kube-api-access-z58nj\") pod \"d0123aec-090e-4af3-baeb-6129a0c6a176\" (UID: \"d0123aec-090e-4af3-baeb-6129a0c6a176\") " Oct 13 13:31:29 crc kubenswrapper[4797]: I1013 13:31:29.269008 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0123aec-090e-4af3-baeb-6129a0c6a176-catalog-content\") pod \"d0123aec-090e-4af3-baeb-6129a0c6a176\" (UID: \"d0123aec-090e-4af3-baeb-6129a0c6a176\") " Oct 13 13:31:29 crc kubenswrapper[4797]: I1013 13:31:29.269077 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0123aec-090e-4af3-baeb-6129a0c6a176-utilities\") pod \"d0123aec-090e-4af3-baeb-6129a0c6a176\" (UID: \"d0123aec-090e-4af3-baeb-6129a0c6a176\") " Oct 13 13:31:29 crc kubenswrapper[4797]: I1013 13:31:29.270626 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d0123aec-090e-4af3-baeb-6129a0c6a176-utilities" (OuterVolumeSpecName: "utilities") pod "d0123aec-090e-4af3-baeb-6129a0c6a176" (UID: "d0123aec-090e-4af3-baeb-6129a0c6a176"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 13:31:29 crc kubenswrapper[4797]: I1013 13:31:29.279040 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0123aec-090e-4af3-baeb-6129a0c6a176-kube-api-access-z58nj" (OuterVolumeSpecName: "kube-api-access-z58nj") pod "d0123aec-090e-4af3-baeb-6129a0c6a176" (UID: "d0123aec-090e-4af3-baeb-6129a0c6a176"). InnerVolumeSpecName "kube-api-access-z58nj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:31:29 crc kubenswrapper[4797]: I1013 13:31:29.289344 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d0123aec-090e-4af3-baeb-6129a0c6a176-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d0123aec-090e-4af3-baeb-6129a0c6a176" (UID: "d0123aec-090e-4af3-baeb-6129a0c6a176"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 13:31:29 crc kubenswrapper[4797]: I1013 13:31:29.371131 4797 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0123aec-090e-4af3-baeb-6129a0c6a176-utilities\") on node \"crc\" DevicePath \"\"" Oct 13 13:31:29 crc kubenswrapper[4797]: I1013 13:31:29.371176 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z58nj\" (UniqueName: \"kubernetes.io/projected/d0123aec-090e-4af3-baeb-6129a0c6a176-kube-api-access-z58nj\") on node \"crc\" DevicePath \"\"" Oct 13 13:31:29 crc kubenswrapper[4797]: I1013 13:31:29.371188 4797 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0123aec-090e-4af3-baeb-6129a0c6a176-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 13 13:31:29 crc kubenswrapper[4797]: I1013 13:31:29.727849 4797 generic.go:334] "Generic (PLEG): container finished" podID="d0123aec-090e-4af3-baeb-6129a0c6a176" containerID="294b81a3e33ec389a78cdda713a0035881dbb0828b17818ab99e7e2a1ea706b3" exitCode=0 Oct 13 13:31:29 crc kubenswrapper[4797]: I1013 13:31:29.727898 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gb798" event={"ID":"d0123aec-090e-4af3-baeb-6129a0c6a176","Type":"ContainerDied","Data":"294b81a3e33ec389a78cdda713a0035881dbb0828b17818ab99e7e2a1ea706b3"} Oct 13 13:31:29 crc kubenswrapper[4797]: I1013 13:31:29.727930 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gb798" event={"ID":"d0123aec-090e-4af3-baeb-6129a0c6a176","Type":"ContainerDied","Data":"d3afa2dc42b568d0c0f534afc30e92d5679245a8cd2a3e8899a5421b6702a7e1"} Oct 13 13:31:29 crc kubenswrapper[4797]: I1013 13:31:29.727954 4797 scope.go:117] "RemoveContainer" containerID="294b81a3e33ec389a78cdda713a0035881dbb0828b17818ab99e7e2a1ea706b3" Oct 13 13:31:29 crc kubenswrapper[4797]: I1013 13:31:29.727980 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gb798" Oct 13 13:31:29 crc kubenswrapper[4797]: I1013 13:31:29.752068 4797 scope.go:117] "RemoveContainer" containerID="599147f930a5fbb5fcc3b2cb3f3af52468e7253eddc70a09908a16871a059366" Oct 13 13:31:29 crc kubenswrapper[4797]: I1013 13:31:29.779195 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gb798"] Oct 13 13:31:29 crc kubenswrapper[4797]: I1013 13:31:29.785747 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-gb798"] Oct 13 13:31:29 crc kubenswrapper[4797]: I1013 13:31:29.795087 4797 scope.go:117] "RemoveContainer" containerID="e3e08391d7ed3a5d563bc720b95547e401cc6092124320ffee358d4f50626e5b" Oct 13 13:31:29 crc kubenswrapper[4797]: I1013 13:31:29.824270 4797 scope.go:117] "RemoveContainer" containerID="294b81a3e33ec389a78cdda713a0035881dbb0828b17818ab99e7e2a1ea706b3" Oct 13 13:31:29 crc kubenswrapper[4797]: E1013 13:31:29.824939 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"294b81a3e33ec389a78cdda713a0035881dbb0828b17818ab99e7e2a1ea706b3\": container with ID starting with 294b81a3e33ec389a78cdda713a0035881dbb0828b17818ab99e7e2a1ea706b3 not found: ID does not exist" containerID="294b81a3e33ec389a78cdda713a0035881dbb0828b17818ab99e7e2a1ea706b3" Oct 13 13:31:29 crc kubenswrapper[4797]: I1013 13:31:29.824971 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"294b81a3e33ec389a78cdda713a0035881dbb0828b17818ab99e7e2a1ea706b3"} err="failed to get container status \"294b81a3e33ec389a78cdda713a0035881dbb0828b17818ab99e7e2a1ea706b3\": rpc error: code = NotFound desc = could not find container \"294b81a3e33ec389a78cdda713a0035881dbb0828b17818ab99e7e2a1ea706b3\": container with ID starting with 294b81a3e33ec389a78cdda713a0035881dbb0828b17818ab99e7e2a1ea706b3 not found: ID does not exist" Oct 13 13:31:29 crc kubenswrapper[4797]: I1013 13:31:29.824991 4797 scope.go:117] "RemoveContainer" containerID="599147f930a5fbb5fcc3b2cb3f3af52468e7253eddc70a09908a16871a059366" Oct 13 13:31:29 crc kubenswrapper[4797]: E1013 13:31:29.825514 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"599147f930a5fbb5fcc3b2cb3f3af52468e7253eddc70a09908a16871a059366\": container with ID starting with 599147f930a5fbb5fcc3b2cb3f3af52468e7253eddc70a09908a16871a059366 not found: ID does not exist" containerID="599147f930a5fbb5fcc3b2cb3f3af52468e7253eddc70a09908a16871a059366" Oct 13 13:31:29 crc kubenswrapper[4797]: I1013 13:31:29.825607 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"599147f930a5fbb5fcc3b2cb3f3af52468e7253eddc70a09908a16871a059366"} err="failed to get container status \"599147f930a5fbb5fcc3b2cb3f3af52468e7253eddc70a09908a16871a059366\": rpc error: code = NotFound desc = could not find container \"599147f930a5fbb5fcc3b2cb3f3af52468e7253eddc70a09908a16871a059366\": container with ID starting with 599147f930a5fbb5fcc3b2cb3f3af52468e7253eddc70a09908a16871a059366 not found: ID does not exist" Oct 13 13:31:29 crc kubenswrapper[4797]: I1013 13:31:29.825683 4797 scope.go:117] "RemoveContainer" containerID="e3e08391d7ed3a5d563bc720b95547e401cc6092124320ffee358d4f50626e5b" Oct 13 13:31:29 crc kubenswrapper[4797]: E1013 13:31:29.826254 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e3e08391d7ed3a5d563bc720b95547e401cc6092124320ffee358d4f50626e5b\": container with ID starting with e3e08391d7ed3a5d563bc720b95547e401cc6092124320ffee358d4f50626e5b not found: ID does not exist" containerID="e3e08391d7ed3a5d563bc720b95547e401cc6092124320ffee358d4f50626e5b" Oct 13 13:31:29 crc kubenswrapper[4797]: I1013 13:31:29.826323 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3e08391d7ed3a5d563bc720b95547e401cc6092124320ffee358d4f50626e5b"} err="failed to get container status \"e3e08391d7ed3a5d563bc720b95547e401cc6092124320ffee358d4f50626e5b\": rpc error: code = NotFound desc = could not find container \"e3e08391d7ed3a5d563bc720b95547e401cc6092124320ffee358d4f50626e5b\": container with ID starting with e3e08391d7ed3a5d563bc720b95547e401cc6092124320ffee358d4f50626e5b not found: ID does not exist" Oct 13 13:31:31 crc kubenswrapper[4797]: I1013 13:31:31.249273 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0123aec-090e-4af3-baeb-6129a0c6a176" path="/var/lib/kubelet/pods/d0123aec-090e-4af3-baeb-6129a0c6a176/volumes" Oct 13 13:32:09 crc kubenswrapper[4797]: I1013 13:32:09.649986 4797 scope.go:117] "RemoveContainer" containerID="8a82a9e40c540c4652fc02f27eeaca8b45824717be1359cb164d63d458e5e12f" Oct 13 13:32:09 crc kubenswrapper[4797]: I1013 13:32:09.694861 4797 scope.go:117] "RemoveContainer" containerID="e4cf8b303fc9930630ea8cf2fafc4b1ecdee57504baf50f2bfe1bf0f63ecfaf1" Oct 13 13:32:09 crc kubenswrapper[4797]: I1013 13:32:09.715781 4797 scope.go:117] "RemoveContainer" containerID="500b5a1ee7aa9cf5b1f05189445b52d0c05810f3b993e86be05f0b07214c543c" Oct 13 13:32:09 crc kubenswrapper[4797]: I1013 13:32:09.741906 4797 scope.go:117] "RemoveContainer" containerID="4251a02173a50fd16205c9b173f202379b93c111710f2865655e603d39f3ed2c" Oct 13 13:32:09 crc kubenswrapper[4797]: I1013 13:32:09.781487 4797 scope.go:117] "RemoveContainer" containerID="b59989a84aa6cae92a4cd6b910046455f53509fe73ce82a007f4aab8aa821634" Oct 13 13:32:09 crc kubenswrapper[4797]: I1013 13:32:09.801510 4797 scope.go:117] "RemoveContainer" containerID="eafea942da1ddb6d4abaaa93876373ee484606f7001d932ec2587dd831ade1f5" Oct 13 13:32:09 crc kubenswrapper[4797]: I1013 13:32:09.839196 4797 scope.go:117] "RemoveContainer" containerID="07625101ccd79368453ce261cd92f3c80074068cd66650ebfdbe4888531d4cd2" Oct 13 13:32:09 crc kubenswrapper[4797]: I1013 13:32:09.857069 4797 scope.go:117] "RemoveContainer" containerID="12e079d1fea424b348c31cef1525d77ed92d39ac5a4b98cb78fbf186197a402c" Oct 13 13:32:09 crc kubenswrapper[4797]: I1013 13:32:09.896732 4797 scope.go:117] "RemoveContainer" containerID="b5c8e1cc5e2837e1df6c74841ca13a868c1d79cca109abf5eddbf0d8bb543195" Oct 13 13:32:18 crc kubenswrapper[4797]: I1013 13:32:18.119743 4797 patch_prober.go:28] interesting pod/machine-config-daemon-hrdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 13:32:18 crc kubenswrapper[4797]: I1013 13:32:18.121678 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 13:32:48 crc kubenswrapper[4797]: I1013 13:32:48.120832 4797 patch_prober.go:28] interesting pod/machine-config-daemon-hrdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 13:32:48 crc kubenswrapper[4797]: I1013 13:32:48.121420 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 13:33:10 crc kubenswrapper[4797]: I1013 13:33:10.062686 4797 scope.go:117] "RemoveContainer" containerID="36e7e2aeec321eaebaa86fac8932d4e7bb3dce1420adea82887e645b8e0c07f2" Oct 13 13:33:10 crc kubenswrapper[4797]: I1013 13:33:10.094476 4797 scope.go:117] "RemoveContainer" containerID="9d64c2be9688b9fb4e9f73b6943972899943509da3f03d108888f03aae8e7ad2" Oct 13 13:33:10 crc kubenswrapper[4797]: I1013 13:33:10.129942 4797 scope.go:117] "RemoveContainer" containerID="2a0a2151ae65744b3d130619b0d9e3af037af9e2ad82efcda00139bb22be61e9" Oct 13 13:33:10 crc kubenswrapper[4797]: I1013 13:33:10.145980 4797 scope.go:117] "RemoveContainer" containerID="c597d4730edec7f44d1fce4d0ce4d87234325f19fc8caa11012c424af77676b8" Oct 13 13:33:10 crc kubenswrapper[4797]: I1013 13:33:10.170240 4797 scope.go:117] "RemoveContainer" containerID="ac261ec2207e3bf512b4dde7e4a1e1e0dc07b2d3fbe679680f2d4d0c24ca5ddd" Oct 13 13:33:10 crc kubenswrapper[4797]: I1013 13:33:10.200065 4797 scope.go:117] "RemoveContainer" containerID="5dbaaac2816f861e605b3530adc848ceccb2aa44765d018c23ce49dddf5fbec3" Oct 13 13:33:10 crc kubenswrapper[4797]: I1013 13:33:10.226616 4797 scope.go:117] "RemoveContainer" containerID="08725d77ede82c87183202476a70e1d4f3cdcf4541813fed902f4f45081356cd" Oct 13 13:33:18 crc kubenswrapper[4797]: I1013 13:33:18.120923 4797 patch_prober.go:28] interesting pod/machine-config-daemon-hrdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 13:33:18 crc kubenswrapper[4797]: I1013 13:33:18.121566 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 13:33:18 crc kubenswrapper[4797]: I1013 13:33:18.121659 4797 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" Oct 13 13:33:18 crc kubenswrapper[4797]: I1013 13:33:18.122530 4797 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2e1a8d37a243fdb391fd594b14c3967cbe771bff4ffdfcbdee7201408ecf2edb"} pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 13 13:33:18 crc kubenswrapper[4797]: I1013 13:33:18.122621 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerName="machine-config-daemon" containerID="cri-o://2e1a8d37a243fdb391fd594b14c3967cbe771bff4ffdfcbdee7201408ecf2edb" gracePeriod=600 Oct 13 13:33:18 crc kubenswrapper[4797]: E1013 13:33:18.251876 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 13:33:18 crc kubenswrapper[4797]: I1013 13:33:18.766486 4797 generic.go:334] "Generic (PLEG): container finished" podID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerID="2e1a8d37a243fdb391fd594b14c3967cbe771bff4ffdfcbdee7201408ecf2edb" exitCode=0 Oct 13 13:33:18 crc kubenswrapper[4797]: I1013 13:33:18.766551 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" event={"ID":"345b1c60-ba79-407d-8423-53010f2dfeb0","Type":"ContainerDied","Data":"2e1a8d37a243fdb391fd594b14c3967cbe771bff4ffdfcbdee7201408ecf2edb"} Oct 13 13:33:18 crc kubenswrapper[4797]: I1013 13:33:18.766601 4797 scope.go:117] "RemoveContainer" containerID="7a2f1a197e052d816aea722ded8ddb41413d4d55e91b26d0412cfadb04dd4ef6" Oct 13 13:33:18 crc kubenswrapper[4797]: I1013 13:33:18.767495 4797 scope.go:117] "RemoveContainer" containerID="2e1a8d37a243fdb391fd594b14c3967cbe771bff4ffdfcbdee7201408ecf2edb" Oct 13 13:33:18 crc kubenswrapper[4797]: E1013 13:33:18.768215 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 13:33:32 crc kubenswrapper[4797]: I1013 13:33:32.236723 4797 scope.go:117] "RemoveContainer" containerID="2e1a8d37a243fdb391fd594b14c3967cbe771bff4ffdfcbdee7201408ecf2edb" Oct 13 13:33:32 crc kubenswrapper[4797]: E1013 13:33:32.237760 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 13:33:43 crc kubenswrapper[4797]: I1013 13:33:43.251722 4797 scope.go:117] "RemoveContainer" containerID="2e1a8d37a243fdb391fd594b14c3967cbe771bff4ffdfcbdee7201408ecf2edb" Oct 13 13:33:43 crc kubenswrapper[4797]: E1013 13:33:43.252953 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 13:33:55 crc kubenswrapper[4797]: I1013 13:33:55.236732 4797 scope.go:117] "RemoveContainer" containerID="2e1a8d37a243fdb391fd594b14c3967cbe771bff4ffdfcbdee7201408ecf2edb" Oct 13 13:33:55 crc kubenswrapper[4797]: E1013 13:33:55.237723 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 13:34:10 crc kubenswrapper[4797]: I1013 13:34:10.236317 4797 scope.go:117] "RemoveContainer" containerID="2e1a8d37a243fdb391fd594b14c3967cbe771bff4ffdfcbdee7201408ecf2edb" Oct 13 13:34:10 crc kubenswrapper[4797]: E1013 13:34:10.238499 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 13:34:10 crc kubenswrapper[4797]: I1013 13:34:10.332578 4797 scope.go:117] "RemoveContainer" containerID="08c5f7a46cf66da7ac4bd3d50fc97b7d99b7ffff6dfe1bf44438942e8be3a569" Oct 13 13:34:10 crc kubenswrapper[4797]: I1013 13:34:10.390999 4797 scope.go:117] "RemoveContainer" containerID="f7e069b9ab89c7959910da337a2d82dec852dac12fc5e241175f7c593d851a00" Oct 13 13:34:10 crc kubenswrapper[4797]: I1013 13:34:10.412800 4797 scope.go:117] "RemoveContainer" containerID="5fbdc263c596a7c3afdf13d3673d89669a69914a1c25734b4adeb4c6c4f4e7be" Oct 13 13:34:24 crc kubenswrapper[4797]: I1013 13:34:24.237488 4797 scope.go:117] "RemoveContainer" containerID="2e1a8d37a243fdb391fd594b14c3967cbe771bff4ffdfcbdee7201408ecf2edb" Oct 13 13:34:24 crc kubenswrapper[4797]: E1013 13:34:24.238680 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 13:34:35 crc kubenswrapper[4797]: I1013 13:34:35.236414 4797 scope.go:117] "RemoveContainer" containerID="2e1a8d37a243fdb391fd594b14c3967cbe771bff4ffdfcbdee7201408ecf2edb" Oct 13 13:34:35 crc kubenswrapper[4797]: E1013 13:34:35.237572 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 13:34:46 crc kubenswrapper[4797]: I1013 13:34:46.236530 4797 scope.go:117] "RemoveContainer" containerID="2e1a8d37a243fdb391fd594b14c3967cbe771bff4ffdfcbdee7201408ecf2edb" Oct 13 13:34:46 crc kubenswrapper[4797]: E1013 13:34:46.237562 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 13:35:00 crc kubenswrapper[4797]: I1013 13:35:00.236925 4797 scope.go:117] "RemoveContainer" containerID="2e1a8d37a243fdb391fd594b14c3967cbe771bff4ffdfcbdee7201408ecf2edb" Oct 13 13:35:00 crc kubenswrapper[4797]: E1013 13:35:00.238120 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 13:35:10 crc kubenswrapper[4797]: I1013 13:35:10.497973 4797 scope.go:117] "RemoveContainer" containerID="ddbd9fbba7afb83602cfa1c4d7828d401160383366b2cc9f83c9b5cab203b153" Oct 13 13:35:11 crc kubenswrapper[4797]: I1013 13:35:11.236885 4797 scope.go:117] "RemoveContainer" containerID="2e1a8d37a243fdb391fd594b14c3967cbe771bff4ffdfcbdee7201408ecf2edb" Oct 13 13:35:11 crc kubenswrapper[4797]: E1013 13:35:11.237630 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 13:35:24 crc kubenswrapper[4797]: I1013 13:35:24.237010 4797 scope.go:117] "RemoveContainer" containerID="2e1a8d37a243fdb391fd594b14c3967cbe771bff4ffdfcbdee7201408ecf2edb" Oct 13 13:35:24 crc kubenswrapper[4797]: E1013 13:35:24.237605 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 13:35:35 crc kubenswrapper[4797]: I1013 13:35:35.236411 4797 scope.go:117] "RemoveContainer" containerID="2e1a8d37a243fdb391fd594b14c3967cbe771bff4ffdfcbdee7201408ecf2edb" Oct 13 13:35:35 crc kubenswrapper[4797]: E1013 13:35:35.237227 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 13:35:50 crc kubenswrapper[4797]: I1013 13:35:50.237401 4797 scope.go:117] "RemoveContainer" containerID="2e1a8d37a243fdb391fd594b14c3967cbe771bff4ffdfcbdee7201408ecf2edb" Oct 13 13:35:50 crc kubenswrapper[4797]: E1013 13:35:50.238531 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 13:36:01 crc kubenswrapper[4797]: I1013 13:36:01.236754 4797 scope.go:117] "RemoveContainer" containerID="2e1a8d37a243fdb391fd594b14c3967cbe771bff4ffdfcbdee7201408ecf2edb" Oct 13 13:36:01 crc kubenswrapper[4797]: E1013 13:36:01.237497 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 13:36:10 crc kubenswrapper[4797]: I1013 13:36:10.563077 4797 scope.go:117] "RemoveContainer" containerID="0046c194d5cb2a17a8e468e7c97653174cda795dc3a35ef52e73abaa120bddf3" Oct 13 13:36:10 crc kubenswrapper[4797]: I1013 13:36:10.585459 4797 scope.go:117] "RemoveContainer" containerID="d2554b76de82af7c27df20bec7682a8cbc461613a3a2b1a5e9ff6acf46612daf" Oct 13 13:36:14 crc kubenswrapper[4797]: I1013 13:36:14.236873 4797 scope.go:117] "RemoveContainer" containerID="2e1a8d37a243fdb391fd594b14c3967cbe771bff4ffdfcbdee7201408ecf2edb" Oct 13 13:36:14 crc kubenswrapper[4797]: E1013 13:36:14.237677 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 13:36:28 crc kubenswrapper[4797]: I1013 13:36:28.236309 4797 scope.go:117] "RemoveContainer" containerID="2e1a8d37a243fdb391fd594b14c3967cbe771bff4ffdfcbdee7201408ecf2edb" Oct 13 13:36:28 crc kubenswrapper[4797]: E1013 13:36:28.237113 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 13:36:43 crc kubenswrapper[4797]: I1013 13:36:43.243478 4797 scope.go:117] "RemoveContainer" containerID="2e1a8d37a243fdb391fd594b14c3967cbe771bff4ffdfcbdee7201408ecf2edb" Oct 13 13:36:43 crc kubenswrapper[4797]: E1013 13:36:43.244507 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 13:36:56 crc kubenswrapper[4797]: I1013 13:36:56.236195 4797 scope.go:117] "RemoveContainer" containerID="2e1a8d37a243fdb391fd594b14c3967cbe771bff4ffdfcbdee7201408ecf2edb" Oct 13 13:36:56 crc kubenswrapper[4797]: E1013 13:36:56.238722 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 13:37:07 crc kubenswrapper[4797]: I1013 13:37:07.236959 4797 scope.go:117] "RemoveContainer" containerID="2e1a8d37a243fdb391fd594b14c3967cbe771bff4ffdfcbdee7201408ecf2edb" Oct 13 13:37:07 crc kubenswrapper[4797]: E1013 13:37:07.238081 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 13:37:13 crc kubenswrapper[4797]: I1013 13:37:13.410858 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-vqq9m"] Oct 13 13:37:13 crc kubenswrapper[4797]: E1013 13:37:13.411765 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0123aec-090e-4af3-baeb-6129a0c6a176" containerName="registry-server" Oct 13 13:37:13 crc kubenswrapper[4797]: I1013 13:37:13.411789 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0123aec-090e-4af3-baeb-6129a0c6a176" containerName="registry-server" Oct 13 13:37:13 crc kubenswrapper[4797]: E1013 13:37:13.411847 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0123aec-090e-4af3-baeb-6129a0c6a176" containerName="extract-content" Oct 13 13:37:13 crc kubenswrapper[4797]: I1013 13:37:13.411861 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0123aec-090e-4af3-baeb-6129a0c6a176" containerName="extract-content" Oct 13 13:37:13 crc kubenswrapper[4797]: E1013 13:37:13.411885 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0123aec-090e-4af3-baeb-6129a0c6a176" containerName="extract-utilities" Oct 13 13:37:13 crc kubenswrapper[4797]: I1013 13:37:13.411897 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0123aec-090e-4af3-baeb-6129a0c6a176" containerName="extract-utilities" Oct 13 13:37:13 crc kubenswrapper[4797]: I1013 13:37:13.412134 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0123aec-090e-4af3-baeb-6129a0c6a176" containerName="registry-server" Oct 13 13:37:13 crc kubenswrapper[4797]: I1013 13:37:13.413861 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vqq9m" Oct 13 13:37:13 crc kubenswrapper[4797]: I1013 13:37:13.424998 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vqq9m"] Oct 13 13:37:13 crc kubenswrapper[4797]: I1013 13:37:13.550691 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5zmx\" (UniqueName: \"kubernetes.io/projected/0c005f98-b2ac-4f0c-a46c-5cc7693640f2-kube-api-access-f5zmx\") pod \"certified-operators-vqq9m\" (UID: \"0c005f98-b2ac-4f0c-a46c-5cc7693640f2\") " pod="openshift-marketplace/certified-operators-vqq9m" Oct 13 13:37:13 crc kubenswrapper[4797]: I1013 13:37:13.550930 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c005f98-b2ac-4f0c-a46c-5cc7693640f2-catalog-content\") pod \"certified-operators-vqq9m\" (UID: \"0c005f98-b2ac-4f0c-a46c-5cc7693640f2\") " pod="openshift-marketplace/certified-operators-vqq9m" Oct 13 13:37:13 crc kubenswrapper[4797]: I1013 13:37:13.551525 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c005f98-b2ac-4f0c-a46c-5cc7693640f2-utilities\") pod \"certified-operators-vqq9m\" (UID: \"0c005f98-b2ac-4f0c-a46c-5cc7693640f2\") " pod="openshift-marketplace/certified-operators-vqq9m" Oct 13 13:37:13 crc kubenswrapper[4797]: I1013 13:37:13.653473 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c005f98-b2ac-4f0c-a46c-5cc7693640f2-utilities\") pod \"certified-operators-vqq9m\" (UID: \"0c005f98-b2ac-4f0c-a46c-5cc7693640f2\") " pod="openshift-marketplace/certified-operators-vqq9m" Oct 13 13:37:13 crc kubenswrapper[4797]: I1013 13:37:13.653553 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f5zmx\" (UniqueName: \"kubernetes.io/projected/0c005f98-b2ac-4f0c-a46c-5cc7693640f2-kube-api-access-f5zmx\") pod \"certified-operators-vqq9m\" (UID: \"0c005f98-b2ac-4f0c-a46c-5cc7693640f2\") " pod="openshift-marketplace/certified-operators-vqq9m" Oct 13 13:37:13 crc kubenswrapper[4797]: I1013 13:37:13.653584 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c005f98-b2ac-4f0c-a46c-5cc7693640f2-catalog-content\") pod \"certified-operators-vqq9m\" (UID: \"0c005f98-b2ac-4f0c-a46c-5cc7693640f2\") " pod="openshift-marketplace/certified-operators-vqq9m" Oct 13 13:37:13 crc kubenswrapper[4797]: I1013 13:37:13.654287 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c005f98-b2ac-4f0c-a46c-5cc7693640f2-utilities\") pod \"certified-operators-vqq9m\" (UID: \"0c005f98-b2ac-4f0c-a46c-5cc7693640f2\") " pod="openshift-marketplace/certified-operators-vqq9m" Oct 13 13:37:13 crc kubenswrapper[4797]: I1013 13:37:13.654305 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c005f98-b2ac-4f0c-a46c-5cc7693640f2-catalog-content\") pod \"certified-operators-vqq9m\" (UID: \"0c005f98-b2ac-4f0c-a46c-5cc7693640f2\") " pod="openshift-marketplace/certified-operators-vqq9m" Oct 13 13:37:13 crc kubenswrapper[4797]: I1013 13:37:13.679885 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5zmx\" (UniqueName: \"kubernetes.io/projected/0c005f98-b2ac-4f0c-a46c-5cc7693640f2-kube-api-access-f5zmx\") pod \"certified-operators-vqq9m\" (UID: \"0c005f98-b2ac-4f0c-a46c-5cc7693640f2\") " pod="openshift-marketplace/certified-operators-vqq9m" Oct 13 13:37:13 crc kubenswrapper[4797]: I1013 13:37:13.733768 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vqq9m" Oct 13 13:37:14 crc kubenswrapper[4797]: I1013 13:37:14.199211 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vqq9m"] Oct 13 13:37:14 crc kubenswrapper[4797]: I1013 13:37:14.847309 4797 generic.go:334] "Generic (PLEG): container finished" podID="0c005f98-b2ac-4f0c-a46c-5cc7693640f2" containerID="c0c7e8bb0cce1e03f353b3d71117bda1b3e6e1079d8a6acf045aa69bf1543bd2" exitCode=0 Oct 13 13:37:14 crc kubenswrapper[4797]: I1013 13:37:14.847389 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vqq9m" event={"ID":"0c005f98-b2ac-4f0c-a46c-5cc7693640f2","Type":"ContainerDied","Data":"c0c7e8bb0cce1e03f353b3d71117bda1b3e6e1079d8a6acf045aa69bf1543bd2"} Oct 13 13:37:14 crc kubenswrapper[4797]: I1013 13:37:14.847776 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vqq9m" event={"ID":"0c005f98-b2ac-4f0c-a46c-5cc7693640f2","Type":"ContainerStarted","Data":"d635df92d74ee64ccd2bc6ff48f1537cd86da8b605599252588d60e427fb5f0a"} Oct 13 13:37:14 crc kubenswrapper[4797]: I1013 13:37:14.849301 4797 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 13 13:37:16 crc kubenswrapper[4797]: I1013 13:37:16.862258 4797 generic.go:334] "Generic (PLEG): container finished" podID="0c005f98-b2ac-4f0c-a46c-5cc7693640f2" containerID="5cf832277ecc677be91b575c854e2a7497340e6d08d6beaf9da3d93a1f089c93" exitCode=0 Oct 13 13:37:16 crc kubenswrapper[4797]: I1013 13:37:16.862578 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vqq9m" event={"ID":"0c005f98-b2ac-4f0c-a46c-5cc7693640f2","Type":"ContainerDied","Data":"5cf832277ecc677be91b575c854e2a7497340e6d08d6beaf9da3d93a1f089c93"} Oct 13 13:37:17 crc kubenswrapper[4797]: I1013 13:37:17.872638 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vqq9m" event={"ID":"0c005f98-b2ac-4f0c-a46c-5cc7693640f2","Type":"ContainerStarted","Data":"bfaeb7ba1b5fe1a38de19b2b0458b2e65a036403b2291db17606351d4b3a3066"} Oct 13 13:37:17 crc kubenswrapper[4797]: I1013 13:37:17.898025 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-vqq9m" podStartSLOduration=2.451274086 podStartE2EDuration="4.897977793s" podCreationTimestamp="2025-10-13 13:37:13 +0000 UTC" firstStartedPulling="2025-10-13 13:37:14.849089709 +0000 UTC m=+1812.382639965" lastFinishedPulling="2025-10-13 13:37:17.295793396 +0000 UTC m=+1814.829343672" observedRunningTime="2025-10-13 13:37:17.892877668 +0000 UTC m=+1815.426427924" watchObservedRunningTime="2025-10-13 13:37:17.897977793 +0000 UTC m=+1815.431528049" Oct 13 13:37:18 crc kubenswrapper[4797]: I1013 13:37:18.206189 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-clk2q"] Oct 13 13:37:18 crc kubenswrapper[4797]: I1013 13:37:18.208021 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-clk2q" Oct 13 13:37:18 crc kubenswrapper[4797]: I1013 13:37:18.223508 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-clk2q"] Oct 13 13:37:18 crc kubenswrapper[4797]: I1013 13:37:18.325119 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57aad6e1-f233-4015-8a16-61f667c7237d-catalog-content\") pod \"redhat-operators-clk2q\" (UID: \"57aad6e1-f233-4015-8a16-61f667c7237d\") " pod="openshift-marketplace/redhat-operators-clk2q" Oct 13 13:37:18 crc kubenswrapper[4797]: I1013 13:37:18.325194 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8l2f\" (UniqueName: \"kubernetes.io/projected/57aad6e1-f233-4015-8a16-61f667c7237d-kube-api-access-w8l2f\") pod \"redhat-operators-clk2q\" (UID: \"57aad6e1-f233-4015-8a16-61f667c7237d\") " pod="openshift-marketplace/redhat-operators-clk2q" Oct 13 13:37:18 crc kubenswrapper[4797]: I1013 13:37:18.325231 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57aad6e1-f233-4015-8a16-61f667c7237d-utilities\") pod \"redhat-operators-clk2q\" (UID: \"57aad6e1-f233-4015-8a16-61f667c7237d\") " pod="openshift-marketplace/redhat-operators-clk2q" Oct 13 13:37:18 crc kubenswrapper[4797]: I1013 13:37:18.426609 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57aad6e1-f233-4015-8a16-61f667c7237d-utilities\") pod \"redhat-operators-clk2q\" (UID: \"57aad6e1-f233-4015-8a16-61f667c7237d\") " pod="openshift-marketplace/redhat-operators-clk2q" Oct 13 13:37:18 crc kubenswrapper[4797]: I1013 13:37:18.426892 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57aad6e1-f233-4015-8a16-61f667c7237d-catalog-content\") pod \"redhat-operators-clk2q\" (UID: \"57aad6e1-f233-4015-8a16-61f667c7237d\") " pod="openshift-marketplace/redhat-operators-clk2q" Oct 13 13:37:18 crc kubenswrapper[4797]: I1013 13:37:18.426944 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w8l2f\" (UniqueName: \"kubernetes.io/projected/57aad6e1-f233-4015-8a16-61f667c7237d-kube-api-access-w8l2f\") pod \"redhat-operators-clk2q\" (UID: \"57aad6e1-f233-4015-8a16-61f667c7237d\") " pod="openshift-marketplace/redhat-operators-clk2q" Oct 13 13:37:18 crc kubenswrapper[4797]: I1013 13:37:18.427150 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57aad6e1-f233-4015-8a16-61f667c7237d-utilities\") pod \"redhat-operators-clk2q\" (UID: \"57aad6e1-f233-4015-8a16-61f667c7237d\") " pod="openshift-marketplace/redhat-operators-clk2q" Oct 13 13:37:18 crc kubenswrapper[4797]: I1013 13:37:18.427372 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57aad6e1-f233-4015-8a16-61f667c7237d-catalog-content\") pod \"redhat-operators-clk2q\" (UID: \"57aad6e1-f233-4015-8a16-61f667c7237d\") " pod="openshift-marketplace/redhat-operators-clk2q" Oct 13 13:37:18 crc kubenswrapper[4797]: I1013 13:37:18.444101 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8l2f\" (UniqueName: \"kubernetes.io/projected/57aad6e1-f233-4015-8a16-61f667c7237d-kube-api-access-w8l2f\") pod \"redhat-operators-clk2q\" (UID: \"57aad6e1-f233-4015-8a16-61f667c7237d\") " pod="openshift-marketplace/redhat-operators-clk2q" Oct 13 13:37:18 crc kubenswrapper[4797]: I1013 13:37:18.532079 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-clk2q" Oct 13 13:37:19 crc kubenswrapper[4797]: I1013 13:37:19.024382 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-clk2q"] Oct 13 13:37:19 crc kubenswrapper[4797]: I1013 13:37:19.888972 4797 generic.go:334] "Generic (PLEG): container finished" podID="57aad6e1-f233-4015-8a16-61f667c7237d" containerID="63a816934140141b92e10d68517eff1eebc43d476b1d3c2f81ea32bdc6db28f4" exitCode=0 Oct 13 13:37:19 crc kubenswrapper[4797]: I1013 13:37:19.889046 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-clk2q" event={"ID":"57aad6e1-f233-4015-8a16-61f667c7237d","Type":"ContainerDied","Data":"63a816934140141b92e10d68517eff1eebc43d476b1d3c2f81ea32bdc6db28f4"} Oct 13 13:37:19 crc kubenswrapper[4797]: I1013 13:37:19.889348 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-clk2q" event={"ID":"57aad6e1-f233-4015-8a16-61f667c7237d","Type":"ContainerStarted","Data":"5e99bd2132b6dbb74316d5be6293e681f4e4831ab5acc31dd00b2b6d3b844cd7"} Oct 13 13:37:20 crc kubenswrapper[4797]: I1013 13:37:20.236227 4797 scope.go:117] "RemoveContainer" containerID="2e1a8d37a243fdb391fd594b14c3967cbe771bff4ffdfcbdee7201408ecf2edb" Oct 13 13:37:20 crc kubenswrapper[4797]: E1013 13:37:20.236658 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 13:37:21 crc kubenswrapper[4797]: I1013 13:37:21.911524 4797 generic.go:334] "Generic (PLEG): container finished" podID="57aad6e1-f233-4015-8a16-61f667c7237d" containerID="4bbb71263877d47cfc1a324991ec3b3fc77cffe1bb971527136b716ba1c63838" exitCode=0 Oct 13 13:37:21 crc kubenswrapper[4797]: I1013 13:37:21.911662 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-clk2q" event={"ID":"57aad6e1-f233-4015-8a16-61f667c7237d","Type":"ContainerDied","Data":"4bbb71263877d47cfc1a324991ec3b3fc77cffe1bb971527136b716ba1c63838"} Oct 13 13:37:22 crc kubenswrapper[4797]: I1013 13:37:22.921960 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-clk2q" event={"ID":"57aad6e1-f233-4015-8a16-61f667c7237d","Type":"ContainerStarted","Data":"9cbde833a8fd57ac57dd755509e16abf6a21615a99d0ad59df56f7917b437a39"} Oct 13 13:37:22 crc kubenswrapper[4797]: I1013 13:37:22.943760 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-clk2q" podStartSLOduration=2.263395497 podStartE2EDuration="4.943741885s" podCreationTimestamp="2025-10-13 13:37:18 +0000 UTC" firstStartedPulling="2025-10-13 13:37:19.891036937 +0000 UTC m=+1817.424587203" lastFinishedPulling="2025-10-13 13:37:22.571383335 +0000 UTC m=+1820.104933591" observedRunningTime="2025-10-13 13:37:22.939502941 +0000 UTC m=+1820.473053227" watchObservedRunningTime="2025-10-13 13:37:22.943741885 +0000 UTC m=+1820.477292151" Oct 13 13:37:23 crc kubenswrapper[4797]: I1013 13:37:23.735079 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-vqq9m" Oct 13 13:37:23 crc kubenswrapper[4797]: I1013 13:37:23.735158 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-vqq9m" Oct 13 13:37:23 crc kubenswrapper[4797]: I1013 13:37:23.803545 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-vqq9m" Oct 13 13:37:23 crc kubenswrapper[4797]: I1013 13:37:23.998584 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-vqq9m" Oct 13 13:37:28 crc kubenswrapper[4797]: I1013 13:37:28.401618 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vqq9m"] Oct 13 13:37:28 crc kubenswrapper[4797]: I1013 13:37:28.402386 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-vqq9m" podUID="0c005f98-b2ac-4f0c-a46c-5cc7693640f2" containerName="registry-server" containerID="cri-o://bfaeb7ba1b5fe1a38de19b2b0458b2e65a036403b2291db17606351d4b3a3066" gracePeriod=2 Oct 13 13:37:28 crc kubenswrapper[4797]: I1013 13:37:28.532694 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-clk2q" Oct 13 13:37:28 crc kubenswrapper[4797]: I1013 13:37:28.532802 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-clk2q" Oct 13 13:37:28 crc kubenswrapper[4797]: I1013 13:37:28.608202 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-clk2q" Oct 13 13:37:28 crc kubenswrapper[4797]: I1013 13:37:28.837023 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vqq9m" Oct 13 13:37:28 crc kubenswrapper[4797]: I1013 13:37:28.979712 4797 generic.go:334] "Generic (PLEG): container finished" podID="0c005f98-b2ac-4f0c-a46c-5cc7693640f2" containerID="bfaeb7ba1b5fe1a38de19b2b0458b2e65a036403b2291db17606351d4b3a3066" exitCode=0 Oct 13 13:37:28 crc kubenswrapper[4797]: I1013 13:37:28.979744 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vqq9m" Oct 13 13:37:28 crc kubenswrapper[4797]: I1013 13:37:28.979758 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vqq9m" event={"ID":"0c005f98-b2ac-4f0c-a46c-5cc7693640f2","Type":"ContainerDied","Data":"bfaeb7ba1b5fe1a38de19b2b0458b2e65a036403b2291db17606351d4b3a3066"} Oct 13 13:37:28 crc kubenswrapper[4797]: I1013 13:37:28.980370 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vqq9m" event={"ID":"0c005f98-b2ac-4f0c-a46c-5cc7693640f2","Type":"ContainerDied","Data":"d635df92d74ee64ccd2bc6ff48f1537cd86da8b605599252588d60e427fb5f0a"} Oct 13 13:37:28 crc kubenswrapper[4797]: I1013 13:37:28.980412 4797 scope.go:117] "RemoveContainer" containerID="bfaeb7ba1b5fe1a38de19b2b0458b2e65a036403b2291db17606351d4b3a3066" Oct 13 13:37:29 crc kubenswrapper[4797]: I1013 13:37:28.999489 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f5zmx\" (UniqueName: \"kubernetes.io/projected/0c005f98-b2ac-4f0c-a46c-5cc7693640f2-kube-api-access-f5zmx\") pod \"0c005f98-b2ac-4f0c-a46c-5cc7693640f2\" (UID: \"0c005f98-b2ac-4f0c-a46c-5cc7693640f2\") " Oct 13 13:37:29 crc kubenswrapper[4797]: I1013 13:37:28.999845 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c005f98-b2ac-4f0c-a46c-5cc7693640f2-catalog-content\") pod \"0c005f98-b2ac-4f0c-a46c-5cc7693640f2\" (UID: \"0c005f98-b2ac-4f0c-a46c-5cc7693640f2\") " Oct 13 13:37:29 crc kubenswrapper[4797]: I1013 13:37:28.999891 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c005f98-b2ac-4f0c-a46c-5cc7693640f2-utilities\") pod \"0c005f98-b2ac-4f0c-a46c-5cc7693640f2\" (UID: \"0c005f98-b2ac-4f0c-a46c-5cc7693640f2\") " Oct 13 13:37:29 crc kubenswrapper[4797]: I1013 13:37:29.001705 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0c005f98-b2ac-4f0c-a46c-5cc7693640f2-utilities" (OuterVolumeSpecName: "utilities") pod "0c005f98-b2ac-4f0c-a46c-5cc7693640f2" (UID: "0c005f98-b2ac-4f0c-a46c-5cc7693640f2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 13:37:29 crc kubenswrapper[4797]: I1013 13:37:29.003590 4797 scope.go:117] "RemoveContainer" containerID="5cf832277ecc677be91b575c854e2a7497340e6d08d6beaf9da3d93a1f089c93" Oct 13 13:37:29 crc kubenswrapper[4797]: I1013 13:37:29.009349 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c005f98-b2ac-4f0c-a46c-5cc7693640f2-kube-api-access-f5zmx" (OuterVolumeSpecName: "kube-api-access-f5zmx") pod "0c005f98-b2ac-4f0c-a46c-5cc7693640f2" (UID: "0c005f98-b2ac-4f0c-a46c-5cc7693640f2"). InnerVolumeSpecName "kube-api-access-f5zmx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:37:29 crc kubenswrapper[4797]: I1013 13:37:29.057571 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0c005f98-b2ac-4f0c-a46c-5cc7693640f2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0c005f98-b2ac-4f0c-a46c-5cc7693640f2" (UID: "0c005f98-b2ac-4f0c-a46c-5cc7693640f2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 13:37:29 crc kubenswrapper[4797]: I1013 13:37:29.061783 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-clk2q" Oct 13 13:37:29 crc kubenswrapper[4797]: I1013 13:37:29.063170 4797 scope.go:117] "RemoveContainer" containerID="c0c7e8bb0cce1e03f353b3d71117bda1b3e6e1079d8a6acf045aa69bf1543bd2" Oct 13 13:37:29 crc kubenswrapper[4797]: I1013 13:37:29.093877 4797 scope.go:117] "RemoveContainer" containerID="bfaeb7ba1b5fe1a38de19b2b0458b2e65a036403b2291db17606351d4b3a3066" Oct 13 13:37:29 crc kubenswrapper[4797]: E1013 13:37:29.094329 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bfaeb7ba1b5fe1a38de19b2b0458b2e65a036403b2291db17606351d4b3a3066\": container with ID starting with bfaeb7ba1b5fe1a38de19b2b0458b2e65a036403b2291db17606351d4b3a3066 not found: ID does not exist" containerID="bfaeb7ba1b5fe1a38de19b2b0458b2e65a036403b2291db17606351d4b3a3066" Oct 13 13:37:29 crc kubenswrapper[4797]: I1013 13:37:29.094370 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bfaeb7ba1b5fe1a38de19b2b0458b2e65a036403b2291db17606351d4b3a3066"} err="failed to get container status \"bfaeb7ba1b5fe1a38de19b2b0458b2e65a036403b2291db17606351d4b3a3066\": rpc error: code = NotFound desc = could not find container \"bfaeb7ba1b5fe1a38de19b2b0458b2e65a036403b2291db17606351d4b3a3066\": container with ID starting with bfaeb7ba1b5fe1a38de19b2b0458b2e65a036403b2291db17606351d4b3a3066 not found: ID does not exist" Oct 13 13:37:29 crc kubenswrapper[4797]: I1013 13:37:29.094432 4797 scope.go:117] "RemoveContainer" containerID="5cf832277ecc677be91b575c854e2a7497340e6d08d6beaf9da3d93a1f089c93" Oct 13 13:37:29 crc kubenswrapper[4797]: E1013 13:37:29.094787 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5cf832277ecc677be91b575c854e2a7497340e6d08d6beaf9da3d93a1f089c93\": container with ID starting with 5cf832277ecc677be91b575c854e2a7497340e6d08d6beaf9da3d93a1f089c93 not found: ID does not exist" containerID="5cf832277ecc677be91b575c854e2a7497340e6d08d6beaf9da3d93a1f089c93" Oct 13 13:37:29 crc kubenswrapper[4797]: I1013 13:37:29.094845 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5cf832277ecc677be91b575c854e2a7497340e6d08d6beaf9da3d93a1f089c93"} err="failed to get container status \"5cf832277ecc677be91b575c854e2a7497340e6d08d6beaf9da3d93a1f089c93\": rpc error: code = NotFound desc = could not find container \"5cf832277ecc677be91b575c854e2a7497340e6d08d6beaf9da3d93a1f089c93\": container with ID starting with 5cf832277ecc677be91b575c854e2a7497340e6d08d6beaf9da3d93a1f089c93 not found: ID does not exist" Oct 13 13:37:29 crc kubenswrapper[4797]: I1013 13:37:29.094868 4797 scope.go:117] "RemoveContainer" containerID="c0c7e8bb0cce1e03f353b3d71117bda1b3e6e1079d8a6acf045aa69bf1543bd2" Oct 13 13:37:29 crc kubenswrapper[4797]: E1013 13:37:29.095288 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0c7e8bb0cce1e03f353b3d71117bda1b3e6e1079d8a6acf045aa69bf1543bd2\": container with ID starting with c0c7e8bb0cce1e03f353b3d71117bda1b3e6e1079d8a6acf045aa69bf1543bd2 not found: ID does not exist" containerID="c0c7e8bb0cce1e03f353b3d71117bda1b3e6e1079d8a6acf045aa69bf1543bd2" Oct 13 13:37:29 crc kubenswrapper[4797]: I1013 13:37:29.095490 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0c7e8bb0cce1e03f353b3d71117bda1b3e6e1079d8a6acf045aa69bf1543bd2"} err="failed to get container status \"c0c7e8bb0cce1e03f353b3d71117bda1b3e6e1079d8a6acf045aa69bf1543bd2\": rpc error: code = NotFound desc = could not find container \"c0c7e8bb0cce1e03f353b3d71117bda1b3e6e1079d8a6acf045aa69bf1543bd2\": container with ID starting with c0c7e8bb0cce1e03f353b3d71117bda1b3e6e1079d8a6acf045aa69bf1543bd2 not found: ID does not exist" Oct 13 13:37:29 crc kubenswrapper[4797]: I1013 13:37:29.101476 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f5zmx\" (UniqueName: \"kubernetes.io/projected/0c005f98-b2ac-4f0c-a46c-5cc7693640f2-kube-api-access-f5zmx\") on node \"crc\" DevicePath \"\"" Oct 13 13:37:29 crc kubenswrapper[4797]: I1013 13:37:29.101505 4797 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c005f98-b2ac-4f0c-a46c-5cc7693640f2-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 13 13:37:29 crc kubenswrapper[4797]: I1013 13:37:29.101519 4797 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c005f98-b2ac-4f0c-a46c-5cc7693640f2-utilities\") on node \"crc\" DevicePath \"\"" Oct 13 13:37:29 crc kubenswrapper[4797]: I1013 13:37:29.316176 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vqq9m"] Oct 13 13:37:29 crc kubenswrapper[4797]: I1013 13:37:29.322662 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-vqq9m"] Oct 13 13:37:29 crc kubenswrapper[4797]: I1013 13:37:29.803395 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-clk2q"] Oct 13 13:37:31 crc kubenswrapper[4797]: I1013 13:37:31.002401 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-clk2q" podUID="57aad6e1-f233-4015-8a16-61f667c7237d" containerName="registry-server" containerID="cri-o://9cbde833a8fd57ac57dd755509e16abf6a21615a99d0ad59df56f7917b437a39" gracePeriod=2 Oct 13 13:37:31 crc kubenswrapper[4797]: I1013 13:37:31.257250 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c005f98-b2ac-4f0c-a46c-5cc7693640f2" path="/var/lib/kubelet/pods/0c005f98-b2ac-4f0c-a46c-5cc7693640f2/volumes" Oct 13 13:37:31 crc kubenswrapper[4797]: I1013 13:37:31.528270 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-clk2q" Oct 13 13:37:31 crc kubenswrapper[4797]: I1013 13:37:31.655938 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57aad6e1-f233-4015-8a16-61f667c7237d-utilities\") pod \"57aad6e1-f233-4015-8a16-61f667c7237d\" (UID: \"57aad6e1-f233-4015-8a16-61f667c7237d\") " Oct 13 13:37:31 crc kubenswrapper[4797]: I1013 13:37:31.656053 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w8l2f\" (UniqueName: \"kubernetes.io/projected/57aad6e1-f233-4015-8a16-61f667c7237d-kube-api-access-w8l2f\") pod \"57aad6e1-f233-4015-8a16-61f667c7237d\" (UID: \"57aad6e1-f233-4015-8a16-61f667c7237d\") " Oct 13 13:37:31 crc kubenswrapper[4797]: I1013 13:37:31.656106 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57aad6e1-f233-4015-8a16-61f667c7237d-catalog-content\") pod \"57aad6e1-f233-4015-8a16-61f667c7237d\" (UID: \"57aad6e1-f233-4015-8a16-61f667c7237d\") " Oct 13 13:37:31 crc kubenswrapper[4797]: I1013 13:37:31.657489 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57aad6e1-f233-4015-8a16-61f667c7237d-utilities" (OuterVolumeSpecName: "utilities") pod "57aad6e1-f233-4015-8a16-61f667c7237d" (UID: "57aad6e1-f233-4015-8a16-61f667c7237d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 13:37:31 crc kubenswrapper[4797]: I1013 13:37:31.663046 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57aad6e1-f233-4015-8a16-61f667c7237d-kube-api-access-w8l2f" (OuterVolumeSpecName: "kube-api-access-w8l2f") pod "57aad6e1-f233-4015-8a16-61f667c7237d" (UID: "57aad6e1-f233-4015-8a16-61f667c7237d"). InnerVolumeSpecName "kube-api-access-w8l2f". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:37:31 crc kubenswrapper[4797]: I1013 13:37:31.758081 4797 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57aad6e1-f233-4015-8a16-61f667c7237d-utilities\") on node \"crc\" DevicePath \"\"" Oct 13 13:37:31 crc kubenswrapper[4797]: I1013 13:37:31.758133 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w8l2f\" (UniqueName: \"kubernetes.io/projected/57aad6e1-f233-4015-8a16-61f667c7237d-kube-api-access-w8l2f\") on node \"crc\" DevicePath \"\"" Oct 13 13:37:31 crc kubenswrapper[4797]: I1013 13:37:31.781671 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57aad6e1-f233-4015-8a16-61f667c7237d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57aad6e1-f233-4015-8a16-61f667c7237d" (UID: "57aad6e1-f233-4015-8a16-61f667c7237d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 13:37:31 crc kubenswrapper[4797]: I1013 13:37:31.860095 4797 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57aad6e1-f233-4015-8a16-61f667c7237d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 13 13:37:32 crc kubenswrapper[4797]: I1013 13:37:32.012098 4797 generic.go:334] "Generic (PLEG): container finished" podID="57aad6e1-f233-4015-8a16-61f667c7237d" containerID="9cbde833a8fd57ac57dd755509e16abf6a21615a99d0ad59df56f7917b437a39" exitCode=0 Oct 13 13:37:32 crc kubenswrapper[4797]: I1013 13:37:32.012162 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-clk2q" event={"ID":"57aad6e1-f233-4015-8a16-61f667c7237d","Type":"ContainerDied","Data":"9cbde833a8fd57ac57dd755509e16abf6a21615a99d0ad59df56f7917b437a39"} Oct 13 13:37:32 crc kubenswrapper[4797]: I1013 13:37:32.012185 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-clk2q" Oct 13 13:37:32 crc kubenswrapper[4797]: I1013 13:37:32.012208 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-clk2q" event={"ID":"57aad6e1-f233-4015-8a16-61f667c7237d","Type":"ContainerDied","Data":"5e99bd2132b6dbb74316d5be6293e681f4e4831ab5acc31dd00b2b6d3b844cd7"} Oct 13 13:37:32 crc kubenswrapper[4797]: I1013 13:37:32.012228 4797 scope.go:117] "RemoveContainer" containerID="9cbde833a8fd57ac57dd755509e16abf6a21615a99d0ad59df56f7917b437a39" Oct 13 13:37:32 crc kubenswrapper[4797]: I1013 13:37:32.032432 4797 scope.go:117] "RemoveContainer" containerID="4bbb71263877d47cfc1a324991ec3b3fc77cffe1bb971527136b716ba1c63838" Oct 13 13:37:32 crc kubenswrapper[4797]: I1013 13:37:32.049259 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-clk2q"] Oct 13 13:37:32 crc kubenswrapper[4797]: I1013 13:37:32.054129 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-clk2q"] Oct 13 13:37:32 crc kubenswrapper[4797]: I1013 13:37:32.054679 4797 scope.go:117] "RemoveContainer" containerID="63a816934140141b92e10d68517eff1eebc43d476b1d3c2f81ea32bdc6db28f4" Oct 13 13:37:32 crc kubenswrapper[4797]: I1013 13:37:32.079429 4797 scope.go:117] "RemoveContainer" containerID="9cbde833a8fd57ac57dd755509e16abf6a21615a99d0ad59df56f7917b437a39" Oct 13 13:37:32 crc kubenswrapper[4797]: E1013 13:37:32.079951 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9cbde833a8fd57ac57dd755509e16abf6a21615a99d0ad59df56f7917b437a39\": container with ID starting with 9cbde833a8fd57ac57dd755509e16abf6a21615a99d0ad59df56f7917b437a39 not found: ID does not exist" containerID="9cbde833a8fd57ac57dd755509e16abf6a21615a99d0ad59df56f7917b437a39" Oct 13 13:37:32 crc kubenswrapper[4797]: I1013 13:37:32.079997 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9cbde833a8fd57ac57dd755509e16abf6a21615a99d0ad59df56f7917b437a39"} err="failed to get container status \"9cbde833a8fd57ac57dd755509e16abf6a21615a99d0ad59df56f7917b437a39\": rpc error: code = NotFound desc = could not find container \"9cbde833a8fd57ac57dd755509e16abf6a21615a99d0ad59df56f7917b437a39\": container with ID starting with 9cbde833a8fd57ac57dd755509e16abf6a21615a99d0ad59df56f7917b437a39 not found: ID does not exist" Oct 13 13:37:32 crc kubenswrapper[4797]: I1013 13:37:32.080024 4797 scope.go:117] "RemoveContainer" containerID="4bbb71263877d47cfc1a324991ec3b3fc77cffe1bb971527136b716ba1c63838" Oct 13 13:37:32 crc kubenswrapper[4797]: E1013 13:37:32.080442 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4bbb71263877d47cfc1a324991ec3b3fc77cffe1bb971527136b716ba1c63838\": container with ID starting with 4bbb71263877d47cfc1a324991ec3b3fc77cffe1bb971527136b716ba1c63838 not found: ID does not exist" containerID="4bbb71263877d47cfc1a324991ec3b3fc77cffe1bb971527136b716ba1c63838" Oct 13 13:37:32 crc kubenswrapper[4797]: I1013 13:37:32.080473 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4bbb71263877d47cfc1a324991ec3b3fc77cffe1bb971527136b716ba1c63838"} err="failed to get container status \"4bbb71263877d47cfc1a324991ec3b3fc77cffe1bb971527136b716ba1c63838\": rpc error: code = NotFound desc = could not find container \"4bbb71263877d47cfc1a324991ec3b3fc77cffe1bb971527136b716ba1c63838\": container with ID starting with 4bbb71263877d47cfc1a324991ec3b3fc77cffe1bb971527136b716ba1c63838 not found: ID does not exist" Oct 13 13:37:32 crc kubenswrapper[4797]: I1013 13:37:32.080493 4797 scope.go:117] "RemoveContainer" containerID="63a816934140141b92e10d68517eff1eebc43d476b1d3c2f81ea32bdc6db28f4" Oct 13 13:37:32 crc kubenswrapper[4797]: E1013 13:37:32.080714 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"63a816934140141b92e10d68517eff1eebc43d476b1d3c2f81ea32bdc6db28f4\": container with ID starting with 63a816934140141b92e10d68517eff1eebc43d476b1d3c2f81ea32bdc6db28f4 not found: ID does not exist" containerID="63a816934140141b92e10d68517eff1eebc43d476b1d3c2f81ea32bdc6db28f4" Oct 13 13:37:32 crc kubenswrapper[4797]: I1013 13:37:32.080739 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63a816934140141b92e10d68517eff1eebc43d476b1d3c2f81ea32bdc6db28f4"} err="failed to get container status \"63a816934140141b92e10d68517eff1eebc43d476b1d3c2f81ea32bdc6db28f4\": rpc error: code = NotFound desc = could not find container \"63a816934140141b92e10d68517eff1eebc43d476b1d3c2f81ea32bdc6db28f4\": container with ID starting with 63a816934140141b92e10d68517eff1eebc43d476b1d3c2f81ea32bdc6db28f4 not found: ID does not exist" Oct 13 13:37:33 crc kubenswrapper[4797]: I1013 13:37:33.254633 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57aad6e1-f233-4015-8a16-61f667c7237d" path="/var/lib/kubelet/pods/57aad6e1-f233-4015-8a16-61f667c7237d/volumes" Oct 13 13:37:34 crc kubenswrapper[4797]: I1013 13:37:34.236121 4797 scope.go:117] "RemoveContainer" containerID="2e1a8d37a243fdb391fd594b14c3967cbe771bff4ffdfcbdee7201408ecf2edb" Oct 13 13:37:34 crc kubenswrapper[4797]: E1013 13:37:34.236516 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 13:37:46 crc kubenswrapper[4797]: I1013 13:37:46.236058 4797 scope.go:117] "RemoveContainer" containerID="2e1a8d37a243fdb391fd594b14c3967cbe771bff4ffdfcbdee7201408ecf2edb" Oct 13 13:37:46 crc kubenswrapper[4797]: E1013 13:37:46.237531 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 13:37:58 crc kubenswrapper[4797]: I1013 13:37:58.237276 4797 scope.go:117] "RemoveContainer" containerID="2e1a8d37a243fdb391fd594b14c3967cbe771bff4ffdfcbdee7201408ecf2edb" Oct 13 13:37:58 crc kubenswrapper[4797]: E1013 13:37:58.240423 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 13:38:09 crc kubenswrapper[4797]: I1013 13:38:09.236954 4797 scope.go:117] "RemoveContainer" containerID="2e1a8d37a243fdb391fd594b14c3967cbe771bff4ffdfcbdee7201408ecf2edb" Oct 13 13:38:09 crc kubenswrapper[4797]: E1013 13:38:09.237674 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 13:38:21 crc kubenswrapper[4797]: I1013 13:38:21.237445 4797 scope.go:117] "RemoveContainer" containerID="2e1a8d37a243fdb391fd594b14c3967cbe771bff4ffdfcbdee7201408ecf2edb" Oct 13 13:38:22 crc kubenswrapper[4797]: I1013 13:38:22.498340 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" event={"ID":"345b1c60-ba79-407d-8423-53010f2dfeb0","Type":"ContainerStarted","Data":"187fe1ba906854ff992646e2036cc647b869f99f167e66271cdb87f4d6a2e410"} Oct 13 13:40:48 crc kubenswrapper[4797]: I1013 13:40:48.120724 4797 patch_prober.go:28] interesting pod/machine-config-daemon-hrdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 13:40:48 crc kubenswrapper[4797]: I1013 13:40:48.121298 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 13:41:15 crc kubenswrapper[4797]: I1013 13:41:15.562338 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-xp7zq"] Oct 13 13:41:15 crc kubenswrapper[4797]: E1013 13:41:15.563268 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57aad6e1-f233-4015-8a16-61f667c7237d" containerName="extract-content" Oct 13 13:41:15 crc kubenswrapper[4797]: I1013 13:41:15.563285 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="57aad6e1-f233-4015-8a16-61f667c7237d" containerName="extract-content" Oct 13 13:41:15 crc kubenswrapper[4797]: E1013 13:41:15.563299 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c005f98-b2ac-4f0c-a46c-5cc7693640f2" containerName="extract-utilities" Oct 13 13:41:15 crc kubenswrapper[4797]: I1013 13:41:15.563307 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c005f98-b2ac-4f0c-a46c-5cc7693640f2" containerName="extract-utilities" Oct 13 13:41:15 crc kubenswrapper[4797]: E1013 13:41:15.563329 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57aad6e1-f233-4015-8a16-61f667c7237d" containerName="extract-utilities" Oct 13 13:41:15 crc kubenswrapper[4797]: I1013 13:41:15.563338 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="57aad6e1-f233-4015-8a16-61f667c7237d" containerName="extract-utilities" Oct 13 13:41:15 crc kubenswrapper[4797]: E1013 13:41:15.563354 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c005f98-b2ac-4f0c-a46c-5cc7693640f2" containerName="extract-content" Oct 13 13:41:15 crc kubenswrapper[4797]: I1013 13:41:15.563364 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c005f98-b2ac-4f0c-a46c-5cc7693640f2" containerName="extract-content" Oct 13 13:41:15 crc kubenswrapper[4797]: E1013 13:41:15.563378 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c005f98-b2ac-4f0c-a46c-5cc7693640f2" containerName="registry-server" Oct 13 13:41:15 crc kubenswrapper[4797]: I1013 13:41:15.563385 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c005f98-b2ac-4f0c-a46c-5cc7693640f2" containerName="registry-server" Oct 13 13:41:15 crc kubenswrapper[4797]: E1013 13:41:15.563407 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57aad6e1-f233-4015-8a16-61f667c7237d" containerName="registry-server" Oct 13 13:41:15 crc kubenswrapper[4797]: I1013 13:41:15.563415 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="57aad6e1-f233-4015-8a16-61f667c7237d" containerName="registry-server" Oct 13 13:41:15 crc kubenswrapper[4797]: I1013 13:41:15.563581 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c005f98-b2ac-4f0c-a46c-5cc7693640f2" containerName="registry-server" Oct 13 13:41:15 crc kubenswrapper[4797]: I1013 13:41:15.563599 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="57aad6e1-f233-4015-8a16-61f667c7237d" containerName="registry-server" Oct 13 13:41:15 crc kubenswrapper[4797]: I1013 13:41:15.564971 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xp7zq" Oct 13 13:41:15 crc kubenswrapper[4797]: I1013 13:41:15.572374 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xp7zq"] Oct 13 13:41:15 crc kubenswrapper[4797]: I1013 13:41:15.707974 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gnc44\" (UniqueName: \"kubernetes.io/projected/70ccb1a0-247a-48fe-b81a-8f306aeef110-kube-api-access-gnc44\") pod \"community-operators-xp7zq\" (UID: \"70ccb1a0-247a-48fe-b81a-8f306aeef110\") " pod="openshift-marketplace/community-operators-xp7zq" Oct 13 13:41:15 crc kubenswrapper[4797]: I1013 13:41:15.708106 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/70ccb1a0-247a-48fe-b81a-8f306aeef110-utilities\") pod \"community-operators-xp7zq\" (UID: \"70ccb1a0-247a-48fe-b81a-8f306aeef110\") " pod="openshift-marketplace/community-operators-xp7zq" Oct 13 13:41:15 crc kubenswrapper[4797]: I1013 13:41:15.708230 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/70ccb1a0-247a-48fe-b81a-8f306aeef110-catalog-content\") pod \"community-operators-xp7zq\" (UID: \"70ccb1a0-247a-48fe-b81a-8f306aeef110\") " pod="openshift-marketplace/community-operators-xp7zq" Oct 13 13:41:15 crc kubenswrapper[4797]: I1013 13:41:15.810064 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gnc44\" (UniqueName: \"kubernetes.io/projected/70ccb1a0-247a-48fe-b81a-8f306aeef110-kube-api-access-gnc44\") pod \"community-operators-xp7zq\" (UID: \"70ccb1a0-247a-48fe-b81a-8f306aeef110\") " pod="openshift-marketplace/community-operators-xp7zq" Oct 13 13:41:15 crc kubenswrapper[4797]: I1013 13:41:15.810116 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/70ccb1a0-247a-48fe-b81a-8f306aeef110-utilities\") pod \"community-operators-xp7zq\" (UID: \"70ccb1a0-247a-48fe-b81a-8f306aeef110\") " pod="openshift-marketplace/community-operators-xp7zq" Oct 13 13:41:15 crc kubenswrapper[4797]: I1013 13:41:15.810140 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/70ccb1a0-247a-48fe-b81a-8f306aeef110-catalog-content\") pod \"community-operators-xp7zq\" (UID: \"70ccb1a0-247a-48fe-b81a-8f306aeef110\") " pod="openshift-marketplace/community-operators-xp7zq" Oct 13 13:41:15 crc kubenswrapper[4797]: I1013 13:41:15.810625 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/70ccb1a0-247a-48fe-b81a-8f306aeef110-catalog-content\") pod \"community-operators-xp7zq\" (UID: \"70ccb1a0-247a-48fe-b81a-8f306aeef110\") " pod="openshift-marketplace/community-operators-xp7zq" Oct 13 13:41:15 crc kubenswrapper[4797]: I1013 13:41:15.811302 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/70ccb1a0-247a-48fe-b81a-8f306aeef110-utilities\") pod \"community-operators-xp7zq\" (UID: \"70ccb1a0-247a-48fe-b81a-8f306aeef110\") " pod="openshift-marketplace/community-operators-xp7zq" Oct 13 13:41:15 crc kubenswrapper[4797]: I1013 13:41:15.833126 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gnc44\" (UniqueName: \"kubernetes.io/projected/70ccb1a0-247a-48fe-b81a-8f306aeef110-kube-api-access-gnc44\") pod \"community-operators-xp7zq\" (UID: \"70ccb1a0-247a-48fe-b81a-8f306aeef110\") " pod="openshift-marketplace/community-operators-xp7zq" Oct 13 13:41:15 crc kubenswrapper[4797]: I1013 13:41:15.899020 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xp7zq" Oct 13 13:41:16 crc kubenswrapper[4797]: I1013 13:41:16.133907 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-sxrn5"] Oct 13 13:41:16 crc kubenswrapper[4797]: I1013 13:41:16.135599 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sxrn5" Oct 13 13:41:16 crc kubenswrapper[4797]: I1013 13:41:16.143263 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sxrn5"] Oct 13 13:41:16 crc kubenswrapper[4797]: I1013 13:41:16.215603 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4dcf8382-bb4f-4ba4-a85d-4610dad46d99-utilities\") pod \"redhat-marketplace-sxrn5\" (UID: \"4dcf8382-bb4f-4ba4-a85d-4610dad46d99\") " pod="openshift-marketplace/redhat-marketplace-sxrn5" Oct 13 13:41:16 crc kubenswrapper[4797]: I1013 13:41:16.215723 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4dcf8382-bb4f-4ba4-a85d-4610dad46d99-catalog-content\") pod \"redhat-marketplace-sxrn5\" (UID: \"4dcf8382-bb4f-4ba4-a85d-4610dad46d99\") " pod="openshift-marketplace/redhat-marketplace-sxrn5" Oct 13 13:41:16 crc kubenswrapper[4797]: I1013 13:41:16.215759 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxkpt\" (UniqueName: \"kubernetes.io/projected/4dcf8382-bb4f-4ba4-a85d-4610dad46d99-kube-api-access-lxkpt\") pod \"redhat-marketplace-sxrn5\" (UID: \"4dcf8382-bb4f-4ba4-a85d-4610dad46d99\") " pod="openshift-marketplace/redhat-marketplace-sxrn5" Oct 13 13:41:16 crc kubenswrapper[4797]: I1013 13:41:16.317329 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4dcf8382-bb4f-4ba4-a85d-4610dad46d99-catalog-content\") pod \"redhat-marketplace-sxrn5\" (UID: \"4dcf8382-bb4f-4ba4-a85d-4610dad46d99\") " pod="openshift-marketplace/redhat-marketplace-sxrn5" Oct 13 13:41:16 crc kubenswrapper[4797]: I1013 13:41:16.317370 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lxkpt\" (UniqueName: \"kubernetes.io/projected/4dcf8382-bb4f-4ba4-a85d-4610dad46d99-kube-api-access-lxkpt\") pod \"redhat-marketplace-sxrn5\" (UID: \"4dcf8382-bb4f-4ba4-a85d-4610dad46d99\") " pod="openshift-marketplace/redhat-marketplace-sxrn5" Oct 13 13:41:16 crc kubenswrapper[4797]: I1013 13:41:16.317438 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4dcf8382-bb4f-4ba4-a85d-4610dad46d99-utilities\") pod \"redhat-marketplace-sxrn5\" (UID: \"4dcf8382-bb4f-4ba4-a85d-4610dad46d99\") " pod="openshift-marketplace/redhat-marketplace-sxrn5" Oct 13 13:41:16 crc kubenswrapper[4797]: I1013 13:41:16.317929 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4dcf8382-bb4f-4ba4-a85d-4610dad46d99-catalog-content\") pod \"redhat-marketplace-sxrn5\" (UID: \"4dcf8382-bb4f-4ba4-a85d-4610dad46d99\") " pod="openshift-marketplace/redhat-marketplace-sxrn5" Oct 13 13:41:16 crc kubenswrapper[4797]: I1013 13:41:16.317971 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4dcf8382-bb4f-4ba4-a85d-4610dad46d99-utilities\") pod \"redhat-marketplace-sxrn5\" (UID: \"4dcf8382-bb4f-4ba4-a85d-4610dad46d99\") " pod="openshift-marketplace/redhat-marketplace-sxrn5" Oct 13 13:41:16 crc kubenswrapper[4797]: I1013 13:41:16.332919 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxkpt\" (UniqueName: \"kubernetes.io/projected/4dcf8382-bb4f-4ba4-a85d-4610dad46d99-kube-api-access-lxkpt\") pod \"redhat-marketplace-sxrn5\" (UID: \"4dcf8382-bb4f-4ba4-a85d-4610dad46d99\") " pod="openshift-marketplace/redhat-marketplace-sxrn5" Oct 13 13:41:16 crc kubenswrapper[4797]: I1013 13:41:16.434570 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xp7zq"] Oct 13 13:41:16 crc kubenswrapper[4797]: I1013 13:41:16.457216 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sxrn5" Oct 13 13:41:16 crc kubenswrapper[4797]: I1013 13:41:16.908067 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sxrn5"] Oct 13 13:41:16 crc kubenswrapper[4797]: W1013 13:41:16.908963 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4dcf8382_bb4f_4ba4_a85d_4610dad46d99.slice/crio-b220d495eb52aa2a944b4434b0c4a5b5d19bd0efb137cded41fdd13a032f55d7 WatchSource:0}: Error finding container b220d495eb52aa2a944b4434b0c4a5b5d19bd0efb137cded41fdd13a032f55d7: Status 404 returned error can't find the container with id b220d495eb52aa2a944b4434b0c4a5b5d19bd0efb137cded41fdd13a032f55d7 Oct 13 13:41:16 crc kubenswrapper[4797]: I1013 13:41:16.996513 4797 generic.go:334] "Generic (PLEG): container finished" podID="70ccb1a0-247a-48fe-b81a-8f306aeef110" containerID="b4081b90e1e7e0f5940580c3f995a67349057fe08ca554ed5a0c0a518064bfa7" exitCode=0 Oct 13 13:41:16 crc kubenswrapper[4797]: I1013 13:41:16.996637 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xp7zq" event={"ID":"70ccb1a0-247a-48fe-b81a-8f306aeef110","Type":"ContainerDied","Data":"b4081b90e1e7e0f5940580c3f995a67349057fe08ca554ed5a0c0a518064bfa7"} Oct 13 13:41:16 crc kubenswrapper[4797]: I1013 13:41:16.996683 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xp7zq" event={"ID":"70ccb1a0-247a-48fe-b81a-8f306aeef110","Type":"ContainerStarted","Data":"9f9e72499847c667df79bd0dac504234d21f79e30dafe2688a45cd257c4e1b7f"} Oct 13 13:41:16 crc kubenswrapper[4797]: I1013 13:41:16.997466 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sxrn5" event={"ID":"4dcf8382-bb4f-4ba4-a85d-4610dad46d99","Type":"ContainerStarted","Data":"b220d495eb52aa2a944b4434b0c4a5b5d19bd0efb137cded41fdd13a032f55d7"} Oct 13 13:41:18 crc kubenswrapper[4797]: I1013 13:41:18.009753 4797 generic.go:334] "Generic (PLEG): container finished" podID="4dcf8382-bb4f-4ba4-a85d-4610dad46d99" containerID="266457f83429cef644b50125c8900d9b7b7ced38200f9723a0db8127e72dc099" exitCode=0 Oct 13 13:41:18 crc kubenswrapper[4797]: I1013 13:41:18.009880 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sxrn5" event={"ID":"4dcf8382-bb4f-4ba4-a85d-4610dad46d99","Type":"ContainerDied","Data":"266457f83429cef644b50125c8900d9b7b7ced38200f9723a0db8127e72dc099"} Oct 13 13:41:18 crc kubenswrapper[4797]: I1013 13:41:18.120626 4797 patch_prober.go:28] interesting pod/machine-config-daemon-hrdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 13:41:18 crc kubenswrapper[4797]: I1013 13:41:18.120704 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 13:41:19 crc kubenswrapper[4797]: I1013 13:41:19.042757 4797 generic.go:334] "Generic (PLEG): container finished" podID="70ccb1a0-247a-48fe-b81a-8f306aeef110" containerID="22afdbbe99eb08e2a272558dd534334dd34bdf8cc3c19fa1409271fc88530f71" exitCode=0 Oct 13 13:41:19 crc kubenswrapper[4797]: I1013 13:41:19.043023 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xp7zq" event={"ID":"70ccb1a0-247a-48fe-b81a-8f306aeef110","Type":"ContainerDied","Data":"22afdbbe99eb08e2a272558dd534334dd34bdf8cc3c19fa1409271fc88530f71"} Oct 13 13:41:20 crc kubenswrapper[4797]: I1013 13:41:20.055508 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xp7zq" event={"ID":"70ccb1a0-247a-48fe-b81a-8f306aeef110","Type":"ContainerStarted","Data":"bb1633a7272dad1968a6d33810486f2e402481848bacc210e36a35e08b9b10e5"} Oct 13 13:41:20 crc kubenswrapper[4797]: I1013 13:41:20.058170 4797 generic.go:334] "Generic (PLEG): container finished" podID="4dcf8382-bb4f-4ba4-a85d-4610dad46d99" containerID="c98be99ee906a1e2802a650cfb8acec43385ebe45b923971d9b0420d513c77d8" exitCode=0 Oct 13 13:41:20 crc kubenswrapper[4797]: I1013 13:41:20.058226 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sxrn5" event={"ID":"4dcf8382-bb4f-4ba4-a85d-4610dad46d99","Type":"ContainerDied","Data":"c98be99ee906a1e2802a650cfb8acec43385ebe45b923971d9b0420d513c77d8"} Oct 13 13:41:20 crc kubenswrapper[4797]: I1013 13:41:20.087666 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-xp7zq" podStartSLOduration=2.5935993760000002 podStartE2EDuration="5.087645975s" podCreationTimestamp="2025-10-13 13:41:15 +0000 UTC" firstStartedPulling="2025-10-13 13:41:16.998621366 +0000 UTC m=+2054.532171622" lastFinishedPulling="2025-10-13 13:41:19.492667925 +0000 UTC m=+2057.026218221" observedRunningTime="2025-10-13 13:41:20.078396788 +0000 UTC m=+2057.611947054" watchObservedRunningTime="2025-10-13 13:41:20.087645975 +0000 UTC m=+2057.621196241" Oct 13 13:41:21 crc kubenswrapper[4797]: I1013 13:41:21.067465 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sxrn5" event={"ID":"4dcf8382-bb4f-4ba4-a85d-4610dad46d99","Type":"ContainerStarted","Data":"94082aafb027d38bf5ff88103d47b4ef73b7178bd8728310107a9d5364ede2a2"} Oct 13 13:41:21 crc kubenswrapper[4797]: I1013 13:41:21.083197 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-sxrn5" podStartSLOduration=2.5027044 podStartE2EDuration="5.083178788s" podCreationTimestamp="2025-10-13 13:41:16 +0000 UTC" firstStartedPulling="2025-10-13 13:41:18.013210936 +0000 UTC m=+2055.546761222" lastFinishedPulling="2025-10-13 13:41:20.593685354 +0000 UTC m=+2058.127235610" observedRunningTime="2025-10-13 13:41:21.081196169 +0000 UTC m=+2058.614746435" watchObservedRunningTime="2025-10-13 13:41:21.083178788 +0000 UTC m=+2058.616729044" Oct 13 13:41:25 crc kubenswrapper[4797]: I1013 13:41:25.899103 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-xp7zq" Oct 13 13:41:25 crc kubenswrapper[4797]: I1013 13:41:25.899358 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-xp7zq" Oct 13 13:41:25 crc kubenswrapper[4797]: I1013 13:41:25.972915 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-xp7zq" Oct 13 13:41:26 crc kubenswrapper[4797]: I1013 13:41:26.178674 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-xp7zq" Oct 13 13:41:26 crc kubenswrapper[4797]: I1013 13:41:26.458288 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-sxrn5" Oct 13 13:41:26 crc kubenswrapper[4797]: I1013 13:41:26.458355 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-sxrn5" Oct 13 13:41:26 crc kubenswrapper[4797]: I1013 13:41:26.514027 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-sxrn5" Oct 13 13:41:27 crc kubenswrapper[4797]: I1013 13:41:27.200931 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-sxrn5" Oct 13 13:41:27 crc kubenswrapper[4797]: I1013 13:41:27.531976 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xp7zq"] Oct 13 13:41:28 crc kubenswrapper[4797]: I1013 13:41:28.142078 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-xp7zq" podUID="70ccb1a0-247a-48fe-b81a-8f306aeef110" containerName="registry-server" containerID="cri-o://bb1633a7272dad1968a6d33810486f2e402481848bacc210e36a35e08b9b10e5" gracePeriod=2 Oct 13 13:41:28 crc kubenswrapper[4797]: I1013 13:41:28.556147 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xp7zq" Oct 13 13:41:28 crc kubenswrapper[4797]: I1013 13:41:28.703290 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gnc44\" (UniqueName: \"kubernetes.io/projected/70ccb1a0-247a-48fe-b81a-8f306aeef110-kube-api-access-gnc44\") pod \"70ccb1a0-247a-48fe-b81a-8f306aeef110\" (UID: \"70ccb1a0-247a-48fe-b81a-8f306aeef110\") " Oct 13 13:41:28 crc kubenswrapper[4797]: I1013 13:41:28.704387 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/70ccb1a0-247a-48fe-b81a-8f306aeef110-utilities\") pod \"70ccb1a0-247a-48fe-b81a-8f306aeef110\" (UID: \"70ccb1a0-247a-48fe-b81a-8f306aeef110\") " Oct 13 13:41:28 crc kubenswrapper[4797]: I1013 13:41:28.705182 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/70ccb1a0-247a-48fe-b81a-8f306aeef110-utilities" (OuterVolumeSpecName: "utilities") pod "70ccb1a0-247a-48fe-b81a-8f306aeef110" (UID: "70ccb1a0-247a-48fe-b81a-8f306aeef110"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 13:41:28 crc kubenswrapper[4797]: I1013 13:41:28.705327 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/70ccb1a0-247a-48fe-b81a-8f306aeef110-catalog-content\") pod \"70ccb1a0-247a-48fe-b81a-8f306aeef110\" (UID: \"70ccb1a0-247a-48fe-b81a-8f306aeef110\") " Oct 13 13:41:28 crc kubenswrapper[4797]: I1013 13:41:28.705821 4797 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/70ccb1a0-247a-48fe-b81a-8f306aeef110-utilities\") on node \"crc\" DevicePath \"\"" Oct 13 13:41:28 crc kubenswrapper[4797]: I1013 13:41:28.712081 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70ccb1a0-247a-48fe-b81a-8f306aeef110-kube-api-access-gnc44" (OuterVolumeSpecName: "kube-api-access-gnc44") pod "70ccb1a0-247a-48fe-b81a-8f306aeef110" (UID: "70ccb1a0-247a-48fe-b81a-8f306aeef110"). InnerVolumeSpecName "kube-api-access-gnc44". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:41:28 crc kubenswrapper[4797]: I1013 13:41:28.781233 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/70ccb1a0-247a-48fe-b81a-8f306aeef110-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "70ccb1a0-247a-48fe-b81a-8f306aeef110" (UID: "70ccb1a0-247a-48fe-b81a-8f306aeef110"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 13:41:28 crc kubenswrapper[4797]: I1013 13:41:28.807093 4797 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/70ccb1a0-247a-48fe-b81a-8f306aeef110-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 13 13:41:28 crc kubenswrapper[4797]: I1013 13:41:28.807142 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gnc44\" (UniqueName: \"kubernetes.io/projected/70ccb1a0-247a-48fe-b81a-8f306aeef110-kube-api-access-gnc44\") on node \"crc\" DevicePath \"\"" Oct 13 13:41:28 crc kubenswrapper[4797]: I1013 13:41:28.927159 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sxrn5"] Oct 13 13:41:29 crc kubenswrapper[4797]: I1013 13:41:29.155320 4797 generic.go:334] "Generic (PLEG): container finished" podID="70ccb1a0-247a-48fe-b81a-8f306aeef110" containerID="bb1633a7272dad1968a6d33810486f2e402481848bacc210e36a35e08b9b10e5" exitCode=0 Oct 13 13:41:29 crc kubenswrapper[4797]: I1013 13:41:29.155383 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xp7zq" Oct 13 13:41:29 crc kubenswrapper[4797]: I1013 13:41:29.155391 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xp7zq" event={"ID":"70ccb1a0-247a-48fe-b81a-8f306aeef110","Type":"ContainerDied","Data":"bb1633a7272dad1968a6d33810486f2e402481848bacc210e36a35e08b9b10e5"} Oct 13 13:41:29 crc kubenswrapper[4797]: I1013 13:41:29.155461 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xp7zq" event={"ID":"70ccb1a0-247a-48fe-b81a-8f306aeef110","Type":"ContainerDied","Data":"9f9e72499847c667df79bd0dac504234d21f79e30dafe2688a45cd257c4e1b7f"} Oct 13 13:41:29 crc kubenswrapper[4797]: I1013 13:41:29.155507 4797 scope.go:117] "RemoveContainer" containerID="bb1633a7272dad1968a6d33810486f2e402481848bacc210e36a35e08b9b10e5" Oct 13 13:41:29 crc kubenswrapper[4797]: I1013 13:41:29.156501 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-sxrn5" podUID="4dcf8382-bb4f-4ba4-a85d-4610dad46d99" containerName="registry-server" containerID="cri-o://94082aafb027d38bf5ff88103d47b4ef73b7178bd8728310107a9d5364ede2a2" gracePeriod=2 Oct 13 13:41:29 crc kubenswrapper[4797]: I1013 13:41:29.184634 4797 scope.go:117] "RemoveContainer" containerID="22afdbbe99eb08e2a272558dd534334dd34bdf8cc3c19fa1409271fc88530f71" Oct 13 13:41:29 crc kubenswrapper[4797]: I1013 13:41:29.216992 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xp7zq"] Oct 13 13:41:29 crc kubenswrapper[4797]: I1013 13:41:29.217635 4797 scope.go:117] "RemoveContainer" containerID="b4081b90e1e7e0f5940580c3f995a67349057fe08ca554ed5a0c0a518064bfa7" Oct 13 13:41:29 crc kubenswrapper[4797]: I1013 13:41:29.227350 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-xp7zq"] Oct 13 13:41:29 crc kubenswrapper[4797]: I1013 13:41:29.247410 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70ccb1a0-247a-48fe-b81a-8f306aeef110" path="/var/lib/kubelet/pods/70ccb1a0-247a-48fe-b81a-8f306aeef110/volumes" Oct 13 13:41:29 crc kubenswrapper[4797]: I1013 13:41:29.344190 4797 scope.go:117] "RemoveContainer" containerID="bb1633a7272dad1968a6d33810486f2e402481848bacc210e36a35e08b9b10e5" Oct 13 13:41:29 crc kubenswrapper[4797]: E1013 13:41:29.344714 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb1633a7272dad1968a6d33810486f2e402481848bacc210e36a35e08b9b10e5\": container with ID starting with bb1633a7272dad1968a6d33810486f2e402481848bacc210e36a35e08b9b10e5 not found: ID does not exist" containerID="bb1633a7272dad1968a6d33810486f2e402481848bacc210e36a35e08b9b10e5" Oct 13 13:41:29 crc kubenswrapper[4797]: I1013 13:41:29.344744 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb1633a7272dad1968a6d33810486f2e402481848bacc210e36a35e08b9b10e5"} err="failed to get container status \"bb1633a7272dad1968a6d33810486f2e402481848bacc210e36a35e08b9b10e5\": rpc error: code = NotFound desc = could not find container \"bb1633a7272dad1968a6d33810486f2e402481848bacc210e36a35e08b9b10e5\": container with ID starting with bb1633a7272dad1968a6d33810486f2e402481848bacc210e36a35e08b9b10e5 not found: ID does not exist" Oct 13 13:41:29 crc kubenswrapper[4797]: I1013 13:41:29.344766 4797 scope.go:117] "RemoveContainer" containerID="22afdbbe99eb08e2a272558dd534334dd34bdf8cc3c19fa1409271fc88530f71" Oct 13 13:41:29 crc kubenswrapper[4797]: E1013 13:41:29.345313 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22afdbbe99eb08e2a272558dd534334dd34bdf8cc3c19fa1409271fc88530f71\": container with ID starting with 22afdbbe99eb08e2a272558dd534334dd34bdf8cc3c19fa1409271fc88530f71 not found: ID does not exist" containerID="22afdbbe99eb08e2a272558dd534334dd34bdf8cc3c19fa1409271fc88530f71" Oct 13 13:41:29 crc kubenswrapper[4797]: I1013 13:41:29.345361 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22afdbbe99eb08e2a272558dd534334dd34bdf8cc3c19fa1409271fc88530f71"} err="failed to get container status \"22afdbbe99eb08e2a272558dd534334dd34bdf8cc3c19fa1409271fc88530f71\": rpc error: code = NotFound desc = could not find container \"22afdbbe99eb08e2a272558dd534334dd34bdf8cc3c19fa1409271fc88530f71\": container with ID starting with 22afdbbe99eb08e2a272558dd534334dd34bdf8cc3c19fa1409271fc88530f71 not found: ID does not exist" Oct 13 13:41:29 crc kubenswrapper[4797]: I1013 13:41:29.345398 4797 scope.go:117] "RemoveContainer" containerID="b4081b90e1e7e0f5940580c3f995a67349057fe08ca554ed5a0c0a518064bfa7" Oct 13 13:41:29 crc kubenswrapper[4797]: E1013 13:41:29.345787 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b4081b90e1e7e0f5940580c3f995a67349057fe08ca554ed5a0c0a518064bfa7\": container with ID starting with b4081b90e1e7e0f5940580c3f995a67349057fe08ca554ed5a0c0a518064bfa7 not found: ID does not exist" containerID="b4081b90e1e7e0f5940580c3f995a67349057fe08ca554ed5a0c0a518064bfa7" Oct 13 13:41:29 crc kubenswrapper[4797]: I1013 13:41:29.345850 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4081b90e1e7e0f5940580c3f995a67349057fe08ca554ed5a0c0a518064bfa7"} err="failed to get container status \"b4081b90e1e7e0f5940580c3f995a67349057fe08ca554ed5a0c0a518064bfa7\": rpc error: code = NotFound desc = could not find container \"b4081b90e1e7e0f5940580c3f995a67349057fe08ca554ed5a0c0a518064bfa7\": container with ID starting with b4081b90e1e7e0f5940580c3f995a67349057fe08ca554ed5a0c0a518064bfa7 not found: ID does not exist" Oct 13 13:41:29 crc kubenswrapper[4797]: I1013 13:41:29.524707 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sxrn5" Oct 13 13:41:29 crc kubenswrapper[4797]: I1013 13:41:29.617439 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lxkpt\" (UniqueName: \"kubernetes.io/projected/4dcf8382-bb4f-4ba4-a85d-4610dad46d99-kube-api-access-lxkpt\") pod \"4dcf8382-bb4f-4ba4-a85d-4610dad46d99\" (UID: \"4dcf8382-bb4f-4ba4-a85d-4610dad46d99\") " Oct 13 13:41:29 crc kubenswrapper[4797]: I1013 13:41:29.617838 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4dcf8382-bb4f-4ba4-a85d-4610dad46d99-utilities\") pod \"4dcf8382-bb4f-4ba4-a85d-4610dad46d99\" (UID: \"4dcf8382-bb4f-4ba4-a85d-4610dad46d99\") " Oct 13 13:41:29 crc kubenswrapper[4797]: I1013 13:41:29.617925 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4dcf8382-bb4f-4ba4-a85d-4610dad46d99-catalog-content\") pod \"4dcf8382-bb4f-4ba4-a85d-4610dad46d99\" (UID: \"4dcf8382-bb4f-4ba4-a85d-4610dad46d99\") " Oct 13 13:41:29 crc kubenswrapper[4797]: I1013 13:41:29.618844 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4dcf8382-bb4f-4ba4-a85d-4610dad46d99-utilities" (OuterVolumeSpecName: "utilities") pod "4dcf8382-bb4f-4ba4-a85d-4610dad46d99" (UID: "4dcf8382-bb4f-4ba4-a85d-4610dad46d99"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 13:41:29 crc kubenswrapper[4797]: I1013 13:41:29.622993 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4dcf8382-bb4f-4ba4-a85d-4610dad46d99-kube-api-access-lxkpt" (OuterVolumeSpecName: "kube-api-access-lxkpt") pod "4dcf8382-bb4f-4ba4-a85d-4610dad46d99" (UID: "4dcf8382-bb4f-4ba4-a85d-4610dad46d99"). InnerVolumeSpecName "kube-api-access-lxkpt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:41:29 crc kubenswrapper[4797]: I1013 13:41:29.639251 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4dcf8382-bb4f-4ba4-a85d-4610dad46d99-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4dcf8382-bb4f-4ba4-a85d-4610dad46d99" (UID: "4dcf8382-bb4f-4ba4-a85d-4610dad46d99"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 13:41:29 crc kubenswrapper[4797]: I1013 13:41:29.720206 4797 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4dcf8382-bb4f-4ba4-a85d-4610dad46d99-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 13 13:41:29 crc kubenswrapper[4797]: I1013 13:41:29.720274 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lxkpt\" (UniqueName: \"kubernetes.io/projected/4dcf8382-bb4f-4ba4-a85d-4610dad46d99-kube-api-access-lxkpt\") on node \"crc\" DevicePath \"\"" Oct 13 13:41:29 crc kubenswrapper[4797]: I1013 13:41:29.720300 4797 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4dcf8382-bb4f-4ba4-a85d-4610dad46d99-utilities\") on node \"crc\" DevicePath \"\"" Oct 13 13:41:30 crc kubenswrapper[4797]: I1013 13:41:30.165463 4797 generic.go:334] "Generic (PLEG): container finished" podID="4dcf8382-bb4f-4ba4-a85d-4610dad46d99" containerID="94082aafb027d38bf5ff88103d47b4ef73b7178bd8728310107a9d5364ede2a2" exitCode=0 Oct 13 13:41:30 crc kubenswrapper[4797]: I1013 13:41:30.165519 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sxrn5" Oct 13 13:41:30 crc kubenswrapper[4797]: I1013 13:41:30.165521 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sxrn5" event={"ID":"4dcf8382-bb4f-4ba4-a85d-4610dad46d99","Type":"ContainerDied","Data":"94082aafb027d38bf5ff88103d47b4ef73b7178bd8728310107a9d5364ede2a2"} Oct 13 13:41:30 crc kubenswrapper[4797]: I1013 13:41:30.165572 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sxrn5" event={"ID":"4dcf8382-bb4f-4ba4-a85d-4610dad46d99","Type":"ContainerDied","Data":"b220d495eb52aa2a944b4434b0c4a5b5d19bd0efb137cded41fdd13a032f55d7"} Oct 13 13:41:30 crc kubenswrapper[4797]: I1013 13:41:30.165599 4797 scope.go:117] "RemoveContainer" containerID="94082aafb027d38bf5ff88103d47b4ef73b7178bd8728310107a9d5364ede2a2" Oct 13 13:41:30 crc kubenswrapper[4797]: I1013 13:41:30.189145 4797 scope.go:117] "RemoveContainer" containerID="c98be99ee906a1e2802a650cfb8acec43385ebe45b923971d9b0420d513c77d8" Oct 13 13:41:30 crc kubenswrapper[4797]: I1013 13:41:30.200865 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sxrn5"] Oct 13 13:41:30 crc kubenswrapper[4797]: I1013 13:41:30.204760 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-sxrn5"] Oct 13 13:41:30 crc kubenswrapper[4797]: I1013 13:41:30.210778 4797 scope.go:117] "RemoveContainer" containerID="266457f83429cef644b50125c8900d9b7b7ced38200f9723a0db8127e72dc099" Oct 13 13:41:30 crc kubenswrapper[4797]: I1013 13:41:30.228386 4797 scope.go:117] "RemoveContainer" containerID="94082aafb027d38bf5ff88103d47b4ef73b7178bd8728310107a9d5364ede2a2" Oct 13 13:41:30 crc kubenswrapper[4797]: E1013 13:41:30.228880 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94082aafb027d38bf5ff88103d47b4ef73b7178bd8728310107a9d5364ede2a2\": container with ID starting with 94082aafb027d38bf5ff88103d47b4ef73b7178bd8728310107a9d5364ede2a2 not found: ID does not exist" containerID="94082aafb027d38bf5ff88103d47b4ef73b7178bd8728310107a9d5364ede2a2" Oct 13 13:41:30 crc kubenswrapper[4797]: I1013 13:41:30.228910 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94082aafb027d38bf5ff88103d47b4ef73b7178bd8728310107a9d5364ede2a2"} err="failed to get container status \"94082aafb027d38bf5ff88103d47b4ef73b7178bd8728310107a9d5364ede2a2\": rpc error: code = NotFound desc = could not find container \"94082aafb027d38bf5ff88103d47b4ef73b7178bd8728310107a9d5364ede2a2\": container with ID starting with 94082aafb027d38bf5ff88103d47b4ef73b7178bd8728310107a9d5364ede2a2 not found: ID does not exist" Oct 13 13:41:30 crc kubenswrapper[4797]: I1013 13:41:30.228936 4797 scope.go:117] "RemoveContainer" containerID="c98be99ee906a1e2802a650cfb8acec43385ebe45b923971d9b0420d513c77d8" Oct 13 13:41:30 crc kubenswrapper[4797]: E1013 13:41:30.229207 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c98be99ee906a1e2802a650cfb8acec43385ebe45b923971d9b0420d513c77d8\": container with ID starting with c98be99ee906a1e2802a650cfb8acec43385ebe45b923971d9b0420d513c77d8 not found: ID does not exist" containerID="c98be99ee906a1e2802a650cfb8acec43385ebe45b923971d9b0420d513c77d8" Oct 13 13:41:30 crc kubenswrapper[4797]: I1013 13:41:30.229230 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c98be99ee906a1e2802a650cfb8acec43385ebe45b923971d9b0420d513c77d8"} err="failed to get container status \"c98be99ee906a1e2802a650cfb8acec43385ebe45b923971d9b0420d513c77d8\": rpc error: code = NotFound desc = could not find container \"c98be99ee906a1e2802a650cfb8acec43385ebe45b923971d9b0420d513c77d8\": container with ID starting with c98be99ee906a1e2802a650cfb8acec43385ebe45b923971d9b0420d513c77d8 not found: ID does not exist" Oct 13 13:41:30 crc kubenswrapper[4797]: I1013 13:41:30.229247 4797 scope.go:117] "RemoveContainer" containerID="266457f83429cef644b50125c8900d9b7b7ced38200f9723a0db8127e72dc099" Oct 13 13:41:30 crc kubenswrapper[4797]: E1013 13:41:30.229590 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"266457f83429cef644b50125c8900d9b7b7ced38200f9723a0db8127e72dc099\": container with ID starting with 266457f83429cef644b50125c8900d9b7b7ced38200f9723a0db8127e72dc099 not found: ID does not exist" containerID="266457f83429cef644b50125c8900d9b7b7ced38200f9723a0db8127e72dc099" Oct 13 13:41:30 crc kubenswrapper[4797]: I1013 13:41:30.229611 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"266457f83429cef644b50125c8900d9b7b7ced38200f9723a0db8127e72dc099"} err="failed to get container status \"266457f83429cef644b50125c8900d9b7b7ced38200f9723a0db8127e72dc099\": rpc error: code = NotFound desc = could not find container \"266457f83429cef644b50125c8900d9b7b7ced38200f9723a0db8127e72dc099\": container with ID starting with 266457f83429cef644b50125c8900d9b7b7ced38200f9723a0db8127e72dc099 not found: ID does not exist" Oct 13 13:41:31 crc kubenswrapper[4797]: I1013 13:41:31.253784 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4dcf8382-bb4f-4ba4-a85d-4610dad46d99" path="/var/lib/kubelet/pods/4dcf8382-bb4f-4ba4-a85d-4610dad46d99/volumes" Oct 13 13:41:48 crc kubenswrapper[4797]: I1013 13:41:48.119879 4797 patch_prober.go:28] interesting pod/machine-config-daemon-hrdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 13:41:48 crc kubenswrapper[4797]: I1013 13:41:48.120519 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 13:41:48 crc kubenswrapper[4797]: I1013 13:41:48.120581 4797 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" Oct 13 13:41:48 crc kubenswrapper[4797]: I1013 13:41:48.121484 4797 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"187fe1ba906854ff992646e2036cc647b869f99f167e66271cdb87f4d6a2e410"} pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 13 13:41:48 crc kubenswrapper[4797]: I1013 13:41:48.121564 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerName="machine-config-daemon" containerID="cri-o://187fe1ba906854ff992646e2036cc647b869f99f167e66271cdb87f4d6a2e410" gracePeriod=600 Oct 13 13:41:48 crc kubenswrapper[4797]: I1013 13:41:48.322091 4797 generic.go:334] "Generic (PLEG): container finished" podID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerID="187fe1ba906854ff992646e2036cc647b869f99f167e66271cdb87f4d6a2e410" exitCode=0 Oct 13 13:41:48 crc kubenswrapper[4797]: I1013 13:41:48.322149 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" event={"ID":"345b1c60-ba79-407d-8423-53010f2dfeb0","Type":"ContainerDied","Data":"187fe1ba906854ff992646e2036cc647b869f99f167e66271cdb87f4d6a2e410"} Oct 13 13:41:48 crc kubenswrapper[4797]: I1013 13:41:48.322490 4797 scope.go:117] "RemoveContainer" containerID="2e1a8d37a243fdb391fd594b14c3967cbe771bff4ffdfcbdee7201408ecf2edb" Oct 13 13:41:49 crc kubenswrapper[4797]: I1013 13:41:49.332941 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" event={"ID":"345b1c60-ba79-407d-8423-53010f2dfeb0","Type":"ContainerStarted","Data":"7fc962c2b3cd9094de09eb9de80039049b8a00525bc7d3b9657e49e3e585d698"} Oct 13 13:43:48 crc kubenswrapper[4797]: I1013 13:43:48.120375 4797 patch_prober.go:28] interesting pod/machine-config-daemon-hrdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 13:43:48 crc kubenswrapper[4797]: I1013 13:43:48.121091 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 13:44:18 crc kubenswrapper[4797]: I1013 13:44:18.120144 4797 patch_prober.go:28] interesting pod/machine-config-daemon-hrdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 13:44:18 crc kubenswrapper[4797]: I1013 13:44:18.121508 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 13:44:48 crc kubenswrapper[4797]: I1013 13:44:48.121004 4797 patch_prober.go:28] interesting pod/machine-config-daemon-hrdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 13:44:48 crc kubenswrapper[4797]: I1013 13:44:48.121466 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 13:44:48 crc kubenswrapper[4797]: I1013 13:44:48.121507 4797 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" Oct 13 13:44:48 crc kubenswrapper[4797]: I1013 13:44:48.122116 4797 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7fc962c2b3cd9094de09eb9de80039049b8a00525bc7d3b9657e49e3e585d698"} pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 13 13:44:48 crc kubenswrapper[4797]: I1013 13:44:48.122170 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerName="machine-config-daemon" containerID="cri-o://7fc962c2b3cd9094de09eb9de80039049b8a00525bc7d3b9657e49e3e585d698" gracePeriod=600 Oct 13 13:44:48 crc kubenswrapper[4797]: E1013 13:44:48.249734 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 13:44:48 crc kubenswrapper[4797]: I1013 13:44:48.761081 4797 generic.go:334] "Generic (PLEG): container finished" podID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerID="7fc962c2b3cd9094de09eb9de80039049b8a00525bc7d3b9657e49e3e585d698" exitCode=0 Oct 13 13:44:48 crc kubenswrapper[4797]: I1013 13:44:48.761147 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" event={"ID":"345b1c60-ba79-407d-8423-53010f2dfeb0","Type":"ContainerDied","Data":"7fc962c2b3cd9094de09eb9de80039049b8a00525bc7d3b9657e49e3e585d698"} Oct 13 13:44:48 crc kubenswrapper[4797]: I1013 13:44:48.761360 4797 scope.go:117] "RemoveContainer" containerID="187fe1ba906854ff992646e2036cc647b869f99f167e66271cdb87f4d6a2e410" Oct 13 13:44:48 crc kubenswrapper[4797]: I1013 13:44:48.761768 4797 scope.go:117] "RemoveContainer" containerID="7fc962c2b3cd9094de09eb9de80039049b8a00525bc7d3b9657e49e3e585d698" Oct 13 13:44:48 crc kubenswrapper[4797]: E1013 13:44:48.761999 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 13:45:00 crc kubenswrapper[4797]: I1013 13:45:00.161702 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29339385-zjjkk"] Oct 13 13:45:00 crc kubenswrapper[4797]: E1013 13:45:00.163557 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4dcf8382-bb4f-4ba4-a85d-4610dad46d99" containerName="extract-content" Oct 13 13:45:00 crc kubenswrapper[4797]: I1013 13:45:00.163615 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="4dcf8382-bb4f-4ba4-a85d-4610dad46d99" containerName="extract-content" Oct 13 13:45:00 crc kubenswrapper[4797]: E1013 13:45:00.163650 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4dcf8382-bb4f-4ba4-a85d-4610dad46d99" containerName="registry-server" Oct 13 13:45:00 crc kubenswrapper[4797]: I1013 13:45:00.163658 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="4dcf8382-bb4f-4ba4-a85d-4610dad46d99" containerName="registry-server" Oct 13 13:45:00 crc kubenswrapper[4797]: E1013 13:45:00.163680 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70ccb1a0-247a-48fe-b81a-8f306aeef110" containerName="registry-server" Oct 13 13:45:00 crc kubenswrapper[4797]: I1013 13:45:00.163687 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="70ccb1a0-247a-48fe-b81a-8f306aeef110" containerName="registry-server" Oct 13 13:45:00 crc kubenswrapper[4797]: E1013 13:45:00.163698 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4dcf8382-bb4f-4ba4-a85d-4610dad46d99" containerName="extract-utilities" Oct 13 13:45:00 crc kubenswrapper[4797]: I1013 13:45:00.163705 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="4dcf8382-bb4f-4ba4-a85d-4610dad46d99" containerName="extract-utilities" Oct 13 13:45:00 crc kubenswrapper[4797]: E1013 13:45:00.163717 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70ccb1a0-247a-48fe-b81a-8f306aeef110" containerName="extract-content" Oct 13 13:45:00 crc kubenswrapper[4797]: I1013 13:45:00.163723 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="70ccb1a0-247a-48fe-b81a-8f306aeef110" containerName="extract-content" Oct 13 13:45:00 crc kubenswrapper[4797]: E1013 13:45:00.163730 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70ccb1a0-247a-48fe-b81a-8f306aeef110" containerName="extract-utilities" Oct 13 13:45:00 crc kubenswrapper[4797]: I1013 13:45:00.163735 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="70ccb1a0-247a-48fe-b81a-8f306aeef110" containerName="extract-utilities" Oct 13 13:45:00 crc kubenswrapper[4797]: I1013 13:45:00.163909 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="4dcf8382-bb4f-4ba4-a85d-4610dad46d99" containerName="registry-server" Oct 13 13:45:00 crc kubenswrapper[4797]: I1013 13:45:00.163944 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="70ccb1a0-247a-48fe-b81a-8f306aeef110" containerName="registry-server" Oct 13 13:45:00 crc kubenswrapper[4797]: I1013 13:45:00.164449 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29339385-zjjkk" Oct 13 13:45:00 crc kubenswrapper[4797]: I1013 13:45:00.169763 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 13 13:45:00 crc kubenswrapper[4797]: I1013 13:45:00.170307 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 13 13:45:00 crc kubenswrapper[4797]: I1013 13:45:00.175333 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29339385-zjjkk"] Oct 13 13:45:00 crc kubenswrapper[4797]: I1013 13:45:00.204193 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/00d78659-91ac-4b36-aade-be7b446d8276-config-volume\") pod \"collect-profiles-29339385-zjjkk\" (UID: \"00d78659-91ac-4b36-aade-be7b446d8276\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339385-zjjkk" Oct 13 13:45:00 crc kubenswrapper[4797]: I1013 13:45:00.204259 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/00d78659-91ac-4b36-aade-be7b446d8276-secret-volume\") pod \"collect-profiles-29339385-zjjkk\" (UID: \"00d78659-91ac-4b36-aade-be7b446d8276\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339385-zjjkk" Oct 13 13:45:00 crc kubenswrapper[4797]: I1013 13:45:00.204312 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvtcj\" (UniqueName: \"kubernetes.io/projected/00d78659-91ac-4b36-aade-be7b446d8276-kube-api-access-jvtcj\") pod \"collect-profiles-29339385-zjjkk\" (UID: \"00d78659-91ac-4b36-aade-be7b446d8276\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339385-zjjkk" Oct 13 13:45:00 crc kubenswrapper[4797]: I1013 13:45:00.305359 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/00d78659-91ac-4b36-aade-be7b446d8276-config-volume\") pod \"collect-profiles-29339385-zjjkk\" (UID: \"00d78659-91ac-4b36-aade-be7b446d8276\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339385-zjjkk" Oct 13 13:45:00 crc kubenswrapper[4797]: I1013 13:45:00.305426 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/00d78659-91ac-4b36-aade-be7b446d8276-secret-volume\") pod \"collect-profiles-29339385-zjjkk\" (UID: \"00d78659-91ac-4b36-aade-be7b446d8276\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339385-zjjkk" Oct 13 13:45:00 crc kubenswrapper[4797]: I1013 13:45:00.305465 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvtcj\" (UniqueName: \"kubernetes.io/projected/00d78659-91ac-4b36-aade-be7b446d8276-kube-api-access-jvtcj\") pod \"collect-profiles-29339385-zjjkk\" (UID: \"00d78659-91ac-4b36-aade-be7b446d8276\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339385-zjjkk" Oct 13 13:45:00 crc kubenswrapper[4797]: I1013 13:45:00.308077 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/00d78659-91ac-4b36-aade-be7b446d8276-config-volume\") pod \"collect-profiles-29339385-zjjkk\" (UID: \"00d78659-91ac-4b36-aade-be7b446d8276\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339385-zjjkk" Oct 13 13:45:00 crc kubenswrapper[4797]: I1013 13:45:00.312440 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/00d78659-91ac-4b36-aade-be7b446d8276-secret-volume\") pod \"collect-profiles-29339385-zjjkk\" (UID: \"00d78659-91ac-4b36-aade-be7b446d8276\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339385-zjjkk" Oct 13 13:45:00 crc kubenswrapper[4797]: I1013 13:45:00.327607 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvtcj\" (UniqueName: \"kubernetes.io/projected/00d78659-91ac-4b36-aade-be7b446d8276-kube-api-access-jvtcj\") pod \"collect-profiles-29339385-zjjkk\" (UID: \"00d78659-91ac-4b36-aade-be7b446d8276\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339385-zjjkk" Oct 13 13:45:00 crc kubenswrapper[4797]: I1013 13:45:00.494191 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29339385-zjjkk" Oct 13 13:45:00 crc kubenswrapper[4797]: I1013 13:45:00.905471 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29339385-zjjkk"] Oct 13 13:45:01 crc kubenswrapper[4797]: I1013 13:45:01.236252 4797 scope.go:117] "RemoveContainer" containerID="7fc962c2b3cd9094de09eb9de80039049b8a00525bc7d3b9657e49e3e585d698" Oct 13 13:45:01 crc kubenswrapper[4797]: E1013 13:45:01.237545 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 13:45:01 crc kubenswrapper[4797]: I1013 13:45:01.876103 4797 generic.go:334] "Generic (PLEG): container finished" podID="00d78659-91ac-4b36-aade-be7b446d8276" containerID="d61444aaba65a86d5ad081c58fbb5c5f16a174cc19db93ea309419d98c9f6f88" exitCode=0 Oct 13 13:45:01 crc kubenswrapper[4797]: I1013 13:45:01.876316 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29339385-zjjkk" event={"ID":"00d78659-91ac-4b36-aade-be7b446d8276","Type":"ContainerDied","Data":"d61444aaba65a86d5ad081c58fbb5c5f16a174cc19db93ea309419d98c9f6f88"} Oct 13 13:45:01 crc kubenswrapper[4797]: I1013 13:45:01.876578 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29339385-zjjkk" event={"ID":"00d78659-91ac-4b36-aade-be7b446d8276","Type":"ContainerStarted","Data":"892a27e5068754c48910a7f44c99cde2442032c0f737aa98a6c9d56f13d562f3"} Oct 13 13:45:03 crc kubenswrapper[4797]: I1013 13:45:03.183792 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29339385-zjjkk" Oct 13 13:45:03 crc kubenswrapper[4797]: I1013 13:45:03.273941 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jvtcj\" (UniqueName: \"kubernetes.io/projected/00d78659-91ac-4b36-aade-be7b446d8276-kube-api-access-jvtcj\") pod \"00d78659-91ac-4b36-aade-be7b446d8276\" (UID: \"00d78659-91ac-4b36-aade-be7b446d8276\") " Oct 13 13:45:03 crc kubenswrapper[4797]: I1013 13:45:03.274016 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/00d78659-91ac-4b36-aade-be7b446d8276-config-volume\") pod \"00d78659-91ac-4b36-aade-be7b446d8276\" (UID: \"00d78659-91ac-4b36-aade-be7b446d8276\") " Oct 13 13:45:03 crc kubenswrapper[4797]: I1013 13:45:03.274048 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/00d78659-91ac-4b36-aade-be7b446d8276-secret-volume\") pod \"00d78659-91ac-4b36-aade-be7b446d8276\" (UID: \"00d78659-91ac-4b36-aade-be7b446d8276\") " Oct 13 13:45:03 crc kubenswrapper[4797]: I1013 13:45:03.274960 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00d78659-91ac-4b36-aade-be7b446d8276-config-volume" (OuterVolumeSpecName: "config-volume") pod "00d78659-91ac-4b36-aade-be7b446d8276" (UID: "00d78659-91ac-4b36-aade-be7b446d8276"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 13:45:03 crc kubenswrapper[4797]: I1013 13:45:03.287686 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00d78659-91ac-4b36-aade-be7b446d8276-kube-api-access-jvtcj" (OuterVolumeSpecName: "kube-api-access-jvtcj") pod "00d78659-91ac-4b36-aade-be7b446d8276" (UID: "00d78659-91ac-4b36-aade-be7b446d8276"). InnerVolumeSpecName "kube-api-access-jvtcj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:45:03 crc kubenswrapper[4797]: I1013 13:45:03.288250 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00d78659-91ac-4b36-aade-be7b446d8276-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "00d78659-91ac-4b36-aade-be7b446d8276" (UID: "00d78659-91ac-4b36-aade-be7b446d8276"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 13:45:03 crc kubenswrapper[4797]: I1013 13:45:03.375788 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jvtcj\" (UniqueName: \"kubernetes.io/projected/00d78659-91ac-4b36-aade-be7b446d8276-kube-api-access-jvtcj\") on node \"crc\" DevicePath \"\"" Oct 13 13:45:03 crc kubenswrapper[4797]: I1013 13:45:03.375857 4797 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/00d78659-91ac-4b36-aade-be7b446d8276-config-volume\") on node \"crc\" DevicePath \"\"" Oct 13 13:45:03 crc kubenswrapper[4797]: I1013 13:45:03.375870 4797 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/00d78659-91ac-4b36-aade-be7b446d8276-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 13 13:45:03 crc kubenswrapper[4797]: I1013 13:45:03.894670 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29339385-zjjkk" event={"ID":"00d78659-91ac-4b36-aade-be7b446d8276","Type":"ContainerDied","Data":"892a27e5068754c48910a7f44c99cde2442032c0f737aa98a6c9d56f13d562f3"} Oct 13 13:45:03 crc kubenswrapper[4797]: I1013 13:45:03.894999 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="892a27e5068754c48910a7f44c99cde2442032c0f737aa98a6c9d56f13d562f3" Oct 13 13:45:03 crc kubenswrapper[4797]: I1013 13:45:03.894717 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29339385-zjjkk" Oct 13 13:45:04 crc kubenswrapper[4797]: I1013 13:45:04.272198 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29339340-k2d9s"] Oct 13 13:45:04 crc kubenswrapper[4797]: I1013 13:45:04.284474 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29339340-k2d9s"] Oct 13 13:45:05 crc kubenswrapper[4797]: I1013 13:45:05.245990 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc2e514b-61f8-47b1-975e-4a910550ecaa" path="/var/lib/kubelet/pods/fc2e514b-61f8-47b1-975e-4a910550ecaa/volumes" Oct 13 13:45:10 crc kubenswrapper[4797]: I1013 13:45:10.836333 4797 scope.go:117] "RemoveContainer" containerID="4784279ce83fb7e1cdeef66da1ff994c99dbc1fb7d75ec0b692e2916b972f53d" Oct 13 13:45:16 crc kubenswrapper[4797]: I1013 13:45:16.236382 4797 scope.go:117] "RemoveContainer" containerID="7fc962c2b3cd9094de09eb9de80039049b8a00525bc7d3b9657e49e3e585d698" Oct 13 13:45:16 crc kubenswrapper[4797]: E1013 13:45:16.237008 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 13:45:28 crc kubenswrapper[4797]: I1013 13:45:28.236490 4797 scope.go:117] "RemoveContainer" containerID="7fc962c2b3cd9094de09eb9de80039049b8a00525bc7d3b9657e49e3e585d698" Oct 13 13:45:28 crc kubenswrapper[4797]: E1013 13:45:28.237459 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 13:45:43 crc kubenswrapper[4797]: I1013 13:45:43.241390 4797 scope.go:117] "RemoveContainer" containerID="7fc962c2b3cd9094de09eb9de80039049b8a00525bc7d3b9657e49e3e585d698" Oct 13 13:45:43 crc kubenswrapper[4797]: E1013 13:45:43.242233 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 13:45:54 crc kubenswrapper[4797]: I1013 13:45:54.236687 4797 scope.go:117] "RemoveContainer" containerID="7fc962c2b3cd9094de09eb9de80039049b8a00525bc7d3b9657e49e3e585d698" Oct 13 13:45:54 crc kubenswrapper[4797]: E1013 13:45:54.237359 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 13:46:05 crc kubenswrapper[4797]: I1013 13:46:05.236558 4797 scope.go:117] "RemoveContainer" containerID="7fc962c2b3cd9094de09eb9de80039049b8a00525bc7d3b9657e49e3e585d698" Oct 13 13:46:05 crc kubenswrapper[4797]: E1013 13:46:05.237262 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 13:46:18 crc kubenswrapper[4797]: I1013 13:46:18.235906 4797 scope.go:117] "RemoveContainer" containerID="7fc962c2b3cd9094de09eb9de80039049b8a00525bc7d3b9657e49e3e585d698" Oct 13 13:46:18 crc kubenswrapper[4797]: E1013 13:46:18.236944 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 13:46:32 crc kubenswrapper[4797]: I1013 13:46:32.235721 4797 scope.go:117] "RemoveContainer" containerID="7fc962c2b3cd9094de09eb9de80039049b8a00525bc7d3b9657e49e3e585d698" Oct 13 13:46:32 crc kubenswrapper[4797]: E1013 13:46:32.236482 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 13:46:47 crc kubenswrapper[4797]: I1013 13:46:47.236745 4797 scope.go:117] "RemoveContainer" containerID="7fc962c2b3cd9094de09eb9de80039049b8a00525bc7d3b9657e49e3e585d698" Oct 13 13:46:47 crc kubenswrapper[4797]: E1013 13:46:47.237609 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 13:46:58 crc kubenswrapper[4797]: I1013 13:46:58.236889 4797 scope.go:117] "RemoveContainer" containerID="7fc962c2b3cd9094de09eb9de80039049b8a00525bc7d3b9657e49e3e585d698" Oct 13 13:46:58 crc kubenswrapper[4797]: E1013 13:46:58.237720 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 13:47:11 crc kubenswrapper[4797]: I1013 13:47:11.236567 4797 scope.go:117] "RemoveContainer" containerID="7fc962c2b3cd9094de09eb9de80039049b8a00525bc7d3b9657e49e3e585d698" Oct 13 13:47:11 crc kubenswrapper[4797]: E1013 13:47:11.237385 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 13:47:22 crc kubenswrapper[4797]: I1013 13:47:22.182845 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-84b4b"] Oct 13 13:47:22 crc kubenswrapper[4797]: E1013 13:47:22.184258 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00d78659-91ac-4b36-aade-be7b446d8276" containerName="collect-profiles" Oct 13 13:47:22 crc kubenswrapper[4797]: I1013 13:47:22.184293 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="00d78659-91ac-4b36-aade-be7b446d8276" containerName="collect-profiles" Oct 13 13:47:22 crc kubenswrapper[4797]: I1013 13:47:22.184683 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="00d78659-91ac-4b36-aade-be7b446d8276" containerName="collect-profiles" Oct 13 13:47:22 crc kubenswrapper[4797]: I1013 13:47:22.186625 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-84b4b" Oct 13 13:47:22 crc kubenswrapper[4797]: I1013 13:47:22.189191 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-84b4b"] Oct 13 13:47:22 crc kubenswrapper[4797]: I1013 13:47:22.334575 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5be7b2eb-7c23-45d7-9332-8e578d5ce449-utilities\") pod \"redhat-operators-84b4b\" (UID: \"5be7b2eb-7c23-45d7-9332-8e578d5ce449\") " pod="openshift-marketplace/redhat-operators-84b4b" Oct 13 13:47:22 crc kubenswrapper[4797]: I1013 13:47:22.334852 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6944\" (UniqueName: \"kubernetes.io/projected/5be7b2eb-7c23-45d7-9332-8e578d5ce449-kube-api-access-p6944\") pod \"redhat-operators-84b4b\" (UID: \"5be7b2eb-7c23-45d7-9332-8e578d5ce449\") " pod="openshift-marketplace/redhat-operators-84b4b" Oct 13 13:47:22 crc kubenswrapper[4797]: I1013 13:47:22.334982 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5be7b2eb-7c23-45d7-9332-8e578d5ce449-catalog-content\") pod \"redhat-operators-84b4b\" (UID: \"5be7b2eb-7c23-45d7-9332-8e578d5ce449\") " pod="openshift-marketplace/redhat-operators-84b4b" Oct 13 13:47:22 crc kubenswrapper[4797]: I1013 13:47:22.436845 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5be7b2eb-7c23-45d7-9332-8e578d5ce449-utilities\") pod \"redhat-operators-84b4b\" (UID: \"5be7b2eb-7c23-45d7-9332-8e578d5ce449\") " pod="openshift-marketplace/redhat-operators-84b4b" Oct 13 13:47:22 crc kubenswrapper[4797]: I1013 13:47:22.436940 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6944\" (UniqueName: \"kubernetes.io/projected/5be7b2eb-7c23-45d7-9332-8e578d5ce449-kube-api-access-p6944\") pod \"redhat-operators-84b4b\" (UID: \"5be7b2eb-7c23-45d7-9332-8e578d5ce449\") " pod="openshift-marketplace/redhat-operators-84b4b" Oct 13 13:47:22 crc kubenswrapper[4797]: I1013 13:47:22.437003 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5be7b2eb-7c23-45d7-9332-8e578d5ce449-catalog-content\") pod \"redhat-operators-84b4b\" (UID: \"5be7b2eb-7c23-45d7-9332-8e578d5ce449\") " pod="openshift-marketplace/redhat-operators-84b4b" Oct 13 13:47:22 crc kubenswrapper[4797]: I1013 13:47:22.437424 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5be7b2eb-7c23-45d7-9332-8e578d5ce449-utilities\") pod \"redhat-operators-84b4b\" (UID: \"5be7b2eb-7c23-45d7-9332-8e578d5ce449\") " pod="openshift-marketplace/redhat-operators-84b4b" Oct 13 13:47:22 crc kubenswrapper[4797]: I1013 13:47:22.437476 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5be7b2eb-7c23-45d7-9332-8e578d5ce449-catalog-content\") pod \"redhat-operators-84b4b\" (UID: \"5be7b2eb-7c23-45d7-9332-8e578d5ce449\") " pod="openshift-marketplace/redhat-operators-84b4b" Oct 13 13:47:22 crc kubenswrapper[4797]: I1013 13:47:22.461490 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6944\" (UniqueName: \"kubernetes.io/projected/5be7b2eb-7c23-45d7-9332-8e578d5ce449-kube-api-access-p6944\") pod \"redhat-operators-84b4b\" (UID: \"5be7b2eb-7c23-45d7-9332-8e578d5ce449\") " pod="openshift-marketplace/redhat-operators-84b4b" Oct 13 13:47:22 crc kubenswrapper[4797]: I1013 13:47:22.547673 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-84b4b" Oct 13 13:47:22 crc kubenswrapper[4797]: I1013 13:47:22.984588 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-84b4b"] Oct 13 13:47:23 crc kubenswrapper[4797]: I1013 13:47:23.980741 4797 generic.go:334] "Generic (PLEG): container finished" podID="5be7b2eb-7c23-45d7-9332-8e578d5ce449" containerID="a21c3fe2f559f2312782de34efc09cbbede97b51252972c68004ffe62b7229fe" exitCode=0 Oct 13 13:47:23 crc kubenswrapper[4797]: I1013 13:47:23.981073 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-84b4b" event={"ID":"5be7b2eb-7c23-45d7-9332-8e578d5ce449","Type":"ContainerDied","Data":"a21c3fe2f559f2312782de34efc09cbbede97b51252972c68004ffe62b7229fe"} Oct 13 13:47:23 crc kubenswrapper[4797]: I1013 13:47:23.981101 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-84b4b" event={"ID":"5be7b2eb-7c23-45d7-9332-8e578d5ce449","Type":"ContainerStarted","Data":"e45c6d81eff670503ac882e136887c969775a958ab0ea6b41fe2f0399ddf102f"} Oct 13 13:47:23 crc kubenswrapper[4797]: I1013 13:47:23.982359 4797 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 13 13:47:25 crc kubenswrapper[4797]: I1013 13:47:25.236371 4797 scope.go:117] "RemoveContainer" containerID="7fc962c2b3cd9094de09eb9de80039049b8a00525bc7d3b9657e49e3e585d698" Oct 13 13:47:25 crc kubenswrapper[4797]: E1013 13:47:25.237333 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 13:47:25 crc kubenswrapper[4797]: I1013 13:47:25.997243 4797 generic.go:334] "Generic (PLEG): container finished" podID="5be7b2eb-7c23-45d7-9332-8e578d5ce449" containerID="ed03b8c6083c762c30ae2ed0fa6a1964854f0ae3c580a6f8b5a8b6aa9b1ed06e" exitCode=0 Oct 13 13:47:25 crc kubenswrapper[4797]: I1013 13:47:25.997296 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-84b4b" event={"ID":"5be7b2eb-7c23-45d7-9332-8e578d5ce449","Type":"ContainerDied","Data":"ed03b8c6083c762c30ae2ed0fa6a1964854f0ae3c580a6f8b5a8b6aa9b1ed06e"} Oct 13 13:47:27 crc kubenswrapper[4797]: I1013 13:47:27.006158 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-84b4b" event={"ID":"5be7b2eb-7c23-45d7-9332-8e578d5ce449","Type":"ContainerStarted","Data":"96ce9dd6fe1e3e9d44fd10819a5946a264dbd47a7c26b7092c90037325a09c20"} Oct 13 13:47:27 crc kubenswrapper[4797]: I1013 13:47:27.032647 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-84b4b" podStartSLOduration=2.4765680469999998 podStartE2EDuration="5.032631741s" podCreationTimestamp="2025-10-13 13:47:22 +0000 UTC" firstStartedPulling="2025-10-13 13:47:23.98214065 +0000 UTC m=+2421.515690906" lastFinishedPulling="2025-10-13 13:47:26.538204344 +0000 UTC m=+2424.071754600" observedRunningTime="2025-10-13 13:47:27.029123685 +0000 UTC m=+2424.562673951" watchObservedRunningTime="2025-10-13 13:47:27.032631741 +0000 UTC m=+2424.566181997" Oct 13 13:47:32 crc kubenswrapper[4797]: I1013 13:47:32.548864 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-84b4b" Oct 13 13:47:32 crc kubenswrapper[4797]: I1013 13:47:32.549456 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-84b4b" Oct 13 13:47:32 crc kubenswrapper[4797]: I1013 13:47:32.609486 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-84b4b" Oct 13 13:47:33 crc kubenswrapper[4797]: I1013 13:47:33.118932 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-84b4b" Oct 13 13:47:33 crc kubenswrapper[4797]: I1013 13:47:33.170122 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-84b4b"] Oct 13 13:47:35 crc kubenswrapper[4797]: I1013 13:47:35.075871 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-84b4b" podUID="5be7b2eb-7c23-45d7-9332-8e578d5ce449" containerName="registry-server" containerID="cri-o://96ce9dd6fe1e3e9d44fd10819a5946a264dbd47a7c26b7092c90037325a09c20" gracePeriod=2 Oct 13 13:47:35 crc kubenswrapper[4797]: I1013 13:47:35.499132 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-84b4b" Oct 13 13:47:35 crc kubenswrapper[4797]: I1013 13:47:35.627090 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5be7b2eb-7c23-45d7-9332-8e578d5ce449-catalog-content\") pod \"5be7b2eb-7c23-45d7-9332-8e578d5ce449\" (UID: \"5be7b2eb-7c23-45d7-9332-8e578d5ce449\") " Oct 13 13:47:35 crc kubenswrapper[4797]: I1013 13:47:35.627344 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5be7b2eb-7c23-45d7-9332-8e578d5ce449-utilities\") pod \"5be7b2eb-7c23-45d7-9332-8e578d5ce449\" (UID: \"5be7b2eb-7c23-45d7-9332-8e578d5ce449\") " Oct 13 13:47:35 crc kubenswrapper[4797]: I1013 13:47:35.627415 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p6944\" (UniqueName: \"kubernetes.io/projected/5be7b2eb-7c23-45d7-9332-8e578d5ce449-kube-api-access-p6944\") pod \"5be7b2eb-7c23-45d7-9332-8e578d5ce449\" (UID: \"5be7b2eb-7c23-45d7-9332-8e578d5ce449\") " Oct 13 13:47:35 crc kubenswrapper[4797]: I1013 13:47:35.628311 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5be7b2eb-7c23-45d7-9332-8e578d5ce449-utilities" (OuterVolumeSpecName: "utilities") pod "5be7b2eb-7c23-45d7-9332-8e578d5ce449" (UID: "5be7b2eb-7c23-45d7-9332-8e578d5ce449"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 13:47:35 crc kubenswrapper[4797]: I1013 13:47:35.635985 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5be7b2eb-7c23-45d7-9332-8e578d5ce449-kube-api-access-p6944" (OuterVolumeSpecName: "kube-api-access-p6944") pod "5be7b2eb-7c23-45d7-9332-8e578d5ce449" (UID: "5be7b2eb-7c23-45d7-9332-8e578d5ce449"). InnerVolumeSpecName "kube-api-access-p6944". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:47:35 crc kubenswrapper[4797]: I1013 13:47:35.729052 4797 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5be7b2eb-7c23-45d7-9332-8e578d5ce449-utilities\") on node \"crc\" DevicePath \"\"" Oct 13 13:47:35 crc kubenswrapper[4797]: I1013 13:47:35.729109 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p6944\" (UniqueName: \"kubernetes.io/projected/5be7b2eb-7c23-45d7-9332-8e578d5ce449-kube-api-access-p6944\") on node \"crc\" DevicePath \"\"" Oct 13 13:47:36 crc kubenswrapper[4797]: I1013 13:47:36.087406 4797 generic.go:334] "Generic (PLEG): container finished" podID="5be7b2eb-7c23-45d7-9332-8e578d5ce449" containerID="96ce9dd6fe1e3e9d44fd10819a5946a264dbd47a7c26b7092c90037325a09c20" exitCode=0 Oct 13 13:47:36 crc kubenswrapper[4797]: I1013 13:47:36.087460 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-84b4b" event={"ID":"5be7b2eb-7c23-45d7-9332-8e578d5ce449","Type":"ContainerDied","Data":"96ce9dd6fe1e3e9d44fd10819a5946a264dbd47a7c26b7092c90037325a09c20"} Oct 13 13:47:36 crc kubenswrapper[4797]: I1013 13:47:36.087493 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-84b4b" Oct 13 13:47:36 crc kubenswrapper[4797]: I1013 13:47:36.087526 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-84b4b" event={"ID":"5be7b2eb-7c23-45d7-9332-8e578d5ce449","Type":"ContainerDied","Data":"e45c6d81eff670503ac882e136887c969775a958ab0ea6b41fe2f0399ddf102f"} Oct 13 13:47:36 crc kubenswrapper[4797]: I1013 13:47:36.087561 4797 scope.go:117] "RemoveContainer" containerID="96ce9dd6fe1e3e9d44fd10819a5946a264dbd47a7c26b7092c90037325a09c20" Oct 13 13:47:36 crc kubenswrapper[4797]: I1013 13:47:36.108290 4797 scope.go:117] "RemoveContainer" containerID="ed03b8c6083c762c30ae2ed0fa6a1964854f0ae3c580a6f8b5a8b6aa9b1ed06e" Oct 13 13:47:36 crc kubenswrapper[4797]: I1013 13:47:36.135469 4797 scope.go:117] "RemoveContainer" containerID="a21c3fe2f559f2312782de34efc09cbbede97b51252972c68004ffe62b7229fe" Oct 13 13:47:36 crc kubenswrapper[4797]: I1013 13:47:36.161123 4797 scope.go:117] "RemoveContainer" containerID="96ce9dd6fe1e3e9d44fd10819a5946a264dbd47a7c26b7092c90037325a09c20" Oct 13 13:47:36 crc kubenswrapper[4797]: E1013 13:47:36.161561 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96ce9dd6fe1e3e9d44fd10819a5946a264dbd47a7c26b7092c90037325a09c20\": container with ID starting with 96ce9dd6fe1e3e9d44fd10819a5946a264dbd47a7c26b7092c90037325a09c20 not found: ID does not exist" containerID="96ce9dd6fe1e3e9d44fd10819a5946a264dbd47a7c26b7092c90037325a09c20" Oct 13 13:47:36 crc kubenswrapper[4797]: I1013 13:47:36.161622 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96ce9dd6fe1e3e9d44fd10819a5946a264dbd47a7c26b7092c90037325a09c20"} err="failed to get container status \"96ce9dd6fe1e3e9d44fd10819a5946a264dbd47a7c26b7092c90037325a09c20\": rpc error: code = NotFound desc = could not find container \"96ce9dd6fe1e3e9d44fd10819a5946a264dbd47a7c26b7092c90037325a09c20\": container with ID starting with 96ce9dd6fe1e3e9d44fd10819a5946a264dbd47a7c26b7092c90037325a09c20 not found: ID does not exist" Oct 13 13:47:36 crc kubenswrapper[4797]: I1013 13:47:36.161661 4797 scope.go:117] "RemoveContainer" containerID="ed03b8c6083c762c30ae2ed0fa6a1964854f0ae3c580a6f8b5a8b6aa9b1ed06e" Oct 13 13:47:36 crc kubenswrapper[4797]: E1013 13:47:36.162157 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed03b8c6083c762c30ae2ed0fa6a1964854f0ae3c580a6f8b5a8b6aa9b1ed06e\": container with ID starting with ed03b8c6083c762c30ae2ed0fa6a1964854f0ae3c580a6f8b5a8b6aa9b1ed06e not found: ID does not exist" containerID="ed03b8c6083c762c30ae2ed0fa6a1964854f0ae3c580a6f8b5a8b6aa9b1ed06e" Oct 13 13:47:36 crc kubenswrapper[4797]: I1013 13:47:36.162228 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed03b8c6083c762c30ae2ed0fa6a1964854f0ae3c580a6f8b5a8b6aa9b1ed06e"} err="failed to get container status \"ed03b8c6083c762c30ae2ed0fa6a1964854f0ae3c580a6f8b5a8b6aa9b1ed06e\": rpc error: code = NotFound desc = could not find container \"ed03b8c6083c762c30ae2ed0fa6a1964854f0ae3c580a6f8b5a8b6aa9b1ed06e\": container with ID starting with ed03b8c6083c762c30ae2ed0fa6a1964854f0ae3c580a6f8b5a8b6aa9b1ed06e not found: ID does not exist" Oct 13 13:47:36 crc kubenswrapper[4797]: I1013 13:47:36.162268 4797 scope.go:117] "RemoveContainer" containerID="a21c3fe2f559f2312782de34efc09cbbede97b51252972c68004ffe62b7229fe" Oct 13 13:47:36 crc kubenswrapper[4797]: E1013 13:47:36.162775 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a21c3fe2f559f2312782de34efc09cbbede97b51252972c68004ffe62b7229fe\": container with ID starting with a21c3fe2f559f2312782de34efc09cbbede97b51252972c68004ffe62b7229fe not found: ID does not exist" containerID="a21c3fe2f559f2312782de34efc09cbbede97b51252972c68004ffe62b7229fe" Oct 13 13:47:36 crc kubenswrapper[4797]: I1013 13:47:36.162832 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a21c3fe2f559f2312782de34efc09cbbede97b51252972c68004ffe62b7229fe"} err="failed to get container status \"a21c3fe2f559f2312782de34efc09cbbede97b51252972c68004ffe62b7229fe\": rpc error: code = NotFound desc = could not find container \"a21c3fe2f559f2312782de34efc09cbbede97b51252972c68004ffe62b7229fe\": container with ID starting with a21c3fe2f559f2312782de34efc09cbbede97b51252972c68004ffe62b7229fe not found: ID does not exist" Oct 13 13:47:36 crc kubenswrapper[4797]: I1013 13:47:36.235611 4797 scope.go:117] "RemoveContainer" containerID="7fc962c2b3cd9094de09eb9de80039049b8a00525bc7d3b9657e49e3e585d698" Oct 13 13:47:36 crc kubenswrapper[4797]: E1013 13:47:36.236004 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 13:47:36 crc kubenswrapper[4797]: I1013 13:47:36.555308 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5be7b2eb-7c23-45d7-9332-8e578d5ce449-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5be7b2eb-7c23-45d7-9332-8e578d5ce449" (UID: "5be7b2eb-7c23-45d7-9332-8e578d5ce449"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 13:47:36 crc kubenswrapper[4797]: I1013 13:47:36.639912 4797 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5be7b2eb-7c23-45d7-9332-8e578d5ce449-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 13 13:47:36 crc kubenswrapper[4797]: I1013 13:47:36.722582 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-84b4b"] Oct 13 13:47:36 crc kubenswrapper[4797]: I1013 13:47:36.732285 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-84b4b"] Oct 13 13:47:37 crc kubenswrapper[4797]: I1013 13:47:37.247600 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5be7b2eb-7c23-45d7-9332-8e578d5ce449" path="/var/lib/kubelet/pods/5be7b2eb-7c23-45d7-9332-8e578d5ce449/volumes" Oct 13 13:47:49 crc kubenswrapper[4797]: I1013 13:47:49.236604 4797 scope.go:117] "RemoveContainer" containerID="7fc962c2b3cd9094de09eb9de80039049b8a00525bc7d3b9657e49e3e585d698" Oct 13 13:47:49 crc kubenswrapper[4797]: E1013 13:47:49.237660 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 13:48:01 crc kubenswrapper[4797]: I1013 13:48:01.235911 4797 scope.go:117] "RemoveContainer" containerID="7fc962c2b3cd9094de09eb9de80039049b8a00525bc7d3b9657e49e3e585d698" Oct 13 13:48:01 crc kubenswrapper[4797]: E1013 13:48:01.236598 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 13:48:12 crc kubenswrapper[4797]: I1013 13:48:12.236217 4797 scope.go:117] "RemoveContainer" containerID="7fc962c2b3cd9094de09eb9de80039049b8a00525bc7d3b9657e49e3e585d698" Oct 13 13:48:12 crc kubenswrapper[4797]: E1013 13:48:12.237344 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 13:48:19 crc kubenswrapper[4797]: I1013 13:48:19.094097 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-frm7l"] Oct 13 13:48:19 crc kubenswrapper[4797]: E1013 13:48:19.094915 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5be7b2eb-7c23-45d7-9332-8e578d5ce449" containerName="extract-utilities" Oct 13 13:48:19 crc kubenswrapper[4797]: I1013 13:48:19.094936 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="5be7b2eb-7c23-45d7-9332-8e578d5ce449" containerName="extract-utilities" Oct 13 13:48:19 crc kubenswrapper[4797]: E1013 13:48:19.094966 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5be7b2eb-7c23-45d7-9332-8e578d5ce449" containerName="extract-content" Oct 13 13:48:19 crc kubenswrapper[4797]: I1013 13:48:19.094977 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="5be7b2eb-7c23-45d7-9332-8e578d5ce449" containerName="extract-content" Oct 13 13:48:19 crc kubenswrapper[4797]: E1013 13:48:19.095003 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5be7b2eb-7c23-45d7-9332-8e578d5ce449" containerName="registry-server" Oct 13 13:48:19 crc kubenswrapper[4797]: I1013 13:48:19.095015 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="5be7b2eb-7c23-45d7-9332-8e578d5ce449" containerName="registry-server" Oct 13 13:48:19 crc kubenswrapper[4797]: I1013 13:48:19.095244 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="5be7b2eb-7c23-45d7-9332-8e578d5ce449" containerName="registry-server" Oct 13 13:48:19 crc kubenswrapper[4797]: I1013 13:48:19.097648 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-frm7l" Oct 13 13:48:19 crc kubenswrapper[4797]: I1013 13:48:19.114419 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-frm7l"] Oct 13 13:48:19 crc kubenswrapper[4797]: I1013 13:48:19.158083 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7pdjm\" (UniqueName: \"kubernetes.io/projected/d5d10f1a-a177-43bb-8fac-b4177e8c331b-kube-api-access-7pdjm\") pod \"certified-operators-frm7l\" (UID: \"d5d10f1a-a177-43bb-8fac-b4177e8c331b\") " pod="openshift-marketplace/certified-operators-frm7l" Oct 13 13:48:19 crc kubenswrapper[4797]: I1013 13:48:19.158147 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5d10f1a-a177-43bb-8fac-b4177e8c331b-catalog-content\") pod \"certified-operators-frm7l\" (UID: \"d5d10f1a-a177-43bb-8fac-b4177e8c331b\") " pod="openshift-marketplace/certified-operators-frm7l" Oct 13 13:48:19 crc kubenswrapper[4797]: I1013 13:48:19.158269 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5d10f1a-a177-43bb-8fac-b4177e8c331b-utilities\") pod \"certified-operators-frm7l\" (UID: \"d5d10f1a-a177-43bb-8fac-b4177e8c331b\") " pod="openshift-marketplace/certified-operators-frm7l" Oct 13 13:48:19 crc kubenswrapper[4797]: I1013 13:48:19.259267 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7pdjm\" (UniqueName: \"kubernetes.io/projected/d5d10f1a-a177-43bb-8fac-b4177e8c331b-kube-api-access-7pdjm\") pod \"certified-operators-frm7l\" (UID: \"d5d10f1a-a177-43bb-8fac-b4177e8c331b\") " pod="openshift-marketplace/certified-operators-frm7l" Oct 13 13:48:19 crc kubenswrapper[4797]: I1013 13:48:19.259309 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5d10f1a-a177-43bb-8fac-b4177e8c331b-catalog-content\") pod \"certified-operators-frm7l\" (UID: \"d5d10f1a-a177-43bb-8fac-b4177e8c331b\") " pod="openshift-marketplace/certified-operators-frm7l" Oct 13 13:48:19 crc kubenswrapper[4797]: I1013 13:48:19.259371 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5d10f1a-a177-43bb-8fac-b4177e8c331b-utilities\") pod \"certified-operators-frm7l\" (UID: \"d5d10f1a-a177-43bb-8fac-b4177e8c331b\") " pod="openshift-marketplace/certified-operators-frm7l" Oct 13 13:48:19 crc kubenswrapper[4797]: I1013 13:48:19.259775 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5d10f1a-a177-43bb-8fac-b4177e8c331b-utilities\") pod \"certified-operators-frm7l\" (UID: \"d5d10f1a-a177-43bb-8fac-b4177e8c331b\") " pod="openshift-marketplace/certified-operators-frm7l" Oct 13 13:48:19 crc kubenswrapper[4797]: I1013 13:48:19.260323 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5d10f1a-a177-43bb-8fac-b4177e8c331b-catalog-content\") pod \"certified-operators-frm7l\" (UID: \"d5d10f1a-a177-43bb-8fac-b4177e8c331b\") " pod="openshift-marketplace/certified-operators-frm7l" Oct 13 13:48:19 crc kubenswrapper[4797]: I1013 13:48:19.281571 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7pdjm\" (UniqueName: \"kubernetes.io/projected/d5d10f1a-a177-43bb-8fac-b4177e8c331b-kube-api-access-7pdjm\") pod \"certified-operators-frm7l\" (UID: \"d5d10f1a-a177-43bb-8fac-b4177e8c331b\") " pod="openshift-marketplace/certified-operators-frm7l" Oct 13 13:48:19 crc kubenswrapper[4797]: I1013 13:48:19.419771 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-frm7l" Oct 13 13:48:19 crc kubenswrapper[4797]: I1013 13:48:19.896012 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-frm7l"] Oct 13 13:48:20 crc kubenswrapper[4797]: I1013 13:48:20.441623 4797 generic.go:334] "Generic (PLEG): container finished" podID="d5d10f1a-a177-43bb-8fac-b4177e8c331b" containerID="93a635ce518a86c8294c0d04d154e4fe2031b4c7eceaf2491ea020ba1535c73e" exitCode=0 Oct 13 13:48:20 crc kubenswrapper[4797]: I1013 13:48:20.441665 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-frm7l" event={"ID":"d5d10f1a-a177-43bb-8fac-b4177e8c331b","Type":"ContainerDied","Data":"93a635ce518a86c8294c0d04d154e4fe2031b4c7eceaf2491ea020ba1535c73e"} Oct 13 13:48:20 crc kubenswrapper[4797]: I1013 13:48:20.441687 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-frm7l" event={"ID":"d5d10f1a-a177-43bb-8fac-b4177e8c331b","Type":"ContainerStarted","Data":"0f62814529de5b0380e806a921cf9bacb224c5f7b8d591109c0d8389b57ed900"} Oct 13 13:48:21 crc kubenswrapper[4797]: I1013 13:48:21.461256 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-frm7l" event={"ID":"d5d10f1a-a177-43bb-8fac-b4177e8c331b","Type":"ContainerStarted","Data":"c19a2b720aa1704d6671ae1baae314da4c29ab3d4acea40fd52e338a0765d41e"} Oct 13 13:48:22 crc kubenswrapper[4797]: I1013 13:48:22.471997 4797 generic.go:334] "Generic (PLEG): container finished" podID="d5d10f1a-a177-43bb-8fac-b4177e8c331b" containerID="c19a2b720aa1704d6671ae1baae314da4c29ab3d4acea40fd52e338a0765d41e" exitCode=0 Oct 13 13:48:22 crc kubenswrapper[4797]: I1013 13:48:22.472081 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-frm7l" event={"ID":"d5d10f1a-a177-43bb-8fac-b4177e8c331b","Type":"ContainerDied","Data":"c19a2b720aa1704d6671ae1baae314da4c29ab3d4acea40fd52e338a0765d41e"} Oct 13 13:48:23 crc kubenswrapper[4797]: I1013 13:48:23.486538 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-frm7l" event={"ID":"d5d10f1a-a177-43bb-8fac-b4177e8c331b","Type":"ContainerStarted","Data":"41bfedc1946798c8b4fe6573980a9aae0b3298eba4d167bcad1d28a9ed4f490c"} Oct 13 13:48:23 crc kubenswrapper[4797]: I1013 13:48:23.508623 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-frm7l" podStartSLOduration=1.9544368909999998 podStartE2EDuration="4.508601628s" podCreationTimestamp="2025-10-13 13:48:19 +0000 UTC" firstStartedPulling="2025-10-13 13:48:20.443089439 +0000 UTC m=+2477.976639715" lastFinishedPulling="2025-10-13 13:48:22.997254196 +0000 UTC m=+2480.530804452" observedRunningTime="2025-10-13 13:48:23.508108266 +0000 UTC m=+2481.041658532" watchObservedRunningTime="2025-10-13 13:48:23.508601628 +0000 UTC m=+2481.042151894" Oct 13 13:48:25 crc kubenswrapper[4797]: I1013 13:48:25.236176 4797 scope.go:117] "RemoveContainer" containerID="7fc962c2b3cd9094de09eb9de80039049b8a00525bc7d3b9657e49e3e585d698" Oct 13 13:48:25 crc kubenswrapper[4797]: E1013 13:48:25.236388 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 13:48:29 crc kubenswrapper[4797]: I1013 13:48:29.420854 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-frm7l" Oct 13 13:48:29 crc kubenswrapper[4797]: I1013 13:48:29.421171 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-frm7l" Oct 13 13:48:29 crc kubenswrapper[4797]: I1013 13:48:29.468157 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-frm7l" Oct 13 13:48:29 crc kubenswrapper[4797]: I1013 13:48:29.585502 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-frm7l" Oct 13 13:48:29 crc kubenswrapper[4797]: I1013 13:48:29.711560 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-frm7l"] Oct 13 13:48:31 crc kubenswrapper[4797]: I1013 13:48:31.546645 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-frm7l" podUID="d5d10f1a-a177-43bb-8fac-b4177e8c331b" containerName="registry-server" containerID="cri-o://41bfedc1946798c8b4fe6573980a9aae0b3298eba4d167bcad1d28a9ed4f490c" gracePeriod=2 Oct 13 13:48:31 crc kubenswrapper[4797]: I1013 13:48:31.990262 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-frm7l" Oct 13 13:48:32 crc kubenswrapper[4797]: I1013 13:48:32.032128 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7pdjm\" (UniqueName: \"kubernetes.io/projected/d5d10f1a-a177-43bb-8fac-b4177e8c331b-kube-api-access-7pdjm\") pod \"d5d10f1a-a177-43bb-8fac-b4177e8c331b\" (UID: \"d5d10f1a-a177-43bb-8fac-b4177e8c331b\") " Oct 13 13:48:32 crc kubenswrapper[4797]: I1013 13:48:32.032211 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5d10f1a-a177-43bb-8fac-b4177e8c331b-utilities\") pod \"d5d10f1a-a177-43bb-8fac-b4177e8c331b\" (UID: \"d5d10f1a-a177-43bb-8fac-b4177e8c331b\") " Oct 13 13:48:32 crc kubenswrapper[4797]: I1013 13:48:32.032246 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5d10f1a-a177-43bb-8fac-b4177e8c331b-catalog-content\") pod \"d5d10f1a-a177-43bb-8fac-b4177e8c331b\" (UID: \"d5d10f1a-a177-43bb-8fac-b4177e8c331b\") " Oct 13 13:48:32 crc kubenswrapper[4797]: I1013 13:48:32.033430 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d5d10f1a-a177-43bb-8fac-b4177e8c331b-utilities" (OuterVolumeSpecName: "utilities") pod "d5d10f1a-a177-43bb-8fac-b4177e8c331b" (UID: "d5d10f1a-a177-43bb-8fac-b4177e8c331b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 13:48:32 crc kubenswrapper[4797]: I1013 13:48:32.039695 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5d10f1a-a177-43bb-8fac-b4177e8c331b-kube-api-access-7pdjm" (OuterVolumeSpecName: "kube-api-access-7pdjm") pod "d5d10f1a-a177-43bb-8fac-b4177e8c331b" (UID: "d5d10f1a-a177-43bb-8fac-b4177e8c331b"). InnerVolumeSpecName "kube-api-access-7pdjm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:48:32 crc kubenswrapper[4797]: I1013 13:48:32.078674 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d5d10f1a-a177-43bb-8fac-b4177e8c331b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d5d10f1a-a177-43bb-8fac-b4177e8c331b" (UID: "d5d10f1a-a177-43bb-8fac-b4177e8c331b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 13:48:32 crc kubenswrapper[4797]: I1013 13:48:32.133398 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7pdjm\" (UniqueName: \"kubernetes.io/projected/d5d10f1a-a177-43bb-8fac-b4177e8c331b-kube-api-access-7pdjm\") on node \"crc\" DevicePath \"\"" Oct 13 13:48:32 crc kubenswrapper[4797]: I1013 13:48:32.133448 4797 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5d10f1a-a177-43bb-8fac-b4177e8c331b-utilities\") on node \"crc\" DevicePath \"\"" Oct 13 13:48:32 crc kubenswrapper[4797]: I1013 13:48:32.133464 4797 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5d10f1a-a177-43bb-8fac-b4177e8c331b-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 13 13:48:32 crc kubenswrapper[4797]: I1013 13:48:32.555873 4797 generic.go:334] "Generic (PLEG): container finished" podID="d5d10f1a-a177-43bb-8fac-b4177e8c331b" containerID="41bfedc1946798c8b4fe6573980a9aae0b3298eba4d167bcad1d28a9ed4f490c" exitCode=0 Oct 13 13:48:32 crc kubenswrapper[4797]: I1013 13:48:32.555938 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-frm7l" Oct 13 13:48:32 crc kubenswrapper[4797]: I1013 13:48:32.555965 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-frm7l" event={"ID":"d5d10f1a-a177-43bb-8fac-b4177e8c331b","Type":"ContainerDied","Data":"41bfedc1946798c8b4fe6573980a9aae0b3298eba4d167bcad1d28a9ed4f490c"} Oct 13 13:48:32 crc kubenswrapper[4797]: I1013 13:48:32.556284 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-frm7l" event={"ID":"d5d10f1a-a177-43bb-8fac-b4177e8c331b","Type":"ContainerDied","Data":"0f62814529de5b0380e806a921cf9bacb224c5f7b8d591109c0d8389b57ed900"} Oct 13 13:48:32 crc kubenswrapper[4797]: I1013 13:48:32.556311 4797 scope.go:117] "RemoveContainer" containerID="41bfedc1946798c8b4fe6573980a9aae0b3298eba4d167bcad1d28a9ed4f490c" Oct 13 13:48:32 crc kubenswrapper[4797]: I1013 13:48:32.593627 4797 scope.go:117] "RemoveContainer" containerID="c19a2b720aa1704d6671ae1baae314da4c29ab3d4acea40fd52e338a0765d41e" Oct 13 13:48:32 crc kubenswrapper[4797]: I1013 13:48:32.595683 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-frm7l"] Oct 13 13:48:32 crc kubenswrapper[4797]: I1013 13:48:32.600623 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-frm7l"] Oct 13 13:48:32 crc kubenswrapper[4797]: I1013 13:48:32.612839 4797 scope.go:117] "RemoveContainer" containerID="93a635ce518a86c8294c0d04d154e4fe2031b4c7eceaf2491ea020ba1535c73e" Oct 13 13:48:32 crc kubenswrapper[4797]: I1013 13:48:32.643558 4797 scope.go:117] "RemoveContainer" containerID="41bfedc1946798c8b4fe6573980a9aae0b3298eba4d167bcad1d28a9ed4f490c" Oct 13 13:48:32 crc kubenswrapper[4797]: E1013 13:48:32.644074 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41bfedc1946798c8b4fe6573980a9aae0b3298eba4d167bcad1d28a9ed4f490c\": container with ID starting with 41bfedc1946798c8b4fe6573980a9aae0b3298eba4d167bcad1d28a9ed4f490c not found: ID does not exist" containerID="41bfedc1946798c8b4fe6573980a9aae0b3298eba4d167bcad1d28a9ed4f490c" Oct 13 13:48:32 crc kubenswrapper[4797]: I1013 13:48:32.644120 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41bfedc1946798c8b4fe6573980a9aae0b3298eba4d167bcad1d28a9ed4f490c"} err="failed to get container status \"41bfedc1946798c8b4fe6573980a9aae0b3298eba4d167bcad1d28a9ed4f490c\": rpc error: code = NotFound desc = could not find container \"41bfedc1946798c8b4fe6573980a9aae0b3298eba4d167bcad1d28a9ed4f490c\": container with ID starting with 41bfedc1946798c8b4fe6573980a9aae0b3298eba4d167bcad1d28a9ed4f490c not found: ID does not exist" Oct 13 13:48:32 crc kubenswrapper[4797]: I1013 13:48:32.644151 4797 scope.go:117] "RemoveContainer" containerID="c19a2b720aa1704d6671ae1baae314da4c29ab3d4acea40fd52e338a0765d41e" Oct 13 13:48:32 crc kubenswrapper[4797]: E1013 13:48:32.644534 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c19a2b720aa1704d6671ae1baae314da4c29ab3d4acea40fd52e338a0765d41e\": container with ID starting with c19a2b720aa1704d6671ae1baae314da4c29ab3d4acea40fd52e338a0765d41e not found: ID does not exist" containerID="c19a2b720aa1704d6671ae1baae314da4c29ab3d4acea40fd52e338a0765d41e" Oct 13 13:48:32 crc kubenswrapper[4797]: I1013 13:48:32.644581 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c19a2b720aa1704d6671ae1baae314da4c29ab3d4acea40fd52e338a0765d41e"} err="failed to get container status \"c19a2b720aa1704d6671ae1baae314da4c29ab3d4acea40fd52e338a0765d41e\": rpc error: code = NotFound desc = could not find container \"c19a2b720aa1704d6671ae1baae314da4c29ab3d4acea40fd52e338a0765d41e\": container with ID starting with c19a2b720aa1704d6671ae1baae314da4c29ab3d4acea40fd52e338a0765d41e not found: ID does not exist" Oct 13 13:48:32 crc kubenswrapper[4797]: I1013 13:48:32.644617 4797 scope.go:117] "RemoveContainer" containerID="93a635ce518a86c8294c0d04d154e4fe2031b4c7eceaf2491ea020ba1535c73e" Oct 13 13:48:32 crc kubenswrapper[4797]: E1013 13:48:32.645027 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93a635ce518a86c8294c0d04d154e4fe2031b4c7eceaf2491ea020ba1535c73e\": container with ID starting with 93a635ce518a86c8294c0d04d154e4fe2031b4c7eceaf2491ea020ba1535c73e not found: ID does not exist" containerID="93a635ce518a86c8294c0d04d154e4fe2031b4c7eceaf2491ea020ba1535c73e" Oct 13 13:48:32 crc kubenswrapper[4797]: I1013 13:48:32.645070 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93a635ce518a86c8294c0d04d154e4fe2031b4c7eceaf2491ea020ba1535c73e"} err="failed to get container status \"93a635ce518a86c8294c0d04d154e4fe2031b4c7eceaf2491ea020ba1535c73e\": rpc error: code = NotFound desc = could not find container \"93a635ce518a86c8294c0d04d154e4fe2031b4c7eceaf2491ea020ba1535c73e\": container with ID starting with 93a635ce518a86c8294c0d04d154e4fe2031b4c7eceaf2491ea020ba1535c73e not found: ID does not exist" Oct 13 13:48:33 crc kubenswrapper[4797]: I1013 13:48:33.244780 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5d10f1a-a177-43bb-8fac-b4177e8c331b" path="/var/lib/kubelet/pods/d5d10f1a-a177-43bb-8fac-b4177e8c331b/volumes" Oct 13 13:48:40 crc kubenswrapper[4797]: I1013 13:48:40.236689 4797 scope.go:117] "RemoveContainer" containerID="7fc962c2b3cd9094de09eb9de80039049b8a00525bc7d3b9657e49e3e585d698" Oct 13 13:48:40 crc kubenswrapper[4797]: E1013 13:48:40.237529 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 13:48:52 crc kubenswrapper[4797]: I1013 13:48:52.235997 4797 scope.go:117] "RemoveContainer" containerID="7fc962c2b3cd9094de09eb9de80039049b8a00525bc7d3b9657e49e3e585d698" Oct 13 13:48:52 crc kubenswrapper[4797]: E1013 13:48:52.236877 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 13:49:04 crc kubenswrapper[4797]: I1013 13:49:04.236289 4797 scope.go:117] "RemoveContainer" containerID="7fc962c2b3cd9094de09eb9de80039049b8a00525bc7d3b9657e49e3e585d698" Oct 13 13:49:04 crc kubenswrapper[4797]: E1013 13:49:04.237580 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 13:49:18 crc kubenswrapper[4797]: I1013 13:49:18.237637 4797 scope.go:117] "RemoveContainer" containerID="7fc962c2b3cd9094de09eb9de80039049b8a00525bc7d3b9657e49e3e585d698" Oct 13 13:49:18 crc kubenswrapper[4797]: E1013 13:49:18.238588 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 13:49:33 crc kubenswrapper[4797]: I1013 13:49:33.246704 4797 scope.go:117] "RemoveContainer" containerID="7fc962c2b3cd9094de09eb9de80039049b8a00525bc7d3b9657e49e3e585d698" Oct 13 13:49:33 crc kubenswrapper[4797]: E1013 13:49:33.247742 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 13:49:48 crc kubenswrapper[4797]: I1013 13:49:48.236880 4797 scope.go:117] "RemoveContainer" containerID="7fc962c2b3cd9094de09eb9de80039049b8a00525bc7d3b9657e49e3e585d698" Oct 13 13:49:49 crc kubenswrapper[4797]: I1013 13:49:49.218484 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" event={"ID":"345b1c60-ba79-407d-8423-53010f2dfeb0","Type":"ContainerStarted","Data":"bc5d3263f74058423f109919a9998cad948554cc01e824f58be41a872a01286e"} Oct 13 13:51:16 crc kubenswrapper[4797]: I1013 13:51:16.035094 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-95ssx"] Oct 13 13:51:16 crc kubenswrapper[4797]: E1013 13:51:16.036120 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5d10f1a-a177-43bb-8fac-b4177e8c331b" containerName="extract-utilities" Oct 13 13:51:16 crc kubenswrapper[4797]: I1013 13:51:16.036256 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5d10f1a-a177-43bb-8fac-b4177e8c331b" containerName="extract-utilities" Oct 13 13:51:16 crc kubenswrapper[4797]: E1013 13:51:16.036285 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5d10f1a-a177-43bb-8fac-b4177e8c331b" containerName="extract-content" Oct 13 13:51:16 crc kubenswrapper[4797]: I1013 13:51:16.036298 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5d10f1a-a177-43bb-8fac-b4177e8c331b" containerName="extract-content" Oct 13 13:51:16 crc kubenswrapper[4797]: E1013 13:51:16.036345 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5d10f1a-a177-43bb-8fac-b4177e8c331b" containerName="registry-server" Oct 13 13:51:16 crc kubenswrapper[4797]: I1013 13:51:16.036357 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5d10f1a-a177-43bb-8fac-b4177e8c331b" containerName="registry-server" Oct 13 13:51:16 crc kubenswrapper[4797]: I1013 13:51:16.036598 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5d10f1a-a177-43bb-8fac-b4177e8c331b" containerName="registry-server" Oct 13 13:51:16 crc kubenswrapper[4797]: I1013 13:51:16.038397 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-95ssx" Oct 13 13:51:16 crc kubenswrapper[4797]: I1013 13:51:16.057398 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-95ssx"] Oct 13 13:51:16 crc kubenswrapper[4797]: I1013 13:51:16.068204 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tw86s\" (UniqueName: \"kubernetes.io/projected/ef6a82e9-8d01-4d15-a6ba-282d0a3e1320-kube-api-access-tw86s\") pod \"community-operators-95ssx\" (UID: \"ef6a82e9-8d01-4d15-a6ba-282d0a3e1320\") " pod="openshift-marketplace/community-operators-95ssx" Oct 13 13:51:16 crc kubenswrapper[4797]: I1013 13:51:16.068286 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ef6a82e9-8d01-4d15-a6ba-282d0a3e1320-catalog-content\") pod \"community-operators-95ssx\" (UID: \"ef6a82e9-8d01-4d15-a6ba-282d0a3e1320\") " pod="openshift-marketplace/community-operators-95ssx" Oct 13 13:51:16 crc kubenswrapper[4797]: I1013 13:51:16.068405 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ef6a82e9-8d01-4d15-a6ba-282d0a3e1320-utilities\") pod \"community-operators-95ssx\" (UID: \"ef6a82e9-8d01-4d15-a6ba-282d0a3e1320\") " pod="openshift-marketplace/community-operators-95ssx" Oct 13 13:51:16 crc kubenswrapper[4797]: I1013 13:51:16.169881 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tw86s\" (UniqueName: \"kubernetes.io/projected/ef6a82e9-8d01-4d15-a6ba-282d0a3e1320-kube-api-access-tw86s\") pod \"community-operators-95ssx\" (UID: \"ef6a82e9-8d01-4d15-a6ba-282d0a3e1320\") " pod="openshift-marketplace/community-operators-95ssx" Oct 13 13:51:16 crc kubenswrapper[4797]: I1013 13:51:16.169977 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ef6a82e9-8d01-4d15-a6ba-282d0a3e1320-catalog-content\") pod \"community-operators-95ssx\" (UID: \"ef6a82e9-8d01-4d15-a6ba-282d0a3e1320\") " pod="openshift-marketplace/community-operators-95ssx" Oct 13 13:51:16 crc kubenswrapper[4797]: I1013 13:51:16.170045 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ef6a82e9-8d01-4d15-a6ba-282d0a3e1320-utilities\") pod \"community-operators-95ssx\" (UID: \"ef6a82e9-8d01-4d15-a6ba-282d0a3e1320\") " pod="openshift-marketplace/community-operators-95ssx" Oct 13 13:51:16 crc kubenswrapper[4797]: I1013 13:51:16.170597 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ef6a82e9-8d01-4d15-a6ba-282d0a3e1320-utilities\") pod \"community-operators-95ssx\" (UID: \"ef6a82e9-8d01-4d15-a6ba-282d0a3e1320\") " pod="openshift-marketplace/community-operators-95ssx" Oct 13 13:51:16 crc kubenswrapper[4797]: I1013 13:51:16.170604 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ef6a82e9-8d01-4d15-a6ba-282d0a3e1320-catalog-content\") pod \"community-operators-95ssx\" (UID: \"ef6a82e9-8d01-4d15-a6ba-282d0a3e1320\") " pod="openshift-marketplace/community-operators-95ssx" Oct 13 13:51:16 crc kubenswrapper[4797]: I1013 13:51:16.187423 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tw86s\" (UniqueName: \"kubernetes.io/projected/ef6a82e9-8d01-4d15-a6ba-282d0a3e1320-kube-api-access-tw86s\") pod \"community-operators-95ssx\" (UID: \"ef6a82e9-8d01-4d15-a6ba-282d0a3e1320\") " pod="openshift-marketplace/community-operators-95ssx" Oct 13 13:51:16 crc kubenswrapper[4797]: I1013 13:51:16.372914 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-95ssx" Oct 13 13:51:16 crc kubenswrapper[4797]: I1013 13:51:16.806266 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-95ssx"] Oct 13 13:51:16 crc kubenswrapper[4797]: I1013 13:51:16.952028 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-95ssx" event={"ID":"ef6a82e9-8d01-4d15-a6ba-282d0a3e1320","Type":"ContainerStarted","Data":"ca82cb50bb0e40963499ad830e289d7a1605882d9758806e8ab66e19b625c1c7"} Oct 13 13:51:17 crc kubenswrapper[4797]: I1013 13:51:17.964088 4797 generic.go:334] "Generic (PLEG): container finished" podID="ef6a82e9-8d01-4d15-a6ba-282d0a3e1320" containerID="34f9453725d2019a7818f1a24ea9d87210001af0acefd00880694f111403ec5d" exitCode=0 Oct 13 13:51:17 crc kubenswrapper[4797]: I1013 13:51:17.964215 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-95ssx" event={"ID":"ef6a82e9-8d01-4d15-a6ba-282d0a3e1320","Type":"ContainerDied","Data":"34f9453725d2019a7818f1a24ea9d87210001af0acefd00880694f111403ec5d"} Oct 13 13:51:18 crc kubenswrapper[4797]: I1013 13:51:18.974258 4797 generic.go:334] "Generic (PLEG): container finished" podID="ef6a82e9-8d01-4d15-a6ba-282d0a3e1320" containerID="36a748e8d29ffa77c189278edf3548b60bb6840a6c348018a3b8cf258de9acd9" exitCode=0 Oct 13 13:51:18 crc kubenswrapper[4797]: I1013 13:51:18.974299 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-95ssx" event={"ID":"ef6a82e9-8d01-4d15-a6ba-282d0a3e1320","Type":"ContainerDied","Data":"36a748e8d29ffa77c189278edf3548b60bb6840a6c348018a3b8cf258de9acd9"} Oct 13 13:51:19 crc kubenswrapper[4797]: I1013 13:51:19.985845 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-95ssx" event={"ID":"ef6a82e9-8d01-4d15-a6ba-282d0a3e1320","Type":"ContainerStarted","Data":"995b9266a8d3a90bc045b45c94a9dceddbde10f74b5d3e782f7ee65088817547"} Oct 13 13:51:20 crc kubenswrapper[4797]: I1013 13:51:20.010764 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-95ssx" podStartSLOduration=2.613009144 podStartE2EDuration="4.010735806s" podCreationTimestamp="2025-10-13 13:51:16 +0000 UTC" firstStartedPulling="2025-10-13 13:51:17.967359818 +0000 UTC m=+2655.500910114" lastFinishedPulling="2025-10-13 13:51:19.36508652 +0000 UTC m=+2656.898636776" observedRunningTime="2025-10-13 13:51:20.007473486 +0000 UTC m=+2657.541023812" watchObservedRunningTime="2025-10-13 13:51:20.010735806 +0000 UTC m=+2657.544286102" Oct 13 13:51:26 crc kubenswrapper[4797]: I1013 13:51:26.374195 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-95ssx" Oct 13 13:51:26 crc kubenswrapper[4797]: I1013 13:51:26.376075 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-95ssx" Oct 13 13:51:26 crc kubenswrapper[4797]: I1013 13:51:26.444652 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-95ssx" Oct 13 13:51:27 crc kubenswrapper[4797]: I1013 13:51:27.101153 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-95ssx" Oct 13 13:51:29 crc kubenswrapper[4797]: I1013 13:51:29.421888 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-95ssx"] Oct 13 13:51:30 crc kubenswrapper[4797]: I1013 13:51:30.070267 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-95ssx" podUID="ef6a82e9-8d01-4d15-a6ba-282d0a3e1320" containerName="registry-server" containerID="cri-o://995b9266a8d3a90bc045b45c94a9dceddbde10f74b5d3e782f7ee65088817547" gracePeriod=2 Oct 13 13:51:30 crc kubenswrapper[4797]: I1013 13:51:30.492240 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-95ssx" Oct 13 13:51:30 crc kubenswrapper[4797]: I1013 13:51:30.591097 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ef6a82e9-8d01-4d15-a6ba-282d0a3e1320-utilities\") pod \"ef6a82e9-8d01-4d15-a6ba-282d0a3e1320\" (UID: \"ef6a82e9-8d01-4d15-a6ba-282d0a3e1320\") " Oct 13 13:51:30 crc kubenswrapper[4797]: I1013 13:51:30.591188 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tw86s\" (UniqueName: \"kubernetes.io/projected/ef6a82e9-8d01-4d15-a6ba-282d0a3e1320-kube-api-access-tw86s\") pod \"ef6a82e9-8d01-4d15-a6ba-282d0a3e1320\" (UID: \"ef6a82e9-8d01-4d15-a6ba-282d0a3e1320\") " Oct 13 13:51:30 crc kubenswrapper[4797]: I1013 13:51:30.591220 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ef6a82e9-8d01-4d15-a6ba-282d0a3e1320-catalog-content\") pod \"ef6a82e9-8d01-4d15-a6ba-282d0a3e1320\" (UID: \"ef6a82e9-8d01-4d15-a6ba-282d0a3e1320\") " Oct 13 13:51:30 crc kubenswrapper[4797]: I1013 13:51:30.593200 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef6a82e9-8d01-4d15-a6ba-282d0a3e1320-utilities" (OuterVolumeSpecName: "utilities") pod "ef6a82e9-8d01-4d15-a6ba-282d0a3e1320" (UID: "ef6a82e9-8d01-4d15-a6ba-282d0a3e1320"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 13:51:30 crc kubenswrapper[4797]: I1013 13:51:30.596552 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef6a82e9-8d01-4d15-a6ba-282d0a3e1320-kube-api-access-tw86s" (OuterVolumeSpecName: "kube-api-access-tw86s") pod "ef6a82e9-8d01-4d15-a6ba-282d0a3e1320" (UID: "ef6a82e9-8d01-4d15-a6ba-282d0a3e1320"). InnerVolumeSpecName "kube-api-access-tw86s". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:51:30 crc kubenswrapper[4797]: I1013 13:51:30.642580 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef6a82e9-8d01-4d15-a6ba-282d0a3e1320-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ef6a82e9-8d01-4d15-a6ba-282d0a3e1320" (UID: "ef6a82e9-8d01-4d15-a6ba-282d0a3e1320"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 13:51:30 crc kubenswrapper[4797]: I1013 13:51:30.692936 4797 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ef6a82e9-8d01-4d15-a6ba-282d0a3e1320-utilities\") on node \"crc\" DevicePath \"\"" Oct 13 13:51:30 crc kubenswrapper[4797]: I1013 13:51:30.692996 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tw86s\" (UniqueName: \"kubernetes.io/projected/ef6a82e9-8d01-4d15-a6ba-282d0a3e1320-kube-api-access-tw86s\") on node \"crc\" DevicePath \"\"" Oct 13 13:51:30 crc kubenswrapper[4797]: I1013 13:51:30.693022 4797 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ef6a82e9-8d01-4d15-a6ba-282d0a3e1320-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 13 13:51:31 crc kubenswrapper[4797]: I1013 13:51:31.083741 4797 generic.go:334] "Generic (PLEG): container finished" podID="ef6a82e9-8d01-4d15-a6ba-282d0a3e1320" containerID="995b9266a8d3a90bc045b45c94a9dceddbde10f74b5d3e782f7ee65088817547" exitCode=0 Oct 13 13:51:31 crc kubenswrapper[4797]: I1013 13:51:31.083929 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-95ssx" Oct 13 13:51:31 crc kubenswrapper[4797]: I1013 13:51:31.084360 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-95ssx" event={"ID":"ef6a82e9-8d01-4d15-a6ba-282d0a3e1320","Type":"ContainerDied","Data":"995b9266a8d3a90bc045b45c94a9dceddbde10f74b5d3e782f7ee65088817547"} Oct 13 13:51:31 crc kubenswrapper[4797]: I1013 13:51:31.084591 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-95ssx" event={"ID":"ef6a82e9-8d01-4d15-a6ba-282d0a3e1320","Type":"ContainerDied","Data":"ca82cb50bb0e40963499ad830e289d7a1605882d9758806e8ab66e19b625c1c7"} Oct 13 13:51:31 crc kubenswrapper[4797]: I1013 13:51:31.084727 4797 scope.go:117] "RemoveContainer" containerID="995b9266a8d3a90bc045b45c94a9dceddbde10f74b5d3e782f7ee65088817547" Oct 13 13:51:31 crc kubenswrapper[4797]: I1013 13:51:31.116959 4797 scope.go:117] "RemoveContainer" containerID="36a748e8d29ffa77c189278edf3548b60bb6840a6c348018a3b8cf258de9acd9" Oct 13 13:51:31 crc kubenswrapper[4797]: I1013 13:51:31.137958 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-95ssx"] Oct 13 13:51:31 crc kubenswrapper[4797]: I1013 13:51:31.147126 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-95ssx"] Oct 13 13:51:31 crc kubenswrapper[4797]: I1013 13:51:31.155356 4797 scope.go:117] "RemoveContainer" containerID="34f9453725d2019a7818f1a24ea9d87210001af0acefd00880694f111403ec5d" Oct 13 13:51:31 crc kubenswrapper[4797]: I1013 13:51:31.193973 4797 scope.go:117] "RemoveContainer" containerID="995b9266a8d3a90bc045b45c94a9dceddbde10f74b5d3e782f7ee65088817547" Oct 13 13:51:31 crc kubenswrapper[4797]: E1013 13:51:31.194904 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"995b9266a8d3a90bc045b45c94a9dceddbde10f74b5d3e782f7ee65088817547\": container with ID starting with 995b9266a8d3a90bc045b45c94a9dceddbde10f74b5d3e782f7ee65088817547 not found: ID does not exist" containerID="995b9266a8d3a90bc045b45c94a9dceddbde10f74b5d3e782f7ee65088817547" Oct 13 13:51:31 crc kubenswrapper[4797]: I1013 13:51:31.194942 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"995b9266a8d3a90bc045b45c94a9dceddbde10f74b5d3e782f7ee65088817547"} err="failed to get container status \"995b9266a8d3a90bc045b45c94a9dceddbde10f74b5d3e782f7ee65088817547\": rpc error: code = NotFound desc = could not find container \"995b9266a8d3a90bc045b45c94a9dceddbde10f74b5d3e782f7ee65088817547\": container with ID starting with 995b9266a8d3a90bc045b45c94a9dceddbde10f74b5d3e782f7ee65088817547 not found: ID does not exist" Oct 13 13:51:31 crc kubenswrapper[4797]: I1013 13:51:31.194974 4797 scope.go:117] "RemoveContainer" containerID="36a748e8d29ffa77c189278edf3548b60bb6840a6c348018a3b8cf258de9acd9" Oct 13 13:51:31 crc kubenswrapper[4797]: E1013 13:51:31.195905 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"36a748e8d29ffa77c189278edf3548b60bb6840a6c348018a3b8cf258de9acd9\": container with ID starting with 36a748e8d29ffa77c189278edf3548b60bb6840a6c348018a3b8cf258de9acd9 not found: ID does not exist" containerID="36a748e8d29ffa77c189278edf3548b60bb6840a6c348018a3b8cf258de9acd9" Oct 13 13:51:31 crc kubenswrapper[4797]: I1013 13:51:31.195971 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36a748e8d29ffa77c189278edf3548b60bb6840a6c348018a3b8cf258de9acd9"} err="failed to get container status \"36a748e8d29ffa77c189278edf3548b60bb6840a6c348018a3b8cf258de9acd9\": rpc error: code = NotFound desc = could not find container \"36a748e8d29ffa77c189278edf3548b60bb6840a6c348018a3b8cf258de9acd9\": container with ID starting with 36a748e8d29ffa77c189278edf3548b60bb6840a6c348018a3b8cf258de9acd9 not found: ID does not exist" Oct 13 13:51:31 crc kubenswrapper[4797]: I1013 13:51:31.196017 4797 scope.go:117] "RemoveContainer" containerID="34f9453725d2019a7818f1a24ea9d87210001af0acefd00880694f111403ec5d" Oct 13 13:51:31 crc kubenswrapper[4797]: E1013 13:51:31.197268 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"34f9453725d2019a7818f1a24ea9d87210001af0acefd00880694f111403ec5d\": container with ID starting with 34f9453725d2019a7818f1a24ea9d87210001af0acefd00880694f111403ec5d not found: ID does not exist" containerID="34f9453725d2019a7818f1a24ea9d87210001af0acefd00880694f111403ec5d" Oct 13 13:51:31 crc kubenswrapper[4797]: I1013 13:51:31.197301 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34f9453725d2019a7818f1a24ea9d87210001af0acefd00880694f111403ec5d"} err="failed to get container status \"34f9453725d2019a7818f1a24ea9d87210001af0acefd00880694f111403ec5d\": rpc error: code = NotFound desc = could not find container \"34f9453725d2019a7818f1a24ea9d87210001af0acefd00880694f111403ec5d\": container with ID starting with 34f9453725d2019a7818f1a24ea9d87210001af0acefd00880694f111403ec5d not found: ID does not exist" Oct 13 13:51:31 crc kubenswrapper[4797]: I1013 13:51:31.250321 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef6a82e9-8d01-4d15-a6ba-282d0a3e1320" path="/var/lib/kubelet/pods/ef6a82e9-8d01-4d15-a6ba-282d0a3e1320/volumes" Oct 13 13:51:48 crc kubenswrapper[4797]: I1013 13:51:48.119710 4797 patch_prober.go:28] interesting pod/machine-config-daemon-hrdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 13:51:48 crc kubenswrapper[4797]: I1013 13:51:48.120327 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 13:52:18 crc kubenswrapper[4797]: I1013 13:52:18.119718 4797 patch_prober.go:28] interesting pod/machine-config-daemon-hrdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 13:52:18 crc kubenswrapper[4797]: I1013 13:52:18.120367 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 13:52:48 crc kubenswrapper[4797]: I1013 13:52:48.120580 4797 patch_prober.go:28] interesting pod/machine-config-daemon-hrdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 13:52:48 crc kubenswrapper[4797]: I1013 13:52:48.121480 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 13:52:48 crc kubenswrapper[4797]: I1013 13:52:48.121568 4797 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" Oct 13 13:52:48 crc kubenswrapper[4797]: I1013 13:52:48.123329 4797 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"bc5d3263f74058423f109919a9998cad948554cc01e824f58be41a872a01286e"} pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 13 13:52:48 crc kubenswrapper[4797]: I1013 13:52:48.123428 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerName="machine-config-daemon" containerID="cri-o://bc5d3263f74058423f109919a9998cad948554cc01e824f58be41a872a01286e" gracePeriod=600 Oct 13 13:52:48 crc kubenswrapper[4797]: I1013 13:52:48.718466 4797 generic.go:334] "Generic (PLEG): container finished" podID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerID="bc5d3263f74058423f109919a9998cad948554cc01e824f58be41a872a01286e" exitCode=0 Oct 13 13:52:48 crc kubenswrapper[4797]: I1013 13:52:48.718543 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" event={"ID":"345b1c60-ba79-407d-8423-53010f2dfeb0","Type":"ContainerDied","Data":"bc5d3263f74058423f109919a9998cad948554cc01e824f58be41a872a01286e"} Oct 13 13:52:48 crc kubenswrapper[4797]: I1013 13:52:48.718850 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" event={"ID":"345b1c60-ba79-407d-8423-53010f2dfeb0","Type":"ContainerStarted","Data":"0135d5e61b48b629dcaffe88a618f8856342cc08171648832811053b15c81254"} Oct 13 13:52:48 crc kubenswrapper[4797]: I1013 13:52:48.718871 4797 scope.go:117] "RemoveContainer" containerID="7fc962c2b3cd9094de09eb9de80039049b8a00525bc7d3b9657e49e3e585d698" Oct 13 13:54:48 crc kubenswrapper[4797]: I1013 13:54:48.119632 4797 patch_prober.go:28] interesting pod/machine-config-daemon-hrdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 13:54:48 crc kubenswrapper[4797]: I1013 13:54:48.120373 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 13:55:12 crc kubenswrapper[4797]: I1013 13:55:12.212890 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-r7nth"] Oct 13 13:55:12 crc kubenswrapper[4797]: E1013 13:55:12.214363 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef6a82e9-8d01-4d15-a6ba-282d0a3e1320" containerName="registry-server" Oct 13 13:55:12 crc kubenswrapper[4797]: I1013 13:55:12.214381 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef6a82e9-8d01-4d15-a6ba-282d0a3e1320" containerName="registry-server" Oct 13 13:55:12 crc kubenswrapper[4797]: E1013 13:55:12.214423 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef6a82e9-8d01-4d15-a6ba-282d0a3e1320" containerName="extract-utilities" Oct 13 13:55:12 crc kubenswrapper[4797]: I1013 13:55:12.214433 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef6a82e9-8d01-4d15-a6ba-282d0a3e1320" containerName="extract-utilities" Oct 13 13:55:12 crc kubenswrapper[4797]: E1013 13:55:12.214460 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef6a82e9-8d01-4d15-a6ba-282d0a3e1320" containerName="extract-content" Oct 13 13:55:12 crc kubenswrapper[4797]: I1013 13:55:12.214468 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef6a82e9-8d01-4d15-a6ba-282d0a3e1320" containerName="extract-content" Oct 13 13:55:12 crc kubenswrapper[4797]: I1013 13:55:12.214693 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef6a82e9-8d01-4d15-a6ba-282d0a3e1320" containerName="registry-server" Oct 13 13:55:12 crc kubenswrapper[4797]: I1013 13:55:12.216384 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-r7nth" Oct 13 13:55:12 crc kubenswrapper[4797]: I1013 13:55:12.227520 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-r7nth"] Oct 13 13:55:12 crc kubenswrapper[4797]: I1013 13:55:12.368020 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6c0c0886-defb-458f-970b-6938d71db077-catalog-content\") pod \"redhat-marketplace-r7nth\" (UID: \"6c0c0886-defb-458f-970b-6938d71db077\") " pod="openshift-marketplace/redhat-marketplace-r7nth" Oct 13 13:55:12 crc kubenswrapper[4797]: I1013 13:55:12.368858 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6c0c0886-defb-458f-970b-6938d71db077-utilities\") pod \"redhat-marketplace-r7nth\" (UID: \"6c0c0886-defb-458f-970b-6938d71db077\") " pod="openshift-marketplace/redhat-marketplace-r7nth" Oct 13 13:55:12 crc kubenswrapper[4797]: I1013 13:55:12.368887 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cq7mg\" (UniqueName: \"kubernetes.io/projected/6c0c0886-defb-458f-970b-6938d71db077-kube-api-access-cq7mg\") pod \"redhat-marketplace-r7nth\" (UID: \"6c0c0886-defb-458f-970b-6938d71db077\") " pod="openshift-marketplace/redhat-marketplace-r7nth" Oct 13 13:55:12 crc kubenswrapper[4797]: I1013 13:55:12.469947 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6c0c0886-defb-458f-970b-6938d71db077-catalog-content\") pod \"redhat-marketplace-r7nth\" (UID: \"6c0c0886-defb-458f-970b-6938d71db077\") " pod="openshift-marketplace/redhat-marketplace-r7nth" Oct 13 13:55:12 crc kubenswrapper[4797]: I1013 13:55:12.470015 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6c0c0886-defb-458f-970b-6938d71db077-utilities\") pod \"redhat-marketplace-r7nth\" (UID: \"6c0c0886-defb-458f-970b-6938d71db077\") " pod="openshift-marketplace/redhat-marketplace-r7nth" Oct 13 13:55:12 crc kubenswrapper[4797]: I1013 13:55:12.470038 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cq7mg\" (UniqueName: \"kubernetes.io/projected/6c0c0886-defb-458f-970b-6938d71db077-kube-api-access-cq7mg\") pod \"redhat-marketplace-r7nth\" (UID: \"6c0c0886-defb-458f-970b-6938d71db077\") " pod="openshift-marketplace/redhat-marketplace-r7nth" Oct 13 13:55:12 crc kubenswrapper[4797]: I1013 13:55:12.470616 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6c0c0886-defb-458f-970b-6938d71db077-catalog-content\") pod \"redhat-marketplace-r7nth\" (UID: \"6c0c0886-defb-458f-970b-6938d71db077\") " pod="openshift-marketplace/redhat-marketplace-r7nth" Oct 13 13:55:12 crc kubenswrapper[4797]: I1013 13:55:12.470760 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6c0c0886-defb-458f-970b-6938d71db077-utilities\") pod \"redhat-marketplace-r7nth\" (UID: \"6c0c0886-defb-458f-970b-6938d71db077\") " pod="openshift-marketplace/redhat-marketplace-r7nth" Oct 13 13:55:12 crc kubenswrapper[4797]: I1013 13:55:12.512707 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cq7mg\" (UniqueName: \"kubernetes.io/projected/6c0c0886-defb-458f-970b-6938d71db077-kube-api-access-cq7mg\") pod \"redhat-marketplace-r7nth\" (UID: \"6c0c0886-defb-458f-970b-6938d71db077\") " pod="openshift-marketplace/redhat-marketplace-r7nth" Oct 13 13:55:12 crc kubenswrapper[4797]: I1013 13:55:12.546330 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-r7nth" Oct 13 13:55:13 crc kubenswrapper[4797]: I1013 13:55:13.020681 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-r7nth"] Oct 13 13:55:14 crc kubenswrapper[4797]: I1013 13:55:14.040798 4797 generic.go:334] "Generic (PLEG): container finished" podID="6c0c0886-defb-458f-970b-6938d71db077" containerID="03aaaee3083a090b6f7d80806dd8199ca53394e9a2424fdbdeaa006f36e5c024" exitCode=0 Oct 13 13:55:14 crc kubenswrapper[4797]: I1013 13:55:14.040871 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r7nth" event={"ID":"6c0c0886-defb-458f-970b-6938d71db077","Type":"ContainerDied","Data":"03aaaee3083a090b6f7d80806dd8199ca53394e9a2424fdbdeaa006f36e5c024"} Oct 13 13:55:14 crc kubenswrapper[4797]: I1013 13:55:14.041160 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r7nth" event={"ID":"6c0c0886-defb-458f-970b-6938d71db077","Type":"ContainerStarted","Data":"0c8d88bca71da6d38a53a09eb4af086c3193786110b7d97f0ea91646665646a8"} Oct 13 13:55:14 crc kubenswrapper[4797]: I1013 13:55:14.043583 4797 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 13 13:55:16 crc kubenswrapper[4797]: I1013 13:55:16.068062 4797 generic.go:334] "Generic (PLEG): container finished" podID="6c0c0886-defb-458f-970b-6938d71db077" containerID="a9235bb8c076fb86717f5d8de1e5f8e55fed9a89e40a5b3de04d41bab1ce5dcf" exitCode=0 Oct 13 13:55:16 crc kubenswrapper[4797]: I1013 13:55:16.068917 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r7nth" event={"ID":"6c0c0886-defb-458f-970b-6938d71db077","Type":"ContainerDied","Data":"a9235bb8c076fb86717f5d8de1e5f8e55fed9a89e40a5b3de04d41bab1ce5dcf"} Oct 13 13:55:17 crc kubenswrapper[4797]: I1013 13:55:17.080362 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r7nth" event={"ID":"6c0c0886-defb-458f-970b-6938d71db077","Type":"ContainerStarted","Data":"cbee44c3be5eac6fca1a03f64375cc26742021684a61d5db7d6f1ff40e9fa583"} Oct 13 13:55:17 crc kubenswrapper[4797]: I1013 13:55:17.105091 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-r7nth" podStartSLOduration=2.609532134 podStartE2EDuration="5.105071344s" podCreationTimestamp="2025-10-13 13:55:12 +0000 UTC" firstStartedPulling="2025-10-13 13:55:14.043314697 +0000 UTC m=+2891.576864943" lastFinishedPulling="2025-10-13 13:55:16.538853897 +0000 UTC m=+2894.072404153" observedRunningTime="2025-10-13 13:55:17.101402744 +0000 UTC m=+2894.634953020" watchObservedRunningTime="2025-10-13 13:55:17.105071344 +0000 UTC m=+2894.638621620" Oct 13 13:55:18 crc kubenswrapper[4797]: I1013 13:55:18.120238 4797 patch_prober.go:28] interesting pod/machine-config-daemon-hrdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 13:55:18 crc kubenswrapper[4797]: I1013 13:55:18.120511 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 13:55:22 crc kubenswrapper[4797]: I1013 13:55:22.547342 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-r7nth" Oct 13 13:55:22 crc kubenswrapper[4797]: I1013 13:55:22.547752 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-r7nth" Oct 13 13:55:22 crc kubenswrapper[4797]: I1013 13:55:22.590322 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-r7nth" Oct 13 13:55:23 crc kubenswrapper[4797]: I1013 13:55:23.166561 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-r7nth" Oct 13 13:55:23 crc kubenswrapper[4797]: I1013 13:55:23.218492 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-r7nth"] Oct 13 13:55:25 crc kubenswrapper[4797]: I1013 13:55:25.140161 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-r7nth" podUID="6c0c0886-defb-458f-970b-6938d71db077" containerName="registry-server" containerID="cri-o://cbee44c3be5eac6fca1a03f64375cc26742021684a61d5db7d6f1ff40e9fa583" gracePeriod=2 Oct 13 13:55:25 crc kubenswrapper[4797]: I1013 13:55:25.512254 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-r7nth" Oct 13 13:55:25 crc kubenswrapper[4797]: I1013 13:55:25.669024 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6c0c0886-defb-458f-970b-6938d71db077-catalog-content\") pod \"6c0c0886-defb-458f-970b-6938d71db077\" (UID: \"6c0c0886-defb-458f-970b-6938d71db077\") " Oct 13 13:55:25 crc kubenswrapper[4797]: I1013 13:55:25.669085 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cq7mg\" (UniqueName: \"kubernetes.io/projected/6c0c0886-defb-458f-970b-6938d71db077-kube-api-access-cq7mg\") pod \"6c0c0886-defb-458f-970b-6938d71db077\" (UID: \"6c0c0886-defb-458f-970b-6938d71db077\") " Oct 13 13:55:25 crc kubenswrapper[4797]: I1013 13:55:25.669140 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6c0c0886-defb-458f-970b-6938d71db077-utilities\") pod \"6c0c0886-defb-458f-970b-6938d71db077\" (UID: \"6c0c0886-defb-458f-970b-6938d71db077\") " Oct 13 13:55:25 crc kubenswrapper[4797]: I1013 13:55:25.670124 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c0c0886-defb-458f-970b-6938d71db077-utilities" (OuterVolumeSpecName: "utilities") pod "6c0c0886-defb-458f-970b-6938d71db077" (UID: "6c0c0886-defb-458f-970b-6938d71db077"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 13:55:25 crc kubenswrapper[4797]: I1013 13:55:25.682564 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c0c0886-defb-458f-970b-6938d71db077-kube-api-access-cq7mg" (OuterVolumeSpecName: "kube-api-access-cq7mg") pod "6c0c0886-defb-458f-970b-6938d71db077" (UID: "6c0c0886-defb-458f-970b-6938d71db077"). InnerVolumeSpecName "kube-api-access-cq7mg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 13:55:25 crc kubenswrapper[4797]: I1013 13:55:25.686134 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c0c0886-defb-458f-970b-6938d71db077-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6c0c0886-defb-458f-970b-6938d71db077" (UID: "6c0c0886-defb-458f-970b-6938d71db077"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 13:55:25 crc kubenswrapper[4797]: I1013 13:55:25.770172 4797 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6c0c0886-defb-458f-970b-6938d71db077-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 13 13:55:25 crc kubenswrapper[4797]: I1013 13:55:25.770210 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cq7mg\" (UniqueName: \"kubernetes.io/projected/6c0c0886-defb-458f-970b-6938d71db077-kube-api-access-cq7mg\") on node \"crc\" DevicePath \"\"" Oct 13 13:55:25 crc kubenswrapper[4797]: I1013 13:55:25.770220 4797 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6c0c0886-defb-458f-970b-6938d71db077-utilities\") on node \"crc\" DevicePath \"\"" Oct 13 13:55:26 crc kubenswrapper[4797]: I1013 13:55:26.152112 4797 generic.go:334] "Generic (PLEG): container finished" podID="6c0c0886-defb-458f-970b-6938d71db077" containerID="cbee44c3be5eac6fca1a03f64375cc26742021684a61d5db7d6f1ff40e9fa583" exitCode=0 Oct 13 13:55:26 crc kubenswrapper[4797]: I1013 13:55:26.152191 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r7nth" event={"ID":"6c0c0886-defb-458f-970b-6938d71db077","Type":"ContainerDied","Data":"cbee44c3be5eac6fca1a03f64375cc26742021684a61d5db7d6f1ff40e9fa583"} Oct 13 13:55:26 crc kubenswrapper[4797]: I1013 13:55:26.152243 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r7nth" event={"ID":"6c0c0886-defb-458f-970b-6938d71db077","Type":"ContainerDied","Data":"0c8d88bca71da6d38a53a09eb4af086c3193786110b7d97f0ea91646665646a8"} Oct 13 13:55:26 crc kubenswrapper[4797]: I1013 13:55:26.152279 4797 scope.go:117] "RemoveContainer" containerID="cbee44c3be5eac6fca1a03f64375cc26742021684a61d5db7d6f1ff40e9fa583" Oct 13 13:55:26 crc kubenswrapper[4797]: I1013 13:55:26.152506 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-r7nth" Oct 13 13:55:26 crc kubenswrapper[4797]: I1013 13:55:26.180048 4797 scope.go:117] "RemoveContainer" containerID="a9235bb8c076fb86717f5d8de1e5f8e55fed9a89e40a5b3de04d41bab1ce5dcf" Oct 13 13:55:26 crc kubenswrapper[4797]: I1013 13:55:26.204151 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-r7nth"] Oct 13 13:55:26 crc kubenswrapper[4797]: I1013 13:55:26.209017 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-r7nth"] Oct 13 13:55:26 crc kubenswrapper[4797]: I1013 13:55:26.223500 4797 scope.go:117] "RemoveContainer" containerID="03aaaee3083a090b6f7d80806dd8199ca53394e9a2424fdbdeaa006f36e5c024" Oct 13 13:55:26 crc kubenswrapper[4797]: I1013 13:55:26.245624 4797 scope.go:117] "RemoveContainer" containerID="cbee44c3be5eac6fca1a03f64375cc26742021684a61d5db7d6f1ff40e9fa583" Oct 13 13:55:26 crc kubenswrapper[4797]: E1013 13:55:26.246226 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cbee44c3be5eac6fca1a03f64375cc26742021684a61d5db7d6f1ff40e9fa583\": container with ID starting with cbee44c3be5eac6fca1a03f64375cc26742021684a61d5db7d6f1ff40e9fa583 not found: ID does not exist" containerID="cbee44c3be5eac6fca1a03f64375cc26742021684a61d5db7d6f1ff40e9fa583" Oct 13 13:55:26 crc kubenswrapper[4797]: I1013 13:55:26.246256 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cbee44c3be5eac6fca1a03f64375cc26742021684a61d5db7d6f1ff40e9fa583"} err="failed to get container status \"cbee44c3be5eac6fca1a03f64375cc26742021684a61d5db7d6f1ff40e9fa583\": rpc error: code = NotFound desc = could not find container \"cbee44c3be5eac6fca1a03f64375cc26742021684a61d5db7d6f1ff40e9fa583\": container with ID starting with cbee44c3be5eac6fca1a03f64375cc26742021684a61d5db7d6f1ff40e9fa583 not found: ID does not exist" Oct 13 13:55:26 crc kubenswrapper[4797]: I1013 13:55:26.246278 4797 scope.go:117] "RemoveContainer" containerID="a9235bb8c076fb86717f5d8de1e5f8e55fed9a89e40a5b3de04d41bab1ce5dcf" Oct 13 13:55:26 crc kubenswrapper[4797]: E1013 13:55:26.246637 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9235bb8c076fb86717f5d8de1e5f8e55fed9a89e40a5b3de04d41bab1ce5dcf\": container with ID starting with a9235bb8c076fb86717f5d8de1e5f8e55fed9a89e40a5b3de04d41bab1ce5dcf not found: ID does not exist" containerID="a9235bb8c076fb86717f5d8de1e5f8e55fed9a89e40a5b3de04d41bab1ce5dcf" Oct 13 13:55:26 crc kubenswrapper[4797]: I1013 13:55:26.246654 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9235bb8c076fb86717f5d8de1e5f8e55fed9a89e40a5b3de04d41bab1ce5dcf"} err="failed to get container status \"a9235bb8c076fb86717f5d8de1e5f8e55fed9a89e40a5b3de04d41bab1ce5dcf\": rpc error: code = NotFound desc = could not find container \"a9235bb8c076fb86717f5d8de1e5f8e55fed9a89e40a5b3de04d41bab1ce5dcf\": container with ID starting with a9235bb8c076fb86717f5d8de1e5f8e55fed9a89e40a5b3de04d41bab1ce5dcf not found: ID does not exist" Oct 13 13:55:26 crc kubenswrapper[4797]: I1013 13:55:26.246669 4797 scope.go:117] "RemoveContainer" containerID="03aaaee3083a090b6f7d80806dd8199ca53394e9a2424fdbdeaa006f36e5c024" Oct 13 13:55:26 crc kubenswrapper[4797]: E1013 13:55:26.246973 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03aaaee3083a090b6f7d80806dd8199ca53394e9a2424fdbdeaa006f36e5c024\": container with ID starting with 03aaaee3083a090b6f7d80806dd8199ca53394e9a2424fdbdeaa006f36e5c024 not found: ID does not exist" containerID="03aaaee3083a090b6f7d80806dd8199ca53394e9a2424fdbdeaa006f36e5c024" Oct 13 13:55:26 crc kubenswrapper[4797]: I1013 13:55:26.246996 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03aaaee3083a090b6f7d80806dd8199ca53394e9a2424fdbdeaa006f36e5c024"} err="failed to get container status \"03aaaee3083a090b6f7d80806dd8199ca53394e9a2424fdbdeaa006f36e5c024\": rpc error: code = NotFound desc = could not find container \"03aaaee3083a090b6f7d80806dd8199ca53394e9a2424fdbdeaa006f36e5c024\": container with ID starting with 03aaaee3083a090b6f7d80806dd8199ca53394e9a2424fdbdeaa006f36e5c024 not found: ID does not exist" Oct 13 13:55:27 crc kubenswrapper[4797]: I1013 13:55:27.246382 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c0c0886-defb-458f-970b-6938d71db077" path="/var/lib/kubelet/pods/6c0c0886-defb-458f-970b-6938d71db077/volumes" Oct 13 13:55:48 crc kubenswrapper[4797]: I1013 13:55:48.120264 4797 patch_prober.go:28] interesting pod/machine-config-daemon-hrdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 13:55:48 crc kubenswrapper[4797]: I1013 13:55:48.121238 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 13:55:48 crc kubenswrapper[4797]: I1013 13:55:48.121312 4797 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" Oct 13 13:55:48 crc kubenswrapper[4797]: I1013 13:55:48.122132 4797 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0135d5e61b48b629dcaffe88a618f8856342cc08171648832811053b15c81254"} pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 13 13:55:48 crc kubenswrapper[4797]: I1013 13:55:48.122227 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerName="machine-config-daemon" containerID="cri-o://0135d5e61b48b629dcaffe88a618f8856342cc08171648832811053b15c81254" gracePeriod=600 Oct 13 13:55:48 crc kubenswrapper[4797]: E1013 13:55:48.244662 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 13:55:48 crc kubenswrapper[4797]: I1013 13:55:48.336525 4797 generic.go:334] "Generic (PLEG): container finished" podID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerID="0135d5e61b48b629dcaffe88a618f8856342cc08171648832811053b15c81254" exitCode=0 Oct 13 13:55:48 crc kubenswrapper[4797]: I1013 13:55:48.336579 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" event={"ID":"345b1c60-ba79-407d-8423-53010f2dfeb0","Type":"ContainerDied","Data":"0135d5e61b48b629dcaffe88a618f8856342cc08171648832811053b15c81254"} Oct 13 13:55:48 crc kubenswrapper[4797]: I1013 13:55:48.336678 4797 scope.go:117] "RemoveContainer" containerID="bc5d3263f74058423f109919a9998cad948554cc01e824f58be41a872a01286e" Oct 13 13:55:48 crc kubenswrapper[4797]: I1013 13:55:48.337350 4797 scope.go:117] "RemoveContainer" containerID="0135d5e61b48b629dcaffe88a618f8856342cc08171648832811053b15c81254" Oct 13 13:55:48 crc kubenswrapper[4797]: E1013 13:55:48.337591 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 13:56:02 crc kubenswrapper[4797]: I1013 13:56:02.236608 4797 scope.go:117] "RemoveContainer" containerID="0135d5e61b48b629dcaffe88a618f8856342cc08171648832811053b15c81254" Oct 13 13:56:02 crc kubenswrapper[4797]: E1013 13:56:02.237401 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 13:56:16 crc kubenswrapper[4797]: I1013 13:56:16.236607 4797 scope.go:117] "RemoveContainer" containerID="0135d5e61b48b629dcaffe88a618f8856342cc08171648832811053b15c81254" Oct 13 13:56:16 crc kubenswrapper[4797]: E1013 13:56:16.237712 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 13:56:28 crc kubenswrapper[4797]: I1013 13:56:28.236263 4797 scope.go:117] "RemoveContainer" containerID="0135d5e61b48b629dcaffe88a618f8856342cc08171648832811053b15c81254" Oct 13 13:56:28 crc kubenswrapper[4797]: E1013 13:56:28.238174 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 13:56:40 crc kubenswrapper[4797]: I1013 13:56:40.236240 4797 scope.go:117] "RemoveContainer" containerID="0135d5e61b48b629dcaffe88a618f8856342cc08171648832811053b15c81254" Oct 13 13:56:40 crc kubenswrapper[4797]: E1013 13:56:40.236981 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 13:56:53 crc kubenswrapper[4797]: I1013 13:56:53.244213 4797 scope.go:117] "RemoveContainer" containerID="0135d5e61b48b629dcaffe88a618f8856342cc08171648832811053b15c81254" Oct 13 13:56:53 crc kubenswrapper[4797]: E1013 13:56:53.245318 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 13:57:08 crc kubenswrapper[4797]: I1013 13:57:08.236343 4797 scope.go:117] "RemoveContainer" containerID="0135d5e61b48b629dcaffe88a618f8856342cc08171648832811053b15c81254" Oct 13 13:57:08 crc kubenswrapper[4797]: E1013 13:57:08.237639 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 13:57:19 crc kubenswrapper[4797]: I1013 13:57:19.236161 4797 scope.go:117] "RemoveContainer" containerID="0135d5e61b48b629dcaffe88a618f8856342cc08171648832811053b15c81254" Oct 13 13:57:19 crc kubenswrapper[4797]: E1013 13:57:19.237271 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 13:57:31 crc kubenswrapper[4797]: I1013 13:57:31.236174 4797 scope.go:117] "RemoveContainer" containerID="0135d5e61b48b629dcaffe88a618f8856342cc08171648832811053b15c81254" Oct 13 13:57:31 crc kubenswrapper[4797]: E1013 13:57:31.236946 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 13:57:43 crc kubenswrapper[4797]: I1013 13:57:43.240672 4797 scope.go:117] "RemoveContainer" containerID="0135d5e61b48b629dcaffe88a618f8856342cc08171648832811053b15c81254" Oct 13 13:57:43 crc kubenswrapper[4797]: E1013 13:57:43.241562 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 13:57:57 crc kubenswrapper[4797]: I1013 13:57:57.236723 4797 scope.go:117] "RemoveContainer" containerID="0135d5e61b48b629dcaffe88a618f8856342cc08171648832811053b15c81254" Oct 13 13:57:57 crc kubenswrapper[4797]: E1013 13:57:57.237538 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 13:58:11 crc kubenswrapper[4797]: I1013 13:58:11.237106 4797 scope.go:117] "RemoveContainer" containerID="0135d5e61b48b629dcaffe88a618f8856342cc08171648832811053b15c81254" Oct 13 13:58:11 crc kubenswrapper[4797]: E1013 13:58:11.238328 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 13:58:25 crc kubenswrapper[4797]: I1013 13:58:25.236566 4797 scope.go:117] "RemoveContainer" containerID="0135d5e61b48b629dcaffe88a618f8856342cc08171648832811053b15c81254" Oct 13 13:58:25 crc kubenswrapper[4797]: E1013 13:58:25.237323 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 13:58:39 crc kubenswrapper[4797]: I1013 13:58:39.236192 4797 scope.go:117] "RemoveContainer" containerID="0135d5e61b48b629dcaffe88a618f8856342cc08171648832811053b15c81254" Oct 13 13:58:39 crc kubenswrapper[4797]: E1013 13:58:39.236863 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 13:58:53 crc kubenswrapper[4797]: I1013 13:58:53.247034 4797 scope.go:117] "RemoveContainer" containerID="0135d5e61b48b629dcaffe88a618f8856342cc08171648832811053b15c81254" Oct 13 13:58:53 crc kubenswrapper[4797]: E1013 13:58:53.248129 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 13:59:04 crc kubenswrapper[4797]: I1013 13:59:04.236330 4797 scope.go:117] "RemoveContainer" containerID="0135d5e61b48b629dcaffe88a618f8856342cc08171648832811053b15c81254" Oct 13 13:59:04 crc kubenswrapper[4797]: E1013 13:59:04.237122 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 13:59:16 crc kubenswrapper[4797]: I1013 13:59:16.236214 4797 scope.go:117] "RemoveContainer" containerID="0135d5e61b48b629dcaffe88a618f8856342cc08171648832811053b15c81254" Oct 13 13:59:16 crc kubenswrapper[4797]: E1013 13:59:16.238397 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 13:59:29 crc kubenswrapper[4797]: I1013 13:59:29.237055 4797 scope.go:117] "RemoveContainer" containerID="0135d5e61b48b629dcaffe88a618f8856342cc08171648832811053b15c81254" Oct 13 13:59:29 crc kubenswrapper[4797]: E1013 13:59:29.237580 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 13:59:40 crc kubenswrapper[4797]: I1013 13:59:40.236027 4797 scope.go:117] "RemoveContainer" containerID="0135d5e61b48b629dcaffe88a618f8856342cc08171648832811053b15c81254" Oct 13 13:59:40 crc kubenswrapper[4797]: E1013 13:59:40.238097 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 13:59:54 crc kubenswrapper[4797]: I1013 13:59:54.236176 4797 scope.go:117] "RemoveContainer" containerID="0135d5e61b48b629dcaffe88a618f8856342cc08171648832811053b15c81254" Oct 13 13:59:54 crc kubenswrapper[4797]: E1013 13:59:54.237030 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 14:00:00 crc kubenswrapper[4797]: I1013 14:00:00.196403 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29339400-r7fxd"] Oct 13 14:00:00 crc kubenswrapper[4797]: E1013 14:00:00.198744 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c0c0886-defb-458f-970b-6938d71db077" containerName="registry-server" Oct 13 14:00:00 crc kubenswrapper[4797]: I1013 14:00:00.198855 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c0c0886-defb-458f-970b-6938d71db077" containerName="registry-server" Oct 13 14:00:00 crc kubenswrapper[4797]: E1013 14:00:00.198895 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c0c0886-defb-458f-970b-6938d71db077" containerName="extract-content" Oct 13 14:00:00 crc kubenswrapper[4797]: I1013 14:00:00.198906 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c0c0886-defb-458f-970b-6938d71db077" containerName="extract-content" Oct 13 14:00:00 crc kubenswrapper[4797]: E1013 14:00:00.198934 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c0c0886-defb-458f-970b-6938d71db077" containerName="extract-utilities" Oct 13 14:00:00 crc kubenswrapper[4797]: I1013 14:00:00.198947 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c0c0886-defb-458f-970b-6938d71db077" containerName="extract-utilities" Oct 13 14:00:00 crc kubenswrapper[4797]: I1013 14:00:00.199334 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c0c0886-defb-458f-970b-6938d71db077" containerName="registry-server" Oct 13 14:00:00 crc kubenswrapper[4797]: I1013 14:00:00.200367 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29339400-r7fxd" Oct 13 14:00:00 crc kubenswrapper[4797]: I1013 14:00:00.201665 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29339400-r7fxd"] Oct 13 14:00:00 crc kubenswrapper[4797]: I1013 14:00:00.202519 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 13 14:00:00 crc kubenswrapper[4797]: I1013 14:00:00.203042 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 13 14:00:00 crc kubenswrapper[4797]: I1013 14:00:00.320327 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9pp5\" (UniqueName: \"kubernetes.io/projected/b6677772-d44a-4d3a-b496-5a2e5970db18-kube-api-access-r9pp5\") pod \"collect-profiles-29339400-r7fxd\" (UID: \"b6677772-d44a-4d3a-b496-5a2e5970db18\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339400-r7fxd" Oct 13 14:00:00 crc kubenswrapper[4797]: I1013 14:00:00.320587 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b6677772-d44a-4d3a-b496-5a2e5970db18-config-volume\") pod \"collect-profiles-29339400-r7fxd\" (UID: \"b6677772-d44a-4d3a-b496-5a2e5970db18\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339400-r7fxd" Oct 13 14:00:00 crc kubenswrapper[4797]: I1013 14:00:00.320641 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b6677772-d44a-4d3a-b496-5a2e5970db18-secret-volume\") pod \"collect-profiles-29339400-r7fxd\" (UID: \"b6677772-d44a-4d3a-b496-5a2e5970db18\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339400-r7fxd" Oct 13 14:00:00 crc kubenswrapper[4797]: I1013 14:00:00.421893 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b6677772-d44a-4d3a-b496-5a2e5970db18-config-volume\") pod \"collect-profiles-29339400-r7fxd\" (UID: \"b6677772-d44a-4d3a-b496-5a2e5970db18\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339400-r7fxd" Oct 13 14:00:00 crc kubenswrapper[4797]: I1013 14:00:00.421955 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b6677772-d44a-4d3a-b496-5a2e5970db18-secret-volume\") pod \"collect-profiles-29339400-r7fxd\" (UID: \"b6677772-d44a-4d3a-b496-5a2e5970db18\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339400-r7fxd" Oct 13 14:00:00 crc kubenswrapper[4797]: I1013 14:00:00.422001 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r9pp5\" (UniqueName: \"kubernetes.io/projected/b6677772-d44a-4d3a-b496-5a2e5970db18-kube-api-access-r9pp5\") pod \"collect-profiles-29339400-r7fxd\" (UID: \"b6677772-d44a-4d3a-b496-5a2e5970db18\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339400-r7fxd" Oct 13 14:00:00 crc kubenswrapper[4797]: I1013 14:00:00.423937 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b6677772-d44a-4d3a-b496-5a2e5970db18-config-volume\") pod \"collect-profiles-29339400-r7fxd\" (UID: \"b6677772-d44a-4d3a-b496-5a2e5970db18\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339400-r7fxd" Oct 13 14:00:00 crc kubenswrapper[4797]: I1013 14:00:00.428134 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b6677772-d44a-4d3a-b496-5a2e5970db18-secret-volume\") pod \"collect-profiles-29339400-r7fxd\" (UID: \"b6677772-d44a-4d3a-b496-5a2e5970db18\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339400-r7fxd" Oct 13 14:00:00 crc kubenswrapper[4797]: I1013 14:00:00.455294 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9pp5\" (UniqueName: \"kubernetes.io/projected/b6677772-d44a-4d3a-b496-5a2e5970db18-kube-api-access-r9pp5\") pod \"collect-profiles-29339400-r7fxd\" (UID: \"b6677772-d44a-4d3a-b496-5a2e5970db18\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339400-r7fxd" Oct 13 14:00:00 crc kubenswrapper[4797]: I1013 14:00:00.524386 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29339400-r7fxd" Oct 13 14:00:00 crc kubenswrapper[4797]: I1013 14:00:00.952399 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29339400-r7fxd"] Oct 13 14:00:01 crc kubenswrapper[4797]: I1013 14:00:01.341461 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29339400-r7fxd" event={"ID":"b6677772-d44a-4d3a-b496-5a2e5970db18","Type":"ContainerStarted","Data":"968437b0f5ba9d1645208f39f8303e77d536fdf43eb97197f588c9f7832bfdc4"} Oct 13 14:00:01 crc kubenswrapper[4797]: I1013 14:00:01.342753 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29339400-r7fxd" event={"ID":"b6677772-d44a-4d3a-b496-5a2e5970db18","Type":"ContainerStarted","Data":"23a09d87d1bbf159ed7eb530de8fd57183824964050c01f851f9db671801d9ec"} Oct 13 14:00:01 crc kubenswrapper[4797]: I1013 14:00:01.360107 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29339400-r7fxd" podStartSLOduration=1.360087128 podStartE2EDuration="1.360087128s" podCreationTimestamp="2025-10-13 14:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 14:00:01.358830717 +0000 UTC m=+3178.892380993" watchObservedRunningTime="2025-10-13 14:00:01.360087128 +0000 UTC m=+3178.893637384" Oct 13 14:00:02 crc kubenswrapper[4797]: I1013 14:00:02.383858 4797 generic.go:334] "Generic (PLEG): container finished" podID="b6677772-d44a-4d3a-b496-5a2e5970db18" containerID="968437b0f5ba9d1645208f39f8303e77d536fdf43eb97197f588c9f7832bfdc4" exitCode=0 Oct 13 14:00:02 crc kubenswrapper[4797]: I1013 14:00:02.383957 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29339400-r7fxd" event={"ID":"b6677772-d44a-4d3a-b496-5a2e5970db18","Type":"ContainerDied","Data":"968437b0f5ba9d1645208f39f8303e77d536fdf43eb97197f588c9f7832bfdc4"} Oct 13 14:00:03 crc kubenswrapper[4797]: I1013 14:00:03.651219 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29339400-r7fxd" Oct 13 14:00:03 crc kubenswrapper[4797]: I1013 14:00:03.770256 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r9pp5\" (UniqueName: \"kubernetes.io/projected/b6677772-d44a-4d3a-b496-5a2e5970db18-kube-api-access-r9pp5\") pod \"b6677772-d44a-4d3a-b496-5a2e5970db18\" (UID: \"b6677772-d44a-4d3a-b496-5a2e5970db18\") " Oct 13 14:00:03 crc kubenswrapper[4797]: I1013 14:00:03.770862 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b6677772-d44a-4d3a-b496-5a2e5970db18-config-volume\") pod \"b6677772-d44a-4d3a-b496-5a2e5970db18\" (UID: \"b6677772-d44a-4d3a-b496-5a2e5970db18\") " Oct 13 14:00:03 crc kubenswrapper[4797]: I1013 14:00:03.770985 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b6677772-d44a-4d3a-b496-5a2e5970db18-secret-volume\") pod \"b6677772-d44a-4d3a-b496-5a2e5970db18\" (UID: \"b6677772-d44a-4d3a-b496-5a2e5970db18\") " Oct 13 14:00:03 crc kubenswrapper[4797]: I1013 14:00:03.771990 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6677772-d44a-4d3a-b496-5a2e5970db18-config-volume" (OuterVolumeSpecName: "config-volume") pod "b6677772-d44a-4d3a-b496-5a2e5970db18" (UID: "b6677772-d44a-4d3a-b496-5a2e5970db18"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 14:00:03 crc kubenswrapper[4797]: I1013 14:00:03.778059 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6677772-d44a-4d3a-b496-5a2e5970db18-kube-api-access-r9pp5" (OuterVolumeSpecName: "kube-api-access-r9pp5") pod "b6677772-d44a-4d3a-b496-5a2e5970db18" (UID: "b6677772-d44a-4d3a-b496-5a2e5970db18"). InnerVolumeSpecName "kube-api-access-r9pp5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 14:00:03 crc kubenswrapper[4797]: I1013 14:00:03.778210 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6677772-d44a-4d3a-b496-5a2e5970db18-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "b6677772-d44a-4d3a-b496-5a2e5970db18" (UID: "b6677772-d44a-4d3a-b496-5a2e5970db18"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 14:00:03 crc kubenswrapper[4797]: I1013 14:00:03.872612 4797 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b6677772-d44a-4d3a-b496-5a2e5970db18-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 13 14:00:03 crc kubenswrapper[4797]: I1013 14:00:03.872658 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r9pp5\" (UniqueName: \"kubernetes.io/projected/b6677772-d44a-4d3a-b496-5a2e5970db18-kube-api-access-r9pp5\") on node \"crc\" DevicePath \"\"" Oct 13 14:00:03 crc kubenswrapper[4797]: I1013 14:00:03.872673 4797 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b6677772-d44a-4d3a-b496-5a2e5970db18-config-volume\") on node \"crc\" DevicePath \"\"" Oct 13 14:00:04 crc kubenswrapper[4797]: I1013 14:00:04.401983 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29339400-r7fxd" event={"ID":"b6677772-d44a-4d3a-b496-5a2e5970db18","Type":"ContainerDied","Data":"23a09d87d1bbf159ed7eb530de8fd57183824964050c01f851f9db671801d9ec"} Oct 13 14:00:04 crc kubenswrapper[4797]: I1013 14:00:04.402029 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="23a09d87d1bbf159ed7eb530de8fd57183824964050c01f851f9db671801d9ec" Oct 13 14:00:04 crc kubenswrapper[4797]: I1013 14:00:04.402083 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29339400-r7fxd" Oct 13 14:00:04 crc kubenswrapper[4797]: I1013 14:00:04.449167 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29339355-zk5lp"] Oct 13 14:00:04 crc kubenswrapper[4797]: I1013 14:00:04.455114 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29339355-zk5lp"] Oct 13 14:00:05 crc kubenswrapper[4797]: I1013 14:00:05.236860 4797 scope.go:117] "RemoveContainer" containerID="0135d5e61b48b629dcaffe88a618f8856342cc08171648832811053b15c81254" Oct 13 14:00:05 crc kubenswrapper[4797]: E1013 14:00:05.237431 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 14:00:05 crc kubenswrapper[4797]: I1013 14:00:05.250625 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2dafde8d-885e-4344-86a8-f384c52b4b56" path="/var/lib/kubelet/pods/2dafde8d-885e-4344-86a8-f384c52b4b56/volumes" Oct 13 14:00:11 crc kubenswrapper[4797]: I1013 14:00:11.176423 4797 scope.go:117] "RemoveContainer" containerID="00b045b78ab122f5ca663f60b236c4a5bce0f800121e007c6ce781004fb0e3f4" Oct 13 14:00:20 crc kubenswrapper[4797]: I1013 14:00:20.236786 4797 scope.go:117] "RemoveContainer" containerID="0135d5e61b48b629dcaffe88a618f8856342cc08171648832811053b15c81254" Oct 13 14:00:20 crc kubenswrapper[4797]: E1013 14:00:20.237709 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 14:00:31 crc kubenswrapper[4797]: I1013 14:00:31.237645 4797 scope.go:117] "RemoveContainer" containerID="0135d5e61b48b629dcaffe88a618f8856342cc08171648832811053b15c81254" Oct 13 14:00:31 crc kubenswrapper[4797]: E1013 14:00:31.238774 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 14:00:46 crc kubenswrapper[4797]: I1013 14:00:46.236319 4797 scope.go:117] "RemoveContainer" containerID="0135d5e61b48b629dcaffe88a618f8856342cc08171648832811053b15c81254" Oct 13 14:00:46 crc kubenswrapper[4797]: E1013 14:00:46.237021 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 14:00:58 crc kubenswrapper[4797]: I1013 14:00:58.236037 4797 scope.go:117] "RemoveContainer" containerID="0135d5e61b48b629dcaffe88a618f8856342cc08171648832811053b15c81254" Oct 13 14:00:58 crc kubenswrapper[4797]: I1013 14:00:58.822792 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" event={"ID":"345b1c60-ba79-407d-8423-53010f2dfeb0","Type":"ContainerStarted","Data":"39263090f3bbb8cc8391e1dcd103f422d045f83425792d9a12337578bb539d50"} Oct 13 14:03:18 crc kubenswrapper[4797]: I1013 14:03:18.120373 4797 patch_prober.go:28] interesting pod/machine-config-daemon-hrdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 14:03:18 crc kubenswrapper[4797]: I1013 14:03:18.120948 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 14:03:48 crc kubenswrapper[4797]: I1013 14:03:48.119775 4797 patch_prober.go:28] interesting pod/machine-config-daemon-hrdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 14:03:48 crc kubenswrapper[4797]: I1013 14:03:48.120313 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 14:04:18 crc kubenswrapper[4797]: I1013 14:04:18.120050 4797 patch_prober.go:28] interesting pod/machine-config-daemon-hrdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 14:04:18 crc kubenswrapper[4797]: I1013 14:04:18.120702 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 14:04:18 crc kubenswrapper[4797]: I1013 14:04:18.120762 4797 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" Oct 13 14:04:18 crc kubenswrapper[4797]: I1013 14:04:18.121683 4797 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"39263090f3bbb8cc8391e1dcd103f422d045f83425792d9a12337578bb539d50"} pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 13 14:04:18 crc kubenswrapper[4797]: I1013 14:04:18.121768 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerName="machine-config-daemon" containerID="cri-o://39263090f3bbb8cc8391e1dcd103f422d045f83425792d9a12337578bb539d50" gracePeriod=600 Oct 13 14:04:18 crc kubenswrapper[4797]: I1013 14:04:18.277347 4797 generic.go:334] "Generic (PLEG): container finished" podID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerID="39263090f3bbb8cc8391e1dcd103f422d045f83425792d9a12337578bb539d50" exitCode=0 Oct 13 14:04:18 crc kubenswrapper[4797]: I1013 14:04:18.277450 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" event={"ID":"345b1c60-ba79-407d-8423-53010f2dfeb0","Type":"ContainerDied","Data":"39263090f3bbb8cc8391e1dcd103f422d045f83425792d9a12337578bb539d50"} Oct 13 14:04:18 crc kubenswrapper[4797]: I1013 14:04:18.277862 4797 scope.go:117] "RemoveContainer" containerID="0135d5e61b48b629dcaffe88a618f8856342cc08171648832811053b15c81254" Oct 13 14:04:19 crc kubenswrapper[4797]: I1013 14:04:19.286270 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" event={"ID":"345b1c60-ba79-407d-8423-53010f2dfeb0","Type":"ContainerStarted","Data":"cb608ef00552a320249af626728ae168e49b74a818c0214980d05a716b11d54a"} Oct 13 14:05:18 crc kubenswrapper[4797]: I1013 14:05:18.043297 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-8d7lc"] Oct 13 14:05:18 crc kubenswrapper[4797]: E1013 14:05:18.045106 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6677772-d44a-4d3a-b496-5a2e5970db18" containerName="collect-profiles" Oct 13 14:05:18 crc kubenswrapper[4797]: I1013 14:05:18.045162 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6677772-d44a-4d3a-b496-5a2e5970db18" containerName="collect-profiles" Oct 13 14:05:18 crc kubenswrapper[4797]: I1013 14:05:18.045405 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6677772-d44a-4d3a-b496-5a2e5970db18" containerName="collect-profiles" Oct 13 14:05:18 crc kubenswrapper[4797]: I1013 14:05:18.046977 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8d7lc" Oct 13 14:05:18 crc kubenswrapper[4797]: I1013 14:05:18.053123 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8d7lc"] Oct 13 14:05:18 crc kubenswrapper[4797]: I1013 14:05:18.177920 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t676c\" (UniqueName: \"kubernetes.io/projected/e397abd8-1db4-4614-972e-b6b50f1623b5-kube-api-access-t676c\") pod \"redhat-marketplace-8d7lc\" (UID: \"e397abd8-1db4-4614-972e-b6b50f1623b5\") " pod="openshift-marketplace/redhat-marketplace-8d7lc" Oct 13 14:05:18 crc kubenswrapper[4797]: I1013 14:05:18.177976 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e397abd8-1db4-4614-972e-b6b50f1623b5-utilities\") pod \"redhat-marketplace-8d7lc\" (UID: \"e397abd8-1db4-4614-972e-b6b50f1623b5\") " pod="openshift-marketplace/redhat-marketplace-8d7lc" Oct 13 14:05:18 crc kubenswrapper[4797]: I1013 14:05:18.178131 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e397abd8-1db4-4614-972e-b6b50f1623b5-catalog-content\") pod \"redhat-marketplace-8d7lc\" (UID: \"e397abd8-1db4-4614-972e-b6b50f1623b5\") " pod="openshift-marketplace/redhat-marketplace-8d7lc" Oct 13 14:05:18 crc kubenswrapper[4797]: I1013 14:05:18.241654 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-gxvnl"] Oct 13 14:05:18 crc kubenswrapper[4797]: I1013 14:05:18.243563 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gxvnl" Oct 13 14:05:18 crc kubenswrapper[4797]: I1013 14:05:18.258479 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gxvnl"] Oct 13 14:05:18 crc kubenswrapper[4797]: I1013 14:05:18.279575 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e397abd8-1db4-4614-972e-b6b50f1623b5-utilities\") pod \"redhat-marketplace-8d7lc\" (UID: \"e397abd8-1db4-4614-972e-b6b50f1623b5\") " pod="openshift-marketplace/redhat-marketplace-8d7lc" Oct 13 14:05:18 crc kubenswrapper[4797]: I1013 14:05:18.279644 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e397abd8-1db4-4614-972e-b6b50f1623b5-catalog-content\") pod \"redhat-marketplace-8d7lc\" (UID: \"e397abd8-1db4-4614-972e-b6b50f1623b5\") " pod="openshift-marketplace/redhat-marketplace-8d7lc" Oct 13 14:05:18 crc kubenswrapper[4797]: I1013 14:05:18.279884 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t676c\" (UniqueName: \"kubernetes.io/projected/e397abd8-1db4-4614-972e-b6b50f1623b5-kube-api-access-t676c\") pod \"redhat-marketplace-8d7lc\" (UID: \"e397abd8-1db4-4614-972e-b6b50f1623b5\") " pod="openshift-marketplace/redhat-marketplace-8d7lc" Oct 13 14:05:18 crc kubenswrapper[4797]: I1013 14:05:18.280152 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e397abd8-1db4-4614-972e-b6b50f1623b5-catalog-content\") pod \"redhat-marketplace-8d7lc\" (UID: \"e397abd8-1db4-4614-972e-b6b50f1623b5\") " pod="openshift-marketplace/redhat-marketplace-8d7lc" Oct 13 14:05:18 crc kubenswrapper[4797]: I1013 14:05:18.280198 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e397abd8-1db4-4614-972e-b6b50f1623b5-utilities\") pod \"redhat-marketplace-8d7lc\" (UID: \"e397abd8-1db4-4614-972e-b6b50f1623b5\") " pod="openshift-marketplace/redhat-marketplace-8d7lc" Oct 13 14:05:18 crc kubenswrapper[4797]: I1013 14:05:18.299624 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t676c\" (UniqueName: \"kubernetes.io/projected/e397abd8-1db4-4614-972e-b6b50f1623b5-kube-api-access-t676c\") pod \"redhat-marketplace-8d7lc\" (UID: \"e397abd8-1db4-4614-972e-b6b50f1623b5\") " pod="openshift-marketplace/redhat-marketplace-8d7lc" Oct 13 14:05:18 crc kubenswrapper[4797]: I1013 14:05:18.381045 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75f6f619-b978-49a6-b22a-55646ac866bb-utilities\") pod \"redhat-operators-gxvnl\" (UID: \"75f6f619-b978-49a6-b22a-55646ac866bb\") " pod="openshift-marketplace/redhat-operators-gxvnl" Oct 13 14:05:18 crc kubenswrapper[4797]: I1013 14:05:18.381103 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rh77w\" (UniqueName: \"kubernetes.io/projected/75f6f619-b978-49a6-b22a-55646ac866bb-kube-api-access-rh77w\") pod \"redhat-operators-gxvnl\" (UID: \"75f6f619-b978-49a6-b22a-55646ac866bb\") " pod="openshift-marketplace/redhat-operators-gxvnl" Oct 13 14:05:18 crc kubenswrapper[4797]: I1013 14:05:18.381162 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75f6f619-b978-49a6-b22a-55646ac866bb-catalog-content\") pod \"redhat-operators-gxvnl\" (UID: \"75f6f619-b978-49a6-b22a-55646ac866bb\") " pod="openshift-marketplace/redhat-operators-gxvnl" Oct 13 14:05:18 crc kubenswrapper[4797]: I1013 14:05:18.397254 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8d7lc" Oct 13 14:05:18 crc kubenswrapper[4797]: I1013 14:05:18.482975 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rh77w\" (UniqueName: \"kubernetes.io/projected/75f6f619-b978-49a6-b22a-55646ac866bb-kube-api-access-rh77w\") pod \"redhat-operators-gxvnl\" (UID: \"75f6f619-b978-49a6-b22a-55646ac866bb\") " pod="openshift-marketplace/redhat-operators-gxvnl" Oct 13 14:05:18 crc kubenswrapper[4797]: I1013 14:05:18.483032 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75f6f619-b978-49a6-b22a-55646ac866bb-catalog-content\") pod \"redhat-operators-gxvnl\" (UID: \"75f6f619-b978-49a6-b22a-55646ac866bb\") " pod="openshift-marketplace/redhat-operators-gxvnl" Oct 13 14:05:18 crc kubenswrapper[4797]: I1013 14:05:18.483197 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75f6f619-b978-49a6-b22a-55646ac866bb-utilities\") pod \"redhat-operators-gxvnl\" (UID: \"75f6f619-b978-49a6-b22a-55646ac866bb\") " pod="openshift-marketplace/redhat-operators-gxvnl" Oct 13 14:05:18 crc kubenswrapper[4797]: I1013 14:05:18.483826 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75f6f619-b978-49a6-b22a-55646ac866bb-utilities\") pod \"redhat-operators-gxvnl\" (UID: \"75f6f619-b978-49a6-b22a-55646ac866bb\") " pod="openshift-marketplace/redhat-operators-gxvnl" Oct 13 14:05:18 crc kubenswrapper[4797]: I1013 14:05:18.484136 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75f6f619-b978-49a6-b22a-55646ac866bb-catalog-content\") pod \"redhat-operators-gxvnl\" (UID: \"75f6f619-b978-49a6-b22a-55646ac866bb\") " pod="openshift-marketplace/redhat-operators-gxvnl" Oct 13 14:05:18 crc kubenswrapper[4797]: I1013 14:05:18.504853 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rh77w\" (UniqueName: \"kubernetes.io/projected/75f6f619-b978-49a6-b22a-55646ac866bb-kube-api-access-rh77w\") pod \"redhat-operators-gxvnl\" (UID: \"75f6f619-b978-49a6-b22a-55646ac866bb\") " pod="openshift-marketplace/redhat-operators-gxvnl" Oct 13 14:05:18 crc kubenswrapper[4797]: I1013 14:05:18.560278 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gxvnl" Oct 13 14:05:18 crc kubenswrapper[4797]: I1013 14:05:18.634542 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8d7lc"] Oct 13 14:05:18 crc kubenswrapper[4797]: W1013 14:05:18.642233 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode397abd8_1db4_4614_972e_b6b50f1623b5.slice/crio-bd6614291e8ed42a4e773e7fbc87ca490c5e0757fefa7e3a337045a32a212a85 WatchSource:0}: Error finding container bd6614291e8ed42a4e773e7fbc87ca490c5e0757fefa7e3a337045a32a212a85: Status 404 returned error can't find the container with id bd6614291e8ed42a4e773e7fbc87ca490c5e0757fefa7e3a337045a32a212a85 Oct 13 14:05:18 crc kubenswrapper[4797]: I1013 14:05:18.741428 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8d7lc" event={"ID":"e397abd8-1db4-4614-972e-b6b50f1623b5","Type":"ContainerStarted","Data":"bd6614291e8ed42a4e773e7fbc87ca490c5e0757fefa7e3a337045a32a212a85"} Oct 13 14:05:19 crc kubenswrapper[4797]: I1013 14:05:19.077926 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gxvnl"] Oct 13 14:05:19 crc kubenswrapper[4797]: I1013 14:05:19.748960 4797 generic.go:334] "Generic (PLEG): container finished" podID="75f6f619-b978-49a6-b22a-55646ac866bb" containerID="9ac0384d97066ac81ae707aff148a0b48c6eaf4b93fe799fc5fc0058ecd182b6" exitCode=0 Oct 13 14:05:19 crc kubenswrapper[4797]: I1013 14:05:19.749030 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gxvnl" event={"ID":"75f6f619-b978-49a6-b22a-55646ac866bb","Type":"ContainerDied","Data":"9ac0384d97066ac81ae707aff148a0b48c6eaf4b93fe799fc5fc0058ecd182b6"} Oct 13 14:05:19 crc kubenswrapper[4797]: I1013 14:05:19.749057 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gxvnl" event={"ID":"75f6f619-b978-49a6-b22a-55646ac866bb","Type":"ContainerStarted","Data":"da7bfeedc79bb9cbd9180f5d5c140405d0729e5f2f00fbbcdce7d1119c9c2ccc"} Oct 13 14:05:19 crc kubenswrapper[4797]: I1013 14:05:19.750647 4797 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 13 14:05:19 crc kubenswrapper[4797]: I1013 14:05:19.750978 4797 generic.go:334] "Generic (PLEG): container finished" podID="e397abd8-1db4-4614-972e-b6b50f1623b5" containerID="7c5dbbfc18d550c17d329281fd825c40f416a462fc73092ef507bbdaff43ff98" exitCode=0 Oct 13 14:05:19 crc kubenswrapper[4797]: I1013 14:05:19.751011 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8d7lc" event={"ID":"e397abd8-1db4-4614-972e-b6b50f1623b5","Type":"ContainerDied","Data":"7c5dbbfc18d550c17d329281fd825c40f416a462fc73092ef507bbdaff43ff98"} Oct 13 14:05:20 crc kubenswrapper[4797]: I1013 14:05:20.762911 4797 generic.go:334] "Generic (PLEG): container finished" podID="e397abd8-1db4-4614-972e-b6b50f1623b5" containerID="703f70808be9d3d01abea6e518feb5bda81ddbe0759649850239fa0198fca01f" exitCode=0 Oct 13 14:05:20 crc kubenswrapper[4797]: I1013 14:05:20.763039 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8d7lc" event={"ID":"e397abd8-1db4-4614-972e-b6b50f1623b5","Type":"ContainerDied","Data":"703f70808be9d3d01abea6e518feb5bda81ddbe0759649850239fa0198fca01f"} Oct 13 14:05:21 crc kubenswrapper[4797]: I1013 14:05:21.772825 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8d7lc" event={"ID":"e397abd8-1db4-4614-972e-b6b50f1623b5","Type":"ContainerStarted","Data":"699c129f0ab5ff131a76ffd4f9db03b5d83b10645bfc02afedf46fa311a0887d"} Oct 13 14:05:21 crc kubenswrapper[4797]: I1013 14:05:21.775316 4797 generic.go:334] "Generic (PLEG): container finished" podID="75f6f619-b978-49a6-b22a-55646ac866bb" containerID="e0621b222577af13ae8808c7f8a74e4ae2fff7f9702c895b265ac0f27c5f3eda" exitCode=0 Oct 13 14:05:21 crc kubenswrapper[4797]: I1013 14:05:21.775356 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gxvnl" event={"ID":"75f6f619-b978-49a6-b22a-55646ac866bb","Type":"ContainerDied","Data":"e0621b222577af13ae8808c7f8a74e4ae2fff7f9702c895b265ac0f27c5f3eda"} Oct 13 14:05:21 crc kubenswrapper[4797]: I1013 14:05:21.797894 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-8d7lc" podStartSLOduration=2.302011518 podStartE2EDuration="3.797873268s" podCreationTimestamp="2025-10-13 14:05:18 +0000 UTC" firstStartedPulling="2025-10-13 14:05:19.752304016 +0000 UTC m=+3497.285854272" lastFinishedPulling="2025-10-13 14:05:21.248165746 +0000 UTC m=+3498.781716022" observedRunningTime="2025-10-13 14:05:21.79672963 +0000 UTC m=+3499.330279906" watchObservedRunningTime="2025-10-13 14:05:21.797873268 +0000 UTC m=+3499.331423524" Oct 13 14:05:23 crc kubenswrapper[4797]: I1013 14:05:23.270874 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-qm22w"] Oct 13 14:05:23 crc kubenswrapper[4797]: I1013 14:05:23.272844 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qm22w" Oct 13 14:05:23 crc kubenswrapper[4797]: I1013 14:05:23.281902 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qm22w"] Oct 13 14:05:23 crc kubenswrapper[4797]: I1013 14:05:23.454010 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02803a9c-81aa-4d42-9ee5-59777f0b5228-utilities\") pod \"certified-operators-qm22w\" (UID: \"02803a9c-81aa-4d42-9ee5-59777f0b5228\") " pod="openshift-marketplace/certified-operators-qm22w" Oct 13 14:05:23 crc kubenswrapper[4797]: I1013 14:05:23.454426 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqt6k\" (UniqueName: \"kubernetes.io/projected/02803a9c-81aa-4d42-9ee5-59777f0b5228-kube-api-access-qqt6k\") pod \"certified-operators-qm22w\" (UID: \"02803a9c-81aa-4d42-9ee5-59777f0b5228\") " pod="openshift-marketplace/certified-operators-qm22w" Oct 13 14:05:23 crc kubenswrapper[4797]: I1013 14:05:23.454529 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02803a9c-81aa-4d42-9ee5-59777f0b5228-catalog-content\") pod \"certified-operators-qm22w\" (UID: \"02803a9c-81aa-4d42-9ee5-59777f0b5228\") " pod="openshift-marketplace/certified-operators-qm22w" Oct 13 14:05:23 crc kubenswrapper[4797]: I1013 14:05:23.555879 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qqt6k\" (UniqueName: \"kubernetes.io/projected/02803a9c-81aa-4d42-9ee5-59777f0b5228-kube-api-access-qqt6k\") pod \"certified-operators-qm22w\" (UID: \"02803a9c-81aa-4d42-9ee5-59777f0b5228\") " pod="openshift-marketplace/certified-operators-qm22w" Oct 13 14:05:23 crc kubenswrapper[4797]: I1013 14:05:23.555949 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02803a9c-81aa-4d42-9ee5-59777f0b5228-catalog-content\") pod \"certified-operators-qm22w\" (UID: \"02803a9c-81aa-4d42-9ee5-59777f0b5228\") " pod="openshift-marketplace/certified-operators-qm22w" Oct 13 14:05:23 crc kubenswrapper[4797]: I1013 14:05:23.555996 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02803a9c-81aa-4d42-9ee5-59777f0b5228-utilities\") pod \"certified-operators-qm22w\" (UID: \"02803a9c-81aa-4d42-9ee5-59777f0b5228\") " pod="openshift-marketplace/certified-operators-qm22w" Oct 13 14:05:23 crc kubenswrapper[4797]: I1013 14:05:23.556584 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02803a9c-81aa-4d42-9ee5-59777f0b5228-utilities\") pod \"certified-operators-qm22w\" (UID: \"02803a9c-81aa-4d42-9ee5-59777f0b5228\") " pod="openshift-marketplace/certified-operators-qm22w" Oct 13 14:05:23 crc kubenswrapper[4797]: I1013 14:05:23.556661 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02803a9c-81aa-4d42-9ee5-59777f0b5228-catalog-content\") pod \"certified-operators-qm22w\" (UID: \"02803a9c-81aa-4d42-9ee5-59777f0b5228\") " pod="openshift-marketplace/certified-operators-qm22w" Oct 13 14:05:23 crc kubenswrapper[4797]: I1013 14:05:23.582741 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqt6k\" (UniqueName: \"kubernetes.io/projected/02803a9c-81aa-4d42-9ee5-59777f0b5228-kube-api-access-qqt6k\") pod \"certified-operators-qm22w\" (UID: \"02803a9c-81aa-4d42-9ee5-59777f0b5228\") " pod="openshift-marketplace/certified-operators-qm22w" Oct 13 14:05:23 crc kubenswrapper[4797]: I1013 14:05:23.608489 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qm22w" Oct 13 14:05:23 crc kubenswrapper[4797]: I1013 14:05:23.809157 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gxvnl" event={"ID":"75f6f619-b978-49a6-b22a-55646ac866bb","Type":"ContainerStarted","Data":"bf7c2cf587e450d2017d86a9b877dfd8088d662dc489ffb543e8539faba78078"} Oct 13 14:05:24 crc kubenswrapper[4797]: I1013 14:05:24.089140 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-gxvnl" podStartSLOduration=2.6614190239999997 podStartE2EDuration="6.089120207s" podCreationTimestamp="2025-10-13 14:05:18 +0000 UTC" firstStartedPulling="2025-10-13 14:05:19.750411459 +0000 UTC m=+3497.283961715" lastFinishedPulling="2025-10-13 14:05:23.178112632 +0000 UTC m=+3500.711662898" observedRunningTime="2025-10-13 14:05:23.831284013 +0000 UTC m=+3501.364834289" watchObservedRunningTime="2025-10-13 14:05:24.089120207 +0000 UTC m=+3501.622670463" Oct 13 14:05:24 crc kubenswrapper[4797]: I1013 14:05:24.098110 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qm22w"] Oct 13 14:05:24 crc kubenswrapper[4797]: I1013 14:05:24.442588 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-57lrh"] Oct 13 14:05:24 crc kubenswrapper[4797]: I1013 14:05:24.444510 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-57lrh" Oct 13 14:05:24 crc kubenswrapper[4797]: I1013 14:05:24.457462 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-57lrh"] Oct 13 14:05:24 crc kubenswrapper[4797]: I1013 14:05:24.572522 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7lwz\" (UniqueName: \"kubernetes.io/projected/bdb6f76c-3a26-441b-9292-07e1194f4045-kube-api-access-x7lwz\") pod \"community-operators-57lrh\" (UID: \"bdb6f76c-3a26-441b-9292-07e1194f4045\") " pod="openshift-marketplace/community-operators-57lrh" Oct 13 14:05:24 crc kubenswrapper[4797]: I1013 14:05:24.572896 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bdb6f76c-3a26-441b-9292-07e1194f4045-catalog-content\") pod \"community-operators-57lrh\" (UID: \"bdb6f76c-3a26-441b-9292-07e1194f4045\") " pod="openshift-marketplace/community-operators-57lrh" Oct 13 14:05:24 crc kubenswrapper[4797]: I1013 14:05:24.573050 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bdb6f76c-3a26-441b-9292-07e1194f4045-utilities\") pod \"community-operators-57lrh\" (UID: \"bdb6f76c-3a26-441b-9292-07e1194f4045\") " pod="openshift-marketplace/community-operators-57lrh" Oct 13 14:05:24 crc kubenswrapper[4797]: I1013 14:05:24.674367 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bdb6f76c-3a26-441b-9292-07e1194f4045-utilities\") pod \"community-operators-57lrh\" (UID: \"bdb6f76c-3a26-441b-9292-07e1194f4045\") " pod="openshift-marketplace/community-operators-57lrh" Oct 13 14:05:24 crc kubenswrapper[4797]: I1013 14:05:24.674484 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7lwz\" (UniqueName: \"kubernetes.io/projected/bdb6f76c-3a26-441b-9292-07e1194f4045-kube-api-access-x7lwz\") pod \"community-operators-57lrh\" (UID: \"bdb6f76c-3a26-441b-9292-07e1194f4045\") " pod="openshift-marketplace/community-operators-57lrh" Oct 13 14:05:24 crc kubenswrapper[4797]: I1013 14:05:24.674535 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bdb6f76c-3a26-441b-9292-07e1194f4045-catalog-content\") pod \"community-operators-57lrh\" (UID: \"bdb6f76c-3a26-441b-9292-07e1194f4045\") " pod="openshift-marketplace/community-operators-57lrh" Oct 13 14:05:24 crc kubenswrapper[4797]: I1013 14:05:24.674981 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bdb6f76c-3a26-441b-9292-07e1194f4045-catalog-content\") pod \"community-operators-57lrh\" (UID: \"bdb6f76c-3a26-441b-9292-07e1194f4045\") " pod="openshift-marketplace/community-operators-57lrh" Oct 13 14:05:24 crc kubenswrapper[4797]: I1013 14:05:24.674980 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bdb6f76c-3a26-441b-9292-07e1194f4045-utilities\") pod \"community-operators-57lrh\" (UID: \"bdb6f76c-3a26-441b-9292-07e1194f4045\") " pod="openshift-marketplace/community-operators-57lrh" Oct 13 14:05:24 crc kubenswrapper[4797]: I1013 14:05:24.692174 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7lwz\" (UniqueName: \"kubernetes.io/projected/bdb6f76c-3a26-441b-9292-07e1194f4045-kube-api-access-x7lwz\") pod \"community-operators-57lrh\" (UID: \"bdb6f76c-3a26-441b-9292-07e1194f4045\") " pod="openshift-marketplace/community-operators-57lrh" Oct 13 14:05:24 crc kubenswrapper[4797]: I1013 14:05:24.762609 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-57lrh" Oct 13 14:05:24 crc kubenswrapper[4797]: I1013 14:05:24.840495 4797 generic.go:334] "Generic (PLEG): container finished" podID="02803a9c-81aa-4d42-9ee5-59777f0b5228" containerID="d51ccfce7a8be36a91943627797e42e1f76e157d1ef6f08547b200fa53e97226" exitCode=0 Oct 13 14:05:24 crc kubenswrapper[4797]: I1013 14:05:24.840581 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qm22w" event={"ID":"02803a9c-81aa-4d42-9ee5-59777f0b5228","Type":"ContainerDied","Data":"d51ccfce7a8be36a91943627797e42e1f76e157d1ef6f08547b200fa53e97226"} Oct 13 14:05:24 crc kubenswrapper[4797]: I1013 14:05:24.840638 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qm22w" event={"ID":"02803a9c-81aa-4d42-9ee5-59777f0b5228","Type":"ContainerStarted","Data":"f944d91a1415b6219ce0680a26bdf4e0bbaf9b8f80be08444cd7c5db4a58ba21"} Oct 13 14:05:25 crc kubenswrapper[4797]: W1013 14:05:25.265267 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbdb6f76c_3a26_441b_9292_07e1194f4045.slice/crio-5e6bd2b00bd9b78e385ea08b88faea580f02b43919f1d22bb64c04497fe5101c WatchSource:0}: Error finding container 5e6bd2b00bd9b78e385ea08b88faea580f02b43919f1d22bb64c04497fe5101c: Status 404 returned error can't find the container with id 5e6bd2b00bd9b78e385ea08b88faea580f02b43919f1d22bb64c04497fe5101c Oct 13 14:05:25 crc kubenswrapper[4797]: I1013 14:05:25.268785 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-57lrh"] Oct 13 14:05:25 crc kubenswrapper[4797]: I1013 14:05:25.852673 4797 generic.go:334] "Generic (PLEG): container finished" podID="bdb6f76c-3a26-441b-9292-07e1194f4045" containerID="a0b1ba3889738317508403c5a411939586c225e8bacb1bd5188eeebda7d5aafb" exitCode=0 Oct 13 14:05:25 crc kubenswrapper[4797]: I1013 14:05:25.852920 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-57lrh" event={"ID":"bdb6f76c-3a26-441b-9292-07e1194f4045","Type":"ContainerDied","Data":"a0b1ba3889738317508403c5a411939586c225e8bacb1bd5188eeebda7d5aafb"} Oct 13 14:05:25 crc kubenswrapper[4797]: I1013 14:05:25.853112 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-57lrh" event={"ID":"bdb6f76c-3a26-441b-9292-07e1194f4045","Type":"ContainerStarted","Data":"5e6bd2b00bd9b78e385ea08b88faea580f02b43919f1d22bb64c04497fe5101c"} Oct 13 14:05:26 crc kubenswrapper[4797]: I1013 14:05:26.865459 4797 generic.go:334] "Generic (PLEG): container finished" podID="02803a9c-81aa-4d42-9ee5-59777f0b5228" containerID="24f2ad224b465645fb6309551d1b376fb966212f5e4c6acd9575c37663ad08de" exitCode=0 Oct 13 14:05:26 crc kubenswrapper[4797]: I1013 14:05:26.865581 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qm22w" event={"ID":"02803a9c-81aa-4d42-9ee5-59777f0b5228","Type":"ContainerDied","Data":"24f2ad224b465645fb6309551d1b376fb966212f5e4c6acd9575c37663ad08de"} Oct 13 14:05:27 crc kubenswrapper[4797]: I1013 14:05:27.876085 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qm22w" event={"ID":"02803a9c-81aa-4d42-9ee5-59777f0b5228","Type":"ContainerStarted","Data":"bfb5f97057ac93c99c1135de0b6ea36e641046bb83ef8a6d91cee73f05a0587c"} Oct 13 14:05:27 crc kubenswrapper[4797]: I1013 14:05:27.878256 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-57lrh" event={"ID":"bdb6f76c-3a26-441b-9292-07e1194f4045","Type":"ContainerStarted","Data":"e2a77edbcfa9086ba755f948d2d81f2cf67642b2a15fb67fa3d72de84b36a15f"} Oct 13 14:05:27 crc kubenswrapper[4797]: I1013 14:05:27.892296 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-qm22w" podStartSLOduration=2.420504013 podStartE2EDuration="4.892275569s" podCreationTimestamp="2025-10-13 14:05:23 +0000 UTC" firstStartedPulling="2025-10-13 14:05:24.844685069 +0000 UTC m=+3502.378235325" lastFinishedPulling="2025-10-13 14:05:27.316456625 +0000 UTC m=+3504.850006881" observedRunningTime="2025-10-13 14:05:27.891786457 +0000 UTC m=+3505.425336733" watchObservedRunningTime="2025-10-13 14:05:27.892275569 +0000 UTC m=+3505.425825825" Oct 13 14:05:28 crc kubenswrapper[4797]: I1013 14:05:28.397886 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-8d7lc" Oct 13 14:05:28 crc kubenswrapper[4797]: I1013 14:05:28.397936 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-8d7lc" Oct 13 14:05:28 crc kubenswrapper[4797]: I1013 14:05:28.444723 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-8d7lc" Oct 13 14:05:28 crc kubenswrapper[4797]: I1013 14:05:28.561009 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-gxvnl" Oct 13 14:05:28 crc kubenswrapper[4797]: I1013 14:05:28.561074 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-gxvnl" Oct 13 14:05:28 crc kubenswrapper[4797]: I1013 14:05:28.601663 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-gxvnl" Oct 13 14:05:28 crc kubenswrapper[4797]: I1013 14:05:28.887438 4797 generic.go:334] "Generic (PLEG): container finished" podID="bdb6f76c-3a26-441b-9292-07e1194f4045" containerID="e2a77edbcfa9086ba755f948d2d81f2cf67642b2a15fb67fa3d72de84b36a15f" exitCode=0 Oct 13 14:05:28 crc kubenswrapper[4797]: I1013 14:05:28.887573 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-57lrh" event={"ID":"bdb6f76c-3a26-441b-9292-07e1194f4045","Type":"ContainerDied","Data":"e2a77edbcfa9086ba755f948d2d81f2cf67642b2a15fb67fa3d72de84b36a15f"} Oct 13 14:05:28 crc kubenswrapper[4797]: I1013 14:05:28.950527 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-8d7lc" Oct 13 14:05:28 crc kubenswrapper[4797]: I1013 14:05:28.952837 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-gxvnl" Oct 13 14:05:29 crc kubenswrapper[4797]: I1013 14:05:29.898145 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-57lrh" event={"ID":"bdb6f76c-3a26-441b-9292-07e1194f4045","Type":"ContainerStarted","Data":"a0842573814666a3f89245be5d027b01a8994c0f54cdf28926214ddf9c245d94"} Oct 13 14:05:29 crc kubenswrapper[4797]: I1013 14:05:29.920977 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-57lrh" podStartSLOduration=2.28681195 podStartE2EDuration="5.920958386s" podCreationTimestamp="2025-10-13 14:05:24 +0000 UTC" firstStartedPulling="2025-10-13 14:05:25.857685545 +0000 UTC m=+3503.391235841" lastFinishedPulling="2025-10-13 14:05:29.491832011 +0000 UTC m=+3507.025382277" observedRunningTime="2025-10-13 14:05:29.915719068 +0000 UTC m=+3507.449269354" watchObservedRunningTime="2025-10-13 14:05:29.920958386 +0000 UTC m=+3507.454508642" Oct 13 14:05:33 crc kubenswrapper[4797]: I1013 14:05:33.608739 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-qm22w" Oct 13 14:05:33 crc kubenswrapper[4797]: I1013 14:05:33.609097 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-qm22w" Oct 13 14:05:33 crc kubenswrapper[4797]: I1013 14:05:33.650012 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-qm22w" Oct 13 14:05:33 crc kubenswrapper[4797]: I1013 14:05:33.975411 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-qm22w" Oct 13 14:05:34 crc kubenswrapper[4797]: I1013 14:05:34.763445 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-57lrh" Oct 13 14:05:34 crc kubenswrapper[4797]: I1013 14:05:34.763793 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-57lrh" Oct 13 14:05:34 crc kubenswrapper[4797]: I1013 14:05:34.800537 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-57lrh" Oct 13 14:05:34 crc kubenswrapper[4797]: I1013 14:05:34.994590 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-57lrh" Oct 13 14:05:35 crc kubenswrapper[4797]: I1013 14:05:35.252359 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8d7lc"] Oct 13 14:05:35 crc kubenswrapper[4797]: I1013 14:05:35.253050 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-8d7lc" podUID="e397abd8-1db4-4614-972e-b6b50f1623b5" containerName="registry-server" containerID="cri-o://699c129f0ab5ff131a76ffd4f9db03b5d83b10645bfc02afedf46fa311a0887d" gracePeriod=2 Oct 13 14:05:35 crc kubenswrapper[4797]: I1013 14:05:35.630756 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8d7lc" Oct 13 14:05:35 crc kubenswrapper[4797]: I1013 14:05:35.740461 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e397abd8-1db4-4614-972e-b6b50f1623b5-utilities\") pod \"e397abd8-1db4-4614-972e-b6b50f1623b5\" (UID: \"e397abd8-1db4-4614-972e-b6b50f1623b5\") " Oct 13 14:05:35 crc kubenswrapper[4797]: I1013 14:05:35.740668 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e397abd8-1db4-4614-972e-b6b50f1623b5-catalog-content\") pod \"e397abd8-1db4-4614-972e-b6b50f1623b5\" (UID: \"e397abd8-1db4-4614-972e-b6b50f1623b5\") " Oct 13 14:05:35 crc kubenswrapper[4797]: I1013 14:05:35.740738 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t676c\" (UniqueName: \"kubernetes.io/projected/e397abd8-1db4-4614-972e-b6b50f1623b5-kube-api-access-t676c\") pod \"e397abd8-1db4-4614-972e-b6b50f1623b5\" (UID: \"e397abd8-1db4-4614-972e-b6b50f1623b5\") " Oct 13 14:05:35 crc kubenswrapper[4797]: I1013 14:05:35.741631 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e397abd8-1db4-4614-972e-b6b50f1623b5-utilities" (OuterVolumeSpecName: "utilities") pod "e397abd8-1db4-4614-972e-b6b50f1623b5" (UID: "e397abd8-1db4-4614-972e-b6b50f1623b5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 14:05:35 crc kubenswrapper[4797]: I1013 14:05:35.746409 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e397abd8-1db4-4614-972e-b6b50f1623b5-kube-api-access-t676c" (OuterVolumeSpecName: "kube-api-access-t676c") pod "e397abd8-1db4-4614-972e-b6b50f1623b5" (UID: "e397abd8-1db4-4614-972e-b6b50f1623b5"). InnerVolumeSpecName "kube-api-access-t676c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 14:05:35 crc kubenswrapper[4797]: I1013 14:05:35.754133 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e397abd8-1db4-4614-972e-b6b50f1623b5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e397abd8-1db4-4614-972e-b6b50f1623b5" (UID: "e397abd8-1db4-4614-972e-b6b50f1623b5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 14:05:35 crc kubenswrapper[4797]: I1013 14:05:35.839008 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gxvnl"] Oct 13 14:05:35 crc kubenswrapper[4797]: I1013 14:05:35.839287 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-gxvnl" podUID="75f6f619-b978-49a6-b22a-55646ac866bb" containerName="registry-server" containerID="cri-o://bf7c2cf587e450d2017d86a9b877dfd8088d662dc489ffb543e8539faba78078" gracePeriod=2 Oct 13 14:05:35 crc kubenswrapper[4797]: I1013 14:05:35.843129 4797 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e397abd8-1db4-4614-972e-b6b50f1623b5-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 13 14:05:35 crc kubenswrapper[4797]: I1013 14:05:35.843164 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t676c\" (UniqueName: \"kubernetes.io/projected/e397abd8-1db4-4614-972e-b6b50f1623b5-kube-api-access-t676c\") on node \"crc\" DevicePath \"\"" Oct 13 14:05:35 crc kubenswrapper[4797]: I1013 14:05:35.843177 4797 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e397abd8-1db4-4614-972e-b6b50f1623b5-utilities\") on node \"crc\" DevicePath \"\"" Oct 13 14:05:35 crc kubenswrapper[4797]: I1013 14:05:35.946349 4797 generic.go:334] "Generic (PLEG): container finished" podID="e397abd8-1db4-4614-972e-b6b50f1623b5" containerID="699c129f0ab5ff131a76ffd4f9db03b5d83b10645bfc02afedf46fa311a0887d" exitCode=0 Oct 13 14:05:35 crc kubenswrapper[4797]: I1013 14:05:35.946427 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8d7lc" Oct 13 14:05:35 crc kubenswrapper[4797]: I1013 14:05:35.946429 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8d7lc" event={"ID":"e397abd8-1db4-4614-972e-b6b50f1623b5","Type":"ContainerDied","Data":"699c129f0ab5ff131a76ffd4f9db03b5d83b10645bfc02afedf46fa311a0887d"} Oct 13 14:05:35 crc kubenswrapper[4797]: I1013 14:05:35.946467 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8d7lc" event={"ID":"e397abd8-1db4-4614-972e-b6b50f1623b5","Type":"ContainerDied","Data":"bd6614291e8ed42a4e773e7fbc87ca490c5e0757fefa7e3a337045a32a212a85"} Oct 13 14:05:35 crc kubenswrapper[4797]: I1013 14:05:35.946484 4797 scope.go:117] "RemoveContainer" containerID="699c129f0ab5ff131a76ffd4f9db03b5d83b10645bfc02afedf46fa311a0887d" Oct 13 14:05:36 crc kubenswrapper[4797]: I1013 14:05:36.019043 4797 scope.go:117] "RemoveContainer" containerID="703f70808be9d3d01abea6e518feb5bda81ddbe0759649850239fa0198fca01f" Oct 13 14:05:36 crc kubenswrapper[4797]: I1013 14:05:36.023099 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8d7lc"] Oct 13 14:05:36 crc kubenswrapper[4797]: I1013 14:05:36.029549 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-8d7lc"] Oct 13 14:05:36 crc kubenswrapper[4797]: I1013 14:05:36.061504 4797 scope.go:117] "RemoveContainer" containerID="7c5dbbfc18d550c17d329281fd825c40f416a462fc73092ef507bbdaff43ff98" Oct 13 14:05:36 crc kubenswrapper[4797]: I1013 14:05:36.080962 4797 scope.go:117] "RemoveContainer" containerID="699c129f0ab5ff131a76ffd4f9db03b5d83b10645bfc02afedf46fa311a0887d" Oct 13 14:05:36 crc kubenswrapper[4797]: E1013 14:05:36.081383 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"699c129f0ab5ff131a76ffd4f9db03b5d83b10645bfc02afedf46fa311a0887d\": container with ID starting with 699c129f0ab5ff131a76ffd4f9db03b5d83b10645bfc02afedf46fa311a0887d not found: ID does not exist" containerID="699c129f0ab5ff131a76ffd4f9db03b5d83b10645bfc02afedf46fa311a0887d" Oct 13 14:05:36 crc kubenswrapper[4797]: I1013 14:05:36.081426 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"699c129f0ab5ff131a76ffd4f9db03b5d83b10645bfc02afedf46fa311a0887d"} err="failed to get container status \"699c129f0ab5ff131a76ffd4f9db03b5d83b10645bfc02afedf46fa311a0887d\": rpc error: code = NotFound desc = could not find container \"699c129f0ab5ff131a76ffd4f9db03b5d83b10645bfc02afedf46fa311a0887d\": container with ID starting with 699c129f0ab5ff131a76ffd4f9db03b5d83b10645bfc02afedf46fa311a0887d not found: ID does not exist" Oct 13 14:05:36 crc kubenswrapper[4797]: I1013 14:05:36.081453 4797 scope.go:117] "RemoveContainer" containerID="703f70808be9d3d01abea6e518feb5bda81ddbe0759649850239fa0198fca01f" Oct 13 14:05:36 crc kubenswrapper[4797]: E1013 14:05:36.083314 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"703f70808be9d3d01abea6e518feb5bda81ddbe0759649850239fa0198fca01f\": container with ID starting with 703f70808be9d3d01abea6e518feb5bda81ddbe0759649850239fa0198fca01f not found: ID does not exist" containerID="703f70808be9d3d01abea6e518feb5bda81ddbe0759649850239fa0198fca01f" Oct 13 14:05:36 crc kubenswrapper[4797]: I1013 14:05:36.083359 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"703f70808be9d3d01abea6e518feb5bda81ddbe0759649850239fa0198fca01f"} err="failed to get container status \"703f70808be9d3d01abea6e518feb5bda81ddbe0759649850239fa0198fca01f\": rpc error: code = NotFound desc = could not find container \"703f70808be9d3d01abea6e518feb5bda81ddbe0759649850239fa0198fca01f\": container with ID starting with 703f70808be9d3d01abea6e518feb5bda81ddbe0759649850239fa0198fca01f not found: ID does not exist" Oct 13 14:05:36 crc kubenswrapper[4797]: I1013 14:05:36.083387 4797 scope.go:117] "RemoveContainer" containerID="7c5dbbfc18d550c17d329281fd825c40f416a462fc73092ef507bbdaff43ff98" Oct 13 14:05:36 crc kubenswrapper[4797]: E1013 14:05:36.083818 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c5dbbfc18d550c17d329281fd825c40f416a462fc73092ef507bbdaff43ff98\": container with ID starting with 7c5dbbfc18d550c17d329281fd825c40f416a462fc73092ef507bbdaff43ff98 not found: ID does not exist" containerID="7c5dbbfc18d550c17d329281fd825c40f416a462fc73092ef507bbdaff43ff98" Oct 13 14:05:36 crc kubenswrapper[4797]: I1013 14:05:36.083854 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c5dbbfc18d550c17d329281fd825c40f416a462fc73092ef507bbdaff43ff98"} err="failed to get container status \"7c5dbbfc18d550c17d329281fd825c40f416a462fc73092ef507bbdaff43ff98\": rpc error: code = NotFound desc = could not find container \"7c5dbbfc18d550c17d329281fd825c40f416a462fc73092ef507bbdaff43ff98\": container with ID starting with 7c5dbbfc18d550c17d329281fd825c40f416a462fc73092ef507bbdaff43ff98 not found: ID does not exist" Oct 13 14:05:36 crc kubenswrapper[4797]: I1013 14:05:36.181123 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gxvnl" Oct 13 14:05:36 crc kubenswrapper[4797]: I1013 14:05:36.349629 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rh77w\" (UniqueName: \"kubernetes.io/projected/75f6f619-b978-49a6-b22a-55646ac866bb-kube-api-access-rh77w\") pod \"75f6f619-b978-49a6-b22a-55646ac866bb\" (UID: \"75f6f619-b978-49a6-b22a-55646ac866bb\") " Oct 13 14:05:36 crc kubenswrapper[4797]: I1013 14:05:36.349683 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75f6f619-b978-49a6-b22a-55646ac866bb-catalog-content\") pod \"75f6f619-b978-49a6-b22a-55646ac866bb\" (UID: \"75f6f619-b978-49a6-b22a-55646ac866bb\") " Oct 13 14:05:36 crc kubenswrapper[4797]: I1013 14:05:36.349833 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75f6f619-b978-49a6-b22a-55646ac866bb-utilities\") pod \"75f6f619-b978-49a6-b22a-55646ac866bb\" (UID: \"75f6f619-b978-49a6-b22a-55646ac866bb\") " Oct 13 14:05:36 crc kubenswrapper[4797]: I1013 14:05:36.350647 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75f6f619-b978-49a6-b22a-55646ac866bb-utilities" (OuterVolumeSpecName: "utilities") pod "75f6f619-b978-49a6-b22a-55646ac866bb" (UID: "75f6f619-b978-49a6-b22a-55646ac866bb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 14:05:36 crc kubenswrapper[4797]: I1013 14:05:36.353543 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75f6f619-b978-49a6-b22a-55646ac866bb-kube-api-access-rh77w" (OuterVolumeSpecName: "kube-api-access-rh77w") pod "75f6f619-b978-49a6-b22a-55646ac866bb" (UID: "75f6f619-b978-49a6-b22a-55646ac866bb"). InnerVolumeSpecName "kube-api-access-rh77w". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 14:05:36 crc kubenswrapper[4797]: I1013 14:05:36.438192 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75f6f619-b978-49a6-b22a-55646ac866bb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "75f6f619-b978-49a6-b22a-55646ac866bb" (UID: "75f6f619-b978-49a6-b22a-55646ac866bb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 14:05:36 crc kubenswrapper[4797]: I1013 14:05:36.451185 4797 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75f6f619-b978-49a6-b22a-55646ac866bb-utilities\") on node \"crc\" DevicePath \"\"" Oct 13 14:05:36 crc kubenswrapper[4797]: I1013 14:05:36.451244 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rh77w\" (UniqueName: \"kubernetes.io/projected/75f6f619-b978-49a6-b22a-55646ac866bb-kube-api-access-rh77w\") on node \"crc\" DevicePath \"\"" Oct 13 14:05:36 crc kubenswrapper[4797]: I1013 14:05:36.451260 4797 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75f6f619-b978-49a6-b22a-55646ac866bb-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 13 14:05:36 crc kubenswrapper[4797]: I1013 14:05:36.956720 4797 generic.go:334] "Generic (PLEG): container finished" podID="75f6f619-b978-49a6-b22a-55646ac866bb" containerID="bf7c2cf587e450d2017d86a9b877dfd8088d662dc489ffb543e8539faba78078" exitCode=0 Oct 13 14:05:36 crc kubenswrapper[4797]: I1013 14:05:36.956764 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gxvnl" event={"ID":"75f6f619-b978-49a6-b22a-55646ac866bb","Type":"ContainerDied","Data":"bf7c2cf587e450d2017d86a9b877dfd8088d662dc489ffb543e8539faba78078"} Oct 13 14:05:36 crc kubenswrapper[4797]: I1013 14:05:36.956789 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gxvnl" event={"ID":"75f6f619-b978-49a6-b22a-55646ac866bb","Type":"ContainerDied","Data":"da7bfeedc79bb9cbd9180f5d5c140405d0729e5f2f00fbbcdce7d1119c9c2ccc"} Oct 13 14:05:36 crc kubenswrapper[4797]: I1013 14:05:36.956824 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gxvnl" Oct 13 14:05:36 crc kubenswrapper[4797]: I1013 14:05:36.956837 4797 scope.go:117] "RemoveContainer" containerID="bf7c2cf587e450d2017d86a9b877dfd8088d662dc489ffb543e8539faba78078" Oct 13 14:05:36 crc kubenswrapper[4797]: I1013 14:05:36.976468 4797 scope.go:117] "RemoveContainer" containerID="e0621b222577af13ae8808c7f8a74e4ae2fff7f9702c895b265ac0f27c5f3eda" Oct 13 14:05:36 crc kubenswrapper[4797]: I1013 14:05:36.995838 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gxvnl"] Oct 13 14:05:36 crc kubenswrapper[4797]: I1013 14:05:36.999790 4797 scope.go:117] "RemoveContainer" containerID="9ac0384d97066ac81ae707aff148a0b48c6eaf4b93fe799fc5fc0058ecd182b6" Oct 13 14:05:37 crc kubenswrapper[4797]: I1013 14:05:37.002187 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-gxvnl"] Oct 13 14:05:37 crc kubenswrapper[4797]: I1013 14:05:37.023165 4797 scope.go:117] "RemoveContainer" containerID="bf7c2cf587e450d2017d86a9b877dfd8088d662dc489ffb543e8539faba78078" Oct 13 14:05:37 crc kubenswrapper[4797]: E1013 14:05:37.023662 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf7c2cf587e450d2017d86a9b877dfd8088d662dc489ffb543e8539faba78078\": container with ID starting with bf7c2cf587e450d2017d86a9b877dfd8088d662dc489ffb543e8539faba78078 not found: ID does not exist" containerID="bf7c2cf587e450d2017d86a9b877dfd8088d662dc489ffb543e8539faba78078" Oct 13 14:05:37 crc kubenswrapper[4797]: I1013 14:05:37.023717 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf7c2cf587e450d2017d86a9b877dfd8088d662dc489ffb543e8539faba78078"} err="failed to get container status \"bf7c2cf587e450d2017d86a9b877dfd8088d662dc489ffb543e8539faba78078\": rpc error: code = NotFound desc = could not find container \"bf7c2cf587e450d2017d86a9b877dfd8088d662dc489ffb543e8539faba78078\": container with ID starting with bf7c2cf587e450d2017d86a9b877dfd8088d662dc489ffb543e8539faba78078 not found: ID does not exist" Oct 13 14:05:37 crc kubenswrapper[4797]: I1013 14:05:37.023739 4797 scope.go:117] "RemoveContainer" containerID="e0621b222577af13ae8808c7f8a74e4ae2fff7f9702c895b265ac0f27c5f3eda" Oct 13 14:05:37 crc kubenswrapper[4797]: E1013 14:05:37.024067 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0621b222577af13ae8808c7f8a74e4ae2fff7f9702c895b265ac0f27c5f3eda\": container with ID starting with e0621b222577af13ae8808c7f8a74e4ae2fff7f9702c895b265ac0f27c5f3eda not found: ID does not exist" containerID="e0621b222577af13ae8808c7f8a74e4ae2fff7f9702c895b265ac0f27c5f3eda" Oct 13 14:05:37 crc kubenswrapper[4797]: I1013 14:05:37.024099 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0621b222577af13ae8808c7f8a74e4ae2fff7f9702c895b265ac0f27c5f3eda"} err="failed to get container status \"e0621b222577af13ae8808c7f8a74e4ae2fff7f9702c895b265ac0f27c5f3eda\": rpc error: code = NotFound desc = could not find container \"e0621b222577af13ae8808c7f8a74e4ae2fff7f9702c895b265ac0f27c5f3eda\": container with ID starting with e0621b222577af13ae8808c7f8a74e4ae2fff7f9702c895b265ac0f27c5f3eda not found: ID does not exist" Oct 13 14:05:37 crc kubenswrapper[4797]: I1013 14:05:37.024122 4797 scope.go:117] "RemoveContainer" containerID="9ac0384d97066ac81ae707aff148a0b48c6eaf4b93fe799fc5fc0058ecd182b6" Oct 13 14:05:37 crc kubenswrapper[4797]: E1013 14:05:37.024487 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ac0384d97066ac81ae707aff148a0b48c6eaf4b93fe799fc5fc0058ecd182b6\": container with ID starting with 9ac0384d97066ac81ae707aff148a0b48c6eaf4b93fe799fc5fc0058ecd182b6 not found: ID does not exist" containerID="9ac0384d97066ac81ae707aff148a0b48c6eaf4b93fe799fc5fc0058ecd182b6" Oct 13 14:05:37 crc kubenswrapper[4797]: I1013 14:05:37.024510 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ac0384d97066ac81ae707aff148a0b48c6eaf4b93fe799fc5fc0058ecd182b6"} err="failed to get container status \"9ac0384d97066ac81ae707aff148a0b48c6eaf4b93fe799fc5fc0058ecd182b6\": rpc error: code = NotFound desc = could not find container \"9ac0384d97066ac81ae707aff148a0b48c6eaf4b93fe799fc5fc0058ecd182b6\": container with ID starting with 9ac0384d97066ac81ae707aff148a0b48c6eaf4b93fe799fc5fc0058ecd182b6 not found: ID does not exist" Oct 13 14:05:37 crc kubenswrapper[4797]: I1013 14:05:37.243519 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75f6f619-b978-49a6-b22a-55646ac866bb" path="/var/lib/kubelet/pods/75f6f619-b978-49a6-b22a-55646ac866bb/volumes" Oct 13 14:05:37 crc kubenswrapper[4797]: I1013 14:05:37.244134 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e397abd8-1db4-4614-972e-b6b50f1623b5" path="/var/lib/kubelet/pods/e397abd8-1db4-4614-972e-b6b50f1623b5/volumes" Oct 13 14:05:37 crc kubenswrapper[4797]: I1013 14:05:37.638936 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qm22w"] Oct 13 14:05:37 crc kubenswrapper[4797]: I1013 14:05:37.639235 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-qm22w" podUID="02803a9c-81aa-4d42-9ee5-59777f0b5228" containerName="registry-server" containerID="cri-o://bfb5f97057ac93c99c1135de0b6ea36e641046bb83ef8a6d91cee73f05a0587c" gracePeriod=2 Oct 13 14:05:37 crc kubenswrapper[4797]: I1013 14:05:37.967797 4797 generic.go:334] "Generic (PLEG): container finished" podID="02803a9c-81aa-4d42-9ee5-59777f0b5228" containerID="bfb5f97057ac93c99c1135de0b6ea36e641046bb83ef8a6d91cee73f05a0587c" exitCode=0 Oct 13 14:05:37 crc kubenswrapper[4797]: I1013 14:05:37.967844 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qm22w" event={"ID":"02803a9c-81aa-4d42-9ee5-59777f0b5228","Type":"ContainerDied","Data":"bfb5f97057ac93c99c1135de0b6ea36e641046bb83ef8a6d91cee73f05a0587c"} Oct 13 14:05:38 crc kubenswrapper[4797]: I1013 14:05:38.034369 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qm22w" Oct 13 14:05:38 crc kubenswrapper[4797]: I1013 14:05:38.174722 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02803a9c-81aa-4d42-9ee5-59777f0b5228-utilities\") pod \"02803a9c-81aa-4d42-9ee5-59777f0b5228\" (UID: \"02803a9c-81aa-4d42-9ee5-59777f0b5228\") " Oct 13 14:05:38 crc kubenswrapper[4797]: I1013 14:05:38.174849 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02803a9c-81aa-4d42-9ee5-59777f0b5228-catalog-content\") pod \"02803a9c-81aa-4d42-9ee5-59777f0b5228\" (UID: \"02803a9c-81aa-4d42-9ee5-59777f0b5228\") " Oct 13 14:05:38 crc kubenswrapper[4797]: I1013 14:05:38.174913 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qqt6k\" (UniqueName: \"kubernetes.io/projected/02803a9c-81aa-4d42-9ee5-59777f0b5228-kube-api-access-qqt6k\") pod \"02803a9c-81aa-4d42-9ee5-59777f0b5228\" (UID: \"02803a9c-81aa-4d42-9ee5-59777f0b5228\") " Oct 13 14:05:38 crc kubenswrapper[4797]: I1013 14:05:38.175833 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/02803a9c-81aa-4d42-9ee5-59777f0b5228-utilities" (OuterVolumeSpecName: "utilities") pod "02803a9c-81aa-4d42-9ee5-59777f0b5228" (UID: "02803a9c-81aa-4d42-9ee5-59777f0b5228"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 14:05:38 crc kubenswrapper[4797]: I1013 14:05:38.180450 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02803a9c-81aa-4d42-9ee5-59777f0b5228-kube-api-access-qqt6k" (OuterVolumeSpecName: "kube-api-access-qqt6k") pod "02803a9c-81aa-4d42-9ee5-59777f0b5228" (UID: "02803a9c-81aa-4d42-9ee5-59777f0b5228"). InnerVolumeSpecName "kube-api-access-qqt6k". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 14:05:38 crc kubenswrapper[4797]: I1013 14:05:38.219120 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/02803a9c-81aa-4d42-9ee5-59777f0b5228-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "02803a9c-81aa-4d42-9ee5-59777f0b5228" (UID: "02803a9c-81aa-4d42-9ee5-59777f0b5228"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 14:05:38 crc kubenswrapper[4797]: I1013 14:05:38.236579 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-57lrh"] Oct 13 14:05:38 crc kubenswrapper[4797]: I1013 14:05:38.236794 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-57lrh" podUID="bdb6f76c-3a26-441b-9292-07e1194f4045" containerName="registry-server" containerID="cri-o://a0842573814666a3f89245be5d027b01a8994c0f54cdf28926214ddf9c245d94" gracePeriod=2 Oct 13 14:05:38 crc kubenswrapper[4797]: I1013 14:05:38.276872 4797 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02803a9c-81aa-4d42-9ee5-59777f0b5228-utilities\") on node \"crc\" DevicePath \"\"" Oct 13 14:05:38 crc kubenswrapper[4797]: I1013 14:05:38.276929 4797 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02803a9c-81aa-4d42-9ee5-59777f0b5228-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 13 14:05:38 crc kubenswrapper[4797]: I1013 14:05:38.276954 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qqt6k\" (UniqueName: \"kubernetes.io/projected/02803a9c-81aa-4d42-9ee5-59777f0b5228-kube-api-access-qqt6k\") on node \"crc\" DevicePath \"\"" Oct 13 14:05:38 crc kubenswrapper[4797]: I1013 14:05:38.616303 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-57lrh" Oct 13 14:05:38 crc kubenswrapper[4797]: I1013 14:05:38.783480 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7lwz\" (UniqueName: \"kubernetes.io/projected/bdb6f76c-3a26-441b-9292-07e1194f4045-kube-api-access-x7lwz\") pod \"bdb6f76c-3a26-441b-9292-07e1194f4045\" (UID: \"bdb6f76c-3a26-441b-9292-07e1194f4045\") " Oct 13 14:05:38 crc kubenswrapper[4797]: I1013 14:05:38.783553 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bdb6f76c-3a26-441b-9292-07e1194f4045-catalog-content\") pod \"bdb6f76c-3a26-441b-9292-07e1194f4045\" (UID: \"bdb6f76c-3a26-441b-9292-07e1194f4045\") " Oct 13 14:05:38 crc kubenswrapper[4797]: I1013 14:05:38.783575 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bdb6f76c-3a26-441b-9292-07e1194f4045-utilities\") pod \"bdb6f76c-3a26-441b-9292-07e1194f4045\" (UID: \"bdb6f76c-3a26-441b-9292-07e1194f4045\") " Oct 13 14:05:38 crc kubenswrapper[4797]: I1013 14:05:38.784792 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bdb6f76c-3a26-441b-9292-07e1194f4045-utilities" (OuterVolumeSpecName: "utilities") pod "bdb6f76c-3a26-441b-9292-07e1194f4045" (UID: "bdb6f76c-3a26-441b-9292-07e1194f4045"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 14:05:38 crc kubenswrapper[4797]: I1013 14:05:38.793062 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bdb6f76c-3a26-441b-9292-07e1194f4045-kube-api-access-x7lwz" (OuterVolumeSpecName: "kube-api-access-x7lwz") pod "bdb6f76c-3a26-441b-9292-07e1194f4045" (UID: "bdb6f76c-3a26-441b-9292-07e1194f4045"). InnerVolumeSpecName "kube-api-access-x7lwz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 14:05:38 crc kubenswrapper[4797]: I1013 14:05:38.834457 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bdb6f76c-3a26-441b-9292-07e1194f4045-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bdb6f76c-3a26-441b-9292-07e1194f4045" (UID: "bdb6f76c-3a26-441b-9292-07e1194f4045"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 14:05:38 crc kubenswrapper[4797]: I1013 14:05:38.885165 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7lwz\" (UniqueName: \"kubernetes.io/projected/bdb6f76c-3a26-441b-9292-07e1194f4045-kube-api-access-x7lwz\") on node \"crc\" DevicePath \"\"" Oct 13 14:05:38 crc kubenswrapper[4797]: I1013 14:05:38.885431 4797 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bdb6f76c-3a26-441b-9292-07e1194f4045-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 13 14:05:38 crc kubenswrapper[4797]: I1013 14:05:38.885542 4797 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bdb6f76c-3a26-441b-9292-07e1194f4045-utilities\") on node \"crc\" DevicePath \"\"" Oct 13 14:05:38 crc kubenswrapper[4797]: I1013 14:05:38.978415 4797 generic.go:334] "Generic (PLEG): container finished" podID="bdb6f76c-3a26-441b-9292-07e1194f4045" containerID="a0842573814666a3f89245be5d027b01a8994c0f54cdf28926214ddf9c245d94" exitCode=0 Oct 13 14:05:38 crc kubenswrapper[4797]: I1013 14:05:38.978502 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-57lrh" event={"ID":"bdb6f76c-3a26-441b-9292-07e1194f4045","Type":"ContainerDied","Data":"a0842573814666a3f89245be5d027b01a8994c0f54cdf28926214ddf9c245d94"} Oct 13 14:05:38 crc kubenswrapper[4797]: I1013 14:05:38.979269 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-57lrh" event={"ID":"bdb6f76c-3a26-441b-9292-07e1194f4045","Type":"ContainerDied","Data":"5e6bd2b00bd9b78e385ea08b88faea580f02b43919f1d22bb64c04497fe5101c"} Oct 13 14:05:38 crc kubenswrapper[4797]: I1013 14:05:38.978540 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-57lrh" Oct 13 14:05:38 crc kubenswrapper[4797]: I1013 14:05:38.979552 4797 scope.go:117] "RemoveContainer" containerID="a0842573814666a3f89245be5d027b01a8994c0f54cdf28926214ddf9c245d94" Oct 13 14:05:38 crc kubenswrapper[4797]: I1013 14:05:38.983850 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qm22w" event={"ID":"02803a9c-81aa-4d42-9ee5-59777f0b5228","Type":"ContainerDied","Data":"f944d91a1415b6219ce0680a26bdf4e0bbaf9b8f80be08444cd7c5db4a58ba21"} Oct 13 14:05:38 crc kubenswrapper[4797]: I1013 14:05:38.983895 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qm22w" Oct 13 14:05:39 crc kubenswrapper[4797]: I1013 14:05:39.008691 4797 scope.go:117] "RemoveContainer" containerID="e2a77edbcfa9086ba755f948d2d81f2cf67642b2a15fb67fa3d72de84b36a15f" Oct 13 14:05:39 crc kubenswrapper[4797]: I1013 14:05:39.038663 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-57lrh"] Oct 13 14:05:39 crc kubenswrapper[4797]: I1013 14:05:39.041996 4797 scope.go:117] "RemoveContainer" containerID="a0b1ba3889738317508403c5a411939586c225e8bacb1bd5188eeebda7d5aafb" Oct 13 14:05:39 crc kubenswrapper[4797]: I1013 14:05:39.047448 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-57lrh"] Oct 13 14:05:39 crc kubenswrapper[4797]: I1013 14:05:39.055330 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qm22w"] Oct 13 14:05:39 crc kubenswrapper[4797]: I1013 14:05:39.061587 4797 scope.go:117] "RemoveContainer" containerID="a0842573814666a3f89245be5d027b01a8994c0f54cdf28926214ddf9c245d94" Oct 13 14:05:39 crc kubenswrapper[4797]: E1013 14:05:39.062045 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0842573814666a3f89245be5d027b01a8994c0f54cdf28926214ddf9c245d94\": container with ID starting with a0842573814666a3f89245be5d027b01a8994c0f54cdf28926214ddf9c245d94 not found: ID does not exist" containerID="a0842573814666a3f89245be5d027b01a8994c0f54cdf28926214ddf9c245d94" Oct 13 14:05:39 crc kubenswrapper[4797]: I1013 14:05:39.062171 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0842573814666a3f89245be5d027b01a8994c0f54cdf28926214ddf9c245d94"} err="failed to get container status \"a0842573814666a3f89245be5d027b01a8994c0f54cdf28926214ddf9c245d94\": rpc error: code = NotFound desc = could not find container \"a0842573814666a3f89245be5d027b01a8994c0f54cdf28926214ddf9c245d94\": container with ID starting with a0842573814666a3f89245be5d027b01a8994c0f54cdf28926214ddf9c245d94 not found: ID does not exist" Oct 13 14:05:39 crc kubenswrapper[4797]: I1013 14:05:39.062303 4797 scope.go:117] "RemoveContainer" containerID="e2a77edbcfa9086ba755f948d2d81f2cf67642b2a15fb67fa3d72de84b36a15f" Oct 13 14:05:39 crc kubenswrapper[4797]: E1013 14:05:39.062653 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2a77edbcfa9086ba755f948d2d81f2cf67642b2a15fb67fa3d72de84b36a15f\": container with ID starting with e2a77edbcfa9086ba755f948d2d81f2cf67642b2a15fb67fa3d72de84b36a15f not found: ID does not exist" containerID="e2a77edbcfa9086ba755f948d2d81f2cf67642b2a15fb67fa3d72de84b36a15f" Oct 13 14:05:39 crc kubenswrapper[4797]: I1013 14:05:39.062687 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2a77edbcfa9086ba755f948d2d81f2cf67642b2a15fb67fa3d72de84b36a15f"} err="failed to get container status \"e2a77edbcfa9086ba755f948d2d81f2cf67642b2a15fb67fa3d72de84b36a15f\": rpc error: code = NotFound desc = could not find container \"e2a77edbcfa9086ba755f948d2d81f2cf67642b2a15fb67fa3d72de84b36a15f\": container with ID starting with e2a77edbcfa9086ba755f948d2d81f2cf67642b2a15fb67fa3d72de84b36a15f not found: ID does not exist" Oct 13 14:05:39 crc kubenswrapper[4797]: I1013 14:05:39.062710 4797 scope.go:117] "RemoveContainer" containerID="a0b1ba3889738317508403c5a411939586c225e8bacb1bd5188eeebda7d5aafb" Oct 13 14:05:39 crc kubenswrapper[4797]: E1013 14:05:39.063098 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0b1ba3889738317508403c5a411939586c225e8bacb1bd5188eeebda7d5aafb\": container with ID starting with a0b1ba3889738317508403c5a411939586c225e8bacb1bd5188eeebda7d5aafb not found: ID does not exist" containerID="a0b1ba3889738317508403c5a411939586c225e8bacb1bd5188eeebda7d5aafb" Oct 13 14:05:39 crc kubenswrapper[4797]: I1013 14:05:39.063313 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0b1ba3889738317508403c5a411939586c225e8bacb1bd5188eeebda7d5aafb"} err="failed to get container status \"a0b1ba3889738317508403c5a411939586c225e8bacb1bd5188eeebda7d5aafb\": rpc error: code = NotFound desc = could not find container \"a0b1ba3889738317508403c5a411939586c225e8bacb1bd5188eeebda7d5aafb\": container with ID starting with a0b1ba3889738317508403c5a411939586c225e8bacb1bd5188eeebda7d5aafb not found: ID does not exist" Oct 13 14:05:39 crc kubenswrapper[4797]: I1013 14:05:39.063402 4797 scope.go:117] "RemoveContainer" containerID="bfb5f97057ac93c99c1135de0b6ea36e641046bb83ef8a6d91cee73f05a0587c" Oct 13 14:05:39 crc kubenswrapper[4797]: I1013 14:05:39.063639 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-qm22w"] Oct 13 14:05:39 crc kubenswrapper[4797]: I1013 14:05:39.089100 4797 scope.go:117] "RemoveContainer" containerID="24f2ad224b465645fb6309551d1b376fb966212f5e4c6acd9575c37663ad08de" Oct 13 14:05:39 crc kubenswrapper[4797]: I1013 14:05:39.104839 4797 scope.go:117] "RemoveContainer" containerID="d51ccfce7a8be36a91943627797e42e1f76e157d1ef6f08547b200fa53e97226" Oct 13 14:05:39 crc kubenswrapper[4797]: I1013 14:05:39.245559 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02803a9c-81aa-4d42-9ee5-59777f0b5228" path="/var/lib/kubelet/pods/02803a9c-81aa-4d42-9ee5-59777f0b5228/volumes" Oct 13 14:05:39 crc kubenswrapper[4797]: I1013 14:05:39.246301 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bdb6f76c-3a26-441b-9292-07e1194f4045" path="/var/lib/kubelet/pods/bdb6f76c-3a26-441b-9292-07e1194f4045/volumes" Oct 13 14:06:18 crc kubenswrapper[4797]: I1013 14:06:18.120021 4797 patch_prober.go:28] interesting pod/machine-config-daemon-hrdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 14:06:18 crc kubenswrapper[4797]: I1013 14:06:18.120653 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 14:06:48 crc kubenswrapper[4797]: I1013 14:06:48.119656 4797 patch_prober.go:28] interesting pod/machine-config-daemon-hrdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 14:06:48 crc kubenswrapper[4797]: I1013 14:06:48.120310 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 14:07:18 crc kubenswrapper[4797]: I1013 14:07:18.119982 4797 patch_prober.go:28] interesting pod/machine-config-daemon-hrdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 14:07:18 crc kubenswrapper[4797]: I1013 14:07:18.121043 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 14:07:18 crc kubenswrapper[4797]: I1013 14:07:18.121102 4797 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" Oct 13 14:07:18 crc kubenswrapper[4797]: I1013 14:07:18.121710 4797 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"cb608ef00552a320249af626728ae168e49b74a818c0214980d05a716b11d54a"} pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 13 14:07:18 crc kubenswrapper[4797]: I1013 14:07:18.121782 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerName="machine-config-daemon" containerID="cri-o://cb608ef00552a320249af626728ae168e49b74a818c0214980d05a716b11d54a" gracePeriod=600 Oct 13 14:07:18 crc kubenswrapper[4797]: E1013 14:07:18.249624 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 14:07:18 crc kubenswrapper[4797]: I1013 14:07:18.742173 4797 generic.go:334] "Generic (PLEG): container finished" podID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerID="cb608ef00552a320249af626728ae168e49b74a818c0214980d05a716b11d54a" exitCode=0 Oct 13 14:07:18 crc kubenswrapper[4797]: I1013 14:07:18.742266 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" event={"ID":"345b1c60-ba79-407d-8423-53010f2dfeb0","Type":"ContainerDied","Data":"cb608ef00552a320249af626728ae168e49b74a818c0214980d05a716b11d54a"} Oct 13 14:07:18 crc kubenswrapper[4797]: I1013 14:07:18.742728 4797 scope.go:117] "RemoveContainer" containerID="39263090f3bbb8cc8391e1dcd103f422d045f83425792d9a12337578bb539d50" Oct 13 14:07:18 crc kubenswrapper[4797]: I1013 14:07:18.743259 4797 scope.go:117] "RemoveContainer" containerID="cb608ef00552a320249af626728ae168e49b74a818c0214980d05a716b11d54a" Oct 13 14:07:18 crc kubenswrapper[4797]: E1013 14:07:18.743481 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 14:07:32 crc kubenswrapper[4797]: I1013 14:07:32.236757 4797 scope.go:117] "RemoveContainer" containerID="cb608ef00552a320249af626728ae168e49b74a818c0214980d05a716b11d54a" Oct 13 14:07:32 crc kubenswrapper[4797]: E1013 14:07:32.237492 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 14:07:43 crc kubenswrapper[4797]: I1013 14:07:43.236245 4797 scope.go:117] "RemoveContainer" containerID="cb608ef00552a320249af626728ae168e49b74a818c0214980d05a716b11d54a" Oct 13 14:07:43 crc kubenswrapper[4797]: E1013 14:07:43.237174 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 14:07:56 crc kubenswrapper[4797]: I1013 14:07:56.236720 4797 scope.go:117] "RemoveContainer" containerID="cb608ef00552a320249af626728ae168e49b74a818c0214980d05a716b11d54a" Oct 13 14:07:56 crc kubenswrapper[4797]: E1013 14:07:56.237540 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 14:08:07 crc kubenswrapper[4797]: I1013 14:08:07.236041 4797 scope.go:117] "RemoveContainer" containerID="cb608ef00552a320249af626728ae168e49b74a818c0214980d05a716b11d54a" Oct 13 14:08:07 crc kubenswrapper[4797]: E1013 14:08:07.236771 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 14:08:22 crc kubenswrapper[4797]: I1013 14:08:22.237489 4797 scope.go:117] "RemoveContainer" containerID="cb608ef00552a320249af626728ae168e49b74a818c0214980d05a716b11d54a" Oct 13 14:08:22 crc kubenswrapper[4797]: E1013 14:08:22.238736 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 14:08:34 crc kubenswrapper[4797]: I1013 14:08:34.236018 4797 scope.go:117] "RemoveContainer" containerID="cb608ef00552a320249af626728ae168e49b74a818c0214980d05a716b11d54a" Oct 13 14:08:34 crc kubenswrapper[4797]: E1013 14:08:34.236705 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 14:08:47 crc kubenswrapper[4797]: I1013 14:08:47.235841 4797 scope.go:117] "RemoveContainer" containerID="cb608ef00552a320249af626728ae168e49b74a818c0214980d05a716b11d54a" Oct 13 14:08:47 crc kubenswrapper[4797]: E1013 14:08:47.249907 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 14:09:02 crc kubenswrapper[4797]: I1013 14:09:02.236064 4797 scope.go:117] "RemoveContainer" containerID="cb608ef00552a320249af626728ae168e49b74a818c0214980d05a716b11d54a" Oct 13 14:09:02 crc kubenswrapper[4797]: E1013 14:09:02.236635 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 14:09:16 crc kubenswrapper[4797]: I1013 14:09:16.235597 4797 scope.go:117] "RemoveContainer" containerID="cb608ef00552a320249af626728ae168e49b74a818c0214980d05a716b11d54a" Oct 13 14:09:16 crc kubenswrapper[4797]: E1013 14:09:16.236225 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 14:09:28 crc kubenswrapper[4797]: I1013 14:09:28.236114 4797 scope.go:117] "RemoveContainer" containerID="cb608ef00552a320249af626728ae168e49b74a818c0214980d05a716b11d54a" Oct 13 14:09:28 crc kubenswrapper[4797]: E1013 14:09:28.237059 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 14:09:41 crc kubenswrapper[4797]: I1013 14:09:41.235748 4797 scope.go:117] "RemoveContainer" containerID="cb608ef00552a320249af626728ae168e49b74a818c0214980d05a716b11d54a" Oct 13 14:09:41 crc kubenswrapper[4797]: E1013 14:09:41.236483 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 14:09:54 crc kubenswrapper[4797]: I1013 14:09:54.236912 4797 scope.go:117] "RemoveContainer" containerID="cb608ef00552a320249af626728ae168e49b74a818c0214980d05a716b11d54a" Oct 13 14:09:54 crc kubenswrapper[4797]: E1013 14:09:54.237858 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 14:10:09 crc kubenswrapper[4797]: I1013 14:10:09.235870 4797 scope.go:117] "RemoveContainer" containerID="cb608ef00552a320249af626728ae168e49b74a818c0214980d05a716b11d54a" Oct 13 14:10:09 crc kubenswrapper[4797]: E1013 14:10:09.236442 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 14:10:21 crc kubenswrapper[4797]: I1013 14:10:21.236180 4797 scope.go:117] "RemoveContainer" containerID="cb608ef00552a320249af626728ae168e49b74a818c0214980d05a716b11d54a" Oct 13 14:10:21 crc kubenswrapper[4797]: E1013 14:10:21.236861 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 14:10:33 crc kubenswrapper[4797]: I1013 14:10:33.267425 4797 scope.go:117] "RemoveContainer" containerID="cb608ef00552a320249af626728ae168e49b74a818c0214980d05a716b11d54a" Oct 13 14:10:33 crc kubenswrapper[4797]: E1013 14:10:33.268780 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 14:10:44 crc kubenswrapper[4797]: I1013 14:10:44.236728 4797 scope.go:117] "RemoveContainer" containerID="cb608ef00552a320249af626728ae168e49b74a818c0214980d05a716b11d54a" Oct 13 14:10:44 crc kubenswrapper[4797]: E1013 14:10:44.237572 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 14:10:57 crc kubenswrapper[4797]: I1013 14:10:57.236922 4797 scope.go:117] "RemoveContainer" containerID="cb608ef00552a320249af626728ae168e49b74a818c0214980d05a716b11d54a" Oct 13 14:10:57 crc kubenswrapper[4797]: E1013 14:10:57.238113 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 14:11:08 crc kubenswrapper[4797]: I1013 14:11:08.236407 4797 scope.go:117] "RemoveContainer" containerID="cb608ef00552a320249af626728ae168e49b74a818c0214980d05a716b11d54a" Oct 13 14:11:08 crc kubenswrapper[4797]: E1013 14:11:08.237029 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 14:11:21 crc kubenswrapper[4797]: I1013 14:11:21.236214 4797 scope.go:117] "RemoveContainer" containerID="cb608ef00552a320249af626728ae168e49b74a818c0214980d05a716b11d54a" Oct 13 14:11:21 crc kubenswrapper[4797]: E1013 14:11:21.237094 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 14:11:33 crc kubenswrapper[4797]: I1013 14:11:33.242025 4797 scope.go:117] "RemoveContainer" containerID="cb608ef00552a320249af626728ae168e49b74a818c0214980d05a716b11d54a" Oct 13 14:11:33 crc kubenswrapper[4797]: E1013 14:11:33.243193 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 14:11:44 crc kubenswrapper[4797]: I1013 14:11:44.236870 4797 scope.go:117] "RemoveContainer" containerID="cb608ef00552a320249af626728ae168e49b74a818c0214980d05a716b11d54a" Oct 13 14:11:44 crc kubenswrapper[4797]: E1013 14:11:44.237667 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 14:11:59 crc kubenswrapper[4797]: I1013 14:11:59.236621 4797 scope.go:117] "RemoveContainer" containerID="cb608ef00552a320249af626728ae168e49b74a818c0214980d05a716b11d54a" Oct 13 14:11:59 crc kubenswrapper[4797]: E1013 14:11:59.237977 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 14:12:11 crc kubenswrapper[4797]: I1013 14:12:11.236150 4797 scope.go:117] "RemoveContainer" containerID="cb608ef00552a320249af626728ae168e49b74a818c0214980d05a716b11d54a" Oct 13 14:12:11 crc kubenswrapper[4797]: E1013 14:12:11.237122 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 14:12:26 crc kubenswrapper[4797]: I1013 14:12:26.236138 4797 scope.go:117] "RemoveContainer" containerID="cb608ef00552a320249af626728ae168e49b74a818c0214980d05a716b11d54a" Oct 13 14:12:27 crc kubenswrapper[4797]: I1013 14:12:27.246877 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" event={"ID":"345b1c60-ba79-407d-8423-53010f2dfeb0","Type":"ContainerStarted","Data":"acb629200fdd59edad77e0212a8498d3d82c7cdb09b876143e9a24e3014955a5"} Oct 13 14:14:48 crc kubenswrapper[4797]: I1013 14:14:48.119921 4797 patch_prober.go:28] interesting pod/machine-config-daemon-hrdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 14:14:48 crc kubenswrapper[4797]: I1013 14:14:48.120529 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 14:14:50 crc kubenswrapper[4797]: I1013 14:14:50.734066 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-7qr56"] Oct 13 14:14:50 crc kubenswrapper[4797]: I1013 14:14:50.741009 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-7qr56"] Oct 13 14:14:50 crc kubenswrapper[4797]: I1013 14:14:50.861525 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-jpf9n"] Oct 13 14:14:50 crc kubenswrapper[4797]: E1013 14:14:50.861946 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75f6f619-b978-49a6-b22a-55646ac866bb" containerName="extract-utilities" Oct 13 14:14:50 crc kubenswrapper[4797]: I1013 14:14:50.861973 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="75f6f619-b978-49a6-b22a-55646ac866bb" containerName="extract-utilities" Oct 13 14:14:50 crc kubenswrapper[4797]: E1013 14:14:50.861999 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02803a9c-81aa-4d42-9ee5-59777f0b5228" containerName="extract-content" Oct 13 14:14:50 crc kubenswrapper[4797]: I1013 14:14:50.862011 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="02803a9c-81aa-4d42-9ee5-59777f0b5228" containerName="extract-content" Oct 13 14:14:50 crc kubenswrapper[4797]: E1013 14:14:50.862033 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02803a9c-81aa-4d42-9ee5-59777f0b5228" containerName="extract-utilities" Oct 13 14:14:50 crc kubenswrapper[4797]: I1013 14:14:50.862045 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="02803a9c-81aa-4d42-9ee5-59777f0b5228" containerName="extract-utilities" Oct 13 14:14:50 crc kubenswrapper[4797]: E1013 14:14:50.862057 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bdb6f76c-3a26-441b-9292-07e1194f4045" containerName="extract-utilities" Oct 13 14:14:50 crc kubenswrapper[4797]: I1013 14:14:50.862067 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdb6f76c-3a26-441b-9292-07e1194f4045" containerName="extract-utilities" Oct 13 14:14:50 crc kubenswrapper[4797]: E1013 14:14:50.862082 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bdb6f76c-3a26-441b-9292-07e1194f4045" containerName="extract-content" Oct 13 14:14:50 crc kubenswrapper[4797]: I1013 14:14:50.862095 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdb6f76c-3a26-441b-9292-07e1194f4045" containerName="extract-content" Oct 13 14:14:50 crc kubenswrapper[4797]: E1013 14:14:50.862112 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e397abd8-1db4-4614-972e-b6b50f1623b5" containerName="extract-content" Oct 13 14:14:50 crc kubenswrapper[4797]: I1013 14:14:50.862122 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="e397abd8-1db4-4614-972e-b6b50f1623b5" containerName="extract-content" Oct 13 14:14:50 crc kubenswrapper[4797]: E1013 14:14:50.862137 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e397abd8-1db4-4614-972e-b6b50f1623b5" containerName="extract-utilities" Oct 13 14:14:50 crc kubenswrapper[4797]: I1013 14:14:50.862147 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="e397abd8-1db4-4614-972e-b6b50f1623b5" containerName="extract-utilities" Oct 13 14:14:50 crc kubenswrapper[4797]: E1013 14:14:50.862165 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e397abd8-1db4-4614-972e-b6b50f1623b5" containerName="registry-server" Oct 13 14:14:50 crc kubenswrapper[4797]: I1013 14:14:50.862176 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="e397abd8-1db4-4614-972e-b6b50f1623b5" containerName="registry-server" Oct 13 14:14:50 crc kubenswrapper[4797]: E1013 14:14:50.862200 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75f6f619-b978-49a6-b22a-55646ac866bb" containerName="extract-content" Oct 13 14:14:50 crc kubenswrapper[4797]: I1013 14:14:50.862213 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="75f6f619-b978-49a6-b22a-55646ac866bb" containerName="extract-content" Oct 13 14:14:50 crc kubenswrapper[4797]: E1013 14:14:50.862238 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02803a9c-81aa-4d42-9ee5-59777f0b5228" containerName="registry-server" Oct 13 14:14:50 crc kubenswrapper[4797]: I1013 14:14:50.862248 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="02803a9c-81aa-4d42-9ee5-59777f0b5228" containerName="registry-server" Oct 13 14:14:50 crc kubenswrapper[4797]: E1013 14:14:50.862262 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bdb6f76c-3a26-441b-9292-07e1194f4045" containerName="registry-server" Oct 13 14:14:50 crc kubenswrapper[4797]: I1013 14:14:50.862271 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdb6f76c-3a26-441b-9292-07e1194f4045" containerName="registry-server" Oct 13 14:14:50 crc kubenswrapper[4797]: E1013 14:14:50.862284 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75f6f619-b978-49a6-b22a-55646ac866bb" containerName="registry-server" Oct 13 14:14:50 crc kubenswrapper[4797]: I1013 14:14:50.862294 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="75f6f619-b978-49a6-b22a-55646ac866bb" containerName="registry-server" Oct 13 14:14:50 crc kubenswrapper[4797]: I1013 14:14:50.862509 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="75f6f619-b978-49a6-b22a-55646ac866bb" containerName="registry-server" Oct 13 14:14:50 crc kubenswrapper[4797]: I1013 14:14:50.862539 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="e397abd8-1db4-4614-972e-b6b50f1623b5" containerName="registry-server" Oct 13 14:14:50 crc kubenswrapper[4797]: I1013 14:14:50.862557 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="02803a9c-81aa-4d42-9ee5-59777f0b5228" containerName="registry-server" Oct 13 14:14:50 crc kubenswrapper[4797]: I1013 14:14:50.862573 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="bdb6f76c-3a26-441b-9292-07e1194f4045" containerName="registry-server" Oct 13 14:14:50 crc kubenswrapper[4797]: I1013 14:14:50.863297 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-jpf9n" Oct 13 14:14:50 crc kubenswrapper[4797]: I1013 14:14:50.868109 4797 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-xnt4c" Oct 13 14:14:50 crc kubenswrapper[4797]: I1013 14:14:50.868439 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Oct 13 14:14:50 crc kubenswrapper[4797]: I1013 14:14:50.868662 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Oct 13 14:14:50 crc kubenswrapper[4797]: I1013 14:14:50.869015 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Oct 13 14:14:50 crc kubenswrapper[4797]: I1013 14:14:50.871541 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-jpf9n"] Oct 13 14:14:50 crc kubenswrapper[4797]: I1013 14:14:50.972327 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7dwn\" (UniqueName: \"kubernetes.io/projected/1af41123-465c-45d0-b803-24733c3a894d-kube-api-access-t7dwn\") pod \"crc-storage-crc-jpf9n\" (UID: \"1af41123-465c-45d0-b803-24733c3a894d\") " pod="crc-storage/crc-storage-crc-jpf9n" Oct 13 14:14:50 crc kubenswrapper[4797]: I1013 14:14:50.972371 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/1af41123-465c-45d0-b803-24733c3a894d-node-mnt\") pod \"crc-storage-crc-jpf9n\" (UID: \"1af41123-465c-45d0-b803-24733c3a894d\") " pod="crc-storage/crc-storage-crc-jpf9n" Oct 13 14:14:50 crc kubenswrapper[4797]: I1013 14:14:50.972589 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/1af41123-465c-45d0-b803-24733c3a894d-crc-storage\") pod \"crc-storage-crc-jpf9n\" (UID: \"1af41123-465c-45d0-b803-24733c3a894d\") " pod="crc-storage/crc-storage-crc-jpf9n" Oct 13 14:14:51 crc kubenswrapper[4797]: I1013 14:14:51.074312 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/1af41123-465c-45d0-b803-24733c3a894d-crc-storage\") pod \"crc-storage-crc-jpf9n\" (UID: \"1af41123-465c-45d0-b803-24733c3a894d\") " pod="crc-storage/crc-storage-crc-jpf9n" Oct 13 14:14:51 crc kubenswrapper[4797]: I1013 14:14:51.074479 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t7dwn\" (UniqueName: \"kubernetes.io/projected/1af41123-465c-45d0-b803-24733c3a894d-kube-api-access-t7dwn\") pod \"crc-storage-crc-jpf9n\" (UID: \"1af41123-465c-45d0-b803-24733c3a894d\") " pod="crc-storage/crc-storage-crc-jpf9n" Oct 13 14:14:51 crc kubenswrapper[4797]: I1013 14:14:51.074523 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/1af41123-465c-45d0-b803-24733c3a894d-node-mnt\") pod \"crc-storage-crc-jpf9n\" (UID: \"1af41123-465c-45d0-b803-24733c3a894d\") " pod="crc-storage/crc-storage-crc-jpf9n" Oct 13 14:14:51 crc kubenswrapper[4797]: I1013 14:14:51.074955 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/1af41123-465c-45d0-b803-24733c3a894d-node-mnt\") pod \"crc-storage-crc-jpf9n\" (UID: \"1af41123-465c-45d0-b803-24733c3a894d\") " pod="crc-storage/crc-storage-crc-jpf9n" Oct 13 14:14:51 crc kubenswrapper[4797]: I1013 14:14:51.075857 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/1af41123-465c-45d0-b803-24733c3a894d-crc-storage\") pod \"crc-storage-crc-jpf9n\" (UID: \"1af41123-465c-45d0-b803-24733c3a894d\") " pod="crc-storage/crc-storage-crc-jpf9n" Oct 13 14:14:51 crc kubenswrapper[4797]: I1013 14:14:51.107540 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7dwn\" (UniqueName: \"kubernetes.io/projected/1af41123-465c-45d0-b803-24733c3a894d-kube-api-access-t7dwn\") pod \"crc-storage-crc-jpf9n\" (UID: \"1af41123-465c-45d0-b803-24733c3a894d\") " pod="crc-storage/crc-storage-crc-jpf9n" Oct 13 14:14:51 crc kubenswrapper[4797]: I1013 14:14:51.185314 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-jpf9n" Oct 13 14:14:51 crc kubenswrapper[4797]: I1013 14:14:51.254613 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="307671bd-dffc-403e-8ad7-034af76c10be" path="/var/lib/kubelet/pods/307671bd-dffc-403e-8ad7-034af76c10be/volumes" Oct 13 14:14:51 crc kubenswrapper[4797]: I1013 14:14:51.437221 4797 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 13 14:14:51 crc kubenswrapper[4797]: I1013 14:14:51.441430 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-jpf9n"] Oct 13 14:14:51 crc kubenswrapper[4797]: I1013 14:14:51.478476 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-jpf9n" event={"ID":"1af41123-465c-45d0-b803-24733c3a894d","Type":"ContainerStarted","Data":"83799ee0d5c59403b259241f280499a86f4819658485725d24ac1b3574620652"} Oct 13 14:14:52 crc kubenswrapper[4797]: I1013 14:14:52.490748 4797 generic.go:334] "Generic (PLEG): container finished" podID="1af41123-465c-45d0-b803-24733c3a894d" containerID="ce2f34a16021f09ae0550752bd4da285b25538d5effcb1520c8d1caed405ed0c" exitCode=0 Oct 13 14:14:52 crc kubenswrapper[4797]: I1013 14:14:52.490895 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-jpf9n" event={"ID":"1af41123-465c-45d0-b803-24733c3a894d","Type":"ContainerDied","Data":"ce2f34a16021f09ae0550752bd4da285b25538d5effcb1520c8d1caed405ed0c"} Oct 13 14:14:53 crc kubenswrapper[4797]: I1013 14:14:53.819961 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-jpf9n" Oct 13 14:14:54 crc kubenswrapper[4797]: I1013 14:14:54.018777 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t7dwn\" (UniqueName: \"kubernetes.io/projected/1af41123-465c-45d0-b803-24733c3a894d-kube-api-access-t7dwn\") pod \"1af41123-465c-45d0-b803-24733c3a894d\" (UID: \"1af41123-465c-45d0-b803-24733c3a894d\") " Oct 13 14:14:54 crc kubenswrapper[4797]: I1013 14:14:54.019220 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/1af41123-465c-45d0-b803-24733c3a894d-node-mnt\") pod \"1af41123-465c-45d0-b803-24733c3a894d\" (UID: \"1af41123-465c-45d0-b803-24733c3a894d\") " Oct 13 14:14:54 crc kubenswrapper[4797]: I1013 14:14:54.019331 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/1af41123-465c-45d0-b803-24733c3a894d-crc-storage\") pod \"1af41123-465c-45d0-b803-24733c3a894d\" (UID: \"1af41123-465c-45d0-b803-24733c3a894d\") " Oct 13 14:14:54 crc kubenswrapper[4797]: I1013 14:14:54.019693 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1af41123-465c-45d0-b803-24733c3a894d-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "1af41123-465c-45d0-b803-24733c3a894d" (UID: "1af41123-465c-45d0-b803-24733c3a894d"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 13 14:14:54 crc kubenswrapper[4797]: I1013 14:14:54.028584 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1af41123-465c-45d0-b803-24733c3a894d-kube-api-access-t7dwn" (OuterVolumeSpecName: "kube-api-access-t7dwn") pod "1af41123-465c-45d0-b803-24733c3a894d" (UID: "1af41123-465c-45d0-b803-24733c3a894d"). InnerVolumeSpecName "kube-api-access-t7dwn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 14:14:54 crc kubenswrapper[4797]: I1013 14:14:54.045255 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1af41123-465c-45d0-b803-24733c3a894d-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "1af41123-465c-45d0-b803-24733c3a894d" (UID: "1af41123-465c-45d0-b803-24733c3a894d"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 14:14:54 crc kubenswrapper[4797]: I1013 14:14:54.120888 4797 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/1af41123-465c-45d0-b803-24733c3a894d-crc-storage\") on node \"crc\" DevicePath \"\"" Oct 13 14:14:54 crc kubenswrapper[4797]: I1013 14:14:54.120935 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t7dwn\" (UniqueName: \"kubernetes.io/projected/1af41123-465c-45d0-b803-24733c3a894d-kube-api-access-t7dwn\") on node \"crc\" DevicePath \"\"" Oct 13 14:14:54 crc kubenswrapper[4797]: I1013 14:14:54.120949 4797 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/1af41123-465c-45d0-b803-24733c3a894d-node-mnt\") on node \"crc\" DevicePath \"\"" Oct 13 14:14:54 crc kubenswrapper[4797]: I1013 14:14:54.521194 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-jpf9n" event={"ID":"1af41123-465c-45d0-b803-24733c3a894d","Type":"ContainerDied","Data":"83799ee0d5c59403b259241f280499a86f4819658485725d24ac1b3574620652"} Oct 13 14:14:54 crc kubenswrapper[4797]: I1013 14:14:54.521248 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="83799ee0d5c59403b259241f280499a86f4819658485725d24ac1b3574620652" Oct 13 14:14:54 crc kubenswrapper[4797]: I1013 14:14:54.521267 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-jpf9n" Oct 13 14:14:56 crc kubenswrapper[4797]: I1013 14:14:56.130521 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-jpf9n"] Oct 13 14:14:56 crc kubenswrapper[4797]: I1013 14:14:56.136093 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-jpf9n"] Oct 13 14:14:56 crc kubenswrapper[4797]: I1013 14:14:56.276769 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-tnbsr"] Oct 13 14:14:56 crc kubenswrapper[4797]: E1013 14:14:56.277189 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1af41123-465c-45d0-b803-24733c3a894d" containerName="storage" Oct 13 14:14:56 crc kubenswrapper[4797]: I1013 14:14:56.277208 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="1af41123-465c-45d0-b803-24733c3a894d" containerName="storage" Oct 13 14:14:56 crc kubenswrapper[4797]: I1013 14:14:56.277345 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="1af41123-465c-45d0-b803-24733c3a894d" containerName="storage" Oct 13 14:14:56 crc kubenswrapper[4797]: I1013 14:14:56.277850 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-tnbsr" Oct 13 14:14:56 crc kubenswrapper[4797]: I1013 14:14:56.279894 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Oct 13 14:14:56 crc kubenswrapper[4797]: I1013 14:14:56.279960 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Oct 13 14:14:56 crc kubenswrapper[4797]: I1013 14:14:56.280050 4797 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-xnt4c" Oct 13 14:14:56 crc kubenswrapper[4797]: I1013 14:14:56.280633 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Oct 13 14:14:56 crc kubenswrapper[4797]: I1013 14:14:56.283237 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-tnbsr"] Oct 13 14:14:56 crc kubenswrapper[4797]: I1013 14:14:56.349055 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/75cf37d5-f587-4823-9887-9358e2e693d4-crc-storage\") pod \"crc-storage-crc-tnbsr\" (UID: \"75cf37d5-f587-4823-9887-9358e2e693d4\") " pod="crc-storage/crc-storage-crc-tnbsr" Oct 13 14:14:56 crc kubenswrapper[4797]: I1013 14:14:56.349164 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2snbf\" (UniqueName: \"kubernetes.io/projected/75cf37d5-f587-4823-9887-9358e2e693d4-kube-api-access-2snbf\") pod \"crc-storage-crc-tnbsr\" (UID: \"75cf37d5-f587-4823-9887-9358e2e693d4\") " pod="crc-storage/crc-storage-crc-tnbsr" Oct 13 14:14:56 crc kubenswrapper[4797]: I1013 14:14:56.349196 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/75cf37d5-f587-4823-9887-9358e2e693d4-node-mnt\") pod \"crc-storage-crc-tnbsr\" (UID: \"75cf37d5-f587-4823-9887-9358e2e693d4\") " pod="crc-storage/crc-storage-crc-tnbsr" Oct 13 14:14:56 crc kubenswrapper[4797]: I1013 14:14:56.450882 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/75cf37d5-f587-4823-9887-9358e2e693d4-crc-storage\") pod \"crc-storage-crc-tnbsr\" (UID: \"75cf37d5-f587-4823-9887-9358e2e693d4\") " pod="crc-storage/crc-storage-crc-tnbsr" Oct 13 14:14:56 crc kubenswrapper[4797]: I1013 14:14:56.451382 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2snbf\" (UniqueName: \"kubernetes.io/projected/75cf37d5-f587-4823-9887-9358e2e693d4-kube-api-access-2snbf\") pod \"crc-storage-crc-tnbsr\" (UID: \"75cf37d5-f587-4823-9887-9358e2e693d4\") " pod="crc-storage/crc-storage-crc-tnbsr" Oct 13 14:14:56 crc kubenswrapper[4797]: I1013 14:14:56.451428 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/75cf37d5-f587-4823-9887-9358e2e693d4-node-mnt\") pod \"crc-storage-crc-tnbsr\" (UID: \"75cf37d5-f587-4823-9887-9358e2e693d4\") " pod="crc-storage/crc-storage-crc-tnbsr" Oct 13 14:14:56 crc kubenswrapper[4797]: I1013 14:14:56.451695 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/75cf37d5-f587-4823-9887-9358e2e693d4-crc-storage\") pod \"crc-storage-crc-tnbsr\" (UID: \"75cf37d5-f587-4823-9887-9358e2e693d4\") " pod="crc-storage/crc-storage-crc-tnbsr" Oct 13 14:14:56 crc kubenswrapper[4797]: I1013 14:14:56.451762 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/75cf37d5-f587-4823-9887-9358e2e693d4-node-mnt\") pod \"crc-storage-crc-tnbsr\" (UID: \"75cf37d5-f587-4823-9887-9358e2e693d4\") " pod="crc-storage/crc-storage-crc-tnbsr" Oct 13 14:14:56 crc kubenswrapper[4797]: I1013 14:14:56.470624 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2snbf\" (UniqueName: \"kubernetes.io/projected/75cf37d5-f587-4823-9887-9358e2e693d4-kube-api-access-2snbf\") pod \"crc-storage-crc-tnbsr\" (UID: \"75cf37d5-f587-4823-9887-9358e2e693d4\") " pod="crc-storage/crc-storage-crc-tnbsr" Oct 13 14:14:56 crc kubenswrapper[4797]: I1013 14:14:56.601703 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-tnbsr" Oct 13 14:14:57 crc kubenswrapper[4797]: I1013 14:14:57.053481 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-tnbsr"] Oct 13 14:14:57 crc kubenswrapper[4797]: I1013 14:14:57.248207 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1af41123-465c-45d0-b803-24733c3a894d" path="/var/lib/kubelet/pods/1af41123-465c-45d0-b803-24733c3a894d/volumes" Oct 13 14:14:57 crc kubenswrapper[4797]: I1013 14:14:57.553461 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-tnbsr" event={"ID":"75cf37d5-f587-4823-9887-9358e2e693d4","Type":"ContainerStarted","Data":"687ac0a507162f6bf53a9f758d26f1842f288915b8ca1ccfa404721c16dfc006"} Oct 13 14:14:58 crc kubenswrapper[4797]: I1013 14:14:58.567730 4797 generic.go:334] "Generic (PLEG): container finished" podID="75cf37d5-f587-4823-9887-9358e2e693d4" containerID="16be03308008d45b9a7c8acb9362adac18592653280c2bcbaa2a4038fe82df3a" exitCode=0 Oct 13 14:14:58 crc kubenswrapper[4797]: I1013 14:14:58.567851 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-tnbsr" event={"ID":"75cf37d5-f587-4823-9887-9358e2e693d4","Type":"ContainerDied","Data":"16be03308008d45b9a7c8acb9362adac18592653280c2bcbaa2a4038fe82df3a"} Oct 13 14:14:59 crc kubenswrapper[4797]: I1013 14:14:59.947545 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-tnbsr" Oct 13 14:15:00 crc kubenswrapper[4797]: I1013 14:15:00.110991 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/75cf37d5-f587-4823-9887-9358e2e693d4-node-mnt\") pod \"75cf37d5-f587-4823-9887-9358e2e693d4\" (UID: \"75cf37d5-f587-4823-9887-9358e2e693d4\") " Oct 13 14:15:00 crc kubenswrapper[4797]: I1013 14:15:00.111069 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2snbf\" (UniqueName: \"kubernetes.io/projected/75cf37d5-f587-4823-9887-9358e2e693d4-kube-api-access-2snbf\") pod \"75cf37d5-f587-4823-9887-9358e2e693d4\" (UID: \"75cf37d5-f587-4823-9887-9358e2e693d4\") " Oct 13 14:15:00 crc kubenswrapper[4797]: I1013 14:15:00.111076 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/75cf37d5-f587-4823-9887-9358e2e693d4-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "75cf37d5-f587-4823-9887-9358e2e693d4" (UID: "75cf37d5-f587-4823-9887-9358e2e693d4"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 13 14:15:00 crc kubenswrapper[4797]: I1013 14:15:00.111171 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/75cf37d5-f587-4823-9887-9358e2e693d4-crc-storage\") pod \"75cf37d5-f587-4823-9887-9358e2e693d4\" (UID: \"75cf37d5-f587-4823-9887-9358e2e693d4\") " Oct 13 14:15:00 crc kubenswrapper[4797]: I1013 14:15:00.111432 4797 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/75cf37d5-f587-4823-9887-9358e2e693d4-node-mnt\") on node \"crc\" DevicePath \"\"" Oct 13 14:15:00 crc kubenswrapper[4797]: I1013 14:15:00.122385 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75cf37d5-f587-4823-9887-9358e2e693d4-kube-api-access-2snbf" (OuterVolumeSpecName: "kube-api-access-2snbf") pod "75cf37d5-f587-4823-9887-9358e2e693d4" (UID: "75cf37d5-f587-4823-9887-9358e2e693d4"). InnerVolumeSpecName "kube-api-access-2snbf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 14:15:00 crc kubenswrapper[4797]: I1013 14:15:00.142682 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75cf37d5-f587-4823-9887-9358e2e693d4-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "75cf37d5-f587-4823-9887-9358e2e693d4" (UID: "75cf37d5-f587-4823-9887-9358e2e693d4"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 14:15:00 crc kubenswrapper[4797]: I1013 14:15:00.149471 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29339415-4wslq"] Oct 13 14:15:00 crc kubenswrapper[4797]: E1013 14:15:00.150123 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75cf37d5-f587-4823-9887-9358e2e693d4" containerName="storage" Oct 13 14:15:00 crc kubenswrapper[4797]: I1013 14:15:00.150183 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="75cf37d5-f587-4823-9887-9358e2e693d4" containerName="storage" Oct 13 14:15:00 crc kubenswrapper[4797]: I1013 14:15:00.150423 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="75cf37d5-f587-4823-9887-9358e2e693d4" containerName="storage" Oct 13 14:15:00 crc kubenswrapper[4797]: I1013 14:15:00.151344 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29339415-4wslq" Oct 13 14:15:00 crc kubenswrapper[4797]: I1013 14:15:00.153861 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 13 14:15:00 crc kubenswrapper[4797]: I1013 14:15:00.155136 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29339415-4wslq"] Oct 13 14:15:00 crc kubenswrapper[4797]: I1013 14:15:00.155189 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 13 14:15:00 crc kubenswrapper[4797]: I1013 14:15:00.212612 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2snbf\" (UniqueName: \"kubernetes.io/projected/75cf37d5-f587-4823-9887-9358e2e693d4-kube-api-access-2snbf\") on node \"crc\" DevicePath \"\"" Oct 13 14:15:00 crc kubenswrapper[4797]: I1013 14:15:00.212658 4797 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/75cf37d5-f587-4823-9887-9358e2e693d4-crc-storage\") on node \"crc\" DevicePath \"\"" Oct 13 14:15:00 crc kubenswrapper[4797]: I1013 14:15:00.313533 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/85aee1f1-31f9-4f8e-ad76-802c222bbece-config-volume\") pod \"collect-profiles-29339415-4wslq\" (UID: \"85aee1f1-31f9-4f8e-ad76-802c222bbece\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339415-4wslq" Oct 13 14:15:00 crc kubenswrapper[4797]: I1013 14:15:00.313626 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/85aee1f1-31f9-4f8e-ad76-802c222bbece-secret-volume\") pod \"collect-profiles-29339415-4wslq\" (UID: \"85aee1f1-31f9-4f8e-ad76-802c222bbece\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339415-4wslq" Oct 13 14:15:00 crc kubenswrapper[4797]: I1013 14:15:00.314231 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppp85\" (UniqueName: \"kubernetes.io/projected/85aee1f1-31f9-4f8e-ad76-802c222bbece-kube-api-access-ppp85\") pod \"collect-profiles-29339415-4wslq\" (UID: \"85aee1f1-31f9-4f8e-ad76-802c222bbece\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339415-4wslq" Oct 13 14:15:00 crc kubenswrapper[4797]: I1013 14:15:00.415821 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ppp85\" (UniqueName: \"kubernetes.io/projected/85aee1f1-31f9-4f8e-ad76-802c222bbece-kube-api-access-ppp85\") pod \"collect-profiles-29339415-4wslq\" (UID: \"85aee1f1-31f9-4f8e-ad76-802c222bbece\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339415-4wslq" Oct 13 14:15:00 crc kubenswrapper[4797]: I1013 14:15:00.415940 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/85aee1f1-31f9-4f8e-ad76-802c222bbece-config-volume\") pod \"collect-profiles-29339415-4wslq\" (UID: \"85aee1f1-31f9-4f8e-ad76-802c222bbece\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339415-4wslq" Oct 13 14:15:00 crc kubenswrapper[4797]: I1013 14:15:00.415964 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/85aee1f1-31f9-4f8e-ad76-802c222bbece-secret-volume\") pod \"collect-profiles-29339415-4wslq\" (UID: \"85aee1f1-31f9-4f8e-ad76-802c222bbece\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339415-4wslq" Oct 13 14:15:00 crc kubenswrapper[4797]: I1013 14:15:00.417353 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/85aee1f1-31f9-4f8e-ad76-802c222bbece-config-volume\") pod \"collect-profiles-29339415-4wslq\" (UID: \"85aee1f1-31f9-4f8e-ad76-802c222bbece\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339415-4wslq" Oct 13 14:15:00 crc kubenswrapper[4797]: I1013 14:15:00.419501 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/85aee1f1-31f9-4f8e-ad76-802c222bbece-secret-volume\") pod \"collect-profiles-29339415-4wslq\" (UID: \"85aee1f1-31f9-4f8e-ad76-802c222bbece\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339415-4wslq" Oct 13 14:15:00 crc kubenswrapper[4797]: I1013 14:15:00.442609 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppp85\" (UniqueName: \"kubernetes.io/projected/85aee1f1-31f9-4f8e-ad76-802c222bbece-kube-api-access-ppp85\") pod \"collect-profiles-29339415-4wslq\" (UID: \"85aee1f1-31f9-4f8e-ad76-802c222bbece\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339415-4wslq" Oct 13 14:15:00 crc kubenswrapper[4797]: I1013 14:15:00.494014 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29339415-4wslq" Oct 13 14:15:00 crc kubenswrapper[4797]: I1013 14:15:00.595491 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-tnbsr" event={"ID":"75cf37d5-f587-4823-9887-9358e2e693d4","Type":"ContainerDied","Data":"687ac0a507162f6bf53a9f758d26f1842f288915b8ca1ccfa404721c16dfc006"} Oct 13 14:15:00 crc kubenswrapper[4797]: I1013 14:15:00.595996 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="687ac0a507162f6bf53a9f758d26f1842f288915b8ca1ccfa404721c16dfc006" Oct 13 14:15:00 crc kubenswrapper[4797]: I1013 14:15:00.595550 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-tnbsr" Oct 13 14:15:00 crc kubenswrapper[4797]: I1013 14:15:00.921937 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29339415-4wslq"] Oct 13 14:15:00 crc kubenswrapper[4797]: W1013 14:15:00.926856 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod85aee1f1_31f9_4f8e_ad76_802c222bbece.slice/crio-f73aff6cc1a5ca83a1ceeabe7bdfac97c455f7b6008c011002fc3a49c4fc0a90 WatchSource:0}: Error finding container f73aff6cc1a5ca83a1ceeabe7bdfac97c455f7b6008c011002fc3a49c4fc0a90: Status 404 returned error can't find the container with id f73aff6cc1a5ca83a1ceeabe7bdfac97c455f7b6008c011002fc3a49c4fc0a90 Oct 13 14:15:01 crc kubenswrapper[4797]: I1013 14:15:01.604630 4797 generic.go:334] "Generic (PLEG): container finished" podID="85aee1f1-31f9-4f8e-ad76-802c222bbece" containerID="681eee713f8e63de016f4eb9fc053d1417d329ce5d6284aee304a3a642fb7342" exitCode=0 Oct 13 14:15:01 crc kubenswrapper[4797]: I1013 14:15:01.604737 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29339415-4wslq" event={"ID":"85aee1f1-31f9-4f8e-ad76-802c222bbece","Type":"ContainerDied","Data":"681eee713f8e63de016f4eb9fc053d1417d329ce5d6284aee304a3a642fb7342"} Oct 13 14:15:01 crc kubenswrapper[4797]: I1013 14:15:01.606007 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29339415-4wslq" event={"ID":"85aee1f1-31f9-4f8e-ad76-802c222bbece","Type":"ContainerStarted","Data":"f73aff6cc1a5ca83a1ceeabe7bdfac97c455f7b6008c011002fc3a49c4fc0a90"} Oct 13 14:15:02 crc kubenswrapper[4797]: I1013 14:15:02.884193 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29339415-4wslq" Oct 13 14:15:03 crc kubenswrapper[4797]: I1013 14:15:03.053058 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/85aee1f1-31f9-4f8e-ad76-802c222bbece-config-volume\") pod \"85aee1f1-31f9-4f8e-ad76-802c222bbece\" (UID: \"85aee1f1-31f9-4f8e-ad76-802c222bbece\") " Oct 13 14:15:03 crc kubenswrapper[4797]: I1013 14:15:03.053172 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ppp85\" (UniqueName: \"kubernetes.io/projected/85aee1f1-31f9-4f8e-ad76-802c222bbece-kube-api-access-ppp85\") pod \"85aee1f1-31f9-4f8e-ad76-802c222bbece\" (UID: \"85aee1f1-31f9-4f8e-ad76-802c222bbece\") " Oct 13 14:15:03 crc kubenswrapper[4797]: I1013 14:15:03.053255 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/85aee1f1-31f9-4f8e-ad76-802c222bbece-secret-volume\") pod \"85aee1f1-31f9-4f8e-ad76-802c222bbece\" (UID: \"85aee1f1-31f9-4f8e-ad76-802c222bbece\") " Oct 13 14:15:03 crc kubenswrapper[4797]: I1013 14:15:03.053858 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/85aee1f1-31f9-4f8e-ad76-802c222bbece-config-volume" (OuterVolumeSpecName: "config-volume") pod "85aee1f1-31f9-4f8e-ad76-802c222bbece" (UID: "85aee1f1-31f9-4f8e-ad76-802c222bbece"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 14:15:03 crc kubenswrapper[4797]: I1013 14:15:03.075465 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85aee1f1-31f9-4f8e-ad76-802c222bbece-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "85aee1f1-31f9-4f8e-ad76-802c222bbece" (UID: "85aee1f1-31f9-4f8e-ad76-802c222bbece"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 14:15:03 crc kubenswrapper[4797]: I1013 14:15:03.076032 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85aee1f1-31f9-4f8e-ad76-802c222bbece-kube-api-access-ppp85" (OuterVolumeSpecName: "kube-api-access-ppp85") pod "85aee1f1-31f9-4f8e-ad76-802c222bbece" (UID: "85aee1f1-31f9-4f8e-ad76-802c222bbece"). InnerVolumeSpecName "kube-api-access-ppp85". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 14:15:03 crc kubenswrapper[4797]: I1013 14:15:03.154937 4797 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/85aee1f1-31f9-4f8e-ad76-802c222bbece-config-volume\") on node \"crc\" DevicePath \"\"" Oct 13 14:15:03 crc kubenswrapper[4797]: I1013 14:15:03.154984 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ppp85\" (UniqueName: \"kubernetes.io/projected/85aee1f1-31f9-4f8e-ad76-802c222bbece-kube-api-access-ppp85\") on node \"crc\" DevicePath \"\"" Oct 13 14:15:03 crc kubenswrapper[4797]: I1013 14:15:03.154997 4797 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/85aee1f1-31f9-4f8e-ad76-802c222bbece-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 13 14:15:03 crc kubenswrapper[4797]: I1013 14:15:03.624412 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29339415-4wslq" event={"ID":"85aee1f1-31f9-4f8e-ad76-802c222bbece","Type":"ContainerDied","Data":"f73aff6cc1a5ca83a1ceeabe7bdfac97c455f7b6008c011002fc3a49c4fc0a90"} Oct 13 14:15:03 crc kubenswrapper[4797]: I1013 14:15:03.624460 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f73aff6cc1a5ca83a1ceeabe7bdfac97c455f7b6008c011002fc3a49c4fc0a90" Oct 13 14:15:03 crc kubenswrapper[4797]: I1013 14:15:03.624537 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29339415-4wslq" Oct 13 14:15:03 crc kubenswrapper[4797]: I1013 14:15:03.946689 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29339370-mslvc"] Oct 13 14:15:03 crc kubenswrapper[4797]: I1013 14:15:03.949732 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29339370-mslvc"] Oct 13 14:15:05 crc kubenswrapper[4797]: I1013 14:15:05.255197 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad219041-8188-4fbc-9fc6-dac8a4b904c3" path="/var/lib/kubelet/pods/ad219041-8188-4fbc-9fc6-dac8a4b904c3/volumes" Oct 13 14:15:11 crc kubenswrapper[4797]: I1013 14:15:11.488422 4797 scope.go:117] "RemoveContainer" containerID="a6a5f98a978fccc927f2b842bcde0364f07c12b99c741765c5ab8cb736a25f02" Oct 13 14:15:11 crc kubenswrapper[4797]: I1013 14:15:11.506745 4797 scope.go:117] "RemoveContainer" containerID="4cbef5a09a8ea7a1158c859166d8ca251477ff59b88f204ba42490bff7c6d5cc" Oct 13 14:15:18 crc kubenswrapper[4797]: I1013 14:15:18.120502 4797 patch_prober.go:28] interesting pod/machine-config-daemon-hrdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 14:15:18 crc kubenswrapper[4797]: I1013 14:15:18.121146 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 14:15:41 crc kubenswrapper[4797]: I1013 14:15:41.222410 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-tpv5p"] Oct 13 14:15:41 crc kubenswrapper[4797]: E1013 14:15:41.223517 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85aee1f1-31f9-4f8e-ad76-802c222bbece" containerName="collect-profiles" Oct 13 14:15:41 crc kubenswrapper[4797]: I1013 14:15:41.223537 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="85aee1f1-31f9-4f8e-ad76-802c222bbece" containerName="collect-profiles" Oct 13 14:15:41 crc kubenswrapper[4797]: I1013 14:15:41.223783 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="85aee1f1-31f9-4f8e-ad76-802c222bbece" containerName="collect-profiles" Oct 13 14:15:41 crc kubenswrapper[4797]: I1013 14:15:41.225868 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tpv5p" Oct 13 14:15:41 crc kubenswrapper[4797]: I1013 14:15:41.252772 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a30fc4ba-e870-4ea5-8e0f-d3b38c054167-utilities\") pod \"redhat-operators-tpv5p\" (UID: \"a30fc4ba-e870-4ea5-8e0f-d3b38c054167\") " pod="openshift-marketplace/redhat-operators-tpv5p" Oct 13 14:15:41 crc kubenswrapper[4797]: I1013 14:15:41.253091 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a30fc4ba-e870-4ea5-8e0f-d3b38c054167-catalog-content\") pod \"redhat-operators-tpv5p\" (UID: \"a30fc4ba-e870-4ea5-8e0f-d3b38c054167\") " pod="openshift-marketplace/redhat-operators-tpv5p" Oct 13 14:15:41 crc kubenswrapper[4797]: I1013 14:15:41.253258 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wgsx\" (UniqueName: \"kubernetes.io/projected/a30fc4ba-e870-4ea5-8e0f-d3b38c054167-kube-api-access-7wgsx\") pod \"redhat-operators-tpv5p\" (UID: \"a30fc4ba-e870-4ea5-8e0f-d3b38c054167\") " pod="openshift-marketplace/redhat-operators-tpv5p" Oct 13 14:15:41 crc kubenswrapper[4797]: I1013 14:15:41.256667 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tpv5p"] Oct 13 14:15:41 crc kubenswrapper[4797]: I1013 14:15:41.354446 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7wgsx\" (UniqueName: \"kubernetes.io/projected/a30fc4ba-e870-4ea5-8e0f-d3b38c054167-kube-api-access-7wgsx\") pod \"redhat-operators-tpv5p\" (UID: \"a30fc4ba-e870-4ea5-8e0f-d3b38c054167\") " pod="openshift-marketplace/redhat-operators-tpv5p" Oct 13 14:15:41 crc kubenswrapper[4797]: I1013 14:15:41.354865 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a30fc4ba-e870-4ea5-8e0f-d3b38c054167-utilities\") pod \"redhat-operators-tpv5p\" (UID: \"a30fc4ba-e870-4ea5-8e0f-d3b38c054167\") " pod="openshift-marketplace/redhat-operators-tpv5p" Oct 13 14:15:41 crc kubenswrapper[4797]: I1013 14:15:41.354923 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a30fc4ba-e870-4ea5-8e0f-d3b38c054167-catalog-content\") pod \"redhat-operators-tpv5p\" (UID: \"a30fc4ba-e870-4ea5-8e0f-d3b38c054167\") " pod="openshift-marketplace/redhat-operators-tpv5p" Oct 13 14:15:41 crc kubenswrapper[4797]: I1013 14:15:41.355348 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a30fc4ba-e870-4ea5-8e0f-d3b38c054167-catalog-content\") pod \"redhat-operators-tpv5p\" (UID: \"a30fc4ba-e870-4ea5-8e0f-d3b38c054167\") " pod="openshift-marketplace/redhat-operators-tpv5p" Oct 13 14:15:41 crc kubenswrapper[4797]: I1013 14:15:41.355870 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a30fc4ba-e870-4ea5-8e0f-d3b38c054167-utilities\") pod \"redhat-operators-tpv5p\" (UID: \"a30fc4ba-e870-4ea5-8e0f-d3b38c054167\") " pod="openshift-marketplace/redhat-operators-tpv5p" Oct 13 14:15:41 crc kubenswrapper[4797]: I1013 14:15:41.374460 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wgsx\" (UniqueName: \"kubernetes.io/projected/a30fc4ba-e870-4ea5-8e0f-d3b38c054167-kube-api-access-7wgsx\") pod \"redhat-operators-tpv5p\" (UID: \"a30fc4ba-e870-4ea5-8e0f-d3b38c054167\") " pod="openshift-marketplace/redhat-operators-tpv5p" Oct 13 14:15:41 crc kubenswrapper[4797]: I1013 14:15:41.569013 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tpv5p" Oct 13 14:15:41 crc kubenswrapper[4797]: I1013 14:15:41.800331 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tpv5p"] Oct 13 14:15:41 crc kubenswrapper[4797]: I1013 14:15:41.933866 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tpv5p" event={"ID":"a30fc4ba-e870-4ea5-8e0f-d3b38c054167","Type":"ContainerStarted","Data":"47a9c3ee5da565d26043202f6b104b73695938c99f15f75d08f6b3345dadd7a1"} Oct 13 14:15:42 crc kubenswrapper[4797]: I1013 14:15:42.944091 4797 generic.go:334] "Generic (PLEG): container finished" podID="a30fc4ba-e870-4ea5-8e0f-d3b38c054167" containerID="d44d26cd290df354ac35b521eb1461fd679db729b839fa8846a9e48a1315799e" exitCode=0 Oct 13 14:15:42 crc kubenswrapper[4797]: I1013 14:15:42.944152 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tpv5p" event={"ID":"a30fc4ba-e870-4ea5-8e0f-d3b38c054167","Type":"ContainerDied","Data":"d44d26cd290df354ac35b521eb1461fd679db729b839fa8846a9e48a1315799e"} Oct 13 14:15:44 crc kubenswrapper[4797]: I1013 14:15:44.972841 4797 generic.go:334] "Generic (PLEG): container finished" podID="a30fc4ba-e870-4ea5-8e0f-d3b38c054167" containerID="ab0a1f2fa95e70186efd552a4e67fb5ff1880c6d3cdde4f78a2b80761660925d" exitCode=0 Oct 13 14:15:44 crc kubenswrapper[4797]: I1013 14:15:44.972940 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tpv5p" event={"ID":"a30fc4ba-e870-4ea5-8e0f-d3b38c054167","Type":"ContainerDied","Data":"ab0a1f2fa95e70186efd552a4e67fb5ff1880c6d3cdde4f78a2b80761660925d"} Oct 13 14:15:45 crc kubenswrapper[4797]: I1013 14:15:45.985868 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tpv5p" event={"ID":"a30fc4ba-e870-4ea5-8e0f-d3b38c054167","Type":"ContainerStarted","Data":"0b2e6c35a8f2afeb60fc3dba395ca7e307ebfab3bfec2d0fd7d5a38f9d30f169"} Oct 13 14:15:46 crc kubenswrapper[4797]: I1013 14:15:46.017442 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-tpv5p" podStartSLOduration=2.5139509650000003 podStartE2EDuration="5.017414282s" podCreationTimestamp="2025-10-13 14:15:41 +0000 UTC" firstStartedPulling="2025-10-13 14:15:42.947937567 +0000 UTC m=+4120.481487823" lastFinishedPulling="2025-10-13 14:15:45.451400874 +0000 UTC m=+4122.984951140" observedRunningTime="2025-10-13 14:15:46.012561043 +0000 UTC m=+4123.546111299" watchObservedRunningTime="2025-10-13 14:15:46.017414282 +0000 UTC m=+4123.550964568" Oct 13 14:15:48 crc kubenswrapper[4797]: I1013 14:15:48.120438 4797 patch_prober.go:28] interesting pod/machine-config-daemon-hrdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 14:15:48 crc kubenswrapper[4797]: I1013 14:15:48.120490 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 14:15:48 crc kubenswrapper[4797]: I1013 14:15:48.120528 4797 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" Oct 13 14:15:48 crc kubenswrapper[4797]: I1013 14:15:48.121189 4797 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"acb629200fdd59edad77e0212a8498d3d82c7cdb09b876143e9a24e3014955a5"} pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 13 14:15:48 crc kubenswrapper[4797]: I1013 14:15:48.121238 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerName="machine-config-daemon" containerID="cri-o://acb629200fdd59edad77e0212a8498d3d82c7cdb09b876143e9a24e3014955a5" gracePeriod=600 Oct 13 14:15:49 crc kubenswrapper[4797]: I1013 14:15:49.010925 4797 generic.go:334] "Generic (PLEG): container finished" podID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerID="acb629200fdd59edad77e0212a8498d3d82c7cdb09b876143e9a24e3014955a5" exitCode=0 Oct 13 14:15:49 crc kubenswrapper[4797]: I1013 14:15:49.011018 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" event={"ID":"345b1c60-ba79-407d-8423-53010f2dfeb0","Type":"ContainerDied","Data":"acb629200fdd59edad77e0212a8498d3d82c7cdb09b876143e9a24e3014955a5"} Oct 13 14:15:49 crc kubenswrapper[4797]: I1013 14:15:49.011597 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" event={"ID":"345b1c60-ba79-407d-8423-53010f2dfeb0","Type":"ContainerStarted","Data":"e94c068ee3715b0c0c1473747789b8e54032f59cffffcb81aaefbd0a73c58ba2"} Oct 13 14:15:49 crc kubenswrapper[4797]: I1013 14:15:49.011626 4797 scope.go:117] "RemoveContainer" containerID="cb608ef00552a320249af626728ae168e49b74a818c0214980d05a716b11d54a" Oct 13 14:15:51 crc kubenswrapper[4797]: I1013 14:15:51.569882 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-tpv5p" Oct 13 14:15:51 crc kubenswrapper[4797]: I1013 14:15:51.570586 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-tpv5p" Oct 13 14:15:51 crc kubenswrapper[4797]: I1013 14:15:51.646310 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-tpv5p" Oct 13 14:15:52 crc kubenswrapper[4797]: I1013 14:15:52.099129 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-tpv5p" Oct 13 14:15:52 crc kubenswrapper[4797]: I1013 14:15:52.149619 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tpv5p"] Oct 13 14:15:54 crc kubenswrapper[4797]: I1013 14:15:54.054335 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-tpv5p" podUID="a30fc4ba-e870-4ea5-8e0f-d3b38c054167" containerName="registry-server" containerID="cri-o://0b2e6c35a8f2afeb60fc3dba395ca7e307ebfab3bfec2d0fd7d5a38f9d30f169" gracePeriod=2 Oct 13 14:15:54 crc kubenswrapper[4797]: I1013 14:15:54.445089 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tpv5p" Oct 13 14:15:54 crc kubenswrapper[4797]: I1013 14:15:54.548444 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a30fc4ba-e870-4ea5-8e0f-d3b38c054167-catalog-content\") pod \"a30fc4ba-e870-4ea5-8e0f-d3b38c054167\" (UID: \"a30fc4ba-e870-4ea5-8e0f-d3b38c054167\") " Oct 13 14:15:54 crc kubenswrapper[4797]: I1013 14:15:54.548527 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a30fc4ba-e870-4ea5-8e0f-d3b38c054167-utilities\") pod \"a30fc4ba-e870-4ea5-8e0f-d3b38c054167\" (UID: \"a30fc4ba-e870-4ea5-8e0f-d3b38c054167\") " Oct 13 14:15:54 crc kubenswrapper[4797]: I1013 14:15:54.548631 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7wgsx\" (UniqueName: \"kubernetes.io/projected/a30fc4ba-e870-4ea5-8e0f-d3b38c054167-kube-api-access-7wgsx\") pod \"a30fc4ba-e870-4ea5-8e0f-d3b38c054167\" (UID: \"a30fc4ba-e870-4ea5-8e0f-d3b38c054167\") " Oct 13 14:15:54 crc kubenswrapper[4797]: I1013 14:15:54.549253 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a30fc4ba-e870-4ea5-8e0f-d3b38c054167-utilities" (OuterVolumeSpecName: "utilities") pod "a30fc4ba-e870-4ea5-8e0f-d3b38c054167" (UID: "a30fc4ba-e870-4ea5-8e0f-d3b38c054167"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 14:15:54 crc kubenswrapper[4797]: I1013 14:15:54.553542 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a30fc4ba-e870-4ea5-8e0f-d3b38c054167-kube-api-access-7wgsx" (OuterVolumeSpecName: "kube-api-access-7wgsx") pod "a30fc4ba-e870-4ea5-8e0f-d3b38c054167" (UID: "a30fc4ba-e870-4ea5-8e0f-d3b38c054167"). InnerVolumeSpecName "kube-api-access-7wgsx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 14:15:54 crc kubenswrapper[4797]: I1013 14:15:54.653838 4797 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a30fc4ba-e870-4ea5-8e0f-d3b38c054167-utilities\") on node \"crc\" DevicePath \"\"" Oct 13 14:15:54 crc kubenswrapper[4797]: I1013 14:15:54.653890 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7wgsx\" (UniqueName: \"kubernetes.io/projected/a30fc4ba-e870-4ea5-8e0f-d3b38c054167-kube-api-access-7wgsx\") on node \"crc\" DevicePath \"\"" Oct 13 14:15:54 crc kubenswrapper[4797]: I1013 14:15:54.659000 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a30fc4ba-e870-4ea5-8e0f-d3b38c054167-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a30fc4ba-e870-4ea5-8e0f-d3b38c054167" (UID: "a30fc4ba-e870-4ea5-8e0f-d3b38c054167"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 14:15:54 crc kubenswrapper[4797]: I1013 14:15:54.755063 4797 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a30fc4ba-e870-4ea5-8e0f-d3b38c054167-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 13 14:15:55 crc kubenswrapper[4797]: I1013 14:15:55.062173 4797 generic.go:334] "Generic (PLEG): container finished" podID="a30fc4ba-e870-4ea5-8e0f-d3b38c054167" containerID="0b2e6c35a8f2afeb60fc3dba395ca7e307ebfab3bfec2d0fd7d5a38f9d30f169" exitCode=0 Oct 13 14:15:55 crc kubenswrapper[4797]: I1013 14:15:55.062259 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tpv5p" Oct 13 14:15:55 crc kubenswrapper[4797]: I1013 14:15:55.062258 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tpv5p" event={"ID":"a30fc4ba-e870-4ea5-8e0f-d3b38c054167","Type":"ContainerDied","Data":"0b2e6c35a8f2afeb60fc3dba395ca7e307ebfab3bfec2d0fd7d5a38f9d30f169"} Oct 13 14:15:55 crc kubenswrapper[4797]: I1013 14:15:55.062737 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tpv5p" event={"ID":"a30fc4ba-e870-4ea5-8e0f-d3b38c054167","Type":"ContainerDied","Data":"47a9c3ee5da565d26043202f6b104b73695938c99f15f75d08f6b3345dadd7a1"} Oct 13 14:15:55 crc kubenswrapper[4797]: I1013 14:15:55.062798 4797 scope.go:117] "RemoveContainer" containerID="0b2e6c35a8f2afeb60fc3dba395ca7e307ebfab3bfec2d0fd7d5a38f9d30f169" Oct 13 14:15:55 crc kubenswrapper[4797]: I1013 14:15:55.084137 4797 scope.go:117] "RemoveContainer" containerID="ab0a1f2fa95e70186efd552a4e67fb5ff1880c6d3cdde4f78a2b80761660925d" Oct 13 14:15:55 crc kubenswrapper[4797]: I1013 14:15:55.093770 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tpv5p"] Oct 13 14:15:55 crc kubenswrapper[4797]: I1013 14:15:55.098580 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-tpv5p"] Oct 13 14:15:55 crc kubenswrapper[4797]: I1013 14:15:55.122328 4797 scope.go:117] "RemoveContainer" containerID="d44d26cd290df354ac35b521eb1461fd679db729b839fa8846a9e48a1315799e" Oct 13 14:15:55 crc kubenswrapper[4797]: I1013 14:15:55.146165 4797 scope.go:117] "RemoveContainer" containerID="0b2e6c35a8f2afeb60fc3dba395ca7e307ebfab3bfec2d0fd7d5a38f9d30f169" Oct 13 14:15:55 crc kubenswrapper[4797]: E1013 14:15:55.146671 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b2e6c35a8f2afeb60fc3dba395ca7e307ebfab3bfec2d0fd7d5a38f9d30f169\": container with ID starting with 0b2e6c35a8f2afeb60fc3dba395ca7e307ebfab3bfec2d0fd7d5a38f9d30f169 not found: ID does not exist" containerID="0b2e6c35a8f2afeb60fc3dba395ca7e307ebfab3bfec2d0fd7d5a38f9d30f169" Oct 13 14:15:55 crc kubenswrapper[4797]: I1013 14:15:55.146708 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b2e6c35a8f2afeb60fc3dba395ca7e307ebfab3bfec2d0fd7d5a38f9d30f169"} err="failed to get container status \"0b2e6c35a8f2afeb60fc3dba395ca7e307ebfab3bfec2d0fd7d5a38f9d30f169\": rpc error: code = NotFound desc = could not find container \"0b2e6c35a8f2afeb60fc3dba395ca7e307ebfab3bfec2d0fd7d5a38f9d30f169\": container with ID starting with 0b2e6c35a8f2afeb60fc3dba395ca7e307ebfab3bfec2d0fd7d5a38f9d30f169 not found: ID does not exist" Oct 13 14:15:55 crc kubenswrapper[4797]: I1013 14:15:55.146754 4797 scope.go:117] "RemoveContainer" containerID="ab0a1f2fa95e70186efd552a4e67fb5ff1880c6d3cdde4f78a2b80761660925d" Oct 13 14:15:55 crc kubenswrapper[4797]: E1013 14:15:55.146988 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab0a1f2fa95e70186efd552a4e67fb5ff1880c6d3cdde4f78a2b80761660925d\": container with ID starting with ab0a1f2fa95e70186efd552a4e67fb5ff1880c6d3cdde4f78a2b80761660925d not found: ID does not exist" containerID="ab0a1f2fa95e70186efd552a4e67fb5ff1880c6d3cdde4f78a2b80761660925d" Oct 13 14:15:55 crc kubenswrapper[4797]: I1013 14:15:55.147013 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab0a1f2fa95e70186efd552a4e67fb5ff1880c6d3cdde4f78a2b80761660925d"} err="failed to get container status \"ab0a1f2fa95e70186efd552a4e67fb5ff1880c6d3cdde4f78a2b80761660925d\": rpc error: code = NotFound desc = could not find container \"ab0a1f2fa95e70186efd552a4e67fb5ff1880c6d3cdde4f78a2b80761660925d\": container with ID starting with ab0a1f2fa95e70186efd552a4e67fb5ff1880c6d3cdde4f78a2b80761660925d not found: ID does not exist" Oct 13 14:15:55 crc kubenswrapper[4797]: I1013 14:15:55.147031 4797 scope.go:117] "RemoveContainer" containerID="d44d26cd290df354ac35b521eb1461fd679db729b839fa8846a9e48a1315799e" Oct 13 14:15:55 crc kubenswrapper[4797]: E1013 14:15:55.147262 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d44d26cd290df354ac35b521eb1461fd679db729b839fa8846a9e48a1315799e\": container with ID starting with d44d26cd290df354ac35b521eb1461fd679db729b839fa8846a9e48a1315799e not found: ID does not exist" containerID="d44d26cd290df354ac35b521eb1461fd679db729b839fa8846a9e48a1315799e" Oct 13 14:15:55 crc kubenswrapper[4797]: I1013 14:15:55.147306 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d44d26cd290df354ac35b521eb1461fd679db729b839fa8846a9e48a1315799e"} err="failed to get container status \"d44d26cd290df354ac35b521eb1461fd679db729b839fa8846a9e48a1315799e\": rpc error: code = NotFound desc = could not find container \"d44d26cd290df354ac35b521eb1461fd679db729b839fa8846a9e48a1315799e\": container with ID starting with d44d26cd290df354ac35b521eb1461fd679db729b839fa8846a9e48a1315799e not found: ID does not exist" Oct 13 14:15:55 crc kubenswrapper[4797]: I1013 14:15:55.244722 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a30fc4ba-e870-4ea5-8e0f-d3b38c054167" path="/var/lib/kubelet/pods/a30fc4ba-e870-4ea5-8e0f-d3b38c054167/volumes" Oct 13 14:16:11 crc kubenswrapper[4797]: I1013 14:16:11.975921 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-prg5z"] Oct 13 14:16:11 crc kubenswrapper[4797]: E1013 14:16:11.977023 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a30fc4ba-e870-4ea5-8e0f-d3b38c054167" containerName="extract-content" Oct 13 14:16:11 crc kubenswrapper[4797]: I1013 14:16:11.977044 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="a30fc4ba-e870-4ea5-8e0f-d3b38c054167" containerName="extract-content" Oct 13 14:16:11 crc kubenswrapper[4797]: E1013 14:16:11.977068 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a30fc4ba-e870-4ea5-8e0f-d3b38c054167" containerName="registry-server" Oct 13 14:16:11 crc kubenswrapper[4797]: I1013 14:16:11.977077 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="a30fc4ba-e870-4ea5-8e0f-d3b38c054167" containerName="registry-server" Oct 13 14:16:11 crc kubenswrapper[4797]: E1013 14:16:11.977088 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a30fc4ba-e870-4ea5-8e0f-d3b38c054167" containerName="extract-utilities" Oct 13 14:16:11 crc kubenswrapper[4797]: I1013 14:16:11.977096 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="a30fc4ba-e870-4ea5-8e0f-d3b38c054167" containerName="extract-utilities" Oct 13 14:16:11 crc kubenswrapper[4797]: I1013 14:16:11.977273 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="a30fc4ba-e870-4ea5-8e0f-d3b38c054167" containerName="registry-server" Oct 13 14:16:11 crc kubenswrapper[4797]: I1013 14:16:11.978659 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-prg5z" Oct 13 14:16:11 crc kubenswrapper[4797]: I1013 14:16:11.981671 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-prg5z"] Oct 13 14:16:12 crc kubenswrapper[4797]: I1013 14:16:12.100114 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb251147-04cd-4f50-a65b-2fca9f95b4df-catalog-content\") pod \"redhat-marketplace-prg5z\" (UID: \"cb251147-04cd-4f50-a65b-2fca9f95b4df\") " pod="openshift-marketplace/redhat-marketplace-prg5z" Oct 13 14:16:12 crc kubenswrapper[4797]: I1013 14:16:12.100261 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l429q\" (UniqueName: \"kubernetes.io/projected/cb251147-04cd-4f50-a65b-2fca9f95b4df-kube-api-access-l429q\") pod \"redhat-marketplace-prg5z\" (UID: \"cb251147-04cd-4f50-a65b-2fca9f95b4df\") " pod="openshift-marketplace/redhat-marketplace-prg5z" Oct 13 14:16:12 crc kubenswrapper[4797]: I1013 14:16:12.100384 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb251147-04cd-4f50-a65b-2fca9f95b4df-utilities\") pod \"redhat-marketplace-prg5z\" (UID: \"cb251147-04cd-4f50-a65b-2fca9f95b4df\") " pod="openshift-marketplace/redhat-marketplace-prg5z" Oct 13 14:16:12 crc kubenswrapper[4797]: I1013 14:16:12.202356 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb251147-04cd-4f50-a65b-2fca9f95b4df-utilities\") pod \"redhat-marketplace-prg5z\" (UID: \"cb251147-04cd-4f50-a65b-2fca9f95b4df\") " pod="openshift-marketplace/redhat-marketplace-prg5z" Oct 13 14:16:12 crc kubenswrapper[4797]: I1013 14:16:12.202485 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb251147-04cd-4f50-a65b-2fca9f95b4df-catalog-content\") pod \"redhat-marketplace-prg5z\" (UID: \"cb251147-04cd-4f50-a65b-2fca9f95b4df\") " pod="openshift-marketplace/redhat-marketplace-prg5z" Oct 13 14:16:12 crc kubenswrapper[4797]: I1013 14:16:12.202595 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l429q\" (UniqueName: \"kubernetes.io/projected/cb251147-04cd-4f50-a65b-2fca9f95b4df-kube-api-access-l429q\") pod \"redhat-marketplace-prg5z\" (UID: \"cb251147-04cd-4f50-a65b-2fca9f95b4df\") " pod="openshift-marketplace/redhat-marketplace-prg5z" Oct 13 14:16:12 crc kubenswrapper[4797]: I1013 14:16:12.204748 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb251147-04cd-4f50-a65b-2fca9f95b4df-utilities\") pod \"redhat-marketplace-prg5z\" (UID: \"cb251147-04cd-4f50-a65b-2fca9f95b4df\") " pod="openshift-marketplace/redhat-marketplace-prg5z" Oct 13 14:16:12 crc kubenswrapper[4797]: I1013 14:16:12.205147 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb251147-04cd-4f50-a65b-2fca9f95b4df-catalog-content\") pod \"redhat-marketplace-prg5z\" (UID: \"cb251147-04cd-4f50-a65b-2fca9f95b4df\") " pod="openshift-marketplace/redhat-marketplace-prg5z" Oct 13 14:16:12 crc kubenswrapper[4797]: I1013 14:16:12.227375 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l429q\" (UniqueName: \"kubernetes.io/projected/cb251147-04cd-4f50-a65b-2fca9f95b4df-kube-api-access-l429q\") pod \"redhat-marketplace-prg5z\" (UID: \"cb251147-04cd-4f50-a65b-2fca9f95b4df\") " pod="openshift-marketplace/redhat-marketplace-prg5z" Oct 13 14:16:12 crc kubenswrapper[4797]: I1013 14:16:12.304583 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-prg5z" Oct 13 14:16:12 crc kubenswrapper[4797]: I1013 14:16:12.786621 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-prg5z"] Oct 13 14:16:13 crc kubenswrapper[4797]: I1013 14:16:13.201611 4797 generic.go:334] "Generic (PLEG): container finished" podID="cb251147-04cd-4f50-a65b-2fca9f95b4df" containerID="482e5c377934aec1619b55806565b0b63723549db5774ece04c93e6bbb8ff044" exitCode=0 Oct 13 14:16:13 crc kubenswrapper[4797]: I1013 14:16:13.202162 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-prg5z" event={"ID":"cb251147-04cd-4f50-a65b-2fca9f95b4df","Type":"ContainerDied","Data":"482e5c377934aec1619b55806565b0b63723549db5774ece04c93e6bbb8ff044"} Oct 13 14:16:13 crc kubenswrapper[4797]: I1013 14:16:13.202234 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-prg5z" event={"ID":"cb251147-04cd-4f50-a65b-2fca9f95b4df","Type":"ContainerStarted","Data":"2ac343c583a6e1ee01cb246fa4add87c8aa167726d93e0ba57a5b439740d5629"} Oct 13 14:16:13 crc kubenswrapper[4797]: E1013 14:16:13.215743 4797 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcb251147_04cd_4f50_a65b_2fca9f95b4df.slice/crio-482e5c377934aec1619b55806565b0b63723549db5774ece04c93e6bbb8ff044.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcb251147_04cd_4f50_a65b_2fca9f95b4df.slice/crio-conmon-482e5c377934aec1619b55806565b0b63723549db5774ece04c93e6bbb8ff044.scope\": RecentStats: unable to find data in memory cache]" Oct 13 14:16:14 crc kubenswrapper[4797]: I1013 14:16:14.214490 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-prg5z" event={"ID":"cb251147-04cd-4f50-a65b-2fca9f95b4df","Type":"ContainerStarted","Data":"c9d43fa475e2379cc64700b927eb2b38ada004960153e0af08ba998329fad46c"} Oct 13 14:16:15 crc kubenswrapper[4797]: I1013 14:16:15.225280 4797 generic.go:334] "Generic (PLEG): container finished" podID="cb251147-04cd-4f50-a65b-2fca9f95b4df" containerID="c9d43fa475e2379cc64700b927eb2b38ada004960153e0af08ba998329fad46c" exitCode=0 Oct 13 14:16:15 crc kubenswrapper[4797]: I1013 14:16:15.225352 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-prg5z" event={"ID":"cb251147-04cd-4f50-a65b-2fca9f95b4df","Type":"ContainerDied","Data":"c9d43fa475e2379cc64700b927eb2b38ada004960153e0af08ba998329fad46c"} Oct 13 14:16:16 crc kubenswrapper[4797]: I1013 14:16:16.237517 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-prg5z" event={"ID":"cb251147-04cd-4f50-a65b-2fca9f95b4df","Type":"ContainerStarted","Data":"5eae637a053f0680ef2b5506e895e42ad2ebb673f741f37a15b34390bbf8ff27"} Oct 13 14:16:16 crc kubenswrapper[4797]: I1013 14:16:16.253624 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-prg5z" podStartSLOduration=2.586503206 podStartE2EDuration="5.253604998s" podCreationTimestamp="2025-10-13 14:16:11 +0000 UTC" firstStartedPulling="2025-10-13 14:16:13.205539266 +0000 UTC m=+4150.739089522" lastFinishedPulling="2025-10-13 14:16:15.872641058 +0000 UTC m=+4153.406191314" observedRunningTime="2025-10-13 14:16:16.253109946 +0000 UTC m=+4153.786660282" watchObservedRunningTime="2025-10-13 14:16:16.253604998 +0000 UTC m=+4153.787155264" Oct 13 14:16:22 crc kubenswrapper[4797]: I1013 14:16:22.305639 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-prg5z" Oct 13 14:16:22 crc kubenswrapper[4797]: I1013 14:16:22.306144 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-prg5z" Oct 13 14:16:22 crc kubenswrapper[4797]: I1013 14:16:22.385616 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-prg5z" Oct 13 14:16:23 crc kubenswrapper[4797]: I1013 14:16:23.370596 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-prg5z" Oct 13 14:16:23 crc kubenswrapper[4797]: I1013 14:16:23.412160 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-prg5z"] Oct 13 14:16:25 crc kubenswrapper[4797]: I1013 14:16:25.310398 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-prg5z" podUID="cb251147-04cd-4f50-a65b-2fca9f95b4df" containerName="registry-server" containerID="cri-o://5eae637a053f0680ef2b5506e895e42ad2ebb673f741f37a15b34390bbf8ff27" gracePeriod=2 Oct 13 14:16:25 crc kubenswrapper[4797]: I1013 14:16:25.710944 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-prg5z" Oct 13 14:16:25 crc kubenswrapper[4797]: I1013 14:16:25.907010 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l429q\" (UniqueName: \"kubernetes.io/projected/cb251147-04cd-4f50-a65b-2fca9f95b4df-kube-api-access-l429q\") pod \"cb251147-04cd-4f50-a65b-2fca9f95b4df\" (UID: \"cb251147-04cd-4f50-a65b-2fca9f95b4df\") " Oct 13 14:16:25 crc kubenswrapper[4797]: I1013 14:16:25.910137 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb251147-04cd-4f50-a65b-2fca9f95b4df-catalog-content\") pod \"cb251147-04cd-4f50-a65b-2fca9f95b4df\" (UID: \"cb251147-04cd-4f50-a65b-2fca9f95b4df\") " Oct 13 14:16:25 crc kubenswrapper[4797]: I1013 14:16:25.910254 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb251147-04cd-4f50-a65b-2fca9f95b4df-utilities\") pod \"cb251147-04cd-4f50-a65b-2fca9f95b4df\" (UID: \"cb251147-04cd-4f50-a65b-2fca9f95b4df\") " Oct 13 14:16:25 crc kubenswrapper[4797]: I1013 14:16:25.913224 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cb251147-04cd-4f50-a65b-2fca9f95b4df-utilities" (OuterVolumeSpecName: "utilities") pod "cb251147-04cd-4f50-a65b-2fca9f95b4df" (UID: "cb251147-04cd-4f50-a65b-2fca9f95b4df"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 14:16:25 crc kubenswrapper[4797]: I1013 14:16:25.925065 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb251147-04cd-4f50-a65b-2fca9f95b4df-kube-api-access-l429q" (OuterVolumeSpecName: "kube-api-access-l429q") pod "cb251147-04cd-4f50-a65b-2fca9f95b4df" (UID: "cb251147-04cd-4f50-a65b-2fca9f95b4df"). InnerVolumeSpecName "kube-api-access-l429q". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 14:16:25 crc kubenswrapper[4797]: I1013 14:16:25.929553 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cb251147-04cd-4f50-a65b-2fca9f95b4df-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cb251147-04cd-4f50-a65b-2fca9f95b4df" (UID: "cb251147-04cd-4f50-a65b-2fca9f95b4df"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 14:16:26 crc kubenswrapper[4797]: I1013 14:16:26.012584 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l429q\" (UniqueName: \"kubernetes.io/projected/cb251147-04cd-4f50-a65b-2fca9f95b4df-kube-api-access-l429q\") on node \"crc\" DevicePath \"\"" Oct 13 14:16:26 crc kubenswrapper[4797]: I1013 14:16:26.012613 4797 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb251147-04cd-4f50-a65b-2fca9f95b4df-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 13 14:16:26 crc kubenswrapper[4797]: I1013 14:16:26.012622 4797 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb251147-04cd-4f50-a65b-2fca9f95b4df-utilities\") on node \"crc\" DevicePath \"\"" Oct 13 14:16:26 crc kubenswrapper[4797]: I1013 14:16:26.320282 4797 generic.go:334] "Generic (PLEG): container finished" podID="cb251147-04cd-4f50-a65b-2fca9f95b4df" containerID="5eae637a053f0680ef2b5506e895e42ad2ebb673f741f37a15b34390bbf8ff27" exitCode=0 Oct 13 14:16:26 crc kubenswrapper[4797]: I1013 14:16:26.320344 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-prg5z" event={"ID":"cb251147-04cd-4f50-a65b-2fca9f95b4df","Type":"ContainerDied","Data":"5eae637a053f0680ef2b5506e895e42ad2ebb673f741f37a15b34390bbf8ff27"} Oct 13 14:16:26 crc kubenswrapper[4797]: I1013 14:16:26.320388 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-prg5z" event={"ID":"cb251147-04cd-4f50-a65b-2fca9f95b4df","Type":"ContainerDied","Data":"2ac343c583a6e1ee01cb246fa4add87c8aa167726d93e0ba57a5b439740d5629"} Oct 13 14:16:26 crc kubenswrapper[4797]: I1013 14:16:26.320413 4797 scope.go:117] "RemoveContainer" containerID="5eae637a053f0680ef2b5506e895e42ad2ebb673f741f37a15b34390bbf8ff27" Oct 13 14:16:26 crc kubenswrapper[4797]: I1013 14:16:26.320422 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-prg5z" Oct 13 14:16:26 crc kubenswrapper[4797]: I1013 14:16:26.343417 4797 scope.go:117] "RemoveContainer" containerID="c9d43fa475e2379cc64700b927eb2b38ada004960153e0af08ba998329fad46c" Oct 13 14:16:26 crc kubenswrapper[4797]: I1013 14:16:26.363853 4797 scope.go:117] "RemoveContainer" containerID="482e5c377934aec1619b55806565b0b63723549db5774ece04c93e6bbb8ff044" Oct 13 14:16:26 crc kubenswrapper[4797]: I1013 14:16:26.415256 4797 scope.go:117] "RemoveContainer" containerID="5eae637a053f0680ef2b5506e895e42ad2ebb673f741f37a15b34390bbf8ff27" Oct 13 14:16:26 crc kubenswrapper[4797]: E1013 14:16:26.418393 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5eae637a053f0680ef2b5506e895e42ad2ebb673f741f37a15b34390bbf8ff27\": container with ID starting with 5eae637a053f0680ef2b5506e895e42ad2ebb673f741f37a15b34390bbf8ff27 not found: ID does not exist" containerID="5eae637a053f0680ef2b5506e895e42ad2ebb673f741f37a15b34390bbf8ff27" Oct 13 14:16:26 crc kubenswrapper[4797]: I1013 14:16:26.418477 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5eae637a053f0680ef2b5506e895e42ad2ebb673f741f37a15b34390bbf8ff27"} err="failed to get container status \"5eae637a053f0680ef2b5506e895e42ad2ebb673f741f37a15b34390bbf8ff27\": rpc error: code = NotFound desc = could not find container \"5eae637a053f0680ef2b5506e895e42ad2ebb673f741f37a15b34390bbf8ff27\": container with ID starting with 5eae637a053f0680ef2b5506e895e42ad2ebb673f741f37a15b34390bbf8ff27 not found: ID does not exist" Oct 13 14:16:26 crc kubenswrapper[4797]: I1013 14:16:26.418516 4797 scope.go:117] "RemoveContainer" containerID="c9d43fa475e2379cc64700b927eb2b38ada004960153e0af08ba998329fad46c" Oct 13 14:16:26 crc kubenswrapper[4797]: E1013 14:16:26.419525 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c9d43fa475e2379cc64700b927eb2b38ada004960153e0af08ba998329fad46c\": container with ID starting with c9d43fa475e2379cc64700b927eb2b38ada004960153e0af08ba998329fad46c not found: ID does not exist" containerID="c9d43fa475e2379cc64700b927eb2b38ada004960153e0af08ba998329fad46c" Oct 13 14:16:26 crc kubenswrapper[4797]: I1013 14:16:26.419600 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9d43fa475e2379cc64700b927eb2b38ada004960153e0af08ba998329fad46c"} err="failed to get container status \"c9d43fa475e2379cc64700b927eb2b38ada004960153e0af08ba998329fad46c\": rpc error: code = NotFound desc = could not find container \"c9d43fa475e2379cc64700b927eb2b38ada004960153e0af08ba998329fad46c\": container with ID starting with c9d43fa475e2379cc64700b927eb2b38ada004960153e0af08ba998329fad46c not found: ID does not exist" Oct 13 14:16:26 crc kubenswrapper[4797]: I1013 14:16:26.419640 4797 scope.go:117] "RemoveContainer" containerID="482e5c377934aec1619b55806565b0b63723549db5774ece04c93e6bbb8ff044" Oct 13 14:16:26 crc kubenswrapper[4797]: E1013 14:16:26.420251 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"482e5c377934aec1619b55806565b0b63723549db5774ece04c93e6bbb8ff044\": container with ID starting with 482e5c377934aec1619b55806565b0b63723549db5774ece04c93e6bbb8ff044 not found: ID does not exist" containerID="482e5c377934aec1619b55806565b0b63723549db5774ece04c93e6bbb8ff044" Oct 13 14:16:26 crc kubenswrapper[4797]: I1013 14:16:26.420301 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"482e5c377934aec1619b55806565b0b63723549db5774ece04c93e6bbb8ff044"} err="failed to get container status \"482e5c377934aec1619b55806565b0b63723549db5774ece04c93e6bbb8ff044\": rpc error: code = NotFound desc = could not find container \"482e5c377934aec1619b55806565b0b63723549db5774ece04c93e6bbb8ff044\": container with ID starting with 482e5c377934aec1619b55806565b0b63723549db5774ece04c93e6bbb8ff044 not found: ID does not exist" Oct 13 14:16:26 crc kubenswrapper[4797]: I1013 14:16:26.424931 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-prg5z"] Oct 13 14:16:26 crc kubenswrapper[4797]: I1013 14:16:26.433296 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-prg5z"] Oct 13 14:16:27 crc kubenswrapper[4797]: I1013 14:16:27.247619 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb251147-04cd-4f50-a65b-2fca9f95b4df" path="/var/lib/kubelet/pods/cb251147-04cd-4f50-a65b-2fca9f95b4df/volumes" Oct 13 14:16:46 crc kubenswrapper[4797]: I1013 14:16:46.024144 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-77769"] Oct 13 14:16:46 crc kubenswrapper[4797]: E1013 14:16:46.025192 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb251147-04cd-4f50-a65b-2fca9f95b4df" containerName="extract-utilities" Oct 13 14:16:46 crc kubenswrapper[4797]: I1013 14:16:46.025214 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb251147-04cd-4f50-a65b-2fca9f95b4df" containerName="extract-utilities" Oct 13 14:16:46 crc kubenswrapper[4797]: E1013 14:16:46.025256 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb251147-04cd-4f50-a65b-2fca9f95b4df" containerName="extract-content" Oct 13 14:16:46 crc kubenswrapper[4797]: I1013 14:16:46.025271 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb251147-04cd-4f50-a65b-2fca9f95b4df" containerName="extract-content" Oct 13 14:16:46 crc kubenswrapper[4797]: E1013 14:16:46.025314 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb251147-04cd-4f50-a65b-2fca9f95b4df" containerName="registry-server" Oct 13 14:16:46 crc kubenswrapper[4797]: I1013 14:16:46.025328 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb251147-04cd-4f50-a65b-2fca9f95b4df" containerName="registry-server" Oct 13 14:16:46 crc kubenswrapper[4797]: I1013 14:16:46.025630 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb251147-04cd-4f50-a65b-2fca9f95b4df" containerName="registry-server" Oct 13 14:16:46 crc kubenswrapper[4797]: I1013 14:16:46.027533 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-77769" Oct 13 14:16:46 crc kubenswrapper[4797]: I1013 14:16:46.031753 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-77769"] Oct 13 14:16:46 crc kubenswrapper[4797]: I1013 14:16:46.129011 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9197d39-0195-4ce4-b59f-43598f6a2623-utilities\") pod \"certified-operators-77769\" (UID: \"d9197d39-0195-4ce4-b59f-43598f6a2623\") " pod="openshift-marketplace/certified-operators-77769" Oct 13 14:16:46 crc kubenswrapper[4797]: I1013 14:16:46.129092 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9197d39-0195-4ce4-b59f-43598f6a2623-catalog-content\") pod \"certified-operators-77769\" (UID: \"d9197d39-0195-4ce4-b59f-43598f6a2623\") " pod="openshift-marketplace/certified-operators-77769" Oct 13 14:16:46 crc kubenswrapper[4797]: I1013 14:16:46.129156 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8q2tg\" (UniqueName: \"kubernetes.io/projected/d9197d39-0195-4ce4-b59f-43598f6a2623-kube-api-access-8q2tg\") pod \"certified-operators-77769\" (UID: \"d9197d39-0195-4ce4-b59f-43598f6a2623\") " pod="openshift-marketplace/certified-operators-77769" Oct 13 14:16:46 crc kubenswrapper[4797]: I1013 14:16:46.230670 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9197d39-0195-4ce4-b59f-43598f6a2623-utilities\") pod \"certified-operators-77769\" (UID: \"d9197d39-0195-4ce4-b59f-43598f6a2623\") " pod="openshift-marketplace/certified-operators-77769" Oct 13 14:16:46 crc kubenswrapper[4797]: I1013 14:16:46.230737 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9197d39-0195-4ce4-b59f-43598f6a2623-catalog-content\") pod \"certified-operators-77769\" (UID: \"d9197d39-0195-4ce4-b59f-43598f6a2623\") " pod="openshift-marketplace/certified-operators-77769" Oct 13 14:16:46 crc kubenswrapper[4797]: I1013 14:16:46.230774 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8q2tg\" (UniqueName: \"kubernetes.io/projected/d9197d39-0195-4ce4-b59f-43598f6a2623-kube-api-access-8q2tg\") pod \"certified-operators-77769\" (UID: \"d9197d39-0195-4ce4-b59f-43598f6a2623\") " pod="openshift-marketplace/certified-operators-77769" Oct 13 14:16:46 crc kubenswrapper[4797]: I1013 14:16:46.231341 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9197d39-0195-4ce4-b59f-43598f6a2623-catalog-content\") pod \"certified-operators-77769\" (UID: \"d9197d39-0195-4ce4-b59f-43598f6a2623\") " pod="openshift-marketplace/certified-operators-77769" Oct 13 14:16:46 crc kubenswrapper[4797]: I1013 14:16:46.231436 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9197d39-0195-4ce4-b59f-43598f6a2623-utilities\") pod \"certified-operators-77769\" (UID: \"d9197d39-0195-4ce4-b59f-43598f6a2623\") " pod="openshift-marketplace/certified-operators-77769" Oct 13 14:16:46 crc kubenswrapper[4797]: I1013 14:16:46.252998 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8q2tg\" (UniqueName: \"kubernetes.io/projected/d9197d39-0195-4ce4-b59f-43598f6a2623-kube-api-access-8q2tg\") pod \"certified-operators-77769\" (UID: \"d9197d39-0195-4ce4-b59f-43598f6a2623\") " pod="openshift-marketplace/certified-operators-77769" Oct 13 14:16:46 crc kubenswrapper[4797]: I1013 14:16:46.357081 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-77769" Oct 13 14:16:46 crc kubenswrapper[4797]: I1013 14:16:46.891939 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-77769"] Oct 13 14:16:47 crc kubenswrapper[4797]: I1013 14:16:47.510665 4797 generic.go:334] "Generic (PLEG): container finished" podID="d9197d39-0195-4ce4-b59f-43598f6a2623" containerID="b58c3e45909f7b05269571cedd931b2fa02bca08af90db4e6e5c143aac96d5b8" exitCode=0 Oct 13 14:16:47 crc kubenswrapper[4797]: I1013 14:16:47.510783 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-77769" event={"ID":"d9197d39-0195-4ce4-b59f-43598f6a2623","Type":"ContainerDied","Data":"b58c3e45909f7b05269571cedd931b2fa02bca08af90db4e6e5c143aac96d5b8"} Oct 13 14:16:47 crc kubenswrapper[4797]: I1013 14:16:47.511251 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-77769" event={"ID":"d9197d39-0195-4ce4-b59f-43598f6a2623","Type":"ContainerStarted","Data":"785987f1cdc229672ff292c94f87d44b988337a5899544657e1aa46a0fde4b0a"} Oct 13 14:16:49 crc kubenswrapper[4797]: I1013 14:16:49.539307 4797 generic.go:334] "Generic (PLEG): container finished" podID="d9197d39-0195-4ce4-b59f-43598f6a2623" containerID="74a1bb125da25fb00d226c5822a2495692fe1e7d51b4f5cbc7970d620dda84c5" exitCode=0 Oct 13 14:16:49 crc kubenswrapper[4797]: I1013 14:16:49.539403 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-77769" event={"ID":"d9197d39-0195-4ce4-b59f-43598f6a2623","Type":"ContainerDied","Data":"74a1bb125da25fb00d226c5822a2495692fe1e7d51b4f5cbc7970d620dda84c5"} Oct 13 14:16:50 crc kubenswrapper[4797]: I1013 14:16:50.549871 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-77769" event={"ID":"d9197d39-0195-4ce4-b59f-43598f6a2623","Type":"ContainerStarted","Data":"3e335f53a498751c1da8bb0a3a51716e25442faec24fea63358449b126e381ab"} Oct 13 14:16:50 crc kubenswrapper[4797]: I1013 14:16:50.571401 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-77769" podStartSLOduration=2.7764880549999997 podStartE2EDuration="5.571384669s" podCreationTimestamp="2025-10-13 14:16:45 +0000 UTC" firstStartedPulling="2025-10-13 14:16:47.515478806 +0000 UTC m=+4185.049029072" lastFinishedPulling="2025-10-13 14:16:50.31037539 +0000 UTC m=+4187.843925686" observedRunningTime="2025-10-13 14:16:50.570225981 +0000 UTC m=+4188.103776257" watchObservedRunningTime="2025-10-13 14:16:50.571384669 +0000 UTC m=+4188.104934915" Oct 13 14:16:56 crc kubenswrapper[4797]: I1013 14:16:56.357225 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-77769" Oct 13 14:16:56 crc kubenswrapper[4797]: I1013 14:16:56.357752 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-77769" Oct 13 14:16:56 crc kubenswrapper[4797]: I1013 14:16:56.404063 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-77769" Oct 13 14:16:56 crc kubenswrapper[4797]: I1013 14:16:56.659243 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-77769" Oct 13 14:16:56 crc kubenswrapper[4797]: I1013 14:16:56.708782 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-77769"] Oct 13 14:16:58 crc kubenswrapper[4797]: I1013 14:16:58.623144 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-77769" podUID="d9197d39-0195-4ce4-b59f-43598f6a2623" containerName="registry-server" containerID="cri-o://3e335f53a498751c1da8bb0a3a51716e25442faec24fea63358449b126e381ab" gracePeriod=2 Oct 13 14:16:59 crc kubenswrapper[4797]: I1013 14:16:59.070054 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-77769" Oct 13 14:16:59 crc kubenswrapper[4797]: I1013 14:16:59.215129 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8q2tg\" (UniqueName: \"kubernetes.io/projected/d9197d39-0195-4ce4-b59f-43598f6a2623-kube-api-access-8q2tg\") pod \"d9197d39-0195-4ce4-b59f-43598f6a2623\" (UID: \"d9197d39-0195-4ce4-b59f-43598f6a2623\") " Oct 13 14:16:59 crc kubenswrapper[4797]: I1013 14:16:59.215236 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9197d39-0195-4ce4-b59f-43598f6a2623-utilities\") pod \"d9197d39-0195-4ce4-b59f-43598f6a2623\" (UID: \"d9197d39-0195-4ce4-b59f-43598f6a2623\") " Oct 13 14:16:59 crc kubenswrapper[4797]: I1013 14:16:59.215295 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9197d39-0195-4ce4-b59f-43598f6a2623-catalog-content\") pod \"d9197d39-0195-4ce4-b59f-43598f6a2623\" (UID: \"d9197d39-0195-4ce4-b59f-43598f6a2623\") " Oct 13 14:16:59 crc kubenswrapper[4797]: I1013 14:16:59.216535 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d9197d39-0195-4ce4-b59f-43598f6a2623-utilities" (OuterVolumeSpecName: "utilities") pod "d9197d39-0195-4ce4-b59f-43598f6a2623" (UID: "d9197d39-0195-4ce4-b59f-43598f6a2623"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 14:16:59 crc kubenswrapper[4797]: I1013 14:16:59.220992 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9197d39-0195-4ce4-b59f-43598f6a2623-kube-api-access-8q2tg" (OuterVolumeSpecName: "kube-api-access-8q2tg") pod "d9197d39-0195-4ce4-b59f-43598f6a2623" (UID: "d9197d39-0195-4ce4-b59f-43598f6a2623"). InnerVolumeSpecName "kube-api-access-8q2tg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 14:16:59 crc kubenswrapper[4797]: I1013 14:16:59.262433 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d9197d39-0195-4ce4-b59f-43598f6a2623-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d9197d39-0195-4ce4-b59f-43598f6a2623" (UID: "d9197d39-0195-4ce4-b59f-43598f6a2623"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 14:16:59 crc kubenswrapper[4797]: I1013 14:16:59.317332 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8q2tg\" (UniqueName: \"kubernetes.io/projected/d9197d39-0195-4ce4-b59f-43598f6a2623-kube-api-access-8q2tg\") on node \"crc\" DevicePath \"\"" Oct 13 14:16:59 crc kubenswrapper[4797]: I1013 14:16:59.317369 4797 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9197d39-0195-4ce4-b59f-43598f6a2623-utilities\") on node \"crc\" DevicePath \"\"" Oct 13 14:16:59 crc kubenswrapper[4797]: I1013 14:16:59.317379 4797 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9197d39-0195-4ce4-b59f-43598f6a2623-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 13 14:16:59 crc kubenswrapper[4797]: I1013 14:16:59.637751 4797 generic.go:334] "Generic (PLEG): container finished" podID="d9197d39-0195-4ce4-b59f-43598f6a2623" containerID="3e335f53a498751c1da8bb0a3a51716e25442faec24fea63358449b126e381ab" exitCode=0 Oct 13 14:16:59 crc kubenswrapper[4797]: I1013 14:16:59.637831 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-77769" event={"ID":"d9197d39-0195-4ce4-b59f-43598f6a2623","Type":"ContainerDied","Data":"3e335f53a498751c1da8bb0a3a51716e25442faec24fea63358449b126e381ab"} Oct 13 14:16:59 crc kubenswrapper[4797]: I1013 14:16:59.637866 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-77769" event={"ID":"d9197d39-0195-4ce4-b59f-43598f6a2623","Type":"ContainerDied","Data":"785987f1cdc229672ff292c94f87d44b988337a5899544657e1aa46a0fde4b0a"} Oct 13 14:16:59 crc kubenswrapper[4797]: I1013 14:16:59.637903 4797 scope.go:117] "RemoveContainer" containerID="3e335f53a498751c1da8bb0a3a51716e25442faec24fea63358449b126e381ab" Oct 13 14:16:59 crc kubenswrapper[4797]: I1013 14:16:59.638018 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-77769" Oct 13 14:16:59 crc kubenswrapper[4797]: I1013 14:16:59.659133 4797 scope.go:117] "RemoveContainer" containerID="74a1bb125da25fb00d226c5822a2495692fe1e7d51b4f5cbc7970d620dda84c5" Oct 13 14:16:59 crc kubenswrapper[4797]: I1013 14:16:59.680207 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-77769"] Oct 13 14:16:59 crc kubenswrapper[4797]: I1013 14:16:59.680279 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-77769"] Oct 13 14:16:59 crc kubenswrapper[4797]: I1013 14:16:59.684848 4797 scope.go:117] "RemoveContainer" containerID="b58c3e45909f7b05269571cedd931b2fa02bca08af90db4e6e5c143aac96d5b8" Oct 13 14:16:59 crc kubenswrapper[4797]: I1013 14:16:59.717372 4797 scope.go:117] "RemoveContainer" containerID="3e335f53a498751c1da8bb0a3a51716e25442faec24fea63358449b126e381ab" Oct 13 14:16:59 crc kubenswrapper[4797]: E1013 14:16:59.718007 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e335f53a498751c1da8bb0a3a51716e25442faec24fea63358449b126e381ab\": container with ID starting with 3e335f53a498751c1da8bb0a3a51716e25442faec24fea63358449b126e381ab not found: ID does not exist" containerID="3e335f53a498751c1da8bb0a3a51716e25442faec24fea63358449b126e381ab" Oct 13 14:16:59 crc kubenswrapper[4797]: I1013 14:16:59.718071 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e335f53a498751c1da8bb0a3a51716e25442faec24fea63358449b126e381ab"} err="failed to get container status \"3e335f53a498751c1da8bb0a3a51716e25442faec24fea63358449b126e381ab\": rpc error: code = NotFound desc = could not find container \"3e335f53a498751c1da8bb0a3a51716e25442faec24fea63358449b126e381ab\": container with ID starting with 3e335f53a498751c1da8bb0a3a51716e25442faec24fea63358449b126e381ab not found: ID does not exist" Oct 13 14:16:59 crc kubenswrapper[4797]: I1013 14:16:59.718106 4797 scope.go:117] "RemoveContainer" containerID="74a1bb125da25fb00d226c5822a2495692fe1e7d51b4f5cbc7970d620dda84c5" Oct 13 14:16:59 crc kubenswrapper[4797]: E1013 14:16:59.718546 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74a1bb125da25fb00d226c5822a2495692fe1e7d51b4f5cbc7970d620dda84c5\": container with ID starting with 74a1bb125da25fb00d226c5822a2495692fe1e7d51b4f5cbc7970d620dda84c5 not found: ID does not exist" containerID="74a1bb125da25fb00d226c5822a2495692fe1e7d51b4f5cbc7970d620dda84c5" Oct 13 14:16:59 crc kubenswrapper[4797]: I1013 14:16:59.718588 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74a1bb125da25fb00d226c5822a2495692fe1e7d51b4f5cbc7970d620dda84c5"} err="failed to get container status \"74a1bb125da25fb00d226c5822a2495692fe1e7d51b4f5cbc7970d620dda84c5\": rpc error: code = NotFound desc = could not find container \"74a1bb125da25fb00d226c5822a2495692fe1e7d51b4f5cbc7970d620dda84c5\": container with ID starting with 74a1bb125da25fb00d226c5822a2495692fe1e7d51b4f5cbc7970d620dda84c5 not found: ID does not exist" Oct 13 14:16:59 crc kubenswrapper[4797]: I1013 14:16:59.718618 4797 scope.go:117] "RemoveContainer" containerID="b58c3e45909f7b05269571cedd931b2fa02bca08af90db4e6e5c143aac96d5b8" Oct 13 14:16:59 crc kubenswrapper[4797]: E1013 14:16:59.719393 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b58c3e45909f7b05269571cedd931b2fa02bca08af90db4e6e5c143aac96d5b8\": container with ID starting with b58c3e45909f7b05269571cedd931b2fa02bca08af90db4e6e5c143aac96d5b8 not found: ID does not exist" containerID="b58c3e45909f7b05269571cedd931b2fa02bca08af90db4e6e5c143aac96d5b8" Oct 13 14:16:59 crc kubenswrapper[4797]: I1013 14:16:59.719423 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b58c3e45909f7b05269571cedd931b2fa02bca08af90db4e6e5c143aac96d5b8"} err="failed to get container status \"b58c3e45909f7b05269571cedd931b2fa02bca08af90db4e6e5c143aac96d5b8\": rpc error: code = NotFound desc = could not find container \"b58c3e45909f7b05269571cedd931b2fa02bca08af90db4e6e5c143aac96d5b8\": container with ID starting with b58c3e45909f7b05269571cedd931b2fa02bca08af90db4e6e5c143aac96d5b8 not found: ID does not exist" Oct 13 14:17:01 crc kubenswrapper[4797]: I1013 14:17:01.250941 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9197d39-0195-4ce4-b59f-43598f6a2623" path="/var/lib/kubelet/pods/d9197d39-0195-4ce4-b59f-43598f6a2623/volumes" Oct 13 14:17:48 crc kubenswrapper[4797]: I1013 14:17:48.120107 4797 patch_prober.go:28] interesting pod/machine-config-daemon-hrdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 14:17:48 crc kubenswrapper[4797]: I1013 14:17:48.122376 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 14:18:18 crc kubenswrapper[4797]: I1013 14:18:18.120431 4797 patch_prober.go:28] interesting pod/machine-config-daemon-hrdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 14:18:18 crc kubenswrapper[4797]: I1013 14:18:18.120995 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 14:18:18 crc kubenswrapper[4797]: I1013 14:18:18.252562 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-54f7fbfcf9-tqdd5"] Oct 13 14:18:18 crc kubenswrapper[4797]: E1013 14:18:18.252882 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9197d39-0195-4ce4-b59f-43598f6a2623" containerName="extract-utilities" Oct 13 14:18:18 crc kubenswrapper[4797]: I1013 14:18:18.252902 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9197d39-0195-4ce4-b59f-43598f6a2623" containerName="extract-utilities" Oct 13 14:18:18 crc kubenswrapper[4797]: E1013 14:18:18.252921 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9197d39-0195-4ce4-b59f-43598f6a2623" containerName="registry-server" Oct 13 14:18:18 crc kubenswrapper[4797]: I1013 14:18:18.252928 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9197d39-0195-4ce4-b59f-43598f6a2623" containerName="registry-server" Oct 13 14:18:18 crc kubenswrapper[4797]: E1013 14:18:18.252940 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9197d39-0195-4ce4-b59f-43598f6a2623" containerName="extract-content" Oct 13 14:18:18 crc kubenswrapper[4797]: I1013 14:18:18.252946 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9197d39-0195-4ce4-b59f-43598f6a2623" containerName="extract-content" Oct 13 14:18:18 crc kubenswrapper[4797]: I1013 14:18:18.253103 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9197d39-0195-4ce4-b59f-43598f6a2623" containerName="registry-server" Oct 13 14:18:18 crc kubenswrapper[4797]: I1013 14:18:18.253845 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54f7fbfcf9-tqdd5" Oct 13 14:18:18 crc kubenswrapper[4797]: W1013 14:18:18.255473 4797 reflector.go:561] object-"openstack"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openstack": no relationship found between node 'crc' and this object Oct 13 14:18:18 crc kubenswrapper[4797]: W1013 14:18:18.255504 4797 reflector.go:561] object-"openstack"/"dns": failed to list *v1.ConfigMap: configmaps "dns" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openstack": no relationship found between node 'crc' and this object Oct 13 14:18:18 crc kubenswrapper[4797]: E1013 14:18:18.255522 4797 reflector.go:158] "Unhandled Error" err="object-\"openstack\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openstack\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 13 14:18:18 crc kubenswrapper[4797]: E1013 14:18:18.255551 4797 reflector.go:158] "Unhandled Error" err="object-\"openstack\"/\"dns\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"dns\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openstack\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 13 14:18:18 crc kubenswrapper[4797]: W1013 14:18:18.256056 4797 reflector.go:561] object-"openstack"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openstack": no relationship found between node 'crc' and this object Oct 13 14:18:18 crc kubenswrapper[4797]: E1013 14:18:18.256182 4797 reflector.go:158] "Unhandled Error" err="object-\"openstack\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openstack\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 13 14:18:18 crc kubenswrapper[4797]: W1013 14:18:18.256310 4797 reflector.go:561] object-"openstack"/"dnsmasq-dns-dockercfg-lpr4f": failed to list *v1.Secret: secrets "dnsmasq-dns-dockercfg-lpr4f" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openstack": no relationship found between node 'crc' and this object Oct 13 14:18:18 crc kubenswrapper[4797]: E1013 14:18:18.256409 4797 reflector.go:158] "Unhandled Error" err="object-\"openstack\"/\"dnsmasq-dns-dockercfg-lpr4f\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"dnsmasq-dns-dockercfg-lpr4f\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openstack\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 13 14:18:18 crc kubenswrapper[4797]: W1013 14:18:18.256627 4797 reflector.go:561] object-"openstack"/"dns-svc": failed to list *v1.ConfigMap: configmaps "dns-svc" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openstack": no relationship found between node 'crc' and this object Oct 13 14:18:18 crc kubenswrapper[4797]: E1013 14:18:18.256721 4797 reflector.go:158] "Unhandled Error" err="object-\"openstack\"/\"dns-svc\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"dns-svc\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openstack\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 13 14:18:18 crc kubenswrapper[4797]: I1013 14:18:18.271920 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5b9774f-6729-449f-a5e6-b1ef4df0e647-config\") pod \"dnsmasq-dns-54f7fbfcf9-tqdd5\" (UID: \"d5b9774f-6729-449f-a5e6-b1ef4df0e647\") " pod="openstack/dnsmasq-dns-54f7fbfcf9-tqdd5" Oct 13 14:18:18 crc kubenswrapper[4797]: I1013 14:18:18.272181 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kddss\" (UniqueName: \"kubernetes.io/projected/d5b9774f-6729-449f-a5e6-b1ef4df0e647-kube-api-access-kddss\") pod \"dnsmasq-dns-54f7fbfcf9-tqdd5\" (UID: \"d5b9774f-6729-449f-a5e6-b1ef4df0e647\") " pod="openstack/dnsmasq-dns-54f7fbfcf9-tqdd5" Oct 13 14:18:18 crc kubenswrapper[4797]: I1013 14:18:18.272310 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d5b9774f-6729-449f-a5e6-b1ef4df0e647-dns-svc\") pod \"dnsmasq-dns-54f7fbfcf9-tqdd5\" (UID: \"d5b9774f-6729-449f-a5e6-b1ef4df0e647\") " pod="openstack/dnsmasq-dns-54f7fbfcf9-tqdd5" Oct 13 14:18:18 crc kubenswrapper[4797]: I1013 14:18:18.283167 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-54f7fbfcf9-tqdd5"] Oct 13 14:18:18 crc kubenswrapper[4797]: I1013 14:18:18.374327 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5b9774f-6729-449f-a5e6-b1ef4df0e647-config\") pod \"dnsmasq-dns-54f7fbfcf9-tqdd5\" (UID: \"d5b9774f-6729-449f-a5e6-b1ef4df0e647\") " pod="openstack/dnsmasq-dns-54f7fbfcf9-tqdd5" Oct 13 14:18:18 crc kubenswrapper[4797]: I1013 14:18:18.374477 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kddss\" (UniqueName: \"kubernetes.io/projected/d5b9774f-6729-449f-a5e6-b1ef4df0e647-kube-api-access-kddss\") pod \"dnsmasq-dns-54f7fbfcf9-tqdd5\" (UID: \"d5b9774f-6729-449f-a5e6-b1ef4df0e647\") " pod="openstack/dnsmasq-dns-54f7fbfcf9-tqdd5" Oct 13 14:18:18 crc kubenswrapper[4797]: I1013 14:18:18.374533 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d5b9774f-6729-449f-a5e6-b1ef4df0e647-dns-svc\") pod \"dnsmasq-dns-54f7fbfcf9-tqdd5\" (UID: \"d5b9774f-6729-449f-a5e6-b1ef4df0e647\") " pod="openstack/dnsmasq-dns-54f7fbfcf9-tqdd5" Oct 13 14:18:18 crc kubenswrapper[4797]: I1013 14:18:18.453724 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-557fbdf45f-4cpz9"] Oct 13 14:18:18 crc kubenswrapper[4797]: I1013 14:18:18.455270 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-557fbdf45f-4cpz9" Oct 13 14:18:18 crc kubenswrapper[4797]: I1013 14:18:18.469312 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-557fbdf45f-4cpz9"] Oct 13 14:18:18 crc kubenswrapper[4797]: I1013 14:18:18.476565 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7c4c1d51-22f5-47e8-baa4-05f2ec403852-dns-svc\") pod \"dnsmasq-dns-557fbdf45f-4cpz9\" (UID: \"7c4c1d51-22f5-47e8-baa4-05f2ec403852\") " pod="openstack/dnsmasq-dns-557fbdf45f-4cpz9" Oct 13 14:18:18 crc kubenswrapper[4797]: I1013 14:18:18.476643 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c4c1d51-22f5-47e8-baa4-05f2ec403852-config\") pod \"dnsmasq-dns-557fbdf45f-4cpz9\" (UID: \"7c4c1d51-22f5-47e8-baa4-05f2ec403852\") " pod="openstack/dnsmasq-dns-557fbdf45f-4cpz9" Oct 13 14:18:18 crc kubenswrapper[4797]: I1013 14:18:18.476718 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4x7qs\" (UniqueName: \"kubernetes.io/projected/7c4c1d51-22f5-47e8-baa4-05f2ec403852-kube-api-access-4x7qs\") pod \"dnsmasq-dns-557fbdf45f-4cpz9\" (UID: \"7c4c1d51-22f5-47e8-baa4-05f2ec403852\") " pod="openstack/dnsmasq-dns-557fbdf45f-4cpz9" Oct 13 14:18:18 crc kubenswrapper[4797]: I1013 14:18:18.578910 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7c4c1d51-22f5-47e8-baa4-05f2ec403852-dns-svc\") pod \"dnsmasq-dns-557fbdf45f-4cpz9\" (UID: \"7c4c1d51-22f5-47e8-baa4-05f2ec403852\") " pod="openstack/dnsmasq-dns-557fbdf45f-4cpz9" Oct 13 14:18:18 crc kubenswrapper[4797]: I1013 14:18:18.578982 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c4c1d51-22f5-47e8-baa4-05f2ec403852-config\") pod \"dnsmasq-dns-557fbdf45f-4cpz9\" (UID: \"7c4c1d51-22f5-47e8-baa4-05f2ec403852\") " pod="openstack/dnsmasq-dns-557fbdf45f-4cpz9" Oct 13 14:18:18 crc kubenswrapper[4797]: I1013 14:18:18.579025 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4x7qs\" (UniqueName: \"kubernetes.io/projected/7c4c1d51-22f5-47e8-baa4-05f2ec403852-kube-api-access-4x7qs\") pod \"dnsmasq-dns-557fbdf45f-4cpz9\" (UID: \"7c4c1d51-22f5-47e8-baa4-05f2ec403852\") " pod="openstack/dnsmasq-dns-557fbdf45f-4cpz9" Oct 13 14:18:19 crc kubenswrapper[4797]: I1013 14:18:19.074434 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-lpr4f" Oct 13 14:18:19 crc kubenswrapper[4797]: I1013 14:18:19.075240 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Oct 13 14:18:19 crc kubenswrapper[4797]: I1013 14:18:19.318191 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Oct 13 14:18:19 crc kubenswrapper[4797]: I1013 14:18:19.319991 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c4c1d51-22f5-47e8-baa4-05f2ec403852-config\") pod \"dnsmasq-dns-557fbdf45f-4cpz9\" (UID: \"7c4c1d51-22f5-47e8-baa4-05f2ec403852\") " pod="openstack/dnsmasq-dns-557fbdf45f-4cpz9" Oct 13 14:18:19 crc kubenswrapper[4797]: I1013 14:18:19.325024 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Oct 13 14:18:19 crc kubenswrapper[4797]: I1013 14:18:19.325606 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5b9774f-6729-449f-a5e6-b1ef4df0e647-config\") pod \"dnsmasq-dns-54f7fbfcf9-tqdd5\" (UID: \"d5b9774f-6729-449f-a5e6-b1ef4df0e647\") " pod="openstack/dnsmasq-dns-54f7fbfcf9-tqdd5" Oct 13 14:18:19 crc kubenswrapper[4797]: I1013 14:18:19.326123 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 13 14:18:19 crc kubenswrapper[4797]: I1013 14:18:19.330230 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Oct 13 14:18:19 crc kubenswrapper[4797]: I1013 14:18:19.330234 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Oct 13 14:18:19 crc kubenswrapper[4797]: I1013 14:18:19.330485 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-7z8cm" Oct 13 14:18:19 crc kubenswrapper[4797]: I1013 14:18:19.330609 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Oct 13 14:18:19 crc kubenswrapper[4797]: I1013 14:18:19.330854 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Oct 13 14:18:19 crc kubenswrapper[4797]: E1013 14:18:19.374959 4797 configmap.go:193] Couldn't get configMap openstack/dns-svc: failed to sync configmap cache: timed out waiting for the condition Oct 13 14:18:19 crc kubenswrapper[4797]: E1013 14:18:19.375061 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d5b9774f-6729-449f-a5e6-b1ef4df0e647-dns-svc podName:d5b9774f-6729-449f-a5e6-b1ef4df0e647 nodeName:}" failed. No retries permitted until 2025-10-13 14:18:19.875023402 +0000 UTC m=+4277.408573658 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "dns-svc" (UniqueName: "kubernetes.io/configmap/d5b9774f-6729-449f-a5e6-b1ef4df0e647-dns-svc") pod "dnsmasq-dns-54f7fbfcf9-tqdd5" (UID: "d5b9774f-6729-449f-a5e6-b1ef4df0e647") : failed to sync configmap cache: timed out waiting for the condition Oct 13 14:18:19 crc kubenswrapper[4797]: E1013 14:18:19.389468 4797 projected.go:288] Couldn't get configMap openstack/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Oct 13 14:18:19 crc kubenswrapper[4797]: E1013 14:18:19.389519 4797 projected.go:194] Error preparing data for projected volume kube-api-access-kddss for pod openstack/dnsmasq-dns-54f7fbfcf9-tqdd5: failed to sync configmap cache: timed out waiting for the condition Oct 13 14:18:19 crc kubenswrapper[4797]: E1013 14:18:19.389570 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d5b9774f-6729-449f-a5e6-b1ef4df0e647-kube-api-access-kddss podName:d5b9774f-6729-449f-a5e6-b1ef4df0e647 nodeName:}" failed. No retries permitted until 2025-10-13 14:18:19.889553528 +0000 UTC m=+4277.423103784 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-kddss" (UniqueName: "kubernetes.io/projected/d5b9774f-6729-449f-a5e6-b1ef4df0e647-kube-api-access-kddss") pod "dnsmasq-dns-54f7fbfcf9-tqdd5" (UID: "d5b9774f-6729-449f-a5e6-b1ef4df0e647") : failed to sync configmap cache: timed out waiting for the condition Oct 13 14:18:19 crc kubenswrapper[4797]: I1013 14:18:19.411503 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 13 14:18:19 crc kubenswrapper[4797]: I1013 14:18:19.490710 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/39f17318-f36e-49ec-ac64-b53ed1f136f2-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"39f17318-f36e-49ec-ac64-b53ed1f136f2\") " pod="openstack/rabbitmq-server-0" Oct 13 14:18:19 crc kubenswrapper[4797]: I1013 14:18:19.490766 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/39f17318-f36e-49ec-ac64-b53ed1f136f2-pod-info\") pod \"rabbitmq-server-0\" (UID: \"39f17318-f36e-49ec-ac64-b53ed1f136f2\") " pod="openstack/rabbitmq-server-0" Oct 13 14:18:19 crc kubenswrapper[4797]: I1013 14:18:19.490800 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-d9f106f9-e2c9-457f-96e8-885c3fc1388d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d9f106f9-e2c9-457f-96e8-885c3fc1388d\") pod \"rabbitmq-server-0\" (UID: \"39f17318-f36e-49ec-ac64-b53ed1f136f2\") " pod="openstack/rabbitmq-server-0" Oct 13 14:18:19 crc kubenswrapper[4797]: I1013 14:18:19.490901 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/39f17318-f36e-49ec-ac64-b53ed1f136f2-server-conf\") pod \"rabbitmq-server-0\" (UID: \"39f17318-f36e-49ec-ac64-b53ed1f136f2\") " pod="openstack/rabbitmq-server-0" Oct 13 14:18:19 crc kubenswrapper[4797]: I1013 14:18:19.490938 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/39f17318-f36e-49ec-ac64-b53ed1f136f2-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"39f17318-f36e-49ec-ac64-b53ed1f136f2\") " pod="openstack/rabbitmq-server-0" Oct 13 14:18:19 crc kubenswrapper[4797]: I1013 14:18:19.490961 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/39f17318-f36e-49ec-ac64-b53ed1f136f2-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"39f17318-f36e-49ec-ac64-b53ed1f136f2\") " pod="openstack/rabbitmq-server-0" Oct 13 14:18:19 crc kubenswrapper[4797]: I1013 14:18:19.491003 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/39f17318-f36e-49ec-ac64-b53ed1f136f2-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"39f17318-f36e-49ec-ac64-b53ed1f136f2\") " pod="openstack/rabbitmq-server-0" Oct 13 14:18:19 crc kubenswrapper[4797]: I1013 14:18:19.491124 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/39f17318-f36e-49ec-ac64-b53ed1f136f2-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"39f17318-f36e-49ec-ac64-b53ed1f136f2\") " pod="openstack/rabbitmq-server-0" Oct 13 14:18:19 crc kubenswrapper[4797]: I1013 14:18:19.491208 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdzwn\" (UniqueName: \"kubernetes.io/projected/39f17318-f36e-49ec-ac64-b53ed1f136f2-kube-api-access-jdzwn\") pod \"rabbitmq-server-0\" (UID: \"39f17318-f36e-49ec-ac64-b53ed1f136f2\") " pod="openstack/rabbitmq-server-0" Oct 13 14:18:19 crc kubenswrapper[4797]: I1013 14:18:19.491677 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Oct 13 14:18:19 crc kubenswrapper[4797]: I1013 14:18:19.500442 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7c4c1d51-22f5-47e8-baa4-05f2ec403852-dns-svc\") pod \"dnsmasq-dns-557fbdf45f-4cpz9\" (UID: \"7c4c1d51-22f5-47e8-baa4-05f2ec403852\") " pod="openstack/dnsmasq-dns-557fbdf45f-4cpz9" Oct 13 14:18:19 crc kubenswrapper[4797]: I1013 14:18:19.592654 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-d9f106f9-e2c9-457f-96e8-885c3fc1388d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d9f106f9-e2c9-457f-96e8-885c3fc1388d\") pod \"rabbitmq-server-0\" (UID: \"39f17318-f36e-49ec-ac64-b53ed1f136f2\") " pod="openstack/rabbitmq-server-0" Oct 13 14:18:19 crc kubenswrapper[4797]: I1013 14:18:19.592771 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/39f17318-f36e-49ec-ac64-b53ed1f136f2-server-conf\") pod \"rabbitmq-server-0\" (UID: \"39f17318-f36e-49ec-ac64-b53ed1f136f2\") " pod="openstack/rabbitmq-server-0" Oct 13 14:18:19 crc kubenswrapper[4797]: I1013 14:18:19.592837 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/39f17318-f36e-49ec-ac64-b53ed1f136f2-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"39f17318-f36e-49ec-ac64-b53ed1f136f2\") " pod="openstack/rabbitmq-server-0" Oct 13 14:18:19 crc kubenswrapper[4797]: I1013 14:18:19.592977 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/39f17318-f36e-49ec-ac64-b53ed1f136f2-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"39f17318-f36e-49ec-ac64-b53ed1f136f2\") " pod="openstack/rabbitmq-server-0" Oct 13 14:18:19 crc kubenswrapper[4797]: I1013 14:18:19.593032 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/39f17318-f36e-49ec-ac64-b53ed1f136f2-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"39f17318-f36e-49ec-ac64-b53ed1f136f2\") " pod="openstack/rabbitmq-server-0" Oct 13 14:18:19 crc kubenswrapper[4797]: I1013 14:18:19.593066 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/39f17318-f36e-49ec-ac64-b53ed1f136f2-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"39f17318-f36e-49ec-ac64-b53ed1f136f2\") " pod="openstack/rabbitmq-server-0" Oct 13 14:18:19 crc kubenswrapper[4797]: I1013 14:18:19.593108 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jdzwn\" (UniqueName: \"kubernetes.io/projected/39f17318-f36e-49ec-ac64-b53ed1f136f2-kube-api-access-jdzwn\") pod \"rabbitmq-server-0\" (UID: \"39f17318-f36e-49ec-ac64-b53ed1f136f2\") " pod="openstack/rabbitmq-server-0" Oct 13 14:18:19 crc kubenswrapper[4797]: I1013 14:18:19.593142 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/39f17318-f36e-49ec-ac64-b53ed1f136f2-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"39f17318-f36e-49ec-ac64-b53ed1f136f2\") " pod="openstack/rabbitmq-server-0" Oct 13 14:18:19 crc kubenswrapper[4797]: I1013 14:18:19.593171 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/39f17318-f36e-49ec-ac64-b53ed1f136f2-pod-info\") pod \"rabbitmq-server-0\" (UID: \"39f17318-f36e-49ec-ac64-b53ed1f136f2\") " pod="openstack/rabbitmq-server-0" Oct 13 14:18:19 crc kubenswrapper[4797]: I1013 14:18:19.593666 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/39f17318-f36e-49ec-ac64-b53ed1f136f2-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"39f17318-f36e-49ec-ac64-b53ed1f136f2\") " pod="openstack/rabbitmq-server-0" Oct 13 14:18:19 crc kubenswrapper[4797]: I1013 14:18:19.593879 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/39f17318-f36e-49ec-ac64-b53ed1f136f2-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"39f17318-f36e-49ec-ac64-b53ed1f136f2\") " pod="openstack/rabbitmq-server-0" Oct 13 14:18:19 crc kubenswrapper[4797]: I1013 14:18:19.594165 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/39f17318-f36e-49ec-ac64-b53ed1f136f2-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"39f17318-f36e-49ec-ac64-b53ed1f136f2\") " pod="openstack/rabbitmq-server-0" Oct 13 14:18:19 crc kubenswrapper[4797]: E1013 14:18:19.594719 4797 projected.go:288] Couldn't get configMap openstack/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Oct 13 14:18:19 crc kubenswrapper[4797]: E1013 14:18:19.594754 4797 projected.go:194] Error preparing data for projected volume kube-api-access-4x7qs for pod openstack/dnsmasq-dns-557fbdf45f-4cpz9: failed to sync configmap cache: timed out waiting for the condition Oct 13 14:18:19 crc kubenswrapper[4797]: E1013 14:18:19.594818 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7c4c1d51-22f5-47e8-baa4-05f2ec403852-kube-api-access-4x7qs podName:7c4c1d51-22f5-47e8-baa4-05f2ec403852 nodeName:}" failed. No retries permitted until 2025-10-13 14:18:20.094787085 +0000 UTC m=+4277.628337341 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-4x7qs" (UniqueName: "kubernetes.io/projected/7c4c1d51-22f5-47e8-baa4-05f2ec403852-kube-api-access-4x7qs") pod "dnsmasq-dns-557fbdf45f-4cpz9" (UID: "7c4c1d51-22f5-47e8-baa4-05f2ec403852") : failed to sync configmap cache: timed out waiting for the condition Oct 13 14:18:19 crc kubenswrapper[4797]: I1013 14:18:19.594970 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/39f17318-f36e-49ec-ac64-b53ed1f136f2-server-conf\") pod \"rabbitmq-server-0\" (UID: \"39f17318-f36e-49ec-ac64-b53ed1f136f2\") " pod="openstack/rabbitmq-server-0" Oct 13 14:18:19 crc kubenswrapper[4797]: I1013 14:18:19.597138 4797 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 13 14:18:19 crc kubenswrapper[4797]: I1013 14:18:19.597175 4797 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-d9f106f9-e2c9-457f-96e8-885c3fc1388d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d9f106f9-e2c9-457f-96e8-885c3fc1388d\") pod \"rabbitmq-server-0\" (UID: \"39f17318-f36e-49ec-ac64-b53ed1f136f2\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/f018f0ce9372d149d8985d17a9152c0300d4bb049edd760d75027a364ec41417/globalmount\"" pod="openstack/rabbitmq-server-0" Oct 13 14:18:19 crc kubenswrapper[4797]: I1013 14:18:19.599496 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/39f17318-f36e-49ec-ac64-b53ed1f136f2-pod-info\") pod \"rabbitmq-server-0\" (UID: \"39f17318-f36e-49ec-ac64-b53ed1f136f2\") " pod="openstack/rabbitmq-server-0" Oct 13 14:18:19 crc kubenswrapper[4797]: I1013 14:18:19.601320 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/39f17318-f36e-49ec-ac64-b53ed1f136f2-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"39f17318-f36e-49ec-ac64-b53ed1f136f2\") " pod="openstack/rabbitmq-server-0" Oct 13 14:18:19 crc kubenswrapper[4797]: I1013 14:18:19.602110 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/39f17318-f36e-49ec-ac64-b53ed1f136f2-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"39f17318-f36e-49ec-ac64-b53ed1f136f2\") " pod="openstack/rabbitmq-server-0" Oct 13 14:18:19 crc kubenswrapper[4797]: I1013 14:18:19.622672 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Oct 13 14:18:19 crc kubenswrapper[4797]: I1013 14:18:19.635762 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdzwn\" (UniqueName: \"kubernetes.io/projected/39f17318-f36e-49ec-ac64-b53ed1f136f2-kube-api-access-jdzwn\") pod \"rabbitmq-server-0\" (UID: \"39f17318-f36e-49ec-ac64-b53ed1f136f2\") " pod="openstack/rabbitmq-server-0" Oct 13 14:18:19 crc kubenswrapper[4797]: I1013 14:18:19.658093 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-d9f106f9-e2c9-457f-96e8-885c3fc1388d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d9f106f9-e2c9-457f-96e8-885c3fc1388d\") pod \"rabbitmq-server-0\" (UID: \"39f17318-f36e-49ec-ac64-b53ed1f136f2\") " pod="openstack/rabbitmq-server-0" Oct 13 14:18:19 crc kubenswrapper[4797]: I1013 14:18:19.662461 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 13 14:18:19 crc kubenswrapper[4797]: I1013 14:18:19.665343 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 13 14:18:19 crc kubenswrapper[4797]: I1013 14:18:19.667936 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Oct 13 14:18:19 crc kubenswrapper[4797]: I1013 14:18:19.668204 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Oct 13 14:18:19 crc kubenswrapper[4797]: I1013 14:18:19.668301 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-24j2c" Oct 13 14:18:19 crc kubenswrapper[4797]: I1013 14:18:19.668421 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Oct 13 14:18:19 crc kubenswrapper[4797]: I1013 14:18:19.680496 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Oct 13 14:18:19 crc kubenswrapper[4797]: I1013 14:18:19.682370 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 13 14:18:19 crc kubenswrapper[4797]: I1013 14:18:19.797757 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e8a03dc5-9e8b-416a-9225-7b2b3788eadd-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"e8a03dc5-9e8b-416a-9225-7b2b3788eadd\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 14:18:19 crc kubenswrapper[4797]: I1013 14:18:19.797829 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e8a03dc5-9e8b-416a-9225-7b2b3788eadd-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"e8a03dc5-9e8b-416a-9225-7b2b3788eadd\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 14:18:19 crc kubenswrapper[4797]: I1013 14:18:19.797913 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e8a03dc5-9e8b-416a-9225-7b2b3788eadd-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e8a03dc5-9e8b-416a-9225-7b2b3788eadd\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 14:18:19 crc kubenswrapper[4797]: I1013 14:18:19.797952 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e8a03dc5-9e8b-416a-9225-7b2b3788eadd-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"e8a03dc5-9e8b-416a-9225-7b2b3788eadd\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 14:18:19 crc kubenswrapper[4797]: I1013 14:18:19.797995 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e8a03dc5-9e8b-416a-9225-7b2b3788eadd-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e8a03dc5-9e8b-416a-9225-7b2b3788eadd\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 14:18:19 crc kubenswrapper[4797]: I1013 14:18:19.798019 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frv6w\" (UniqueName: \"kubernetes.io/projected/e8a03dc5-9e8b-416a-9225-7b2b3788eadd-kube-api-access-frv6w\") pod \"rabbitmq-cell1-server-0\" (UID: \"e8a03dc5-9e8b-416a-9225-7b2b3788eadd\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 14:18:19 crc kubenswrapper[4797]: I1013 14:18:19.798053 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e8a03dc5-9e8b-416a-9225-7b2b3788eadd-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"e8a03dc5-9e8b-416a-9225-7b2b3788eadd\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 14:18:19 crc kubenswrapper[4797]: I1013 14:18:19.798093 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e8a03dc5-9e8b-416a-9225-7b2b3788eadd-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"e8a03dc5-9e8b-416a-9225-7b2b3788eadd\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 14:18:19 crc kubenswrapper[4797]: I1013 14:18:19.798119 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-a737523d-9656-44b2-9311-7d430d4ff5de\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a737523d-9656-44b2-9311-7d430d4ff5de\") pod \"rabbitmq-cell1-server-0\" (UID: \"e8a03dc5-9e8b-416a-9225-7b2b3788eadd\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 14:18:19 crc kubenswrapper[4797]: I1013 14:18:19.899221 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e8a03dc5-9e8b-416a-9225-7b2b3788eadd-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"e8a03dc5-9e8b-416a-9225-7b2b3788eadd\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 14:18:19 crc kubenswrapper[4797]: I1013 14:18:19.899629 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kddss\" (UniqueName: \"kubernetes.io/projected/d5b9774f-6729-449f-a5e6-b1ef4df0e647-kube-api-access-kddss\") pod \"dnsmasq-dns-54f7fbfcf9-tqdd5\" (UID: \"d5b9774f-6729-449f-a5e6-b1ef4df0e647\") " pod="openstack/dnsmasq-dns-54f7fbfcf9-tqdd5" Oct 13 14:18:19 crc kubenswrapper[4797]: I1013 14:18:19.899792 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e8a03dc5-9e8b-416a-9225-7b2b3788eadd-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"e8a03dc5-9e8b-416a-9225-7b2b3788eadd\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 14:18:19 crc kubenswrapper[4797]: I1013 14:18:19.899980 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e8a03dc5-9e8b-416a-9225-7b2b3788eadd-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"e8a03dc5-9e8b-416a-9225-7b2b3788eadd\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 14:18:19 crc kubenswrapper[4797]: I1013 14:18:19.900124 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-a737523d-9656-44b2-9311-7d430d4ff5de\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a737523d-9656-44b2-9311-7d430d4ff5de\") pod \"rabbitmq-cell1-server-0\" (UID: \"e8a03dc5-9e8b-416a-9225-7b2b3788eadd\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 14:18:19 crc kubenswrapper[4797]: I1013 14:18:19.900339 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d5b9774f-6729-449f-a5e6-b1ef4df0e647-dns-svc\") pod \"dnsmasq-dns-54f7fbfcf9-tqdd5\" (UID: \"d5b9774f-6729-449f-a5e6-b1ef4df0e647\") " pod="openstack/dnsmasq-dns-54f7fbfcf9-tqdd5" Oct 13 14:18:19 crc kubenswrapper[4797]: I1013 14:18:19.900501 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e8a03dc5-9e8b-416a-9225-7b2b3788eadd-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"e8a03dc5-9e8b-416a-9225-7b2b3788eadd\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 14:18:19 crc kubenswrapper[4797]: I1013 14:18:19.900649 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e8a03dc5-9e8b-416a-9225-7b2b3788eadd-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"e8a03dc5-9e8b-416a-9225-7b2b3788eadd\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 14:18:19 crc kubenswrapper[4797]: I1013 14:18:19.900941 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e8a03dc5-9e8b-416a-9225-7b2b3788eadd-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e8a03dc5-9e8b-416a-9225-7b2b3788eadd\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 14:18:19 crc kubenswrapper[4797]: I1013 14:18:19.901180 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e8a03dc5-9e8b-416a-9225-7b2b3788eadd-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"e8a03dc5-9e8b-416a-9225-7b2b3788eadd\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 14:18:19 crc kubenswrapper[4797]: I1013 14:18:19.901879 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e8a03dc5-9e8b-416a-9225-7b2b3788eadd-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e8a03dc5-9e8b-416a-9225-7b2b3788eadd\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 14:18:19 crc kubenswrapper[4797]: I1013 14:18:19.901897 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e8a03dc5-9e8b-416a-9225-7b2b3788eadd-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e8a03dc5-9e8b-416a-9225-7b2b3788eadd\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 14:18:19 crc kubenswrapper[4797]: I1013 14:18:19.901990 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-frv6w\" (UniqueName: \"kubernetes.io/projected/e8a03dc5-9e8b-416a-9225-7b2b3788eadd-kube-api-access-frv6w\") pod \"rabbitmq-cell1-server-0\" (UID: \"e8a03dc5-9e8b-416a-9225-7b2b3788eadd\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 14:18:19 crc kubenswrapper[4797]: I1013 14:18:19.901788 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e8a03dc5-9e8b-416a-9225-7b2b3788eadd-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"e8a03dc5-9e8b-416a-9225-7b2b3788eadd\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 14:18:19 crc kubenswrapper[4797]: I1013 14:18:19.902507 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d5b9774f-6729-449f-a5e6-b1ef4df0e647-dns-svc\") pod \"dnsmasq-dns-54f7fbfcf9-tqdd5\" (UID: \"d5b9774f-6729-449f-a5e6-b1ef4df0e647\") " pod="openstack/dnsmasq-dns-54f7fbfcf9-tqdd5" Oct 13 14:18:19 crc kubenswrapper[4797]: I1013 14:18:19.903916 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kddss\" (UniqueName: \"kubernetes.io/projected/d5b9774f-6729-449f-a5e6-b1ef4df0e647-kube-api-access-kddss\") pod \"dnsmasq-dns-54f7fbfcf9-tqdd5\" (UID: \"d5b9774f-6729-449f-a5e6-b1ef4df0e647\") " pod="openstack/dnsmasq-dns-54f7fbfcf9-tqdd5" Oct 13 14:18:19 crc kubenswrapper[4797]: I1013 14:18:19.904691 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e8a03dc5-9e8b-416a-9225-7b2b3788eadd-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"e8a03dc5-9e8b-416a-9225-7b2b3788eadd\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 14:18:19 crc kubenswrapper[4797]: I1013 14:18:19.906651 4797 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 13 14:18:19 crc kubenswrapper[4797]: I1013 14:18:19.906690 4797 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-a737523d-9656-44b2-9311-7d430d4ff5de\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a737523d-9656-44b2-9311-7d430d4ff5de\") pod \"rabbitmq-cell1-server-0\" (UID: \"e8a03dc5-9e8b-416a-9225-7b2b3788eadd\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/0d9b5c6133a2fd75e8a005ecab3887f623cd9089254a1d9df7fe9b22b6e7a859/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Oct 13 14:18:19 crc kubenswrapper[4797]: I1013 14:18:19.908511 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e8a03dc5-9e8b-416a-9225-7b2b3788eadd-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"e8a03dc5-9e8b-416a-9225-7b2b3788eadd\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 14:18:19 crc kubenswrapper[4797]: I1013 14:18:19.908508 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e8a03dc5-9e8b-416a-9225-7b2b3788eadd-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"e8a03dc5-9e8b-416a-9225-7b2b3788eadd\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 14:18:19 crc kubenswrapper[4797]: I1013 14:18:19.910306 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e8a03dc5-9e8b-416a-9225-7b2b3788eadd-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e8a03dc5-9e8b-416a-9225-7b2b3788eadd\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 14:18:19 crc kubenswrapper[4797]: I1013 14:18:19.925448 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-frv6w\" (UniqueName: \"kubernetes.io/projected/e8a03dc5-9e8b-416a-9225-7b2b3788eadd-kube-api-access-frv6w\") pod \"rabbitmq-cell1-server-0\" (UID: \"e8a03dc5-9e8b-416a-9225-7b2b3788eadd\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 14:18:19 crc kubenswrapper[4797]: I1013 14:18:19.946306 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 13 14:18:19 crc kubenswrapper[4797]: I1013 14:18:19.956825 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-a737523d-9656-44b2-9311-7d430d4ff5de\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a737523d-9656-44b2-9311-7d430d4ff5de\") pod \"rabbitmq-cell1-server-0\" (UID: \"e8a03dc5-9e8b-416a-9225-7b2b3788eadd\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 14:18:20 crc kubenswrapper[4797]: I1013 14:18:20.024089 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 13 14:18:20 crc kubenswrapper[4797]: I1013 14:18:20.074334 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54f7fbfcf9-tqdd5" Oct 13 14:18:20 crc kubenswrapper[4797]: I1013 14:18:20.105876 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4x7qs\" (UniqueName: \"kubernetes.io/projected/7c4c1d51-22f5-47e8-baa4-05f2ec403852-kube-api-access-4x7qs\") pod \"dnsmasq-dns-557fbdf45f-4cpz9\" (UID: \"7c4c1d51-22f5-47e8-baa4-05f2ec403852\") " pod="openstack/dnsmasq-dns-557fbdf45f-4cpz9" Oct 13 14:18:20 crc kubenswrapper[4797]: I1013 14:18:20.117670 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4x7qs\" (UniqueName: \"kubernetes.io/projected/7c4c1d51-22f5-47e8-baa4-05f2ec403852-kube-api-access-4x7qs\") pod \"dnsmasq-dns-557fbdf45f-4cpz9\" (UID: \"7c4c1d51-22f5-47e8-baa4-05f2ec403852\") " pod="openstack/dnsmasq-dns-557fbdf45f-4cpz9" Oct 13 14:18:20 crc kubenswrapper[4797]: I1013 14:18:20.271170 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-557fbdf45f-4cpz9" Oct 13 14:18:20 crc kubenswrapper[4797]: I1013 14:18:20.315155 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 13 14:18:20 crc kubenswrapper[4797]: I1013 14:18:20.436878 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 13 14:18:20 crc kubenswrapper[4797]: W1013 14:18:20.456336 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod39f17318_f36e_49ec_ac64_b53ed1f136f2.slice/crio-910be0feca868cf7a7eeaddbbab23f9ae7cab44474b0dab96a579e53f622c427 WatchSource:0}: Error finding container 910be0feca868cf7a7eeaddbbab23f9ae7cab44474b0dab96a579e53f622c427: Status 404 returned error can't find the container with id 910be0feca868cf7a7eeaddbbab23f9ae7cab44474b0dab96a579e53f622c427 Oct 13 14:18:20 crc kubenswrapper[4797]: I1013 14:18:20.595751 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-54f7fbfcf9-tqdd5"] Oct 13 14:18:20 crc kubenswrapper[4797]: I1013 14:18:20.735408 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-557fbdf45f-4cpz9"] Oct 13 14:18:20 crc kubenswrapper[4797]: I1013 14:18:20.933503 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Oct 13 14:18:20 crc kubenswrapper[4797]: I1013 14:18:20.935675 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 13 14:18:20 crc kubenswrapper[4797]: I1013 14:18:20.939132 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Oct 13 14:18:20 crc kubenswrapper[4797]: I1013 14:18:20.940362 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Oct 13 14:18:20 crc kubenswrapper[4797]: I1013 14:18:20.940658 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Oct 13 14:18:20 crc kubenswrapper[4797]: I1013 14:18:20.941146 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Oct 13 14:18:20 crc kubenswrapper[4797]: I1013 14:18:20.941310 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-597xf" Oct 13 14:18:20 crc kubenswrapper[4797]: I1013 14:18:20.948200 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Oct 13 14:18:20 crc kubenswrapper[4797]: I1013 14:18:20.961499 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Oct 13 14:18:21 crc kubenswrapper[4797]: I1013 14:18:21.042420 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a5077271-1a56-4d9c-82e5-c8ecc23f5ef8-operator-scripts\") pod \"openstack-galera-0\" (UID: \"a5077271-1a56-4d9c-82e5-c8ecc23f5ef8\") " pod="openstack/openstack-galera-0" Oct 13 14:18:21 crc kubenswrapper[4797]: I1013 14:18:21.042475 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/a5077271-1a56-4d9c-82e5-c8ecc23f5ef8-secrets\") pod \"openstack-galera-0\" (UID: \"a5077271-1a56-4d9c-82e5-c8ecc23f5ef8\") " pod="openstack/openstack-galera-0" Oct 13 14:18:21 crc kubenswrapper[4797]: I1013 14:18:21.042504 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5077271-1a56-4d9c-82e5-c8ecc23f5ef8-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"a5077271-1a56-4d9c-82e5-c8ecc23f5ef8\") " pod="openstack/openstack-galera-0" Oct 13 14:18:21 crc kubenswrapper[4797]: I1013 14:18:21.042524 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/a5077271-1a56-4d9c-82e5-c8ecc23f5ef8-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"a5077271-1a56-4d9c-82e5-c8ecc23f5ef8\") " pod="openstack/openstack-galera-0" Oct 13 14:18:21 crc kubenswrapper[4797]: I1013 14:18:21.042548 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5cfn\" (UniqueName: \"kubernetes.io/projected/a5077271-1a56-4d9c-82e5-c8ecc23f5ef8-kube-api-access-q5cfn\") pod \"openstack-galera-0\" (UID: \"a5077271-1a56-4d9c-82e5-c8ecc23f5ef8\") " pod="openstack/openstack-galera-0" Oct 13 14:18:21 crc kubenswrapper[4797]: I1013 14:18:21.042590 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a5077271-1a56-4d9c-82e5-c8ecc23f5ef8-kolla-config\") pod \"openstack-galera-0\" (UID: \"a5077271-1a56-4d9c-82e5-c8ecc23f5ef8\") " pod="openstack/openstack-galera-0" Oct 13 14:18:21 crc kubenswrapper[4797]: I1013 14:18:21.042646 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-f5541a35-25a8-4e20-913b-6072f94c610e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f5541a35-25a8-4e20-913b-6072f94c610e\") pod \"openstack-galera-0\" (UID: \"a5077271-1a56-4d9c-82e5-c8ecc23f5ef8\") " pod="openstack/openstack-galera-0" Oct 13 14:18:21 crc kubenswrapper[4797]: I1013 14:18:21.042671 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/a5077271-1a56-4d9c-82e5-c8ecc23f5ef8-config-data-default\") pod \"openstack-galera-0\" (UID: \"a5077271-1a56-4d9c-82e5-c8ecc23f5ef8\") " pod="openstack/openstack-galera-0" Oct 13 14:18:21 crc kubenswrapper[4797]: I1013 14:18:21.042714 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/a5077271-1a56-4d9c-82e5-c8ecc23f5ef8-config-data-generated\") pod \"openstack-galera-0\" (UID: \"a5077271-1a56-4d9c-82e5-c8ecc23f5ef8\") " pod="openstack/openstack-galera-0" Oct 13 14:18:21 crc kubenswrapper[4797]: I1013 14:18:21.144363 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-f5541a35-25a8-4e20-913b-6072f94c610e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f5541a35-25a8-4e20-913b-6072f94c610e\") pod \"openstack-galera-0\" (UID: \"a5077271-1a56-4d9c-82e5-c8ecc23f5ef8\") " pod="openstack/openstack-galera-0" Oct 13 14:18:21 crc kubenswrapper[4797]: I1013 14:18:21.144408 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/a5077271-1a56-4d9c-82e5-c8ecc23f5ef8-config-data-default\") pod \"openstack-galera-0\" (UID: \"a5077271-1a56-4d9c-82e5-c8ecc23f5ef8\") " pod="openstack/openstack-galera-0" Oct 13 14:18:21 crc kubenswrapper[4797]: I1013 14:18:21.144445 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/a5077271-1a56-4d9c-82e5-c8ecc23f5ef8-config-data-generated\") pod \"openstack-galera-0\" (UID: \"a5077271-1a56-4d9c-82e5-c8ecc23f5ef8\") " pod="openstack/openstack-galera-0" Oct 13 14:18:21 crc kubenswrapper[4797]: I1013 14:18:21.144485 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a5077271-1a56-4d9c-82e5-c8ecc23f5ef8-operator-scripts\") pod \"openstack-galera-0\" (UID: \"a5077271-1a56-4d9c-82e5-c8ecc23f5ef8\") " pod="openstack/openstack-galera-0" Oct 13 14:18:21 crc kubenswrapper[4797]: I1013 14:18:21.144521 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/a5077271-1a56-4d9c-82e5-c8ecc23f5ef8-secrets\") pod \"openstack-galera-0\" (UID: \"a5077271-1a56-4d9c-82e5-c8ecc23f5ef8\") " pod="openstack/openstack-galera-0" Oct 13 14:18:21 crc kubenswrapper[4797]: I1013 14:18:21.144537 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5077271-1a56-4d9c-82e5-c8ecc23f5ef8-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"a5077271-1a56-4d9c-82e5-c8ecc23f5ef8\") " pod="openstack/openstack-galera-0" Oct 13 14:18:21 crc kubenswrapper[4797]: I1013 14:18:21.144554 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/a5077271-1a56-4d9c-82e5-c8ecc23f5ef8-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"a5077271-1a56-4d9c-82e5-c8ecc23f5ef8\") " pod="openstack/openstack-galera-0" Oct 13 14:18:21 crc kubenswrapper[4797]: I1013 14:18:21.144570 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q5cfn\" (UniqueName: \"kubernetes.io/projected/a5077271-1a56-4d9c-82e5-c8ecc23f5ef8-kube-api-access-q5cfn\") pod \"openstack-galera-0\" (UID: \"a5077271-1a56-4d9c-82e5-c8ecc23f5ef8\") " pod="openstack/openstack-galera-0" Oct 13 14:18:21 crc kubenswrapper[4797]: I1013 14:18:21.144589 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a5077271-1a56-4d9c-82e5-c8ecc23f5ef8-kolla-config\") pod \"openstack-galera-0\" (UID: \"a5077271-1a56-4d9c-82e5-c8ecc23f5ef8\") " pod="openstack/openstack-galera-0" Oct 13 14:18:21 crc kubenswrapper[4797]: I1013 14:18:21.145334 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a5077271-1a56-4d9c-82e5-c8ecc23f5ef8-kolla-config\") pod \"openstack-galera-0\" (UID: \"a5077271-1a56-4d9c-82e5-c8ecc23f5ef8\") " pod="openstack/openstack-galera-0" Oct 13 14:18:21 crc kubenswrapper[4797]: I1013 14:18:21.146183 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/a5077271-1a56-4d9c-82e5-c8ecc23f5ef8-config-data-default\") pod \"openstack-galera-0\" (UID: \"a5077271-1a56-4d9c-82e5-c8ecc23f5ef8\") " pod="openstack/openstack-galera-0" Oct 13 14:18:21 crc kubenswrapper[4797]: I1013 14:18:21.146390 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/a5077271-1a56-4d9c-82e5-c8ecc23f5ef8-config-data-generated\") pod \"openstack-galera-0\" (UID: \"a5077271-1a56-4d9c-82e5-c8ecc23f5ef8\") " pod="openstack/openstack-galera-0" Oct 13 14:18:21 crc kubenswrapper[4797]: I1013 14:18:21.147251 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a5077271-1a56-4d9c-82e5-c8ecc23f5ef8-operator-scripts\") pod \"openstack-galera-0\" (UID: \"a5077271-1a56-4d9c-82e5-c8ecc23f5ef8\") " pod="openstack/openstack-galera-0" Oct 13 14:18:21 crc kubenswrapper[4797]: I1013 14:18:21.152176 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5077271-1a56-4d9c-82e5-c8ecc23f5ef8-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"a5077271-1a56-4d9c-82e5-c8ecc23f5ef8\") " pod="openstack/openstack-galera-0" Oct 13 14:18:21 crc kubenswrapper[4797]: I1013 14:18:21.153950 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/a5077271-1a56-4d9c-82e5-c8ecc23f5ef8-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"a5077271-1a56-4d9c-82e5-c8ecc23f5ef8\") " pod="openstack/openstack-galera-0" Oct 13 14:18:21 crc kubenswrapper[4797]: I1013 14:18:21.154851 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/a5077271-1a56-4d9c-82e5-c8ecc23f5ef8-secrets\") pod \"openstack-galera-0\" (UID: \"a5077271-1a56-4d9c-82e5-c8ecc23f5ef8\") " pod="openstack/openstack-galera-0" Oct 13 14:18:21 crc kubenswrapper[4797]: I1013 14:18:21.168576 4797 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 13 14:18:21 crc kubenswrapper[4797]: I1013 14:18:21.168612 4797 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-f5541a35-25a8-4e20-913b-6072f94c610e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f5541a35-25a8-4e20-913b-6072f94c610e\") pod \"openstack-galera-0\" (UID: \"a5077271-1a56-4d9c-82e5-c8ecc23f5ef8\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/d60fe34959414039bf284600239caafe75f6fa145c9b8c1c1bf8386194951aa4/globalmount\"" pod="openstack/openstack-galera-0" Oct 13 14:18:21 crc kubenswrapper[4797]: I1013 14:18:21.202517 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5cfn\" (UniqueName: \"kubernetes.io/projected/a5077271-1a56-4d9c-82e5-c8ecc23f5ef8-kube-api-access-q5cfn\") pod \"openstack-galera-0\" (UID: \"a5077271-1a56-4d9c-82e5-c8ecc23f5ef8\") " pod="openstack/openstack-galera-0" Oct 13 14:18:21 crc kubenswrapper[4797]: I1013 14:18:21.224358 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Oct 13 14:18:21 crc kubenswrapper[4797]: I1013 14:18:21.225156 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-f5541a35-25a8-4e20-913b-6072f94c610e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f5541a35-25a8-4e20-913b-6072f94c610e\") pod \"openstack-galera-0\" (UID: \"a5077271-1a56-4d9c-82e5-c8ecc23f5ef8\") " pod="openstack/openstack-galera-0" Oct 13 14:18:21 crc kubenswrapper[4797]: I1013 14:18:21.225512 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 13 14:18:21 crc kubenswrapper[4797]: I1013 14:18:21.232012 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-cn6d2" Oct 13 14:18:21 crc kubenswrapper[4797]: I1013 14:18:21.232215 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Oct 13 14:18:21 crc kubenswrapper[4797]: I1013 14:18:21.303302 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 13 14:18:21 crc kubenswrapper[4797]: I1013 14:18:21.318940 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Oct 13 14:18:21 crc kubenswrapper[4797]: I1013 14:18:21.319565 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54f7fbfcf9-tqdd5" event={"ID":"d5b9774f-6729-449f-a5e6-b1ef4df0e647","Type":"ContainerStarted","Data":"8c6771d36dcb97e8948c3f735ec9cf4bca455ff74c0c5ece22eb316636cda0b1"} Oct 13 14:18:21 crc kubenswrapper[4797]: I1013 14:18:21.321246 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"39f17318-f36e-49ec-ac64-b53ed1f136f2","Type":"ContainerStarted","Data":"910be0feca868cf7a7eeaddbbab23f9ae7cab44474b0dab96a579e53f622c427"} Oct 13 14:18:21 crc kubenswrapper[4797]: I1013 14:18:21.321742 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-557fbdf45f-4cpz9" event={"ID":"7c4c1d51-22f5-47e8-baa4-05f2ec403852","Type":"ContainerStarted","Data":"7a5400b10dac7031454c0e8cd1850ab1244565603f63715e6e2082fe3f14e7b3"} Oct 13 14:18:21 crc kubenswrapper[4797]: I1013 14:18:21.327244 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e8a03dc5-9e8b-416a-9225-7b2b3788eadd","Type":"ContainerStarted","Data":"7e1ea81d0387eb55a0784b4b148013e4728c4267e8efb46d4e072514a364e193"} Oct 13 14:18:21 crc kubenswrapper[4797]: I1013 14:18:21.348342 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b8191ae4-3c4c-4059-8b22-ca8d1967e6f8-config-data\") pod \"memcached-0\" (UID: \"b8191ae4-3c4c-4059-8b22-ca8d1967e6f8\") " pod="openstack/memcached-0" Oct 13 14:18:21 crc kubenswrapper[4797]: I1013 14:18:21.348829 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b8191ae4-3c4c-4059-8b22-ca8d1967e6f8-kolla-config\") pod \"memcached-0\" (UID: \"b8191ae4-3c4c-4059-8b22-ca8d1967e6f8\") " pod="openstack/memcached-0" Oct 13 14:18:21 crc kubenswrapper[4797]: I1013 14:18:21.348909 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96ghf\" (UniqueName: \"kubernetes.io/projected/b8191ae4-3c4c-4059-8b22-ca8d1967e6f8-kube-api-access-96ghf\") pod \"memcached-0\" (UID: \"b8191ae4-3c4c-4059-8b22-ca8d1967e6f8\") " pod="openstack/memcached-0" Oct 13 14:18:21 crc kubenswrapper[4797]: I1013 14:18:21.455631 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-96ghf\" (UniqueName: \"kubernetes.io/projected/b8191ae4-3c4c-4059-8b22-ca8d1967e6f8-kube-api-access-96ghf\") pod \"memcached-0\" (UID: \"b8191ae4-3c4c-4059-8b22-ca8d1967e6f8\") " pod="openstack/memcached-0" Oct 13 14:18:21 crc kubenswrapper[4797]: I1013 14:18:21.455750 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b8191ae4-3c4c-4059-8b22-ca8d1967e6f8-config-data\") pod \"memcached-0\" (UID: \"b8191ae4-3c4c-4059-8b22-ca8d1967e6f8\") " pod="openstack/memcached-0" Oct 13 14:18:21 crc kubenswrapper[4797]: I1013 14:18:21.455793 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b8191ae4-3c4c-4059-8b22-ca8d1967e6f8-kolla-config\") pod \"memcached-0\" (UID: \"b8191ae4-3c4c-4059-8b22-ca8d1967e6f8\") " pod="openstack/memcached-0" Oct 13 14:18:21 crc kubenswrapper[4797]: I1013 14:18:21.457560 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b8191ae4-3c4c-4059-8b22-ca8d1967e6f8-config-data\") pod \"memcached-0\" (UID: \"b8191ae4-3c4c-4059-8b22-ca8d1967e6f8\") " pod="openstack/memcached-0" Oct 13 14:18:21 crc kubenswrapper[4797]: I1013 14:18:21.459903 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b8191ae4-3c4c-4059-8b22-ca8d1967e6f8-kolla-config\") pod \"memcached-0\" (UID: \"b8191ae4-3c4c-4059-8b22-ca8d1967e6f8\") " pod="openstack/memcached-0" Oct 13 14:18:21 crc kubenswrapper[4797]: I1013 14:18:21.485425 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-96ghf\" (UniqueName: \"kubernetes.io/projected/b8191ae4-3c4c-4059-8b22-ca8d1967e6f8-kube-api-access-96ghf\") pod \"memcached-0\" (UID: \"b8191ae4-3c4c-4059-8b22-ca8d1967e6f8\") " pod="openstack/memcached-0" Oct 13 14:18:21 crc kubenswrapper[4797]: I1013 14:18:21.585190 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 13 14:18:21 crc kubenswrapper[4797]: I1013 14:18:21.882223 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Oct 13 14:18:22 crc kubenswrapper[4797]: I1013 14:18:22.143095 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Oct 13 14:18:22 crc kubenswrapper[4797]: I1013 14:18:22.325731 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 13 14:18:22 crc kubenswrapper[4797]: I1013 14:18:22.331080 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 13 14:18:22 crc kubenswrapper[4797]: I1013 14:18:22.333297 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-lwzgn" Oct 13 14:18:22 crc kubenswrapper[4797]: I1013 14:18:22.333483 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Oct 13 14:18:22 crc kubenswrapper[4797]: I1013 14:18:22.333604 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Oct 13 14:18:22 crc kubenswrapper[4797]: I1013 14:18:22.336918 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Oct 13 14:18:22 crc kubenswrapper[4797]: I1013 14:18:22.338625 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 13 14:18:22 crc kubenswrapper[4797]: I1013 14:18:22.343082 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"a5077271-1a56-4d9c-82e5-c8ecc23f5ef8","Type":"ContainerStarted","Data":"db9bdb2b0ebbc9c856c8f08dd7b870d6efa5b3dc148f7ab2acdf5c0b5751b827"} Oct 13 14:18:22 crc kubenswrapper[4797]: I1013 14:18:22.481702 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h24h4\" (UniqueName: \"kubernetes.io/projected/8eeb3b4f-e45c-44bb-874f-2445e71ea23a-kube-api-access-h24h4\") pod \"openstack-cell1-galera-0\" (UID: \"8eeb3b4f-e45c-44bb-874f-2445e71ea23a\") " pod="openstack/openstack-cell1-galera-0" Oct 13 14:18:22 crc kubenswrapper[4797]: I1013 14:18:22.482067 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/8eeb3b4f-e45c-44bb-874f-2445e71ea23a-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"8eeb3b4f-e45c-44bb-874f-2445e71ea23a\") " pod="openstack/openstack-cell1-galera-0" Oct 13 14:18:22 crc kubenswrapper[4797]: I1013 14:18:22.482156 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/8eeb3b4f-e45c-44bb-874f-2445e71ea23a-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"8eeb3b4f-e45c-44bb-874f-2445e71ea23a\") " pod="openstack/openstack-cell1-galera-0" Oct 13 14:18:22 crc kubenswrapper[4797]: I1013 14:18:22.482236 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-a99cc823-999d-470b-a968-ba391edd954d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a99cc823-999d-470b-a968-ba391edd954d\") pod \"openstack-cell1-galera-0\" (UID: \"8eeb3b4f-e45c-44bb-874f-2445e71ea23a\") " pod="openstack/openstack-cell1-galera-0" Oct 13 14:18:22 crc kubenswrapper[4797]: I1013 14:18:22.483088 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/8eeb3b4f-e45c-44bb-874f-2445e71ea23a-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"8eeb3b4f-e45c-44bb-874f-2445e71ea23a\") " pod="openstack/openstack-cell1-galera-0" Oct 13 14:18:22 crc kubenswrapper[4797]: I1013 14:18:22.483158 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/8eeb3b4f-e45c-44bb-874f-2445e71ea23a-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"8eeb3b4f-e45c-44bb-874f-2445e71ea23a\") " pod="openstack/openstack-cell1-galera-0" Oct 13 14:18:22 crc kubenswrapper[4797]: I1013 14:18:22.483187 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8eeb3b4f-e45c-44bb-874f-2445e71ea23a-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"8eeb3b4f-e45c-44bb-874f-2445e71ea23a\") " pod="openstack/openstack-cell1-galera-0" Oct 13 14:18:22 crc kubenswrapper[4797]: I1013 14:18:22.483217 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8eeb3b4f-e45c-44bb-874f-2445e71ea23a-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"8eeb3b4f-e45c-44bb-874f-2445e71ea23a\") " pod="openstack/openstack-cell1-galera-0" Oct 13 14:18:22 crc kubenswrapper[4797]: I1013 14:18:22.483272 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8eeb3b4f-e45c-44bb-874f-2445e71ea23a-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"8eeb3b4f-e45c-44bb-874f-2445e71ea23a\") " pod="openstack/openstack-cell1-galera-0" Oct 13 14:18:22 crc kubenswrapper[4797]: I1013 14:18:22.585163 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/8eeb3b4f-e45c-44bb-874f-2445e71ea23a-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"8eeb3b4f-e45c-44bb-874f-2445e71ea23a\") " pod="openstack/openstack-cell1-galera-0" Oct 13 14:18:22 crc kubenswrapper[4797]: I1013 14:18:22.585208 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/8eeb3b4f-e45c-44bb-874f-2445e71ea23a-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"8eeb3b4f-e45c-44bb-874f-2445e71ea23a\") " pod="openstack/openstack-cell1-galera-0" Oct 13 14:18:22 crc kubenswrapper[4797]: I1013 14:18:22.585229 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8eeb3b4f-e45c-44bb-874f-2445e71ea23a-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"8eeb3b4f-e45c-44bb-874f-2445e71ea23a\") " pod="openstack/openstack-cell1-galera-0" Oct 13 14:18:22 crc kubenswrapper[4797]: I1013 14:18:22.585291 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8eeb3b4f-e45c-44bb-874f-2445e71ea23a-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"8eeb3b4f-e45c-44bb-874f-2445e71ea23a\") " pod="openstack/openstack-cell1-galera-0" Oct 13 14:18:22 crc kubenswrapper[4797]: I1013 14:18:22.585322 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8eeb3b4f-e45c-44bb-874f-2445e71ea23a-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"8eeb3b4f-e45c-44bb-874f-2445e71ea23a\") " pod="openstack/openstack-cell1-galera-0" Oct 13 14:18:22 crc kubenswrapper[4797]: I1013 14:18:22.585354 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h24h4\" (UniqueName: \"kubernetes.io/projected/8eeb3b4f-e45c-44bb-874f-2445e71ea23a-kube-api-access-h24h4\") pod \"openstack-cell1-galera-0\" (UID: \"8eeb3b4f-e45c-44bb-874f-2445e71ea23a\") " pod="openstack/openstack-cell1-galera-0" Oct 13 14:18:22 crc kubenswrapper[4797]: I1013 14:18:22.585373 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/8eeb3b4f-e45c-44bb-874f-2445e71ea23a-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"8eeb3b4f-e45c-44bb-874f-2445e71ea23a\") " pod="openstack/openstack-cell1-galera-0" Oct 13 14:18:22 crc kubenswrapper[4797]: I1013 14:18:22.585399 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/8eeb3b4f-e45c-44bb-874f-2445e71ea23a-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"8eeb3b4f-e45c-44bb-874f-2445e71ea23a\") " pod="openstack/openstack-cell1-galera-0" Oct 13 14:18:22 crc kubenswrapper[4797]: I1013 14:18:22.585429 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-a99cc823-999d-470b-a968-ba391edd954d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a99cc823-999d-470b-a968-ba391edd954d\") pod \"openstack-cell1-galera-0\" (UID: \"8eeb3b4f-e45c-44bb-874f-2445e71ea23a\") " pod="openstack/openstack-cell1-galera-0" Oct 13 14:18:22 crc kubenswrapper[4797]: I1013 14:18:22.585767 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/8eeb3b4f-e45c-44bb-874f-2445e71ea23a-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"8eeb3b4f-e45c-44bb-874f-2445e71ea23a\") " pod="openstack/openstack-cell1-galera-0" Oct 13 14:18:22 crc kubenswrapper[4797]: I1013 14:18:22.586385 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/8eeb3b4f-e45c-44bb-874f-2445e71ea23a-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"8eeb3b4f-e45c-44bb-874f-2445e71ea23a\") " pod="openstack/openstack-cell1-galera-0" Oct 13 14:18:22 crc kubenswrapper[4797]: I1013 14:18:22.586969 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8eeb3b4f-e45c-44bb-874f-2445e71ea23a-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"8eeb3b4f-e45c-44bb-874f-2445e71ea23a\") " pod="openstack/openstack-cell1-galera-0" Oct 13 14:18:22 crc kubenswrapper[4797]: I1013 14:18:22.588404 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8eeb3b4f-e45c-44bb-874f-2445e71ea23a-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"8eeb3b4f-e45c-44bb-874f-2445e71ea23a\") " pod="openstack/openstack-cell1-galera-0" Oct 13 14:18:22 crc kubenswrapper[4797]: I1013 14:18:22.589807 4797 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 13 14:18:22 crc kubenswrapper[4797]: I1013 14:18:22.589863 4797 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-a99cc823-999d-470b-a968-ba391edd954d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a99cc823-999d-470b-a968-ba391edd954d\") pod \"openstack-cell1-galera-0\" (UID: \"8eeb3b4f-e45c-44bb-874f-2445e71ea23a\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/40e1996bd1e318ba1adf61ab9887ee40ba319942ff35673a6dc01631ff0a2e89/globalmount\"" pod="openstack/openstack-cell1-galera-0" Oct 13 14:18:22 crc kubenswrapper[4797]: I1013 14:18:22.591972 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/8eeb3b4f-e45c-44bb-874f-2445e71ea23a-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"8eeb3b4f-e45c-44bb-874f-2445e71ea23a\") " pod="openstack/openstack-cell1-galera-0" Oct 13 14:18:22 crc kubenswrapper[4797]: I1013 14:18:22.592241 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8eeb3b4f-e45c-44bb-874f-2445e71ea23a-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"8eeb3b4f-e45c-44bb-874f-2445e71ea23a\") " pod="openstack/openstack-cell1-galera-0" Oct 13 14:18:22 crc kubenswrapper[4797]: I1013 14:18:22.592831 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/8eeb3b4f-e45c-44bb-874f-2445e71ea23a-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"8eeb3b4f-e45c-44bb-874f-2445e71ea23a\") " pod="openstack/openstack-cell1-galera-0" Oct 13 14:18:22 crc kubenswrapper[4797]: I1013 14:18:22.605693 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h24h4\" (UniqueName: \"kubernetes.io/projected/8eeb3b4f-e45c-44bb-874f-2445e71ea23a-kube-api-access-h24h4\") pod \"openstack-cell1-galera-0\" (UID: \"8eeb3b4f-e45c-44bb-874f-2445e71ea23a\") " pod="openstack/openstack-cell1-galera-0" Oct 13 14:18:22 crc kubenswrapper[4797]: I1013 14:18:22.616571 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-a99cc823-999d-470b-a968-ba391edd954d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a99cc823-999d-470b-a968-ba391edd954d\") pod \"openstack-cell1-galera-0\" (UID: \"8eeb3b4f-e45c-44bb-874f-2445e71ea23a\") " pod="openstack/openstack-cell1-galera-0" Oct 13 14:18:22 crc kubenswrapper[4797]: I1013 14:18:22.677734 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 13 14:18:23 crc kubenswrapper[4797]: I1013 14:18:23.914403 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 13 14:18:24 crc kubenswrapper[4797]: I1013 14:18:24.367325 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"b8191ae4-3c4c-4059-8b22-ca8d1967e6f8","Type":"ContainerStarted","Data":"fead82d2dc1f202400e23f3fb4684f8f42c79c8adfc9e78436c631fc30d7f506"} Oct 13 14:18:25 crc kubenswrapper[4797]: I1013 14:18:25.376991 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"8eeb3b4f-e45c-44bb-874f-2445e71ea23a","Type":"ContainerStarted","Data":"267097846b393d905e59acf9fdb39c315651abbe36e76d75331b0d7a77cea333"} Oct 13 14:18:27 crc kubenswrapper[4797]: I1013 14:18:27.394404 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"39f17318-f36e-49ec-ac64-b53ed1f136f2","Type":"ContainerStarted","Data":"7737fcd4ec77ed9f548f43472327538c12217181a6cfb5072bff3388308f9704"} Oct 13 14:18:27 crc kubenswrapper[4797]: I1013 14:18:27.396360 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e8a03dc5-9e8b-416a-9225-7b2b3788eadd","Type":"ContainerStarted","Data":"c41a498f495d9d7d28569b37c12bbfa092106858c1ce0547625a97de8ae50469"} Oct 13 14:18:40 crc kubenswrapper[4797]: I1013 14:18:40.496952 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"a5077271-1a56-4d9c-82e5-c8ecc23f5ef8","Type":"ContainerStarted","Data":"d2a2bb6f1d54528dda02efb0fee3a10444de296e9a1b436def7538b6d9388189"} Oct 13 14:18:40 crc kubenswrapper[4797]: I1013 14:18:40.500415 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"8eeb3b4f-e45c-44bb-874f-2445e71ea23a","Type":"ContainerStarted","Data":"dd14863459961a66a4e99fcf64a5d49bd28897d53a38172830298903e5cf9477"} Oct 13 14:18:40 crc kubenswrapper[4797]: I1013 14:18:40.502911 4797 generic.go:334] "Generic (PLEG): container finished" podID="7c4c1d51-22f5-47e8-baa4-05f2ec403852" containerID="62f2a724709bc33f4d4d642ab957bd4d85906624f14e6aabd0af7014e49df210" exitCode=0 Oct 13 14:18:40 crc kubenswrapper[4797]: I1013 14:18:40.503147 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-557fbdf45f-4cpz9" event={"ID":"7c4c1d51-22f5-47e8-baa4-05f2ec403852","Type":"ContainerDied","Data":"62f2a724709bc33f4d4d642ab957bd4d85906624f14e6aabd0af7014e49df210"} Oct 13 14:18:40 crc kubenswrapper[4797]: I1013 14:18:40.505023 4797 generic.go:334] "Generic (PLEG): container finished" podID="d5b9774f-6729-449f-a5e6-b1ef4df0e647" containerID="5ae42815e5511f2399335f5db7bbd9002343e4735d2f4444344a3cb2ae5157ab" exitCode=0 Oct 13 14:18:40 crc kubenswrapper[4797]: I1013 14:18:40.505089 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54f7fbfcf9-tqdd5" event={"ID":"d5b9774f-6729-449f-a5e6-b1ef4df0e647","Type":"ContainerDied","Data":"5ae42815e5511f2399335f5db7bbd9002343e4735d2f4444344a3cb2ae5157ab"} Oct 13 14:18:40 crc kubenswrapper[4797]: I1013 14:18:40.509722 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"b8191ae4-3c4c-4059-8b22-ca8d1967e6f8","Type":"ContainerStarted","Data":"30b741c5fbcd7b238aea39a95dbe603bdc008547473fd774e5b661a834dc1109"} Oct 13 14:18:40 crc kubenswrapper[4797]: I1013 14:18:40.510223 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Oct 13 14:18:40 crc kubenswrapper[4797]: I1013 14:18:40.633152 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=3.604678244 podStartE2EDuration="19.633126885s" podCreationTimestamp="2025-10-13 14:18:21 +0000 UTC" firstStartedPulling="2025-10-13 14:18:23.44672649 +0000 UTC m=+4280.980276746" lastFinishedPulling="2025-10-13 14:18:39.475175131 +0000 UTC m=+4297.008725387" observedRunningTime="2025-10-13 14:18:40.602473374 +0000 UTC m=+4298.136023630" watchObservedRunningTime="2025-10-13 14:18:40.633126885 +0000 UTC m=+4298.166677141" Oct 13 14:18:41 crc kubenswrapper[4797]: I1013 14:18:41.537041 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54f7fbfcf9-tqdd5" event={"ID":"d5b9774f-6729-449f-a5e6-b1ef4df0e647","Type":"ContainerStarted","Data":"3c9ac451f3620d797cb6708c2bd9582318ae657e9fcca998778c58982bfacaef"} Oct 13 14:18:41 crc kubenswrapper[4797]: I1013 14:18:41.538131 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-54f7fbfcf9-tqdd5" Oct 13 14:18:41 crc kubenswrapper[4797]: I1013 14:18:41.540841 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-557fbdf45f-4cpz9" event={"ID":"7c4c1d51-22f5-47e8-baa4-05f2ec403852","Type":"ContainerStarted","Data":"9e84a71ed4627eda53d00d945dfc3c0edfd0bb958ba2076bff5d8c64119baed0"} Oct 13 14:18:41 crc kubenswrapper[4797]: I1013 14:18:41.556018 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-54f7fbfcf9-tqdd5" podStartSLOduration=4.868755611 podStartE2EDuration="23.556001811s" podCreationTimestamp="2025-10-13 14:18:18 +0000 UTC" firstStartedPulling="2025-10-13 14:18:20.888621218 +0000 UTC m=+4278.422171474" lastFinishedPulling="2025-10-13 14:18:39.575867418 +0000 UTC m=+4297.109417674" observedRunningTime="2025-10-13 14:18:41.551726566 +0000 UTC m=+4299.085276822" watchObservedRunningTime="2025-10-13 14:18:41.556001811 +0000 UTC m=+4299.089552067" Oct 13 14:18:41 crc kubenswrapper[4797]: I1013 14:18:41.571612 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-557fbdf45f-4cpz9" podStartSLOduration=4.896756977 podStartE2EDuration="23.571591263s" podCreationTimestamp="2025-10-13 14:18:18 +0000 UTC" firstStartedPulling="2025-10-13 14:18:20.901688608 +0000 UTC m=+4278.435238864" lastFinishedPulling="2025-10-13 14:18:39.576522894 +0000 UTC m=+4297.110073150" observedRunningTime="2025-10-13 14:18:41.571365047 +0000 UTC m=+4299.104915303" watchObservedRunningTime="2025-10-13 14:18:41.571591263 +0000 UTC m=+4299.105141509" Oct 13 14:18:42 crc kubenswrapper[4797]: I1013 14:18:42.548108 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-557fbdf45f-4cpz9" Oct 13 14:18:43 crc kubenswrapper[4797]: I1013 14:18:43.562710 4797 generic.go:334] "Generic (PLEG): container finished" podID="8eeb3b4f-e45c-44bb-874f-2445e71ea23a" containerID="dd14863459961a66a4e99fcf64a5d49bd28897d53a38172830298903e5cf9477" exitCode=0 Oct 13 14:18:43 crc kubenswrapper[4797]: I1013 14:18:43.562839 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"8eeb3b4f-e45c-44bb-874f-2445e71ea23a","Type":"ContainerDied","Data":"dd14863459961a66a4e99fcf64a5d49bd28897d53a38172830298903e5cf9477"} Oct 13 14:18:43 crc kubenswrapper[4797]: I1013 14:18:43.565234 4797 generic.go:334] "Generic (PLEG): container finished" podID="a5077271-1a56-4d9c-82e5-c8ecc23f5ef8" containerID="d2a2bb6f1d54528dda02efb0fee3a10444de296e9a1b436def7538b6d9388189" exitCode=0 Oct 13 14:18:43 crc kubenswrapper[4797]: I1013 14:18:43.565302 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"a5077271-1a56-4d9c-82e5-c8ecc23f5ef8","Type":"ContainerDied","Data":"d2a2bb6f1d54528dda02efb0fee3a10444de296e9a1b436def7538b6d9388189"} Oct 13 14:18:44 crc kubenswrapper[4797]: I1013 14:18:44.578137 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"a5077271-1a56-4d9c-82e5-c8ecc23f5ef8","Type":"ContainerStarted","Data":"aa90fcdeffb0bfbf138c0c9377a5b8633c8b87d60ff1c3cc66177713d00ffd98"} Oct 13 14:18:44 crc kubenswrapper[4797]: I1013 14:18:44.581527 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"8eeb3b4f-e45c-44bb-874f-2445e71ea23a","Type":"ContainerStarted","Data":"6dcb301c5ef4f2bdaa8bd87709db1c3a11822f8c45c07ca8e27fc00da04db068"} Oct 13 14:18:44 crc kubenswrapper[4797]: I1013 14:18:44.614978 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=7.979190137 podStartE2EDuration="25.614945831s" podCreationTimestamp="2025-10-13 14:18:19 +0000 UTC" firstStartedPulling="2025-10-13 14:18:21.893575234 +0000 UTC m=+4279.427125490" lastFinishedPulling="2025-10-13 14:18:39.529330928 +0000 UTC m=+4297.062881184" observedRunningTime="2025-10-13 14:18:44.603550502 +0000 UTC m=+4302.137100768" watchObservedRunningTime="2025-10-13 14:18:44.614945831 +0000 UTC m=+4302.148496127" Oct 13 14:18:44 crc kubenswrapper[4797]: I1013 14:18:44.638270 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=9.185821437 podStartE2EDuration="23.638249212s" podCreationTimestamp="2025-10-13 14:18:21 +0000 UTC" firstStartedPulling="2025-10-13 14:18:25.023857903 +0000 UTC m=+4282.557408159" lastFinishedPulling="2025-10-13 14:18:39.476285668 +0000 UTC m=+4297.009835934" observedRunningTime="2025-10-13 14:18:44.637418442 +0000 UTC m=+4302.170968718" watchObservedRunningTime="2025-10-13 14:18:44.638249212 +0000 UTC m=+4302.171799478" Oct 13 14:18:45 crc kubenswrapper[4797]: I1013 14:18:45.079020 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-54f7fbfcf9-tqdd5" Oct 13 14:18:45 crc kubenswrapper[4797]: I1013 14:18:45.274003 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-557fbdf45f-4cpz9" Oct 13 14:18:45 crc kubenswrapper[4797]: I1013 14:18:45.320870 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-54f7fbfcf9-tqdd5"] Oct 13 14:18:45 crc kubenswrapper[4797]: I1013 14:18:45.587373 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-54f7fbfcf9-tqdd5" podUID="d5b9774f-6729-449f-a5e6-b1ef4df0e647" containerName="dnsmasq-dns" containerID="cri-o://3c9ac451f3620d797cb6708c2bd9582318ae657e9fcca998778c58982bfacaef" gracePeriod=10 Oct 13 14:18:46 crc kubenswrapper[4797]: I1013 14:18:46.172297 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54f7fbfcf9-tqdd5" Oct 13 14:18:46 crc kubenswrapper[4797]: I1013 14:18:46.233383 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d5b9774f-6729-449f-a5e6-b1ef4df0e647-dns-svc\") pod \"d5b9774f-6729-449f-a5e6-b1ef4df0e647\" (UID: \"d5b9774f-6729-449f-a5e6-b1ef4df0e647\") " Oct 13 14:18:46 crc kubenswrapper[4797]: I1013 14:18:46.233590 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kddss\" (UniqueName: \"kubernetes.io/projected/d5b9774f-6729-449f-a5e6-b1ef4df0e647-kube-api-access-kddss\") pod \"d5b9774f-6729-449f-a5e6-b1ef4df0e647\" (UID: \"d5b9774f-6729-449f-a5e6-b1ef4df0e647\") " Oct 13 14:18:46 crc kubenswrapper[4797]: I1013 14:18:46.233633 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5b9774f-6729-449f-a5e6-b1ef4df0e647-config\") pod \"d5b9774f-6729-449f-a5e6-b1ef4df0e647\" (UID: \"d5b9774f-6729-449f-a5e6-b1ef4df0e647\") " Oct 13 14:18:46 crc kubenswrapper[4797]: I1013 14:18:46.241695 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5b9774f-6729-449f-a5e6-b1ef4df0e647-kube-api-access-kddss" (OuterVolumeSpecName: "kube-api-access-kddss") pod "d5b9774f-6729-449f-a5e6-b1ef4df0e647" (UID: "d5b9774f-6729-449f-a5e6-b1ef4df0e647"). InnerVolumeSpecName "kube-api-access-kddss". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 14:18:46 crc kubenswrapper[4797]: I1013 14:18:46.272348 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5b9774f-6729-449f-a5e6-b1ef4df0e647-config" (OuterVolumeSpecName: "config") pod "d5b9774f-6729-449f-a5e6-b1ef4df0e647" (UID: "d5b9774f-6729-449f-a5e6-b1ef4df0e647"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 14:18:46 crc kubenswrapper[4797]: I1013 14:18:46.278439 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5b9774f-6729-449f-a5e6-b1ef4df0e647-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d5b9774f-6729-449f-a5e6-b1ef4df0e647" (UID: "d5b9774f-6729-449f-a5e6-b1ef4df0e647"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 14:18:46 crc kubenswrapper[4797]: I1013 14:18:46.334977 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kddss\" (UniqueName: \"kubernetes.io/projected/d5b9774f-6729-449f-a5e6-b1ef4df0e647-kube-api-access-kddss\") on node \"crc\" DevicePath \"\"" Oct 13 14:18:46 crc kubenswrapper[4797]: I1013 14:18:46.335062 4797 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5b9774f-6729-449f-a5e6-b1ef4df0e647-config\") on node \"crc\" DevicePath \"\"" Oct 13 14:18:46 crc kubenswrapper[4797]: I1013 14:18:46.335081 4797 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d5b9774f-6729-449f-a5e6-b1ef4df0e647-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 13 14:18:46 crc kubenswrapper[4797]: I1013 14:18:46.586964 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Oct 13 14:18:46 crc kubenswrapper[4797]: I1013 14:18:46.605701 4797 generic.go:334] "Generic (PLEG): container finished" podID="d5b9774f-6729-449f-a5e6-b1ef4df0e647" containerID="3c9ac451f3620d797cb6708c2bd9582318ae657e9fcca998778c58982bfacaef" exitCode=0 Oct 13 14:18:46 crc kubenswrapper[4797]: I1013 14:18:46.605772 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54f7fbfcf9-tqdd5" event={"ID":"d5b9774f-6729-449f-a5e6-b1ef4df0e647","Type":"ContainerDied","Data":"3c9ac451f3620d797cb6708c2bd9582318ae657e9fcca998778c58982bfacaef"} Oct 13 14:18:46 crc kubenswrapper[4797]: I1013 14:18:46.605915 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54f7fbfcf9-tqdd5" event={"ID":"d5b9774f-6729-449f-a5e6-b1ef4df0e647","Type":"ContainerDied","Data":"8c6771d36dcb97e8948c3f735ec9cf4bca455ff74c0c5ece22eb316636cda0b1"} Oct 13 14:18:46 crc kubenswrapper[4797]: I1013 14:18:46.605950 4797 scope.go:117] "RemoveContainer" containerID="3c9ac451f3620d797cb6708c2bd9582318ae657e9fcca998778c58982bfacaef" Oct 13 14:18:46 crc kubenswrapper[4797]: I1013 14:18:46.607040 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54f7fbfcf9-tqdd5" Oct 13 14:18:46 crc kubenswrapper[4797]: I1013 14:18:46.650654 4797 scope.go:117] "RemoveContainer" containerID="5ae42815e5511f2399335f5db7bbd9002343e4735d2f4444344a3cb2ae5157ab" Oct 13 14:18:46 crc kubenswrapper[4797]: I1013 14:18:46.651834 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-54f7fbfcf9-tqdd5"] Oct 13 14:18:46 crc kubenswrapper[4797]: I1013 14:18:46.657175 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-54f7fbfcf9-tqdd5"] Oct 13 14:18:46 crc kubenswrapper[4797]: I1013 14:18:46.684171 4797 scope.go:117] "RemoveContainer" containerID="3c9ac451f3620d797cb6708c2bd9582318ae657e9fcca998778c58982bfacaef" Oct 13 14:18:46 crc kubenswrapper[4797]: E1013 14:18:46.685058 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c9ac451f3620d797cb6708c2bd9582318ae657e9fcca998778c58982bfacaef\": container with ID starting with 3c9ac451f3620d797cb6708c2bd9582318ae657e9fcca998778c58982bfacaef not found: ID does not exist" containerID="3c9ac451f3620d797cb6708c2bd9582318ae657e9fcca998778c58982bfacaef" Oct 13 14:18:46 crc kubenswrapper[4797]: I1013 14:18:46.685327 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c9ac451f3620d797cb6708c2bd9582318ae657e9fcca998778c58982bfacaef"} err="failed to get container status \"3c9ac451f3620d797cb6708c2bd9582318ae657e9fcca998778c58982bfacaef\": rpc error: code = NotFound desc = could not find container \"3c9ac451f3620d797cb6708c2bd9582318ae657e9fcca998778c58982bfacaef\": container with ID starting with 3c9ac451f3620d797cb6708c2bd9582318ae657e9fcca998778c58982bfacaef not found: ID does not exist" Oct 13 14:18:46 crc kubenswrapper[4797]: I1013 14:18:46.685434 4797 scope.go:117] "RemoveContainer" containerID="5ae42815e5511f2399335f5db7bbd9002343e4735d2f4444344a3cb2ae5157ab" Oct 13 14:18:46 crc kubenswrapper[4797]: E1013 14:18:46.685990 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ae42815e5511f2399335f5db7bbd9002343e4735d2f4444344a3cb2ae5157ab\": container with ID starting with 5ae42815e5511f2399335f5db7bbd9002343e4735d2f4444344a3cb2ae5157ab not found: ID does not exist" containerID="5ae42815e5511f2399335f5db7bbd9002343e4735d2f4444344a3cb2ae5157ab" Oct 13 14:18:46 crc kubenswrapper[4797]: I1013 14:18:46.686039 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ae42815e5511f2399335f5db7bbd9002343e4735d2f4444344a3cb2ae5157ab"} err="failed to get container status \"5ae42815e5511f2399335f5db7bbd9002343e4735d2f4444344a3cb2ae5157ab\": rpc error: code = NotFound desc = could not find container \"5ae42815e5511f2399335f5db7bbd9002343e4735d2f4444344a3cb2ae5157ab\": container with ID starting with 5ae42815e5511f2399335f5db7bbd9002343e4735d2f4444344a3cb2ae5157ab not found: ID does not exist" Oct 13 14:18:47 crc kubenswrapper[4797]: I1013 14:18:47.248216 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5b9774f-6729-449f-a5e6-b1ef4df0e647" path="/var/lib/kubelet/pods/d5b9774f-6729-449f-a5e6-b1ef4df0e647/volumes" Oct 13 14:18:47 crc kubenswrapper[4797]: E1013 14:18:47.309202 4797 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.147:46074->38.102.83.147:46853: write tcp 38.102.83.147:46074->38.102.83.147:46853: write: broken pipe Oct 13 14:18:48 crc kubenswrapper[4797]: I1013 14:18:48.120055 4797 patch_prober.go:28] interesting pod/machine-config-daemon-hrdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 14:18:48 crc kubenswrapper[4797]: I1013 14:18:48.120151 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 14:18:48 crc kubenswrapper[4797]: I1013 14:18:48.120211 4797 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" Oct 13 14:18:48 crc kubenswrapper[4797]: I1013 14:18:48.121089 4797 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e94c068ee3715b0c0c1473747789b8e54032f59cffffcb81aaefbd0a73c58ba2"} pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 13 14:18:48 crc kubenswrapper[4797]: I1013 14:18:48.121169 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerName="machine-config-daemon" containerID="cri-o://e94c068ee3715b0c0c1473747789b8e54032f59cffffcb81aaefbd0a73c58ba2" gracePeriod=600 Oct 13 14:18:48 crc kubenswrapper[4797]: E1013 14:18:48.255757 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 14:18:48 crc kubenswrapper[4797]: I1013 14:18:48.623168 4797 generic.go:334] "Generic (PLEG): container finished" podID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerID="e94c068ee3715b0c0c1473747789b8e54032f59cffffcb81aaefbd0a73c58ba2" exitCode=0 Oct 13 14:18:48 crc kubenswrapper[4797]: I1013 14:18:48.623212 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" event={"ID":"345b1c60-ba79-407d-8423-53010f2dfeb0","Type":"ContainerDied","Data":"e94c068ee3715b0c0c1473747789b8e54032f59cffffcb81aaefbd0a73c58ba2"} Oct 13 14:18:48 crc kubenswrapper[4797]: I1013 14:18:48.623532 4797 scope.go:117] "RemoveContainer" containerID="acb629200fdd59edad77e0212a8498d3d82c7cdb09b876143e9a24e3014955a5" Oct 13 14:18:48 crc kubenswrapper[4797]: I1013 14:18:48.624242 4797 scope.go:117] "RemoveContainer" containerID="e94c068ee3715b0c0c1473747789b8e54032f59cffffcb81aaefbd0a73c58ba2" Oct 13 14:18:48 crc kubenswrapper[4797]: E1013 14:18:48.624480 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 14:18:49 crc kubenswrapper[4797]: E1013 14:18:49.018419 4797 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.147:46086->38.102.83.147:46853: write tcp 38.102.83.147:46086->38.102.83.147:46853: write: broken pipe Oct 13 14:18:51 crc kubenswrapper[4797]: I1013 14:18:51.304479 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Oct 13 14:18:51 crc kubenswrapper[4797]: I1013 14:18:51.304968 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Oct 13 14:18:51 crc kubenswrapper[4797]: I1013 14:18:51.354411 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Oct 13 14:18:51 crc kubenswrapper[4797]: I1013 14:18:51.716617 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Oct 13 14:18:52 crc kubenswrapper[4797]: I1013 14:18:52.678507 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Oct 13 14:18:52 crc kubenswrapper[4797]: I1013 14:18:52.678562 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Oct 13 14:18:52 crc kubenswrapper[4797]: I1013 14:18:52.724884 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Oct 13 14:18:53 crc kubenswrapper[4797]: I1013 14:18:53.743303 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Oct 13 14:18:58 crc kubenswrapper[4797]: I1013 14:18:58.720625 4797 generic.go:334] "Generic (PLEG): container finished" podID="39f17318-f36e-49ec-ac64-b53ed1f136f2" containerID="7737fcd4ec77ed9f548f43472327538c12217181a6cfb5072bff3388308f9704" exitCode=0 Oct 13 14:18:58 crc kubenswrapper[4797]: I1013 14:18:58.720674 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"39f17318-f36e-49ec-ac64-b53ed1f136f2","Type":"ContainerDied","Data":"7737fcd4ec77ed9f548f43472327538c12217181a6cfb5072bff3388308f9704"} Oct 13 14:18:58 crc kubenswrapper[4797]: I1013 14:18:58.724104 4797 generic.go:334] "Generic (PLEG): container finished" podID="e8a03dc5-9e8b-416a-9225-7b2b3788eadd" containerID="c41a498f495d9d7d28569b37c12bbfa092106858c1ce0547625a97de8ae50469" exitCode=0 Oct 13 14:18:58 crc kubenswrapper[4797]: I1013 14:18:58.724144 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e8a03dc5-9e8b-416a-9225-7b2b3788eadd","Type":"ContainerDied","Data":"c41a498f495d9d7d28569b37c12bbfa092106858c1ce0547625a97de8ae50469"} Oct 13 14:18:59 crc kubenswrapper[4797]: I1013 14:18:59.731743 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"39f17318-f36e-49ec-ac64-b53ed1f136f2","Type":"ContainerStarted","Data":"414597fcbcad84e72511eda4dda75fc8c3b861347b9a78392491f9f8991028e8"} Oct 13 14:18:59 crc kubenswrapper[4797]: I1013 14:18:59.732248 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Oct 13 14:18:59 crc kubenswrapper[4797]: I1013 14:18:59.734146 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e8a03dc5-9e8b-416a-9225-7b2b3788eadd","Type":"ContainerStarted","Data":"0d38697b8918c246aa99d2d005598d5565a3f8d8d9dcf5d81f94b92c3b9ab2be"} Oct 13 14:18:59 crc kubenswrapper[4797]: I1013 14:18:59.734302 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Oct 13 14:18:59 crc kubenswrapper[4797]: I1013 14:18:59.758192 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=36.814883312 podStartE2EDuration="41.758176419s" podCreationTimestamp="2025-10-13 14:18:18 +0000 UTC" firstStartedPulling="2025-10-13 14:18:20.46124592 +0000 UTC m=+4277.994796176" lastFinishedPulling="2025-10-13 14:18:25.404539027 +0000 UTC m=+4282.938089283" observedRunningTime="2025-10-13 14:18:59.75659274 +0000 UTC m=+4317.290143016" watchObservedRunningTime="2025-10-13 14:18:59.758176419 +0000 UTC m=+4317.291726675" Oct 13 14:18:59 crc kubenswrapper[4797]: I1013 14:18:59.783674 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.027693164 podStartE2EDuration="41.783657743s" podCreationTimestamp="2025-10-13 14:18:18 +0000 UTC" firstStartedPulling="2025-10-13 14:18:20.334548216 +0000 UTC m=+4277.868098472" lastFinishedPulling="2025-10-13 14:18:25.090512795 +0000 UTC m=+4282.624063051" observedRunningTime="2025-10-13 14:18:59.778499766 +0000 UTC m=+4317.312050032" watchObservedRunningTime="2025-10-13 14:18:59.783657743 +0000 UTC m=+4317.317207999" Oct 13 14:19:01 crc kubenswrapper[4797]: I1013 14:19:01.237222 4797 scope.go:117] "RemoveContainer" containerID="e94c068ee3715b0c0c1473747789b8e54032f59cffffcb81aaefbd0a73c58ba2" Oct 13 14:19:01 crc kubenswrapper[4797]: E1013 14:19:01.237622 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 14:19:09 crc kubenswrapper[4797]: I1013 14:19:09.949419 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Oct 13 14:19:10 crc kubenswrapper[4797]: I1013 14:19:10.028789 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Oct 13 14:19:14 crc kubenswrapper[4797]: I1013 14:19:14.937894 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-fdd579685-r7l2d"] Oct 13 14:19:14 crc kubenswrapper[4797]: E1013 14:19:14.939070 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5b9774f-6729-449f-a5e6-b1ef4df0e647" containerName="dnsmasq-dns" Oct 13 14:19:14 crc kubenswrapper[4797]: I1013 14:19:14.939093 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5b9774f-6729-449f-a5e6-b1ef4df0e647" containerName="dnsmasq-dns" Oct 13 14:19:14 crc kubenswrapper[4797]: E1013 14:19:14.939145 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5b9774f-6729-449f-a5e6-b1ef4df0e647" containerName="init" Oct 13 14:19:14 crc kubenswrapper[4797]: I1013 14:19:14.939155 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5b9774f-6729-449f-a5e6-b1ef4df0e647" containerName="init" Oct 13 14:19:14 crc kubenswrapper[4797]: I1013 14:19:14.941387 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5b9774f-6729-449f-a5e6-b1ef4df0e647" containerName="dnsmasq-dns" Oct 13 14:19:14 crc kubenswrapper[4797]: I1013 14:19:14.944202 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fdd579685-r7l2d" Oct 13 14:19:14 crc kubenswrapper[4797]: I1013 14:19:14.946492 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-fdd579685-r7l2d"] Oct 13 14:19:15 crc kubenswrapper[4797]: I1013 14:19:15.104482 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ffc82494-7881-4544-9e63-cd6041bf8c2c-dns-svc\") pod \"dnsmasq-dns-fdd579685-r7l2d\" (UID: \"ffc82494-7881-4544-9e63-cd6041bf8c2c\") " pod="openstack/dnsmasq-dns-fdd579685-r7l2d" Oct 13 14:19:15 crc kubenswrapper[4797]: I1013 14:19:15.104541 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8m8tl\" (UniqueName: \"kubernetes.io/projected/ffc82494-7881-4544-9e63-cd6041bf8c2c-kube-api-access-8m8tl\") pod \"dnsmasq-dns-fdd579685-r7l2d\" (UID: \"ffc82494-7881-4544-9e63-cd6041bf8c2c\") " pod="openstack/dnsmasq-dns-fdd579685-r7l2d" Oct 13 14:19:15 crc kubenswrapper[4797]: I1013 14:19:15.104567 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ffc82494-7881-4544-9e63-cd6041bf8c2c-config\") pod \"dnsmasq-dns-fdd579685-r7l2d\" (UID: \"ffc82494-7881-4544-9e63-cd6041bf8c2c\") " pod="openstack/dnsmasq-dns-fdd579685-r7l2d" Oct 13 14:19:15 crc kubenswrapper[4797]: I1013 14:19:15.206030 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8m8tl\" (UniqueName: \"kubernetes.io/projected/ffc82494-7881-4544-9e63-cd6041bf8c2c-kube-api-access-8m8tl\") pod \"dnsmasq-dns-fdd579685-r7l2d\" (UID: \"ffc82494-7881-4544-9e63-cd6041bf8c2c\") " pod="openstack/dnsmasq-dns-fdd579685-r7l2d" Oct 13 14:19:15 crc kubenswrapper[4797]: I1013 14:19:15.206123 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ffc82494-7881-4544-9e63-cd6041bf8c2c-config\") pod \"dnsmasq-dns-fdd579685-r7l2d\" (UID: \"ffc82494-7881-4544-9e63-cd6041bf8c2c\") " pod="openstack/dnsmasq-dns-fdd579685-r7l2d" Oct 13 14:19:15 crc kubenswrapper[4797]: I1013 14:19:15.206371 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ffc82494-7881-4544-9e63-cd6041bf8c2c-dns-svc\") pod \"dnsmasq-dns-fdd579685-r7l2d\" (UID: \"ffc82494-7881-4544-9e63-cd6041bf8c2c\") " pod="openstack/dnsmasq-dns-fdd579685-r7l2d" Oct 13 14:19:15 crc kubenswrapper[4797]: I1013 14:19:15.207706 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ffc82494-7881-4544-9e63-cd6041bf8c2c-dns-svc\") pod \"dnsmasq-dns-fdd579685-r7l2d\" (UID: \"ffc82494-7881-4544-9e63-cd6041bf8c2c\") " pod="openstack/dnsmasq-dns-fdd579685-r7l2d" Oct 13 14:19:15 crc kubenswrapper[4797]: I1013 14:19:15.208572 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ffc82494-7881-4544-9e63-cd6041bf8c2c-config\") pod \"dnsmasq-dns-fdd579685-r7l2d\" (UID: \"ffc82494-7881-4544-9e63-cd6041bf8c2c\") " pod="openstack/dnsmasq-dns-fdd579685-r7l2d" Oct 13 14:19:15 crc kubenswrapper[4797]: I1013 14:19:15.236672 4797 scope.go:117] "RemoveContainer" containerID="e94c068ee3715b0c0c1473747789b8e54032f59cffffcb81aaefbd0a73c58ba2" Oct 13 14:19:15 crc kubenswrapper[4797]: E1013 14:19:15.236933 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 14:19:15 crc kubenswrapper[4797]: I1013 14:19:15.252075 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8m8tl\" (UniqueName: \"kubernetes.io/projected/ffc82494-7881-4544-9e63-cd6041bf8c2c-kube-api-access-8m8tl\") pod \"dnsmasq-dns-fdd579685-r7l2d\" (UID: \"ffc82494-7881-4544-9e63-cd6041bf8c2c\") " pod="openstack/dnsmasq-dns-fdd579685-r7l2d" Oct 13 14:19:15 crc kubenswrapper[4797]: I1013 14:19:15.267140 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fdd579685-r7l2d" Oct 13 14:19:15 crc kubenswrapper[4797]: I1013 14:19:15.727216 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 13 14:19:16 crc kubenswrapper[4797]: I1013 14:19:16.230590 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-fdd579685-r7l2d"] Oct 13 14:19:16 crc kubenswrapper[4797]: W1013 14:19:16.236707 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podffc82494_7881_4544_9e63_cd6041bf8c2c.slice/crio-0d0e1ea2ffcc01c0c37dc02cfa29558c4bd7bbaaee10d170a34204f49dab6abc WatchSource:0}: Error finding container 0d0e1ea2ffcc01c0c37dc02cfa29558c4bd7bbaaee10d170a34204f49dab6abc: Status 404 returned error can't find the container with id 0d0e1ea2ffcc01c0c37dc02cfa29558c4bd7bbaaee10d170a34204f49dab6abc Oct 13 14:19:16 crc kubenswrapper[4797]: I1013 14:19:16.259834 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 13 14:19:16 crc kubenswrapper[4797]: I1013 14:19:16.858437 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fdd579685-r7l2d" event={"ID":"ffc82494-7881-4544-9e63-cd6041bf8c2c","Type":"ContainerStarted","Data":"0d0e1ea2ffcc01c0c37dc02cfa29558c4bd7bbaaee10d170a34204f49dab6abc"} Oct 13 14:19:17 crc kubenswrapper[4797]: I1013 14:19:17.713613 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="39f17318-f36e-49ec-ac64-b53ed1f136f2" containerName="rabbitmq" containerID="cri-o://414597fcbcad84e72511eda4dda75fc8c3b861347b9a78392491f9f8991028e8" gracePeriod=604799 Oct 13 14:19:17 crc kubenswrapper[4797]: I1013 14:19:17.869388 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fdd579685-r7l2d" event={"ID":"ffc82494-7881-4544-9e63-cd6041bf8c2c","Type":"ContainerStarted","Data":"1490a034af90f099e5ff8b4d3fbecb32711ba3cb9ee1fb6a3081d2ed2299fa63"} Oct 13 14:19:18 crc kubenswrapper[4797]: I1013 14:19:18.367305 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="e8a03dc5-9e8b-416a-9225-7b2b3788eadd" containerName="rabbitmq" containerID="cri-o://0d38697b8918c246aa99d2d005598d5565a3f8d8d9dcf5d81f94b92c3b9ab2be" gracePeriod=604798 Oct 13 14:19:18 crc kubenswrapper[4797]: I1013 14:19:18.879585 4797 generic.go:334] "Generic (PLEG): container finished" podID="ffc82494-7881-4544-9e63-cd6041bf8c2c" containerID="1490a034af90f099e5ff8b4d3fbecb32711ba3cb9ee1fb6a3081d2ed2299fa63" exitCode=0 Oct 13 14:19:18 crc kubenswrapper[4797]: I1013 14:19:18.879648 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fdd579685-r7l2d" event={"ID":"ffc82494-7881-4544-9e63-cd6041bf8c2c","Type":"ContainerDied","Data":"1490a034af90f099e5ff8b4d3fbecb32711ba3cb9ee1fb6a3081d2ed2299fa63"} Oct 13 14:19:19 crc kubenswrapper[4797]: I1013 14:19:19.890685 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fdd579685-r7l2d" event={"ID":"ffc82494-7881-4544-9e63-cd6041bf8c2c","Type":"ContainerStarted","Data":"34ca4a52cafa7d4b3c88c1a3f437f8d3fd5c1ca6c86fe08989ed6f5698a93ae4"} Oct 13 14:19:19 crc kubenswrapper[4797]: I1013 14:19:19.890858 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-fdd579685-r7l2d" Oct 13 14:19:19 crc kubenswrapper[4797]: I1013 14:19:19.911911 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-fdd579685-r7l2d" podStartSLOduration=5.91189349 podStartE2EDuration="5.91189349s" podCreationTimestamp="2025-10-13 14:19:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 14:19:19.905302238 +0000 UTC m=+4337.438852514" watchObservedRunningTime="2025-10-13 14:19:19.91189349 +0000 UTC m=+4337.445443746" Oct 13 14:19:19 crc kubenswrapper[4797]: I1013 14:19:19.948061 4797 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="39f17318-f36e-49ec-ac64-b53ed1f136f2" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.239:5672: connect: connection refused" Oct 13 14:19:20 crc kubenswrapper[4797]: I1013 14:19:20.025673 4797 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="e8a03dc5-9e8b-416a-9225-7b2b3788eadd" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.240:5672: connect: connection refused" Oct 13 14:19:20 crc kubenswrapper[4797]: I1013 14:19:20.268768 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-lgzz6"] Oct 13 14:19:20 crc kubenswrapper[4797]: I1013 14:19:20.270435 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lgzz6" Oct 13 14:19:20 crc kubenswrapper[4797]: I1013 14:19:20.283020 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lgzz6"] Oct 13 14:19:20 crc kubenswrapper[4797]: I1013 14:19:20.381929 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60a9c779-b261-4811-928b-305e24816f64-utilities\") pod \"community-operators-lgzz6\" (UID: \"60a9c779-b261-4811-928b-305e24816f64\") " pod="openshift-marketplace/community-operators-lgzz6" Oct 13 14:19:20 crc kubenswrapper[4797]: I1013 14:19:20.382027 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60a9c779-b261-4811-928b-305e24816f64-catalog-content\") pod \"community-operators-lgzz6\" (UID: \"60a9c779-b261-4811-928b-305e24816f64\") " pod="openshift-marketplace/community-operators-lgzz6" Oct 13 14:19:20 crc kubenswrapper[4797]: I1013 14:19:20.382161 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5f8r2\" (UniqueName: \"kubernetes.io/projected/60a9c779-b261-4811-928b-305e24816f64-kube-api-access-5f8r2\") pod \"community-operators-lgzz6\" (UID: \"60a9c779-b261-4811-928b-305e24816f64\") " pod="openshift-marketplace/community-operators-lgzz6" Oct 13 14:19:20 crc kubenswrapper[4797]: I1013 14:19:20.484879 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60a9c779-b261-4811-928b-305e24816f64-utilities\") pod \"community-operators-lgzz6\" (UID: \"60a9c779-b261-4811-928b-305e24816f64\") " pod="openshift-marketplace/community-operators-lgzz6" Oct 13 14:19:20 crc kubenswrapper[4797]: I1013 14:19:20.484940 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60a9c779-b261-4811-928b-305e24816f64-catalog-content\") pod \"community-operators-lgzz6\" (UID: \"60a9c779-b261-4811-928b-305e24816f64\") " pod="openshift-marketplace/community-operators-lgzz6" Oct 13 14:19:20 crc kubenswrapper[4797]: I1013 14:19:20.485038 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5f8r2\" (UniqueName: \"kubernetes.io/projected/60a9c779-b261-4811-928b-305e24816f64-kube-api-access-5f8r2\") pod \"community-operators-lgzz6\" (UID: \"60a9c779-b261-4811-928b-305e24816f64\") " pod="openshift-marketplace/community-operators-lgzz6" Oct 13 14:19:20 crc kubenswrapper[4797]: I1013 14:19:20.485317 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60a9c779-b261-4811-928b-305e24816f64-utilities\") pod \"community-operators-lgzz6\" (UID: \"60a9c779-b261-4811-928b-305e24816f64\") " pod="openshift-marketplace/community-operators-lgzz6" Oct 13 14:19:20 crc kubenswrapper[4797]: I1013 14:19:20.485685 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60a9c779-b261-4811-928b-305e24816f64-catalog-content\") pod \"community-operators-lgzz6\" (UID: \"60a9c779-b261-4811-928b-305e24816f64\") " pod="openshift-marketplace/community-operators-lgzz6" Oct 13 14:19:20 crc kubenswrapper[4797]: I1013 14:19:20.511899 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5f8r2\" (UniqueName: \"kubernetes.io/projected/60a9c779-b261-4811-928b-305e24816f64-kube-api-access-5f8r2\") pod \"community-operators-lgzz6\" (UID: \"60a9c779-b261-4811-928b-305e24816f64\") " pod="openshift-marketplace/community-operators-lgzz6" Oct 13 14:19:20 crc kubenswrapper[4797]: I1013 14:19:20.644796 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lgzz6" Oct 13 14:19:21 crc kubenswrapper[4797]: I1013 14:19:21.076317 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lgzz6"] Oct 13 14:19:21 crc kubenswrapper[4797]: I1013 14:19:21.907437 4797 generic.go:334] "Generic (PLEG): container finished" podID="60a9c779-b261-4811-928b-305e24816f64" containerID="48428210cdba3b9dbeb4f6790bd051312add69b3ab77993f1d526b90803d2c71" exitCode=0 Oct 13 14:19:21 crc kubenswrapper[4797]: I1013 14:19:21.907550 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lgzz6" event={"ID":"60a9c779-b261-4811-928b-305e24816f64","Type":"ContainerDied","Data":"48428210cdba3b9dbeb4f6790bd051312add69b3ab77993f1d526b90803d2c71"} Oct 13 14:19:21 crc kubenswrapper[4797]: I1013 14:19:21.907750 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lgzz6" event={"ID":"60a9c779-b261-4811-928b-305e24816f64","Type":"ContainerStarted","Data":"112ec702b07ea199740435bf5a0396eb2d6e47a1bed2c2835bdd50db3d6bb7d7"} Oct 13 14:19:22 crc kubenswrapper[4797]: I1013 14:19:22.915229 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lgzz6" event={"ID":"60a9c779-b261-4811-928b-305e24816f64","Type":"ContainerStarted","Data":"0088ae072cac73db84fdb6ba3a022bbf24de818073734c8a2c7516b5576c5298"} Oct 13 14:19:23 crc kubenswrapper[4797]: I1013 14:19:23.925038 4797 generic.go:334] "Generic (PLEG): container finished" podID="60a9c779-b261-4811-928b-305e24816f64" containerID="0088ae072cac73db84fdb6ba3a022bbf24de818073734c8a2c7516b5576c5298" exitCode=0 Oct 13 14:19:23 crc kubenswrapper[4797]: I1013 14:19:23.925139 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lgzz6" event={"ID":"60a9c779-b261-4811-928b-305e24816f64","Type":"ContainerDied","Data":"0088ae072cac73db84fdb6ba3a022bbf24de818073734c8a2c7516b5576c5298"} Oct 13 14:19:24 crc kubenswrapper[4797]: I1013 14:19:24.716692 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 13 14:19:24 crc kubenswrapper[4797]: I1013 14:19:24.848830 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/39f17318-f36e-49ec-ac64-b53ed1f136f2-rabbitmq-plugins\") pod \"39f17318-f36e-49ec-ac64-b53ed1f136f2\" (UID: \"39f17318-f36e-49ec-ac64-b53ed1f136f2\") " Oct 13 14:19:24 crc kubenswrapper[4797]: I1013 14:19:24.848917 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/39f17318-f36e-49ec-ac64-b53ed1f136f2-rabbitmq-confd\") pod \"39f17318-f36e-49ec-ac64-b53ed1f136f2\" (UID: \"39f17318-f36e-49ec-ac64-b53ed1f136f2\") " Oct 13 14:19:24 crc kubenswrapper[4797]: I1013 14:19:24.848971 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jdzwn\" (UniqueName: \"kubernetes.io/projected/39f17318-f36e-49ec-ac64-b53ed1f136f2-kube-api-access-jdzwn\") pod \"39f17318-f36e-49ec-ac64-b53ed1f136f2\" (UID: \"39f17318-f36e-49ec-ac64-b53ed1f136f2\") " Oct 13 14:19:24 crc kubenswrapper[4797]: I1013 14:19:24.848997 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/39f17318-f36e-49ec-ac64-b53ed1f136f2-server-conf\") pod \"39f17318-f36e-49ec-ac64-b53ed1f136f2\" (UID: \"39f17318-f36e-49ec-ac64-b53ed1f136f2\") " Oct 13 14:19:24 crc kubenswrapper[4797]: I1013 14:19:24.849021 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/39f17318-f36e-49ec-ac64-b53ed1f136f2-plugins-conf\") pod \"39f17318-f36e-49ec-ac64-b53ed1f136f2\" (UID: \"39f17318-f36e-49ec-ac64-b53ed1f136f2\") " Oct 13 14:19:24 crc kubenswrapper[4797]: I1013 14:19:24.849148 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d9f106f9-e2c9-457f-96e8-885c3fc1388d\") pod \"39f17318-f36e-49ec-ac64-b53ed1f136f2\" (UID: \"39f17318-f36e-49ec-ac64-b53ed1f136f2\") " Oct 13 14:19:24 crc kubenswrapper[4797]: I1013 14:19:24.849202 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/39f17318-f36e-49ec-ac64-b53ed1f136f2-erlang-cookie-secret\") pod \"39f17318-f36e-49ec-ac64-b53ed1f136f2\" (UID: \"39f17318-f36e-49ec-ac64-b53ed1f136f2\") " Oct 13 14:19:24 crc kubenswrapper[4797]: I1013 14:19:24.849277 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/39f17318-f36e-49ec-ac64-b53ed1f136f2-pod-info\") pod \"39f17318-f36e-49ec-ac64-b53ed1f136f2\" (UID: \"39f17318-f36e-49ec-ac64-b53ed1f136f2\") " Oct 13 14:19:24 crc kubenswrapper[4797]: I1013 14:19:24.849302 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/39f17318-f36e-49ec-ac64-b53ed1f136f2-rabbitmq-erlang-cookie\") pod \"39f17318-f36e-49ec-ac64-b53ed1f136f2\" (UID: \"39f17318-f36e-49ec-ac64-b53ed1f136f2\") " Oct 13 14:19:24 crc kubenswrapper[4797]: I1013 14:19:24.849915 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/39f17318-f36e-49ec-ac64-b53ed1f136f2-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "39f17318-f36e-49ec-ac64-b53ed1f136f2" (UID: "39f17318-f36e-49ec-ac64-b53ed1f136f2"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 14:19:24 crc kubenswrapper[4797]: I1013 14:19:24.850111 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/39f17318-f36e-49ec-ac64-b53ed1f136f2-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "39f17318-f36e-49ec-ac64-b53ed1f136f2" (UID: "39f17318-f36e-49ec-ac64-b53ed1f136f2"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 14:19:24 crc kubenswrapper[4797]: I1013 14:19:24.850278 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39f17318-f36e-49ec-ac64-b53ed1f136f2-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "39f17318-f36e-49ec-ac64-b53ed1f136f2" (UID: "39f17318-f36e-49ec-ac64-b53ed1f136f2"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 14:19:24 crc kubenswrapper[4797]: I1013 14:19:24.856853 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39f17318-f36e-49ec-ac64-b53ed1f136f2-kube-api-access-jdzwn" (OuterVolumeSpecName: "kube-api-access-jdzwn") pod "39f17318-f36e-49ec-ac64-b53ed1f136f2" (UID: "39f17318-f36e-49ec-ac64-b53ed1f136f2"). InnerVolumeSpecName "kube-api-access-jdzwn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 14:19:24 crc kubenswrapper[4797]: I1013 14:19:24.856916 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/39f17318-f36e-49ec-ac64-b53ed1f136f2-pod-info" (OuterVolumeSpecName: "pod-info") pod "39f17318-f36e-49ec-ac64-b53ed1f136f2" (UID: "39f17318-f36e-49ec-ac64-b53ed1f136f2"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Oct 13 14:19:24 crc kubenswrapper[4797]: I1013 14:19:24.858780 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39f17318-f36e-49ec-ac64-b53ed1f136f2-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "39f17318-f36e-49ec-ac64-b53ed1f136f2" (UID: "39f17318-f36e-49ec-ac64-b53ed1f136f2"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 14:19:24 crc kubenswrapper[4797]: I1013 14:19:24.871609 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d9f106f9-e2c9-457f-96e8-885c3fc1388d" (OuterVolumeSpecName: "persistence") pod "39f17318-f36e-49ec-ac64-b53ed1f136f2" (UID: "39f17318-f36e-49ec-ac64-b53ed1f136f2"). InnerVolumeSpecName "pvc-d9f106f9-e2c9-457f-96e8-885c3fc1388d". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 13 14:19:24 crc kubenswrapper[4797]: I1013 14:19:24.876468 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39f17318-f36e-49ec-ac64-b53ed1f136f2-server-conf" (OuterVolumeSpecName: "server-conf") pod "39f17318-f36e-49ec-ac64-b53ed1f136f2" (UID: "39f17318-f36e-49ec-ac64-b53ed1f136f2"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 14:19:24 crc kubenswrapper[4797]: I1013 14:19:24.936263 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39f17318-f36e-49ec-ac64-b53ed1f136f2-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "39f17318-f36e-49ec-ac64-b53ed1f136f2" (UID: "39f17318-f36e-49ec-ac64-b53ed1f136f2"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 14:19:24 crc kubenswrapper[4797]: I1013 14:19:24.938131 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lgzz6" event={"ID":"60a9c779-b261-4811-928b-305e24816f64","Type":"ContainerStarted","Data":"a6f14aee03e299c3c4f1d22c3c78b6b6fe73c2d4462e4d9ad957757024c0d452"} Oct 13 14:19:24 crc kubenswrapper[4797]: I1013 14:19:24.941060 4797 generic.go:334] "Generic (PLEG): container finished" podID="39f17318-f36e-49ec-ac64-b53ed1f136f2" containerID="414597fcbcad84e72511eda4dda75fc8c3b861347b9a78392491f9f8991028e8" exitCode=0 Oct 13 14:19:24 crc kubenswrapper[4797]: I1013 14:19:24.941110 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"39f17318-f36e-49ec-ac64-b53ed1f136f2","Type":"ContainerDied","Data":"414597fcbcad84e72511eda4dda75fc8c3b861347b9a78392491f9f8991028e8"} Oct 13 14:19:24 crc kubenswrapper[4797]: I1013 14:19:24.941136 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"39f17318-f36e-49ec-ac64-b53ed1f136f2","Type":"ContainerDied","Data":"910be0feca868cf7a7eeaddbbab23f9ae7cab44474b0dab96a579e53f622c427"} Oct 13 14:19:24 crc kubenswrapper[4797]: I1013 14:19:24.941151 4797 scope.go:117] "RemoveContainer" containerID="414597fcbcad84e72511eda4dda75fc8c3b861347b9a78392491f9f8991028e8" Oct 13 14:19:24 crc kubenswrapper[4797]: I1013 14:19:24.941304 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 13 14:19:24 crc kubenswrapper[4797]: I1013 14:19:24.947159 4797 generic.go:334] "Generic (PLEG): container finished" podID="e8a03dc5-9e8b-416a-9225-7b2b3788eadd" containerID="0d38697b8918c246aa99d2d005598d5565a3f8d8d9dcf5d81f94b92c3b9ab2be" exitCode=0 Oct 13 14:19:24 crc kubenswrapper[4797]: I1013 14:19:24.947237 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e8a03dc5-9e8b-416a-9225-7b2b3788eadd","Type":"ContainerDied","Data":"0d38697b8918c246aa99d2d005598d5565a3f8d8d9dcf5d81f94b92c3b9ab2be"} Oct 13 14:19:24 crc kubenswrapper[4797]: I1013 14:19:24.950699 4797 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/39f17318-f36e-49ec-ac64-b53ed1f136f2-pod-info\") on node \"crc\" DevicePath \"\"" Oct 13 14:19:24 crc kubenswrapper[4797]: I1013 14:19:24.950728 4797 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/39f17318-f36e-49ec-ac64-b53ed1f136f2-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Oct 13 14:19:24 crc kubenswrapper[4797]: I1013 14:19:24.950757 4797 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/39f17318-f36e-49ec-ac64-b53ed1f136f2-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Oct 13 14:19:24 crc kubenswrapper[4797]: I1013 14:19:24.950767 4797 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/39f17318-f36e-49ec-ac64-b53ed1f136f2-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Oct 13 14:19:24 crc kubenswrapper[4797]: I1013 14:19:24.950779 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jdzwn\" (UniqueName: \"kubernetes.io/projected/39f17318-f36e-49ec-ac64-b53ed1f136f2-kube-api-access-jdzwn\") on node \"crc\" DevicePath \"\"" Oct 13 14:19:24 crc kubenswrapper[4797]: I1013 14:19:24.950791 4797 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/39f17318-f36e-49ec-ac64-b53ed1f136f2-server-conf\") on node \"crc\" DevicePath \"\"" Oct 13 14:19:24 crc kubenswrapper[4797]: I1013 14:19:24.950865 4797 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/39f17318-f36e-49ec-ac64-b53ed1f136f2-plugins-conf\") on node \"crc\" DevicePath \"\"" Oct 13 14:19:24 crc kubenswrapper[4797]: I1013 14:19:24.950920 4797 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-d9f106f9-e2c9-457f-96e8-885c3fc1388d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d9f106f9-e2c9-457f-96e8-885c3fc1388d\") on node \"crc\" " Oct 13 14:19:24 crc kubenswrapper[4797]: I1013 14:19:24.950936 4797 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/39f17318-f36e-49ec-ac64-b53ed1f136f2-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Oct 13 14:19:24 crc kubenswrapper[4797]: I1013 14:19:24.971626 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-lgzz6" podStartSLOduration=2.513932937 podStartE2EDuration="4.971606168s" podCreationTimestamp="2025-10-13 14:19:20 +0000 UTC" firstStartedPulling="2025-10-13 14:19:21.910737802 +0000 UTC m=+4339.444288068" lastFinishedPulling="2025-10-13 14:19:24.368411043 +0000 UTC m=+4341.901961299" observedRunningTime="2025-10-13 14:19:24.965988941 +0000 UTC m=+4342.499539217" watchObservedRunningTime="2025-10-13 14:19:24.971606168 +0000 UTC m=+4342.505156424" Oct 13 14:19:24 crc kubenswrapper[4797]: I1013 14:19:24.977344 4797 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Oct 13 14:19:24 crc kubenswrapper[4797]: I1013 14:19:24.977561 4797 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-d9f106f9-e2c9-457f-96e8-885c3fc1388d" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d9f106f9-e2c9-457f-96e8-885c3fc1388d") on node "crc" Oct 13 14:19:24 crc kubenswrapper[4797]: I1013 14:19:24.983510 4797 scope.go:117] "RemoveContainer" containerID="7737fcd4ec77ed9f548f43472327538c12217181a6cfb5072bff3388308f9704" Oct 13 14:19:24 crc kubenswrapper[4797]: I1013 14:19:24.997521 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 13 14:19:25 crc kubenswrapper[4797]: I1013 14:19:25.003206 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 13 14:19:25 crc kubenswrapper[4797]: I1013 14:19:25.003752 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 13 14:19:25 crc kubenswrapper[4797]: I1013 14:19:25.019691 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Oct 13 14:19:25 crc kubenswrapper[4797]: E1013 14:19:25.020296 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8a03dc5-9e8b-416a-9225-7b2b3788eadd" containerName="setup-container" Oct 13 14:19:25 crc kubenswrapper[4797]: I1013 14:19:25.020395 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8a03dc5-9e8b-416a-9225-7b2b3788eadd" containerName="setup-container" Oct 13 14:19:25 crc kubenswrapper[4797]: E1013 14:19:25.020482 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39f17318-f36e-49ec-ac64-b53ed1f136f2" containerName="setup-container" Oct 13 14:19:25 crc kubenswrapper[4797]: I1013 14:19:25.020551 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="39f17318-f36e-49ec-ac64-b53ed1f136f2" containerName="setup-container" Oct 13 14:19:25 crc kubenswrapper[4797]: E1013 14:19:25.020629 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8a03dc5-9e8b-416a-9225-7b2b3788eadd" containerName="rabbitmq" Oct 13 14:19:25 crc kubenswrapper[4797]: I1013 14:19:25.020707 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8a03dc5-9e8b-416a-9225-7b2b3788eadd" containerName="rabbitmq" Oct 13 14:19:25 crc kubenswrapper[4797]: E1013 14:19:25.020792 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39f17318-f36e-49ec-ac64-b53ed1f136f2" containerName="rabbitmq" Oct 13 14:19:25 crc kubenswrapper[4797]: I1013 14:19:25.020883 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="39f17318-f36e-49ec-ac64-b53ed1f136f2" containerName="rabbitmq" Oct 13 14:19:25 crc kubenswrapper[4797]: I1013 14:19:25.021125 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="39f17318-f36e-49ec-ac64-b53ed1f136f2" containerName="rabbitmq" Oct 13 14:19:25 crc kubenswrapper[4797]: I1013 14:19:25.021215 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8a03dc5-9e8b-416a-9225-7b2b3788eadd" containerName="rabbitmq" Oct 13 14:19:25 crc kubenswrapper[4797]: I1013 14:19:25.022247 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 13 14:19:25 crc kubenswrapper[4797]: I1013 14:19:25.031310 4797 scope.go:117] "RemoveContainer" containerID="414597fcbcad84e72511eda4dda75fc8c3b861347b9a78392491f9f8991028e8" Oct 13 14:19:25 crc kubenswrapper[4797]: I1013 14:19:25.031669 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-7z8cm" Oct 13 14:19:25 crc kubenswrapper[4797]: I1013 14:19:25.031900 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Oct 13 14:19:25 crc kubenswrapper[4797]: I1013 14:19:25.031948 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Oct 13 14:19:25 crc kubenswrapper[4797]: I1013 14:19:25.032086 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Oct 13 14:19:25 crc kubenswrapper[4797]: E1013 14:19:25.032221 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"414597fcbcad84e72511eda4dda75fc8c3b861347b9a78392491f9f8991028e8\": container with ID starting with 414597fcbcad84e72511eda4dda75fc8c3b861347b9a78392491f9f8991028e8 not found: ID does not exist" containerID="414597fcbcad84e72511eda4dda75fc8c3b861347b9a78392491f9f8991028e8" Oct 13 14:19:25 crc kubenswrapper[4797]: I1013 14:19:25.032360 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"414597fcbcad84e72511eda4dda75fc8c3b861347b9a78392491f9f8991028e8"} err="failed to get container status \"414597fcbcad84e72511eda4dda75fc8c3b861347b9a78392491f9f8991028e8\": rpc error: code = NotFound desc = could not find container \"414597fcbcad84e72511eda4dda75fc8c3b861347b9a78392491f9f8991028e8\": container with ID starting with 414597fcbcad84e72511eda4dda75fc8c3b861347b9a78392491f9f8991028e8 not found: ID does not exist" Oct 13 14:19:25 crc kubenswrapper[4797]: I1013 14:19:25.032467 4797 scope.go:117] "RemoveContainer" containerID="7737fcd4ec77ed9f548f43472327538c12217181a6cfb5072bff3388308f9704" Oct 13 14:19:25 crc kubenswrapper[4797]: E1013 14:19:25.036297 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7737fcd4ec77ed9f548f43472327538c12217181a6cfb5072bff3388308f9704\": container with ID starting with 7737fcd4ec77ed9f548f43472327538c12217181a6cfb5072bff3388308f9704 not found: ID does not exist" containerID="7737fcd4ec77ed9f548f43472327538c12217181a6cfb5072bff3388308f9704" Oct 13 14:19:25 crc kubenswrapper[4797]: I1013 14:19:25.036329 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7737fcd4ec77ed9f548f43472327538c12217181a6cfb5072bff3388308f9704"} err="failed to get container status \"7737fcd4ec77ed9f548f43472327538c12217181a6cfb5072bff3388308f9704\": rpc error: code = NotFound desc = could not find container \"7737fcd4ec77ed9f548f43472327538c12217181a6cfb5072bff3388308f9704\": container with ID starting with 7737fcd4ec77ed9f548f43472327538c12217181a6cfb5072bff3388308f9704 not found: ID does not exist" Oct 13 14:19:25 crc kubenswrapper[4797]: I1013 14:19:25.041194 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Oct 13 14:19:25 crc kubenswrapper[4797]: I1013 14:19:25.042684 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 13 14:19:25 crc kubenswrapper[4797]: I1013 14:19:25.054530 4797 reconciler_common.go:293] "Volume detached for volume \"pvc-d9f106f9-e2c9-457f-96e8-885c3fc1388d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d9f106f9-e2c9-457f-96e8-885c3fc1388d\") on node \"crc\" DevicePath \"\"" Oct 13 14:19:25 crc kubenswrapper[4797]: I1013 14:19:25.156032 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e8a03dc5-9e8b-416a-9225-7b2b3788eadd-rabbitmq-confd\") pod \"e8a03dc5-9e8b-416a-9225-7b2b3788eadd\" (UID: \"e8a03dc5-9e8b-416a-9225-7b2b3788eadd\") " Oct 13 14:19:25 crc kubenswrapper[4797]: I1013 14:19:25.156152 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e8a03dc5-9e8b-416a-9225-7b2b3788eadd-rabbitmq-erlang-cookie\") pod \"e8a03dc5-9e8b-416a-9225-7b2b3788eadd\" (UID: \"e8a03dc5-9e8b-416a-9225-7b2b3788eadd\") " Oct 13 14:19:25 crc kubenswrapper[4797]: I1013 14:19:25.157259 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e8a03dc5-9e8b-416a-9225-7b2b3788eadd-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "e8a03dc5-9e8b-416a-9225-7b2b3788eadd" (UID: "e8a03dc5-9e8b-416a-9225-7b2b3788eadd"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 14:19:25 crc kubenswrapper[4797]: I1013 14:19:25.157418 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e8a03dc5-9e8b-416a-9225-7b2b3788eadd-erlang-cookie-secret\") pod \"e8a03dc5-9e8b-416a-9225-7b2b3788eadd\" (UID: \"e8a03dc5-9e8b-416a-9225-7b2b3788eadd\") " Oct 13 14:19:25 crc kubenswrapper[4797]: I1013 14:19:25.158591 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e8a03dc5-9e8b-416a-9225-7b2b3788eadd-plugins-conf\") pod \"e8a03dc5-9e8b-416a-9225-7b2b3788eadd\" (UID: \"e8a03dc5-9e8b-416a-9225-7b2b3788eadd\") " Oct 13 14:19:25 crc kubenswrapper[4797]: I1013 14:19:25.158647 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e8a03dc5-9e8b-416a-9225-7b2b3788eadd-server-conf\") pod \"e8a03dc5-9e8b-416a-9225-7b2b3788eadd\" (UID: \"e8a03dc5-9e8b-416a-9225-7b2b3788eadd\") " Oct 13 14:19:25 crc kubenswrapper[4797]: I1013 14:19:25.158870 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a737523d-9656-44b2-9311-7d430d4ff5de\") pod \"e8a03dc5-9e8b-416a-9225-7b2b3788eadd\" (UID: \"e8a03dc5-9e8b-416a-9225-7b2b3788eadd\") " Oct 13 14:19:25 crc kubenswrapper[4797]: I1013 14:19:25.158952 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e8a03dc5-9e8b-416a-9225-7b2b3788eadd-rabbitmq-plugins\") pod \"e8a03dc5-9e8b-416a-9225-7b2b3788eadd\" (UID: \"e8a03dc5-9e8b-416a-9225-7b2b3788eadd\") " Oct 13 14:19:25 crc kubenswrapper[4797]: I1013 14:19:25.159046 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-frv6w\" (UniqueName: \"kubernetes.io/projected/e8a03dc5-9e8b-416a-9225-7b2b3788eadd-kube-api-access-frv6w\") pod \"e8a03dc5-9e8b-416a-9225-7b2b3788eadd\" (UID: \"e8a03dc5-9e8b-416a-9225-7b2b3788eadd\") " Oct 13 14:19:25 crc kubenswrapper[4797]: I1013 14:19:25.159134 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e8a03dc5-9e8b-416a-9225-7b2b3788eadd-pod-info\") pod \"e8a03dc5-9e8b-416a-9225-7b2b3788eadd\" (UID: \"e8a03dc5-9e8b-416a-9225-7b2b3788eadd\") " Oct 13 14:19:25 crc kubenswrapper[4797]: I1013 14:19:25.159500 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c4404e85-5dbc-4ac0-af0a-75886d50bb73-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"c4404e85-5dbc-4ac0-af0a-75886d50bb73\") " pod="openstack/rabbitmq-server-0" Oct 13 14:19:25 crc kubenswrapper[4797]: I1013 14:19:25.159694 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-d9f106f9-e2c9-457f-96e8-885c3fc1388d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d9f106f9-e2c9-457f-96e8-885c3fc1388d\") pod \"rabbitmq-server-0\" (UID: \"c4404e85-5dbc-4ac0-af0a-75886d50bb73\") " pod="openstack/rabbitmq-server-0" Oct 13 14:19:25 crc kubenswrapper[4797]: I1013 14:19:25.159785 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c4404e85-5dbc-4ac0-af0a-75886d50bb73-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"c4404e85-5dbc-4ac0-af0a-75886d50bb73\") " pod="openstack/rabbitmq-server-0" Oct 13 14:19:25 crc kubenswrapper[4797]: I1013 14:19:25.160313 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c4404e85-5dbc-4ac0-af0a-75886d50bb73-pod-info\") pod \"rabbitmq-server-0\" (UID: \"c4404e85-5dbc-4ac0-af0a-75886d50bb73\") " pod="openstack/rabbitmq-server-0" Oct 13 14:19:25 crc kubenswrapper[4797]: I1013 14:19:25.160379 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c4404e85-5dbc-4ac0-af0a-75886d50bb73-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"c4404e85-5dbc-4ac0-af0a-75886d50bb73\") " pod="openstack/rabbitmq-server-0" Oct 13 14:19:25 crc kubenswrapper[4797]: I1013 14:19:25.160382 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8a03dc5-9e8b-416a-9225-7b2b3788eadd-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "e8a03dc5-9e8b-416a-9225-7b2b3788eadd" (UID: "e8a03dc5-9e8b-416a-9225-7b2b3788eadd"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 14:19:25 crc kubenswrapper[4797]: I1013 14:19:25.160636 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c4404e85-5dbc-4ac0-af0a-75886d50bb73-server-conf\") pod \"rabbitmq-server-0\" (UID: \"c4404e85-5dbc-4ac0-af0a-75886d50bb73\") " pod="openstack/rabbitmq-server-0" Oct 13 14:19:25 crc kubenswrapper[4797]: I1013 14:19:25.160720 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e8a03dc5-9e8b-416a-9225-7b2b3788eadd-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "e8a03dc5-9e8b-416a-9225-7b2b3788eadd" (UID: "e8a03dc5-9e8b-416a-9225-7b2b3788eadd"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 14:19:25 crc kubenswrapper[4797]: I1013 14:19:25.160733 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8j5b\" (UniqueName: \"kubernetes.io/projected/c4404e85-5dbc-4ac0-af0a-75886d50bb73-kube-api-access-h8j5b\") pod \"rabbitmq-server-0\" (UID: \"c4404e85-5dbc-4ac0-af0a-75886d50bb73\") " pod="openstack/rabbitmq-server-0" Oct 13 14:19:25 crc kubenswrapper[4797]: I1013 14:19:25.160963 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c4404e85-5dbc-4ac0-af0a-75886d50bb73-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"c4404e85-5dbc-4ac0-af0a-75886d50bb73\") " pod="openstack/rabbitmq-server-0" Oct 13 14:19:25 crc kubenswrapper[4797]: I1013 14:19:25.161024 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c4404e85-5dbc-4ac0-af0a-75886d50bb73-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"c4404e85-5dbc-4ac0-af0a-75886d50bb73\") " pod="openstack/rabbitmq-server-0" Oct 13 14:19:25 crc kubenswrapper[4797]: I1013 14:19:25.161238 4797 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e8a03dc5-9e8b-416a-9225-7b2b3788eadd-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Oct 13 14:19:25 crc kubenswrapper[4797]: I1013 14:19:25.161254 4797 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e8a03dc5-9e8b-416a-9225-7b2b3788eadd-plugins-conf\") on node \"crc\" DevicePath \"\"" Oct 13 14:19:25 crc kubenswrapper[4797]: I1013 14:19:25.161267 4797 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e8a03dc5-9e8b-416a-9225-7b2b3788eadd-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Oct 13 14:19:25 crc kubenswrapper[4797]: I1013 14:19:25.168041 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8a03dc5-9e8b-416a-9225-7b2b3788eadd-kube-api-access-frv6w" (OuterVolumeSpecName: "kube-api-access-frv6w") pod "e8a03dc5-9e8b-416a-9225-7b2b3788eadd" (UID: "e8a03dc5-9e8b-416a-9225-7b2b3788eadd"). InnerVolumeSpecName "kube-api-access-frv6w". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 14:19:25 crc kubenswrapper[4797]: I1013 14:19:25.168156 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/e8a03dc5-9e8b-416a-9225-7b2b3788eadd-pod-info" (OuterVolumeSpecName: "pod-info") pod "e8a03dc5-9e8b-416a-9225-7b2b3788eadd" (UID: "e8a03dc5-9e8b-416a-9225-7b2b3788eadd"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Oct 13 14:19:25 crc kubenswrapper[4797]: I1013 14:19:25.168189 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8a03dc5-9e8b-416a-9225-7b2b3788eadd-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "e8a03dc5-9e8b-416a-9225-7b2b3788eadd" (UID: "e8a03dc5-9e8b-416a-9225-7b2b3788eadd"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 14:19:25 crc kubenswrapper[4797]: I1013 14:19:25.183234 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a737523d-9656-44b2-9311-7d430d4ff5de" (OuterVolumeSpecName: "persistence") pod "e8a03dc5-9e8b-416a-9225-7b2b3788eadd" (UID: "e8a03dc5-9e8b-416a-9225-7b2b3788eadd"). InnerVolumeSpecName "pvc-a737523d-9656-44b2-9311-7d430d4ff5de". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 13 14:19:25 crc kubenswrapper[4797]: I1013 14:19:25.186382 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8a03dc5-9e8b-416a-9225-7b2b3788eadd-server-conf" (OuterVolumeSpecName: "server-conf") pod "e8a03dc5-9e8b-416a-9225-7b2b3788eadd" (UID: "e8a03dc5-9e8b-416a-9225-7b2b3788eadd"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 14:19:25 crc kubenswrapper[4797]: I1013 14:19:25.259137 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39f17318-f36e-49ec-ac64-b53ed1f136f2" path="/var/lib/kubelet/pods/39f17318-f36e-49ec-ac64-b53ed1f136f2/volumes" Oct 13 14:19:25 crc kubenswrapper[4797]: I1013 14:19:25.260541 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8a03dc5-9e8b-416a-9225-7b2b3788eadd-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "e8a03dc5-9e8b-416a-9225-7b2b3788eadd" (UID: "e8a03dc5-9e8b-416a-9225-7b2b3788eadd"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 14:19:25 crc kubenswrapper[4797]: I1013 14:19:25.262276 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c4404e85-5dbc-4ac0-af0a-75886d50bb73-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"c4404e85-5dbc-4ac0-af0a-75886d50bb73\") " pod="openstack/rabbitmq-server-0" Oct 13 14:19:25 crc kubenswrapper[4797]: I1013 14:19:25.262323 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c4404e85-5dbc-4ac0-af0a-75886d50bb73-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"c4404e85-5dbc-4ac0-af0a-75886d50bb73\") " pod="openstack/rabbitmq-server-0" Oct 13 14:19:25 crc kubenswrapper[4797]: I1013 14:19:25.262359 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c4404e85-5dbc-4ac0-af0a-75886d50bb73-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"c4404e85-5dbc-4ac0-af0a-75886d50bb73\") " pod="openstack/rabbitmq-server-0" Oct 13 14:19:25 crc kubenswrapper[4797]: I1013 14:19:25.262418 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-d9f106f9-e2c9-457f-96e8-885c3fc1388d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d9f106f9-e2c9-457f-96e8-885c3fc1388d\") pod \"rabbitmq-server-0\" (UID: \"c4404e85-5dbc-4ac0-af0a-75886d50bb73\") " pod="openstack/rabbitmq-server-0" Oct 13 14:19:25 crc kubenswrapper[4797]: I1013 14:19:25.262454 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c4404e85-5dbc-4ac0-af0a-75886d50bb73-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"c4404e85-5dbc-4ac0-af0a-75886d50bb73\") " pod="openstack/rabbitmq-server-0" Oct 13 14:19:25 crc kubenswrapper[4797]: I1013 14:19:25.262493 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c4404e85-5dbc-4ac0-af0a-75886d50bb73-pod-info\") pod \"rabbitmq-server-0\" (UID: \"c4404e85-5dbc-4ac0-af0a-75886d50bb73\") " pod="openstack/rabbitmq-server-0" Oct 13 14:19:25 crc kubenswrapper[4797]: I1013 14:19:25.262521 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c4404e85-5dbc-4ac0-af0a-75886d50bb73-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"c4404e85-5dbc-4ac0-af0a-75886d50bb73\") " pod="openstack/rabbitmq-server-0" Oct 13 14:19:25 crc kubenswrapper[4797]: I1013 14:19:25.262559 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c4404e85-5dbc-4ac0-af0a-75886d50bb73-server-conf\") pod \"rabbitmq-server-0\" (UID: \"c4404e85-5dbc-4ac0-af0a-75886d50bb73\") " pod="openstack/rabbitmq-server-0" Oct 13 14:19:25 crc kubenswrapper[4797]: I1013 14:19:25.262592 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h8j5b\" (UniqueName: \"kubernetes.io/projected/c4404e85-5dbc-4ac0-af0a-75886d50bb73-kube-api-access-h8j5b\") pod \"rabbitmq-server-0\" (UID: \"c4404e85-5dbc-4ac0-af0a-75886d50bb73\") " pod="openstack/rabbitmq-server-0" Oct 13 14:19:25 crc kubenswrapper[4797]: I1013 14:19:25.262648 4797 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e8a03dc5-9e8b-416a-9225-7b2b3788eadd-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Oct 13 14:19:25 crc kubenswrapper[4797]: I1013 14:19:25.262664 4797 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e8a03dc5-9e8b-416a-9225-7b2b3788eadd-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Oct 13 14:19:25 crc kubenswrapper[4797]: I1013 14:19:25.262678 4797 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e8a03dc5-9e8b-416a-9225-7b2b3788eadd-server-conf\") on node \"crc\" DevicePath \"\"" Oct 13 14:19:25 crc kubenswrapper[4797]: I1013 14:19:25.262705 4797 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-a737523d-9656-44b2-9311-7d430d4ff5de\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a737523d-9656-44b2-9311-7d430d4ff5de\") on node \"crc\" " Oct 13 14:19:25 crc kubenswrapper[4797]: I1013 14:19:25.262722 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-frv6w\" (UniqueName: \"kubernetes.io/projected/e8a03dc5-9e8b-416a-9225-7b2b3788eadd-kube-api-access-frv6w\") on node \"crc\" DevicePath \"\"" Oct 13 14:19:25 crc kubenswrapper[4797]: I1013 14:19:25.262735 4797 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e8a03dc5-9e8b-416a-9225-7b2b3788eadd-pod-info\") on node \"crc\" DevicePath \"\"" Oct 13 14:19:25 crc kubenswrapper[4797]: I1013 14:19:25.263196 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c4404e85-5dbc-4ac0-af0a-75886d50bb73-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"c4404e85-5dbc-4ac0-af0a-75886d50bb73\") " pod="openstack/rabbitmq-server-0" Oct 13 14:19:25 crc kubenswrapper[4797]: I1013 14:19:25.263961 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c4404e85-5dbc-4ac0-af0a-75886d50bb73-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"c4404e85-5dbc-4ac0-af0a-75886d50bb73\") " pod="openstack/rabbitmq-server-0" Oct 13 14:19:25 crc kubenswrapper[4797]: I1013 14:19:25.264316 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c4404e85-5dbc-4ac0-af0a-75886d50bb73-server-conf\") pod \"rabbitmq-server-0\" (UID: \"c4404e85-5dbc-4ac0-af0a-75886d50bb73\") " pod="openstack/rabbitmq-server-0" Oct 13 14:19:25 crc kubenswrapper[4797]: I1013 14:19:25.265281 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c4404e85-5dbc-4ac0-af0a-75886d50bb73-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"c4404e85-5dbc-4ac0-af0a-75886d50bb73\") " pod="openstack/rabbitmq-server-0" Oct 13 14:19:25 crc kubenswrapper[4797]: I1013 14:19:25.266725 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c4404e85-5dbc-4ac0-af0a-75886d50bb73-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"c4404e85-5dbc-4ac0-af0a-75886d50bb73\") " pod="openstack/rabbitmq-server-0" Oct 13 14:19:25 crc kubenswrapper[4797]: I1013 14:19:25.266731 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c4404e85-5dbc-4ac0-af0a-75886d50bb73-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"c4404e85-5dbc-4ac0-af0a-75886d50bb73\") " pod="openstack/rabbitmq-server-0" Oct 13 14:19:25 crc kubenswrapper[4797]: I1013 14:19:25.268346 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c4404e85-5dbc-4ac0-af0a-75886d50bb73-pod-info\") pod \"rabbitmq-server-0\" (UID: \"c4404e85-5dbc-4ac0-af0a-75886d50bb73\") " pod="openstack/rabbitmq-server-0" Oct 13 14:19:25 crc kubenswrapper[4797]: I1013 14:19:25.272512 4797 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 13 14:19:25 crc kubenswrapper[4797]: I1013 14:19:25.272637 4797 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-d9f106f9-e2c9-457f-96e8-885c3fc1388d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d9f106f9-e2c9-457f-96e8-885c3fc1388d\") pod \"rabbitmq-server-0\" (UID: \"c4404e85-5dbc-4ac0-af0a-75886d50bb73\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/f018f0ce9372d149d8985d17a9152c0300d4bb049edd760d75027a364ec41417/globalmount\"" pod="openstack/rabbitmq-server-0" Oct 13 14:19:25 crc kubenswrapper[4797]: I1013 14:19:25.278769 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8j5b\" (UniqueName: \"kubernetes.io/projected/c4404e85-5dbc-4ac0-af0a-75886d50bb73-kube-api-access-h8j5b\") pod \"rabbitmq-server-0\" (UID: \"c4404e85-5dbc-4ac0-af0a-75886d50bb73\") " pod="openstack/rabbitmq-server-0" Oct 13 14:19:25 crc kubenswrapper[4797]: I1013 14:19:25.288727 4797 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Oct 13 14:19:25 crc kubenswrapper[4797]: I1013 14:19:25.288969 4797 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-a737523d-9656-44b2-9311-7d430d4ff5de" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a737523d-9656-44b2-9311-7d430d4ff5de") on node "crc" Oct 13 14:19:25 crc kubenswrapper[4797]: I1013 14:19:25.295655 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-fdd579685-r7l2d" Oct 13 14:19:25 crc kubenswrapper[4797]: I1013 14:19:25.307714 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-d9f106f9-e2c9-457f-96e8-885c3fc1388d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d9f106f9-e2c9-457f-96e8-885c3fc1388d\") pod \"rabbitmq-server-0\" (UID: \"c4404e85-5dbc-4ac0-af0a-75886d50bb73\") " pod="openstack/rabbitmq-server-0" Oct 13 14:19:25 crc kubenswrapper[4797]: I1013 14:19:25.360262 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 13 14:19:25 crc kubenswrapper[4797]: I1013 14:19:25.364408 4797 reconciler_common.go:293] "Volume detached for volume \"pvc-a737523d-9656-44b2-9311-7d430d4ff5de\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a737523d-9656-44b2-9311-7d430d4ff5de\") on node \"crc\" DevicePath \"\"" Oct 13 14:19:25 crc kubenswrapper[4797]: I1013 14:19:25.385004 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-557fbdf45f-4cpz9"] Oct 13 14:19:25 crc kubenswrapper[4797]: I1013 14:19:25.385493 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-557fbdf45f-4cpz9" podUID="7c4c1d51-22f5-47e8-baa4-05f2ec403852" containerName="dnsmasq-dns" containerID="cri-o://9e84a71ed4627eda53d00d945dfc3c0edfd0bb958ba2076bff5d8c64119baed0" gracePeriod=10 Oct 13 14:19:25 crc kubenswrapper[4797]: I1013 14:19:25.829415 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 13 14:19:25 crc kubenswrapper[4797]: I1013 14:19:25.958185 4797 generic.go:334] "Generic (PLEG): container finished" podID="7c4c1d51-22f5-47e8-baa4-05f2ec403852" containerID="9e84a71ed4627eda53d00d945dfc3c0edfd0bb958ba2076bff5d8c64119baed0" exitCode=0 Oct 13 14:19:25 crc kubenswrapper[4797]: I1013 14:19:25.958283 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-557fbdf45f-4cpz9" event={"ID":"7c4c1d51-22f5-47e8-baa4-05f2ec403852","Type":"ContainerDied","Data":"9e84a71ed4627eda53d00d945dfc3c0edfd0bb958ba2076bff5d8c64119baed0"} Oct 13 14:19:25 crc kubenswrapper[4797]: I1013 14:19:25.960404 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e8a03dc5-9e8b-416a-9225-7b2b3788eadd","Type":"ContainerDied","Data":"7e1ea81d0387eb55a0784b4b148013e4728c4267e8efb46d4e072514a364e193"} Oct 13 14:19:25 crc kubenswrapper[4797]: I1013 14:19:25.960471 4797 scope.go:117] "RemoveContainer" containerID="0d38697b8918c246aa99d2d005598d5565a3f8d8d9dcf5d81f94b92c3b9ab2be" Oct 13 14:19:25 crc kubenswrapper[4797]: I1013 14:19:25.960476 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 13 14:19:26 crc kubenswrapper[4797]: I1013 14:19:26.012331 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 13 14:19:26 crc kubenswrapper[4797]: I1013 14:19:26.018065 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 13 14:19:26 crc kubenswrapper[4797]: I1013 14:19:26.035602 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 13 14:19:26 crc kubenswrapper[4797]: I1013 14:19:26.037018 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 13 14:19:26 crc kubenswrapper[4797]: I1013 14:19:26.040105 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Oct 13 14:19:26 crc kubenswrapper[4797]: I1013 14:19:26.041135 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Oct 13 14:19:26 crc kubenswrapper[4797]: I1013 14:19:26.041195 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Oct 13 14:19:26 crc kubenswrapper[4797]: I1013 14:19:26.042678 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-24j2c" Oct 13 14:19:26 crc kubenswrapper[4797]: I1013 14:19:26.043391 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Oct 13 14:19:26 crc kubenswrapper[4797]: I1013 14:19:26.057929 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 13 14:19:26 crc kubenswrapper[4797]: W1013 14:19:26.086589 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc4404e85_5dbc_4ac0_af0a_75886d50bb73.slice/crio-d83ce88b68c883306a86f88b29105cacc035bc2482c9383162f46848ddfc68ff WatchSource:0}: Error finding container d83ce88b68c883306a86f88b29105cacc035bc2482c9383162f46848ddfc68ff: Status 404 returned error can't find the container with id d83ce88b68c883306a86f88b29105cacc035bc2482c9383162f46848ddfc68ff Oct 13 14:19:26 crc kubenswrapper[4797]: I1013 14:19:26.094933 4797 scope.go:117] "RemoveContainer" containerID="c41a498f495d9d7d28569b37c12bbfa092106858c1ce0547625a97de8ae50469" Oct 13 14:19:26 crc kubenswrapper[4797]: I1013 14:19:26.102759 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-557fbdf45f-4cpz9" Oct 13 14:19:26 crc kubenswrapper[4797]: I1013 14:19:26.175735 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4x7qs\" (UniqueName: \"kubernetes.io/projected/7c4c1d51-22f5-47e8-baa4-05f2ec403852-kube-api-access-4x7qs\") pod \"7c4c1d51-22f5-47e8-baa4-05f2ec403852\" (UID: \"7c4c1d51-22f5-47e8-baa4-05f2ec403852\") " Oct 13 14:19:26 crc kubenswrapper[4797]: I1013 14:19:26.175848 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c4c1d51-22f5-47e8-baa4-05f2ec403852-config\") pod \"7c4c1d51-22f5-47e8-baa4-05f2ec403852\" (UID: \"7c4c1d51-22f5-47e8-baa4-05f2ec403852\") " Oct 13 14:19:26 crc kubenswrapper[4797]: I1013 14:19:26.175990 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7c4c1d51-22f5-47e8-baa4-05f2ec403852-dns-svc\") pod \"7c4c1d51-22f5-47e8-baa4-05f2ec403852\" (UID: \"7c4c1d51-22f5-47e8-baa4-05f2ec403852\") " Oct 13 14:19:26 crc kubenswrapper[4797]: I1013 14:19:26.176163 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/24ee7ce6-783b-433c-a9d0-1a3b81edd035-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"24ee7ce6-783b-433c-a9d0-1a3b81edd035\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 14:19:26 crc kubenswrapper[4797]: I1013 14:19:26.176196 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/24ee7ce6-783b-433c-a9d0-1a3b81edd035-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"24ee7ce6-783b-433c-a9d0-1a3b81edd035\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 14:19:26 crc kubenswrapper[4797]: I1013 14:19:26.176216 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/24ee7ce6-783b-433c-a9d0-1a3b81edd035-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"24ee7ce6-783b-433c-a9d0-1a3b81edd035\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 14:19:26 crc kubenswrapper[4797]: I1013 14:19:26.176233 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/24ee7ce6-783b-433c-a9d0-1a3b81edd035-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"24ee7ce6-783b-433c-a9d0-1a3b81edd035\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 14:19:26 crc kubenswrapper[4797]: I1013 14:19:26.176254 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/24ee7ce6-783b-433c-a9d0-1a3b81edd035-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"24ee7ce6-783b-433c-a9d0-1a3b81edd035\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 14:19:26 crc kubenswrapper[4797]: I1013 14:19:26.176274 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/24ee7ce6-783b-433c-a9d0-1a3b81edd035-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"24ee7ce6-783b-433c-a9d0-1a3b81edd035\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 14:19:26 crc kubenswrapper[4797]: I1013 14:19:26.176306 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5gfp\" (UniqueName: \"kubernetes.io/projected/24ee7ce6-783b-433c-a9d0-1a3b81edd035-kube-api-access-f5gfp\") pod \"rabbitmq-cell1-server-0\" (UID: \"24ee7ce6-783b-433c-a9d0-1a3b81edd035\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 14:19:26 crc kubenswrapper[4797]: I1013 14:19:26.176353 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-a737523d-9656-44b2-9311-7d430d4ff5de\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a737523d-9656-44b2-9311-7d430d4ff5de\") pod \"rabbitmq-cell1-server-0\" (UID: \"24ee7ce6-783b-433c-a9d0-1a3b81edd035\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 14:19:26 crc kubenswrapper[4797]: I1013 14:19:26.176377 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/24ee7ce6-783b-433c-a9d0-1a3b81edd035-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"24ee7ce6-783b-433c-a9d0-1a3b81edd035\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 14:19:26 crc kubenswrapper[4797]: I1013 14:19:26.179819 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c4c1d51-22f5-47e8-baa4-05f2ec403852-kube-api-access-4x7qs" (OuterVolumeSpecName: "kube-api-access-4x7qs") pod "7c4c1d51-22f5-47e8-baa4-05f2ec403852" (UID: "7c4c1d51-22f5-47e8-baa4-05f2ec403852"). InnerVolumeSpecName "kube-api-access-4x7qs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 14:19:26 crc kubenswrapper[4797]: I1013 14:19:26.277540 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/24ee7ce6-783b-433c-a9d0-1a3b81edd035-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"24ee7ce6-783b-433c-a9d0-1a3b81edd035\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 14:19:26 crc kubenswrapper[4797]: I1013 14:19:26.277916 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/24ee7ce6-783b-433c-a9d0-1a3b81edd035-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"24ee7ce6-783b-433c-a9d0-1a3b81edd035\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 14:19:26 crc kubenswrapper[4797]: I1013 14:19:26.278006 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/24ee7ce6-783b-433c-a9d0-1a3b81edd035-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"24ee7ce6-783b-433c-a9d0-1a3b81edd035\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 14:19:26 crc kubenswrapper[4797]: I1013 14:19:26.278100 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/24ee7ce6-783b-433c-a9d0-1a3b81edd035-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"24ee7ce6-783b-433c-a9d0-1a3b81edd035\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 14:19:26 crc kubenswrapper[4797]: I1013 14:19:26.278199 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/24ee7ce6-783b-433c-a9d0-1a3b81edd035-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"24ee7ce6-783b-433c-a9d0-1a3b81edd035\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 14:19:26 crc kubenswrapper[4797]: I1013 14:19:26.278294 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/24ee7ce6-783b-433c-a9d0-1a3b81edd035-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"24ee7ce6-783b-433c-a9d0-1a3b81edd035\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 14:19:26 crc kubenswrapper[4797]: I1013 14:19:26.278400 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f5gfp\" (UniqueName: \"kubernetes.io/projected/24ee7ce6-783b-433c-a9d0-1a3b81edd035-kube-api-access-f5gfp\") pod \"rabbitmq-cell1-server-0\" (UID: \"24ee7ce6-783b-433c-a9d0-1a3b81edd035\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 14:19:26 crc kubenswrapper[4797]: I1013 14:19:26.278517 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-a737523d-9656-44b2-9311-7d430d4ff5de\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a737523d-9656-44b2-9311-7d430d4ff5de\") pod \"rabbitmq-cell1-server-0\" (UID: \"24ee7ce6-783b-433c-a9d0-1a3b81edd035\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 14:19:26 crc kubenswrapper[4797]: I1013 14:19:26.278614 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/24ee7ce6-783b-433c-a9d0-1a3b81edd035-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"24ee7ce6-783b-433c-a9d0-1a3b81edd035\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 14:19:26 crc kubenswrapper[4797]: I1013 14:19:26.278721 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4x7qs\" (UniqueName: \"kubernetes.io/projected/7c4c1d51-22f5-47e8-baa4-05f2ec403852-kube-api-access-4x7qs\") on node \"crc\" DevicePath \"\"" Oct 13 14:19:26 crc kubenswrapper[4797]: I1013 14:19:26.279170 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/24ee7ce6-783b-433c-a9d0-1a3b81edd035-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"24ee7ce6-783b-433c-a9d0-1a3b81edd035\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 14:19:26 crc kubenswrapper[4797]: I1013 14:19:26.279221 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/24ee7ce6-783b-433c-a9d0-1a3b81edd035-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"24ee7ce6-783b-433c-a9d0-1a3b81edd035\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 14:19:26 crc kubenswrapper[4797]: I1013 14:19:26.279532 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/24ee7ce6-783b-433c-a9d0-1a3b81edd035-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"24ee7ce6-783b-433c-a9d0-1a3b81edd035\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 14:19:26 crc kubenswrapper[4797]: I1013 14:19:26.279669 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/24ee7ce6-783b-433c-a9d0-1a3b81edd035-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"24ee7ce6-783b-433c-a9d0-1a3b81edd035\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 14:19:26 crc kubenswrapper[4797]: I1013 14:19:26.288568 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/24ee7ce6-783b-433c-a9d0-1a3b81edd035-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"24ee7ce6-783b-433c-a9d0-1a3b81edd035\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 14:19:26 crc kubenswrapper[4797]: I1013 14:19:26.289088 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/24ee7ce6-783b-433c-a9d0-1a3b81edd035-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"24ee7ce6-783b-433c-a9d0-1a3b81edd035\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 14:19:26 crc kubenswrapper[4797]: I1013 14:19:26.296644 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/24ee7ce6-783b-433c-a9d0-1a3b81edd035-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"24ee7ce6-783b-433c-a9d0-1a3b81edd035\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 14:19:26 crc kubenswrapper[4797]: I1013 14:19:26.297005 4797 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 13 14:19:26 crc kubenswrapper[4797]: I1013 14:19:26.297037 4797 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-a737523d-9656-44b2-9311-7d430d4ff5de\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a737523d-9656-44b2-9311-7d430d4ff5de\") pod \"rabbitmq-cell1-server-0\" (UID: \"24ee7ce6-783b-433c-a9d0-1a3b81edd035\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/0d9b5c6133a2fd75e8a005ecab3887f623cd9089254a1d9df7fe9b22b6e7a859/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Oct 13 14:19:26 crc kubenswrapper[4797]: I1013 14:19:26.302974 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5gfp\" (UniqueName: \"kubernetes.io/projected/24ee7ce6-783b-433c-a9d0-1a3b81edd035-kube-api-access-f5gfp\") pod \"rabbitmq-cell1-server-0\" (UID: \"24ee7ce6-783b-433c-a9d0-1a3b81edd035\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 14:19:26 crc kubenswrapper[4797]: I1013 14:19:26.325918 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c4c1d51-22f5-47e8-baa4-05f2ec403852-config" (OuterVolumeSpecName: "config") pod "7c4c1d51-22f5-47e8-baa4-05f2ec403852" (UID: "7c4c1d51-22f5-47e8-baa4-05f2ec403852"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 14:19:26 crc kubenswrapper[4797]: I1013 14:19:26.327932 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c4c1d51-22f5-47e8-baa4-05f2ec403852-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7c4c1d51-22f5-47e8-baa4-05f2ec403852" (UID: "7c4c1d51-22f5-47e8-baa4-05f2ec403852"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 14:19:26 crc kubenswrapper[4797]: I1013 14:19:26.329593 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-a737523d-9656-44b2-9311-7d430d4ff5de\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a737523d-9656-44b2-9311-7d430d4ff5de\") pod \"rabbitmq-cell1-server-0\" (UID: \"24ee7ce6-783b-433c-a9d0-1a3b81edd035\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 14:19:26 crc kubenswrapper[4797]: I1013 14:19:26.364240 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 13 14:19:26 crc kubenswrapper[4797]: I1013 14:19:26.380503 4797 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c4c1d51-22f5-47e8-baa4-05f2ec403852-config\") on node \"crc\" DevicePath \"\"" Oct 13 14:19:26 crc kubenswrapper[4797]: I1013 14:19:26.380548 4797 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7c4c1d51-22f5-47e8-baa4-05f2ec403852-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 13 14:19:26 crc kubenswrapper[4797]: I1013 14:19:26.779422 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 13 14:19:26 crc kubenswrapper[4797]: W1013 14:19:26.785876 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod24ee7ce6_783b_433c_a9d0_1a3b81edd035.slice/crio-c200070226215c6419ff7e84603ac02d3f88ff4b74784e462462c93d2a061276 WatchSource:0}: Error finding container c200070226215c6419ff7e84603ac02d3f88ff4b74784e462462c93d2a061276: Status 404 returned error can't find the container with id c200070226215c6419ff7e84603ac02d3f88ff4b74784e462462c93d2a061276 Oct 13 14:19:26 crc kubenswrapper[4797]: I1013 14:19:26.967662 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c4404e85-5dbc-4ac0-af0a-75886d50bb73","Type":"ContainerStarted","Data":"d83ce88b68c883306a86f88b29105cacc035bc2482c9383162f46848ddfc68ff"} Oct 13 14:19:26 crc kubenswrapper[4797]: I1013 14:19:26.968676 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"24ee7ce6-783b-433c-a9d0-1a3b81edd035","Type":"ContainerStarted","Data":"c200070226215c6419ff7e84603ac02d3f88ff4b74784e462462c93d2a061276"} Oct 13 14:19:26 crc kubenswrapper[4797]: I1013 14:19:26.970458 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-557fbdf45f-4cpz9" event={"ID":"7c4c1d51-22f5-47e8-baa4-05f2ec403852","Type":"ContainerDied","Data":"7a5400b10dac7031454c0e8cd1850ab1244565603f63715e6e2082fe3f14e7b3"} Oct 13 14:19:26 crc kubenswrapper[4797]: I1013 14:19:26.970505 4797 scope.go:117] "RemoveContainer" containerID="9e84a71ed4627eda53d00d945dfc3c0edfd0bb958ba2076bff5d8c64119baed0" Oct 13 14:19:26 crc kubenswrapper[4797]: I1013 14:19:26.970516 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-557fbdf45f-4cpz9" Oct 13 14:19:26 crc kubenswrapper[4797]: I1013 14:19:26.985025 4797 scope.go:117] "RemoveContainer" containerID="62f2a724709bc33f4d4d642ab957bd4d85906624f14e6aabd0af7014e49df210" Oct 13 14:19:27 crc kubenswrapper[4797]: I1013 14:19:27.007510 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-557fbdf45f-4cpz9"] Oct 13 14:19:27 crc kubenswrapper[4797]: I1013 14:19:27.011924 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-557fbdf45f-4cpz9"] Oct 13 14:19:27 crc kubenswrapper[4797]: I1013 14:19:27.247781 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c4c1d51-22f5-47e8-baa4-05f2ec403852" path="/var/lib/kubelet/pods/7c4c1d51-22f5-47e8-baa4-05f2ec403852/volumes" Oct 13 14:19:27 crc kubenswrapper[4797]: I1013 14:19:27.248909 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8a03dc5-9e8b-416a-9225-7b2b3788eadd" path="/var/lib/kubelet/pods/e8a03dc5-9e8b-416a-9225-7b2b3788eadd/volumes" Oct 13 14:19:29 crc kubenswrapper[4797]: I1013 14:19:29.017326 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c4404e85-5dbc-4ac0-af0a-75886d50bb73","Type":"ContainerStarted","Data":"b68891e48276d3599c204ac63725a563b40fc0ef19e00e42e46b6abe53678ed7"} Oct 13 14:19:29 crc kubenswrapper[4797]: I1013 14:19:29.023015 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"24ee7ce6-783b-433c-a9d0-1a3b81edd035","Type":"ContainerStarted","Data":"553f1bbf79e4f16b5ff345876bb1e62832ba212107d072ac3c8806deeba660bc"} Oct 13 14:19:30 crc kubenswrapper[4797]: I1013 14:19:30.236633 4797 scope.go:117] "RemoveContainer" containerID="e94c068ee3715b0c0c1473747789b8e54032f59cffffcb81aaefbd0a73c58ba2" Oct 13 14:19:30 crc kubenswrapper[4797]: E1013 14:19:30.238476 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 14:19:30 crc kubenswrapper[4797]: I1013 14:19:30.645359 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-lgzz6" Oct 13 14:19:30 crc kubenswrapper[4797]: I1013 14:19:30.645454 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-lgzz6" Oct 13 14:19:30 crc kubenswrapper[4797]: I1013 14:19:30.710514 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-lgzz6" Oct 13 14:19:31 crc kubenswrapper[4797]: I1013 14:19:31.102563 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-lgzz6" Oct 13 14:19:32 crc kubenswrapper[4797]: I1013 14:19:32.257786 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lgzz6"] Oct 13 14:19:33 crc kubenswrapper[4797]: I1013 14:19:33.076717 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-lgzz6" podUID="60a9c779-b261-4811-928b-305e24816f64" containerName="registry-server" containerID="cri-o://a6f14aee03e299c3c4f1d22c3c78b6b6fe73c2d4462e4d9ad957757024c0d452" gracePeriod=2 Oct 13 14:19:33 crc kubenswrapper[4797]: I1013 14:19:33.601452 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lgzz6" Oct 13 14:19:33 crc kubenswrapper[4797]: I1013 14:19:33.696191 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5f8r2\" (UniqueName: \"kubernetes.io/projected/60a9c779-b261-4811-928b-305e24816f64-kube-api-access-5f8r2\") pod \"60a9c779-b261-4811-928b-305e24816f64\" (UID: \"60a9c779-b261-4811-928b-305e24816f64\") " Oct 13 14:19:33 crc kubenswrapper[4797]: I1013 14:19:33.696238 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60a9c779-b261-4811-928b-305e24816f64-utilities\") pod \"60a9c779-b261-4811-928b-305e24816f64\" (UID: \"60a9c779-b261-4811-928b-305e24816f64\") " Oct 13 14:19:33 crc kubenswrapper[4797]: I1013 14:19:33.696348 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60a9c779-b261-4811-928b-305e24816f64-catalog-content\") pod \"60a9c779-b261-4811-928b-305e24816f64\" (UID: \"60a9c779-b261-4811-928b-305e24816f64\") " Oct 13 14:19:33 crc kubenswrapper[4797]: I1013 14:19:33.697057 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/60a9c779-b261-4811-928b-305e24816f64-utilities" (OuterVolumeSpecName: "utilities") pod "60a9c779-b261-4811-928b-305e24816f64" (UID: "60a9c779-b261-4811-928b-305e24816f64"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 14:19:33 crc kubenswrapper[4797]: I1013 14:19:33.706034 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60a9c779-b261-4811-928b-305e24816f64-kube-api-access-5f8r2" (OuterVolumeSpecName: "kube-api-access-5f8r2") pod "60a9c779-b261-4811-928b-305e24816f64" (UID: "60a9c779-b261-4811-928b-305e24816f64"). InnerVolumeSpecName "kube-api-access-5f8r2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 14:19:33 crc kubenswrapper[4797]: I1013 14:19:33.741018 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/60a9c779-b261-4811-928b-305e24816f64-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "60a9c779-b261-4811-928b-305e24816f64" (UID: "60a9c779-b261-4811-928b-305e24816f64"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 14:19:33 crc kubenswrapper[4797]: I1013 14:19:33.797415 4797 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60a9c779-b261-4811-928b-305e24816f64-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 13 14:19:33 crc kubenswrapper[4797]: I1013 14:19:33.797451 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5f8r2\" (UniqueName: \"kubernetes.io/projected/60a9c779-b261-4811-928b-305e24816f64-kube-api-access-5f8r2\") on node \"crc\" DevicePath \"\"" Oct 13 14:19:33 crc kubenswrapper[4797]: I1013 14:19:33.797464 4797 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60a9c779-b261-4811-928b-305e24816f64-utilities\") on node \"crc\" DevicePath \"\"" Oct 13 14:19:34 crc kubenswrapper[4797]: I1013 14:19:34.087162 4797 generic.go:334] "Generic (PLEG): container finished" podID="60a9c779-b261-4811-928b-305e24816f64" containerID="a6f14aee03e299c3c4f1d22c3c78b6b6fe73c2d4462e4d9ad957757024c0d452" exitCode=0 Oct 13 14:19:34 crc kubenswrapper[4797]: I1013 14:19:34.087228 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lgzz6" Oct 13 14:19:34 crc kubenswrapper[4797]: I1013 14:19:34.087729 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lgzz6" event={"ID":"60a9c779-b261-4811-928b-305e24816f64","Type":"ContainerDied","Data":"a6f14aee03e299c3c4f1d22c3c78b6b6fe73c2d4462e4d9ad957757024c0d452"} Oct 13 14:19:34 crc kubenswrapper[4797]: I1013 14:19:34.088085 4797 scope.go:117] "RemoveContainer" containerID="a6f14aee03e299c3c4f1d22c3c78b6b6fe73c2d4462e4d9ad957757024c0d452" Oct 13 14:19:34 crc kubenswrapper[4797]: I1013 14:19:34.088563 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lgzz6" event={"ID":"60a9c779-b261-4811-928b-305e24816f64","Type":"ContainerDied","Data":"112ec702b07ea199740435bf5a0396eb2d6e47a1bed2c2835bdd50db3d6bb7d7"} Oct 13 14:19:34 crc kubenswrapper[4797]: I1013 14:19:34.106636 4797 scope.go:117] "RemoveContainer" containerID="0088ae072cac73db84fdb6ba3a022bbf24de818073734c8a2c7516b5576c5298" Oct 13 14:19:34 crc kubenswrapper[4797]: I1013 14:19:34.127199 4797 scope.go:117] "RemoveContainer" containerID="48428210cdba3b9dbeb4f6790bd051312add69b3ab77993f1d526b90803d2c71" Oct 13 14:19:34 crc kubenswrapper[4797]: I1013 14:19:34.128865 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lgzz6"] Oct 13 14:19:34 crc kubenswrapper[4797]: I1013 14:19:34.138281 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-lgzz6"] Oct 13 14:19:34 crc kubenswrapper[4797]: I1013 14:19:34.175196 4797 scope.go:117] "RemoveContainer" containerID="a6f14aee03e299c3c4f1d22c3c78b6b6fe73c2d4462e4d9ad957757024c0d452" Oct 13 14:19:34 crc kubenswrapper[4797]: E1013 14:19:34.175605 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a6f14aee03e299c3c4f1d22c3c78b6b6fe73c2d4462e4d9ad957757024c0d452\": container with ID starting with a6f14aee03e299c3c4f1d22c3c78b6b6fe73c2d4462e4d9ad957757024c0d452 not found: ID does not exist" containerID="a6f14aee03e299c3c4f1d22c3c78b6b6fe73c2d4462e4d9ad957757024c0d452" Oct 13 14:19:34 crc kubenswrapper[4797]: I1013 14:19:34.175641 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a6f14aee03e299c3c4f1d22c3c78b6b6fe73c2d4462e4d9ad957757024c0d452"} err="failed to get container status \"a6f14aee03e299c3c4f1d22c3c78b6b6fe73c2d4462e4d9ad957757024c0d452\": rpc error: code = NotFound desc = could not find container \"a6f14aee03e299c3c4f1d22c3c78b6b6fe73c2d4462e4d9ad957757024c0d452\": container with ID starting with a6f14aee03e299c3c4f1d22c3c78b6b6fe73c2d4462e4d9ad957757024c0d452 not found: ID does not exist" Oct 13 14:19:34 crc kubenswrapper[4797]: I1013 14:19:34.175661 4797 scope.go:117] "RemoveContainer" containerID="0088ae072cac73db84fdb6ba3a022bbf24de818073734c8a2c7516b5576c5298" Oct 13 14:19:34 crc kubenswrapper[4797]: E1013 14:19:34.176581 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0088ae072cac73db84fdb6ba3a022bbf24de818073734c8a2c7516b5576c5298\": container with ID starting with 0088ae072cac73db84fdb6ba3a022bbf24de818073734c8a2c7516b5576c5298 not found: ID does not exist" containerID="0088ae072cac73db84fdb6ba3a022bbf24de818073734c8a2c7516b5576c5298" Oct 13 14:19:34 crc kubenswrapper[4797]: I1013 14:19:34.176620 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0088ae072cac73db84fdb6ba3a022bbf24de818073734c8a2c7516b5576c5298"} err="failed to get container status \"0088ae072cac73db84fdb6ba3a022bbf24de818073734c8a2c7516b5576c5298\": rpc error: code = NotFound desc = could not find container \"0088ae072cac73db84fdb6ba3a022bbf24de818073734c8a2c7516b5576c5298\": container with ID starting with 0088ae072cac73db84fdb6ba3a022bbf24de818073734c8a2c7516b5576c5298 not found: ID does not exist" Oct 13 14:19:34 crc kubenswrapper[4797]: I1013 14:19:34.176647 4797 scope.go:117] "RemoveContainer" containerID="48428210cdba3b9dbeb4f6790bd051312add69b3ab77993f1d526b90803d2c71" Oct 13 14:19:34 crc kubenswrapper[4797]: E1013 14:19:34.176952 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"48428210cdba3b9dbeb4f6790bd051312add69b3ab77993f1d526b90803d2c71\": container with ID starting with 48428210cdba3b9dbeb4f6790bd051312add69b3ab77993f1d526b90803d2c71 not found: ID does not exist" containerID="48428210cdba3b9dbeb4f6790bd051312add69b3ab77993f1d526b90803d2c71" Oct 13 14:19:34 crc kubenswrapper[4797]: I1013 14:19:34.176979 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48428210cdba3b9dbeb4f6790bd051312add69b3ab77993f1d526b90803d2c71"} err="failed to get container status \"48428210cdba3b9dbeb4f6790bd051312add69b3ab77993f1d526b90803d2c71\": rpc error: code = NotFound desc = could not find container \"48428210cdba3b9dbeb4f6790bd051312add69b3ab77993f1d526b90803d2c71\": container with ID starting with 48428210cdba3b9dbeb4f6790bd051312add69b3ab77993f1d526b90803d2c71 not found: ID does not exist" Oct 13 14:19:35 crc kubenswrapper[4797]: I1013 14:19:35.248693 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60a9c779-b261-4811-928b-305e24816f64" path="/var/lib/kubelet/pods/60a9c779-b261-4811-928b-305e24816f64/volumes" Oct 13 14:19:44 crc kubenswrapper[4797]: I1013 14:19:44.236956 4797 scope.go:117] "RemoveContainer" containerID="e94c068ee3715b0c0c1473747789b8e54032f59cffffcb81aaefbd0a73c58ba2" Oct 13 14:19:44 crc kubenswrapper[4797]: E1013 14:19:44.237832 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 14:19:58 crc kubenswrapper[4797]: I1013 14:19:58.235736 4797 scope.go:117] "RemoveContainer" containerID="e94c068ee3715b0c0c1473747789b8e54032f59cffffcb81aaefbd0a73c58ba2" Oct 13 14:19:58 crc kubenswrapper[4797]: E1013 14:19:58.236542 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 14:20:01 crc kubenswrapper[4797]: I1013 14:20:01.333138 4797 generic.go:334] "Generic (PLEG): container finished" podID="c4404e85-5dbc-4ac0-af0a-75886d50bb73" containerID="b68891e48276d3599c204ac63725a563b40fc0ef19e00e42e46b6abe53678ed7" exitCode=0 Oct 13 14:20:01 crc kubenswrapper[4797]: I1013 14:20:01.333225 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c4404e85-5dbc-4ac0-af0a-75886d50bb73","Type":"ContainerDied","Data":"b68891e48276d3599c204ac63725a563b40fc0ef19e00e42e46b6abe53678ed7"} Oct 13 14:20:02 crc kubenswrapper[4797]: I1013 14:20:02.352881 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c4404e85-5dbc-4ac0-af0a-75886d50bb73","Type":"ContainerStarted","Data":"b26856c0913b8e4d05b29491700b99832d6ef8807370a691123a2db1b49b55d5"} Oct 13 14:20:02 crc kubenswrapper[4797]: I1013 14:20:02.354041 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Oct 13 14:20:02 crc kubenswrapper[4797]: I1013 14:20:02.355283 4797 generic.go:334] "Generic (PLEG): container finished" podID="24ee7ce6-783b-433c-a9d0-1a3b81edd035" containerID="553f1bbf79e4f16b5ff345876bb1e62832ba212107d072ac3c8806deeba660bc" exitCode=0 Oct 13 14:20:02 crc kubenswrapper[4797]: I1013 14:20:02.355397 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"24ee7ce6-783b-433c-a9d0-1a3b81edd035","Type":"ContainerDied","Data":"553f1bbf79e4f16b5ff345876bb1e62832ba212107d072ac3c8806deeba660bc"} Oct 13 14:20:02 crc kubenswrapper[4797]: I1013 14:20:02.383749 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=38.38372837 podStartE2EDuration="38.38372837s" podCreationTimestamp="2025-10-13 14:19:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 14:20:02.378150204 +0000 UTC m=+4379.911700470" watchObservedRunningTime="2025-10-13 14:20:02.38372837 +0000 UTC m=+4379.917278646" Oct 13 14:20:03 crc kubenswrapper[4797]: I1013 14:20:03.368841 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"24ee7ce6-783b-433c-a9d0-1a3b81edd035","Type":"ContainerStarted","Data":"eb4ad42e4fce005312f14a5d443f46f92bb611b466150980e15fac45d90dd0b8"} Oct 13 14:20:03 crc kubenswrapper[4797]: I1013 14:20:03.369552 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Oct 13 14:20:03 crc kubenswrapper[4797]: I1013 14:20:03.405319 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.405301134 podStartE2EDuration="37.405301134s" podCreationTimestamp="2025-10-13 14:19:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 14:20:03.398758644 +0000 UTC m=+4380.932308920" watchObservedRunningTime="2025-10-13 14:20:03.405301134 +0000 UTC m=+4380.938851390" Oct 13 14:20:10 crc kubenswrapper[4797]: I1013 14:20:10.235784 4797 scope.go:117] "RemoveContainer" containerID="e94c068ee3715b0c0c1473747789b8e54032f59cffffcb81aaefbd0a73c58ba2" Oct 13 14:20:10 crc kubenswrapper[4797]: E1013 14:20:10.236554 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 14:20:15 crc kubenswrapper[4797]: I1013 14:20:15.363698 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Oct 13 14:20:16 crc kubenswrapper[4797]: I1013 14:20:16.368303 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Oct 13 14:20:21 crc kubenswrapper[4797]: I1013 14:20:21.237398 4797 scope.go:117] "RemoveContainer" containerID="e94c068ee3715b0c0c1473747789b8e54032f59cffffcb81aaefbd0a73c58ba2" Oct 13 14:20:21 crc kubenswrapper[4797]: E1013 14:20:21.238262 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 14:20:26 crc kubenswrapper[4797]: I1013 14:20:26.026584 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-1-default"] Oct 13 14:20:26 crc kubenswrapper[4797]: E1013 14:20:26.027196 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60a9c779-b261-4811-928b-305e24816f64" containerName="extract-utilities" Oct 13 14:20:26 crc kubenswrapper[4797]: I1013 14:20:26.027213 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="60a9c779-b261-4811-928b-305e24816f64" containerName="extract-utilities" Oct 13 14:20:26 crc kubenswrapper[4797]: E1013 14:20:26.027226 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c4c1d51-22f5-47e8-baa4-05f2ec403852" containerName="init" Oct 13 14:20:26 crc kubenswrapper[4797]: I1013 14:20:26.027234 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c4c1d51-22f5-47e8-baa4-05f2ec403852" containerName="init" Oct 13 14:20:26 crc kubenswrapper[4797]: E1013 14:20:26.027265 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60a9c779-b261-4811-928b-305e24816f64" containerName="registry-server" Oct 13 14:20:26 crc kubenswrapper[4797]: I1013 14:20:26.027273 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="60a9c779-b261-4811-928b-305e24816f64" containerName="registry-server" Oct 13 14:20:26 crc kubenswrapper[4797]: E1013 14:20:26.027286 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c4c1d51-22f5-47e8-baa4-05f2ec403852" containerName="dnsmasq-dns" Oct 13 14:20:26 crc kubenswrapper[4797]: I1013 14:20:26.027294 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c4c1d51-22f5-47e8-baa4-05f2ec403852" containerName="dnsmasq-dns" Oct 13 14:20:26 crc kubenswrapper[4797]: E1013 14:20:26.027305 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60a9c779-b261-4811-928b-305e24816f64" containerName="extract-content" Oct 13 14:20:26 crc kubenswrapper[4797]: I1013 14:20:26.027313 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="60a9c779-b261-4811-928b-305e24816f64" containerName="extract-content" Oct 13 14:20:26 crc kubenswrapper[4797]: I1013 14:20:26.027492 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="60a9c779-b261-4811-928b-305e24816f64" containerName="registry-server" Oct 13 14:20:26 crc kubenswrapper[4797]: I1013 14:20:26.027511 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c4c1d51-22f5-47e8-baa4-05f2ec403852" containerName="dnsmasq-dns" Oct 13 14:20:26 crc kubenswrapper[4797]: I1013 14:20:26.028335 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1-default" Oct 13 14:20:26 crc kubenswrapper[4797]: I1013 14:20:26.031078 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-dj9th" Oct 13 14:20:26 crc kubenswrapper[4797]: I1013 14:20:26.038976 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-1-default"] Oct 13 14:20:26 crc kubenswrapper[4797]: I1013 14:20:26.157403 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vmnb\" (UniqueName: \"kubernetes.io/projected/c8b368d0-6d78-4a55-a5a2-fb0078ae2115-kube-api-access-8vmnb\") pod \"mariadb-client-1-default\" (UID: \"c8b368d0-6d78-4a55-a5a2-fb0078ae2115\") " pod="openstack/mariadb-client-1-default" Oct 13 14:20:26 crc kubenswrapper[4797]: I1013 14:20:26.258899 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8vmnb\" (UniqueName: \"kubernetes.io/projected/c8b368d0-6d78-4a55-a5a2-fb0078ae2115-kube-api-access-8vmnb\") pod \"mariadb-client-1-default\" (UID: \"c8b368d0-6d78-4a55-a5a2-fb0078ae2115\") " pod="openstack/mariadb-client-1-default" Oct 13 14:20:26 crc kubenswrapper[4797]: I1013 14:20:26.296245 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vmnb\" (UniqueName: \"kubernetes.io/projected/c8b368d0-6d78-4a55-a5a2-fb0078ae2115-kube-api-access-8vmnb\") pod \"mariadb-client-1-default\" (UID: \"c8b368d0-6d78-4a55-a5a2-fb0078ae2115\") " pod="openstack/mariadb-client-1-default" Oct 13 14:20:26 crc kubenswrapper[4797]: I1013 14:20:26.374496 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1-default" Oct 13 14:20:26 crc kubenswrapper[4797]: I1013 14:20:26.890866 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-1-default"] Oct 13 14:20:26 crc kubenswrapper[4797]: W1013 14:20:26.899492 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc8b368d0_6d78_4a55_a5a2_fb0078ae2115.slice/crio-ef770ce1655454d418a2ea0070902eb64f938b543f53e5f3508ca9c771304d6d WatchSource:0}: Error finding container ef770ce1655454d418a2ea0070902eb64f938b543f53e5f3508ca9c771304d6d: Status 404 returned error can't find the container with id ef770ce1655454d418a2ea0070902eb64f938b543f53e5f3508ca9c771304d6d Oct 13 14:20:26 crc kubenswrapper[4797]: I1013 14:20:26.902552 4797 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 13 14:20:27 crc kubenswrapper[4797]: I1013 14:20:27.563334 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-1-default" event={"ID":"c8b368d0-6d78-4a55-a5a2-fb0078ae2115","Type":"ContainerStarted","Data":"ef770ce1655454d418a2ea0070902eb64f938b543f53e5f3508ca9c771304d6d"} Oct 13 14:20:30 crc kubenswrapper[4797]: I1013 14:20:30.591780 4797 generic.go:334] "Generic (PLEG): container finished" podID="c8b368d0-6d78-4a55-a5a2-fb0078ae2115" containerID="9d949ba8220d4eaad18949311d083828126fd0e03af5ecc6897307ffffc1e19d" exitCode=0 Oct 13 14:20:30 crc kubenswrapper[4797]: I1013 14:20:30.591841 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-1-default" event={"ID":"c8b368d0-6d78-4a55-a5a2-fb0078ae2115","Type":"ContainerDied","Data":"9d949ba8220d4eaad18949311d083828126fd0e03af5ecc6897307ffffc1e19d"} Oct 13 14:20:32 crc kubenswrapper[4797]: I1013 14:20:32.058716 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1-default" Oct 13 14:20:32 crc kubenswrapper[4797]: I1013 14:20:32.088209 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-1-default_c8b368d0-6d78-4a55-a5a2-fb0078ae2115/mariadb-client-1-default/0.log" Oct 13 14:20:32 crc kubenswrapper[4797]: I1013 14:20:32.116592 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-1-default"] Oct 13 14:20:32 crc kubenswrapper[4797]: I1013 14:20:32.121365 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-1-default"] Oct 13 14:20:32 crc kubenswrapper[4797]: I1013 14:20:32.161220 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8vmnb\" (UniqueName: \"kubernetes.io/projected/c8b368d0-6d78-4a55-a5a2-fb0078ae2115-kube-api-access-8vmnb\") pod \"c8b368d0-6d78-4a55-a5a2-fb0078ae2115\" (UID: \"c8b368d0-6d78-4a55-a5a2-fb0078ae2115\") " Oct 13 14:20:32 crc kubenswrapper[4797]: I1013 14:20:32.166369 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8b368d0-6d78-4a55-a5a2-fb0078ae2115-kube-api-access-8vmnb" (OuterVolumeSpecName: "kube-api-access-8vmnb") pod "c8b368d0-6d78-4a55-a5a2-fb0078ae2115" (UID: "c8b368d0-6d78-4a55-a5a2-fb0078ae2115"). InnerVolumeSpecName "kube-api-access-8vmnb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 14:20:32 crc kubenswrapper[4797]: I1013 14:20:32.236593 4797 scope.go:117] "RemoveContainer" containerID="e94c068ee3715b0c0c1473747789b8e54032f59cffffcb81aaefbd0a73c58ba2" Oct 13 14:20:32 crc kubenswrapper[4797]: E1013 14:20:32.236865 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 14:20:32 crc kubenswrapper[4797]: I1013 14:20:32.263230 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8vmnb\" (UniqueName: \"kubernetes.io/projected/c8b368d0-6d78-4a55-a5a2-fb0078ae2115-kube-api-access-8vmnb\") on node \"crc\" DevicePath \"\"" Oct 13 14:20:32 crc kubenswrapper[4797]: I1013 14:20:32.556925 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-2-default"] Oct 13 14:20:32 crc kubenswrapper[4797]: E1013 14:20:32.557388 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8b368d0-6d78-4a55-a5a2-fb0078ae2115" containerName="mariadb-client-1-default" Oct 13 14:20:32 crc kubenswrapper[4797]: I1013 14:20:32.557419 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8b368d0-6d78-4a55-a5a2-fb0078ae2115" containerName="mariadb-client-1-default" Oct 13 14:20:32 crc kubenswrapper[4797]: I1013 14:20:32.557677 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8b368d0-6d78-4a55-a5a2-fb0078ae2115" containerName="mariadb-client-1-default" Oct 13 14:20:32 crc kubenswrapper[4797]: I1013 14:20:32.559793 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2-default" Oct 13 14:20:32 crc kubenswrapper[4797]: I1013 14:20:32.574288 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-2-default"] Oct 13 14:20:32 crc kubenswrapper[4797]: I1013 14:20:32.610550 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ef770ce1655454d418a2ea0070902eb64f938b543f53e5f3508ca9c771304d6d" Oct 13 14:20:32 crc kubenswrapper[4797]: I1013 14:20:32.610603 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1-default" Oct 13 14:20:32 crc kubenswrapper[4797]: I1013 14:20:32.668565 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-scnmz\" (UniqueName: \"kubernetes.io/projected/0e6f4323-2f40-4508-a22c-99fded8d964e-kube-api-access-scnmz\") pod \"mariadb-client-2-default\" (UID: \"0e6f4323-2f40-4508-a22c-99fded8d964e\") " pod="openstack/mariadb-client-2-default" Oct 13 14:20:32 crc kubenswrapper[4797]: I1013 14:20:32.770163 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-scnmz\" (UniqueName: \"kubernetes.io/projected/0e6f4323-2f40-4508-a22c-99fded8d964e-kube-api-access-scnmz\") pod \"mariadb-client-2-default\" (UID: \"0e6f4323-2f40-4508-a22c-99fded8d964e\") " pod="openstack/mariadb-client-2-default" Oct 13 14:20:32 crc kubenswrapper[4797]: I1013 14:20:32.792985 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-scnmz\" (UniqueName: \"kubernetes.io/projected/0e6f4323-2f40-4508-a22c-99fded8d964e-kube-api-access-scnmz\") pod \"mariadb-client-2-default\" (UID: \"0e6f4323-2f40-4508-a22c-99fded8d964e\") " pod="openstack/mariadb-client-2-default" Oct 13 14:20:32 crc kubenswrapper[4797]: I1013 14:20:32.893033 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2-default" Oct 13 14:20:33 crc kubenswrapper[4797]: I1013 14:20:33.247140 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8b368d0-6d78-4a55-a5a2-fb0078ae2115" path="/var/lib/kubelet/pods/c8b368d0-6d78-4a55-a5a2-fb0078ae2115/volumes" Oct 13 14:20:33 crc kubenswrapper[4797]: I1013 14:20:33.475249 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-2-default"] Oct 13 14:20:33 crc kubenswrapper[4797]: W1013 14:20:33.482632 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0e6f4323_2f40_4508_a22c_99fded8d964e.slice/crio-7181220a4835231f46199966c73e550744fd5114e0211c7b0c203ec38291d341 WatchSource:0}: Error finding container 7181220a4835231f46199966c73e550744fd5114e0211c7b0c203ec38291d341: Status 404 returned error can't find the container with id 7181220a4835231f46199966c73e550744fd5114e0211c7b0c203ec38291d341 Oct 13 14:20:33 crc kubenswrapper[4797]: I1013 14:20:33.618039 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-2-default" event={"ID":"0e6f4323-2f40-4508-a22c-99fded8d964e","Type":"ContainerStarted","Data":"7181220a4835231f46199966c73e550744fd5114e0211c7b0c203ec38291d341"} Oct 13 14:20:34 crc kubenswrapper[4797]: I1013 14:20:34.628343 4797 generic.go:334] "Generic (PLEG): container finished" podID="0e6f4323-2f40-4508-a22c-99fded8d964e" containerID="3e07c0bbbd68aae6e7c637d0ef7db977ddfe0745008fd17f9a72c382caf48bec" exitCode=0 Oct 13 14:20:34 crc kubenswrapper[4797]: I1013 14:20:34.628417 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-2-default" event={"ID":"0e6f4323-2f40-4508-a22c-99fded8d964e","Type":"ContainerDied","Data":"3e07c0bbbd68aae6e7c637d0ef7db977ddfe0745008fd17f9a72c382caf48bec"} Oct 13 14:20:35 crc kubenswrapper[4797]: I1013 14:20:35.986349 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2-default" Oct 13 14:20:36 crc kubenswrapper[4797]: I1013 14:20:36.029508 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-2-default_0e6f4323-2f40-4508-a22c-99fded8d964e/mariadb-client-2-default/0.log" Oct 13 14:20:36 crc kubenswrapper[4797]: I1013 14:20:36.030896 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-scnmz\" (UniqueName: \"kubernetes.io/projected/0e6f4323-2f40-4508-a22c-99fded8d964e-kube-api-access-scnmz\") pod \"0e6f4323-2f40-4508-a22c-99fded8d964e\" (UID: \"0e6f4323-2f40-4508-a22c-99fded8d964e\") " Oct 13 14:20:36 crc kubenswrapper[4797]: I1013 14:20:36.036120 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e6f4323-2f40-4508-a22c-99fded8d964e-kube-api-access-scnmz" (OuterVolumeSpecName: "kube-api-access-scnmz") pod "0e6f4323-2f40-4508-a22c-99fded8d964e" (UID: "0e6f4323-2f40-4508-a22c-99fded8d964e"). InnerVolumeSpecName "kube-api-access-scnmz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 14:20:36 crc kubenswrapper[4797]: I1013 14:20:36.053616 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-2-default"] Oct 13 14:20:36 crc kubenswrapper[4797]: I1013 14:20:36.059273 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-2-default"] Oct 13 14:20:36 crc kubenswrapper[4797]: I1013 14:20:36.133383 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-scnmz\" (UniqueName: \"kubernetes.io/projected/0e6f4323-2f40-4508-a22c-99fded8d964e-kube-api-access-scnmz\") on node \"crc\" DevicePath \"\"" Oct 13 14:20:36 crc kubenswrapper[4797]: I1013 14:20:36.482920 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-1"] Oct 13 14:20:36 crc kubenswrapper[4797]: E1013 14:20:36.483324 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e6f4323-2f40-4508-a22c-99fded8d964e" containerName="mariadb-client-2-default" Oct 13 14:20:36 crc kubenswrapper[4797]: I1013 14:20:36.483349 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e6f4323-2f40-4508-a22c-99fded8d964e" containerName="mariadb-client-2-default" Oct 13 14:20:36 crc kubenswrapper[4797]: I1013 14:20:36.483494 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e6f4323-2f40-4508-a22c-99fded8d964e" containerName="mariadb-client-2-default" Oct 13 14:20:36 crc kubenswrapper[4797]: I1013 14:20:36.484082 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1" Oct 13 14:20:36 crc kubenswrapper[4797]: I1013 14:20:36.503946 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-1"] Oct 13 14:20:36 crc kubenswrapper[4797]: I1013 14:20:36.541679 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tbdb9\" (UniqueName: \"kubernetes.io/projected/b3c83441-9381-4c48-baff-1aa3ddab2579-kube-api-access-tbdb9\") pod \"mariadb-client-1\" (UID: \"b3c83441-9381-4c48-baff-1aa3ddab2579\") " pod="openstack/mariadb-client-1" Oct 13 14:20:36 crc kubenswrapper[4797]: I1013 14:20:36.643604 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tbdb9\" (UniqueName: \"kubernetes.io/projected/b3c83441-9381-4c48-baff-1aa3ddab2579-kube-api-access-tbdb9\") pod \"mariadb-client-1\" (UID: \"b3c83441-9381-4c48-baff-1aa3ddab2579\") " pod="openstack/mariadb-client-1" Oct 13 14:20:36 crc kubenswrapper[4797]: I1013 14:20:36.648033 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7181220a4835231f46199966c73e550744fd5114e0211c7b0c203ec38291d341" Oct 13 14:20:36 crc kubenswrapper[4797]: I1013 14:20:36.648083 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2-default" Oct 13 14:20:36 crc kubenswrapper[4797]: I1013 14:20:36.676597 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tbdb9\" (UniqueName: \"kubernetes.io/projected/b3c83441-9381-4c48-baff-1aa3ddab2579-kube-api-access-tbdb9\") pod \"mariadb-client-1\" (UID: \"b3c83441-9381-4c48-baff-1aa3ddab2579\") " pod="openstack/mariadb-client-1" Oct 13 14:20:36 crc kubenswrapper[4797]: I1013 14:20:36.802224 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1" Oct 13 14:20:37 crc kubenswrapper[4797]: I1013 14:20:37.260147 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e6f4323-2f40-4508-a22c-99fded8d964e" path="/var/lib/kubelet/pods/0e6f4323-2f40-4508-a22c-99fded8d964e/volumes" Oct 13 14:20:37 crc kubenswrapper[4797]: I1013 14:20:37.331658 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-1"] Oct 13 14:20:37 crc kubenswrapper[4797]: W1013 14:20:37.388100 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb3c83441_9381_4c48_baff_1aa3ddab2579.slice/crio-723ab3bac49a758e0fc198cd37c0a57c72e4acc9d29cfe8069dde23ff81b83f7 WatchSource:0}: Error finding container 723ab3bac49a758e0fc198cd37c0a57c72e4acc9d29cfe8069dde23ff81b83f7: Status 404 returned error can't find the container with id 723ab3bac49a758e0fc198cd37c0a57c72e4acc9d29cfe8069dde23ff81b83f7 Oct 13 14:20:37 crc kubenswrapper[4797]: I1013 14:20:37.659141 4797 generic.go:334] "Generic (PLEG): container finished" podID="b3c83441-9381-4c48-baff-1aa3ddab2579" containerID="7df8f2baf285eb6a7027e994031290a46a7ffd2c8c7998d845092f5f8121a367" exitCode=0 Oct 13 14:20:37 crc kubenswrapper[4797]: I1013 14:20:37.659434 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-1" event={"ID":"b3c83441-9381-4c48-baff-1aa3ddab2579","Type":"ContainerDied","Data":"7df8f2baf285eb6a7027e994031290a46a7ffd2c8c7998d845092f5f8121a367"} Oct 13 14:20:37 crc kubenswrapper[4797]: I1013 14:20:37.659460 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-1" event={"ID":"b3c83441-9381-4c48-baff-1aa3ddab2579","Type":"ContainerStarted","Data":"723ab3bac49a758e0fc198cd37c0a57c72e4acc9d29cfe8069dde23ff81b83f7"} Oct 13 14:20:39 crc kubenswrapper[4797]: I1013 14:20:39.265549 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1" Oct 13 14:20:39 crc kubenswrapper[4797]: I1013 14:20:39.285901 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-1_b3c83441-9381-4c48-baff-1aa3ddab2579/mariadb-client-1/0.log" Oct 13 14:20:39 crc kubenswrapper[4797]: I1013 14:20:39.311506 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-1"] Oct 13 14:20:39 crc kubenswrapper[4797]: I1013 14:20:39.330764 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-1"] Oct 13 14:20:39 crc kubenswrapper[4797]: I1013 14:20:39.391723 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tbdb9\" (UniqueName: \"kubernetes.io/projected/b3c83441-9381-4c48-baff-1aa3ddab2579-kube-api-access-tbdb9\") pod \"b3c83441-9381-4c48-baff-1aa3ddab2579\" (UID: \"b3c83441-9381-4c48-baff-1aa3ddab2579\") " Oct 13 14:20:39 crc kubenswrapper[4797]: I1013 14:20:39.397226 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3c83441-9381-4c48-baff-1aa3ddab2579-kube-api-access-tbdb9" (OuterVolumeSpecName: "kube-api-access-tbdb9") pod "b3c83441-9381-4c48-baff-1aa3ddab2579" (UID: "b3c83441-9381-4c48-baff-1aa3ddab2579"). InnerVolumeSpecName "kube-api-access-tbdb9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 14:20:39 crc kubenswrapper[4797]: I1013 14:20:39.494497 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tbdb9\" (UniqueName: \"kubernetes.io/projected/b3c83441-9381-4c48-baff-1aa3ddab2579-kube-api-access-tbdb9\") on node \"crc\" DevicePath \"\"" Oct 13 14:20:39 crc kubenswrapper[4797]: I1013 14:20:39.688070 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="723ab3bac49a758e0fc198cd37c0a57c72e4acc9d29cfe8069dde23ff81b83f7" Oct 13 14:20:39 crc kubenswrapper[4797]: I1013 14:20:39.688157 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1" Oct 13 14:20:39 crc kubenswrapper[4797]: I1013 14:20:39.739741 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-4-default"] Oct 13 14:20:39 crc kubenswrapper[4797]: E1013 14:20:39.740163 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3c83441-9381-4c48-baff-1aa3ddab2579" containerName="mariadb-client-1" Oct 13 14:20:39 crc kubenswrapper[4797]: I1013 14:20:39.740189 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3c83441-9381-4c48-baff-1aa3ddab2579" containerName="mariadb-client-1" Oct 13 14:20:39 crc kubenswrapper[4797]: I1013 14:20:39.740350 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3c83441-9381-4c48-baff-1aa3ddab2579" containerName="mariadb-client-1" Oct 13 14:20:39 crc kubenswrapper[4797]: I1013 14:20:39.741020 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-4-default" Oct 13 14:20:39 crc kubenswrapper[4797]: I1013 14:20:39.746034 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-dj9th" Oct 13 14:20:39 crc kubenswrapper[4797]: I1013 14:20:39.748047 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-4-default"] Oct 13 14:20:39 crc kubenswrapper[4797]: I1013 14:20:39.799502 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmsvk\" (UniqueName: \"kubernetes.io/projected/9350c3ee-9427-4db6-a94e-4a116fedcb97-kube-api-access-gmsvk\") pod \"mariadb-client-4-default\" (UID: \"9350c3ee-9427-4db6-a94e-4a116fedcb97\") " pod="openstack/mariadb-client-4-default" Oct 13 14:20:39 crc kubenswrapper[4797]: I1013 14:20:39.900929 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gmsvk\" (UniqueName: \"kubernetes.io/projected/9350c3ee-9427-4db6-a94e-4a116fedcb97-kube-api-access-gmsvk\") pod \"mariadb-client-4-default\" (UID: \"9350c3ee-9427-4db6-a94e-4a116fedcb97\") " pod="openstack/mariadb-client-4-default" Oct 13 14:20:39 crc kubenswrapper[4797]: I1013 14:20:39.931216 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmsvk\" (UniqueName: \"kubernetes.io/projected/9350c3ee-9427-4db6-a94e-4a116fedcb97-kube-api-access-gmsvk\") pod \"mariadb-client-4-default\" (UID: \"9350c3ee-9427-4db6-a94e-4a116fedcb97\") " pod="openstack/mariadb-client-4-default" Oct 13 14:20:40 crc kubenswrapper[4797]: I1013 14:20:40.076351 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-4-default" Oct 13 14:20:40 crc kubenswrapper[4797]: W1013 14:20:40.657137 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9350c3ee_9427_4db6_a94e_4a116fedcb97.slice/crio-2a80e38cf726c37b64381bed373c42412b73346c0726a8d5861f9138e62dad9c WatchSource:0}: Error finding container 2a80e38cf726c37b64381bed373c42412b73346c0726a8d5861f9138e62dad9c: Status 404 returned error can't find the container with id 2a80e38cf726c37b64381bed373c42412b73346c0726a8d5861f9138e62dad9c Oct 13 14:20:40 crc kubenswrapper[4797]: I1013 14:20:40.657477 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-4-default"] Oct 13 14:20:40 crc kubenswrapper[4797]: I1013 14:20:40.700634 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-4-default" event={"ID":"9350c3ee-9427-4db6-a94e-4a116fedcb97","Type":"ContainerStarted","Data":"2a80e38cf726c37b64381bed373c42412b73346c0726a8d5861f9138e62dad9c"} Oct 13 14:20:41 crc kubenswrapper[4797]: I1013 14:20:41.246061 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b3c83441-9381-4c48-baff-1aa3ddab2579" path="/var/lib/kubelet/pods/b3c83441-9381-4c48-baff-1aa3ddab2579/volumes" Oct 13 14:20:41 crc kubenswrapper[4797]: I1013 14:20:41.711416 4797 generic.go:334] "Generic (PLEG): container finished" podID="9350c3ee-9427-4db6-a94e-4a116fedcb97" containerID="7d88c9d89711934c73ab5ca1201dd7012bb4e4ac31aff77fa97d37be8482d480" exitCode=0 Oct 13 14:20:41 crc kubenswrapper[4797]: I1013 14:20:41.711487 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-4-default" event={"ID":"9350c3ee-9427-4db6-a94e-4a116fedcb97","Type":"ContainerDied","Data":"7d88c9d89711934c73ab5ca1201dd7012bb4e4ac31aff77fa97d37be8482d480"} Oct 13 14:20:43 crc kubenswrapper[4797]: I1013 14:20:43.095120 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-4-default" Oct 13 14:20:43 crc kubenswrapper[4797]: I1013 14:20:43.110428 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-4-default_9350c3ee-9427-4db6-a94e-4a116fedcb97/mariadb-client-4-default/0.log" Oct 13 14:20:43 crc kubenswrapper[4797]: I1013 14:20:43.161553 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gmsvk\" (UniqueName: \"kubernetes.io/projected/9350c3ee-9427-4db6-a94e-4a116fedcb97-kube-api-access-gmsvk\") pod \"9350c3ee-9427-4db6-a94e-4a116fedcb97\" (UID: \"9350c3ee-9427-4db6-a94e-4a116fedcb97\") " Oct 13 14:20:43 crc kubenswrapper[4797]: I1013 14:20:43.171172 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9350c3ee-9427-4db6-a94e-4a116fedcb97-kube-api-access-gmsvk" (OuterVolumeSpecName: "kube-api-access-gmsvk") pod "9350c3ee-9427-4db6-a94e-4a116fedcb97" (UID: "9350c3ee-9427-4db6-a94e-4a116fedcb97"). InnerVolumeSpecName "kube-api-access-gmsvk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 14:20:43 crc kubenswrapper[4797]: I1013 14:20:43.172804 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-4-default"] Oct 13 14:20:43 crc kubenswrapper[4797]: I1013 14:20:43.178794 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-4-default"] Oct 13 14:20:43 crc kubenswrapper[4797]: I1013 14:20:43.244914 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9350c3ee-9427-4db6-a94e-4a116fedcb97" path="/var/lib/kubelet/pods/9350c3ee-9427-4db6-a94e-4a116fedcb97/volumes" Oct 13 14:20:43 crc kubenswrapper[4797]: I1013 14:20:43.262852 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gmsvk\" (UniqueName: \"kubernetes.io/projected/9350c3ee-9427-4db6-a94e-4a116fedcb97-kube-api-access-gmsvk\") on node \"crc\" DevicePath \"\"" Oct 13 14:20:43 crc kubenswrapper[4797]: I1013 14:20:43.731187 4797 scope.go:117] "RemoveContainer" containerID="7d88c9d89711934c73ab5ca1201dd7012bb4e4ac31aff77fa97d37be8482d480" Oct 13 14:20:43 crc kubenswrapper[4797]: I1013 14:20:43.731248 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-4-default" Oct 13 14:20:44 crc kubenswrapper[4797]: I1013 14:20:44.236766 4797 scope.go:117] "RemoveContainer" containerID="e94c068ee3715b0c0c1473747789b8e54032f59cffffcb81aaefbd0a73c58ba2" Oct 13 14:20:44 crc kubenswrapper[4797]: E1013 14:20:44.237786 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 14:20:47 crc kubenswrapper[4797]: I1013 14:20:47.187094 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-5-default"] Oct 13 14:20:47 crc kubenswrapper[4797]: E1013 14:20:47.187731 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9350c3ee-9427-4db6-a94e-4a116fedcb97" containerName="mariadb-client-4-default" Oct 13 14:20:47 crc kubenswrapper[4797]: I1013 14:20:47.187742 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="9350c3ee-9427-4db6-a94e-4a116fedcb97" containerName="mariadb-client-4-default" Oct 13 14:20:47 crc kubenswrapper[4797]: I1013 14:20:47.187954 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="9350c3ee-9427-4db6-a94e-4a116fedcb97" containerName="mariadb-client-4-default" Oct 13 14:20:47 crc kubenswrapper[4797]: I1013 14:20:47.188472 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-5-default" Oct 13 14:20:47 crc kubenswrapper[4797]: I1013 14:20:47.190166 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-dj9th" Oct 13 14:20:47 crc kubenswrapper[4797]: I1013 14:20:47.197127 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-5-default"] Oct 13 14:20:47 crc kubenswrapper[4797]: I1013 14:20:47.356369 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lcbgw\" (UniqueName: \"kubernetes.io/projected/c8166a37-c0d2-4f04-90d9-807a4f3d7dd5-kube-api-access-lcbgw\") pod \"mariadb-client-5-default\" (UID: \"c8166a37-c0d2-4f04-90d9-807a4f3d7dd5\") " pod="openstack/mariadb-client-5-default" Oct 13 14:20:47 crc kubenswrapper[4797]: I1013 14:20:47.457663 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lcbgw\" (UniqueName: \"kubernetes.io/projected/c8166a37-c0d2-4f04-90d9-807a4f3d7dd5-kube-api-access-lcbgw\") pod \"mariadb-client-5-default\" (UID: \"c8166a37-c0d2-4f04-90d9-807a4f3d7dd5\") " pod="openstack/mariadb-client-5-default" Oct 13 14:20:47 crc kubenswrapper[4797]: I1013 14:20:47.484929 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lcbgw\" (UniqueName: \"kubernetes.io/projected/c8166a37-c0d2-4f04-90d9-807a4f3d7dd5-kube-api-access-lcbgw\") pod \"mariadb-client-5-default\" (UID: \"c8166a37-c0d2-4f04-90d9-807a4f3d7dd5\") " pod="openstack/mariadb-client-5-default" Oct 13 14:20:47 crc kubenswrapper[4797]: I1013 14:20:47.514584 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-5-default" Oct 13 14:20:48 crc kubenswrapper[4797]: I1013 14:20:48.020117 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-5-default"] Oct 13 14:20:48 crc kubenswrapper[4797]: I1013 14:20:48.776825 4797 generic.go:334] "Generic (PLEG): container finished" podID="c8166a37-c0d2-4f04-90d9-807a4f3d7dd5" containerID="381fee8e8354196b3eeae958d1e5e043b47cee940c956023213668e8423081c5" exitCode=0 Oct 13 14:20:48 crc kubenswrapper[4797]: I1013 14:20:48.776902 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-5-default" event={"ID":"c8166a37-c0d2-4f04-90d9-807a4f3d7dd5","Type":"ContainerDied","Data":"381fee8e8354196b3eeae958d1e5e043b47cee940c956023213668e8423081c5"} Oct 13 14:20:48 crc kubenswrapper[4797]: I1013 14:20:48.777102 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-5-default" event={"ID":"c8166a37-c0d2-4f04-90d9-807a4f3d7dd5","Type":"ContainerStarted","Data":"b97b306232ed9ba86babd9fb0a7644841873c509465fc48b24c86fb784828c7a"} Oct 13 14:20:50 crc kubenswrapper[4797]: I1013 14:20:50.271765 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-5-default" Oct 13 14:20:50 crc kubenswrapper[4797]: I1013 14:20:50.292832 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-5-default_c8166a37-c0d2-4f04-90d9-807a4f3d7dd5/mariadb-client-5-default/0.log" Oct 13 14:20:50 crc kubenswrapper[4797]: I1013 14:20:50.329226 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-5-default"] Oct 13 14:20:50 crc kubenswrapper[4797]: I1013 14:20:50.343710 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-5-default"] Oct 13 14:20:50 crc kubenswrapper[4797]: I1013 14:20:50.403592 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lcbgw\" (UniqueName: \"kubernetes.io/projected/c8166a37-c0d2-4f04-90d9-807a4f3d7dd5-kube-api-access-lcbgw\") pod \"c8166a37-c0d2-4f04-90d9-807a4f3d7dd5\" (UID: \"c8166a37-c0d2-4f04-90d9-807a4f3d7dd5\") " Oct 13 14:20:50 crc kubenswrapper[4797]: I1013 14:20:50.409118 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8166a37-c0d2-4f04-90d9-807a4f3d7dd5-kube-api-access-lcbgw" (OuterVolumeSpecName: "kube-api-access-lcbgw") pod "c8166a37-c0d2-4f04-90d9-807a4f3d7dd5" (UID: "c8166a37-c0d2-4f04-90d9-807a4f3d7dd5"). InnerVolumeSpecName "kube-api-access-lcbgw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 14:20:50 crc kubenswrapper[4797]: I1013 14:20:50.450000 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-6-default"] Oct 13 14:20:50 crc kubenswrapper[4797]: E1013 14:20:50.450287 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8166a37-c0d2-4f04-90d9-807a4f3d7dd5" containerName="mariadb-client-5-default" Oct 13 14:20:50 crc kubenswrapper[4797]: I1013 14:20:50.450300 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8166a37-c0d2-4f04-90d9-807a4f3d7dd5" containerName="mariadb-client-5-default" Oct 13 14:20:50 crc kubenswrapper[4797]: I1013 14:20:50.450462 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8166a37-c0d2-4f04-90d9-807a4f3d7dd5" containerName="mariadb-client-5-default" Oct 13 14:20:50 crc kubenswrapper[4797]: I1013 14:20:50.450936 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-6-default" Oct 13 14:20:50 crc kubenswrapper[4797]: I1013 14:20:50.457726 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-6-default"] Oct 13 14:20:50 crc kubenswrapper[4797]: I1013 14:20:50.505357 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lcbgw\" (UniqueName: \"kubernetes.io/projected/c8166a37-c0d2-4f04-90d9-807a4f3d7dd5-kube-api-access-lcbgw\") on node \"crc\" DevicePath \"\"" Oct 13 14:20:50 crc kubenswrapper[4797]: I1013 14:20:50.606792 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t46hf\" (UniqueName: \"kubernetes.io/projected/d59ca801-da85-46fc-a78b-a08f3590a218-kube-api-access-t46hf\") pod \"mariadb-client-6-default\" (UID: \"d59ca801-da85-46fc-a78b-a08f3590a218\") " pod="openstack/mariadb-client-6-default" Oct 13 14:20:50 crc kubenswrapper[4797]: I1013 14:20:50.708290 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t46hf\" (UniqueName: \"kubernetes.io/projected/d59ca801-da85-46fc-a78b-a08f3590a218-kube-api-access-t46hf\") pod \"mariadb-client-6-default\" (UID: \"d59ca801-da85-46fc-a78b-a08f3590a218\") " pod="openstack/mariadb-client-6-default" Oct 13 14:20:50 crc kubenswrapper[4797]: I1013 14:20:50.724891 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t46hf\" (UniqueName: \"kubernetes.io/projected/d59ca801-da85-46fc-a78b-a08f3590a218-kube-api-access-t46hf\") pod \"mariadb-client-6-default\" (UID: \"d59ca801-da85-46fc-a78b-a08f3590a218\") " pod="openstack/mariadb-client-6-default" Oct 13 14:20:50 crc kubenswrapper[4797]: I1013 14:20:50.785950 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-6-default" Oct 13 14:20:50 crc kubenswrapper[4797]: I1013 14:20:50.792561 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b97b306232ed9ba86babd9fb0a7644841873c509465fc48b24c86fb784828c7a" Oct 13 14:20:50 crc kubenswrapper[4797]: I1013 14:20:50.792623 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-5-default" Oct 13 14:20:51 crc kubenswrapper[4797]: I1013 14:20:51.248129 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8166a37-c0d2-4f04-90d9-807a4f3d7dd5" path="/var/lib/kubelet/pods/c8166a37-c0d2-4f04-90d9-807a4f3d7dd5/volumes" Oct 13 14:20:52 crc kubenswrapper[4797]: I1013 14:20:52.099465 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-6-default"] Oct 13 14:20:52 crc kubenswrapper[4797]: I1013 14:20:52.810170 4797 generic.go:334] "Generic (PLEG): container finished" podID="d59ca801-da85-46fc-a78b-a08f3590a218" containerID="1ea42cc83adbb8ca30180c0024ecca3751f890ed77be1900674a3a4269e1af92" exitCode=0 Oct 13 14:20:52 crc kubenswrapper[4797]: I1013 14:20:52.810274 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-6-default" event={"ID":"d59ca801-da85-46fc-a78b-a08f3590a218","Type":"ContainerDied","Data":"1ea42cc83adbb8ca30180c0024ecca3751f890ed77be1900674a3a4269e1af92"} Oct 13 14:20:52 crc kubenswrapper[4797]: I1013 14:20:52.810455 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-6-default" event={"ID":"d59ca801-da85-46fc-a78b-a08f3590a218","Type":"ContainerStarted","Data":"1516dbbb16002a82a3454f371b043fc96e7a453fbc0e4b0194c45a38087cee18"} Oct 13 14:20:54 crc kubenswrapper[4797]: I1013 14:20:54.295473 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-6-default" Oct 13 14:20:54 crc kubenswrapper[4797]: I1013 14:20:54.339448 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-6-default_d59ca801-da85-46fc-a78b-a08f3590a218/mariadb-client-6-default/0.log" Oct 13 14:20:54 crc kubenswrapper[4797]: I1013 14:20:54.362500 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-6-default"] Oct 13 14:20:54 crc kubenswrapper[4797]: I1013 14:20:54.370330 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-6-default"] Oct 13 14:20:54 crc kubenswrapper[4797]: I1013 14:20:54.469973 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t46hf\" (UniqueName: \"kubernetes.io/projected/d59ca801-da85-46fc-a78b-a08f3590a218-kube-api-access-t46hf\") pod \"d59ca801-da85-46fc-a78b-a08f3590a218\" (UID: \"d59ca801-da85-46fc-a78b-a08f3590a218\") " Oct 13 14:20:54 crc kubenswrapper[4797]: I1013 14:20:54.478840 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d59ca801-da85-46fc-a78b-a08f3590a218-kube-api-access-t46hf" (OuterVolumeSpecName: "kube-api-access-t46hf") pod "d59ca801-da85-46fc-a78b-a08f3590a218" (UID: "d59ca801-da85-46fc-a78b-a08f3590a218"). InnerVolumeSpecName "kube-api-access-t46hf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 14:20:54 crc kubenswrapper[4797]: I1013 14:20:54.496347 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-7-default"] Oct 13 14:20:54 crc kubenswrapper[4797]: E1013 14:20:54.498172 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d59ca801-da85-46fc-a78b-a08f3590a218" containerName="mariadb-client-6-default" Oct 13 14:20:54 crc kubenswrapper[4797]: I1013 14:20:54.498206 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="d59ca801-da85-46fc-a78b-a08f3590a218" containerName="mariadb-client-6-default" Oct 13 14:20:54 crc kubenswrapper[4797]: I1013 14:20:54.498422 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="d59ca801-da85-46fc-a78b-a08f3590a218" containerName="mariadb-client-6-default" Oct 13 14:20:54 crc kubenswrapper[4797]: I1013 14:20:54.499077 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-7-default" Oct 13 14:20:54 crc kubenswrapper[4797]: I1013 14:20:54.511647 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-7-default"] Oct 13 14:20:54 crc kubenswrapper[4797]: I1013 14:20:54.572645 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t46hf\" (UniqueName: \"kubernetes.io/projected/d59ca801-da85-46fc-a78b-a08f3590a218-kube-api-access-t46hf\") on node \"crc\" DevicePath \"\"" Oct 13 14:20:54 crc kubenswrapper[4797]: I1013 14:20:54.674879 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7v2kf\" (UniqueName: \"kubernetes.io/projected/48b7f3bb-e1fc-4047-95e1-676286e4b72c-kube-api-access-7v2kf\") pod \"mariadb-client-7-default\" (UID: \"48b7f3bb-e1fc-4047-95e1-676286e4b72c\") " pod="openstack/mariadb-client-7-default" Oct 13 14:20:54 crc kubenswrapper[4797]: I1013 14:20:54.775985 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7v2kf\" (UniqueName: \"kubernetes.io/projected/48b7f3bb-e1fc-4047-95e1-676286e4b72c-kube-api-access-7v2kf\") pod \"mariadb-client-7-default\" (UID: \"48b7f3bb-e1fc-4047-95e1-676286e4b72c\") " pod="openstack/mariadb-client-7-default" Oct 13 14:20:54 crc kubenswrapper[4797]: I1013 14:20:54.793794 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7v2kf\" (UniqueName: \"kubernetes.io/projected/48b7f3bb-e1fc-4047-95e1-676286e4b72c-kube-api-access-7v2kf\") pod \"mariadb-client-7-default\" (UID: \"48b7f3bb-e1fc-4047-95e1-676286e4b72c\") " pod="openstack/mariadb-client-7-default" Oct 13 14:20:54 crc kubenswrapper[4797]: I1013 14:20:54.842883 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1516dbbb16002a82a3454f371b043fc96e7a453fbc0e4b0194c45a38087cee18" Oct 13 14:20:54 crc kubenswrapper[4797]: I1013 14:20:54.842952 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-6-default" Oct 13 14:20:54 crc kubenswrapper[4797]: I1013 14:20:54.856750 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-7-default" Oct 13 14:20:55 crc kubenswrapper[4797]: I1013 14:20:55.205469 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-7-default"] Oct 13 14:20:55 crc kubenswrapper[4797]: W1013 14:20:55.212022 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod48b7f3bb_e1fc_4047_95e1_676286e4b72c.slice/crio-a972cdf2b402a50e4830516441da96c0cda518a24b0c06df9dfc2ffaf3afe1db WatchSource:0}: Error finding container a972cdf2b402a50e4830516441da96c0cda518a24b0c06df9dfc2ffaf3afe1db: Status 404 returned error can't find the container with id a972cdf2b402a50e4830516441da96c0cda518a24b0c06df9dfc2ffaf3afe1db Oct 13 14:20:55 crc kubenswrapper[4797]: I1013 14:20:55.246850 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d59ca801-da85-46fc-a78b-a08f3590a218" path="/var/lib/kubelet/pods/d59ca801-da85-46fc-a78b-a08f3590a218/volumes" Oct 13 14:20:55 crc kubenswrapper[4797]: I1013 14:20:55.853828 4797 generic.go:334] "Generic (PLEG): container finished" podID="48b7f3bb-e1fc-4047-95e1-676286e4b72c" containerID="2984706f4a6aa410e8c3d22187daf4d9f45df63940c651963593443f3410f764" exitCode=0 Oct 13 14:20:55 crc kubenswrapper[4797]: I1013 14:20:55.853947 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-7-default" event={"ID":"48b7f3bb-e1fc-4047-95e1-676286e4b72c","Type":"ContainerDied","Data":"2984706f4a6aa410e8c3d22187daf4d9f45df63940c651963593443f3410f764"} Oct 13 14:20:55 crc kubenswrapper[4797]: I1013 14:20:55.854324 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-7-default" event={"ID":"48b7f3bb-e1fc-4047-95e1-676286e4b72c","Type":"ContainerStarted","Data":"a972cdf2b402a50e4830516441da96c0cda518a24b0c06df9dfc2ffaf3afe1db"} Oct 13 14:20:57 crc kubenswrapper[4797]: I1013 14:20:57.236506 4797 scope.go:117] "RemoveContainer" containerID="e94c068ee3715b0c0c1473747789b8e54032f59cffffcb81aaefbd0a73c58ba2" Oct 13 14:20:57 crc kubenswrapper[4797]: E1013 14:20:57.237229 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 14:20:57 crc kubenswrapper[4797]: I1013 14:20:57.264381 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-7-default" Oct 13 14:20:57 crc kubenswrapper[4797]: I1013 14:20:57.320190 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-7-default_48b7f3bb-e1fc-4047-95e1-676286e4b72c/mariadb-client-7-default/0.log" Oct 13 14:20:57 crc kubenswrapper[4797]: I1013 14:20:57.343202 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-7-default"] Oct 13 14:20:57 crc kubenswrapper[4797]: I1013 14:20:57.348563 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-7-default"] Oct 13 14:20:57 crc kubenswrapper[4797]: I1013 14:20:57.417021 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7v2kf\" (UniqueName: \"kubernetes.io/projected/48b7f3bb-e1fc-4047-95e1-676286e4b72c-kube-api-access-7v2kf\") pod \"48b7f3bb-e1fc-4047-95e1-676286e4b72c\" (UID: \"48b7f3bb-e1fc-4047-95e1-676286e4b72c\") " Oct 13 14:20:57 crc kubenswrapper[4797]: I1013 14:20:57.422958 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48b7f3bb-e1fc-4047-95e1-676286e4b72c-kube-api-access-7v2kf" (OuterVolumeSpecName: "kube-api-access-7v2kf") pod "48b7f3bb-e1fc-4047-95e1-676286e4b72c" (UID: "48b7f3bb-e1fc-4047-95e1-676286e4b72c"). InnerVolumeSpecName "kube-api-access-7v2kf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 14:20:57 crc kubenswrapper[4797]: I1013 14:20:57.461500 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-2"] Oct 13 14:20:57 crc kubenswrapper[4797]: E1013 14:20:57.461853 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48b7f3bb-e1fc-4047-95e1-676286e4b72c" containerName="mariadb-client-7-default" Oct 13 14:20:57 crc kubenswrapper[4797]: I1013 14:20:57.461872 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="48b7f3bb-e1fc-4047-95e1-676286e4b72c" containerName="mariadb-client-7-default" Oct 13 14:20:57 crc kubenswrapper[4797]: I1013 14:20:57.462019 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="48b7f3bb-e1fc-4047-95e1-676286e4b72c" containerName="mariadb-client-7-default" Oct 13 14:20:57 crc kubenswrapper[4797]: I1013 14:20:57.462500 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2" Oct 13 14:20:57 crc kubenswrapper[4797]: I1013 14:20:57.470447 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-2"] Oct 13 14:20:57 crc kubenswrapper[4797]: I1013 14:20:57.519183 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7v2kf\" (UniqueName: \"kubernetes.io/projected/48b7f3bb-e1fc-4047-95e1-676286e4b72c-kube-api-access-7v2kf\") on node \"crc\" DevicePath \"\"" Oct 13 14:20:57 crc kubenswrapper[4797]: I1013 14:20:57.621294 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9p5w\" (UniqueName: \"kubernetes.io/projected/8d4a2b75-0677-4c1c-a7d0-a7d244411b3c-kube-api-access-k9p5w\") pod \"mariadb-client-2\" (UID: \"8d4a2b75-0677-4c1c-a7d0-a7d244411b3c\") " pod="openstack/mariadb-client-2" Oct 13 14:20:57 crc kubenswrapper[4797]: I1013 14:20:57.723485 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9p5w\" (UniqueName: \"kubernetes.io/projected/8d4a2b75-0677-4c1c-a7d0-a7d244411b3c-kube-api-access-k9p5w\") pod \"mariadb-client-2\" (UID: \"8d4a2b75-0677-4c1c-a7d0-a7d244411b3c\") " pod="openstack/mariadb-client-2" Oct 13 14:20:57 crc kubenswrapper[4797]: I1013 14:20:57.739074 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9p5w\" (UniqueName: \"kubernetes.io/projected/8d4a2b75-0677-4c1c-a7d0-a7d244411b3c-kube-api-access-k9p5w\") pod \"mariadb-client-2\" (UID: \"8d4a2b75-0677-4c1c-a7d0-a7d244411b3c\") " pod="openstack/mariadb-client-2" Oct 13 14:20:57 crc kubenswrapper[4797]: I1013 14:20:57.780382 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2" Oct 13 14:20:57 crc kubenswrapper[4797]: I1013 14:20:57.876108 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a972cdf2b402a50e4830516441da96c0cda518a24b0c06df9dfc2ffaf3afe1db" Oct 13 14:20:57 crc kubenswrapper[4797]: I1013 14:20:57.876206 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-7-default" Oct 13 14:20:58 crc kubenswrapper[4797]: I1013 14:20:58.265918 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-2"] Oct 13 14:20:58 crc kubenswrapper[4797]: I1013 14:20:58.889770 4797 generic.go:334] "Generic (PLEG): container finished" podID="8d4a2b75-0677-4c1c-a7d0-a7d244411b3c" containerID="cf714e3407b0d29ee21f458d0091ddea60b12ebee9d446a2da9cc6392dd369c7" exitCode=0 Oct 13 14:20:58 crc kubenswrapper[4797]: I1013 14:20:58.889871 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-2" event={"ID":"8d4a2b75-0677-4c1c-a7d0-a7d244411b3c","Type":"ContainerDied","Data":"cf714e3407b0d29ee21f458d0091ddea60b12ebee9d446a2da9cc6392dd369c7"} Oct 13 14:20:58 crc kubenswrapper[4797]: I1013 14:20:58.890328 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-2" event={"ID":"8d4a2b75-0677-4c1c-a7d0-a7d244411b3c","Type":"ContainerStarted","Data":"f2a524d368c2fc6860acb1cb271a19d632de4896feb45353556837128bb6e621"} Oct 13 14:20:59 crc kubenswrapper[4797]: I1013 14:20:59.247617 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48b7f3bb-e1fc-4047-95e1-676286e4b72c" path="/var/lib/kubelet/pods/48b7f3bb-e1fc-4047-95e1-676286e4b72c/volumes" Oct 13 14:21:00 crc kubenswrapper[4797]: I1013 14:21:00.250281 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2" Oct 13 14:21:00 crc kubenswrapper[4797]: I1013 14:21:00.274857 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-2_8d4a2b75-0677-4c1c-a7d0-a7d244411b3c/mariadb-client-2/0.log" Oct 13 14:21:00 crc kubenswrapper[4797]: I1013 14:21:00.299194 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-2"] Oct 13 14:21:00 crc kubenswrapper[4797]: I1013 14:21:00.303573 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-2"] Oct 13 14:21:00 crc kubenswrapper[4797]: I1013 14:21:00.363030 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k9p5w\" (UniqueName: \"kubernetes.io/projected/8d4a2b75-0677-4c1c-a7d0-a7d244411b3c-kube-api-access-k9p5w\") pod \"8d4a2b75-0677-4c1c-a7d0-a7d244411b3c\" (UID: \"8d4a2b75-0677-4c1c-a7d0-a7d244411b3c\") " Oct 13 14:21:00 crc kubenswrapper[4797]: I1013 14:21:00.371589 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d4a2b75-0677-4c1c-a7d0-a7d244411b3c-kube-api-access-k9p5w" (OuterVolumeSpecName: "kube-api-access-k9p5w") pod "8d4a2b75-0677-4c1c-a7d0-a7d244411b3c" (UID: "8d4a2b75-0677-4c1c-a7d0-a7d244411b3c"). InnerVolumeSpecName "kube-api-access-k9p5w". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 14:21:00 crc kubenswrapper[4797]: I1013 14:21:00.464407 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k9p5w\" (UniqueName: \"kubernetes.io/projected/8d4a2b75-0677-4c1c-a7d0-a7d244411b3c-kube-api-access-k9p5w\") on node \"crc\" DevicePath \"\"" Oct 13 14:21:00 crc kubenswrapper[4797]: I1013 14:21:00.904946 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f2a524d368c2fc6860acb1cb271a19d632de4896feb45353556837128bb6e621" Oct 13 14:21:00 crc kubenswrapper[4797]: I1013 14:21:00.905004 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2" Oct 13 14:21:01 crc kubenswrapper[4797]: I1013 14:21:01.247338 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d4a2b75-0677-4c1c-a7d0-a7d244411b3c" path="/var/lib/kubelet/pods/8d4a2b75-0677-4c1c-a7d0-a7d244411b3c/volumes" Oct 13 14:21:11 crc kubenswrapper[4797]: I1013 14:21:11.236060 4797 scope.go:117] "RemoveContainer" containerID="e94c068ee3715b0c0c1473747789b8e54032f59cffffcb81aaefbd0a73c58ba2" Oct 13 14:21:11 crc kubenswrapper[4797]: E1013 14:21:11.237043 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 14:21:11 crc kubenswrapper[4797]: I1013 14:21:11.749711 4797 scope.go:117] "RemoveContainer" containerID="ce2f34a16021f09ae0550752bd4da285b25538d5effcb1520c8d1caed405ed0c" Oct 13 14:21:24 crc kubenswrapper[4797]: I1013 14:21:24.236491 4797 scope.go:117] "RemoveContainer" containerID="e94c068ee3715b0c0c1473747789b8e54032f59cffffcb81aaefbd0a73c58ba2" Oct 13 14:21:24 crc kubenswrapper[4797]: E1013 14:21:24.237323 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 14:21:36 crc kubenswrapper[4797]: I1013 14:21:36.236631 4797 scope.go:117] "RemoveContainer" containerID="e94c068ee3715b0c0c1473747789b8e54032f59cffffcb81aaefbd0a73c58ba2" Oct 13 14:21:36 crc kubenswrapper[4797]: E1013 14:21:36.237383 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 14:21:48 crc kubenswrapper[4797]: I1013 14:21:48.237028 4797 scope.go:117] "RemoveContainer" containerID="e94c068ee3715b0c0c1473747789b8e54032f59cffffcb81aaefbd0a73c58ba2" Oct 13 14:21:48 crc kubenswrapper[4797]: E1013 14:21:48.238023 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 14:21:59 crc kubenswrapper[4797]: I1013 14:21:59.236596 4797 scope.go:117] "RemoveContainer" containerID="e94c068ee3715b0c0c1473747789b8e54032f59cffffcb81aaefbd0a73c58ba2" Oct 13 14:21:59 crc kubenswrapper[4797]: E1013 14:21:59.237385 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 14:22:12 crc kubenswrapper[4797]: I1013 14:22:12.236627 4797 scope.go:117] "RemoveContainer" containerID="e94c068ee3715b0c0c1473747789b8e54032f59cffffcb81aaefbd0a73c58ba2" Oct 13 14:22:12 crc kubenswrapper[4797]: E1013 14:22:12.237302 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 14:22:26 crc kubenswrapper[4797]: I1013 14:22:26.236351 4797 scope.go:117] "RemoveContainer" containerID="e94c068ee3715b0c0c1473747789b8e54032f59cffffcb81aaefbd0a73c58ba2" Oct 13 14:22:26 crc kubenswrapper[4797]: E1013 14:22:26.236898 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 14:22:40 crc kubenswrapper[4797]: I1013 14:22:40.236679 4797 scope.go:117] "RemoveContainer" containerID="e94c068ee3715b0c0c1473747789b8e54032f59cffffcb81aaefbd0a73c58ba2" Oct 13 14:22:40 crc kubenswrapper[4797]: E1013 14:22:40.237466 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 14:22:52 crc kubenswrapper[4797]: I1013 14:22:52.235789 4797 scope.go:117] "RemoveContainer" containerID="e94c068ee3715b0c0c1473747789b8e54032f59cffffcb81aaefbd0a73c58ba2" Oct 13 14:22:52 crc kubenswrapper[4797]: E1013 14:22:52.236603 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 14:23:07 crc kubenswrapper[4797]: I1013 14:23:07.236733 4797 scope.go:117] "RemoveContainer" containerID="e94c068ee3715b0c0c1473747789b8e54032f59cffffcb81aaefbd0a73c58ba2" Oct 13 14:23:07 crc kubenswrapper[4797]: E1013 14:23:07.237662 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 14:23:19 crc kubenswrapper[4797]: I1013 14:23:19.238207 4797 scope.go:117] "RemoveContainer" containerID="e94c068ee3715b0c0c1473747789b8e54032f59cffffcb81aaefbd0a73c58ba2" Oct 13 14:23:19 crc kubenswrapper[4797]: E1013 14:23:19.239599 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 14:23:30 crc kubenswrapper[4797]: I1013 14:23:30.236891 4797 scope.go:117] "RemoveContainer" containerID="e94c068ee3715b0c0c1473747789b8e54032f59cffffcb81aaefbd0a73c58ba2" Oct 13 14:23:30 crc kubenswrapper[4797]: E1013 14:23:30.238085 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 14:23:38 crc kubenswrapper[4797]: I1013 14:23:38.805877 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-copy-data"] Oct 13 14:23:38 crc kubenswrapper[4797]: E1013 14:23:38.808731 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d4a2b75-0677-4c1c-a7d0-a7d244411b3c" containerName="mariadb-client-2" Oct 13 14:23:38 crc kubenswrapper[4797]: I1013 14:23:38.809018 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d4a2b75-0677-4c1c-a7d0-a7d244411b3c" containerName="mariadb-client-2" Oct 13 14:23:38 crc kubenswrapper[4797]: I1013 14:23:38.809605 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d4a2b75-0677-4c1c-a7d0-a7d244411b3c" containerName="mariadb-client-2" Oct 13 14:23:38 crc kubenswrapper[4797]: I1013 14:23:38.811080 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Oct 13 14:23:38 crc kubenswrapper[4797]: I1013 14:23:38.819891 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-copy-data"] Oct 13 14:23:38 crc kubenswrapper[4797]: I1013 14:23:38.820459 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-dj9th" Oct 13 14:23:38 crc kubenswrapper[4797]: I1013 14:23:38.971356 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-8c6eed2c-d005-4cb8-891b-223d0e4428a7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8c6eed2c-d005-4cb8-891b-223d0e4428a7\") pod \"mariadb-copy-data\" (UID: \"e6f3fa0b-2447-4956-8550-71b9a486cb9b\") " pod="openstack/mariadb-copy-data" Oct 13 14:23:38 crc kubenswrapper[4797]: I1013 14:23:38.971744 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9x5q\" (UniqueName: \"kubernetes.io/projected/e6f3fa0b-2447-4956-8550-71b9a486cb9b-kube-api-access-d9x5q\") pod \"mariadb-copy-data\" (UID: \"e6f3fa0b-2447-4956-8550-71b9a486cb9b\") " pod="openstack/mariadb-copy-data" Oct 13 14:23:39 crc kubenswrapper[4797]: I1013 14:23:39.074100 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-8c6eed2c-d005-4cb8-891b-223d0e4428a7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8c6eed2c-d005-4cb8-891b-223d0e4428a7\") pod \"mariadb-copy-data\" (UID: \"e6f3fa0b-2447-4956-8550-71b9a486cb9b\") " pod="openstack/mariadb-copy-data" Oct 13 14:23:39 crc kubenswrapper[4797]: I1013 14:23:39.074184 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9x5q\" (UniqueName: \"kubernetes.io/projected/e6f3fa0b-2447-4956-8550-71b9a486cb9b-kube-api-access-d9x5q\") pod \"mariadb-copy-data\" (UID: \"e6f3fa0b-2447-4956-8550-71b9a486cb9b\") " pod="openstack/mariadb-copy-data" Oct 13 14:23:39 crc kubenswrapper[4797]: I1013 14:23:39.078378 4797 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 13 14:23:39 crc kubenswrapper[4797]: I1013 14:23:39.078423 4797 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-8c6eed2c-d005-4cb8-891b-223d0e4428a7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8c6eed2c-d005-4cb8-891b-223d0e4428a7\") pod \"mariadb-copy-data\" (UID: \"e6f3fa0b-2447-4956-8550-71b9a486cb9b\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/89a137d9690c5059289176525afab29357f8d103eed508667f413b232f1cbc21/globalmount\"" pod="openstack/mariadb-copy-data" Oct 13 14:23:39 crc kubenswrapper[4797]: I1013 14:23:39.119941 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9x5q\" (UniqueName: \"kubernetes.io/projected/e6f3fa0b-2447-4956-8550-71b9a486cb9b-kube-api-access-d9x5q\") pod \"mariadb-copy-data\" (UID: \"e6f3fa0b-2447-4956-8550-71b9a486cb9b\") " pod="openstack/mariadb-copy-data" Oct 13 14:23:39 crc kubenswrapper[4797]: I1013 14:23:39.129884 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-8c6eed2c-d005-4cb8-891b-223d0e4428a7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8c6eed2c-d005-4cb8-891b-223d0e4428a7\") pod \"mariadb-copy-data\" (UID: \"e6f3fa0b-2447-4956-8550-71b9a486cb9b\") " pod="openstack/mariadb-copy-data" Oct 13 14:23:39 crc kubenswrapper[4797]: I1013 14:23:39.147122 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Oct 13 14:23:39 crc kubenswrapper[4797]: I1013 14:23:39.704935 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-copy-data"] Oct 13 14:23:40 crc kubenswrapper[4797]: I1013 14:23:40.221125 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"e6f3fa0b-2447-4956-8550-71b9a486cb9b","Type":"ContainerStarted","Data":"247f4f4d82f828cf74310f3496e96c5bbbbce04b5e20087d937886dd34059a39"} Oct 13 14:23:40 crc kubenswrapper[4797]: I1013 14:23:40.221358 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"e6f3fa0b-2447-4956-8550-71b9a486cb9b","Type":"ContainerStarted","Data":"1557ff85d31aba04800006bfccc48051ef31398adbbf8c01cf5a99b246ad8426"} Oct 13 14:23:40 crc kubenswrapper[4797]: I1013 14:23:40.242606 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mariadb-copy-data" podStartSLOduration=3.242585042 podStartE2EDuration="3.242585042s" podCreationTimestamp="2025-10-13 14:23:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 14:23:40.238801469 +0000 UTC m=+4597.772351745" watchObservedRunningTime="2025-10-13 14:23:40.242585042 +0000 UTC m=+4597.776135318" Oct 13 14:23:41 crc kubenswrapper[4797]: I1013 14:23:41.997009 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Oct 13 14:23:41 crc kubenswrapper[4797]: I1013 14:23:41.998183 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Oct 13 14:23:42 crc kubenswrapper[4797]: I1013 14:23:42.012940 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Oct 13 14:23:42 crc kubenswrapper[4797]: I1013 14:23:42.118503 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdz6p\" (UniqueName: \"kubernetes.io/projected/1a6cd4f8-4903-4f0d-a232-f131cfc5f8bf-kube-api-access-zdz6p\") pod \"mariadb-client\" (UID: \"1a6cd4f8-4903-4f0d-a232-f131cfc5f8bf\") " pod="openstack/mariadb-client" Oct 13 14:23:42 crc kubenswrapper[4797]: I1013 14:23:42.219480 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zdz6p\" (UniqueName: \"kubernetes.io/projected/1a6cd4f8-4903-4f0d-a232-f131cfc5f8bf-kube-api-access-zdz6p\") pod \"mariadb-client\" (UID: \"1a6cd4f8-4903-4f0d-a232-f131cfc5f8bf\") " pod="openstack/mariadb-client" Oct 13 14:23:42 crc kubenswrapper[4797]: I1013 14:23:42.241772 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdz6p\" (UniqueName: \"kubernetes.io/projected/1a6cd4f8-4903-4f0d-a232-f131cfc5f8bf-kube-api-access-zdz6p\") pod \"mariadb-client\" (UID: \"1a6cd4f8-4903-4f0d-a232-f131cfc5f8bf\") " pod="openstack/mariadb-client" Oct 13 14:23:42 crc kubenswrapper[4797]: I1013 14:23:42.333571 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Oct 13 14:23:42 crc kubenswrapper[4797]: I1013 14:23:42.801547 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Oct 13 14:23:43 crc kubenswrapper[4797]: I1013 14:23:43.247132 4797 scope.go:117] "RemoveContainer" containerID="e94c068ee3715b0c0c1473747789b8e54032f59cffffcb81aaefbd0a73c58ba2" Oct 13 14:23:43 crc kubenswrapper[4797]: E1013 14:23:43.247553 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 14:23:43 crc kubenswrapper[4797]: I1013 14:23:43.257423 4797 generic.go:334] "Generic (PLEG): container finished" podID="1a6cd4f8-4903-4f0d-a232-f131cfc5f8bf" containerID="7730fc0f4f897660463179709de9120853076c8748e3b3b0787f369245701d23" exitCode=0 Oct 13 14:23:43 crc kubenswrapper[4797]: I1013 14:23:43.257458 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"1a6cd4f8-4903-4f0d-a232-f131cfc5f8bf","Type":"ContainerDied","Data":"7730fc0f4f897660463179709de9120853076c8748e3b3b0787f369245701d23"} Oct 13 14:23:43 crc kubenswrapper[4797]: I1013 14:23:43.257477 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"1a6cd4f8-4903-4f0d-a232-f131cfc5f8bf","Type":"ContainerStarted","Data":"57c973d9900e080af169d005eb0e2527c46577b3f945e4e737809d7709e36cda"} Oct 13 14:23:44 crc kubenswrapper[4797]: I1013 14:23:44.668187 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Oct 13 14:23:44 crc kubenswrapper[4797]: I1013 14:23:44.688895 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client_1a6cd4f8-4903-4f0d-a232-f131cfc5f8bf/mariadb-client/0.log" Oct 13 14:23:44 crc kubenswrapper[4797]: I1013 14:23:44.714988 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Oct 13 14:23:44 crc kubenswrapper[4797]: I1013 14:23:44.721423 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Oct 13 14:23:44 crc kubenswrapper[4797]: I1013 14:23:44.760320 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zdz6p\" (UniqueName: \"kubernetes.io/projected/1a6cd4f8-4903-4f0d-a232-f131cfc5f8bf-kube-api-access-zdz6p\") pod \"1a6cd4f8-4903-4f0d-a232-f131cfc5f8bf\" (UID: \"1a6cd4f8-4903-4f0d-a232-f131cfc5f8bf\") " Oct 13 14:23:44 crc kubenswrapper[4797]: I1013 14:23:44.765123 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a6cd4f8-4903-4f0d-a232-f131cfc5f8bf-kube-api-access-zdz6p" (OuterVolumeSpecName: "kube-api-access-zdz6p") pod "1a6cd4f8-4903-4f0d-a232-f131cfc5f8bf" (UID: "1a6cd4f8-4903-4f0d-a232-f131cfc5f8bf"). InnerVolumeSpecName "kube-api-access-zdz6p". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 14:23:44 crc kubenswrapper[4797]: I1013 14:23:44.848945 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Oct 13 14:23:44 crc kubenswrapper[4797]: E1013 14:23:44.849415 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a6cd4f8-4903-4f0d-a232-f131cfc5f8bf" containerName="mariadb-client" Oct 13 14:23:44 crc kubenswrapper[4797]: I1013 14:23:44.849441 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a6cd4f8-4903-4f0d-a232-f131cfc5f8bf" containerName="mariadb-client" Oct 13 14:23:44 crc kubenswrapper[4797]: I1013 14:23:44.854515 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a6cd4f8-4903-4f0d-a232-f131cfc5f8bf" containerName="mariadb-client" Oct 13 14:23:44 crc kubenswrapper[4797]: I1013 14:23:44.855561 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Oct 13 14:23:44 crc kubenswrapper[4797]: I1013 14:23:44.865282 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Oct 13 14:23:44 crc kubenswrapper[4797]: I1013 14:23:44.867066 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zdz6p\" (UniqueName: \"kubernetes.io/projected/1a6cd4f8-4903-4f0d-a232-f131cfc5f8bf-kube-api-access-zdz6p\") on node \"crc\" DevicePath \"\"" Oct 13 14:23:44 crc kubenswrapper[4797]: I1013 14:23:44.969238 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8lfv\" (UniqueName: \"kubernetes.io/projected/8a154493-c6b0-4f2b-9d1e-dbcfa50f28c7-kube-api-access-w8lfv\") pod \"mariadb-client\" (UID: \"8a154493-c6b0-4f2b-9d1e-dbcfa50f28c7\") " pod="openstack/mariadb-client" Oct 13 14:23:45 crc kubenswrapper[4797]: I1013 14:23:45.070682 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w8lfv\" (UniqueName: \"kubernetes.io/projected/8a154493-c6b0-4f2b-9d1e-dbcfa50f28c7-kube-api-access-w8lfv\") pod \"mariadb-client\" (UID: \"8a154493-c6b0-4f2b-9d1e-dbcfa50f28c7\") " pod="openstack/mariadb-client" Oct 13 14:23:45 crc kubenswrapper[4797]: I1013 14:23:45.092277 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8lfv\" (UniqueName: \"kubernetes.io/projected/8a154493-c6b0-4f2b-9d1e-dbcfa50f28c7-kube-api-access-w8lfv\") pod \"mariadb-client\" (UID: \"8a154493-c6b0-4f2b-9d1e-dbcfa50f28c7\") " pod="openstack/mariadb-client" Oct 13 14:23:45 crc kubenswrapper[4797]: I1013 14:23:45.190442 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Oct 13 14:23:45 crc kubenswrapper[4797]: I1013 14:23:45.249514 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a6cd4f8-4903-4f0d-a232-f131cfc5f8bf" path="/var/lib/kubelet/pods/1a6cd4f8-4903-4f0d-a232-f131cfc5f8bf/volumes" Oct 13 14:23:45 crc kubenswrapper[4797]: I1013 14:23:45.277534 4797 scope.go:117] "RemoveContainer" containerID="7730fc0f4f897660463179709de9120853076c8748e3b3b0787f369245701d23" Oct 13 14:23:45 crc kubenswrapper[4797]: I1013 14:23:45.277660 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Oct 13 14:23:45 crc kubenswrapper[4797]: I1013 14:23:45.707092 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Oct 13 14:23:45 crc kubenswrapper[4797]: W1013 14:23:45.710172 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8a154493_c6b0_4f2b_9d1e_dbcfa50f28c7.slice/crio-ce5cc1e8b99ed6dac6730e46d9717be2723a975a0b9252241196307fde9573fe WatchSource:0}: Error finding container ce5cc1e8b99ed6dac6730e46d9717be2723a975a0b9252241196307fde9573fe: Status 404 returned error can't find the container with id ce5cc1e8b99ed6dac6730e46d9717be2723a975a0b9252241196307fde9573fe Oct 13 14:23:46 crc kubenswrapper[4797]: I1013 14:23:46.293198 4797 generic.go:334] "Generic (PLEG): container finished" podID="8a154493-c6b0-4f2b-9d1e-dbcfa50f28c7" containerID="1316595c470754d486e8891348ee8a2d3a8ab9391726d1ef6892c1e152c2834a" exitCode=0 Oct 13 14:23:46 crc kubenswrapper[4797]: I1013 14:23:46.293721 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"8a154493-c6b0-4f2b-9d1e-dbcfa50f28c7","Type":"ContainerDied","Data":"1316595c470754d486e8891348ee8a2d3a8ab9391726d1ef6892c1e152c2834a"} Oct 13 14:23:46 crc kubenswrapper[4797]: I1013 14:23:46.294554 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"8a154493-c6b0-4f2b-9d1e-dbcfa50f28c7","Type":"ContainerStarted","Data":"ce5cc1e8b99ed6dac6730e46d9717be2723a975a0b9252241196307fde9573fe"} Oct 13 14:23:47 crc kubenswrapper[4797]: I1013 14:23:47.665049 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Oct 13 14:23:47 crc kubenswrapper[4797]: I1013 14:23:47.685114 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client_8a154493-c6b0-4f2b-9d1e-dbcfa50f28c7/mariadb-client/0.log" Oct 13 14:23:47 crc kubenswrapper[4797]: I1013 14:23:47.707289 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Oct 13 14:23:47 crc kubenswrapper[4797]: I1013 14:23:47.712301 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Oct 13 14:23:47 crc kubenswrapper[4797]: I1013 14:23:47.812344 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w8lfv\" (UniqueName: \"kubernetes.io/projected/8a154493-c6b0-4f2b-9d1e-dbcfa50f28c7-kube-api-access-w8lfv\") pod \"8a154493-c6b0-4f2b-9d1e-dbcfa50f28c7\" (UID: \"8a154493-c6b0-4f2b-9d1e-dbcfa50f28c7\") " Oct 13 14:23:47 crc kubenswrapper[4797]: I1013 14:23:47.819630 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a154493-c6b0-4f2b-9d1e-dbcfa50f28c7-kube-api-access-w8lfv" (OuterVolumeSpecName: "kube-api-access-w8lfv") pod "8a154493-c6b0-4f2b-9d1e-dbcfa50f28c7" (UID: "8a154493-c6b0-4f2b-9d1e-dbcfa50f28c7"). InnerVolumeSpecName "kube-api-access-w8lfv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 14:23:47 crc kubenswrapper[4797]: I1013 14:23:47.913898 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w8lfv\" (UniqueName: \"kubernetes.io/projected/8a154493-c6b0-4f2b-9d1e-dbcfa50f28c7-kube-api-access-w8lfv\") on node \"crc\" DevicePath \"\"" Oct 13 14:23:48 crc kubenswrapper[4797]: I1013 14:23:48.325294 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ce5cc1e8b99ed6dac6730e46d9717be2723a975a0b9252241196307fde9573fe" Oct 13 14:23:48 crc kubenswrapper[4797]: I1013 14:23:48.325447 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Oct 13 14:23:49 crc kubenswrapper[4797]: I1013 14:23:49.249909 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a154493-c6b0-4f2b-9d1e-dbcfa50f28c7" path="/var/lib/kubelet/pods/8a154493-c6b0-4f2b-9d1e-dbcfa50f28c7/volumes" Oct 13 14:23:54 crc kubenswrapper[4797]: I1013 14:23:54.240577 4797 scope.go:117] "RemoveContainer" containerID="e94c068ee3715b0c0c1473747789b8e54032f59cffffcb81aaefbd0a73c58ba2" Oct 13 14:23:55 crc kubenswrapper[4797]: I1013 14:23:55.393772 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" event={"ID":"345b1c60-ba79-407d-8423-53010f2dfeb0","Type":"ContainerStarted","Data":"bd2a230444894bdbf4c868b39927df09a997acfb15fbcd184a28d0e618ab816a"} Oct 13 14:24:21 crc kubenswrapper[4797]: I1013 14:24:21.487186 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 13 14:24:21 crc kubenswrapper[4797]: E1013 14:24:21.488360 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a154493-c6b0-4f2b-9d1e-dbcfa50f28c7" containerName="mariadb-client" Oct 13 14:24:21 crc kubenswrapper[4797]: I1013 14:24:21.488379 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a154493-c6b0-4f2b-9d1e-dbcfa50f28c7" containerName="mariadb-client" Oct 13 14:24:21 crc kubenswrapper[4797]: I1013 14:24:21.488554 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a154493-c6b0-4f2b-9d1e-dbcfa50f28c7" containerName="mariadb-client" Oct 13 14:24:21 crc kubenswrapper[4797]: I1013 14:24:21.489489 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 13 14:24:21 crc kubenswrapper[4797]: I1013 14:24:21.497084 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Oct 13 14:24:21 crc kubenswrapper[4797]: I1013 14:24:21.497411 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Oct 13 14:24:21 crc kubenswrapper[4797]: I1013 14:24:21.497839 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-dktt7" Oct 13 14:24:21 crc kubenswrapper[4797]: I1013 14:24:21.509062 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 13 14:24:21 crc kubenswrapper[4797]: I1013 14:24:21.517797 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-2"] Oct 13 14:24:21 crc kubenswrapper[4797]: I1013 14:24:21.520094 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-2" Oct 13 14:24:21 crc kubenswrapper[4797]: I1013 14:24:21.530476 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-1"] Oct 13 14:24:21 crc kubenswrapper[4797]: I1013 14:24:21.535472 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-1" Oct 13 14:24:21 crc kubenswrapper[4797]: I1013 14:24:21.549160 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-1"] Oct 13 14:24:21 crc kubenswrapper[4797]: I1013 14:24:21.562616 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-2"] Oct 13 14:24:21 crc kubenswrapper[4797]: I1013 14:24:21.601667 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f4d36fc-f414-419a-88fb-8897d4029861-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"2f4d36fc-f414-419a-88fb-8897d4029861\") " pod="openstack/ovsdbserver-nb-0" Oct 13 14:24:21 crc kubenswrapper[4797]: I1013 14:24:21.601759 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f4d36fc-f414-419a-88fb-8897d4029861-config\") pod \"ovsdbserver-nb-0\" (UID: \"2f4d36fc-f414-419a-88fb-8897d4029861\") " pod="openstack/ovsdbserver-nb-0" Oct 13 14:24:21 crc kubenswrapper[4797]: I1013 14:24:21.601992 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2f4d36fc-f414-419a-88fb-8897d4029861-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"2f4d36fc-f414-419a-88fb-8897d4029861\") " pod="openstack/ovsdbserver-nb-0" Oct 13 14:24:21 crc kubenswrapper[4797]: I1013 14:24:21.602029 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-66be4aab-0cd6-426a-b2e4-86ae3c4024e3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-66be4aab-0cd6-426a-b2e4-86ae3c4024e3\") pod \"ovsdbserver-nb-0\" (UID: \"2f4d36fc-f414-419a-88fb-8897d4029861\") " pod="openstack/ovsdbserver-nb-0" Oct 13 14:24:21 crc kubenswrapper[4797]: I1013 14:24:21.602052 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlhr9\" (UniqueName: \"kubernetes.io/projected/2f4d36fc-f414-419a-88fb-8897d4029861-kube-api-access-dlhr9\") pod \"ovsdbserver-nb-0\" (UID: \"2f4d36fc-f414-419a-88fb-8897d4029861\") " pod="openstack/ovsdbserver-nb-0" Oct 13 14:24:21 crc kubenswrapper[4797]: I1013 14:24:21.602242 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2f4d36fc-f414-419a-88fb-8897d4029861-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"2f4d36fc-f414-419a-88fb-8897d4029861\") " pod="openstack/ovsdbserver-nb-0" Oct 13 14:24:21 crc kubenswrapper[4797]: I1013 14:24:21.681849 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 13 14:24:21 crc kubenswrapper[4797]: I1013 14:24:21.683509 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 13 14:24:21 crc kubenswrapper[4797]: I1013 14:24:21.692476 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-69cbg" Oct 13 14:24:21 crc kubenswrapper[4797]: I1013 14:24:21.692504 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Oct 13 14:24:21 crc kubenswrapper[4797]: I1013 14:24:21.693389 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Oct 13 14:24:21 crc kubenswrapper[4797]: I1013 14:24:21.696248 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-2"] Oct 13 14:24:21 crc kubenswrapper[4797]: I1013 14:24:21.698139 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-2" Oct 13 14:24:21 crc kubenswrapper[4797]: I1013 14:24:21.701854 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-1"] Oct 13 14:24:21 crc kubenswrapper[4797]: I1013 14:24:21.703148 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-1" Oct 13 14:24:21 crc kubenswrapper[4797]: I1013 14:24:21.704144 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f792911e-0022-42df-9a19-9912fde49848-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"f792911e-0022-42df-9a19-9912fde49848\") " pod="openstack/ovsdbserver-nb-1" Oct 13 14:24:21 crc kubenswrapper[4797]: I1013 14:24:21.704252 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2f4d36fc-f414-419a-88fb-8897d4029861-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"2f4d36fc-f414-419a-88fb-8897d4029861\") " pod="openstack/ovsdbserver-nb-0" Oct 13 14:24:21 crc kubenswrapper[4797]: I1013 14:24:21.704327 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-66be4aab-0cd6-426a-b2e4-86ae3c4024e3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-66be4aab-0cd6-426a-b2e4-86ae3c4024e3\") pod \"ovsdbserver-nb-0\" (UID: \"2f4d36fc-f414-419a-88fb-8897d4029861\") " pod="openstack/ovsdbserver-nb-0" Oct 13 14:24:21 crc kubenswrapper[4797]: I1013 14:24:21.704377 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dlhr9\" (UniqueName: \"kubernetes.io/projected/2f4d36fc-f414-419a-88fb-8897d4029861-kube-api-access-dlhr9\") pod \"ovsdbserver-nb-0\" (UID: \"2f4d36fc-f414-419a-88fb-8897d4029861\") " pod="openstack/ovsdbserver-nb-0" Oct 13 14:24:21 crc kubenswrapper[4797]: I1013 14:24:21.704439 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j96mf\" (UniqueName: \"kubernetes.io/projected/f792911e-0022-42df-9a19-9912fde49848-kube-api-access-j96mf\") pod \"ovsdbserver-nb-1\" (UID: \"f792911e-0022-42df-9a19-9912fde49848\") " pod="openstack/ovsdbserver-nb-1" Oct 13 14:24:21 crc kubenswrapper[4797]: I1013 14:24:21.704504 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-b77df48f-6bb0-4d91-ac48-4a708dcfe88f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b77df48f-6bb0-4d91-ac48-4a708dcfe88f\") pod \"ovsdbserver-nb-1\" (UID: \"f792911e-0022-42df-9a19-9912fde49848\") " pod="openstack/ovsdbserver-nb-1" Oct 13 14:24:21 crc kubenswrapper[4797]: I1013 14:24:21.704554 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f792911e-0022-42df-9a19-9912fde49848-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"f792911e-0022-42df-9a19-9912fde49848\") " pod="openstack/ovsdbserver-nb-1" Oct 13 14:24:21 crc kubenswrapper[4797]: I1013 14:24:21.704598 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f792911e-0022-42df-9a19-9912fde49848-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"f792911e-0022-42df-9a19-9912fde49848\") " pod="openstack/ovsdbserver-nb-1" Oct 13 14:24:21 crc kubenswrapper[4797]: I1013 14:24:21.704643 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1cbefe95-eada-4df1-90ce-a7350636f4fb-config\") pod \"ovsdbserver-nb-2\" (UID: \"1cbefe95-eada-4df1-90ce-a7350636f4fb\") " pod="openstack/ovsdbserver-nb-2" Oct 13 14:24:21 crc kubenswrapper[4797]: I1013 14:24:21.704707 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-62d871ac-cf19-46f5-92f7-621835970b2a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-62d871ac-cf19-46f5-92f7-621835970b2a\") pod \"ovsdbserver-nb-2\" (UID: \"1cbefe95-eada-4df1-90ce-a7350636f4fb\") " pod="openstack/ovsdbserver-nb-2" Oct 13 14:24:21 crc kubenswrapper[4797]: I1013 14:24:21.704762 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cbefe95-eada-4df1-90ce-a7350636f4fb-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"1cbefe95-eada-4df1-90ce-a7350636f4fb\") " pod="openstack/ovsdbserver-nb-2" Oct 13 14:24:21 crc kubenswrapper[4797]: I1013 14:24:21.704871 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2f4d36fc-f414-419a-88fb-8897d4029861-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"2f4d36fc-f414-419a-88fb-8897d4029861\") " pod="openstack/ovsdbserver-nb-0" Oct 13 14:24:21 crc kubenswrapper[4797]: I1013 14:24:21.705504 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f4d36fc-f414-419a-88fb-8897d4029861-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"2f4d36fc-f414-419a-88fb-8897d4029861\") " pod="openstack/ovsdbserver-nb-0" Oct 13 14:24:21 crc kubenswrapper[4797]: I1013 14:24:21.705768 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2f4d36fc-f414-419a-88fb-8897d4029861-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"2f4d36fc-f414-419a-88fb-8897d4029861\") " pod="openstack/ovsdbserver-nb-0" Oct 13 14:24:21 crc kubenswrapper[4797]: I1013 14:24:21.706005 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2f4d36fc-f414-419a-88fb-8897d4029861-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"2f4d36fc-f414-419a-88fb-8897d4029861\") " pod="openstack/ovsdbserver-nb-0" Oct 13 14:24:21 crc kubenswrapper[4797]: I1013 14:24:21.706148 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gkmlf\" (UniqueName: \"kubernetes.io/projected/1cbefe95-eada-4df1-90ce-a7350636f4fb-kube-api-access-gkmlf\") pod \"ovsdbserver-nb-2\" (UID: \"1cbefe95-eada-4df1-90ce-a7350636f4fb\") " pod="openstack/ovsdbserver-nb-2" Oct 13 14:24:21 crc kubenswrapper[4797]: I1013 14:24:21.706200 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/1cbefe95-eada-4df1-90ce-a7350636f4fb-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"1cbefe95-eada-4df1-90ce-a7350636f4fb\") " pod="openstack/ovsdbserver-nb-2" Oct 13 14:24:21 crc kubenswrapper[4797]: I1013 14:24:21.706232 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1cbefe95-eada-4df1-90ce-a7350636f4fb-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"1cbefe95-eada-4df1-90ce-a7350636f4fb\") " pod="openstack/ovsdbserver-nb-2" Oct 13 14:24:21 crc kubenswrapper[4797]: I1013 14:24:21.706262 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f4d36fc-f414-419a-88fb-8897d4029861-config\") pod \"ovsdbserver-nb-0\" (UID: \"2f4d36fc-f414-419a-88fb-8897d4029861\") " pod="openstack/ovsdbserver-nb-0" Oct 13 14:24:21 crc kubenswrapper[4797]: I1013 14:24:21.706323 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f792911e-0022-42df-9a19-9912fde49848-config\") pod \"ovsdbserver-nb-1\" (UID: \"f792911e-0022-42df-9a19-9912fde49848\") " pod="openstack/ovsdbserver-nb-1" Oct 13 14:24:21 crc kubenswrapper[4797]: I1013 14:24:21.707194 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f4d36fc-f414-419a-88fb-8897d4029861-config\") pod \"ovsdbserver-nb-0\" (UID: \"2f4d36fc-f414-419a-88fb-8897d4029861\") " pod="openstack/ovsdbserver-nb-0" Oct 13 14:24:21 crc kubenswrapper[4797]: I1013 14:24:21.707849 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 13 14:24:21 crc kubenswrapper[4797]: I1013 14:24:21.709169 4797 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 13 14:24:21 crc kubenswrapper[4797]: I1013 14:24:21.709201 4797 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-66be4aab-0cd6-426a-b2e4-86ae3c4024e3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-66be4aab-0cd6-426a-b2e4-86ae3c4024e3\") pod \"ovsdbserver-nb-0\" (UID: \"2f4d36fc-f414-419a-88fb-8897d4029861\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/7428c95267ee6c705d9fe5bde4e0898af6c866870fa9b659cb22e64f4c261232/globalmount\"" pod="openstack/ovsdbserver-nb-0" Oct 13 14:24:21 crc kubenswrapper[4797]: I1013 14:24:21.715721 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-2"] Oct 13 14:24:21 crc kubenswrapper[4797]: I1013 14:24:21.722212 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f4d36fc-f414-419a-88fb-8897d4029861-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"2f4d36fc-f414-419a-88fb-8897d4029861\") " pod="openstack/ovsdbserver-nb-0" Oct 13 14:24:21 crc kubenswrapper[4797]: I1013 14:24:21.726481 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-1"] Oct 13 14:24:21 crc kubenswrapper[4797]: I1013 14:24:21.742695 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dlhr9\" (UniqueName: \"kubernetes.io/projected/2f4d36fc-f414-419a-88fb-8897d4029861-kube-api-access-dlhr9\") pod \"ovsdbserver-nb-0\" (UID: \"2f4d36fc-f414-419a-88fb-8897d4029861\") " pod="openstack/ovsdbserver-nb-0" Oct 13 14:24:21 crc kubenswrapper[4797]: I1013 14:24:21.745111 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-66be4aab-0cd6-426a-b2e4-86ae3c4024e3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-66be4aab-0cd6-426a-b2e4-86ae3c4024e3\") pod \"ovsdbserver-nb-0\" (UID: \"2f4d36fc-f414-419a-88fb-8897d4029861\") " pod="openstack/ovsdbserver-nb-0" Oct 13 14:24:21 crc kubenswrapper[4797]: I1013 14:24:21.807467 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mgg2c\" (UniqueName: \"kubernetes.io/projected/31b1a65c-acf8-4ab5-8e13-fbfa0bd51d01-kube-api-access-mgg2c\") pod \"ovsdbserver-sb-0\" (UID: \"31b1a65c-acf8-4ab5-8e13-fbfa0bd51d01\") " pod="openstack/ovsdbserver-sb-0" Oct 13 14:24:21 crc kubenswrapper[4797]: I1013 14:24:21.808156 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j96mf\" (UniqueName: \"kubernetes.io/projected/f792911e-0022-42df-9a19-9912fde49848-kube-api-access-j96mf\") pod \"ovsdbserver-nb-1\" (UID: \"f792911e-0022-42df-9a19-9912fde49848\") " pod="openstack/ovsdbserver-nb-1" Oct 13 14:24:21 crc kubenswrapper[4797]: I1013 14:24:21.808196 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2ab030ed-5bf4-480d-bdef-c2e145080cfd-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"2ab030ed-5bf4-480d-bdef-c2e145080cfd\") " pod="openstack/ovsdbserver-sb-2" Oct 13 14:24:21 crc kubenswrapper[4797]: I1013 14:24:21.808233 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxzwg\" (UniqueName: \"kubernetes.io/projected/2ab030ed-5bf4-480d-bdef-c2e145080cfd-kube-api-access-xxzwg\") pod \"ovsdbserver-sb-2\" (UID: \"2ab030ed-5bf4-480d-bdef-c2e145080cfd\") " pod="openstack/ovsdbserver-sb-2" Oct 13 14:24:21 crc kubenswrapper[4797]: I1013 14:24:21.808277 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-b77df48f-6bb0-4d91-ac48-4a708dcfe88f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b77df48f-6bb0-4d91-ac48-4a708dcfe88f\") pod \"ovsdbserver-nb-1\" (UID: \"f792911e-0022-42df-9a19-9912fde49848\") " pod="openstack/ovsdbserver-nb-1" Oct 13 14:24:21 crc kubenswrapper[4797]: I1013 14:24:21.808309 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2ab030ed-5bf4-480d-bdef-c2e145080cfd-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"2ab030ed-5bf4-480d-bdef-c2e145080cfd\") " pod="openstack/ovsdbserver-sb-2" Oct 13 14:24:21 crc kubenswrapper[4797]: I1013 14:24:21.808344 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f792911e-0022-42df-9a19-9912fde49848-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"f792911e-0022-42df-9a19-9912fde49848\") " pod="openstack/ovsdbserver-nb-1" Oct 13 14:24:21 crc kubenswrapper[4797]: I1013 14:24:21.808373 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f792911e-0022-42df-9a19-9912fde49848-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"f792911e-0022-42df-9a19-9912fde49848\") " pod="openstack/ovsdbserver-nb-1" Oct 13 14:24:21 crc kubenswrapper[4797]: I1013 14:24:21.808403 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1cbefe95-eada-4df1-90ce-a7350636f4fb-config\") pod \"ovsdbserver-nb-2\" (UID: \"1cbefe95-eada-4df1-90ce-a7350636f4fb\") " pod="openstack/ovsdbserver-nb-2" Oct 13 14:24:21 crc kubenswrapper[4797]: I1013 14:24:21.808437 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ab030ed-5bf4-480d-bdef-c2e145080cfd-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"2ab030ed-5bf4-480d-bdef-c2e145080cfd\") " pod="openstack/ovsdbserver-sb-2" Oct 13 14:24:21 crc kubenswrapper[4797]: I1013 14:24:21.808480 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-62d871ac-cf19-46f5-92f7-621835970b2a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-62d871ac-cf19-46f5-92f7-621835970b2a\") pod \"ovsdbserver-nb-2\" (UID: \"1cbefe95-eada-4df1-90ce-a7350636f4fb\") " pod="openstack/ovsdbserver-nb-2" Oct 13 14:24:21 crc kubenswrapper[4797]: I1013 14:24:21.808518 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cbefe95-eada-4df1-90ce-a7350636f4fb-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"1cbefe95-eada-4df1-90ce-a7350636f4fb\") " pod="openstack/ovsdbserver-nb-2" Oct 13 14:24:21 crc kubenswrapper[4797]: I1013 14:24:21.808563 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-426a03e9-546a-47d6-997a-0bf61c2ef1c5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-426a03e9-546a-47d6-997a-0bf61c2ef1c5\") pod \"ovsdbserver-sb-0\" (UID: \"31b1a65c-acf8-4ab5-8e13-fbfa0bd51d01\") " pod="openstack/ovsdbserver-sb-0" Oct 13 14:24:21 crc kubenswrapper[4797]: I1013 14:24:21.808603 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/31b1a65c-acf8-4ab5-8e13-fbfa0bd51d01-config\") pod \"ovsdbserver-sb-0\" (UID: \"31b1a65c-acf8-4ab5-8e13-fbfa0bd51d01\") " pod="openstack/ovsdbserver-sb-0" Oct 13 14:24:21 crc kubenswrapper[4797]: I1013 14:24:21.808660 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gkmlf\" (UniqueName: \"kubernetes.io/projected/1cbefe95-eada-4df1-90ce-a7350636f4fb-kube-api-access-gkmlf\") pod \"ovsdbserver-nb-2\" (UID: \"1cbefe95-eada-4df1-90ce-a7350636f4fb\") " pod="openstack/ovsdbserver-nb-2" Oct 13 14:24:21 crc kubenswrapper[4797]: I1013 14:24:21.808699 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/fb1869c6-0b2c-4fad-9681-94b1b427085a-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"fb1869c6-0b2c-4fad-9681-94b1b427085a\") " pod="openstack/ovsdbserver-sb-1" Oct 13 14:24:21 crc kubenswrapper[4797]: I1013 14:24:21.808730 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/1cbefe95-eada-4df1-90ce-a7350636f4fb-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"1cbefe95-eada-4df1-90ce-a7350636f4fb\") " pod="openstack/ovsdbserver-nb-2" Oct 13 14:24:21 crc kubenswrapper[4797]: I1013 14:24:21.808762 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1cbefe95-eada-4df1-90ce-a7350636f4fb-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"1cbefe95-eada-4df1-90ce-a7350636f4fb\") " pod="openstack/ovsdbserver-nb-2" Oct 13 14:24:21 crc kubenswrapper[4797]: I1013 14:24:21.808792 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31b1a65c-acf8-4ab5-8e13-fbfa0bd51d01-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"31b1a65c-acf8-4ab5-8e13-fbfa0bd51d01\") " pod="openstack/ovsdbserver-sb-0" Oct 13 14:24:21 crc kubenswrapper[4797]: I1013 14:24:21.808856 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2nr5m\" (UniqueName: \"kubernetes.io/projected/fb1869c6-0b2c-4fad-9681-94b1b427085a-kube-api-access-2nr5m\") pod \"ovsdbserver-sb-1\" (UID: \"fb1869c6-0b2c-4fad-9681-94b1b427085a\") " pod="openstack/ovsdbserver-sb-1" Oct 13 14:24:21 crc kubenswrapper[4797]: I1013 14:24:21.808885 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/31b1a65c-acf8-4ab5-8e13-fbfa0bd51d01-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"31b1a65c-acf8-4ab5-8e13-fbfa0bd51d01\") " pod="openstack/ovsdbserver-sb-0" Oct 13 14:24:21 crc kubenswrapper[4797]: I1013 14:24:21.808922 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f792911e-0022-42df-9a19-9912fde49848-config\") pod \"ovsdbserver-nb-1\" (UID: \"f792911e-0022-42df-9a19-9912fde49848\") " pod="openstack/ovsdbserver-nb-1" Oct 13 14:24:21 crc kubenswrapper[4797]: I1013 14:24:21.808957 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb1869c6-0b2c-4fad-9681-94b1b427085a-config\") pod \"ovsdbserver-sb-1\" (UID: \"fb1869c6-0b2c-4fad-9681-94b1b427085a\") " pod="openstack/ovsdbserver-sb-1" Oct 13 14:24:21 crc kubenswrapper[4797]: I1013 14:24:21.808998 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-ddf8cbda-eb0a-4a26-be69-9b1d013c059d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ddf8cbda-eb0a-4a26-be69-9b1d013c059d\") pod \"ovsdbserver-sb-2\" (UID: \"2ab030ed-5bf4-480d-bdef-c2e145080cfd\") " pod="openstack/ovsdbserver-sb-2" Oct 13 14:24:21 crc kubenswrapper[4797]: I1013 14:24:21.809034 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/31b1a65c-acf8-4ab5-8e13-fbfa0bd51d01-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"31b1a65c-acf8-4ab5-8e13-fbfa0bd51d01\") " pod="openstack/ovsdbserver-sb-0" Oct 13 14:24:21 crc kubenswrapper[4797]: I1013 14:24:21.809063 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ab030ed-5bf4-480d-bdef-c2e145080cfd-config\") pod \"ovsdbserver-sb-2\" (UID: \"2ab030ed-5bf4-480d-bdef-c2e145080cfd\") " pod="openstack/ovsdbserver-sb-2" Oct 13 14:24:21 crc kubenswrapper[4797]: I1013 14:24:21.809100 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-dfdfe40e-ff93-4dfe-801f-dab83d8000d4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-dfdfe40e-ff93-4dfe-801f-dab83d8000d4\") pod \"ovsdbserver-sb-1\" (UID: \"fb1869c6-0b2c-4fad-9681-94b1b427085a\") " pod="openstack/ovsdbserver-sb-1" Oct 13 14:24:21 crc kubenswrapper[4797]: I1013 14:24:21.809141 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f792911e-0022-42df-9a19-9912fde49848-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"f792911e-0022-42df-9a19-9912fde49848\") " pod="openstack/ovsdbserver-nb-1" Oct 13 14:24:21 crc kubenswrapper[4797]: I1013 14:24:21.809183 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb1869c6-0b2c-4fad-9681-94b1b427085a-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"fb1869c6-0b2c-4fad-9681-94b1b427085a\") " pod="openstack/ovsdbserver-sb-1" Oct 13 14:24:21 crc kubenswrapper[4797]: I1013 14:24:21.809219 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fb1869c6-0b2c-4fad-9681-94b1b427085a-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"fb1869c6-0b2c-4fad-9681-94b1b427085a\") " pod="openstack/ovsdbserver-sb-1" Oct 13 14:24:21 crc kubenswrapper[4797]: I1013 14:24:21.811427 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f792911e-0022-42df-9a19-9912fde49848-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"f792911e-0022-42df-9a19-9912fde49848\") " pod="openstack/ovsdbserver-nb-1" Oct 13 14:24:21 crc kubenswrapper[4797]: I1013 14:24:21.813095 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f792911e-0022-42df-9a19-9912fde49848-config\") pod \"ovsdbserver-nb-1\" (UID: \"f792911e-0022-42df-9a19-9912fde49848\") " pod="openstack/ovsdbserver-nb-1" Oct 13 14:24:21 crc kubenswrapper[4797]: I1013 14:24:21.813472 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f792911e-0022-42df-9a19-9912fde49848-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"f792911e-0022-42df-9a19-9912fde49848\") " pod="openstack/ovsdbserver-nb-1" Oct 13 14:24:21 crc kubenswrapper[4797]: I1013 14:24:21.813495 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/1cbefe95-eada-4df1-90ce-a7350636f4fb-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"1cbefe95-eada-4df1-90ce-a7350636f4fb\") " pod="openstack/ovsdbserver-nb-2" Oct 13 14:24:21 crc kubenswrapper[4797]: I1013 14:24:21.815103 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1cbefe95-eada-4df1-90ce-a7350636f4fb-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"1cbefe95-eada-4df1-90ce-a7350636f4fb\") " pod="openstack/ovsdbserver-nb-2" Oct 13 14:24:21 crc kubenswrapper[4797]: I1013 14:24:21.815201 4797 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 13 14:24:21 crc kubenswrapper[4797]: I1013 14:24:21.815244 4797 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-62d871ac-cf19-46f5-92f7-621835970b2a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-62d871ac-cf19-46f5-92f7-621835970b2a\") pod \"ovsdbserver-nb-2\" (UID: \"1cbefe95-eada-4df1-90ce-a7350636f4fb\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/357b23a5c8637b200eb690383076b8fd8dc0ecac6c3ed89a009e91f2e88484bb/globalmount\"" pod="openstack/ovsdbserver-nb-2" Oct 13 14:24:21 crc kubenswrapper[4797]: I1013 14:24:21.815520 4797 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 13 14:24:21 crc kubenswrapper[4797]: I1013 14:24:21.815567 4797 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-b77df48f-6bb0-4d91-ac48-4a708dcfe88f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b77df48f-6bb0-4d91-ac48-4a708dcfe88f\") pod \"ovsdbserver-nb-1\" (UID: \"f792911e-0022-42df-9a19-9912fde49848\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/325ae6e91df6b5c61ba56fc0e974e77aaef17bb5b370599b846153b900ab2470/globalmount\"" pod="openstack/ovsdbserver-nb-1" Oct 13 14:24:21 crc kubenswrapper[4797]: I1013 14:24:21.817870 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f792911e-0022-42df-9a19-9912fde49848-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"f792911e-0022-42df-9a19-9912fde49848\") " pod="openstack/ovsdbserver-nb-1" Oct 13 14:24:21 crc kubenswrapper[4797]: I1013 14:24:21.819132 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1cbefe95-eada-4df1-90ce-a7350636f4fb-config\") pod \"ovsdbserver-nb-2\" (UID: \"1cbefe95-eada-4df1-90ce-a7350636f4fb\") " pod="openstack/ovsdbserver-nb-2" Oct 13 14:24:21 crc kubenswrapper[4797]: I1013 14:24:21.824086 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cbefe95-eada-4df1-90ce-a7350636f4fb-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"1cbefe95-eada-4df1-90ce-a7350636f4fb\") " pod="openstack/ovsdbserver-nb-2" Oct 13 14:24:21 crc kubenswrapper[4797]: I1013 14:24:21.824754 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j96mf\" (UniqueName: \"kubernetes.io/projected/f792911e-0022-42df-9a19-9912fde49848-kube-api-access-j96mf\") pod \"ovsdbserver-nb-1\" (UID: \"f792911e-0022-42df-9a19-9912fde49848\") " pod="openstack/ovsdbserver-nb-1" Oct 13 14:24:21 crc kubenswrapper[4797]: I1013 14:24:21.833172 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gkmlf\" (UniqueName: \"kubernetes.io/projected/1cbefe95-eada-4df1-90ce-a7350636f4fb-kube-api-access-gkmlf\") pod \"ovsdbserver-nb-2\" (UID: \"1cbefe95-eada-4df1-90ce-a7350636f4fb\") " pod="openstack/ovsdbserver-nb-2" Oct 13 14:24:21 crc kubenswrapper[4797]: I1013 14:24:21.842932 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 13 14:24:21 crc kubenswrapper[4797]: I1013 14:24:21.844391 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-62d871ac-cf19-46f5-92f7-621835970b2a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-62d871ac-cf19-46f5-92f7-621835970b2a\") pod \"ovsdbserver-nb-2\" (UID: \"1cbefe95-eada-4df1-90ce-a7350636f4fb\") " pod="openstack/ovsdbserver-nb-2" Oct 13 14:24:21 crc kubenswrapper[4797]: I1013 14:24:21.846297 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-b77df48f-6bb0-4d91-ac48-4a708dcfe88f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b77df48f-6bb0-4d91-ac48-4a708dcfe88f\") pod \"ovsdbserver-nb-1\" (UID: \"f792911e-0022-42df-9a19-9912fde49848\") " pod="openstack/ovsdbserver-nb-1" Oct 13 14:24:21 crc kubenswrapper[4797]: I1013 14:24:21.876200 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-2" Oct 13 14:24:21 crc kubenswrapper[4797]: I1013 14:24:21.882938 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-1" Oct 13 14:24:21 crc kubenswrapper[4797]: I1013 14:24:21.910851 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-426a03e9-546a-47d6-997a-0bf61c2ef1c5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-426a03e9-546a-47d6-997a-0bf61c2ef1c5\") pod \"ovsdbserver-sb-0\" (UID: \"31b1a65c-acf8-4ab5-8e13-fbfa0bd51d01\") " pod="openstack/ovsdbserver-sb-0" Oct 13 14:24:21 crc kubenswrapper[4797]: I1013 14:24:21.910932 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/31b1a65c-acf8-4ab5-8e13-fbfa0bd51d01-config\") pod \"ovsdbserver-sb-0\" (UID: \"31b1a65c-acf8-4ab5-8e13-fbfa0bd51d01\") " pod="openstack/ovsdbserver-sb-0" Oct 13 14:24:21 crc kubenswrapper[4797]: I1013 14:24:21.911000 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/fb1869c6-0b2c-4fad-9681-94b1b427085a-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"fb1869c6-0b2c-4fad-9681-94b1b427085a\") " pod="openstack/ovsdbserver-sb-1" Oct 13 14:24:21 crc kubenswrapper[4797]: I1013 14:24:21.911023 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31b1a65c-acf8-4ab5-8e13-fbfa0bd51d01-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"31b1a65c-acf8-4ab5-8e13-fbfa0bd51d01\") " pod="openstack/ovsdbserver-sb-0" Oct 13 14:24:21 crc kubenswrapper[4797]: I1013 14:24:21.911044 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2nr5m\" (UniqueName: \"kubernetes.io/projected/fb1869c6-0b2c-4fad-9681-94b1b427085a-kube-api-access-2nr5m\") pod \"ovsdbserver-sb-1\" (UID: \"fb1869c6-0b2c-4fad-9681-94b1b427085a\") " pod="openstack/ovsdbserver-sb-1" Oct 13 14:24:21 crc kubenswrapper[4797]: I1013 14:24:21.911079 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/31b1a65c-acf8-4ab5-8e13-fbfa0bd51d01-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"31b1a65c-acf8-4ab5-8e13-fbfa0bd51d01\") " pod="openstack/ovsdbserver-sb-0" Oct 13 14:24:21 crc kubenswrapper[4797]: I1013 14:24:21.911102 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb1869c6-0b2c-4fad-9681-94b1b427085a-config\") pod \"ovsdbserver-sb-1\" (UID: \"fb1869c6-0b2c-4fad-9681-94b1b427085a\") " pod="openstack/ovsdbserver-sb-1" Oct 13 14:24:21 crc kubenswrapper[4797]: I1013 14:24:21.911155 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-ddf8cbda-eb0a-4a26-be69-9b1d013c059d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ddf8cbda-eb0a-4a26-be69-9b1d013c059d\") pod \"ovsdbserver-sb-2\" (UID: \"2ab030ed-5bf4-480d-bdef-c2e145080cfd\") " pod="openstack/ovsdbserver-sb-2" Oct 13 14:24:21 crc kubenswrapper[4797]: I1013 14:24:21.911182 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/31b1a65c-acf8-4ab5-8e13-fbfa0bd51d01-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"31b1a65c-acf8-4ab5-8e13-fbfa0bd51d01\") " pod="openstack/ovsdbserver-sb-0" Oct 13 14:24:21 crc kubenswrapper[4797]: I1013 14:24:21.911223 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ab030ed-5bf4-480d-bdef-c2e145080cfd-config\") pod \"ovsdbserver-sb-2\" (UID: \"2ab030ed-5bf4-480d-bdef-c2e145080cfd\") " pod="openstack/ovsdbserver-sb-2" Oct 13 14:24:21 crc kubenswrapper[4797]: I1013 14:24:21.911244 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-dfdfe40e-ff93-4dfe-801f-dab83d8000d4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-dfdfe40e-ff93-4dfe-801f-dab83d8000d4\") pod \"ovsdbserver-sb-1\" (UID: \"fb1869c6-0b2c-4fad-9681-94b1b427085a\") " pod="openstack/ovsdbserver-sb-1" Oct 13 14:24:21 crc kubenswrapper[4797]: I1013 14:24:21.911273 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb1869c6-0b2c-4fad-9681-94b1b427085a-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"fb1869c6-0b2c-4fad-9681-94b1b427085a\") " pod="openstack/ovsdbserver-sb-1" Oct 13 14:24:21 crc kubenswrapper[4797]: I1013 14:24:21.911308 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fb1869c6-0b2c-4fad-9681-94b1b427085a-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"fb1869c6-0b2c-4fad-9681-94b1b427085a\") " pod="openstack/ovsdbserver-sb-1" Oct 13 14:24:21 crc kubenswrapper[4797]: I1013 14:24:21.911330 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mgg2c\" (UniqueName: \"kubernetes.io/projected/31b1a65c-acf8-4ab5-8e13-fbfa0bd51d01-kube-api-access-mgg2c\") pod \"ovsdbserver-sb-0\" (UID: \"31b1a65c-acf8-4ab5-8e13-fbfa0bd51d01\") " pod="openstack/ovsdbserver-sb-0" Oct 13 14:24:21 crc kubenswrapper[4797]: I1013 14:24:21.911348 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2ab030ed-5bf4-480d-bdef-c2e145080cfd-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"2ab030ed-5bf4-480d-bdef-c2e145080cfd\") " pod="openstack/ovsdbserver-sb-2" Oct 13 14:24:21 crc kubenswrapper[4797]: I1013 14:24:21.911377 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxzwg\" (UniqueName: \"kubernetes.io/projected/2ab030ed-5bf4-480d-bdef-c2e145080cfd-kube-api-access-xxzwg\") pod \"ovsdbserver-sb-2\" (UID: \"2ab030ed-5bf4-480d-bdef-c2e145080cfd\") " pod="openstack/ovsdbserver-sb-2" Oct 13 14:24:21 crc kubenswrapper[4797]: I1013 14:24:21.911395 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2ab030ed-5bf4-480d-bdef-c2e145080cfd-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"2ab030ed-5bf4-480d-bdef-c2e145080cfd\") " pod="openstack/ovsdbserver-sb-2" Oct 13 14:24:21 crc kubenswrapper[4797]: I1013 14:24:21.911416 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ab030ed-5bf4-480d-bdef-c2e145080cfd-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"2ab030ed-5bf4-480d-bdef-c2e145080cfd\") " pod="openstack/ovsdbserver-sb-2" Oct 13 14:24:21 crc kubenswrapper[4797]: I1013 14:24:21.912616 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/31b1a65c-acf8-4ab5-8e13-fbfa0bd51d01-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"31b1a65c-acf8-4ab5-8e13-fbfa0bd51d01\") " pod="openstack/ovsdbserver-sb-0" Oct 13 14:24:21 crc kubenswrapper[4797]: I1013 14:24:21.912781 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ab030ed-5bf4-480d-bdef-c2e145080cfd-config\") pod \"ovsdbserver-sb-2\" (UID: \"2ab030ed-5bf4-480d-bdef-c2e145080cfd\") " pod="openstack/ovsdbserver-sb-2" Oct 13 14:24:21 crc kubenswrapper[4797]: I1013 14:24:21.913354 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2ab030ed-5bf4-480d-bdef-c2e145080cfd-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"2ab030ed-5bf4-480d-bdef-c2e145080cfd\") " pod="openstack/ovsdbserver-sb-2" Oct 13 14:24:21 crc kubenswrapper[4797]: I1013 14:24:21.913519 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/31b1a65c-acf8-4ab5-8e13-fbfa0bd51d01-config\") pod \"ovsdbserver-sb-0\" (UID: \"31b1a65c-acf8-4ab5-8e13-fbfa0bd51d01\") " pod="openstack/ovsdbserver-sb-0" Oct 13 14:24:21 crc kubenswrapper[4797]: I1013 14:24:21.913917 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/fb1869c6-0b2c-4fad-9681-94b1b427085a-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"fb1869c6-0b2c-4fad-9681-94b1b427085a\") " pod="openstack/ovsdbserver-sb-1" Oct 13 14:24:21 crc kubenswrapper[4797]: I1013 14:24:21.913961 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fb1869c6-0b2c-4fad-9681-94b1b427085a-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"fb1869c6-0b2c-4fad-9681-94b1b427085a\") " pod="openstack/ovsdbserver-sb-1" Oct 13 14:24:21 crc kubenswrapper[4797]: I1013 14:24:21.914103 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/31b1a65c-acf8-4ab5-8e13-fbfa0bd51d01-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"31b1a65c-acf8-4ab5-8e13-fbfa0bd51d01\") " pod="openstack/ovsdbserver-sb-0" Oct 13 14:24:21 crc kubenswrapper[4797]: I1013 14:24:21.914608 4797 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 13 14:24:21 crc kubenswrapper[4797]: I1013 14:24:21.914635 4797 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-dfdfe40e-ff93-4dfe-801f-dab83d8000d4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-dfdfe40e-ff93-4dfe-801f-dab83d8000d4\") pod \"ovsdbserver-sb-1\" (UID: \"fb1869c6-0b2c-4fad-9681-94b1b427085a\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/33fdbce1d54f1f9eac3435c3c80488a8611af1ac69a0d4abbcf3bc97ecb333b7/globalmount\"" pod="openstack/ovsdbserver-sb-1" Oct 13 14:24:21 crc kubenswrapper[4797]: I1013 14:24:21.914643 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2ab030ed-5bf4-480d-bdef-c2e145080cfd-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"2ab030ed-5bf4-480d-bdef-c2e145080cfd\") " pod="openstack/ovsdbserver-sb-2" Oct 13 14:24:21 crc kubenswrapper[4797]: I1013 14:24:21.914725 4797 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 13 14:24:21 crc kubenswrapper[4797]: I1013 14:24:21.914749 4797 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-426a03e9-546a-47d6-997a-0bf61c2ef1c5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-426a03e9-546a-47d6-997a-0bf61c2ef1c5\") pod \"ovsdbserver-sb-0\" (UID: \"31b1a65c-acf8-4ab5-8e13-fbfa0bd51d01\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/e4957acf57ff9da7808d16ac9ffe247282e39a0fc90dffe13c9f7e4f385525cc/globalmount\"" pod="openstack/ovsdbserver-sb-0" Oct 13 14:24:21 crc kubenswrapper[4797]: I1013 14:24:21.916526 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ab030ed-5bf4-480d-bdef-c2e145080cfd-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"2ab030ed-5bf4-480d-bdef-c2e145080cfd\") " pod="openstack/ovsdbserver-sb-2" Oct 13 14:24:21 crc kubenswrapper[4797]: I1013 14:24:21.917782 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb1869c6-0b2c-4fad-9681-94b1b427085a-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"fb1869c6-0b2c-4fad-9681-94b1b427085a\") " pod="openstack/ovsdbserver-sb-1" Oct 13 14:24:21 crc kubenswrapper[4797]: I1013 14:24:21.920533 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb1869c6-0b2c-4fad-9681-94b1b427085a-config\") pod \"ovsdbserver-sb-1\" (UID: \"fb1869c6-0b2c-4fad-9681-94b1b427085a\") " pod="openstack/ovsdbserver-sb-1" Oct 13 14:24:21 crc kubenswrapper[4797]: I1013 14:24:21.929001 4797 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 13 14:24:21 crc kubenswrapper[4797]: I1013 14:24:21.929073 4797 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-ddf8cbda-eb0a-4a26-be69-9b1d013c059d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ddf8cbda-eb0a-4a26-be69-9b1d013c059d\") pod \"ovsdbserver-sb-2\" (UID: \"2ab030ed-5bf4-480d-bdef-c2e145080cfd\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/8df20a2c64e343062f5d277b747915960e85af2d7011fb1fa3125998dae22237/globalmount\"" pod="openstack/ovsdbserver-sb-2" Oct 13 14:24:21 crc kubenswrapper[4797]: I1013 14:24:21.929270 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31b1a65c-acf8-4ab5-8e13-fbfa0bd51d01-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"31b1a65c-acf8-4ab5-8e13-fbfa0bd51d01\") " pod="openstack/ovsdbserver-sb-0" Oct 13 14:24:21 crc kubenswrapper[4797]: I1013 14:24:21.931952 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2nr5m\" (UniqueName: \"kubernetes.io/projected/fb1869c6-0b2c-4fad-9681-94b1b427085a-kube-api-access-2nr5m\") pod \"ovsdbserver-sb-1\" (UID: \"fb1869c6-0b2c-4fad-9681-94b1b427085a\") " pod="openstack/ovsdbserver-sb-1" Oct 13 14:24:21 crc kubenswrapper[4797]: I1013 14:24:21.933866 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mgg2c\" (UniqueName: \"kubernetes.io/projected/31b1a65c-acf8-4ab5-8e13-fbfa0bd51d01-kube-api-access-mgg2c\") pod \"ovsdbserver-sb-0\" (UID: \"31b1a65c-acf8-4ab5-8e13-fbfa0bd51d01\") " pod="openstack/ovsdbserver-sb-0" Oct 13 14:24:21 crc kubenswrapper[4797]: I1013 14:24:21.937663 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxzwg\" (UniqueName: \"kubernetes.io/projected/2ab030ed-5bf4-480d-bdef-c2e145080cfd-kube-api-access-xxzwg\") pod \"ovsdbserver-sb-2\" (UID: \"2ab030ed-5bf4-480d-bdef-c2e145080cfd\") " pod="openstack/ovsdbserver-sb-2" Oct 13 14:24:21 crc kubenswrapper[4797]: I1013 14:24:21.981450 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-dfdfe40e-ff93-4dfe-801f-dab83d8000d4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-dfdfe40e-ff93-4dfe-801f-dab83d8000d4\") pod \"ovsdbserver-sb-1\" (UID: \"fb1869c6-0b2c-4fad-9681-94b1b427085a\") " pod="openstack/ovsdbserver-sb-1" Oct 13 14:24:21 crc kubenswrapper[4797]: I1013 14:24:21.989495 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-426a03e9-546a-47d6-997a-0bf61c2ef1c5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-426a03e9-546a-47d6-997a-0bf61c2ef1c5\") pod \"ovsdbserver-sb-0\" (UID: \"31b1a65c-acf8-4ab5-8e13-fbfa0bd51d01\") " pod="openstack/ovsdbserver-sb-0" Oct 13 14:24:22 crc kubenswrapper[4797]: I1013 14:24:22.016211 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 13 14:24:22 crc kubenswrapper[4797]: I1013 14:24:22.017459 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-ddf8cbda-eb0a-4a26-be69-9b1d013c059d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ddf8cbda-eb0a-4a26-be69-9b1d013c059d\") pod \"ovsdbserver-sb-2\" (UID: \"2ab030ed-5bf4-480d-bdef-c2e145080cfd\") " pod="openstack/ovsdbserver-sb-2" Oct 13 14:24:22 crc kubenswrapper[4797]: I1013 14:24:22.085036 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-2" Oct 13 14:24:22 crc kubenswrapper[4797]: I1013 14:24:22.091341 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-1" Oct 13 14:24:22 crc kubenswrapper[4797]: I1013 14:24:22.367258 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 13 14:24:22 crc kubenswrapper[4797]: I1013 14:24:22.471755 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-2"] Oct 13 14:24:22 crc kubenswrapper[4797]: W1013 14:24:22.477377 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1cbefe95_eada_4df1_90ce_a7350636f4fb.slice/crio-bad0dc1449bae61b294461f8b8284f8f871c00ef0b097a9a69903f3826ccd728 WatchSource:0}: Error finding container bad0dc1449bae61b294461f8b8284f8f871c00ef0b097a9a69903f3826ccd728: Status 404 returned error can't find the container with id bad0dc1449bae61b294461f8b8284f8f871c00ef0b097a9a69903f3826ccd728 Oct 13 14:24:22 crc kubenswrapper[4797]: I1013 14:24:22.563267 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-2"] Oct 13 14:24:22 crc kubenswrapper[4797]: I1013 14:24:22.641490 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"2f4d36fc-f414-419a-88fb-8897d4029861","Type":"ContainerStarted","Data":"5c1545acd5f4ab0000490c0fe71c1878cdf32af97400d1b635abce1e7fcc7956"} Oct 13 14:24:22 crc kubenswrapper[4797]: I1013 14:24:22.642515 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"1cbefe95-eada-4df1-90ce-a7350636f4fb","Type":"ContainerStarted","Data":"bad0dc1449bae61b294461f8b8284f8f871c00ef0b097a9a69903f3826ccd728"} Oct 13 14:24:22 crc kubenswrapper[4797]: I1013 14:24:22.643878 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"2ab030ed-5bf4-480d-bdef-c2e145080cfd","Type":"ContainerStarted","Data":"baf66afc7fffdf460cdc744e850e9941dd39fd950b1ed0e258db0814f61f4e98"} Oct 13 14:24:22 crc kubenswrapper[4797]: I1013 14:24:22.720032 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-1"] Oct 13 14:24:22 crc kubenswrapper[4797]: W1013 14:24:22.729001 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfb1869c6_0b2c_4fad_9681_94b1b427085a.slice/crio-3708c0477c54eaac9d6a987e99136453f571a9d3c2c9699de47ab0ad7eb7359f WatchSource:0}: Error finding container 3708c0477c54eaac9d6a987e99136453f571a9d3c2c9699de47ab0ad7eb7359f: Status 404 returned error can't find the container with id 3708c0477c54eaac9d6a987e99136453f571a9d3c2c9699de47ab0ad7eb7359f Oct 13 14:24:23 crc kubenswrapper[4797]: I1013 14:24:23.114183 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-1"] Oct 13 14:24:23 crc kubenswrapper[4797]: I1013 14:24:23.207626 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 13 14:24:23 crc kubenswrapper[4797]: W1013 14:24:23.216860 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod31b1a65c_acf8_4ab5_8e13_fbfa0bd51d01.slice/crio-a251b306dfb0947a025d0c0de60286d8ae6c9c990781c43e09d122ed3d78a3dd WatchSource:0}: Error finding container a251b306dfb0947a025d0c0de60286d8ae6c9c990781c43e09d122ed3d78a3dd: Status 404 returned error can't find the container with id a251b306dfb0947a025d0c0de60286d8ae6c9c990781c43e09d122ed3d78a3dd Oct 13 14:24:23 crc kubenswrapper[4797]: I1013 14:24:23.652596 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"31b1a65c-acf8-4ab5-8e13-fbfa0bd51d01","Type":"ContainerStarted","Data":"a251b306dfb0947a025d0c0de60286d8ae6c9c990781c43e09d122ed3d78a3dd"} Oct 13 14:24:23 crc kubenswrapper[4797]: I1013 14:24:23.653638 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"f792911e-0022-42df-9a19-9912fde49848","Type":"ContainerStarted","Data":"2afb12132c1b9eb4423f18d4060d770056de5ded02c7d146b82e583a11e64c21"} Oct 13 14:24:23 crc kubenswrapper[4797]: I1013 14:24:23.655472 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"fb1869c6-0b2c-4fad-9681-94b1b427085a","Type":"ContainerStarted","Data":"3708c0477c54eaac9d6a987e99136453f571a9d3c2c9699de47ab0ad7eb7359f"} Oct 13 14:24:29 crc kubenswrapper[4797]: I1013 14:24:29.713434 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"1cbefe95-eada-4df1-90ce-a7350636f4fb","Type":"ContainerStarted","Data":"a97009c76b00693369522ba454d834e5b90f63cdf2c8c0655b0d9c24f67f8c96"} Oct 13 14:24:29 crc kubenswrapper[4797]: I1013 14:24:29.714941 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"1cbefe95-eada-4df1-90ce-a7350636f4fb","Type":"ContainerStarted","Data":"8d0a0f31240105a1f06559b3bc1c47a9d757ac7b45a5e2952a44606b7aba6ec5"} Oct 13 14:24:29 crc kubenswrapper[4797]: I1013 14:24:29.715950 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"2ab030ed-5bf4-480d-bdef-c2e145080cfd","Type":"ContainerStarted","Data":"0c58b26d5903345002c42a3d6ee2615561ea11038981b570aafe48647dc32967"} Oct 13 14:24:29 crc kubenswrapper[4797]: I1013 14:24:29.715988 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"2ab030ed-5bf4-480d-bdef-c2e145080cfd","Type":"ContainerStarted","Data":"11654343c92367732864806b6cbdb6b5d03c0f175fcff7e4cbb48a1681a478c4"} Oct 13 14:24:29 crc kubenswrapper[4797]: I1013 14:24:29.718985 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"31b1a65c-acf8-4ab5-8e13-fbfa0bd51d01","Type":"ContainerStarted","Data":"71b3d69d3b7d7f2d1c3c957012af57d105af3c5a343e3d1e0af7714c7417c188"} Oct 13 14:24:29 crc kubenswrapper[4797]: I1013 14:24:29.719188 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"31b1a65c-acf8-4ab5-8e13-fbfa0bd51d01","Type":"ContainerStarted","Data":"851137b2b56a2fe09a11d186580fd666d76dc97a3fc46da0a78e6a47ce985234"} Oct 13 14:24:29 crc kubenswrapper[4797]: I1013 14:24:29.722056 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"f792911e-0022-42df-9a19-9912fde49848","Type":"ContainerStarted","Data":"58fa08b0b91424a00ec5b1e8f6fcba61aaac30319c14a7d9729eb5feceb084ad"} Oct 13 14:24:29 crc kubenswrapper[4797]: I1013 14:24:29.722087 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"f792911e-0022-42df-9a19-9912fde49848","Type":"ContainerStarted","Data":"05eca23b4db09202289b105792a896db0663659463e6f260baa70895bd0a15bb"} Oct 13 14:24:29 crc kubenswrapper[4797]: I1013 14:24:29.725086 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"fb1869c6-0b2c-4fad-9681-94b1b427085a","Type":"ContainerStarted","Data":"c9e41574d03d95c0037f439958615d2d0f463fa359fdaa9646f920fdf04b9968"} Oct 13 14:24:29 crc kubenswrapper[4797]: I1013 14:24:29.725163 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"fb1869c6-0b2c-4fad-9681-94b1b427085a","Type":"ContainerStarted","Data":"10a2cb894ca2f96d8b02019ff4710d5a1ecac540ee34c6cbf8f24f3d3bc78692"} Oct 13 14:24:29 crc kubenswrapper[4797]: I1013 14:24:29.727600 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"2f4d36fc-f414-419a-88fb-8897d4029861","Type":"ContainerStarted","Data":"3a06883e6a43fc3f0900e81dfc357c2d864e2e4ef77ab000c2dc1ef5efd6effc"} Oct 13 14:24:29 crc kubenswrapper[4797]: I1013 14:24:29.727633 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"2f4d36fc-f414-419a-88fb-8897d4029861","Type":"ContainerStarted","Data":"c5b2f1ec84936e8aae3bb76235bff7640b45e3f845b71b6e23b493c7a95bb25e"} Oct 13 14:24:29 crc kubenswrapper[4797]: I1013 14:24:29.749073 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-2" podStartSLOduration=3.841860711 podStartE2EDuration="9.749036508s" podCreationTimestamp="2025-10-13 14:24:20 +0000 UTC" firstStartedPulling="2025-10-13 14:24:22.479468538 +0000 UTC m=+4640.013018784" lastFinishedPulling="2025-10-13 14:24:28.386644315 +0000 UTC m=+4645.920194581" observedRunningTime="2025-10-13 14:24:29.741429742 +0000 UTC m=+4647.274980038" watchObservedRunningTime="2025-10-13 14:24:29.749036508 +0000 UTC m=+4647.282586804" Oct 13 14:24:29 crc kubenswrapper[4797]: I1013 14:24:29.763416 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=3.748424222 podStartE2EDuration="9.763383489s" podCreationTimestamp="2025-10-13 14:24:20 +0000 UTC" firstStartedPulling="2025-10-13 14:24:22.372746764 +0000 UTC m=+4639.906297020" lastFinishedPulling="2025-10-13 14:24:28.387706021 +0000 UTC m=+4645.921256287" observedRunningTime="2025-10-13 14:24:29.762124609 +0000 UTC m=+4647.295674905" watchObservedRunningTime="2025-10-13 14:24:29.763383489 +0000 UTC m=+4647.296933775" Oct 13 14:24:29 crc kubenswrapper[4797]: I1013 14:24:29.820050 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-2" podStartSLOduration=4.002751872 podStartE2EDuration="9.820021857s" podCreationTimestamp="2025-10-13 14:24:20 +0000 UTC" firstStartedPulling="2025-10-13 14:24:22.569252637 +0000 UTC m=+4640.102802893" lastFinishedPulling="2025-10-13 14:24:28.386522602 +0000 UTC m=+4645.920072878" observedRunningTime="2025-10-13 14:24:29.785829429 +0000 UTC m=+4647.319379725" watchObservedRunningTime="2025-10-13 14:24:29.820021857 +0000 UTC m=+4647.353572123" Oct 13 14:24:29 crc kubenswrapper[4797]: I1013 14:24:29.830359 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-1" podStartSLOduration=4.537613844 podStartE2EDuration="9.8303411s" podCreationTimestamp="2025-10-13 14:24:20 +0000 UTC" firstStartedPulling="2025-10-13 14:24:23.117683241 +0000 UTC m=+4640.651233497" lastFinishedPulling="2025-10-13 14:24:28.410410507 +0000 UTC m=+4645.943960753" observedRunningTime="2025-10-13 14:24:29.818839038 +0000 UTC m=+4647.352389334" watchObservedRunningTime="2025-10-13 14:24:29.8303411 +0000 UTC m=+4647.363891366" Oct 13 14:24:29 crc kubenswrapper[4797]: I1013 14:24:29.844063 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=4.678405731 podStartE2EDuration="9.844040095s" podCreationTimestamp="2025-10-13 14:24:20 +0000 UTC" firstStartedPulling="2025-10-13 14:24:23.220886348 +0000 UTC m=+4640.754436624" lastFinishedPulling="2025-10-13 14:24:28.386520692 +0000 UTC m=+4645.920070988" observedRunningTime="2025-10-13 14:24:29.841726849 +0000 UTC m=+4647.375277115" watchObservedRunningTime="2025-10-13 14:24:29.844040095 +0000 UTC m=+4647.377590361" Oct 13 14:24:29 crc kubenswrapper[4797]: I1013 14:24:29.862885 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-1" podStartSLOduration=4.20516579 podStartE2EDuration="9.862866226s" podCreationTimestamp="2025-10-13 14:24:20 +0000 UTC" firstStartedPulling="2025-10-13 14:24:22.731716697 +0000 UTC m=+4640.265266953" lastFinishedPulling="2025-10-13 14:24:28.389417133 +0000 UTC m=+4645.922967389" observedRunningTime="2025-10-13 14:24:29.860100949 +0000 UTC m=+4647.393651205" watchObservedRunningTime="2025-10-13 14:24:29.862866226 +0000 UTC m=+4647.396416482" Oct 13 14:24:30 crc kubenswrapper[4797]: I1013 14:24:30.843854 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Oct 13 14:24:30 crc kubenswrapper[4797]: I1013 14:24:30.877211 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-2" Oct 13 14:24:30 crc kubenswrapper[4797]: I1013 14:24:30.884412 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-1" Oct 13 14:24:31 crc kubenswrapper[4797]: I1013 14:24:31.017488 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Oct 13 14:24:31 crc kubenswrapper[4797]: I1013 14:24:31.085372 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-2" Oct 13 14:24:31 crc kubenswrapper[4797]: I1013 14:24:31.092569 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-1" Oct 13 14:24:31 crc kubenswrapper[4797]: I1013 14:24:31.092784 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Oct 13 14:24:31 crc kubenswrapper[4797]: I1013 14:24:31.147786 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-2" Oct 13 14:24:31 crc kubenswrapper[4797]: I1013 14:24:31.160334 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-1" Oct 13 14:24:31 crc kubenswrapper[4797]: I1013 14:24:31.749321 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Oct 13 14:24:31 crc kubenswrapper[4797]: I1013 14:24:31.749698 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-1" Oct 13 14:24:31 crc kubenswrapper[4797]: I1013 14:24:31.750091 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-2" Oct 13 14:24:31 crc kubenswrapper[4797]: I1013 14:24:31.843522 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Oct 13 14:24:31 crc kubenswrapper[4797]: I1013 14:24:31.876896 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-2" Oct 13 14:24:31 crc kubenswrapper[4797]: I1013 14:24:31.884042 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-1" Oct 13 14:24:33 crc kubenswrapper[4797]: I1013 14:24:33.820841 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-2" Oct 13 14:24:33 crc kubenswrapper[4797]: I1013 14:24:33.832853 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-1" Oct 13 14:24:33 crc kubenswrapper[4797]: I1013 14:24:33.912557 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Oct 13 14:24:33 crc kubenswrapper[4797]: I1013 14:24:33.924883 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-1" Oct 13 14:24:33 crc kubenswrapper[4797]: I1013 14:24:33.940603 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-2" Oct 13 14:24:33 crc kubenswrapper[4797]: I1013 14:24:33.993205 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Oct 13 14:24:33 crc kubenswrapper[4797]: I1013 14:24:33.993269 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-1" Oct 13 14:24:34 crc kubenswrapper[4797]: I1013 14:24:34.043096 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-2" Oct 13 14:24:34 crc kubenswrapper[4797]: I1013 14:24:34.071137 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7d85f8967f-gphrv"] Oct 13 14:24:34 crc kubenswrapper[4797]: I1013 14:24:34.072759 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d85f8967f-gphrv" Oct 13 14:24:34 crc kubenswrapper[4797]: I1013 14:24:34.087470 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Oct 13 14:24:34 crc kubenswrapper[4797]: I1013 14:24:34.088178 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7d85f8967f-gphrv"] Oct 13 14:24:34 crc kubenswrapper[4797]: I1013 14:24:34.173201 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6df2e2b9-9793-4361-9e81-dbc40bbb9197-ovsdbserver-sb\") pod \"dnsmasq-dns-7d85f8967f-gphrv\" (UID: \"6df2e2b9-9793-4361-9e81-dbc40bbb9197\") " pod="openstack/dnsmasq-dns-7d85f8967f-gphrv" Oct 13 14:24:34 crc kubenswrapper[4797]: I1013 14:24:34.173308 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6df2e2b9-9793-4361-9e81-dbc40bbb9197-dns-svc\") pod \"dnsmasq-dns-7d85f8967f-gphrv\" (UID: \"6df2e2b9-9793-4361-9e81-dbc40bbb9197\") " pod="openstack/dnsmasq-dns-7d85f8967f-gphrv" Oct 13 14:24:34 crc kubenswrapper[4797]: I1013 14:24:34.173351 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rgnf\" (UniqueName: \"kubernetes.io/projected/6df2e2b9-9793-4361-9e81-dbc40bbb9197-kube-api-access-9rgnf\") pod \"dnsmasq-dns-7d85f8967f-gphrv\" (UID: \"6df2e2b9-9793-4361-9e81-dbc40bbb9197\") " pod="openstack/dnsmasq-dns-7d85f8967f-gphrv" Oct 13 14:24:34 crc kubenswrapper[4797]: I1013 14:24:34.173399 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6df2e2b9-9793-4361-9e81-dbc40bbb9197-config\") pod \"dnsmasq-dns-7d85f8967f-gphrv\" (UID: \"6df2e2b9-9793-4361-9e81-dbc40bbb9197\") " pod="openstack/dnsmasq-dns-7d85f8967f-gphrv" Oct 13 14:24:34 crc kubenswrapper[4797]: I1013 14:24:34.254058 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7d85f8967f-gphrv"] Oct 13 14:24:34 crc kubenswrapper[4797]: E1013 14:24:34.254703 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config dns-svc kube-api-access-9rgnf ovsdbserver-sb], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-7d85f8967f-gphrv" podUID="6df2e2b9-9793-4361-9e81-dbc40bbb9197" Oct 13 14:24:34 crc kubenswrapper[4797]: I1013 14:24:34.275460 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6df2e2b9-9793-4361-9e81-dbc40bbb9197-dns-svc\") pod \"dnsmasq-dns-7d85f8967f-gphrv\" (UID: \"6df2e2b9-9793-4361-9e81-dbc40bbb9197\") " pod="openstack/dnsmasq-dns-7d85f8967f-gphrv" Oct 13 14:24:34 crc kubenswrapper[4797]: I1013 14:24:34.275566 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9rgnf\" (UniqueName: \"kubernetes.io/projected/6df2e2b9-9793-4361-9e81-dbc40bbb9197-kube-api-access-9rgnf\") pod \"dnsmasq-dns-7d85f8967f-gphrv\" (UID: \"6df2e2b9-9793-4361-9e81-dbc40bbb9197\") " pod="openstack/dnsmasq-dns-7d85f8967f-gphrv" Oct 13 14:24:34 crc kubenswrapper[4797]: I1013 14:24:34.275623 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6df2e2b9-9793-4361-9e81-dbc40bbb9197-config\") pod \"dnsmasq-dns-7d85f8967f-gphrv\" (UID: \"6df2e2b9-9793-4361-9e81-dbc40bbb9197\") " pod="openstack/dnsmasq-dns-7d85f8967f-gphrv" Oct 13 14:24:34 crc kubenswrapper[4797]: I1013 14:24:34.276389 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6df2e2b9-9793-4361-9e81-dbc40bbb9197-dns-svc\") pod \"dnsmasq-dns-7d85f8967f-gphrv\" (UID: \"6df2e2b9-9793-4361-9e81-dbc40bbb9197\") " pod="openstack/dnsmasq-dns-7d85f8967f-gphrv" Oct 13 14:24:34 crc kubenswrapper[4797]: I1013 14:24:34.276589 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-9d78d64fc-4gns6"] Oct 13 14:24:34 crc kubenswrapper[4797]: I1013 14:24:34.276628 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6df2e2b9-9793-4361-9e81-dbc40bbb9197-config\") pod \"dnsmasq-dns-7d85f8967f-gphrv\" (UID: \"6df2e2b9-9793-4361-9e81-dbc40bbb9197\") " pod="openstack/dnsmasq-dns-7d85f8967f-gphrv" Oct 13 14:24:34 crc kubenswrapper[4797]: I1013 14:24:34.276769 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6df2e2b9-9793-4361-9e81-dbc40bbb9197-ovsdbserver-sb\") pod \"dnsmasq-dns-7d85f8967f-gphrv\" (UID: \"6df2e2b9-9793-4361-9e81-dbc40bbb9197\") " pod="openstack/dnsmasq-dns-7d85f8967f-gphrv" Oct 13 14:24:34 crc kubenswrapper[4797]: I1013 14:24:34.277532 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6df2e2b9-9793-4361-9e81-dbc40bbb9197-ovsdbserver-sb\") pod \"dnsmasq-dns-7d85f8967f-gphrv\" (UID: \"6df2e2b9-9793-4361-9e81-dbc40bbb9197\") " pod="openstack/dnsmasq-dns-7d85f8967f-gphrv" Oct 13 14:24:34 crc kubenswrapper[4797]: I1013 14:24:34.278018 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9d78d64fc-4gns6" Oct 13 14:24:34 crc kubenswrapper[4797]: I1013 14:24:34.279300 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Oct 13 14:24:34 crc kubenswrapper[4797]: I1013 14:24:34.291255 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-9d78d64fc-4gns6"] Oct 13 14:24:34 crc kubenswrapper[4797]: I1013 14:24:34.310114 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rgnf\" (UniqueName: \"kubernetes.io/projected/6df2e2b9-9793-4361-9e81-dbc40bbb9197-kube-api-access-9rgnf\") pod \"dnsmasq-dns-7d85f8967f-gphrv\" (UID: \"6df2e2b9-9793-4361-9e81-dbc40bbb9197\") " pod="openstack/dnsmasq-dns-7d85f8967f-gphrv" Oct 13 14:24:34 crc kubenswrapper[4797]: I1013 14:24:34.379041 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a7af7ea5-7441-487b-b6be-1ffad6754828-ovsdbserver-nb\") pod \"dnsmasq-dns-9d78d64fc-4gns6\" (UID: \"a7af7ea5-7441-487b-b6be-1ffad6754828\") " pod="openstack/dnsmasq-dns-9d78d64fc-4gns6" Oct 13 14:24:34 crc kubenswrapper[4797]: I1013 14:24:34.379106 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a7af7ea5-7441-487b-b6be-1ffad6754828-dns-svc\") pod \"dnsmasq-dns-9d78d64fc-4gns6\" (UID: \"a7af7ea5-7441-487b-b6be-1ffad6754828\") " pod="openstack/dnsmasq-dns-9d78d64fc-4gns6" Oct 13 14:24:34 crc kubenswrapper[4797]: I1013 14:24:34.379285 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a7af7ea5-7441-487b-b6be-1ffad6754828-ovsdbserver-sb\") pod \"dnsmasq-dns-9d78d64fc-4gns6\" (UID: \"a7af7ea5-7441-487b-b6be-1ffad6754828\") " pod="openstack/dnsmasq-dns-9d78d64fc-4gns6" Oct 13 14:24:34 crc kubenswrapper[4797]: I1013 14:24:34.379341 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a7af7ea5-7441-487b-b6be-1ffad6754828-config\") pod \"dnsmasq-dns-9d78d64fc-4gns6\" (UID: \"a7af7ea5-7441-487b-b6be-1ffad6754828\") " pod="openstack/dnsmasq-dns-9d78d64fc-4gns6" Oct 13 14:24:34 crc kubenswrapper[4797]: I1013 14:24:34.379971 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zflp8\" (UniqueName: \"kubernetes.io/projected/a7af7ea5-7441-487b-b6be-1ffad6754828-kube-api-access-zflp8\") pod \"dnsmasq-dns-9d78d64fc-4gns6\" (UID: \"a7af7ea5-7441-487b-b6be-1ffad6754828\") " pod="openstack/dnsmasq-dns-9d78d64fc-4gns6" Oct 13 14:24:34 crc kubenswrapper[4797]: I1013 14:24:34.481216 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a7af7ea5-7441-487b-b6be-1ffad6754828-dns-svc\") pod \"dnsmasq-dns-9d78d64fc-4gns6\" (UID: \"a7af7ea5-7441-487b-b6be-1ffad6754828\") " pod="openstack/dnsmasq-dns-9d78d64fc-4gns6" Oct 13 14:24:34 crc kubenswrapper[4797]: I1013 14:24:34.481292 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a7af7ea5-7441-487b-b6be-1ffad6754828-ovsdbserver-sb\") pod \"dnsmasq-dns-9d78d64fc-4gns6\" (UID: \"a7af7ea5-7441-487b-b6be-1ffad6754828\") " pod="openstack/dnsmasq-dns-9d78d64fc-4gns6" Oct 13 14:24:34 crc kubenswrapper[4797]: I1013 14:24:34.481318 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a7af7ea5-7441-487b-b6be-1ffad6754828-config\") pod \"dnsmasq-dns-9d78d64fc-4gns6\" (UID: \"a7af7ea5-7441-487b-b6be-1ffad6754828\") " pod="openstack/dnsmasq-dns-9d78d64fc-4gns6" Oct 13 14:24:34 crc kubenswrapper[4797]: I1013 14:24:34.481366 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zflp8\" (UniqueName: \"kubernetes.io/projected/a7af7ea5-7441-487b-b6be-1ffad6754828-kube-api-access-zflp8\") pod \"dnsmasq-dns-9d78d64fc-4gns6\" (UID: \"a7af7ea5-7441-487b-b6be-1ffad6754828\") " pod="openstack/dnsmasq-dns-9d78d64fc-4gns6" Oct 13 14:24:34 crc kubenswrapper[4797]: I1013 14:24:34.481423 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a7af7ea5-7441-487b-b6be-1ffad6754828-ovsdbserver-nb\") pod \"dnsmasq-dns-9d78d64fc-4gns6\" (UID: \"a7af7ea5-7441-487b-b6be-1ffad6754828\") " pod="openstack/dnsmasq-dns-9d78d64fc-4gns6" Oct 13 14:24:34 crc kubenswrapper[4797]: I1013 14:24:34.482776 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a7af7ea5-7441-487b-b6be-1ffad6754828-dns-svc\") pod \"dnsmasq-dns-9d78d64fc-4gns6\" (UID: \"a7af7ea5-7441-487b-b6be-1ffad6754828\") " pod="openstack/dnsmasq-dns-9d78d64fc-4gns6" Oct 13 14:24:34 crc kubenswrapper[4797]: I1013 14:24:34.482879 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a7af7ea5-7441-487b-b6be-1ffad6754828-config\") pod \"dnsmasq-dns-9d78d64fc-4gns6\" (UID: \"a7af7ea5-7441-487b-b6be-1ffad6754828\") " pod="openstack/dnsmasq-dns-9d78d64fc-4gns6" Oct 13 14:24:34 crc kubenswrapper[4797]: I1013 14:24:34.482911 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a7af7ea5-7441-487b-b6be-1ffad6754828-ovsdbserver-sb\") pod \"dnsmasq-dns-9d78d64fc-4gns6\" (UID: \"a7af7ea5-7441-487b-b6be-1ffad6754828\") " pod="openstack/dnsmasq-dns-9d78d64fc-4gns6" Oct 13 14:24:34 crc kubenswrapper[4797]: I1013 14:24:34.483009 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a7af7ea5-7441-487b-b6be-1ffad6754828-ovsdbserver-nb\") pod \"dnsmasq-dns-9d78d64fc-4gns6\" (UID: \"a7af7ea5-7441-487b-b6be-1ffad6754828\") " pod="openstack/dnsmasq-dns-9d78d64fc-4gns6" Oct 13 14:24:34 crc kubenswrapper[4797]: I1013 14:24:34.500379 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zflp8\" (UniqueName: \"kubernetes.io/projected/a7af7ea5-7441-487b-b6be-1ffad6754828-kube-api-access-zflp8\") pod \"dnsmasq-dns-9d78d64fc-4gns6\" (UID: \"a7af7ea5-7441-487b-b6be-1ffad6754828\") " pod="openstack/dnsmasq-dns-9d78d64fc-4gns6" Oct 13 14:24:34 crc kubenswrapper[4797]: I1013 14:24:34.592172 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9d78d64fc-4gns6" Oct 13 14:24:34 crc kubenswrapper[4797]: I1013 14:24:34.776285 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d85f8967f-gphrv" Oct 13 14:24:34 crc kubenswrapper[4797]: I1013 14:24:34.789344 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d85f8967f-gphrv" Oct 13 14:24:34 crc kubenswrapper[4797]: I1013 14:24:34.887114 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6df2e2b9-9793-4361-9e81-dbc40bbb9197-config\") pod \"6df2e2b9-9793-4361-9e81-dbc40bbb9197\" (UID: \"6df2e2b9-9793-4361-9e81-dbc40bbb9197\") " Oct 13 14:24:34 crc kubenswrapper[4797]: I1013 14:24:34.887165 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9rgnf\" (UniqueName: \"kubernetes.io/projected/6df2e2b9-9793-4361-9e81-dbc40bbb9197-kube-api-access-9rgnf\") pod \"6df2e2b9-9793-4361-9e81-dbc40bbb9197\" (UID: \"6df2e2b9-9793-4361-9e81-dbc40bbb9197\") " Oct 13 14:24:34 crc kubenswrapper[4797]: I1013 14:24:34.887230 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6df2e2b9-9793-4361-9e81-dbc40bbb9197-dns-svc\") pod \"6df2e2b9-9793-4361-9e81-dbc40bbb9197\" (UID: \"6df2e2b9-9793-4361-9e81-dbc40bbb9197\") " Oct 13 14:24:34 crc kubenswrapper[4797]: I1013 14:24:34.887269 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6df2e2b9-9793-4361-9e81-dbc40bbb9197-ovsdbserver-sb\") pod \"6df2e2b9-9793-4361-9e81-dbc40bbb9197\" (UID: \"6df2e2b9-9793-4361-9e81-dbc40bbb9197\") " Oct 13 14:24:34 crc kubenswrapper[4797]: I1013 14:24:34.887537 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6df2e2b9-9793-4361-9e81-dbc40bbb9197-config" (OuterVolumeSpecName: "config") pod "6df2e2b9-9793-4361-9e81-dbc40bbb9197" (UID: "6df2e2b9-9793-4361-9e81-dbc40bbb9197"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 14:24:34 crc kubenswrapper[4797]: I1013 14:24:34.887827 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6df2e2b9-9793-4361-9e81-dbc40bbb9197-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6df2e2b9-9793-4361-9e81-dbc40bbb9197" (UID: "6df2e2b9-9793-4361-9e81-dbc40bbb9197"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 14:24:34 crc kubenswrapper[4797]: I1013 14:24:34.887851 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6df2e2b9-9793-4361-9e81-dbc40bbb9197-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6df2e2b9-9793-4361-9e81-dbc40bbb9197" (UID: "6df2e2b9-9793-4361-9e81-dbc40bbb9197"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 14:24:34 crc kubenswrapper[4797]: I1013 14:24:34.888939 4797 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6df2e2b9-9793-4361-9e81-dbc40bbb9197-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 13 14:24:34 crc kubenswrapper[4797]: I1013 14:24:34.888959 4797 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6df2e2b9-9793-4361-9e81-dbc40bbb9197-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 13 14:24:34 crc kubenswrapper[4797]: I1013 14:24:34.888974 4797 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6df2e2b9-9793-4361-9e81-dbc40bbb9197-config\") on node \"crc\" DevicePath \"\"" Oct 13 14:24:34 crc kubenswrapper[4797]: I1013 14:24:34.892783 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6df2e2b9-9793-4361-9e81-dbc40bbb9197-kube-api-access-9rgnf" (OuterVolumeSpecName: "kube-api-access-9rgnf") pod "6df2e2b9-9793-4361-9e81-dbc40bbb9197" (UID: "6df2e2b9-9793-4361-9e81-dbc40bbb9197"). InnerVolumeSpecName "kube-api-access-9rgnf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 14:24:34 crc kubenswrapper[4797]: I1013 14:24:34.990754 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9rgnf\" (UniqueName: \"kubernetes.io/projected/6df2e2b9-9793-4361-9e81-dbc40bbb9197-kube-api-access-9rgnf\") on node \"crc\" DevicePath \"\"" Oct 13 14:24:35 crc kubenswrapper[4797]: I1013 14:24:35.087215 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-9d78d64fc-4gns6"] Oct 13 14:24:35 crc kubenswrapper[4797]: I1013 14:24:35.789706 4797 generic.go:334] "Generic (PLEG): container finished" podID="a7af7ea5-7441-487b-b6be-1ffad6754828" containerID="6bb3c2d7af9c8506294e01054548a8e932fc6a3ecba112f6103c2d4880cff037" exitCode=0 Oct 13 14:24:35 crc kubenswrapper[4797]: I1013 14:24:35.789992 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d85f8967f-gphrv" Oct 13 14:24:35 crc kubenswrapper[4797]: I1013 14:24:35.789757 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9d78d64fc-4gns6" event={"ID":"a7af7ea5-7441-487b-b6be-1ffad6754828","Type":"ContainerDied","Data":"6bb3c2d7af9c8506294e01054548a8e932fc6a3ecba112f6103c2d4880cff037"} Oct 13 14:24:35 crc kubenswrapper[4797]: I1013 14:24:35.790066 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9d78d64fc-4gns6" event={"ID":"a7af7ea5-7441-487b-b6be-1ffad6754828","Type":"ContainerStarted","Data":"3646c14d9463146325cb1a874467cc4b2d42ba9c843366f855ee88e42f247e21"} Oct 13 14:24:35 crc kubenswrapper[4797]: I1013 14:24:35.885520 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7d85f8967f-gphrv"] Oct 13 14:24:35 crc kubenswrapper[4797]: I1013 14:24:35.897510 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7d85f8967f-gphrv"] Oct 13 14:24:36 crc kubenswrapper[4797]: I1013 14:24:36.813954 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9d78d64fc-4gns6" event={"ID":"a7af7ea5-7441-487b-b6be-1ffad6754828","Type":"ContainerStarted","Data":"2d42c462c6952a8d70787a870440947cfd2d204da07ec2ce15cfc0715532fc9f"} Oct 13 14:24:36 crc kubenswrapper[4797]: I1013 14:24:36.814784 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-9d78d64fc-4gns6" Oct 13 14:24:36 crc kubenswrapper[4797]: I1013 14:24:36.882104 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-9d78d64fc-4gns6" podStartSLOduration=2.882080415 podStartE2EDuration="2.882080415s" podCreationTimestamp="2025-10-13 14:24:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 14:24:36.860850005 +0000 UTC m=+4654.394400291" watchObservedRunningTime="2025-10-13 14:24:36.882080415 +0000 UTC m=+4654.415630681" Oct 13 14:24:37 crc kubenswrapper[4797]: I1013 14:24:37.249896 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6df2e2b9-9793-4361-9e81-dbc40bbb9197" path="/var/lib/kubelet/pods/6df2e2b9-9793-4361-9e81-dbc40bbb9197/volumes" Oct 13 14:24:37 crc kubenswrapper[4797]: I1013 14:24:37.322206 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Oct 13 14:24:40 crc kubenswrapper[4797]: I1013 14:24:40.004715 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-copy-data"] Oct 13 14:24:40 crc kubenswrapper[4797]: I1013 14:24:40.007097 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Oct 13 14:24:40 crc kubenswrapper[4797]: I1013 14:24:40.020658 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-copy-data"] Oct 13 14:24:40 crc kubenswrapper[4797]: I1013 14:24:40.023070 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovn-data-cert" Oct 13 14:24:40 crc kubenswrapper[4797]: I1013 14:24:40.104719 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-14c99935-2bda-49e3-9c30-9ad5847c9567\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-14c99935-2bda-49e3-9c30-9ad5847c9567\") pod \"ovn-copy-data\" (UID: \"575e9052-19a2-442d-af24-3a8bd5a2eb64\") " pod="openstack/ovn-copy-data" Oct 13 14:24:40 crc kubenswrapper[4797]: I1013 14:24:40.104795 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/575e9052-19a2-442d-af24-3a8bd5a2eb64-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"575e9052-19a2-442d-af24-3a8bd5a2eb64\") " pod="openstack/ovn-copy-data" Oct 13 14:24:40 crc kubenswrapper[4797]: I1013 14:24:40.104934 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6j2j\" (UniqueName: \"kubernetes.io/projected/575e9052-19a2-442d-af24-3a8bd5a2eb64-kube-api-access-f6j2j\") pod \"ovn-copy-data\" (UID: \"575e9052-19a2-442d-af24-3a8bd5a2eb64\") " pod="openstack/ovn-copy-data" Oct 13 14:24:40 crc kubenswrapper[4797]: I1013 14:24:40.206946 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f6j2j\" (UniqueName: \"kubernetes.io/projected/575e9052-19a2-442d-af24-3a8bd5a2eb64-kube-api-access-f6j2j\") pod \"ovn-copy-data\" (UID: \"575e9052-19a2-442d-af24-3a8bd5a2eb64\") " pod="openstack/ovn-copy-data" Oct 13 14:24:40 crc kubenswrapper[4797]: I1013 14:24:40.207010 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-14c99935-2bda-49e3-9c30-9ad5847c9567\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-14c99935-2bda-49e3-9c30-9ad5847c9567\") pod \"ovn-copy-data\" (UID: \"575e9052-19a2-442d-af24-3a8bd5a2eb64\") " pod="openstack/ovn-copy-data" Oct 13 14:24:40 crc kubenswrapper[4797]: I1013 14:24:40.207055 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/575e9052-19a2-442d-af24-3a8bd5a2eb64-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"575e9052-19a2-442d-af24-3a8bd5a2eb64\") " pod="openstack/ovn-copy-data" Oct 13 14:24:40 crc kubenswrapper[4797]: I1013 14:24:40.211568 4797 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 13 14:24:40 crc kubenswrapper[4797]: I1013 14:24:40.211628 4797 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-14c99935-2bda-49e3-9c30-9ad5847c9567\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-14c99935-2bda-49e3-9c30-9ad5847c9567\") pod \"ovn-copy-data\" (UID: \"575e9052-19a2-442d-af24-3a8bd5a2eb64\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/be52986b40281a990711cd060c9318df4a2e00c6c49b2d0b09dfe4fbe8734079/globalmount\"" pod="openstack/ovn-copy-data" Oct 13 14:24:40 crc kubenswrapper[4797]: I1013 14:24:40.215281 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/575e9052-19a2-442d-af24-3a8bd5a2eb64-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"575e9052-19a2-442d-af24-3a8bd5a2eb64\") " pod="openstack/ovn-copy-data" Oct 13 14:24:40 crc kubenswrapper[4797]: I1013 14:24:40.227306 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6j2j\" (UniqueName: \"kubernetes.io/projected/575e9052-19a2-442d-af24-3a8bd5a2eb64-kube-api-access-f6j2j\") pod \"ovn-copy-data\" (UID: \"575e9052-19a2-442d-af24-3a8bd5a2eb64\") " pod="openstack/ovn-copy-data" Oct 13 14:24:40 crc kubenswrapper[4797]: I1013 14:24:40.264915 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-14c99935-2bda-49e3-9c30-9ad5847c9567\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-14c99935-2bda-49e3-9c30-9ad5847c9567\") pod \"ovn-copy-data\" (UID: \"575e9052-19a2-442d-af24-3a8bd5a2eb64\") " pod="openstack/ovn-copy-data" Oct 13 14:24:40 crc kubenswrapper[4797]: I1013 14:24:40.381231 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Oct 13 14:24:40 crc kubenswrapper[4797]: I1013 14:24:40.965215 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-copy-data"] Oct 13 14:24:41 crc kubenswrapper[4797]: I1013 14:24:41.863917 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"575e9052-19a2-442d-af24-3a8bd5a2eb64","Type":"ContainerStarted","Data":"2833b38e9f91168527f7300260906101fec4bde704ef193b6181eaf83bd5afff"} Oct 13 14:24:41 crc kubenswrapper[4797]: I1013 14:24:41.864996 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"575e9052-19a2-442d-af24-3a8bd5a2eb64","Type":"ContainerStarted","Data":"0647dda7607f8bdcd029415cfd253383347c5da774d58a46784333be814f72f3"} Oct 13 14:24:41 crc kubenswrapper[4797]: I1013 14:24:41.886454 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-copy-data" podStartSLOduration=3.696550038 podStartE2EDuration="3.886432788s" podCreationTimestamp="2025-10-13 14:24:38 +0000 UTC" firstStartedPulling="2025-10-13 14:24:40.97120963 +0000 UTC m=+4658.504759886" lastFinishedPulling="2025-10-13 14:24:41.16109238 +0000 UTC m=+4658.694642636" observedRunningTime="2025-10-13 14:24:41.88323638 +0000 UTC m=+4659.416786656" watchObservedRunningTime="2025-10-13 14:24:41.886432788 +0000 UTC m=+4659.419983044" Oct 13 14:24:44 crc kubenswrapper[4797]: I1013 14:24:44.594148 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-9d78d64fc-4gns6" Oct 13 14:24:44 crc kubenswrapper[4797]: I1013 14:24:44.676486 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-fdd579685-r7l2d"] Oct 13 14:24:44 crc kubenswrapper[4797]: I1013 14:24:44.677005 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-fdd579685-r7l2d" podUID="ffc82494-7881-4544-9e63-cd6041bf8c2c" containerName="dnsmasq-dns" containerID="cri-o://34ca4a52cafa7d4b3c88c1a3f437f8d3fd5c1ca6c86fe08989ed6f5698a93ae4" gracePeriod=10 Oct 13 14:24:44 crc kubenswrapper[4797]: I1013 14:24:44.899927 4797 generic.go:334] "Generic (PLEG): container finished" podID="ffc82494-7881-4544-9e63-cd6041bf8c2c" containerID="34ca4a52cafa7d4b3c88c1a3f437f8d3fd5c1ca6c86fe08989ed6f5698a93ae4" exitCode=0 Oct 13 14:24:44 crc kubenswrapper[4797]: I1013 14:24:44.899941 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fdd579685-r7l2d" event={"ID":"ffc82494-7881-4544-9e63-cd6041bf8c2c","Type":"ContainerDied","Data":"34ca4a52cafa7d4b3c88c1a3f437f8d3fd5c1ca6c86fe08989ed6f5698a93ae4"} Oct 13 14:24:45 crc kubenswrapper[4797]: I1013 14:24:45.110503 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fdd579685-r7l2d" Oct 13 14:24:45 crc kubenswrapper[4797]: I1013 14:24:45.201178 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8m8tl\" (UniqueName: \"kubernetes.io/projected/ffc82494-7881-4544-9e63-cd6041bf8c2c-kube-api-access-8m8tl\") pod \"ffc82494-7881-4544-9e63-cd6041bf8c2c\" (UID: \"ffc82494-7881-4544-9e63-cd6041bf8c2c\") " Oct 13 14:24:45 crc kubenswrapper[4797]: I1013 14:24:45.201242 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ffc82494-7881-4544-9e63-cd6041bf8c2c-config\") pod \"ffc82494-7881-4544-9e63-cd6041bf8c2c\" (UID: \"ffc82494-7881-4544-9e63-cd6041bf8c2c\") " Oct 13 14:24:45 crc kubenswrapper[4797]: I1013 14:24:45.201345 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ffc82494-7881-4544-9e63-cd6041bf8c2c-dns-svc\") pod \"ffc82494-7881-4544-9e63-cd6041bf8c2c\" (UID: \"ffc82494-7881-4544-9e63-cd6041bf8c2c\") " Oct 13 14:24:45 crc kubenswrapper[4797]: I1013 14:24:45.210580 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ffc82494-7881-4544-9e63-cd6041bf8c2c-kube-api-access-8m8tl" (OuterVolumeSpecName: "kube-api-access-8m8tl") pod "ffc82494-7881-4544-9e63-cd6041bf8c2c" (UID: "ffc82494-7881-4544-9e63-cd6041bf8c2c"). InnerVolumeSpecName "kube-api-access-8m8tl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 14:24:45 crc kubenswrapper[4797]: I1013 14:24:45.252896 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ffc82494-7881-4544-9e63-cd6041bf8c2c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ffc82494-7881-4544-9e63-cd6041bf8c2c" (UID: "ffc82494-7881-4544-9e63-cd6041bf8c2c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 14:24:45 crc kubenswrapper[4797]: I1013 14:24:45.257379 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ffc82494-7881-4544-9e63-cd6041bf8c2c-config" (OuterVolumeSpecName: "config") pod "ffc82494-7881-4544-9e63-cd6041bf8c2c" (UID: "ffc82494-7881-4544-9e63-cd6041bf8c2c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 14:24:45 crc kubenswrapper[4797]: I1013 14:24:45.303910 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8m8tl\" (UniqueName: \"kubernetes.io/projected/ffc82494-7881-4544-9e63-cd6041bf8c2c-kube-api-access-8m8tl\") on node \"crc\" DevicePath \"\"" Oct 13 14:24:45 crc kubenswrapper[4797]: I1013 14:24:45.303947 4797 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ffc82494-7881-4544-9e63-cd6041bf8c2c-config\") on node \"crc\" DevicePath \"\"" Oct 13 14:24:45 crc kubenswrapper[4797]: I1013 14:24:45.303961 4797 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ffc82494-7881-4544-9e63-cd6041bf8c2c-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 13 14:24:45 crc kubenswrapper[4797]: I1013 14:24:45.918718 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fdd579685-r7l2d" event={"ID":"ffc82494-7881-4544-9e63-cd6041bf8c2c","Type":"ContainerDied","Data":"0d0e1ea2ffcc01c0c37dc02cfa29558c4bd7bbaaee10d170a34204f49dab6abc"} Oct 13 14:24:45 crc kubenswrapper[4797]: I1013 14:24:45.919122 4797 scope.go:117] "RemoveContainer" containerID="34ca4a52cafa7d4b3c88c1a3f437f8d3fd5c1ca6c86fe08989ed6f5698a93ae4" Oct 13 14:24:45 crc kubenswrapper[4797]: I1013 14:24:45.918831 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fdd579685-r7l2d" Oct 13 14:24:45 crc kubenswrapper[4797]: I1013 14:24:45.948919 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-fdd579685-r7l2d"] Oct 13 14:24:45 crc kubenswrapper[4797]: I1013 14:24:45.953267 4797 scope.go:117] "RemoveContainer" containerID="1490a034af90f099e5ff8b4d3fbecb32711ba3cb9ee1fb6a3081d2ed2299fa63" Oct 13 14:24:45 crc kubenswrapper[4797]: I1013 14:24:45.956181 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-fdd579685-r7l2d"] Oct 13 14:24:47 crc kubenswrapper[4797]: I1013 14:24:47.247750 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ffc82494-7881-4544-9e63-cd6041bf8c2c" path="/var/lib/kubelet/pods/ffc82494-7881-4544-9e63-cd6041bf8c2c/volumes" Oct 13 14:24:47 crc kubenswrapper[4797]: I1013 14:24:47.334436 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Oct 13 14:24:47 crc kubenswrapper[4797]: E1013 14:24:47.334757 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffc82494-7881-4544-9e63-cd6041bf8c2c" containerName="init" Oct 13 14:24:47 crc kubenswrapper[4797]: I1013 14:24:47.334778 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffc82494-7881-4544-9e63-cd6041bf8c2c" containerName="init" Oct 13 14:24:47 crc kubenswrapper[4797]: E1013 14:24:47.334830 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffc82494-7881-4544-9e63-cd6041bf8c2c" containerName="dnsmasq-dns" Oct 13 14:24:47 crc kubenswrapper[4797]: I1013 14:24:47.334840 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffc82494-7881-4544-9e63-cd6041bf8c2c" containerName="dnsmasq-dns" Oct 13 14:24:47 crc kubenswrapper[4797]: I1013 14:24:47.335014 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="ffc82494-7881-4544-9e63-cd6041bf8c2c" containerName="dnsmasq-dns" Oct 13 14:24:47 crc kubenswrapper[4797]: I1013 14:24:47.335884 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 13 14:24:47 crc kubenswrapper[4797]: I1013 14:24:47.338476 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-tgghc" Oct 13 14:24:47 crc kubenswrapper[4797]: I1013 14:24:47.338896 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Oct 13 14:24:47 crc kubenswrapper[4797]: I1013 14:24:47.341692 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Oct 13 14:24:47 crc kubenswrapper[4797]: I1013 14:24:47.356918 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Oct 13 14:24:47 crc kubenswrapper[4797]: I1013 14:24:47.451724 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3f6dd28e-459b-4ef1-8211-a82eba48d1bd-scripts\") pod \"ovn-northd-0\" (UID: \"3f6dd28e-459b-4ef1-8211-a82eba48d1bd\") " pod="openstack/ovn-northd-0" Oct 13 14:24:47 crc kubenswrapper[4797]: I1013 14:24:47.452088 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f6dd28e-459b-4ef1-8211-a82eba48d1bd-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"3f6dd28e-459b-4ef1-8211-a82eba48d1bd\") " pod="openstack/ovn-northd-0" Oct 13 14:24:47 crc kubenswrapper[4797]: I1013 14:24:47.452114 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3f6dd28e-459b-4ef1-8211-a82eba48d1bd-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"3f6dd28e-459b-4ef1-8211-a82eba48d1bd\") " pod="openstack/ovn-northd-0" Oct 13 14:24:47 crc kubenswrapper[4797]: I1013 14:24:47.452167 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f6dd28e-459b-4ef1-8211-a82eba48d1bd-config\") pod \"ovn-northd-0\" (UID: \"3f6dd28e-459b-4ef1-8211-a82eba48d1bd\") " pod="openstack/ovn-northd-0" Oct 13 14:24:47 crc kubenswrapper[4797]: I1013 14:24:47.452243 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-csm6z\" (UniqueName: \"kubernetes.io/projected/3f6dd28e-459b-4ef1-8211-a82eba48d1bd-kube-api-access-csm6z\") pod \"ovn-northd-0\" (UID: \"3f6dd28e-459b-4ef1-8211-a82eba48d1bd\") " pod="openstack/ovn-northd-0" Oct 13 14:24:47 crc kubenswrapper[4797]: I1013 14:24:47.553308 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f6dd28e-459b-4ef1-8211-a82eba48d1bd-config\") pod \"ovn-northd-0\" (UID: \"3f6dd28e-459b-4ef1-8211-a82eba48d1bd\") " pod="openstack/ovn-northd-0" Oct 13 14:24:47 crc kubenswrapper[4797]: I1013 14:24:47.553410 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-csm6z\" (UniqueName: \"kubernetes.io/projected/3f6dd28e-459b-4ef1-8211-a82eba48d1bd-kube-api-access-csm6z\") pod \"ovn-northd-0\" (UID: \"3f6dd28e-459b-4ef1-8211-a82eba48d1bd\") " pod="openstack/ovn-northd-0" Oct 13 14:24:47 crc kubenswrapper[4797]: I1013 14:24:47.553467 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3f6dd28e-459b-4ef1-8211-a82eba48d1bd-scripts\") pod \"ovn-northd-0\" (UID: \"3f6dd28e-459b-4ef1-8211-a82eba48d1bd\") " pod="openstack/ovn-northd-0" Oct 13 14:24:47 crc kubenswrapper[4797]: I1013 14:24:47.553490 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f6dd28e-459b-4ef1-8211-a82eba48d1bd-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"3f6dd28e-459b-4ef1-8211-a82eba48d1bd\") " pod="openstack/ovn-northd-0" Oct 13 14:24:47 crc kubenswrapper[4797]: I1013 14:24:47.553517 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3f6dd28e-459b-4ef1-8211-a82eba48d1bd-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"3f6dd28e-459b-4ef1-8211-a82eba48d1bd\") " pod="openstack/ovn-northd-0" Oct 13 14:24:47 crc kubenswrapper[4797]: I1013 14:24:47.554018 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3f6dd28e-459b-4ef1-8211-a82eba48d1bd-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"3f6dd28e-459b-4ef1-8211-a82eba48d1bd\") " pod="openstack/ovn-northd-0" Oct 13 14:24:47 crc kubenswrapper[4797]: I1013 14:24:47.554526 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3f6dd28e-459b-4ef1-8211-a82eba48d1bd-scripts\") pod \"ovn-northd-0\" (UID: \"3f6dd28e-459b-4ef1-8211-a82eba48d1bd\") " pod="openstack/ovn-northd-0" Oct 13 14:24:47 crc kubenswrapper[4797]: I1013 14:24:47.554620 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f6dd28e-459b-4ef1-8211-a82eba48d1bd-config\") pod \"ovn-northd-0\" (UID: \"3f6dd28e-459b-4ef1-8211-a82eba48d1bd\") " pod="openstack/ovn-northd-0" Oct 13 14:24:47 crc kubenswrapper[4797]: I1013 14:24:47.567056 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f6dd28e-459b-4ef1-8211-a82eba48d1bd-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"3f6dd28e-459b-4ef1-8211-a82eba48d1bd\") " pod="openstack/ovn-northd-0" Oct 13 14:24:47 crc kubenswrapper[4797]: I1013 14:24:47.585680 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-csm6z\" (UniqueName: \"kubernetes.io/projected/3f6dd28e-459b-4ef1-8211-a82eba48d1bd-kube-api-access-csm6z\") pod \"ovn-northd-0\" (UID: \"3f6dd28e-459b-4ef1-8211-a82eba48d1bd\") " pod="openstack/ovn-northd-0" Oct 13 14:24:47 crc kubenswrapper[4797]: I1013 14:24:47.659142 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 13 14:24:48 crc kubenswrapper[4797]: I1013 14:24:48.088508 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Oct 13 14:24:48 crc kubenswrapper[4797]: I1013 14:24:48.945759 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"3f6dd28e-459b-4ef1-8211-a82eba48d1bd","Type":"ContainerStarted","Data":"5b79233f1263ca146a5f1f0a72ee7d1b30efcc14ca57fdcc426c5575640e187a"} Oct 13 14:24:49 crc kubenswrapper[4797]: I1013 14:24:49.960873 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"3f6dd28e-459b-4ef1-8211-a82eba48d1bd","Type":"ContainerStarted","Data":"94b569c207d6a3d81f4dc52d79d0e1b1ea3b632665ab1f25f8a3f6cc6134e7e4"} Oct 13 14:24:49 crc kubenswrapper[4797]: I1013 14:24:49.961638 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"3f6dd28e-459b-4ef1-8211-a82eba48d1bd","Type":"ContainerStarted","Data":"56dfb37637f69e8783af5f30f5b835ce9f3fed921ababe5794f94afb56438438"} Oct 13 14:24:49 crc kubenswrapper[4797]: I1013 14:24:49.961668 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Oct 13 14:24:50 crc kubenswrapper[4797]: I1013 14:24:50.000060 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.353767472 podStartE2EDuration="3.000028503s" podCreationTimestamp="2025-10-13 14:24:47 +0000 UTC" firstStartedPulling="2025-10-13 14:24:48.606334893 +0000 UTC m=+4666.139885169" lastFinishedPulling="2025-10-13 14:24:49.252595924 +0000 UTC m=+4666.786146200" observedRunningTime="2025-10-13 14:24:49.988308655 +0000 UTC m=+4667.521858981" watchObservedRunningTime="2025-10-13 14:24:50.000028503 +0000 UTC m=+4667.533578829" Oct 13 14:24:53 crc kubenswrapper[4797]: I1013 14:24:53.119435 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-x8km4"] Oct 13 14:24:53 crc kubenswrapper[4797]: I1013 14:24:53.120991 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-x8km4" Oct 13 14:24:53 crc kubenswrapper[4797]: I1013 14:24:53.144123 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-x8km4"] Oct 13 14:24:53 crc kubenswrapper[4797]: I1013 14:24:53.256457 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88vrn\" (UniqueName: \"kubernetes.io/projected/bd035957-69f9-40dd-9c0d-ae7846605b91-kube-api-access-88vrn\") pod \"keystone-db-create-x8km4\" (UID: \"bd035957-69f9-40dd-9c0d-ae7846605b91\") " pod="openstack/keystone-db-create-x8km4" Oct 13 14:24:53 crc kubenswrapper[4797]: I1013 14:24:53.358449 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88vrn\" (UniqueName: \"kubernetes.io/projected/bd035957-69f9-40dd-9c0d-ae7846605b91-kube-api-access-88vrn\") pod \"keystone-db-create-x8km4\" (UID: \"bd035957-69f9-40dd-9c0d-ae7846605b91\") " pod="openstack/keystone-db-create-x8km4" Oct 13 14:24:53 crc kubenswrapper[4797]: I1013 14:24:53.381765 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88vrn\" (UniqueName: \"kubernetes.io/projected/bd035957-69f9-40dd-9c0d-ae7846605b91-kube-api-access-88vrn\") pod \"keystone-db-create-x8km4\" (UID: \"bd035957-69f9-40dd-9c0d-ae7846605b91\") " pod="openstack/keystone-db-create-x8km4" Oct 13 14:24:53 crc kubenswrapper[4797]: I1013 14:24:53.450019 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-x8km4" Oct 13 14:24:53 crc kubenswrapper[4797]: I1013 14:24:53.966589 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-x8km4"] Oct 13 14:24:53 crc kubenswrapper[4797]: W1013 14:24:53.970417 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbd035957_69f9_40dd_9c0d_ae7846605b91.slice/crio-7a3946b13da3b770bebe46d042c414402bfcf3ac2353ff87180806bd6f5aa4d8 WatchSource:0}: Error finding container 7a3946b13da3b770bebe46d042c414402bfcf3ac2353ff87180806bd6f5aa4d8: Status 404 returned error can't find the container with id 7a3946b13da3b770bebe46d042c414402bfcf3ac2353ff87180806bd6f5aa4d8 Oct 13 14:24:54 crc kubenswrapper[4797]: I1013 14:24:54.002668 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-x8km4" event={"ID":"bd035957-69f9-40dd-9c0d-ae7846605b91","Type":"ContainerStarted","Data":"7a3946b13da3b770bebe46d042c414402bfcf3ac2353ff87180806bd6f5aa4d8"} Oct 13 14:24:55 crc kubenswrapper[4797]: I1013 14:24:55.013188 4797 generic.go:334] "Generic (PLEG): container finished" podID="bd035957-69f9-40dd-9c0d-ae7846605b91" containerID="89da6a592f418a5587dcdc6f2d22eea87653bc2729cf75cd3d262e4c57251374" exitCode=0 Oct 13 14:24:55 crc kubenswrapper[4797]: I1013 14:24:55.013246 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-x8km4" event={"ID":"bd035957-69f9-40dd-9c0d-ae7846605b91","Type":"ContainerDied","Data":"89da6a592f418a5587dcdc6f2d22eea87653bc2729cf75cd3d262e4c57251374"} Oct 13 14:24:56 crc kubenswrapper[4797]: I1013 14:24:56.405380 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-x8km4" Oct 13 14:24:56 crc kubenswrapper[4797]: I1013 14:24:56.515282 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-88vrn\" (UniqueName: \"kubernetes.io/projected/bd035957-69f9-40dd-9c0d-ae7846605b91-kube-api-access-88vrn\") pod \"bd035957-69f9-40dd-9c0d-ae7846605b91\" (UID: \"bd035957-69f9-40dd-9c0d-ae7846605b91\") " Oct 13 14:24:56 crc kubenswrapper[4797]: I1013 14:24:56.523647 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd035957-69f9-40dd-9c0d-ae7846605b91-kube-api-access-88vrn" (OuterVolumeSpecName: "kube-api-access-88vrn") pod "bd035957-69f9-40dd-9c0d-ae7846605b91" (UID: "bd035957-69f9-40dd-9c0d-ae7846605b91"). InnerVolumeSpecName "kube-api-access-88vrn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 14:24:56 crc kubenswrapper[4797]: I1013 14:24:56.617507 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-88vrn\" (UniqueName: \"kubernetes.io/projected/bd035957-69f9-40dd-9c0d-ae7846605b91-kube-api-access-88vrn\") on node \"crc\" DevicePath \"\"" Oct 13 14:24:57 crc kubenswrapper[4797]: I1013 14:24:57.034322 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-x8km4" event={"ID":"bd035957-69f9-40dd-9c0d-ae7846605b91","Type":"ContainerDied","Data":"7a3946b13da3b770bebe46d042c414402bfcf3ac2353ff87180806bd6f5aa4d8"} Oct 13 14:24:57 crc kubenswrapper[4797]: I1013 14:24:57.034383 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-x8km4" Oct 13 14:24:57 crc kubenswrapper[4797]: I1013 14:24:57.034398 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7a3946b13da3b770bebe46d042c414402bfcf3ac2353ff87180806bd6f5aa4d8" Oct 13 14:25:02 crc kubenswrapper[4797]: I1013 14:25:02.749034 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Oct 13 14:25:03 crc kubenswrapper[4797]: I1013 14:25:03.205597 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-3079-account-create-xw5xn"] Oct 13 14:25:03 crc kubenswrapper[4797]: E1013 14:25:03.206450 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd035957-69f9-40dd-9c0d-ae7846605b91" containerName="mariadb-database-create" Oct 13 14:25:03 crc kubenswrapper[4797]: I1013 14:25:03.206480 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd035957-69f9-40dd-9c0d-ae7846605b91" containerName="mariadb-database-create" Oct 13 14:25:03 crc kubenswrapper[4797]: I1013 14:25:03.206765 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd035957-69f9-40dd-9c0d-ae7846605b91" containerName="mariadb-database-create" Oct 13 14:25:03 crc kubenswrapper[4797]: I1013 14:25:03.207654 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-3079-account-create-xw5xn" Oct 13 14:25:03 crc kubenswrapper[4797]: I1013 14:25:03.209310 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Oct 13 14:25:03 crc kubenswrapper[4797]: I1013 14:25:03.216717 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-3079-account-create-xw5xn"] Oct 13 14:25:03 crc kubenswrapper[4797]: I1013 14:25:03.244686 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdmsv\" (UniqueName: \"kubernetes.io/projected/6953ed8a-6074-4c10-9810-66b62741e903-kube-api-access-fdmsv\") pod \"keystone-3079-account-create-xw5xn\" (UID: \"6953ed8a-6074-4c10-9810-66b62741e903\") " pod="openstack/keystone-3079-account-create-xw5xn" Oct 13 14:25:03 crc kubenswrapper[4797]: I1013 14:25:03.345876 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fdmsv\" (UniqueName: \"kubernetes.io/projected/6953ed8a-6074-4c10-9810-66b62741e903-kube-api-access-fdmsv\") pod \"keystone-3079-account-create-xw5xn\" (UID: \"6953ed8a-6074-4c10-9810-66b62741e903\") " pod="openstack/keystone-3079-account-create-xw5xn" Oct 13 14:25:03 crc kubenswrapper[4797]: I1013 14:25:03.382366 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdmsv\" (UniqueName: \"kubernetes.io/projected/6953ed8a-6074-4c10-9810-66b62741e903-kube-api-access-fdmsv\") pod \"keystone-3079-account-create-xw5xn\" (UID: \"6953ed8a-6074-4c10-9810-66b62741e903\") " pod="openstack/keystone-3079-account-create-xw5xn" Oct 13 14:25:03 crc kubenswrapper[4797]: I1013 14:25:03.534392 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-3079-account-create-xw5xn" Oct 13 14:25:04 crc kubenswrapper[4797]: I1013 14:25:04.036258 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-3079-account-create-xw5xn"] Oct 13 14:25:04 crc kubenswrapper[4797]: W1013 14:25:04.048612 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6953ed8a_6074_4c10_9810_66b62741e903.slice/crio-e189247101f228402e6ddb3db22ded8bc6cfe856c11acf0fc167a6f77eff21de WatchSource:0}: Error finding container e189247101f228402e6ddb3db22ded8bc6cfe856c11acf0fc167a6f77eff21de: Status 404 returned error can't find the container with id e189247101f228402e6ddb3db22ded8bc6cfe856c11acf0fc167a6f77eff21de Oct 13 14:25:04 crc kubenswrapper[4797]: I1013 14:25:04.110501 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-3079-account-create-xw5xn" event={"ID":"6953ed8a-6074-4c10-9810-66b62741e903","Type":"ContainerStarted","Data":"e189247101f228402e6ddb3db22ded8bc6cfe856c11acf0fc167a6f77eff21de"} Oct 13 14:25:05 crc kubenswrapper[4797]: I1013 14:25:05.122760 4797 generic.go:334] "Generic (PLEG): container finished" podID="6953ed8a-6074-4c10-9810-66b62741e903" containerID="9afb6cd0252adecf77addc7338033b8e4da0e1ec06022eb8854f48b3a5cbc2d0" exitCode=0 Oct 13 14:25:05 crc kubenswrapper[4797]: I1013 14:25:05.122922 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-3079-account-create-xw5xn" event={"ID":"6953ed8a-6074-4c10-9810-66b62741e903","Type":"ContainerDied","Data":"9afb6cd0252adecf77addc7338033b8e4da0e1ec06022eb8854f48b3a5cbc2d0"} Oct 13 14:25:06 crc kubenswrapper[4797]: I1013 14:25:06.537571 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-3079-account-create-xw5xn" Oct 13 14:25:06 crc kubenswrapper[4797]: I1013 14:25:06.604740 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fdmsv\" (UniqueName: \"kubernetes.io/projected/6953ed8a-6074-4c10-9810-66b62741e903-kube-api-access-fdmsv\") pod \"6953ed8a-6074-4c10-9810-66b62741e903\" (UID: \"6953ed8a-6074-4c10-9810-66b62741e903\") " Oct 13 14:25:06 crc kubenswrapper[4797]: I1013 14:25:06.615035 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6953ed8a-6074-4c10-9810-66b62741e903-kube-api-access-fdmsv" (OuterVolumeSpecName: "kube-api-access-fdmsv") pod "6953ed8a-6074-4c10-9810-66b62741e903" (UID: "6953ed8a-6074-4c10-9810-66b62741e903"). InnerVolumeSpecName "kube-api-access-fdmsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 14:25:06 crc kubenswrapper[4797]: I1013 14:25:06.706636 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fdmsv\" (UniqueName: \"kubernetes.io/projected/6953ed8a-6074-4c10-9810-66b62741e903-kube-api-access-fdmsv\") on node \"crc\" DevicePath \"\"" Oct 13 14:25:07 crc kubenswrapper[4797]: I1013 14:25:07.148698 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-3079-account-create-xw5xn" event={"ID":"6953ed8a-6074-4c10-9810-66b62741e903","Type":"ContainerDied","Data":"e189247101f228402e6ddb3db22ded8bc6cfe856c11acf0fc167a6f77eff21de"} Oct 13 14:25:07 crc kubenswrapper[4797]: I1013 14:25:07.148862 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-3079-account-create-xw5xn" Oct 13 14:25:07 crc kubenswrapper[4797]: I1013 14:25:07.149030 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e189247101f228402e6ddb3db22ded8bc6cfe856c11acf0fc167a6f77eff21de" Oct 13 14:25:08 crc kubenswrapper[4797]: I1013 14:25:08.689198 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-lq62q"] Oct 13 14:25:08 crc kubenswrapper[4797]: E1013 14:25:08.689840 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6953ed8a-6074-4c10-9810-66b62741e903" containerName="mariadb-account-create" Oct 13 14:25:08 crc kubenswrapper[4797]: I1013 14:25:08.689853 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="6953ed8a-6074-4c10-9810-66b62741e903" containerName="mariadb-account-create" Oct 13 14:25:08 crc kubenswrapper[4797]: I1013 14:25:08.690020 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="6953ed8a-6074-4c10-9810-66b62741e903" containerName="mariadb-account-create" Oct 13 14:25:08 crc kubenswrapper[4797]: I1013 14:25:08.690661 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-lq62q" Oct 13 14:25:08 crc kubenswrapper[4797]: I1013 14:25:08.692467 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 13 14:25:08 crc kubenswrapper[4797]: I1013 14:25:08.692684 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-dqm8l" Oct 13 14:25:08 crc kubenswrapper[4797]: I1013 14:25:08.692974 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 13 14:25:08 crc kubenswrapper[4797]: I1013 14:25:08.695265 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 13 14:25:08 crc kubenswrapper[4797]: I1013 14:25:08.699863 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-lq62q"] Oct 13 14:25:08 crc kubenswrapper[4797]: I1013 14:25:08.740277 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a582257-89db-4b9f-926a-6631e27ee53e-config-data\") pod \"keystone-db-sync-lq62q\" (UID: \"9a582257-89db-4b9f-926a-6631e27ee53e\") " pod="openstack/keystone-db-sync-lq62q" Oct 13 14:25:08 crc kubenswrapper[4797]: I1013 14:25:08.740373 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nsc5z\" (UniqueName: \"kubernetes.io/projected/9a582257-89db-4b9f-926a-6631e27ee53e-kube-api-access-nsc5z\") pod \"keystone-db-sync-lq62q\" (UID: \"9a582257-89db-4b9f-926a-6631e27ee53e\") " pod="openstack/keystone-db-sync-lq62q" Oct 13 14:25:08 crc kubenswrapper[4797]: I1013 14:25:08.740513 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a582257-89db-4b9f-926a-6631e27ee53e-combined-ca-bundle\") pod \"keystone-db-sync-lq62q\" (UID: \"9a582257-89db-4b9f-926a-6631e27ee53e\") " pod="openstack/keystone-db-sync-lq62q" Oct 13 14:25:08 crc kubenswrapper[4797]: I1013 14:25:08.842035 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a582257-89db-4b9f-926a-6631e27ee53e-config-data\") pod \"keystone-db-sync-lq62q\" (UID: \"9a582257-89db-4b9f-926a-6631e27ee53e\") " pod="openstack/keystone-db-sync-lq62q" Oct 13 14:25:08 crc kubenswrapper[4797]: I1013 14:25:08.842131 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nsc5z\" (UniqueName: \"kubernetes.io/projected/9a582257-89db-4b9f-926a-6631e27ee53e-kube-api-access-nsc5z\") pod \"keystone-db-sync-lq62q\" (UID: \"9a582257-89db-4b9f-926a-6631e27ee53e\") " pod="openstack/keystone-db-sync-lq62q" Oct 13 14:25:08 crc kubenswrapper[4797]: I1013 14:25:08.842173 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a582257-89db-4b9f-926a-6631e27ee53e-combined-ca-bundle\") pod \"keystone-db-sync-lq62q\" (UID: \"9a582257-89db-4b9f-926a-6631e27ee53e\") " pod="openstack/keystone-db-sync-lq62q" Oct 13 14:25:08 crc kubenswrapper[4797]: I1013 14:25:08.850756 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a582257-89db-4b9f-926a-6631e27ee53e-combined-ca-bundle\") pod \"keystone-db-sync-lq62q\" (UID: \"9a582257-89db-4b9f-926a-6631e27ee53e\") " pod="openstack/keystone-db-sync-lq62q" Oct 13 14:25:08 crc kubenswrapper[4797]: I1013 14:25:08.852366 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a582257-89db-4b9f-926a-6631e27ee53e-config-data\") pod \"keystone-db-sync-lq62q\" (UID: \"9a582257-89db-4b9f-926a-6631e27ee53e\") " pod="openstack/keystone-db-sync-lq62q" Oct 13 14:25:08 crc kubenswrapper[4797]: I1013 14:25:08.862954 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nsc5z\" (UniqueName: \"kubernetes.io/projected/9a582257-89db-4b9f-926a-6631e27ee53e-kube-api-access-nsc5z\") pod \"keystone-db-sync-lq62q\" (UID: \"9a582257-89db-4b9f-926a-6631e27ee53e\") " pod="openstack/keystone-db-sync-lq62q" Oct 13 14:25:09 crc kubenswrapper[4797]: I1013 14:25:09.021152 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-lq62q" Oct 13 14:25:09 crc kubenswrapper[4797]: I1013 14:25:09.501298 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-lq62q"] Oct 13 14:25:10 crc kubenswrapper[4797]: I1013 14:25:10.184287 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-lq62q" event={"ID":"9a582257-89db-4b9f-926a-6631e27ee53e","Type":"ContainerStarted","Data":"64281b5fb01677071e277069ce428214b1a192aadfe82bb10138f6a61061bde0"} Oct 13 14:25:15 crc kubenswrapper[4797]: I1013 14:25:15.258521 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-lq62q" event={"ID":"9a582257-89db-4b9f-926a-6631e27ee53e","Type":"ContainerStarted","Data":"3c5d78b44e0d4a868b59d440a805df697edae9b0ba54a1c1a46167b209e09cd0"} Oct 13 14:25:15 crc kubenswrapper[4797]: I1013 14:25:15.282431 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-lq62q" podStartSLOduration=2.761114101 podStartE2EDuration="7.282403991s" podCreationTimestamp="2025-10-13 14:25:08 +0000 UTC" firstStartedPulling="2025-10-13 14:25:09.503319301 +0000 UTC m=+4687.036869567" lastFinishedPulling="2025-10-13 14:25:14.024609201 +0000 UTC m=+4691.558159457" observedRunningTime="2025-10-13 14:25:15.269584047 +0000 UTC m=+4692.803134353" watchObservedRunningTime="2025-10-13 14:25:15.282403991 +0000 UTC m=+4692.815954287" Oct 13 14:25:16 crc kubenswrapper[4797]: I1013 14:25:16.259328 4797 generic.go:334] "Generic (PLEG): container finished" podID="9a582257-89db-4b9f-926a-6631e27ee53e" containerID="3c5d78b44e0d4a868b59d440a805df697edae9b0ba54a1c1a46167b209e09cd0" exitCode=0 Oct 13 14:25:16 crc kubenswrapper[4797]: I1013 14:25:16.259409 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-lq62q" event={"ID":"9a582257-89db-4b9f-926a-6631e27ee53e","Type":"ContainerDied","Data":"3c5d78b44e0d4a868b59d440a805df697edae9b0ba54a1c1a46167b209e09cd0"} Oct 13 14:25:17 crc kubenswrapper[4797]: I1013 14:25:17.688578 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-lq62q" Oct 13 14:25:17 crc kubenswrapper[4797]: I1013 14:25:17.832863 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nsc5z\" (UniqueName: \"kubernetes.io/projected/9a582257-89db-4b9f-926a-6631e27ee53e-kube-api-access-nsc5z\") pod \"9a582257-89db-4b9f-926a-6631e27ee53e\" (UID: \"9a582257-89db-4b9f-926a-6631e27ee53e\") " Oct 13 14:25:17 crc kubenswrapper[4797]: I1013 14:25:17.832962 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a582257-89db-4b9f-926a-6631e27ee53e-combined-ca-bundle\") pod \"9a582257-89db-4b9f-926a-6631e27ee53e\" (UID: \"9a582257-89db-4b9f-926a-6631e27ee53e\") " Oct 13 14:25:17 crc kubenswrapper[4797]: I1013 14:25:17.833048 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a582257-89db-4b9f-926a-6631e27ee53e-config-data\") pod \"9a582257-89db-4b9f-926a-6631e27ee53e\" (UID: \"9a582257-89db-4b9f-926a-6631e27ee53e\") " Oct 13 14:25:17 crc kubenswrapper[4797]: I1013 14:25:17.844534 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a582257-89db-4b9f-926a-6631e27ee53e-kube-api-access-nsc5z" (OuterVolumeSpecName: "kube-api-access-nsc5z") pod "9a582257-89db-4b9f-926a-6631e27ee53e" (UID: "9a582257-89db-4b9f-926a-6631e27ee53e"). InnerVolumeSpecName "kube-api-access-nsc5z". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 14:25:17 crc kubenswrapper[4797]: I1013 14:25:17.875368 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a582257-89db-4b9f-926a-6631e27ee53e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9a582257-89db-4b9f-926a-6631e27ee53e" (UID: "9a582257-89db-4b9f-926a-6631e27ee53e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 14:25:17 crc kubenswrapper[4797]: I1013 14:25:17.920511 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a582257-89db-4b9f-926a-6631e27ee53e-config-data" (OuterVolumeSpecName: "config-data") pod "9a582257-89db-4b9f-926a-6631e27ee53e" (UID: "9a582257-89db-4b9f-926a-6631e27ee53e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 14:25:17 crc kubenswrapper[4797]: I1013 14:25:17.935628 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nsc5z\" (UniqueName: \"kubernetes.io/projected/9a582257-89db-4b9f-926a-6631e27ee53e-kube-api-access-nsc5z\") on node \"crc\" DevicePath \"\"" Oct 13 14:25:17 crc kubenswrapper[4797]: I1013 14:25:17.935667 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a582257-89db-4b9f-926a-6631e27ee53e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 14:25:17 crc kubenswrapper[4797]: I1013 14:25:17.935679 4797 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a582257-89db-4b9f-926a-6631e27ee53e-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 14:25:18 crc kubenswrapper[4797]: I1013 14:25:18.282729 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-lq62q" event={"ID":"9a582257-89db-4b9f-926a-6631e27ee53e","Type":"ContainerDied","Data":"64281b5fb01677071e277069ce428214b1a192aadfe82bb10138f6a61061bde0"} Oct 13 14:25:18 crc kubenswrapper[4797]: I1013 14:25:18.282774 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="64281b5fb01677071e277069ce428214b1a192aadfe82bb10138f6a61061bde0" Oct 13 14:25:18 crc kubenswrapper[4797]: I1013 14:25:18.282886 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-lq62q" Oct 13 14:25:18 crc kubenswrapper[4797]: I1013 14:25:18.605382 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7f59f744ff-nbq59"] Oct 13 14:25:18 crc kubenswrapper[4797]: E1013 14:25:18.605735 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a582257-89db-4b9f-926a-6631e27ee53e" containerName="keystone-db-sync" Oct 13 14:25:18 crc kubenswrapper[4797]: I1013 14:25:18.605759 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a582257-89db-4b9f-926a-6631e27ee53e" containerName="keystone-db-sync" Oct 13 14:25:18 crc kubenswrapper[4797]: I1013 14:25:18.606003 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a582257-89db-4b9f-926a-6631e27ee53e" containerName="keystone-db-sync" Oct 13 14:25:18 crc kubenswrapper[4797]: I1013 14:25:18.607088 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f59f744ff-nbq59" Oct 13 14:25:18 crc kubenswrapper[4797]: I1013 14:25:18.638774 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7f59f744ff-nbq59"] Oct 13 14:25:18 crc kubenswrapper[4797]: I1013 14:25:18.668965 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-lswtg"] Oct 13 14:25:18 crc kubenswrapper[4797]: I1013 14:25:18.670070 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-lswtg" Oct 13 14:25:18 crc kubenswrapper[4797]: I1013 14:25:18.673944 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 13 14:25:18 crc kubenswrapper[4797]: I1013 14:25:18.675003 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-dqm8l" Oct 13 14:25:18 crc kubenswrapper[4797]: I1013 14:25:18.675216 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 13 14:25:18 crc kubenswrapper[4797]: I1013 14:25:18.675404 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 13 14:25:18 crc kubenswrapper[4797]: I1013 14:25:18.718878 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-lswtg"] Oct 13 14:25:18 crc kubenswrapper[4797]: I1013 14:25:18.754101 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hpcrr\" (UniqueName: \"kubernetes.io/projected/7904403a-c230-4ffd-bf08-9a9a3dc153e3-kube-api-access-hpcrr\") pod \"dnsmasq-dns-7f59f744ff-nbq59\" (UID: \"7904403a-c230-4ffd-bf08-9a9a3dc153e3\") " pod="openstack/dnsmasq-dns-7f59f744ff-nbq59" Oct 13 14:25:18 crc kubenswrapper[4797]: I1013 14:25:18.754138 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7904403a-c230-4ffd-bf08-9a9a3dc153e3-ovsdbserver-sb\") pod \"dnsmasq-dns-7f59f744ff-nbq59\" (UID: \"7904403a-c230-4ffd-bf08-9a9a3dc153e3\") " pod="openstack/dnsmasq-dns-7f59f744ff-nbq59" Oct 13 14:25:18 crc kubenswrapper[4797]: I1013 14:25:18.754166 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5000c5bd-1fda-44e9-81e0-5c55e3298868-combined-ca-bundle\") pod \"keystone-bootstrap-lswtg\" (UID: \"5000c5bd-1fda-44e9-81e0-5c55e3298868\") " pod="openstack/keystone-bootstrap-lswtg" Oct 13 14:25:18 crc kubenswrapper[4797]: I1013 14:25:18.754208 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7904403a-c230-4ffd-bf08-9a9a3dc153e3-config\") pod \"dnsmasq-dns-7f59f744ff-nbq59\" (UID: \"7904403a-c230-4ffd-bf08-9a9a3dc153e3\") " pod="openstack/dnsmasq-dns-7f59f744ff-nbq59" Oct 13 14:25:18 crc kubenswrapper[4797]: I1013 14:25:18.754235 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5000c5bd-1fda-44e9-81e0-5c55e3298868-config-data\") pod \"keystone-bootstrap-lswtg\" (UID: \"5000c5bd-1fda-44e9-81e0-5c55e3298868\") " pod="openstack/keystone-bootstrap-lswtg" Oct 13 14:25:18 crc kubenswrapper[4797]: I1013 14:25:18.754272 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7904403a-c230-4ffd-bf08-9a9a3dc153e3-ovsdbserver-nb\") pod \"dnsmasq-dns-7f59f744ff-nbq59\" (UID: \"7904403a-c230-4ffd-bf08-9a9a3dc153e3\") " pod="openstack/dnsmasq-dns-7f59f744ff-nbq59" Oct 13 14:25:18 crc kubenswrapper[4797]: I1013 14:25:18.754286 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5000c5bd-1fda-44e9-81e0-5c55e3298868-credential-keys\") pod \"keystone-bootstrap-lswtg\" (UID: \"5000c5bd-1fda-44e9-81e0-5c55e3298868\") " pod="openstack/keystone-bootstrap-lswtg" Oct 13 14:25:18 crc kubenswrapper[4797]: I1013 14:25:18.754311 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5000c5bd-1fda-44e9-81e0-5c55e3298868-fernet-keys\") pod \"keystone-bootstrap-lswtg\" (UID: \"5000c5bd-1fda-44e9-81e0-5c55e3298868\") " pod="openstack/keystone-bootstrap-lswtg" Oct 13 14:25:18 crc kubenswrapper[4797]: I1013 14:25:18.754331 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vll9f\" (UniqueName: \"kubernetes.io/projected/5000c5bd-1fda-44e9-81e0-5c55e3298868-kube-api-access-vll9f\") pod \"keystone-bootstrap-lswtg\" (UID: \"5000c5bd-1fda-44e9-81e0-5c55e3298868\") " pod="openstack/keystone-bootstrap-lswtg" Oct 13 14:25:18 crc kubenswrapper[4797]: I1013 14:25:18.754354 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7904403a-c230-4ffd-bf08-9a9a3dc153e3-dns-svc\") pod \"dnsmasq-dns-7f59f744ff-nbq59\" (UID: \"7904403a-c230-4ffd-bf08-9a9a3dc153e3\") " pod="openstack/dnsmasq-dns-7f59f744ff-nbq59" Oct 13 14:25:18 crc kubenswrapper[4797]: I1013 14:25:18.754376 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5000c5bd-1fda-44e9-81e0-5c55e3298868-scripts\") pod \"keystone-bootstrap-lswtg\" (UID: \"5000c5bd-1fda-44e9-81e0-5c55e3298868\") " pod="openstack/keystone-bootstrap-lswtg" Oct 13 14:25:18 crc kubenswrapper[4797]: I1013 14:25:18.856232 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5000c5bd-1fda-44e9-81e0-5c55e3298868-config-data\") pod \"keystone-bootstrap-lswtg\" (UID: \"5000c5bd-1fda-44e9-81e0-5c55e3298868\") " pod="openstack/keystone-bootstrap-lswtg" Oct 13 14:25:18 crc kubenswrapper[4797]: I1013 14:25:18.856347 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7904403a-c230-4ffd-bf08-9a9a3dc153e3-ovsdbserver-nb\") pod \"dnsmasq-dns-7f59f744ff-nbq59\" (UID: \"7904403a-c230-4ffd-bf08-9a9a3dc153e3\") " pod="openstack/dnsmasq-dns-7f59f744ff-nbq59" Oct 13 14:25:18 crc kubenswrapper[4797]: I1013 14:25:18.856374 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5000c5bd-1fda-44e9-81e0-5c55e3298868-credential-keys\") pod \"keystone-bootstrap-lswtg\" (UID: \"5000c5bd-1fda-44e9-81e0-5c55e3298868\") " pod="openstack/keystone-bootstrap-lswtg" Oct 13 14:25:18 crc kubenswrapper[4797]: I1013 14:25:18.856408 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5000c5bd-1fda-44e9-81e0-5c55e3298868-fernet-keys\") pod \"keystone-bootstrap-lswtg\" (UID: \"5000c5bd-1fda-44e9-81e0-5c55e3298868\") " pod="openstack/keystone-bootstrap-lswtg" Oct 13 14:25:18 crc kubenswrapper[4797]: I1013 14:25:18.856439 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vll9f\" (UniqueName: \"kubernetes.io/projected/5000c5bd-1fda-44e9-81e0-5c55e3298868-kube-api-access-vll9f\") pod \"keystone-bootstrap-lswtg\" (UID: \"5000c5bd-1fda-44e9-81e0-5c55e3298868\") " pod="openstack/keystone-bootstrap-lswtg" Oct 13 14:25:18 crc kubenswrapper[4797]: I1013 14:25:18.856478 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7904403a-c230-4ffd-bf08-9a9a3dc153e3-dns-svc\") pod \"dnsmasq-dns-7f59f744ff-nbq59\" (UID: \"7904403a-c230-4ffd-bf08-9a9a3dc153e3\") " pod="openstack/dnsmasq-dns-7f59f744ff-nbq59" Oct 13 14:25:18 crc kubenswrapper[4797]: I1013 14:25:18.856508 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5000c5bd-1fda-44e9-81e0-5c55e3298868-scripts\") pod \"keystone-bootstrap-lswtg\" (UID: \"5000c5bd-1fda-44e9-81e0-5c55e3298868\") " pod="openstack/keystone-bootstrap-lswtg" Oct 13 14:25:18 crc kubenswrapper[4797]: I1013 14:25:18.856541 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hpcrr\" (UniqueName: \"kubernetes.io/projected/7904403a-c230-4ffd-bf08-9a9a3dc153e3-kube-api-access-hpcrr\") pod \"dnsmasq-dns-7f59f744ff-nbq59\" (UID: \"7904403a-c230-4ffd-bf08-9a9a3dc153e3\") " pod="openstack/dnsmasq-dns-7f59f744ff-nbq59" Oct 13 14:25:18 crc kubenswrapper[4797]: I1013 14:25:18.856566 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7904403a-c230-4ffd-bf08-9a9a3dc153e3-ovsdbserver-sb\") pod \"dnsmasq-dns-7f59f744ff-nbq59\" (UID: \"7904403a-c230-4ffd-bf08-9a9a3dc153e3\") " pod="openstack/dnsmasq-dns-7f59f744ff-nbq59" Oct 13 14:25:18 crc kubenswrapper[4797]: I1013 14:25:18.856595 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5000c5bd-1fda-44e9-81e0-5c55e3298868-combined-ca-bundle\") pod \"keystone-bootstrap-lswtg\" (UID: \"5000c5bd-1fda-44e9-81e0-5c55e3298868\") " pod="openstack/keystone-bootstrap-lswtg" Oct 13 14:25:18 crc kubenswrapper[4797]: I1013 14:25:18.856651 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7904403a-c230-4ffd-bf08-9a9a3dc153e3-config\") pod \"dnsmasq-dns-7f59f744ff-nbq59\" (UID: \"7904403a-c230-4ffd-bf08-9a9a3dc153e3\") " pod="openstack/dnsmasq-dns-7f59f744ff-nbq59" Oct 13 14:25:18 crc kubenswrapper[4797]: I1013 14:25:18.857513 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7904403a-c230-4ffd-bf08-9a9a3dc153e3-dns-svc\") pod \"dnsmasq-dns-7f59f744ff-nbq59\" (UID: \"7904403a-c230-4ffd-bf08-9a9a3dc153e3\") " pod="openstack/dnsmasq-dns-7f59f744ff-nbq59" Oct 13 14:25:18 crc kubenswrapper[4797]: I1013 14:25:18.857674 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7904403a-c230-4ffd-bf08-9a9a3dc153e3-config\") pod \"dnsmasq-dns-7f59f744ff-nbq59\" (UID: \"7904403a-c230-4ffd-bf08-9a9a3dc153e3\") " pod="openstack/dnsmasq-dns-7f59f744ff-nbq59" Oct 13 14:25:18 crc kubenswrapper[4797]: I1013 14:25:18.858051 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7904403a-c230-4ffd-bf08-9a9a3dc153e3-ovsdbserver-nb\") pod \"dnsmasq-dns-7f59f744ff-nbq59\" (UID: \"7904403a-c230-4ffd-bf08-9a9a3dc153e3\") " pod="openstack/dnsmasq-dns-7f59f744ff-nbq59" Oct 13 14:25:18 crc kubenswrapper[4797]: I1013 14:25:18.859452 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7904403a-c230-4ffd-bf08-9a9a3dc153e3-ovsdbserver-sb\") pod \"dnsmasq-dns-7f59f744ff-nbq59\" (UID: \"7904403a-c230-4ffd-bf08-9a9a3dc153e3\") " pod="openstack/dnsmasq-dns-7f59f744ff-nbq59" Oct 13 14:25:18 crc kubenswrapper[4797]: I1013 14:25:18.861927 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5000c5bd-1fda-44e9-81e0-5c55e3298868-credential-keys\") pod \"keystone-bootstrap-lswtg\" (UID: \"5000c5bd-1fda-44e9-81e0-5c55e3298868\") " pod="openstack/keystone-bootstrap-lswtg" Oct 13 14:25:18 crc kubenswrapper[4797]: I1013 14:25:18.870079 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5000c5bd-1fda-44e9-81e0-5c55e3298868-scripts\") pod \"keystone-bootstrap-lswtg\" (UID: \"5000c5bd-1fda-44e9-81e0-5c55e3298868\") " pod="openstack/keystone-bootstrap-lswtg" Oct 13 14:25:18 crc kubenswrapper[4797]: I1013 14:25:18.870477 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5000c5bd-1fda-44e9-81e0-5c55e3298868-config-data\") pod \"keystone-bootstrap-lswtg\" (UID: \"5000c5bd-1fda-44e9-81e0-5c55e3298868\") " pod="openstack/keystone-bootstrap-lswtg" Oct 13 14:25:18 crc kubenswrapper[4797]: I1013 14:25:18.873383 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5000c5bd-1fda-44e9-81e0-5c55e3298868-combined-ca-bundle\") pod \"keystone-bootstrap-lswtg\" (UID: \"5000c5bd-1fda-44e9-81e0-5c55e3298868\") " pod="openstack/keystone-bootstrap-lswtg" Oct 13 14:25:18 crc kubenswrapper[4797]: I1013 14:25:18.878394 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5000c5bd-1fda-44e9-81e0-5c55e3298868-fernet-keys\") pod \"keystone-bootstrap-lswtg\" (UID: \"5000c5bd-1fda-44e9-81e0-5c55e3298868\") " pod="openstack/keystone-bootstrap-lswtg" Oct 13 14:25:18 crc kubenswrapper[4797]: I1013 14:25:18.881242 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vll9f\" (UniqueName: \"kubernetes.io/projected/5000c5bd-1fda-44e9-81e0-5c55e3298868-kube-api-access-vll9f\") pod \"keystone-bootstrap-lswtg\" (UID: \"5000c5bd-1fda-44e9-81e0-5c55e3298868\") " pod="openstack/keystone-bootstrap-lswtg" Oct 13 14:25:18 crc kubenswrapper[4797]: I1013 14:25:18.882676 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hpcrr\" (UniqueName: \"kubernetes.io/projected/7904403a-c230-4ffd-bf08-9a9a3dc153e3-kube-api-access-hpcrr\") pod \"dnsmasq-dns-7f59f744ff-nbq59\" (UID: \"7904403a-c230-4ffd-bf08-9a9a3dc153e3\") " pod="openstack/dnsmasq-dns-7f59f744ff-nbq59" Oct 13 14:25:18 crc kubenswrapper[4797]: I1013 14:25:18.958627 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f59f744ff-nbq59" Oct 13 14:25:18 crc kubenswrapper[4797]: I1013 14:25:18.986228 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-lswtg" Oct 13 14:25:19 crc kubenswrapper[4797]: I1013 14:25:19.399260 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7f59f744ff-nbq59"] Oct 13 14:25:19 crc kubenswrapper[4797]: I1013 14:25:19.508054 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-lswtg"] Oct 13 14:25:19 crc kubenswrapper[4797]: W1013 14:25:19.513565 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5000c5bd_1fda_44e9_81e0_5c55e3298868.slice/crio-6d13677d9a7e3be29480b9525209caa3363cdbc921a1b11eb7bd671294714b3d WatchSource:0}: Error finding container 6d13677d9a7e3be29480b9525209caa3363cdbc921a1b11eb7bd671294714b3d: Status 404 returned error can't find the container with id 6d13677d9a7e3be29480b9525209caa3363cdbc921a1b11eb7bd671294714b3d Oct 13 14:25:20 crc kubenswrapper[4797]: I1013 14:25:20.312562 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-lswtg" event={"ID":"5000c5bd-1fda-44e9-81e0-5c55e3298868","Type":"ContainerStarted","Data":"6735b15f8ed0d8c37dd10d1365169713bf17cdea1794049fbbd7855fff44fa08"} Oct 13 14:25:20 crc kubenswrapper[4797]: I1013 14:25:20.312925 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-lswtg" event={"ID":"5000c5bd-1fda-44e9-81e0-5c55e3298868","Type":"ContainerStarted","Data":"6d13677d9a7e3be29480b9525209caa3363cdbc921a1b11eb7bd671294714b3d"} Oct 13 14:25:20 crc kubenswrapper[4797]: I1013 14:25:20.315111 4797 generic.go:334] "Generic (PLEG): container finished" podID="7904403a-c230-4ffd-bf08-9a9a3dc153e3" containerID="9f34b95597c57f45dc00aaa0483de4cd17dcec706325c0157f77df5a0c88c7a9" exitCode=0 Oct 13 14:25:20 crc kubenswrapper[4797]: I1013 14:25:20.315142 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f59f744ff-nbq59" event={"ID":"7904403a-c230-4ffd-bf08-9a9a3dc153e3","Type":"ContainerDied","Data":"9f34b95597c57f45dc00aaa0483de4cd17dcec706325c0157f77df5a0c88c7a9"} Oct 13 14:25:20 crc kubenswrapper[4797]: I1013 14:25:20.315156 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f59f744ff-nbq59" event={"ID":"7904403a-c230-4ffd-bf08-9a9a3dc153e3","Type":"ContainerStarted","Data":"b2ef8c571766cafb80246fa01970f0b5c2f9b056d07bdf7fad796f01e98c10cf"} Oct 13 14:25:20 crc kubenswrapper[4797]: I1013 14:25:20.366497 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-lswtg" podStartSLOduration=2.366478978 podStartE2EDuration="2.366478978s" podCreationTimestamp="2025-10-13 14:25:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 14:25:20.365150106 +0000 UTC m=+4697.898700382" watchObservedRunningTime="2025-10-13 14:25:20.366478978 +0000 UTC m=+4697.900029234" Oct 13 14:25:21 crc kubenswrapper[4797]: I1013 14:25:21.341713 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f59f744ff-nbq59" event={"ID":"7904403a-c230-4ffd-bf08-9a9a3dc153e3","Type":"ContainerStarted","Data":"ac6ee89b9534d1e2a821c09543f500d5bbe06d905e25bb9dbd04d9c03db96ef0"} Oct 13 14:25:21 crc kubenswrapper[4797]: I1013 14:25:21.343061 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7f59f744ff-nbq59" Oct 13 14:25:21 crc kubenswrapper[4797]: I1013 14:25:21.392691 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7f59f744ff-nbq59" podStartSLOduration=3.392666095 podStartE2EDuration="3.392666095s" podCreationTimestamp="2025-10-13 14:25:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 14:25:21.383767836 +0000 UTC m=+4698.917318133" watchObservedRunningTime="2025-10-13 14:25:21.392666095 +0000 UTC m=+4698.926216361" Oct 13 14:25:23 crc kubenswrapper[4797]: I1013 14:25:23.364740 4797 generic.go:334] "Generic (PLEG): container finished" podID="5000c5bd-1fda-44e9-81e0-5c55e3298868" containerID="6735b15f8ed0d8c37dd10d1365169713bf17cdea1794049fbbd7855fff44fa08" exitCode=0 Oct 13 14:25:23 crc kubenswrapper[4797]: I1013 14:25:23.364852 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-lswtg" event={"ID":"5000c5bd-1fda-44e9-81e0-5c55e3298868","Type":"ContainerDied","Data":"6735b15f8ed0d8c37dd10d1365169713bf17cdea1794049fbbd7855fff44fa08"} Oct 13 14:25:24 crc kubenswrapper[4797]: I1013 14:25:24.719347 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-lswtg" Oct 13 14:25:24 crc kubenswrapper[4797]: I1013 14:25:24.763422 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5000c5bd-1fda-44e9-81e0-5c55e3298868-combined-ca-bundle\") pod \"5000c5bd-1fda-44e9-81e0-5c55e3298868\" (UID: \"5000c5bd-1fda-44e9-81e0-5c55e3298868\") " Oct 13 14:25:24 crc kubenswrapper[4797]: I1013 14:25:24.763553 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5000c5bd-1fda-44e9-81e0-5c55e3298868-credential-keys\") pod \"5000c5bd-1fda-44e9-81e0-5c55e3298868\" (UID: \"5000c5bd-1fda-44e9-81e0-5c55e3298868\") " Oct 13 14:25:24 crc kubenswrapper[4797]: I1013 14:25:24.763760 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vll9f\" (UniqueName: \"kubernetes.io/projected/5000c5bd-1fda-44e9-81e0-5c55e3298868-kube-api-access-vll9f\") pod \"5000c5bd-1fda-44e9-81e0-5c55e3298868\" (UID: \"5000c5bd-1fda-44e9-81e0-5c55e3298868\") " Oct 13 14:25:24 crc kubenswrapper[4797]: I1013 14:25:24.763917 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5000c5bd-1fda-44e9-81e0-5c55e3298868-scripts\") pod \"5000c5bd-1fda-44e9-81e0-5c55e3298868\" (UID: \"5000c5bd-1fda-44e9-81e0-5c55e3298868\") " Oct 13 14:25:24 crc kubenswrapper[4797]: I1013 14:25:24.763980 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5000c5bd-1fda-44e9-81e0-5c55e3298868-fernet-keys\") pod \"5000c5bd-1fda-44e9-81e0-5c55e3298868\" (UID: \"5000c5bd-1fda-44e9-81e0-5c55e3298868\") " Oct 13 14:25:24 crc kubenswrapper[4797]: I1013 14:25:24.764115 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5000c5bd-1fda-44e9-81e0-5c55e3298868-config-data\") pod \"5000c5bd-1fda-44e9-81e0-5c55e3298868\" (UID: \"5000c5bd-1fda-44e9-81e0-5c55e3298868\") " Oct 13 14:25:24 crc kubenswrapper[4797]: I1013 14:25:24.770377 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5000c5bd-1fda-44e9-81e0-5c55e3298868-kube-api-access-vll9f" (OuterVolumeSpecName: "kube-api-access-vll9f") pod "5000c5bd-1fda-44e9-81e0-5c55e3298868" (UID: "5000c5bd-1fda-44e9-81e0-5c55e3298868"). InnerVolumeSpecName "kube-api-access-vll9f". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 14:25:24 crc kubenswrapper[4797]: I1013 14:25:24.770865 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5000c5bd-1fda-44e9-81e0-5c55e3298868-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "5000c5bd-1fda-44e9-81e0-5c55e3298868" (UID: "5000c5bd-1fda-44e9-81e0-5c55e3298868"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 14:25:24 crc kubenswrapper[4797]: I1013 14:25:24.771889 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5000c5bd-1fda-44e9-81e0-5c55e3298868-scripts" (OuterVolumeSpecName: "scripts") pod "5000c5bd-1fda-44e9-81e0-5c55e3298868" (UID: "5000c5bd-1fda-44e9-81e0-5c55e3298868"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 14:25:24 crc kubenswrapper[4797]: I1013 14:25:24.771955 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5000c5bd-1fda-44e9-81e0-5c55e3298868-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "5000c5bd-1fda-44e9-81e0-5c55e3298868" (UID: "5000c5bd-1fda-44e9-81e0-5c55e3298868"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 14:25:24 crc kubenswrapper[4797]: I1013 14:25:24.795984 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5000c5bd-1fda-44e9-81e0-5c55e3298868-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5000c5bd-1fda-44e9-81e0-5c55e3298868" (UID: "5000c5bd-1fda-44e9-81e0-5c55e3298868"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 14:25:24 crc kubenswrapper[4797]: I1013 14:25:24.811609 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5000c5bd-1fda-44e9-81e0-5c55e3298868-config-data" (OuterVolumeSpecName: "config-data") pod "5000c5bd-1fda-44e9-81e0-5c55e3298868" (UID: "5000c5bd-1fda-44e9-81e0-5c55e3298868"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 14:25:24 crc kubenswrapper[4797]: I1013 14:25:24.867226 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vll9f\" (UniqueName: \"kubernetes.io/projected/5000c5bd-1fda-44e9-81e0-5c55e3298868-kube-api-access-vll9f\") on node \"crc\" DevicePath \"\"" Oct 13 14:25:24 crc kubenswrapper[4797]: I1013 14:25:24.867282 4797 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5000c5bd-1fda-44e9-81e0-5c55e3298868-scripts\") on node \"crc\" DevicePath \"\"" Oct 13 14:25:24 crc kubenswrapper[4797]: I1013 14:25:24.867302 4797 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5000c5bd-1fda-44e9-81e0-5c55e3298868-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 13 14:25:24 crc kubenswrapper[4797]: I1013 14:25:24.867320 4797 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5000c5bd-1fda-44e9-81e0-5c55e3298868-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 14:25:24 crc kubenswrapper[4797]: I1013 14:25:24.867340 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5000c5bd-1fda-44e9-81e0-5c55e3298868-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 14:25:24 crc kubenswrapper[4797]: I1013 14:25:24.867358 4797 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5000c5bd-1fda-44e9-81e0-5c55e3298868-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 13 14:25:25 crc kubenswrapper[4797]: I1013 14:25:25.383261 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-lswtg" event={"ID":"5000c5bd-1fda-44e9-81e0-5c55e3298868","Type":"ContainerDied","Data":"6d13677d9a7e3be29480b9525209caa3363cdbc921a1b11eb7bd671294714b3d"} Oct 13 14:25:25 crc kubenswrapper[4797]: I1013 14:25:25.383518 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6d13677d9a7e3be29480b9525209caa3363cdbc921a1b11eb7bd671294714b3d" Oct 13 14:25:25 crc kubenswrapper[4797]: I1013 14:25:25.383299 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-lswtg" Oct 13 14:25:25 crc kubenswrapper[4797]: I1013 14:25:25.513457 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-lswtg"] Oct 13 14:25:25 crc kubenswrapper[4797]: I1013 14:25:25.518768 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-lswtg"] Oct 13 14:25:25 crc kubenswrapper[4797]: I1013 14:25:25.619774 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-8gv9r"] Oct 13 14:25:25 crc kubenswrapper[4797]: E1013 14:25:25.620406 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5000c5bd-1fda-44e9-81e0-5c55e3298868" containerName="keystone-bootstrap" Oct 13 14:25:25 crc kubenswrapper[4797]: I1013 14:25:25.620484 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="5000c5bd-1fda-44e9-81e0-5c55e3298868" containerName="keystone-bootstrap" Oct 13 14:25:25 crc kubenswrapper[4797]: I1013 14:25:25.620705 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="5000c5bd-1fda-44e9-81e0-5c55e3298868" containerName="keystone-bootstrap" Oct 13 14:25:25 crc kubenswrapper[4797]: I1013 14:25:25.621305 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-8gv9r" Oct 13 14:25:25 crc kubenswrapper[4797]: I1013 14:25:25.623753 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 13 14:25:25 crc kubenswrapper[4797]: I1013 14:25:25.623753 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 13 14:25:25 crc kubenswrapper[4797]: I1013 14:25:25.623935 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 13 14:25:25 crc kubenswrapper[4797]: I1013 14:25:25.624182 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-dqm8l" Oct 13 14:25:25 crc kubenswrapper[4797]: I1013 14:25:25.630242 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-8gv9r"] Oct 13 14:25:25 crc kubenswrapper[4797]: I1013 14:25:25.679959 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e38a41c7-6328-45df-b131-0f4e6a74563a-credential-keys\") pod \"keystone-bootstrap-8gv9r\" (UID: \"e38a41c7-6328-45df-b131-0f4e6a74563a\") " pod="openstack/keystone-bootstrap-8gv9r" Oct 13 14:25:25 crc kubenswrapper[4797]: I1013 14:25:25.680042 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h55sl\" (UniqueName: \"kubernetes.io/projected/e38a41c7-6328-45df-b131-0f4e6a74563a-kube-api-access-h55sl\") pod \"keystone-bootstrap-8gv9r\" (UID: \"e38a41c7-6328-45df-b131-0f4e6a74563a\") " pod="openstack/keystone-bootstrap-8gv9r" Oct 13 14:25:25 crc kubenswrapper[4797]: I1013 14:25:25.680099 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e38a41c7-6328-45df-b131-0f4e6a74563a-combined-ca-bundle\") pod \"keystone-bootstrap-8gv9r\" (UID: \"e38a41c7-6328-45df-b131-0f4e6a74563a\") " pod="openstack/keystone-bootstrap-8gv9r" Oct 13 14:25:25 crc kubenswrapper[4797]: I1013 14:25:25.680137 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e38a41c7-6328-45df-b131-0f4e6a74563a-scripts\") pod \"keystone-bootstrap-8gv9r\" (UID: \"e38a41c7-6328-45df-b131-0f4e6a74563a\") " pod="openstack/keystone-bootstrap-8gv9r" Oct 13 14:25:25 crc kubenswrapper[4797]: I1013 14:25:25.680183 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e38a41c7-6328-45df-b131-0f4e6a74563a-fernet-keys\") pod \"keystone-bootstrap-8gv9r\" (UID: \"e38a41c7-6328-45df-b131-0f4e6a74563a\") " pod="openstack/keystone-bootstrap-8gv9r" Oct 13 14:25:25 crc kubenswrapper[4797]: I1013 14:25:25.681028 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e38a41c7-6328-45df-b131-0f4e6a74563a-config-data\") pod \"keystone-bootstrap-8gv9r\" (UID: \"e38a41c7-6328-45df-b131-0f4e6a74563a\") " pod="openstack/keystone-bootstrap-8gv9r" Oct 13 14:25:25 crc kubenswrapper[4797]: I1013 14:25:25.783171 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e38a41c7-6328-45df-b131-0f4e6a74563a-config-data\") pod \"keystone-bootstrap-8gv9r\" (UID: \"e38a41c7-6328-45df-b131-0f4e6a74563a\") " pod="openstack/keystone-bootstrap-8gv9r" Oct 13 14:25:25 crc kubenswrapper[4797]: I1013 14:25:25.783270 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e38a41c7-6328-45df-b131-0f4e6a74563a-credential-keys\") pod \"keystone-bootstrap-8gv9r\" (UID: \"e38a41c7-6328-45df-b131-0f4e6a74563a\") " pod="openstack/keystone-bootstrap-8gv9r" Oct 13 14:25:25 crc kubenswrapper[4797]: I1013 14:25:25.783310 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h55sl\" (UniqueName: \"kubernetes.io/projected/e38a41c7-6328-45df-b131-0f4e6a74563a-kube-api-access-h55sl\") pod \"keystone-bootstrap-8gv9r\" (UID: \"e38a41c7-6328-45df-b131-0f4e6a74563a\") " pod="openstack/keystone-bootstrap-8gv9r" Oct 13 14:25:25 crc kubenswrapper[4797]: I1013 14:25:25.783350 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e38a41c7-6328-45df-b131-0f4e6a74563a-combined-ca-bundle\") pod \"keystone-bootstrap-8gv9r\" (UID: \"e38a41c7-6328-45df-b131-0f4e6a74563a\") " pod="openstack/keystone-bootstrap-8gv9r" Oct 13 14:25:25 crc kubenswrapper[4797]: I1013 14:25:25.783376 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e38a41c7-6328-45df-b131-0f4e6a74563a-scripts\") pod \"keystone-bootstrap-8gv9r\" (UID: \"e38a41c7-6328-45df-b131-0f4e6a74563a\") " pod="openstack/keystone-bootstrap-8gv9r" Oct 13 14:25:25 crc kubenswrapper[4797]: I1013 14:25:25.783410 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e38a41c7-6328-45df-b131-0f4e6a74563a-fernet-keys\") pod \"keystone-bootstrap-8gv9r\" (UID: \"e38a41c7-6328-45df-b131-0f4e6a74563a\") " pod="openstack/keystone-bootstrap-8gv9r" Oct 13 14:25:25 crc kubenswrapper[4797]: I1013 14:25:25.787962 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e38a41c7-6328-45df-b131-0f4e6a74563a-fernet-keys\") pod \"keystone-bootstrap-8gv9r\" (UID: \"e38a41c7-6328-45df-b131-0f4e6a74563a\") " pod="openstack/keystone-bootstrap-8gv9r" Oct 13 14:25:25 crc kubenswrapper[4797]: I1013 14:25:25.788241 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e38a41c7-6328-45df-b131-0f4e6a74563a-combined-ca-bundle\") pod \"keystone-bootstrap-8gv9r\" (UID: \"e38a41c7-6328-45df-b131-0f4e6a74563a\") " pod="openstack/keystone-bootstrap-8gv9r" Oct 13 14:25:25 crc kubenswrapper[4797]: I1013 14:25:25.788602 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e38a41c7-6328-45df-b131-0f4e6a74563a-config-data\") pod \"keystone-bootstrap-8gv9r\" (UID: \"e38a41c7-6328-45df-b131-0f4e6a74563a\") " pod="openstack/keystone-bootstrap-8gv9r" Oct 13 14:25:25 crc kubenswrapper[4797]: I1013 14:25:25.789613 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e38a41c7-6328-45df-b131-0f4e6a74563a-credential-keys\") pod \"keystone-bootstrap-8gv9r\" (UID: \"e38a41c7-6328-45df-b131-0f4e6a74563a\") " pod="openstack/keystone-bootstrap-8gv9r" Oct 13 14:25:25 crc kubenswrapper[4797]: I1013 14:25:25.790474 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e38a41c7-6328-45df-b131-0f4e6a74563a-scripts\") pod \"keystone-bootstrap-8gv9r\" (UID: \"e38a41c7-6328-45df-b131-0f4e6a74563a\") " pod="openstack/keystone-bootstrap-8gv9r" Oct 13 14:25:25 crc kubenswrapper[4797]: I1013 14:25:25.804370 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h55sl\" (UniqueName: \"kubernetes.io/projected/e38a41c7-6328-45df-b131-0f4e6a74563a-kube-api-access-h55sl\") pod \"keystone-bootstrap-8gv9r\" (UID: \"e38a41c7-6328-45df-b131-0f4e6a74563a\") " pod="openstack/keystone-bootstrap-8gv9r" Oct 13 14:25:25 crc kubenswrapper[4797]: I1013 14:25:25.978540 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-8gv9r" Oct 13 14:25:26 crc kubenswrapper[4797]: I1013 14:25:26.518070 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-8gv9r"] Oct 13 14:25:27 crc kubenswrapper[4797]: I1013 14:25:27.250425 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5000c5bd-1fda-44e9-81e0-5c55e3298868" path="/var/lib/kubelet/pods/5000c5bd-1fda-44e9-81e0-5c55e3298868/volumes" Oct 13 14:25:27 crc kubenswrapper[4797]: I1013 14:25:27.419026 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-8gv9r" event={"ID":"e38a41c7-6328-45df-b131-0f4e6a74563a","Type":"ContainerStarted","Data":"f80dbaf2dd4bcac7be154cafd83514fd4c13497e18e2dff9db8d1b2f8741e7d2"} Oct 13 14:25:27 crc kubenswrapper[4797]: I1013 14:25:27.419086 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-8gv9r" event={"ID":"e38a41c7-6328-45df-b131-0f4e6a74563a","Type":"ContainerStarted","Data":"3635c907b84aefd6eb80772829b70d6ec7f439493f46cf2c82e87a4c2165ac50"} Oct 13 14:25:27 crc kubenswrapper[4797]: I1013 14:25:27.457160 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-8gv9r" podStartSLOduration=2.457132187 podStartE2EDuration="2.457132187s" podCreationTimestamp="2025-10-13 14:25:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 14:25:27.447612413 +0000 UTC m=+4704.981162729" watchObservedRunningTime="2025-10-13 14:25:27.457132187 +0000 UTC m=+4704.990682473" Oct 13 14:25:28 crc kubenswrapper[4797]: I1013 14:25:28.961109 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7f59f744ff-nbq59" Oct 13 14:25:29 crc kubenswrapper[4797]: I1013 14:25:29.030311 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-9d78d64fc-4gns6"] Oct 13 14:25:29 crc kubenswrapper[4797]: I1013 14:25:29.030539 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-9d78d64fc-4gns6" podUID="a7af7ea5-7441-487b-b6be-1ffad6754828" containerName="dnsmasq-dns" containerID="cri-o://2d42c462c6952a8d70787a870440947cfd2d204da07ec2ce15cfc0715532fc9f" gracePeriod=10 Oct 13 14:25:29 crc kubenswrapper[4797]: I1013 14:25:29.444372 4797 generic.go:334] "Generic (PLEG): container finished" podID="a7af7ea5-7441-487b-b6be-1ffad6754828" containerID="2d42c462c6952a8d70787a870440947cfd2d204da07ec2ce15cfc0715532fc9f" exitCode=0 Oct 13 14:25:29 crc kubenswrapper[4797]: I1013 14:25:29.444449 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9d78d64fc-4gns6" event={"ID":"a7af7ea5-7441-487b-b6be-1ffad6754828","Type":"ContainerDied","Data":"2d42c462c6952a8d70787a870440947cfd2d204da07ec2ce15cfc0715532fc9f"} Oct 13 14:25:29 crc kubenswrapper[4797]: I1013 14:25:29.446520 4797 generic.go:334] "Generic (PLEG): container finished" podID="e38a41c7-6328-45df-b131-0f4e6a74563a" containerID="f80dbaf2dd4bcac7be154cafd83514fd4c13497e18e2dff9db8d1b2f8741e7d2" exitCode=0 Oct 13 14:25:29 crc kubenswrapper[4797]: I1013 14:25:29.446562 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-8gv9r" event={"ID":"e38a41c7-6328-45df-b131-0f4e6a74563a","Type":"ContainerDied","Data":"f80dbaf2dd4bcac7be154cafd83514fd4c13497e18e2dff9db8d1b2f8741e7d2"} Oct 13 14:25:29 crc kubenswrapper[4797]: I1013 14:25:29.515030 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9d78d64fc-4gns6" Oct 13 14:25:29 crc kubenswrapper[4797]: I1013 14:25:29.551238 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a7af7ea5-7441-487b-b6be-1ffad6754828-config\") pod \"a7af7ea5-7441-487b-b6be-1ffad6754828\" (UID: \"a7af7ea5-7441-487b-b6be-1ffad6754828\") " Oct 13 14:25:29 crc kubenswrapper[4797]: I1013 14:25:29.551411 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a7af7ea5-7441-487b-b6be-1ffad6754828-dns-svc\") pod \"a7af7ea5-7441-487b-b6be-1ffad6754828\" (UID: \"a7af7ea5-7441-487b-b6be-1ffad6754828\") " Oct 13 14:25:29 crc kubenswrapper[4797]: I1013 14:25:29.551457 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a7af7ea5-7441-487b-b6be-1ffad6754828-ovsdbserver-nb\") pod \"a7af7ea5-7441-487b-b6be-1ffad6754828\" (UID: \"a7af7ea5-7441-487b-b6be-1ffad6754828\") " Oct 13 14:25:29 crc kubenswrapper[4797]: I1013 14:25:29.551483 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zflp8\" (UniqueName: \"kubernetes.io/projected/a7af7ea5-7441-487b-b6be-1ffad6754828-kube-api-access-zflp8\") pod \"a7af7ea5-7441-487b-b6be-1ffad6754828\" (UID: \"a7af7ea5-7441-487b-b6be-1ffad6754828\") " Oct 13 14:25:29 crc kubenswrapper[4797]: I1013 14:25:29.551516 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a7af7ea5-7441-487b-b6be-1ffad6754828-ovsdbserver-sb\") pod \"a7af7ea5-7441-487b-b6be-1ffad6754828\" (UID: \"a7af7ea5-7441-487b-b6be-1ffad6754828\") " Oct 13 14:25:29 crc kubenswrapper[4797]: I1013 14:25:29.557004 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7af7ea5-7441-487b-b6be-1ffad6754828-kube-api-access-zflp8" (OuterVolumeSpecName: "kube-api-access-zflp8") pod "a7af7ea5-7441-487b-b6be-1ffad6754828" (UID: "a7af7ea5-7441-487b-b6be-1ffad6754828"). InnerVolumeSpecName "kube-api-access-zflp8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 14:25:29 crc kubenswrapper[4797]: I1013 14:25:29.591224 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a7af7ea5-7441-487b-b6be-1ffad6754828-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a7af7ea5-7441-487b-b6be-1ffad6754828" (UID: "a7af7ea5-7441-487b-b6be-1ffad6754828"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 14:25:29 crc kubenswrapper[4797]: I1013 14:25:29.597486 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a7af7ea5-7441-487b-b6be-1ffad6754828-config" (OuterVolumeSpecName: "config") pod "a7af7ea5-7441-487b-b6be-1ffad6754828" (UID: "a7af7ea5-7441-487b-b6be-1ffad6754828"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 14:25:29 crc kubenswrapper[4797]: I1013 14:25:29.598027 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a7af7ea5-7441-487b-b6be-1ffad6754828-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a7af7ea5-7441-487b-b6be-1ffad6754828" (UID: "a7af7ea5-7441-487b-b6be-1ffad6754828"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 14:25:29 crc kubenswrapper[4797]: I1013 14:25:29.599104 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a7af7ea5-7441-487b-b6be-1ffad6754828-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a7af7ea5-7441-487b-b6be-1ffad6754828" (UID: "a7af7ea5-7441-487b-b6be-1ffad6754828"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 14:25:29 crc kubenswrapper[4797]: I1013 14:25:29.654515 4797 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a7af7ea5-7441-487b-b6be-1ffad6754828-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 13 14:25:29 crc kubenswrapper[4797]: I1013 14:25:29.654563 4797 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a7af7ea5-7441-487b-b6be-1ffad6754828-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 13 14:25:29 crc kubenswrapper[4797]: I1013 14:25:29.654581 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zflp8\" (UniqueName: \"kubernetes.io/projected/a7af7ea5-7441-487b-b6be-1ffad6754828-kube-api-access-zflp8\") on node \"crc\" DevicePath \"\"" Oct 13 14:25:29 crc kubenswrapper[4797]: I1013 14:25:29.654593 4797 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a7af7ea5-7441-487b-b6be-1ffad6754828-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 13 14:25:29 crc kubenswrapper[4797]: I1013 14:25:29.654607 4797 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a7af7ea5-7441-487b-b6be-1ffad6754828-config\") on node \"crc\" DevicePath \"\"" Oct 13 14:25:30 crc kubenswrapper[4797]: I1013 14:25:30.458294 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9d78d64fc-4gns6" event={"ID":"a7af7ea5-7441-487b-b6be-1ffad6754828","Type":"ContainerDied","Data":"3646c14d9463146325cb1a874467cc4b2d42ba9c843366f855ee88e42f247e21"} Oct 13 14:25:30 crc kubenswrapper[4797]: I1013 14:25:30.458387 4797 scope.go:117] "RemoveContainer" containerID="2d42c462c6952a8d70787a870440947cfd2d204da07ec2ce15cfc0715532fc9f" Oct 13 14:25:30 crc kubenswrapper[4797]: I1013 14:25:30.458327 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9d78d64fc-4gns6" Oct 13 14:25:30 crc kubenswrapper[4797]: I1013 14:25:30.505952 4797 scope.go:117] "RemoveContainer" containerID="6bb3c2d7af9c8506294e01054548a8e932fc6a3ecba112f6103c2d4880cff037" Oct 13 14:25:30 crc kubenswrapper[4797]: I1013 14:25:30.540069 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-9d78d64fc-4gns6"] Oct 13 14:25:30 crc kubenswrapper[4797]: I1013 14:25:30.550858 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-9d78d64fc-4gns6"] Oct 13 14:25:30 crc kubenswrapper[4797]: I1013 14:25:30.811208 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-8gv9r" Oct 13 14:25:30 crc kubenswrapper[4797]: I1013 14:25:30.875049 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h55sl\" (UniqueName: \"kubernetes.io/projected/e38a41c7-6328-45df-b131-0f4e6a74563a-kube-api-access-h55sl\") pod \"e38a41c7-6328-45df-b131-0f4e6a74563a\" (UID: \"e38a41c7-6328-45df-b131-0f4e6a74563a\") " Oct 13 14:25:30 crc kubenswrapper[4797]: I1013 14:25:30.875547 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e38a41c7-6328-45df-b131-0f4e6a74563a-credential-keys\") pod \"e38a41c7-6328-45df-b131-0f4e6a74563a\" (UID: \"e38a41c7-6328-45df-b131-0f4e6a74563a\") " Oct 13 14:25:30 crc kubenswrapper[4797]: I1013 14:25:30.875621 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e38a41c7-6328-45df-b131-0f4e6a74563a-fernet-keys\") pod \"e38a41c7-6328-45df-b131-0f4e6a74563a\" (UID: \"e38a41c7-6328-45df-b131-0f4e6a74563a\") " Oct 13 14:25:30 crc kubenswrapper[4797]: I1013 14:25:30.875703 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e38a41c7-6328-45df-b131-0f4e6a74563a-scripts\") pod \"e38a41c7-6328-45df-b131-0f4e6a74563a\" (UID: \"e38a41c7-6328-45df-b131-0f4e6a74563a\") " Oct 13 14:25:30 crc kubenswrapper[4797]: I1013 14:25:30.875781 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e38a41c7-6328-45df-b131-0f4e6a74563a-combined-ca-bundle\") pod \"e38a41c7-6328-45df-b131-0f4e6a74563a\" (UID: \"e38a41c7-6328-45df-b131-0f4e6a74563a\") " Oct 13 14:25:30 crc kubenswrapper[4797]: I1013 14:25:30.875869 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e38a41c7-6328-45df-b131-0f4e6a74563a-config-data\") pod \"e38a41c7-6328-45df-b131-0f4e6a74563a\" (UID: \"e38a41c7-6328-45df-b131-0f4e6a74563a\") " Oct 13 14:25:30 crc kubenswrapper[4797]: I1013 14:25:30.881994 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e38a41c7-6328-45df-b131-0f4e6a74563a-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "e38a41c7-6328-45df-b131-0f4e6a74563a" (UID: "e38a41c7-6328-45df-b131-0f4e6a74563a"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 14:25:30 crc kubenswrapper[4797]: I1013 14:25:30.883032 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e38a41c7-6328-45df-b131-0f4e6a74563a-scripts" (OuterVolumeSpecName: "scripts") pod "e38a41c7-6328-45df-b131-0f4e6a74563a" (UID: "e38a41c7-6328-45df-b131-0f4e6a74563a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 14:25:30 crc kubenswrapper[4797]: I1013 14:25:30.886928 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e38a41c7-6328-45df-b131-0f4e6a74563a-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "e38a41c7-6328-45df-b131-0f4e6a74563a" (UID: "e38a41c7-6328-45df-b131-0f4e6a74563a"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 14:25:30 crc kubenswrapper[4797]: I1013 14:25:30.900894 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e38a41c7-6328-45df-b131-0f4e6a74563a-kube-api-access-h55sl" (OuterVolumeSpecName: "kube-api-access-h55sl") pod "e38a41c7-6328-45df-b131-0f4e6a74563a" (UID: "e38a41c7-6328-45df-b131-0f4e6a74563a"). InnerVolumeSpecName "kube-api-access-h55sl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 14:25:30 crc kubenswrapper[4797]: I1013 14:25:30.902462 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e38a41c7-6328-45df-b131-0f4e6a74563a-config-data" (OuterVolumeSpecName: "config-data") pod "e38a41c7-6328-45df-b131-0f4e6a74563a" (UID: "e38a41c7-6328-45df-b131-0f4e6a74563a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 14:25:30 crc kubenswrapper[4797]: I1013 14:25:30.903161 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e38a41c7-6328-45df-b131-0f4e6a74563a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e38a41c7-6328-45df-b131-0f4e6a74563a" (UID: "e38a41c7-6328-45df-b131-0f4e6a74563a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 14:25:30 crc kubenswrapper[4797]: I1013 14:25:30.978313 4797 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e38a41c7-6328-45df-b131-0f4e6a74563a-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 13 14:25:30 crc kubenswrapper[4797]: I1013 14:25:30.978344 4797 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e38a41c7-6328-45df-b131-0f4e6a74563a-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 13 14:25:30 crc kubenswrapper[4797]: I1013 14:25:30.978354 4797 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e38a41c7-6328-45df-b131-0f4e6a74563a-scripts\") on node \"crc\" DevicePath \"\"" Oct 13 14:25:30 crc kubenswrapper[4797]: I1013 14:25:30.978363 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e38a41c7-6328-45df-b131-0f4e6a74563a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 14:25:30 crc kubenswrapper[4797]: I1013 14:25:30.978373 4797 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e38a41c7-6328-45df-b131-0f4e6a74563a-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 14:25:30 crc kubenswrapper[4797]: I1013 14:25:30.978382 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h55sl\" (UniqueName: \"kubernetes.io/projected/e38a41c7-6328-45df-b131-0f4e6a74563a-kube-api-access-h55sl\") on node \"crc\" DevicePath \"\"" Oct 13 14:25:31 crc kubenswrapper[4797]: I1013 14:25:31.246041 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7af7ea5-7441-487b-b6be-1ffad6754828" path="/var/lib/kubelet/pods/a7af7ea5-7441-487b-b6be-1ffad6754828/volumes" Oct 13 14:25:31 crc kubenswrapper[4797]: I1013 14:25:31.482611 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-8gv9r" Oct 13 14:25:31 crc kubenswrapper[4797]: I1013 14:25:31.483437 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-8gv9r" event={"ID":"e38a41c7-6328-45df-b131-0f4e6a74563a","Type":"ContainerDied","Data":"3635c907b84aefd6eb80772829b70d6ec7f439493f46cf2c82e87a4c2165ac50"} Oct 13 14:25:31 crc kubenswrapper[4797]: I1013 14:25:31.483469 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3635c907b84aefd6eb80772829b70d6ec7f439493f46cf2c82e87a4c2165ac50" Oct 13 14:25:31 crc kubenswrapper[4797]: I1013 14:25:31.560117 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-69d9d545f8-4q5z4"] Oct 13 14:25:31 crc kubenswrapper[4797]: E1013 14:25:31.560469 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7af7ea5-7441-487b-b6be-1ffad6754828" containerName="dnsmasq-dns" Oct 13 14:25:31 crc kubenswrapper[4797]: I1013 14:25:31.560488 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7af7ea5-7441-487b-b6be-1ffad6754828" containerName="dnsmasq-dns" Oct 13 14:25:31 crc kubenswrapper[4797]: E1013 14:25:31.560504 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e38a41c7-6328-45df-b131-0f4e6a74563a" containerName="keystone-bootstrap" Oct 13 14:25:31 crc kubenswrapper[4797]: I1013 14:25:31.560512 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="e38a41c7-6328-45df-b131-0f4e6a74563a" containerName="keystone-bootstrap" Oct 13 14:25:31 crc kubenswrapper[4797]: E1013 14:25:31.560525 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7af7ea5-7441-487b-b6be-1ffad6754828" containerName="init" Oct 13 14:25:31 crc kubenswrapper[4797]: I1013 14:25:31.560531 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7af7ea5-7441-487b-b6be-1ffad6754828" containerName="init" Oct 13 14:25:31 crc kubenswrapper[4797]: I1013 14:25:31.560918 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7af7ea5-7441-487b-b6be-1ffad6754828" containerName="dnsmasq-dns" Oct 13 14:25:31 crc kubenswrapper[4797]: I1013 14:25:31.560936 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="e38a41c7-6328-45df-b131-0f4e6a74563a" containerName="keystone-bootstrap" Oct 13 14:25:31 crc kubenswrapper[4797]: I1013 14:25:31.561504 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-69d9d545f8-4q5z4" Oct 13 14:25:31 crc kubenswrapper[4797]: I1013 14:25:31.564395 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 13 14:25:31 crc kubenswrapper[4797]: I1013 14:25:31.564630 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 13 14:25:31 crc kubenswrapper[4797]: I1013 14:25:31.566314 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 13 14:25:31 crc kubenswrapper[4797]: I1013 14:25:31.578118 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-dqm8l" Oct 13 14:25:31 crc kubenswrapper[4797]: I1013 14:25:31.579016 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-69d9d545f8-4q5z4"] Oct 13 14:25:31 crc kubenswrapper[4797]: I1013 14:25:31.587226 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drvxp\" (UniqueName: \"kubernetes.io/projected/e8508459-dcad-4083-884e-5f763b630be0-kube-api-access-drvxp\") pod \"keystone-69d9d545f8-4q5z4\" (UID: \"e8508459-dcad-4083-884e-5f763b630be0\") " pod="openstack/keystone-69d9d545f8-4q5z4" Oct 13 14:25:31 crc kubenswrapper[4797]: I1013 14:25:31.587275 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e8508459-dcad-4083-884e-5f763b630be0-fernet-keys\") pod \"keystone-69d9d545f8-4q5z4\" (UID: \"e8508459-dcad-4083-884e-5f763b630be0\") " pod="openstack/keystone-69d9d545f8-4q5z4" Oct 13 14:25:31 crc kubenswrapper[4797]: I1013 14:25:31.587468 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8508459-dcad-4083-884e-5f763b630be0-scripts\") pod \"keystone-69d9d545f8-4q5z4\" (UID: \"e8508459-dcad-4083-884e-5f763b630be0\") " pod="openstack/keystone-69d9d545f8-4q5z4" Oct 13 14:25:31 crc kubenswrapper[4797]: I1013 14:25:31.587596 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8508459-dcad-4083-884e-5f763b630be0-combined-ca-bundle\") pod \"keystone-69d9d545f8-4q5z4\" (UID: \"e8508459-dcad-4083-884e-5f763b630be0\") " pod="openstack/keystone-69d9d545f8-4q5z4" Oct 13 14:25:31 crc kubenswrapper[4797]: I1013 14:25:31.587703 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e8508459-dcad-4083-884e-5f763b630be0-credential-keys\") pod \"keystone-69d9d545f8-4q5z4\" (UID: \"e8508459-dcad-4083-884e-5f763b630be0\") " pod="openstack/keystone-69d9d545f8-4q5z4" Oct 13 14:25:31 crc kubenswrapper[4797]: I1013 14:25:31.587795 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8508459-dcad-4083-884e-5f763b630be0-config-data\") pod \"keystone-69d9d545f8-4q5z4\" (UID: \"e8508459-dcad-4083-884e-5f763b630be0\") " pod="openstack/keystone-69d9d545f8-4q5z4" Oct 13 14:25:31 crc kubenswrapper[4797]: I1013 14:25:31.689020 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-drvxp\" (UniqueName: \"kubernetes.io/projected/e8508459-dcad-4083-884e-5f763b630be0-kube-api-access-drvxp\") pod \"keystone-69d9d545f8-4q5z4\" (UID: \"e8508459-dcad-4083-884e-5f763b630be0\") " pod="openstack/keystone-69d9d545f8-4q5z4" Oct 13 14:25:31 crc kubenswrapper[4797]: I1013 14:25:31.689317 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e8508459-dcad-4083-884e-5f763b630be0-fernet-keys\") pod \"keystone-69d9d545f8-4q5z4\" (UID: \"e8508459-dcad-4083-884e-5f763b630be0\") " pod="openstack/keystone-69d9d545f8-4q5z4" Oct 13 14:25:31 crc kubenswrapper[4797]: I1013 14:25:31.689354 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8508459-dcad-4083-884e-5f763b630be0-scripts\") pod \"keystone-69d9d545f8-4q5z4\" (UID: \"e8508459-dcad-4083-884e-5f763b630be0\") " pod="openstack/keystone-69d9d545f8-4q5z4" Oct 13 14:25:31 crc kubenswrapper[4797]: I1013 14:25:31.689387 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8508459-dcad-4083-884e-5f763b630be0-combined-ca-bundle\") pod \"keystone-69d9d545f8-4q5z4\" (UID: \"e8508459-dcad-4083-884e-5f763b630be0\") " pod="openstack/keystone-69d9d545f8-4q5z4" Oct 13 14:25:31 crc kubenswrapper[4797]: I1013 14:25:31.689416 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e8508459-dcad-4083-884e-5f763b630be0-credential-keys\") pod \"keystone-69d9d545f8-4q5z4\" (UID: \"e8508459-dcad-4083-884e-5f763b630be0\") " pod="openstack/keystone-69d9d545f8-4q5z4" Oct 13 14:25:31 crc kubenswrapper[4797]: I1013 14:25:31.689438 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8508459-dcad-4083-884e-5f763b630be0-config-data\") pod \"keystone-69d9d545f8-4q5z4\" (UID: \"e8508459-dcad-4083-884e-5f763b630be0\") " pod="openstack/keystone-69d9d545f8-4q5z4" Oct 13 14:25:31 crc kubenswrapper[4797]: I1013 14:25:31.693275 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e8508459-dcad-4083-884e-5f763b630be0-credential-keys\") pod \"keystone-69d9d545f8-4q5z4\" (UID: \"e8508459-dcad-4083-884e-5f763b630be0\") " pod="openstack/keystone-69d9d545f8-4q5z4" Oct 13 14:25:31 crc kubenswrapper[4797]: I1013 14:25:31.697289 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8508459-dcad-4083-884e-5f763b630be0-scripts\") pod \"keystone-69d9d545f8-4q5z4\" (UID: \"e8508459-dcad-4083-884e-5f763b630be0\") " pod="openstack/keystone-69d9d545f8-4q5z4" Oct 13 14:25:31 crc kubenswrapper[4797]: I1013 14:25:31.697551 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8508459-dcad-4083-884e-5f763b630be0-config-data\") pod \"keystone-69d9d545f8-4q5z4\" (UID: \"e8508459-dcad-4083-884e-5f763b630be0\") " pod="openstack/keystone-69d9d545f8-4q5z4" Oct 13 14:25:31 crc kubenswrapper[4797]: I1013 14:25:31.699482 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e8508459-dcad-4083-884e-5f763b630be0-fernet-keys\") pod \"keystone-69d9d545f8-4q5z4\" (UID: \"e8508459-dcad-4083-884e-5f763b630be0\") " pod="openstack/keystone-69d9d545f8-4q5z4" Oct 13 14:25:31 crc kubenswrapper[4797]: I1013 14:25:31.699678 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8508459-dcad-4083-884e-5f763b630be0-combined-ca-bundle\") pod \"keystone-69d9d545f8-4q5z4\" (UID: \"e8508459-dcad-4083-884e-5f763b630be0\") " pod="openstack/keystone-69d9d545f8-4q5z4" Oct 13 14:25:31 crc kubenswrapper[4797]: I1013 14:25:31.709385 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-drvxp\" (UniqueName: \"kubernetes.io/projected/e8508459-dcad-4083-884e-5f763b630be0-kube-api-access-drvxp\") pod \"keystone-69d9d545f8-4q5z4\" (UID: \"e8508459-dcad-4083-884e-5f763b630be0\") " pod="openstack/keystone-69d9d545f8-4q5z4" Oct 13 14:25:31 crc kubenswrapper[4797]: I1013 14:25:31.879350 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-69d9d545f8-4q5z4" Oct 13 14:25:32 crc kubenswrapper[4797]: I1013 14:25:32.324556 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-69d9d545f8-4q5z4"] Oct 13 14:25:32 crc kubenswrapper[4797]: I1013 14:25:32.496606 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-69d9d545f8-4q5z4" event={"ID":"e8508459-dcad-4083-884e-5f763b630be0","Type":"ContainerStarted","Data":"28215f4ac5cdba92a15351fd2728615339a0e9e3c31346928355e949d7621daf"} Oct 13 14:25:33 crc kubenswrapper[4797]: I1013 14:25:33.507437 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-69d9d545f8-4q5z4" event={"ID":"e8508459-dcad-4083-884e-5f763b630be0","Type":"ContainerStarted","Data":"a2002e124d79e98a4778a322876e95b3af48d8d9ab0ea5f4a90efc553396012e"} Oct 13 14:25:33 crc kubenswrapper[4797]: I1013 14:25:33.507995 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-69d9d545f8-4q5z4" Oct 13 14:25:33 crc kubenswrapper[4797]: I1013 14:25:33.540855 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-69d9d545f8-4q5z4" podStartSLOduration=2.540830888 podStartE2EDuration="2.540830888s" podCreationTimestamp="2025-10-13 14:25:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 14:25:33.529184853 +0000 UTC m=+4711.062735159" watchObservedRunningTime="2025-10-13 14:25:33.540830888 +0000 UTC m=+4711.074381164" Oct 13 14:25:54 crc kubenswrapper[4797]: I1013 14:25:54.702770 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-87b9n"] Oct 13 14:25:54 crc kubenswrapper[4797]: I1013 14:25:54.706197 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-87b9n" Oct 13 14:25:54 crc kubenswrapper[4797]: I1013 14:25:54.716638 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-87b9n"] Oct 13 14:25:54 crc kubenswrapper[4797]: I1013 14:25:54.808866 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38aab386-b290-4792-8d3f-f1dcd20f8968-utilities\") pod \"redhat-operators-87b9n\" (UID: \"38aab386-b290-4792-8d3f-f1dcd20f8968\") " pod="openshift-marketplace/redhat-operators-87b9n" Oct 13 14:25:54 crc kubenswrapper[4797]: I1013 14:25:54.808923 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-784tz\" (UniqueName: \"kubernetes.io/projected/38aab386-b290-4792-8d3f-f1dcd20f8968-kube-api-access-784tz\") pod \"redhat-operators-87b9n\" (UID: \"38aab386-b290-4792-8d3f-f1dcd20f8968\") " pod="openshift-marketplace/redhat-operators-87b9n" Oct 13 14:25:54 crc kubenswrapper[4797]: I1013 14:25:54.809236 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38aab386-b290-4792-8d3f-f1dcd20f8968-catalog-content\") pod \"redhat-operators-87b9n\" (UID: \"38aab386-b290-4792-8d3f-f1dcd20f8968\") " pod="openshift-marketplace/redhat-operators-87b9n" Oct 13 14:25:54 crc kubenswrapper[4797]: I1013 14:25:54.910245 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38aab386-b290-4792-8d3f-f1dcd20f8968-utilities\") pod \"redhat-operators-87b9n\" (UID: \"38aab386-b290-4792-8d3f-f1dcd20f8968\") " pod="openshift-marketplace/redhat-operators-87b9n" Oct 13 14:25:54 crc kubenswrapper[4797]: I1013 14:25:54.910311 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-784tz\" (UniqueName: \"kubernetes.io/projected/38aab386-b290-4792-8d3f-f1dcd20f8968-kube-api-access-784tz\") pod \"redhat-operators-87b9n\" (UID: \"38aab386-b290-4792-8d3f-f1dcd20f8968\") " pod="openshift-marketplace/redhat-operators-87b9n" Oct 13 14:25:54 crc kubenswrapper[4797]: I1013 14:25:54.910384 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38aab386-b290-4792-8d3f-f1dcd20f8968-catalog-content\") pod \"redhat-operators-87b9n\" (UID: \"38aab386-b290-4792-8d3f-f1dcd20f8968\") " pod="openshift-marketplace/redhat-operators-87b9n" Oct 13 14:25:54 crc kubenswrapper[4797]: I1013 14:25:54.910724 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38aab386-b290-4792-8d3f-f1dcd20f8968-utilities\") pod \"redhat-operators-87b9n\" (UID: \"38aab386-b290-4792-8d3f-f1dcd20f8968\") " pod="openshift-marketplace/redhat-operators-87b9n" Oct 13 14:25:54 crc kubenswrapper[4797]: I1013 14:25:54.910735 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38aab386-b290-4792-8d3f-f1dcd20f8968-catalog-content\") pod \"redhat-operators-87b9n\" (UID: \"38aab386-b290-4792-8d3f-f1dcd20f8968\") " pod="openshift-marketplace/redhat-operators-87b9n" Oct 13 14:25:54 crc kubenswrapper[4797]: I1013 14:25:54.933931 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-784tz\" (UniqueName: \"kubernetes.io/projected/38aab386-b290-4792-8d3f-f1dcd20f8968-kube-api-access-784tz\") pod \"redhat-operators-87b9n\" (UID: \"38aab386-b290-4792-8d3f-f1dcd20f8968\") " pod="openshift-marketplace/redhat-operators-87b9n" Oct 13 14:25:55 crc kubenswrapper[4797]: I1013 14:25:55.042614 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-87b9n" Oct 13 14:25:55 crc kubenswrapper[4797]: I1013 14:25:55.544850 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-87b9n"] Oct 13 14:25:55 crc kubenswrapper[4797]: W1013 14:25:55.553998 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod38aab386_b290_4792_8d3f_f1dcd20f8968.slice/crio-8eef1c1a7700cc837b96c455c5883f4bd3370ca1b39ad8c80eb06bb55cb9a68c WatchSource:0}: Error finding container 8eef1c1a7700cc837b96c455c5883f4bd3370ca1b39ad8c80eb06bb55cb9a68c: Status 404 returned error can't find the container with id 8eef1c1a7700cc837b96c455c5883f4bd3370ca1b39ad8c80eb06bb55cb9a68c Oct 13 14:25:55 crc kubenswrapper[4797]: I1013 14:25:55.750737 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-87b9n" event={"ID":"38aab386-b290-4792-8d3f-f1dcd20f8968","Type":"ContainerStarted","Data":"8eef1c1a7700cc837b96c455c5883f4bd3370ca1b39ad8c80eb06bb55cb9a68c"} Oct 13 14:25:56 crc kubenswrapper[4797]: I1013 14:25:56.765386 4797 generic.go:334] "Generic (PLEG): container finished" podID="38aab386-b290-4792-8d3f-f1dcd20f8968" containerID="6115fe0c58e4376e21f8ac6e8dd7e696ee1240c1078dcb57a8218ed8db334b9d" exitCode=0 Oct 13 14:25:56 crc kubenswrapper[4797]: I1013 14:25:56.765448 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-87b9n" event={"ID":"38aab386-b290-4792-8d3f-f1dcd20f8968","Type":"ContainerDied","Data":"6115fe0c58e4376e21f8ac6e8dd7e696ee1240c1078dcb57a8218ed8db334b9d"} Oct 13 14:25:56 crc kubenswrapper[4797]: I1013 14:25:56.774070 4797 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 13 14:25:58 crc kubenswrapper[4797]: I1013 14:25:58.781705 4797 generic.go:334] "Generic (PLEG): container finished" podID="38aab386-b290-4792-8d3f-f1dcd20f8968" containerID="3c9dd14699bfa59400e4297289357d7a30a7608cded8a8a1842be3b353c95442" exitCode=0 Oct 13 14:25:58 crc kubenswrapper[4797]: I1013 14:25:58.781749 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-87b9n" event={"ID":"38aab386-b290-4792-8d3f-f1dcd20f8968","Type":"ContainerDied","Data":"3c9dd14699bfa59400e4297289357d7a30a7608cded8a8a1842be3b353c95442"} Oct 13 14:25:59 crc kubenswrapper[4797]: I1013 14:25:59.798886 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-87b9n" event={"ID":"38aab386-b290-4792-8d3f-f1dcd20f8968","Type":"ContainerStarted","Data":"957aff002b7692669abad30a1d53d30d83374fb95a7558e54b82caaac8a69f9d"} Oct 13 14:25:59 crc kubenswrapper[4797]: I1013 14:25:59.830255 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-87b9n" podStartSLOduration=3.399087028 podStartE2EDuration="5.83021089s" podCreationTimestamp="2025-10-13 14:25:54 +0000 UTC" firstStartedPulling="2025-10-13 14:25:56.771547701 +0000 UTC m=+4734.305097997" lastFinishedPulling="2025-10-13 14:25:59.202671593 +0000 UTC m=+4736.736221859" observedRunningTime="2025-10-13 14:25:59.825834723 +0000 UTC m=+4737.359385019" watchObservedRunningTime="2025-10-13 14:25:59.83021089 +0000 UTC m=+4737.363761156" Oct 13 14:26:03 crc kubenswrapper[4797]: I1013 14:26:03.354030 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-69d9d545f8-4q5z4" Oct 13 14:26:05 crc kubenswrapper[4797]: I1013 14:26:05.043277 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-87b9n" Oct 13 14:26:05 crc kubenswrapper[4797]: I1013 14:26:05.044649 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-87b9n" Oct 13 14:26:05 crc kubenswrapper[4797]: I1013 14:26:05.125938 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-87b9n" Oct 13 14:26:05 crc kubenswrapper[4797]: I1013 14:26:05.902003 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-87b9n" Oct 13 14:26:05 crc kubenswrapper[4797]: I1013 14:26:05.959750 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-87b9n"] Oct 13 14:26:07 crc kubenswrapper[4797]: I1013 14:26:07.868545 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-87b9n" podUID="38aab386-b290-4792-8d3f-f1dcd20f8968" containerName="registry-server" containerID="cri-o://957aff002b7692669abad30a1d53d30d83374fb95a7558e54b82caaac8a69f9d" gracePeriod=2 Oct 13 14:26:08 crc kubenswrapper[4797]: I1013 14:26:08.251004 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Oct 13 14:26:08 crc kubenswrapper[4797]: I1013 14:26:08.253166 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 13 14:26:08 crc kubenswrapper[4797]: I1013 14:26:08.255479 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Oct 13 14:26:08 crc kubenswrapper[4797]: I1013 14:26:08.255668 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-7mc9f" Oct 13 14:26:08 crc kubenswrapper[4797]: I1013 14:26:08.259375 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Oct 13 14:26:08 crc kubenswrapper[4797]: I1013 14:26:08.276356 4797 status_manager.go:875] "Failed to update status for pod" pod="openstack/openstackclient" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3a7bc036-bcae-49ba-a95f-a3fe41e95709\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T14:26:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T14:26:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T14:26:08Z\\\",\\\"message\\\":\\\"containers with unready status: [openstackclient]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T14:26:08Z\\\",\\\"message\\\":\\\"containers with unready status: [openstackclient]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.rdoproject.org/podified-antelope-centos9/openstack-openstackclient:1e4eeec18f8da2b364b39b7a7358aef5\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"openstackclient\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/home/cloud-admin/.config/openstack/clouds.yaml\\\",\\\"name\\\":\\\"openstack-config\\\"},{\\\"mountPath\\\":\\\"/home/cloud-admin/.config/openstack/secure.yaml\\\",\\\"name\\\":\\\"openstack-config-secret\\\"},{\\\"mountPath\\\":\\\"/home/cloud-admin/cloudrc\\\",\\\"name\\\":\\\"openstack-config-secret\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g8xtf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T14:26:08Z\\\"}}\" for pod \"openstack\"/\"openstackclient\": pods \"openstackclient\" not found" Oct 13 14:26:08 crc kubenswrapper[4797]: I1013 14:26:08.281132 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/3a7bc036-bcae-49ba-a95f-a3fe41e95709-openstack-config-secret\") pod \"openstackclient\" (UID: \"3a7bc036-bcae-49ba-a95f-a3fe41e95709\") " pod="openstack/openstackclient" Oct 13 14:26:08 crc kubenswrapper[4797]: I1013 14:26:08.281201 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/3a7bc036-bcae-49ba-a95f-a3fe41e95709-openstack-config\") pod \"openstackclient\" (UID: \"3a7bc036-bcae-49ba-a95f-a3fe41e95709\") " pod="openstack/openstackclient" Oct 13 14:26:08 crc kubenswrapper[4797]: I1013 14:26:08.281303 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8xtf\" (UniqueName: \"kubernetes.io/projected/3a7bc036-bcae-49ba-a95f-a3fe41e95709-kube-api-access-g8xtf\") pod \"openstackclient\" (UID: \"3a7bc036-bcae-49ba-a95f-a3fe41e95709\") " pod="openstack/openstackclient" Oct 13 14:26:08 crc kubenswrapper[4797]: I1013 14:26:08.289208 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 13 14:26:08 crc kubenswrapper[4797]: I1013 14:26:08.305712 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Oct 13 14:26:08 crc kubenswrapper[4797]: I1013 14:26:08.322344 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Oct 13 14:26:08 crc kubenswrapper[4797]: I1013 14:26:08.337796 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Oct 13 14:26:08 crc kubenswrapper[4797]: I1013 14:26:08.339283 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 13 14:26:08 crc kubenswrapper[4797]: I1013 14:26:08.342175 4797 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="3a7bc036-bcae-49ba-a95f-a3fe41e95709" podUID="468f31d5-3d01-48fa-92ed-82c6cafc7690" Oct 13 14:26:08 crc kubenswrapper[4797]: I1013 14:26:08.346938 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 13 14:26:08 crc kubenswrapper[4797]: E1013 14:26:08.379658 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-g8xtf openstack-config openstack-config-secret], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/openstackclient" podUID="3a7bc036-bcae-49ba-a95f-a3fe41e95709" Oct 13 14:26:08 crc kubenswrapper[4797]: I1013 14:26:08.382112 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/3a7bc036-bcae-49ba-a95f-a3fe41e95709-openstack-config-secret\") pod \"openstackclient\" (UID: \"3a7bc036-bcae-49ba-a95f-a3fe41e95709\") " pod="openstack/openstackclient" Oct 13 14:26:08 crc kubenswrapper[4797]: I1013 14:26:08.382142 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/3a7bc036-bcae-49ba-a95f-a3fe41e95709-openstack-config\") pod \"openstackclient\" (UID: \"3a7bc036-bcae-49ba-a95f-a3fe41e95709\") " pod="openstack/openstackclient" Oct 13 14:26:08 crc kubenswrapper[4797]: I1013 14:26:08.382202 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g8xtf\" (UniqueName: \"kubernetes.io/projected/3a7bc036-bcae-49ba-a95f-a3fe41e95709-kube-api-access-g8xtf\") pod \"openstackclient\" (UID: \"3a7bc036-bcae-49ba-a95f-a3fe41e95709\") " pod="openstack/openstackclient" Oct 13 14:26:08 crc kubenswrapper[4797]: I1013 14:26:08.385977 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/3a7bc036-bcae-49ba-a95f-a3fe41e95709-openstack-config\") pod \"openstackclient\" (UID: \"3a7bc036-bcae-49ba-a95f-a3fe41e95709\") " pod="openstack/openstackclient" Oct 13 14:26:08 crc kubenswrapper[4797]: I1013 14:26:08.386025 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-87b9n" Oct 13 14:26:08 crc kubenswrapper[4797]: E1013 14:26:08.387135 4797 projected.go:194] Error preparing data for projected volume kube-api-access-g8xtf for pod openstack/openstackclient: failed to fetch token: serviceaccounts "openstackclient-openstackclient" is forbidden: the UID in the bound object reference (3a7bc036-bcae-49ba-a95f-a3fe41e95709) does not match the UID in record. The object might have been deleted and then recreated Oct 13 14:26:08 crc kubenswrapper[4797]: E1013 14:26:08.387200 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3a7bc036-bcae-49ba-a95f-a3fe41e95709-kube-api-access-g8xtf podName:3a7bc036-bcae-49ba-a95f-a3fe41e95709 nodeName:}" failed. No retries permitted until 2025-10-13 14:26:08.887179336 +0000 UTC m=+4746.420729602 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-g8xtf" (UniqueName: "kubernetes.io/projected/3a7bc036-bcae-49ba-a95f-a3fe41e95709-kube-api-access-g8xtf") pod "openstackclient" (UID: "3a7bc036-bcae-49ba-a95f-a3fe41e95709") : failed to fetch token: serviceaccounts "openstackclient-openstackclient" is forbidden: the UID in the bound object reference (3a7bc036-bcae-49ba-a95f-a3fe41e95709) does not match the UID in record. The object might have been deleted and then recreated Oct 13 14:26:08 crc kubenswrapper[4797]: I1013 14:26:08.405636 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/3a7bc036-bcae-49ba-a95f-a3fe41e95709-openstack-config-secret\") pod \"openstackclient\" (UID: \"3a7bc036-bcae-49ba-a95f-a3fe41e95709\") " pod="openstack/openstackclient" Oct 13 14:26:08 crc kubenswrapper[4797]: I1013 14:26:08.483756 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38aab386-b290-4792-8d3f-f1dcd20f8968-catalog-content\") pod \"38aab386-b290-4792-8d3f-f1dcd20f8968\" (UID: \"38aab386-b290-4792-8d3f-f1dcd20f8968\") " Oct 13 14:26:08 crc kubenswrapper[4797]: I1013 14:26:08.483879 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38aab386-b290-4792-8d3f-f1dcd20f8968-utilities\") pod \"38aab386-b290-4792-8d3f-f1dcd20f8968\" (UID: \"38aab386-b290-4792-8d3f-f1dcd20f8968\") " Oct 13 14:26:08 crc kubenswrapper[4797]: I1013 14:26:08.483931 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-784tz\" (UniqueName: \"kubernetes.io/projected/38aab386-b290-4792-8d3f-f1dcd20f8968-kube-api-access-784tz\") pod \"38aab386-b290-4792-8d3f-f1dcd20f8968\" (UID: \"38aab386-b290-4792-8d3f-f1dcd20f8968\") " Oct 13 14:26:08 crc kubenswrapper[4797]: I1013 14:26:08.484166 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/468f31d5-3d01-48fa-92ed-82c6cafc7690-openstack-config\") pod \"openstackclient\" (UID: \"468f31d5-3d01-48fa-92ed-82c6cafc7690\") " pod="openstack/openstackclient" Oct 13 14:26:08 crc kubenswrapper[4797]: I1013 14:26:08.484219 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77gcs\" (UniqueName: \"kubernetes.io/projected/468f31d5-3d01-48fa-92ed-82c6cafc7690-kube-api-access-77gcs\") pod \"openstackclient\" (UID: \"468f31d5-3d01-48fa-92ed-82c6cafc7690\") " pod="openstack/openstackclient" Oct 13 14:26:08 crc kubenswrapper[4797]: I1013 14:26:08.484302 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/468f31d5-3d01-48fa-92ed-82c6cafc7690-openstack-config-secret\") pod \"openstackclient\" (UID: \"468f31d5-3d01-48fa-92ed-82c6cafc7690\") " pod="openstack/openstackclient" Oct 13 14:26:08 crc kubenswrapper[4797]: I1013 14:26:08.484863 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/38aab386-b290-4792-8d3f-f1dcd20f8968-utilities" (OuterVolumeSpecName: "utilities") pod "38aab386-b290-4792-8d3f-f1dcd20f8968" (UID: "38aab386-b290-4792-8d3f-f1dcd20f8968"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 14:26:08 crc kubenswrapper[4797]: I1013 14:26:08.487491 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38aab386-b290-4792-8d3f-f1dcd20f8968-kube-api-access-784tz" (OuterVolumeSpecName: "kube-api-access-784tz") pod "38aab386-b290-4792-8d3f-f1dcd20f8968" (UID: "38aab386-b290-4792-8d3f-f1dcd20f8968"). InnerVolumeSpecName "kube-api-access-784tz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 14:26:08 crc kubenswrapper[4797]: I1013 14:26:08.579013 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/38aab386-b290-4792-8d3f-f1dcd20f8968-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "38aab386-b290-4792-8d3f-f1dcd20f8968" (UID: "38aab386-b290-4792-8d3f-f1dcd20f8968"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 14:26:08 crc kubenswrapper[4797]: I1013 14:26:08.585272 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/468f31d5-3d01-48fa-92ed-82c6cafc7690-openstack-config-secret\") pod \"openstackclient\" (UID: \"468f31d5-3d01-48fa-92ed-82c6cafc7690\") " pod="openstack/openstackclient" Oct 13 14:26:08 crc kubenswrapper[4797]: I1013 14:26:08.585356 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/468f31d5-3d01-48fa-92ed-82c6cafc7690-openstack-config\") pod \"openstackclient\" (UID: \"468f31d5-3d01-48fa-92ed-82c6cafc7690\") " pod="openstack/openstackclient" Oct 13 14:26:08 crc kubenswrapper[4797]: I1013 14:26:08.585410 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-77gcs\" (UniqueName: \"kubernetes.io/projected/468f31d5-3d01-48fa-92ed-82c6cafc7690-kube-api-access-77gcs\") pod \"openstackclient\" (UID: \"468f31d5-3d01-48fa-92ed-82c6cafc7690\") " pod="openstack/openstackclient" Oct 13 14:26:08 crc kubenswrapper[4797]: I1013 14:26:08.585496 4797 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38aab386-b290-4792-8d3f-f1dcd20f8968-utilities\") on node \"crc\" DevicePath \"\"" Oct 13 14:26:08 crc kubenswrapper[4797]: I1013 14:26:08.585509 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-784tz\" (UniqueName: \"kubernetes.io/projected/38aab386-b290-4792-8d3f-f1dcd20f8968-kube-api-access-784tz\") on node \"crc\" DevicePath \"\"" Oct 13 14:26:08 crc kubenswrapper[4797]: I1013 14:26:08.585520 4797 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38aab386-b290-4792-8d3f-f1dcd20f8968-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 13 14:26:08 crc kubenswrapper[4797]: I1013 14:26:08.586252 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/468f31d5-3d01-48fa-92ed-82c6cafc7690-openstack-config\") pod \"openstackclient\" (UID: \"468f31d5-3d01-48fa-92ed-82c6cafc7690\") " pod="openstack/openstackclient" Oct 13 14:26:08 crc kubenswrapper[4797]: I1013 14:26:08.589406 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/468f31d5-3d01-48fa-92ed-82c6cafc7690-openstack-config-secret\") pod \"openstackclient\" (UID: \"468f31d5-3d01-48fa-92ed-82c6cafc7690\") " pod="openstack/openstackclient" Oct 13 14:26:08 crc kubenswrapper[4797]: I1013 14:26:08.602025 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-77gcs\" (UniqueName: \"kubernetes.io/projected/468f31d5-3d01-48fa-92ed-82c6cafc7690-kube-api-access-77gcs\") pod \"openstackclient\" (UID: \"468f31d5-3d01-48fa-92ed-82c6cafc7690\") " pod="openstack/openstackclient" Oct 13 14:26:08 crc kubenswrapper[4797]: I1013 14:26:08.743535 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 13 14:26:08 crc kubenswrapper[4797]: I1013 14:26:08.894407 4797 generic.go:334] "Generic (PLEG): container finished" podID="38aab386-b290-4792-8d3f-f1dcd20f8968" containerID="957aff002b7692669abad30a1d53d30d83374fb95a7558e54b82caaac8a69f9d" exitCode=0 Oct 13 14:26:08 crc kubenswrapper[4797]: I1013 14:26:08.894476 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 13 14:26:08 crc kubenswrapper[4797]: I1013 14:26:08.895029 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-87b9n" event={"ID":"38aab386-b290-4792-8d3f-f1dcd20f8968","Type":"ContainerDied","Data":"957aff002b7692669abad30a1d53d30d83374fb95a7558e54b82caaac8a69f9d"} Oct 13 14:26:08 crc kubenswrapper[4797]: I1013 14:26:08.895061 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-87b9n" event={"ID":"38aab386-b290-4792-8d3f-f1dcd20f8968","Type":"ContainerDied","Data":"8eef1c1a7700cc837b96c455c5883f4bd3370ca1b39ad8c80eb06bb55cb9a68c"} Oct 13 14:26:08 crc kubenswrapper[4797]: I1013 14:26:08.895082 4797 scope.go:117] "RemoveContainer" containerID="957aff002b7692669abad30a1d53d30d83374fb95a7558e54b82caaac8a69f9d" Oct 13 14:26:08 crc kubenswrapper[4797]: I1013 14:26:08.895090 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-87b9n" Oct 13 14:26:08 crc kubenswrapper[4797]: I1013 14:26:08.900074 4797 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="3a7bc036-bcae-49ba-a95f-a3fe41e95709" podUID="468f31d5-3d01-48fa-92ed-82c6cafc7690" Oct 13 14:26:08 crc kubenswrapper[4797]: I1013 14:26:08.904027 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 13 14:26:08 crc kubenswrapper[4797]: I1013 14:26:08.907228 4797 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="3a7bc036-bcae-49ba-a95f-a3fe41e95709" podUID="468f31d5-3d01-48fa-92ed-82c6cafc7690" Oct 13 14:26:08 crc kubenswrapper[4797]: I1013 14:26:08.911349 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/3a7bc036-bcae-49ba-a95f-a3fe41e95709-openstack-config-secret\") pod \"3a7bc036-bcae-49ba-a95f-a3fe41e95709\" (UID: \"3a7bc036-bcae-49ba-a95f-a3fe41e95709\") " Oct 13 14:26:08 crc kubenswrapper[4797]: I1013 14:26:08.911408 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/3a7bc036-bcae-49ba-a95f-a3fe41e95709-openstack-config\") pod \"3a7bc036-bcae-49ba-a95f-a3fe41e95709\" (UID: \"3a7bc036-bcae-49ba-a95f-a3fe41e95709\") " Oct 13 14:26:08 crc kubenswrapper[4797]: I1013 14:26:08.911604 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g8xtf\" (UniqueName: \"kubernetes.io/projected/3a7bc036-bcae-49ba-a95f-a3fe41e95709-kube-api-access-g8xtf\") on node \"crc\" DevicePath \"\"" Oct 13 14:26:08 crc kubenswrapper[4797]: I1013 14:26:08.913000 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a7bc036-bcae-49ba-a95f-a3fe41e95709-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "3a7bc036-bcae-49ba-a95f-a3fe41e95709" (UID: "3a7bc036-bcae-49ba-a95f-a3fe41e95709"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 14:26:08 crc kubenswrapper[4797]: I1013 14:26:08.923931 4797 scope.go:117] "RemoveContainer" containerID="3c9dd14699bfa59400e4297289357d7a30a7608cded8a8a1842be3b353c95442" Oct 13 14:26:08 crc kubenswrapper[4797]: I1013 14:26:08.924450 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a7bc036-bcae-49ba-a95f-a3fe41e95709-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "3a7bc036-bcae-49ba-a95f-a3fe41e95709" (UID: "3a7bc036-bcae-49ba-a95f-a3fe41e95709"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 14:26:08 crc kubenswrapper[4797]: I1013 14:26:08.944785 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-87b9n"] Oct 13 14:26:08 crc kubenswrapper[4797]: I1013 14:26:08.951186 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-87b9n"] Oct 13 14:26:08 crc kubenswrapper[4797]: I1013 14:26:08.962510 4797 scope.go:117] "RemoveContainer" containerID="6115fe0c58e4376e21f8ac6e8dd7e696ee1240c1078dcb57a8218ed8db334b9d" Oct 13 14:26:08 crc kubenswrapper[4797]: I1013 14:26:08.981392 4797 scope.go:117] "RemoveContainer" containerID="957aff002b7692669abad30a1d53d30d83374fb95a7558e54b82caaac8a69f9d" Oct 13 14:26:08 crc kubenswrapper[4797]: E1013 14:26:08.981668 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"957aff002b7692669abad30a1d53d30d83374fb95a7558e54b82caaac8a69f9d\": container with ID starting with 957aff002b7692669abad30a1d53d30d83374fb95a7558e54b82caaac8a69f9d not found: ID does not exist" containerID="957aff002b7692669abad30a1d53d30d83374fb95a7558e54b82caaac8a69f9d" Oct 13 14:26:08 crc kubenswrapper[4797]: I1013 14:26:08.981699 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"957aff002b7692669abad30a1d53d30d83374fb95a7558e54b82caaac8a69f9d"} err="failed to get container status \"957aff002b7692669abad30a1d53d30d83374fb95a7558e54b82caaac8a69f9d\": rpc error: code = NotFound desc = could not find container \"957aff002b7692669abad30a1d53d30d83374fb95a7558e54b82caaac8a69f9d\": container with ID starting with 957aff002b7692669abad30a1d53d30d83374fb95a7558e54b82caaac8a69f9d not found: ID does not exist" Oct 13 14:26:08 crc kubenswrapper[4797]: I1013 14:26:08.981718 4797 scope.go:117] "RemoveContainer" containerID="3c9dd14699bfa59400e4297289357d7a30a7608cded8a8a1842be3b353c95442" Oct 13 14:26:08 crc kubenswrapper[4797]: E1013 14:26:08.981906 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c9dd14699bfa59400e4297289357d7a30a7608cded8a8a1842be3b353c95442\": container with ID starting with 3c9dd14699bfa59400e4297289357d7a30a7608cded8a8a1842be3b353c95442 not found: ID does not exist" containerID="3c9dd14699bfa59400e4297289357d7a30a7608cded8a8a1842be3b353c95442" Oct 13 14:26:08 crc kubenswrapper[4797]: I1013 14:26:08.981926 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c9dd14699bfa59400e4297289357d7a30a7608cded8a8a1842be3b353c95442"} err="failed to get container status \"3c9dd14699bfa59400e4297289357d7a30a7608cded8a8a1842be3b353c95442\": rpc error: code = NotFound desc = could not find container \"3c9dd14699bfa59400e4297289357d7a30a7608cded8a8a1842be3b353c95442\": container with ID starting with 3c9dd14699bfa59400e4297289357d7a30a7608cded8a8a1842be3b353c95442 not found: ID does not exist" Oct 13 14:26:08 crc kubenswrapper[4797]: I1013 14:26:08.981941 4797 scope.go:117] "RemoveContainer" containerID="6115fe0c58e4376e21f8ac6e8dd7e696ee1240c1078dcb57a8218ed8db334b9d" Oct 13 14:26:08 crc kubenswrapper[4797]: E1013 14:26:08.982114 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6115fe0c58e4376e21f8ac6e8dd7e696ee1240c1078dcb57a8218ed8db334b9d\": container with ID starting with 6115fe0c58e4376e21f8ac6e8dd7e696ee1240c1078dcb57a8218ed8db334b9d not found: ID does not exist" containerID="6115fe0c58e4376e21f8ac6e8dd7e696ee1240c1078dcb57a8218ed8db334b9d" Oct 13 14:26:08 crc kubenswrapper[4797]: I1013 14:26:08.982133 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6115fe0c58e4376e21f8ac6e8dd7e696ee1240c1078dcb57a8218ed8db334b9d"} err="failed to get container status \"6115fe0c58e4376e21f8ac6e8dd7e696ee1240c1078dcb57a8218ed8db334b9d\": rpc error: code = NotFound desc = could not find container \"6115fe0c58e4376e21f8ac6e8dd7e696ee1240c1078dcb57a8218ed8db334b9d\": container with ID starting with 6115fe0c58e4376e21f8ac6e8dd7e696ee1240c1078dcb57a8218ed8db334b9d not found: ID does not exist" Oct 13 14:26:09 crc kubenswrapper[4797]: I1013 14:26:09.013149 4797 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/3a7bc036-bcae-49ba-a95f-a3fe41e95709-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Oct 13 14:26:09 crc kubenswrapper[4797]: I1013 14:26:09.013190 4797 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/3a7bc036-bcae-49ba-a95f-a3fe41e95709-openstack-config\") on node \"crc\" DevicePath \"\"" Oct 13 14:26:09 crc kubenswrapper[4797]: W1013 14:26:09.256263 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod468f31d5_3d01_48fa_92ed_82c6cafc7690.slice/crio-a2f5882582d104d9acf7f0a1b6a2d3827d6b51475aed96e22240c345f562395b WatchSource:0}: Error finding container a2f5882582d104d9acf7f0a1b6a2d3827d6b51475aed96e22240c345f562395b: Status 404 returned error can't find the container with id a2f5882582d104d9acf7f0a1b6a2d3827d6b51475aed96e22240c345f562395b Oct 13 14:26:09 crc kubenswrapper[4797]: I1013 14:26:09.264299 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38aab386-b290-4792-8d3f-f1dcd20f8968" path="/var/lib/kubelet/pods/38aab386-b290-4792-8d3f-f1dcd20f8968/volumes" Oct 13 14:26:09 crc kubenswrapper[4797]: I1013 14:26:09.265449 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a7bc036-bcae-49ba-a95f-a3fe41e95709" path="/var/lib/kubelet/pods/3a7bc036-bcae-49ba-a95f-a3fe41e95709/volumes" Oct 13 14:26:09 crc kubenswrapper[4797]: I1013 14:26:09.265991 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 13 14:26:09 crc kubenswrapper[4797]: I1013 14:26:09.906065 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"468f31d5-3d01-48fa-92ed-82c6cafc7690","Type":"ContainerStarted","Data":"a2f5882582d104d9acf7f0a1b6a2d3827d6b51475aed96e22240c345f562395b"} Oct 13 14:26:09 crc kubenswrapper[4797]: I1013 14:26:09.907302 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 13 14:26:09 crc kubenswrapper[4797]: I1013 14:26:09.910510 4797 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="3a7bc036-bcae-49ba-a95f-a3fe41e95709" podUID="468f31d5-3d01-48fa-92ed-82c6cafc7690" Oct 13 14:26:09 crc kubenswrapper[4797]: I1013 14:26:09.912976 4797 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="3a7bc036-bcae-49ba-a95f-a3fe41e95709" podUID="468f31d5-3d01-48fa-92ed-82c6cafc7690" Oct 13 14:26:18 crc kubenswrapper[4797]: I1013 14:26:18.120058 4797 patch_prober.go:28] interesting pod/machine-config-daemon-hrdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 14:26:18 crc kubenswrapper[4797]: I1013 14:26:18.120873 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 14:26:20 crc kubenswrapper[4797]: I1013 14:26:20.002935 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"468f31d5-3d01-48fa-92ed-82c6cafc7690","Type":"ContainerStarted","Data":"039442d9b3b20e36ecf325ef2cc27220189afd91b1ab889858082b8db4a65e4c"} Oct 13 14:26:20 crc kubenswrapper[4797]: I1013 14:26:20.026190 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=1.7109162100000002 podStartE2EDuration="12.026163092s" podCreationTimestamp="2025-10-13 14:26:08 +0000 UTC" firstStartedPulling="2025-10-13 14:26:09.272181077 +0000 UTC m=+4746.805731353" lastFinishedPulling="2025-10-13 14:26:19.587427939 +0000 UTC m=+4757.120978235" observedRunningTime="2025-10-13 14:26:20.015702806 +0000 UTC m=+4757.549253092" watchObservedRunningTime="2025-10-13 14:26:20.026163092 +0000 UTC m=+4757.559713388" Oct 13 14:26:48 crc kubenswrapper[4797]: I1013 14:26:48.120277 4797 patch_prober.go:28] interesting pod/machine-config-daemon-hrdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 14:26:48 crc kubenswrapper[4797]: I1013 14:26:48.121075 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 14:27:00 crc kubenswrapper[4797]: I1013 14:27:00.196996 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-h5mvl"] Oct 13 14:27:00 crc kubenswrapper[4797]: E1013 14:27:00.197950 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38aab386-b290-4792-8d3f-f1dcd20f8968" containerName="extract-content" Oct 13 14:27:00 crc kubenswrapper[4797]: I1013 14:27:00.197964 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="38aab386-b290-4792-8d3f-f1dcd20f8968" containerName="extract-content" Oct 13 14:27:00 crc kubenswrapper[4797]: E1013 14:27:00.197980 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38aab386-b290-4792-8d3f-f1dcd20f8968" containerName="registry-server" Oct 13 14:27:00 crc kubenswrapper[4797]: I1013 14:27:00.197987 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="38aab386-b290-4792-8d3f-f1dcd20f8968" containerName="registry-server" Oct 13 14:27:00 crc kubenswrapper[4797]: E1013 14:27:00.198022 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38aab386-b290-4792-8d3f-f1dcd20f8968" containerName="extract-utilities" Oct 13 14:27:00 crc kubenswrapper[4797]: I1013 14:27:00.198030 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="38aab386-b290-4792-8d3f-f1dcd20f8968" containerName="extract-utilities" Oct 13 14:27:00 crc kubenswrapper[4797]: I1013 14:27:00.198209 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="38aab386-b290-4792-8d3f-f1dcd20f8968" containerName="registry-server" Oct 13 14:27:00 crc kubenswrapper[4797]: I1013 14:27:00.199759 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h5mvl" Oct 13 14:27:00 crc kubenswrapper[4797]: I1013 14:27:00.213115 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-h5mvl"] Oct 13 14:27:00 crc kubenswrapper[4797]: I1013 14:27:00.282480 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6gt9j\" (UniqueName: \"kubernetes.io/projected/87eb8b52-ac82-4cee-9c55-c3fb14aed2a4-kube-api-access-6gt9j\") pod \"redhat-marketplace-h5mvl\" (UID: \"87eb8b52-ac82-4cee-9c55-c3fb14aed2a4\") " pod="openshift-marketplace/redhat-marketplace-h5mvl" Oct 13 14:27:00 crc kubenswrapper[4797]: I1013 14:27:00.282564 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/87eb8b52-ac82-4cee-9c55-c3fb14aed2a4-utilities\") pod \"redhat-marketplace-h5mvl\" (UID: \"87eb8b52-ac82-4cee-9c55-c3fb14aed2a4\") " pod="openshift-marketplace/redhat-marketplace-h5mvl" Oct 13 14:27:00 crc kubenswrapper[4797]: I1013 14:27:00.282616 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/87eb8b52-ac82-4cee-9c55-c3fb14aed2a4-catalog-content\") pod \"redhat-marketplace-h5mvl\" (UID: \"87eb8b52-ac82-4cee-9c55-c3fb14aed2a4\") " pod="openshift-marketplace/redhat-marketplace-h5mvl" Oct 13 14:27:00 crc kubenswrapper[4797]: I1013 14:27:00.383739 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/87eb8b52-ac82-4cee-9c55-c3fb14aed2a4-utilities\") pod \"redhat-marketplace-h5mvl\" (UID: \"87eb8b52-ac82-4cee-9c55-c3fb14aed2a4\") " pod="openshift-marketplace/redhat-marketplace-h5mvl" Oct 13 14:27:00 crc kubenswrapper[4797]: I1013 14:27:00.383844 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/87eb8b52-ac82-4cee-9c55-c3fb14aed2a4-catalog-content\") pod \"redhat-marketplace-h5mvl\" (UID: \"87eb8b52-ac82-4cee-9c55-c3fb14aed2a4\") " pod="openshift-marketplace/redhat-marketplace-h5mvl" Oct 13 14:27:00 crc kubenswrapper[4797]: I1013 14:27:00.383937 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6gt9j\" (UniqueName: \"kubernetes.io/projected/87eb8b52-ac82-4cee-9c55-c3fb14aed2a4-kube-api-access-6gt9j\") pod \"redhat-marketplace-h5mvl\" (UID: \"87eb8b52-ac82-4cee-9c55-c3fb14aed2a4\") " pod="openshift-marketplace/redhat-marketplace-h5mvl" Oct 13 14:27:00 crc kubenswrapper[4797]: I1013 14:27:00.384424 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/87eb8b52-ac82-4cee-9c55-c3fb14aed2a4-catalog-content\") pod \"redhat-marketplace-h5mvl\" (UID: \"87eb8b52-ac82-4cee-9c55-c3fb14aed2a4\") " pod="openshift-marketplace/redhat-marketplace-h5mvl" Oct 13 14:27:00 crc kubenswrapper[4797]: I1013 14:27:00.384705 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/87eb8b52-ac82-4cee-9c55-c3fb14aed2a4-utilities\") pod \"redhat-marketplace-h5mvl\" (UID: \"87eb8b52-ac82-4cee-9c55-c3fb14aed2a4\") " pod="openshift-marketplace/redhat-marketplace-h5mvl" Oct 13 14:27:00 crc kubenswrapper[4797]: I1013 14:27:00.404796 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6gt9j\" (UniqueName: \"kubernetes.io/projected/87eb8b52-ac82-4cee-9c55-c3fb14aed2a4-kube-api-access-6gt9j\") pod \"redhat-marketplace-h5mvl\" (UID: \"87eb8b52-ac82-4cee-9c55-c3fb14aed2a4\") " pod="openshift-marketplace/redhat-marketplace-h5mvl" Oct 13 14:27:00 crc kubenswrapper[4797]: I1013 14:27:00.517819 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h5mvl" Oct 13 14:27:01 crc kubenswrapper[4797]: I1013 14:27:01.013203 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-h5mvl"] Oct 13 14:27:01 crc kubenswrapper[4797]: W1013 14:27:01.018869 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod87eb8b52_ac82_4cee_9c55_c3fb14aed2a4.slice/crio-e805ce7addb1e5ae4c09105b4c42fc809dc9421939e8bd6d2f5d2f1cb636f118 WatchSource:0}: Error finding container e805ce7addb1e5ae4c09105b4c42fc809dc9421939e8bd6d2f5d2f1cb636f118: Status 404 returned error can't find the container with id e805ce7addb1e5ae4c09105b4c42fc809dc9421939e8bd6d2f5d2f1cb636f118 Oct 13 14:27:01 crc kubenswrapper[4797]: I1013 14:27:01.404672 4797 generic.go:334] "Generic (PLEG): container finished" podID="87eb8b52-ac82-4cee-9c55-c3fb14aed2a4" containerID="c66f2c3df2b7ed180785692c97268f95fe3f5e7d2286cf77bf6853f2a05d73be" exitCode=0 Oct 13 14:27:01 crc kubenswrapper[4797]: I1013 14:27:01.405338 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h5mvl" event={"ID":"87eb8b52-ac82-4cee-9c55-c3fb14aed2a4","Type":"ContainerDied","Data":"c66f2c3df2b7ed180785692c97268f95fe3f5e7d2286cf77bf6853f2a05d73be"} Oct 13 14:27:01 crc kubenswrapper[4797]: I1013 14:27:01.405768 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h5mvl" event={"ID":"87eb8b52-ac82-4cee-9c55-c3fb14aed2a4","Type":"ContainerStarted","Data":"e805ce7addb1e5ae4c09105b4c42fc809dc9421939e8bd6d2f5d2f1cb636f118"} Oct 13 14:27:02 crc kubenswrapper[4797]: I1013 14:27:02.427126 4797 generic.go:334] "Generic (PLEG): container finished" podID="87eb8b52-ac82-4cee-9c55-c3fb14aed2a4" containerID="45314c02072d4e3cd8f6f2b6ed9ef2e4966f38dff77cb73c71377f7829d7248a" exitCode=0 Oct 13 14:27:02 crc kubenswrapper[4797]: I1013 14:27:02.428060 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h5mvl" event={"ID":"87eb8b52-ac82-4cee-9c55-c3fb14aed2a4","Type":"ContainerDied","Data":"45314c02072d4e3cd8f6f2b6ed9ef2e4966f38dff77cb73c71377f7829d7248a"} Oct 13 14:27:03 crc kubenswrapper[4797]: I1013 14:27:03.438057 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h5mvl" event={"ID":"87eb8b52-ac82-4cee-9c55-c3fb14aed2a4","Type":"ContainerStarted","Data":"0aed0ddd5a877a48febbf4334b0f6d9a82dbacb23f662e636a96055ced42439c"} Oct 13 14:27:03 crc kubenswrapper[4797]: I1013 14:27:03.465786 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-h5mvl" podStartSLOduration=1.9594701209999998 podStartE2EDuration="3.465770296s" podCreationTimestamp="2025-10-13 14:27:00 +0000 UTC" firstStartedPulling="2025-10-13 14:27:01.408394026 +0000 UTC m=+4798.941944282" lastFinishedPulling="2025-10-13 14:27:02.914694201 +0000 UTC m=+4800.448244457" observedRunningTime="2025-10-13 14:27:03.457282368 +0000 UTC m=+4800.990832634" watchObservedRunningTime="2025-10-13 14:27:03.465770296 +0000 UTC m=+4800.999320552" Oct 13 14:27:10 crc kubenswrapper[4797]: I1013 14:27:10.518547 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-h5mvl" Oct 13 14:27:10 crc kubenswrapper[4797]: I1013 14:27:10.519168 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-h5mvl" Oct 13 14:27:10 crc kubenswrapper[4797]: I1013 14:27:10.563737 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-h5mvl" Oct 13 14:27:11 crc kubenswrapper[4797]: I1013 14:27:11.550526 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-h5mvl" Oct 13 14:27:11 crc kubenswrapper[4797]: I1013 14:27:11.597956 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-h5mvl"] Oct 13 14:27:13 crc kubenswrapper[4797]: I1013 14:27:13.531971 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-h5mvl" podUID="87eb8b52-ac82-4cee-9c55-c3fb14aed2a4" containerName="registry-server" containerID="cri-o://0aed0ddd5a877a48febbf4334b0f6d9a82dbacb23f662e636a96055ced42439c" gracePeriod=2 Oct 13 14:27:14 crc kubenswrapper[4797]: I1013 14:27:14.058088 4797 scope.go:117] "RemoveContainer" containerID="1ea42cc83adbb8ca30180c0024ecca3751f890ed77be1900674a3a4269e1af92" Oct 13 14:27:14 crc kubenswrapper[4797]: I1013 14:27:14.065060 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h5mvl" Oct 13 14:27:14 crc kubenswrapper[4797]: I1013 14:27:14.099848 4797 scope.go:117] "RemoveContainer" containerID="7df8f2baf285eb6a7027e994031290a46a7ffd2c8c7998d845092f5f8121a367" Oct 13 14:27:14 crc kubenswrapper[4797]: I1013 14:27:14.127590 4797 scope.go:117] "RemoveContainer" containerID="9d949ba8220d4eaad18949311d083828126fd0e03af5ecc6897307ffffc1e19d" Oct 13 14:27:14 crc kubenswrapper[4797]: I1013 14:27:14.170427 4797 scope.go:117] "RemoveContainer" containerID="cf714e3407b0d29ee21f458d0091ddea60b12ebee9d446a2da9cc6392dd369c7" Oct 13 14:27:14 crc kubenswrapper[4797]: I1013 14:27:14.188441 4797 scope.go:117] "RemoveContainer" containerID="3e07c0bbbd68aae6e7c637d0ef7db977ddfe0745008fd17f9a72c382caf48bec" Oct 13 14:27:14 crc kubenswrapper[4797]: I1013 14:27:14.210867 4797 scope.go:117] "RemoveContainer" containerID="2984706f4a6aa410e8c3d22187daf4d9f45df63940c651963593443f3410f764" Oct 13 14:27:14 crc kubenswrapper[4797]: I1013 14:27:14.228365 4797 scope.go:117] "RemoveContainer" containerID="381fee8e8354196b3eeae958d1e5e043b47cee940c956023213668e8423081c5" Oct 13 14:27:14 crc kubenswrapper[4797]: I1013 14:27:14.251038 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6gt9j\" (UniqueName: \"kubernetes.io/projected/87eb8b52-ac82-4cee-9c55-c3fb14aed2a4-kube-api-access-6gt9j\") pod \"87eb8b52-ac82-4cee-9c55-c3fb14aed2a4\" (UID: \"87eb8b52-ac82-4cee-9c55-c3fb14aed2a4\") " Oct 13 14:27:14 crc kubenswrapper[4797]: I1013 14:27:14.251079 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/87eb8b52-ac82-4cee-9c55-c3fb14aed2a4-catalog-content\") pod \"87eb8b52-ac82-4cee-9c55-c3fb14aed2a4\" (UID: \"87eb8b52-ac82-4cee-9c55-c3fb14aed2a4\") " Oct 13 14:27:14 crc kubenswrapper[4797]: I1013 14:27:14.251454 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/87eb8b52-ac82-4cee-9c55-c3fb14aed2a4-utilities\") pod \"87eb8b52-ac82-4cee-9c55-c3fb14aed2a4\" (UID: \"87eb8b52-ac82-4cee-9c55-c3fb14aed2a4\") " Oct 13 14:27:14 crc kubenswrapper[4797]: I1013 14:27:14.252989 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/87eb8b52-ac82-4cee-9c55-c3fb14aed2a4-utilities" (OuterVolumeSpecName: "utilities") pod "87eb8b52-ac82-4cee-9c55-c3fb14aed2a4" (UID: "87eb8b52-ac82-4cee-9c55-c3fb14aed2a4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 14:27:14 crc kubenswrapper[4797]: I1013 14:27:14.258659 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87eb8b52-ac82-4cee-9c55-c3fb14aed2a4-kube-api-access-6gt9j" (OuterVolumeSpecName: "kube-api-access-6gt9j") pod "87eb8b52-ac82-4cee-9c55-c3fb14aed2a4" (UID: "87eb8b52-ac82-4cee-9c55-c3fb14aed2a4"). InnerVolumeSpecName "kube-api-access-6gt9j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 14:27:14 crc kubenswrapper[4797]: I1013 14:27:14.268259 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/87eb8b52-ac82-4cee-9c55-c3fb14aed2a4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "87eb8b52-ac82-4cee-9c55-c3fb14aed2a4" (UID: "87eb8b52-ac82-4cee-9c55-c3fb14aed2a4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 14:27:14 crc kubenswrapper[4797]: I1013 14:27:14.352966 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6gt9j\" (UniqueName: \"kubernetes.io/projected/87eb8b52-ac82-4cee-9c55-c3fb14aed2a4-kube-api-access-6gt9j\") on node \"crc\" DevicePath \"\"" Oct 13 14:27:14 crc kubenswrapper[4797]: I1013 14:27:14.353266 4797 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/87eb8b52-ac82-4cee-9c55-c3fb14aed2a4-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 13 14:27:14 crc kubenswrapper[4797]: I1013 14:27:14.353275 4797 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/87eb8b52-ac82-4cee-9c55-c3fb14aed2a4-utilities\") on node \"crc\" DevicePath \"\"" Oct 13 14:27:14 crc kubenswrapper[4797]: I1013 14:27:14.552761 4797 generic.go:334] "Generic (PLEG): container finished" podID="87eb8b52-ac82-4cee-9c55-c3fb14aed2a4" containerID="0aed0ddd5a877a48febbf4334b0f6d9a82dbacb23f662e636a96055ced42439c" exitCode=0 Oct 13 14:27:14 crc kubenswrapper[4797]: I1013 14:27:14.552821 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h5mvl" event={"ID":"87eb8b52-ac82-4cee-9c55-c3fb14aed2a4","Type":"ContainerDied","Data":"0aed0ddd5a877a48febbf4334b0f6d9a82dbacb23f662e636a96055ced42439c"} Oct 13 14:27:14 crc kubenswrapper[4797]: I1013 14:27:14.552863 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h5mvl" event={"ID":"87eb8b52-ac82-4cee-9c55-c3fb14aed2a4","Type":"ContainerDied","Data":"e805ce7addb1e5ae4c09105b4c42fc809dc9421939e8bd6d2f5d2f1cb636f118"} Oct 13 14:27:14 crc kubenswrapper[4797]: I1013 14:27:14.552902 4797 scope.go:117] "RemoveContainer" containerID="0aed0ddd5a877a48febbf4334b0f6d9a82dbacb23f662e636a96055ced42439c" Oct 13 14:27:14 crc kubenswrapper[4797]: I1013 14:27:14.552946 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h5mvl" Oct 13 14:27:14 crc kubenswrapper[4797]: I1013 14:27:14.589421 4797 scope.go:117] "RemoveContainer" containerID="45314c02072d4e3cd8f6f2b6ed9ef2e4966f38dff77cb73c71377f7829d7248a" Oct 13 14:27:14 crc kubenswrapper[4797]: I1013 14:27:14.606543 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-h5mvl"] Oct 13 14:27:14 crc kubenswrapper[4797]: I1013 14:27:14.613194 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-h5mvl"] Oct 13 14:27:14 crc kubenswrapper[4797]: I1013 14:27:14.624557 4797 scope.go:117] "RemoveContainer" containerID="c66f2c3df2b7ed180785692c97268f95fe3f5e7d2286cf77bf6853f2a05d73be" Oct 13 14:27:14 crc kubenswrapper[4797]: I1013 14:27:14.646494 4797 scope.go:117] "RemoveContainer" containerID="0aed0ddd5a877a48febbf4334b0f6d9a82dbacb23f662e636a96055ced42439c" Oct 13 14:27:14 crc kubenswrapper[4797]: E1013 14:27:14.646935 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0aed0ddd5a877a48febbf4334b0f6d9a82dbacb23f662e636a96055ced42439c\": container with ID starting with 0aed0ddd5a877a48febbf4334b0f6d9a82dbacb23f662e636a96055ced42439c not found: ID does not exist" containerID="0aed0ddd5a877a48febbf4334b0f6d9a82dbacb23f662e636a96055ced42439c" Oct 13 14:27:14 crc kubenswrapper[4797]: I1013 14:27:14.646961 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0aed0ddd5a877a48febbf4334b0f6d9a82dbacb23f662e636a96055ced42439c"} err="failed to get container status \"0aed0ddd5a877a48febbf4334b0f6d9a82dbacb23f662e636a96055ced42439c\": rpc error: code = NotFound desc = could not find container \"0aed0ddd5a877a48febbf4334b0f6d9a82dbacb23f662e636a96055ced42439c\": container with ID starting with 0aed0ddd5a877a48febbf4334b0f6d9a82dbacb23f662e636a96055ced42439c not found: ID does not exist" Oct 13 14:27:14 crc kubenswrapper[4797]: I1013 14:27:14.646981 4797 scope.go:117] "RemoveContainer" containerID="45314c02072d4e3cd8f6f2b6ed9ef2e4966f38dff77cb73c71377f7829d7248a" Oct 13 14:27:14 crc kubenswrapper[4797]: E1013 14:27:14.647331 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45314c02072d4e3cd8f6f2b6ed9ef2e4966f38dff77cb73c71377f7829d7248a\": container with ID starting with 45314c02072d4e3cd8f6f2b6ed9ef2e4966f38dff77cb73c71377f7829d7248a not found: ID does not exist" containerID="45314c02072d4e3cd8f6f2b6ed9ef2e4966f38dff77cb73c71377f7829d7248a" Oct 13 14:27:14 crc kubenswrapper[4797]: I1013 14:27:14.647350 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45314c02072d4e3cd8f6f2b6ed9ef2e4966f38dff77cb73c71377f7829d7248a"} err="failed to get container status \"45314c02072d4e3cd8f6f2b6ed9ef2e4966f38dff77cb73c71377f7829d7248a\": rpc error: code = NotFound desc = could not find container \"45314c02072d4e3cd8f6f2b6ed9ef2e4966f38dff77cb73c71377f7829d7248a\": container with ID starting with 45314c02072d4e3cd8f6f2b6ed9ef2e4966f38dff77cb73c71377f7829d7248a not found: ID does not exist" Oct 13 14:27:14 crc kubenswrapper[4797]: I1013 14:27:14.647361 4797 scope.go:117] "RemoveContainer" containerID="c66f2c3df2b7ed180785692c97268f95fe3f5e7d2286cf77bf6853f2a05d73be" Oct 13 14:27:14 crc kubenswrapper[4797]: E1013 14:27:14.647553 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c66f2c3df2b7ed180785692c97268f95fe3f5e7d2286cf77bf6853f2a05d73be\": container with ID starting with c66f2c3df2b7ed180785692c97268f95fe3f5e7d2286cf77bf6853f2a05d73be not found: ID does not exist" containerID="c66f2c3df2b7ed180785692c97268f95fe3f5e7d2286cf77bf6853f2a05d73be" Oct 13 14:27:14 crc kubenswrapper[4797]: I1013 14:27:14.647574 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c66f2c3df2b7ed180785692c97268f95fe3f5e7d2286cf77bf6853f2a05d73be"} err="failed to get container status \"c66f2c3df2b7ed180785692c97268f95fe3f5e7d2286cf77bf6853f2a05d73be\": rpc error: code = NotFound desc = could not find container \"c66f2c3df2b7ed180785692c97268f95fe3f5e7d2286cf77bf6853f2a05d73be\": container with ID starting with c66f2c3df2b7ed180785692c97268f95fe3f5e7d2286cf77bf6853f2a05d73be not found: ID does not exist" Oct 13 14:27:15 crc kubenswrapper[4797]: I1013 14:27:15.254155 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87eb8b52-ac82-4cee-9c55-c3fb14aed2a4" path="/var/lib/kubelet/pods/87eb8b52-ac82-4cee-9c55-c3fb14aed2a4/volumes" Oct 13 14:27:18 crc kubenswrapper[4797]: I1013 14:27:18.120229 4797 patch_prober.go:28] interesting pod/machine-config-daemon-hrdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 14:27:18 crc kubenswrapper[4797]: I1013 14:27:18.120674 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 14:27:18 crc kubenswrapper[4797]: I1013 14:27:18.120753 4797 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" Oct 13 14:27:18 crc kubenswrapper[4797]: I1013 14:27:18.121829 4797 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"bd2a230444894bdbf4c868b39927df09a997acfb15fbcd184a28d0e618ab816a"} pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 13 14:27:18 crc kubenswrapper[4797]: I1013 14:27:18.121930 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerName="machine-config-daemon" containerID="cri-o://bd2a230444894bdbf4c868b39927df09a997acfb15fbcd184a28d0e618ab816a" gracePeriod=600 Oct 13 14:27:18 crc kubenswrapper[4797]: I1013 14:27:18.610399 4797 generic.go:334] "Generic (PLEG): container finished" podID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerID="bd2a230444894bdbf4c868b39927df09a997acfb15fbcd184a28d0e618ab816a" exitCode=0 Oct 13 14:27:18 crc kubenswrapper[4797]: I1013 14:27:18.610470 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" event={"ID":"345b1c60-ba79-407d-8423-53010f2dfeb0","Type":"ContainerDied","Data":"bd2a230444894bdbf4c868b39927df09a997acfb15fbcd184a28d0e618ab816a"} Oct 13 14:27:18 crc kubenswrapper[4797]: I1013 14:27:18.611088 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" event={"ID":"345b1c60-ba79-407d-8423-53010f2dfeb0","Type":"ContainerStarted","Data":"96c8267bd4c8e99eeab0f52fde47a06d5529395a03b2ed9e13ec45aa355e370b"} Oct 13 14:27:18 crc kubenswrapper[4797]: I1013 14:27:18.611118 4797 scope.go:117] "RemoveContainer" containerID="e94c068ee3715b0c0c1473747789b8e54032f59cffffcb81aaefbd0a73c58ba2" Oct 13 14:27:46 crc kubenswrapper[4797]: I1013 14:27:46.940185 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-996sj"] Oct 13 14:27:46 crc kubenswrapper[4797]: E1013 14:27:46.940998 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87eb8b52-ac82-4cee-9c55-c3fb14aed2a4" containerName="extract-utilities" Oct 13 14:27:46 crc kubenswrapper[4797]: I1013 14:27:46.941011 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="87eb8b52-ac82-4cee-9c55-c3fb14aed2a4" containerName="extract-utilities" Oct 13 14:27:46 crc kubenswrapper[4797]: E1013 14:27:46.941021 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87eb8b52-ac82-4cee-9c55-c3fb14aed2a4" containerName="extract-content" Oct 13 14:27:46 crc kubenswrapper[4797]: I1013 14:27:46.941027 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="87eb8b52-ac82-4cee-9c55-c3fb14aed2a4" containerName="extract-content" Oct 13 14:27:46 crc kubenswrapper[4797]: E1013 14:27:46.941042 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87eb8b52-ac82-4cee-9c55-c3fb14aed2a4" containerName="registry-server" Oct 13 14:27:46 crc kubenswrapper[4797]: I1013 14:27:46.941047 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="87eb8b52-ac82-4cee-9c55-c3fb14aed2a4" containerName="registry-server" Oct 13 14:27:46 crc kubenswrapper[4797]: I1013 14:27:46.941203 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="87eb8b52-ac82-4cee-9c55-c3fb14aed2a4" containerName="registry-server" Oct 13 14:27:46 crc kubenswrapper[4797]: I1013 14:27:46.941788 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-996sj" Oct 13 14:27:46 crc kubenswrapper[4797]: I1013 14:27:46.954124 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-996sj"] Oct 13 14:27:47 crc kubenswrapper[4797]: I1013 14:27:47.019952 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78zs8\" (UniqueName: \"kubernetes.io/projected/6fce202b-7f4c-4827-b0b0-056784afbb08-kube-api-access-78zs8\") pod \"barbican-db-create-996sj\" (UID: \"6fce202b-7f4c-4827-b0b0-056784afbb08\") " pod="openstack/barbican-db-create-996sj" Oct 13 14:27:47 crc kubenswrapper[4797]: I1013 14:27:47.121256 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78zs8\" (UniqueName: \"kubernetes.io/projected/6fce202b-7f4c-4827-b0b0-056784afbb08-kube-api-access-78zs8\") pod \"barbican-db-create-996sj\" (UID: \"6fce202b-7f4c-4827-b0b0-056784afbb08\") " pod="openstack/barbican-db-create-996sj" Oct 13 14:27:47 crc kubenswrapper[4797]: I1013 14:27:47.383621 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78zs8\" (UniqueName: \"kubernetes.io/projected/6fce202b-7f4c-4827-b0b0-056784afbb08-kube-api-access-78zs8\") pod \"barbican-db-create-996sj\" (UID: \"6fce202b-7f4c-4827-b0b0-056784afbb08\") " pod="openstack/barbican-db-create-996sj" Oct 13 14:27:47 crc kubenswrapper[4797]: I1013 14:27:47.557320 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-996sj" Oct 13 14:27:48 crc kubenswrapper[4797]: I1013 14:27:48.028405 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-996sj"] Oct 13 14:27:48 crc kubenswrapper[4797]: I1013 14:27:48.929986 4797 generic.go:334] "Generic (PLEG): container finished" podID="6fce202b-7f4c-4827-b0b0-056784afbb08" containerID="424eec589bc6b1db174bd15459b71a639e20ecabdb6a3078229196de7f1432ba" exitCode=0 Oct 13 14:27:48 crc kubenswrapper[4797]: I1013 14:27:48.930111 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-996sj" event={"ID":"6fce202b-7f4c-4827-b0b0-056784afbb08","Type":"ContainerDied","Data":"424eec589bc6b1db174bd15459b71a639e20ecabdb6a3078229196de7f1432ba"} Oct 13 14:27:48 crc kubenswrapper[4797]: I1013 14:27:48.930401 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-996sj" event={"ID":"6fce202b-7f4c-4827-b0b0-056784afbb08","Type":"ContainerStarted","Data":"ccfd9c5cecab4eacf37e04dd7a1dce416c909403be53955a2ac48e40038cde00"} Oct 13 14:27:50 crc kubenswrapper[4797]: I1013 14:27:50.254856 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-996sj" Oct 13 14:27:50 crc kubenswrapper[4797]: I1013 14:27:50.391654 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-78zs8\" (UniqueName: \"kubernetes.io/projected/6fce202b-7f4c-4827-b0b0-056784afbb08-kube-api-access-78zs8\") pod \"6fce202b-7f4c-4827-b0b0-056784afbb08\" (UID: \"6fce202b-7f4c-4827-b0b0-056784afbb08\") " Oct 13 14:27:50 crc kubenswrapper[4797]: I1013 14:27:50.397475 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6fce202b-7f4c-4827-b0b0-056784afbb08-kube-api-access-78zs8" (OuterVolumeSpecName: "kube-api-access-78zs8") pod "6fce202b-7f4c-4827-b0b0-056784afbb08" (UID: "6fce202b-7f4c-4827-b0b0-056784afbb08"). InnerVolumeSpecName "kube-api-access-78zs8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 14:27:50 crc kubenswrapper[4797]: I1013 14:27:50.494459 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-78zs8\" (UniqueName: \"kubernetes.io/projected/6fce202b-7f4c-4827-b0b0-056784afbb08-kube-api-access-78zs8\") on node \"crc\" DevicePath \"\"" Oct 13 14:27:50 crc kubenswrapper[4797]: I1013 14:27:50.948477 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-996sj" event={"ID":"6fce202b-7f4c-4827-b0b0-056784afbb08","Type":"ContainerDied","Data":"ccfd9c5cecab4eacf37e04dd7a1dce416c909403be53955a2ac48e40038cde00"} Oct 13 14:27:50 crc kubenswrapper[4797]: I1013 14:27:50.948511 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-996sj" Oct 13 14:27:50 crc kubenswrapper[4797]: I1013 14:27:50.948522 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ccfd9c5cecab4eacf37e04dd7a1dce416c909403be53955a2ac48e40038cde00" Oct 13 14:27:57 crc kubenswrapper[4797]: I1013 14:27:57.047452 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-247e-account-create-2hpqr"] Oct 13 14:27:57 crc kubenswrapper[4797]: E1013 14:27:57.048243 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fce202b-7f4c-4827-b0b0-056784afbb08" containerName="mariadb-database-create" Oct 13 14:27:57 crc kubenswrapper[4797]: I1013 14:27:57.048256 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fce202b-7f4c-4827-b0b0-056784afbb08" containerName="mariadb-database-create" Oct 13 14:27:57 crc kubenswrapper[4797]: I1013 14:27:57.048429 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="6fce202b-7f4c-4827-b0b0-056784afbb08" containerName="mariadb-database-create" Oct 13 14:27:57 crc kubenswrapper[4797]: I1013 14:27:57.048937 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-247e-account-create-2hpqr" Oct 13 14:27:57 crc kubenswrapper[4797]: I1013 14:27:57.056193 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Oct 13 14:27:57 crc kubenswrapper[4797]: I1013 14:27:57.058045 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-247e-account-create-2hpqr"] Oct 13 14:27:57 crc kubenswrapper[4797]: I1013 14:27:57.111326 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpxdw\" (UniqueName: \"kubernetes.io/projected/90fae4c4-9315-431a-8b5a-ae7ca884d321-kube-api-access-lpxdw\") pod \"barbican-247e-account-create-2hpqr\" (UID: \"90fae4c4-9315-431a-8b5a-ae7ca884d321\") " pod="openstack/barbican-247e-account-create-2hpqr" Oct 13 14:27:57 crc kubenswrapper[4797]: I1013 14:27:57.213244 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lpxdw\" (UniqueName: \"kubernetes.io/projected/90fae4c4-9315-431a-8b5a-ae7ca884d321-kube-api-access-lpxdw\") pod \"barbican-247e-account-create-2hpqr\" (UID: \"90fae4c4-9315-431a-8b5a-ae7ca884d321\") " pod="openstack/barbican-247e-account-create-2hpqr" Oct 13 14:27:57 crc kubenswrapper[4797]: I1013 14:27:57.237081 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpxdw\" (UniqueName: \"kubernetes.io/projected/90fae4c4-9315-431a-8b5a-ae7ca884d321-kube-api-access-lpxdw\") pod \"barbican-247e-account-create-2hpqr\" (UID: \"90fae4c4-9315-431a-8b5a-ae7ca884d321\") " pod="openstack/barbican-247e-account-create-2hpqr" Oct 13 14:27:57 crc kubenswrapper[4797]: I1013 14:27:57.378864 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-247e-account-create-2hpqr" Oct 13 14:27:57 crc kubenswrapper[4797]: I1013 14:27:57.840158 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-247e-account-create-2hpqr"] Oct 13 14:27:58 crc kubenswrapper[4797]: I1013 14:27:58.012609 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-247e-account-create-2hpqr" event={"ID":"90fae4c4-9315-431a-8b5a-ae7ca884d321","Type":"ContainerStarted","Data":"756427a03cf4d7437144651f7c7d03c21208f3496e265607cb767be930829401"} Oct 13 14:27:58 crc kubenswrapper[4797]: I1013 14:27:58.012888 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-247e-account-create-2hpqr" event={"ID":"90fae4c4-9315-431a-8b5a-ae7ca884d321","Type":"ContainerStarted","Data":"346abab76c8d22b06eecbd2338cbcfebd29e910a4b0086681080dd8d745ad635"} Oct 13 14:27:59 crc kubenswrapper[4797]: I1013 14:27:59.025691 4797 generic.go:334] "Generic (PLEG): container finished" podID="90fae4c4-9315-431a-8b5a-ae7ca884d321" containerID="756427a03cf4d7437144651f7c7d03c21208f3496e265607cb767be930829401" exitCode=0 Oct 13 14:27:59 crc kubenswrapper[4797]: I1013 14:27:59.026054 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-247e-account-create-2hpqr" event={"ID":"90fae4c4-9315-431a-8b5a-ae7ca884d321","Type":"ContainerDied","Data":"756427a03cf4d7437144651f7c7d03c21208f3496e265607cb767be930829401"} Oct 13 14:28:00 crc kubenswrapper[4797]: I1013 14:28:00.385096 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-247e-account-create-2hpqr" Oct 13 14:28:00 crc kubenswrapper[4797]: I1013 14:28:00.470716 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lpxdw\" (UniqueName: \"kubernetes.io/projected/90fae4c4-9315-431a-8b5a-ae7ca884d321-kube-api-access-lpxdw\") pod \"90fae4c4-9315-431a-8b5a-ae7ca884d321\" (UID: \"90fae4c4-9315-431a-8b5a-ae7ca884d321\") " Oct 13 14:28:00 crc kubenswrapper[4797]: I1013 14:28:00.479081 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90fae4c4-9315-431a-8b5a-ae7ca884d321-kube-api-access-lpxdw" (OuterVolumeSpecName: "kube-api-access-lpxdw") pod "90fae4c4-9315-431a-8b5a-ae7ca884d321" (UID: "90fae4c4-9315-431a-8b5a-ae7ca884d321"). InnerVolumeSpecName "kube-api-access-lpxdw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 14:28:00 crc kubenswrapper[4797]: I1013 14:28:00.572920 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lpxdw\" (UniqueName: \"kubernetes.io/projected/90fae4c4-9315-431a-8b5a-ae7ca884d321-kube-api-access-lpxdw\") on node \"crc\" DevicePath \"\"" Oct 13 14:28:01 crc kubenswrapper[4797]: I1013 14:28:01.049194 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-247e-account-create-2hpqr" event={"ID":"90fae4c4-9315-431a-8b5a-ae7ca884d321","Type":"ContainerDied","Data":"346abab76c8d22b06eecbd2338cbcfebd29e910a4b0086681080dd8d745ad635"} Oct 13 14:28:01 crc kubenswrapper[4797]: I1013 14:28:01.049233 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="346abab76c8d22b06eecbd2338cbcfebd29e910a4b0086681080dd8d745ad635" Oct 13 14:28:01 crc kubenswrapper[4797]: I1013 14:28:01.049676 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-247e-account-create-2hpqr" Oct 13 14:28:02 crc kubenswrapper[4797]: I1013 14:28:02.308423 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-dfjfv"] Oct 13 14:28:02 crc kubenswrapper[4797]: E1013 14:28:02.310355 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90fae4c4-9315-431a-8b5a-ae7ca884d321" containerName="mariadb-account-create" Oct 13 14:28:02 crc kubenswrapper[4797]: I1013 14:28:02.310398 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="90fae4c4-9315-431a-8b5a-ae7ca884d321" containerName="mariadb-account-create" Oct 13 14:28:02 crc kubenswrapper[4797]: I1013 14:28:02.310637 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="90fae4c4-9315-431a-8b5a-ae7ca884d321" containerName="mariadb-account-create" Oct 13 14:28:02 crc kubenswrapper[4797]: I1013 14:28:02.311365 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-dfjfv" Oct 13 14:28:02 crc kubenswrapper[4797]: I1013 14:28:02.313406 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-45tsx" Oct 13 14:28:02 crc kubenswrapper[4797]: I1013 14:28:02.314572 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Oct 13 14:28:02 crc kubenswrapper[4797]: I1013 14:28:02.322911 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-dfjfv"] Oct 13 14:28:02 crc kubenswrapper[4797]: I1013 14:28:02.406188 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aff293db-b437-4018-b341-f285a753a07e-combined-ca-bundle\") pod \"barbican-db-sync-dfjfv\" (UID: \"aff293db-b437-4018-b341-f285a753a07e\") " pod="openstack/barbican-db-sync-dfjfv" Oct 13 14:28:02 crc kubenswrapper[4797]: I1013 14:28:02.406243 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qhnm\" (UniqueName: \"kubernetes.io/projected/aff293db-b437-4018-b341-f285a753a07e-kube-api-access-2qhnm\") pod \"barbican-db-sync-dfjfv\" (UID: \"aff293db-b437-4018-b341-f285a753a07e\") " pod="openstack/barbican-db-sync-dfjfv" Oct 13 14:28:02 crc kubenswrapper[4797]: I1013 14:28:02.406310 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/aff293db-b437-4018-b341-f285a753a07e-db-sync-config-data\") pod \"barbican-db-sync-dfjfv\" (UID: \"aff293db-b437-4018-b341-f285a753a07e\") " pod="openstack/barbican-db-sync-dfjfv" Oct 13 14:28:02 crc kubenswrapper[4797]: I1013 14:28:02.507229 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/aff293db-b437-4018-b341-f285a753a07e-db-sync-config-data\") pod \"barbican-db-sync-dfjfv\" (UID: \"aff293db-b437-4018-b341-f285a753a07e\") " pod="openstack/barbican-db-sync-dfjfv" Oct 13 14:28:02 crc kubenswrapper[4797]: I1013 14:28:02.507542 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aff293db-b437-4018-b341-f285a753a07e-combined-ca-bundle\") pod \"barbican-db-sync-dfjfv\" (UID: \"aff293db-b437-4018-b341-f285a753a07e\") " pod="openstack/barbican-db-sync-dfjfv" Oct 13 14:28:02 crc kubenswrapper[4797]: I1013 14:28:02.507685 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2qhnm\" (UniqueName: \"kubernetes.io/projected/aff293db-b437-4018-b341-f285a753a07e-kube-api-access-2qhnm\") pod \"barbican-db-sync-dfjfv\" (UID: \"aff293db-b437-4018-b341-f285a753a07e\") " pod="openstack/barbican-db-sync-dfjfv" Oct 13 14:28:02 crc kubenswrapper[4797]: I1013 14:28:02.512225 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aff293db-b437-4018-b341-f285a753a07e-combined-ca-bundle\") pod \"barbican-db-sync-dfjfv\" (UID: \"aff293db-b437-4018-b341-f285a753a07e\") " pod="openstack/barbican-db-sync-dfjfv" Oct 13 14:28:02 crc kubenswrapper[4797]: I1013 14:28:02.513171 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/aff293db-b437-4018-b341-f285a753a07e-db-sync-config-data\") pod \"barbican-db-sync-dfjfv\" (UID: \"aff293db-b437-4018-b341-f285a753a07e\") " pod="openstack/barbican-db-sync-dfjfv" Oct 13 14:28:02 crc kubenswrapper[4797]: I1013 14:28:02.538080 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qhnm\" (UniqueName: \"kubernetes.io/projected/aff293db-b437-4018-b341-f285a753a07e-kube-api-access-2qhnm\") pod \"barbican-db-sync-dfjfv\" (UID: \"aff293db-b437-4018-b341-f285a753a07e\") " pod="openstack/barbican-db-sync-dfjfv" Oct 13 14:28:02 crc kubenswrapper[4797]: I1013 14:28:02.640273 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-dfjfv" Oct 13 14:28:02 crc kubenswrapper[4797]: I1013 14:28:02.881193 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-dfjfv"] Oct 13 14:28:02 crc kubenswrapper[4797]: W1013 14:28:02.888311 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaff293db_b437_4018_b341_f285a753a07e.slice/crio-835c73cf74c43cc7e641b3cba0ec99ef46987e8fbe24fbbcc05cfe55e23300c2 WatchSource:0}: Error finding container 835c73cf74c43cc7e641b3cba0ec99ef46987e8fbe24fbbcc05cfe55e23300c2: Status 404 returned error can't find the container with id 835c73cf74c43cc7e641b3cba0ec99ef46987e8fbe24fbbcc05cfe55e23300c2 Oct 13 14:28:03 crc kubenswrapper[4797]: I1013 14:28:03.070316 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-dfjfv" event={"ID":"aff293db-b437-4018-b341-f285a753a07e","Type":"ContainerStarted","Data":"835c73cf74c43cc7e641b3cba0ec99ef46987e8fbe24fbbcc05cfe55e23300c2"} Oct 13 14:28:07 crc kubenswrapper[4797]: I1013 14:28:07.107553 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-dfjfv" event={"ID":"aff293db-b437-4018-b341-f285a753a07e","Type":"ContainerStarted","Data":"e70d2fe9da59d5e516f20659af2b9b3a164386e9cf890c92f826cfcab4349a2b"} Oct 13 14:28:07 crc kubenswrapper[4797]: I1013 14:28:07.125089 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-dfjfv" podStartSLOduration=1.232535825 podStartE2EDuration="5.125069823s" podCreationTimestamp="2025-10-13 14:28:02 +0000 UTC" firstStartedPulling="2025-10-13 14:28:02.891590826 +0000 UTC m=+4860.425141082" lastFinishedPulling="2025-10-13 14:28:06.784124784 +0000 UTC m=+4864.317675080" observedRunningTime="2025-10-13 14:28:07.122925981 +0000 UTC m=+4864.656476247" watchObservedRunningTime="2025-10-13 14:28:07.125069823 +0000 UTC m=+4864.658620079" Oct 13 14:28:08 crc kubenswrapper[4797]: I1013 14:28:08.119217 4797 generic.go:334] "Generic (PLEG): container finished" podID="aff293db-b437-4018-b341-f285a753a07e" containerID="e70d2fe9da59d5e516f20659af2b9b3a164386e9cf890c92f826cfcab4349a2b" exitCode=0 Oct 13 14:28:08 crc kubenswrapper[4797]: I1013 14:28:08.119281 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-dfjfv" event={"ID":"aff293db-b437-4018-b341-f285a753a07e","Type":"ContainerDied","Data":"e70d2fe9da59d5e516f20659af2b9b3a164386e9cf890c92f826cfcab4349a2b"} Oct 13 14:28:09 crc kubenswrapper[4797]: I1013 14:28:09.423913 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-dfjfv" Oct 13 14:28:09 crc kubenswrapper[4797]: I1013 14:28:09.551739 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2qhnm\" (UniqueName: \"kubernetes.io/projected/aff293db-b437-4018-b341-f285a753a07e-kube-api-access-2qhnm\") pod \"aff293db-b437-4018-b341-f285a753a07e\" (UID: \"aff293db-b437-4018-b341-f285a753a07e\") " Oct 13 14:28:09 crc kubenswrapper[4797]: I1013 14:28:09.551840 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/aff293db-b437-4018-b341-f285a753a07e-db-sync-config-data\") pod \"aff293db-b437-4018-b341-f285a753a07e\" (UID: \"aff293db-b437-4018-b341-f285a753a07e\") " Oct 13 14:28:09 crc kubenswrapper[4797]: I1013 14:28:09.552078 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aff293db-b437-4018-b341-f285a753a07e-combined-ca-bundle\") pod \"aff293db-b437-4018-b341-f285a753a07e\" (UID: \"aff293db-b437-4018-b341-f285a753a07e\") " Oct 13 14:28:09 crc kubenswrapper[4797]: I1013 14:28:09.560896 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aff293db-b437-4018-b341-f285a753a07e-kube-api-access-2qhnm" (OuterVolumeSpecName: "kube-api-access-2qhnm") pod "aff293db-b437-4018-b341-f285a753a07e" (UID: "aff293db-b437-4018-b341-f285a753a07e"). InnerVolumeSpecName "kube-api-access-2qhnm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 14:28:09 crc kubenswrapper[4797]: I1013 14:28:09.561026 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aff293db-b437-4018-b341-f285a753a07e-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "aff293db-b437-4018-b341-f285a753a07e" (UID: "aff293db-b437-4018-b341-f285a753a07e"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 14:28:09 crc kubenswrapper[4797]: I1013 14:28:09.578321 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aff293db-b437-4018-b341-f285a753a07e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "aff293db-b437-4018-b341-f285a753a07e" (UID: "aff293db-b437-4018-b341-f285a753a07e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 14:28:09 crc kubenswrapper[4797]: I1013 14:28:09.653945 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aff293db-b437-4018-b341-f285a753a07e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 14:28:09 crc kubenswrapper[4797]: I1013 14:28:09.653982 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2qhnm\" (UniqueName: \"kubernetes.io/projected/aff293db-b437-4018-b341-f285a753a07e-kube-api-access-2qhnm\") on node \"crc\" DevicePath \"\"" Oct 13 14:28:09 crc kubenswrapper[4797]: I1013 14:28:09.653996 4797 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/aff293db-b437-4018-b341-f285a753a07e-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 14:28:10 crc kubenswrapper[4797]: I1013 14:28:10.143295 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-dfjfv" event={"ID":"aff293db-b437-4018-b341-f285a753a07e","Type":"ContainerDied","Data":"835c73cf74c43cc7e641b3cba0ec99ef46987e8fbe24fbbcc05cfe55e23300c2"} Oct 13 14:28:10 crc kubenswrapper[4797]: I1013 14:28:10.143351 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="835c73cf74c43cc7e641b3cba0ec99ef46987e8fbe24fbbcc05cfe55e23300c2" Oct 13 14:28:10 crc kubenswrapper[4797]: I1013 14:28:10.143425 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-dfjfv" Oct 13 14:28:10 crc kubenswrapper[4797]: I1013 14:28:10.440196 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-5d8dfccd44-t4jbz"] Oct 13 14:28:10 crc kubenswrapper[4797]: E1013 14:28:10.440616 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aff293db-b437-4018-b341-f285a753a07e" containerName="barbican-db-sync" Oct 13 14:28:10 crc kubenswrapper[4797]: I1013 14:28:10.440633 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="aff293db-b437-4018-b341-f285a753a07e" containerName="barbican-db-sync" Oct 13 14:28:10 crc kubenswrapper[4797]: I1013 14:28:10.440888 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="aff293db-b437-4018-b341-f285a753a07e" containerName="barbican-db-sync" Oct 13 14:28:10 crc kubenswrapper[4797]: I1013 14:28:10.442019 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-5d8dfccd44-t4jbz" Oct 13 14:28:10 crc kubenswrapper[4797]: I1013 14:28:10.450152 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Oct 13 14:28:10 crc kubenswrapper[4797]: I1013 14:28:10.455554 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Oct 13 14:28:10 crc kubenswrapper[4797]: I1013 14:28:10.455734 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-45tsx" Oct 13 14:28:10 crc kubenswrapper[4797]: I1013 14:28:10.460541 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-5d95764d5c-br2pn"] Oct 13 14:28:10 crc kubenswrapper[4797]: I1013 14:28:10.462285 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-5d95764d5c-br2pn" Oct 13 14:28:10 crc kubenswrapper[4797]: I1013 14:28:10.466092 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Oct 13 14:28:10 crc kubenswrapper[4797]: I1013 14:28:10.467513 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-5d8dfccd44-t4jbz"] Oct 13 14:28:10 crc kubenswrapper[4797]: I1013 14:28:10.478192 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-5d95764d5c-br2pn"] Oct 13 14:28:10 crc kubenswrapper[4797]: I1013 14:28:10.533264 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-58bbdf8b99-5z4rr"] Oct 13 14:28:10 crc kubenswrapper[4797]: I1013 14:28:10.535890 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58bbdf8b99-5z4rr" Oct 13 14:28:10 crc kubenswrapper[4797]: I1013 14:28:10.549194 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58bbdf8b99-5z4rr"] Oct 13 14:28:10 crc kubenswrapper[4797]: I1013 14:28:10.573159 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtznt\" (UniqueName: \"kubernetes.io/projected/15f72fda-eacd-42a1-8be8-92e28ed31924-kube-api-access-jtznt\") pod \"barbican-worker-5d95764d5c-br2pn\" (UID: \"15f72fda-eacd-42a1-8be8-92e28ed31924\") " pod="openstack/barbican-worker-5d95764d5c-br2pn" Oct 13 14:28:10 crc kubenswrapper[4797]: I1013 14:28:10.573207 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7afb7d3-8d9d-475f-9ef5-75c5125ec374-config-data\") pod \"barbican-keystone-listener-5d8dfccd44-t4jbz\" (UID: \"d7afb7d3-8d9d-475f-9ef5-75c5125ec374\") " pod="openstack/barbican-keystone-listener-5d8dfccd44-t4jbz" Oct 13 14:28:10 crc kubenswrapper[4797]: I1013 14:28:10.573267 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d7afb7d3-8d9d-475f-9ef5-75c5125ec374-config-data-custom\") pod \"barbican-keystone-listener-5d8dfccd44-t4jbz\" (UID: \"d7afb7d3-8d9d-475f-9ef5-75c5125ec374\") " pod="openstack/barbican-keystone-listener-5d8dfccd44-t4jbz" Oct 13 14:28:10 crc kubenswrapper[4797]: I1013 14:28:10.573288 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15f72fda-eacd-42a1-8be8-92e28ed31924-config-data\") pod \"barbican-worker-5d95764d5c-br2pn\" (UID: \"15f72fda-eacd-42a1-8be8-92e28ed31924\") " pod="openstack/barbican-worker-5d95764d5c-br2pn" Oct 13 14:28:10 crc kubenswrapper[4797]: I1013 14:28:10.573315 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d7afb7d3-8d9d-475f-9ef5-75c5125ec374-logs\") pod \"barbican-keystone-listener-5d8dfccd44-t4jbz\" (UID: \"d7afb7d3-8d9d-475f-9ef5-75c5125ec374\") " pod="openstack/barbican-keystone-listener-5d8dfccd44-t4jbz" Oct 13 14:28:10 crc kubenswrapper[4797]: I1013 14:28:10.573341 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/15f72fda-eacd-42a1-8be8-92e28ed31924-logs\") pod \"barbican-worker-5d95764d5c-br2pn\" (UID: \"15f72fda-eacd-42a1-8be8-92e28ed31924\") " pod="openstack/barbican-worker-5d95764d5c-br2pn" Oct 13 14:28:10 crc kubenswrapper[4797]: I1013 14:28:10.573367 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/15f72fda-eacd-42a1-8be8-92e28ed31924-config-data-custom\") pod \"barbican-worker-5d95764d5c-br2pn\" (UID: \"15f72fda-eacd-42a1-8be8-92e28ed31924\") " pod="openstack/barbican-worker-5d95764d5c-br2pn" Oct 13 14:28:10 crc kubenswrapper[4797]: I1013 14:28:10.573381 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnrlm\" (UniqueName: \"kubernetes.io/projected/d7afb7d3-8d9d-475f-9ef5-75c5125ec374-kube-api-access-vnrlm\") pod \"barbican-keystone-listener-5d8dfccd44-t4jbz\" (UID: \"d7afb7d3-8d9d-475f-9ef5-75c5125ec374\") " pod="openstack/barbican-keystone-listener-5d8dfccd44-t4jbz" Oct 13 14:28:10 crc kubenswrapper[4797]: I1013 14:28:10.573402 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7afb7d3-8d9d-475f-9ef5-75c5125ec374-combined-ca-bundle\") pod \"barbican-keystone-listener-5d8dfccd44-t4jbz\" (UID: \"d7afb7d3-8d9d-475f-9ef5-75c5125ec374\") " pod="openstack/barbican-keystone-listener-5d8dfccd44-t4jbz" Oct 13 14:28:10 crc kubenswrapper[4797]: I1013 14:28:10.573440 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15f72fda-eacd-42a1-8be8-92e28ed31924-combined-ca-bundle\") pod \"barbican-worker-5d95764d5c-br2pn\" (UID: \"15f72fda-eacd-42a1-8be8-92e28ed31924\") " pod="openstack/barbican-worker-5d95764d5c-br2pn" Oct 13 14:28:10 crc kubenswrapper[4797]: I1013 14:28:10.601387 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-5b7d9bc6cb-z65xd"] Oct 13 14:28:10 crc kubenswrapper[4797]: I1013 14:28:10.604612 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5b7d9bc6cb-z65xd" Oct 13 14:28:10 crc kubenswrapper[4797]: I1013 14:28:10.606826 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Oct 13 14:28:10 crc kubenswrapper[4797]: I1013 14:28:10.616017 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5b7d9bc6cb-z65xd"] Oct 13 14:28:10 crc kubenswrapper[4797]: I1013 14:28:10.675966 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/80b0a71d-5c0a-4432-a2cd-15010490e327-dns-svc\") pod \"dnsmasq-dns-58bbdf8b99-5z4rr\" (UID: \"80b0a71d-5c0a-4432-a2cd-15010490e327\") " pod="openstack/dnsmasq-dns-58bbdf8b99-5z4rr" Oct 13 14:28:10 crc kubenswrapper[4797]: I1013 14:28:10.676326 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jtznt\" (UniqueName: \"kubernetes.io/projected/15f72fda-eacd-42a1-8be8-92e28ed31924-kube-api-access-jtznt\") pod \"barbican-worker-5d95764d5c-br2pn\" (UID: \"15f72fda-eacd-42a1-8be8-92e28ed31924\") " pod="openstack/barbican-worker-5d95764d5c-br2pn" Oct 13 14:28:10 crc kubenswrapper[4797]: I1013 14:28:10.676351 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7afb7d3-8d9d-475f-9ef5-75c5125ec374-config-data\") pod \"barbican-keystone-listener-5d8dfccd44-t4jbz\" (UID: \"d7afb7d3-8d9d-475f-9ef5-75c5125ec374\") " pod="openstack/barbican-keystone-listener-5d8dfccd44-t4jbz" Oct 13 14:28:10 crc kubenswrapper[4797]: I1013 14:28:10.676398 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2fb17bd-c741-47d7-8f3b-c9057c58105c-config-data\") pod \"barbican-api-5b7d9bc6cb-z65xd\" (UID: \"e2fb17bd-c741-47d7-8f3b-c9057c58105c\") " pod="openstack/barbican-api-5b7d9bc6cb-z65xd" Oct 13 14:28:10 crc kubenswrapper[4797]: I1013 14:28:10.676422 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2fb17bd-c741-47d7-8f3b-c9057c58105c-combined-ca-bundle\") pod \"barbican-api-5b7d9bc6cb-z65xd\" (UID: \"e2fb17bd-c741-47d7-8f3b-c9057c58105c\") " pod="openstack/barbican-api-5b7d9bc6cb-z65xd" Oct 13 14:28:10 crc kubenswrapper[4797]: I1013 14:28:10.676447 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/80b0a71d-5c0a-4432-a2cd-15010490e327-ovsdbserver-sb\") pod \"dnsmasq-dns-58bbdf8b99-5z4rr\" (UID: \"80b0a71d-5c0a-4432-a2cd-15010490e327\") " pod="openstack/dnsmasq-dns-58bbdf8b99-5z4rr" Oct 13 14:28:10 crc kubenswrapper[4797]: I1013 14:28:10.676465 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d7afb7d3-8d9d-475f-9ef5-75c5125ec374-config-data-custom\") pod \"barbican-keystone-listener-5d8dfccd44-t4jbz\" (UID: \"d7afb7d3-8d9d-475f-9ef5-75c5125ec374\") " pod="openstack/barbican-keystone-listener-5d8dfccd44-t4jbz" Oct 13 14:28:10 crc kubenswrapper[4797]: I1013 14:28:10.676565 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15f72fda-eacd-42a1-8be8-92e28ed31924-config-data\") pod \"barbican-worker-5d95764d5c-br2pn\" (UID: \"15f72fda-eacd-42a1-8be8-92e28ed31924\") " pod="openstack/barbican-worker-5d95764d5c-br2pn" Oct 13 14:28:10 crc kubenswrapper[4797]: I1013 14:28:10.676631 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttqmw\" (UniqueName: \"kubernetes.io/projected/e2fb17bd-c741-47d7-8f3b-c9057c58105c-kube-api-access-ttqmw\") pod \"barbican-api-5b7d9bc6cb-z65xd\" (UID: \"e2fb17bd-c741-47d7-8f3b-c9057c58105c\") " pod="openstack/barbican-api-5b7d9bc6cb-z65xd" Oct 13 14:28:10 crc kubenswrapper[4797]: I1013 14:28:10.676657 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdpdm\" (UniqueName: \"kubernetes.io/projected/80b0a71d-5c0a-4432-a2cd-15010490e327-kube-api-access-kdpdm\") pod \"dnsmasq-dns-58bbdf8b99-5z4rr\" (UID: \"80b0a71d-5c0a-4432-a2cd-15010490e327\") " pod="openstack/dnsmasq-dns-58bbdf8b99-5z4rr" Oct 13 14:28:10 crc kubenswrapper[4797]: I1013 14:28:10.676676 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d7afb7d3-8d9d-475f-9ef5-75c5125ec374-logs\") pod \"barbican-keystone-listener-5d8dfccd44-t4jbz\" (UID: \"d7afb7d3-8d9d-475f-9ef5-75c5125ec374\") " pod="openstack/barbican-keystone-listener-5d8dfccd44-t4jbz" Oct 13 14:28:10 crc kubenswrapper[4797]: I1013 14:28:10.676723 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e2fb17bd-c741-47d7-8f3b-c9057c58105c-config-data-custom\") pod \"barbican-api-5b7d9bc6cb-z65xd\" (UID: \"e2fb17bd-c741-47d7-8f3b-c9057c58105c\") " pod="openstack/barbican-api-5b7d9bc6cb-z65xd" Oct 13 14:28:10 crc kubenswrapper[4797]: I1013 14:28:10.676743 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/15f72fda-eacd-42a1-8be8-92e28ed31924-logs\") pod \"barbican-worker-5d95764d5c-br2pn\" (UID: \"15f72fda-eacd-42a1-8be8-92e28ed31924\") " pod="openstack/barbican-worker-5d95764d5c-br2pn" Oct 13 14:28:10 crc kubenswrapper[4797]: I1013 14:28:10.676789 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80b0a71d-5c0a-4432-a2cd-15010490e327-config\") pod \"dnsmasq-dns-58bbdf8b99-5z4rr\" (UID: \"80b0a71d-5c0a-4432-a2cd-15010490e327\") " pod="openstack/dnsmasq-dns-58bbdf8b99-5z4rr" Oct 13 14:28:10 crc kubenswrapper[4797]: I1013 14:28:10.676821 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/80b0a71d-5c0a-4432-a2cd-15010490e327-ovsdbserver-nb\") pod \"dnsmasq-dns-58bbdf8b99-5z4rr\" (UID: \"80b0a71d-5c0a-4432-a2cd-15010490e327\") " pod="openstack/dnsmasq-dns-58bbdf8b99-5z4rr" Oct 13 14:28:10 crc kubenswrapper[4797]: I1013 14:28:10.676849 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/15f72fda-eacd-42a1-8be8-92e28ed31924-config-data-custom\") pod \"barbican-worker-5d95764d5c-br2pn\" (UID: \"15f72fda-eacd-42a1-8be8-92e28ed31924\") " pod="openstack/barbican-worker-5d95764d5c-br2pn" Oct 13 14:28:10 crc kubenswrapper[4797]: I1013 14:28:10.676866 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vnrlm\" (UniqueName: \"kubernetes.io/projected/d7afb7d3-8d9d-475f-9ef5-75c5125ec374-kube-api-access-vnrlm\") pod \"barbican-keystone-listener-5d8dfccd44-t4jbz\" (UID: \"d7afb7d3-8d9d-475f-9ef5-75c5125ec374\") " pod="openstack/barbican-keystone-listener-5d8dfccd44-t4jbz" Oct 13 14:28:10 crc kubenswrapper[4797]: I1013 14:28:10.676886 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7afb7d3-8d9d-475f-9ef5-75c5125ec374-combined-ca-bundle\") pod \"barbican-keystone-listener-5d8dfccd44-t4jbz\" (UID: \"d7afb7d3-8d9d-475f-9ef5-75c5125ec374\") " pod="openstack/barbican-keystone-listener-5d8dfccd44-t4jbz" Oct 13 14:28:10 crc kubenswrapper[4797]: I1013 14:28:10.676946 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e2fb17bd-c741-47d7-8f3b-c9057c58105c-logs\") pod \"barbican-api-5b7d9bc6cb-z65xd\" (UID: \"e2fb17bd-c741-47d7-8f3b-c9057c58105c\") " pod="openstack/barbican-api-5b7d9bc6cb-z65xd" Oct 13 14:28:10 crc kubenswrapper[4797]: I1013 14:28:10.676970 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15f72fda-eacd-42a1-8be8-92e28ed31924-combined-ca-bundle\") pod \"barbican-worker-5d95764d5c-br2pn\" (UID: \"15f72fda-eacd-42a1-8be8-92e28ed31924\") " pod="openstack/barbican-worker-5d95764d5c-br2pn" Oct 13 14:28:10 crc kubenswrapper[4797]: I1013 14:28:10.677415 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d7afb7d3-8d9d-475f-9ef5-75c5125ec374-logs\") pod \"barbican-keystone-listener-5d8dfccd44-t4jbz\" (UID: \"d7afb7d3-8d9d-475f-9ef5-75c5125ec374\") " pod="openstack/barbican-keystone-listener-5d8dfccd44-t4jbz" Oct 13 14:28:10 crc kubenswrapper[4797]: I1013 14:28:10.677664 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/15f72fda-eacd-42a1-8be8-92e28ed31924-logs\") pod \"barbican-worker-5d95764d5c-br2pn\" (UID: \"15f72fda-eacd-42a1-8be8-92e28ed31924\") " pod="openstack/barbican-worker-5d95764d5c-br2pn" Oct 13 14:28:10 crc kubenswrapper[4797]: I1013 14:28:10.694215 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d7afb7d3-8d9d-475f-9ef5-75c5125ec374-config-data-custom\") pod \"barbican-keystone-listener-5d8dfccd44-t4jbz\" (UID: \"d7afb7d3-8d9d-475f-9ef5-75c5125ec374\") " pod="openstack/barbican-keystone-listener-5d8dfccd44-t4jbz" Oct 13 14:28:10 crc kubenswrapper[4797]: I1013 14:28:10.695652 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15f72fda-eacd-42a1-8be8-92e28ed31924-combined-ca-bundle\") pod \"barbican-worker-5d95764d5c-br2pn\" (UID: \"15f72fda-eacd-42a1-8be8-92e28ed31924\") " pod="openstack/barbican-worker-5d95764d5c-br2pn" Oct 13 14:28:10 crc kubenswrapper[4797]: I1013 14:28:10.696223 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/15f72fda-eacd-42a1-8be8-92e28ed31924-config-data-custom\") pod \"barbican-worker-5d95764d5c-br2pn\" (UID: \"15f72fda-eacd-42a1-8be8-92e28ed31924\") " pod="openstack/barbican-worker-5d95764d5c-br2pn" Oct 13 14:28:10 crc kubenswrapper[4797]: I1013 14:28:10.696514 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15f72fda-eacd-42a1-8be8-92e28ed31924-config-data\") pod \"barbican-worker-5d95764d5c-br2pn\" (UID: \"15f72fda-eacd-42a1-8be8-92e28ed31924\") " pod="openstack/barbican-worker-5d95764d5c-br2pn" Oct 13 14:28:10 crc kubenswrapper[4797]: I1013 14:28:10.697871 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtznt\" (UniqueName: \"kubernetes.io/projected/15f72fda-eacd-42a1-8be8-92e28ed31924-kube-api-access-jtznt\") pod \"barbican-worker-5d95764d5c-br2pn\" (UID: \"15f72fda-eacd-42a1-8be8-92e28ed31924\") " pod="openstack/barbican-worker-5d95764d5c-br2pn" Oct 13 14:28:10 crc kubenswrapper[4797]: I1013 14:28:10.699781 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnrlm\" (UniqueName: \"kubernetes.io/projected/d7afb7d3-8d9d-475f-9ef5-75c5125ec374-kube-api-access-vnrlm\") pod \"barbican-keystone-listener-5d8dfccd44-t4jbz\" (UID: \"d7afb7d3-8d9d-475f-9ef5-75c5125ec374\") " pod="openstack/barbican-keystone-listener-5d8dfccd44-t4jbz" Oct 13 14:28:10 crc kubenswrapper[4797]: I1013 14:28:10.700228 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7afb7d3-8d9d-475f-9ef5-75c5125ec374-combined-ca-bundle\") pod \"barbican-keystone-listener-5d8dfccd44-t4jbz\" (UID: \"d7afb7d3-8d9d-475f-9ef5-75c5125ec374\") " pod="openstack/barbican-keystone-listener-5d8dfccd44-t4jbz" Oct 13 14:28:10 crc kubenswrapper[4797]: I1013 14:28:10.703108 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7afb7d3-8d9d-475f-9ef5-75c5125ec374-config-data\") pod \"barbican-keystone-listener-5d8dfccd44-t4jbz\" (UID: \"d7afb7d3-8d9d-475f-9ef5-75c5125ec374\") " pod="openstack/barbican-keystone-listener-5d8dfccd44-t4jbz" Oct 13 14:28:10 crc kubenswrapper[4797]: I1013 14:28:10.767786 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-5d8dfccd44-t4jbz" Oct 13 14:28:10 crc kubenswrapper[4797]: I1013 14:28:10.782142 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ttqmw\" (UniqueName: \"kubernetes.io/projected/e2fb17bd-c741-47d7-8f3b-c9057c58105c-kube-api-access-ttqmw\") pod \"barbican-api-5b7d9bc6cb-z65xd\" (UID: \"e2fb17bd-c741-47d7-8f3b-c9057c58105c\") " pod="openstack/barbican-api-5b7d9bc6cb-z65xd" Oct 13 14:28:10 crc kubenswrapper[4797]: I1013 14:28:10.782183 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kdpdm\" (UniqueName: \"kubernetes.io/projected/80b0a71d-5c0a-4432-a2cd-15010490e327-kube-api-access-kdpdm\") pod \"dnsmasq-dns-58bbdf8b99-5z4rr\" (UID: \"80b0a71d-5c0a-4432-a2cd-15010490e327\") " pod="openstack/dnsmasq-dns-58bbdf8b99-5z4rr" Oct 13 14:28:10 crc kubenswrapper[4797]: I1013 14:28:10.782212 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e2fb17bd-c741-47d7-8f3b-c9057c58105c-config-data-custom\") pod \"barbican-api-5b7d9bc6cb-z65xd\" (UID: \"e2fb17bd-c741-47d7-8f3b-c9057c58105c\") " pod="openstack/barbican-api-5b7d9bc6cb-z65xd" Oct 13 14:28:10 crc kubenswrapper[4797]: I1013 14:28:10.782245 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80b0a71d-5c0a-4432-a2cd-15010490e327-config\") pod \"dnsmasq-dns-58bbdf8b99-5z4rr\" (UID: \"80b0a71d-5c0a-4432-a2cd-15010490e327\") " pod="openstack/dnsmasq-dns-58bbdf8b99-5z4rr" Oct 13 14:28:10 crc kubenswrapper[4797]: I1013 14:28:10.782261 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/80b0a71d-5c0a-4432-a2cd-15010490e327-ovsdbserver-nb\") pod \"dnsmasq-dns-58bbdf8b99-5z4rr\" (UID: \"80b0a71d-5c0a-4432-a2cd-15010490e327\") " pod="openstack/dnsmasq-dns-58bbdf8b99-5z4rr" Oct 13 14:28:10 crc kubenswrapper[4797]: I1013 14:28:10.783236 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80b0a71d-5c0a-4432-a2cd-15010490e327-config\") pod \"dnsmasq-dns-58bbdf8b99-5z4rr\" (UID: \"80b0a71d-5c0a-4432-a2cd-15010490e327\") " pod="openstack/dnsmasq-dns-58bbdf8b99-5z4rr" Oct 13 14:28:10 crc kubenswrapper[4797]: I1013 14:28:10.783305 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e2fb17bd-c741-47d7-8f3b-c9057c58105c-logs\") pod \"barbican-api-5b7d9bc6cb-z65xd\" (UID: \"e2fb17bd-c741-47d7-8f3b-c9057c58105c\") " pod="openstack/barbican-api-5b7d9bc6cb-z65xd" Oct 13 14:28:10 crc kubenswrapper[4797]: I1013 14:28:10.783367 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/80b0a71d-5c0a-4432-a2cd-15010490e327-dns-svc\") pod \"dnsmasq-dns-58bbdf8b99-5z4rr\" (UID: \"80b0a71d-5c0a-4432-a2cd-15010490e327\") " pod="openstack/dnsmasq-dns-58bbdf8b99-5z4rr" Oct 13 14:28:10 crc kubenswrapper[4797]: I1013 14:28:10.783420 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2fb17bd-c741-47d7-8f3b-c9057c58105c-config-data\") pod \"barbican-api-5b7d9bc6cb-z65xd\" (UID: \"e2fb17bd-c741-47d7-8f3b-c9057c58105c\") " pod="openstack/barbican-api-5b7d9bc6cb-z65xd" Oct 13 14:28:10 crc kubenswrapper[4797]: I1013 14:28:10.783452 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2fb17bd-c741-47d7-8f3b-c9057c58105c-combined-ca-bundle\") pod \"barbican-api-5b7d9bc6cb-z65xd\" (UID: \"e2fb17bd-c741-47d7-8f3b-c9057c58105c\") " pod="openstack/barbican-api-5b7d9bc6cb-z65xd" Oct 13 14:28:10 crc kubenswrapper[4797]: I1013 14:28:10.783483 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/80b0a71d-5c0a-4432-a2cd-15010490e327-ovsdbserver-sb\") pod \"dnsmasq-dns-58bbdf8b99-5z4rr\" (UID: \"80b0a71d-5c0a-4432-a2cd-15010490e327\") " pod="openstack/dnsmasq-dns-58bbdf8b99-5z4rr" Oct 13 14:28:10 crc kubenswrapper[4797]: I1013 14:28:10.784204 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/80b0a71d-5c0a-4432-a2cd-15010490e327-ovsdbserver-sb\") pod \"dnsmasq-dns-58bbdf8b99-5z4rr\" (UID: \"80b0a71d-5c0a-4432-a2cd-15010490e327\") " pod="openstack/dnsmasq-dns-58bbdf8b99-5z4rr" Oct 13 14:28:10 crc kubenswrapper[4797]: I1013 14:28:10.784511 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e2fb17bd-c741-47d7-8f3b-c9057c58105c-logs\") pod \"barbican-api-5b7d9bc6cb-z65xd\" (UID: \"e2fb17bd-c741-47d7-8f3b-c9057c58105c\") " pod="openstack/barbican-api-5b7d9bc6cb-z65xd" Oct 13 14:28:10 crc kubenswrapper[4797]: I1013 14:28:10.785555 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/80b0a71d-5c0a-4432-a2cd-15010490e327-dns-svc\") pod \"dnsmasq-dns-58bbdf8b99-5z4rr\" (UID: \"80b0a71d-5c0a-4432-a2cd-15010490e327\") " pod="openstack/dnsmasq-dns-58bbdf8b99-5z4rr" Oct 13 14:28:10 crc kubenswrapper[4797]: I1013 14:28:10.787599 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2fb17bd-c741-47d7-8f3b-c9057c58105c-combined-ca-bundle\") pod \"barbican-api-5b7d9bc6cb-z65xd\" (UID: \"e2fb17bd-c741-47d7-8f3b-c9057c58105c\") " pod="openstack/barbican-api-5b7d9bc6cb-z65xd" Oct 13 14:28:10 crc kubenswrapper[4797]: I1013 14:28:10.788334 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e2fb17bd-c741-47d7-8f3b-c9057c58105c-config-data-custom\") pod \"barbican-api-5b7d9bc6cb-z65xd\" (UID: \"e2fb17bd-c741-47d7-8f3b-c9057c58105c\") " pod="openstack/barbican-api-5b7d9bc6cb-z65xd" Oct 13 14:28:10 crc kubenswrapper[4797]: I1013 14:28:10.788665 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/80b0a71d-5c0a-4432-a2cd-15010490e327-ovsdbserver-nb\") pod \"dnsmasq-dns-58bbdf8b99-5z4rr\" (UID: \"80b0a71d-5c0a-4432-a2cd-15010490e327\") " pod="openstack/dnsmasq-dns-58bbdf8b99-5z4rr" Oct 13 14:28:10 crc kubenswrapper[4797]: I1013 14:28:10.791791 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2fb17bd-c741-47d7-8f3b-c9057c58105c-config-data\") pod \"barbican-api-5b7d9bc6cb-z65xd\" (UID: \"e2fb17bd-c741-47d7-8f3b-c9057c58105c\") " pod="openstack/barbican-api-5b7d9bc6cb-z65xd" Oct 13 14:28:10 crc kubenswrapper[4797]: I1013 14:28:10.793234 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-5d95764d5c-br2pn" Oct 13 14:28:10 crc kubenswrapper[4797]: I1013 14:28:10.801541 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdpdm\" (UniqueName: \"kubernetes.io/projected/80b0a71d-5c0a-4432-a2cd-15010490e327-kube-api-access-kdpdm\") pod \"dnsmasq-dns-58bbdf8b99-5z4rr\" (UID: \"80b0a71d-5c0a-4432-a2cd-15010490e327\") " pod="openstack/dnsmasq-dns-58bbdf8b99-5z4rr" Oct 13 14:28:10 crc kubenswrapper[4797]: I1013 14:28:10.802498 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttqmw\" (UniqueName: \"kubernetes.io/projected/e2fb17bd-c741-47d7-8f3b-c9057c58105c-kube-api-access-ttqmw\") pod \"barbican-api-5b7d9bc6cb-z65xd\" (UID: \"e2fb17bd-c741-47d7-8f3b-c9057c58105c\") " pod="openstack/barbican-api-5b7d9bc6cb-z65xd" Oct 13 14:28:10 crc kubenswrapper[4797]: I1013 14:28:10.864249 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58bbdf8b99-5z4rr" Oct 13 14:28:10 crc kubenswrapper[4797]: I1013 14:28:10.931088 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5b7d9bc6cb-z65xd" Oct 13 14:28:11 crc kubenswrapper[4797]: I1013 14:28:11.208048 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-5d8dfccd44-t4jbz"] Oct 13 14:28:11 crc kubenswrapper[4797]: W1013 14:28:11.220204 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd7afb7d3_8d9d_475f_9ef5_75c5125ec374.slice/crio-3e6523c83c426e756655cef83e3984ea18f4867a78132d5c2f16d589a268006b WatchSource:0}: Error finding container 3e6523c83c426e756655cef83e3984ea18f4867a78132d5c2f16d589a268006b: Status 404 returned error can't find the container with id 3e6523c83c426e756655cef83e3984ea18f4867a78132d5c2f16d589a268006b Oct 13 14:28:11 crc kubenswrapper[4797]: W1013 14:28:11.318755 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod15f72fda_eacd_42a1_8be8_92e28ed31924.slice/crio-19602660748f30a66a3972290ee9baec4a40910b545dda7c762a0471724451f3 WatchSource:0}: Error finding container 19602660748f30a66a3972290ee9baec4a40910b545dda7c762a0471724451f3: Status 404 returned error can't find the container with id 19602660748f30a66a3972290ee9baec4a40910b545dda7c762a0471724451f3 Oct 13 14:28:11 crc kubenswrapper[4797]: I1013 14:28:11.320203 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-5d95764d5c-br2pn"] Oct 13 14:28:11 crc kubenswrapper[4797]: I1013 14:28:11.463506 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58bbdf8b99-5z4rr"] Oct 13 14:28:11 crc kubenswrapper[4797]: W1013 14:28:11.479490 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod80b0a71d_5c0a_4432_a2cd_15010490e327.slice/crio-16570f060bc1832516b7f9a0d92e68367712d29f7a16cb47766897a0033fec60 WatchSource:0}: Error finding container 16570f060bc1832516b7f9a0d92e68367712d29f7a16cb47766897a0033fec60: Status 404 returned error can't find the container with id 16570f060bc1832516b7f9a0d92e68367712d29f7a16cb47766897a0033fec60 Oct 13 14:28:11 crc kubenswrapper[4797]: I1013 14:28:11.514302 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5b7d9bc6cb-z65xd"] Oct 13 14:28:12 crc kubenswrapper[4797]: I1013 14:28:12.176899 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5d8dfccd44-t4jbz" event={"ID":"d7afb7d3-8d9d-475f-9ef5-75c5125ec374","Type":"ContainerStarted","Data":"3e6523c83c426e756655cef83e3984ea18f4867a78132d5c2f16d589a268006b"} Oct 13 14:28:12 crc kubenswrapper[4797]: I1013 14:28:12.180214 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5d95764d5c-br2pn" event={"ID":"15f72fda-eacd-42a1-8be8-92e28ed31924","Type":"ContainerStarted","Data":"19602660748f30a66a3972290ee9baec4a40910b545dda7c762a0471724451f3"} Oct 13 14:28:12 crc kubenswrapper[4797]: I1013 14:28:12.185414 4797 generic.go:334] "Generic (PLEG): container finished" podID="80b0a71d-5c0a-4432-a2cd-15010490e327" containerID="cfb48ca87f6adfcc8962943ea85f73a69463c6e0767d1ffd858bcb871d7f4daa" exitCode=0 Oct 13 14:28:12 crc kubenswrapper[4797]: I1013 14:28:12.185486 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58bbdf8b99-5z4rr" event={"ID":"80b0a71d-5c0a-4432-a2cd-15010490e327","Type":"ContainerDied","Data":"cfb48ca87f6adfcc8962943ea85f73a69463c6e0767d1ffd858bcb871d7f4daa"} Oct 13 14:28:12 crc kubenswrapper[4797]: I1013 14:28:12.185516 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58bbdf8b99-5z4rr" event={"ID":"80b0a71d-5c0a-4432-a2cd-15010490e327","Type":"ContainerStarted","Data":"16570f060bc1832516b7f9a0d92e68367712d29f7a16cb47766897a0033fec60"} Oct 13 14:28:12 crc kubenswrapper[4797]: I1013 14:28:12.188927 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5b7d9bc6cb-z65xd" event={"ID":"e2fb17bd-c741-47d7-8f3b-c9057c58105c","Type":"ContainerStarted","Data":"04c659b4e06d855f3b8285f655feb9c4cd5f339804c95f9a033fe324ca333e41"} Oct 13 14:28:12 crc kubenswrapper[4797]: I1013 14:28:12.188976 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5b7d9bc6cb-z65xd" event={"ID":"e2fb17bd-c741-47d7-8f3b-c9057c58105c","Type":"ContainerStarted","Data":"f9f54ccc2c2a87e3ad38154134029fee4a0dc7849830cd75360da548a6f747ac"} Oct 13 14:28:13 crc kubenswrapper[4797]: I1013 14:28:13.201155 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5d95764d5c-br2pn" event={"ID":"15f72fda-eacd-42a1-8be8-92e28ed31924","Type":"ContainerStarted","Data":"4ca048b17b089262aaab36a8d40f40b7e7085abb4ab08ec4149bf058623155b0"} Oct 13 14:28:13 crc kubenswrapper[4797]: I1013 14:28:13.201562 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5d95764d5c-br2pn" event={"ID":"15f72fda-eacd-42a1-8be8-92e28ed31924","Type":"ContainerStarted","Data":"dc7b4bc0df090af3ee0d7524b1631e0edbef0aabb05723e5b676763ef8c4a8df"} Oct 13 14:28:13 crc kubenswrapper[4797]: I1013 14:28:13.204173 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58bbdf8b99-5z4rr" event={"ID":"80b0a71d-5c0a-4432-a2cd-15010490e327","Type":"ContainerStarted","Data":"cf16f52dc0826a0d662836d2d29eaba40fa292024955274002ae4d7f12a2f6bb"} Oct 13 14:28:13 crc kubenswrapper[4797]: I1013 14:28:13.204319 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-58bbdf8b99-5z4rr" Oct 13 14:28:13 crc kubenswrapper[4797]: I1013 14:28:13.206655 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5b7d9bc6cb-z65xd" event={"ID":"e2fb17bd-c741-47d7-8f3b-c9057c58105c","Type":"ContainerStarted","Data":"279b88581c8b115a717e1dc1139b049ab24ceda41d8dfcc4787c39b7d3efc806"} Oct 13 14:28:13 crc kubenswrapper[4797]: I1013 14:28:13.207197 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5b7d9bc6cb-z65xd" Oct 13 14:28:13 crc kubenswrapper[4797]: I1013 14:28:13.207231 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5b7d9bc6cb-z65xd" Oct 13 14:28:13 crc kubenswrapper[4797]: I1013 14:28:13.208593 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5d8dfccd44-t4jbz" event={"ID":"d7afb7d3-8d9d-475f-9ef5-75c5125ec374","Type":"ContainerStarted","Data":"29a6c3884b80f111252667457406578bb4424576388f7f2cc1eb5fe84abd5052"} Oct 13 14:28:13 crc kubenswrapper[4797]: I1013 14:28:13.208620 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5d8dfccd44-t4jbz" event={"ID":"d7afb7d3-8d9d-475f-9ef5-75c5125ec374","Type":"ContainerStarted","Data":"13782f743d9608606c2c1b1a0633d0e23f6f5d2081ae3ea62b9e90e93ec3e85f"} Oct 13 14:28:13 crc kubenswrapper[4797]: I1013 14:28:13.225248 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-5d95764d5c-br2pn" podStartSLOduration=2.148907493 podStartE2EDuration="3.225231009s" podCreationTimestamp="2025-10-13 14:28:10 +0000 UTC" firstStartedPulling="2025-10-13 14:28:11.320928668 +0000 UTC m=+4868.854478924" lastFinishedPulling="2025-10-13 14:28:12.397252184 +0000 UTC m=+4869.930802440" observedRunningTime="2025-10-13 14:28:13.223266921 +0000 UTC m=+4870.756817187" watchObservedRunningTime="2025-10-13 14:28:13.225231009 +0000 UTC m=+4870.758781295" Oct 13 14:28:13 crc kubenswrapper[4797]: I1013 14:28:13.253347 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-58bbdf8b99-5z4rr" podStartSLOduration=3.253317887 podStartE2EDuration="3.253317887s" podCreationTimestamp="2025-10-13 14:28:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 14:28:13.250721683 +0000 UTC m=+4870.784271959" watchObservedRunningTime="2025-10-13 14:28:13.253317887 +0000 UTC m=+4870.786868173" Oct 13 14:28:13 crc kubenswrapper[4797]: I1013 14:28:13.280188 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-5b7d9bc6cb-z65xd" podStartSLOduration=3.280169904 podStartE2EDuration="3.280169904s" podCreationTimestamp="2025-10-13 14:28:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 14:28:13.275877629 +0000 UTC m=+4870.809427945" watchObservedRunningTime="2025-10-13 14:28:13.280169904 +0000 UTC m=+4870.813720170" Oct 13 14:28:13 crc kubenswrapper[4797]: I1013 14:28:13.302891 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-5d8dfccd44-t4jbz" podStartSLOduration=2.129817716 podStartE2EDuration="3.30286561s" podCreationTimestamp="2025-10-13 14:28:10 +0000 UTC" firstStartedPulling="2025-10-13 14:28:11.222458337 +0000 UTC m=+4868.756008603" lastFinishedPulling="2025-10-13 14:28:12.395506241 +0000 UTC m=+4869.929056497" observedRunningTime="2025-10-13 14:28:13.295261274 +0000 UTC m=+4870.828811580" watchObservedRunningTime="2025-10-13 14:28:13.30286561 +0000 UTC m=+4870.836415876" Oct 13 14:28:20 crc kubenswrapper[4797]: I1013 14:28:20.867083 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-58bbdf8b99-5z4rr" Oct 13 14:28:20 crc kubenswrapper[4797]: I1013 14:28:20.928052 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f59f744ff-nbq59"] Oct 13 14:28:20 crc kubenswrapper[4797]: I1013 14:28:20.932015 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7f59f744ff-nbq59" podUID="7904403a-c230-4ffd-bf08-9a9a3dc153e3" containerName="dnsmasq-dns" containerID="cri-o://ac6ee89b9534d1e2a821c09543f500d5bbe06d905e25bb9dbd04d9c03db96ef0" gracePeriod=10 Oct 13 14:28:21 crc kubenswrapper[4797]: I1013 14:28:21.338476 4797 generic.go:334] "Generic (PLEG): container finished" podID="7904403a-c230-4ffd-bf08-9a9a3dc153e3" containerID="ac6ee89b9534d1e2a821c09543f500d5bbe06d905e25bb9dbd04d9c03db96ef0" exitCode=0 Oct 13 14:28:21 crc kubenswrapper[4797]: I1013 14:28:21.338522 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f59f744ff-nbq59" event={"ID":"7904403a-c230-4ffd-bf08-9a9a3dc153e3","Type":"ContainerDied","Data":"ac6ee89b9534d1e2a821c09543f500d5bbe06d905e25bb9dbd04d9c03db96ef0"} Oct 13 14:28:21 crc kubenswrapper[4797]: I1013 14:28:21.425667 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f59f744ff-nbq59" Oct 13 14:28:21 crc kubenswrapper[4797]: I1013 14:28:21.578449 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hpcrr\" (UniqueName: \"kubernetes.io/projected/7904403a-c230-4ffd-bf08-9a9a3dc153e3-kube-api-access-hpcrr\") pod \"7904403a-c230-4ffd-bf08-9a9a3dc153e3\" (UID: \"7904403a-c230-4ffd-bf08-9a9a3dc153e3\") " Oct 13 14:28:21 crc kubenswrapper[4797]: I1013 14:28:21.578524 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7904403a-c230-4ffd-bf08-9a9a3dc153e3-ovsdbserver-nb\") pod \"7904403a-c230-4ffd-bf08-9a9a3dc153e3\" (UID: \"7904403a-c230-4ffd-bf08-9a9a3dc153e3\") " Oct 13 14:28:21 crc kubenswrapper[4797]: I1013 14:28:21.578548 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7904403a-c230-4ffd-bf08-9a9a3dc153e3-dns-svc\") pod \"7904403a-c230-4ffd-bf08-9a9a3dc153e3\" (UID: \"7904403a-c230-4ffd-bf08-9a9a3dc153e3\") " Oct 13 14:28:21 crc kubenswrapper[4797]: I1013 14:28:21.578582 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7904403a-c230-4ffd-bf08-9a9a3dc153e3-ovsdbserver-sb\") pod \"7904403a-c230-4ffd-bf08-9a9a3dc153e3\" (UID: \"7904403a-c230-4ffd-bf08-9a9a3dc153e3\") " Oct 13 14:28:21 crc kubenswrapper[4797]: I1013 14:28:21.578649 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7904403a-c230-4ffd-bf08-9a9a3dc153e3-config\") pod \"7904403a-c230-4ffd-bf08-9a9a3dc153e3\" (UID: \"7904403a-c230-4ffd-bf08-9a9a3dc153e3\") " Oct 13 14:28:21 crc kubenswrapper[4797]: I1013 14:28:21.584083 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7904403a-c230-4ffd-bf08-9a9a3dc153e3-kube-api-access-hpcrr" (OuterVolumeSpecName: "kube-api-access-hpcrr") pod "7904403a-c230-4ffd-bf08-9a9a3dc153e3" (UID: "7904403a-c230-4ffd-bf08-9a9a3dc153e3"). InnerVolumeSpecName "kube-api-access-hpcrr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 14:28:21 crc kubenswrapper[4797]: I1013 14:28:21.627689 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7904403a-c230-4ffd-bf08-9a9a3dc153e3-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "7904403a-c230-4ffd-bf08-9a9a3dc153e3" (UID: "7904403a-c230-4ffd-bf08-9a9a3dc153e3"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 14:28:21 crc kubenswrapper[4797]: I1013 14:28:21.635876 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7904403a-c230-4ffd-bf08-9a9a3dc153e3-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "7904403a-c230-4ffd-bf08-9a9a3dc153e3" (UID: "7904403a-c230-4ffd-bf08-9a9a3dc153e3"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 14:28:21 crc kubenswrapper[4797]: I1013 14:28:21.645904 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7904403a-c230-4ffd-bf08-9a9a3dc153e3-config" (OuterVolumeSpecName: "config") pod "7904403a-c230-4ffd-bf08-9a9a3dc153e3" (UID: "7904403a-c230-4ffd-bf08-9a9a3dc153e3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 14:28:21 crc kubenswrapper[4797]: I1013 14:28:21.655377 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7904403a-c230-4ffd-bf08-9a9a3dc153e3-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7904403a-c230-4ffd-bf08-9a9a3dc153e3" (UID: "7904403a-c230-4ffd-bf08-9a9a3dc153e3"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 14:28:21 crc kubenswrapper[4797]: I1013 14:28:21.680771 4797 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7904403a-c230-4ffd-bf08-9a9a3dc153e3-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 13 14:28:21 crc kubenswrapper[4797]: I1013 14:28:21.680799 4797 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7904403a-c230-4ffd-bf08-9a9a3dc153e3-config\") on node \"crc\" DevicePath \"\"" Oct 13 14:28:21 crc kubenswrapper[4797]: I1013 14:28:21.680852 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hpcrr\" (UniqueName: \"kubernetes.io/projected/7904403a-c230-4ffd-bf08-9a9a3dc153e3-kube-api-access-hpcrr\") on node \"crc\" DevicePath \"\"" Oct 13 14:28:21 crc kubenswrapper[4797]: I1013 14:28:21.680862 4797 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7904403a-c230-4ffd-bf08-9a9a3dc153e3-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 13 14:28:21 crc kubenswrapper[4797]: I1013 14:28:21.680870 4797 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7904403a-c230-4ffd-bf08-9a9a3dc153e3-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 13 14:28:22 crc kubenswrapper[4797]: I1013 14:28:22.299505 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5b7d9bc6cb-z65xd" Oct 13 14:28:22 crc kubenswrapper[4797]: I1013 14:28:22.353205 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f59f744ff-nbq59" event={"ID":"7904403a-c230-4ffd-bf08-9a9a3dc153e3","Type":"ContainerDied","Data":"b2ef8c571766cafb80246fa01970f0b5c2f9b056d07bdf7fad796f01e98c10cf"} Oct 13 14:28:22 crc kubenswrapper[4797]: I1013 14:28:22.353295 4797 scope.go:117] "RemoveContainer" containerID="ac6ee89b9534d1e2a821c09543f500d5bbe06d905e25bb9dbd04d9c03db96ef0" Oct 13 14:28:22 crc kubenswrapper[4797]: I1013 14:28:22.353329 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f59f744ff-nbq59" Oct 13 14:28:22 crc kubenswrapper[4797]: I1013 14:28:22.379705 4797 scope.go:117] "RemoveContainer" containerID="9f34b95597c57f45dc00aaa0483de4cd17dcec706325c0157f77df5a0c88c7a9" Oct 13 14:28:22 crc kubenswrapper[4797]: I1013 14:28:22.386802 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5b7d9bc6cb-z65xd" Oct 13 14:28:22 crc kubenswrapper[4797]: I1013 14:28:22.410110 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f59f744ff-nbq59"] Oct 13 14:28:22 crc kubenswrapper[4797]: I1013 14:28:22.418695 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7f59f744ff-nbq59"] Oct 13 14:28:23 crc kubenswrapper[4797]: I1013 14:28:23.252302 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7904403a-c230-4ffd-bf08-9a9a3dc153e3" path="/var/lib/kubelet/pods/7904403a-c230-4ffd-bf08-9a9a3dc153e3/volumes" Oct 13 14:28:34 crc kubenswrapper[4797]: E1013 14:28:34.977696 4797 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.147:39088->38.102.83.147:46853: write tcp 38.102.83.147:39088->38.102.83.147:46853: write: broken pipe Oct 13 14:28:35 crc kubenswrapper[4797]: I1013 14:28:35.989295 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-lbdbr"] Oct 13 14:28:35 crc kubenswrapper[4797]: E1013 14:28:35.989691 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7904403a-c230-4ffd-bf08-9a9a3dc153e3" containerName="init" Oct 13 14:28:35 crc kubenswrapper[4797]: I1013 14:28:35.989706 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="7904403a-c230-4ffd-bf08-9a9a3dc153e3" containerName="init" Oct 13 14:28:35 crc kubenswrapper[4797]: E1013 14:28:35.989749 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7904403a-c230-4ffd-bf08-9a9a3dc153e3" containerName="dnsmasq-dns" Oct 13 14:28:35 crc kubenswrapper[4797]: I1013 14:28:35.989758 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="7904403a-c230-4ffd-bf08-9a9a3dc153e3" containerName="dnsmasq-dns" Oct 13 14:28:35 crc kubenswrapper[4797]: I1013 14:28:35.989989 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="7904403a-c230-4ffd-bf08-9a9a3dc153e3" containerName="dnsmasq-dns" Oct 13 14:28:35 crc kubenswrapper[4797]: I1013 14:28:35.990570 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-lbdbr" Oct 13 14:28:35 crc kubenswrapper[4797]: I1013 14:28:35.995423 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-lbdbr"] Oct 13 14:28:36 crc kubenswrapper[4797]: I1013 14:28:36.050537 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhz9b\" (UniqueName: \"kubernetes.io/projected/00c0b6e0-17cc-43c2-b42b-706129ced2e3-kube-api-access-nhz9b\") pod \"neutron-db-create-lbdbr\" (UID: \"00c0b6e0-17cc-43c2-b42b-706129ced2e3\") " pod="openstack/neutron-db-create-lbdbr" Oct 13 14:28:36 crc kubenswrapper[4797]: I1013 14:28:36.152027 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhz9b\" (UniqueName: \"kubernetes.io/projected/00c0b6e0-17cc-43c2-b42b-706129ced2e3-kube-api-access-nhz9b\") pod \"neutron-db-create-lbdbr\" (UID: \"00c0b6e0-17cc-43c2-b42b-706129ced2e3\") " pod="openstack/neutron-db-create-lbdbr" Oct 13 14:28:36 crc kubenswrapper[4797]: I1013 14:28:36.170368 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhz9b\" (UniqueName: \"kubernetes.io/projected/00c0b6e0-17cc-43c2-b42b-706129ced2e3-kube-api-access-nhz9b\") pod \"neutron-db-create-lbdbr\" (UID: \"00c0b6e0-17cc-43c2-b42b-706129ced2e3\") " pod="openstack/neutron-db-create-lbdbr" Oct 13 14:28:36 crc kubenswrapper[4797]: I1013 14:28:36.309827 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-lbdbr" Oct 13 14:28:36 crc kubenswrapper[4797]: I1013 14:28:36.748486 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-lbdbr"] Oct 13 14:28:37 crc kubenswrapper[4797]: I1013 14:28:37.537167 4797 generic.go:334] "Generic (PLEG): container finished" podID="00c0b6e0-17cc-43c2-b42b-706129ced2e3" containerID="41f4df9c2e5d1cf58537ea4c6f0e63a65af78328c7930e1c7058802628d7f7df" exitCode=0 Oct 13 14:28:37 crc kubenswrapper[4797]: I1013 14:28:37.537224 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-lbdbr" event={"ID":"00c0b6e0-17cc-43c2-b42b-706129ced2e3","Type":"ContainerDied","Data":"41f4df9c2e5d1cf58537ea4c6f0e63a65af78328c7930e1c7058802628d7f7df"} Oct 13 14:28:37 crc kubenswrapper[4797]: I1013 14:28:37.537254 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-lbdbr" event={"ID":"00c0b6e0-17cc-43c2-b42b-706129ced2e3","Type":"ContainerStarted","Data":"8098aa13dd19559202eadf88a3aad3ec2a925a8c8ceb6adfaf56a40370d92387"} Oct 13 14:28:38 crc kubenswrapper[4797]: I1013 14:28:38.876906 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-lbdbr" Oct 13 14:28:38 crc kubenswrapper[4797]: I1013 14:28:38.893927 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nhz9b\" (UniqueName: \"kubernetes.io/projected/00c0b6e0-17cc-43c2-b42b-706129ced2e3-kube-api-access-nhz9b\") pod \"00c0b6e0-17cc-43c2-b42b-706129ced2e3\" (UID: \"00c0b6e0-17cc-43c2-b42b-706129ced2e3\") " Oct 13 14:28:38 crc kubenswrapper[4797]: I1013 14:28:38.932519 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00c0b6e0-17cc-43c2-b42b-706129ced2e3-kube-api-access-nhz9b" (OuterVolumeSpecName: "kube-api-access-nhz9b") pod "00c0b6e0-17cc-43c2-b42b-706129ced2e3" (UID: "00c0b6e0-17cc-43c2-b42b-706129ced2e3"). InnerVolumeSpecName "kube-api-access-nhz9b". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 14:28:38 crc kubenswrapper[4797]: I1013 14:28:38.995350 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nhz9b\" (UniqueName: \"kubernetes.io/projected/00c0b6e0-17cc-43c2-b42b-706129ced2e3-kube-api-access-nhz9b\") on node \"crc\" DevicePath \"\"" Oct 13 14:28:39 crc kubenswrapper[4797]: I1013 14:28:39.559102 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-lbdbr" event={"ID":"00c0b6e0-17cc-43c2-b42b-706129ced2e3","Type":"ContainerDied","Data":"8098aa13dd19559202eadf88a3aad3ec2a925a8c8ceb6adfaf56a40370d92387"} Oct 13 14:28:39 crc kubenswrapper[4797]: I1013 14:28:39.559179 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8098aa13dd19559202eadf88a3aad3ec2a925a8c8ceb6adfaf56a40370d92387" Oct 13 14:28:39 crc kubenswrapper[4797]: I1013 14:28:39.559193 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-lbdbr" Oct 13 14:28:46 crc kubenswrapper[4797]: I1013 14:28:46.064561 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-cf68-account-create-xv5hv"] Oct 13 14:28:46 crc kubenswrapper[4797]: E1013 14:28:46.065743 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00c0b6e0-17cc-43c2-b42b-706129ced2e3" containerName="mariadb-database-create" Oct 13 14:28:46 crc kubenswrapper[4797]: I1013 14:28:46.065772 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="00c0b6e0-17cc-43c2-b42b-706129ced2e3" containerName="mariadb-database-create" Oct 13 14:28:46 crc kubenswrapper[4797]: I1013 14:28:46.066091 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="00c0b6e0-17cc-43c2-b42b-706129ced2e3" containerName="mariadb-database-create" Oct 13 14:28:46 crc kubenswrapper[4797]: I1013 14:28:46.067066 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-cf68-account-create-xv5hv" Oct 13 14:28:46 crc kubenswrapper[4797]: I1013 14:28:46.071121 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Oct 13 14:28:46 crc kubenswrapper[4797]: I1013 14:28:46.076148 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-cf68-account-create-xv5hv"] Oct 13 14:28:46 crc kubenswrapper[4797]: I1013 14:28:46.225092 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zp49\" (UniqueName: \"kubernetes.io/projected/b76b9c6a-0f53-48f0-9f69-13094dde56ca-kube-api-access-5zp49\") pod \"neutron-cf68-account-create-xv5hv\" (UID: \"b76b9c6a-0f53-48f0-9f69-13094dde56ca\") " pod="openstack/neutron-cf68-account-create-xv5hv" Oct 13 14:28:46 crc kubenswrapper[4797]: I1013 14:28:46.328275 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5zp49\" (UniqueName: \"kubernetes.io/projected/b76b9c6a-0f53-48f0-9f69-13094dde56ca-kube-api-access-5zp49\") pod \"neutron-cf68-account-create-xv5hv\" (UID: \"b76b9c6a-0f53-48f0-9f69-13094dde56ca\") " pod="openstack/neutron-cf68-account-create-xv5hv" Oct 13 14:28:46 crc kubenswrapper[4797]: I1013 14:28:46.361363 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zp49\" (UniqueName: \"kubernetes.io/projected/b76b9c6a-0f53-48f0-9f69-13094dde56ca-kube-api-access-5zp49\") pod \"neutron-cf68-account-create-xv5hv\" (UID: \"b76b9c6a-0f53-48f0-9f69-13094dde56ca\") " pod="openstack/neutron-cf68-account-create-xv5hv" Oct 13 14:28:46 crc kubenswrapper[4797]: I1013 14:28:46.397593 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-cf68-account-create-xv5hv" Oct 13 14:28:46 crc kubenswrapper[4797]: I1013 14:28:46.881687 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-cf68-account-create-xv5hv"] Oct 13 14:28:46 crc kubenswrapper[4797]: W1013 14:28:46.888510 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb76b9c6a_0f53_48f0_9f69_13094dde56ca.slice/crio-aa089858c9c8add930879b28dc4e52fc67680238f0e521376e0a64c1d200ddc5 WatchSource:0}: Error finding container aa089858c9c8add930879b28dc4e52fc67680238f0e521376e0a64c1d200ddc5: Status 404 returned error can't find the container with id aa089858c9c8add930879b28dc4e52fc67680238f0e521376e0a64c1d200ddc5 Oct 13 14:28:47 crc kubenswrapper[4797]: I1013 14:28:47.641232 4797 generic.go:334] "Generic (PLEG): container finished" podID="b76b9c6a-0f53-48f0-9f69-13094dde56ca" containerID="46b68267a8495306a9d6a730fd0bd31db302c64e244072d7aeed3466beb39955" exitCode=0 Oct 13 14:28:47 crc kubenswrapper[4797]: I1013 14:28:47.641363 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-cf68-account-create-xv5hv" event={"ID":"b76b9c6a-0f53-48f0-9f69-13094dde56ca","Type":"ContainerDied","Data":"46b68267a8495306a9d6a730fd0bd31db302c64e244072d7aeed3466beb39955"} Oct 13 14:28:47 crc kubenswrapper[4797]: I1013 14:28:47.641787 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-cf68-account-create-xv5hv" event={"ID":"b76b9c6a-0f53-48f0-9f69-13094dde56ca","Type":"ContainerStarted","Data":"aa089858c9c8add930879b28dc4e52fc67680238f0e521376e0a64c1d200ddc5"} Oct 13 14:28:49 crc kubenswrapper[4797]: I1013 14:28:49.037708 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-cf68-account-create-xv5hv" Oct 13 14:28:49 crc kubenswrapper[4797]: I1013 14:28:49.191697 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5zp49\" (UniqueName: \"kubernetes.io/projected/b76b9c6a-0f53-48f0-9f69-13094dde56ca-kube-api-access-5zp49\") pod \"b76b9c6a-0f53-48f0-9f69-13094dde56ca\" (UID: \"b76b9c6a-0f53-48f0-9f69-13094dde56ca\") " Oct 13 14:28:49 crc kubenswrapper[4797]: I1013 14:28:49.201055 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b76b9c6a-0f53-48f0-9f69-13094dde56ca-kube-api-access-5zp49" (OuterVolumeSpecName: "kube-api-access-5zp49") pod "b76b9c6a-0f53-48f0-9f69-13094dde56ca" (UID: "b76b9c6a-0f53-48f0-9f69-13094dde56ca"). InnerVolumeSpecName "kube-api-access-5zp49". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 14:28:49 crc kubenswrapper[4797]: I1013 14:28:49.293716 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5zp49\" (UniqueName: \"kubernetes.io/projected/b76b9c6a-0f53-48f0-9f69-13094dde56ca-kube-api-access-5zp49\") on node \"crc\" DevicePath \"\"" Oct 13 14:28:49 crc kubenswrapper[4797]: I1013 14:28:49.668247 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-cf68-account-create-xv5hv" event={"ID":"b76b9c6a-0f53-48f0-9f69-13094dde56ca","Type":"ContainerDied","Data":"aa089858c9c8add930879b28dc4e52fc67680238f0e521376e0a64c1d200ddc5"} Oct 13 14:28:49 crc kubenswrapper[4797]: I1013 14:28:49.668310 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aa089858c9c8add930879b28dc4e52fc67680238f0e521376e0a64c1d200ddc5" Oct 13 14:28:49 crc kubenswrapper[4797]: I1013 14:28:49.668318 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-cf68-account-create-xv5hv" Oct 13 14:28:51 crc kubenswrapper[4797]: I1013 14:28:51.210330 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-zhs42"] Oct 13 14:28:51 crc kubenswrapper[4797]: E1013 14:28:51.212223 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b76b9c6a-0f53-48f0-9f69-13094dde56ca" containerName="mariadb-account-create" Oct 13 14:28:51 crc kubenswrapper[4797]: I1013 14:28:51.212274 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="b76b9c6a-0f53-48f0-9f69-13094dde56ca" containerName="mariadb-account-create" Oct 13 14:28:51 crc kubenswrapper[4797]: I1013 14:28:51.212724 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="b76b9c6a-0f53-48f0-9f69-13094dde56ca" containerName="mariadb-account-create" Oct 13 14:28:51 crc kubenswrapper[4797]: I1013 14:28:51.214082 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-zhs42" Oct 13 14:28:51 crc kubenswrapper[4797]: I1013 14:28:51.220097 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Oct 13 14:28:51 crc kubenswrapper[4797]: I1013 14:28:51.220474 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Oct 13 14:28:51 crc kubenswrapper[4797]: I1013 14:28:51.220841 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-wdckq" Oct 13 14:28:51 crc kubenswrapper[4797]: I1013 14:28:51.223204 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-zhs42"] Oct 13 14:28:51 crc kubenswrapper[4797]: I1013 14:28:51.335601 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc4c6e24-80f6-4ac8-b5d3-4f54dd1d7e87-combined-ca-bundle\") pod \"neutron-db-sync-zhs42\" (UID: \"bc4c6e24-80f6-4ac8-b5d3-4f54dd1d7e87\") " pod="openstack/neutron-db-sync-zhs42" Oct 13 14:28:51 crc kubenswrapper[4797]: I1013 14:28:51.335785 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wfrr4\" (UniqueName: \"kubernetes.io/projected/bc4c6e24-80f6-4ac8-b5d3-4f54dd1d7e87-kube-api-access-wfrr4\") pod \"neutron-db-sync-zhs42\" (UID: \"bc4c6e24-80f6-4ac8-b5d3-4f54dd1d7e87\") " pod="openstack/neutron-db-sync-zhs42" Oct 13 14:28:51 crc kubenswrapper[4797]: I1013 14:28:51.335878 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/bc4c6e24-80f6-4ac8-b5d3-4f54dd1d7e87-config\") pod \"neutron-db-sync-zhs42\" (UID: \"bc4c6e24-80f6-4ac8-b5d3-4f54dd1d7e87\") " pod="openstack/neutron-db-sync-zhs42" Oct 13 14:28:51 crc kubenswrapper[4797]: I1013 14:28:51.437469 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/bc4c6e24-80f6-4ac8-b5d3-4f54dd1d7e87-config\") pod \"neutron-db-sync-zhs42\" (UID: \"bc4c6e24-80f6-4ac8-b5d3-4f54dd1d7e87\") " pod="openstack/neutron-db-sync-zhs42" Oct 13 14:28:51 crc kubenswrapper[4797]: I1013 14:28:51.437874 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc4c6e24-80f6-4ac8-b5d3-4f54dd1d7e87-combined-ca-bundle\") pod \"neutron-db-sync-zhs42\" (UID: \"bc4c6e24-80f6-4ac8-b5d3-4f54dd1d7e87\") " pod="openstack/neutron-db-sync-zhs42" Oct 13 14:28:51 crc kubenswrapper[4797]: I1013 14:28:51.438010 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wfrr4\" (UniqueName: \"kubernetes.io/projected/bc4c6e24-80f6-4ac8-b5d3-4f54dd1d7e87-kube-api-access-wfrr4\") pod \"neutron-db-sync-zhs42\" (UID: \"bc4c6e24-80f6-4ac8-b5d3-4f54dd1d7e87\") " pod="openstack/neutron-db-sync-zhs42" Oct 13 14:28:51 crc kubenswrapper[4797]: I1013 14:28:51.447779 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/bc4c6e24-80f6-4ac8-b5d3-4f54dd1d7e87-config\") pod \"neutron-db-sync-zhs42\" (UID: \"bc4c6e24-80f6-4ac8-b5d3-4f54dd1d7e87\") " pod="openstack/neutron-db-sync-zhs42" Oct 13 14:28:51 crc kubenswrapper[4797]: I1013 14:28:51.449366 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc4c6e24-80f6-4ac8-b5d3-4f54dd1d7e87-combined-ca-bundle\") pod \"neutron-db-sync-zhs42\" (UID: \"bc4c6e24-80f6-4ac8-b5d3-4f54dd1d7e87\") " pod="openstack/neutron-db-sync-zhs42" Oct 13 14:28:51 crc kubenswrapper[4797]: I1013 14:28:51.461263 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wfrr4\" (UniqueName: \"kubernetes.io/projected/bc4c6e24-80f6-4ac8-b5d3-4f54dd1d7e87-kube-api-access-wfrr4\") pod \"neutron-db-sync-zhs42\" (UID: \"bc4c6e24-80f6-4ac8-b5d3-4f54dd1d7e87\") " pod="openstack/neutron-db-sync-zhs42" Oct 13 14:28:51 crc kubenswrapper[4797]: I1013 14:28:51.573327 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-zhs42" Oct 13 14:28:51 crc kubenswrapper[4797]: I1013 14:28:51.815762 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-zhs42"] Oct 13 14:28:52 crc kubenswrapper[4797]: I1013 14:28:52.703640 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-zhs42" event={"ID":"bc4c6e24-80f6-4ac8-b5d3-4f54dd1d7e87","Type":"ContainerStarted","Data":"b27e61b23d28756876233b7d45b11cacfad8decbcb08eba0f3eaad2da91eaada"} Oct 13 14:28:52 crc kubenswrapper[4797]: I1013 14:28:52.704254 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-zhs42" event={"ID":"bc4c6e24-80f6-4ac8-b5d3-4f54dd1d7e87","Type":"ContainerStarted","Data":"4437b72b18ba4f0d3ddb0b0df5440bab5f34743f4085ea9b13b7bb0ec807ef35"} Oct 13 14:28:52 crc kubenswrapper[4797]: I1013 14:28:52.727228 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-zhs42" podStartSLOduration=1.727197915 podStartE2EDuration="1.727197915s" podCreationTimestamp="2025-10-13 14:28:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 14:28:52.725222146 +0000 UTC m=+4910.258772402" watchObservedRunningTime="2025-10-13 14:28:52.727197915 +0000 UTC m=+4910.260748241" Oct 13 14:28:55 crc kubenswrapper[4797]: I1013 14:28:55.736980 4797 generic.go:334] "Generic (PLEG): container finished" podID="bc4c6e24-80f6-4ac8-b5d3-4f54dd1d7e87" containerID="b27e61b23d28756876233b7d45b11cacfad8decbcb08eba0f3eaad2da91eaada" exitCode=0 Oct 13 14:28:55 crc kubenswrapper[4797]: I1013 14:28:55.737083 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-zhs42" event={"ID":"bc4c6e24-80f6-4ac8-b5d3-4f54dd1d7e87","Type":"ContainerDied","Data":"b27e61b23d28756876233b7d45b11cacfad8decbcb08eba0f3eaad2da91eaada"} Oct 13 14:28:57 crc kubenswrapper[4797]: I1013 14:28:57.094140 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-zhs42" Oct 13 14:28:57 crc kubenswrapper[4797]: I1013 14:28:57.250309 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wfrr4\" (UniqueName: \"kubernetes.io/projected/bc4c6e24-80f6-4ac8-b5d3-4f54dd1d7e87-kube-api-access-wfrr4\") pod \"bc4c6e24-80f6-4ac8-b5d3-4f54dd1d7e87\" (UID: \"bc4c6e24-80f6-4ac8-b5d3-4f54dd1d7e87\") " Oct 13 14:28:57 crc kubenswrapper[4797]: I1013 14:28:57.250558 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/bc4c6e24-80f6-4ac8-b5d3-4f54dd1d7e87-config\") pod \"bc4c6e24-80f6-4ac8-b5d3-4f54dd1d7e87\" (UID: \"bc4c6e24-80f6-4ac8-b5d3-4f54dd1d7e87\") " Oct 13 14:28:57 crc kubenswrapper[4797]: I1013 14:28:57.250637 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc4c6e24-80f6-4ac8-b5d3-4f54dd1d7e87-combined-ca-bundle\") pod \"bc4c6e24-80f6-4ac8-b5d3-4f54dd1d7e87\" (UID: \"bc4c6e24-80f6-4ac8-b5d3-4f54dd1d7e87\") " Oct 13 14:28:57 crc kubenswrapper[4797]: I1013 14:28:57.274274 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc4c6e24-80f6-4ac8-b5d3-4f54dd1d7e87-kube-api-access-wfrr4" (OuterVolumeSpecName: "kube-api-access-wfrr4") pod "bc4c6e24-80f6-4ac8-b5d3-4f54dd1d7e87" (UID: "bc4c6e24-80f6-4ac8-b5d3-4f54dd1d7e87"). InnerVolumeSpecName "kube-api-access-wfrr4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 14:28:57 crc kubenswrapper[4797]: I1013 14:28:57.281380 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc4c6e24-80f6-4ac8-b5d3-4f54dd1d7e87-config" (OuterVolumeSpecName: "config") pod "bc4c6e24-80f6-4ac8-b5d3-4f54dd1d7e87" (UID: "bc4c6e24-80f6-4ac8-b5d3-4f54dd1d7e87"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 14:28:57 crc kubenswrapper[4797]: I1013 14:28:57.290218 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc4c6e24-80f6-4ac8-b5d3-4f54dd1d7e87-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bc4c6e24-80f6-4ac8-b5d3-4f54dd1d7e87" (UID: "bc4c6e24-80f6-4ac8-b5d3-4f54dd1d7e87"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 14:28:57 crc kubenswrapper[4797]: I1013 14:28:57.354195 4797 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/bc4c6e24-80f6-4ac8-b5d3-4f54dd1d7e87-config\") on node \"crc\" DevicePath \"\"" Oct 13 14:28:57 crc kubenswrapper[4797]: I1013 14:28:57.355017 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc4c6e24-80f6-4ac8-b5d3-4f54dd1d7e87-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 14:28:57 crc kubenswrapper[4797]: I1013 14:28:57.355111 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wfrr4\" (UniqueName: \"kubernetes.io/projected/bc4c6e24-80f6-4ac8-b5d3-4f54dd1d7e87-kube-api-access-wfrr4\") on node \"crc\" DevicePath \"\"" Oct 13 14:28:57 crc kubenswrapper[4797]: I1013 14:28:57.765174 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-zhs42" event={"ID":"bc4c6e24-80f6-4ac8-b5d3-4f54dd1d7e87","Type":"ContainerDied","Data":"4437b72b18ba4f0d3ddb0b0df5440bab5f34743f4085ea9b13b7bb0ec807ef35"} Oct 13 14:28:57 crc kubenswrapper[4797]: I1013 14:28:57.765248 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4437b72b18ba4f0d3ddb0b0df5440bab5f34743f4085ea9b13b7bb0ec807ef35" Oct 13 14:28:57 crc kubenswrapper[4797]: I1013 14:28:57.765253 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-zhs42" Oct 13 14:28:57 crc kubenswrapper[4797]: I1013 14:28:57.901607 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-75d669c78f-wd2k9"] Oct 13 14:28:57 crc kubenswrapper[4797]: E1013 14:28:57.902180 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc4c6e24-80f6-4ac8-b5d3-4f54dd1d7e87" containerName="neutron-db-sync" Oct 13 14:28:57 crc kubenswrapper[4797]: I1013 14:28:57.902203 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc4c6e24-80f6-4ac8-b5d3-4f54dd1d7e87" containerName="neutron-db-sync" Oct 13 14:28:57 crc kubenswrapper[4797]: I1013 14:28:57.902497 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc4c6e24-80f6-4ac8-b5d3-4f54dd1d7e87" containerName="neutron-db-sync" Oct 13 14:28:57 crc kubenswrapper[4797]: I1013 14:28:57.903565 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75d669c78f-wd2k9" Oct 13 14:28:57 crc kubenswrapper[4797]: I1013 14:28:57.924023 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75d669c78f-wd2k9"] Oct 13 14:28:57 crc kubenswrapper[4797]: I1013 14:28:57.991790 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-7f5b654777-hpl7r"] Oct 13 14:28:57 crc kubenswrapper[4797]: I1013 14:28:57.993912 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7f5b654777-hpl7r" Oct 13 14:28:57 crc kubenswrapper[4797]: I1013 14:28:57.998457 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-wdckq" Oct 13 14:28:57 crc kubenswrapper[4797]: I1013 14:28:57.998649 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Oct 13 14:28:57 crc kubenswrapper[4797]: I1013 14:28:57.998784 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Oct 13 14:28:58 crc kubenswrapper[4797]: I1013 14:28:58.004716 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7f5b654777-hpl7r"] Oct 13 14:28:58 crc kubenswrapper[4797]: I1013 14:28:58.072360 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1ee0989b-49de-46c5-a81f-4d855c2a9b47-dns-svc\") pod \"dnsmasq-dns-75d669c78f-wd2k9\" (UID: \"1ee0989b-49de-46c5-a81f-4d855c2a9b47\") " pod="openstack/dnsmasq-dns-75d669c78f-wd2k9" Oct 13 14:28:58 crc kubenswrapper[4797]: I1013 14:28:58.072403 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1ee0989b-49de-46c5-a81f-4d855c2a9b47-ovsdbserver-nb\") pod \"dnsmasq-dns-75d669c78f-wd2k9\" (UID: \"1ee0989b-49de-46c5-a81f-4d855c2a9b47\") " pod="openstack/dnsmasq-dns-75d669c78f-wd2k9" Oct 13 14:28:58 crc kubenswrapper[4797]: I1013 14:28:58.072444 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1ee0989b-49de-46c5-a81f-4d855c2a9b47-ovsdbserver-sb\") pod \"dnsmasq-dns-75d669c78f-wd2k9\" (UID: \"1ee0989b-49de-46c5-a81f-4d855c2a9b47\") " pod="openstack/dnsmasq-dns-75d669c78f-wd2k9" Oct 13 14:28:58 crc kubenswrapper[4797]: I1013 14:28:58.072630 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2sv8\" (UniqueName: \"kubernetes.io/projected/1ee0989b-49de-46c5-a81f-4d855c2a9b47-kube-api-access-c2sv8\") pod \"dnsmasq-dns-75d669c78f-wd2k9\" (UID: \"1ee0989b-49de-46c5-a81f-4d855c2a9b47\") " pod="openstack/dnsmasq-dns-75d669c78f-wd2k9" Oct 13 14:28:58 crc kubenswrapper[4797]: I1013 14:28:58.073004 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ee0989b-49de-46c5-a81f-4d855c2a9b47-config\") pod \"dnsmasq-dns-75d669c78f-wd2k9\" (UID: \"1ee0989b-49de-46c5-a81f-4d855c2a9b47\") " pod="openstack/dnsmasq-dns-75d669c78f-wd2k9" Oct 13 14:28:58 crc kubenswrapper[4797]: I1013 14:28:58.174962 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/14fda70e-da94-432b-8c32-8f290ca1ab52-config\") pod \"neutron-7f5b654777-hpl7r\" (UID: \"14fda70e-da94-432b-8c32-8f290ca1ab52\") " pod="openstack/neutron-7f5b654777-hpl7r" Oct 13 14:28:58 crc kubenswrapper[4797]: I1013 14:28:58.175032 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1ee0989b-49de-46c5-a81f-4d855c2a9b47-ovsdbserver-sb\") pod \"dnsmasq-dns-75d669c78f-wd2k9\" (UID: \"1ee0989b-49de-46c5-a81f-4d855c2a9b47\") " pod="openstack/dnsmasq-dns-75d669c78f-wd2k9" Oct 13 14:28:58 crc kubenswrapper[4797]: I1013 14:28:58.175065 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2sv8\" (UniqueName: \"kubernetes.io/projected/1ee0989b-49de-46c5-a81f-4d855c2a9b47-kube-api-access-c2sv8\") pod \"dnsmasq-dns-75d669c78f-wd2k9\" (UID: \"1ee0989b-49de-46c5-a81f-4d855c2a9b47\") " pod="openstack/dnsmasq-dns-75d669c78f-wd2k9" Oct 13 14:28:58 crc kubenswrapper[4797]: I1013 14:28:58.175090 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dgj25\" (UniqueName: \"kubernetes.io/projected/14fda70e-da94-432b-8c32-8f290ca1ab52-kube-api-access-dgj25\") pod \"neutron-7f5b654777-hpl7r\" (UID: \"14fda70e-da94-432b-8c32-8f290ca1ab52\") " pod="openstack/neutron-7f5b654777-hpl7r" Oct 13 14:28:58 crc kubenswrapper[4797]: I1013 14:28:58.175134 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14fda70e-da94-432b-8c32-8f290ca1ab52-combined-ca-bundle\") pod \"neutron-7f5b654777-hpl7r\" (UID: \"14fda70e-da94-432b-8c32-8f290ca1ab52\") " pod="openstack/neutron-7f5b654777-hpl7r" Oct 13 14:28:58 crc kubenswrapper[4797]: I1013 14:28:58.175179 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/14fda70e-da94-432b-8c32-8f290ca1ab52-httpd-config\") pod \"neutron-7f5b654777-hpl7r\" (UID: \"14fda70e-da94-432b-8c32-8f290ca1ab52\") " pod="openstack/neutron-7f5b654777-hpl7r" Oct 13 14:28:58 crc kubenswrapper[4797]: I1013 14:28:58.175362 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ee0989b-49de-46c5-a81f-4d855c2a9b47-config\") pod \"dnsmasq-dns-75d669c78f-wd2k9\" (UID: \"1ee0989b-49de-46c5-a81f-4d855c2a9b47\") " pod="openstack/dnsmasq-dns-75d669c78f-wd2k9" Oct 13 14:28:58 crc kubenswrapper[4797]: I1013 14:28:58.175439 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1ee0989b-49de-46c5-a81f-4d855c2a9b47-dns-svc\") pod \"dnsmasq-dns-75d669c78f-wd2k9\" (UID: \"1ee0989b-49de-46c5-a81f-4d855c2a9b47\") " pod="openstack/dnsmasq-dns-75d669c78f-wd2k9" Oct 13 14:28:58 crc kubenswrapper[4797]: I1013 14:28:58.175480 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1ee0989b-49de-46c5-a81f-4d855c2a9b47-ovsdbserver-nb\") pod \"dnsmasq-dns-75d669c78f-wd2k9\" (UID: \"1ee0989b-49de-46c5-a81f-4d855c2a9b47\") " pod="openstack/dnsmasq-dns-75d669c78f-wd2k9" Oct 13 14:28:58 crc kubenswrapper[4797]: I1013 14:28:58.176834 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1ee0989b-49de-46c5-a81f-4d855c2a9b47-dns-svc\") pod \"dnsmasq-dns-75d669c78f-wd2k9\" (UID: \"1ee0989b-49de-46c5-a81f-4d855c2a9b47\") " pod="openstack/dnsmasq-dns-75d669c78f-wd2k9" Oct 13 14:28:58 crc kubenswrapper[4797]: I1013 14:28:58.176978 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1ee0989b-49de-46c5-a81f-4d855c2a9b47-ovsdbserver-nb\") pod \"dnsmasq-dns-75d669c78f-wd2k9\" (UID: \"1ee0989b-49de-46c5-a81f-4d855c2a9b47\") " pod="openstack/dnsmasq-dns-75d669c78f-wd2k9" Oct 13 14:28:58 crc kubenswrapper[4797]: I1013 14:28:58.177000 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ee0989b-49de-46c5-a81f-4d855c2a9b47-config\") pod \"dnsmasq-dns-75d669c78f-wd2k9\" (UID: \"1ee0989b-49de-46c5-a81f-4d855c2a9b47\") " pod="openstack/dnsmasq-dns-75d669c78f-wd2k9" Oct 13 14:28:58 crc kubenswrapper[4797]: I1013 14:28:58.176974 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1ee0989b-49de-46c5-a81f-4d855c2a9b47-ovsdbserver-sb\") pod \"dnsmasq-dns-75d669c78f-wd2k9\" (UID: \"1ee0989b-49de-46c5-a81f-4d855c2a9b47\") " pod="openstack/dnsmasq-dns-75d669c78f-wd2k9" Oct 13 14:28:58 crc kubenswrapper[4797]: I1013 14:28:58.200605 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2sv8\" (UniqueName: \"kubernetes.io/projected/1ee0989b-49de-46c5-a81f-4d855c2a9b47-kube-api-access-c2sv8\") pod \"dnsmasq-dns-75d669c78f-wd2k9\" (UID: \"1ee0989b-49de-46c5-a81f-4d855c2a9b47\") " pod="openstack/dnsmasq-dns-75d669c78f-wd2k9" Oct 13 14:28:58 crc kubenswrapper[4797]: I1013 14:28:58.233297 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75d669c78f-wd2k9" Oct 13 14:28:58 crc kubenswrapper[4797]: I1013 14:28:58.276860 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/14fda70e-da94-432b-8c32-8f290ca1ab52-httpd-config\") pod \"neutron-7f5b654777-hpl7r\" (UID: \"14fda70e-da94-432b-8c32-8f290ca1ab52\") " pod="openstack/neutron-7f5b654777-hpl7r" Oct 13 14:28:58 crc kubenswrapper[4797]: I1013 14:28:58.276962 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/14fda70e-da94-432b-8c32-8f290ca1ab52-config\") pod \"neutron-7f5b654777-hpl7r\" (UID: \"14fda70e-da94-432b-8c32-8f290ca1ab52\") " pod="openstack/neutron-7f5b654777-hpl7r" Oct 13 14:28:58 crc kubenswrapper[4797]: I1013 14:28:58.276995 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dgj25\" (UniqueName: \"kubernetes.io/projected/14fda70e-da94-432b-8c32-8f290ca1ab52-kube-api-access-dgj25\") pod \"neutron-7f5b654777-hpl7r\" (UID: \"14fda70e-da94-432b-8c32-8f290ca1ab52\") " pod="openstack/neutron-7f5b654777-hpl7r" Oct 13 14:28:58 crc kubenswrapper[4797]: I1013 14:28:58.277028 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14fda70e-da94-432b-8c32-8f290ca1ab52-combined-ca-bundle\") pod \"neutron-7f5b654777-hpl7r\" (UID: \"14fda70e-da94-432b-8c32-8f290ca1ab52\") " pod="openstack/neutron-7f5b654777-hpl7r" Oct 13 14:28:58 crc kubenswrapper[4797]: I1013 14:28:58.280384 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14fda70e-da94-432b-8c32-8f290ca1ab52-combined-ca-bundle\") pod \"neutron-7f5b654777-hpl7r\" (UID: \"14fda70e-da94-432b-8c32-8f290ca1ab52\") " pod="openstack/neutron-7f5b654777-hpl7r" Oct 13 14:28:58 crc kubenswrapper[4797]: I1013 14:28:58.280954 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/14fda70e-da94-432b-8c32-8f290ca1ab52-config\") pod \"neutron-7f5b654777-hpl7r\" (UID: \"14fda70e-da94-432b-8c32-8f290ca1ab52\") " pod="openstack/neutron-7f5b654777-hpl7r" Oct 13 14:28:58 crc kubenswrapper[4797]: I1013 14:28:58.283574 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/14fda70e-da94-432b-8c32-8f290ca1ab52-httpd-config\") pod \"neutron-7f5b654777-hpl7r\" (UID: \"14fda70e-da94-432b-8c32-8f290ca1ab52\") " pod="openstack/neutron-7f5b654777-hpl7r" Oct 13 14:28:58 crc kubenswrapper[4797]: I1013 14:28:58.304517 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dgj25\" (UniqueName: \"kubernetes.io/projected/14fda70e-da94-432b-8c32-8f290ca1ab52-kube-api-access-dgj25\") pod \"neutron-7f5b654777-hpl7r\" (UID: \"14fda70e-da94-432b-8c32-8f290ca1ab52\") " pod="openstack/neutron-7f5b654777-hpl7r" Oct 13 14:28:58 crc kubenswrapper[4797]: I1013 14:28:58.325513 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7f5b654777-hpl7r" Oct 13 14:28:58 crc kubenswrapper[4797]: I1013 14:28:58.712246 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75d669c78f-wd2k9"] Oct 13 14:28:58 crc kubenswrapper[4797]: I1013 14:28:58.777070 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75d669c78f-wd2k9" event={"ID":"1ee0989b-49de-46c5-a81f-4d855c2a9b47","Type":"ContainerStarted","Data":"ec3401a9dda805c59e3a5bdbe380cc3a74c943e0ca9056197ee4758d392e6780"} Oct 13 14:28:58 crc kubenswrapper[4797]: I1013 14:28:58.933627 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7f5b654777-hpl7r"] Oct 13 14:28:59 crc kubenswrapper[4797]: I1013 14:28:59.787019 4797 generic.go:334] "Generic (PLEG): container finished" podID="1ee0989b-49de-46c5-a81f-4d855c2a9b47" containerID="1d81bec1b7b7309619f89efc40364eb3572fdd36037e8b80efa117c69603d044" exitCode=0 Oct 13 14:28:59 crc kubenswrapper[4797]: I1013 14:28:59.787375 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75d669c78f-wd2k9" event={"ID":"1ee0989b-49de-46c5-a81f-4d855c2a9b47","Type":"ContainerDied","Data":"1d81bec1b7b7309619f89efc40364eb3572fdd36037e8b80efa117c69603d044"} Oct 13 14:28:59 crc kubenswrapper[4797]: I1013 14:28:59.791181 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7f5b654777-hpl7r" event={"ID":"14fda70e-da94-432b-8c32-8f290ca1ab52","Type":"ContainerStarted","Data":"ee853752985e404e0d2f0443c8d41510c00fb5e8574fddd7ff5f7654d64c62d5"} Oct 13 14:28:59 crc kubenswrapper[4797]: I1013 14:28:59.791295 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7f5b654777-hpl7r" event={"ID":"14fda70e-da94-432b-8c32-8f290ca1ab52","Type":"ContainerStarted","Data":"ca2b411cf10b4b00658603242f01fd8190fd88ad70f372c3af220397cf773fb3"} Oct 13 14:28:59 crc kubenswrapper[4797]: I1013 14:28:59.791357 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7f5b654777-hpl7r" event={"ID":"14fda70e-da94-432b-8c32-8f290ca1ab52","Type":"ContainerStarted","Data":"6e894a0e3442bcf088fd132371a1ca8582698abb10c4154dd1fe1244814c3a4f"} Oct 13 14:28:59 crc kubenswrapper[4797]: I1013 14:28:59.791881 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-7f5b654777-hpl7r" Oct 13 14:28:59 crc kubenswrapper[4797]: I1013 14:28:59.831081 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-7f5b654777-hpl7r" podStartSLOduration=2.831062858 podStartE2EDuration="2.831062858s" podCreationTimestamp="2025-10-13 14:28:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 14:28:59.82539856 +0000 UTC m=+4917.358948836" watchObservedRunningTime="2025-10-13 14:28:59.831062858 +0000 UTC m=+4917.364613114" Oct 13 14:29:00 crc kubenswrapper[4797]: I1013 14:29:00.801358 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75d669c78f-wd2k9" event={"ID":"1ee0989b-49de-46c5-a81f-4d855c2a9b47","Type":"ContainerStarted","Data":"74f001ad8e6d132da9c293330fc1ae0c8f91a5f9ce7bd679d3cf08b907e1c7e8"} Oct 13 14:29:00 crc kubenswrapper[4797]: I1013 14:29:00.818032 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-75d669c78f-wd2k9" podStartSLOduration=3.818013516 podStartE2EDuration="3.818013516s" podCreationTimestamp="2025-10-13 14:28:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 14:29:00.815418183 +0000 UTC m=+4918.348968439" watchObservedRunningTime="2025-10-13 14:29:00.818013516 +0000 UTC m=+4918.351563772" Oct 13 14:29:01 crc kubenswrapper[4797]: I1013 14:29:01.818951 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-75d669c78f-wd2k9" Oct 13 14:29:08 crc kubenswrapper[4797]: I1013 14:29:08.235395 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-75d669c78f-wd2k9" Oct 13 14:29:08 crc kubenswrapper[4797]: I1013 14:29:08.292442 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58bbdf8b99-5z4rr"] Oct 13 14:29:08 crc kubenswrapper[4797]: I1013 14:29:08.295099 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-58bbdf8b99-5z4rr" podUID="80b0a71d-5c0a-4432-a2cd-15010490e327" containerName="dnsmasq-dns" containerID="cri-o://cf16f52dc0826a0d662836d2d29eaba40fa292024955274002ae4d7f12a2f6bb" gracePeriod=10 Oct 13 14:29:08 crc kubenswrapper[4797]: I1013 14:29:08.821694 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58bbdf8b99-5z4rr" Oct 13 14:29:08 crc kubenswrapper[4797]: I1013 14:29:08.873327 4797 generic.go:334] "Generic (PLEG): container finished" podID="80b0a71d-5c0a-4432-a2cd-15010490e327" containerID="cf16f52dc0826a0d662836d2d29eaba40fa292024955274002ae4d7f12a2f6bb" exitCode=0 Oct 13 14:29:08 crc kubenswrapper[4797]: I1013 14:29:08.873386 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58bbdf8b99-5z4rr" event={"ID":"80b0a71d-5c0a-4432-a2cd-15010490e327","Type":"ContainerDied","Data":"cf16f52dc0826a0d662836d2d29eaba40fa292024955274002ae4d7f12a2f6bb"} Oct 13 14:29:08 crc kubenswrapper[4797]: I1013 14:29:08.873453 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58bbdf8b99-5z4rr" event={"ID":"80b0a71d-5c0a-4432-a2cd-15010490e327","Type":"ContainerDied","Data":"16570f060bc1832516b7f9a0d92e68367712d29f7a16cb47766897a0033fec60"} Oct 13 14:29:08 crc kubenswrapper[4797]: I1013 14:29:08.873471 4797 scope.go:117] "RemoveContainer" containerID="cf16f52dc0826a0d662836d2d29eaba40fa292024955274002ae4d7f12a2f6bb" Oct 13 14:29:08 crc kubenswrapper[4797]: I1013 14:29:08.873404 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58bbdf8b99-5z4rr" Oct 13 14:29:08 crc kubenswrapper[4797]: I1013 14:29:08.903841 4797 scope.go:117] "RemoveContainer" containerID="cfb48ca87f6adfcc8962943ea85f73a69463c6e0767d1ffd858bcb871d7f4daa" Oct 13 14:29:08 crc kubenswrapper[4797]: I1013 14:29:08.925235 4797 scope.go:117] "RemoveContainer" containerID="cf16f52dc0826a0d662836d2d29eaba40fa292024955274002ae4d7f12a2f6bb" Oct 13 14:29:08 crc kubenswrapper[4797]: E1013 14:29:08.925649 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf16f52dc0826a0d662836d2d29eaba40fa292024955274002ae4d7f12a2f6bb\": container with ID starting with cf16f52dc0826a0d662836d2d29eaba40fa292024955274002ae4d7f12a2f6bb not found: ID does not exist" containerID="cf16f52dc0826a0d662836d2d29eaba40fa292024955274002ae4d7f12a2f6bb" Oct 13 14:29:08 crc kubenswrapper[4797]: I1013 14:29:08.925697 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf16f52dc0826a0d662836d2d29eaba40fa292024955274002ae4d7f12a2f6bb"} err="failed to get container status \"cf16f52dc0826a0d662836d2d29eaba40fa292024955274002ae4d7f12a2f6bb\": rpc error: code = NotFound desc = could not find container \"cf16f52dc0826a0d662836d2d29eaba40fa292024955274002ae4d7f12a2f6bb\": container with ID starting with cf16f52dc0826a0d662836d2d29eaba40fa292024955274002ae4d7f12a2f6bb not found: ID does not exist" Oct 13 14:29:08 crc kubenswrapper[4797]: I1013 14:29:08.925731 4797 scope.go:117] "RemoveContainer" containerID="cfb48ca87f6adfcc8962943ea85f73a69463c6e0767d1ffd858bcb871d7f4daa" Oct 13 14:29:08 crc kubenswrapper[4797]: E1013 14:29:08.926113 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cfb48ca87f6adfcc8962943ea85f73a69463c6e0767d1ffd858bcb871d7f4daa\": container with ID starting with cfb48ca87f6adfcc8962943ea85f73a69463c6e0767d1ffd858bcb871d7f4daa not found: ID does not exist" containerID="cfb48ca87f6adfcc8962943ea85f73a69463c6e0767d1ffd858bcb871d7f4daa" Oct 13 14:29:08 crc kubenswrapper[4797]: I1013 14:29:08.926149 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cfb48ca87f6adfcc8962943ea85f73a69463c6e0767d1ffd858bcb871d7f4daa"} err="failed to get container status \"cfb48ca87f6adfcc8962943ea85f73a69463c6e0767d1ffd858bcb871d7f4daa\": rpc error: code = NotFound desc = could not find container \"cfb48ca87f6adfcc8962943ea85f73a69463c6e0767d1ffd858bcb871d7f4daa\": container with ID starting with cfb48ca87f6adfcc8962943ea85f73a69463c6e0767d1ffd858bcb871d7f4daa not found: ID does not exist" Oct 13 14:29:08 crc kubenswrapper[4797]: I1013 14:29:08.955499 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/80b0a71d-5c0a-4432-a2cd-15010490e327-ovsdbserver-nb\") pod \"80b0a71d-5c0a-4432-a2cd-15010490e327\" (UID: \"80b0a71d-5c0a-4432-a2cd-15010490e327\") " Oct 13 14:29:08 crc kubenswrapper[4797]: I1013 14:29:08.955583 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80b0a71d-5c0a-4432-a2cd-15010490e327-config\") pod \"80b0a71d-5c0a-4432-a2cd-15010490e327\" (UID: \"80b0a71d-5c0a-4432-a2cd-15010490e327\") " Oct 13 14:29:08 crc kubenswrapper[4797]: I1013 14:29:08.955619 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kdpdm\" (UniqueName: \"kubernetes.io/projected/80b0a71d-5c0a-4432-a2cd-15010490e327-kube-api-access-kdpdm\") pod \"80b0a71d-5c0a-4432-a2cd-15010490e327\" (UID: \"80b0a71d-5c0a-4432-a2cd-15010490e327\") " Oct 13 14:29:08 crc kubenswrapper[4797]: I1013 14:29:08.955703 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/80b0a71d-5c0a-4432-a2cd-15010490e327-ovsdbserver-sb\") pod \"80b0a71d-5c0a-4432-a2cd-15010490e327\" (UID: \"80b0a71d-5c0a-4432-a2cd-15010490e327\") " Oct 13 14:29:08 crc kubenswrapper[4797]: I1013 14:29:08.955723 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/80b0a71d-5c0a-4432-a2cd-15010490e327-dns-svc\") pod \"80b0a71d-5c0a-4432-a2cd-15010490e327\" (UID: \"80b0a71d-5c0a-4432-a2cd-15010490e327\") " Oct 13 14:29:08 crc kubenswrapper[4797]: I1013 14:29:08.963569 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80b0a71d-5c0a-4432-a2cd-15010490e327-kube-api-access-kdpdm" (OuterVolumeSpecName: "kube-api-access-kdpdm") pod "80b0a71d-5c0a-4432-a2cd-15010490e327" (UID: "80b0a71d-5c0a-4432-a2cd-15010490e327"). InnerVolumeSpecName "kube-api-access-kdpdm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 14:29:09 crc kubenswrapper[4797]: I1013 14:29:09.000571 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/80b0a71d-5c0a-4432-a2cd-15010490e327-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "80b0a71d-5c0a-4432-a2cd-15010490e327" (UID: "80b0a71d-5c0a-4432-a2cd-15010490e327"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 14:29:09 crc kubenswrapper[4797]: I1013 14:29:09.001527 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/80b0a71d-5c0a-4432-a2cd-15010490e327-config" (OuterVolumeSpecName: "config") pod "80b0a71d-5c0a-4432-a2cd-15010490e327" (UID: "80b0a71d-5c0a-4432-a2cd-15010490e327"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 14:29:09 crc kubenswrapper[4797]: I1013 14:29:09.004258 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/80b0a71d-5c0a-4432-a2cd-15010490e327-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "80b0a71d-5c0a-4432-a2cd-15010490e327" (UID: "80b0a71d-5c0a-4432-a2cd-15010490e327"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 14:29:09 crc kubenswrapper[4797]: I1013 14:29:09.011888 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/80b0a71d-5c0a-4432-a2cd-15010490e327-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "80b0a71d-5c0a-4432-a2cd-15010490e327" (UID: "80b0a71d-5c0a-4432-a2cd-15010490e327"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 14:29:09 crc kubenswrapper[4797]: I1013 14:29:09.057409 4797 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80b0a71d-5c0a-4432-a2cd-15010490e327-config\") on node \"crc\" DevicePath \"\"" Oct 13 14:29:09 crc kubenswrapper[4797]: I1013 14:29:09.057440 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kdpdm\" (UniqueName: \"kubernetes.io/projected/80b0a71d-5c0a-4432-a2cd-15010490e327-kube-api-access-kdpdm\") on node \"crc\" DevicePath \"\"" Oct 13 14:29:09 crc kubenswrapper[4797]: I1013 14:29:09.057449 4797 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/80b0a71d-5c0a-4432-a2cd-15010490e327-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 13 14:29:09 crc kubenswrapper[4797]: I1013 14:29:09.057458 4797 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/80b0a71d-5c0a-4432-a2cd-15010490e327-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 13 14:29:09 crc kubenswrapper[4797]: I1013 14:29:09.057468 4797 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/80b0a71d-5c0a-4432-a2cd-15010490e327-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 13 14:29:09 crc kubenswrapper[4797]: I1013 14:29:09.216565 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58bbdf8b99-5z4rr"] Oct 13 14:29:09 crc kubenswrapper[4797]: I1013 14:29:09.223080 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-58bbdf8b99-5z4rr"] Oct 13 14:29:09 crc kubenswrapper[4797]: I1013 14:29:09.247548 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="80b0a71d-5c0a-4432-a2cd-15010490e327" path="/var/lib/kubelet/pods/80b0a71d-5c0a-4432-a2cd-15010490e327/volumes" Oct 13 14:29:09 crc kubenswrapper[4797]: E1013 14:29:09.374477 4797 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod80b0a71d_5c0a_4432_a2cd_15010490e327.slice/crio-16570f060bc1832516b7f9a0d92e68367712d29f7a16cb47766897a0033fec60\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod80b0a71d_5c0a_4432_a2cd_15010490e327.slice\": RecentStats: unable to find data in memory cache]" Oct 13 14:29:18 crc kubenswrapper[4797]: I1013 14:29:18.120681 4797 patch_prober.go:28] interesting pod/machine-config-daemon-hrdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 14:29:18 crc kubenswrapper[4797]: I1013 14:29:18.121260 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 14:29:28 crc kubenswrapper[4797]: I1013 14:29:28.334382 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-7f5b654777-hpl7r" Oct 13 14:29:29 crc kubenswrapper[4797]: I1013 14:29:29.091628 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-nq9f5"] Oct 13 14:29:29 crc kubenswrapper[4797]: E1013 14:29:29.092394 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80b0a71d-5c0a-4432-a2cd-15010490e327" containerName="dnsmasq-dns" Oct 13 14:29:29 crc kubenswrapper[4797]: I1013 14:29:29.092414 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="80b0a71d-5c0a-4432-a2cd-15010490e327" containerName="dnsmasq-dns" Oct 13 14:29:29 crc kubenswrapper[4797]: E1013 14:29:29.092430 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80b0a71d-5c0a-4432-a2cd-15010490e327" containerName="init" Oct 13 14:29:29 crc kubenswrapper[4797]: I1013 14:29:29.092439 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="80b0a71d-5c0a-4432-a2cd-15010490e327" containerName="init" Oct 13 14:29:29 crc kubenswrapper[4797]: I1013 14:29:29.092609 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="80b0a71d-5c0a-4432-a2cd-15010490e327" containerName="dnsmasq-dns" Oct 13 14:29:29 crc kubenswrapper[4797]: I1013 14:29:29.093751 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nq9f5" Oct 13 14:29:29 crc kubenswrapper[4797]: I1013 14:29:29.109382 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nq9f5"] Oct 13 14:29:29 crc kubenswrapper[4797]: I1013 14:29:29.252050 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/691453d2-1233-457f-9402-2bcb9005936a-catalog-content\") pod \"community-operators-nq9f5\" (UID: \"691453d2-1233-457f-9402-2bcb9005936a\") " pod="openshift-marketplace/community-operators-nq9f5" Oct 13 14:29:29 crc kubenswrapper[4797]: I1013 14:29:29.252119 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/691453d2-1233-457f-9402-2bcb9005936a-utilities\") pod \"community-operators-nq9f5\" (UID: \"691453d2-1233-457f-9402-2bcb9005936a\") " pod="openshift-marketplace/community-operators-nq9f5" Oct 13 14:29:29 crc kubenswrapper[4797]: I1013 14:29:29.252190 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrvcm\" (UniqueName: \"kubernetes.io/projected/691453d2-1233-457f-9402-2bcb9005936a-kube-api-access-zrvcm\") pod \"community-operators-nq9f5\" (UID: \"691453d2-1233-457f-9402-2bcb9005936a\") " pod="openshift-marketplace/community-operators-nq9f5" Oct 13 14:29:29 crc kubenswrapper[4797]: I1013 14:29:29.290745 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-2l2vd"] Oct 13 14:29:29 crc kubenswrapper[4797]: I1013 14:29:29.292359 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2l2vd" Oct 13 14:29:29 crc kubenswrapper[4797]: I1013 14:29:29.345049 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2l2vd"] Oct 13 14:29:29 crc kubenswrapper[4797]: I1013 14:29:29.368766 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/691453d2-1233-457f-9402-2bcb9005936a-utilities\") pod \"community-operators-nq9f5\" (UID: \"691453d2-1233-457f-9402-2bcb9005936a\") " pod="openshift-marketplace/community-operators-nq9f5" Oct 13 14:29:29 crc kubenswrapper[4797]: I1013 14:29:29.369379 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zrvcm\" (UniqueName: \"kubernetes.io/projected/691453d2-1233-457f-9402-2bcb9005936a-kube-api-access-zrvcm\") pod \"community-operators-nq9f5\" (UID: \"691453d2-1233-457f-9402-2bcb9005936a\") " pod="openshift-marketplace/community-operators-nq9f5" Oct 13 14:29:29 crc kubenswrapper[4797]: I1013 14:29:29.369916 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/691453d2-1233-457f-9402-2bcb9005936a-catalog-content\") pod \"community-operators-nq9f5\" (UID: \"691453d2-1233-457f-9402-2bcb9005936a\") " pod="openshift-marketplace/community-operators-nq9f5" Oct 13 14:29:29 crc kubenswrapper[4797]: I1013 14:29:29.372059 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/691453d2-1233-457f-9402-2bcb9005936a-catalog-content\") pod \"community-operators-nq9f5\" (UID: \"691453d2-1233-457f-9402-2bcb9005936a\") " pod="openshift-marketplace/community-operators-nq9f5" Oct 13 14:29:29 crc kubenswrapper[4797]: I1013 14:29:29.373034 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/691453d2-1233-457f-9402-2bcb9005936a-utilities\") pod \"community-operators-nq9f5\" (UID: \"691453d2-1233-457f-9402-2bcb9005936a\") " pod="openshift-marketplace/community-operators-nq9f5" Oct 13 14:29:29 crc kubenswrapper[4797]: I1013 14:29:29.392727 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrvcm\" (UniqueName: \"kubernetes.io/projected/691453d2-1233-457f-9402-2bcb9005936a-kube-api-access-zrvcm\") pod \"community-operators-nq9f5\" (UID: \"691453d2-1233-457f-9402-2bcb9005936a\") " pod="openshift-marketplace/community-operators-nq9f5" Oct 13 14:29:29 crc kubenswrapper[4797]: I1013 14:29:29.438787 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nq9f5" Oct 13 14:29:29 crc kubenswrapper[4797]: I1013 14:29:29.472097 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/152284ac-c39a-4c58-a164-6caa70cde371-utilities\") pod \"certified-operators-2l2vd\" (UID: \"152284ac-c39a-4c58-a164-6caa70cde371\") " pod="openshift-marketplace/certified-operators-2l2vd" Oct 13 14:29:29 crc kubenswrapper[4797]: I1013 14:29:29.472175 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/152284ac-c39a-4c58-a164-6caa70cde371-catalog-content\") pod \"certified-operators-2l2vd\" (UID: \"152284ac-c39a-4c58-a164-6caa70cde371\") " pod="openshift-marketplace/certified-operators-2l2vd" Oct 13 14:29:29 crc kubenswrapper[4797]: I1013 14:29:29.472244 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9m8pp\" (UniqueName: \"kubernetes.io/projected/152284ac-c39a-4c58-a164-6caa70cde371-kube-api-access-9m8pp\") pod \"certified-operators-2l2vd\" (UID: \"152284ac-c39a-4c58-a164-6caa70cde371\") " pod="openshift-marketplace/certified-operators-2l2vd" Oct 13 14:29:29 crc kubenswrapper[4797]: I1013 14:29:29.573724 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/152284ac-c39a-4c58-a164-6caa70cde371-utilities\") pod \"certified-operators-2l2vd\" (UID: \"152284ac-c39a-4c58-a164-6caa70cde371\") " pod="openshift-marketplace/certified-operators-2l2vd" Oct 13 14:29:29 crc kubenswrapper[4797]: I1013 14:29:29.573795 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/152284ac-c39a-4c58-a164-6caa70cde371-catalog-content\") pod \"certified-operators-2l2vd\" (UID: \"152284ac-c39a-4c58-a164-6caa70cde371\") " pod="openshift-marketplace/certified-operators-2l2vd" Oct 13 14:29:29 crc kubenswrapper[4797]: I1013 14:29:29.573868 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9m8pp\" (UniqueName: \"kubernetes.io/projected/152284ac-c39a-4c58-a164-6caa70cde371-kube-api-access-9m8pp\") pod \"certified-operators-2l2vd\" (UID: \"152284ac-c39a-4c58-a164-6caa70cde371\") " pod="openshift-marketplace/certified-operators-2l2vd" Oct 13 14:29:29 crc kubenswrapper[4797]: I1013 14:29:29.574858 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/152284ac-c39a-4c58-a164-6caa70cde371-utilities\") pod \"certified-operators-2l2vd\" (UID: \"152284ac-c39a-4c58-a164-6caa70cde371\") " pod="openshift-marketplace/certified-operators-2l2vd" Oct 13 14:29:29 crc kubenswrapper[4797]: I1013 14:29:29.575065 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/152284ac-c39a-4c58-a164-6caa70cde371-catalog-content\") pod \"certified-operators-2l2vd\" (UID: \"152284ac-c39a-4c58-a164-6caa70cde371\") " pod="openshift-marketplace/certified-operators-2l2vd" Oct 13 14:29:29 crc kubenswrapper[4797]: I1013 14:29:29.596429 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9m8pp\" (UniqueName: \"kubernetes.io/projected/152284ac-c39a-4c58-a164-6caa70cde371-kube-api-access-9m8pp\") pod \"certified-operators-2l2vd\" (UID: \"152284ac-c39a-4c58-a164-6caa70cde371\") " pod="openshift-marketplace/certified-operators-2l2vd" Oct 13 14:29:29 crc kubenswrapper[4797]: I1013 14:29:29.649177 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2l2vd" Oct 13 14:29:30 crc kubenswrapper[4797]: I1013 14:29:30.027758 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2l2vd"] Oct 13 14:29:30 crc kubenswrapper[4797]: I1013 14:29:30.066902 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nq9f5"] Oct 13 14:29:30 crc kubenswrapper[4797]: I1013 14:29:30.083070 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2l2vd" event={"ID":"152284ac-c39a-4c58-a164-6caa70cde371","Type":"ContainerStarted","Data":"e3d3aeef97aece0bbe89e1440ec5ff0f0c5d753f7bdfdb5293d7841a6c9e8dec"} Oct 13 14:29:31 crc kubenswrapper[4797]: I1013 14:29:31.091924 4797 generic.go:334] "Generic (PLEG): container finished" podID="691453d2-1233-457f-9402-2bcb9005936a" containerID="872939c53cac82ebccc6a807f277c318d5f6ec1ba823f7587af0b27cb7a0301e" exitCode=0 Oct 13 14:29:31 crc kubenswrapper[4797]: I1013 14:29:31.092071 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nq9f5" event={"ID":"691453d2-1233-457f-9402-2bcb9005936a","Type":"ContainerDied","Data":"872939c53cac82ebccc6a807f277c318d5f6ec1ba823f7587af0b27cb7a0301e"} Oct 13 14:29:31 crc kubenswrapper[4797]: I1013 14:29:31.092126 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nq9f5" event={"ID":"691453d2-1233-457f-9402-2bcb9005936a","Type":"ContainerStarted","Data":"97eff4276d2f018cdcb7f85d219c08278faff3e179ef4e173a9e31690b4f09c4"} Oct 13 14:29:31 crc kubenswrapper[4797]: I1013 14:29:31.094443 4797 generic.go:334] "Generic (PLEG): container finished" podID="152284ac-c39a-4c58-a164-6caa70cde371" containerID="8cb319b34a72f394542fa54d681170110d40bdfec44823b9ce695d0bad1de386" exitCode=0 Oct 13 14:29:31 crc kubenswrapper[4797]: I1013 14:29:31.094499 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2l2vd" event={"ID":"152284ac-c39a-4c58-a164-6caa70cde371","Type":"ContainerDied","Data":"8cb319b34a72f394542fa54d681170110d40bdfec44823b9ce695d0bad1de386"} Oct 13 14:29:32 crc kubenswrapper[4797]: I1013 14:29:32.108452 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2l2vd" event={"ID":"152284ac-c39a-4c58-a164-6caa70cde371","Type":"ContainerStarted","Data":"610d1a3fa30eb107e9729f5023ca0cfc74e3401346d65322bfffb88d527884fc"} Oct 13 14:29:33 crc kubenswrapper[4797]: I1013 14:29:33.122557 4797 generic.go:334] "Generic (PLEG): container finished" podID="691453d2-1233-457f-9402-2bcb9005936a" containerID="4c2f2da5912ad856aa67aa235db6fec9e282b292e15bad701acf7179aa99166d" exitCode=0 Oct 13 14:29:33 crc kubenswrapper[4797]: I1013 14:29:33.122658 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nq9f5" event={"ID":"691453d2-1233-457f-9402-2bcb9005936a","Type":"ContainerDied","Data":"4c2f2da5912ad856aa67aa235db6fec9e282b292e15bad701acf7179aa99166d"} Oct 13 14:29:33 crc kubenswrapper[4797]: I1013 14:29:33.126019 4797 generic.go:334] "Generic (PLEG): container finished" podID="152284ac-c39a-4c58-a164-6caa70cde371" containerID="610d1a3fa30eb107e9729f5023ca0cfc74e3401346d65322bfffb88d527884fc" exitCode=0 Oct 13 14:29:33 crc kubenswrapper[4797]: I1013 14:29:33.126073 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2l2vd" event={"ID":"152284ac-c39a-4c58-a164-6caa70cde371","Type":"ContainerDied","Data":"610d1a3fa30eb107e9729f5023ca0cfc74e3401346d65322bfffb88d527884fc"} Oct 13 14:29:34 crc kubenswrapper[4797]: I1013 14:29:34.134839 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nq9f5" event={"ID":"691453d2-1233-457f-9402-2bcb9005936a","Type":"ContainerStarted","Data":"fc70a2bc61a047febc4079937772a2af41929f8e158008e1ccb4b4830f50e0a6"} Oct 13 14:29:34 crc kubenswrapper[4797]: I1013 14:29:34.137154 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2l2vd" event={"ID":"152284ac-c39a-4c58-a164-6caa70cde371","Type":"ContainerStarted","Data":"59b52a62e0d39e6021b06a893342ba9124a5478fe0cc700cbd9e7c8157d7f366"} Oct 13 14:29:34 crc kubenswrapper[4797]: I1013 14:29:34.154301 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-nq9f5" podStartSLOduration=2.436921559 podStartE2EDuration="5.154281731s" podCreationTimestamp="2025-10-13 14:29:29 +0000 UTC" firstStartedPulling="2025-10-13 14:29:31.094131575 +0000 UTC m=+4948.627681831" lastFinishedPulling="2025-10-13 14:29:33.811491747 +0000 UTC m=+4951.345042003" observedRunningTime="2025-10-13 14:29:34.148488569 +0000 UTC m=+4951.682038845" watchObservedRunningTime="2025-10-13 14:29:34.154281731 +0000 UTC m=+4951.687831987" Oct 13 14:29:34 crc kubenswrapper[4797]: I1013 14:29:34.175114 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-2l2vd" podStartSLOduration=2.722077242 podStartE2EDuration="5.17509821s" podCreationTimestamp="2025-10-13 14:29:29 +0000 UTC" firstStartedPulling="2025-10-13 14:29:31.09636812 +0000 UTC m=+4948.629918376" lastFinishedPulling="2025-10-13 14:29:33.549389088 +0000 UTC m=+4951.082939344" observedRunningTime="2025-10-13 14:29:34.172659321 +0000 UTC m=+4951.706209587" watchObservedRunningTime="2025-10-13 14:29:34.17509821 +0000 UTC m=+4951.708648466" Oct 13 14:29:36 crc kubenswrapper[4797]: I1013 14:29:36.395859 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-gw9tq"] Oct 13 14:29:36 crc kubenswrapper[4797]: I1013 14:29:36.397255 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-gw9tq" Oct 13 14:29:36 crc kubenswrapper[4797]: I1013 14:29:36.405996 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-gw9tq"] Oct 13 14:29:36 crc kubenswrapper[4797]: I1013 14:29:36.522916 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lt7k6\" (UniqueName: \"kubernetes.io/projected/669368b9-50a6-46f6-b526-26be88b3c854-kube-api-access-lt7k6\") pod \"glance-db-create-gw9tq\" (UID: \"669368b9-50a6-46f6-b526-26be88b3c854\") " pod="openstack/glance-db-create-gw9tq" Oct 13 14:29:36 crc kubenswrapper[4797]: I1013 14:29:36.624453 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lt7k6\" (UniqueName: \"kubernetes.io/projected/669368b9-50a6-46f6-b526-26be88b3c854-kube-api-access-lt7k6\") pod \"glance-db-create-gw9tq\" (UID: \"669368b9-50a6-46f6-b526-26be88b3c854\") " pod="openstack/glance-db-create-gw9tq" Oct 13 14:29:36 crc kubenswrapper[4797]: I1013 14:29:36.649839 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lt7k6\" (UniqueName: \"kubernetes.io/projected/669368b9-50a6-46f6-b526-26be88b3c854-kube-api-access-lt7k6\") pod \"glance-db-create-gw9tq\" (UID: \"669368b9-50a6-46f6-b526-26be88b3c854\") " pod="openstack/glance-db-create-gw9tq" Oct 13 14:29:36 crc kubenswrapper[4797]: I1013 14:29:36.718595 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-gw9tq" Oct 13 14:29:37 crc kubenswrapper[4797]: I1013 14:29:37.202884 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-gw9tq"] Oct 13 14:29:37 crc kubenswrapper[4797]: W1013 14:29:37.210957 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod669368b9_50a6_46f6_b526_26be88b3c854.slice/crio-33b34192c058f1d7693a7f2cd68200fc81eda4e4c911cd68ec665803ff902e15 WatchSource:0}: Error finding container 33b34192c058f1d7693a7f2cd68200fc81eda4e4c911cd68ec665803ff902e15: Status 404 returned error can't find the container with id 33b34192c058f1d7693a7f2cd68200fc81eda4e4c911cd68ec665803ff902e15 Oct 13 14:29:38 crc kubenswrapper[4797]: I1013 14:29:38.177051 4797 generic.go:334] "Generic (PLEG): container finished" podID="669368b9-50a6-46f6-b526-26be88b3c854" containerID="1301c51c48e78ec2f0067fb09bb6b31345679c53adfafb2eb36162c4ed150680" exitCode=0 Oct 13 14:29:38 crc kubenswrapper[4797]: I1013 14:29:38.177178 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-gw9tq" event={"ID":"669368b9-50a6-46f6-b526-26be88b3c854","Type":"ContainerDied","Data":"1301c51c48e78ec2f0067fb09bb6b31345679c53adfafb2eb36162c4ed150680"} Oct 13 14:29:38 crc kubenswrapper[4797]: I1013 14:29:38.177521 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-gw9tq" event={"ID":"669368b9-50a6-46f6-b526-26be88b3c854","Type":"ContainerStarted","Data":"33b34192c058f1d7693a7f2cd68200fc81eda4e4c911cd68ec665803ff902e15"} Oct 13 14:29:39 crc kubenswrapper[4797]: I1013 14:29:39.439006 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-nq9f5" Oct 13 14:29:39 crc kubenswrapper[4797]: I1013 14:29:39.439424 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-nq9f5" Oct 13 14:29:39 crc kubenswrapper[4797]: I1013 14:29:39.513760 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-nq9f5" Oct 13 14:29:39 crc kubenswrapper[4797]: I1013 14:29:39.591351 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-gw9tq" Oct 13 14:29:39 crc kubenswrapper[4797]: I1013 14:29:39.651127 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-2l2vd" Oct 13 14:29:39 crc kubenswrapper[4797]: I1013 14:29:39.651175 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-2l2vd" Oct 13 14:29:39 crc kubenswrapper[4797]: I1013 14:29:39.687456 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lt7k6\" (UniqueName: \"kubernetes.io/projected/669368b9-50a6-46f6-b526-26be88b3c854-kube-api-access-lt7k6\") pod \"669368b9-50a6-46f6-b526-26be88b3c854\" (UID: \"669368b9-50a6-46f6-b526-26be88b3c854\") " Oct 13 14:29:39 crc kubenswrapper[4797]: I1013 14:29:39.692590 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/669368b9-50a6-46f6-b526-26be88b3c854-kube-api-access-lt7k6" (OuterVolumeSpecName: "kube-api-access-lt7k6") pod "669368b9-50a6-46f6-b526-26be88b3c854" (UID: "669368b9-50a6-46f6-b526-26be88b3c854"). InnerVolumeSpecName "kube-api-access-lt7k6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 14:29:39 crc kubenswrapper[4797]: I1013 14:29:39.699409 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-2l2vd" Oct 13 14:29:39 crc kubenswrapper[4797]: I1013 14:29:39.789798 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lt7k6\" (UniqueName: \"kubernetes.io/projected/669368b9-50a6-46f6-b526-26be88b3c854-kube-api-access-lt7k6\") on node \"crc\" DevicePath \"\"" Oct 13 14:29:40 crc kubenswrapper[4797]: I1013 14:29:40.206282 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-gw9tq" event={"ID":"669368b9-50a6-46f6-b526-26be88b3c854","Type":"ContainerDied","Data":"33b34192c058f1d7693a7f2cd68200fc81eda4e4c911cd68ec665803ff902e15"} Oct 13 14:29:40 crc kubenswrapper[4797]: I1013 14:29:40.206361 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="33b34192c058f1d7693a7f2cd68200fc81eda4e4c911cd68ec665803ff902e15" Oct 13 14:29:40 crc kubenswrapper[4797]: I1013 14:29:40.206325 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-gw9tq" Oct 13 14:29:40 crc kubenswrapper[4797]: I1013 14:29:40.254129 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-nq9f5" Oct 13 14:29:40 crc kubenswrapper[4797]: I1013 14:29:40.270893 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-2l2vd" Oct 13 14:29:41 crc kubenswrapper[4797]: I1013 14:29:41.284306 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nq9f5"] Oct 13 14:29:42 crc kubenswrapper[4797]: I1013 14:29:42.227001 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-nq9f5" podUID="691453d2-1233-457f-9402-2bcb9005936a" containerName="registry-server" containerID="cri-o://fc70a2bc61a047febc4079937772a2af41929f8e158008e1ccb4b4830f50e0a6" gracePeriod=2 Oct 13 14:29:42 crc kubenswrapper[4797]: I1013 14:29:42.691900 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2l2vd"] Oct 13 14:29:42 crc kubenswrapper[4797]: I1013 14:29:42.692477 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-2l2vd" podUID="152284ac-c39a-4c58-a164-6caa70cde371" containerName="registry-server" containerID="cri-o://59b52a62e0d39e6021b06a893342ba9124a5478fe0cc700cbd9e7c8157d7f366" gracePeriod=2 Oct 13 14:29:42 crc kubenswrapper[4797]: I1013 14:29:42.871520 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nq9f5" Oct 13 14:29:43 crc kubenswrapper[4797]: I1013 14:29:43.042368 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zrvcm\" (UniqueName: \"kubernetes.io/projected/691453d2-1233-457f-9402-2bcb9005936a-kube-api-access-zrvcm\") pod \"691453d2-1233-457f-9402-2bcb9005936a\" (UID: \"691453d2-1233-457f-9402-2bcb9005936a\") " Oct 13 14:29:43 crc kubenswrapper[4797]: I1013 14:29:43.042489 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/691453d2-1233-457f-9402-2bcb9005936a-catalog-content\") pod \"691453d2-1233-457f-9402-2bcb9005936a\" (UID: \"691453d2-1233-457f-9402-2bcb9005936a\") " Oct 13 14:29:43 crc kubenswrapper[4797]: I1013 14:29:43.047685 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/691453d2-1233-457f-9402-2bcb9005936a-kube-api-access-zrvcm" (OuterVolumeSpecName: "kube-api-access-zrvcm") pod "691453d2-1233-457f-9402-2bcb9005936a" (UID: "691453d2-1233-457f-9402-2bcb9005936a"). InnerVolumeSpecName "kube-api-access-zrvcm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 14:29:43 crc kubenswrapper[4797]: I1013 14:29:43.051046 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/691453d2-1233-457f-9402-2bcb9005936a-utilities\") pod \"691453d2-1233-457f-9402-2bcb9005936a\" (UID: \"691453d2-1233-457f-9402-2bcb9005936a\") " Oct 13 14:29:43 crc kubenswrapper[4797]: I1013 14:29:43.051780 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/691453d2-1233-457f-9402-2bcb9005936a-utilities" (OuterVolumeSpecName: "utilities") pod "691453d2-1233-457f-9402-2bcb9005936a" (UID: "691453d2-1233-457f-9402-2bcb9005936a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 14:29:43 crc kubenswrapper[4797]: I1013 14:29:43.051874 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zrvcm\" (UniqueName: \"kubernetes.io/projected/691453d2-1233-457f-9402-2bcb9005936a-kube-api-access-zrvcm\") on node \"crc\" DevicePath \"\"" Oct 13 14:29:43 crc kubenswrapper[4797]: I1013 14:29:43.094350 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2l2vd" Oct 13 14:29:43 crc kubenswrapper[4797]: I1013 14:29:43.100395 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/691453d2-1233-457f-9402-2bcb9005936a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "691453d2-1233-457f-9402-2bcb9005936a" (UID: "691453d2-1233-457f-9402-2bcb9005936a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 14:29:43 crc kubenswrapper[4797]: I1013 14:29:43.153058 4797 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/691453d2-1233-457f-9402-2bcb9005936a-utilities\") on node \"crc\" DevicePath \"\"" Oct 13 14:29:43 crc kubenswrapper[4797]: I1013 14:29:43.153084 4797 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/691453d2-1233-457f-9402-2bcb9005936a-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 13 14:29:43 crc kubenswrapper[4797]: I1013 14:29:43.247363 4797 generic.go:334] "Generic (PLEG): container finished" podID="691453d2-1233-457f-9402-2bcb9005936a" containerID="fc70a2bc61a047febc4079937772a2af41929f8e158008e1ccb4b4830f50e0a6" exitCode=0 Oct 13 14:29:43 crc kubenswrapper[4797]: I1013 14:29:43.247479 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nq9f5" Oct 13 14:29:43 crc kubenswrapper[4797]: I1013 14:29:43.250274 4797 generic.go:334] "Generic (PLEG): container finished" podID="152284ac-c39a-4c58-a164-6caa70cde371" containerID="59b52a62e0d39e6021b06a893342ba9124a5478fe0cc700cbd9e7c8157d7f366" exitCode=0 Oct 13 14:29:43 crc kubenswrapper[4797]: I1013 14:29:43.250353 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2l2vd" Oct 13 14:29:43 crc kubenswrapper[4797]: I1013 14:29:43.253584 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/152284ac-c39a-4c58-a164-6caa70cde371-utilities\") pod \"152284ac-c39a-4c58-a164-6caa70cde371\" (UID: \"152284ac-c39a-4c58-a164-6caa70cde371\") " Oct 13 14:29:43 crc kubenswrapper[4797]: I1013 14:29:43.253658 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/152284ac-c39a-4c58-a164-6caa70cde371-catalog-content\") pod \"152284ac-c39a-4c58-a164-6caa70cde371\" (UID: \"152284ac-c39a-4c58-a164-6caa70cde371\") " Oct 13 14:29:43 crc kubenswrapper[4797]: I1013 14:29:43.253710 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9m8pp\" (UniqueName: \"kubernetes.io/projected/152284ac-c39a-4c58-a164-6caa70cde371-kube-api-access-9m8pp\") pod \"152284ac-c39a-4c58-a164-6caa70cde371\" (UID: \"152284ac-c39a-4c58-a164-6caa70cde371\") " Oct 13 14:29:43 crc kubenswrapper[4797]: I1013 14:29:43.255274 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nq9f5" event={"ID":"691453d2-1233-457f-9402-2bcb9005936a","Type":"ContainerDied","Data":"fc70a2bc61a047febc4079937772a2af41929f8e158008e1ccb4b4830f50e0a6"} Oct 13 14:29:43 crc kubenswrapper[4797]: I1013 14:29:43.255412 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nq9f5" event={"ID":"691453d2-1233-457f-9402-2bcb9005936a","Type":"ContainerDied","Data":"97eff4276d2f018cdcb7f85d219c08278faff3e179ef4e173a9e31690b4f09c4"} Oct 13 14:29:43 crc kubenswrapper[4797]: I1013 14:29:43.255508 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2l2vd" event={"ID":"152284ac-c39a-4c58-a164-6caa70cde371","Type":"ContainerDied","Data":"59b52a62e0d39e6021b06a893342ba9124a5478fe0cc700cbd9e7c8157d7f366"} Oct 13 14:29:43 crc kubenswrapper[4797]: I1013 14:29:43.255617 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2l2vd" event={"ID":"152284ac-c39a-4c58-a164-6caa70cde371","Type":"ContainerDied","Data":"e3d3aeef97aece0bbe89e1440ec5ff0f0c5d753f7bdfdb5293d7841a6c9e8dec"} Oct 13 14:29:43 crc kubenswrapper[4797]: I1013 14:29:43.255444 4797 scope.go:117] "RemoveContainer" containerID="fc70a2bc61a047febc4079937772a2af41929f8e158008e1ccb4b4830f50e0a6" Oct 13 14:29:43 crc kubenswrapper[4797]: I1013 14:29:43.254913 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/152284ac-c39a-4c58-a164-6caa70cde371-utilities" (OuterVolumeSpecName: "utilities") pod "152284ac-c39a-4c58-a164-6caa70cde371" (UID: "152284ac-c39a-4c58-a164-6caa70cde371"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 14:29:43 crc kubenswrapper[4797]: I1013 14:29:43.257442 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/152284ac-c39a-4c58-a164-6caa70cde371-kube-api-access-9m8pp" (OuterVolumeSpecName: "kube-api-access-9m8pp") pod "152284ac-c39a-4c58-a164-6caa70cde371" (UID: "152284ac-c39a-4c58-a164-6caa70cde371"). InnerVolumeSpecName "kube-api-access-9m8pp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 14:29:43 crc kubenswrapper[4797]: I1013 14:29:43.277276 4797 scope.go:117] "RemoveContainer" containerID="4c2f2da5912ad856aa67aa235db6fec9e282b292e15bad701acf7179aa99166d" Oct 13 14:29:43 crc kubenswrapper[4797]: I1013 14:29:43.289549 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/152284ac-c39a-4c58-a164-6caa70cde371-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "152284ac-c39a-4c58-a164-6caa70cde371" (UID: "152284ac-c39a-4c58-a164-6caa70cde371"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 14:29:43 crc kubenswrapper[4797]: I1013 14:29:43.295255 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nq9f5"] Oct 13 14:29:43 crc kubenswrapper[4797]: I1013 14:29:43.300848 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-nq9f5"] Oct 13 14:29:43 crc kubenswrapper[4797]: I1013 14:29:43.355741 4797 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/152284ac-c39a-4c58-a164-6caa70cde371-utilities\") on node \"crc\" DevicePath \"\"" Oct 13 14:29:43 crc kubenswrapper[4797]: I1013 14:29:43.355788 4797 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/152284ac-c39a-4c58-a164-6caa70cde371-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 13 14:29:43 crc kubenswrapper[4797]: I1013 14:29:43.355840 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9m8pp\" (UniqueName: \"kubernetes.io/projected/152284ac-c39a-4c58-a164-6caa70cde371-kube-api-access-9m8pp\") on node \"crc\" DevicePath \"\"" Oct 13 14:29:43 crc kubenswrapper[4797]: I1013 14:29:43.547733 4797 scope.go:117] "RemoveContainer" containerID="872939c53cac82ebccc6a807f277c318d5f6ec1ba823f7587af0b27cb7a0301e" Oct 13 14:29:43 crc kubenswrapper[4797]: I1013 14:29:43.613899 4797 scope.go:117] "RemoveContainer" containerID="fc70a2bc61a047febc4079937772a2af41929f8e158008e1ccb4b4830f50e0a6" Oct 13 14:29:43 crc kubenswrapper[4797]: E1013 14:29:43.614299 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc70a2bc61a047febc4079937772a2af41929f8e158008e1ccb4b4830f50e0a6\": container with ID starting with fc70a2bc61a047febc4079937772a2af41929f8e158008e1ccb4b4830f50e0a6 not found: ID does not exist" containerID="fc70a2bc61a047febc4079937772a2af41929f8e158008e1ccb4b4830f50e0a6" Oct 13 14:29:43 crc kubenswrapper[4797]: I1013 14:29:43.614341 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc70a2bc61a047febc4079937772a2af41929f8e158008e1ccb4b4830f50e0a6"} err="failed to get container status \"fc70a2bc61a047febc4079937772a2af41929f8e158008e1ccb4b4830f50e0a6\": rpc error: code = NotFound desc = could not find container \"fc70a2bc61a047febc4079937772a2af41929f8e158008e1ccb4b4830f50e0a6\": container with ID starting with fc70a2bc61a047febc4079937772a2af41929f8e158008e1ccb4b4830f50e0a6 not found: ID does not exist" Oct 13 14:29:43 crc kubenswrapper[4797]: I1013 14:29:43.614364 4797 scope.go:117] "RemoveContainer" containerID="4c2f2da5912ad856aa67aa235db6fec9e282b292e15bad701acf7179aa99166d" Oct 13 14:29:43 crc kubenswrapper[4797]: E1013 14:29:43.614611 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c2f2da5912ad856aa67aa235db6fec9e282b292e15bad701acf7179aa99166d\": container with ID starting with 4c2f2da5912ad856aa67aa235db6fec9e282b292e15bad701acf7179aa99166d not found: ID does not exist" containerID="4c2f2da5912ad856aa67aa235db6fec9e282b292e15bad701acf7179aa99166d" Oct 13 14:29:43 crc kubenswrapper[4797]: I1013 14:29:43.614640 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c2f2da5912ad856aa67aa235db6fec9e282b292e15bad701acf7179aa99166d"} err="failed to get container status \"4c2f2da5912ad856aa67aa235db6fec9e282b292e15bad701acf7179aa99166d\": rpc error: code = NotFound desc = could not find container \"4c2f2da5912ad856aa67aa235db6fec9e282b292e15bad701acf7179aa99166d\": container with ID starting with 4c2f2da5912ad856aa67aa235db6fec9e282b292e15bad701acf7179aa99166d not found: ID does not exist" Oct 13 14:29:43 crc kubenswrapper[4797]: I1013 14:29:43.614660 4797 scope.go:117] "RemoveContainer" containerID="872939c53cac82ebccc6a807f277c318d5f6ec1ba823f7587af0b27cb7a0301e" Oct 13 14:29:43 crc kubenswrapper[4797]: E1013 14:29:43.615012 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"872939c53cac82ebccc6a807f277c318d5f6ec1ba823f7587af0b27cb7a0301e\": container with ID starting with 872939c53cac82ebccc6a807f277c318d5f6ec1ba823f7587af0b27cb7a0301e not found: ID does not exist" containerID="872939c53cac82ebccc6a807f277c318d5f6ec1ba823f7587af0b27cb7a0301e" Oct 13 14:29:43 crc kubenswrapper[4797]: I1013 14:29:43.615037 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"872939c53cac82ebccc6a807f277c318d5f6ec1ba823f7587af0b27cb7a0301e"} err="failed to get container status \"872939c53cac82ebccc6a807f277c318d5f6ec1ba823f7587af0b27cb7a0301e\": rpc error: code = NotFound desc = could not find container \"872939c53cac82ebccc6a807f277c318d5f6ec1ba823f7587af0b27cb7a0301e\": container with ID starting with 872939c53cac82ebccc6a807f277c318d5f6ec1ba823f7587af0b27cb7a0301e not found: ID does not exist" Oct 13 14:29:43 crc kubenswrapper[4797]: I1013 14:29:43.615052 4797 scope.go:117] "RemoveContainer" containerID="59b52a62e0d39e6021b06a893342ba9124a5478fe0cc700cbd9e7c8157d7f366" Oct 13 14:29:43 crc kubenswrapper[4797]: I1013 14:29:43.640131 4797 scope.go:117] "RemoveContainer" containerID="610d1a3fa30eb107e9729f5023ca0cfc74e3401346d65322bfffb88d527884fc" Oct 13 14:29:43 crc kubenswrapper[4797]: I1013 14:29:43.647848 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2l2vd"] Oct 13 14:29:43 crc kubenswrapper[4797]: I1013 14:29:43.654523 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-2l2vd"] Oct 13 14:29:43 crc kubenswrapper[4797]: I1013 14:29:43.670581 4797 scope.go:117] "RemoveContainer" containerID="8cb319b34a72f394542fa54d681170110d40bdfec44823b9ce695d0bad1de386" Oct 13 14:29:43 crc kubenswrapper[4797]: I1013 14:29:43.704133 4797 scope.go:117] "RemoveContainer" containerID="59b52a62e0d39e6021b06a893342ba9124a5478fe0cc700cbd9e7c8157d7f366" Oct 13 14:29:43 crc kubenswrapper[4797]: E1013 14:29:43.705017 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59b52a62e0d39e6021b06a893342ba9124a5478fe0cc700cbd9e7c8157d7f366\": container with ID starting with 59b52a62e0d39e6021b06a893342ba9124a5478fe0cc700cbd9e7c8157d7f366 not found: ID does not exist" containerID="59b52a62e0d39e6021b06a893342ba9124a5478fe0cc700cbd9e7c8157d7f366" Oct 13 14:29:43 crc kubenswrapper[4797]: I1013 14:29:43.705047 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59b52a62e0d39e6021b06a893342ba9124a5478fe0cc700cbd9e7c8157d7f366"} err="failed to get container status \"59b52a62e0d39e6021b06a893342ba9124a5478fe0cc700cbd9e7c8157d7f366\": rpc error: code = NotFound desc = could not find container \"59b52a62e0d39e6021b06a893342ba9124a5478fe0cc700cbd9e7c8157d7f366\": container with ID starting with 59b52a62e0d39e6021b06a893342ba9124a5478fe0cc700cbd9e7c8157d7f366 not found: ID does not exist" Oct 13 14:29:43 crc kubenswrapper[4797]: I1013 14:29:43.705071 4797 scope.go:117] "RemoveContainer" containerID="610d1a3fa30eb107e9729f5023ca0cfc74e3401346d65322bfffb88d527884fc" Oct 13 14:29:43 crc kubenswrapper[4797]: E1013 14:29:43.705308 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"610d1a3fa30eb107e9729f5023ca0cfc74e3401346d65322bfffb88d527884fc\": container with ID starting with 610d1a3fa30eb107e9729f5023ca0cfc74e3401346d65322bfffb88d527884fc not found: ID does not exist" containerID="610d1a3fa30eb107e9729f5023ca0cfc74e3401346d65322bfffb88d527884fc" Oct 13 14:29:43 crc kubenswrapper[4797]: I1013 14:29:43.705336 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"610d1a3fa30eb107e9729f5023ca0cfc74e3401346d65322bfffb88d527884fc"} err="failed to get container status \"610d1a3fa30eb107e9729f5023ca0cfc74e3401346d65322bfffb88d527884fc\": rpc error: code = NotFound desc = could not find container \"610d1a3fa30eb107e9729f5023ca0cfc74e3401346d65322bfffb88d527884fc\": container with ID starting with 610d1a3fa30eb107e9729f5023ca0cfc74e3401346d65322bfffb88d527884fc not found: ID does not exist" Oct 13 14:29:43 crc kubenswrapper[4797]: I1013 14:29:43.705354 4797 scope.go:117] "RemoveContainer" containerID="8cb319b34a72f394542fa54d681170110d40bdfec44823b9ce695d0bad1de386" Oct 13 14:29:43 crc kubenswrapper[4797]: E1013 14:29:43.705626 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8cb319b34a72f394542fa54d681170110d40bdfec44823b9ce695d0bad1de386\": container with ID starting with 8cb319b34a72f394542fa54d681170110d40bdfec44823b9ce695d0bad1de386 not found: ID does not exist" containerID="8cb319b34a72f394542fa54d681170110d40bdfec44823b9ce695d0bad1de386" Oct 13 14:29:43 crc kubenswrapper[4797]: I1013 14:29:43.705652 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8cb319b34a72f394542fa54d681170110d40bdfec44823b9ce695d0bad1de386"} err="failed to get container status \"8cb319b34a72f394542fa54d681170110d40bdfec44823b9ce695d0bad1de386\": rpc error: code = NotFound desc = could not find container \"8cb319b34a72f394542fa54d681170110d40bdfec44823b9ce695d0bad1de386\": container with ID starting with 8cb319b34a72f394542fa54d681170110d40bdfec44823b9ce695d0bad1de386 not found: ID does not exist" Oct 13 14:29:45 crc kubenswrapper[4797]: I1013 14:29:45.256832 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="152284ac-c39a-4c58-a164-6caa70cde371" path="/var/lib/kubelet/pods/152284ac-c39a-4c58-a164-6caa70cde371/volumes" Oct 13 14:29:45 crc kubenswrapper[4797]: I1013 14:29:45.277240 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="691453d2-1233-457f-9402-2bcb9005936a" path="/var/lib/kubelet/pods/691453d2-1233-457f-9402-2bcb9005936a/volumes" Oct 13 14:29:46 crc kubenswrapper[4797]: I1013 14:29:46.476932 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-8a64-account-create-5g7vx"] Oct 13 14:29:46 crc kubenswrapper[4797]: E1013 14:29:46.477516 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="152284ac-c39a-4c58-a164-6caa70cde371" containerName="registry-server" Oct 13 14:29:46 crc kubenswrapper[4797]: I1013 14:29:46.477540 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="152284ac-c39a-4c58-a164-6caa70cde371" containerName="registry-server" Oct 13 14:29:46 crc kubenswrapper[4797]: E1013 14:29:46.477570 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="152284ac-c39a-4c58-a164-6caa70cde371" containerName="extract-content" Oct 13 14:29:46 crc kubenswrapper[4797]: I1013 14:29:46.477585 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="152284ac-c39a-4c58-a164-6caa70cde371" containerName="extract-content" Oct 13 14:29:46 crc kubenswrapper[4797]: E1013 14:29:46.477644 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="691453d2-1233-457f-9402-2bcb9005936a" containerName="extract-content" Oct 13 14:29:46 crc kubenswrapper[4797]: I1013 14:29:46.477659 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="691453d2-1233-457f-9402-2bcb9005936a" containerName="extract-content" Oct 13 14:29:46 crc kubenswrapper[4797]: E1013 14:29:46.477676 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="691453d2-1233-457f-9402-2bcb9005936a" containerName="extract-utilities" Oct 13 14:29:46 crc kubenswrapper[4797]: I1013 14:29:46.477690 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="691453d2-1233-457f-9402-2bcb9005936a" containerName="extract-utilities" Oct 13 14:29:46 crc kubenswrapper[4797]: E1013 14:29:46.477737 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="152284ac-c39a-4c58-a164-6caa70cde371" containerName="extract-utilities" Oct 13 14:29:46 crc kubenswrapper[4797]: I1013 14:29:46.477750 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="152284ac-c39a-4c58-a164-6caa70cde371" containerName="extract-utilities" Oct 13 14:29:46 crc kubenswrapper[4797]: E1013 14:29:46.477786 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="669368b9-50a6-46f6-b526-26be88b3c854" containerName="mariadb-database-create" Oct 13 14:29:46 crc kubenswrapper[4797]: I1013 14:29:46.477799 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="669368b9-50a6-46f6-b526-26be88b3c854" containerName="mariadb-database-create" Oct 13 14:29:46 crc kubenswrapper[4797]: E1013 14:29:46.477845 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="691453d2-1233-457f-9402-2bcb9005936a" containerName="registry-server" Oct 13 14:29:46 crc kubenswrapper[4797]: I1013 14:29:46.477859 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="691453d2-1233-457f-9402-2bcb9005936a" containerName="registry-server" Oct 13 14:29:46 crc kubenswrapper[4797]: I1013 14:29:46.478171 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="691453d2-1233-457f-9402-2bcb9005936a" containerName="registry-server" Oct 13 14:29:46 crc kubenswrapper[4797]: I1013 14:29:46.478213 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="152284ac-c39a-4c58-a164-6caa70cde371" containerName="registry-server" Oct 13 14:29:46 crc kubenswrapper[4797]: I1013 14:29:46.478243 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="669368b9-50a6-46f6-b526-26be88b3c854" containerName="mariadb-database-create" Oct 13 14:29:46 crc kubenswrapper[4797]: I1013 14:29:46.479220 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-8a64-account-create-5g7vx" Oct 13 14:29:46 crc kubenswrapper[4797]: I1013 14:29:46.482532 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Oct 13 14:29:46 crc kubenswrapper[4797]: I1013 14:29:46.486394 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-8a64-account-create-5g7vx"] Oct 13 14:29:46 crc kubenswrapper[4797]: I1013 14:29:46.625451 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9xnx\" (UniqueName: \"kubernetes.io/projected/99e5c216-96da-4a27-b0e2-26973c459a03-kube-api-access-x9xnx\") pod \"glance-8a64-account-create-5g7vx\" (UID: \"99e5c216-96da-4a27-b0e2-26973c459a03\") " pod="openstack/glance-8a64-account-create-5g7vx" Oct 13 14:29:46 crc kubenswrapper[4797]: I1013 14:29:46.727321 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x9xnx\" (UniqueName: \"kubernetes.io/projected/99e5c216-96da-4a27-b0e2-26973c459a03-kube-api-access-x9xnx\") pod \"glance-8a64-account-create-5g7vx\" (UID: \"99e5c216-96da-4a27-b0e2-26973c459a03\") " pod="openstack/glance-8a64-account-create-5g7vx" Oct 13 14:29:46 crc kubenswrapper[4797]: I1013 14:29:46.760972 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9xnx\" (UniqueName: \"kubernetes.io/projected/99e5c216-96da-4a27-b0e2-26973c459a03-kube-api-access-x9xnx\") pod \"glance-8a64-account-create-5g7vx\" (UID: \"99e5c216-96da-4a27-b0e2-26973c459a03\") " pod="openstack/glance-8a64-account-create-5g7vx" Oct 13 14:29:46 crc kubenswrapper[4797]: I1013 14:29:46.803682 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-8a64-account-create-5g7vx" Oct 13 14:29:47 crc kubenswrapper[4797]: I1013 14:29:47.317499 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-8a64-account-create-5g7vx"] Oct 13 14:29:48 crc kubenswrapper[4797]: I1013 14:29:48.120428 4797 patch_prober.go:28] interesting pod/machine-config-daemon-hrdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 14:29:48 crc kubenswrapper[4797]: I1013 14:29:48.120515 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 14:29:48 crc kubenswrapper[4797]: I1013 14:29:48.327764 4797 generic.go:334] "Generic (PLEG): container finished" podID="99e5c216-96da-4a27-b0e2-26973c459a03" containerID="d1b82afce4cb561251ab198fd7c2cec35b900e5a56031f7eeba8e0e75efd2f34" exitCode=0 Oct 13 14:29:48 crc kubenswrapper[4797]: I1013 14:29:48.327880 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-8a64-account-create-5g7vx" event={"ID":"99e5c216-96da-4a27-b0e2-26973c459a03","Type":"ContainerDied","Data":"d1b82afce4cb561251ab198fd7c2cec35b900e5a56031f7eeba8e0e75efd2f34"} Oct 13 14:29:48 crc kubenswrapper[4797]: I1013 14:29:48.327925 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-8a64-account-create-5g7vx" event={"ID":"99e5c216-96da-4a27-b0e2-26973c459a03","Type":"ContainerStarted","Data":"22910f9384c9c9bb88886632189c889d73fa1dc22379b5826ca46deec73ca17d"} Oct 13 14:29:49 crc kubenswrapper[4797]: I1013 14:29:49.720953 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-8a64-account-create-5g7vx" Oct 13 14:29:49 crc kubenswrapper[4797]: I1013 14:29:49.890942 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x9xnx\" (UniqueName: \"kubernetes.io/projected/99e5c216-96da-4a27-b0e2-26973c459a03-kube-api-access-x9xnx\") pod \"99e5c216-96da-4a27-b0e2-26973c459a03\" (UID: \"99e5c216-96da-4a27-b0e2-26973c459a03\") " Oct 13 14:29:49 crc kubenswrapper[4797]: I1013 14:29:49.898351 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99e5c216-96da-4a27-b0e2-26973c459a03-kube-api-access-x9xnx" (OuterVolumeSpecName: "kube-api-access-x9xnx") pod "99e5c216-96da-4a27-b0e2-26973c459a03" (UID: "99e5c216-96da-4a27-b0e2-26973c459a03"). InnerVolumeSpecName "kube-api-access-x9xnx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 14:29:49 crc kubenswrapper[4797]: I1013 14:29:49.993720 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x9xnx\" (UniqueName: \"kubernetes.io/projected/99e5c216-96da-4a27-b0e2-26973c459a03-kube-api-access-x9xnx\") on node \"crc\" DevicePath \"\"" Oct 13 14:29:50 crc kubenswrapper[4797]: I1013 14:29:50.349633 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-8a64-account-create-5g7vx" event={"ID":"99e5c216-96da-4a27-b0e2-26973c459a03","Type":"ContainerDied","Data":"22910f9384c9c9bb88886632189c889d73fa1dc22379b5826ca46deec73ca17d"} Oct 13 14:29:50 crc kubenswrapper[4797]: I1013 14:29:50.349689 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="22910f9384c9c9bb88886632189c889d73fa1dc22379b5826ca46deec73ca17d" Oct 13 14:29:50 crc kubenswrapper[4797]: I1013 14:29:50.349720 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-8a64-account-create-5g7vx" Oct 13 14:29:51 crc kubenswrapper[4797]: I1013 14:29:51.646633 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-s6mjj"] Oct 13 14:29:51 crc kubenswrapper[4797]: E1013 14:29:51.647385 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99e5c216-96da-4a27-b0e2-26973c459a03" containerName="mariadb-account-create" Oct 13 14:29:51 crc kubenswrapper[4797]: I1013 14:29:51.647402 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="99e5c216-96da-4a27-b0e2-26973c459a03" containerName="mariadb-account-create" Oct 13 14:29:51 crc kubenswrapper[4797]: I1013 14:29:51.647650 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="99e5c216-96da-4a27-b0e2-26973c459a03" containerName="mariadb-account-create" Oct 13 14:29:51 crc kubenswrapper[4797]: I1013 14:29:51.648374 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-s6mjj" Oct 13 14:29:51 crc kubenswrapper[4797]: I1013 14:29:51.651122 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Oct 13 14:29:51 crc kubenswrapper[4797]: I1013 14:29:51.652049 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-tgh5n" Oct 13 14:29:51 crc kubenswrapper[4797]: I1013 14:29:51.664194 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-s6mjj"] Oct 13 14:29:51 crc kubenswrapper[4797]: I1013 14:29:51.727613 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54342a44-f051-411b-8a7c-fabc11ebe1dc-config-data\") pod \"glance-db-sync-s6mjj\" (UID: \"54342a44-f051-411b-8a7c-fabc11ebe1dc\") " pod="openstack/glance-db-sync-s6mjj" Oct 13 14:29:51 crc kubenswrapper[4797]: I1013 14:29:51.727656 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/54342a44-f051-411b-8a7c-fabc11ebe1dc-db-sync-config-data\") pod \"glance-db-sync-s6mjj\" (UID: \"54342a44-f051-411b-8a7c-fabc11ebe1dc\") " pod="openstack/glance-db-sync-s6mjj" Oct 13 14:29:51 crc kubenswrapper[4797]: I1013 14:29:51.727687 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54342a44-f051-411b-8a7c-fabc11ebe1dc-combined-ca-bundle\") pod \"glance-db-sync-s6mjj\" (UID: \"54342a44-f051-411b-8a7c-fabc11ebe1dc\") " pod="openstack/glance-db-sync-s6mjj" Oct 13 14:29:51 crc kubenswrapper[4797]: I1013 14:29:51.727835 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dptn\" (UniqueName: \"kubernetes.io/projected/54342a44-f051-411b-8a7c-fabc11ebe1dc-kube-api-access-5dptn\") pod \"glance-db-sync-s6mjj\" (UID: \"54342a44-f051-411b-8a7c-fabc11ebe1dc\") " pod="openstack/glance-db-sync-s6mjj" Oct 13 14:29:51 crc kubenswrapper[4797]: I1013 14:29:51.829394 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54342a44-f051-411b-8a7c-fabc11ebe1dc-config-data\") pod \"glance-db-sync-s6mjj\" (UID: \"54342a44-f051-411b-8a7c-fabc11ebe1dc\") " pod="openstack/glance-db-sync-s6mjj" Oct 13 14:29:51 crc kubenswrapper[4797]: I1013 14:29:51.830420 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/54342a44-f051-411b-8a7c-fabc11ebe1dc-db-sync-config-data\") pod \"glance-db-sync-s6mjj\" (UID: \"54342a44-f051-411b-8a7c-fabc11ebe1dc\") " pod="openstack/glance-db-sync-s6mjj" Oct 13 14:29:51 crc kubenswrapper[4797]: I1013 14:29:51.830452 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54342a44-f051-411b-8a7c-fabc11ebe1dc-combined-ca-bundle\") pod \"glance-db-sync-s6mjj\" (UID: \"54342a44-f051-411b-8a7c-fabc11ebe1dc\") " pod="openstack/glance-db-sync-s6mjj" Oct 13 14:29:51 crc kubenswrapper[4797]: I1013 14:29:51.830517 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5dptn\" (UniqueName: \"kubernetes.io/projected/54342a44-f051-411b-8a7c-fabc11ebe1dc-kube-api-access-5dptn\") pod \"glance-db-sync-s6mjj\" (UID: \"54342a44-f051-411b-8a7c-fabc11ebe1dc\") " pod="openstack/glance-db-sync-s6mjj" Oct 13 14:29:51 crc kubenswrapper[4797]: I1013 14:29:51.835849 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/54342a44-f051-411b-8a7c-fabc11ebe1dc-db-sync-config-data\") pod \"glance-db-sync-s6mjj\" (UID: \"54342a44-f051-411b-8a7c-fabc11ebe1dc\") " pod="openstack/glance-db-sync-s6mjj" Oct 13 14:29:51 crc kubenswrapper[4797]: I1013 14:29:51.835941 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54342a44-f051-411b-8a7c-fabc11ebe1dc-config-data\") pod \"glance-db-sync-s6mjj\" (UID: \"54342a44-f051-411b-8a7c-fabc11ebe1dc\") " pod="openstack/glance-db-sync-s6mjj" Oct 13 14:29:51 crc kubenswrapper[4797]: I1013 14:29:51.836446 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54342a44-f051-411b-8a7c-fabc11ebe1dc-combined-ca-bundle\") pod \"glance-db-sync-s6mjj\" (UID: \"54342a44-f051-411b-8a7c-fabc11ebe1dc\") " pod="openstack/glance-db-sync-s6mjj" Oct 13 14:29:51 crc kubenswrapper[4797]: I1013 14:29:51.850923 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dptn\" (UniqueName: \"kubernetes.io/projected/54342a44-f051-411b-8a7c-fabc11ebe1dc-kube-api-access-5dptn\") pod \"glance-db-sync-s6mjj\" (UID: \"54342a44-f051-411b-8a7c-fabc11ebe1dc\") " pod="openstack/glance-db-sync-s6mjj" Oct 13 14:29:51 crc kubenswrapper[4797]: I1013 14:29:51.982867 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-s6mjj" Oct 13 14:29:52 crc kubenswrapper[4797]: I1013 14:29:52.608917 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-s6mjj"] Oct 13 14:29:53 crc kubenswrapper[4797]: I1013 14:29:53.378915 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-s6mjj" event={"ID":"54342a44-f051-411b-8a7c-fabc11ebe1dc","Type":"ContainerStarted","Data":"ab0381abc5339ce7442416457d0e83f03b049ee713f833020939912d2531a63e"} Oct 13 14:30:00 crc kubenswrapper[4797]: I1013 14:30:00.151182 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29339430-5225v"] Oct 13 14:30:00 crc kubenswrapper[4797]: I1013 14:30:00.154241 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29339430-5225v" Oct 13 14:30:00 crc kubenswrapper[4797]: I1013 14:30:00.161579 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 13 14:30:00 crc kubenswrapper[4797]: I1013 14:30:00.161895 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 13 14:30:00 crc kubenswrapper[4797]: I1013 14:30:00.169444 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29339430-5225v"] Oct 13 14:30:00 crc kubenswrapper[4797]: I1013 14:30:00.286481 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c7bd21a4-836f-4c46-a04a-6a5f262004a7-config-volume\") pod \"collect-profiles-29339430-5225v\" (UID: \"c7bd21a4-836f-4c46-a04a-6a5f262004a7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339430-5225v" Oct 13 14:30:00 crc kubenswrapper[4797]: I1013 14:30:00.286571 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c7bd21a4-836f-4c46-a04a-6a5f262004a7-secret-volume\") pod \"collect-profiles-29339430-5225v\" (UID: \"c7bd21a4-836f-4c46-a04a-6a5f262004a7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339430-5225v" Oct 13 14:30:00 crc kubenswrapper[4797]: I1013 14:30:00.286625 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5kh2s\" (UniqueName: \"kubernetes.io/projected/c7bd21a4-836f-4c46-a04a-6a5f262004a7-kube-api-access-5kh2s\") pod \"collect-profiles-29339430-5225v\" (UID: \"c7bd21a4-836f-4c46-a04a-6a5f262004a7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339430-5225v" Oct 13 14:30:00 crc kubenswrapper[4797]: I1013 14:30:00.388520 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c7bd21a4-836f-4c46-a04a-6a5f262004a7-config-volume\") pod \"collect-profiles-29339430-5225v\" (UID: \"c7bd21a4-836f-4c46-a04a-6a5f262004a7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339430-5225v" Oct 13 14:30:00 crc kubenswrapper[4797]: I1013 14:30:00.388598 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c7bd21a4-836f-4c46-a04a-6a5f262004a7-secret-volume\") pod \"collect-profiles-29339430-5225v\" (UID: \"c7bd21a4-836f-4c46-a04a-6a5f262004a7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339430-5225v" Oct 13 14:30:00 crc kubenswrapper[4797]: I1013 14:30:00.388626 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5kh2s\" (UniqueName: \"kubernetes.io/projected/c7bd21a4-836f-4c46-a04a-6a5f262004a7-kube-api-access-5kh2s\") pod \"collect-profiles-29339430-5225v\" (UID: \"c7bd21a4-836f-4c46-a04a-6a5f262004a7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339430-5225v" Oct 13 14:30:00 crc kubenswrapper[4797]: I1013 14:30:00.389976 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c7bd21a4-836f-4c46-a04a-6a5f262004a7-config-volume\") pod \"collect-profiles-29339430-5225v\" (UID: \"c7bd21a4-836f-4c46-a04a-6a5f262004a7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339430-5225v" Oct 13 14:30:00 crc kubenswrapper[4797]: I1013 14:30:00.396611 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c7bd21a4-836f-4c46-a04a-6a5f262004a7-secret-volume\") pod \"collect-profiles-29339430-5225v\" (UID: \"c7bd21a4-836f-4c46-a04a-6a5f262004a7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339430-5225v" Oct 13 14:30:00 crc kubenswrapper[4797]: I1013 14:30:00.404018 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5kh2s\" (UniqueName: \"kubernetes.io/projected/c7bd21a4-836f-4c46-a04a-6a5f262004a7-kube-api-access-5kh2s\") pod \"collect-profiles-29339430-5225v\" (UID: \"c7bd21a4-836f-4c46-a04a-6a5f262004a7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339430-5225v" Oct 13 14:30:00 crc kubenswrapper[4797]: I1013 14:30:00.488699 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29339430-5225v" Oct 13 14:30:00 crc kubenswrapper[4797]: I1013 14:30:00.911691 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29339430-5225v"] Oct 13 14:30:01 crc kubenswrapper[4797]: I1013 14:30:01.469784 4797 generic.go:334] "Generic (PLEG): container finished" podID="c7bd21a4-836f-4c46-a04a-6a5f262004a7" containerID="eafd5fb13c53708c4076fb9a4f3da118ab9fb0b8543c9acdfe748849a26b6a55" exitCode=0 Oct 13 14:30:01 crc kubenswrapper[4797]: I1013 14:30:01.469839 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29339430-5225v" event={"ID":"c7bd21a4-836f-4c46-a04a-6a5f262004a7","Type":"ContainerDied","Data":"eafd5fb13c53708c4076fb9a4f3da118ab9fb0b8543c9acdfe748849a26b6a55"} Oct 13 14:30:01 crc kubenswrapper[4797]: I1013 14:30:01.469881 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29339430-5225v" event={"ID":"c7bd21a4-836f-4c46-a04a-6a5f262004a7","Type":"ContainerStarted","Data":"5af9a5e2d067694b9a3d9a6a4fab55f8dcd868d372f7ab4afbc139b5a21ea2f4"} Oct 13 14:30:02 crc kubenswrapper[4797]: I1013 14:30:02.789222 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29339430-5225v" Oct 13 14:30:02 crc kubenswrapper[4797]: I1013 14:30:02.938316 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c7bd21a4-836f-4c46-a04a-6a5f262004a7-config-volume\") pod \"c7bd21a4-836f-4c46-a04a-6a5f262004a7\" (UID: \"c7bd21a4-836f-4c46-a04a-6a5f262004a7\") " Oct 13 14:30:02 crc kubenswrapper[4797]: I1013 14:30:02.938369 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5kh2s\" (UniqueName: \"kubernetes.io/projected/c7bd21a4-836f-4c46-a04a-6a5f262004a7-kube-api-access-5kh2s\") pod \"c7bd21a4-836f-4c46-a04a-6a5f262004a7\" (UID: \"c7bd21a4-836f-4c46-a04a-6a5f262004a7\") " Oct 13 14:30:02 crc kubenswrapper[4797]: I1013 14:30:02.938498 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c7bd21a4-836f-4c46-a04a-6a5f262004a7-secret-volume\") pod \"c7bd21a4-836f-4c46-a04a-6a5f262004a7\" (UID: \"c7bd21a4-836f-4c46-a04a-6a5f262004a7\") " Oct 13 14:30:02 crc kubenswrapper[4797]: I1013 14:30:02.939069 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7bd21a4-836f-4c46-a04a-6a5f262004a7-config-volume" (OuterVolumeSpecName: "config-volume") pod "c7bd21a4-836f-4c46-a04a-6a5f262004a7" (UID: "c7bd21a4-836f-4c46-a04a-6a5f262004a7"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 14:30:02 crc kubenswrapper[4797]: I1013 14:30:02.944106 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7bd21a4-836f-4c46-a04a-6a5f262004a7-kube-api-access-5kh2s" (OuterVolumeSpecName: "kube-api-access-5kh2s") pod "c7bd21a4-836f-4c46-a04a-6a5f262004a7" (UID: "c7bd21a4-836f-4c46-a04a-6a5f262004a7"). InnerVolumeSpecName "kube-api-access-5kh2s". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 14:30:02 crc kubenswrapper[4797]: I1013 14:30:02.944952 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7bd21a4-836f-4c46-a04a-6a5f262004a7-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "c7bd21a4-836f-4c46-a04a-6a5f262004a7" (UID: "c7bd21a4-836f-4c46-a04a-6a5f262004a7"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 14:30:03 crc kubenswrapper[4797]: I1013 14:30:03.041927 4797 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c7bd21a4-836f-4c46-a04a-6a5f262004a7-config-volume\") on node \"crc\" DevicePath \"\"" Oct 13 14:30:03 crc kubenswrapper[4797]: I1013 14:30:03.042314 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5kh2s\" (UniqueName: \"kubernetes.io/projected/c7bd21a4-836f-4c46-a04a-6a5f262004a7-kube-api-access-5kh2s\") on node \"crc\" DevicePath \"\"" Oct 13 14:30:03 crc kubenswrapper[4797]: I1013 14:30:03.042446 4797 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c7bd21a4-836f-4c46-a04a-6a5f262004a7-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 13 14:30:03 crc kubenswrapper[4797]: I1013 14:30:03.485637 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29339430-5225v" event={"ID":"c7bd21a4-836f-4c46-a04a-6a5f262004a7","Type":"ContainerDied","Data":"5af9a5e2d067694b9a3d9a6a4fab55f8dcd868d372f7ab4afbc139b5a21ea2f4"} Oct 13 14:30:03 crc kubenswrapper[4797]: I1013 14:30:03.486158 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5af9a5e2d067694b9a3d9a6a4fab55f8dcd868d372f7ab4afbc139b5a21ea2f4" Oct 13 14:30:03 crc kubenswrapper[4797]: I1013 14:30:03.485685 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29339430-5225v" Oct 13 14:30:03 crc kubenswrapper[4797]: I1013 14:30:03.856950 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29339385-zjjkk"] Oct 13 14:30:03 crc kubenswrapper[4797]: I1013 14:30:03.863898 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29339385-zjjkk"] Oct 13 14:30:05 crc kubenswrapper[4797]: I1013 14:30:05.245692 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00d78659-91ac-4b36-aade-be7b446d8276" path="/var/lib/kubelet/pods/00d78659-91ac-4b36-aade-be7b446d8276/volumes" Oct 13 14:30:09 crc kubenswrapper[4797]: I1013 14:30:09.538120 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-s6mjj" event={"ID":"54342a44-f051-411b-8a7c-fabc11ebe1dc","Type":"ContainerStarted","Data":"08e76799c38714222e19274bef91718a1a471eb392fc01b5254816f1105ec6f8"} Oct 13 14:30:09 crc kubenswrapper[4797]: I1013 14:30:09.560630 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-s6mjj" podStartSLOduration=2.496804943 podStartE2EDuration="18.560612052s" podCreationTimestamp="2025-10-13 14:29:51 +0000 UTC" firstStartedPulling="2025-10-13 14:29:52.626241426 +0000 UTC m=+4970.159791682" lastFinishedPulling="2025-10-13 14:30:08.690048505 +0000 UTC m=+4986.223598791" observedRunningTime="2025-10-13 14:30:09.560272964 +0000 UTC m=+4987.093823280" watchObservedRunningTime="2025-10-13 14:30:09.560612052 +0000 UTC m=+4987.094162318" Oct 13 14:30:10 crc kubenswrapper[4797]: E1013 14:30:10.875563 4797 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc7bd21a4_836f_4c46_a04a_6a5f262004a7.slice/crio-5af9a5e2d067694b9a3d9a6a4fab55f8dcd868d372f7ab4afbc139b5a21ea2f4\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc7bd21a4_836f_4c46_a04a_6a5f262004a7.slice\": RecentStats: unable to find data in memory cache]" Oct 13 14:30:12 crc kubenswrapper[4797]: I1013 14:30:12.569596 4797 generic.go:334] "Generic (PLEG): container finished" podID="54342a44-f051-411b-8a7c-fabc11ebe1dc" containerID="08e76799c38714222e19274bef91718a1a471eb392fc01b5254816f1105ec6f8" exitCode=0 Oct 13 14:30:12 crc kubenswrapper[4797]: I1013 14:30:12.569706 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-s6mjj" event={"ID":"54342a44-f051-411b-8a7c-fabc11ebe1dc","Type":"ContainerDied","Data":"08e76799c38714222e19274bef91718a1a471eb392fc01b5254816f1105ec6f8"} Oct 13 14:30:13 crc kubenswrapper[4797]: I1013 14:30:13.951594 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-s6mjj" Oct 13 14:30:14 crc kubenswrapper[4797]: I1013 14:30:14.054328 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5dptn\" (UniqueName: \"kubernetes.io/projected/54342a44-f051-411b-8a7c-fabc11ebe1dc-kube-api-access-5dptn\") pod \"54342a44-f051-411b-8a7c-fabc11ebe1dc\" (UID: \"54342a44-f051-411b-8a7c-fabc11ebe1dc\") " Oct 13 14:30:14 crc kubenswrapper[4797]: I1013 14:30:14.054424 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54342a44-f051-411b-8a7c-fabc11ebe1dc-combined-ca-bundle\") pod \"54342a44-f051-411b-8a7c-fabc11ebe1dc\" (UID: \"54342a44-f051-411b-8a7c-fabc11ebe1dc\") " Oct 13 14:30:14 crc kubenswrapper[4797]: I1013 14:30:14.054473 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/54342a44-f051-411b-8a7c-fabc11ebe1dc-db-sync-config-data\") pod \"54342a44-f051-411b-8a7c-fabc11ebe1dc\" (UID: \"54342a44-f051-411b-8a7c-fabc11ebe1dc\") " Oct 13 14:30:14 crc kubenswrapper[4797]: I1013 14:30:14.054560 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54342a44-f051-411b-8a7c-fabc11ebe1dc-config-data\") pod \"54342a44-f051-411b-8a7c-fabc11ebe1dc\" (UID: \"54342a44-f051-411b-8a7c-fabc11ebe1dc\") " Oct 13 14:30:14 crc kubenswrapper[4797]: I1013 14:30:14.058968 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54342a44-f051-411b-8a7c-fabc11ebe1dc-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "54342a44-f051-411b-8a7c-fabc11ebe1dc" (UID: "54342a44-f051-411b-8a7c-fabc11ebe1dc"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 14:30:14 crc kubenswrapper[4797]: I1013 14:30:14.059577 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54342a44-f051-411b-8a7c-fabc11ebe1dc-kube-api-access-5dptn" (OuterVolumeSpecName: "kube-api-access-5dptn") pod "54342a44-f051-411b-8a7c-fabc11ebe1dc" (UID: "54342a44-f051-411b-8a7c-fabc11ebe1dc"). InnerVolumeSpecName "kube-api-access-5dptn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 14:30:14 crc kubenswrapper[4797]: I1013 14:30:14.077989 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54342a44-f051-411b-8a7c-fabc11ebe1dc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "54342a44-f051-411b-8a7c-fabc11ebe1dc" (UID: "54342a44-f051-411b-8a7c-fabc11ebe1dc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 14:30:14 crc kubenswrapper[4797]: I1013 14:30:14.111657 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54342a44-f051-411b-8a7c-fabc11ebe1dc-config-data" (OuterVolumeSpecName: "config-data") pod "54342a44-f051-411b-8a7c-fabc11ebe1dc" (UID: "54342a44-f051-411b-8a7c-fabc11ebe1dc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 14:30:14 crc kubenswrapper[4797]: I1013 14:30:14.156719 4797 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/54342a44-f051-411b-8a7c-fabc11ebe1dc-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 14:30:14 crc kubenswrapper[4797]: I1013 14:30:14.156758 4797 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54342a44-f051-411b-8a7c-fabc11ebe1dc-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 14:30:14 crc kubenswrapper[4797]: I1013 14:30:14.156771 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5dptn\" (UniqueName: \"kubernetes.io/projected/54342a44-f051-411b-8a7c-fabc11ebe1dc-kube-api-access-5dptn\") on node \"crc\" DevicePath \"\"" Oct 13 14:30:14 crc kubenswrapper[4797]: I1013 14:30:14.156784 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54342a44-f051-411b-8a7c-fabc11ebe1dc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 14:30:14 crc kubenswrapper[4797]: I1013 14:30:14.418886 4797 scope.go:117] "RemoveContainer" containerID="d61444aaba65a86d5ad081c58fbb5c5f16a174cc19db93ea309419d98c9f6f88" Oct 13 14:30:14 crc kubenswrapper[4797]: I1013 14:30:14.440725 4797 scope.go:117] "RemoveContainer" containerID="1316595c470754d486e8891348ee8a2d3a8ab9391726d1ef6892c1e152c2834a" Oct 13 14:30:14 crc kubenswrapper[4797]: I1013 14:30:14.615635 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-s6mjj" event={"ID":"54342a44-f051-411b-8a7c-fabc11ebe1dc","Type":"ContainerDied","Data":"ab0381abc5339ce7442416457d0e83f03b049ee713f833020939912d2531a63e"} Oct 13 14:30:14 crc kubenswrapper[4797]: I1013 14:30:14.615705 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ab0381abc5339ce7442416457d0e83f03b049ee713f833020939912d2531a63e" Oct 13 14:30:14 crc kubenswrapper[4797]: I1013 14:30:14.615736 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-s6mjj" Oct 13 14:30:14 crc kubenswrapper[4797]: I1013 14:30:14.888272 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 13 14:30:14 crc kubenswrapper[4797]: E1013 14:30:14.889148 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54342a44-f051-411b-8a7c-fabc11ebe1dc" containerName="glance-db-sync" Oct 13 14:30:14 crc kubenswrapper[4797]: I1013 14:30:14.889167 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="54342a44-f051-411b-8a7c-fabc11ebe1dc" containerName="glance-db-sync" Oct 13 14:30:14 crc kubenswrapper[4797]: E1013 14:30:14.889197 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7bd21a4-836f-4c46-a04a-6a5f262004a7" containerName="collect-profiles" Oct 13 14:30:14 crc kubenswrapper[4797]: I1013 14:30:14.889205 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7bd21a4-836f-4c46-a04a-6a5f262004a7" containerName="collect-profiles" Oct 13 14:30:14 crc kubenswrapper[4797]: I1013 14:30:14.889408 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="54342a44-f051-411b-8a7c-fabc11ebe1dc" containerName="glance-db-sync" Oct 13 14:30:14 crc kubenswrapper[4797]: I1013 14:30:14.889437 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7bd21a4-836f-4c46-a04a-6a5f262004a7" containerName="collect-profiles" Oct 13 14:30:14 crc kubenswrapper[4797]: I1013 14:30:14.890991 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 13 14:30:14 crc kubenswrapper[4797]: I1013 14:30:14.894141 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 13 14:30:14 crc kubenswrapper[4797]: I1013 14:30:14.894379 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 13 14:30:14 crc kubenswrapper[4797]: I1013 14:30:14.894605 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Oct 13 14:30:14 crc kubenswrapper[4797]: I1013 14:30:14.894871 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-tgh5n" Oct 13 14:30:14 crc kubenswrapper[4797]: I1013 14:30:14.909968 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 13 14:30:14 crc kubenswrapper[4797]: I1013 14:30:14.970881 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/aa6da6b7-2a11-48e5-8dc2-591746eb5c76-ceph\") pod \"glance-default-external-api-0\" (UID: \"aa6da6b7-2a11-48e5-8dc2-591746eb5c76\") " pod="openstack/glance-default-external-api-0" Oct 13 14:30:14 crc kubenswrapper[4797]: I1013 14:30:14.970965 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aa6da6b7-2a11-48e5-8dc2-591746eb5c76-scripts\") pod \"glance-default-external-api-0\" (UID: \"aa6da6b7-2a11-48e5-8dc2-591746eb5c76\") " pod="openstack/glance-default-external-api-0" Oct 13 14:30:14 crc kubenswrapper[4797]: I1013 14:30:14.971009 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa6da6b7-2a11-48e5-8dc2-591746eb5c76-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"aa6da6b7-2a11-48e5-8dc2-591746eb5c76\") " pod="openstack/glance-default-external-api-0" Oct 13 14:30:14 crc kubenswrapper[4797]: I1013 14:30:14.971291 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aa6da6b7-2a11-48e5-8dc2-591746eb5c76-logs\") pod \"glance-default-external-api-0\" (UID: \"aa6da6b7-2a11-48e5-8dc2-591746eb5c76\") " pod="openstack/glance-default-external-api-0" Oct 13 14:30:14 crc kubenswrapper[4797]: I1013 14:30:14.971412 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wm8tc\" (UniqueName: \"kubernetes.io/projected/aa6da6b7-2a11-48e5-8dc2-591746eb5c76-kube-api-access-wm8tc\") pod \"glance-default-external-api-0\" (UID: \"aa6da6b7-2a11-48e5-8dc2-591746eb5c76\") " pod="openstack/glance-default-external-api-0" Oct 13 14:30:14 crc kubenswrapper[4797]: I1013 14:30:14.971460 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/aa6da6b7-2a11-48e5-8dc2-591746eb5c76-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"aa6da6b7-2a11-48e5-8dc2-591746eb5c76\") " pod="openstack/glance-default-external-api-0" Oct 13 14:30:14 crc kubenswrapper[4797]: I1013 14:30:14.971485 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa6da6b7-2a11-48e5-8dc2-591746eb5c76-config-data\") pod \"glance-default-external-api-0\" (UID: \"aa6da6b7-2a11-48e5-8dc2-591746eb5c76\") " pod="openstack/glance-default-external-api-0" Oct 13 14:30:15 crc kubenswrapper[4797]: I1013 14:30:15.021777 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-69f96c4cc9-bc79b"] Oct 13 14:30:15 crc kubenswrapper[4797]: I1013 14:30:15.023198 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69f96c4cc9-bc79b" Oct 13 14:30:15 crc kubenswrapper[4797]: I1013 14:30:15.030074 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-69f96c4cc9-bc79b"] Oct 13 14:30:15 crc kubenswrapper[4797]: I1013 14:30:15.073621 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/aa6da6b7-2a11-48e5-8dc2-591746eb5c76-ceph\") pod \"glance-default-external-api-0\" (UID: \"aa6da6b7-2a11-48e5-8dc2-591746eb5c76\") " pod="openstack/glance-default-external-api-0" Oct 13 14:30:15 crc kubenswrapper[4797]: I1013 14:30:15.073678 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aa6da6b7-2a11-48e5-8dc2-591746eb5c76-scripts\") pod \"glance-default-external-api-0\" (UID: \"aa6da6b7-2a11-48e5-8dc2-591746eb5c76\") " pod="openstack/glance-default-external-api-0" Oct 13 14:30:15 crc kubenswrapper[4797]: I1013 14:30:15.073714 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa6da6b7-2a11-48e5-8dc2-591746eb5c76-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"aa6da6b7-2a11-48e5-8dc2-591746eb5c76\") " pod="openstack/glance-default-external-api-0" Oct 13 14:30:15 crc kubenswrapper[4797]: I1013 14:30:15.073761 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0f0d264f-0263-421f-b09e-20354fd33770-dns-svc\") pod \"dnsmasq-dns-69f96c4cc9-bc79b\" (UID: \"0f0d264f-0263-421f-b09e-20354fd33770\") " pod="openstack/dnsmasq-dns-69f96c4cc9-bc79b" Oct 13 14:30:15 crc kubenswrapper[4797]: I1013 14:30:15.073790 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0f0d264f-0263-421f-b09e-20354fd33770-ovsdbserver-sb\") pod \"dnsmasq-dns-69f96c4cc9-bc79b\" (UID: \"0f0d264f-0263-421f-b09e-20354fd33770\") " pod="openstack/dnsmasq-dns-69f96c4cc9-bc79b" Oct 13 14:30:15 crc kubenswrapper[4797]: I1013 14:30:15.073902 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hg7s7\" (UniqueName: \"kubernetes.io/projected/0f0d264f-0263-421f-b09e-20354fd33770-kube-api-access-hg7s7\") pod \"dnsmasq-dns-69f96c4cc9-bc79b\" (UID: \"0f0d264f-0263-421f-b09e-20354fd33770\") " pod="openstack/dnsmasq-dns-69f96c4cc9-bc79b" Oct 13 14:30:15 crc kubenswrapper[4797]: I1013 14:30:15.073950 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f0d264f-0263-421f-b09e-20354fd33770-config\") pod \"dnsmasq-dns-69f96c4cc9-bc79b\" (UID: \"0f0d264f-0263-421f-b09e-20354fd33770\") " pod="openstack/dnsmasq-dns-69f96c4cc9-bc79b" Oct 13 14:30:15 crc kubenswrapper[4797]: I1013 14:30:15.073973 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aa6da6b7-2a11-48e5-8dc2-591746eb5c76-logs\") pod \"glance-default-external-api-0\" (UID: \"aa6da6b7-2a11-48e5-8dc2-591746eb5c76\") " pod="openstack/glance-default-external-api-0" Oct 13 14:30:15 crc kubenswrapper[4797]: I1013 14:30:15.074016 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wm8tc\" (UniqueName: \"kubernetes.io/projected/aa6da6b7-2a11-48e5-8dc2-591746eb5c76-kube-api-access-wm8tc\") pod \"glance-default-external-api-0\" (UID: \"aa6da6b7-2a11-48e5-8dc2-591746eb5c76\") " pod="openstack/glance-default-external-api-0" Oct 13 14:30:15 crc kubenswrapper[4797]: I1013 14:30:15.074048 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/aa6da6b7-2a11-48e5-8dc2-591746eb5c76-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"aa6da6b7-2a11-48e5-8dc2-591746eb5c76\") " pod="openstack/glance-default-external-api-0" Oct 13 14:30:15 crc kubenswrapper[4797]: I1013 14:30:15.074074 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa6da6b7-2a11-48e5-8dc2-591746eb5c76-config-data\") pod \"glance-default-external-api-0\" (UID: \"aa6da6b7-2a11-48e5-8dc2-591746eb5c76\") " pod="openstack/glance-default-external-api-0" Oct 13 14:30:15 crc kubenswrapper[4797]: I1013 14:30:15.074101 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0f0d264f-0263-421f-b09e-20354fd33770-ovsdbserver-nb\") pod \"dnsmasq-dns-69f96c4cc9-bc79b\" (UID: \"0f0d264f-0263-421f-b09e-20354fd33770\") " pod="openstack/dnsmasq-dns-69f96c4cc9-bc79b" Oct 13 14:30:15 crc kubenswrapper[4797]: I1013 14:30:15.075537 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aa6da6b7-2a11-48e5-8dc2-591746eb5c76-logs\") pod \"glance-default-external-api-0\" (UID: \"aa6da6b7-2a11-48e5-8dc2-591746eb5c76\") " pod="openstack/glance-default-external-api-0" Oct 13 14:30:15 crc kubenswrapper[4797]: I1013 14:30:15.075712 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/aa6da6b7-2a11-48e5-8dc2-591746eb5c76-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"aa6da6b7-2a11-48e5-8dc2-591746eb5c76\") " pod="openstack/glance-default-external-api-0" Oct 13 14:30:15 crc kubenswrapper[4797]: I1013 14:30:15.079587 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/aa6da6b7-2a11-48e5-8dc2-591746eb5c76-ceph\") pod \"glance-default-external-api-0\" (UID: \"aa6da6b7-2a11-48e5-8dc2-591746eb5c76\") " pod="openstack/glance-default-external-api-0" Oct 13 14:30:15 crc kubenswrapper[4797]: I1013 14:30:15.079670 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aa6da6b7-2a11-48e5-8dc2-591746eb5c76-scripts\") pod \"glance-default-external-api-0\" (UID: \"aa6da6b7-2a11-48e5-8dc2-591746eb5c76\") " pod="openstack/glance-default-external-api-0" Oct 13 14:30:15 crc kubenswrapper[4797]: I1013 14:30:15.083442 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa6da6b7-2a11-48e5-8dc2-591746eb5c76-config-data\") pod \"glance-default-external-api-0\" (UID: \"aa6da6b7-2a11-48e5-8dc2-591746eb5c76\") " pod="openstack/glance-default-external-api-0" Oct 13 14:30:15 crc kubenswrapper[4797]: I1013 14:30:15.093497 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa6da6b7-2a11-48e5-8dc2-591746eb5c76-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"aa6da6b7-2a11-48e5-8dc2-591746eb5c76\") " pod="openstack/glance-default-external-api-0" Oct 13 14:30:15 crc kubenswrapper[4797]: I1013 14:30:15.102804 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wm8tc\" (UniqueName: \"kubernetes.io/projected/aa6da6b7-2a11-48e5-8dc2-591746eb5c76-kube-api-access-wm8tc\") pod \"glance-default-external-api-0\" (UID: \"aa6da6b7-2a11-48e5-8dc2-591746eb5c76\") " pod="openstack/glance-default-external-api-0" Oct 13 14:30:15 crc kubenswrapper[4797]: I1013 14:30:15.143251 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 13 14:30:15 crc kubenswrapper[4797]: I1013 14:30:15.144922 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 13 14:30:15 crc kubenswrapper[4797]: I1013 14:30:15.153722 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 13 14:30:15 crc kubenswrapper[4797]: I1013 14:30:15.164093 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 13 14:30:15 crc kubenswrapper[4797]: I1013 14:30:15.175018 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0f0d264f-0263-421f-b09e-20354fd33770-dns-svc\") pod \"dnsmasq-dns-69f96c4cc9-bc79b\" (UID: \"0f0d264f-0263-421f-b09e-20354fd33770\") " pod="openstack/dnsmasq-dns-69f96c4cc9-bc79b" Oct 13 14:30:15 crc kubenswrapper[4797]: I1013 14:30:15.175060 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0f0d264f-0263-421f-b09e-20354fd33770-ovsdbserver-sb\") pod \"dnsmasq-dns-69f96c4cc9-bc79b\" (UID: \"0f0d264f-0263-421f-b09e-20354fd33770\") " pod="openstack/dnsmasq-dns-69f96c4cc9-bc79b" Oct 13 14:30:15 crc kubenswrapper[4797]: I1013 14:30:15.175090 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hg7s7\" (UniqueName: \"kubernetes.io/projected/0f0d264f-0263-421f-b09e-20354fd33770-kube-api-access-hg7s7\") pod \"dnsmasq-dns-69f96c4cc9-bc79b\" (UID: \"0f0d264f-0263-421f-b09e-20354fd33770\") " pod="openstack/dnsmasq-dns-69f96c4cc9-bc79b" Oct 13 14:30:15 crc kubenswrapper[4797]: I1013 14:30:15.175127 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f0d264f-0263-421f-b09e-20354fd33770-config\") pod \"dnsmasq-dns-69f96c4cc9-bc79b\" (UID: \"0f0d264f-0263-421f-b09e-20354fd33770\") " pod="openstack/dnsmasq-dns-69f96c4cc9-bc79b" Oct 13 14:30:15 crc kubenswrapper[4797]: I1013 14:30:15.175175 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0f0d264f-0263-421f-b09e-20354fd33770-ovsdbserver-nb\") pod \"dnsmasq-dns-69f96c4cc9-bc79b\" (UID: \"0f0d264f-0263-421f-b09e-20354fd33770\") " pod="openstack/dnsmasq-dns-69f96c4cc9-bc79b" Oct 13 14:30:15 crc kubenswrapper[4797]: I1013 14:30:15.176023 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0f0d264f-0263-421f-b09e-20354fd33770-dns-svc\") pod \"dnsmasq-dns-69f96c4cc9-bc79b\" (UID: \"0f0d264f-0263-421f-b09e-20354fd33770\") " pod="openstack/dnsmasq-dns-69f96c4cc9-bc79b" Oct 13 14:30:15 crc kubenswrapper[4797]: I1013 14:30:15.176137 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0f0d264f-0263-421f-b09e-20354fd33770-ovsdbserver-sb\") pod \"dnsmasq-dns-69f96c4cc9-bc79b\" (UID: \"0f0d264f-0263-421f-b09e-20354fd33770\") " pod="openstack/dnsmasq-dns-69f96c4cc9-bc79b" Oct 13 14:30:15 crc kubenswrapper[4797]: I1013 14:30:15.176358 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f0d264f-0263-421f-b09e-20354fd33770-config\") pod \"dnsmasq-dns-69f96c4cc9-bc79b\" (UID: \"0f0d264f-0263-421f-b09e-20354fd33770\") " pod="openstack/dnsmasq-dns-69f96c4cc9-bc79b" Oct 13 14:30:15 crc kubenswrapper[4797]: I1013 14:30:15.176364 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0f0d264f-0263-421f-b09e-20354fd33770-ovsdbserver-nb\") pod \"dnsmasq-dns-69f96c4cc9-bc79b\" (UID: \"0f0d264f-0263-421f-b09e-20354fd33770\") " pod="openstack/dnsmasq-dns-69f96c4cc9-bc79b" Oct 13 14:30:15 crc kubenswrapper[4797]: I1013 14:30:15.197352 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hg7s7\" (UniqueName: \"kubernetes.io/projected/0f0d264f-0263-421f-b09e-20354fd33770-kube-api-access-hg7s7\") pod \"dnsmasq-dns-69f96c4cc9-bc79b\" (UID: \"0f0d264f-0263-421f-b09e-20354fd33770\") " pod="openstack/dnsmasq-dns-69f96c4cc9-bc79b" Oct 13 14:30:15 crc kubenswrapper[4797]: I1013 14:30:15.259118 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 13 14:30:15 crc kubenswrapper[4797]: I1013 14:30:15.276607 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1f8a71b9-f566-42f3-b2d2-d04ccf17c248-logs\") pod \"glance-default-internal-api-0\" (UID: \"1f8a71b9-f566-42f3-b2d2-d04ccf17c248\") " pod="openstack/glance-default-internal-api-0" Oct 13 14:30:15 crc kubenswrapper[4797]: I1013 14:30:15.276678 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f8a71b9-f566-42f3-b2d2-d04ccf17c248-config-data\") pod \"glance-default-internal-api-0\" (UID: \"1f8a71b9-f566-42f3-b2d2-d04ccf17c248\") " pod="openstack/glance-default-internal-api-0" Oct 13 14:30:15 crc kubenswrapper[4797]: I1013 14:30:15.276889 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1f8a71b9-f566-42f3-b2d2-d04ccf17c248-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"1f8a71b9-f566-42f3-b2d2-d04ccf17c248\") " pod="openstack/glance-default-internal-api-0" Oct 13 14:30:15 crc kubenswrapper[4797]: I1013 14:30:15.276994 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1f8a71b9-f566-42f3-b2d2-d04ccf17c248-scripts\") pod \"glance-default-internal-api-0\" (UID: \"1f8a71b9-f566-42f3-b2d2-d04ccf17c248\") " pod="openstack/glance-default-internal-api-0" Oct 13 14:30:15 crc kubenswrapper[4797]: I1013 14:30:15.277044 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/1f8a71b9-f566-42f3-b2d2-d04ccf17c248-ceph\") pod \"glance-default-internal-api-0\" (UID: \"1f8a71b9-f566-42f3-b2d2-d04ccf17c248\") " pod="openstack/glance-default-internal-api-0" Oct 13 14:30:15 crc kubenswrapper[4797]: I1013 14:30:15.277104 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f8a71b9-f566-42f3-b2d2-d04ccf17c248-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"1f8a71b9-f566-42f3-b2d2-d04ccf17c248\") " pod="openstack/glance-default-internal-api-0" Oct 13 14:30:15 crc kubenswrapper[4797]: I1013 14:30:15.277163 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jdrw\" (UniqueName: \"kubernetes.io/projected/1f8a71b9-f566-42f3-b2d2-d04ccf17c248-kube-api-access-2jdrw\") pod \"glance-default-internal-api-0\" (UID: \"1f8a71b9-f566-42f3-b2d2-d04ccf17c248\") " pod="openstack/glance-default-internal-api-0" Oct 13 14:30:15 crc kubenswrapper[4797]: I1013 14:30:15.353424 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69f96c4cc9-bc79b" Oct 13 14:30:15 crc kubenswrapper[4797]: I1013 14:30:15.379103 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1f8a71b9-f566-42f3-b2d2-d04ccf17c248-logs\") pod \"glance-default-internal-api-0\" (UID: \"1f8a71b9-f566-42f3-b2d2-d04ccf17c248\") " pod="openstack/glance-default-internal-api-0" Oct 13 14:30:15 crc kubenswrapper[4797]: I1013 14:30:15.379180 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f8a71b9-f566-42f3-b2d2-d04ccf17c248-config-data\") pod \"glance-default-internal-api-0\" (UID: \"1f8a71b9-f566-42f3-b2d2-d04ccf17c248\") " pod="openstack/glance-default-internal-api-0" Oct 13 14:30:15 crc kubenswrapper[4797]: I1013 14:30:15.379218 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1f8a71b9-f566-42f3-b2d2-d04ccf17c248-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"1f8a71b9-f566-42f3-b2d2-d04ccf17c248\") " pod="openstack/glance-default-internal-api-0" Oct 13 14:30:15 crc kubenswrapper[4797]: I1013 14:30:15.379262 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1f8a71b9-f566-42f3-b2d2-d04ccf17c248-scripts\") pod \"glance-default-internal-api-0\" (UID: \"1f8a71b9-f566-42f3-b2d2-d04ccf17c248\") " pod="openstack/glance-default-internal-api-0" Oct 13 14:30:15 crc kubenswrapper[4797]: I1013 14:30:15.379297 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/1f8a71b9-f566-42f3-b2d2-d04ccf17c248-ceph\") pod \"glance-default-internal-api-0\" (UID: \"1f8a71b9-f566-42f3-b2d2-d04ccf17c248\") " pod="openstack/glance-default-internal-api-0" Oct 13 14:30:15 crc kubenswrapper[4797]: I1013 14:30:15.379351 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f8a71b9-f566-42f3-b2d2-d04ccf17c248-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"1f8a71b9-f566-42f3-b2d2-d04ccf17c248\") " pod="openstack/glance-default-internal-api-0" Oct 13 14:30:15 crc kubenswrapper[4797]: I1013 14:30:15.379383 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2jdrw\" (UniqueName: \"kubernetes.io/projected/1f8a71b9-f566-42f3-b2d2-d04ccf17c248-kube-api-access-2jdrw\") pod \"glance-default-internal-api-0\" (UID: \"1f8a71b9-f566-42f3-b2d2-d04ccf17c248\") " pod="openstack/glance-default-internal-api-0" Oct 13 14:30:15 crc kubenswrapper[4797]: I1013 14:30:15.379806 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1f8a71b9-f566-42f3-b2d2-d04ccf17c248-logs\") pod \"glance-default-internal-api-0\" (UID: \"1f8a71b9-f566-42f3-b2d2-d04ccf17c248\") " pod="openstack/glance-default-internal-api-0" Oct 13 14:30:15 crc kubenswrapper[4797]: I1013 14:30:15.380667 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1f8a71b9-f566-42f3-b2d2-d04ccf17c248-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"1f8a71b9-f566-42f3-b2d2-d04ccf17c248\") " pod="openstack/glance-default-internal-api-0" Oct 13 14:30:15 crc kubenswrapper[4797]: I1013 14:30:15.385839 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/1f8a71b9-f566-42f3-b2d2-d04ccf17c248-ceph\") pod \"glance-default-internal-api-0\" (UID: \"1f8a71b9-f566-42f3-b2d2-d04ccf17c248\") " pod="openstack/glance-default-internal-api-0" Oct 13 14:30:15 crc kubenswrapper[4797]: I1013 14:30:15.395233 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1f8a71b9-f566-42f3-b2d2-d04ccf17c248-scripts\") pod \"glance-default-internal-api-0\" (UID: \"1f8a71b9-f566-42f3-b2d2-d04ccf17c248\") " pod="openstack/glance-default-internal-api-0" Oct 13 14:30:15 crc kubenswrapper[4797]: I1013 14:30:15.399050 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f8a71b9-f566-42f3-b2d2-d04ccf17c248-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"1f8a71b9-f566-42f3-b2d2-d04ccf17c248\") " pod="openstack/glance-default-internal-api-0" Oct 13 14:30:15 crc kubenswrapper[4797]: I1013 14:30:15.399419 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jdrw\" (UniqueName: \"kubernetes.io/projected/1f8a71b9-f566-42f3-b2d2-d04ccf17c248-kube-api-access-2jdrw\") pod \"glance-default-internal-api-0\" (UID: \"1f8a71b9-f566-42f3-b2d2-d04ccf17c248\") " pod="openstack/glance-default-internal-api-0" Oct 13 14:30:15 crc kubenswrapper[4797]: I1013 14:30:15.401767 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f8a71b9-f566-42f3-b2d2-d04ccf17c248-config-data\") pod \"glance-default-internal-api-0\" (UID: \"1f8a71b9-f566-42f3-b2d2-d04ccf17c248\") " pod="openstack/glance-default-internal-api-0" Oct 13 14:30:15 crc kubenswrapper[4797]: I1013 14:30:15.489465 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 13 14:30:15 crc kubenswrapper[4797]: I1013 14:30:15.663897 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-69f96c4cc9-bc79b"] Oct 13 14:30:15 crc kubenswrapper[4797]: I1013 14:30:15.793759 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 13 14:30:15 crc kubenswrapper[4797]: W1013 14:30:15.804404 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaa6da6b7_2a11_48e5_8dc2_591746eb5c76.slice/crio-0b999eec57e669474d0bb044efacaa6986c8bf8dd84ff50baa519b4f07035f0f WatchSource:0}: Error finding container 0b999eec57e669474d0bb044efacaa6986c8bf8dd84ff50baa519b4f07035f0f: Status 404 returned error can't find the container with id 0b999eec57e669474d0bb044efacaa6986c8bf8dd84ff50baa519b4f07035f0f Oct 13 14:30:15 crc kubenswrapper[4797]: I1013 14:30:15.889966 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 13 14:30:16 crc kubenswrapper[4797]: I1013 14:30:16.317779 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 13 14:30:16 crc kubenswrapper[4797]: I1013 14:30:16.641472 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"aa6da6b7-2a11-48e5-8dc2-591746eb5c76","Type":"ContainerStarted","Data":"e60f94a6675400af93867dfd309d46dfc7e8eb22f2cfaa4ec8644a15eb0603c0"} Oct 13 14:30:16 crc kubenswrapper[4797]: I1013 14:30:16.641513 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"aa6da6b7-2a11-48e5-8dc2-591746eb5c76","Type":"ContainerStarted","Data":"0b999eec57e669474d0bb044efacaa6986c8bf8dd84ff50baa519b4f07035f0f"} Oct 13 14:30:16 crc kubenswrapper[4797]: I1013 14:30:16.646359 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"1f8a71b9-f566-42f3-b2d2-d04ccf17c248","Type":"ContainerStarted","Data":"25df6f0784b96dcf7cbc720cb1acc7aa56aebff96056fe93689c101af2a7261b"} Oct 13 14:30:16 crc kubenswrapper[4797]: I1013 14:30:16.646389 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"1f8a71b9-f566-42f3-b2d2-d04ccf17c248","Type":"ContainerStarted","Data":"a9e28a299f6ad7c894b1f5fb98947534d257ff904ba3f3d6b2ddbea63eee7446"} Oct 13 14:30:16 crc kubenswrapper[4797]: I1013 14:30:16.651911 4797 generic.go:334] "Generic (PLEG): container finished" podID="0f0d264f-0263-421f-b09e-20354fd33770" containerID="c4c5767ca5dcfbe961e431d80b91714df4d6e3afb37bbb0b325e6834d7acfa54" exitCode=0 Oct 13 14:30:16 crc kubenswrapper[4797]: I1013 14:30:16.651961 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69f96c4cc9-bc79b" event={"ID":"0f0d264f-0263-421f-b09e-20354fd33770","Type":"ContainerDied","Data":"c4c5767ca5dcfbe961e431d80b91714df4d6e3afb37bbb0b325e6834d7acfa54"} Oct 13 14:30:16 crc kubenswrapper[4797]: I1013 14:30:16.651988 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69f96c4cc9-bc79b" event={"ID":"0f0d264f-0263-421f-b09e-20354fd33770","Type":"ContainerStarted","Data":"b148ab344d3364443e65ec6f834d5ea30537e9456b6077feef6bcc28ded9cae2"} Oct 13 14:30:17 crc kubenswrapper[4797]: I1013 14:30:17.665171 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69f96c4cc9-bc79b" event={"ID":"0f0d264f-0263-421f-b09e-20354fd33770","Type":"ContainerStarted","Data":"a6a04335a1935f5867e54e5e71bbaafcd89d3bf63483739ccbc10cf05102ba8e"} Oct 13 14:30:17 crc kubenswrapper[4797]: I1013 14:30:17.665480 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-69f96c4cc9-bc79b" Oct 13 14:30:17 crc kubenswrapper[4797]: I1013 14:30:17.667803 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"aa6da6b7-2a11-48e5-8dc2-591746eb5c76","Type":"ContainerStarted","Data":"58ea0633e22453bebba9f7269210ec58fec1e463aa661b8a054ced37988f35a7"} Oct 13 14:30:17 crc kubenswrapper[4797]: I1013 14:30:17.667868 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="aa6da6b7-2a11-48e5-8dc2-591746eb5c76" containerName="glance-log" containerID="cri-o://e60f94a6675400af93867dfd309d46dfc7e8eb22f2cfaa4ec8644a15eb0603c0" gracePeriod=30 Oct 13 14:30:17 crc kubenswrapper[4797]: I1013 14:30:17.667951 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="aa6da6b7-2a11-48e5-8dc2-591746eb5c76" containerName="glance-httpd" containerID="cri-o://58ea0633e22453bebba9f7269210ec58fec1e463aa661b8a054ced37988f35a7" gracePeriod=30 Oct 13 14:30:17 crc kubenswrapper[4797]: I1013 14:30:17.678727 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"1f8a71b9-f566-42f3-b2d2-d04ccf17c248","Type":"ContainerStarted","Data":"57bc9bf0c42886ee3c9da87da422daa69f857d03dc0c49ac535d6ef8b3d385a6"} Oct 13 14:30:17 crc kubenswrapper[4797]: I1013 14:30:17.693295 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-69f96c4cc9-bc79b" podStartSLOduration=3.6932660569999998 podStartE2EDuration="3.693266057s" podCreationTimestamp="2025-10-13 14:30:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 14:30:17.69172362 +0000 UTC m=+4995.225273886" watchObservedRunningTime="2025-10-13 14:30:17.693266057 +0000 UTC m=+4995.226816333" Oct 13 14:30:17 crc kubenswrapper[4797]: I1013 14:30:17.740384 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.74034782 podStartE2EDuration="3.74034782s" podCreationTimestamp="2025-10-13 14:30:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 14:30:17.722061443 +0000 UTC m=+4995.255611749" watchObservedRunningTime="2025-10-13 14:30:17.74034782 +0000 UTC m=+4995.273898166" Oct 13 14:30:17 crc kubenswrapper[4797]: I1013 14:30:17.757875 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=2.757790718 podStartE2EDuration="2.757790718s" podCreationTimestamp="2025-10-13 14:30:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 14:30:17.752207301 +0000 UTC m=+4995.285757567" watchObservedRunningTime="2025-10-13 14:30:17.757790718 +0000 UTC m=+4995.291340974" Oct 13 14:30:18 crc kubenswrapper[4797]: I1013 14:30:18.119715 4797 patch_prober.go:28] interesting pod/machine-config-daemon-hrdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 14:30:18 crc kubenswrapper[4797]: I1013 14:30:18.119781 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 14:30:18 crc kubenswrapper[4797]: I1013 14:30:18.119852 4797 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" Oct 13 14:30:18 crc kubenswrapper[4797]: I1013 14:30:18.120630 4797 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"96c8267bd4c8e99eeab0f52fde47a06d5529395a03b2ed9e13ec45aa355e370b"} pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 13 14:30:18 crc kubenswrapper[4797]: I1013 14:30:18.120702 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerName="machine-config-daemon" containerID="cri-o://96c8267bd4c8e99eeab0f52fde47a06d5529395a03b2ed9e13ec45aa355e370b" gracePeriod=600 Oct 13 14:30:18 crc kubenswrapper[4797]: E1013 14:30:18.243635 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 14:30:18 crc kubenswrapper[4797]: I1013 14:30:18.275220 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 13 14:30:18 crc kubenswrapper[4797]: I1013 14:30:18.335894 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/aa6da6b7-2a11-48e5-8dc2-591746eb5c76-httpd-run\") pod \"aa6da6b7-2a11-48e5-8dc2-591746eb5c76\" (UID: \"aa6da6b7-2a11-48e5-8dc2-591746eb5c76\") " Oct 13 14:30:18 crc kubenswrapper[4797]: I1013 14:30:18.336365 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa6da6b7-2a11-48e5-8dc2-591746eb5c76-config-data\") pod \"aa6da6b7-2a11-48e5-8dc2-591746eb5c76\" (UID: \"aa6da6b7-2a11-48e5-8dc2-591746eb5c76\") " Oct 13 14:30:18 crc kubenswrapper[4797]: I1013 14:30:18.336388 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa6da6b7-2a11-48e5-8dc2-591746eb5c76-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "aa6da6b7-2a11-48e5-8dc2-591746eb5c76" (UID: "aa6da6b7-2a11-48e5-8dc2-591746eb5c76"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 14:30:18 crc kubenswrapper[4797]: I1013 14:30:18.336526 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/aa6da6b7-2a11-48e5-8dc2-591746eb5c76-ceph\") pod \"aa6da6b7-2a11-48e5-8dc2-591746eb5c76\" (UID: \"aa6da6b7-2a11-48e5-8dc2-591746eb5c76\") " Oct 13 14:30:18 crc kubenswrapper[4797]: I1013 14:30:18.336586 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa6da6b7-2a11-48e5-8dc2-591746eb5c76-combined-ca-bundle\") pod \"aa6da6b7-2a11-48e5-8dc2-591746eb5c76\" (UID: \"aa6da6b7-2a11-48e5-8dc2-591746eb5c76\") " Oct 13 14:30:18 crc kubenswrapper[4797]: I1013 14:30:18.336631 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wm8tc\" (UniqueName: \"kubernetes.io/projected/aa6da6b7-2a11-48e5-8dc2-591746eb5c76-kube-api-access-wm8tc\") pod \"aa6da6b7-2a11-48e5-8dc2-591746eb5c76\" (UID: \"aa6da6b7-2a11-48e5-8dc2-591746eb5c76\") " Oct 13 14:30:18 crc kubenswrapper[4797]: I1013 14:30:18.336717 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aa6da6b7-2a11-48e5-8dc2-591746eb5c76-logs\") pod \"aa6da6b7-2a11-48e5-8dc2-591746eb5c76\" (UID: \"aa6da6b7-2a11-48e5-8dc2-591746eb5c76\") " Oct 13 14:30:18 crc kubenswrapper[4797]: I1013 14:30:18.336760 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aa6da6b7-2a11-48e5-8dc2-591746eb5c76-scripts\") pod \"aa6da6b7-2a11-48e5-8dc2-591746eb5c76\" (UID: \"aa6da6b7-2a11-48e5-8dc2-591746eb5c76\") " Oct 13 14:30:18 crc kubenswrapper[4797]: I1013 14:30:18.337446 4797 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/aa6da6b7-2a11-48e5-8dc2-591746eb5c76-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 13 14:30:18 crc kubenswrapper[4797]: I1013 14:30:18.341720 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa6da6b7-2a11-48e5-8dc2-591746eb5c76-logs" (OuterVolumeSpecName: "logs") pod "aa6da6b7-2a11-48e5-8dc2-591746eb5c76" (UID: "aa6da6b7-2a11-48e5-8dc2-591746eb5c76"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 14:30:18 crc kubenswrapper[4797]: I1013 14:30:18.342171 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa6da6b7-2a11-48e5-8dc2-591746eb5c76-ceph" (OuterVolumeSpecName: "ceph") pod "aa6da6b7-2a11-48e5-8dc2-591746eb5c76" (UID: "aa6da6b7-2a11-48e5-8dc2-591746eb5c76"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 14:30:18 crc kubenswrapper[4797]: I1013 14:30:18.343968 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa6da6b7-2a11-48e5-8dc2-591746eb5c76-kube-api-access-wm8tc" (OuterVolumeSpecName: "kube-api-access-wm8tc") pod "aa6da6b7-2a11-48e5-8dc2-591746eb5c76" (UID: "aa6da6b7-2a11-48e5-8dc2-591746eb5c76"). InnerVolumeSpecName "kube-api-access-wm8tc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 14:30:18 crc kubenswrapper[4797]: I1013 14:30:18.344225 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa6da6b7-2a11-48e5-8dc2-591746eb5c76-scripts" (OuterVolumeSpecName: "scripts") pod "aa6da6b7-2a11-48e5-8dc2-591746eb5c76" (UID: "aa6da6b7-2a11-48e5-8dc2-591746eb5c76"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 14:30:18 crc kubenswrapper[4797]: I1013 14:30:18.376377 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa6da6b7-2a11-48e5-8dc2-591746eb5c76-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "aa6da6b7-2a11-48e5-8dc2-591746eb5c76" (UID: "aa6da6b7-2a11-48e5-8dc2-591746eb5c76"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 14:30:18 crc kubenswrapper[4797]: I1013 14:30:18.392901 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa6da6b7-2a11-48e5-8dc2-591746eb5c76-config-data" (OuterVolumeSpecName: "config-data") pod "aa6da6b7-2a11-48e5-8dc2-591746eb5c76" (UID: "aa6da6b7-2a11-48e5-8dc2-591746eb5c76"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 14:30:18 crc kubenswrapper[4797]: I1013 14:30:18.439297 4797 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aa6da6b7-2a11-48e5-8dc2-591746eb5c76-logs\") on node \"crc\" DevicePath \"\"" Oct 13 14:30:18 crc kubenswrapper[4797]: I1013 14:30:18.439336 4797 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aa6da6b7-2a11-48e5-8dc2-591746eb5c76-scripts\") on node \"crc\" DevicePath \"\"" Oct 13 14:30:18 crc kubenswrapper[4797]: I1013 14:30:18.439346 4797 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa6da6b7-2a11-48e5-8dc2-591746eb5c76-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 14:30:18 crc kubenswrapper[4797]: I1013 14:30:18.439355 4797 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/aa6da6b7-2a11-48e5-8dc2-591746eb5c76-ceph\") on node \"crc\" DevicePath \"\"" Oct 13 14:30:18 crc kubenswrapper[4797]: I1013 14:30:18.439368 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa6da6b7-2a11-48e5-8dc2-591746eb5c76-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 14:30:18 crc kubenswrapper[4797]: I1013 14:30:18.439380 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wm8tc\" (UniqueName: \"kubernetes.io/projected/aa6da6b7-2a11-48e5-8dc2-591746eb5c76-kube-api-access-wm8tc\") on node \"crc\" DevicePath \"\"" Oct 13 14:30:18 crc kubenswrapper[4797]: I1013 14:30:18.508830 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 13 14:30:18 crc kubenswrapper[4797]: I1013 14:30:18.692285 4797 generic.go:334] "Generic (PLEG): container finished" podID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerID="96c8267bd4c8e99eeab0f52fde47a06d5529395a03b2ed9e13ec45aa355e370b" exitCode=0 Oct 13 14:30:18 crc kubenswrapper[4797]: I1013 14:30:18.692345 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" event={"ID":"345b1c60-ba79-407d-8423-53010f2dfeb0","Type":"ContainerDied","Data":"96c8267bd4c8e99eeab0f52fde47a06d5529395a03b2ed9e13ec45aa355e370b"} Oct 13 14:30:18 crc kubenswrapper[4797]: I1013 14:30:18.692388 4797 scope.go:117] "RemoveContainer" containerID="bd2a230444894bdbf4c868b39927df09a997acfb15fbcd184a28d0e618ab816a" Oct 13 14:30:18 crc kubenswrapper[4797]: I1013 14:30:18.693174 4797 scope.go:117] "RemoveContainer" containerID="96c8267bd4c8e99eeab0f52fde47a06d5529395a03b2ed9e13ec45aa355e370b" Oct 13 14:30:18 crc kubenswrapper[4797]: E1013 14:30:18.693485 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 14:30:18 crc kubenswrapper[4797]: I1013 14:30:18.695299 4797 generic.go:334] "Generic (PLEG): container finished" podID="aa6da6b7-2a11-48e5-8dc2-591746eb5c76" containerID="58ea0633e22453bebba9f7269210ec58fec1e463aa661b8a054ced37988f35a7" exitCode=0 Oct 13 14:30:18 crc kubenswrapper[4797]: I1013 14:30:18.695325 4797 generic.go:334] "Generic (PLEG): container finished" podID="aa6da6b7-2a11-48e5-8dc2-591746eb5c76" containerID="e60f94a6675400af93867dfd309d46dfc7e8eb22f2cfaa4ec8644a15eb0603c0" exitCode=143 Oct 13 14:30:18 crc kubenswrapper[4797]: I1013 14:30:18.695712 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 13 14:30:18 crc kubenswrapper[4797]: I1013 14:30:18.695844 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"aa6da6b7-2a11-48e5-8dc2-591746eb5c76","Type":"ContainerDied","Data":"58ea0633e22453bebba9f7269210ec58fec1e463aa661b8a054ced37988f35a7"} Oct 13 14:30:18 crc kubenswrapper[4797]: I1013 14:30:18.708543 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"aa6da6b7-2a11-48e5-8dc2-591746eb5c76","Type":"ContainerDied","Data":"e60f94a6675400af93867dfd309d46dfc7e8eb22f2cfaa4ec8644a15eb0603c0"} Oct 13 14:30:18 crc kubenswrapper[4797]: I1013 14:30:18.708621 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"aa6da6b7-2a11-48e5-8dc2-591746eb5c76","Type":"ContainerDied","Data":"0b999eec57e669474d0bb044efacaa6986c8bf8dd84ff50baa519b4f07035f0f"} Oct 13 14:30:18 crc kubenswrapper[4797]: I1013 14:30:18.728762 4797 scope.go:117] "RemoveContainer" containerID="58ea0633e22453bebba9f7269210ec58fec1e463aa661b8a054ced37988f35a7" Oct 13 14:30:18 crc kubenswrapper[4797]: I1013 14:30:18.769996 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 13 14:30:18 crc kubenswrapper[4797]: I1013 14:30:18.788500 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 13 14:30:18 crc kubenswrapper[4797]: I1013 14:30:18.795712 4797 scope.go:117] "RemoveContainer" containerID="e60f94a6675400af93867dfd309d46dfc7e8eb22f2cfaa4ec8644a15eb0603c0" Oct 13 14:30:18 crc kubenswrapper[4797]: I1013 14:30:18.806718 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 13 14:30:18 crc kubenswrapper[4797]: E1013 14:30:18.807473 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa6da6b7-2a11-48e5-8dc2-591746eb5c76" containerName="glance-httpd" Oct 13 14:30:18 crc kubenswrapper[4797]: I1013 14:30:18.807493 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa6da6b7-2a11-48e5-8dc2-591746eb5c76" containerName="glance-httpd" Oct 13 14:30:18 crc kubenswrapper[4797]: E1013 14:30:18.807524 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa6da6b7-2a11-48e5-8dc2-591746eb5c76" containerName="glance-log" Oct 13 14:30:18 crc kubenswrapper[4797]: I1013 14:30:18.807535 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa6da6b7-2a11-48e5-8dc2-591746eb5c76" containerName="glance-log" Oct 13 14:30:18 crc kubenswrapper[4797]: I1013 14:30:18.807748 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa6da6b7-2a11-48e5-8dc2-591746eb5c76" containerName="glance-httpd" Oct 13 14:30:18 crc kubenswrapper[4797]: I1013 14:30:18.807771 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa6da6b7-2a11-48e5-8dc2-591746eb5c76" containerName="glance-log" Oct 13 14:30:18 crc kubenswrapper[4797]: I1013 14:30:18.812024 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 13 14:30:18 crc kubenswrapper[4797]: I1013 14:30:18.818265 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 13 14:30:18 crc kubenswrapper[4797]: I1013 14:30:18.834059 4797 scope.go:117] "RemoveContainer" containerID="58ea0633e22453bebba9f7269210ec58fec1e463aa661b8a054ced37988f35a7" Oct 13 14:30:18 crc kubenswrapper[4797]: E1013 14:30:18.837909 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58ea0633e22453bebba9f7269210ec58fec1e463aa661b8a054ced37988f35a7\": container with ID starting with 58ea0633e22453bebba9f7269210ec58fec1e463aa661b8a054ced37988f35a7 not found: ID does not exist" containerID="58ea0633e22453bebba9f7269210ec58fec1e463aa661b8a054ced37988f35a7" Oct 13 14:30:18 crc kubenswrapper[4797]: I1013 14:30:18.837945 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58ea0633e22453bebba9f7269210ec58fec1e463aa661b8a054ced37988f35a7"} err="failed to get container status \"58ea0633e22453bebba9f7269210ec58fec1e463aa661b8a054ced37988f35a7\": rpc error: code = NotFound desc = could not find container \"58ea0633e22453bebba9f7269210ec58fec1e463aa661b8a054ced37988f35a7\": container with ID starting with 58ea0633e22453bebba9f7269210ec58fec1e463aa661b8a054ced37988f35a7 not found: ID does not exist" Oct 13 14:30:18 crc kubenswrapper[4797]: I1013 14:30:18.837966 4797 scope.go:117] "RemoveContainer" containerID="e60f94a6675400af93867dfd309d46dfc7e8eb22f2cfaa4ec8644a15eb0603c0" Oct 13 14:30:18 crc kubenswrapper[4797]: E1013 14:30:18.842814 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e60f94a6675400af93867dfd309d46dfc7e8eb22f2cfaa4ec8644a15eb0603c0\": container with ID starting with e60f94a6675400af93867dfd309d46dfc7e8eb22f2cfaa4ec8644a15eb0603c0 not found: ID does not exist" containerID="e60f94a6675400af93867dfd309d46dfc7e8eb22f2cfaa4ec8644a15eb0603c0" Oct 13 14:30:18 crc kubenswrapper[4797]: I1013 14:30:18.842899 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e60f94a6675400af93867dfd309d46dfc7e8eb22f2cfaa4ec8644a15eb0603c0"} err="failed to get container status \"e60f94a6675400af93867dfd309d46dfc7e8eb22f2cfaa4ec8644a15eb0603c0\": rpc error: code = NotFound desc = could not find container \"e60f94a6675400af93867dfd309d46dfc7e8eb22f2cfaa4ec8644a15eb0603c0\": container with ID starting with e60f94a6675400af93867dfd309d46dfc7e8eb22f2cfaa4ec8644a15eb0603c0 not found: ID does not exist" Oct 13 14:30:18 crc kubenswrapper[4797]: I1013 14:30:18.842942 4797 scope.go:117] "RemoveContainer" containerID="58ea0633e22453bebba9f7269210ec58fec1e463aa661b8a054ced37988f35a7" Oct 13 14:30:18 crc kubenswrapper[4797]: I1013 14:30:18.844389 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 13 14:30:18 crc kubenswrapper[4797]: I1013 14:30:18.867286 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58ea0633e22453bebba9f7269210ec58fec1e463aa661b8a054ced37988f35a7"} err="failed to get container status \"58ea0633e22453bebba9f7269210ec58fec1e463aa661b8a054ced37988f35a7\": rpc error: code = NotFound desc = could not find container \"58ea0633e22453bebba9f7269210ec58fec1e463aa661b8a054ced37988f35a7\": container with ID starting with 58ea0633e22453bebba9f7269210ec58fec1e463aa661b8a054ced37988f35a7 not found: ID does not exist" Oct 13 14:30:18 crc kubenswrapper[4797]: I1013 14:30:18.867342 4797 scope.go:117] "RemoveContainer" containerID="e60f94a6675400af93867dfd309d46dfc7e8eb22f2cfaa4ec8644a15eb0603c0" Oct 13 14:30:18 crc kubenswrapper[4797]: I1013 14:30:18.871006 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e60f94a6675400af93867dfd309d46dfc7e8eb22f2cfaa4ec8644a15eb0603c0"} err="failed to get container status \"e60f94a6675400af93867dfd309d46dfc7e8eb22f2cfaa4ec8644a15eb0603c0\": rpc error: code = NotFound desc = could not find container \"e60f94a6675400af93867dfd309d46dfc7e8eb22f2cfaa4ec8644a15eb0603c0\": container with ID starting with e60f94a6675400af93867dfd309d46dfc7e8eb22f2cfaa4ec8644a15eb0603c0 not found: ID does not exist" Oct 13 14:30:18 crc kubenswrapper[4797]: I1013 14:30:18.947014 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42c07aef-3157-4265-b940-0d838eb32e9f-config-data\") pod \"glance-default-external-api-0\" (UID: \"42c07aef-3157-4265-b940-0d838eb32e9f\") " pod="openstack/glance-default-external-api-0" Oct 13 14:30:18 crc kubenswrapper[4797]: I1013 14:30:18.947088 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/42c07aef-3157-4265-b940-0d838eb32e9f-ceph\") pod \"glance-default-external-api-0\" (UID: \"42c07aef-3157-4265-b940-0d838eb32e9f\") " pod="openstack/glance-default-external-api-0" Oct 13 14:30:18 crc kubenswrapper[4797]: I1013 14:30:18.947145 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/42c07aef-3157-4265-b940-0d838eb32e9f-scripts\") pod \"glance-default-external-api-0\" (UID: \"42c07aef-3157-4265-b940-0d838eb32e9f\") " pod="openstack/glance-default-external-api-0" Oct 13 14:30:18 crc kubenswrapper[4797]: I1013 14:30:18.947210 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctjx6\" (UniqueName: \"kubernetes.io/projected/42c07aef-3157-4265-b940-0d838eb32e9f-kube-api-access-ctjx6\") pod \"glance-default-external-api-0\" (UID: \"42c07aef-3157-4265-b940-0d838eb32e9f\") " pod="openstack/glance-default-external-api-0" Oct 13 14:30:18 crc kubenswrapper[4797]: I1013 14:30:18.947245 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42c07aef-3157-4265-b940-0d838eb32e9f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"42c07aef-3157-4265-b940-0d838eb32e9f\") " pod="openstack/glance-default-external-api-0" Oct 13 14:30:18 crc kubenswrapper[4797]: I1013 14:30:18.947269 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/42c07aef-3157-4265-b940-0d838eb32e9f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"42c07aef-3157-4265-b940-0d838eb32e9f\") " pod="openstack/glance-default-external-api-0" Oct 13 14:30:18 crc kubenswrapper[4797]: I1013 14:30:18.947298 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/42c07aef-3157-4265-b940-0d838eb32e9f-logs\") pod \"glance-default-external-api-0\" (UID: \"42c07aef-3157-4265-b940-0d838eb32e9f\") " pod="openstack/glance-default-external-api-0" Oct 13 14:30:19 crc kubenswrapper[4797]: I1013 14:30:19.048515 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/42c07aef-3157-4265-b940-0d838eb32e9f-scripts\") pod \"glance-default-external-api-0\" (UID: \"42c07aef-3157-4265-b940-0d838eb32e9f\") " pod="openstack/glance-default-external-api-0" Oct 13 14:30:19 crc kubenswrapper[4797]: I1013 14:30:19.048649 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ctjx6\" (UniqueName: \"kubernetes.io/projected/42c07aef-3157-4265-b940-0d838eb32e9f-kube-api-access-ctjx6\") pod \"glance-default-external-api-0\" (UID: \"42c07aef-3157-4265-b940-0d838eb32e9f\") " pod="openstack/glance-default-external-api-0" Oct 13 14:30:19 crc kubenswrapper[4797]: I1013 14:30:19.048687 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/42c07aef-3157-4265-b940-0d838eb32e9f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"42c07aef-3157-4265-b940-0d838eb32e9f\") " pod="openstack/glance-default-external-api-0" Oct 13 14:30:19 crc kubenswrapper[4797]: I1013 14:30:19.048706 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42c07aef-3157-4265-b940-0d838eb32e9f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"42c07aef-3157-4265-b940-0d838eb32e9f\") " pod="openstack/glance-default-external-api-0" Oct 13 14:30:19 crc kubenswrapper[4797]: I1013 14:30:19.048742 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/42c07aef-3157-4265-b940-0d838eb32e9f-logs\") pod \"glance-default-external-api-0\" (UID: \"42c07aef-3157-4265-b940-0d838eb32e9f\") " pod="openstack/glance-default-external-api-0" Oct 13 14:30:19 crc kubenswrapper[4797]: I1013 14:30:19.049015 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42c07aef-3157-4265-b940-0d838eb32e9f-config-data\") pod \"glance-default-external-api-0\" (UID: \"42c07aef-3157-4265-b940-0d838eb32e9f\") " pod="openstack/glance-default-external-api-0" Oct 13 14:30:19 crc kubenswrapper[4797]: I1013 14:30:19.049150 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/42c07aef-3157-4265-b940-0d838eb32e9f-ceph\") pod \"glance-default-external-api-0\" (UID: \"42c07aef-3157-4265-b940-0d838eb32e9f\") " pod="openstack/glance-default-external-api-0" Oct 13 14:30:19 crc kubenswrapper[4797]: I1013 14:30:19.049245 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/42c07aef-3157-4265-b940-0d838eb32e9f-logs\") pod \"glance-default-external-api-0\" (UID: \"42c07aef-3157-4265-b940-0d838eb32e9f\") " pod="openstack/glance-default-external-api-0" Oct 13 14:30:19 crc kubenswrapper[4797]: I1013 14:30:19.049861 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/42c07aef-3157-4265-b940-0d838eb32e9f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"42c07aef-3157-4265-b940-0d838eb32e9f\") " pod="openstack/glance-default-external-api-0" Oct 13 14:30:19 crc kubenswrapper[4797]: I1013 14:30:19.053648 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/42c07aef-3157-4265-b940-0d838eb32e9f-ceph\") pod \"glance-default-external-api-0\" (UID: \"42c07aef-3157-4265-b940-0d838eb32e9f\") " pod="openstack/glance-default-external-api-0" Oct 13 14:30:19 crc kubenswrapper[4797]: I1013 14:30:19.054963 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/42c07aef-3157-4265-b940-0d838eb32e9f-scripts\") pod \"glance-default-external-api-0\" (UID: \"42c07aef-3157-4265-b940-0d838eb32e9f\") " pod="openstack/glance-default-external-api-0" Oct 13 14:30:19 crc kubenswrapper[4797]: I1013 14:30:19.055418 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42c07aef-3157-4265-b940-0d838eb32e9f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"42c07aef-3157-4265-b940-0d838eb32e9f\") " pod="openstack/glance-default-external-api-0" Oct 13 14:30:19 crc kubenswrapper[4797]: I1013 14:30:19.059928 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42c07aef-3157-4265-b940-0d838eb32e9f-config-data\") pod \"glance-default-external-api-0\" (UID: \"42c07aef-3157-4265-b940-0d838eb32e9f\") " pod="openstack/glance-default-external-api-0" Oct 13 14:30:19 crc kubenswrapper[4797]: I1013 14:30:19.072622 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctjx6\" (UniqueName: \"kubernetes.io/projected/42c07aef-3157-4265-b940-0d838eb32e9f-kube-api-access-ctjx6\") pod \"glance-default-external-api-0\" (UID: \"42c07aef-3157-4265-b940-0d838eb32e9f\") " pod="openstack/glance-default-external-api-0" Oct 13 14:30:19 crc kubenswrapper[4797]: I1013 14:30:19.181709 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 13 14:30:19 crc kubenswrapper[4797]: I1013 14:30:19.247923 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa6da6b7-2a11-48e5-8dc2-591746eb5c76" path="/var/lib/kubelet/pods/aa6da6b7-2a11-48e5-8dc2-591746eb5c76/volumes" Oct 13 14:30:19 crc kubenswrapper[4797]: I1013 14:30:19.707709 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="1f8a71b9-f566-42f3-b2d2-d04ccf17c248" containerName="glance-log" containerID="cri-o://25df6f0784b96dcf7cbc720cb1acc7aa56aebff96056fe93689c101af2a7261b" gracePeriod=30 Oct 13 14:30:19 crc kubenswrapper[4797]: I1013 14:30:19.707719 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="1f8a71b9-f566-42f3-b2d2-d04ccf17c248" containerName="glance-httpd" containerID="cri-o://57bc9bf0c42886ee3c9da87da422daa69f857d03dc0c49ac535d6ef8b3d385a6" gracePeriod=30 Oct 13 14:30:19 crc kubenswrapper[4797]: I1013 14:30:19.740277 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 13 14:30:20 crc kubenswrapper[4797]: I1013 14:30:20.378957 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 13 14:30:20 crc kubenswrapper[4797]: I1013 14:30:20.473377 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2jdrw\" (UniqueName: \"kubernetes.io/projected/1f8a71b9-f566-42f3-b2d2-d04ccf17c248-kube-api-access-2jdrw\") pod \"1f8a71b9-f566-42f3-b2d2-d04ccf17c248\" (UID: \"1f8a71b9-f566-42f3-b2d2-d04ccf17c248\") " Oct 13 14:30:20 crc kubenswrapper[4797]: I1013 14:30:20.473428 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1f8a71b9-f566-42f3-b2d2-d04ccf17c248-logs\") pod \"1f8a71b9-f566-42f3-b2d2-d04ccf17c248\" (UID: \"1f8a71b9-f566-42f3-b2d2-d04ccf17c248\") " Oct 13 14:30:20 crc kubenswrapper[4797]: I1013 14:30:20.473461 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f8a71b9-f566-42f3-b2d2-d04ccf17c248-config-data\") pod \"1f8a71b9-f566-42f3-b2d2-d04ccf17c248\" (UID: \"1f8a71b9-f566-42f3-b2d2-d04ccf17c248\") " Oct 13 14:30:20 crc kubenswrapper[4797]: I1013 14:30:20.473483 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f8a71b9-f566-42f3-b2d2-d04ccf17c248-combined-ca-bundle\") pod \"1f8a71b9-f566-42f3-b2d2-d04ccf17c248\" (UID: \"1f8a71b9-f566-42f3-b2d2-d04ccf17c248\") " Oct 13 14:30:20 crc kubenswrapper[4797]: I1013 14:30:20.473515 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1f8a71b9-f566-42f3-b2d2-d04ccf17c248-scripts\") pod \"1f8a71b9-f566-42f3-b2d2-d04ccf17c248\" (UID: \"1f8a71b9-f566-42f3-b2d2-d04ccf17c248\") " Oct 13 14:30:20 crc kubenswrapper[4797]: I1013 14:30:20.473541 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/1f8a71b9-f566-42f3-b2d2-d04ccf17c248-ceph\") pod \"1f8a71b9-f566-42f3-b2d2-d04ccf17c248\" (UID: \"1f8a71b9-f566-42f3-b2d2-d04ccf17c248\") " Oct 13 14:30:20 crc kubenswrapper[4797]: I1013 14:30:20.473597 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1f8a71b9-f566-42f3-b2d2-d04ccf17c248-httpd-run\") pod \"1f8a71b9-f566-42f3-b2d2-d04ccf17c248\" (UID: \"1f8a71b9-f566-42f3-b2d2-d04ccf17c248\") " Oct 13 14:30:20 crc kubenswrapper[4797]: I1013 14:30:20.474155 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f8a71b9-f566-42f3-b2d2-d04ccf17c248-logs" (OuterVolumeSpecName: "logs") pod "1f8a71b9-f566-42f3-b2d2-d04ccf17c248" (UID: "1f8a71b9-f566-42f3-b2d2-d04ccf17c248"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 14:30:20 crc kubenswrapper[4797]: I1013 14:30:20.474299 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f8a71b9-f566-42f3-b2d2-d04ccf17c248-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "1f8a71b9-f566-42f3-b2d2-d04ccf17c248" (UID: "1f8a71b9-f566-42f3-b2d2-d04ccf17c248"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 14:30:20 crc kubenswrapper[4797]: I1013 14:30:20.479200 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f8a71b9-f566-42f3-b2d2-d04ccf17c248-scripts" (OuterVolumeSpecName: "scripts") pod "1f8a71b9-f566-42f3-b2d2-d04ccf17c248" (UID: "1f8a71b9-f566-42f3-b2d2-d04ccf17c248"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 14:30:20 crc kubenswrapper[4797]: I1013 14:30:20.479333 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f8a71b9-f566-42f3-b2d2-d04ccf17c248-kube-api-access-2jdrw" (OuterVolumeSpecName: "kube-api-access-2jdrw") pod "1f8a71b9-f566-42f3-b2d2-d04ccf17c248" (UID: "1f8a71b9-f566-42f3-b2d2-d04ccf17c248"). InnerVolumeSpecName "kube-api-access-2jdrw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 14:30:20 crc kubenswrapper[4797]: I1013 14:30:20.483084 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f8a71b9-f566-42f3-b2d2-d04ccf17c248-ceph" (OuterVolumeSpecName: "ceph") pod "1f8a71b9-f566-42f3-b2d2-d04ccf17c248" (UID: "1f8a71b9-f566-42f3-b2d2-d04ccf17c248"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 14:30:20 crc kubenswrapper[4797]: I1013 14:30:20.508915 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f8a71b9-f566-42f3-b2d2-d04ccf17c248-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1f8a71b9-f566-42f3-b2d2-d04ccf17c248" (UID: "1f8a71b9-f566-42f3-b2d2-d04ccf17c248"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 14:30:20 crc kubenswrapper[4797]: I1013 14:30:20.538665 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f8a71b9-f566-42f3-b2d2-d04ccf17c248-config-data" (OuterVolumeSpecName: "config-data") pod "1f8a71b9-f566-42f3-b2d2-d04ccf17c248" (UID: "1f8a71b9-f566-42f3-b2d2-d04ccf17c248"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 14:30:20 crc kubenswrapper[4797]: I1013 14:30:20.578328 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2jdrw\" (UniqueName: \"kubernetes.io/projected/1f8a71b9-f566-42f3-b2d2-d04ccf17c248-kube-api-access-2jdrw\") on node \"crc\" DevicePath \"\"" Oct 13 14:30:20 crc kubenswrapper[4797]: I1013 14:30:20.578378 4797 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1f8a71b9-f566-42f3-b2d2-d04ccf17c248-logs\") on node \"crc\" DevicePath \"\"" Oct 13 14:30:20 crc kubenswrapper[4797]: I1013 14:30:20.578392 4797 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f8a71b9-f566-42f3-b2d2-d04ccf17c248-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 14:30:20 crc kubenswrapper[4797]: I1013 14:30:20.578405 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f8a71b9-f566-42f3-b2d2-d04ccf17c248-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 14:30:20 crc kubenswrapper[4797]: I1013 14:30:20.578416 4797 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1f8a71b9-f566-42f3-b2d2-d04ccf17c248-scripts\") on node \"crc\" DevicePath \"\"" Oct 13 14:30:20 crc kubenswrapper[4797]: I1013 14:30:20.578426 4797 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/1f8a71b9-f566-42f3-b2d2-d04ccf17c248-ceph\") on node \"crc\" DevicePath \"\"" Oct 13 14:30:20 crc kubenswrapper[4797]: I1013 14:30:20.578436 4797 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1f8a71b9-f566-42f3-b2d2-d04ccf17c248-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 13 14:30:20 crc kubenswrapper[4797]: I1013 14:30:20.725013 4797 generic.go:334] "Generic (PLEG): container finished" podID="1f8a71b9-f566-42f3-b2d2-d04ccf17c248" containerID="57bc9bf0c42886ee3c9da87da422daa69f857d03dc0c49ac535d6ef8b3d385a6" exitCode=0 Oct 13 14:30:20 crc kubenswrapper[4797]: I1013 14:30:20.725052 4797 generic.go:334] "Generic (PLEG): container finished" podID="1f8a71b9-f566-42f3-b2d2-d04ccf17c248" containerID="25df6f0784b96dcf7cbc720cb1acc7aa56aebff96056fe93689c101af2a7261b" exitCode=143 Oct 13 14:30:20 crc kubenswrapper[4797]: I1013 14:30:20.725103 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"1f8a71b9-f566-42f3-b2d2-d04ccf17c248","Type":"ContainerDied","Data":"57bc9bf0c42886ee3c9da87da422daa69f857d03dc0c49ac535d6ef8b3d385a6"} Oct 13 14:30:20 crc kubenswrapper[4797]: I1013 14:30:20.725130 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"1f8a71b9-f566-42f3-b2d2-d04ccf17c248","Type":"ContainerDied","Data":"25df6f0784b96dcf7cbc720cb1acc7aa56aebff96056fe93689c101af2a7261b"} Oct 13 14:30:20 crc kubenswrapper[4797]: I1013 14:30:20.725141 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"1f8a71b9-f566-42f3-b2d2-d04ccf17c248","Type":"ContainerDied","Data":"a9e28a299f6ad7c894b1f5fb98947534d257ff904ba3f3d6b2ddbea63eee7446"} Oct 13 14:30:20 crc kubenswrapper[4797]: I1013 14:30:20.725158 4797 scope.go:117] "RemoveContainer" containerID="57bc9bf0c42886ee3c9da87da422daa69f857d03dc0c49ac535d6ef8b3d385a6" Oct 13 14:30:20 crc kubenswrapper[4797]: I1013 14:30:20.725271 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 13 14:30:20 crc kubenswrapper[4797]: I1013 14:30:20.740859 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"42c07aef-3157-4265-b940-0d838eb32e9f","Type":"ContainerStarted","Data":"65bcd9b8b6054f2b3ecd43dc1183e377ec852f26140bb43c56e152253bca6e95"} Oct 13 14:30:20 crc kubenswrapper[4797]: I1013 14:30:20.740896 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"42c07aef-3157-4265-b940-0d838eb32e9f","Type":"ContainerStarted","Data":"2ac51a03f59ebee4449a2e8cf71cf7257dd79df1fe56dd56fa6a1924b14e82a2"} Oct 13 14:30:20 crc kubenswrapper[4797]: I1013 14:30:20.761472 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 13 14:30:20 crc kubenswrapper[4797]: I1013 14:30:20.774712 4797 scope.go:117] "RemoveContainer" containerID="25df6f0784b96dcf7cbc720cb1acc7aa56aebff96056fe93689c101af2a7261b" Oct 13 14:30:20 crc kubenswrapper[4797]: I1013 14:30:20.777659 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 13 14:30:20 crc kubenswrapper[4797]: I1013 14:30:20.783334 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 13 14:30:20 crc kubenswrapper[4797]: E1013 14:30:20.783764 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f8a71b9-f566-42f3-b2d2-d04ccf17c248" containerName="glance-httpd" Oct 13 14:30:20 crc kubenswrapper[4797]: I1013 14:30:20.783785 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f8a71b9-f566-42f3-b2d2-d04ccf17c248" containerName="glance-httpd" Oct 13 14:30:20 crc kubenswrapper[4797]: E1013 14:30:20.783805 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f8a71b9-f566-42f3-b2d2-d04ccf17c248" containerName="glance-log" Oct 13 14:30:20 crc kubenswrapper[4797]: I1013 14:30:20.783827 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f8a71b9-f566-42f3-b2d2-d04ccf17c248" containerName="glance-log" Oct 13 14:30:20 crc kubenswrapper[4797]: I1013 14:30:20.784002 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f8a71b9-f566-42f3-b2d2-d04ccf17c248" containerName="glance-httpd" Oct 13 14:30:20 crc kubenswrapper[4797]: I1013 14:30:20.784032 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f8a71b9-f566-42f3-b2d2-d04ccf17c248" containerName="glance-log" Oct 13 14:30:20 crc kubenswrapper[4797]: I1013 14:30:20.787975 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 13 14:30:20 crc kubenswrapper[4797]: I1013 14:30:20.790107 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 13 14:30:20 crc kubenswrapper[4797]: I1013 14:30:20.797941 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 13 14:30:20 crc kubenswrapper[4797]: I1013 14:30:20.807154 4797 scope.go:117] "RemoveContainer" containerID="57bc9bf0c42886ee3c9da87da422daa69f857d03dc0c49ac535d6ef8b3d385a6" Oct 13 14:30:20 crc kubenswrapper[4797]: E1013 14:30:20.813373 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"57bc9bf0c42886ee3c9da87da422daa69f857d03dc0c49ac535d6ef8b3d385a6\": container with ID starting with 57bc9bf0c42886ee3c9da87da422daa69f857d03dc0c49ac535d6ef8b3d385a6 not found: ID does not exist" containerID="57bc9bf0c42886ee3c9da87da422daa69f857d03dc0c49ac535d6ef8b3d385a6" Oct 13 14:30:20 crc kubenswrapper[4797]: I1013 14:30:20.813652 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57bc9bf0c42886ee3c9da87da422daa69f857d03dc0c49ac535d6ef8b3d385a6"} err="failed to get container status \"57bc9bf0c42886ee3c9da87da422daa69f857d03dc0c49ac535d6ef8b3d385a6\": rpc error: code = NotFound desc = could not find container \"57bc9bf0c42886ee3c9da87da422daa69f857d03dc0c49ac535d6ef8b3d385a6\": container with ID starting with 57bc9bf0c42886ee3c9da87da422daa69f857d03dc0c49ac535d6ef8b3d385a6 not found: ID does not exist" Oct 13 14:30:20 crc kubenswrapper[4797]: I1013 14:30:20.813796 4797 scope.go:117] "RemoveContainer" containerID="25df6f0784b96dcf7cbc720cb1acc7aa56aebff96056fe93689c101af2a7261b" Oct 13 14:30:20 crc kubenswrapper[4797]: E1013 14:30:20.814340 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25df6f0784b96dcf7cbc720cb1acc7aa56aebff96056fe93689c101af2a7261b\": container with ID starting with 25df6f0784b96dcf7cbc720cb1acc7aa56aebff96056fe93689c101af2a7261b not found: ID does not exist" containerID="25df6f0784b96dcf7cbc720cb1acc7aa56aebff96056fe93689c101af2a7261b" Oct 13 14:30:20 crc kubenswrapper[4797]: I1013 14:30:20.814368 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25df6f0784b96dcf7cbc720cb1acc7aa56aebff96056fe93689c101af2a7261b"} err="failed to get container status \"25df6f0784b96dcf7cbc720cb1acc7aa56aebff96056fe93689c101af2a7261b\": rpc error: code = NotFound desc = could not find container \"25df6f0784b96dcf7cbc720cb1acc7aa56aebff96056fe93689c101af2a7261b\": container with ID starting with 25df6f0784b96dcf7cbc720cb1acc7aa56aebff96056fe93689c101af2a7261b not found: ID does not exist" Oct 13 14:30:20 crc kubenswrapper[4797]: I1013 14:30:20.814387 4797 scope.go:117] "RemoveContainer" containerID="57bc9bf0c42886ee3c9da87da422daa69f857d03dc0c49ac535d6ef8b3d385a6" Oct 13 14:30:20 crc kubenswrapper[4797]: I1013 14:30:20.815280 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57bc9bf0c42886ee3c9da87da422daa69f857d03dc0c49ac535d6ef8b3d385a6"} err="failed to get container status \"57bc9bf0c42886ee3c9da87da422daa69f857d03dc0c49ac535d6ef8b3d385a6\": rpc error: code = NotFound desc = could not find container \"57bc9bf0c42886ee3c9da87da422daa69f857d03dc0c49ac535d6ef8b3d385a6\": container with ID starting with 57bc9bf0c42886ee3c9da87da422daa69f857d03dc0c49ac535d6ef8b3d385a6 not found: ID does not exist" Oct 13 14:30:20 crc kubenswrapper[4797]: I1013 14:30:20.815310 4797 scope.go:117] "RemoveContainer" containerID="25df6f0784b96dcf7cbc720cb1acc7aa56aebff96056fe93689c101af2a7261b" Oct 13 14:30:20 crc kubenswrapper[4797]: I1013 14:30:20.815540 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25df6f0784b96dcf7cbc720cb1acc7aa56aebff96056fe93689c101af2a7261b"} err="failed to get container status \"25df6f0784b96dcf7cbc720cb1acc7aa56aebff96056fe93689c101af2a7261b\": rpc error: code = NotFound desc = could not find container \"25df6f0784b96dcf7cbc720cb1acc7aa56aebff96056fe93689c101af2a7261b\": container with ID starting with 25df6f0784b96dcf7cbc720cb1acc7aa56aebff96056fe93689c101af2a7261b not found: ID does not exist" Oct 13 14:30:20 crc kubenswrapper[4797]: I1013 14:30:20.887105 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4129fe47-83ce-4c43-9549-39be0607dc11-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"4129fe47-83ce-4c43-9549-39be0607dc11\") " pod="openstack/glance-default-internal-api-0" Oct 13 14:30:20 crc kubenswrapper[4797]: I1013 14:30:20.887159 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4129fe47-83ce-4c43-9549-39be0607dc11-config-data\") pod \"glance-default-internal-api-0\" (UID: \"4129fe47-83ce-4c43-9549-39be0607dc11\") " pod="openstack/glance-default-internal-api-0" Oct 13 14:30:20 crc kubenswrapper[4797]: I1013 14:30:20.887183 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4129fe47-83ce-4c43-9549-39be0607dc11-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"4129fe47-83ce-4c43-9549-39be0607dc11\") " pod="openstack/glance-default-internal-api-0" Oct 13 14:30:20 crc kubenswrapper[4797]: I1013 14:30:20.887286 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4129fe47-83ce-4c43-9549-39be0607dc11-scripts\") pod \"glance-default-internal-api-0\" (UID: \"4129fe47-83ce-4c43-9549-39be0607dc11\") " pod="openstack/glance-default-internal-api-0" Oct 13 14:30:20 crc kubenswrapper[4797]: I1013 14:30:20.887644 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27c45\" (UniqueName: \"kubernetes.io/projected/4129fe47-83ce-4c43-9549-39be0607dc11-kube-api-access-27c45\") pod \"glance-default-internal-api-0\" (UID: \"4129fe47-83ce-4c43-9549-39be0607dc11\") " pod="openstack/glance-default-internal-api-0" Oct 13 14:30:20 crc kubenswrapper[4797]: I1013 14:30:20.887749 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/4129fe47-83ce-4c43-9549-39be0607dc11-ceph\") pod \"glance-default-internal-api-0\" (UID: \"4129fe47-83ce-4c43-9549-39be0607dc11\") " pod="openstack/glance-default-internal-api-0" Oct 13 14:30:20 crc kubenswrapper[4797]: I1013 14:30:20.887986 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4129fe47-83ce-4c43-9549-39be0607dc11-logs\") pod \"glance-default-internal-api-0\" (UID: \"4129fe47-83ce-4c43-9549-39be0607dc11\") " pod="openstack/glance-default-internal-api-0" Oct 13 14:30:20 crc kubenswrapper[4797]: I1013 14:30:20.989157 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4129fe47-83ce-4c43-9549-39be0607dc11-logs\") pod \"glance-default-internal-api-0\" (UID: \"4129fe47-83ce-4c43-9549-39be0607dc11\") " pod="openstack/glance-default-internal-api-0" Oct 13 14:30:20 crc kubenswrapper[4797]: I1013 14:30:20.989246 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4129fe47-83ce-4c43-9549-39be0607dc11-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"4129fe47-83ce-4c43-9549-39be0607dc11\") " pod="openstack/glance-default-internal-api-0" Oct 13 14:30:20 crc kubenswrapper[4797]: I1013 14:30:20.989290 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4129fe47-83ce-4c43-9549-39be0607dc11-config-data\") pod \"glance-default-internal-api-0\" (UID: \"4129fe47-83ce-4c43-9549-39be0607dc11\") " pod="openstack/glance-default-internal-api-0" Oct 13 14:30:20 crc kubenswrapper[4797]: I1013 14:30:20.989326 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4129fe47-83ce-4c43-9549-39be0607dc11-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"4129fe47-83ce-4c43-9549-39be0607dc11\") " pod="openstack/glance-default-internal-api-0" Oct 13 14:30:20 crc kubenswrapper[4797]: I1013 14:30:20.989357 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4129fe47-83ce-4c43-9549-39be0607dc11-scripts\") pod \"glance-default-internal-api-0\" (UID: \"4129fe47-83ce-4c43-9549-39be0607dc11\") " pod="openstack/glance-default-internal-api-0" Oct 13 14:30:20 crc kubenswrapper[4797]: I1013 14:30:20.989443 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27c45\" (UniqueName: \"kubernetes.io/projected/4129fe47-83ce-4c43-9549-39be0607dc11-kube-api-access-27c45\") pod \"glance-default-internal-api-0\" (UID: \"4129fe47-83ce-4c43-9549-39be0607dc11\") " pod="openstack/glance-default-internal-api-0" Oct 13 14:30:20 crc kubenswrapper[4797]: I1013 14:30:20.989520 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/4129fe47-83ce-4c43-9549-39be0607dc11-ceph\") pod \"glance-default-internal-api-0\" (UID: \"4129fe47-83ce-4c43-9549-39be0607dc11\") " pod="openstack/glance-default-internal-api-0" Oct 13 14:30:20 crc kubenswrapper[4797]: I1013 14:30:20.989602 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4129fe47-83ce-4c43-9549-39be0607dc11-logs\") pod \"glance-default-internal-api-0\" (UID: \"4129fe47-83ce-4c43-9549-39be0607dc11\") " pod="openstack/glance-default-internal-api-0" Oct 13 14:30:20 crc kubenswrapper[4797]: I1013 14:30:20.989654 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4129fe47-83ce-4c43-9549-39be0607dc11-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"4129fe47-83ce-4c43-9549-39be0607dc11\") " pod="openstack/glance-default-internal-api-0" Oct 13 14:30:20 crc kubenswrapper[4797]: I1013 14:30:20.998951 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/4129fe47-83ce-4c43-9549-39be0607dc11-ceph\") pod \"glance-default-internal-api-0\" (UID: \"4129fe47-83ce-4c43-9549-39be0607dc11\") " pod="openstack/glance-default-internal-api-0" Oct 13 14:30:20 crc kubenswrapper[4797]: I1013 14:30:20.999120 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4129fe47-83ce-4c43-9549-39be0607dc11-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"4129fe47-83ce-4c43-9549-39be0607dc11\") " pod="openstack/glance-default-internal-api-0" Oct 13 14:30:20 crc kubenswrapper[4797]: I1013 14:30:20.999439 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4129fe47-83ce-4c43-9549-39be0607dc11-config-data\") pod \"glance-default-internal-api-0\" (UID: \"4129fe47-83ce-4c43-9549-39be0607dc11\") " pod="openstack/glance-default-internal-api-0" Oct 13 14:30:21 crc kubenswrapper[4797]: I1013 14:30:21.000495 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4129fe47-83ce-4c43-9549-39be0607dc11-scripts\") pod \"glance-default-internal-api-0\" (UID: \"4129fe47-83ce-4c43-9549-39be0607dc11\") " pod="openstack/glance-default-internal-api-0" Oct 13 14:30:21 crc kubenswrapper[4797]: I1013 14:30:21.012389 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27c45\" (UniqueName: \"kubernetes.io/projected/4129fe47-83ce-4c43-9549-39be0607dc11-kube-api-access-27c45\") pod \"glance-default-internal-api-0\" (UID: \"4129fe47-83ce-4c43-9549-39be0607dc11\") " pod="openstack/glance-default-internal-api-0" Oct 13 14:30:21 crc kubenswrapper[4797]: I1013 14:30:21.114040 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 13 14:30:21 crc kubenswrapper[4797]: E1013 14:30:21.120935 4797 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc7bd21a4_836f_4c46_a04a_6a5f262004a7.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc7bd21a4_836f_4c46_a04a_6a5f262004a7.slice/crio-5af9a5e2d067694b9a3d9a6a4fab55f8dcd868d372f7ab4afbc139b5a21ea2f4\": RecentStats: unable to find data in memory cache]" Oct 13 14:30:21 crc kubenswrapper[4797]: I1013 14:30:21.292882 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f8a71b9-f566-42f3-b2d2-d04ccf17c248" path="/var/lib/kubelet/pods/1f8a71b9-f566-42f3-b2d2-d04ccf17c248/volumes" Oct 13 14:30:21 crc kubenswrapper[4797]: I1013 14:30:21.748135 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 13 14:30:21 crc kubenswrapper[4797]: I1013 14:30:21.752226 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"42c07aef-3157-4265-b940-0d838eb32e9f","Type":"ContainerStarted","Data":"1e2c86c583fb966dae97514dbe98d91f56ac5589ee97b753c4018672f0c3109a"} Oct 13 14:30:21 crc kubenswrapper[4797]: W1013 14:30:21.757350 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4129fe47_83ce_4c43_9549_39be0607dc11.slice/crio-783896fddb198770f7f3ec5a58b0c1b71ef87d5b06737f06e385c667477436eb WatchSource:0}: Error finding container 783896fddb198770f7f3ec5a58b0c1b71ef87d5b06737f06e385c667477436eb: Status 404 returned error can't find the container with id 783896fddb198770f7f3ec5a58b0c1b71ef87d5b06737f06e385c667477436eb Oct 13 14:30:21 crc kubenswrapper[4797]: I1013 14:30:21.783836 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.783797263 podStartE2EDuration="3.783797263s" podCreationTimestamp="2025-10-13 14:30:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 14:30:21.771040131 +0000 UTC m=+4999.304590387" watchObservedRunningTime="2025-10-13 14:30:21.783797263 +0000 UTC m=+4999.317347519" Oct 13 14:30:22 crc kubenswrapper[4797]: I1013 14:30:22.767893 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4129fe47-83ce-4c43-9549-39be0607dc11","Type":"ContainerStarted","Data":"ae8411c1bcc54e7c84453171d93e0c8b45852913b5a9ab2433f22184cffd6213"} Oct 13 14:30:22 crc kubenswrapper[4797]: I1013 14:30:22.768155 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4129fe47-83ce-4c43-9549-39be0607dc11","Type":"ContainerStarted","Data":"783896fddb198770f7f3ec5a58b0c1b71ef87d5b06737f06e385c667477436eb"} Oct 13 14:30:23 crc kubenswrapper[4797]: I1013 14:30:23.778929 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4129fe47-83ce-4c43-9549-39be0607dc11","Type":"ContainerStarted","Data":"df9ffaffb3948f1cee13cbad7d818dd5004c66fe554b30b53c353c221e2d8ea4"} Oct 13 14:30:23 crc kubenswrapper[4797]: I1013 14:30:23.802611 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.802588228 podStartE2EDuration="3.802588228s" podCreationTimestamp="2025-10-13 14:30:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 14:30:23.795069094 +0000 UTC m=+5001.328619390" watchObservedRunningTime="2025-10-13 14:30:23.802588228 +0000 UTC m=+5001.336138484" Oct 13 14:30:25 crc kubenswrapper[4797]: I1013 14:30:25.355189 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-69f96c4cc9-bc79b" Oct 13 14:30:25 crc kubenswrapper[4797]: I1013 14:30:25.433868 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75d669c78f-wd2k9"] Oct 13 14:30:25 crc kubenswrapper[4797]: I1013 14:30:25.434280 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-75d669c78f-wd2k9" podUID="1ee0989b-49de-46c5-a81f-4d855c2a9b47" containerName="dnsmasq-dns" containerID="cri-o://74f001ad8e6d132da9c293330fc1ae0c8f91a5f9ce7bd679d3cf08b907e1c7e8" gracePeriod=10 Oct 13 14:30:25 crc kubenswrapper[4797]: I1013 14:30:25.802448 4797 generic.go:334] "Generic (PLEG): container finished" podID="1ee0989b-49de-46c5-a81f-4d855c2a9b47" containerID="74f001ad8e6d132da9c293330fc1ae0c8f91a5f9ce7bd679d3cf08b907e1c7e8" exitCode=0 Oct 13 14:30:25 crc kubenswrapper[4797]: I1013 14:30:25.802492 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75d669c78f-wd2k9" event={"ID":"1ee0989b-49de-46c5-a81f-4d855c2a9b47","Type":"ContainerDied","Data":"74f001ad8e6d132da9c293330fc1ae0c8f91a5f9ce7bd679d3cf08b907e1c7e8"} Oct 13 14:30:25 crc kubenswrapper[4797]: I1013 14:30:25.954897 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75d669c78f-wd2k9" Oct 13 14:30:26 crc kubenswrapper[4797]: I1013 14:30:26.082532 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ee0989b-49de-46c5-a81f-4d855c2a9b47-config\") pod \"1ee0989b-49de-46c5-a81f-4d855c2a9b47\" (UID: \"1ee0989b-49de-46c5-a81f-4d855c2a9b47\") " Oct 13 14:30:26 crc kubenswrapper[4797]: I1013 14:30:26.082581 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1ee0989b-49de-46c5-a81f-4d855c2a9b47-dns-svc\") pod \"1ee0989b-49de-46c5-a81f-4d855c2a9b47\" (UID: \"1ee0989b-49de-46c5-a81f-4d855c2a9b47\") " Oct 13 14:30:26 crc kubenswrapper[4797]: I1013 14:30:26.082609 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c2sv8\" (UniqueName: \"kubernetes.io/projected/1ee0989b-49de-46c5-a81f-4d855c2a9b47-kube-api-access-c2sv8\") pod \"1ee0989b-49de-46c5-a81f-4d855c2a9b47\" (UID: \"1ee0989b-49de-46c5-a81f-4d855c2a9b47\") " Oct 13 14:30:26 crc kubenswrapper[4797]: I1013 14:30:26.082666 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1ee0989b-49de-46c5-a81f-4d855c2a9b47-ovsdbserver-sb\") pod \"1ee0989b-49de-46c5-a81f-4d855c2a9b47\" (UID: \"1ee0989b-49de-46c5-a81f-4d855c2a9b47\") " Oct 13 14:30:26 crc kubenswrapper[4797]: I1013 14:30:26.082771 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1ee0989b-49de-46c5-a81f-4d855c2a9b47-ovsdbserver-nb\") pod \"1ee0989b-49de-46c5-a81f-4d855c2a9b47\" (UID: \"1ee0989b-49de-46c5-a81f-4d855c2a9b47\") " Oct 13 14:30:26 crc kubenswrapper[4797]: I1013 14:30:26.088692 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ee0989b-49de-46c5-a81f-4d855c2a9b47-kube-api-access-c2sv8" (OuterVolumeSpecName: "kube-api-access-c2sv8") pod "1ee0989b-49de-46c5-a81f-4d855c2a9b47" (UID: "1ee0989b-49de-46c5-a81f-4d855c2a9b47"). InnerVolumeSpecName "kube-api-access-c2sv8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 14:30:26 crc kubenswrapper[4797]: I1013 14:30:26.134149 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ee0989b-49de-46c5-a81f-4d855c2a9b47-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "1ee0989b-49de-46c5-a81f-4d855c2a9b47" (UID: "1ee0989b-49de-46c5-a81f-4d855c2a9b47"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 14:30:26 crc kubenswrapper[4797]: I1013 14:30:26.139960 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ee0989b-49de-46c5-a81f-4d855c2a9b47-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1ee0989b-49de-46c5-a81f-4d855c2a9b47" (UID: "1ee0989b-49de-46c5-a81f-4d855c2a9b47"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 14:30:26 crc kubenswrapper[4797]: I1013 14:30:26.142421 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ee0989b-49de-46c5-a81f-4d855c2a9b47-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "1ee0989b-49de-46c5-a81f-4d855c2a9b47" (UID: "1ee0989b-49de-46c5-a81f-4d855c2a9b47"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 14:30:26 crc kubenswrapper[4797]: I1013 14:30:26.142494 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ee0989b-49de-46c5-a81f-4d855c2a9b47-config" (OuterVolumeSpecName: "config") pod "1ee0989b-49de-46c5-a81f-4d855c2a9b47" (UID: "1ee0989b-49de-46c5-a81f-4d855c2a9b47"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 14:30:26 crc kubenswrapper[4797]: I1013 14:30:26.184588 4797 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ee0989b-49de-46c5-a81f-4d855c2a9b47-config\") on node \"crc\" DevicePath \"\"" Oct 13 14:30:26 crc kubenswrapper[4797]: I1013 14:30:26.184630 4797 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1ee0989b-49de-46c5-a81f-4d855c2a9b47-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 13 14:30:26 crc kubenswrapper[4797]: I1013 14:30:26.184644 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c2sv8\" (UniqueName: \"kubernetes.io/projected/1ee0989b-49de-46c5-a81f-4d855c2a9b47-kube-api-access-c2sv8\") on node \"crc\" DevicePath \"\"" Oct 13 14:30:26 crc kubenswrapper[4797]: I1013 14:30:26.184658 4797 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1ee0989b-49de-46c5-a81f-4d855c2a9b47-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 13 14:30:26 crc kubenswrapper[4797]: I1013 14:30:26.184669 4797 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1ee0989b-49de-46c5-a81f-4d855c2a9b47-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 13 14:30:26 crc kubenswrapper[4797]: I1013 14:30:26.816978 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75d669c78f-wd2k9" event={"ID":"1ee0989b-49de-46c5-a81f-4d855c2a9b47","Type":"ContainerDied","Data":"ec3401a9dda805c59e3a5bdbe380cc3a74c943e0ca9056197ee4758d392e6780"} Oct 13 14:30:26 crc kubenswrapper[4797]: I1013 14:30:26.817075 4797 scope.go:117] "RemoveContainer" containerID="74f001ad8e6d132da9c293330fc1ae0c8f91a5f9ce7bd679d3cf08b907e1c7e8" Oct 13 14:30:26 crc kubenswrapper[4797]: I1013 14:30:26.817123 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75d669c78f-wd2k9" Oct 13 14:30:26 crc kubenswrapper[4797]: I1013 14:30:26.848301 4797 scope.go:117] "RemoveContainer" containerID="1d81bec1b7b7309619f89efc40364eb3572fdd36037e8b80efa117c69603d044" Oct 13 14:30:26 crc kubenswrapper[4797]: I1013 14:30:26.871016 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75d669c78f-wd2k9"] Oct 13 14:30:26 crc kubenswrapper[4797]: I1013 14:30:26.882281 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-75d669c78f-wd2k9"] Oct 13 14:30:27 crc kubenswrapper[4797]: I1013 14:30:27.254230 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ee0989b-49de-46c5-a81f-4d855c2a9b47" path="/var/lib/kubelet/pods/1ee0989b-49de-46c5-a81f-4d855c2a9b47/volumes" Oct 13 14:30:29 crc kubenswrapper[4797]: I1013 14:30:29.182738 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 13 14:30:29 crc kubenswrapper[4797]: I1013 14:30:29.183069 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 13 14:30:29 crc kubenswrapper[4797]: I1013 14:30:29.252678 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 13 14:30:29 crc kubenswrapper[4797]: I1013 14:30:29.253095 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 13 14:30:29 crc kubenswrapper[4797]: I1013 14:30:29.844515 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 13 14:30:29 crc kubenswrapper[4797]: I1013 14:30:29.844554 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 13 14:30:31 crc kubenswrapper[4797]: I1013 14:30:31.114415 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 13 14:30:31 crc kubenswrapper[4797]: I1013 14:30:31.116629 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 13 14:30:31 crc kubenswrapper[4797]: I1013 14:30:31.155525 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 13 14:30:31 crc kubenswrapper[4797]: I1013 14:30:31.164612 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 13 14:30:31 crc kubenswrapper[4797]: E1013 14:30:31.378796 4797 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc7bd21a4_836f_4c46_a04a_6a5f262004a7.slice/crio-5af9a5e2d067694b9a3d9a6a4fab55f8dcd868d372f7ab4afbc139b5a21ea2f4\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc7bd21a4_836f_4c46_a04a_6a5f262004a7.slice\": RecentStats: unable to find data in memory cache]" Oct 13 14:30:31 crc kubenswrapper[4797]: I1013 14:30:31.867704 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 13 14:30:31 crc kubenswrapper[4797]: I1013 14:30:31.868190 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 13 14:30:31 crc kubenswrapper[4797]: I1013 14:30:31.887314 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 13 14:30:31 crc kubenswrapper[4797]: I1013 14:30:31.887459 4797 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 13 14:30:31 crc kubenswrapper[4797]: I1013 14:30:31.899050 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 13 14:30:33 crc kubenswrapper[4797]: I1013 14:30:33.245140 4797 scope.go:117] "RemoveContainer" containerID="96c8267bd4c8e99eeab0f52fde47a06d5529395a03b2ed9e13ec45aa355e370b" Oct 13 14:30:33 crc kubenswrapper[4797]: E1013 14:30:33.245693 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 14:30:33 crc kubenswrapper[4797]: I1013 14:30:33.785763 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 13 14:30:33 crc kubenswrapper[4797]: I1013 14:30:33.884762 4797 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 13 14:30:33 crc kubenswrapper[4797]: I1013 14:30:33.886043 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 13 14:30:41 crc kubenswrapper[4797]: E1013 14:30:41.645699 4797 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc7bd21a4_836f_4c46_a04a_6a5f262004a7.slice/crio-5af9a5e2d067694b9a3d9a6a4fab55f8dcd868d372f7ab4afbc139b5a21ea2f4\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc7bd21a4_836f_4c46_a04a_6a5f262004a7.slice\": RecentStats: unable to find data in memory cache]" Oct 13 14:30:41 crc kubenswrapper[4797]: I1013 14:30:41.745247 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-rq7jf"] Oct 13 14:30:41 crc kubenswrapper[4797]: E1013 14:30:41.746235 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ee0989b-49de-46c5-a81f-4d855c2a9b47" containerName="init" Oct 13 14:30:41 crc kubenswrapper[4797]: I1013 14:30:41.746265 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ee0989b-49de-46c5-a81f-4d855c2a9b47" containerName="init" Oct 13 14:30:41 crc kubenswrapper[4797]: E1013 14:30:41.746303 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ee0989b-49de-46c5-a81f-4d855c2a9b47" containerName="dnsmasq-dns" Oct 13 14:30:41 crc kubenswrapper[4797]: I1013 14:30:41.746314 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ee0989b-49de-46c5-a81f-4d855c2a9b47" containerName="dnsmasq-dns" Oct 13 14:30:41 crc kubenswrapper[4797]: I1013 14:30:41.746874 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ee0989b-49de-46c5-a81f-4d855c2a9b47" containerName="dnsmasq-dns" Oct 13 14:30:41 crc kubenswrapper[4797]: I1013 14:30:41.748104 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-rq7jf" Oct 13 14:30:41 crc kubenswrapper[4797]: I1013 14:30:41.798050 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-rq7jf"] Oct 13 14:30:41 crc kubenswrapper[4797]: I1013 14:30:41.883974 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-th6vw\" (UniqueName: \"kubernetes.io/projected/dad69435-1f0a-4d7d-b3c6-b5c6eb491740-kube-api-access-th6vw\") pod \"placement-db-create-rq7jf\" (UID: \"dad69435-1f0a-4d7d-b3c6-b5c6eb491740\") " pod="openstack/placement-db-create-rq7jf" Oct 13 14:30:41 crc kubenswrapper[4797]: I1013 14:30:41.985439 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-th6vw\" (UniqueName: \"kubernetes.io/projected/dad69435-1f0a-4d7d-b3c6-b5c6eb491740-kube-api-access-th6vw\") pod \"placement-db-create-rq7jf\" (UID: \"dad69435-1f0a-4d7d-b3c6-b5c6eb491740\") " pod="openstack/placement-db-create-rq7jf" Oct 13 14:30:42 crc kubenswrapper[4797]: I1013 14:30:42.010138 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-th6vw\" (UniqueName: \"kubernetes.io/projected/dad69435-1f0a-4d7d-b3c6-b5c6eb491740-kube-api-access-th6vw\") pod \"placement-db-create-rq7jf\" (UID: \"dad69435-1f0a-4d7d-b3c6-b5c6eb491740\") " pod="openstack/placement-db-create-rq7jf" Oct 13 14:30:42 crc kubenswrapper[4797]: I1013 14:30:42.094106 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-rq7jf" Oct 13 14:30:42 crc kubenswrapper[4797]: I1013 14:30:42.533056 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-rq7jf"] Oct 13 14:30:42 crc kubenswrapper[4797]: W1013 14:30:42.533890 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddad69435_1f0a_4d7d_b3c6_b5c6eb491740.slice/crio-aae207e7432c4e88bf69d2cb8ec4dc59edeafc90d614d2c85797be40807b2061 WatchSource:0}: Error finding container aae207e7432c4e88bf69d2cb8ec4dc59edeafc90d614d2c85797be40807b2061: Status 404 returned error can't find the container with id aae207e7432c4e88bf69d2cb8ec4dc59edeafc90d614d2c85797be40807b2061 Oct 13 14:30:42 crc kubenswrapper[4797]: I1013 14:30:42.979420 4797 generic.go:334] "Generic (PLEG): container finished" podID="dad69435-1f0a-4d7d-b3c6-b5c6eb491740" containerID="a5a9ce3a90682dacfe8f75663327b413798499a48b0a9f187a030c8c9f5d76b6" exitCode=0 Oct 13 14:30:42 crc kubenswrapper[4797]: I1013 14:30:42.979514 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-rq7jf" event={"ID":"dad69435-1f0a-4d7d-b3c6-b5c6eb491740","Type":"ContainerDied","Data":"a5a9ce3a90682dacfe8f75663327b413798499a48b0a9f187a030c8c9f5d76b6"} Oct 13 14:30:42 crc kubenswrapper[4797]: I1013 14:30:42.979961 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-rq7jf" event={"ID":"dad69435-1f0a-4d7d-b3c6-b5c6eb491740","Type":"ContainerStarted","Data":"aae207e7432c4e88bf69d2cb8ec4dc59edeafc90d614d2c85797be40807b2061"} Oct 13 14:30:44 crc kubenswrapper[4797]: I1013 14:30:44.237209 4797 scope.go:117] "RemoveContainer" containerID="96c8267bd4c8e99eeab0f52fde47a06d5529395a03b2ed9e13ec45aa355e370b" Oct 13 14:30:44 crc kubenswrapper[4797]: E1013 14:30:44.237949 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 14:30:44 crc kubenswrapper[4797]: I1013 14:30:44.338911 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-rq7jf" Oct 13 14:30:44 crc kubenswrapper[4797]: I1013 14:30:44.429042 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-th6vw\" (UniqueName: \"kubernetes.io/projected/dad69435-1f0a-4d7d-b3c6-b5c6eb491740-kube-api-access-th6vw\") pod \"dad69435-1f0a-4d7d-b3c6-b5c6eb491740\" (UID: \"dad69435-1f0a-4d7d-b3c6-b5c6eb491740\") " Oct 13 14:30:44 crc kubenswrapper[4797]: I1013 14:30:44.437067 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dad69435-1f0a-4d7d-b3c6-b5c6eb491740-kube-api-access-th6vw" (OuterVolumeSpecName: "kube-api-access-th6vw") pod "dad69435-1f0a-4d7d-b3c6-b5c6eb491740" (UID: "dad69435-1f0a-4d7d-b3c6-b5c6eb491740"). InnerVolumeSpecName "kube-api-access-th6vw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 14:30:44 crc kubenswrapper[4797]: I1013 14:30:44.531079 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-th6vw\" (UniqueName: \"kubernetes.io/projected/dad69435-1f0a-4d7d-b3c6-b5c6eb491740-kube-api-access-th6vw\") on node \"crc\" DevicePath \"\"" Oct 13 14:30:45 crc kubenswrapper[4797]: I1013 14:30:45.004039 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-rq7jf" event={"ID":"dad69435-1f0a-4d7d-b3c6-b5c6eb491740","Type":"ContainerDied","Data":"aae207e7432c4e88bf69d2cb8ec4dc59edeafc90d614d2c85797be40807b2061"} Oct 13 14:30:45 crc kubenswrapper[4797]: I1013 14:30:45.004087 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aae207e7432c4e88bf69d2cb8ec4dc59edeafc90d614d2c85797be40807b2061" Oct 13 14:30:45 crc kubenswrapper[4797]: I1013 14:30:45.004114 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-rq7jf" Oct 13 14:30:51 crc kubenswrapper[4797]: I1013 14:30:51.867614 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-7d13-account-create-lddbw"] Oct 13 14:30:51 crc kubenswrapper[4797]: E1013 14:30:51.868571 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dad69435-1f0a-4d7d-b3c6-b5c6eb491740" containerName="mariadb-database-create" Oct 13 14:30:51 crc kubenswrapper[4797]: I1013 14:30:51.868585 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="dad69435-1f0a-4d7d-b3c6-b5c6eb491740" containerName="mariadb-database-create" Oct 13 14:30:51 crc kubenswrapper[4797]: I1013 14:30:51.868772 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="dad69435-1f0a-4d7d-b3c6-b5c6eb491740" containerName="mariadb-database-create" Oct 13 14:30:51 crc kubenswrapper[4797]: I1013 14:30:51.869471 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7d13-account-create-lddbw" Oct 13 14:30:51 crc kubenswrapper[4797]: I1013 14:30:51.873103 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Oct 13 14:30:51 crc kubenswrapper[4797]: I1013 14:30:51.886586 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-7d13-account-create-lddbw"] Oct 13 14:30:51 crc kubenswrapper[4797]: E1013 14:30:51.926946 4797 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc7bd21a4_836f_4c46_a04a_6a5f262004a7.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc7bd21a4_836f_4c46_a04a_6a5f262004a7.slice/crio-5af9a5e2d067694b9a3d9a6a4fab55f8dcd868d372f7ab4afbc139b5a21ea2f4\": RecentStats: unable to find data in memory cache]" Oct 13 14:30:51 crc kubenswrapper[4797]: I1013 14:30:51.970971 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jpcb\" (UniqueName: \"kubernetes.io/projected/3e6ff53c-6611-4d2a-a7fb-42d8f5d75183-kube-api-access-9jpcb\") pod \"placement-7d13-account-create-lddbw\" (UID: \"3e6ff53c-6611-4d2a-a7fb-42d8f5d75183\") " pod="openstack/placement-7d13-account-create-lddbw" Oct 13 14:30:52 crc kubenswrapper[4797]: I1013 14:30:52.073411 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jpcb\" (UniqueName: \"kubernetes.io/projected/3e6ff53c-6611-4d2a-a7fb-42d8f5d75183-kube-api-access-9jpcb\") pod \"placement-7d13-account-create-lddbw\" (UID: \"3e6ff53c-6611-4d2a-a7fb-42d8f5d75183\") " pod="openstack/placement-7d13-account-create-lddbw" Oct 13 14:30:52 crc kubenswrapper[4797]: I1013 14:30:52.092434 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jpcb\" (UniqueName: \"kubernetes.io/projected/3e6ff53c-6611-4d2a-a7fb-42d8f5d75183-kube-api-access-9jpcb\") pod \"placement-7d13-account-create-lddbw\" (UID: \"3e6ff53c-6611-4d2a-a7fb-42d8f5d75183\") " pod="openstack/placement-7d13-account-create-lddbw" Oct 13 14:30:52 crc kubenswrapper[4797]: I1013 14:30:52.252539 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7d13-account-create-lddbw" Oct 13 14:30:52 crc kubenswrapper[4797]: I1013 14:30:52.723417 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-7d13-account-create-lddbw"] Oct 13 14:30:53 crc kubenswrapper[4797]: I1013 14:30:53.077358 4797 generic.go:334] "Generic (PLEG): container finished" podID="3e6ff53c-6611-4d2a-a7fb-42d8f5d75183" containerID="b378593e68d043e685faca81e7e6ac3933ada9f1711ba5b62aff96b0ab29e96c" exitCode=0 Oct 13 14:30:53 crc kubenswrapper[4797]: I1013 14:30:53.077586 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7d13-account-create-lddbw" event={"ID":"3e6ff53c-6611-4d2a-a7fb-42d8f5d75183","Type":"ContainerDied","Data":"b378593e68d043e685faca81e7e6ac3933ada9f1711ba5b62aff96b0ab29e96c"} Oct 13 14:30:53 crc kubenswrapper[4797]: I1013 14:30:53.077863 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7d13-account-create-lddbw" event={"ID":"3e6ff53c-6611-4d2a-a7fb-42d8f5d75183","Type":"ContainerStarted","Data":"8dc476dd539744105235345b24edd8800074fdff485d9548e42ac4f8ba5c1bf2"} Oct 13 14:30:54 crc kubenswrapper[4797]: I1013 14:30:54.472255 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7d13-account-create-lddbw" Oct 13 14:30:54 crc kubenswrapper[4797]: I1013 14:30:54.536409 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9jpcb\" (UniqueName: \"kubernetes.io/projected/3e6ff53c-6611-4d2a-a7fb-42d8f5d75183-kube-api-access-9jpcb\") pod \"3e6ff53c-6611-4d2a-a7fb-42d8f5d75183\" (UID: \"3e6ff53c-6611-4d2a-a7fb-42d8f5d75183\") " Oct 13 14:30:54 crc kubenswrapper[4797]: I1013 14:30:54.541747 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e6ff53c-6611-4d2a-a7fb-42d8f5d75183-kube-api-access-9jpcb" (OuterVolumeSpecName: "kube-api-access-9jpcb") pod "3e6ff53c-6611-4d2a-a7fb-42d8f5d75183" (UID: "3e6ff53c-6611-4d2a-a7fb-42d8f5d75183"). InnerVolumeSpecName "kube-api-access-9jpcb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 14:30:54 crc kubenswrapper[4797]: I1013 14:30:54.637933 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9jpcb\" (UniqueName: \"kubernetes.io/projected/3e6ff53c-6611-4d2a-a7fb-42d8f5d75183-kube-api-access-9jpcb\") on node \"crc\" DevicePath \"\"" Oct 13 14:30:55 crc kubenswrapper[4797]: I1013 14:30:55.099395 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7d13-account-create-lddbw" event={"ID":"3e6ff53c-6611-4d2a-a7fb-42d8f5d75183","Type":"ContainerDied","Data":"8dc476dd539744105235345b24edd8800074fdff485d9548e42ac4f8ba5c1bf2"} Oct 13 14:30:55 crc kubenswrapper[4797]: I1013 14:30:55.099437 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8dc476dd539744105235345b24edd8800074fdff485d9548e42ac4f8ba5c1bf2" Oct 13 14:30:55 crc kubenswrapper[4797]: I1013 14:30:55.099486 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7d13-account-create-lddbw" Oct 13 14:30:57 crc kubenswrapper[4797]: I1013 14:30:57.162525 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7d9899cb4c-j4tsm"] Oct 13 14:30:57 crc kubenswrapper[4797]: E1013 14:30:57.163135 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e6ff53c-6611-4d2a-a7fb-42d8f5d75183" containerName="mariadb-account-create" Oct 13 14:30:57 crc kubenswrapper[4797]: I1013 14:30:57.163146 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e6ff53c-6611-4d2a-a7fb-42d8f5d75183" containerName="mariadb-account-create" Oct 13 14:30:57 crc kubenswrapper[4797]: I1013 14:30:57.163318 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e6ff53c-6611-4d2a-a7fb-42d8f5d75183" containerName="mariadb-account-create" Oct 13 14:30:57 crc kubenswrapper[4797]: I1013 14:30:57.164279 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d9899cb4c-j4tsm" Oct 13 14:30:57 crc kubenswrapper[4797]: I1013 14:30:57.179096 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7d9899cb4c-j4tsm"] Oct 13 14:30:57 crc kubenswrapper[4797]: I1013 14:30:57.213028 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-4cnz5"] Oct 13 14:30:57 crc kubenswrapper[4797]: I1013 14:30:57.214384 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-4cnz5" Oct 13 14:30:57 crc kubenswrapper[4797]: I1013 14:30:57.217692 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-vww2t" Oct 13 14:30:57 crc kubenswrapper[4797]: I1013 14:30:57.217697 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Oct 13 14:30:57 crc kubenswrapper[4797]: I1013 14:30:57.217762 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Oct 13 14:30:57 crc kubenswrapper[4797]: I1013 14:30:57.225223 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-4cnz5"] Oct 13 14:30:57 crc kubenswrapper[4797]: I1013 14:30:57.286477 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/22a19aef-3dad-4ad9-a048-600878189bf6-ovsdbserver-sb\") pod \"dnsmasq-dns-7d9899cb4c-j4tsm\" (UID: \"22a19aef-3dad-4ad9-a048-600878189bf6\") " pod="openstack/dnsmasq-dns-7d9899cb4c-j4tsm" Oct 13 14:30:57 crc kubenswrapper[4797]: I1013 14:30:57.286555 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/22a19aef-3dad-4ad9-a048-600878189bf6-dns-svc\") pod \"dnsmasq-dns-7d9899cb4c-j4tsm\" (UID: \"22a19aef-3dad-4ad9-a048-600878189bf6\") " pod="openstack/dnsmasq-dns-7d9899cb4c-j4tsm" Oct 13 14:30:57 crc kubenswrapper[4797]: I1013 14:30:57.286583 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/22a19aef-3dad-4ad9-a048-600878189bf6-ovsdbserver-nb\") pod \"dnsmasq-dns-7d9899cb4c-j4tsm\" (UID: \"22a19aef-3dad-4ad9-a048-600878189bf6\") " pod="openstack/dnsmasq-dns-7d9899cb4c-j4tsm" Oct 13 14:30:57 crc kubenswrapper[4797]: I1013 14:30:57.286606 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22a19aef-3dad-4ad9-a048-600878189bf6-config\") pod \"dnsmasq-dns-7d9899cb4c-j4tsm\" (UID: \"22a19aef-3dad-4ad9-a048-600878189bf6\") " pod="openstack/dnsmasq-dns-7d9899cb4c-j4tsm" Oct 13 14:30:57 crc kubenswrapper[4797]: I1013 14:30:57.286633 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2qns\" (UniqueName: \"kubernetes.io/projected/22a19aef-3dad-4ad9-a048-600878189bf6-kube-api-access-s2qns\") pod \"dnsmasq-dns-7d9899cb4c-j4tsm\" (UID: \"22a19aef-3dad-4ad9-a048-600878189bf6\") " pod="openstack/dnsmasq-dns-7d9899cb4c-j4tsm" Oct 13 14:30:57 crc kubenswrapper[4797]: I1013 14:30:57.387795 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10b7675b-71aa-4ace-85f1-7fbb6c4862ab-config-data\") pod \"placement-db-sync-4cnz5\" (UID: \"10b7675b-71aa-4ace-85f1-7fbb6c4862ab\") " pod="openstack/placement-db-sync-4cnz5" Oct 13 14:30:57 crc kubenswrapper[4797]: I1013 14:30:57.387847 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10b7675b-71aa-4ace-85f1-7fbb6c4862ab-combined-ca-bundle\") pod \"placement-db-sync-4cnz5\" (UID: \"10b7675b-71aa-4ace-85f1-7fbb6c4862ab\") " pod="openstack/placement-db-sync-4cnz5" Oct 13 14:30:57 crc kubenswrapper[4797]: I1013 14:30:57.387918 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/10b7675b-71aa-4ace-85f1-7fbb6c4862ab-logs\") pod \"placement-db-sync-4cnz5\" (UID: \"10b7675b-71aa-4ace-85f1-7fbb6c4862ab\") " pod="openstack/placement-db-sync-4cnz5" Oct 13 14:30:57 crc kubenswrapper[4797]: I1013 14:30:57.387953 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6s6q2\" (UniqueName: \"kubernetes.io/projected/10b7675b-71aa-4ace-85f1-7fbb6c4862ab-kube-api-access-6s6q2\") pod \"placement-db-sync-4cnz5\" (UID: \"10b7675b-71aa-4ace-85f1-7fbb6c4862ab\") " pod="openstack/placement-db-sync-4cnz5" Oct 13 14:30:57 crc kubenswrapper[4797]: I1013 14:30:57.387994 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/22a19aef-3dad-4ad9-a048-600878189bf6-ovsdbserver-sb\") pod \"dnsmasq-dns-7d9899cb4c-j4tsm\" (UID: \"22a19aef-3dad-4ad9-a048-600878189bf6\") " pod="openstack/dnsmasq-dns-7d9899cb4c-j4tsm" Oct 13 14:30:57 crc kubenswrapper[4797]: I1013 14:30:57.388012 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/10b7675b-71aa-4ace-85f1-7fbb6c4862ab-scripts\") pod \"placement-db-sync-4cnz5\" (UID: \"10b7675b-71aa-4ace-85f1-7fbb6c4862ab\") " pod="openstack/placement-db-sync-4cnz5" Oct 13 14:30:57 crc kubenswrapper[4797]: I1013 14:30:57.388080 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/22a19aef-3dad-4ad9-a048-600878189bf6-dns-svc\") pod \"dnsmasq-dns-7d9899cb4c-j4tsm\" (UID: \"22a19aef-3dad-4ad9-a048-600878189bf6\") " pod="openstack/dnsmasq-dns-7d9899cb4c-j4tsm" Oct 13 14:30:57 crc kubenswrapper[4797]: I1013 14:30:57.388110 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/22a19aef-3dad-4ad9-a048-600878189bf6-ovsdbserver-nb\") pod \"dnsmasq-dns-7d9899cb4c-j4tsm\" (UID: \"22a19aef-3dad-4ad9-a048-600878189bf6\") " pod="openstack/dnsmasq-dns-7d9899cb4c-j4tsm" Oct 13 14:30:57 crc kubenswrapper[4797]: I1013 14:30:57.388176 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22a19aef-3dad-4ad9-a048-600878189bf6-config\") pod \"dnsmasq-dns-7d9899cb4c-j4tsm\" (UID: \"22a19aef-3dad-4ad9-a048-600878189bf6\") " pod="openstack/dnsmasq-dns-7d9899cb4c-j4tsm" Oct 13 14:30:57 crc kubenswrapper[4797]: I1013 14:30:57.388220 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2qns\" (UniqueName: \"kubernetes.io/projected/22a19aef-3dad-4ad9-a048-600878189bf6-kube-api-access-s2qns\") pod \"dnsmasq-dns-7d9899cb4c-j4tsm\" (UID: \"22a19aef-3dad-4ad9-a048-600878189bf6\") " pod="openstack/dnsmasq-dns-7d9899cb4c-j4tsm" Oct 13 14:30:57 crc kubenswrapper[4797]: I1013 14:30:57.388910 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/22a19aef-3dad-4ad9-a048-600878189bf6-ovsdbserver-nb\") pod \"dnsmasq-dns-7d9899cb4c-j4tsm\" (UID: \"22a19aef-3dad-4ad9-a048-600878189bf6\") " pod="openstack/dnsmasq-dns-7d9899cb4c-j4tsm" Oct 13 14:30:57 crc kubenswrapper[4797]: I1013 14:30:57.389370 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/22a19aef-3dad-4ad9-a048-600878189bf6-ovsdbserver-sb\") pod \"dnsmasq-dns-7d9899cb4c-j4tsm\" (UID: \"22a19aef-3dad-4ad9-a048-600878189bf6\") " pod="openstack/dnsmasq-dns-7d9899cb4c-j4tsm" Oct 13 14:30:57 crc kubenswrapper[4797]: I1013 14:30:57.390059 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22a19aef-3dad-4ad9-a048-600878189bf6-config\") pod \"dnsmasq-dns-7d9899cb4c-j4tsm\" (UID: \"22a19aef-3dad-4ad9-a048-600878189bf6\") " pod="openstack/dnsmasq-dns-7d9899cb4c-j4tsm" Oct 13 14:30:57 crc kubenswrapper[4797]: I1013 14:30:57.390398 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/22a19aef-3dad-4ad9-a048-600878189bf6-dns-svc\") pod \"dnsmasq-dns-7d9899cb4c-j4tsm\" (UID: \"22a19aef-3dad-4ad9-a048-600878189bf6\") " pod="openstack/dnsmasq-dns-7d9899cb4c-j4tsm" Oct 13 14:30:57 crc kubenswrapper[4797]: I1013 14:30:57.414983 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2qns\" (UniqueName: \"kubernetes.io/projected/22a19aef-3dad-4ad9-a048-600878189bf6-kube-api-access-s2qns\") pod \"dnsmasq-dns-7d9899cb4c-j4tsm\" (UID: \"22a19aef-3dad-4ad9-a048-600878189bf6\") " pod="openstack/dnsmasq-dns-7d9899cb4c-j4tsm" Oct 13 14:30:57 crc kubenswrapper[4797]: I1013 14:30:57.489702 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10b7675b-71aa-4ace-85f1-7fbb6c4862ab-config-data\") pod \"placement-db-sync-4cnz5\" (UID: \"10b7675b-71aa-4ace-85f1-7fbb6c4862ab\") " pod="openstack/placement-db-sync-4cnz5" Oct 13 14:30:57 crc kubenswrapper[4797]: I1013 14:30:57.489769 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10b7675b-71aa-4ace-85f1-7fbb6c4862ab-combined-ca-bundle\") pod \"placement-db-sync-4cnz5\" (UID: \"10b7675b-71aa-4ace-85f1-7fbb6c4862ab\") " pod="openstack/placement-db-sync-4cnz5" Oct 13 14:30:57 crc kubenswrapper[4797]: I1013 14:30:57.489866 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/10b7675b-71aa-4ace-85f1-7fbb6c4862ab-logs\") pod \"placement-db-sync-4cnz5\" (UID: \"10b7675b-71aa-4ace-85f1-7fbb6c4862ab\") " pod="openstack/placement-db-sync-4cnz5" Oct 13 14:30:57 crc kubenswrapper[4797]: I1013 14:30:57.489941 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6s6q2\" (UniqueName: \"kubernetes.io/projected/10b7675b-71aa-4ace-85f1-7fbb6c4862ab-kube-api-access-6s6q2\") pod \"placement-db-sync-4cnz5\" (UID: \"10b7675b-71aa-4ace-85f1-7fbb6c4862ab\") " pod="openstack/placement-db-sync-4cnz5" Oct 13 14:30:57 crc kubenswrapper[4797]: I1013 14:30:57.490002 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/10b7675b-71aa-4ace-85f1-7fbb6c4862ab-scripts\") pod \"placement-db-sync-4cnz5\" (UID: \"10b7675b-71aa-4ace-85f1-7fbb6c4862ab\") " pod="openstack/placement-db-sync-4cnz5" Oct 13 14:30:57 crc kubenswrapper[4797]: I1013 14:30:57.490639 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/10b7675b-71aa-4ace-85f1-7fbb6c4862ab-logs\") pod \"placement-db-sync-4cnz5\" (UID: \"10b7675b-71aa-4ace-85f1-7fbb6c4862ab\") " pod="openstack/placement-db-sync-4cnz5" Oct 13 14:30:57 crc kubenswrapper[4797]: I1013 14:30:57.491081 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d9899cb4c-j4tsm" Oct 13 14:30:57 crc kubenswrapper[4797]: I1013 14:30:57.494116 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10b7675b-71aa-4ace-85f1-7fbb6c4862ab-combined-ca-bundle\") pod \"placement-db-sync-4cnz5\" (UID: \"10b7675b-71aa-4ace-85f1-7fbb6c4862ab\") " pod="openstack/placement-db-sync-4cnz5" Oct 13 14:30:57 crc kubenswrapper[4797]: I1013 14:30:57.503438 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10b7675b-71aa-4ace-85f1-7fbb6c4862ab-config-data\") pod \"placement-db-sync-4cnz5\" (UID: \"10b7675b-71aa-4ace-85f1-7fbb6c4862ab\") " pod="openstack/placement-db-sync-4cnz5" Oct 13 14:30:57 crc kubenswrapper[4797]: I1013 14:30:57.505889 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6s6q2\" (UniqueName: \"kubernetes.io/projected/10b7675b-71aa-4ace-85f1-7fbb6c4862ab-kube-api-access-6s6q2\") pod \"placement-db-sync-4cnz5\" (UID: \"10b7675b-71aa-4ace-85f1-7fbb6c4862ab\") " pod="openstack/placement-db-sync-4cnz5" Oct 13 14:30:57 crc kubenswrapper[4797]: I1013 14:30:57.509796 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/10b7675b-71aa-4ace-85f1-7fbb6c4862ab-scripts\") pod \"placement-db-sync-4cnz5\" (UID: \"10b7675b-71aa-4ace-85f1-7fbb6c4862ab\") " pod="openstack/placement-db-sync-4cnz5" Oct 13 14:30:57 crc kubenswrapper[4797]: I1013 14:30:57.542565 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-4cnz5" Oct 13 14:30:58 crc kubenswrapper[4797]: I1013 14:30:58.020400 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7d9899cb4c-j4tsm"] Oct 13 14:30:58 crc kubenswrapper[4797]: I1013 14:30:58.084540 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-4cnz5"] Oct 13 14:30:58 crc kubenswrapper[4797]: W1013 14:30:58.098585 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod10b7675b_71aa_4ace_85f1_7fbb6c4862ab.slice/crio-38a5dc64bb8e9a78f40ebb971db5f8f0b7f099fe930ee904af0a4ed56df3f2bd WatchSource:0}: Error finding container 38a5dc64bb8e9a78f40ebb971db5f8f0b7f099fe930ee904af0a4ed56df3f2bd: Status 404 returned error can't find the container with id 38a5dc64bb8e9a78f40ebb971db5f8f0b7f099fe930ee904af0a4ed56df3f2bd Oct 13 14:30:58 crc kubenswrapper[4797]: I1013 14:30:58.101693 4797 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 13 14:30:58 crc kubenswrapper[4797]: I1013 14:30:58.134355 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d9899cb4c-j4tsm" event={"ID":"22a19aef-3dad-4ad9-a048-600878189bf6","Type":"ContainerStarted","Data":"37b318e5dfe1fd8d9b8a12a4a6666f2cfc140952765a3c991b3012895d87c968"} Oct 13 14:30:58 crc kubenswrapper[4797]: I1013 14:30:58.135687 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-4cnz5" event={"ID":"10b7675b-71aa-4ace-85f1-7fbb6c4862ab","Type":"ContainerStarted","Data":"38a5dc64bb8e9a78f40ebb971db5f8f0b7f099fe930ee904af0a4ed56df3f2bd"} Oct 13 14:30:59 crc kubenswrapper[4797]: I1013 14:30:59.146597 4797 generic.go:334] "Generic (PLEG): container finished" podID="22a19aef-3dad-4ad9-a048-600878189bf6" containerID="cdccce2c6e8d08cc7bbb02be5f79b40826c1f325e1b05cf2c1dec39d04c4c62b" exitCode=0 Oct 13 14:30:59 crc kubenswrapper[4797]: I1013 14:30:59.146669 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d9899cb4c-j4tsm" event={"ID":"22a19aef-3dad-4ad9-a048-600878189bf6","Type":"ContainerDied","Data":"cdccce2c6e8d08cc7bbb02be5f79b40826c1f325e1b05cf2c1dec39d04c4c62b"} Oct 13 14:30:59 crc kubenswrapper[4797]: I1013 14:30:59.236842 4797 scope.go:117] "RemoveContainer" containerID="96c8267bd4c8e99eeab0f52fde47a06d5529395a03b2ed9e13ec45aa355e370b" Oct 13 14:30:59 crc kubenswrapper[4797]: E1013 14:30:59.237307 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 14:31:00 crc kubenswrapper[4797]: I1013 14:31:00.168720 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d9899cb4c-j4tsm" event={"ID":"22a19aef-3dad-4ad9-a048-600878189bf6","Type":"ContainerStarted","Data":"d6337036cf4355641b75f773ea7af8f0230be877189c8329a9121afb5b8ae26b"} Oct 13 14:31:00 crc kubenswrapper[4797]: I1013 14:31:00.170058 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7d9899cb4c-j4tsm" Oct 13 14:31:00 crc kubenswrapper[4797]: I1013 14:31:00.191324 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7d9899cb4c-j4tsm" podStartSLOduration=3.191307837 podStartE2EDuration="3.191307837s" podCreationTimestamp="2025-10-13 14:30:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 14:31:00.191156633 +0000 UTC m=+5037.724706939" watchObservedRunningTime="2025-10-13 14:31:00.191307837 +0000 UTC m=+5037.724858093" Oct 13 14:31:02 crc kubenswrapper[4797]: E1013 14:31:02.150586 4797 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc7bd21a4_836f_4c46_a04a_6a5f262004a7.slice/crio-5af9a5e2d067694b9a3d9a6a4fab55f8dcd868d372f7ab4afbc139b5a21ea2f4\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc7bd21a4_836f_4c46_a04a_6a5f262004a7.slice\": RecentStats: unable to find data in memory cache]" Oct 13 14:31:02 crc kubenswrapper[4797]: I1013 14:31:02.185385 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-4cnz5" event={"ID":"10b7675b-71aa-4ace-85f1-7fbb6c4862ab","Type":"ContainerStarted","Data":"21561b5d69e406155ea00274e6a74937b50060af2d456bc1598ac6f0aec22045"} Oct 13 14:31:02 crc kubenswrapper[4797]: I1013 14:31:02.203723 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-4cnz5" podStartSLOduration=1.88997263 podStartE2EDuration="5.203704124s" podCreationTimestamp="2025-10-13 14:30:57 +0000 UTC" firstStartedPulling="2025-10-13 14:30:58.101290467 +0000 UTC m=+5035.634840723" lastFinishedPulling="2025-10-13 14:31:01.415021961 +0000 UTC m=+5038.948572217" observedRunningTime="2025-10-13 14:31:02.200564267 +0000 UTC m=+5039.734114533" watchObservedRunningTime="2025-10-13 14:31:02.203704124 +0000 UTC m=+5039.737254380" Oct 13 14:31:03 crc kubenswrapper[4797]: I1013 14:31:03.198901 4797 generic.go:334] "Generic (PLEG): container finished" podID="10b7675b-71aa-4ace-85f1-7fbb6c4862ab" containerID="21561b5d69e406155ea00274e6a74937b50060af2d456bc1598ac6f0aec22045" exitCode=0 Oct 13 14:31:03 crc kubenswrapper[4797]: I1013 14:31:03.198999 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-4cnz5" event={"ID":"10b7675b-71aa-4ace-85f1-7fbb6c4862ab","Type":"ContainerDied","Data":"21561b5d69e406155ea00274e6a74937b50060af2d456bc1598ac6f0aec22045"} Oct 13 14:31:04 crc kubenswrapper[4797]: I1013 14:31:04.603160 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-4cnz5" Oct 13 14:31:04 crc kubenswrapper[4797]: I1013 14:31:04.724140 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10b7675b-71aa-4ace-85f1-7fbb6c4862ab-combined-ca-bundle\") pod \"10b7675b-71aa-4ace-85f1-7fbb6c4862ab\" (UID: \"10b7675b-71aa-4ace-85f1-7fbb6c4862ab\") " Oct 13 14:31:04 crc kubenswrapper[4797]: I1013 14:31:04.724941 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/10b7675b-71aa-4ace-85f1-7fbb6c4862ab-scripts\") pod \"10b7675b-71aa-4ace-85f1-7fbb6c4862ab\" (UID: \"10b7675b-71aa-4ace-85f1-7fbb6c4862ab\") " Oct 13 14:31:04 crc kubenswrapper[4797]: I1013 14:31:04.725216 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6s6q2\" (UniqueName: \"kubernetes.io/projected/10b7675b-71aa-4ace-85f1-7fbb6c4862ab-kube-api-access-6s6q2\") pod \"10b7675b-71aa-4ace-85f1-7fbb6c4862ab\" (UID: \"10b7675b-71aa-4ace-85f1-7fbb6c4862ab\") " Oct 13 14:31:04 crc kubenswrapper[4797]: I1013 14:31:04.725423 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/10b7675b-71aa-4ace-85f1-7fbb6c4862ab-logs\") pod \"10b7675b-71aa-4ace-85f1-7fbb6c4862ab\" (UID: \"10b7675b-71aa-4ace-85f1-7fbb6c4862ab\") " Oct 13 14:31:04 crc kubenswrapper[4797]: I1013 14:31:04.725646 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10b7675b-71aa-4ace-85f1-7fbb6c4862ab-config-data\") pod \"10b7675b-71aa-4ace-85f1-7fbb6c4862ab\" (UID: \"10b7675b-71aa-4ace-85f1-7fbb6c4862ab\") " Oct 13 14:31:04 crc kubenswrapper[4797]: I1013 14:31:04.725823 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/10b7675b-71aa-4ace-85f1-7fbb6c4862ab-logs" (OuterVolumeSpecName: "logs") pod "10b7675b-71aa-4ace-85f1-7fbb6c4862ab" (UID: "10b7675b-71aa-4ace-85f1-7fbb6c4862ab"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 14:31:04 crc kubenswrapper[4797]: I1013 14:31:04.726368 4797 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/10b7675b-71aa-4ace-85f1-7fbb6c4862ab-logs\") on node \"crc\" DevicePath \"\"" Oct 13 14:31:04 crc kubenswrapper[4797]: I1013 14:31:04.730336 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10b7675b-71aa-4ace-85f1-7fbb6c4862ab-scripts" (OuterVolumeSpecName: "scripts") pod "10b7675b-71aa-4ace-85f1-7fbb6c4862ab" (UID: "10b7675b-71aa-4ace-85f1-7fbb6c4862ab"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 14:31:04 crc kubenswrapper[4797]: I1013 14:31:04.732489 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10b7675b-71aa-4ace-85f1-7fbb6c4862ab-kube-api-access-6s6q2" (OuterVolumeSpecName: "kube-api-access-6s6q2") pod "10b7675b-71aa-4ace-85f1-7fbb6c4862ab" (UID: "10b7675b-71aa-4ace-85f1-7fbb6c4862ab"). InnerVolumeSpecName "kube-api-access-6s6q2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 14:31:04 crc kubenswrapper[4797]: I1013 14:31:04.760193 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10b7675b-71aa-4ace-85f1-7fbb6c4862ab-config-data" (OuterVolumeSpecName: "config-data") pod "10b7675b-71aa-4ace-85f1-7fbb6c4862ab" (UID: "10b7675b-71aa-4ace-85f1-7fbb6c4862ab"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 14:31:04 crc kubenswrapper[4797]: I1013 14:31:04.770169 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10b7675b-71aa-4ace-85f1-7fbb6c4862ab-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "10b7675b-71aa-4ace-85f1-7fbb6c4862ab" (UID: "10b7675b-71aa-4ace-85f1-7fbb6c4862ab"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 14:31:04 crc kubenswrapper[4797]: I1013 14:31:04.827715 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6s6q2\" (UniqueName: \"kubernetes.io/projected/10b7675b-71aa-4ace-85f1-7fbb6c4862ab-kube-api-access-6s6q2\") on node \"crc\" DevicePath \"\"" Oct 13 14:31:04 crc kubenswrapper[4797]: I1013 14:31:04.827765 4797 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10b7675b-71aa-4ace-85f1-7fbb6c4862ab-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 14:31:04 crc kubenswrapper[4797]: I1013 14:31:04.827781 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10b7675b-71aa-4ace-85f1-7fbb6c4862ab-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 14:31:04 crc kubenswrapper[4797]: I1013 14:31:04.827795 4797 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/10b7675b-71aa-4ace-85f1-7fbb6c4862ab-scripts\") on node \"crc\" DevicePath \"\"" Oct 13 14:31:05 crc kubenswrapper[4797]: I1013 14:31:05.218069 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-4cnz5" event={"ID":"10b7675b-71aa-4ace-85f1-7fbb6c4862ab","Type":"ContainerDied","Data":"38a5dc64bb8e9a78f40ebb971db5f8f0b7f099fe930ee904af0a4ed56df3f2bd"} Oct 13 14:31:05 crc kubenswrapper[4797]: I1013 14:31:05.218123 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="38a5dc64bb8e9a78f40ebb971db5f8f0b7f099fe930ee904af0a4ed56df3f2bd" Oct 13 14:31:05 crc kubenswrapper[4797]: I1013 14:31:05.218130 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-4cnz5" Oct 13 14:31:05 crc kubenswrapper[4797]: I1013 14:31:05.283212 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-6f8d4c4db4-4chzm"] Oct 13 14:31:05 crc kubenswrapper[4797]: E1013 14:31:05.283665 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10b7675b-71aa-4ace-85f1-7fbb6c4862ab" containerName="placement-db-sync" Oct 13 14:31:05 crc kubenswrapper[4797]: I1013 14:31:05.283686 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="10b7675b-71aa-4ace-85f1-7fbb6c4862ab" containerName="placement-db-sync" Oct 13 14:31:05 crc kubenswrapper[4797]: I1013 14:31:05.283911 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="10b7675b-71aa-4ace-85f1-7fbb6c4862ab" containerName="placement-db-sync" Oct 13 14:31:05 crc kubenswrapper[4797]: I1013 14:31:05.285107 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6f8d4c4db4-4chzm" Oct 13 14:31:05 crc kubenswrapper[4797]: I1013 14:31:05.287422 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Oct 13 14:31:05 crc kubenswrapper[4797]: I1013 14:31:05.287554 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-vww2t" Oct 13 14:31:05 crc kubenswrapper[4797]: I1013 14:31:05.287703 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Oct 13 14:31:05 crc kubenswrapper[4797]: I1013 14:31:05.310754 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6f8d4c4db4-4chzm"] Oct 13 14:31:05 crc kubenswrapper[4797]: I1013 14:31:05.438171 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fb9dbb3f-deb5-48f7-815c-9d2166039ea9-logs\") pod \"placement-6f8d4c4db4-4chzm\" (UID: \"fb9dbb3f-deb5-48f7-815c-9d2166039ea9\") " pod="openstack/placement-6f8d4c4db4-4chzm" Oct 13 14:31:05 crc kubenswrapper[4797]: I1013 14:31:05.438530 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb9dbb3f-deb5-48f7-815c-9d2166039ea9-config-data\") pod \"placement-6f8d4c4db4-4chzm\" (UID: \"fb9dbb3f-deb5-48f7-815c-9d2166039ea9\") " pod="openstack/placement-6f8d4c4db4-4chzm" Oct 13 14:31:05 crc kubenswrapper[4797]: I1013 14:31:05.438797 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb9dbb3f-deb5-48f7-815c-9d2166039ea9-combined-ca-bundle\") pod \"placement-6f8d4c4db4-4chzm\" (UID: \"fb9dbb3f-deb5-48f7-815c-9d2166039ea9\") " pod="openstack/placement-6f8d4c4db4-4chzm" Oct 13 14:31:05 crc kubenswrapper[4797]: I1013 14:31:05.439094 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzhtj\" (UniqueName: \"kubernetes.io/projected/fb9dbb3f-deb5-48f7-815c-9d2166039ea9-kube-api-access-vzhtj\") pod \"placement-6f8d4c4db4-4chzm\" (UID: \"fb9dbb3f-deb5-48f7-815c-9d2166039ea9\") " pod="openstack/placement-6f8d4c4db4-4chzm" Oct 13 14:31:05 crc kubenswrapper[4797]: I1013 14:31:05.439298 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb9dbb3f-deb5-48f7-815c-9d2166039ea9-scripts\") pod \"placement-6f8d4c4db4-4chzm\" (UID: \"fb9dbb3f-deb5-48f7-815c-9d2166039ea9\") " pod="openstack/placement-6f8d4c4db4-4chzm" Oct 13 14:31:05 crc kubenswrapper[4797]: I1013 14:31:05.541238 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb9dbb3f-deb5-48f7-815c-9d2166039ea9-combined-ca-bundle\") pod \"placement-6f8d4c4db4-4chzm\" (UID: \"fb9dbb3f-deb5-48f7-815c-9d2166039ea9\") " pod="openstack/placement-6f8d4c4db4-4chzm" Oct 13 14:31:05 crc kubenswrapper[4797]: I1013 14:31:05.541348 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vzhtj\" (UniqueName: \"kubernetes.io/projected/fb9dbb3f-deb5-48f7-815c-9d2166039ea9-kube-api-access-vzhtj\") pod \"placement-6f8d4c4db4-4chzm\" (UID: \"fb9dbb3f-deb5-48f7-815c-9d2166039ea9\") " pod="openstack/placement-6f8d4c4db4-4chzm" Oct 13 14:31:05 crc kubenswrapper[4797]: I1013 14:31:05.541390 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb9dbb3f-deb5-48f7-815c-9d2166039ea9-scripts\") pod \"placement-6f8d4c4db4-4chzm\" (UID: \"fb9dbb3f-deb5-48f7-815c-9d2166039ea9\") " pod="openstack/placement-6f8d4c4db4-4chzm" Oct 13 14:31:05 crc kubenswrapper[4797]: I1013 14:31:05.541470 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fb9dbb3f-deb5-48f7-815c-9d2166039ea9-logs\") pod \"placement-6f8d4c4db4-4chzm\" (UID: \"fb9dbb3f-deb5-48f7-815c-9d2166039ea9\") " pod="openstack/placement-6f8d4c4db4-4chzm" Oct 13 14:31:05 crc kubenswrapper[4797]: I1013 14:31:05.541508 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb9dbb3f-deb5-48f7-815c-9d2166039ea9-config-data\") pod \"placement-6f8d4c4db4-4chzm\" (UID: \"fb9dbb3f-deb5-48f7-815c-9d2166039ea9\") " pod="openstack/placement-6f8d4c4db4-4chzm" Oct 13 14:31:05 crc kubenswrapper[4797]: I1013 14:31:05.543125 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fb9dbb3f-deb5-48f7-815c-9d2166039ea9-logs\") pod \"placement-6f8d4c4db4-4chzm\" (UID: \"fb9dbb3f-deb5-48f7-815c-9d2166039ea9\") " pod="openstack/placement-6f8d4c4db4-4chzm" Oct 13 14:31:05 crc kubenswrapper[4797]: I1013 14:31:05.546581 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb9dbb3f-deb5-48f7-815c-9d2166039ea9-combined-ca-bundle\") pod \"placement-6f8d4c4db4-4chzm\" (UID: \"fb9dbb3f-deb5-48f7-815c-9d2166039ea9\") " pod="openstack/placement-6f8d4c4db4-4chzm" Oct 13 14:31:05 crc kubenswrapper[4797]: I1013 14:31:05.547092 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb9dbb3f-deb5-48f7-815c-9d2166039ea9-scripts\") pod \"placement-6f8d4c4db4-4chzm\" (UID: \"fb9dbb3f-deb5-48f7-815c-9d2166039ea9\") " pod="openstack/placement-6f8d4c4db4-4chzm" Oct 13 14:31:05 crc kubenswrapper[4797]: I1013 14:31:05.548013 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb9dbb3f-deb5-48f7-815c-9d2166039ea9-config-data\") pod \"placement-6f8d4c4db4-4chzm\" (UID: \"fb9dbb3f-deb5-48f7-815c-9d2166039ea9\") " pod="openstack/placement-6f8d4c4db4-4chzm" Oct 13 14:31:05 crc kubenswrapper[4797]: I1013 14:31:05.570226 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzhtj\" (UniqueName: \"kubernetes.io/projected/fb9dbb3f-deb5-48f7-815c-9d2166039ea9-kube-api-access-vzhtj\") pod \"placement-6f8d4c4db4-4chzm\" (UID: \"fb9dbb3f-deb5-48f7-815c-9d2166039ea9\") " pod="openstack/placement-6f8d4c4db4-4chzm" Oct 13 14:31:05 crc kubenswrapper[4797]: I1013 14:31:05.599504 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6f8d4c4db4-4chzm" Oct 13 14:31:06 crc kubenswrapper[4797]: I1013 14:31:06.025687 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6f8d4c4db4-4chzm"] Oct 13 14:31:06 crc kubenswrapper[4797]: I1013 14:31:06.227428 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6f8d4c4db4-4chzm" event={"ID":"fb9dbb3f-deb5-48f7-815c-9d2166039ea9","Type":"ContainerStarted","Data":"6f32b0a311c83dad9b7bb83fa69ef2ce6f37bb7c44e1d496ced0af6749868ec5"} Oct 13 14:31:06 crc kubenswrapper[4797]: I1013 14:31:06.227796 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6f8d4c4db4-4chzm" event={"ID":"fb9dbb3f-deb5-48f7-815c-9d2166039ea9","Type":"ContainerStarted","Data":"0d632fd42ecdfe0079d6b5bfef1ba326055e5f16242f2001fbcaf65b657a41ef"} Oct 13 14:31:07 crc kubenswrapper[4797]: I1013 14:31:07.248907 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-6f8d4c4db4-4chzm" Oct 13 14:31:07 crc kubenswrapper[4797]: I1013 14:31:07.249241 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-6f8d4c4db4-4chzm" Oct 13 14:31:07 crc kubenswrapper[4797]: I1013 14:31:07.249256 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6f8d4c4db4-4chzm" event={"ID":"fb9dbb3f-deb5-48f7-815c-9d2166039ea9","Type":"ContainerStarted","Data":"160c1d3ab7f024b56dee12f35c156463984068111e943c27868f23daff26b6ba"} Oct 13 14:31:07 crc kubenswrapper[4797]: I1013 14:31:07.259670 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-6f8d4c4db4-4chzm" podStartSLOduration=2.25965027 podStartE2EDuration="2.25965027s" podCreationTimestamp="2025-10-13 14:31:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 14:31:07.254126675 +0000 UTC m=+5044.787676941" watchObservedRunningTime="2025-10-13 14:31:07.25965027 +0000 UTC m=+5044.793200526" Oct 13 14:31:07 crc kubenswrapper[4797]: I1013 14:31:07.493269 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7d9899cb4c-j4tsm" Oct 13 14:31:07 crc kubenswrapper[4797]: I1013 14:31:07.559483 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-69f96c4cc9-bc79b"] Oct 13 14:31:07 crc kubenswrapper[4797]: I1013 14:31:07.559760 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-69f96c4cc9-bc79b" podUID="0f0d264f-0263-421f-b09e-20354fd33770" containerName="dnsmasq-dns" containerID="cri-o://a6a04335a1935f5867e54e5e71bbaafcd89d3bf63483739ccbc10cf05102ba8e" gracePeriod=10 Oct 13 14:31:08 crc kubenswrapper[4797]: I1013 14:31:08.005380 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69f96c4cc9-bc79b" Oct 13 14:31:08 crc kubenswrapper[4797]: I1013 14:31:08.190370 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f0d264f-0263-421f-b09e-20354fd33770-config\") pod \"0f0d264f-0263-421f-b09e-20354fd33770\" (UID: \"0f0d264f-0263-421f-b09e-20354fd33770\") " Oct 13 14:31:08 crc kubenswrapper[4797]: I1013 14:31:08.190492 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hg7s7\" (UniqueName: \"kubernetes.io/projected/0f0d264f-0263-421f-b09e-20354fd33770-kube-api-access-hg7s7\") pod \"0f0d264f-0263-421f-b09e-20354fd33770\" (UID: \"0f0d264f-0263-421f-b09e-20354fd33770\") " Oct 13 14:31:08 crc kubenswrapper[4797]: I1013 14:31:08.190537 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0f0d264f-0263-421f-b09e-20354fd33770-ovsdbserver-nb\") pod \"0f0d264f-0263-421f-b09e-20354fd33770\" (UID: \"0f0d264f-0263-421f-b09e-20354fd33770\") " Oct 13 14:31:08 crc kubenswrapper[4797]: I1013 14:31:08.190584 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0f0d264f-0263-421f-b09e-20354fd33770-ovsdbserver-sb\") pod \"0f0d264f-0263-421f-b09e-20354fd33770\" (UID: \"0f0d264f-0263-421f-b09e-20354fd33770\") " Oct 13 14:31:08 crc kubenswrapper[4797]: I1013 14:31:08.190692 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0f0d264f-0263-421f-b09e-20354fd33770-dns-svc\") pod \"0f0d264f-0263-421f-b09e-20354fd33770\" (UID: \"0f0d264f-0263-421f-b09e-20354fd33770\") " Oct 13 14:31:08 crc kubenswrapper[4797]: I1013 14:31:08.195550 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f0d264f-0263-421f-b09e-20354fd33770-kube-api-access-hg7s7" (OuterVolumeSpecName: "kube-api-access-hg7s7") pod "0f0d264f-0263-421f-b09e-20354fd33770" (UID: "0f0d264f-0263-421f-b09e-20354fd33770"). InnerVolumeSpecName "kube-api-access-hg7s7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 14:31:08 crc kubenswrapper[4797]: I1013 14:31:08.232377 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f0d264f-0263-421f-b09e-20354fd33770-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0f0d264f-0263-421f-b09e-20354fd33770" (UID: "0f0d264f-0263-421f-b09e-20354fd33770"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 14:31:08 crc kubenswrapper[4797]: I1013 14:31:08.234535 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f0d264f-0263-421f-b09e-20354fd33770-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "0f0d264f-0263-421f-b09e-20354fd33770" (UID: "0f0d264f-0263-421f-b09e-20354fd33770"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 14:31:08 crc kubenswrapper[4797]: I1013 14:31:08.241928 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f0d264f-0263-421f-b09e-20354fd33770-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "0f0d264f-0263-421f-b09e-20354fd33770" (UID: "0f0d264f-0263-421f-b09e-20354fd33770"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 14:31:08 crc kubenswrapper[4797]: I1013 14:31:08.246957 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f0d264f-0263-421f-b09e-20354fd33770-config" (OuterVolumeSpecName: "config") pod "0f0d264f-0263-421f-b09e-20354fd33770" (UID: "0f0d264f-0263-421f-b09e-20354fd33770"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 14:31:08 crc kubenswrapper[4797]: I1013 14:31:08.252860 4797 generic.go:334] "Generic (PLEG): container finished" podID="0f0d264f-0263-421f-b09e-20354fd33770" containerID="a6a04335a1935f5867e54e5e71bbaafcd89d3bf63483739ccbc10cf05102ba8e" exitCode=0 Oct 13 14:31:08 crc kubenswrapper[4797]: I1013 14:31:08.252923 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69f96c4cc9-bc79b" event={"ID":"0f0d264f-0263-421f-b09e-20354fd33770","Type":"ContainerDied","Data":"a6a04335a1935f5867e54e5e71bbaafcd89d3bf63483739ccbc10cf05102ba8e"} Oct 13 14:31:08 crc kubenswrapper[4797]: I1013 14:31:08.252974 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69f96c4cc9-bc79b" event={"ID":"0f0d264f-0263-421f-b09e-20354fd33770","Type":"ContainerDied","Data":"b148ab344d3364443e65ec6f834d5ea30537e9456b6077feef6bcc28ded9cae2"} Oct 13 14:31:08 crc kubenswrapper[4797]: I1013 14:31:08.252946 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69f96c4cc9-bc79b" Oct 13 14:31:08 crc kubenswrapper[4797]: I1013 14:31:08.252991 4797 scope.go:117] "RemoveContainer" containerID="a6a04335a1935f5867e54e5e71bbaafcd89d3bf63483739ccbc10cf05102ba8e" Oct 13 14:31:08 crc kubenswrapper[4797]: I1013 14:31:08.292585 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hg7s7\" (UniqueName: \"kubernetes.io/projected/0f0d264f-0263-421f-b09e-20354fd33770-kube-api-access-hg7s7\") on node \"crc\" DevicePath \"\"" Oct 13 14:31:08 crc kubenswrapper[4797]: I1013 14:31:08.292630 4797 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0f0d264f-0263-421f-b09e-20354fd33770-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 13 14:31:08 crc kubenswrapper[4797]: I1013 14:31:08.292640 4797 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0f0d264f-0263-421f-b09e-20354fd33770-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 13 14:31:08 crc kubenswrapper[4797]: I1013 14:31:08.292654 4797 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0f0d264f-0263-421f-b09e-20354fd33770-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 13 14:31:08 crc kubenswrapper[4797]: I1013 14:31:08.292663 4797 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f0d264f-0263-421f-b09e-20354fd33770-config\") on node \"crc\" DevicePath \"\"" Oct 13 14:31:08 crc kubenswrapper[4797]: I1013 14:31:08.332319 4797 scope.go:117] "RemoveContainer" containerID="c4c5767ca5dcfbe961e431d80b91714df4d6e3afb37bbb0b325e6834d7acfa54" Oct 13 14:31:08 crc kubenswrapper[4797]: I1013 14:31:08.340741 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-69f96c4cc9-bc79b"] Oct 13 14:31:08 crc kubenswrapper[4797]: I1013 14:31:08.347081 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-69f96c4cc9-bc79b"] Oct 13 14:31:08 crc kubenswrapper[4797]: I1013 14:31:08.354973 4797 scope.go:117] "RemoveContainer" containerID="a6a04335a1935f5867e54e5e71bbaafcd89d3bf63483739ccbc10cf05102ba8e" Oct 13 14:31:08 crc kubenswrapper[4797]: E1013 14:31:08.355535 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a6a04335a1935f5867e54e5e71bbaafcd89d3bf63483739ccbc10cf05102ba8e\": container with ID starting with a6a04335a1935f5867e54e5e71bbaafcd89d3bf63483739ccbc10cf05102ba8e not found: ID does not exist" containerID="a6a04335a1935f5867e54e5e71bbaafcd89d3bf63483739ccbc10cf05102ba8e" Oct 13 14:31:08 crc kubenswrapper[4797]: I1013 14:31:08.355571 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a6a04335a1935f5867e54e5e71bbaafcd89d3bf63483739ccbc10cf05102ba8e"} err="failed to get container status \"a6a04335a1935f5867e54e5e71bbaafcd89d3bf63483739ccbc10cf05102ba8e\": rpc error: code = NotFound desc = could not find container \"a6a04335a1935f5867e54e5e71bbaafcd89d3bf63483739ccbc10cf05102ba8e\": container with ID starting with a6a04335a1935f5867e54e5e71bbaafcd89d3bf63483739ccbc10cf05102ba8e not found: ID does not exist" Oct 13 14:31:08 crc kubenswrapper[4797]: I1013 14:31:08.355597 4797 scope.go:117] "RemoveContainer" containerID="c4c5767ca5dcfbe961e431d80b91714df4d6e3afb37bbb0b325e6834d7acfa54" Oct 13 14:31:08 crc kubenswrapper[4797]: E1013 14:31:08.355898 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4c5767ca5dcfbe961e431d80b91714df4d6e3afb37bbb0b325e6834d7acfa54\": container with ID starting with c4c5767ca5dcfbe961e431d80b91714df4d6e3afb37bbb0b325e6834d7acfa54 not found: ID does not exist" containerID="c4c5767ca5dcfbe961e431d80b91714df4d6e3afb37bbb0b325e6834d7acfa54" Oct 13 14:31:08 crc kubenswrapper[4797]: I1013 14:31:08.355933 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4c5767ca5dcfbe961e431d80b91714df4d6e3afb37bbb0b325e6834d7acfa54"} err="failed to get container status \"c4c5767ca5dcfbe961e431d80b91714df4d6e3afb37bbb0b325e6834d7acfa54\": rpc error: code = NotFound desc = could not find container \"c4c5767ca5dcfbe961e431d80b91714df4d6e3afb37bbb0b325e6834d7acfa54\": container with ID starting with c4c5767ca5dcfbe961e431d80b91714df4d6e3afb37bbb0b325e6834d7acfa54 not found: ID does not exist" Oct 13 14:31:09 crc kubenswrapper[4797]: I1013 14:31:09.247515 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f0d264f-0263-421f-b09e-20354fd33770" path="/var/lib/kubelet/pods/0f0d264f-0263-421f-b09e-20354fd33770/volumes" Oct 13 14:31:14 crc kubenswrapper[4797]: I1013 14:31:14.236315 4797 scope.go:117] "RemoveContainer" containerID="96c8267bd4c8e99eeab0f52fde47a06d5529395a03b2ed9e13ec45aa355e370b" Oct 13 14:31:14 crc kubenswrapper[4797]: E1013 14:31:14.237265 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 14:31:26 crc kubenswrapper[4797]: I1013 14:31:26.236638 4797 scope.go:117] "RemoveContainer" containerID="96c8267bd4c8e99eeab0f52fde47a06d5529395a03b2ed9e13ec45aa355e370b" Oct 13 14:31:26 crc kubenswrapper[4797]: E1013 14:31:26.237559 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 14:31:36 crc kubenswrapper[4797]: I1013 14:31:36.667047 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-6f8d4c4db4-4chzm" Oct 13 14:31:36 crc kubenswrapper[4797]: I1013 14:31:36.677915 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-6f8d4c4db4-4chzm" Oct 13 14:31:40 crc kubenswrapper[4797]: I1013 14:31:40.237358 4797 scope.go:117] "RemoveContainer" containerID="96c8267bd4c8e99eeab0f52fde47a06d5529395a03b2ed9e13ec45aa355e370b" Oct 13 14:31:40 crc kubenswrapper[4797]: E1013 14:31:40.237944 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 14:31:55 crc kubenswrapper[4797]: I1013 14:31:55.236209 4797 scope.go:117] "RemoveContainer" containerID="96c8267bd4c8e99eeab0f52fde47a06d5529395a03b2ed9e13ec45aa355e370b" Oct 13 14:31:55 crc kubenswrapper[4797]: E1013 14:31:55.237004 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 14:32:01 crc kubenswrapper[4797]: I1013 14:32:01.894032 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-sbxhd"] Oct 13 14:32:01 crc kubenswrapper[4797]: E1013 14:32:01.896205 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f0d264f-0263-421f-b09e-20354fd33770" containerName="dnsmasq-dns" Oct 13 14:32:01 crc kubenswrapper[4797]: I1013 14:32:01.896321 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f0d264f-0263-421f-b09e-20354fd33770" containerName="dnsmasq-dns" Oct 13 14:32:01 crc kubenswrapper[4797]: E1013 14:32:01.896433 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f0d264f-0263-421f-b09e-20354fd33770" containerName="init" Oct 13 14:32:01 crc kubenswrapper[4797]: I1013 14:32:01.896513 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f0d264f-0263-421f-b09e-20354fd33770" containerName="init" Oct 13 14:32:01 crc kubenswrapper[4797]: I1013 14:32:01.897136 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f0d264f-0263-421f-b09e-20354fd33770" containerName="dnsmasq-dns" Oct 13 14:32:01 crc kubenswrapper[4797]: I1013 14:32:01.898063 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-sbxhd" Oct 13 14:32:01 crc kubenswrapper[4797]: I1013 14:32:01.913700 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-sbxhd"] Oct 13 14:32:01 crc kubenswrapper[4797]: I1013 14:32:01.963221 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vh9g7\" (UniqueName: \"kubernetes.io/projected/7bb03bfd-b126-482a-be5f-abb05c56d932-kube-api-access-vh9g7\") pod \"nova-api-db-create-sbxhd\" (UID: \"7bb03bfd-b126-482a-be5f-abb05c56d932\") " pod="openstack/nova-api-db-create-sbxhd" Oct 13 14:32:01 crc kubenswrapper[4797]: I1013 14:32:01.990281 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-fwp9r"] Oct 13 14:32:01 crc kubenswrapper[4797]: I1013 14:32:01.992416 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-fwp9r" Oct 13 14:32:02 crc kubenswrapper[4797]: I1013 14:32:02.000501 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-fwp9r"] Oct 13 14:32:02 crc kubenswrapper[4797]: I1013 14:32:02.072838 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5w9sd\" (UniqueName: \"kubernetes.io/projected/e8c66927-df80-4626-a9a4-5af6596fac42-kube-api-access-5w9sd\") pod \"nova-cell0-db-create-fwp9r\" (UID: \"e8c66927-df80-4626-a9a4-5af6596fac42\") " pod="openstack/nova-cell0-db-create-fwp9r" Oct 13 14:32:02 crc kubenswrapper[4797]: I1013 14:32:02.072963 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vh9g7\" (UniqueName: \"kubernetes.io/projected/7bb03bfd-b126-482a-be5f-abb05c56d932-kube-api-access-vh9g7\") pod \"nova-api-db-create-sbxhd\" (UID: \"7bb03bfd-b126-482a-be5f-abb05c56d932\") " pod="openstack/nova-api-db-create-sbxhd" Oct 13 14:32:02 crc kubenswrapper[4797]: I1013 14:32:02.087119 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-s9hws"] Oct 13 14:32:02 crc kubenswrapper[4797]: I1013 14:32:02.088229 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-s9hws" Oct 13 14:32:02 crc kubenswrapper[4797]: I1013 14:32:02.090553 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vh9g7\" (UniqueName: \"kubernetes.io/projected/7bb03bfd-b126-482a-be5f-abb05c56d932-kube-api-access-vh9g7\") pod \"nova-api-db-create-sbxhd\" (UID: \"7bb03bfd-b126-482a-be5f-abb05c56d932\") " pod="openstack/nova-api-db-create-sbxhd" Oct 13 14:32:02 crc kubenswrapper[4797]: I1013 14:32:02.107406 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-s9hws"] Oct 13 14:32:02 crc kubenswrapper[4797]: I1013 14:32:02.174480 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5w9sd\" (UniqueName: \"kubernetes.io/projected/e8c66927-df80-4626-a9a4-5af6596fac42-kube-api-access-5w9sd\") pod \"nova-cell0-db-create-fwp9r\" (UID: \"e8c66927-df80-4626-a9a4-5af6596fac42\") " pod="openstack/nova-cell0-db-create-fwp9r" Oct 13 14:32:02 crc kubenswrapper[4797]: I1013 14:32:02.174917 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjwn6\" (UniqueName: \"kubernetes.io/projected/d9068568-7dc0-4636-8129-794e57dd32e3-kube-api-access-tjwn6\") pod \"nova-cell1-db-create-s9hws\" (UID: \"d9068568-7dc0-4636-8129-794e57dd32e3\") " pod="openstack/nova-cell1-db-create-s9hws" Oct 13 14:32:02 crc kubenswrapper[4797]: I1013 14:32:02.194144 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5w9sd\" (UniqueName: \"kubernetes.io/projected/e8c66927-df80-4626-a9a4-5af6596fac42-kube-api-access-5w9sd\") pod \"nova-cell0-db-create-fwp9r\" (UID: \"e8c66927-df80-4626-a9a4-5af6596fac42\") " pod="openstack/nova-cell0-db-create-fwp9r" Oct 13 14:32:02 crc kubenswrapper[4797]: I1013 14:32:02.264743 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-sbxhd" Oct 13 14:32:02 crc kubenswrapper[4797]: I1013 14:32:02.276974 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tjwn6\" (UniqueName: \"kubernetes.io/projected/d9068568-7dc0-4636-8129-794e57dd32e3-kube-api-access-tjwn6\") pod \"nova-cell1-db-create-s9hws\" (UID: \"d9068568-7dc0-4636-8129-794e57dd32e3\") " pod="openstack/nova-cell1-db-create-s9hws" Oct 13 14:32:02 crc kubenswrapper[4797]: I1013 14:32:02.298409 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjwn6\" (UniqueName: \"kubernetes.io/projected/d9068568-7dc0-4636-8129-794e57dd32e3-kube-api-access-tjwn6\") pod \"nova-cell1-db-create-s9hws\" (UID: \"d9068568-7dc0-4636-8129-794e57dd32e3\") " pod="openstack/nova-cell1-db-create-s9hws" Oct 13 14:32:02 crc kubenswrapper[4797]: I1013 14:32:02.311227 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-fwp9r" Oct 13 14:32:02 crc kubenswrapper[4797]: I1013 14:32:02.447339 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-s9hws" Oct 13 14:32:02 crc kubenswrapper[4797]: I1013 14:32:02.747560 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-s9hws"] Oct 13 14:32:02 crc kubenswrapper[4797]: I1013 14:32:02.756668 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-sbxhd"] Oct 13 14:32:02 crc kubenswrapper[4797]: W1013 14:32:02.764115 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd9068568_7dc0_4636_8129_794e57dd32e3.slice/crio-2b2572d7b18981fdf57ff8759ff334ade90a851c1fcc1298f856c14724abb936 WatchSource:0}: Error finding container 2b2572d7b18981fdf57ff8759ff334ade90a851c1fcc1298f856c14724abb936: Status 404 returned error can't find the container with id 2b2572d7b18981fdf57ff8759ff334ade90a851c1fcc1298f856c14724abb936 Oct 13 14:32:02 crc kubenswrapper[4797]: I1013 14:32:02.773958 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-sbxhd" event={"ID":"7bb03bfd-b126-482a-be5f-abb05c56d932","Type":"ContainerStarted","Data":"87700bfd0f412aca3d41815bdc55196856b6b85302af8f7a81ae0a61c9891827"} Oct 13 14:32:02 crc kubenswrapper[4797]: I1013 14:32:02.837120 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-fwp9r"] Oct 13 14:32:03 crc kubenswrapper[4797]: I1013 14:32:03.800254 4797 generic.go:334] "Generic (PLEG): container finished" podID="e8c66927-df80-4626-a9a4-5af6596fac42" containerID="f5468adde202ce08735f6bcd3dd3317db4fbf8363100efc31a0af21200d6ce95" exitCode=0 Oct 13 14:32:03 crc kubenswrapper[4797]: I1013 14:32:03.802147 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-fwp9r" event={"ID":"e8c66927-df80-4626-a9a4-5af6596fac42","Type":"ContainerDied","Data":"f5468adde202ce08735f6bcd3dd3317db4fbf8363100efc31a0af21200d6ce95"} Oct 13 14:32:03 crc kubenswrapper[4797]: I1013 14:32:03.802198 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-fwp9r" event={"ID":"e8c66927-df80-4626-a9a4-5af6596fac42","Type":"ContainerStarted","Data":"82bddbe63772e9f8f1ce5093ec4ca0567161d1ab591e75058f8f354b8bdc5253"} Oct 13 14:32:03 crc kubenswrapper[4797]: I1013 14:32:03.803305 4797 generic.go:334] "Generic (PLEG): container finished" podID="d9068568-7dc0-4636-8129-794e57dd32e3" containerID="9a001682c8f110ed0261d9b3fc9b75b4734e268d044145379dca9699293626a0" exitCode=0 Oct 13 14:32:03 crc kubenswrapper[4797]: I1013 14:32:03.803370 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-s9hws" event={"ID":"d9068568-7dc0-4636-8129-794e57dd32e3","Type":"ContainerDied","Data":"9a001682c8f110ed0261d9b3fc9b75b4734e268d044145379dca9699293626a0"} Oct 13 14:32:03 crc kubenswrapper[4797]: I1013 14:32:03.803393 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-s9hws" event={"ID":"d9068568-7dc0-4636-8129-794e57dd32e3","Type":"ContainerStarted","Data":"2b2572d7b18981fdf57ff8759ff334ade90a851c1fcc1298f856c14724abb936"} Oct 13 14:32:03 crc kubenswrapper[4797]: I1013 14:32:03.805157 4797 generic.go:334] "Generic (PLEG): container finished" podID="7bb03bfd-b126-482a-be5f-abb05c56d932" containerID="5434d1dc1c5f053e87774f21a075c3701e704bba8a186e986c848735d7eea498" exitCode=0 Oct 13 14:32:03 crc kubenswrapper[4797]: I1013 14:32:03.805182 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-sbxhd" event={"ID":"7bb03bfd-b126-482a-be5f-abb05c56d932","Type":"ContainerDied","Data":"5434d1dc1c5f053e87774f21a075c3701e704bba8a186e986c848735d7eea498"} Oct 13 14:32:05 crc kubenswrapper[4797]: I1013 14:32:05.253903 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-s9hws" Oct 13 14:32:05 crc kubenswrapper[4797]: I1013 14:32:05.260416 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-sbxhd" Oct 13 14:32:05 crc kubenswrapper[4797]: I1013 14:32:05.269743 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-fwp9r" Oct 13 14:32:05 crc kubenswrapper[4797]: I1013 14:32:05.335371 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vh9g7\" (UniqueName: \"kubernetes.io/projected/7bb03bfd-b126-482a-be5f-abb05c56d932-kube-api-access-vh9g7\") pod \"7bb03bfd-b126-482a-be5f-abb05c56d932\" (UID: \"7bb03bfd-b126-482a-be5f-abb05c56d932\") " Oct 13 14:32:05 crc kubenswrapper[4797]: I1013 14:32:05.335503 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5w9sd\" (UniqueName: \"kubernetes.io/projected/e8c66927-df80-4626-a9a4-5af6596fac42-kube-api-access-5w9sd\") pod \"e8c66927-df80-4626-a9a4-5af6596fac42\" (UID: \"e8c66927-df80-4626-a9a4-5af6596fac42\") " Oct 13 14:32:05 crc kubenswrapper[4797]: I1013 14:32:05.335589 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tjwn6\" (UniqueName: \"kubernetes.io/projected/d9068568-7dc0-4636-8129-794e57dd32e3-kube-api-access-tjwn6\") pod \"d9068568-7dc0-4636-8129-794e57dd32e3\" (UID: \"d9068568-7dc0-4636-8129-794e57dd32e3\") " Oct 13 14:32:05 crc kubenswrapper[4797]: I1013 14:32:05.343053 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8c66927-df80-4626-a9a4-5af6596fac42-kube-api-access-5w9sd" (OuterVolumeSpecName: "kube-api-access-5w9sd") pod "e8c66927-df80-4626-a9a4-5af6596fac42" (UID: "e8c66927-df80-4626-a9a4-5af6596fac42"). InnerVolumeSpecName "kube-api-access-5w9sd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 14:32:05 crc kubenswrapper[4797]: I1013 14:32:05.343357 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb03bfd-b126-482a-be5f-abb05c56d932-kube-api-access-vh9g7" (OuterVolumeSpecName: "kube-api-access-vh9g7") pod "7bb03bfd-b126-482a-be5f-abb05c56d932" (UID: "7bb03bfd-b126-482a-be5f-abb05c56d932"). InnerVolumeSpecName "kube-api-access-vh9g7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 14:32:05 crc kubenswrapper[4797]: I1013 14:32:05.343563 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9068568-7dc0-4636-8129-794e57dd32e3-kube-api-access-tjwn6" (OuterVolumeSpecName: "kube-api-access-tjwn6") pod "d9068568-7dc0-4636-8129-794e57dd32e3" (UID: "d9068568-7dc0-4636-8129-794e57dd32e3"). InnerVolumeSpecName "kube-api-access-tjwn6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 14:32:05 crc kubenswrapper[4797]: I1013 14:32:05.437790 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5w9sd\" (UniqueName: \"kubernetes.io/projected/e8c66927-df80-4626-a9a4-5af6596fac42-kube-api-access-5w9sd\") on node \"crc\" DevicePath \"\"" Oct 13 14:32:05 crc kubenswrapper[4797]: I1013 14:32:05.438832 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tjwn6\" (UniqueName: \"kubernetes.io/projected/d9068568-7dc0-4636-8129-794e57dd32e3-kube-api-access-tjwn6\") on node \"crc\" DevicePath \"\"" Oct 13 14:32:05 crc kubenswrapper[4797]: I1013 14:32:05.438849 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vh9g7\" (UniqueName: \"kubernetes.io/projected/7bb03bfd-b126-482a-be5f-abb05c56d932-kube-api-access-vh9g7\") on node \"crc\" DevicePath \"\"" Oct 13 14:32:05 crc kubenswrapper[4797]: I1013 14:32:05.837512 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-s9hws" event={"ID":"d9068568-7dc0-4636-8129-794e57dd32e3","Type":"ContainerDied","Data":"2b2572d7b18981fdf57ff8759ff334ade90a851c1fcc1298f856c14724abb936"} Oct 13 14:32:05 crc kubenswrapper[4797]: I1013 14:32:05.837586 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2b2572d7b18981fdf57ff8759ff334ade90a851c1fcc1298f856c14724abb936" Oct 13 14:32:05 crc kubenswrapper[4797]: I1013 14:32:05.837726 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-s9hws" Oct 13 14:32:05 crc kubenswrapper[4797]: I1013 14:32:05.841646 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-sbxhd" event={"ID":"7bb03bfd-b126-482a-be5f-abb05c56d932","Type":"ContainerDied","Data":"87700bfd0f412aca3d41815bdc55196856b6b85302af8f7a81ae0a61c9891827"} Oct 13 14:32:05 crc kubenswrapper[4797]: I1013 14:32:05.841733 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="87700bfd0f412aca3d41815bdc55196856b6b85302af8f7a81ae0a61c9891827" Oct 13 14:32:05 crc kubenswrapper[4797]: I1013 14:32:05.841865 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-sbxhd" Oct 13 14:32:05 crc kubenswrapper[4797]: I1013 14:32:05.844848 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-fwp9r" event={"ID":"e8c66927-df80-4626-a9a4-5af6596fac42","Type":"ContainerDied","Data":"82bddbe63772e9f8f1ce5093ec4ca0567161d1ab591e75058f8f354b8bdc5253"} Oct 13 14:32:05 crc kubenswrapper[4797]: I1013 14:32:05.844898 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="82bddbe63772e9f8f1ce5093ec4ca0567161d1ab591e75058f8f354b8bdc5253" Oct 13 14:32:05 crc kubenswrapper[4797]: I1013 14:32:05.845093 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-fwp9r" Oct 13 14:32:10 crc kubenswrapper[4797]: I1013 14:32:10.236312 4797 scope.go:117] "RemoveContainer" containerID="96c8267bd4c8e99eeab0f52fde47a06d5529395a03b2ed9e13ec45aa355e370b" Oct 13 14:32:10 crc kubenswrapper[4797]: E1013 14:32:10.236794 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 14:32:12 crc kubenswrapper[4797]: I1013 14:32:12.143303 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-ea2d-account-create-5459z"] Oct 13 14:32:12 crc kubenswrapper[4797]: E1013 14:32:12.143954 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bb03bfd-b126-482a-be5f-abb05c56d932" containerName="mariadb-database-create" Oct 13 14:32:12 crc kubenswrapper[4797]: I1013 14:32:12.143966 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bb03bfd-b126-482a-be5f-abb05c56d932" containerName="mariadb-database-create" Oct 13 14:32:12 crc kubenswrapper[4797]: E1013 14:32:12.143984 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9068568-7dc0-4636-8129-794e57dd32e3" containerName="mariadb-database-create" Oct 13 14:32:12 crc kubenswrapper[4797]: I1013 14:32:12.143990 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9068568-7dc0-4636-8129-794e57dd32e3" containerName="mariadb-database-create" Oct 13 14:32:12 crc kubenswrapper[4797]: E1013 14:32:12.144002 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8c66927-df80-4626-a9a4-5af6596fac42" containerName="mariadb-database-create" Oct 13 14:32:12 crc kubenswrapper[4797]: I1013 14:32:12.144008 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8c66927-df80-4626-a9a4-5af6596fac42" containerName="mariadb-database-create" Oct 13 14:32:12 crc kubenswrapper[4797]: I1013 14:32:12.144159 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="7bb03bfd-b126-482a-be5f-abb05c56d932" containerName="mariadb-database-create" Oct 13 14:32:12 crc kubenswrapper[4797]: I1013 14:32:12.144175 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9068568-7dc0-4636-8129-794e57dd32e3" containerName="mariadb-database-create" Oct 13 14:32:12 crc kubenswrapper[4797]: I1013 14:32:12.144196 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8c66927-df80-4626-a9a4-5af6596fac42" containerName="mariadb-database-create" Oct 13 14:32:12 crc kubenswrapper[4797]: I1013 14:32:12.144783 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-ea2d-account-create-5459z" Oct 13 14:32:12 crc kubenswrapper[4797]: I1013 14:32:12.147651 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Oct 13 14:32:12 crc kubenswrapper[4797]: I1013 14:32:12.153326 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-ea2d-account-create-5459z"] Oct 13 14:32:12 crc kubenswrapper[4797]: I1013 14:32:12.154618 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwhrg\" (UniqueName: \"kubernetes.io/projected/e4981c2c-7ef7-468d-808c-c1dba459c1c0-kube-api-access-vwhrg\") pod \"nova-api-ea2d-account-create-5459z\" (UID: \"e4981c2c-7ef7-468d-808c-c1dba459c1c0\") " pod="openstack/nova-api-ea2d-account-create-5459z" Oct 13 14:32:12 crc kubenswrapper[4797]: I1013 14:32:12.256611 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vwhrg\" (UniqueName: \"kubernetes.io/projected/e4981c2c-7ef7-468d-808c-c1dba459c1c0-kube-api-access-vwhrg\") pod \"nova-api-ea2d-account-create-5459z\" (UID: \"e4981c2c-7ef7-468d-808c-c1dba459c1c0\") " pod="openstack/nova-api-ea2d-account-create-5459z" Oct 13 14:32:12 crc kubenswrapper[4797]: I1013 14:32:12.275223 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwhrg\" (UniqueName: \"kubernetes.io/projected/e4981c2c-7ef7-468d-808c-c1dba459c1c0-kube-api-access-vwhrg\") pod \"nova-api-ea2d-account-create-5459z\" (UID: \"e4981c2c-7ef7-468d-808c-c1dba459c1c0\") " pod="openstack/nova-api-ea2d-account-create-5459z" Oct 13 14:32:12 crc kubenswrapper[4797]: I1013 14:32:12.316758 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cc58-account-create-4c2kv"] Oct 13 14:32:12 crc kubenswrapper[4797]: I1013 14:32:12.318006 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cc58-account-create-4c2kv" Oct 13 14:32:12 crc kubenswrapper[4797]: I1013 14:32:12.319912 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Oct 13 14:32:12 crc kubenswrapper[4797]: I1013 14:32:12.330082 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cc58-account-create-4c2kv"] Oct 13 14:32:12 crc kubenswrapper[4797]: I1013 14:32:12.357410 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cgwwh\" (UniqueName: \"kubernetes.io/projected/0ce3d55d-49d8-4559-bb4a-aab76084a7d3-kube-api-access-cgwwh\") pod \"nova-cell0-cc58-account-create-4c2kv\" (UID: \"0ce3d55d-49d8-4559-bb4a-aab76084a7d3\") " pod="openstack/nova-cell0-cc58-account-create-4c2kv" Oct 13 14:32:12 crc kubenswrapper[4797]: I1013 14:32:12.459087 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cgwwh\" (UniqueName: \"kubernetes.io/projected/0ce3d55d-49d8-4559-bb4a-aab76084a7d3-kube-api-access-cgwwh\") pod \"nova-cell0-cc58-account-create-4c2kv\" (UID: \"0ce3d55d-49d8-4559-bb4a-aab76084a7d3\") " pod="openstack/nova-cell0-cc58-account-create-4c2kv" Oct 13 14:32:12 crc kubenswrapper[4797]: I1013 14:32:12.482416 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cgwwh\" (UniqueName: \"kubernetes.io/projected/0ce3d55d-49d8-4559-bb4a-aab76084a7d3-kube-api-access-cgwwh\") pod \"nova-cell0-cc58-account-create-4c2kv\" (UID: \"0ce3d55d-49d8-4559-bb4a-aab76084a7d3\") " pod="openstack/nova-cell0-cc58-account-create-4c2kv" Oct 13 14:32:12 crc kubenswrapper[4797]: I1013 14:32:12.503986 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-ea2d-account-create-5459z" Oct 13 14:32:12 crc kubenswrapper[4797]: I1013 14:32:12.523276 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-64d8-account-create-c74vt"] Oct 13 14:32:12 crc kubenswrapper[4797]: I1013 14:32:12.524620 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-64d8-account-create-c74vt" Oct 13 14:32:12 crc kubenswrapper[4797]: I1013 14:32:12.528124 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Oct 13 14:32:12 crc kubenswrapper[4797]: I1013 14:32:12.532822 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-64d8-account-create-c74vt"] Oct 13 14:32:12 crc kubenswrapper[4797]: I1013 14:32:12.560682 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xj92g\" (UniqueName: \"kubernetes.io/projected/9d225d04-4356-48b7-acd1-beb811cb4320-kube-api-access-xj92g\") pod \"nova-cell1-64d8-account-create-c74vt\" (UID: \"9d225d04-4356-48b7-acd1-beb811cb4320\") " pod="openstack/nova-cell1-64d8-account-create-c74vt" Oct 13 14:32:12 crc kubenswrapper[4797]: I1013 14:32:12.648724 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cc58-account-create-4c2kv" Oct 13 14:32:12 crc kubenswrapper[4797]: I1013 14:32:12.662631 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xj92g\" (UniqueName: \"kubernetes.io/projected/9d225d04-4356-48b7-acd1-beb811cb4320-kube-api-access-xj92g\") pod \"nova-cell1-64d8-account-create-c74vt\" (UID: \"9d225d04-4356-48b7-acd1-beb811cb4320\") " pod="openstack/nova-cell1-64d8-account-create-c74vt" Oct 13 14:32:12 crc kubenswrapper[4797]: I1013 14:32:12.684174 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xj92g\" (UniqueName: \"kubernetes.io/projected/9d225d04-4356-48b7-acd1-beb811cb4320-kube-api-access-xj92g\") pod \"nova-cell1-64d8-account-create-c74vt\" (UID: \"9d225d04-4356-48b7-acd1-beb811cb4320\") " pod="openstack/nova-cell1-64d8-account-create-c74vt" Oct 13 14:32:12 crc kubenswrapper[4797]: I1013 14:32:12.890463 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cc58-account-create-4c2kv"] Oct 13 14:32:12 crc kubenswrapper[4797]: I1013 14:32:12.902981 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cc58-account-create-4c2kv" event={"ID":"0ce3d55d-49d8-4559-bb4a-aab76084a7d3","Type":"ContainerStarted","Data":"8f38136f0fe217932ccd2fbedc90c80d52c63906c3deb9bf51dffa6f3ef409ba"} Oct 13 14:32:12 crc kubenswrapper[4797]: I1013 14:32:12.927463 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-64d8-account-create-c74vt" Oct 13 14:32:12 crc kubenswrapper[4797]: I1013 14:32:12.943265 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-ea2d-account-create-5459z"] Oct 13 14:32:12 crc kubenswrapper[4797]: W1013 14:32:12.958038 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode4981c2c_7ef7_468d_808c_c1dba459c1c0.slice/crio-04f764858507c6d8a55ea6e26a2075d153705b95afdfeb2dbced00a9f4860b3b WatchSource:0}: Error finding container 04f764858507c6d8a55ea6e26a2075d153705b95afdfeb2dbced00a9f4860b3b: Status 404 returned error can't find the container with id 04f764858507c6d8a55ea6e26a2075d153705b95afdfeb2dbced00a9f4860b3b Oct 13 14:32:13 crc kubenswrapper[4797]: I1013 14:32:13.370492 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-64d8-account-create-c74vt"] Oct 13 14:32:13 crc kubenswrapper[4797]: I1013 14:32:13.913307 4797 generic.go:334] "Generic (PLEG): container finished" podID="e4981c2c-7ef7-468d-808c-c1dba459c1c0" containerID="af5dbd20310a5fd1b74d30ecd1df10ae2d3b3c2187d9d485aaa6577802d3df22" exitCode=0 Oct 13 14:32:13 crc kubenswrapper[4797]: I1013 14:32:13.913403 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-ea2d-account-create-5459z" event={"ID":"e4981c2c-7ef7-468d-808c-c1dba459c1c0","Type":"ContainerDied","Data":"af5dbd20310a5fd1b74d30ecd1df10ae2d3b3c2187d9d485aaa6577802d3df22"} Oct 13 14:32:13 crc kubenswrapper[4797]: I1013 14:32:13.913617 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-ea2d-account-create-5459z" event={"ID":"e4981c2c-7ef7-468d-808c-c1dba459c1c0","Type":"ContainerStarted","Data":"04f764858507c6d8a55ea6e26a2075d153705b95afdfeb2dbced00a9f4860b3b"} Oct 13 14:32:13 crc kubenswrapper[4797]: I1013 14:32:13.918350 4797 generic.go:334] "Generic (PLEG): container finished" podID="0ce3d55d-49d8-4559-bb4a-aab76084a7d3" containerID="4f2548644b61d387f76ad56cf7cd644c9e497ce36b28bfcc828631f67644df02" exitCode=0 Oct 13 14:32:13 crc kubenswrapper[4797]: I1013 14:32:13.918441 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cc58-account-create-4c2kv" event={"ID":"0ce3d55d-49d8-4559-bb4a-aab76084a7d3","Type":"ContainerDied","Data":"4f2548644b61d387f76ad56cf7cd644c9e497ce36b28bfcc828631f67644df02"} Oct 13 14:32:13 crc kubenswrapper[4797]: I1013 14:32:13.921249 4797 generic.go:334] "Generic (PLEG): container finished" podID="9d225d04-4356-48b7-acd1-beb811cb4320" containerID="f75f53b05a896d1bfa0f2f4c2a81ea202396bf1a4892f8008657de104104896d" exitCode=0 Oct 13 14:32:13 crc kubenswrapper[4797]: I1013 14:32:13.921304 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-64d8-account-create-c74vt" event={"ID":"9d225d04-4356-48b7-acd1-beb811cb4320","Type":"ContainerDied","Data":"f75f53b05a896d1bfa0f2f4c2a81ea202396bf1a4892f8008657de104104896d"} Oct 13 14:32:13 crc kubenswrapper[4797]: I1013 14:32:13.921325 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-64d8-account-create-c74vt" event={"ID":"9d225d04-4356-48b7-acd1-beb811cb4320","Type":"ContainerStarted","Data":"4426e25e15839b7a1fab76d99cf2769aace11dff51b211ef67b6c9771d509883"} Oct 13 14:32:14 crc kubenswrapper[4797]: I1013 14:32:14.576531 4797 scope.go:117] "RemoveContainer" containerID="6735b15f8ed0d8c37dd10d1365169713bf17cdea1794049fbbd7855fff44fa08" Oct 13 14:32:15 crc kubenswrapper[4797]: I1013 14:32:15.455662 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-64d8-account-create-c74vt" Oct 13 14:32:15 crc kubenswrapper[4797]: I1013 14:32:15.460743 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cc58-account-create-4c2kv" Oct 13 14:32:15 crc kubenswrapper[4797]: I1013 14:32:15.465248 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-ea2d-account-create-5459z" Oct 13 14:32:15 crc kubenswrapper[4797]: I1013 14:32:15.623186 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xj92g\" (UniqueName: \"kubernetes.io/projected/9d225d04-4356-48b7-acd1-beb811cb4320-kube-api-access-xj92g\") pod \"9d225d04-4356-48b7-acd1-beb811cb4320\" (UID: \"9d225d04-4356-48b7-acd1-beb811cb4320\") " Oct 13 14:32:15 crc kubenswrapper[4797]: I1013 14:32:15.624485 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cgwwh\" (UniqueName: \"kubernetes.io/projected/0ce3d55d-49d8-4559-bb4a-aab76084a7d3-kube-api-access-cgwwh\") pod \"0ce3d55d-49d8-4559-bb4a-aab76084a7d3\" (UID: \"0ce3d55d-49d8-4559-bb4a-aab76084a7d3\") " Oct 13 14:32:15 crc kubenswrapper[4797]: I1013 14:32:15.624923 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vwhrg\" (UniqueName: \"kubernetes.io/projected/e4981c2c-7ef7-468d-808c-c1dba459c1c0-kube-api-access-vwhrg\") pod \"e4981c2c-7ef7-468d-808c-c1dba459c1c0\" (UID: \"e4981c2c-7ef7-468d-808c-c1dba459c1c0\") " Oct 13 14:32:15 crc kubenswrapper[4797]: I1013 14:32:15.629120 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d225d04-4356-48b7-acd1-beb811cb4320-kube-api-access-xj92g" (OuterVolumeSpecName: "kube-api-access-xj92g") pod "9d225d04-4356-48b7-acd1-beb811cb4320" (UID: "9d225d04-4356-48b7-acd1-beb811cb4320"). InnerVolumeSpecName "kube-api-access-xj92g". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 14:32:15 crc kubenswrapper[4797]: I1013 14:32:15.629655 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ce3d55d-49d8-4559-bb4a-aab76084a7d3-kube-api-access-cgwwh" (OuterVolumeSpecName: "kube-api-access-cgwwh") pod "0ce3d55d-49d8-4559-bb4a-aab76084a7d3" (UID: "0ce3d55d-49d8-4559-bb4a-aab76084a7d3"). InnerVolumeSpecName "kube-api-access-cgwwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 14:32:15 crc kubenswrapper[4797]: I1013 14:32:15.632427 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4981c2c-7ef7-468d-808c-c1dba459c1c0-kube-api-access-vwhrg" (OuterVolumeSpecName: "kube-api-access-vwhrg") pod "e4981c2c-7ef7-468d-808c-c1dba459c1c0" (UID: "e4981c2c-7ef7-468d-808c-c1dba459c1c0"). InnerVolumeSpecName "kube-api-access-vwhrg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 14:32:15 crc kubenswrapper[4797]: I1013 14:32:15.727937 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xj92g\" (UniqueName: \"kubernetes.io/projected/9d225d04-4356-48b7-acd1-beb811cb4320-kube-api-access-xj92g\") on node \"crc\" DevicePath \"\"" Oct 13 14:32:15 crc kubenswrapper[4797]: I1013 14:32:15.728204 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cgwwh\" (UniqueName: \"kubernetes.io/projected/0ce3d55d-49d8-4559-bb4a-aab76084a7d3-kube-api-access-cgwwh\") on node \"crc\" DevicePath \"\"" Oct 13 14:32:15 crc kubenswrapper[4797]: I1013 14:32:15.728215 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vwhrg\" (UniqueName: \"kubernetes.io/projected/e4981c2c-7ef7-468d-808c-c1dba459c1c0-kube-api-access-vwhrg\") on node \"crc\" DevicePath \"\"" Oct 13 14:32:15 crc kubenswrapper[4797]: I1013 14:32:15.939331 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-ea2d-account-create-5459z" event={"ID":"e4981c2c-7ef7-468d-808c-c1dba459c1c0","Type":"ContainerDied","Data":"04f764858507c6d8a55ea6e26a2075d153705b95afdfeb2dbced00a9f4860b3b"} Oct 13 14:32:15 crc kubenswrapper[4797]: I1013 14:32:15.939379 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="04f764858507c6d8a55ea6e26a2075d153705b95afdfeb2dbced00a9f4860b3b" Oct 13 14:32:15 crc kubenswrapper[4797]: I1013 14:32:15.939409 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-ea2d-account-create-5459z" Oct 13 14:32:15 crc kubenswrapper[4797]: I1013 14:32:15.940594 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cc58-account-create-4c2kv" Oct 13 14:32:15 crc kubenswrapper[4797]: I1013 14:32:15.940604 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cc58-account-create-4c2kv" event={"ID":"0ce3d55d-49d8-4559-bb4a-aab76084a7d3","Type":"ContainerDied","Data":"8f38136f0fe217932ccd2fbedc90c80d52c63906c3deb9bf51dffa6f3ef409ba"} Oct 13 14:32:15 crc kubenswrapper[4797]: I1013 14:32:15.940626 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8f38136f0fe217932ccd2fbedc90c80d52c63906c3deb9bf51dffa6f3ef409ba" Oct 13 14:32:15 crc kubenswrapper[4797]: I1013 14:32:15.942504 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-64d8-account-create-c74vt" event={"ID":"9d225d04-4356-48b7-acd1-beb811cb4320","Type":"ContainerDied","Data":"4426e25e15839b7a1fab76d99cf2769aace11dff51b211ef67b6c9771d509883"} Oct 13 14:32:15 crc kubenswrapper[4797]: I1013 14:32:15.942533 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4426e25e15839b7a1fab76d99cf2769aace11dff51b211ef67b6c9771d509883" Oct 13 14:32:15 crc kubenswrapper[4797]: I1013 14:32:15.942682 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-64d8-account-create-c74vt" Oct 13 14:32:17 crc kubenswrapper[4797]: I1013 14:32:17.532544 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-mqjqh"] Oct 13 14:32:17 crc kubenswrapper[4797]: E1013 14:32:17.533018 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ce3d55d-49d8-4559-bb4a-aab76084a7d3" containerName="mariadb-account-create" Oct 13 14:32:17 crc kubenswrapper[4797]: I1013 14:32:17.533039 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ce3d55d-49d8-4559-bb4a-aab76084a7d3" containerName="mariadb-account-create" Oct 13 14:32:17 crc kubenswrapper[4797]: E1013 14:32:17.533055 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4981c2c-7ef7-468d-808c-c1dba459c1c0" containerName="mariadb-account-create" Oct 13 14:32:17 crc kubenswrapper[4797]: I1013 14:32:17.533062 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4981c2c-7ef7-468d-808c-c1dba459c1c0" containerName="mariadb-account-create" Oct 13 14:32:17 crc kubenswrapper[4797]: E1013 14:32:17.533092 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d225d04-4356-48b7-acd1-beb811cb4320" containerName="mariadb-account-create" Oct 13 14:32:17 crc kubenswrapper[4797]: I1013 14:32:17.533100 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d225d04-4356-48b7-acd1-beb811cb4320" containerName="mariadb-account-create" Oct 13 14:32:17 crc kubenswrapper[4797]: I1013 14:32:17.533303 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ce3d55d-49d8-4559-bb4a-aab76084a7d3" containerName="mariadb-account-create" Oct 13 14:32:17 crc kubenswrapper[4797]: I1013 14:32:17.533319 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d225d04-4356-48b7-acd1-beb811cb4320" containerName="mariadb-account-create" Oct 13 14:32:17 crc kubenswrapper[4797]: I1013 14:32:17.533329 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4981c2c-7ef7-468d-808c-c1dba459c1c0" containerName="mariadb-account-create" Oct 13 14:32:17 crc kubenswrapper[4797]: I1013 14:32:17.534074 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-mqjqh" Oct 13 14:32:17 crc kubenswrapper[4797]: I1013 14:32:17.536375 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Oct 13 14:32:17 crc kubenswrapper[4797]: I1013 14:32:17.537510 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 13 14:32:17 crc kubenswrapper[4797]: I1013 14:32:17.538215 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-fcrfc" Oct 13 14:32:17 crc kubenswrapper[4797]: I1013 14:32:17.558257 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-mqjqh"] Oct 13 14:32:17 crc kubenswrapper[4797]: I1013 14:32:17.663330 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bcc41bde-060e-4a36-adb7-62b14fd7cd30-scripts\") pod \"nova-cell0-conductor-db-sync-mqjqh\" (UID: \"bcc41bde-060e-4a36-adb7-62b14fd7cd30\") " pod="openstack/nova-cell0-conductor-db-sync-mqjqh" Oct 13 14:32:17 crc kubenswrapper[4797]: I1013 14:32:17.663388 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bcc41bde-060e-4a36-adb7-62b14fd7cd30-config-data\") pod \"nova-cell0-conductor-db-sync-mqjqh\" (UID: \"bcc41bde-060e-4a36-adb7-62b14fd7cd30\") " pod="openstack/nova-cell0-conductor-db-sync-mqjqh" Oct 13 14:32:17 crc kubenswrapper[4797]: I1013 14:32:17.663440 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcc41bde-060e-4a36-adb7-62b14fd7cd30-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-mqjqh\" (UID: \"bcc41bde-060e-4a36-adb7-62b14fd7cd30\") " pod="openstack/nova-cell0-conductor-db-sync-mqjqh" Oct 13 14:32:17 crc kubenswrapper[4797]: I1013 14:32:17.663534 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nnhvg\" (UniqueName: \"kubernetes.io/projected/bcc41bde-060e-4a36-adb7-62b14fd7cd30-kube-api-access-nnhvg\") pod \"nova-cell0-conductor-db-sync-mqjqh\" (UID: \"bcc41bde-060e-4a36-adb7-62b14fd7cd30\") " pod="openstack/nova-cell0-conductor-db-sync-mqjqh" Oct 13 14:32:17 crc kubenswrapper[4797]: I1013 14:32:17.765210 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bcc41bde-060e-4a36-adb7-62b14fd7cd30-scripts\") pod \"nova-cell0-conductor-db-sync-mqjqh\" (UID: \"bcc41bde-060e-4a36-adb7-62b14fd7cd30\") " pod="openstack/nova-cell0-conductor-db-sync-mqjqh" Oct 13 14:32:17 crc kubenswrapper[4797]: I1013 14:32:17.765266 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bcc41bde-060e-4a36-adb7-62b14fd7cd30-config-data\") pod \"nova-cell0-conductor-db-sync-mqjqh\" (UID: \"bcc41bde-060e-4a36-adb7-62b14fd7cd30\") " pod="openstack/nova-cell0-conductor-db-sync-mqjqh" Oct 13 14:32:17 crc kubenswrapper[4797]: I1013 14:32:17.765303 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcc41bde-060e-4a36-adb7-62b14fd7cd30-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-mqjqh\" (UID: \"bcc41bde-060e-4a36-adb7-62b14fd7cd30\") " pod="openstack/nova-cell0-conductor-db-sync-mqjqh" Oct 13 14:32:17 crc kubenswrapper[4797]: I1013 14:32:17.765342 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nnhvg\" (UniqueName: \"kubernetes.io/projected/bcc41bde-060e-4a36-adb7-62b14fd7cd30-kube-api-access-nnhvg\") pod \"nova-cell0-conductor-db-sync-mqjqh\" (UID: \"bcc41bde-060e-4a36-adb7-62b14fd7cd30\") " pod="openstack/nova-cell0-conductor-db-sync-mqjqh" Oct 13 14:32:17 crc kubenswrapper[4797]: I1013 14:32:17.771674 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bcc41bde-060e-4a36-adb7-62b14fd7cd30-config-data\") pod \"nova-cell0-conductor-db-sync-mqjqh\" (UID: \"bcc41bde-060e-4a36-adb7-62b14fd7cd30\") " pod="openstack/nova-cell0-conductor-db-sync-mqjqh" Oct 13 14:32:17 crc kubenswrapper[4797]: I1013 14:32:17.774506 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcc41bde-060e-4a36-adb7-62b14fd7cd30-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-mqjqh\" (UID: \"bcc41bde-060e-4a36-adb7-62b14fd7cd30\") " pod="openstack/nova-cell0-conductor-db-sync-mqjqh" Oct 13 14:32:17 crc kubenswrapper[4797]: I1013 14:32:17.781517 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bcc41bde-060e-4a36-adb7-62b14fd7cd30-scripts\") pod \"nova-cell0-conductor-db-sync-mqjqh\" (UID: \"bcc41bde-060e-4a36-adb7-62b14fd7cd30\") " pod="openstack/nova-cell0-conductor-db-sync-mqjqh" Oct 13 14:32:17 crc kubenswrapper[4797]: I1013 14:32:17.786460 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nnhvg\" (UniqueName: \"kubernetes.io/projected/bcc41bde-060e-4a36-adb7-62b14fd7cd30-kube-api-access-nnhvg\") pod \"nova-cell0-conductor-db-sync-mqjqh\" (UID: \"bcc41bde-060e-4a36-adb7-62b14fd7cd30\") " pod="openstack/nova-cell0-conductor-db-sync-mqjqh" Oct 13 14:32:17 crc kubenswrapper[4797]: I1013 14:32:17.872263 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-mqjqh" Oct 13 14:32:18 crc kubenswrapper[4797]: I1013 14:32:18.314918 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-mqjqh"] Oct 13 14:32:18 crc kubenswrapper[4797]: I1013 14:32:18.976685 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-mqjqh" event={"ID":"bcc41bde-060e-4a36-adb7-62b14fd7cd30","Type":"ContainerStarted","Data":"b67ce19020481302805fc64442bd0232bf00f46378795d17be1a47c37dda1d33"} Oct 13 14:32:22 crc kubenswrapper[4797]: I1013 14:32:22.236413 4797 scope.go:117] "RemoveContainer" containerID="96c8267bd4c8e99eeab0f52fde47a06d5529395a03b2ed9e13ec45aa355e370b" Oct 13 14:32:22 crc kubenswrapper[4797]: E1013 14:32:22.237845 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 14:32:28 crc kubenswrapper[4797]: I1013 14:32:28.080053 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-mqjqh" event={"ID":"bcc41bde-060e-4a36-adb7-62b14fd7cd30","Type":"ContainerStarted","Data":"0afc7e79af2a4e281ae15fb5c878a5a7160d75851fbd1e5d29fd13c3873f98c0"} Oct 13 14:32:28 crc kubenswrapper[4797]: I1013 14:32:28.107932 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-mqjqh" podStartSLOduration=2.421274168 podStartE2EDuration="11.107908638s" podCreationTimestamp="2025-10-13 14:32:17 +0000 UTC" firstStartedPulling="2025-10-13 14:32:18.312018665 +0000 UTC m=+5115.845568911" lastFinishedPulling="2025-10-13 14:32:26.998653105 +0000 UTC m=+5124.532203381" observedRunningTime="2025-10-13 14:32:28.102009334 +0000 UTC m=+5125.635559590" watchObservedRunningTime="2025-10-13 14:32:28.107908638 +0000 UTC m=+5125.641458894" Oct 13 14:32:33 crc kubenswrapper[4797]: I1013 14:32:33.166297 4797 generic.go:334] "Generic (PLEG): container finished" podID="bcc41bde-060e-4a36-adb7-62b14fd7cd30" containerID="0afc7e79af2a4e281ae15fb5c878a5a7160d75851fbd1e5d29fd13c3873f98c0" exitCode=0 Oct 13 14:32:33 crc kubenswrapper[4797]: I1013 14:32:33.166445 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-mqjqh" event={"ID":"bcc41bde-060e-4a36-adb7-62b14fd7cd30","Type":"ContainerDied","Data":"0afc7e79af2a4e281ae15fb5c878a5a7160d75851fbd1e5d29fd13c3873f98c0"} Oct 13 14:32:34 crc kubenswrapper[4797]: I1013 14:32:34.235661 4797 scope.go:117] "RemoveContainer" containerID="96c8267bd4c8e99eeab0f52fde47a06d5529395a03b2ed9e13ec45aa355e370b" Oct 13 14:32:34 crc kubenswrapper[4797]: E1013 14:32:34.236029 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 14:32:34 crc kubenswrapper[4797]: I1013 14:32:34.497743 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-mqjqh" Oct 13 14:32:34 crc kubenswrapper[4797]: I1013 14:32:34.595097 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nnhvg\" (UniqueName: \"kubernetes.io/projected/bcc41bde-060e-4a36-adb7-62b14fd7cd30-kube-api-access-nnhvg\") pod \"bcc41bde-060e-4a36-adb7-62b14fd7cd30\" (UID: \"bcc41bde-060e-4a36-adb7-62b14fd7cd30\") " Oct 13 14:32:34 crc kubenswrapper[4797]: I1013 14:32:34.595200 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bcc41bde-060e-4a36-adb7-62b14fd7cd30-config-data\") pod \"bcc41bde-060e-4a36-adb7-62b14fd7cd30\" (UID: \"bcc41bde-060e-4a36-adb7-62b14fd7cd30\") " Oct 13 14:32:34 crc kubenswrapper[4797]: I1013 14:32:34.595377 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bcc41bde-060e-4a36-adb7-62b14fd7cd30-scripts\") pod \"bcc41bde-060e-4a36-adb7-62b14fd7cd30\" (UID: \"bcc41bde-060e-4a36-adb7-62b14fd7cd30\") " Oct 13 14:32:34 crc kubenswrapper[4797]: I1013 14:32:34.595581 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcc41bde-060e-4a36-adb7-62b14fd7cd30-combined-ca-bundle\") pod \"bcc41bde-060e-4a36-adb7-62b14fd7cd30\" (UID: \"bcc41bde-060e-4a36-adb7-62b14fd7cd30\") " Oct 13 14:32:34 crc kubenswrapper[4797]: I1013 14:32:34.601156 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bcc41bde-060e-4a36-adb7-62b14fd7cd30-kube-api-access-nnhvg" (OuterVolumeSpecName: "kube-api-access-nnhvg") pod "bcc41bde-060e-4a36-adb7-62b14fd7cd30" (UID: "bcc41bde-060e-4a36-adb7-62b14fd7cd30"). InnerVolumeSpecName "kube-api-access-nnhvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 14:32:34 crc kubenswrapper[4797]: I1013 14:32:34.602539 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bcc41bde-060e-4a36-adb7-62b14fd7cd30-scripts" (OuterVolumeSpecName: "scripts") pod "bcc41bde-060e-4a36-adb7-62b14fd7cd30" (UID: "bcc41bde-060e-4a36-adb7-62b14fd7cd30"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 14:32:34 crc kubenswrapper[4797]: I1013 14:32:34.619645 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bcc41bde-060e-4a36-adb7-62b14fd7cd30-config-data" (OuterVolumeSpecName: "config-data") pod "bcc41bde-060e-4a36-adb7-62b14fd7cd30" (UID: "bcc41bde-060e-4a36-adb7-62b14fd7cd30"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 14:32:34 crc kubenswrapper[4797]: I1013 14:32:34.632456 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bcc41bde-060e-4a36-adb7-62b14fd7cd30-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bcc41bde-060e-4a36-adb7-62b14fd7cd30" (UID: "bcc41bde-060e-4a36-adb7-62b14fd7cd30"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 14:32:34 crc kubenswrapper[4797]: I1013 14:32:34.697996 4797 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bcc41bde-060e-4a36-adb7-62b14fd7cd30-scripts\") on node \"crc\" DevicePath \"\"" Oct 13 14:32:34 crc kubenswrapper[4797]: I1013 14:32:34.698037 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcc41bde-060e-4a36-adb7-62b14fd7cd30-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 14:32:34 crc kubenswrapper[4797]: I1013 14:32:34.698053 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nnhvg\" (UniqueName: \"kubernetes.io/projected/bcc41bde-060e-4a36-adb7-62b14fd7cd30-kube-api-access-nnhvg\") on node \"crc\" DevicePath \"\"" Oct 13 14:32:34 crc kubenswrapper[4797]: I1013 14:32:34.698064 4797 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bcc41bde-060e-4a36-adb7-62b14fd7cd30-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 14:32:35 crc kubenswrapper[4797]: I1013 14:32:35.187171 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-mqjqh" event={"ID":"bcc41bde-060e-4a36-adb7-62b14fd7cd30","Type":"ContainerDied","Data":"b67ce19020481302805fc64442bd0232bf00f46378795d17be1a47c37dda1d33"} Oct 13 14:32:35 crc kubenswrapper[4797]: I1013 14:32:35.187212 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b67ce19020481302805fc64442bd0232bf00f46378795d17be1a47c37dda1d33" Oct 13 14:32:35 crc kubenswrapper[4797]: I1013 14:32:35.187269 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-mqjqh" Oct 13 14:32:35 crc kubenswrapper[4797]: I1013 14:32:35.280419 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 13 14:32:35 crc kubenswrapper[4797]: E1013 14:32:35.281607 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcc41bde-060e-4a36-adb7-62b14fd7cd30" containerName="nova-cell0-conductor-db-sync" Oct 13 14:32:35 crc kubenswrapper[4797]: I1013 14:32:35.281690 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcc41bde-060e-4a36-adb7-62b14fd7cd30" containerName="nova-cell0-conductor-db-sync" Oct 13 14:32:35 crc kubenswrapper[4797]: I1013 14:32:35.281961 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="bcc41bde-060e-4a36-adb7-62b14fd7cd30" containerName="nova-cell0-conductor-db-sync" Oct 13 14:32:35 crc kubenswrapper[4797]: I1013 14:32:35.282618 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 13 14:32:35 crc kubenswrapper[4797]: I1013 14:32:35.284143 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 13 14:32:35 crc kubenswrapper[4797]: I1013 14:32:35.284378 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-fcrfc" Oct 13 14:32:35 crc kubenswrapper[4797]: I1013 14:32:35.290620 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 13 14:32:35 crc kubenswrapper[4797]: I1013 14:32:35.409007 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c326826-7bbd-44cb-aa2e-9c4e10aaf8f2-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"2c326826-7bbd-44cb-aa2e-9c4e10aaf8f2\") " pod="openstack/nova-cell0-conductor-0" Oct 13 14:32:35 crc kubenswrapper[4797]: I1013 14:32:35.409375 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7ddk\" (UniqueName: \"kubernetes.io/projected/2c326826-7bbd-44cb-aa2e-9c4e10aaf8f2-kube-api-access-t7ddk\") pod \"nova-cell0-conductor-0\" (UID: \"2c326826-7bbd-44cb-aa2e-9c4e10aaf8f2\") " pod="openstack/nova-cell0-conductor-0" Oct 13 14:32:35 crc kubenswrapper[4797]: I1013 14:32:35.409607 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c326826-7bbd-44cb-aa2e-9c4e10aaf8f2-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"2c326826-7bbd-44cb-aa2e-9c4e10aaf8f2\") " pod="openstack/nova-cell0-conductor-0" Oct 13 14:32:35 crc kubenswrapper[4797]: I1013 14:32:35.511608 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c326826-7bbd-44cb-aa2e-9c4e10aaf8f2-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"2c326826-7bbd-44cb-aa2e-9c4e10aaf8f2\") " pod="openstack/nova-cell0-conductor-0" Oct 13 14:32:35 crc kubenswrapper[4797]: I1013 14:32:35.511665 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t7ddk\" (UniqueName: \"kubernetes.io/projected/2c326826-7bbd-44cb-aa2e-9c4e10aaf8f2-kube-api-access-t7ddk\") pod \"nova-cell0-conductor-0\" (UID: \"2c326826-7bbd-44cb-aa2e-9c4e10aaf8f2\") " pod="openstack/nova-cell0-conductor-0" Oct 13 14:32:35 crc kubenswrapper[4797]: I1013 14:32:35.511761 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c326826-7bbd-44cb-aa2e-9c4e10aaf8f2-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"2c326826-7bbd-44cb-aa2e-9c4e10aaf8f2\") " pod="openstack/nova-cell0-conductor-0" Oct 13 14:32:35 crc kubenswrapper[4797]: I1013 14:32:35.517562 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c326826-7bbd-44cb-aa2e-9c4e10aaf8f2-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"2c326826-7bbd-44cb-aa2e-9c4e10aaf8f2\") " pod="openstack/nova-cell0-conductor-0" Oct 13 14:32:35 crc kubenswrapper[4797]: I1013 14:32:35.521145 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c326826-7bbd-44cb-aa2e-9c4e10aaf8f2-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"2c326826-7bbd-44cb-aa2e-9c4e10aaf8f2\") " pod="openstack/nova-cell0-conductor-0" Oct 13 14:32:35 crc kubenswrapper[4797]: I1013 14:32:35.529527 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7ddk\" (UniqueName: \"kubernetes.io/projected/2c326826-7bbd-44cb-aa2e-9c4e10aaf8f2-kube-api-access-t7ddk\") pod \"nova-cell0-conductor-0\" (UID: \"2c326826-7bbd-44cb-aa2e-9c4e10aaf8f2\") " pod="openstack/nova-cell0-conductor-0" Oct 13 14:32:35 crc kubenswrapper[4797]: I1013 14:32:35.607084 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 13 14:32:36 crc kubenswrapper[4797]: I1013 14:32:36.054457 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 13 14:32:36 crc kubenswrapper[4797]: I1013 14:32:36.203601 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"2c326826-7bbd-44cb-aa2e-9c4e10aaf8f2","Type":"ContainerStarted","Data":"7fc105c29c8edded54a1eae1b55c2a7e2d6173db088f9fab3641e2e00bec1851"} Oct 13 14:32:37 crc kubenswrapper[4797]: I1013 14:32:37.217491 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"2c326826-7bbd-44cb-aa2e-9c4e10aaf8f2","Type":"ContainerStarted","Data":"97e5c68cf5e19917b631b61817624d7b8429ea7c865d7f40feaec88f0c18f64c"} Oct 13 14:32:37 crc kubenswrapper[4797]: I1013 14:32:37.218104 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Oct 13 14:32:37 crc kubenswrapper[4797]: I1013 14:32:37.240146 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.24011867 podStartE2EDuration="2.24011867s" podCreationTimestamp="2025-10-13 14:32:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 14:32:37.235970538 +0000 UTC m=+5134.769520844" watchObservedRunningTime="2025-10-13 14:32:37.24011867 +0000 UTC m=+5134.773668926" Oct 13 14:32:45 crc kubenswrapper[4797]: I1013 14:32:45.236023 4797 scope.go:117] "RemoveContainer" containerID="96c8267bd4c8e99eeab0f52fde47a06d5529395a03b2ed9e13ec45aa355e370b" Oct 13 14:32:45 crc kubenswrapper[4797]: E1013 14:32:45.236699 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 14:32:45 crc kubenswrapper[4797]: I1013 14:32:45.645683 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Oct 13 14:32:46 crc kubenswrapper[4797]: I1013 14:32:46.204197 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-wsbtp"] Oct 13 14:32:46 crc kubenswrapper[4797]: I1013 14:32:46.205611 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-wsbtp" Oct 13 14:32:46 crc kubenswrapper[4797]: I1013 14:32:46.210230 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Oct 13 14:32:46 crc kubenswrapper[4797]: I1013 14:32:46.215583 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Oct 13 14:32:46 crc kubenswrapper[4797]: I1013 14:32:46.216068 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-wsbtp"] Oct 13 14:32:46 crc kubenswrapper[4797]: I1013 14:32:46.308275 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e83dc9a-ad8e-480e-b41e-f140d7ccacb5-config-data\") pod \"nova-cell0-cell-mapping-wsbtp\" (UID: \"8e83dc9a-ad8e-480e-b41e-f140d7ccacb5\") " pod="openstack/nova-cell0-cell-mapping-wsbtp" Oct 13 14:32:46 crc kubenswrapper[4797]: I1013 14:32:46.308338 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e83dc9a-ad8e-480e-b41e-f140d7ccacb5-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-wsbtp\" (UID: \"8e83dc9a-ad8e-480e-b41e-f140d7ccacb5\") " pod="openstack/nova-cell0-cell-mapping-wsbtp" Oct 13 14:32:46 crc kubenswrapper[4797]: I1013 14:32:46.308409 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8e83dc9a-ad8e-480e-b41e-f140d7ccacb5-scripts\") pod \"nova-cell0-cell-mapping-wsbtp\" (UID: \"8e83dc9a-ad8e-480e-b41e-f140d7ccacb5\") " pod="openstack/nova-cell0-cell-mapping-wsbtp" Oct 13 14:32:46 crc kubenswrapper[4797]: I1013 14:32:46.308557 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p88rq\" (UniqueName: \"kubernetes.io/projected/8e83dc9a-ad8e-480e-b41e-f140d7ccacb5-kube-api-access-p88rq\") pod \"nova-cell0-cell-mapping-wsbtp\" (UID: \"8e83dc9a-ad8e-480e-b41e-f140d7ccacb5\") " pod="openstack/nova-cell0-cell-mapping-wsbtp" Oct 13 14:32:46 crc kubenswrapper[4797]: I1013 14:32:46.354426 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 13 14:32:46 crc kubenswrapper[4797]: I1013 14:32:46.356275 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 13 14:32:46 crc kubenswrapper[4797]: I1013 14:32:46.362537 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 13 14:32:46 crc kubenswrapper[4797]: I1013 14:32:46.365489 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 13 14:32:46 crc kubenswrapper[4797]: I1013 14:32:46.377693 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 13 14:32:46 crc kubenswrapper[4797]: I1013 14:32:46.378978 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 13 14:32:46 crc kubenswrapper[4797]: I1013 14:32:46.388622 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Oct 13 14:32:46 crc kubenswrapper[4797]: I1013 14:32:46.409410 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40112579-d6ec-4ca8-a75f-acd4c0cd5868-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"40112579-d6ec-4ca8-a75f-acd4c0cd5868\") " pod="openstack/nova-api-0" Oct 13 14:32:46 crc kubenswrapper[4797]: I1013 14:32:46.409466 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8e83dc9a-ad8e-480e-b41e-f140d7ccacb5-scripts\") pod \"nova-cell0-cell-mapping-wsbtp\" (UID: \"8e83dc9a-ad8e-480e-b41e-f140d7ccacb5\") " pod="openstack/nova-cell0-cell-mapping-wsbtp" Oct 13 14:32:46 crc kubenswrapper[4797]: I1013 14:32:46.409500 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/40112579-d6ec-4ca8-a75f-acd4c0cd5868-logs\") pod \"nova-api-0\" (UID: \"40112579-d6ec-4ca8-a75f-acd4c0cd5868\") " pod="openstack/nova-api-0" Oct 13 14:32:46 crc kubenswrapper[4797]: I1013 14:32:46.409534 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40112579-d6ec-4ca8-a75f-acd4c0cd5868-config-data\") pod \"nova-api-0\" (UID: \"40112579-d6ec-4ca8-a75f-acd4c0cd5868\") " pod="openstack/nova-api-0" Oct 13 14:32:46 crc kubenswrapper[4797]: I1013 14:32:46.409598 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p88rq\" (UniqueName: \"kubernetes.io/projected/8e83dc9a-ad8e-480e-b41e-f140d7ccacb5-kube-api-access-p88rq\") pod \"nova-cell0-cell-mapping-wsbtp\" (UID: \"8e83dc9a-ad8e-480e-b41e-f140d7ccacb5\") " pod="openstack/nova-cell0-cell-mapping-wsbtp" Oct 13 14:32:46 crc kubenswrapper[4797]: I1013 14:32:46.409640 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zt6m\" (UniqueName: \"kubernetes.io/projected/40112579-d6ec-4ca8-a75f-acd4c0cd5868-kube-api-access-5zt6m\") pod \"nova-api-0\" (UID: \"40112579-d6ec-4ca8-a75f-acd4c0cd5868\") " pod="openstack/nova-api-0" Oct 13 14:32:46 crc kubenswrapper[4797]: I1013 14:32:46.409706 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e83dc9a-ad8e-480e-b41e-f140d7ccacb5-config-data\") pod \"nova-cell0-cell-mapping-wsbtp\" (UID: \"8e83dc9a-ad8e-480e-b41e-f140d7ccacb5\") " pod="openstack/nova-cell0-cell-mapping-wsbtp" Oct 13 14:32:46 crc kubenswrapper[4797]: I1013 14:32:46.409729 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e83dc9a-ad8e-480e-b41e-f140d7ccacb5-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-wsbtp\" (UID: \"8e83dc9a-ad8e-480e-b41e-f140d7ccacb5\") " pod="openstack/nova-cell0-cell-mapping-wsbtp" Oct 13 14:32:46 crc kubenswrapper[4797]: I1013 14:32:46.431676 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e83dc9a-ad8e-480e-b41e-f140d7ccacb5-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-wsbtp\" (UID: \"8e83dc9a-ad8e-480e-b41e-f140d7ccacb5\") " pod="openstack/nova-cell0-cell-mapping-wsbtp" Oct 13 14:32:46 crc kubenswrapper[4797]: I1013 14:32:46.433562 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e83dc9a-ad8e-480e-b41e-f140d7ccacb5-config-data\") pod \"nova-cell0-cell-mapping-wsbtp\" (UID: \"8e83dc9a-ad8e-480e-b41e-f140d7ccacb5\") " pod="openstack/nova-cell0-cell-mapping-wsbtp" Oct 13 14:32:46 crc kubenswrapper[4797]: I1013 14:32:46.438477 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 13 14:32:46 crc kubenswrapper[4797]: I1013 14:32:46.438740 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8e83dc9a-ad8e-480e-b41e-f140d7ccacb5-scripts\") pod \"nova-cell0-cell-mapping-wsbtp\" (UID: \"8e83dc9a-ad8e-480e-b41e-f140d7ccacb5\") " pod="openstack/nova-cell0-cell-mapping-wsbtp" Oct 13 14:32:46 crc kubenswrapper[4797]: I1013 14:32:46.449041 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p88rq\" (UniqueName: \"kubernetes.io/projected/8e83dc9a-ad8e-480e-b41e-f140d7ccacb5-kube-api-access-p88rq\") pod \"nova-cell0-cell-mapping-wsbtp\" (UID: \"8e83dc9a-ad8e-480e-b41e-f140d7ccacb5\") " pod="openstack/nova-cell0-cell-mapping-wsbtp" Oct 13 14:32:46 crc kubenswrapper[4797]: I1013 14:32:46.477541 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 13 14:32:46 crc kubenswrapper[4797]: I1013 14:32:46.479092 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 13 14:32:46 crc kubenswrapper[4797]: I1013 14:32:46.486192 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 13 14:32:46 crc kubenswrapper[4797]: I1013 14:32:46.496586 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 13 14:32:46 crc kubenswrapper[4797]: I1013 14:32:46.510924 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jwdh\" (UniqueName: \"kubernetes.io/projected/93ca9b07-0502-43f7-99f7-522b362c00e8-kube-api-access-9jwdh\") pod \"nova-cell1-novncproxy-0\" (UID: \"93ca9b07-0502-43f7-99f7-522b362c00e8\") " pod="openstack/nova-cell1-novncproxy-0" Oct 13 14:32:46 crc kubenswrapper[4797]: I1013 14:32:46.510969 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93ca9b07-0502-43f7-99f7-522b362c00e8-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"93ca9b07-0502-43f7-99f7-522b362c00e8\") " pod="openstack/nova-cell1-novncproxy-0" Oct 13 14:32:46 crc kubenswrapper[4797]: I1013 14:32:46.510992 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/86baf589-28a0-4464-9496-153b7152db6f-logs\") pod \"nova-metadata-0\" (UID: \"86baf589-28a0-4464-9496-153b7152db6f\") " pod="openstack/nova-metadata-0" Oct 13 14:32:46 crc kubenswrapper[4797]: I1013 14:32:46.511049 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5zt6m\" (UniqueName: \"kubernetes.io/projected/40112579-d6ec-4ca8-a75f-acd4c0cd5868-kube-api-access-5zt6m\") pod \"nova-api-0\" (UID: \"40112579-d6ec-4ca8-a75f-acd4c0cd5868\") " pod="openstack/nova-api-0" Oct 13 14:32:46 crc kubenswrapper[4797]: I1013 14:32:46.511495 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86baf589-28a0-4464-9496-153b7152db6f-config-data\") pod \"nova-metadata-0\" (UID: \"86baf589-28a0-4464-9496-153b7152db6f\") " pod="openstack/nova-metadata-0" Oct 13 14:32:46 crc kubenswrapper[4797]: I1013 14:32:46.511555 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93ca9b07-0502-43f7-99f7-522b362c00e8-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"93ca9b07-0502-43f7-99f7-522b362c00e8\") " pod="openstack/nova-cell1-novncproxy-0" Oct 13 14:32:46 crc kubenswrapper[4797]: I1013 14:32:46.511672 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40112579-d6ec-4ca8-a75f-acd4c0cd5868-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"40112579-d6ec-4ca8-a75f-acd4c0cd5868\") " pod="openstack/nova-api-0" Oct 13 14:32:46 crc kubenswrapper[4797]: I1013 14:32:46.511690 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5x9j\" (UniqueName: \"kubernetes.io/projected/86baf589-28a0-4464-9496-153b7152db6f-kube-api-access-t5x9j\") pod \"nova-metadata-0\" (UID: \"86baf589-28a0-4464-9496-153b7152db6f\") " pod="openstack/nova-metadata-0" Oct 13 14:32:46 crc kubenswrapper[4797]: I1013 14:32:46.511714 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/40112579-d6ec-4ca8-a75f-acd4c0cd5868-logs\") pod \"nova-api-0\" (UID: \"40112579-d6ec-4ca8-a75f-acd4c0cd5868\") " pod="openstack/nova-api-0" Oct 13 14:32:46 crc kubenswrapper[4797]: I1013 14:32:46.511739 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40112579-d6ec-4ca8-a75f-acd4c0cd5868-config-data\") pod \"nova-api-0\" (UID: \"40112579-d6ec-4ca8-a75f-acd4c0cd5868\") " pod="openstack/nova-api-0" Oct 13 14:32:46 crc kubenswrapper[4797]: I1013 14:32:46.511762 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86baf589-28a0-4464-9496-153b7152db6f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"86baf589-28a0-4464-9496-153b7152db6f\") " pod="openstack/nova-metadata-0" Oct 13 14:32:46 crc kubenswrapper[4797]: I1013 14:32:46.514968 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/40112579-d6ec-4ca8-a75f-acd4c0cd5868-logs\") pod \"nova-api-0\" (UID: \"40112579-d6ec-4ca8-a75f-acd4c0cd5868\") " pod="openstack/nova-api-0" Oct 13 14:32:46 crc kubenswrapper[4797]: I1013 14:32:46.525392 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40112579-d6ec-4ca8-a75f-acd4c0cd5868-config-data\") pod \"nova-api-0\" (UID: \"40112579-d6ec-4ca8-a75f-acd4c0cd5868\") " pod="openstack/nova-api-0" Oct 13 14:32:46 crc kubenswrapper[4797]: I1013 14:32:46.542957 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 13 14:32:46 crc kubenswrapper[4797]: I1013 14:32:46.544556 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 13 14:32:46 crc kubenswrapper[4797]: I1013 14:32:46.545638 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-wsbtp" Oct 13 14:32:46 crc kubenswrapper[4797]: I1013 14:32:46.545944 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40112579-d6ec-4ca8-a75f-acd4c0cd5868-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"40112579-d6ec-4ca8-a75f-acd4c0cd5868\") " pod="openstack/nova-api-0" Oct 13 14:32:46 crc kubenswrapper[4797]: I1013 14:32:46.546610 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zt6m\" (UniqueName: \"kubernetes.io/projected/40112579-d6ec-4ca8-a75f-acd4c0cd5868-kube-api-access-5zt6m\") pod \"nova-api-0\" (UID: \"40112579-d6ec-4ca8-a75f-acd4c0cd5868\") " pod="openstack/nova-api-0" Oct 13 14:32:46 crc kubenswrapper[4797]: I1013 14:32:46.547607 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 13 14:32:46 crc kubenswrapper[4797]: I1013 14:32:46.614235 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93ca9b07-0502-43f7-99f7-522b362c00e8-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"93ca9b07-0502-43f7-99f7-522b362c00e8\") " pod="openstack/nova-cell1-novncproxy-0" Oct 13 14:32:46 crc kubenswrapper[4797]: I1013 14:32:46.614295 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5303f67-2a7b-4c40-a910-d3769b68b3f3-config-data\") pod \"nova-scheduler-0\" (UID: \"a5303f67-2a7b-4c40-a910-d3769b68b3f3\") " pod="openstack/nova-scheduler-0" Oct 13 14:32:46 crc kubenswrapper[4797]: I1013 14:32:46.614334 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t5x9j\" (UniqueName: \"kubernetes.io/projected/86baf589-28a0-4464-9496-153b7152db6f-kube-api-access-t5x9j\") pod \"nova-metadata-0\" (UID: \"86baf589-28a0-4464-9496-153b7152db6f\") " pod="openstack/nova-metadata-0" Oct 13 14:32:46 crc kubenswrapper[4797]: I1013 14:32:46.614369 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86baf589-28a0-4464-9496-153b7152db6f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"86baf589-28a0-4464-9496-153b7152db6f\") " pod="openstack/nova-metadata-0" Oct 13 14:32:46 crc kubenswrapper[4797]: I1013 14:32:46.614385 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jwdh\" (UniqueName: \"kubernetes.io/projected/93ca9b07-0502-43f7-99f7-522b362c00e8-kube-api-access-9jwdh\") pod \"nova-cell1-novncproxy-0\" (UID: \"93ca9b07-0502-43f7-99f7-522b362c00e8\") " pod="openstack/nova-cell1-novncproxy-0" Oct 13 14:32:46 crc kubenswrapper[4797]: I1013 14:32:46.614409 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93ca9b07-0502-43f7-99f7-522b362c00e8-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"93ca9b07-0502-43f7-99f7-522b362c00e8\") " pod="openstack/nova-cell1-novncproxy-0" Oct 13 14:32:46 crc kubenswrapper[4797]: I1013 14:32:46.614432 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/86baf589-28a0-4464-9496-153b7152db6f-logs\") pod \"nova-metadata-0\" (UID: \"86baf589-28a0-4464-9496-153b7152db6f\") " pod="openstack/nova-metadata-0" Oct 13 14:32:46 crc kubenswrapper[4797]: I1013 14:32:46.614453 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5303f67-2a7b-4c40-a910-d3769b68b3f3-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a5303f67-2a7b-4c40-a910-d3769b68b3f3\") " pod="openstack/nova-scheduler-0" Oct 13 14:32:46 crc kubenswrapper[4797]: I1013 14:32:46.614488 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4cvgn\" (UniqueName: \"kubernetes.io/projected/a5303f67-2a7b-4c40-a910-d3769b68b3f3-kube-api-access-4cvgn\") pod \"nova-scheduler-0\" (UID: \"a5303f67-2a7b-4c40-a910-d3769b68b3f3\") " pod="openstack/nova-scheduler-0" Oct 13 14:32:46 crc kubenswrapper[4797]: I1013 14:32:46.614537 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86baf589-28a0-4464-9496-153b7152db6f-config-data\") pod \"nova-metadata-0\" (UID: \"86baf589-28a0-4464-9496-153b7152db6f\") " pod="openstack/nova-metadata-0" Oct 13 14:32:46 crc kubenswrapper[4797]: I1013 14:32:46.615400 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/86baf589-28a0-4464-9496-153b7152db6f-logs\") pod \"nova-metadata-0\" (UID: \"86baf589-28a0-4464-9496-153b7152db6f\") " pod="openstack/nova-metadata-0" Oct 13 14:32:46 crc kubenswrapper[4797]: I1013 14:32:46.617594 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86baf589-28a0-4464-9496-153b7152db6f-config-data\") pod \"nova-metadata-0\" (UID: \"86baf589-28a0-4464-9496-153b7152db6f\") " pod="openstack/nova-metadata-0" Oct 13 14:32:46 crc kubenswrapper[4797]: I1013 14:32:46.620935 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86baf589-28a0-4464-9496-153b7152db6f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"86baf589-28a0-4464-9496-153b7152db6f\") " pod="openstack/nova-metadata-0" Oct 13 14:32:46 crc kubenswrapper[4797]: I1013 14:32:46.622956 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 13 14:32:46 crc kubenswrapper[4797]: I1013 14:32:46.626684 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93ca9b07-0502-43f7-99f7-522b362c00e8-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"93ca9b07-0502-43f7-99f7-522b362c00e8\") " pod="openstack/nova-cell1-novncproxy-0" Oct 13 14:32:46 crc kubenswrapper[4797]: I1013 14:32:46.629560 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93ca9b07-0502-43f7-99f7-522b362c00e8-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"93ca9b07-0502-43f7-99f7-522b362c00e8\") " pod="openstack/nova-cell1-novncproxy-0" Oct 13 14:32:46 crc kubenswrapper[4797]: I1013 14:32:46.647736 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jwdh\" (UniqueName: \"kubernetes.io/projected/93ca9b07-0502-43f7-99f7-522b362c00e8-kube-api-access-9jwdh\") pod \"nova-cell1-novncproxy-0\" (UID: \"93ca9b07-0502-43f7-99f7-522b362c00e8\") " pod="openstack/nova-cell1-novncproxy-0" Oct 13 14:32:46 crc kubenswrapper[4797]: I1013 14:32:46.649260 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-76ddc8f5dc-dq7dh"] Oct 13 14:32:46 crc kubenswrapper[4797]: I1013 14:32:46.651620 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76ddc8f5dc-dq7dh" Oct 13 14:32:46 crc kubenswrapper[4797]: I1013 14:32:46.652467 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5x9j\" (UniqueName: \"kubernetes.io/projected/86baf589-28a0-4464-9496-153b7152db6f-kube-api-access-t5x9j\") pod \"nova-metadata-0\" (UID: \"86baf589-28a0-4464-9496-153b7152db6f\") " pod="openstack/nova-metadata-0" Oct 13 14:32:46 crc kubenswrapper[4797]: I1013 14:32:46.673983 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 13 14:32:46 crc kubenswrapper[4797]: I1013 14:32:46.674608 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-76ddc8f5dc-dq7dh"] Oct 13 14:32:46 crc kubenswrapper[4797]: I1013 14:32:46.707108 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 13 14:32:46 crc kubenswrapper[4797]: I1013 14:32:46.716227 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4cvgn\" (UniqueName: \"kubernetes.io/projected/a5303f67-2a7b-4c40-a910-d3769b68b3f3-kube-api-access-4cvgn\") pod \"nova-scheduler-0\" (UID: \"a5303f67-2a7b-4c40-a910-d3769b68b3f3\") " pod="openstack/nova-scheduler-0" Oct 13 14:32:46 crc kubenswrapper[4797]: I1013 14:32:46.716319 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0b9a3713-8e2a-40ac-80a9-0fa0fa54ad96-dns-svc\") pod \"dnsmasq-dns-76ddc8f5dc-dq7dh\" (UID: \"0b9a3713-8e2a-40ac-80a9-0fa0fa54ad96\") " pod="openstack/dnsmasq-dns-76ddc8f5dc-dq7dh" Oct 13 14:32:46 crc kubenswrapper[4797]: I1013 14:32:46.716376 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljg8p\" (UniqueName: \"kubernetes.io/projected/0b9a3713-8e2a-40ac-80a9-0fa0fa54ad96-kube-api-access-ljg8p\") pod \"dnsmasq-dns-76ddc8f5dc-dq7dh\" (UID: \"0b9a3713-8e2a-40ac-80a9-0fa0fa54ad96\") " pod="openstack/dnsmasq-dns-76ddc8f5dc-dq7dh" Oct 13 14:32:46 crc kubenswrapper[4797]: I1013 14:32:46.716423 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5303f67-2a7b-4c40-a910-d3769b68b3f3-config-data\") pod \"nova-scheduler-0\" (UID: \"a5303f67-2a7b-4c40-a910-d3769b68b3f3\") " pod="openstack/nova-scheduler-0" Oct 13 14:32:46 crc kubenswrapper[4797]: I1013 14:32:46.716449 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b9a3713-8e2a-40ac-80a9-0fa0fa54ad96-config\") pod \"dnsmasq-dns-76ddc8f5dc-dq7dh\" (UID: \"0b9a3713-8e2a-40ac-80a9-0fa0fa54ad96\") " pod="openstack/dnsmasq-dns-76ddc8f5dc-dq7dh" Oct 13 14:32:46 crc kubenswrapper[4797]: I1013 14:32:46.716480 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0b9a3713-8e2a-40ac-80a9-0fa0fa54ad96-ovsdbserver-sb\") pod \"dnsmasq-dns-76ddc8f5dc-dq7dh\" (UID: \"0b9a3713-8e2a-40ac-80a9-0fa0fa54ad96\") " pod="openstack/dnsmasq-dns-76ddc8f5dc-dq7dh" Oct 13 14:32:46 crc kubenswrapper[4797]: I1013 14:32:46.716599 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0b9a3713-8e2a-40ac-80a9-0fa0fa54ad96-ovsdbserver-nb\") pod \"dnsmasq-dns-76ddc8f5dc-dq7dh\" (UID: \"0b9a3713-8e2a-40ac-80a9-0fa0fa54ad96\") " pod="openstack/dnsmasq-dns-76ddc8f5dc-dq7dh" Oct 13 14:32:46 crc kubenswrapper[4797]: I1013 14:32:46.716639 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5303f67-2a7b-4c40-a910-d3769b68b3f3-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a5303f67-2a7b-4c40-a910-d3769b68b3f3\") " pod="openstack/nova-scheduler-0" Oct 13 14:32:46 crc kubenswrapper[4797]: I1013 14:32:46.721076 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5303f67-2a7b-4c40-a910-d3769b68b3f3-config-data\") pod \"nova-scheduler-0\" (UID: \"a5303f67-2a7b-4c40-a910-d3769b68b3f3\") " pod="openstack/nova-scheduler-0" Oct 13 14:32:46 crc kubenswrapper[4797]: I1013 14:32:46.723716 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5303f67-2a7b-4c40-a910-d3769b68b3f3-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a5303f67-2a7b-4c40-a910-d3769b68b3f3\") " pod="openstack/nova-scheduler-0" Oct 13 14:32:46 crc kubenswrapper[4797]: I1013 14:32:46.733859 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4cvgn\" (UniqueName: \"kubernetes.io/projected/a5303f67-2a7b-4c40-a910-d3769b68b3f3-kube-api-access-4cvgn\") pod \"nova-scheduler-0\" (UID: \"a5303f67-2a7b-4c40-a910-d3769b68b3f3\") " pod="openstack/nova-scheduler-0" Oct 13 14:32:46 crc kubenswrapper[4797]: I1013 14:32:46.818094 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0b9a3713-8e2a-40ac-80a9-0fa0fa54ad96-ovsdbserver-nb\") pod \"dnsmasq-dns-76ddc8f5dc-dq7dh\" (UID: \"0b9a3713-8e2a-40ac-80a9-0fa0fa54ad96\") " pod="openstack/dnsmasq-dns-76ddc8f5dc-dq7dh" Oct 13 14:32:46 crc kubenswrapper[4797]: I1013 14:32:46.818172 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0b9a3713-8e2a-40ac-80a9-0fa0fa54ad96-dns-svc\") pod \"dnsmasq-dns-76ddc8f5dc-dq7dh\" (UID: \"0b9a3713-8e2a-40ac-80a9-0fa0fa54ad96\") " pod="openstack/dnsmasq-dns-76ddc8f5dc-dq7dh" Oct 13 14:32:46 crc kubenswrapper[4797]: I1013 14:32:46.818212 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ljg8p\" (UniqueName: \"kubernetes.io/projected/0b9a3713-8e2a-40ac-80a9-0fa0fa54ad96-kube-api-access-ljg8p\") pod \"dnsmasq-dns-76ddc8f5dc-dq7dh\" (UID: \"0b9a3713-8e2a-40ac-80a9-0fa0fa54ad96\") " pod="openstack/dnsmasq-dns-76ddc8f5dc-dq7dh" Oct 13 14:32:46 crc kubenswrapper[4797]: I1013 14:32:46.818241 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b9a3713-8e2a-40ac-80a9-0fa0fa54ad96-config\") pod \"dnsmasq-dns-76ddc8f5dc-dq7dh\" (UID: \"0b9a3713-8e2a-40ac-80a9-0fa0fa54ad96\") " pod="openstack/dnsmasq-dns-76ddc8f5dc-dq7dh" Oct 13 14:32:46 crc kubenswrapper[4797]: I1013 14:32:46.818260 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0b9a3713-8e2a-40ac-80a9-0fa0fa54ad96-ovsdbserver-sb\") pod \"dnsmasq-dns-76ddc8f5dc-dq7dh\" (UID: \"0b9a3713-8e2a-40ac-80a9-0fa0fa54ad96\") " pod="openstack/dnsmasq-dns-76ddc8f5dc-dq7dh" Oct 13 14:32:46 crc kubenswrapper[4797]: I1013 14:32:46.819318 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0b9a3713-8e2a-40ac-80a9-0fa0fa54ad96-ovsdbserver-sb\") pod \"dnsmasq-dns-76ddc8f5dc-dq7dh\" (UID: \"0b9a3713-8e2a-40ac-80a9-0fa0fa54ad96\") " pod="openstack/dnsmasq-dns-76ddc8f5dc-dq7dh" Oct 13 14:32:46 crc kubenswrapper[4797]: I1013 14:32:46.819355 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0b9a3713-8e2a-40ac-80a9-0fa0fa54ad96-ovsdbserver-nb\") pod \"dnsmasq-dns-76ddc8f5dc-dq7dh\" (UID: \"0b9a3713-8e2a-40ac-80a9-0fa0fa54ad96\") " pod="openstack/dnsmasq-dns-76ddc8f5dc-dq7dh" Oct 13 14:32:46 crc kubenswrapper[4797]: I1013 14:32:46.819477 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0b9a3713-8e2a-40ac-80a9-0fa0fa54ad96-dns-svc\") pod \"dnsmasq-dns-76ddc8f5dc-dq7dh\" (UID: \"0b9a3713-8e2a-40ac-80a9-0fa0fa54ad96\") " pod="openstack/dnsmasq-dns-76ddc8f5dc-dq7dh" Oct 13 14:32:46 crc kubenswrapper[4797]: I1013 14:32:46.820064 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b9a3713-8e2a-40ac-80a9-0fa0fa54ad96-config\") pod \"dnsmasq-dns-76ddc8f5dc-dq7dh\" (UID: \"0b9a3713-8e2a-40ac-80a9-0fa0fa54ad96\") " pod="openstack/dnsmasq-dns-76ddc8f5dc-dq7dh" Oct 13 14:32:46 crc kubenswrapper[4797]: I1013 14:32:46.844685 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljg8p\" (UniqueName: \"kubernetes.io/projected/0b9a3713-8e2a-40ac-80a9-0fa0fa54ad96-kube-api-access-ljg8p\") pod \"dnsmasq-dns-76ddc8f5dc-dq7dh\" (UID: \"0b9a3713-8e2a-40ac-80a9-0fa0fa54ad96\") " pod="openstack/dnsmasq-dns-76ddc8f5dc-dq7dh" Oct 13 14:32:46 crc kubenswrapper[4797]: I1013 14:32:46.883716 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 13 14:32:46 crc kubenswrapper[4797]: I1013 14:32:46.949797 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 13 14:32:46 crc kubenswrapper[4797]: I1013 14:32:46.968388 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76ddc8f5dc-dq7dh" Oct 13 14:32:47 crc kubenswrapper[4797]: I1013 14:32:47.119698 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-wsbtp"] Oct 13 14:32:48 crc kubenswrapper[4797]: W1013 14:32:47.205618 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod40112579_d6ec_4ca8_a75f_acd4c0cd5868.slice/crio-79337ef9c717a7644ea9a07db935e7ce3be954262359a5e0289a23b446eb6923 WatchSource:0}: Error finding container 79337ef9c717a7644ea9a07db935e7ce3be954262359a5e0289a23b446eb6923: Status 404 returned error can't find the container with id 79337ef9c717a7644ea9a07db935e7ce3be954262359a5e0289a23b446eb6923 Oct 13 14:32:48 crc kubenswrapper[4797]: I1013 14:32:47.210248 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 13 14:32:48 crc kubenswrapper[4797]: I1013 14:32:47.284996 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 13 14:32:48 crc kubenswrapper[4797]: I1013 14:32:47.312421 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"40112579-d6ec-4ca8-a75f-acd4c0cd5868","Type":"ContainerStarted","Data":"79337ef9c717a7644ea9a07db935e7ce3be954262359a5e0289a23b446eb6923"} Oct 13 14:32:48 crc kubenswrapper[4797]: I1013 14:32:47.316054 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-wsbtp" event={"ID":"8e83dc9a-ad8e-480e-b41e-f140d7ccacb5","Type":"ContainerStarted","Data":"9bd863a93c8a65785701e068f128aa7804253109677f8927dd5568a0c24e036e"} Oct 13 14:32:48 crc kubenswrapper[4797]: I1013 14:32:47.317404 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"93ca9b07-0502-43f7-99f7-522b362c00e8","Type":"ContainerStarted","Data":"200acffd475fece6bb7a1eec5ab388e99ad96b5625242c0b999e9375315b360d"} Oct 13 14:32:48 crc kubenswrapper[4797]: I1013 14:32:47.390345 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 13 14:32:48 crc kubenswrapper[4797]: W1013 14:32:47.396591 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod86baf589_28a0_4464_9496_153b7152db6f.slice/crio-5b5dbca6dca5f2f61b14239ab00146563a240638525c798ff5d2307f813f690a WatchSource:0}: Error finding container 5b5dbca6dca5f2f61b14239ab00146563a240638525c798ff5d2307f813f690a: Status 404 returned error can't find the container with id 5b5dbca6dca5f2f61b14239ab00146563a240638525c798ff5d2307f813f690a Oct 13 14:32:48 crc kubenswrapper[4797]: I1013 14:32:47.630749 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-rkk9c"] Oct 13 14:32:48 crc kubenswrapper[4797]: I1013 14:32:47.633980 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-rkk9c" Oct 13 14:32:48 crc kubenswrapper[4797]: I1013 14:32:47.641212 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 13 14:32:48 crc kubenswrapper[4797]: I1013 14:32:47.646506 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Oct 13 14:32:48 crc kubenswrapper[4797]: I1013 14:32:47.655535 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-rkk9c"] Oct 13 14:32:48 crc kubenswrapper[4797]: I1013 14:32:47.739895 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e14a029e-0f34-443e-bf02-4ffc65c90307-config-data\") pod \"nova-cell1-conductor-db-sync-rkk9c\" (UID: \"e14a029e-0f34-443e-bf02-4ffc65c90307\") " pod="openstack/nova-cell1-conductor-db-sync-rkk9c" Oct 13 14:32:48 crc kubenswrapper[4797]: I1013 14:32:47.739957 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e14a029e-0f34-443e-bf02-4ffc65c90307-scripts\") pod \"nova-cell1-conductor-db-sync-rkk9c\" (UID: \"e14a029e-0f34-443e-bf02-4ffc65c90307\") " pod="openstack/nova-cell1-conductor-db-sync-rkk9c" Oct 13 14:32:48 crc kubenswrapper[4797]: I1013 14:32:47.740041 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmgpz\" (UniqueName: \"kubernetes.io/projected/e14a029e-0f34-443e-bf02-4ffc65c90307-kube-api-access-mmgpz\") pod \"nova-cell1-conductor-db-sync-rkk9c\" (UID: \"e14a029e-0f34-443e-bf02-4ffc65c90307\") " pod="openstack/nova-cell1-conductor-db-sync-rkk9c" Oct 13 14:32:48 crc kubenswrapper[4797]: I1013 14:32:47.740133 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e14a029e-0f34-443e-bf02-4ffc65c90307-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-rkk9c\" (UID: \"e14a029e-0f34-443e-bf02-4ffc65c90307\") " pod="openstack/nova-cell1-conductor-db-sync-rkk9c" Oct 13 14:32:48 crc kubenswrapper[4797]: I1013 14:32:47.842075 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e14a029e-0f34-443e-bf02-4ffc65c90307-config-data\") pod \"nova-cell1-conductor-db-sync-rkk9c\" (UID: \"e14a029e-0f34-443e-bf02-4ffc65c90307\") " pod="openstack/nova-cell1-conductor-db-sync-rkk9c" Oct 13 14:32:48 crc kubenswrapper[4797]: I1013 14:32:47.842172 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e14a029e-0f34-443e-bf02-4ffc65c90307-scripts\") pod \"nova-cell1-conductor-db-sync-rkk9c\" (UID: \"e14a029e-0f34-443e-bf02-4ffc65c90307\") " pod="openstack/nova-cell1-conductor-db-sync-rkk9c" Oct 13 14:32:48 crc kubenswrapper[4797]: I1013 14:32:47.842725 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mmgpz\" (UniqueName: \"kubernetes.io/projected/e14a029e-0f34-443e-bf02-4ffc65c90307-kube-api-access-mmgpz\") pod \"nova-cell1-conductor-db-sync-rkk9c\" (UID: \"e14a029e-0f34-443e-bf02-4ffc65c90307\") " pod="openstack/nova-cell1-conductor-db-sync-rkk9c" Oct 13 14:32:48 crc kubenswrapper[4797]: I1013 14:32:47.842854 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e14a029e-0f34-443e-bf02-4ffc65c90307-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-rkk9c\" (UID: \"e14a029e-0f34-443e-bf02-4ffc65c90307\") " pod="openstack/nova-cell1-conductor-db-sync-rkk9c" Oct 13 14:32:48 crc kubenswrapper[4797]: I1013 14:32:47.847548 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e14a029e-0f34-443e-bf02-4ffc65c90307-scripts\") pod \"nova-cell1-conductor-db-sync-rkk9c\" (UID: \"e14a029e-0f34-443e-bf02-4ffc65c90307\") " pod="openstack/nova-cell1-conductor-db-sync-rkk9c" Oct 13 14:32:48 crc kubenswrapper[4797]: I1013 14:32:47.847720 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e14a029e-0f34-443e-bf02-4ffc65c90307-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-rkk9c\" (UID: \"e14a029e-0f34-443e-bf02-4ffc65c90307\") " pod="openstack/nova-cell1-conductor-db-sync-rkk9c" Oct 13 14:32:48 crc kubenswrapper[4797]: I1013 14:32:47.863150 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmgpz\" (UniqueName: \"kubernetes.io/projected/e14a029e-0f34-443e-bf02-4ffc65c90307-kube-api-access-mmgpz\") pod \"nova-cell1-conductor-db-sync-rkk9c\" (UID: \"e14a029e-0f34-443e-bf02-4ffc65c90307\") " pod="openstack/nova-cell1-conductor-db-sync-rkk9c" Oct 13 14:32:48 crc kubenswrapper[4797]: I1013 14:32:47.874035 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e14a029e-0f34-443e-bf02-4ffc65c90307-config-data\") pod \"nova-cell1-conductor-db-sync-rkk9c\" (UID: \"e14a029e-0f34-443e-bf02-4ffc65c90307\") " pod="openstack/nova-cell1-conductor-db-sync-rkk9c" Oct 13 14:32:48 crc kubenswrapper[4797]: I1013 14:32:47.959616 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-rkk9c" Oct 13 14:32:48 crc kubenswrapper[4797]: I1013 14:32:48.346093 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"86baf589-28a0-4464-9496-153b7152db6f","Type":"ContainerStarted","Data":"5b5dbca6dca5f2f61b14239ab00146563a240638525c798ff5d2307f813f690a"} Oct 13 14:32:48 crc kubenswrapper[4797]: I1013 14:32:48.351002 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-wsbtp" event={"ID":"8e83dc9a-ad8e-480e-b41e-f140d7ccacb5","Type":"ContainerStarted","Data":"11b52077a25ff4ef47422e07552263ace81d9cb6b39f17b32d0922f6ca0b32eb"} Oct 13 14:32:48 crc kubenswrapper[4797]: I1013 14:32:48.366913 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-76ddc8f5dc-dq7dh"] Oct 13 14:32:48 crc kubenswrapper[4797]: I1013 14:32:48.375952 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 13 14:32:48 crc kubenswrapper[4797]: I1013 14:32:48.380214 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-wsbtp" podStartSLOduration=2.380193839 podStartE2EDuration="2.380193839s" podCreationTimestamp="2025-10-13 14:32:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 14:32:48.363196423 +0000 UTC m=+5145.896746699" watchObservedRunningTime="2025-10-13 14:32:48.380193839 +0000 UTC m=+5145.913744105" Oct 13 14:32:48 crc kubenswrapper[4797]: I1013 14:32:48.571005 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-rkk9c"] Oct 13 14:32:49 crc kubenswrapper[4797]: I1013 14:32:49.379079 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a5303f67-2a7b-4c40-a910-d3769b68b3f3","Type":"ContainerStarted","Data":"4af1a47b92ff5c29eb9e8619ef5a38e9ee992630c84c0fb0ef85838decfa1e19"} Oct 13 14:32:49 crc kubenswrapper[4797]: I1013 14:32:49.386462 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76ddc8f5dc-dq7dh" event={"ID":"0b9a3713-8e2a-40ac-80a9-0fa0fa54ad96","Type":"ContainerStarted","Data":"4a9d78b5402ff14cc370909061285680d329a48edcf6fccbe337d38b01cc66a4"} Oct 13 14:32:49 crc kubenswrapper[4797]: I1013 14:32:49.398038 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-rkk9c" event={"ID":"e14a029e-0f34-443e-bf02-4ffc65c90307","Type":"ContainerStarted","Data":"5e388e71647a0cf09b9b216d277678bfbd46cfee19f3cd78fa193ca594fc48f2"} Oct 13 14:32:50 crc kubenswrapper[4797]: I1013 14:32:50.413841 4797 generic.go:334] "Generic (PLEG): container finished" podID="0b9a3713-8e2a-40ac-80a9-0fa0fa54ad96" containerID="ba97c6a7364bc4c96f42073b242438604e519aa487057cdf0a0cefd5a9944bfd" exitCode=0 Oct 13 14:32:50 crc kubenswrapper[4797]: I1013 14:32:50.413935 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76ddc8f5dc-dq7dh" event={"ID":"0b9a3713-8e2a-40ac-80a9-0fa0fa54ad96","Type":"ContainerDied","Data":"ba97c6a7364bc4c96f42073b242438604e519aa487057cdf0a0cefd5a9944bfd"} Oct 13 14:32:50 crc kubenswrapper[4797]: I1013 14:32:50.416048 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-rkk9c" event={"ID":"e14a029e-0f34-443e-bf02-4ffc65c90307","Type":"ContainerStarted","Data":"760fd1ac730f931272315a4082868d802b7d7dfc344e4b48b04f26e08d8abef2"} Oct 13 14:32:50 crc kubenswrapper[4797]: I1013 14:32:50.418174 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"86baf589-28a0-4464-9496-153b7152db6f","Type":"ContainerStarted","Data":"fe5bdb82a7bed77475726e456421d6e033a5ed5d96f953910aea7a16a3735809"} Oct 13 14:32:50 crc kubenswrapper[4797]: I1013 14:32:50.418237 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"86baf589-28a0-4464-9496-153b7152db6f","Type":"ContainerStarted","Data":"444b7c4dc1a5a1652d61e254d80a8559c868369859d63fa9a61cf5712db933df"} Oct 13 14:32:50 crc kubenswrapper[4797]: I1013 14:32:50.421936 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"93ca9b07-0502-43f7-99f7-522b362c00e8","Type":"ContainerStarted","Data":"21675c223001857578e43dbd45a4de7e11acf215c1307a87259b699dc9ae9b44"} Oct 13 14:32:50 crc kubenswrapper[4797]: I1013 14:32:50.436847 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"40112579-d6ec-4ca8-a75f-acd4c0cd5868","Type":"ContainerStarted","Data":"32ae334187ed756c4b907c1d6d39c4562697bd0127e6caf0b170d2a063713ad9"} Oct 13 14:32:50 crc kubenswrapper[4797]: I1013 14:32:50.436917 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"40112579-d6ec-4ca8-a75f-acd4c0cd5868","Type":"ContainerStarted","Data":"f18fae5bdd78e7c4562afb9ad0bd9efbf06d1d929642dcfbd52010dd10fd24aa"} Oct 13 14:32:50 crc kubenswrapper[4797]: I1013 14:32:50.443426 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a5303f67-2a7b-4c40-a910-d3769b68b3f3","Type":"ContainerStarted","Data":"18e193392e0d839f62480d10e22d79b1a0fa0560a5c1e9c87865962cad6a4a9b"} Oct 13 14:32:50 crc kubenswrapper[4797]: I1013 14:32:50.504985 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.551709241 podStartE2EDuration="4.50496368s" podCreationTimestamp="2025-10-13 14:32:46 +0000 UTC" firstStartedPulling="2025-10-13 14:32:47.398754207 +0000 UTC m=+5144.932304463" lastFinishedPulling="2025-10-13 14:32:49.352008646 +0000 UTC m=+5146.885558902" observedRunningTime="2025-10-13 14:32:50.476189805 +0000 UTC m=+5148.009740061" watchObservedRunningTime="2025-10-13 14:32:50.50496368 +0000 UTC m=+5148.038513936" Oct 13 14:32:50 crc kubenswrapper[4797]: I1013 14:32:50.517136 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.374239084 podStartE2EDuration="4.517107047s" podCreationTimestamp="2025-10-13 14:32:46 +0000 UTC" firstStartedPulling="2025-10-13 14:32:47.208050377 +0000 UTC m=+5144.741600633" lastFinishedPulling="2025-10-13 14:32:49.35091833 +0000 UTC m=+5146.884468596" observedRunningTime="2025-10-13 14:32:50.504300543 +0000 UTC m=+5148.037850809" watchObservedRunningTime="2025-10-13 14:32:50.517107047 +0000 UTC m=+5148.050657303" Oct 13 14:32:50 crc kubenswrapper[4797]: I1013 14:32:50.535919 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.504104475 podStartE2EDuration="4.535890637s" podCreationTimestamp="2025-10-13 14:32:46 +0000 UTC" firstStartedPulling="2025-10-13 14:32:47.297920768 +0000 UTC m=+5144.831471024" lastFinishedPulling="2025-10-13 14:32:49.32970694 +0000 UTC m=+5146.863257186" observedRunningTime="2025-10-13 14:32:50.525164724 +0000 UTC m=+5148.058715000" watchObservedRunningTime="2025-10-13 14:32:50.535890637 +0000 UTC m=+5148.069440913" Oct 13 14:32:50 crc kubenswrapper[4797]: I1013 14:32:50.563418 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-rkk9c" podStartSLOduration=3.56337908 podStartE2EDuration="3.56337908s" podCreationTimestamp="2025-10-13 14:32:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 14:32:50.560255953 +0000 UTC m=+5148.093806229" watchObservedRunningTime="2025-10-13 14:32:50.56337908 +0000 UTC m=+5148.096929336" Oct 13 14:32:50 crc kubenswrapper[4797]: I1013 14:32:50.592870 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.035854515 podStartE2EDuration="4.592851672s" podCreationTimestamp="2025-10-13 14:32:46 +0000 UTC" firstStartedPulling="2025-10-13 14:32:48.61216509 +0000 UTC m=+5146.145715346" lastFinishedPulling="2025-10-13 14:32:50.169162247 +0000 UTC m=+5147.702712503" observedRunningTime="2025-10-13 14:32:50.587140032 +0000 UTC m=+5148.120690288" watchObservedRunningTime="2025-10-13 14:32:50.592851672 +0000 UTC m=+5148.126401928" Oct 13 14:32:51 crc kubenswrapper[4797]: I1013 14:32:51.459222 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76ddc8f5dc-dq7dh" event={"ID":"0b9a3713-8e2a-40ac-80a9-0fa0fa54ad96","Type":"ContainerStarted","Data":"733a353a3af7f6a1a5b0bff35236bf33bc666221c30cec3bbd5d04c28424540b"} Oct 13 14:32:51 crc kubenswrapper[4797]: I1013 14:32:51.486419 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-76ddc8f5dc-dq7dh" podStartSLOduration=5.486383321 podStartE2EDuration="5.486383321s" podCreationTimestamp="2025-10-13 14:32:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 14:32:51.484613018 +0000 UTC m=+5149.018163304" watchObservedRunningTime="2025-10-13 14:32:51.486383321 +0000 UTC m=+5149.019933577" Oct 13 14:32:51 crc kubenswrapper[4797]: I1013 14:32:51.707457 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Oct 13 14:32:51 crc kubenswrapper[4797]: I1013 14:32:51.884419 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 13 14:32:51 crc kubenswrapper[4797]: I1013 14:32:51.884495 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 13 14:32:51 crc kubenswrapper[4797]: I1013 14:32:51.950347 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 13 14:32:51 crc kubenswrapper[4797]: I1013 14:32:51.969518 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-76ddc8f5dc-dq7dh" Oct 13 14:32:53 crc kubenswrapper[4797]: I1013 14:32:53.483306 4797 generic.go:334] "Generic (PLEG): container finished" podID="e14a029e-0f34-443e-bf02-4ffc65c90307" containerID="760fd1ac730f931272315a4082868d802b7d7dfc344e4b48b04f26e08d8abef2" exitCode=0 Oct 13 14:32:53 crc kubenswrapper[4797]: I1013 14:32:53.483454 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-rkk9c" event={"ID":"e14a029e-0f34-443e-bf02-4ffc65c90307","Type":"ContainerDied","Data":"760fd1ac730f931272315a4082868d802b7d7dfc344e4b48b04f26e08d8abef2"} Oct 13 14:32:53 crc kubenswrapper[4797]: I1013 14:32:53.486416 4797 generic.go:334] "Generic (PLEG): container finished" podID="8e83dc9a-ad8e-480e-b41e-f140d7ccacb5" containerID="11b52077a25ff4ef47422e07552263ace81d9cb6b39f17b32d0922f6ca0b32eb" exitCode=0 Oct 13 14:32:53 crc kubenswrapper[4797]: I1013 14:32:53.486458 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-wsbtp" event={"ID":"8e83dc9a-ad8e-480e-b41e-f140d7ccacb5","Type":"ContainerDied","Data":"11b52077a25ff4ef47422e07552263ace81d9cb6b39f17b32d0922f6ca0b32eb"} Oct 13 14:32:54 crc kubenswrapper[4797]: I1013 14:32:54.926538 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-rkk9c" Oct 13 14:32:54 crc kubenswrapper[4797]: I1013 14:32:54.932403 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-wsbtp" Oct 13 14:32:55 crc kubenswrapper[4797]: I1013 14:32:55.101822 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p88rq\" (UniqueName: \"kubernetes.io/projected/8e83dc9a-ad8e-480e-b41e-f140d7ccacb5-kube-api-access-p88rq\") pod \"8e83dc9a-ad8e-480e-b41e-f140d7ccacb5\" (UID: \"8e83dc9a-ad8e-480e-b41e-f140d7ccacb5\") " Oct 13 14:32:55 crc kubenswrapper[4797]: I1013 14:32:55.102322 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8e83dc9a-ad8e-480e-b41e-f140d7ccacb5-scripts\") pod \"8e83dc9a-ad8e-480e-b41e-f140d7ccacb5\" (UID: \"8e83dc9a-ad8e-480e-b41e-f140d7ccacb5\") " Oct 13 14:32:55 crc kubenswrapper[4797]: I1013 14:32:55.102353 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mmgpz\" (UniqueName: \"kubernetes.io/projected/e14a029e-0f34-443e-bf02-4ffc65c90307-kube-api-access-mmgpz\") pod \"e14a029e-0f34-443e-bf02-4ffc65c90307\" (UID: \"e14a029e-0f34-443e-bf02-4ffc65c90307\") " Oct 13 14:32:55 crc kubenswrapper[4797]: I1013 14:32:55.102568 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e83dc9a-ad8e-480e-b41e-f140d7ccacb5-config-data\") pod \"8e83dc9a-ad8e-480e-b41e-f140d7ccacb5\" (UID: \"8e83dc9a-ad8e-480e-b41e-f140d7ccacb5\") " Oct 13 14:32:55 crc kubenswrapper[4797]: I1013 14:32:55.102588 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e14a029e-0f34-443e-bf02-4ffc65c90307-scripts\") pod \"e14a029e-0f34-443e-bf02-4ffc65c90307\" (UID: \"e14a029e-0f34-443e-bf02-4ffc65c90307\") " Oct 13 14:32:55 crc kubenswrapper[4797]: I1013 14:32:55.102613 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e14a029e-0f34-443e-bf02-4ffc65c90307-combined-ca-bundle\") pod \"e14a029e-0f34-443e-bf02-4ffc65c90307\" (UID: \"e14a029e-0f34-443e-bf02-4ffc65c90307\") " Oct 13 14:32:55 crc kubenswrapper[4797]: I1013 14:32:55.102738 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e83dc9a-ad8e-480e-b41e-f140d7ccacb5-combined-ca-bundle\") pod \"8e83dc9a-ad8e-480e-b41e-f140d7ccacb5\" (UID: \"8e83dc9a-ad8e-480e-b41e-f140d7ccacb5\") " Oct 13 14:32:55 crc kubenswrapper[4797]: I1013 14:32:55.102788 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e14a029e-0f34-443e-bf02-4ffc65c90307-config-data\") pod \"e14a029e-0f34-443e-bf02-4ffc65c90307\" (UID: \"e14a029e-0f34-443e-bf02-4ffc65c90307\") " Oct 13 14:32:55 crc kubenswrapper[4797]: I1013 14:32:55.109758 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e83dc9a-ad8e-480e-b41e-f140d7ccacb5-kube-api-access-p88rq" (OuterVolumeSpecName: "kube-api-access-p88rq") pod "8e83dc9a-ad8e-480e-b41e-f140d7ccacb5" (UID: "8e83dc9a-ad8e-480e-b41e-f140d7ccacb5"). InnerVolumeSpecName "kube-api-access-p88rq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 14:32:55 crc kubenswrapper[4797]: I1013 14:32:55.111481 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e14a029e-0f34-443e-bf02-4ffc65c90307-kube-api-access-mmgpz" (OuterVolumeSpecName: "kube-api-access-mmgpz") pod "e14a029e-0f34-443e-bf02-4ffc65c90307" (UID: "e14a029e-0f34-443e-bf02-4ffc65c90307"). InnerVolumeSpecName "kube-api-access-mmgpz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 14:32:55 crc kubenswrapper[4797]: I1013 14:32:55.111579 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e14a029e-0f34-443e-bf02-4ffc65c90307-scripts" (OuterVolumeSpecName: "scripts") pod "e14a029e-0f34-443e-bf02-4ffc65c90307" (UID: "e14a029e-0f34-443e-bf02-4ffc65c90307"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 14:32:55 crc kubenswrapper[4797]: I1013 14:32:55.111601 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e83dc9a-ad8e-480e-b41e-f140d7ccacb5-scripts" (OuterVolumeSpecName: "scripts") pod "8e83dc9a-ad8e-480e-b41e-f140d7ccacb5" (UID: "8e83dc9a-ad8e-480e-b41e-f140d7ccacb5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 14:32:55 crc kubenswrapper[4797]: I1013 14:32:55.132860 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e14a029e-0f34-443e-bf02-4ffc65c90307-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e14a029e-0f34-443e-bf02-4ffc65c90307" (UID: "e14a029e-0f34-443e-bf02-4ffc65c90307"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 14:32:55 crc kubenswrapper[4797]: I1013 14:32:55.135051 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e14a029e-0f34-443e-bf02-4ffc65c90307-config-data" (OuterVolumeSpecName: "config-data") pod "e14a029e-0f34-443e-bf02-4ffc65c90307" (UID: "e14a029e-0f34-443e-bf02-4ffc65c90307"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 14:32:55 crc kubenswrapper[4797]: I1013 14:32:55.136287 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e83dc9a-ad8e-480e-b41e-f140d7ccacb5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8e83dc9a-ad8e-480e-b41e-f140d7ccacb5" (UID: "8e83dc9a-ad8e-480e-b41e-f140d7ccacb5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 14:32:55 crc kubenswrapper[4797]: I1013 14:32:55.147977 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e83dc9a-ad8e-480e-b41e-f140d7ccacb5-config-data" (OuterVolumeSpecName: "config-data") pod "8e83dc9a-ad8e-480e-b41e-f140d7ccacb5" (UID: "8e83dc9a-ad8e-480e-b41e-f140d7ccacb5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 14:32:55 crc kubenswrapper[4797]: I1013 14:32:55.204641 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p88rq\" (UniqueName: \"kubernetes.io/projected/8e83dc9a-ad8e-480e-b41e-f140d7ccacb5-kube-api-access-p88rq\") on node \"crc\" DevicePath \"\"" Oct 13 14:32:55 crc kubenswrapper[4797]: I1013 14:32:55.204679 4797 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8e83dc9a-ad8e-480e-b41e-f140d7ccacb5-scripts\") on node \"crc\" DevicePath \"\"" Oct 13 14:32:55 crc kubenswrapper[4797]: I1013 14:32:55.204691 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mmgpz\" (UniqueName: \"kubernetes.io/projected/e14a029e-0f34-443e-bf02-4ffc65c90307-kube-api-access-mmgpz\") on node \"crc\" DevicePath \"\"" Oct 13 14:32:55 crc kubenswrapper[4797]: I1013 14:32:55.204698 4797 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e83dc9a-ad8e-480e-b41e-f140d7ccacb5-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 14:32:55 crc kubenswrapper[4797]: I1013 14:32:55.204708 4797 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e14a029e-0f34-443e-bf02-4ffc65c90307-scripts\") on node \"crc\" DevicePath \"\"" Oct 13 14:32:55 crc kubenswrapper[4797]: I1013 14:32:55.204718 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e14a029e-0f34-443e-bf02-4ffc65c90307-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 14:32:55 crc kubenswrapper[4797]: I1013 14:32:55.204725 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e83dc9a-ad8e-480e-b41e-f140d7ccacb5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 14:32:55 crc kubenswrapper[4797]: I1013 14:32:55.204733 4797 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e14a029e-0f34-443e-bf02-4ffc65c90307-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 14:32:55 crc kubenswrapper[4797]: I1013 14:32:55.510784 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-wsbtp" event={"ID":"8e83dc9a-ad8e-480e-b41e-f140d7ccacb5","Type":"ContainerDied","Data":"9bd863a93c8a65785701e068f128aa7804253109677f8927dd5568a0c24e036e"} Oct 13 14:32:55 crc kubenswrapper[4797]: I1013 14:32:55.510968 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9bd863a93c8a65785701e068f128aa7804253109677f8927dd5568a0c24e036e" Oct 13 14:32:55 crc kubenswrapper[4797]: I1013 14:32:55.511762 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-wsbtp" Oct 13 14:32:55 crc kubenswrapper[4797]: I1013 14:32:55.513994 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-rkk9c" event={"ID":"e14a029e-0f34-443e-bf02-4ffc65c90307","Type":"ContainerDied","Data":"5e388e71647a0cf09b9b216d277678bfbd46cfee19f3cd78fa193ca594fc48f2"} Oct 13 14:32:55 crc kubenswrapper[4797]: I1013 14:32:55.514040 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5e388e71647a0cf09b9b216d277678bfbd46cfee19f3cd78fa193ca594fc48f2" Oct 13 14:32:55 crc kubenswrapper[4797]: I1013 14:32:55.514128 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-rkk9c" Oct 13 14:32:55 crc kubenswrapper[4797]: I1013 14:32:55.620528 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 13 14:32:55 crc kubenswrapper[4797]: E1013 14:32:55.620949 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e83dc9a-ad8e-480e-b41e-f140d7ccacb5" containerName="nova-manage" Oct 13 14:32:55 crc kubenswrapper[4797]: I1013 14:32:55.620969 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e83dc9a-ad8e-480e-b41e-f140d7ccacb5" containerName="nova-manage" Oct 13 14:32:55 crc kubenswrapper[4797]: E1013 14:32:55.620990 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e14a029e-0f34-443e-bf02-4ffc65c90307" containerName="nova-cell1-conductor-db-sync" Oct 13 14:32:55 crc kubenswrapper[4797]: I1013 14:32:55.620998 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="e14a029e-0f34-443e-bf02-4ffc65c90307" containerName="nova-cell1-conductor-db-sync" Oct 13 14:32:55 crc kubenswrapper[4797]: I1013 14:32:55.621223 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e83dc9a-ad8e-480e-b41e-f140d7ccacb5" containerName="nova-manage" Oct 13 14:32:55 crc kubenswrapper[4797]: I1013 14:32:55.621251 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="e14a029e-0f34-443e-bf02-4ffc65c90307" containerName="nova-cell1-conductor-db-sync" Oct 13 14:32:55 crc kubenswrapper[4797]: I1013 14:32:55.621974 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 13 14:32:55 crc kubenswrapper[4797]: I1013 14:32:55.624317 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 13 14:32:55 crc kubenswrapper[4797]: I1013 14:32:55.639076 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 13 14:32:55 crc kubenswrapper[4797]: I1013 14:32:55.722292 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/912acab0-a01a-4f1a-9bcd-825354752818-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"912acab0-a01a-4f1a-9bcd-825354752818\") " pod="openstack/nova-cell1-conductor-0" Oct 13 14:32:55 crc kubenswrapper[4797]: I1013 14:32:55.722854 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/912acab0-a01a-4f1a-9bcd-825354752818-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"912acab0-a01a-4f1a-9bcd-825354752818\") " pod="openstack/nova-cell1-conductor-0" Oct 13 14:32:55 crc kubenswrapper[4797]: I1013 14:32:55.722897 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5fzfk\" (UniqueName: \"kubernetes.io/projected/912acab0-a01a-4f1a-9bcd-825354752818-kube-api-access-5fzfk\") pod \"nova-cell1-conductor-0\" (UID: \"912acab0-a01a-4f1a-9bcd-825354752818\") " pod="openstack/nova-cell1-conductor-0" Oct 13 14:32:55 crc kubenswrapper[4797]: I1013 14:32:55.758396 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 13 14:32:55 crc kubenswrapper[4797]: I1013 14:32:55.758669 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="40112579-d6ec-4ca8-a75f-acd4c0cd5868" containerName="nova-api-log" containerID="cri-o://f18fae5bdd78e7c4562afb9ad0bd9efbf06d1d929642dcfbd52010dd10fd24aa" gracePeriod=30 Oct 13 14:32:55 crc kubenswrapper[4797]: I1013 14:32:55.758735 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="40112579-d6ec-4ca8-a75f-acd4c0cd5868" containerName="nova-api-api" containerID="cri-o://32ae334187ed756c4b907c1d6d39c4562697bd0127e6caf0b170d2a063713ad9" gracePeriod=30 Oct 13 14:32:55 crc kubenswrapper[4797]: I1013 14:32:55.774262 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 13 14:32:55 crc kubenswrapper[4797]: I1013 14:32:55.774504 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="a5303f67-2a7b-4c40-a910-d3769b68b3f3" containerName="nova-scheduler-scheduler" containerID="cri-o://18e193392e0d839f62480d10e22d79b1a0fa0560a5c1e9c87865962cad6a4a9b" gracePeriod=30 Oct 13 14:32:55 crc kubenswrapper[4797]: I1013 14:32:55.829297 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/912acab0-a01a-4f1a-9bcd-825354752818-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"912acab0-a01a-4f1a-9bcd-825354752818\") " pod="openstack/nova-cell1-conductor-0" Oct 13 14:32:55 crc kubenswrapper[4797]: I1013 14:32:55.829348 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5fzfk\" (UniqueName: \"kubernetes.io/projected/912acab0-a01a-4f1a-9bcd-825354752818-kube-api-access-5fzfk\") pod \"nova-cell1-conductor-0\" (UID: \"912acab0-a01a-4f1a-9bcd-825354752818\") " pod="openstack/nova-cell1-conductor-0" Oct 13 14:32:55 crc kubenswrapper[4797]: I1013 14:32:55.829487 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/912acab0-a01a-4f1a-9bcd-825354752818-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"912acab0-a01a-4f1a-9bcd-825354752818\") " pod="openstack/nova-cell1-conductor-0" Oct 13 14:32:55 crc kubenswrapper[4797]: I1013 14:32:55.840965 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/912acab0-a01a-4f1a-9bcd-825354752818-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"912acab0-a01a-4f1a-9bcd-825354752818\") " pod="openstack/nova-cell1-conductor-0" Oct 13 14:32:55 crc kubenswrapper[4797]: I1013 14:32:55.855842 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/912acab0-a01a-4f1a-9bcd-825354752818-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"912acab0-a01a-4f1a-9bcd-825354752818\") " pod="openstack/nova-cell1-conductor-0" Oct 13 14:32:55 crc kubenswrapper[4797]: I1013 14:32:55.861518 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5fzfk\" (UniqueName: \"kubernetes.io/projected/912acab0-a01a-4f1a-9bcd-825354752818-kube-api-access-5fzfk\") pod \"nova-cell1-conductor-0\" (UID: \"912acab0-a01a-4f1a-9bcd-825354752818\") " pod="openstack/nova-cell1-conductor-0" Oct 13 14:32:55 crc kubenswrapper[4797]: I1013 14:32:55.896436 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 13 14:32:55 crc kubenswrapper[4797]: I1013 14:32:55.896828 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="86baf589-28a0-4464-9496-153b7152db6f" containerName="nova-metadata-log" containerID="cri-o://444b7c4dc1a5a1652d61e254d80a8559c868369859d63fa9a61cf5712db933df" gracePeriod=30 Oct 13 14:32:55 crc kubenswrapper[4797]: I1013 14:32:55.897356 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="86baf589-28a0-4464-9496-153b7152db6f" containerName="nova-metadata-metadata" containerID="cri-o://fe5bdb82a7bed77475726e456421d6e033a5ed5d96f953910aea7a16a3735809" gracePeriod=30 Oct 13 14:32:55 crc kubenswrapper[4797]: I1013 14:32:55.951509 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 13 14:32:56 crc kubenswrapper[4797]: I1013 14:32:56.478763 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 13 14:32:56 crc kubenswrapper[4797]: I1013 14:32:56.490083 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 13 14:32:56 crc kubenswrapper[4797]: I1013 14:32:56.539048 4797 generic.go:334] "Generic (PLEG): container finished" podID="86baf589-28a0-4464-9496-153b7152db6f" containerID="fe5bdb82a7bed77475726e456421d6e033a5ed5d96f953910aea7a16a3735809" exitCode=0 Oct 13 14:32:56 crc kubenswrapper[4797]: I1013 14:32:56.539271 4797 generic.go:334] "Generic (PLEG): container finished" podID="86baf589-28a0-4464-9496-153b7152db6f" containerID="444b7c4dc1a5a1652d61e254d80a8559c868369859d63fa9a61cf5712db933df" exitCode=143 Oct 13 14:32:56 crc kubenswrapper[4797]: I1013 14:32:56.539139 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"86baf589-28a0-4464-9496-153b7152db6f","Type":"ContainerDied","Data":"fe5bdb82a7bed77475726e456421d6e033a5ed5d96f953910aea7a16a3735809"} Oct 13 14:32:56 crc kubenswrapper[4797]: I1013 14:32:56.539440 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"86baf589-28a0-4464-9496-153b7152db6f","Type":"ContainerDied","Data":"444b7c4dc1a5a1652d61e254d80a8559c868369859d63fa9a61cf5712db933df"} Oct 13 14:32:56 crc kubenswrapper[4797]: I1013 14:32:56.539501 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"86baf589-28a0-4464-9496-153b7152db6f","Type":"ContainerDied","Data":"5b5dbca6dca5f2f61b14239ab00146563a240638525c798ff5d2307f813f690a"} Oct 13 14:32:56 crc kubenswrapper[4797]: I1013 14:32:56.539113 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 13 14:32:56 crc kubenswrapper[4797]: I1013 14:32:56.539547 4797 scope.go:117] "RemoveContainer" containerID="fe5bdb82a7bed77475726e456421d6e033a5ed5d96f953910aea7a16a3735809" Oct 13 14:32:56 crc kubenswrapper[4797]: I1013 14:32:56.541089 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"912acab0-a01a-4f1a-9bcd-825354752818","Type":"ContainerStarted","Data":"5906046ddd0238ec717751a5b3a97f8a999aef8e4eb3a075f4e7996c897df87a"} Oct 13 14:32:56 crc kubenswrapper[4797]: I1013 14:32:56.543547 4797 generic.go:334] "Generic (PLEG): container finished" podID="40112579-d6ec-4ca8-a75f-acd4c0cd5868" containerID="32ae334187ed756c4b907c1d6d39c4562697bd0127e6caf0b170d2a063713ad9" exitCode=0 Oct 13 14:32:56 crc kubenswrapper[4797]: I1013 14:32:56.543592 4797 generic.go:334] "Generic (PLEG): container finished" podID="40112579-d6ec-4ca8-a75f-acd4c0cd5868" containerID="f18fae5bdd78e7c4562afb9ad0bd9efbf06d1d929642dcfbd52010dd10fd24aa" exitCode=143 Oct 13 14:32:56 crc kubenswrapper[4797]: I1013 14:32:56.543615 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"40112579-d6ec-4ca8-a75f-acd4c0cd5868","Type":"ContainerDied","Data":"32ae334187ed756c4b907c1d6d39c4562697bd0127e6caf0b170d2a063713ad9"} Oct 13 14:32:56 crc kubenswrapper[4797]: I1013 14:32:56.543640 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"40112579-d6ec-4ca8-a75f-acd4c0cd5868","Type":"ContainerDied","Data":"f18fae5bdd78e7c4562afb9ad0bd9efbf06d1d929642dcfbd52010dd10fd24aa"} Oct 13 14:32:56 crc kubenswrapper[4797]: I1013 14:32:56.561291 4797 scope.go:117] "RemoveContainer" containerID="444b7c4dc1a5a1652d61e254d80a8559c868369859d63fa9a61cf5712db933df" Oct 13 14:32:56 crc kubenswrapper[4797]: I1013 14:32:56.594834 4797 scope.go:117] "RemoveContainer" containerID="fe5bdb82a7bed77475726e456421d6e033a5ed5d96f953910aea7a16a3735809" Oct 13 14:32:56 crc kubenswrapper[4797]: E1013 14:32:56.595428 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe5bdb82a7bed77475726e456421d6e033a5ed5d96f953910aea7a16a3735809\": container with ID starting with fe5bdb82a7bed77475726e456421d6e033a5ed5d96f953910aea7a16a3735809 not found: ID does not exist" containerID="fe5bdb82a7bed77475726e456421d6e033a5ed5d96f953910aea7a16a3735809" Oct 13 14:32:56 crc kubenswrapper[4797]: I1013 14:32:56.595463 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe5bdb82a7bed77475726e456421d6e033a5ed5d96f953910aea7a16a3735809"} err="failed to get container status \"fe5bdb82a7bed77475726e456421d6e033a5ed5d96f953910aea7a16a3735809\": rpc error: code = NotFound desc = could not find container \"fe5bdb82a7bed77475726e456421d6e033a5ed5d96f953910aea7a16a3735809\": container with ID starting with fe5bdb82a7bed77475726e456421d6e033a5ed5d96f953910aea7a16a3735809 not found: ID does not exist" Oct 13 14:32:56 crc kubenswrapper[4797]: I1013 14:32:56.595496 4797 scope.go:117] "RemoveContainer" containerID="444b7c4dc1a5a1652d61e254d80a8559c868369859d63fa9a61cf5712db933df" Oct 13 14:32:56 crc kubenswrapper[4797]: E1013 14:32:56.596747 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"444b7c4dc1a5a1652d61e254d80a8559c868369859d63fa9a61cf5712db933df\": container with ID starting with 444b7c4dc1a5a1652d61e254d80a8559c868369859d63fa9a61cf5712db933df not found: ID does not exist" containerID="444b7c4dc1a5a1652d61e254d80a8559c868369859d63fa9a61cf5712db933df" Oct 13 14:32:56 crc kubenswrapper[4797]: I1013 14:32:56.596820 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"444b7c4dc1a5a1652d61e254d80a8559c868369859d63fa9a61cf5712db933df"} err="failed to get container status \"444b7c4dc1a5a1652d61e254d80a8559c868369859d63fa9a61cf5712db933df\": rpc error: code = NotFound desc = could not find container \"444b7c4dc1a5a1652d61e254d80a8559c868369859d63fa9a61cf5712db933df\": container with ID starting with 444b7c4dc1a5a1652d61e254d80a8559c868369859d63fa9a61cf5712db933df not found: ID does not exist" Oct 13 14:32:56 crc kubenswrapper[4797]: I1013 14:32:56.596861 4797 scope.go:117] "RemoveContainer" containerID="fe5bdb82a7bed77475726e456421d6e033a5ed5d96f953910aea7a16a3735809" Oct 13 14:32:56 crc kubenswrapper[4797]: I1013 14:32:56.597721 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe5bdb82a7bed77475726e456421d6e033a5ed5d96f953910aea7a16a3735809"} err="failed to get container status \"fe5bdb82a7bed77475726e456421d6e033a5ed5d96f953910aea7a16a3735809\": rpc error: code = NotFound desc = could not find container \"fe5bdb82a7bed77475726e456421d6e033a5ed5d96f953910aea7a16a3735809\": container with ID starting with fe5bdb82a7bed77475726e456421d6e033a5ed5d96f953910aea7a16a3735809 not found: ID does not exist" Oct 13 14:32:56 crc kubenswrapper[4797]: I1013 14:32:56.597742 4797 scope.go:117] "RemoveContainer" containerID="444b7c4dc1a5a1652d61e254d80a8559c868369859d63fa9a61cf5712db933df" Oct 13 14:32:56 crc kubenswrapper[4797]: I1013 14:32:56.597998 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"444b7c4dc1a5a1652d61e254d80a8559c868369859d63fa9a61cf5712db933df"} err="failed to get container status \"444b7c4dc1a5a1652d61e254d80a8559c868369859d63fa9a61cf5712db933df\": rpc error: code = NotFound desc = could not find container \"444b7c4dc1a5a1652d61e254d80a8559c868369859d63fa9a61cf5712db933df\": container with ID starting with 444b7c4dc1a5a1652d61e254d80a8559c868369859d63fa9a61cf5712db933df not found: ID does not exist" Oct 13 14:32:56 crc kubenswrapper[4797]: I1013 14:32:56.646769 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86baf589-28a0-4464-9496-153b7152db6f-config-data\") pod \"86baf589-28a0-4464-9496-153b7152db6f\" (UID: \"86baf589-28a0-4464-9496-153b7152db6f\") " Oct 13 14:32:56 crc kubenswrapper[4797]: I1013 14:32:56.647310 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86baf589-28a0-4464-9496-153b7152db6f-combined-ca-bundle\") pod \"86baf589-28a0-4464-9496-153b7152db6f\" (UID: \"86baf589-28a0-4464-9496-153b7152db6f\") " Oct 13 14:32:56 crc kubenswrapper[4797]: I1013 14:32:56.647354 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t5x9j\" (UniqueName: \"kubernetes.io/projected/86baf589-28a0-4464-9496-153b7152db6f-kube-api-access-t5x9j\") pod \"86baf589-28a0-4464-9496-153b7152db6f\" (UID: \"86baf589-28a0-4464-9496-153b7152db6f\") " Oct 13 14:32:56 crc kubenswrapper[4797]: I1013 14:32:56.647433 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/86baf589-28a0-4464-9496-153b7152db6f-logs\") pod \"86baf589-28a0-4464-9496-153b7152db6f\" (UID: \"86baf589-28a0-4464-9496-153b7152db6f\") " Oct 13 14:32:56 crc kubenswrapper[4797]: I1013 14:32:56.648022 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86baf589-28a0-4464-9496-153b7152db6f-logs" (OuterVolumeSpecName: "logs") pod "86baf589-28a0-4464-9496-153b7152db6f" (UID: "86baf589-28a0-4464-9496-153b7152db6f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 14:32:56 crc kubenswrapper[4797]: I1013 14:32:56.653399 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86baf589-28a0-4464-9496-153b7152db6f-kube-api-access-t5x9j" (OuterVolumeSpecName: "kube-api-access-t5x9j") pod "86baf589-28a0-4464-9496-153b7152db6f" (UID: "86baf589-28a0-4464-9496-153b7152db6f"). InnerVolumeSpecName "kube-api-access-t5x9j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 14:32:56 crc kubenswrapper[4797]: I1013 14:32:56.673650 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 13 14:32:56 crc kubenswrapper[4797]: I1013 14:32:56.675576 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86baf589-28a0-4464-9496-153b7152db6f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "86baf589-28a0-4464-9496-153b7152db6f" (UID: "86baf589-28a0-4464-9496-153b7152db6f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 14:32:56 crc kubenswrapper[4797]: I1013 14:32:56.682616 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86baf589-28a0-4464-9496-153b7152db6f-config-data" (OuterVolumeSpecName: "config-data") pod "86baf589-28a0-4464-9496-153b7152db6f" (UID: "86baf589-28a0-4464-9496-153b7152db6f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 14:32:56 crc kubenswrapper[4797]: I1013 14:32:56.707683 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Oct 13 14:32:56 crc kubenswrapper[4797]: I1013 14:32:56.718731 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Oct 13 14:32:56 crc kubenswrapper[4797]: I1013 14:32:56.749506 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86baf589-28a0-4464-9496-153b7152db6f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 14:32:56 crc kubenswrapper[4797]: I1013 14:32:56.749540 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t5x9j\" (UniqueName: \"kubernetes.io/projected/86baf589-28a0-4464-9496-153b7152db6f-kube-api-access-t5x9j\") on node \"crc\" DevicePath \"\"" Oct 13 14:32:56 crc kubenswrapper[4797]: I1013 14:32:56.749549 4797 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/86baf589-28a0-4464-9496-153b7152db6f-logs\") on node \"crc\" DevicePath \"\"" Oct 13 14:32:56 crc kubenswrapper[4797]: I1013 14:32:56.749559 4797 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86baf589-28a0-4464-9496-153b7152db6f-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 14:32:56 crc kubenswrapper[4797]: I1013 14:32:56.850318 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5zt6m\" (UniqueName: \"kubernetes.io/projected/40112579-d6ec-4ca8-a75f-acd4c0cd5868-kube-api-access-5zt6m\") pod \"40112579-d6ec-4ca8-a75f-acd4c0cd5868\" (UID: \"40112579-d6ec-4ca8-a75f-acd4c0cd5868\") " Oct 13 14:32:56 crc kubenswrapper[4797]: I1013 14:32:56.850409 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40112579-d6ec-4ca8-a75f-acd4c0cd5868-combined-ca-bundle\") pod \"40112579-d6ec-4ca8-a75f-acd4c0cd5868\" (UID: \"40112579-d6ec-4ca8-a75f-acd4c0cd5868\") " Oct 13 14:32:56 crc kubenswrapper[4797]: I1013 14:32:56.850489 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/40112579-d6ec-4ca8-a75f-acd4c0cd5868-logs\") pod \"40112579-d6ec-4ca8-a75f-acd4c0cd5868\" (UID: \"40112579-d6ec-4ca8-a75f-acd4c0cd5868\") " Oct 13 14:32:56 crc kubenswrapper[4797]: I1013 14:32:56.850579 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40112579-d6ec-4ca8-a75f-acd4c0cd5868-config-data\") pod \"40112579-d6ec-4ca8-a75f-acd4c0cd5868\" (UID: \"40112579-d6ec-4ca8-a75f-acd4c0cd5868\") " Oct 13 14:32:56 crc kubenswrapper[4797]: I1013 14:32:56.850881 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40112579-d6ec-4ca8-a75f-acd4c0cd5868-logs" (OuterVolumeSpecName: "logs") pod "40112579-d6ec-4ca8-a75f-acd4c0cd5868" (UID: "40112579-d6ec-4ca8-a75f-acd4c0cd5868"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 14:32:56 crc kubenswrapper[4797]: I1013 14:32:56.851635 4797 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/40112579-d6ec-4ca8-a75f-acd4c0cd5868-logs\") on node \"crc\" DevicePath \"\"" Oct 13 14:32:56 crc kubenswrapper[4797]: I1013 14:32:56.854150 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40112579-d6ec-4ca8-a75f-acd4c0cd5868-kube-api-access-5zt6m" (OuterVolumeSpecName: "kube-api-access-5zt6m") pod "40112579-d6ec-4ca8-a75f-acd4c0cd5868" (UID: "40112579-d6ec-4ca8-a75f-acd4c0cd5868"). InnerVolumeSpecName "kube-api-access-5zt6m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 14:32:56 crc kubenswrapper[4797]: I1013 14:32:56.874296 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40112579-d6ec-4ca8-a75f-acd4c0cd5868-config-data" (OuterVolumeSpecName: "config-data") pod "40112579-d6ec-4ca8-a75f-acd4c0cd5868" (UID: "40112579-d6ec-4ca8-a75f-acd4c0cd5868"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 14:32:56 crc kubenswrapper[4797]: I1013 14:32:56.876927 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40112579-d6ec-4ca8-a75f-acd4c0cd5868-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "40112579-d6ec-4ca8-a75f-acd4c0cd5868" (UID: "40112579-d6ec-4ca8-a75f-acd4c0cd5868"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 14:32:56 crc kubenswrapper[4797]: I1013 14:32:56.956477 4797 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40112579-d6ec-4ca8-a75f-acd4c0cd5868-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 14:32:56 crc kubenswrapper[4797]: I1013 14:32:56.956539 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5zt6m\" (UniqueName: \"kubernetes.io/projected/40112579-d6ec-4ca8-a75f-acd4c0cd5868-kube-api-access-5zt6m\") on node \"crc\" DevicePath \"\"" Oct 13 14:32:56 crc kubenswrapper[4797]: I1013 14:32:56.956554 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40112579-d6ec-4ca8-a75f-acd4c0cd5868-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 14:32:56 crc kubenswrapper[4797]: I1013 14:32:56.969192 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 13 14:32:56 crc kubenswrapper[4797]: I1013 14:32:56.976048 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-76ddc8f5dc-dq7dh" Oct 13 14:32:56 crc kubenswrapper[4797]: I1013 14:32:56.982352 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 13 14:32:56 crc kubenswrapper[4797]: I1013 14:32:56.992526 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 13 14:32:56 crc kubenswrapper[4797]: E1013 14:32:56.992974 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40112579-d6ec-4ca8-a75f-acd4c0cd5868" containerName="nova-api-api" Oct 13 14:32:56 crc kubenswrapper[4797]: I1013 14:32:56.992992 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="40112579-d6ec-4ca8-a75f-acd4c0cd5868" containerName="nova-api-api" Oct 13 14:32:56 crc kubenswrapper[4797]: E1013 14:32:56.993011 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40112579-d6ec-4ca8-a75f-acd4c0cd5868" containerName="nova-api-log" Oct 13 14:32:56 crc kubenswrapper[4797]: I1013 14:32:56.993017 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="40112579-d6ec-4ca8-a75f-acd4c0cd5868" containerName="nova-api-log" Oct 13 14:32:56 crc kubenswrapper[4797]: E1013 14:32:56.993029 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86baf589-28a0-4464-9496-153b7152db6f" containerName="nova-metadata-metadata" Oct 13 14:32:56 crc kubenswrapper[4797]: I1013 14:32:56.993035 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="86baf589-28a0-4464-9496-153b7152db6f" containerName="nova-metadata-metadata" Oct 13 14:32:56 crc kubenswrapper[4797]: E1013 14:32:56.993056 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86baf589-28a0-4464-9496-153b7152db6f" containerName="nova-metadata-log" Oct 13 14:32:56 crc kubenswrapper[4797]: I1013 14:32:56.993062 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="86baf589-28a0-4464-9496-153b7152db6f" containerName="nova-metadata-log" Oct 13 14:32:56 crc kubenswrapper[4797]: I1013 14:32:56.993234 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="40112579-d6ec-4ca8-a75f-acd4c0cd5868" containerName="nova-api-api" Oct 13 14:32:56 crc kubenswrapper[4797]: I1013 14:32:56.993249 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="86baf589-28a0-4464-9496-153b7152db6f" containerName="nova-metadata-log" Oct 13 14:32:56 crc kubenswrapper[4797]: I1013 14:32:56.993260 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="40112579-d6ec-4ca8-a75f-acd4c0cd5868" containerName="nova-api-log" Oct 13 14:32:56 crc kubenswrapper[4797]: I1013 14:32:56.993274 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="86baf589-28a0-4464-9496-153b7152db6f" containerName="nova-metadata-metadata" Oct 13 14:32:56 crc kubenswrapper[4797]: I1013 14:32:56.994194 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 13 14:32:56 crc kubenswrapper[4797]: I1013 14:32:56.999146 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 13 14:32:57 crc kubenswrapper[4797]: I1013 14:32:57.002887 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 13 14:32:57 crc kubenswrapper[4797]: I1013 14:32:57.093778 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7d9899cb4c-j4tsm"] Oct 13 14:32:57 crc kubenswrapper[4797]: I1013 14:32:57.094105 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7d9899cb4c-j4tsm" podUID="22a19aef-3dad-4ad9-a048-600878189bf6" containerName="dnsmasq-dns" containerID="cri-o://d6337036cf4355641b75f773ea7af8f0230be877189c8329a9121afb5b8ae26b" gracePeriod=10 Oct 13 14:32:57 crc kubenswrapper[4797]: I1013 14:32:57.166871 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f9bf017-4407-42c7-aacd-719a6882b0c5-config-data\") pod \"nova-metadata-0\" (UID: \"6f9bf017-4407-42c7-aacd-719a6882b0c5\") " pod="openstack/nova-metadata-0" Oct 13 14:32:57 crc kubenswrapper[4797]: I1013 14:32:57.166916 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjnz5\" (UniqueName: \"kubernetes.io/projected/6f9bf017-4407-42c7-aacd-719a6882b0c5-kube-api-access-vjnz5\") pod \"nova-metadata-0\" (UID: \"6f9bf017-4407-42c7-aacd-719a6882b0c5\") " pod="openstack/nova-metadata-0" Oct 13 14:32:57 crc kubenswrapper[4797]: I1013 14:32:57.166944 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6f9bf017-4407-42c7-aacd-719a6882b0c5-logs\") pod \"nova-metadata-0\" (UID: \"6f9bf017-4407-42c7-aacd-719a6882b0c5\") " pod="openstack/nova-metadata-0" Oct 13 14:32:57 crc kubenswrapper[4797]: I1013 14:32:57.167097 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f9bf017-4407-42c7-aacd-719a6882b0c5-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"6f9bf017-4407-42c7-aacd-719a6882b0c5\") " pod="openstack/nova-metadata-0" Oct 13 14:32:57 crc kubenswrapper[4797]: I1013 14:32:57.257093 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86baf589-28a0-4464-9496-153b7152db6f" path="/var/lib/kubelet/pods/86baf589-28a0-4464-9496-153b7152db6f/volumes" Oct 13 14:32:57 crc kubenswrapper[4797]: I1013 14:32:57.269211 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f9bf017-4407-42c7-aacd-719a6882b0c5-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"6f9bf017-4407-42c7-aacd-719a6882b0c5\") " pod="openstack/nova-metadata-0" Oct 13 14:32:57 crc kubenswrapper[4797]: I1013 14:32:57.269341 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f9bf017-4407-42c7-aacd-719a6882b0c5-config-data\") pod \"nova-metadata-0\" (UID: \"6f9bf017-4407-42c7-aacd-719a6882b0c5\") " pod="openstack/nova-metadata-0" Oct 13 14:32:57 crc kubenswrapper[4797]: I1013 14:32:57.269367 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjnz5\" (UniqueName: \"kubernetes.io/projected/6f9bf017-4407-42c7-aacd-719a6882b0c5-kube-api-access-vjnz5\") pod \"nova-metadata-0\" (UID: \"6f9bf017-4407-42c7-aacd-719a6882b0c5\") " pod="openstack/nova-metadata-0" Oct 13 14:32:57 crc kubenswrapper[4797]: I1013 14:32:57.269425 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6f9bf017-4407-42c7-aacd-719a6882b0c5-logs\") pod \"nova-metadata-0\" (UID: \"6f9bf017-4407-42c7-aacd-719a6882b0c5\") " pod="openstack/nova-metadata-0" Oct 13 14:32:57 crc kubenswrapper[4797]: I1013 14:32:57.270720 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6f9bf017-4407-42c7-aacd-719a6882b0c5-logs\") pod \"nova-metadata-0\" (UID: \"6f9bf017-4407-42c7-aacd-719a6882b0c5\") " pod="openstack/nova-metadata-0" Oct 13 14:32:57 crc kubenswrapper[4797]: I1013 14:32:57.275867 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f9bf017-4407-42c7-aacd-719a6882b0c5-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"6f9bf017-4407-42c7-aacd-719a6882b0c5\") " pod="openstack/nova-metadata-0" Oct 13 14:32:57 crc kubenswrapper[4797]: I1013 14:32:57.303016 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjnz5\" (UniqueName: \"kubernetes.io/projected/6f9bf017-4407-42c7-aacd-719a6882b0c5-kube-api-access-vjnz5\") pod \"nova-metadata-0\" (UID: \"6f9bf017-4407-42c7-aacd-719a6882b0c5\") " pod="openstack/nova-metadata-0" Oct 13 14:32:57 crc kubenswrapper[4797]: I1013 14:32:57.303173 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f9bf017-4407-42c7-aacd-719a6882b0c5-config-data\") pod \"nova-metadata-0\" (UID: \"6f9bf017-4407-42c7-aacd-719a6882b0c5\") " pod="openstack/nova-metadata-0" Oct 13 14:32:57 crc kubenswrapper[4797]: I1013 14:32:57.317931 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 13 14:32:57 crc kubenswrapper[4797]: I1013 14:32:57.567305 4797 generic.go:334] "Generic (PLEG): container finished" podID="22a19aef-3dad-4ad9-a048-600878189bf6" containerID="d6337036cf4355641b75f773ea7af8f0230be877189c8329a9121afb5b8ae26b" exitCode=0 Oct 13 14:32:57 crc kubenswrapper[4797]: I1013 14:32:57.567388 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d9899cb4c-j4tsm" event={"ID":"22a19aef-3dad-4ad9-a048-600878189bf6","Type":"ContainerDied","Data":"d6337036cf4355641b75f773ea7af8f0230be877189c8329a9121afb5b8ae26b"} Oct 13 14:32:57 crc kubenswrapper[4797]: I1013 14:32:57.567499 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d9899cb4c-j4tsm" Oct 13 14:32:57 crc kubenswrapper[4797]: I1013 14:32:57.573658 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s2qns\" (UniqueName: \"kubernetes.io/projected/22a19aef-3dad-4ad9-a048-600878189bf6-kube-api-access-s2qns\") pod \"22a19aef-3dad-4ad9-a048-600878189bf6\" (UID: \"22a19aef-3dad-4ad9-a048-600878189bf6\") " Oct 13 14:32:57 crc kubenswrapper[4797]: I1013 14:32:57.575973 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22a19aef-3dad-4ad9-a048-600878189bf6-config\") pod \"22a19aef-3dad-4ad9-a048-600878189bf6\" (UID: \"22a19aef-3dad-4ad9-a048-600878189bf6\") " Oct 13 14:32:57 crc kubenswrapper[4797]: I1013 14:32:57.576258 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/22a19aef-3dad-4ad9-a048-600878189bf6-ovsdbserver-nb\") pod \"22a19aef-3dad-4ad9-a048-600878189bf6\" (UID: \"22a19aef-3dad-4ad9-a048-600878189bf6\") " Oct 13 14:32:57 crc kubenswrapper[4797]: I1013 14:32:57.577012 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"912acab0-a01a-4f1a-9bcd-825354752818","Type":"ContainerStarted","Data":"7d039f6c8b418f3f8dfe577577e7ef7a82a4dda03e9f3001acab7bb0bc5afb5e"} Oct 13 14:32:57 crc kubenswrapper[4797]: I1013 14:32:57.577496 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Oct 13 14:32:57 crc kubenswrapper[4797]: I1013 14:32:57.581215 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22a19aef-3dad-4ad9-a048-600878189bf6-kube-api-access-s2qns" (OuterVolumeSpecName: "kube-api-access-s2qns") pod "22a19aef-3dad-4ad9-a048-600878189bf6" (UID: "22a19aef-3dad-4ad9-a048-600878189bf6"). InnerVolumeSpecName "kube-api-access-s2qns". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 14:32:57 crc kubenswrapper[4797]: I1013 14:32:57.587296 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"40112579-d6ec-4ca8-a75f-acd4c0cd5868","Type":"ContainerDied","Data":"79337ef9c717a7644ea9a07db935e7ce3be954262359a5e0289a23b446eb6923"} Oct 13 14:32:57 crc kubenswrapper[4797]: I1013 14:32:57.587328 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 13 14:32:57 crc kubenswrapper[4797]: I1013 14:32:57.587346 4797 scope.go:117] "RemoveContainer" containerID="32ae334187ed756c4b907c1d6d39c4562697bd0127e6caf0b170d2a063713ad9" Oct 13 14:32:57 crc kubenswrapper[4797]: I1013 14:32:57.623943 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Oct 13 14:32:57 crc kubenswrapper[4797]: I1013 14:32:57.631685 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.631665253 podStartE2EDuration="2.631665253s" podCreationTimestamp="2025-10-13 14:32:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 14:32:57.624057706 +0000 UTC m=+5155.157607962" watchObservedRunningTime="2025-10-13 14:32:57.631665253 +0000 UTC m=+5155.165215509" Oct 13 14:32:57 crc kubenswrapper[4797]: I1013 14:32:57.654995 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22a19aef-3dad-4ad9-a048-600878189bf6-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "22a19aef-3dad-4ad9-a048-600878189bf6" (UID: "22a19aef-3dad-4ad9-a048-600878189bf6"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 14:32:57 crc kubenswrapper[4797]: I1013 14:32:57.656302 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 13 14:32:57 crc kubenswrapper[4797]: I1013 14:32:57.666000 4797 scope.go:117] "RemoveContainer" containerID="f18fae5bdd78e7c4562afb9ad0bd9efbf06d1d929642dcfbd52010dd10fd24aa" Oct 13 14:32:57 crc kubenswrapper[4797]: I1013 14:32:57.677567 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/22a19aef-3dad-4ad9-a048-600878189bf6-dns-svc\") pod \"22a19aef-3dad-4ad9-a048-600878189bf6\" (UID: \"22a19aef-3dad-4ad9-a048-600878189bf6\") " Oct 13 14:32:57 crc kubenswrapper[4797]: I1013 14:32:57.677617 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/22a19aef-3dad-4ad9-a048-600878189bf6-ovsdbserver-sb\") pod \"22a19aef-3dad-4ad9-a048-600878189bf6\" (UID: \"22a19aef-3dad-4ad9-a048-600878189bf6\") " Oct 13 14:32:57 crc kubenswrapper[4797]: I1013 14:32:57.678218 4797 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/22a19aef-3dad-4ad9-a048-600878189bf6-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 13 14:32:57 crc kubenswrapper[4797]: I1013 14:32:57.678233 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s2qns\" (UniqueName: \"kubernetes.io/projected/22a19aef-3dad-4ad9-a048-600878189bf6-kube-api-access-s2qns\") on node \"crc\" DevicePath \"\"" Oct 13 14:32:57 crc kubenswrapper[4797]: I1013 14:32:57.679302 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 13 14:32:57 crc kubenswrapper[4797]: I1013 14:32:57.698000 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 13 14:32:57 crc kubenswrapper[4797]: E1013 14:32:57.698475 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22a19aef-3dad-4ad9-a048-600878189bf6" containerName="dnsmasq-dns" Oct 13 14:32:57 crc kubenswrapper[4797]: I1013 14:32:57.698490 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="22a19aef-3dad-4ad9-a048-600878189bf6" containerName="dnsmasq-dns" Oct 13 14:32:57 crc kubenswrapper[4797]: E1013 14:32:57.698531 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22a19aef-3dad-4ad9-a048-600878189bf6" containerName="init" Oct 13 14:32:57 crc kubenswrapper[4797]: I1013 14:32:57.698559 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="22a19aef-3dad-4ad9-a048-600878189bf6" containerName="init" Oct 13 14:32:57 crc kubenswrapper[4797]: I1013 14:32:57.698777 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="22a19aef-3dad-4ad9-a048-600878189bf6" containerName="dnsmasq-dns" Oct 13 14:32:57 crc kubenswrapper[4797]: I1013 14:32:57.699959 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 13 14:32:57 crc kubenswrapper[4797]: I1013 14:32:57.705556 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22a19aef-3dad-4ad9-a048-600878189bf6-config" (OuterVolumeSpecName: "config") pod "22a19aef-3dad-4ad9-a048-600878189bf6" (UID: "22a19aef-3dad-4ad9-a048-600878189bf6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 14:32:57 crc kubenswrapper[4797]: I1013 14:32:57.708153 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 13 14:32:57 crc kubenswrapper[4797]: I1013 14:32:57.750557 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22a19aef-3dad-4ad9-a048-600878189bf6-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "22a19aef-3dad-4ad9-a048-600878189bf6" (UID: "22a19aef-3dad-4ad9-a048-600878189bf6"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 14:32:57 crc kubenswrapper[4797]: I1013 14:32:57.752503 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 13 14:32:57 crc kubenswrapper[4797]: I1013 14:32:57.760558 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22a19aef-3dad-4ad9-a048-600878189bf6-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "22a19aef-3dad-4ad9-a048-600878189bf6" (UID: "22a19aef-3dad-4ad9-a048-600878189bf6"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 14:32:57 crc kubenswrapper[4797]: I1013 14:32:57.779722 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kc2xx\" (UniqueName: \"kubernetes.io/projected/8aff75a0-5bce-4173-a77e-81e9bf3711b9-kube-api-access-kc2xx\") pod \"nova-api-0\" (UID: \"8aff75a0-5bce-4173-a77e-81e9bf3711b9\") " pod="openstack/nova-api-0" Oct 13 14:32:57 crc kubenswrapper[4797]: I1013 14:32:57.779780 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8aff75a0-5bce-4173-a77e-81e9bf3711b9-config-data\") pod \"nova-api-0\" (UID: \"8aff75a0-5bce-4173-a77e-81e9bf3711b9\") " pod="openstack/nova-api-0" Oct 13 14:32:57 crc kubenswrapper[4797]: I1013 14:32:57.779848 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8aff75a0-5bce-4173-a77e-81e9bf3711b9-logs\") pod \"nova-api-0\" (UID: \"8aff75a0-5bce-4173-a77e-81e9bf3711b9\") " pod="openstack/nova-api-0" Oct 13 14:32:57 crc kubenswrapper[4797]: I1013 14:32:57.779908 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8aff75a0-5bce-4173-a77e-81e9bf3711b9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8aff75a0-5bce-4173-a77e-81e9bf3711b9\") " pod="openstack/nova-api-0" Oct 13 14:32:57 crc kubenswrapper[4797]: I1013 14:32:57.779977 4797 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/22a19aef-3dad-4ad9-a048-600878189bf6-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 13 14:32:57 crc kubenswrapper[4797]: I1013 14:32:57.779988 4797 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/22a19aef-3dad-4ad9-a048-600878189bf6-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 13 14:32:57 crc kubenswrapper[4797]: I1013 14:32:57.779998 4797 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22a19aef-3dad-4ad9-a048-600878189bf6-config\") on node \"crc\" DevicePath \"\"" Oct 13 14:32:57 crc kubenswrapper[4797]: I1013 14:32:57.819696 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 13 14:32:57 crc kubenswrapper[4797]: W1013 14:32:57.822571 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6f9bf017_4407_42c7_aacd_719a6882b0c5.slice/crio-1c21bf5387cb8f6395a8a02611325ad64a5083bb720560b0ff3a3ff702344fb8 WatchSource:0}: Error finding container 1c21bf5387cb8f6395a8a02611325ad64a5083bb720560b0ff3a3ff702344fb8: Status 404 returned error can't find the container with id 1c21bf5387cb8f6395a8a02611325ad64a5083bb720560b0ff3a3ff702344fb8 Oct 13 14:32:57 crc kubenswrapper[4797]: I1013 14:32:57.881057 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kc2xx\" (UniqueName: \"kubernetes.io/projected/8aff75a0-5bce-4173-a77e-81e9bf3711b9-kube-api-access-kc2xx\") pod \"nova-api-0\" (UID: \"8aff75a0-5bce-4173-a77e-81e9bf3711b9\") " pod="openstack/nova-api-0" Oct 13 14:32:57 crc kubenswrapper[4797]: I1013 14:32:57.881129 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8aff75a0-5bce-4173-a77e-81e9bf3711b9-config-data\") pod \"nova-api-0\" (UID: \"8aff75a0-5bce-4173-a77e-81e9bf3711b9\") " pod="openstack/nova-api-0" Oct 13 14:32:57 crc kubenswrapper[4797]: I1013 14:32:57.881158 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8aff75a0-5bce-4173-a77e-81e9bf3711b9-logs\") pod \"nova-api-0\" (UID: \"8aff75a0-5bce-4173-a77e-81e9bf3711b9\") " pod="openstack/nova-api-0" Oct 13 14:32:57 crc kubenswrapper[4797]: I1013 14:32:57.881219 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8aff75a0-5bce-4173-a77e-81e9bf3711b9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8aff75a0-5bce-4173-a77e-81e9bf3711b9\") " pod="openstack/nova-api-0" Oct 13 14:32:57 crc kubenswrapper[4797]: I1013 14:32:57.881880 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8aff75a0-5bce-4173-a77e-81e9bf3711b9-logs\") pod \"nova-api-0\" (UID: \"8aff75a0-5bce-4173-a77e-81e9bf3711b9\") " pod="openstack/nova-api-0" Oct 13 14:32:57 crc kubenswrapper[4797]: I1013 14:32:57.886091 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8aff75a0-5bce-4173-a77e-81e9bf3711b9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8aff75a0-5bce-4173-a77e-81e9bf3711b9\") " pod="openstack/nova-api-0" Oct 13 14:32:57 crc kubenswrapper[4797]: I1013 14:32:57.886253 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8aff75a0-5bce-4173-a77e-81e9bf3711b9-config-data\") pod \"nova-api-0\" (UID: \"8aff75a0-5bce-4173-a77e-81e9bf3711b9\") " pod="openstack/nova-api-0" Oct 13 14:32:57 crc kubenswrapper[4797]: I1013 14:32:57.899584 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kc2xx\" (UniqueName: \"kubernetes.io/projected/8aff75a0-5bce-4173-a77e-81e9bf3711b9-kube-api-access-kc2xx\") pod \"nova-api-0\" (UID: \"8aff75a0-5bce-4173-a77e-81e9bf3711b9\") " pod="openstack/nova-api-0" Oct 13 14:32:58 crc kubenswrapper[4797]: I1013 14:32:58.029264 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 13 14:32:58 crc kubenswrapper[4797]: I1013 14:32:58.288437 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 13 14:32:58 crc kubenswrapper[4797]: W1013 14:32:58.298409 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8aff75a0_5bce_4173_a77e_81e9bf3711b9.slice/crio-d7a11f2122cd0064c2cf807257ced937d72e964ce3449d760644032d6d52954f WatchSource:0}: Error finding container d7a11f2122cd0064c2cf807257ced937d72e964ce3449d760644032d6d52954f: Status 404 returned error can't find the container with id d7a11f2122cd0064c2cf807257ced937d72e964ce3449d760644032d6d52954f Oct 13 14:32:58 crc kubenswrapper[4797]: I1013 14:32:58.473579 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 13 14:32:58 crc kubenswrapper[4797]: I1013 14:32:58.498022 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5303f67-2a7b-4c40-a910-d3769b68b3f3-config-data\") pod \"a5303f67-2a7b-4c40-a910-d3769b68b3f3\" (UID: \"a5303f67-2a7b-4c40-a910-d3769b68b3f3\") " Oct 13 14:32:58 crc kubenswrapper[4797]: I1013 14:32:58.498176 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4cvgn\" (UniqueName: \"kubernetes.io/projected/a5303f67-2a7b-4c40-a910-d3769b68b3f3-kube-api-access-4cvgn\") pod \"a5303f67-2a7b-4c40-a910-d3769b68b3f3\" (UID: \"a5303f67-2a7b-4c40-a910-d3769b68b3f3\") " Oct 13 14:32:58 crc kubenswrapper[4797]: I1013 14:32:58.498206 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5303f67-2a7b-4c40-a910-d3769b68b3f3-combined-ca-bundle\") pod \"a5303f67-2a7b-4c40-a910-d3769b68b3f3\" (UID: \"a5303f67-2a7b-4c40-a910-d3769b68b3f3\") " Oct 13 14:32:58 crc kubenswrapper[4797]: I1013 14:32:58.502624 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5303f67-2a7b-4c40-a910-d3769b68b3f3-kube-api-access-4cvgn" (OuterVolumeSpecName: "kube-api-access-4cvgn") pod "a5303f67-2a7b-4c40-a910-d3769b68b3f3" (UID: "a5303f67-2a7b-4c40-a910-d3769b68b3f3"). InnerVolumeSpecName "kube-api-access-4cvgn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 14:32:58 crc kubenswrapper[4797]: I1013 14:32:58.529336 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5303f67-2a7b-4c40-a910-d3769b68b3f3-config-data" (OuterVolumeSpecName: "config-data") pod "a5303f67-2a7b-4c40-a910-d3769b68b3f3" (UID: "a5303f67-2a7b-4c40-a910-d3769b68b3f3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 14:32:58 crc kubenswrapper[4797]: I1013 14:32:58.537934 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5303f67-2a7b-4c40-a910-d3769b68b3f3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a5303f67-2a7b-4c40-a910-d3769b68b3f3" (UID: "a5303f67-2a7b-4c40-a910-d3769b68b3f3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 14:32:58 crc kubenswrapper[4797]: I1013 14:32:58.599641 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4cvgn\" (UniqueName: \"kubernetes.io/projected/a5303f67-2a7b-4c40-a910-d3769b68b3f3-kube-api-access-4cvgn\") on node \"crc\" DevicePath \"\"" Oct 13 14:32:58 crc kubenswrapper[4797]: I1013 14:32:58.599689 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5303f67-2a7b-4c40-a910-d3769b68b3f3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 14:32:58 crc kubenswrapper[4797]: I1013 14:32:58.599708 4797 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5303f67-2a7b-4c40-a910-d3769b68b3f3-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 14:32:58 crc kubenswrapper[4797]: I1013 14:32:58.600588 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8aff75a0-5bce-4173-a77e-81e9bf3711b9","Type":"ContainerStarted","Data":"bc4d15f9d615ffd287630d709a42ee7bca1ed1b84e13c503b5f197f0646c0ab8"} Oct 13 14:32:58 crc kubenswrapper[4797]: I1013 14:32:58.600663 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8aff75a0-5bce-4173-a77e-81e9bf3711b9","Type":"ContainerStarted","Data":"d7a11f2122cd0064c2cf807257ced937d72e964ce3449d760644032d6d52954f"} Oct 13 14:32:58 crc kubenswrapper[4797]: I1013 14:32:58.603156 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d9899cb4c-j4tsm" event={"ID":"22a19aef-3dad-4ad9-a048-600878189bf6","Type":"ContainerDied","Data":"37b318e5dfe1fd8d9b8a12a4a6666f2cfc140952765a3c991b3012895d87c968"} Oct 13 14:32:58 crc kubenswrapper[4797]: I1013 14:32:58.603222 4797 scope.go:117] "RemoveContainer" containerID="d6337036cf4355641b75f773ea7af8f0230be877189c8329a9121afb5b8ae26b" Oct 13 14:32:58 crc kubenswrapper[4797]: I1013 14:32:58.603362 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d9899cb4c-j4tsm" Oct 13 14:32:58 crc kubenswrapper[4797]: I1013 14:32:58.615162 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6f9bf017-4407-42c7-aacd-719a6882b0c5","Type":"ContainerStarted","Data":"68b8a9cea446d1d84a8aa87ebbacec65aaa5b2097eb1c8ff1130fbd1a223b18c"} Oct 13 14:32:58 crc kubenswrapper[4797]: I1013 14:32:58.615214 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6f9bf017-4407-42c7-aacd-719a6882b0c5","Type":"ContainerStarted","Data":"1b6eea7a93ba3bf6a333953ed1dda0ff84ef3b95ad05eebe93bcd617f5d06cad"} Oct 13 14:32:58 crc kubenswrapper[4797]: I1013 14:32:58.615229 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6f9bf017-4407-42c7-aacd-719a6882b0c5","Type":"ContainerStarted","Data":"1c21bf5387cb8f6395a8a02611325ad64a5083bb720560b0ff3a3ff702344fb8"} Oct 13 14:32:58 crc kubenswrapper[4797]: I1013 14:32:58.618131 4797 generic.go:334] "Generic (PLEG): container finished" podID="a5303f67-2a7b-4c40-a910-d3769b68b3f3" containerID="18e193392e0d839f62480d10e22d79b1a0fa0560a5c1e9c87865962cad6a4a9b" exitCode=0 Oct 13 14:32:58 crc kubenswrapper[4797]: I1013 14:32:58.618196 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 13 14:32:58 crc kubenswrapper[4797]: I1013 14:32:58.618239 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a5303f67-2a7b-4c40-a910-d3769b68b3f3","Type":"ContainerDied","Data":"18e193392e0d839f62480d10e22d79b1a0fa0560a5c1e9c87865962cad6a4a9b"} Oct 13 14:32:58 crc kubenswrapper[4797]: I1013 14:32:58.618273 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a5303f67-2a7b-4c40-a910-d3769b68b3f3","Type":"ContainerDied","Data":"4af1a47b92ff5c29eb9e8619ef5a38e9ee992630c84c0fb0ef85838decfa1e19"} Oct 13 14:32:58 crc kubenswrapper[4797]: I1013 14:32:58.640572 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.640549646 podStartE2EDuration="2.640549646s" podCreationTimestamp="2025-10-13 14:32:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 14:32:58.634769035 +0000 UTC m=+5156.168319341" watchObservedRunningTime="2025-10-13 14:32:58.640549646 +0000 UTC m=+5156.174099912" Oct 13 14:32:58 crc kubenswrapper[4797]: I1013 14:32:58.658884 4797 scope.go:117] "RemoveContainer" containerID="cdccce2c6e8d08cc7bbb02be5f79b40826c1f325e1b05cf2c1dec39d04c4c62b" Oct 13 14:32:58 crc kubenswrapper[4797]: I1013 14:32:58.675510 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7d9899cb4c-j4tsm"] Oct 13 14:32:58 crc kubenswrapper[4797]: I1013 14:32:58.684393 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7d9899cb4c-j4tsm"] Oct 13 14:32:58 crc kubenswrapper[4797]: I1013 14:32:58.691634 4797 scope.go:117] "RemoveContainer" containerID="18e193392e0d839f62480d10e22d79b1a0fa0560a5c1e9c87865962cad6a4a9b" Oct 13 14:32:58 crc kubenswrapper[4797]: I1013 14:32:58.720497 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 13 14:32:58 crc kubenswrapper[4797]: I1013 14:32:58.737319 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 13 14:32:58 crc kubenswrapper[4797]: I1013 14:32:58.749607 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 13 14:32:58 crc kubenswrapper[4797]: E1013 14:32:58.750143 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5303f67-2a7b-4c40-a910-d3769b68b3f3" containerName="nova-scheduler-scheduler" Oct 13 14:32:58 crc kubenswrapper[4797]: I1013 14:32:58.750168 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5303f67-2a7b-4c40-a910-d3769b68b3f3" containerName="nova-scheduler-scheduler" Oct 13 14:32:58 crc kubenswrapper[4797]: I1013 14:32:58.750467 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5303f67-2a7b-4c40-a910-d3769b68b3f3" containerName="nova-scheduler-scheduler" Oct 13 14:32:58 crc kubenswrapper[4797]: I1013 14:32:58.751787 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 13 14:32:58 crc kubenswrapper[4797]: I1013 14:32:58.756419 4797 scope.go:117] "RemoveContainer" containerID="18e193392e0d839f62480d10e22d79b1a0fa0560a5c1e9c87865962cad6a4a9b" Oct 13 14:32:58 crc kubenswrapper[4797]: I1013 14:32:58.756860 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 13 14:32:58 crc kubenswrapper[4797]: I1013 14:32:58.758779 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 13 14:32:58 crc kubenswrapper[4797]: E1013 14:32:58.762987 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18e193392e0d839f62480d10e22d79b1a0fa0560a5c1e9c87865962cad6a4a9b\": container with ID starting with 18e193392e0d839f62480d10e22d79b1a0fa0560a5c1e9c87865962cad6a4a9b not found: ID does not exist" containerID="18e193392e0d839f62480d10e22d79b1a0fa0560a5c1e9c87865962cad6a4a9b" Oct 13 14:32:58 crc kubenswrapper[4797]: I1013 14:32:58.763042 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18e193392e0d839f62480d10e22d79b1a0fa0560a5c1e9c87865962cad6a4a9b"} err="failed to get container status \"18e193392e0d839f62480d10e22d79b1a0fa0560a5c1e9c87865962cad6a4a9b\": rpc error: code = NotFound desc = could not find container \"18e193392e0d839f62480d10e22d79b1a0fa0560a5c1e9c87865962cad6a4a9b\": container with ID starting with 18e193392e0d839f62480d10e22d79b1a0fa0560a5c1e9c87865962cad6a4a9b not found: ID does not exist" Oct 13 14:32:58 crc kubenswrapper[4797]: I1013 14:32:58.803366 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qp2t\" (UniqueName: \"kubernetes.io/projected/e6606df7-682a-4a8f-a465-f241fc8ce5de-kube-api-access-9qp2t\") pod \"nova-scheduler-0\" (UID: \"e6606df7-682a-4a8f-a465-f241fc8ce5de\") " pod="openstack/nova-scheduler-0" Oct 13 14:32:58 crc kubenswrapper[4797]: I1013 14:32:58.803768 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6606df7-682a-4a8f-a465-f241fc8ce5de-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e6606df7-682a-4a8f-a465-f241fc8ce5de\") " pod="openstack/nova-scheduler-0" Oct 13 14:32:58 crc kubenswrapper[4797]: I1013 14:32:58.804090 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6606df7-682a-4a8f-a465-f241fc8ce5de-config-data\") pod \"nova-scheduler-0\" (UID: \"e6606df7-682a-4a8f-a465-f241fc8ce5de\") " pod="openstack/nova-scheduler-0" Oct 13 14:32:58 crc kubenswrapper[4797]: I1013 14:32:58.906077 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6606df7-682a-4a8f-a465-f241fc8ce5de-config-data\") pod \"nova-scheduler-0\" (UID: \"e6606df7-682a-4a8f-a465-f241fc8ce5de\") " pod="openstack/nova-scheduler-0" Oct 13 14:32:58 crc kubenswrapper[4797]: I1013 14:32:58.906171 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qp2t\" (UniqueName: \"kubernetes.io/projected/e6606df7-682a-4a8f-a465-f241fc8ce5de-kube-api-access-9qp2t\") pod \"nova-scheduler-0\" (UID: \"e6606df7-682a-4a8f-a465-f241fc8ce5de\") " pod="openstack/nova-scheduler-0" Oct 13 14:32:58 crc kubenswrapper[4797]: I1013 14:32:58.906218 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6606df7-682a-4a8f-a465-f241fc8ce5de-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e6606df7-682a-4a8f-a465-f241fc8ce5de\") " pod="openstack/nova-scheduler-0" Oct 13 14:32:58 crc kubenswrapper[4797]: I1013 14:32:58.912553 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6606df7-682a-4a8f-a465-f241fc8ce5de-config-data\") pod \"nova-scheduler-0\" (UID: \"e6606df7-682a-4a8f-a465-f241fc8ce5de\") " pod="openstack/nova-scheduler-0" Oct 13 14:32:58 crc kubenswrapper[4797]: I1013 14:32:58.914169 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6606df7-682a-4a8f-a465-f241fc8ce5de-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e6606df7-682a-4a8f-a465-f241fc8ce5de\") " pod="openstack/nova-scheduler-0" Oct 13 14:32:58 crc kubenswrapper[4797]: I1013 14:32:58.928774 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qp2t\" (UniqueName: \"kubernetes.io/projected/e6606df7-682a-4a8f-a465-f241fc8ce5de-kube-api-access-9qp2t\") pod \"nova-scheduler-0\" (UID: \"e6606df7-682a-4a8f-a465-f241fc8ce5de\") " pod="openstack/nova-scheduler-0" Oct 13 14:32:59 crc kubenswrapper[4797]: I1013 14:32:59.077318 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 13 14:32:59 crc kubenswrapper[4797]: I1013 14:32:59.239128 4797 scope.go:117] "RemoveContainer" containerID="96c8267bd4c8e99eeab0f52fde47a06d5529395a03b2ed9e13ec45aa355e370b" Oct 13 14:32:59 crc kubenswrapper[4797]: E1013 14:32:59.239362 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 14:32:59 crc kubenswrapper[4797]: I1013 14:32:59.257749 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22a19aef-3dad-4ad9-a048-600878189bf6" path="/var/lib/kubelet/pods/22a19aef-3dad-4ad9-a048-600878189bf6/volumes" Oct 13 14:32:59 crc kubenswrapper[4797]: I1013 14:32:59.258631 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40112579-d6ec-4ca8-a75f-acd4c0cd5868" path="/var/lib/kubelet/pods/40112579-d6ec-4ca8-a75f-acd4c0cd5868/volumes" Oct 13 14:32:59 crc kubenswrapper[4797]: I1013 14:32:59.259444 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5303f67-2a7b-4c40-a910-d3769b68b3f3" path="/var/lib/kubelet/pods/a5303f67-2a7b-4c40-a910-d3769b68b3f3/volumes" Oct 13 14:32:59 crc kubenswrapper[4797]: I1013 14:32:59.528038 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 13 14:32:59 crc kubenswrapper[4797]: I1013 14:32:59.634624 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e6606df7-682a-4a8f-a465-f241fc8ce5de","Type":"ContainerStarted","Data":"c8437af83d40b2ea3cd07920cc2c0da8c691e463e888bb643663aa0dc35e22d4"} Oct 13 14:32:59 crc kubenswrapper[4797]: I1013 14:32:59.647340 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8aff75a0-5bce-4173-a77e-81e9bf3711b9","Type":"ContainerStarted","Data":"703e7690eb61ddd2dd1574c8099c2addec1fc0c9acb69c0decc7424f49ee7d1b"} Oct 13 14:33:00 crc kubenswrapper[4797]: I1013 14:33:00.658868 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e6606df7-682a-4a8f-a465-f241fc8ce5de","Type":"ContainerStarted","Data":"ba38339a2744d135151147d87803e1268a0b8045dbce051c131f1eef9d01b341"} Oct 13 14:33:00 crc kubenswrapper[4797]: I1013 14:33:00.678153 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.678134632 podStartE2EDuration="3.678134632s" podCreationTimestamp="2025-10-13 14:32:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 14:32:59.668242062 +0000 UTC m=+5157.201792318" watchObservedRunningTime="2025-10-13 14:33:00.678134632 +0000 UTC m=+5158.211684888" Oct 13 14:33:00 crc kubenswrapper[4797]: I1013 14:33:00.678615 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.678609513 podStartE2EDuration="2.678609513s" podCreationTimestamp="2025-10-13 14:32:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 14:33:00.671824977 +0000 UTC m=+5158.205375253" watchObservedRunningTime="2025-10-13 14:33:00.678609513 +0000 UTC m=+5158.212159769" Oct 13 14:33:02 crc kubenswrapper[4797]: I1013 14:33:02.319300 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 13 14:33:02 crc kubenswrapper[4797]: I1013 14:33:02.320560 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 13 14:33:02 crc kubenswrapper[4797]: I1013 14:33:02.493383 4797 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-7d9899cb4c-j4tsm" podUID="22a19aef-3dad-4ad9-a048-600878189bf6" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.1.49:5353: i/o timeout" Oct 13 14:33:04 crc kubenswrapper[4797]: I1013 14:33:04.077917 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 13 14:33:05 crc kubenswrapper[4797]: I1013 14:33:05.982864 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Oct 13 14:33:06 crc kubenswrapper[4797]: I1013 14:33:06.552849 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-wjhx2"] Oct 13 14:33:06 crc kubenswrapper[4797]: I1013 14:33:06.554275 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-wjhx2" Oct 13 14:33:06 crc kubenswrapper[4797]: I1013 14:33:06.557518 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Oct 13 14:33:06 crc kubenswrapper[4797]: I1013 14:33:06.557532 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Oct 13 14:33:06 crc kubenswrapper[4797]: I1013 14:33:06.568207 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-wjhx2"] Oct 13 14:33:06 crc kubenswrapper[4797]: I1013 14:33:06.669172 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k64nz\" (UniqueName: \"kubernetes.io/projected/768cb00c-712a-4da6-bcc9-669b541c1761-kube-api-access-k64nz\") pod \"nova-cell1-cell-mapping-wjhx2\" (UID: \"768cb00c-712a-4da6-bcc9-669b541c1761\") " pod="openstack/nova-cell1-cell-mapping-wjhx2" Oct 13 14:33:06 crc kubenswrapper[4797]: I1013 14:33:06.669234 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/768cb00c-712a-4da6-bcc9-669b541c1761-scripts\") pod \"nova-cell1-cell-mapping-wjhx2\" (UID: \"768cb00c-712a-4da6-bcc9-669b541c1761\") " pod="openstack/nova-cell1-cell-mapping-wjhx2" Oct 13 14:33:06 crc kubenswrapper[4797]: I1013 14:33:06.670361 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/768cb00c-712a-4da6-bcc9-669b541c1761-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-wjhx2\" (UID: \"768cb00c-712a-4da6-bcc9-669b541c1761\") " pod="openstack/nova-cell1-cell-mapping-wjhx2" Oct 13 14:33:06 crc kubenswrapper[4797]: I1013 14:33:06.670424 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/768cb00c-712a-4da6-bcc9-669b541c1761-config-data\") pod \"nova-cell1-cell-mapping-wjhx2\" (UID: \"768cb00c-712a-4da6-bcc9-669b541c1761\") " pod="openstack/nova-cell1-cell-mapping-wjhx2" Oct 13 14:33:06 crc kubenswrapper[4797]: I1013 14:33:06.772274 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k64nz\" (UniqueName: \"kubernetes.io/projected/768cb00c-712a-4da6-bcc9-669b541c1761-kube-api-access-k64nz\") pod \"nova-cell1-cell-mapping-wjhx2\" (UID: \"768cb00c-712a-4da6-bcc9-669b541c1761\") " pod="openstack/nova-cell1-cell-mapping-wjhx2" Oct 13 14:33:06 crc kubenswrapper[4797]: I1013 14:33:06.772315 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/768cb00c-712a-4da6-bcc9-669b541c1761-scripts\") pod \"nova-cell1-cell-mapping-wjhx2\" (UID: \"768cb00c-712a-4da6-bcc9-669b541c1761\") " pod="openstack/nova-cell1-cell-mapping-wjhx2" Oct 13 14:33:06 crc kubenswrapper[4797]: I1013 14:33:06.772387 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/768cb00c-712a-4da6-bcc9-669b541c1761-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-wjhx2\" (UID: \"768cb00c-712a-4da6-bcc9-669b541c1761\") " pod="openstack/nova-cell1-cell-mapping-wjhx2" Oct 13 14:33:06 crc kubenswrapper[4797]: I1013 14:33:06.772417 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/768cb00c-712a-4da6-bcc9-669b541c1761-config-data\") pod \"nova-cell1-cell-mapping-wjhx2\" (UID: \"768cb00c-712a-4da6-bcc9-669b541c1761\") " pod="openstack/nova-cell1-cell-mapping-wjhx2" Oct 13 14:33:06 crc kubenswrapper[4797]: I1013 14:33:06.778338 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/768cb00c-712a-4da6-bcc9-669b541c1761-scripts\") pod \"nova-cell1-cell-mapping-wjhx2\" (UID: \"768cb00c-712a-4da6-bcc9-669b541c1761\") " pod="openstack/nova-cell1-cell-mapping-wjhx2" Oct 13 14:33:06 crc kubenswrapper[4797]: I1013 14:33:06.778510 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/768cb00c-712a-4da6-bcc9-669b541c1761-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-wjhx2\" (UID: \"768cb00c-712a-4da6-bcc9-669b541c1761\") " pod="openstack/nova-cell1-cell-mapping-wjhx2" Oct 13 14:33:06 crc kubenswrapper[4797]: I1013 14:33:06.785458 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/768cb00c-712a-4da6-bcc9-669b541c1761-config-data\") pod \"nova-cell1-cell-mapping-wjhx2\" (UID: \"768cb00c-712a-4da6-bcc9-669b541c1761\") " pod="openstack/nova-cell1-cell-mapping-wjhx2" Oct 13 14:33:06 crc kubenswrapper[4797]: I1013 14:33:06.808259 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k64nz\" (UniqueName: \"kubernetes.io/projected/768cb00c-712a-4da6-bcc9-669b541c1761-kube-api-access-k64nz\") pod \"nova-cell1-cell-mapping-wjhx2\" (UID: \"768cb00c-712a-4da6-bcc9-669b541c1761\") " pod="openstack/nova-cell1-cell-mapping-wjhx2" Oct 13 14:33:06 crc kubenswrapper[4797]: I1013 14:33:06.875287 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-wjhx2" Oct 13 14:33:07 crc kubenswrapper[4797]: I1013 14:33:07.355539 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 13 14:33:07 crc kubenswrapper[4797]: I1013 14:33:07.355821 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 13 14:33:07 crc kubenswrapper[4797]: I1013 14:33:07.392229 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-wjhx2"] Oct 13 14:33:07 crc kubenswrapper[4797]: I1013 14:33:07.723533 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-wjhx2" event={"ID":"768cb00c-712a-4da6-bcc9-669b541c1761","Type":"ContainerStarted","Data":"db8d6ef859959b7c0c31e0cd5db58a792b490247eb1a5b1b415e4139878b9beb"} Oct 13 14:33:08 crc kubenswrapper[4797]: I1013 14:33:08.029848 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 13 14:33:08 crc kubenswrapper[4797]: I1013 14:33:08.029943 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 13 14:33:08 crc kubenswrapper[4797]: I1013 14:33:08.439078 4797 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="6f9bf017-4407-42c7-aacd-719a6882b0c5" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.68:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 13 14:33:08 crc kubenswrapper[4797]: I1013 14:33:08.439367 4797 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="6f9bf017-4407-42c7-aacd-719a6882b0c5" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.68:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 13 14:33:08 crc kubenswrapper[4797]: I1013 14:33:08.736080 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-wjhx2" event={"ID":"768cb00c-712a-4da6-bcc9-669b541c1761","Type":"ContainerStarted","Data":"e86e99da707a5e06b77ff5fd97dda0e3f7b432d2d862555c57df10db6517018a"} Oct 13 14:33:09 crc kubenswrapper[4797]: I1013 14:33:09.071245 4797 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="8aff75a0-5bce-4173-a77e-81e9bf3711b9" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.69:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 13 14:33:09 crc kubenswrapper[4797]: I1013 14:33:09.078551 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 13 14:33:09 crc kubenswrapper[4797]: I1013 14:33:09.106779 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 13 14:33:09 crc kubenswrapper[4797]: I1013 14:33:09.112015 4797 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="8aff75a0-5bce-4173-a77e-81e9bf3711b9" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.69:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 13 14:33:09 crc kubenswrapper[4797]: I1013 14:33:09.131662 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-wjhx2" podStartSLOduration=3.131643225 podStartE2EDuration="3.131643225s" podCreationTimestamp="2025-10-13 14:33:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 14:33:08.753482984 +0000 UTC m=+5166.287033250" watchObservedRunningTime="2025-10-13 14:33:09.131643225 +0000 UTC m=+5166.665193501" Oct 13 14:33:09 crc kubenswrapper[4797]: I1013 14:33:09.784006 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 13 14:33:10 crc kubenswrapper[4797]: I1013 14:33:10.236951 4797 scope.go:117] "RemoveContainer" containerID="96c8267bd4c8e99eeab0f52fde47a06d5529395a03b2ed9e13ec45aa355e370b" Oct 13 14:33:10 crc kubenswrapper[4797]: E1013 14:33:10.237725 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 14:33:12 crc kubenswrapper[4797]: I1013 14:33:12.772906 4797 generic.go:334] "Generic (PLEG): container finished" podID="768cb00c-712a-4da6-bcc9-669b541c1761" containerID="e86e99da707a5e06b77ff5fd97dda0e3f7b432d2d862555c57df10db6517018a" exitCode=0 Oct 13 14:33:12 crc kubenswrapper[4797]: I1013 14:33:12.772977 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-wjhx2" event={"ID":"768cb00c-712a-4da6-bcc9-669b541c1761","Type":"ContainerDied","Data":"e86e99da707a5e06b77ff5fd97dda0e3f7b432d2d862555c57df10db6517018a"} Oct 13 14:33:14 crc kubenswrapper[4797]: I1013 14:33:14.171795 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-wjhx2" Oct 13 14:33:14 crc kubenswrapper[4797]: I1013 14:33:14.235442 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/768cb00c-712a-4da6-bcc9-669b541c1761-config-data\") pod \"768cb00c-712a-4da6-bcc9-669b541c1761\" (UID: \"768cb00c-712a-4da6-bcc9-669b541c1761\") " Oct 13 14:33:14 crc kubenswrapper[4797]: I1013 14:33:14.235502 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/768cb00c-712a-4da6-bcc9-669b541c1761-combined-ca-bundle\") pod \"768cb00c-712a-4da6-bcc9-669b541c1761\" (UID: \"768cb00c-712a-4da6-bcc9-669b541c1761\") " Oct 13 14:33:14 crc kubenswrapper[4797]: I1013 14:33:14.235550 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/768cb00c-712a-4da6-bcc9-669b541c1761-scripts\") pod \"768cb00c-712a-4da6-bcc9-669b541c1761\" (UID: \"768cb00c-712a-4da6-bcc9-669b541c1761\") " Oct 13 14:33:14 crc kubenswrapper[4797]: I1013 14:33:14.235585 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k64nz\" (UniqueName: \"kubernetes.io/projected/768cb00c-712a-4da6-bcc9-669b541c1761-kube-api-access-k64nz\") pod \"768cb00c-712a-4da6-bcc9-669b541c1761\" (UID: \"768cb00c-712a-4da6-bcc9-669b541c1761\") " Oct 13 14:33:14 crc kubenswrapper[4797]: I1013 14:33:14.241149 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/768cb00c-712a-4da6-bcc9-669b541c1761-scripts" (OuterVolumeSpecName: "scripts") pod "768cb00c-712a-4da6-bcc9-669b541c1761" (UID: "768cb00c-712a-4da6-bcc9-669b541c1761"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 14:33:14 crc kubenswrapper[4797]: I1013 14:33:14.241186 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/768cb00c-712a-4da6-bcc9-669b541c1761-kube-api-access-k64nz" (OuterVolumeSpecName: "kube-api-access-k64nz") pod "768cb00c-712a-4da6-bcc9-669b541c1761" (UID: "768cb00c-712a-4da6-bcc9-669b541c1761"). InnerVolumeSpecName "kube-api-access-k64nz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 14:33:14 crc kubenswrapper[4797]: I1013 14:33:14.260967 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/768cb00c-712a-4da6-bcc9-669b541c1761-config-data" (OuterVolumeSpecName: "config-data") pod "768cb00c-712a-4da6-bcc9-669b541c1761" (UID: "768cb00c-712a-4da6-bcc9-669b541c1761"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 14:33:14 crc kubenswrapper[4797]: I1013 14:33:14.261657 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/768cb00c-712a-4da6-bcc9-669b541c1761-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "768cb00c-712a-4da6-bcc9-669b541c1761" (UID: "768cb00c-712a-4da6-bcc9-669b541c1761"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 14:33:14 crc kubenswrapper[4797]: I1013 14:33:14.338229 4797 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/768cb00c-712a-4da6-bcc9-669b541c1761-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 14:33:14 crc kubenswrapper[4797]: I1013 14:33:14.338271 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/768cb00c-712a-4da6-bcc9-669b541c1761-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 14:33:14 crc kubenswrapper[4797]: I1013 14:33:14.338283 4797 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/768cb00c-712a-4da6-bcc9-669b541c1761-scripts\") on node \"crc\" DevicePath \"\"" Oct 13 14:33:14 crc kubenswrapper[4797]: I1013 14:33:14.338295 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k64nz\" (UniqueName: \"kubernetes.io/projected/768cb00c-712a-4da6-bcc9-669b541c1761-kube-api-access-k64nz\") on node \"crc\" DevicePath \"\"" Oct 13 14:33:14 crc kubenswrapper[4797]: I1013 14:33:14.793993 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-wjhx2" event={"ID":"768cb00c-712a-4da6-bcc9-669b541c1761","Type":"ContainerDied","Data":"db8d6ef859959b7c0c31e0cd5db58a792b490247eb1a5b1b415e4139878b9beb"} Oct 13 14:33:14 crc kubenswrapper[4797]: I1013 14:33:14.794294 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="db8d6ef859959b7c0c31e0cd5db58a792b490247eb1a5b1b415e4139878b9beb" Oct 13 14:33:14 crc kubenswrapper[4797]: I1013 14:33:14.794041 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-wjhx2" Oct 13 14:33:14 crc kubenswrapper[4797]: I1013 14:33:14.980651 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 13 14:33:14 crc kubenswrapper[4797]: I1013 14:33:14.980960 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="8aff75a0-5bce-4173-a77e-81e9bf3711b9" containerName="nova-api-log" containerID="cri-o://bc4d15f9d615ffd287630d709a42ee7bca1ed1b84e13c503b5f197f0646c0ab8" gracePeriod=30 Oct 13 14:33:14 crc kubenswrapper[4797]: I1013 14:33:14.981182 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="8aff75a0-5bce-4173-a77e-81e9bf3711b9" containerName="nova-api-api" containerID="cri-o://703e7690eb61ddd2dd1574c8099c2addec1fc0c9acb69c0decc7424f49ee7d1b" gracePeriod=30 Oct 13 14:33:15 crc kubenswrapper[4797]: I1013 14:33:15.022200 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 13 14:33:15 crc kubenswrapper[4797]: I1013 14:33:15.022450 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="e6606df7-682a-4a8f-a465-f241fc8ce5de" containerName="nova-scheduler-scheduler" containerID="cri-o://ba38339a2744d135151147d87803e1268a0b8045dbce051c131f1eef9d01b341" gracePeriod=30 Oct 13 14:33:15 crc kubenswrapper[4797]: I1013 14:33:15.040790 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 13 14:33:15 crc kubenswrapper[4797]: I1013 14:33:15.041110 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="6f9bf017-4407-42c7-aacd-719a6882b0c5" containerName="nova-metadata-log" containerID="cri-o://1b6eea7a93ba3bf6a333953ed1dda0ff84ef3b95ad05eebe93bcd617f5d06cad" gracePeriod=30 Oct 13 14:33:15 crc kubenswrapper[4797]: I1013 14:33:15.041167 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="6f9bf017-4407-42c7-aacd-719a6882b0c5" containerName="nova-metadata-metadata" containerID="cri-o://68b8a9cea446d1d84a8aa87ebbacec65aaa5b2097eb1c8ff1130fbd1a223b18c" gracePeriod=30 Oct 13 14:33:15 crc kubenswrapper[4797]: I1013 14:33:15.809016 4797 generic.go:334] "Generic (PLEG): container finished" podID="8aff75a0-5bce-4173-a77e-81e9bf3711b9" containerID="bc4d15f9d615ffd287630d709a42ee7bca1ed1b84e13c503b5f197f0646c0ab8" exitCode=143 Oct 13 14:33:15 crc kubenswrapper[4797]: I1013 14:33:15.809084 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8aff75a0-5bce-4173-a77e-81e9bf3711b9","Type":"ContainerDied","Data":"bc4d15f9d615ffd287630d709a42ee7bca1ed1b84e13c503b5f197f0646c0ab8"} Oct 13 14:33:15 crc kubenswrapper[4797]: I1013 14:33:15.812942 4797 generic.go:334] "Generic (PLEG): container finished" podID="6f9bf017-4407-42c7-aacd-719a6882b0c5" containerID="1b6eea7a93ba3bf6a333953ed1dda0ff84ef3b95ad05eebe93bcd617f5d06cad" exitCode=143 Oct 13 14:33:15 crc kubenswrapper[4797]: I1013 14:33:15.812966 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6f9bf017-4407-42c7-aacd-719a6882b0c5","Type":"ContainerDied","Data":"1b6eea7a93ba3bf6a333953ed1dda0ff84ef3b95ad05eebe93bcd617f5d06cad"} Oct 13 14:33:16 crc kubenswrapper[4797]: I1013 14:33:16.460604 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 13 14:33:16 crc kubenswrapper[4797]: I1013 14:33:16.482643 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6606df7-682a-4a8f-a465-f241fc8ce5de-config-data\") pod \"e6606df7-682a-4a8f-a465-f241fc8ce5de\" (UID: \"e6606df7-682a-4a8f-a465-f241fc8ce5de\") " Oct 13 14:33:16 crc kubenswrapper[4797]: I1013 14:33:16.483068 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9qp2t\" (UniqueName: \"kubernetes.io/projected/e6606df7-682a-4a8f-a465-f241fc8ce5de-kube-api-access-9qp2t\") pod \"e6606df7-682a-4a8f-a465-f241fc8ce5de\" (UID: \"e6606df7-682a-4a8f-a465-f241fc8ce5de\") " Oct 13 14:33:16 crc kubenswrapper[4797]: I1013 14:33:16.483181 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6606df7-682a-4a8f-a465-f241fc8ce5de-combined-ca-bundle\") pod \"e6606df7-682a-4a8f-a465-f241fc8ce5de\" (UID: \"e6606df7-682a-4a8f-a465-f241fc8ce5de\") " Oct 13 14:33:16 crc kubenswrapper[4797]: I1013 14:33:16.488826 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6606df7-682a-4a8f-a465-f241fc8ce5de-kube-api-access-9qp2t" (OuterVolumeSpecName: "kube-api-access-9qp2t") pod "e6606df7-682a-4a8f-a465-f241fc8ce5de" (UID: "e6606df7-682a-4a8f-a465-f241fc8ce5de"). InnerVolumeSpecName "kube-api-access-9qp2t". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 14:33:16 crc kubenswrapper[4797]: I1013 14:33:16.511457 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6606df7-682a-4a8f-a465-f241fc8ce5de-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e6606df7-682a-4a8f-a465-f241fc8ce5de" (UID: "e6606df7-682a-4a8f-a465-f241fc8ce5de"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 14:33:16 crc kubenswrapper[4797]: I1013 14:33:16.526485 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6606df7-682a-4a8f-a465-f241fc8ce5de-config-data" (OuterVolumeSpecName: "config-data") pod "e6606df7-682a-4a8f-a465-f241fc8ce5de" (UID: "e6606df7-682a-4a8f-a465-f241fc8ce5de"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 14:33:16 crc kubenswrapper[4797]: I1013 14:33:16.585168 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9qp2t\" (UniqueName: \"kubernetes.io/projected/e6606df7-682a-4a8f-a465-f241fc8ce5de-kube-api-access-9qp2t\") on node \"crc\" DevicePath \"\"" Oct 13 14:33:16 crc kubenswrapper[4797]: I1013 14:33:16.585200 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6606df7-682a-4a8f-a465-f241fc8ce5de-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 14:33:16 crc kubenswrapper[4797]: I1013 14:33:16.585211 4797 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6606df7-682a-4a8f-a465-f241fc8ce5de-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 14:33:16 crc kubenswrapper[4797]: I1013 14:33:16.823445 4797 generic.go:334] "Generic (PLEG): container finished" podID="e6606df7-682a-4a8f-a465-f241fc8ce5de" containerID="ba38339a2744d135151147d87803e1268a0b8045dbce051c131f1eef9d01b341" exitCode=0 Oct 13 14:33:16 crc kubenswrapper[4797]: I1013 14:33:16.823504 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 13 14:33:16 crc kubenswrapper[4797]: I1013 14:33:16.823504 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e6606df7-682a-4a8f-a465-f241fc8ce5de","Type":"ContainerDied","Data":"ba38339a2744d135151147d87803e1268a0b8045dbce051c131f1eef9d01b341"} Oct 13 14:33:16 crc kubenswrapper[4797]: I1013 14:33:16.823624 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e6606df7-682a-4a8f-a465-f241fc8ce5de","Type":"ContainerDied","Data":"c8437af83d40b2ea3cd07920cc2c0da8c691e463e888bb643663aa0dc35e22d4"} Oct 13 14:33:16 crc kubenswrapper[4797]: I1013 14:33:16.823648 4797 scope.go:117] "RemoveContainer" containerID="ba38339a2744d135151147d87803e1268a0b8045dbce051c131f1eef9d01b341" Oct 13 14:33:16 crc kubenswrapper[4797]: I1013 14:33:16.854532 4797 scope.go:117] "RemoveContainer" containerID="ba38339a2744d135151147d87803e1268a0b8045dbce051c131f1eef9d01b341" Oct 13 14:33:16 crc kubenswrapper[4797]: E1013 14:33:16.858659 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba38339a2744d135151147d87803e1268a0b8045dbce051c131f1eef9d01b341\": container with ID starting with ba38339a2744d135151147d87803e1268a0b8045dbce051c131f1eef9d01b341 not found: ID does not exist" containerID="ba38339a2744d135151147d87803e1268a0b8045dbce051c131f1eef9d01b341" Oct 13 14:33:16 crc kubenswrapper[4797]: I1013 14:33:16.859378 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba38339a2744d135151147d87803e1268a0b8045dbce051c131f1eef9d01b341"} err="failed to get container status \"ba38339a2744d135151147d87803e1268a0b8045dbce051c131f1eef9d01b341\": rpc error: code = NotFound desc = could not find container \"ba38339a2744d135151147d87803e1268a0b8045dbce051c131f1eef9d01b341\": container with ID starting with ba38339a2744d135151147d87803e1268a0b8045dbce051c131f1eef9d01b341 not found: ID does not exist" Oct 13 14:33:16 crc kubenswrapper[4797]: I1013 14:33:16.862588 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 13 14:33:16 crc kubenswrapper[4797]: I1013 14:33:16.884406 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 13 14:33:16 crc kubenswrapper[4797]: I1013 14:33:16.905419 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 13 14:33:16 crc kubenswrapper[4797]: E1013 14:33:16.905913 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="768cb00c-712a-4da6-bcc9-669b541c1761" containerName="nova-manage" Oct 13 14:33:16 crc kubenswrapper[4797]: I1013 14:33:16.905934 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="768cb00c-712a-4da6-bcc9-669b541c1761" containerName="nova-manage" Oct 13 14:33:16 crc kubenswrapper[4797]: E1013 14:33:16.905966 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6606df7-682a-4a8f-a465-f241fc8ce5de" containerName="nova-scheduler-scheduler" Oct 13 14:33:16 crc kubenswrapper[4797]: I1013 14:33:16.905973 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6606df7-682a-4a8f-a465-f241fc8ce5de" containerName="nova-scheduler-scheduler" Oct 13 14:33:16 crc kubenswrapper[4797]: I1013 14:33:16.906139 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6606df7-682a-4a8f-a465-f241fc8ce5de" containerName="nova-scheduler-scheduler" Oct 13 14:33:16 crc kubenswrapper[4797]: I1013 14:33:16.906166 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="768cb00c-712a-4da6-bcc9-669b541c1761" containerName="nova-manage" Oct 13 14:33:16 crc kubenswrapper[4797]: I1013 14:33:16.906849 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 13 14:33:16 crc kubenswrapper[4797]: I1013 14:33:16.909105 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 13 14:33:16 crc kubenswrapper[4797]: I1013 14:33:16.914909 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 13 14:33:16 crc kubenswrapper[4797]: I1013 14:33:16.992166 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab0e779d-76c7-48ef-9455-08b1c32ba0f8-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"ab0e779d-76c7-48ef-9455-08b1c32ba0f8\") " pod="openstack/nova-scheduler-0" Oct 13 14:33:16 crc kubenswrapper[4797]: I1013 14:33:16.992208 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab0e779d-76c7-48ef-9455-08b1c32ba0f8-config-data\") pod \"nova-scheduler-0\" (UID: \"ab0e779d-76c7-48ef-9455-08b1c32ba0f8\") " pod="openstack/nova-scheduler-0" Oct 13 14:33:16 crc kubenswrapper[4797]: I1013 14:33:16.992371 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrkz9\" (UniqueName: \"kubernetes.io/projected/ab0e779d-76c7-48ef-9455-08b1c32ba0f8-kube-api-access-mrkz9\") pod \"nova-scheduler-0\" (UID: \"ab0e779d-76c7-48ef-9455-08b1c32ba0f8\") " pod="openstack/nova-scheduler-0" Oct 13 14:33:17 crc kubenswrapper[4797]: I1013 14:33:17.094426 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mrkz9\" (UniqueName: \"kubernetes.io/projected/ab0e779d-76c7-48ef-9455-08b1c32ba0f8-kube-api-access-mrkz9\") pod \"nova-scheduler-0\" (UID: \"ab0e779d-76c7-48ef-9455-08b1c32ba0f8\") " pod="openstack/nova-scheduler-0" Oct 13 14:33:17 crc kubenswrapper[4797]: I1013 14:33:17.094548 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab0e779d-76c7-48ef-9455-08b1c32ba0f8-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"ab0e779d-76c7-48ef-9455-08b1c32ba0f8\") " pod="openstack/nova-scheduler-0" Oct 13 14:33:17 crc kubenswrapper[4797]: I1013 14:33:17.094574 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab0e779d-76c7-48ef-9455-08b1c32ba0f8-config-data\") pod \"nova-scheduler-0\" (UID: \"ab0e779d-76c7-48ef-9455-08b1c32ba0f8\") " pod="openstack/nova-scheduler-0" Oct 13 14:33:17 crc kubenswrapper[4797]: I1013 14:33:17.099419 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab0e779d-76c7-48ef-9455-08b1c32ba0f8-config-data\") pod \"nova-scheduler-0\" (UID: \"ab0e779d-76c7-48ef-9455-08b1c32ba0f8\") " pod="openstack/nova-scheduler-0" Oct 13 14:33:17 crc kubenswrapper[4797]: I1013 14:33:17.100212 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab0e779d-76c7-48ef-9455-08b1c32ba0f8-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"ab0e779d-76c7-48ef-9455-08b1c32ba0f8\") " pod="openstack/nova-scheduler-0" Oct 13 14:33:17 crc kubenswrapper[4797]: I1013 14:33:17.124637 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrkz9\" (UniqueName: \"kubernetes.io/projected/ab0e779d-76c7-48ef-9455-08b1c32ba0f8-kube-api-access-mrkz9\") pod \"nova-scheduler-0\" (UID: \"ab0e779d-76c7-48ef-9455-08b1c32ba0f8\") " pod="openstack/nova-scheduler-0" Oct 13 14:33:17 crc kubenswrapper[4797]: I1013 14:33:17.232723 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 13 14:33:17 crc kubenswrapper[4797]: I1013 14:33:17.246044 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6606df7-682a-4a8f-a465-f241fc8ce5de" path="/var/lib/kubelet/pods/e6606df7-682a-4a8f-a465-f241fc8ce5de/volumes" Oct 13 14:33:17 crc kubenswrapper[4797]: I1013 14:33:17.682756 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 13 14:33:17 crc kubenswrapper[4797]: I1013 14:33:17.834298 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ab0e779d-76c7-48ef-9455-08b1c32ba0f8","Type":"ContainerStarted","Data":"d6608d765a69da28c75c82a90a5f8f72a0d0ee01d9bf52bbb4bc06979aa30b3d"} Oct 13 14:33:18 crc kubenswrapper[4797]: I1013 14:33:18.463824 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 13 14:33:18 crc kubenswrapper[4797]: I1013 14:33:18.518513 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8aff75a0-5bce-4173-a77e-81e9bf3711b9-config-data\") pod \"8aff75a0-5bce-4173-a77e-81e9bf3711b9\" (UID: \"8aff75a0-5bce-4173-a77e-81e9bf3711b9\") " Oct 13 14:33:18 crc kubenswrapper[4797]: I1013 14:33:18.518686 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8aff75a0-5bce-4173-a77e-81e9bf3711b9-combined-ca-bundle\") pod \"8aff75a0-5bce-4173-a77e-81e9bf3711b9\" (UID: \"8aff75a0-5bce-4173-a77e-81e9bf3711b9\") " Oct 13 14:33:18 crc kubenswrapper[4797]: I1013 14:33:18.518787 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kc2xx\" (UniqueName: \"kubernetes.io/projected/8aff75a0-5bce-4173-a77e-81e9bf3711b9-kube-api-access-kc2xx\") pod \"8aff75a0-5bce-4173-a77e-81e9bf3711b9\" (UID: \"8aff75a0-5bce-4173-a77e-81e9bf3711b9\") " Oct 13 14:33:18 crc kubenswrapper[4797]: I1013 14:33:18.519001 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8aff75a0-5bce-4173-a77e-81e9bf3711b9-logs\") pod \"8aff75a0-5bce-4173-a77e-81e9bf3711b9\" (UID: \"8aff75a0-5bce-4173-a77e-81e9bf3711b9\") " Oct 13 14:33:18 crc kubenswrapper[4797]: I1013 14:33:18.519611 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8aff75a0-5bce-4173-a77e-81e9bf3711b9-logs" (OuterVolumeSpecName: "logs") pod "8aff75a0-5bce-4173-a77e-81e9bf3711b9" (UID: "8aff75a0-5bce-4173-a77e-81e9bf3711b9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 14:33:18 crc kubenswrapper[4797]: I1013 14:33:18.545068 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8aff75a0-5bce-4173-a77e-81e9bf3711b9-kube-api-access-kc2xx" (OuterVolumeSpecName: "kube-api-access-kc2xx") pod "8aff75a0-5bce-4173-a77e-81e9bf3711b9" (UID: "8aff75a0-5bce-4173-a77e-81e9bf3711b9"). InnerVolumeSpecName "kube-api-access-kc2xx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 14:33:18 crc kubenswrapper[4797]: E1013 14:33:18.545096 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8aff75a0-5bce-4173-a77e-81e9bf3711b9-combined-ca-bundle podName:8aff75a0-5bce-4173-a77e-81e9bf3711b9 nodeName:}" failed. No retries permitted until 2025-10-13 14:33:19.045071442 +0000 UTC m=+5176.578621698 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/8aff75a0-5bce-4173-a77e-81e9bf3711b9-combined-ca-bundle") pod "8aff75a0-5bce-4173-a77e-81e9bf3711b9" (UID: "8aff75a0-5bce-4173-a77e-81e9bf3711b9") : error deleting /var/lib/kubelet/pods/8aff75a0-5bce-4173-a77e-81e9bf3711b9/volume-subpaths: remove /var/lib/kubelet/pods/8aff75a0-5bce-4173-a77e-81e9bf3711b9/volume-subpaths: no such file or directory Oct 13 14:33:18 crc kubenswrapper[4797]: I1013 14:33:18.548017 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8aff75a0-5bce-4173-a77e-81e9bf3711b9-config-data" (OuterVolumeSpecName: "config-data") pod "8aff75a0-5bce-4173-a77e-81e9bf3711b9" (UID: "8aff75a0-5bce-4173-a77e-81e9bf3711b9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 14:33:18 crc kubenswrapper[4797]: I1013 14:33:18.620665 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kc2xx\" (UniqueName: \"kubernetes.io/projected/8aff75a0-5bce-4173-a77e-81e9bf3711b9-kube-api-access-kc2xx\") on node \"crc\" DevicePath \"\"" Oct 13 14:33:18 crc kubenswrapper[4797]: I1013 14:33:18.620691 4797 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8aff75a0-5bce-4173-a77e-81e9bf3711b9-logs\") on node \"crc\" DevicePath \"\"" Oct 13 14:33:18 crc kubenswrapper[4797]: I1013 14:33:18.620699 4797 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8aff75a0-5bce-4173-a77e-81e9bf3711b9-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 14:33:18 crc kubenswrapper[4797]: I1013 14:33:18.647491 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 13 14:33:18 crc kubenswrapper[4797]: I1013 14:33:18.721901 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6f9bf017-4407-42c7-aacd-719a6882b0c5-logs\") pod \"6f9bf017-4407-42c7-aacd-719a6882b0c5\" (UID: \"6f9bf017-4407-42c7-aacd-719a6882b0c5\") " Oct 13 14:33:18 crc kubenswrapper[4797]: I1013 14:33:18.722144 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f9bf017-4407-42c7-aacd-719a6882b0c5-combined-ca-bundle\") pod \"6f9bf017-4407-42c7-aacd-719a6882b0c5\" (UID: \"6f9bf017-4407-42c7-aacd-719a6882b0c5\") " Oct 13 14:33:18 crc kubenswrapper[4797]: I1013 14:33:18.722308 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f9bf017-4407-42c7-aacd-719a6882b0c5-config-data\") pod \"6f9bf017-4407-42c7-aacd-719a6882b0c5\" (UID: \"6f9bf017-4407-42c7-aacd-719a6882b0c5\") " Oct 13 14:33:18 crc kubenswrapper[4797]: I1013 14:33:18.722450 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f9bf017-4407-42c7-aacd-719a6882b0c5-logs" (OuterVolumeSpecName: "logs") pod "6f9bf017-4407-42c7-aacd-719a6882b0c5" (UID: "6f9bf017-4407-42c7-aacd-719a6882b0c5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 14:33:18 crc kubenswrapper[4797]: I1013 14:33:18.722715 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vjnz5\" (UniqueName: \"kubernetes.io/projected/6f9bf017-4407-42c7-aacd-719a6882b0c5-kube-api-access-vjnz5\") pod \"6f9bf017-4407-42c7-aacd-719a6882b0c5\" (UID: \"6f9bf017-4407-42c7-aacd-719a6882b0c5\") " Oct 13 14:33:18 crc kubenswrapper[4797]: I1013 14:33:18.723287 4797 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6f9bf017-4407-42c7-aacd-719a6882b0c5-logs\") on node \"crc\" DevicePath \"\"" Oct 13 14:33:18 crc kubenswrapper[4797]: I1013 14:33:18.725321 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f9bf017-4407-42c7-aacd-719a6882b0c5-kube-api-access-vjnz5" (OuterVolumeSpecName: "kube-api-access-vjnz5") pod "6f9bf017-4407-42c7-aacd-719a6882b0c5" (UID: "6f9bf017-4407-42c7-aacd-719a6882b0c5"). InnerVolumeSpecName "kube-api-access-vjnz5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 14:33:18 crc kubenswrapper[4797]: I1013 14:33:18.752981 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f9bf017-4407-42c7-aacd-719a6882b0c5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6f9bf017-4407-42c7-aacd-719a6882b0c5" (UID: "6f9bf017-4407-42c7-aacd-719a6882b0c5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 14:33:18 crc kubenswrapper[4797]: I1013 14:33:18.763764 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f9bf017-4407-42c7-aacd-719a6882b0c5-config-data" (OuterVolumeSpecName: "config-data") pod "6f9bf017-4407-42c7-aacd-719a6882b0c5" (UID: "6f9bf017-4407-42c7-aacd-719a6882b0c5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 14:33:18 crc kubenswrapper[4797]: I1013 14:33:18.824677 4797 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f9bf017-4407-42c7-aacd-719a6882b0c5-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 14:33:18 crc kubenswrapper[4797]: I1013 14:33:18.825041 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vjnz5\" (UniqueName: \"kubernetes.io/projected/6f9bf017-4407-42c7-aacd-719a6882b0c5-kube-api-access-vjnz5\") on node \"crc\" DevicePath \"\"" Oct 13 14:33:18 crc kubenswrapper[4797]: I1013 14:33:18.825056 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f9bf017-4407-42c7-aacd-719a6882b0c5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 14:33:18 crc kubenswrapper[4797]: I1013 14:33:18.848623 4797 generic.go:334] "Generic (PLEG): container finished" podID="8aff75a0-5bce-4173-a77e-81e9bf3711b9" containerID="703e7690eb61ddd2dd1574c8099c2addec1fc0c9acb69c0decc7424f49ee7d1b" exitCode=0 Oct 13 14:33:18 crc kubenswrapper[4797]: I1013 14:33:18.848731 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8aff75a0-5bce-4173-a77e-81e9bf3711b9","Type":"ContainerDied","Data":"703e7690eb61ddd2dd1574c8099c2addec1fc0c9acb69c0decc7424f49ee7d1b"} Oct 13 14:33:18 crc kubenswrapper[4797]: I1013 14:33:18.848771 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8aff75a0-5bce-4173-a77e-81e9bf3711b9","Type":"ContainerDied","Data":"d7a11f2122cd0064c2cf807257ced937d72e964ce3449d760644032d6d52954f"} Oct 13 14:33:18 crc kubenswrapper[4797]: I1013 14:33:18.848799 4797 scope.go:117] "RemoveContainer" containerID="703e7690eb61ddd2dd1574c8099c2addec1fc0c9acb69c0decc7424f49ee7d1b" Oct 13 14:33:18 crc kubenswrapper[4797]: I1013 14:33:18.848897 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 13 14:33:18 crc kubenswrapper[4797]: I1013 14:33:18.851098 4797 generic.go:334] "Generic (PLEG): container finished" podID="6f9bf017-4407-42c7-aacd-719a6882b0c5" containerID="68b8a9cea446d1d84a8aa87ebbacec65aaa5b2097eb1c8ff1130fbd1a223b18c" exitCode=0 Oct 13 14:33:18 crc kubenswrapper[4797]: I1013 14:33:18.851181 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6f9bf017-4407-42c7-aacd-719a6882b0c5","Type":"ContainerDied","Data":"68b8a9cea446d1d84a8aa87ebbacec65aaa5b2097eb1c8ff1130fbd1a223b18c"} Oct 13 14:33:18 crc kubenswrapper[4797]: I1013 14:33:18.851221 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6f9bf017-4407-42c7-aacd-719a6882b0c5","Type":"ContainerDied","Data":"1c21bf5387cb8f6395a8a02611325ad64a5083bb720560b0ff3a3ff702344fb8"} Oct 13 14:33:18 crc kubenswrapper[4797]: I1013 14:33:18.851487 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 13 14:33:18 crc kubenswrapper[4797]: I1013 14:33:18.853017 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ab0e779d-76c7-48ef-9455-08b1c32ba0f8","Type":"ContainerStarted","Data":"5161b110bfe6553ed3a8f68efc55fbb21768983c7aa7ec2f96cca17ae383b3bf"} Oct 13 14:33:18 crc kubenswrapper[4797]: I1013 14:33:18.875126 4797 scope.go:117] "RemoveContainer" containerID="bc4d15f9d615ffd287630d709a42ee7bca1ed1b84e13c503b5f197f0646c0ab8" Oct 13 14:33:18 crc kubenswrapper[4797]: I1013 14:33:18.878730 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.878712322 podStartE2EDuration="2.878712322s" podCreationTimestamp="2025-10-13 14:33:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 14:33:18.869568368 +0000 UTC m=+5176.403118634" watchObservedRunningTime="2025-10-13 14:33:18.878712322 +0000 UTC m=+5176.412262578" Oct 13 14:33:18 crc kubenswrapper[4797]: I1013 14:33:18.897283 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 13 14:33:18 crc kubenswrapper[4797]: I1013 14:33:18.908693 4797 scope.go:117] "RemoveContainer" containerID="703e7690eb61ddd2dd1574c8099c2addec1fc0c9acb69c0decc7424f49ee7d1b" Oct 13 14:33:18 crc kubenswrapper[4797]: I1013 14:33:18.910272 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 13 14:33:18 crc kubenswrapper[4797]: E1013 14:33:18.913330 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"703e7690eb61ddd2dd1574c8099c2addec1fc0c9acb69c0decc7424f49ee7d1b\": container with ID starting with 703e7690eb61ddd2dd1574c8099c2addec1fc0c9acb69c0decc7424f49ee7d1b not found: ID does not exist" containerID="703e7690eb61ddd2dd1574c8099c2addec1fc0c9acb69c0decc7424f49ee7d1b" Oct 13 14:33:18 crc kubenswrapper[4797]: I1013 14:33:18.913390 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"703e7690eb61ddd2dd1574c8099c2addec1fc0c9acb69c0decc7424f49ee7d1b"} err="failed to get container status \"703e7690eb61ddd2dd1574c8099c2addec1fc0c9acb69c0decc7424f49ee7d1b\": rpc error: code = NotFound desc = could not find container \"703e7690eb61ddd2dd1574c8099c2addec1fc0c9acb69c0decc7424f49ee7d1b\": container with ID starting with 703e7690eb61ddd2dd1574c8099c2addec1fc0c9acb69c0decc7424f49ee7d1b not found: ID does not exist" Oct 13 14:33:18 crc kubenswrapper[4797]: I1013 14:33:18.913427 4797 scope.go:117] "RemoveContainer" containerID="bc4d15f9d615ffd287630d709a42ee7bca1ed1b84e13c503b5f197f0646c0ab8" Oct 13 14:33:18 crc kubenswrapper[4797]: E1013 14:33:18.916468 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc4d15f9d615ffd287630d709a42ee7bca1ed1b84e13c503b5f197f0646c0ab8\": container with ID starting with bc4d15f9d615ffd287630d709a42ee7bca1ed1b84e13c503b5f197f0646c0ab8 not found: ID does not exist" containerID="bc4d15f9d615ffd287630d709a42ee7bca1ed1b84e13c503b5f197f0646c0ab8" Oct 13 14:33:18 crc kubenswrapper[4797]: I1013 14:33:18.916521 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc4d15f9d615ffd287630d709a42ee7bca1ed1b84e13c503b5f197f0646c0ab8"} err="failed to get container status \"bc4d15f9d615ffd287630d709a42ee7bca1ed1b84e13c503b5f197f0646c0ab8\": rpc error: code = NotFound desc = could not find container \"bc4d15f9d615ffd287630d709a42ee7bca1ed1b84e13c503b5f197f0646c0ab8\": container with ID starting with bc4d15f9d615ffd287630d709a42ee7bca1ed1b84e13c503b5f197f0646c0ab8 not found: ID does not exist" Oct 13 14:33:18 crc kubenswrapper[4797]: I1013 14:33:18.916547 4797 scope.go:117] "RemoveContainer" containerID="68b8a9cea446d1d84a8aa87ebbacec65aaa5b2097eb1c8ff1130fbd1a223b18c" Oct 13 14:33:18 crc kubenswrapper[4797]: I1013 14:33:18.933718 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 13 14:33:18 crc kubenswrapper[4797]: E1013 14:33:18.934472 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f9bf017-4407-42c7-aacd-719a6882b0c5" containerName="nova-metadata-log" Oct 13 14:33:18 crc kubenswrapper[4797]: I1013 14:33:18.934527 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f9bf017-4407-42c7-aacd-719a6882b0c5" containerName="nova-metadata-log" Oct 13 14:33:18 crc kubenswrapper[4797]: E1013 14:33:18.934572 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f9bf017-4407-42c7-aacd-719a6882b0c5" containerName="nova-metadata-metadata" Oct 13 14:33:18 crc kubenswrapper[4797]: I1013 14:33:18.934581 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f9bf017-4407-42c7-aacd-719a6882b0c5" containerName="nova-metadata-metadata" Oct 13 14:33:18 crc kubenswrapper[4797]: E1013 14:33:18.934596 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8aff75a0-5bce-4173-a77e-81e9bf3711b9" containerName="nova-api-api" Oct 13 14:33:18 crc kubenswrapper[4797]: I1013 14:33:18.934604 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="8aff75a0-5bce-4173-a77e-81e9bf3711b9" containerName="nova-api-api" Oct 13 14:33:18 crc kubenswrapper[4797]: E1013 14:33:18.934672 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8aff75a0-5bce-4173-a77e-81e9bf3711b9" containerName="nova-api-log" Oct 13 14:33:18 crc kubenswrapper[4797]: I1013 14:33:18.934682 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="8aff75a0-5bce-4173-a77e-81e9bf3711b9" containerName="nova-api-log" Oct 13 14:33:18 crc kubenswrapper[4797]: I1013 14:33:18.934957 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="8aff75a0-5bce-4173-a77e-81e9bf3711b9" containerName="nova-api-api" Oct 13 14:33:18 crc kubenswrapper[4797]: I1013 14:33:18.934978 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f9bf017-4407-42c7-aacd-719a6882b0c5" containerName="nova-metadata-metadata" Oct 13 14:33:18 crc kubenswrapper[4797]: I1013 14:33:18.934997 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f9bf017-4407-42c7-aacd-719a6882b0c5" containerName="nova-metadata-log" Oct 13 14:33:18 crc kubenswrapper[4797]: I1013 14:33:18.935040 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="8aff75a0-5bce-4173-a77e-81e9bf3711b9" containerName="nova-api-log" Oct 13 14:33:18 crc kubenswrapper[4797]: I1013 14:33:18.936490 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 13 14:33:18 crc kubenswrapper[4797]: I1013 14:33:18.940237 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 13 14:33:18 crc kubenswrapper[4797]: I1013 14:33:18.961244 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 13 14:33:18 crc kubenswrapper[4797]: I1013 14:33:18.964660 4797 scope.go:117] "RemoveContainer" containerID="1b6eea7a93ba3bf6a333953ed1dda0ff84ef3b95ad05eebe93bcd617f5d06cad" Oct 13 14:33:18 crc kubenswrapper[4797]: I1013 14:33:18.981334 4797 scope.go:117] "RemoveContainer" containerID="68b8a9cea446d1d84a8aa87ebbacec65aaa5b2097eb1c8ff1130fbd1a223b18c" Oct 13 14:33:18 crc kubenswrapper[4797]: E1013 14:33:18.981846 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68b8a9cea446d1d84a8aa87ebbacec65aaa5b2097eb1c8ff1130fbd1a223b18c\": container with ID starting with 68b8a9cea446d1d84a8aa87ebbacec65aaa5b2097eb1c8ff1130fbd1a223b18c not found: ID does not exist" containerID="68b8a9cea446d1d84a8aa87ebbacec65aaa5b2097eb1c8ff1130fbd1a223b18c" Oct 13 14:33:18 crc kubenswrapper[4797]: I1013 14:33:18.981889 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68b8a9cea446d1d84a8aa87ebbacec65aaa5b2097eb1c8ff1130fbd1a223b18c"} err="failed to get container status \"68b8a9cea446d1d84a8aa87ebbacec65aaa5b2097eb1c8ff1130fbd1a223b18c\": rpc error: code = NotFound desc = could not find container \"68b8a9cea446d1d84a8aa87ebbacec65aaa5b2097eb1c8ff1130fbd1a223b18c\": container with ID starting with 68b8a9cea446d1d84a8aa87ebbacec65aaa5b2097eb1c8ff1130fbd1a223b18c not found: ID does not exist" Oct 13 14:33:18 crc kubenswrapper[4797]: I1013 14:33:18.981922 4797 scope.go:117] "RemoveContainer" containerID="1b6eea7a93ba3bf6a333953ed1dda0ff84ef3b95ad05eebe93bcd617f5d06cad" Oct 13 14:33:18 crc kubenswrapper[4797]: E1013 14:33:18.982225 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b6eea7a93ba3bf6a333953ed1dda0ff84ef3b95ad05eebe93bcd617f5d06cad\": container with ID starting with 1b6eea7a93ba3bf6a333953ed1dda0ff84ef3b95ad05eebe93bcd617f5d06cad not found: ID does not exist" containerID="1b6eea7a93ba3bf6a333953ed1dda0ff84ef3b95ad05eebe93bcd617f5d06cad" Oct 13 14:33:18 crc kubenswrapper[4797]: I1013 14:33:18.982267 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b6eea7a93ba3bf6a333953ed1dda0ff84ef3b95ad05eebe93bcd617f5d06cad"} err="failed to get container status \"1b6eea7a93ba3bf6a333953ed1dda0ff84ef3b95ad05eebe93bcd617f5d06cad\": rpc error: code = NotFound desc = could not find container \"1b6eea7a93ba3bf6a333953ed1dda0ff84ef3b95ad05eebe93bcd617f5d06cad\": container with ID starting with 1b6eea7a93ba3bf6a333953ed1dda0ff84ef3b95ad05eebe93bcd617f5d06cad not found: ID does not exist" Oct 13 14:33:19 crc kubenswrapper[4797]: I1013 14:33:19.030103 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e872135e-d02b-4571-9217-0aa3cf0d444d-config-data\") pod \"nova-metadata-0\" (UID: \"e872135e-d02b-4571-9217-0aa3cf0d444d\") " pod="openstack/nova-metadata-0" Oct 13 14:33:19 crc kubenswrapper[4797]: I1013 14:33:19.030213 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e872135e-d02b-4571-9217-0aa3cf0d444d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e872135e-d02b-4571-9217-0aa3cf0d444d\") " pod="openstack/nova-metadata-0" Oct 13 14:33:19 crc kubenswrapper[4797]: I1013 14:33:19.030241 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6l4vd\" (UniqueName: \"kubernetes.io/projected/e872135e-d02b-4571-9217-0aa3cf0d444d-kube-api-access-6l4vd\") pod \"nova-metadata-0\" (UID: \"e872135e-d02b-4571-9217-0aa3cf0d444d\") " pod="openstack/nova-metadata-0" Oct 13 14:33:19 crc kubenswrapper[4797]: I1013 14:33:19.030305 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e872135e-d02b-4571-9217-0aa3cf0d444d-logs\") pod \"nova-metadata-0\" (UID: \"e872135e-d02b-4571-9217-0aa3cf0d444d\") " pod="openstack/nova-metadata-0" Oct 13 14:33:19 crc kubenswrapper[4797]: I1013 14:33:19.131694 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8aff75a0-5bce-4173-a77e-81e9bf3711b9-combined-ca-bundle\") pod \"8aff75a0-5bce-4173-a77e-81e9bf3711b9\" (UID: \"8aff75a0-5bce-4173-a77e-81e9bf3711b9\") " Oct 13 14:33:19 crc kubenswrapper[4797]: I1013 14:33:19.132046 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e872135e-d02b-4571-9217-0aa3cf0d444d-config-data\") pod \"nova-metadata-0\" (UID: \"e872135e-d02b-4571-9217-0aa3cf0d444d\") " pod="openstack/nova-metadata-0" Oct 13 14:33:19 crc kubenswrapper[4797]: I1013 14:33:19.132131 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e872135e-d02b-4571-9217-0aa3cf0d444d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e872135e-d02b-4571-9217-0aa3cf0d444d\") " pod="openstack/nova-metadata-0" Oct 13 14:33:19 crc kubenswrapper[4797]: I1013 14:33:19.132159 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6l4vd\" (UniqueName: \"kubernetes.io/projected/e872135e-d02b-4571-9217-0aa3cf0d444d-kube-api-access-6l4vd\") pod \"nova-metadata-0\" (UID: \"e872135e-d02b-4571-9217-0aa3cf0d444d\") " pod="openstack/nova-metadata-0" Oct 13 14:33:19 crc kubenswrapper[4797]: I1013 14:33:19.132243 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e872135e-d02b-4571-9217-0aa3cf0d444d-logs\") pod \"nova-metadata-0\" (UID: \"e872135e-d02b-4571-9217-0aa3cf0d444d\") " pod="openstack/nova-metadata-0" Oct 13 14:33:19 crc kubenswrapper[4797]: I1013 14:33:19.132646 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e872135e-d02b-4571-9217-0aa3cf0d444d-logs\") pod \"nova-metadata-0\" (UID: \"e872135e-d02b-4571-9217-0aa3cf0d444d\") " pod="openstack/nova-metadata-0" Oct 13 14:33:19 crc kubenswrapper[4797]: I1013 14:33:19.136707 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8aff75a0-5bce-4173-a77e-81e9bf3711b9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8aff75a0-5bce-4173-a77e-81e9bf3711b9" (UID: "8aff75a0-5bce-4173-a77e-81e9bf3711b9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 14:33:19 crc kubenswrapper[4797]: I1013 14:33:19.137766 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e872135e-d02b-4571-9217-0aa3cf0d444d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e872135e-d02b-4571-9217-0aa3cf0d444d\") " pod="openstack/nova-metadata-0" Oct 13 14:33:19 crc kubenswrapper[4797]: I1013 14:33:19.143175 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e872135e-d02b-4571-9217-0aa3cf0d444d-config-data\") pod \"nova-metadata-0\" (UID: \"e872135e-d02b-4571-9217-0aa3cf0d444d\") " pod="openstack/nova-metadata-0" Oct 13 14:33:19 crc kubenswrapper[4797]: I1013 14:33:19.151177 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6l4vd\" (UniqueName: \"kubernetes.io/projected/e872135e-d02b-4571-9217-0aa3cf0d444d-kube-api-access-6l4vd\") pod \"nova-metadata-0\" (UID: \"e872135e-d02b-4571-9217-0aa3cf0d444d\") " pod="openstack/nova-metadata-0" Oct 13 14:33:19 crc kubenswrapper[4797]: I1013 14:33:19.189627 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 13 14:33:19 crc kubenswrapper[4797]: I1013 14:33:19.221843 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 13 14:33:19 crc kubenswrapper[4797]: I1013 14:33:19.234852 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8aff75a0-5bce-4173-a77e-81e9bf3711b9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 14:33:19 crc kubenswrapper[4797]: I1013 14:33:19.258043 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f9bf017-4407-42c7-aacd-719a6882b0c5" path="/var/lib/kubelet/pods/6f9bf017-4407-42c7-aacd-719a6882b0c5/volumes" Oct 13 14:33:19 crc kubenswrapper[4797]: I1013 14:33:19.258838 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8aff75a0-5bce-4173-a77e-81e9bf3711b9" path="/var/lib/kubelet/pods/8aff75a0-5bce-4173-a77e-81e9bf3711b9/volumes" Oct 13 14:33:19 crc kubenswrapper[4797]: I1013 14:33:19.259443 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 13 14:33:19 crc kubenswrapper[4797]: I1013 14:33:19.261332 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 13 14:33:19 crc kubenswrapper[4797]: I1013 14:33:19.261425 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 13 14:33:19 crc kubenswrapper[4797]: I1013 14:33:19.262049 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 13 14:33:19 crc kubenswrapper[4797]: I1013 14:33:19.264112 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 13 14:33:19 crc kubenswrapper[4797]: I1013 14:33:19.336382 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dca2aaee-d1cd-4f97-b991-384473d89a49-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"dca2aaee-d1cd-4f97-b991-384473d89a49\") " pod="openstack/nova-api-0" Oct 13 14:33:19 crc kubenswrapper[4797]: I1013 14:33:19.336456 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dca2aaee-d1cd-4f97-b991-384473d89a49-config-data\") pod \"nova-api-0\" (UID: \"dca2aaee-d1cd-4f97-b991-384473d89a49\") " pod="openstack/nova-api-0" Oct 13 14:33:19 crc kubenswrapper[4797]: I1013 14:33:19.336575 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6s87s\" (UniqueName: \"kubernetes.io/projected/dca2aaee-d1cd-4f97-b991-384473d89a49-kube-api-access-6s87s\") pod \"nova-api-0\" (UID: \"dca2aaee-d1cd-4f97-b991-384473d89a49\") " pod="openstack/nova-api-0" Oct 13 14:33:19 crc kubenswrapper[4797]: I1013 14:33:19.337289 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dca2aaee-d1cd-4f97-b991-384473d89a49-logs\") pod \"nova-api-0\" (UID: \"dca2aaee-d1cd-4f97-b991-384473d89a49\") " pod="openstack/nova-api-0" Oct 13 14:33:19 crc kubenswrapper[4797]: I1013 14:33:19.438383 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dca2aaee-d1cd-4f97-b991-384473d89a49-logs\") pod \"nova-api-0\" (UID: \"dca2aaee-d1cd-4f97-b991-384473d89a49\") " pod="openstack/nova-api-0" Oct 13 14:33:19 crc kubenswrapper[4797]: I1013 14:33:19.438633 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dca2aaee-d1cd-4f97-b991-384473d89a49-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"dca2aaee-d1cd-4f97-b991-384473d89a49\") " pod="openstack/nova-api-0" Oct 13 14:33:19 crc kubenswrapper[4797]: I1013 14:33:19.438699 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dca2aaee-d1cd-4f97-b991-384473d89a49-config-data\") pod \"nova-api-0\" (UID: \"dca2aaee-d1cd-4f97-b991-384473d89a49\") " pod="openstack/nova-api-0" Oct 13 14:33:19 crc kubenswrapper[4797]: I1013 14:33:19.438753 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6s87s\" (UniqueName: \"kubernetes.io/projected/dca2aaee-d1cd-4f97-b991-384473d89a49-kube-api-access-6s87s\") pod \"nova-api-0\" (UID: \"dca2aaee-d1cd-4f97-b991-384473d89a49\") " pod="openstack/nova-api-0" Oct 13 14:33:19 crc kubenswrapper[4797]: I1013 14:33:19.439477 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dca2aaee-d1cd-4f97-b991-384473d89a49-logs\") pod \"nova-api-0\" (UID: \"dca2aaee-d1cd-4f97-b991-384473d89a49\") " pod="openstack/nova-api-0" Oct 13 14:33:19 crc kubenswrapper[4797]: I1013 14:33:19.443422 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dca2aaee-d1cd-4f97-b991-384473d89a49-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"dca2aaee-d1cd-4f97-b991-384473d89a49\") " pod="openstack/nova-api-0" Oct 13 14:33:19 crc kubenswrapper[4797]: I1013 14:33:19.452762 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dca2aaee-d1cd-4f97-b991-384473d89a49-config-data\") pod \"nova-api-0\" (UID: \"dca2aaee-d1cd-4f97-b991-384473d89a49\") " pod="openstack/nova-api-0" Oct 13 14:33:19 crc kubenswrapper[4797]: I1013 14:33:19.463024 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6s87s\" (UniqueName: \"kubernetes.io/projected/dca2aaee-d1cd-4f97-b991-384473d89a49-kube-api-access-6s87s\") pod \"nova-api-0\" (UID: \"dca2aaee-d1cd-4f97-b991-384473d89a49\") " pod="openstack/nova-api-0" Oct 13 14:33:19 crc kubenswrapper[4797]: I1013 14:33:19.590107 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 13 14:33:19 crc kubenswrapper[4797]: I1013 14:33:19.717706 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 13 14:33:19 crc kubenswrapper[4797]: W1013 14:33:19.721031 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode872135e_d02b_4571_9217_0aa3cf0d444d.slice/crio-9e1f7ed5304816563ef954a6066ffd9e002a40e82d7806da07810644c97d90d6 WatchSource:0}: Error finding container 9e1f7ed5304816563ef954a6066ffd9e002a40e82d7806da07810644c97d90d6: Status 404 returned error can't find the container with id 9e1f7ed5304816563ef954a6066ffd9e002a40e82d7806da07810644c97d90d6 Oct 13 14:33:19 crc kubenswrapper[4797]: I1013 14:33:19.873373 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e872135e-d02b-4571-9217-0aa3cf0d444d","Type":"ContainerStarted","Data":"9e1f7ed5304816563ef954a6066ffd9e002a40e82d7806da07810644c97d90d6"} Oct 13 14:33:20 crc kubenswrapper[4797]: I1013 14:33:20.018197 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 13 14:33:20 crc kubenswrapper[4797]: W1013 14:33:20.023652 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddca2aaee_d1cd_4f97_b991_384473d89a49.slice/crio-612abea44fbda8019898d5da193f25fe1a1fe213599f5ccfeb01f54634b0f16c WatchSource:0}: Error finding container 612abea44fbda8019898d5da193f25fe1a1fe213599f5ccfeb01f54634b0f16c: Status 404 returned error can't find the container with id 612abea44fbda8019898d5da193f25fe1a1fe213599f5ccfeb01f54634b0f16c Oct 13 14:33:20 crc kubenswrapper[4797]: I1013 14:33:20.885398 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"dca2aaee-d1cd-4f97-b991-384473d89a49","Type":"ContainerStarted","Data":"5b9e18ce17adfde9e6ffb19d74e800da763fcaca03cdcc3c4a430c84e231a845"} Oct 13 14:33:20 crc kubenswrapper[4797]: I1013 14:33:20.886344 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"dca2aaee-d1cd-4f97-b991-384473d89a49","Type":"ContainerStarted","Data":"ac26cf3eec713da06a181855a0db367f1463291376e6ef025d8b4e08a2f19441"} Oct 13 14:33:20 crc kubenswrapper[4797]: I1013 14:33:20.886413 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"dca2aaee-d1cd-4f97-b991-384473d89a49","Type":"ContainerStarted","Data":"612abea44fbda8019898d5da193f25fe1a1fe213599f5ccfeb01f54634b0f16c"} Oct 13 14:33:20 crc kubenswrapper[4797]: I1013 14:33:20.887685 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e872135e-d02b-4571-9217-0aa3cf0d444d","Type":"ContainerStarted","Data":"15627da574b8282a1bbfef2a9135d70d80740ec5600f34160ee053408940b6f8"} Oct 13 14:33:20 crc kubenswrapper[4797]: I1013 14:33:20.887725 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e872135e-d02b-4571-9217-0aa3cf0d444d","Type":"ContainerStarted","Data":"f6b124f02b85e8a6700b935f0939bcd19d6b94a7adea514dea990f5f65e1e4f8"} Oct 13 14:33:20 crc kubenswrapper[4797]: I1013 14:33:20.903988 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=1.903975365 podStartE2EDuration="1.903975365s" podCreationTimestamp="2025-10-13 14:33:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 14:33:20.901492614 +0000 UTC m=+5178.435042870" watchObservedRunningTime="2025-10-13 14:33:20.903975365 +0000 UTC m=+5178.437525621" Oct 13 14:33:20 crc kubenswrapper[4797]: I1013 14:33:20.931316 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.931294074 podStartE2EDuration="2.931294074s" podCreationTimestamp="2025-10-13 14:33:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 14:33:20.929323366 +0000 UTC m=+5178.462873632" watchObservedRunningTime="2025-10-13 14:33:20.931294074 +0000 UTC m=+5178.464844330" Oct 13 14:33:21 crc kubenswrapper[4797]: I1013 14:33:21.236877 4797 scope.go:117] "RemoveContainer" containerID="96c8267bd4c8e99eeab0f52fde47a06d5529395a03b2ed9e13ec45aa355e370b" Oct 13 14:33:21 crc kubenswrapper[4797]: E1013 14:33:21.237120 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 14:33:22 crc kubenswrapper[4797]: I1013 14:33:22.233249 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 13 14:33:24 crc kubenswrapper[4797]: I1013 14:33:24.262284 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 13 14:33:24 crc kubenswrapper[4797]: I1013 14:33:24.262637 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 13 14:33:27 crc kubenswrapper[4797]: I1013 14:33:27.233094 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 13 14:33:27 crc kubenswrapper[4797]: I1013 14:33:27.274114 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 13 14:33:28 crc kubenswrapper[4797]: I1013 14:33:28.028669 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 13 14:33:29 crc kubenswrapper[4797]: I1013 14:33:29.263039 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 13 14:33:29 crc kubenswrapper[4797]: I1013 14:33:29.263087 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 13 14:33:29 crc kubenswrapper[4797]: I1013 14:33:29.594378 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 13 14:33:29 crc kubenswrapper[4797]: I1013 14:33:29.595075 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 13 14:33:30 crc kubenswrapper[4797]: I1013 14:33:30.345099 4797 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="e872135e-d02b-4571-9217-0aa3cf0d444d" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.73:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 13 14:33:30 crc kubenswrapper[4797]: I1013 14:33:30.345146 4797 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="e872135e-d02b-4571-9217-0aa3cf0d444d" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.73:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 13 14:33:30 crc kubenswrapper[4797]: I1013 14:33:30.673051 4797 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="dca2aaee-d1cd-4f97-b991-384473d89a49" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.74:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 13 14:33:30 crc kubenswrapper[4797]: I1013 14:33:30.673051 4797 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="dca2aaee-d1cd-4f97-b991-384473d89a49" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.74:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 13 14:33:36 crc kubenswrapper[4797]: I1013 14:33:36.236031 4797 scope.go:117] "RemoveContainer" containerID="96c8267bd4c8e99eeab0f52fde47a06d5529395a03b2ed9e13ec45aa355e370b" Oct 13 14:33:36 crc kubenswrapper[4797]: E1013 14:33:36.236859 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 14:33:39 crc kubenswrapper[4797]: I1013 14:33:39.264967 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 13 14:33:39 crc kubenswrapper[4797]: I1013 14:33:39.265985 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 13 14:33:39 crc kubenswrapper[4797]: I1013 14:33:39.267866 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 13 14:33:39 crc kubenswrapper[4797]: I1013 14:33:39.269132 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 13 14:33:39 crc kubenswrapper[4797]: I1013 14:33:39.595621 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 13 14:33:39 crc kubenswrapper[4797]: I1013 14:33:39.595698 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 13 14:33:39 crc kubenswrapper[4797]: I1013 14:33:39.596287 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 13 14:33:39 crc kubenswrapper[4797]: I1013 14:33:39.596398 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 13 14:33:39 crc kubenswrapper[4797]: I1013 14:33:39.599316 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 13 14:33:39 crc kubenswrapper[4797]: I1013 14:33:39.599841 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 13 14:33:39 crc kubenswrapper[4797]: I1013 14:33:39.797719 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-65d574b495-22jtd"] Oct 13 14:33:39 crc kubenswrapper[4797]: I1013 14:33:39.799983 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65d574b495-22jtd" Oct 13 14:33:39 crc kubenswrapper[4797]: I1013 14:33:39.820960 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-65d574b495-22jtd"] Oct 13 14:33:39 crc kubenswrapper[4797]: I1013 14:33:39.938944 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/972b0c4f-2817-4375-bd37-d6060aa6b80a-config\") pod \"dnsmasq-dns-65d574b495-22jtd\" (UID: \"972b0c4f-2817-4375-bd37-d6060aa6b80a\") " pod="openstack/dnsmasq-dns-65d574b495-22jtd" Oct 13 14:33:39 crc kubenswrapper[4797]: I1013 14:33:39.939482 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/972b0c4f-2817-4375-bd37-d6060aa6b80a-ovsdbserver-nb\") pod \"dnsmasq-dns-65d574b495-22jtd\" (UID: \"972b0c4f-2817-4375-bd37-d6060aa6b80a\") " pod="openstack/dnsmasq-dns-65d574b495-22jtd" Oct 13 14:33:39 crc kubenswrapper[4797]: I1013 14:33:39.939566 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/972b0c4f-2817-4375-bd37-d6060aa6b80a-dns-svc\") pod \"dnsmasq-dns-65d574b495-22jtd\" (UID: \"972b0c4f-2817-4375-bd37-d6060aa6b80a\") " pod="openstack/dnsmasq-dns-65d574b495-22jtd" Oct 13 14:33:39 crc kubenswrapper[4797]: I1013 14:33:39.939601 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtqwm\" (UniqueName: \"kubernetes.io/projected/972b0c4f-2817-4375-bd37-d6060aa6b80a-kube-api-access-mtqwm\") pod \"dnsmasq-dns-65d574b495-22jtd\" (UID: \"972b0c4f-2817-4375-bd37-d6060aa6b80a\") " pod="openstack/dnsmasq-dns-65d574b495-22jtd" Oct 13 14:33:39 crc kubenswrapper[4797]: I1013 14:33:39.939665 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/972b0c4f-2817-4375-bd37-d6060aa6b80a-ovsdbserver-sb\") pod \"dnsmasq-dns-65d574b495-22jtd\" (UID: \"972b0c4f-2817-4375-bd37-d6060aa6b80a\") " pod="openstack/dnsmasq-dns-65d574b495-22jtd" Oct 13 14:33:40 crc kubenswrapper[4797]: I1013 14:33:40.041643 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/972b0c4f-2817-4375-bd37-d6060aa6b80a-ovsdbserver-nb\") pod \"dnsmasq-dns-65d574b495-22jtd\" (UID: \"972b0c4f-2817-4375-bd37-d6060aa6b80a\") " pod="openstack/dnsmasq-dns-65d574b495-22jtd" Oct 13 14:33:40 crc kubenswrapper[4797]: I1013 14:33:40.041707 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/972b0c4f-2817-4375-bd37-d6060aa6b80a-dns-svc\") pod \"dnsmasq-dns-65d574b495-22jtd\" (UID: \"972b0c4f-2817-4375-bd37-d6060aa6b80a\") " pod="openstack/dnsmasq-dns-65d574b495-22jtd" Oct 13 14:33:40 crc kubenswrapper[4797]: I1013 14:33:40.041733 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mtqwm\" (UniqueName: \"kubernetes.io/projected/972b0c4f-2817-4375-bd37-d6060aa6b80a-kube-api-access-mtqwm\") pod \"dnsmasq-dns-65d574b495-22jtd\" (UID: \"972b0c4f-2817-4375-bd37-d6060aa6b80a\") " pod="openstack/dnsmasq-dns-65d574b495-22jtd" Oct 13 14:33:40 crc kubenswrapper[4797]: I1013 14:33:40.041764 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/972b0c4f-2817-4375-bd37-d6060aa6b80a-ovsdbserver-sb\") pod \"dnsmasq-dns-65d574b495-22jtd\" (UID: \"972b0c4f-2817-4375-bd37-d6060aa6b80a\") " pod="openstack/dnsmasq-dns-65d574b495-22jtd" Oct 13 14:33:40 crc kubenswrapper[4797]: I1013 14:33:40.041848 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/972b0c4f-2817-4375-bd37-d6060aa6b80a-config\") pod \"dnsmasq-dns-65d574b495-22jtd\" (UID: \"972b0c4f-2817-4375-bd37-d6060aa6b80a\") " pod="openstack/dnsmasq-dns-65d574b495-22jtd" Oct 13 14:33:40 crc kubenswrapper[4797]: I1013 14:33:40.042849 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/972b0c4f-2817-4375-bd37-d6060aa6b80a-ovsdbserver-sb\") pod \"dnsmasq-dns-65d574b495-22jtd\" (UID: \"972b0c4f-2817-4375-bd37-d6060aa6b80a\") " pod="openstack/dnsmasq-dns-65d574b495-22jtd" Oct 13 14:33:40 crc kubenswrapper[4797]: I1013 14:33:40.043064 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/972b0c4f-2817-4375-bd37-d6060aa6b80a-dns-svc\") pod \"dnsmasq-dns-65d574b495-22jtd\" (UID: \"972b0c4f-2817-4375-bd37-d6060aa6b80a\") " pod="openstack/dnsmasq-dns-65d574b495-22jtd" Oct 13 14:33:40 crc kubenswrapper[4797]: I1013 14:33:40.043104 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/972b0c4f-2817-4375-bd37-d6060aa6b80a-config\") pod \"dnsmasq-dns-65d574b495-22jtd\" (UID: \"972b0c4f-2817-4375-bd37-d6060aa6b80a\") " pod="openstack/dnsmasq-dns-65d574b495-22jtd" Oct 13 14:33:40 crc kubenswrapper[4797]: I1013 14:33:40.043149 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/972b0c4f-2817-4375-bd37-d6060aa6b80a-ovsdbserver-nb\") pod \"dnsmasq-dns-65d574b495-22jtd\" (UID: \"972b0c4f-2817-4375-bd37-d6060aa6b80a\") " pod="openstack/dnsmasq-dns-65d574b495-22jtd" Oct 13 14:33:40 crc kubenswrapper[4797]: I1013 14:33:40.069478 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtqwm\" (UniqueName: \"kubernetes.io/projected/972b0c4f-2817-4375-bd37-d6060aa6b80a-kube-api-access-mtqwm\") pod \"dnsmasq-dns-65d574b495-22jtd\" (UID: \"972b0c4f-2817-4375-bd37-d6060aa6b80a\") " pod="openstack/dnsmasq-dns-65d574b495-22jtd" Oct 13 14:33:40 crc kubenswrapper[4797]: I1013 14:33:40.128241 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65d574b495-22jtd" Oct 13 14:33:40 crc kubenswrapper[4797]: I1013 14:33:40.620631 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-65d574b495-22jtd"] Oct 13 14:33:41 crc kubenswrapper[4797]: I1013 14:33:41.119708 4797 generic.go:334] "Generic (PLEG): container finished" podID="972b0c4f-2817-4375-bd37-d6060aa6b80a" containerID="4b930bebc4f4a98219c95a6595053bb573c1b33056d5d3a915963f925e77daf9" exitCode=0 Oct 13 14:33:41 crc kubenswrapper[4797]: I1013 14:33:41.119754 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65d574b495-22jtd" event={"ID":"972b0c4f-2817-4375-bd37-d6060aa6b80a","Type":"ContainerDied","Data":"4b930bebc4f4a98219c95a6595053bb573c1b33056d5d3a915963f925e77daf9"} Oct 13 14:33:41 crc kubenswrapper[4797]: I1013 14:33:41.120037 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65d574b495-22jtd" event={"ID":"972b0c4f-2817-4375-bd37-d6060aa6b80a","Type":"ContainerStarted","Data":"92399251a4c19e2deb29bbd93c6579881623cb1e81835dbf37e063743cb88ded"} Oct 13 14:33:42 crc kubenswrapper[4797]: I1013 14:33:42.134832 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65d574b495-22jtd" event={"ID":"972b0c4f-2817-4375-bd37-d6060aa6b80a","Type":"ContainerStarted","Data":"959139e9736c3aa9d0d771a3f5b70befbd0d0e53979e5e8ef8cba8dccfdf52f2"} Oct 13 14:33:42 crc kubenswrapper[4797]: I1013 14:33:42.135103 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-65d574b495-22jtd" Oct 13 14:33:42 crc kubenswrapper[4797]: I1013 14:33:42.172672 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-65d574b495-22jtd" podStartSLOduration=3.172649447 podStartE2EDuration="3.172649447s" podCreationTimestamp="2025-10-13 14:33:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 14:33:42.155120387 +0000 UTC m=+5199.688670653" watchObservedRunningTime="2025-10-13 14:33:42.172649447 +0000 UTC m=+5199.706199703" Oct 13 14:33:50 crc kubenswrapper[4797]: I1013 14:33:50.130951 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-65d574b495-22jtd" Oct 13 14:33:50 crc kubenswrapper[4797]: I1013 14:33:50.190215 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-76ddc8f5dc-dq7dh"] Oct 13 14:33:50 crc kubenswrapper[4797]: I1013 14:33:50.190485 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-76ddc8f5dc-dq7dh" podUID="0b9a3713-8e2a-40ac-80a9-0fa0fa54ad96" containerName="dnsmasq-dns" containerID="cri-o://733a353a3af7f6a1a5b0bff35236bf33bc666221c30cec3bbd5d04c28424540b" gracePeriod=10 Oct 13 14:33:50 crc kubenswrapper[4797]: I1013 14:33:50.235791 4797 scope.go:117] "RemoveContainer" containerID="96c8267bd4c8e99eeab0f52fde47a06d5529395a03b2ed9e13ec45aa355e370b" Oct 13 14:33:50 crc kubenswrapper[4797]: E1013 14:33:50.236040 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 14:33:50 crc kubenswrapper[4797]: I1013 14:33:50.696620 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76ddc8f5dc-dq7dh" Oct 13 14:33:50 crc kubenswrapper[4797]: I1013 14:33:50.841839 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ljg8p\" (UniqueName: \"kubernetes.io/projected/0b9a3713-8e2a-40ac-80a9-0fa0fa54ad96-kube-api-access-ljg8p\") pod \"0b9a3713-8e2a-40ac-80a9-0fa0fa54ad96\" (UID: \"0b9a3713-8e2a-40ac-80a9-0fa0fa54ad96\") " Oct 13 14:33:50 crc kubenswrapper[4797]: I1013 14:33:50.841920 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0b9a3713-8e2a-40ac-80a9-0fa0fa54ad96-dns-svc\") pod \"0b9a3713-8e2a-40ac-80a9-0fa0fa54ad96\" (UID: \"0b9a3713-8e2a-40ac-80a9-0fa0fa54ad96\") " Oct 13 14:33:50 crc kubenswrapper[4797]: I1013 14:33:50.842020 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0b9a3713-8e2a-40ac-80a9-0fa0fa54ad96-ovsdbserver-sb\") pod \"0b9a3713-8e2a-40ac-80a9-0fa0fa54ad96\" (UID: \"0b9a3713-8e2a-40ac-80a9-0fa0fa54ad96\") " Oct 13 14:33:50 crc kubenswrapper[4797]: I1013 14:33:50.842069 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b9a3713-8e2a-40ac-80a9-0fa0fa54ad96-config\") pod \"0b9a3713-8e2a-40ac-80a9-0fa0fa54ad96\" (UID: \"0b9a3713-8e2a-40ac-80a9-0fa0fa54ad96\") " Oct 13 14:33:50 crc kubenswrapper[4797]: I1013 14:33:50.842132 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0b9a3713-8e2a-40ac-80a9-0fa0fa54ad96-ovsdbserver-nb\") pod \"0b9a3713-8e2a-40ac-80a9-0fa0fa54ad96\" (UID: \"0b9a3713-8e2a-40ac-80a9-0fa0fa54ad96\") " Oct 13 14:33:50 crc kubenswrapper[4797]: I1013 14:33:50.847902 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b9a3713-8e2a-40ac-80a9-0fa0fa54ad96-kube-api-access-ljg8p" (OuterVolumeSpecName: "kube-api-access-ljg8p") pod "0b9a3713-8e2a-40ac-80a9-0fa0fa54ad96" (UID: "0b9a3713-8e2a-40ac-80a9-0fa0fa54ad96"). InnerVolumeSpecName "kube-api-access-ljg8p". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 14:33:50 crc kubenswrapper[4797]: I1013 14:33:50.886156 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b9a3713-8e2a-40ac-80a9-0fa0fa54ad96-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0b9a3713-8e2a-40ac-80a9-0fa0fa54ad96" (UID: "0b9a3713-8e2a-40ac-80a9-0fa0fa54ad96"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 14:33:50 crc kubenswrapper[4797]: I1013 14:33:50.887968 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b9a3713-8e2a-40ac-80a9-0fa0fa54ad96-config" (OuterVolumeSpecName: "config") pod "0b9a3713-8e2a-40ac-80a9-0fa0fa54ad96" (UID: "0b9a3713-8e2a-40ac-80a9-0fa0fa54ad96"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 14:33:50 crc kubenswrapper[4797]: I1013 14:33:50.894368 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b9a3713-8e2a-40ac-80a9-0fa0fa54ad96-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "0b9a3713-8e2a-40ac-80a9-0fa0fa54ad96" (UID: "0b9a3713-8e2a-40ac-80a9-0fa0fa54ad96"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 14:33:50 crc kubenswrapper[4797]: I1013 14:33:50.894697 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b9a3713-8e2a-40ac-80a9-0fa0fa54ad96-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "0b9a3713-8e2a-40ac-80a9-0fa0fa54ad96" (UID: "0b9a3713-8e2a-40ac-80a9-0fa0fa54ad96"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 14:33:50 crc kubenswrapper[4797]: I1013 14:33:50.943787 4797 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b9a3713-8e2a-40ac-80a9-0fa0fa54ad96-config\") on node \"crc\" DevicePath \"\"" Oct 13 14:33:50 crc kubenswrapper[4797]: I1013 14:33:50.943847 4797 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0b9a3713-8e2a-40ac-80a9-0fa0fa54ad96-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 13 14:33:50 crc kubenswrapper[4797]: I1013 14:33:50.943863 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ljg8p\" (UniqueName: \"kubernetes.io/projected/0b9a3713-8e2a-40ac-80a9-0fa0fa54ad96-kube-api-access-ljg8p\") on node \"crc\" DevicePath \"\"" Oct 13 14:33:50 crc kubenswrapper[4797]: I1013 14:33:50.943876 4797 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0b9a3713-8e2a-40ac-80a9-0fa0fa54ad96-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 13 14:33:50 crc kubenswrapper[4797]: I1013 14:33:50.943886 4797 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0b9a3713-8e2a-40ac-80a9-0fa0fa54ad96-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 13 14:33:51 crc kubenswrapper[4797]: I1013 14:33:51.217770 4797 generic.go:334] "Generic (PLEG): container finished" podID="0b9a3713-8e2a-40ac-80a9-0fa0fa54ad96" containerID="733a353a3af7f6a1a5b0bff35236bf33bc666221c30cec3bbd5d04c28424540b" exitCode=0 Oct 13 14:33:51 crc kubenswrapper[4797]: I1013 14:33:51.217862 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76ddc8f5dc-dq7dh" Oct 13 14:33:51 crc kubenswrapper[4797]: I1013 14:33:51.217867 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76ddc8f5dc-dq7dh" event={"ID":"0b9a3713-8e2a-40ac-80a9-0fa0fa54ad96","Type":"ContainerDied","Data":"733a353a3af7f6a1a5b0bff35236bf33bc666221c30cec3bbd5d04c28424540b"} Oct 13 14:33:51 crc kubenswrapper[4797]: I1013 14:33:51.218262 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76ddc8f5dc-dq7dh" event={"ID":"0b9a3713-8e2a-40ac-80a9-0fa0fa54ad96","Type":"ContainerDied","Data":"4a9d78b5402ff14cc370909061285680d329a48edcf6fccbe337d38b01cc66a4"} Oct 13 14:33:51 crc kubenswrapper[4797]: I1013 14:33:51.218284 4797 scope.go:117] "RemoveContainer" containerID="733a353a3af7f6a1a5b0bff35236bf33bc666221c30cec3bbd5d04c28424540b" Oct 13 14:33:51 crc kubenswrapper[4797]: I1013 14:33:51.245789 4797 scope.go:117] "RemoveContainer" containerID="ba97c6a7364bc4c96f42073b242438604e519aa487057cdf0a0cefd5a9944bfd" Oct 13 14:33:51 crc kubenswrapper[4797]: I1013 14:33:51.271878 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-76ddc8f5dc-dq7dh"] Oct 13 14:33:51 crc kubenswrapper[4797]: I1013 14:33:51.275408 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-76ddc8f5dc-dq7dh"] Oct 13 14:33:51 crc kubenswrapper[4797]: I1013 14:33:51.294864 4797 scope.go:117] "RemoveContainer" containerID="733a353a3af7f6a1a5b0bff35236bf33bc666221c30cec3bbd5d04c28424540b" Oct 13 14:33:51 crc kubenswrapper[4797]: E1013 14:33:51.295425 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"733a353a3af7f6a1a5b0bff35236bf33bc666221c30cec3bbd5d04c28424540b\": container with ID starting with 733a353a3af7f6a1a5b0bff35236bf33bc666221c30cec3bbd5d04c28424540b not found: ID does not exist" containerID="733a353a3af7f6a1a5b0bff35236bf33bc666221c30cec3bbd5d04c28424540b" Oct 13 14:33:51 crc kubenswrapper[4797]: I1013 14:33:51.295462 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"733a353a3af7f6a1a5b0bff35236bf33bc666221c30cec3bbd5d04c28424540b"} err="failed to get container status \"733a353a3af7f6a1a5b0bff35236bf33bc666221c30cec3bbd5d04c28424540b\": rpc error: code = NotFound desc = could not find container \"733a353a3af7f6a1a5b0bff35236bf33bc666221c30cec3bbd5d04c28424540b\": container with ID starting with 733a353a3af7f6a1a5b0bff35236bf33bc666221c30cec3bbd5d04c28424540b not found: ID does not exist" Oct 13 14:33:51 crc kubenswrapper[4797]: I1013 14:33:51.295492 4797 scope.go:117] "RemoveContainer" containerID="ba97c6a7364bc4c96f42073b242438604e519aa487057cdf0a0cefd5a9944bfd" Oct 13 14:33:51 crc kubenswrapper[4797]: E1013 14:33:51.295753 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba97c6a7364bc4c96f42073b242438604e519aa487057cdf0a0cefd5a9944bfd\": container with ID starting with ba97c6a7364bc4c96f42073b242438604e519aa487057cdf0a0cefd5a9944bfd not found: ID does not exist" containerID="ba97c6a7364bc4c96f42073b242438604e519aa487057cdf0a0cefd5a9944bfd" Oct 13 14:33:51 crc kubenswrapper[4797]: I1013 14:33:51.295821 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba97c6a7364bc4c96f42073b242438604e519aa487057cdf0a0cefd5a9944bfd"} err="failed to get container status \"ba97c6a7364bc4c96f42073b242438604e519aa487057cdf0a0cefd5a9944bfd\": rpc error: code = NotFound desc = could not find container \"ba97c6a7364bc4c96f42073b242438604e519aa487057cdf0a0cefd5a9944bfd\": container with ID starting with ba97c6a7364bc4c96f42073b242438604e519aa487057cdf0a0cefd5a9944bfd not found: ID does not exist" Oct 13 14:33:52 crc kubenswrapper[4797]: I1013 14:33:52.991114 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-zpd46"] Oct 13 14:33:52 crc kubenswrapper[4797]: E1013 14:33:52.991969 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b9a3713-8e2a-40ac-80a9-0fa0fa54ad96" containerName="init" Oct 13 14:33:52 crc kubenswrapper[4797]: I1013 14:33:52.991985 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b9a3713-8e2a-40ac-80a9-0fa0fa54ad96" containerName="init" Oct 13 14:33:52 crc kubenswrapper[4797]: E1013 14:33:52.991998 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b9a3713-8e2a-40ac-80a9-0fa0fa54ad96" containerName="dnsmasq-dns" Oct 13 14:33:52 crc kubenswrapper[4797]: I1013 14:33:52.992008 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b9a3713-8e2a-40ac-80a9-0fa0fa54ad96" containerName="dnsmasq-dns" Oct 13 14:33:52 crc kubenswrapper[4797]: I1013 14:33:52.992230 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b9a3713-8e2a-40ac-80a9-0fa0fa54ad96" containerName="dnsmasq-dns" Oct 13 14:33:52 crc kubenswrapper[4797]: I1013 14:33:52.992918 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-zpd46" Oct 13 14:33:53 crc kubenswrapper[4797]: I1013 14:33:53.000385 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-zpd46"] Oct 13 14:33:53 crc kubenswrapper[4797]: I1013 14:33:53.082069 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7sdz\" (UniqueName: \"kubernetes.io/projected/b56f7d3a-7c03-403f-ac55-88898b8fc587-kube-api-access-q7sdz\") pod \"cinder-db-create-zpd46\" (UID: \"b56f7d3a-7c03-403f-ac55-88898b8fc587\") " pod="openstack/cinder-db-create-zpd46" Oct 13 14:33:53 crc kubenswrapper[4797]: I1013 14:33:53.184306 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q7sdz\" (UniqueName: \"kubernetes.io/projected/b56f7d3a-7c03-403f-ac55-88898b8fc587-kube-api-access-q7sdz\") pod \"cinder-db-create-zpd46\" (UID: \"b56f7d3a-7c03-403f-ac55-88898b8fc587\") " pod="openstack/cinder-db-create-zpd46" Oct 13 14:33:53 crc kubenswrapper[4797]: I1013 14:33:53.204303 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7sdz\" (UniqueName: \"kubernetes.io/projected/b56f7d3a-7c03-403f-ac55-88898b8fc587-kube-api-access-q7sdz\") pod \"cinder-db-create-zpd46\" (UID: \"b56f7d3a-7c03-403f-ac55-88898b8fc587\") " pod="openstack/cinder-db-create-zpd46" Oct 13 14:33:53 crc kubenswrapper[4797]: I1013 14:33:53.248708 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b9a3713-8e2a-40ac-80a9-0fa0fa54ad96" path="/var/lib/kubelet/pods/0b9a3713-8e2a-40ac-80a9-0fa0fa54ad96/volumes" Oct 13 14:33:53 crc kubenswrapper[4797]: I1013 14:33:53.309291 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-zpd46" Oct 13 14:33:53 crc kubenswrapper[4797]: I1013 14:33:53.741075 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-zpd46"] Oct 13 14:33:54 crc kubenswrapper[4797]: I1013 14:33:54.245158 4797 generic.go:334] "Generic (PLEG): container finished" podID="b56f7d3a-7c03-403f-ac55-88898b8fc587" containerID="3bb66a10e270b3d66dd15ce7e287b20b9c0765c29b2947788cf93f37745a70a7" exitCode=0 Oct 13 14:33:54 crc kubenswrapper[4797]: I1013 14:33:54.245207 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-zpd46" event={"ID":"b56f7d3a-7c03-403f-ac55-88898b8fc587","Type":"ContainerDied","Data":"3bb66a10e270b3d66dd15ce7e287b20b9c0765c29b2947788cf93f37745a70a7"} Oct 13 14:33:54 crc kubenswrapper[4797]: I1013 14:33:54.245409 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-zpd46" event={"ID":"b56f7d3a-7c03-403f-ac55-88898b8fc587","Type":"ContainerStarted","Data":"370a1d739c025e15d75708d24f00b9d66d2e92f8b8425a962e5f77c7f0bdc1f3"} Oct 13 14:33:55 crc kubenswrapper[4797]: I1013 14:33:55.591346 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-zpd46" Oct 13 14:33:55 crc kubenswrapper[4797]: I1013 14:33:55.747778 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q7sdz\" (UniqueName: \"kubernetes.io/projected/b56f7d3a-7c03-403f-ac55-88898b8fc587-kube-api-access-q7sdz\") pod \"b56f7d3a-7c03-403f-ac55-88898b8fc587\" (UID: \"b56f7d3a-7c03-403f-ac55-88898b8fc587\") " Oct 13 14:33:55 crc kubenswrapper[4797]: I1013 14:33:55.757529 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b56f7d3a-7c03-403f-ac55-88898b8fc587-kube-api-access-q7sdz" (OuterVolumeSpecName: "kube-api-access-q7sdz") pod "b56f7d3a-7c03-403f-ac55-88898b8fc587" (UID: "b56f7d3a-7c03-403f-ac55-88898b8fc587"). InnerVolumeSpecName "kube-api-access-q7sdz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 14:33:55 crc kubenswrapper[4797]: I1013 14:33:55.849313 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q7sdz\" (UniqueName: \"kubernetes.io/projected/b56f7d3a-7c03-403f-ac55-88898b8fc587-kube-api-access-q7sdz\") on node \"crc\" DevicePath \"\"" Oct 13 14:33:56 crc kubenswrapper[4797]: I1013 14:33:56.268267 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-zpd46" event={"ID":"b56f7d3a-7c03-403f-ac55-88898b8fc587","Type":"ContainerDied","Data":"370a1d739c025e15d75708d24f00b9d66d2e92f8b8425a962e5f77c7f0bdc1f3"} Oct 13 14:33:56 crc kubenswrapper[4797]: I1013 14:33:56.268315 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="370a1d739c025e15d75708d24f00b9d66d2e92f8b8425a962e5f77c7f0bdc1f3" Oct 13 14:33:56 crc kubenswrapper[4797]: I1013 14:33:56.268324 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-zpd46" Oct 13 14:33:56 crc kubenswrapper[4797]: E1013 14:33:56.370043 4797 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb56f7d3a_7c03_403f_ac55_88898b8fc587.slice/crio-370a1d739c025e15d75708d24f00b9d66d2e92f8b8425a962e5f77c7f0bdc1f3\": RecentStats: unable to find data in memory cache]" Oct 13 14:34:03 crc kubenswrapper[4797]: I1013 14:34:03.111074 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-4e02-account-create-r4vkp"] Oct 13 14:34:03 crc kubenswrapper[4797]: E1013 14:34:03.112525 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b56f7d3a-7c03-403f-ac55-88898b8fc587" containerName="mariadb-database-create" Oct 13 14:34:03 crc kubenswrapper[4797]: I1013 14:34:03.112546 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="b56f7d3a-7c03-403f-ac55-88898b8fc587" containerName="mariadb-database-create" Oct 13 14:34:03 crc kubenswrapper[4797]: I1013 14:34:03.113067 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="b56f7d3a-7c03-403f-ac55-88898b8fc587" containerName="mariadb-database-create" Oct 13 14:34:03 crc kubenswrapper[4797]: I1013 14:34:03.114528 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-4e02-account-create-r4vkp" Oct 13 14:34:03 crc kubenswrapper[4797]: I1013 14:34:03.117744 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Oct 13 14:34:03 crc kubenswrapper[4797]: I1013 14:34:03.122339 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-4e02-account-create-r4vkp"] Oct 13 14:34:03 crc kubenswrapper[4797]: I1013 14:34:03.243228 4797 scope.go:117] "RemoveContainer" containerID="96c8267bd4c8e99eeab0f52fde47a06d5529395a03b2ed9e13ec45aa355e370b" Oct 13 14:34:03 crc kubenswrapper[4797]: E1013 14:34:03.243574 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 14:34:03 crc kubenswrapper[4797]: I1013 14:34:03.292473 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x48gg\" (UniqueName: \"kubernetes.io/projected/478609df-1ea6-49b7-b109-81174cbbb205-kube-api-access-x48gg\") pod \"cinder-4e02-account-create-r4vkp\" (UID: \"478609df-1ea6-49b7-b109-81174cbbb205\") " pod="openstack/cinder-4e02-account-create-r4vkp" Oct 13 14:34:03 crc kubenswrapper[4797]: I1013 14:34:03.394839 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x48gg\" (UniqueName: \"kubernetes.io/projected/478609df-1ea6-49b7-b109-81174cbbb205-kube-api-access-x48gg\") pod \"cinder-4e02-account-create-r4vkp\" (UID: \"478609df-1ea6-49b7-b109-81174cbbb205\") " pod="openstack/cinder-4e02-account-create-r4vkp" Oct 13 14:34:03 crc kubenswrapper[4797]: I1013 14:34:03.415725 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x48gg\" (UniqueName: \"kubernetes.io/projected/478609df-1ea6-49b7-b109-81174cbbb205-kube-api-access-x48gg\") pod \"cinder-4e02-account-create-r4vkp\" (UID: \"478609df-1ea6-49b7-b109-81174cbbb205\") " pod="openstack/cinder-4e02-account-create-r4vkp" Oct 13 14:34:03 crc kubenswrapper[4797]: I1013 14:34:03.454210 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-4e02-account-create-r4vkp" Oct 13 14:34:03 crc kubenswrapper[4797]: W1013 14:34:03.906145 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod478609df_1ea6_49b7_b109_81174cbbb205.slice/crio-082053968e23fe84256c6a2b424951cb17b5f6641bf5ddc9ed26445dc55b8161 WatchSource:0}: Error finding container 082053968e23fe84256c6a2b424951cb17b5f6641bf5ddc9ed26445dc55b8161: Status 404 returned error can't find the container with id 082053968e23fe84256c6a2b424951cb17b5f6641bf5ddc9ed26445dc55b8161 Oct 13 14:34:03 crc kubenswrapper[4797]: I1013 14:34:03.908421 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-4e02-account-create-r4vkp"] Oct 13 14:34:03 crc kubenswrapper[4797]: I1013 14:34:03.915378 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Oct 13 14:34:04 crc kubenswrapper[4797]: I1013 14:34:04.391556 4797 generic.go:334] "Generic (PLEG): container finished" podID="478609df-1ea6-49b7-b109-81174cbbb205" containerID="321df0c3ef0e5d2e9db3af1221c495f77bb5cd483167b07fb3a08ad6aa2b2e18" exitCode=0 Oct 13 14:34:04 crc kubenswrapper[4797]: I1013 14:34:04.391648 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-4e02-account-create-r4vkp" event={"ID":"478609df-1ea6-49b7-b109-81174cbbb205","Type":"ContainerDied","Data":"321df0c3ef0e5d2e9db3af1221c495f77bb5cd483167b07fb3a08ad6aa2b2e18"} Oct 13 14:34:04 crc kubenswrapper[4797]: I1013 14:34:04.391676 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-4e02-account-create-r4vkp" event={"ID":"478609df-1ea6-49b7-b109-81174cbbb205","Type":"ContainerStarted","Data":"082053968e23fe84256c6a2b424951cb17b5f6641bf5ddc9ed26445dc55b8161"} Oct 13 14:34:05 crc kubenswrapper[4797]: I1013 14:34:05.725840 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-4e02-account-create-r4vkp" Oct 13 14:34:05 crc kubenswrapper[4797]: I1013 14:34:05.841789 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x48gg\" (UniqueName: \"kubernetes.io/projected/478609df-1ea6-49b7-b109-81174cbbb205-kube-api-access-x48gg\") pod \"478609df-1ea6-49b7-b109-81174cbbb205\" (UID: \"478609df-1ea6-49b7-b109-81174cbbb205\") " Oct 13 14:34:05 crc kubenswrapper[4797]: I1013 14:34:05.847257 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/478609df-1ea6-49b7-b109-81174cbbb205-kube-api-access-x48gg" (OuterVolumeSpecName: "kube-api-access-x48gg") pod "478609df-1ea6-49b7-b109-81174cbbb205" (UID: "478609df-1ea6-49b7-b109-81174cbbb205"). InnerVolumeSpecName "kube-api-access-x48gg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 14:34:05 crc kubenswrapper[4797]: I1013 14:34:05.944440 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x48gg\" (UniqueName: \"kubernetes.io/projected/478609df-1ea6-49b7-b109-81174cbbb205-kube-api-access-x48gg\") on node \"crc\" DevicePath \"\"" Oct 13 14:34:06 crc kubenswrapper[4797]: I1013 14:34:06.418172 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-4e02-account-create-r4vkp" event={"ID":"478609df-1ea6-49b7-b109-81174cbbb205","Type":"ContainerDied","Data":"082053968e23fe84256c6a2b424951cb17b5f6641bf5ddc9ed26445dc55b8161"} Oct 13 14:34:06 crc kubenswrapper[4797]: I1013 14:34:06.418233 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-4e02-account-create-r4vkp" Oct 13 14:34:06 crc kubenswrapper[4797]: I1013 14:34:06.418444 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="082053968e23fe84256c6a2b424951cb17b5f6641bf5ddc9ed26445dc55b8161" Oct 13 14:34:06 crc kubenswrapper[4797]: E1013 14:34:06.635068 4797 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod478609df_1ea6_49b7_b109_81174cbbb205.slice\": RecentStats: unable to find data in memory cache]" Oct 13 14:34:08 crc kubenswrapper[4797]: I1013 14:34:08.340271 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-vrf8l"] Oct 13 14:34:08 crc kubenswrapper[4797]: E1013 14:34:08.340633 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="478609df-1ea6-49b7-b109-81174cbbb205" containerName="mariadb-account-create" Oct 13 14:34:08 crc kubenswrapper[4797]: I1013 14:34:08.340648 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="478609df-1ea6-49b7-b109-81174cbbb205" containerName="mariadb-account-create" Oct 13 14:34:08 crc kubenswrapper[4797]: I1013 14:34:08.340900 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="478609df-1ea6-49b7-b109-81174cbbb205" containerName="mariadb-account-create" Oct 13 14:34:08 crc kubenswrapper[4797]: I1013 14:34:08.341481 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-vrf8l" Oct 13 14:34:08 crc kubenswrapper[4797]: I1013 14:34:08.347027 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Oct 13 14:34:08 crc kubenswrapper[4797]: I1013 14:34:08.347173 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-9cqvz" Oct 13 14:34:08 crc kubenswrapper[4797]: I1013 14:34:08.347270 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Oct 13 14:34:08 crc kubenswrapper[4797]: I1013 14:34:08.351471 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-vrf8l"] Oct 13 14:34:08 crc kubenswrapper[4797]: I1013 14:34:08.495927 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mx985\" (UniqueName: \"kubernetes.io/projected/3013fc1e-e99d-4fb5-96c5-2a384769b668-kube-api-access-mx985\") pod \"cinder-db-sync-vrf8l\" (UID: \"3013fc1e-e99d-4fb5-96c5-2a384769b668\") " pod="openstack/cinder-db-sync-vrf8l" Oct 13 14:34:08 crc kubenswrapper[4797]: I1013 14:34:08.496354 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3013fc1e-e99d-4fb5-96c5-2a384769b668-config-data\") pod \"cinder-db-sync-vrf8l\" (UID: \"3013fc1e-e99d-4fb5-96c5-2a384769b668\") " pod="openstack/cinder-db-sync-vrf8l" Oct 13 14:34:08 crc kubenswrapper[4797]: I1013 14:34:08.496396 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3013fc1e-e99d-4fb5-96c5-2a384769b668-combined-ca-bundle\") pod \"cinder-db-sync-vrf8l\" (UID: \"3013fc1e-e99d-4fb5-96c5-2a384769b668\") " pod="openstack/cinder-db-sync-vrf8l" Oct 13 14:34:08 crc kubenswrapper[4797]: I1013 14:34:08.496427 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3013fc1e-e99d-4fb5-96c5-2a384769b668-db-sync-config-data\") pod \"cinder-db-sync-vrf8l\" (UID: \"3013fc1e-e99d-4fb5-96c5-2a384769b668\") " pod="openstack/cinder-db-sync-vrf8l" Oct 13 14:34:08 crc kubenswrapper[4797]: I1013 14:34:08.496447 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3013fc1e-e99d-4fb5-96c5-2a384769b668-scripts\") pod \"cinder-db-sync-vrf8l\" (UID: \"3013fc1e-e99d-4fb5-96c5-2a384769b668\") " pod="openstack/cinder-db-sync-vrf8l" Oct 13 14:34:08 crc kubenswrapper[4797]: I1013 14:34:08.496519 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3013fc1e-e99d-4fb5-96c5-2a384769b668-etc-machine-id\") pod \"cinder-db-sync-vrf8l\" (UID: \"3013fc1e-e99d-4fb5-96c5-2a384769b668\") " pod="openstack/cinder-db-sync-vrf8l" Oct 13 14:34:08 crc kubenswrapper[4797]: I1013 14:34:08.597504 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3013fc1e-e99d-4fb5-96c5-2a384769b668-etc-machine-id\") pod \"cinder-db-sync-vrf8l\" (UID: \"3013fc1e-e99d-4fb5-96c5-2a384769b668\") " pod="openstack/cinder-db-sync-vrf8l" Oct 13 14:34:08 crc kubenswrapper[4797]: I1013 14:34:08.597580 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mx985\" (UniqueName: \"kubernetes.io/projected/3013fc1e-e99d-4fb5-96c5-2a384769b668-kube-api-access-mx985\") pod \"cinder-db-sync-vrf8l\" (UID: \"3013fc1e-e99d-4fb5-96c5-2a384769b668\") " pod="openstack/cinder-db-sync-vrf8l" Oct 13 14:34:08 crc kubenswrapper[4797]: I1013 14:34:08.597629 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3013fc1e-e99d-4fb5-96c5-2a384769b668-etc-machine-id\") pod \"cinder-db-sync-vrf8l\" (UID: \"3013fc1e-e99d-4fb5-96c5-2a384769b668\") " pod="openstack/cinder-db-sync-vrf8l" Oct 13 14:34:08 crc kubenswrapper[4797]: I1013 14:34:08.597707 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3013fc1e-e99d-4fb5-96c5-2a384769b668-config-data\") pod \"cinder-db-sync-vrf8l\" (UID: \"3013fc1e-e99d-4fb5-96c5-2a384769b668\") " pod="openstack/cinder-db-sync-vrf8l" Oct 13 14:34:08 crc kubenswrapper[4797]: I1013 14:34:08.597747 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3013fc1e-e99d-4fb5-96c5-2a384769b668-combined-ca-bundle\") pod \"cinder-db-sync-vrf8l\" (UID: \"3013fc1e-e99d-4fb5-96c5-2a384769b668\") " pod="openstack/cinder-db-sync-vrf8l" Oct 13 14:34:08 crc kubenswrapper[4797]: I1013 14:34:08.597780 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3013fc1e-e99d-4fb5-96c5-2a384769b668-db-sync-config-data\") pod \"cinder-db-sync-vrf8l\" (UID: \"3013fc1e-e99d-4fb5-96c5-2a384769b668\") " pod="openstack/cinder-db-sync-vrf8l" Oct 13 14:34:08 crc kubenswrapper[4797]: I1013 14:34:08.597806 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3013fc1e-e99d-4fb5-96c5-2a384769b668-scripts\") pod \"cinder-db-sync-vrf8l\" (UID: \"3013fc1e-e99d-4fb5-96c5-2a384769b668\") " pod="openstack/cinder-db-sync-vrf8l" Oct 13 14:34:08 crc kubenswrapper[4797]: I1013 14:34:08.603050 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3013fc1e-e99d-4fb5-96c5-2a384769b668-db-sync-config-data\") pod \"cinder-db-sync-vrf8l\" (UID: \"3013fc1e-e99d-4fb5-96c5-2a384769b668\") " pod="openstack/cinder-db-sync-vrf8l" Oct 13 14:34:08 crc kubenswrapper[4797]: I1013 14:34:08.603494 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3013fc1e-e99d-4fb5-96c5-2a384769b668-combined-ca-bundle\") pod \"cinder-db-sync-vrf8l\" (UID: \"3013fc1e-e99d-4fb5-96c5-2a384769b668\") " pod="openstack/cinder-db-sync-vrf8l" Oct 13 14:34:08 crc kubenswrapper[4797]: I1013 14:34:08.606910 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3013fc1e-e99d-4fb5-96c5-2a384769b668-config-data\") pod \"cinder-db-sync-vrf8l\" (UID: \"3013fc1e-e99d-4fb5-96c5-2a384769b668\") " pod="openstack/cinder-db-sync-vrf8l" Oct 13 14:34:08 crc kubenswrapper[4797]: I1013 14:34:08.609394 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3013fc1e-e99d-4fb5-96c5-2a384769b668-scripts\") pod \"cinder-db-sync-vrf8l\" (UID: \"3013fc1e-e99d-4fb5-96c5-2a384769b668\") " pod="openstack/cinder-db-sync-vrf8l" Oct 13 14:34:08 crc kubenswrapper[4797]: I1013 14:34:08.618633 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mx985\" (UniqueName: \"kubernetes.io/projected/3013fc1e-e99d-4fb5-96c5-2a384769b668-kube-api-access-mx985\") pod \"cinder-db-sync-vrf8l\" (UID: \"3013fc1e-e99d-4fb5-96c5-2a384769b668\") " pod="openstack/cinder-db-sync-vrf8l" Oct 13 14:34:08 crc kubenswrapper[4797]: I1013 14:34:08.666800 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-vrf8l" Oct 13 14:34:09 crc kubenswrapper[4797]: I1013 14:34:09.118611 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-vrf8l"] Oct 13 14:34:09 crc kubenswrapper[4797]: I1013 14:34:09.454608 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-vrf8l" event={"ID":"3013fc1e-e99d-4fb5-96c5-2a384769b668","Type":"ContainerStarted","Data":"6529052068ff6ac486c720411bc959de1c9058b4d931601c39e0c2c221bdb342"} Oct 13 14:34:16 crc kubenswrapper[4797]: I1013 14:34:16.237369 4797 scope.go:117] "RemoveContainer" containerID="96c8267bd4c8e99eeab0f52fde47a06d5529395a03b2ed9e13ec45aa355e370b" Oct 13 14:34:16 crc kubenswrapper[4797]: E1013 14:34:16.238207 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 14:34:28 crc kubenswrapper[4797]: I1013 14:34:28.236174 4797 scope.go:117] "RemoveContainer" containerID="96c8267bd4c8e99eeab0f52fde47a06d5529395a03b2ed9e13ec45aa355e370b" Oct 13 14:34:28 crc kubenswrapper[4797]: E1013 14:34:28.237002 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 14:34:28 crc kubenswrapper[4797]: I1013 14:34:28.626778 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-vrf8l" event={"ID":"3013fc1e-e99d-4fb5-96c5-2a384769b668","Type":"ContainerStarted","Data":"8d6f97744260ac699cc7e6337e820eb81c7f1e1243950affaf6429523db3681f"} Oct 13 14:34:28 crc kubenswrapper[4797]: I1013 14:34:28.649016 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-vrf8l" podStartSLOduration=1.8230040669999998 podStartE2EDuration="20.648996472s" podCreationTimestamp="2025-10-13 14:34:08 +0000 UTC" firstStartedPulling="2025-10-13 14:34:09.119420959 +0000 UTC m=+5226.652971215" lastFinishedPulling="2025-10-13 14:34:27.945413364 +0000 UTC m=+5245.478963620" observedRunningTime="2025-10-13 14:34:28.646278726 +0000 UTC m=+5246.179828992" watchObservedRunningTime="2025-10-13 14:34:28.648996472 +0000 UTC m=+5246.182546728" Oct 13 14:34:31 crc kubenswrapper[4797]: I1013 14:34:31.677351 4797 generic.go:334] "Generic (PLEG): container finished" podID="3013fc1e-e99d-4fb5-96c5-2a384769b668" containerID="8d6f97744260ac699cc7e6337e820eb81c7f1e1243950affaf6429523db3681f" exitCode=0 Oct 13 14:34:31 crc kubenswrapper[4797]: I1013 14:34:31.677498 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-vrf8l" event={"ID":"3013fc1e-e99d-4fb5-96c5-2a384769b668","Type":"ContainerDied","Data":"8d6f97744260ac699cc7e6337e820eb81c7f1e1243950affaf6429523db3681f"} Oct 13 14:34:33 crc kubenswrapper[4797]: I1013 14:34:33.067247 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-vrf8l" Oct 13 14:34:33 crc kubenswrapper[4797]: I1013 14:34:33.183800 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3013fc1e-e99d-4fb5-96c5-2a384769b668-scripts\") pod \"3013fc1e-e99d-4fb5-96c5-2a384769b668\" (UID: \"3013fc1e-e99d-4fb5-96c5-2a384769b668\") " Oct 13 14:34:33 crc kubenswrapper[4797]: I1013 14:34:33.183944 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mx985\" (UniqueName: \"kubernetes.io/projected/3013fc1e-e99d-4fb5-96c5-2a384769b668-kube-api-access-mx985\") pod \"3013fc1e-e99d-4fb5-96c5-2a384769b668\" (UID: \"3013fc1e-e99d-4fb5-96c5-2a384769b668\") " Oct 13 14:34:33 crc kubenswrapper[4797]: I1013 14:34:33.184053 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3013fc1e-e99d-4fb5-96c5-2a384769b668-etc-machine-id\") pod \"3013fc1e-e99d-4fb5-96c5-2a384769b668\" (UID: \"3013fc1e-e99d-4fb5-96c5-2a384769b668\") " Oct 13 14:34:33 crc kubenswrapper[4797]: I1013 14:34:33.184118 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3013fc1e-e99d-4fb5-96c5-2a384769b668-db-sync-config-data\") pod \"3013fc1e-e99d-4fb5-96c5-2a384769b668\" (UID: \"3013fc1e-e99d-4fb5-96c5-2a384769b668\") " Oct 13 14:34:33 crc kubenswrapper[4797]: I1013 14:34:33.184185 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3013fc1e-e99d-4fb5-96c5-2a384769b668-config-data\") pod \"3013fc1e-e99d-4fb5-96c5-2a384769b668\" (UID: \"3013fc1e-e99d-4fb5-96c5-2a384769b668\") " Oct 13 14:34:33 crc kubenswrapper[4797]: I1013 14:34:33.184245 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3013fc1e-e99d-4fb5-96c5-2a384769b668-combined-ca-bundle\") pod \"3013fc1e-e99d-4fb5-96c5-2a384769b668\" (UID: \"3013fc1e-e99d-4fb5-96c5-2a384769b668\") " Oct 13 14:34:33 crc kubenswrapper[4797]: I1013 14:34:33.184266 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3013fc1e-e99d-4fb5-96c5-2a384769b668-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "3013fc1e-e99d-4fb5-96c5-2a384769b668" (UID: "3013fc1e-e99d-4fb5-96c5-2a384769b668"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 13 14:34:33 crc kubenswrapper[4797]: I1013 14:34:33.184597 4797 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3013fc1e-e99d-4fb5-96c5-2a384769b668-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 13 14:34:33 crc kubenswrapper[4797]: I1013 14:34:33.189609 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3013fc1e-e99d-4fb5-96c5-2a384769b668-scripts" (OuterVolumeSpecName: "scripts") pod "3013fc1e-e99d-4fb5-96c5-2a384769b668" (UID: "3013fc1e-e99d-4fb5-96c5-2a384769b668"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 14:34:33 crc kubenswrapper[4797]: I1013 14:34:33.189844 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3013fc1e-e99d-4fb5-96c5-2a384769b668-kube-api-access-mx985" (OuterVolumeSpecName: "kube-api-access-mx985") pod "3013fc1e-e99d-4fb5-96c5-2a384769b668" (UID: "3013fc1e-e99d-4fb5-96c5-2a384769b668"). InnerVolumeSpecName "kube-api-access-mx985". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 14:34:33 crc kubenswrapper[4797]: I1013 14:34:33.191484 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3013fc1e-e99d-4fb5-96c5-2a384769b668-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "3013fc1e-e99d-4fb5-96c5-2a384769b668" (UID: "3013fc1e-e99d-4fb5-96c5-2a384769b668"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 14:34:33 crc kubenswrapper[4797]: I1013 14:34:33.237159 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3013fc1e-e99d-4fb5-96c5-2a384769b668-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3013fc1e-e99d-4fb5-96c5-2a384769b668" (UID: "3013fc1e-e99d-4fb5-96c5-2a384769b668"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 14:34:33 crc kubenswrapper[4797]: I1013 14:34:33.258557 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3013fc1e-e99d-4fb5-96c5-2a384769b668-config-data" (OuterVolumeSpecName: "config-data") pod "3013fc1e-e99d-4fb5-96c5-2a384769b668" (UID: "3013fc1e-e99d-4fb5-96c5-2a384769b668"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 14:34:33 crc kubenswrapper[4797]: I1013 14:34:33.285966 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mx985\" (UniqueName: \"kubernetes.io/projected/3013fc1e-e99d-4fb5-96c5-2a384769b668-kube-api-access-mx985\") on node \"crc\" DevicePath \"\"" Oct 13 14:34:33 crc kubenswrapper[4797]: I1013 14:34:33.286018 4797 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3013fc1e-e99d-4fb5-96c5-2a384769b668-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 14:34:33 crc kubenswrapper[4797]: I1013 14:34:33.286034 4797 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3013fc1e-e99d-4fb5-96c5-2a384769b668-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 14:34:33 crc kubenswrapper[4797]: I1013 14:34:33.286048 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3013fc1e-e99d-4fb5-96c5-2a384769b668-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 14:34:33 crc kubenswrapper[4797]: I1013 14:34:33.286059 4797 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3013fc1e-e99d-4fb5-96c5-2a384769b668-scripts\") on node \"crc\" DevicePath \"\"" Oct 13 14:34:33 crc kubenswrapper[4797]: I1013 14:34:33.700677 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-vrf8l" event={"ID":"3013fc1e-e99d-4fb5-96c5-2a384769b668","Type":"ContainerDied","Data":"6529052068ff6ac486c720411bc959de1c9058b4d931601c39e0c2c221bdb342"} Oct 13 14:34:33 crc kubenswrapper[4797]: I1013 14:34:33.700722 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-vrf8l" Oct 13 14:34:33 crc kubenswrapper[4797]: I1013 14:34:33.700726 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6529052068ff6ac486c720411bc959de1c9058b4d931601c39e0c2c221bdb342" Oct 13 14:34:34 crc kubenswrapper[4797]: I1013 14:34:34.086133 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-564c9b84c5-8nbpf"] Oct 13 14:34:34 crc kubenswrapper[4797]: E1013 14:34:34.087192 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3013fc1e-e99d-4fb5-96c5-2a384769b668" containerName="cinder-db-sync" Oct 13 14:34:34 crc kubenswrapper[4797]: I1013 14:34:34.087210 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="3013fc1e-e99d-4fb5-96c5-2a384769b668" containerName="cinder-db-sync" Oct 13 14:34:34 crc kubenswrapper[4797]: I1013 14:34:34.087445 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="3013fc1e-e99d-4fb5-96c5-2a384769b668" containerName="cinder-db-sync" Oct 13 14:34:34 crc kubenswrapper[4797]: I1013 14:34:34.088768 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-564c9b84c5-8nbpf" Oct 13 14:34:34 crc kubenswrapper[4797]: I1013 14:34:34.110533 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-564c9b84c5-8nbpf"] Oct 13 14:34:34 crc kubenswrapper[4797]: I1013 14:34:34.179081 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Oct 13 14:34:34 crc kubenswrapper[4797]: I1013 14:34:34.180782 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 13 14:34:34 crc kubenswrapper[4797]: I1013 14:34:34.190236 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-9cqvz" Oct 13 14:34:34 crc kubenswrapper[4797]: I1013 14:34:34.190355 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Oct 13 14:34:34 crc kubenswrapper[4797]: I1013 14:34:34.190571 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Oct 13 14:34:34 crc kubenswrapper[4797]: I1013 14:34:34.190749 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Oct 13 14:34:34 crc kubenswrapper[4797]: I1013 14:34:34.191918 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 13 14:34:34 crc kubenswrapper[4797]: I1013 14:34:34.207297 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1e42222d-b6e6-4ef7-a8fa-4f3b2a88c0c4-ovsdbserver-nb\") pod \"dnsmasq-dns-564c9b84c5-8nbpf\" (UID: \"1e42222d-b6e6-4ef7-a8fa-4f3b2a88c0c4\") " pod="openstack/dnsmasq-dns-564c9b84c5-8nbpf" Oct 13 14:34:34 crc kubenswrapper[4797]: I1013 14:34:34.207356 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1e42222d-b6e6-4ef7-a8fa-4f3b2a88c0c4-dns-svc\") pod \"dnsmasq-dns-564c9b84c5-8nbpf\" (UID: \"1e42222d-b6e6-4ef7-a8fa-4f3b2a88c0c4\") " pod="openstack/dnsmasq-dns-564c9b84c5-8nbpf" Oct 13 14:34:34 crc kubenswrapper[4797]: I1013 14:34:34.207382 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27g8d\" (UniqueName: \"kubernetes.io/projected/1e42222d-b6e6-4ef7-a8fa-4f3b2a88c0c4-kube-api-access-27g8d\") pod \"dnsmasq-dns-564c9b84c5-8nbpf\" (UID: \"1e42222d-b6e6-4ef7-a8fa-4f3b2a88c0c4\") " pod="openstack/dnsmasq-dns-564c9b84c5-8nbpf" Oct 13 14:34:34 crc kubenswrapper[4797]: I1013 14:34:34.207402 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1e42222d-b6e6-4ef7-a8fa-4f3b2a88c0c4-config\") pod \"dnsmasq-dns-564c9b84c5-8nbpf\" (UID: \"1e42222d-b6e6-4ef7-a8fa-4f3b2a88c0c4\") " pod="openstack/dnsmasq-dns-564c9b84c5-8nbpf" Oct 13 14:34:34 crc kubenswrapper[4797]: I1013 14:34:34.207443 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1e42222d-b6e6-4ef7-a8fa-4f3b2a88c0c4-ovsdbserver-sb\") pod \"dnsmasq-dns-564c9b84c5-8nbpf\" (UID: \"1e42222d-b6e6-4ef7-a8fa-4f3b2a88c0c4\") " pod="openstack/dnsmasq-dns-564c9b84c5-8nbpf" Oct 13 14:34:34 crc kubenswrapper[4797]: I1013 14:34:34.309321 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkq4q\" (UniqueName: \"kubernetes.io/projected/95071464-c69d-4edb-a00e-1f980c80301d-kube-api-access-dkq4q\") pod \"cinder-api-0\" (UID: \"95071464-c69d-4edb-a00e-1f980c80301d\") " pod="openstack/cinder-api-0" Oct 13 14:34:34 crc kubenswrapper[4797]: I1013 14:34:34.309426 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1e42222d-b6e6-4ef7-a8fa-4f3b2a88c0c4-ovsdbserver-nb\") pod \"dnsmasq-dns-564c9b84c5-8nbpf\" (UID: \"1e42222d-b6e6-4ef7-a8fa-4f3b2a88c0c4\") " pod="openstack/dnsmasq-dns-564c9b84c5-8nbpf" Oct 13 14:34:34 crc kubenswrapper[4797]: I1013 14:34:34.309761 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1e42222d-b6e6-4ef7-a8fa-4f3b2a88c0c4-dns-svc\") pod \"dnsmasq-dns-564c9b84c5-8nbpf\" (UID: \"1e42222d-b6e6-4ef7-a8fa-4f3b2a88c0c4\") " pod="openstack/dnsmasq-dns-564c9b84c5-8nbpf" Oct 13 14:34:34 crc kubenswrapper[4797]: I1013 14:34:34.309912 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27g8d\" (UniqueName: \"kubernetes.io/projected/1e42222d-b6e6-4ef7-a8fa-4f3b2a88c0c4-kube-api-access-27g8d\") pod \"dnsmasq-dns-564c9b84c5-8nbpf\" (UID: \"1e42222d-b6e6-4ef7-a8fa-4f3b2a88c0c4\") " pod="openstack/dnsmasq-dns-564c9b84c5-8nbpf" Oct 13 14:34:34 crc kubenswrapper[4797]: I1013 14:34:34.309972 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1e42222d-b6e6-4ef7-a8fa-4f3b2a88c0c4-config\") pod \"dnsmasq-dns-564c9b84c5-8nbpf\" (UID: \"1e42222d-b6e6-4ef7-a8fa-4f3b2a88c0c4\") " pod="openstack/dnsmasq-dns-564c9b84c5-8nbpf" Oct 13 14:34:34 crc kubenswrapper[4797]: I1013 14:34:34.310069 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/95071464-c69d-4edb-a00e-1f980c80301d-etc-machine-id\") pod \"cinder-api-0\" (UID: \"95071464-c69d-4edb-a00e-1f980c80301d\") " pod="openstack/cinder-api-0" Oct 13 14:34:34 crc kubenswrapper[4797]: I1013 14:34:34.310141 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/95071464-c69d-4edb-a00e-1f980c80301d-config-data-custom\") pod \"cinder-api-0\" (UID: \"95071464-c69d-4edb-a00e-1f980c80301d\") " pod="openstack/cinder-api-0" Oct 13 14:34:34 crc kubenswrapper[4797]: I1013 14:34:34.310226 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95071464-c69d-4edb-a00e-1f980c80301d-scripts\") pod \"cinder-api-0\" (UID: \"95071464-c69d-4edb-a00e-1f980c80301d\") " pod="openstack/cinder-api-0" Oct 13 14:34:34 crc kubenswrapper[4797]: I1013 14:34:34.310333 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95071464-c69d-4edb-a00e-1f980c80301d-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"95071464-c69d-4edb-a00e-1f980c80301d\") " pod="openstack/cinder-api-0" Oct 13 14:34:34 crc kubenswrapper[4797]: I1013 14:34:34.310367 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1e42222d-b6e6-4ef7-a8fa-4f3b2a88c0c4-ovsdbserver-sb\") pod \"dnsmasq-dns-564c9b84c5-8nbpf\" (UID: \"1e42222d-b6e6-4ef7-a8fa-4f3b2a88c0c4\") " pod="openstack/dnsmasq-dns-564c9b84c5-8nbpf" Oct 13 14:34:34 crc kubenswrapper[4797]: I1013 14:34:34.310626 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/95071464-c69d-4edb-a00e-1f980c80301d-logs\") pod \"cinder-api-0\" (UID: \"95071464-c69d-4edb-a00e-1f980c80301d\") " pod="openstack/cinder-api-0" Oct 13 14:34:34 crc kubenswrapper[4797]: I1013 14:34:34.310859 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95071464-c69d-4edb-a00e-1f980c80301d-config-data\") pod \"cinder-api-0\" (UID: \"95071464-c69d-4edb-a00e-1f980c80301d\") " pod="openstack/cinder-api-0" Oct 13 14:34:34 crc kubenswrapper[4797]: I1013 14:34:34.311329 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1e42222d-b6e6-4ef7-a8fa-4f3b2a88c0c4-config\") pod \"dnsmasq-dns-564c9b84c5-8nbpf\" (UID: \"1e42222d-b6e6-4ef7-a8fa-4f3b2a88c0c4\") " pod="openstack/dnsmasq-dns-564c9b84c5-8nbpf" Oct 13 14:34:34 crc kubenswrapper[4797]: I1013 14:34:34.311354 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1e42222d-b6e6-4ef7-a8fa-4f3b2a88c0c4-dns-svc\") pod \"dnsmasq-dns-564c9b84c5-8nbpf\" (UID: \"1e42222d-b6e6-4ef7-a8fa-4f3b2a88c0c4\") " pod="openstack/dnsmasq-dns-564c9b84c5-8nbpf" Oct 13 14:34:34 crc kubenswrapper[4797]: I1013 14:34:34.311636 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1e42222d-b6e6-4ef7-a8fa-4f3b2a88c0c4-ovsdbserver-sb\") pod \"dnsmasq-dns-564c9b84c5-8nbpf\" (UID: \"1e42222d-b6e6-4ef7-a8fa-4f3b2a88c0c4\") " pod="openstack/dnsmasq-dns-564c9b84c5-8nbpf" Oct 13 14:34:34 crc kubenswrapper[4797]: I1013 14:34:34.311784 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1e42222d-b6e6-4ef7-a8fa-4f3b2a88c0c4-ovsdbserver-nb\") pod \"dnsmasq-dns-564c9b84c5-8nbpf\" (UID: \"1e42222d-b6e6-4ef7-a8fa-4f3b2a88c0c4\") " pod="openstack/dnsmasq-dns-564c9b84c5-8nbpf" Oct 13 14:34:34 crc kubenswrapper[4797]: I1013 14:34:34.333974 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27g8d\" (UniqueName: \"kubernetes.io/projected/1e42222d-b6e6-4ef7-a8fa-4f3b2a88c0c4-kube-api-access-27g8d\") pod \"dnsmasq-dns-564c9b84c5-8nbpf\" (UID: \"1e42222d-b6e6-4ef7-a8fa-4f3b2a88c0c4\") " pod="openstack/dnsmasq-dns-564c9b84c5-8nbpf" Oct 13 14:34:34 crc kubenswrapper[4797]: I1013 14:34:34.412691 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95071464-c69d-4edb-a00e-1f980c80301d-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"95071464-c69d-4edb-a00e-1f980c80301d\") " pod="openstack/cinder-api-0" Oct 13 14:34:34 crc kubenswrapper[4797]: I1013 14:34:34.412862 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/95071464-c69d-4edb-a00e-1f980c80301d-logs\") pod \"cinder-api-0\" (UID: \"95071464-c69d-4edb-a00e-1f980c80301d\") " pod="openstack/cinder-api-0" Oct 13 14:34:34 crc kubenswrapper[4797]: I1013 14:34:34.413018 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95071464-c69d-4edb-a00e-1f980c80301d-config-data\") pod \"cinder-api-0\" (UID: \"95071464-c69d-4edb-a00e-1f980c80301d\") " pod="openstack/cinder-api-0" Oct 13 14:34:34 crc kubenswrapper[4797]: I1013 14:34:34.413078 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dkq4q\" (UniqueName: \"kubernetes.io/projected/95071464-c69d-4edb-a00e-1f980c80301d-kube-api-access-dkq4q\") pod \"cinder-api-0\" (UID: \"95071464-c69d-4edb-a00e-1f980c80301d\") " pod="openstack/cinder-api-0" Oct 13 14:34:34 crc kubenswrapper[4797]: I1013 14:34:34.413287 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/95071464-c69d-4edb-a00e-1f980c80301d-etc-machine-id\") pod \"cinder-api-0\" (UID: \"95071464-c69d-4edb-a00e-1f980c80301d\") " pod="openstack/cinder-api-0" Oct 13 14:34:34 crc kubenswrapper[4797]: I1013 14:34:34.413343 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/95071464-c69d-4edb-a00e-1f980c80301d-config-data-custom\") pod \"cinder-api-0\" (UID: \"95071464-c69d-4edb-a00e-1f980c80301d\") " pod="openstack/cinder-api-0" Oct 13 14:34:34 crc kubenswrapper[4797]: I1013 14:34:34.413713 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/95071464-c69d-4edb-a00e-1f980c80301d-logs\") pod \"cinder-api-0\" (UID: \"95071464-c69d-4edb-a00e-1f980c80301d\") " pod="openstack/cinder-api-0" Oct 13 14:34:34 crc kubenswrapper[4797]: I1013 14:34:34.414735 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/95071464-c69d-4edb-a00e-1f980c80301d-etc-machine-id\") pod \"cinder-api-0\" (UID: \"95071464-c69d-4edb-a00e-1f980c80301d\") " pod="openstack/cinder-api-0" Oct 13 14:34:34 crc kubenswrapper[4797]: I1013 14:34:34.414896 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95071464-c69d-4edb-a00e-1f980c80301d-scripts\") pod \"cinder-api-0\" (UID: \"95071464-c69d-4edb-a00e-1f980c80301d\") " pod="openstack/cinder-api-0" Oct 13 14:34:34 crc kubenswrapper[4797]: I1013 14:34:34.416908 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95071464-c69d-4edb-a00e-1f980c80301d-config-data\") pod \"cinder-api-0\" (UID: \"95071464-c69d-4edb-a00e-1f980c80301d\") " pod="openstack/cinder-api-0" Oct 13 14:34:34 crc kubenswrapper[4797]: I1013 14:34:34.417522 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/95071464-c69d-4edb-a00e-1f980c80301d-config-data-custom\") pod \"cinder-api-0\" (UID: \"95071464-c69d-4edb-a00e-1f980c80301d\") " pod="openstack/cinder-api-0" Oct 13 14:34:34 crc kubenswrapper[4797]: I1013 14:34:34.418054 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95071464-c69d-4edb-a00e-1f980c80301d-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"95071464-c69d-4edb-a00e-1f980c80301d\") " pod="openstack/cinder-api-0" Oct 13 14:34:34 crc kubenswrapper[4797]: I1013 14:34:34.419865 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95071464-c69d-4edb-a00e-1f980c80301d-scripts\") pod \"cinder-api-0\" (UID: \"95071464-c69d-4edb-a00e-1f980c80301d\") " pod="openstack/cinder-api-0" Oct 13 14:34:34 crc kubenswrapper[4797]: I1013 14:34:34.422886 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-564c9b84c5-8nbpf" Oct 13 14:34:34 crc kubenswrapper[4797]: I1013 14:34:34.440137 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkq4q\" (UniqueName: \"kubernetes.io/projected/95071464-c69d-4edb-a00e-1f980c80301d-kube-api-access-dkq4q\") pod \"cinder-api-0\" (UID: \"95071464-c69d-4edb-a00e-1f980c80301d\") " pod="openstack/cinder-api-0" Oct 13 14:34:34 crc kubenswrapper[4797]: I1013 14:34:34.516911 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 13 14:34:34 crc kubenswrapper[4797]: I1013 14:34:34.914563 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-564c9b84c5-8nbpf"] Oct 13 14:34:35 crc kubenswrapper[4797]: W1013 14:34:35.087481 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod95071464_c69d_4edb_a00e_1f980c80301d.slice/crio-9c8e7c03442d46fc0fd9fe380ca62d69af7cec86ab87a4ece5cc524b31229a6d WatchSource:0}: Error finding container 9c8e7c03442d46fc0fd9fe380ca62d69af7cec86ab87a4ece5cc524b31229a6d: Status 404 returned error can't find the container with id 9c8e7c03442d46fc0fd9fe380ca62d69af7cec86ab87a4ece5cc524b31229a6d Oct 13 14:34:35 crc kubenswrapper[4797]: I1013 14:34:35.091166 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 13 14:34:35 crc kubenswrapper[4797]: I1013 14:34:35.734645 4797 generic.go:334] "Generic (PLEG): container finished" podID="1e42222d-b6e6-4ef7-a8fa-4f3b2a88c0c4" containerID="6657962aea5639db472c8891f2f798191b6c24df7414d3b4f2809ca8c8b4d3d7" exitCode=0 Oct 13 14:34:35 crc kubenswrapper[4797]: I1013 14:34:35.734848 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-564c9b84c5-8nbpf" event={"ID":"1e42222d-b6e6-4ef7-a8fa-4f3b2a88c0c4","Type":"ContainerDied","Data":"6657962aea5639db472c8891f2f798191b6c24df7414d3b4f2809ca8c8b4d3d7"} Oct 13 14:34:35 crc kubenswrapper[4797]: I1013 14:34:35.735087 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-564c9b84c5-8nbpf" event={"ID":"1e42222d-b6e6-4ef7-a8fa-4f3b2a88c0c4","Type":"ContainerStarted","Data":"5643c3982e32054ba5e7f400d81661af08d9e3c56fbd97019c4023f50d4edf77"} Oct 13 14:34:35 crc kubenswrapper[4797]: I1013 14:34:35.742856 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"95071464-c69d-4edb-a00e-1f980c80301d","Type":"ContainerStarted","Data":"92bea98873451245b4bed5faa98fc8e5ecf8d317bd7702f42d473e22a4eabb8e"} Oct 13 14:34:35 crc kubenswrapper[4797]: I1013 14:34:35.742901 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"95071464-c69d-4edb-a00e-1f980c80301d","Type":"ContainerStarted","Data":"9c8e7c03442d46fc0fd9fe380ca62d69af7cec86ab87a4ece5cc524b31229a6d"} Oct 13 14:34:36 crc kubenswrapper[4797]: I1013 14:34:36.756015 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-564c9b84c5-8nbpf" event={"ID":"1e42222d-b6e6-4ef7-a8fa-4f3b2a88c0c4","Type":"ContainerStarted","Data":"cc36507ded7605b7ba59eaca28d45815d99f6ef2bb7b0b08fd3fba0faf592300"} Oct 13 14:34:36 crc kubenswrapper[4797]: I1013 14:34:36.756425 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-564c9b84c5-8nbpf" Oct 13 14:34:36 crc kubenswrapper[4797]: I1013 14:34:36.758997 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"95071464-c69d-4edb-a00e-1f980c80301d","Type":"ContainerStarted","Data":"6ea2f84867dbe40dd8ea2ee1d95b1a575bc72f41db261870c4191f01f445fa16"} Oct 13 14:34:36 crc kubenswrapper[4797]: I1013 14:34:36.759080 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Oct 13 14:34:36 crc kubenswrapper[4797]: I1013 14:34:36.781475 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-564c9b84c5-8nbpf" podStartSLOduration=2.781455454 podStartE2EDuration="2.781455454s" podCreationTimestamp="2025-10-13 14:34:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 14:34:36.773700474 +0000 UTC m=+5254.307250730" watchObservedRunningTime="2025-10-13 14:34:36.781455454 +0000 UTC m=+5254.315005710" Oct 13 14:34:36 crc kubenswrapper[4797]: I1013 14:34:36.803998 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=2.803979315 podStartE2EDuration="2.803979315s" podCreationTimestamp="2025-10-13 14:34:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 14:34:36.796083172 +0000 UTC m=+5254.329633448" watchObservedRunningTime="2025-10-13 14:34:36.803979315 +0000 UTC m=+5254.337529571" Oct 13 14:34:43 crc kubenswrapper[4797]: I1013 14:34:43.249734 4797 scope.go:117] "RemoveContainer" containerID="96c8267bd4c8e99eeab0f52fde47a06d5529395a03b2ed9e13ec45aa355e370b" Oct 13 14:34:43 crc kubenswrapper[4797]: E1013 14:34:43.250501 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 14:34:44 crc kubenswrapper[4797]: I1013 14:34:44.424983 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-564c9b84c5-8nbpf" Oct 13 14:34:44 crc kubenswrapper[4797]: I1013 14:34:44.500978 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-65d574b495-22jtd"] Oct 13 14:34:44 crc kubenswrapper[4797]: I1013 14:34:44.501239 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-65d574b495-22jtd" podUID="972b0c4f-2817-4375-bd37-d6060aa6b80a" containerName="dnsmasq-dns" containerID="cri-o://959139e9736c3aa9d0d771a3f5b70befbd0d0e53979e5e8ef8cba8dccfdf52f2" gracePeriod=10 Oct 13 14:34:44 crc kubenswrapper[4797]: I1013 14:34:44.848047 4797 generic.go:334] "Generic (PLEG): container finished" podID="972b0c4f-2817-4375-bd37-d6060aa6b80a" containerID="959139e9736c3aa9d0d771a3f5b70befbd0d0e53979e5e8ef8cba8dccfdf52f2" exitCode=0 Oct 13 14:34:44 crc kubenswrapper[4797]: I1013 14:34:44.848376 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65d574b495-22jtd" event={"ID":"972b0c4f-2817-4375-bd37-d6060aa6b80a","Type":"ContainerDied","Data":"959139e9736c3aa9d0d771a3f5b70befbd0d0e53979e5e8ef8cba8dccfdf52f2"} Oct 13 14:34:44 crc kubenswrapper[4797]: I1013 14:34:44.990301 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65d574b495-22jtd" Oct 13 14:34:45 crc kubenswrapper[4797]: I1013 14:34:45.143717 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/972b0c4f-2817-4375-bd37-d6060aa6b80a-config\") pod \"972b0c4f-2817-4375-bd37-d6060aa6b80a\" (UID: \"972b0c4f-2817-4375-bd37-d6060aa6b80a\") " Oct 13 14:34:45 crc kubenswrapper[4797]: I1013 14:34:45.143950 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/972b0c4f-2817-4375-bd37-d6060aa6b80a-ovsdbserver-sb\") pod \"972b0c4f-2817-4375-bd37-d6060aa6b80a\" (UID: \"972b0c4f-2817-4375-bd37-d6060aa6b80a\") " Oct 13 14:34:45 crc kubenswrapper[4797]: I1013 14:34:45.144010 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/972b0c4f-2817-4375-bd37-d6060aa6b80a-dns-svc\") pod \"972b0c4f-2817-4375-bd37-d6060aa6b80a\" (UID: \"972b0c4f-2817-4375-bd37-d6060aa6b80a\") " Oct 13 14:34:45 crc kubenswrapper[4797]: I1013 14:34:45.144056 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/972b0c4f-2817-4375-bd37-d6060aa6b80a-ovsdbserver-nb\") pod \"972b0c4f-2817-4375-bd37-d6060aa6b80a\" (UID: \"972b0c4f-2817-4375-bd37-d6060aa6b80a\") " Oct 13 14:34:45 crc kubenswrapper[4797]: I1013 14:34:45.144151 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mtqwm\" (UniqueName: \"kubernetes.io/projected/972b0c4f-2817-4375-bd37-d6060aa6b80a-kube-api-access-mtqwm\") pod \"972b0c4f-2817-4375-bd37-d6060aa6b80a\" (UID: \"972b0c4f-2817-4375-bd37-d6060aa6b80a\") " Oct 13 14:34:45 crc kubenswrapper[4797]: I1013 14:34:45.150068 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/972b0c4f-2817-4375-bd37-d6060aa6b80a-kube-api-access-mtqwm" (OuterVolumeSpecName: "kube-api-access-mtqwm") pod "972b0c4f-2817-4375-bd37-d6060aa6b80a" (UID: "972b0c4f-2817-4375-bd37-d6060aa6b80a"). InnerVolumeSpecName "kube-api-access-mtqwm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 14:34:45 crc kubenswrapper[4797]: I1013 14:34:45.191921 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/972b0c4f-2817-4375-bd37-d6060aa6b80a-config" (OuterVolumeSpecName: "config") pod "972b0c4f-2817-4375-bd37-d6060aa6b80a" (UID: "972b0c4f-2817-4375-bd37-d6060aa6b80a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 14:34:45 crc kubenswrapper[4797]: I1013 14:34:45.193504 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/972b0c4f-2817-4375-bd37-d6060aa6b80a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "972b0c4f-2817-4375-bd37-d6060aa6b80a" (UID: "972b0c4f-2817-4375-bd37-d6060aa6b80a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 14:34:45 crc kubenswrapper[4797]: I1013 14:34:45.197362 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/972b0c4f-2817-4375-bd37-d6060aa6b80a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "972b0c4f-2817-4375-bd37-d6060aa6b80a" (UID: "972b0c4f-2817-4375-bd37-d6060aa6b80a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 14:34:45 crc kubenswrapper[4797]: I1013 14:34:45.205703 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/972b0c4f-2817-4375-bd37-d6060aa6b80a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "972b0c4f-2817-4375-bd37-d6060aa6b80a" (UID: "972b0c4f-2817-4375-bd37-d6060aa6b80a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 14:34:45 crc kubenswrapper[4797]: I1013 14:34:45.245729 4797 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/972b0c4f-2817-4375-bd37-d6060aa6b80a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 13 14:34:45 crc kubenswrapper[4797]: I1013 14:34:45.245770 4797 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/972b0c4f-2817-4375-bd37-d6060aa6b80a-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 13 14:34:45 crc kubenswrapper[4797]: I1013 14:34:45.245784 4797 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/972b0c4f-2817-4375-bd37-d6060aa6b80a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 13 14:34:45 crc kubenswrapper[4797]: I1013 14:34:45.245798 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mtqwm\" (UniqueName: \"kubernetes.io/projected/972b0c4f-2817-4375-bd37-d6060aa6b80a-kube-api-access-mtqwm\") on node \"crc\" DevicePath \"\"" Oct 13 14:34:45 crc kubenswrapper[4797]: I1013 14:34:45.245831 4797 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/972b0c4f-2817-4375-bd37-d6060aa6b80a-config\") on node \"crc\" DevicePath \"\"" Oct 13 14:34:45 crc kubenswrapper[4797]: I1013 14:34:45.862912 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65d574b495-22jtd" event={"ID":"972b0c4f-2817-4375-bd37-d6060aa6b80a","Type":"ContainerDied","Data":"92399251a4c19e2deb29bbd93c6579881623cb1e81835dbf37e063743cb88ded"} Oct 13 14:34:45 crc kubenswrapper[4797]: I1013 14:34:45.863228 4797 scope.go:117] "RemoveContainer" containerID="959139e9736c3aa9d0d771a3f5b70befbd0d0e53979e5e8ef8cba8dccfdf52f2" Oct 13 14:34:45 crc kubenswrapper[4797]: I1013 14:34:45.863165 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65d574b495-22jtd" Oct 13 14:34:45 crc kubenswrapper[4797]: I1013 14:34:45.899425 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-65d574b495-22jtd"] Oct 13 14:34:45 crc kubenswrapper[4797]: I1013 14:34:45.908323 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-65d574b495-22jtd"] Oct 13 14:34:46 crc kubenswrapper[4797]: I1013 14:34:46.454161 4797 scope.go:117] "RemoveContainer" containerID="4b930bebc4f4a98219c95a6595053bb573c1b33056d5d3a915963f925e77daf9" Oct 13 14:34:46 crc kubenswrapper[4797]: I1013 14:34:46.558930 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Oct 13 14:34:46 crc kubenswrapper[4797]: I1013 14:34:46.636787 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 13 14:34:46 crc kubenswrapper[4797]: I1013 14:34:46.656959 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 13 14:34:46 crc kubenswrapper[4797]: I1013 14:34:46.657173 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="dca2aaee-d1cd-4f97-b991-384473d89a49" containerName="nova-api-log" containerID="cri-o://ac26cf3eec713da06a181855a0db367f1463291376e6ef025d8b4e08a2f19441" gracePeriod=30 Oct 13 14:34:46 crc kubenswrapper[4797]: I1013 14:34:46.657568 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="dca2aaee-d1cd-4f97-b991-384473d89a49" containerName="nova-api-api" containerID="cri-o://5b9e18ce17adfde9e6ffb19d74e800da763fcaca03cdcc3c4a430c84e231a845" gracePeriod=30 Oct 13 14:34:46 crc kubenswrapper[4797]: I1013 14:34:46.674640 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 13 14:34:46 crc kubenswrapper[4797]: I1013 14:34:46.674951 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="2c326826-7bbd-44cb-aa2e-9c4e10aaf8f2" containerName="nova-cell0-conductor-conductor" containerID="cri-o://97e5c68cf5e19917b631b61817624d7b8429ea7c865d7f40feaec88f0c18f64c" gracePeriod=30 Oct 13 14:34:46 crc kubenswrapper[4797]: I1013 14:34:46.692726 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 13 14:34:46 crc kubenswrapper[4797]: I1013 14:34:46.693094 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="e872135e-d02b-4571-9217-0aa3cf0d444d" containerName="nova-metadata-log" containerID="cri-o://15627da574b8282a1bbfef2a9135d70d80740ec5600f34160ee053408940b6f8" gracePeriod=30 Oct 13 14:34:46 crc kubenswrapper[4797]: I1013 14:34:46.693605 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="e872135e-d02b-4571-9217-0aa3cf0d444d" containerName="nova-metadata-metadata" containerID="cri-o://f6b124f02b85e8a6700b935f0939bcd19d6b94a7adea514dea990f5f65e1e4f8" gracePeriod=30 Oct 13 14:34:46 crc kubenswrapper[4797]: I1013 14:34:46.709590 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 13 14:34:46 crc kubenswrapper[4797]: I1013 14:34:46.709790 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="93ca9b07-0502-43f7-99f7-522b362c00e8" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://21675c223001857578e43dbd45a4de7e11acf215c1307a87259b699dc9ae9b44" gracePeriod=30 Oct 13 14:34:46 crc kubenswrapper[4797]: I1013 14:34:46.729667 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/nova-cell1-novncproxy-0" podUID="93ca9b07-0502-43f7-99f7-522b362c00e8" containerName="nova-cell1-novncproxy-novncproxy" probeResult="failure" output="Get \"http://10.217.1.62:6080/vnc_lite.html\": read tcp 10.217.0.2:53150->10.217.1.62:6080: read: connection reset by peer" Oct 13 14:34:46 crc kubenswrapper[4797]: I1013 14:34:46.729670 4797 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-cell1-novncproxy-0" podUID="93ca9b07-0502-43f7-99f7-522b362c00e8" containerName="nova-cell1-novncproxy-novncproxy" probeResult="failure" output="Get \"http://10.217.1.62:6080/vnc_lite.html\": read tcp 10.217.0.2:53160->10.217.1.62:6080: read: connection reset by peer" Oct 13 14:34:46 crc kubenswrapper[4797]: I1013 14:34:46.890378 4797 generic.go:334] "Generic (PLEG): container finished" podID="dca2aaee-d1cd-4f97-b991-384473d89a49" containerID="ac26cf3eec713da06a181855a0db367f1463291376e6ef025d8b4e08a2f19441" exitCode=143 Oct 13 14:34:46 crc kubenswrapper[4797]: I1013 14:34:46.890500 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"dca2aaee-d1cd-4f97-b991-384473d89a49","Type":"ContainerDied","Data":"ac26cf3eec713da06a181855a0db367f1463291376e6ef025d8b4e08a2f19441"} Oct 13 14:34:46 crc kubenswrapper[4797]: I1013 14:34:46.901961 4797 generic.go:334] "Generic (PLEG): container finished" podID="e872135e-d02b-4571-9217-0aa3cf0d444d" containerID="15627da574b8282a1bbfef2a9135d70d80740ec5600f34160ee053408940b6f8" exitCode=143 Oct 13 14:34:46 crc kubenswrapper[4797]: I1013 14:34:46.902083 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e872135e-d02b-4571-9217-0aa3cf0d444d","Type":"ContainerDied","Data":"15627da574b8282a1bbfef2a9135d70d80740ec5600f34160ee053408940b6f8"} Oct 13 14:34:46 crc kubenswrapper[4797]: I1013 14:34:46.909138 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="ab0e779d-76c7-48ef-9455-08b1c32ba0f8" containerName="nova-scheduler-scheduler" containerID="cri-o://5161b110bfe6553ed3a8f68efc55fbb21768983c7aa7ec2f96cca17ae383b3bf" gracePeriod=30 Oct 13 14:34:47 crc kubenswrapper[4797]: E1013 14:34:47.236724 4797 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5161b110bfe6553ed3a8f68efc55fbb21768983c7aa7ec2f96cca17ae383b3bf" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 13 14:34:47 crc kubenswrapper[4797]: E1013 14:34:47.239087 4797 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5161b110bfe6553ed3a8f68efc55fbb21768983c7aa7ec2f96cca17ae383b3bf" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 13 14:34:47 crc kubenswrapper[4797]: E1013 14:34:47.240781 4797 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5161b110bfe6553ed3a8f68efc55fbb21768983c7aa7ec2f96cca17ae383b3bf" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 13 14:34:47 crc kubenswrapper[4797]: E1013 14:34:47.240826 4797 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="ab0e779d-76c7-48ef-9455-08b1c32ba0f8" containerName="nova-scheduler-scheduler" Oct 13 14:34:47 crc kubenswrapper[4797]: I1013 14:34:47.260252 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="972b0c4f-2817-4375-bd37-d6060aa6b80a" path="/var/lib/kubelet/pods/972b0c4f-2817-4375-bd37-d6060aa6b80a/volumes" Oct 13 14:34:47 crc kubenswrapper[4797]: I1013 14:34:47.516029 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 13 14:34:47 crc kubenswrapper[4797]: I1013 14:34:47.588567 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9jwdh\" (UniqueName: \"kubernetes.io/projected/93ca9b07-0502-43f7-99f7-522b362c00e8-kube-api-access-9jwdh\") pod \"93ca9b07-0502-43f7-99f7-522b362c00e8\" (UID: \"93ca9b07-0502-43f7-99f7-522b362c00e8\") " Oct 13 14:34:47 crc kubenswrapper[4797]: I1013 14:34:47.588656 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93ca9b07-0502-43f7-99f7-522b362c00e8-combined-ca-bundle\") pod \"93ca9b07-0502-43f7-99f7-522b362c00e8\" (UID: \"93ca9b07-0502-43f7-99f7-522b362c00e8\") " Oct 13 14:34:47 crc kubenswrapper[4797]: I1013 14:34:47.594005 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93ca9b07-0502-43f7-99f7-522b362c00e8-config-data\") pod \"93ca9b07-0502-43f7-99f7-522b362c00e8\" (UID: \"93ca9b07-0502-43f7-99f7-522b362c00e8\") " Oct 13 14:34:47 crc kubenswrapper[4797]: I1013 14:34:47.604677 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93ca9b07-0502-43f7-99f7-522b362c00e8-kube-api-access-9jwdh" (OuterVolumeSpecName: "kube-api-access-9jwdh") pod "93ca9b07-0502-43f7-99f7-522b362c00e8" (UID: "93ca9b07-0502-43f7-99f7-522b362c00e8"). InnerVolumeSpecName "kube-api-access-9jwdh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 14:34:47 crc kubenswrapper[4797]: I1013 14:34:47.647264 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93ca9b07-0502-43f7-99f7-522b362c00e8-config-data" (OuterVolumeSpecName: "config-data") pod "93ca9b07-0502-43f7-99f7-522b362c00e8" (UID: "93ca9b07-0502-43f7-99f7-522b362c00e8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 14:34:47 crc kubenswrapper[4797]: I1013 14:34:47.648506 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93ca9b07-0502-43f7-99f7-522b362c00e8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "93ca9b07-0502-43f7-99f7-522b362c00e8" (UID: "93ca9b07-0502-43f7-99f7-522b362c00e8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 14:34:47 crc kubenswrapper[4797]: I1013 14:34:47.701485 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9jwdh\" (UniqueName: \"kubernetes.io/projected/93ca9b07-0502-43f7-99f7-522b362c00e8-kube-api-access-9jwdh\") on node \"crc\" DevicePath \"\"" Oct 13 14:34:47 crc kubenswrapper[4797]: I1013 14:34:47.701526 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93ca9b07-0502-43f7-99f7-522b362c00e8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 14:34:47 crc kubenswrapper[4797]: I1013 14:34:47.701543 4797 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93ca9b07-0502-43f7-99f7-522b362c00e8-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 14:34:47 crc kubenswrapper[4797]: I1013 14:34:47.917957 4797 generic.go:334] "Generic (PLEG): container finished" podID="93ca9b07-0502-43f7-99f7-522b362c00e8" containerID="21675c223001857578e43dbd45a4de7e11acf215c1307a87259b699dc9ae9b44" exitCode=0 Oct 13 14:34:47 crc kubenswrapper[4797]: I1013 14:34:47.918004 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"93ca9b07-0502-43f7-99f7-522b362c00e8","Type":"ContainerDied","Data":"21675c223001857578e43dbd45a4de7e11acf215c1307a87259b699dc9ae9b44"} Oct 13 14:34:47 crc kubenswrapper[4797]: I1013 14:34:47.918031 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"93ca9b07-0502-43f7-99f7-522b362c00e8","Type":"ContainerDied","Data":"200acffd475fece6bb7a1eec5ab388e99ad96b5625242c0b999e9375315b360d"} Oct 13 14:34:47 crc kubenswrapper[4797]: I1013 14:34:47.918049 4797 scope.go:117] "RemoveContainer" containerID="21675c223001857578e43dbd45a4de7e11acf215c1307a87259b699dc9ae9b44" Oct 13 14:34:47 crc kubenswrapper[4797]: I1013 14:34:47.918074 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 13 14:34:47 crc kubenswrapper[4797]: I1013 14:34:47.947662 4797 scope.go:117] "RemoveContainer" containerID="21675c223001857578e43dbd45a4de7e11acf215c1307a87259b699dc9ae9b44" Oct 13 14:34:47 crc kubenswrapper[4797]: E1013 14:34:47.948171 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21675c223001857578e43dbd45a4de7e11acf215c1307a87259b699dc9ae9b44\": container with ID starting with 21675c223001857578e43dbd45a4de7e11acf215c1307a87259b699dc9ae9b44 not found: ID does not exist" containerID="21675c223001857578e43dbd45a4de7e11acf215c1307a87259b699dc9ae9b44" Oct 13 14:34:47 crc kubenswrapper[4797]: I1013 14:34:47.948208 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21675c223001857578e43dbd45a4de7e11acf215c1307a87259b699dc9ae9b44"} err="failed to get container status \"21675c223001857578e43dbd45a4de7e11acf215c1307a87259b699dc9ae9b44\": rpc error: code = NotFound desc = could not find container \"21675c223001857578e43dbd45a4de7e11acf215c1307a87259b699dc9ae9b44\": container with ID starting with 21675c223001857578e43dbd45a4de7e11acf215c1307a87259b699dc9ae9b44 not found: ID does not exist" Oct 13 14:34:47 crc kubenswrapper[4797]: I1013 14:34:47.951049 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 13 14:34:47 crc kubenswrapper[4797]: I1013 14:34:47.961343 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 13 14:34:47 crc kubenswrapper[4797]: I1013 14:34:47.981136 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 13 14:34:47 crc kubenswrapper[4797]: E1013 14:34:47.981634 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="972b0c4f-2817-4375-bd37-d6060aa6b80a" containerName="dnsmasq-dns" Oct 13 14:34:47 crc kubenswrapper[4797]: I1013 14:34:47.981655 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="972b0c4f-2817-4375-bd37-d6060aa6b80a" containerName="dnsmasq-dns" Oct 13 14:34:47 crc kubenswrapper[4797]: E1013 14:34:47.981678 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93ca9b07-0502-43f7-99f7-522b362c00e8" containerName="nova-cell1-novncproxy-novncproxy" Oct 13 14:34:47 crc kubenswrapper[4797]: I1013 14:34:47.981686 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="93ca9b07-0502-43f7-99f7-522b362c00e8" containerName="nova-cell1-novncproxy-novncproxy" Oct 13 14:34:47 crc kubenswrapper[4797]: E1013 14:34:47.981716 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="972b0c4f-2817-4375-bd37-d6060aa6b80a" containerName="init" Oct 13 14:34:47 crc kubenswrapper[4797]: I1013 14:34:47.981726 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="972b0c4f-2817-4375-bd37-d6060aa6b80a" containerName="init" Oct 13 14:34:47 crc kubenswrapper[4797]: I1013 14:34:47.981954 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="972b0c4f-2817-4375-bd37-d6060aa6b80a" containerName="dnsmasq-dns" Oct 13 14:34:47 crc kubenswrapper[4797]: I1013 14:34:47.981994 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="93ca9b07-0502-43f7-99f7-522b362c00e8" containerName="nova-cell1-novncproxy-novncproxy" Oct 13 14:34:47 crc kubenswrapper[4797]: I1013 14:34:47.982751 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 13 14:34:47 crc kubenswrapper[4797]: I1013 14:34:47.984711 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Oct 13 14:34:47 crc kubenswrapper[4797]: I1013 14:34:47.991437 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 13 14:34:48 crc kubenswrapper[4797]: I1013 14:34:48.108940 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62cg5\" (UniqueName: \"kubernetes.io/projected/a4d665a2-0bed-48b1-bb0c-0688c897a898-kube-api-access-62cg5\") pod \"nova-cell1-novncproxy-0\" (UID: \"a4d665a2-0bed-48b1-bb0c-0688c897a898\") " pod="openstack/nova-cell1-novncproxy-0" Oct 13 14:34:48 crc kubenswrapper[4797]: I1013 14:34:48.109023 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4d665a2-0bed-48b1-bb0c-0688c897a898-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"a4d665a2-0bed-48b1-bb0c-0688c897a898\") " pod="openstack/nova-cell1-novncproxy-0" Oct 13 14:34:48 crc kubenswrapper[4797]: I1013 14:34:48.109137 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4d665a2-0bed-48b1-bb0c-0688c897a898-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"a4d665a2-0bed-48b1-bb0c-0688c897a898\") " pod="openstack/nova-cell1-novncproxy-0" Oct 13 14:34:48 crc kubenswrapper[4797]: I1013 14:34:48.210196 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-62cg5\" (UniqueName: \"kubernetes.io/projected/a4d665a2-0bed-48b1-bb0c-0688c897a898-kube-api-access-62cg5\") pod \"nova-cell1-novncproxy-0\" (UID: \"a4d665a2-0bed-48b1-bb0c-0688c897a898\") " pod="openstack/nova-cell1-novncproxy-0" Oct 13 14:34:48 crc kubenswrapper[4797]: I1013 14:34:48.210264 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4d665a2-0bed-48b1-bb0c-0688c897a898-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"a4d665a2-0bed-48b1-bb0c-0688c897a898\") " pod="openstack/nova-cell1-novncproxy-0" Oct 13 14:34:48 crc kubenswrapper[4797]: I1013 14:34:48.210326 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4d665a2-0bed-48b1-bb0c-0688c897a898-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"a4d665a2-0bed-48b1-bb0c-0688c897a898\") " pod="openstack/nova-cell1-novncproxy-0" Oct 13 14:34:48 crc kubenswrapper[4797]: I1013 14:34:48.215503 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4d665a2-0bed-48b1-bb0c-0688c897a898-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"a4d665a2-0bed-48b1-bb0c-0688c897a898\") " pod="openstack/nova-cell1-novncproxy-0" Oct 13 14:34:48 crc kubenswrapper[4797]: I1013 14:34:48.222581 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4d665a2-0bed-48b1-bb0c-0688c897a898-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"a4d665a2-0bed-48b1-bb0c-0688c897a898\") " pod="openstack/nova-cell1-novncproxy-0" Oct 13 14:34:48 crc kubenswrapper[4797]: I1013 14:34:48.231399 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-62cg5\" (UniqueName: \"kubernetes.io/projected/a4d665a2-0bed-48b1-bb0c-0688c897a898-kube-api-access-62cg5\") pod \"nova-cell1-novncproxy-0\" (UID: \"a4d665a2-0bed-48b1-bb0c-0688c897a898\") " pod="openstack/nova-cell1-novncproxy-0" Oct 13 14:34:48 crc kubenswrapper[4797]: I1013 14:34:48.303572 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 13 14:34:48 crc kubenswrapper[4797]: I1013 14:34:48.738358 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 13 14:34:48 crc kubenswrapper[4797]: I1013 14:34:48.930086 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"a4d665a2-0bed-48b1-bb0c-0688c897a898","Type":"ContainerStarted","Data":"df5fd213e6fb0c1bead425033939d871182c11612526ccd81d72772e7b4fc8d7"} Oct 13 14:34:49 crc kubenswrapper[4797]: I1013 14:34:49.248925 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93ca9b07-0502-43f7-99f7-522b362c00e8" path="/var/lib/kubelet/pods/93ca9b07-0502-43f7-99f7-522b362c00e8/volumes" Oct 13 14:34:49 crc kubenswrapper[4797]: I1013 14:34:49.466638 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 13 14:34:49 crc kubenswrapper[4797]: I1013 14:34:49.636381 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c326826-7bbd-44cb-aa2e-9c4e10aaf8f2-combined-ca-bundle\") pod \"2c326826-7bbd-44cb-aa2e-9c4e10aaf8f2\" (UID: \"2c326826-7bbd-44cb-aa2e-9c4e10aaf8f2\") " Oct 13 14:34:49 crc kubenswrapper[4797]: I1013 14:34:49.636437 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t7ddk\" (UniqueName: \"kubernetes.io/projected/2c326826-7bbd-44cb-aa2e-9c4e10aaf8f2-kube-api-access-t7ddk\") pod \"2c326826-7bbd-44cb-aa2e-9c4e10aaf8f2\" (UID: \"2c326826-7bbd-44cb-aa2e-9c4e10aaf8f2\") " Oct 13 14:34:49 crc kubenswrapper[4797]: I1013 14:34:49.636480 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c326826-7bbd-44cb-aa2e-9c4e10aaf8f2-config-data\") pod \"2c326826-7bbd-44cb-aa2e-9c4e10aaf8f2\" (UID: \"2c326826-7bbd-44cb-aa2e-9c4e10aaf8f2\") " Oct 13 14:34:49 crc kubenswrapper[4797]: I1013 14:34:49.642167 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c326826-7bbd-44cb-aa2e-9c4e10aaf8f2-kube-api-access-t7ddk" (OuterVolumeSpecName: "kube-api-access-t7ddk") pod "2c326826-7bbd-44cb-aa2e-9c4e10aaf8f2" (UID: "2c326826-7bbd-44cb-aa2e-9c4e10aaf8f2"). InnerVolumeSpecName "kube-api-access-t7ddk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 14:34:49 crc kubenswrapper[4797]: I1013 14:34:49.660575 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c326826-7bbd-44cb-aa2e-9c4e10aaf8f2-config-data" (OuterVolumeSpecName: "config-data") pod "2c326826-7bbd-44cb-aa2e-9c4e10aaf8f2" (UID: "2c326826-7bbd-44cb-aa2e-9c4e10aaf8f2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 14:34:49 crc kubenswrapper[4797]: I1013 14:34:49.665264 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c326826-7bbd-44cb-aa2e-9c4e10aaf8f2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2c326826-7bbd-44cb-aa2e-9c4e10aaf8f2" (UID: "2c326826-7bbd-44cb-aa2e-9c4e10aaf8f2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 14:34:49 crc kubenswrapper[4797]: I1013 14:34:49.738446 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c326826-7bbd-44cb-aa2e-9c4e10aaf8f2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 14:34:49 crc kubenswrapper[4797]: I1013 14:34:49.738485 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t7ddk\" (UniqueName: \"kubernetes.io/projected/2c326826-7bbd-44cb-aa2e-9c4e10aaf8f2-kube-api-access-t7ddk\") on node \"crc\" DevicePath \"\"" Oct 13 14:34:49 crc kubenswrapper[4797]: I1013 14:34:49.738497 4797 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c326826-7bbd-44cb-aa2e-9c4e10aaf8f2-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 14:34:49 crc kubenswrapper[4797]: I1013 14:34:49.802159 4797 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-api-0" podUID="dca2aaee-d1cd-4f97-b991-384473d89a49" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.74:8774/\": read tcp 10.217.0.2:35682->10.217.1.74:8774: read: connection reset by peer" Oct 13 14:34:49 crc kubenswrapper[4797]: I1013 14:34:49.802159 4797 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-api-0" podUID="dca2aaee-d1cd-4f97-b991-384473d89a49" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.74:8774/\": read tcp 10.217.0.2:35698->10.217.1.74:8774: read: connection reset by peer" Oct 13 14:34:49 crc kubenswrapper[4797]: I1013 14:34:49.847386 4797 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="e872135e-d02b-4571-9217-0aa3cf0d444d" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.73:8775/\": read tcp 10.217.0.2:39484->10.217.1.73:8775: read: connection reset by peer" Oct 13 14:34:49 crc kubenswrapper[4797]: I1013 14:34:49.847387 4797 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="e872135e-d02b-4571-9217-0aa3cf0d444d" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.73:8775/\": read tcp 10.217.0.2:39500->10.217.1.73:8775: read: connection reset by peer" Oct 13 14:34:49 crc kubenswrapper[4797]: I1013 14:34:49.854515 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 13 14:34:49 crc kubenswrapper[4797]: I1013 14:34:49.855071 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="912acab0-a01a-4f1a-9bcd-825354752818" containerName="nova-cell1-conductor-conductor" containerID="cri-o://7d039f6c8b418f3f8dfe577577e7ef7a82a4dda03e9f3001acab7bb0bc5afb5e" gracePeriod=30 Oct 13 14:34:49 crc kubenswrapper[4797]: I1013 14:34:49.941915 4797 generic.go:334] "Generic (PLEG): container finished" podID="2c326826-7bbd-44cb-aa2e-9c4e10aaf8f2" containerID="97e5c68cf5e19917b631b61817624d7b8429ea7c865d7f40feaec88f0c18f64c" exitCode=0 Oct 13 14:34:49 crc kubenswrapper[4797]: I1013 14:34:49.942081 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 13 14:34:49 crc kubenswrapper[4797]: I1013 14:34:49.942097 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"2c326826-7bbd-44cb-aa2e-9c4e10aaf8f2","Type":"ContainerDied","Data":"97e5c68cf5e19917b631b61817624d7b8429ea7c865d7f40feaec88f0c18f64c"} Oct 13 14:34:49 crc kubenswrapper[4797]: I1013 14:34:49.948203 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"2c326826-7bbd-44cb-aa2e-9c4e10aaf8f2","Type":"ContainerDied","Data":"7fc105c29c8edded54a1eae1b55c2a7e2d6173db088f9fab3641e2e00bec1851"} Oct 13 14:34:49 crc kubenswrapper[4797]: I1013 14:34:49.948295 4797 scope.go:117] "RemoveContainer" containerID="97e5c68cf5e19917b631b61817624d7b8429ea7c865d7f40feaec88f0c18f64c" Oct 13 14:34:49 crc kubenswrapper[4797]: I1013 14:34:49.957356 4797 generic.go:334] "Generic (PLEG): container finished" podID="dca2aaee-d1cd-4f97-b991-384473d89a49" containerID="5b9e18ce17adfde9e6ffb19d74e800da763fcaca03cdcc3c4a430c84e231a845" exitCode=0 Oct 13 14:34:49 crc kubenswrapper[4797]: I1013 14:34:49.957486 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"dca2aaee-d1cd-4f97-b991-384473d89a49","Type":"ContainerDied","Data":"5b9e18ce17adfde9e6ffb19d74e800da763fcaca03cdcc3c4a430c84e231a845"} Oct 13 14:34:49 crc kubenswrapper[4797]: I1013 14:34:49.961721 4797 generic.go:334] "Generic (PLEG): container finished" podID="e872135e-d02b-4571-9217-0aa3cf0d444d" containerID="f6b124f02b85e8a6700b935f0939bcd19d6b94a7adea514dea990f5f65e1e4f8" exitCode=0 Oct 13 14:34:49 crc kubenswrapper[4797]: I1013 14:34:49.961791 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e872135e-d02b-4571-9217-0aa3cf0d444d","Type":"ContainerDied","Data":"f6b124f02b85e8a6700b935f0939bcd19d6b94a7adea514dea990f5f65e1e4f8"} Oct 13 14:34:49 crc kubenswrapper[4797]: I1013 14:34:49.968084 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"a4d665a2-0bed-48b1-bb0c-0688c897a898","Type":"ContainerStarted","Data":"faeef177822614a1b14f5434df96eba326d9bd4c30fde02fde62820aff4bd368"} Oct 13 14:34:49 crc kubenswrapper[4797]: I1013 14:34:49.993868 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.993841057 podStartE2EDuration="2.993841057s" podCreationTimestamp="2025-10-13 14:34:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 14:34:49.98456605 +0000 UTC m=+5267.518116326" watchObservedRunningTime="2025-10-13 14:34:49.993841057 +0000 UTC m=+5267.527391313" Oct 13 14:34:50 crc kubenswrapper[4797]: I1013 14:34:50.066970 4797 scope.go:117] "RemoveContainer" containerID="97e5c68cf5e19917b631b61817624d7b8429ea7c865d7f40feaec88f0c18f64c" Oct 13 14:34:50 crc kubenswrapper[4797]: I1013 14:34:50.067213 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 13 14:34:50 crc kubenswrapper[4797]: E1013 14:34:50.073856 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"97e5c68cf5e19917b631b61817624d7b8429ea7c865d7f40feaec88f0c18f64c\": container with ID starting with 97e5c68cf5e19917b631b61817624d7b8429ea7c865d7f40feaec88f0c18f64c not found: ID does not exist" containerID="97e5c68cf5e19917b631b61817624d7b8429ea7c865d7f40feaec88f0c18f64c" Oct 13 14:34:50 crc kubenswrapper[4797]: I1013 14:34:50.073912 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97e5c68cf5e19917b631b61817624d7b8429ea7c865d7f40feaec88f0c18f64c"} err="failed to get container status \"97e5c68cf5e19917b631b61817624d7b8429ea7c865d7f40feaec88f0c18f64c\": rpc error: code = NotFound desc = could not find container \"97e5c68cf5e19917b631b61817624d7b8429ea7c865d7f40feaec88f0c18f64c\": container with ID starting with 97e5c68cf5e19917b631b61817624d7b8429ea7c865d7f40feaec88f0c18f64c not found: ID does not exist" Oct 13 14:34:50 crc kubenswrapper[4797]: I1013 14:34:50.083264 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 13 14:34:50 crc kubenswrapper[4797]: I1013 14:34:50.094932 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 13 14:34:50 crc kubenswrapper[4797]: E1013 14:34:50.095345 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c326826-7bbd-44cb-aa2e-9c4e10aaf8f2" containerName="nova-cell0-conductor-conductor" Oct 13 14:34:50 crc kubenswrapper[4797]: I1013 14:34:50.095366 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c326826-7bbd-44cb-aa2e-9c4e10aaf8f2" containerName="nova-cell0-conductor-conductor" Oct 13 14:34:50 crc kubenswrapper[4797]: I1013 14:34:50.095555 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c326826-7bbd-44cb-aa2e-9c4e10aaf8f2" containerName="nova-cell0-conductor-conductor" Oct 13 14:34:50 crc kubenswrapper[4797]: I1013 14:34:50.100261 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 13 14:34:50 crc kubenswrapper[4797]: I1013 14:34:50.102239 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 13 14:34:50 crc kubenswrapper[4797]: I1013 14:34:50.117388 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 13 14:34:50 crc kubenswrapper[4797]: I1013 14:34:50.249978 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65493b63-8563-4071-98e1-647040b90c23-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"65493b63-8563-4071-98e1-647040b90c23\") " pod="openstack/nova-cell0-conductor-0" Oct 13 14:34:50 crc kubenswrapper[4797]: I1013 14:34:50.250140 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5t7tp\" (UniqueName: \"kubernetes.io/projected/65493b63-8563-4071-98e1-647040b90c23-kube-api-access-5t7tp\") pod \"nova-cell0-conductor-0\" (UID: \"65493b63-8563-4071-98e1-647040b90c23\") " pod="openstack/nova-cell0-conductor-0" Oct 13 14:34:50 crc kubenswrapper[4797]: I1013 14:34:50.250182 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65493b63-8563-4071-98e1-647040b90c23-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"65493b63-8563-4071-98e1-647040b90c23\") " pod="openstack/nova-cell0-conductor-0" Oct 13 14:34:50 crc kubenswrapper[4797]: I1013 14:34:50.277236 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 13 14:34:50 crc kubenswrapper[4797]: I1013 14:34:50.351525 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6s87s\" (UniqueName: \"kubernetes.io/projected/dca2aaee-d1cd-4f97-b991-384473d89a49-kube-api-access-6s87s\") pod \"dca2aaee-d1cd-4f97-b991-384473d89a49\" (UID: \"dca2aaee-d1cd-4f97-b991-384473d89a49\") " Oct 13 14:34:50 crc kubenswrapper[4797]: I1013 14:34:50.351645 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dca2aaee-d1cd-4f97-b991-384473d89a49-config-data\") pod \"dca2aaee-d1cd-4f97-b991-384473d89a49\" (UID: \"dca2aaee-d1cd-4f97-b991-384473d89a49\") " Oct 13 14:34:50 crc kubenswrapper[4797]: I1013 14:34:50.351947 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dca2aaee-d1cd-4f97-b991-384473d89a49-combined-ca-bundle\") pod \"dca2aaee-d1cd-4f97-b991-384473d89a49\" (UID: \"dca2aaee-d1cd-4f97-b991-384473d89a49\") " Oct 13 14:34:50 crc kubenswrapper[4797]: I1013 14:34:50.351994 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dca2aaee-d1cd-4f97-b991-384473d89a49-logs\") pod \"dca2aaee-d1cd-4f97-b991-384473d89a49\" (UID: \"dca2aaee-d1cd-4f97-b991-384473d89a49\") " Oct 13 14:34:50 crc kubenswrapper[4797]: I1013 14:34:50.352286 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65493b63-8563-4071-98e1-647040b90c23-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"65493b63-8563-4071-98e1-647040b90c23\") " pod="openstack/nova-cell0-conductor-0" Oct 13 14:34:50 crc kubenswrapper[4797]: I1013 14:34:50.352420 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5t7tp\" (UniqueName: \"kubernetes.io/projected/65493b63-8563-4071-98e1-647040b90c23-kube-api-access-5t7tp\") pod \"nova-cell0-conductor-0\" (UID: \"65493b63-8563-4071-98e1-647040b90c23\") " pod="openstack/nova-cell0-conductor-0" Oct 13 14:34:50 crc kubenswrapper[4797]: I1013 14:34:50.352484 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65493b63-8563-4071-98e1-647040b90c23-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"65493b63-8563-4071-98e1-647040b90c23\") " pod="openstack/nova-cell0-conductor-0" Oct 13 14:34:50 crc kubenswrapper[4797]: I1013 14:34:50.353593 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dca2aaee-d1cd-4f97-b991-384473d89a49-logs" (OuterVolumeSpecName: "logs") pod "dca2aaee-d1cd-4f97-b991-384473d89a49" (UID: "dca2aaee-d1cd-4f97-b991-384473d89a49"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 14:34:50 crc kubenswrapper[4797]: I1013 14:34:50.362189 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65493b63-8563-4071-98e1-647040b90c23-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"65493b63-8563-4071-98e1-647040b90c23\") " pod="openstack/nova-cell0-conductor-0" Oct 13 14:34:50 crc kubenswrapper[4797]: I1013 14:34:50.362835 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dca2aaee-d1cd-4f97-b991-384473d89a49-kube-api-access-6s87s" (OuterVolumeSpecName: "kube-api-access-6s87s") pod "dca2aaee-d1cd-4f97-b991-384473d89a49" (UID: "dca2aaee-d1cd-4f97-b991-384473d89a49"). InnerVolumeSpecName "kube-api-access-6s87s". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 14:34:50 crc kubenswrapper[4797]: I1013 14:34:50.370027 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65493b63-8563-4071-98e1-647040b90c23-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"65493b63-8563-4071-98e1-647040b90c23\") " pod="openstack/nova-cell0-conductor-0" Oct 13 14:34:50 crc kubenswrapper[4797]: I1013 14:34:50.371977 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5t7tp\" (UniqueName: \"kubernetes.io/projected/65493b63-8563-4071-98e1-647040b90c23-kube-api-access-5t7tp\") pod \"nova-cell0-conductor-0\" (UID: \"65493b63-8563-4071-98e1-647040b90c23\") " pod="openstack/nova-cell0-conductor-0" Oct 13 14:34:50 crc kubenswrapper[4797]: I1013 14:34:50.383300 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dca2aaee-d1cd-4f97-b991-384473d89a49-config-data" (OuterVolumeSpecName: "config-data") pod "dca2aaee-d1cd-4f97-b991-384473d89a49" (UID: "dca2aaee-d1cd-4f97-b991-384473d89a49"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 14:34:50 crc kubenswrapper[4797]: I1013 14:34:50.403942 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dca2aaee-d1cd-4f97-b991-384473d89a49-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dca2aaee-d1cd-4f97-b991-384473d89a49" (UID: "dca2aaee-d1cd-4f97-b991-384473d89a49"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 14:34:50 crc kubenswrapper[4797]: I1013 14:34:50.411650 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 13 14:34:50 crc kubenswrapper[4797]: I1013 14:34:50.439192 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 13 14:34:50 crc kubenswrapper[4797]: I1013 14:34:50.460988 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dca2aaee-d1cd-4f97-b991-384473d89a49-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 14:34:50 crc kubenswrapper[4797]: I1013 14:34:50.461018 4797 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dca2aaee-d1cd-4f97-b991-384473d89a49-logs\") on node \"crc\" DevicePath \"\"" Oct 13 14:34:50 crc kubenswrapper[4797]: I1013 14:34:50.461028 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6s87s\" (UniqueName: \"kubernetes.io/projected/dca2aaee-d1cd-4f97-b991-384473d89a49-kube-api-access-6s87s\") on node \"crc\" DevicePath \"\"" Oct 13 14:34:50 crc kubenswrapper[4797]: I1013 14:34:50.461037 4797 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dca2aaee-d1cd-4f97-b991-384473d89a49-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 14:34:50 crc kubenswrapper[4797]: I1013 14:34:50.563023 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e872135e-d02b-4571-9217-0aa3cf0d444d-combined-ca-bundle\") pod \"e872135e-d02b-4571-9217-0aa3cf0d444d\" (UID: \"e872135e-d02b-4571-9217-0aa3cf0d444d\") " Oct 13 14:34:50 crc kubenswrapper[4797]: I1013 14:34:50.563374 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e872135e-d02b-4571-9217-0aa3cf0d444d-logs\") pod \"e872135e-d02b-4571-9217-0aa3cf0d444d\" (UID: \"e872135e-d02b-4571-9217-0aa3cf0d444d\") " Oct 13 14:34:50 crc kubenswrapper[4797]: I1013 14:34:50.563470 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e872135e-d02b-4571-9217-0aa3cf0d444d-config-data\") pod \"e872135e-d02b-4571-9217-0aa3cf0d444d\" (UID: \"e872135e-d02b-4571-9217-0aa3cf0d444d\") " Oct 13 14:34:50 crc kubenswrapper[4797]: I1013 14:34:50.563562 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6l4vd\" (UniqueName: \"kubernetes.io/projected/e872135e-d02b-4571-9217-0aa3cf0d444d-kube-api-access-6l4vd\") pod \"e872135e-d02b-4571-9217-0aa3cf0d444d\" (UID: \"e872135e-d02b-4571-9217-0aa3cf0d444d\") " Oct 13 14:34:50 crc kubenswrapper[4797]: I1013 14:34:50.564065 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e872135e-d02b-4571-9217-0aa3cf0d444d-logs" (OuterVolumeSpecName: "logs") pod "e872135e-d02b-4571-9217-0aa3cf0d444d" (UID: "e872135e-d02b-4571-9217-0aa3cf0d444d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 14:34:50 crc kubenswrapper[4797]: I1013 14:34:50.564221 4797 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e872135e-d02b-4571-9217-0aa3cf0d444d-logs\") on node \"crc\" DevicePath \"\"" Oct 13 14:34:50 crc kubenswrapper[4797]: I1013 14:34:50.572035 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e872135e-d02b-4571-9217-0aa3cf0d444d-kube-api-access-6l4vd" (OuterVolumeSpecName: "kube-api-access-6l4vd") pod "e872135e-d02b-4571-9217-0aa3cf0d444d" (UID: "e872135e-d02b-4571-9217-0aa3cf0d444d"). InnerVolumeSpecName "kube-api-access-6l4vd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 14:34:50 crc kubenswrapper[4797]: I1013 14:34:50.624971 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e872135e-d02b-4571-9217-0aa3cf0d444d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e872135e-d02b-4571-9217-0aa3cf0d444d" (UID: "e872135e-d02b-4571-9217-0aa3cf0d444d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 14:34:50 crc kubenswrapper[4797]: I1013 14:34:50.629135 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e872135e-d02b-4571-9217-0aa3cf0d444d-config-data" (OuterVolumeSpecName: "config-data") pod "e872135e-d02b-4571-9217-0aa3cf0d444d" (UID: "e872135e-d02b-4571-9217-0aa3cf0d444d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 14:34:50 crc kubenswrapper[4797]: I1013 14:34:50.676125 4797 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e872135e-d02b-4571-9217-0aa3cf0d444d-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 14:34:50 crc kubenswrapper[4797]: I1013 14:34:50.676185 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6l4vd\" (UniqueName: \"kubernetes.io/projected/e872135e-d02b-4571-9217-0aa3cf0d444d-kube-api-access-6l4vd\") on node \"crc\" DevicePath \"\"" Oct 13 14:34:50 crc kubenswrapper[4797]: I1013 14:34:50.676203 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e872135e-d02b-4571-9217-0aa3cf0d444d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 14:34:50 crc kubenswrapper[4797]: I1013 14:34:50.767749 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 13 14:34:50 crc kubenswrapper[4797]: E1013 14:34:50.954664 4797 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="7d039f6c8b418f3f8dfe577577e7ef7a82a4dda03e9f3001acab7bb0bc5afb5e" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 13 14:34:50 crc kubenswrapper[4797]: E1013 14:34:50.956155 4797 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="7d039f6c8b418f3f8dfe577577e7ef7a82a4dda03e9f3001acab7bb0bc5afb5e" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 13 14:34:50 crc kubenswrapper[4797]: E1013 14:34:50.958428 4797 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="7d039f6c8b418f3f8dfe577577e7ef7a82a4dda03e9f3001acab7bb0bc5afb5e" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 13 14:34:50 crc kubenswrapper[4797]: E1013 14:34:50.958463 4797 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell1-conductor-0" podUID="912acab0-a01a-4f1a-9bcd-825354752818" containerName="nova-cell1-conductor-conductor" Oct 13 14:34:50 crc kubenswrapper[4797]: I1013 14:34:50.976187 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"65493b63-8563-4071-98e1-647040b90c23","Type":"ContainerStarted","Data":"d98e214ebbcf98374ce57a8265cecb7f6fd8415b1210b0470ba22c31ad251f9b"} Oct 13 14:34:50 crc kubenswrapper[4797]: I1013 14:34:50.976235 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"65493b63-8563-4071-98e1-647040b90c23","Type":"ContainerStarted","Data":"f43c9ed899e198c85092d2ecb0f8d5956c76a0cd3db3b5401721e8a67d2a5174"} Oct 13 14:34:50 crc kubenswrapper[4797]: I1013 14:34:50.976321 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Oct 13 14:34:50 crc kubenswrapper[4797]: I1013 14:34:50.978965 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"dca2aaee-d1cd-4f97-b991-384473d89a49","Type":"ContainerDied","Data":"612abea44fbda8019898d5da193f25fe1a1fe213599f5ccfeb01f54634b0f16c"} Oct 13 14:34:50 crc kubenswrapper[4797]: I1013 14:34:50.979003 4797 scope.go:117] "RemoveContainer" containerID="5b9e18ce17adfde9e6ffb19d74e800da763fcaca03cdcc3c4a430c84e231a845" Oct 13 14:34:50 crc kubenswrapper[4797]: I1013 14:34:50.979008 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 13 14:34:50 crc kubenswrapper[4797]: I1013 14:34:50.989107 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e872135e-d02b-4571-9217-0aa3cf0d444d","Type":"ContainerDied","Data":"9e1f7ed5304816563ef954a6066ffd9e002a40e82d7806da07810644c97d90d6"} Oct 13 14:34:50 crc kubenswrapper[4797]: I1013 14:34:50.989184 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 13 14:34:51 crc kubenswrapper[4797]: I1013 14:34:51.001312 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=1.001293797 podStartE2EDuration="1.001293797s" podCreationTimestamp="2025-10-13 14:34:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 14:34:50.99283343 +0000 UTC m=+5268.526383696" watchObservedRunningTime="2025-10-13 14:34:51.001293797 +0000 UTC m=+5268.534844063" Oct 13 14:34:51 crc kubenswrapper[4797]: I1013 14:34:51.023001 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 13 14:34:51 crc kubenswrapper[4797]: I1013 14:34:51.033047 4797 scope.go:117] "RemoveContainer" containerID="ac26cf3eec713da06a181855a0db367f1463291376e6ef025d8b4e08a2f19441" Oct 13 14:34:51 crc kubenswrapper[4797]: I1013 14:34:51.034142 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 13 14:34:51 crc kubenswrapper[4797]: I1013 14:34:51.059192 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 13 14:34:51 crc kubenswrapper[4797]: I1013 14:34:51.068340 4797 scope.go:117] "RemoveContainer" containerID="f6b124f02b85e8a6700b935f0939bcd19d6b94a7adea514dea990f5f65e1e4f8" Oct 13 14:34:51 crc kubenswrapper[4797]: I1013 14:34:51.077869 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 13 14:34:51 crc kubenswrapper[4797]: I1013 14:34:51.100366 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 13 14:34:51 crc kubenswrapper[4797]: E1013 14:34:51.100785 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e872135e-d02b-4571-9217-0aa3cf0d444d" containerName="nova-metadata-log" Oct 13 14:34:51 crc kubenswrapper[4797]: I1013 14:34:51.100814 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="e872135e-d02b-4571-9217-0aa3cf0d444d" containerName="nova-metadata-log" Oct 13 14:34:51 crc kubenswrapper[4797]: E1013 14:34:51.100828 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dca2aaee-d1cd-4f97-b991-384473d89a49" containerName="nova-api-api" Oct 13 14:34:51 crc kubenswrapper[4797]: I1013 14:34:51.100835 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="dca2aaee-d1cd-4f97-b991-384473d89a49" containerName="nova-api-api" Oct 13 14:34:51 crc kubenswrapper[4797]: E1013 14:34:51.100846 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e872135e-d02b-4571-9217-0aa3cf0d444d" containerName="nova-metadata-metadata" Oct 13 14:34:51 crc kubenswrapper[4797]: I1013 14:34:51.100852 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="e872135e-d02b-4571-9217-0aa3cf0d444d" containerName="nova-metadata-metadata" Oct 13 14:34:51 crc kubenswrapper[4797]: E1013 14:34:51.100866 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dca2aaee-d1cd-4f97-b991-384473d89a49" containerName="nova-api-log" Oct 13 14:34:51 crc kubenswrapper[4797]: I1013 14:34:51.100871 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="dca2aaee-d1cd-4f97-b991-384473d89a49" containerName="nova-api-log" Oct 13 14:34:51 crc kubenswrapper[4797]: I1013 14:34:51.101062 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="dca2aaee-d1cd-4f97-b991-384473d89a49" containerName="nova-api-api" Oct 13 14:34:51 crc kubenswrapper[4797]: I1013 14:34:51.101084 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="e872135e-d02b-4571-9217-0aa3cf0d444d" containerName="nova-metadata-log" Oct 13 14:34:51 crc kubenswrapper[4797]: I1013 14:34:51.101095 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="dca2aaee-d1cd-4f97-b991-384473d89a49" containerName="nova-api-log" Oct 13 14:34:51 crc kubenswrapper[4797]: I1013 14:34:51.101102 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="e872135e-d02b-4571-9217-0aa3cf0d444d" containerName="nova-metadata-metadata" Oct 13 14:34:51 crc kubenswrapper[4797]: I1013 14:34:51.102035 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 13 14:34:51 crc kubenswrapper[4797]: I1013 14:34:51.107205 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 13 14:34:51 crc kubenswrapper[4797]: I1013 14:34:51.118636 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 13 14:34:51 crc kubenswrapper[4797]: I1013 14:34:51.136488 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 13 14:34:51 crc kubenswrapper[4797]: I1013 14:34:51.137106 4797 scope.go:117] "RemoveContainer" containerID="15627da574b8282a1bbfef2a9135d70d80740ec5600f34160ee053408940b6f8" Oct 13 14:34:51 crc kubenswrapper[4797]: I1013 14:34:51.139534 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 13 14:34:51 crc kubenswrapper[4797]: I1013 14:34:51.141633 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 13 14:34:51 crc kubenswrapper[4797]: I1013 14:34:51.146911 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 13 14:34:51 crc kubenswrapper[4797]: I1013 14:34:51.192685 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3386188-e3d7-49b6-b099-7a5b66e012ee-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b3386188-e3d7-49b6-b099-7a5b66e012ee\") " pod="openstack/nova-api-0" Oct 13 14:34:51 crc kubenswrapper[4797]: I1013 14:34:51.192847 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crdv4\" (UniqueName: \"kubernetes.io/projected/b3386188-e3d7-49b6-b099-7a5b66e012ee-kube-api-access-crdv4\") pod \"nova-api-0\" (UID: \"b3386188-e3d7-49b6-b099-7a5b66e012ee\") " pod="openstack/nova-api-0" Oct 13 14:34:51 crc kubenswrapper[4797]: I1013 14:34:51.192881 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b3386188-e3d7-49b6-b099-7a5b66e012ee-logs\") pod \"nova-api-0\" (UID: \"b3386188-e3d7-49b6-b099-7a5b66e012ee\") " pod="openstack/nova-api-0" Oct 13 14:34:51 crc kubenswrapper[4797]: I1013 14:34:51.192925 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3386188-e3d7-49b6-b099-7a5b66e012ee-config-data\") pod \"nova-api-0\" (UID: \"b3386188-e3d7-49b6-b099-7a5b66e012ee\") " pod="openstack/nova-api-0" Oct 13 14:34:51 crc kubenswrapper[4797]: I1013 14:34:51.251494 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c326826-7bbd-44cb-aa2e-9c4e10aaf8f2" path="/var/lib/kubelet/pods/2c326826-7bbd-44cb-aa2e-9c4e10aaf8f2/volumes" Oct 13 14:34:51 crc kubenswrapper[4797]: I1013 14:34:51.252217 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dca2aaee-d1cd-4f97-b991-384473d89a49" path="/var/lib/kubelet/pods/dca2aaee-d1cd-4f97-b991-384473d89a49/volumes" Oct 13 14:34:51 crc kubenswrapper[4797]: I1013 14:34:51.253263 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e872135e-d02b-4571-9217-0aa3cf0d444d" path="/var/lib/kubelet/pods/e872135e-d02b-4571-9217-0aa3cf0d444d/volumes" Oct 13 14:34:51 crc kubenswrapper[4797]: I1013 14:34:51.294464 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-crdv4\" (UniqueName: \"kubernetes.io/projected/b3386188-e3d7-49b6-b099-7a5b66e012ee-kube-api-access-crdv4\") pod \"nova-api-0\" (UID: \"b3386188-e3d7-49b6-b099-7a5b66e012ee\") " pod="openstack/nova-api-0" Oct 13 14:34:51 crc kubenswrapper[4797]: I1013 14:34:51.294514 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b3386188-e3d7-49b6-b099-7a5b66e012ee-logs\") pod \"nova-api-0\" (UID: \"b3386188-e3d7-49b6-b099-7a5b66e012ee\") " pod="openstack/nova-api-0" Oct 13 14:34:51 crc kubenswrapper[4797]: I1013 14:34:51.294560 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3386188-e3d7-49b6-b099-7a5b66e012ee-config-data\") pod \"nova-api-0\" (UID: \"b3386188-e3d7-49b6-b099-7a5b66e012ee\") " pod="openstack/nova-api-0" Oct 13 14:34:51 crc kubenswrapper[4797]: I1013 14:34:51.294617 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/10b527c4-151c-464b-a112-64bd7b5a7444-logs\") pod \"nova-metadata-0\" (UID: \"10b527c4-151c-464b-a112-64bd7b5a7444\") " pod="openstack/nova-metadata-0" Oct 13 14:34:51 crc kubenswrapper[4797]: I1013 14:34:51.294658 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10b527c4-151c-464b-a112-64bd7b5a7444-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"10b527c4-151c-464b-a112-64bd7b5a7444\") " pod="openstack/nova-metadata-0" Oct 13 14:34:51 crc kubenswrapper[4797]: I1013 14:34:51.294683 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zg4ds\" (UniqueName: \"kubernetes.io/projected/10b527c4-151c-464b-a112-64bd7b5a7444-kube-api-access-zg4ds\") pod \"nova-metadata-0\" (UID: \"10b527c4-151c-464b-a112-64bd7b5a7444\") " pod="openstack/nova-metadata-0" Oct 13 14:34:51 crc kubenswrapper[4797]: I1013 14:34:51.294705 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3386188-e3d7-49b6-b099-7a5b66e012ee-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b3386188-e3d7-49b6-b099-7a5b66e012ee\") " pod="openstack/nova-api-0" Oct 13 14:34:51 crc kubenswrapper[4797]: I1013 14:34:51.294722 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10b527c4-151c-464b-a112-64bd7b5a7444-config-data\") pod \"nova-metadata-0\" (UID: \"10b527c4-151c-464b-a112-64bd7b5a7444\") " pod="openstack/nova-metadata-0" Oct 13 14:34:51 crc kubenswrapper[4797]: I1013 14:34:51.295823 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b3386188-e3d7-49b6-b099-7a5b66e012ee-logs\") pod \"nova-api-0\" (UID: \"b3386188-e3d7-49b6-b099-7a5b66e012ee\") " pod="openstack/nova-api-0" Oct 13 14:34:51 crc kubenswrapper[4797]: I1013 14:34:51.299838 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3386188-e3d7-49b6-b099-7a5b66e012ee-config-data\") pod \"nova-api-0\" (UID: \"b3386188-e3d7-49b6-b099-7a5b66e012ee\") " pod="openstack/nova-api-0" Oct 13 14:34:51 crc kubenswrapper[4797]: I1013 14:34:51.302499 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3386188-e3d7-49b6-b099-7a5b66e012ee-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b3386188-e3d7-49b6-b099-7a5b66e012ee\") " pod="openstack/nova-api-0" Oct 13 14:34:51 crc kubenswrapper[4797]: I1013 14:34:51.316119 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-crdv4\" (UniqueName: \"kubernetes.io/projected/b3386188-e3d7-49b6-b099-7a5b66e012ee-kube-api-access-crdv4\") pod \"nova-api-0\" (UID: \"b3386188-e3d7-49b6-b099-7a5b66e012ee\") " pod="openstack/nova-api-0" Oct 13 14:34:51 crc kubenswrapper[4797]: I1013 14:34:51.396771 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/10b527c4-151c-464b-a112-64bd7b5a7444-logs\") pod \"nova-metadata-0\" (UID: \"10b527c4-151c-464b-a112-64bd7b5a7444\") " pod="openstack/nova-metadata-0" Oct 13 14:34:51 crc kubenswrapper[4797]: I1013 14:34:51.396853 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10b527c4-151c-464b-a112-64bd7b5a7444-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"10b527c4-151c-464b-a112-64bd7b5a7444\") " pod="openstack/nova-metadata-0" Oct 13 14:34:51 crc kubenswrapper[4797]: I1013 14:34:51.396889 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zg4ds\" (UniqueName: \"kubernetes.io/projected/10b527c4-151c-464b-a112-64bd7b5a7444-kube-api-access-zg4ds\") pod \"nova-metadata-0\" (UID: \"10b527c4-151c-464b-a112-64bd7b5a7444\") " pod="openstack/nova-metadata-0" Oct 13 14:34:51 crc kubenswrapper[4797]: I1013 14:34:51.396915 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10b527c4-151c-464b-a112-64bd7b5a7444-config-data\") pod \"nova-metadata-0\" (UID: \"10b527c4-151c-464b-a112-64bd7b5a7444\") " pod="openstack/nova-metadata-0" Oct 13 14:34:51 crc kubenswrapper[4797]: I1013 14:34:51.397911 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/10b527c4-151c-464b-a112-64bd7b5a7444-logs\") pod \"nova-metadata-0\" (UID: \"10b527c4-151c-464b-a112-64bd7b5a7444\") " pod="openstack/nova-metadata-0" Oct 13 14:34:51 crc kubenswrapper[4797]: I1013 14:34:51.400592 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10b527c4-151c-464b-a112-64bd7b5a7444-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"10b527c4-151c-464b-a112-64bd7b5a7444\") " pod="openstack/nova-metadata-0" Oct 13 14:34:51 crc kubenswrapper[4797]: I1013 14:34:51.400603 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10b527c4-151c-464b-a112-64bd7b5a7444-config-data\") pod \"nova-metadata-0\" (UID: \"10b527c4-151c-464b-a112-64bd7b5a7444\") " pod="openstack/nova-metadata-0" Oct 13 14:34:51 crc kubenswrapper[4797]: I1013 14:34:51.414122 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zg4ds\" (UniqueName: \"kubernetes.io/projected/10b527c4-151c-464b-a112-64bd7b5a7444-kube-api-access-zg4ds\") pod \"nova-metadata-0\" (UID: \"10b527c4-151c-464b-a112-64bd7b5a7444\") " pod="openstack/nova-metadata-0" Oct 13 14:34:51 crc kubenswrapper[4797]: I1013 14:34:51.427172 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 13 14:34:51 crc kubenswrapper[4797]: I1013 14:34:51.462216 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 13 14:34:51 crc kubenswrapper[4797]: I1013 14:34:51.967468 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 13 14:34:51 crc kubenswrapper[4797]: W1013 14:34:51.976031 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb3386188_e3d7_49b6_b099_7a5b66e012ee.slice/crio-305f114b16360690138d4618d0a0ffa4b05d0d3ed325872577336daef1b999c8 WatchSource:0}: Error finding container 305f114b16360690138d4618d0a0ffa4b05d0d3ed325872577336daef1b999c8: Status 404 returned error can't find the container with id 305f114b16360690138d4618d0a0ffa4b05d0d3ed325872577336daef1b999c8 Oct 13 14:34:51 crc kubenswrapper[4797]: I1013 14:34:51.999776 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b3386188-e3d7-49b6-b099-7a5b66e012ee","Type":"ContainerStarted","Data":"305f114b16360690138d4618d0a0ffa4b05d0d3ed325872577336daef1b999c8"} Oct 13 14:34:52 crc kubenswrapper[4797]: I1013 14:34:52.056736 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 13 14:34:52 crc kubenswrapper[4797]: W1013 14:34:52.079948 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod10b527c4_151c_464b_a112_64bd7b5a7444.slice/crio-7eb9ba15b76879003e5d86a4e9a6fdfa1c25fed169eeac9e873ada71422433c1 WatchSource:0}: Error finding container 7eb9ba15b76879003e5d86a4e9a6fdfa1c25fed169eeac9e873ada71422433c1: Status 404 returned error can't find the container with id 7eb9ba15b76879003e5d86a4e9a6fdfa1c25fed169eeac9e873ada71422433c1 Oct 13 14:34:52 crc kubenswrapper[4797]: E1013 14:34:52.242222 4797 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5161b110bfe6553ed3a8f68efc55fbb21768983c7aa7ec2f96cca17ae383b3bf" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 13 14:34:52 crc kubenswrapper[4797]: E1013 14:34:52.243651 4797 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5161b110bfe6553ed3a8f68efc55fbb21768983c7aa7ec2f96cca17ae383b3bf" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 13 14:34:52 crc kubenswrapper[4797]: E1013 14:34:52.245151 4797 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5161b110bfe6553ed3a8f68efc55fbb21768983c7aa7ec2f96cca17ae383b3bf" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 13 14:34:52 crc kubenswrapper[4797]: E1013 14:34:52.245193 4797 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="ab0e779d-76c7-48ef-9455-08b1c32ba0f8" containerName="nova-scheduler-scheduler" Oct 13 14:34:53 crc kubenswrapper[4797]: I1013 14:34:53.014360 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b3386188-e3d7-49b6-b099-7a5b66e012ee","Type":"ContainerStarted","Data":"c9f3097f8f28be100e01af3e0dfca4331ebc0d5c4809ddeb98bd981107123b24"} Oct 13 14:34:53 crc kubenswrapper[4797]: I1013 14:34:53.014699 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b3386188-e3d7-49b6-b099-7a5b66e012ee","Type":"ContainerStarted","Data":"95048d81c2b650b38cdff860feea929ab3fe2c9e48055adba8f91a38d9a111f4"} Oct 13 14:34:53 crc kubenswrapper[4797]: I1013 14:34:53.016412 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"10b527c4-151c-464b-a112-64bd7b5a7444","Type":"ContainerStarted","Data":"6c060dc934ae135cc4334d93281f9963ff3fb142cdeb222dbb378c1dc96b7d1c"} Oct 13 14:34:53 crc kubenswrapper[4797]: I1013 14:34:53.016460 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"10b527c4-151c-464b-a112-64bd7b5a7444","Type":"ContainerStarted","Data":"fb66a988bd5f490396133e2807eb34da2f13ff1da09ea7d4b86bf67ed042750a"} Oct 13 14:34:53 crc kubenswrapper[4797]: I1013 14:34:53.016472 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"10b527c4-151c-464b-a112-64bd7b5a7444","Type":"ContainerStarted","Data":"7eb9ba15b76879003e5d86a4e9a6fdfa1c25fed169eeac9e873ada71422433c1"} Oct 13 14:34:53 crc kubenswrapper[4797]: I1013 14:34:53.036946 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.036927114 podStartE2EDuration="2.036927114s" podCreationTimestamp="2025-10-13 14:34:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 14:34:53.02816225 +0000 UTC m=+5270.561712526" watchObservedRunningTime="2025-10-13 14:34:53.036927114 +0000 UTC m=+5270.570477380" Oct 13 14:34:53 crc kubenswrapper[4797]: I1013 14:34:53.057113 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.057094588 podStartE2EDuration="2.057094588s" podCreationTimestamp="2025-10-13 14:34:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 14:34:53.052022114 +0000 UTC m=+5270.585572410" watchObservedRunningTime="2025-10-13 14:34:53.057094588 +0000 UTC m=+5270.590644844" Oct 13 14:34:53 crc kubenswrapper[4797]: I1013 14:34:53.305163 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Oct 13 14:34:55 crc kubenswrapper[4797]: I1013 14:34:55.236738 4797 scope.go:117] "RemoveContainer" containerID="96c8267bd4c8e99eeab0f52fde47a06d5529395a03b2ed9e13ec45aa355e370b" Oct 13 14:34:55 crc kubenswrapper[4797]: E1013 14:34:55.237243 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 14:34:55 crc kubenswrapper[4797]: I1013 14:34:55.478205 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Oct 13 14:34:55 crc kubenswrapper[4797]: I1013 14:34:55.624967 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 13 14:34:55 crc kubenswrapper[4797]: I1013 14:34:55.769985 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/912acab0-a01a-4f1a-9bcd-825354752818-combined-ca-bundle\") pod \"912acab0-a01a-4f1a-9bcd-825354752818\" (UID: \"912acab0-a01a-4f1a-9bcd-825354752818\") " Oct 13 14:34:55 crc kubenswrapper[4797]: I1013 14:34:55.770358 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/912acab0-a01a-4f1a-9bcd-825354752818-config-data\") pod \"912acab0-a01a-4f1a-9bcd-825354752818\" (UID: \"912acab0-a01a-4f1a-9bcd-825354752818\") " Oct 13 14:34:55 crc kubenswrapper[4797]: I1013 14:34:55.770412 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5fzfk\" (UniqueName: \"kubernetes.io/projected/912acab0-a01a-4f1a-9bcd-825354752818-kube-api-access-5fzfk\") pod \"912acab0-a01a-4f1a-9bcd-825354752818\" (UID: \"912acab0-a01a-4f1a-9bcd-825354752818\") " Oct 13 14:34:55 crc kubenswrapper[4797]: I1013 14:34:55.802492 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/912acab0-a01a-4f1a-9bcd-825354752818-kube-api-access-5fzfk" (OuterVolumeSpecName: "kube-api-access-5fzfk") pod "912acab0-a01a-4f1a-9bcd-825354752818" (UID: "912acab0-a01a-4f1a-9bcd-825354752818"). InnerVolumeSpecName "kube-api-access-5fzfk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 14:34:55 crc kubenswrapper[4797]: I1013 14:34:55.810191 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/912acab0-a01a-4f1a-9bcd-825354752818-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "912acab0-a01a-4f1a-9bcd-825354752818" (UID: "912acab0-a01a-4f1a-9bcd-825354752818"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 14:34:55 crc kubenswrapper[4797]: I1013 14:34:55.810671 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/912acab0-a01a-4f1a-9bcd-825354752818-config-data" (OuterVolumeSpecName: "config-data") pod "912acab0-a01a-4f1a-9bcd-825354752818" (UID: "912acab0-a01a-4f1a-9bcd-825354752818"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 14:34:55 crc kubenswrapper[4797]: I1013 14:34:55.872054 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/912acab0-a01a-4f1a-9bcd-825354752818-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 14:34:55 crc kubenswrapper[4797]: I1013 14:34:55.872083 4797 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/912acab0-a01a-4f1a-9bcd-825354752818-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 14:34:55 crc kubenswrapper[4797]: I1013 14:34:55.872092 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5fzfk\" (UniqueName: \"kubernetes.io/projected/912acab0-a01a-4f1a-9bcd-825354752818-kube-api-access-5fzfk\") on node \"crc\" DevicePath \"\"" Oct 13 14:34:56 crc kubenswrapper[4797]: I1013 14:34:56.045342 4797 generic.go:334] "Generic (PLEG): container finished" podID="912acab0-a01a-4f1a-9bcd-825354752818" containerID="7d039f6c8b418f3f8dfe577577e7ef7a82a4dda03e9f3001acab7bb0bc5afb5e" exitCode=0 Oct 13 14:34:56 crc kubenswrapper[4797]: I1013 14:34:56.045424 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"912acab0-a01a-4f1a-9bcd-825354752818","Type":"ContainerDied","Data":"7d039f6c8b418f3f8dfe577577e7ef7a82a4dda03e9f3001acab7bb0bc5afb5e"} Oct 13 14:34:56 crc kubenswrapper[4797]: I1013 14:34:56.045438 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 13 14:34:56 crc kubenswrapper[4797]: I1013 14:34:56.045457 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"912acab0-a01a-4f1a-9bcd-825354752818","Type":"ContainerDied","Data":"5906046ddd0238ec717751a5b3a97f8a999aef8e4eb3a075f4e7996c897df87a"} Oct 13 14:34:56 crc kubenswrapper[4797]: I1013 14:34:56.045479 4797 scope.go:117] "RemoveContainer" containerID="7d039f6c8b418f3f8dfe577577e7ef7a82a4dda03e9f3001acab7bb0bc5afb5e" Oct 13 14:34:56 crc kubenswrapper[4797]: I1013 14:34:56.049835 4797 generic.go:334] "Generic (PLEG): container finished" podID="ab0e779d-76c7-48ef-9455-08b1c32ba0f8" containerID="5161b110bfe6553ed3a8f68efc55fbb21768983c7aa7ec2f96cca17ae383b3bf" exitCode=0 Oct 13 14:34:56 crc kubenswrapper[4797]: I1013 14:34:56.049883 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ab0e779d-76c7-48ef-9455-08b1c32ba0f8","Type":"ContainerDied","Data":"5161b110bfe6553ed3a8f68efc55fbb21768983c7aa7ec2f96cca17ae383b3bf"} Oct 13 14:34:56 crc kubenswrapper[4797]: I1013 14:34:56.076667 4797 scope.go:117] "RemoveContainer" containerID="7d039f6c8b418f3f8dfe577577e7ef7a82a4dda03e9f3001acab7bb0bc5afb5e" Oct 13 14:34:56 crc kubenswrapper[4797]: E1013 14:34:56.077199 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d039f6c8b418f3f8dfe577577e7ef7a82a4dda03e9f3001acab7bb0bc5afb5e\": container with ID starting with 7d039f6c8b418f3f8dfe577577e7ef7a82a4dda03e9f3001acab7bb0bc5afb5e not found: ID does not exist" containerID="7d039f6c8b418f3f8dfe577577e7ef7a82a4dda03e9f3001acab7bb0bc5afb5e" Oct 13 14:34:56 crc kubenswrapper[4797]: I1013 14:34:56.077231 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d039f6c8b418f3f8dfe577577e7ef7a82a4dda03e9f3001acab7bb0bc5afb5e"} err="failed to get container status \"7d039f6c8b418f3f8dfe577577e7ef7a82a4dda03e9f3001acab7bb0bc5afb5e\": rpc error: code = NotFound desc = could not find container \"7d039f6c8b418f3f8dfe577577e7ef7a82a4dda03e9f3001acab7bb0bc5afb5e\": container with ID starting with 7d039f6c8b418f3f8dfe577577e7ef7a82a4dda03e9f3001acab7bb0bc5afb5e not found: ID does not exist" Oct 13 14:34:56 crc kubenswrapper[4797]: I1013 14:34:56.094420 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 13 14:34:56 crc kubenswrapper[4797]: I1013 14:34:56.108901 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 13 14:34:56 crc kubenswrapper[4797]: I1013 14:34:56.119054 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 13 14:34:56 crc kubenswrapper[4797]: E1013 14:34:56.119535 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="912acab0-a01a-4f1a-9bcd-825354752818" containerName="nova-cell1-conductor-conductor" Oct 13 14:34:56 crc kubenswrapper[4797]: I1013 14:34:56.119555 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="912acab0-a01a-4f1a-9bcd-825354752818" containerName="nova-cell1-conductor-conductor" Oct 13 14:34:56 crc kubenswrapper[4797]: I1013 14:34:56.119725 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="912acab0-a01a-4f1a-9bcd-825354752818" containerName="nova-cell1-conductor-conductor" Oct 13 14:34:56 crc kubenswrapper[4797]: I1013 14:34:56.120382 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 13 14:34:56 crc kubenswrapper[4797]: I1013 14:34:56.122837 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 13 14:34:56 crc kubenswrapper[4797]: I1013 14:34:56.129060 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 13 14:34:56 crc kubenswrapper[4797]: I1013 14:34:56.281088 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxm5f\" (UniqueName: \"kubernetes.io/projected/7cc2cb14-0b77-4361-b3e4-c1196a2253ce-kube-api-access-vxm5f\") pod \"nova-cell1-conductor-0\" (UID: \"7cc2cb14-0b77-4361-b3e4-c1196a2253ce\") " pod="openstack/nova-cell1-conductor-0" Oct 13 14:34:56 crc kubenswrapper[4797]: I1013 14:34:56.281161 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cc2cb14-0b77-4361-b3e4-c1196a2253ce-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"7cc2cb14-0b77-4361-b3e4-c1196a2253ce\") " pod="openstack/nova-cell1-conductor-0" Oct 13 14:34:56 crc kubenswrapper[4797]: I1013 14:34:56.281462 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cc2cb14-0b77-4361-b3e4-c1196a2253ce-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"7cc2cb14-0b77-4361-b3e4-c1196a2253ce\") " pod="openstack/nova-cell1-conductor-0" Oct 13 14:34:56 crc kubenswrapper[4797]: I1013 14:34:56.383187 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cc2cb14-0b77-4361-b3e4-c1196a2253ce-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"7cc2cb14-0b77-4361-b3e4-c1196a2253ce\") " pod="openstack/nova-cell1-conductor-0" Oct 13 14:34:56 crc kubenswrapper[4797]: I1013 14:34:56.383354 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vxm5f\" (UniqueName: \"kubernetes.io/projected/7cc2cb14-0b77-4361-b3e4-c1196a2253ce-kube-api-access-vxm5f\") pod \"nova-cell1-conductor-0\" (UID: \"7cc2cb14-0b77-4361-b3e4-c1196a2253ce\") " pod="openstack/nova-cell1-conductor-0" Oct 13 14:34:56 crc kubenswrapper[4797]: I1013 14:34:56.383412 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cc2cb14-0b77-4361-b3e4-c1196a2253ce-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"7cc2cb14-0b77-4361-b3e4-c1196a2253ce\") " pod="openstack/nova-cell1-conductor-0" Oct 13 14:34:56 crc kubenswrapper[4797]: I1013 14:34:56.397558 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cc2cb14-0b77-4361-b3e4-c1196a2253ce-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"7cc2cb14-0b77-4361-b3e4-c1196a2253ce\") " pod="openstack/nova-cell1-conductor-0" Oct 13 14:34:56 crc kubenswrapper[4797]: I1013 14:34:56.401067 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cc2cb14-0b77-4361-b3e4-c1196a2253ce-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"7cc2cb14-0b77-4361-b3e4-c1196a2253ce\") " pod="openstack/nova-cell1-conductor-0" Oct 13 14:34:56 crc kubenswrapper[4797]: I1013 14:34:56.403408 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxm5f\" (UniqueName: \"kubernetes.io/projected/7cc2cb14-0b77-4361-b3e4-c1196a2253ce-kube-api-access-vxm5f\") pod \"nova-cell1-conductor-0\" (UID: \"7cc2cb14-0b77-4361-b3e4-c1196a2253ce\") " pod="openstack/nova-cell1-conductor-0" Oct 13 14:34:56 crc kubenswrapper[4797]: I1013 14:34:56.447011 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 13 14:34:56 crc kubenswrapper[4797]: I1013 14:34:56.463955 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 13 14:34:56 crc kubenswrapper[4797]: I1013 14:34:56.464006 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 13 14:34:56 crc kubenswrapper[4797]: I1013 14:34:56.542620 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 13 14:34:56 crc kubenswrapper[4797]: I1013 14:34:56.688158 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab0e779d-76c7-48ef-9455-08b1c32ba0f8-combined-ca-bundle\") pod \"ab0e779d-76c7-48ef-9455-08b1c32ba0f8\" (UID: \"ab0e779d-76c7-48ef-9455-08b1c32ba0f8\") " Oct 13 14:34:56 crc kubenswrapper[4797]: I1013 14:34:56.688286 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mrkz9\" (UniqueName: \"kubernetes.io/projected/ab0e779d-76c7-48ef-9455-08b1c32ba0f8-kube-api-access-mrkz9\") pod \"ab0e779d-76c7-48ef-9455-08b1c32ba0f8\" (UID: \"ab0e779d-76c7-48ef-9455-08b1c32ba0f8\") " Oct 13 14:34:56 crc kubenswrapper[4797]: I1013 14:34:56.688480 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab0e779d-76c7-48ef-9455-08b1c32ba0f8-config-data\") pod \"ab0e779d-76c7-48ef-9455-08b1c32ba0f8\" (UID: \"ab0e779d-76c7-48ef-9455-08b1c32ba0f8\") " Oct 13 14:34:56 crc kubenswrapper[4797]: I1013 14:34:56.696248 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab0e779d-76c7-48ef-9455-08b1c32ba0f8-kube-api-access-mrkz9" (OuterVolumeSpecName: "kube-api-access-mrkz9") pod "ab0e779d-76c7-48ef-9455-08b1c32ba0f8" (UID: "ab0e779d-76c7-48ef-9455-08b1c32ba0f8"). InnerVolumeSpecName "kube-api-access-mrkz9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 14:34:56 crc kubenswrapper[4797]: I1013 14:34:56.715637 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab0e779d-76c7-48ef-9455-08b1c32ba0f8-config-data" (OuterVolumeSpecName: "config-data") pod "ab0e779d-76c7-48ef-9455-08b1c32ba0f8" (UID: "ab0e779d-76c7-48ef-9455-08b1c32ba0f8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 14:34:56 crc kubenswrapper[4797]: I1013 14:34:56.722982 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab0e779d-76c7-48ef-9455-08b1c32ba0f8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ab0e779d-76c7-48ef-9455-08b1c32ba0f8" (UID: "ab0e779d-76c7-48ef-9455-08b1c32ba0f8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 14:34:56 crc kubenswrapper[4797]: I1013 14:34:56.790944 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mrkz9\" (UniqueName: \"kubernetes.io/projected/ab0e779d-76c7-48ef-9455-08b1c32ba0f8-kube-api-access-mrkz9\") on node \"crc\" DevicePath \"\"" Oct 13 14:34:56 crc kubenswrapper[4797]: I1013 14:34:56.790994 4797 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab0e779d-76c7-48ef-9455-08b1c32ba0f8-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 14:34:56 crc kubenswrapper[4797]: I1013 14:34:56.791006 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab0e779d-76c7-48ef-9455-08b1c32ba0f8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 14:34:56 crc kubenswrapper[4797]: I1013 14:34:56.894317 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 13 14:34:56 crc kubenswrapper[4797]: W1013 14:34:56.894923 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7cc2cb14_0b77_4361_b3e4_c1196a2253ce.slice/crio-18981afe8cb2860b08a2b3f06066237f41227dc99b962cbb4580e90325ad5bc7 WatchSource:0}: Error finding container 18981afe8cb2860b08a2b3f06066237f41227dc99b962cbb4580e90325ad5bc7: Status 404 returned error can't find the container with id 18981afe8cb2860b08a2b3f06066237f41227dc99b962cbb4580e90325ad5bc7 Oct 13 14:34:57 crc kubenswrapper[4797]: I1013 14:34:57.058145 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-x8km4"] Oct 13 14:34:57 crc kubenswrapper[4797]: I1013 14:34:57.068085 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-x8km4"] Oct 13 14:34:57 crc kubenswrapper[4797]: I1013 14:34:57.082264 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ab0e779d-76c7-48ef-9455-08b1c32ba0f8","Type":"ContainerDied","Data":"d6608d765a69da28c75c82a90a5f8f72a0d0ee01d9bf52bbb4bc06979aa30b3d"} Oct 13 14:34:57 crc kubenswrapper[4797]: I1013 14:34:57.082296 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 13 14:34:57 crc kubenswrapper[4797]: I1013 14:34:57.082322 4797 scope.go:117] "RemoveContainer" containerID="5161b110bfe6553ed3a8f68efc55fbb21768983c7aa7ec2f96cca17ae383b3bf" Oct 13 14:34:57 crc kubenswrapper[4797]: I1013 14:34:57.113582 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"7cc2cb14-0b77-4361-b3e4-c1196a2253ce","Type":"ContainerStarted","Data":"18981afe8cb2860b08a2b3f06066237f41227dc99b962cbb4580e90325ad5bc7"} Oct 13 14:34:57 crc kubenswrapper[4797]: I1013 14:34:57.214211 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 13 14:34:57 crc kubenswrapper[4797]: I1013 14:34:57.300295 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="912acab0-a01a-4f1a-9bcd-825354752818" path="/var/lib/kubelet/pods/912acab0-a01a-4f1a-9bcd-825354752818/volumes" Oct 13 14:34:57 crc kubenswrapper[4797]: I1013 14:34:57.300878 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd035957-69f9-40dd-9c0d-ae7846605b91" path="/var/lib/kubelet/pods/bd035957-69f9-40dd-9c0d-ae7846605b91/volumes" Oct 13 14:34:57 crc kubenswrapper[4797]: I1013 14:34:57.301464 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 13 14:34:57 crc kubenswrapper[4797]: I1013 14:34:57.301494 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 13 14:34:57 crc kubenswrapper[4797]: E1013 14:34:57.301784 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab0e779d-76c7-48ef-9455-08b1c32ba0f8" containerName="nova-scheduler-scheduler" Oct 13 14:34:57 crc kubenswrapper[4797]: I1013 14:34:57.301796 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab0e779d-76c7-48ef-9455-08b1c32ba0f8" containerName="nova-scheduler-scheduler" Oct 13 14:34:57 crc kubenswrapper[4797]: I1013 14:34:57.308855 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab0e779d-76c7-48ef-9455-08b1c32ba0f8" containerName="nova-scheduler-scheduler" Oct 13 14:34:57 crc kubenswrapper[4797]: I1013 14:34:57.309682 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 13 14:34:57 crc kubenswrapper[4797]: I1013 14:34:57.309765 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 13 14:34:57 crc kubenswrapper[4797]: I1013 14:34:57.313487 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 13 14:34:57 crc kubenswrapper[4797]: I1013 14:34:57.440748 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abf87031-2d48-4271-9cbd-2c872a8e75e3-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"abf87031-2d48-4271-9cbd-2c872a8e75e3\") " pod="openstack/nova-scheduler-0" Oct 13 14:34:57 crc kubenswrapper[4797]: I1013 14:34:57.440894 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z82jl\" (UniqueName: \"kubernetes.io/projected/abf87031-2d48-4271-9cbd-2c872a8e75e3-kube-api-access-z82jl\") pod \"nova-scheduler-0\" (UID: \"abf87031-2d48-4271-9cbd-2c872a8e75e3\") " pod="openstack/nova-scheduler-0" Oct 13 14:34:57 crc kubenswrapper[4797]: I1013 14:34:57.440998 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abf87031-2d48-4271-9cbd-2c872a8e75e3-config-data\") pod \"nova-scheduler-0\" (UID: \"abf87031-2d48-4271-9cbd-2c872a8e75e3\") " pod="openstack/nova-scheduler-0" Oct 13 14:34:57 crc kubenswrapper[4797]: I1013 14:34:57.542827 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z82jl\" (UniqueName: \"kubernetes.io/projected/abf87031-2d48-4271-9cbd-2c872a8e75e3-kube-api-access-z82jl\") pod \"nova-scheduler-0\" (UID: \"abf87031-2d48-4271-9cbd-2c872a8e75e3\") " pod="openstack/nova-scheduler-0" Oct 13 14:34:57 crc kubenswrapper[4797]: I1013 14:34:57.542939 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abf87031-2d48-4271-9cbd-2c872a8e75e3-config-data\") pod \"nova-scheduler-0\" (UID: \"abf87031-2d48-4271-9cbd-2c872a8e75e3\") " pod="openstack/nova-scheduler-0" Oct 13 14:34:57 crc kubenswrapper[4797]: I1013 14:34:57.543021 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abf87031-2d48-4271-9cbd-2c872a8e75e3-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"abf87031-2d48-4271-9cbd-2c872a8e75e3\") " pod="openstack/nova-scheduler-0" Oct 13 14:34:57 crc kubenswrapper[4797]: I1013 14:34:57.549837 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abf87031-2d48-4271-9cbd-2c872a8e75e3-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"abf87031-2d48-4271-9cbd-2c872a8e75e3\") " pod="openstack/nova-scheduler-0" Oct 13 14:34:57 crc kubenswrapper[4797]: I1013 14:34:57.549966 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abf87031-2d48-4271-9cbd-2c872a8e75e3-config-data\") pod \"nova-scheduler-0\" (UID: \"abf87031-2d48-4271-9cbd-2c872a8e75e3\") " pod="openstack/nova-scheduler-0" Oct 13 14:34:57 crc kubenswrapper[4797]: I1013 14:34:57.563418 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z82jl\" (UniqueName: \"kubernetes.io/projected/abf87031-2d48-4271-9cbd-2c872a8e75e3-kube-api-access-z82jl\") pod \"nova-scheduler-0\" (UID: \"abf87031-2d48-4271-9cbd-2c872a8e75e3\") " pod="openstack/nova-scheduler-0" Oct 13 14:34:57 crc kubenswrapper[4797]: I1013 14:34:57.645460 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 13 14:34:57 crc kubenswrapper[4797]: I1013 14:34:57.943747 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 13 14:34:58 crc kubenswrapper[4797]: I1013 14:34:58.138129 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"7cc2cb14-0b77-4361-b3e4-c1196a2253ce","Type":"ContainerStarted","Data":"af1984024f7a3d917a811ad718a3683fe1090a17543cab3ab5fec01ccdb4f851"} Oct 13 14:34:58 crc kubenswrapper[4797]: I1013 14:34:58.138597 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Oct 13 14:34:58 crc kubenswrapper[4797]: I1013 14:34:58.140055 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"abf87031-2d48-4271-9cbd-2c872a8e75e3","Type":"ContainerStarted","Data":"1267e88d787b2a87fe9ff47a429f14e3f8526961643107ddae60fcdcb361fa35"} Oct 13 14:34:58 crc kubenswrapper[4797]: I1013 14:34:58.140100 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"abf87031-2d48-4271-9cbd-2c872a8e75e3","Type":"ContainerStarted","Data":"ec6e85227c57b9640496984943f3c136f93f62a974b37c3fccedd06ec7b475c6"} Oct 13 14:34:58 crc kubenswrapper[4797]: I1013 14:34:58.160477 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.160458985 podStartE2EDuration="2.160458985s" podCreationTimestamp="2025-10-13 14:34:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 14:34:58.153948595 +0000 UTC m=+5275.687498851" watchObservedRunningTime="2025-10-13 14:34:58.160458985 +0000 UTC m=+5275.694009241" Oct 13 14:34:58 crc kubenswrapper[4797]: I1013 14:34:58.174060 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=1.174040937 podStartE2EDuration="1.174040937s" podCreationTimestamp="2025-10-13 14:34:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 14:34:58.172205362 +0000 UTC m=+5275.705755618" watchObservedRunningTime="2025-10-13 14:34:58.174040937 +0000 UTC m=+5275.707591193" Oct 13 14:34:58 crc kubenswrapper[4797]: I1013 14:34:58.304757 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Oct 13 14:34:58 crc kubenswrapper[4797]: I1013 14:34:58.323580 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Oct 13 14:34:59 crc kubenswrapper[4797]: I1013 14:34:59.161219 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Oct 13 14:34:59 crc kubenswrapper[4797]: I1013 14:34:59.248179 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab0e779d-76c7-48ef-9455-08b1c32ba0f8" path="/var/lib/kubelet/pods/ab0e779d-76c7-48ef-9455-08b1c32ba0f8/volumes" Oct 13 14:35:01 crc kubenswrapper[4797]: I1013 14:35:01.428229 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 13 14:35:01 crc kubenswrapper[4797]: I1013 14:35:01.428575 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 13 14:35:01 crc kubenswrapper[4797]: I1013 14:35:01.462503 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 13 14:35:01 crc kubenswrapper[4797]: I1013 14:35:01.462554 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 13 14:35:02 crc kubenswrapper[4797]: I1013 14:35:02.595008 4797 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="b3386188-e3d7-49b6-b099-7a5b66e012ee" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.83:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 13 14:35:02 crc kubenswrapper[4797]: I1013 14:35:02.595307 4797 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="10b527c4-151c-464b-a112-64bd7b5a7444" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.84:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 13 14:35:02 crc kubenswrapper[4797]: I1013 14:35:02.595321 4797 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="10b527c4-151c-464b-a112-64bd7b5a7444" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.84:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 13 14:35:02 crc kubenswrapper[4797]: I1013 14:35:02.595337 4797 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="b3386188-e3d7-49b6-b099-7a5b66e012ee" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.83:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 13 14:35:02 crc kubenswrapper[4797]: I1013 14:35:02.646065 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 13 14:35:05 crc kubenswrapper[4797]: I1013 14:35:05.035934 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Oct 13 14:35:05 crc kubenswrapper[4797]: I1013 14:35:05.039203 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 13 14:35:05 crc kubenswrapper[4797]: I1013 14:35:05.041925 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Oct 13 14:35:05 crc kubenswrapper[4797]: I1013 14:35:05.048437 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 13 14:35:05 crc kubenswrapper[4797]: I1013 14:35:05.182101 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbf7ae4b-2087-46c3-b7ee-0fe3223215c2-config-data\") pod \"cinder-scheduler-0\" (UID: \"fbf7ae4b-2087-46c3-b7ee-0fe3223215c2\") " pod="openstack/cinder-scheduler-0" Oct 13 14:35:05 crc kubenswrapper[4797]: I1013 14:35:05.182371 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sftwk\" (UniqueName: \"kubernetes.io/projected/fbf7ae4b-2087-46c3-b7ee-0fe3223215c2-kube-api-access-sftwk\") pod \"cinder-scheduler-0\" (UID: \"fbf7ae4b-2087-46c3-b7ee-0fe3223215c2\") " pod="openstack/cinder-scheduler-0" Oct 13 14:35:05 crc kubenswrapper[4797]: I1013 14:35:05.182555 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fbf7ae4b-2087-46c3-b7ee-0fe3223215c2-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"fbf7ae4b-2087-46c3-b7ee-0fe3223215c2\") " pod="openstack/cinder-scheduler-0" Oct 13 14:35:05 crc kubenswrapper[4797]: I1013 14:35:05.182708 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fbf7ae4b-2087-46c3-b7ee-0fe3223215c2-scripts\") pod \"cinder-scheduler-0\" (UID: \"fbf7ae4b-2087-46c3-b7ee-0fe3223215c2\") " pod="openstack/cinder-scheduler-0" Oct 13 14:35:05 crc kubenswrapper[4797]: I1013 14:35:05.182764 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fbf7ae4b-2087-46c3-b7ee-0fe3223215c2-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"fbf7ae4b-2087-46c3-b7ee-0fe3223215c2\") " pod="openstack/cinder-scheduler-0" Oct 13 14:35:05 crc kubenswrapper[4797]: I1013 14:35:05.182790 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbf7ae4b-2087-46c3-b7ee-0fe3223215c2-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"fbf7ae4b-2087-46c3-b7ee-0fe3223215c2\") " pod="openstack/cinder-scheduler-0" Oct 13 14:35:05 crc kubenswrapper[4797]: I1013 14:35:05.284079 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sftwk\" (UniqueName: \"kubernetes.io/projected/fbf7ae4b-2087-46c3-b7ee-0fe3223215c2-kube-api-access-sftwk\") pod \"cinder-scheduler-0\" (UID: \"fbf7ae4b-2087-46c3-b7ee-0fe3223215c2\") " pod="openstack/cinder-scheduler-0" Oct 13 14:35:05 crc kubenswrapper[4797]: I1013 14:35:05.284177 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fbf7ae4b-2087-46c3-b7ee-0fe3223215c2-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"fbf7ae4b-2087-46c3-b7ee-0fe3223215c2\") " pod="openstack/cinder-scheduler-0" Oct 13 14:35:05 crc kubenswrapper[4797]: I1013 14:35:05.284257 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fbf7ae4b-2087-46c3-b7ee-0fe3223215c2-scripts\") pod \"cinder-scheduler-0\" (UID: \"fbf7ae4b-2087-46c3-b7ee-0fe3223215c2\") " pod="openstack/cinder-scheduler-0" Oct 13 14:35:05 crc kubenswrapper[4797]: I1013 14:35:05.284298 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fbf7ae4b-2087-46c3-b7ee-0fe3223215c2-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"fbf7ae4b-2087-46c3-b7ee-0fe3223215c2\") " pod="openstack/cinder-scheduler-0" Oct 13 14:35:05 crc kubenswrapper[4797]: I1013 14:35:05.284319 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbf7ae4b-2087-46c3-b7ee-0fe3223215c2-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"fbf7ae4b-2087-46c3-b7ee-0fe3223215c2\") " pod="openstack/cinder-scheduler-0" Oct 13 14:35:05 crc kubenswrapper[4797]: I1013 14:35:05.284339 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbf7ae4b-2087-46c3-b7ee-0fe3223215c2-config-data\") pod \"cinder-scheduler-0\" (UID: \"fbf7ae4b-2087-46c3-b7ee-0fe3223215c2\") " pod="openstack/cinder-scheduler-0" Oct 13 14:35:05 crc kubenswrapper[4797]: I1013 14:35:05.284333 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fbf7ae4b-2087-46c3-b7ee-0fe3223215c2-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"fbf7ae4b-2087-46c3-b7ee-0fe3223215c2\") " pod="openstack/cinder-scheduler-0" Oct 13 14:35:05 crc kubenswrapper[4797]: I1013 14:35:05.295039 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbf7ae4b-2087-46c3-b7ee-0fe3223215c2-config-data\") pod \"cinder-scheduler-0\" (UID: \"fbf7ae4b-2087-46c3-b7ee-0fe3223215c2\") " pod="openstack/cinder-scheduler-0" Oct 13 14:35:05 crc kubenswrapper[4797]: I1013 14:35:05.304377 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fbf7ae4b-2087-46c3-b7ee-0fe3223215c2-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"fbf7ae4b-2087-46c3-b7ee-0fe3223215c2\") " pod="openstack/cinder-scheduler-0" Oct 13 14:35:05 crc kubenswrapper[4797]: I1013 14:35:05.304912 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbf7ae4b-2087-46c3-b7ee-0fe3223215c2-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"fbf7ae4b-2087-46c3-b7ee-0fe3223215c2\") " pod="openstack/cinder-scheduler-0" Oct 13 14:35:05 crc kubenswrapper[4797]: I1013 14:35:05.306372 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fbf7ae4b-2087-46c3-b7ee-0fe3223215c2-scripts\") pod \"cinder-scheduler-0\" (UID: \"fbf7ae4b-2087-46c3-b7ee-0fe3223215c2\") " pod="openstack/cinder-scheduler-0" Oct 13 14:35:05 crc kubenswrapper[4797]: I1013 14:35:05.317330 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sftwk\" (UniqueName: \"kubernetes.io/projected/fbf7ae4b-2087-46c3-b7ee-0fe3223215c2-kube-api-access-sftwk\") pod \"cinder-scheduler-0\" (UID: \"fbf7ae4b-2087-46c3-b7ee-0fe3223215c2\") " pod="openstack/cinder-scheduler-0" Oct 13 14:35:05 crc kubenswrapper[4797]: I1013 14:35:05.359913 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 13 14:35:05 crc kubenswrapper[4797]: I1013 14:35:05.882073 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 13 14:35:05 crc kubenswrapper[4797]: W1013 14:35:05.888852 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfbf7ae4b_2087_46c3_b7ee_0fe3223215c2.slice/crio-d66849e0667782afc7ddf7496f38c85bb1addc93f1d0ad66531b3aba9c6d3b0b WatchSource:0}: Error finding container d66849e0667782afc7ddf7496f38c85bb1addc93f1d0ad66531b3aba9c6d3b0b: Status 404 returned error can't find the container with id d66849e0667782afc7ddf7496f38c85bb1addc93f1d0ad66531b3aba9c6d3b0b Oct 13 14:35:06 crc kubenswrapper[4797]: I1013 14:35:06.213047 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"fbf7ae4b-2087-46c3-b7ee-0fe3223215c2","Type":"ContainerStarted","Data":"d66849e0667782afc7ddf7496f38c85bb1addc93f1d0ad66531b3aba9c6d3b0b"} Oct 13 14:35:06 crc kubenswrapper[4797]: I1013 14:35:06.502906 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Oct 13 14:35:06 crc kubenswrapper[4797]: I1013 14:35:06.632415 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 13 14:35:06 crc kubenswrapper[4797]: I1013 14:35:06.632856 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="95071464-c69d-4edb-a00e-1f980c80301d" containerName="cinder-api-log" containerID="cri-o://92bea98873451245b4bed5faa98fc8e5ecf8d317bd7702f42d473e22a4eabb8e" gracePeriod=30 Oct 13 14:35:06 crc kubenswrapper[4797]: I1013 14:35:06.633723 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="95071464-c69d-4edb-a00e-1f980c80301d" containerName="cinder-api" containerID="cri-o://6ea2f84867dbe40dd8ea2ee1d95b1a575bc72f41db261870c4191f01f445fa16" gracePeriod=30 Oct 13 14:35:07 crc kubenswrapper[4797]: I1013 14:35:07.032680 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-3079-account-create-xw5xn"] Oct 13 14:35:07 crc kubenswrapper[4797]: I1013 14:35:07.054225 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-3079-account-create-xw5xn"] Oct 13 14:35:07 crc kubenswrapper[4797]: I1013 14:35:07.081820 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-volume-volume1-0"] Oct 13 14:35:07 crc kubenswrapper[4797]: I1013 14:35:07.083510 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Oct 13 14:35:07 crc kubenswrapper[4797]: I1013 14:35:07.087540 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Oct 13 14:35:07 crc kubenswrapper[4797]: I1013 14:35:07.091189 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-volume-volume1-config-data" Oct 13 14:35:07 crc kubenswrapper[4797]: I1013 14:35:07.216719 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dc355640-e2b0-4d27-8135-0ab3599cba98-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"dc355640-e2b0-4d27-8135-0ab3599cba98\") " pod="openstack/cinder-volume-volume1-0" Oct 13 14:35:07 crc kubenswrapper[4797]: I1013 14:35:07.216753 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/dc355640-e2b0-4d27-8135-0ab3599cba98-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"dc355640-e2b0-4d27-8135-0ab3599cba98\") " pod="openstack/cinder-volume-volume1-0" Oct 13 14:35:07 crc kubenswrapper[4797]: I1013 14:35:07.216776 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/dc355640-e2b0-4d27-8135-0ab3599cba98-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"dc355640-e2b0-4d27-8135-0ab3599cba98\") " pod="openstack/cinder-volume-volume1-0" Oct 13 14:35:07 crc kubenswrapper[4797]: I1013 14:35:07.216817 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc355640-e2b0-4d27-8135-0ab3599cba98-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"dc355640-e2b0-4d27-8135-0ab3599cba98\") " pod="openstack/cinder-volume-volume1-0" Oct 13 14:35:07 crc kubenswrapper[4797]: I1013 14:35:07.216942 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc355640-e2b0-4d27-8135-0ab3599cba98-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"dc355640-e2b0-4d27-8135-0ab3599cba98\") " pod="openstack/cinder-volume-volume1-0" Oct 13 14:35:07 crc kubenswrapper[4797]: I1013 14:35:07.217002 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/dc355640-e2b0-4d27-8135-0ab3599cba98-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"dc355640-e2b0-4d27-8135-0ab3599cba98\") " pod="openstack/cinder-volume-volume1-0" Oct 13 14:35:07 crc kubenswrapper[4797]: I1013 14:35:07.217026 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/dc355640-e2b0-4d27-8135-0ab3599cba98-dev\") pod \"cinder-volume-volume1-0\" (UID: \"dc355640-e2b0-4d27-8135-0ab3599cba98\") " pod="openstack/cinder-volume-volume1-0" Oct 13 14:35:07 crc kubenswrapper[4797]: I1013 14:35:07.217077 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/dc355640-e2b0-4d27-8135-0ab3599cba98-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"dc355640-e2b0-4d27-8135-0ab3599cba98\") " pod="openstack/cinder-volume-volume1-0" Oct 13 14:35:07 crc kubenswrapper[4797]: I1013 14:35:07.217113 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/dc355640-e2b0-4d27-8135-0ab3599cba98-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"dc355640-e2b0-4d27-8135-0ab3599cba98\") " pod="openstack/cinder-volume-volume1-0" Oct 13 14:35:07 crc kubenswrapper[4797]: I1013 14:35:07.217222 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/dc355640-e2b0-4d27-8135-0ab3599cba98-sys\") pod \"cinder-volume-volume1-0\" (UID: \"dc355640-e2b0-4d27-8135-0ab3599cba98\") " pod="openstack/cinder-volume-volume1-0" Oct 13 14:35:07 crc kubenswrapper[4797]: I1013 14:35:07.217350 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/dc355640-e2b0-4d27-8135-0ab3599cba98-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"dc355640-e2b0-4d27-8135-0ab3599cba98\") " pod="openstack/cinder-volume-volume1-0" Oct 13 14:35:07 crc kubenswrapper[4797]: I1013 14:35:07.217392 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/dc355640-e2b0-4d27-8135-0ab3599cba98-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"dc355640-e2b0-4d27-8135-0ab3599cba98\") " pod="openstack/cinder-volume-volume1-0" Oct 13 14:35:07 crc kubenswrapper[4797]: I1013 14:35:07.217413 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dc355640-e2b0-4d27-8135-0ab3599cba98-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"dc355640-e2b0-4d27-8135-0ab3599cba98\") " pod="openstack/cinder-volume-volume1-0" Oct 13 14:35:07 crc kubenswrapper[4797]: I1013 14:35:07.217530 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/dc355640-e2b0-4d27-8135-0ab3599cba98-run\") pod \"cinder-volume-volume1-0\" (UID: \"dc355640-e2b0-4d27-8135-0ab3599cba98\") " pod="openstack/cinder-volume-volume1-0" Oct 13 14:35:07 crc kubenswrapper[4797]: I1013 14:35:07.217561 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc355640-e2b0-4d27-8135-0ab3599cba98-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"dc355640-e2b0-4d27-8135-0ab3599cba98\") " pod="openstack/cinder-volume-volume1-0" Oct 13 14:35:07 crc kubenswrapper[4797]: I1013 14:35:07.217585 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6tlc\" (UniqueName: \"kubernetes.io/projected/dc355640-e2b0-4d27-8135-0ab3599cba98-kube-api-access-b6tlc\") pod \"cinder-volume-volume1-0\" (UID: \"dc355640-e2b0-4d27-8135-0ab3599cba98\") " pod="openstack/cinder-volume-volume1-0" Oct 13 14:35:07 crc kubenswrapper[4797]: I1013 14:35:07.224783 4797 generic.go:334] "Generic (PLEG): container finished" podID="95071464-c69d-4edb-a00e-1f980c80301d" containerID="92bea98873451245b4bed5faa98fc8e5ecf8d317bd7702f42d473e22a4eabb8e" exitCode=143 Oct 13 14:35:07 crc kubenswrapper[4797]: I1013 14:35:07.224875 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"95071464-c69d-4edb-a00e-1f980c80301d","Type":"ContainerDied","Data":"92bea98873451245b4bed5faa98fc8e5ecf8d317bd7702f42d473e22a4eabb8e"} Oct 13 14:35:07 crc kubenswrapper[4797]: I1013 14:35:07.228099 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"fbf7ae4b-2087-46c3-b7ee-0fe3223215c2","Type":"ContainerStarted","Data":"e0cb55008831959f12e5d7e58804c2cbab8e388757dd30e23bd0047d59c1b5d8"} Oct 13 14:35:07 crc kubenswrapper[4797]: I1013 14:35:07.241476 4797 scope.go:117] "RemoveContainer" containerID="96c8267bd4c8e99eeab0f52fde47a06d5529395a03b2ed9e13ec45aa355e370b" Oct 13 14:35:07 crc kubenswrapper[4797]: E1013 14:35:07.241705 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 14:35:07 crc kubenswrapper[4797]: I1013 14:35:07.249410 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6953ed8a-6074-4c10-9810-66b62741e903" path="/var/lib/kubelet/pods/6953ed8a-6074-4c10-9810-66b62741e903/volumes" Oct 13 14:35:07 crc kubenswrapper[4797]: I1013 14:35:07.319891 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b6tlc\" (UniqueName: \"kubernetes.io/projected/dc355640-e2b0-4d27-8135-0ab3599cba98-kube-api-access-b6tlc\") pod \"cinder-volume-volume1-0\" (UID: \"dc355640-e2b0-4d27-8135-0ab3599cba98\") " pod="openstack/cinder-volume-volume1-0" Oct 13 14:35:07 crc kubenswrapper[4797]: I1013 14:35:07.320011 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dc355640-e2b0-4d27-8135-0ab3599cba98-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"dc355640-e2b0-4d27-8135-0ab3599cba98\") " pod="openstack/cinder-volume-volume1-0" Oct 13 14:35:07 crc kubenswrapper[4797]: I1013 14:35:07.320038 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/dc355640-e2b0-4d27-8135-0ab3599cba98-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"dc355640-e2b0-4d27-8135-0ab3599cba98\") " pod="openstack/cinder-volume-volume1-0" Oct 13 14:35:07 crc kubenswrapper[4797]: I1013 14:35:07.320058 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/dc355640-e2b0-4d27-8135-0ab3599cba98-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"dc355640-e2b0-4d27-8135-0ab3599cba98\") " pod="openstack/cinder-volume-volume1-0" Oct 13 14:35:07 crc kubenswrapper[4797]: I1013 14:35:07.320103 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc355640-e2b0-4d27-8135-0ab3599cba98-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"dc355640-e2b0-4d27-8135-0ab3599cba98\") " pod="openstack/cinder-volume-volume1-0" Oct 13 14:35:07 crc kubenswrapper[4797]: I1013 14:35:07.320141 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc355640-e2b0-4d27-8135-0ab3599cba98-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"dc355640-e2b0-4d27-8135-0ab3599cba98\") " pod="openstack/cinder-volume-volume1-0" Oct 13 14:35:07 crc kubenswrapper[4797]: I1013 14:35:07.320137 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dc355640-e2b0-4d27-8135-0ab3599cba98-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"dc355640-e2b0-4d27-8135-0ab3599cba98\") " pod="openstack/cinder-volume-volume1-0" Oct 13 14:35:07 crc kubenswrapper[4797]: I1013 14:35:07.320166 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/dc355640-e2b0-4d27-8135-0ab3599cba98-dev\") pod \"cinder-volume-volume1-0\" (UID: \"dc355640-e2b0-4d27-8135-0ab3599cba98\") " pod="openstack/cinder-volume-volume1-0" Oct 13 14:35:07 crc kubenswrapper[4797]: I1013 14:35:07.320185 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/dc355640-e2b0-4d27-8135-0ab3599cba98-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"dc355640-e2b0-4d27-8135-0ab3599cba98\") " pod="openstack/cinder-volume-volume1-0" Oct 13 14:35:07 crc kubenswrapper[4797]: I1013 14:35:07.320221 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/dc355640-e2b0-4d27-8135-0ab3599cba98-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"dc355640-e2b0-4d27-8135-0ab3599cba98\") " pod="openstack/cinder-volume-volume1-0" Oct 13 14:35:07 crc kubenswrapper[4797]: I1013 14:35:07.320252 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/dc355640-e2b0-4d27-8135-0ab3599cba98-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"dc355640-e2b0-4d27-8135-0ab3599cba98\") " pod="openstack/cinder-volume-volume1-0" Oct 13 14:35:07 crc kubenswrapper[4797]: I1013 14:35:07.320288 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/dc355640-e2b0-4d27-8135-0ab3599cba98-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"dc355640-e2b0-4d27-8135-0ab3599cba98\") " pod="openstack/cinder-volume-volume1-0" Oct 13 14:35:07 crc kubenswrapper[4797]: I1013 14:35:07.320348 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/dc355640-e2b0-4d27-8135-0ab3599cba98-sys\") pod \"cinder-volume-volume1-0\" (UID: \"dc355640-e2b0-4d27-8135-0ab3599cba98\") " pod="openstack/cinder-volume-volume1-0" Oct 13 14:35:07 crc kubenswrapper[4797]: I1013 14:35:07.320394 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/dc355640-e2b0-4d27-8135-0ab3599cba98-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"dc355640-e2b0-4d27-8135-0ab3599cba98\") " pod="openstack/cinder-volume-volume1-0" Oct 13 14:35:07 crc kubenswrapper[4797]: I1013 14:35:07.320443 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/dc355640-e2b0-4d27-8135-0ab3599cba98-dev\") pod \"cinder-volume-volume1-0\" (UID: \"dc355640-e2b0-4d27-8135-0ab3599cba98\") " pod="openstack/cinder-volume-volume1-0" Oct 13 14:35:07 crc kubenswrapper[4797]: I1013 14:35:07.320483 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/dc355640-e2b0-4d27-8135-0ab3599cba98-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"dc355640-e2b0-4d27-8135-0ab3599cba98\") " pod="openstack/cinder-volume-volume1-0" Oct 13 14:35:07 crc kubenswrapper[4797]: I1013 14:35:07.320474 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/dc355640-e2b0-4d27-8135-0ab3599cba98-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"dc355640-e2b0-4d27-8135-0ab3599cba98\") " pod="openstack/cinder-volume-volume1-0" Oct 13 14:35:07 crc kubenswrapper[4797]: I1013 14:35:07.320731 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/dc355640-e2b0-4d27-8135-0ab3599cba98-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"dc355640-e2b0-4d27-8135-0ab3599cba98\") " pod="openstack/cinder-volume-volume1-0" Oct 13 14:35:07 crc kubenswrapper[4797]: I1013 14:35:07.320303 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/dc355640-e2b0-4d27-8135-0ab3599cba98-sys\") pod \"cinder-volume-volume1-0\" (UID: \"dc355640-e2b0-4d27-8135-0ab3599cba98\") " pod="openstack/cinder-volume-volume1-0" Oct 13 14:35:07 crc kubenswrapper[4797]: I1013 14:35:07.320932 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/dc355640-e2b0-4d27-8135-0ab3599cba98-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"dc355640-e2b0-4d27-8135-0ab3599cba98\") " pod="openstack/cinder-volume-volume1-0" Oct 13 14:35:07 crc kubenswrapper[4797]: I1013 14:35:07.320966 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/dc355640-e2b0-4d27-8135-0ab3599cba98-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"dc355640-e2b0-4d27-8135-0ab3599cba98\") " pod="openstack/cinder-volume-volume1-0" Oct 13 14:35:07 crc kubenswrapper[4797]: I1013 14:35:07.320987 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dc355640-e2b0-4d27-8135-0ab3599cba98-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"dc355640-e2b0-4d27-8135-0ab3599cba98\") " pod="openstack/cinder-volume-volume1-0" Oct 13 14:35:07 crc kubenswrapper[4797]: I1013 14:35:07.321099 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/dc355640-e2b0-4d27-8135-0ab3599cba98-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"dc355640-e2b0-4d27-8135-0ab3599cba98\") " pod="openstack/cinder-volume-volume1-0" Oct 13 14:35:07 crc kubenswrapper[4797]: I1013 14:35:07.321130 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/dc355640-e2b0-4d27-8135-0ab3599cba98-run\") pod \"cinder-volume-volume1-0\" (UID: \"dc355640-e2b0-4d27-8135-0ab3599cba98\") " pod="openstack/cinder-volume-volume1-0" Oct 13 14:35:07 crc kubenswrapper[4797]: I1013 14:35:07.321111 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/dc355640-e2b0-4d27-8135-0ab3599cba98-run\") pod \"cinder-volume-volume1-0\" (UID: \"dc355640-e2b0-4d27-8135-0ab3599cba98\") " pod="openstack/cinder-volume-volume1-0" Oct 13 14:35:07 crc kubenswrapper[4797]: I1013 14:35:07.321211 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc355640-e2b0-4d27-8135-0ab3599cba98-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"dc355640-e2b0-4d27-8135-0ab3599cba98\") " pod="openstack/cinder-volume-volume1-0" Oct 13 14:35:07 crc kubenswrapper[4797]: I1013 14:35:07.324049 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/dc355640-e2b0-4d27-8135-0ab3599cba98-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"dc355640-e2b0-4d27-8135-0ab3599cba98\") " pod="openstack/cinder-volume-volume1-0" Oct 13 14:35:07 crc kubenswrapper[4797]: I1013 14:35:07.324674 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc355640-e2b0-4d27-8135-0ab3599cba98-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"dc355640-e2b0-4d27-8135-0ab3599cba98\") " pod="openstack/cinder-volume-volume1-0" Oct 13 14:35:07 crc kubenswrapper[4797]: I1013 14:35:07.327654 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dc355640-e2b0-4d27-8135-0ab3599cba98-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"dc355640-e2b0-4d27-8135-0ab3599cba98\") " pod="openstack/cinder-volume-volume1-0" Oct 13 14:35:07 crc kubenswrapper[4797]: I1013 14:35:07.330378 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc355640-e2b0-4d27-8135-0ab3599cba98-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"dc355640-e2b0-4d27-8135-0ab3599cba98\") " pod="openstack/cinder-volume-volume1-0" Oct 13 14:35:07 crc kubenswrapper[4797]: I1013 14:35:07.345019 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6tlc\" (UniqueName: \"kubernetes.io/projected/dc355640-e2b0-4d27-8135-0ab3599cba98-kube-api-access-b6tlc\") pod \"cinder-volume-volume1-0\" (UID: \"dc355640-e2b0-4d27-8135-0ab3599cba98\") " pod="openstack/cinder-volume-volume1-0" Oct 13 14:35:07 crc kubenswrapper[4797]: I1013 14:35:07.345778 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc355640-e2b0-4d27-8135-0ab3599cba98-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"dc355640-e2b0-4d27-8135-0ab3599cba98\") " pod="openstack/cinder-volume-volume1-0" Oct 13 14:35:07 crc kubenswrapper[4797]: I1013 14:35:07.409503 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Oct 13 14:35:07 crc kubenswrapper[4797]: I1013 14:35:07.649077 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 13 14:35:07 crc kubenswrapper[4797]: I1013 14:35:07.735659 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-backup-0"] Oct 13 14:35:07 crc kubenswrapper[4797]: I1013 14:35:07.737540 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Oct 13 14:35:07 crc kubenswrapper[4797]: I1013 14:35:07.752193 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-backup-config-data" Oct 13 14:35:07 crc kubenswrapper[4797]: I1013 14:35:07.760033 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Oct 13 14:35:07 crc kubenswrapper[4797]: I1013 14:35:07.840231 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/2579cd51-2c4b-4a29-993e-d38dfffead2b-run\") pod \"cinder-backup-0\" (UID: \"2579cd51-2c4b-4a29-993e-d38dfffead2b\") " pod="openstack/cinder-backup-0" Oct 13 14:35:07 crc kubenswrapper[4797]: I1013 14:35:07.840304 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/2579cd51-2c4b-4a29-993e-d38dfffead2b-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"2579cd51-2c4b-4a29-993e-d38dfffead2b\") " pod="openstack/cinder-backup-0" Oct 13 14:35:07 crc kubenswrapper[4797]: I1013 14:35:07.840344 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2579cd51-2c4b-4a29-993e-d38dfffead2b-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"2579cd51-2c4b-4a29-993e-d38dfffead2b\") " pod="openstack/cinder-backup-0" Oct 13 14:35:07 crc kubenswrapper[4797]: I1013 14:35:07.840398 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2579cd51-2c4b-4a29-993e-d38dfffead2b-config-data-custom\") pod \"cinder-backup-0\" (UID: \"2579cd51-2c4b-4a29-993e-d38dfffead2b\") " pod="openstack/cinder-backup-0" Oct 13 14:35:07 crc kubenswrapper[4797]: I1013 14:35:07.840435 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2579cd51-2c4b-4a29-993e-d38dfffead2b-sys\") pod \"cinder-backup-0\" (UID: \"2579cd51-2c4b-4a29-993e-d38dfffead2b\") " pod="openstack/cinder-backup-0" Oct 13 14:35:07 crc kubenswrapper[4797]: I1013 14:35:07.840490 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/2579cd51-2c4b-4a29-993e-d38dfffead2b-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"2579cd51-2c4b-4a29-993e-d38dfffead2b\") " pod="openstack/cinder-backup-0" Oct 13 14:35:07 crc kubenswrapper[4797]: I1013 14:35:07.840513 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/2579cd51-2c4b-4a29-993e-d38dfffead2b-etc-nvme\") pod \"cinder-backup-0\" (UID: \"2579cd51-2c4b-4a29-993e-d38dfffead2b\") " pod="openstack/cinder-backup-0" Oct 13 14:35:07 crc kubenswrapper[4797]: I1013 14:35:07.840550 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2579cd51-2c4b-4a29-993e-d38dfffead2b-config-data\") pod \"cinder-backup-0\" (UID: \"2579cd51-2c4b-4a29-993e-d38dfffead2b\") " pod="openstack/cinder-backup-0" Oct 13 14:35:07 crc kubenswrapper[4797]: I1013 14:35:07.840579 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2579cd51-2c4b-4a29-993e-d38dfffead2b-lib-modules\") pod \"cinder-backup-0\" (UID: \"2579cd51-2c4b-4a29-993e-d38dfffead2b\") " pod="openstack/cinder-backup-0" Oct 13 14:35:07 crc kubenswrapper[4797]: I1013 14:35:07.840626 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/2579cd51-2c4b-4a29-993e-d38dfffead2b-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"2579cd51-2c4b-4a29-993e-d38dfffead2b\") " pod="openstack/cinder-backup-0" Oct 13 14:35:07 crc kubenswrapper[4797]: I1013 14:35:07.840681 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2579cd51-2c4b-4a29-993e-d38dfffead2b-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"2579cd51-2c4b-4a29-993e-d38dfffead2b\") " pod="openstack/cinder-backup-0" Oct 13 14:35:07 crc kubenswrapper[4797]: I1013 14:35:07.840704 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnhrc\" (UniqueName: \"kubernetes.io/projected/2579cd51-2c4b-4a29-993e-d38dfffead2b-kube-api-access-jnhrc\") pod \"cinder-backup-0\" (UID: \"2579cd51-2c4b-4a29-993e-d38dfffead2b\") " pod="openstack/cinder-backup-0" Oct 13 14:35:07 crc kubenswrapper[4797]: I1013 14:35:07.840747 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2579cd51-2c4b-4a29-993e-d38dfffead2b-scripts\") pod \"cinder-backup-0\" (UID: \"2579cd51-2c4b-4a29-993e-d38dfffead2b\") " pod="openstack/cinder-backup-0" Oct 13 14:35:07 crc kubenswrapper[4797]: I1013 14:35:07.840786 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/2579cd51-2c4b-4a29-993e-d38dfffead2b-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"2579cd51-2c4b-4a29-993e-d38dfffead2b\") " pod="openstack/cinder-backup-0" Oct 13 14:35:07 crc kubenswrapper[4797]: I1013 14:35:07.840839 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/2579cd51-2c4b-4a29-993e-d38dfffead2b-dev\") pod \"cinder-backup-0\" (UID: \"2579cd51-2c4b-4a29-993e-d38dfffead2b\") " pod="openstack/cinder-backup-0" Oct 13 14:35:07 crc kubenswrapper[4797]: I1013 14:35:07.840876 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/2579cd51-2c4b-4a29-993e-d38dfffead2b-ceph\") pod \"cinder-backup-0\" (UID: \"2579cd51-2c4b-4a29-993e-d38dfffead2b\") " pod="openstack/cinder-backup-0" Oct 13 14:35:07 crc kubenswrapper[4797]: I1013 14:35:07.910645 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 13 14:35:07 crc kubenswrapper[4797]: I1013 14:35:07.948972 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/2579cd51-2c4b-4a29-993e-d38dfffead2b-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"2579cd51-2c4b-4a29-993e-d38dfffead2b\") " pod="openstack/cinder-backup-0" Oct 13 14:35:07 crc kubenswrapper[4797]: I1013 14:35:07.949011 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/2579cd51-2c4b-4a29-993e-d38dfffead2b-etc-nvme\") pod \"cinder-backup-0\" (UID: \"2579cd51-2c4b-4a29-993e-d38dfffead2b\") " pod="openstack/cinder-backup-0" Oct 13 14:35:07 crc kubenswrapper[4797]: I1013 14:35:07.949034 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2579cd51-2c4b-4a29-993e-d38dfffead2b-config-data\") pod \"cinder-backup-0\" (UID: \"2579cd51-2c4b-4a29-993e-d38dfffead2b\") " pod="openstack/cinder-backup-0" Oct 13 14:35:07 crc kubenswrapper[4797]: I1013 14:35:07.949052 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2579cd51-2c4b-4a29-993e-d38dfffead2b-lib-modules\") pod \"cinder-backup-0\" (UID: \"2579cd51-2c4b-4a29-993e-d38dfffead2b\") " pod="openstack/cinder-backup-0" Oct 13 14:35:07 crc kubenswrapper[4797]: I1013 14:35:07.949082 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/2579cd51-2c4b-4a29-993e-d38dfffead2b-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"2579cd51-2c4b-4a29-993e-d38dfffead2b\") " pod="openstack/cinder-backup-0" Oct 13 14:35:07 crc kubenswrapper[4797]: I1013 14:35:07.949116 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2579cd51-2c4b-4a29-993e-d38dfffead2b-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"2579cd51-2c4b-4a29-993e-d38dfffead2b\") " pod="openstack/cinder-backup-0" Oct 13 14:35:07 crc kubenswrapper[4797]: I1013 14:35:07.949135 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jnhrc\" (UniqueName: \"kubernetes.io/projected/2579cd51-2c4b-4a29-993e-d38dfffead2b-kube-api-access-jnhrc\") pod \"cinder-backup-0\" (UID: \"2579cd51-2c4b-4a29-993e-d38dfffead2b\") " pod="openstack/cinder-backup-0" Oct 13 14:35:07 crc kubenswrapper[4797]: I1013 14:35:07.949160 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2579cd51-2c4b-4a29-993e-d38dfffead2b-scripts\") pod \"cinder-backup-0\" (UID: \"2579cd51-2c4b-4a29-993e-d38dfffead2b\") " pod="openstack/cinder-backup-0" Oct 13 14:35:07 crc kubenswrapper[4797]: I1013 14:35:07.949182 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/2579cd51-2c4b-4a29-993e-d38dfffead2b-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"2579cd51-2c4b-4a29-993e-d38dfffead2b\") " pod="openstack/cinder-backup-0" Oct 13 14:35:07 crc kubenswrapper[4797]: I1013 14:35:07.949201 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/2579cd51-2c4b-4a29-993e-d38dfffead2b-dev\") pod \"cinder-backup-0\" (UID: \"2579cd51-2c4b-4a29-993e-d38dfffead2b\") " pod="openstack/cinder-backup-0" Oct 13 14:35:07 crc kubenswrapper[4797]: I1013 14:35:07.949222 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/2579cd51-2c4b-4a29-993e-d38dfffead2b-ceph\") pod \"cinder-backup-0\" (UID: \"2579cd51-2c4b-4a29-993e-d38dfffead2b\") " pod="openstack/cinder-backup-0" Oct 13 14:35:07 crc kubenswrapper[4797]: I1013 14:35:07.949255 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/2579cd51-2c4b-4a29-993e-d38dfffead2b-run\") pod \"cinder-backup-0\" (UID: \"2579cd51-2c4b-4a29-993e-d38dfffead2b\") " pod="openstack/cinder-backup-0" Oct 13 14:35:07 crc kubenswrapper[4797]: I1013 14:35:07.949270 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/2579cd51-2c4b-4a29-993e-d38dfffead2b-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"2579cd51-2c4b-4a29-993e-d38dfffead2b\") " pod="openstack/cinder-backup-0" Oct 13 14:35:07 crc kubenswrapper[4797]: I1013 14:35:07.949285 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2579cd51-2c4b-4a29-993e-d38dfffead2b-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"2579cd51-2c4b-4a29-993e-d38dfffead2b\") " pod="openstack/cinder-backup-0" Oct 13 14:35:07 crc kubenswrapper[4797]: I1013 14:35:07.949316 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2579cd51-2c4b-4a29-993e-d38dfffead2b-config-data-custom\") pod \"cinder-backup-0\" (UID: \"2579cd51-2c4b-4a29-993e-d38dfffead2b\") " pod="openstack/cinder-backup-0" Oct 13 14:35:07 crc kubenswrapper[4797]: I1013 14:35:07.949334 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2579cd51-2c4b-4a29-993e-d38dfffead2b-sys\") pod \"cinder-backup-0\" (UID: \"2579cd51-2c4b-4a29-993e-d38dfffead2b\") " pod="openstack/cinder-backup-0" Oct 13 14:35:07 crc kubenswrapper[4797]: I1013 14:35:07.949414 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2579cd51-2c4b-4a29-993e-d38dfffead2b-sys\") pod \"cinder-backup-0\" (UID: \"2579cd51-2c4b-4a29-993e-d38dfffead2b\") " pod="openstack/cinder-backup-0" Oct 13 14:35:07 crc kubenswrapper[4797]: I1013 14:35:07.949462 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/2579cd51-2c4b-4a29-993e-d38dfffead2b-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"2579cd51-2c4b-4a29-993e-d38dfffead2b\") " pod="openstack/cinder-backup-0" Oct 13 14:35:07 crc kubenswrapper[4797]: I1013 14:35:07.949490 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/2579cd51-2c4b-4a29-993e-d38dfffead2b-etc-nvme\") pod \"cinder-backup-0\" (UID: \"2579cd51-2c4b-4a29-993e-d38dfffead2b\") " pod="openstack/cinder-backup-0" Oct 13 14:35:07 crc kubenswrapper[4797]: I1013 14:35:07.950983 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/2579cd51-2c4b-4a29-993e-d38dfffead2b-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"2579cd51-2c4b-4a29-993e-d38dfffead2b\") " pod="openstack/cinder-backup-0" Oct 13 14:35:07 crc kubenswrapper[4797]: I1013 14:35:07.951075 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2579cd51-2c4b-4a29-993e-d38dfffead2b-lib-modules\") pod \"cinder-backup-0\" (UID: \"2579cd51-2c4b-4a29-993e-d38dfffead2b\") " pod="openstack/cinder-backup-0" Oct 13 14:35:07 crc kubenswrapper[4797]: I1013 14:35:07.951110 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/2579cd51-2c4b-4a29-993e-d38dfffead2b-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"2579cd51-2c4b-4a29-993e-d38dfffead2b\") " pod="openstack/cinder-backup-0" Oct 13 14:35:07 crc kubenswrapper[4797]: I1013 14:35:07.951141 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2579cd51-2c4b-4a29-993e-d38dfffead2b-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"2579cd51-2c4b-4a29-993e-d38dfffead2b\") " pod="openstack/cinder-backup-0" Oct 13 14:35:07 crc kubenswrapper[4797]: I1013 14:35:07.954290 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/2579cd51-2c4b-4a29-993e-d38dfffead2b-run\") pod \"cinder-backup-0\" (UID: \"2579cd51-2c4b-4a29-993e-d38dfffead2b\") " pod="openstack/cinder-backup-0" Oct 13 14:35:07 crc kubenswrapper[4797]: I1013 14:35:07.954374 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/2579cd51-2c4b-4a29-993e-d38dfffead2b-dev\") pod \"cinder-backup-0\" (UID: \"2579cd51-2c4b-4a29-993e-d38dfffead2b\") " pod="openstack/cinder-backup-0" Oct 13 14:35:07 crc kubenswrapper[4797]: I1013 14:35:07.958226 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/2579cd51-2c4b-4a29-993e-d38dfffead2b-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"2579cd51-2c4b-4a29-993e-d38dfffead2b\") " pod="openstack/cinder-backup-0" Oct 13 14:35:07 crc kubenswrapper[4797]: I1013 14:35:07.960520 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2579cd51-2c4b-4a29-993e-d38dfffead2b-config-data\") pod \"cinder-backup-0\" (UID: \"2579cd51-2c4b-4a29-993e-d38dfffead2b\") " pod="openstack/cinder-backup-0" Oct 13 14:35:07 crc kubenswrapper[4797]: I1013 14:35:07.960843 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2579cd51-2c4b-4a29-993e-d38dfffead2b-config-data-custom\") pod \"cinder-backup-0\" (UID: \"2579cd51-2c4b-4a29-993e-d38dfffead2b\") " pod="openstack/cinder-backup-0" Oct 13 14:35:07 crc kubenswrapper[4797]: I1013 14:35:07.969411 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2579cd51-2c4b-4a29-993e-d38dfffead2b-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"2579cd51-2c4b-4a29-993e-d38dfffead2b\") " pod="openstack/cinder-backup-0" Oct 13 14:35:07 crc kubenswrapper[4797]: I1013 14:35:07.970344 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/2579cd51-2c4b-4a29-993e-d38dfffead2b-ceph\") pod \"cinder-backup-0\" (UID: \"2579cd51-2c4b-4a29-993e-d38dfffead2b\") " pod="openstack/cinder-backup-0" Oct 13 14:35:07 crc kubenswrapper[4797]: I1013 14:35:07.997506 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jnhrc\" (UniqueName: \"kubernetes.io/projected/2579cd51-2c4b-4a29-993e-d38dfffead2b-kube-api-access-jnhrc\") pod \"cinder-backup-0\" (UID: \"2579cd51-2c4b-4a29-993e-d38dfffead2b\") " pod="openstack/cinder-backup-0" Oct 13 14:35:08 crc kubenswrapper[4797]: I1013 14:35:08.000255 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2579cd51-2c4b-4a29-993e-d38dfffead2b-scripts\") pod \"cinder-backup-0\" (UID: \"2579cd51-2c4b-4a29-993e-d38dfffead2b\") " pod="openstack/cinder-backup-0" Oct 13 14:35:08 crc kubenswrapper[4797]: I1013 14:35:08.087776 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Oct 13 14:35:08 crc kubenswrapper[4797]: I1013 14:35:08.131153 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Oct 13 14:35:08 crc kubenswrapper[4797]: W1013 14:35:08.134261 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddc355640_e2b0_4d27_8135_0ab3599cba98.slice/crio-f7ccd24385c80d3745b3f284c07737b8d548ee34b25e6bc4faa5ab727a7b5d0a WatchSource:0}: Error finding container f7ccd24385c80d3745b3f284c07737b8d548ee34b25e6bc4faa5ab727a7b5d0a: Status 404 returned error can't find the container with id f7ccd24385c80d3745b3f284c07737b8d548ee34b25e6bc4faa5ab727a7b5d0a Oct 13 14:35:08 crc kubenswrapper[4797]: I1013 14:35:08.241293 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"fbf7ae4b-2087-46c3-b7ee-0fe3223215c2","Type":"ContainerStarted","Data":"79744d37454d43ba5949a0fe1a16cc74a97718d8f8b734406de6128082390659"} Oct 13 14:35:08 crc kubenswrapper[4797]: I1013 14:35:08.244219 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"dc355640-e2b0-4d27-8135-0ab3599cba98","Type":"ContainerStarted","Data":"f7ccd24385c80d3745b3f284c07737b8d548ee34b25e6bc4faa5ab727a7b5d0a"} Oct 13 14:35:08 crc kubenswrapper[4797]: I1013 14:35:08.277630 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.011316294 podStartE2EDuration="3.277593825s" podCreationTimestamp="2025-10-13 14:35:05 +0000 UTC" firstStartedPulling="2025-10-13 14:35:05.891003504 +0000 UTC m=+5283.424553760" lastFinishedPulling="2025-10-13 14:35:06.157281035 +0000 UTC m=+5283.690831291" observedRunningTime="2025-10-13 14:35:08.272596622 +0000 UTC m=+5285.806146898" watchObservedRunningTime="2025-10-13 14:35:08.277593825 +0000 UTC m=+5285.811144081" Oct 13 14:35:08 crc kubenswrapper[4797]: I1013 14:35:08.315504 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 13 14:35:08 crc kubenswrapper[4797]: I1013 14:35:08.679718 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Oct 13 14:35:08 crc kubenswrapper[4797]: W1013 14:35:08.692109 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2579cd51_2c4b_4a29_993e_d38dfffead2b.slice/crio-340e35313be0133971124f21273694e487d4d067208df1b3afd17f4ddf199551 WatchSource:0}: Error finding container 340e35313be0133971124f21273694e487d4d067208df1b3afd17f4ddf199551: Status 404 returned error can't find the container with id 340e35313be0133971124f21273694e487d4d067208df1b3afd17f4ddf199551 Oct 13 14:35:09 crc kubenswrapper[4797]: I1013 14:35:09.258694 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"2579cd51-2c4b-4a29-993e-d38dfffead2b","Type":"ContainerStarted","Data":"0d1204f4c9e52c1059bc47621f04cc19f64a8dfd7e0f4a818b9f2b4ba57f3a45"} Oct 13 14:35:09 crc kubenswrapper[4797]: I1013 14:35:09.259838 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"2579cd51-2c4b-4a29-993e-d38dfffead2b","Type":"ContainerStarted","Data":"340e35313be0133971124f21273694e487d4d067208df1b3afd17f4ddf199551"} Oct 13 14:35:09 crc kubenswrapper[4797]: I1013 14:35:09.274512 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"dc355640-e2b0-4d27-8135-0ab3599cba98","Type":"ContainerStarted","Data":"ade361ddfcc44433da29c5dca55fa57676ccf1c3e23383188c87442f1d258228"} Oct 13 14:35:09 crc kubenswrapper[4797]: I1013 14:35:09.274735 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"dc355640-e2b0-4d27-8135-0ab3599cba98","Type":"ContainerStarted","Data":"a2303b7683852d7b981177ff04206327acec67b5f54e2afbaf34b663bc1c44c8"} Oct 13 14:35:09 crc kubenswrapper[4797]: I1013 14:35:09.308702 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-volume-volume1-0" podStartSLOduration=1.8912708010000001 podStartE2EDuration="2.308676653s" podCreationTimestamp="2025-10-13 14:35:07 +0000 UTC" firstStartedPulling="2025-10-13 14:35:08.13817389 +0000 UTC m=+5285.671724146" lastFinishedPulling="2025-10-13 14:35:08.555579742 +0000 UTC m=+5286.089129998" observedRunningTime="2025-10-13 14:35:09.299903888 +0000 UTC m=+5286.833454174" watchObservedRunningTime="2025-10-13 14:35:09.308676653 +0000 UTC m=+5286.842226909" Oct 13 14:35:09 crc kubenswrapper[4797]: I1013 14:35:09.828874 4797 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="95071464-c69d-4edb-a00e-1f980c80301d" containerName="cinder-api" probeResult="failure" output="Get \"http://10.217.1.80:8776/healthcheck\": read tcp 10.217.0.2:37578->10.217.1.80:8776: read: connection reset by peer" Oct 13 14:35:10 crc kubenswrapper[4797]: I1013 14:35:10.276983 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 13 14:35:10 crc kubenswrapper[4797]: I1013 14:35:10.284611 4797 generic.go:334] "Generic (PLEG): container finished" podID="95071464-c69d-4edb-a00e-1f980c80301d" containerID="6ea2f84867dbe40dd8ea2ee1d95b1a575bc72f41db261870c4191f01f445fa16" exitCode=0 Oct 13 14:35:10 crc kubenswrapper[4797]: I1013 14:35:10.284660 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"95071464-c69d-4edb-a00e-1f980c80301d","Type":"ContainerDied","Data":"6ea2f84867dbe40dd8ea2ee1d95b1a575bc72f41db261870c4191f01f445fa16"} Oct 13 14:35:10 crc kubenswrapper[4797]: I1013 14:35:10.284704 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 13 14:35:10 crc kubenswrapper[4797]: I1013 14:35:10.284750 4797 scope.go:117] "RemoveContainer" containerID="6ea2f84867dbe40dd8ea2ee1d95b1a575bc72f41db261870c4191f01f445fa16" Oct 13 14:35:10 crc kubenswrapper[4797]: I1013 14:35:10.284737 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"95071464-c69d-4edb-a00e-1f980c80301d","Type":"ContainerDied","Data":"9c8e7c03442d46fc0fd9fe380ca62d69af7cec86ab87a4ece5cc524b31229a6d"} Oct 13 14:35:10 crc kubenswrapper[4797]: I1013 14:35:10.286538 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"2579cd51-2c4b-4a29-993e-d38dfffead2b","Type":"ContainerStarted","Data":"0eb34609e3e3464a61f5e88d27341f734260ce8f6e3f991405e86c4c8a366e22"} Oct 13 14:35:10 crc kubenswrapper[4797]: I1013 14:35:10.335651 4797 scope.go:117] "RemoveContainer" containerID="92bea98873451245b4bed5faa98fc8e5ecf8d317bd7702f42d473e22a4eabb8e" Oct 13 14:35:10 crc kubenswrapper[4797]: I1013 14:35:10.350180 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-backup-0" podStartSLOduration=3.02603854 podStartE2EDuration="3.350163337s" podCreationTimestamp="2025-10-13 14:35:07 +0000 UTC" firstStartedPulling="2025-10-13 14:35:08.69804788 +0000 UTC m=+5286.231598146" lastFinishedPulling="2025-10-13 14:35:09.022172687 +0000 UTC m=+5286.555722943" observedRunningTime="2025-10-13 14:35:10.345886752 +0000 UTC m=+5287.879437018" watchObservedRunningTime="2025-10-13 14:35:10.350163337 +0000 UTC m=+5287.883713593" Oct 13 14:35:10 crc kubenswrapper[4797]: I1013 14:35:10.362880 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Oct 13 14:35:10 crc kubenswrapper[4797]: I1013 14:35:10.371760 4797 scope.go:117] "RemoveContainer" containerID="6ea2f84867dbe40dd8ea2ee1d95b1a575bc72f41db261870c4191f01f445fa16" Oct 13 14:35:10 crc kubenswrapper[4797]: E1013 14:35:10.372179 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ea2f84867dbe40dd8ea2ee1d95b1a575bc72f41db261870c4191f01f445fa16\": container with ID starting with 6ea2f84867dbe40dd8ea2ee1d95b1a575bc72f41db261870c4191f01f445fa16 not found: ID does not exist" containerID="6ea2f84867dbe40dd8ea2ee1d95b1a575bc72f41db261870c4191f01f445fa16" Oct 13 14:35:10 crc kubenswrapper[4797]: I1013 14:35:10.373012 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ea2f84867dbe40dd8ea2ee1d95b1a575bc72f41db261870c4191f01f445fa16"} err="failed to get container status \"6ea2f84867dbe40dd8ea2ee1d95b1a575bc72f41db261870c4191f01f445fa16\": rpc error: code = NotFound desc = could not find container \"6ea2f84867dbe40dd8ea2ee1d95b1a575bc72f41db261870c4191f01f445fa16\": container with ID starting with 6ea2f84867dbe40dd8ea2ee1d95b1a575bc72f41db261870c4191f01f445fa16 not found: ID does not exist" Oct 13 14:35:10 crc kubenswrapper[4797]: I1013 14:35:10.373050 4797 scope.go:117] "RemoveContainer" containerID="92bea98873451245b4bed5faa98fc8e5ecf8d317bd7702f42d473e22a4eabb8e" Oct 13 14:35:10 crc kubenswrapper[4797]: E1013 14:35:10.373533 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"92bea98873451245b4bed5faa98fc8e5ecf8d317bd7702f42d473e22a4eabb8e\": container with ID starting with 92bea98873451245b4bed5faa98fc8e5ecf8d317bd7702f42d473e22a4eabb8e not found: ID does not exist" containerID="92bea98873451245b4bed5faa98fc8e5ecf8d317bd7702f42d473e22a4eabb8e" Oct 13 14:35:10 crc kubenswrapper[4797]: I1013 14:35:10.373564 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92bea98873451245b4bed5faa98fc8e5ecf8d317bd7702f42d473e22a4eabb8e"} err="failed to get container status \"92bea98873451245b4bed5faa98fc8e5ecf8d317bd7702f42d473e22a4eabb8e\": rpc error: code = NotFound desc = could not find container \"92bea98873451245b4bed5faa98fc8e5ecf8d317bd7702f42d473e22a4eabb8e\": container with ID starting with 92bea98873451245b4bed5faa98fc8e5ecf8d317bd7702f42d473e22a4eabb8e not found: ID does not exist" Oct 13 14:35:10 crc kubenswrapper[4797]: I1013 14:35:10.408403 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/95071464-c69d-4edb-a00e-1f980c80301d-etc-machine-id\") pod \"95071464-c69d-4edb-a00e-1f980c80301d\" (UID: \"95071464-c69d-4edb-a00e-1f980c80301d\") " Oct 13 14:35:10 crc kubenswrapper[4797]: I1013 14:35:10.408467 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95071464-c69d-4edb-a00e-1f980c80301d-combined-ca-bundle\") pod \"95071464-c69d-4edb-a00e-1f980c80301d\" (UID: \"95071464-c69d-4edb-a00e-1f980c80301d\") " Oct 13 14:35:10 crc kubenswrapper[4797]: I1013 14:35:10.408508 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/95071464-c69d-4edb-a00e-1f980c80301d-logs\") pod \"95071464-c69d-4edb-a00e-1f980c80301d\" (UID: \"95071464-c69d-4edb-a00e-1f980c80301d\") " Oct 13 14:35:10 crc kubenswrapper[4797]: I1013 14:35:10.408525 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95071464-c69d-4edb-a00e-1f980c80301d-config-data\") pod \"95071464-c69d-4edb-a00e-1f980c80301d\" (UID: \"95071464-c69d-4edb-a00e-1f980c80301d\") " Oct 13 14:35:10 crc kubenswrapper[4797]: I1013 14:35:10.408577 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95071464-c69d-4edb-a00e-1f980c80301d-scripts\") pod \"95071464-c69d-4edb-a00e-1f980c80301d\" (UID: \"95071464-c69d-4edb-a00e-1f980c80301d\") " Oct 13 14:35:10 crc kubenswrapper[4797]: I1013 14:35:10.408606 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dkq4q\" (UniqueName: \"kubernetes.io/projected/95071464-c69d-4edb-a00e-1f980c80301d-kube-api-access-dkq4q\") pod \"95071464-c69d-4edb-a00e-1f980c80301d\" (UID: \"95071464-c69d-4edb-a00e-1f980c80301d\") " Oct 13 14:35:10 crc kubenswrapper[4797]: I1013 14:35:10.408650 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/95071464-c69d-4edb-a00e-1f980c80301d-config-data-custom\") pod \"95071464-c69d-4edb-a00e-1f980c80301d\" (UID: \"95071464-c69d-4edb-a00e-1f980c80301d\") " Oct 13 14:35:10 crc kubenswrapper[4797]: I1013 14:35:10.409718 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/95071464-c69d-4edb-a00e-1f980c80301d-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "95071464-c69d-4edb-a00e-1f980c80301d" (UID: "95071464-c69d-4edb-a00e-1f980c80301d"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 13 14:35:10 crc kubenswrapper[4797]: I1013 14:35:10.412935 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/95071464-c69d-4edb-a00e-1f980c80301d-logs" (OuterVolumeSpecName: "logs") pod "95071464-c69d-4edb-a00e-1f980c80301d" (UID: "95071464-c69d-4edb-a00e-1f980c80301d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 14:35:10 crc kubenswrapper[4797]: I1013 14:35:10.419924 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95071464-c69d-4edb-a00e-1f980c80301d-scripts" (OuterVolumeSpecName: "scripts") pod "95071464-c69d-4edb-a00e-1f980c80301d" (UID: "95071464-c69d-4edb-a00e-1f980c80301d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 14:35:10 crc kubenswrapper[4797]: I1013 14:35:10.420015 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95071464-c69d-4edb-a00e-1f980c80301d-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "95071464-c69d-4edb-a00e-1f980c80301d" (UID: "95071464-c69d-4edb-a00e-1f980c80301d"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 14:35:10 crc kubenswrapper[4797]: I1013 14:35:10.425300 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95071464-c69d-4edb-a00e-1f980c80301d-kube-api-access-dkq4q" (OuterVolumeSpecName: "kube-api-access-dkq4q") pod "95071464-c69d-4edb-a00e-1f980c80301d" (UID: "95071464-c69d-4edb-a00e-1f980c80301d"). InnerVolumeSpecName "kube-api-access-dkq4q". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 14:35:10 crc kubenswrapper[4797]: I1013 14:35:10.446963 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95071464-c69d-4edb-a00e-1f980c80301d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "95071464-c69d-4edb-a00e-1f980c80301d" (UID: "95071464-c69d-4edb-a00e-1f980c80301d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 14:35:10 crc kubenswrapper[4797]: I1013 14:35:10.474255 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95071464-c69d-4edb-a00e-1f980c80301d-config-data" (OuterVolumeSpecName: "config-data") pod "95071464-c69d-4edb-a00e-1f980c80301d" (UID: "95071464-c69d-4edb-a00e-1f980c80301d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 14:35:10 crc kubenswrapper[4797]: I1013 14:35:10.510346 4797 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/95071464-c69d-4edb-a00e-1f980c80301d-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 13 14:35:10 crc kubenswrapper[4797]: I1013 14:35:10.510372 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95071464-c69d-4edb-a00e-1f980c80301d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 14:35:10 crc kubenswrapper[4797]: I1013 14:35:10.510381 4797 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/95071464-c69d-4edb-a00e-1f980c80301d-logs\") on node \"crc\" DevicePath \"\"" Oct 13 14:35:10 crc kubenswrapper[4797]: I1013 14:35:10.510391 4797 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95071464-c69d-4edb-a00e-1f980c80301d-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 14:35:10 crc kubenswrapper[4797]: I1013 14:35:10.510400 4797 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95071464-c69d-4edb-a00e-1f980c80301d-scripts\") on node \"crc\" DevicePath \"\"" Oct 13 14:35:10 crc kubenswrapper[4797]: I1013 14:35:10.510409 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dkq4q\" (UniqueName: \"kubernetes.io/projected/95071464-c69d-4edb-a00e-1f980c80301d-kube-api-access-dkq4q\") on node \"crc\" DevicePath \"\"" Oct 13 14:35:10 crc kubenswrapper[4797]: I1013 14:35:10.510418 4797 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/95071464-c69d-4edb-a00e-1f980c80301d-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 13 14:35:10 crc kubenswrapper[4797]: I1013 14:35:10.655514 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 13 14:35:10 crc kubenswrapper[4797]: I1013 14:35:10.680888 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Oct 13 14:35:10 crc kubenswrapper[4797]: I1013 14:35:10.683253 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Oct 13 14:35:10 crc kubenswrapper[4797]: E1013 14:35:10.683788 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95071464-c69d-4edb-a00e-1f980c80301d" containerName="cinder-api-log" Oct 13 14:35:10 crc kubenswrapper[4797]: I1013 14:35:10.683814 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="95071464-c69d-4edb-a00e-1f980c80301d" containerName="cinder-api-log" Oct 13 14:35:10 crc kubenswrapper[4797]: E1013 14:35:10.683832 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95071464-c69d-4edb-a00e-1f980c80301d" containerName="cinder-api" Oct 13 14:35:10 crc kubenswrapper[4797]: I1013 14:35:10.683837 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="95071464-c69d-4edb-a00e-1f980c80301d" containerName="cinder-api" Oct 13 14:35:10 crc kubenswrapper[4797]: I1013 14:35:10.683996 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="95071464-c69d-4edb-a00e-1f980c80301d" containerName="cinder-api-log" Oct 13 14:35:10 crc kubenswrapper[4797]: I1013 14:35:10.684010 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="95071464-c69d-4edb-a00e-1f980c80301d" containerName="cinder-api" Oct 13 14:35:10 crc kubenswrapper[4797]: I1013 14:35:10.685046 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 13 14:35:10 crc kubenswrapper[4797]: I1013 14:35:10.692721 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Oct 13 14:35:10 crc kubenswrapper[4797]: I1013 14:35:10.693406 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 13 14:35:10 crc kubenswrapper[4797]: I1013 14:35:10.815990 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3bace6c-b376-4b6d-8759-5c90d5d5b02b-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"d3bace6c-b376-4b6d-8759-5c90d5d5b02b\") " pod="openstack/cinder-api-0" Oct 13 14:35:10 crc kubenswrapper[4797]: I1013 14:35:10.816343 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d3bace6c-b376-4b6d-8759-5c90d5d5b02b-scripts\") pod \"cinder-api-0\" (UID: \"d3bace6c-b376-4b6d-8759-5c90d5d5b02b\") " pod="openstack/cinder-api-0" Oct 13 14:35:10 crc kubenswrapper[4797]: I1013 14:35:10.816365 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3bace6c-b376-4b6d-8759-5c90d5d5b02b-config-data\") pod \"cinder-api-0\" (UID: \"d3bace6c-b376-4b6d-8759-5c90d5d5b02b\") " pod="openstack/cinder-api-0" Oct 13 14:35:10 crc kubenswrapper[4797]: I1013 14:35:10.816387 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d3bace6c-b376-4b6d-8759-5c90d5d5b02b-etc-machine-id\") pod \"cinder-api-0\" (UID: \"d3bace6c-b376-4b6d-8759-5c90d5d5b02b\") " pod="openstack/cinder-api-0" Oct 13 14:35:10 crc kubenswrapper[4797]: I1013 14:35:10.816617 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d3bace6c-b376-4b6d-8759-5c90d5d5b02b-logs\") pod \"cinder-api-0\" (UID: \"d3bace6c-b376-4b6d-8759-5c90d5d5b02b\") " pod="openstack/cinder-api-0" Oct 13 14:35:10 crc kubenswrapper[4797]: I1013 14:35:10.816839 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d3bace6c-b376-4b6d-8759-5c90d5d5b02b-config-data-custom\") pod \"cinder-api-0\" (UID: \"d3bace6c-b376-4b6d-8759-5c90d5d5b02b\") " pod="openstack/cinder-api-0" Oct 13 14:35:10 crc kubenswrapper[4797]: I1013 14:35:10.816907 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98mmd\" (UniqueName: \"kubernetes.io/projected/d3bace6c-b376-4b6d-8759-5c90d5d5b02b-kube-api-access-98mmd\") pod \"cinder-api-0\" (UID: \"d3bace6c-b376-4b6d-8759-5c90d5d5b02b\") " pod="openstack/cinder-api-0" Oct 13 14:35:10 crc kubenswrapper[4797]: I1013 14:35:10.918903 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d3bace6c-b376-4b6d-8759-5c90d5d5b02b-config-data-custom\") pod \"cinder-api-0\" (UID: \"d3bace6c-b376-4b6d-8759-5c90d5d5b02b\") " pod="openstack/cinder-api-0" Oct 13 14:35:10 crc kubenswrapper[4797]: I1013 14:35:10.918984 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-98mmd\" (UniqueName: \"kubernetes.io/projected/d3bace6c-b376-4b6d-8759-5c90d5d5b02b-kube-api-access-98mmd\") pod \"cinder-api-0\" (UID: \"d3bace6c-b376-4b6d-8759-5c90d5d5b02b\") " pod="openstack/cinder-api-0" Oct 13 14:35:10 crc kubenswrapper[4797]: I1013 14:35:10.919022 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3bace6c-b376-4b6d-8759-5c90d5d5b02b-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"d3bace6c-b376-4b6d-8759-5c90d5d5b02b\") " pod="openstack/cinder-api-0" Oct 13 14:35:10 crc kubenswrapper[4797]: I1013 14:35:10.919084 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d3bace6c-b376-4b6d-8759-5c90d5d5b02b-scripts\") pod \"cinder-api-0\" (UID: \"d3bace6c-b376-4b6d-8759-5c90d5d5b02b\") " pod="openstack/cinder-api-0" Oct 13 14:35:10 crc kubenswrapper[4797]: I1013 14:35:10.919106 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3bace6c-b376-4b6d-8759-5c90d5d5b02b-config-data\") pod \"cinder-api-0\" (UID: \"d3bace6c-b376-4b6d-8759-5c90d5d5b02b\") " pod="openstack/cinder-api-0" Oct 13 14:35:10 crc kubenswrapper[4797]: I1013 14:35:10.919132 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d3bace6c-b376-4b6d-8759-5c90d5d5b02b-etc-machine-id\") pod \"cinder-api-0\" (UID: \"d3bace6c-b376-4b6d-8759-5c90d5d5b02b\") " pod="openstack/cinder-api-0" Oct 13 14:35:10 crc kubenswrapper[4797]: I1013 14:35:10.919209 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d3bace6c-b376-4b6d-8759-5c90d5d5b02b-logs\") pod \"cinder-api-0\" (UID: \"d3bace6c-b376-4b6d-8759-5c90d5d5b02b\") " pod="openstack/cinder-api-0" Oct 13 14:35:10 crc kubenswrapper[4797]: I1013 14:35:10.919716 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d3bace6c-b376-4b6d-8759-5c90d5d5b02b-logs\") pod \"cinder-api-0\" (UID: \"d3bace6c-b376-4b6d-8759-5c90d5d5b02b\") " pod="openstack/cinder-api-0" Oct 13 14:35:10 crc kubenswrapper[4797]: I1013 14:35:10.920848 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d3bace6c-b376-4b6d-8759-5c90d5d5b02b-etc-machine-id\") pod \"cinder-api-0\" (UID: \"d3bace6c-b376-4b6d-8759-5c90d5d5b02b\") " pod="openstack/cinder-api-0" Oct 13 14:35:10 crc kubenswrapper[4797]: I1013 14:35:10.924409 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3bace6c-b376-4b6d-8759-5c90d5d5b02b-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"d3bace6c-b376-4b6d-8759-5c90d5d5b02b\") " pod="openstack/cinder-api-0" Oct 13 14:35:10 crc kubenswrapper[4797]: I1013 14:35:10.927102 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d3bace6c-b376-4b6d-8759-5c90d5d5b02b-config-data-custom\") pod \"cinder-api-0\" (UID: \"d3bace6c-b376-4b6d-8759-5c90d5d5b02b\") " pod="openstack/cinder-api-0" Oct 13 14:35:10 crc kubenswrapper[4797]: I1013 14:35:10.929794 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3bace6c-b376-4b6d-8759-5c90d5d5b02b-config-data\") pod \"cinder-api-0\" (UID: \"d3bace6c-b376-4b6d-8759-5c90d5d5b02b\") " pod="openstack/cinder-api-0" Oct 13 14:35:10 crc kubenswrapper[4797]: I1013 14:35:10.943883 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-98mmd\" (UniqueName: \"kubernetes.io/projected/d3bace6c-b376-4b6d-8759-5c90d5d5b02b-kube-api-access-98mmd\") pod \"cinder-api-0\" (UID: \"d3bace6c-b376-4b6d-8759-5c90d5d5b02b\") " pod="openstack/cinder-api-0" Oct 13 14:35:10 crc kubenswrapper[4797]: I1013 14:35:10.947206 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d3bace6c-b376-4b6d-8759-5c90d5d5b02b-scripts\") pod \"cinder-api-0\" (UID: \"d3bace6c-b376-4b6d-8759-5c90d5d5b02b\") " pod="openstack/cinder-api-0" Oct 13 14:35:11 crc kubenswrapper[4797]: I1013 14:35:11.014345 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 13 14:35:11 crc kubenswrapper[4797]: I1013 14:35:11.246675 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95071464-c69d-4edb-a00e-1f980c80301d" path="/var/lib/kubelet/pods/95071464-c69d-4edb-a00e-1f980c80301d/volumes" Oct 13 14:35:11 crc kubenswrapper[4797]: I1013 14:35:11.434916 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 13 14:35:11 crc kubenswrapper[4797]: I1013 14:35:11.436517 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 13 14:35:11 crc kubenswrapper[4797]: I1013 14:35:11.451207 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 13 14:35:11 crc kubenswrapper[4797]: I1013 14:35:11.478472 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 13 14:35:11 crc kubenswrapper[4797]: I1013 14:35:11.484532 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 13 14:35:11 crc kubenswrapper[4797]: I1013 14:35:11.486306 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 13 14:35:11 crc kubenswrapper[4797]: I1013 14:35:11.491410 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 13 14:35:11 crc kubenswrapper[4797]: I1013 14:35:11.526583 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 13 14:35:11 crc kubenswrapper[4797]: W1013 14:35:11.527943 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd3bace6c_b376_4b6d_8759_5c90d5d5b02b.slice/crio-b43d63ccae9a631117506aacb7621c8d19bd24553ef487a2199a2e50c9192c3c WatchSource:0}: Error finding container b43d63ccae9a631117506aacb7621c8d19bd24553ef487a2199a2e50c9192c3c: Status 404 returned error can't find the container with id b43d63ccae9a631117506aacb7621c8d19bd24553ef487a2199a2e50c9192c3c Oct 13 14:35:12 crc kubenswrapper[4797]: I1013 14:35:12.318275 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d3bace6c-b376-4b6d-8759-5c90d5d5b02b","Type":"ContainerStarted","Data":"a7b1f37f4fb5cda9c93fec035d4b58e5c50c9ed33f982ea42793b191f2df1a40"} Oct 13 14:35:12 crc kubenswrapper[4797]: I1013 14:35:12.318639 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d3bace6c-b376-4b6d-8759-5c90d5d5b02b","Type":"ContainerStarted","Data":"b43d63ccae9a631117506aacb7621c8d19bd24553ef487a2199a2e50c9192c3c"} Oct 13 14:35:12 crc kubenswrapper[4797]: I1013 14:35:12.319215 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 13 14:35:12 crc kubenswrapper[4797]: I1013 14:35:12.320584 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 13 14:35:12 crc kubenswrapper[4797]: I1013 14:35:12.349587 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 13 14:35:12 crc kubenswrapper[4797]: I1013 14:35:12.410444 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-volume-volume1-0" Oct 13 14:35:13 crc kubenswrapper[4797]: I1013 14:35:13.088655 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-backup-0" Oct 13 14:35:13 crc kubenswrapper[4797]: I1013 14:35:13.326845 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d3bace6c-b376-4b6d-8759-5c90d5d5b02b","Type":"ContainerStarted","Data":"745d19a4283423e4d7a605c92052aa4a07dacfac1ab51fad6cac603782f8e79a"} Oct 13 14:35:13 crc kubenswrapper[4797]: I1013 14:35:13.347326 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.3473058780000002 podStartE2EDuration="3.347305878s" podCreationTimestamp="2025-10-13 14:35:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 14:35:13.341340552 +0000 UTC m=+5290.874890808" watchObservedRunningTime="2025-10-13 14:35:13.347305878 +0000 UTC m=+5290.880856144" Oct 13 14:35:14 crc kubenswrapper[4797]: I1013 14:35:14.336642 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Oct 13 14:35:14 crc kubenswrapper[4797]: I1013 14:35:14.822555 4797 scope.go:117] "RemoveContainer" containerID="89da6a592f418a5587dcdc6f2d22eea87653bc2729cf75cd3d262e4c57251374" Oct 13 14:35:14 crc kubenswrapper[4797]: I1013 14:35:14.864900 4797 scope.go:117] "RemoveContainer" containerID="9afb6cd0252adecf77addc7338033b8e4da0e1ec06022eb8854f48b3a5cbc2d0" Oct 13 14:35:15 crc kubenswrapper[4797]: I1013 14:35:15.590471 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Oct 13 14:35:15 crc kubenswrapper[4797]: I1013 14:35:15.671211 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 13 14:35:16 crc kubenswrapper[4797]: I1013 14:35:16.353100 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="fbf7ae4b-2087-46c3-b7ee-0fe3223215c2" containerName="cinder-scheduler" containerID="cri-o://e0cb55008831959f12e5d7e58804c2cbab8e388757dd30e23bd0047d59c1b5d8" gracePeriod=30 Oct 13 14:35:16 crc kubenswrapper[4797]: I1013 14:35:16.353164 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="fbf7ae4b-2087-46c3-b7ee-0fe3223215c2" containerName="probe" containerID="cri-o://79744d37454d43ba5949a0fe1a16cc74a97718d8f8b734406de6128082390659" gracePeriod=30 Oct 13 14:35:17 crc kubenswrapper[4797]: I1013 14:35:17.363321 4797 generic.go:334] "Generic (PLEG): container finished" podID="fbf7ae4b-2087-46c3-b7ee-0fe3223215c2" containerID="79744d37454d43ba5949a0fe1a16cc74a97718d8f8b734406de6128082390659" exitCode=0 Oct 13 14:35:17 crc kubenswrapper[4797]: I1013 14:35:17.363676 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"fbf7ae4b-2087-46c3-b7ee-0fe3223215c2","Type":"ContainerDied","Data":"79744d37454d43ba5949a0fe1a16cc74a97718d8f8b734406de6128082390659"} Oct 13 14:35:17 crc kubenswrapper[4797]: I1013 14:35:17.628312 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-volume-volume1-0" Oct 13 14:35:18 crc kubenswrapper[4797]: I1013 14:35:18.047533 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-lq62q"] Oct 13 14:35:18 crc kubenswrapper[4797]: I1013 14:35:18.059357 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-lq62q"] Oct 13 14:35:18 crc kubenswrapper[4797]: I1013 14:35:18.319192 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-backup-0" Oct 13 14:35:18 crc kubenswrapper[4797]: I1013 14:35:18.374045 4797 generic.go:334] "Generic (PLEG): container finished" podID="fbf7ae4b-2087-46c3-b7ee-0fe3223215c2" containerID="e0cb55008831959f12e5d7e58804c2cbab8e388757dd30e23bd0047d59c1b5d8" exitCode=0 Oct 13 14:35:18 crc kubenswrapper[4797]: I1013 14:35:18.374086 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"fbf7ae4b-2087-46c3-b7ee-0fe3223215c2","Type":"ContainerDied","Data":"e0cb55008831959f12e5d7e58804c2cbab8e388757dd30e23bd0047d59c1b5d8"} Oct 13 14:35:18 crc kubenswrapper[4797]: I1013 14:35:18.704499 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 13 14:35:18 crc kubenswrapper[4797]: I1013 14:35:18.803645 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fbf7ae4b-2087-46c3-b7ee-0fe3223215c2-scripts\") pod \"fbf7ae4b-2087-46c3-b7ee-0fe3223215c2\" (UID: \"fbf7ae4b-2087-46c3-b7ee-0fe3223215c2\") " Oct 13 14:35:18 crc kubenswrapper[4797]: I1013 14:35:18.803694 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fbf7ae4b-2087-46c3-b7ee-0fe3223215c2-etc-machine-id\") pod \"fbf7ae4b-2087-46c3-b7ee-0fe3223215c2\" (UID: \"fbf7ae4b-2087-46c3-b7ee-0fe3223215c2\") " Oct 13 14:35:18 crc kubenswrapper[4797]: I1013 14:35:18.803743 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbf7ae4b-2087-46c3-b7ee-0fe3223215c2-config-data\") pod \"fbf7ae4b-2087-46c3-b7ee-0fe3223215c2\" (UID: \"fbf7ae4b-2087-46c3-b7ee-0fe3223215c2\") " Oct 13 14:35:18 crc kubenswrapper[4797]: I1013 14:35:18.803796 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sftwk\" (UniqueName: \"kubernetes.io/projected/fbf7ae4b-2087-46c3-b7ee-0fe3223215c2-kube-api-access-sftwk\") pod \"fbf7ae4b-2087-46c3-b7ee-0fe3223215c2\" (UID: \"fbf7ae4b-2087-46c3-b7ee-0fe3223215c2\") " Oct 13 14:35:18 crc kubenswrapper[4797]: I1013 14:35:18.803977 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fbf7ae4b-2087-46c3-b7ee-0fe3223215c2-config-data-custom\") pod \"fbf7ae4b-2087-46c3-b7ee-0fe3223215c2\" (UID: \"fbf7ae4b-2087-46c3-b7ee-0fe3223215c2\") " Oct 13 14:35:18 crc kubenswrapper[4797]: I1013 14:35:18.804017 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbf7ae4b-2087-46c3-b7ee-0fe3223215c2-combined-ca-bundle\") pod \"fbf7ae4b-2087-46c3-b7ee-0fe3223215c2\" (UID: \"fbf7ae4b-2087-46c3-b7ee-0fe3223215c2\") " Oct 13 14:35:18 crc kubenswrapper[4797]: I1013 14:35:18.804192 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fbf7ae4b-2087-46c3-b7ee-0fe3223215c2-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "fbf7ae4b-2087-46c3-b7ee-0fe3223215c2" (UID: "fbf7ae4b-2087-46c3-b7ee-0fe3223215c2"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 13 14:35:18 crc kubenswrapper[4797]: I1013 14:35:18.804623 4797 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fbf7ae4b-2087-46c3-b7ee-0fe3223215c2-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 13 14:35:18 crc kubenswrapper[4797]: I1013 14:35:18.810759 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbf7ae4b-2087-46c3-b7ee-0fe3223215c2-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "fbf7ae4b-2087-46c3-b7ee-0fe3223215c2" (UID: "fbf7ae4b-2087-46c3-b7ee-0fe3223215c2"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 14:35:18 crc kubenswrapper[4797]: I1013 14:35:18.820950 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbf7ae4b-2087-46c3-b7ee-0fe3223215c2-scripts" (OuterVolumeSpecName: "scripts") pod "fbf7ae4b-2087-46c3-b7ee-0fe3223215c2" (UID: "fbf7ae4b-2087-46c3-b7ee-0fe3223215c2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 14:35:18 crc kubenswrapper[4797]: I1013 14:35:18.822759 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fbf7ae4b-2087-46c3-b7ee-0fe3223215c2-kube-api-access-sftwk" (OuterVolumeSpecName: "kube-api-access-sftwk") pod "fbf7ae4b-2087-46c3-b7ee-0fe3223215c2" (UID: "fbf7ae4b-2087-46c3-b7ee-0fe3223215c2"). InnerVolumeSpecName "kube-api-access-sftwk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 14:35:18 crc kubenswrapper[4797]: I1013 14:35:18.859861 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbf7ae4b-2087-46c3-b7ee-0fe3223215c2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fbf7ae4b-2087-46c3-b7ee-0fe3223215c2" (UID: "fbf7ae4b-2087-46c3-b7ee-0fe3223215c2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 14:35:18 crc kubenswrapper[4797]: I1013 14:35:18.906674 4797 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fbf7ae4b-2087-46c3-b7ee-0fe3223215c2-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 13 14:35:18 crc kubenswrapper[4797]: I1013 14:35:18.906714 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbf7ae4b-2087-46c3-b7ee-0fe3223215c2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 14:35:18 crc kubenswrapper[4797]: I1013 14:35:18.906728 4797 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fbf7ae4b-2087-46c3-b7ee-0fe3223215c2-scripts\") on node \"crc\" DevicePath \"\"" Oct 13 14:35:18 crc kubenswrapper[4797]: I1013 14:35:18.906740 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sftwk\" (UniqueName: \"kubernetes.io/projected/fbf7ae4b-2087-46c3-b7ee-0fe3223215c2-kube-api-access-sftwk\") on node \"crc\" DevicePath \"\"" Oct 13 14:35:18 crc kubenswrapper[4797]: I1013 14:35:18.924980 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbf7ae4b-2087-46c3-b7ee-0fe3223215c2-config-data" (OuterVolumeSpecName: "config-data") pod "fbf7ae4b-2087-46c3-b7ee-0fe3223215c2" (UID: "fbf7ae4b-2087-46c3-b7ee-0fe3223215c2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 14:35:19 crc kubenswrapper[4797]: I1013 14:35:19.008062 4797 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbf7ae4b-2087-46c3-b7ee-0fe3223215c2-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 14:35:19 crc kubenswrapper[4797]: I1013 14:35:19.245405 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a582257-89db-4b9f-926a-6631e27ee53e" path="/var/lib/kubelet/pods/9a582257-89db-4b9f-926a-6631e27ee53e/volumes" Oct 13 14:35:19 crc kubenswrapper[4797]: I1013 14:35:19.391132 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"fbf7ae4b-2087-46c3-b7ee-0fe3223215c2","Type":"ContainerDied","Data":"d66849e0667782afc7ddf7496f38c85bb1addc93f1d0ad66531b3aba9c6d3b0b"} Oct 13 14:35:19 crc kubenswrapper[4797]: I1013 14:35:19.391207 4797 scope.go:117] "RemoveContainer" containerID="79744d37454d43ba5949a0fe1a16cc74a97718d8f8b734406de6128082390659" Oct 13 14:35:19 crc kubenswrapper[4797]: I1013 14:35:19.391286 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 13 14:35:19 crc kubenswrapper[4797]: I1013 14:35:19.418320 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 13 14:35:19 crc kubenswrapper[4797]: I1013 14:35:19.427476 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 13 14:35:19 crc kubenswrapper[4797]: I1013 14:35:19.430156 4797 scope.go:117] "RemoveContainer" containerID="e0cb55008831959f12e5d7e58804c2cbab8e388757dd30e23bd0047d59c1b5d8" Oct 13 14:35:19 crc kubenswrapper[4797]: I1013 14:35:19.457090 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Oct 13 14:35:19 crc kubenswrapper[4797]: E1013 14:35:19.457648 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbf7ae4b-2087-46c3-b7ee-0fe3223215c2" containerName="cinder-scheduler" Oct 13 14:35:19 crc kubenswrapper[4797]: I1013 14:35:19.457670 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbf7ae4b-2087-46c3-b7ee-0fe3223215c2" containerName="cinder-scheduler" Oct 13 14:35:19 crc kubenswrapper[4797]: E1013 14:35:19.457694 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbf7ae4b-2087-46c3-b7ee-0fe3223215c2" containerName="probe" Oct 13 14:35:19 crc kubenswrapper[4797]: I1013 14:35:19.457700 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbf7ae4b-2087-46c3-b7ee-0fe3223215c2" containerName="probe" Oct 13 14:35:19 crc kubenswrapper[4797]: I1013 14:35:19.457901 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="fbf7ae4b-2087-46c3-b7ee-0fe3223215c2" containerName="cinder-scheduler" Oct 13 14:35:19 crc kubenswrapper[4797]: I1013 14:35:19.457922 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="fbf7ae4b-2087-46c3-b7ee-0fe3223215c2" containerName="probe" Oct 13 14:35:19 crc kubenswrapper[4797]: I1013 14:35:19.459084 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 13 14:35:19 crc kubenswrapper[4797]: I1013 14:35:19.466090 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Oct 13 14:35:19 crc kubenswrapper[4797]: I1013 14:35:19.470880 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 13 14:35:19 crc kubenswrapper[4797]: I1013 14:35:19.515596 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2079f065-f421-4b28-8023-926aa90e9f63-config-data\") pod \"cinder-scheduler-0\" (UID: \"2079f065-f421-4b28-8023-926aa90e9f63\") " pod="openstack/cinder-scheduler-0" Oct 13 14:35:19 crc kubenswrapper[4797]: I1013 14:35:19.515650 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2079f065-f421-4b28-8023-926aa90e9f63-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"2079f065-f421-4b28-8023-926aa90e9f63\") " pod="openstack/cinder-scheduler-0" Oct 13 14:35:19 crc kubenswrapper[4797]: I1013 14:35:19.515687 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdpkg\" (UniqueName: \"kubernetes.io/projected/2079f065-f421-4b28-8023-926aa90e9f63-kube-api-access-wdpkg\") pod \"cinder-scheduler-0\" (UID: \"2079f065-f421-4b28-8023-926aa90e9f63\") " pod="openstack/cinder-scheduler-0" Oct 13 14:35:19 crc kubenswrapper[4797]: I1013 14:35:19.515728 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2079f065-f421-4b28-8023-926aa90e9f63-scripts\") pod \"cinder-scheduler-0\" (UID: \"2079f065-f421-4b28-8023-926aa90e9f63\") " pod="openstack/cinder-scheduler-0" Oct 13 14:35:19 crc kubenswrapper[4797]: I1013 14:35:19.515780 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2079f065-f421-4b28-8023-926aa90e9f63-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"2079f065-f421-4b28-8023-926aa90e9f63\") " pod="openstack/cinder-scheduler-0" Oct 13 14:35:19 crc kubenswrapper[4797]: I1013 14:35:19.515941 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2079f065-f421-4b28-8023-926aa90e9f63-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"2079f065-f421-4b28-8023-926aa90e9f63\") " pod="openstack/cinder-scheduler-0" Oct 13 14:35:19 crc kubenswrapper[4797]: I1013 14:35:19.617322 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2079f065-f421-4b28-8023-926aa90e9f63-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"2079f065-f421-4b28-8023-926aa90e9f63\") " pod="openstack/cinder-scheduler-0" Oct 13 14:35:19 crc kubenswrapper[4797]: I1013 14:35:19.617398 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wdpkg\" (UniqueName: \"kubernetes.io/projected/2079f065-f421-4b28-8023-926aa90e9f63-kube-api-access-wdpkg\") pod \"cinder-scheduler-0\" (UID: \"2079f065-f421-4b28-8023-926aa90e9f63\") " pod="openstack/cinder-scheduler-0" Oct 13 14:35:19 crc kubenswrapper[4797]: I1013 14:35:19.617445 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2079f065-f421-4b28-8023-926aa90e9f63-scripts\") pod \"cinder-scheduler-0\" (UID: \"2079f065-f421-4b28-8023-926aa90e9f63\") " pod="openstack/cinder-scheduler-0" Oct 13 14:35:19 crc kubenswrapper[4797]: I1013 14:35:19.617481 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2079f065-f421-4b28-8023-926aa90e9f63-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"2079f065-f421-4b28-8023-926aa90e9f63\") " pod="openstack/cinder-scheduler-0" Oct 13 14:35:19 crc kubenswrapper[4797]: I1013 14:35:19.617582 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2079f065-f421-4b28-8023-926aa90e9f63-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"2079f065-f421-4b28-8023-926aa90e9f63\") " pod="openstack/cinder-scheduler-0" Oct 13 14:35:19 crc kubenswrapper[4797]: I1013 14:35:19.617639 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2079f065-f421-4b28-8023-926aa90e9f63-config-data\") pod \"cinder-scheduler-0\" (UID: \"2079f065-f421-4b28-8023-926aa90e9f63\") " pod="openstack/cinder-scheduler-0" Oct 13 14:35:19 crc kubenswrapper[4797]: I1013 14:35:19.617629 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2079f065-f421-4b28-8023-926aa90e9f63-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"2079f065-f421-4b28-8023-926aa90e9f63\") " pod="openstack/cinder-scheduler-0" Oct 13 14:35:19 crc kubenswrapper[4797]: I1013 14:35:19.622530 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2079f065-f421-4b28-8023-926aa90e9f63-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"2079f065-f421-4b28-8023-926aa90e9f63\") " pod="openstack/cinder-scheduler-0" Oct 13 14:35:19 crc kubenswrapper[4797]: I1013 14:35:19.622557 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2079f065-f421-4b28-8023-926aa90e9f63-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"2079f065-f421-4b28-8023-926aa90e9f63\") " pod="openstack/cinder-scheduler-0" Oct 13 14:35:19 crc kubenswrapper[4797]: I1013 14:35:19.623069 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2079f065-f421-4b28-8023-926aa90e9f63-config-data\") pod \"cinder-scheduler-0\" (UID: \"2079f065-f421-4b28-8023-926aa90e9f63\") " pod="openstack/cinder-scheduler-0" Oct 13 14:35:19 crc kubenswrapper[4797]: I1013 14:35:19.623584 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2079f065-f421-4b28-8023-926aa90e9f63-scripts\") pod \"cinder-scheduler-0\" (UID: \"2079f065-f421-4b28-8023-926aa90e9f63\") " pod="openstack/cinder-scheduler-0" Oct 13 14:35:19 crc kubenswrapper[4797]: I1013 14:35:19.637402 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdpkg\" (UniqueName: \"kubernetes.io/projected/2079f065-f421-4b28-8023-926aa90e9f63-kube-api-access-wdpkg\") pod \"cinder-scheduler-0\" (UID: \"2079f065-f421-4b28-8023-926aa90e9f63\") " pod="openstack/cinder-scheduler-0" Oct 13 14:35:19 crc kubenswrapper[4797]: I1013 14:35:19.795057 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 13 14:35:20 crc kubenswrapper[4797]: I1013 14:35:20.237301 4797 scope.go:117] "RemoveContainer" containerID="96c8267bd4c8e99eeab0f52fde47a06d5529395a03b2ed9e13ec45aa355e370b" Oct 13 14:35:20 crc kubenswrapper[4797]: I1013 14:35:20.320855 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 13 14:35:20 crc kubenswrapper[4797]: I1013 14:35:20.406069 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"2079f065-f421-4b28-8023-926aa90e9f63","Type":"ContainerStarted","Data":"d5d0a08a949827b7d3ea5d663e54fc7914900d7042ae82c54dccfe0cdd496235"} Oct 13 14:35:21 crc kubenswrapper[4797]: I1013 14:35:21.376644 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fbf7ae4b-2087-46c3-b7ee-0fe3223215c2" path="/var/lib/kubelet/pods/fbf7ae4b-2087-46c3-b7ee-0fe3223215c2/volumes" Oct 13 14:35:21 crc kubenswrapper[4797]: I1013 14:35:21.437302 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"2079f065-f421-4b28-8023-926aa90e9f63","Type":"ContainerStarted","Data":"ca4f858933e99c455964a99452aa651401b3b0ce97d2ad04cabd67b7d0833f53"} Oct 13 14:35:21 crc kubenswrapper[4797]: I1013 14:35:21.449942 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" event={"ID":"345b1c60-ba79-407d-8423-53010f2dfeb0","Type":"ContainerStarted","Data":"f6beaa7adf1d21db8fdfdf908e4a91ef09e840de8f57f89fa2a6f4402ae41c29"} Oct 13 14:35:22 crc kubenswrapper[4797]: I1013 14:35:22.462323 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"2079f065-f421-4b28-8023-926aa90e9f63","Type":"ContainerStarted","Data":"21df7657585634f91a10bd3e8e39d4a9945b19b18166451cb68d1d45495708d6"} Oct 13 14:35:22 crc kubenswrapper[4797]: I1013 14:35:22.483852 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.483826805 podStartE2EDuration="3.483826805s" podCreationTimestamp="2025-10-13 14:35:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 14:35:22.483109758 +0000 UTC m=+5300.016660014" watchObservedRunningTime="2025-10-13 14:35:22.483826805 +0000 UTC m=+5300.017377071" Oct 13 14:35:23 crc kubenswrapper[4797]: I1013 14:35:23.040696 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Oct 13 14:35:24 crc kubenswrapper[4797]: I1013 14:35:24.796198 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Oct 13 14:35:30 crc kubenswrapper[4797]: I1013 14:35:30.027637 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Oct 13 14:35:31 crc kubenswrapper[4797]: I1013 14:35:31.060983 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-8gv9r"] Oct 13 14:35:31 crc kubenswrapper[4797]: I1013 14:35:31.068204 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-8gv9r"] Oct 13 14:35:31 crc kubenswrapper[4797]: I1013 14:35:31.250654 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e38a41c7-6328-45df-b131-0f4e6a74563a" path="/var/lib/kubelet/pods/e38a41c7-6328-45df-b131-0f4e6a74563a/volumes" Oct 13 14:36:15 crc kubenswrapper[4797]: I1013 14:36:15.065826 4797 scope.go:117] "RemoveContainer" containerID="f80dbaf2dd4bcac7be154cafd83514fd4c13497e18e2dff9db8d1b2f8741e7d2" Oct 13 14:36:15 crc kubenswrapper[4797]: I1013 14:36:15.127937 4797 scope.go:117] "RemoveContainer" containerID="3c5d78b44e0d4a868b59d440a805df697edae9b0ba54a1c1a46167b209e09cd0" Oct 13 14:36:48 crc kubenswrapper[4797]: I1013 14:36:48.110547 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-7lh88"] Oct 13 14:36:48 crc kubenswrapper[4797]: I1013 14:36:48.112954 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7lh88" Oct 13 14:36:48 crc kubenswrapper[4797]: I1013 14:36:48.137583 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7lh88"] Oct 13 14:36:48 crc kubenswrapper[4797]: I1013 14:36:48.177030 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vtv6\" (UniqueName: \"kubernetes.io/projected/d736f10d-90a5-43e1-8f2d-84e8c82b497e-kube-api-access-6vtv6\") pod \"redhat-operators-7lh88\" (UID: \"d736f10d-90a5-43e1-8f2d-84e8c82b497e\") " pod="openshift-marketplace/redhat-operators-7lh88" Oct 13 14:36:48 crc kubenswrapper[4797]: I1013 14:36:48.177280 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d736f10d-90a5-43e1-8f2d-84e8c82b497e-utilities\") pod \"redhat-operators-7lh88\" (UID: \"d736f10d-90a5-43e1-8f2d-84e8c82b497e\") " pod="openshift-marketplace/redhat-operators-7lh88" Oct 13 14:36:48 crc kubenswrapper[4797]: I1013 14:36:48.177358 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d736f10d-90a5-43e1-8f2d-84e8c82b497e-catalog-content\") pod \"redhat-operators-7lh88\" (UID: \"d736f10d-90a5-43e1-8f2d-84e8c82b497e\") " pod="openshift-marketplace/redhat-operators-7lh88" Oct 13 14:36:48 crc kubenswrapper[4797]: I1013 14:36:48.278781 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vtv6\" (UniqueName: \"kubernetes.io/projected/d736f10d-90a5-43e1-8f2d-84e8c82b497e-kube-api-access-6vtv6\") pod \"redhat-operators-7lh88\" (UID: \"d736f10d-90a5-43e1-8f2d-84e8c82b497e\") " pod="openshift-marketplace/redhat-operators-7lh88" Oct 13 14:36:48 crc kubenswrapper[4797]: I1013 14:36:48.279126 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d736f10d-90a5-43e1-8f2d-84e8c82b497e-utilities\") pod \"redhat-operators-7lh88\" (UID: \"d736f10d-90a5-43e1-8f2d-84e8c82b497e\") " pod="openshift-marketplace/redhat-operators-7lh88" Oct 13 14:36:48 crc kubenswrapper[4797]: I1013 14:36:48.279161 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d736f10d-90a5-43e1-8f2d-84e8c82b497e-catalog-content\") pod \"redhat-operators-7lh88\" (UID: \"d736f10d-90a5-43e1-8f2d-84e8c82b497e\") " pod="openshift-marketplace/redhat-operators-7lh88" Oct 13 14:36:48 crc kubenswrapper[4797]: I1013 14:36:48.280138 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d736f10d-90a5-43e1-8f2d-84e8c82b497e-utilities\") pod \"redhat-operators-7lh88\" (UID: \"d736f10d-90a5-43e1-8f2d-84e8c82b497e\") " pod="openshift-marketplace/redhat-operators-7lh88" Oct 13 14:36:48 crc kubenswrapper[4797]: I1013 14:36:48.281019 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d736f10d-90a5-43e1-8f2d-84e8c82b497e-catalog-content\") pod \"redhat-operators-7lh88\" (UID: \"d736f10d-90a5-43e1-8f2d-84e8c82b497e\") " pod="openshift-marketplace/redhat-operators-7lh88" Oct 13 14:36:48 crc kubenswrapper[4797]: I1013 14:36:48.299261 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vtv6\" (UniqueName: \"kubernetes.io/projected/d736f10d-90a5-43e1-8f2d-84e8c82b497e-kube-api-access-6vtv6\") pod \"redhat-operators-7lh88\" (UID: \"d736f10d-90a5-43e1-8f2d-84e8c82b497e\") " pod="openshift-marketplace/redhat-operators-7lh88" Oct 13 14:36:48 crc kubenswrapper[4797]: I1013 14:36:48.440673 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7lh88" Oct 13 14:36:48 crc kubenswrapper[4797]: I1013 14:36:48.972377 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7lh88"] Oct 13 14:36:49 crc kubenswrapper[4797]: I1013 14:36:49.345358 4797 generic.go:334] "Generic (PLEG): container finished" podID="d736f10d-90a5-43e1-8f2d-84e8c82b497e" containerID="a0f3af894db6797f1c348466a759d8b2fc6bfb0f7efa680a4c6f0746f638de33" exitCode=0 Oct 13 14:36:49 crc kubenswrapper[4797]: I1013 14:36:49.345463 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7lh88" event={"ID":"d736f10d-90a5-43e1-8f2d-84e8c82b497e","Type":"ContainerDied","Data":"a0f3af894db6797f1c348466a759d8b2fc6bfb0f7efa680a4c6f0746f638de33"} Oct 13 14:36:49 crc kubenswrapper[4797]: I1013 14:36:49.345655 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7lh88" event={"ID":"d736f10d-90a5-43e1-8f2d-84e8c82b497e","Type":"ContainerStarted","Data":"15f91a45d9ab64807e2cc3e258e9a666641561571529e1dca2b773b2842607e8"} Oct 13 14:36:49 crc kubenswrapper[4797]: I1013 14:36:49.347712 4797 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 13 14:36:50 crc kubenswrapper[4797]: I1013 14:36:50.357425 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7lh88" event={"ID":"d736f10d-90a5-43e1-8f2d-84e8c82b497e","Type":"ContainerStarted","Data":"aa0245896a1d2c11d9b832f7bef6577826fc247503652e135b70bedcb916c716"} Oct 13 14:36:50 crc kubenswrapper[4797]: E1013 14:36:50.738672 4797 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd736f10d_90a5_43e1_8f2d_84e8c82b497e.slice/crio-aa0245896a1d2c11d9b832f7bef6577826fc247503652e135b70bedcb916c716.scope\": RecentStats: unable to find data in memory cache]" Oct 13 14:36:51 crc kubenswrapper[4797]: I1013 14:36:51.365950 4797 generic.go:334] "Generic (PLEG): container finished" podID="d736f10d-90a5-43e1-8f2d-84e8c82b497e" containerID="aa0245896a1d2c11d9b832f7bef6577826fc247503652e135b70bedcb916c716" exitCode=0 Oct 13 14:36:51 crc kubenswrapper[4797]: I1013 14:36:51.366028 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7lh88" event={"ID":"d736f10d-90a5-43e1-8f2d-84e8c82b497e","Type":"ContainerDied","Data":"aa0245896a1d2c11d9b832f7bef6577826fc247503652e135b70bedcb916c716"} Oct 13 14:36:53 crc kubenswrapper[4797]: I1013 14:36:53.405679 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7lh88" event={"ID":"d736f10d-90a5-43e1-8f2d-84e8c82b497e","Type":"ContainerStarted","Data":"ef2c8c0781f2f6a55cfb7a23cd2f36bc58cfb4351b922356bab176ce106f9f9d"} Oct 13 14:36:53 crc kubenswrapper[4797]: I1013 14:36:53.429938 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-7lh88" podStartSLOduration=2.95375164 podStartE2EDuration="5.429918394s" podCreationTimestamp="2025-10-13 14:36:48 +0000 UTC" firstStartedPulling="2025-10-13 14:36:49.347435965 +0000 UTC m=+5386.880986231" lastFinishedPulling="2025-10-13 14:36:51.823602729 +0000 UTC m=+5389.357152985" observedRunningTime="2025-10-13 14:36:53.424976443 +0000 UTC m=+5390.958526709" watchObservedRunningTime="2025-10-13 14:36:53.429918394 +0000 UTC m=+5390.963468650" Oct 13 14:36:58 crc kubenswrapper[4797]: I1013 14:36:58.442409 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-7lh88" Oct 13 14:36:58 crc kubenswrapper[4797]: I1013 14:36:58.443865 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-7lh88" Oct 13 14:36:58 crc kubenswrapper[4797]: I1013 14:36:58.521712 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-7lh88" Oct 13 14:36:59 crc kubenswrapper[4797]: I1013 14:36:59.520711 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-7lh88" Oct 13 14:36:59 crc kubenswrapper[4797]: I1013 14:36:59.575403 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7lh88"] Oct 13 14:37:01 crc kubenswrapper[4797]: I1013 14:37:01.489716 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-7lh88" podUID="d736f10d-90a5-43e1-8f2d-84e8c82b497e" containerName="registry-server" containerID="cri-o://ef2c8c0781f2f6a55cfb7a23cd2f36bc58cfb4351b922356bab176ce106f9f9d" gracePeriod=2 Oct 13 14:37:01 crc kubenswrapper[4797]: I1013 14:37:01.931106 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7lh88" Oct 13 14:37:02 crc kubenswrapper[4797]: I1013 14:37:02.035953 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d736f10d-90a5-43e1-8f2d-84e8c82b497e-utilities\") pod \"d736f10d-90a5-43e1-8f2d-84e8c82b497e\" (UID: \"d736f10d-90a5-43e1-8f2d-84e8c82b497e\") " Oct 13 14:37:02 crc kubenswrapper[4797]: I1013 14:37:02.036140 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6vtv6\" (UniqueName: \"kubernetes.io/projected/d736f10d-90a5-43e1-8f2d-84e8c82b497e-kube-api-access-6vtv6\") pod \"d736f10d-90a5-43e1-8f2d-84e8c82b497e\" (UID: \"d736f10d-90a5-43e1-8f2d-84e8c82b497e\") " Oct 13 14:37:02 crc kubenswrapper[4797]: I1013 14:37:02.036176 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d736f10d-90a5-43e1-8f2d-84e8c82b497e-catalog-content\") pod \"d736f10d-90a5-43e1-8f2d-84e8c82b497e\" (UID: \"d736f10d-90a5-43e1-8f2d-84e8c82b497e\") " Oct 13 14:37:02 crc kubenswrapper[4797]: I1013 14:37:02.037221 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d736f10d-90a5-43e1-8f2d-84e8c82b497e-utilities" (OuterVolumeSpecName: "utilities") pod "d736f10d-90a5-43e1-8f2d-84e8c82b497e" (UID: "d736f10d-90a5-43e1-8f2d-84e8c82b497e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 14:37:02 crc kubenswrapper[4797]: I1013 14:37:02.047004 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d736f10d-90a5-43e1-8f2d-84e8c82b497e-kube-api-access-6vtv6" (OuterVolumeSpecName: "kube-api-access-6vtv6") pod "d736f10d-90a5-43e1-8f2d-84e8c82b497e" (UID: "d736f10d-90a5-43e1-8f2d-84e8c82b497e"). InnerVolumeSpecName "kube-api-access-6vtv6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 14:37:02 crc kubenswrapper[4797]: I1013 14:37:02.139396 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6vtv6\" (UniqueName: \"kubernetes.io/projected/d736f10d-90a5-43e1-8f2d-84e8c82b497e-kube-api-access-6vtv6\") on node \"crc\" DevicePath \"\"" Oct 13 14:37:02 crc kubenswrapper[4797]: I1013 14:37:02.139439 4797 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d736f10d-90a5-43e1-8f2d-84e8c82b497e-utilities\") on node \"crc\" DevicePath \"\"" Oct 13 14:37:02 crc kubenswrapper[4797]: I1013 14:37:02.140035 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d736f10d-90a5-43e1-8f2d-84e8c82b497e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d736f10d-90a5-43e1-8f2d-84e8c82b497e" (UID: "d736f10d-90a5-43e1-8f2d-84e8c82b497e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 14:37:02 crc kubenswrapper[4797]: I1013 14:37:02.241406 4797 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d736f10d-90a5-43e1-8f2d-84e8c82b497e-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 13 14:37:02 crc kubenswrapper[4797]: I1013 14:37:02.391836 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-5c8799d87c-bmvq7"] Oct 13 14:37:02 crc kubenswrapper[4797]: E1013 14:37:02.392310 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d736f10d-90a5-43e1-8f2d-84e8c82b497e" containerName="registry-server" Oct 13 14:37:02 crc kubenswrapper[4797]: I1013 14:37:02.392325 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="d736f10d-90a5-43e1-8f2d-84e8c82b497e" containerName="registry-server" Oct 13 14:37:02 crc kubenswrapper[4797]: E1013 14:37:02.392350 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d736f10d-90a5-43e1-8f2d-84e8c82b497e" containerName="extract-utilities" Oct 13 14:37:02 crc kubenswrapper[4797]: I1013 14:37:02.392361 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="d736f10d-90a5-43e1-8f2d-84e8c82b497e" containerName="extract-utilities" Oct 13 14:37:02 crc kubenswrapper[4797]: E1013 14:37:02.392389 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d736f10d-90a5-43e1-8f2d-84e8c82b497e" containerName="extract-content" Oct 13 14:37:02 crc kubenswrapper[4797]: I1013 14:37:02.392398 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="d736f10d-90a5-43e1-8f2d-84e8c82b497e" containerName="extract-content" Oct 13 14:37:02 crc kubenswrapper[4797]: I1013 14:37:02.392739 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="d736f10d-90a5-43e1-8f2d-84e8c82b497e" containerName="registry-server" Oct 13 14:37:02 crc kubenswrapper[4797]: I1013 14:37:02.394079 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5c8799d87c-bmvq7" Oct 13 14:37:02 crc kubenswrapper[4797]: I1013 14:37:02.397697 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Oct 13 14:37:02 crc kubenswrapper[4797]: I1013 14:37:02.397924 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Oct 13 14:37:02 crc kubenswrapper[4797]: I1013 14:37:02.398145 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-tsrps" Oct 13 14:37:02 crc kubenswrapper[4797]: I1013 14:37:02.398279 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Oct 13 14:37:02 crc kubenswrapper[4797]: I1013 14:37:02.414724 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5c8799d87c-bmvq7"] Oct 13 14:37:02 crc kubenswrapper[4797]: I1013 14:37:02.457022 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 13 14:37:02 crc kubenswrapper[4797]: I1013 14:37:02.457311 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="42c07aef-3157-4265-b940-0d838eb32e9f" containerName="glance-log" containerID="cri-o://65bcd9b8b6054f2b3ecd43dc1183e377ec852f26140bb43c56e152253bca6e95" gracePeriod=30 Oct 13 14:37:02 crc kubenswrapper[4797]: I1013 14:37:02.457890 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="42c07aef-3157-4265-b940-0d838eb32e9f" containerName="glance-httpd" containerID="cri-o://1e2c86c583fb966dae97514dbe98d91f56ac5589ee97b753c4018672f0c3109a" gracePeriod=30 Oct 13 14:37:02 crc kubenswrapper[4797]: I1013 14:37:02.514318 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 13 14:37:02 crc kubenswrapper[4797]: I1013 14:37:02.514568 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="4129fe47-83ce-4c43-9549-39be0607dc11" containerName="glance-log" containerID="cri-o://ae8411c1bcc54e7c84453171d93e0c8b45852913b5a9ab2433f22184cffd6213" gracePeriod=30 Oct 13 14:37:02 crc kubenswrapper[4797]: I1013 14:37:02.515022 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="4129fe47-83ce-4c43-9549-39be0607dc11" containerName="glance-httpd" containerID="cri-o://df9ffaffb3948f1cee13cbad7d818dd5004c66fe554b30b53c353c221e2d8ea4" gracePeriod=30 Oct 13 14:37:02 crc kubenswrapper[4797]: I1013 14:37:02.530900 4797 generic.go:334] "Generic (PLEG): container finished" podID="d736f10d-90a5-43e1-8f2d-84e8c82b497e" containerID="ef2c8c0781f2f6a55cfb7a23cd2f36bc58cfb4351b922356bab176ce106f9f9d" exitCode=0 Oct 13 14:37:02 crc kubenswrapper[4797]: I1013 14:37:02.531208 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7lh88" event={"ID":"d736f10d-90a5-43e1-8f2d-84e8c82b497e","Type":"ContainerDied","Data":"ef2c8c0781f2f6a55cfb7a23cd2f36bc58cfb4351b922356bab176ce106f9f9d"} Oct 13 14:37:02 crc kubenswrapper[4797]: I1013 14:37:02.531232 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7lh88" event={"ID":"d736f10d-90a5-43e1-8f2d-84e8c82b497e","Type":"ContainerDied","Data":"15f91a45d9ab64807e2cc3e258e9a666641561571529e1dca2b773b2842607e8"} Oct 13 14:37:02 crc kubenswrapper[4797]: I1013 14:37:02.531248 4797 scope.go:117] "RemoveContainer" containerID="ef2c8c0781f2f6a55cfb7a23cd2f36bc58cfb4351b922356bab176ce106f9f9d" Oct 13 14:37:02 crc kubenswrapper[4797]: I1013 14:37:02.531376 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7lh88" Oct 13 14:37:02 crc kubenswrapper[4797]: I1013 14:37:02.548263 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d9308db4-47d9-45e2-a059-a2aeed2ff9fe-config-data\") pod \"horizon-5c8799d87c-bmvq7\" (UID: \"d9308db4-47d9-45e2-a059-a2aeed2ff9fe\") " pod="openstack/horizon-5c8799d87c-bmvq7" Oct 13 14:37:02 crc kubenswrapper[4797]: I1013 14:37:02.548374 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68zvj\" (UniqueName: \"kubernetes.io/projected/d9308db4-47d9-45e2-a059-a2aeed2ff9fe-kube-api-access-68zvj\") pod \"horizon-5c8799d87c-bmvq7\" (UID: \"d9308db4-47d9-45e2-a059-a2aeed2ff9fe\") " pod="openstack/horizon-5c8799d87c-bmvq7" Oct 13 14:37:02 crc kubenswrapper[4797]: I1013 14:37:02.548414 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d9308db4-47d9-45e2-a059-a2aeed2ff9fe-horizon-secret-key\") pod \"horizon-5c8799d87c-bmvq7\" (UID: \"d9308db4-47d9-45e2-a059-a2aeed2ff9fe\") " pod="openstack/horizon-5c8799d87c-bmvq7" Oct 13 14:37:02 crc kubenswrapper[4797]: I1013 14:37:02.548486 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d9308db4-47d9-45e2-a059-a2aeed2ff9fe-scripts\") pod \"horizon-5c8799d87c-bmvq7\" (UID: \"d9308db4-47d9-45e2-a059-a2aeed2ff9fe\") " pod="openstack/horizon-5c8799d87c-bmvq7" Oct 13 14:37:02 crc kubenswrapper[4797]: I1013 14:37:02.548510 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d9308db4-47d9-45e2-a059-a2aeed2ff9fe-logs\") pod \"horizon-5c8799d87c-bmvq7\" (UID: \"d9308db4-47d9-45e2-a059-a2aeed2ff9fe\") " pod="openstack/horizon-5c8799d87c-bmvq7" Oct 13 14:37:02 crc kubenswrapper[4797]: I1013 14:37:02.572573 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-658c7589bf-7sm9b"] Oct 13 14:37:02 crc kubenswrapper[4797]: I1013 14:37:02.574482 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-658c7589bf-7sm9b" Oct 13 14:37:02 crc kubenswrapper[4797]: I1013 14:37:02.594998 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-658c7589bf-7sm9b"] Oct 13 14:37:02 crc kubenswrapper[4797]: I1013 14:37:02.612214 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7lh88"] Oct 13 14:37:02 crc kubenswrapper[4797]: I1013 14:37:02.630014 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-7lh88"] Oct 13 14:37:02 crc kubenswrapper[4797]: I1013 14:37:02.647861 4797 scope.go:117] "RemoveContainer" containerID="aa0245896a1d2c11d9b832f7bef6577826fc247503652e135b70bedcb916c716" Oct 13 14:37:02 crc kubenswrapper[4797]: I1013 14:37:02.655543 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d9308db4-47d9-45e2-a059-a2aeed2ff9fe-config-data\") pod \"horizon-5c8799d87c-bmvq7\" (UID: \"d9308db4-47d9-45e2-a059-a2aeed2ff9fe\") " pod="openstack/horizon-5c8799d87c-bmvq7" Oct 13 14:37:02 crc kubenswrapper[4797]: I1013 14:37:02.655642 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-68zvj\" (UniqueName: \"kubernetes.io/projected/d9308db4-47d9-45e2-a059-a2aeed2ff9fe-kube-api-access-68zvj\") pod \"horizon-5c8799d87c-bmvq7\" (UID: \"d9308db4-47d9-45e2-a059-a2aeed2ff9fe\") " pod="openstack/horizon-5c8799d87c-bmvq7" Oct 13 14:37:02 crc kubenswrapper[4797]: I1013 14:37:02.655674 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d9308db4-47d9-45e2-a059-a2aeed2ff9fe-horizon-secret-key\") pod \"horizon-5c8799d87c-bmvq7\" (UID: \"d9308db4-47d9-45e2-a059-a2aeed2ff9fe\") " pod="openstack/horizon-5c8799d87c-bmvq7" Oct 13 14:37:02 crc kubenswrapper[4797]: I1013 14:37:02.655886 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d9308db4-47d9-45e2-a059-a2aeed2ff9fe-scripts\") pod \"horizon-5c8799d87c-bmvq7\" (UID: \"d9308db4-47d9-45e2-a059-a2aeed2ff9fe\") " pod="openstack/horizon-5c8799d87c-bmvq7" Oct 13 14:37:02 crc kubenswrapper[4797]: I1013 14:37:02.655920 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d9308db4-47d9-45e2-a059-a2aeed2ff9fe-logs\") pod \"horizon-5c8799d87c-bmvq7\" (UID: \"d9308db4-47d9-45e2-a059-a2aeed2ff9fe\") " pod="openstack/horizon-5c8799d87c-bmvq7" Oct 13 14:37:02 crc kubenswrapper[4797]: I1013 14:37:02.657309 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d9308db4-47d9-45e2-a059-a2aeed2ff9fe-logs\") pod \"horizon-5c8799d87c-bmvq7\" (UID: \"d9308db4-47d9-45e2-a059-a2aeed2ff9fe\") " pod="openstack/horizon-5c8799d87c-bmvq7" Oct 13 14:37:02 crc kubenswrapper[4797]: I1013 14:37:02.657942 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d9308db4-47d9-45e2-a059-a2aeed2ff9fe-scripts\") pod \"horizon-5c8799d87c-bmvq7\" (UID: \"d9308db4-47d9-45e2-a059-a2aeed2ff9fe\") " pod="openstack/horizon-5c8799d87c-bmvq7" Oct 13 14:37:02 crc kubenswrapper[4797]: I1013 14:37:02.661486 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d9308db4-47d9-45e2-a059-a2aeed2ff9fe-config-data\") pod \"horizon-5c8799d87c-bmvq7\" (UID: \"d9308db4-47d9-45e2-a059-a2aeed2ff9fe\") " pod="openstack/horizon-5c8799d87c-bmvq7" Oct 13 14:37:02 crc kubenswrapper[4797]: I1013 14:37:02.666541 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d9308db4-47d9-45e2-a059-a2aeed2ff9fe-horizon-secret-key\") pod \"horizon-5c8799d87c-bmvq7\" (UID: \"d9308db4-47d9-45e2-a059-a2aeed2ff9fe\") " pod="openstack/horizon-5c8799d87c-bmvq7" Oct 13 14:37:02 crc kubenswrapper[4797]: I1013 14:37:02.676561 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-68zvj\" (UniqueName: \"kubernetes.io/projected/d9308db4-47d9-45e2-a059-a2aeed2ff9fe-kube-api-access-68zvj\") pod \"horizon-5c8799d87c-bmvq7\" (UID: \"d9308db4-47d9-45e2-a059-a2aeed2ff9fe\") " pod="openstack/horizon-5c8799d87c-bmvq7" Oct 13 14:37:02 crc kubenswrapper[4797]: I1013 14:37:02.715319 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5c8799d87c-bmvq7" Oct 13 14:37:02 crc kubenswrapper[4797]: I1013 14:37:02.757716 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mc5hq\" (UniqueName: \"kubernetes.io/projected/97db726c-fe9c-4730-9f04-004a744500f2-kube-api-access-mc5hq\") pod \"horizon-658c7589bf-7sm9b\" (UID: \"97db726c-fe9c-4730-9f04-004a744500f2\") " pod="openstack/horizon-658c7589bf-7sm9b" Oct 13 14:37:02 crc kubenswrapper[4797]: I1013 14:37:02.757792 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/97db726c-fe9c-4730-9f04-004a744500f2-horizon-secret-key\") pod \"horizon-658c7589bf-7sm9b\" (UID: \"97db726c-fe9c-4730-9f04-004a744500f2\") " pod="openstack/horizon-658c7589bf-7sm9b" Oct 13 14:37:02 crc kubenswrapper[4797]: I1013 14:37:02.758059 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/97db726c-fe9c-4730-9f04-004a744500f2-config-data\") pod \"horizon-658c7589bf-7sm9b\" (UID: \"97db726c-fe9c-4730-9f04-004a744500f2\") " pod="openstack/horizon-658c7589bf-7sm9b" Oct 13 14:37:02 crc kubenswrapper[4797]: I1013 14:37:02.758268 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/97db726c-fe9c-4730-9f04-004a744500f2-logs\") pod \"horizon-658c7589bf-7sm9b\" (UID: \"97db726c-fe9c-4730-9f04-004a744500f2\") " pod="openstack/horizon-658c7589bf-7sm9b" Oct 13 14:37:02 crc kubenswrapper[4797]: I1013 14:37:02.758319 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/97db726c-fe9c-4730-9f04-004a744500f2-scripts\") pod \"horizon-658c7589bf-7sm9b\" (UID: \"97db726c-fe9c-4730-9f04-004a744500f2\") " pod="openstack/horizon-658c7589bf-7sm9b" Oct 13 14:37:02 crc kubenswrapper[4797]: I1013 14:37:02.777708 4797 scope.go:117] "RemoveContainer" containerID="a0f3af894db6797f1c348466a759d8b2fc6bfb0f7efa680a4c6f0746f638de33" Oct 13 14:37:02 crc kubenswrapper[4797]: I1013 14:37:02.857831 4797 scope.go:117] "RemoveContainer" containerID="ef2c8c0781f2f6a55cfb7a23cd2f36bc58cfb4351b922356bab176ce106f9f9d" Oct 13 14:37:02 crc kubenswrapper[4797]: E1013 14:37:02.858716 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef2c8c0781f2f6a55cfb7a23cd2f36bc58cfb4351b922356bab176ce106f9f9d\": container with ID starting with ef2c8c0781f2f6a55cfb7a23cd2f36bc58cfb4351b922356bab176ce106f9f9d not found: ID does not exist" containerID="ef2c8c0781f2f6a55cfb7a23cd2f36bc58cfb4351b922356bab176ce106f9f9d" Oct 13 14:37:02 crc kubenswrapper[4797]: I1013 14:37:02.858767 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef2c8c0781f2f6a55cfb7a23cd2f36bc58cfb4351b922356bab176ce106f9f9d"} err="failed to get container status \"ef2c8c0781f2f6a55cfb7a23cd2f36bc58cfb4351b922356bab176ce106f9f9d\": rpc error: code = NotFound desc = could not find container \"ef2c8c0781f2f6a55cfb7a23cd2f36bc58cfb4351b922356bab176ce106f9f9d\": container with ID starting with ef2c8c0781f2f6a55cfb7a23cd2f36bc58cfb4351b922356bab176ce106f9f9d not found: ID does not exist" Oct 13 14:37:02 crc kubenswrapper[4797]: I1013 14:37:02.858875 4797 scope.go:117] "RemoveContainer" containerID="aa0245896a1d2c11d9b832f7bef6577826fc247503652e135b70bedcb916c716" Oct 13 14:37:02 crc kubenswrapper[4797]: E1013 14:37:02.859276 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa0245896a1d2c11d9b832f7bef6577826fc247503652e135b70bedcb916c716\": container with ID starting with aa0245896a1d2c11d9b832f7bef6577826fc247503652e135b70bedcb916c716 not found: ID does not exist" containerID="aa0245896a1d2c11d9b832f7bef6577826fc247503652e135b70bedcb916c716" Oct 13 14:37:02 crc kubenswrapper[4797]: I1013 14:37:02.859309 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa0245896a1d2c11d9b832f7bef6577826fc247503652e135b70bedcb916c716"} err="failed to get container status \"aa0245896a1d2c11d9b832f7bef6577826fc247503652e135b70bedcb916c716\": rpc error: code = NotFound desc = could not find container \"aa0245896a1d2c11d9b832f7bef6577826fc247503652e135b70bedcb916c716\": container with ID starting with aa0245896a1d2c11d9b832f7bef6577826fc247503652e135b70bedcb916c716 not found: ID does not exist" Oct 13 14:37:02 crc kubenswrapper[4797]: I1013 14:37:02.859329 4797 scope.go:117] "RemoveContainer" containerID="a0f3af894db6797f1c348466a759d8b2fc6bfb0f7efa680a4c6f0746f638de33" Oct 13 14:37:02 crc kubenswrapper[4797]: E1013 14:37:02.859583 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0f3af894db6797f1c348466a759d8b2fc6bfb0f7efa680a4c6f0746f638de33\": container with ID starting with a0f3af894db6797f1c348466a759d8b2fc6bfb0f7efa680a4c6f0746f638de33 not found: ID does not exist" containerID="a0f3af894db6797f1c348466a759d8b2fc6bfb0f7efa680a4c6f0746f638de33" Oct 13 14:37:02 crc kubenswrapper[4797]: I1013 14:37:02.859605 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0f3af894db6797f1c348466a759d8b2fc6bfb0f7efa680a4c6f0746f638de33"} err="failed to get container status \"a0f3af894db6797f1c348466a759d8b2fc6bfb0f7efa680a4c6f0746f638de33\": rpc error: code = NotFound desc = could not find container \"a0f3af894db6797f1c348466a759d8b2fc6bfb0f7efa680a4c6f0746f638de33\": container with ID starting with a0f3af894db6797f1c348466a759d8b2fc6bfb0f7efa680a4c6f0746f638de33 not found: ID does not exist" Oct 13 14:37:02 crc kubenswrapper[4797]: I1013 14:37:02.859733 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/97db726c-fe9c-4730-9f04-004a744500f2-scripts\") pod \"horizon-658c7589bf-7sm9b\" (UID: \"97db726c-fe9c-4730-9f04-004a744500f2\") " pod="openstack/horizon-658c7589bf-7sm9b" Oct 13 14:37:02 crc kubenswrapper[4797]: I1013 14:37:02.859897 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mc5hq\" (UniqueName: \"kubernetes.io/projected/97db726c-fe9c-4730-9f04-004a744500f2-kube-api-access-mc5hq\") pod \"horizon-658c7589bf-7sm9b\" (UID: \"97db726c-fe9c-4730-9f04-004a744500f2\") " pod="openstack/horizon-658c7589bf-7sm9b" Oct 13 14:37:02 crc kubenswrapper[4797]: I1013 14:37:02.859944 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/97db726c-fe9c-4730-9f04-004a744500f2-horizon-secret-key\") pod \"horizon-658c7589bf-7sm9b\" (UID: \"97db726c-fe9c-4730-9f04-004a744500f2\") " pod="openstack/horizon-658c7589bf-7sm9b" Oct 13 14:37:02 crc kubenswrapper[4797]: I1013 14:37:02.860018 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/97db726c-fe9c-4730-9f04-004a744500f2-config-data\") pod \"horizon-658c7589bf-7sm9b\" (UID: \"97db726c-fe9c-4730-9f04-004a744500f2\") " pod="openstack/horizon-658c7589bf-7sm9b" Oct 13 14:37:02 crc kubenswrapper[4797]: I1013 14:37:02.860090 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/97db726c-fe9c-4730-9f04-004a744500f2-logs\") pod \"horizon-658c7589bf-7sm9b\" (UID: \"97db726c-fe9c-4730-9f04-004a744500f2\") " pod="openstack/horizon-658c7589bf-7sm9b" Oct 13 14:37:02 crc kubenswrapper[4797]: I1013 14:37:02.860998 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/97db726c-fe9c-4730-9f04-004a744500f2-logs\") pod \"horizon-658c7589bf-7sm9b\" (UID: \"97db726c-fe9c-4730-9f04-004a744500f2\") " pod="openstack/horizon-658c7589bf-7sm9b" Oct 13 14:37:02 crc kubenswrapper[4797]: I1013 14:37:02.861027 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/97db726c-fe9c-4730-9f04-004a744500f2-scripts\") pod \"horizon-658c7589bf-7sm9b\" (UID: \"97db726c-fe9c-4730-9f04-004a744500f2\") " pod="openstack/horizon-658c7589bf-7sm9b" Oct 13 14:37:02 crc kubenswrapper[4797]: I1013 14:37:02.863149 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/97db726c-fe9c-4730-9f04-004a744500f2-config-data\") pod \"horizon-658c7589bf-7sm9b\" (UID: \"97db726c-fe9c-4730-9f04-004a744500f2\") " pod="openstack/horizon-658c7589bf-7sm9b" Oct 13 14:37:02 crc kubenswrapper[4797]: I1013 14:37:02.883846 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/97db726c-fe9c-4730-9f04-004a744500f2-horizon-secret-key\") pod \"horizon-658c7589bf-7sm9b\" (UID: \"97db726c-fe9c-4730-9f04-004a744500f2\") " pod="openstack/horizon-658c7589bf-7sm9b" Oct 13 14:37:02 crc kubenswrapper[4797]: I1013 14:37:02.887268 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mc5hq\" (UniqueName: \"kubernetes.io/projected/97db726c-fe9c-4730-9f04-004a744500f2-kube-api-access-mc5hq\") pod \"horizon-658c7589bf-7sm9b\" (UID: \"97db726c-fe9c-4730-9f04-004a744500f2\") " pod="openstack/horizon-658c7589bf-7sm9b" Oct 13 14:37:02 crc kubenswrapper[4797]: I1013 14:37:02.930588 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-658c7589bf-7sm9b" Oct 13 14:37:03 crc kubenswrapper[4797]: I1013 14:37:03.123879 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5c8799d87c-bmvq7"] Oct 13 14:37:03 crc kubenswrapper[4797]: I1013 14:37:03.156190 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7c5c76c54c-hrm4p"] Oct 13 14:37:03 crc kubenswrapper[4797]: I1013 14:37:03.159549 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7c5c76c54c-hrm4p" Oct 13 14:37:03 crc kubenswrapper[4797]: I1013 14:37:03.166160 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/14c1d6c3-d0ec-48c2-bca2-391a04f0c47d-logs\") pod \"horizon-7c5c76c54c-hrm4p\" (UID: \"14c1d6c3-d0ec-48c2-bca2-391a04f0c47d\") " pod="openstack/horizon-7c5c76c54c-hrm4p" Oct 13 14:37:03 crc kubenswrapper[4797]: I1013 14:37:03.166207 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/14c1d6c3-d0ec-48c2-bca2-391a04f0c47d-config-data\") pod \"horizon-7c5c76c54c-hrm4p\" (UID: \"14c1d6c3-d0ec-48c2-bca2-391a04f0c47d\") " pod="openstack/horizon-7c5c76c54c-hrm4p" Oct 13 14:37:03 crc kubenswrapper[4797]: I1013 14:37:03.166255 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/14c1d6c3-d0ec-48c2-bca2-391a04f0c47d-horizon-secret-key\") pod \"horizon-7c5c76c54c-hrm4p\" (UID: \"14c1d6c3-d0ec-48c2-bca2-391a04f0c47d\") " pod="openstack/horizon-7c5c76c54c-hrm4p" Oct 13 14:37:03 crc kubenswrapper[4797]: I1013 14:37:03.166362 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmswn\" (UniqueName: \"kubernetes.io/projected/14c1d6c3-d0ec-48c2-bca2-391a04f0c47d-kube-api-access-pmswn\") pod \"horizon-7c5c76c54c-hrm4p\" (UID: \"14c1d6c3-d0ec-48c2-bca2-391a04f0c47d\") " pod="openstack/horizon-7c5c76c54c-hrm4p" Oct 13 14:37:03 crc kubenswrapper[4797]: I1013 14:37:03.166437 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/14c1d6c3-d0ec-48c2-bca2-391a04f0c47d-scripts\") pod \"horizon-7c5c76c54c-hrm4p\" (UID: \"14c1d6c3-d0ec-48c2-bca2-391a04f0c47d\") " pod="openstack/horizon-7c5c76c54c-hrm4p" Oct 13 14:37:03 crc kubenswrapper[4797]: I1013 14:37:03.183294 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7c5c76c54c-hrm4p"] Oct 13 14:37:03 crc kubenswrapper[4797]: I1013 14:37:03.253543 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d736f10d-90a5-43e1-8f2d-84e8c82b497e" path="/var/lib/kubelet/pods/d736f10d-90a5-43e1-8f2d-84e8c82b497e/volumes" Oct 13 14:37:03 crc kubenswrapper[4797]: I1013 14:37:03.262097 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5c8799d87c-bmvq7"] Oct 13 14:37:03 crc kubenswrapper[4797]: I1013 14:37:03.272996 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/14c1d6c3-d0ec-48c2-bca2-391a04f0c47d-scripts\") pod \"horizon-7c5c76c54c-hrm4p\" (UID: \"14c1d6c3-d0ec-48c2-bca2-391a04f0c47d\") " pod="openstack/horizon-7c5c76c54c-hrm4p" Oct 13 14:37:03 crc kubenswrapper[4797]: I1013 14:37:03.273070 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/14c1d6c3-d0ec-48c2-bca2-391a04f0c47d-logs\") pod \"horizon-7c5c76c54c-hrm4p\" (UID: \"14c1d6c3-d0ec-48c2-bca2-391a04f0c47d\") " pod="openstack/horizon-7c5c76c54c-hrm4p" Oct 13 14:37:03 crc kubenswrapper[4797]: I1013 14:37:03.273107 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/14c1d6c3-d0ec-48c2-bca2-391a04f0c47d-config-data\") pod \"horizon-7c5c76c54c-hrm4p\" (UID: \"14c1d6c3-d0ec-48c2-bca2-391a04f0c47d\") " pod="openstack/horizon-7c5c76c54c-hrm4p" Oct 13 14:37:03 crc kubenswrapper[4797]: I1013 14:37:03.273154 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/14c1d6c3-d0ec-48c2-bca2-391a04f0c47d-horizon-secret-key\") pod \"horizon-7c5c76c54c-hrm4p\" (UID: \"14c1d6c3-d0ec-48c2-bca2-391a04f0c47d\") " pod="openstack/horizon-7c5c76c54c-hrm4p" Oct 13 14:37:03 crc kubenswrapper[4797]: I1013 14:37:03.273300 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pmswn\" (UniqueName: \"kubernetes.io/projected/14c1d6c3-d0ec-48c2-bca2-391a04f0c47d-kube-api-access-pmswn\") pod \"horizon-7c5c76c54c-hrm4p\" (UID: \"14c1d6c3-d0ec-48c2-bca2-391a04f0c47d\") " pod="openstack/horizon-7c5c76c54c-hrm4p" Oct 13 14:37:03 crc kubenswrapper[4797]: I1013 14:37:03.273922 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/14c1d6c3-d0ec-48c2-bca2-391a04f0c47d-logs\") pod \"horizon-7c5c76c54c-hrm4p\" (UID: \"14c1d6c3-d0ec-48c2-bca2-391a04f0c47d\") " pod="openstack/horizon-7c5c76c54c-hrm4p" Oct 13 14:37:03 crc kubenswrapper[4797]: I1013 14:37:03.273992 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/14c1d6c3-d0ec-48c2-bca2-391a04f0c47d-scripts\") pod \"horizon-7c5c76c54c-hrm4p\" (UID: \"14c1d6c3-d0ec-48c2-bca2-391a04f0c47d\") " pod="openstack/horizon-7c5c76c54c-hrm4p" Oct 13 14:37:03 crc kubenswrapper[4797]: I1013 14:37:03.275399 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/14c1d6c3-d0ec-48c2-bca2-391a04f0c47d-config-data\") pod \"horizon-7c5c76c54c-hrm4p\" (UID: \"14c1d6c3-d0ec-48c2-bca2-391a04f0c47d\") " pod="openstack/horizon-7c5c76c54c-hrm4p" Oct 13 14:37:03 crc kubenswrapper[4797]: I1013 14:37:03.289718 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/14c1d6c3-d0ec-48c2-bca2-391a04f0c47d-horizon-secret-key\") pod \"horizon-7c5c76c54c-hrm4p\" (UID: \"14c1d6c3-d0ec-48c2-bca2-391a04f0c47d\") " pod="openstack/horizon-7c5c76c54c-hrm4p" Oct 13 14:37:03 crc kubenswrapper[4797]: I1013 14:37:03.305403 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmswn\" (UniqueName: \"kubernetes.io/projected/14c1d6c3-d0ec-48c2-bca2-391a04f0c47d-kube-api-access-pmswn\") pod \"horizon-7c5c76c54c-hrm4p\" (UID: \"14c1d6c3-d0ec-48c2-bca2-391a04f0c47d\") " pod="openstack/horizon-7c5c76c54c-hrm4p" Oct 13 14:37:03 crc kubenswrapper[4797]: I1013 14:37:03.484282 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7c5c76c54c-hrm4p" Oct 13 14:37:03 crc kubenswrapper[4797]: I1013 14:37:03.495287 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-658c7589bf-7sm9b"] Oct 13 14:37:03 crc kubenswrapper[4797]: W1013 14:37:03.511173 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod97db726c_fe9c_4730_9f04_004a744500f2.slice/crio-4e18acf03551ac80dcf6fd5515cc44f8ac454953b329875436d01e1e518420f2 WatchSource:0}: Error finding container 4e18acf03551ac80dcf6fd5515cc44f8ac454953b329875436d01e1e518420f2: Status 404 returned error can't find the container with id 4e18acf03551ac80dcf6fd5515cc44f8ac454953b329875436d01e1e518420f2 Oct 13 14:37:03 crc kubenswrapper[4797]: I1013 14:37:03.555833 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-658c7589bf-7sm9b" event={"ID":"97db726c-fe9c-4730-9f04-004a744500f2","Type":"ContainerStarted","Data":"4e18acf03551ac80dcf6fd5515cc44f8ac454953b329875436d01e1e518420f2"} Oct 13 14:37:03 crc kubenswrapper[4797]: I1013 14:37:03.557112 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5c8799d87c-bmvq7" event={"ID":"d9308db4-47d9-45e2-a059-a2aeed2ff9fe","Type":"ContainerStarted","Data":"49f339eacb97bfa0b317cdd994a775b94ff67d1c5e2a8e3cd704ffa95c2aa5e6"} Oct 13 14:37:03 crc kubenswrapper[4797]: I1013 14:37:03.559412 4797 generic.go:334] "Generic (PLEG): container finished" podID="4129fe47-83ce-4c43-9549-39be0607dc11" containerID="ae8411c1bcc54e7c84453171d93e0c8b45852913b5a9ab2433f22184cffd6213" exitCode=143 Oct 13 14:37:03 crc kubenswrapper[4797]: I1013 14:37:03.559497 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4129fe47-83ce-4c43-9549-39be0607dc11","Type":"ContainerDied","Data":"ae8411c1bcc54e7c84453171d93e0c8b45852913b5a9ab2433f22184cffd6213"} Oct 13 14:37:03 crc kubenswrapper[4797]: I1013 14:37:03.562791 4797 generic.go:334] "Generic (PLEG): container finished" podID="42c07aef-3157-4265-b940-0d838eb32e9f" containerID="65bcd9b8b6054f2b3ecd43dc1183e377ec852f26140bb43c56e152253bca6e95" exitCode=143 Oct 13 14:37:03 crc kubenswrapper[4797]: I1013 14:37:03.562872 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"42c07aef-3157-4265-b940-0d838eb32e9f","Type":"ContainerDied","Data":"65bcd9b8b6054f2b3ecd43dc1183e377ec852f26140bb43c56e152253bca6e95"} Oct 13 14:37:03 crc kubenswrapper[4797]: I1013 14:37:03.988561 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7c5c76c54c-hrm4p"] Oct 13 14:37:04 crc kubenswrapper[4797]: I1013 14:37:04.576225 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7c5c76c54c-hrm4p" event={"ID":"14c1d6c3-d0ec-48c2-bca2-391a04f0c47d","Type":"ContainerStarted","Data":"ad59835e11fb8bbbb708e936ea32341d862776e9a6ae08b77172f6eb52365504"} Oct 13 14:37:06 crc kubenswrapper[4797]: I1013 14:37:06.425025 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 13 14:37:06 crc kubenswrapper[4797]: I1013 14:37:06.542116 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 13 14:37:06 crc kubenswrapper[4797]: I1013 14:37:06.548100 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42c07aef-3157-4265-b940-0d838eb32e9f-config-data\") pod \"42c07aef-3157-4265-b940-0d838eb32e9f\" (UID: \"42c07aef-3157-4265-b940-0d838eb32e9f\") " Oct 13 14:37:06 crc kubenswrapper[4797]: I1013 14:37:06.548155 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/42c07aef-3157-4265-b940-0d838eb32e9f-scripts\") pod \"42c07aef-3157-4265-b940-0d838eb32e9f\" (UID: \"42c07aef-3157-4265-b940-0d838eb32e9f\") " Oct 13 14:37:06 crc kubenswrapper[4797]: I1013 14:37:06.548282 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/42c07aef-3157-4265-b940-0d838eb32e9f-logs\") pod \"42c07aef-3157-4265-b940-0d838eb32e9f\" (UID: \"42c07aef-3157-4265-b940-0d838eb32e9f\") " Oct 13 14:37:06 crc kubenswrapper[4797]: I1013 14:37:06.548414 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ctjx6\" (UniqueName: \"kubernetes.io/projected/42c07aef-3157-4265-b940-0d838eb32e9f-kube-api-access-ctjx6\") pod \"42c07aef-3157-4265-b940-0d838eb32e9f\" (UID: \"42c07aef-3157-4265-b940-0d838eb32e9f\") " Oct 13 14:37:06 crc kubenswrapper[4797]: I1013 14:37:06.548440 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42c07aef-3157-4265-b940-0d838eb32e9f-combined-ca-bundle\") pod \"42c07aef-3157-4265-b940-0d838eb32e9f\" (UID: \"42c07aef-3157-4265-b940-0d838eb32e9f\") " Oct 13 14:37:06 crc kubenswrapper[4797]: I1013 14:37:06.548462 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/42c07aef-3157-4265-b940-0d838eb32e9f-ceph\") pod \"42c07aef-3157-4265-b940-0d838eb32e9f\" (UID: \"42c07aef-3157-4265-b940-0d838eb32e9f\") " Oct 13 14:37:06 crc kubenswrapper[4797]: I1013 14:37:06.548500 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/42c07aef-3157-4265-b940-0d838eb32e9f-httpd-run\") pod \"42c07aef-3157-4265-b940-0d838eb32e9f\" (UID: \"42c07aef-3157-4265-b940-0d838eb32e9f\") " Oct 13 14:37:06 crc kubenswrapper[4797]: I1013 14:37:06.549501 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42c07aef-3157-4265-b940-0d838eb32e9f-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "42c07aef-3157-4265-b940-0d838eb32e9f" (UID: "42c07aef-3157-4265-b940-0d838eb32e9f"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 14:37:06 crc kubenswrapper[4797]: I1013 14:37:06.550593 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42c07aef-3157-4265-b940-0d838eb32e9f-logs" (OuterVolumeSpecName: "logs") pod "42c07aef-3157-4265-b940-0d838eb32e9f" (UID: "42c07aef-3157-4265-b940-0d838eb32e9f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 14:37:06 crc kubenswrapper[4797]: I1013 14:37:06.554272 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42c07aef-3157-4265-b940-0d838eb32e9f-ceph" (OuterVolumeSpecName: "ceph") pod "42c07aef-3157-4265-b940-0d838eb32e9f" (UID: "42c07aef-3157-4265-b940-0d838eb32e9f"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 14:37:06 crc kubenswrapper[4797]: I1013 14:37:06.556700 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42c07aef-3157-4265-b940-0d838eb32e9f-scripts" (OuterVolumeSpecName: "scripts") pod "42c07aef-3157-4265-b940-0d838eb32e9f" (UID: "42c07aef-3157-4265-b940-0d838eb32e9f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 14:37:06 crc kubenswrapper[4797]: I1013 14:37:06.575119 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42c07aef-3157-4265-b940-0d838eb32e9f-kube-api-access-ctjx6" (OuterVolumeSpecName: "kube-api-access-ctjx6") pod "42c07aef-3157-4265-b940-0d838eb32e9f" (UID: "42c07aef-3157-4265-b940-0d838eb32e9f"). InnerVolumeSpecName "kube-api-access-ctjx6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 14:37:06 crc kubenswrapper[4797]: I1013 14:37:06.599122 4797 generic.go:334] "Generic (PLEG): container finished" podID="42c07aef-3157-4265-b940-0d838eb32e9f" containerID="1e2c86c583fb966dae97514dbe98d91f56ac5589ee97b753c4018672f0c3109a" exitCode=0 Oct 13 14:37:06 crc kubenswrapper[4797]: I1013 14:37:06.599187 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"42c07aef-3157-4265-b940-0d838eb32e9f","Type":"ContainerDied","Data":"1e2c86c583fb966dae97514dbe98d91f56ac5589ee97b753c4018672f0c3109a"} Oct 13 14:37:06 crc kubenswrapper[4797]: I1013 14:37:06.599215 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"42c07aef-3157-4265-b940-0d838eb32e9f","Type":"ContainerDied","Data":"2ac51a03f59ebee4449a2e8cf71cf7257dd79df1fe56dd56fa6a1924b14e82a2"} Oct 13 14:37:06 crc kubenswrapper[4797]: I1013 14:37:06.599232 4797 scope.go:117] "RemoveContainer" containerID="1e2c86c583fb966dae97514dbe98d91f56ac5589ee97b753c4018672f0c3109a" Oct 13 14:37:06 crc kubenswrapper[4797]: I1013 14:37:06.599345 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 13 14:37:06 crc kubenswrapper[4797]: I1013 14:37:06.599902 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42c07aef-3157-4265-b940-0d838eb32e9f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "42c07aef-3157-4265-b940-0d838eb32e9f" (UID: "42c07aef-3157-4265-b940-0d838eb32e9f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 14:37:06 crc kubenswrapper[4797]: I1013 14:37:06.608369 4797 generic.go:334] "Generic (PLEG): container finished" podID="4129fe47-83ce-4c43-9549-39be0607dc11" containerID="df9ffaffb3948f1cee13cbad7d818dd5004c66fe554b30b53c353c221e2d8ea4" exitCode=0 Oct 13 14:37:06 crc kubenswrapper[4797]: I1013 14:37:06.608411 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4129fe47-83ce-4c43-9549-39be0607dc11","Type":"ContainerDied","Data":"df9ffaffb3948f1cee13cbad7d818dd5004c66fe554b30b53c353c221e2d8ea4"} Oct 13 14:37:06 crc kubenswrapper[4797]: I1013 14:37:06.608436 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4129fe47-83ce-4c43-9549-39be0607dc11","Type":"ContainerDied","Data":"783896fddb198770f7f3ec5a58b0c1b71ef87d5b06737f06e385c667477436eb"} Oct 13 14:37:06 crc kubenswrapper[4797]: I1013 14:37:06.608508 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 13 14:37:06 crc kubenswrapper[4797]: I1013 14:37:06.631966 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42c07aef-3157-4265-b940-0d838eb32e9f-config-data" (OuterVolumeSpecName: "config-data") pod "42c07aef-3157-4265-b940-0d838eb32e9f" (UID: "42c07aef-3157-4265-b940-0d838eb32e9f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 14:37:06 crc kubenswrapper[4797]: I1013 14:37:06.650488 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4129fe47-83ce-4c43-9549-39be0607dc11-scripts\") pod \"4129fe47-83ce-4c43-9549-39be0607dc11\" (UID: \"4129fe47-83ce-4c43-9549-39be0607dc11\") " Oct 13 14:37:06 crc kubenswrapper[4797]: I1013 14:37:06.650572 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4129fe47-83ce-4c43-9549-39be0607dc11-config-data\") pod \"4129fe47-83ce-4c43-9549-39be0607dc11\" (UID: \"4129fe47-83ce-4c43-9549-39be0607dc11\") " Oct 13 14:37:06 crc kubenswrapper[4797]: I1013 14:37:06.650613 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/4129fe47-83ce-4c43-9549-39be0607dc11-ceph\") pod \"4129fe47-83ce-4c43-9549-39be0607dc11\" (UID: \"4129fe47-83ce-4c43-9549-39be0607dc11\") " Oct 13 14:37:06 crc kubenswrapper[4797]: I1013 14:37:06.650744 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4129fe47-83ce-4c43-9549-39be0607dc11-logs\") pod \"4129fe47-83ce-4c43-9549-39be0607dc11\" (UID: \"4129fe47-83ce-4c43-9549-39be0607dc11\") " Oct 13 14:37:06 crc kubenswrapper[4797]: I1013 14:37:06.650779 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4129fe47-83ce-4c43-9549-39be0607dc11-combined-ca-bundle\") pod \"4129fe47-83ce-4c43-9549-39be0607dc11\" (UID: \"4129fe47-83ce-4c43-9549-39be0607dc11\") " Oct 13 14:37:06 crc kubenswrapper[4797]: I1013 14:37:06.650817 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-27c45\" (UniqueName: \"kubernetes.io/projected/4129fe47-83ce-4c43-9549-39be0607dc11-kube-api-access-27c45\") pod \"4129fe47-83ce-4c43-9549-39be0607dc11\" (UID: \"4129fe47-83ce-4c43-9549-39be0607dc11\") " Oct 13 14:37:06 crc kubenswrapper[4797]: I1013 14:37:06.650855 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4129fe47-83ce-4c43-9549-39be0607dc11-httpd-run\") pod \"4129fe47-83ce-4c43-9549-39be0607dc11\" (UID: \"4129fe47-83ce-4c43-9549-39be0607dc11\") " Oct 13 14:37:06 crc kubenswrapper[4797]: I1013 14:37:06.651103 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ctjx6\" (UniqueName: \"kubernetes.io/projected/42c07aef-3157-4265-b940-0d838eb32e9f-kube-api-access-ctjx6\") on node \"crc\" DevicePath \"\"" Oct 13 14:37:06 crc kubenswrapper[4797]: I1013 14:37:06.651114 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42c07aef-3157-4265-b940-0d838eb32e9f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 14:37:06 crc kubenswrapper[4797]: I1013 14:37:06.651124 4797 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/42c07aef-3157-4265-b940-0d838eb32e9f-ceph\") on node \"crc\" DevicePath \"\"" Oct 13 14:37:06 crc kubenswrapper[4797]: I1013 14:37:06.651133 4797 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/42c07aef-3157-4265-b940-0d838eb32e9f-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 13 14:37:06 crc kubenswrapper[4797]: I1013 14:37:06.651141 4797 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42c07aef-3157-4265-b940-0d838eb32e9f-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 14:37:06 crc kubenswrapper[4797]: I1013 14:37:06.651149 4797 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/42c07aef-3157-4265-b940-0d838eb32e9f-scripts\") on node \"crc\" DevicePath \"\"" Oct 13 14:37:06 crc kubenswrapper[4797]: I1013 14:37:06.651157 4797 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/42c07aef-3157-4265-b940-0d838eb32e9f-logs\") on node \"crc\" DevicePath \"\"" Oct 13 14:37:06 crc kubenswrapper[4797]: I1013 14:37:06.651490 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4129fe47-83ce-4c43-9549-39be0607dc11-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "4129fe47-83ce-4c43-9549-39be0607dc11" (UID: "4129fe47-83ce-4c43-9549-39be0607dc11"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 14:37:06 crc kubenswrapper[4797]: I1013 14:37:06.653434 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4129fe47-83ce-4c43-9549-39be0607dc11-logs" (OuterVolumeSpecName: "logs") pod "4129fe47-83ce-4c43-9549-39be0607dc11" (UID: "4129fe47-83ce-4c43-9549-39be0607dc11"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 14:37:06 crc kubenswrapper[4797]: I1013 14:37:06.661047 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4129fe47-83ce-4c43-9549-39be0607dc11-scripts" (OuterVolumeSpecName: "scripts") pod "4129fe47-83ce-4c43-9549-39be0607dc11" (UID: "4129fe47-83ce-4c43-9549-39be0607dc11"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 14:37:06 crc kubenswrapper[4797]: I1013 14:37:06.661260 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4129fe47-83ce-4c43-9549-39be0607dc11-kube-api-access-27c45" (OuterVolumeSpecName: "kube-api-access-27c45") pod "4129fe47-83ce-4c43-9549-39be0607dc11" (UID: "4129fe47-83ce-4c43-9549-39be0607dc11"). InnerVolumeSpecName "kube-api-access-27c45". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 14:37:06 crc kubenswrapper[4797]: I1013 14:37:06.661381 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4129fe47-83ce-4c43-9549-39be0607dc11-ceph" (OuterVolumeSpecName: "ceph") pod "4129fe47-83ce-4c43-9549-39be0607dc11" (UID: "4129fe47-83ce-4c43-9549-39be0607dc11"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 14:37:06 crc kubenswrapper[4797]: I1013 14:37:06.666564 4797 scope.go:117] "RemoveContainer" containerID="65bcd9b8b6054f2b3ecd43dc1183e377ec852f26140bb43c56e152253bca6e95" Oct 13 14:37:06 crc kubenswrapper[4797]: I1013 14:37:06.689281 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4129fe47-83ce-4c43-9549-39be0607dc11-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4129fe47-83ce-4c43-9549-39be0607dc11" (UID: "4129fe47-83ce-4c43-9549-39be0607dc11"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 14:37:06 crc kubenswrapper[4797]: I1013 14:37:06.697151 4797 scope.go:117] "RemoveContainer" containerID="1e2c86c583fb966dae97514dbe98d91f56ac5589ee97b753c4018672f0c3109a" Oct 13 14:37:06 crc kubenswrapper[4797]: E1013 14:37:06.697674 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e2c86c583fb966dae97514dbe98d91f56ac5589ee97b753c4018672f0c3109a\": container with ID starting with 1e2c86c583fb966dae97514dbe98d91f56ac5589ee97b753c4018672f0c3109a not found: ID does not exist" containerID="1e2c86c583fb966dae97514dbe98d91f56ac5589ee97b753c4018672f0c3109a" Oct 13 14:37:06 crc kubenswrapper[4797]: I1013 14:37:06.697744 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e2c86c583fb966dae97514dbe98d91f56ac5589ee97b753c4018672f0c3109a"} err="failed to get container status \"1e2c86c583fb966dae97514dbe98d91f56ac5589ee97b753c4018672f0c3109a\": rpc error: code = NotFound desc = could not find container \"1e2c86c583fb966dae97514dbe98d91f56ac5589ee97b753c4018672f0c3109a\": container with ID starting with 1e2c86c583fb966dae97514dbe98d91f56ac5589ee97b753c4018672f0c3109a not found: ID does not exist" Oct 13 14:37:06 crc kubenswrapper[4797]: I1013 14:37:06.697766 4797 scope.go:117] "RemoveContainer" containerID="65bcd9b8b6054f2b3ecd43dc1183e377ec852f26140bb43c56e152253bca6e95" Oct 13 14:37:06 crc kubenswrapper[4797]: E1013 14:37:06.698321 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65bcd9b8b6054f2b3ecd43dc1183e377ec852f26140bb43c56e152253bca6e95\": container with ID starting with 65bcd9b8b6054f2b3ecd43dc1183e377ec852f26140bb43c56e152253bca6e95 not found: ID does not exist" containerID="65bcd9b8b6054f2b3ecd43dc1183e377ec852f26140bb43c56e152253bca6e95" Oct 13 14:37:06 crc kubenswrapper[4797]: I1013 14:37:06.698342 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65bcd9b8b6054f2b3ecd43dc1183e377ec852f26140bb43c56e152253bca6e95"} err="failed to get container status \"65bcd9b8b6054f2b3ecd43dc1183e377ec852f26140bb43c56e152253bca6e95\": rpc error: code = NotFound desc = could not find container \"65bcd9b8b6054f2b3ecd43dc1183e377ec852f26140bb43c56e152253bca6e95\": container with ID starting with 65bcd9b8b6054f2b3ecd43dc1183e377ec852f26140bb43c56e152253bca6e95 not found: ID does not exist" Oct 13 14:37:06 crc kubenswrapper[4797]: I1013 14:37:06.698361 4797 scope.go:117] "RemoveContainer" containerID="df9ffaffb3948f1cee13cbad7d818dd5004c66fe554b30b53c353c221e2d8ea4" Oct 13 14:37:06 crc kubenswrapper[4797]: I1013 14:37:06.721359 4797 scope.go:117] "RemoveContainer" containerID="ae8411c1bcc54e7c84453171d93e0c8b45852913b5a9ab2433f22184cffd6213" Oct 13 14:37:06 crc kubenswrapper[4797]: I1013 14:37:06.734890 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4129fe47-83ce-4c43-9549-39be0607dc11-config-data" (OuterVolumeSpecName: "config-data") pod "4129fe47-83ce-4c43-9549-39be0607dc11" (UID: "4129fe47-83ce-4c43-9549-39be0607dc11"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 14:37:06 crc kubenswrapper[4797]: I1013 14:37:06.752612 4797 scope.go:117] "RemoveContainer" containerID="df9ffaffb3948f1cee13cbad7d818dd5004c66fe554b30b53c353c221e2d8ea4" Oct 13 14:37:06 crc kubenswrapper[4797]: I1013 14:37:06.752890 4797 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4129fe47-83ce-4c43-9549-39be0607dc11-scripts\") on node \"crc\" DevicePath \"\"" Oct 13 14:37:06 crc kubenswrapper[4797]: I1013 14:37:06.752971 4797 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4129fe47-83ce-4c43-9549-39be0607dc11-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 14:37:06 crc kubenswrapper[4797]: I1013 14:37:06.753008 4797 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/4129fe47-83ce-4c43-9549-39be0607dc11-ceph\") on node \"crc\" DevicePath \"\"" Oct 13 14:37:06 crc kubenswrapper[4797]: I1013 14:37:06.753016 4797 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4129fe47-83ce-4c43-9549-39be0607dc11-logs\") on node \"crc\" DevicePath \"\"" Oct 13 14:37:06 crc kubenswrapper[4797]: I1013 14:37:06.753043 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4129fe47-83ce-4c43-9549-39be0607dc11-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 14:37:06 crc kubenswrapper[4797]: I1013 14:37:06.753052 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-27c45\" (UniqueName: \"kubernetes.io/projected/4129fe47-83ce-4c43-9549-39be0607dc11-kube-api-access-27c45\") on node \"crc\" DevicePath \"\"" Oct 13 14:37:06 crc kubenswrapper[4797]: I1013 14:37:06.753062 4797 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4129fe47-83ce-4c43-9549-39be0607dc11-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 13 14:37:06 crc kubenswrapper[4797]: E1013 14:37:06.753144 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df9ffaffb3948f1cee13cbad7d818dd5004c66fe554b30b53c353c221e2d8ea4\": container with ID starting with df9ffaffb3948f1cee13cbad7d818dd5004c66fe554b30b53c353c221e2d8ea4 not found: ID does not exist" containerID="df9ffaffb3948f1cee13cbad7d818dd5004c66fe554b30b53c353c221e2d8ea4" Oct 13 14:37:06 crc kubenswrapper[4797]: I1013 14:37:06.753169 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df9ffaffb3948f1cee13cbad7d818dd5004c66fe554b30b53c353c221e2d8ea4"} err="failed to get container status \"df9ffaffb3948f1cee13cbad7d818dd5004c66fe554b30b53c353c221e2d8ea4\": rpc error: code = NotFound desc = could not find container \"df9ffaffb3948f1cee13cbad7d818dd5004c66fe554b30b53c353c221e2d8ea4\": container with ID starting with df9ffaffb3948f1cee13cbad7d818dd5004c66fe554b30b53c353c221e2d8ea4 not found: ID does not exist" Oct 13 14:37:06 crc kubenswrapper[4797]: I1013 14:37:06.753191 4797 scope.go:117] "RemoveContainer" containerID="ae8411c1bcc54e7c84453171d93e0c8b45852913b5a9ab2433f22184cffd6213" Oct 13 14:37:06 crc kubenswrapper[4797]: E1013 14:37:06.753698 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae8411c1bcc54e7c84453171d93e0c8b45852913b5a9ab2433f22184cffd6213\": container with ID starting with ae8411c1bcc54e7c84453171d93e0c8b45852913b5a9ab2433f22184cffd6213 not found: ID does not exist" containerID="ae8411c1bcc54e7c84453171d93e0c8b45852913b5a9ab2433f22184cffd6213" Oct 13 14:37:06 crc kubenswrapper[4797]: I1013 14:37:06.753776 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae8411c1bcc54e7c84453171d93e0c8b45852913b5a9ab2433f22184cffd6213"} err="failed to get container status \"ae8411c1bcc54e7c84453171d93e0c8b45852913b5a9ab2433f22184cffd6213\": rpc error: code = NotFound desc = could not find container \"ae8411c1bcc54e7c84453171d93e0c8b45852913b5a9ab2433f22184cffd6213\": container with ID starting with ae8411c1bcc54e7c84453171d93e0c8b45852913b5a9ab2433f22184cffd6213 not found: ID does not exist" Oct 13 14:37:06 crc kubenswrapper[4797]: I1013 14:37:06.990491 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 13 14:37:07 crc kubenswrapper[4797]: I1013 14:37:07.005045 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 13 14:37:07 crc kubenswrapper[4797]: I1013 14:37:07.023524 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 13 14:37:07 crc kubenswrapper[4797]: I1013 14:37:07.039781 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 13 14:37:07 crc kubenswrapper[4797]: I1013 14:37:07.052146 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 13 14:37:07 crc kubenswrapper[4797]: E1013 14:37:07.053078 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42c07aef-3157-4265-b940-0d838eb32e9f" containerName="glance-httpd" Oct 13 14:37:07 crc kubenswrapper[4797]: I1013 14:37:07.053094 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="42c07aef-3157-4265-b940-0d838eb32e9f" containerName="glance-httpd" Oct 13 14:37:07 crc kubenswrapper[4797]: E1013 14:37:07.053126 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4129fe47-83ce-4c43-9549-39be0607dc11" containerName="glance-httpd" Oct 13 14:37:07 crc kubenswrapper[4797]: I1013 14:37:07.053132 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="4129fe47-83ce-4c43-9549-39be0607dc11" containerName="glance-httpd" Oct 13 14:37:07 crc kubenswrapper[4797]: E1013 14:37:07.053143 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42c07aef-3157-4265-b940-0d838eb32e9f" containerName="glance-log" Oct 13 14:37:07 crc kubenswrapper[4797]: I1013 14:37:07.053148 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="42c07aef-3157-4265-b940-0d838eb32e9f" containerName="glance-log" Oct 13 14:37:07 crc kubenswrapper[4797]: E1013 14:37:07.053195 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4129fe47-83ce-4c43-9549-39be0607dc11" containerName="glance-log" Oct 13 14:37:07 crc kubenswrapper[4797]: I1013 14:37:07.053202 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="4129fe47-83ce-4c43-9549-39be0607dc11" containerName="glance-log" Oct 13 14:37:07 crc kubenswrapper[4797]: I1013 14:37:07.066153 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="42c07aef-3157-4265-b940-0d838eb32e9f" containerName="glance-log" Oct 13 14:37:07 crc kubenswrapper[4797]: I1013 14:37:07.066647 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="4129fe47-83ce-4c43-9549-39be0607dc11" containerName="glance-log" Oct 13 14:37:07 crc kubenswrapper[4797]: I1013 14:37:07.066749 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="42c07aef-3157-4265-b940-0d838eb32e9f" containerName="glance-httpd" Oct 13 14:37:07 crc kubenswrapper[4797]: I1013 14:37:07.067076 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="4129fe47-83ce-4c43-9549-39be0607dc11" containerName="glance-httpd" Oct 13 14:37:07 crc kubenswrapper[4797]: I1013 14:37:07.071914 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 13 14:37:07 crc kubenswrapper[4797]: I1013 14:37:07.072980 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 13 14:37:07 crc kubenswrapper[4797]: I1013 14:37:07.086034 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-tgh5n" Oct 13 14:37:07 crc kubenswrapper[4797]: I1013 14:37:07.086676 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Oct 13 14:37:07 crc kubenswrapper[4797]: I1013 14:37:07.086947 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 13 14:37:07 crc kubenswrapper[4797]: I1013 14:37:07.092355 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 13 14:37:07 crc kubenswrapper[4797]: I1013 14:37:07.103326 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 13 14:37:07 crc kubenswrapper[4797]: I1013 14:37:07.110529 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 13 14:37:07 crc kubenswrapper[4797]: I1013 14:37:07.130478 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 13 14:37:07 crc kubenswrapper[4797]: I1013 14:37:07.177606 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82fccc24-58d1-4521-9a93-be26f53cb8c3-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"82fccc24-58d1-4521-9a93-be26f53cb8c3\") " pod="openstack/glance-default-internal-api-0" Oct 13 14:37:07 crc kubenswrapper[4797]: I1013 14:37:07.177690 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7332de29-571b-4dfd-8049-6b3c749109cc-config-data\") pod \"glance-default-external-api-0\" (UID: \"7332de29-571b-4dfd-8049-6b3c749109cc\") " pod="openstack/glance-default-external-api-0" Oct 13 14:37:07 crc kubenswrapper[4797]: I1013 14:37:07.177733 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/82fccc24-58d1-4521-9a93-be26f53cb8c3-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"82fccc24-58d1-4521-9a93-be26f53cb8c3\") " pod="openstack/glance-default-internal-api-0" Oct 13 14:37:07 crc kubenswrapper[4797]: I1013 14:37:07.177761 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85jkj\" (UniqueName: \"kubernetes.io/projected/7332de29-571b-4dfd-8049-6b3c749109cc-kube-api-access-85jkj\") pod \"glance-default-external-api-0\" (UID: \"7332de29-571b-4dfd-8049-6b3c749109cc\") " pod="openstack/glance-default-external-api-0" Oct 13 14:37:07 crc kubenswrapper[4797]: I1013 14:37:07.177789 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7332de29-571b-4dfd-8049-6b3c749109cc-scripts\") pod \"glance-default-external-api-0\" (UID: \"7332de29-571b-4dfd-8049-6b3c749109cc\") " pod="openstack/glance-default-external-api-0" Oct 13 14:37:07 crc kubenswrapper[4797]: I1013 14:37:07.177839 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7332de29-571b-4dfd-8049-6b3c749109cc-logs\") pod \"glance-default-external-api-0\" (UID: \"7332de29-571b-4dfd-8049-6b3c749109cc\") " pod="openstack/glance-default-external-api-0" Oct 13 14:37:07 crc kubenswrapper[4797]: I1013 14:37:07.177885 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82fccc24-58d1-4521-9a93-be26f53cb8c3-config-data\") pod \"glance-default-internal-api-0\" (UID: \"82fccc24-58d1-4521-9a93-be26f53cb8c3\") " pod="openstack/glance-default-internal-api-0" Oct 13 14:37:07 crc kubenswrapper[4797]: I1013 14:37:07.177932 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7332de29-571b-4dfd-8049-6b3c749109cc-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"7332de29-571b-4dfd-8049-6b3c749109cc\") " pod="openstack/glance-default-external-api-0" Oct 13 14:37:07 crc kubenswrapper[4797]: I1013 14:37:07.177987 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/82fccc24-58d1-4521-9a93-be26f53cb8c3-ceph\") pod \"glance-default-internal-api-0\" (UID: \"82fccc24-58d1-4521-9a93-be26f53cb8c3\") " pod="openstack/glance-default-internal-api-0" Oct 13 14:37:07 crc kubenswrapper[4797]: I1013 14:37:07.178006 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/82fccc24-58d1-4521-9a93-be26f53cb8c3-scripts\") pod \"glance-default-internal-api-0\" (UID: \"82fccc24-58d1-4521-9a93-be26f53cb8c3\") " pod="openstack/glance-default-internal-api-0" Oct 13 14:37:07 crc kubenswrapper[4797]: I1013 14:37:07.178050 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5cpcf\" (UniqueName: \"kubernetes.io/projected/82fccc24-58d1-4521-9a93-be26f53cb8c3-kube-api-access-5cpcf\") pod \"glance-default-internal-api-0\" (UID: \"82fccc24-58d1-4521-9a93-be26f53cb8c3\") " pod="openstack/glance-default-internal-api-0" Oct 13 14:37:07 crc kubenswrapper[4797]: I1013 14:37:07.178685 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/82fccc24-58d1-4521-9a93-be26f53cb8c3-logs\") pod \"glance-default-internal-api-0\" (UID: \"82fccc24-58d1-4521-9a93-be26f53cb8c3\") " pod="openstack/glance-default-internal-api-0" Oct 13 14:37:07 crc kubenswrapper[4797]: I1013 14:37:07.178741 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/7332de29-571b-4dfd-8049-6b3c749109cc-ceph\") pod \"glance-default-external-api-0\" (UID: \"7332de29-571b-4dfd-8049-6b3c749109cc\") " pod="openstack/glance-default-external-api-0" Oct 13 14:37:07 crc kubenswrapper[4797]: I1013 14:37:07.178771 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7332de29-571b-4dfd-8049-6b3c749109cc-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"7332de29-571b-4dfd-8049-6b3c749109cc\") " pod="openstack/glance-default-external-api-0" Oct 13 14:37:07 crc kubenswrapper[4797]: I1013 14:37:07.253835 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4129fe47-83ce-4c43-9549-39be0607dc11" path="/var/lib/kubelet/pods/4129fe47-83ce-4c43-9549-39be0607dc11/volumes" Oct 13 14:37:07 crc kubenswrapper[4797]: I1013 14:37:07.254798 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42c07aef-3157-4265-b940-0d838eb32e9f" path="/var/lib/kubelet/pods/42c07aef-3157-4265-b940-0d838eb32e9f/volumes" Oct 13 14:37:07 crc kubenswrapper[4797]: I1013 14:37:07.280027 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7332de29-571b-4dfd-8049-6b3c749109cc-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"7332de29-571b-4dfd-8049-6b3c749109cc\") " pod="openstack/glance-default-external-api-0" Oct 13 14:37:07 crc kubenswrapper[4797]: I1013 14:37:07.280091 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/82fccc24-58d1-4521-9a93-be26f53cb8c3-ceph\") pod \"glance-default-internal-api-0\" (UID: \"82fccc24-58d1-4521-9a93-be26f53cb8c3\") " pod="openstack/glance-default-internal-api-0" Oct 13 14:37:07 crc kubenswrapper[4797]: I1013 14:37:07.280111 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/82fccc24-58d1-4521-9a93-be26f53cb8c3-scripts\") pod \"glance-default-internal-api-0\" (UID: \"82fccc24-58d1-4521-9a93-be26f53cb8c3\") " pod="openstack/glance-default-internal-api-0" Oct 13 14:37:07 crc kubenswrapper[4797]: I1013 14:37:07.280130 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5cpcf\" (UniqueName: \"kubernetes.io/projected/82fccc24-58d1-4521-9a93-be26f53cb8c3-kube-api-access-5cpcf\") pod \"glance-default-internal-api-0\" (UID: \"82fccc24-58d1-4521-9a93-be26f53cb8c3\") " pod="openstack/glance-default-internal-api-0" Oct 13 14:37:07 crc kubenswrapper[4797]: I1013 14:37:07.280191 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/82fccc24-58d1-4521-9a93-be26f53cb8c3-logs\") pod \"glance-default-internal-api-0\" (UID: \"82fccc24-58d1-4521-9a93-be26f53cb8c3\") " pod="openstack/glance-default-internal-api-0" Oct 13 14:37:07 crc kubenswrapper[4797]: I1013 14:37:07.280221 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/7332de29-571b-4dfd-8049-6b3c749109cc-ceph\") pod \"glance-default-external-api-0\" (UID: \"7332de29-571b-4dfd-8049-6b3c749109cc\") " pod="openstack/glance-default-external-api-0" Oct 13 14:37:07 crc kubenswrapper[4797]: I1013 14:37:07.280245 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7332de29-571b-4dfd-8049-6b3c749109cc-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"7332de29-571b-4dfd-8049-6b3c749109cc\") " pod="openstack/glance-default-external-api-0" Oct 13 14:37:07 crc kubenswrapper[4797]: I1013 14:37:07.280301 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82fccc24-58d1-4521-9a93-be26f53cb8c3-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"82fccc24-58d1-4521-9a93-be26f53cb8c3\") " pod="openstack/glance-default-internal-api-0" Oct 13 14:37:07 crc kubenswrapper[4797]: I1013 14:37:07.280330 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7332de29-571b-4dfd-8049-6b3c749109cc-config-data\") pod \"glance-default-external-api-0\" (UID: \"7332de29-571b-4dfd-8049-6b3c749109cc\") " pod="openstack/glance-default-external-api-0" Oct 13 14:37:07 crc kubenswrapper[4797]: I1013 14:37:07.280357 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/82fccc24-58d1-4521-9a93-be26f53cb8c3-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"82fccc24-58d1-4521-9a93-be26f53cb8c3\") " pod="openstack/glance-default-internal-api-0" Oct 13 14:37:07 crc kubenswrapper[4797]: I1013 14:37:07.280379 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85jkj\" (UniqueName: \"kubernetes.io/projected/7332de29-571b-4dfd-8049-6b3c749109cc-kube-api-access-85jkj\") pod \"glance-default-external-api-0\" (UID: \"7332de29-571b-4dfd-8049-6b3c749109cc\") " pod="openstack/glance-default-external-api-0" Oct 13 14:37:07 crc kubenswrapper[4797]: I1013 14:37:07.280396 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7332de29-571b-4dfd-8049-6b3c749109cc-scripts\") pod \"glance-default-external-api-0\" (UID: \"7332de29-571b-4dfd-8049-6b3c749109cc\") " pod="openstack/glance-default-external-api-0" Oct 13 14:37:07 crc kubenswrapper[4797]: I1013 14:37:07.280415 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7332de29-571b-4dfd-8049-6b3c749109cc-logs\") pod \"glance-default-external-api-0\" (UID: \"7332de29-571b-4dfd-8049-6b3c749109cc\") " pod="openstack/glance-default-external-api-0" Oct 13 14:37:07 crc kubenswrapper[4797]: I1013 14:37:07.280439 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82fccc24-58d1-4521-9a93-be26f53cb8c3-config-data\") pod \"glance-default-internal-api-0\" (UID: \"82fccc24-58d1-4521-9a93-be26f53cb8c3\") " pod="openstack/glance-default-internal-api-0" Oct 13 14:37:07 crc kubenswrapper[4797]: I1013 14:37:07.280784 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7332de29-571b-4dfd-8049-6b3c749109cc-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"7332de29-571b-4dfd-8049-6b3c749109cc\") " pod="openstack/glance-default-external-api-0" Oct 13 14:37:07 crc kubenswrapper[4797]: I1013 14:37:07.284412 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/82fccc24-58d1-4521-9a93-be26f53cb8c3-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"82fccc24-58d1-4521-9a93-be26f53cb8c3\") " pod="openstack/glance-default-internal-api-0" Oct 13 14:37:07 crc kubenswrapper[4797]: I1013 14:37:07.284751 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7332de29-571b-4dfd-8049-6b3c749109cc-logs\") pod \"glance-default-external-api-0\" (UID: \"7332de29-571b-4dfd-8049-6b3c749109cc\") " pod="openstack/glance-default-external-api-0" Oct 13 14:37:07 crc kubenswrapper[4797]: I1013 14:37:07.286005 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/7332de29-571b-4dfd-8049-6b3c749109cc-ceph\") pod \"glance-default-external-api-0\" (UID: \"7332de29-571b-4dfd-8049-6b3c749109cc\") " pod="openstack/glance-default-external-api-0" Oct 13 14:37:07 crc kubenswrapper[4797]: I1013 14:37:07.286094 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7332de29-571b-4dfd-8049-6b3c749109cc-scripts\") pod \"glance-default-external-api-0\" (UID: \"7332de29-571b-4dfd-8049-6b3c749109cc\") " pod="openstack/glance-default-external-api-0" Oct 13 14:37:07 crc kubenswrapper[4797]: I1013 14:37:07.293707 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/82fccc24-58d1-4521-9a93-be26f53cb8c3-logs\") pod \"glance-default-internal-api-0\" (UID: \"82fccc24-58d1-4521-9a93-be26f53cb8c3\") " pod="openstack/glance-default-internal-api-0" Oct 13 14:37:07 crc kubenswrapper[4797]: I1013 14:37:07.293876 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/82fccc24-58d1-4521-9a93-be26f53cb8c3-scripts\") pod \"glance-default-internal-api-0\" (UID: \"82fccc24-58d1-4521-9a93-be26f53cb8c3\") " pod="openstack/glance-default-internal-api-0" Oct 13 14:37:07 crc kubenswrapper[4797]: I1013 14:37:07.294009 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82fccc24-58d1-4521-9a93-be26f53cb8c3-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"82fccc24-58d1-4521-9a93-be26f53cb8c3\") " pod="openstack/glance-default-internal-api-0" Oct 13 14:37:07 crc kubenswrapper[4797]: I1013 14:37:07.294791 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/82fccc24-58d1-4521-9a93-be26f53cb8c3-ceph\") pod \"glance-default-internal-api-0\" (UID: \"82fccc24-58d1-4521-9a93-be26f53cb8c3\") " pod="openstack/glance-default-internal-api-0" Oct 13 14:37:07 crc kubenswrapper[4797]: I1013 14:37:07.295732 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82fccc24-58d1-4521-9a93-be26f53cb8c3-config-data\") pod \"glance-default-internal-api-0\" (UID: \"82fccc24-58d1-4521-9a93-be26f53cb8c3\") " pod="openstack/glance-default-internal-api-0" Oct 13 14:37:07 crc kubenswrapper[4797]: I1013 14:37:07.296022 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7332de29-571b-4dfd-8049-6b3c749109cc-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"7332de29-571b-4dfd-8049-6b3c749109cc\") " pod="openstack/glance-default-external-api-0" Oct 13 14:37:07 crc kubenswrapper[4797]: I1013 14:37:07.296403 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85jkj\" (UniqueName: \"kubernetes.io/projected/7332de29-571b-4dfd-8049-6b3c749109cc-kube-api-access-85jkj\") pod \"glance-default-external-api-0\" (UID: \"7332de29-571b-4dfd-8049-6b3c749109cc\") " pod="openstack/glance-default-external-api-0" Oct 13 14:37:07 crc kubenswrapper[4797]: I1013 14:37:07.297901 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7332de29-571b-4dfd-8049-6b3c749109cc-config-data\") pod \"glance-default-external-api-0\" (UID: \"7332de29-571b-4dfd-8049-6b3c749109cc\") " pod="openstack/glance-default-external-api-0" Oct 13 14:37:07 crc kubenswrapper[4797]: I1013 14:37:07.298122 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5cpcf\" (UniqueName: \"kubernetes.io/projected/82fccc24-58d1-4521-9a93-be26f53cb8c3-kube-api-access-5cpcf\") pod \"glance-default-internal-api-0\" (UID: \"82fccc24-58d1-4521-9a93-be26f53cb8c3\") " pod="openstack/glance-default-internal-api-0" Oct 13 14:37:07 crc kubenswrapper[4797]: I1013 14:37:07.433227 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 13 14:37:07 crc kubenswrapper[4797]: I1013 14:37:07.453168 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 13 14:37:11 crc kubenswrapper[4797]: I1013 14:37:11.943358 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 13 14:37:12 crc kubenswrapper[4797]: W1013 14:37:12.046552 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7332de29_571b_4dfd_8049_6b3c749109cc.slice/crio-fde09e820542440d9afc9e0e32e51d230116eb2658757125fc327f25be664423 WatchSource:0}: Error finding container fde09e820542440d9afc9e0e32e51d230116eb2658757125fc327f25be664423: Status 404 returned error can't find the container with id fde09e820542440d9afc9e0e32e51d230116eb2658757125fc327f25be664423 Oct 13 14:37:12 crc kubenswrapper[4797]: I1013 14:37:12.048551 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 13 14:37:12 crc kubenswrapper[4797]: I1013 14:37:12.703480 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"82fccc24-58d1-4521-9a93-be26f53cb8c3","Type":"ContainerStarted","Data":"1996509cd3494352cde2b04f6e60ff24596c632b1c2f77e54fc2814be657e2e5"} Oct 13 14:37:12 crc kubenswrapper[4797]: I1013 14:37:12.703780 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"82fccc24-58d1-4521-9a93-be26f53cb8c3","Type":"ContainerStarted","Data":"d72af68a5d822e8b986eed9fad05aa16c9db0904ffe07ab23da02669d0e80596"} Oct 13 14:37:12 crc kubenswrapper[4797]: I1013 14:37:12.706911 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-658c7589bf-7sm9b" event={"ID":"97db726c-fe9c-4730-9f04-004a744500f2","Type":"ContainerStarted","Data":"b4a2dd7f13a5d90da8f935842304898343aba422001a545e60199494f4fdd976"} Oct 13 14:37:12 crc kubenswrapper[4797]: I1013 14:37:12.706959 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-658c7589bf-7sm9b" event={"ID":"97db726c-fe9c-4730-9f04-004a744500f2","Type":"ContainerStarted","Data":"447a1ed385b1a6540864ff6af160c4364c22bc2ef6072fe62de862b303fb9854"} Oct 13 14:37:12 crc kubenswrapper[4797]: I1013 14:37:12.713068 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5c8799d87c-bmvq7" event={"ID":"d9308db4-47d9-45e2-a059-a2aeed2ff9fe","Type":"ContainerStarted","Data":"4d7d0419a2c99ac2a41586562f3358a7995484e3ef8a269e0186734637b27a6f"} Oct 13 14:37:12 crc kubenswrapper[4797]: I1013 14:37:12.713118 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5c8799d87c-bmvq7" event={"ID":"d9308db4-47d9-45e2-a059-a2aeed2ff9fe","Type":"ContainerStarted","Data":"ac2a9c93717e421777017830530b83cdf2746386cbc77a87f3dca5a3bb336fe3"} Oct 13 14:37:12 crc kubenswrapper[4797]: I1013 14:37:12.713149 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-5c8799d87c-bmvq7" podUID="d9308db4-47d9-45e2-a059-a2aeed2ff9fe" containerName="horizon-log" containerID="cri-o://ac2a9c93717e421777017830530b83cdf2746386cbc77a87f3dca5a3bb336fe3" gracePeriod=30 Oct 13 14:37:12 crc kubenswrapper[4797]: I1013 14:37:12.713191 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-5c8799d87c-bmvq7" podUID="d9308db4-47d9-45e2-a059-a2aeed2ff9fe" containerName="horizon" containerID="cri-o://4d7d0419a2c99ac2a41586562f3358a7995484e3ef8a269e0186734637b27a6f" gracePeriod=30 Oct 13 14:37:12 crc kubenswrapper[4797]: I1013 14:37:12.715487 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-5c8799d87c-bmvq7" Oct 13 14:37:12 crc kubenswrapper[4797]: I1013 14:37:12.716194 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7332de29-571b-4dfd-8049-6b3c749109cc","Type":"ContainerStarted","Data":"cda81ffd195e942f85960a4fc9161011a00310b3c44ab20f9bf0c974f569c93a"} Oct 13 14:37:12 crc kubenswrapper[4797]: I1013 14:37:12.716219 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7332de29-571b-4dfd-8049-6b3c749109cc","Type":"ContainerStarted","Data":"fde09e820542440d9afc9e0e32e51d230116eb2658757125fc327f25be664423"} Oct 13 14:37:12 crc kubenswrapper[4797]: I1013 14:37:12.722560 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7c5c76c54c-hrm4p" event={"ID":"14c1d6c3-d0ec-48c2-bca2-391a04f0c47d","Type":"ContainerStarted","Data":"3e62b80534ac510a3738e79e12c35b07168b4b3b9e63b673896bab0020dbb858"} Oct 13 14:37:12 crc kubenswrapper[4797]: I1013 14:37:12.722607 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7c5c76c54c-hrm4p" event={"ID":"14c1d6c3-d0ec-48c2-bca2-391a04f0c47d","Type":"ContainerStarted","Data":"71a0fbdeec6f77261bf21ba6f891657c7fb8be77e92f18d5ac6dec49a549a54f"} Oct 13 14:37:12 crc kubenswrapper[4797]: I1013 14:37:12.734290 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-658c7589bf-7sm9b" podStartSLOduration=2.73033252 podStartE2EDuration="10.734270504s" podCreationTimestamp="2025-10-13 14:37:02 +0000 UTC" firstStartedPulling="2025-10-13 14:37:03.518563797 +0000 UTC m=+5401.052114053" lastFinishedPulling="2025-10-13 14:37:11.522501781 +0000 UTC m=+5409.056052037" observedRunningTime="2025-10-13 14:37:12.726196276 +0000 UTC m=+5410.259746542" watchObservedRunningTime="2025-10-13 14:37:12.734270504 +0000 UTC m=+5410.267820770" Oct 13 14:37:12 crc kubenswrapper[4797]: I1013 14:37:12.772246 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-7c5c76c54c-hrm4p" podStartSLOduration=2.238894431 podStartE2EDuration="9.772223882s" podCreationTimestamp="2025-10-13 14:37:03 +0000 UTC" firstStartedPulling="2025-10-13 14:37:04.017794301 +0000 UTC m=+5401.551344557" lastFinishedPulling="2025-10-13 14:37:11.551123752 +0000 UTC m=+5409.084674008" observedRunningTime="2025-10-13 14:37:12.761346776 +0000 UTC m=+5410.294897032" watchObservedRunningTime="2025-10-13 14:37:12.772223882 +0000 UTC m=+5410.305774138" Oct 13 14:37:12 crc kubenswrapper[4797]: I1013 14:37:12.780417 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-5c8799d87c-bmvq7" podStartSLOduration=2.52294155 podStartE2EDuration="10.780396562s" podCreationTimestamp="2025-10-13 14:37:02 +0000 UTC" firstStartedPulling="2025-10-13 14:37:03.262558648 +0000 UTC m=+5400.796108904" lastFinishedPulling="2025-10-13 14:37:11.52001366 +0000 UTC m=+5409.053563916" observedRunningTime="2025-10-13 14:37:12.779641304 +0000 UTC m=+5410.313191570" watchObservedRunningTime="2025-10-13 14:37:12.780396562 +0000 UTC m=+5410.313946838" Oct 13 14:37:12 crc kubenswrapper[4797]: I1013 14:37:12.931537 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-658c7589bf-7sm9b" Oct 13 14:37:12 crc kubenswrapper[4797]: I1013 14:37:12.931592 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-658c7589bf-7sm9b" Oct 13 14:37:13 crc kubenswrapper[4797]: I1013 14:37:13.486337 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-7c5c76c54c-hrm4p" Oct 13 14:37:13 crc kubenswrapper[4797]: I1013 14:37:13.487452 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7c5c76c54c-hrm4p" Oct 13 14:37:13 crc kubenswrapper[4797]: I1013 14:37:13.732694 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"82fccc24-58d1-4521-9a93-be26f53cb8c3","Type":"ContainerStarted","Data":"1afef0636d4e6824cf0910be0012380e0ba8377dcf53ddc8c56ca8d2dce23506"} Oct 13 14:37:13 crc kubenswrapper[4797]: I1013 14:37:13.734205 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7332de29-571b-4dfd-8049-6b3c749109cc","Type":"ContainerStarted","Data":"84622cf33ab0a6f6b34762d95dd981f0c2628122391b0899344547eefede8574"} Oct 13 14:37:13 crc kubenswrapper[4797]: I1013 14:37:13.762075 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=6.762057551 podStartE2EDuration="6.762057551s" podCreationTimestamp="2025-10-13 14:37:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 14:37:13.756768851 +0000 UTC m=+5411.290319127" watchObservedRunningTime="2025-10-13 14:37:13.762057551 +0000 UTC m=+5411.295607807" Oct 13 14:37:13 crc kubenswrapper[4797]: I1013 14:37:13.784536 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=7.784517961 podStartE2EDuration="7.784517961s" podCreationTimestamp="2025-10-13 14:37:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 14:37:13.776983786 +0000 UTC m=+5411.310534062" watchObservedRunningTime="2025-10-13 14:37:13.784517961 +0000 UTC m=+5411.318068207" Oct 13 14:37:17 crc kubenswrapper[4797]: I1013 14:37:17.434125 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 13 14:37:17 crc kubenswrapper[4797]: I1013 14:37:17.434500 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 13 14:37:17 crc kubenswrapper[4797]: I1013 14:37:17.454207 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 13 14:37:17 crc kubenswrapper[4797]: I1013 14:37:17.454344 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 13 14:37:17 crc kubenswrapper[4797]: I1013 14:37:17.478471 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 13 14:37:17 crc kubenswrapper[4797]: I1013 14:37:17.481192 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 13 14:37:17 crc kubenswrapper[4797]: I1013 14:37:17.492623 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 13 14:37:17 crc kubenswrapper[4797]: I1013 14:37:17.505006 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 13 14:37:17 crc kubenswrapper[4797]: I1013 14:37:17.782181 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 13 14:37:17 crc kubenswrapper[4797]: I1013 14:37:17.782520 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 13 14:37:17 crc kubenswrapper[4797]: I1013 14:37:17.782549 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 13 14:37:17 crc kubenswrapper[4797]: I1013 14:37:17.782561 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 13 14:37:19 crc kubenswrapper[4797]: I1013 14:37:19.806594 4797 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 13 14:37:20 crc kubenswrapper[4797]: I1013 14:37:20.363645 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 13 14:37:20 crc kubenswrapper[4797]: I1013 14:37:20.367828 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 13 14:37:20 crc kubenswrapper[4797]: I1013 14:37:20.814630 4797 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 13 14:37:20 crc kubenswrapper[4797]: I1013 14:37:20.869924 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 13 14:37:21 crc kubenswrapper[4797]: I1013 14:37:21.146407 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 13 14:37:22 crc kubenswrapper[4797]: I1013 14:37:22.932463 4797 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-658c7589bf-7sm9b" podUID="97db726c-fe9c-4730-9f04-004a744500f2" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.94:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.94:8080: connect: connection refused" Oct 13 14:37:23 crc kubenswrapper[4797]: I1013 14:37:23.486790 4797 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-7c5c76c54c-hrm4p" podUID="14c1d6c3-d0ec-48c2-bca2-391a04f0c47d" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.95:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.95:8080: connect: connection refused" Oct 13 14:37:34 crc kubenswrapper[4797]: I1013 14:37:34.811886 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-658c7589bf-7sm9b" Oct 13 14:37:35 crc kubenswrapper[4797]: I1013 14:37:35.324464 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-7c5c76c54c-hrm4p" Oct 13 14:37:36 crc kubenswrapper[4797]: I1013 14:37:36.605937 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-658c7589bf-7sm9b" Oct 13 14:37:37 crc kubenswrapper[4797]: I1013 14:37:37.068964 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-7c5c76c54c-hrm4p" Oct 13 14:37:37 crc kubenswrapper[4797]: I1013 14:37:37.141919 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-658c7589bf-7sm9b"] Oct 13 14:37:37 crc kubenswrapper[4797]: I1013 14:37:37.142130 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-658c7589bf-7sm9b" podUID="97db726c-fe9c-4730-9f04-004a744500f2" containerName="horizon-log" containerID="cri-o://447a1ed385b1a6540864ff6af160c4364c22bc2ef6072fe62de862b303fb9854" gracePeriod=30 Oct 13 14:37:37 crc kubenswrapper[4797]: I1013 14:37:37.142263 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-658c7589bf-7sm9b" podUID="97db726c-fe9c-4730-9f04-004a744500f2" containerName="horizon" containerID="cri-o://b4a2dd7f13a5d90da8f935842304898343aba422001a545e60199494f4fdd976" gracePeriod=30 Oct 13 14:37:41 crc kubenswrapper[4797]: I1013 14:37:41.010456 4797 generic.go:334] "Generic (PLEG): container finished" podID="97db726c-fe9c-4730-9f04-004a744500f2" containerID="b4a2dd7f13a5d90da8f935842304898343aba422001a545e60199494f4fdd976" exitCode=0 Oct 13 14:37:41 crc kubenswrapper[4797]: I1013 14:37:41.010525 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-658c7589bf-7sm9b" event={"ID":"97db726c-fe9c-4730-9f04-004a744500f2","Type":"ContainerDied","Data":"b4a2dd7f13a5d90da8f935842304898343aba422001a545e60199494f4fdd976"} Oct 13 14:37:42 crc kubenswrapper[4797]: I1013 14:37:42.931916 4797 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-658c7589bf-7sm9b" podUID="97db726c-fe9c-4730-9f04-004a744500f2" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.94:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.94:8080: connect: connection refused" Oct 13 14:37:43 crc kubenswrapper[4797]: I1013 14:37:43.029732 4797 generic.go:334] "Generic (PLEG): container finished" podID="d9308db4-47d9-45e2-a059-a2aeed2ff9fe" containerID="4d7d0419a2c99ac2a41586562f3358a7995484e3ef8a269e0186734637b27a6f" exitCode=137 Oct 13 14:37:43 crc kubenswrapper[4797]: I1013 14:37:43.029765 4797 generic.go:334] "Generic (PLEG): container finished" podID="d9308db4-47d9-45e2-a059-a2aeed2ff9fe" containerID="ac2a9c93717e421777017830530b83cdf2746386cbc77a87f3dca5a3bb336fe3" exitCode=137 Oct 13 14:37:43 crc kubenswrapper[4797]: I1013 14:37:43.029788 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5c8799d87c-bmvq7" event={"ID":"d9308db4-47d9-45e2-a059-a2aeed2ff9fe","Type":"ContainerDied","Data":"4d7d0419a2c99ac2a41586562f3358a7995484e3ef8a269e0186734637b27a6f"} Oct 13 14:37:43 crc kubenswrapper[4797]: I1013 14:37:43.029830 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5c8799d87c-bmvq7" event={"ID":"d9308db4-47d9-45e2-a059-a2aeed2ff9fe","Type":"ContainerDied","Data":"ac2a9c93717e421777017830530b83cdf2746386cbc77a87f3dca5a3bb336fe3"} Oct 13 14:37:43 crc kubenswrapper[4797]: I1013 14:37:43.130750 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5c8799d87c-bmvq7" Oct 13 14:37:43 crc kubenswrapper[4797]: I1013 14:37:43.203135 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d9308db4-47d9-45e2-a059-a2aeed2ff9fe-logs\") pod \"d9308db4-47d9-45e2-a059-a2aeed2ff9fe\" (UID: \"d9308db4-47d9-45e2-a059-a2aeed2ff9fe\") " Oct 13 14:37:43 crc kubenswrapper[4797]: I1013 14:37:43.203256 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d9308db4-47d9-45e2-a059-a2aeed2ff9fe-scripts\") pod \"d9308db4-47d9-45e2-a059-a2aeed2ff9fe\" (UID: \"d9308db4-47d9-45e2-a059-a2aeed2ff9fe\") " Oct 13 14:37:43 crc kubenswrapper[4797]: I1013 14:37:43.203284 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-68zvj\" (UniqueName: \"kubernetes.io/projected/d9308db4-47d9-45e2-a059-a2aeed2ff9fe-kube-api-access-68zvj\") pod \"d9308db4-47d9-45e2-a059-a2aeed2ff9fe\" (UID: \"d9308db4-47d9-45e2-a059-a2aeed2ff9fe\") " Oct 13 14:37:43 crc kubenswrapper[4797]: I1013 14:37:43.203383 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d9308db4-47d9-45e2-a059-a2aeed2ff9fe-config-data\") pod \"d9308db4-47d9-45e2-a059-a2aeed2ff9fe\" (UID: \"d9308db4-47d9-45e2-a059-a2aeed2ff9fe\") " Oct 13 14:37:43 crc kubenswrapper[4797]: I1013 14:37:43.203411 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d9308db4-47d9-45e2-a059-a2aeed2ff9fe-horizon-secret-key\") pod \"d9308db4-47d9-45e2-a059-a2aeed2ff9fe\" (UID: \"d9308db4-47d9-45e2-a059-a2aeed2ff9fe\") " Oct 13 14:37:43 crc kubenswrapper[4797]: I1013 14:37:43.203796 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d9308db4-47d9-45e2-a059-a2aeed2ff9fe-logs" (OuterVolumeSpecName: "logs") pod "d9308db4-47d9-45e2-a059-a2aeed2ff9fe" (UID: "d9308db4-47d9-45e2-a059-a2aeed2ff9fe"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 14:37:43 crc kubenswrapper[4797]: I1013 14:37:43.209048 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9308db4-47d9-45e2-a059-a2aeed2ff9fe-kube-api-access-68zvj" (OuterVolumeSpecName: "kube-api-access-68zvj") pod "d9308db4-47d9-45e2-a059-a2aeed2ff9fe" (UID: "d9308db4-47d9-45e2-a059-a2aeed2ff9fe"). InnerVolumeSpecName "kube-api-access-68zvj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 14:37:43 crc kubenswrapper[4797]: I1013 14:37:43.209076 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9308db4-47d9-45e2-a059-a2aeed2ff9fe-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "d9308db4-47d9-45e2-a059-a2aeed2ff9fe" (UID: "d9308db4-47d9-45e2-a059-a2aeed2ff9fe"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 14:37:43 crc kubenswrapper[4797]: I1013 14:37:43.232012 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d9308db4-47d9-45e2-a059-a2aeed2ff9fe-scripts" (OuterVolumeSpecName: "scripts") pod "d9308db4-47d9-45e2-a059-a2aeed2ff9fe" (UID: "d9308db4-47d9-45e2-a059-a2aeed2ff9fe"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 14:37:43 crc kubenswrapper[4797]: I1013 14:37:43.232972 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d9308db4-47d9-45e2-a059-a2aeed2ff9fe-config-data" (OuterVolumeSpecName: "config-data") pod "d9308db4-47d9-45e2-a059-a2aeed2ff9fe" (UID: "d9308db4-47d9-45e2-a059-a2aeed2ff9fe"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 14:37:43 crc kubenswrapper[4797]: I1013 14:37:43.306004 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-68zvj\" (UniqueName: \"kubernetes.io/projected/d9308db4-47d9-45e2-a059-a2aeed2ff9fe-kube-api-access-68zvj\") on node \"crc\" DevicePath \"\"" Oct 13 14:37:43 crc kubenswrapper[4797]: I1013 14:37:43.306041 4797 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d9308db4-47d9-45e2-a059-a2aeed2ff9fe-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 14:37:43 crc kubenswrapper[4797]: I1013 14:37:43.306054 4797 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d9308db4-47d9-45e2-a059-a2aeed2ff9fe-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 13 14:37:43 crc kubenswrapper[4797]: I1013 14:37:43.306065 4797 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d9308db4-47d9-45e2-a059-a2aeed2ff9fe-logs\") on node \"crc\" DevicePath \"\"" Oct 13 14:37:43 crc kubenswrapper[4797]: I1013 14:37:43.306076 4797 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d9308db4-47d9-45e2-a059-a2aeed2ff9fe-scripts\") on node \"crc\" DevicePath \"\"" Oct 13 14:37:44 crc kubenswrapper[4797]: I1013 14:37:44.042128 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5c8799d87c-bmvq7" event={"ID":"d9308db4-47d9-45e2-a059-a2aeed2ff9fe","Type":"ContainerDied","Data":"49f339eacb97bfa0b317cdd994a775b94ff67d1c5e2a8e3cd704ffa95c2aa5e6"} Oct 13 14:37:44 crc kubenswrapper[4797]: I1013 14:37:44.042500 4797 scope.go:117] "RemoveContainer" containerID="4d7d0419a2c99ac2a41586562f3358a7995484e3ef8a269e0186734637b27a6f" Oct 13 14:37:44 crc kubenswrapper[4797]: I1013 14:37:44.042711 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5c8799d87c-bmvq7" Oct 13 14:37:44 crc kubenswrapper[4797]: I1013 14:37:44.073570 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5c8799d87c-bmvq7"] Oct 13 14:37:44 crc kubenswrapper[4797]: I1013 14:37:44.085938 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-5c8799d87c-bmvq7"] Oct 13 14:37:44 crc kubenswrapper[4797]: I1013 14:37:44.237728 4797 scope.go:117] "RemoveContainer" containerID="ac2a9c93717e421777017830530b83cdf2746386cbc77a87f3dca5a3bb336fe3" Oct 13 14:37:45 crc kubenswrapper[4797]: I1013 14:37:45.259674 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9308db4-47d9-45e2-a059-a2aeed2ff9fe" path="/var/lib/kubelet/pods/d9308db4-47d9-45e2-a059-a2aeed2ff9fe/volumes" Oct 13 14:37:48 crc kubenswrapper[4797]: I1013 14:37:48.120238 4797 patch_prober.go:28] interesting pod/machine-config-daemon-hrdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 14:37:48 crc kubenswrapper[4797]: I1013 14:37:48.121160 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 14:37:51 crc kubenswrapper[4797]: I1013 14:37:51.049641 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-996sj"] Oct 13 14:37:51 crc kubenswrapper[4797]: I1013 14:37:51.058533 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-996sj"] Oct 13 14:37:51 crc kubenswrapper[4797]: I1013 14:37:51.247180 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6fce202b-7f4c-4827-b0b0-056784afbb08" path="/var/lib/kubelet/pods/6fce202b-7f4c-4827-b0b0-056784afbb08/volumes" Oct 13 14:37:52 crc kubenswrapper[4797]: I1013 14:37:52.931678 4797 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-658c7589bf-7sm9b" podUID="97db726c-fe9c-4730-9f04-004a744500f2" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.94:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.94:8080: connect: connection refused" Oct 13 14:37:58 crc kubenswrapper[4797]: I1013 14:37:58.128261 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-kq4h6"] Oct 13 14:37:58 crc kubenswrapper[4797]: E1013 14:37:58.129926 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9308db4-47d9-45e2-a059-a2aeed2ff9fe" containerName="horizon-log" Oct 13 14:37:58 crc kubenswrapper[4797]: I1013 14:37:58.129946 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9308db4-47d9-45e2-a059-a2aeed2ff9fe" containerName="horizon-log" Oct 13 14:37:58 crc kubenswrapper[4797]: E1013 14:37:58.129979 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9308db4-47d9-45e2-a059-a2aeed2ff9fe" containerName="horizon" Oct 13 14:37:58 crc kubenswrapper[4797]: I1013 14:37:58.129987 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9308db4-47d9-45e2-a059-a2aeed2ff9fe" containerName="horizon" Oct 13 14:37:58 crc kubenswrapper[4797]: I1013 14:37:58.130199 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9308db4-47d9-45e2-a059-a2aeed2ff9fe" containerName="horizon-log" Oct 13 14:37:58 crc kubenswrapper[4797]: I1013 14:37:58.130229 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9308db4-47d9-45e2-a059-a2aeed2ff9fe" containerName="horizon" Oct 13 14:37:58 crc kubenswrapper[4797]: I1013 14:37:58.133106 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kq4h6" Oct 13 14:37:58 crc kubenswrapper[4797]: I1013 14:37:58.159435 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kq4h6"] Oct 13 14:37:58 crc kubenswrapper[4797]: I1013 14:37:58.213675 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e03928a6-b0eb-4d80-8d9b-bbe571c7db10-catalog-content\") pod \"redhat-marketplace-kq4h6\" (UID: \"e03928a6-b0eb-4d80-8d9b-bbe571c7db10\") " pod="openshift-marketplace/redhat-marketplace-kq4h6" Oct 13 14:37:58 crc kubenswrapper[4797]: I1013 14:37:58.213999 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e03928a6-b0eb-4d80-8d9b-bbe571c7db10-utilities\") pod \"redhat-marketplace-kq4h6\" (UID: \"e03928a6-b0eb-4d80-8d9b-bbe571c7db10\") " pod="openshift-marketplace/redhat-marketplace-kq4h6" Oct 13 14:37:58 crc kubenswrapper[4797]: I1013 14:37:58.214191 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvdzq\" (UniqueName: \"kubernetes.io/projected/e03928a6-b0eb-4d80-8d9b-bbe571c7db10-kube-api-access-dvdzq\") pod \"redhat-marketplace-kq4h6\" (UID: \"e03928a6-b0eb-4d80-8d9b-bbe571c7db10\") " pod="openshift-marketplace/redhat-marketplace-kq4h6" Oct 13 14:37:58 crc kubenswrapper[4797]: I1013 14:37:58.315887 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvdzq\" (UniqueName: \"kubernetes.io/projected/e03928a6-b0eb-4d80-8d9b-bbe571c7db10-kube-api-access-dvdzq\") pod \"redhat-marketplace-kq4h6\" (UID: \"e03928a6-b0eb-4d80-8d9b-bbe571c7db10\") " pod="openshift-marketplace/redhat-marketplace-kq4h6" Oct 13 14:37:58 crc kubenswrapper[4797]: I1013 14:37:58.316306 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e03928a6-b0eb-4d80-8d9b-bbe571c7db10-catalog-content\") pod \"redhat-marketplace-kq4h6\" (UID: \"e03928a6-b0eb-4d80-8d9b-bbe571c7db10\") " pod="openshift-marketplace/redhat-marketplace-kq4h6" Oct 13 14:37:58 crc kubenswrapper[4797]: I1013 14:37:58.316999 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e03928a6-b0eb-4d80-8d9b-bbe571c7db10-utilities\") pod \"redhat-marketplace-kq4h6\" (UID: \"e03928a6-b0eb-4d80-8d9b-bbe571c7db10\") " pod="openshift-marketplace/redhat-marketplace-kq4h6" Oct 13 14:37:58 crc kubenswrapper[4797]: I1013 14:37:58.316921 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e03928a6-b0eb-4d80-8d9b-bbe571c7db10-catalog-content\") pod \"redhat-marketplace-kq4h6\" (UID: \"e03928a6-b0eb-4d80-8d9b-bbe571c7db10\") " pod="openshift-marketplace/redhat-marketplace-kq4h6" Oct 13 14:37:58 crc kubenswrapper[4797]: I1013 14:37:58.317358 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e03928a6-b0eb-4d80-8d9b-bbe571c7db10-utilities\") pod \"redhat-marketplace-kq4h6\" (UID: \"e03928a6-b0eb-4d80-8d9b-bbe571c7db10\") " pod="openshift-marketplace/redhat-marketplace-kq4h6" Oct 13 14:37:58 crc kubenswrapper[4797]: I1013 14:37:58.346785 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvdzq\" (UniqueName: \"kubernetes.io/projected/e03928a6-b0eb-4d80-8d9b-bbe571c7db10-kube-api-access-dvdzq\") pod \"redhat-marketplace-kq4h6\" (UID: \"e03928a6-b0eb-4d80-8d9b-bbe571c7db10\") " pod="openshift-marketplace/redhat-marketplace-kq4h6" Oct 13 14:37:58 crc kubenswrapper[4797]: I1013 14:37:58.464959 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kq4h6" Oct 13 14:37:58 crc kubenswrapper[4797]: I1013 14:37:58.895392 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kq4h6"] Oct 13 14:37:59 crc kubenswrapper[4797]: I1013 14:37:59.188391 4797 generic.go:334] "Generic (PLEG): container finished" podID="e03928a6-b0eb-4d80-8d9b-bbe571c7db10" containerID="d2febace3d7b3b095e28e9db847e676cbc7ea2679b978f076cd71e3b6c91e27d" exitCode=0 Oct 13 14:37:59 crc kubenswrapper[4797]: I1013 14:37:59.188475 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kq4h6" event={"ID":"e03928a6-b0eb-4d80-8d9b-bbe571c7db10","Type":"ContainerDied","Data":"d2febace3d7b3b095e28e9db847e676cbc7ea2679b978f076cd71e3b6c91e27d"} Oct 13 14:37:59 crc kubenswrapper[4797]: I1013 14:37:59.188527 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kq4h6" event={"ID":"e03928a6-b0eb-4d80-8d9b-bbe571c7db10","Type":"ContainerStarted","Data":"b823cc513d60ff511b09a8a71669a088c1ba875e65e2c7633f05065ce40b009b"} Oct 13 14:38:01 crc kubenswrapper[4797]: I1013 14:38:01.044126 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-247e-account-create-2hpqr"] Oct 13 14:38:01 crc kubenswrapper[4797]: I1013 14:38:01.056248 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-247e-account-create-2hpqr"] Oct 13 14:38:01 crc kubenswrapper[4797]: I1013 14:38:01.209500 4797 generic.go:334] "Generic (PLEG): container finished" podID="e03928a6-b0eb-4d80-8d9b-bbe571c7db10" containerID="2bbcd18e7bf4c2097f00c8ec9336d1a5c5fe6fac2d12c0620bd7bdf804cdba28" exitCode=0 Oct 13 14:38:01 crc kubenswrapper[4797]: I1013 14:38:01.209540 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kq4h6" event={"ID":"e03928a6-b0eb-4d80-8d9b-bbe571c7db10","Type":"ContainerDied","Data":"2bbcd18e7bf4c2097f00c8ec9336d1a5c5fe6fac2d12c0620bd7bdf804cdba28"} Oct 13 14:38:01 crc kubenswrapper[4797]: I1013 14:38:01.247040 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90fae4c4-9315-431a-8b5a-ae7ca884d321" path="/var/lib/kubelet/pods/90fae4c4-9315-431a-8b5a-ae7ca884d321/volumes" Oct 13 14:38:02 crc kubenswrapper[4797]: I1013 14:38:02.220038 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kq4h6" event={"ID":"e03928a6-b0eb-4d80-8d9b-bbe571c7db10","Type":"ContainerStarted","Data":"e9624f5cd8cdfc1b4b628c0405c9203c3658ceed201b96e8081b34b09f607002"} Oct 13 14:38:02 crc kubenswrapper[4797]: I1013 14:38:02.246490 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-kq4h6" podStartSLOduration=1.5469883580000001 podStartE2EDuration="4.246469541s" podCreationTimestamp="2025-10-13 14:37:58 +0000 UTC" firstStartedPulling="2025-10-13 14:37:59.190607391 +0000 UTC m=+5456.724157647" lastFinishedPulling="2025-10-13 14:38:01.890088574 +0000 UTC m=+5459.423638830" observedRunningTime="2025-10-13 14:38:02.236716972 +0000 UTC m=+5459.770267238" watchObservedRunningTime="2025-10-13 14:38:02.246469541 +0000 UTC m=+5459.780019797" Oct 13 14:38:02 crc kubenswrapper[4797]: I1013 14:38:02.931893 4797 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-658c7589bf-7sm9b" podUID="97db726c-fe9c-4730-9f04-004a744500f2" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.94:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.94:8080: connect: connection refused" Oct 13 14:38:02 crc kubenswrapper[4797]: I1013 14:38:02.932292 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-658c7589bf-7sm9b" Oct 13 14:38:07 crc kubenswrapper[4797]: I1013 14:38:07.314278 4797 generic.go:334] "Generic (PLEG): container finished" podID="97db726c-fe9c-4730-9f04-004a744500f2" containerID="447a1ed385b1a6540864ff6af160c4364c22bc2ef6072fe62de862b303fb9854" exitCode=137 Oct 13 14:38:07 crc kubenswrapper[4797]: I1013 14:38:07.314436 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-658c7589bf-7sm9b" event={"ID":"97db726c-fe9c-4730-9f04-004a744500f2","Type":"ContainerDied","Data":"447a1ed385b1a6540864ff6af160c4364c22bc2ef6072fe62de862b303fb9854"} Oct 13 14:38:07 crc kubenswrapper[4797]: I1013 14:38:07.544198 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-658c7589bf-7sm9b" Oct 13 14:38:07 crc kubenswrapper[4797]: I1013 14:38:07.594636 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mc5hq\" (UniqueName: \"kubernetes.io/projected/97db726c-fe9c-4730-9f04-004a744500f2-kube-api-access-mc5hq\") pod \"97db726c-fe9c-4730-9f04-004a744500f2\" (UID: \"97db726c-fe9c-4730-9f04-004a744500f2\") " Oct 13 14:38:07 crc kubenswrapper[4797]: I1013 14:38:07.594950 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/97db726c-fe9c-4730-9f04-004a744500f2-config-data\") pod \"97db726c-fe9c-4730-9f04-004a744500f2\" (UID: \"97db726c-fe9c-4730-9f04-004a744500f2\") " Oct 13 14:38:07 crc kubenswrapper[4797]: I1013 14:38:07.595157 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/97db726c-fe9c-4730-9f04-004a744500f2-scripts\") pod \"97db726c-fe9c-4730-9f04-004a744500f2\" (UID: \"97db726c-fe9c-4730-9f04-004a744500f2\") " Oct 13 14:38:07 crc kubenswrapper[4797]: I1013 14:38:07.595333 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/97db726c-fe9c-4730-9f04-004a744500f2-horizon-secret-key\") pod \"97db726c-fe9c-4730-9f04-004a744500f2\" (UID: \"97db726c-fe9c-4730-9f04-004a744500f2\") " Oct 13 14:38:07 crc kubenswrapper[4797]: I1013 14:38:07.595526 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/97db726c-fe9c-4730-9f04-004a744500f2-logs\") pod \"97db726c-fe9c-4730-9f04-004a744500f2\" (UID: \"97db726c-fe9c-4730-9f04-004a744500f2\") " Oct 13 14:38:07 crc kubenswrapper[4797]: I1013 14:38:07.596101 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/97db726c-fe9c-4730-9f04-004a744500f2-logs" (OuterVolumeSpecName: "logs") pod "97db726c-fe9c-4730-9f04-004a744500f2" (UID: "97db726c-fe9c-4730-9f04-004a744500f2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 14:38:07 crc kubenswrapper[4797]: I1013 14:38:07.601074 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97db726c-fe9c-4730-9f04-004a744500f2-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "97db726c-fe9c-4730-9f04-004a744500f2" (UID: "97db726c-fe9c-4730-9f04-004a744500f2"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 14:38:07 crc kubenswrapper[4797]: I1013 14:38:07.605416 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97db726c-fe9c-4730-9f04-004a744500f2-kube-api-access-mc5hq" (OuterVolumeSpecName: "kube-api-access-mc5hq") pod "97db726c-fe9c-4730-9f04-004a744500f2" (UID: "97db726c-fe9c-4730-9f04-004a744500f2"). InnerVolumeSpecName "kube-api-access-mc5hq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 14:38:07 crc kubenswrapper[4797]: I1013 14:38:07.621728 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97db726c-fe9c-4730-9f04-004a744500f2-scripts" (OuterVolumeSpecName: "scripts") pod "97db726c-fe9c-4730-9f04-004a744500f2" (UID: "97db726c-fe9c-4730-9f04-004a744500f2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 14:38:07 crc kubenswrapper[4797]: I1013 14:38:07.623272 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97db726c-fe9c-4730-9f04-004a744500f2-config-data" (OuterVolumeSpecName: "config-data") pod "97db726c-fe9c-4730-9f04-004a744500f2" (UID: "97db726c-fe9c-4730-9f04-004a744500f2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 14:38:07 crc kubenswrapper[4797]: I1013 14:38:07.698724 4797 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/97db726c-fe9c-4730-9f04-004a744500f2-scripts\") on node \"crc\" DevicePath \"\"" Oct 13 14:38:07 crc kubenswrapper[4797]: I1013 14:38:07.699338 4797 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/97db726c-fe9c-4730-9f04-004a744500f2-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 13 14:38:07 crc kubenswrapper[4797]: I1013 14:38:07.699356 4797 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/97db726c-fe9c-4730-9f04-004a744500f2-logs\") on node \"crc\" DevicePath \"\"" Oct 13 14:38:07 crc kubenswrapper[4797]: I1013 14:38:07.699367 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mc5hq\" (UniqueName: \"kubernetes.io/projected/97db726c-fe9c-4730-9f04-004a744500f2-kube-api-access-mc5hq\") on node \"crc\" DevicePath \"\"" Oct 13 14:38:07 crc kubenswrapper[4797]: I1013 14:38:07.699380 4797 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/97db726c-fe9c-4730-9f04-004a744500f2-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 14:38:08 crc kubenswrapper[4797]: I1013 14:38:08.327151 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-658c7589bf-7sm9b" event={"ID":"97db726c-fe9c-4730-9f04-004a744500f2","Type":"ContainerDied","Data":"4e18acf03551ac80dcf6fd5515cc44f8ac454953b329875436d01e1e518420f2"} Oct 13 14:38:08 crc kubenswrapper[4797]: I1013 14:38:08.327218 4797 scope.go:117] "RemoveContainer" containerID="b4a2dd7f13a5d90da8f935842304898343aba422001a545e60199494f4fdd976" Oct 13 14:38:08 crc kubenswrapper[4797]: I1013 14:38:08.327217 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-658c7589bf-7sm9b" Oct 13 14:38:08 crc kubenswrapper[4797]: I1013 14:38:08.373305 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-658c7589bf-7sm9b"] Oct 13 14:38:08 crc kubenswrapper[4797]: I1013 14:38:08.382900 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-658c7589bf-7sm9b"] Oct 13 14:38:08 crc kubenswrapper[4797]: I1013 14:38:08.465541 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-kq4h6" Oct 13 14:38:08 crc kubenswrapper[4797]: I1013 14:38:08.465604 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-kq4h6" Oct 13 14:38:08 crc kubenswrapper[4797]: I1013 14:38:08.490358 4797 scope.go:117] "RemoveContainer" containerID="447a1ed385b1a6540864ff6af160c4364c22bc2ef6072fe62de862b303fb9854" Oct 13 14:38:08 crc kubenswrapper[4797]: I1013 14:38:08.514740 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-kq4h6" Oct 13 14:38:09 crc kubenswrapper[4797]: I1013 14:38:09.247553 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97db726c-fe9c-4730-9f04-004a744500f2" path="/var/lib/kubelet/pods/97db726c-fe9c-4730-9f04-004a744500f2/volumes" Oct 13 14:38:09 crc kubenswrapper[4797]: I1013 14:38:09.391394 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-kq4h6" Oct 13 14:38:09 crc kubenswrapper[4797]: I1013 14:38:09.441764 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kq4h6"] Oct 13 14:38:10 crc kubenswrapper[4797]: I1013 14:38:10.028050 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-dfjfv"] Oct 13 14:38:10 crc kubenswrapper[4797]: I1013 14:38:10.038959 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-dfjfv"] Oct 13 14:38:11 crc kubenswrapper[4797]: I1013 14:38:11.252879 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aff293db-b437-4018-b341-f285a753a07e" path="/var/lib/kubelet/pods/aff293db-b437-4018-b341-f285a753a07e/volumes" Oct 13 14:38:11 crc kubenswrapper[4797]: I1013 14:38:11.356178 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-kq4h6" podUID="e03928a6-b0eb-4d80-8d9b-bbe571c7db10" containerName="registry-server" containerID="cri-o://e9624f5cd8cdfc1b4b628c0405c9203c3658ceed201b96e8081b34b09f607002" gracePeriod=2 Oct 13 14:38:11 crc kubenswrapper[4797]: I1013 14:38:11.840834 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kq4h6" Oct 13 14:38:11 crc kubenswrapper[4797]: I1013 14:38:11.904865 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e03928a6-b0eb-4d80-8d9b-bbe571c7db10-catalog-content\") pod \"e03928a6-b0eb-4d80-8d9b-bbe571c7db10\" (UID: \"e03928a6-b0eb-4d80-8d9b-bbe571c7db10\") " Oct 13 14:38:11 crc kubenswrapper[4797]: I1013 14:38:11.905501 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e03928a6-b0eb-4d80-8d9b-bbe571c7db10-utilities\") pod \"e03928a6-b0eb-4d80-8d9b-bbe571c7db10\" (UID: \"e03928a6-b0eb-4d80-8d9b-bbe571c7db10\") " Oct 13 14:38:11 crc kubenswrapper[4797]: I1013 14:38:11.905844 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dvdzq\" (UniqueName: \"kubernetes.io/projected/e03928a6-b0eb-4d80-8d9b-bbe571c7db10-kube-api-access-dvdzq\") pod \"e03928a6-b0eb-4d80-8d9b-bbe571c7db10\" (UID: \"e03928a6-b0eb-4d80-8d9b-bbe571c7db10\") " Oct 13 14:38:11 crc kubenswrapper[4797]: I1013 14:38:11.906622 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e03928a6-b0eb-4d80-8d9b-bbe571c7db10-utilities" (OuterVolumeSpecName: "utilities") pod "e03928a6-b0eb-4d80-8d9b-bbe571c7db10" (UID: "e03928a6-b0eb-4d80-8d9b-bbe571c7db10"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 14:38:11 crc kubenswrapper[4797]: I1013 14:38:11.907946 4797 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e03928a6-b0eb-4d80-8d9b-bbe571c7db10-utilities\") on node \"crc\" DevicePath \"\"" Oct 13 14:38:11 crc kubenswrapper[4797]: I1013 14:38:11.927869 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e03928a6-b0eb-4d80-8d9b-bbe571c7db10-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e03928a6-b0eb-4d80-8d9b-bbe571c7db10" (UID: "e03928a6-b0eb-4d80-8d9b-bbe571c7db10"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 14:38:11 crc kubenswrapper[4797]: I1013 14:38:11.928895 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e03928a6-b0eb-4d80-8d9b-bbe571c7db10-kube-api-access-dvdzq" (OuterVolumeSpecName: "kube-api-access-dvdzq") pod "e03928a6-b0eb-4d80-8d9b-bbe571c7db10" (UID: "e03928a6-b0eb-4d80-8d9b-bbe571c7db10"). InnerVolumeSpecName "kube-api-access-dvdzq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 14:38:12 crc kubenswrapper[4797]: I1013 14:38:12.009223 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dvdzq\" (UniqueName: \"kubernetes.io/projected/e03928a6-b0eb-4d80-8d9b-bbe571c7db10-kube-api-access-dvdzq\") on node \"crc\" DevicePath \"\"" Oct 13 14:38:12 crc kubenswrapper[4797]: I1013 14:38:12.009263 4797 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e03928a6-b0eb-4d80-8d9b-bbe571c7db10-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 13 14:38:12 crc kubenswrapper[4797]: I1013 14:38:12.367310 4797 generic.go:334] "Generic (PLEG): container finished" podID="e03928a6-b0eb-4d80-8d9b-bbe571c7db10" containerID="e9624f5cd8cdfc1b4b628c0405c9203c3658ceed201b96e8081b34b09f607002" exitCode=0 Oct 13 14:38:12 crc kubenswrapper[4797]: I1013 14:38:12.367367 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kq4h6" event={"ID":"e03928a6-b0eb-4d80-8d9b-bbe571c7db10","Type":"ContainerDied","Data":"e9624f5cd8cdfc1b4b628c0405c9203c3658ceed201b96e8081b34b09f607002"} Oct 13 14:38:12 crc kubenswrapper[4797]: I1013 14:38:12.367399 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kq4h6" event={"ID":"e03928a6-b0eb-4d80-8d9b-bbe571c7db10","Type":"ContainerDied","Data":"b823cc513d60ff511b09a8a71669a088c1ba875e65e2c7633f05065ce40b009b"} Oct 13 14:38:12 crc kubenswrapper[4797]: I1013 14:38:12.367419 4797 scope.go:117] "RemoveContainer" containerID="e9624f5cd8cdfc1b4b628c0405c9203c3658ceed201b96e8081b34b09f607002" Oct 13 14:38:12 crc kubenswrapper[4797]: I1013 14:38:12.367457 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kq4h6" Oct 13 14:38:12 crc kubenswrapper[4797]: I1013 14:38:12.386217 4797 scope.go:117] "RemoveContainer" containerID="2bbcd18e7bf4c2097f00c8ec9336d1a5c5fe6fac2d12c0620bd7bdf804cdba28" Oct 13 14:38:12 crc kubenswrapper[4797]: I1013 14:38:12.400634 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kq4h6"] Oct 13 14:38:12 crc kubenswrapper[4797]: I1013 14:38:12.411799 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-kq4h6"] Oct 13 14:38:12 crc kubenswrapper[4797]: I1013 14:38:12.415330 4797 scope.go:117] "RemoveContainer" containerID="d2febace3d7b3b095e28e9db847e676cbc7ea2679b978f076cd71e3b6c91e27d" Oct 13 14:38:12 crc kubenswrapper[4797]: I1013 14:38:12.451147 4797 scope.go:117] "RemoveContainer" containerID="e9624f5cd8cdfc1b4b628c0405c9203c3658ceed201b96e8081b34b09f607002" Oct 13 14:38:12 crc kubenswrapper[4797]: E1013 14:38:12.452138 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e9624f5cd8cdfc1b4b628c0405c9203c3658ceed201b96e8081b34b09f607002\": container with ID starting with e9624f5cd8cdfc1b4b628c0405c9203c3658ceed201b96e8081b34b09f607002 not found: ID does not exist" containerID="e9624f5cd8cdfc1b4b628c0405c9203c3658ceed201b96e8081b34b09f607002" Oct 13 14:38:12 crc kubenswrapper[4797]: I1013 14:38:12.452194 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9624f5cd8cdfc1b4b628c0405c9203c3658ceed201b96e8081b34b09f607002"} err="failed to get container status \"e9624f5cd8cdfc1b4b628c0405c9203c3658ceed201b96e8081b34b09f607002\": rpc error: code = NotFound desc = could not find container \"e9624f5cd8cdfc1b4b628c0405c9203c3658ceed201b96e8081b34b09f607002\": container with ID starting with e9624f5cd8cdfc1b4b628c0405c9203c3658ceed201b96e8081b34b09f607002 not found: ID does not exist" Oct 13 14:38:12 crc kubenswrapper[4797]: I1013 14:38:12.452229 4797 scope.go:117] "RemoveContainer" containerID="2bbcd18e7bf4c2097f00c8ec9336d1a5c5fe6fac2d12c0620bd7bdf804cdba28" Oct 13 14:38:12 crc kubenswrapper[4797]: E1013 14:38:12.452955 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2bbcd18e7bf4c2097f00c8ec9336d1a5c5fe6fac2d12c0620bd7bdf804cdba28\": container with ID starting with 2bbcd18e7bf4c2097f00c8ec9336d1a5c5fe6fac2d12c0620bd7bdf804cdba28 not found: ID does not exist" containerID="2bbcd18e7bf4c2097f00c8ec9336d1a5c5fe6fac2d12c0620bd7bdf804cdba28" Oct 13 14:38:12 crc kubenswrapper[4797]: I1013 14:38:12.452982 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2bbcd18e7bf4c2097f00c8ec9336d1a5c5fe6fac2d12c0620bd7bdf804cdba28"} err="failed to get container status \"2bbcd18e7bf4c2097f00c8ec9336d1a5c5fe6fac2d12c0620bd7bdf804cdba28\": rpc error: code = NotFound desc = could not find container \"2bbcd18e7bf4c2097f00c8ec9336d1a5c5fe6fac2d12c0620bd7bdf804cdba28\": container with ID starting with 2bbcd18e7bf4c2097f00c8ec9336d1a5c5fe6fac2d12c0620bd7bdf804cdba28 not found: ID does not exist" Oct 13 14:38:12 crc kubenswrapper[4797]: I1013 14:38:12.452997 4797 scope.go:117] "RemoveContainer" containerID="d2febace3d7b3b095e28e9db847e676cbc7ea2679b978f076cd71e3b6c91e27d" Oct 13 14:38:12 crc kubenswrapper[4797]: E1013 14:38:12.453435 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d2febace3d7b3b095e28e9db847e676cbc7ea2679b978f076cd71e3b6c91e27d\": container with ID starting with d2febace3d7b3b095e28e9db847e676cbc7ea2679b978f076cd71e3b6c91e27d not found: ID does not exist" containerID="d2febace3d7b3b095e28e9db847e676cbc7ea2679b978f076cd71e3b6c91e27d" Oct 13 14:38:12 crc kubenswrapper[4797]: I1013 14:38:12.453496 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2febace3d7b3b095e28e9db847e676cbc7ea2679b978f076cd71e3b6c91e27d"} err="failed to get container status \"d2febace3d7b3b095e28e9db847e676cbc7ea2679b978f076cd71e3b6c91e27d\": rpc error: code = NotFound desc = could not find container \"d2febace3d7b3b095e28e9db847e676cbc7ea2679b978f076cd71e3b6c91e27d\": container with ID starting with d2febace3d7b3b095e28e9db847e676cbc7ea2679b978f076cd71e3b6c91e27d not found: ID does not exist" Oct 13 14:38:13 crc kubenswrapper[4797]: I1013 14:38:13.247204 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e03928a6-b0eb-4d80-8d9b-bbe571c7db10" path="/var/lib/kubelet/pods/e03928a6-b0eb-4d80-8d9b-bbe571c7db10/volumes" Oct 13 14:38:15 crc kubenswrapper[4797]: I1013 14:38:15.310652 4797 scope.go:117] "RemoveContainer" containerID="e70d2fe9da59d5e516f20659af2b9b3a164386e9cf890c92f826cfcab4349a2b" Oct 13 14:38:15 crc kubenswrapper[4797]: I1013 14:38:15.338690 4797 scope.go:117] "RemoveContainer" containerID="756427a03cf4d7437144651f7c7d03c21208f3496e265607cb767be930829401" Oct 13 14:38:15 crc kubenswrapper[4797]: I1013 14:38:15.390762 4797 scope.go:117] "RemoveContainer" containerID="424eec589bc6b1db174bd15459b71a639e20ecabdb6a3078229196de7f1432ba" Oct 13 14:38:18 crc kubenswrapper[4797]: I1013 14:38:18.120129 4797 patch_prober.go:28] interesting pod/machine-config-daemon-hrdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 14:38:18 crc kubenswrapper[4797]: I1013 14:38:18.120494 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 14:38:19 crc kubenswrapper[4797]: I1013 14:38:19.557281 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-55ccf7bdd9-dbjvm"] Oct 13 14:38:19 crc kubenswrapper[4797]: E1013 14:38:19.557881 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e03928a6-b0eb-4d80-8d9b-bbe571c7db10" containerName="registry-server" Oct 13 14:38:19 crc kubenswrapper[4797]: I1013 14:38:19.557893 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="e03928a6-b0eb-4d80-8d9b-bbe571c7db10" containerName="registry-server" Oct 13 14:38:19 crc kubenswrapper[4797]: E1013 14:38:19.557907 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e03928a6-b0eb-4d80-8d9b-bbe571c7db10" containerName="extract-content" Oct 13 14:38:19 crc kubenswrapper[4797]: I1013 14:38:19.557913 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="e03928a6-b0eb-4d80-8d9b-bbe571c7db10" containerName="extract-content" Oct 13 14:38:19 crc kubenswrapper[4797]: E1013 14:38:19.557921 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97db726c-fe9c-4730-9f04-004a744500f2" containerName="horizon-log" Oct 13 14:38:19 crc kubenswrapper[4797]: I1013 14:38:19.557927 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="97db726c-fe9c-4730-9f04-004a744500f2" containerName="horizon-log" Oct 13 14:38:19 crc kubenswrapper[4797]: E1013 14:38:19.557958 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e03928a6-b0eb-4d80-8d9b-bbe571c7db10" containerName="extract-utilities" Oct 13 14:38:19 crc kubenswrapper[4797]: I1013 14:38:19.557966 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="e03928a6-b0eb-4d80-8d9b-bbe571c7db10" containerName="extract-utilities" Oct 13 14:38:19 crc kubenswrapper[4797]: E1013 14:38:19.557994 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97db726c-fe9c-4730-9f04-004a744500f2" containerName="horizon" Oct 13 14:38:19 crc kubenswrapper[4797]: I1013 14:38:19.558000 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="97db726c-fe9c-4730-9f04-004a744500f2" containerName="horizon" Oct 13 14:38:19 crc kubenswrapper[4797]: I1013 14:38:19.558169 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="97db726c-fe9c-4730-9f04-004a744500f2" containerName="horizon-log" Oct 13 14:38:19 crc kubenswrapper[4797]: I1013 14:38:19.558187 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="e03928a6-b0eb-4d80-8d9b-bbe571c7db10" containerName="registry-server" Oct 13 14:38:19 crc kubenswrapper[4797]: I1013 14:38:19.558200 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="97db726c-fe9c-4730-9f04-004a744500f2" containerName="horizon" Oct 13 14:38:19 crc kubenswrapper[4797]: I1013 14:38:19.559178 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-55ccf7bdd9-dbjvm" Oct 13 14:38:19 crc kubenswrapper[4797]: I1013 14:38:19.585154 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-55ccf7bdd9-dbjvm"] Oct 13 14:38:19 crc kubenswrapper[4797]: I1013 14:38:19.649407 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/59c6a50b-c86b-4e7e-98c8-067c2aaf9777-logs\") pod \"horizon-55ccf7bdd9-dbjvm\" (UID: \"59c6a50b-c86b-4e7e-98c8-067c2aaf9777\") " pod="openstack/horizon-55ccf7bdd9-dbjvm" Oct 13 14:38:19 crc kubenswrapper[4797]: I1013 14:38:19.649506 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/59c6a50b-c86b-4e7e-98c8-067c2aaf9777-horizon-secret-key\") pod \"horizon-55ccf7bdd9-dbjvm\" (UID: \"59c6a50b-c86b-4e7e-98c8-067c2aaf9777\") " pod="openstack/horizon-55ccf7bdd9-dbjvm" Oct 13 14:38:19 crc kubenswrapper[4797]: I1013 14:38:19.649570 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/59c6a50b-c86b-4e7e-98c8-067c2aaf9777-scripts\") pod \"horizon-55ccf7bdd9-dbjvm\" (UID: \"59c6a50b-c86b-4e7e-98c8-067c2aaf9777\") " pod="openstack/horizon-55ccf7bdd9-dbjvm" Oct 13 14:38:19 crc kubenswrapper[4797]: I1013 14:38:19.649611 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/59c6a50b-c86b-4e7e-98c8-067c2aaf9777-config-data\") pod \"horizon-55ccf7bdd9-dbjvm\" (UID: \"59c6a50b-c86b-4e7e-98c8-067c2aaf9777\") " pod="openstack/horizon-55ccf7bdd9-dbjvm" Oct 13 14:38:19 crc kubenswrapper[4797]: I1013 14:38:19.649654 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4cn4\" (UniqueName: \"kubernetes.io/projected/59c6a50b-c86b-4e7e-98c8-067c2aaf9777-kube-api-access-f4cn4\") pod \"horizon-55ccf7bdd9-dbjvm\" (UID: \"59c6a50b-c86b-4e7e-98c8-067c2aaf9777\") " pod="openstack/horizon-55ccf7bdd9-dbjvm" Oct 13 14:38:19 crc kubenswrapper[4797]: I1013 14:38:19.751799 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/59c6a50b-c86b-4e7e-98c8-067c2aaf9777-horizon-secret-key\") pod \"horizon-55ccf7bdd9-dbjvm\" (UID: \"59c6a50b-c86b-4e7e-98c8-067c2aaf9777\") " pod="openstack/horizon-55ccf7bdd9-dbjvm" Oct 13 14:38:19 crc kubenswrapper[4797]: I1013 14:38:19.751927 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/59c6a50b-c86b-4e7e-98c8-067c2aaf9777-scripts\") pod \"horizon-55ccf7bdd9-dbjvm\" (UID: \"59c6a50b-c86b-4e7e-98c8-067c2aaf9777\") " pod="openstack/horizon-55ccf7bdd9-dbjvm" Oct 13 14:38:19 crc kubenswrapper[4797]: I1013 14:38:19.751978 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/59c6a50b-c86b-4e7e-98c8-067c2aaf9777-config-data\") pod \"horizon-55ccf7bdd9-dbjvm\" (UID: \"59c6a50b-c86b-4e7e-98c8-067c2aaf9777\") " pod="openstack/horizon-55ccf7bdd9-dbjvm" Oct 13 14:38:19 crc kubenswrapper[4797]: I1013 14:38:19.752032 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f4cn4\" (UniqueName: \"kubernetes.io/projected/59c6a50b-c86b-4e7e-98c8-067c2aaf9777-kube-api-access-f4cn4\") pod \"horizon-55ccf7bdd9-dbjvm\" (UID: \"59c6a50b-c86b-4e7e-98c8-067c2aaf9777\") " pod="openstack/horizon-55ccf7bdd9-dbjvm" Oct 13 14:38:19 crc kubenswrapper[4797]: I1013 14:38:19.752105 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/59c6a50b-c86b-4e7e-98c8-067c2aaf9777-logs\") pod \"horizon-55ccf7bdd9-dbjvm\" (UID: \"59c6a50b-c86b-4e7e-98c8-067c2aaf9777\") " pod="openstack/horizon-55ccf7bdd9-dbjvm" Oct 13 14:38:19 crc kubenswrapper[4797]: I1013 14:38:19.752662 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/59c6a50b-c86b-4e7e-98c8-067c2aaf9777-logs\") pod \"horizon-55ccf7bdd9-dbjvm\" (UID: \"59c6a50b-c86b-4e7e-98c8-067c2aaf9777\") " pod="openstack/horizon-55ccf7bdd9-dbjvm" Oct 13 14:38:19 crc kubenswrapper[4797]: I1013 14:38:19.752953 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/59c6a50b-c86b-4e7e-98c8-067c2aaf9777-scripts\") pod \"horizon-55ccf7bdd9-dbjvm\" (UID: \"59c6a50b-c86b-4e7e-98c8-067c2aaf9777\") " pod="openstack/horizon-55ccf7bdd9-dbjvm" Oct 13 14:38:19 crc kubenswrapper[4797]: I1013 14:38:19.753695 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/59c6a50b-c86b-4e7e-98c8-067c2aaf9777-config-data\") pod \"horizon-55ccf7bdd9-dbjvm\" (UID: \"59c6a50b-c86b-4e7e-98c8-067c2aaf9777\") " pod="openstack/horizon-55ccf7bdd9-dbjvm" Oct 13 14:38:19 crc kubenswrapper[4797]: I1013 14:38:19.757694 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/59c6a50b-c86b-4e7e-98c8-067c2aaf9777-horizon-secret-key\") pod \"horizon-55ccf7bdd9-dbjvm\" (UID: \"59c6a50b-c86b-4e7e-98c8-067c2aaf9777\") " pod="openstack/horizon-55ccf7bdd9-dbjvm" Oct 13 14:38:19 crc kubenswrapper[4797]: I1013 14:38:19.768049 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4cn4\" (UniqueName: \"kubernetes.io/projected/59c6a50b-c86b-4e7e-98c8-067c2aaf9777-kube-api-access-f4cn4\") pod \"horizon-55ccf7bdd9-dbjvm\" (UID: \"59c6a50b-c86b-4e7e-98c8-067c2aaf9777\") " pod="openstack/horizon-55ccf7bdd9-dbjvm" Oct 13 14:38:19 crc kubenswrapper[4797]: I1013 14:38:19.891681 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-55ccf7bdd9-dbjvm" Oct 13 14:38:20 crc kubenswrapper[4797]: I1013 14:38:20.353210 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-55ccf7bdd9-dbjvm"] Oct 13 14:38:20 crc kubenswrapper[4797]: I1013 14:38:20.457367 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-55ccf7bdd9-dbjvm" event={"ID":"59c6a50b-c86b-4e7e-98c8-067c2aaf9777","Type":"ContainerStarted","Data":"f1f6803b03fa558168f679e4bf908bb070c61a290806309ef16b01c21f78d6d0"} Oct 13 14:38:20 crc kubenswrapper[4797]: I1013 14:38:20.684769 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-create-q9x6q"] Oct 13 14:38:20 crc kubenswrapper[4797]: I1013 14:38:20.686397 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-q9x6q" Oct 13 14:38:20 crc kubenswrapper[4797]: I1013 14:38:20.695700 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-q9x6q"] Oct 13 14:38:20 crc kubenswrapper[4797]: I1013 14:38:20.773083 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmkrb\" (UniqueName: \"kubernetes.io/projected/fe03d3d1-d5e5-4ab1-bbce-148648453626-kube-api-access-hmkrb\") pod \"heat-db-create-q9x6q\" (UID: \"fe03d3d1-d5e5-4ab1-bbce-148648453626\") " pod="openstack/heat-db-create-q9x6q" Oct 13 14:38:20 crc kubenswrapper[4797]: I1013 14:38:20.874588 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hmkrb\" (UniqueName: \"kubernetes.io/projected/fe03d3d1-d5e5-4ab1-bbce-148648453626-kube-api-access-hmkrb\") pod \"heat-db-create-q9x6q\" (UID: \"fe03d3d1-d5e5-4ab1-bbce-148648453626\") " pod="openstack/heat-db-create-q9x6q" Oct 13 14:38:20 crc kubenswrapper[4797]: I1013 14:38:20.894374 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmkrb\" (UniqueName: \"kubernetes.io/projected/fe03d3d1-d5e5-4ab1-bbce-148648453626-kube-api-access-hmkrb\") pod \"heat-db-create-q9x6q\" (UID: \"fe03d3d1-d5e5-4ab1-bbce-148648453626\") " pod="openstack/heat-db-create-q9x6q" Oct 13 14:38:21 crc kubenswrapper[4797]: I1013 14:38:21.022274 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-q9x6q" Oct 13 14:38:21 crc kubenswrapper[4797]: I1013 14:38:21.466485 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-q9x6q"] Oct 13 14:38:21 crc kubenswrapper[4797]: W1013 14:38:21.473736 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfe03d3d1_d5e5_4ab1_bbce_148648453626.slice/crio-4b8a1920de8c63196a5868eee0324031decb28bc8aa2e6376ca689d879e96589 WatchSource:0}: Error finding container 4b8a1920de8c63196a5868eee0324031decb28bc8aa2e6376ca689d879e96589: Status 404 returned error can't find the container with id 4b8a1920de8c63196a5868eee0324031decb28bc8aa2e6376ca689d879e96589 Oct 13 14:38:21 crc kubenswrapper[4797]: I1013 14:38:21.480159 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-55ccf7bdd9-dbjvm" event={"ID":"59c6a50b-c86b-4e7e-98c8-067c2aaf9777","Type":"ContainerStarted","Data":"526378263f91e9739862031a8c2303d0e0aff4a1d8497b6e0c3f9a152273dc3e"} Oct 13 14:38:21 crc kubenswrapper[4797]: I1013 14:38:21.480248 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-55ccf7bdd9-dbjvm" event={"ID":"59c6a50b-c86b-4e7e-98c8-067c2aaf9777","Type":"ContainerStarted","Data":"221a3a6ce88ad5417c3a2fcd60a231eabbf50fcb5d5fe60945e05bd3999725cc"} Oct 13 14:38:21 crc kubenswrapper[4797]: I1013 14:38:21.506290 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-55ccf7bdd9-dbjvm" podStartSLOduration=2.506265549 podStartE2EDuration="2.506265549s" podCreationTimestamp="2025-10-13 14:38:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 14:38:21.499055252 +0000 UTC m=+5479.032605528" watchObservedRunningTime="2025-10-13 14:38:21.506265549 +0000 UTC m=+5479.039815805" Oct 13 14:38:22 crc kubenswrapper[4797]: I1013 14:38:22.512391 4797 generic.go:334] "Generic (PLEG): container finished" podID="fe03d3d1-d5e5-4ab1-bbce-148648453626" containerID="abcfb420a7f3dc8f893e44ec3a8a7b2c5085cd3df2159637112dd9f4c8eb2c9f" exitCode=0 Oct 13 14:38:22 crc kubenswrapper[4797]: I1013 14:38:22.512513 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-q9x6q" event={"ID":"fe03d3d1-d5e5-4ab1-bbce-148648453626","Type":"ContainerDied","Data":"abcfb420a7f3dc8f893e44ec3a8a7b2c5085cd3df2159637112dd9f4c8eb2c9f"} Oct 13 14:38:22 crc kubenswrapper[4797]: I1013 14:38:22.513042 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-q9x6q" event={"ID":"fe03d3d1-d5e5-4ab1-bbce-148648453626","Type":"ContainerStarted","Data":"4b8a1920de8c63196a5868eee0324031decb28bc8aa2e6376ca689d879e96589"} Oct 13 14:38:23 crc kubenswrapper[4797]: I1013 14:38:23.892646 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-q9x6q" Oct 13 14:38:23 crc kubenswrapper[4797]: I1013 14:38:23.945218 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hmkrb\" (UniqueName: \"kubernetes.io/projected/fe03d3d1-d5e5-4ab1-bbce-148648453626-kube-api-access-hmkrb\") pod \"fe03d3d1-d5e5-4ab1-bbce-148648453626\" (UID: \"fe03d3d1-d5e5-4ab1-bbce-148648453626\") " Oct 13 14:38:23 crc kubenswrapper[4797]: I1013 14:38:23.950561 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe03d3d1-d5e5-4ab1-bbce-148648453626-kube-api-access-hmkrb" (OuterVolumeSpecName: "kube-api-access-hmkrb") pod "fe03d3d1-d5e5-4ab1-bbce-148648453626" (UID: "fe03d3d1-d5e5-4ab1-bbce-148648453626"). InnerVolumeSpecName "kube-api-access-hmkrb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 14:38:24 crc kubenswrapper[4797]: I1013 14:38:24.047499 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hmkrb\" (UniqueName: \"kubernetes.io/projected/fe03d3d1-d5e5-4ab1-bbce-148648453626-kube-api-access-hmkrb\") on node \"crc\" DevicePath \"\"" Oct 13 14:38:24 crc kubenswrapper[4797]: I1013 14:38:24.534564 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-q9x6q" event={"ID":"fe03d3d1-d5e5-4ab1-bbce-148648453626","Type":"ContainerDied","Data":"4b8a1920de8c63196a5868eee0324031decb28bc8aa2e6376ca689d879e96589"} Oct 13 14:38:24 crc kubenswrapper[4797]: I1013 14:38:24.534602 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4b8a1920de8c63196a5868eee0324031decb28bc8aa2e6376ca689d879e96589" Oct 13 14:38:24 crc kubenswrapper[4797]: I1013 14:38:24.534656 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-q9x6q" Oct 13 14:38:29 crc kubenswrapper[4797]: I1013 14:38:29.893089 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-55ccf7bdd9-dbjvm" Oct 13 14:38:29 crc kubenswrapper[4797]: I1013 14:38:29.893508 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-55ccf7bdd9-dbjvm" Oct 13 14:38:30 crc kubenswrapper[4797]: I1013 14:38:30.789552 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-2f95-account-create-xvpl7"] Oct 13 14:38:30 crc kubenswrapper[4797]: E1013 14:38:30.790246 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe03d3d1-d5e5-4ab1-bbce-148648453626" containerName="mariadb-database-create" Oct 13 14:38:30 crc kubenswrapper[4797]: I1013 14:38:30.790278 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe03d3d1-d5e5-4ab1-bbce-148648453626" containerName="mariadb-database-create" Oct 13 14:38:30 crc kubenswrapper[4797]: I1013 14:38:30.790549 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe03d3d1-d5e5-4ab1-bbce-148648453626" containerName="mariadb-database-create" Oct 13 14:38:30 crc kubenswrapper[4797]: I1013 14:38:30.791694 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-2f95-account-create-xvpl7" Oct 13 14:38:30 crc kubenswrapper[4797]: I1013 14:38:30.793783 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-db-secret" Oct 13 14:38:30 crc kubenswrapper[4797]: I1013 14:38:30.802676 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-2f95-account-create-xvpl7"] Oct 13 14:38:30 crc kubenswrapper[4797]: I1013 14:38:30.887836 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpzhg\" (UniqueName: \"kubernetes.io/projected/a153a9b5-e82d-467b-bbd0-6904bbd9a75b-kube-api-access-lpzhg\") pod \"heat-2f95-account-create-xvpl7\" (UID: \"a153a9b5-e82d-467b-bbd0-6904bbd9a75b\") " pod="openstack/heat-2f95-account-create-xvpl7" Oct 13 14:38:30 crc kubenswrapper[4797]: I1013 14:38:30.989697 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lpzhg\" (UniqueName: \"kubernetes.io/projected/a153a9b5-e82d-467b-bbd0-6904bbd9a75b-kube-api-access-lpzhg\") pod \"heat-2f95-account-create-xvpl7\" (UID: \"a153a9b5-e82d-467b-bbd0-6904bbd9a75b\") " pod="openstack/heat-2f95-account-create-xvpl7" Oct 13 14:38:31 crc kubenswrapper[4797]: I1013 14:38:31.010608 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpzhg\" (UniqueName: \"kubernetes.io/projected/a153a9b5-e82d-467b-bbd0-6904bbd9a75b-kube-api-access-lpzhg\") pod \"heat-2f95-account-create-xvpl7\" (UID: \"a153a9b5-e82d-467b-bbd0-6904bbd9a75b\") " pod="openstack/heat-2f95-account-create-xvpl7" Oct 13 14:38:31 crc kubenswrapper[4797]: I1013 14:38:31.124579 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-2f95-account-create-xvpl7" Oct 13 14:38:31 crc kubenswrapper[4797]: I1013 14:38:31.589104 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-2f95-account-create-xvpl7"] Oct 13 14:38:31 crc kubenswrapper[4797]: I1013 14:38:31.609547 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-2f95-account-create-xvpl7" event={"ID":"a153a9b5-e82d-467b-bbd0-6904bbd9a75b","Type":"ContainerStarted","Data":"0522dc24f4eced7dac31ca9ac960b3ee09a3f9a8d227ef0948483c41fa296e9f"} Oct 13 14:38:32 crc kubenswrapper[4797]: I1013 14:38:32.619311 4797 generic.go:334] "Generic (PLEG): container finished" podID="a153a9b5-e82d-467b-bbd0-6904bbd9a75b" containerID="a599d6d74584c4a41379ec371da024c5f86e25efa6e5f21aa56ee11b6bc3ac95" exitCode=0 Oct 13 14:38:32 crc kubenswrapper[4797]: I1013 14:38:32.619624 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-2f95-account-create-xvpl7" event={"ID":"a153a9b5-e82d-467b-bbd0-6904bbd9a75b","Type":"ContainerDied","Data":"a599d6d74584c4a41379ec371da024c5f86e25efa6e5f21aa56ee11b6bc3ac95"} Oct 13 14:38:34 crc kubenswrapper[4797]: I1013 14:38:34.068255 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-2f95-account-create-xvpl7" Oct 13 14:38:34 crc kubenswrapper[4797]: I1013 14:38:34.155302 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lpzhg\" (UniqueName: \"kubernetes.io/projected/a153a9b5-e82d-467b-bbd0-6904bbd9a75b-kube-api-access-lpzhg\") pod \"a153a9b5-e82d-467b-bbd0-6904bbd9a75b\" (UID: \"a153a9b5-e82d-467b-bbd0-6904bbd9a75b\") " Oct 13 14:38:34 crc kubenswrapper[4797]: I1013 14:38:34.161565 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a153a9b5-e82d-467b-bbd0-6904bbd9a75b-kube-api-access-lpzhg" (OuterVolumeSpecName: "kube-api-access-lpzhg") pod "a153a9b5-e82d-467b-bbd0-6904bbd9a75b" (UID: "a153a9b5-e82d-467b-bbd0-6904bbd9a75b"). InnerVolumeSpecName "kube-api-access-lpzhg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 14:38:34 crc kubenswrapper[4797]: I1013 14:38:34.260122 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lpzhg\" (UniqueName: \"kubernetes.io/projected/a153a9b5-e82d-467b-bbd0-6904bbd9a75b-kube-api-access-lpzhg\") on node \"crc\" DevicePath \"\"" Oct 13 14:38:34 crc kubenswrapper[4797]: I1013 14:38:34.643852 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-2f95-account-create-xvpl7" event={"ID":"a153a9b5-e82d-467b-bbd0-6904bbd9a75b","Type":"ContainerDied","Data":"0522dc24f4eced7dac31ca9ac960b3ee09a3f9a8d227ef0948483c41fa296e9f"} Oct 13 14:38:34 crc kubenswrapper[4797]: I1013 14:38:34.643898 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0522dc24f4eced7dac31ca9ac960b3ee09a3f9a8d227ef0948483c41fa296e9f" Oct 13 14:38:34 crc kubenswrapper[4797]: I1013 14:38:34.643946 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-2f95-account-create-xvpl7" Oct 13 14:38:35 crc kubenswrapper[4797]: I1013 14:38:35.854735 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-sync-dlf5b"] Oct 13 14:38:35 crc kubenswrapper[4797]: E1013 14:38:35.855566 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a153a9b5-e82d-467b-bbd0-6904bbd9a75b" containerName="mariadb-account-create" Oct 13 14:38:35 crc kubenswrapper[4797]: I1013 14:38:35.855586 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="a153a9b5-e82d-467b-bbd0-6904bbd9a75b" containerName="mariadb-account-create" Oct 13 14:38:35 crc kubenswrapper[4797]: I1013 14:38:35.855886 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="a153a9b5-e82d-467b-bbd0-6904bbd9a75b" containerName="mariadb-account-create" Oct 13 14:38:35 crc kubenswrapper[4797]: I1013 14:38:35.856719 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-dlf5b" Oct 13 14:38:35 crc kubenswrapper[4797]: I1013 14:38:35.859489 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Oct 13 14:38:35 crc kubenswrapper[4797]: I1013 14:38:35.859559 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-hgtpf" Oct 13 14:38:35 crc kubenswrapper[4797]: I1013 14:38:35.865927 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-dlf5b"] Oct 13 14:38:35 crc kubenswrapper[4797]: I1013 14:38:35.895970 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae765d31-580c-49c1-be2f-aca757f6e464-config-data\") pod \"heat-db-sync-dlf5b\" (UID: \"ae765d31-580c-49c1-be2f-aca757f6e464\") " pod="openstack/heat-db-sync-dlf5b" Oct 13 14:38:35 crc kubenswrapper[4797]: I1013 14:38:35.896162 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2mvl\" (UniqueName: \"kubernetes.io/projected/ae765d31-580c-49c1-be2f-aca757f6e464-kube-api-access-t2mvl\") pod \"heat-db-sync-dlf5b\" (UID: \"ae765d31-580c-49c1-be2f-aca757f6e464\") " pod="openstack/heat-db-sync-dlf5b" Oct 13 14:38:35 crc kubenswrapper[4797]: I1013 14:38:35.896282 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae765d31-580c-49c1-be2f-aca757f6e464-combined-ca-bundle\") pod \"heat-db-sync-dlf5b\" (UID: \"ae765d31-580c-49c1-be2f-aca757f6e464\") " pod="openstack/heat-db-sync-dlf5b" Oct 13 14:38:35 crc kubenswrapper[4797]: I1013 14:38:35.997474 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae765d31-580c-49c1-be2f-aca757f6e464-config-data\") pod \"heat-db-sync-dlf5b\" (UID: \"ae765d31-580c-49c1-be2f-aca757f6e464\") " pod="openstack/heat-db-sync-dlf5b" Oct 13 14:38:35 crc kubenswrapper[4797]: I1013 14:38:35.997559 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2mvl\" (UniqueName: \"kubernetes.io/projected/ae765d31-580c-49c1-be2f-aca757f6e464-kube-api-access-t2mvl\") pod \"heat-db-sync-dlf5b\" (UID: \"ae765d31-580c-49c1-be2f-aca757f6e464\") " pod="openstack/heat-db-sync-dlf5b" Oct 13 14:38:35 crc kubenswrapper[4797]: I1013 14:38:35.997640 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae765d31-580c-49c1-be2f-aca757f6e464-combined-ca-bundle\") pod \"heat-db-sync-dlf5b\" (UID: \"ae765d31-580c-49c1-be2f-aca757f6e464\") " pod="openstack/heat-db-sync-dlf5b" Oct 13 14:38:36 crc kubenswrapper[4797]: I1013 14:38:36.004261 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae765d31-580c-49c1-be2f-aca757f6e464-combined-ca-bundle\") pod \"heat-db-sync-dlf5b\" (UID: \"ae765d31-580c-49c1-be2f-aca757f6e464\") " pod="openstack/heat-db-sync-dlf5b" Oct 13 14:38:36 crc kubenswrapper[4797]: I1013 14:38:36.004330 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae765d31-580c-49c1-be2f-aca757f6e464-config-data\") pod \"heat-db-sync-dlf5b\" (UID: \"ae765d31-580c-49c1-be2f-aca757f6e464\") " pod="openstack/heat-db-sync-dlf5b" Oct 13 14:38:36 crc kubenswrapper[4797]: I1013 14:38:36.021583 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2mvl\" (UniqueName: \"kubernetes.io/projected/ae765d31-580c-49c1-be2f-aca757f6e464-kube-api-access-t2mvl\") pod \"heat-db-sync-dlf5b\" (UID: \"ae765d31-580c-49c1-be2f-aca757f6e464\") " pod="openstack/heat-db-sync-dlf5b" Oct 13 14:38:36 crc kubenswrapper[4797]: I1013 14:38:36.176094 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-dlf5b" Oct 13 14:38:36 crc kubenswrapper[4797]: I1013 14:38:36.646299 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-dlf5b"] Oct 13 14:38:36 crc kubenswrapper[4797]: I1013 14:38:36.663790 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-dlf5b" event={"ID":"ae765d31-580c-49c1-be2f-aca757f6e464","Type":"ContainerStarted","Data":"84392e9614a4f9243fd9815d3dd6a324997de2efed6242c40fc1717a877db150"} Oct 13 14:38:39 crc kubenswrapper[4797]: I1013 14:38:39.040208 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-lbdbr"] Oct 13 14:38:39 crc kubenswrapper[4797]: I1013 14:38:39.048834 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-lbdbr"] Oct 13 14:38:39 crc kubenswrapper[4797]: I1013 14:38:39.247727 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00c0b6e0-17cc-43c2-b42b-706129ced2e3" path="/var/lib/kubelet/pods/00c0b6e0-17cc-43c2-b42b-706129ced2e3/volumes" Oct 13 14:38:41 crc kubenswrapper[4797]: I1013 14:38:41.943012 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-55ccf7bdd9-dbjvm" Oct 13 14:38:43 crc kubenswrapper[4797]: I1013 14:38:43.801572 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-55ccf7bdd9-dbjvm" Oct 13 14:38:43 crc kubenswrapper[4797]: I1013 14:38:43.907469 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7c5c76c54c-hrm4p"] Oct 13 14:38:43 crc kubenswrapper[4797]: I1013 14:38:43.907680 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7c5c76c54c-hrm4p" podUID="14c1d6c3-d0ec-48c2-bca2-391a04f0c47d" containerName="horizon-log" containerID="cri-o://71a0fbdeec6f77261bf21ba6f891657c7fb8be77e92f18d5ac6dec49a549a54f" gracePeriod=30 Oct 13 14:38:43 crc kubenswrapper[4797]: I1013 14:38:43.908116 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7c5c76c54c-hrm4p" podUID="14c1d6c3-d0ec-48c2-bca2-391a04f0c47d" containerName="horizon" containerID="cri-o://3e62b80534ac510a3738e79e12c35b07168b4b3b9e63b673896bab0020dbb858" gracePeriod=30 Oct 13 14:38:46 crc kubenswrapper[4797]: I1013 14:38:46.755587 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-dlf5b" event={"ID":"ae765d31-580c-49c1-be2f-aca757f6e464","Type":"ContainerStarted","Data":"9f64fc111deb6154d387cfa1edb9779b1ba05a14e8e687b7f6af0c606a85b201"} Oct 13 14:38:46 crc kubenswrapper[4797]: I1013 14:38:46.779193 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-db-sync-dlf5b" podStartSLOduration=2.905603671 podStartE2EDuration="11.779177951s" podCreationTimestamp="2025-10-13 14:38:35 +0000 UTC" firstStartedPulling="2025-10-13 14:38:36.651543524 +0000 UTC m=+5494.185093780" lastFinishedPulling="2025-10-13 14:38:45.525117804 +0000 UTC m=+5503.058668060" observedRunningTime="2025-10-13 14:38:46.770604491 +0000 UTC m=+5504.304154787" watchObservedRunningTime="2025-10-13 14:38:46.779177951 +0000 UTC m=+5504.312728207" Oct 13 14:38:47 crc kubenswrapper[4797]: I1013 14:38:47.766904 4797 generic.go:334] "Generic (PLEG): container finished" podID="14c1d6c3-d0ec-48c2-bca2-391a04f0c47d" containerID="3e62b80534ac510a3738e79e12c35b07168b4b3b9e63b673896bab0020dbb858" exitCode=0 Oct 13 14:38:47 crc kubenswrapper[4797]: I1013 14:38:47.766981 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7c5c76c54c-hrm4p" event={"ID":"14c1d6c3-d0ec-48c2-bca2-391a04f0c47d","Type":"ContainerDied","Data":"3e62b80534ac510a3738e79e12c35b07168b4b3b9e63b673896bab0020dbb858"} Oct 13 14:38:48 crc kubenswrapper[4797]: I1013 14:38:48.120451 4797 patch_prober.go:28] interesting pod/machine-config-daemon-hrdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 14:38:48 crc kubenswrapper[4797]: I1013 14:38:48.120574 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 14:38:48 crc kubenswrapper[4797]: I1013 14:38:48.120661 4797 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" Oct 13 14:38:48 crc kubenswrapper[4797]: I1013 14:38:48.121700 4797 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f6beaa7adf1d21db8fdfdf908e4a91ef09e840de8f57f89fa2a6f4402ae41c29"} pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 13 14:38:48 crc kubenswrapper[4797]: I1013 14:38:48.121830 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerName="machine-config-daemon" containerID="cri-o://f6beaa7adf1d21db8fdfdf908e4a91ef09e840de8f57f89fa2a6f4402ae41c29" gracePeriod=600 Oct 13 14:38:48 crc kubenswrapper[4797]: I1013 14:38:48.778462 4797 generic.go:334] "Generic (PLEG): container finished" podID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerID="f6beaa7adf1d21db8fdfdf908e4a91ef09e840de8f57f89fa2a6f4402ae41c29" exitCode=0 Oct 13 14:38:48 crc kubenswrapper[4797]: I1013 14:38:48.778541 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" event={"ID":"345b1c60-ba79-407d-8423-53010f2dfeb0","Type":"ContainerDied","Data":"f6beaa7adf1d21db8fdfdf908e4a91ef09e840de8f57f89fa2a6f4402ae41c29"} Oct 13 14:38:48 crc kubenswrapper[4797]: I1013 14:38:48.778847 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" event={"ID":"345b1c60-ba79-407d-8423-53010f2dfeb0","Type":"ContainerStarted","Data":"6e1792210eb1300e989630292282b9e027a6f68ede1e610221b234da4c9f7a00"} Oct 13 14:38:48 crc kubenswrapper[4797]: I1013 14:38:48.778873 4797 scope.go:117] "RemoveContainer" containerID="96c8267bd4c8e99eeab0f52fde47a06d5529395a03b2ed9e13ec45aa355e370b" Oct 13 14:38:48 crc kubenswrapper[4797]: I1013 14:38:48.783833 4797 generic.go:334] "Generic (PLEG): container finished" podID="ae765d31-580c-49c1-be2f-aca757f6e464" containerID="9f64fc111deb6154d387cfa1edb9779b1ba05a14e8e687b7f6af0c606a85b201" exitCode=0 Oct 13 14:38:48 crc kubenswrapper[4797]: I1013 14:38:48.783880 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-dlf5b" event={"ID":"ae765d31-580c-49c1-be2f-aca757f6e464","Type":"ContainerDied","Data":"9f64fc111deb6154d387cfa1edb9779b1ba05a14e8e687b7f6af0c606a85b201"} Oct 13 14:38:50 crc kubenswrapper[4797]: I1013 14:38:50.030718 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-cf68-account-create-xv5hv"] Oct 13 14:38:50 crc kubenswrapper[4797]: I1013 14:38:50.042463 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-cf68-account-create-xv5hv"] Oct 13 14:38:50 crc kubenswrapper[4797]: I1013 14:38:50.136528 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-dlf5b" Oct 13 14:38:50 crc kubenswrapper[4797]: I1013 14:38:50.217309 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t2mvl\" (UniqueName: \"kubernetes.io/projected/ae765d31-580c-49c1-be2f-aca757f6e464-kube-api-access-t2mvl\") pod \"ae765d31-580c-49c1-be2f-aca757f6e464\" (UID: \"ae765d31-580c-49c1-be2f-aca757f6e464\") " Oct 13 14:38:50 crc kubenswrapper[4797]: I1013 14:38:50.217502 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae765d31-580c-49c1-be2f-aca757f6e464-config-data\") pod \"ae765d31-580c-49c1-be2f-aca757f6e464\" (UID: \"ae765d31-580c-49c1-be2f-aca757f6e464\") " Oct 13 14:38:50 crc kubenswrapper[4797]: I1013 14:38:50.217553 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae765d31-580c-49c1-be2f-aca757f6e464-combined-ca-bundle\") pod \"ae765d31-580c-49c1-be2f-aca757f6e464\" (UID: \"ae765d31-580c-49c1-be2f-aca757f6e464\") " Oct 13 14:38:50 crc kubenswrapper[4797]: I1013 14:38:50.224617 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae765d31-580c-49c1-be2f-aca757f6e464-kube-api-access-t2mvl" (OuterVolumeSpecName: "kube-api-access-t2mvl") pod "ae765d31-580c-49c1-be2f-aca757f6e464" (UID: "ae765d31-580c-49c1-be2f-aca757f6e464"). InnerVolumeSpecName "kube-api-access-t2mvl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 14:38:50 crc kubenswrapper[4797]: I1013 14:38:50.254905 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae765d31-580c-49c1-be2f-aca757f6e464-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ae765d31-580c-49c1-be2f-aca757f6e464" (UID: "ae765d31-580c-49c1-be2f-aca757f6e464"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 14:38:50 crc kubenswrapper[4797]: I1013 14:38:50.294409 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae765d31-580c-49c1-be2f-aca757f6e464-config-data" (OuterVolumeSpecName: "config-data") pod "ae765d31-580c-49c1-be2f-aca757f6e464" (UID: "ae765d31-580c-49c1-be2f-aca757f6e464"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 14:38:50 crc kubenswrapper[4797]: I1013 14:38:50.320098 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t2mvl\" (UniqueName: \"kubernetes.io/projected/ae765d31-580c-49c1-be2f-aca757f6e464-kube-api-access-t2mvl\") on node \"crc\" DevicePath \"\"" Oct 13 14:38:50 crc kubenswrapper[4797]: I1013 14:38:50.320135 4797 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae765d31-580c-49c1-be2f-aca757f6e464-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 14:38:50 crc kubenswrapper[4797]: I1013 14:38:50.320146 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae765d31-580c-49c1-be2f-aca757f6e464-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 14:38:50 crc kubenswrapper[4797]: I1013 14:38:50.808369 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-dlf5b" event={"ID":"ae765d31-580c-49c1-be2f-aca757f6e464","Type":"ContainerDied","Data":"84392e9614a4f9243fd9815d3dd6a324997de2efed6242c40fc1717a877db150"} Oct 13 14:38:50 crc kubenswrapper[4797]: I1013 14:38:50.808418 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="84392e9614a4f9243fd9815d3dd6a324997de2efed6242c40fc1717a877db150" Oct 13 14:38:50 crc kubenswrapper[4797]: I1013 14:38:50.808525 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-dlf5b" Oct 13 14:38:51 crc kubenswrapper[4797]: I1013 14:38:51.246529 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b76b9c6a-0f53-48f0-9f69-13094dde56ca" path="/var/lib/kubelet/pods/b76b9c6a-0f53-48f0-9f69-13094dde56ca/volumes" Oct 13 14:38:52 crc kubenswrapper[4797]: I1013 14:38:52.190684 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-6c67d84d9-t7s9t"] Oct 13 14:38:52 crc kubenswrapper[4797]: E1013 14:38:52.191422 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae765d31-580c-49c1-be2f-aca757f6e464" containerName="heat-db-sync" Oct 13 14:38:52 crc kubenswrapper[4797]: I1013 14:38:52.191443 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae765d31-580c-49c1-be2f-aca757f6e464" containerName="heat-db-sync" Oct 13 14:38:52 crc kubenswrapper[4797]: I1013 14:38:52.191652 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae765d31-580c-49c1-be2f-aca757f6e464" containerName="heat-db-sync" Oct 13 14:38:52 crc kubenswrapper[4797]: I1013 14:38:52.192373 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-6c67d84d9-t7s9t" Oct 13 14:38:52 crc kubenswrapper[4797]: I1013 14:38:52.194022 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-hgtpf" Oct 13 14:38:52 crc kubenswrapper[4797]: I1013 14:38:52.194370 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Oct 13 14:38:52 crc kubenswrapper[4797]: I1013 14:38:52.194486 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-engine-config-data" Oct 13 14:38:52 crc kubenswrapper[4797]: I1013 14:38:52.206938 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-6c67d84d9-t7s9t"] Oct 13 14:38:52 crc kubenswrapper[4797]: I1013 14:38:52.334865 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-6794f4b959-8wbgr"] Oct 13 14:38:52 crc kubenswrapper[4797]: I1013 14:38:52.336519 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-6794f4b959-8wbgr" Oct 13 14:38:52 crc kubenswrapper[4797]: I1013 14:38:52.339441 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-api-config-data" Oct 13 14:38:52 crc kubenswrapper[4797]: I1013 14:38:52.352543 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-6b8568fd4f-wdjdv"] Oct 13 14:38:52 crc kubenswrapper[4797]: I1013 14:38:52.359098 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g255q\" (UniqueName: \"kubernetes.io/projected/456c5f14-d211-4869-914d-73d2fd6efa69-kube-api-access-g255q\") pod \"heat-engine-6c67d84d9-t7s9t\" (UID: \"456c5f14-d211-4869-914d-73d2fd6efa69\") " pod="openstack/heat-engine-6c67d84d9-t7s9t" Oct 13 14:38:52 crc kubenswrapper[4797]: I1013 14:38:52.359153 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/456c5f14-d211-4869-914d-73d2fd6efa69-combined-ca-bundle\") pod \"heat-engine-6c67d84d9-t7s9t\" (UID: \"456c5f14-d211-4869-914d-73d2fd6efa69\") " pod="openstack/heat-engine-6c67d84d9-t7s9t" Oct 13 14:38:52 crc kubenswrapper[4797]: I1013 14:38:52.359252 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/456c5f14-d211-4869-914d-73d2fd6efa69-config-data\") pod \"heat-engine-6c67d84d9-t7s9t\" (UID: \"456c5f14-d211-4869-914d-73d2fd6efa69\") " pod="openstack/heat-engine-6c67d84d9-t7s9t" Oct 13 14:38:52 crc kubenswrapper[4797]: I1013 14:38:52.359352 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/456c5f14-d211-4869-914d-73d2fd6efa69-config-data-custom\") pod \"heat-engine-6c67d84d9-t7s9t\" (UID: \"456c5f14-d211-4869-914d-73d2fd6efa69\") " pod="openstack/heat-engine-6c67d84d9-t7s9t" Oct 13 14:38:52 crc kubenswrapper[4797]: I1013 14:38:52.360292 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-6b8568fd4f-wdjdv" Oct 13 14:38:52 crc kubenswrapper[4797]: I1013 14:38:52.361906 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-6794f4b959-8wbgr"] Oct 13 14:38:52 crc kubenswrapper[4797]: I1013 14:38:52.368342 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-cfnapi-config-data" Oct 13 14:38:52 crc kubenswrapper[4797]: I1013 14:38:52.409916 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-6b8568fd4f-wdjdv"] Oct 13 14:38:52 crc kubenswrapper[4797]: I1013 14:38:52.460884 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19924c78-0046-4b6a-91be-546357d8b190-config-data\") pod \"heat-cfnapi-6b8568fd4f-wdjdv\" (UID: \"19924c78-0046-4b6a-91be-546357d8b190\") " pod="openstack/heat-cfnapi-6b8568fd4f-wdjdv" Oct 13 14:38:52 crc kubenswrapper[4797]: I1013 14:38:52.462977 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19924c78-0046-4b6a-91be-546357d8b190-combined-ca-bundle\") pod \"heat-cfnapi-6b8568fd4f-wdjdv\" (UID: \"19924c78-0046-4b6a-91be-546357d8b190\") " pod="openstack/heat-cfnapi-6b8568fd4f-wdjdv" Oct 13 14:38:52 crc kubenswrapper[4797]: I1013 14:38:52.463003 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b291a2fb-8b0f-4ece-8665-fa382a5c51e4-config-data\") pod \"heat-api-6794f4b959-8wbgr\" (UID: \"b291a2fb-8b0f-4ece-8665-fa382a5c51e4\") " pod="openstack/heat-api-6794f4b959-8wbgr" Oct 13 14:38:52 crc kubenswrapper[4797]: I1013 14:38:52.463041 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g255q\" (UniqueName: \"kubernetes.io/projected/456c5f14-d211-4869-914d-73d2fd6efa69-kube-api-access-g255q\") pod \"heat-engine-6c67d84d9-t7s9t\" (UID: \"456c5f14-d211-4869-914d-73d2fd6efa69\") " pod="openstack/heat-engine-6c67d84d9-t7s9t" Oct 13 14:38:52 crc kubenswrapper[4797]: I1013 14:38:52.463070 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/456c5f14-d211-4869-914d-73d2fd6efa69-combined-ca-bundle\") pod \"heat-engine-6c67d84d9-t7s9t\" (UID: \"456c5f14-d211-4869-914d-73d2fd6efa69\") " pod="openstack/heat-engine-6c67d84d9-t7s9t" Oct 13 14:38:52 crc kubenswrapper[4797]: I1013 14:38:52.463101 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/19924c78-0046-4b6a-91be-546357d8b190-config-data-custom\") pod \"heat-cfnapi-6b8568fd4f-wdjdv\" (UID: \"19924c78-0046-4b6a-91be-546357d8b190\") " pod="openstack/heat-cfnapi-6b8568fd4f-wdjdv" Oct 13 14:38:52 crc kubenswrapper[4797]: I1013 14:38:52.463167 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b291a2fb-8b0f-4ece-8665-fa382a5c51e4-combined-ca-bundle\") pod \"heat-api-6794f4b959-8wbgr\" (UID: \"b291a2fb-8b0f-4ece-8665-fa382a5c51e4\") " pod="openstack/heat-api-6794f4b959-8wbgr" Oct 13 14:38:52 crc kubenswrapper[4797]: I1013 14:38:52.463237 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zgv2z\" (UniqueName: \"kubernetes.io/projected/b291a2fb-8b0f-4ece-8665-fa382a5c51e4-kube-api-access-zgv2z\") pod \"heat-api-6794f4b959-8wbgr\" (UID: \"b291a2fb-8b0f-4ece-8665-fa382a5c51e4\") " pod="openstack/heat-api-6794f4b959-8wbgr" Oct 13 14:38:52 crc kubenswrapper[4797]: I1013 14:38:52.463280 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b291a2fb-8b0f-4ece-8665-fa382a5c51e4-config-data-custom\") pod \"heat-api-6794f4b959-8wbgr\" (UID: \"b291a2fb-8b0f-4ece-8665-fa382a5c51e4\") " pod="openstack/heat-api-6794f4b959-8wbgr" Oct 13 14:38:52 crc kubenswrapper[4797]: I1013 14:38:52.463306 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/456c5f14-d211-4869-914d-73d2fd6efa69-config-data\") pod \"heat-engine-6c67d84d9-t7s9t\" (UID: \"456c5f14-d211-4869-914d-73d2fd6efa69\") " pod="openstack/heat-engine-6c67d84d9-t7s9t" Oct 13 14:38:52 crc kubenswrapper[4797]: I1013 14:38:52.463398 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/456c5f14-d211-4869-914d-73d2fd6efa69-config-data-custom\") pod \"heat-engine-6c67d84d9-t7s9t\" (UID: \"456c5f14-d211-4869-914d-73d2fd6efa69\") " pod="openstack/heat-engine-6c67d84d9-t7s9t" Oct 13 14:38:52 crc kubenswrapper[4797]: I1013 14:38:52.463447 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfdbz\" (UniqueName: \"kubernetes.io/projected/19924c78-0046-4b6a-91be-546357d8b190-kube-api-access-dfdbz\") pod \"heat-cfnapi-6b8568fd4f-wdjdv\" (UID: \"19924c78-0046-4b6a-91be-546357d8b190\") " pod="openstack/heat-cfnapi-6b8568fd4f-wdjdv" Oct 13 14:38:52 crc kubenswrapper[4797]: I1013 14:38:52.472974 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/456c5f14-d211-4869-914d-73d2fd6efa69-config-data-custom\") pod \"heat-engine-6c67d84d9-t7s9t\" (UID: \"456c5f14-d211-4869-914d-73d2fd6efa69\") " pod="openstack/heat-engine-6c67d84d9-t7s9t" Oct 13 14:38:52 crc kubenswrapper[4797]: I1013 14:38:52.473130 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/456c5f14-d211-4869-914d-73d2fd6efa69-combined-ca-bundle\") pod \"heat-engine-6c67d84d9-t7s9t\" (UID: \"456c5f14-d211-4869-914d-73d2fd6efa69\") " pod="openstack/heat-engine-6c67d84d9-t7s9t" Oct 13 14:38:52 crc kubenswrapper[4797]: I1013 14:38:52.491832 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/456c5f14-d211-4869-914d-73d2fd6efa69-config-data\") pod \"heat-engine-6c67d84d9-t7s9t\" (UID: \"456c5f14-d211-4869-914d-73d2fd6efa69\") " pod="openstack/heat-engine-6c67d84d9-t7s9t" Oct 13 14:38:52 crc kubenswrapper[4797]: I1013 14:38:52.494740 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g255q\" (UniqueName: \"kubernetes.io/projected/456c5f14-d211-4869-914d-73d2fd6efa69-kube-api-access-g255q\") pod \"heat-engine-6c67d84d9-t7s9t\" (UID: \"456c5f14-d211-4869-914d-73d2fd6efa69\") " pod="openstack/heat-engine-6c67d84d9-t7s9t" Oct 13 14:38:52 crc kubenswrapper[4797]: I1013 14:38:52.517470 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-6c67d84d9-t7s9t" Oct 13 14:38:52 crc kubenswrapper[4797]: I1013 14:38:52.567492 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19924c78-0046-4b6a-91be-546357d8b190-config-data\") pod \"heat-cfnapi-6b8568fd4f-wdjdv\" (UID: \"19924c78-0046-4b6a-91be-546357d8b190\") " pod="openstack/heat-cfnapi-6b8568fd4f-wdjdv" Oct 13 14:38:52 crc kubenswrapper[4797]: I1013 14:38:52.567588 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19924c78-0046-4b6a-91be-546357d8b190-combined-ca-bundle\") pod \"heat-cfnapi-6b8568fd4f-wdjdv\" (UID: \"19924c78-0046-4b6a-91be-546357d8b190\") " pod="openstack/heat-cfnapi-6b8568fd4f-wdjdv" Oct 13 14:38:52 crc kubenswrapper[4797]: I1013 14:38:52.567619 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b291a2fb-8b0f-4ece-8665-fa382a5c51e4-config-data\") pod \"heat-api-6794f4b959-8wbgr\" (UID: \"b291a2fb-8b0f-4ece-8665-fa382a5c51e4\") " pod="openstack/heat-api-6794f4b959-8wbgr" Oct 13 14:38:52 crc kubenswrapper[4797]: I1013 14:38:52.567696 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/19924c78-0046-4b6a-91be-546357d8b190-config-data-custom\") pod \"heat-cfnapi-6b8568fd4f-wdjdv\" (UID: \"19924c78-0046-4b6a-91be-546357d8b190\") " pod="openstack/heat-cfnapi-6b8568fd4f-wdjdv" Oct 13 14:38:52 crc kubenswrapper[4797]: I1013 14:38:52.567772 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b291a2fb-8b0f-4ece-8665-fa382a5c51e4-combined-ca-bundle\") pod \"heat-api-6794f4b959-8wbgr\" (UID: \"b291a2fb-8b0f-4ece-8665-fa382a5c51e4\") " pod="openstack/heat-api-6794f4b959-8wbgr" Oct 13 14:38:52 crc kubenswrapper[4797]: I1013 14:38:52.567862 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zgv2z\" (UniqueName: \"kubernetes.io/projected/b291a2fb-8b0f-4ece-8665-fa382a5c51e4-kube-api-access-zgv2z\") pod \"heat-api-6794f4b959-8wbgr\" (UID: \"b291a2fb-8b0f-4ece-8665-fa382a5c51e4\") " pod="openstack/heat-api-6794f4b959-8wbgr" Oct 13 14:38:52 crc kubenswrapper[4797]: I1013 14:38:52.567906 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b291a2fb-8b0f-4ece-8665-fa382a5c51e4-config-data-custom\") pod \"heat-api-6794f4b959-8wbgr\" (UID: \"b291a2fb-8b0f-4ece-8665-fa382a5c51e4\") " pod="openstack/heat-api-6794f4b959-8wbgr" Oct 13 14:38:52 crc kubenswrapper[4797]: I1013 14:38:52.568043 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dfdbz\" (UniqueName: \"kubernetes.io/projected/19924c78-0046-4b6a-91be-546357d8b190-kube-api-access-dfdbz\") pod \"heat-cfnapi-6b8568fd4f-wdjdv\" (UID: \"19924c78-0046-4b6a-91be-546357d8b190\") " pod="openstack/heat-cfnapi-6b8568fd4f-wdjdv" Oct 13 14:38:52 crc kubenswrapper[4797]: I1013 14:38:52.572408 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19924c78-0046-4b6a-91be-546357d8b190-config-data\") pod \"heat-cfnapi-6b8568fd4f-wdjdv\" (UID: \"19924c78-0046-4b6a-91be-546357d8b190\") " pod="openstack/heat-cfnapi-6b8568fd4f-wdjdv" Oct 13 14:38:52 crc kubenswrapper[4797]: I1013 14:38:52.572481 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19924c78-0046-4b6a-91be-546357d8b190-combined-ca-bundle\") pod \"heat-cfnapi-6b8568fd4f-wdjdv\" (UID: \"19924c78-0046-4b6a-91be-546357d8b190\") " pod="openstack/heat-cfnapi-6b8568fd4f-wdjdv" Oct 13 14:38:52 crc kubenswrapper[4797]: I1013 14:38:52.577145 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b291a2fb-8b0f-4ece-8665-fa382a5c51e4-config-data\") pod \"heat-api-6794f4b959-8wbgr\" (UID: \"b291a2fb-8b0f-4ece-8665-fa382a5c51e4\") " pod="openstack/heat-api-6794f4b959-8wbgr" Oct 13 14:38:52 crc kubenswrapper[4797]: I1013 14:38:52.585607 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b291a2fb-8b0f-4ece-8665-fa382a5c51e4-combined-ca-bundle\") pod \"heat-api-6794f4b959-8wbgr\" (UID: \"b291a2fb-8b0f-4ece-8665-fa382a5c51e4\") " pod="openstack/heat-api-6794f4b959-8wbgr" Oct 13 14:38:52 crc kubenswrapper[4797]: I1013 14:38:52.585747 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/19924c78-0046-4b6a-91be-546357d8b190-config-data-custom\") pod \"heat-cfnapi-6b8568fd4f-wdjdv\" (UID: \"19924c78-0046-4b6a-91be-546357d8b190\") " pod="openstack/heat-cfnapi-6b8568fd4f-wdjdv" Oct 13 14:38:52 crc kubenswrapper[4797]: I1013 14:38:52.586638 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfdbz\" (UniqueName: \"kubernetes.io/projected/19924c78-0046-4b6a-91be-546357d8b190-kube-api-access-dfdbz\") pod \"heat-cfnapi-6b8568fd4f-wdjdv\" (UID: \"19924c78-0046-4b6a-91be-546357d8b190\") " pod="openstack/heat-cfnapi-6b8568fd4f-wdjdv" Oct 13 14:38:52 crc kubenswrapper[4797]: I1013 14:38:52.588832 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b291a2fb-8b0f-4ece-8665-fa382a5c51e4-config-data-custom\") pod \"heat-api-6794f4b959-8wbgr\" (UID: \"b291a2fb-8b0f-4ece-8665-fa382a5c51e4\") " pod="openstack/heat-api-6794f4b959-8wbgr" Oct 13 14:38:52 crc kubenswrapper[4797]: I1013 14:38:52.597446 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zgv2z\" (UniqueName: \"kubernetes.io/projected/b291a2fb-8b0f-4ece-8665-fa382a5c51e4-kube-api-access-zgv2z\") pod \"heat-api-6794f4b959-8wbgr\" (UID: \"b291a2fb-8b0f-4ece-8665-fa382a5c51e4\") " pod="openstack/heat-api-6794f4b959-8wbgr" Oct 13 14:38:52 crc kubenswrapper[4797]: I1013 14:38:52.655919 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-6794f4b959-8wbgr" Oct 13 14:38:52 crc kubenswrapper[4797]: I1013 14:38:52.688513 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-6b8568fd4f-wdjdv" Oct 13 14:38:53 crc kubenswrapper[4797]: W1013 14:38:53.067252 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod456c5f14_d211_4869_914d_73d2fd6efa69.slice/crio-52dc1f7fab0bcb774f530cfe2d2f53e0d22bf95ac951acf3482974331220bc35 WatchSource:0}: Error finding container 52dc1f7fab0bcb774f530cfe2d2f53e0d22bf95ac951acf3482974331220bc35: Status 404 returned error can't find the container with id 52dc1f7fab0bcb774f530cfe2d2f53e0d22bf95ac951acf3482974331220bc35 Oct 13 14:38:53 crc kubenswrapper[4797]: I1013 14:38:53.071878 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-6c67d84d9-t7s9t"] Oct 13 14:38:53 crc kubenswrapper[4797]: I1013 14:38:53.252308 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-6794f4b959-8wbgr"] Oct 13 14:38:53 crc kubenswrapper[4797]: W1013 14:38:53.348263 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod19924c78_0046_4b6a_91be_546357d8b190.slice/crio-200a32a07623e8cf82642c145c2dc62816328fa8aca797a302f7307d3134089f WatchSource:0}: Error finding container 200a32a07623e8cf82642c145c2dc62816328fa8aca797a302f7307d3134089f: Status 404 returned error can't find the container with id 200a32a07623e8cf82642c145c2dc62816328fa8aca797a302f7307d3134089f Oct 13 14:38:53 crc kubenswrapper[4797]: I1013 14:38:53.358596 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-6b8568fd4f-wdjdv"] Oct 13 14:38:53 crc kubenswrapper[4797]: I1013 14:38:53.485688 4797 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-7c5c76c54c-hrm4p" podUID="14c1d6c3-d0ec-48c2-bca2-391a04f0c47d" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.95:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.95:8080: connect: connection refused" Oct 13 14:38:53 crc kubenswrapper[4797]: I1013 14:38:53.873091 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-6b8568fd4f-wdjdv" event={"ID":"19924c78-0046-4b6a-91be-546357d8b190","Type":"ContainerStarted","Data":"200a32a07623e8cf82642c145c2dc62816328fa8aca797a302f7307d3134089f"} Oct 13 14:38:53 crc kubenswrapper[4797]: I1013 14:38:53.883746 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-6794f4b959-8wbgr" event={"ID":"b291a2fb-8b0f-4ece-8665-fa382a5c51e4","Type":"ContainerStarted","Data":"52fa003af8b64af3f51d87be0756b0f4aa624660ba450e10687d4be73fd8ed15"} Oct 13 14:38:53 crc kubenswrapper[4797]: I1013 14:38:53.886580 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-6c67d84d9-t7s9t" event={"ID":"456c5f14-d211-4869-914d-73d2fd6efa69","Type":"ContainerStarted","Data":"28b5365fe1d7390623cffaf0fd0554e2993f80c4ebce5fc8069768813971c342"} Oct 13 14:38:53 crc kubenswrapper[4797]: I1013 14:38:53.886713 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-6c67d84d9-t7s9t" event={"ID":"456c5f14-d211-4869-914d-73d2fd6efa69","Type":"ContainerStarted","Data":"52dc1f7fab0bcb774f530cfe2d2f53e0d22bf95ac951acf3482974331220bc35"} Oct 13 14:38:53 crc kubenswrapper[4797]: I1013 14:38:53.888010 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-6c67d84d9-t7s9t" Oct 13 14:38:53 crc kubenswrapper[4797]: I1013 14:38:53.917484 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-6c67d84d9-t7s9t" podStartSLOduration=1.9174373770000002 podStartE2EDuration="1.917437377s" podCreationTimestamp="2025-10-13 14:38:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 14:38:53.904293825 +0000 UTC m=+5511.437844081" watchObservedRunningTime="2025-10-13 14:38:53.917437377 +0000 UTC m=+5511.450987633" Oct 13 14:38:55 crc kubenswrapper[4797]: I1013 14:38:55.910528 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-6b8568fd4f-wdjdv" event={"ID":"19924c78-0046-4b6a-91be-546357d8b190","Type":"ContainerStarted","Data":"12ca18d1136281f0e98af3ff61db5ac4a09d59352f642ef178e1e7a22c61248a"} Oct 13 14:38:55 crc kubenswrapper[4797]: I1013 14:38:55.912980 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-6b8568fd4f-wdjdv" Oct 13 14:38:55 crc kubenswrapper[4797]: I1013 14:38:55.913033 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-6794f4b959-8wbgr" event={"ID":"b291a2fb-8b0f-4ece-8665-fa382a5c51e4","Type":"ContainerStarted","Data":"a5dc79aa900005f26a2a2f3e7afb6e3860c875cce29480b16964bec545b90ba7"} Oct 13 14:38:55 crc kubenswrapper[4797]: I1013 14:38:55.913325 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-6794f4b959-8wbgr" Oct 13 14:38:55 crc kubenswrapper[4797]: I1013 14:38:55.965900 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-6b8568fd4f-wdjdv" podStartSLOduration=2.4501828420000002 podStartE2EDuration="3.965884648s" podCreationTimestamp="2025-10-13 14:38:52 +0000 UTC" firstStartedPulling="2025-10-13 14:38:53.353394305 +0000 UTC m=+5510.886944561" lastFinishedPulling="2025-10-13 14:38:54.869096111 +0000 UTC m=+5512.402646367" observedRunningTime="2025-10-13 14:38:55.937453872 +0000 UTC m=+5513.471004148" watchObservedRunningTime="2025-10-13 14:38:55.965884648 +0000 UTC m=+5513.499434904" Oct 13 14:38:55 crc kubenswrapper[4797]: I1013 14:38:55.968213 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-6794f4b959-8wbgr" podStartSLOduration=2.353809413 podStartE2EDuration="3.968206985s" podCreationTimestamp="2025-10-13 14:38:52 +0000 UTC" firstStartedPulling="2025-10-13 14:38:53.252678049 +0000 UTC m=+5510.786228305" lastFinishedPulling="2025-10-13 14:38:54.867075611 +0000 UTC m=+5512.400625877" observedRunningTime="2025-10-13 14:38:55.956503038 +0000 UTC m=+5513.490053314" watchObservedRunningTime="2025-10-13 14:38:55.968206985 +0000 UTC m=+5513.501757241" Oct 13 14:38:57 crc kubenswrapper[4797]: I1013 14:38:57.044515 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-zhs42"] Oct 13 14:38:57 crc kubenswrapper[4797]: I1013 14:38:57.054714 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-zhs42"] Oct 13 14:38:57 crc kubenswrapper[4797]: I1013 14:38:57.249860 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc4c6e24-80f6-4ac8-b5d3-4f54dd1d7e87" path="/var/lib/kubelet/pods/bc4c6e24-80f6-4ac8-b5d3-4f54dd1d7e87/volumes" Oct 13 14:39:03 crc kubenswrapper[4797]: I1013 14:39:03.486602 4797 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-7c5c76c54c-hrm4p" podUID="14c1d6c3-d0ec-48c2-bca2-391a04f0c47d" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.95:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.95:8080: connect: connection refused" Oct 13 14:39:04 crc kubenswrapper[4797]: I1013 14:39:04.973079 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-cfnapi-6b8568fd4f-wdjdv" Oct 13 14:39:04 crc kubenswrapper[4797]: I1013 14:39:04.975979 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-api-6794f4b959-8wbgr" Oct 13 14:39:12 crc kubenswrapper[4797]: I1013 14:39:12.544945 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-6c67d84d9-t7s9t" Oct 13 14:39:13 crc kubenswrapper[4797]: I1013 14:39:13.485725 4797 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-7c5c76c54c-hrm4p" podUID="14c1d6c3-d0ec-48c2-bca2-391a04f0c47d" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.95:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.95:8080: connect: connection refused" Oct 13 14:39:13 crc kubenswrapper[4797]: I1013 14:39:13.485874 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7c5c76c54c-hrm4p" Oct 13 14:39:14 crc kubenswrapper[4797]: I1013 14:39:14.082257 4797 generic.go:334] "Generic (PLEG): container finished" podID="14c1d6c3-d0ec-48c2-bca2-391a04f0c47d" containerID="71a0fbdeec6f77261bf21ba6f891657c7fb8be77e92f18d5ac6dec49a549a54f" exitCode=137 Oct 13 14:39:14 crc kubenswrapper[4797]: I1013 14:39:14.082321 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7c5c76c54c-hrm4p" event={"ID":"14c1d6c3-d0ec-48c2-bca2-391a04f0c47d","Type":"ContainerDied","Data":"71a0fbdeec6f77261bf21ba6f891657c7fb8be77e92f18d5ac6dec49a549a54f"} Oct 13 14:39:14 crc kubenswrapper[4797]: I1013 14:39:14.430358 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7c5c76c54c-hrm4p" Oct 13 14:39:14 crc kubenswrapper[4797]: I1013 14:39:14.614255 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/14c1d6c3-d0ec-48c2-bca2-391a04f0c47d-scripts\") pod \"14c1d6c3-d0ec-48c2-bca2-391a04f0c47d\" (UID: \"14c1d6c3-d0ec-48c2-bca2-391a04f0c47d\") " Oct 13 14:39:14 crc kubenswrapper[4797]: I1013 14:39:14.614513 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/14c1d6c3-d0ec-48c2-bca2-391a04f0c47d-logs\") pod \"14c1d6c3-d0ec-48c2-bca2-391a04f0c47d\" (UID: \"14c1d6c3-d0ec-48c2-bca2-391a04f0c47d\") " Oct 13 14:39:14 crc kubenswrapper[4797]: I1013 14:39:14.614557 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/14c1d6c3-d0ec-48c2-bca2-391a04f0c47d-config-data\") pod \"14c1d6c3-d0ec-48c2-bca2-391a04f0c47d\" (UID: \"14c1d6c3-d0ec-48c2-bca2-391a04f0c47d\") " Oct 13 14:39:14 crc kubenswrapper[4797]: I1013 14:39:14.614636 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/14c1d6c3-d0ec-48c2-bca2-391a04f0c47d-horizon-secret-key\") pod \"14c1d6c3-d0ec-48c2-bca2-391a04f0c47d\" (UID: \"14c1d6c3-d0ec-48c2-bca2-391a04f0c47d\") " Oct 13 14:39:14 crc kubenswrapper[4797]: I1013 14:39:14.614753 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pmswn\" (UniqueName: \"kubernetes.io/projected/14c1d6c3-d0ec-48c2-bca2-391a04f0c47d-kube-api-access-pmswn\") pod \"14c1d6c3-d0ec-48c2-bca2-391a04f0c47d\" (UID: \"14c1d6c3-d0ec-48c2-bca2-391a04f0c47d\") " Oct 13 14:39:14 crc kubenswrapper[4797]: I1013 14:39:14.614966 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/14c1d6c3-d0ec-48c2-bca2-391a04f0c47d-logs" (OuterVolumeSpecName: "logs") pod "14c1d6c3-d0ec-48c2-bca2-391a04f0c47d" (UID: "14c1d6c3-d0ec-48c2-bca2-391a04f0c47d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 14:39:14 crc kubenswrapper[4797]: I1013 14:39:14.615470 4797 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/14c1d6c3-d0ec-48c2-bca2-391a04f0c47d-logs\") on node \"crc\" DevicePath \"\"" Oct 13 14:39:14 crc kubenswrapper[4797]: I1013 14:39:14.625885 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14c1d6c3-d0ec-48c2-bca2-391a04f0c47d-kube-api-access-pmswn" (OuterVolumeSpecName: "kube-api-access-pmswn") pod "14c1d6c3-d0ec-48c2-bca2-391a04f0c47d" (UID: "14c1d6c3-d0ec-48c2-bca2-391a04f0c47d"). InnerVolumeSpecName "kube-api-access-pmswn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 14:39:14 crc kubenswrapper[4797]: I1013 14:39:14.625883 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14c1d6c3-d0ec-48c2-bca2-391a04f0c47d-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "14c1d6c3-d0ec-48c2-bca2-391a04f0c47d" (UID: "14c1d6c3-d0ec-48c2-bca2-391a04f0c47d"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 14:39:14 crc kubenswrapper[4797]: I1013 14:39:14.645637 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14c1d6c3-d0ec-48c2-bca2-391a04f0c47d-scripts" (OuterVolumeSpecName: "scripts") pod "14c1d6c3-d0ec-48c2-bca2-391a04f0c47d" (UID: "14c1d6c3-d0ec-48c2-bca2-391a04f0c47d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 14:39:14 crc kubenswrapper[4797]: I1013 14:39:14.646356 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14c1d6c3-d0ec-48c2-bca2-391a04f0c47d-config-data" (OuterVolumeSpecName: "config-data") pod "14c1d6c3-d0ec-48c2-bca2-391a04f0c47d" (UID: "14c1d6c3-d0ec-48c2-bca2-391a04f0c47d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 14:39:14 crc kubenswrapper[4797]: I1013 14:39:14.717048 4797 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/14c1d6c3-d0ec-48c2-bca2-391a04f0c47d-scripts\") on node \"crc\" DevicePath \"\"" Oct 13 14:39:14 crc kubenswrapper[4797]: I1013 14:39:14.717096 4797 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/14c1d6c3-d0ec-48c2-bca2-391a04f0c47d-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 14:39:14 crc kubenswrapper[4797]: I1013 14:39:14.717140 4797 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/14c1d6c3-d0ec-48c2-bca2-391a04f0c47d-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 13 14:39:14 crc kubenswrapper[4797]: I1013 14:39:14.717158 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pmswn\" (UniqueName: \"kubernetes.io/projected/14c1d6c3-d0ec-48c2-bca2-391a04f0c47d-kube-api-access-pmswn\") on node \"crc\" DevicePath \"\"" Oct 13 14:39:15 crc kubenswrapper[4797]: I1013 14:39:15.098043 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7c5c76c54c-hrm4p" event={"ID":"14c1d6c3-d0ec-48c2-bca2-391a04f0c47d","Type":"ContainerDied","Data":"ad59835e11fb8bbbb708e936ea32341d862776e9a6ae08b77172f6eb52365504"} Oct 13 14:39:15 crc kubenswrapper[4797]: I1013 14:39:15.098104 4797 scope.go:117] "RemoveContainer" containerID="3e62b80534ac510a3738e79e12c35b07168b4b3b9e63b673896bab0020dbb858" Oct 13 14:39:15 crc kubenswrapper[4797]: I1013 14:39:15.098102 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7c5c76c54c-hrm4p" Oct 13 14:39:15 crc kubenswrapper[4797]: I1013 14:39:15.142874 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7c5c76c54c-hrm4p"] Oct 13 14:39:15 crc kubenswrapper[4797]: I1013 14:39:15.162899 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-7c5c76c54c-hrm4p"] Oct 13 14:39:15 crc kubenswrapper[4797]: I1013 14:39:15.246871 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14c1d6c3-d0ec-48c2-bca2-391a04f0c47d" path="/var/lib/kubelet/pods/14c1d6c3-d0ec-48c2-bca2-391a04f0c47d/volumes" Oct 13 14:39:15 crc kubenswrapper[4797]: I1013 14:39:15.337328 4797 scope.go:117] "RemoveContainer" containerID="71a0fbdeec6f77261bf21ba6f891657c7fb8be77e92f18d5ac6dec49a549a54f" Oct 13 14:39:15 crc kubenswrapper[4797]: I1013 14:39:15.512709 4797 scope.go:117] "RemoveContainer" containerID="41f4df9c2e5d1cf58537ea4c6f0e63a65af78328c7930e1c7058802628d7f7df" Oct 13 14:39:15 crc kubenswrapper[4797]: I1013 14:39:15.538930 4797 scope.go:117] "RemoveContainer" containerID="b27e61b23d28756876233b7d45b11cacfad8decbcb08eba0f3eaad2da91eaada" Oct 13 14:39:15 crc kubenswrapper[4797]: I1013 14:39:15.613319 4797 scope.go:117] "RemoveContainer" containerID="46b68267a8495306a9d6a730fd0bd31db302c64e244072d7aeed3466beb39955" Oct 13 14:39:22 crc kubenswrapper[4797]: I1013 14:39:22.958174 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2drz84j"] Oct 13 14:39:22 crc kubenswrapper[4797]: E1013 14:39:22.959315 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14c1d6c3-d0ec-48c2-bca2-391a04f0c47d" containerName="horizon-log" Oct 13 14:39:22 crc kubenswrapper[4797]: I1013 14:39:22.959335 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="14c1d6c3-d0ec-48c2-bca2-391a04f0c47d" containerName="horizon-log" Oct 13 14:39:22 crc kubenswrapper[4797]: E1013 14:39:22.959348 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14c1d6c3-d0ec-48c2-bca2-391a04f0c47d" containerName="horizon" Oct 13 14:39:22 crc kubenswrapper[4797]: I1013 14:39:22.959358 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="14c1d6c3-d0ec-48c2-bca2-391a04f0c47d" containerName="horizon" Oct 13 14:39:22 crc kubenswrapper[4797]: I1013 14:39:22.959623 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="14c1d6c3-d0ec-48c2-bca2-391a04f0c47d" containerName="horizon-log" Oct 13 14:39:22 crc kubenswrapper[4797]: I1013 14:39:22.959648 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="14c1d6c3-d0ec-48c2-bca2-391a04f0c47d" containerName="horizon" Oct 13 14:39:22 crc kubenswrapper[4797]: I1013 14:39:22.961529 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2drz84j" Oct 13 14:39:22 crc kubenswrapper[4797]: I1013 14:39:22.966449 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 13 14:39:22 crc kubenswrapper[4797]: I1013 14:39:22.967831 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2drz84j"] Oct 13 14:39:23 crc kubenswrapper[4797]: I1013 14:39:23.093855 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qcgz9\" (UniqueName: \"kubernetes.io/projected/db995f55-9b3f-4fc7-a12c-9f37d196794c-kube-api-access-qcgz9\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2drz84j\" (UID: \"db995f55-9b3f-4fc7-a12c-9f37d196794c\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2drz84j" Oct 13 14:39:23 crc kubenswrapper[4797]: I1013 14:39:23.093954 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/db995f55-9b3f-4fc7-a12c-9f37d196794c-bundle\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2drz84j\" (UID: \"db995f55-9b3f-4fc7-a12c-9f37d196794c\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2drz84j" Oct 13 14:39:23 crc kubenswrapper[4797]: I1013 14:39:23.094004 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/db995f55-9b3f-4fc7-a12c-9f37d196794c-util\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2drz84j\" (UID: \"db995f55-9b3f-4fc7-a12c-9f37d196794c\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2drz84j" Oct 13 14:39:23 crc kubenswrapper[4797]: I1013 14:39:23.196294 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/db995f55-9b3f-4fc7-a12c-9f37d196794c-bundle\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2drz84j\" (UID: \"db995f55-9b3f-4fc7-a12c-9f37d196794c\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2drz84j" Oct 13 14:39:23 crc kubenswrapper[4797]: I1013 14:39:23.196454 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/db995f55-9b3f-4fc7-a12c-9f37d196794c-util\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2drz84j\" (UID: \"db995f55-9b3f-4fc7-a12c-9f37d196794c\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2drz84j" Oct 13 14:39:23 crc kubenswrapper[4797]: I1013 14:39:23.197040 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/db995f55-9b3f-4fc7-a12c-9f37d196794c-bundle\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2drz84j\" (UID: \"db995f55-9b3f-4fc7-a12c-9f37d196794c\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2drz84j" Oct 13 14:39:23 crc kubenswrapper[4797]: I1013 14:39:23.197055 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/db995f55-9b3f-4fc7-a12c-9f37d196794c-util\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2drz84j\" (UID: \"db995f55-9b3f-4fc7-a12c-9f37d196794c\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2drz84j" Oct 13 14:39:23 crc kubenswrapper[4797]: I1013 14:39:23.197505 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qcgz9\" (UniqueName: \"kubernetes.io/projected/db995f55-9b3f-4fc7-a12c-9f37d196794c-kube-api-access-qcgz9\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2drz84j\" (UID: \"db995f55-9b3f-4fc7-a12c-9f37d196794c\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2drz84j" Oct 13 14:39:23 crc kubenswrapper[4797]: I1013 14:39:23.217751 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qcgz9\" (UniqueName: \"kubernetes.io/projected/db995f55-9b3f-4fc7-a12c-9f37d196794c-kube-api-access-qcgz9\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2drz84j\" (UID: \"db995f55-9b3f-4fc7-a12c-9f37d196794c\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2drz84j" Oct 13 14:39:23 crc kubenswrapper[4797]: I1013 14:39:23.314613 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2drz84j" Oct 13 14:39:23 crc kubenswrapper[4797]: I1013 14:39:23.833768 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2drz84j"] Oct 13 14:39:24 crc kubenswrapper[4797]: I1013 14:39:24.197755 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2drz84j" event={"ID":"db995f55-9b3f-4fc7-a12c-9f37d196794c","Type":"ContainerStarted","Data":"c11fe571d9a53f0d2a450a04eb913af6b69e11b224fad1abe54653eae580ac31"} Oct 13 14:39:24 crc kubenswrapper[4797]: I1013 14:39:24.197874 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2drz84j" event={"ID":"db995f55-9b3f-4fc7-a12c-9f37d196794c","Type":"ContainerStarted","Data":"9142999a7276771482963eff1808362811979a2c06abf9f7a467d3e3d054b3a4"} Oct 13 14:39:24 crc kubenswrapper[4797]: E1013 14:39:24.859417 4797 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb995f55_9b3f_4fc7_a12c_9f37d196794c.slice/crio-conmon-c11fe571d9a53f0d2a450a04eb913af6b69e11b224fad1abe54653eae580ac31.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb995f55_9b3f_4fc7_a12c_9f37d196794c.slice/crio-c11fe571d9a53f0d2a450a04eb913af6b69e11b224fad1abe54653eae580ac31.scope\": RecentStats: unable to find data in memory cache]" Oct 13 14:39:25 crc kubenswrapper[4797]: I1013 14:39:25.208040 4797 generic.go:334] "Generic (PLEG): container finished" podID="db995f55-9b3f-4fc7-a12c-9f37d196794c" containerID="c11fe571d9a53f0d2a450a04eb913af6b69e11b224fad1abe54653eae580ac31" exitCode=0 Oct 13 14:39:25 crc kubenswrapper[4797]: I1013 14:39:25.208095 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2drz84j" event={"ID":"db995f55-9b3f-4fc7-a12c-9f37d196794c","Type":"ContainerDied","Data":"c11fe571d9a53f0d2a450a04eb913af6b69e11b224fad1abe54653eae580ac31"} Oct 13 14:39:28 crc kubenswrapper[4797]: I1013 14:39:28.236564 4797 generic.go:334] "Generic (PLEG): container finished" podID="db995f55-9b3f-4fc7-a12c-9f37d196794c" containerID="b7380659f4b874d2143f617f7a0dfee7d99d2088095c2c92aab30a216f7c399e" exitCode=0 Oct 13 14:39:28 crc kubenswrapper[4797]: I1013 14:39:28.236669 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2drz84j" event={"ID":"db995f55-9b3f-4fc7-a12c-9f37d196794c","Type":"ContainerDied","Data":"b7380659f4b874d2143f617f7a0dfee7d99d2088095c2c92aab30a216f7c399e"} Oct 13 14:39:29 crc kubenswrapper[4797]: I1013 14:39:29.247450 4797 generic.go:334] "Generic (PLEG): container finished" podID="db995f55-9b3f-4fc7-a12c-9f37d196794c" containerID="4417829a5c300a9b314c5b67b3cdbb50680e6680f758e8efb96df8ccc30dc0b7" exitCode=0 Oct 13 14:39:29 crc kubenswrapper[4797]: I1013 14:39:29.247502 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2drz84j" event={"ID":"db995f55-9b3f-4fc7-a12c-9f37d196794c","Type":"ContainerDied","Data":"4417829a5c300a9b314c5b67b3cdbb50680e6680f758e8efb96df8ccc30dc0b7"} Oct 13 14:39:30 crc kubenswrapper[4797]: I1013 14:39:30.628602 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2drz84j" Oct 13 14:39:30 crc kubenswrapper[4797]: I1013 14:39:30.745990 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/db995f55-9b3f-4fc7-a12c-9f37d196794c-bundle\") pod \"db995f55-9b3f-4fc7-a12c-9f37d196794c\" (UID: \"db995f55-9b3f-4fc7-a12c-9f37d196794c\") " Oct 13 14:39:30 crc kubenswrapper[4797]: I1013 14:39:30.746112 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/db995f55-9b3f-4fc7-a12c-9f37d196794c-util\") pod \"db995f55-9b3f-4fc7-a12c-9f37d196794c\" (UID: \"db995f55-9b3f-4fc7-a12c-9f37d196794c\") " Oct 13 14:39:30 crc kubenswrapper[4797]: I1013 14:39:30.746152 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qcgz9\" (UniqueName: \"kubernetes.io/projected/db995f55-9b3f-4fc7-a12c-9f37d196794c-kube-api-access-qcgz9\") pod \"db995f55-9b3f-4fc7-a12c-9f37d196794c\" (UID: \"db995f55-9b3f-4fc7-a12c-9f37d196794c\") " Oct 13 14:39:30 crc kubenswrapper[4797]: I1013 14:39:30.748991 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/db995f55-9b3f-4fc7-a12c-9f37d196794c-bundle" (OuterVolumeSpecName: "bundle") pod "db995f55-9b3f-4fc7-a12c-9f37d196794c" (UID: "db995f55-9b3f-4fc7-a12c-9f37d196794c"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 14:39:30 crc kubenswrapper[4797]: I1013 14:39:30.751379 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db995f55-9b3f-4fc7-a12c-9f37d196794c-kube-api-access-qcgz9" (OuterVolumeSpecName: "kube-api-access-qcgz9") pod "db995f55-9b3f-4fc7-a12c-9f37d196794c" (UID: "db995f55-9b3f-4fc7-a12c-9f37d196794c"). InnerVolumeSpecName "kube-api-access-qcgz9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 14:39:30 crc kubenswrapper[4797]: I1013 14:39:30.758860 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/db995f55-9b3f-4fc7-a12c-9f37d196794c-util" (OuterVolumeSpecName: "util") pod "db995f55-9b3f-4fc7-a12c-9f37d196794c" (UID: "db995f55-9b3f-4fc7-a12c-9f37d196794c"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 14:39:30 crc kubenswrapper[4797]: I1013 14:39:30.848959 4797 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/db995f55-9b3f-4fc7-a12c-9f37d196794c-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 14:39:30 crc kubenswrapper[4797]: I1013 14:39:30.848999 4797 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/db995f55-9b3f-4fc7-a12c-9f37d196794c-util\") on node \"crc\" DevicePath \"\"" Oct 13 14:39:30 crc kubenswrapper[4797]: I1013 14:39:30.849015 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qcgz9\" (UniqueName: \"kubernetes.io/projected/db995f55-9b3f-4fc7-a12c-9f37d196794c-kube-api-access-qcgz9\") on node \"crc\" DevicePath \"\"" Oct 13 14:39:31 crc kubenswrapper[4797]: I1013 14:39:31.274063 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2drz84j" event={"ID":"db995f55-9b3f-4fc7-a12c-9f37d196794c","Type":"ContainerDied","Data":"9142999a7276771482963eff1808362811979a2c06abf9f7a467d3e3d054b3a4"} Oct 13 14:39:31 crc kubenswrapper[4797]: I1013 14:39:31.274382 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9142999a7276771482963eff1808362811979a2c06abf9f7a467d3e3d054b3a4" Oct 13 14:39:31 crc kubenswrapper[4797]: I1013 14:39:31.274134 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2drz84j" Oct 13 14:39:36 crc kubenswrapper[4797]: I1013 14:39:36.608543 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-8ngck"] Oct 13 14:39:36 crc kubenswrapper[4797]: E1013 14:39:36.609543 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db995f55-9b3f-4fc7-a12c-9f37d196794c" containerName="extract" Oct 13 14:39:36 crc kubenswrapper[4797]: I1013 14:39:36.609560 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="db995f55-9b3f-4fc7-a12c-9f37d196794c" containerName="extract" Oct 13 14:39:36 crc kubenswrapper[4797]: E1013 14:39:36.609605 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db995f55-9b3f-4fc7-a12c-9f37d196794c" containerName="pull" Oct 13 14:39:36 crc kubenswrapper[4797]: I1013 14:39:36.609613 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="db995f55-9b3f-4fc7-a12c-9f37d196794c" containerName="pull" Oct 13 14:39:36 crc kubenswrapper[4797]: E1013 14:39:36.609636 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db995f55-9b3f-4fc7-a12c-9f37d196794c" containerName="util" Oct 13 14:39:36 crc kubenswrapper[4797]: I1013 14:39:36.609644 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="db995f55-9b3f-4fc7-a12c-9f37d196794c" containerName="util" Oct 13 14:39:36 crc kubenswrapper[4797]: I1013 14:39:36.609875 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="db995f55-9b3f-4fc7-a12c-9f37d196794c" containerName="extract" Oct 13 14:39:36 crc kubenswrapper[4797]: I1013 14:39:36.611590 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8ngck" Oct 13 14:39:36 crc kubenswrapper[4797]: I1013 14:39:36.627322 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8ngck"] Oct 13 14:39:36 crc kubenswrapper[4797]: I1013 14:39:36.759450 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f46db46d-c53b-493b-94f1-2349893a731d-utilities\") pod \"community-operators-8ngck\" (UID: \"f46db46d-c53b-493b-94f1-2349893a731d\") " pod="openshift-marketplace/community-operators-8ngck" Oct 13 14:39:36 crc kubenswrapper[4797]: I1013 14:39:36.759562 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jk7kx\" (UniqueName: \"kubernetes.io/projected/f46db46d-c53b-493b-94f1-2349893a731d-kube-api-access-jk7kx\") pod \"community-operators-8ngck\" (UID: \"f46db46d-c53b-493b-94f1-2349893a731d\") " pod="openshift-marketplace/community-operators-8ngck" Oct 13 14:39:36 crc kubenswrapper[4797]: I1013 14:39:36.759614 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f46db46d-c53b-493b-94f1-2349893a731d-catalog-content\") pod \"community-operators-8ngck\" (UID: \"f46db46d-c53b-493b-94f1-2349893a731d\") " pod="openshift-marketplace/community-operators-8ngck" Oct 13 14:39:36 crc kubenswrapper[4797]: I1013 14:39:36.861133 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f46db46d-c53b-493b-94f1-2349893a731d-utilities\") pod \"community-operators-8ngck\" (UID: \"f46db46d-c53b-493b-94f1-2349893a731d\") " pod="openshift-marketplace/community-operators-8ngck" Oct 13 14:39:36 crc kubenswrapper[4797]: I1013 14:39:36.861215 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jk7kx\" (UniqueName: \"kubernetes.io/projected/f46db46d-c53b-493b-94f1-2349893a731d-kube-api-access-jk7kx\") pod \"community-operators-8ngck\" (UID: \"f46db46d-c53b-493b-94f1-2349893a731d\") " pod="openshift-marketplace/community-operators-8ngck" Oct 13 14:39:36 crc kubenswrapper[4797]: I1013 14:39:36.861247 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f46db46d-c53b-493b-94f1-2349893a731d-catalog-content\") pod \"community-operators-8ngck\" (UID: \"f46db46d-c53b-493b-94f1-2349893a731d\") " pod="openshift-marketplace/community-operators-8ngck" Oct 13 14:39:36 crc kubenswrapper[4797]: I1013 14:39:36.861669 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f46db46d-c53b-493b-94f1-2349893a731d-utilities\") pod \"community-operators-8ngck\" (UID: \"f46db46d-c53b-493b-94f1-2349893a731d\") " pod="openshift-marketplace/community-operators-8ngck" Oct 13 14:39:36 crc kubenswrapper[4797]: I1013 14:39:36.861739 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f46db46d-c53b-493b-94f1-2349893a731d-catalog-content\") pod \"community-operators-8ngck\" (UID: \"f46db46d-c53b-493b-94f1-2349893a731d\") " pod="openshift-marketplace/community-operators-8ngck" Oct 13 14:39:36 crc kubenswrapper[4797]: I1013 14:39:36.890338 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jk7kx\" (UniqueName: \"kubernetes.io/projected/f46db46d-c53b-493b-94f1-2349893a731d-kube-api-access-jk7kx\") pod \"community-operators-8ngck\" (UID: \"f46db46d-c53b-493b-94f1-2349893a731d\") " pod="openshift-marketplace/community-operators-8ngck" Oct 13 14:39:36 crc kubenswrapper[4797]: I1013 14:39:36.934688 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8ngck" Oct 13 14:39:37 crc kubenswrapper[4797]: I1013 14:39:37.838406 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8ngck"] Oct 13 14:39:38 crc kubenswrapper[4797]: I1013 14:39:38.332572 4797 generic.go:334] "Generic (PLEG): container finished" podID="f46db46d-c53b-493b-94f1-2349893a731d" containerID="e382c4c08f81a5c9e0df28192dca5c21ac8d368f39072d74ca848965c631ca38" exitCode=0 Oct 13 14:39:38 crc kubenswrapper[4797]: I1013 14:39:38.332616 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8ngck" event={"ID":"f46db46d-c53b-493b-94f1-2349893a731d","Type":"ContainerDied","Data":"e382c4c08f81a5c9e0df28192dca5c21ac8d368f39072d74ca848965c631ca38"} Oct 13 14:39:38 crc kubenswrapper[4797]: I1013 14:39:38.332645 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8ngck" event={"ID":"f46db46d-c53b-493b-94f1-2349893a731d","Type":"ContainerStarted","Data":"6a601448a8fdf1a196a883caa485671f1abf96e15ea56a22694db8e5cc8e62f5"} Oct 13 14:39:40 crc kubenswrapper[4797]: I1013 14:39:40.065301 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-gw9tq"] Oct 13 14:39:40 crc kubenswrapper[4797]: I1013 14:39:40.078394 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-gw9tq"] Oct 13 14:39:40 crc kubenswrapper[4797]: I1013 14:39:40.349124 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8ngck" event={"ID":"f46db46d-c53b-493b-94f1-2349893a731d","Type":"ContainerStarted","Data":"441988c6edd3c991c6a9efef27a4cf97b7fd3cdd1dce4354b34ea9cb8ba9c732"} Oct 13 14:39:41 crc kubenswrapper[4797]: I1013 14:39:41.246786 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="669368b9-50a6-46f6-b526-26be88b3c854" path="/var/lib/kubelet/pods/669368b9-50a6-46f6-b526-26be88b3c854/volumes" Oct 13 14:39:41 crc kubenswrapper[4797]: I1013 14:39:41.354244 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-7c8cf85677-tplx7"] Oct 13 14:39:41 crc kubenswrapper[4797]: I1013 14:39:41.358641 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-tplx7" Oct 13 14:39:41 crc kubenswrapper[4797]: I1013 14:39:41.367481 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-jlv2z" Oct 13 14:39:41 crc kubenswrapper[4797]: I1013 14:39:41.367570 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Oct 13 14:39:41 crc kubenswrapper[4797]: I1013 14:39:41.367796 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Oct 13 14:39:41 crc kubenswrapper[4797]: I1013 14:39:41.369618 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-7c8cf85677-tplx7"] Oct 13 14:39:41 crc kubenswrapper[4797]: I1013 14:39:41.469682 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8tfnd\" (UniqueName: \"kubernetes.io/projected/c51a5185-9bfb-46c9-95fa-b41b91150fc1-kube-api-access-8tfnd\") pod \"obo-prometheus-operator-7c8cf85677-tplx7\" (UID: \"c51a5185-9bfb-46c9-95fa-b41b91150fc1\") " pod="openshift-operators/obo-prometheus-operator-7c8cf85677-tplx7" Oct 13 14:39:41 crc kubenswrapper[4797]: I1013 14:39:41.528864 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-84558866d4-w8c2d"] Oct 13 14:39:41 crc kubenswrapper[4797]: I1013 14:39:41.530040 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-84558866d4-w8c2d" Oct 13 14:39:41 crc kubenswrapper[4797]: I1013 14:39:41.536180 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Oct 13 14:39:41 crc kubenswrapper[4797]: I1013 14:39:41.536274 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-6gs46" Oct 13 14:39:41 crc kubenswrapper[4797]: I1013 14:39:41.563290 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-84558866d4-w8c2d"] Oct 13 14:39:41 crc kubenswrapper[4797]: I1013 14:39:41.574670 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8tfnd\" (UniqueName: \"kubernetes.io/projected/c51a5185-9bfb-46c9-95fa-b41b91150fc1-kube-api-access-8tfnd\") pod \"obo-prometheus-operator-7c8cf85677-tplx7\" (UID: \"c51a5185-9bfb-46c9-95fa-b41b91150fc1\") " pod="openshift-operators/obo-prometheus-operator-7c8cf85677-tplx7" Oct 13 14:39:41 crc kubenswrapper[4797]: I1013 14:39:41.579652 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-84558866d4-4fbtq"] Oct 13 14:39:41 crc kubenswrapper[4797]: I1013 14:39:41.580974 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-84558866d4-4fbtq" Oct 13 14:39:41 crc kubenswrapper[4797]: I1013 14:39:41.624241 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-84558866d4-4fbtq"] Oct 13 14:39:41 crc kubenswrapper[4797]: I1013 14:39:41.625055 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8tfnd\" (UniqueName: \"kubernetes.io/projected/c51a5185-9bfb-46c9-95fa-b41b91150fc1-kube-api-access-8tfnd\") pod \"obo-prometheus-operator-7c8cf85677-tplx7\" (UID: \"c51a5185-9bfb-46c9-95fa-b41b91150fc1\") " pod="openshift-operators/obo-prometheus-operator-7c8cf85677-tplx7" Oct 13 14:39:41 crc kubenswrapper[4797]: I1013 14:39:41.676667 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/57f56c14-5bb3-410a-a578-4814c6ce81a8-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-84558866d4-4fbtq\" (UID: \"57f56c14-5bb3-410a-a578-4814c6ce81a8\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-84558866d4-4fbtq" Oct 13 14:39:41 crc kubenswrapper[4797]: I1013 14:39:41.676757 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2d98b442-31b7-44c4-b551-55af7fb2ff25-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-84558866d4-w8c2d\" (UID: \"2d98b442-31b7-44c4-b551-55af7fb2ff25\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-84558866d4-w8c2d" Oct 13 14:39:41 crc kubenswrapper[4797]: I1013 14:39:41.676872 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2d98b442-31b7-44c4-b551-55af7fb2ff25-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-84558866d4-w8c2d\" (UID: \"2d98b442-31b7-44c4-b551-55af7fb2ff25\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-84558866d4-w8c2d" Oct 13 14:39:41 crc kubenswrapper[4797]: I1013 14:39:41.677152 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/57f56c14-5bb3-410a-a578-4814c6ce81a8-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-84558866d4-4fbtq\" (UID: \"57f56c14-5bb3-410a-a578-4814c6ce81a8\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-84558866d4-4fbtq" Oct 13 14:39:41 crc kubenswrapper[4797]: I1013 14:39:41.714746 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-cc5f78dfc-rpdw6"] Oct 13 14:39:41 crc kubenswrapper[4797]: I1013 14:39:41.716683 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-cc5f78dfc-rpdw6" Oct 13 14:39:41 crc kubenswrapper[4797]: I1013 14:39:41.719054 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Oct 13 14:39:41 crc kubenswrapper[4797]: I1013 14:39:41.719255 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-ngw4w" Oct 13 14:39:41 crc kubenswrapper[4797]: I1013 14:39:41.742507 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-tplx7" Oct 13 14:39:41 crc kubenswrapper[4797]: I1013 14:39:41.744766 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-cc5f78dfc-rpdw6"] Oct 13 14:39:41 crc kubenswrapper[4797]: I1013 14:39:41.779255 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/57f56c14-5bb3-410a-a578-4814c6ce81a8-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-84558866d4-4fbtq\" (UID: \"57f56c14-5bb3-410a-a578-4814c6ce81a8\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-84558866d4-4fbtq" Oct 13 14:39:41 crc kubenswrapper[4797]: I1013 14:39:41.779338 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/4449e3ff-7bc9-44b4-b1e6-932bb69225dd-observability-operator-tls\") pod \"observability-operator-cc5f78dfc-rpdw6\" (UID: \"4449e3ff-7bc9-44b4-b1e6-932bb69225dd\") " pod="openshift-operators/observability-operator-cc5f78dfc-rpdw6" Oct 13 14:39:41 crc kubenswrapper[4797]: I1013 14:39:41.779388 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2d98b442-31b7-44c4-b551-55af7fb2ff25-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-84558866d4-w8c2d\" (UID: \"2d98b442-31b7-44c4-b551-55af7fb2ff25\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-84558866d4-w8c2d" Oct 13 14:39:41 crc kubenswrapper[4797]: I1013 14:39:41.779440 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2d98b442-31b7-44c4-b551-55af7fb2ff25-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-84558866d4-w8c2d\" (UID: \"2d98b442-31b7-44c4-b551-55af7fb2ff25\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-84558866d4-w8c2d" Oct 13 14:39:41 crc kubenswrapper[4797]: I1013 14:39:41.779502 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92jr7\" (UniqueName: \"kubernetes.io/projected/4449e3ff-7bc9-44b4-b1e6-932bb69225dd-kube-api-access-92jr7\") pod \"observability-operator-cc5f78dfc-rpdw6\" (UID: \"4449e3ff-7bc9-44b4-b1e6-932bb69225dd\") " pod="openshift-operators/observability-operator-cc5f78dfc-rpdw6" Oct 13 14:39:41 crc kubenswrapper[4797]: I1013 14:39:41.779563 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/57f56c14-5bb3-410a-a578-4814c6ce81a8-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-84558866d4-4fbtq\" (UID: \"57f56c14-5bb3-410a-a578-4814c6ce81a8\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-84558866d4-4fbtq" Oct 13 14:39:41 crc kubenswrapper[4797]: I1013 14:39:41.789836 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/57f56c14-5bb3-410a-a578-4814c6ce81a8-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-84558866d4-4fbtq\" (UID: \"57f56c14-5bb3-410a-a578-4814c6ce81a8\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-84558866d4-4fbtq" Oct 13 14:39:41 crc kubenswrapper[4797]: I1013 14:39:41.790233 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2d98b442-31b7-44c4-b551-55af7fb2ff25-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-84558866d4-w8c2d\" (UID: \"2d98b442-31b7-44c4-b551-55af7fb2ff25\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-84558866d4-w8c2d" Oct 13 14:39:41 crc kubenswrapper[4797]: I1013 14:39:41.793539 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2d98b442-31b7-44c4-b551-55af7fb2ff25-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-84558866d4-w8c2d\" (UID: \"2d98b442-31b7-44c4-b551-55af7fb2ff25\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-84558866d4-w8c2d" Oct 13 14:39:41 crc kubenswrapper[4797]: I1013 14:39:41.797084 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/57f56c14-5bb3-410a-a578-4814c6ce81a8-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-84558866d4-4fbtq\" (UID: \"57f56c14-5bb3-410a-a578-4814c6ce81a8\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-84558866d4-4fbtq" Oct 13 14:39:41 crc kubenswrapper[4797]: I1013 14:39:41.850637 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-84558866d4-w8c2d" Oct 13 14:39:41 crc kubenswrapper[4797]: I1013 14:39:41.883015 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-92jr7\" (UniqueName: \"kubernetes.io/projected/4449e3ff-7bc9-44b4-b1e6-932bb69225dd-kube-api-access-92jr7\") pod \"observability-operator-cc5f78dfc-rpdw6\" (UID: \"4449e3ff-7bc9-44b4-b1e6-932bb69225dd\") " pod="openshift-operators/observability-operator-cc5f78dfc-rpdw6" Oct 13 14:39:41 crc kubenswrapper[4797]: I1013 14:39:41.885252 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/4449e3ff-7bc9-44b4-b1e6-932bb69225dd-observability-operator-tls\") pod \"observability-operator-cc5f78dfc-rpdw6\" (UID: \"4449e3ff-7bc9-44b4-b1e6-932bb69225dd\") " pod="openshift-operators/observability-operator-cc5f78dfc-rpdw6" Oct 13 14:39:41 crc kubenswrapper[4797]: I1013 14:39:41.893682 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/4449e3ff-7bc9-44b4-b1e6-932bb69225dd-observability-operator-tls\") pod \"observability-operator-cc5f78dfc-rpdw6\" (UID: \"4449e3ff-7bc9-44b4-b1e6-932bb69225dd\") " pod="openshift-operators/observability-operator-cc5f78dfc-rpdw6" Oct 13 14:39:41 crc kubenswrapper[4797]: I1013 14:39:41.897650 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-84558866d4-4fbtq" Oct 13 14:39:41 crc kubenswrapper[4797]: I1013 14:39:41.911792 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-92jr7\" (UniqueName: \"kubernetes.io/projected/4449e3ff-7bc9-44b4-b1e6-932bb69225dd-kube-api-access-92jr7\") pod \"observability-operator-cc5f78dfc-rpdw6\" (UID: \"4449e3ff-7bc9-44b4-b1e6-932bb69225dd\") " pod="openshift-operators/observability-operator-cc5f78dfc-rpdw6" Oct 13 14:39:41 crc kubenswrapper[4797]: I1013 14:39:41.979696 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-54bc95c9fb-zc6fr"] Oct 13 14:39:41 crc kubenswrapper[4797]: I1013 14:39:41.981654 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-54bc95c9fb-zc6fr" Oct 13 14:39:41 crc kubenswrapper[4797]: I1013 14:39:41.990741 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-5n54q" Oct 13 14:39:42 crc kubenswrapper[4797]: I1013 14:39:42.001873 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-54bc95c9fb-zc6fr"] Oct 13 14:39:42 crc kubenswrapper[4797]: I1013 14:39:42.038518 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-cc5f78dfc-rpdw6" Oct 13 14:39:42 crc kubenswrapper[4797]: I1013 14:39:42.097459 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pr9fl\" (UniqueName: \"kubernetes.io/projected/ece0b1a2-3e6f-461f-bace-aece27efc279-kube-api-access-pr9fl\") pod \"perses-operator-54bc95c9fb-zc6fr\" (UID: \"ece0b1a2-3e6f-461f-bace-aece27efc279\") " pod="openshift-operators/perses-operator-54bc95c9fb-zc6fr" Oct 13 14:39:42 crc kubenswrapper[4797]: I1013 14:39:42.097596 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/ece0b1a2-3e6f-461f-bace-aece27efc279-openshift-service-ca\") pod \"perses-operator-54bc95c9fb-zc6fr\" (UID: \"ece0b1a2-3e6f-461f-bace-aece27efc279\") " pod="openshift-operators/perses-operator-54bc95c9fb-zc6fr" Oct 13 14:39:42 crc kubenswrapper[4797]: I1013 14:39:42.203673 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/ece0b1a2-3e6f-461f-bace-aece27efc279-openshift-service-ca\") pod \"perses-operator-54bc95c9fb-zc6fr\" (UID: \"ece0b1a2-3e6f-461f-bace-aece27efc279\") " pod="openshift-operators/perses-operator-54bc95c9fb-zc6fr" Oct 13 14:39:42 crc kubenswrapper[4797]: I1013 14:39:42.203902 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pr9fl\" (UniqueName: \"kubernetes.io/projected/ece0b1a2-3e6f-461f-bace-aece27efc279-kube-api-access-pr9fl\") pod \"perses-operator-54bc95c9fb-zc6fr\" (UID: \"ece0b1a2-3e6f-461f-bace-aece27efc279\") " pod="openshift-operators/perses-operator-54bc95c9fb-zc6fr" Oct 13 14:39:42 crc kubenswrapper[4797]: I1013 14:39:42.205084 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/ece0b1a2-3e6f-461f-bace-aece27efc279-openshift-service-ca\") pod \"perses-operator-54bc95c9fb-zc6fr\" (UID: \"ece0b1a2-3e6f-461f-bace-aece27efc279\") " pod="openshift-operators/perses-operator-54bc95c9fb-zc6fr" Oct 13 14:39:42 crc kubenswrapper[4797]: I1013 14:39:42.240512 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pr9fl\" (UniqueName: \"kubernetes.io/projected/ece0b1a2-3e6f-461f-bace-aece27efc279-kube-api-access-pr9fl\") pod \"perses-operator-54bc95c9fb-zc6fr\" (UID: \"ece0b1a2-3e6f-461f-bace-aece27efc279\") " pod="openshift-operators/perses-operator-54bc95c9fb-zc6fr" Oct 13 14:39:42 crc kubenswrapper[4797]: I1013 14:39:42.340536 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-54bc95c9fb-zc6fr" Oct 13 14:39:42 crc kubenswrapper[4797]: I1013 14:39:42.410325 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-7c8cf85677-tplx7"] Oct 13 14:39:42 crc kubenswrapper[4797]: I1013 14:39:42.411537 4797 generic.go:334] "Generic (PLEG): container finished" podID="f46db46d-c53b-493b-94f1-2349893a731d" containerID="441988c6edd3c991c6a9efef27a4cf97b7fd3cdd1dce4354b34ea9cb8ba9c732" exitCode=0 Oct 13 14:39:42 crc kubenswrapper[4797]: I1013 14:39:42.411589 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8ngck" event={"ID":"f46db46d-c53b-493b-94f1-2349893a731d","Type":"ContainerDied","Data":"441988c6edd3c991c6a9efef27a4cf97b7fd3cdd1dce4354b34ea9cb8ba9c732"} Oct 13 14:39:42 crc kubenswrapper[4797]: I1013 14:39:42.518588 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-cc5f78dfc-rpdw6"] Oct 13 14:39:42 crc kubenswrapper[4797]: I1013 14:39:42.551932 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-84558866d4-4fbtq"] Oct 13 14:39:42 crc kubenswrapper[4797]: I1013 14:39:42.580448 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-84558866d4-w8c2d"] Oct 13 14:39:43 crc kubenswrapper[4797]: I1013 14:39:43.041896 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-54bc95c9fb-zc6fr"] Oct 13 14:39:43 crc kubenswrapper[4797]: I1013 14:39:43.422554 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8ngck" event={"ID":"f46db46d-c53b-493b-94f1-2349893a731d","Type":"ContainerStarted","Data":"2ea1684682fb471fc6d93999489dcb8847d2622ffdfee1b1248c1c966ba2f7d2"} Oct 13 14:39:43 crc kubenswrapper[4797]: I1013 14:39:43.426362 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-84558866d4-4fbtq" event={"ID":"57f56c14-5bb3-410a-a578-4814c6ce81a8","Type":"ContainerStarted","Data":"737220b560a84fd3cf5dc078af207cb06a5390b7f6ae9a0411a3a25a41d805c5"} Oct 13 14:39:43 crc kubenswrapper[4797]: I1013 14:39:43.427491 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-84558866d4-w8c2d" event={"ID":"2d98b442-31b7-44c4-b551-55af7fb2ff25","Type":"ContainerStarted","Data":"fd12bca20a50fb7dcdcf54fda57cf5168161d005b6d34d3b5f5527b978933194"} Oct 13 14:39:43 crc kubenswrapper[4797]: I1013 14:39:43.430091 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-cc5f78dfc-rpdw6" event={"ID":"4449e3ff-7bc9-44b4-b1e6-932bb69225dd","Type":"ContainerStarted","Data":"588701445f4ea6c69b3d8872e3d80be183b075290ebee7429bdefc29c9a268b5"} Oct 13 14:39:43 crc kubenswrapper[4797]: I1013 14:39:43.431182 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-54bc95c9fb-zc6fr" event={"ID":"ece0b1a2-3e6f-461f-bace-aece27efc279","Type":"ContainerStarted","Data":"3decd599f817d63e56b2889100cc55fac32cf2452ac32e8bce430681111360cc"} Oct 13 14:39:43 crc kubenswrapper[4797]: I1013 14:39:43.432399 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-tplx7" event={"ID":"c51a5185-9bfb-46c9-95fa-b41b91150fc1","Type":"ContainerStarted","Data":"8ac25ffcffde21fa0afc39bee2968252490fc7fc7ca7b13ad93949452749c7e0"} Oct 13 14:39:43 crc kubenswrapper[4797]: I1013 14:39:43.450013 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-8ngck" podStartSLOduration=2.686351423 podStartE2EDuration="7.449987741s" podCreationTimestamp="2025-10-13 14:39:36 +0000 UTC" firstStartedPulling="2025-10-13 14:39:38.334544438 +0000 UTC m=+5555.868094704" lastFinishedPulling="2025-10-13 14:39:43.098180766 +0000 UTC m=+5560.631731022" observedRunningTime="2025-10-13 14:39:43.441968864 +0000 UTC m=+5560.975519130" watchObservedRunningTime="2025-10-13 14:39:43.449987741 +0000 UTC m=+5560.983538007" Oct 13 14:39:46 crc kubenswrapper[4797]: I1013 14:39:46.935838 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-8ngck" Oct 13 14:39:46 crc kubenswrapper[4797]: I1013 14:39:46.936332 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-8ngck" Oct 13 14:39:46 crc kubenswrapper[4797]: I1013 14:39:46.992021 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-8ngck" Oct 13 14:39:50 crc kubenswrapper[4797]: I1013 14:39:50.043579 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-8a64-account-create-5g7vx"] Oct 13 14:39:50 crc kubenswrapper[4797]: I1013 14:39:50.054800 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-8a64-account-create-5g7vx"] Oct 13 14:39:51 crc kubenswrapper[4797]: I1013 14:39:51.249753 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99e5c216-96da-4a27-b0e2-26973c459a03" path="/var/lib/kubelet/pods/99e5c216-96da-4a27-b0e2-26973c459a03/volumes" Oct 13 14:39:51 crc kubenswrapper[4797]: I1013 14:39:51.557137 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-54bc95c9fb-zc6fr" event={"ID":"ece0b1a2-3e6f-461f-bace-aece27efc279","Type":"ContainerStarted","Data":"e96ac8d35ba14d6b1ab1254b50e8f71bb569dc0c258d4f3f0d5a89f0c188c5bb"} Oct 13 14:39:51 crc kubenswrapper[4797]: I1013 14:39:51.557934 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-54bc95c9fb-zc6fr" Oct 13 14:39:51 crc kubenswrapper[4797]: I1013 14:39:51.567647 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-tplx7" event={"ID":"c51a5185-9bfb-46c9-95fa-b41b91150fc1","Type":"ContainerStarted","Data":"50c341a2c4c011979c130b0a52d06dd9bf78d87e76c44f6e1e9e9e039f287537"} Oct 13 14:39:51 crc kubenswrapper[4797]: I1013 14:39:51.569843 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-84558866d4-4fbtq" event={"ID":"57f56c14-5bb3-410a-a578-4814c6ce81a8","Type":"ContainerStarted","Data":"76411263bda7c5d083d62ce09433ca8a37da70e9a66aabcede8aeec307566404"} Oct 13 14:39:51 crc kubenswrapper[4797]: I1013 14:39:51.571616 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-84558866d4-w8c2d" event={"ID":"2d98b442-31b7-44c4-b551-55af7fb2ff25","Type":"ContainerStarted","Data":"17be761fa99fced4cac6110d3412519c2e1f3579c5d575a694aa5ccad42ec271"} Oct 13 14:39:51 crc kubenswrapper[4797]: I1013 14:39:51.574177 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-cc5f78dfc-rpdw6" event={"ID":"4449e3ff-7bc9-44b4-b1e6-932bb69225dd","Type":"ContainerStarted","Data":"3a6d3c4e47a7befe2d58365bcaca0f7560efac43320dafe14703ff3f06064a19"} Oct 13 14:39:51 crc kubenswrapper[4797]: I1013 14:39:51.574339 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-cc5f78dfc-rpdw6" Oct 13 14:39:51 crc kubenswrapper[4797]: I1013 14:39:51.582520 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-cc5f78dfc-rpdw6" Oct 13 14:39:51 crc kubenswrapper[4797]: I1013 14:39:51.590417 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-54bc95c9fb-zc6fr" podStartSLOduration=2.820071564 podStartE2EDuration="10.590394507s" podCreationTimestamp="2025-10-13 14:39:41 +0000 UTC" firstStartedPulling="2025-10-13 14:39:43.09467062 +0000 UTC m=+5560.628220876" lastFinishedPulling="2025-10-13 14:39:50.864993563 +0000 UTC m=+5568.398543819" observedRunningTime="2025-10-13 14:39:51.585111477 +0000 UTC m=+5569.118661733" watchObservedRunningTime="2025-10-13 14:39:51.590394507 +0000 UTC m=+5569.123944763" Oct 13 14:39:51 crc kubenswrapper[4797]: I1013 14:39:51.617154 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-cc5f78dfc-rpdw6" podStartSLOduration=2.211994604 podStartE2EDuration="10.617132841s" podCreationTimestamp="2025-10-13 14:39:41 +0000 UTC" firstStartedPulling="2025-10-13 14:39:42.513020798 +0000 UTC m=+5560.046571064" lastFinishedPulling="2025-10-13 14:39:50.918159045 +0000 UTC m=+5568.451709301" observedRunningTime="2025-10-13 14:39:51.614593229 +0000 UTC m=+5569.148143505" watchObservedRunningTime="2025-10-13 14:39:51.617132841 +0000 UTC m=+5569.150683107" Oct 13 14:39:51 crc kubenswrapper[4797]: I1013 14:39:51.647131 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-tplx7" podStartSLOduration=2.229983625 podStartE2EDuration="10.647104735s" podCreationTimestamp="2025-10-13 14:39:41 +0000 UTC" firstStartedPulling="2025-10-13 14:39:42.445989577 +0000 UTC m=+5559.979539843" lastFinishedPulling="2025-10-13 14:39:50.863110697 +0000 UTC m=+5568.396660953" observedRunningTime="2025-10-13 14:39:51.64196893 +0000 UTC m=+5569.175519206" watchObservedRunningTime="2025-10-13 14:39:51.647104735 +0000 UTC m=+5569.180655001" Oct 13 14:39:51 crc kubenswrapper[4797]: I1013 14:39:51.684616 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-84558866d4-w8c2d" podStartSLOduration=2.35519669 podStartE2EDuration="10.684590503s" podCreationTimestamp="2025-10-13 14:39:41 +0000 UTC" firstStartedPulling="2025-10-13 14:39:42.532755681 +0000 UTC m=+5560.066305937" lastFinishedPulling="2025-10-13 14:39:50.862149494 +0000 UTC m=+5568.395699750" observedRunningTime="2025-10-13 14:39:51.681151229 +0000 UTC m=+5569.214701505" watchObservedRunningTime="2025-10-13 14:39:51.684590503 +0000 UTC m=+5569.218140779" Oct 13 14:39:51 crc kubenswrapper[4797]: I1013 14:39:51.713002 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-84558866d4-4fbtq" podStartSLOduration=2.339987358 podStartE2EDuration="10.712977128s" podCreationTimestamp="2025-10-13 14:39:41 +0000 UTC" firstStartedPulling="2025-10-13 14:39:42.547853691 +0000 UTC m=+5560.081403947" lastFinishedPulling="2025-10-13 14:39:50.920843461 +0000 UTC m=+5568.454393717" observedRunningTime="2025-10-13 14:39:51.709895623 +0000 UTC m=+5569.243445889" watchObservedRunningTime="2025-10-13 14:39:51.712977128 +0000 UTC m=+5569.246527384" Oct 13 14:39:56 crc kubenswrapper[4797]: I1013 14:39:56.999026 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-8ngck" Oct 13 14:39:59 crc kubenswrapper[4797]: I1013 14:39:59.252080 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8ngck"] Oct 13 14:39:59 crc kubenswrapper[4797]: I1013 14:39:59.252617 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-8ngck" podUID="f46db46d-c53b-493b-94f1-2349893a731d" containerName="registry-server" containerID="cri-o://2ea1684682fb471fc6d93999489dcb8847d2622ffdfee1b1248c1c966ba2f7d2" gracePeriod=2 Oct 13 14:39:59 crc kubenswrapper[4797]: I1013 14:39:59.658141 4797 generic.go:334] "Generic (PLEG): container finished" podID="f46db46d-c53b-493b-94f1-2349893a731d" containerID="2ea1684682fb471fc6d93999489dcb8847d2622ffdfee1b1248c1c966ba2f7d2" exitCode=0 Oct 13 14:39:59 crc kubenswrapper[4797]: I1013 14:39:59.658224 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8ngck" event={"ID":"f46db46d-c53b-493b-94f1-2349893a731d","Type":"ContainerDied","Data":"2ea1684682fb471fc6d93999489dcb8847d2622ffdfee1b1248c1c966ba2f7d2"} Oct 13 14:39:59 crc kubenswrapper[4797]: I1013 14:39:59.829074 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8ngck" Oct 13 14:39:59 crc kubenswrapper[4797]: I1013 14:39:59.988855 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f46db46d-c53b-493b-94f1-2349893a731d-catalog-content\") pod \"f46db46d-c53b-493b-94f1-2349893a731d\" (UID: \"f46db46d-c53b-493b-94f1-2349893a731d\") " Oct 13 14:39:59 crc kubenswrapper[4797]: I1013 14:39:59.988928 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jk7kx\" (UniqueName: \"kubernetes.io/projected/f46db46d-c53b-493b-94f1-2349893a731d-kube-api-access-jk7kx\") pod \"f46db46d-c53b-493b-94f1-2349893a731d\" (UID: \"f46db46d-c53b-493b-94f1-2349893a731d\") " Oct 13 14:39:59 crc kubenswrapper[4797]: I1013 14:39:59.989124 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f46db46d-c53b-493b-94f1-2349893a731d-utilities\") pod \"f46db46d-c53b-493b-94f1-2349893a731d\" (UID: \"f46db46d-c53b-493b-94f1-2349893a731d\") " Oct 13 14:39:59 crc kubenswrapper[4797]: I1013 14:39:59.990100 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f46db46d-c53b-493b-94f1-2349893a731d-utilities" (OuterVolumeSpecName: "utilities") pod "f46db46d-c53b-493b-94f1-2349893a731d" (UID: "f46db46d-c53b-493b-94f1-2349893a731d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 14:40:00 crc kubenswrapper[4797]: I1013 14:40:00.018487 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f46db46d-c53b-493b-94f1-2349893a731d-kube-api-access-jk7kx" (OuterVolumeSpecName: "kube-api-access-jk7kx") pod "f46db46d-c53b-493b-94f1-2349893a731d" (UID: "f46db46d-c53b-493b-94f1-2349893a731d"). InnerVolumeSpecName "kube-api-access-jk7kx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 14:40:00 crc kubenswrapper[4797]: I1013 14:40:00.070161 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f46db46d-c53b-493b-94f1-2349893a731d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f46db46d-c53b-493b-94f1-2349893a731d" (UID: "f46db46d-c53b-493b-94f1-2349893a731d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 14:40:00 crc kubenswrapper[4797]: I1013 14:40:00.091245 4797 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f46db46d-c53b-493b-94f1-2349893a731d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 13 14:40:00 crc kubenswrapper[4797]: I1013 14:40:00.091284 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jk7kx\" (UniqueName: \"kubernetes.io/projected/f46db46d-c53b-493b-94f1-2349893a731d-kube-api-access-jk7kx\") on node \"crc\" DevicePath \"\"" Oct 13 14:40:00 crc kubenswrapper[4797]: I1013 14:40:00.091296 4797 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f46db46d-c53b-493b-94f1-2349893a731d-utilities\") on node \"crc\" DevicePath \"\"" Oct 13 14:40:00 crc kubenswrapper[4797]: I1013 14:40:00.670885 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8ngck" event={"ID":"f46db46d-c53b-493b-94f1-2349893a731d","Type":"ContainerDied","Data":"6a601448a8fdf1a196a883caa485671f1abf96e15ea56a22694db8e5cc8e62f5"} Oct 13 14:40:00 crc kubenswrapper[4797]: I1013 14:40:00.670936 4797 scope.go:117] "RemoveContainer" containerID="2ea1684682fb471fc6d93999489dcb8847d2622ffdfee1b1248c1c966ba2f7d2" Oct 13 14:40:00 crc kubenswrapper[4797]: I1013 14:40:00.671003 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8ngck" Oct 13 14:40:00 crc kubenswrapper[4797]: I1013 14:40:00.704782 4797 scope.go:117] "RemoveContainer" containerID="441988c6edd3c991c6a9efef27a4cf97b7fd3cdd1dce4354b34ea9cb8ba9c732" Oct 13 14:40:00 crc kubenswrapper[4797]: I1013 14:40:00.718689 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8ngck"] Oct 13 14:40:00 crc kubenswrapper[4797]: I1013 14:40:00.728660 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-8ngck"] Oct 13 14:40:00 crc kubenswrapper[4797]: I1013 14:40:00.746735 4797 scope.go:117] "RemoveContainer" containerID="e382c4c08f81a5c9e0df28192dca5c21ac8d368f39072d74ca848965c631ca38" Oct 13 14:40:01 crc kubenswrapper[4797]: I1013 14:40:01.247566 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f46db46d-c53b-493b-94f1-2349893a731d" path="/var/lib/kubelet/pods/f46db46d-c53b-493b-94f1-2349893a731d/volumes" Oct 13 14:40:02 crc kubenswrapper[4797]: I1013 14:40:02.344317 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-54bc95c9fb-zc6fr" Oct 13 14:40:05 crc kubenswrapper[4797]: I1013 14:40:04.957570 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Oct 13 14:40:05 crc kubenswrapper[4797]: I1013 14:40:04.958373 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="468f31d5-3d01-48fa-92ed-82c6cafc7690" containerName="openstackclient" containerID="cri-o://039442d9b3b20e36ecf325ef2cc27220189afd91b1ab889858082b8db4a65e4c" gracePeriod=2 Oct 13 14:40:05 crc kubenswrapper[4797]: I1013 14:40:04.973795 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Oct 13 14:40:05 crc kubenswrapper[4797]: I1013 14:40:05.025671 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Oct 13 14:40:05 crc kubenswrapper[4797]: E1013 14:40:05.026102 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="468f31d5-3d01-48fa-92ed-82c6cafc7690" containerName="openstackclient" Oct 13 14:40:05 crc kubenswrapper[4797]: I1013 14:40:05.026118 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="468f31d5-3d01-48fa-92ed-82c6cafc7690" containerName="openstackclient" Oct 13 14:40:05 crc kubenswrapper[4797]: E1013 14:40:05.026133 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f46db46d-c53b-493b-94f1-2349893a731d" containerName="extract-utilities" Oct 13 14:40:05 crc kubenswrapper[4797]: I1013 14:40:05.026139 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="f46db46d-c53b-493b-94f1-2349893a731d" containerName="extract-utilities" Oct 13 14:40:05 crc kubenswrapper[4797]: E1013 14:40:05.026155 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f46db46d-c53b-493b-94f1-2349893a731d" containerName="extract-content" Oct 13 14:40:05 crc kubenswrapper[4797]: I1013 14:40:05.026160 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="f46db46d-c53b-493b-94f1-2349893a731d" containerName="extract-content" Oct 13 14:40:05 crc kubenswrapper[4797]: E1013 14:40:05.026168 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f46db46d-c53b-493b-94f1-2349893a731d" containerName="registry-server" Oct 13 14:40:05 crc kubenswrapper[4797]: I1013 14:40:05.026174 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="f46db46d-c53b-493b-94f1-2349893a731d" containerName="registry-server" Oct 13 14:40:05 crc kubenswrapper[4797]: I1013 14:40:05.026359 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="f46db46d-c53b-493b-94f1-2349893a731d" containerName="registry-server" Oct 13 14:40:05 crc kubenswrapper[4797]: I1013 14:40:05.026370 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="468f31d5-3d01-48fa-92ed-82c6cafc7690" containerName="openstackclient" Oct 13 14:40:05 crc kubenswrapper[4797]: I1013 14:40:05.027037 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 13 14:40:05 crc kubenswrapper[4797]: I1013 14:40:05.041056 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 13 14:40:05 crc kubenswrapper[4797]: I1013 14:40:05.086412 4797 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="468f31d5-3d01-48fa-92ed-82c6cafc7690" podUID="7b6a4ded-dd7b-45bc-94c3-379d1288d07f" Oct 13 14:40:05 crc kubenswrapper[4797]: I1013 14:40:05.178956 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/7b6a4ded-dd7b-45bc-94c3-379d1288d07f-openstack-config-secret\") pod \"openstackclient\" (UID: \"7b6a4ded-dd7b-45bc-94c3-379d1288d07f\") " pod="openstack/openstackclient" Oct 13 14:40:05 crc kubenswrapper[4797]: I1013 14:40:05.178999 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/7b6a4ded-dd7b-45bc-94c3-379d1288d07f-openstack-config\") pod \"openstackclient\" (UID: \"7b6a4ded-dd7b-45bc-94c3-379d1288d07f\") " pod="openstack/openstackclient" Oct 13 14:40:05 crc kubenswrapper[4797]: I1013 14:40:05.179025 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9zsm\" (UniqueName: \"kubernetes.io/projected/7b6a4ded-dd7b-45bc-94c3-379d1288d07f-kube-api-access-t9zsm\") pod \"openstackclient\" (UID: \"7b6a4ded-dd7b-45bc-94c3-379d1288d07f\") " pod="openstack/openstackclient" Oct 13 14:40:05 crc kubenswrapper[4797]: I1013 14:40:05.281323 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/7b6a4ded-dd7b-45bc-94c3-379d1288d07f-openstack-config-secret\") pod \"openstackclient\" (UID: \"7b6a4ded-dd7b-45bc-94c3-379d1288d07f\") " pod="openstack/openstackclient" Oct 13 14:40:05 crc kubenswrapper[4797]: I1013 14:40:05.281680 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/7b6a4ded-dd7b-45bc-94c3-379d1288d07f-openstack-config\") pod \"openstackclient\" (UID: \"7b6a4ded-dd7b-45bc-94c3-379d1288d07f\") " pod="openstack/openstackclient" Oct 13 14:40:05 crc kubenswrapper[4797]: I1013 14:40:05.281721 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t9zsm\" (UniqueName: \"kubernetes.io/projected/7b6a4ded-dd7b-45bc-94c3-379d1288d07f-kube-api-access-t9zsm\") pod \"openstackclient\" (UID: \"7b6a4ded-dd7b-45bc-94c3-379d1288d07f\") " pod="openstack/openstackclient" Oct 13 14:40:05 crc kubenswrapper[4797]: I1013 14:40:05.284078 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/7b6a4ded-dd7b-45bc-94c3-379d1288d07f-openstack-config\") pod \"openstackclient\" (UID: \"7b6a4ded-dd7b-45bc-94c3-379d1288d07f\") " pod="openstack/openstackclient" Oct 13 14:40:05 crc kubenswrapper[4797]: I1013 14:40:05.307171 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/7b6a4ded-dd7b-45bc-94c3-379d1288d07f-openstack-config-secret\") pod \"openstackclient\" (UID: \"7b6a4ded-dd7b-45bc-94c3-379d1288d07f\") " pod="openstack/openstackclient" Oct 13 14:40:05 crc kubenswrapper[4797]: I1013 14:40:05.330928 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9zsm\" (UniqueName: \"kubernetes.io/projected/7b6a4ded-dd7b-45bc-94c3-379d1288d07f-kube-api-access-t9zsm\") pod \"openstackclient\" (UID: \"7b6a4ded-dd7b-45bc-94c3-379d1288d07f\") " pod="openstack/openstackclient" Oct 13 14:40:05 crc kubenswrapper[4797]: I1013 14:40:05.364107 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 13 14:40:05 crc kubenswrapper[4797]: I1013 14:40:05.385462 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Oct 13 14:40:05 crc kubenswrapper[4797]: I1013 14:40:05.386711 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 13 14:40:05 crc kubenswrapper[4797]: I1013 14:40:05.397235 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-2td8l" Oct 13 14:40:05 crc kubenswrapper[4797]: I1013 14:40:05.405639 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 13 14:40:05 crc kubenswrapper[4797]: I1013 14:40:05.495050 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48wlh\" (UniqueName: \"kubernetes.io/projected/637619fb-02b4-4561-9765-0d75ef8ab480-kube-api-access-48wlh\") pod \"kube-state-metrics-0\" (UID: \"637619fb-02b4-4561-9765-0d75ef8ab480\") " pod="openstack/kube-state-metrics-0" Oct 13 14:40:05 crc kubenswrapper[4797]: I1013 14:40:05.597465 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48wlh\" (UniqueName: \"kubernetes.io/projected/637619fb-02b4-4561-9765-0d75ef8ab480-kube-api-access-48wlh\") pod \"kube-state-metrics-0\" (UID: \"637619fb-02b4-4561-9765-0d75ef8ab480\") " pod="openstack/kube-state-metrics-0" Oct 13 14:40:05 crc kubenswrapper[4797]: I1013 14:40:05.640583 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48wlh\" (UniqueName: \"kubernetes.io/projected/637619fb-02b4-4561-9765-0d75ef8ab480-kube-api-access-48wlh\") pod \"kube-state-metrics-0\" (UID: \"637619fb-02b4-4561-9765-0d75ef8ab480\") " pod="openstack/kube-state-metrics-0" Oct 13 14:40:05 crc kubenswrapper[4797]: I1013 14:40:05.834554 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 13 14:40:06 crc kubenswrapper[4797]: I1013 14:40:06.221138 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/alertmanager-metric-storage-0"] Oct 13 14:40:06 crc kubenswrapper[4797]: I1013 14:40:06.232791 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Oct 13 14:40:06 crc kubenswrapper[4797]: I1013 14:40:06.240056 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-web-config" Oct 13 14:40:06 crc kubenswrapper[4797]: I1013 14:40:06.240092 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-generated" Oct 13 14:40:06 crc kubenswrapper[4797]: I1013 14:40:06.240125 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-alertmanager-dockercfg-2rtjr" Oct 13 14:40:06 crc kubenswrapper[4797]: I1013 14:40:06.240059 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-tls-assets-0" Oct 13 14:40:06 crc kubenswrapper[4797]: I1013 14:40:06.248725 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Oct 13 14:40:06 crc kubenswrapper[4797]: I1013 14:40:06.321914 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/8431ed84-8990-4a23-9d29-0f7ef6dfc9d0-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"8431ed84-8990-4a23-9d29-0f7ef6dfc9d0\") " pod="openstack/alertmanager-metric-storage-0" Oct 13 14:40:06 crc kubenswrapper[4797]: I1013 14:40:06.342043 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8431ed84-8990-4a23-9d29-0f7ef6dfc9d0-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"8431ed84-8990-4a23-9d29-0f7ef6dfc9d0\") " pod="openstack/alertmanager-metric-storage-0" Oct 13 14:40:06 crc kubenswrapper[4797]: I1013 14:40:06.342090 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8431ed84-8990-4a23-9d29-0f7ef6dfc9d0-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"8431ed84-8990-4a23-9d29-0f7ef6dfc9d0\") " pod="openstack/alertmanager-metric-storage-0" Oct 13 14:40:06 crc kubenswrapper[4797]: I1013 14:40:06.342235 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8431ed84-8990-4a23-9d29-0f7ef6dfc9d0-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"8431ed84-8990-4a23-9d29-0f7ef6dfc9d0\") " pod="openstack/alertmanager-metric-storage-0" Oct 13 14:40:06 crc kubenswrapper[4797]: I1013 14:40:06.345497 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/8431ed84-8990-4a23-9d29-0f7ef6dfc9d0-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"8431ed84-8990-4a23-9d29-0f7ef6dfc9d0\") " pod="openstack/alertmanager-metric-storage-0" Oct 13 14:40:06 crc kubenswrapper[4797]: I1013 14:40:06.345571 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvzt6\" (UniqueName: \"kubernetes.io/projected/8431ed84-8990-4a23-9d29-0f7ef6dfc9d0-kube-api-access-jvzt6\") pod \"alertmanager-metric-storage-0\" (UID: \"8431ed84-8990-4a23-9d29-0f7ef6dfc9d0\") " pod="openstack/alertmanager-metric-storage-0" Oct 13 14:40:06 crc kubenswrapper[4797]: I1013 14:40:06.448301 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/8431ed84-8990-4a23-9d29-0f7ef6dfc9d0-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"8431ed84-8990-4a23-9d29-0f7ef6dfc9d0\") " pod="openstack/alertmanager-metric-storage-0" Oct 13 14:40:06 crc kubenswrapper[4797]: I1013 14:40:06.448603 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8431ed84-8990-4a23-9d29-0f7ef6dfc9d0-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"8431ed84-8990-4a23-9d29-0f7ef6dfc9d0\") " pod="openstack/alertmanager-metric-storage-0" Oct 13 14:40:06 crc kubenswrapper[4797]: I1013 14:40:06.448725 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8431ed84-8990-4a23-9d29-0f7ef6dfc9d0-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"8431ed84-8990-4a23-9d29-0f7ef6dfc9d0\") " pod="openstack/alertmanager-metric-storage-0" Oct 13 14:40:06 crc kubenswrapper[4797]: I1013 14:40:06.448892 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8431ed84-8990-4a23-9d29-0f7ef6dfc9d0-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"8431ed84-8990-4a23-9d29-0f7ef6dfc9d0\") " pod="openstack/alertmanager-metric-storage-0" Oct 13 14:40:06 crc kubenswrapper[4797]: I1013 14:40:06.449111 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/8431ed84-8990-4a23-9d29-0f7ef6dfc9d0-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"8431ed84-8990-4a23-9d29-0f7ef6dfc9d0\") " pod="openstack/alertmanager-metric-storage-0" Oct 13 14:40:06 crc kubenswrapper[4797]: I1013 14:40:06.449228 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvzt6\" (UniqueName: \"kubernetes.io/projected/8431ed84-8990-4a23-9d29-0f7ef6dfc9d0-kube-api-access-jvzt6\") pod \"alertmanager-metric-storage-0\" (UID: \"8431ed84-8990-4a23-9d29-0f7ef6dfc9d0\") " pod="openstack/alertmanager-metric-storage-0" Oct 13 14:40:06 crc kubenswrapper[4797]: I1013 14:40:06.451476 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/8431ed84-8990-4a23-9d29-0f7ef6dfc9d0-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"8431ed84-8990-4a23-9d29-0f7ef6dfc9d0\") " pod="openstack/alertmanager-metric-storage-0" Oct 13 14:40:06 crc kubenswrapper[4797]: I1013 14:40:06.472269 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8431ed84-8990-4a23-9d29-0f7ef6dfc9d0-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"8431ed84-8990-4a23-9d29-0f7ef6dfc9d0\") " pod="openstack/alertmanager-metric-storage-0" Oct 13 14:40:06 crc kubenswrapper[4797]: I1013 14:40:06.473536 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8431ed84-8990-4a23-9d29-0f7ef6dfc9d0-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"8431ed84-8990-4a23-9d29-0f7ef6dfc9d0\") " pod="openstack/alertmanager-metric-storage-0" Oct 13 14:40:06 crc kubenswrapper[4797]: I1013 14:40:06.473994 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8431ed84-8990-4a23-9d29-0f7ef6dfc9d0-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"8431ed84-8990-4a23-9d29-0f7ef6dfc9d0\") " pod="openstack/alertmanager-metric-storage-0" Oct 13 14:40:06 crc kubenswrapper[4797]: I1013 14:40:06.493582 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/8431ed84-8990-4a23-9d29-0f7ef6dfc9d0-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"8431ed84-8990-4a23-9d29-0f7ef6dfc9d0\") " pod="openstack/alertmanager-metric-storage-0" Oct 13 14:40:06 crc kubenswrapper[4797]: I1013 14:40:06.511513 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvzt6\" (UniqueName: \"kubernetes.io/projected/8431ed84-8990-4a23-9d29-0f7ef6dfc9d0-kube-api-access-jvzt6\") pod \"alertmanager-metric-storage-0\" (UID: \"8431ed84-8990-4a23-9d29-0f7ef6dfc9d0\") " pod="openstack/alertmanager-metric-storage-0" Oct 13 14:40:06 crc kubenswrapper[4797]: I1013 14:40:06.552592 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 13 14:40:06 crc kubenswrapper[4797]: I1013 14:40:06.679718 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Oct 13 14:40:06 crc kubenswrapper[4797]: I1013 14:40:06.740053 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 13 14:40:06 crc kubenswrapper[4797]: I1013 14:40:06.747737 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 13 14:40:06 crc kubenswrapper[4797]: I1013 14:40:06.761641 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Oct 13 14:40:06 crc kubenswrapper[4797]: I1013 14:40:06.762042 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-qzzqx" Oct 13 14:40:06 crc kubenswrapper[4797]: I1013 14:40:06.762184 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Oct 13 14:40:06 crc kubenswrapper[4797]: I1013 14:40:06.762301 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Oct 13 14:40:06 crc kubenswrapper[4797]: I1013 14:40:06.762436 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Oct 13 14:40:06 crc kubenswrapper[4797]: I1013 14:40:06.762606 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Oct 13 14:40:06 crc kubenswrapper[4797]: I1013 14:40:06.787210 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 13 14:40:06 crc kubenswrapper[4797]: I1013 14:40:06.838330 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"7b6a4ded-dd7b-45bc-94c3-379d1288d07f","Type":"ContainerStarted","Data":"d30b0df69f12b554835c89b591c62ce61c73d6db509c5d2afcea222470411994"} Oct 13 14:40:06 crc kubenswrapper[4797]: I1013 14:40:06.889001 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/5b21f03c-f0be-45b6-b3d2-eaa44eb95e3d-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"5b21f03c-f0be-45b6-b3d2-eaa44eb95e3d\") " pod="openstack/prometheus-metric-storage-0" Oct 13 14:40:06 crc kubenswrapper[4797]: I1013 14:40:06.889129 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-656ec596-7adc-4ec1-bf80-48991910abd1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-656ec596-7adc-4ec1-bf80-48991910abd1\") pod \"prometheus-metric-storage-0\" (UID: \"5b21f03c-f0be-45b6-b3d2-eaa44eb95e3d\") " pod="openstack/prometheus-metric-storage-0" Oct 13 14:40:06 crc kubenswrapper[4797]: I1013 14:40:06.889222 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5b21f03c-f0be-45b6-b3d2-eaa44eb95e3d-config\") pod \"prometheus-metric-storage-0\" (UID: \"5b21f03c-f0be-45b6-b3d2-eaa44eb95e3d\") " pod="openstack/prometheus-metric-storage-0" Oct 13 14:40:06 crc kubenswrapper[4797]: I1013 14:40:06.889252 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/5b21f03c-f0be-45b6-b3d2-eaa44eb95e3d-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"5b21f03c-f0be-45b6-b3d2-eaa44eb95e3d\") " pod="openstack/prometheus-metric-storage-0" Oct 13 14:40:06 crc kubenswrapper[4797]: I1013 14:40:06.889320 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/5b21f03c-f0be-45b6-b3d2-eaa44eb95e3d-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"5b21f03c-f0be-45b6-b3d2-eaa44eb95e3d\") " pod="openstack/prometheus-metric-storage-0" Oct 13 14:40:06 crc kubenswrapper[4797]: I1013 14:40:06.889352 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/5b21f03c-f0be-45b6-b3d2-eaa44eb95e3d-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"5b21f03c-f0be-45b6-b3d2-eaa44eb95e3d\") " pod="openstack/prometheus-metric-storage-0" Oct 13 14:40:06 crc kubenswrapper[4797]: I1013 14:40:06.889376 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/5b21f03c-f0be-45b6-b3d2-eaa44eb95e3d-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"5b21f03c-f0be-45b6-b3d2-eaa44eb95e3d\") " pod="openstack/prometheus-metric-storage-0" Oct 13 14:40:06 crc kubenswrapper[4797]: I1013 14:40:06.889400 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-69cqv\" (UniqueName: \"kubernetes.io/projected/5b21f03c-f0be-45b6-b3d2-eaa44eb95e3d-kube-api-access-69cqv\") pod \"prometheus-metric-storage-0\" (UID: \"5b21f03c-f0be-45b6-b3d2-eaa44eb95e3d\") " pod="openstack/prometheus-metric-storage-0" Oct 13 14:40:06 crc kubenswrapper[4797]: I1013 14:40:06.998313 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-656ec596-7adc-4ec1-bf80-48991910abd1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-656ec596-7adc-4ec1-bf80-48991910abd1\") pod \"prometheus-metric-storage-0\" (UID: \"5b21f03c-f0be-45b6-b3d2-eaa44eb95e3d\") " pod="openstack/prometheus-metric-storage-0" Oct 13 14:40:06 crc kubenswrapper[4797]: I1013 14:40:06.998799 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5b21f03c-f0be-45b6-b3d2-eaa44eb95e3d-config\") pod \"prometheus-metric-storage-0\" (UID: \"5b21f03c-f0be-45b6-b3d2-eaa44eb95e3d\") " pod="openstack/prometheus-metric-storage-0" Oct 13 14:40:06 crc kubenswrapper[4797]: I1013 14:40:06.998885 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/5b21f03c-f0be-45b6-b3d2-eaa44eb95e3d-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"5b21f03c-f0be-45b6-b3d2-eaa44eb95e3d\") " pod="openstack/prometheus-metric-storage-0" Oct 13 14:40:06 crc kubenswrapper[4797]: I1013 14:40:06.998974 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/5b21f03c-f0be-45b6-b3d2-eaa44eb95e3d-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"5b21f03c-f0be-45b6-b3d2-eaa44eb95e3d\") " pod="openstack/prometheus-metric-storage-0" Oct 13 14:40:06 crc kubenswrapper[4797]: I1013 14:40:06.999012 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/5b21f03c-f0be-45b6-b3d2-eaa44eb95e3d-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"5b21f03c-f0be-45b6-b3d2-eaa44eb95e3d\") " pod="openstack/prometheus-metric-storage-0" Oct 13 14:40:06 crc kubenswrapper[4797]: I1013 14:40:06.999034 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/5b21f03c-f0be-45b6-b3d2-eaa44eb95e3d-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"5b21f03c-f0be-45b6-b3d2-eaa44eb95e3d\") " pod="openstack/prometheus-metric-storage-0" Oct 13 14:40:06 crc kubenswrapper[4797]: I1013 14:40:06.999062 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-69cqv\" (UniqueName: \"kubernetes.io/projected/5b21f03c-f0be-45b6-b3d2-eaa44eb95e3d-kube-api-access-69cqv\") pod \"prometheus-metric-storage-0\" (UID: \"5b21f03c-f0be-45b6-b3d2-eaa44eb95e3d\") " pod="openstack/prometheus-metric-storage-0" Oct 13 14:40:06 crc kubenswrapper[4797]: I1013 14:40:06.999105 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/5b21f03c-f0be-45b6-b3d2-eaa44eb95e3d-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"5b21f03c-f0be-45b6-b3d2-eaa44eb95e3d\") " pod="openstack/prometheus-metric-storage-0" Oct 13 14:40:07 crc kubenswrapper[4797]: I1013 14:40:07.008213 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/5b21f03c-f0be-45b6-b3d2-eaa44eb95e3d-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"5b21f03c-f0be-45b6-b3d2-eaa44eb95e3d\") " pod="openstack/prometheus-metric-storage-0" Oct 13 14:40:07 crc kubenswrapper[4797]: I1013 14:40:07.013506 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/5b21f03c-f0be-45b6-b3d2-eaa44eb95e3d-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"5b21f03c-f0be-45b6-b3d2-eaa44eb95e3d\") " pod="openstack/prometheus-metric-storage-0" Oct 13 14:40:07 crc kubenswrapper[4797]: I1013 14:40:07.018048 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 13 14:40:07 crc kubenswrapper[4797]: I1013 14:40:07.019166 4797 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 13 14:40:07 crc kubenswrapper[4797]: I1013 14:40:07.019208 4797 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-656ec596-7adc-4ec1-bf80-48991910abd1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-656ec596-7adc-4ec1-bf80-48991910abd1\") pod \"prometheus-metric-storage-0\" (UID: \"5b21f03c-f0be-45b6-b3d2-eaa44eb95e3d\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/015baed9d556b766924d2c38096f20dbfc14b15f0082bfadfb277e2efd24391f/globalmount\"" pod="openstack/prometheus-metric-storage-0" Oct 13 14:40:07 crc kubenswrapper[4797]: I1013 14:40:07.019576 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/5b21f03c-f0be-45b6-b3d2-eaa44eb95e3d-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"5b21f03c-f0be-45b6-b3d2-eaa44eb95e3d\") " pod="openstack/prometheus-metric-storage-0" Oct 13 14:40:07 crc kubenswrapper[4797]: I1013 14:40:07.032350 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/5b21f03c-f0be-45b6-b3d2-eaa44eb95e3d-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"5b21f03c-f0be-45b6-b3d2-eaa44eb95e3d\") " pod="openstack/prometheus-metric-storage-0" Oct 13 14:40:07 crc kubenswrapper[4797]: I1013 14:40:07.033855 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/5b21f03c-f0be-45b6-b3d2-eaa44eb95e3d-config\") pod \"prometheus-metric-storage-0\" (UID: \"5b21f03c-f0be-45b6-b3d2-eaa44eb95e3d\") " pod="openstack/prometheus-metric-storage-0" Oct 13 14:40:07 crc kubenswrapper[4797]: I1013 14:40:07.036332 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/5b21f03c-f0be-45b6-b3d2-eaa44eb95e3d-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"5b21f03c-f0be-45b6-b3d2-eaa44eb95e3d\") " pod="openstack/prometheus-metric-storage-0" Oct 13 14:40:07 crc kubenswrapper[4797]: I1013 14:40:07.046047 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-69cqv\" (UniqueName: \"kubernetes.io/projected/5b21f03c-f0be-45b6-b3d2-eaa44eb95e3d-kube-api-access-69cqv\") pod \"prometheus-metric-storage-0\" (UID: \"5b21f03c-f0be-45b6-b3d2-eaa44eb95e3d\") " pod="openstack/prometheus-metric-storage-0" Oct 13 14:40:07 crc kubenswrapper[4797]: I1013 14:40:07.201398 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-656ec596-7adc-4ec1-bf80-48991910abd1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-656ec596-7adc-4ec1-bf80-48991910abd1\") pod \"prometheus-metric-storage-0\" (UID: \"5b21f03c-f0be-45b6-b3d2-eaa44eb95e3d\") " pod="openstack/prometheus-metric-storage-0" Oct 13 14:40:07 crc kubenswrapper[4797]: I1013 14:40:07.378516 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Oct 13 14:40:07 crc kubenswrapper[4797]: I1013 14:40:07.442749 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 13 14:40:07 crc kubenswrapper[4797]: I1013 14:40:07.580461 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 13 14:40:07 crc kubenswrapper[4797]: I1013 14:40:07.719439 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-77gcs\" (UniqueName: \"kubernetes.io/projected/468f31d5-3d01-48fa-92ed-82c6cafc7690-kube-api-access-77gcs\") pod \"468f31d5-3d01-48fa-92ed-82c6cafc7690\" (UID: \"468f31d5-3d01-48fa-92ed-82c6cafc7690\") " Oct 13 14:40:07 crc kubenswrapper[4797]: I1013 14:40:07.719825 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/468f31d5-3d01-48fa-92ed-82c6cafc7690-openstack-config-secret\") pod \"468f31d5-3d01-48fa-92ed-82c6cafc7690\" (UID: \"468f31d5-3d01-48fa-92ed-82c6cafc7690\") " Oct 13 14:40:07 crc kubenswrapper[4797]: I1013 14:40:07.719860 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/468f31d5-3d01-48fa-92ed-82c6cafc7690-openstack-config\") pod \"468f31d5-3d01-48fa-92ed-82c6cafc7690\" (UID: \"468f31d5-3d01-48fa-92ed-82c6cafc7690\") " Oct 13 14:40:07 crc kubenswrapper[4797]: I1013 14:40:07.728731 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/468f31d5-3d01-48fa-92ed-82c6cafc7690-kube-api-access-77gcs" (OuterVolumeSpecName: "kube-api-access-77gcs") pod "468f31d5-3d01-48fa-92ed-82c6cafc7690" (UID: "468f31d5-3d01-48fa-92ed-82c6cafc7690"). InnerVolumeSpecName "kube-api-access-77gcs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 14:40:07 crc kubenswrapper[4797]: I1013 14:40:07.763750 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/468f31d5-3d01-48fa-92ed-82c6cafc7690-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "468f31d5-3d01-48fa-92ed-82c6cafc7690" (UID: "468f31d5-3d01-48fa-92ed-82c6cafc7690"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 14:40:07 crc kubenswrapper[4797]: I1013 14:40:07.785519 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/468f31d5-3d01-48fa-92ed-82c6cafc7690-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "468f31d5-3d01-48fa-92ed-82c6cafc7690" (UID: "468f31d5-3d01-48fa-92ed-82c6cafc7690"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 14:40:07 crc kubenswrapper[4797]: I1013 14:40:07.822261 4797 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/468f31d5-3d01-48fa-92ed-82c6cafc7690-openstack-config\") on node \"crc\" DevicePath \"\"" Oct 13 14:40:07 crc kubenswrapper[4797]: I1013 14:40:07.822295 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-77gcs\" (UniqueName: \"kubernetes.io/projected/468f31d5-3d01-48fa-92ed-82c6cafc7690-kube-api-access-77gcs\") on node \"crc\" DevicePath \"\"" Oct 13 14:40:07 crc kubenswrapper[4797]: I1013 14:40:07.822305 4797 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/468f31d5-3d01-48fa-92ed-82c6cafc7690-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Oct 13 14:40:07 crc kubenswrapper[4797]: I1013 14:40:07.852040 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"637619fb-02b4-4561-9765-0d75ef8ab480","Type":"ContainerStarted","Data":"bebc4b4445a44c9d89c0b258ce2d90c035cd74b7c0fb33ccd8c5cf5f427f453c"} Oct 13 14:40:07 crc kubenswrapper[4797]: I1013 14:40:07.853858 4797 generic.go:334] "Generic (PLEG): container finished" podID="468f31d5-3d01-48fa-92ed-82c6cafc7690" containerID="039442d9b3b20e36ecf325ef2cc27220189afd91b1ab889858082b8db4a65e4c" exitCode=137 Oct 13 14:40:07 crc kubenswrapper[4797]: I1013 14:40:07.853934 4797 scope.go:117] "RemoveContainer" containerID="039442d9b3b20e36ecf325ef2cc27220189afd91b1ab889858082b8db4a65e4c" Oct 13 14:40:07 crc kubenswrapper[4797]: I1013 14:40:07.854036 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 13 14:40:07 crc kubenswrapper[4797]: I1013 14:40:07.859522 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"7b6a4ded-dd7b-45bc-94c3-379d1288d07f","Type":"ContainerStarted","Data":"456bbb4468d5a9da25a49b122457b1d1d071ee8e72cdbbec9068e0b963437f61"} Oct 13 14:40:07 crc kubenswrapper[4797]: I1013 14:40:07.869528 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"8431ed84-8990-4a23-9d29-0f7ef6dfc9d0","Type":"ContainerStarted","Data":"00a616b04a8e93aebb2c32e35cf5a4a5ef79d91ebc06ed821d47a730e79656d4"} Oct 13 14:40:07 crc kubenswrapper[4797]: I1013 14:40:07.873311 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=3.8732904489999997 podStartE2EDuration="3.873290449s" podCreationTimestamp="2025-10-13 14:40:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 14:40:07.871892525 +0000 UTC m=+5585.405442801" watchObservedRunningTime="2025-10-13 14:40:07.873290449 +0000 UTC m=+5585.406840715" Oct 13 14:40:07 crc kubenswrapper[4797]: I1013 14:40:07.879212 4797 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="468f31d5-3d01-48fa-92ed-82c6cafc7690" podUID="7b6a4ded-dd7b-45bc-94c3-379d1288d07f" Oct 13 14:40:07 crc kubenswrapper[4797]: I1013 14:40:07.883971 4797 scope.go:117] "RemoveContainer" containerID="039442d9b3b20e36ecf325ef2cc27220189afd91b1ab889858082b8db4a65e4c" Oct 13 14:40:07 crc kubenswrapper[4797]: E1013 14:40:07.884473 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"039442d9b3b20e36ecf325ef2cc27220189afd91b1ab889858082b8db4a65e4c\": container with ID starting with 039442d9b3b20e36ecf325ef2cc27220189afd91b1ab889858082b8db4a65e4c not found: ID does not exist" containerID="039442d9b3b20e36ecf325ef2cc27220189afd91b1ab889858082b8db4a65e4c" Oct 13 14:40:07 crc kubenswrapper[4797]: I1013 14:40:07.884523 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"039442d9b3b20e36ecf325ef2cc27220189afd91b1ab889858082b8db4a65e4c"} err="failed to get container status \"039442d9b3b20e36ecf325ef2cc27220189afd91b1ab889858082b8db4a65e4c\": rpc error: code = NotFound desc = could not find container \"039442d9b3b20e36ecf325ef2cc27220189afd91b1ab889858082b8db4a65e4c\": container with ID starting with 039442d9b3b20e36ecf325ef2cc27220189afd91b1ab889858082b8db4a65e4c not found: ID does not exist" Oct 13 14:40:07 crc kubenswrapper[4797]: W1013 14:40:07.942422 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5b21f03c_f0be_45b6_b3d2_eaa44eb95e3d.slice/crio-05b88014a2ac2eef2337da516973ebe3f1eb28375731bd20de1bf2e4b1ed1201 WatchSource:0}: Error finding container 05b88014a2ac2eef2337da516973ebe3f1eb28375731bd20de1bf2e4b1ed1201: Status 404 returned error can't find the container with id 05b88014a2ac2eef2337da516973ebe3f1eb28375731bd20de1bf2e4b1ed1201 Oct 13 14:40:07 crc kubenswrapper[4797]: I1013 14:40:07.943485 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 13 14:40:08 crc kubenswrapper[4797]: I1013 14:40:08.892456 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"5b21f03c-f0be-45b6-b3d2-eaa44eb95e3d","Type":"ContainerStarted","Data":"05b88014a2ac2eef2337da516973ebe3f1eb28375731bd20de1bf2e4b1ed1201"} Oct 13 14:40:08 crc kubenswrapper[4797]: I1013 14:40:08.896546 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"637619fb-02b4-4561-9765-0d75ef8ab480","Type":"ContainerStarted","Data":"0331070d105a645598bdd3bf76041c3f8b11bc083aab32fe93644d257d364e91"} Oct 13 14:40:08 crc kubenswrapper[4797]: I1013 14:40:08.896691 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Oct 13 14:40:08 crc kubenswrapper[4797]: I1013 14:40:08.926875 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=3.249412028 podStartE2EDuration="3.926841307s" podCreationTimestamp="2025-10-13 14:40:05 +0000 UTC" firstStartedPulling="2025-10-13 14:40:07.143933989 +0000 UTC m=+5584.677484245" lastFinishedPulling="2025-10-13 14:40:07.821363268 +0000 UTC m=+5585.354913524" observedRunningTime="2025-10-13 14:40:08.916028022 +0000 UTC m=+5586.449578288" watchObservedRunningTime="2025-10-13 14:40:08.926841307 +0000 UTC m=+5586.460391563" Oct 13 14:40:09 crc kubenswrapper[4797]: I1013 14:40:09.259071 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="468f31d5-3d01-48fa-92ed-82c6cafc7690" path="/var/lib/kubelet/pods/468f31d5-3d01-48fa-92ed-82c6cafc7690/volumes" Oct 13 14:40:13 crc kubenswrapper[4797]: I1013 14:40:13.945435 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"5b21f03c-f0be-45b6-b3d2-eaa44eb95e3d","Type":"ContainerStarted","Data":"457cab69317a358ee45ca501297dda0dcbeb3f42ef5e13a5f1bdfed01664f72f"} Oct 13 14:40:13 crc kubenswrapper[4797]: I1013 14:40:13.947417 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"8431ed84-8990-4a23-9d29-0f7ef6dfc9d0","Type":"ContainerStarted","Data":"7064ef33aafa9750af29b5505c710f208bf8c9be616447eb0d4d72537742f86d"} Oct 13 14:40:14 crc kubenswrapper[4797]: I1013 14:40:14.031360 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-s6mjj"] Oct 13 14:40:14 crc kubenswrapper[4797]: I1013 14:40:14.042616 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-s6mjj"] Oct 13 14:40:15 crc kubenswrapper[4797]: I1013 14:40:15.246771 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54342a44-f051-411b-8a7c-fabc11ebe1dc" path="/var/lib/kubelet/pods/54342a44-f051-411b-8a7c-fabc11ebe1dc/volumes" Oct 13 14:40:15 crc kubenswrapper[4797]: I1013 14:40:15.724229 4797 scope.go:117] "RemoveContainer" containerID="1301c51c48e78ec2f0067fb09bb6b31345679c53adfafb2eb36162c4ed150680" Oct 13 14:40:15 crc kubenswrapper[4797]: I1013 14:40:15.750662 4797 scope.go:117] "RemoveContainer" containerID="d1b82afce4cb561251ab198fd7c2cec35b900e5a56031f7eeba8e0e75efd2f34" Oct 13 14:40:15 crc kubenswrapper[4797]: I1013 14:40:15.799573 4797 scope.go:117] "RemoveContainer" containerID="08e76799c38714222e19274bef91718a1a471eb392fc01b5254816f1105ec6f8" Oct 13 14:40:15 crc kubenswrapper[4797]: I1013 14:40:15.838972 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Oct 13 14:40:22 crc kubenswrapper[4797]: I1013 14:40:22.029450 4797 generic.go:334] "Generic (PLEG): container finished" podID="8431ed84-8990-4a23-9d29-0f7ef6dfc9d0" containerID="7064ef33aafa9750af29b5505c710f208bf8c9be616447eb0d4d72537742f86d" exitCode=0 Oct 13 14:40:22 crc kubenswrapper[4797]: I1013 14:40:22.029555 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"8431ed84-8990-4a23-9d29-0f7ef6dfc9d0","Type":"ContainerDied","Data":"7064ef33aafa9750af29b5505c710f208bf8c9be616447eb0d4d72537742f86d"} Oct 13 14:40:22 crc kubenswrapper[4797]: I1013 14:40:22.032935 4797 generic.go:334] "Generic (PLEG): container finished" podID="5b21f03c-f0be-45b6-b3d2-eaa44eb95e3d" containerID="457cab69317a358ee45ca501297dda0dcbeb3f42ef5e13a5f1bdfed01664f72f" exitCode=0 Oct 13 14:40:22 crc kubenswrapper[4797]: I1013 14:40:22.032983 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"5b21f03c-f0be-45b6-b3d2-eaa44eb95e3d","Type":"ContainerDied","Data":"457cab69317a358ee45ca501297dda0dcbeb3f42ef5e13a5f1bdfed01664f72f"} Oct 13 14:40:25 crc kubenswrapper[4797]: I1013 14:40:25.068182 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"8431ed84-8990-4a23-9d29-0f7ef6dfc9d0","Type":"ContainerStarted","Data":"ad343eaa5ab484c9dbf3c876f8eb19fd73fe01da87852a47e90daa5ed4bf730c"} Oct 13 14:40:28 crc kubenswrapper[4797]: I1013 14:40:28.116431 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"8431ed84-8990-4a23-9d29-0f7ef6dfc9d0","Type":"ContainerStarted","Data":"b0f806b6ea0a352308a75b1bbd16d4916e4b4e29965efecfac721531096c9aac"} Oct 13 14:40:28 crc kubenswrapper[4797]: I1013 14:40:28.117665 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/alertmanager-metric-storage-0" Oct 13 14:40:28 crc kubenswrapper[4797]: I1013 14:40:28.121785 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/alertmanager-metric-storage-0" Oct 13 14:40:28 crc kubenswrapper[4797]: I1013 14:40:28.148822 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/alertmanager-metric-storage-0" podStartSLOduration=4.99537536 podStartE2EDuration="22.14875427s" podCreationTimestamp="2025-10-13 14:40:06 +0000 UTC" firstStartedPulling="2025-10-13 14:40:07.389445741 +0000 UTC m=+5584.922995997" lastFinishedPulling="2025-10-13 14:40:24.542824651 +0000 UTC m=+5602.076374907" observedRunningTime="2025-10-13 14:40:28.142276281 +0000 UTC m=+5605.675826557" watchObservedRunningTime="2025-10-13 14:40:28.14875427 +0000 UTC m=+5605.682304546" Oct 13 14:40:29 crc kubenswrapper[4797]: I1013 14:40:29.126500 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"5b21f03c-f0be-45b6-b3d2-eaa44eb95e3d","Type":"ContainerStarted","Data":"03b569e057e266cc531b7b9696ab6bed6e7878a47ddb8a1ef30d5d7c65fdfa02"} Oct 13 14:40:32 crc kubenswrapper[4797]: I1013 14:40:32.157969 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"5b21f03c-f0be-45b6-b3d2-eaa44eb95e3d","Type":"ContainerStarted","Data":"21b980b31ff129b46194322b7024eb21e6a0f96190e3b35c85fc20527f57c0ce"} Oct 13 14:40:36 crc kubenswrapper[4797]: I1013 14:40:36.215975 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"5b21f03c-f0be-45b6-b3d2-eaa44eb95e3d","Type":"ContainerStarted","Data":"f64e4eb6e24e7478d7b0cc4d6f582f3c9b555cfedbb9bcc4268eaec841985f22"} Oct 13 14:40:36 crc kubenswrapper[4797]: I1013 14:40:36.244532 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=4.052432774 podStartE2EDuration="31.244511403s" podCreationTimestamp="2025-10-13 14:40:05 +0000 UTC" firstStartedPulling="2025-10-13 14:40:07.945508148 +0000 UTC m=+5585.479058404" lastFinishedPulling="2025-10-13 14:40:35.137586777 +0000 UTC m=+5612.671137033" observedRunningTime="2025-10-13 14:40:36.237580663 +0000 UTC m=+5613.771130929" watchObservedRunningTime="2025-10-13 14:40:36.244511403 +0000 UTC m=+5613.778061659" Oct 13 14:40:37 crc kubenswrapper[4797]: I1013 14:40:37.443641 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Oct 13 14:40:37 crc kubenswrapper[4797]: I1013 14:40:37.443991 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Oct 13 14:40:37 crc kubenswrapper[4797]: I1013 14:40:37.447231 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Oct 13 14:40:38 crc kubenswrapper[4797]: I1013 14:40:38.232894 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Oct 13 14:40:38 crc kubenswrapper[4797]: I1013 14:40:38.323908 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 13 14:40:38 crc kubenswrapper[4797]: I1013 14:40:38.326867 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 13 14:40:38 crc kubenswrapper[4797]: I1013 14:40:38.329483 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 13 14:40:38 crc kubenswrapper[4797]: I1013 14:40:38.330865 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 13 14:40:38 crc kubenswrapper[4797]: I1013 14:40:38.356919 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 13 14:40:38 crc kubenswrapper[4797]: I1013 14:40:38.452520 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/703b9830-3383-4f8d-8ef5-376e2f956a60-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"703b9830-3383-4f8d-8ef5-376e2f956a60\") " pod="openstack/ceilometer-0" Oct 13 14:40:38 crc kubenswrapper[4797]: I1013 14:40:38.452568 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/703b9830-3383-4f8d-8ef5-376e2f956a60-log-httpd\") pod \"ceilometer-0\" (UID: \"703b9830-3383-4f8d-8ef5-376e2f956a60\") " pod="openstack/ceilometer-0" Oct 13 14:40:38 crc kubenswrapper[4797]: I1013 14:40:38.452604 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/703b9830-3383-4f8d-8ef5-376e2f956a60-scripts\") pod \"ceilometer-0\" (UID: \"703b9830-3383-4f8d-8ef5-376e2f956a60\") " pod="openstack/ceilometer-0" Oct 13 14:40:38 crc kubenswrapper[4797]: I1013 14:40:38.452642 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/703b9830-3383-4f8d-8ef5-376e2f956a60-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"703b9830-3383-4f8d-8ef5-376e2f956a60\") " pod="openstack/ceilometer-0" Oct 13 14:40:38 crc kubenswrapper[4797]: I1013 14:40:38.452728 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/703b9830-3383-4f8d-8ef5-376e2f956a60-config-data\") pod \"ceilometer-0\" (UID: \"703b9830-3383-4f8d-8ef5-376e2f956a60\") " pod="openstack/ceilometer-0" Oct 13 14:40:38 crc kubenswrapper[4797]: I1013 14:40:38.452788 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7kd4t\" (UniqueName: \"kubernetes.io/projected/703b9830-3383-4f8d-8ef5-376e2f956a60-kube-api-access-7kd4t\") pod \"ceilometer-0\" (UID: \"703b9830-3383-4f8d-8ef5-376e2f956a60\") " pod="openstack/ceilometer-0" Oct 13 14:40:38 crc kubenswrapper[4797]: I1013 14:40:38.452848 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/703b9830-3383-4f8d-8ef5-376e2f956a60-run-httpd\") pod \"ceilometer-0\" (UID: \"703b9830-3383-4f8d-8ef5-376e2f956a60\") " pod="openstack/ceilometer-0" Oct 13 14:40:38 crc kubenswrapper[4797]: I1013 14:40:38.554439 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7kd4t\" (UniqueName: \"kubernetes.io/projected/703b9830-3383-4f8d-8ef5-376e2f956a60-kube-api-access-7kd4t\") pod \"ceilometer-0\" (UID: \"703b9830-3383-4f8d-8ef5-376e2f956a60\") " pod="openstack/ceilometer-0" Oct 13 14:40:38 crc kubenswrapper[4797]: I1013 14:40:38.554612 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/703b9830-3383-4f8d-8ef5-376e2f956a60-run-httpd\") pod \"ceilometer-0\" (UID: \"703b9830-3383-4f8d-8ef5-376e2f956a60\") " pod="openstack/ceilometer-0" Oct 13 14:40:38 crc kubenswrapper[4797]: I1013 14:40:38.554680 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/703b9830-3383-4f8d-8ef5-376e2f956a60-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"703b9830-3383-4f8d-8ef5-376e2f956a60\") " pod="openstack/ceilometer-0" Oct 13 14:40:38 crc kubenswrapper[4797]: I1013 14:40:38.554732 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/703b9830-3383-4f8d-8ef5-376e2f956a60-log-httpd\") pod \"ceilometer-0\" (UID: \"703b9830-3383-4f8d-8ef5-376e2f956a60\") " pod="openstack/ceilometer-0" Oct 13 14:40:38 crc kubenswrapper[4797]: I1013 14:40:38.555162 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/703b9830-3383-4f8d-8ef5-376e2f956a60-log-httpd\") pod \"ceilometer-0\" (UID: \"703b9830-3383-4f8d-8ef5-376e2f956a60\") " pod="openstack/ceilometer-0" Oct 13 14:40:38 crc kubenswrapper[4797]: I1013 14:40:38.555163 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/703b9830-3383-4f8d-8ef5-376e2f956a60-run-httpd\") pod \"ceilometer-0\" (UID: \"703b9830-3383-4f8d-8ef5-376e2f956a60\") " pod="openstack/ceilometer-0" Oct 13 14:40:38 crc kubenswrapper[4797]: I1013 14:40:38.555260 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/703b9830-3383-4f8d-8ef5-376e2f956a60-scripts\") pod \"ceilometer-0\" (UID: \"703b9830-3383-4f8d-8ef5-376e2f956a60\") " pod="openstack/ceilometer-0" Oct 13 14:40:38 crc kubenswrapper[4797]: I1013 14:40:38.555325 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/703b9830-3383-4f8d-8ef5-376e2f956a60-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"703b9830-3383-4f8d-8ef5-376e2f956a60\") " pod="openstack/ceilometer-0" Oct 13 14:40:38 crc kubenswrapper[4797]: I1013 14:40:38.555518 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/703b9830-3383-4f8d-8ef5-376e2f956a60-config-data\") pod \"ceilometer-0\" (UID: \"703b9830-3383-4f8d-8ef5-376e2f956a60\") " pod="openstack/ceilometer-0" Oct 13 14:40:38 crc kubenswrapper[4797]: I1013 14:40:38.562844 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/703b9830-3383-4f8d-8ef5-376e2f956a60-scripts\") pod \"ceilometer-0\" (UID: \"703b9830-3383-4f8d-8ef5-376e2f956a60\") " pod="openstack/ceilometer-0" Oct 13 14:40:38 crc kubenswrapper[4797]: I1013 14:40:38.565692 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/703b9830-3383-4f8d-8ef5-376e2f956a60-config-data\") pod \"ceilometer-0\" (UID: \"703b9830-3383-4f8d-8ef5-376e2f956a60\") " pod="openstack/ceilometer-0" Oct 13 14:40:38 crc kubenswrapper[4797]: I1013 14:40:38.572376 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/703b9830-3383-4f8d-8ef5-376e2f956a60-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"703b9830-3383-4f8d-8ef5-376e2f956a60\") " pod="openstack/ceilometer-0" Oct 13 14:40:38 crc kubenswrapper[4797]: I1013 14:40:38.580603 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/703b9830-3383-4f8d-8ef5-376e2f956a60-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"703b9830-3383-4f8d-8ef5-376e2f956a60\") " pod="openstack/ceilometer-0" Oct 13 14:40:38 crc kubenswrapper[4797]: I1013 14:40:38.593587 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7kd4t\" (UniqueName: \"kubernetes.io/projected/703b9830-3383-4f8d-8ef5-376e2f956a60-kube-api-access-7kd4t\") pod \"ceilometer-0\" (UID: \"703b9830-3383-4f8d-8ef5-376e2f956a60\") " pod="openstack/ceilometer-0" Oct 13 14:40:38 crc kubenswrapper[4797]: I1013 14:40:38.649742 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 13 14:40:39 crc kubenswrapper[4797]: I1013 14:40:39.206889 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 13 14:40:39 crc kubenswrapper[4797]: W1013 14:40:39.217823 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod703b9830_3383_4f8d_8ef5_376e2f956a60.slice/crio-7690d1b50ce1fa1b043d163471a847c174e496f51cdac424bd0c84053803b4b2 WatchSource:0}: Error finding container 7690d1b50ce1fa1b043d163471a847c174e496f51cdac424bd0c84053803b4b2: Status 404 returned error can't find the container with id 7690d1b50ce1fa1b043d163471a847c174e496f51cdac424bd0c84053803b4b2 Oct 13 14:40:39 crc kubenswrapper[4797]: I1013 14:40:39.269779 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"703b9830-3383-4f8d-8ef5-376e2f956a60","Type":"ContainerStarted","Data":"7690d1b50ce1fa1b043d163471a847c174e496f51cdac424bd0c84053803b4b2"} Oct 13 14:40:43 crc kubenswrapper[4797]: I1013 14:40:43.306920 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"703b9830-3383-4f8d-8ef5-376e2f956a60","Type":"ContainerStarted","Data":"2e37cb4ebfbda5a36a1898e66e2a0915f282c19cae6ae43a6b559100997b0d8d"} Oct 13 14:40:45 crc kubenswrapper[4797]: I1013 14:40:45.046659 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-rq7jf"] Oct 13 14:40:45 crc kubenswrapper[4797]: I1013 14:40:45.056293 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-rq7jf"] Oct 13 14:40:45 crc kubenswrapper[4797]: I1013 14:40:45.251036 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dad69435-1f0a-4d7d-b3c6-b5c6eb491740" path="/var/lib/kubelet/pods/dad69435-1f0a-4d7d-b3c6-b5c6eb491740/volumes" Oct 13 14:40:45 crc kubenswrapper[4797]: I1013 14:40:45.337004 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"703b9830-3383-4f8d-8ef5-376e2f956a60","Type":"ContainerStarted","Data":"792a95a3440e6cf6694214a5b75c09520d68564535efb9b7face0a0ad5850fab"} Oct 13 14:40:46 crc kubenswrapper[4797]: I1013 14:40:46.353741 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"703b9830-3383-4f8d-8ef5-376e2f956a60","Type":"ContainerStarted","Data":"24793c3bb75b41e569e451dc1c9c60adee3ada86d1ab3c2caa44df96a06d1ffc"} Oct 13 14:40:48 crc kubenswrapper[4797]: I1013 14:40:48.120371 4797 patch_prober.go:28] interesting pod/machine-config-daemon-hrdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 14:40:48 crc kubenswrapper[4797]: I1013 14:40:48.121033 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 14:40:48 crc kubenswrapper[4797]: I1013 14:40:48.394568 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"703b9830-3383-4f8d-8ef5-376e2f956a60","Type":"ContainerStarted","Data":"87cb39d2584b44b56a90852829c6503206f3825023131fb058f878c6731fb31c"} Oct 13 14:40:48 crc kubenswrapper[4797]: I1013 14:40:48.395953 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 13 14:40:48 crc kubenswrapper[4797]: I1013 14:40:48.426638 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.268281053 podStartE2EDuration="10.426619428s" podCreationTimestamp="2025-10-13 14:40:38 +0000 UTC" firstStartedPulling="2025-10-13 14:40:39.220917037 +0000 UTC m=+5616.754467293" lastFinishedPulling="2025-10-13 14:40:47.379255392 +0000 UTC m=+5624.912805668" observedRunningTime="2025-10-13 14:40:48.41525624 +0000 UTC m=+5625.948806516" watchObservedRunningTime="2025-10-13 14:40:48.426619428 +0000 UTC m=+5625.960169674" Oct 13 14:40:55 crc kubenswrapper[4797]: I1013 14:40:55.028732 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-7d13-account-create-lddbw"] Oct 13 14:40:55 crc kubenswrapper[4797]: I1013 14:40:55.038580 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-7d13-account-create-lddbw"] Oct 13 14:40:55 crc kubenswrapper[4797]: I1013 14:40:55.249555 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e6ff53c-6611-4d2a-a7fb-42d8f5d75183" path="/var/lib/kubelet/pods/3e6ff53c-6611-4d2a-a7fb-42d8f5d75183/volumes" Oct 13 14:40:55 crc kubenswrapper[4797]: I1013 14:40:55.299216 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-create-rzdsz"] Oct 13 14:40:55 crc kubenswrapper[4797]: I1013 14:40:55.301011 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-rzdsz" Oct 13 14:40:55 crc kubenswrapper[4797]: I1013 14:40:55.313005 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-rzdsz"] Oct 13 14:40:55 crc kubenswrapper[4797]: I1013 14:40:55.427392 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbc5v\" (UniqueName: \"kubernetes.io/projected/d1fb0bc1-01e6-4c15-bbfc-849333911dc5-kube-api-access-sbc5v\") pod \"aodh-db-create-rzdsz\" (UID: \"d1fb0bc1-01e6-4c15-bbfc-849333911dc5\") " pod="openstack/aodh-db-create-rzdsz" Oct 13 14:40:55 crc kubenswrapper[4797]: I1013 14:40:55.528940 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sbc5v\" (UniqueName: \"kubernetes.io/projected/d1fb0bc1-01e6-4c15-bbfc-849333911dc5-kube-api-access-sbc5v\") pod \"aodh-db-create-rzdsz\" (UID: \"d1fb0bc1-01e6-4c15-bbfc-849333911dc5\") " pod="openstack/aodh-db-create-rzdsz" Oct 13 14:40:55 crc kubenswrapper[4797]: I1013 14:40:55.548594 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbc5v\" (UniqueName: \"kubernetes.io/projected/d1fb0bc1-01e6-4c15-bbfc-849333911dc5-kube-api-access-sbc5v\") pod \"aodh-db-create-rzdsz\" (UID: \"d1fb0bc1-01e6-4c15-bbfc-849333911dc5\") " pod="openstack/aodh-db-create-rzdsz" Oct 13 14:40:55 crc kubenswrapper[4797]: I1013 14:40:55.620094 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-rzdsz" Oct 13 14:40:56 crc kubenswrapper[4797]: I1013 14:40:56.114653 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-rzdsz"] Oct 13 14:40:56 crc kubenswrapper[4797]: I1013 14:40:56.487941 4797 generic.go:334] "Generic (PLEG): container finished" podID="d1fb0bc1-01e6-4c15-bbfc-849333911dc5" containerID="afd0317bc27bff6b8ff89a7a9479df6b562f6a9e3d85a1cb74319a2a7afff01c" exitCode=0 Oct 13 14:40:56 crc kubenswrapper[4797]: I1013 14:40:56.488034 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-rzdsz" event={"ID":"d1fb0bc1-01e6-4c15-bbfc-849333911dc5","Type":"ContainerDied","Data":"afd0317bc27bff6b8ff89a7a9479df6b562f6a9e3d85a1cb74319a2a7afff01c"} Oct 13 14:40:56 crc kubenswrapper[4797]: I1013 14:40:56.488084 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-rzdsz" event={"ID":"d1fb0bc1-01e6-4c15-bbfc-849333911dc5","Type":"ContainerStarted","Data":"8b65b52be27e86507a8a27e70328569e84c5cb60048344e3215b141830c03656"} Oct 13 14:40:57 crc kubenswrapper[4797]: I1013 14:40:57.871570 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-rzdsz" Oct 13 14:40:57 crc kubenswrapper[4797]: I1013 14:40:57.977194 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sbc5v\" (UniqueName: \"kubernetes.io/projected/d1fb0bc1-01e6-4c15-bbfc-849333911dc5-kube-api-access-sbc5v\") pod \"d1fb0bc1-01e6-4c15-bbfc-849333911dc5\" (UID: \"d1fb0bc1-01e6-4c15-bbfc-849333911dc5\") " Oct 13 14:40:57 crc kubenswrapper[4797]: I1013 14:40:57.989064 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1fb0bc1-01e6-4c15-bbfc-849333911dc5-kube-api-access-sbc5v" (OuterVolumeSpecName: "kube-api-access-sbc5v") pod "d1fb0bc1-01e6-4c15-bbfc-849333911dc5" (UID: "d1fb0bc1-01e6-4c15-bbfc-849333911dc5"). InnerVolumeSpecName "kube-api-access-sbc5v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 14:40:58 crc kubenswrapper[4797]: I1013 14:40:58.079901 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sbc5v\" (UniqueName: \"kubernetes.io/projected/d1fb0bc1-01e6-4c15-bbfc-849333911dc5-kube-api-access-sbc5v\") on node \"crc\" DevicePath \"\"" Oct 13 14:40:58 crc kubenswrapper[4797]: I1013 14:40:58.509144 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-rzdsz" event={"ID":"d1fb0bc1-01e6-4c15-bbfc-849333911dc5","Type":"ContainerDied","Data":"8b65b52be27e86507a8a27e70328569e84c5cb60048344e3215b141830c03656"} Oct 13 14:40:58 crc kubenswrapper[4797]: I1013 14:40:58.509181 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8b65b52be27e86507a8a27e70328569e84c5cb60048344e3215b141830c03656" Oct 13 14:40:58 crc kubenswrapper[4797]: I1013 14:40:58.509252 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-rzdsz" Oct 13 14:41:05 crc kubenswrapper[4797]: I1013 14:41:05.046462 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-4cnz5"] Oct 13 14:41:05 crc kubenswrapper[4797]: I1013 14:41:05.058716 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-4cnz5"] Oct 13 14:41:05 crc kubenswrapper[4797]: I1013 14:41:05.248712 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10b7675b-71aa-4ace-85f1-7fbb6c4862ab" path="/var/lib/kubelet/pods/10b7675b-71aa-4ace-85f1-7fbb6c4862ab/volumes" Oct 13 14:41:05 crc kubenswrapper[4797]: I1013 14:41:05.426086 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-01bb-account-create-shxkb"] Oct 13 14:41:05 crc kubenswrapper[4797]: E1013 14:41:05.426632 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1fb0bc1-01e6-4c15-bbfc-849333911dc5" containerName="mariadb-database-create" Oct 13 14:41:05 crc kubenswrapper[4797]: I1013 14:41:05.426658 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1fb0bc1-01e6-4c15-bbfc-849333911dc5" containerName="mariadb-database-create" Oct 13 14:41:05 crc kubenswrapper[4797]: I1013 14:41:05.426951 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1fb0bc1-01e6-4c15-bbfc-849333911dc5" containerName="mariadb-database-create" Oct 13 14:41:05 crc kubenswrapper[4797]: I1013 14:41:05.427836 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-01bb-account-create-shxkb" Oct 13 14:41:05 crc kubenswrapper[4797]: I1013 14:41:05.428792 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6fjc\" (UniqueName: \"kubernetes.io/projected/55226f30-4bd9-44e8-806d-38f43b21b396-kube-api-access-x6fjc\") pod \"aodh-01bb-account-create-shxkb\" (UID: \"55226f30-4bd9-44e8-806d-38f43b21b396\") " pod="openstack/aodh-01bb-account-create-shxkb" Oct 13 14:41:05 crc kubenswrapper[4797]: I1013 14:41:05.430124 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-db-secret" Oct 13 14:41:05 crc kubenswrapper[4797]: I1013 14:41:05.438290 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-01bb-account-create-shxkb"] Oct 13 14:41:05 crc kubenswrapper[4797]: I1013 14:41:05.531002 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x6fjc\" (UniqueName: \"kubernetes.io/projected/55226f30-4bd9-44e8-806d-38f43b21b396-kube-api-access-x6fjc\") pod \"aodh-01bb-account-create-shxkb\" (UID: \"55226f30-4bd9-44e8-806d-38f43b21b396\") " pod="openstack/aodh-01bb-account-create-shxkb" Oct 13 14:41:05 crc kubenswrapper[4797]: I1013 14:41:05.563527 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6fjc\" (UniqueName: \"kubernetes.io/projected/55226f30-4bd9-44e8-806d-38f43b21b396-kube-api-access-x6fjc\") pod \"aodh-01bb-account-create-shxkb\" (UID: \"55226f30-4bd9-44e8-806d-38f43b21b396\") " pod="openstack/aodh-01bb-account-create-shxkb" Oct 13 14:41:05 crc kubenswrapper[4797]: I1013 14:41:05.745872 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-01bb-account-create-shxkb" Oct 13 14:41:06 crc kubenswrapper[4797]: W1013 14:41:06.246767 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod55226f30_4bd9_44e8_806d_38f43b21b396.slice/crio-3a0c12dee02fe51a39358efa37eb87157b1636da102c71aec1598c61024de019 WatchSource:0}: Error finding container 3a0c12dee02fe51a39358efa37eb87157b1636da102c71aec1598c61024de019: Status 404 returned error can't find the container with id 3a0c12dee02fe51a39358efa37eb87157b1636da102c71aec1598c61024de019 Oct 13 14:41:06 crc kubenswrapper[4797]: I1013 14:41:06.255616 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-01bb-account-create-shxkb"] Oct 13 14:41:06 crc kubenswrapper[4797]: I1013 14:41:06.592881 4797 generic.go:334] "Generic (PLEG): container finished" podID="55226f30-4bd9-44e8-806d-38f43b21b396" containerID="082ea0e307371daf17cdfd468e4cb66a164be8c76aaa0e195c1e0ffb4564273c" exitCode=0 Oct 13 14:41:06 crc kubenswrapper[4797]: I1013 14:41:06.592924 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-01bb-account-create-shxkb" event={"ID":"55226f30-4bd9-44e8-806d-38f43b21b396","Type":"ContainerDied","Data":"082ea0e307371daf17cdfd468e4cb66a164be8c76aaa0e195c1e0ffb4564273c"} Oct 13 14:41:06 crc kubenswrapper[4797]: I1013 14:41:06.592951 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-01bb-account-create-shxkb" event={"ID":"55226f30-4bd9-44e8-806d-38f43b21b396","Type":"ContainerStarted","Data":"3a0c12dee02fe51a39358efa37eb87157b1636da102c71aec1598c61024de019"} Oct 13 14:41:07 crc kubenswrapper[4797]: I1013 14:41:07.969932 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-01bb-account-create-shxkb" Oct 13 14:41:08 crc kubenswrapper[4797]: I1013 14:41:08.083332 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x6fjc\" (UniqueName: \"kubernetes.io/projected/55226f30-4bd9-44e8-806d-38f43b21b396-kube-api-access-x6fjc\") pod \"55226f30-4bd9-44e8-806d-38f43b21b396\" (UID: \"55226f30-4bd9-44e8-806d-38f43b21b396\") " Oct 13 14:41:08 crc kubenswrapper[4797]: I1013 14:41:08.087962 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55226f30-4bd9-44e8-806d-38f43b21b396-kube-api-access-x6fjc" (OuterVolumeSpecName: "kube-api-access-x6fjc") pod "55226f30-4bd9-44e8-806d-38f43b21b396" (UID: "55226f30-4bd9-44e8-806d-38f43b21b396"). InnerVolumeSpecName "kube-api-access-x6fjc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 14:41:08 crc kubenswrapper[4797]: I1013 14:41:08.186264 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x6fjc\" (UniqueName: \"kubernetes.io/projected/55226f30-4bd9-44e8-806d-38f43b21b396-kube-api-access-x6fjc\") on node \"crc\" DevicePath \"\"" Oct 13 14:41:08 crc kubenswrapper[4797]: I1013 14:41:08.611570 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-01bb-account-create-shxkb" event={"ID":"55226f30-4bd9-44e8-806d-38f43b21b396","Type":"ContainerDied","Data":"3a0c12dee02fe51a39358efa37eb87157b1636da102c71aec1598c61024de019"} Oct 13 14:41:08 crc kubenswrapper[4797]: I1013 14:41:08.611610 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3a0c12dee02fe51a39358efa37eb87157b1636da102c71aec1598c61024de019" Oct 13 14:41:08 crc kubenswrapper[4797]: I1013 14:41:08.611634 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-01bb-account-create-shxkb" Oct 13 14:41:08 crc kubenswrapper[4797]: I1013 14:41:08.662588 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 13 14:41:10 crc kubenswrapper[4797]: I1013 14:41:10.857234 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-sync-kf7zx"] Oct 13 14:41:10 crc kubenswrapper[4797]: E1013 14:41:10.858310 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55226f30-4bd9-44e8-806d-38f43b21b396" containerName="mariadb-account-create" Oct 13 14:41:10 crc kubenswrapper[4797]: I1013 14:41:10.858329 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="55226f30-4bd9-44e8-806d-38f43b21b396" containerName="mariadb-account-create" Oct 13 14:41:10 crc kubenswrapper[4797]: I1013 14:41:10.858585 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="55226f30-4bd9-44e8-806d-38f43b21b396" containerName="mariadb-account-create" Oct 13 14:41:10 crc kubenswrapper[4797]: I1013 14:41:10.859512 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-kf7zx" Oct 13 14:41:10 crc kubenswrapper[4797]: I1013 14:41:10.865511 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Oct 13 14:41:10 crc kubenswrapper[4797]: I1013 14:41:10.868626 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-gzhv4" Oct 13 14:41:10 crc kubenswrapper[4797]: I1013 14:41:10.869350 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Oct 13 14:41:10 crc kubenswrapper[4797]: I1013 14:41:10.902545 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-kf7zx"] Oct 13 14:41:11 crc kubenswrapper[4797]: I1013 14:41:11.043002 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/302b0733-f0b0-4d8c-acc8-1f86c16a2920-scripts\") pod \"aodh-db-sync-kf7zx\" (UID: \"302b0733-f0b0-4d8c-acc8-1f86c16a2920\") " pod="openstack/aodh-db-sync-kf7zx" Oct 13 14:41:11 crc kubenswrapper[4797]: I1013 14:41:11.043246 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/302b0733-f0b0-4d8c-acc8-1f86c16a2920-combined-ca-bundle\") pod \"aodh-db-sync-kf7zx\" (UID: \"302b0733-f0b0-4d8c-acc8-1f86c16a2920\") " pod="openstack/aodh-db-sync-kf7zx" Oct 13 14:41:11 crc kubenswrapper[4797]: I1013 14:41:11.043306 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/302b0733-f0b0-4d8c-acc8-1f86c16a2920-config-data\") pod \"aodh-db-sync-kf7zx\" (UID: \"302b0733-f0b0-4d8c-acc8-1f86c16a2920\") " pod="openstack/aodh-db-sync-kf7zx" Oct 13 14:41:11 crc kubenswrapper[4797]: I1013 14:41:11.043583 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pp6wx\" (UniqueName: \"kubernetes.io/projected/302b0733-f0b0-4d8c-acc8-1f86c16a2920-kube-api-access-pp6wx\") pod \"aodh-db-sync-kf7zx\" (UID: \"302b0733-f0b0-4d8c-acc8-1f86c16a2920\") " pod="openstack/aodh-db-sync-kf7zx" Oct 13 14:41:11 crc kubenswrapper[4797]: I1013 14:41:11.145227 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/302b0733-f0b0-4d8c-acc8-1f86c16a2920-scripts\") pod \"aodh-db-sync-kf7zx\" (UID: \"302b0733-f0b0-4d8c-acc8-1f86c16a2920\") " pod="openstack/aodh-db-sync-kf7zx" Oct 13 14:41:11 crc kubenswrapper[4797]: I1013 14:41:11.145346 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/302b0733-f0b0-4d8c-acc8-1f86c16a2920-combined-ca-bundle\") pod \"aodh-db-sync-kf7zx\" (UID: \"302b0733-f0b0-4d8c-acc8-1f86c16a2920\") " pod="openstack/aodh-db-sync-kf7zx" Oct 13 14:41:11 crc kubenswrapper[4797]: I1013 14:41:11.145372 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/302b0733-f0b0-4d8c-acc8-1f86c16a2920-config-data\") pod \"aodh-db-sync-kf7zx\" (UID: \"302b0733-f0b0-4d8c-acc8-1f86c16a2920\") " pod="openstack/aodh-db-sync-kf7zx" Oct 13 14:41:11 crc kubenswrapper[4797]: I1013 14:41:11.145454 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pp6wx\" (UniqueName: \"kubernetes.io/projected/302b0733-f0b0-4d8c-acc8-1f86c16a2920-kube-api-access-pp6wx\") pod \"aodh-db-sync-kf7zx\" (UID: \"302b0733-f0b0-4d8c-acc8-1f86c16a2920\") " pod="openstack/aodh-db-sync-kf7zx" Oct 13 14:41:11 crc kubenswrapper[4797]: I1013 14:41:11.150535 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/302b0733-f0b0-4d8c-acc8-1f86c16a2920-scripts\") pod \"aodh-db-sync-kf7zx\" (UID: \"302b0733-f0b0-4d8c-acc8-1f86c16a2920\") " pod="openstack/aodh-db-sync-kf7zx" Oct 13 14:41:11 crc kubenswrapper[4797]: I1013 14:41:11.150703 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/302b0733-f0b0-4d8c-acc8-1f86c16a2920-combined-ca-bundle\") pod \"aodh-db-sync-kf7zx\" (UID: \"302b0733-f0b0-4d8c-acc8-1f86c16a2920\") " pod="openstack/aodh-db-sync-kf7zx" Oct 13 14:41:11 crc kubenswrapper[4797]: I1013 14:41:11.151093 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/302b0733-f0b0-4d8c-acc8-1f86c16a2920-config-data\") pod \"aodh-db-sync-kf7zx\" (UID: \"302b0733-f0b0-4d8c-acc8-1f86c16a2920\") " pod="openstack/aodh-db-sync-kf7zx" Oct 13 14:41:11 crc kubenswrapper[4797]: I1013 14:41:11.162778 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pp6wx\" (UniqueName: \"kubernetes.io/projected/302b0733-f0b0-4d8c-acc8-1f86c16a2920-kube-api-access-pp6wx\") pod \"aodh-db-sync-kf7zx\" (UID: \"302b0733-f0b0-4d8c-acc8-1f86c16a2920\") " pod="openstack/aodh-db-sync-kf7zx" Oct 13 14:41:11 crc kubenswrapper[4797]: I1013 14:41:11.192299 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-kf7zx" Oct 13 14:41:11 crc kubenswrapper[4797]: I1013 14:41:11.669970 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-kf7zx"] Oct 13 14:41:12 crc kubenswrapper[4797]: I1013 14:41:12.656657 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-kf7zx" event={"ID":"302b0733-f0b0-4d8c-acc8-1f86c16a2920","Type":"ContainerStarted","Data":"c930aaf476091cd86888e80e510c577f36fd6a02a3cf53763bc31f65bb2fcbc6"} Oct 13 14:41:13 crc kubenswrapper[4797]: I1013 14:41:13.642051 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-j8w4s"] Oct 13 14:41:13 crc kubenswrapper[4797]: I1013 14:41:13.644944 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-j8w4s" Oct 13 14:41:13 crc kubenswrapper[4797]: I1013 14:41:13.656989 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2b3dfa9-6860-44f7-8e0f-98fd29fd729a-utilities\") pod \"certified-operators-j8w4s\" (UID: \"a2b3dfa9-6860-44f7-8e0f-98fd29fd729a\") " pod="openshift-marketplace/certified-operators-j8w4s" Oct 13 14:41:13 crc kubenswrapper[4797]: I1013 14:41:13.657436 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2b3dfa9-6860-44f7-8e0f-98fd29fd729a-catalog-content\") pod \"certified-operators-j8w4s\" (UID: \"a2b3dfa9-6860-44f7-8e0f-98fd29fd729a\") " pod="openshift-marketplace/certified-operators-j8w4s" Oct 13 14:41:13 crc kubenswrapper[4797]: I1013 14:41:13.657711 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7sfl\" (UniqueName: \"kubernetes.io/projected/a2b3dfa9-6860-44f7-8e0f-98fd29fd729a-kube-api-access-l7sfl\") pod \"certified-operators-j8w4s\" (UID: \"a2b3dfa9-6860-44f7-8e0f-98fd29fd729a\") " pod="openshift-marketplace/certified-operators-j8w4s" Oct 13 14:41:13 crc kubenswrapper[4797]: I1013 14:41:13.674011 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-j8w4s"] Oct 13 14:41:13 crc kubenswrapper[4797]: I1013 14:41:13.758828 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l7sfl\" (UniqueName: \"kubernetes.io/projected/a2b3dfa9-6860-44f7-8e0f-98fd29fd729a-kube-api-access-l7sfl\") pod \"certified-operators-j8w4s\" (UID: \"a2b3dfa9-6860-44f7-8e0f-98fd29fd729a\") " pod="openshift-marketplace/certified-operators-j8w4s" Oct 13 14:41:13 crc kubenswrapper[4797]: I1013 14:41:13.759126 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2b3dfa9-6860-44f7-8e0f-98fd29fd729a-utilities\") pod \"certified-operators-j8w4s\" (UID: \"a2b3dfa9-6860-44f7-8e0f-98fd29fd729a\") " pod="openshift-marketplace/certified-operators-j8w4s" Oct 13 14:41:13 crc kubenswrapper[4797]: I1013 14:41:13.759278 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2b3dfa9-6860-44f7-8e0f-98fd29fd729a-catalog-content\") pod \"certified-operators-j8w4s\" (UID: \"a2b3dfa9-6860-44f7-8e0f-98fd29fd729a\") " pod="openshift-marketplace/certified-operators-j8w4s" Oct 13 14:41:13 crc kubenswrapper[4797]: I1013 14:41:13.759647 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2b3dfa9-6860-44f7-8e0f-98fd29fd729a-utilities\") pod \"certified-operators-j8w4s\" (UID: \"a2b3dfa9-6860-44f7-8e0f-98fd29fd729a\") " pod="openshift-marketplace/certified-operators-j8w4s" Oct 13 14:41:13 crc kubenswrapper[4797]: I1013 14:41:13.759764 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2b3dfa9-6860-44f7-8e0f-98fd29fd729a-catalog-content\") pod \"certified-operators-j8w4s\" (UID: \"a2b3dfa9-6860-44f7-8e0f-98fd29fd729a\") " pod="openshift-marketplace/certified-operators-j8w4s" Oct 13 14:41:13 crc kubenswrapper[4797]: I1013 14:41:13.778920 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7sfl\" (UniqueName: \"kubernetes.io/projected/a2b3dfa9-6860-44f7-8e0f-98fd29fd729a-kube-api-access-l7sfl\") pod \"certified-operators-j8w4s\" (UID: \"a2b3dfa9-6860-44f7-8e0f-98fd29fd729a\") " pod="openshift-marketplace/certified-operators-j8w4s" Oct 13 14:41:13 crc kubenswrapper[4797]: I1013 14:41:13.977830 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-j8w4s" Oct 13 14:41:14 crc kubenswrapper[4797]: I1013 14:41:14.473201 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-j8w4s"] Oct 13 14:41:15 crc kubenswrapper[4797]: I1013 14:41:15.989846 4797 scope.go:117] "RemoveContainer" containerID="b378593e68d043e685faca81e7e6ac3933ada9f1711ba5b62aff96b0ab29e96c" Oct 13 14:41:16 crc kubenswrapper[4797]: I1013 14:41:16.796029 4797 scope.go:117] "RemoveContainer" containerID="21561b5d69e406155ea00274e6a74937b50060af2d456bc1598ac6f0aec22045" Oct 13 14:41:16 crc kubenswrapper[4797]: I1013 14:41:16.908195 4797 scope.go:117] "RemoveContainer" containerID="a5a9ce3a90682dacfe8f75663327b413798499a48b0a9f187a030c8c9f5d76b6" Oct 13 14:41:17 crc kubenswrapper[4797]: I1013 14:41:17.707327 4797 generic.go:334] "Generic (PLEG): container finished" podID="a2b3dfa9-6860-44f7-8e0f-98fd29fd729a" containerID="2a2739b021b6ecb7283af281b9a80df21fe2aae915419013c47966c876ff4517" exitCode=0 Oct 13 14:41:17 crc kubenswrapper[4797]: I1013 14:41:17.707382 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j8w4s" event={"ID":"a2b3dfa9-6860-44f7-8e0f-98fd29fd729a","Type":"ContainerDied","Data":"2a2739b021b6ecb7283af281b9a80df21fe2aae915419013c47966c876ff4517"} Oct 13 14:41:17 crc kubenswrapper[4797]: I1013 14:41:17.707403 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j8w4s" event={"ID":"a2b3dfa9-6860-44f7-8e0f-98fd29fd729a","Type":"ContainerStarted","Data":"56f890f2809681ae2279175530c4f90d7417e2354e77a39f125013dae06b69ed"} Oct 13 14:41:17 crc kubenswrapper[4797]: I1013 14:41:17.716627 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-kf7zx" event={"ID":"302b0733-f0b0-4d8c-acc8-1f86c16a2920","Type":"ContainerStarted","Data":"3651e4efedbe2cd36246ba3e3251d5d5490eb4d5d10c66a45cafd8ca8f199fe5"} Oct 13 14:41:17 crc kubenswrapper[4797]: I1013 14:41:17.741751 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-db-sync-kf7zx" podStartSLOduration=2.504996184 podStartE2EDuration="7.741729556s" podCreationTimestamp="2025-10-13 14:41:10 +0000 UTC" firstStartedPulling="2025-10-13 14:41:11.674520638 +0000 UTC m=+5649.208070894" lastFinishedPulling="2025-10-13 14:41:16.91125401 +0000 UTC m=+5654.444804266" observedRunningTime="2025-10-13 14:41:17.739272276 +0000 UTC m=+5655.272822552" watchObservedRunningTime="2025-10-13 14:41:17.741729556 +0000 UTC m=+5655.275279812" Oct 13 14:41:18 crc kubenswrapper[4797]: I1013 14:41:18.120427 4797 patch_prober.go:28] interesting pod/machine-config-daemon-hrdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 14:41:18 crc kubenswrapper[4797]: I1013 14:41:18.120685 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 14:41:19 crc kubenswrapper[4797]: I1013 14:41:19.740454 4797 generic.go:334] "Generic (PLEG): container finished" podID="a2b3dfa9-6860-44f7-8e0f-98fd29fd729a" containerID="bc9a79f8dd1949d2741d76f163a013fb03d651d61d2233c878dc3eabe9c4d537" exitCode=0 Oct 13 14:41:19 crc kubenswrapper[4797]: I1013 14:41:19.740994 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j8w4s" event={"ID":"a2b3dfa9-6860-44f7-8e0f-98fd29fd729a","Type":"ContainerDied","Data":"bc9a79f8dd1949d2741d76f163a013fb03d651d61d2233c878dc3eabe9c4d537"} Oct 13 14:41:20 crc kubenswrapper[4797]: I1013 14:41:20.752863 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j8w4s" event={"ID":"a2b3dfa9-6860-44f7-8e0f-98fd29fd729a","Type":"ContainerStarted","Data":"7c4fb7dd16fb6dee56e6fa122f840a510a4e9091f326ad1443c5d44c29ce02f5"} Oct 13 14:41:20 crc kubenswrapper[4797]: I1013 14:41:20.754237 4797 generic.go:334] "Generic (PLEG): container finished" podID="302b0733-f0b0-4d8c-acc8-1f86c16a2920" containerID="3651e4efedbe2cd36246ba3e3251d5d5490eb4d5d10c66a45cafd8ca8f199fe5" exitCode=0 Oct 13 14:41:20 crc kubenswrapper[4797]: I1013 14:41:20.754263 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-kf7zx" event={"ID":"302b0733-f0b0-4d8c-acc8-1f86c16a2920","Type":"ContainerDied","Data":"3651e4efedbe2cd36246ba3e3251d5d5490eb4d5d10c66a45cafd8ca8f199fe5"} Oct 13 14:41:20 crc kubenswrapper[4797]: I1013 14:41:20.778561 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-j8w4s" podStartSLOduration=5.117672513 podStartE2EDuration="7.778537099s" podCreationTimestamp="2025-10-13 14:41:13 +0000 UTC" firstStartedPulling="2025-10-13 14:41:17.708995415 +0000 UTC m=+5655.242545671" lastFinishedPulling="2025-10-13 14:41:20.369859961 +0000 UTC m=+5657.903410257" observedRunningTime="2025-10-13 14:41:20.769652271 +0000 UTC m=+5658.303202547" watchObservedRunningTime="2025-10-13 14:41:20.778537099 +0000 UTC m=+5658.312087385" Oct 13 14:41:22 crc kubenswrapper[4797]: I1013 14:41:22.283271 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-kf7zx" Oct 13 14:41:22 crc kubenswrapper[4797]: I1013 14:41:22.340794 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/302b0733-f0b0-4d8c-acc8-1f86c16a2920-combined-ca-bundle\") pod \"302b0733-f0b0-4d8c-acc8-1f86c16a2920\" (UID: \"302b0733-f0b0-4d8c-acc8-1f86c16a2920\") " Oct 13 14:41:22 crc kubenswrapper[4797]: I1013 14:41:22.341002 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pp6wx\" (UniqueName: \"kubernetes.io/projected/302b0733-f0b0-4d8c-acc8-1f86c16a2920-kube-api-access-pp6wx\") pod \"302b0733-f0b0-4d8c-acc8-1f86c16a2920\" (UID: \"302b0733-f0b0-4d8c-acc8-1f86c16a2920\") " Oct 13 14:41:22 crc kubenswrapper[4797]: I1013 14:41:22.341127 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/302b0733-f0b0-4d8c-acc8-1f86c16a2920-scripts\") pod \"302b0733-f0b0-4d8c-acc8-1f86c16a2920\" (UID: \"302b0733-f0b0-4d8c-acc8-1f86c16a2920\") " Oct 13 14:41:22 crc kubenswrapper[4797]: I1013 14:41:22.341199 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/302b0733-f0b0-4d8c-acc8-1f86c16a2920-config-data\") pod \"302b0733-f0b0-4d8c-acc8-1f86c16a2920\" (UID: \"302b0733-f0b0-4d8c-acc8-1f86c16a2920\") " Oct 13 14:41:22 crc kubenswrapper[4797]: I1013 14:41:22.348988 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/302b0733-f0b0-4d8c-acc8-1f86c16a2920-kube-api-access-pp6wx" (OuterVolumeSpecName: "kube-api-access-pp6wx") pod "302b0733-f0b0-4d8c-acc8-1f86c16a2920" (UID: "302b0733-f0b0-4d8c-acc8-1f86c16a2920"). InnerVolumeSpecName "kube-api-access-pp6wx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 14:41:22 crc kubenswrapper[4797]: I1013 14:41:22.349024 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/302b0733-f0b0-4d8c-acc8-1f86c16a2920-scripts" (OuterVolumeSpecName: "scripts") pod "302b0733-f0b0-4d8c-acc8-1f86c16a2920" (UID: "302b0733-f0b0-4d8c-acc8-1f86c16a2920"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 14:41:22 crc kubenswrapper[4797]: I1013 14:41:22.368303 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/302b0733-f0b0-4d8c-acc8-1f86c16a2920-config-data" (OuterVolumeSpecName: "config-data") pod "302b0733-f0b0-4d8c-acc8-1f86c16a2920" (UID: "302b0733-f0b0-4d8c-acc8-1f86c16a2920"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 14:41:22 crc kubenswrapper[4797]: I1013 14:41:22.380706 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/302b0733-f0b0-4d8c-acc8-1f86c16a2920-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "302b0733-f0b0-4d8c-acc8-1f86c16a2920" (UID: "302b0733-f0b0-4d8c-acc8-1f86c16a2920"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 14:41:22 crc kubenswrapper[4797]: I1013 14:41:22.443135 4797 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/302b0733-f0b0-4d8c-acc8-1f86c16a2920-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 14:41:22 crc kubenswrapper[4797]: I1013 14:41:22.443179 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/302b0733-f0b0-4d8c-acc8-1f86c16a2920-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 14:41:22 crc kubenswrapper[4797]: I1013 14:41:22.443195 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pp6wx\" (UniqueName: \"kubernetes.io/projected/302b0733-f0b0-4d8c-acc8-1f86c16a2920-kube-api-access-pp6wx\") on node \"crc\" DevicePath \"\"" Oct 13 14:41:22 crc kubenswrapper[4797]: I1013 14:41:22.443210 4797 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/302b0733-f0b0-4d8c-acc8-1f86c16a2920-scripts\") on node \"crc\" DevicePath \"\"" Oct 13 14:41:22 crc kubenswrapper[4797]: I1013 14:41:22.773827 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-kf7zx" event={"ID":"302b0733-f0b0-4d8c-acc8-1f86c16a2920","Type":"ContainerDied","Data":"c930aaf476091cd86888e80e510c577f36fd6a02a3cf53763bc31f65bb2fcbc6"} Oct 13 14:41:22 crc kubenswrapper[4797]: I1013 14:41:22.773872 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c930aaf476091cd86888e80e510c577f36fd6a02a3cf53763bc31f65bb2fcbc6" Oct 13 14:41:22 crc kubenswrapper[4797]: I1013 14:41:22.773936 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-kf7zx" Oct 13 14:41:23 crc kubenswrapper[4797]: I1013 14:41:23.978636 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-j8w4s" Oct 13 14:41:23 crc kubenswrapper[4797]: I1013 14:41:23.978921 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-j8w4s" Oct 13 14:41:24 crc kubenswrapper[4797]: I1013 14:41:24.032780 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-j8w4s" Oct 13 14:41:25 crc kubenswrapper[4797]: I1013 14:41:25.936794 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Oct 13 14:41:25 crc kubenswrapper[4797]: E1013 14:41:25.937616 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="302b0733-f0b0-4d8c-acc8-1f86c16a2920" containerName="aodh-db-sync" Oct 13 14:41:25 crc kubenswrapper[4797]: I1013 14:41:25.937636 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="302b0733-f0b0-4d8c-acc8-1f86c16a2920" containerName="aodh-db-sync" Oct 13 14:41:25 crc kubenswrapper[4797]: I1013 14:41:25.937919 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="302b0733-f0b0-4d8c-acc8-1f86c16a2920" containerName="aodh-db-sync" Oct 13 14:41:25 crc kubenswrapper[4797]: I1013 14:41:25.940518 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Oct 13 14:41:25 crc kubenswrapper[4797]: I1013 14:41:25.942035 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-gzhv4" Oct 13 14:41:25 crc kubenswrapper[4797]: I1013 14:41:25.942412 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Oct 13 14:41:25 crc kubenswrapper[4797]: I1013 14:41:25.942726 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Oct 13 14:41:25 crc kubenswrapper[4797]: I1013 14:41:25.977594 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Oct 13 14:41:26 crc kubenswrapper[4797]: I1013 14:41:26.013316 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7c7e5e8-bc03-4870-a67d-fb86ae0e41e3-combined-ca-bundle\") pod \"aodh-0\" (UID: \"a7c7e5e8-bc03-4870-a67d-fb86ae0e41e3\") " pod="openstack/aodh-0" Oct 13 14:41:26 crc kubenswrapper[4797]: I1013 14:41:26.013404 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a7c7e5e8-bc03-4870-a67d-fb86ae0e41e3-scripts\") pod \"aodh-0\" (UID: \"a7c7e5e8-bc03-4870-a67d-fb86ae0e41e3\") " pod="openstack/aodh-0" Oct 13 14:41:26 crc kubenswrapper[4797]: I1013 14:41:26.013469 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7c7e5e8-bc03-4870-a67d-fb86ae0e41e3-config-data\") pod \"aodh-0\" (UID: \"a7c7e5e8-bc03-4870-a67d-fb86ae0e41e3\") " pod="openstack/aodh-0" Oct 13 14:41:26 crc kubenswrapper[4797]: I1013 14:41:26.013498 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7lmj9\" (UniqueName: \"kubernetes.io/projected/a7c7e5e8-bc03-4870-a67d-fb86ae0e41e3-kube-api-access-7lmj9\") pod \"aodh-0\" (UID: \"a7c7e5e8-bc03-4870-a67d-fb86ae0e41e3\") " pod="openstack/aodh-0" Oct 13 14:41:26 crc kubenswrapper[4797]: I1013 14:41:26.116728 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7c7e5e8-bc03-4870-a67d-fb86ae0e41e3-config-data\") pod \"aodh-0\" (UID: \"a7c7e5e8-bc03-4870-a67d-fb86ae0e41e3\") " pod="openstack/aodh-0" Oct 13 14:41:26 crc kubenswrapper[4797]: I1013 14:41:26.117215 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7lmj9\" (UniqueName: \"kubernetes.io/projected/a7c7e5e8-bc03-4870-a67d-fb86ae0e41e3-kube-api-access-7lmj9\") pod \"aodh-0\" (UID: \"a7c7e5e8-bc03-4870-a67d-fb86ae0e41e3\") " pod="openstack/aodh-0" Oct 13 14:41:26 crc kubenswrapper[4797]: I1013 14:41:26.117726 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7c7e5e8-bc03-4870-a67d-fb86ae0e41e3-combined-ca-bundle\") pod \"aodh-0\" (UID: \"a7c7e5e8-bc03-4870-a67d-fb86ae0e41e3\") " pod="openstack/aodh-0" Oct 13 14:41:26 crc kubenswrapper[4797]: I1013 14:41:26.118003 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a7c7e5e8-bc03-4870-a67d-fb86ae0e41e3-scripts\") pod \"aodh-0\" (UID: \"a7c7e5e8-bc03-4870-a67d-fb86ae0e41e3\") " pod="openstack/aodh-0" Oct 13 14:41:26 crc kubenswrapper[4797]: I1013 14:41:26.124846 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a7c7e5e8-bc03-4870-a67d-fb86ae0e41e3-scripts\") pod \"aodh-0\" (UID: \"a7c7e5e8-bc03-4870-a67d-fb86ae0e41e3\") " pod="openstack/aodh-0" Oct 13 14:41:26 crc kubenswrapper[4797]: I1013 14:41:26.126648 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7c7e5e8-bc03-4870-a67d-fb86ae0e41e3-config-data\") pod \"aodh-0\" (UID: \"a7c7e5e8-bc03-4870-a67d-fb86ae0e41e3\") " pod="openstack/aodh-0" Oct 13 14:41:26 crc kubenswrapper[4797]: I1013 14:41:26.127131 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7c7e5e8-bc03-4870-a67d-fb86ae0e41e3-combined-ca-bundle\") pod \"aodh-0\" (UID: \"a7c7e5e8-bc03-4870-a67d-fb86ae0e41e3\") " pod="openstack/aodh-0" Oct 13 14:41:26 crc kubenswrapper[4797]: I1013 14:41:26.140906 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7lmj9\" (UniqueName: \"kubernetes.io/projected/a7c7e5e8-bc03-4870-a67d-fb86ae0e41e3-kube-api-access-7lmj9\") pod \"aodh-0\" (UID: \"a7c7e5e8-bc03-4870-a67d-fb86ae0e41e3\") " pod="openstack/aodh-0" Oct 13 14:41:26 crc kubenswrapper[4797]: I1013 14:41:26.273666 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Oct 13 14:41:26 crc kubenswrapper[4797]: I1013 14:41:26.780439 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Oct 13 14:41:26 crc kubenswrapper[4797]: I1013 14:41:26.816197 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"a7c7e5e8-bc03-4870-a67d-fb86ae0e41e3","Type":"ContainerStarted","Data":"d9d67dd6016ab94f1b18e572f5408b1d8603af969162505a55b6a89161457b45"} Oct 13 14:41:27 crc kubenswrapper[4797]: I1013 14:41:27.826737 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"a7c7e5e8-bc03-4870-a67d-fb86ae0e41e3","Type":"ContainerStarted","Data":"9929b567f6c80885ff7aa143595791e616d72c87478f7ba49f3c411e558ca9fb"} Oct 13 14:41:28 crc kubenswrapper[4797]: I1013 14:41:28.612966 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 13 14:41:28 crc kubenswrapper[4797]: I1013 14:41:28.613556 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="703b9830-3383-4f8d-8ef5-376e2f956a60" containerName="ceilometer-central-agent" containerID="cri-o://2e37cb4ebfbda5a36a1898e66e2a0915f282c19cae6ae43a6b559100997b0d8d" gracePeriod=30 Oct 13 14:41:28 crc kubenswrapper[4797]: I1013 14:41:28.613650 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="703b9830-3383-4f8d-8ef5-376e2f956a60" containerName="sg-core" containerID="cri-o://24793c3bb75b41e569e451dc1c9c60adee3ada86d1ab3c2caa44df96a06d1ffc" gracePeriod=30 Oct 13 14:41:28 crc kubenswrapper[4797]: I1013 14:41:28.613676 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="703b9830-3383-4f8d-8ef5-376e2f956a60" containerName="ceilometer-notification-agent" containerID="cri-o://792a95a3440e6cf6694214a5b75c09520d68564535efb9b7face0a0ad5850fab" gracePeriod=30 Oct 13 14:41:28 crc kubenswrapper[4797]: I1013 14:41:28.613928 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="703b9830-3383-4f8d-8ef5-376e2f956a60" containerName="proxy-httpd" containerID="cri-o://87cb39d2584b44b56a90852829c6503206f3825023131fb058f878c6731fb31c" gracePeriod=30 Oct 13 14:41:28 crc kubenswrapper[4797]: I1013 14:41:28.842648 4797 generic.go:334] "Generic (PLEG): container finished" podID="703b9830-3383-4f8d-8ef5-376e2f956a60" containerID="87cb39d2584b44b56a90852829c6503206f3825023131fb058f878c6731fb31c" exitCode=0 Oct 13 14:41:28 crc kubenswrapper[4797]: I1013 14:41:28.842691 4797 generic.go:334] "Generic (PLEG): container finished" podID="703b9830-3383-4f8d-8ef5-376e2f956a60" containerID="24793c3bb75b41e569e451dc1c9c60adee3ada86d1ab3c2caa44df96a06d1ffc" exitCode=2 Oct 13 14:41:28 crc kubenswrapper[4797]: I1013 14:41:28.842730 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"703b9830-3383-4f8d-8ef5-376e2f956a60","Type":"ContainerDied","Data":"87cb39d2584b44b56a90852829c6503206f3825023131fb058f878c6731fb31c"} Oct 13 14:41:28 crc kubenswrapper[4797]: I1013 14:41:28.842784 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"703b9830-3383-4f8d-8ef5-376e2f956a60","Type":"ContainerDied","Data":"24793c3bb75b41e569e451dc1c9c60adee3ada86d1ab3c2caa44df96a06d1ffc"} Oct 13 14:41:28 crc kubenswrapper[4797]: I1013 14:41:28.846782 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"a7c7e5e8-bc03-4870-a67d-fb86ae0e41e3","Type":"ContainerStarted","Data":"341a82df1de128abf9ad3c56586f6e605425754b5dcc49274b72a4f15ea28e25"} Oct 13 14:41:29 crc kubenswrapper[4797]: I1013 14:41:29.859295 4797 generic.go:334] "Generic (PLEG): container finished" podID="703b9830-3383-4f8d-8ef5-376e2f956a60" containerID="2e37cb4ebfbda5a36a1898e66e2a0915f282c19cae6ae43a6b559100997b0d8d" exitCode=0 Oct 13 14:41:29 crc kubenswrapper[4797]: I1013 14:41:29.859337 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"703b9830-3383-4f8d-8ef5-376e2f956a60","Type":"ContainerDied","Data":"2e37cb4ebfbda5a36a1898e66e2a0915f282c19cae6ae43a6b559100997b0d8d"} Oct 13 14:41:29 crc kubenswrapper[4797]: I1013 14:41:29.862222 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"a7c7e5e8-bc03-4870-a67d-fb86ae0e41e3","Type":"ContainerStarted","Data":"130343f2a602815104ad5b2bcb0fa84d9d15ed0a61e75dc64195d8f1c79377f2"} Oct 13 14:41:30 crc kubenswrapper[4797]: I1013 14:41:30.888371 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"a7c7e5e8-bc03-4870-a67d-fb86ae0e41e3","Type":"ContainerStarted","Data":"b948ac6f06ef9b83173328abcbc163fdcf346bb7f46cb2c4076eaf45e804de51"} Oct 13 14:41:31 crc kubenswrapper[4797]: I1013 14:41:31.925029 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=3.140977555 podStartE2EDuration="6.925014794s" podCreationTimestamp="2025-10-13 14:41:25 +0000 UTC" firstStartedPulling="2025-10-13 14:41:26.784114669 +0000 UTC m=+5664.317664925" lastFinishedPulling="2025-10-13 14:41:30.568151908 +0000 UTC m=+5668.101702164" observedRunningTime="2025-10-13 14:41:31.922162654 +0000 UTC m=+5669.455712910" watchObservedRunningTime="2025-10-13 14:41:31.925014794 +0000 UTC m=+5669.458565050" Oct 13 14:41:32 crc kubenswrapper[4797]: I1013 14:41:32.834311 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 13 14:41:32 crc kubenswrapper[4797]: I1013 14:41:32.878502 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/703b9830-3383-4f8d-8ef5-376e2f956a60-combined-ca-bundle\") pod \"703b9830-3383-4f8d-8ef5-376e2f956a60\" (UID: \"703b9830-3383-4f8d-8ef5-376e2f956a60\") " Oct 13 14:41:32 crc kubenswrapper[4797]: I1013 14:41:32.878597 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/703b9830-3383-4f8d-8ef5-376e2f956a60-run-httpd\") pod \"703b9830-3383-4f8d-8ef5-376e2f956a60\" (UID: \"703b9830-3383-4f8d-8ef5-376e2f956a60\") " Oct 13 14:41:32 crc kubenswrapper[4797]: I1013 14:41:32.878634 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7kd4t\" (UniqueName: \"kubernetes.io/projected/703b9830-3383-4f8d-8ef5-376e2f956a60-kube-api-access-7kd4t\") pod \"703b9830-3383-4f8d-8ef5-376e2f956a60\" (UID: \"703b9830-3383-4f8d-8ef5-376e2f956a60\") " Oct 13 14:41:32 crc kubenswrapper[4797]: I1013 14:41:32.878735 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/703b9830-3383-4f8d-8ef5-376e2f956a60-sg-core-conf-yaml\") pod \"703b9830-3383-4f8d-8ef5-376e2f956a60\" (UID: \"703b9830-3383-4f8d-8ef5-376e2f956a60\") " Oct 13 14:41:32 crc kubenswrapper[4797]: I1013 14:41:32.878780 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/703b9830-3383-4f8d-8ef5-376e2f956a60-config-data\") pod \"703b9830-3383-4f8d-8ef5-376e2f956a60\" (UID: \"703b9830-3383-4f8d-8ef5-376e2f956a60\") " Oct 13 14:41:32 crc kubenswrapper[4797]: I1013 14:41:32.878828 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/703b9830-3383-4f8d-8ef5-376e2f956a60-log-httpd\") pod \"703b9830-3383-4f8d-8ef5-376e2f956a60\" (UID: \"703b9830-3383-4f8d-8ef5-376e2f956a60\") " Oct 13 14:41:32 crc kubenswrapper[4797]: I1013 14:41:32.878917 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/703b9830-3383-4f8d-8ef5-376e2f956a60-scripts\") pod \"703b9830-3383-4f8d-8ef5-376e2f956a60\" (UID: \"703b9830-3383-4f8d-8ef5-376e2f956a60\") " Oct 13 14:41:32 crc kubenswrapper[4797]: I1013 14:41:32.884898 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/703b9830-3383-4f8d-8ef5-376e2f956a60-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "703b9830-3383-4f8d-8ef5-376e2f956a60" (UID: "703b9830-3383-4f8d-8ef5-376e2f956a60"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 14:41:32 crc kubenswrapper[4797]: I1013 14:41:32.887208 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/703b9830-3383-4f8d-8ef5-376e2f956a60-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "703b9830-3383-4f8d-8ef5-376e2f956a60" (UID: "703b9830-3383-4f8d-8ef5-376e2f956a60"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 14:41:32 crc kubenswrapper[4797]: I1013 14:41:32.891902 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/703b9830-3383-4f8d-8ef5-376e2f956a60-scripts" (OuterVolumeSpecName: "scripts") pod "703b9830-3383-4f8d-8ef5-376e2f956a60" (UID: "703b9830-3383-4f8d-8ef5-376e2f956a60"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 14:41:32 crc kubenswrapper[4797]: I1013 14:41:32.899658 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/703b9830-3383-4f8d-8ef5-376e2f956a60-kube-api-access-7kd4t" (OuterVolumeSpecName: "kube-api-access-7kd4t") pod "703b9830-3383-4f8d-8ef5-376e2f956a60" (UID: "703b9830-3383-4f8d-8ef5-376e2f956a60"). InnerVolumeSpecName "kube-api-access-7kd4t". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 14:41:32 crc kubenswrapper[4797]: I1013 14:41:32.923949 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/703b9830-3383-4f8d-8ef5-376e2f956a60-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "703b9830-3383-4f8d-8ef5-376e2f956a60" (UID: "703b9830-3383-4f8d-8ef5-376e2f956a60"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 14:41:32 crc kubenswrapper[4797]: I1013 14:41:32.926092 4797 generic.go:334] "Generic (PLEG): container finished" podID="703b9830-3383-4f8d-8ef5-376e2f956a60" containerID="792a95a3440e6cf6694214a5b75c09520d68564535efb9b7face0a0ad5850fab" exitCode=0 Oct 13 14:41:32 crc kubenswrapper[4797]: I1013 14:41:32.926135 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"703b9830-3383-4f8d-8ef5-376e2f956a60","Type":"ContainerDied","Data":"792a95a3440e6cf6694214a5b75c09520d68564535efb9b7face0a0ad5850fab"} Oct 13 14:41:32 crc kubenswrapper[4797]: I1013 14:41:32.926166 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"703b9830-3383-4f8d-8ef5-376e2f956a60","Type":"ContainerDied","Data":"7690d1b50ce1fa1b043d163471a847c174e496f51cdac424bd0c84053803b4b2"} Oct 13 14:41:32 crc kubenswrapper[4797]: I1013 14:41:32.926185 4797 scope.go:117] "RemoveContainer" containerID="87cb39d2584b44b56a90852829c6503206f3825023131fb058f878c6731fb31c" Oct 13 14:41:32 crc kubenswrapper[4797]: I1013 14:41:32.926332 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 13 14:41:32 crc kubenswrapper[4797]: I1013 14:41:32.981528 4797 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/703b9830-3383-4f8d-8ef5-376e2f956a60-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 13 14:41:32 crc kubenswrapper[4797]: I1013 14:41:32.981570 4797 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/703b9830-3383-4f8d-8ef5-376e2f956a60-scripts\") on node \"crc\" DevicePath \"\"" Oct 13 14:41:32 crc kubenswrapper[4797]: I1013 14:41:32.981583 4797 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/703b9830-3383-4f8d-8ef5-376e2f956a60-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 13 14:41:32 crc kubenswrapper[4797]: I1013 14:41:32.981596 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7kd4t\" (UniqueName: \"kubernetes.io/projected/703b9830-3383-4f8d-8ef5-376e2f956a60-kube-api-access-7kd4t\") on node \"crc\" DevicePath \"\"" Oct 13 14:41:32 crc kubenswrapper[4797]: I1013 14:41:32.981609 4797 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/703b9830-3383-4f8d-8ef5-376e2f956a60-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 13 14:41:32 crc kubenswrapper[4797]: I1013 14:41:32.996934 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/703b9830-3383-4f8d-8ef5-376e2f956a60-config-data" (OuterVolumeSpecName: "config-data") pod "703b9830-3383-4f8d-8ef5-376e2f956a60" (UID: "703b9830-3383-4f8d-8ef5-376e2f956a60"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 14:41:32 crc kubenswrapper[4797]: I1013 14:41:32.998900 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/703b9830-3383-4f8d-8ef5-376e2f956a60-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "703b9830-3383-4f8d-8ef5-376e2f956a60" (UID: "703b9830-3383-4f8d-8ef5-376e2f956a60"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 14:41:33 crc kubenswrapper[4797]: I1013 14:41:33.005532 4797 scope.go:117] "RemoveContainer" containerID="24793c3bb75b41e569e451dc1c9c60adee3ada86d1ab3c2caa44df96a06d1ffc" Oct 13 14:41:33 crc kubenswrapper[4797]: I1013 14:41:33.025520 4797 scope.go:117] "RemoveContainer" containerID="792a95a3440e6cf6694214a5b75c09520d68564535efb9b7face0a0ad5850fab" Oct 13 14:41:33 crc kubenswrapper[4797]: I1013 14:41:33.045220 4797 scope.go:117] "RemoveContainer" containerID="2e37cb4ebfbda5a36a1898e66e2a0915f282c19cae6ae43a6b559100997b0d8d" Oct 13 14:41:33 crc kubenswrapper[4797]: I1013 14:41:33.073174 4797 scope.go:117] "RemoveContainer" containerID="87cb39d2584b44b56a90852829c6503206f3825023131fb058f878c6731fb31c" Oct 13 14:41:33 crc kubenswrapper[4797]: E1013 14:41:33.074326 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87cb39d2584b44b56a90852829c6503206f3825023131fb058f878c6731fb31c\": container with ID starting with 87cb39d2584b44b56a90852829c6503206f3825023131fb058f878c6731fb31c not found: ID does not exist" containerID="87cb39d2584b44b56a90852829c6503206f3825023131fb058f878c6731fb31c" Oct 13 14:41:33 crc kubenswrapper[4797]: I1013 14:41:33.074368 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87cb39d2584b44b56a90852829c6503206f3825023131fb058f878c6731fb31c"} err="failed to get container status \"87cb39d2584b44b56a90852829c6503206f3825023131fb058f878c6731fb31c\": rpc error: code = NotFound desc = could not find container \"87cb39d2584b44b56a90852829c6503206f3825023131fb058f878c6731fb31c\": container with ID starting with 87cb39d2584b44b56a90852829c6503206f3825023131fb058f878c6731fb31c not found: ID does not exist" Oct 13 14:41:33 crc kubenswrapper[4797]: I1013 14:41:33.074395 4797 scope.go:117] "RemoveContainer" containerID="24793c3bb75b41e569e451dc1c9c60adee3ada86d1ab3c2caa44df96a06d1ffc" Oct 13 14:41:33 crc kubenswrapper[4797]: E1013 14:41:33.074676 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"24793c3bb75b41e569e451dc1c9c60adee3ada86d1ab3c2caa44df96a06d1ffc\": container with ID starting with 24793c3bb75b41e569e451dc1c9c60adee3ada86d1ab3c2caa44df96a06d1ffc not found: ID does not exist" containerID="24793c3bb75b41e569e451dc1c9c60adee3ada86d1ab3c2caa44df96a06d1ffc" Oct 13 14:41:33 crc kubenswrapper[4797]: I1013 14:41:33.074705 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24793c3bb75b41e569e451dc1c9c60adee3ada86d1ab3c2caa44df96a06d1ffc"} err="failed to get container status \"24793c3bb75b41e569e451dc1c9c60adee3ada86d1ab3c2caa44df96a06d1ffc\": rpc error: code = NotFound desc = could not find container \"24793c3bb75b41e569e451dc1c9c60adee3ada86d1ab3c2caa44df96a06d1ffc\": container with ID starting with 24793c3bb75b41e569e451dc1c9c60adee3ada86d1ab3c2caa44df96a06d1ffc not found: ID does not exist" Oct 13 14:41:33 crc kubenswrapper[4797]: I1013 14:41:33.074726 4797 scope.go:117] "RemoveContainer" containerID="792a95a3440e6cf6694214a5b75c09520d68564535efb9b7face0a0ad5850fab" Oct 13 14:41:33 crc kubenswrapper[4797]: E1013 14:41:33.074991 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"792a95a3440e6cf6694214a5b75c09520d68564535efb9b7face0a0ad5850fab\": container with ID starting with 792a95a3440e6cf6694214a5b75c09520d68564535efb9b7face0a0ad5850fab not found: ID does not exist" containerID="792a95a3440e6cf6694214a5b75c09520d68564535efb9b7face0a0ad5850fab" Oct 13 14:41:33 crc kubenswrapper[4797]: I1013 14:41:33.075082 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"792a95a3440e6cf6694214a5b75c09520d68564535efb9b7face0a0ad5850fab"} err="failed to get container status \"792a95a3440e6cf6694214a5b75c09520d68564535efb9b7face0a0ad5850fab\": rpc error: code = NotFound desc = could not find container \"792a95a3440e6cf6694214a5b75c09520d68564535efb9b7face0a0ad5850fab\": container with ID starting with 792a95a3440e6cf6694214a5b75c09520d68564535efb9b7face0a0ad5850fab not found: ID does not exist" Oct 13 14:41:33 crc kubenswrapper[4797]: I1013 14:41:33.075100 4797 scope.go:117] "RemoveContainer" containerID="2e37cb4ebfbda5a36a1898e66e2a0915f282c19cae6ae43a6b559100997b0d8d" Oct 13 14:41:33 crc kubenswrapper[4797]: E1013 14:41:33.075330 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e37cb4ebfbda5a36a1898e66e2a0915f282c19cae6ae43a6b559100997b0d8d\": container with ID starting with 2e37cb4ebfbda5a36a1898e66e2a0915f282c19cae6ae43a6b559100997b0d8d not found: ID does not exist" containerID="2e37cb4ebfbda5a36a1898e66e2a0915f282c19cae6ae43a6b559100997b0d8d" Oct 13 14:41:33 crc kubenswrapper[4797]: I1013 14:41:33.075350 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e37cb4ebfbda5a36a1898e66e2a0915f282c19cae6ae43a6b559100997b0d8d"} err="failed to get container status \"2e37cb4ebfbda5a36a1898e66e2a0915f282c19cae6ae43a6b559100997b0d8d\": rpc error: code = NotFound desc = could not find container \"2e37cb4ebfbda5a36a1898e66e2a0915f282c19cae6ae43a6b559100997b0d8d\": container with ID starting with 2e37cb4ebfbda5a36a1898e66e2a0915f282c19cae6ae43a6b559100997b0d8d not found: ID does not exist" Oct 13 14:41:33 crc kubenswrapper[4797]: I1013 14:41:33.083214 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/703b9830-3383-4f8d-8ef5-376e2f956a60-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 14:41:33 crc kubenswrapper[4797]: I1013 14:41:33.083254 4797 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/703b9830-3383-4f8d-8ef5-376e2f956a60-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 14:41:33 crc kubenswrapper[4797]: I1013 14:41:33.268263 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 13 14:41:33 crc kubenswrapper[4797]: I1013 14:41:33.277408 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 13 14:41:33 crc kubenswrapper[4797]: I1013 14:41:33.294619 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 13 14:41:33 crc kubenswrapper[4797]: E1013 14:41:33.295128 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="703b9830-3383-4f8d-8ef5-376e2f956a60" containerName="ceilometer-central-agent" Oct 13 14:41:33 crc kubenswrapper[4797]: I1013 14:41:33.295150 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="703b9830-3383-4f8d-8ef5-376e2f956a60" containerName="ceilometer-central-agent" Oct 13 14:41:33 crc kubenswrapper[4797]: E1013 14:41:33.295173 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="703b9830-3383-4f8d-8ef5-376e2f956a60" containerName="proxy-httpd" Oct 13 14:41:33 crc kubenswrapper[4797]: I1013 14:41:33.295180 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="703b9830-3383-4f8d-8ef5-376e2f956a60" containerName="proxy-httpd" Oct 13 14:41:33 crc kubenswrapper[4797]: E1013 14:41:33.295190 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="703b9830-3383-4f8d-8ef5-376e2f956a60" containerName="ceilometer-notification-agent" Oct 13 14:41:33 crc kubenswrapper[4797]: I1013 14:41:33.295196 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="703b9830-3383-4f8d-8ef5-376e2f956a60" containerName="ceilometer-notification-agent" Oct 13 14:41:33 crc kubenswrapper[4797]: E1013 14:41:33.295207 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="703b9830-3383-4f8d-8ef5-376e2f956a60" containerName="sg-core" Oct 13 14:41:33 crc kubenswrapper[4797]: I1013 14:41:33.295213 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="703b9830-3383-4f8d-8ef5-376e2f956a60" containerName="sg-core" Oct 13 14:41:33 crc kubenswrapper[4797]: I1013 14:41:33.303553 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="703b9830-3383-4f8d-8ef5-376e2f956a60" containerName="proxy-httpd" Oct 13 14:41:33 crc kubenswrapper[4797]: I1013 14:41:33.303616 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="703b9830-3383-4f8d-8ef5-376e2f956a60" containerName="ceilometer-central-agent" Oct 13 14:41:33 crc kubenswrapper[4797]: I1013 14:41:33.303630 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="703b9830-3383-4f8d-8ef5-376e2f956a60" containerName="sg-core" Oct 13 14:41:33 crc kubenswrapper[4797]: I1013 14:41:33.303655 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="703b9830-3383-4f8d-8ef5-376e2f956a60" containerName="ceilometer-notification-agent" Oct 13 14:41:33 crc kubenswrapper[4797]: I1013 14:41:33.306389 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 13 14:41:33 crc kubenswrapper[4797]: I1013 14:41:33.308778 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 13 14:41:33 crc kubenswrapper[4797]: I1013 14:41:33.320907 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 13 14:41:33 crc kubenswrapper[4797]: I1013 14:41:33.321567 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 13 14:41:33 crc kubenswrapper[4797]: I1013 14:41:33.393237 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7b4e14a9-e972-4fc1-815c-f5dfa5399912-run-httpd\") pod \"ceilometer-0\" (UID: \"7b4e14a9-e972-4fc1-815c-f5dfa5399912\") " pod="openstack/ceilometer-0" Oct 13 14:41:33 crc kubenswrapper[4797]: I1013 14:41:33.393296 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7b4e14a9-e972-4fc1-815c-f5dfa5399912-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7b4e14a9-e972-4fc1-815c-f5dfa5399912\") " pod="openstack/ceilometer-0" Oct 13 14:41:33 crc kubenswrapper[4797]: I1013 14:41:33.393403 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b4e14a9-e972-4fc1-815c-f5dfa5399912-scripts\") pod \"ceilometer-0\" (UID: \"7b4e14a9-e972-4fc1-815c-f5dfa5399912\") " pod="openstack/ceilometer-0" Oct 13 14:41:33 crc kubenswrapper[4797]: I1013 14:41:33.393423 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b4e14a9-e972-4fc1-815c-f5dfa5399912-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7b4e14a9-e972-4fc1-815c-f5dfa5399912\") " pod="openstack/ceilometer-0" Oct 13 14:41:33 crc kubenswrapper[4797]: I1013 14:41:33.393443 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7b4e14a9-e972-4fc1-815c-f5dfa5399912-log-httpd\") pod \"ceilometer-0\" (UID: \"7b4e14a9-e972-4fc1-815c-f5dfa5399912\") " pod="openstack/ceilometer-0" Oct 13 14:41:33 crc kubenswrapper[4797]: I1013 14:41:33.393487 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99p9s\" (UniqueName: \"kubernetes.io/projected/7b4e14a9-e972-4fc1-815c-f5dfa5399912-kube-api-access-99p9s\") pod \"ceilometer-0\" (UID: \"7b4e14a9-e972-4fc1-815c-f5dfa5399912\") " pod="openstack/ceilometer-0" Oct 13 14:41:33 crc kubenswrapper[4797]: I1013 14:41:33.393512 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b4e14a9-e972-4fc1-815c-f5dfa5399912-config-data\") pod \"ceilometer-0\" (UID: \"7b4e14a9-e972-4fc1-815c-f5dfa5399912\") " pod="openstack/ceilometer-0" Oct 13 14:41:33 crc kubenswrapper[4797]: I1013 14:41:33.495530 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b4e14a9-e972-4fc1-815c-f5dfa5399912-config-data\") pod \"ceilometer-0\" (UID: \"7b4e14a9-e972-4fc1-815c-f5dfa5399912\") " pod="openstack/ceilometer-0" Oct 13 14:41:33 crc kubenswrapper[4797]: I1013 14:41:33.495881 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7b4e14a9-e972-4fc1-815c-f5dfa5399912-run-httpd\") pod \"ceilometer-0\" (UID: \"7b4e14a9-e972-4fc1-815c-f5dfa5399912\") " pod="openstack/ceilometer-0" Oct 13 14:41:33 crc kubenswrapper[4797]: I1013 14:41:33.495931 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7b4e14a9-e972-4fc1-815c-f5dfa5399912-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7b4e14a9-e972-4fc1-815c-f5dfa5399912\") " pod="openstack/ceilometer-0" Oct 13 14:41:33 crc kubenswrapper[4797]: I1013 14:41:33.496070 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b4e14a9-e972-4fc1-815c-f5dfa5399912-scripts\") pod \"ceilometer-0\" (UID: \"7b4e14a9-e972-4fc1-815c-f5dfa5399912\") " pod="openstack/ceilometer-0" Oct 13 14:41:33 crc kubenswrapper[4797]: I1013 14:41:33.496096 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b4e14a9-e972-4fc1-815c-f5dfa5399912-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7b4e14a9-e972-4fc1-815c-f5dfa5399912\") " pod="openstack/ceilometer-0" Oct 13 14:41:33 crc kubenswrapper[4797]: I1013 14:41:33.496125 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7b4e14a9-e972-4fc1-815c-f5dfa5399912-log-httpd\") pod \"ceilometer-0\" (UID: \"7b4e14a9-e972-4fc1-815c-f5dfa5399912\") " pod="openstack/ceilometer-0" Oct 13 14:41:33 crc kubenswrapper[4797]: I1013 14:41:33.496196 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99p9s\" (UniqueName: \"kubernetes.io/projected/7b4e14a9-e972-4fc1-815c-f5dfa5399912-kube-api-access-99p9s\") pod \"ceilometer-0\" (UID: \"7b4e14a9-e972-4fc1-815c-f5dfa5399912\") " pod="openstack/ceilometer-0" Oct 13 14:41:33 crc kubenswrapper[4797]: I1013 14:41:33.496241 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7b4e14a9-e972-4fc1-815c-f5dfa5399912-run-httpd\") pod \"ceilometer-0\" (UID: \"7b4e14a9-e972-4fc1-815c-f5dfa5399912\") " pod="openstack/ceilometer-0" Oct 13 14:41:33 crc kubenswrapper[4797]: I1013 14:41:33.496934 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7b4e14a9-e972-4fc1-815c-f5dfa5399912-log-httpd\") pod \"ceilometer-0\" (UID: \"7b4e14a9-e972-4fc1-815c-f5dfa5399912\") " pod="openstack/ceilometer-0" Oct 13 14:41:33 crc kubenswrapper[4797]: I1013 14:41:33.499893 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b4e14a9-e972-4fc1-815c-f5dfa5399912-scripts\") pod \"ceilometer-0\" (UID: \"7b4e14a9-e972-4fc1-815c-f5dfa5399912\") " pod="openstack/ceilometer-0" Oct 13 14:41:33 crc kubenswrapper[4797]: I1013 14:41:33.499981 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b4e14a9-e972-4fc1-815c-f5dfa5399912-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7b4e14a9-e972-4fc1-815c-f5dfa5399912\") " pod="openstack/ceilometer-0" Oct 13 14:41:33 crc kubenswrapper[4797]: I1013 14:41:33.500601 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b4e14a9-e972-4fc1-815c-f5dfa5399912-config-data\") pod \"ceilometer-0\" (UID: \"7b4e14a9-e972-4fc1-815c-f5dfa5399912\") " pod="openstack/ceilometer-0" Oct 13 14:41:33 crc kubenswrapper[4797]: I1013 14:41:33.506211 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7b4e14a9-e972-4fc1-815c-f5dfa5399912-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7b4e14a9-e972-4fc1-815c-f5dfa5399912\") " pod="openstack/ceilometer-0" Oct 13 14:41:33 crc kubenswrapper[4797]: I1013 14:41:33.518794 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99p9s\" (UniqueName: \"kubernetes.io/projected/7b4e14a9-e972-4fc1-815c-f5dfa5399912-kube-api-access-99p9s\") pod \"ceilometer-0\" (UID: \"7b4e14a9-e972-4fc1-815c-f5dfa5399912\") " pod="openstack/ceilometer-0" Oct 13 14:41:33 crc kubenswrapper[4797]: I1013 14:41:33.635950 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 13 14:41:34 crc kubenswrapper[4797]: I1013 14:41:34.027937 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-j8w4s" Oct 13 14:41:34 crc kubenswrapper[4797]: I1013 14:41:34.080433 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-j8w4s"] Oct 13 14:41:34 crc kubenswrapper[4797]: I1013 14:41:34.127588 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 13 14:41:34 crc kubenswrapper[4797]: W1013 14:41:34.134152 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7b4e14a9_e972_4fc1_815c_f5dfa5399912.slice/crio-469bb3cb24aa866882095d4b37207a724d4e2d94955117c7785ec1e63a166ba7 WatchSource:0}: Error finding container 469bb3cb24aa866882095d4b37207a724d4e2d94955117c7785ec1e63a166ba7: Status 404 returned error can't find the container with id 469bb3cb24aa866882095d4b37207a724d4e2d94955117c7785ec1e63a166ba7 Oct 13 14:41:34 crc kubenswrapper[4797]: I1013 14:41:34.951668 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7b4e14a9-e972-4fc1-815c-f5dfa5399912","Type":"ContainerStarted","Data":"dee06dcbd97929d4f49ea7969cfe7b1bb5638ffe3ed0258eb294127a4c7807f3"} Oct 13 14:41:34 crc kubenswrapper[4797]: I1013 14:41:34.951968 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7b4e14a9-e972-4fc1-815c-f5dfa5399912","Type":"ContainerStarted","Data":"7d97cffa929237e724d4f415e4bafc0cb19a158c176686afb77a5c13bd1814f3"} Oct 13 14:41:34 crc kubenswrapper[4797]: I1013 14:41:34.952075 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7b4e14a9-e972-4fc1-815c-f5dfa5399912","Type":"ContainerStarted","Data":"469bb3cb24aa866882095d4b37207a724d4e2d94955117c7785ec1e63a166ba7"} Oct 13 14:41:34 crc kubenswrapper[4797]: I1013 14:41:34.951864 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-j8w4s" podUID="a2b3dfa9-6860-44f7-8e0f-98fd29fd729a" containerName="registry-server" containerID="cri-o://7c4fb7dd16fb6dee56e6fa122f840a510a4e9091f326ad1443c5d44c29ce02f5" gracePeriod=2 Oct 13 14:41:35 crc kubenswrapper[4797]: I1013 14:41:35.255546 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="703b9830-3383-4f8d-8ef5-376e2f956a60" path="/var/lib/kubelet/pods/703b9830-3383-4f8d-8ef5-376e2f956a60/volumes" Oct 13 14:41:35 crc kubenswrapper[4797]: I1013 14:41:35.614024 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-j8w4s" Oct 13 14:41:35 crc kubenswrapper[4797]: I1013 14:41:35.672729 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2b3dfa9-6860-44f7-8e0f-98fd29fd729a-catalog-content\") pod \"a2b3dfa9-6860-44f7-8e0f-98fd29fd729a\" (UID: \"a2b3dfa9-6860-44f7-8e0f-98fd29fd729a\") " Oct 13 14:41:35 crc kubenswrapper[4797]: I1013 14:41:35.672883 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2b3dfa9-6860-44f7-8e0f-98fd29fd729a-utilities\") pod \"a2b3dfa9-6860-44f7-8e0f-98fd29fd729a\" (UID: \"a2b3dfa9-6860-44f7-8e0f-98fd29fd729a\") " Oct 13 14:41:35 crc kubenswrapper[4797]: I1013 14:41:35.673053 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l7sfl\" (UniqueName: \"kubernetes.io/projected/a2b3dfa9-6860-44f7-8e0f-98fd29fd729a-kube-api-access-l7sfl\") pod \"a2b3dfa9-6860-44f7-8e0f-98fd29fd729a\" (UID: \"a2b3dfa9-6860-44f7-8e0f-98fd29fd729a\") " Oct 13 14:41:35 crc kubenswrapper[4797]: I1013 14:41:35.675385 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a2b3dfa9-6860-44f7-8e0f-98fd29fd729a-utilities" (OuterVolumeSpecName: "utilities") pod "a2b3dfa9-6860-44f7-8e0f-98fd29fd729a" (UID: "a2b3dfa9-6860-44f7-8e0f-98fd29fd729a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 14:41:35 crc kubenswrapper[4797]: I1013 14:41:35.742751 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a2b3dfa9-6860-44f7-8e0f-98fd29fd729a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a2b3dfa9-6860-44f7-8e0f-98fd29fd729a" (UID: "a2b3dfa9-6860-44f7-8e0f-98fd29fd729a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 14:41:35 crc kubenswrapper[4797]: I1013 14:41:35.759666 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2b3dfa9-6860-44f7-8e0f-98fd29fd729a-kube-api-access-l7sfl" (OuterVolumeSpecName: "kube-api-access-l7sfl") pod "a2b3dfa9-6860-44f7-8e0f-98fd29fd729a" (UID: "a2b3dfa9-6860-44f7-8e0f-98fd29fd729a"). InnerVolumeSpecName "kube-api-access-l7sfl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 14:41:35 crc kubenswrapper[4797]: I1013 14:41:35.775868 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l7sfl\" (UniqueName: \"kubernetes.io/projected/a2b3dfa9-6860-44f7-8e0f-98fd29fd729a-kube-api-access-l7sfl\") on node \"crc\" DevicePath \"\"" Oct 13 14:41:35 crc kubenswrapper[4797]: I1013 14:41:35.775932 4797 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2b3dfa9-6860-44f7-8e0f-98fd29fd729a-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 13 14:41:35 crc kubenswrapper[4797]: I1013 14:41:35.775948 4797 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2b3dfa9-6860-44f7-8e0f-98fd29fd729a-utilities\") on node \"crc\" DevicePath \"\"" Oct 13 14:41:35 crc kubenswrapper[4797]: I1013 14:41:35.962352 4797 generic.go:334] "Generic (PLEG): container finished" podID="a2b3dfa9-6860-44f7-8e0f-98fd29fd729a" containerID="7c4fb7dd16fb6dee56e6fa122f840a510a4e9091f326ad1443c5d44c29ce02f5" exitCode=0 Oct 13 14:41:35 crc kubenswrapper[4797]: I1013 14:41:35.962407 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-j8w4s" Oct 13 14:41:35 crc kubenswrapper[4797]: I1013 14:41:35.962419 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j8w4s" event={"ID":"a2b3dfa9-6860-44f7-8e0f-98fd29fd729a","Type":"ContainerDied","Data":"7c4fb7dd16fb6dee56e6fa122f840a510a4e9091f326ad1443c5d44c29ce02f5"} Oct 13 14:41:35 crc kubenswrapper[4797]: I1013 14:41:35.962482 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j8w4s" event={"ID":"a2b3dfa9-6860-44f7-8e0f-98fd29fd729a","Type":"ContainerDied","Data":"56f890f2809681ae2279175530c4f90d7417e2354e77a39f125013dae06b69ed"} Oct 13 14:41:35 crc kubenswrapper[4797]: I1013 14:41:35.962505 4797 scope.go:117] "RemoveContainer" containerID="7c4fb7dd16fb6dee56e6fa122f840a510a4e9091f326ad1443c5d44c29ce02f5" Oct 13 14:41:35 crc kubenswrapper[4797]: I1013 14:41:35.966534 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7b4e14a9-e972-4fc1-815c-f5dfa5399912","Type":"ContainerStarted","Data":"41981fb5e99c02149e9791c65bb5b930500f07a20f87bd85093779ccd7c5013f"} Oct 13 14:41:35 crc kubenswrapper[4797]: I1013 14:41:35.990292 4797 scope.go:117] "RemoveContainer" containerID="bc9a79f8dd1949d2741d76f163a013fb03d651d61d2233c878dc3eabe9c4d537" Oct 13 14:41:36 crc kubenswrapper[4797]: I1013 14:41:36.002417 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-j8w4s"] Oct 13 14:41:36 crc kubenswrapper[4797]: I1013 14:41:36.015278 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-j8w4s"] Oct 13 14:41:36 crc kubenswrapper[4797]: I1013 14:41:36.022587 4797 scope.go:117] "RemoveContainer" containerID="2a2739b021b6ecb7283af281b9a80df21fe2aae915419013c47966c876ff4517" Oct 13 14:41:36 crc kubenswrapper[4797]: I1013 14:41:36.041883 4797 scope.go:117] "RemoveContainer" containerID="7c4fb7dd16fb6dee56e6fa122f840a510a4e9091f326ad1443c5d44c29ce02f5" Oct 13 14:41:36 crc kubenswrapper[4797]: E1013 14:41:36.042302 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c4fb7dd16fb6dee56e6fa122f840a510a4e9091f326ad1443c5d44c29ce02f5\": container with ID starting with 7c4fb7dd16fb6dee56e6fa122f840a510a4e9091f326ad1443c5d44c29ce02f5 not found: ID does not exist" containerID="7c4fb7dd16fb6dee56e6fa122f840a510a4e9091f326ad1443c5d44c29ce02f5" Oct 13 14:41:36 crc kubenswrapper[4797]: I1013 14:41:36.042354 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c4fb7dd16fb6dee56e6fa122f840a510a4e9091f326ad1443c5d44c29ce02f5"} err="failed to get container status \"7c4fb7dd16fb6dee56e6fa122f840a510a4e9091f326ad1443c5d44c29ce02f5\": rpc error: code = NotFound desc = could not find container \"7c4fb7dd16fb6dee56e6fa122f840a510a4e9091f326ad1443c5d44c29ce02f5\": container with ID starting with 7c4fb7dd16fb6dee56e6fa122f840a510a4e9091f326ad1443c5d44c29ce02f5 not found: ID does not exist" Oct 13 14:41:36 crc kubenswrapper[4797]: I1013 14:41:36.042384 4797 scope.go:117] "RemoveContainer" containerID="bc9a79f8dd1949d2741d76f163a013fb03d651d61d2233c878dc3eabe9c4d537" Oct 13 14:41:36 crc kubenswrapper[4797]: E1013 14:41:36.042595 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc9a79f8dd1949d2741d76f163a013fb03d651d61d2233c878dc3eabe9c4d537\": container with ID starting with bc9a79f8dd1949d2741d76f163a013fb03d651d61d2233c878dc3eabe9c4d537 not found: ID does not exist" containerID="bc9a79f8dd1949d2741d76f163a013fb03d651d61d2233c878dc3eabe9c4d537" Oct 13 14:41:36 crc kubenswrapper[4797]: I1013 14:41:36.042627 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc9a79f8dd1949d2741d76f163a013fb03d651d61d2233c878dc3eabe9c4d537"} err="failed to get container status \"bc9a79f8dd1949d2741d76f163a013fb03d651d61d2233c878dc3eabe9c4d537\": rpc error: code = NotFound desc = could not find container \"bc9a79f8dd1949d2741d76f163a013fb03d651d61d2233c878dc3eabe9c4d537\": container with ID starting with bc9a79f8dd1949d2741d76f163a013fb03d651d61d2233c878dc3eabe9c4d537 not found: ID does not exist" Oct 13 14:41:36 crc kubenswrapper[4797]: I1013 14:41:36.042647 4797 scope.go:117] "RemoveContainer" containerID="2a2739b021b6ecb7283af281b9a80df21fe2aae915419013c47966c876ff4517" Oct 13 14:41:36 crc kubenswrapper[4797]: E1013 14:41:36.042970 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a2739b021b6ecb7283af281b9a80df21fe2aae915419013c47966c876ff4517\": container with ID starting with 2a2739b021b6ecb7283af281b9a80df21fe2aae915419013c47966c876ff4517 not found: ID does not exist" containerID="2a2739b021b6ecb7283af281b9a80df21fe2aae915419013c47966c876ff4517" Oct 13 14:41:36 crc kubenswrapper[4797]: I1013 14:41:36.042998 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a2739b021b6ecb7283af281b9a80df21fe2aae915419013c47966c876ff4517"} err="failed to get container status \"2a2739b021b6ecb7283af281b9a80df21fe2aae915419013c47966c876ff4517\": rpc error: code = NotFound desc = could not find container \"2a2739b021b6ecb7283af281b9a80df21fe2aae915419013c47966c876ff4517\": container with ID starting with 2a2739b021b6ecb7283af281b9a80df21fe2aae915419013c47966c876ff4517 not found: ID does not exist" Oct 13 14:41:37 crc kubenswrapper[4797]: I1013 14:41:37.056997 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-create-vmdmb"] Oct 13 14:41:37 crc kubenswrapper[4797]: E1013 14:41:37.057677 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2b3dfa9-6860-44f7-8e0f-98fd29fd729a" containerName="extract-utilities" Oct 13 14:41:37 crc kubenswrapper[4797]: I1013 14:41:37.057689 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2b3dfa9-6860-44f7-8e0f-98fd29fd729a" containerName="extract-utilities" Oct 13 14:41:37 crc kubenswrapper[4797]: E1013 14:41:37.057702 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2b3dfa9-6860-44f7-8e0f-98fd29fd729a" containerName="registry-server" Oct 13 14:41:37 crc kubenswrapper[4797]: I1013 14:41:37.057707 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2b3dfa9-6860-44f7-8e0f-98fd29fd729a" containerName="registry-server" Oct 13 14:41:37 crc kubenswrapper[4797]: E1013 14:41:37.057733 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2b3dfa9-6860-44f7-8e0f-98fd29fd729a" containerName="extract-content" Oct 13 14:41:37 crc kubenswrapper[4797]: I1013 14:41:37.057739 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2b3dfa9-6860-44f7-8e0f-98fd29fd729a" containerName="extract-content" Oct 13 14:41:37 crc kubenswrapper[4797]: I1013 14:41:37.058014 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2b3dfa9-6860-44f7-8e0f-98fd29fd729a" containerName="registry-server" Oct 13 14:41:37 crc kubenswrapper[4797]: I1013 14:41:37.058931 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-vmdmb" Oct 13 14:41:37 crc kubenswrapper[4797]: I1013 14:41:37.087848 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-vmdmb"] Oct 13 14:41:37 crc kubenswrapper[4797]: I1013 14:41:37.205860 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svm49\" (UniqueName: \"kubernetes.io/projected/70a423f9-6797-471e-bbd6-d11cfc177239-kube-api-access-svm49\") pod \"manila-db-create-vmdmb\" (UID: \"70a423f9-6797-471e-bbd6-d11cfc177239\") " pod="openstack/manila-db-create-vmdmb" Oct 13 14:41:37 crc kubenswrapper[4797]: I1013 14:41:37.247332 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2b3dfa9-6860-44f7-8e0f-98fd29fd729a" path="/var/lib/kubelet/pods/a2b3dfa9-6860-44f7-8e0f-98fd29fd729a/volumes" Oct 13 14:41:37 crc kubenswrapper[4797]: I1013 14:41:37.308934 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-svm49\" (UniqueName: \"kubernetes.io/projected/70a423f9-6797-471e-bbd6-d11cfc177239-kube-api-access-svm49\") pod \"manila-db-create-vmdmb\" (UID: \"70a423f9-6797-471e-bbd6-d11cfc177239\") " pod="openstack/manila-db-create-vmdmb" Oct 13 14:41:37 crc kubenswrapper[4797]: I1013 14:41:37.338221 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-svm49\" (UniqueName: \"kubernetes.io/projected/70a423f9-6797-471e-bbd6-d11cfc177239-kube-api-access-svm49\") pod \"manila-db-create-vmdmb\" (UID: \"70a423f9-6797-471e-bbd6-d11cfc177239\") " pod="openstack/manila-db-create-vmdmb" Oct 13 14:41:37 crc kubenswrapper[4797]: I1013 14:41:37.389819 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-vmdmb" Oct 13 14:41:37 crc kubenswrapper[4797]: I1013 14:41:37.935295 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-vmdmb"] Oct 13 14:41:38 crc kubenswrapper[4797]: I1013 14:41:38.014716 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7b4e14a9-e972-4fc1-815c-f5dfa5399912","Type":"ContainerStarted","Data":"9e92322d9b7ddc1983ccf0f0f51539223a7e72cb9ee523bbbc9689687edeef11"} Oct 13 14:41:38 crc kubenswrapper[4797]: I1013 14:41:38.016087 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 13 14:41:38 crc kubenswrapper[4797]: I1013 14:41:38.021995 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-vmdmb" event={"ID":"70a423f9-6797-471e-bbd6-d11cfc177239","Type":"ContainerStarted","Data":"99b0feedac36e2b0eca12cc9bc6be8514aedc45494fcea4ea02f403a6e817428"} Oct 13 14:41:38 crc kubenswrapper[4797]: I1013 14:41:38.041106 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.309214563 podStartE2EDuration="5.04108334s" podCreationTimestamp="2025-10-13 14:41:33 +0000 UTC" firstStartedPulling="2025-10-13 14:41:34.136441845 +0000 UTC m=+5671.669992101" lastFinishedPulling="2025-10-13 14:41:36.868310612 +0000 UTC m=+5674.401860878" observedRunningTime="2025-10-13 14:41:38.03984326 +0000 UTC m=+5675.573393526" watchObservedRunningTime="2025-10-13 14:41:38.04108334 +0000 UTC m=+5675.574633586" Oct 13 14:41:39 crc kubenswrapper[4797]: I1013 14:41:39.032864 4797 generic.go:334] "Generic (PLEG): container finished" podID="70a423f9-6797-471e-bbd6-d11cfc177239" containerID="8d1871e05ce344551cfc2d6ba1bee2c6e3a82705548ab0ac6fa5db6f3814345c" exitCode=0 Oct 13 14:41:39 crc kubenswrapper[4797]: I1013 14:41:39.032971 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-vmdmb" event={"ID":"70a423f9-6797-471e-bbd6-d11cfc177239","Type":"ContainerDied","Data":"8d1871e05ce344551cfc2d6ba1bee2c6e3a82705548ab0ac6fa5db6f3814345c"} Oct 13 14:41:40 crc kubenswrapper[4797]: I1013 14:41:40.486127 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-vmdmb" Oct 13 14:41:40 crc kubenswrapper[4797]: I1013 14:41:40.593106 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-svm49\" (UniqueName: \"kubernetes.io/projected/70a423f9-6797-471e-bbd6-d11cfc177239-kube-api-access-svm49\") pod \"70a423f9-6797-471e-bbd6-d11cfc177239\" (UID: \"70a423f9-6797-471e-bbd6-d11cfc177239\") " Oct 13 14:41:40 crc kubenswrapper[4797]: I1013 14:41:40.603175 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70a423f9-6797-471e-bbd6-d11cfc177239-kube-api-access-svm49" (OuterVolumeSpecName: "kube-api-access-svm49") pod "70a423f9-6797-471e-bbd6-d11cfc177239" (UID: "70a423f9-6797-471e-bbd6-d11cfc177239"). InnerVolumeSpecName "kube-api-access-svm49". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 14:41:40 crc kubenswrapper[4797]: I1013 14:41:40.695524 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-svm49\" (UniqueName: \"kubernetes.io/projected/70a423f9-6797-471e-bbd6-d11cfc177239-kube-api-access-svm49\") on node \"crc\" DevicePath \"\"" Oct 13 14:41:41 crc kubenswrapper[4797]: I1013 14:41:41.063655 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-vmdmb" event={"ID":"70a423f9-6797-471e-bbd6-d11cfc177239","Type":"ContainerDied","Data":"99b0feedac36e2b0eca12cc9bc6be8514aedc45494fcea4ea02f403a6e817428"} Oct 13 14:41:41 crc kubenswrapper[4797]: I1013 14:41:41.063697 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="99b0feedac36e2b0eca12cc9bc6be8514aedc45494fcea4ea02f403a6e817428" Oct 13 14:41:41 crc kubenswrapper[4797]: I1013 14:41:41.063756 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-vmdmb" Oct 13 14:41:47 crc kubenswrapper[4797]: I1013 14:41:47.161416 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-cd37-account-create-chjm4"] Oct 13 14:41:47 crc kubenswrapper[4797]: E1013 14:41:47.162522 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70a423f9-6797-471e-bbd6-d11cfc177239" containerName="mariadb-database-create" Oct 13 14:41:47 crc kubenswrapper[4797]: I1013 14:41:47.162543 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="70a423f9-6797-471e-bbd6-d11cfc177239" containerName="mariadb-database-create" Oct 13 14:41:47 crc kubenswrapper[4797]: I1013 14:41:47.162863 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="70a423f9-6797-471e-bbd6-d11cfc177239" containerName="mariadb-database-create" Oct 13 14:41:47 crc kubenswrapper[4797]: I1013 14:41:47.163919 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-cd37-account-create-chjm4" Oct 13 14:41:47 crc kubenswrapper[4797]: I1013 14:41:47.166039 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-db-secret" Oct 13 14:41:47 crc kubenswrapper[4797]: I1013 14:41:47.177828 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-cd37-account-create-chjm4"] Oct 13 14:41:47 crc kubenswrapper[4797]: I1013 14:41:47.224019 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6gsm7\" (UniqueName: \"kubernetes.io/projected/bd5395ec-ccd8-45a9-9c2f-aa8aeedc160a-kube-api-access-6gsm7\") pod \"manila-cd37-account-create-chjm4\" (UID: \"bd5395ec-ccd8-45a9-9c2f-aa8aeedc160a\") " pod="openstack/manila-cd37-account-create-chjm4" Oct 13 14:41:47 crc kubenswrapper[4797]: I1013 14:41:47.325757 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6gsm7\" (UniqueName: \"kubernetes.io/projected/bd5395ec-ccd8-45a9-9c2f-aa8aeedc160a-kube-api-access-6gsm7\") pod \"manila-cd37-account-create-chjm4\" (UID: \"bd5395ec-ccd8-45a9-9c2f-aa8aeedc160a\") " pod="openstack/manila-cd37-account-create-chjm4" Oct 13 14:41:47 crc kubenswrapper[4797]: I1013 14:41:47.345328 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6gsm7\" (UniqueName: \"kubernetes.io/projected/bd5395ec-ccd8-45a9-9c2f-aa8aeedc160a-kube-api-access-6gsm7\") pod \"manila-cd37-account-create-chjm4\" (UID: \"bd5395ec-ccd8-45a9-9c2f-aa8aeedc160a\") " pod="openstack/manila-cd37-account-create-chjm4" Oct 13 14:41:47 crc kubenswrapper[4797]: I1013 14:41:47.490577 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-cd37-account-create-chjm4" Oct 13 14:41:47 crc kubenswrapper[4797]: I1013 14:41:47.970884 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-cd37-account-create-chjm4"] Oct 13 14:41:48 crc kubenswrapper[4797]: I1013 14:41:48.120321 4797 patch_prober.go:28] interesting pod/machine-config-daemon-hrdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 14:41:48 crc kubenswrapper[4797]: I1013 14:41:48.120385 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 14:41:48 crc kubenswrapper[4797]: I1013 14:41:48.120436 4797 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" Oct 13 14:41:48 crc kubenswrapper[4797]: I1013 14:41:48.121276 4797 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6e1792210eb1300e989630292282b9e027a6f68ede1e610221b234da4c9f7a00"} pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 13 14:41:48 crc kubenswrapper[4797]: I1013 14:41:48.121341 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerName="machine-config-daemon" containerID="cri-o://6e1792210eb1300e989630292282b9e027a6f68ede1e610221b234da4c9f7a00" gracePeriod=600 Oct 13 14:41:48 crc kubenswrapper[4797]: I1013 14:41:48.134181 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-cd37-account-create-chjm4" event={"ID":"bd5395ec-ccd8-45a9-9c2f-aa8aeedc160a","Type":"ContainerStarted","Data":"412b37bd15ae3132958f2b71200b5c2de82fb4c2a277da6df48925f9529f75fa"} Oct 13 14:41:48 crc kubenswrapper[4797]: E1013 14:41:48.241923 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 14:41:49 crc kubenswrapper[4797]: I1013 14:41:49.144492 4797 generic.go:334] "Generic (PLEG): container finished" podID="bd5395ec-ccd8-45a9-9c2f-aa8aeedc160a" containerID="469039f7f4a985d485998e262d8bcf7587ff8148e1469857ab86adf7e8c54c0f" exitCode=0 Oct 13 14:41:49 crc kubenswrapper[4797]: I1013 14:41:49.144544 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-cd37-account-create-chjm4" event={"ID":"bd5395ec-ccd8-45a9-9c2f-aa8aeedc160a","Type":"ContainerDied","Data":"469039f7f4a985d485998e262d8bcf7587ff8148e1469857ab86adf7e8c54c0f"} Oct 13 14:41:49 crc kubenswrapper[4797]: I1013 14:41:49.149080 4797 generic.go:334] "Generic (PLEG): container finished" podID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerID="6e1792210eb1300e989630292282b9e027a6f68ede1e610221b234da4c9f7a00" exitCode=0 Oct 13 14:41:49 crc kubenswrapper[4797]: I1013 14:41:49.149136 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" event={"ID":"345b1c60-ba79-407d-8423-53010f2dfeb0","Type":"ContainerDied","Data":"6e1792210eb1300e989630292282b9e027a6f68ede1e610221b234da4c9f7a00"} Oct 13 14:41:49 crc kubenswrapper[4797]: I1013 14:41:49.149233 4797 scope.go:117] "RemoveContainer" containerID="f6beaa7adf1d21db8fdfdf908e4a91ef09e840de8f57f89fa2a6f4402ae41c29" Oct 13 14:41:49 crc kubenswrapper[4797]: I1013 14:41:49.149832 4797 scope.go:117] "RemoveContainer" containerID="6e1792210eb1300e989630292282b9e027a6f68ede1e610221b234da4c9f7a00" Oct 13 14:41:49 crc kubenswrapper[4797]: E1013 14:41:49.150168 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 14:41:51 crc kubenswrapper[4797]: I1013 14:41:50.599145 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-cd37-account-create-chjm4" Oct 13 14:41:51 crc kubenswrapper[4797]: I1013 14:41:50.693120 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6gsm7\" (UniqueName: \"kubernetes.io/projected/bd5395ec-ccd8-45a9-9c2f-aa8aeedc160a-kube-api-access-6gsm7\") pod \"bd5395ec-ccd8-45a9-9c2f-aa8aeedc160a\" (UID: \"bd5395ec-ccd8-45a9-9c2f-aa8aeedc160a\") " Oct 13 14:41:51 crc kubenswrapper[4797]: I1013 14:41:50.700443 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd5395ec-ccd8-45a9-9c2f-aa8aeedc160a-kube-api-access-6gsm7" (OuterVolumeSpecName: "kube-api-access-6gsm7") pod "bd5395ec-ccd8-45a9-9c2f-aa8aeedc160a" (UID: "bd5395ec-ccd8-45a9-9c2f-aa8aeedc160a"). InnerVolumeSpecName "kube-api-access-6gsm7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 14:41:51 crc kubenswrapper[4797]: I1013 14:41:50.795687 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6gsm7\" (UniqueName: \"kubernetes.io/projected/bd5395ec-ccd8-45a9-9c2f-aa8aeedc160a-kube-api-access-6gsm7\") on node \"crc\" DevicePath \"\"" Oct 13 14:41:51 crc kubenswrapper[4797]: I1013 14:41:51.177021 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-cd37-account-create-chjm4" event={"ID":"bd5395ec-ccd8-45a9-9c2f-aa8aeedc160a","Type":"ContainerDied","Data":"412b37bd15ae3132958f2b71200b5c2de82fb4c2a277da6df48925f9529f75fa"} Oct 13 14:41:51 crc kubenswrapper[4797]: I1013 14:41:51.177275 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="412b37bd15ae3132958f2b71200b5c2de82fb4c2a277da6df48925f9529f75fa" Oct 13 14:41:51 crc kubenswrapper[4797]: I1013 14:41:51.177130 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-cd37-account-create-chjm4" Oct 13 14:41:52 crc kubenswrapper[4797]: I1013 14:41:52.513383 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-sync-pnslk"] Oct 13 14:41:52 crc kubenswrapper[4797]: E1013 14:41:52.513927 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd5395ec-ccd8-45a9-9c2f-aa8aeedc160a" containerName="mariadb-account-create" Oct 13 14:41:52 crc kubenswrapper[4797]: I1013 14:41:52.513947 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd5395ec-ccd8-45a9-9c2f-aa8aeedc160a" containerName="mariadb-account-create" Oct 13 14:41:52 crc kubenswrapper[4797]: I1013 14:41:52.514184 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd5395ec-ccd8-45a9-9c2f-aa8aeedc160a" containerName="mariadb-account-create" Oct 13 14:41:52 crc kubenswrapper[4797]: I1013 14:41:52.515137 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-pnslk" Oct 13 14:41:52 crc kubenswrapper[4797]: I1013 14:41:52.520194 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Oct 13 14:41:52 crc kubenswrapper[4797]: I1013 14:41:52.520858 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-xlj6f" Oct 13 14:41:52 crc kubenswrapper[4797]: I1013 14:41:52.522512 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-pnslk"] Oct 13 14:41:52 crc kubenswrapper[4797]: I1013 14:41:52.649238 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ktlj\" (UniqueName: \"kubernetes.io/projected/b5348b8f-6498-4d16-bde9-20705e21127d-kube-api-access-7ktlj\") pod \"manila-db-sync-pnslk\" (UID: \"b5348b8f-6498-4d16-bde9-20705e21127d\") " pod="openstack/manila-db-sync-pnslk" Oct 13 14:41:52 crc kubenswrapper[4797]: I1013 14:41:52.649308 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5348b8f-6498-4d16-bde9-20705e21127d-combined-ca-bundle\") pod \"manila-db-sync-pnslk\" (UID: \"b5348b8f-6498-4d16-bde9-20705e21127d\") " pod="openstack/manila-db-sync-pnslk" Oct 13 14:41:52 crc kubenswrapper[4797]: I1013 14:41:52.649596 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5348b8f-6498-4d16-bde9-20705e21127d-config-data\") pod \"manila-db-sync-pnslk\" (UID: \"b5348b8f-6498-4d16-bde9-20705e21127d\") " pod="openstack/manila-db-sync-pnslk" Oct 13 14:41:52 crc kubenswrapper[4797]: I1013 14:41:52.649716 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/b5348b8f-6498-4d16-bde9-20705e21127d-job-config-data\") pod \"manila-db-sync-pnslk\" (UID: \"b5348b8f-6498-4d16-bde9-20705e21127d\") " pod="openstack/manila-db-sync-pnslk" Oct 13 14:41:52 crc kubenswrapper[4797]: I1013 14:41:52.751603 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7ktlj\" (UniqueName: \"kubernetes.io/projected/b5348b8f-6498-4d16-bde9-20705e21127d-kube-api-access-7ktlj\") pod \"manila-db-sync-pnslk\" (UID: \"b5348b8f-6498-4d16-bde9-20705e21127d\") " pod="openstack/manila-db-sync-pnslk" Oct 13 14:41:52 crc kubenswrapper[4797]: I1013 14:41:52.751661 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5348b8f-6498-4d16-bde9-20705e21127d-combined-ca-bundle\") pod \"manila-db-sync-pnslk\" (UID: \"b5348b8f-6498-4d16-bde9-20705e21127d\") " pod="openstack/manila-db-sync-pnslk" Oct 13 14:41:52 crc kubenswrapper[4797]: I1013 14:41:52.751738 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5348b8f-6498-4d16-bde9-20705e21127d-config-data\") pod \"manila-db-sync-pnslk\" (UID: \"b5348b8f-6498-4d16-bde9-20705e21127d\") " pod="openstack/manila-db-sync-pnslk" Oct 13 14:41:52 crc kubenswrapper[4797]: I1013 14:41:52.751779 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/b5348b8f-6498-4d16-bde9-20705e21127d-job-config-data\") pod \"manila-db-sync-pnslk\" (UID: \"b5348b8f-6498-4d16-bde9-20705e21127d\") " pod="openstack/manila-db-sync-pnslk" Oct 13 14:41:52 crc kubenswrapper[4797]: I1013 14:41:52.757181 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5348b8f-6498-4d16-bde9-20705e21127d-combined-ca-bundle\") pod \"manila-db-sync-pnslk\" (UID: \"b5348b8f-6498-4d16-bde9-20705e21127d\") " pod="openstack/manila-db-sync-pnslk" Oct 13 14:41:52 crc kubenswrapper[4797]: I1013 14:41:52.758699 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/b5348b8f-6498-4d16-bde9-20705e21127d-job-config-data\") pod \"manila-db-sync-pnslk\" (UID: \"b5348b8f-6498-4d16-bde9-20705e21127d\") " pod="openstack/manila-db-sync-pnslk" Oct 13 14:41:52 crc kubenswrapper[4797]: I1013 14:41:52.761763 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5348b8f-6498-4d16-bde9-20705e21127d-config-data\") pod \"manila-db-sync-pnslk\" (UID: \"b5348b8f-6498-4d16-bde9-20705e21127d\") " pod="openstack/manila-db-sync-pnslk" Oct 13 14:41:52 crc kubenswrapper[4797]: I1013 14:41:52.770288 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ktlj\" (UniqueName: \"kubernetes.io/projected/b5348b8f-6498-4d16-bde9-20705e21127d-kube-api-access-7ktlj\") pod \"manila-db-sync-pnslk\" (UID: \"b5348b8f-6498-4d16-bde9-20705e21127d\") " pod="openstack/manila-db-sync-pnslk" Oct 13 14:41:52 crc kubenswrapper[4797]: I1013 14:41:52.859967 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-pnslk" Oct 13 14:41:53 crc kubenswrapper[4797]: W1013 14:41:53.489172 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb5348b8f_6498_4d16_bde9_20705e21127d.slice/crio-f7ed0f98af3d3897ec6e92084ca75948c6498b0699b0dad32a69c70b2b9b2193 WatchSource:0}: Error finding container f7ed0f98af3d3897ec6e92084ca75948c6498b0699b0dad32a69c70b2b9b2193: Status 404 returned error can't find the container with id f7ed0f98af3d3897ec6e92084ca75948c6498b0699b0dad32a69c70b2b9b2193 Oct 13 14:41:53 crc kubenswrapper[4797]: I1013 14:41:53.492142 4797 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 13 14:41:53 crc kubenswrapper[4797]: I1013 14:41:53.499107 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-pnslk"] Oct 13 14:41:54 crc kubenswrapper[4797]: I1013 14:41:54.220220 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-pnslk" event={"ID":"b5348b8f-6498-4d16-bde9-20705e21127d","Type":"ContainerStarted","Data":"f7ed0f98af3d3897ec6e92084ca75948c6498b0699b0dad32a69c70b2b9b2193"} Oct 13 14:42:00 crc kubenswrapper[4797]: I1013 14:42:00.290973 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-pnslk" event={"ID":"b5348b8f-6498-4d16-bde9-20705e21127d","Type":"ContainerStarted","Data":"690f13fdc1ce6bb867d23f5782b24676165ab0c4e203303d58019049af46c27a"} Oct 13 14:42:00 crc kubenswrapper[4797]: I1013 14:42:00.310475 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-db-sync-pnslk" podStartSLOduration=3.016477709 podStartE2EDuration="8.310451474s" podCreationTimestamp="2025-10-13 14:41:52 +0000 UTC" firstStartedPulling="2025-10-13 14:41:53.491643741 +0000 UTC m=+5691.025194027" lastFinishedPulling="2025-10-13 14:41:58.785617496 +0000 UTC m=+5696.319167792" observedRunningTime="2025-10-13 14:42:00.308044985 +0000 UTC m=+5697.841595281" watchObservedRunningTime="2025-10-13 14:42:00.310451474 +0000 UTC m=+5697.844001770" Oct 13 14:42:01 crc kubenswrapper[4797]: I1013 14:42:01.237894 4797 scope.go:117] "RemoveContainer" containerID="6e1792210eb1300e989630292282b9e027a6f68ede1e610221b234da4c9f7a00" Oct 13 14:42:01 crc kubenswrapper[4797]: E1013 14:42:01.238848 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 14:42:01 crc kubenswrapper[4797]: I1013 14:42:01.305518 4797 generic.go:334] "Generic (PLEG): container finished" podID="b5348b8f-6498-4d16-bde9-20705e21127d" containerID="690f13fdc1ce6bb867d23f5782b24676165ab0c4e203303d58019049af46c27a" exitCode=0 Oct 13 14:42:01 crc kubenswrapper[4797]: I1013 14:42:01.305592 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-pnslk" event={"ID":"b5348b8f-6498-4d16-bde9-20705e21127d","Type":"ContainerDied","Data":"690f13fdc1ce6bb867d23f5782b24676165ab0c4e203303d58019049af46c27a"} Oct 13 14:42:02 crc kubenswrapper[4797]: I1013 14:42:02.762318 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-pnslk" Oct 13 14:42:02 crc kubenswrapper[4797]: I1013 14:42:02.794484 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/b5348b8f-6498-4d16-bde9-20705e21127d-job-config-data\") pod \"b5348b8f-6498-4d16-bde9-20705e21127d\" (UID: \"b5348b8f-6498-4d16-bde9-20705e21127d\") " Oct 13 14:42:02 crc kubenswrapper[4797]: I1013 14:42:02.794594 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5348b8f-6498-4d16-bde9-20705e21127d-config-data\") pod \"b5348b8f-6498-4d16-bde9-20705e21127d\" (UID: \"b5348b8f-6498-4d16-bde9-20705e21127d\") " Oct 13 14:42:02 crc kubenswrapper[4797]: I1013 14:42:02.794642 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5348b8f-6498-4d16-bde9-20705e21127d-combined-ca-bundle\") pod \"b5348b8f-6498-4d16-bde9-20705e21127d\" (UID: \"b5348b8f-6498-4d16-bde9-20705e21127d\") " Oct 13 14:42:02 crc kubenswrapper[4797]: I1013 14:42:02.794672 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7ktlj\" (UniqueName: \"kubernetes.io/projected/b5348b8f-6498-4d16-bde9-20705e21127d-kube-api-access-7ktlj\") pod \"b5348b8f-6498-4d16-bde9-20705e21127d\" (UID: \"b5348b8f-6498-4d16-bde9-20705e21127d\") " Oct 13 14:42:02 crc kubenswrapper[4797]: I1013 14:42:02.802388 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5348b8f-6498-4d16-bde9-20705e21127d-kube-api-access-7ktlj" (OuterVolumeSpecName: "kube-api-access-7ktlj") pod "b5348b8f-6498-4d16-bde9-20705e21127d" (UID: "b5348b8f-6498-4d16-bde9-20705e21127d"). InnerVolumeSpecName "kube-api-access-7ktlj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 14:42:02 crc kubenswrapper[4797]: I1013 14:42:02.802434 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5348b8f-6498-4d16-bde9-20705e21127d-job-config-data" (OuterVolumeSpecName: "job-config-data") pod "b5348b8f-6498-4d16-bde9-20705e21127d" (UID: "b5348b8f-6498-4d16-bde9-20705e21127d"). InnerVolumeSpecName "job-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 14:42:02 crc kubenswrapper[4797]: I1013 14:42:02.807568 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5348b8f-6498-4d16-bde9-20705e21127d-config-data" (OuterVolumeSpecName: "config-data") pod "b5348b8f-6498-4d16-bde9-20705e21127d" (UID: "b5348b8f-6498-4d16-bde9-20705e21127d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 14:42:02 crc kubenswrapper[4797]: I1013 14:42:02.828011 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5348b8f-6498-4d16-bde9-20705e21127d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b5348b8f-6498-4d16-bde9-20705e21127d" (UID: "b5348b8f-6498-4d16-bde9-20705e21127d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 14:42:02 crc kubenswrapper[4797]: I1013 14:42:02.895937 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5348b8f-6498-4d16-bde9-20705e21127d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 14:42:02 crc kubenswrapper[4797]: I1013 14:42:02.895965 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7ktlj\" (UniqueName: \"kubernetes.io/projected/b5348b8f-6498-4d16-bde9-20705e21127d-kube-api-access-7ktlj\") on node \"crc\" DevicePath \"\"" Oct 13 14:42:02 crc kubenswrapper[4797]: I1013 14:42:02.895976 4797 reconciler_common.go:293] "Volume detached for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/b5348b8f-6498-4d16-bde9-20705e21127d-job-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 14:42:02 crc kubenswrapper[4797]: I1013 14:42:02.895985 4797 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5348b8f-6498-4d16-bde9-20705e21127d-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 14:42:03 crc kubenswrapper[4797]: I1013 14:42:03.323738 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-pnslk" event={"ID":"b5348b8f-6498-4d16-bde9-20705e21127d","Type":"ContainerDied","Data":"f7ed0f98af3d3897ec6e92084ca75948c6498b0699b0dad32a69c70b2b9b2193"} Oct 13 14:42:03 crc kubenswrapper[4797]: I1013 14:42:03.323782 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f7ed0f98af3d3897ec6e92084ca75948c6498b0699b0dad32a69c70b2b9b2193" Oct 13 14:42:03 crc kubenswrapper[4797]: I1013 14:42:03.324142 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-pnslk" Oct 13 14:42:03 crc kubenswrapper[4797]: I1013 14:42:03.600707 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-scheduler-0"] Oct 13 14:42:03 crc kubenswrapper[4797]: E1013 14:42:03.601305 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5348b8f-6498-4d16-bde9-20705e21127d" containerName="manila-db-sync" Oct 13 14:42:03 crc kubenswrapper[4797]: I1013 14:42:03.601329 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5348b8f-6498-4d16-bde9-20705e21127d" containerName="manila-db-sync" Oct 13 14:42:03 crc kubenswrapper[4797]: I1013 14:42:03.601557 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5348b8f-6498-4d16-bde9-20705e21127d" containerName="manila-db-sync" Oct 13 14:42:03 crc kubenswrapper[4797]: I1013 14:42:03.602862 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Oct 13 14:42:03 crc kubenswrapper[4797]: I1013 14:42:03.608225 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-xlj6f" Oct 13 14:42:03 crc kubenswrapper[4797]: I1013 14:42:03.608398 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scripts" Oct 13 14:42:03 crc kubenswrapper[4797]: I1013 14:42:03.608470 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Oct 13 14:42:03 crc kubenswrapper[4797]: I1013 14:42:03.608909 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scheduler-config-data" Oct 13 14:42:03 crc kubenswrapper[4797]: I1013 14:42:03.610373 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cb39c204-29ae-4483-8d78-810d907515fe-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"cb39c204-29ae-4483-8d78-810d907515fe\") " pod="openstack/manila-scheduler-0" Oct 13 14:42:03 crc kubenswrapper[4797]: I1013 14:42:03.610463 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cb39c204-29ae-4483-8d78-810d907515fe-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"cb39c204-29ae-4483-8d78-810d907515fe\") " pod="openstack/manila-scheduler-0" Oct 13 14:42:03 crc kubenswrapper[4797]: I1013 14:42:03.610523 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb39c204-29ae-4483-8d78-810d907515fe-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"cb39c204-29ae-4483-8d78-810d907515fe\") " pod="openstack/manila-scheduler-0" Oct 13 14:42:03 crc kubenswrapper[4797]: I1013 14:42:03.610747 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb39c204-29ae-4483-8d78-810d907515fe-config-data\") pod \"manila-scheduler-0\" (UID: \"cb39c204-29ae-4483-8d78-810d907515fe\") " pod="openstack/manila-scheduler-0" Oct 13 14:42:03 crc kubenswrapper[4797]: I1013 14:42:03.610850 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bg566\" (UniqueName: \"kubernetes.io/projected/cb39c204-29ae-4483-8d78-810d907515fe-kube-api-access-bg566\") pod \"manila-scheduler-0\" (UID: \"cb39c204-29ae-4483-8d78-810d907515fe\") " pod="openstack/manila-scheduler-0" Oct 13 14:42:03 crc kubenswrapper[4797]: I1013 14:42:03.610986 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb39c204-29ae-4483-8d78-810d907515fe-scripts\") pod \"manila-scheduler-0\" (UID: \"cb39c204-29ae-4483-8d78-810d907515fe\") " pod="openstack/manila-scheduler-0" Oct 13 14:42:03 crc kubenswrapper[4797]: I1013 14:42:03.643599 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 13 14:42:03 crc kubenswrapper[4797]: I1013 14:42:03.727485 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cb39c204-29ae-4483-8d78-810d907515fe-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"cb39c204-29ae-4483-8d78-810d907515fe\") " pod="openstack/manila-scheduler-0" Oct 13 14:42:03 crc kubenswrapper[4797]: I1013 14:42:03.727885 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cb39c204-29ae-4483-8d78-810d907515fe-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"cb39c204-29ae-4483-8d78-810d907515fe\") " pod="openstack/manila-scheduler-0" Oct 13 14:42:03 crc kubenswrapper[4797]: I1013 14:42:03.727950 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cb39c204-29ae-4483-8d78-810d907515fe-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"cb39c204-29ae-4483-8d78-810d907515fe\") " pod="openstack/manila-scheduler-0" Oct 13 14:42:03 crc kubenswrapper[4797]: I1013 14:42:03.728379 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb39c204-29ae-4483-8d78-810d907515fe-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"cb39c204-29ae-4483-8d78-810d907515fe\") " pod="openstack/manila-scheduler-0" Oct 13 14:42:03 crc kubenswrapper[4797]: I1013 14:42:03.728968 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb39c204-29ae-4483-8d78-810d907515fe-config-data\") pod \"manila-scheduler-0\" (UID: \"cb39c204-29ae-4483-8d78-810d907515fe\") " pod="openstack/manila-scheduler-0" Oct 13 14:42:03 crc kubenswrapper[4797]: I1013 14:42:03.729028 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bg566\" (UniqueName: \"kubernetes.io/projected/cb39c204-29ae-4483-8d78-810d907515fe-kube-api-access-bg566\") pod \"manila-scheduler-0\" (UID: \"cb39c204-29ae-4483-8d78-810d907515fe\") " pod="openstack/manila-scheduler-0" Oct 13 14:42:03 crc kubenswrapper[4797]: I1013 14:42:03.729144 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb39c204-29ae-4483-8d78-810d907515fe-scripts\") pod \"manila-scheduler-0\" (UID: \"cb39c204-29ae-4483-8d78-810d907515fe\") " pod="openstack/manila-scheduler-0" Oct 13 14:42:03 crc kubenswrapper[4797]: I1013 14:42:03.747526 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cb39c204-29ae-4483-8d78-810d907515fe-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"cb39c204-29ae-4483-8d78-810d907515fe\") " pod="openstack/manila-scheduler-0" Oct 13 14:42:03 crc kubenswrapper[4797]: I1013 14:42:03.750610 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-share-share1-0"] Oct 13 14:42:03 crc kubenswrapper[4797]: I1013 14:42:03.754723 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb39c204-29ae-4483-8d78-810d907515fe-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"cb39c204-29ae-4483-8d78-810d907515fe\") " pod="openstack/manila-scheduler-0" Oct 13 14:42:03 crc kubenswrapper[4797]: I1013 14:42:03.756230 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb39c204-29ae-4483-8d78-810d907515fe-scripts\") pod \"manila-scheduler-0\" (UID: \"cb39c204-29ae-4483-8d78-810d907515fe\") " pod="openstack/manila-scheduler-0" Oct 13 14:42:03 crc kubenswrapper[4797]: I1013 14:42:03.765127 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb39c204-29ae-4483-8d78-810d907515fe-config-data\") pod \"manila-scheduler-0\" (UID: \"cb39c204-29ae-4483-8d78-810d907515fe\") " pod="openstack/manila-scheduler-0" Oct 13 14:42:03 crc kubenswrapper[4797]: I1013 14:42:03.774168 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bg566\" (UniqueName: \"kubernetes.io/projected/cb39c204-29ae-4483-8d78-810d907515fe-kube-api-access-bg566\") pod \"manila-scheduler-0\" (UID: \"cb39c204-29ae-4483-8d78-810d907515fe\") " pod="openstack/manila-scheduler-0" Oct 13 14:42:03 crc kubenswrapper[4797]: I1013 14:42:03.785754 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Oct 13 14:42:03 crc kubenswrapper[4797]: I1013 14:42:03.796036 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6c489d6fcc-6jc4s"] Oct 13 14:42:03 crc kubenswrapper[4797]: I1013 14:42:03.799958 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-share-share1-config-data" Oct 13 14:42:03 crc kubenswrapper[4797]: I1013 14:42:03.825690 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c489d6fcc-6jc4s" Oct 13 14:42:03 crc kubenswrapper[4797]: I1013 14:42:03.843463 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/80a123a6-e2f6-4491-abbb-0eeb468e2879-ovsdbserver-sb\") pod \"dnsmasq-dns-6c489d6fcc-6jc4s\" (UID: \"80a123a6-e2f6-4491-abbb-0eeb468e2879\") " pod="openstack/dnsmasq-dns-6c489d6fcc-6jc4s" Oct 13 14:42:03 crc kubenswrapper[4797]: I1013 14:42:03.843521 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/945cdf25-57ce-44f2-89ae-2a17e21c485f-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"945cdf25-57ce-44f2-89ae-2a17e21c485f\") " pod="openstack/manila-share-share1-0" Oct 13 14:42:03 crc kubenswrapper[4797]: I1013 14:42:03.843562 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6dqc\" (UniqueName: \"kubernetes.io/projected/80a123a6-e2f6-4491-abbb-0eeb468e2879-kube-api-access-f6dqc\") pod \"dnsmasq-dns-6c489d6fcc-6jc4s\" (UID: \"80a123a6-e2f6-4491-abbb-0eeb468e2879\") " pod="openstack/dnsmasq-dns-6c489d6fcc-6jc4s" Oct 13 14:42:03 crc kubenswrapper[4797]: I1013 14:42:03.843639 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/945cdf25-57ce-44f2-89ae-2a17e21c485f-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"945cdf25-57ce-44f2-89ae-2a17e21c485f\") " pod="openstack/manila-share-share1-0" Oct 13 14:42:03 crc kubenswrapper[4797]: I1013 14:42:03.843674 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/80a123a6-e2f6-4491-abbb-0eeb468e2879-ovsdbserver-nb\") pod \"dnsmasq-dns-6c489d6fcc-6jc4s\" (UID: \"80a123a6-e2f6-4491-abbb-0eeb468e2879\") " pod="openstack/dnsmasq-dns-6c489d6fcc-6jc4s" Oct 13 14:42:03 crc kubenswrapper[4797]: I1013 14:42:03.843742 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/945cdf25-57ce-44f2-89ae-2a17e21c485f-scripts\") pod \"manila-share-share1-0\" (UID: \"945cdf25-57ce-44f2-89ae-2a17e21c485f\") " pod="openstack/manila-share-share1-0" Oct 13 14:42:03 crc kubenswrapper[4797]: I1013 14:42:03.843786 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80a123a6-e2f6-4491-abbb-0eeb468e2879-config\") pod \"dnsmasq-dns-6c489d6fcc-6jc4s\" (UID: \"80a123a6-e2f6-4491-abbb-0eeb468e2879\") " pod="openstack/dnsmasq-dns-6c489d6fcc-6jc4s" Oct 13 14:42:03 crc kubenswrapper[4797]: I1013 14:42:03.843827 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/945cdf25-57ce-44f2-89ae-2a17e21c485f-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"945cdf25-57ce-44f2-89ae-2a17e21c485f\") " pod="openstack/manila-share-share1-0" Oct 13 14:42:03 crc kubenswrapper[4797]: I1013 14:42:03.843894 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/80a123a6-e2f6-4491-abbb-0eeb468e2879-dns-svc\") pod \"dnsmasq-dns-6c489d6fcc-6jc4s\" (UID: \"80a123a6-e2f6-4491-abbb-0eeb468e2879\") " pod="openstack/dnsmasq-dns-6c489d6fcc-6jc4s" Oct 13 14:42:03 crc kubenswrapper[4797]: I1013 14:42:03.843951 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bsmvb\" (UniqueName: \"kubernetes.io/projected/945cdf25-57ce-44f2-89ae-2a17e21c485f-kube-api-access-bsmvb\") pod \"manila-share-share1-0\" (UID: \"945cdf25-57ce-44f2-89ae-2a17e21c485f\") " pod="openstack/manila-share-share1-0" Oct 13 14:42:03 crc kubenswrapper[4797]: I1013 14:42:03.843968 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/945cdf25-57ce-44f2-89ae-2a17e21c485f-config-data\") pod \"manila-share-share1-0\" (UID: \"945cdf25-57ce-44f2-89ae-2a17e21c485f\") " pod="openstack/manila-share-share1-0" Oct 13 14:42:03 crc kubenswrapper[4797]: I1013 14:42:03.844008 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/945cdf25-57ce-44f2-89ae-2a17e21c485f-ceph\") pod \"manila-share-share1-0\" (UID: \"945cdf25-57ce-44f2-89ae-2a17e21c485f\") " pod="openstack/manila-share-share1-0" Oct 13 14:42:03 crc kubenswrapper[4797]: I1013 14:42:03.844058 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/945cdf25-57ce-44f2-89ae-2a17e21c485f-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"945cdf25-57ce-44f2-89ae-2a17e21c485f\") " pod="openstack/manila-share-share1-0" Oct 13 14:42:03 crc kubenswrapper[4797]: I1013 14:42:03.900584 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6c489d6fcc-6jc4s"] Oct 13 14:42:03 crc kubenswrapper[4797]: I1013 14:42:03.919856 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Oct 13 14:42:03 crc kubenswrapper[4797]: I1013 14:42:03.928521 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Oct 13 14:42:03 crc kubenswrapper[4797]: I1013 14:42:03.932910 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Oct 13 14:42:03 crc kubenswrapper[4797]: I1013 14:42:03.948304 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/945cdf25-57ce-44f2-89ae-2a17e21c485f-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"945cdf25-57ce-44f2-89ae-2a17e21c485f\") " pod="openstack/manila-share-share1-0" Oct 13 14:42:03 crc kubenswrapper[4797]: I1013 14:42:03.964013 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/80a123a6-e2f6-4491-abbb-0eeb468e2879-ovsdbserver-sb\") pod \"dnsmasq-dns-6c489d6fcc-6jc4s\" (UID: \"80a123a6-e2f6-4491-abbb-0eeb468e2879\") " pod="openstack/dnsmasq-dns-6c489d6fcc-6jc4s" Oct 13 14:42:03 crc kubenswrapper[4797]: I1013 14:42:03.964261 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/945cdf25-57ce-44f2-89ae-2a17e21c485f-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"945cdf25-57ce-44f2-89ae-2a17e21c485f\") " pod="openstack/manila-share-share1-0" Oct 13 14:42:03 crc kubenswrapper[4797]: I1013 14:42:03.950432 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/945cdf25-57ce-44f2-89ae-2a17e21c485f-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"945cdf25-57ce-44f2-89ae-2a17e21c485f\") " pod="openstack/manila-share-share1-0" Oct 13 14:42:03 crc kubenswrapper[4797]: I1013 14:42:03.964506 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f6dqc\" (UniqueName: \"kubernetes.io/projected/80a123a6-e2f6-4491-abbb-0eeb468e2879-kube-api-access-f6dqc\") pod \"dnsmasq-dns-6c489d6fcc-6jc4s\" (UID: \"80a123a6-e2f6-4491-abbb-0eeb468e2879\") " pod="openstack/dnsmasq-dns-6c489d6fcc-6jc4s" Oct 13 14:42:03 crc kubenswrapper[4797]: I1013 14:42:03.964608 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/945cdf25-57ce-44f2-89ae-2a17e21c485f-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"945cdf25-57ce-44f2-89ae-2a17e21c485f\") " pod="openstack/manila-share-share1-0" Oct 13 14:42:03 crc kubenswrapper[4797]: I1013 14:42:03.964757 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/945cdf25-57ce-44f2-89ae-2a17e21c485f-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"945cdf25-57ce-44f2-89ae-2a17e21c485f\") " pod="openstack/manila-share-share1-0" Oct 13 14:42:03 crc kubenswrapper[4797]: I1013 14:42:03.964913 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/80a123a6-e2f6-4491-abbb-0eeb468e2879-ovsdbserver-nb\") pod \"dnsmasq-dns-6c489d6fcc-6jc4s\" (UID: \"80a123a6-e2f6-4491-abbb-0eeb468e2879\") " pod="openstack/dnsmasq-dns-6c489d6fcc-6jc4s" Oct 13 14:42:03 crc kubenswrapper[4797]: I1013 14:42:03.965089 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/945cdf25-57ce-44f2-89ae-2a17e21c485f-scripts\") pod \"manila-share-share1-0\" (UID: \"945cdf25-57ce-44f2-89ae-2a17e21c485f\") " pod="openstack/manila-share-share1-0" Oct 13 14:42:03 crc kubenswrapper[4797]: I1013 14:42:03.965899 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80a123a6-e2f6-4491-abbb-0eeb468e2879-config\") pod \"dnsmasq-dns-6c489d6fcc-6jc4s\" (UID: \"80a123a6-e2f6-4491-abbb-0eeb468e2879\") " pod="openstack/dnsmasq-dns-6c489d6fcc-6jc4s" Oct 13 14:42:03 crc kubenswrapper[4797]: I1013 14:42:03.966012 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/945cdf25-57ce-44f2-89ae-2a17e21c485f-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"945cdf25-57ce-44f2-89ae-2a17e21c485f\") " pod="openstack/manila-share-share1-0" Oct 13 14:42:03 crc kubenswrapper[4797]: I1013 14:42:03.966168 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/80a123a6-e2f6-4491-abbb-0eeb468e2879-dns-svc\") pod \"dnsmasq-dns-6c489d6fcc-6jc4s\" (UID: \"80a123a6-e2f6-4491-abbb-0eeb468e2879\") " pod="openstack/dnsmasq-dns-6c489d6fcc-6jc4s" Oct 13 14:42:03 crc kubenswrapper[4797]: I1013 14:42:03.966318 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bsmvb\" (UniqueName: \"kubernetes.io/projected/945cdf25-57ce-44f2-89ae-2a17e21c485f-kube-api-access-bsmvb\") pod \"manila-share-share1-0\" (UID: \"945cdf25-57ce-44f2-89ae-2a17e21c485f\") " pod="openstack/manila-share-share1-0" Oct 13 14:42:03 crc kubenswrapper[4797]: I1013 14:42:03.966398 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/945cdf25-57ce-44f2-89ae-2a17e21c485f-config-data\") pod \"manila-share-share1-0\" (UID: \"945cdf25-57ce-44f2-89ae-2a17e21c485f\") " pod="openstack/manila-share-share1-0" Oct 13 14:42:03 crc kubenswrapper[4797]: I1013 14:42:03.965326 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/80a123a6-e2f6-4491-abbb-0eeb468e2879-ovsdbserver-sb\") pod \"dnsmasq-dns-6c489d6fcc-6jc4s\" (UID: \"80a123a6-e2f6-4491-abbb-0eeb468e2879\") " pod="openstack/dnsmasq-dns-6c489d6fcc-6jc4s" Oct 13 14:42:03 crc kubenswrapper[4797]: I1013 14:42:03.969709 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/80a123a6-e2f6-4491-abbb-0eeb468e2879-dns-svc\") pod \"dnsmasq-dns-6c489d6fcc-6jc4s\" (UID: \"80a123a6-e2f6-4491-abbb-0eeb468e2879\") " pod="openstack/dnsmasq-dns-6c489d6fcc-6jc4s" Oct 13 14:42:03 crc kubenswrapper[4797]: I1013 14:42:03.970041 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/945cdf25-57ce-44f2-89ae-2a17e21c485f-scripts\") pod \"manila-share-share1-0\" (UID: \"945cdf25-57ce-44f2-89ae-2a17e21c485f\") " pod="openstack/manila-share-share1-0" Oct 13 14:42:03 crc kubenswrapper[4797]: I1013 14:42:03.970070 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80a123a6-e2f6-4491-abbb-0eeb468e2879-config\") pod \"dnsmasq-dns-6c489d6fcc-6jc4s\" (UID: \"80a123a6-e2f6-4491-abbb-0eeb468e2879\") " pod="openstack/dnsmasq-dns-6c489d6fcc-6jc4s" Oct 13 14:42:03 crc kubenswrapper[4797]: I1013 14:42:03.972469 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/945cdf25-57ce-44f2-89ae-2a17e21c485f-config-data\") pod \"manila-share-share1-0\" (UID: \"945cdf25-57ce-44f2-89ae-2a17e21c485f\") " pod="openstack/manila-share-share1-0" Oct 13 14:42:03 crc kubenswrapper[4797]: I1013 14:42:03.973343 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/945cdf25-57ce-44f2-89ae-2a17e21c485f-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"945cdf25-57ce-44f2-89ae-2a17e21c485f\") " pod="openstack/manila-share-share1-0" Oct 13 14:42:03 crc kubenswrapper[4797]: I1013 14:42:03.976215 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/945cdf25-57ce-44f2-89ae-2a17e21c485f-ceph\") pod \"manila-share-share1-0\" (UID: \"945cdf25-57ce-44f2-89ae-2a17e21c485f\") " pod="openstack/manila-share-share1-0" Oct 13 14:42:03 crc kubenswrapper[4797]: I1013 14:42:03.976373 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/80a123a6-e2f6-4491-abbb-0eeb468e2879-ovsdbserver-nb\") pod \"dnsmasq-dns-6c489d6fcc-6jc4s\" (UID: \"80a123a6-e2f6-4491-abbb-0eeb468e2879\") " pod="openstack/dnsmasq-dns-6c489d6fcc-6jc4s" Oct 13 14:42:03 crc kubenswrapper[4797]: I1013 14:42:03.988348 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/945cdf25-57ce-44f2-89ae-2a17e21c485f-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"945cdf25-57ce-44f2-89ae-2a17e21c485f\") " pod="openstack/manila-share-share1-0" Oct 13 14:42:03 crc kubenswrapper[4797]: I1013 14:42:03.988613 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/945cdf25-57ce-44f2-89ae-2a17e21c485f-ceph\") pod \"manila-share-share1-0\" (UID: \"945cdf25-57ce-44f2-89ae-2a17e21c485f\") " pod="openstack/manila-share-share1-0" Oct 13 14:42:03 crc kubenswrapper[4797]: I1013 14:42:03.989542 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-api-0"] Oct 13 14:42:03 crc kubenswrapper[4797]: I1013 14:42:03.993289 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Oct 13 14:42:03 crc kubenswrapper[4797]: I1013 14:42:03.996183 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-api-config-data" Oct 13 14:42:03 crc kubenswrapper[4797]: I1013 14:42:03.997369 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6dqc\" (UniqueName: \"kubernetes.io/projected/80a123a6-e2f6-4491-abbb-0eeb468e2879-kube-api-access-f6dqc\") pod \"dnsmasq-dns-6c489d6fcc-6jc4s\" (UID: \"80a123a6-e2f6-4491-abbb-0eeb468e2879\") " pod="openstack/dnsmasq-dns-6c489d6fcc-6jc4s" Oct 13 14:42:04 crc kubenswrapper[4797]: I1013 14:42:04.004119 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Oct 13 14:42:04 crc kubenswrapper[4797]: I1013 14:42:04.005044 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bsmvb\" (UniqueName: \"kubernetes.io/projected/945cdf25-57ce-44f2-89ae-2a17e21c485f-kube-api-access-bsmvb\") pod \"manila-share-share1-0\" (UID: \"945cdf25-57ce-44f2-89ae-2a17e21c485f\") " pod="openstack/manila-share-share1-0" Oct 13 14:42:04 crc kubenswrapper[4797]: I1013 14:42:04.081629 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58623ce0-7ada-4657-954a-d91efd56bb3e-scripts\") pod \"manila-api-0\" (UID: \"58623ce0-7ada-4657-954a-d91efd56bb3e\") " pod="openstack/manila-api-0" Oct 13 14:42:04 crc kubenswrapper[4797]: I1013 14:42:04.081715 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brjrt\" (UniqueName: \"kubernetes.io/projected/58623ce0-7ada-4657-954a-d91efd56bb3e-kube-api-access-brjrt\") pod \"manila-api-0\" (UID: \"58623ce0-7ada-4657-954a-d91efd56bb3e\") " pod="openstack/manila-api-0" Oct 13 14:42:04 crc kubenswrapper[4797]: I1013 14:42:04.081756 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58623ce0-7ada-4657-954a-d91efd56bb3e-config-data\") pod \"manila-api-0\" (UID: \"58623ce0-7ada-4657-954a-d91efd56bb3e\") " pod="openstack/manila-api-0" Oct 13 14:42:04 crc kubenswrapper[4797]: I1013 14:42:04.097334 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/58623ce0-7ada-4657-954a-d91efd56bb3e-config-data-custom\") pod \"manila-api-0\" (UID: \"58623ce0-7ada-4657-954a-d91efd56bb3e\") " pod="openstack/manila-api-0" Oct 13 14:42:04 crc kubenswrapper[4797]: I1013 14:42:04.097715 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/58623ce0-7ada-4657-954a-d91efd56bb3e-logs\") pod \"manila-api-0\" (UID: \"58623ce0-7ada-4657-954a-d91efd56bb3e\") " pod="openstack/manila-api-0" Oct 13 14:42:04 crc kubenswrapper[4797]: I1013 14:42:04.097798 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58623ce0-7ada-4657-954a-d91efd56bb3e-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"58623ce0-7ada-4657-954a-d91efd56bb3e\") " pod="openstack/manila-api-0" Oct 13 14:42:04 crc kubenswrapper[4797]: I1013 14:42:04.097921 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/58623ce0-7ada-4657-954a-d91efd56bb3e-etc-machine-id\") pod \"manila-api-0\" (UID: \"58623ce0-7ada-4657-954a-d91efd56bb3e\") " pod="openstack/manila-api-0" Oct 13 14:42:04 crc kubenswrapper[4797]: I1013 14:42:04.165861 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Oct 13 14:42:04 crc kubenswrapper[4797]: I1013 14:42:04.179315 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c489d6fcc-6jc4s" Oct 13 14:42:04 crc kubenswrapper[4797]: I1013 14:42:04.199462 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/58623ce0-7ada-4657-954a-d91efd56bb3e-config-data-custom\") pod \"manila-api-0\" (UID: \"58623ce0-7ada-4657-954a-d91efd56bb3e\") " pod="openstack/manila-api-0" Oct 13 14:42:04 crc kubenswrapper[4797]: I1013 14:42:04.199520 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/58623ce0-7ada-4657-954a-d91efd56bb3e-logs\") pod \"manila-api-0\" (UID: \"58623ce0-7ada-4657-954a-d91efd56bb3e\") " pod="openstack/manila-api-0" Oct 13 14:42:04 crc kubenswrapper[4797]: I1013 14:42:04.199563 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58623ce0-7ada-4657-954a-d91efd56bb3e-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"58623ce0-7ada-4657-954a-d91efd56bb3e\") " pod="openstack/manila-api-0" Oct 13 14:42:04 crc kubenswrapper[4797]: I1013 14:42:04.199610 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/58623ce0-7ada-4657-954a-d91efd56bb3e-etc-machine-id\") pod \"manila-api-0\" (UID: \"58623ce0-7ada-4657-954a-d91efd56bb3e\") " pod="openstack/manila-api-0" Oct 13 14:42:04 crc kubenswrapper[4797]: I1013 14:42:04.199650 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58623ce0-7ada-4657-954a-d91efd56bb3e-scripts\") pod \"manila-api-0\" (UID: \"58623ce0-7ada-4657-954a-d91efd56bb3e\") " pod="openstack/manila-api-0" Oct 13 14:42:04 crc kubenswrapper[4797]: I1013 14:42:04.199688 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-brjrt\" (UniqueName: \"kubernetes.io/projected/58623ce0-7ada-4657-954a-d91efd56bb3e-kube-api-access-brjrt\") pod \"manila-api-0\" (UID: \"58623ce0-7ada-4657-954a-d91efd56bb3e\") " pod="openstack/manila-api-0" Oct 13 14:42:04 crc kubenswrapper[4797]: I1013 14:42:04.199728 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58623ce0-7ada-4657-954a-d91efd56bb3e-config-data\") pod \"manila-api-0\" (UID: \"58623ce0-7ada-4657-954a-d91efd56bb3e\") " pod="openstack/manila-api-0" Oct 13 14:42:04 crc kubenswrapper[4797]: I1013 14:42:04.199774 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/58623ce0-7ada-4657-954a-d91efd56bb3e-etc-machine-id\") pod \"manila-api-0\" (UID: \"58623ce0-7ada-4657-954a-d91efd56bb3e\") " pod="openstack/manila-api-0" Oct 13 14:42:04 crc kubenswrapper[4797]: I1013 14:42:04.200518 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/58623ce0-7ada-4657-954a-d91efd56bb3e-logs\") pod \"manila-api-0\" (UID: \"58623ce0-7ada-4657-954a-d91efd56bb3e\") " pod="openstack/manila-api-0" Oct 13 14:42:04 crc kubenswrapper[4797]: I1013 14:42:04.205086 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58623ce0-7ada-4657-954a-d91efd56bb3e-scripts\") pod \"manila-api-0\" (UID: \"58623ce0-7ada-4657-954a-d91efd56bb3e\") " pod="openstack/manila-api-0" Oct 13 14:42:04 crc kubenswrapper[4797]: I1013 14:42:04.205576 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/58623ce0-7ada-4657-954a-d91efd56bb3e-config-data-custom\") pod \"manila-api-0\" (UID: \"58623ce0-7ada-4657-954a-d91efd56bb3e\") " pod="openstack/manila-api-0" Oct 13 14:42:04 crc kubenswrapper[4797]: I1013 14:42:04.206798 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58623ce0-7ada-4657-954a-d91efd56bb3e-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"58623ce0-7ada-4657-954a-d91efd56bb3e\") " pod="openstack/manila-api-0" Oct 13 14:42:04 crc kubenswrapper[4797]: I1013 14:42:04.225391 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-brjrt\" (UniqueName: \"kubernetes.io/projected/58623ce0-7ada-4657-954a-d91efd56bb3e-kube-api-access-brjrt\") pod \"manila-api-0\" (UID: \"58623ce0-7ada-4657-954a-d91efd56bb3e\") " pod="openstack/manila-api-0" Oct 13 14:42:04 crc kubenswrapper[4797]: I1013 14:42:04.244973 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58623ce0-7ada-4657-954a-d91efd56bb3e-config-data\") pod \"manila-api-0\" (UID: \"58623ce0-7ada-4657-954a-d91efd56bb3e\") " pod="openstack/manila-api-0" Oct 13 14:42:04 crc kubenswrapper[4797]: I1013 14:42:04.480246 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Oct 13 14:42:04 crc kubenswrapper[4797]: I1013 14:42:04.662697 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Oct 13 14:42:04 crc kubenswrapper[4797]: I1013 14:42:04.952198 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6c489d6fcc-6jc4s"] Oct 13 14:42:05 crc kubenswrapper[4797]: W1013 14:42:05.019250 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod80a123a6_e2f6_4491_abbb_0eeb468e2879.slice/crio-19cae0882b3e126c057cb7f36200511ed25e3226c16a371c9cc409ceeca4a946 WatchSource:0}: Error finding container 19cae0882b3e126c057cb7f36200511ed25e3226c16a371c9cc409ceeca4a946: Status 404 returned error can't find the container with id 19cae0882b3e126c057cb7f36200511ed25e3226c16a371c9cc409ceeca4a946 Oct 13 14:42:05 crc kubenswrapper[4797]: I1013 14:42:05.115174 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Oct 13 14:42:05 crc kubenswrapper[4797]: W1013 14:42:05.132052 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod945cdf25_57ce_44f2_89ae_2a17e21c485f.slice/crio-ec0db50348a06f02799d6c8eafeecc7aaf256c0af270c9e316995cf19482ee35 WatchSource:0}: Error finding container ec0db50348a06f02799d6c8eafeecc7aaf256c0af270c9e316995cf19482ee35: Status 404 returned error can't find the container with id ec0db50348a06f02799d6c8eafeecc7aaf256c0af270c9e316995cf19482ee35 Oct 13 14:42:05 crc kubenswrapper[4797]: W1013 14:42:05.350745 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod58623ce0_7ada_4657_954a_d91efd56bb3e.slice/crio-59ddc21af9647c49b5d7cd73a91a4525f1094d3a770ac5cce1b8250758af1c6a WatchSource:0}: Error finding container 59ddc21af9647c49b5d7cd73a91a4525f1094d3a770ac5cce1b8250758af1c6a: Status 404 returned error can't find the container with id 59ddc21af9647c49b5d7cd73a91a4525f1094d3a770ac5cce1b8250758af1c6a Oct 13 14:42:05 crc kubenswrapper[4797]: I1013 14:42:05.350836 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Oct 13 14:42:05 crc kubenswrapper[4797]: I1013 14:42:05.360408 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"cb39c204-29ae-4483-8d78-810d907515fe","Type":"ContainerStarted","Data":"7c89db0f5797ba6d96c9b85b9fc66002e1a8757d24757372a956e9d9fc80ec42"} Oct 13 14:42:05 crc kubenswrapper[4797]: I1013 14:42:05.361592 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"945cdf25-57ce-44f2-89ae-2a17e21c485f","Type":"ContainerStarted","Data":"ec0db50348a06f02799d6c8eafeecc7aaf256c0af270c9e316995cf19482ee35"} Oct 13 14:42:05 crc kubenswrapper[4797]: I1013 14:42:05.362488 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c489d6fcc-6jc4s" event={"ID":"80a123a6-e2f6-4491-abbb-0eeb468e2879","Type":"ContainerStarted","Data":"19cae0882b3e126c057cb7f36200511ed25e3226c16a371c9cc409ceeca4a946"} Oct 13 14:42:06 crc kubenswrapper[4797]: I1013 14:42:06.042397 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-sbxhd"] Oct 13 14:42:06 crc kubenswrapper[4797]: I1013 14:42:06.063867 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-s9hws"] Oct 13 14:42:06 crc kubenswrapper[4797]: I1013 14:42:06.079517 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-sbxhd"] Oct 13 14:42:06 crc kubenswrapper[4797]: I1013 14:42:06.095919 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-fwp9r"] Oct 13 14:42:06 crc kubenswrapper[4797]: I1013 14:42:06.103031 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-s9hws"] Oct 13 14:42:06 crc kubenswrapper[4797]: I1013 14:42:06.113198 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-fwp9r"] Oct 13 14:42:06 crc kubenswrapper[4797]: I1013 14:42:06.401740 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"cb39c204-29ae-4483-8d78-810d907515fe","Type":"ContainerStarted","Data":"741c03a1ad2d194eb932b652e5de6c6c45f1adbc07b363df4734aec8cd1d0255"} Oct 13 14:42:06 crc kubenswrapper[4797]: I1013 14:42:06.401794 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"cb39c204-29ae-4483-8d78-810d907515fe","Type":"ContainerStarted","Data":"c586f225968c3628539249f2089bfaead36373cc8e308009c75baf82ac11b72a"} Oct 13 14:42:06 crc kubenswrapper[4797]: I1013 14:42:06.405197 4797 generic.go:334] "Generic (PLEG): container finished" podID="80a123a6-e2f6-4491-abbb-0eeb468e2879" containerID="e988fdf012d700582d0d48bda9a2382512a7826eba026c0e6d732bd35889a503" exitCode=0 Oct 13 14:42:06 crc kubenswrapper[4797]: I1013 14:42:06.405271 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c489d6fcc-6jc4s" event={"ID":"80a123a6-e2f6-4491-abbb-0eeb468e2879","Type":"ContainerDied","Data":"e988fdf012d700582d0d48bda9a2382512a7826eba026c0e6d732bd35889a503"} Oct 13 14:42:06 crc kubenswrapper[4797]: I1013 14:42:06.410750 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"58623ce0-7ada-4657-954a-d91efd56bb3e","Type":"ContainerStarted","Data":"7de7fa62eeb531fc229acedf87aa58ab8fcc4ec7af5540190e5894cd0da70c27"} Oct 13 14:42:06 crc kubenswrapper[4797]: I1013 14:42:06.411662 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"58623ce0-7ada-4657-954a-d91efd56bb3e","Type":"ContainerStarted","Data":"59ddc21af9647c49b5d7cd73a91a4525f1094d3a770ac5cce1b8250758af1c6a"} Oct 13 14:42:06 crc kubenswrapper[4797]: I1013 14:42:06.434537 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-scheduler-0" podStartSLOduration=3.085111659 podStartE2EDuration="3.434519975s" podCreationTimestamp="2025-10-13 14:42:03 +0000 UTC" firstStartedPulling="2025-10-13 14:42:04.720925945 +0000 UTC m=+5702.254476201" lastFinishedPulling="2025-10-13 14:42:05.070334261 +0000 UTC m=+5702.603884517" observedRunningTime="2025-10-13 14:42:06.428311563 +0000 UTC m=+5703.961861829" watchObservedRunningTime="2025-10-13 14:42:06.434519975 +0000 UTC m=+5703.968070231" Oct 13 14:42:07 crc kubenswrapper[4797]: I1013 14:42:07.249722 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb03bfd-b126-482a-be5f-abb05c56d932" path="/var/lib/kubelet/pods/7bb03bfd-b126-482a-be5f-abb05c56d932/volumes" Oct 13 14:42:07 crc kubenswrapper[4797]: I1013 14:42:07.253783 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9068568-7dc0-4636-8129-794e57dd32e3" path="/var/lib/kubelet/pods/d9068568-7dc0-4636-8129-794e57dd32e3/volumes" Oct 13 14:42:07 crc kubenswrapper[4797]: I1013 14:42:07.254333 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8c66927-df80-4626-a9a4-5af6596fac42" path="/var/lib/kubelet/pods/e8c66927-df80-4626-a9a4-5af6596fac42/volumes" Oct 13 14:42:07 crc kubenswrapper[4797]: I1013 14:42:07.432990 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c489d6fcc-6jc4s" event={"ID":"80a123a6-e2f6-4491-abbb-0eeb468e2879","Type":"ContainerStarted","Data":"1ae4681492128fd97a809246fd137701e256ccc00919562ec3be7d4472645093"} Oct 13 14:42:07 crc kubenswrapper[4797]: I1013 14:42:07.433205 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6c489d6fcc-6jc4s" Oct 13 14:42:07 crc kubenswrapper[4797]: I1013 14:42:07.436266 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"58623ce0-7ada-4657-954a-d91efd56bb3e","Type":"ContainerStarted","Data":"7ebdb45bdc00804f2f16e11d933df33c33a03bdaf5fb684689af4a5593713953"} Oct 13 14:42:07 crc kubenswrapper[4797]: I1013 14:42:07.436393 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/manila-api-0" Oct 13 14:42:07 crc kubenswrapper[4797]: I1013 14:42:07.461777 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6c489d6fcc-6jc4s" podStartSLOduration=4.461750579 podStartE2EDuration="4.461750579s" podCreationTimestamp="2025-10-13 14:42:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 14:42:07.449774166 +0000 UTC m=+5704.983324442" watchObservedRunningTime="2025-10-13 14:42:07.461750579 +0000 UTC m=+5704.995300835" Oct 13 14:42:07 crc kubenswrapper[4797]: I1013 14:42:07.479229 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-api-0" podStartSLOduration=4.479209737 podStartE2EDuration="4.479209737s" podCreationTimestamp="2025-10-13 14:42:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 14:42:07.470951014 +0000 UTC m=+5705.004501280" watchObservedRunningTime="2025-10-13 14:42:07.479209737 +0000 UTC m=+5705.012759993" Oct 13 14:42:09 crc kubenswrapper[4797]: E1013 14:42:09.563549 4797 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb5348b8f_6498_4d16_bde9_20705e21127d.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb5348b8f_6498_4d16_bde9_20705e21127d.slice/crio-f7ed0f98af3d3897ec6e92084ca75948c6498b0699b0dad32a69c70b2b9b2193\": RecentStats: unable to find data in memory cache]" Oct 13 14:42:12 crc kubenswrapper[4797]: I1013 14:42:12.236680 4797 scope.go:117] "RemoveContainer" containerID="6e1792210eb1300e989630292282b9e027a6f68ede1e610221b234da4c9f7a00" Oct 13 14:42:12 crc kubenswrapper[4797]: E1013 14:42:12.237450 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 14:42:13 crc kubenswrapper[4797]: I1013 14:42:13.510624 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"945cdf25-57ce-44f2-89ae-2a17e21c485f","Type":"ContainerStarted","Data":"12d637f994b58959fe4165fa81c4751873011bc4e50c072b684518a31069cf59"} Oct 13 14:42:13 crc kubenswrapper[4797]: I1013 14:42:13.929190 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-scheduler-0" Oct 13 14:42:14 crc kubenswrapper[4797]: I1013 14:42:14.180956 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6c489d6fcc-6jc4s" Oct 13 14:42:14 crc kubenswrapper[4797]: I1013 14:42:14.272907 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-564c9b84c5-8nbpf"] Oct 13 14:42:14 crc kubenswrapper[4797]: I1013 14:42:14.273257 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-564c9b84c5-8nbpf" podUID="1e42222d-b6e6-4ef7-a8fa-4f3b2a88c0c4" containerName="dnsmasq-dns" containerID="cri-o://cc36507ded7605b7ba59eaca28d45815d99f6ef2bb7b0b08fd3fba0faf592300" gracePeriod=10 Oct 13 14:42:14 crc kubenswrapper[4797]: I1013 14:42:14.426306 4797 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-564c9b84c5-8nbpf" podUID="1e42222d-b6e6-4ef7-a8fa-4f3b2a88c0c4" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.1.79:5353: connect: connection refused" Oct 13 14:42:14 crc kubenswrapper[4797]: I1013 14:42:14.538095 4797 generic.go:334] "Generic (PLEG): container finished" podID="1e42222d-b6e6-4ef7-a8fa-4f3b2a88c0c4" containerID="cc36507ded7605b7ba59eaca28d45815d99f6ef2bb7b0b08fd3fba0faf592300" exitCode=0 Oct 13 14:42:14 crc kubenswrapper[4797]: I1013 14:42:14.538144 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-564c9b84c5-8nbpf" event={"ID":"1e42222d-b6e6-4ef7-a8fa-4f3b2a88c0c4","Type":"ContainerDied","Data":"cc36507ded7605b7ba59eaca28d45815d99f6ef2bb7b0b08fd3fba0faf592300"} Oct 13 14:42:14 crc kubenswrapper[4797]: I1013 14:42:14.541879 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"945cdf25-57ce-44f2-89ae-2a17e21c485f","Type":"ContainerStarted","Data":"b19253a4b5e654ca5264b4e1999c3362e69e9cb46cba87bc0d1e83356279712d"} Oct 13 14:42:14 crc kubenswrapper[4797]: I1013 14:42:14.561497 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-share-share1-0" podStartSLOduration=4.043773324 podStartE2EDuration="11.561480471s" podCreationTimestamp="2025-10-13 14:42:03 +0000 UTC" firstStartedPulling="2025-10-13 14:42:05.140632132 +0000 UTC m=+5702.674182388" lastFinishedPulling="2025-10-13 14:42:12.658339289 +0000 UTC m=+5710.191889535" observedRunningTime="2025-10-13 14:42:14.560573629 +0000 UTC m=+5712.094123915" watchObservedRunningTime="2025-10-13 14:42:14.561480471 +0000 UTC m=+5712.095030727" Oct 13 14:42:14 crc kubenswrapper[4797]: I1013 14:42:14.894070 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-564c9b84c5-8nbpf" Oct 13 14:42:15 crc kubenswrapper[4797]: I1013 14:42:15.054598 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1e42222d-b6e6-4ef7-a8fa-4f3b2a88c0c4-dns-svc\") pod \"1e42222d-b6e6-4ef7-a8fa-4f3b2a88c0c4\" (UID: \"1e42222d-b6e6-4ef7-a8fa-4f3b2a88c0c4\") " Oct 13 14:42:15 crc kubenswrapper[4797]: I1013 14:42:15.054720 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1e42222d-b6e6-4ef7-a8fa-4f3b2a88c0c4-ovsdbserver-nb\") pod \"1e42222d-b6e6-4ef7-a8fa-4f3b2a88c0c4\" (UID: \"1e42222d-b6e6-4ef7-a8fa-4f3b2a88c0c4\") " Oct 13 14:42:15 crc kubenswrapper[4797]: I1013 14:42:15.054797 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1e42222d-b6e6-4ef7-a8fa-4f3b2a88c0c4-ovsdbserver-sb\") pod \"1e42222d-b6e6-4ef7-a8fa-4f3b2a88c0c4\" (UID: \"1e42222d-b6e6-4ef7-a8fa-4f3b2a88c0c4\") " Oct 13 14:42:15 crc kubenswrapper[4797]: I1013 14:42:15.054834 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-27g8d\" (UniqueName: \"kubernetes.io/projected/1e42222d-b6e6-4ef7-a8fa-4f3b2a88c0c4-kube-api-access-27g8d\") pod \"1e42222d-b6e6-4ef7-a8fa-4f3b2a88c0c4\" (UID: \"1e42222d-b6e6-4ef7-a8fa-4f3b2a88c0c4\") " Oct 13 14:42:15 crc kubenswrapper[4797]: I1013 14:42:15.054862 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1e42222d-b6e6-4ef7-a8fa-4f3b2a88c0c4-config\") pod \"1e42222d-b6e6-4ef7-a8fa-4f3b2a88c0c4\" (UID: \"1e42222d-b6e6-4ef7-a8fa-4f3b2a88c0c4\") " Oct 13 14:42:15 crc kubenswrapper[4797]: I1013 14:42:15.069000 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e42222d-b6e6-4ef7-a8fa-4f3b2a88c0c4-kube-api-access-27g8d" (OuterVolumeSpecName: "kube-api-access-27g8d") pod "1e42222d-b6e6-4ef7-a8fa-4f3b2a88c0c4" (UID: "1e42222d-b6e6-4ef7-a8fa-4f3b2a88c0c4"). InnerVolumeSpecName "kube-api-access-27g8d". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 14:42:15 crc kubenswrapper[4797]: I1013 14:42:15.113844 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e42222d-b6e6-4ef7-a8fa-4f3b2a88c0c4-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1e42222d-b6e6-4ef7-a8fa-4f3b2a88c0c4" (UID: "1e42222d-b6e6-4ef7-a8fa-4f3b2a88c0c4"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 14:42:15 crc kubenswrapper[4797]: I1013 14:42:15.116220 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e42222d-b6e6-4ef7-a8fa-4f3b2a88c0c4-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "1e42222d-b6e6-4ef7-a8fa-4f3b2a88c0c4" (UID: "1e42222d-b6e6-4ef7-a8fa-4f3b2a88c0c4"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 14:42:15 crc kubenswrapper[4797]: I1013 14:42:15.130567 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e42222d-b6e6-4ef7-a8fa-4f3b2a88c0c4-config" (OuterVolumeSpecName: "config") pod "1e42222d-b6e6-4ef7-a8fa-4f3b2a88c0c4" (UID: "1e42222d-b6e6-4ef7-a8fa-4f3b2a88c0c4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 14:42:15 crc kubenswrapper[4797]: I1013 14:42:15.133282 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e42222d-b6e6-4ef7-a8fa-4f3b2a88c0c4-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "1e42222d-b6e6-4ef7-a8fa-4f3b2a88c0c4" (UID: "1e42222d-b6e6-4ef7-a8fa-4f3b2a88c0c4"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 14:42:15 crc kubenswrapper[4797]: I1013 14:42:15.156567 4797 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1e42222d-b6e6-4ef7-a8fa-4f3b2a88c0c4-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 13 14:42:15 crc kubenswrapper[4797]: I1013 14:42:15.156606 4797 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1e42222d-b6e6-4ef7-a8fa-4f3b2a88c0c4-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 13 14:42:15 crc kubenswrapper[4797]: I1013 14:42:15.156617 4797 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1e42222d-b6e6-4ef7-a8fa-4f3b2a88c0c4-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 13 14:42:15 crc kubenswrapper[4797]: I1013 14:42:15.156628 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-27g8d\" (UniqueName: \"kubernetes.io/projected/1e42222d-b6e6-4ef7-a8fa-4f3b2a88c0c4-kube-api-access-27g8d\") on node \"crc\" DevicePath \"\"" Oct 13 14:42:15 crc kubenswrapper[4797]: I1013 14:42:15.156638 4797 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1e42222d-b6e6-4ef7-a8fa-4f3b2a88c0c4-config\") on node \"crc\" DevicePath \"\"" Oct 13 14:42:15 crc kubenswrapper[4797]: I1013 14:42:15.553131 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-564c9b84c5-8nbpf" Oct 13 14:42:15 crc kubenswrapper[4797]: I1013 14:42:15.553113 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-564c9b84c5-8nbpf" event={"ID":"1e42222d-b6e6-4ef7-a8fa-4f3b2a88c0c4","Type":"ContainerDied","Data":"5643c3982e32054ba5e7f400d81661af08d9e3c56fbd97019c4023f50d4edf77"} Oct 13 14:42:15 crc kubenswrapper[4797]: I1013 14:42:15.553206 4797 scope.go:117] "RemoveContainer" containerID="cc36507ded7605b7ba59eaca28d45815d99f6ef2bb7b0b08fd3fba0faf592300" Oct 13 14:42:15 crc kubenswrapper[4797]: I1013 14:42:15.583380 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-564c9b84c5-8nbpf"] Oct 13 14:42:15 crc kubenswrapper[4797]: I1013 14:42:15.594156 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-564c9b84c5-8nbpf"] Oct 13 14:42:15 crc kubenswrapper[4797]: I1013 14:42:15.595076 4797 scope.go:117] "RemoveContainer" containerID="6657962aea5639db472c8891f2f798191b6c24df7414d3b4f2809ca8c8b4d3d7" Oct 13 14:42:16 crc kubenswrapper[4797]: I1013 14:42:16.034087 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-ea2d-account-create-5459z"] Oct 13 14:42:16 crc kubenswrapper[4797]: I1013 14:42:16.043559 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cc58-account-create-4c2kv"] Oct 13 14:42:16 crc kubenswrapper[4797]: I1013 14:42:16.054365 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-ea2d-account-create-5459z"] Oct 13 14:42:16 crc kubenswrapper[4797]: I1013 14:42:16.061870 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cc58-account-create-4c2kv"] Oct 13 14:42:16 crc kubenswrapper[4797]: I1013 14:42:16.069351 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-64d8-account-create-c74vt"] Oct 13 14:42:16 crc kubenswrapper[4797]: I1013 14:42:16.077083 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-64d8-account-create-c74vt"] Oct 13 14:42:17 crc kubenswrapper[4797]: I1013 14:42:17.073775 4797 scope.go:117] "RemoveContainer" containerID="9a001682c8f110ed0261d9b3fc9b75b4734e268d044145379dca9699293626a0" Oct 13 14:42:17 crc kubenswrapper[4797]: I1013 14:42:17.080693 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 13 14:42:17 crc kubenswrapper[4797]: I1013 14:42:17.080966 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7b4e14a9-e972-4fc1-815c-f5dfa5399912" containerName="ceilometer-central-agent" containerID="cri-o://7d97cffa929237e724d4f415e4bafc0cb19a158c176686afb77a5c13bd1814f3" gracePeriod=30 Oct 13 14:42:17 crc kubenswrapper[4797]: I1013 14:42:17.081092 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7b4e14a9-e972-4fc1-815c-f5dfa5399912" containerName="proxy-httpd" containerID="cri-o://9e92322d9b7ddc1983ccf0f0f51539223a7e72cb9ee523bbbc9689687edeef11" gracePeriod=30 Oct 13 14:42:17 crc kubenswrapper[4797]: I1013 14:42:17.081144 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7b4e14a9-e972-4fc1-815c-f5dfa5399912" containerName="sg-core" containerID="cri-o://41981fb5e99c02149e9791c65bb5b930500f07a20f87bd85093779ccd7c5013f" gracePeriod=30 Oct 13 14:42:17 crc kubenswrapper[4797]: I1013 14:42:17.081185 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7b4e14a9-e972-4fc1-815c-f5dfa5399912" containerName="ceilometer-notification-agent" containerID="cri-o://dee06dcbd97929d4f49ea7969cfe7b1bb5638ffe3ed0258eb294127a4c7807f3" gracePeriod=30 Oct 13 14:42:17 crc kubenswrapper[4797]: I1013 14:42:17.118725 4797 scope.go:117] "RemoveContainer" containerID="5434d1dc1c5f053e87774f21a075c3701e704bba8a186e986c848735d7eea498" Oct 13 14:42:17 crc kubenswrapper[4797]: I1013 14:42:17.175406 4797 scope.go:117] "RemoveContainer" containerID="f5468adde202ce08735f6bcd3dd3317db4fbf8363100efc31a0af21200d6ce95" Oct 13 14:42:17 crc kubenswrapper[4797]: I1013 14:42:17.256524 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ce3d55d-49d8-4559-bb4a-aab76084a7d3" path="/var/lib/kubelet/pods/0ce3d55d-49d8-4559-bb4a-aab76084a7d3/volumes" Oct 13 14:42:17 crc kubenswrapper[4797]: I1013 14:42:17.257294 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e42222d-b6e6-4ef7-a8fa-4f3b2a88c0c4" path="/var/lib/kubelet/pods/1e42222d-b6e6-4ef7-a8fa-4f3b2a88c0c4/volumes" Oct 13 14:42:17 crc kubenswrapper[4797]: I1013 14:42:17.257875 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d225d04-4356-48b7-acd1-beb811cb4320" path="/var/lib/kubelet/pods/9d225d04-4356-48b7-acd1-beb811cb4320/volumes" Oct 13 14:42:17 crc kubenswrapper[4797]: I1013 14:42:17.258392 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4981c2c-7ef7-468d-808c-c1dba459c1c0" path="/var/lib/kubelet/pods/e4981c2c-7ef7-468d-808c-c1dba459c1c0/volumes" Oct 13 14:42:17 crc kubenswrapper[4797]: I1013 14:42:17.575973 4797 generic.go:334] "Generic (PLEG): container finished" podID="7b4e14a9-e972-4fc1-815c-f5dfa5399912" containerID="9e92322d9b7ddc1983ccf0f0f51539223a7e72cb9ee523bbbc9689687edeef11" exitCode=0 Oct 13 14:42:17 crc kubenswrapper[4797]: I1013 14:42:17.576019 4797 generic.go:334] "Generic (PLEG): container finished" podID="7b4e14a9-e972-4fc1-815c-f5dfa5399912" containerID="41981fb5e99c02149e9791c65bb5b930500f07a20f87bd85093779ccd7c5013f" exitCode=2 Oct 13 14:42:17 crc kubenswrapper[4797]: I1013 14:42:17.576052 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7b4e14a9-e972-4fc1-815c-f5dfa5399912","Type":"ContainerDied","Data":"9e92322d9b7ddc1983ccf0f0f51539223a7e72cb9ee523bbbc9689687edeef11"} Oct 13 14:42:17 crc kubenswrapper[4797]: I1013 14:42:17.576105 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7b4e14a9-e972-4fc1-815c-f5dfa5399912","Type":"ContainerDied","Data":"41981fb5e99c02149e9791c65bb5b930500f07a20f87bd85093779ccd7c5013f"} Oct 13 14:42:18 crc kubenswrapper[4797]: I1013 14:42:18.451788 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 13 14:42:18 crc kubenswrapper[4797]: I1013 14:42:18.559061 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b4e14a9-e972-4fc1-815c-f5dfa5399912-scripts\") pod \"7b4e14a9-e972-4fc1-815c-f5dfa5399912\" (UID: \"7b4e14a9-e972-4fc1-815c-f5dfa5399912\") " Oct 13 14:42:18 crc kubenswrapper[4797]: I1013 14:42:18.559403 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b4e14a9-e972-4fc1-815c-f5dfa5399912-combined-ca-bundle\") pod \"7b4e14a9-e972-4fc1-815c-f5dfa5399912\" (UID: \"7b4e14a9-e972-4fc1-815c-f5dfa5399912\") " Oct 13 14:42:18 crc kubenswrapper[4797]: I1013 14:42:18.559429 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7b4e14a9-e972-4fc1-815c-f5dfa5399912-log-httpd\") pod \"7b4e14a9-e972-4fc1-815c-f5dfa5399912\" (UID: \"7b4e14a9-e972-4fc1-815c-f5dfa5399912\") " Oct 13 14:42:18 crc kubenswrapper[4797]: I1013 14:42:18.559459 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b4e14a9-e972-4fc1-815c-f5dfa5399912-config-data\") pod \"7b4e14a9-e972-4fc1-815c-f5dfa5399912\" (UID: \"7b4e14a9-e972-4fc1-815c-f5dfa5399912\") " Oct 13 14:42:18 crc kubenswrapper[4797]: I1013 14:42:18.559531 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7b4e14a9-e972-4fc1-815c-f5dfa5399912-sg-core-conf-yaml\") pod \"7b4e14a9-e972-4fc1-815c-f5dfa5399912\" (UID: \"7b4e14a9-e972-4fc1-815c-f5dfa5399912\") " Oct 13 14:42:18 crc kubenswrapper[4797]: I1013 14:42:18.559568 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7b4e14a9-e972-4fc1-815c-f5dfa5399912-run-httpd\") pod \"7b4e14a9-e972-4fc1-815c-f5dfa5399912\" (UID: \"7b4e14a9-e972-4fc1-815c-f5dfa5399912\") " Oct 13 14:42:18 crc kubenswrapper[4797]: I1013 14:42:18.559610 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-99p9s\" (UniqueName: \"kubernetes.io/projected/7b4e14a9-e972-4fc1-815c-f5dfa5399912-kube-api-access-99p9s\") pod \"7b4e14a9-e972-4fc1-815c-f5dfa5399912\" (UID: \"7b4e14a9-e972-4fc1-815c-f5dfa5399912\") " Oct 13 14:42:18 crc kubenswrapper[4797]: I1013 14:42:18.560058 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b4e14a9-e972-4fc1-815c-f5dfa5399912-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "7b4e14a9-e972-4fc1-815c-f5dfa5399912" (UID: "7b4e14a9-e972-4fc1-815c-f5dfa5399912"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 14:42:18 crc kubenswrapper[4797]: I1013 14:42:18.560204 4797 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7b4e14a9-e972-4fc1-815c-f5dfa5399912-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 13 14:42:18 crc kubenswrapper[4797]: I1013 14:42:18.560279 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b4e14a9-e972-4fc1-815c-f5dfa5399912-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "7b4e14a9-e972-4fc1-815c-f5dfa5399912" (UID: "7b4e14a9-e972-4fc1-815c-f5dfa5399912"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 14:42:18 crc kubenswrapper[4797]: I1013 14:42:18.567989 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b4e14a9-e972-4fc1-815c-f5dfa5399912-scripts" (OuterVolumeSpecName: "scripts") pod "7b4e14a9-e972-4fc1-815c-f5dfa5399912" (UID: "7b4e14a9-e972-4fc1-815c-f5dfa5399912"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 14:42:18 crc kubenswrapper[4797]: I1013 14:42:18.568041 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b4e14a9-e972-4fc1-815c-f5dfa5399912-kube-api-access-99p9s" (OuterVolumeSpecName: "kube-api-access-99p9s") pod "7b4e14a9-e972-4fc1-815c-f5dfa5399912" (UID: "7b4e14a9-e972-4fc1-815c-f5dfa5399912"). InnerVolumeSpecName "kube-api-access-99p9s". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 14:42:18 crc kubenswrapper[4797]: I1013 14:42:18.598885 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b4e14a9-e972-4fc1-815c-f5dfa5399912-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "7b4e14a9-e972-4fc1-815c-f5dfa5399912" (UID: "7b4e14a9-e972-4fc1-815c-f5dfa5399912"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 14:42:18 crc kubenswrapper[4797]: I1013 14:42:18.601516 4797 generic.go:334] "Generic (PLEG): container finished" podID="7b4e14a9-e972-4fc1-815c-f5dfa5399912" containerID="dee06dcbd97929d4f49ea7969cfe7b1bb5638ffe3ed0258eb294127a4c7807f3" exitCode=0 Oct 13 14:42:18 crc kubenswrapper[4797]: I1013 14:42:18.601542 4797 generic.go:334] "Generic (PLEG): container finished" podID="7b4e14a9-e972-4fc1-815c-f5dfa5399912" containerID="7d97cffa929237e724d4f415e4bafc0cb19a158c176686afb77a5c13bd1814f3" exitCode=0 Oct 13 14:42:18 crc kubenswrapper[4797]: I1013 14:42:18.601562 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7b4e14a9-e972-4fc1-815c-f5dfa5399912","Type":"ContainerDied","Data":"dee06dcbd97929d4f49ea7969cfe7b1bb5638ffe3ed0258eb294127a4c7807f3"} Oct 13 14:42:18 crc kubenswrapper[4797]: I1013 14:42:18.601586 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7b4e14a9-e972-4fc1-815c-f5dfa5399912","Type":"ContainerDied","Data":"7d97cffa929237e724d4f415e4bafc0cb19a158c176686afb77a5c13bd1814f3"} Oct 13 14:42:18 crc kubenswrapper[4797]: I1013 14:42:18.601596 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7b4e14a9-e972-4fc1-815c-f5dfa5399912","Type":"ContainerDied","Data":"469bb3cb24aa866882095d4b37207a724d4e2d94955117c7785ec1e63a166ba7"} Oct 13 14:42:18 crc kubenswrapper[4797]: I1013 14:42:18.601610 4797 scope.go:117] "RemoveContainer" containerID="9e92322d9b7ddc1983ccf0f0f51539223a7e72cb9ee523bbbc9689687edeef11" Oct 13 14:42:18 crc kubenswrapper[4797]: I1013 14:42:18.601756 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 13 14:42:18 crc kubenswrapper[4797]: I1013 14:42:18.661624 4797 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b4e14a9-e972-4fc1-815c-f5dfa5399912-scripts\") on node \"crc\" DevicePath \"\"" Oct 13 14:42:18 crc kubenswrapper[4797]: I1013 14:42:18.661652 4797 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7b4e14a9-e972-4fc1-815c-f5dfa5399912-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 13 14:42:18 crc kubenswrapper[4797]: I1013 14:42:18.661661 4797 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7b4e14a9-e972-4fc1-815c-f5dfa5399912-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 13 14:42:18 crc kubenswrapper[4797]: I1013 14:42:18.661670 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-99p9s\" (UniqueName: \"kubernetes.io/projected/7b4e14a9-e972-4fc1-815c-f5dfa5399912-kube-api-access-99p9s\") on node \"crc\" DevicePath \"\"" Oct 13 14:42:18 crc kubenswrapper[4797]: I1013 14:42:18.663950 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b4e14a9-e972-4fc1-815c-f5dfa5399912-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7b4e14a9-e972-4fc1-815c-f5dfa5399912" (UID: "7b4e14a9-e972-4fc1-815c-f5dfa5399912"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 14:42:18 crc kubenswrapper[4797]: I1013 14:42:18.675588 4797 scope.go:117] "RemoveContainer" containerID="41981fb5e99c02149e9791c65bb5b930500f07a20f87bd85093779ccd7c5013f" Oct 13 14:42:18 crc kubenswrapper[4797]: I1013 14:42:18.682752 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b4e14a9-e972-4fc1-815c-f5dfa5399912-config-data" (OuterVolumeSpecName: "config-data") pod "7b4e14a9-e972-4fc1-815c-f5dfa5399912" (UID: "7b4e14a9-e972-4fc1-815c-f5dfa5399912"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 14:42:18 crc kubenswrapper[4797]: I1013 14:42:18.694068 4797 scope.go:117] "RemoveContainer" containerID="dee06dcbd97929d4f49ea7969cfe7b1bb5638ffe3ed0258eb294127a4c7807f3" Oct 13 14:42:18 crc kubenswrapper[4797]: I1013 14:42:18.728054 4797 scope.go:117] "RemoveContainer" containerID="7d97cffa929237e724d4f415e4bafc0cb19a158c176686afb77a5c13bd1814f3" Oct 13 14:42:18 crc kubenswrapper[4797]: I1013 14:42:18.757040 4797 scope.go:117] "RemoveContainer" containerID="9e92322d9b7ddc1983ccf0f0f51539223a7e72cb9ee523bbbc9689687edeef11" Oct 13 14:42:18 crc kubenswrapper[4797]: E1013 14:42:18.757526 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e92322d9b7ddc1983ccf0f0f51539223a7e72cb9ee523bbbc9689687edeef11\": container with ID starting with 9e92322d9b7ddc1983ccf0f0f51539223a7e72cb9ee523bbbc9689687edeef11 not found: ID does not exist" containerID="9e92322d9b7ddc1983ccf0f0f51539223a7e72cb9ee523bbbc9689687edeef11" Oct 13 14:42:18 crc kubenswrapper[4797]: I1013 14:42:18.757583 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e92322d9b7ddc1983ccf0f0f51539223a7e72cb9ee523bbbc9689687edeef11"} err="failed to get container status \"9e92322d9b7ddc1983ccf0f0f51539223a7e72cb9ee523bbbc9689687edeef11\": rpc error: code = NotFound desc = could not find container \"9e92322d9b7ddc1983ccf0f0f51539223a7e72cb9ee523bbbc9689687edeef11\": container with ID starting with 9e92322d9b7ddc1983ccf0f0f51539223a7e72cb9ee523bbbc9689687edeef11 not found: ID does not exist" Oct 13 14:42:18 crc kubenswrapper[4797]: I1013 14:42:18.757642 4797 scope.go:117] "RemoveContainer" containerID="41981fb5e99c02149e9791c65bb5b930500f07a20f87bd85093779ccd7c5013f" Oct 13 14:42:18 crc kubenswrapper[4797]: E1013 14:42:18.757992 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41981fb5e99c02149e9791c65bb5b930500f07a20f87bd85093779ccd7c5013f\": container with ID starting with 41981fb5e99c02149e9791c65bb5b930500f07a20f87bd85093779ccd7c5013f not found: ID does not exist" containerID="41981fb5e99c02149e9791c65bb5b930500f07a20f87bd85093779ccd7c5013f" Oct 13 14:42:18 crc kubenswrapper[4797]: I1013 14:42:18.758012 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41981fb5e99c02149e9791c65bb5b930500f07a20f87bd85093779ccd7c5013f"} err="failed to get container status \"41981fb5e99c02149e9791c65bb5b930500f07a20f87bd85093779ccd7c5013f\": rpc error: code = NotFound desc = could not find container \"41981fb5e99c02149e9791c65bb5b930500f07a20f87bd85093779ccd7c5013f\": container with ID starting with 41981fb5e99c02149e9791c65bb5b930500f07a20f87bd85093779ccd7c5013f not found: ID does not exist" Oct 13 14:42:18 crc kubenswrapper[4797]: I1013 14:42:18.758024 4797 scope.go:117] "RemoveContainer" containerID="dee06dcbd97929d4f49ea7969cfe7b1bb5638ffe3ed0258eb294127a4c7807f3" Oct 13 14:42:18 crc kubenswrapper[4797]: E1013 14:42:18.758397 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dee06dcbd97929d4f49ea7969cfe7b1bb5638ffe3ed0258eb294127a4c7807f3\": container with ID starting with dee06dcbd97929d4f49ea7969cfe7b1bb5638ffe3ed0258eb294127a4c7807f3 not found: ID does not exist" containerID="dee06dcbd97929d4f49ea7969cfe7b1bb5638ffe3ed0258eb294127a4c7807f3" Oct 13 14:42:18 crc kubenswrapper[4797]: I1013 14:42:18.758419 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dee06dcbd97929d4f49ea7969cfe7b1bb5638ffe3ed0258eb294127a4c7807f3"} err="failed to get container status \"dee06dcbd97929d4f49ea7969cfe7b1bb5638ffe3ed0258eb294127a4c7807f3\": rpc error: code = NotFound desc = could not find container \"dee06dcbd97929d4f49ea7969cfe7b1bb5638ffe3ed0258eb294127a4c7807f3\": container with ID starting with dee06dcbd97929d4f49ea7969cfe7b1bb5638ffe3ed0258eb294127a4c7807f3 not found: ID does not exist" Oct 13 14:42:18 crc kubenswrapper[4797]: I1013 14:42:18.758431 4797 scope.go:117] "RemoveContainer" containerID="7d97cffa929237e724d4f415e4bafc0cb19a158c176686afb77a5c13bd1814f3" Oct 13 14:42:18 crc kubenswrapper[4797]: E1013 14:42:18.758601 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d97cffa929237e724d4f415e4bafc0cb19a158c176686afb77a5c13bd1814f3\": container with ID starting with 7d97cffa929237e724d4f415e4bafc0cb19a158c176686afb77a5c13bd1814f3 not found: ID does not exist" containerID="7d97cffa929237e724d4f415e4bafc0cb19a158c176686afb77a5c13bd1814f3" Oct 13 14:42:18 crc kubenswrapper[4797]: I1013 14:42:18.758622 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d97cffa929237e724d4f415e4bafc0cb19a158c176686afb77a5c13bd1814f3"} err="failed to get container status \"7d97cffa929237e724d4f415e4bafc0cb19a158c176686afb77a5c13bd1814f3\": rpc error: code = NotFound desc = could not find container \"7d97cffa929237e724d4f415e4bafc0cb19a158c176686afb77a5c13bd1814f3\": container with ID starting with 7d97cffa929237e724d4f415e4bafc0cb19a158c176686afb77a5c13bd1814f3 not found: ID does not exist" Oct 13 14:42:18 crc kubenswrapper[4797]: I1013 14:42:18.758637 4797 scope.go:117] "RemoveContainer" containerID="9e92322d9b7ddc1983ccf0f0f51539223a7e72cb9ee523bbbc9689687edeef11" Oct 13 14:42:18 crc kubenswrapper[4797]: I1013 14:42:18.758887 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e92322d9b7ddc1983ccf0f0f51539223a7e72cb9ee523bbbc9689687edeef11"} err="failed to get container status \"9e92322d9b7ddc1983ccf0f0f51539223a7e72cb9ee523bbbc9689687edeef11\": rpc error: code = NotFound desc = could not find container \"9e92322d9b7ddc1983ccf0f0f51539223a7e72cb9ee523bbbc9689687edeef11\": container with ID starting with 9e92322d9b7ddc1983ccf0f0f51539223a7e72cb9ee523bbbc9689687edeef11 not found: ID does not exist" Oct 13 14:42:18 crc kubenswrapper[4797]: I1013 14:42:18.758907 4797 scope.go:117] "RemoveContainer" containerID="41981fb5e99c02149e9791c65bb5b930500f07a20f87bd85093779ccd7c5013f" Oct 13 14:42:18 crc kubenswrapper[4797]: I1013 14:42:18.759275 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41981fb5e99c02149e9791c65bb5b930500f07a20f87bd85093779ccd7c5013f"} err="failed to get container status \"41981fb5e99c02149e9791c65bb5b930500f07a20f87bd85093779ccd7c5013f\": rpc error: code = NotFound desc = could not find container \"41981fb5e99c02149e9791c65bb5b930500f07a20f87bd85093779ccd7c5013f\": container with ID starting with 41981fb5e99c02149e9791c65bb5b930500f07a20f87bd85093779ccd7c5013f not found: ID does not exist" Oct 13 14:42:18 crc kubenswrapper[4797]: I1013 14:42:18.759330 4797 scope.go:117] "RemoveContainer" containerID="dee06dcbd97929d4f49ea7969cfe7b1bb5638ffe3ed0258eb294127a4c7807f3" Oct 13 14:42:18 crc kubenswrapper[4797]: I1013 14:42:18.759767 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dee06dcbd97929d4f49ea7969cfe7b1bb5638ffe3ed0258eb294127a4c7807f3"} err="failed to get container status \"dee06dcbd97929d4f49ea7969cfe7b1bb5638ffe3ed0258eb294127a4c7807f3\": rpc error: code = NotFound desc = could not find container \"dee06dcbd97929d4f49ea7969cfe7b1bb5638ffe3ed0258eb294127a4c7807f3\": container with ID starting with dee06dcbd97929d4f49ea7969cfe7b1bb5638ffe3ed0258eb294127a4c7807f3 not found: ID does not exist" Oct 13 14:42:18 crc kubenswrapper[4797]: I1013 14:42:18.759789 4797 scope.go:117] "RemoveContainer" containerID="7d97cffa929237e724d4f415e4bafc0cb19a158c176686afb77a5c13bd1814f3" Oct 13 14:42:18 crc kubenswrapper[4797]: I1013 14:42:18.760067 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d97cffa929237e724d4f415e4bafc0cb19a158c176686afb77a5c13bd1814f3"} err="failed to get container status \"7d97cffa929237e724d4f415e4bafc0cb19a158c176686afb77a5c13bd1814f3\": rpc error: code = NotFound desc = could not find container \"7d97cffa929237e724d4f415e4bafc0cb19a158c176686afb77a5c13bd1814f3\": container with ID starting with 7d97cffa929237e724d4f415e4bafc0cb19a158c176686afb77a5c13bd1814f3 not found: ID does not exist" Oct 13 14:42:18 crc kubenswrapper[4797]: I1013 14:42:18.762889 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b4e14a9-e972-4fc1-815c-f5dfa5399912-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 14:42:18 crc kubenswrapper[4797]: I1013 14:42:18.762912 4797 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b4e14a9-e972-4fc1-815c-f5dfa5399912-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 14:42:18 crc kubenswrapper[4797]: I1013 14:42:18.942728 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 13 14:42:18 crc kubenswrapper[4797]: I1013 14:42:18.952106 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 13 14:42:18 crc kubenswrapper[4797]: I1013 14:42:18.964078 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 13 14:42:18 crc kubenswrapper[4797]: E1013 14:42:18.964601 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b4e14a9-e972-4fc1-815c-f5dfa5399912" containerName="ceilometer-notification-agent" Oct 13 14:42:18 crc kubenswrapper[4797]: I1013 14:42:18.964627 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b4e14a9-e972-4fc1-815c-f5dfa5399912" containerName="ceilometer-notification-agent" Oct 13 14:42:18 crc kubenswrapper[4797]: E1013 14:42:18.964645 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e42222d-b6e6-4ef7-a8fa-4f3b2a88c0c4" containerName="init" Oct 13 14:42:18 crc kubenswrapper[4797]: I1013 14:42:18.964654 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e42222d-b6e6-4ef7-a8fa-4f3b2a88c0c4" containerName="init" Oct 13 14:42:18 crc kubenswrapper[4797]: E1013 14:42:18.964674 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b4e14a9-e972-4fc1-815c-f5dfa5399912" containerName="proxy-httpd" Oct 13 14:42:18 crc kubenswrapper[4797]: I1013 14:42:18.964681 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b4e14a9-e972-4fc1-815c-f5dfa5399912" containerName="proxy-httpd" Oct 13 14:42:18 crc kubenswrapper[4797]: E1013 14:42:18.964703 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b4e14a9-e972-4fc1-815c-f5dfa5399912" containerName="sg-core" Oct 13 14:42:18 crc kubenswrapper[4797]: I1013 14:42:18.964713 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b4e14a9-e972-4fc1-815c-f5dfa5399912" containerName="sg-core" Oct 13 14:42:18 crc kubenswrapper[4797]: E1013 14:42:18.964726 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e42222d-b6e6-4ef7-a8fa-4f3b2a88c0c4" containerName="dnsmasq-dns" Oct 13 14:42:18 crc kubenswrapper[4797]: I1013 14:42:18.964731 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e42222d-b6e6-4ef7-a8fa-4f3b2a88c0c4" containerName="dnsmasq-dns" Oct 13 14:42:18 crc kubenswrapper[4797]: E1013 14:42:18.964752 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b4e14a9-e972-4fc1-815c-f5dfa5399912" containerName="ceilometer-central-agent" Oct 13 14:42:18 crc kubenswrapper[4797]: I1013 14:42:18.964759 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b4e14a9-e972-4fc1-815c-f5dfa5399912" containerName="ceilometer-central-agent" Oct 13 14:42:18 crc kubenswrapper[4797]: I1013 14:42:18.965032 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b4e14a9-e972-4fc1-815c-f5dfa5399912" containerName="ceilometer-notification-agent" Oct 13 14:42:18 crc kubenswrapper[4797]: I1013 14:42:18.965056 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b4e14a9-e972-4fc1-815c-f5dfa5399912" containerName="sg-core" Oct 13 14:42:18 crc kubenswrapper[4797]: I1013 14:42:18.965077 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b4e14a9-e972-4fc1-815c-f5dfa5399912" containerName="proxy-httpd" Oct 13 14:42:18 crc kubenswrapper[4797]: I1013 14:42:18.965088 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b4e14a9-e972-4fc1-815c-f5dfa5399912" containerName="ceilometer-central-agent" Oct 13 14:42:18 crc kubenswrapper[4797]: I1013 14:42:18.965102 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e42222d-b6e6-4ef7-a8fa-4f3b2a88c0c4" containerName="dnsmasq-dns" Oct 13 14:42:18 crc kubenswrapper[4797]: I1013 14:42:18.970008 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 13 14:42:18 crc kubenswrapper[4797]: I1013 14:42:18.973547 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 13 14:42:18 crc kubenswrapper[4797]: I1013 14:42:18.975142 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 13 14:42:18 crc kubenswrapper[4797]: I1013 14:42:18.977218 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 13 14:42:19 crc kubenswrapper[4797]: I1013 14:42:19.067766 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/713cca7f-3dd9-4fde-8672-c566a1acd8ce-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"713cca7f-3dd9-4fde-8672-c566a1acd8ce\") " pod="openstack/ceilometer-0" Oct 13 14:42:19 crc kubenswrapper[4797]: I1013 14:42:19.067844 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/713cca7f-3dd9-4fde-8672-c566a1acd8ce-run-httpd\") pod \"ceilometer-0\" (UID: \"713cca7f-3dd9-4fde-8672-c566a1acd8ce\") " pod="openstack/ceilometer-0" Oct 13 14:42:19 crc kubenswrapper[4797]: I1013 14:42:19.067865 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/713cca7f-3dd9-4fde-8672-c566a1acd8ce-scripts\") pod \"ceilometer-0\" (UID: \"713cca7f-3dd9-4fde-8672-c566a1acd8ce\") " pod="openstack/ceilometer-0" Oct 13 14:42:19 crc kubenswrapper[4797]: I1013 14:42:19.067884 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/713cca7f-3dd9-4fde-8672-c566a1acd8ce-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"713cca7f-3dd9-4fde-8672-c566a1acd8ce\") " pod="openstack/ceilometer-0" Oct 13 14:42:19 crc kubenswrapper[4797]: I1013 14:42:19.067949 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/713cca7f-3dd9-4fde-8672-c566a1acd8ce-log-httpd\") pod \"ceilometer-0\" (UID: \"713cca7f-3dd9-4fde-8672-c566a1acd8ce\") " pod="openstack/ceilometer-0" Oct 13 14:42:19 crc kubenswrapper[4797]: I1013 14:42:19.067965 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9nk22\" (UniqueName: \"kubernetes.io/projected/713cca7f-3dd9-4fde-8672-c566a1acd8ce-kube-api-access-9nk22\") pod \"ceilometer-0\" (UID: \"713cca7f-3dd9-4fde-8672-c566a1acd8ce\") " pod="openstack/ceilometer-0" Oct 13 14:42:19 crc kubenswrapper[4797]: I1013 14:42:19.068006 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/713cca7f-3dd9-4fde-8672-c566a1acd8ce-config-data\") pod \"ceilometer-0\" (UID: \"713cca7f-3dd9-4fde-8672-c566a1acd8ce\") " pod="openstack/ceilometer-0" Oct 13 14:42:19 crc kubenswrapper[4797]: I1013 14:42:19.169468 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/713cca7f-3dd9-4fde-8672-c566a1acd8ce-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"713cca7f-3dd9-4fde-8672-c566a1acd8ce\") " pod="openstack/ceilometer-0" Oct 13 14:42:19 crc kubenswrapper[4797]: I1013 14:42:19.169561 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/713cca7f-3dd9-4fde-8672-c566a1acd8ce-run-httpd\") pod \"ceilometer-0\" (UID: \"713cca7f-3dd9-4fde-8672-c566a1acd8ce\") " pod="openstack/ceilometer-0" Oct 13 14:42:19 crc kubenswrapper[4797]: I1013 14:42:19.169593 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/713cca7f-3dd9-4fde-8672-c566a1acd8ce-scripts\") pod \"ceilometer-0\" (UID: \"713cca7f-3dd9-4fde-8672-c566a1acd8ce\") " pod="openstack/ceilometer-0" Oct 13 14:42:19 crc kubenswrapper[4797]: I1013 14:42:19.169613 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/713cca7f-3dd9-4fde-8672-c566a1acd8ce-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"713cca7f-3dd9-4fde-8672-c566a1acd8ce\") " pod="openstack/ceilometer-0" Oct 13 14:42:19 crc kubenswrapper[4797]: I1013 14:42:19.169716 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/713cca7f-3dd9-4fde-8672-c566a1acd8ce-log-httpd\") pod \"ceilometer-0\" (UID: \"713cca7f-3dd9-4fde-8672-c566a1acd8ce\") " pod="openstack/ceilometer-0" Oct 13 14:42:19 crc kubenswrapper[4797]: I1013 14:42:19.169733 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9nk22\" (UniqueName: \"kubernetes.io/projected/713cca7f-3dd9-4fde-8672-c566a1acd8ce-kube-api-access-9nk22\") pod \"ceilometer-0\" (UID: \"713cca7f-3dd9-4fde-8672-c566a1acd8ce\") " pod="openstack/ceilometer-0" Oct 13 14:42:19 crc kubenswrapper[4797]: I1013 14:42:19.169777 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/713cca7f-3dd9-4fde-8672-c566a1acd8ce-config-data\") pod \"ceilometer-0\" (UID: \"713cca7f-3dd9-4fde-8672-c566a1acd8ce\") " pod="openstack/ceilometer-0" Oct 13 14:42:19 crc kubenswrapper[4797]: I1013 14:42:19.171129 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/713cca7f-3dd9-4fde-8672-c566a1acd8ce-log-httpd\") pod \"ceilometer-0\" (UID: \"713cca7f-3dd9-4fde-8672-c566a1acd8ce\") " pod="openstack/ceilometer-0" Oct 13 14:42:19 crc kubenswrapper[4797]: I1013 14:42:19.171296 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/713cca7f-3dd9-4fde-8672-c566a1acd8ce-run-httpd\") pod \"ceilometer-0\" (UID: \"713cca7f-3dd9-4fde-8672-c566a1acd8ce\") " pod="openstack/ceilometer-0" Oct 13 14:42:19 crc kubenswrapper[4797]: I1013 14:42:19.173575 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/713cca7f-3dd9-4fde-8672-c566a1acd8ce-scripts\") pod \"ceilometer-0\" (UID: \"713cca7f-3dd9-4fde-8672-c566a1acd8ce\") " pod="openstack/ceilometer-0" Oct 13 14:42:19 crc kubenswrapper[4797]: I1013 14:42:19.173832 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/713cca7f-3dd9-4fde-8672-c566a1acd8ce-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"713cca7f-3dd9-4fde-8672-c566a1acd8ce\") " pod="openstack/ceilometer-0" Oct 13 14:42:19 crc kubenswrapper[4797]: I1013 14:42:19.174499 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/713cca7f-3dd9-4fde-8672-c566a1acd8ce-config-data\") pod \"ceilometer-0\" (UID: \"713cca7f-3dd9-4fde-8672-c566a1acd8ce\") " pod="openstack/ceilometer-0" Oct 13 14:42:19 crc kubenswrapper[4797]: I1013 14:42:19.174609 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/713cca7f-3dd9-4fde-8672-c566a1acd8ce-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"713cca7f-3dd9-4fde-8672-c566a1acd8ce\") " pod="openstack/ceilometer-0" Oct 13 14:42:19 crc kubenswrapper[4797]: I1013 14:42:19.195747 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9nk22\" (UniqueName: \"kubernetes.io/projected/713cca7f-3dd9-4fde-8672-c566a1acd8ce-kube-api-access-9nk22\") pod \"ceilometer-0\" (UID: \"713cca7f-3dd9-4fde-8672-c566a1acd8ce\") " pod="openstack/ceilometer-0" Oct 13 14:42:19 crc kubenswrapper[4797]: I1013 14:42:19.249224 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b4e14a9-e972-4fc1-815c-f5dfa5399912" path="/var/lib/kubelet/pods/7b4e14a9-e972-4fc1-815c-f5dfa5399912/volumes" Oct 13 14:42:19 crc kubenswrapper[4797]: I1013 14:42:19.286845 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 13 14:42:19 crc kubenswrapper[4797]: I1013 14:42:19.669671 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 13 14:42:19 crc kubenswrapper[4797]: E1013 14:42:19.840081 4797 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb5348b8f_6498_4d16_bde9_20705e21127d.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb5348b8f_6498_4d16_bde9_20705e21127d.slice/crio-f7ed0f98af3d3897ec6e92084ca75948c6498b0699b0dad32a69c70b2b9b2193\": RecentStats: unable to find data in memory cache]" Oct 13 14:42:20 crc kubenswrapper[4797]: I1013 14:42:20.631073 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"713cca7f-3dd9-4fde-8672-c566a1acd8ce","Type":"ContainerStarted","Data":"e051e479a7f852ab66fc01ec6c810fd99d49b90b0d558c3fcaebb3fd44686a6c"} Oct 13 14:42:20 crc kubenswrapper[4797]: I1013 14:42:20.631374 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"713cca7f-3dd9-4fde-8672-c566a1acd8ce","Type":"ContainerStarted","Data":"292390cc2f8ff1f7fbbb4f21fe8a38476cbd95f991ef00fb2f9949708dfdfda4"} Oct 13 14:42:21 crc kubenswrapper[4797]: I1013 14:42:21.645021 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"713cca7f-3dd9-4fde-8672-c566a1acd8ce","Type":"ContainerStarted","Data":"61e53d816648d3acf2a19d545793789ad54972e0d59095f7e3ab72a476cab9ee"} Oct 13 14:42:22 crc kubenswrapper[4797]: I1013 14:42:22.658130 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"713cca7f-3dd9-4fde-8672-c566a1acd8ce","Type":"ContainerStarted","Data":"a0ba26c89b8da8efa336ca598292073e843cb2cfb3ee5eb3ab83728b0fe3036a"} Oct 13 14:42:23 crc kubenswrapper[4797]: I1013 14:42:23.671180 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"713cca7f-3dd9-4fde-8672-c566a1acd8ce","Type":"ContainerStarted","Data":"6ab3ef16d21f7e920b835736c8491d33b9d0059fd7f19879c51b14887e2ce3bf"} Oct 13 14:42:23 crc kubenswrapper[4797]: I1013 14:42:23.672661 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 13 14:42:23 crc kubenswrapper[4797]: I1013 14:42:23.703108 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.6705546780000002 podStartE2EDuration="5.703083775s" podCreationTimestamp="2025-10-13 14:42:18 +0000 UTC" firstStartedPulling="2025-10-13 14:42:19.664721697 +0000 UTC m=+5717.198271953" lastFinishedPulling="2025-10-13 14:42:22.697250794 +0000 UTC m=+5720.230801050" observedRunningTime="2025-10-13 14:42:23.697318933 +0000 UTC m=+5721.230869269" watchObservedRunningTime="2025-10-13 14:42:23.703083775 +0000 UTC m=+5721.236634031" Oct 13 14:42:24 crc kubenswrapper[4797]: I1013 14:42:24.166821 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-share-share1-0" Oct 13 14:42:25 crc kubenswrapper[4797]: I1013 14:42:25.548863 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-scheduler-0" Oct 13 14:42:25 crc kubenswrapper[4797]: I1013 14:42:25.846535 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-share-share1-0" Oct 13 14:42:25 crc kubenswrapper[4797]: I1013 14:42:25.853375 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/manila-api-0" Oct 13 14:42:26 crc kubenswrapper[4797]: I1013 14:42:26.237507 4797 scope.go:117] "RemoveContainer" containerID="6e1792210eb1300e989630292282b9e027a6f68ede1e610221b234da4c9f7a00" Oct 13 14:42:26 crc kubenswrapper[4797]: E1013 14:42:26.241202 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 14:42:30 crc kubenswrapper[4797]: E1013 14:42:30.141132 4797 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb5348b8f_6498_4d16_bde9_20705e21127d.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb5348b8f_6498_4d16_bde9_20705e21127d.slice/crio-f7ed0f98af3d3897ec6e92084ca75948c6498b0699b0dad32a69c70b2b9b2193\": RecentStats: unable to find data in memory cache]" Oct 13 14:42:35 crc kubenswrapper[4797]: I1013 14:42:35.042501 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-mqjqh"] Oct 13 14:42:35 crc kubenswrapper[4797]: I1013 14:42:35.053005 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-mqjqh"] Oct 13 14:42:35 crc kubenswrapper[4797]: I1013 14:42:35.250249 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bcc41bde-060e-4a36-adb7-62b14fd7cd30" path="/var/lib/kubelet/pods/bcc41bde-060e-4a36-adb7-62b14fd7cd30/volumes" Oct 13 14:42:38 crc kubenswrapper[4797]: I1013 14:42:38.236992 4797 scope.go:117] "RemoveContainer" containerID="6e1792210eb1300e989630292282b9e027a6f68ede1e610221b234da4c9f7a00" Oct 13 14:42:38 crc kubenswrapper[4797]: E1013 14:42:38.237646 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 14:42:40 crc kubenswrapper[4797]: E1013 14:42:40.426398 4797 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb5348b8f_6498_4d16_bde9_20705e21127d.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb5348b8f_6498_4d16_bde9_20705e21127d.slice/crio-f7ed0f98af3d3897ec6e92084ca75948c6498b0699b0dad32a69c70b2b9b2193\": RecentStats: unable to find data in memory cache]" Oct 13 14:42:49 crc kubenswrapper[4797]: I1013 14:42:49.292949 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 13 14:42:50 crc kubenswrapper[4797]: E1013 14:42:50.764199 4797 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb5348b8f_6498_4d16_bde9_20705e21127d.slice/crio-f7ed0f98af3d3897ec6e92084ca75948c6498b0699b0dad32a69c70b2b9b2193\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb5348b8f_6498_4d16_bde9_20705e21127d.slice\": RecentStats: unable to find data in memory cache]" Oct 13 14:42:51 crc kubenswrapper[4797]: I1013 14:42:51.236338 4797 scope.go:117] "RemoveContainer" containerID="6e1792210eb1300e989630292282b9e027a6f68ede1e610221b234da4c9f7a00" Oct 13 14:42:51 crc kubenswrapper[4797]: E1013 14:42:51.236936 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 14:42:55 crc kubenswrapper[4797]: I1013 14:42:55.047849 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-rkk9c"] Oct 13 14:42:55 crc kubenswrapper[4797]: I1013 14:42:55.069663 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-wsbtp"] Oct 13 14:42:55 crc kubenswrapper[4797]: I1013 14:42:55.078748 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-rkk9c"] Oct 13 14:42:55 crc kubenswrapper[4797]: I1013 14:42:55.088472 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-wsbtp"] Oct 13 14:42:55 crc kubenswrapper[4797]: I1013 14:42:55.249372 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e83dc9a-ad8e-480e-b41e-f140d7ccacb5" path="/var/lib/kubelet/pods/8e83dc9a-ad8e-480e-b41e-f140d7ccacb5/volumes" Oct 13 14:42:55 crc kubenswrapper[4797]: I1013 14:42:55.250401 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e14a029e-0f34-443e-bf02-4ffc65c90307" path="/var/lib/kubelet/pods/e14a029e-0f34-443e-bf02-4ffc65c90307/volumes" Oct 13 14:43:01 crc kubenswrapper[4797]: E1013 14:43:01.019728 4797 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb5348b8f_6498_4d16_bde9_20705e21127d.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb5348b8f_6498_4d16_bde9_20705e21127d.slice/crio-f7ed0f98af3d3897ec6e92084ca75948c6498b0699b0dad32a69c70b2b9b2193\": RecentStats: unable to find data in memory cache]" Oct 13 14:43:06 crc kubenswrapper[4797]: I1013 14:43:06.236501 4797 scope.go:117] "RemoveContainer" containerID="6e1792210eb1300e989630292282b9e027a6f68ede1e610221b234da4c9f7a00" Oct 13 14:43:06 crc kubenswrapper[4797]: E1013 14:43:06.239561 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 14:43:10 crc kubenswrapper[4797]: I1013 14:43:10.908518 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6b76c56699-bpj22"] Oct 13 14:43:10 crc kubenswrapper[4797]: I1013 14:43:10.911992 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b76c56699-bpj22" Oct 13 14:43:10 crc kubenswrapper[4797]: I1013 14:43:10.913999 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1" Oct 13 14:43:10 crc kubenswrapper[4797]: I1013 14:43:10.932937 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b76c56699-bpj22"] Oct 13 14:43:11 crc kubenswrapper[4797]: I1013 14:43:11.007836 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/90aa9215-2ccf-4400-b400-26d0d8e1bba8-dns-svc\") pod \"dnsmasq-dns-6b76c56699-bpj22\" (UID: \"90aa9215-2ccf-4400-b400-26d0d8e1bba8\") " pod="openstack/dnsmasq-dns-6b76c56699-bpj22" Oct 13 14:43:11 crc kubenswrapper[4797]: I1013 14:43:11.007968 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90aa9215-2ccf-4400-b400-26d0d8e1bba8-config\") pod \"dnsmasq-dns-6b76c56699-bpj22\" (UID: \"90aa9215-2ccf-4400-b400-26d0d8e1bba8\") " pod="openstack/dnsmasq-dns-6b76c56699-bpj22" Oct 13 14:43:11 crc kubenswrapper[4797]: I1013 14:43:11.007988 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/90aa9215-2ccf-4400-b400-26d0d8e1bba8-ovsdbserver-sb\") pod \"dnsmasq-dns-6b76c56699-bpj22\" (UID: \"90aa9215-2ccf-4400-b400-26d0d8e1bba8\") " pod="openstack/dnsmasq-dns-6b76c56699-bpj22" Oct 13 14:43:11 crc kubenswrapper[4797]: I1013 14:43:11.008025 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2rw8\" (UniqueName: \"kubernetes.io/projected/90aa9215-2ccf-4400-b400-26d0d8e1bba8-kube-api-access-z2rw8\") pod \"dnsmasq-dns-6b76c56699-bpj22\" (UID: \"90aa9215-2ccf-4400-b400-26d0d8e1bba8\") " pod="openstack/dnsmasq-dns-6b76c56699-bpj22" Oct 13 14:43:11 crc kubenswrapper[4797]: I1013 14:43:11.008072 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/90aa9215-2ccf-4400-b400-26d0d8e1bba8-ovsdbserver-nb\") pod \"dnsmasq-dns-6b76c56699-bpj22\" (UID: \"90aa9215-2ccf-4400-b400-26d0d8e1bba8\") " pod="openstack/dnsmasq-dns-6b76c56699-bpj22" Oct 13 14:43:11 crc kubenswrapper[4797]: I1013 14:43:11.008099 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/90aa9215-2ccf-4400-b400-26d0d8e1bba8-openstack-cell1\") pod \"dnsmasq-dns-6b76c56699-bpj22\" (UID: \"90aa9215-2ccf-4400-b400-26d0d8e1bba8\") " pod="openstack/dnsmasq-dns-6b76c56699-bpj22" Oct 13 14:43:11 crc kubenswrapper[4797]: I1013 14:43:11.109333 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90aa9215-2ccf-4400-b400-26d0d8e1bba8-config\") pod \"dnsmasq-dns-6b76c56699-bpj22\" (UID: \"90aa9215-2ccf-4400-b400-26d0d8e1bba8\") " pod="openstack/dnsmasq-dns-6b76c56699-bpj22" Oct 13 14:43:11 crc kubenswrapper[4797]: I1013 14:43:11.109384 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/90aa9215-2ccf-4400-b400-26d0d8e1bba8-ovsdbserver-sb\") pod \"dnsmasq-dns-6b76c56699-bpj22\" (UID: \"90aa9215-2ccf-4400-b400-26d0d8e1bba8\") " pod="openstack/dnsmasq-dns-6b76c56699-bpj22" Oct 13 14:43:11 crc kubenswrapper[4797]: I1013 14:43:11.109436 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z2rw8\" (UniqueName: \"kubernetes.io/projected/90aa9215-2ccf-4400-b400-26d0d8e1bba8-kube-api-access-z2rw8\") pod \"dnsmasq-dns-6b76c56699-bpj22\" (UID: \"90aa9215-2ccf-4400-b400-26d0d8e1bba8\") " pod="openstack/dnsmasq-dns-6b76c56699-bpj22" Oct 13 14:43:11 crc kubenswrapper[4797]: I1013 14:43:11.109495 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/90aa9215-2ccf-4400-b400-26d0d8e1bba8-ovsdbserver-nb\") pod \"dnsmasq-dns-6b76c56699-bpj22\" (UID: \"90aa9215-2ccf-4400-b400-26d0d8e1bba8\") " pod="openstack/dnsmasq-dns-6b76c56699-bpj22" Oct 13 14:43:11 crc kubenswrapper[4797]: I1013 14:43:11.109531 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/90aa9215-2ccf-4400-b400-26d0d8e1bba8-openstack-cell1\") pod \"dnsmasq-dns-6b76c56699-bpj22\" (UID: \"90aa9215-2ccf-4400-b400-26d0d8e1bba8\") " pod="openstack/dnsmasq-dns-6b76c56699-bpj22" Oct 13 14:43:11 crc kubenswrapper[4797]: I1013 14:43:11.109588 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/90aa9215-2ccf-4400-b400-26d0d8e1bba8-dns-svc\") pod \"dnsmasq-dns-6b76c56699-bpj22\" (UID: \"90aa9215-2ccf-4400-b400-26d0d8e1bba8\") " pod="openstack/dnsmasq-dns-6b76c56699-bpj22" Oct 13 14:43:11 crc kubenswrapper[4797]: I1013 14:43:11.110287 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90aa9215-2ccf-4400-b400-26d0d8e1bba8-config\") pod \"dnsmasq-dns-6b76c56699-bpj22\" (UID: \"90aa9215-2ccf-4400-b400-26d0d8e1bba8\") " pod="openstack/dnsmasq-dns-6b76c56699-bpj22" Oct 13 14:43:11 crc kubenswrapper[4797]: I1013 14:43:11.110670 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/90aa9215-2ccf-4400-b400-26d0d8e1bba8-ovsdbserver-nb\") pod \"dnsmasq-dns-6b76c56699-bpj22\" (UID: \"90aa9215-2ccf-4400-b400-26d0d8e1bba8\") " pod="openstack/dnsmasq-dns-6b76c56699-bpj22" Oct 13 14:43:11 crc kubenswrapper[4797]: I1013 14:43:11.112875 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/90aa9215-2ccf-4400-b400-26d0d8e1bba8-ovsdbserver-sb\") pod \"dnsmasq-dns-6b76c56699-bpj22\" (UID: \"90aa9215-2ccf-4400-b400-26d0d8e1bba8\") " pod="openstack/dnsmasq-dns-6b76c56699-bpj22" Oct 13 14:43:11 crc kubenswrapper[4797]: I1013 14:43:11.113131 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/90aa9215-2ccf-4400-b400-26d0d8e1bba8-openstack-cell1\") pod \"dnsmasq-dns-6b76c56699-bpj22\" (UID: \"90aa9215-2ccf-4400-b400-26d0d8e1bba8\") " pod="openstack/dnsmasq-dns-6b76c56699-bpj22" Oct 13 14:43:11 crc kubenswrapper[4797]: I1013 14:43:11.113175 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/90aa9215-2ccf-4400-b400-26d0d8e1bba8-dns-svc\") pod \"dnsmasq-dns-6b76c56699-bpj22\" (UID: \"90aa9215-2ccf-4400-b400-26d0d8e1bba8\") " pod="openstack/dnsmasq-dns-6b76c56699-bpj22" Oct 13 14:43:11 crc kubenswrapper[4797]: I1013 14:43:11.129711 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2rw8\" (UniqueName: \"kubernetes.io/projected/90aa9215-2ccf-4400-b400-26d0d8e1bba8-kube-api-access-z2rw8\") pod \"dnsmasq-dns-6b76c56699-bpj22\" (UID: \"90aa9215-2ccf-4400-b400-26d0d8e1bba8\") " pod="openstack/dnsmasq-dns-6b76c56699-bpj22" Oct 13 14:43:11 crc kubenswrapper[4797]: I1013 14:43:11.230676 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b76c56699-bpj22" Oct 13 14:43:11 crc kubenswrapper[4797]: I1013 14:43:11.804534 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b76c56699-bpj22"] Oct 13 14:43:11 crc kubenswrapper[4797]: W1013 14:43:11.804558 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod90aa9215_2ccf_4400_b400_26d0d8e1bba8.slice/crio-97229223ed27c7f8fe400f2db56bd7b58e98c06cbb3c2674240f3a722a7e60a5 WatchSource:0}: Error finding container 97229223ed27c7f8fe400f2db56bd7b58e98c06cbb3c2674240f3a722a7e60a5: Status 404 returned error can't find the container with id 97229223ed27c7f8fe400f2db56bd7b58e98c06cbb3c2674240f3a722a7e60a5 Oct 13 14:43:12 crc kubenswrapper[4797]: I1013 14:43:12.176081 4797 generic.go:334] "Generic (PLEG): container finished" podID="90aa9215-2ccf-4400-b400-26d0d8e1bba8" containerID="990429f1b2e468d798484f93bf4cc2185b849d3118b7e9458e9e8e31da5e1e77" exitCode=0 Oct 13 14:43:12 crc kubenswrapper[4797]: I1013 14:43:12.176209 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b76c56699-bpj22" event={"ID":"90aa9215-2ccf-4400-b400-26d0d8e1bba8","Type":"ContainerDied","Data":"990429f1b2e468d798484f93bf4cc2185b849d3118b7e9458e9e8e31da5e1e77"} Oct 13 14:43:12 crc kubenswrapper[4797]: I1013 14:43:12.176347 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b76c56699-bpj22" event={"ID":"90aa9215-2ccf-4400-b400-26d0d8e1bba8","Type":"ContainerStarted","Data":"97229223ed27c7f8fe400f2db56bd7b58e98c06cbb3c2674240f3a722a7e60a5"} Oct 13 14:43:13 crc kubenswrapper[4797]: I1013 14:43:13.196174 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b76c56699-bpj22" event={"ID":"90aa9215-2ccf-4400-b400-26d0d8e1bba8","Type":"ContainerStarted","Data":"a672498f2c5b67a50102b8d27065feb7431c336470d6b617401bc84102e0c62e"} Oct 13 14:43:13 crc kubenswrapper[4797]: I1013 14:43:13.196707 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6b76c56699-bpj22" Oct 13 14:43:13 crc kubenswrapper[4797]: I1013 14:43:13.246494 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6b76c56699-bpj22" podStartSLOduration=3.24647468 podStartE2EDuration="3.24647468s" podCreationTimestamp="2025-10-13 14:43:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 14:43:13.229293568 +0000 UTC m=+5770.762843834" watchObservedRunningTime="2025-10-13 14:43:13.24647468 +0000 UTC m=+5770.780024936" Oct 13 14:43:14 crc kubenswrapper[4797]: I1013 14:43:14.041691 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-wjhx2"] Oct 13 14:43:14 crc kubenswrapper[4797]: I1013 14:43:14.054499 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-wjhx2"] Oct 13 14:43:15 crc kubenswrapper[4797]: I1013 14:43:15.251148 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="768cb00c-712a-4da6-bcc9-669b541c1761" path="/var/lib/kubelet/pods/768cb00c-712a-4da6-bcc9-669b541c1761/volumes" Oct 13 14:43:17 crc kubenswrapper[4797]: I1013 14:43:17.452624 4797 scope.go:117] "RemoveContainer" containerID="4f2548644b61d387f76ad56cf7cd644c9e497ce36b28bfcc828631f67644df02" Oct 13 14:43:17 crc kubenswrapper[4797]: I1013 14:43:17.481599 4797 scope.go:117] "RemoveContainer" containerID="760fd1ac730f931272315a4082868d802b7d7dfc344e4b48b04f26e08d8abef2" Oct 13 14:43:17 crc kubenswrapper[4797]: I1013 14:43:17.554587 4797 scope.go:117] "RemoveContainer" containerID="11b52077a25ff4ef47422e07552263ace81d9cb6b39f17b32d0922f6ca0b32eb" Oct 13 14:43:17 crc kubenswrapper[4797]: I1013 14:43:17.602034 4797 scope.go:117] "RemoveContainer" containerID="0afc7e79af2a4e281ae15fb5c878a5a7160d75851fbd1e5d29fd13c3873f98c0" Oct 13 14:43:17 crc kubenswrapper[4797]: I1013 14:43:17.652231 4797 scope.go:117] "RemoveContainer" containerID="af5dbd20310a5fd1b74d30ecd1df10ae2d3b3c2187d9d485aaa6577802d3df22" Oct 13 14:43:17 crc kubenswrapper[4797]: I1013 14:43:17.676438 4797 scope.go:117] "RemoveContainer" containerID="e86e99da707a5e06b77ff5fd97dda0e3f7b432d2d862555c57df10db6517018a" Oct 13 14:43:17 crc kubenswrapper[4797]: I1013 14:43:17.751210 4797 scope.go:117] "RemoveContainer" containerID="f75f53b05a896d1bfa0f2f4c2a81ea202396bf1a4892f8008657de104104896d" Oct 13 14:43:21 crc kubenswrapper[4797]: I1013 14:43:21.232100 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6b76c56699-bpj22" Oct 13 14:43:21 crc kubenswrapper[4797]: I1013 14:43:21.236690 4797 scope.go:117] "RemoveContainer" containerID="6e1792210eb1300e989630292282b9e027a6f68ede1e610221b234da4c9f7a00" Oct 13 14:43:21 crc kubenswrapper[4797]: E1013 14:43:21.237276 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 14:43:21 crc kubenswrapper[4797]: I1013 14:43:21.321113 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c489d6fcc-6jc4s"] Oct 13 14:43:21 crc kubenswrapper[4797]: I1013 14:43:21.321449 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6c489d6fcc-6jc4s" podUID="80a123a6-e2f6-4491-abbb-0eeb468e2879" containerName="dnsmasq-dns" containerID="cri-o://1ae4681492128fd97a809246fd137701e256ccc00919562ec3be7d4472645093" gracePeriod=10 Oct 13 14:43:21 crc kubenswrapper[4797]: I1013 14:43:21.439872 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6f55445bd9-6dr9d"] Oct 13 14:43:21 crc kubenswrapper[4797]: I1013 14:43:21.442010 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f55445bd9-6dr9d" Oct 13 14:43:21 crc kubenswrapper[4797]: I1013 14:43:21.473726 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6f55445bd9-6dr9d"] Oct 13 14:43:21 crc kubenswrapper[4797]: I1013 14:43:21.557730 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9f72e1e8-e5a1-4c86-9ef7-386e430a851c-ovsdbserver-nb\") pod \"dnsmasq-dns-6f55445bd9-6dr9d\" (UID: \"9f72e1e8-e5a1-4c86-9ef7-386e430a851c\") " pod="openstack/dnsmasq-dns-6f55445bd9-6dr9d" Oct 13 14:43:21 crc kubenswrapper[4797]: I1013 14:43:21.557769 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f72e1e8-e5a1-4c86-9ef7-386e430a851c-config\") pod \"dnsmasq-dns-6f55445bd9-6dr9d\" (UID: \"9f72e1e8-e5a1-4c86-9ef7-386e430a851c\") " pod="openstack/dnsmasq-dns-6f55445bd9-6dr9d" Oct 13 14:43:21 crc kubenswrapper[4797]: I1013 14:43:21.558112 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xm6ct\" (UniqueName: \"kubernetes.io/projected/9f72e1e8-e5a1-4c86-9ef7-386e430a851c-kube-api-access-xm6ct\") pod \"dnsmasq-dns-6f55445bd9-6dr9d\" (UID: \"9f72e1e8-e5a1-4c86-9ef7-386e430a851c\") " pod="openstack/dnsmasq-dns-6f55445bd9-6dr9d" Oct 13 14:43:21 crc kubenswrapper[4797]: I1013 14:43:21.558427 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9f72e1e8-e5a1-4c86-9ef7-386e430a851c-dns-svc\") pod \"dnsmasq-dns-6f55445bd9-6dr9d\" (UID: \"9f72e1e8-e5a1-4c86-9ef7-386e430a851c\") " pod="openstack/dnsmasq-dns-6f55445bd9-6dr9d" Oct 13 14:43:21 crc kubenswrapper[4797]: I1013 14:43:21.558464 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9f72e1e8-e5a1-4c86-9ef7-386e430a851c-ovsdbserver-sb\") pod \"dnsmasq-dns-6f55445bd9-6dr9d\" (UID: \"9f72e1e8-e5a1-4c86-9ef7-386e430a851c\") " pod="openstack/dnsmasq-dns-6f55445bd9-6dr9d" Oct 13 14:43:21 crc kubenswrapper[4797]: I1013 14:43:21.558703 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/9f72e1e8-e5a1-4c86-9ef7-386e430a851c-openstack-cell1\") pod \"dnsmasq-dns-6f55445bd9-6dr9d\" (UID: \"9f72e1e8-e5a1-4c86-9ef7-386e430a851c\") " pod="openstack/dnsmasq-dns-6f55445bd9-6dr9d" Oct 13 14:43:21 crc kubenswrapper[4797]: I1013 14:43:21.661951 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xm6ct\" (UniqueName: \"kubernetes.io/projected/9f72e1e8-e5a1-4c86-9ef7-386e430a851c-kube-api-access-xm6ct\") pod \"dnsmasq-dns-6f55445bd9-6dr9d\" (UID: \"9f72e1e8-e5a1-4c86-9ef7-386e430a851c\") " pod="openstack/dnsmasq-dns-6f55445bd9-6dr9d" Oct 13 14:43:21 crc kubenswrapper[4797]: I1013 14:43:21.662615 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9f72e1e8-e5a1-4c86-9ef7-386e430a851c-dns-svc\") pod \"dnsmasq-dns-6f55445bd9-6dr9d\" (UID: \"9f72e1e8-e5a1-4c86-9ef7-386e430a851c\") " pod="openstack/dnsmasq-dns-6f55445bd9-6dr9d" Oct 13 14:43:21 crc kubenswrapper[4797]: I1013 14:43:21.662779 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9f72e1e8-e5a1-4c86-9ef7-386e430a851c-ovsdbserver-sb\") pod \"dnsmasq-dns-6f55445bd9-6dr9d\" (UID: \"9f72e1e8-e5a1-4c86-9ef7-386e430a851c\") " pod="openstack/dnsmasq-dns-6f55445bd9-6dr9d" Oct 13 14:43:21 crc kubenswrapper[4797]: I1013 14:43:21.662998 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/9f72e1e8-e5a1-4c86-9ef7-386e430a851c-openstack-cell1\") pod \"dnsmasq-dns-6f55445bd9-6dr9d\" (UID: \"9f72e1e8-e5a1-4c86-9ef7-386e430a851c\") " pod="openstack/dnsmasq-dns-6f55445bd9-6dr9d" Oct 13 14:43:21 crc kubenswrapper[4797]: I1013 14:43:21.663185 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9f72e1e8-e5a1-4c86-9ef7-386e430a851c-ovsdbserver-nb\") pod \"dnsmasq-dns-6f55445bd9-6dr9d\" (UID: \"9f72e1e8-e5a1-4c86-9ef7-386e430a851c\") " pod="openstack/dnsmasq-dns-6f55445bd9-6dr9d" Oct 13 14:43:21 crc kubenswrapper[4797]: I1013 14:43:21.663277 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f72e1e8-e5a1-4c86-9ef7-386e430a851c-config\") pod \"dnsmasq-dns-6f55445bd9-6dr9d\" (UID: \"9f72e1e8-e5a1-4c86-9ef7-386e430a851c\") " pod="openstack/dnsmasq-dns-6f55445bd9-6dr9d" Oct 13 14:43:21 crc kubenswrapper[4797]: I1013 14:43:21.663514 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9f72e1e8-e5a1-4c86-9ef7-386e430a851c-ovsdbserver-sb\") pod \"dnsmasq-dns-6f55445bd9-6dr9d\" (UID: \"9f72e1e8-e5a1-4c86-9ef7-386e430a851c\") " pod="openstack/dnsmasq-dns-6f55445bd9-6dr9d" Oct 13 14:43:21 crc kubenswrapper[4797]: I1013 14:43:21.663525 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9f72e1e8-e5a1-4c86-9ef7-386e430a851c-dns-svc\") pod \"dnsmasq-dns-6f55445bd9-6dr9d\" (UID: \"9f72e1e8-e5a1-4c86-9ef7-386e430a851c\") " pod="openstack/dnsmasq-dns-6f55445bd9-6dr9d" Oct 13 14:43:21 crc kubenswrapper[4797]: I1013 14:43:21.664143 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f72e1e8-e5a1-4c86-9ef7-386e430a851c-config\") pod \"dnsmasq-dns-6f55445bd9-6dr9d\" (UID: \"9f72e1e8-e5a1-4c86-9ef7-386e430a851c\") " pod="openstack/dnsmasq-dns-6f55445bd9-6dr9d" Oct 13 14:43:21 crc kubenswrapper[4797]: I1013 14:43:21.664250 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9f72e1e8-e5a1-4c86-9ef7-386e430a851c-ovsdbserver-nb\") pod \"dnsmasq-dns-6f55445bd9-6dr9d\" (UID: \"9f72e1e8-e5a1-4c86-9ef7-386e430a851c\") " pod="openstack/dnsmasq-dns-6f55445bd9-6dr9d" Oct 13 14:43:21 crc kubenswrapper[4797]: I1013 14:43:21.664997 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/9f72e1e8-e5a1-4c86-9ef7-386e430a851c-openstack-cell1\") pod \"dnsmasq-dns-6f55445bd9-6dr9d\" (UID: \"9f72e1e8-e5a1-4c86-9ef7-386e430a851c\") " pod="openstack/dnsmasq-dns-6f55445bd9-6dr9d" Oct 13 14:43:21 crc kubenswrapper[4797]: I1013 14:43:21.688396 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xm6ct\" (UniqueName: \"kubernetes.io/projected/9f72e1e8-e5a1-4c86-9ef7-386e430a851c-kube-api-access-xm6ct\") pod \"dnsmasq-dns-6f55445bd9-6dr9d\" (UID: \"9f72e1e8-e5a1-4c86-9ef7-386e430a851c\") " pod="openstack/dnsmasq-dns-6f55445bd9-6dr9d" Oct 13 14:43:21 crc kubenswrapper[4797]: I1013 14:43:21.833767 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f55445bd9-6dr9d" Oct 13 14:43:21 crc kubenswrapper[4797]: I1013 14:43:21.955401 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c489d6fcc-6jc4s" Oct 13 14:43:22 crc kubenswrapper[4797]: I1013 14:43:22.088735 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80a123a6-e2f6-4491-abbb-0eeb468e2879-config\") pod \"80a123a6-e2f6-4491-abbb-0eeb468e2879\" (UID: \"80a123a6-e2f6-4491-abbb-0eeb468e2879\") " Oct 13 14:43:22 crc kubenswrapper[4797]: I1013 14:43:22.091574 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f6dqc\" (UniqueName: \"kubernetes.io/projected/80a123a6-e2f6-4491-abbb-0eeb468e2879-kube-api-access-f6dqc\") pod \"80a123a6-e2f6-4491-abbb-0eeb468e2879\" (UID: \"80a123a6-e2f6-4491-abbb-0eeb468e2879\") " Oct 13 14:43:22 crc kubenswrapper[4797]: I1013 14:43:22.091780 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/80a123a6-e2f6-4491-abbb-0eeb468e2879-ovsdbserver-nb\") pod \"80a123a6-e2f6-4491-abbb-0eeb468e2879\" (UID: \"80a123a6-e2f6-4491-abbb-0eeb468e2879\") " Oct 13 14:43:22 crc kubenswrapper[4797]: I1013 14:43:22.091861 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/80a123a6-e2f6-4491-abbb-0eeb468e2879-ovsdbserver-sb\") pod \"80a123a6-e2f6-4491-abbb-0eeb468e2879\" (UID: \"80a123a6-e2f6-4491-abbb-0eeb468e2879\") " Oct 13 14:43:22 crc kubenswrapper[4797]: I1013 14:43:22.091916 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/80a123a6-e2f6-4491-abbb-0eeb468e2879-dns-svc\") pod \"80a123a6-e2f6-4491-abbb-0eeb468e2879\" (UID: \"80a123a6-e2f6-4491-abbb-0eeb468e2879\") " Oct 13 14:43:22 crc kubenswrapper[4797]: I1013 14:43:22.117167 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80a123a6-e2f6-4491-abbb-0eeb468e2879-kube-api-access-f6dqc" (OuterVolumeSpecName: "kube-api-access-f6dqc") pod "80a123a6-e2f6-4491-abbb-0eeb468e2879" (UID: "80a123a6-e2f6-4491-abbb-0eeb468e2879"). InnerVolumeSpecName "kube-api-access-f6dqc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 14:43:22 crc kubenswrapper[4797]: I1013 14:43:22.159369 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/80a123a6-e2f6-4491-abbb-0eeb468e2879-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "80a123a6-e2f6-4491-abbb-0eeb468e2879" (UID: "80a123a6-e2f6-4491-abbb-0eeb468e2879"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 14:43:22 crc kubenswrapper[4797]: I1013 14:43:22.170273 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/80a123a6-e2f6-4491-abbb-0eeb468e2879-config" (OuterVolumeSpecName: "config") pod "80a123a6-e2f6-4491-abbb-0eeb468e2879" (UID: "80a123a6-e2f6-4491-abbb-0eeb468e2879"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 14:43:22 crc kubenswrapper[4797]: I1013 14:43:22.176315 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/80a123a6-e2f6-4491-abbb-0eeb468e2879-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "80a123a6-e2f6-4491-abbb-0eeb468e2879" (UID: "80a123a6-e2f6-4491-abbb-0eeb468e2879"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 14:43:22 crc kubenswrapper[4797]: I1013 14:43:22.188367 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/80a123a6-e2f6-4491-abbb-0eeb468e2879-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "80a123a6-e2f6-4491-abbb-0eeb468e2879" (UID: "80a123a6-e2f6-4491-abbb-0eeb468e2879"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 14:43:22 crc kubenswrapper[4797]: I1013 14:43:22.197432 4797 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80a123a6-e2f6-4491-abbb-0eeb468e2879-config\") on node \"crc\" DevicePath \"\"" Oct 13 14:43:22 crc kubenswrapper[4797]: I1013 14:43:22.197468 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f6dqc\" (UniqueName: \"kubernetes.io/projected/80a123a6-e2f6-4491-abbb-0eeb468e2879-kube-api-access-f6dqc\") on node \"crc\" DevicePath \"\"" Oct 13 14:43:22 crc kubenswrapper[4797]: I1013 14:43:22.197483 4797 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/80a123a6-e2f6-4491-abbb-0eeb468e2879-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 13 14:43:22 crc kubenswrapper[4797]: I1013 14:43:22.197493 4797 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/80a123a6-e2f6-4491-abbb-0eeb468e2879-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 13 14:43:22 crc kubenswrapper[4797]: I1013 14:43:22.197502 4797 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/80a123a6-e2f6-4491-abbb-0eeb468e2879-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 13 14:43:22 crc kubenswrapper[4797]: I1013 14:43:22.300185 4797 generic.go:334] "Generic (PLEG): container finished" podID="80a123a6-e2f6-4491-abbb-0eeb468e2879" containerID="1ae4681492128fd97a809246fd137701e256ccc00919562ec3be7d4472645093" exitCode=0 Oct 13 14:43:22 crc kubenswrapper[4797]: I1013 14:43:22.300231 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c489d6fcc-6jc4s" event={"ID":"80a123a6-e2f6-4491-abbb-0eeb468e2879","Type":"ContainerDied","Data":"1ae4681492128fd97a809246fd137701e256ccc00919562ec3be7d4472645093"} Oct 13 14:43:22 crc kubenswrapper[4797]: I1013 14:43:22.300259 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c489d6fcc-6jc4s" event={"ID":"80a123a6-e2f6-4491-abbb-0eeb468e2879","Type":"ContainerDied","Data":"19cae0882b3e126c057cb7f36200511ed25e3226c16a371c9cc409ceeca4a946"} Oct 13 14:43:22 crc kubenswrapper[4797]: I1013 14:43:22.300277 4797 scope.go:117] "RemoveContainer" containerID="1ae4681492128fd97a809246fd137701e256ccc00919562ec3be7d4472645093" Oct 13 14:43:22 crc kubenswrapper[4797]: I1013 14:43:22.300387 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c489d6fcc-6jc4s" Oct 13 14:43:22 crc kubenswrapper[4797]: I1013 14:43:22.328153 4797 scope.go:117] "RemoveContainer" containerID="e988fdf012d700582d0d48bda9a2382512a7826eba026c0e6d732bd35889a503" Oct 13 14:43:22 crc kubenswrapper[4797]: I1013 14:43:22.337038 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c489d6fcc-6jc4s"] Oct 13 14:43:22 crc kubenswrapper[4797]: I1013 14:43:22.347526 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6c489d6fcc-6jc4s"] Oct 13 14:43:22 crc kubenswrapper[4797]: I1013 14:43:22.350768 4797 scope.go:117] "RemoveContainer" containerID="1ae4681492128fd97a809246fd137701e256ccc00919562ec3be7d4472645093" Oct 13 14:43:22 crc kubenswrapper[4797]: E1013 14:43:22.351335 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ae4681492128fd97a809246fd137701e256ccc00919562ec3be7d4472645093\": container with ID starting with 1ae4681492128fd97a809246fd137701e256ccc00919562ec3be7d4472645093 not found: ID does not exist" containerID="1ae4681492128fd97a809246fd137701e256ccc00919562ec3be7d4472645093" Oct 13 14:43:22 crc kubenswrapper[4797]: I1013 14:43:22.351382 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ae4681492128fd97a809246fd137701e256ccc00919562ec3be7d4472645093"} err="failed to get container status \"1ae4681492128fd97a809246fd137701e256ccc00919562ec3be7d4472645093\": rpc error: code = NotFound desc = could not find container \"1ae4681492128fd97a809246fd137701e256ccc00919562ec3be7d4472645093\": container with ID starting with 1ae4681492128fd97a809246fd137701e256ccc00919562ec3be7d4472645093 not found: ID does not exist" Oct 13 14:43:22 crc kubenswrapper[4797]: I1013 14:43:22.351409 4797 scope.go:117] "RemoveContainer" containerID="e988fdf012d700582d0d48bda9a2382512a7826eba026c0e6d732bd35889a503" Oct 13 14:43:22 crc kubenswrapper[4797]: E1013 14:43:22.351798 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e988fdf012d700582d0d48bda9a2382512a7826eba026c0e6d732bd35889a503\": container with ID starting with e988fdf012d700582d0d48bda9a2382512a7826eba026c0e6d732bd35889a503 not found: ID does not exist" containerID="e988fdf012d700582d0d48bda9a2382512a7826eba026c0e6d732bd35889a503" Oct 13 14:43:22 crc kubenswrapper[4797]: I1013 14:43:22.351851 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e988fdf012d700582d0d48bda9a2382512a7826eba026c0e6d732bd35889a503"} err="failed to get container status \"e988fdf012d700582d0d48bda9a2382512a7826eba026c0e6d732bd35889a503\": rpc error: code = NotFound desc = could not find container \"e988fdf012d700582d0d48bda9a2382512a7826eba026c0e6d732bd35889a503\": container with ID starting with e988fdf012d700582d0d48bda9a2382512a7826eba026c0e6d732bd35889a503 not found: ID does not exist" Oct 13 14:43:22 crc kubenswrapper[4797]: I1013 14:43:22.357021 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6f55445bd9-6dr9d"] Oct 13 14:43:23 crc kubenswrapper[4797]: I1013 14:43:23.266708 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="80a123a6-e2f6-4491-abbb-0eeb468e2879" path="/var/lib/kubelet/pods/80a123a6-e2f6-4491-abbb-0eeb468e2879/volumes" Oct 13 14:43:23 crc kubenswrapper[4797]: I1013 14:43:23.314212 4797 generic.go:334] "Generic (PLEG): container finished" podID="9f72e1e8-e5a1-4c86-9ef7-386e430a851c" containerID="770a3567b8e7b7ac00f2cf20f7ce0da863d1415d207ab1598c515b15a3324baf" exitCode=0 Oct 13 14:43:23 crc kubenswrapper[4797]: I1013 14:43:23.314320 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f55445bd9-6dr9d" event={"ID":"9f72e1e8-e5a1-4c86-9ef7-386e430a851c","Type":"ContainerDied","Data":"770a3567b8e7b7ac00f2cf20f7ce0da863d1415d207ab1598c515b15a3324baf"} Oct 13 14:43:23 crc kubenswrapper[4797]: I1013 14:43:23.314412 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f55445bd9-6dr9d" event={"ID":"9f72e1e8-e5a1-4c86-9ef7-386e430a851c","Type":"ContainerStarted","Data":"96b3e2f48dcb8e297e08accbaa1a93302a7724d6c145db85a15564f0c58b2c69"} Oct 13 14:43:24 crc kubenswrapper[4797]: I1013 14:43:24.339532 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f55445bd9-6dr9d" event={"ID":"9f72e1e8-e5a1-4c86-9ef7-386e430a851c","Type":"ContainerStarted","Data":"a01054cfc42b7a35a060ef710be725d42e616113b9b37be46ceb6d2157fb3b7b"} Oct 13 14:43:24 crc kubenswrapper[4797]: I1013 14:43:24.340778 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6f55445bd9-6dr9d" Oct 13 14:43:24 crc kubenswrapper[4797]: I1013 14:43:24.360228 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6f55445bd9-6dr9d" podStartSLOduration=3.360200481 podStartE2EDuration="3.360200481s" podCreationTimestamp="2025-10-13 14:43:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 14:43:24.35851226 +0000 UTC m=+5781.892062586" watchObservedRunningTime="2025-10-13 14:43:24.360200481 +0000 UTC m=+5781.893750777" Oct 13 14:43:31 crc kubenswrapper[4797]: I1013 14:43:31.836059 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6f55445bd9-6dr9d" Oct 13 14:43:31 crc kubenswrapper[4797]: I1013 14:43:31.899236 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b76c56699-bpj22"] Oct 13 14:43:31 crc kubenswrapper[4797]: I1013 14:43:31.899606 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6b76c56699-bpj22" podUID="90aa9215-2ccf-4400-b400-26d0d8e1bba8" containerName="dnsmasq-dns" containerID="cri-o://a672498f2c5b67a50102b8d27065feb7431c336470d6b617401bc84102e0c62e" gracePeriod=10 Oct 13 14:43:32 crc kubenswrapper[4797]: I1013 14:43:32.426859 4797 generic.go:334] "Generic (PLEG): container finished" podID="90aa9215-2ccf-4400-b400-26d0d8e1bba8" containerID="a672498f2c5b67a50102b8d27065feb7431c336470d6b617401bc84102e0c62e" exitCode=0 Oct 13 14:43:32 crc kubenswrapper[4797]: I1013 14:43:32.427153 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b76c56699-bpj22" event={"ID":"90aa9215-2ccf-4400-b400-26d0d8e1bba8","Type":"ContainerDied","Data":"a672498f2c5b67a50102b8d27065feb7431c336470d6b617401bc84102e0c62e"} Oct 13 14:43:32 crc kubenswrapper[4797]: I1013 14:43:32.427184 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b76c56699-bpj22" event={"ID":"90aa9215-2ccf-4400-b400-26d0d8e1bba8","Type":"ContainerDied","Data":"97229223ed27c7f8fe400f2db56bd7b58e98c06cbb3c2674240f3a722a7e60a5"} Oct 13 14:43:32 crc kubenswrapper[4797]: I1013 14:43:32.427198 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="97229223ed27c7f8fe400f2db56bd7b58e98c06cbb3c2674240f3a722a7e60a5" Oct 13 14:43:32 crc kubenswrapper[4797]: I1013 14:43:32.527580 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b76c56699-bpj22" Oct 13 14:43:32 crc kubenswrapper[4797]: I1013 14:43:32.629229 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/90aa9215-2ccf-4400-b400-26d0d8e1bba8-ovsdbserver-sb\") pod \"90aa9215-2ccf-4400-b400-26d0d8e1bba8\" (UID: \"90aa9215-2ccf-4400-b400-26d0d8e1bba8\") " Oct 13 14:43:32 crc kubenswrapper[4797]: I1013 14:43:32.629340 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/90aa9215-2ccf-4400-b400-26d0d8e1bba8-ovsdbserver-nb\") pod \"90aa9215-2ccf-4400-b400-26d0d8e1bba8\" (UID: \"90aa9215-2ccf-4400-b400-26d0d8e1bba8\") " Oct 13 14:43:32 crc kubenswrapper[4797]: I1013 14:43:32.629396 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z2rw8\" (UniqueName: \"kubernetes.io/projected/90aa9215-2ccf-4400-b400-26d0d8e1bba8-kube-api-access-z2rw8\") pod \"90aa9215-2ccf-4400-b400-26d0d8e1bba8\" (UID: \"90aa9215-2ccf-4400-b400-26d0d8e1bba8\") " Oct 13 14:43:32 crc kubenswrapper[4797]: I1013 14:43:32.629498 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/90aa9215-2ccf-4400-b400-26d0d8e1bba8-openstack-cell1\") pod \"90aa9215-2ccf-4400-b400-26d0d8e1bba8\" (UID: \"90aa9215-2ccf-4400-b400-26d0d8e1bba8\") " Oct 13 14:43:32 crc kubenswrapper[4797]: I1013 14:43:32.629562 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90aa9215-2ccf-4400-b400-26d0d8e1bba8-config\") pod \"90aa9215-2ccf-4400-b400-26d0d8e1bba8\" (UID: \"90aa9215-2ccf-4400-b400-26d0d8e1bba8\") " Oct 13 14:43:32 crc kubenswrapper[4797]: I1013 14:43:32.629670 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/90aa9215-2ccf-4400-b400-26d0d8e1bba8-dns-svc\") pod \"90aa9215-2ccf-4400-b400-26d0d8e1bba8\" (UID: \"90aa9215-2ccf-4400-b400-26d0d8e1bba8\") " Oct 13 14:43:32 crc kubenswrapper[4797]: I1013 14:43:32.655510 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90aa9215-2ccf-4400-b400-26d0d8e1bba8-kube-api-access-z2rw8" (OuterVolumeSpecName: "kube-api-access-z2rw8") pod "90aa9215-2ccf-4400-b400-26d0d8e1bba8" (UID: "90aa9215-2ccf-4400-b400-26d0d8e1bba8"). InnerVolumeSpecName "kube-api-access-z2rw8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 14:43:32 crc kubenswrapper[4797]: I1013 14:43:32.695544 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90aa9215-2ccf-4400-b400-26d0d8e1bba8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "90aa9215-2ccf-4400-b400-26d0d8e1bba8" (UID: "90aa9215-2ccf-4400-b400-26d0d8e1bba8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 14:43:32 crc kubenswrapper[4797]: I1013 14:43:32.706934 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90aa9215-2ccf-4400-b400-26d0d8e1bba8-config" (OuterVolumeSpecName: "config") pod "90aa9215-2ccf-4400-b400-26d0d8e1bba8" (UID: "90aa9215-2ccf-4400-b400-26d0d8e1bba8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 14:43:32 crc kubenswrapper[4797]: I1013 14:43:32.714384 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90aa9215-2ccf-4400-b400-26d0d8e1bba8-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "90aa9215-2ccf-4400-b400-26d0d8e1bba8" (UID: "90aa9215-2ccf-4400-b400-26d0d8e1bba8"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 14:43:32 crc kubenswrapper[4797]: I1013 14:43:32.717642 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90aa9215-2ccf-4400-b400-26d0d8e1bba8-openstack-cell1" (OuterVolumeSpecName: "openstack-cell1") pod "90aa9215-2ccf-4400-b400-26d0d8e1bba8" (UID: "90aa9215-2ccf-4400-b400-26d0d8e1bba8"). InnerVolumeSpecName "openstack-cell1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 14:43:32 crc kubenswrapper[4797]: I1013 14:43:32.732568 4797 reconciler_common.go:293] "Volume detached for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/90aa9215-2ccf-4400-b400-26d0d8e1bba8-openstack-cell1\") on node \"crc\" DevicePath \"\"" Oct 13 14:43:32 crc kubenswrapper[4797]: I1013 14:43:32.732601 4797 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90aa9215-2ccf-4400-b400-26d0d8e1bba8-config\") on node \"crc\" DevicePath \"\"" Oct 13 14:43:32 crc kubenswrapper[4797]: I1013 14:43:32.732610 4797 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/90aa9215-2ccf-4400-b400-26d0d8e1bba8-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 13 14:43:32 crc kubenswrapper[4797]: I1013 14:43:32.732619 4797 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/90aa9215-2ccf-4400-b400-26d0d8e1bba8-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 13 14:43:32 crc kubenswrapper[4797]: I1013 14:43:32.732629 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z2rw8\" (UniqueName: \"kubernetes.io/projected/90aa9215-2ccf-4400-b400-26d0d8e1bba8-kube-api-access-z2rw8\") on node \"crc\" DevicePath \"\"" Oct 13 14:43:32 crc kubenswrapper[4797]: I1013 14:43:32.757389 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90aa9215-2ccf-4400-b400-26d0d8e1bba8-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "90aa9215-2ccf-4400-b400-26d0d8e1bba8" (UID: "90aa9215-2ccf-4400-b400-26d0d8e1bba8"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 14:43:32 crc kubenswrapper[4797]: I1013 14:43:32.834714 4797 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/90aa9215-2ccf-4400-b400-26d0d8e1bba8-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 13 14:43:33 crc kubenswrapper[4797]: I1013 14:43:33.437425 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b76c56699-bpj22" Oct 13 14:43:33 crc kubenswrapper[4797]: I1013 14:43:33.465913 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b76c56699-bpj22"] Oct 13 14:43:33 crc kubenswrapper[4797]: I1013 14:43:33.473597 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6b76c56699-bpj22"] Oct 13 14:43:35 crc kubenswrapper[4797]: I1013 14:43:35.240069 4797 scope.go:117] "RemoveContainer" containerID="6e1792210eb1300e989630292282b9e027a6f68ede1e610221b234da4c9f7a00" Oct 13 14:43:35 crc kubenswrapper[4797]: E1013 14:43:35.240917 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 14:43:35 crc kubenswrapper[4797]: I1013 14:43:35.263212 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90aa9215-2ccf-4400-b400-26d0d8e1bba8" path="/var/lib/kubelet/pods/90aa9215-2ccf-4400-b400-26d0d8e1bba8/volumes" Oct 13 14:43:42 crc kubenswrapper[4797]: I1013 14:43:42.513785 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cwpmrr"] Oct 13 14:43:42 crc kubenswrapper[4797]: E1013 14:43:42.514700 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80a123a6-e2f6-4491-abbb-0eeb468e2879" containerName="dnsmasq-dns" Oct 13 14:43:42 crc kubenswrapper[4797]: I1013 14:43:42.514713 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="80a123a6-e2f6-4491-abbb-0eeb468e2879" containerName="dnsmasq-dns" Oct 13 14:43:42 crc kubenswrapper[4797]: E1013 14:43:42.514723 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90aa9215-2ccf-4400-b400-26d0d8e1bba8" containerName="dnsmasq-dns" Oct 13 14:43:42 crc kubenswrapper[4797]: I1013 14:43:42.514729 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="90aa9215-2ccf-4400-b400-26d0d8e1bba8" containerName="dnsmasq-dns" Oct 13 14:43:42 crc kubenswrapper[4797]: E1013 14:43:42.514763 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90aa9215-2ccf-4400-b400-26d0d8e1bba8" containerName="init" Oct 13 14:43:42 crc kubenswrapper[4797]: I1013 14:43:42.514770 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="90aa9215-2ccf-4400-b400-26d0d8e1bba8" containerName="init" Oct 13 14:43:42 crc kubenswrapper[4797]: E1013 14:43:42.514777 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80a123a6-e2f6-4491-abbb-0eeb468e2879" containerName="init" Oct 13 14:43:42 crc kubenswrapper[4797]: I1013 14:43:42.514784 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="80a123a6-e2f6-4491-abbb-0eeb468e2879" containerName="init" Oct 13 14:43:42 crc kubenswrapper[4797]: I1013 14:43:42.515000 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="80a123a6-e2f6-4491-abbb-0eeb468e2879" containerName="dnsmasq-dns" Oct 13 14:43:42 crc kubenswrapper[4797]: I1013 14:43:42.515011 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="90aa9215-2ccf-4400-b400-26d0d8e1bba8" containerName="dnsmasq-dns" Oct 13 14:43:42 crc kubenswrapper[4797]: I1013 14:43:42.515731 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cwpmrr" Oct 13 14:43:42 crc kubenswrapper[4797]: I1013 14:43:42.519607 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 13 14:43:42 crc kubenswrapper[4797]: I1013 14:43:42.520300 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-rf85n" Oct 13 14:43:42 crc kubenswrapper[4797]: I1013 14:43:42.520536 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 13 14:43:42 crc kubenswrapper[4797]: I1013 14:43:42.521131 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 13 14:43:42 crc kubenswrapper[4797]: I1013 14:43:42.544271 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cwpmrr"] Oct 13 14:43:42 crc kubenswrapper[4797]: I1013 14:43:42.624344 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ceb0bb2-f070-48f0-a768-f0a75b81c937-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cwpmrr\" (UID: \"9ceb0bb2-f070-48f0-a768-f0a75b81c937\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cwpmrr" Oct 13 14:43:42 crc kubenswrapper[4797]: I1013 14:43:42.624571 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hdjvb\" (UniqueName: \"kubernetes.io/projected/9ceb0bb2-f070-48f0-a768-f0a75b81c937-kube-api-access-hdjvb\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cwpmrr\" (UID: \"9ceb0bb2-f070-48f0-a768-f0a75b81c937\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cwpmrr" Oct 13 14:43:42 crc kubenswrapper[4797]: I1013 14:43:42.624640 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9ceb0bb2-f070-48f0-a768-f0a75b81c937-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cwpmrr\" (UID: \"9ceb0bb2-f070-48f0-a768-f0a75b81c937\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cwpmrr" Oct 13 14:43:42 crc kubenswrapper[4797]: I1013 14:43:42.624664 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9ceb0bb2-f070-48f0-a768-f0a75b81c937-ssh-key\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cwpmrr\" (UID: \"9ceb0bb2-f070-48f0-a768-f0a75b81c937\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cwpmrr" Oct 13 14:43:42 crc kubenswrapper[4797]: I1013 14:43:42.624683 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9ceb0bb2-f070-48f0-a768-f0a75b81c937-ceph\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cwpmrr\" (UID: \"9ceb0bb2-f070-48f0-a768-f0a75b81c937\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cwpmrr" Oct 13 14:43:42 crc kubenswrapper[4797]: I1013 14:43:42.726439 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9ceb0bb2-f070-48f0-a768-f0a75b81c937-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cwpmrr\" (UID: \"9ceb0bb2-f070-48f0-a768-f0a75b81c937\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cwpmrr" Oct 13 14:43:42 crc kubenswrapper[4797]: I1013 14:43:42.726487 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9ceb0bb2-f070-48f0-a768-f0a75b81c937-ssh-key\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cwpmrr\" (UID: \"9ceb0bb2-f070-48f0-a768-f0a75b81c937\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cwpmrr" Oct 13 14:43:42 crc kubenswrapper[4797]: I1013 14:43:42.726508 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9ceb0bb2-f070-48f0-a768-f0a75b81c937-ceph\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cwpmrr\" (UID: \"9ceb0bb2-f070-48f0-a768-f0a75b81c937\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cwpmrr" Oct 13 14:43:42 crc kubenswrapper[4797]: I1013 14:43:42.727459 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ceb0bb2-f070-48f0-a768-f0a75b81c937-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cwpmrr\" (UID: \"9ceb0bb2-f070-48f0-a768-f0a75b81c937\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cwpmrr" Oct 13 14:43:42 crc kubenswrapper[4797]: I1013 14:43:42.727562 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hdjvb\" (UniqueName: \"kubernetes.io/projected/9ceb0bb2-f070-48f0-a768-f0a75b81c937-kube-api-access-hdjvb\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cwpmrr\" (UID: \"9ceb0bb2-f070-48f0-a768-f0a75b81c937\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cwpmrr" Oct 13 14:43:42 crc kubenswrapper[4797]: I1013 14:43:42.732814 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9ceb0bb2-f070-48f0-a768-f0a75b81c937-ssh-key\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cwpmrr\" (UID: \"9ceb0bb2-f070-48f0-a768-f0a75b81c937\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cwpmrr" Oct 13 14:43:42 crc kubenswrapper[4797]: I1013 14:43:42.733000 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9ceb0bb2-f070-48f0-a768-f0a75b81c937-ceph\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cwpmrr\" (UID: \"9ceb0bb2-f070-48f0-a768-f0a75b81c937\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cwpmrr" Oct 13 14:43:42 crc kubenswrapper[4797]: I1013 14:43:42.734089 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9ceb0bb2-f070-48f0-a768-f0a75b81c937-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cwpmrr\" (UID: \"9ceb0bb2-f070-48f0-a768-f0a75b81c937\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cwpmrr" Oct 13 14:43:42 crc kubenswrapper[4797]: I1013 14:43:42.740713 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ceb0bb2-f070-48f0-a768-f0a75b81c937-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cwpmrr\" (UID: \"9ceb0bb2-f070-48f0-a768-f0a75b81c937\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cwpmrr" Oct 13 14:43:42 crc kubenswrapper[4797]: I1013 14:43:42.745759 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hdjvb\" (UniqueName: \"kubernetes.io/projected/9ceb0bb2-f070-48f0-a768-f0a75b81c937-kube-api-access-hdjvb\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cwpmrr\" (UID: \"9ceb0bb2-f070-48f0-a768-f0a75b81c937\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cwpmrr" Oct 13 14:43:42 crc kubenswrapper[4797]: I1013 14:43:42.844310 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cwpmrr" Oct 13 14:43:43 crc kubenswrapper[4797]: I1013 14:43:43.503557 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cwpmrr"] Oct 13 14:43:43 crc kubenswrapper[4797]: I1013 14:43:43.537476 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cwpmrr" event={"ID":"9ceb0bb2-f070-48f0-a768-f0a75b81c937","Type":"ContainerStarted","Data":"e6679ff63a979d567cc693926699bb42567ba1cf3ca1252fbab505a58accdf43"} Oct 13 14:43:49 crc kubenswrapper[4797]: I1013 14:43:49.237977 4797 scope.go:117] "RemoveContainer" containerID="6e1792210eb1300e989630292282b9e027a6f68ede1e610221b234da4c9f7a00" Oct 13 14:43:49 crc kubenswrapper[4797]: E1013 14:43:49.238770 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 14:43:53 crc kubenswrapper[4797]: I1013 14:43:53.636558 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cwpmrr" event={"ID":"9ceb0bb2-f070-48f0-a768-f0a75b81c937","Type":"ContainerStarted","Data":"7d0e9c69b9b2898001fd9d82c3ef43e480d7f63ea74a39839796145dbb759b0f"} Oct 13 14:43:56 crc kubenswrapper[4797]: I1013 14:43:56.040919 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cwpmrr" podStartSLOduration=4.649478967 podStartE2EDuration="14.040903017s" podCreationTimestamp="2025-10-13 14:43:42 +0000 UTC" firstStartedPulling="2025-10-13 14:43:43.509191237 +0000 UTC m=+5801.042741493" lastFinishedPulling="2025-10-13 14:43:52.900615287 +0000 UTC m=+5810.434165543" observedRunningTime="2025-10-13 14:43:53.664167233 +0000 UTC m=+5811.197717489" watchObservedRunningTime="2025-10-13 14:43:56.040903017 +0000 UTC m=+5813.574453273" Oct 13 14:43:56 crc kubenswrapper[4797]: I1013 14:43:56.044979 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-zpd46"] Oct 13 14:43:56 crc kubenswrapper[4797]: I1013 14:43:56.054008 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-zpd46"] Oct 13 14:43:57 crc kubenswrapper[4797]: I1013 14:43:57.248056 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b56f7d3a-7c03-403f-ac55-88898b8fc587" path="/var/lib/kubelet/pods/b56f7d3a-7c03-403f-ac55-88898b8fc587/volumes" Oct 13 14:44:00 crc kubenswrapper[4797]: I1013 14:44:00.237748 4797 scope.go:117] "RemoveContainer" containerID="6e1792210eb1300e989630292282b9e027a6f68ede1e610221b234da4c9f7a00" Oct 13 14:44:00 crc kubenswrapper[4797]: E1013 14:44:00.238669 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 14:44:06 crc kubenswrapper[4797]: I1013 14:44:06.053791 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-4e02-account-create-r4vkp"] Oct 13 14:44:06 crc kubenswrapper[4797]: I1013 14:44:06.063140 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-4e02-account-create-r4vkp"] Oct 13 14:44:06 crc kubenswrapper[4797]: I1013 14:44:06.777200 4797 generic.go:334] "Generic (PLEG): container finished" podID="9ceb0bb2-f070-48f0-a768-f0a75b81c937" containerID="7d0e9c69b9b2898001fd9d82c3ef43e480d7f63ea74a39839796145dbb759b0f" exitCode=0 Oct 13 14:44:06 crc kubenswrapper[4797]: I1013 14:44:06.777358 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cwpmrr" event={"ID":"9ceb0bb2-f070-48f0-a768-f0a75b81c937","Type":"ContainerDied","Data":"7d0e9c69b9b2898001fd9d82c3ef43e480d7f63ea74a39839796145dbb759b0f"} Oct 13 14:44:07 crc kubenswrapper[4797]: I1013 14:44:07.247436 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="478609df-1ea6-49b7-b109-81174cbbb205" path="/var/lib/kubelet/pods/478609df-1ea6-49b7-b109-81174cbbb205/volumes" Oct 13 14:44:08 crc kubenswrapper[4797]: I1013 14:44:08.298938 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cwpmrr" Oct 13 14:44:08 crc kubenswrapper[4797]: I1013 14:44:08.491923 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ceb0bb2-f070-48f0-a768-f0a75b81c937-pre-adoption-validation-combined-ca-bundle\") pod \"9ceb0bb2-f070-48f0-a768-f0a75b81c937\" (UID: \"9ceb0bb2-f070-48f0-a768-f0a75b81c937\") " Oct 13 14:44:08 crc kubenswrapper[4797]: I1013 14:44:08.492110 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9ceb0bb2-f070-48f0-a768-f0a75b81c937-ssh-key\") pod \"9ceb0bb2-f070-48f0-a768-f0a75b81c937\" (UID: \"9ceb0bb2-f070-48f0-a768-f0a75b81c937\") " Oct 13 14:44:08 crc kubenswrapper[4797]: I1013 14:44:08.492267 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hdjvb\" (UniqueName: \"kubernetes.io/projected/9ceb0bb2-f070-48f0-a768-f0a75b81c937-kube-api-access-hdjvb\") pod \"9ceb0bb2-f070-48f0-a768-f0a75b81c937\" (UID: \"9ceb0bb2-f070-48f0-a768-f0a75b81c937\") " Oct 13 14:44:08 crc kubenswrapper[4797]: I1013 14:44:08.492982 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9ceb0bb2-f070-48f0-a768-f0a75b81c937-inventory\") pod \"9ceb0bb2-f070-48f0-a768-f0a75b81c937\" (UID: \"9ceb0bb2-f070-48f0-a768-f0a75b81c937\") " Oct 13 14:44:08 crc kubenswrapper[4797]: I1013 14:44:08.493008 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9ceb0bb2-f070-48f0-a768-f0a75b81c937-ceph\") pod \"9ceb0bb2-f070-48f0-a768-f0a75b81c937\" (UID: \"9ceb0bb2-f070-48f0-a768-f0a75b81c937\") " Oct 13 14:44:08 crc kubenswrapper[4797]: I1013 14:44:08.497507 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ceb0bb2-f070-48f0-a768-f0a75b81c937-ceph" (OuterVolumeSpecName: "ceph") pod "9ceb0bb2-f070-48f0-a768-f0a75b81c937" (UID: "9ceb0bb2-f070-48f0-a768-f0a75b81c937"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 14:44:08 crc kubenswrapper[4797]: I1013 14:44:08.497825 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ceb0bb2-f070-48f0-a768-f0a75b81c937-pre-adoption-validation-combined-ca-bundle" (OuterVolumeSpecName: "pre-adoption-validation-combined-ca-bundle") pod "9ceb0bb2-f070-48f0-a768-f0a75b81c937" (UID: "9ceb0bb2-f070-48f0-a768-f0a75b81c937"). InnerVolumeSpecName "pre-adoption-validation-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 14:44:08 crc kubenswrapper[4797]: I1013 14:44:08.500303 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ceb0bb2-f070-48f0-a768-f0a75b81c937-kube-api-access-hdjvb" (OuterVolumeSpecName: "kube-api-access-hdjvb") pod "9ceb0bb2-f070-48f0-a768-f0a75b81c937" (UID: "9ceb0bb2-f070-48f0-a768-f0a75b81c937"). InnerVolumeSpecName "kube-api-access-hdjvb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 14:44:08 crc kubenswrapper[4797]: I1013 14:44:08.522042 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ceb0bb2-f070-48f0-a768-f0a75b81c937-inventory" (OuterVolumeSpecName: "inventory") pod "9ceb0bb2-f070-48f0-a768-f0a75b81c937" (UID: "9ceb0bb2-f070-48f0-a768-f0a75b81c937"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 14:44:08 crc kubenswrapper[4797]: I1013 14:44:08.522245 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ceb0bb2-f070-48f0-a768-f0a75b81c937-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "9ceb0bb2-f070-48f0-a768-f0a75b81c937" (UID: "9ceb0bb2-f070-48f0-a768-f0a75b81c937"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 14:44:08 crc kubenswrapper[4797]: I1013 14:44:08.595883 4797 reconciler_common.go:293] "Volume detached for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ceb0bb2-f070-48f0-a768-f0a75b81c937-pre-adoption-validation-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 14:44:08 crc kubenswrapper[4797]: I1013 14:44:08.595936 4797 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9ceb0bb2-f070-48f0-a768-f0a75b81c937-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 13 14:44:08 crc kubenswrapper[4797]: I1013 14:44:08.595951 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hdjvb\" (UniqueName: \"kubernetes.io/projected/9ceb0bb2-f070-48f0-a768-f0a75b81c937-kube-api-access-hdjvb\") on node \"crc\" DevicePath \"\"" Oct 13 14:44:08 crc kubenswrapper[4797]: I1013 14:44:08.595965 4797 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9ceb0bb2-f070-48f0-a768-f0a75b81c937-inventory\") on node \"crc\" DevicePath \"\"" Oct 13 14:44:08 crc kubenswrapper[4797]: I1013 14:44:08.595976 4797 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9ceb0bb2-f070-48f0-a768-f0a75b81c937-ceph\") on node \"crc\" DevicePath \"\"" Oct 13 14:44:08 crc kubenswrapper[4797]: I1013 14:44:08.806749 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cwpmrr" event={"ID":"9ceb0bb2-f070-48f0-a768-f0a75b81c937","Type":"ContainerDied","Data":"e6679ff63a979d567cc693926699bb42567ba1cf3ca1252fbab505a58accdf43"} Oct 13 14:44:08 crc kubenswrapper[4797]: I1013 14:44:08.806828 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e6679ff63a979d567cc693926699bb42567ba1cf3ca1252fbab505a58accdf43" Oct 13 14:44:08 crc kubenswrapper[4797]: I1013 14:44:08.806994 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cwpmrr" Oct 13 14:44:13 crc kubenswrapper[4797]: I1013 14:44:13.249575 4797 scope.go:117] "RemoveContainer" containerID="6e1792210eb1300e989630292282b9e027a6f68ede1e610221b234da4c9f7a00" Oct 13 14:44:13 crc kubenswrapper[4797]: E1013 14:44:13.250466 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 14:44:15 crc kubenswrapper[4797]: I1013 14:44:15.248867 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-sfvqb"] Oct 13 14:44:15 crc kubenswrapper[4797]: E1013 14:44:15.249478 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ceb0bb2-f070-48f0-a768-f0a75b81c937" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-cell1" Oct 13 14:44:15 crc kubenswrapper[4797]: I1013 14:44:15.249495 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ceb0bb2-f070-48f0-a768-f0a75b81c937" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-cell1" Oct 13 14:44:15 crc kubenswrapper[4797]: I1013 14:44:15.249764 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ceb0bb2-f070-48f0-a768-f0a75b81c937" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-cell1" Oct 13 14:44:15 crc kubenswrapper[4797]: I1013 14:44:15.253709 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-sfvqb" Oct 13 14:44:15 crc kubenswrapper[4797]: I1013 14:44:15.261154 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 13 14:44:15 crc kubenswrapper[4797]: I1013 14:44:15.262349 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-rf85n" Oct 13 14:44:15 crc kubenswrapper[4797]: I1013 14:44:15.263467 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 13 14:44:15 crc kubenswrapper[4797]: I1013 14:44:15.264493 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-sfvqb"] Oct 13 14:44:15 crc kubenswrapper[4797]: I1013 14:44:15.270471 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 13 14:44:15 crc kubenswrapper[4797]: I1013 14:44:15.431717 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab353644-5bc4-4dea-a881-0c4009efb270-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-sfvqb\" (UID: \"ab353644-5bc4-4dea-a881-0c4009efb270\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-sfvqb" Oct 13 14:44:15 crc kubenswrapper[4797]: I1013 14:44:15.431770 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xlwx\" (UniqueName: \"kubernetes.io/projected/ab353644-5bc4-4dea-a881-0c4009efb270-kube-api-access-6xlwx\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-sfvqb\" (UID: \"ab353644-5bc4-4dea-a881-0c4009efb270\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-sfvqb" Oct 13 14:44:15 crc kubenswrapper[4797]: I1013 14:44:15.431880 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ab353644-5bc4-4dea-a881-0c4009efb270-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-sfvqb\" (UID: \"ab353644-5bc4-4dea-a881-0c4009efb270\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-sfvqb" Oct 13 14:44:15 crc kubenswrapper[4797]: I1013 14:44:15.432003 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ab353644-5bc4-4dea-a881-0c4009efb270-ssh-key\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-sfvqb\" (UID: \"ab353644-5bc4-4dea-a881-0c4009efb270\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-sfvqb" Oct 13 14:44:15 crc kubenswrapper[4797]: I1013 14:44:15.432089 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ab353644-5bc4-4dea-a881-0c4009efb270-ceph\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-sfvqb\" (UID: \"ab353644-5bc4-4dea-a881-0c4009efb270\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-sfvqb" Oct 13 14:44:15 crc kubenswrapper[4797]: I1013 14:44:15.533435 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ab353644-5bc4-4dea-a881-0c4009efb270-ssh-key\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-sfvqb\" (UID: \"ab353644-5bc4-4dea-a881-0c4009efb270\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-sfvqb" Oct 13 14:44:15 crc kubenswrapper[4797]: I1013 14:44:15.533511 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ab353644-5bc4-4dea-a881-0c4009efb270-ceph\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-sfvqb\" (UID: \"ab353644-5bc4-4dea-a881-0c4009efb270\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-sfvqb" Oct 13 14:44:15 crc kubenswrapper[4797]: I1013 14:44:15.533665 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab353644-5bc4-4dea-a881-0c4009efb270-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-sfvqb\" (UID: \"ab353644-5bc4-4dea-a881-0c4009efb270\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-sfvqb" Oct 13 14:44:15 crc kubenswrapper[4797]: I1013 14:44:15.533693 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xlwx\" (UniqueName: \"kubernetes.io/projected/ab353644-5bc4-4dea-a881-0c4009efb270-kube-api-access-6xlwx\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-sfvqb\" (UID: \"ab353644-5bc4-4dea-a881-0c4009efb270\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-sfvqb" Oct 13 14:44:15 crc kubenswrapper[4797]: I1013 14:44:15.533738 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ab353644-5bc4-4dea-a881-0c4009efb270-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-sfvqb\" (UID: \"ab353644-5bc4-4dea-a881-0c4009efb270\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-sfvqb" Oct 13 14:44:15 crc kubenswrapper[4797]: I1013 14:44:15.540048 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ab353644-5bc4-4dea-a881-0c4009efb270-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-sfvqb\" (UID: \"ab353644-5bc4-4dea-a881-0c4009efb270\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-sfvqb" Oct 13 14:44:15 crc kubenswrapper[4797]: I1013 14:44:15.540531 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ab353644-5bc4-4dea-a881-0c4009efb270-ceph\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-sfvqb\" (UID: \"ab353644-5bc4-4dea-a881-0c4009efb270\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-sfvqb" Oct 13 14:44:15 crc kubenswrapper[4797]: I1013 14:44:15.540624 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ab353644-5bc4-4dea-a881-0c4009efb270-ssh-key\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-sfvqb\" (UID: \"ab353644-5bc4-4dea-a881-0c4009efb270\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-sfvqb" Oct 13 14:44:15 crc kubenswrapper[4797]: I1013 14:44:15.540717 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab353644-5bc4-4dea-a881-0c4009efb270-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-sfvqb\" (UID: \"ab353644-5bc4-4dea-a881-0c4009efb270\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-sfvqb" Oct 13 14:44:15 crc kubenswrapper[4797]: I1013 14:44:15.553759 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xlwx\" (UniqueName: \"kubernetes.io/projected/ab353644-5bc4-4dea-a881-0c4009efb270-kube-api-access-6xlwx\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-sfvqb\" (UID: \"ab353644-5bc4-4dea-a881-0c4009efb270\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-sfvqb" Oct 13 14:44:15 crc kubenswrapper[4797]: I1013 14:44:15.580516 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-sfvqb" Oct 13 14:44:16 crc kubenswrapper[4797]: I1013 14:44:16.159593 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-sfvqb"] Oct 13 14:44:16 crc kubenswrapper[4797]: I1013 14:44:16.893315 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-sfvqb" event={"ID":"ab353644-5bc4-4dea-a881-0c4009efb270","Type":"ContainerStarted","Data":"de05fba5e02a9244a7a3583b3037f9b6940b37571bb1dae2e668281c5541775c"} Oct 13 14:44:16 crc kubenswrapper[4797]: I1013 14:44:16.893850 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-sfvqb" event={"ID":"ab353644-5bc4-4dea-a881-0c4009efb270","Type":"ContainerStarted","Data":"f7c386fa5ec36d2706ecd8472075c7015a5b221dac631748966edfffbbee54a1"} Oct 13 14:44:16 crc kubenswrapper[4797]: I1013 14:44:16.922421 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-sfvqb" podStartSLOduration=1.501743642 podStartE2EDuration="1.922391549s" podCreationTimestamp="2025-10-13 14:44:15 +0000 UTC" firstStartedPulling="2025-10-13 14:44:16.170014945 +0000 UTC m=+5833.703565201" lastFinishedPulling="2025-10-13 14:44:16.590662852 +0000 UTC m=+5834.124213108" observedRunningTime="2025-10-13 14:44:16.917096199 +0000 UTC m=+5834.450646455" watchObservedRunningTime="2025-10-13 14:44:16.922391549 +0000 UTC m=+5834.455941805" Oct 13 14:44:17 crc kubenswrapper[4797]: I1013 14:44:17.910639 4797 scope.go:117] "RemoveContainer" containerID="321df0c3ef0e5d2e9db3af1221c495f77bb5cd483167b07fb3a08ad6aa2b2e18" Oct 13 14:44:17 crc kubenswrapper[4797]: I1013 14:44:17.938307 4797 scope.go:117] "RemoveContainer" containerID="3bb66a10e270b3d66dd15ce7e287b20b9c0765c29b2947788cf93f37745a70a7" Oct 13 14:44:25 crc kubenswrapper[4797]: I1013 14:44:25.236918 4797 scope.go:117] "RemoveContainer" containerID="6e1792210eb1300e989630292282b9e027a6f68ede1e610221b234da4c9f7a00" Oct 13 14:44:25 crc kubenswrapper[4797]: E1013 14:44:25.237571 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 14:44:33 crc kubenswrapper[4797]: I1013 14:44:33.043515 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-vrf8l"] Oct 13 14:44:33 crc kubenswrapper[4797]: I1013 14:44:33.054347 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-vrf8l"] Oct 13 14:44:33 crc kubenswrapper[4797]: I1013 14:44:33.250224 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3013fc1e-e99d-4fb5-96c5-2a384769b668" path="/var/lib/kubelet/pods/3013fc1e-e99d-4fb5-96c5-2a384769b668/volumes" Oct 13 14:44:36 crc kubenswrapper[4797]: I1013 14:44:36.237034 4797 scope.go:117] "RemoveContainer" containerID="6e1792210eb1300e989630292282b9e027a6f68ede1e610221b234da4c9f7a00" Oct 13 14:44:36 crc kubenswrapper[4797]: E1013 14:44:36.237732 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 14:44:50 crc kubenswrapper[4797]: I1013 14:44:50.236178 4797 scope.go:117] "RemoveContainer" containerID="6e1792210eb1300e989630292282b9e027a6f68ede1e610221b234da4c9f7a00" Oct 13 14:44:50 crc kubenswrapper[4797]: E1013 14:44:50.237039 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 14:45:00 crc kubenswrapper[4797]: I1013 14:45:00.147376 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29339445-2wvzh"] Oct 13 14:45:00 crc kubenswrapper[4797]: I1013 14:45:00.149665 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29339445-2wvzh" Oct 13 14:45:00 crc kubenswrapper[4797]: I1013 14:45:00.151707 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 13 14:45:00 crc kubenswrapper[4797]: I1013 14:45:00.151710 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 13 14:45:00 crc kubenswrapper[4797]: I1013 14:45:00.158316 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29339445-2wvzh"] Oct 13 14:45:00 crc kubenswrapper[4797]: I1013 14:45:00.210792 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d7d23b2c-c8c8-4d5e-8577-0574a1dbf3ca-config-volume\") pod \"collect-profiles-29339445-2wvzh\" (UID: \"d7d23b2c-c8c8-4d5e-8577-0574a1dbf3ca\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339445-2wvzh" Oct 13 14:45:00 crc kubenswrapper[4797]: I1013 14:45:00.210924 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65btc\" (UniqueName: \"kubernetes.io/projected/d7d23b2c-c8c8-4d5e-8577-0574a1dbf3ca-kube-api-access-65btc\") pod \"collect-profiles-29339445-2wvzh\" (UID: \"d7d23b2c-c8c8-4d5e-8577-0574a1dbf3ca\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339445-2wvzh" Oct 13 14:45:00 crc kubenswrapper[4797]: I1013 14:45:00.210979 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d7d23b2c-c8c8-4d5e-8577-0574a1dbf3ca-secret-volume\") pod \"collect-profiles-29339445-2wvzh\" (UID: \"d7d23b2c-c8c8-4d5e-8577-0574a1dbf3ca\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339445-2wvzh" Oct 13 14:45:00 crc kubenswrapper[4797]: I1013 14:45:00.312386 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d7d23b2c-c8c8-4d5e-8577-0574a1dbf3ca-config-volume\") pod \"collect-profiles-29339445-2wvzh\" (UID: \"d7d23b2c-c8c8-4d5e-8577-0574a1dbf3ca\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339445-2wvzh" Oct 13 14:45:00 crc kubenswrapper[4797]: I1013 14:45:00.312482 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-65btc\" (UniqueName: \"kubernetes.io/projected/d7d23b2c-c8c8-4d5e-8577-0574a1dbf3ca-kube-api-access-65btc\") pod \"collect-profiles-29339445-2wvzh\" (UID: \"d7d23b2c-c8c8-4d5e-8577-0574a1dbf3ca\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339445-2wvzh" Oct 13 14:45:00 crc kubenswrapper[4797]: I1013 14:45:00.312534 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d7d23b2c-c8c8-4d5e-8577-0574a1dbf3ca-secret-volume\") pod \"collect-profiles-29339445-2wvzh\" (UID: \"d7d23b2c-c8c8-4d5e-8577-0574a1dbf3ca\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339445-2wvzh" Oct 13 14:45:00 crc kubenswrapper[4797]: I1013 14:45:00.313929 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d7d23b2c-c8c8-4d5e-8577-0574a1dbf3ca-config-volume\") pod \"collect-profiles-29339445-2wvzh\" (UID: \"d7d23b2c-c8c8-4d5e-8577-0574a1dbf3ca\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339445-2wvzh" Oct 13 14:45:00 crc kubenswrapper[4797]: I1013 14:45:00.318215 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d7d23b2c-c8c8-4d5e-8577-0574a1dbf3ca-secret-volume\") pod \"collect-profiles-29339445-2wvzh\" (UID: \"d7d23b2c-c8c8-4d5e-8577-0574a1dbf3ca\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339445-2wvzh" Oct 13 14:45:00 crc kubenswrapper[4797]: I1013 14:45:00.328699 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-65btc\" (UniqueName: \"kubernetes.io/projected/d7d23b2c-c8c8-4d5e-8577-0574a1dbf3ca-kube-api-access-65btc\") pod \"collect-profiles-29339445-2wvzh\" (UID: \"d7d23b2c-c8c8-4d5e-8577-0574a1dbf3ca\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339445-2wvzh" Oct 13 14:45:00 crc kubenswrapper[4797]: I1013 14:45:00.478745 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29339445-2wvzh" Oct 13 14:45:00 crc kubenswrapper[4797]: I1013 14:45:00.950732 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29339445-2wvzh"] Oct 13 14:45:01 crc kubenswrapper[4797]: I1013 14:45:01.294021 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29339445-2wvzh" event={"ID":"d7d23b2c-c8c8-4d5e-8577-0574a1dbf3ca","Type":"ContainerStarted","Data":"3ace9afe25f80a81e9ab314e3b95faca7ccbfc352e6c7a8cd8ebed938bc7004c"} Oct 13 14:45:01 crc kubenswrapper[4797]: I1013 14:45:01.294357 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29339445-2wvzh" event={"ID":"d7d23b2c-c8c8-4d5e-8577-0574a1dbf3ca","Type":"ContainerStarted","Data":"6bf917b4ae12aced324d53b4fc53079e73e88b019426626eeb643e8df452f11f"} Oct 13 14:45:01 crc kubenswrapper[4797]: I1013 14:45:01.316228 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29339445-2wvzh" podStartSLOduration=1.316193844 podStartE2EDuration="1.316193844s" podCreationTimestamp="2025-10-13 14:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 14:45:01.309575631 +0000 UTC m=+5878.843125907" watchObservedRunningTime="2025-10-13 14:45:01.316193844 +0000 UTC m=+5878.849744090" Oct 13 14:45:02 crc kubenswrapper[4797]: I1013 14:45:02.304296 4797 generic.go:334] "Generic (PLEG): container finished" podID="d7d23b2c-c8c8-4d5e-8577-0574a1dbf3ca" containerID="3ace9afe25f80a81e9ab314e3b95faca7ccbfc352e6c7a8cd8ebed938bc7004c" exitCode=0 Oct 13 14:45:02 crc kubenswrapper[4797]: I1013 14:45:02.304342 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29339445-2wvzh" event={"ID":"d7d23b2c-c8c8-4d5e-8577-0574a1dbf3ca","Type":"ContainerDied","Data":"3ace9afe25f80a81e9ab314e3b95faca7ccbfc352e6c7a8cd8ebed938bc7004c"} Oct 13 14:45:03 crc kubenswrapper[4797]: I1013 14:45:03.253473 4797 scope.go:117] "RemoveContainer" containerID="6e1792210eb1300e989630292282b9e027a6f68ede1e610221b234da4c9f7a00" Oct 13 14:45:03 crc kubenswrapper[4797]: E1013 14:45:03.253861 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 14:45:03 crc kubenswrapper[4797]: I1013 14:45:03.706793 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29339445-2wvzh" Oct 13 14:45:03 crc kubenswrapper[4797]: I1013 14:45:03.777719 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d7d23b2c-c8c8-4d5e-8577-0574a1dbf3ca-secret-volume\") pod \"d7d23b2c-c8c8-4d5e-8577-0574a1dbf3ca\" (UID: \"d7d23b2c-c8c8-4d5e-8577-0574a1dbf3ca\") " Oct 13 14:45:03 crc kubenswrapper[4797]: I1013 14:45:03.778011 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-65btc\" (UniqueName: \"kubernetes.io/projected/d7d23b2c-c8c8-4d5e-8577-0574a1dbf3ca-kube-api-access-65btc\") pod \"d7d23b2c-c8c8-4d5e-8577-0574a1dbf3ca\" (UID: \"d7d23b2c-c8c8-4d5e-8577-0574a1dbf3ca\") " Oct 13 14:45:03 crc kubenswrapper[4797]: I1013 14:45:03.778159 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d7d23b2c-c8c8-4d5e-8577-0574a1dbf3ca-config-volume\") pod \"d7d23b2c-c8c8-4d5e-8577-0574a1dbf3ca\" (UID: \"d7d23b2c-c8c8-4d5e-8577-0574a1dbf3ca\") " Oct 13 14:45:03 crc kubenswrapper[4797]: I1013 14:45:03.778841 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7d23b2c-c8c8-4d5e-8577-0574a1dbf3ca-config-volume" (OuterVolumeSpecName: "config-volume") pod "d7d23b2c-c8c8-4d5e-8577-0574a1dbf3ca" (UID: "d7d23b2c-c8c8-4d5e-8577-0574a1dbf3ca"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 14:45:03 crc kubenswrapper[4797]: I1013 14:45:03.785190 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7d23b2c-c8c8-4d5e-8577-0574a1dbf3ca-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "d7d23b2c-c8c8-4d5e-8577-0574a1dbf3ca" (UID: "d7d23b2c-c8c8-4d5e-8577-0574a1dbf3ca"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 14:45:03 crc kubenswrapper[4797]: I1013 14:45:03.785346 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7d23b2c-c8c8-4d5e-8577-0574a1dbf3ca-kube-api-access-65btc" (OuterVolumeSpecName: "kube-api-access-65btc") pod "d7d23b2c-c8c8-4d5e-8577-0574a1dbf3ca" (UID: "d7d23b2c-c8c8-4d5e-8577-0574a1dbf3ca"). InnerVolumeSpecName "kube-api-access-65btc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 14:45:03 crc kubenswrapper[4797]: I1013 14:45:03.889441 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-65btc\" (UniqueName: \"kubernetes.io/projected/d7d23b2c-c8c8-4d5e-8577-0574a1dbf3ca-kube-api-access-65btc\") on node \"crc\" DevicePath \"\"" Oct 13 14:45:03 crc kubenswrapper[4797]: I1013 14:45:03.889740 4797 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d7d23b2c-c8c8-4d5e-8577-0574a1dbf3ca-config-volume\") on node \"crc\" DevicePath \"\"" Oct 13 14:45:03 crc kubenswrapper[4797]: I1013 14:45:03.889752 4797 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d7d23b2c-c8c8-4d5e-8577-0574a1dbf3ca-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 13 14:45:04 crc kubenswrapper[4797]: I1013 14:45:04.323246 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29339445-2wvzh" event={"ID":"d7d23b2c-c8c8-4d5e-8577-0574a1dbf3ca","Type":"ContainerDied","Data":"6bf917b4ae12aced324d53b4fc53079e73e88b019426626eeb643e8df452f11f"} Oct 13 14:45:04 crc kubenswrapper[4797]: I1013 14:45:04.323298 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6bf917b4ae12aced324d53b4fc53079e73e88b019426626eeb643e8df452f11f" Oct 13 14:45:04 crc kubenswrapper[4797]: I1013 14:45:04.323352 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29339445-2wvzh" Oct 13 14:45:04 crc kubenswrapper[4797]: I1013 14:45:04.388797 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29339400-r7fxd"] Oct 13 14:45:04 crc kubenswrapper[4797]: I1013 14:45:04.397892 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29339400-r7fxd"] Oct 13 14:45:04 crc kubenswrapper[4797]: E1013 14:45:04.456506 4797 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd7d23b2c_c8c8_4d5e_8577_0574a1dbf3ca.slice\": RecentStats: unable to find data in memory cache]" Oct 13 14:45:05 crc kubenswrapper[4797]: I1013 14:45:05.248036 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6677772-d44a-4d3a-b496-5a2e5970db18" path="/var/lib/kubelet/pods/b6677772-d44a-4d3a-b496-5a2e5970db18/volumes" Oct 13 14:45:18 crc kubenswrapper[4797]: I1013 14:45:18.031071 4797 scope.go:117] "RemoveContainer" containerID="968437b0f5ba9d1645208f39f8303e77d536fdf43eb97197f588c9f7832bfdc4" Oct 13 14:45:18 crc kubenswrapper[4797]: I1013 14:45:18.060411 4797 scope.go:117] "RemoveContainer" containerID="8d6f97744260ac699cc7e6337e820eb81c7f1e1243950affaf6429523db3681f" Oct 13 14:45:18 crc kubenswrapper[4797]: I1013 14:45:18.236347 4797 scope.go:117] "RemoveContainer" containerID="6e1792210eb1300e989630292282b9e027a6f68ede1e610221b234da4c9f7a00" Oct 13 14:45:18 crc kubenswrapper[4797]: E1013 14:45:18.237230 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 14:45:31 crc kubenswrapper[4797]: I1013 14:45:31.236651 4797 scope.go:117] "RemoveContainer" containerID="6e1792210eb1300e989630292282b9e027a6f68ede1e610221b234da4c9f7a00" Oct 13 14:45:31 crc kubenswrapper[4797]: E1013 14:45:31.237508 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 14:45:43 crc kubenswrapper[4797]: I1013 14:45:43.243392 4797 scope.go:117] "RemoveContainer" containerID="6e1792210eb1300e989630292282b9e027a6f68ede1e610221b234da4c9f7a00" Oct 13 14:45:43 crc kubenswrapper[4797]: E1013 14:45:43.244211 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 14:45:57 crc kubenswrapper[4797]: I1013 14:45:57.236884 4797 scope.go:117] "RemoveContainer" containerID="6e1792210eb1300e989630292282b9e027a6f68ede1e610221b234da4c9f7a00" Oct 13 14:45:57 crc kubenswrapper[4797]: E1013 14:45:57.237678 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 14:46:08 crc kubenswrapper[4797]: I1013 14:46:08.237361 4797 scope.go:117] "RemoveContainer" containerID="6e1792210eb1300e989630292282b9e027a6f68ede1e610221b234da4c9f7a00" Oct 13 14:46:08 crc kubenswrapper[4797]: E1013 14:46:08.238288 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 14:46:22 crc kubenswrapper[4797]: I1013 14:46:22.236209 4797 scope.go:117] "RemoveContainer" containerID="6e1792210eb1300e989630292282b9e027a6f68ede1e610221b234da4c9f7a00" Oct 13 14:46:22 crc kubenswrapper[4797]: E1013 14:46:22.236985 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 14:46:34 crc kubenswrapper[4797]: I1013 14:46:34.241317 4797 scope.go:117] "RemoveContainer" containerID="6e1792210eb1300e989630292282b9e027a6f68ede1e610221b234da4c9f7a00" Oct 13 14:46:34 crc kubenswrapper[4797]: E1013 14:46:34.242297 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 14:46:47 crc kubenswrapper[4797]: I1013 14:46:47.236603 4797 scope.go:117] "RemoveContainer" containerID="6e1792210eb1300e989630292282b9e027a6f68ede1e610221b234da4c9f7a00" Oct 13 14:46:47 crc kubenswrapper[4797]: E1013 14:46:47.237326 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 14:47:00 crc kubenswrapper[4797]: I1013 14:47:00.236258 4797 scope.go:117] "RemoveContainer" containerID="6e1792210eb1300e989630292282b9e027a6f68ede1e610221b234da4c9f7a00" Oct 13 14:47:00 crc kubenswrapper[4797]: I1013 14:47:00.571303 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" event={"ID":"345b1c60-ba79-407d-8423-53010f2dfeb0","Type":"ContainerStarted","Data":"267d09256aad9f3fe4d63c88f12e79b6b8c943961fc8e4ffcccbde0e623208bd"} Oct 13 14:47:29 crc kubenswrapper[4797]: I1013 14:47:29.313221 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-4sdfw"] Oct 13 14:47:29 crc kubenswrapper[4797]: E1013 14:47:29.314208 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7d23b2c-c8c8-4d5e-8577-0574a1dbf3ca" containerName="collect-profiles" Oct 13 14:47:29 crc kubenswrapper[4797]: I1013 14:47:29.314269 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7d23b2c-c8c8-4d5e-8577-0574a1dbf3ca" containerName="collect-profiles" Oct 13 14:47:29 crc kubenswrapper[4797]: I1013 14:47:29.314847 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7d23b2c-c8c8-4d5e-8577-0574a1dbf3ca" containerName="collect-profiles" Oct 13 14:47:29 crc kubenswrapper[4797]: I1013 14:47:29.317589 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4sdfw" Oct 13 14:47:29 crc kubenswrapper[4797]: I1013 14:47:29.331727 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4sdfw"] Oct 13 14:47:29 crc kubenswrapper[4797]: I1013 14:47:29.477367 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2d57bde-a378-4a0a-9c32-e8d19cb1d5a1-utilities\") pod \"redhat-operators-4sdfw\" (UID: \"a2d57bde-a378-4a0a-9c32-e8d19cb1d5a1\") " pod="openshift-marketplace/redhat-operators-4sdfw" Oct 13 14:47:29 crc kubenswrapper[4797]: I1013 14:47:29.477706 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2d57bde-a378-4a0a-9c32-e8d19cb1d5a1-catalog-content\") pod \"redhat-operators-4sdfw\" (UID: \"a2d57bde-a378-4a0a-9c32-e8d19cb1d5a1\") " pod="openshift-marketplace/redhat-operators-4sdfw" Oct 13 14:47:29 crc kubenswrapper[4797]: I1013 14:47:29.477730 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7576\" (UniqueName: \"kubernetes.io/projected/a2d57bde-a378-4a0a-9c32-e8d19cb1d5a1-kube-api-access-r7576\") pod \"redhat-operators-4sdfw\" (UID: \"a2d57bde-a378-4a0a-9c32-e8d19cb1d5a1\") " pod="openshift-marketplace/redhat-operators-4sdfw" Oct 13 14:47:29 crc kubenswrapper[4797]: I1013 14:47:29.579485 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2d57bde-a378-4a0a-9c32-e8d19cb1d5a1-utilities\") pod \"redhat-operators-4sdfw\" (UID: \"a2d57bde-a378-4a0a-9c32-e8d19cb1d5a1\") " pod="openshift-marketplace/redhat-operators-4sdfw" Oct 13 14:47:29 crc kubenswrapper[4797]: I1013 14:47:29.579549 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2d57bde-a378-4a0a-9c32-e8d19cb1d5a1-catalog-content\") pod \"redhat-operators-4sdfw\" (UID: \"a2d57bde-a378-4a0a-9c32-e8d19cb1d5a1\") " pod="openshift-marketplace/redhat-operators-4sdfw" Oct 13 14:47:29 crc kubenswrapper[4797]: I1013 14:47:29.579577 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7576\" (UniqueName: \"kubernetes.io/projected/a2d57bde-a378-4a0a-9c32-e8d19cb1d5a1-kube-api-access-r7576\") pod \"redhat-operators-4sdfw\" (UID: \"a2d57bde-a378-4a0a-9c32-e8d19cb1d5a1\") " pod="openshift-marketplace/redhat-operators-4sdfw" Oct 13 14:47:29 crc kubenswrapper[4797]: I1013 14:47:29.580157 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2d57bde-a378-4a0a-9c32-e8d19cb1d5a1-utilities\") pod \"redhat-operators-4sdfw\" (UID: \"a2d57bde-a378-4a0a-9c32-e8d19cb1d5a1\") " pod="openshift-marketplace/redhat-operators-4sdfw" Oct 13 14:47:29 crc kubenswrapper[4797]: I1013 14:47:29.580264 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2d57bde-a378-4a0a-9c32-e8d19cb1d5a1-catalog-content\") pod \"redhat-operators-4sdfw\" (UID: \"a2d57bde-a378-4a0a-9c32-e8d19cb1d5a1\") " pod="openshift-marketplace/redhat-operators-4sdfw" Oct 13 14:47:29 crc kubenswrapper[4797]: I1013 14:47:29.600867 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7576\" (UniqueName: \"kubernetes.io/projected/a2d57bde-a378-4a0a-9c32-e8d19cb1d5a1-kube-api-access-r7576\") pod \"redhat-operators-4sdfw\" (UID: \"a2d57bde-a378-4a0a-9c32-e8d19cb1d5a1\") " pod="openshift-marketplace/redhat-operators-4sdfw" Oct 13 14:47:29 crc kubenswrapper[4797]: I1013 14:47:29.656119 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4sdfw" Oct 13 14:47:30 crc kubenswrapper[4797]: I1013 14:47:30.166224 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4sdfw"] Oct 13 14:47:30 crc kubenswrapper[4797]: I1013 14:47:30.869544 4797 generic.go:334] "Generic (PLEG): container finished" podID="a2d57bde-a378-4a0a-9c32-e8d19cb1d5a1" containerID="a13274df2d75b6041a20aea32c046d59820c6fa989cefd0019635ab73d1fc858" exitCode=0 Oct 13 14:47:30 crc kubenswrapper[4797]: I1013 14:47:30.869599 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4sdfw" event={"ID":"a2d57bde-a378-4a0a-9c32-e8d19cb1d5a1","Type":"ContainerDied","Data":"a13274df2d75b6041a20aea32c046d59820c6fa989cefd0019635ab73d1fc858"} Oct 13 14:47:30 crc kubenswrapper[4797]: I1013 14:47:30.869861 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4sdfw" event={"ID":"a2d57bde-a378-4a0a-9c32-e8d19cb1d5a1","Type":"ContainerStarted","Data":"65a41bcd06abd97488716f54fa99515e007bdec498724f555f03c00fcb180172"} Oct 13 14:47:30 crc kubenswrapper[4797]: I1013 14:47:30.873602 4797 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 13 14:47:32 crc kubenswrapper[4797]: I1013 14:47:32.888702 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4sdfw" event={"ID":"a2d57bde-a378-4a0a-9c32-e8d19cb1d5a1","Type":"ContainerStarted","Data":"c426a27e2e717ebb8f33e34cdb9a3d390f98cd5d7d7e4097a2cc79388f11a7ff"} Oct 13 14:47:36 crc kubenswrapper[4797]: I1013 14:47:36.934238 4797 generic.go:334] "Generic (PLEG): container finished" podID="a2d57bde-a378-4a0a-9c32-e8d19cb1d5a1" containerID="c426a27e2e717ebb8f33e34cdb9a3d390f98cd5d7d7e4097a2cc79388f11a7ff" exitCode=0 Oct 13 14:47:36 crc kubenswrapper[4797]: I1013 14:47:36.934313 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4sdfw" event={"ID":"a2d57bde-a378-4a0a-9c32-e8d19cb1d5a1","Type":"ContainerDied","Data":"c426a27e2e717ebb8f33e34cdb9a3d390f98cd5d7d7e4097a2cc79388f11a7ff"} Oct 13 14:47:37 crc kubenswrapper[4797]: I1013 14:47:37.949683 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4sdfw" event={"ID":"a2d57bde-a378-4a0a-9c32-e8d19cb1d5a1","Type":"ContainerStarted","Data":"02ab0ff8d5bf2a965d81e983155f6c9f11b23c19820f864e3ab440c7c32d4399"} Oct 13 14:47:37 crc kubenswrapper[4797]: I1013 14:47:37.976092 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-4sdfw" podStartSLOduration=2.395233264 podStartE2EDuration="8.976071599s" podCreationTimestamp="2025-10-13 14:47:29 +0000 UTC" firstStartedPulling="2025-10-13 14:47:30.872202355 +0000 UTC m=+6028.405752611" lastFinishedPulling="2025-10-13 14:47:37.45304069 +0000 UTC m=+6034.986590946" observedRunningTime="2025-10-13 14:47:37.968431041 +0000 UTC m=+6035.501981327" watchObservedRunningTime="2025-10-13 14:47:37.976071599 +0000 UTC m=+6035.509621855" Oct 13 14:47:39 crc kubenswrapper[4797]: I1013 14:47:39.657109 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-4sdfw" Oct 13 14:47:39 crc kubenswrapper[4797]: I1013 14:47:39.657491 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-4sdfw" Oct 13 14:47:40 crc kubenswrapper[4797]: I1013 14:47:40.707853 4797 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-4sdfw" podUID="a2d57bde-a378-4a0a-9c32-e8d19cb1d5a1" containerName="registry-server" probeResult="failure" output=< Oct 13 14:47:40 crc kubenswrapper[4797]: timeout: failed to connect service ":50051" within 1s Oct 13 14:47:40 crc kubenswrapper[4797]: > Oct 13 14:47:49 crc kubenswrapper[4797]: I1013 14:47:49.709009 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-4sdfw" Oct 13 14:47:49 crc kubenswrapper[4797]: I1013 14:47:49.764684 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-4sdfw" Oct 13 14:47:49 crc kubenswrapper[4797]: I1013 14:47:49.949004 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4sdfw"] Oct 13 14:47:51 crc kubenswrapper[4797]: I1013 14:47:51.069256 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-4sdfw" podUID="a2d57bde-a378-4a0a-9c32-e8d19cb1d5a1" containerName="registry-server" containerID="cri-o://02ab0ff8d5bf2a965d81e983155f6c9f11b23c19820f864e3ab440c7c32d4399" gracePeriod=2 Oct 13 14:47:51 crc kubenswrapper[4797]: I1013 14:47:51.643476 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4sdfw" Oct 13 14:47:51 crc kubenswrapper[4797]: I1013 14:47:51.766052 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r7576\" (UniqueName: \"kubernetes.io/projected/a2d57bde-a378-4a0a-9c32-e8d19cb1d5a1-kube-api-access-r7576\") pod \"a2d57bde-a378-4a0a-9c32-e8d19cb1d5a1\" (UID: \"a2d57bde-a378-4a0a-9c32-e8d19cb1d5a1\") " Oct 13 14:47:51 crc kubenswrapper[4797]: I1013 14:47:51.766267 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2d57bde-a378-4a0a-9c32-e8d19cb1d5a1-catalog-content\") pod \"a2d57bde-a378-4a0a-9c32-e8d19cb1d5a1\" (UID: \"a2d57bde-a378-4a0a-9c32-e8d19cb1d5a1\") " Oct 13 14:47:51 crc kubenswrapper[4797]: I1013 14:47:51.766343 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2d57bde-a378-4a0a-9c32-e8d19cb1d5a1-utilities\") pod \"a2d57bde-a378-4a0a-9c32-e8d19cb1d5a1\" (UID: \"a2d57bde-a378-4a0a-9c32-e8d19cb1d5a1\") " Oct 13 14:47:51 crc kubenswrapper[4797]: I1013 14:47:51.767119 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a2d57bde-a378-4a0a-9c32-e8d19cb1d5a1-utilities" (OuterVolumeSpecName: "utilities") pod "a2d57bde-a378-4a0a-9c32-e8d19cb1d5a1" (UID: "a2d57bde-a378-4a0a-9c32-e8d19cb1d5a1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 14:47:51 crc kubenswrapper[4797]: I1013 14:47:51.773851 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2d57bde-a378-4a0a-9c32-e8d19cb1d5a1-kube-api-access-r7576" (OuterVolumeSpecName: "kube-api-access-r7576") pod "a2d57bde-a378-4a0a-9c32-e8d19cb1d5a1" (UID: "a2d57bde-a378-4a0a-9c32-e8d19cb1d5a1"). InnerVolumeSpecName "kube-api-access-r7576". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 14:47:51 crc kubenswrapper[4797]: I1013 14:47:51.850912 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a2d57bde-a378-4a0a-9c32-e8d19cb1d5a1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a2d57bde-a378-4a0a-9c32-e8d19cb1d5a1" (UID: "a2d57bde-a378-4a0a-9c32-e8d19cb1d5a1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 14:47:51 crc kubenswrapper[4797]: I1013 14:47:51.868379 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r7576\" (UniqueName: \"kubernetes.io/projected/a2d57bde-a378-4a0a-9c32-e8d19cb1d5a1-kube-api-access-r7576\") on node \"crc\" DevicePath \"\"" Oct 13 14:47:51 crc kubenswrapper[4797]: I1013 14:47:51.868414 4797 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2d57bde-a378-4a0a-9c32-e8d19cb1d5a1-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 13 14:47:51 crc kubenswrapper[4797]: I1013 14:47:51.868430 4797 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2d57bde-a378-4a0a-9c32-e8d19cb1d5a1-utilities\") on node \"crc\" DevicePath \"\"" Oct 13 14:47:52 crc kubenswrapper[4797]: I1013 14:47:52.080525 4797 generic.go:334] "Generic (PLEG): container finished" podID="a2d57bde-a378-4a0a-9c32-e8d19cb1d5a1" containerID="02ab0ff8d5bf2a965d81e983155f6c9f11b23c19820f864e3ab440c7c32d4399" exitCode=0 Oct 13 14:47:52 crc kubenswrapper[4797]: I1013 14:47:52.080578 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4sdfw" event={"ID":"a2d57bde-a378-4a0a-9c32-e8d19cb1d5a1","Type":"ContainerDied","Data":"02ab0ff8d5bf2a965d81e983155f6c9f11b23c19820f864e3ab440c7c32d4399"} Oct 13 14:47:52 crc kubenswrapper[4797]: I1013 14:47:52.080615 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4sdfw" event={"ID":"a2d57bde-a378-4a0a-9c32-e8d19cb1d5a1","Type":"ContainerDied","Data":"65a41bcd06abd97488716f54fa99515e007bdec498724f555f03c00fcb180172"} Oct 13 14:47:52 crc kubenswrapper[4797]: I1013 14:47:52.080634 4797 scope.go:117] "RemoveContainer" containerID="02ab0ff8d5bf2a965d81e983155f6c9f11b23c19820f864e3ab440c7c32d4399" Oct 13 14:47:52 crc kubenswrapper[4797]: I1013 14:47:52.080641 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4sdfw" Oct 13 14:47:52 crc kubenswrapper[4797]: I1013 14:47:52.113368 4797 scope.go:117] "RemoveContainer" containerID="c426a27e2e717ebb8f33e34cdb9a3d390f98cd5d7d7e4097a2cc79388f11a7ff" Oct 13 14:47:52 crc kubenswrapper[4797]: I1013 14:47:52.118198 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4sdfw"] Oct 13 14:47:52 crc kubenswrapper[4797]: I1013 14:47:52.126209 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-4sdfw"] Oct 13 14:47:52 crc kubenswrapper[4797]: I1013 14:47:52.144903 4797 scope.go:117] "RemoveContainer" containerID="a13274df2d75b6041a20aea32c046d59820c6fa989cefd0019635ab73d1fc858" Oct 13 14:47:52 crc kubenswrapper[4797]: I1013 14:47:52.203287 4797 scope.go:117] "RemoveContainer" containerID="02ab0ff8d5bf2a965d81e983155f6c9f11b23c19820f864e3ab440c7c32d4399" Oct 13 14:47:52 crc kubenswrapper[4797]: E1013 14:47:52.203980 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02ab0ff8d5bf2a965d81e983155f6c9f11b23c19820f864e3ab440c7c32d4399\": container with ID starting with 02ab0ff8d5bf2a965d81e983155f6c9f11b23c19820f864e3ab440c7c32d4399 not found: ID does not exist" containerID="02ab0ff8d5bf2a965d81e983155f6c9f11b23c19820f864e3ab440c7c32d4399" Oct 13 14:47:52 crc kubenswrapper[4797]: I1013 14:47:52.204028 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02ab0ff8d5bf2a965d81e983155f6c9f11b23c19820f864e3ab440c7c32d4399"} err="failed to get container status \"02ab0ff8d5bf2a965d81e983155f6c9f11b23c19820f864e3ab440c7c32d4399\": rpc error: code = NotFound desc = could not find container \"02ab0ff8d5bf2a965d81e983155f6c9f11b23c19820f864e3ab440c7c32d4399\": container with ID starting with 02ab0ff8d5bf2a965d81e983155f6c9f11b23c19820f864e3ab440c7c32d4399 not found: ID does not exist" Oct 13 14:47:52 crc kubenswrapper[4797]: I1013 14:47:52.204059 4797 scope.go:117] "RemoveContainer" containerID="c426a27e2e717ebb8f33e34cdb9a3d390f98cd5d7d7e4097a2cc79388f11a7ff" Oct 13 14:47:52 crc kubenswrapper[4797]: E1013 14:47:52.206325 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c426a27e2e717ebb8f33e34cdb9a3d390f98cd5d7d7e4097a2cc79388f11a7ff\": container with ID starting with c426a27e2e717ebb8f33e34cdb9a3d390f98cd5d7d7e4097a2cc79388f11a7ff not found: ID does not exist" containerID="c426a27e2e717ebb8f33e34cdb9a3d390f98cd5d7d7e4097a2cc79388f11a7ff" Oct 13 14:47:52 crc kubenswrapper[4797]: I1013 14:47:52.206375 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c426a27e2e717ebb8f33e34cdb9a3d390f98cd5d7d7e4097a2cc79388f11a7ff"} err="failed to get container status \"c426a27e2e717ebb8f33e34cdb9a3d390f98cd5d7d7e4097a2cc79388f11a7ff\": rpc error: code = NotFound desc = could not find container \"c426a27e2e717ebb8f33e34cdb9a3d390f98cd5d7d7e4097a2cc79388f11a7ff\": container with ID starting with c426a27e2e717ebb8f33e34cdb9a3d390f98cd5d7d7e4097a2cc79388f11a7ff not found: ID does not exist" Oct 13 14:47:52 crc kubenswrapper[4797]: I1013 14:47:52.206415 4797 scope.go:117] "RemoveContainer" containerID="a13274df2d75b6041a20aea32c046d59820c6fa989cefd0019635ab73d1fc858" Oct 13 14:47:52 crc kubenswrapper[4797]: E1013 14:47:52.207578 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a13274df2d75b6041a20aea32c046d59820c6fa989cefd0019635ab73d1fc858\": container with ID starting with a13274df2d75b6041a20aea32c046d59820c6fa989cefd0019635ab73d1fc858 not found: ID does not exist" containerID="a13274df2d75b6041a20aea32c046d59820c6fa989cefd0019635ab73d1fc858" Oct 13 14:47:52 crc kubenswrapper[4797]: I1013 14:47:52.207649 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a13274df2d75b6041a20aea32c046d59820c6fa989cefd0019635ab73d1fc858"} err="failed to get container status \"a13274df2d75b6041a20aea32c046d59820c6fa989cefd0019635ab73d1fc858\": rpc error: code = NotFound desc = could not find container \"a13274df2d75b6041a20aea32c046d59820c6fa989cefd0019635ab73d1fc858\": container with ID starting with a13274df2d75b6041a20aea32c046d59820c6fa989cefd0019635ab73d1fc858 not found: ID does not exist" Oct 13 14:47:53 crc kubenswrapper[4797]: I1013 14:47:53.251900 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2d57bde-a378-4a0a-9c32-e8d19cb1d5a1" path="/var/lib/kubelet/pods/a2d57bde-a378-4a0a-9c32-e8d19cb1d5a1/volumes" Oct 13 14:48:13 crc kubenswrapper[4797]: I1013 14:48:13.166795 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-klpcc"] Oct 13 14:48:13 crc kubenswrapper[4797]: E1013 14:48:13.167771 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2d57bde-a378-4a0a-9c32-e8d19cb1d5a1" containerName="registry-server" Oct 13 14:48:13 crc kubenswrapper[4797]: I1013 14:48:13.167787 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2d57bde-a378-4a0a-9c32-e8d19cb1d5a1" containerName="registry-server" Oct 13 14:48:13 crc kubenswrapper[4797]: E1013 14:48:13.167839 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2d57bde-a378-4a0a-9c32-e8d19cb1d5a1" containerName="extract-utilities" Oct 13 14:48:13 crc kubenswrapper[4797]: I1013 14:48:13.167850 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2d57bde-a378-4a0a-9c32-e8d19cb1d5a1" containerName="extract-utilities" Oct 13 14:48:13 crc kubenswrapper[4797]: E1013 14:48:13.167867 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2d57bde-a378-4a0a-9c32-e8d19cb1d5a1" containerName="extract-content" Oct 13 14:48:13 crc kubenswrapper[4797]: I1013 14:48:13.167879 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2d57bde-a378-4a0a-9c32-e8d19cb1d5a1" containerName="extract-content" Oct 13 14:48:13 crc kubenswrapper[4797]: I1013 14:48:13.168187 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2d57bde-a378-4a0a-9c32-e8d19cb1d5a1" containerName="registry-server" Oct 13 14:48:13 crc kubenswrapper[4797]: I1013 14:48:13.170218 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-klpcc" Oct 13 14:48:13 crc kubenswrapper[4797]: I1013 14:48:13.184446 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-klpcc"] Oct 13 14:48:13 crc kubenswrapper[4797]: I1013 14:48:13.241345 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ba32138-2528-4387-9c7d-cc122d18caac-utilities\") pod \"redhat-marketplace-klpcc\" (UID: \"2ba32138-2528-4387-9c7d-cc122d18caac\") " pod="openshift-marketplace/redhat-marketplace-klpcc" Oct 13 14:48:13 crc kubenswrapper[4797]: I1013 14:48:13.241746 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-szbfd\" (UniqueName: \"kubernetes.io/projected/2ba32138-2528-4387-9c7d-cc122d18caac-kube-api-access-szbfd\") pod \"redhat-marketplace-klpcc\" (UID: \"2ba32138-2528-4387-9c7d-cc122d18caac\") " pod="openshift-marketplace/redhat-marketplace-klpcc" Oct 13 14:48:13 crc kubenswrapper[4797]: I1013 14:48:13.242116 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ba32138-2528-4387-9c7d-cc122d18caac-catalog-content\") pod \"redhat-marketplace-klpcc\" (UID: \"2ba32138-2528-4387-9c7d-cc122d18caac\") " pod="openshift-marketplace/redhat-marketplace-klpcc" Oct 13 14:48:13 crc kubenswrapper[4797]: I1013 14:48:13.344066 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ba32138-2528-4387-9c7d-cc122d18caac-catalog-content\") pod \"redhat-marketplace-klpcc\" (UID: \"2ba32138-2528-4387-9c7d-cc122d18caac\") " pod="openshift-marketplace/redhat-marketplace-klpcc" Oct 13 14:48:13 crc kubenswrapper[4797]: I1013 14:48:13.344217 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ba32138-2528-4387-9c7d-cc122d18caac-utilities\") pod \"redhat-marketplace-klpcc\" (UID: \"2ba32138-2528-4387-9c7d-cc122d18caac\") " pod="openshift-marketplace/redhat-marketplace-klpcc" Oct 13 14:48:13 crc kubenswrapper[4797]: I1013 14:48:13.344248 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-szbfd\" (UniqueName: \"kubernetes.io/projected/2ba32138-2528-4387-9c7d-cc122d18caac-kube-api-access-szbfd\") pod \"redhat-marketplace-klpcc\" (UID: \"2ba32138-2528-4387-9c7d-cc122d18caac\") " pod="openshift-marketplace/redhat-marketplace-klpcc" Oct 13 14:48:13 crc kubenswrapper[4797]: I1013 14:48:13.345025 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ba32138-2528-4387-9c7d-cc122d18caac-utilities\") pod \"redhat-marketplace-klpcc\" (UID: \"2ba32138-2528-4387-9c7d-cc122d18caac\") " pod="openshift-marketplace/redhat-marketplace-klpcc" Oct 13 14:48:13 crc kubenswrapper[4797]: I1013 14:48:13.345152 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ba32138-2528-4387-9c7d-cc122d18caac-catalog-content\") pod \"redhat-marketplace-klpcc\" (UID: \"2ba32138-2528-4387-9c7d-cc122d18caac\") " pod="openshift-marketplace/redhat-marketplace-klpcc" Oct 13 14:48:13 crc kubenswrapper[4797]: I1013 14:48:13.372224 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-szbfd\" (UniqueName: \"kubernetes.io/projected/2ba32138-2528-4387-9c7d-cc122d18caac-kube-api-access-szbfd\") pod \"redhat-marketplace-klpcc\" (UID: \"2ba32138-2528-4387-9c7d-cc122d18caac\") " pod="openshift-marketplace/redhat-marketplace-klpcc" Oct 13 14:48:13 crc kubenswrapper[4797]: I1013 14:48:13.503587 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-klpcc" Oct 13 14:48:14 crc kubenswrapper[4797]: I1013 14:48:14.007150 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-klpcc"] Oct 13 14:48:14 crc kubenswrapper[4797]: I1013 14:48:14.329758 4797 generic.go:334] "Generic (PLEG): container finished" podID="2ba32138-2528-4387-9c7d-cc122d18caac" containerID="c65f4013f9760720d375548d326844382cc6777bdd59acea878095f14a3d3676" exitCode=0 Oct 13 14:48:14 crc kubenswrapper[4797]: I1013 14:48:14.329912 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-klpcc" event={"ID":"2ba32138-2528-4387-9c7d-cc122d18caac","Type":"ContainerDied","Data":"c65f4013f9760720d375548d326844382cc6777bdd59acea878095f14a3d3676"} Oct 13 14:48:14 crc kubenswrapper[4797]: I1013 14:48:14.330169 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-klpcc" event={"ID":"2ba32138-2528-4387-9c7d-cc122d18caac","Type":"ContainerStarted","Data":"9b96d04e6bc8b6cb5573b691482715772dcdb2bf6ed3affc64c7905ecadd51f2"} Oct 13 14:48:16 crc kubenswrapper[4797]: I1013 14:48:16.352525 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-klpcc" event={"ID":"2ba32138-2528-4387-9c7d-cc122d18caac","Type":"ContainerStarted","Data":"8f912090feddf60f282f9c980555f2e0858b5b341b34546522b1f78237758a15"} Oct 13 14:48:17 crc kubenswrapper[4797]: I1013 14:48:17.362846 4797 generic.go:334] "Generic (PLEG): container finished" podID="2ba32138-2528-4387-9c7d-cc122d18caac" containerID="8f912090feddf60f282f9c980555f2e0858b5b341b34546522b1f78237758a15" exitCode=0 Oct 13 14:48:17 crc kubenswrapper[4797]: I1013 14:48:17.362918 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-klpcc" event={"ID":"2ba32138-2528-4387-9c7d-cc122d18caac","Type":"ContainerDied","Data":"8f912090feddf60f282f9c980555f2e0858b5b341b34546522b1f78237758a15"} Oct 13 14:48:18 crc kubenswrapper[4797]: I1013 14:48:18.382377 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-klpcc" event={"ID":"2ba32138-2528-4387-9c7d-cc122d18caac","Type":"ContainerStarted","Data":"ac135d9ab4d13886371839aab3c330032f1e34aaa9d293513721b1d0e1597257"} Oct 13 14:48:18 crc kubenswrapper[4797]: I1013 14:48:18.402411 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-klpcc" podStartSLOduration=1.9573359190000001 podStartE2EDuration="5.402390074s" podCreationTimestamp="2025-10-13 14:48:13 +0000 UTC" firstStartedPulling="2025-10-13 14:48:14.332065513 +0000 UTC m=+6071.865615769" lastFinishedPulling="2025-10-13 14:48:17.777119668 +0000 UTC m=+6075.310669924" observedRunningTime="2025-10-13 14:48:18.396768626 +0000 UTC m=+6075.930318922" watchObservedRunningTime="2025-10-13 14:48:18.402390074 +0000 UTC m=+6075.935940360" Oct 13 14:48:23 crc kubenswrapper[4797]: I1013 14:48:23.504139 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-klpcc" Oct 13 14:48:23 crc kubenswrapper[4797]: I1013 14:48:23.504868 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-klpcc" Oct 13 14:48:23 crc kubenswrapper[4797]: I1013 14:48:23.558132 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-klpcc" Oct 13 14:48:24 crc kubenswrapper[4797]: I1013 14:48:24.052589 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-create-q9x6q"] Oct 13 14:48:24 crc kubenswrapper[4797]: I1013 14:48:24.067785 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-create-q9x6q"] Oct 13 14:48:24 crc kubenswrapper[4797]: I1013 14:48:24.486312 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-klpcc" Oct 13 14:48:24 crc kubenswrapper[4797]: I1013 14:48:24.534991 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-klpcc"] Oct 13 14:48:25 crc kubenswrapper[4797]: I1013 14:48:25.255016 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe03d3d1-d5e5-4ab1-bbce-148648453626" path="/var/lib/kubelet/pods/fe03d3d1-d5e5-4ab1-bbce-148648453626/volumes" Oct 13 14:48:26 crc kubenswrapper[4797]: I1013 14:48:26.457530 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-klpcc" podUID="2ba32138-2528-4387-9c7d-cc122d18caac" containerName="registry-server" containerID="cri-o://ac135d9ab4d13886371839aab3c330032f1e34aaa9d293513721b1d0e1597257" gracePeriod=2 Oct 13 14:48:26 crc kubenswrapper[4797]: I1013 14:48:26.957076 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-klpcc" Oct 13 14:48:27 crc kubenswrapper[4797]: I1013 14:48:27.155014 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ba32138-2528-4387-9c7d-cc122d18caac-catalog-content\") pod \"2ba32138-2528-4387-9c7d-cc122d18caac\" (UID: \"2ba32138-2528-4387-9c7d-cc122d18caac\") " Oct 13 14:48:27 crc kubenswrapper[4797]: I1013 14:48:27.155368 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ba32138-2528-4387-9c7d-cc122d18caac-utilities\") pod \"2ba32138-2528-4387-9c7d-cc122d18caac\" (UID: \"2ba32138-2528-4387-9c7d-cc122d18caac\") " Oct 13 14:48:27 crc kubenswrapper[4797]: I1013 14:48:27.155477 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-szbfd\" (UniqueName: \"kubernetes.io/projected/2ba32138-2528-4387-9c7d-cc122d18caac-kube-api-access-szbfd\") pod \"2ba32138-2528-4387-9c7d-cc122d18caac\" (UID: \"2ba32138-2528-4387-9c7d-cc122d18caac\") " Oct 13 14:48:27 crc kubenswrapper[4797]: I1013 14:48:27.156396 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ba32138-2528-4387-9c7d-cc122d18caac-utilities" (OuterVolumeSpecName: "utilities") pod "2ba32138-2528-4387-9c7d-cc122d18caac" (UID: "2ba32138-2528-4387-9c7d-cc122d18caac"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 14:48:27 crc kubenswrapper[4797]: I1013 14:48:27.157284 4797 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ba32138-2528-4387-9c7d-cc122d18caac-utilities\") on node \"crc\" DevicePath \"\"" Oct 13 14:48:27 crc kubenswrapper[4797]: I1013 14:48:27.161473 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ba32138-2528-4387-9c7d-cc122d18caac-kube-api-access-szbfd" (OuterVolumeSpecName: "kube-api-access-szbfd") pod "2ba32138-2528-4387-9c7d-cc122d18caac" (UID: "2ba32138-2528-4387-9c7d-cc122d18caac"). InnerVolumeSpecName "kube-api-access-szbfd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 14:48:27 crc kubenswrapper[4797]: I1013 14:48:27.170195 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ba32138-2528-4387-9c7d-cc122d18caac-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2ba32138-2528-4387-9c7d-cc122d18caac" (UID: "2ba32138-2528-4387-9c7d-cc122d18caac"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 14:48:27 crc kubenswrapper[4797]: I1013 14:48:27.258982 4797 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ba32138-2528-4387-9c7d-cc122d18caac-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 13 14:48:27 crc kubenswrapper[4797]: I1013 14:48:27.259020 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-szbfd\" (UniqueName: \"kubernetes.io/projected/2ba32138-2528-4387-9c7d-cc122d18caac-kube-api-access-szbfd\") on node \"crc\" DevicePath \"\"" Oct 13 14:48:27 crc kubenswrapper[4797]: I1013 14:48:27.470480 4797 generic.go:334] "Generic (PLEG): container finished" podID="2ba32138-2528-4387-9c7d-cc122d18caac" containerID="ac135d9ab4d13886371839aab3c330032f1e34aaa9d293513721b1d0e1597257" exitCode=0 Oct 13 14:48:27 crc kubenswrapper[4797]: I1013 14:48:27.470534 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-klpcc" event={"ID":"2ba32138-2528-4387-9c7d-cc122d18caac","Type":"ContainerDied","Data":"ac135d9ab4d13886371839aab3c330032f1e34aaa9d293513721b1d0e1597257"} Oct 13 14:48:27 crc kubenswrapper[4797]: I1013 14:48:27.470564 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-klpcc" Oct 13 14:48:27 crc kubenswrapper[4797]: I1013 14:48:27.470589 4797 scope.go:117] "RemoveContainer" containerID="ac135d9ab4d13886371839aab3c330032f1e34aaa9d293513721b1d0e1597257" Oct 13 14:48:27 crc kubenswrapper[4797]: I1013 14:48:27.470573 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-klpcc" event={"ID":"2ba32138-2528-4387-9c7d-cc122d18caac","Type":"ContainerDied","Data":"9b96d04e6bc8b6cb5573b691482715772dcdb2bf6ed3affc64c7905ecadd51f2"} Oct 13 14:48:27 crc kubenswrapper[4797]: I1013 14:48:27.506253 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-klpcc"] Oct 13 14:48:27 crc kubenswrapper[4797]: I1013 14:48:27.507215 4797 scope.go:117] "RemoveContainer" containerID="8f912090feddf60f282f9c980555f2e0858b5b341b34546522b1f78237758a15" Oct 13 14:48:27 crc kubenswrapper[4797]: I1013 14:48:27.516590 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-klpcc"] Oct 13 14:48:27 crc kubenswrapper[4797]: I1013 14:48:27.527214 4797 scope.go:117] "RemoveContainer" containerID="c65f4013f9760720d375548d326844382cc6777bdd59acea878095f14a3d3676" Oct 13 14:48:27 crc kubenswrapper[4797]: I1013 14:48:27.579642 4797 scope.go:117] "RemoveContainer" containerID="ac135d9ab4d13886371839aab3c330032f1e34aaa9d293513721b1d0e1597257" Oct 13 14:48:27 crc kubenswrapper[4797]: E1013 14:48:27.580983 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac135d9ab4d13886371839aab3c330032f1e34aaa9d293513721b1d0e1597257\": container with ID starting with ac135d9ab4d13886371839aab3c330032f1e34aaa9d293513721b1d0e1597257 not found: ID does not exist" containerID="ac135d9ab4d13886371839aab3c330032f1e34aaa9d293513721b1d0e1597257" Oct 13 14:48:27 crc kubenswrapper[4797]: I1013 14:48:27.581044 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac135d9ab4d13886371839aab3c330032f1e34aaa9d293513721b1d0e1597257"} err="failed to get container status \"ac135d9ab4d13886371839aab3c330032f1e34aaa9d293513721b1d0e1597257\": rpc error: code = NotFound desc = could not find container \"ac135d9ab4d13886371839aab3c330032f1e34aaa9d293513721b1d0e1597257\": container with ID starting with ac135d9ab4d13886371839aab3c330032f1e34aaa9d293513721b1d0e1597257 not found: ID does not exist" Oct 13 14:48:27 crc kubenswrapper[4797]: I1013 14:48:27.581078 4797 scope.go:117] "RemoveContainer" containerID="8f912090feddf60f282f9c980555f2e0858b5b341b34546522b1f78237758a15" Oct 13 14:48:27 crc kubenswrapper[4797]: E1013 14:48:27.581907 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f912090feddf60f282f9c980555f2e0858b5b341b34546522b1f78237758a15\": container with ID starting with 8f912090feddf60f282f9c980555f2e0858b5b341b34546522b1f78237758a15 not found: ID does not exist" containerID="8f912090feddf60f282f9c980555f2e0858b5b341b34546522b1f78237758a15" Oct 13 14:48:27 crc kubenswrapper[4797]: I1013 14:48:27.581955 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f912090feddf60f282f9c980555f2e0858b5b341b34546522b1f78237758a15"} err="failed to get container status \"8f912090feddf60f282f9c980555f2e0858b5b341b34546522b1f78237758a15\": rpc error: code = NotFound desc = could not find container \"8f912090feddf60f282f9c980555f2e0858b5b341b34546522b1f78237758a15\": container with ID starting with 8f912090feddf60f282f9c980555f2e0858b5b341b34546522b1f78237758a15 not found: ID does not exist" Oct 13 14:48:27 crc kubenswrapper[4797]: I1013 14:48:27.581983 4797 scope.go:117] "RemoveContainer" containerID="c65f4013f9760720d375548d326844382cc6777bdd59acea878095f14a3d3676" Oct 13 14:48:27 crc kubenswrapper[4797]: E1013 14:48:27.582405 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c65f4013f9760720d375548d326844382cc6777bdd59acea878095f14a3d3676\": container with ID starting with c65f4013f9760720d375548d326844382cc6777bdd59acea878095f14a3d3676 not found: ID does not exist" containerID="c65f4013f9760720d375548d326844382cc6777bdd59acea878095f14a3d3676" Oct 13 14:48:27 crc kubenswrapper[4797]: I1013 14:48:27.583100 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c65f4013f9760720d375548d326844382cc6777bdd59acea878095f14a3d3676"} err="failed to get container status \"c65f4013f9760720d375548d326844382cc6777bdd59acea878095f14a3d3676\": rpc error: code = NotFound desc = could not find container \"c65f4013f9760720d375548d326844382cc6777bdd59acea878095f14a3d3676\": container with ID starting with c65f4013f9760720d375548d326844382cc6777bdd59acea878095f14a3d3676 not found: ID does not exist" Oct 13 14:48:29 crc kubenswrapper[4797]: I1013 14:48:29.248060 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ba32138-2528-4387-9c7d-cc122d18caac" path="/var/lib/kubelet/pods/2ba32138-2528-4387-9c7d-cc122d18caac/volumes" Oct 13 14:48:35 crc kubenswrapper[4797]: I1013 14:48:35.029453 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-2f95-account-create-xvpl7"] Oct 13 14:48:35 crc kubenswrapper[4797]: I1013 14:48:35.041697 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-2f95-account-create-xvpl7"] Oct 13 14:48:35 crc kubenswrapper[4797]: I1013 14:48:35.249468 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a153a9b5-e82d-467b-bbd0-6904bbd9a75b" path="/var/lib/kubelet/pods/a153a9b5-e82d-467b-bbd0-6904bbd9a75b/volumes" Oct 13 14:48:50 crc kubenswrapper[4797]: I1013 14:48:50.034196 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-sync-dlf5b"] Oct 13 14:48:50 crc kubenswrapper[4797]: I1013 14:48:50.042430 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-sync-dlf5b"] Oct 13 14:48:51 crc kubenswrapper[4797]: I1013 14:48:51.256902 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae765d31-580c-49c1-be2f-aca757f6e464" path="/var/lib/kubelet/pods/ae765d31-580c-49c1-be2f-aca757f6e464/volumes" Oct 13 14:49:18 crc kubenswrapper[4797]: I1013 14:49:18.120355 4797 patch_prober.go:28] interesting pod/machine-config-daemon-hrdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 14:49:18 crc kubenswrapper[4797]: I1013 14:49:18.120939 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 14:49:18 crc kubenswrapper[4797]: I1013 14:49:18.232653 4797 scope.go:117] "RemoveContainer" containerID="abcfb420a7f3dc8f893e44ec3a8a7b2c5085cd3df2159637112dd9f4c8eb2c9f" Oct 13 14:49:18 crc kubenswrapper[4797]: I1013 14:49:18.269545 4797 scope.go:117] "RemoveContainer" containerID="a599d6d74584c4a41379ec371da024c5f86e25efa6e5f21aa56ee11b6bc3ac95" Oct 13 14:49:18 crc kubenswrapper[4797]: I1013 14:49:18.318019 4797 scope.go:117] "RemoveContainer" containerID="9f64fc111deb6154d387cfa1edb9779b1ba05a14e8e687b7f6af0c606a85b201" Oct 13 14:49:18 crc kubenswrapper[4797]: I1013 14:49:18.372507 4797 scope.go:117] "RemoveContainer" containerID="a672498f2c5b67a50102b8d27065feb7431c336470d6b617401bc84102e0c62e" Oct 13 14:49:18 crc kubenswrapper[4797]: I1013 14:49:18.403340 4797 scope.go:117] "RemoveContainer" containerID="990429f1b2e468d798484f93bf4cc2185b849d3118b7e9458e9e8e31da5e1e77" Oct 13 14:49:48 crc kubenswrapper[4797]: I1013 14:49:48.120336 4797 patch_prober.go:28] interesting pod/machine-config-daemon-hrdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 14:49:48 crc kubenswrapper[4797]: I1013 14:49:48.120975 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 14:50:18 crc kubenswrapper[4797]: I1013 14:50:18.120043 4797 patch_prober.go:28] interesting pod/machine-config-daemon-hrdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 14:50:18 crc kubenswrapper[4797]: I1013 14:50:18.120656 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 14:50:18 crc kubenswrapper[4797]: I1013 14:50:18.120713 4797 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" Oct 13 14:50:18 crc kubenswrapper[4797]: I1013 14:50:18.121591 4797 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"267d09256aad9f3fe4d63c88f12e79b6b8c943961fc8e4ffcccbde0e623208bd"} pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 13 14:50:18 crc kubenswrapper[4797]: I1013 14:50:18.121655 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerName="machine-config-daemon" containerID="cri-o://267d09256aad9f3fe4d63c88f12e79b6b8c943961fc8e4ffcccbde0e623208bd" gracePeriod=600 Oct 13 14:50:18 crc kubenswrapper[4797]: I1013 14:50:18.664795 4797 generic.go:334] "Generic (PLEG): container finished" podID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerID="267d09256aad9f3fe4d63c88f12e79b6b8c943961fc8e4ffcccbde0e623208bd" exitCode=0 Oct 13 14:50:18 crc kubenswrapper[4797]: I1013 14:50:18.664855 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" event={"ID":"345b1c60-ba79-407d-8423-53010f2dfeb0","Type":"ContainerDied","Data":"267d09256aad9f3fe4d63c88f12e79b6b8c943961fc8e4ffcccbde0e623208bd"} Oct 13 14:50:18 crc kubenswrapper[4797]: I1013 14:50:18.665328 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" event={"ID":"345b1c60-ba79-407d-8423-53010f2dfeb0","Type":"ContainerStarted","Data":"1e92f33646910139b178cb12e4fc4664f24f2ce749b3ae38aac00d92a8bb562c"} Oct 13 14:50:18 crc kubenswrapper[4797]: I1013 14:50:18.665351 4797 scope.go:117] "RemoveContainer" containerID="6e1792210eb1300e989630292282b9e027a6f68ede1e610221b234da4c9f7a00" Oct 13 14:50:58 crc kubenswrapper[4797]: I1013 14:50:58.053587 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-create-rzdsz"] Oct 13 14:50:58 crc kubenswrapper[4797]: I1013 14:50:58.066186 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-create-rzdsz"] Oct 13 14:50:59 crc kubenswrapper[4797]: I1013 14:50:59.250546 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1fb0bc1-01e6-4c15-bbfc-849333911dc5" path="/var/lib/kubelet/pods/d1fb0bc1-01e6-4c15-bbfc-849333911dc5/volumes" Oct 13 14:51:08 crc kubenswrapper[4797]: I1013 14:51:08.027654 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-01bb-account-create-shxkb"] Oct 13 14:51:08 crc kubenswrapper[4797]: I1013 14:51:08.037120 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-01bb-account-create-shxkb"] Oct 13 14:51:09 crc kubenswrapper[4797]: I1013 14:51:09.248167 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55226f30-4bd9-44e8-806d-38f43b21b396" path="/var/lib/kubelet/pods/55226f30-4bd9-44e8-806d-38f43b21b396/volumes" Oct 13 14:51:18 crc kubenswrapper[4797]: I1013 14:51:18.576148 4797 scope.go:117] "RemoveContainer" containerID="082ea0e307371daf17cdfd468e4cb66a164be8c76aaa0e195c1e0ffb4564273c" Oct 13 14:51:18 crc kubenswrapper[4797]: I1013 14:51:18.600416 4797 scope.go:117] "RemoveContainer" containerID="afd0317bc27bff6b8ff89a7a9479df6b562f6a9e3d85a1cb74319a2a7afff01c" Oct 13 14:51:22 crc kubenswrapper[4797]: I1013 14:51:22.047662 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-sync-kf7zx"] Oct 13 14:51:22 crc kubenswrapper[4797]: I1013 14:51:22.057599 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-sync-kf7zx"] Oct 13 14:51:23 crc kubenswrapper[4797]: I1013 14:51:23.257263 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="302b0733-f0b0-4d8c-acc8-1f86c16a2920" path="/var/lib/kubelet/pods/302b0733-f0b0-4d8c-acc8-1f86c16a2920/volumes" Oct 13 14:51:33 crc kubenswrapper[4797]: I1013 14:51:33.986502 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-586x7"] Oct 13 14:51:33 crc kubenswrapper[4797]: E1013 14:51:33.987516 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ba32138-2528-4387-9c7d-cc122d18caac" containerName="extract-utilities" Oct 13 14:51:33 crc kubenswrapper[4797]: I1013 14:51:33.987533 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ba32138-2528-4387-9c7d-cc122d18caac" containerName="extract-utilities" Oct 13 14:51:33 crc kubenswrapper[4797]: E1013 14:51:33.987577 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ba32138-2528-4387-9c7d-cc122d18caac" containerName="registry-server" Oct 13 14:51:33 crc kubenswrapper[4797]: I1013 14:51:33.987587 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ba32138-2528-4387-9c7d-cc122d18caac" containerName="registry-server" Oct 13 14:51:33 crc kubenswrapper[4797]: E1013 14:51:33.987617 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ba32138-2528-4387-9c7d-cc122d18caac" containerName="extract-content" Oct 13 14:51:33 crc kubenswrapper[4797]: I1013 14:51:33.987626 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ba32138-2528-4387-9c7d-cc122d18caac" containerName="extract-content" Oct 13 14:51:33 crc kubenswrapper[4797]: I1013 14:51:33.987890 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ba32138-2528-4387-9c7d-cc122d18caac" containerName="registry-server" Oct 13 14:51:33 crc kubenswrapper[4797]: I1013 14:51:33.989705 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-586x7" Oct 13 14:51:33 crc kubenswrapper[4797]: I1013 14:51:33.999169 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-586x7"] Oct 13 14:51:34 crc kubenswrapper[4797]: I1013 14:51:34.118951 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fptv2\" (UniqueName: \"kubernetes.io/projected/dbbb4a00-085c-442f-8fdd-c9711d96f930-kube-api-access-fptv2\") pod \"certified-operators-586x7\" (UID: \"dbbb4a00-085c-442f-8fdd-c9711d96f930\") " pod="openshift-marketplace/certified-operators-586x7" Oct 13 14:51:34 crc kubenswrapper[4797]: I1013 14:51:34.119126 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dbbb4a00-085c-442f-8fdd-c9711d96f930-utilities\") pod \"certified-operators-586x7\" (UID: \"dbbb4a00-085c-442f-8fdd-c9711d96f930\") " pod="openshift-marketplace/certified-operators-586x7" Oct 13 14:51:34 crc kubenswrapper[4797]: I1013 14:51:34.119198 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dbbb4a00-085c-442f-8fdd-c9711d96f930-catalog-content\") pod \"certified-operators-586x7\" (UID: \"dbbb4a00-085c-442f-8fdd-c9711d96f930\") " pod="openshift-marketplace/certified-operators-586x7" Oct 13 14:51:34 crc kubenswrapper[4797]: I1013 14:51:34.220649 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dbbb4a00-085c-442f-8fdd-c9711d96f930-utilities\") pod \"certified-operators-586x7\" (UID: \"dbbb4a00-085c-442f-8fdd-c9711d96f930\") " pod="openshift-marketplace/certified-operators-586x7" Oct 13 14:51:34 crc kubenswrapper[4797]: I1013 14:51:34.220743 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dbbb4a00-085c-442f-8fdd-c9711d96f930-catalog-content\") pod \"certified-operators-586x7\" (UID: \"dbbb4a00-085c-442f-8fdd-c9711d96f930\") " pod="openshift-marketplace/certified-operators-586x7" Oct 13 14:51:34 crc kubenswrapper[4797]: I1013 14:51:34.220820 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fptv2\" (UniqueName: \"kubernetes.io/projected/dbbb4a00-085c-442f-8fdd-c9711d96f930-kube-api-access-fptv2\") pod \"certified-operators-586x7\" (UID: \"dbbb4a00-085c-442f-8fdd-c9711d96f930\") " pod="openshift-marketplace/certified-operators-586x7" Oct 13 14:51:34 crc kubenswrapper[4797]: I1013 14:51:34.221641 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dbbb4a00-085c-442f-8fdd-c9711d96f930-catalog-content\") pod \"certified-operators-586x7\" (UID: \"dbbb4a00-085c-442f-8fdd-c9711d96f930\") " pod="openshift-marketplace/certified-operators-586x7" Oct 13 14:51:34 crc kubenswrapper[4797]: I1013 14:51:34.221641 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dbbb4a00-085c-442f-8fdd-c9711d96f930-utilities\") pod \"certified-operators-586x7\" (UID: \"dbbb4a00-085c-442f-8fdd-c9711d96f930\") " pod="openshift-marketplace/certified-operators-586x7" Oct 13 14:51:34 crc kubenswrapper[4797]: I1013 14:51:34.247626 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fptv2\" (UniqueName: \"kubernetes.io/projected/dbbb4a00-085c-442f-8fdd-c9711d96f930-kube-api-access-fptv2\") pod \"certified-operators-586x7\" (UID: \"dbbb4a00-085c-442f-8fdd-c9711d96f930\") " pod="openshift-marketplace/certified-operators-586x7" Oct 13 14:51:34 crc kubenswrapper[4797]: I1013 14:51:34.315593 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-586x7" Oct 13 14:51:34 crc kubenswrapper[4797]: I1013 14:51:34.842549 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-586x7"] Oct 13 14:51:35 crc kubenswrapper[4797]: E1013 14:51:35.256716 4797 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddbbb4a00_085c_442f_8fdd_c9711d96f930.slice/crio-conmon-975fb7fc2596949ff98f13ba0aa805589594dd30e1f76901e2a11d2af6d92f86.scope\": RecentStats: unable to find data in memory cache]" Oct 13 14:51:35 crc kubenswrapper[4797]: I1013 14:51:35.421898 4797 generic.go:334] "Generic (PLEG): container finished" podID="dbbb4a00-085c-442f-8fdd-c9711d96f930" containerID="975fb7fc2596949ff98f13ba0aa805589594dd30e1f76901e2a11d2af6d92f86" exitCode=0 Oct 13 14:51:35 crc kubenswrapper[4797]: I1013 14:51:35.421927 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-586x7" event={"ID":"dbbb4a00-085c-442f-8fdd-c9711d96f930","Type":"ContainerDied","Data":"975fb7fc2596949ff98f13ba0aa805589594dd30e1f76901e2a11d2af6d92f86"} Oct 13 14:51:35 crc kubenswrapper[4797]: I1013 14:51:35.421965 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-586x7" event={"ID":"dbbb4a00-085c-442f-8fdd-c9711d96f930","Type":"ContainerStarted","Data":"af560ec76fd3e384e37bf3a5ae0f4972db2580c16b5aa9b0f062dee75559c374"} Oct 13 14:51:36 crc kubenswrapper[4797]: I1013 14:51:36.433788 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-586x7" event={"ID":"dbbb4a00-085c-442f-8fdd-c9711d96f930","Type":"ContainerStarted","Data":"31c219a4d500b24f4fce81395938135fac8e460b844289354ae289756aa5fa84"} Oct 13 14:51:37 crc kubenswrapper[4797]: I1013 14:51:37.447194 4797 generic.go:334] "Generic (PLEG): container finished" podID="dbbb4a00-085c-442f-8fdd-c9711d96f930" containerID="31c219a4d500b24f4fce81395938135fac8e460b844289354ae289756aa5fa84" exitCode=0 Oct 13 14:51:37 crc kubenswrapper[4797]: I1013 14:51:37.447329 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-586x7" event={"ID":"dbbb4a00-085c-442f-8fdd-c9711d96f930","Type":"ContainerDied","Data":"31c219a4d500b24f4fce81395938135fac8e460b844289354ae289756aa5fa84"} Oct 13 14:51:38 crc kubenswrapper[4797]: I1013 14:51:38.459259 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-586x7" event={"ID":"dbbb4a00-085c-442f-8fdd-c9711d96f930","Type":"ContainerStarted","Data":"1b59cf31d4808884db6b5be99018c30dab4e1d5ae8152956bd1c7ab96a15552b"} Oct 13 14:51:38 crc kubenswrapper[4797]: I1013 14:51:38.479564 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-586x7" podStartSLOduration=3.05806113 podStartE2EDuration="5.479544352s" podCreationTimestamp="2025-10-13 14:51:33 +0000 UTC" firstStartedPulling="2025-10-13 14:51:35.423979898 +0000 UTC m=+6272.957530154" lastFinishedPulling="2025-10-13 14:51:37.84546312 +0000 UTC m=+6275.379013376" observedRunningTime="2025-10-13 14:51:38.47783618 +0000 UTC m=+6276.011386476" watchObservedRunningTime="2025-10-13 14:51:38.479544352 +0000 UTC m=+6276.013094618" Oct 13 14:51:41 crc kubenswrapper[4797]: I1013 14:51:41.037461 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-create-vmdmb"] Oct 13 14:51:41 crc kubenswrapper[4797]: I1013 14:51:41.046615 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-create-vmdmb"] Oct 13 14:51:41 crc kubenswrapper[4797]: I1013 14:51:41.252786 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70a423f9-6797-471e-bbd6-d11cfc177239" path="/var/lib/kubelet/pods/70a423f9-6797-471e-bbd6-d11cfc177239/volumes" Oct 13 14:51:44 crc kubenswrapper[4797]: I1013 14:51:44.315933 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-586x7" Oct 13 14:51:44 crc kubenswrapper[4797]: I1013 14:51:44.316632 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-586x7" Oct 13 14:51:44 crc kubenswrapper[4797]: I1013 14:51:44.362580 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-586x7" Oct 13 14:51:44 crc kubenswrapper[4797]: I1013 14:51:44.578820 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-586x7" Oct 13 14:51:44 crc kubenswrapper[4797]: I1013 14:51:44.626523 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-586x7"] Oct 13 14:51:46 crc kubenswrapper[4797]: I1013 14:51:46.539020 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-586x7" podUID="dbbb4a00-085c-442f-8fdd-c9711d96f930" containerName="registry-server" containerID="cri-o://1b59cf31d4808884db6b5be99018c30dab4e1d5ae8152956bd1c7ab96a15552b" gracePeriod=2 Oct 13 14:51:47 crc kubenswrapper[4797]: I1013 14:51:47.048363 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-586x7" Oct 13 14:51:47 crc kubenswrapper[4797]: I1013 14:51:47.116977 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dbbb4a00-085c-442f-8fdd-c9711d96f930-catalog-content\") pod \"dbbb4a00-085c-442f-8fdd-c9711d96f930\" (UID: \"dbbb4a00-085c-442f-8fdd-c9711d96f930\") " Oct 13 14:51:47 crc kubenswrapper[4797]: I1013 14:51:47.117143 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dbbb4a00-085c-442f-8fdd-c9711d96f930-utilities\") pod \"dbbb4a00-085c-442f-8fdd-c9711d96f930\" (UID: \"dbbb4a00-085c-442f-8fdd-c9711d96f930\") " Oct 13 14:51:47 crc kubenswrapper[4797]: I1013 14:51:47.117207 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fptv2\" (UniqueName: \"kubernetes.io/projected/dbbb4a00-085c-442f-8fdd-c9711d96f930-kube-api-access-fptv2\") pod \"dbbb4a00-085c-442f-8fdd-c9711d96f930\" (UID: \"dbbb4a00-085c-442f-8fdd-c9711d96f930\") " Oct 13 14:51:47 crc kubenswrapper[4797]: I1013 14:51:47.118711 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dbbb4a00-085c-442f-8fdd-c9711d96f930-utilities" (OuterVolumeSpecName: "utilities") pod "dbbb4a00-085c-442f-8fdd-c9711d96f930" (UID: "dbbb4a00-085c-442f-8fdd-c9711d96f930"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 14:51:47 crc kubenswrapper[4797]: I1013 14:51:47.123673 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dbbb4a00-085c-442f-8fdd-c9711d96f930-kube-api-access-fptv2" (OuterVolumeSpecName: "kube-api-access-fptv2") pod "dbbb4a00-085c-442f-8fdd-c9711d96f930" (UID: "dbbb4a00-085c-442f-8fdd-c9711d96f930"). InnerVolumeSpecName "kube-api-access-fptv2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 14:51:47 crc kubenswrapper[4797]: I1013 14:51:47.160441 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dbbb4a00-085c-442f-8fdd-c9711d96f930-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dbbb4a00-085c-442f-8fdd-c9711d96f930" (UID: "dbbb4a00-085c-442f-8fdd-c9711d96f930"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 14:51:47 crc kubenswrapper[4797]: I1013 14:51:47.219788 4797 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dbbb4a00-085c-442f-8fdd-c9711d96f930-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 13 14:51:47 crc kubenswrapper[4797]: I1013 14:51:47.219854 4797 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dbbb4a00-085c-442f-8fdd-c9711d96f930-utilities\") on node \"crc\" DevicePath \"\"" Oct 13 14:51:47 crc kubenswrapper[4797]: I1013 14:51:47.219872 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fptv2\" (UniqueName: \"kubernetes.io/projected/dbbb4a00-085c-442f-8fdd-c9711d96f930-kube-api-access-fptv2\") on node \"crc\" DevicePath \"\"" Oct 13 14:51:47 crc kubenswrapper[4797]: I1013 14:51:47.550541 4797 generic.go:334] "Generic (PLEG): container finished" podID="dbbb4a00-085c-442f-8fdd-c9711d96f930" containerID="1b59cf31d4808884db6b5be99018c30dab4e1d5ae8152956bd1c7ab96a15552b" exitCode=0 Oct 13 14:51:47 crc kubenswrapper[4797]: I1013 14:51:47.550595 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-586x7" event={"ID":"dbbb4a00-085c-442f-8fdd-c9711d96f930","Type":"ContainerDied","Data":"1b59cf31d4808884db6b5be99018c30dab4e1d5ae8152956bd1c7ab96a15552b"} Oct 13 14:51:47 crc kubenswrapper[4797]: I1013 14:51:47.550627 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-586x7" event={"ID":"dbbb4a00-085c-442f-8fdd-c9711d96f930","Type":"ContainerDied","Data":"af560ec76fd3e384e37bf3a5ae0f4972db2580c16b5aa9b0f062dee75559c374"} Oct 13 14:51:47 crc kubenswrapper[4797]: I1013 14:51:47.550647 4797 scope.go:117] "RemoveContainer" containerID="1b59cf31d4808884db6b5be99018c30dab4e1d5ae8152956bd1c7ab96a15552b" Oct 13 14:51:47 crc kubenswrapper[4797]: I1013 14:51:47.550727 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-586x7" Oct 13 14:51:47 crc kubenswrapper[4797]: I1013 14:51:47.576572 4797 scope.go:117] "RemoveContainer" containerID="31c219a4d500b24f4fce81395938135fac8e460b844289354ae289756aa5fa84" Oct 13 14:51:47 crc kubenswrapper[4797]: I1013 14:51:47.576731 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-586x7"] Oct 13 14:51:47 crc kubenswrapper[4797]: I1013 14:51:47.585395 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-586x7"] Oct 13 14:51:47 crc kubenswrapper[4797]: I1013 14:51:47.602042 4797 scope.go:117] "RemoveContainer" containerID="975fb7fc2596949ff98f13ba0aa805589594dd30e1f76901e2a11d2af6d92f86" Oct 13 14:51:47 crc kubenswrapper[4797]: I1013 14:51:47.652550 4797 scope.go:117] "RemoveContainer" containerID="1b59cf31d4808884db6b5be99018c30dab4e1d5ae8152956bd1c7ab96a15552b" Oct 13 14:51:47 crc kubenswrapper[4797]: E1013 14:51:47.653508 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b59cf31d4808884db6b5be99018c30dab4e1d5ae8152956bd1c7ab96a15552b\": container with ID starting with 1b59cf31d4808884db6b5be99018c30dab4e1d5ae8152956bd1c7ab96a15552b not found: ID does not exist" containerID="1b59cf31d4808884db6b5be99018c30dab4e1d5ae8152956bd1c7ab96a15552b" Oct 13 14:51:47 crc kubenswrapper[4797]: I1013 14:51:47.653604 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b59cf31d4808884db6b5be99018c30dab4e1d5ae8152956bd1c7ab96a15552b"} err="failed to get container status \"1b59cf31d4808884db6b5be99018c30dab4e1d5ae8152956bd1c7ab96a15552b\": rpc error: code = NotFound desc = could not find container \"1b59cf31d4808884db6b5be99018c30dab4e1d5ae8152956bd1c7ab96a15552b\": container with ID starting with 1b59cf31d4808884db6b5be99018c30dab4e1d5ae8152956bd1c7ab96a15552b not found: ID does not exist" Oct 13 14:51:47 crc kubenswrapper[4797]: I1013 14:51:47.653665 4797 scope.go:117] "RemoveContainer" containerID="31c219a4d500b24f4fce81395938135fac8e460b844289354ae289756aa5fa84" Oct 13 14:51:47 crc kubenswrapper[4797]: E1013 14:51:47.654311 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"31c219a4d500b24f4fce81395938135fac8e460b844289354ae289756aa5fa84\": container with ID starting with 31c219a4d500b24f4fce81395938135fac8e460b844289354ae289756aa5fa84 not found: ID does not exist" containerID="31c219a4d500b24f4fce81395938135fac8e460b844289354ae289756aa5fa84" Oct 13 14:51:47 crc kubenswrapper[4797]: I1013 14:51:47.654378 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31c219a4d500b24f4fce81395938135fac8e460b844289354ae289756aa5fa84"} err="failed to get container status \"31c219a4d500b24f4fce81395938135fac8e460b844289354ae289756aa5fa84\": rpc error: code = NotFound desc = could not find container \"31c219a4d500b24f4fce81395938135fac8e460b844289354ae289756aa5fa84\": container with ID starting with 31c219a4d500b24f4fce81395938135fac8e460b844289354ae289756aa5fa84 not found: ID does not exist" Oct 13 14:51:47 crc kubenswrapper[4797]: I1013 14:51:47.654411 4797 scope.go:117] "RemoveContainer" containerID="975fb7fc2596949ff98f13ba0aa805589594dd30e1f76901e2a11d2af6d92f86" Oct 13 14:51:47 crc kubenswrapper[4797]: E1013 14:51:47.654870 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"975fb7fc2596949ff98f13ba0aa805589594dd30e1f76901e2a11d2af6d92f86\": container with ID starting with 975fb7fc2596949ff98f13ba0aa805589594dd30e1f76901e2a11d2af6d92f86 not found: ID does not exist" containerID="975fb7fc2596949ff98f13ba0aa805589594dd30e1f76901e2a11d2af6d92f86" Oct 13 14:51:47 crc kubenswrapper[4797]: I1013 14:51:47.655034 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"975fb7fc2596949ff98f13ba0aa805589594dd30e1f76901e2a11d2af6d92f86"} err="failed to get container status \"975fb7fc2596949ff98f13ba0aa805589594dd30e1f76901e2a11d2af6d92f86\": rpc error: code = NotFound desc = could not find container \"975fb7fc2596949ff98f13ba0aa805589594dd30e1f76901e2a11d2af6d92f86\": container with ID starting with 975fb7fc2596949ff98f13ba0aa805589594dd30e1f76901e2a11d2af6d92f86 not found: ID does not exist" Oct 13 14:51:49 crc kubenswrapper[4797]: I1013 14:51:49.248429 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dbbb4a00-085c-442f-8fdd-c9711d96f930" path="/var/lib/kubelet/pods/dbbb4a00-085c-442f-8fdd-c9711d96f930/volumes" Oct 13 14:51:51 crc kubenswrapper[4797]: I1013 14:51:51.052195 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-cd37-account-create-chjm4"] Oct 13 14:51:51 crc kubenswrapper[4797]: I1013 14:51:51.064715 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-cd37-account-create-chjm4"] Oct 13 14:51:51 crc kubenswrapper[4797]: I1013 14:51:51.253118 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd5395ec-ccd8-45a9-9c2f-aa8aeedc160a" path="/var/lib/kubelet/pods/bd5395ec-ccd8-45a9-9c2f-aa8aeedc160a/volumes" Oct 13 14:52:03 crc kubenswrapper[4797]: I1013 14:52:03.037009 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-sync-pnslk"] Oct 13 14:52:03 crc kubenswrapper[4797]: I1013 14:52:03.044956 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-sync-pnslk"] Oct 13 14:52:03 crc kubenswrapper[4797]: I1013 14:52:03.252199 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5348b8f-6498-4d16-bde9-20705e21127d" path="/var/lib/kubelet/pods/b5348b8f-6498-4d16-bde9-20705e21127d/volumes" Oct 13 14:52:18 crc kubenswrapper[4797]: I1013 14:52:18.119865 4797 patch_prober.go:28] interesting pod/machine-config-daemon-hrdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 14:52:18 crc kubenswrapper[4797]: I1013 14:52:18.120394 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 14:52:18 crc kubenswrapper[4797]: I1013 14:52:18.694463 4797 scope.go:117] "RemoveContainer" containerID="8d1871e05ce344551cfc2d6ba1bee2c6e3a82705548ab0ac6fa5db6f3814345c" Oct 13 14:52:18 crc kubenswrapper[4797]: I1013 14:52:18.729287 4797 scope.go:117] "RemoveContainer" containerID="690f13fdc1ce6bb867d23f5782b24676165ab0c4e203303d58019049af46c27a" Oct 13 14:52:18 crc kubenswrapper[4797]: I1013 14:52:18.783990 4797 scope.go:117] "RemoveContainer" containerID="3651e4efedbe2cd36246ba3e3251d5d5490eb4d5d10c66a45cafd8ca8f199fe5" Oct 13 14:52:18 crc kubenswrapper[4797]: I1013 14:52:18.846843 4797 scope.go:117] "RemoveContainer" containerID="469039f7f4a985d485998e262d8bcf7587ff8148e1469857ab86adf7e8c54c0f" Oct 13 14:52:48 crc kubenswrapper[4797]: I1013 14:52:48.120668 4797 patch_prober.go:28] interesting pod/machine-config-daemon-hrdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 14:52:48 crc kubenswrapper[4797]: I1013 14:52:48.121527 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 14:53:18 crc kubenswrapper[4797]: I1013 14:53:18.119882 4797 patch_prober.go:28] interesting pod/machine-config-daemon-hrdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 14:53:18 crc kubenswrapper[4797]: I1013 14:53:18.120500 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 14:53:18 crc kubenswrapper[4797]: I1013 14:53:18.120547 4797 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" Oct 13 14:53:18 crc kubenswrapper[4797]: I1013 14:53:18.121345 4797 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1e92f33646910139b178cb12e4fc4664f24f2ce749b3ae38aac00d92a8bb562c"} pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 13 14:53:18 crc kubenswrapper[4797]: I1013 14:53:18.121401 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerName="machine-config-daemon" containerID="cri-o://1e92f33646910139b178cb12e4fc4664f24f2ce749b3ae38aac00d92a8bb562c" gracePeriod=600 Oct 13 14:53:18 crc kubenswrapper[4797]: E1013 14:53:18.241128 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 14:53:18 crc kubenswrapper[4797]: I1013 14:53:18.442341 4797 generic.go:334] "Generic (PLEG): container finished" podID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerID="1e92f33646910139b178cb12e4fc4664f24f2ce749b3ae38aac00d92a8bb562c" exitCode=0 Oct 13 14:53:18 crc kubenswrapper[4797]: I1013 14:53:18.442400 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" event={"ID":"345b1c60-ba79-407d-8423-53010f2dfeb0","Type":"ContainerDied","Data":"1e92f33646910139b178cb12e4fc4664f24f2ce749b3ae38aac00d92a8bb562c"} Oct 13 14:53:18 crc kubenswrapper[4797]: I1013 14:53:18.442437 4797 scope.go:117] "RemoveContainer" containerID="267d09256aad9f3fe4d63c88f12e79b6b8c943961fc8e4ffcccbde0e623208bd" Oct 13 14:53:18 crc kubenswrapper[4797]: I1013 14:53:18.443598 4797 scope.go:117] "RemoveContainer" containerID="1e92f33646910139b178cb12e4fc4664f24f2ce749b3ae38aac00d92a8bb562c" Oct 13 14:53:18 crc kubenswrapper[4797]: E1013 14:53:18.444406 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 14:53:30 crc kubenswrapper[4797]: I1013 14:53:30.236913 4797 scope.go:117] "RemoveContainer" containerID="1e92f33646910139b178cb12e4fc4664f24f2ce749b3ae38aac00d92a8bb562c" Oct 13 14:53:30 crc kubenswrapper[4797]: E1013 14:53:30.237685 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 14:53:41 crc kubenswrapper[4797]: I1013 14:53:41.678276 4797 generic.go:334] "Generic (PLEG): container finished" podID="ab353644-5bc4-4dea-a881-0c4009efb270" containerID="de05fba5e02a9244a7a3583b3037f9b6940b37571bb1dae2e668281c5541775c" exitCode=0 Oct 13 14:53:41 crc kubenswrapper[4797]: I1013 14:53:41.678367 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-sfvqb" event={"ID":"ab353644-5bc4-4dea-a881-0c4009efb270","Type":"ContainerDied","Data":"de05fba5e02a9244a7a3583b3037f9b6940b37571bb1dae2e668281c5541775c"} Oct 13 14:53:43 crc kubenswrapper[4797]: I1013 14:53:43.135369 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-sfvqb" Oct 13 14:53:43 crc kubenswrapper[4797]: I1013 14:53:43.212629 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ab353644-5bc4-4dea-a881-0c4009efb270-ceph\") pod \"ab353644-5bc4-4dea-a881-0c4009efb270\" (UID: \"ab353644-5bc4-4dea-a881-0c4009efb270\") " Oct 13 14:53:43 crc kubenswrapper[4797]: I1013 14:53:43.213028 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6xlwx\" (UniqueName: \"kubernetes.io/projected/ab353644-5bc4-4dea-a881-0c4009efb270-kube-api-access-6xlwx\") pod \"ab353644-5bc4-4dea-a881-0c4009efb270\" (UID: \"ab353644-5bc4-4dea-a881-0c4009efb270\") " Oct 13 14:53:43 crc kubenswrapper[4797]: I1013 14:53:43.213227 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ab353644-5bc4-4dea-a881-0c4009efb270-ssh-key\") pod \"ab353644-5bc4-4dea-a881-0c4009efb270\" (UID: \"ab353644-5bc4-4dea-a881-0c4009efb270\") " Oct 13 14:53:43 crc kubenswrapper[4797]: I1013 14:53:43.213466 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ab353644-5bc4-4dea-a881-0c4009efb270-inventory\") pod \"ab353644-5bc4-4dea-a881-0c4009efb270\" (UID: \"ab353644-5bc4-4dea-a881-0c4009efb270\") " Oct 13 14:53:43 crc kubenswrapper[4797]: I1013 14:53:43.213518 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab353644-5bc4-4dea-a881-0c4009efb270-tripleo-cleanup-combined-ca-bundle\") pod \"ab353644-5bc4-4dea-a881-0c4009efb270\" (UID: \"ab353644-5bc4-4dea-a881-0c4009efb270\") " Oct 13 14:53:43 crc kubenswrapper[4797]: I1013 14:53:43.220110 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab353644-5bc4-4dea-a881-0c4009efb270-ceph" (OuterVolumeSpecName: "ceph") pod "ab353644-5bc4-4dea-a881-0c4009efb270" (UID: "ab353644-5bc4-4dea-a881-0c4009efb270"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 14:53:43 crc kubenswrapper[4797]: I1013 14:53:43.220210 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab353644-5bc4-4dea-a881-0c4009efb270-kube-api-access-6xlwx" (OuterVolumeSpecName: "kube-api-access-6xlwx") pod "ab353644-5bc4-4dea-a881-0c4009efb270" (UID: "ab353644-5bc4-4dea-a881-0c4009efb270"). InnerVolumeSpecName "kube-api-access-6xlwx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 14:53:43 crc kubenswrapper[4797]: I1013 14:53:43.222120 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab353644-5bc4-4dea-a881-0c4009efb270-tripleo-cleanup-combined-ca-bundle" (OuterVolumeSpecName: "tripleo-cleanup-combined-ca-bundle") pod "ab353644-5bc4-4dea-a881-0c4009efb270" (UID: "ab353644-5bc4-4dea-a881-0c4009efb270"). InnerVolumeSpecName "tripleo-cleanup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 14:53:43 crc kubenswrapper[4797]: I1013 14:53:43.243764 4797 scope.go:117] "RemoveContainer" containerID="1e92f33646910139b178cb12e4fc4664f24f2ce749b3ae38aac00d92a8bb562c" Oct 13 14:53:43 crc kubenswrapper[4797]: E1013 14:53:43.244077 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 14:53:43 crc kubenswrapper[4797]: I1013 14:53:43.251138 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab353644-5bc4-4dea-a881-0c4009efb270-inventory" (OuterVolumeSpecName: "inventory") pod "ab353644-5bc4-4dea-a881-0c4009efb270" (UID: "ab353644-5bc4-4dea-a881-0c4009efb270"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 14:53:43 crc kubenswrapper[4797]: I1013 14:53:43.251052 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab353644-5bc4-4dea-a881-0c4009efb270-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "ab353644-5bc4-4dea-a881-0c4009efb270" (UID: "ab353644-5bc4-4dea-a881-0c4009efb270"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 14:53:43 crc kubenswrapper[4797]: I1013 14:53:43.316074 4797 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ab353644-5bc4-4dea-a881-0c4009efb270-ceph\") on node \"crc\" DevicePath \"\"" Oct 13 14:53:43 crc kubenswrapper[4797]: I1013 14:53:43.316120 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6xlwx\" (UniqueName: \"kubernetes.io/projected/ab353644-5bc4-4dea-a881-0c4009efb270-kube-api-access-6xlwx\") on node \"crc\" DevicePath \"\"" Oct 13 14:53:43 crc kubenswrapper[4797]: I1013 14:53:43.316131 4797 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ab353644-5bc4-4dea-a881-0c4009efb270-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 13 14:53:43 crc kubenswrapper[4797]: I1013 14:53:43.316142 4797 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ab353644-5bc4-4dea-a881-0c4009efb270-inventory\") on node \"crc\" DevicePath \"\"" Oct 13 14:53:43 crc kubenswrapper[4797]: I1013 14:53:43.316152 4797 reconciler_common.go:293] "Volume detached for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab353644-5bc4-4dea-a881-0c4009efb270-tripleo-cleanup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 14:53:43 crc kubenswrapper[4797]: I1013 14:53:43.701355 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-sfvqb" event={"ID":"ab353644-5bc4-4dea-a881-0c4009efb270","Type":"ContainerDied","Data":"f7c386fa5ec36d2706ecd8472075c7015a5b221dac631748966edfffbbee54a1"} Oct 13 14:53:43 crc kubenswrapper[4797]: I1013 14:53:43.701415 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-sfvqb" Oct 13 14:53:43 crc kubenswrapper[4797]: I1013 14:53:43.701456 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f7c386fa5ec36d2706ecd8472075c7015a5b221dac631748966edfffbbee54a1" Oct 13 14:53:50 crc kubenswrapper[4797]: I1013 14:53:50.687185 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-openstack-openstack-cell1-znhqv"] Oct 13 14:53:50 crc kubenswrapper[4797]: E1013 14:53:50.688334 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbbb4a00-085c-442f-8fdd-c9711d96f930" containerName="registry-server" Oct 13 14:53:50 crc kubenswrapper[4797]: I1013 14:53:50.688352 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbbb4a00-085c-442f-8fdd-c9711d96f930" containerName="registry-server" Oct 13 14:53:50 crc kubenswrapper[4797]: E1013 14:53:50.688369 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab353644-5bc4-4dea-a881-0c4009efb270" containerName="tripleo-cleanup-tripleo-cleanup-openstack-cell1" Oct 13 14:53:50 crc kubenswrapper[4797]: I1013 14:53:50.688379 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab353644-5bc4-4dea-a881-0c4009efb270" containerName="tripleo-cleanup-tripleo-cleanup-openstack-cell1" Oct 13 14:53:50 crc kubenswrapper[4797]: E1013 14:53:50.688407 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbbb4a00-085c-442f-8fdd-c9711d96f930" containerName="extract-utilities" Oct 13 14:53:50 crc kubenswrapper[4797]: I1013 14:53:50.688416 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbbb4a00-085c-442f-8fdd-c9711d96f930" containerName="extract-utilities" Oct 13 14:53:50 crc kubenswrapper[4797]: E1013 14:53:50.688424 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbbb4a00-085c-442f-8fdd-c9711d96f930" containerName="extract-content" Oct 13 14:53:50 crc kubenswrapper[4797]: I1013 14:53:50.688432 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbbb4a00-085c-442f-8fdd-c9711d96f930" containerName="extract-content" Oct 13 14:53:50 crc kubenswrapper[4797]: I1013 14:53:50.688734 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab353644-5bc4-4dea-a881-0c4009efb270" containerName="tripleo-cleanup-tripleo-cleanup-openstack-cell1" Oct 13 14:53:50 crc kubenswrapper[4797]: I1013 14:53:50.688765 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="dbbb4a00-085c-442f-8fdd-c9711d96f930" containerName="registry-server" Oct 13 14:53:50 crc kubenswrapper[4797]: I1013 14:53:50.689880 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-znhqv" Oct 13 14:53:50 crc kubenswrapper[4797]: I1013 14:53:50.692239 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 13 14:53:50 crc kubenswrapper[4797]: I1013 14:53:50.692484 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-rf85n" Oct 13 14:53:50 crc kubenswrapper[4797]: I1013 14:53:50.693710 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 13 14:53:50 crc kubenswrapper[4797]: I1013 14:53:50.693904 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 13 14:53:50 crc kubenswrapper[4797]: I1013 14:53:50.716089 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-openstack-openstack-cell1-znhqv"] Oct 13 14:53:50 crc kubenswrapper[4797]: I1013 14:53:50.778986 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6ddd3dd4-0ef9-495e-9ffa-8358884cb552-ssh-key\") pod \"bootstrap-openstack-openstack-cell1-znhqv\" (UID: \"6ddd3dd4-0ef9-495e-9ffa-8358884cb552\") " pod="openstack/bootstrap-openstack-openstack-cell1-znhqv" Oct 13 14:53:50 crc kubenswrapper[4797]: I1013 14:53:50.779171 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6ddd3dd4-0ef9-495e-9ffa-8358884cb552-ceph\") pod \"bootstrap-openstack-openstack-cell1-znhqv\" (UID: \"6ddd3dd4-0ef9-495e-9ffa-8358884cb552\") " pod="openstack/bootstrap-openstack-openstack-cell1-znhqv" Oct 13 14:53:50 crc kubenswrapper[4797]: I1013 14:53:50.779221 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gbdrw\" (UniqueName: \"kubernetes.io/projected/6ddd3dd4-0ef9-495e-9ffa-8358884cb552-kube-api-access-gbdrw\") pod \"bootstrap-openstack-openstack-cell1-znhqv\" (UID: \"6ddd3dd4-0ef9-495e-9ffa-8358884cb552\") " pod="openstack/bootstrap-openstack-openstack-cell1-znhqv" Oct 13 14:53:50 crc kubenswrapper[4797]: I1013 14:53:50.779266 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ddd3dd4-0ef9-495e-9ffa-8358884cb552-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-cell1-znhqv\" (UID: \"6ddd3dd4-0ef9-495e-9ffa-8358884cb552\") " pod="openstack/bootstrap-openstack-openstack-cell1-znhqv" Oct 13 14:53:50 crc kubenswrapper[4797]: I1013 14:53:50.779288 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6ddd3dd4-0ef9-495e-9ffa-8358884cb552-inventory\") pod \"bootstrap-openstack-openstack-cell1-znhqv\" (UID: \"6ddd3dd4-0ef9-495e-9ffa-8358884cb552\") " pod="openstack/bootstrap-openstack-openstack-cell1-znhqv" Oct 13 14:53:50 crc kubenswrapper[4797]: I1013 14:53:50.881933 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6ddd3dd4-0ef9-495e-9ffa-8358884cb552-ceph\") pod \"bootstrap-openstack-openstack-cell1-znhqv\" (UID: \"6ddd3dd4-0ef9-495e-9ffa-8358884cb552\") " pod="openstack/bootstrap-openstack-openstack-cell1-znhqv" Oct 13 14:53:50 crc kubenswrapper[4797]: I1013 14:53:50.881997 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gbdrw\" (UniqueName: \"kubernetes.io/projected/6ddd3dd4-0ef9-495e-9ffa-8358884cb552-kube-api-access-gbdrw\") pod \"bootstrap-openstack-openstack-cell1-znhqv\" (UID: \"6ddd3dd4-0ef9-495e-9ffa-8358884cb552\") " pod="openstack/bootstrap-openstack-openstack-cell1-znhqv" Oct 13 14:53:50 crc kubenswrapper[4797]: I1013 14:53:50.882061 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ddd3dd4-0ef9-495e-9ffa-8358884cb552-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-cell1-znhqv\" (UID: \"6ddd3dd4-0ef9-495e-9ffa-8358884cb552\") " pod="openstack/bootstrap-openstack-openstack-cell1-znhqv" Oct 13 14:53:50 crc kubenswrapper[4797]: I1013 14:53:50.882509 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6ddd3dd4-0ef9-495e-9ffa-8358884cb552-inventory\") pod \"bootstrap-openstack-openstack-cell1-znhqv\" (UID: \"6ddd3dd4-0ef9-495e-9ffa-8358884cb552\") " pod="openstack/bootstrap-openstack-openstack-cell1-znhqv" Oct 13 14:53:50 crc kubenswrapper[4797]: I1013 14:53:50.882765 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6ddd3dd4-0ef9-495e-9ffa-8358884cb552-ssh-key\") pod \"bootstrap-openstack-openstack-cell1-znhqv\" (UID: \"6ddd3dd4-0ef9-495e-9ffa-8358884cb552\") " pod="openstack/bootstrap-openstack-openstack-cell1-znhqv" Oct 13 14:53:50 crc kubenswrapper[4797]: I1013 14:53:50.887680 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6ddd3dd4-0ef9-495e-9ffa-8358884cb552-inventory\") pod \"bootstrap-openstack-openstack-cell1-znhqv\" (UID: \"6ddd3dd4-0ef9-495e-9ffa-8358884cb552\") " pod="openstack/bootstrap-openstack-openstack-cell1-znhqv" Oct 13 14:53:50 crc kubenswrapper[4797]: I1013 14:53:50.888293 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6ddd3dd4-0ef9-495e-9ffa-8358884cb552-ceph\") pod \"bootstrap-openstack-openstack-cell1-znhqv\" (UID: \"6ddd3dd4-0ef9-495e-9ffa-8358884cb552\") " pod="openstack/bootstrap-openstack-openstack-cell1-znhqv" Oct 13 14:53:50 crc kubenswrapper[4797]: I1013 14:53:50.888630 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6ddd3dd4-0ef9-495e-9ffa-8358884cb552-ssh-key\") pod \"bootstrap-openstack-openstack-cell1-znhqv\" (UID: \"6ddd3dd4-0ef9-495e-9ffa-8358884cb552\") " pod="openstack/bootstrap-openstack-openstack-cell1-znhqv" Oct 13 14:53:50 crc kubenswrapper[4797]: I1013 14:53:50.888684 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ddd3dd4-0ef9-495e-9ffa-8358884cb552-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-cell1-znhqv\" (UID: \"6ddd3dd4-0ef9-495e-9ffa-8358884cb552\") " pod="openstack/bootstrap-openstack-openstack-cell1-znhqv" Oct 13 14:53:50 crc kubenswrapper[4797]: I1013 14:53:50.901664 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gbdrw\" (UniqueName: \"kubernetes.io/projected/6ddd3dd4-0ef9-495e-9ffa-8358884cb552-kube-api-access-gbdrw\") pod \"bootstrap-openstack-openstack-cell1-znhqv\" (UID: \"6ddd3dd4-0ef9-495e-9ffa-8358884cb552\") " pod="openstack/bootstrap-openstack-openstack-cell1-znhqv" Oct 13 14:53:51 crc kubenswrapper[4797]: I1013 14:53:51.018886 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-znhqv" Oct 13 14:53:51 crc kubenswrapper[4797]: I1013 14:53:51.557490 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-openstack-openstack-cell1-znhqv"] Oct 13 14:53:51 crc kubenswrapper[4797]: I1013 14:53:51.569330 4797 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 13 14:53:51 crc kubenswrapper[4797]: I1013 14:53:51.791148 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-znhqv" event={"ID":"6ddd3dd4-0ef9-495e-9ffa-8358884cb552","Type":"ContainerStarted","Data":"636f0656a4fc3fe4202ed617d5d15ed8fa777c712b1341c12654b13704266bb6"} Oct 13 14:53:52 crc kubenswrapper[4797]: I1013 14:53:52.804490 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-znhqv" event={"ID":"6ddd3dd4-0ef9-495e-9ffa-8358884cb552","Type":"ContainerStarted","Data":"d3831cd78b1c0de727a9d895b871c052212ac5796e43a330f06c536c52ef26cf"} Oct 13 14:53:52 crc kubenswrapper[4797]: I1013 14:53:52.819960 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-openstack-openstack-cell1-znhqv" podStartSLOduration=2.201237386 podStartE2EDuration="2.819939681s" podCreationTimestamp="2025-10-13 14:53:50 +0000 UTC" firstStartedPulling="2025-10-13 14:53:51.568715323 +0000 UTC m=+6409.102265589" lastFinishedPulling="2025-10-13 14:53:52.187417628 +0000 UTC m=+6409.720967884" observedRunningTime="2025-10-13 14:53:52.818083326 +0000 UTC m=+6410.351633612" watchObservedRunningTime="2025-10-13 14:53:52.819939681 +0000 UTC m=+6410.353489937" Oct 13 14:53:55 crc kubenswrapper[4797]: I1013 14:53:55.236529 4797 scope.go:117] "RemoveContainer" containerID="1e92f33646910139b178cb12e4fc4664f24f2ce749b3ae38aac00d92a8bb562c" Oct 13 14:53:55 crc kubenswrapper[4797]: E1013 14:53:55.237464 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 14:54:10 crc kubenswrapper[4797]: I1013 14:54:10.236800 4797 scope.go:117] "RemoveContainer" containerID="1e92f33646910139b178cb12e4fc4664f24f2ce749b3ae38aac00d92a8bb562c" Oct 13 14:54:10 crc kubenswrapper[4797]: E1013 14:54:10.238299 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 14:54:21 crc kubenswrapper[4797]: I1013 14:54:21.236255 4797 scope.go:117] "RemoveContainer" containerID="1e92f33646910139b178cb12e4fc4664f24f2ce749b3ae38aac00d92a8bb562c" Oct 13 14:54:21 crc kubenswrapper[4797]: E1013 14:54:21.237105 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 14:54:32 crc kubenswrapper[4797]: I1013 14:54:32.236547 4797 scope.go:117] "RemoveContainer" containerID="1e92f33646910139b178cb12e4fc4664f24f2ce749b3ae38aac00d92a8bb562c" Oct 13 14:54:32 crc kubenswrapper[4797]: E1013 14:54:32.237772 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 14:54:47 crc kubenswrapper[4797]: I1013 14:54:47.237273 4797 scope.go:117] "RemoveContainer" containerID="1e92f33646910139b178cb12e4fc4664f24f2ce749b3ae38aac00d92a8bb562c" Oct 13 14:54:47 crc kubenswrapper[4797]: E1013 14:54:47.238232 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 14:54:59 crc kubenswrapper[4797]: I1013 14:54:59.236118 4797 scope.go:117] "RemoveContainer" containerID="1e92f33646910139b178cb12e4fc4664f24f2ce749b3ae38aac00d92a8bb562c" Oct 13 14:54:59 crc kubenswrapper[4797]: E1013 14:54:59.236884 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 14:55:14 crc kubenswrapper[4797]: I1013 14:55:14.236338 4797 scope.go:117] "RemoveContainer" containerID="1e92f33646910139b178cb12e4fc4664f24f2ce749b3ae38aac00d92a8bb562c" Oct 13 14:55:14 crc kubenswrapper[4797]: E1013 14:55:14.237117 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 14:55:28 crc kubenswrapper[4797]: I1013 14:55:28.236307 4797 scope.go:117] "RemoveContainer" containerID="1e92f33646910139b178cb12e4fc4664f24f2ce749b3ae38aac00d92a8bb562c" Oct 13 14:55:28 crc kubenswrapper[4797]: E1013 14:55:28.237096 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 14:55:40 crc kubenswrapper[4797]: I1013 14:55:40.235954 4797 scope.go:117] "RemoveContainer" containerID="1e92f33646910139b178cb12e4fc4664f24f2ce749b3ae38aac00d92a8bb562c" Oct 13 14:55:40 crc kubenswrapper[4797]: E1013 14:55:40.237109 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 14:55:52 crc kubenswrapper[4797]: I1013 14:55:52.236575 4797 scope.go:117] "RemoveContainer" containerID="1e92f33646910139b178cb12e4fc4664f24f2ce749b3ae38aac00d92a8bb562c" Oct 13 14:55:52 crc kubenswrapper[4797]: E1013 14:55:52.237335 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 14:56:06 crc kubenswrapper[4797]: I1013 14:56:06.236937 4797 scope.go:117] "RemoveContainer" containerID="1e92f33646910139b178cb12e4fc4664f24f2ce749b3ae38aac00d92a8bb562c" Oct 13 14:56:06 crc kubenswrapper[4797]: E1013 14:56:06.238005 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 14:56:21 crc kubenswrapper[4797]: I1013 14:56:21.238283 4797 scope.go:117] "RemoveContainer" containerID="1e92f33646910139b178cb12e4fc4664f24f2ce749b3ae38aac00d92a8bb562c" Oct 13 14:56:21 crc kubenswrapper[4797]: E1013 14:56:21.239538 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 14:56:34 crc kubenswrapper[4797]: I1013 14:56:34.779711 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-k8gpv"] Oct 13 14:56:34 crc kubenswrapper[4797]: I1013 14:56:34.783231 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k8gpv" Oct 13 14:56:34 crc kubenswrapper[4797]: I1013 14:56:34.810835 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-k8gpv"] Oct 13 14:56:34 crc kubenswrapper[4797]: I1013 14:56:34.929223 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvp52\" (UniqueName: \"kubernetes.io/projected/4880b275-4169-4a3b-a56e-5fd9e892e4d4-kube-api-access-qvp52\") pod \"community-operators-k8gpv\" (UID: \"4880b275-4169-4a3b-a56e-5fd9e892e4d4\") " pod="openshift-marketplace/community-operators-k8gpv" Oct 13 14:56:34 crc kubenswrapper[4797]: I1013 14:56:34.929298 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4880b275-4169-4a3b-a56e-5fd9e892e4d4-utilities\") pod \"community-operators-k8gpv\" (UID: \"4880b275-4169-4a3b-a56e-5fd9e892e4d4\") " pod="openshift-marketplace/community-operators-k8gpv" Oct 13 14:56:34 crc kubenswrapper[4797]: I1013 14:56:34.929507 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4880b275-4169-4a3b-a56e-5fd9e892e4d4-catalog-content\") pod \"community-operators-k8gpv\" (UID: \"4880b275-4169-4a3b-a56e-5fd9e892e4d4\") " pod="openshift-marketplace/community-operators-k8gpv" Oct 13 14:56:35 crc kubenswrapper[4797]: I1013 14:56:35.031510 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4880b275-4169-4a3b-a56e-5fd9e892e4d4-catalog-content\") pod \"community-operators-k8gpv\" (UID: \"4880b275-4169-4a3b-a56e-5fd9e892e4d4\") " pod="openshift-marketplace/community-operators-k8gpv" Oct 13 14:56:35 crc kubenswrapper[4797]: I1013 14:56:35.031958 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4880b275-4169-4a3b-a56e-5fd9e892e4d4-catalog-content\") pod \"community-operators-k8gpv\" (UID: \"4880b275-4169-4a3b-a56e-5fd9e892e4d4\") " pod="openshift-marketplace/community-operators-k8gpv" Oct 13 14:56:35 crc kubenswrapper[4797]: I1013 14:56:35.032142 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qvp52\" (UniqueName: \"kubernetes.io/projected/4880b275-4169-4a3b-a56e-5fd9e892e4d4-kube-api-access-qvp52\") pod \"community-operators-k8gpv\" (UID: \"4880b275-4169-4a3b-a56e-5fd9e892e4d4\") " pod="openshift-marketplace/community-operators-k8gpv" Oct 13 14:56:35 crc kubenswrapper[4797]: I1013 14:56:35.032283 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4880b275-4169-4a3b-a56e-5fd9e892e4d4-utilities\") pod \"community-operators-k8gpv\" (UID: \"4880b275-4169-4a3b-a56e-5fd9e892e4d4\") " pod="openshift-marketplace/community-operators-k8gpv" Oct 13 14:56:35 crc kubenswrapper[4797]: I1013 14:56:35.032719 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4880b275-4169-4a3b-a56e-5fd9e892e4d4-utilities\") pod \"community-operators-k8gpv\" (UID: \"4880b275-4169-4a3b-a56e-5fd9e892e4d4\") " pod="openshift-marketplace/community-operators-k8gpv" Oct 13 14:56:35 crc kubenswrapper[4797]: I1013 14:56:35.055849 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvp52\" (UniqueName: \"kubernetes.io/projected/4880b275-4169-4a3b-a56e-5fd9e892e4d4-kube-api-access-qvp52\") pod \"community-operators-k8gpv\" (UID: \"4880b275-4169-4a3b-a56e-5fd9e892e4d4\") " pod="openshift-marketplace/community-operators-k8gpv" Oct 13 14:56:35 crc kubenswrapper[4797]: I1013 14:56:35.103452 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k8gpv" Oct 13 14:56:35 crc kubenswrapper[4797]: I1013 14:56:35.236093 4797 scope.go:117] "RemoveContainer" containerID="1e92f33646910139b178cb12e4fc4664f24f2ce749b3ae38aac00d92a8bb562c" Oct 13 14:56:35 crc kubenswrapper[4797]: E1013 14:56:35.236755 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 14:56:35 crc kubenswrapper[4797]: I1013 14:56:35.663187 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-k8gpv"] Oct 13 14:56:36 crc kubenswrapper[4797]: I1013 14:56:36.387772 4797 generic.go:334] "Generic (PLEG): container finished" podID="4880b275-4169-4a3b-a56e-5fd9e892e4d4" containerID="84f703d0e9336044d8a47ec410b6d429cbf3e19246a068ca5328766855d0aac8" exitCode=0 Oct 13 14:56:36 crc kubenswrapper[4797]: I1013 14:56:36.387849 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k8gpv" event={"ID":"4880b275-4169-4a3b-a56e-5fd9e892e4d4","Type":"ContainerDied","Data":"84f703d0e9336044d8a47ec410b6d429cbf3e19246a068ca5328766855d0aac8"} Oct 13 14:56:36 crc kubenswrapper[4797]: I1013 14:56:36.388253 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k8gpv" event={"ID":"4880b275-4169-4a3b-a56e-5fd9e892e4d4","Type":"ContainerStarted","Data":"a383eae700b8f1caf30df0d77fa063517fca36e18558c1df8ed8221339c5bd91"} Oct 13 14:56:38 crc kubenswrapper[4797]: I1013 14:56:38.409634 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k8gpv" event={"ID":"4880b275-4169-4a3b-a56e-5fd9e892e4d4","Type":"ContainerStarted","Data":"703ceb6f677c9b2bd6d65d8228c092b19d6d9437dd2f0e4554ed3ae54db8c6f6"} Oct 13 14:56:39 crc kubenswrapper[4797]: I1013 14:56:39.425703 4797 generic.go:334] "Generic (PLEG): container finished" podID="4880b275-4169-4a3b-a56e-5fd9e892e4d4" containerID="703ceb6f677c9b2bd6d65d8228c092b19d6d9437dd2f0e4554ed3ae54db8c6f6" exitCode=0 Oct 13 14:56:39 crc kubenswrapper[4797]: I1013 14:56:39.425863 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k8gpv" event={"ID":"4880b275-4169-4a3b-a56e-5fd9e892e4d4","Type":"ContainerDied","Data":"703ceb6f677c9b2bd6d65d8228c092b19d6d9437dd2f0e4554ed3ae54db8c6f6"} Oct 13 14:56:40 crc kubenswrapper[4797]: I1013 14:56:40.437750 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k8gpv" event={"ID":"4880b275-4169-4a3b-a56e-5fd9e892e4d4","Type":"ContainerStarted","Data":"8a14f10d2fc594598e4d6a55f6a326cdab3bb9d09fa45622a4390f2d8271baab"} Oct 13 14:56:40 crc kubenswrapper[4797]: I1013 14:56:40.466118 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-k8gpv" podStartSLOduration=3.058871216 podStartE2EDuration="6.466096863s" podCreationTimestamp="2025-10-13 14:56:34 +0000 UTC" firstStartedPulling="2025-10-13 14:56:36.390753739 +0000 UTC m=+6573.924304005" lastFinishedPulling="2025-10-13 14:56:39.797979396 +0000 UTC m=+6577.331529652" observedRunningTime="2025-10-13 14:56:40.4549919 +0000 UTC m=+6577.988542186" watchObservedRunningTime="2025-10-13 14:56:40.466096863 +0000 UTC m=+6577.999647119" Oct 13 14:56:45 crc kubenswrapper[4797]: I1013 14:56:45.105194 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-k8gpv" Oct 13 14:56:45 crc kubenswrapper[4797]: I1013 14:56:45.105875 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-k8gpv" Oct 13 14:56:45 crc kubenswrapper[4797]: I1013 14:56:45.164628 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-k8gpv" Oct 13 14:56:45 crc kubenswrapper[4797]: I1013 14:56:45.586185 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-k8gpv" Oct 13 14:56:45 crc kubenswrapper[4797]: I1013 14:56:45.645550 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-k8gpv"] Oct 13 14:56:46 crc kubenswrapper[4797]: I1013 14:56:46.236453 4797 scope.go:117] "RemoveContainer" containerID="1e92f33646910139b178cb12e4fc4664f24f2ce749b3ae38aac00d92a8bb562c" Oct 13 14:56:46 crc kubenswrapper[4797]: E1013 14:56:46.236743 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 14:56:47 crc kubenswrapper[4797]: I1013 14:56:47.548680 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-k8gpv" podUID="4880b275-4169-4a3b-a56e-5fd9e892e4d4" containerName="registry-server" containerID="cri-o://8a14f10d2fc594598e4d6a55f6a326cdab3bb9d09fa45622a4390f2d8271baab" gracePeriod=2 Oct 13 14:56:48 crc kubenswrapper[4797]: I1013 14:56:48.019941 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k8gpv" Oct 13 14:56:48 crc kubenswrapper[4797]: I1013 14:56:48.216751 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4880b275-4169-4a3b-a56e-5fd9e892e4d4-utilities\") pod \"4880b275-4169-4a3b-a56e-5fd9e892e4d4\" (UID: \"4880b275-4169-4a3b-a56e-5fd9e892e4d4\") " Oct 13 14:56:48 crc kubenswrapper[4797]: I1013 14:56:48.216932 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4880b275-4169-4a3b-a56e-5fd9e892e4d4-catalog-content\") pod \"4880b275-4169-4a3b-a56e-5fd9e892e4d4\" (UID: \"4880b275-4169-4a3b-a56e-5fd9e892e4d4\") " Oct 13 14:56:48 crc kubenswrapper[4797]: I1013 14:56:48.217131 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qvp52\" (UniqueName: \"kubernetes.io/projected/4880b275-4169-4a3b-a56e-5fd9e892e4d4-kube-api-access-qvp52\") pod \"4880b275-4169-4a3b-a56e-5fd9e892e4d4\" (UID: \"4880b275-4169-4a3b-a56e-5fd9e892e4d4\") " Oct 13 14:56:48 crc kubenswrapper[4797]: I1013 14:56:48.218548 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4880b275-4169-4a3b-a56e-5fd9e892e4d4-utilities" (OuterVolumeSpecName: "utilities") pod "4880b275-4169-4a3b-a56e-5fd9e892e4d4" (UID: "4880b275-4169-4a3b-a56e-5fd9e892e4d4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 14:56:48 crc kubenswrapper[4797]: I1013 14:56:48.228920 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4880b275-4169-4a3b-a56e-5fd9e892e4d4-kube-api-access-qvp52" (OuterVolumeSpecName: "kube-api-access-qvp52") pod "4880b275-4169-4a3b-a56e-5fd9e892e4d4" (UID: "4880b275-4169-4a3b-a56e-5fd9e892e4d4"). InnerVolumeSpecName "kube-api-access-qvp52". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 14:56:48 crc kubenswrapper[4797]: I1013 14:56:48.275100 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4880b275-4169-4a3b-a56e-5fd9e892e4d4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4880b275-4169-4a3b-a56e-5fd9e892e4d4" (UID: "4880b275-4169-4a3b-a56e-5fd9e892e4d4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 14:56:48 crc kubenswrapper[4797]: I1013 14:56:48.319845 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qvp52\" (UniqueName: \"kubernetes.io/projected/4880b275-4169-4a3b-a56e-5fd9e892e4d4-kube-api-access-qvp52\") on node \"crc\" DevicePath \"\"" Oct 13 14:56:48 crc kubenswrapper[4797]: I1013 14:56:48.319883 4797 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4880b275-4169-4a3b-a56e-5fd9e892e4d4-utilities\") on node \"crc\" DevicePath \"\"" Oct 13 14:56:48 crc kubenswrapper[4797]: I1013 14:56:48.319895 4797 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4880b275-4169-4a3b-a56e-5fd9e892e4d4-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 13 14:56:48 crc kubenswrapper[4797]: I1013 14:56:48.568569 4797 generic.go:334] "Generic (PLEG): container finished" podID="4880b275-4169-4a3b-a56e-5fd9e892e4d4" containerID="8a14f10d2fc594598e4d6a55f6a326cdab3bb9d09fa45622a4390f2d8271baab" exitCode=0 Oct 13 14:56:48 crc kubenswrapper[4797]: I1013 14:56:48.568614 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k8gpv" event={"ID":"4880b275-4169-4a3b-a56e-5fd9e892e4d4","Type":"ContainerDied","Data":"8a14f10d2fc594598e4d6a55f6a326cdab3bb9d09fa45622a4390f2d8271baab"} Oct 13 14:56:48 crc kubenswrapper[4797]: I1013 14:56:48.568678 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k8gpv" event={"ID":"4880b275-4169-4a3b-a56e-5fd9e892e4d4","Type":"ContainerDied","Data":"a383eae700b8f1caf30df0d77fa063517fca36e18558c1df8ed8221339c5bd91"} Oct 13 14:56:48 crc kubenswrapper[4797]: I1013 14:56:48.568704 4797 scope.go:117] "RemoveContainer" containerID="8a14f10d2fc594598e4d6a55f6a326cdab3bb9d09fa45622a4390f2d8271baab" Oct 13 14:56:48 crc kubenswrapper[4797]: I1013 14:56:48.568644 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k8gpv" Oct 13 14:56:48 crc kubenswrapper[4797]: I1013 14:56:48.590598 4797 scope.go:117] "RemoveContainer" containerID="703ceb6f677c9b2bd6d65d8228c092b19d6d9437dd2f0e4554ed3ae54db8c6f6" Oct 13 14:56:48 crc kubenswrapper[4797]: I1013 14:56:48.601576 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-k8gpv"] Oct 13 14:56:48 crc kubenswrapper[4797]: I1013 14:56:48.609044 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-k8gpv"] Oct 13 14:56:48 crc kubenswrapper[4797]: I1013 14:56:48.620335 4797 scope.go:117] "RemoveContainer" containerID="84f703d0e9336044d8a47ec410b6d429cbf3e19246a068ca5328766855d0aac8" Oct 13 14:56:48 crc kubenswrapper[4797]: I1013 14:56:48.657234 4797 scope.go:117] "RemoveContainer" containerID="8a14f10d2fc594598e4d6a55f6a326cdab3bb9d09fa45622a4390f2d8271baab" Oct 13 14:56:48 crc kubenswrapper[4797]: E1013 14:56:48.657771 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a14f10d2fc594598e4d6a55f6a326cdab3bb9d09fa45622a4390f2d8271baab\": container with ID starting with 8a14f10d2fc594598e4d6a55f6a326cdab3bb9d09fa45622a4390f2d8271baab not found: ID does not exist" containerID="8a14f10d2fc594598e4d6a55f6a326cdab3bb9d09fa45622a4390f2d8271baab" Oct 13 14:56:48 crc kubenswrapper[4797]: I1013 14:56:48.657828 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a14f10d2fc594598e4d6a55f6a326cdab3bb9d09fa45622a4390f2d8271baab"} err="failed to get container status \"8a14f10d2fc594598e4d6a55f6a326cdab3bb9d09fa45622a4390f2d8271baab\": rpc error: code = NotFound desc = could not find container \"8a14f10d2fc594598e4d6a55f6a326cdab3bb9d09fa45622a4390f2d8271baab\": container with ID starting with 8a14f10d2fc594598e4d6a55f6a326cdab3bb9d09fa45622a4390f2d8271baab not found: ID does not exist" Oct 13 14:56:48 crc kubenswrapper[4797]: I1013 14:56:48.657856 4797 scope.go:117] "RemoveContainer" containerID="703ceb6f677c9b2bd6d65d8228c092b19d6d9437dd2f0e4554ed3ae54db8c6f6" Oct 13 14:56:48 crc kubenswrapper[4797]: E1013 14:56:48.658193 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"703ceb6f677c9b2bd6d65d8228c092b19d6d9437dd2f0e4554ed3ae54db8c6f6\": container with ID starting with 703ceb6f677c9b2bd6d65d8228c092b19d6d9437dd2f0e4554ed3ae54db8c6f6 not found: ID does not exist" containerID="703ceb6f677c9b2bd6d65d8228c092b19d6d9437dd2f0e4554ed3ae54db8c6f6" Oct 13 14:56:48 crc kubenswrapper[4797]: I1013 14:56:48.658233 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"703ceb6f677c9b2bd6d65d8228c092b19d6d9437dd2f0e4554ed3ae54db8c6f6"} err="failed to get container status \"703ceb6f677c9b2bd6d65d8228c092b19d6d9437dd2f0e4554ed3ae54db8c6f6\": rpc error: code = NotFound desc = could not find container \"703ceb6f677c9b2bd6d65d8228c092b19d6d9437dd2f0e4554ed3ae54db8c6f6\": container with ID starting with 703ceb6f677c9b2bd6d65d8228c092b19d6d9437dd2f0e4554ed3ae54db8c6f6 not found: ID does not exist" Oct 13 14:56:48 crc kubenswrapper[4797]: I1013 14:56:48.658260 4797 scope.go:117] "RemoveContainer" containerID="84f703d0e9336044d8a47ec410b6d429cbf3e19246a068ca5328766855d0aac8" Oct 13 14:56:48 crc kubenswrapper[4797]: E1013 14:56:48.658501 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84f703d0e9336044d8a47ec410b6d429cbf3e19246a068ca5328766855d0aac8\": container with ID starting with 84f703d0e9336044d8a47ec410b6d429cbf3e19246a068ca5328766855d0aac8 not found: ID does not exist" containerID="84f703d0e9336044d8a47ec410b6d429cbf3e19246a068ca5328766855d0aac8" Oct 13 14:56:48 crc kubenswrapper[4797]: I1013 14:56:48.658530 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84f703d0e9336044d8a47ec410b6d429cbf3e19246a068ca5328766855d0aac8"} err="failed to get container status \"84f703d0e9336044d8a47ec410b6d429cbf3e19246a068ca5328766855d0aac8\": rpc error: code = NotFound desc = could not find container \"84f703d0e9336044d8a47ec410b6d429cbf3e19246a068ca5328766855d0aac8\": container with ID starting with 84f703d0e9336044d8a47ec410b6d429cbf3e19246a068ca5328766855d0aac8 not found: ID does not exist" Oct 13 14:56:49 crc kubenswrapper[4797]: I1013 14:56:49.248965 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4880b275-4169-4a3b-a56e-5fd9e892e4d4" path="/var/lib/kubelet/pods/4880b275-4169-4a3b-a56e-5fd9e892e4d4/volumes" Oct 13 14:57:00 crc kubenswrapper[4797]: I1013 14:57:00.236341 4797 scope.go:117] "RemoveContainer" containerID="1e92f33646910139b178cb12e4fc4664f24f2ce749b3ae38aac00d92a8bb562c" Oct 13 14:57:00 crc kubenswrapper[4797]: E1013 14:57:00.237661 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 14:57:04 crc kubenswrapper[4797]: I1013 14:57:04.722625 4797 generic.go:334] "Generic (PLEG): container finished" podID="6ddd3dd4-0ef9-495e-9ffa-8358884cb552" containerID="d3831cd78b1c0de727a9d895b871c052212ac5796e43a330f06c536c52ef26cf" exitCode=0 Oct 13 14:57:04 crc kubenswrapper[4797]: I1013 14:57:04.722754 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-znhqv" event={"ID":"6ddd3dd4-0ef9-495e-9ffa-8358884cb552","Type":"ContainerDied","Data":"d3831cd78b1c0de727a9d895b871c052212ac5796e43a330f06c536c52ef26cf"} Oct 13 14:57:06 crc kubenswrapper[4797]: I1013 14:57:06.161968 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-znhqv" Oct 13 14:57:06 crc kubenswrapper[4797]: I1013 14:57:06.286623 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ddd3dd4-0ef9-495e-9ffa-8358884cb552-bootstrap-combined-ca-bundle\") pod \"6ddd3dd4-0ef9-495e-9ffa-8358884cb552\" (UID: \"6ddd3dd4-0ef9-495e-9ffa-8358884cb552\") " Oct 13 14:57:06 crc kubenswrapper[4797]: I1013 14:57:06.287032 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6ddd3dd4-0ef9-495e-9ffa-8358884cb552-ceph\") pod \"6ddd3dd4-0ef9-495e-9ffa-8358884cb552\" (UID: \"6ddd3dd4-0ef9-495e-9ffa-8358884cb552\") " Oct 13 14:57:06 crc kubenswrapper[4797]: I1013 14:57:06.287095 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6ddd3dd4-0ef9-495e-9ffa-8358884cb552-ssh-key\") pod \"6ddd3dd4-0ef9-495e-9ffa-8358884cb552\" (UID: \"6ddd3dd4-0ef9-495e-9ffa-8358884cb552\") " Oct 13 14:57:06 crc kubenswrapper[4797]: I1013 14:57:06.287150 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gbdrw\" (UniqueName: \"kubernetes.io/projected/6ddd3dd4-0ef9-495e-9ffa-8358884cb552-kube-api-access-gbdrw\") pod \"6ddd3dd4-0ef9-495e-9ffa-8358884cb552\" (UID: \"6ddd3dd4-0ef9-495e-9ffa-8358884cb552\") " Oct 13 14:57:06 crc kubenswrapper[4797]: I1013 14:57:06.287354 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6ddd3dd4-0ef9-495e-9ffa-8358884cb552-inventory\") pod \"6ddd3dd4-0ef9-495e-9ffa-8358884cb552\" (UID: \"6ddd3dd4-0ef9-495e-9ffa-8358884cb552\") " Oct 13 14:57:06 crc kubenswrapper[4797]: I1013 14:57:06.294148 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ddd3dd4-0ef9-495e-9ffa-8358884cb552-kube-api-access-gbdrw" (OuterVolumeSpecName: "kube-api-access-gbdrw") pod "6ddd3dd4-0ef9-495e-9ffa-8358884cb552" (UID: "6ddd3dd4-0ef9-495e-9ffa-8358884cb552"). InnerVolumeSpecName "kube-api-access-gbdrw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 14:57:06 crc kubenswrapper[4797]: I1013 14:57:06.294209 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ddd3dd4-0ef9-495e-9ffa-8358884cb552-ceph" (OuterVolumeSpecName: "ceph") pod "6ddd3dd4-0ef9-495e-9ffa-8358884cb552" (UID: "6ddd3dd4-0ef9-495e-9ffa-8358884cb552"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 14:57:06 crc kubenswrapper[4797]: I1013 14:57:06.295838 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ddd3dd4-0ef9-495e-9ffa-8358884cb552-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "6ddd3dd4-0ef9-495e-9ffa-8358884cb552" (UID: "6ddd3dd4-0ef9-495e-9ffa-8358884cb552"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 14:57:06 crc kubenswrapper[4797]: I1013 14:57:06.319345 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ddd3dd4-0ef9-495e-9ffa-8358884cb552-inventory" (OuterVolumeSpecName: "inventory") pod "6ddd3dd4-0ef9-495e-9ffa-8358884cb552" (UID: "6ddd3dd4-0ef9-495e-9ffa-8358884cb552"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 14:57:06 crc kubenswrapper[4797]: I1013 14:57:06.347477 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ddd3dd4-0ef9-495e-9ffa-8358884cb552-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "6ddd3dd4-0ef9-495e-9ffa-8358884cb552" (UID: "6ddd3dd4-0ef9-495e-9ffa-8358884cb552"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 14:57:06 crc kubenswrapper[4797]: I1013 14:57:06.390241 4797 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ddd3dd4-0ef9-495e-9ffa-8358884cb552-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 14:57:06 crc kubenswrapper[4797]: I1013 14:57:06.390289 4797 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6ddd3dd4-0ef9-495e-9ffa-8358884cb552-ceph\") on node \"crc\" DevicePath \"\"" Oct 13 14:57:06 crc kubenswrapper[4797]: I1013 14:57:06.390301 4797 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6ddd3dd4-0ef9-495e-9ffa-8358884cb552-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 13 14:57:06 crc kubenswrapper[4797]: I1013 14:57:06.390316 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gbdrw\" (UniqueName: \"kubernetes.io/projected/6ddd3dd4-0ef9-495e-9ffa-8358884cb552-kube-api-access-gbdrw\") on node \"crc\" DevicePath \"\"" Oct 13 14:57:06 crc kubenswrapper[4797]: I1013 14:57:06.390330 4797 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6ddd3dd4-0ef9-495e-9ffa-8358884cb552-inventory\") on node \"crc\" DevicePath \"\"" Oct 13 14:57:06 crc kubenswrapper[4797]: I1013 14:57:06.743628 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-znhqv" event={"ID":"6ddd3dd4-0ef9-495e-9ffa-8358884cb552","Type":"ContainerDied","Data":"636f0656a4fc3fe4202ed617d5d15ed8fa777c712b1341c12654b13704266bb6"} Oct 13 14:57:06 crc kubenswrapper[4797]: I1013 14:57:06.743683 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="636f0656a4fc3fe4202ed617d5d15ed8fa777c712b1341c12654b13704266bb6" Oct 13 14:57:06 crc kubenswrapper[4797]: I1013 14:57:06.743718 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-znhqv" Oct 13 14:57:06 crc kubenswrapper[4797]: I1013 14:57:06.840144 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-openstack-openstack-cell1-6wfff"] Oct 13 14:57:06 crc kubenswrapper[4797]: E1013 14:57:06.840544 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4880b275-4169-4a3b-a56e-5fd9e892e4d4" containerName="extract-utilities" Oct 13 14:57:06 crc kubenswrapper[4797]: I1013 14:57:06.840567 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="4880b275-4169-4a3b-a56e-5fd9e892e4d4" containerName="extract-utilities" Oct 13 14:57:06 crc kubenswrapper[4797]: E1013 14:57:06.840612 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4880b275-4169-4a3b-a56e-5fd9e892e4d4" containerName="registry-server" Oct 13 14:57:06 crc kubenswrapper[4797]: I1013 14:57:06.840621 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="4880b275-4169-4a3b-a56e-5fd9e892e4d4" containerName="registry-server" Oct 13 14:57:06 crc kubenswrapper[4797]: E1013 14:57:06.840638 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4880b275-4169-4a3b-a56e-5fd9e892e4d4" containerName="extract-content" Oct 13 14:57:06 crc kubenswrapper[4797]: I1013 14:57:06.840644 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="4880b275-4169-4a3b-a56e-5fd9e892e4d4" containerName="extract-content" Oct 13 14:57:06 crc kubenswrapper[4797]: E1013 14:57:06.840660 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ddd3dd4-0ef9-495e-9ffa-8358884cb552" containerName="bootstrap-openstack-openstack-cell1" Oct 13 14:57:06 crc kubenswrapper[4797]: I1013 14:57:06.840666 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ddd3dd4-0ef9-495e-9ffa-8358884cb552" containerName="bootstrap-openstack-openstack-cell1" Oct 13 14:57:06 crc kubenswrapper[4797]: I1013 14:57:06.840858 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ddd3dd4-0ef9-495e-9ffa-8358884cb552" containerName="bootstrap-openstack-openstack-cell1" Oct 13 14:57:06 crc kubenswrapper[4797]: I1013 14:57:06.840875 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="4880b275-4169-4a3b-a56e-5fd9e892e4d4" containerName="registry-server" Oct 13 14:57:06 crc kubenswrapper[4797]: I1013 14:57:06.841622 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-6wfff" Oct 13 14:57:06 crc kubenswrapper[4797]: I1013 14:57:06.843671 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 13 14:57:06 crc kubenswrapper[4797]: I1013 14:57:06.844290 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 13 14:57:06 crc kubenswrapper[4797]: I1013 14:57:06.844411 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 13 14:57:06 crc kubenswrapper[4797]: I1013 14:57:06.845067 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-rf85n" Oct 13 14:57:06 crc kubenswrapper[4797]: I1013 14:57:06.856466 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-openstack-openstack-cell1-6wfff"] Oct 13 14:57:07 crc kubenswrapper[4797]: I1013 14:57:07.007170 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e800517f-8039-464d-8137-cc928b84cc79-ssh-key\") pod \"download-cache-openstack-openstack-cell1-6wfff\" (UID: \"e800517f-8039-464d-8137-cc928b84cc79\") " pod="openstack/download-cache-openstack-openstack-cell1-6wfff" Oct 13 14:57:07 crc kubenswrapper[4797]: I1013 14:57:07.007333 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e800517f-8039-464d-8137-cc928b84cc79-ceph\") pod \"download-cache-openstack-openstack-cell1-6wfff\" (UID: \"e800517f-8039-464d-8137-cc928b84cc79\") " pod="openstack/download-cache-openstack-openstack-cell1-6wfff" Oct 13 14:57:07 crc kubenswrapper[4797]: I1013 14:57:07.007369 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59lms\" (UniqueName: \"kubernetes.io/projected/e800517f-8039-464d-8137-cc928b84cc79-kube-api-access-59lms\") pod \"download-cache-openstack-openstack-cell1-6wfff\" (UID: \"e800517f-8039-464d-8137-cc928b84cc79\") " pod="openstack/download-cache-openstack-openstack-cell1-6wfff" Oct 13 14:57:07 crc kubenswrapper[4797]: I1013 14:57:07.007431 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e800517f-8039-464d-8137-cc928b84cc79-inventory\") pod \"download-cache-openstack-openstack-cell1-6wfff\" (UID: \"e800517f-8039-464d-8137-cc928b84cc79\") " pod="openstack/download-cache-openstack-openstack-cell1-6wfff" Oct 13 14:57:07 crc kubenswrapper[4797]: I1013 14:57:07.109428 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e800517f-8039-464d-8137-cc928b84cc79-ssh-key\") pod \"download-cache-openstack-openstack-cell1-6wfff\" (UID: \"e800517f-8039-464d-8137-cc928b84cc79\") " pod="openstack/download-cache-openstack-openstack-cell1-6wfff" Oct 13 14:57:07 crc kubenswrapper[4797]: I1013 14:57:07.109575 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e800517f-8039-464d-8137-cc928b84cc79-ceph\") pod \"download-cache-openstack-openstack-cell1-6wfff\" (UID: \"e800517f-8039-464d-8137-cc928b84cc79\") " pod="openstack/download-cache-openstack-openstack-cell1-6wfff" Oct 13 14:57:07 crc kubenswrapper[4797]: I1013 14:57:07.109612 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59lms\" (UniqueName: \"kubernetes.io/projected/e800517f-8039-464d-8137-cc928b84cc79-kube-api-access-59lms\") pod \"download-cache-openstack-openstack-cell1-6wfff\" (UID: \"e800517f-8039-464d-8137-cc928b84cc79\") " pod="openstack/download-cache-openstack-openstack-cell1-6wfff" Oct 13 14:57:07 crc kubenswrapper[4797]: I1013 14:57:07.109636 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e800517f-8039-464d-8137-cc928b84cc79-inventory\") pod \"download-cache-openstack-openstack-cell1-6wfff\" (UID: \"e800517f-8039-464d-8137-cc928b84cc79\") " pod="openstack/download-cache-openstack-openstack-cell1-6wfff" Oct 13 14:57:07 crc kubenswrapper[4797]: I1013 14:57:07.118524 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e800517f-8039-464d-8137-cc928b84cc79-ceph\") pod \"download-cache-openstack-openstack-cell1-6wfff\" (UID: \"e800517f-8039-464d-8137-cc928b84cc79\") " pod="openstack/download-cache-openstack-openstack-cell1-6wfff" Oct 13 14:57:07 crc kubenswrapper[4797]: I1013 14:57:07.121553 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e800517f-8039-464d-8137-cc928b84cc79-inventory\") pod \"download-cache-openstack-openstack-cell1-6wfff\" (UID: \"e800517f-8039-464d-8137-cc928b84cc79\") " pod="openstack/download-cache-openstack-openstack-cell1-6wfff" Oct 13 14:57:07 crc kubenswrapper[4797]: I1013 14:57:07.122089 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e800517f-8039-464d-8137-cc928b84cc79-ssh-key\") pod \"download-cache-openstack-openstack-cell1-6wfff\" (UID: \"e800517f-8039-464d-8137-cc928b84cc79\") " pod="openstack/download-cache-openstack-openstack-cell1-6wfff" Oct 13 14:57:07 crc kubenswrapper[4797]: I1013 14:57:07.138571 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59lms\" (UniqueName: \"kubernetes.io/projected/e800517f-8039-464d-8137-cc928b84cc79-kube-api-access-59lms\") pod \"download-cache-openstack-openstack-cell1-6wfff\" (UID: \"e800517f-8039-464d-8137-cc928b84cc79\") " pod="openstack/download-cache-openstack-openstack-cell1-6wfff" Oct 13 14:57:07 crc kubenswrapper[4797]: I1013 14:57:07.218796 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-6wfff" Oct 13 14:57:07 crc kubenswrapper[4797]: I1013 14:57:07.755637 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-openstack-openstack-cell1-6wfff"] Oct 13 14:57:08 crc kubenswrapper[4797]: I1013 14:57:08.764001 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-6wfff" event={"ID":"e800517f-8039-464d-8137-cc928b84cc79","Type":"ContainerStarted","Data":"15242632ac45f1460411676950ef8f1044f44a5e302d7d11029f09416fa19d75"} Oct 13 14:57:08 crc kubenswrapper[4797]: I1013 14:57:08.764759 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-6wfff" event={"ID":"e800517f-8039-464d-8137-cc928b84cc79","Type":"ContainerStarted","Data":"425231eb7234f7dd3fb6d6369445da8307a5ace9ce72e8509fde15632ba6cc9c"} Oct 13 14:57:08 crc kubenswrapper[4797]: I1013 14:57:08.787697 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-openstack-openstack-cell1-6wfff" podStartSLOduration=2.219821967 podStartE2EDuration="2.787671873s" podCreationTimestamp="2025-10-13 14:57:06 +0000 UTC" firstStartedPulling="2025-10-13 14:57:07.755262923 +0000 UTC m=+6605.288813179" lastFinishedPulling="2025-10-13 14:57:08.323112829 +0000 UTC m=+6605.856663085" observedRunningTime="2025-10-13 14:57:08.783179753 +0000 UTC m=+6606.316730019" watchObservedRunningTime="2025-10-13 14:57:08.787671873 +0000 UTC m=+6606.321222129" Oct 13 14:57:15 crc kubenswrapper[4797]: I1013 14:57:15.236576 4797 scope.go:117] "RemoveContainer" containerID="1e92f33646910139b178cb12e4fc4664f24f2ce749b3ae38aac00d92a8bb562c" Oct 13 14:57:15 crc kubenswrapper[4797]: E1013 14:57:15.237474 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 14:57:30 crc kubenswrapper[4797]: I1013 14:57:30.236334 4797 scope.go:117] "RemoveContainer" containerID="1e92f33646910139b178cb12e4fc4664f24f2ce749b3ae38aac00d92a8bb562c" Oct 13 14:57:30 crc kubenswrapper[4797]: E1013 14:57:30.237131 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 14:57:45 crc kubenswrapper[4797]: I1013 14:57:45.236868 4797 scope.go:117] "RemoveContainer" containerID="1e92f33646910139b178cb12e4fc4664f24f2ce749b3ae38aac00d92a8bb562c" Oct 13 14:57:45 crc kubenswrapper[4797]: E1013 14:57:45.237604 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 14:58:00 crc kubenswrapper[4797]: I1013 14:58:00.236773 4797 scope.go:117] "RemoveContainer" containerID="1e92f33646910139b178cb12e4fc4664f24f2ce749b3ae38aac00d92a8bb562c" Oct 13 14:58:00 crc kubenswrapper[4797]: E1013 14:58:00.238028 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 14:58:05 crc kubenswrapper[4797]: I1013 14:58:05.598072 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-jzpjw"] Oct 13 14:58:05 crc kubenswrapper[4797]: I1013 14:58:05.600738 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jzpjw" Oct 13 14:58:05 crc kubenswrapper[4797]: I1013 14:58:05.606546 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jzpjw"] Oct 13 14:58:05 crc kubenswrapper[4797]: I1013 14:58:05.627326 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/06251d27-3d6c-424d-a24e-2f432999d518-catalog-content\") pod \"redhat-operators-jzpjw\" (UID: \"06251d27-3d6c-424d-a24e-2f432999d518\") " pod="openshift-marketplace/redhat-operators-jzpjw" Oct 13 14:58:05 crc kubenswrapper[4797]: I1013 14:58:05.627540 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4jq47\" (UniqueName: \"kubernetes.io/projected/06251d27-3d6c-424d-a24e-2f432999d518-kube-api-access-4jq47\") pod \"redhat-operators-jzpjw\" (UID: \"06251d27-3d6c-424d-a24e-2f432999d518\") " pod="openshift-marketplace/redhat-operators-jzpjw" Oct 13 14:58:05 crc kubenswrapper[4797]: I1013 14:58:05.629229 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/06251d27-3d6c-424d-a24e-2f432999d518-utilities\") pod \"redhat-operators-jzpjw\" (UID: \"06251d27-3d6c-424d-a24e-2f432999d518\") " pod="openshift-marketplace/redhat-operators-jzpjw" Oct 13 14:58:05 crc kubenswrapper[4797]: I1013 14:58:05.731053 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/06251d27-3d6c-424d-a24e-2f432999d518-catalog-content\") pod \"redhat-operators-jzpjw\" (UID: \"06251d27-3d6c-424d-a24e-2f432999d518\") " pod="openshift-marketplace/redhat-operators-jzpjw" Oct 13 14:58:05 crc kubenswrapper[4797]: I1013 14:58:05.731109 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4jq47\" (UniqueName: \"kubernetes.io/projected/06251d27-3d6c-424d-a24e-2f432999d518-kube-api-access-4jq47\") pod \"redhat-operators-jzpjw\" (UID: \"06251d27-3d6c-424d-a24e-2f432999d518\") " pod="openshift-marketplace/redhat-operators-jzpjw" Oct 13 14:58:05 crc kubenswrapper[4797]: I1013 14:58:05.731132 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/06251d27-3d6c-424d-a24e-2f432999d518-utilities\") pod \"redhat-operators-jzpjw\" (UID: \"06251d27-3d6c-424d-a24e-2f432999d518\") " pod="openshift-marketplace/redhat-operators-jzpjw" Oct 13 14:58:05 crc kubenswrapper[4797]: I1013 14:58:05.731622 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/06251d27-3d6c-424d-a24e-2f432999d518-catalog-content\") pod \"redhat-operators-jzpjw\" (UID: \"06251d27-3d6c-424d-a24e-2f432999d518\") " pod="openshift-marketplace/redhat-operators-jzpjw" Oct 13 14:58:05 crc kubenswrapper[4797]: I1013 14:58:05.731651 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/06251d27-3d6c-424d-a24e-2f432999d518-utilities\") pod \"redhat-operators-jzpjw\" (UID: \"06251d27-3d6c-424d-a24e-2f432999d518\") " pod="openshift-marketplace/redhat-operators-jzpjw" Oct 13 14:58:05 crc kubenswrapper[4797]: I1013 14:58:05.752070 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4jq47\" (UniqueName: \"kubernetes.io/projected/06251d27-3d6c-424d-a24e-2f432999d518-kube-api-access-4jq47\") pod \"redhat-operators-jzpjw\" (UID: \"06251d27-3d6c-424d-a24e-2f432999d518\") " pod="openshift-marketplace/redhat-operators-jzpjw" Oct 13 14:58:05 crc kubenswrapper[4797]: I1013 14:58:05.930787 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jzpjw" Oct 13 14:58:06 crc kubenswrapper[4797]: I1013 14:58:06.406637 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jzpjw"] Oct 13 14:58:07 crc kubenswrapper[4797]: I1013 14:58:07.353973 4797 generic.go:334] "Generic (PLEG): container finished" podID="06251d27-3d6c-424d-a24e-2f432999d518" containerID="6714b81aa82efd2400a1ebc3a945df3c01442d09dc8b593f594b263058f6bc34" exitCode=0 Oct 13 14:58:07 crc kubenswrapper[4797]: I1013 14:58:07.354012 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jzpjw" event={"ID":"06251d27-3d6c-424d-a24e-2f432999d518","Type":"ContainerDied","Data":"6714b81aa82efd2400a1ebc3a945df3c01442d09dc8b593f594b263058f6bc34"} Oct 13 14:58:07 crc kubenswrapper[4797]: I1013 14:58:07.354037 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jzpjw" event={"ID":"06251d27-3d6c-424d-a24e-2f432999d518","Type":"ContainerStarted","Data":"3a6459c5e67583e963b963f7b5889b4ee5121ee50fba3590df2b927b1d835374"} Oct 13 14:58:10 crc kubenswrapper[4797]: I1013 14:58:10.383658 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jzpjw" event={"ID":"06251d27-3d6c-424d-a24e-2f432999d518","Type":"ContainerStarted","Data":"316bf392b4aa7bfda50e4088d3b6dd5dc6b26babaaab4fe953ec0b760175ef06"} Oct 13 14:58:11 crc kubenswrapper[4797]: I1013 14:58:11.396461 4797 generic.go:334] "Generic (PLEG): container finished" podID="06251d27-3d6c-424d-a24e-2f432999d518" containerID="316bf392b4aa7bfda50e4088d3b6dd5dc6b26babaaab4fe953ec0b760175ef06" exitCode=0 Oct 13 14:58:11 crc kubenswrapper[4797]: I1013 14:58:11.396572 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jzpjw" event={"ID":"06251d27-3d6c-424d-a24e-2f432999d518","Type":"ContainerDied","Data":"316bf392b4aa7bfda50e4088d3b6dd5dc6b26babaaab4fe953ec0b760175ef06"} Oct 13 14:58:13 crc kubenswrapper[4797]: I1013 14:58:13.244184 4797 scope.go:117] "RemoveContainer" containerID="1e92f33646910139b178cb12e4fc4664f24f2ce749b3ae38aac00d92a8bb562c" Oct 13 14:58:13 crc kubenswrapper[4797]: E1013 14:58:13.244851 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 14:58:13 crc kubenswrapper[4797]: I1013 14:58:13.431156 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jzpjw" event={"ID":"06251d27-3d6c-424d-a24e-2f432999d518","Type":"ContainerStarted","Data":"9ed0030b2abd543d0c6326e4bf886c9673bb22570ec4dd4379efc8184fc7f6ed"} Oct 13 14:58:13 crc kubenswrapper[4797]: I1013 14:58:13.461641 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-jzpjw" podStartSLOduration=3.6043137229999997 podStartE2EDuration="8.461613296s" podCreationTimestamp="2025-10-13 14:58:05 +0000 UTC" firstStartedPulling="2025-10-13 14:58:07.356839277 +0000 UTC m=+6664.890389533" lastFinishedPulling="2025-10-13 14:58:12.21413885 +0000 UTC m=+6669.747689106" observedRunningTime="2025-10-13 14:58:13.451045587 +0000 UTC m=+6670.984595853" watchObservedRunningTime="2025-10-13 14:58:13.461613296 +0000 UTC m=+6670.995163552" Oct 13 14:58:15 crc kubenswrapper[4797]: I1013 14:58:15.931063 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-jzpjw" Oct 13 14:58:15 crc kubenswrapper[4797]: I1013 14:58:15.931474 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-jzpjw" Oct 13 14:58:16 crc kubenswrapper[4797]: I1013 14:58:16.843544 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-cp292"] Oct 13 14:58:16 crc kubenswrapper[4797]: I1013 14:58:16.845912 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cp292" Oct 13 14:58:16 crc kubenswrapper[4797]: I1013 14:58:16.863249 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e439833-7cdd-42be-8f0e-058085fd5e08-catalog-content\") pod \"redhat-marketplace-cp292\" (UID: \"7e439833-7cdd-42be-8f0e-058085fd5e08\") " pod="openshift-marketplace/redhat-marketplace-cp292" Oct 13 14:58:16 crc kubenswrapper[4797]: I1013 14:58:16.863340 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mtcz\" (UniqueName: \"kubernetes.io/projected/7e439833-7cdd-42be-8f0e-058085fd5e08-kube-api-access-2mtcz\") pod \"redhat-marketplace-cp292\" (UID: \"7e439833-7cdd-42be-8f0e-058085fd5e08\") " pod="openshift-marketplace/redhat-marketplace-cp292" Oct 13 14:58:16 crc kubenswrapper[4797]: I1013 14:58:16.863420 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e439833-7cdd-42be-8f0e-058085fd5e08-utilities\") pod \"redhat-marketplace-cp292\" (UID: \"7e439833-7cdd-42be-8f0e-058085fd5e08\") " pod="openshift-marketplace/redhat-marketplace-cp292" Oct 13 14:58:16 crc kubenswrapper[4797]: I1013 14:58:16.871710 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-cp292"] Oct 13 14:58:16 crc kubenswrapper[4797]: I1013 14:58:16.965679 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e439833-7cdd-42be-8f0e-058085fd5e08-catalog-content\") pod \"redhat-marketplace-cp292\" (UID: \"7e439833-7cdd-42be-8f0e-058085fd5e08\") " pod="openshift-marketplace/redhat-marketplace-cp292" Oct 13 14:58:16 crc kubenswrapper[4797]: I1013 14:58:16.966110 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2mtcz\" (UniqueName: \"kubernetes.io/projected/7e439833-7cdd-42be-8f0e-058085fd5e08-kube-api-access-2mtcz\") pod \"redhat-marketplace-cp292\" (UID: \"7e439833-7cdd-42be-8f0e-058085fd5e08\") " pod="openshift-marketplace/redhat-marketplace-cp292" Oct 13 14:58:16 crc kubenswrapper[4797]: I1013 14:58:16.966149 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e439833-7cdd-42be-8f0e-058085fd5e08-utilities\") pod \"redhat-marketplace-cp292\" (UID: \"7e439833-7cdd-42be-8f0e-058085fd5e08\") " pod="openshift-marketplace/redhat-marketplace-cp292" Oct 13 14:58:16 crc kubenswrapper[4797]: I1013 14:58:16.966852 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e439833-7cdd-42be-8f0e-058085fd5e08-utilities\") pod \"redhat-marketplace-cp292\" (UID: \"7e439833-7cdd-42be-8f0e-058085fd5e08\") " pod="openshift-marketplace/redhat-marketplace-cp292" Oct 13 14:58:16 crc kubenswrapper[4797]: I1013 14:58:16.967081 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e439833-7cdd-42be-8f0e-058085fd5e08-catalog-content\") pod \"redhat-marketplace-cp292\" (UID: \"7e439833-7cdd-42be-8f0e-058085fd5e08\") " pod="openshift-marketplace/redhat-marketplace-cp292" Oct 13 14:58:16 crc kubenswrapper[4797]: I1013 14:58:16.989212 4797 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-jzpjw" podUID="06251d27-3d6c-424d-a24e-2f432999d518" containerName="registry-server" probeResult="failure" output=< Oct 13 14:58:16 crc kubenswrapper[4797]: timeout: failed to connect service ":50051" within 1s Oct 13 14:58:16 crc kubenswrapper[4797]: > Oct 13 14:58:16 crc kubenswrapper[4797]: I1013 14:58:16.997246 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mtcz\" (UniqueName: \"kubernetes.io/projected/7e439833-7cdd-42be-8f0e-058085fd5e08-kube-api-access-2mtcz\") pod \"redhat-marketplace-cp292\" (UID: \"7e439833-7cdd-42be-8f0e-058085fd5e08\") " pod="openshift-marketplace/redhat-marketplace-cp292" Oct 13 14:58:17 crc kubenswrapper[4797]: I1013 14:58:17.185380 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cp292" Oct 13 14:58:17 crc kubenswrapper[4797]: I1013 14:58:17.700690 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-cp292"] Oct 13 14:58:18 crc kubenswrapper[4797]: I1013 14:58:18.506021 4797 generic.go:334] "Generic (PLEG): container finished" podID="7e439833-7cdd-42be-8f0e-058085fd5e08" containerID="96e8560a7a07df9fe73737bce857c7c56f148e338d3f9bca26ba2808a92a471d" exitCode=0 Oct 13 14:58:18 crc kubenswrapper[4797]: I1013 14:58:18.506074 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cp292" event={"ID":"7e439833-7cdd-42be-8f0e-058085fd5e08","Type":"ContainerDied","Data":"96e8560a7a07df9fe73737bce857c7c56f148e338d3f9bca26ba2808a92a471d"} Oct 13 14:58:18 crc kubenswrapper[4797]: I1013 14:58:18.506371 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cp292" event={"ID":"7e439833-7cdd-42be-8f0e-058085fd5e08","Type":"ContainerStarted","Data":"140f76c998bd0743159cc9a13f9b533891925892b13f5964fdde55a4754c8140"} Oct 13 14:58:19 crc kubenswrapper[4797]: I1013 14:58:19.517663 4797 generic.go:334] "Generic (PLEG): container finished" podID="7e439833-7cdd-42be-8f0e-058085fd5e08" containerID="1037a98a08e049c1d887130e65382092003e780b91366f36364e897ad3369c48" exitCode=0 Oct 13 14:58:19 crc kubenswrapper[4797]: I1013 14:58:19.517871 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cp292" event={"ID":"7e439833-7cdd-42be-8f0e-058085fd5e08","Type":"ContainerDied","Data":"1037a98a08e049c1d887130e65382092003e780b91366f36364e897ad3369c48"} Oct 13 14:58:20 crc kubenswrapper[4797]: I1013 14:58:20.533319 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cp292" event={"ID":"7e439833-7cdd-42be-8f0e-058085fd5e08","Type":"ContainerStarted","Data":"59d9e1b0e30675fc7dcac606a7c1f6636522365d5d40242b3c8e28ef4eff79e8"} Oct 13 14:58:20 crc kubenswrapper[4797]: I1013 14:58:20.557225 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-cp292" podStartSLOduration=3.072591244 podStartE2EDuration="4.557199336s" podCreationTimestamp="2025-10-13 14:58:16 +0000 UTC" firstStartedPulling="2025-10-13 14:58:18.508614592 +0000 UTC m=+6676.042164848" lastFinishedPulling="2025-10-13 14:58:19.993222674 +0000 UTC m=+6677.526772940" observedRunningTime="2025-10-13 14:58:20.551538617 +0000 UTC m=+6678.085088893" watchObservedRunningTime="2025-10-13 14:58:20.557199336 +0000 UTC m=+6678.090749592" Oct 13 14:58:25 crc kubenswrapper[4797]: I1013 14:58:25.237023 4797 scope.go:117] "RemoveContainer" containerID="1e92f33646910139b178cb12e4fc4664f24f2ce749b3ae38aac00d92a8bb562c" Oct 13 14:58:25 crc kubenswrapper[4797]: I1013 14:58:25.584848 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" event={"ID":"345b1c60-ba79-407d-8423-53010f2dfeb0","Type":"ContainerStarted","Data":"9ced763cf8f63ef478d23b8f41f116ec9c1aafb73fa9427083e41fbea89d39fa"} Oct 13 14:58:25 crc kubenswrapper[4797]: I1013 14:58:25.996798 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-jzpjw" Oct 13 14:58:26 crc kubenswrapper[4797]: I1013 14:58:26.052227 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-jzpjw" Oct 13 14:58:26 crc kubenswrapper[4797]: I1013 14:58:26.236610 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jzpjw"] Oct 13 14:58:27 crc kubenswrapper[4797]: I1013 14:58:27.186563 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-cp292" Oct 13 14:58:27 crc kubenswrapper[4797]: I1013 14:58:27.187219 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-cp292" Oct 13 14:58:27 crc kubenswrapper[4797]: I1013 14:58:27.255005 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-cp292" Oct 13 14:58:27 crc kubenswrapper[4797]: I1013 14:58:27.603682 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-jzpjw" podUID="06251d27-3d6c-424d-a24e-2f432999d518" containerName="registry-server" containerID="cri-o://9ed0030b2abd543d0c6326e4bf886c9673bb22570ec4dd4379efc8184fc7f6ed" gracePeriod=2 Oct 13 14:58:27 crc kubenswrapper[4797]: I1013 14:58:27.658990 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-cp292" Oct 13 14:58:28 crc kubenswrapper[4797]: I1013 14:58:28.099492 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jzpjw" Oct 13 14:58:28 crc kubenswrapper[4797]: I1013 14:58:28.126104 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/06251d27-3d6c-424d-a24e-2f432999d518-utilities\") pod \"06251d27-3d6c-424d-a24e-2f432999d518\" (UID: \"06251d27-3d6c-424d-a24e-2f432999d518\") " Oct 13 14:58:28 crc kubenswrapper[4797]: I1013 14:58:28.126195 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4jq47\" (UniqueName: \"kubernetes.io/projected/06251d27-3d6c-424d-a24e-2f432999d518-kube-api-access-4jq47\") pod \"06251d27-3d6c-424d-a24e-2f432999d518\" (UID: \"06251d27-3d6c-424d-a24e-2f432999d518\") " Oct 13 14:58:28 crc kubenswrapper[4797]: I1013 14:58:28.126233 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/06251d27-3d6c-424d-a24e-2f432999d518-catalog-content\") pod \"06251d27-3d6c-424d-a24e-2f432999d518\" (UID: \"06251d27-3d6c-424d-a24e-2f432999d518\") " Oct 13 14:58:28 crc kubenswrapper[4797]: I1013 14:58:28.127475 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/06251d27-3d6c-424d-a24e-2f432999d518-utilities" (OuterVolumeSpecName: "utilities") pod "06251d27-3d6c-424d-a24e-2f432999d518" (UID: "06251d27-3d6c-424d-a24e-2f432999d518"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 14:58:28 crc kubenswrapper[4797]: I1013 14:58:28.139107 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06251d27-3d6c-424d-a24e-2f432999d518-kube-api-access-4jq47" (OuterVolumeSpecName: "kube-api-access-4jq47") pod "06251d27-3d6c-424d-a24e-2f432999d518" (UID: "06251d27-3d6c-424d-a24e-2f432999d518"). InnerVolumeSpecName "kube-api-access-4jq47". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 14:58:28 crc kubenswrapper[4797]: I1013 14:58:28.200380 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/06251d27-3d6c-424d-a24e-2f432999d518-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "06251d27-3d6c-424d-a24e-2f432999d518" (UID: "06251d27-3d6c-424d-a24e-2f432999d518"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 14:58:28 crc kubenswrapper[4797]: I1013 14:58:28.227927 4797 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/06251d27-3d6c-424d-a24e-2f432999d518-utilities\") on node \"crc\" DevicePath \"\"" Oct 13 14:58:28 crc kubenswrapper[4797]: I1013 14:58:28.227990 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4jq47\" (UniqueName: \"kubernetes.io/projected/06251d27-3d6c-424d-a24e-2f432999d518-kube-api-access-4jq47\") on node \"crc\" DevicePath \"\"" Oct 13 14:58:28 crc kubenswrapper[4797]: I1013 14:58:28.228006 4797 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/06251d27-3d6c-424d-a24e-2f432999d518-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 13 14:58:28 crc kubenswrapper[4797]: I1013 14:58:28.613710 4797 generic.go:334] "Generic (PLEG): container finished" podID="06251d27-3d6c-424d-a24e-2f432999d518" containerID="9ed0030b2abd543d0c6326e4bf886c9673bb22570ec4dd4379efc8184fc7f6ed" exitCode=0 Oct 13 14:58:28 crc kubenswrapper[4797]: I1013 14:58:28.613767 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jzpjw" Oct 13 14:58:28 crc kubenswrapper[4797]: I1013 14:58:28.613821 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jzpjw" event={"ID":"06251d27-3d6c-424d-a24e-2f432999d518","Type":"ContainerDied","Data":"9ed0030b2abd543d0c6326e4bf886c9673bb22570ec4dd4379efc8184fc7f6ed"} Oct 13 14:58:28 crc kubenswrapper[4797]: I1013 14:58:28.613882 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jzpjw" event={"ID":"06251d27-3d6c-424d-a24e-2f432999d518","Type":"ContainerDied","Data":"3a6459c5e67583e963b963f7b5889b4ee5121ee50fba3590df2b927b1d835374"} Oct 13 14:58:28 crc kubenswrapper[4797]: I1013 14:58:28.613906 4797 scope.go:117] "RemoveContainer" containerID="9ed0030b2abd543d0c6326e4bf886c9673bb22570ec4dd4379efc8184fc7f6ed" Oct 13 14:58:28 crc kubenswrapper[4797]: I1013 14:58:28.636177 4797 scope.go:117] "RemoveContainer" containerID="316bf392b4aa7bfda50e4088d3b6dd5dc6b26babaaab4fe953ec0b760175ef06" Oct 13 14:58:28 crc kubenswrapper[4797]: I1013 14:58:28.649275 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jzpjw"] Oct 13 14:58:28 crc kubenswrapper[4797]: I1013 14:58:28.657323 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-jzpjw"] Oct 13 14:58:28 crc kubenswrapper[4797]: I1013 14:58:28.679470 4797 scope.go:117] "RemoveContainer" containerID="6714b81aa82efd2400a1ebc3a945df3c01442d09dc8b593f594b263058f6bc34" Oct 13 14:58:28 crc kubenswrapper[4797]: I1013 14:58:28.721670 4797 scope.go:117] "RemoveContainer" containerID="9ed0030b2abd543d0c6326e4bf886c9673bb22570ec4dd4379efc8184fc7f6ed" Oct 13 14:58:28 crc kubenswrapper[4797]: E1013 14:58:28.722257 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ed0030b2abd543d0c6326e4bf886c9673bb22570ec4dd4379efc8184fc7f6ed\": container with ID starting with 9ed0030b2abd543d0c6326e4bf886c9673bb22570ec4dd4379efc8184fc7f6ed not found: ID does not exist" containerID="9ed0030b2abd543d0c6326e4bf886c9673bb22570ec4dd4379efc8184fc7f6ed" Oct 13 14:58:28 crc kubenswrapper[4797]: I1013 14:58:28.722298 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ed0030b2abd543d0c6326e4bf886c9673bb22570ec4dd4379efc8184fc7f6ed"} err="failed to get container status \"9ed0030b2abd543d0c6326e4bf886c9673bb22570ec4dd4379efc8184fc7f6ed\": rpc error: code = NotFound desc = could not find container \"9ed0030b2abd543d0c6326e4bf886c9673bb22570ec4dd4379efc8184fc7f6ed\": container with ID starting with 9ed0030b2abd543d0c6326e4bf886c9673bb22570ec4dd4379efc8184fc7f6ed not found: ID does not exist" Oct 13 14:58:28 crc kubenswrapper[4797]: I1013 14:58:28.722321 4797 scope.go:117] "RemoveContainer" containerID="316bf392b4aa7bfda50e4088d3b6dd5dc6b26babaaab4fe953ec0b760175ef06" Oct 13 14:58:28 crc kubenswrapper[4797]: E1013 14:58:28.722786 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"316bf392b4aa7bfda50e4088d3b6dd5dc6b26babaaab4fe953ec0b760175ef06\": container with ID starting with 316bf392b4aa7bfda50e4088d3b6dd5dc6b26babaaab4fe953ec0b760175ef06 not found: ID does not exist" containerID="316bf392b4aa7bfda50e4088d3b6dd5dc6b26babaaab4fe953ec0b760175ef06" Oct 13 14:58:28 crc kubenswrapper[4797]: I1013 14:58:28.722870 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"316bf392b4aa7bfda50e4088d3b6dd5dc6b26babaaab4fe953ec0b760175ef06"} err="failed to get container status \"316bf392b4aa7bfda50e4088d3b6dd5dc6b26babaaab4fe953ec0b760175ef06\": rpc error: code = NotFound desc = could not find container \"316bf392b4aa7bfda50e4088d3b6dd5dc6b26babaaab4fe953ec0b760175ef06\": container with ID starting with 316bf392b4aa7bfda50e4088d3b6dd5dc6b26babaaab4fe953ec0b760175ef06 not found: ID does not exist" Oct 13 14:58:28 crc kubenswrapper[4797]: I1013 14:58:28.722915 4797 scope.go:117] "RemoveContainer" containerID="6714b81aa82efd2400a1ebc3a945df3c01442d09dc8b593f594b263058f6bc34" Oct 13 14:58:28 crc kubenswrapper[4797]: E1013 14:58:28.723240 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6714b81aa82efd2400a1ebc3a945df3c01442d09dc8b593f594b263058f6bc34\": container with ID starting with 6714b81aa82efd2400a1ebc3a945df3c01442d09dc8b593f594b263058f6bc34 not found: ID does not exist" containerID="6714b81aa82efd2400a1ebc3a945df3c01442d09dc8b593f594b263058f6bc34" Oct 13 14:58:28 crc kubenswrapper[4797]: I1013 14:58:28.723279 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6714b81aa82efd2400a1ebc3a945df3c01442d09dc8b593f594b263058f6bc34"} err="failed to get container status \"6714b81aa82efd2400a1ebc3a945df3c01442d09dc8b593f594b263058f6bc34\": rpc error: code = NotFound desc = could not find container \"6714b81aa82efd2400a1ebc3a945df3c01442d09dc8b593f594b263058f6bc34\": container with ID starting with 6714b81aa82efd2400a1ebc3a945df3c01442d09dc8b593f594b263058f6bc34 not found: ID does not exist" Oct 13 14:58:29 crc kubenswrapper[4797]: I1013 14:58:29.034799 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-cp292"] Oct 13 14:58:29 crc kubenswrapper[4797]: I1013 14:58:29.252395 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="06251d27-3d6c-424d-a24e-2f432999d518" path="/var/lib/kubelet/pods/06251d27-3d6c-424d-a24e-2f432999d518/volumes" Oct 13 14:58:29 crc kubenswrapper[4797]: I1013 14:58:29.624420 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-cp292" podUID="7e439833-7cdd-42be-8f0e-058085fd5e08" containerName="registry-server" containerID="cri-o://59d9e1b0e30675fc7dcac606a7c1f6636522365d5d40242b3c8e28ef4eff79e8" gracePeriod=2 Oct 13 14:58:30 crc kubenswrapper[4797]: I1013 14:58:30.126116 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cp292" Oct 13 14:58:30 crc kubenswrapper[4797]: I1013 14:58:30.268658 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e439833-7cdd-42be-8f0e-058085fd5e08-catalog-content\") pod \"7e439833-7cdd-42be-8f0e-058085fd5e08\" (UID: \"7e439833-7cdd-42be-8f0e-058085fd5e08\") " Oct 13 14:58:30 crc kubenswrapper[4797]: I1013 14:58:30.268760 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e439833-7cdd-42be-8f0e-058085fd5e08-utilities\") pod \"7e439833-7cdd-42be-8f0e-058085fd5e08\" (UID: \"7e439833-7cdd-42be-8f0e-058085fd5e08\") " Oct 13 14:58:30 crc kubenswrapper[4797]: I1013 14:58:30.269043 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2mtcz\" (UniqueName: \"kubernetes.io/projected/7e439833-7cdd-42be-8f0e-058085fd5e08-kube-api-access-2mtcz\") pod \"7e439833-7cdd-42be-8f0e-058085fd5e08\" (UID: \"7e439833-7cdd-42be-8f0e-058085fd5e08\") " Oct 13 14:58:30 crc kubenswrapper[4797]: I1013 14:58:30.270135 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e439833-7cdd-42be-8f0e-058085fd5e08-utilities" (OuterVolumeSpecName: "utilities") pod "7e439833-7cdd-42be-8f0e-058085fd5e08" (UID: "7e439833-7cdd-42be-8f0e-058085fd5e08"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 14:58:30 crc kubenswrapper[4797]: I1013 14:58:30.278610 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e439833-7cdd-42be-8f0e-058085fd5e08-kube-api-access-2mtcz" (OuterVolumeSpecName: "kube-api-access-2mtcz") pod "7e439833-7cdd-42be-8f0e-058085fd5e08" (UID: "7e439833-7cdd-42be-8f0e-058085fd5e08"). InnerVolumeSpecName "kube-api-access-2mtcz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 14:58:30 crc kubenswrapper[4797]: I1013 14:58:30.283459 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e439833-7cdd-42be-8f0e-058085fd5e08-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7e439833-7cdd-42be-8f0e-058085fd5e08" (UID: "7e439833-7cdd-42be-8f0e-058085fd5e08"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 14:58:30 crc kubenswrapper[4797]: I1013 14:58:30.372085 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2mtcz\" (UniqueName: \"kubernetes.io/projected/7e439833-7cdd-42be-8f0e-058085fd5e08-kube-api-access-2mtcz\") on node \"crc\" DevicePath \"\"" Oct 13 14:58:30 crc kubenswrapper[4797]: I1013 14:58:30.372465 4797 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e439833-7cdd-42be-8f0e-058085fd5e08-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 13 14:58:30 crc kubenswrapper[4797]: I1013 14:58:30.372488 4797 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e439833-7cdd-42be-8f0e-058085fd5e08-utilities\") on node \"crc\" DevicePath \"\"" Oct 13 14:58:30 crc kubenswrapper[4797]: I1013 14:58:30.637865 4797 generic.go:334] "Generic (PLEG): container finished" podID="7e439833-7cdd-42be-8f0e-058085fd5e08" containerID="59d9e1b0e30675fc7dcac606a7c1f6636522365d5d40242b3c8e28ef4eff79e8" exitCode=0 Oct 13 14:58:30 crc kubenswrapper[4797]: I1013 14:58:30.637920 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cp292" event={"ID":"7e439833-7cdd-42be-8f0e-058085fd5e08","Type":"ContainerDied","Data":"59d9e1b0e30675fc7dcac606a7c1f6636522365d5d40242b3c8e28ef4eff79e8"} Oct 13 14:58:30 crc kubenswrapper[4797]: I1013 14:58:30.637943 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cp292" Oct 13 14:58:30 crc kubenswrapper[4797]: I1013 14:58:30.638006 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cp292" event={"ID":"7e439833-7cdd-42be-8f0e-058085fd5e08","Type":"ContainerDied","Data":"140f76c998bd0743159cc9a13f9b533891925892b13f5964fdde55a4754c8140"} Oct 13 14:58:30 crc kubenswrapper[4797]: I1013 14:58:30.638028 4797 scope.go:117] "RemoveContainer" containerID="59d9e1b0e30675fc7dcac606a7c1f6636522365d5d40242b3c8e28ef4eff79e8" Oct 13 14:58:30 crc kubenswrapper[4797]: I1013 14:58:30.668854 4797 scope.go:117] "RemoveContainer" containerID="1037a98a08e049c1d887130e65382092003e780b91366f36364e897ad3369c48" Oct 13 14:58:30 crc kubenswrapper[4797]: I1013 14:58:30.673351 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-cp292"] Oct 13 14:58:30 crc kubenswrapper[4797]: I1013 14:58:30.685635 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-cp292"] Oct 13 14:58:30 crc kubenswrapper[4797]: I1013 14:58:30.703655 4797 scope.go:117] "RemoveContainer" containerID="96e8560a7a07df9fe73737bce857c7c56f148e338d3f9bca26ba2808a92a471d" Oct 13 14:58:30 crc kubenswrapper[4797]: I1013 14:58:30.737401 4797 scope.go:117] "RemoveContainer" containerID="59d9e1b0e30675fc7dcac606a7c1f6636522365d5d40242b3c8e28ef4eff79e8" Oct 13 14:58:30 crc kubenswrapper[4797]: E1013 14:58:30.738112 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59d9e1b0e30675fc7dcac606a7c1f6636522365d5d40242b3c8e28ef4eff79e8\": container with ID starting with 59d9e1b0e30675fc7dcac606a7c1f6636522365d5d40242b3c8e28ef4eff79e8 not found: ID does not exist" containerID="59d9e1b0e30675fc7dcac606a7c1f6636522365d5d40242b3c8e28ef4eff79e8" Oct 13 14:58:30 crc kubenswrapper[4797]: I1013 14:58:30.738157 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59d9e1b0e30675fc7dcac606a7c1f6636522365d5d40242b3c8e28ef4eff79e8"} err="failed to get container status \"59d9e1b0e30675fc7dcac606a7c1f6636522365d5d40242b3c8e28ef4eff79e8\": rpc error: code = NotFound desc = could not find container \"59d9e1b0e30675fc7dcac606a7c1f6636522365d5d40242b3c8e28ef4eff79e8\": container with ID starting with 59d9e1b0e30675fc7dcac606a7c1f6636522365d5d40242b3c8e28ef4eff79e8 not found: ID does not exist" Oct 13 14:58:30 crc kubenswrapper[4797]: I1013 14:58:30.738185 4797 scope.go:117] "RemoveContainer" containerID="1037a98a08e049c1d887130e65382092003e780b91366f36364e897ad3369c48" Oct 13 14:58:30 crc kubenswrapper[4797]: E1013 14:58:30.738480 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1037a98a08e049c1d887130e65382092003e780b91366f36364e897ad3369c48\": container with ID starting with 1037a98a08e049c1d887130e65382092003e780b91366f36364e897ad3369c48 not found: ID does not exist" containerID="1037a98a08e049c1d887130e65382092003e780b91366f36364e897ad3369c48" Oct 13 14:58:30 crc kubenswrapper[4797]: I1013 14:58:30.738512 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1037a98a08e049c1d887130e65382092003e780b91366f36364e897ad3369c48"} err="failed to get container status \"1037a98a08e049c1d887130e65382092003e780b91366f36364e897ad3369c48\": rpc error: code = NotFound desc = could not find container \"1037a98a08e049c1d887130e65382092003e780b91366f36364e897ad3369c48\": container with ID starting with 1037a98a08e049c1d887130e65382092003e780b91366f36364e897ad3369c48 not found: ID does not exist" Oct 13 14:58:30 crc kubenswrapper[4797]: I1013 14:58:30.738529 4797 scope.go:117] "RemoveContainer" containerID="96e8560a7a07df9fe73737bce857c7c56f148e338d3f9bca26ba2808a92a471d" Oct 13 14:58:30 crc kubenswrapper[4797]: E1013 14:58:30.738818 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96e8560a7a07df9fe73737bce857c7c56f148e338d3f9bca26ba2808a92a471d\": container with ID starting with 96e8560a7a07df9fe73737bce857c7c56f148e338d3f9bca26ba2808a92a471d not found: ID does not exist" containerID="96e8560a7a07df9fe73737bce857c7c56f148e338d3f9bca26ba2808a92a471d" Oct 13 14:58:30 crc kubenswrapper[4797]: I1013 14:58:30.738942 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96e8560a7a07df9fe73737bce857c7c56f148e338d3f9bca26ba2808a92a471d"} err="failed to get container status \"96e8560a7a07df9fe73737bce857c7c56f148e338d3f9bca26ba2808a92a471d\": rpc error: code = NotFound desc = could not find container \"96e8560a7a07df9fe73737bce857c7c56f148e338d3f9bca26ba2808a92a471d\": container with ID starting with 96e8560a7a07df9fe73737bce857c7c56f148e338d3f9bca26ba2808a92a471d not found: ID does not exist" Oct 13 14:58:31 crc kubenswrapper[4797]: I1013 14:58:31.251783 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e439833-7cdd-42be-8f0e-058085fd5e08" path="/var/lib/kubelet/pods/7e439833-7cdd-42be-8f0e-058085fd5e08/volumes" Oct 13 14:58:43 crc kubenswrapper[4797]: I1013 14:58:43.768058 4797 generic.go:334] "Generic (PLEG): container finished" podID="e800517f-8039-464d-8137-cc928b84cc79" containerID="15242632ac45f1460411676950ef8f1044f44a5e302d7d11029f09416fa19d75" exitCode=0 Oct 13 14:58:43 crc kubenswrapper[4797]: I1013 14:58:43.768180 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-6wfff" event={"ID":"e800517f-8039-464d-8137-cc928b84cc79","Type":"ContainerDied","Data":"15242632ac45f1460411676950ef8f1044f44a5e302d7d11029f09416fa19d75"} Oct 13 14:58:45 crc kubenswrapper[4797]: I1013 14:58:45.261312 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-6wfff" Oct 13 14:58:45 crc kubenswrapper[4797]: I1013 14:58:45.395571 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-59lms\" (UniqueName: \"kubernetes.io/projected/e800517f-8039-464d-8137-cc928b84cc79-kube-api-access-59lms\") pod \"e800517f-8039-464d-8137-cc928b84cc79\" (UID: \"e800517f-8039-464d-8137-cc928b84cc79\") " Oct 13 14:58:45 crc kubenswrapper[4797]: I1013 14:58:45.396003 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e800517f-8039-464d-8137-cc928b84cc79-ssh-key\") pod \"e800517f-8039-464d-8137-cc928b84cc79\" (UID: \"e800517f-8039-464d-8137-cc928b84cc79\") " Oct 13 14:58:45 crc kubenswrapper[4797]: I1013 14:58:45.396117 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e800517f-8039-464d-8137-cc928b84cc79-inventory\") pod \"e800517f-8039-464d-8137-cc928b84cc79\" (UID: \"e800517f-8039-464d-8137-cc928b84cc79\") " Oct 13 14:58:45 crc kubenswrapper[4797]: I1013 14:58:45.396297 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e800517f-8039-464d-8137-cc928b84cc79-ceph\") pod \"e800517f-8039-464d-8137-cc928b84cc79\" (UID: \"e800517f-8039-464d-8137-cc928b84cc79\") " Oct 13 14:58:45 crc kubenswrapper[4797]: I1013 14:58:45.403953 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e800517f-8039-464d-8137-cc928b84cc79-kube-api-access-59lms" (OuterVolumeSpecName: "kube-api-access-59lms") pod "e800517f-8039-464d-8137-cc928b84cc79" (UID: "e800517f-8039-464d-8137-cc928b84cc79"). InnerVolumeSpecName "kube-api-access-59lms". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 14:58:45 crc kubenswrapper[4797]: I1013 14:58:45.408145 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e800517f-8039-464d-8137-cc928b84cc79-ceph" (OuterVolumeSpecName: "ceph") pod "e800517f-8039-464d-8137-cc928b84cc79" (UID: "e800517f-8039-464d-8137-cc928b84cc79"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 14:58:45 crc kubenswrapper[4797]: I1013 14:58:45.438926 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e800517f-8039-464d-8137-cc928b84cc79-inventory" (OuterVolumeSpecName: "inventory") pod "e800517f-8039-464d-8137-cc928b84cc79" (UID: "e800517f-8039-464d-8137-cc928b84cc79"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 14:58:45 crc kubenswrapper[4797]: I1013 14:58:45.448652 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e800517f-8039-464d-8137-cc928b84cc79-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "e800517f-8039-464d-8137-cc928b84cc79" (UID: "e800517f-8039-464d-8137-cc928b84cc79"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 14:58:45 crc kubenswrapper[4797]: I1013 14:58:45.501406 4797 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e800517f-8039-464d-8137-cc928b84cc79-ceph\") on node \"crc\" DevicePath \"\"" Oct 13 14:58:45 crc kubenswrapper[4797]: I1013 14:58:45.501456 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-59lms\" (UniqueName: \"kubernetes.io/projected/e800517f-8039-464d-8137-cc928b84cc79-kube-api-access-59lms\") on node \"crc\" DevicePath \"\"" Oct 13 14:58:45 crc kubenswrapper[4797]: I1013 14:58:45.501469 4797 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e800517f-8039-464d-8137-cc928b84cc79-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 13 14:58:45 crc kubenswrapper[4797]: I1013 14:58:45.501479 4797 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e800517f-8039-464d-8137-cc928b84cc79-inventory\") on node \"crc\" DevicePath \"\"" Oct 13 14:58:45 crc kubenswrapper[4797]: I1013 14:58:45.791990 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-6wfff" event={"ID":"e800517f-8039-464d-8137-cc928b84cc79","Type":"ContainerDied","Data":"425231eb7234f7dd3fb6d6369445da8307a5ace9ce72e8509fde15632ba6cc9c"} Oct 13 14:58:45 crc kubenswrapper[4797]: I1013 14:58:45.792553 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="425231eb7234f7dd3fb6d6369445da8307a5ace9ce72e8509fde15632ba6cc9c" Oct 13 14:58:45 crc kubenswrapper[4797]: I1013 14:58:45.792095 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-6wfff" Oct 13 14:58:45 crc kubenswrapper[4797]: I1013 14:58:45.880657 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-openstack-openstack-cell1-5k77q"] Oct 13 14:58:45 crc kubenswrapper[4797]: E1013 14:58:45.881285 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e439833-7cdd-42be-8f0e-058085fd5e08" containerName="extract-utilities" Oct 13 14:58:45 crc kubenswrapper[4797]: I1013 14:58:45.881310 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e439833-7cdd-42be-8f0e-058085fd5e08" containerName="extract-utilities" Oct 13 14:58:45 crc kubenswrapper[4797]: E1013 14:58:45.881381 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e439833-7cdd-42be-8f0e-058085fd5e08" containerName="extract-content" Oct 13 14:58:45 crc kubenswrapper[4797]: I1013 14:58:45.881389 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e439833-7cdd-42be-8f0e-058085fd5e08" containerName="extract-content" Oct 13 14:58:45 crc kubenswrapper[4797]: E1013 14:58:45.881398 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e800517f-8039-464d-8137-cc928b84cc79" containerName="download-cache-openstack-openstack-cell1" Oct 13 14:58:45 crc kubenswrapper[4797]: I1013 14:58:45.881405 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="e800517f-8039-464d-8137-cc928b84cc79" containerName="download-cache-openstack-openstack-cell1" Oct 13 14:58:45 crc kubenswrapper[4797]: E1013 14:58:45.881415 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06251d27-3d6c-424d-a24e-2f432999d518" containerName="registry-server" Oct 13 14:58:45 crc kubenswrapper[4797]: I1013 14:58:45.881423 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="06251d27-3d6c-424d-a24e-2f432999d518" containerName="registry-server" Oct 13 14:58:45 crc kubenswrapper[4797]: E1013 14:58:45.881438 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06251d27-3d6c-424d-a24e-2f432999d518" containerName="extract-content" Oct 13 14:58:45 crc kubenswrapper[4797]: I1013 14:58:45.881444 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="06251d27-3d6c-424d-a24e-2f432999d518" containerName="extract-content" Oct 13 14:58:45 crc kubenswrapper[4797]: E1013 14:58:45.881459 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e439833-7cdd-42be-8f0e-058085fd5e08" containerName="registry-server" Oct 13 14:58:45 crc kubenswrapper[4797]: I1013 14:58:45.881465 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e439833-7cdd-42be-8f0e-058085fd5e08" containerName="registry-server" Oct 13 14:58:45 crc kubenswrapper[4797]: E1013 14:58:45.881484 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06251d27-3d6c-424d-a24e-2f432999d518" containerName="extract-utilities" Oct 13 14:58:45 crc kubenswrapper[4797]: I1013 14:58:45.881490 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="06251d27-3d6c-424d-a24e-2f432999d518" containerName="extract-utilities" Oct 13 14:58:45 crc kubenswrapper[4797]: I1013 14:58:45.881699 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="e800517f-8039-464d-8137-cc928b84cc79" containerName="download-cache-openstack-openstack-cell1" Oct 13 14:58:45 crc kubenswrapper[4797]: I1013 14:58:45.881713 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="06251d27-3d6c-424d-a24e-2f432999d518" containerName="registry-server" Oct 13 14:58:45 crc kubenswrapper[4797]: I1013 14:58:45.881726 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e439833-7cdd-42be-8f0e-058085fd5e08" containerName="registry-server" Oct 13 14:58:45 crc kubenswrapper[4797]: I1013 14:58:45.882742 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-5k77q" Oct 13 14:58:45 crc kubenswrapper[4797]: I1013 14:58:45.887778 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 13 14:58:45 crc kubenswrapper[4797]: I1013 14:58:45.887997 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 13 14:58:45 crc kubenswrapper[4797]: I1013 14:58:45.888235 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-rf85n" Oct 13 14:58:45 crc kubenswrapper[4797]: I1013 14:58:45.888271 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 13 14:58:45 crc kubenswrapper[4797]: I1013 14:58:45.897171 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-openstack-openstack-cell1-5k77q"] Oct 13 14:58:46 crc kubenswrapper[4797]: I1013 14:58:46.015782 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/edbca830-a799-43d2-a312-1aa256aabed6-ceph\") pod \"configure-network-openstack-openstack-cell1-5k77q\" (UID: \"edbca830-a799-43d2-a312-1aa256aabed6\") " pod="openstack/configure-network-openstack-openstack-cell1-5k77q" Oct 13 14:58:46 crc kubenswrapper[4797]: I1013 14:58:46.016268 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/edbca830-a799-43d2-a312-1aa256aabed6-inventory\") pod \"configure-network-openstack-openstack-cell1-5k77q\" (UID: \"edbca830-a799-43d2-a312-1aa256aabed6\") " pod="openstack/configure-network-openstack-openstack-cell1-5k77q" Oct 13 14:58:46 crc kubenswrapper[4797]: I1013 14:58:46.016488 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/edbca830-a799-43d2-a312-1aa256aabed6-ssh-key\") pod \"configure-network-openstack-openstack-cell1-5k77q\" (UID: \"edbca830-a799-43d2-a312-1aa256aabed6\") " pod="openstack/configure-network-openstack-openstack-cell1-5k77q" Oct 13 14:58:46 crc kubenswrapper[4797]: I1013 14:58:46.016580 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qnn9j\" (UniqueName: \"kubernetes.io/projected/edbca830-a799-43d2-a312-1aa256aabed6-kube-api-access-qnn9j\") pod \"configure-network-openstack-openstack-cell1-5k77q\" (UID: \"edbca830-a799-43d2-a312-1aa256aabed6\") " pod="openstack/configure-network-openstack-openstack-cell1-5k77q" Oct 13 14:58:46 crc kubenswrapper[4797]: I1013 14:58:46.118958 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qnn9j\" (UniqueName: \"kubernetes.io/projected/edbca830-a799-43d2-a312-1aa256aabed6-kube-api-access-qnn9j\") pod \"configure-network-openstack-openstack-cell1-5k77q\" (UID: \"edbca830-a799-43d2-a312-1aa256aabed6\") " pod="openstack/configure-network-openstack-openstack-cell1-5k77q" Oct 13 14:58:46 crc kubenswrapper[4797]: I1013 14:58:46.119045 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/edbca830-a799-43d2-a312-1aa256aabed6-ceph\") pod \"configure-network-openstack-openstack-cell1-5k77q\" (UID: \"edbca830-a799-43d2-a312-1aa256aabed6\") " pod="openstack/configure-network-openstack-openstack-cell1-5k77q" Oct 13 14:58:46 crc kubenswrapper[4797]: I1013 14:58:46.119240 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/edbca830-a799-43d2-a312-1aa256aabed6-inventory\") pod \"configure-network-openstack-openstack-cell1-5k77q\" (UID: \"edbca830-a799-43d2-a312-1aa256aabed6\") " pod="openstack/configure-network-openstack-openstack-cell1-5k77q" Oct 13 14:58:46 crc kubenswrapper[4797]: I1013 14:58:46.119312 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/edbca830-a799-43d2-a312-1aa256aabed6-ssh-key\") pod \"configure-network-openstack-openstack-cell1-5k77q\" (UID: \"edbca830-a799-43d2-a312-1aa256aabed6\") " pod="openstack/configure-network-openstack-openstack-cell1-5k77q" Oct 13 14:58:46 crc kubenswrapper[4797]: I1013 14:58:46.126415 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/edbca830-a799-43d2-a312-1aa256aabed6-inventory\") pod \"configure-network-openstack-openstack-cell1-5k77q\" (UID: \"edbca830-a799-43d2-a312-1aa256aabed6\") " pod="openstack/configure-network-openstack-openstack-cell1-5k77q" Oct 13 14:58:46 crc kubenswrapper[4797]: I1013 14:58:46.126444 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/edbca830-a799-43d2-a312-1aa256aabed6-ceph\") pod \"configure-network-openstack-openstack-cell1-5k77q\" (UID: \"edbca830-a799-43d2-a312-1aa256aabed6\") " pod="openstack/configure-network-openstack-openstack-cell1-5k77q" Oct 13 14:58:46 crc kubenswrapper[4797]: I1013 14:58:46.126904 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/edbca830-a799-43d2-a312-1aa256aabed6-ssh-key\") pod \"configure-network-openstack-openstack-cell1-5k77q\" (UID: \"edbca830-a799-43d2-a312-1aa256aabed6\") " pod="openstack/configure-network-openstack-openstack-cell1-5k77q" Oct 13 14:58:46 crc kubenswrapper[4797]: I1013 14:58:46.138974 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qnn9j\" (UniqueName: \"kubernetes.io/projected/edbca830-a799-43d2-a312-1aa256aabed6-kube-api-access-qnn9j\") pod \"configure-network-openstack-openstack-cell1-5k77q\" (UID: \"edbca830-a799-43d2-a312-1aa256aabed6\") " pod="openstack/configure-network-openstack-openstack-cell1-5k77q" Oct 13 14:58:46 crc kubenswrapper[4797]: I1013 14:58:46.220258 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-5k77q" Oct 13 14:58:46 crc kubenswrapper[4797]: I1013 14:58:46.792551 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-openstack-openstack-cell1-5k77q"] Oct 13 14:58:46 crc kubenswrapper[4797]: I1013 14:58:46.804441 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-5k77q" event={"ID":"edbca830-a799-43d2-a312-1aa256aabed6","Type":"ContainerStarted","Data":"aabc189f646cce42eea885e6734846b76ed144ec985102b8b992360c3ee5bc17"} Oct 13 14:58:47 crc kubenswrapper[4797]: I1013 14:58:47.814912 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-5k77q" event={"ID":"edbca830-a799-43d2-a312-1aa256aabed6","Type":"ContainerStarted","Data":"10f0fd1da52b95350e8341a27dd180677003f4ef9c66ade914f5ce5233203f3a"} Oct 13 14:58:47 crc kubenswrapper[4797]: I1013 14:58:47.838052 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-openstack-openstack-cell1-5k77q" podStartSLOduration=2.338299751 podStartE2EDuration="2.838028558s" podCreationTimestamp="2025-10-13 14:58:45 +0000 UTC" firstStartedPulling="2025-10-13 14:58:46.790217178 +0000 UTC m=+6704.323767434" lastFinishedPulling="2025-10-13 14:58:47.289945985 +0000 UTC m=+6704.823496241" observedRunningTime="2025-10-13 14:58:47.828495034 +0000 UTC m=+6705.362045310" watchObservedRunningTime="2025-10-13 14:58:47.838028558 +0000 UTC m=+6705.371578824" Oct 13 15:00:00 crc kubenswrapper[4797]: I1013 15:00:00.175944 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29339460-5tzwt"] Oct 13 15:00:00 crc kubenswrapper[4797]: I1013 15:00:00.178326 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29339460-5tzwt" Oct 13 15:00:00 crc kubenswrapper[4797]: I1013 15:00:00.187251 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 13 15:00:00 crc kubenswrapper[4797]: I1013 15:00:00.196832 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29339460-5tzwt"] Oct 13 15:00:00 crc kubenswrapper[4797]: I1013 15:00:00.197035 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 13 15:00:00 crc kubenswrapper[4797]: I1013 15:00:00.290020 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d721eb5f-2ce0-4e90-b01c-ed1d97a58208-config-volume\") pod \"collect-profiles-29339460-5tzwt\" (UID: \"d721eb5f-2ce0-4e90-b01c-ed1d97a58208\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339460-5tzwt" Oct 13 15:00:00 crc kubenswrapper[4797]: I1013 15:00:00.290334 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d721eb5f-2ce0-4e90-b01c-ed1d97a58208-secret-volume\") pod \"collect-profiles-29339460-5tzwt\" (UID: \"d721eb5f-2ce0-4e90-b01c-ed1d97a58208\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339460-5tzwt" Oct 13 15:00:00 crc kubenswrapper[4797]: I1013 15:00:00.290681 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4nv8\" (UniqueName: \"kubernetes.io/projected/d721eb5f-2ce0-4e90-b01c-ed1d97a58208-kube-api-access-p4nv8\") pod \"collect-profiles-29339460-5tzwt\" (UID: \"d721eb5f-2ce0-4e90-b01c-ed1d97a58208\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339460-5tzwt" Oct 13 15:00:00 crc kubenswrapper[4797]: I1013 15:00:00.393534 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p4nv8\" (UniqueName: \"kubernetes.io/projected/d721eb5f-2ce0-4e90-b01c-ed1d97a58208-kube-api-access-p4nv8\") pod \"collect-profiles-29339460-5tzwt\" (UID: \"d721eb5f-2ce0-4e90-b01c-ed1d97a58208\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339460-5tzwt" Oct 13 15:00:00 crc kubenswrapper[4797]: I1013 15:00:00.393955 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d721eb5f-2ce0-4e90-b01c-ed1d97a58208-config-volume\") pod \"collect-profiles-29339460-5tzwt\" (UID: \"d721eb5f-2ce0-4e90-b01c-ed1d97a58208\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339460-5tzwt" Oct 13 15:00:00 crc kubenswrapper[4797]: I1013 15:00:00.394056 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d721eb5f-2ce0-4e90-b01c-ed1d97a58208-secret-volume\") pod \"collect-profiles-29339460-5tzwt\" (UID: \"d721eb5f-2ce0-4e90-b01c-ed1d97a58208\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339460-5tzwt" Oct 13 15:00:00 crc kubenswrapper[4797]: I1013 15:00:00.395011 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d721eb5f-2ce0-4e90-b01c-ed1d97a58208-config-volume\") pod \"collect-profiles-29339460-5tzwt\" (UID: \"d721eb5f-2ce0-4e90-b01c-ed1d97a58208\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339460-5tzwt" Oct 13 15:00:00 crc kubenswrapper[4797]: I1013 15:00:00.410297 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d721eb5f-2ce0-4e90-b01c-ed1d97a58208-secret-volume\") pod \"collect-profiles-29339460-5tzwt\" (UID: \"d721eb5f-2ce0-4e90-b01c-ed1d97a58208\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339460-5tzwt" Oct 13 15:00:00 crc kubenswrapper[4797]: I1013 15:00:00.420393 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4nv8\" (UniqueName: \"kubernetes.io/projected/d721eb5f-2ce0-4e90-b01c-ed1d97a58208-kube-api-access-p4nv8\") pod \"collect-profiles-29339460-5tzwt\" (UID: \"d721eb5f-2ce0-4e90-b01c-ed1d97a58208\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339460-5tzwt" Oct 13 15:00:00 crc kubenswrapper[4797]: I1013 15:00:00.501246 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29339460-5tzwt" Oct 13 15:00:00 crc kubenswrapper[4797]: I1013 15:00:00.962666 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29339460-5tzwt"] Oct 13 15:00:01 crc kubenswrapper[4797]: I1013 15:00:01.568947 4797 generic.go:334] "Generic (PLEG): container finished" podID="d721eb5f-2ce0-4e90-b01c-ed1d97a58208" containerID="ec9d30e7973a3e688af8d06494c7157c62228f27e52664e7956e6d5e4e9a51ce" exitCode=0 Oct 13 15:00:01 crc kubenswrapper[4797]: I1013 15:00:01.569008 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29339460-5tzwt" event={"ID":"d721eb5f-2ce0-4e90-b01c-ed1d97a58208","Type":"ContainerDied","Data":"ec9d30e7973a3e688af8d06494c7157c62228f27e52664e7956e6d5e4e9a51ce"} Oct 13 15:00:01 crc kubenswrapper[4797]: I1013 15:00:01.569222 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29339460-5tzwt" event={"ID":"d721eb5f-2ce0-4e90-b01c-ed1d97a58208","Type":"ContainerStarted","Data":"727b6da29b86a6a4b4f78a0029f0aa823cf2a3701195adaa2300268928a4c62a"} Oct 13 15:00:02 crc kubenswrapper[4797]: I1013 15:00:02.964341 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29339460-5tzwt" Oct 13 15:00:03 crc kubenswrapper[4797]: I1013 15:00:03.051725 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d721eb5f-2ce0-4e90-b01c-ed1d97a58208-secret-volume\") pod \"d721eb5f-2ce0-4e90-b01c-ed1d97a58208\" (UID: \"d721eb5f-2ce0-4e90-b01c-ed1d97a58208\") " Oct 13 15:00:03 crc kubenswrapper[4797]: I1013 15:00:03.051887 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d721eb5f-2ce0-4e90-b01c-ed1d97a58208-config-volume\") pod \"d721eb5f-2ce0-4e90-b01c-ed1d97a58208\" (UID: \"d721eb5f-2ce0-4e90-b01c-ed1d97a58208\") " Oct 13 15:00:03 crc kubenswrapper[4797]: I1013 15:00:03.051985 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p4nv8\" (UniqueName: \"kubernetes.io/projected/d721eb5f-2ce0-4e90-b01c-ed1d97a58208-kube-api-access-p4nv8\") pod \"d721eb5f-2ce0-4e90-b01c-ed1d97a58208\" (UID: \"d721eb5f-2ce0-4e90-b01c-ed1d97a58208\") " Oct 13 15:00:03 crc kubenswrapper[4797]: I1013 15:00:03.052855 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d721eb5f-2ce0-4e90-b01c-ed1d97a58208-config-volume" (OuterVolumeSpecName: "config-volume") pod "d721eb5f-2ce0-4e90-b01c-ed1d97a58208" (UID: "d721eb5f-2ce0-4e90-b01c-ed1d97a58208"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 15:00:03 crc kubenswrapper[4797]: I1013 15:00:03.057389 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d721eb5f-2ce0-4e90-b01c-ed1d97a58208-kube-api-access-p4nv8" (OuterVolumeSpecName: "kube-api-access-p4nv8") pod "d721eb5f-2ce0-4e90-b01c-ed1d97a58208" (UID: "d721eb5f-2ce0-4e90-b01c-ed1d97a58208"). InnerVolumeSpecName "kube-api-access-p4nv8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 15:00:03 crc kubenswrapper[4797]: I1013 15:00:03.065020 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d721eb5f-2ce0-4e90-b01c-ed1d97a58208-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "d721eb5f-2ce0-4e90-b01c-ed1d97a58208" (UID: "d721eb5f-2ce0-4e90-b01c-ed1d97a58208"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 15:00:03 crc kubenswrapper[4797]: I1013 15:00:03.154871 4797 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d721eb5f-2ce0-4e90-b01c-ed1d97a58208-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 13 15:00:03 crc kubenswrapper[4797]: I1013 15:00:03.154933 4797 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d721eb5f-2ce0-4e90-b01c-ed1d97a58208-config-volume\") on node \"crc\" DevicePath \"\"" Oct 13 15:00:03 crc kubenswrapper[4797]: I1013 15:00:03.154950 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p4nv8\" (UniqueName: \"kubernetes.io/projected/d721eb5f-2ce0-4e90-b01c-ed1d97a58208-kube-api-access-p4nv8\") on node \"crc\" DevicePath \"\"" Oct 13 15:00:03 crc kubenswrapper[4797]: I1013 15:00:03.588907 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29339460-5tzwt" event={"ID":"d721eb5f-2ce0-4e90-b01c-ed1d97a58208","Type":"ContainerDied","Data":"727b6da29b86a6a4b4f78a0029f0aa823cf2a3701195adaa2300268928a4c62a"} Oct 13 15:00:03 crc kubenswrapper[4797]: I1013 15:00:03.588946 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="727b6da29b86a6a4b4f78a0029f0aa823cf2a3701195adaa2300268928a4c62a" Oct 13 15:00:03 crc kubenswrapper[4797]: I1013 15:00:03.588979 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29339460-5tzwt" Oct 13 15:00:04 crc kubenswrapper[4797]: I1013 15:00:04.064838 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29339415-4wslq"] Oct 13 15:00:04 crc kubenswrapper[4797]: I1013 15:00:04.072883 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29339415-4wslq"] Oct 13 15:00:05 crc kubenswrapper[4797]: I1013 15:00:05.249044 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85aee1f1-31f9-4f8e-ad76-802c222bbece" path="/var/lib/kubelet/pods/85aee1f1-31f9-4f8e-ad76-802c222bbece/volumes" Oct 13 15:00:05 crc kubenswrapper[4797]: I1013 15:00:05.608436 4797 generic.go:334] "Generic (PLEG): container finished" podID="edbca830-a799-43d2-a312-1aa256aabed6" containerID="10f0fd1da52b95350e8341a27dd180677003f4ef9c66ade914f5ce5233203f3a" exitCode=0 Oct 13 15:00:05 crc kubenswrapper[4797]: I1013 15:00:05.608513 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-5k77q" event={"ID":"edbca830-a799-43d2-a312-1aa256aabed6","Type":"ContainerDied","Data":"10f0fd1da52b95350e8341a27dd180677003f4ef9c66ade914f5ce5233203f3a"} Oct 13 15:00:07 crc kubenswrapper[4797]: I1013 15:00:07.072315 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-5k77q" Oct 13 15:00:07 crc kubenswrapper[4797]: I1013 15:00:07.141578 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/edbca830-a799-43d2-a312-1aa256aabed6-ssh-key\") pod \"edbca830-a799-43d2-a312-1aa256aabed6\" (UID: \"edbca830-a799-43d2-a312-1aa256aabed6\") " Oct 13 15:00:07 crc kubenswrapper[4797]: I1013 15:00:07.141729 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/edbca830-a799-43d2-a312-1aa256aabed6-inventory\") pod \"edbca830-a799-43d2-a312-1aa256aabed6\" (UID: \"edbca830-a799-43d2-a312-1aa256aabed6\") " Oct 13 15:00:07 crc kubenswrapper[4797]: I1013 15:00:07.141832 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/edbca830-a799-43d2-a312-1aa256aabed6-ceph\") pod \"edbca830-a799-43d2-a312-1aa256aabed6\" (UID: \"edbca830-a799-43d2-a312-1aa256aabed6\") " Oct 13 15:00:07 crc kubenswrapper[4797]: I1013 15:00:07.141880 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qnn9j\" (UniqueName: \"kubernetes.io/projected/edbca830-a799-43d2-a312-1aa256aabed6-kube-api-access-qnn9j\") pod \"edbca830-a799-43d2-a312-1aa256aabed6\" (UID: \"edbca830-a799-43d2-a312-1aa256aabed6\") " Oct 13 15:00:07 crc kubenswrapper[4797]: I1013 15:00:07.148652 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/edbca830-a799-43d2-a312-1aa256aabed6-kube-api-access-qnn9j" (OuterVolumeSpecName: "kube-api-access-qnn9j") pod "edbca830-a799-43d2-a312-1aa256aabed6" (UID: "edbca830-a799-43d2-a312-1aa256aabed6"). InnerVolumeSpecName "kube-api-access-qnn9j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 15:00:07 crc kubenswrapper[4797]: I1013 15:00:07.149160 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/edbca830-a799-43d2-a312-1aa256aabed6-ceph" (OuterVolumeSpecName: "ceph") pod "edbca830-a799-43d2-a312-1aa256aabed6" (UID: "edbca830-a799-43d2-a312-1aa256aabed6"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 15:00:07 crc kubenswrapper[4797]: I1013 15:00:07.173007 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/edbca830-a799-43d2-a312-1aa256aabed6-inventory" (OuterVolumeSpecName: "inventory") pod "edbca830-a799-43d2-a312-1aa256aabed6" (UID: "edbca830-a799-43d2-a312-1aa256aabed6"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 15:00:07 crc kubenswrapper[4797]: I1013 15:00:07.179304 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/edbca830-a799-43d2-a312-1aa256aabed6-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "edbca830-a799-43d2-a312-1aa256aabed6" (UID: "edbca830-a799-43d2-a312-1aa256aabed6"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 15:00:07 crc kubenswrapper[4797]: I1013 15:00:07.244210 4797 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/edbca830-a799-43d2-a312-1aa256aabed6-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 13 15:00:07 crc kubenswrapper[4797]: I1013 15:00:07.244249 4797 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/edbca830-a799-43d2-a312-1aa256aabed6-inventory\") on node \"crc\" DevicePath \"\"" Oct 13 15:00:07 crc kubenswrapper[4797]: I1013 15:00:07.244261 4797 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/edbca830-a799-43d2-a312-1aa256aabed6-ceph\") on node \"crc\" DevicePath \"\"" Oct 13 15:00:07 crc kubenswrapper[4797]: I1013 15:00:07.244298 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qnn9j\" (UniqueName: \"kubernetes.io/projected/edbca830-a799-43d2-a312-1aa256aabed6-kube-api-access-qnn9j\") on node \"crc\" DevicePath \"\"" Oct 13 15:00:07 crc kubenswrapper[4797]: I1013 15:00:07.628387 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-5k77q" event={"ID":"edbca830-a799-43d2-a312-1aa256aabed6","Type":"ContainerDied","Data":"aabc189f646cce42eea885e6734846b76ed144ec985102b8b992360c3ee5bc17"} Oct 13 15:00:07 crc kubenswrapper[4797]: I1013 15:00:07.628434 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aabc189f646cce42eea885e6734846b76ed144ec985102b8b992360c3ee5bc17" Oct 13 15:00:07 crc kubenswrapper[4797]: I1013 15:00:07.628515 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-5k77q" Oct 13 15:00:07 crc kubenswrapper[4797]: I1013 15:00:07.725470 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-openstack-openstack-cell1-l9gmr"] Oct 13 15:00:07 crc kubenswrapper[4797]: E1013 15:00:07.725970 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d721eb5f-2ce0-4e90-b01c-ed1d97a58208" containerName="collect-profiles" Oct 13 15:00:07 crc kubenswrapper[4797]: I1013 15:00:07.725996 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="d721eb5f-2ce0-4e90-b01c-ed1d97a58208" containerName="collect-profiles" Oct 13 15:00:07 crc kubenswrapper[4797]: E1013 15:00:07.726024 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edbca830-a799-43d2-a312-1aa256aabed6" containerName="configure-network-openstack-openstack-cell1" Oct 13 15:00:07 crc kubenswrapper[4797]: I1013 15:00:07.726034 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="edbca830-a799-43d2-a312-1aa256aabed6" containerName="configure-network-openstack-openstack-cell1" Oct 13 15:00:07 crc kubenswrapper[4797]: I1013 15:00:07.726346 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="d721eb5f-2ce0-4e90-b01c-ed1d97a58208" containerName="collect-profiles" Oct 13 15:00:07 crc kubenswrapper[4797]: I1013 15:00:07.726370 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="edbca830-a799-43d2-a312-1aa256aabed6" containerName="configure-network-openstack-openstack-cell1" Oct 13 15:00:07 crc kubenswrapper[4797]: I1013 15:00:07.727339 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-l9gmr" Oct 13 15:00:07 crc kubenswrapper[4797]: I1013 15:00:07.730709 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 13 15:00:07 crc kubenswrapper[4797]: I1013 15:00:07.730878 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-rf85n" Oct 13 15:00:07 crc kubenswrapper[4797]: I1013 15:00:07.731920 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 13 15:00:07 crc kubenswrapper[4797]: I1013 15:00:07.737596 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 13 15:00:07 crc kubenswrapper[4797]: I1013 15:00:07.739187 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-openstack-openstack-cell1-l9gmr"] Oct 13 15:00:07 crc kubenswrapper[4797]: I1013 15:00:07.864203 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/94c7ef1d-6bc2-4a04-94dc-8ab98fff0fce-ceph\") pod \"validate-network-openstack-openstack-cell1-l9gmr\" (UID: \"94c7ef1d-6bc2-4a04-94dc-8ab98fff0fce\") " pod="openstack/validate-network-openstack-openstack-cell1-l9gmr" Oct 13 15:00:07 crc kubenswrapper[4797]: I1013 15:00:07.864256 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/94c7ef1d-6bc2-4a04-94dc-8ab98fff0fce-ssh-key\") pod \"validate-network-openstack-openstack-cell1-l9gmr\" (UID: \"94c7ef1d-6bc2-4a04-94dc-8ab98fff0fce\") " pod="openstack/validate-network-openstack-openstack-cell1-l9gmr" Oct 13 15:00:07 crc kubenswrapper[4797]: I1013 15:00:07.864305 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fsbcb\" (UniqueName: \"kubernetes.io/projected/94c7ef1d-6bc2-4a04-94dc-8ab98fff0fce-kube-api-access-fsbcb\") pod \"validate-network-openstack-openstack-cell1-l9gmr\" (UID: \"94c7ef1d-6bc2-4a04-94dc-8ab98fff0fce\") " pod="openstack/validate-network-openstack-openstack-cell1-l9gmr" Oct 13 15:00:07 crc kubenswrapper[4797]: I1013 15:00:07.864342 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/94c7ef1d-6bc2-4a04-94dc-8ab98fff0fce-inventory\") pod \"validate-network-openstack-openstack-cell1-l9gmr\" (UID: \"94c7ef1d-6bc2-4a04-94dc-8ab98fff0fce\") " pod="openstack/validate-network-openstack-openstack-cell1-l9gmr" Oct 13 15:00:07 crc kubenswrapper[4797]: I1013 15:00:07.966414 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/94c7ef1d-6bc2-4a04-94dc-8ab98fff0fce-ceph\") pod \"validate-network-openstack-openstack-cell1-l9gmr\" (UID: \"94c7ef1d-6bc2-4a04-94dc-8ab98fff0fce\") " pod="openstack/validate-network-openstack-openstack-cell1-l9gmr" Oct 13 15:00:07 crc kubenswrapper[4797]: I1013 15:00:07.966785 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/94c7ef1d-6bc2-4a04-94dc-8ab98fff0fce-ssh-key\") pod \"validate-network-openstack-openstack-cell1-l9gmr\" (UID: \"94c7ef1d-6bc2-4a04-94dc-8ab98fff0fce\") " pod="openstack/validate-network-openstack-openstack-cell1-l9gmr" Oct 13 15:00:07 crc kubenswrapper[4797]: I1013 15:00:07.966887 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fsbcb\" (UniqueName: \"kubernetes.io/projected/94c7ef1d-6bc2-4a04-94dc-8ab98fff0fce-kube-api-access-fsbcb\") pod \"validate-network-openstack-openstack-cell1-l9gmr\" (UID: \"94c7ef1d-6bc2-4a04-94dc-8ab98fff0fce\") " pod="openstack/validate-network-openstack-openstack-cell1-l9gmr" Oct 13 15:00:07 crc kubenswrapper[4797]: I1013 15:00:07.966947 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/94c7ef1d-6bc2-4a04-94dc-8ab98fff0fce-inventory\") pod \"validate-network-openstack-openstack-cell1-l9gmr\" (UID: \"94c7ef1d-6bc2-4a04-94dc-8ab98fff0fce\") " pod="openstack/validate-network-openstack-openstack-cell1-l9gmr" Oct 13 15:00:07 crc kubenswrapper[4797]: I1013 15:00:07.971657 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/94c7ef1d-6bc2-4a04-94dc-8ab98fff0fce-inventory\") pod \"validate-network-openstack-openstack-cell1-l9gmr\" (UID: \"94c7ef1d-6bc2-4a04-94dc-8ab98fff0fce\") " pod="openstack/validate-network-openstack-openstack-cell1-l9gmr" Oct 13 15:00:07 crc kubenswrapper[4797]: I1013 15:00:07.971861 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/94c7ef1d-6bc2-4a04-94dc-8ab98fff0fce-ceph\") pod \"validate-network-openstack-openstack-cell1-l9gmr\" (UID: \"94c7ef1d-6bc2-4a04-94dc-8ab98fff0fce\") " pod="openstack/validate-network-openstack-openstack-cell1-l9gmr" Oct 13 15:00:07 crc kubenswrapper[4797]: I1013 15:00:07.971902 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/94c7ef1d-6bc2-4a04-94dc-8ab98fff0fce-ssh-key\") pod \"validate-network-openstack-openstack-cell1-l9gmr\" (UID: \"94c7ef1d-6bc2-4a04-94dc-8ab98fff0fce\") " pod="openstack/validate-network-openstack-openstack-cell1-l9gmr" Oct 13 15:00:07 crc kubenswrapper[4797]: I1013 15:00:07.987441 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fsbcb\" (UniqueName: \"kubernetes.io/projected/94c7ef1d-6bc2-4a04-94dc-8ab98fff0fce-kube-api-access-fsbcb\") pod \"validate-network-openstack-openstack-cell1-l9gmr\" (UID: \"94c7ef1d-6bc2-4a04-94dc-8ab98fff0fce\") " pod="openstack/validate-network-openstack-openstack-cell1-l9gmr" Oct 13 15:00:08 crc kubenswrapper[4797]: I1013 15:00:08.048490 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-l9gmr" Oct 13 15:00:08 crc kubenswrapper[4797]: I1013 15:00:08.655438 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-openstack-openstack-cell1-l9gmr"] Oct 13 15:00:08 crc kubenswrapper[4797]: I1013 15:00:08.663275 4797 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 13 15:00:09 crc kubenswrapper[4797]: E1013 15:00:09.115762 4797 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd721eb5f_2ce0_4e90_b01c_ed1d97a58208.slice/crio-727b6da29b86a6a4b4f78a0029f0aa823cf2a3701195adaa2300268928a4c62a\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd721eb5f_2ce0_4e90_b01c_ed1d97a58208.slice\": RecentStats: unable to find data in memory cache]" Oct 13 15:00:09 crc kubenswrapper[4797]: I1013 15:00:09.658169 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-l9gmr" event={"ID":"94c7ef1d-6bc2-4a04-94dc-8ab98fff0fce","Type":"ContainerStarted","Data":"d633f0644eeb953592681d58f9d8a4a14380efd82f35db9da2b9f31b8f11a099"} Oct 13 15:00:10 crc kubenswrapper[4797]: I1013 15:00:10.675729 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-l9gmr" event={"ID":"94c7ef1d-6bc2-4a04-94dc-8ab98fff0fce","Type":"ContainerStarted","Data":"d287bf73908b1c447498d4c21f630bcc9c4dfd6fea0029e4a6719187bae8cf7a"} Oct 13 15:00:10 crc kubenswrapper[4797]: I1013 15:00:10.696012 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-openstack-openstack-cell1-l9gmr" podStartSLOduration=2.996645822 podStartE2EDuration="3.695968074s" podCreationTimestamp="2025-10-13 15:00:07 +0000 UTC" firstStartedPulling="2025-10-13 15:00:08.662884839 +0000 UTC m=+6786.196435115" lastFinishedPulling="2025-10-13 15:00:09.362207111 +0000 UTC m=+6786.895757367" observedRunningTime="2025-10-13 15:00:10.692370436 +0000 UTC m=+6788.225920732" watchObservedRunningTime="2025-10-13 15:00:10.695968074 +0000 UTC m=+6788.229518340" Oct 13 15:00:14 crc kubenswrapper[4797]: I1013 15:00:14.739998 4797 generic.go:334] "Generic (PLEG): container finished" podID="94c7ef1d-6bc2-4a04-94dc-8ab98fff0fce" containerID="d287bf73908b1c447498d4c21f630bcc9c4dfd6fea0029e4a6719187bae8cf7a" exitCode=0 Oct 13 15:00:14 crc kubenswrapper[4797]: I1013 15:00:14.740126 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-l9gmr" event={"ID":"94c7ef1d-6bc2-4a04-94dc-8ab98fff0fce","Type":"ContainerDied","Data":"d287bf73908b1c447498d4c21f630bcc9c4dfd6fea0029e4a6719187bae8cf7a"} Oct 13 15:00:16 crc kubenswrapper[4797]: I1013 15:00:16.216321 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-l9gmr" Oct 13 15:00:16 crc kubenswrapper[4797]: I1013 15:00:16.345007 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/94c7ef1d-6bc2-4a04-94dc-8ab98fff0fce-ssh-key\") pod \"94c7ef1d-6bc2-4a04-94dc-8ab98fff0fce\" (UID: \"94c7ef1d-6bc2-4a04-94dc-8ab98fff0fce\") " Oct 13 15:00:16 crc kubenswrapper[4797]: I1013 15:00:16.345234 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fsbcb\" (UniqueName: \"kubernetes.io/projected/94c7ef1d-6bc2-4a04-94dc-8ab98fff0fce-kube-api-access-fsbcb\") pod \"94c7ef1d-6bc2-4a04-94dc-8ab98fff0fce\" (UID: \"94c7ef1d-6bc2-4a04-94dc-8ab98fff0fce\") " Oct 13 15:00:16 crc kubenswrapper[4797]: I1013 15:00:16.345300 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/94c7ef1d-6bc2-4a04-94dc-8ab98fff0fce-inventory\") pod \"94c7ef1d-6bc2-4a04-94dc-8ab98fff0fce\" (UID: \"94c7ef1d-6bc2-4a04-94dc-8ab98fff0fce\") " Oct 13 15:00:16 crc kubenswrapper[4797]: I1013 15:00:16.345420 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/94c7ef1d-6bc2-4a04-94dc-8ab98fff0fce-ceph\") pod \"94c7ef1d-6bc2-4a04-94dc-8ab98fff0fce\" (UID: \"94c7ef1d-6bc2-4a04-94dc-8ab98fff0fce\") " Oct 13 15:00:16 crc kubenswrapper[4797]: I1013 15:00:16.361038 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94c7ef1d-6bc2-4a04-94dc-8ab98fff0fce-kube-api-access-fsbcb" (OuterVolumeSpecName: "kube-api-access-fsbcb") pod "94c7ef1d-6bc2-4a04-94dc-8ab98fff0fce" (UID: "94c7ef1d-6bc2-4a04-94dc-8ab98fff0fce"). InnerVolumeSpecName "kube-api-access-fsbcb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 15:00:16 crc kubenswrapper[4797]: I1013 15:00:16.361091 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94c7ef1d-6bc2-4a04-94dc-8ab98fff0fce-ceph" (OuterVolumeSpecName: "ceph") pod "94c7ef1d-6bc2-4a04-94dc-8ab98fff0fce" (UID: "94c7ef1d-6bc2-4a04-94dc-8ab98fff0fce"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 15:00:16 crc kubenswrapper[4797]: I1013 15:00:16.376435 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94c7ef1d-6bc2-4a04-94dc-8ab98fff0fce-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "94c7ef1d-6bc2-4a04-94dc-8ab98fff0fce" (UID: "94c7ef1d-6bc2-4a04-94dc-8ab98fff0fce"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 15:00:16 crc kubenswrapper[4797]: I1013 15:00:16.382883 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94c7ef1d-6bc2-4a04-94dc-8ab98fff0fce-inventory" (OuterVolumeSpecName: "inventory") pod "94c7ef1d-6bc2-4a04-94dc-8ab98fff0fce" (UID: "94c7ef1d-6bc2-4a04-94dc-8ab98fff0fce"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 15:00:16 crc kubenswrapper[4797]: I1013 15:00:16.448763 4797 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/94c7ef1d-6bc2-4a04-94dc-8ab98fff0fce-ceph\") on node \"crc\" DevicePath \"\"" Oct 13 15:00:16 crc kubenswrapper[4797]: I1013 15:00:16.448799 4797 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/94c7ef1d-6bc2-4a04-94dc-8ab98fff0fce-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 13 15:00:16 crc kubenswrapper[4797]: I1013 15:00:16.448827 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fsbcb\" (UniqueName: \"kubernetes.io/projected/94c7ef1d-6bc2-4a04-94dc-8ab98fff0fce-kube-api-access-fsbcb\") on node \"crc\" DevicePath \"\"" Oct 13 15:00:16 crc kubenswrapper[4797]: I1013 15:00:16.448837 4797 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/94c7ef1d-6bc2-4a04-94dc-8ab98fff0fce-inventory\") on node \"crc\" DevicePath \"\"" Oct 13 15:00:16 crc kubenswrapper[4797]: I1013 15:00:16.763407 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-l9gmr" event={"ID":"94c7ef1d-6bc2-4a04-94dc-8ab98fff0fce","Type":"ContainerDied","Data":"d633f0644eeb953592681d58f9d8a4a14380efd82f35db9da2b9f31b8f11a099"} Oct 13 15:00:16 crc kubenswrapper[4797]: I1013 15:00:16.763459 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d633f0644eeb953592681d58f9d8a4a14380efd82f35db9da2b9f31b8f11a099" Oct 13 15:00:16 crc kubenswrapper[4797]: I1013 15:00:16.763744 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-l9gmr" Oct 13 15:00:16 crc kubenswrapper[4797]: I1013 15:00:16.834097 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-openstack-openstack-cell1-r6xm4"] Oct 13 15:00:16 crc kubenswrapper[4797]: E1013 15:00:16.835027 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94c7ef1d-6bc2-4a04-94dc-8ab98fff0fce" containerName="validate-network-openstack-openstack-cell1" Oct 13 15:00:16 crc kubenswrapper[4797]: I1013 15:00:16.835052 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="94c7ef1d-6bc2-4a04-94dc-8ab98fff0fce" containerName="validate-network-openstack-openstack-cell1" Oct 13 15:00:16 crc kubenswrapper[4797]: I1013 15:00:16.835310 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="94c7ef1d-6bc2-4a04-94dc-8ab98fff0fce" containerName="validate-network-openstack-openstack-cell1" Oct 13 15:00:16 crc kubenswrapper[4797]: I1013 15:00:16.836307 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-r6xm4" Oct 13 15:00:16 crc kubenswrapper[4797]: I1013 15:00:16.838794 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 13 15:00:16 crc kubenswrapper[4797]: I1013 15:00:16.838992 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 13 15:00:16 crc kubenswrapper[4797]: I1013 15:00:16.839083 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 13 15:00:16 crc kubenswrapper[4797]: I1013 15:00:16.839110 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-rf85n" Oct 13 15:00:16 crc kubenswrapper[4797]: I1013 15:00:16.843083 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-openstack-openstack-cell1-r6xm4"] Oct 13 15:00:16 crc kubenswrapper[4797]: I1013 15:00:16.958617 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/fb2cf8ea-12be-4bf7-9b6e-fcded0175c91-ceph\") pod \"install-os-openstack-openstack-cell1-r6xm4\" (UID: \"fb2cf8ea-12be-4bf7-9b6e-fcded0175c91\") " pod="openstack/install-os-openstack-openstack-cell1-r6xm4" Oct 13 15:00:16 crc kubenswrapper[4797]: I1013 15:00:16.958724 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nm4s7\" (UniqueName: \"kubernetes.io/projected/fb2cf8ea-12be-4bf7-9b6e-fcded0175c91-kube-api-access-nm4s7\") pod \"install-os-openstack-openstack-cell1-r6xm4\" (UID: \"fb2cf8ea-12be-4bf7-9b6e-fcded0175c91\") " pod="openstack/install-os-openstack-openstack-cell1-r6xm4" Oct 13 15:00:16 crc kubenswrapper[4797]: I1013 15:00:16.958792 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fb2cf8ea-12be-4bf7-9b6e-fcded0175c91-inventory\") pod \"install-os-openstack-openstack-cell1-r6xm4\" (UID: \"fb2cf8ea-12be-4bf7-9b6e-fcded0175c91\") " pod="openstack/install-os-openstack-openstack-cell1-r6xm4" Oct 13 15:00:16 crc kubenswrapper[4797]: I1013 15:00:16.958917 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fb2cf8ea-12be-4bf7-9b6e-fcded0175c91-ssh-key\") pod \"install-os-openstack-openstack-cell1-r6xm4\" (UID: \"fb2cf8ea-12be-4bf7-9b6e-fcded0175c91\") " pod="openstack/install-os-openstack-openstack-cell1-r6xm4" Oct 13 15:00:17 crc kubenswrapper[4797]: I1013 15:00:17.060439 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fb2cf8ea-12be-4bf7-9b6e-fcded0175c91-ssh-key\") pod \"install-os-openstack-openstack-cell1-r6xm4\" (UID: \"fb2cf8ea-12be-4bf7-9b6e-fcded0175c91\") " pod="openstack/install-os-openstack-openstack-cell1-r6xm4" Oct 13 15:00:17 crc kubenswrapper[4797]: I1013 15:00:17.060511 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/fb2cf8ea-12be-4bf7-9b6e-fcded0175c91-ceph\") pod \"install-os-openstack-openstack-cell1-r6xm4\" (UID: \"fb2cf8ea-12be-4bf7-9b6e-fcded0175c91\") " pod="openstack/install-os-openstack-openstack-cell1-r6xm4" Oct 13 15:00:17 crc kubenswrapper[4797]: I1013 15:00:17.060582 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nm4s7\" (UniqueName: \"kubernetes.io/projected/fb2cf8ea-12be-4bf7-9b6e-fcded0175c91-kube-api-access-nm4s7\") pod \"install-os-openstack-openstack-cell1-r6xm4\" (UID: \"fb2cf8ea-12be-4bf7-9b6e-fcded0175c91\") " pod="openstack/install-os-openstack-openstack-cell1-r6xm4" Oct 13 15:00:17 crc kubenswrapper[4797]: I1013 15:00:17.060647 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fb2cf8ea-12be-4bf7-9b6e-fcded0175c91-inventory\") pod \"install-os-openstack-openstack-cell1-r6xm4\" (UID: \"fb2cf8ea-12be-4bf7-9b6e-fcded0175c91\") " pod="openstack/install-os-openstack-openstack-cell1-r6xm4" Oct 13 15:00:17 crc kubenswrapper[4797]: I1013 15:00:17.066345 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/fb2cf8ea-12be-4bf7-9b6e-fcded0175c91-ceph\") pod \"install-os-openstack-openstack-cell1-r6xm4\" (UID: \"fb2cf8ea-12be-4bf7-9b6e-fcded0175c91\") " pod="openstack/install-os-openstack-openstack-cell1-r6xm4" Oct 13 15:00:17 crc kubenswrapper[4797]: I1013 15:00:17.066635 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fb2cf8ea-12be-4bf7-9b6e-fcded0175c91-inventory\") pod \"install-os-openstack-openstack-cell1-r6xm4\" (UID: \"fb2cf8ea-12be-4bf7-9b6e-fcded0175c91\") " pod="openstack/install-os-openstack-openstack-cell1-r6xm4" Oct 13 15:00:17 crc kubenswrapper[4797]: I1013 15:00:17.067150 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fb2cf8ea-12be-4bf7-9b6e-fcded0175c91-ssh-key\") pod \"install-os-openstack-openstack-cell1-r6xm4\" (UID: \"fb2cf8ea-12be-4bf7-9b6e-fcded0175c91\") " pod="openstack/install-os-openstack-openstack-cell1-r6xm4" Oct 13 15:00:17 crc kubenswrapper[4797]: I1013 15:00:17.077589 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nm4s7\" (UniqueName: \"kubernetes.io/projected/fb2cf8ea-12be-4bf7-9b6e-fcded0175c91-kube-api-access-nm4s7\") pod \"install-os-openstack-openstack-cell1-r6xm4\" (UID: \"fb2cf8ea-12be-4bf7-9b6e-fcded0175c91\") " pod="openstack/install-os-openstack-openstack-cell1-r6xm4" Oct 13 15:00:17 crc kubenswrapper[4797]: I1013 15:00:17.155324 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-r6xm4" Oct 13 15:00:17 crc kubenswrapper[4797]: I1013 15:00:17.669378 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-openstack-openstack-cell1-r6xm4"] Oct 13 15:00:17 crc kubenswrapper[4797]: W1013 15:00:17.675563 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfb2cf8ea_12be_4bf7_9b6e_fcded0175c91.slice/crio-a01dc5bbd450cc177b540ade9adafb651feaef9b62f5ad4c5fe55fbf9da2b735 WatchSource:0}: Error finding container a01dc5bbd450cc177b540ade9adafb651feaef9b62f5ad4c5fe55fbf9da2b735: Status 404 returned error can't find the container with id a01dc5bbd450cc177b540ade9adafb651feaef9b62f5ad4c5fe55fbf9da2b735 Oct 13 15:00:17 crc kubenswrapper[4797]: I1013 15:00:17.774233 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-r6xm4" event={"ID":"fb2cf8ea-12be-4bf7-9b6e-fcded0175c91","Type":"ContainerStarted","Data":"a01dc5bbd450cc177b540ade9adafb651feaef9b62f5ad4c5fe55fbf9da2b735"} Oct 13 15:00:18 crc kubenswrapper[4797]: I1013 15:00:18.803888 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-r6xm4" event={"ID":"fb2cf8ea-12be-4bf7-9b6e-fcded0175c91","Type":"ContainerStarted","Data":"2d438d867dbc7f9a4ed08b3158f8bff709612fb477a178e16ac0afba4abd47a6"} Oct 13 15:00:18 crc kubenswrapper[4797]: I1013 15:00:18.825932 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-openstack-openstack-cell1-r6xm4" podStartSLOduration=2.410098248 podStartE2EDuration="2.825905755s" podCreationTimestamp="2025-10-13 15:00:16 +0000 UTC" firstStartedPulling="2025-10-13 15:00:17.678892201 +0000 UTC m=+6795.212442457" lastFinishedPulling="2025-10-13 15:00:18.094699718 +0000 UTC m=+6795.628249964" observedRunningTime="2025-10-13 15:00:18.821983588 +0000 UTC m=+6796.355533854" watchObservedRunningTime="2025-10-13 15:00:18.825905755 +0000 UTC m=+6796.359456021" Oct 13 15:00:19 crc kubenswrapper[4797]: I1013 15:00:19.205798 4797 scope.go:117] "RemoveContainer" containerID="681eee713f8e63de016f4eb9fc053d1417d329ce5d6284aee304a3a642fb7342" Oct 13 15:00:19 crc kubenswrapper[4797]: E1013 15:00:19.426475 4797 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd721eb5f_2ce0_4e90_b01c_ed1d97a58208.slice/crio-727b6da29b86a6a4b4f78a0029f0aa823cf2a3701195adaa2300268928a4c62a\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd721eb5f_2ce0_4e90_b01c_ed1d97a58208.slice\": RecentStats: unable to find data in memory cache]" Oct 13 15:00:29 crc kubenswrapper[4797]: E1013 15:00:29.688828 4797 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd721eb5f_2ce0_4e90_b01c_ed1d97a58208.slice/crio-727b6da29b86a6a4b4f78a0029f0aa823cf2a3701195adaa2300268928a4c62a\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd721eb5f_2ce0_4e90_b01c_ed1d97a58208.slice\": RecentStats: unable to find data in memory cache]" Oct 13 15:00:39 crc kubenswrapper[4797]: E1013 15:00:39.994353 4797 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd721eb5f_2ce0_4e90_b01c_ed1d97a58208.slice/crio-727b6da29b86a6a4b4f78a0029f0aa823cf2a3701195adaa2300268928a4c62a\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd721eb5f_2ce0_4e90_b01c_ed1d97a58208.slice\": RecentStats: unable to find data in memory cache]" Oct 13 15:00:48 crc kubenswrapper[4797]: I1013 15:00:48.120135 4797 patch_prober.go:28] interesting pod/machine-config-daemon-hrdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 15:00:48 crc kubenswrapper[4797]: I1013 15:00:48.122111 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 15:00:50 crc kubenswrapper[4797]: E1013 15:00:50.273433 4797 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd721eb5f_2ce0_4e90_b01c_ed1d97a58208.slice/crio-727b6da29b86a6a4b4f78a0029f0aa823cf2a3701195adaa2300268928a4c62a\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd721eb5f_2ce0_4e90_b01c_ed1d97a58208.slice\": RecentStats: unable to find data in memory cache]" Oct 13 15:01:00 crc kubenswrapper[4797]: I1013 15:01:00.155826 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29339461-hwsdn"] Oct 13 15:01:00 crc kubenswrapper[4797]: I1013 15:01:00.158898 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29339461-hwsdn" Oct 13 15:01:00 crc kubenswrapper[4797]: I1013 15:01:00.168783 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29339461-hwsdn"] Oct 13 15:01:00 crc kubenswrapper[4797]: I1013 15:01:00.344750 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7cc54a2c-eb42-4b57-b4c4-1592a7ce99d2-fernet-keys\") pod \"keystone-cron-29339461-hwsdn\" (UID: \"7cc54a2c-eb42-4b57-b4c4-1592a7ce99d2\") " pod="openstack/keystone-cron-29339461-hwsdn" Oct 13 15:01:00 crc kubenswrapper[4797]: I1013 15:01:00.344924 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cc54a2c-eb42-4b57-b4c4-1592a7ce99d2-config-data\") pod \"keystone-cron-29339461-hwsdn\" (UID: \"7cc54a2c-eb42-4b57-b4c4-1592a7ce99d2\") " pod="openstack/keystone-cron-29339461-hwsdn" Oct 13 15:01:00 crc kubenswrapper[4797]: I1013 15:01:00.345019 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cc54a2c-eb42-4b57-b4c4-1592a7ce99d2-combined-ca-bundle\") pod \"keystone-cron-29339461-hwsdn\" (UID: \"7cc54a2c-eb42-4b57-b4c4-1592a7ce99d2\") " pod="openstack/keystone-cron-29339461-hwsdn" Oct 13 15:01:00 crc kubenswrapper[4797]: I1013 15:01:00.345256 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sd7m7\" (UniqueName: \"kubernetes.io/projected/7cc54a2c-eb42-4b57-b4c4-1592a7ce99d2-kube-api-access-sd7m7\") pod \"keystone-cron-29339461-hwsdn\" (UID: \"7cc54a2c-eb42-4b57-b4c4-1592a7ce99d2\") " pod="openstack/keystone-cron-29339461-hwsdn" Oct 13 15:01:00 crc kubenswrapper[4797]: I1013 15:01:00.462221 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sd7m7\" (UniqueName: \"kubernetes.io/projected/7cc54a2c-eb42-4b57-b4c4-1592a7ce99d2-kube-api-access-sd7m7\") pod \"keystone-cron-29339461-hwsdn\" (UID: \"7cc54a2c-eb42-4b57-b4c4-1592a7ce99d2\") " pod="openstack/keystone-cron-29339461-hwsdn" Oct 13 15:01:00 crc kubenswrapper[4797]: I1013 15:01:00.462712 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7cc54a2c-eb42-4b57-b4c4-1592a7ce99d2-fernet-keys\") pod \"keystone-cron-29339461-hwsdn\" (UID: \"7cc54a2c-eb42-4b57-b4c4-1592a7ce99d2\") " pod="openstack/keystone-cron-29339461-hwsdn" Oct 13 15:01:00 crc kubenswrapper[4797]: I1013 15:01:00.462977 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cc54a2c-eb42-4b57-b4c4-1592a7ce99d2-config-data\") pod \"keystone-cron-29339461-hwsdn\" (UID: \"7cc54a2c-eb42-4b57-b4c4-1592a7ce99d2\") " pod="openstack/keystone-cron-29339461-hwsdn" Oct 13 15:01:00 crc kubenswrapper[4797]: I1013 15:01:00.463240 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cc54a2c-eb42-4b57-b4c4-1592a7ce99d2-combined-ca-bundle\") pod \"keystone-cron-29339461-hwsdn\" (UID: \"7cc54a2c-eb42-4b57-b4c4-1592a7ce99d2\") " pod="openstack/keystone-cron-29339461-hwsdn" Oct 13 15:01:00 crc kubenswrapper[4797]: I1013 15:01:00.469513 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cc54a2c-eb42-4b57-b4c4-1592a7ce99d2-combined-ca-bundle\") pod \"keystone-cron-29339461-hwsdn\" (UID: \"7cc54a2c-eb42-4b57-b4c4-1592a7ce99d2\") " pod="openstack/keystone-cron-29339461-hwsdn" Oct 13 15:01:00 crc kubenswrapper[4797]: I1013 15:01:00.476407 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cc54a2c-eb42-4b57-b4c4-1592a7ce99d2-config-data\") pod \"keystone-cron-29339461-hwsdn\" (UID: \"7cc54a2c-eb42-4b57-b4c4-1592a7ce99d2\") " pod="openstack/keystone-cron-29339461-hwsdn" Oct 13 15:01:00 crc kubenswrapper[4797]: I1013 15:01:00.492246 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sd7m7\" (UniqueName: \"kubernetes.io/projected/7cc54a2c-eb42-4b57-b4c4-1592a7ce99d2-kube-api-access-sd7m7\") pod \"keystone-cron-29339461-hwsdn\" (UID: \"7cc54a2c-eb42-4b57-b4c4-1592a7ce99d2\") " pod="openstack/keystone-cron-29339461-hwsdn" Oct 13 15:01:00 crc kubenswrapper[4797]: I1013 15:01:00.495610 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7cc54a2c-eb42-4b57-b4c4-1592a7ce99d2-fernet-keys\") pod \"keystone-cron-29339461-hwsdn\" (UID: \"7cc54a2c-eb42-4b57-b4c4-1592a7ce99d2\") " pod="openstack/keystone-cron-29339461-hwsdn" Oct 13 15:01:00 crc kubenswrapper[4797]: I1013 15:01:00.519964 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29339461-hwsdn" Oct 13 15:01:00 crc kubenswrapper[4797]: E1013 15:01:00.639691 4797 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd721eb5f_2ce0_4e90_b01c_ed1d97a58208.slice/crio-727b6da29b86a6a4b4f78a0029f0aa823cf2a3701195adaa2300268928a4c62a\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd721eb5f_2ce0_4e90_b01c_ed1d97a58208.slice\": RecentStats: unable to find data in memory cache]" Oct 13 15:01:01 crc kubenswrapper[4797]: I1013 15:01:01.004536 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29339461-hwsdn"] Oct 13 15:01:01 crc kubenswrapper[4797]: I1013 15:01:01.250429 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29339461-hwsdn" event={"ID":"7cc54a2c-eb42-4b57-b4c4-1592a7ce99d2","Type":"ContainerStarted","Data":"7af52699c42b67419d87703603b6126e118ba1ab1e5f9b1f7679f5c1d69ff3b7"} Oct 13 15:01:01 crc kubenswrapper[4797]: I1013 15:01:01.270054 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29339461-hwsdn" podStartSLOduration=1.270034745 podStartE2EDuration="1.270034745s" podCreationTimestamp="2025-10-13 15:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 15:01:01.266263962 +0000 UTC m=+6838.799814218" watchObservedRunningTime="2025-10-13 15:01:01.270034745 +0000 UTC m=+6838.803585001" Oct 13 15:01:02 crc kubenswrapper[4797]: I1013 15:01:02.254721 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29339461-hwsdn" event={"ID":"7cc54a2c-eb42-4b57-b4c4-1592a7ce99d2","Type":"ContainerStarted","Data":"18a1bfd923b58cc8986b233e40565d5bbbd1693f7bd312ab7648d886d3e5aa95"} Oct 13 15:01:05 crc kubenswrapper[4797]: I1013 15:01:05.283708 4797 generic.go:334] "Generic (PLEG): container finished" podID="7cc54a2c-eb42-4b57-b4c4-1592a7ce99d2" containerID="18a1bfd923b58cc8986b233e40565d5bbbd1693f7bd312ab7648d886d3e5aa95" exitCode=0 Oct 13 15:01:05 crc kubenswrapper[4797]: I1013 15:01:05.286360 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29339461-hwsdn" event={"ID":"7cc54a2c-eb42-4b57-b4c4-1592a7ce99d2","Type":"ContainerDied","Data":"18a1bfd923b58cc8986b233e40565d5bbbd1693f7bd312ab7648d886d3e5aa95"} Oct 13 15:01:05 crc kubenswrapper[4797]: I1013 15:01:05.288875 4797 generic.go:334] "Generic (PLEG): container finished" podID="fb2cf8ea-12be-4bf7-9b6e-fcded0175c91" containerID="2d438d867dbc7f9a4ed08b3158f8bff709612fb477a178e16ac0afba4abd47a6" exitCode=0 Oct 13 15:01:05 crc kubenswrapper[4797]: I1013 15:01:05.289016 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-r6xm4" event={"ID":"fb2cf8ea-12be-4bf7-9b6e-fcded0175c91","Type":"ContainerDied","Data":"2d438d867dbc7f9a4ed08b3158f8bff709612fb477a178e16ac0afba4abd47a6"} Oct 13 15:01:06 crc kubenswrapper[4797]: I1013 15:01:06.882740 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29339461-hwsdn" Oct 13 15:01:06 crc kubenswrapper[4797]: I1013 15:01:06.899588 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-r6xm4" Oct 13 15:01:07 crc kubenswrapper[4797]: I1013 15:01:07.028230 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cc54a2c-eb42-4b57-b4c4-1592a7ce99d2-config-data\") pod \"7cc54a2c-eb42-4b57-b4c4-1592a7ce99d2\" (UID: \"7cc54a2c-eb42-4b57-b4c4-1592a7ce99d2\") " Oct 13 15:01:07 crc kubenswrapper[4797]: I1013 15:01:07.028325 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fb2cf8ea-12be-4bf7-9b6e-fcded0175c91-inventory\") pod \"fb2cf8ea-12be-4bf7-9b6e-fcded0175c91\" (UID: \"fb2cf8ea-12be-4bf7-9b6e-fcded0175c91\") " Oct 13 15:01:07 crc kubenswrapper[4797]: I1013 15:01:07.028405 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fb2cf8ea-12be-4bf7-9b6e-fcded0175c91-ssh-key\") pod \"fb2cf8ea-12be-4bf7-9b6e-fcded0175c91\" (UID: \"fb2cf8ea-12be-4bf7-9b6e-fcded0175c91\") " Oct 13 15:01:07 crc kubenswrapper[4797]: I1013 15:01:07.028436 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/fb2cf8ea-12be-4bf7-9b6e-fcded0175c91-ceph\") pod \"fb2cf8ea-12be-4bf7-9b6e-fcded0175c91\" (UID: \"fb2cf8ea-12be-4bf7-9b6e-fcded0175c91\") " Oct 13 15:01:07 crc kubenswrapper[4797]: I1013 15:01:07.028513 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7cc54a2c-eb42-4b57-b4c4-1592a7ce99d2-fernet-keys\") pod \"7cc54a2c-eb42-4b57-b4c4-1592a7ce99d2\" (UID: \"7cc54a2c-eb42-4b57-b4c4-1592a7ce99d2\") " Oct 13 15:01:07 crc kubenswrapper[4797]: I1013 15:01:07.028639 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cc54a2c-eb42-4b57-b4c4-1592a7ce99d2-combined-ca-bundle\") pod \"7cc54a2c-eb42-4b57-b4c4-1592a7ce99d2\" (UID: \"7cc54a2c-eb42-4b57-b4c4-1592a7ce99d2\") " Oct 13 15:01:07 crc kubenswrapper[4797]: I1013 15:01:07.028676 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sd7m7\" (UniqueName: \"kubernetes.io/projected/7cc54a2c-eb42-4b57-b4c4-1592a7ce99d2-kube-api-access-sd7m7\") pod \"7cc54a2c-eb42-4b57-b4c4-1592a7ce99d2\" (UID: \"7cc54a2c-eb42-4b57-b4c4-1592a7ce99d2\") " Oct 13 15:01:07 crc kubenswrapper[4797]: I1013 15:01:07.028705 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nm4s7\" (UniqueName: \"kubernetes.io/projected/fb2cf8ea-12be-4bf7-9b6e-fcded0175c91-kube-api-access-nm4s7\") pod \"fb2cf8ea-12be-4bf7-9b6e-fcded0175c91\" (UID: \"fb2cf8ea-12be-4bf7-9b6e-fcded0175c91\") " Oct 13 15:01:07 crc kubenswrapper[4797]: I1013 15:01:07.036367 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cc54a2c-eb42-4b57-b4c4-1592a7ce99d2-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "7cc54a2c-eb42-4b57-b4c4-1592a7ce99d2" (UID: "7cc54a2c-eb42-4b57-b4c4-1592a7ce99d2"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 15:01:07 crc kubenswrapper[4797]: I1013 15:01:07.036388 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb2cf8ea-12be-4bf7-9b6e-fcded0175c91-ceph" (OuterVolumeSpecName: "ceph") pod "fb2cf8ea-12be-4bf7-9b6e-fcded0175c91" (UID: "fb2cf8ea-12be-4bf7-9b6e-fcded0175c91"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 15:01:07 crc kubenswrapper[4797]: I1013 15:01:07.038110 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb2cf8ea-12be-4bf7-9b6e-fcded0175c91-kube-api-access-nm4s7" (OuterVolumeSpecName: "kube-api-access-nm4s7") pod "fb2cf8ea-12be-4bf7-9b6e-fcded0175c91" (UID: "fb2cf8ea-12be-4bf7-9b6e-fcded0175c91"). InnerVolumeSpecName "kube-api-access-nm4s7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 15:01:07 crc kubenswrapper[4797]: I1013 15:01:07.042008 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7cc54a2c-eb42-4b57-b4c4-1592a7ce99d2-kube-api-access-sd7m7" (OuterVolumeSpecName: "kube-api-access-sd7m7") pod "7cc54a2c-eb42-4b57-b4c4-1592a7ce99d2" (UID: "7cc54a2c-eb42-4b57-b4c4-1592a7ce99d2"). InnerVolumeSpecName "kube-api-access-sd7m7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 15:01:07 crc kubenswrapper[4797]: I1013 15:01:07.069852 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cc54a2c-eb42-4b57-b4c4-1592a7ce99d2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7cc54a2c-eb42-4b57-b4c4-1592a7ce99d2" (UID: "7cc54a2c-eb42-4b57-b4c4-1592a7ce99d2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 15:01:07 crc kubenswrapper[4797]: I1013 15:01:07.077681 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb2cf8ea-12be-4bf7-9b6e-fcded0175c91-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "fb2cf8ea-12be-4bf7-9b6e-fcded0175c91" (UID: "fb2cf8ea-12be-4bf7-9b6e-fcded0175c91"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 15:01:07 crc kubenswrapper[4797]: I1013 15:01:07.080774 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb2cf8ea-12be-4bf7-9b6e-fcded0175c91-inventory" (OuterVolumeSpecName: "inventory") pod "fb2cf8ea-12be-4bf7-9b6e-fcded0175c91" (UID: "fb2cf8ea-12be-4bf7-9b6e-fcded0175c91"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 15:01:07 crc kubenswrapper[4797]: I1013 15:01:07.105993 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cc54a2c-eb42-4b57-b4c4-1592a7ce99d2-config-data" (OuterVolumeSpecName: "config-data") pod "7cc54a2c-eb42-4b57-b4c4-1592a7ce99d2" (UID: "7cc54a2c-eb42-4b57-b4c4-1592a7ce99d2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 15:01:07 crc kubenswrapper[4797]: I1013 15:01:07.132067 4797 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cc54a2c-eb42-4b57-b4c4-1592a7ce99d2-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 15:01:07 crc kubenswrapper[4797]: I1013 15:01:07.132158 4797 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fb2cf8ea-12be-4bf7-9b6e-fcded0175c91-inventory\") on node \"crc\" DevicePath \"\"" Oct 13 15:01:07 crc kubenswrapper[4797]: I1013 15:01:07.132173 4797 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fb2cf8ea-12be-4bf7-9b6e-fcded0175c91-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 13 15:01:07 crc kubenswrapper[4797]: I1013 15:01:07.132185 4797 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/fb2cf8ea-12be-4bf7-9b6e-fcded0175c91-ceph\") on node \"crc\" DevicePath \"\"" Oct 13 15:01:07 crc kubenswrapper[4797]: I1013 15:01:07.132194 4797 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7cc54a2c-eb42-4b57-b4c4-1592a7ce99d2-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 13 15:01:07 crc kubenswrapper[4797]: I1013 15:01:07.132224 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cc54a2c-eb42-4b57-b4c4-1592a7ce99d2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 15:01:07 crc kubenswrapper[4797]: I1013 15:01:07.132237 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sd7m7\" (UniqueName: \"kubernetes.io/projected/7cc54a2c-eb42-4b57-b4c4-1592a7ce99d2-kube-api-access-sd7m7\") on node \"crc\" DevicePath \"\"" Oct 13 15:01:07 crc kubenswrapper[4797]: I1013 15:01:07.132249 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nm4s7\" (UniqueName: \"kubernetes.io/projected/fb2cf8ea-12be-4bf7-9b6e-fcded0175c91-kube-api-access-nm4s7\") on node \"crc\" DevicePath \"\"" Oct 13 15:01:07 crc kubenswrapper[4797]: I1013 15:01:07.312166 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-r6xm4" event={"ID":"fb2cf8ea-12be-4bf7-9b6e-fcded0175c91","Type":"ContainerDied","Data":"a01dc5bbd450cc177b540ade9adafb651feaef9b62f5ad4c5fe55fbf9da2b735"} Oct 13 15:01:07 crc kubenswrapper[4797]: I1013 15:01:07.312204 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a01dc5bbd450cc177b540ade9adafb651feaef9b62f5ad4c5fe55fbf9da2b735" Oct 13 15:01:07 crc kubenswrapper[4797]: I1013 15:01:07.312474 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-r6xm4" Oct 13 15:01:07 crc kubenswrapper[4797]: I1013 15:01:07.314262 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29339461-hwsdn" event={"ID":"7cc54a2c-eb42-4b57-b4c4-1592a7ce99d2","Type":"ContainerDied","Data":"7af52699c42b67419d87703603b6126e118ba1ab1e5f9b1f7679f5c1d69ff3b7"} Oct 13 15:01:07 crc kubenswrapper[4797]: I1013 15:01:07.314285 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7af52699c42b67419d87703603b6126e118ba1ab1e5f9b1f7679f5c1d69ff3b7" Oct 13 15:01:07 crc kubenswrapper[4797]: I1013 15:01:07.314329 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29339461-hwsdn" Oct 13 15:01:07 crc kubenswrapper[4797]: I1013 15:01:07.420049 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-openstack-openstack-cell1-fx9pj"] Oct 13 15:01:07 crc kubenswrapper[4797]: E1013 15:01:07.420510 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb2cf8ea-12be-4bf7-9b6e-fcded0175c91" containerName="install-os-openstack-openstack-cell1" Oct 13 15:01:07 crc kubenswrapper[4797]: I1013 15:01:07.420528 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb2cf8ea-12be-4bf7-9b6e-fcded0175c91" containerName="install-os-openstack-openstack-cell1" Oct 13 15:01:07 crc kubenswrapper[4797]: E1013 15:01:07.420559 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cc54a2c-eb42-4b57-b4c4-1592a7ce99d2" containerName="keystone-cron" Oct 13 15:01:07 crc kubenswrapper[4797]: I1013 15:01:07.420567 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cc54a2c-eb42-4b57-b4c4-1592a7ce99d2" containerName="keystone-cron" Oct 13 15:01:07 crc kubenswrapper[4797]: I1013 15:01:07.420772 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb2cf8ea-12be-4bf7-9b6e-fcded0175c91" containerName="install-os-openstack-openstack-cell1" Oct 13 15:01:07 crc kubenswrapper[4797]: I1013 15:01:07.420883 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="7cc54a2c-eb42-4b57-b4c4-1592a7ce99d2" containerName="keystone-cron" Oct 13 15:01:07 crc kubenswrapper[4797]: I1013 15:01:07.421592 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-fx9pj" Oct 13 15:01:07 crc kubenswrapper[4797]: I1013 15:01:07.425457 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 13 15:01:07 crc kubenswrapper[4797]: I1013 15:01:07.425704 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-rf85n" Oct 13 15:01:07 crc kubenswrapper[4797]: I1013 15:01:07.425880 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 13 15:01:07 crc kubenswrapper[4797]: I1013 15:01:07.426006 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 13 15:01:07 crc kubenswrapper[4797]: I1013 15:01:07.427995 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-openstack-openstack-cell1-fx9pj"] Oct 13 15:01:07 crc kubenswrapper[4797]: I1013 15:01:07.540744 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pg5n5\" (UniqueName: \"kubernetes.io/projected/dc3cd5ee-3e78-4d2f-97dd-6901058f0f49-kube-api-access-pg5n5\") pod \"configure-os-openstack-openstack-cell1-fx9pj\" (UID: \"dc3cd5ee-3e78-4d2f-97dd-6901058f0f49\") " pod="openstack/configure-os-openstack-openstack-cell1-fx9pj" Oct 13 15:01:07 crc kubenswrapper[4797]: I1013 15:01:07.541008 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/dc3cd5ee-3e78-4d2f-97dd-6901058f0f49-ceph\") pod \"configure-os-openstack-openstack-cell1-fx9pj\" (UID: \"dc3cd5ee-3e78-4d2f-97dd-6901058f0f49\") " pod="openstack/configure-os-openstack-openstack-cell1-fx9pj" Oct 13 15:01:07 crc kubenswrapper[4797]: I1013 15:01:07.541044 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dc3cd5ee-3e78-4d2f-97dd-6901058f0f49-inventory\") pod \"configure-os-openstack-openstack-cell1-fx9pj\" (UID: \"dc3cd5ee-3e78-4d2f-97dd-6901058f0f49\") " pod="openstack/configure-os-openstack-openstack-cell1-fx9pj" Oct 13 15:01:07 crc kubenswrapper[4797]: I1013 15:01:07.541082 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dc3cd5ee-3e78-4d2f-97dd-6901058f0f49-ssh-key\") pod \"configure-os-openstack-openstack-cell1-fx9pj\" (UID: \"dc3cd5ee-3e78-4d2f-97dd-6901058f0f49\") " pod="openstack/configure-os-openstack-openstack-cell1-fx9pj" Oct 13 15:01:07 crc kubenswrapper[4797]: I1013 15:01:07.643440 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pg5n5\" (UniqueName: \"kubernetes.io/projected/dc3cd5ee-3e78-4d2f-97dd-6901058f0f49-kube-api-access-pg5n5\") pod \"configure-os-openstack-openstack-cell1-fx9pj\" (UID: \"dc3cd5ee-3e78-4d2f-97dd-6901058f0f49\") " pod="openstack/configure-os-openstack-openstack-cell1-fx9pj" Oct 13 15:01:07 crc kubenswrapper[4797]: I1013 15:01:07.643633 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/dc3cd5ee-3e78-4d2f-97dd-6901058f0f49-ceph\") pod \"configure-os-openstack-openstack-cell1-fx9pj\" (UID: \"dc3cd5ee-3e78-4d2f-97dd-6901058f0f49\") " pod="openstack/configure-os-openstack-openstack-cell1-fx9pj" Oct 13 15:01:07 crc kubenswrapper[4797]: I1013 15:01:07.643682 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dc3cd5ee-3e78-4d2f-97dd-6901058f0f49-inventory\") pod \"configure-os-openstack-openstack-cell1-fx9pj\" (UID: \"dc3cd5ee-3e78-4d2f-97dd-6901058f0f49\") " pod="openstack/configure-os-openstack-openstack-cell1-fx9pj" Oct 13 15:01:07 crc kubenswrapper[4797]: I1013 15:01:07.643730 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dc3cd5ee-3e78-4d2f-97dd-6901058f0f49-ssh-key\") pod \"configure-os-openstack-openstack-cell1-fx9pj\" (UID: \"dc3cd5ee-3e78-4d2f-97dd-6901058f0f49\") " pod="openstack/configure-os-openstack-openstack-cell1-fx9pj" Oct 13 15:01:07 crc kubenswrapper[4797]: I1013 15:01:07.650163 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dc3cd5ee-3e78-4d2f-97dd-6901058f0f49-ssh-key\") pod \"configure-os-openstack-openstack-cell1-fx9pj\" (UID: \"dc3cd5ee-3e78-4d2f-97dd-6901058f0f49\") " pod="openstack/configure-os-openstack-openstack-cell1-fx9pj" Oct 13 15:01:07 crc kubenswrapper[4797]: I1013 15:01:07.650205 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/dc3cd5ee-3e78-4d2f-97dd-6901058f0f49-ceph\") pod \"configure-os-openstack-openstack-cell1-fx9pj\" (UID: \"dc3cd5ee-3e78-4d2f-97dd-6901058f0f49\") " pod="openstack/configure-os-openstack-openstack-cell1-fx9pj" Oct 13 15:01:07 crc kubenswrapper[4797]: I1013 15:01:07.650986 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dc3cd5ee-3e78-4d2f-97dd-6901058f0f49-inventory\") pod \"configure-os-openstack-openstack-cell1-fx9pj\" (UID: \"dc3cd5ee-3e78-4d2f-97dd-6901058f0f49\") " pod="openstack/configure-os-openstack-openstack-cell1-fx9pj" Oct 13 15:01:07 crc kubenswrapper[4797]: I1013 15:01:07.661539 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pg5n5\" (UniqueName: \"kubernetes.io/projected/dc3cd5ee-3e78-4d2f-97dd-6901058f0f49-kube-api-access-pg5n5\") pod \"configure-os-openstack-openstack-cell1-fx9pj\" (UID: \"dc3cd5ee-3e78-4d2f-97dd-6901058f0f49\") " pod="openstack/configure-os-openstack-openstack-cell1-fx9pj" Oct 13 15:01:07 crc kubenswrapper[4797]: I1013 15:01:07.741205 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-fx9pj" Oct 13 15:01:08 crc kubenswrapper[4797]: I1013 15:01:08.484578 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-openstack-openstack-cell1-fx9pj"] Oct 13 15:01:08 crc kubenswrapper[4797]: W1013 15:01:08.501187 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddc3cd5ee_3e78_4d2f_97dd_6901058f0f49.slice/crio-f520f226542dc4892a90d020cbefa7ef5a15eac35bdd669e88fb981c001be95e WatchSource:0}: Error finding container f520f226542dc4892a90d020cbefa7ef5a15eac35bdd669e88fb981c001be95e: Status 404 returned error can't find the container with id f520f226542dc4892a90d020cbefa7ef5a15eac35bdd669e88fb981c001be95e Oct 13 15:01:09 crc kubenswrapper[4797]: I1013 15:01:09.339131 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-fx9pj" event={"ID":"dc3cd5ee-3e78-4d2f-97dd-6901058f0f49","Type":"ContainerStarted","Data":"9ac27d44dcb0139e5d44bf71f60fd2a02022eb5f4b4ce27aa43c2deb4d634973"} Oct 13 15:01:09 crc kubenswrapper[4797]: I1013 15:01:09.339617 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-fx9pj" event={"ID":"dc3cd5ee-3e78-4d2f-97dd-6901058f0f49","Type":"ContainerStarted","Data":"f520f226542dc4892a90d020cbefa7ef5a15eac35bdd669e88fb981c001be95e"} Oct 13 15:01:09 crc kubenswrapper[4797]: I1013 15:01:09.367484 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-openstack-openstack-cell1-fx9pj" podStartSLOduration=1.8997648169999999 podStartE2EDuration="2.36746032s" podCreationTimestamp="2025-10-13 15:01:07 +0000 UTC" firstStartedPulling="2025-10-13 15:01:08.50605117 +0000 UTC m=+6846.039601436" lastFinishedPulling="2025-10-13 15:01:08.973746683 +0000 UTC m=+6846.507296939" observedRunningTime="2025-10-13 15:01:09.357937215 +0000 UTC m=+6846.891487501" watchObservedRunningTime="2025-10-13 15:01:09.36746032 +0000 UTC m=+6846.901010586" Oct 13 15:01:18 crc kubenswrapper[4797]: I1013 15:01:18.120403 4797 patch_prober.go:28] interesting pod/machine-config-daemon-hrdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 15:01:18 crc kubenswrapper[4797]: I1013 15:01:18.121005 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 15:01:36 crc kubenswrapper[4797]: I1013 15:01:36.140301 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-h2pk2"] Oct 13 15:01:36 crc kubenswrapper[4797]: I1013 15:01:36.149018 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h2pk2" Oct 13 15:01:36 crc kubenswrapper[4797]: I1013 15:01:36.150173 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-h2pk2"] Oct 13 15:01:36 crc kubenswrapper[4797]: I1013 15:01:36.153500 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5bfcfff8-ee9a-44ea-9106-f64ac3a0ade8-utilities\") pod \"certified-operators-h2pk2\" (UID: \"5bfcfff8-ee9a-44ea-9106-f64ac3a0ade8\") " pod="openshift-marketplace/certified-operators-h2pk2" Oct 13 15:01:36 crc kubenswrapper[4797]: I1013 15:01:36.153614 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5bfcfff8-ee9a-44ea-9106-f64ac3a0ade8-catalog-content\") pod \"certified-operators-h2pk2\" (UID: \"5bfcfff8-ee9a-44ea-9106-f64ac3a0ade8\") " pod="openshift-marketplace/certified-operators-h2pk2" Oct 13 15:01:36 crc kubenswrapper[4797]: I1013 15:01:36.153716 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2p5tg\" (UniqueName: \"kubernetes.io/projected/5bfcfff8-ee9a-44ea-9106-f64ac3a0ade8-kube-api-access-2p5tg\") pod \"certified-operators-h2pk2\" (UID: \"5bfcfff8-ee9a-44ea-9106-f64ac3a0ade8\") " pod="openshift-marketplace/certified-operators-h2pk2" Oct 13 15:01:36 crc kubenswrapper[4797]: I1013 15:01:36.256412 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5bfcfff8-ee9a-44ea-9106-f64ac3a0ade8-catalog-content\") pod \"certified-operators-h2pk2\" (UID: \"5bfcfff8-ee9a-44ea-9106-f64ac3a0ade8\") " pod="openshift-marketplace/certified-operators-h2pk2" Oct 13 15:01:36 crc kubenswrapper[4797]: I1013 15:01:36.257139 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5bfcfff8-ee9a-44ea-9106-f64ac3a0ade8-catalog-content\") pod \"certified-operators-h2pk2\" (UID: \"5bfcfff8-ee9a-44ea-9106-f64ac3a0ade8\") " pod="openshift-marketplace/certified-operators-h2pk2" Oct 13 15:01:36 crc kubenswrapper[4797]: I1013 15:01:36.257548 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2p5tg\" (UniqueName: \"kubernetes.io/projected/5bfcfff8-ee9a-44ea-9106-f64ac3a0ade8-kube-api-access-2p5tg\") pod \"certified-operators-h2pk2\" (UID: \"5bfcfff8-ee9a-44ea-9106-f64ac3a0ade8\") " pod="openshift-marketplace/certified-operators-h2pk2" Oct 13 15:01:36 crc kubenswrapper[4797]: I1013 15:01:36.257861 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5bfcfff8-ee9a-44ea-9106-f64ac3a0ade8-utilities\") pod \"certified-operators-h2pk2\" (UID: \"5bfcfff8-ee9a-44ea-9106-f64ac3a0ade8\") " pod="openshift-marketplace/certified-operators-h2pk2" Oct 13 15:01:36 crc kubenswrapper[4797]: I1013 15:01:36.258242 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5bfcfff8-ee9a-44ea-9106-f64ac3a0ade8-utilities\") pod \"certified-operators-h2pk2\" (UID: \"5bfcfff8-ee9a-44ea-9106-f64ac3a0ade8\") " pod="openshift-marketplace/certified-operators-h2pk2" Oct 13 15:01:36 crc kubenswrapper[4797]: I1013 15:01:36.282092 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2p5tg\" (UniqueName: \"kubernetes.io/projected/5bfcfff8-ee9a-44ea-9106-f64ac3a0ade8-kube-api-access-2p5tg\") pod \"certified-operators-h2pk2\" (UID: \"5bfcfff8-ee9a-44ea-9106-f64ac3a0ade8\") " pod="openshift-marketplace/certified-operators-h2pk2" Oct 13 15:01:36 crc kubenswrapper[4797]: I1013 15:01:36.479568 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h2pk2" Oct 13 15:01:37 crc kubenswrapper[4797]: I1013 15:01:37.042170 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-h2pk2"] Oct 13 15:01:37 crc kubenswrapper[4797]: I1013 15:01:37.640372 4797 generic.go:334] "Generic (PLEG): container finished" podID="5bfcfff8-ee9a-44ea-9106-f64ac3a0ade8" containerID="fcabd8627cfdf2d26981fbf9cc7e0f1dcf0fef9fdf300c5dd3b063947e6006c1" exitCode=0 Oct 13 15:01:37 crc kubenswrapper[4797]: I1013 15:01:37.640630 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h2pk2" event={"ID":"5bfcfff8-ee9a-44ea-9106-f64ac3a0ade8","Type":"ContainerDied","Data":"fcabd8627cfdf2d26981fbf9cc7e0f1dcf0fef9fdf300c5dd3b063947e6006c1"} Oct 13 15:01:37 crc kubenswrapper[4797]: I1013 15:01:37.640910 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h2pk2" event={"ID":"5bfcfff8-ee9a-44ea-9106-f64ac3a0ade8","Type":"ContainerStarted","Data":"32bd984e30243ad4957a3c6ead331a2891e8d1d3e8918da51d4e6e33bf4b9c73"} Oct 13 15:01:38 crc kubenswrapper[4797]: I1013 15:01:38.652152 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h2pk2" event={"ID":"5bfcfff8-ee9a-44ea-9106-f64ac3a0ade8","Type":"ContainerStarted","Data":"a3159d3c1bf45fc4c7e163b7ad9581de58df9ae7eadcbc8b07cb9d38269bf087"} Oct 13 15:01:39 crc kubenswrapper[4797]: I1013 15:01:39.664175 4797 generic.go:334] "Generic (PLEG): container finished" podID="5bfcfff8-ee9a-44ea-9106-f64ac3a0ade8" containerID="a3159d3c1bf45fc4c7e163b7ad9581de58df9ae7eadcbc8b07cb9d38269bf087" exitCode=0 Oct 13 15:01:39 crc kubenswrapper[4797]: I1013 15:01:39.664222 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h2pk2" event={"ID":"5bfcfff8-ee9a-44ea-9106-f64ac3a0ade8","Type":"ContainerDied","Data":"a3159d3c1bf45fc4c7e163b7ad9581de58df9ae7eadcbc8b07cb9d38269bf087"} Oct 13 15:01:40 crc kubenswrapper[4797]: I1013 15:01:40.678038 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h2pk2" event={"ID":"5bfcfff8-ee9a-44ea-9106-f64ac3a0ade8","Type":"ContainerStarted","Data":"ddfe4c9513e6d7dfe634f402118dda00cf344065437b71aa54ac69d999b7153b"} Oct 13 15:01:40 crc kubenswrapper[4797]: I1013 15:01:40.711139 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-h2pk2" podStartSLOduration=2.29445174 podStartE2EDuration="4.711114888s" podCreationTimestamp="2025-10-13 15:01:36 +0000 UTC" firstStartedPulling="2025-10-13 15:01:37.643338147 +0000 UTC m=+6875.176888403" lastFinishedPulling="2025-10-13 15:01:40.060001295 +0000 UTC m=+6877.593551551" observedRunningTime="2025-10-13 15:01:40.699924142 +0000 UTC m=+6878.233474468" watchObservedRunningTime="2025-10-13 15:01:40.711114888 +0000 UTC m=+6878.244665184" Oct 13 15:01:46 crc kubenswrapper[4797]: I1013 15:01:46.480128 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-h2pk2" Oct 13 15:01:46 crc kubenswrapper[4797]: I1013 15:01:46.480723 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-h2pk2" Oct 13 15:01:46 crc kubenswrapper[4797]: I1013 15:01:46.532697 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-h2pk2" Oct 13 15:01:46 crc kubenswrapper[4797]: I1013 15:01:46.814413 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-h2pk2" Oct 13 15:01:47 crc kubenswrapper[4797]: I1013 15:01:47.533038 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-h2pk2"] Oct 13 15:01:48 crc kubenswrapper[4797]: I1013 15:01:48.119953 4797 patch_prober.go:28] interesting pod/machine-config-daemon-hrdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 15:01:48 crc kubenswrapper[4797]: I1013 15:01:48.120019 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 15:01:48 crc kubenswrapper[4797]: I1013 15:01:48.120066 4797 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" Oct 13 15:01:48 crc kubenswrapper[4797]: I1013 15:01:48.120937 4797 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9ced763cf8f63ef478d23b8f41f116ec9c1aafb73fa9427083e41fbea89d39fa"} pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 13 15:01:48 crc kubenswrapper[4797]: I1013 15:01:48.120998 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerName="machine-config-daemon" containerID="cri-o://9ced763cf8f63ef478d23b8f41f116ec9c1aafb73fa9427083e41fbea89d39fa" gracePeriod=600 Oct 13 15:01:48 crc kubenswrapper[4797]: I1013 15:01:48.771164 4797 generic.go:334] "Generic (PLEG): container finished" podID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerID="9ced763cf8f63ef478d23b8f41f116ec9c1aafb73fa9427083e41fbea89d39fa" exitCode=0 Oct 13 15:01:48 crc kubenswrapper[4797]: I1013 15:01:48.771232 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" event={"ID":"345b1c60-ba79-407d-8423-53010f2dfeb0","Type":"ContainerDied","Data":"9ced763cf8f63ef478d23b8f41f116ec9c1aafb73fa9427083e41fbea89d39fa"} Oct 13 15:01:48 crc kubenswrapper[4797]: I1013 15:01:48.771818 4797 scope.go:117] "RemoveContainer" containerID="1e92f33646910139b178cb12e4fc4664f24f2ce749b3ae38aac00d92a8bb562c" Oct 13 15:01:48 crc kubenswrapper[4797]: I1013 15:01:48.771831 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-h2pk2" podUID="5bfcfff8-ee9a-44ea-9106-f64ac3a0ade8" containerName="registry-server" containerID="cri-o://ddfe4c9513e6d7dfe634f402118dda00cf344065437b71aa54ac69d999b7153b" gracePeriod=2 Oct 13 15:01:48 crc kubenswrapper[4797]: I1013 15:01:48.772140 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" event={"ID":"345b1c60-ba79-407d-8423-53010f2dfeb0","Type":"ContainerStarted","Data":"3d7a7089f34adab43f593c787419674f839b78a7826391814ca89f2c6d53810d"} Oct 13 15:01:49 crc kubenswrapper[4797]: I1013 15:01:49.269410 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h2pk2" Oct 13 15:01:49 crc kubenswrapper[4797]: I1013 15:01:49.371737 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2p5tg\" (UniqueName: \"kubernetes.io/projected/5bfcfff8-ee9a-44ea-9106-f64ac3a0ade8-kube-api-access-2p5tg\") pod \"5bfcfff8-ee9a-44ea-9106-f64ac3a0ade8\" (UID: \"5bfcfff8-ee9a-44ea-9106-f64ac3a0ade8\") " Oct 13 15:01:49 crc kubenswrapper[4797]: I1013 15:01:49.371786 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5bfcfff8-ee9a-44ea-9106-f64ac3a0ade8-catalog-content\") pod \"5bfcfff8-ee9a-44ea-9106-f64ac3a0ade8\" (UID: \"5bfcfff8-ee9a-44ea-9106-f64ac3a0ade8\") " Oct 13 15:01:49 crc kubenswrapper[4797]: I1013 15:01:49.371914 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5bfcfff8-ee9a-44ea-9106-f64ac3a0ade8-utilities\") pod \"5bfcfff8-ee9a-44ea-9106-f64ac3a0ade8\" (UID: \"5bfcfff8-ee9a-44ea-9106-f64ac3a0ade8\") " Oct 13 15:01:49 crc kubenswrapper[4797]: I1013 15:01:49.376510 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5bfcfff8-ee9a-44ea-9106-f64ac3a0ade8-utilities" (OuterVolumeSpecName: "utilities") pod "5bfcfff8-ee9a-44ea-9106-f64ac3a0ade8" (UID: "5bfcfff8-ee9a-44ea-9106-f64ac3a0ade8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 15:01:49 crc kubenswrapper[4797]: I1013 15:01:49.391949 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5bfcfff8-ee9a-44ea-9106-f64ac3a0ade8-kube-api-access-2p5tg" (OuterVolumeSpecName: "kube-api-access-2p5tg") pod "5bfcfff8-ee9a-44ea-9106-f64ac3a0ade8" (UID: "5bfcfff8-ee9a-44ea-9106-f64ac3a0ade8"). InnerVolumeSpecName "kube-api-access-2p5tg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 15:01:49 crc kubenswrapper[4797]: I1013 15:01:49.428610 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5bfcfff8-ee9a-44ea-9106-f64ac3a0ade8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5bfcfff8-ee9a-44ea-9106-f64ac3a0ade8" (UID: "5bfcfff8-ee9a-44ea-9106-f64ac3a0ade8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 15:01:49 crc kubenswrapper[4797]: I1013 15:01:49.475344 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2p5tg\" (UniqueName: \"kubernetes.io/projected/5bfcfff8-ee9a-44ea-9106-f64ac3a0ade8-kube-api-access-2p5tg\") on node \"crc\" DevicePath \"\"" Oct 13 15:01:49 crc kubenswrapper[4797]: I1013 15:01:49.475386 4797 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5bfcfff8-ee9a-44ea-9106-f64ac3a0ade8-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 13 15:01:49 crc kubenswrapper[4797]: I1013 15:01:49.475399 4797 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5bfcfff8-ee9a-44ea-9106-f64ac3a0ade8-utilities\") on node \"crc\" DevicePath \"\"" Oct 13 15:01:49 crc kubenswrapper[4797]: I1013 15:01:49.783343 4797 generic.go:334] "Generic (PLEG): container finished" podID="5bfcfff8-ee9a-44ea-9106-f64ac3a0ade8" containerID="ddfe4c9513e6d7dfe634f402118dda00cf344065437b71aa54ac69d999b7153b" exitCode=0 Oct 13 15:01:49 crc kubenswrapper[4797]: I1013 15:01:49.783395 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h2pk2" event={"ID":"5bfcfff8-ee9a-44ea-9106-f64ac3a0ade8","Type":"ContainerDied","Data":"ddfe4c9513e6d7dfe634f402118dda00cf344065437b71aa54ac69d999b7153b"} Oct 13 15:01:49 crc kubenswrapper[4797]: I1013 15:01:49.783461 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h2pk2" event={"ID":"5bfcfff8-ee9a-44ea-9106-f64ac3a0ade8","Type":"ContainerDied","Data":"32bd984e30243ad4957a3c6ead331a2891e8d1d3e8918da51d4e6e33bf4b9c73"} Oct 13 15:01:49 crc kubenswrapper[4797]: I1013 15:01:49.783484 4797 scope.go:117] "RemoveContainer" containerID="ddfe4c9513e6d7dfe634f402118dda00cf344065437b71aa54ac69d999b7153b" Oct 13 15:01:49 crc kubenswrapper[4797]: I1013 15:01:49.784676 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h2pk2" Oct 13 15:01:49 crc kubenswrapper[4797]: I1013 15:01:49.809491 4797 scope.go:117] "RemoveContainer" containerID="a3159d3c1bf45fc4c7e163b7ad9581de58df9ae7eadcbc8b07cb9d38269bf087" Oct 13 15:01:49 crc kubenswrapper[4797]: I1013 15:01:49.825667 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-h2pk2"] Oct 13 15:01:49 crc kubenswrapper[4797]: I1013 15:01:49.834711 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-h2pk2"] Oct 13 15:01:49 crc kubenswrapper[4797]: I1013 15:01:49.838945 4797 scope.go:117] "RemoveContainer" containerID="fcabd8627cfdf2d26981fbf9cc7e0f1dcf0fef9fdf300c5dd3b063947e6006c1" Oct 13 15:01:49 crc kubenswrapper[4797]: I1013 15:01:49.879842 4797 scope.go:117] "RemoveContainer" containerID="ddfe4c9513e6d7dfe634f402118dda00cf344065437b71aa54ac69d999b7153b" Oct 13 15:01:49 crc kubenswrapper[4797]: E1013 15:01:49.880602 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ddfe4c9513e6d7dfe634f402118dda00cf344065437b71aa54ac69d999b7153b\": container with ID starting with ddfe4c9513e6d7dfe634f402118dda00cf344065437b71aa54ac69d999b7153b not found: ID does not exist" containerID="ddfe4c9513e6d7dfe634f402118dda00cf344065437b71aa54ac69d999b7153b" Oct 13 15:01:49 crc kubenswrapper[4797]: I1013 15:01:49.880666 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ddfe4c9513e6d7dfe634f402118dda00cf344065437b71aa54ac69d999b7153b"} err="failed to get container status \"ddfe4c9513e6d7dfe634f402118dda00cf344065437b71aa54ac69d999b7153b\": rpc error: code = NotFound desc = could not find container \"ddfe4c9513e6d7dfe634f402118dda00cf344065437b71aa54ac69d999b7153b\": container with ID starting with ddfe4c9513e6d7dfe634f402118dda00cf344065437b71aa54ac69d999b7153b not found: ID does not exist" Oct 13 15:01:49 crc kubenswrapper[4797]: I1013 15:01:49.880749 4797 scope.go:117] "RemoveContainer" containerID="a3159d3c1bf45fc4c7e163b7ad9581de58df9ae7eadcbc8b07cb9d38269bf087" Oct 13 15:01:49 crc kubenswrapper[4797]: E1013 15:01:49.881039 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a3159d3c1bf45fc4c7e163b7ad9581de58df9ae7eadcbc8b07cb9d38269bf087\": container with ID starting with a3159d3c1bf45fc4c7e163b7ad9581de58df9ae7eadcbc8b07cb9d38269bf087 not found: ID does not exist" containerID="a3159d3c1bf45fc4c7e163b7ad9581de58df9ae7eadcbc8b07cb9d38269bf087" Oct 13 15:01:49 crc kubenswrapper[4797]: I1013 15:01:49.881071 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3159d3c1bf45fc4c7e163b7ad9581de58df9ae7eadcbc8b07cb9d38269bf087"} err="failed to get container status \"a3159d3c1bf45fc4c7e163b7ad9581de58df9ae7eadcbc8b07cb9d38269bf087\": rpc error: code = NotFound desc = could not find container \"a3159d3c1bf45fc4c7e163b7ad9581de58df9ae7eadcbc8b07cb9d38269bf087\": container with ID starting with a3159d3c1bf45fc4c7e163b7ad9581de58df9ae7eadcbc8b07cb9d38269bf087 not found: ID does not exist" Oct 13 15:01:49 crc kubenswrapper[4797]: I1013 15:01:49.881103 4797 scope.go:117] "RemoveContainer" containerID="fcabd8627cfdf2d26981fbf9cc7e0f1dcf0fef9fdf300c5dd3b063947e6006c1" Oct 13 15:01:49 crc kubenswrapper[4797]: E1013 15:01:49.881448 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fcabd8627cfdf2d26981fbf9cc7e0f1dcf0fef9fdf300c5dd3b063947e6006c1\": container with ID starting with fcabd8627cfdf2d26981fbf9cc7e0f1dcf0fef9fdf300c5dd3b063947e6006c1 not found: ID does not exist" containerID="fcabd8627cfdf2d26981fbf9cc7e0f1dcf0fef9fdf300c5dd3b063947e6006c1" Oct 13 15:01:49 crc kubenswrapper[4797]: I1013 15:01:49.881481 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fcabd8627cfdf2d26981fbf9cc7e0f1dcf0fef9fdf300c5dd3b063947e6006c1"} err="failed to get container status \"fcabd8627cfdf2d26981fbf9cc7e0f1dcf0fef9fdf300c5dd3b063947e6006c1\": rpc error: code = NotFound desc = could not find container \"fcabd8627cfdf2d26981fbf9cc7e0f1dcf0fef9fdf300c5dd3b063947e6006c1\": container with ID starting with fcabd8627cfdf2d26981fbf9cc7e0f1dcf0fef9fdf300c5dd3b063947e6006c1 not found: ID does not exist" Oct 13 15:01:51 crc kubenswrapper[4797]: I1013 15:01:51.248383 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5bfcfff8-ee9a-44ea-9106-f64ac3a0ade8" path="/var/lib/kubelet/pods/5bfcfff8-ee9a-44ea-9106-f64ac3a0ade8/volumes" Oct 13 15:01:53 crc kubenswrapper[4797]: I1013 15:01:53.834013 4797 generic.go:334] "Generic (PLEG): container finished" podID="dc3cd5ee-3e78-4d2f-97dd-6901058f0f49" containerID="9ac27d44dcb0139e5d44bf71f60fd2a02022eb5f4b4ce27aa43c2deb4d634973" exitCode=0 Oct 13 15:01:53 crc kubenswrapper[4797]: I1013 15:01:53.834134 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-fx9pj" event={"ID":"dc3cd5ee-3e78-4d2f-97dd-6901058f0f49","Type":"ContainerDied","Data":"9ac27d44dcb0139e5d44bf71f60fd2a02022eb5f4b4ce27aa43c2deb4d634973"} Oct 13 15:01:55 crc kubenswrapper[4797]: I1013 15:01:55.263975 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-fx9pj" Oct 13 15:01:55 crc kubenswrapper[4797]: I1013 15:01:55.421174 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pg5n5\" (UniqueName: \"kubernetes.io/projected/dc3cd5ee-3e78-4d2f-97dd-6901058f0f49-kube-api-access-pg5n5\") pod \"dc3cd5ee-3e78-4d2f-97dd-6901058f0f49\" (UID: \"dc3cd5ee-3e78-4d2f-97dd-6901058f0f49\") " Oct 13 15:01:55 crc kubenswrapper[4797]: I1013 15:01:55.421351 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dc3cd5ee-3e78-4d2f-97dd-6901058f0f49-inventory\") pod \"dc3cd5ee-3e78-4d2f-97dd-6901058f0f49\" (UID: \"dc3cd5ee-3e78-4d2f-97dd-6901058f0f49\") " Oct 13 15:01:55 crc kubenswrapper[4797]: I1013 15:01:55.421403 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dc3cd5ee-3e78-4d2f-97dd-6901058f0f49-ssh-key\") pod \"dc3cd5ee-3e78-4d2f-97dd-6901058f0f49\" (UID: \"dc3cd5ee-3e78-4d2f-97dd-6901058f0f49\") " Oct 13 15:01:55 crc kubenswrapper[4797]: I1013 15:01:55.421499 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/dc3cd5ee-3e78-4d2f-97dd-6901058f0f49-ceph\") pod \"dc3cd5ee-3e78-4d2f-97dd-6901058f0f49\" (UID: \"dc3cd5ee-3e78-4d2f-97dd-6901058f0f49\") " Oct 13 15:01:55 crc kubenswrapper[4797]: I1013 15:01:55.427116 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc3cd5ee-3e78-4d2f-97dd-6901058f0f49-ceph" (OuterVolumeSpecName: "ceph") pod "dc3cd5ee-3e78-4d2f-97dd-6901058f0f49" (UID: "dc3cd5ee-3e78-4d2f-97dd-6901058f0f49"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 15:01:55 crc kubenswrapper[4797]: I1013 15:01:55.427109 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc3cd5ee-3e78-4d2f-97dd-6901058f0f49-kube-api-access-pg5n5" (OuterVolumeSpecName: "kube-api-access-pg5n5") pod "dc3cd5ee-3e78-4d2f-97dd-6901058f0f49" (UID: "dc3cd5ee-3e78-4d2f-97dd-6901058f0f49"). InnerVolumeSpecName "kube-api-access-pg5n5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 15:01:55 crc kubenswrapper[4797]: I1013 15:01:55.453077 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc3cd5ee-3e78-4d2f-97dd-6901058f0f49-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "dc3cd5ee-3e78-4d2f-97dd-6901058f0f49" (UID: "dc3cd5ee-3e78-4d2f-97dd-6901058f0f49"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 15:01:55 crc kubenswrapper[4797]: I1013 15:01:55.469916 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc3cd5ee-3e78-4d2f-97dd-6901058f0f49-inventory" (OuterVolumeSpecName: "inventory") pod "dc3cd5ee-3e78-4d2f-97dd-6901058f0f49" (UID: "dc3cd5ee-3e78-4d2f-97dd-6901058f0f49"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 15:01:55 crc kubenswrapper[4797]: I1013 15:01:55.525342 4797 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dc3cd5ee-3e78-4d2f-97dd-6901058f0f49-inventory\") on node \"crc\" DevicePath \"\"" Oct 13 15:01:55 crc kubenswrapper[4797]: I1013 15:01:55.525435 4797 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dc3cd5ee-3e78-4d2f-97dd-6901058f0f49-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 13 15:01:55 crc kubenswrapper[4797]: I1013 15:01:55.525455 4797 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/dc3cd5ee-3e78-4d2f-97dd-6901058f0f49-ceph\") on node \"crc\" DevicePath \"\"" Oct 13 15:01:55 crc kubenswrapper[4797]: I1013 15:01:55.525473 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pg5n5\" (UniqueName: \"kubernetes.io/projected/dc3cd5ee-3e78-4d2f-97dd-6901058f0f49-kube-api-access-pg5n5\") on node \"crc\" DevicePath \"\"" Oct 13 15:01:55 crc kubenswrapper[4797]: I1013 15:01:55.857576 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-fx9pj" event={"ID":"dc3cd5ee-3e78-4d2f-97dd-6901058f0f49","Type":"ContainerDied","Data":"f520f226542dc4892a90d020cbefa7ef5a15eac35bdd669e88fb981c001be95e"} Oct 13 15:01:55 crc kubenswrapper[4797]: I1013 15:01:55.857641 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f520f226542dc4892a90d020cbefa7ef5a15eac35bdd669e88fb981c001be95e" Oct 13 15:01:55 crc kubenswrapper[4797]: I1013 15:01:55.857684 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-fx9pj" Oct 13 15:01:55 crc kubenswrapper[4797]: I1013 15:01:55.963136 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-openstack-6c9b5"] Oct 13 15:01:55 crc kubenswrapper[4797]: E1013 15:01:55.963764 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5bfcfff8-ee9a-44ea-9106-f64ac3a0ade8" containerName="extract-content" Oct 13 15:01:55 crc kubenswrapper[4797]: I1013 15:01:55.964436 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bfcfff8-ee9a-44ea-9106-f64ac3a0ade8" containerName="extract-content" Oct 13 15:01:55 crc kubenswrapper[4797]: E1013 15:01:55.964507 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5bfcfff8-ee9a-44ea-9106-f64ac3a0ade8" containerName="registry-server" Oct 13 15:01:55 crc kubenswrapper[4797]: I1013 15:01:55.964562 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bfcfff8-ee9a-44ea-9106-f64ac3a0ade8" containerName="registry-server" Oct 13 15:01:55 crc kubenswrapper[4797]: E1013 15:01:55.964636 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc3cd5ee-3e78-4d2f-97dd-6901058f0f49" containerName="configure-os-openstack-openstack-cell1" Oct 13 15:01:55 crc kubenswrapper[4797]: I1013 15:01:55.964688 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc3cd5ee-3e78-4d2f-97dd-6901058f0f49" containerName="configure-os-openstack-openstack-cell1" Oct 13 15:01:55 crc kubenswrapper[4797]: E1013 15:01:55.964762 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5bfcfff8-ee9a-44ea-9106-f64ac3a0ade8" containerName="extract-utilities" Oct 13 15:01:55 crc kubenswrapper[4797]: I1013 15:01:55.964841 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bfcfff8-ee9a-44ea-9106-f64ac3a0ade8" containerName="extract-utilities" Oct 13 15:01:55 crc kubenswrapper[4797]: I1013 15:01:55.965143 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="5bfcfff8-ee9a-44ea-9106-f64ac3a0ade8" containerName="registry-server" Oct 13 15:01:55 crc kubenswrapper[4797]: I1013 15:01:55.965238 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc3cd5ee-3e78-4d2f-97dd-6901058f0f49" containerName="configure-os-openstack-openstack-cell1" Oct 13 15:01:55 crc kubenswrapper[4797]: I1013 15:01:55.966125 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-6c9b5" Oct 13 15:01:55 crc kubenswrapper[4797]: I1013 15:01:55.969486 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 13 15:01:55 crc kubenswrapper[4797]: I1013 15:01:55.969528 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-rf85n" Oct 13 15:01:55 crc kubenswrapper[4797]: I1013 15:01:55.969788 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 13 15:01:55 crc kubenswrapper[4797]: I1013 15:01:55.973314 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 13 15:01:55 crc kubenswrapper[4797]: I1013 15:01:55.990491 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-openstack-6c9b5"] Oct 13 15:01:56 crc kubenswrapper[4797]: I1013 15:01:56.037964 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/0e92099f-7960-419f-adcc-d73622e1b2f1-inventory-0\") pod \"ssh-known-hosts-openstack-6c9b5\" (UID: \"0e92099f-7960-419f-adcc-d73622e1b2f1\") " pod="openstack/ssh-known-hosts-openstack-6c9b5" Oct 13 15:01:56 crc kubenswrapper[4797]: I1013 15:01:56.038251 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4lsk\" (UniqueName: \"kubernetes.io/projected/0e92099f-7960-419f-adcc-d73622e1b2f1-kube-api-access-v4lsk\") pod \"ssh-known-hosts-openstack-6c9b5\" (UID: \"0e92099f-7960-419f-adcc-d73622e1b2f1\") " pod="openstack/ssh-known-hosts-openstack-6c9b5" Oct 13 15:01:56 crc kubenswrapper[4797]: I1013 15:01:56.038349 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/0e92099f-7960-419f-adcc-d73622e1b2f1-ssh-key-openstack-cell1\") pod \"ssh-known-hosts-openstack-6c9b5\" (UID: \"0e92099f-7960-419f-adcc-d73622e1b2f1\") " pod="openstack/ssh-known-hosts-openstack-6c9b5" Oct 13 15:01:56 crc kubenswrapper[4797]: I1013 15:01:56.038522 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0e92099f-7960-419f-adcc-d73622e1b2f1-ceph\") pod \"ssh-known-hosts-openstack-6c9b5\" (UID: \"0e92099f-7960-419f-adcc-d73622e1b2f1\") " pod="openstack/ssh-known-hosts-openstack-6c9b5" Oct 13 15:01:56 crc kubenswrapper[4797]: I1013 15:01:56.140769 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0e92099f-7960-419f-adcc-d73622e1b2f1-ceph\") pod \"ssh-known-hosts-openstack-6c9b5\" (UID: \"0e92099f-7960-419f-adcc-d73622e1b2f1\") " pod="openstack/ssh-known-hosts-openstack-6c9b5" Oct 13 15:01:56 crc kubenswrapper[4797]: I1013 15:01:56.140894 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/0e92099f-7960-419f-adcc-d73622e1b2f1-inventory-0\") pod \"ssh-known-hosts-openstack-6c9b5\" (UID: \"0e92099f-7960-419f-adcc-d73622e1b2f1\") " pod="openstack/ssh-known-hosts-openstack-6c9b5" Oct 13 15:01:56 crc kubenswrapper[4797]: I1013 15:01:56.140943 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v4lsk\" (UniqueName: \"kubernetes.io/projected/0e92099f-7960-419f-adcc-d73622e1b2f1-kube-api-access-v4lsk\") pod \"ssh-known-hosts-openstack-6c9b5\" (UID: \"0e92099f-7960-419f-adcc-d73622e1b2f1\") " pod="openstack/ssh-known-hosts-openstack-6c9b5" Oct 13 15:01:56 crc kubenswrapper[4797]: I1013 15:01:56.140993 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/0e92099f-7960-419f-adcc-d73622e1b2f1-ssh-key-openstack-cell1\") pod \"ssh-known-hosts-openstack-6c9b5\" (UID: \"0e92099f-7960-419f-adcc-d73622e1b2f1\") " pod="openstack/ssh-known-hosts-openstack-6c9b5" Oct 13 15:01:56 crc kubenswrapper[4797]: I1013 15:01:56.145476 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/0e92099f-7960-419f-adcc-d73622e1b2f1-inventory-0\") pod \"ssh-known-hosts-openstack-6c9b5\" (UID: \"0e92099f-7960-419f-adcc-d73622e1b2f1\") " pod="openstack/ssh-known-hosts-openstack-6c9b5" Oct 13 15:01:56 crc kubenswrapper[4797]: I1013 15:01:56.147622 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0e92099f-7960-419f-adcc-d73622e1b2f1-ceph\") pod \"ssh-known-hosts-openstack-6c9b5\" (UID: \"0e92099f-7960-419f-adcc-d73622e1b2f1\") " pod="openstack/ssh-known-hosts-openstack-6c9b5" Oct 13 15:01:56 crc kubenswrapper[4797]: I1013 15:01:56.147971 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/0e92099f-7960-419f-adcc-d73622e1b2f1-ssh-key-openstack-cell1\") pod \"ssh-known-hosts-openstack-6c9b5\" (UID: \"0e92099f-7960-419f-adcc-d73622e1b2f1\") " pod="openstack/ssh-known-hosts-openstack-6c9b5" Oct 13 15:01:56 crc kubenswrapper[4797]: I1013 15:01:56.161569 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4lsk\" (UniqueName: \"kubernetes.io/projected/0e92099f-7960-419f-adcc-d73622e1b2f1-kube-api-access-v4lsk\") pod \"ssh-known-hosts-openstack-6c9b5\" (UID: \"0e92099f-7960-419f-adcc-d73622e1b2f1\") " pod="openstack/ssh-known-hosts-openstack-6c9b5" Oct 13 15:01:56 crc kubenswrapper[4797]: I1013 15:01:56.288422 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-6c9b5" Oct 13 15:01:56 crc kubenswrapper[4797]: I1013 15:01:56.829551 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-openstack-6c9b5"] Oct 13 15:01:56 crc kubenswrapper[4797]: I1013 15:01:56.872606 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-6c9b5" event={"ID":"0e92099f-7960-419f-adcc-d73622e1b2f1","Type":"ContainerStarted","Data":"da73d0a4f538042263a179090e7b6a273fb377924141fc552eaaacce5486f2ac"} Oct 13 15:01:57 crc kubenswrapper[4797]: I1013 15:01:57.889619 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-6c9b5" event={"ID":"0e92099f-7960-419f-adcc-d73622e1b2f1","Type":"ContainerStarted","Data":"59520bfe6dbdab68317216ed11291e8e2a7c8ae392fec8665d77ec9c8f856e69"} Oct 13 15:02:06 crc kubenswrapper[4797]: I1013 15:02:06.003853 4797 generic.go:334] "Generic (PLEG): container finished" podID="0e92099f-7960-419f-adcc-d73622e1b2f1" containerID="59520bfe6dbdab68317216ed11291e8e2a7c8ae392fec8665d77ec9c8f856e69" exitCode=0 Oct 13 15:02:06 crc kubenswrapper[4797]: I1013 15:02:06.003952 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-6c9b5" event={"ID":"0e92099f-7960-419f-adcc-d73622e1b2f1","Type":"ContainerDied","Data":"59520bfe6dbdab68317216ed11291e8e2a7c8ae392fec8665d77ec9c8f856e69"} Oct 13 15:02:07 crc kubenswrapper[4797]: I1013 15:02:07.456948 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-6c9b5" Oct 13 15:02:07 crc kubenswrapper[4797]: I1013 15:02:07.592451 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/0e92099f-7960-419f-adcc-d73622e1b2f1-inventory-0\") pod \"0e92099f-7960-419f-adcc-d73622e1b2f1\" (UID: \"0e92099f-7960-419f-adcc-d73622e1b2f1\") " Oct 13 15:02:07 crc kubenswrapper[4797]: I1013 15:02:07.592603 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/0e92099f-7960-419f-adcc-d73622e1b2f1-ssh-key-openstack-cell1\") pod \"0e92099f-7960-419f-adcc-d73622e1b2f1\" (UID: \"0e92099f-7960-419f-adcc-d73622e1b2f1\") " Oct 13 15:02:07 crc kubenswrapper[4797]: I1013 15:02:07.592698 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0e92099f-7960-419f-adcc-d73622e1b2f1-ceph\") pod \"0e92099f-7960-419f-adcc-d73622e1b2f1\" (UID: \"0e92099f-7960-419f-adcc-d73622e1b2f1\") " Oct 13 15:02:07 crc kubenswrapper[4797]: I1013 15:02:07.592738 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v4lsk\" (UniqueName: \"kubernetes.io/projected/0e92099f-7960-419f-adcc-d73622e1b2f1-kube-api-access-v4lsk\") pod \"0e92099f-7960-419f-adcc-d73622e1b2f1\" (UID: \"0e92099f-7960-419f-adcc-d73622e1b2f1\") " Oct 13 15:02:07 crc kubenswrapper[4797]: I1013 15:02:07.598090 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e92099f-7960-419f-adcc-d73622e1b2f1-ceph" (OuterVolumeSpecName: "ceph") pod "0e92099f-7960-419f-adcc-d73622e1b2f1" (UID: "0e92099f-7960-419f-adcc-d73622e1b2f1"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 15:02:07 crc kubenswrapper[4797]: I1013 15:02:07.598462 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e92099f-7960-419f-adcc-d73622e1b2f1-kube-api-access-v4lsk" (OuterVolumeSpecName: "kube-api-access-v4lsk") pod "0e92099f-7960-419f-adcc-d73622e1b2f1" (UID: "0e92099f-7960-419f-adcc-d73622e1b2f1"). InnerVolumeSpecName "kube-api-access-v4lsk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 15:02:07 crc kubenswrapper[4797]: I1013 15:02:07.619948 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e92099f-7960-419f-adcc-d73622e1b2f1-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "0e92099f-7960-419f-adcc-d73622e1b2f1" (UID: "0e92099f-7960-419f-adcc-d73622e1b2f1"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 15:02:07 crc kubenswrapper[4797]: I1013 15:02:07.621291 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e92099f-7960-419f-adcc-d73622e1b2f1-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "0e92099f-7960-419f-adcc-d73622e1b2f1" (UID: "0e92099f-7960-419f-adcc-d73622e1b2f1"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 15:02:07 crc kubenswrapper[4797]: I1013 15:02:07.696011 4797 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0e92099f-7960-419f-adcc-d73622e1b2f1-ceph\") on node \"crc\" DevicePath \"\"" Oct 13 15:02:07 crc kubenswrapper[4797]: I1013 15:02:07.696040 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v4lsk\" (UniqueName: \"kubernetes.io/projected/0e92099f-7960-419f-adcc-d73622e1b2f1-kube-api-access-v4lsk\") on node \"crc\" DevicePath \"\"" Oct 13 15:02:07 crc kubenswrapper[4797]: I1013 15:02:07.696055 4797 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/0e92099f-7960-419f-adcc-d73622e1b2f1-inventory-0\") on node \"crc\" DevicePath \"\"" Oct 13 15:02:07 crc kubenswrapper[4797]: I1013 15:02:07.696067 4797 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/0e92099f-7960-419f-adcc-d73622e1b2f1-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Oct 13 15:02:08 crc kubenswrapper[4797]: I1013 15:02:08.023163 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-6c9b5" event={"ID":"0e92099f-7960-419f-adcc-d73622e1b2f1","Type":"ContainerDied","Data":"da73d0a4f538042263a179090e7b6a273fb377924141fc552eaaacce5486f2ac"} Oct 13 15:02:08 crc kubenswrapper[4797]: I1013 15:02:08.023209 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="da73d0a4f538042263a179090e7b6a273fb377924141fc552eaaacce5486f2ac" Oct 13 15:02:08 crc kubenswrapper[4797]: I1013 15:02:08.023237 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-6c9b5" Oct 13 15:02:08 crc kubenswrapper[4797]: I1013 15:02:08.087154 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-openstack-openstack-cell1-29bxv"] Oct 13 15:02:08 crc kubenswrapper[4797]: E1013 15:02:08.087791 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e92099f-7960-419f-adcc-d73622e1b2f1" containerName="ssh-known-hosts-openstack" Oct 13 15:02:08 crc kubenswrapper[4797]: I1013 15:02:08.087886 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e92099f-7960-419f-adcc-d73622e1b2f1" containerName="ssh-known-hosts-openstack" Oct 13 15:02:08 crc kubenswrapper[4797]: I1013 15:02:08.088165 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e92099f-7960-419f-adcc-d73622e1b2f1" containerName="ssh-known-hosts-openstack" Oct 13 15:02:08 crc kubenswrapper[4797]: I1013 15:02:08.089171 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-29bxv" Oct 13 15:02:08 crc kubenswrapper[4797]: I1013 15:02:08.092149 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 13 15:02:08 crc kubenswrapper[4797]: I1013 15:02:08.092330 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 13 15:02:08 crc kubenswrapper[4797]: I1013 15:02:08.092664 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 13 15:02:08 crc kubenswrapper[4797]: I1013 15:02:08.092850 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-rf85n" Oct 13 15:02:08 crc kubenswrapper[4797]: I1013 15:02:08.141767 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-openstack-openstack-cell1-29bxv"] Oct 13 15:02:08 crc kubenswrapper[4797]: I1013 15:02:08.204536 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c215c11f-7d9d-482b-8ce9-9b7aeeb6ced0-ceph\") pod \"run-os-openstack-openstack-cell1-29bxv\" (UID: \"c215c11f-7d9d-482b-8ce9-9b7aeeb6ced0\") " pod="openstack/run-os-openstack-openstack-cell1-29bxv" Oct 13 15:02:08 crc kubenswrapper[4797]: I1013 15:02:08.204663 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c215c11f-7d9d-482b-8ce9-9b7aeeb6ced0-inventory\") pod \"run-os-openstack-openstack-cell1-29bxv\" (UID: \"c215c11f-7d9d-482b-8ce9-9b7aeeb6ced0\") " pod="openstack/run-os-openstack-openstack-cell1-29bxv" Oct 13 15:02:08 crc kubenswrapper[4797]: I1013 15:02:08.204845 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzggg\" (UniqueName: \"kubernetes.io/projected/c215c11f-7d9d-482b-8ce9-9b7aeeb6ced0-kube-api-access-zzggg\") pod \"run-os-openstack-openstack-cell1-29bxv\" (UID: \"c215c11f-7d9d-482b-8ce9-9b7aeeb6ced0\") " pod="openstack/run-os-openstack-openstack-cell1-29bxv" Oct 13 15:02:08 crc kubenswrapper[4797]: I1013 15:02:08.204909 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c215c11f-7d9d-482b-8ce9-9b7aeeb6ced0-ssh-key\") pod \"run-os-openstack-openstack-cell1-29bxv\" (UID: \"c215c11f-7d9d-482b-8ce9-9b7aeeb6ced0\") " pod="openstack/run-os-openstack-openstack-cell1-29bxv" Oct 13 15:02:08 crc kubenswrapper[4797]: I1013 15:02:08.306475 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c215c11f-7d9d-482b-8ce9-9b7aeeb6ced0-ssh-key\") pod \"run-os-openstack-openstack-cell1-29bxv\" (UID: \"c215c11f-7d9d-482b-8ce9-9b7aeeb6ced0\") " pod="openstack/run-os-openstack-openstack-cell1-29bxv" Oct 13 15:02:08 crc kubenswrapper[4797]: I1013 15:02:08.306767 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c215c11f-7d9d-482b-8ce9-9b7aeeb6ced0-ceph\") pod \"run-os-openstack-openstack-cell1-29bxv\" (UID: \"c215c11f-7d9d-482b-8ce9-9b7aeeb6ced0\") " pod="openstack/run-os-openstack-openstack-cell1-29bxv" Oct 13 15:02:08 crc kubenswrapper[4797]: I1013 15:02:08.306882 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c215c11f-7d9d-482b-8ce9-9b7aeeb6ced0-inventory\") pod \"run-os-openstack-openstack-cell1-29bxv\" (UID: \"c215c11f-7d9d-482b-8ce9-9b7aeeb6ced0\") " pod="openstack/run-os-openstack-openstack-cell1-29bxv" Oct 13 15:02:08 crc kubenswrapper[4797]: I1013 15:02:08.307053 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zzggg\" (UniqueName: \"kubernetes.io/projected/c215c11f-7d9d-482b-8ce9-9b7aeeb6ced0-kube-api-access-zzggg\") pod \"run-os-openstack-openstack-cell1-29bxv\" (UID: \"c215c11f-7d9d-482b-8ce9-9b7aeeb6ced0\") " pod="openstack/run-os-openstack-openstack-cell1-29bxv" Oct 13 15:02:08 crc kubenswrapper[4797]: I1013 15:02:08.315459 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c215c11f-7d9d-482b-8ce9-9b7aeeb6ced0-ssh-key\") pod \"run-os-openstack-openstack-cell1-29bxv\" (UID: \"c215c11f-7d9d-482b-8ce9-9b7aeeb6ced0\") " pod="openstack/run-os-openstack-openstack-cell1-29bxv" Oct 13 15:02:08 crc kubenswrapper[4797]: I1013 15:02:08.316232 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c215c11f-7d9d-482b-8ce9-9b7aeeb6ced0-inventory\") pod \"run-os-openstack-openstack-cell1-29bxv\" (UID: \"c215c11f-7d9d-482b-8ce9-9b7aeeb6ced0\") " pod="openstack/run-os-openstack-openstack-cell1-29bxv" Oct 13 15:02:08 crc kubenswrapper[4797]: I1013 15:02:08.326823 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c215c11f-7d9d-482b-8ce9-9b7aeeb6ced0-ceph\") pod \"run-os-openstack-openstack-cell1-29bxv\" (UID: \"c215c11f-7d9d-482b-8ce9-9b7aeeb6ced0\") " pod="openstack/run-os-openstack-openstack-cell1-29bxv" Oct 13 15:02:08 crc kubenswrapper[4797]: I1013 15:02:08.328402 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzggg\" (UniqueName: \"kubernetes.io/projected/c215c11f-7d9d-482b-8ce9-9b7aeeb6ced0-kube-api-access-zzggg\") pod \"run-os-openstack-openstack-cell1-29bxv\" (UID: \"c215c11f-7d9d-482b-8ce9-9b7aeeb6ced0\") " pod="openstack/run-os-openstack-openstack-cell1-29bxv" Oct 13 15:02:08 crc kubenswrapper[4797]: I1013 15:02:08.414935 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-29bxv" Oct 13 15:02:08 crc kubenswrapper[4797]: I1013 15:02:08.959221 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-openstack-openstack-cell1-29bxv"] Oct 13 15:02:09 crc kubenswrapper[4797]: I1013 15:02:09.034238 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-29bxv" event={"ID":"c215c11f-7d9d-482b-8ce9-9b7aeeb6ced0","Type":"ContainerStarted","Data":"dcf77ecd843a44a5ccb7941beb648262f5dea790a7b7a496a9ba6db962497aaf"} Oct 13 15:02:10 crc kubenswrapper[4797]: I1013 15:02:10.047365 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-29bxv" event={"ID":"c215c11f-7d9d-482b-8ce9-9b7aeeb6ced0","Type":"ContainerStarted","Data":"e6c4d2f7d6ab89927d7b16c61d6aa6b2c4bbfc9d3d0015f22a93eb98c0264564"} Oct 13 15:02:10 crc kubenswrapper[4797]: I1013 15:02:10.073994 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-openstack-openstack-cell1-29bxv" podStartSLOduration=1.472631081 podStartE2EDuration="2.073961628s" podCreationTimestamp="2025-10-13 15:02:08 +0000 UTC" firstStartedPulling="2025-10-13 15:02:08.972199852 +0000 UTC m=+6906.505750108" lastFinishedPulling="2025-10-13 15:02:09.573530389 +0000 UTC m=+6907.107080655" observedRunningTime="2025-10-13 15:02:10.065170151 +0000 UTC m=+6907.598720407" watchObservedRunningTime="2025-10-13 15:02:10.073961628 +0000 UTC m=+6907.607511884" Oct 13 15:02:17 crc kubenswrapper[4797]: I1013 15:02:17.121187 4797 generic.go:334] "Generic (PLEG): container finished" podID="c215c11f-7d9d-482b-8ce9-9b7aeeb6ced0" containerID="e6c4d2f7d6ab89927d7b16c61d6aa6b2c4bbfc9d3d0015f22a93eb98c0264564" exitCode=0 Oct 13 15:02:17 crc kubenswrapper[4797]: I1013 15:02:17.121304 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-29bxv" event={"ID":"c215c11f-7d9d-482b-8ce9-9b7aeeb6ced0","Type":"ContainerDied","Data":"e6c4d2f7d6ab89927d7b16c61d6aa6b2c4bbfc9d3d0015f22a93eb98c0264564"} Oct 13 15:02:18 crc kubenswrapper[4797]: I1013 15:02:18.588999 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-29bxv" Oct 13 15:02:18 crc kubenswrapper[4797]: I1013 15:02:18.765600 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c215c11f-7d9d-482b-8ce9-9b7aeeb6ced0-inventory\") pod \"c215c11f-7d9d-482b-8ce9-9b7aeeb6ced0\" (UID: \"c215c11f-7d9d-482b-8ce9-9b7aeeb6ced0\") " Oct 13 15:02:18 crc kubenswrapper[4797]: I1013 15:02:18.766002 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c215c11f-7d9d-482b-8ce9-9b7aeeb6ced0-ssh-key\") pod \"c215c11f-7d9d-482b-8ce9-9b7aeeb6ced0\" (UID: \"c215c11f-7d9d-482b-8ce9-9b7aeeb6ced0\") " Oct 13 15:02:18 crc kubenswrapper[4797]: I1013 15:02:18.766086 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c215c11f-7d9d-482b-8ce9-9b7aeeb6ced0-ceph\") pod \"c215c11f-7d9d-482b-8ce9-9b7aeeb6ced0\" (UID: \"c215c11f-7d9d-482b-8ce9-9b7aeeb6ced0\") " Oct 13 15:02:18 crc kubenswrapper[4797]: I1013 15:02:18.766184 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zzggg\" (UniqueName: \"kubernetes.io/projected/c215c11f-7d9d-482b-8ce9-9b7aeeb6ced0-kube-api-access-zzggg\") pod \"c215c11f-7d9d-482b-8ce9-9b7aeeb6ced0\" (UID: \"c215c11f-7d9d-482b-8ce9-9b7aeeb6ced0\") " Oct 13 15:02:18 crc kubenswrapper[4797]: I1013 15:02:18.772727 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c215c11f-7d9d-482b-8ce9-9b7aeeb6ced0-kube-api-access-zzggg" (OuterVolumeSpecName: "kube-api-access-zzggg") pod "c215c11f-7d9d-482b-8ce9-9b7aeeb6ced0" (UID: "c215c11f-7d9d-482b-8ce9-9b7aeeb6ced0"). InnerVolumeSpecName "kube-api-access-zzggg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 15:02:18 crc kubenswrapper[4797]: I1013 15:02:18.791623 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c215c11f-7d9d-482b-8ce9-9b7aeeb6ced0-ceph" (OuterVolumeSpecName: "ceph") pod "c215c11f-7d9d-482b-8ce9-9b7aeeb6ced0" (UID: "c215c11f-7d9d-482b-8ce9-9b7aeeb6ced0"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 15:02:18 crc kubenswrapper[4797]: I1013 15:02:18.817654 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c215c11f-7d9d-482b-8ce9-9b7aeeb6ced0-inventory" (OuterVolumeSpecName: "inventory") pod "c215c11f-7d9d-482b-8ce9-9b7aeeb6ced0" (UID: "c215c11f-7d9d-482b-8ce9-9b7aeeb6ced0"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 15:02:18 crc kubenswrapper[4797]: I1013 15:02:18.837226 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c215c11f-7d9d-482b-8ce9-9b7aeeb6ced0-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "c215c11f-7d9d-482b-8ce9-9b7aeeb6ced0" (UID: "c215c11f-7d9d-482b-8ce9-9b7aeeb6ced0"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 15:02:18 crc kubenswrapper[4797]: I1013 15:02:18.868570 4797 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c215c11f-7d9d-482b-8ce9-9b7aeeb6ced0-inventory\") on node \"crc\" DevicePath \"\"" Oct 13 15:02:18 crc kubenswrapper[4797]: I1013 15:02:18.868597 4797 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c215c11f-7d9d-482b-8ce9-9b7aeeb6ced0-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 13 15:02:18 crc kubenswrapper[4797]: I1013 15:02:18.868606 4797 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c215c11f-7d9d-482b-8ce9-9b7aeeb6ced0-ceph\") on node \"crc\" DevicePath \"\"" Oct 13 15:02:18 crc kubenswrapper[4797]: I1013 15:02:18.868615 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zzggg\" (UniqueName: \"kubernetes.io/projected/c215c11f-7d9d-482b-8ce9-9b7aeeb6ced0-kube-api-access-zzggg\") on node \"crc\" DevicePath \"\"" Oct 13 15:02:19 crc kubenswrapper[4797]: I1013 15:02:19.141171 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-29bxv" event={"ID":"c215c11f-7d9d-482b-8ce9-9b7aeeb6ced0","Type":"ContainerDied","Data":"dcf77ecd843a44a5ccb7941beb648262f5dea790a7b7a496a9ba6db962497aaf"} Oct 13 15:02:19 crc kubenswrapper[4797]: I1013 15:02:19.141228 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dcf77ecd843a44a5ccb7941beb648262f5dea790a7b7a496a9ba6db962497aaf" Oct 13 15:02:19 crc kubenswrapper[4797]: I1013 15:02:19.141283 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-29bxv" Oct 13 15:02:19 crc kubenswrapper[4797]: I1013 15:02:19.221033 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-openstack-openstack-cell1-xl2wv"] Oct 13 15:02:19 crc kubenswrapper[4797]: E1013 15:02:19.221588 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c215c11f-7d9d-482b-8ce9-9b7aeeb6ced0" containerName="run-os-openstack-openstack-cell1" Oct 13 15:02:19 crc kubenswrapper[4797]: I1013 15:02:19.221612 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="c215c11f-7d9d-482b-8ce9-9b7aeeb6ced0" containerName="run-os-openstack-openstack-cell1" Oct 13 15:02:19 crc kubenswrapper[4797]: I1013 15:02:19.221919 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="c215c11f-7d9d-482b-8ce9-9b7aeeb6ced0" containerName="run-os-openstack-openstack-cell1" Oct 13 15:02:19 crc kubenswrapper[4797]: I1013 15:02:19.223017 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-xl2wv" Oct 13 15:02:19 crc kubenswrapper[4797]: I1013 15:02:19.227302 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 13 15:02:19 crc kubenswrapper[4797]: I1013 15:02:19.227715 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 13 15:02:19 crc kubenswrapper[4797]: I1013 15:02:19.227966 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 13 15:02:19 crc kubenswrapper[4797]: I1013 15:02:19.228129 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-rf85n" Oct 13 15:02:19 crc kubenswrapper[4797]: I1013 15:02:19.233724 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-openstack-openstack-cell1-xl2wv"] Oct 13 15:02:19 crc kubenswrapper[4797]: I1013 15:02:19.308853 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/24bc34c6-31c1-400b-8ea5-f857626afde4-ssh-key\") pod \"reboot-os-openstack-openstack-cell1-xl2wv\" (UID: \"24bc34c6-31c1-400b-8ea5-f857626afde4\") " pod="openstack/reboot-os-openstack-openstack-cell1-xl2wv" Oct 13 15:02:19 crc kubenswrapper[4797]: I1013 15:02:19.308936 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/24bc34c6-31c1-400b-8ea5-f857626afde4-inventory\") pod \"reboot-os-openstack-openstack-cell1-xl2wv\" (UID: \"24bc34c6-31c1-400b-8ea5-f857626afde4\") " pod="openstack/reboot-os-openstack-openstack-cell1-xl2wv" Oct 13 15:02:19 crc kubenswrapper[4797]: I1013 15:02:19.311090 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/24bc34c6-31c1-400b-8ea5-f857626afde4-ceph\") pod \"reboot-os-openstack-openstack-cell1-xl2wv\" (UID: \"24bc34c6-31c1-400b-8ea5-f857626afde4\") " pod="openstack/reboot-os-openstack-openstack-cell1-xl2wv" Oct 13 15:02:19 crc kubenswrapper[4797]: I1013 15:02:19.311171 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6nr2\" (UniqueName: \"kubernetes.io/projected/24bc34c6-31c1-400b-8ea5-f857626afde4-kube-api-access-f6nr2\") pod \"reboot-os-openstack-openstack-cell1-xl2wv\" (UID: \"24bc34c6-31c1-400b-8ea5-f857626afde4\") " pod="openstack/reboot-os-openstack-openstack-cell1-xl2wv" Oct 13 15:02:19 crc kubenswrapper[4797]: I1013 15:02:19.412559 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/24bc34c6-31c1-400b-8ea5-f857626afde4-ceph\") pod \"reboot-os-openstack-openstack-cell1-xl2wv\" (UID: \"24bc34c6-31c1-400b-8ea5-f857626afde4\") " pod="openstack/reboot-os-openstack-openstack-cell1-xl2wv" Oct 13 15:02:19 crc kubenswrapper[4797]: I1013 15:02:19.412613 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f6nr2\" (UniqueName: \"kubernetes.io/projected/24bc34c6-31c1-400b-8ea5-f857626afde4-kube-api-access-f6nr2\") pod \"reboot-os-openstack-openstack-cell1-xl2wv\" (UID: \"24bc34c6-31c1-400b-8ea5-f857626afde4\") " pod="openstack/reboot-os-openstack-openstack-cell1-xl2wv" Oct 13 15:02:19 crc kubenswrapper[4797]: I1013 15:02:19.412690 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/24bc34c6-31c1-400b-8ea5-f857626afde4-ssh-key\") pod \"reboot-os-openstack-openstack-cell1-xl2wv\" (UID: \"24bc34c6-31c1-400b-8ea5-f857626afde4\") " pod="openstack/reboot-os-openstack-openstack-cell1-xl2wv" Oct 13 15:02:19 crc kubenswrapper[4797]: I1013 15:02:19.412708 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/24bc34c6-31c1-400b-8ea5-f857626afde4-inventory\") pod \"reboot-os-openstack-openstack-cell1-xl2wv\" (UID: \"24bc34c6-31c1-400b-8ea5-f857626afde4\") " pod="openstack/reboot-os-openstack-openstack-cell1-xl2wv" Oct 13 15:02:19 crc kubenswrapper[4797]: I1013 15:02:19.416343 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/24bc34c6-31c1-400b-8ea5-f857626afde4-inventory\") pod \"reboot-os-openstack-openstack-cell1-xl2wv\" (UID: \"24bc34c6-31c1-400b-8ea5-f857626afde4\") " pod="openstack/reboot-os-openstack-openstack-cell1-xl2wv" Oct 13 15:02:19 crc kubenswrapper[4797]: I1013 15:02:19.416768 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/24bc34c6-31c1-400b-8ea5-f857626afde4-ceph\") pod \"reboot-os-openstack-openstack-cell1-xl2wv\" (UID: \"24bc34c6-31c1-400b-8ea5-f857626afde4\") " pod="openstack/reboot-os-openstack-openstack-cell1-xl2wv" Oct 13 15:02:19 crc kubenswrapper[4797]: I1013 15:02:19.419935 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/24bc34c6-31c1-400b-8ea5-f857626afde4-ssh-key\") pod \"reboot-os-openstack-openstack-cell1-xl2wv\" (UID: \"24bc34c6-31c1-400b-8ea5-f857626afde4\") " pod="openstack/reboot-os-openstack-openstack-cell1-xl2wv" Oct 13 15:02:19 crc kubenswrapper[4797]: I1013 15:02:19.429186 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6nr2\" (UniqueName: \"kubernetes.io/projected/24bc34c6-31c1-400b-8ea5-f857626afde4-kube-api-access-f6nr2\") pod \"reboot-os-openstack-openstack-cell1-xl2wv\" (UID: \"24bc34c6-31c1-400b-8ea5-f857626afde4\") " pod="openstack/reboot-os-openstack-openstack-cell1-xl2wv" Oct 13 15:02:19 crc kubenswrapper[4797]: I1013 15:02:19.545205 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-xl2wv" Oct 13 15:02:20 crc kubenswrapper[4797]: I1013 15:02:20.104229 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-openstack-openstack-cell1-xl2wv"] Oct 13 15:02:20 crc kubenswrapper[4797]: I1013 15:02:20.152700 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-xl2wv" event={"ID":"24bc34c6-31c1-400b-8ea5-f857626afde4","Type":"ContainerStarted","Data":"eb9bf06d02635404f47db85da8655e9dc50d80ba240271bd77b0d1bf89df52fa"} Oct 13 15:02:21 crc kubenswrapper[4797]: I1013 15:02:21.163120 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-xl2wv" event={"ID":"24bc34c6-31c1-400b-8ea5-f857626afde4","Type":"ContainerStarted","Data":"6a4a8ba92278f6e5768a0ffceb9cd79d7621ad2aa6ec424a04e787b0e5232b70"} Oct 13 15:02:21 crc kubenswrapper[4797]: I1013 15:02:21.183221 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-openstack-openstack-cell1-xl2wv" podStartSLOduration=1.735943917 podStartE2EDuration="2.183197215s" podCreationTimestamp="2025-10-13 15:02:19 +0000 UTC" firstStartedPulling="2025-10-13 15:02:20.105506472 +0000 UTC m=+6917.639056728" lastFinishedPulling="2025-10-13 15:02:20.55275977 +0000 UTC m=+6918.086310026" observedRunningTime="2025-10-13 15:02:21.181209156 +0000 UTC m=+6918.714759432" watchObservedRunningTime="2025-10-13 15:02:21.183197215 +0000 UTC m=+6918.716747471" Oct 13 15:02:36 crc kubenswrapper[4797]: I1013 15:02:36.304803 4797 generic.go:334] "Generic (PLEG): container finished" podID="24bc34c6-31c1-400b-8ea5-f857626afde4" containerID="6a4a8ba92278f6e5768a0ffceb9cd79d7621ad2aa6ec424a04e787b0e5232b70" exitCode=0 Oct 13 15:02:36 crc kubenswrapper[4797]: I1013 15:02:36.304854 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-xl2wv" event={"ID":"24bc34c6-31c1-400b-8ea5-f857626afde4","Type":"ContainerDied","Data":"6a4a8ba92278f6e5768a0ffceb9cd79d7621ad2aa6ec424a04e787b0e5232b70"} Oct 13 15:02:37 crc kubenswrapper[4797]: I1013 15:02:37.739276 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-xl2wv" Oct 13 15:02:37 crc kubenswrapper[4797]: I1013 15:02:37.875041 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/24bc34c6-31c1-400b-8ea5-f857626afde4-ssh-key\") pod \"24bc34c6-31c1-400b-8ea5-f857626afde4\" (UID: \"24bc34c6-31c1-400b-8ea5-f857626afde4\") " Oct 13 15:02:37 crc kubenswrapper[4797]: I1013 15:02:37.875085 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/24bc34c6-31c1-400b-8ea5-f857626afde4-ceph\") pod \"24bc34c6-31c1-400b-8ea5-f857626afde4\" (UID: \"24bc34c6-31c1-400b-8ea5-f857626afde4\") " Oct 13 15:02:37 crc kubenswrapper[4797]: I1013 15:02:37.875173 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/24bc34c6-31c1-400b-8ea5-f857626afde4-inventory\") pod \"24bc34c6-31c1-400b-8ea5-f857626afde4\" (UID: \"24bc34c6-31c1-400b-8ea5-f857626afde4\") " Oct 13 15:02:37 crc kubenswrapper[4797]: I1013 15:02:37.875201 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f6nr2\" (UniqueName: \"kubernetes.io/projected/24bc34c6-31c1-400b-8ea5-f857626afde4-kube-api-access-f6nr2\") pod \"24bc34c6-31c1-400b-8ea5-f857626afde4\" (UID: \"24bc34c6-31c1-400b-8ea5-f857626afde4\") " Oct 13 15:02:37 crc kubenswrapper[4797]: I1013 15:02:37.881071 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24bc34c6-31c1-400b-8ea5-f857626afde4-kube-api-access-f6nr2" (OuterVolumeSpecName: "kube-api-access-f6nr2") pod "24bc34c6-31c1-400b-8ea5-f857626afde4" (UID: "24bc34c6-31c1-400b-8ea5-f857626afde4"). InnerVolumeSpecName "kube-api-access-f6nr2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 15:02:37 crc kubenswrapper[4797]: I1013 15:02:37.881154 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24bc34c6-31c1-400b-8ea5-f857626afde4-ceph" (OuterVolumeSpecName: "ceph") pod "24bc34c6-31c1-400b-8ea5-f857626afde4" (UID: "24bc34c6-31c1-400b-8ea5-f857626afde4"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 15:02:37 crc kubenswrapper[4797]: I1013 15:02:37.904780 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24bc34c6-31c1-400b-8ea5-f857626afde4-inventory" (OuterVolumeSpecName: "inventory") pod "24bc34c6-31c1-400b-8ea5-f857626afde4" (UID: "24bc34c6-31c1-400b-8ea5-f857626afde4"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 15:02:37 crc kubenswrapper[4797]: I1013 15:02:37.904928 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24bc34c6-31c1-400b-8ea5-f857626afde4-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "24bc34c6-31c1-400b-8ea5-f857626afde4" (UID: "24bc34c6-31c1-400b-8ea5-f857626afde4"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 15:02:37 crc kubenswrapper[4797]: I1013 15:02:37.978533 4797 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/24bc34c6-31c1-400b-8ea5-f857626afde4-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 13 15:02:37 crc kubenswrapper[4797]: I1013 15:02:37.978572 4797 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/24bc34c6-31c1-400b-8ea5-f857626afde4-ceph\") on node \"crc\" DevicePath \"\"" Oct 13 15:02:37 crc kubenswrapper[4797]: I1013 15:02:37.978584 4797 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/24bc34c6-31c1-400b-8ea5-f857626afde4-inventory\") on node \"crc\" DevicePath \"\"" Oct 13 15:02:37 crc kubenswrapper[4797]: I1013 15:02:37.978598 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f6nr2\" (UniqueName: \"kubernetes.io/projected/24bc34c6-31c1-400b-8ea5-f857626afde4-kube-api-access-f6nr2\") on node \"crc\" DevicePath \"\"" Oct 13 15:02:38 crc kubenswrapper[4797]: I1013 15:02:38.326189 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-xl2wv" event={"ID":"24bc34c6-31c1-400b-8ea5-f857626afde4","Type":"ContainerDied","Data":"eb9bf06d02635404f47db85da8655e9dc50d80ba240271bd77b0d1bf89df52fa"} Oct 13 15:02:38 crc kubenswrapper[4797]: I1013 15:02:38.326229 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eb9bf06d02635404f47db85da8655e9dc50d80ba240271bd77b0d1bf89df52fa" Oct 13 15:02:38 crc kubenswrapper[4797]: I1013 15:02:38.326355 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-xl2wv" Oct 13 15:02:38 crc kubenswrapper[4797]: I1013 15:02:38.403887 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-openstack-openstack-cell1-qvlrc"] Oct 13 15:02:38 crc kubenswrapper[4797]: E1013 15:02:38.404402 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24bc34c6-31c1-400b-8ea5-f857626afde4" containerName="reboot-os-openstack-openstack-cell1" Oct 13 15:02:38 crc kubenswrapper[4797]: I1013 15:02:38.404421 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="24bc34c6-31c1-400b-8ea5-f857626afde4" containerName="reboot-os-openstack-openstack-cell1" Oct 13 15:02:38 crc kubenswrapper[4797]: I1013 15:02:38.404671 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="24bc34c6-31c1-400b-8ea5-f857626afde4" containerName="reboot-os-openstack-openstack-cell1" Oct 13 15:02:38 crc kubenswrapper[4797]: I1013 15:02:38.406518 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-qvlrc" Oct 13 15:02:38 crc kubenswrapper[4797]: I1013 15:02:38.410671 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-rf85n" Oct 13 15:02:38 crc kubenswrapper[4797]: I1013 15:02:38.411043 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 13 15:02:38 crc kubenswrapper[4797]: I1013 15:02:38.412827 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 13 15:02:38 crc kubenswrapper[4797]: I1013 15:02:38.413857 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 13 15:02:38 crc kubenswrapper[4797]: I1013 15:02:38.420242 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-openstack-openstack-cell1-qvlrc"] Oct 13 15:02:38 crc kubenswrapper[4797]: I1013 15:02:38.489152 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9088bd4-a2fd-4a3e-a3ff-b43cf035de24-nova-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-qvlrc\" (UID: \"a9088bd4-a2fd-4a3e-a3ff-b43cf035de24\") " pod="openstack/install-certs-openstack-openstack-cell1-qvlrc" Oct 13 15:02:38 crc kubenswrapper[4797]: I1013 15:02:38.489228 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-px26w\" (UniqueName: \"kubernetes.io/projected/a9088bd4-a2fd-4a3e-a3ff-b43cf035de24-kube-api-access-px26w\") pod \"install-certs-openstack-openstack-cell1-qvlrc\" (UID: \"a9088bd4-a2fd-4a3e-a3ff-b43cf035de24\") " pod="openstack/install-certs-openstack-openstack-cell1-qvlrc" Oct 13 15:02:38 crc kubenswrapper[4797]: I1013 15:02:38.489288 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9088bd4-a2fd-4a3e-a3ff-b43cf035de24-libvirt-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-qvlrc\" (UID: \"a9088bd4-a2fd-4a3e-a3ff-b43cf035de24\") " pod="openstack/install-certs-openstack-openstack-cell1-qvlrc" Oct 13 15:02:38 crc kubenswrapper[4797]: I1013 15:02:38.489383 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9088bd4-a2fd-4a3e-a3ff-b43cf035de24-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-qvlrc\" (UID: \"a9088bd4-a2fd-4a3e-a3ff-b43cf035de24\") " pod="openstack/install-certs-openstack-openstack-cell1-qvlrc" Oct 13 15:02:38 crc kubenswrapper[4797]: I1013 15:02:38.489418 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9088bd4-a2fd-4a3e-a3ff-b43cf035de24-neutron-sriov-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-qvlrc\" (UID: \"a9088bd4-a2fd-4a3e-a3ff-b43cf035de24\") " pod="openstack/install-certs-openstack-openstack-cell1-qvlrc" Oct 13 15:02:38 crc kubenswrapper[4797]: I1013 15:02:38.489445 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a9088bd4-a2fd-4a3e-a3ff-b43cf035de24-inventory\") pod \"install-certs-openstack-openstack-cell1-qvlrc\" (UID: \"a9088bd4-a2fd-4a3e-a3ff-b43cf035de24\") " pod="openstack/install-certs-openstack-openstack-cell1-qvlrc" Oct 13 15:02:38 crc kubenswrapper[4797]: I1013 15:02:38.489580 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a9088bd4-a2fd-4a3e-a3ff-b43cf035de24-ssh-key\") pod \"install-certs-openstack-openstack-cell1-qvlrc\" (UID: \"a9088bd4-a2fd-4a3e-a3ff-b43cf035de24\") " pod="openstack/install-certs-openstack-openstack-cell1-qvlrc" Oct 13 15:02:38 crc kubenswrapper[4797]: I1013 15:02:38.489626 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9088bd4-a2fd-4a3e-a3ff-b43cf035de24-telemetry-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-qvlrc\" (UID: \"a9088bd4-a2fd-4a3e-a3ff-b43cf035de24\") " pod="openstack/install-certs-openstack-openstack-cell1-qvlrc" Oct 13 15:02:38 crc kubenswrapper[4797]: I1013 15:02:38.489650 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9088bd4-a2fd-4a3e-a3ff-b43cf035de24-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-qvlrc\" (UID: \"a9088bd4-a2fd-4a3e-a3ff-b43cf035de24\") " pod="openstack/install-certs-openstack-openstack-cell1-qvlrc" Oct 13 15:02:38 crc kubenswrapper[4797]: I1013 15:02:38.489688 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9088bd4-a2fd-4a3e-a3ff-b43cf035de24-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-qvlrc\" (UID: \"a9088bd4-a2fd-4a3e-a3ff-b43cf035de24\") " pod="openstack/install-certs-openstack-openstack-cell1-qvlrc" Oct 13 15:02:38 crc kubenswrapper[4797]: I1013 15:02:38.489724 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9088bd4-a2fd-4a3e-a3ff-b43cf035de24-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-qvlrc\" (UID: \"a9088bd4-a2fd-4a3e-a3ff-b43cf035de24\") " pod="openstack/install-certs-openstack-openstack-cell1-qvlrc" Oct 13 15:02:38 crc kubenswrapper[4797]: I1013 15:02:38.489756 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a9088bd4-a2fd-4a3e-a3ff-b43cf035de24-ceph\") pod \"install-certs-openstack-openstack-cell1-qvlrc\" (UID: \"a9088bd4-a2fd-4a3e-a3ff-b43cf035de24\") " pod="openstack/install-certs-openstack-openstack-cell1-qvlrc" Oct 13 15:02:38 crc kubenswrapper[4797]: I1013 15:02:38.592184 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-px26w\" (UniqueName: \"kubernetes.io/projected/a9088bd4-a2fd-4a3e-a3ff-b43cf035de24-kube-api-access-px26w\") pod \"install-certs-openstack-openstack-cell1-qvlrc\" (UID: \"a9088bd4-a2fd-4a3e-a3ff-b43cf035de24\") " pod="openstack/install-certs-openstack-openstack-cell1-qvlrc" Oct 13 15:02:38 crc kubenswrapper[4797]: I1013 15:02:38.592278 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9088bd4-a2fd-4a3e-a3ff-b43cf035de24-libvirt-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-qvlrc\" (UID: \"a9088bd4-a2fd-4a3e-a3ff-b43cf035de24\") " pod="openstack/install-certs-openstack-openstack-cell1-qvlrc" Oct 13 15:02:38 crc kubenswrapper[4797]: I1013 15:02:38.592331 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9088bd4-a2fd-4a3e-a3ff-b43cf035de24-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-qvlrc\" (UID: \"a9088bd4-a2fd-4a3e-a3ff-b43cf035de24\") " pod="openstack/install-certs-openstack-openstack-cell1-qvlrc" Oct 13 15:02:38 crc kubenswrapper[4797]: I1013 15:02:38.592365 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9088bd4-a2fd-4a3e-a3ff-b43cf035de24-neutron-sriov-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-qvlrc\" (UID: \"a9088bd4-a2fd-4a3e-a3ff-b43cf035de24\") " pod="openstack/install-certs-openstack-openstack-cell1-qvlrc" Oct 13 15:02:38 crc kubenswrapper[4797]: I1013 15:02:38.592400 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a9088bd4-a2fd-4a3e-a3ff-b43cf035de24-inventory\") pod \"install-certs-openstack-openstack-cell1-qvlrc\" (UID: \"a9088bd4-a2fd-4a3e-a3ff-b43cf035de24\") " pod="openstack/install-certs-openstack-openstack-cell1-qvlrc" Oct 13 15:02:38 crc kubenswrapper[4797]: I1013 15:02:38.592453 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a9088bd4-a2fd-4a3e-a3ff-b43cf035de24-ssh-key\") pod \"install-certs-openstack-openstack-cell1-qvlrc\" (UID: \"a9088bd4-a2fd-4a3e-a3ff-b43cf035de24\") " pod="openstack/install-certs-openstack-openstack-cell1-qvlrc" Oct 13 15:02:38 crc kubenswrapper[4797]: I1013 15:02:38.592480 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9088bd4-a2fd-4a3e-a3ff-b43cf035de24-telemetry-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-qvlrc\" (UID: \"a9088bd4-a2fd-4a3e-a3ff-b43cf035de24\") " pod="openstack/install-certs-openstack-openstack-cell1-qvlrc" Oct 13 15:02:38 crc kubenswrapper[4797]: I1013 15:02:38.592503 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9088bd4-a2fd-4a3e-a3ff-b43cf035de24-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-qvlrc\" (UID: \"a9088bd4-a2fd-4a3e-a3ff-b43cf035de24\") " pod="openstack/install-certs-openstack-openstack-cell1-qvlrc" Oct 13 15:02:38 crc kubenswrapper[4797]: I1013 15:02:38.592536 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9088bd4-a2fd-4a3e-a3ff-b43cf035de24-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-qvlrc\" (UID: \"a9088bd4-a2fd-4a3e-a3ff-b43cf035de24\") " pod="openstack/install-certs-openstack-openstack-cell1-qvlrc" Oct 13 15:02:38 crc kubenswrapper[4797]: I1013 15:02:38.592577 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9088bd4-a2fd-4a3e-a3ff-b43cf035de24-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-qvlrc\" (UID: \"a9088bd4-a2fd-4a3e-a3ff-b43cf035de24\") " pod="openstack/install-certs-openstack-openstack-cell1-qvlrc" Oct 13 15:02:38 crc kubenswrapper[4797]: I1013 15:02:38.592615 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a9088bd4-a2fd-4a3e-a3ff-b43cf035de24-ceph\") pod \"install-certs-openstack-openstack-cell1-qvlrc\" (UID: \"a9088bd4-a2fd-4a3e-a3ff-b43cf035de24\") " pod="openstack/install-certs-openstack-openstack-cell1-qvlrc" Oct 13 15:02:38 crc kubenswrapper[4797]: I1013 15:02:38.592676 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9088bd4-a2fd-4a3e-a3ff-b43cf035de24-nova-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-qvlrc\" (UID: \"a9088bd4-a2fd-4a3e-a3ff-b43cf035de24\") " pod="openstack/install-certs-openstack-openstack-cell1-qvlrc" Oct 13 15:02:38 crc kubenswrapper[4797]: I1013 15:02:38.597447 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a9088bd4-a2fd-4a3e-a3ff-b43cf035de24-ssh-key\") pod \"install-certs-openstack-openstack-cell1-qvlrc\" (UID: \"a9088bd4-a2fd-4a3e-a3ff-b43cf035de24\") " pod="openstack/install-certs-openstack-openstack-cell1-qvlrc" Oct 13 15:02:38 crc kubenswrapper[4797]: I1013 15:02:38.597601 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a9088bd4-a2fd-4a3e-a3ff-b43cf035de24-inventory\") pod \"install-certs-openstack-openstack-cell1-qvlrc\" (UID: \"a9088bd4-a2fd-4a3e-a3ff-b43cf035de24\") " pod="openstack/install-certs-openstack-openstack-cell1-qvlrc" Oct 13 15:02:38 crc kubenswrapper[4797]: I1013 15:02:38.597868 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9088bd4-a2fd-4a3e-a3ff-b43cf035de24-telemetry-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-qvlrc\" (UID: \"a9088bd4-a2fd-4a3e-a3ff-b43cf035de24\") " pod="openstack/install-certs-openstack-openstack-cell1-qvlrc" Oct 13 15:02:38 crc kubenswrapper[4797]: I1013 15:02:38.597990 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a9088bd4-a2fd-4a3e-a3ff-b43cf035de24-ceph\") pod \"install-certs-openstack-openstack-cell1-qvlrc\" (UID: \"a9088bd4-a2fd-4a3e-a3ff-b43cf035de24\") " pod="openstack/install-certs-openstack-openstack-cell1-qvlrc" Oct 13 15:02:38 crc kubenswrapper[4797]: I1013 15:02:38.598035 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9088bd4-a2fd-4a3e-a3ff-b43cf035de24-libvirt-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-qvlrc\" (UID: \"a9088bd4-a2fd-4a3e-a3ff-b43cf035de24\") " pod="openstack/install-certs-openstack-openstack-cell1-qvlrc" Oct 13 15:02:38 crc kubenswrapper[4797]: I1013 15:02:38.598221 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9088bd4-a2fd-4a3e-a3ff-b43cf035de24-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-qvlrc\" (UID: \"a9088bd4-a2fd-4a3e-a3ff-b43cf035de24\") " pod="openstack/install-certs-openstack-openstack-cell1-qvlrc" Oct 13 15:02:38 crc kubenswrapper[4797]: I1013 15:02:38.598633 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9088bd4-a2fd-4a3e-a3ff-b43cf035de24-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-qvlrc\" (UID: \"a9088bd4-a2fd-4a3e-a3ff-b43cf035de24\") " pod="openstack/install-certs-openstack-openstack-cell1-qvlrc" Oct 13 15:02:38 crc kubenswrapper[4797]: I1013 15:02:38.598719 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9088bd4-a2fd-4a3e-a3ff-b43cf035de24-nova-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-qvlrc\" (UID: \"a9088bd4-a2fd-4a3e-a3ff-b43cf035de24\") " pod="openstack/install-certs-openstack-openstack-cell1-qvlrc" Oct 13 15:02:38 crc kubenswrapper[4797]: I1013 15:02:38.599492 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9088bd4-a2fd-4a3e-a3ff-b43cf035de24-neutron-sriov-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-qvlrc\" (UID: \"a9088bd4-a2fd-4a3e-a3ff-b43cf035de24\") " pod="openstack/install-certs-openstack-openstack-cell1-qvlrc" Oct 13 15:02:38 crc kubenswrapper[4797]: I1013 15:02:38.600523 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9088bd4-a2fd-4a3e-a3ff-b43cf035de24-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-qvlrc\" (UID: \"a9088bd4-a2fd-4a3e-a3ff-b43cf035de24\") " pod="openstack/install-certs-openstack-openstack-cell1-qvlrc" Oct 13 15:02:38 crc kubenswrapper[4797]: I1013 15:02:38.605783 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9088bd4-a2fd-4a3e-a3ff-b43cf035de24-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-qvlrc\" (UID: \"a9088bd4-a2fd-4a3e-a3ff-b43cf035de24\") " pod="openstack/install-certs-openstack-openstack-cell1-qvlrc" Oct 13 15:02:38 crc kubenswrapper[4797]: I1013 15:02:38.608625 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-px26w\" (UniqueName: \"kubernetes.io/projected/a9088bd4-a2fd-4a3e-a3ff-b43cf035de24-kube-api-access-px26w\") pod \"install-certs-openstack-openstack-cell1-qvlrc\" (UID: \"a9088bd4-a2fd-4a3e-a3ff-b43cf035de24\") " pod="openstack/install-certs-openstack-openstack-cell1-qvlrc" Oct 13 15:02:38 crc kubenswrapper[4797]: I1013 15:02:38.726367 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-qvlrc" Oct 13 15:02:39 crc kubenswrapper[4797]: I1013 15:02:39.299474 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-openstack-openstack-cell1-qvlrc"] Oct 13 15:02:39 crc kubenswrapper[4797]: I1013 15:02:39.338493 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-qvlrc" event={"ID":"a9088bd4-a2fd-4a3e-a3ff-b43cf035de24","Type":"ContainerStarted","Data":"7d9a2e8f7263fae920dda08560852907104d952533e76b12a53cc89f98838280"} Oct 13 15:02:40 crc kubenswrapper[4797]: I1013 15:02:40.354424 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-qvlrc" event={"ID":"a9088bd4-a2fd-4a3e-a3ff-b43cf035de24","Type":"ContainerStarted","Data":"5381b4fa7b6cb63d755de2b0c9d0c4c58629b1abc34f11c554034c5660a66504"} Oct 13 15:02:40 crc kubenswrapper[4797]: I1013 15:02:40.385276 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-openstack-openstack-cell1-qvlrc" podStartSLOduration=1.949049607 podStartE2EDuration="2.385237072s" podCreationTimestamp="2025-10-13 15:02:38 +0000 UTC" firstStartedPulling="2025-10-13 15:02:39.300012484 +0000 UTC m=+6936.833562780" lastFinishedPulling="2025-10-13 15:02:39.736199989 +0000 UTC m=+6937.269750245" observedRunningTime="2025-10-13 15:02:40.377703827 +0000 UTC m=+6937.911254123" watchObservedRunningTime="2025-10-13 15:02:40.385237072 +0000 UTC m=+6937.918787338" Oct 13 15:02:57 crc kubenswrapper[4797]: I1013 15:02:57.527610 4797 generic.go:334] "Generic (PLEG): container finished" podID="a9088bd4-a2fd-4a3e-a3ff-b43cf035de24" containerID="5381b4fa7b6cb63d755de2b0c9d0c4c58629b1abc34f11c554034c5660a66504" exitCode=0 Oct 13 15:02:57 crc kubenswrapper[4797]: I1013 15:02:57.527691 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-qvlrc" event={"ID":"a9088bd4-a2fd-4a3e-a3ff-b43cf035de24","Type":"ContainerDied","Data":"5381b4fa7b6cb63d755de2b0c9d0c4c58629b1abc34f11c554034c5660a66504"} Oct 13 15:02:58 crc kubenswrapper[4797]: I1013 15:02:58.981606 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-qvlrc" Oct 13 15:02:59 crc kubenswrapper[4797]: I1013 15:02:59.056601 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9088bd4-a2fd-4a3e-a3ff-b43cf035de24-neutron-metadata-combined-ca-bundle\") pod \"a9088bd4-a2fd-4a3e-a3ff-b43cf035de24\" (UID: \"a9088bd4-a2fd-4a3e-a3ff-b43cf035de24\") " Oct 13 15:02:59 crc kubenswrapper[4797]: I1013 15:02:59.056709 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9088bd4-a2fd-4a3e-a3ff-b43cf035de24-neutron-dhcp-combined-ca-bundle\") pod \"a9088bd4-a2fd-4a3e-a3ff-b43cf035de24\" (UID: \"a9088bd4-a2fd-4a3e-a3ff-b43cf035de24\") " Oct 13 15:02:59 crc kubenswrapper[4797]: I1013 15:02:59.056751 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9088bd4-a2fd-4a3e-a3ff-b43cf035de24-neutron-sriov-combined-ca-bundle\") pod \"a9088bd4-a2fd-4a3e-a3ff-b43cf035de24\" (UID: \"a9088bd4-a2fd-4a3e-a3ff-b43cf035de24\") " Oct 13 15:02:59 crc kubenswrapper[4797]: I1013 15:02:59.056787 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9088bd4-a2fd-4a3e-a3ff-b43cf035de24-bootstrap-combined-ca-bundle\") pod \"a9088bd4-a2fd-4a3e-a3ff-b43cf035de24\" (UID: \"a9088bd4-a2fd-4a3e-a3ff-b43cf035de24\") " Oct 13 15:02:59 crc kubenswrapper[4797]: I1013 15:02:59.056895 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-px26w\" (UniqueName: \"kubernetes.io/projected/a9088bd4-a2fd-4a3e-a3ff-b43cf035de24-kube-api-access-px26w\") pod \"a9088bd4-a2fd-4a3e-a3ff-b43cf035de24\" (UID: \"a9088bd4-a2fd-4a3e-a3ff-b43cf035de24\") " Oct 13 15:02:59 crc kubenswrapper[4797]: I1013 15:02:59.056925 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9088bd4-a2fd-4a3e-a3ff-b43cf035de24-telemetry-combined-ca-bundle\") pod \"a9088bd4-a2fd-4a3e-a3ff-b43cf035de24\" (UID: \"a9088bd4-a2fd-4a3e-a3ff-b43cf035de24\") " Oct 13 15:02:59 crc kubenswrapper[4797]: I1013 15:02:59.057052 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a9088bd4-a2fd-4a3e-a3ff-b43cf035de24-ssh-key\") pod \"a9088bd4-a2fd-4a3e-a3ff-b43cf035de24\" (UID: \"a9088bd4-a2fd-4a3e-a3ff-b43cf035de24\") " Oct 13 15:02:59 crc kubenswrapper[4797]: I1013 15:02:59.057126 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9088bd4-a2fd-4a3e-a3ff-b43cf035de24-libvirt-combined-ca-bundle\") pod \"a9088bd4-a2fd-4a3e-a3ff-b43cf035de24\" (UID: \"a9088bd4-a2fd-4a3e-a3ff-b43cf035de24\") " Oct 13 15:02:59 crc kubenswrapper[4797]: I1013 15:02:59.057150 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a9088bd4-a2fd-4a3e-a3ff-b43cf035de24-ceph\") pod \"a9088bd4-a2fd-4a3e-a3ff-b43cf035de24\" (UID: \"a9088bd4-a2fd-4a3e-a3ff-b43cf035de24\") " Oct 13 15:02:59 crc kubenswrapper[4797]: I1013 15:02:59.057186 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a9088bd4-a2fd-4a3e-a3ff-b43cf035de24-inventory\") pod \"a9088bd4-a2fd-4a3e-a3ff-b43cf035de24\" (UID: \"a9088bd4-a2fd-4a3e-a3ff-b43cf035de24\") " Oct 13 15:02:59 crc kubenswrapper[4797]: I1013 15:02:59.057204 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9088bd4-a2fd-4a3e-a3ff-b43cf035de24-nova-combined-ca-bundle\") pod \"a9088bd4-a2fd-4a3e-a3ff-b43cf035de24\" (UID: \"a9088bd4-a2fd-4a3e-a3ff-b43cf035de24\") " Oct 13 15:02:59 crc kubenswrapper[4797]: I1013 15:02:59.057238 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9088bd4-a2fd-4a3e-a3ff-b43cf035de24-ovn-combined-ca-bundle\") pod \"a9088bd4-a2fd-4a3e-a3ff-b43cf035de24\" (UID: \"a9088bd4-a2fd-4a3e-a3ff-b43cf035de24\") " Oct 13 15:02:59 crc kubenswrapper[4797]: I1013 15:02:59.063598 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9088bd4-a2fd-4a3e-a3ff-b43cf035de24-ceph" (OuterVolumeSpecName: "ceph") pod "a9088bd4-a2fd-4a3e-a3ff-b43cf035de24" (UID: "a9088bd4-a2fd-4a3e-a3ff-b43cf035de24"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 15:02:59 crc kubenswrapper[4797]: I1013 15:02:59.064211 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9088bd4-a2fd-4a3e-a3ff-b43cf035de24-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "a9088bd4-a2fd-4a3e-a3ff-b43cf035de24" (UID: "a9088bd4-a2fd-4a3e-a3ff-b43cf035de24"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 15:02:59 crc kubenswrapper[4797]: I1013 15:02:59.064991 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9088bd4-a2fd-4a3e-a3ff-b43cf035de24-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "a9088bd4-a2fd-4a3e-a3ff-b43cf035de24" (UID: "a9088bd4-a2fd-4a3e-a3ff-b43cf035de24"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 15:02:59 crc kubenswrapper[4797]: I1013 15:02:59.065553 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9088bd4-a2fd-4a3e-a3ff-b43cf035de24-neutron-sriov-combined-ca-bundle" (OuterVolumeSpecName: "neutron-sriov-combined-ca-bundle") pod "a9088bd4-a2fd-4a3e-a3ff-b43cf035de24" (UID: "a9088bd4-a2fd-4a3e-a3ff-b43cf035de24"). InnerVolumeSpecName "neutron-sriov-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 15:02:59 crc kubenswrapper[4797]: I1013 15:02:59.065896 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9088bd4-a2fd-4a3e-a3ff-b43cf035de24-neutron-dhcp-combined-ca-bundle" (OuterVolumeSpecName: "neutron-dhcp-combined-ca-bundle") pod "a9088bd4-a2fd-4a3e-a3ff-b43cf035de24" (UID: "a9088bd4-a2fd-4a3e-a3ff-b43cf035de24"). InnerVolumeSpecName "neutron-dhcp-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 15:02:59 crc kubenswrapper[4797]: I1013 15:02:59.066039 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9088bd4-a2fd-4a3e-a3ff-b43cf035de24-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "a9088bd4-a2fd-4a3e-a3ff-b43cf035de24" (UID: "a9088bd4-a2fd-4a3e-a3ff-b43cf035de24"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 15:02:59 crc kubenswrapper[4797]: I1013 15:02:59.066212 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9088bd4-a2fd-4a3e-a3ff-b43cf035de24-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "a9088bd4-a2fd-4a3e-a3ff-b43cf035de24" (UID: "a9088bd4-a2fd-4a3e-a3ff-b43cf035de24"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 15:02:59 crc kubenswrapper[4797]: I1013 15:02:59.067417 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9088bd4-a2fd-4a3e-a3ff-b43cf035de24-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "a9088bd4-a2fd-4a3e-a3ff-b43cf035de24" (UID: "a9088bd4-a2fd-4a3e-a3ff-b43cf035de24"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 15:02:59 crc kubenswrapper[4797]: I1013 15:02:59.069231 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9088bd4-a2fd-4a3e-a3ff-b43cf035de24-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "a9088bd4-a2fd-4a3e-a3ff-b43cf035de24" (UID: "a9088bd4-a2fd-4a3e-a3ff-b43cf035de24"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 15:02:59 crc kubenswrapper[4797]: I1013 15:02:59.070165 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9088bd4-a2fd-4a3e-a3ff-b43cf035de24-kube-api-access-px26w" (OuterVolumeSpecName: "kube-api-access-px26w") pod "a9088bd4-a2fd-4a3e-a3ff-b43cf035de24" (UID: "a9088bd4-a2fd-4a3e-a3ff-b43cf035de24"). InnerVolumeSpecName "kube-api-access-px26w". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 15:02:59 crc kubenswrapper[4797]: I1013 15:02:59.089623 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9088bd4-a2fd-4a3e-a3ff-b43cf035de24-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "a9088bd4-a2fd-4a3e-a3ff-b43cf035de24" (UID: "a9088bd4-a2fd-4a3e-a3ff-b43cf035de24"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 15:02:59 crc kubenswrapper[4797]: I1013 15:02:59.097862 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9088bd4-a2fd-4a3e-a3ff-b43cf035de24-inventory" (OuterVolumeSpecName: "inventory") pod "a9088bd4-a2fd-4a3e-a3ff-b43cf035de24" (UID: "a9088bd4-a2fd-4a3e-a3ff-b43cf035de24"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 15:02:59 crc kubenswrapper[4797]: I1013 15:02:59.159661 4797 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9088bd4-a2fd-4a3e-a3ff-b43cf035de24-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 15:02:59 crc kubenswrapper[4797]: I1013 15:02:59.159698 4797 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a9088bd4-a2fd-4a3e-a3ff-b43cf035de24-ceph\") on node \"crc\" DevicePath \"\"" Oct 13 15:02:59 crc kubenswrapper[4797]: I1013 15:02:59.159711 4797 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a9088bd4-a2fd-4a3e-a3ff-b43cf035de24-inventory\") on node \"crc\" DevicePath \"\"" Oct 13 15:02:59 crc kubenswrapper[4797]: I1013 15:02:59.159721 4797 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9088bd4-a2fd-4a3e-a3ff-b43cf035de24-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 15:02:59 crc kubenswrapper[4797]: I1013 15:02:59.159735 4797 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9088bd4-a2fd-4a3e-a3ff-b43cf035de24-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 15:02:59 crc kubenswrapper[4797]: I1013 15:02:59.159746 4797 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9088bd4-a2fd-4a3e-a3ff-b43cf035de24-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 15:02:59 crc kubenswrapper[4797]: I1013 15:02:59.159757 4797 reconciler_common.go:293] "Volume detached for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9088bd4-a2fd-4a3e-a3ff-b43cf035de24-neutron-dhcp-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 15:02:59 crc kubenswrapper[4797]: I1013 15:02:59.159770 4797 reconciler_common.go:293] "Volume detached for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9088bd4-a2fd-4a3e-a3ff-b43cf035de24-neutron-sriov-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 15:02:59 crc kubenswrapper[4797]: I1013 15:02:59.159780 4797 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9088bd4-a2fd-4a3e-a3ff-b43cf035de24-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 15:02:59 crc kubenswrapper[4797]: I1013 15:02:59.159793 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-px26w\" (UniqueName: \"kubernetes.io/projected/a9088bd4-a2fd-4a3e-a3ff-b43cf035de24-kube-api-access-px26w\") on node \"crc\" DevicePath \"\"" Oct 13 15:02:59 crc kubenswrapper[4797]: I1013 15:02:59.159827 4797 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9088bd4-a2fd-4a3e-a3ff-b43cf035de24-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 15:02:59 crc kubenswrapper[4797]: I1013 15:02:59.159839 4797 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a9088bd4-a2fd-4a3e-a3ff-b43cf035de24-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 13 15:02:59 crc kubenswrapper[4797]: I1013 15:02:59.548492 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-qvlrc" event={"ID":"a9088bd4-a2fd-4a3e-a3ff-b43cf035de24","Type":"ContainerDied","Data":"7d9a2e8f7263fae920dda08560852907104d952533e76b12a53cc89f98838280"} Oct 13 15:02:59 crc kubenswrapper[4797]: I1013 15:02:59.548551 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7d9a2e8f7263fae920dda08560852907104d952533e76b12a53cc89f98838280" Oct 13 15:02:59 crc kubenswrapper[4797]: I1013 15:02:59.548581 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-qvlrc" Oct 13 15:02:59 crc kubenswrapper[4797]: I1013 15:02:59.684745 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph-client-openstack-openstack-cell1-kpsp9"] Oct 13 15:02:59 crc kubenswrapper[4797]: E1013 15:02:59.685564 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9088bd4-a2fd-4a3e-a3ff-b43cf035de24" containerName="install-certs-openstack-openstack-cell1" Oct 13 15:02:59 crc kubenswrapper[4797]: I1013 15:02:59.685579 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9088bd4-a2fd-4a3e-a3ff-b43cf035de24" containerName="install-certs-openstack-openstack-cell1" Oct 13 15:02:59 crc kubenswrapper[4797]: I1013 15:02:59.685857 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9088bd4-a2fd-4a3e-a3ff-b43cf035de24" containerName="install-certs-openstack-openstack-cell1" Oct 13 15:02:59 crc kubenswrapper[4797]: I1013 15:02:59.686680 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-openstack-openstack-cell1-kpsp9" Oct 13 15:02:59 crc kubenswrapper[4797]: I1013 15:02:59.693046 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 13 15:02:59 crc kubenswrapper[4797]: I1013 15:02:59.693140 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 13 15:02:59 crc kubenswrapper[4797]: I1013 15:02:59.693338 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 13 15:02:59 crc kubenswrapper[4797]: I1013 15:02:59.693439 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-rf85n" Oct 13 15:02:59 crc kubenswrapper[4797]: I1013 15:02:59.711416 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-client-openstack-openstack-cell1-kpsp9"] Oct 13 15:02:59 crc kubenswrapper[4797]: I1013 15:02:59.778034 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhn4d\" (UniqueName: \"kubernetes.io/projected/8d868a76-1350-41ea-875b-b4841f483390-kube-api-access-qhn4d\") pod \"ceph-client-openstack-openstack-cell1-kpsp9\" (UID: \"8d868a76-1350-41ea-875b-b4841f483390\") " pod="openstack/ceph-client-openstack-openstack-cell1-kpsp9" Oct 13 15:02:59 crc kubenswrapper[4797]: I1013 15:02:59.778115 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8d868a76-1350-41ea-875b-b4841f483390-ceph\") pod \"ceph-client-openstack-openstack-cell1-kpsp9\" (UID: \"8d868a76-1350-41ea-875b-b4841f483390\") " pod="openstack/ceph-client-openstack-openstack-cell1-kpsp9" Oct 13 15:02:59 crc kubenswrapper[4797]: I1013 15:02:59.778160 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8d868a76-1350-41ea-875b-b4841f483390-ssh-key\") pod \"ceph-client-openstack-openstack-cell1-kpsp9\" (UID: \"8d868a76-1350-41ea-875b-b4841f483390\") " pod="openstack/ceph-client-openstack-openstack-cell1-kpsp9" Oct 13 15:02:59 crc kubenswrapper[4797]: I1013 15:02:59.778281 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8d868a76-1350-41ea-875b-b4841f483390-inventory\") pod \"ceph-client-openstack-openstack-cell1-kpsp9\" (UID: \"8d868a76-1350-41ea-875b-b4841f483390\") " pod="openstack/ceph-client-openstack-openstack-cell1-kpsp9" Oct 13 15:02:59 crc kubenswrapper[4797]: I1013 15:02:59.879690 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8d868a76-1350-41ea-875b-b4841f483390-inventory\") pod \"ceph-client-openstack-openstack-cell1-kpsp9\" (UID: \"8d868a76-1350-41ea-875b-b4841f483390\") " pod="openstack/ceph-client-openstack-openstack-cell1-kpsp9" Oct 13 15:02:59 crc kubenswrapper[4797]: I1013 15:02:59.879778 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qhn4d\" (UniqueName: \"kubernetes.io/projected/8d868a76-1350-41ea-875b-b4841f483390-kube-api-access-qhn4d\") pod \"ceph-client-openstack-openstack-cell1-kpsp9\" (UID: \"8d868a76-1350-41ea-875b-b4841f483390\") " pod="openstack/ceph-client-openstack-openstack-cell1-kpsp9" Oct 13 15:02:59 crc kubenswrapper[4797]: I1013 15:02:59.879879 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8d868a76-1350-41ea-875b-b4841f483390-ceph\") pod \"ceph-client-openstack-openstack-cell1-kpsp9\" (UID: \"8d868a76-1350-41ea-875b-b4841f483390\") " pod="openstack/ceph-client-openstack-openstack-cell1-kpsp9" Oct 13 15:02:59 crc kubenswrapper[4797]: I1013 15:02:59.879940 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8d868a76-1350-41ea-875b-b4841f483390-ssh-key\") pod \"ceph-client-openstack-openstack-cell1-kpsp9\" (UID: \"8d868a76-1350-41ea-875b-b4841f483390\") " pod="openstack/ceph-client-openstack-openstack-cell1-kpsp9" Oct 13 15:02:59 crc kubenswrapper[4797]: I1013 15:02:59.884790 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8d868a76-1350-41ea-875b-b4841f483390-ssh-key\") pod \"ceph-client-openstack-openstack-cell1-kpsp9\" (UID: \"8d868a76-1350-41ea-875b-b4841f483390\") " pod="openstack/ceph-client-openstack-openstack-cell1-kpsp9" Oct 13 15:02:59 crc kubenswrapper[4797]: I1013 15:02:59.885273 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8d868a76-1350-41ea-875b-b4841f483390-inventory\") pod \"ceph-client-openstack-openstack-cell1-kpsp9\" (UID: \"8d868a76-1350-41ea-875b-b4841f483390\") " pod="openstack/ceph-client-openstack-openstack-cell1-kpsp9" Oct 13 15:02:59 crc kubenswrapper[4797]: I1013 15:02:59.886649 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8d868a76-1350-41ea-875b-b4841f483390-ceph\") pod \"ceph-client-openstack-openstack-cell1-kpsp9\" (UID: \"8d868a76-1350-41ea-875b-b4841f483390\") " pod="openstack/ceph-client-openstack-openstack-cell1-kpsp9" Oct 13 15:02:59 crc kubenswrapper[4797]: I1013 15:02:59.900704 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhn4d\" (UniqueName: \"kubernetes.io/projected/8d868a76-1350-41ea-875b-b4841f483390-kube-api-access-qhn4d\") pod \"ceph-client-openstack-openstack-cell1-kpsp9\" (UID: \"8d868a76-1350-41ea-875b-b4841f483390\") " pod="openstack/ceph-client-openstack-openstack-cell1-kpsp9" Oct 13 15:03:00 crc kubenswrapper[4797]: I1013 15:03:00.018570 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-openstack-openstack-cell1-kpsp9" Oct 13 15:03:00 crc kubenswrapper[4797]: I1013 15:03:00.551156 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-client-openstack-openstack-cell1-kpsp9"] Oct 13 15:03:01 crc kubenswrapper[4797]: I1013 15:03:01.569392 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-openstack-openstack-cell1-kpsp9" event={"ID":"8d868a76-1350-41ea-875b-b4841f483390","Type":"ContainerStarted","Data":"c61b09b6a36d2e21782a09d264306527ce9e1c4bb85443b1d562723ee83191ae"} Oct 13 15:03:01 crc kubenswrapper[4797]: I1013 15:03:01.569831 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-openstack-openstack-cell1-kpsp9" event={"ID":"8d868a76-1350-41ea-875b-b4841f483390","Type":"ContainerStarted","Data":"165597cd6e1b7d19af717f5d6186220744c52b0f07b3cdf1941959e7eee6e6a1"} Oct 13 15:03:01 crc kubenswrapper[4797]: I1013 15:03:01.597624 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceph-client-openstack-openstack-cell1-kpsp9" podStartSLOduration=2.109560064 podStartE2EDuration="2.597598517s" podCreationTimestamp="2025-10-13 15:02:59 +0000 UTC" firstStartedPulling="2025-10-13 15:03:00.552368525 +0000 UTC m=+6958.085918781" lastFinishedPulling="2025-10-13 15:03:01.040406978 +0000 UTC m=+6958.573957234" observedRunningTime="2025-10-13 15:03:01.586403651 +0000 UTC m=+6959.119953927" watchObservedRunningTime="2025-10-13 15:03:01.597598517 +0000 UTC m=+6959.131148773" Oct 13 15:03:06 crc kubenswrapper[4797]: I1013 15:03:06.630979 4797 generic.go:334] "Generic (PLEG): container finished" podID="8d868a76-1350-41ea-875b-b4841f483390" containerID="c61b09b6a36d2e21782a09d264306527ce9e1c4bb85443b1d562723ee83191ae" exitCode=0 Oct 13 15:03:06 crc kubenswrapper[4797]: I1013 15:03:06.631134 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-openstack-openstack-cell1-kpsp9" event={"ID":"8d868a76-1350-41ea-875b-b4841f483390","Type":"ContainerDied","Data":"c61b09b6a36d2e21782a09d264306527ce9e1c4bb85443b1d562723ee83191ae"} Oct 13 15:03:08 crc kubenswrapper[4797]: I1013 15:03:08.185229 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-openstack-openstack-cell1-kpsp9" Oct 13 15:03:08 crc kubenswrapper[4797]: I1013 15:03:08.276674 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8d868a76-1350-41ea-875b-b4841f483390-ceph\") pod \"8d868a76-1350-41ea-875b-b4841f483390\" (UID: \"8d868a76-1350-41ea-875b-b4841f483390\") " Oct 13 15:03:08 crc kubenswrapper[4797]: I1013 15:03:08.276747 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8d868a76-1350-41ea-875b-b4841f483390-inventory\") pod \"8d868a76-1350-41ea-875b-b4841f483390\" (UID: \"8d868a76-1350-41ea-875b-b4841f483390\") " Oct 13 15:03:08 crc kubenswrapper[4797]: I1013 15:03:08.276795 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qhn4d\" (UniqueName: \"kubernetes.io/projected/8d868a76-1350-41ea-875b-b4841f483390-kube-api-access-qhn4d\") pod \"8d868a76-1350-41ea-875b-b4841f483390\" (UID: \"8d868a76-1350-41ea-875b-b4841f483390\") " Oct 13 15:03:08 crc kubenswrapper[4797]: I1013 15:03:08.276947 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8d868a76-1350-41ea-875b-b4841f483390-ssh-key\") pod \"8d868a76-1350-41ea-875b-b4841f483390\" (UID: \"8d868a76-1350-41ea-875b-b4841f483390\") " Oct 13 15:03:08 crc kubenswrapper[4797]: I1013 15:03:08.281724 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d868a76-1350-41ea-875b-b4841f483390-ceph" (OuterVolumeSpecName: "ceph") pod "8d868a76-1350-41ea-875b-b4841f483390" (UID: "8d868a76-1350-41ea-875b-b4841f483390"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 15:03:08 crc kubenswrapper[4797]: I1013 15:03:08.297796 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d868a76-1350-41ea-875b-b4841f483390-kube-api-access-qhn4d" (OuterVolumeSpecName: "kube-api-access-qhn4d") pod "8d868a76-1350-41ea-875b-b4841f483390" (UID: "8d868a76-1350-41ea-875b-b4841f483390"). InnerVolumeSpecName "kube-api-access-qhn4d". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 15:03:08 crc kubenswrapper[4797]: I1013 15:03:08.304330 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d868a76-1350-41ea-875b-b4841f483390-inventory" (OuterVolumeSpecName: "inventory") pod "8d868a76-1350-41ea-875b-b4841f483390" (UID: "8d868a76-1350-41ea-875b-b4841f483390"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 15:03:08 crc kubenswrapper[4797]: I1013 15:03:08.309316 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d868a76-1350-41ea-875b-b4841f483390-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "8d868a76-1350-41ea-875b-b4841f483390" (UID: "8d868a76-1350-41ea-875b-b4841f483390"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 15:03:08 crc kubenswrapper[4797]: I1013 15:03:08.382418 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qhn4d\" (UniqueName: \"kubernetes.io/projected/8d868a76-1350-41ea-875b-b4841f483390-kube-api-access-qhn4d\") on node \"crc\" DevicePath \"\"" Oct 13 15:03:08 crc kubenswrapper[4797]: I1013 15:03:08.382453 4797 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8d868a76-1350-41ea-875b-b4841f483390-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 13 15:03:08 crc kubenswrapper[4797]: I1013 15:03:08.382466 4797 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8d868a76-1350-41ea-875b-b4841f483390-ceph\") on node \"crc\" DevicePath \"\"" Oct 13 15:03:08 crc kubenswrapper[4797]: I1013 15:03:08.382478 4797 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8d868a76-1350-41ea-875b-b4841f483390-inventory\") on node \"crc\" DevicePath \"\"" Oct 13 15:03:08 crc kubenswrapper[4797]: I1013 15:03:08.652779 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-openstack-openstack-cell1-kpsp9" event={"ID":"8d868a76-1350-41ea-875b-b4841f483390","Type":"ContainerDied","Data":"165597cd6e1b7d19af717f5d6186220744c52b0f07b3cdf1941959e7eee6e6a1"} Oct 13 15:03:08 crc kubenswrapper[4797]: I1013 15:03:08.652845 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="165597cd6e1b7d19af717f5d6186220744c52b0f07b3cdf1941959e7eee6e6a1" Oct 13 15:03:08 crc kubenswrapper[4797]: I1013 15:03:08.652936 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-openstack-openstack-cell1-kpsp9" Oct 13 15:03:08 crc kubenswrapper[4797]: I1013 15:03:08.737996 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-openstack-openstack-cell1-57ps4"] Oct 13 15:03:08 crc kubenswrapper[4797]: E1013 15:03:08.738718 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d868a76-1350-41ea-875b-b4841f483390" containerName="ceph-client-openstack-openstack-cell1" Oct 13 15:03:08 crc kubenswrapper[4797]: I1013 15:03:08.738735 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d868a76-1350-41ea-875b-b4841f483390" containerName="ceph-client-openstack-openstack-cell1" Oct 13 15:03:08 crc kubenswrapper[4797]: I1013 15:03:08.738976 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d868a76-1350-41ea-875b-b4841f483390" containerName="ceph-client-openstack-openstack-cell1" Oct 13 15:03:08 crc kubenswrapper[4797]: I1013 15:03:08.739729 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-57ps4" Oct 13 15:03:08 crc kubenswrapper[4797]: I1013 15:03:08.742463 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 13 15:03:08 crc kubenswrapper[4797]: I1013 15:03:08.743433 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 13 15:03:08 crc kubenswrapper[4797]: I1013 15:03:08.747751 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-rf85n" Oct 13 15:03:08 crc kubenswrapper[4797]: I1013 15:03:08.748028 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Oct 13 15:03:08 crc kubenswrapper[4797]: I1013 15:03:08.749706 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 13 15:03:08 crc kubenswrapper[4797]: I1013 15:03:08.761288 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-openstack-openstack-cell1-57ps4"] Oct 13 15:03:08 crc kubenswrapper[4797]: I1013 15:03:08.790706 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/366a3bab-eb21-41c6-b4e3-52f6e34cc08c-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-cell1-57ps4\" (UID: \"366a3bab-eb21-41c6-b4e3-52f6e34cc08c\") " pod="openstack/ovn-openstack-openstack-cell1-57ps4" Oct 13 15:03:08 crc kubenswrapper[4797]: I1013 15:03:08.790795 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/366a3bab-eb21-41c6-b4e3-52f6e34cc08c-ovncontroller-config-0\") pod \"ovn-openstack-openstack-cell1-57ps4\" (UID: \"366a3bab-eb21-41c6-b4e3-52f6e34cc08c\") " pod="openstack/ovn-openstack-openstack-cell1-57ps4" Oct 13 15:03:08 crc kubenswrapper[4797]: I1013 15:03:08.790912 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/366a3bab-eb21-41c6-b4e3-52f6e34cc08c-inventory\") pod \"ovn-openstack-openstack-cell1-57ps4\" (UID: \"366a3bab-eb21-41c6-b4e3-52f6e34cc08c\") " pod="openstack/ovn-openstack-openstack-cell1-57ps4" Oct 13 15:03:08 crc kubenswrapper[4797]: I1013 15:03:08.790946 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/366a3bab-eb21-41c6-b4e3-52f6e34cc08c-ssh-key\") pod \"ovn-openstack-openstack-cell1-57ps4\" (UID: \"366a3bab-eb21-41c6-b4e3-52f6e34cc08c\") " pod="openstack/ovn-openstack-openstack-cell1-57ps4" Oct 13 15:03:08 crc kubenswrapper[4797]: I1013 15:03:08.790990 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4fbsk\" (UniqueName: \"kubernetes.io/projected/366a3bab-eb21-41c6-b4e3-52f6e34cc08c-kube-api-access-4fbsk\") pod \"ovn-openstack-openstack-cell1-57ps4\" (UID: \"366a3bab-eb21-41c6-b4e3-52f6e34cc08c\") " pod="openstack/ovn-openstack-openstack-cell1-57ps4" Oct 13 15:03:08 crc kubenswrapper[4797]: I1013 15:03:08.791049 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/366a3bab-eb21-41c6-b4e3-52f6e34cc08c-ceph\") pod \"ovn-openstack-openstack-cell1-57ps4\" (UID: \"366a3bab-eb21-41c6-b4e3-52f6e34cc08c\") " pod="openstack/ovn-openstack-openstack-cell1-57ps4" Oct 13 15:03:08 crc kubenswrapper[4797]: I1013 15:03:08.893404 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/366a3bab-eb21-41c6-b4e3-52f6e34cc08c-ceph\") pod \"ovn-openstack-openstack-cell1-57ps4\" (UID: \"366a3bab-eb21-41c6-b4e3-52f6e34cc08c\") " pod="openstack/ovn-openstack-openstack-cell1-57ps4" Oct 13 15:03:08 crc kubenswrapper[4797]: I1013 15:03:08.893475 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/366a3bab-eb21-41c6-b4e3-52f6e34cc08c-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-cell1-57ps4\" (UID: \"366a3bab-eb21-41c6-b4e3-52f6e34cc08c\") " pod="openstack/ovn-openstack-openstack-cell1-57ps4" Oct 13 15:03:08 crc kubenswrapper[4797]: I1013 15:03:08.893559 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/366a3bab-eb21-41c6-b4e3-52f6e34cc08c-ovncontroller-config-0\") pod \"ovn-openstack-openstack-cell1-57ps4\" (UID: \"366a3bab-eb21-41c6-b4e3-52f6e34cc08c\") " pod="openstack/ovn-openstack-openstack-cell1-57ps4" Oct 13 15:03:08 crc kubenswrapper[4797]: I1013 15:03:08.893613 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/366a3bab-eb21-41c6-b4e3-52f6e34cc08c-inventory\") pod \"ovn-openstack-openstack-cell1-57ps4\" (UID: \"366a3bab-eb21-41c6-b4e3-52f6e34cc08c\") " pod="openstack/ovn-openstack-openstack-cell1-57ps4" Oct 13 15:03:08 crc kubenswrapper[4797]: I1013 15:03:08.893654 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/366a3bab-eb21-41c6-b4e3-52f6e34cc08c-ssh-key\") pod \"ovn-openstack-openstack-cell1-57ps4\" (UID: \"366a3bab-eb21-41c6-b4e3-52f6e34cc08c\") " pod="openstack/ovn-openstack-openstack-cell1-57ps4" Oct 13 15:03:08 crc kubenswrapper[4797]: I1013 15:03:08.893711 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4fbsk\" (UniqueName: \"kubernetes.io/projected/366a3bab-eb21-41c6-b4e3-52f6e34cc08c-kube-api-access-4fbsk\") pod \"ovn-openstack-openstack-cell1-57ps4\" (UID: \"366a3bab-eb21-41c6-b4e3-52f6e34cc08c\") " pod="openstack/ovn-openstack-openstack-cell1-57ps4" Oct 13 15:03:08 crc kubenswrapper[4797]: I1013 15:03:08.894604 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/366a3bab-eb21-41c6-b4e3-52f6e34cc08c-ovncontroller-config-0\") pod \"ovn-openstack-openstack-cell1-57ps4\" (UID: \"366a3bab-eb21-41c6-b4e3-52f6e34cc08c\") " pod="openstack/ovn-openstack-openstack-cell1-57ps4" Oct 13 15:03:08 crc kubenswrapper[4797]: I1013 15:03:08.897756 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/366a3bab-eb21-41c6-b4e3-52f6e34cc08c-ceph\") pod \"ovn-openstack-openstack-cell1-57ps4\" (UID: \"366a3bab-eb21-41c6-b4e3-52f6e34cc08c\") " pod="openstack/ovn-openstack-openstack-cell1-57ps4" Oct 13 15:03:08 crc kubenswrapper[4797]: I1013 15:03:08.897857 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/366a3bab-eb21-41c6-b4e3-52f6e34cc08c-inventory\") pod \"ovn-openstack-openstack-cell1-57ps4\" (UID: \"366a3bab-eb21-41c6-b4e3-52f6e34cc08c\") " pod="openstack/ovn-openstack-openstack-cell1-57ps4" Oct 13 15:03:08 crc kubenswrapper[4797]: I1013 15:03:08.900488 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/366a3bab-eb21-41c6-b4e3-52f6e34cc08c-ssh-key\") pod \"ovn-openstack-openstack-cell1-57ps4\" (UID: \"366a3bab-eb21-41c6-b4e3-52f6e34cc08c\") " pod="openstack/ovn-openstack-openstack-cell1-57ps4" Oct 13 15:03:08 crc kubenswrapper[4797]: I1013 15:03:08.904171 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/366a3bab-eb21-41c6-b4e3-52f6e34cc08c-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-cell1-57ps4\" (UID: \"366a3bab-eb21-41c6-b4e3-52f6e34cc08c\") " pod="openstack/ovn-openstack-openstack-cell1-57ps4" Oct 13 15:03:08 crc kubenswrapper[4797]: I1013 15:03:08.921383 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4fbsk\" (UniqueName: \"kubernetes.io/projected/366a3bab-eb21-41c6-b4e3-52f6e34cc08c-kube-api-access-4fbsk\") pod \"ovn-openstack-openstack-cell1-57ps4\" (UID: \"366a3bab-eb21-41c6-b4e3-52f6e34cc08c\") " pod="openstack/ovn-openstack-openstack-cell1-57ps4" Oct 13 15:03:09 crc kubenswrapper[4797]: I1013 15:03:09.072261 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-57ps4" Oct 13 15:03:09 crc kubenswrapper[4797]: I1013 15:03:09.714655 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-openstack-openstack-cell1-57ps4"] Oct 13 15:03:10 crc kubenswrapper[4797]: I1013 15:03:10.671714 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-57ps4" event={"ID":"366a3bab-eb21-41c6-b4e3-52f6e34cc08c","Type":"ContainerStarted","Data":"dd02c96f3f2ab28e54e0be4ff1ed9456fcb9eec8c08c0d99eae2b057ff39be01"} Oct 13 15:03:10 crc kubenswrapper[4797]: I1013 15:03:10.672094 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-57ps4" event={"ID":"366a3bab-eb21-41c6-b4e3-52f6e34cc08c","Type":"ContainerStarted","Data":"115b9e15bdf7729b9c12c89bd738820733ccca13d661e50e2b8f649f931935f0"} Oct 13 15:03:10 crc kubenswrapper[4797]: I1013 15:03:10.697245 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-openstack-openstack-cell1-57ps4" podStartSLOduration=2.288192527 podStartE2EDuration="2.697226863s" podCreationTimestamp="2025-10-13 15:03:08 +0000 UTC" firstStartedPulling="2025-10-13 15:03:09.713475386 +0000 UTC m=+6967.247025642" lastFinishedPulling="2025-10-13 15:03:10.122509712 +0000 UTC m=+6967.656059978" observedRunningTime="2025-10-13 15:03:10.686399256 +0000 UTC m=+6968.219949542" watchObservedRunningTime="2025-10-13 15:03:10.697226863 +0000 UTC m=+6968.230777119" Oct 13 15:03:48 crc kubenswrapper[4797]: I1013 15:03:48.120012 4797 patch_prober.go:28] interesting pod/machine-config-daemon-hrdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 15:03:48 crc kubenswrapper[4797]: I1013 15:03:48.120578 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 15:04:12 crc kubenswrapper[4797]: I1013 15:04:12.359235 4797 generic.go:334] "Generic (PLEG): container finished" podID="366a3bab-eb21-41c6-b4e3-52f6e34cc08c" containerID="dd02c96f3f2ab28e54e0be4ff1ed9456fcb9eec8c08c0d99eae2b057ff39be01" exitCode=0 Oct 13 15:04:12 crc kubenswrapper[4797]: I1013 15:04:12.359316 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-57ps4" event={"ID":"366a3bab-eb21-41c6-b4e3-52f6e34cc08c","Type":"ContainerDied","Data":"dd02c96f3f2ab28e54e0be4ff1ed9456fcb9eec8c08c0d99eae2b057ff39be01"} Oct 13 15:04:13 crc kubenswrapper[4797]: I1013 15:04:13.818961 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-57ps4" Oct 13 15:04:13 crc kubenswrapper[4797]: I1013 15:04:13.930621 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/366a3bab-eb21-41c6-b4e3-52f6e34cc08c-inventory\") pod \"366a3bab-eb21-41c6-b4e3-52f6e34cc08c\" (UID: \"366a3bab-eb21-41c6-b4e3-52f6e34cc08c\") " Oct 13 15:04:13 crc kubenswrapper[4797]: I1013 15:04:13.930711 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/366a3bab-eb21-41c6-b4e3-52f6e34cc08c-ssh-key\") pod \"366a3bab-eb21-41c6-b4e3-52f6e34cc08c\" (UID: \"366a3bab-eb21-41c6-b4e3-52f6e34cc08c\") " Oct 13 15:04:13 crc kubenswrapper[4797]: I1013 15:04:13.930867 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/366a3bab-eb21-41c6-b4e3-52f6e34cc08c-ovncontroller-config-0\") pod \"366a3bab-eb21-41c6-b4e3-52f6e34cc08c\" (UID: \"366a3bab-eb21-41c6-b4e3-52f6e34cc08c\") " Oct 13 15:04:13 crc kubenswrapper[4797]: I1013 15:04:13.930924 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/366a3bab-eb21-41c6-b4e3-52f6e34cc08c-ceph\") pod \"366a3bab-eb21-41c6-b4e3-52f6e34cc08c\" (UID: \"366a3bab-eb21-41c6-b4e3-52f6e34cc08c\") " Oct 13 15:04:13 crc kubenswrapper[4797]: I1013 15:04:13.930966 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4fbsk\" (UniqueName: \"kubernetes.io/projected/366a3bab-eb21-41c6-b4e3-52f6e34cc08c-kube-api-access-4fbsk\") pod \"366a3bab-eb21-41c6-b4e3-52f6e34cc08c\" (UID: \"366a3bab-eb21-41c6-b4e3-52f6e34cc08c\") " Oct 13 15:04:13 crc kubenswrapper[4797]: I1013 15:04:13.930996 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/366a3bab-eb21-41c6-b4e3-52f6e34cc08c-ovn-combined-ca-bundle\") pod \"366a3bab-eb21-41c6-b4e3-52f6e34cc08c\" (UID: \"366a3bab-eb21-41c6-b4e3-52f6e34cc08c\") " Oct 13 15:04:13 crc kubenswrapper[4797]: I1013 15:04:13.939404 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/366a3bab-eb21-41c6-b4e3-52f6e34cc08c-ceph" (OuterVolumeSpecName: "ceph") pod "366a3bab-eb21-41c6-b4e3-52f6e34cc08c" (UID: "366a3bab-eb21-41c6-b4e3-52f6e34cc08c"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 15:04:13 crc kubenswrapper[4797]: I1013 15:04:13.939523 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/366a3bab-eb21-41c6-b4e3-52f6e34cc08c-kube-api-access-4fbsk" (OuterVolumeSpecName: "kube-api-access-4fbsk") pod "366a3bab-eb21-41c6-b4e3-52f6e34cc08c" (UID: "366a3bab-eb21-41c6-b4e3-52f6e34cc08c"). InnerVolumeSpecName "kube-api-access-4fbsk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 15:04:13 crc kubenswrapper[4797]: I1013 15:04:13.939479 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/366a3bab-eb21-41c6-b4e3-52f6e34cc08c-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "366a3bab-eb21-41c6-b4e3-52f6e34cc08c" (UID: "366a3bab-eb21-41c6-b4e3-52f6e34cc08c"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 15:04:13 crc kubenswrapper[4797]: I1013 15:04:13.970078 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/366a3bab-eb21-41c6-b4e3-52f6e34cc08c-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "366a3bab-eb21-41c6-b4e3-52f6e34cc08c" (UID: "366a3bab-eb21-41c6-b4e3-52f6e34cc08c"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 15:04:13 crc kubenswrapper[4797]: I1013 15:04:13.970280 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/366a3bab-eb21-41c6-b4e3-52f6e34cc08c-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "366a3bab-eb21-41c6-b4e3-52f6e34cc08c" (UID: "366a3bab-eb21-41c6-b4e3-52f6e34cc08c"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 15:04:13 crc kubenswrapper[4797]: I1013 15:04:13.977181 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/366a3bab-eb21-41c6-b4e3-52f6e34cc08c-inventory" (OuterVolumeSpecName: "inventory") pod "366a3bab-eb21-41c6-b4e3-52f6e34cc08c" (UID: "366a3bab-eb21-41c6-b4e3-52f6e34cc08c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 15:04:14 crc kubenswrapper[4797]: I1013 15:04:14.033518 4797 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/366a3bab-eb21-41c6-b4e3-52f6e34cc08c-inventory\") on node \"crc\" DevicePath \"\"" Oct 13 15:04:14 crc kubenswrapper[4797]: I1013 15:04:14.033556 4797 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/366a3bab-eb21-41c6-b4e3-52f6e34cc08c-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 13 15:04:14 crc kubenswrapper[4797]: I1013 15:04:14.033566 4797 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/366a3bab-eb21-41c6-b4e3-52f6e34cc08c-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Oct 13 15:04:14 crc kubenswrapper[4797]: I1013 15:04:14.033577 4797 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/366a3bab-eb21-41c6-b4e3-52f6e34cc08c-ceph\") on node \"crc\" DevicePath \"\"" Oct 13 15:04:14 crc kubenswrapper[4797]: I1013 15:04:14.033585 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4fbsk\" (UniqueName: \"kubernetes.io/projected/366a3bab-eb21-41c6-b4e3-52f6e34cc08c-kube-api-access-4fbsk\") on node \"crc\" DevicePath \"\"" Oct 13 15:04:14 crc kubenswrapper[4797]: I1013 15:04:14.033594 4797 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/366a3bab-eb21-41c6-b4e3-52f6e34cc08c-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 15:04:14 crc kubenswrapper[4797]: I1013 15:04:14.381480 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-57ps4" event={"ID":"366a3bab-eb21-41c6-b4e3-52f6e34cc08c","Type":"ContainerDied","Data":"115b9e15bdf7729b9c12c89bd738820733ccca13d661e50e2b8f649f931935f0"} Oct 13 15:04:14 crc kubenswrapper[4797]: I1013 15:04:14.381544 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="115b9e15bdf7729b9c12c89bd738820733ccca13d661e50e2b8f649f931935f0" Oct 13 15:04:14 crc kubenswrapper[4797]: I1013 15:04:14.381515 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-57ps4" Oct 13 15:04:14 crc kubenswrapper[4797]: I1013 15:04:14.469761 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-openstack-openstack-cell1-mgjkg"] Oct 13 15:04:14 crc kubenswrapper[4797]: E1013 15:04:14.470196 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="366a3bab-eb21-41c6-b4e3-52f6e34cc08c" containerName="ovn-openstack-openstack-cell1" Oct 13 15:04:14 crc kubenswrapper[4797]: I1013 15:04:14.470214 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="366a3bab-eb21-41c6-b4e3-52f6e34cc08c" containerName="ovn-openstack-openstack-cell1" Oct 13 15:04:14 crc kubenswrapper[4797]: I1013 15:04:14.470415 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="366a3bab-eb21-41c6-b4e3-52f6e34cc08c" containerName="ovn-openstack-openstack-cell1" Oct 13 15:04:14 crc kubenswrapper[4797]: I1013 15:04:14.471336 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-mgjkg" Oct 13 15:04:14 crc kubenswrapper[4797]: I1013 15:04:14.473560 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Oct 13 15:04:14 crc kubenswrapper[4797]: I1013 15:04:14.473967 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 13 15:04:14 crc kubenswrapper[4797]: I1013 15:04:14.474489 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 13 15:04:14 crc kubenswrapper[4797]: I1013 15:04:14.474661 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 13 15:04:14 crc kubenswrapper[4797]: I1013 15:04:14.474508 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-rf85n" Oct 13 15:04:14 crc kubenswrapper[4797]: I1013 15:04:14.475012 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Oct 13 15:04:14 crc kubenswrapper[4797]: I1013 15:04:14.491526 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-openstack-openstack-cell1-mgjkg"] Oct 13 15:04:14 crc kubenswrapper[4797]: I1013 15:04:14.544196 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/bcae55b0-a4a1-40c4-85a0-de5a29520295-ceph\") pod \"neutron-metadata-openstack-openstack-cell1-mgjkg\" (UID: \"bcae55b0-a4a1-40c4-85a0-de5a29520295\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-mgjkg" Oct 13 15:04:14 crc kubenswrapper[4797]: I1013 15:04:14.544253 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bcae55b0-a4a1-40c4-85a0-de5a29520295-ssh-key\") pod \"neutron-metadata-openstack-openstack-cell1-mgjkg\" (UID: \"bcae55b0-a4a1-40c4-85a0-de5a29520295\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-mgjkg" Oct 13 15:04:14 crc kubenswrapper[4797]: I1013 15:04:14.544404 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcae55b0-a4a1-40c4-85a0-de5a29520295-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-openstack-openstack-cell1-mgjkg\" (UID: \"bcae55b0-a4a1-40c4-85a0-de5a29520295\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-mgjkg" Oct 13 15:04:14 crc kubenswrapper[4797]: I1013 15:04:14.544474 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bcae55b0-a4a1-40c4-85a0-de5a29520295-inventory\") pod \"neutron-metadata-openstack-openstack-cell1-mgjkg\" (UID: \"bcae55b0-a4a1-40c4-85a0-de5a29520295\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-mgjkg" Oct 13 15:04:14 crc kubenswrapper[4797]: I1013 15:04:14.544532 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c4nhk\" (UniqueName: \"kubernetes.io/projected/bcae55b0-a4a1-40c4-85a0-de5a29520295-kube-api-access-c4nhk\") pod \"neutron-metadata-openstack-openstack-cell1-mgjkg\" (UID: \"bcae55b0-a4a1-40c4-85a0-de5a29520295\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-mgjkg" Oct 13 15:04:14 crc kubenswrapper[4797]: I1013 15:04:14.544560 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/bcae55b0-a4a1-40c4-85a0-de5a29520295-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-mgjkg\" (UID: \"bcae55b0-a4a1-40c4-85a0-de5a29520295\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-mgjkg" Oct 13 15:04:14 crc kubenswrapper[4797]: I1013 15:04:14.544597 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/bcae55b0-a4a1-40c4-85a0-de5a29520295-nova-metadata-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-mgjkg\" (UID: \"bcae55b0-a4a1-40c4-85a0-de5a29520295\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-mgjkg" Oct 13 15:04:14 crc kubenswrapper[4797]: I1013 15:04:14.646654 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/bcae55b0-a4a1-40c4-85a0-de5a29520295-ceph\") pod \"neutron-metadata-openstack-openstack-cell1-mgjkg\" (UID: \"bcae55b0-a4a1-40c4-85a0-de5a29520295\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-mgjkg" Oct 13 15:04:14 crc kubenswrapper[4797]: I1013 15:04:14.646974 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bcae55b0-a4a1-40c4-85a0-de5a29520295-ssh-key\") pod \"neutron-metadata-openstack-openstack-cell1-mgjkg\" (UID: \"bcae55b0-a4a1-40c4-85a0-de5a29520295\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-mgjkg" Oct 13 15:04:14 crc kubenswrapper[4797]: I1013 15:04:14.647071 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcae55b0-a4a1-40c4-85a0-de5a29520295-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-openstack-openstack-cell1-mgjkg\" (UID: \"bcae55b0-a4a1-40c4-85a0-de5a29520295\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-mgjkg" Oct 13 15:04:14 crc kubenswrapper[4797]: I1013 15:04:14.647110 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bcae55b0-a4a1-40c4-85a0-de5a29520295-inventory\") pod \"neutron-metadata-openstack-openstack-cell1-mgjkg\" (UID: \"bcae55b0-a4a1-40c4-85a0-de5a29520295\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-mgjkg" Oct 13 15:04:14 crc kubenswrapper[4797]: I1013 15:04:14.647163 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c4nhk\" (UniqueName: \"kubernetes.io/projected/bcae55b0-a4a1-40c4-85a0-de5a29520295-kube-api-access-c4nhk\") pod \"neutron-metadata-openstack-openstack-cell1-mgjkg\" (UID: \"bcae55b0-a4a1-40c4-85a0-de5a29520295\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-mgjkg" Oct 13 15:04:14 crc kubenswrapper[4797]: I1013 15:04:14.647185 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/bcae55b0-a4a1-40c4-85a0-de5a29520295-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-mgjkg\" (UID: \"bcae55b0-a4a1-40c4-85a0-de5a29520295\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-mgjkg" Oct 13 15:04:14 crc kubenswrapper[4797]: I1013 15:04:14.647211 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/bcae55b0-a4a1-40c4-85a0-de5a29520295-nova-metadata-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-mgjkg\" (UID: \"bcae55b0-a4a1-40c4-85a0-de5a29520295\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-mgjkg" Oct 13 15:04:14 crc kubenswrapper[4797]: I1013 15:04:14.651523 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcae55b0-a4a1-40c4-85a0-de5a29520295-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-openstack-openstack-cell1-mgjkg\" (UID: \"bcae55b0-a4a1-40c4-85a0-de5a29520295\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-mgjkg" Oct 13 15:04:14 crc kubenswrapper[4797]: I1013 15:04:14.651972 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bcae55b0-a4a1-40c4-85a0-de5a29520295-inventory\") pod \"neutron-metadata-openstack-openstack-cell1-mgjkg\" (UID: \"bcae55b0-a4a1-40c4-85a0-de5a29520295\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-mgjkg" Oct 13 15:04:14 crc kubenswrapper[4797]: I1013 15:04:14.652410 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/bcae55b0-a4a1-40c4-85a0-de5a29520295-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-mgjkg\" (UID: \"bcae55b0-a4a1-40c4-85a0-de5a29520295\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-mgjkg" Oct 13 15:04:14 crc kubenswrapper[4797]: I1013 15:04:14.652794 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/bcae55b0-a4a1-40c4-85a0-de5a29520295-nova-metadata-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-mgjkg\" (UID: \"bcae55b0-a4a1-40c4-85a0-de5a29520295\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-mgjkg" Oct 13 15:04:14 crc kubenswrapper[4797]: I1013 15:04:14.653255 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bcae55b0-a4a1-40c4-85a0-de5a29520295-ssh-key\") pod \"neutron-metadata-openstack-openstack-cell1-mgjkg\" (UID: \"bcae55b0-a4a1-40c4-85a0-de5a29520295\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-mgjkg" Oct 13 15:04:14 crc kubenswrapper[4797]: I1013 15:04:14.653414 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/bcae55b0-a4a1-40c4-85a0-de5a29520295-ceph\") pod \"neutron-metadata-openstack-openstack-cell1-mgjkg\" (UID: \"bcae55b0-a4a1-40c4-85a0-de5a29520295\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-mgjkg" Oct 13 15:04:14 crc kubenswrapper[4797]: I1013 15:04:14.666308 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c4nhk\" (UniqueName: \"kubernetes.io/projected/bcae55b0-a4a1-40c4-85a0-de5a29520295-kube-api-access-c4nhk\") pod \"neutron-metadata-openstack-openstack-cell1-mgjkg\" (UID: \"bcae55b0-a4a1-40c4-85a0-de5a29520295\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-mgjkg" Oct 13 15:04:14 crc kubenswrapper[4797]: I1013 15:04:14.797129 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-mgjkg" Oct 13 15:04:15 crc kubenswrapper[4797]: I1013 15:04:15.332066 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-openstack-openstack-cell1-mgjkg"] Oct 13 15:04:15 crc kubenswrapper[4797]: I1013 15:04:15.398045 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-mgjkg" event={"ID":"bcae55b0-a4a1-40c4-85a0-de5a29520295","Type":"ContainerStarted","Data":"797363b84c7128470f97f6d59ea79a79b628fc674af740782f2c08873185c8d8"} Oct 13 15:04:16 crc kubenswrapper[4797]: I1013 15:04:16.411701 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-mgjkg" event={"ID":"bcae55b0-a4a1-40c4-85a0-de5a29520295","Type":"ContainerStarted","Data":"67ed21e6941ce3e45df34e1270f507c8f4cbc30bda388a3263ca9b7122a41421"} Oct 13 15:04:16 crc kubenswrapper[4797]: I1013 15:04:16.447649 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-openstack-openstack-cell1-mgjkg" podStartSLOduration=1.914533732 podStartE2EDuration="2.447613096s" podCreationTimestamp="2025-10-13 15:04:14 +0000 UTC" firstStartedPulling="2025-10-13 15:04:15.334735026 +0000 UTC m=+7032.868285292" lastFinishedPulling="2025-10-13 15:04:15.8678144 +0000 UTC m=+7033.401364656" observedRunningTime="2025-10-13 15:04:16.429895389 +0000 UTC m=+7033.963445685" watchObservedRunningTime="2025-10-13 15:04:16.447613096 +0000 UTC m=+7033.981163392" Oct 13 15:04:18 crc kubenswrapper[4797]: I1013 15:04:18.120297 4797 patch_prober.go:28] interesting pod/machine-config-daemon-hrdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 15:04:18 crc kubenswrapper[4797]: I1013 15:04:18.121049 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 15:04:48 crc kubenswrapper[4797]: I1013 15:04:48.120408 4797 patch_prober.go:28] interesting pod/machine-config-daemon-hrdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 15:04:48 crc kubenswrapper[4797]: I1013 15:04:48.120967 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 15:04:48 crc kubenswrapper[4797]: I1013 15:04:48.121019 4797 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" Oct 13 15:04:48 crc kubenswrapper[4797]: I1013 15:04:48.121935 4797 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3d7a7089f34adab43f593c787419674f839b78a7826391814ca89f2c6d53810d"} pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 13 15:04:48 crc kubenswrapper[4797]: I1013 15:04:48.122002 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerName="machine-config-daemon" containerID="cri-o://3d7a7089f34adab43f593c787419674f839b78a7826391814ca89f2c6d53810d" gracePeriod=600 Oct 13 15:04:48 crc kubenswrapper[4797]: E1013 15:04:48.265435 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 15:04:48 crc kubenswrapper[4797]: I1013 15:04:48.741144 4797 generic.go:334] "Generic (PLEG): container finished" podID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerID="3d7a7089f34adab43f593c787419674f839b78a7826391814ca89f2c6d53810d" exitCode=0 Oct 13 15:04:48 crc kubenswrapper[4797]: I1013 15:04:48.741176 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" event={"ID":"345b1c60-ba79-407d-8423-53010f2dfeb0","Type":"ContainerDied","Data":"3d7a7089f34adab43f593c787419674f839b78a7826391814ca89f2c6d53810d"} Oct 13 15:04:48 crc kubenswrapper[4797]: I1013 15:04:48.741220 4797 scope.go:117] "RemoveContainer" containerID="9ced763cf8f63ef478d23b8f41f116ec9c1aafb73fa9427083e41fbea89d39fa" Oct 13 15:04:48 crc kubenswrapper[4797]: I1013 15:04:48.741767 4797 scope.go:117] "RemoveContainer" containerID="3d7a7089f34adab43f593c787419674f839b78a7826391814ca89f2c6d53810d" Oct 13 15:04:48 crc kubenswrapper[4797]: E1013 15:04:48.742247 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 15:05:03 crc kubenswrapper[4797]: I1013 15:05:03.242885 4797 scope.go:117] "RemoveContainer" containerID="3d7a7089f34adab43f593c787419674f839b78a7826391814ca89f2c6d53810d" Oct 13 15:05:03 crc kubenswrapper[4797]: E1013 15:05:03.243578 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 15:05:06 crc kubenswrapper[4797]: I1013 15:05:06.953410 4797 generic.go:334] "Generic (PLEG): container finished" podID="bcae55b0-a4a1-40c4-85a0-de5a29520295" containerID="67ed21e6941ce3e45df34e1270f507c8f4cbc30bda388a3263ca9b7122a41421" exitCode=0 Oct 13 15:05:06 crc kubenswrapper[4797]: I1013 15:05:06.953592 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-mgjkg" event={"ID":"bcae55b0-a4a1-40c4-85a0-de5a29520295","Type":"ContainerDied","Data":"67ed21e6941ce3e45df34e1270f507c8f4cbc30bda388a3263ca9b7122a41421"} Oct 13 15:05:08 crc kubenswrapper[4797]: I1013 15:05:08.491252 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-mgjkg" Oct 13 15:05:08 crc kubenswrapper[4797]: I1013 15:05:08.575514 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcae55b0-a4a1-40c4-85a0-de5a29520295-neutron-metadata-combined-ca-bundle\") pod \"bcae55b0-a4a1-40c4-85a0-de5a29520295\" (UID: \"bcae55b0-a4a1-40c4-85a0-de5a29520295\") " Oct 13 15:05:08 crc kubenswrapper[4797]: I1013 15:05:08.575590 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/bcae55b0-a4a1-40c4-85a0-de5a29520295-nova-metadata-neutron-config-0\") pod \"bcae55b0-a4a1-40c4-85a0-de5a29520295\" (UID: \"bcae55b0-a4a1-40c4-85a0-de5a29520295\") " Oct 13 15:05:08 crc kubenswrapper[4797]: I1013 15:05:08.575640 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bcae55b0-a4a1-40c4-85a0-de5a29520295-ssh-key\") pod \"bcae55b0-a4a1-40c4-85a0-de5a29520295\" (UID: \"bcae55b0-a4a1-40c4-85a0-de5a29520295\") " Oct 13 15:05:08 crc kubenswrapper[4797]: I1013 15:05:08.575663 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bcae55b0-a4a1-40c4-85a0-de5a29520295-inventory\") pod \"bcae55b0-a4a1-40c4-85a0-de5a29520295\" (UID: \"bcae55b0-a4a1-40c4-85a0-de5a29520295\") " Oct 13 15:05:08 crc kubenswrapper[4797]: I1013 15:05:08.575688 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c4nhk\" (UniqueName: \"kubernetes.io/projected/bcae55b0-a4a1-40c4-85a0-de5a29520295-kube-api-access-c4nhk\") pod \"bcae55b0-a4a1-40c4-85a0-de5a29520295\" (UID: \"bcae55b0-a4a1-40c4-85a0-de5a29520295\") " Oct 13 15:05:08 crc kubenswrapper[4797]: I1013 15:05:08.575874 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/bcae55b0-a4a1-40c4-85a0-de5a29520295-neutron-ovn-metadata-agent-neutron-config-0\") pod \"bcae55b0-a4a1-40c4-85a0-de5a29520295\" (UID: \"bcae55b0-a4a1-40c4-85a0-de5a29520295\") " Oct 13 15:05:08 crc kubenswrapper[4797]: I1013 15:05:08.575910 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/bcae55b0-a4a1-40c4-85a0-de5a29520295-ceph\") pod \"bcae55b0-a4a1-40c4-85a0-de5a29520295\" (UID: \"bcae55b0-a4a1-40c4-85a0-de5a29520295\") " Oct 13 15:05:08 crc kubenswrapper[4797]: I1013 15:05:08.608226 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bcae55b0-a4a1-40c4-85a0-de5a29520295-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "bcae55b0-a4a1-40c4-85a0-de5a29520295" (UID: "bcae55b0-a4a1-40c4-85a0-de5a29520295"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 15:05:08 crc kubenswrapper[4797]: I1013 15:05:08.611966 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bcae55b0-a4a1-40c4-85a0-de5a29520295-ceph" (OuterVolumeSpecName: "ceph") pod "bcae55b0-a4a1-40c4-85a0-de5a29520295" (UID: "bcae55b0-a4a1-40c4-85a0-de5a29520295"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 15:05:08 crc kubenswrapper[4797]: I1013 15:05:08.619343 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bcae55b0-a4a1-40c4-85a0-de5a29520295-kube-api-access-c4nhk" (OuterVolumeSpecName: "kube-api-access-c4nhk") pod "bcae55b0-a4a1-40c4-85a0-de5a29520295" (UID: "bcae55b0-a4a1-40c4-85a0-de5a29520295"). InnerVolumeSpecName "kube-api-access-c4nhk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 15:05:08 crc kubenswrapper[4797]: I1013 15:05:08.659570 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bcae55b0-a4a1-40c4-85a0-de5a29520295-inventory" (OuterVolumeSpecName: "inventory") pod "bcae55b0-a4a1-40c4-85a0-de5a29520295" (UID: "bcae55b0-a4a1-40c4-85a0-de5a29520295"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 15:05:08 crc kubenswrapper[4797]: I1013 15:05:08.660923 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bcae55b0-a4a1-40c4-85a0-de5a29520295-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "bcae55b0-a4a1-40c4-85a0-de5a29520295" (UID: "bcae55b0-a4a1-40c4-85a0-de5a29520295"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 15:05:08 crc kubenswrapper[4797]: I1013 15:05:08.669673 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bcae55b0-a4a1-40c4-85a0-de5a29520295-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "bcae55b0-a4a1-40c4-85a0-de5a29520295" (UID: "bcae55b0-a4a1-40c4-85a0-de5a29520295"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 15:05:08 crc kubenswrapper[4797]: I1013 15:05:08.671034 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bcae55b0-a4a1-40c4-85a0-de5a29520295-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "bcae55b0-a4a1-40c4-85a0-de5a29520295" (UID: "bcae55b0-a4a1-40c4-85a0-de5a29520295"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 15:05:08 crc kubenswrapper[4797]: I1013 15:05:08.678506 4797 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcae55b0-a4a1-40c4-85a0-de5a29520295-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 15:05:08 crc kubenswrapper[4797]: I1013 15:05:08.678542 4797 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/bcae55b0-a4a1-40c4-85a0-de5a29520295-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Oct 13 15:05:08 crc kubenswrapper[4797]: I1013 15:05:08.678554 4797 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bcae55b0-a4a1-40c4-85a0-de5a29520295-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 13 15:05:08 crc kubenswrapper[4797]: I1013 15:05:08.678563 4797 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bcae55b0-a4a1-40c4-85a0-de5a29520295-inventory\") on node \"crc\" DevicePath \"\"" Oct 13 15:05:08 crc kubenswrapper[4797]: I1013 15:05:08.678574 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c4nhk\" (UniqueName: \"kubernetes.io/projected/bcae55b0-a4a1-40c4-85a0-de5a29520295-kube-api-access-c4nhk\") on node \"crc\" DevicePath \"\"" Oct 13 15:05:08 crc kubenswrapper[4797]: I1013 15:05:08.678584 4797 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/bcae55b0-a4a1-40c4-85a0-de5a29520295-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Oct 13 15:05:08 crc kubenswrapper[4797]: I1013 15:05:08.678594 4797 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/bcae55b0-a4a1-40c4-85a0-de5a29520295-ceph\") on node \"crc\" DevicePath \"\"" Oct 13 15:05:08 crc kubenswrapper[4797]: I1013 15:05:08.976148 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-mgjkg" event={"ID":"bcae55b0-a4a1-40c4-85a0-de5a29520295","Type":"ContainerDied","Data":"797363b84c7128470f97f6d59ea79a79b628fc674af740782f2c08873185c8d8"} Oct 13 15:05:08 crc kubenswrapper[4797]: I1013 15:05:08.976202 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="797363b84c7128470f97f6d59ea79a79b628fc674af740782f2c08873185c8d8" Oct 13 15:05:08 crc kubenswrapper[4797]: I1013 15:05:08.976210 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-mgjkg" Oct 13 15:05:09 crc kubenswrapper[4797]: I1013 15:05:09.090881 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-openstack-openstack-cell1-wkdb2"] Oct 13 15:05:09 crc kubenswrapper[4797]: E1013 15:05:09.091356 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcae55b0-a4a1-40c4-85a0-de5a29520295" containerName="neutron-metadata-openstack-openstack-cell1" Oct 13 15:05:09 crc kubenswrapper[4797]: I1013 15:05:09.091377 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcae55b0-a4a1-40c4-85a0-de5a29520295" containerName="neutron-metadata-openstack-openstack-cell1" Oct 13 15:05:09 crc kubenswrapper[4797]: I1013 15:05:09.091589 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="bcae55b0-a4a1-40c4-85a0-de5a29520295" containerName="neutron-metadata-openstack-openstack-cell1" Oct 13 15:05:09 crc kubenswrapper[4797]: I1013 15:05:09.092392 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-wkdb2" Oct 13 15:05:09 crc kubenswrapper[4797]: I1013 15:05:09.095288 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Oct 13 15:05:09 crc kubenswrapper[4797]: I1013 15:05:09.095485 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-rf85n" Oct 13 15:05:09 crc kubenswrapper[4797]: I1013 15:05:09.095633 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 13 15:05:09 crc kubenswrapper[4797]: I1013 15:05:09.095927 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 13 15:05:09 crc kubenswrapper[4797]: I1013 15:05:09.095628 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 13 15:05:09 crc kubenswrapper[4797]: I1013 15:05:09.106734 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-openstack-openstack-cell1-wkdb2"] Oct 13 15:05:09 crc kubenswrapper[4797]: I1013 15:05:09.186930 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cwrtw\" (UniqueName: \"kubernetes.io/projected/9da6c975-70b2-4296-9070-7608770a0446-kube-api-access-cwrtw\") pod \"libvirt-openstack-openstack-cell1-wkdb2\" (UID: \"9da6c975-70b2-4296-9070-7608770a0446\") " pod="openstack/libvirt-openstack-openstack-cell1-wkdb2" Oct 13 15:05:09 crc kubenswrapper[4797]: I1013 15:05:09.187301 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9da6c975-70b2-4296-9070-7608770a0446-libvirt-combined-ca-bundle\") pod \"libvirt-openstack-openstack-cell1-wkdb2\" (UID: \"9da6c975-70b2-4296-9070-7608770a0446\") " pod="openstack/libvirt-openstack-openstack-cell1-wkdb2" Oct 13 15:05:09 crc kubenswrapper[4797]: I1013 15:05:09.187505 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/9da6c975-70b2-4296-9070-7608770a0446-libvirt-secret-0\") pod \"libvirt-openstack-openstack-cell1-wkdb2\" (UID: \"9da6c975-70b2-4296-9070-7608770a0446\") " pod="openstack/libvirt-openstack-openstack-cell1-wkdb2" Oct 13 15:05:09 crc kubenswrapper[4797]: I1013 15:05:09.187670 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9da6c975-70b2-4296-9070-7608770a0446-ssh-key\") pod \"libvirt-openstack-openstack-cell1-wkdb2\" (UID: \"9da6c975-70b2-4296-9070-7608770a0446\") " pod="openstack/libvirt-openstack-openstack-cell1-wkdb2" Oct 13 15:05:09 crc kubenswrapper[4797]: I1013 15:05:09.187991 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9da6c975-70b2-4296-9070-7608770a0446-ceph\") pod \"libvirt-openstack-openstack-cell1-wkdb2\" (UID: \"9da6c975-70b2-4296-9070-7608770a0446\") " pod="openstack/libvirt-openstack-openstack-cell1-wkdb2" Oct 13 15:05:09 crc kubenswrapper[4797]: I1013 15:05:09.188162 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9da6c975-70b2-4296-9070-7608770a0446-inventory\") pod \"libvirt-openstack-openstack-cell1-wkdb2\" (UID: \"9da6c975-70b2-4296-9070-7608770a0446\") " pod="openstack/libvirt-openstack-openstack-cell1-wkdb2" Oct 13 15:05:09 crc kubenswrapper[4797]: I1013 15:05:09.290151 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cwrtw\" (UniqueName: \"kubernetes.io/projected/9da6c975-70b2-4296-9070-7608770a0446-kube-api-access-cwrtw\") pod \"libvirt-openstack-openstack-cell1-wkdb2\" (UID: \"9da6c975-70b2-4296-9070-7608770a0446\") " pod="openstack/libvirt-openstack-openstack-cell1-wkdb2" Oct 13 15:05:09 crc kubenswrapper[4797]: I1013 15:05:09.290206 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9da6c975-70b2-4296-9070-7608770a0446-libvirt-combined-ca-bundle\") pod \"libvirt-openstack-openstack-cell1-wkdb2\" (UID: \"9da6c975-70b2-4296-9070-7608770a0446\") " pod="openstack/libvirt-openstack-openstack-cell1-wkdb2" Oct 13 15:05:09 crc kubenswrapper[4797]: I1013 15:05:09.290241 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/9da6c975-70b2-4296-9070-7608770a0446-libvirt-secret-0\") pod \"libvirt-openstack-openstack-cell1-wkdb2\" (UID: \"9da6c975-70b2-4296-9070-7608770a0446\") " pod="openstack/libvirt-openstack-openstack-cell1-wkdb2" Oct 13 15:05:09 crc kubenswrapper[4797]: I1013 15:05:09.290281 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9da6c975-70b2-4296-9070-7608770a0446-ssh-key\") pod \"libvirt-openstack-openstack-cell1-wkdb2\" (UID: \"9da6c975-70b2-4296-9070-7608770a0446\") " pod="openstack/libvirt-openstack-openstack-cell1-wkdb2" Oct 13 15:05:09 crc kubenswrapper[4797]: I1013 15:05:09.290413 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9da6c975-70b2-4296-9070-7608770a0446-ceph\") pod \"libvirt-openstack-openstack-cell1-wkdb2\" (UID: \"9da6c975-70b2-4296-9070-7608770a0446\") " pod="openstack/libvirt-openstack-openstack-cell1-wkdb2" Oct 13 15:05:09 crc kubenswrapper[4797]: I1013 15:05:09.290440 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9da6c975-70b2-4296-9070-7608770a0446-inventory\") pod \"libvirt-openstack-openstack-cell1-wkdb2\" (UID: \"9da6c975-70b2-4296-9070-7608770a0446\") " pod="openstack/libvirt-openstack-openstack-cell1-wkdb2" Oct 13 15:05:09 crc kubenswrapper[4797]: I1013 15:05:09.295544 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9da6c975-70b2-4296-9070-7608770a0446-libvirt-combined-ca-bundle\") pod \"libvirt-openstack-openstack-cell1-wkdb2\" (UID: \"9da6c975-70b2-4296-9070-7608770a0446\") " pod="openstack/libvirt-openstack-openstack-cell1-wkdb2" Oct 13 15:05:09 crc kubenswrapper[4797]: I1013 15:05:09.297931 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9da6c975-70b2-4296-9070-7608770a0446-ceph\") pod \"libvirt-openstack-openstack-cell1-wkdb2\" (UID: \"9da6c975-70b2-4296-9070-7608770a0446\") " pod="openstack/libvirt-openstack-openstack-cell1-wkdb2" Oct 13 15:05:09 crc kubenswrapper[4797]: I1013 15:05:09.298228 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9da6c975-70b2-4296-9070-7608770a0446-inventory\") pod \"libvirt-openstack-openstack-cell1-wkdb2\" (UID: \"9da6c975-70b2-4296-9070-7608770a0446\") " pod="openstack/libvirt-openstack-openstack-cell1-wkdb2" Oct 13 15:05:09 crc kubenswrapper[4797]: I1013 15:05:09.298677 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9da6c975-70b2-4296-9070-7608770a0446-ssh-key\") pod \"libvirt-openstack-openstack-cell1-wkdb2\" (UID: \"9da6c975-70b2-4296-9070-7608770a0446\") " pod="openstack/libvirt-openstack-openstack-cell1-wkdb2" Oct 13 15:05:09 crc kubenswrapper[4797]: I1013 15:05:09.302357 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/9da6c975-70b2-4296-9070-7608770a0446-libvirt-secret-0\") pod \"libvirt-openstack-openstack-cell1-wkdb2\" (UID: \"9da6c975-70b2-4296-9070-7608770a0446\") " pod="openstack/libvirt-openstack-openstack-cell1-wkdb2" Oct 13 15:05:09 crc kubenswrapper[4797]: I1013 15:05:09.308090 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cwrtw\" (UniqueName: \"kubernetes.io/projected/9da6c975-70b2-4296-9070-7608770a0446-kube-api-access-cwrtw\") pod \"libvirt-openstack-openstack-cell1-wkdb2\" (UID: \"9da6c975-70b2-4296-9070-7608770a0446\") " pod="openstack/libvirt-openstack-openstack-cell1-wkdb2" Oct 13 15:05:09 crc kubenswrapper[4797]: I1013 15:05:09.416783 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-wkdb2" Oct 13 15:05:09 crc kubenswrapper[4797]: I1013 15:05:09.985906 4797 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 13 15:05:09 crc kubenswrapper[4797]: I1013 15:05:09.986758 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-openstack-openstack-cell1-wkdb2"] Oct 13 15:05:10 crc kubenswrapper[4797]: I1013 15:05:10.998502 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-wkdb2" event={"ID":"9da6c975-70b2-4296-9070-7608770a0446","Type":"ContainerStarted","Data":"6683657e5a30cf26bb2e075832d19bb24b79e64a11d6896ec09af01d33999795"} Oct 13 15:05:10 crc kubenswrapper[4797]: I1013 15:05:10.998926 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-wkdb2" event={"ID":"9da6c975-70b2-4296-9070-7608770a0446","Type":"ContainerStarted","Data":"67c3d8c00e97eeeff6f6d693c77c4d8a2db6926be03a1e5b7023d8ae50147fbc"} Oct 13 15:05:11 crc kubenswrapper[4797]: I1013 15:05:11.027598 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-openstack-openstack-cell1-wkdb2" podStartSLOduration=1.522761027 podStartE2EDuration="2.027581614s" podCreationTimestamp="2025-10-13 15:05:09 +0000 UTC" firstStartedPulling="2025-10-13 15:05:09.985649164 +0000 UTC m=+7087.519199420" lastFinishedPulling="2025-10-13 15:05:10.490469751 +0000 UTC m=+7088.024020007" observedRunningTime="2025-10-13 15:05:11.01766472 +0000 UTC m=+7088.551214996" watchObservedRunningTime="2025-10-13 15:05:11.027581614 +0000 UTC m=+7088.561131870" Oct 13 15:05:18 crc kubenswrapper[4797]: I1013 15:05:18.236850 4797 scope.go:117] "RemoveContainer" containerID="3d7a7089f34adab43f593c787419674f839b78a7826391814ca89f2c6d53810d" Oct 13 15:05:18 crc kubenswrapper[4797]: E1013 15:05:18.237923 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 15:05:31 crc kubenswrapper[4797]: I1013 15:05:31.237309 4797 scope.go:117] "RemoveContainer" containerID="3d7a7089f34adab43f593c787419674f839b78a7826391814ca89f2c6d53810d" Oct 13 15:05:31 crc kubenswrapper[4797]: E1013 15:05:31.238433 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 15:05:46 crc kubenswrapper[4797]: I1013 15:05:46.237116 4797 scope.go:117] "RemoveContainer" containerID="3d7a7089f34adab43f593c787419674f839b78a7826391814ca89f2c6d53810d" Oct 13 15:05:46 crc kubenswrapper[4797]: E1013 15:05:46.239759 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 15:05:57 crc kubenswrapper[4797]: I1013 15:05:57.236423 4797 scope.go:117] "RemoveContainer" containerID="3d7a7089f34adab43f593c787419674f839b78a7826391814ca89f2c6d53810d" Oct 13 15:05:57 crc kubenswrapper[4797]: E1013 15:05:57.237117 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 15:06:09 crc kubenswrapper[4797]: I1013 15:06:09.236554 4797 scope.go:117] "RemoveContainer" containerID="3d7a7089f34adab43f593c787419674f839b78a7826391814ca89f2c6d53810d" Oct 13 15:06:09 crc kubenswrapper[4797]: E1013 15:06:09.238010 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 15:06:21 crc kubenswrapper[4797]: I1013 15:06:21.236503 4797 scope.go:117] "RemoveContainer" containerID="3d7a7089f34adab43f593c787419674f839b78a7826391814ca89f2c6d53810d" Oct 13 15:06:21 crc kubenswrapper[4797]: E1013 15:06:21.237383 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 15:06:36 crc kubenswrapper[4797]: I1013 15:06:36.236692 4797 scope.go:117] "RemoveContainer" containerID="3d7a7089f34adab43f593c787419674f839b78a7826391814ca89f2c6d53810d" Oct 13 15:06:36 crc kubenswrapper[4797]: E1013 15:06:36.238006 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 15:06:48 crc kubenswrapper[4797]: I1013 15:06:48.237629 4797 scope.go:117] "RemoveContainer" containerID="3d7a7089f34adab43f593c787419674f839b78a7826391814ca89f2c6d53810d" Oct 13 15:06:48 crc kubenswrapper[4797]: E1013 15:06:48.239097 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 15:07:01 crc kubenswrapper[4797]: I1013 15:07:01.237200 4797 scope.go:117] "RemoveContainer" containerID="3d7a7089f34adab43f593c787419674f839b78a7826391814ca89f2c6d53810d" Oct 13 15:07:01 crc kubenswrapper[4797]: E1013 15:07:01.238240 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 15:07:12 crc kubenswrapper[4797]: I1013 15:07:12.237615 4797 scope.go:117] "RemoveContainer" containerID="3d7a7089f34adab43f593c787419674f839b78a7826391814ca89f2c6d53810d" Oct 13 15:07:12 crc kubenswrapper[4797]: E1013 15:07:12.239037 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 15:07:23 crc kubenswrapper[4797]: I1013 15:07:23.245112 4797 scope.go:117] "RemoveContainer" containerID="3d7a7089f34adab43f593c787419674f839b78a7826391814ca89f2c6d53810d" Oct 13 15:07:23 crc kubenswrapper[4797]: E1013 15:07:23.246142 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 15:07:36 crc kubenswrapper[4797]: I1013 15:07:36.438384 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-k529c"] Oct 13 15:07:36 crc kubenswrapper[4797]: I1013 15:07:36.444349 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k529c" Oct 13 15:07:36 crc kubenswrapper[4797]: I1013 15:07:36.486581 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-k529c"] Oct 13 15:07:36 crc kubenswrapper[4797]: I1013 15:07:36.534441 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d2720b8-138e-4bca-8a6a-7640142a114f-utilities\") pod \"community-operators-k529c\" (UID: \"8d2720b8-138e-4bca-8a6a-7640142a114f\") " pod="openshift-marketplace/community-operators-k529c" Oct 13 15:07:36 crc kubenswrapper[4797]: I1013 15:07:36.534742 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d2720b8-138e-4bca-8a6a-7640142a114f-catalog-content\") pod \"community-operators-k529c\" (UID: \"8d2720b8-138e-4bca-8a6a-7640142a114f\") " pod="openshift-marketplace/community-operators-k529c" Oct 13 15:07:36 crc kubenswrapper[4797]: I1013 15:07:36.535034 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9cxw2\" (UniqueName: \"kubernetes.io/projected/8d2720b8-138e-4bca-8a6a-7640142a114f-kube-api-access-9cxw2\") pod \"community-operators-k529c\" (UID: \"8d2720b8-138e-4bca-8a6a-7640142a114f\") " pod="openshift-marketplace/community-operators-k529c" Oct 13 15:07:36 crc kubenswrapper[4797]: I1013 15:07:36.637860 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9cxw2\" (UniqueName: \"kubernetes.io/projected/8d2720b8-138e-4bca-8a6a-7640142a114f-kube-api-access-9cxw2\") pod \"community-operators-k529c\" (UID: \"8d2720b8-138e-4bca-8a6a-7640142a114f\") " pod="openshift-marketplace/community-operators-k529c" Oct 13 15:07:36 crc kubenswrapper[4797]: I1013 15:07:36.637959 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d2720b8-138e-4bca-8a6a-7640142a114f-utilities\") pod \"community-operators-k529c\" (UID: \"8d2720b8-138e-4bca-8a6a-7640142a114f\") " pod="openshift-marketplace/community-operators-k529c" Oct 13 15:07:36 crc kubenswrapper[4797]: I1013 15:07:36.637982 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d2720b8-138e-4bca-8a6a-7640142a114f-catalog-content\") pod \"community-operators-k529c\" (UID: \"8d2720b8-138e-4bca-8a6a-7640142a114f\") " pod="openshift-marketplace/community-operators-k529c" Oct 13 15:07:36 crc kubenswrapper[4797]: I1013 15:07:36.638654 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d2720b8-138e-4bca-8a6a-7640142a114f-catalog-content\") pod \"community-operators-k529c\" (UID: \"8d2720b8-138e-4bca-8a6a-7640142a114f\") " pod="openshift-marketplace/community-operators-k529c" Oct 13 15:07:36 crc kubenswrapper[4797]: I1013 15:07:36.639038 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d2720b8-138e-4bca-8a6a-7640142a114f-utilities\") pod \"community-operators-k529c\" (UID: \"8d2720b8-138e-4bca-8a6a-7640142a114f\") " pod="openshift-marketplace/community-operators-k529c" Oct 13 15:07:36 crc kubenswrapper[4797]: I1013 15:07:36.668601 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9cxw2\" (UniqueName: \"kubernetes.io/projected/8d2720b8-138e-4bca-8a6a-7640142a114f-kube-api-access-9cxw2\") pod \"community-operators-k529c\" (UID: \"8d2720b8-138e-4bca-8a6a-7640142a114f\") " pod="openshift-marketplace/community-operators-k529c" Oct 13 15:07:36 crc kubenswrapper[4797]: I1013 15:07:36.783222 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k529c" Oct 13 15:07:37 crc kubenswrapper[4797]: I1013 15:07:37.237250 4797 scope.go:117] "RemoveContainer" containerID="3d7a7089f34adab43f593c787419674f839b78a7826391814ca89f2c6d53810d" Oct 13 15:07:37 crc kubenswrapper[4797]: E1013 15:07:37.238443 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 15:07:37 crc kubenswrapper[4797]: I1013 15:07:37.408526 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-k529c"] Oct 13 15:07:37 crc kubenswrapper[4797]: I1013 15:07:37.603625 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k529c" event={"ID":"8d2720b8-138e-4bca-8a6a-7640142a114f","Type":"ContainerStarted","Data":"ebf886e1f9bb59413f0615c47cd807b9998605437ca613a2086a38e5869d37f2"} Oct 13 15:07:38 crc kubenswrapper[4797]: I1013 15:07:38.618677 4797 generic.go:334] "Generic (PLEG): container finished" podID="8d2720b8-138e-4bca-8a6a-7640142a114f" containerID="75715d386c0cc8d0b9b4085ed43a85f1a2d1b605eea4fa7d7aea80c4422b3bde" exitCode=0 Oct 13 15:07:38 crc kubenswrapper[4797]: I1013 15:07:38.618764 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k529c" event={"ID":"8d2720b8-138e-4bca-8a6a-7640142a114f","Type":"ContainerDied","Data":"75715d386c0cc8d0b9b4085ed43a85f1a2d1b605eea4fa7d7aea80c4422b3bde"} Oct 13 15:07:39 crc kubenswrapper[4797]: I1013 15:07:39.633974 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k529c" event={"ID":"8d2720b8-138e-4bca-8a6a-7640142a114f","Type":"ContainerStarted","Data":"a52cf765bf290520fb31f435b2e8992ec035dfc8801aa2bf63344da991a3dc80"} Oct 13 15:07:40 crc kubenswrapper[4797]: I1013 15:07:40.646313 4797 generic.go:334] "Generic (PLEG): container finished" podID="8d2720b8-138e-4bca-8a6a-7640142a114f" containerID="a52cf765bf290520fb31f435b2e8992ec035dfc8801aa2bf63344da991a3dc80" exitCode=0 Oct 13 15:07:40 crc kubenswrapper[4797]: I1013 15:07:40.646358 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k529c" event={"ID":"8d2720b8-138e-4bca-8a6a-7640142a114f","Type":"ContainerDied","Data":"a52cf765bf290520fb31f435b2e8992ec035dfc8801aa2bf63344da991a3dc80"} Oct 13 15:07:41 crc kubenswrapper[4797]: I1013 15:07:41.689411 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-k529c" podStartSLOduration=2.889104088 podStartE2EDuration="5.689387505s" podCreationTimestamp="2025-10-13 15:07:36 +0000 UTC" firstStartedPulling="2025-10-13 15:07:38.621551791 +0000 UTC m=+7236.155102047" lastFinishedPulling="2025-10-13 15:07:41.421835208 +0000 UTC m=+7238.955385464" observedRunningTime="2025-10-13 15:07:41.682832183 +0000 UTC m=+7239.216382469" watchObservedRunningTime="2025-10-13 15:07:41.689387505 +0000 UTC m=+7239.222937771" Oct 13 15:07:42 crc kubenswrapper[4797]: I1013 15:07:42.691563 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k529c" event={"ID":"8d2720b8-138e-4bca-8a6a-7640142a114f","Type":"ContainerStarted","Data":"f042bd38a90963ad31872c6e8dabff6d086aff9f5327173169dfd8abbac91181"} Oct 13 15:07:46 crc kubenswrapper[4797]: I1013 15:07:46.783786 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-k529c" Oct 13 15:07:46 crc kubenswrapper[4797]: I1013 15:07:46.785384 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-k529c" Oct 13 15:07:46 crc kubenswrapper[4797]: I1013 15:07:46.837767 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-k529c" Oct 13 15:07:47 crc kubenswrapper[4797]: I1013 15:07:47.836505 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-k529c" Oct 13 15:07:47 crc kubenswrapper[4797]: I1013 15:07:47.894225 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-k529c"] Oct 13 15:07:49 crc kubenswrapper[4797]: I1013 15:07:49.236940 4797 scope.go:117] "RemoveContainer" containerID="3d7a7089f34adab43f593c787419674f839b78a7826391814ca89f2c6d53810d" Oct 13 15:07:49 crc kubenswrapper[4797]: E1013 15:07:49.237286 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 15:07:49 crc kubenswrapper[4797]: I1013 15:07:49.785492 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-k529c" podUID="8d2720b8-138e-4bca-8a6a-7640142a114f" containerName="registry-server" containerID="cri-o://f042bd38a90963ad31872c6e8dabff6d086aff9f5327173169dfd8abbac91181" gracePeriod=2 Oct 13 15:07:50 crc kubenswrapper[4797]: I1013 15:07:50.262282 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k529c" Oct 13 15:07:50 crc kubenswrapper[4797]: I1013 15:07:50.362745 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d2720b8-138e-4bca-8a6a-7640142a114f-utilities\") pod \"8d2720b8-138e-4bca-8a6a-7640142a114f\" (UID: \"8d2720b8-138e-4bca-8a6a-7640142a114f\") " Oct 13 15:07:50 crc kubenswrapper[4797]: I1013 15:07:50.363711 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8d2720b8-138e-4bca-8a6a-7640142a114f-utilities" (OuterVolumeSpecName: "utilities") pod "8d2720b8-138e-4bca-8a6a-7640142a114f" (UID: "8d2720b8-138e-4bca-8a6a-7640142a114f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 15:07:50 crc kubenswrapper[4797]: I1013 15:07:50.363864 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d2720b8-138e-4bca-8a6a-7640142a114f-catalog-content\") pod \"8d2720b8-138e-4bca-8a6a-7640142a114f\" (UID: \"8d2720b8-138e-4bca-8a6a-7640142a114f\") " Oct 13 15:07:50 crc kubenswrapper[4797]: I1013 15:07:50.364076 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9cxw2\" (UniqueName: \"kubernetes.io/projected/8d2720b8-138e-4bca-8a6a-7640142a114f-kube-api-access-9cxw2\") pod \"8d2720b8-138e-4bca-8a6a-7640142a114f\" (UID: \"8d2720b8-138e-4bca-8a6a-7640142a114f\") " Oct 13 15:07:50 crc kubenswrapper[4797]: I1013 15:07:50.366096 4797 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d2720b8-138e-4bca-8a6a-7640142a114f-utilities\") on node \"crc\" DevicePath \"\"" Oct 13 15:07:50 crc kubenswrapper[4797]: I1013 15:07:50.371978 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d2720b8-138e-4bca-8a6a-7640142a114f-kube-api-access-9cxw2" (OuterVolumeSpecName: "kube-api-access-9cxw2") pod "8d2720b8-138e-4bca-8a6a-7640142a114f" (UID: "8d2720b8-138e-4bca-8a6a-7640142a114f"). InnerVolumeSpecName "kube-api-access-9cxw2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 15:07:50 crc kubenswrapper[4797]: I1013 15:07:50.415625 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8d2720b8-138e-4bca-8a6a-7640142a114f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8d2720b8-138e-4bca-8a6a-7640142a114f" (UID: "8d2720b8-138e-4bca-8a6a-7640142a114f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 15:07:50 crc kubenswrapper[4797]: I1013 15:07:50.469340 4797 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d2720b8-138e-4bca-8a6a-7640142a114f-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 13 15:07:50 crc kubenswrapper[4797]: I1013 15:07:50.469381 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9cxw2\" (UniqueName: \"kubernetes.io/projected/8d2720b8-138e-4bca-8a6a-7640142a114f-kube-api-access-9cxw2\") on node \"crc\" DevicePath \"\"" Oct 13 15:07:50 crc kubenswrapper[4797]: I1013 15:07:50.800132 4797 generic.go:334] "Generic (PLEG): container finished" podID="8d2720b8-138e-4bca-8a6a-7640142a114f" containerID="f042bd38a90963ad31872c6e8dabff6d086aff9f5327173169dfd8abbac91181" exitCode=0 Oct 13 15:07:50 crc kubenswrapper[4797]: I1013 15:07:50.800193 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k529c" event={"ID":"8d2720b8-138e-4bca-8a6a-7640142a114f","Type":"ContainerDied","Data":"f042bd38a90963ad31872c6e8dabff6d086aff9f5327173169dfd8abbac91181"} Oct 13 15:07:50 crc kubenswrapper[4797]: I1013 15:07:50.800233 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k529c" event={"ID":"8d2720b8-138e-4bca-8a6a-7640142a114f","Type":"ContainerDied","Data":"ebf886e1f9bb59413f0615c47cd807b9998605437ca613a2086a38e5869d37f2"} Oct 13 15:07:50 crc kubenswrapper[4797]: I1013 15:07:50.800255 4797 scope.go:117] "RemoveContainer" containerID="f042bd38a90963ad31872c6e8dabff6d086aff9f5327173169dfd8abbac91181" Oct 13 15:07:50 crc kubenswrapper[4797]: I1013 15:07:50.800357 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k529c" Oct 13 15:07:50 crc kubenswrapper[4797]: I1013 15:07:50.839704 4797 scope.go:117] "RemoveContainer" containerID="a52cf765bf290520fb31f435b2e8992ec035dfc8801aa2bf63344da991a3dc80" Oct 13 15:07:50 crc kubenswrapper[4797]: I1013 15:07:50.846142 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-k529c"] Oct 13 15:07:50 crc kubenswrapper[4797]: I1013 15:07:50.856053 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-k529c"] Oct 13 15:07:50 crc kubenswrapper[4797]: I1013 15:07:50.869312 4797 scope.go:117] "RemoveContainer" containerID="75715d386c0cc8d0b9b4085ed43a85f1a2d1b605eea4fa7d7aea80c4422b3bde" Oct 13 15:07:50 crc kubenswrapper[4797]: I1013 15:07:50.916474 4797 scope.go:117] "RemoveContainer" containerID="f042bd38a90963ad31872c6e8dabff6d086aff9f5327173169dfd8abbac91181" Oct 13 15:07:50 crc kubenswrapper[4797]: E1013 15:07:50.916937 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f042bd38a90963ad31872c6e8dabff6d086aff9f5327173169dfd8abbac91181\": container with ID starting with f042bd38a90963ad31872c6e8dabff6d086aff9f5327173169dfd8abbac91181 not found: ID does not exist" containerID="f042bd38a90963ad31872c6e8dabff6d086aff9f5327173169dfd8abbac91181" Oct 13 15:07:50 crc kubenswrapper[4797]: I1013 15:07:50.916987 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f042bd38a90963ad31872c6e8dabff6d086aff9f5327173169dfd8abbac91181"} err="failed to get container status \"f042bd38a90963ad31872c6e8dabff6d086aff9f5327173169dfd8abbac91181\": rpc error: code = NotFound desc = could not find container \"f042bd38a90963ad31872c6e8dabff6d086aff9f5327173169dfd8abbac91181\": container with ID starting with f042bd38a90963ad31872c6e8dabff6d086aff9f5327173169dfd8abbac91181 not found: ID does not exist" Oct 13 15:07:50 crc kubenswrapper[4797]: I1013 15:07:50.917023 4797 scope.go:117] "RemoveContainer" containerID="a52cf765bf290520fb31f435b2e8992ec035dfc8801aa2bf63344da991a3dc80" Oct 13 15:07:50 crc kubenswrapper[4797]: E1013 15:07:50.917486 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a52cf765bf290520fb31f435b2e8992ec035dfc8801aa2bf63344da991a3dc80\": container with ID starting with a52cf765bf290520fb31f435b2e8992ec035dfc8801aa2bf63344da991a3dc80 not found: ID does not exist" containerID="a52cf765bf290520fb31f435b2e8992ec035dfc8801aa2bf63344da991a3dc80" Oct 13 15:07:50 crc kubenswrapper[4797]: I1013 15:07:50.917524 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a52cf765bf290520fb31f435b2e8992ec035dfc8801aa2bf63344da991a3dc80"} err="failed to get container status \"a52cf765bf290520fb31f435b2e8992ec035dfc8801aa2bf63344da991a3dc80\": rpc error: code = NotFound desc = could not find container \"a52cf765bf290520fb31f435b2e8992ec035dfc8801aa2bf63344da991a3dc80\": container with ID starting with a52cf765bf290520fb31f435b2e8992ec035dfc8801aa2bf63344da991a3dc80 not found: ID does not exist" Oct 13 15:07:50 crc kubenswrapper[4797]: I1013 15:07:50.917548 4797 scope.go:117] "RemoveContainer" containerID="75715d386c0cc8d0b9b4085ed43a85f1a2d1b605eea4fa7d7aea80c4422b3bde" Oct 13 15:07:50 crc kubenswrapper[4797]: E1013 15:07:50.917930 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75715d386c0cc8d0b9b4085ed43a85f1a2d1b605eea4fa7d7aea80c4422b3bde\": container with ID starting with 75715d386c0cc8d0b9b4085ed43a85f1a2d1b605eea4fa7d7aea80c4422b3bde not found: ID does not exist" containerID="75715d386c0cc8d0b9b4085ed43a85f1a2d1b605eea4fa7d7aea80c4422b3bde" Oct 13 15:07:50 crc kubenswrapper[4797]: I1013 15:07:50.917951 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75715d386c0cc8d0b9b4085ed43a85f1a2d1b605eea4fa7d7aea80c4422b3bde"} err="failed to get container status \"75715d386c0cc8d0b9b4085ed43a85f1a2d1b605eea4fa7d7aea80c4422b3bde\": rpc error: code = NotFound desc = could not find container \"75715d386c0cc8d0b9b4085ed43a85f1a2d1b605eea4fa7d7aea80c4422b3bde\": container with ID starting with 75715d386c0cc8d0b9b4085ed43a85f1a2d1b605eea4fa7d7aea80c4422b3bde not found: ID does not exist" Oct 13 15:07:51 crc kubenswrapper[4797]: I1013 15:07:51.248215 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d2720b8-138e-4bca-8a6a-7640142a114f" path="/var/lib/kubelet/pods/8d2720b8-138e-4bca-8a6a-7640142a114f/volumes" Oct 13 15:08:04 crc kubenswrapper[4797]: I1013 15:08:04.237657 4797 scope.go:117] "RemoveContainer" containerID="3d7a7089f34adab43f593c787419674f839b78a7826391814ca89f2c6d53810d" Oct 13 15:08:04 crc kubenswrapper[4797]: E1013 15:08:04.239001 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 15:08:15 crc kubenswrapper[4797]: I1013 15:08:15.236895 4797 scope.go:117] "RemoveContainer" containerID="3d7a7089f34adab43f593c787419674f839b78a7826391814ca89f2c6d53810d" Oct 13 15:08:15 crc kubenswrapper[4797]: E1013 15:08:15.237798 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 15:08:26 crc kubenswrapper[4797]: I1013 15:08:26.236956 4797 scope.go:117] "RemoveContainer" containerID="3d7a7089f34adab43f593c787419674f839b78a7826391814ca89f2c6d53810d" Oct 13 15:08:26 crc kubenswrapper[4797]: E1013 15:08:26.238108 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 15:08:39 crc kubenswrapper[4797]: I1013 15:08:39.236468 4797 scope.go:117] "RemoveContainer" containerID="3d7a7089f34adab43f593c787419674f839b78a7826391814ca89f2c6d53810d" Oct 13 15:08:39 crc kubenswrapper[4797]: E1013 15:08:39.237719 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 15:08:40 crc kubenswrapper[4797]: I1013 15:08:40.591464 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-9knbm"] Oct 13 15:08:40 crc kubenswrapper[4797]: E1013 15:08:40.592220 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d2720b8-138e-4bca-8a6a-7640142a114f" containerName="extract-utilities" Oct 13 15:08:40 crc kubenswrapper[4797]: I1013 15:08:40.592233 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d2720b8-138e-4bca-8a6a-7640142a114f" containerName="extract-utilities" Oct 13 15:08:40 crc kubenswrapper[4797]: E1013 15:08:40.592266 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d2720b8-138e-4bca-8a6a-7640142a114f" containerName="extract-content" Oct 13 15:08:40 crc kubenswrapper[4797]: I1013 15:08:40.592272 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d2720b8-138e-4bca-8a6a-7640142a114f" containerName="extract-content" Oct 13 15:08:40 crc kubenswrapper[4797]: E1013 15:08:40.592288 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d2720b8-138e-4bca-8a6a-7640142a114f" containerName="registry-server" Oct 13 15:08:40 crc kubenswrapper[4797]: I1013 15:08:40.592294 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d2720b8-138e-4bca-8a6a-7640142a114f" containerName="registry-server" Oct 13 15:08:40 crc kubenswrapper[4797]: I1013 15:08:40.592482 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d2720b8-138e-4bca-8a6a-7640142a114f" containerName="registry-server" Oct 13 15:08:40 crc kubenswrapper[4797]: I1013 15:08:40.594462 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9knbm" Oct 13 15:08:40 crc kubenswrapper[4797]: I1013 15:08:40.606828 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9knbm"] Oct 13 15:08:40 crc kubenswrapper[4797]: I1013 15:08:40.652844 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kz2bk\" (UniqueName: \"kubernetes.io/projected/49c92455-4464-4407-85f6-bab35da1c9f4-kube-api-access-kz2bk\") pod \"redhat-marketplace-9knbm\" (UID: \"49c92455-4464-4407-85f6-bab35da1c9f4\") " pod="openshift-marketplace/redhat-marketplace-9knbm" Oct 13 15:08:40 crc kubenswrapper[4797]: I1013 15:08:40.652950 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49c92455-4464-4407-85f6-bab35da1c9f4-utilities\") pod \"redhat-marketplace-9knbm\" (UID: \"49c92455-4464-4407-85f6-bab35da1c9f4\") " pod="openshift-marketplace/redhat-marketplace-9knbm" Oct 13 15:08:40 crc kubenswrapper[4797]: I1013 15:08:40.652989 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49c92455-4464-4407-85f6-bab35da1c9f4-catalog-content\") pod \"redhat-marketplace-9knbm\" (UID: \"49c92455-4464-4407-85f6-bab35da1c9f4\") " pod="openshift-marketplace/redhat-marketplace-9knbm" Oct 13 15:08:40 crc kubenswrapper[4797]: I1013 15:08:40.756512 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kz2bk\" (UniqueName: \"kubernetes.io/projected/49c92455-4464-4407-85f6-bab35da1c9f4-kube-api-access-kz2bk\") pod \"redhat-marketplace-9knbm\" (UID: \"49c92455-4464-4407-85f6-bab35da1c9f4\") " pod="openshift-marketplace/redhat-marketplace-9knbm" Oct 13 15:08:40 crc kubenswrapper[4797]: I1013 15:08:40.756630 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49c92455-4464-4407-85f6-bab35da1c9f4-utilities\") pod \"redhat-marketplace-9knbm\" (UID: \"49c92455-4464-4407-85f6-bab35da1c9f4\") " pod="openshift-marketplace/redhat-marketplace-9knbm" Oct 13 15:08:40 crc kubenswrapper[4797]: I1013 15:08:40.756674 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49c92455-4464-4407-85f6-bab35da1c9f4-catalog-content\") pod \"redhat-marketplace-9knbm\" (UID: \"49c92455-4464-4407-85f6-bab35da1c9f4\") " pod="openshift-marketplace/redhat-marketplace-9knbm" Oct 13 15:08:40 crc kubenswrapper[4797]: I1013 15:08:40.757473 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49c92455-4464-4407-85f6-bab35da1c9f4-catalog-content\") pod \"redhat-marketplace-9knbm\" (UID: \"49c92455-4464-4407-85f6-bab35da1c9f4\") " pod="openshift-marketplace/redhat-marketplace-9knbm" Oct 13 15:08:40 crc kubenswrapper[4797]: I1013 15:08:40.757617 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49c92455-4464-4407-85f6-bab35da1c9f4-utilities\") pod \"redhat-marketplace-9knbm\" (UID: \"49c92455-4464-4407-85f6-bab35da1c9f4\") " pod="openshift-marketplace/redhat-marketplace-9knbm" Oct 13 15:08:40 crc kubenswrapper[4797]: I1013 15:08:40.789684 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kz2bk\" (UniqueName: \"kubernetes.io/projected/49c92455-4464-4407-85f6-bab35da1c9f4-kube-api-access-kz2bk\") pod \"redhat-marketplace-9knbm\" (UID: \"49c92455-4464-4407-85f6-bab35da1c9f4\") " pod="openshift-marketplace/redhat-marketplace-9knbm" Oct 13 15:08:40 crc kubenswrapper[4797]: I1013 15:08:40.914944 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9knbm" Oct 13 15:08:41 crc kubenswrapper[4797]: I1013 15:08:41.375869 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9knbm"] Oct 13 15:08:42 crc kubenswrapper[4797]: I1013 15:08:42.353968 4797 generic.go:334] "Generic (PLEG): container finished" podID="49c92455-4464-4407-85f6-bab35da1c9f4" containerID="ca63e2ba268081c49eb52db6ae8640e4c593efaffb9d6763f29598bde3916148" exitCode=0 Oct 13 15:08:42 crc kubenswrapper[4797]: I1013 15:08:42.354070 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9knbm" event={"ID":"49c92455-4464-4407-85f6-bab35da1c9f4","Type":"ContainerDied","Data":"ca63e2ba268081c49eb52db6ae8640e4c593efaffb9d6763f29598bde3916148"} Oct 13 15:08:42 crc kubenswrapper[4797]: I1013 15:08:42.354611 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9knbm" event={"ID":"49c92455-4464-4407-85f6-bab35da1c9f4","Type":"ContainerStarted","Data":"5431737ff2d5ead7086eec61043c682d1baea505963087129199b3cf1b8c439c"} Oct 13 15:08:43 crc kubenswrapper[4797]: I1013 15:08:43.365930 4797 generic.go:334] "Generic (PLEG): container finished" podID="49c92455-4464-4407-85f6-bab35da1c9f4" containerID="8a8cabd1c6cf80cdb82731fd4061a7ff31a013302831cb470f214033d3f8a10a" exitCode=0 Oct 13 15:08:43 crc kubenswrapper[4797]: I1013 15:08:43.366007 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9knbm" event={"ID":"49c92455-4464-4407-85f6-bab35da1c9f4","Type":"ContainerDied","Data":"8a8cabd1c6cf80cdb82731fd4061a7ff31a013302831cb470f214033d3f8a10a"} Oct 13 15:08:44 crc kubenswrapper[4797]: I1013 15:08:44.377921 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9knbm" event={"ID":"49c92455-4464-4407-85f6-bab35da1c9f4","Type":"ContainerStarted","Data":"438c28a4cb454b5a8250638da1c1c1fc2e2210eb4318e274b29006d97e6a328b"} Oct 13 15:08:44 crc kubenswrapper[4797]: I1013 15:08:44.408936 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-9knbm" podStartSLOduration=2.782714008 podStartE2EDuration="4.408913054s" podCreationTimestamp="2025-10-13 15:08:40 +0000 UTC" firstStartedPulling="2025-10-13 15:08:42.35695997 +0000 UTC m=+7299.890510226" lastFinishedPulling="2025-10-13 15:08:43.983158996 +0000 UTC m=+7301.516709272" observedRunningTime="2025-10-13 15:08:44.397317708 +0000 UTC m=+7301.930867984" watchObservedRunningTime="2025-10-13 15:08:44.408913054 +0000 UTC m=+7301.942463310" Oct 13 15:08:50 crc kubenswrapper[4797]: I1013 15:08:50.915475 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-9knbm" Oct 13 15:08:50 crc kubenswrapper[4797]: I1013 15:08:50.917753 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-9knbm" Oct 13 15:08:50 crc kubenswrapper[4797]: I1013 15:08:50.991756 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-9knbm" Oct 13 15:08:51 crc kubenswrapper[4797]: I1013 15:08:51.526004 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-9knbm" Oct 13 15:08:51 crc kubenswrapper[4797]: I1013 15:08:51.588235 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9knbm"] Oct 13 15:08:52 crc kubenswrapper[4797]: I1013 15:08:52.236833 4797 scope.go:117] "RemoveContainer" containerID="3d7a7089f34adab43f593c787419674f839b78a7826391814ca89f2c6d53810d" Oct 13 15:08:52 crc kubenswrapper[4797]: E1013 15:08:52.237394 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 15:08:53 crc kubenswrapper[4797]: I1013 15:08:53.486020 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-9knbm" podUID="49c92455-4464-4407-85f6-bab35da1c9f4" containerName="registry-server" containerID="cri-o://438c28a4cb454b5a8250638da1c1c1fc2e2210eb4318e274b29006d97e6a328b" gracePeriod=2 Oct 13 15:08:53 crc kubenswrapper[4797]: I1013 15:08:53.971006 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9knbm" Oct 13 15:08:54 crc kubenswrapper[4797]: I1013 15:08:54.060990 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49c92455-4464-4407-85f6-bab35da1c9f4-catalog-content\") pod \"49c92455-4464-4407-85f6-bab35da1c9f4\" (UID: \"49c92455-4464-4407-85f6-bab35da1c9f4\") " Oct 13 15:08:54 crc kubenswrapper[4797]: I1013 15:08:54.061163 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kz2bk\" (UniqueName: \"kubernetes.io/projected/49c92455-4464-4407-85f6-bab35da1c9f4-kube-api-access-kz2bk\") pod \"49c92455-4464-4407-85f6-bab35da1c9f4\" (UID: \"49c92455-4464-4407-85f6-bab35da1c9f4\") " Oct 13 15:08:54 crc kubenswrapper[4797]: I1013 15:08:54.061359 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49c92455-4464-4407-85f6-bab35da1c9f4-utilities\") pod \"49c92455-4464-4407-85f6-bab35da1c9f4\" (UID: \"49c92455-4464-4407-85f6-bab35da1c9f4\") " Oct 13 15:08:54 crc kubenswrapper[4797]: I1013 15:08:54.062255 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49c92455-4464-4407-85f6-bab35da1c9f4-utilities" (OuterVolumeSpecName: "utilities") pod "49c92455-4464-4407-85f6-bab35da1c9f4" (UID: "49c92455-4464-4407-85f6-bab35da1c9f4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 15:08:54 crc kubenswrapper[4797]: I1013 15:08:54.068779 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c92455-4464-4407-85f6-bab35da1c9f4-kube-api-access-kz2bk" (OuterVolumeSpecName: "kube-api-access-kz2bk") pod "49c92455-4464-4407-85f6-bab35da1c9f4" (UID: "49c92455-4464-4407-85f6-bab35da1c9f4"). InnerVolumeSpecName "kube-api-access-kz2bk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 15:08:54 crc kubenswrapper[4797]: I1013 15:08:54.074059 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49c92455-4464-4407-85f6-bab35da1c9f4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "49c92455-4464-4407-85f6-bab35da1c9f4" (UID: "49c92455-4464-4407-85f6-bab35da1c9f4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 15:08:54 crc kubenswrapper[4797]: I1013 15:08:54.165321 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kz2bk\" (UniqueName: \"kubernetes.io/projected/49c92455-4464-4407-85f6-bab35da1c9f4-kube-api-access-kz2bk\") on node \"crc\" DevicePath \"\"" Oct 13 15:08:54 crc kubenswrapper[4797]: I1013 15:08:54.165356 4797 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49c92455-4464-4407-85f6-bab35da1c9f4-utilities\") on node \"crc\" DevicePath \"\"" Oct 13 15:08:54 crc kubenswrapper[4797]: I1013 15:08:54.165369 4797 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49c92455-4464-4407-85f6-bab35da1c9f4-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 13 15:08:54 crc kubenswrapper[4797]: I1013 15:08:54.508390 4797 generic.go:334] "Generic (PLEG): container finished" podID="49c92455-4464-4407-85f6-bab35da1c9f4" containerID="438c28a4cb454b5a8250638da1c1c1fc2e2210eb4318e274b29006d97e6a328b" exitCode=0 Oct 13 15:08:54 crc kubenswrapper[4797]: I1013 15:08:54.508530 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9knbm" Oct 13 15:08:54 crc kubenswrapper[4797]: I1013 15:08:54.508491 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9knbm" event={"ID":"49c92455-4464-4407-85f6-bab35da1c9f4","Type":"ContainerDied","Data":"438c28a4cb454b5a8250638da1c1c1fc2e2210eb4318e274b29006d97e6a328b"} Oct 13 15:08:54 crc kubenswrapper[4797]: I1013 15:08:54.508642 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9knbm" event={"ID":"49c92455-4464-4407-85f6-bab35da1c9f4","Type":"ContainerDied","Data":"5431737ff2d5ead7086eec61043c682d1baea505963087129199b3cf1b8c439c"} Oct 13 15:08:54 crc kubenswrapper[4797]: I1013 15:08:54.508689 4797 scope.go:117] "RemoveContainer" containerID="438c28a4cb454b5a8250638da1c1c1fc2e2210eb4318e274b29006d97e6a328b" Oct 13 15:08:54 crc kubenswrapper[4797]: I1013 15:08:54.547262 4797 scope.go:117] "RemoveContainer" containerID="8a8cabd1c6cf80cdb82731fd4061a7ff31a013302831cb470f214033d3f8a10a" Oct 13 15:08:54 crc kubenswrapper[4797]: I1013 15:08:54.581474 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9knbm"] Oct 13 15:08:54 crc kubenswrapper[4797]: I1013 15:08:54.583635 4797 scope.go:117] "RemoveContainer" containerID="ca63e2ba268081c49eb52db6ae8640e4c593efaffb9d6763f29598bde3916148" Oct 13 15:08:54 crc kubenswrapper[4797]: I1013 15:08:54.595732 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-9knbm"] Oct 13 15:08:54 crc kubenswrapper[4797]: I1013 15:08:54.651737 4797 scope.go:117] "RemoveContainer" containerID="438c28a4cb454b5a8250638da1c1c1fc2e2210eb4318e274b29006d97e6a328b" Oct 13 15:08:54 crc kubenswrapper[4797]: E1013 15:08:54.652153 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"438c28a4cb454b5a8250638da1c1c1fc2e2210eb4318e274b29006d97e6a328b\": container with ID starting with 438c28a4cb454b5a8250638da1c1c1fc2e2210eb4318e274b29006d97e6a328b not found: ID does not exist" containerID="438c28a4cb454b5a8250638da1c1c1fc2e2210eb4318e274b29006d97e6a328b" Oct 13 15:08:54 crc kubenswrapper[4797]: I1013 15:08:54.652201 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"438c28a4cb454b5a8250638da1c1c1fc2e2210eb4318e274b29006d97e6a328b"} err="failed to get container status \"438c28a4cb454b5a8250638da1c1c1fc2e2210eb4318e274b29006d97e6a328b\": rpc error: code = NotFound desc = could not find container \"438c28a4cb454b5a8250638da1c1c1fc2e2210eb4318e274b29006d97e6a328b\": container with ID starting with 438c28a4cb454b5a8250638da1c1c1fc2e2210eb4318e274b29006d97e6a328b not found: ID does not exist" Oct 13 15:08:54 crc kubenswrapper[4797]: I1013 15:08:54.652228 4797 scope.go:117] "RemoveContainer" containerID="8a8cabd1c6cf80cdb82731fd4061a7ff31a013302831cb470f214033d3f8a10a" Oct 13 15:08:54 crc kubenswrapper[4797]: E1013 15:08:54.652523 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a8cabd1c6cf80cdb82731fd4061a7ff31a013302831cb470f214033d3f8a10a\": container with ID starting with 8a8cabd1c6cf80cdb82731fd4061a7ff31a013302831cb470f214033d3f8a10a not found: ID does not exist" containerID="8a8cabd1c6cf80cdb82731fd4061a7ff31a013302831cb470f214033d3f8a10a" Oct 13 15:08:54 crc kubenswrapper[4797]: I1013 15:08:54.652564 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a8cabd1c6cf80cdb82731fd4061a7ff31a013302831cb470f214033d3f8a10a"} err="failed to get container status \"8a8cabd1c6cf80cdb82731fd4061a7ff31a013302831cb470f214033d3f8a10a\": rpc error: code = NotFound desc = could not find container \"8a8cabd1c6cf80cdb82731fd4061a7ff31a013302831cb470f214033d3f8a10a\": container with ID starting with 8a8cabd1c6cf80cdb82731fd4061a7ff31a013302831cb470f214033d3f8a10a not found: ID does not exist" Oct 13 15:08:54 crc kubenswrapper[4797]: I1013 15:08:54.652590 4797 scope.go:117] "RemoveContainer" containerID="ca63e2ba268081c49eb52db6ae8640e4c593efaffb9d6763f29598bde3916148" Oct 13 15:08:54 crc kubenswrapper[4797]: E1013 15:08:54.652793 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca63e2ba268081c49eb52db6ae8640e4c593efaffb9d6763f29598bde3916148\": container with ID starting with ca63e2ba268081c49eb52db6ae8640e4c593efaffb9d6763f29598bde3916148 not found: ID does not exist" containerID="ca63e2ba268081c49eb52db6ae8640e4c593efaffb9d6763f29598bde3916148" Oct 13 15:08:54 crc kubenswrapper[4797]: I1013 15:08:54.652818 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca63e2ba268081c49eb52db6ae8640e4c593efaffb9d6763f29598bde3916148"} err="failed to get container status \"ca63e2ba268081c49eb52db6ae8640e4c593efaffb9d6763f29598bde3916148\": rpc error: code = NotFound desc = could not find container \"ca63e2ba268081c49eb52db6ae8640e4c593efaffb9d6763f29598bde3916148\": container with ID starting with ca63e2ba268081c49eb52db6ae8640e4c593efaffb9d6763f29598bde3916148 not found: ID does not exist" Oct 13 15:08:55 crc kubenswrapper[4797]: I1013 15:08:55.248443 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c92455-4464-4407-85f6-bab35da1c9f4" path="/var/lib/kubelet/pods/49c92455-4464-4407-85f6-bab35da1c9f4/volumes" Oct 13 15:09:00 crc kubenswrapper[4797]: I1013 15:09:00.086443 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-6d7fj"] Oct 13 15:09:00 crc kubenswrapper[4797]: E1013 15:09:00.087774 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49c92455-4464-4407-85f6-bab35da1c9f4" containerName="extract-content" Oct 13 15:09:00 crc kubenswrapper[4797]: I1013 15:09:00.087794 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="49c92455-4464-4407-85f6-bab35da1c9f4" containerName="extract-content" Oct 13 15:09:00 crc kubenswrapper[4797]: E1013 15:09:00.087854 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49c92455-4464-4407-85f6-bab35da1c9f4" containerName="registry-server" Oct 13 15:09:00 crc kubenswrapper[4797]: I1013 15:09:00.087872 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="49c92455-4464-4407-85f6-bab35da1c9f4" containerName="registry-server" Oct 13 15:09:00 crc kubenswrapper[4797]: E1013 15:09:00.087916 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49c92455-4464-4407-85f6-bab35da1c9f4" containerName="extract-utilities" Oct 13 15:09:00 crc kubenswrapper[4797]: I1013 15:09:00.087925 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="49c92455-4464-4407-85f6-bab35da1c9f4" containerName="extract-utilities" Oct 13 15:09:00 crc kubenswrapper[4797]: I1013 15:09:00.088438 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="49c92455-4464-4407-85f6-bab35da1c9f4" containerName="registry-server" Oct 13 15:09:00 crc kubenswrapper[4797]: I1013 15:09:00.094783 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6d7fj" Oct 13 15:09:00 crc kubenswrapper[4797]: I1013 15:09:00.127360 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6d7fj"] Oct 13 15:09:00 crc kubenswrapper[4797]: I1013 15:09:00.207873 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8z2v\" (UniqueName: \"kubernetes.io/projected/de3d84c9-c543-4f12-9475-56b144a7726e-kube-api-access-l8z2v\") pod \"redhat-operators-6d7fj\" (UID: \"de3d84c9-c543-4f12-9475-56b144a7726e\") " pod="openshift-marketplace/redhat-operators-6d7fj" Oct 13 15:09:00 crc kubenswrapper[4797]: I1013 15:09:00.207955 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de3d84c9-c543-4f12-9475-56b144a7726e-catalog-content\") pod \"redhat-operators-6d7fj\" (UID: \"de3d84c9-c543-4f12-9475-56b144a7726e\") " pod="openshift-marketplace/redhat-operators-6d7fj" Oct 13 15:09:00 crc kubenswrapper[4797]: I1013 15:09:00.208088 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de3d84c9-c543-4f12-9475-56b144a7726e-utilities\") pod \"redhat-operators-6d7fj\" (UID: \"de3d84c9-c543-4f12-9475-56b144a7726e\") " pod="openshift-marketplace/redhat-operators-6d7fj" Oct 13 15:09:00 crc kubenswrapper[4797]: I1013 15:09:00.310709 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de3d84c9-c543-4f12-9475-56b144a7726e-catalog-content\") pod \"redhat-operators-6d7fj\" (UID: \"de3d84c9-c543-4f12-9475-56b144a7726e\") " pod="openshift-marketplace/redhat-operators-6d7fj" Oct 13 15:09:00 crc kubenswrapper[4797]: I1013 15:09:00.310794 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de3d84c9-c543-4f12-9475-56b144a7726e-utilities\") pod \"redhat-operators-6d7fj\" (UID: \"de3d84c9-c543-4f12-9475-56b144a7726e\") " pod="openshift-marketplace/redhat-operators-6d7fj" Oct 13 15:09:00 crc kubenswrapper[4797]: I1013 15:09:00.311085 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l8z2v\" (UniqueName: \"kubernetes.io/projected/de3d84c9-c543-4f12-9475-56b144a7726e-kube-api-access-l8z2v\") pod \"redhat-operators-6d7fj\" (UID: \"de3d84c9-c543-4f12-9475-56b144a7726e\") " pod="openshift-marketplace/redhat-operators-6d7fj" Oct 13 15:09:00 crc kubenswrapper[4797]: I1013 15:09:00.311806 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de3d84c9-c543-4f12-9475-56b144a7726e-utilities\") pod \"redhat-operators-6d7fj\" (UID: \"de3d84c9-c543-4f12-9475-56b144a7726e\") " pod="openshift-marketplace/redhat-operators-6d7fj" Oct 13 15:09:00 crc kubenswrapper[4797]: I1013 15:09:00.312412 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de3d84c9-c543-4f12-9475-56b144a7726e-catalog-content\") pod \"redhat-operators-6d7fj\" (UID: \"de3d84c9-c543-4f12-9475-56b144a7726e\") " pod="openshift-marketplace/redhat-operators-6d7fj" Oct 13 15:09:00 crc kubenswrapper[4797]: I1013 15:09:00.337172 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8z2v\" (UniqueName: \"kubernetes.io/projected/de3d84c9-c543-4f12-9475-56b144a7726e-kube-api-access-l8z2v\") pod \"redhat-operators-6d7fj\" (UID: \"de3d84c9-c543-4f12-9475-56b144a7726e\") " pod="openshift-marketplace/redhat-operators-6d7fj" Oct 13 15:09:00 crc kubenswrapper[4797]: I1013 15:09:00.435182 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6d7fj" Oct 13 15:09:00 crc kubenswrapper[4797]: I1013 15:09:00.984310 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6d7fj"] Oct 13 15:09:01 crc kubenswrapper[4797]: I1013 15:09:01.633091 4797 generic.go:334] "Generic (PLEG): container finished" podID="de3d84c9-c543-4f12-9475-56b144a7726e" containerID="2adabe5ee1e4525472b4ed5f39cbd38e371510f8171dfd2c5aaf53b4eefa18c5" exitCode=0 Oct 13 15:09:01 crc kubenswrapper[4797]: I1013 15:09:01.633231 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6d7fj" event={"ID":"de3d84c9-c543-4f12-9475-56b144a7726e","Type":"ContainerDied","Data":"2adabe5ee1e4525472b4ed5f39cbd38e371510f8171dfd2c5aaf53b4eefa18c5"} Oct 13 15:09:01 crc kubenswrapper[4797]: I1013 15:09:01.633924 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6d7fj" event={"ID":"de3d84c9-c543-4f12-9475-56b144a7726e","Type":"ContainerStarted","Data":"3e410bf8e479eced38811572cf03dbba00bb82b26b269f5402018950b2fad1c5"} Oct 13 15:09:03 crc kubenswrapper[4797]: I1013 15:09:03.655875 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6d7fj" event={"ID":"de3d84c9-c543-4f12-9475-56b144a7726e","Type":"ContainerStarted","Data":"71d691222f737bd20582cc89481627dd3eaaaec55abe53d1c5562a9977b97bb5"} Oct 13 15:09:06 crc kubenswrapper[4797]: I1013 15:09:06.699045 4797 generic.go:334] "Generic (PLEG): container finished" podID="de3d84c9-c543-4f12-9475-56b144a7726e" containerID="71d691222f737bd20582cc89481627dd3eaaaec55abe53d1c5562a9977b97bb5" exitCode=0 Oct 13 15:09:06 crc kubenswrapper[4797]: I1013 15:09:06.699187 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6d7fj" event={"ID":"de3d84c9-c543-4f12-9475-56b144a7726e","Type":"ContainerDied","Data":"71d691222f737bd20582cc89481627dd3eaaaec55abe53d1c5562a9977b97bb5"} Oct 13 15:09:07 crc kubenswrapper[4797]: I1013 15:09:07.236516 4797 scope.go:117] "RemoveContainer" containerID="3d7a7089f34adab43f593c787419674f839b78a7826391814ca89f2c6d53810d" Oct 13 15:09:07 crc kubenswrapper[4797]: E1013 15:09:07.236741 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 15:09:07 crc kubenswrapper[4797]: I1013 15:09:07.754451 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6d7fj" event={"ID":"de3d84c9-c543-4f12-9475-56b144a7726e","Type":"ContainerStarted","Data":"4fb70d71ad54257517b617940368ad8f0e93fc53771fb309e82b2ae23542faff"} Oct 13 15:09:07 crc kubenswrapper[4797]: I1013 15:09:07.834779 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-6d7fj" podStartSLOduration=2.098894109 podStartE2EDuration="7.834758036s" podCreationTimestamp="2025-10-13 15:09:00 +0000 UTC" firstStartedPulling="2025-10-13 15:09:01.635255408 +0000 UTC m=+7319.168805684" lastFinishedPulling="2025-10-13 15:09:07.371119355 +0000 UTC m=+7324.904669611" observedRunningTime="2025-10-13 15:09:07.814158378 +0000 UTC m=+7325.347708654" watchObservedRunningTime="2025-10-13 15:09:07.834758036 +0000 UTC m=+7325.368308292" Oct 13 15:09:10 crc kubenswrapper[4797]: I1013 15:09:10.436062 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-6d7fj" Oct 13 15:09:10 crc kubenswrapper[4797]: I1013 15:09:10.436355 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-6d7fj" Oct 13 15:09:11 crc kubenswrapper[4797]: I1013 15:09:11.493963 4797 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-6d7fj" podUID="de3d84c9-c543-4f12-9475-56b144a7726e" containerName="registry-server" probeResult="failure" output=< Oct 13 15:09:11 crc kubenswrapper[4797]: timeout: failed to connect service ":50051" within 1s Oct 13 15:09:11 crc kubenswrapper[4797]: > Oct 13 15:09:18 crc kubenswrapper[4797]: I1013 15:09:18.237006 4797 scope.go:117] "RemoveContainer" containerID="3d7a7089f34adab43f593c787419674f839b78a7826391814ca89f2c6d53810d" Oct 13 15:09:18 crc kubenswrapper[4797]: E1013 15:09:18.238538 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 15:09:21 crc kubenswrapper[4797]: I1013 15:09:21.488310 4797 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-6d7fj" podUID="de3d84c9-c543-4f12-9475-56b144a7726e" containerName="registry-server" probeResult="failure" output=< Oct 13 15:09:21 crc kubenswrapper[4797]: timeout: failed to connect service ":50051" within 1s Oct 13 15:09:21 crc kubenswrapper[4797]: > Oct 13 15:09:30 crc kubenswrapper[4797]: I1013 15:09:30.495349 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-6d7fj" Oct 13 15:09:30 crc kubenswrapper[4797]: I1013 15:09:30.552249 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-6d7fj" Oct 13 15:09:31 crc kubenswrapper[4797]: I1013 15:09:31.282695 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6d7fj"] Oct 13 15:09:32 crc kubenswrapper[4797]: I1013 15:09:32.021103 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-6d7fj" podUID="de3d84c9-c543-4f12-9475-56b144a7726e" containerName="registry-server" containerID="cri-o://4fb70d71ad54257517b617940368ad8f0e93fc53771fb309e82b2ae23542faff" gracePeriod=2 Oct 13 15:09:32 crc kubenswrapper[4797]: I1013 15:09:32.630079 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6d7fj" Oct 13 15:09:32 crc kubenswrapper[4797]: I1013 15:09:32.821704 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de3d84c9-c543-4f12-9475-56b144a7726e-utilities\") pod \"de3d84c9-c543-4f12-9475-56b144a7726e\" (UID: \"de3d84c9-c543-4f12-9475-56b144a7726e\") " Oct 13 15:09:32 crc kubenswrapper[4797]: I1013 15:09:32.821963 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l8z2v\" (UniqueName: \"kubernetes.io/projected/de3d84c9-c543-4f12-9475-56b144a7726e-kube-api-access-l8z2v\") pod \"de3d84c9-c543-4f12-9475-56b144a7726e\" (UID: \"de3d84c9-c543-4f12-9475-56b144a7726e\") " Oct 13 15:09:32 crc kubenswrapper[4797]: I1013 15:09:32.822073 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de3d84c9-c543-4f12-9475-56b144a7726e-catalog-content\") pod \"de3d84c9-c543-4f12-9475-56b144a7726e\" (UID: \"de3d84c9-c543-4f12-9475-56b144a7726e\") " Oct 13 15:09:32 crc kubenswrapper[4797]: I1013 15:09:32.822789 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de3d84c9-c543-4f12-9475-56b144a7726e-utilities" (OuterVolumeSpecName: "utilities") pod "de3d84c9-c543-4f12-9475-56b144a7726e" (UID: "de3d84c9-c543-4f12-9475-56b144a7726e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 15:09:32 crc kubenswrapper[4797]: I1013 15:09:32.827255 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de3d84c9-c543-4f12-9475-56b144a7726e-kube-api-access-l8z2v" (OuterVolumeSpecName: "kube-api-access-l8z2v") pod "de3d84c9-c543-4f12-9475-56b144a7726e" (UID: "de3d84c9-c543-4f12-9475-56b144a7726e"). InnerVolumeSpecName "kube-api-access-l8z2v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 15:09:32 crc kubenswrapper[4797]: I1013 15:09:32.894933 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de3d84c9-c543-4f12-9475-56b144a7726e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "de3d84c9-c543-4f12-9475-56b144a7726e" (UID: "de3d84c9-c543-4f12-9475-56b144a7726e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 15:09:32 crc kubenswrapper[4797]: I1013 15:09:32.924735 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l8z2v\" (UniqueName: \"kubernetes.io/projected/de3d84c9-c543-4f12-9475-56b144a7726e-kube-api-access-l8z2v\") on node \"crc\" DevicePath \"\"" Oct 13 15:09:32 crc kubenswrapper[4797]: I1013 15:09:32.924765 4797 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de3d84c9-c543-4f12-9475-56b144a7726e-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 13 15:09:32 crc kubenswrapper[4797]: I1013 15:09:32.924774 4797 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de3d84c9-c543-4f12-9475-56b144a7726e-utilities\") on node \"crc\" DevicePath \"\"" Oct 13 15:09:33 crc kubenswrapper[4797]: I1013 15:09:33.037496 4797 generic.go:334] "Generic (PLEG): container finished" podID="de3d84c9-c543-4f12-9475-56b144a7726e" containerID="4fb70d71ad54257517b617940368ad8f0e93fc53771fb309e82b2ae23542faff" exitCode=0 Oct 13 15:09:33 crc kubenswrapper[4797]: I1013 15:09:33.037560 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6d7fj" event={"ID":"de3d84c9-c543-4f12-9475-56b144a7726e","Type":"ContainerDied","Data":"4fb70d71ad54257517b617940368ad8f0e93fc53771fb309e82b2ae23542faff"} Oct 13 15:09:33 crc kubenswrapper[4797]: I1013 15:09:33.037603 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6d7fj" event={"ID":"de3d84c9-c543-4f12-9475-56b144a7726e","Type":"ContainerDied","Data":"3e410bf8e479eced38811572cf03dbba00bb82b26b269f5402018950b2fad1c5"} Oct 13 15:09:33 crc kubenswrapper[4797]: I1013 15:09:33.037635 4797 scope.go:117] "RemoveContainer" containerID="4fb70d71ad54257517b617940368ad8f0e93fc53771fb309e82b2ae23542faff" Oct 13 15:09:33 crc kubenswrapper[4797]: I1013 15:09:33.039500 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6d7fj" Oct 13 15:09:33 crc kubenswrapper[4797]: I1013 15:09:33.074464 4797 scope.go:117] "RemoveContainer" containerID="71d691222f737bd20582cc89481627dd3eaaaec55abe53d1c5562a9977b97bb5" Oct 13 15:09:33 crc kubenswrapper[4797]: I1013 15:09:33.095538 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6d7fj"] Oct 13 15:09:33 crc kubenswrapper[4797]: I1013 15:09:33.115354 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-6d7fj"] Oct 13 15:09:33 crc kubenswrapper[4797]: I1013 15:09:33.120777 4797 scope.go:117] "RemoveContainer" containerID="2adabe5ee1e4525472b4ed5f39cbd38e371510f8171dfd2c5aaf53b4eefa18c5" Oct 13 15:09:33 crc kubenswrapper[4797]: I1013 15:09:33.161295 4797 scope.go:117] "RemoveContainer" containerID="4fb70d71ad54257517b617940368ad8f0e93fc53771fb309e82b2ae23542faff" Oct 13 15:09:33 crc kubenswrapper[4797]: E1013 15:09:33.162073 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4fb70d71ad54257517b617940368ad8f0e93fc53771fb309e82b2ae23542faff\": container with ID starting with 4fb70d71ad54257517b617940368ad8f0e93fc53771fb309e82b2ae23542faff not found: ID does not exist" containerID="4fb70d71ad54257517b617940368ad8f0e93fc53771fb309e82b2ae23542faff" Oct 13 15:09:33 crc kubenswrapper[4797]: I1013 15:09:33.162154 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4fb70d71ad54257517b617940368ad8f0e93fc53771fb309e82b2ae23542faff"} err="failed to get container status \"4fb70d71ad54257517b617940368ad8f0e93fc53771fb309e82b2ae23542faff\": rpc error: code = NotFound desc = could not find container \"4fb70d71ad54257517b617940368ad8f0e93fc53771fb309e82b2ae23542faff\": container with ID starting with 4fb70d71ad54257517b617940368ad8f0e93fc53771fb309e82b2ae23542faff not found: ID does not exist" Oct 13 15:09:33 crc kubenswrapper[4797]: I1013 15:09:33.162204 4797 scope.go:117] "RemoveContainer" containerID="71d691222f737bd20582cc89481627dd3eaaaec55abe53d1c5562a9977b97bb5" Oct 13 15:09:33 crc kubenswrapper[4797]: E1013 15:09:33.162775 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71d691222f737bd20582cc89481627dd3eaaaec55abe53d1c5562a9977b97bb5\": container with ID starting with 71d691222f737bd20582cc89481627dd3eaaaec55abe53d1c5562a9977b97bb5 not found: ID does not exist" containerID="71d691222f737bd20582cc89481627dd3eaaaec55abe53d1c5562a9977b97bb5" Oct 13 15:09:33 crc kubenswrapper[4797]: I1013 15:09:33.163037 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71d691222f737bd20582cc89481627dd3eaaaec55abe53d1c5562a9977b97bb5"} err="failed to get container status \"71d691222f737bd20582cc89481627dd3eaaaec55abe53d1c5562a9977b97bb5\": rpc error: code = NotFound desc = could not find container \"71d691222f737bd20582cc89481627dd3eaaaec55abe53d1c5562a9977b97bb5\": container with ID starting with 71d691222f737bd20582cc89481627dd3eaaaec55abe53d1c5562a9977b97bb5 not found: ID does not exist" Oct 13 15:09:33 crc kubenswrapper[4797]: I1013 15:09:33.163162 4797 scope.go:117] "RemoveContainer" containerID="2adabe5ee1e4525472b4ed5f39cbd38e371510f8171dfd2c5aaf53b4eefa18c5" Oct 13 15:09:33 crc kubenswrapper[4797]: E1013 15:09:33.163645 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2adabe5ee1e4525472b4ed5f39cbd38e371510f8171dfd2c5aaf53b4eefa18c5\": container with ID starting with 2adabe5ee1e4525472b4ed5f39cbd38e371510f8171dfd2c5aaf53b4eefa18c5 not found: ID does not exist" containerID="2adabe5ee1e4525472b4ed5f39cbd38e371510f8171dfd2c5aaf53b4eefa18c5" Oct 13 15:09:33 crc kubenswrapper[4797]: I1013 15:09:33.163714 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2adabe5ee1e4525472b4ed5f39cbd38e371510f8171dfd2c5aaf53b4eefa18c5"} err="failed to get container status \"2adabe5ee1e4525472b4ed5f39cbd38e371510f8171dfd2c5aaf53b4eefa18c5\": rpc error: code = NotFound desc = could not find container \"2adabe5ee1e4525472b4ed5f39cbd38e371510f8171dfd2c5aaf53b4eefa18c5\": container with ID starting with 2adabe5ee1e4525472b4ed5f39cbd38e371510f8171dfd2c5aaf53b4eefa18c5 not found: ID does not exist" Oct 13 15:09:33 crc kubenswrapper[4797]: I1013 15:09:33.244837 4797 scope.go:117] "RemoveContainer" containerID="3d7a7089f34adab43f593c787419674f839b78a7826391814ca89f2c6d53810d" Oct 13 15:09:33 crc kubenswrapper[4797]: E1013 15:09:33.245575 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 15:09:33 crc kubenswrapper[4797]: I1013 15:09:33.251039 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de3d84c9-c543-4f12-9475-56b144a7726e" path="/var/lib/kubelet/pods/de3d84c9-c543-4f12-9475-56b144a7726e/volumes" Oct 13 15:09:44 crc kubenswrapper[4797]: I1013 15:09:44.173495 4797 generic.go:334] "Generic (PLEG): container finished" podID="9da6c975-70b2-4296-9070-7608770a0446" containerID="6683657e5a30cf26bb2e075832d19bb24b79e64a11d6896ec09af01d33999795" exitCode=0 Oct 13 15:09:44 crc kubenswrapper[4797]: I1013 15:09:44.173574 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-wkdb2" event={"ID":"9da6c975-70b2-4296-9070-7608770a0446","Type":"ContainerDied","Data":"6683657e5a30cf26bb2e075832d19bb24b79e64a11d6896ec09af01d33999795"} Oct 13 15:09:45 crc kubenswrapper[4797]: I1013 15:09:45.669505 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-wkdb2" Oct 13 15:09:45 crc kubenswrapper[4797]: I1013 15:09:45.818940 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/9da6c975-70b2-4296-9070-7608770a0446-libvirt-secret-0\") pod \"9da6c975-70b2-4296-9070-7608770a0446\" (UID: \"9da6c975-70b2-4296-9070-7608770a0446\") " Oct 13 15:09:45 crc kubenswrapper[4797]: I1013 15:09:45.819291 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cwrtw\" (UniqueName: \"kubernetes.io/projected/9da6c975-70b2-4296-9070-7608770a0446-kube-api-access-cwrtw\") pod \"9da6c975-70b2-4296-9070-7608770a0446\" (UID: \"9da6c975-70b2-4296-9070-7608770a0446\") " Oct 13 15:09:45 crc kubenswrapper[4797]: I1013 15:09:45.819354 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9da6c975-70b2-4296-9070-7608770a0446-ssh-key\") pod \"9da6c975-70b2-4296-9070-7608770a0446\" (UID: \"9da6c975-70b2-4296-9070-7608770a0446\") " Oct 13 15:09:45 crc kubenswrapper[4797]: I1013 15:09:45.819536 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9da6c975-70b2-4296-9070-7608770a0446-inventory\") pod \"9da6c975-70b2-4296-9070-7608770a0446\" (UID: \"9da6c975-70b2-4296-9070-7608770a0446\") " Oct 13 15:09:45 crc kubenswrapper[4797]: I1013 15:09:45.819613 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9da6c975-70b2-4296-9070-7608770a0446-ceph\") pod \"9da6c975-70b2-4296-9070-7608770a0446\" (UID: \"9da6c975-70b2-4296-9070-7608770a0446\") " Oct 13 15:09:45 crc kubenswrapper[4797]: I1013 15:09:45.819689 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9da6c975-70b2-4296-9070-7608770a0446-libvirt-combined-ca-bundle\") pod \"9da6c975-70b2-4296-9070-7608770a0446\" (UID: \"9da6c975-70b2-4296-9070-7608770a0446\") " Oct 13 15:09:45 crc kubenswrapper[4797]: I1013 15:09:45.825729 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9da6c975-70b2-4296-9070-7608770a0446-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "9da6c975-70b2-4296-9070-7608770a0446" (UID: "9da6c975-70b2-4296-9070-7608770a0446"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 15:09:45 crc kubenswrapper[4797]: I1013 15:09:45.826527 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9da6c975-70b2-4296-9070-7608770a0446-kube-api-access-cwrtw" (OuterVolumeSpecName: "kube-api-access-cwrtw") pod "9da6c975-70b2-4296-9070-7608770a0446" (UID: "9da6c975-70b2-4296-9070-7608770a0446"). InnerVolumeSpecName "kube-api-access-cwrtw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 15:09:45 crc kubenswrapper[4797]: I1013 15:09:45.827516 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9da6c975-70b2-4296-9070-7608770a0446-ceph" (OuterVolumeSpecName: "ceph") pod "9da6c975-70b2-4296-9070-7608770a0446" (UID: "9da6c975-70b2-4296-9070-7608770a0446"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 15:09:45 crc kubenswrapper[4797]: I1013 15:09:45.861545 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9da6c975-70b2-4296-9070-7608770a0446-inventory" (OuterVolumeSpecName: "inventory") pod "9da6c975-70b2-4296-9070-7608770a0446" (UID: "9da6c975-70b2-4296-9070-7608770a0446"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 15:09:45 crc kubenswrapper[4797]: I1013 15:09:45.862455 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9da6c975-70b2-4296-9070-7608770a0446-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "9da6c975-70b2-4296-9070-7608770a0446" (UID: "9da6c975-70b2-4296-9070-7608770a0446"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 15:09:45 crc kubenswrapper[4797]: I1013 15:09:45.867264 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9da6c975-70b2-4296-9070-7608770a0446-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "9da6c975-70b2-4296-9070-7608770a0446" (UID: "9da6c975-70b2-4296-9070-7608770a0446"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 15:09:45 crc kubenswrapper[4797]: I1013 15:09:45.922935 4797 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/9da6c975-70b2-4296-9070-7608770a0446-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Oct 13 15:09:45 crc kubenswrapper[4797]: I1013 15:09:45.922979 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cwrtw\" (UniqueName: \"kubernetes.io/projected/9da6c975-70b2-4296-9070-7608770a0446-kube-api-access-cwrtw\") on node \"crc\" DevicePath \"\"" Oct 13 15:09:45 crc kubenswrapper[4797]: I1013 15:09:45.922995 4797 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9da6c975-70b2-4296-9070-7608770a0446-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 13 15:09:45 crc kubenswrapper[4797]: I1013 15:09:45.923009 4797 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9da6c975-70b2-4296-9070-7608770a0446-inventory\") on node \"crc\" DevicePath \"\"" Oct 13 15:09:45 crc kubenswrapper[4797]: I1013 15:09:45.923021 4797 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9da6c975-70b2-4296-9070-7608770a0446-ceph\") on node \"crc\" DevicePath \"\"" Oct 13 15:09:45 crc kubenswrapper[4797]: I1013 15:09:45.923033 4797 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9da6c975-70b2-4296-9070-7608770a0446-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 15:09:46 crc kubenswrapper[4797]: I1013 15:09:46.211725 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-wkdb2" event={"ID":"9da6c975-70b2-4296-9070-7608770a0446","Type":"ContainerDied","Data":"67c3d8c00e97eeeff6f6d693c77c4d8a2db6926be03a1e5b7023d8ae50147fbc"} Oct 13 15:09:46 crc kubenswrapper[4797]: I1013 15:09:46.211771 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="67c3d8c00e97eeeff6f6d693c77c4d8a2db6926be03a1e5b7023d8ae50147fbc" Oct 13 15:09:46 crc kubenswrapper[4797]: I1013 15:09:46.211846 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-wkdb2" Oct 13 15:09:46 crc kubenswrapper[4797]: I1013 15:09:46.338904 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-openstack-openstack-cell1-b7rd2"] Oct 13 15:09:46 crc kubenswrapper[4797]: E1013 15:09:46.340070 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de3d84c9-c543-4f12-9475-56b144a7726e" containerName="extract-utilities" Oct 13 15:09:46 crc kubenswrapper[4797]: I1013 15:09:46.340101 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="de3d84c9-c543-4f12-9475-56b144a7726e" containerName="extract-utilities" Oct 13 15:09:46 crc kubenswrapper[4797]: E1013 15:09:46.340127 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de3d84c9-c543-4f12-9475-56b144a7726e" containerName="extract-content" Oct 13 15:09:46 crc kubenswrapper[4797]: I1013 15:09:46.340139 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="de3d84c9-c543-4f12-9475-56b144a7726e" containerName="extract-content" Oct 13 15:09:46 crc kubenswrapper[4797]: E1013 15:09:46.340377 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9da6c975-70b2-4296-9070-7608770a0446" containerName="libvirt-openstack-openstack-cell1" Oct 13 15:09:46 crc kubenswrapper[4797]: I1013 15:09:46.340386 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="9da6c975-70b2-4296-9070-7608770a0446" containerName="libvirt-openstack-openstack-cell1" Oct 13 15:09:46 crc kubenswrapper[4797]: E1013 15:09:46.340413 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de3d84c9-c543-4f12-9475-56b144a7726e" containerName="registry-server" Oct 13 15:09:46 crc kubenswrapper[4797]: I1013 15:09:46.340422 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="de3d84c9-c543-4f12-9475-56b144a7726e" containerName="registry-server" Oct 13 15:09:46 crc kubenswrapper[4797]: I1013 15:09:46.340988 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="9da6c975-70b2-4296-9070-7608770a0446" containerName="libvirt-openstack-openstack-cell1" Oct 13 15:09:46 crc kubenswrapper[4797]: I1013 15:09:46.341043 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="de3d84c9-c543-4f12-9475-56b144a7726e" containerName="registry-server" Oct 13 15:09:46 crc kubenswrapper[4797]: I1013 15:09:46.342453 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-b7rd2" Oct 13 15:09:46 crc kubenswrapper[4797]: I1013 15:09:46.351726 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-rf85n" Oct 13 15:09:46 crc kubenswrapper[4797]: I1013 15:09:46.351985 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-cells-global-config" Oct 13 15:09:46 crc kubenswrapper[4797]: I1013 15:09:46.352022 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Oct 13 15:09:46 crc kubenswrapper[4797]: I1013 15:09:46.352093 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Oct 13 15:09:46 crc kubenswrapper[4797]: I1013 15:09:46.352032 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 13 15:09:46 crc kubenswrapper[4797]: I1013 15:09:46.353509 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 13 15:09:46 crc kubenswrapper[4797]: I1013 15:09:46.356237 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 13 15:09:46 crc kubenswrapper[4797]: I1013 15:09:46.358718 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-openstack-cell1-b7rd2"] Oct 13 15:09:46 crc kubenswrapper[4797]: I1013 15:09:46.445554 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54b45\" (UniqueName: \"kubernetes.io/projected/26ec5d7d-17f0-4271-ab0b-30af4063a2c1-kube-api-access-54b45\") pod \"nova-cell1-openstack-openstack-cell1-b7rd2\" (UID: \"26ec5d7d-17f0-4271-ab0b-30af4063a2c1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-b7rd2" Oct 13 15:09:46 crc kubenswrapper[4797]: I1013 15:09:46.445639 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/26ec5d7d-17f0-4271-ab0b-30af4063a2c1-inventory\") pod \"nova-cell1-openstack-openstack-cell1-b7rd2\" (UID: \"26ec5d7d-17f0-4271-ab0b-30af4063a2c1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-b7rd2" Oct 13 15:09:46 crc kubenswrapper[4797]: I1013 15:09:46.445687 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/26ec5d7d-17f0-4271-ab0b-30af4063a2c1-nova-cells-global-config-0\") pod \"nova-cell1-openstack-openstack-cell1-b7rd2\" (UID: \"26ec5d7d-17f0-4271-ab0b-30af4063a2c1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-b7rd2" Oct 13 15:09:46 crc kubenswrapper[4797]: I1013 15:09:46.445726 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/26ec5d7d-17f0-4271-ab0b-30af4063a2c1-ssh-key\") pod \"nova-cell1-openstack-openstack-cell1-b7rd2\" (UID: \"26ec5d7d-17f0-4271-ab0b-30af4063a2c1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-b7rd2" Oct 13 15:09:46 crc kubenswrapper[4797]: I1013 15:09:46.445903 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/26ec5d7d-17f0-4271-ab0b-30af4063a2c1-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-openstack-cell1-b7rd2\" (UID: \"26ec5d7d-17f0-4271-ab0b-30af4063a2c1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-b7rd2" Oct 13 15:09:46 crc kubenswrapper[4797]: I1013 15:09:46.446049 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/26ec5d7d-17f0-4271-ab0b-30af4063a2c1-ceph\") pod \"nova-cell1-openstack-openstack-cell1-b7rd2\" (UID: \"26ec5d7d-17f0-4271-ab0b-30af4063a2c1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-b7rd2" Oct 13 15:09:46 crc kubenswrapper[4797]: I1013 15:09:46.446087 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/26ec5d7d-17f0-4271-ab0b-30af4063a2c1-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-openstack-cell1-b7rd2\" (UID: \"26ec5d7d-17f0-4271-ab0b-30af4063a2c1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-b7rd2" Oct 13 15:09:46 crc kubenswrapper[4797]: I1013 15:09:46.446488 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/26ec5d7d-17f0-4271-ab0b-30af4063a2c1-nova-cells-global-config-1\") pod \"nova-cell1-openstack-openstack-cell1-b7rd2\" (UID: \"26ec5d7d-17f0-4271-ab0b-30af4063a2c1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-b7rd2" Oct 13 15:09:46 crc kubenswrapper[4797]: I1013 15:09:46.446674 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26ec5d7d-17f0-4271-ab0b-30af4063a2c1-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-openstack-cell1-b7rd2\" (UID: \"26ec5d7d-17f0-4271-ab0b-30af4063a2c1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-b7rd2" Oct 13 15:09:46 crc kubenswrapper[4797]: I1013 15:09:46.446957 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/26ec5d7d-17f0-4271-ab0b-30af4063a2c1-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-openstack-cell1-b7rd2\" (UID: \"26ec5d7d-17f0-4271-ab0b-30af4063a2c1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-b7rd2" Oct 13 15:09:46 crc kubenswrapper[4797]: I1013 15:09:46.447147 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/26ec5d7d-17f0-4271-ab0b-30af4063a2c1-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-openstack-cell1-b7rd2\" (UID: \"26ec5d7d-17f0-4271-ab0b-30af4063a2c1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-b7rd2" Oct 13 15:09:46 crc kubenswrapper[4797]: I1013 15:09:46.549603 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26ec5d7d-17f0-4271-ab0b-30af4063a2c1-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-openstack-cell1-b7rd2\" (UID: \"26ec5d7d-17f0-4271-ab0b-30af4063a2c1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-b7rd2" Oct 13 15:09:46 crc kubenswrapper[4797]: I1013 15:09:46.549834 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/26ec5d7d-17f0-4271-ab0b-30af4063a2c1-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-openstack-cell1-b7rd2\" (UID: \"26ec5d7d-17f0-4271-ab0b-30af4063a2c1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-b7rd2" Oct 13 15:09:46 crc kubenswrapper[4797]: I1013 15:09:46.549924 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/26ec5d7d-17f0-4271-ab0b-30af4063a2c1-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-openstack-cell1-b7rd2\" (UID: \"26ec5d7d-17f0-4271-ab0b-30af4063a2c1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-b7rd2" Oct 13 15:09:46 crc kubenswrapper[4797]: I1013 15:09:46.550038 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54b45\" (UniqueName: \"kubernetes.io/projected/26ec5d7d-17f0-4271-ab0b-30af4063a2c1-kube-api-access-54b45\") pod \"nova-cell1-openstack-openstack-cell1-b7rd2\" (UID: \"26ec5d7d-17f0-4271-ab0b-30af4063a2c1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-b7rd2" Oct 13 15:09:46 crc kubenswrapper[4797]: I1013 15:09:46.550097 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/26ec5d7d-17f0-4271-ab0b-30af4063a2c1-inventory\") pod \"nova-cell1-openstack-openstack-cell1-b7rd2\" (UID: \"26ec5d7d-17f0-4271-ab0b-30af4063a2c1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-b7rd2" Oct 13 15:09:46 crc kubenswrapper[4797]: I1013 15:09:46.550171 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/26ec5d7d-17f0-4271-ab0b-30af4063a2c1-nova-cells-global-config-0\") pod \"nova-cell1-openstack-openstack-cell1-b7rd2\" (UID: \"26ec5d7d-17f0-4271-ab0b-30af4063a2c1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-b7rd2" Oct 13 15:09:46 crc kubenswrapper[4797]: I1013 15:09:46.550309 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/26ec5d7d-17f0-4271-ab0b-30af4063a2c1-ssh-key\") pod \"nova-cell1-openstack-openstack-cell1-b7rd2\" (UID: \"26ec5d7d-17f0-4271-ab0b-30af4063a2c1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-b7rd2" Oct 13 15:09:46 crc kubenswrapper[4797]: I1013 15:09:46.550348 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/26ec5d7d-17f0-4271-ab0b-30af4063a2c1-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-openstack-cell1-b7rd2\" (UID: \"26ec5d7d-17f0-4271-ab0b-30af4063a2c1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-b7rd2" Oct 13 15:09:46 crc kubenswrapper[4797]: I1013 15:09:46.550423 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/26ec5d7d-17f0-4271-ab0b-30af4063a2c1-ceph\") pod \"nova-cell1-openstack-openstack-cell1-b7rd2\" (UID: \"26ec5d7d-17f0-4271-ab0b-30af4063a2c1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-b7rd2" Oct 13 15:09:46 crc kubenswrapper[4797]: I1013 15:09:46.550447 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/26ec5d7d-17f0-4271-ab0b-30af4063a2c1-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-openstack-cell1-b7rd2\" (UID: \"26ec5d7d-17f0-4271-ab0b-30af4063a2c1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-b7rd2" Oct 13 15:09:46 crc kubenswrapper[4797]: I1013 15:09:46.550679 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/26ec5d7d-17f0-4271-ab0b-30af4063a2c1-nova-cells-global-config-1\") pod \"nova-cell1-openstack-openstack-cell1-b7rd2\" (UID: \"26ec5d7d-17f0-4271-ab0b-30af4063a2c1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-b7rd2" Oct 13 15:09:46 crc kubenswrapper[4797]: I1013 15:09:46.552705 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/26ec5d7d-17f0-4271-ab0b-30af4063a2c1-nova-cells-global-config-1\") pod \"nova-cell1-openstack-openstack-cell1-b7rd2\" (UID: \"26ec5d7d-17f0-4271-ab0b-30af4063a2c1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-b7rd2" Oct 13 15:09:46 crc kubenswrapper[4797]: I1013 15:09:46.553075 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/26ec5d7d-17f0-4271-ab0b-30af4063a2c1-nova-cells-global-config-0\") pod \"nova-cell1-openstack-openstack-cell1-b7rd2\" (UID: \"26ec5d7d-17f0-4271-ab0b-30af4063a2c1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-b7rd2" Oct 13 15:09:46 crc kubenswrapper[4797]: I1013 15:09:46.554606 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/26ec5d7d-17f0-4271-ab0b-30af4063a2c1-inventory\") pod \"nova-cell1-openstack-openstack-cell1-b7rd2\" (UID: \"26ec5d7d-17f0-4271-ab0b-30af4063a2c1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-b7rd2" Oct 13 15:09:46 crc kubenswrapper[4797]: I1013 15:09:46.555362 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26ec5d7d-17f0-4271-ab0b-30af4063a2c1-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-openstack-cell1-b7rd2\" (UID: \"26ec5d7d-17f0-4271-ab0b-30af4063a2c1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-b7rd2" Oct 13 15:09:46 crc kubenswrapper[4797]: I1013 15:09:46.556876 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/26ec5d7d-17f0-4271-ab0b-30af4063a2c1-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-openstack-cell1-b7rd2\" (UID: \"26ec5d7d-17f0-4271-ab0b-30af4063a2c1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-b7rd2" Oct 13 15:09:46 crc kubenswrapper[4797]: I1013 15:09:46.557832 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/26ec5d7d-17f0-4271-ab0b-30af4063a2c1-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-openstack-cell1-b7rd2\" (UID: \"26ec5d7d-17f0-4271-ab0b-30af4063a2c1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-b7rd2" Oct 13 15:09:46 crc kubenswrapper[4797]: I1013 15:09:46.558063 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/26ec5d7d-17f0-4271-ab0b-30af4063a2c1-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-openstack-cell1-b7rd2\" (UID: \"26ec5d7d-17f0-4271-ab0b-30af4063a2c1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-b7rd2" Oct 13 15:09:46 crc kubenswrapper[4797]: I1013 15:09:46.558323 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/26ec5d7d-17f0-4271-ab0b-30af4063a2c1-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-openstack-cell1-b7rd2\" (UID: \"26ec5d7d-17f0-4271-ab0b-30af4063a2c1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-b7rd2" Oct 13 15:09:46 crc kubenswrapper[4797]: I1013 15:09:46.561549 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/26ec5d7d-17f0-4271-ab0b-30af4063a2c1-ceph\") pod \"nova-cell1-openstack-openstack-cell1-b7rd2\" (UID: \"26ec5d7d-17f0-4271-ab0b-30af4063a2c1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-b7rd2" Oct 13 15:09:46 crc kubenswrapper[4797]: I1013 15:09:46.562328 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/26ec5d7d-17f0-4271-ab0b-30af4063a2c1-ssh-key\") pod \"nova-cell1-openstack-openstack-cell1-b7rd2\" (UID: \"26ec5d7d-17f0-4271-ab0b-30af4063a2c1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-b7rd2" Oct 13 15:09:46 crc kubenswrapper[4797]: I1013 15:09:46.572013 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54b45\" (UniqueName: \"kubernetes.io/projected/26ec5d7d-17f0-4271-ab0b-30af4063a2c1-kube-api-access-54b45\") pod \"nova-cell1-openstack-openstack-cell1-b7rd2\" (UID: \"26ec5d7d-17f0-4271-ab0b-30af4063a2c1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-b7rd2" Oct 13 15:09:46 crc kubenswrapper[4797]: I1013 15:09:46.676957 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-b7rd2" Oct 13 15:09:47 crc kubenswrapper[4797]: I1013 15:09:47.235236 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-openstack-cell1-b7rd2"] Oct 13 15:09:47 crc kubenswrapper[4797]: I1013 15:09:47.249255 4797 scope.go:117] "RemoveContainer" containerID="3d7a7089f34adab43f593c787419674f839b78a7826391814ca89f2c6d53810d" Oct 13 15:09:47 crc kubenswrapper[4797]: E1013 15:09:47.249485 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 15:09:48 crc kubenswrapper[4797]: I1013 15:09:48.233433 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-b7rd2" event={"ID":"26ec5d7d-17f0-4271-ab0b-30af4063a2c1","Type":"ContainerStarted","Data":"b6d68aaf2b8a80b0ec34c9ddf1e25006993a503dda6db754bc7ae36ab60554bc"} Oct 13 15:09:48 crc kubenswrapper[4797]: I1013 15:09:48.233840 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-b7rd2" event={"ID":"26ec5d7d-17f0-4271-ab0b-30af4063a2c1","Type":"ContainerStarted","Data":"66fc16f401825b239567b4c4d2597ec711fed9a1be5c1da3c8adce929d2c1ebc"} Oct 13 15:09:48 crc kubenswrapper[4797]: I1013 15:09:48.276194 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-openstack-openstack-cell1-b7rd2" podStartSLOduration=1.790143902 podStartE2EDuration="2.276168906s" podCreationTimestamp="2025-10-13 15:09:46 +0000 UTC" firstStartedPulling="2025-10-13 15:09:47.24175022 +0000 UTC m=+7364.775300466" lastFinishedPulling="2025-10-13 15:09:47.727775194 +0000 UTC m=+7365.261325470" observedRunningTime="2025-10-13 15:09:48.25726826 +0000 UTC m=+7365.790818536" watchObservedRunningTime="2025-10-13 15:09:48.276168906 +0000 UTC m=+7365.809719182" Oct 13 15:09:59 crc kubenswrapper[4797]: I1013 15:09:59.236351 4797 scope.go:117] "RemoveContainer" containerID="3d7a7089f34adab43f593c787419674f839b78a7826391814ca89f2c6d53810d" Oct 13 15:10:00 crc kubenswrapper[4797]: I1013 15:10:00.395768 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" event={"ID":"345b1c60-ba79-407d-8423-53010f2dfeb0","Type":"ContainerStarted","Data":"0d8c5076360fc4f912c2146f18658a8ad6b27bc03d71740bdbd957f89fae5603"} Oct 13 15:11:55 crc kubenswrapper[4797]: I1013 15:11:55.326741 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-jbk4k"] Oct 13 15:11:55 crc kubenswrapper[4797]: I1013 15:11:55.340123 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jbk4k" Oct 13 15:11:55 crc kubenswrapper[4797]: I1013 15:11:55.362052 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jbk4k"] Oct 13 15:11:55 crc kubenswrapper[4797]: I1013 15:11:55.471704 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/906c973d-0e7f-453a-bac5-f7571e9e6b7b-utilities\") pod \"certified-operators-jbk4k\" (UID: \"906c973d-0e7f-453a-bac5-f7571e9e6b7b\") " pod="openshift-marketplace/certified-operators-jbk4k" Oct 13 15:11:55 crc kubenswrapper[4797]: I1013 15:11:55.472099 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/906c973d-0e7f-453a-bac5-f7571e9e6b7b-catalog-content\") pod \"certified-operators-jbk4k\" (UID: \"906c973d-0e7f-453a-bac5-f7571e9e6b7b\") " pod="openshift-marketplace/certified-operators-jbk4k" Oct 13 15:11:55 crc kubenswrapper[4797]: I1013 15:11:55.472301 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-glzcq\" (UniqueName: \"kubernetes.io/projected/906c973d-0e7f-453a-bac5-f7571e9e6b7b-kube-api-access-glzcq\") pod \"certified-operators-jbk4k\" (UID: \"906c973d-0e7f-453a-bac5-f7571e9e6b7b\") " pod="openshift-marketplace/certified-operators-jbk4k" Oct 13 15:11:55 crc kubenswrapper[4797]: I1013 15:11:55.574261 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-glzcq\" (UniqueName: \"kubernetes.io/projected/906c973d-0e7f-453a-bac5-f7571e9e6b7b-kube-api-access-glzcq\") pod \"certified-operators-jbk4k\" (UID: \"906c973d-0e7f-453a-bac5-f7571e9e6b7b\") " pod="openshift-marketplace/certified-operators-jbk4k" Oct 13 15:11:55 crc kubenswrapper[4797]: I1013 15:11:55.574364 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/906c973d-0e7f-453a-bac5-f7571e9e6b7b-utilities\") pod \"certified-operators-jbk4k\" (UID: \"906c973d-0e7f-453a-bac5-f7571e9e6b7b\") " pod="openshift-marketplace/certified-operators-jbk4k" Oct 13 15:11:55 crc kubenswrapper[4797]: I1013 15:11:55.574489 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/906c973d-0e7f-453a-bac5-f7571e9e6b7b-catalog-content\") pod \"certified-operators-jbk4k\" (UID: \"906c973d-0e7f-453a-bac5-f7571e9e6b7b\") " pod="openshift-marketplace/certified-operators-jbk4k" Oct 13 15:11:55 crc kubenswrapper[4797]: I1013 15:11:55.574958 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/906c973d-0e7f-453a-bac5-f7571e9e6b7b-utilities\") pod \"certified-operators-jbk4k\" (UID: \"906c973d-0e7f-453a-bac5-f7571e9e6b7b\") " pod="openshift-marketplace/certified-operators-jbk4k" Oct 13 15:11:55 crc kubenswrapper[4797]: I1013 15:11:55.575025 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/906c973d-0e7f-453a-bac5-f7571e9e6b7b-catalog-content\") pod \"certified-operators-jbk4k\" (UID: \"906c973d-0e7f-453a-bac5-f7571e9e6b7b\") " pod="openshift-marketplace/certified-operators-jbk4k" Oct 13 15:11:55 crc kubenswrapper[4797]: I1013 15:11:55.598356 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-glzcq\" (UniqueName: \"kubernetes.io/projected/906c973d-0e7f-453a-bac5-f7571e9e6b7b-kube-api-access-glzcq\") pod \"certified-operators-jbk4k\" (UID: \"906c973d-0e7f-453a-bac5-f7571e9e6b7b\") " pod="openshift-marketplace/certified-operators-jbk4k" Oct 13 15:11:55 crc kubenswrapper[4797]: I1013 15:11:55.673049 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jbk4k" Oct 13 15:11:56 crc kubenswrapper[4797]: I1013 15:11:56.177425 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jbk4k"] Oct 13 15:11:56 crc kubenswrapper[4797]: W1013 15:11:56.182160 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod906c973d_0e7f_453a_bac5_f7571e9e6b7b.slice/crio-724f42b2a908e0134fdfc6b6c132fdfc9f5446382d42dff0b0d7910c4a61cc04 WatchSource:0}: Error finding container 724f42b2a908e0134fdfc6b6c132fdfc9f5446382d42dff0b0d7910c4a61cc04: Status 404 returned error can't find the container with id 724f42b2a908e0134fdfc6b6c132fdfc9f5446382d42dff0b0d7910c4a61cc04 Oct 13 15:11:56 crc kubenswrapper[4797]: I1013 15:11:56.788390 4797 generic.go:334] "Generic (PLEG): container finished" podID="906c973d-0e7f-453a-bac5-f7571e9e6b7b" containerID="7afad25729522755696e57000988f54314924c8f077a7f9e70b6a90d89ae2940" exitCode=0 Oct 13 15:11:56 crc kubenswrapper[4797]: I1013 15:11:56.788450 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jbk4k" event={"ID":"906c973d-0e7f-453a-bac5-f7571e9e6b7b","Type":"ContainerDied","Data":"7afad25729522755696e57000988f54314924c8f077a7f9e70b6a90d89ae2940"} Oct 13 15:11:56 crc kubenswrapper[4797]: I1013 15:11:56.788729 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jbk4k" event={"ID":"906c973d-0e7f-453a-bac5-f7571e9e6b7b","Type":"ContainerStarted","Data":"724f42b2a908e0134fdfc6b6c132fdfc9f5446382d42dff0b0d7910c4a61cc04"} Oct 13 15:11:56 crc kubenswrapper[4797]: I1013 15:11:56.790891 4797 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 13 15:11:58 crc kubenswrapper[4797]: I1013 15:11:58.814169 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jbk4k" event={"ID":"906c973d-0e7f-453a-bac5-f7571e9e6b7b","Type":"ContainerStarted","Data":"c887eda719be8242a89b710c9b0ad16bbbdf53942dd5a3789a0c99d24cdb9a1f"} Oct 13 15:11:59 crc kubenswrapper[4797]: E1013 15:11:59.103297 4797 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod906c973d_0e7f_453a_bac5_f7571e9e6b7b.slice/crio-conmon-c887eda719be8242a89b710c9b0ad16bbbdf53942dd5a3789a0c99d24cdb9a1f.scope\": RecentStats: unable to find data in memory cache]" Oct 13 15:11:59 crc kubenswrapper[4797]: I1013 15:11:59.832951 4797 generic.go:334] "Generic (PLEG): container finished" podID="906c973d-0e7f-453a-bac5-f7571e9e6b7b" containerID="c887eda719be8242a89b710c9b0ad16bbbdf53942dd5a3789a0c99d24cdb9a1f" exitCode=0 Oct 13 15:11:59 crc kubenswrapper[4797]: I1013 15:11:59.833012 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jbk4k" event={"ID":"906c973d-0e7f-453a-bac5-f7571e9e6b7b","Type":"ContainerDied","Data":"c887eda719be8242a89b710c9b0ad16bbbdf53942dd5a3789a0c99d24cdb9a1f"} Oct 13 15:12:00 crc kubenswrapper[4797]: I1013 15:12:00.847303 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jbk4k" event={"ID":"906c973d-0e7f-453a-bac5-f7571e9e6b7b","Type":"ContainerStarted","Data":"6c3689d1a9213e9c87831c33603ec59c522ade35a23e462f3c3e1fd89cb0a7f9"} Oct 13 15:12:00 crc kubenswrapper[4797]: I1013 15:12:00.874696 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-jbk4k" podStartSLOduration=2.347397068 podStartE2EDuration="5.874665448s" podCreationTimestamp="2025-10-13 15:11:55 +0000 UTC" firstStartedPulling="2025-10-13 15:11:56.7906653 +0000 UTC m=+7494.324215546" lastFinishedPulling="2025-10-13 15:12:00.31793366 +0000 UTC m=+7497.851483926" observedRunningTime="2025-10-13 15:12:00.868865875 +0000 UTC m=+7498.402416171" watchObservedRunningTime="2025-10-13 15:12:00.874665448 +0000 UTC m=+7498.408215714" Oct 13 15:12:05 crc kubenswrapper[4797]: I1013 15:12:05.673906 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-jbk4k" Oct 13 15:12:05 crc kubenswrapper[4797]: I1013 15:12:05.674486 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-jbk4k" Oct 13 15:12:05 crc kubenswrapper[4797]: I1013 15:12:05.733654 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-jbk4k" Oct 13 15:12:05 crc kubenswrapper[4797]: I1013 15:12:05.977731 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-jbk4k" Oct 13 15:12:06 crc kubenswrapper[4797]: I1013 15:12:06.032440 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jbk4k"] Oct 13 15:12:07 crc kubenswrapper[4797]: I1013 15:12:07.959935 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-jbk4k" podUID="906c973d-0e7f-453a-bac5-f7571e9e6b7b" containerName="registry-server" containerID="cri-o://6c3689d1a9213e9c87831c33603ec59c522ade35a23e462f3c3e1fd89cb0a7f9" gracePeriod=2 Oct 13 15:12:08 crc kubenswrapper[4797]: I1013 15:12:08.485921 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jbk4k" Oct 13 15:12:08 crc kubenswrapper[4797]: I1013 15:12:08.570776 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/906c973d-0e7f-453a-bac5-f7571e9e6b7b-catalog-content\") pod \"906c973d-0e7f-453a-bac5-f7571e9e6b7b\" (UID: \"906c973d-0e7f-453a-bac5-f7571e9e6b7b\") " Oct 13 15:12:08 crc kubenswrapper[4797]: I1013 15:12:08.570924 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-glzcq\" (UniqueName: \"kubernetes.io/projected/906c973d-0e7f-453a-bac5-f7571e9e6b7b-kube-api-access-glzcq\") pod \"906c973d-0e7f-453a-bac5-f7571e9e6b7b\" (UID: \"906c973d-0e7f-453a-bac5-f7571e9e6b7b\") " Oct 13 15:12:08 crc kubenswrapper[4797]: I1013 15:12:08.571094 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/906c973d-0e7f-453a-bac5-f7571e9e6b7b-utilities\") pod \"906c973d-0e7f-453a-bac5-f7571e9e6b7b\" (UID: \"906c973d-0e7f-453a-bac5-f7571e9e6b7b\") " Oct 13 15:12:08 crc kubenswrapper[4797]: I1013 15:12:08.572525 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/906c973d-0e7f-453a-bac5-f7571e9e6b7b-utilities" (OuterVolumeSpecName: "utilities") pod "906c973d-0e7f-453a-bac5-f7571e9e6b7b" (UID: "906c973d-0e7f-453a-bac5-f7571e9e6b7b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 15:12:08 crc kubenswrapper[4797]: I1013 15:12:08.579108 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/906c973d-0e7f-453a-bac5-f7571e9e6b7b-kube-api-access-glzcq" (OuterVolumeSpecName: "kube-api-access-glzcq") pod "906c973d-0e7f-453a-bac5-f7571e9e6b7b" (UID: "906c973d-0e7f-453a-bac5-f7571e9e6b7b"). InnerVolumeSpecName "kube-api-access-glzcq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 15:12:08 crc kubenswrapper[4797]: I1013 15:12:08.614515 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/906c973d-0e7f-453a-bac5-f7571e9e6b7b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "906c973d-0e7f-453a-bac5-f7571e9e6b7b" (UID: "906c973d-0e7f-453a-bac5-f7571e9e6b7b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 15:12:08 crc kubenswrapper[4797]: I1013 15:12:08.673592 4797 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/906c973d-0e7f-453a-bac5-f7571e9e6b7b-utilities\") on node \"crc\" DevicePath \"\"" Oct 13 15:12:08 crc kubenswrapper[4797]: I1013 15:12:08.673631 4797 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/906c973d-0e7f-453a-bac5-f7571e9e6b7b-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 13 15:12:08 crc kubenswrapper[4797]: I1013 15:12:08.673645 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-glzcq\" (UniqueName: \"kubernetes.io/projected/906c973d-0e7f-453a-bac5-f7571e9e6b7b-kube-api-access-glzcq\") on node \"crc\" DevicePath \"\"" Oct 13 15:12:08 crc kubenswrapper[4797]: I1013 15:12:08.977861 4797 generic.go:334] "Generic (PLEG): container finished" podID="906c973d-0e7f-453a-bac5-f7571e9e6b7b" containerID="6c3689d1a9213e9c87831c33603ec59c522ade35a23e462f3c3e1fd89cb0a7f9" exitCode=0 Oct 13 15:12:08 crc kubenswrapper[4797]: I1013 15:12:08.977937 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jbk4k" event={"ID":"906c973d-0e7f-453a-bac5-f7571e9e6b7b","Type":"ContainerDied","Data":"6c3689d1a9213e9c87831c33603ec59c522ade35a23e462f3c3e1fd89cb0a7f9"} Oct 13 15:12:08 crc kubenswrapper[4797]: I1013 15:12:08.977977 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jbk4k" Oct 13 15:12:08 crc kubenswrapper[4797]: I1013 15:12:08.978016 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jbk4k" event={"ID":"906c973d-0e7f-453a-bac5-f7571e9e6b7b","Type":"ContainerDied","Data":"724f42b2a908e0134fdfc6b6c132fdfc9f5446382d42dff0b0d7910c4a61cc04"} Oct 13 15:12:08 crc kubenswrapper[4797]: I1013 15:12:08.978062 4797 scope.go:117] "RemoveContainer" containerID="6c3689d1a9213e9c87831c33603ec59c522ade35a23e462f3c3e1fd89cb0a7f9" Oct 13 15:12:09 crc kubenswrapper[4797]: I1013 15:12:09.027249 4797 scope.go:117] "RemoveContainer" containerID="c887eda719be8242a89b710c9b0ad16bbbdf53942dd5a3789a0c99d24cdb9a1f" Oct 13 15:12:09 crc kubenswrapper[4797]: I1013 15:12:09.039892 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jbk4k"] Oct 13 15:12:09 crc kubenswrapper[4797]: I1013 15:12:09.049559 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-jbk4k"] Oct 13 15:12:09 crc kubenswrapper[4797]: I1013 15:12:09.069471 4797 scope.go:117] "RemoveContainer" containerID="7afad25729522755696e57000988f54314924c8f077a7f9e70b6a90d89ae2940" Oct 13 15:12:09 crc kubenswrapper[4797]: I1013 15:12:09.126535 4797 scope.go:117] "RemoveContainer" containerID="6c3689d1a9213e9c87831c33603ec59c522ade35a23e462f3c3e1fd89cb0a7f9" Oct 13 15:12:09 crc kubenswrapper[4797]: E1013 15:12:09.126928 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c3689d1a9213e9c87831c33603ec59c522ade35a23e462f3c3e1fd89cb0a7f9\": container with ID starting with 6c3689d1a9213e9c87831c33603ec59c522ade35a23e462f3c3e1fd89cb0a7f9 not found: ID does not exist" containerID="6c3689d1a9213e9c87831c33603ec59c522ade35a23e462f3c3e1fd89cb0a7f9" Oct 13 15:12:09 crc kubenswrapper[4797]: I1013 15:12:09.126973 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c3689d1a9213e9c87831c33603ec59c522ade35a23e462f3c3e1fd89cb0a7f9"} err="failed to get container status \"6c3689d1a9213e9c87831c33603ec59c522ade35a23e462f3c3e1fd89cb0a7f9\": rpc error: code = NotFound desc = could not find container \"6c3689d1a9213e9c87831c33603ec59c522ade35a23e462f3c3e1fd89cb0a7f9\": container with ID starting with 6c3689d1a9213e9c87831c33603ec59c522ade35a23e462f3c3e1fd89cb0a7f9 not found: ID does not exist" Oct 13 15:12:09 crc kubenswrapper[4797]: I1013 15:12:09.127002 4797 scope.go:117] "RemoveContainer" containerID="c887eda719be8242a89b710c9b0ad16bbbdf53942dd5a3789a0c99d24cdb9a1f" Oct 13 15:12:09 crc kubenswrapper[4797]: E1013 15:12:09.127424 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c887eda719be8242a89b710c9b0ad16bbbdf53942dd5a3789a0c99d24cdb9a1f\": container with ID starting with c887eda719be8242a89b710c9b0ad16bbbdf53942dd5a3789a0c99d24cdb9a1f not found: ID does not exist" containerID="c887eda719be8242a89b710c9b0ad16bbbdf53942dd5a3789a0c99d24cdb9a1f" Oct 13 15:12:09 crc kubenswrapper[4797]: I1013 15:12:09.127472 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c887eda719be8242a89b710c9b0ad16bbbdf53942dd5a3789a0c99d24cdb9a1f"} err="failed to get container status \"c887eda719be8242a89b710c9b0ad16bbbdf53942dd5a3789a0c99d24cdb9a1f\": rpc error: code = NotFound desc = could not find container \"c887eda719be8242a89b710c9b0ad16bbbdf53942dd5a3789a0c99d24cdb9a1f\": container with ID starting with c887eda719be8242a89b710c9b0ad16bbbdf53942dd5a3789a0c99d24cdb9a1f not found: ID does not exist" Oct 13 15:12:09 crc kubenswrapper[4797]: I1013 15:12:09.127488 4797 scope.go:117] "RemoveContainer" containerID="7afad25729522755696e57000988f54314924c8f077a7f9e70b6a90d89ae2940" Oct 13 15:12:09 crc kubenswrapper[4797]: E1013 15:12:09.127711 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7afad25729522755696e57000988f54314924c8f077a7f9e70b6a90d89ae2940\": container with ID starting with 7afad25729522755696e57000988f54314924c8f077a7f9e70b6a90d89ae2940 not found: ID does not exist" containerID="7afad25729522755696e57000988f54314924c8f077a7f9e70b6a90d89ae2940" Oct 13 15:12:09 crc kubenswrapper[4797]: I1013 15:12:09.127732 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7afad25729522755696e57000988f54314924c8f077a7f9e70b6a90d89ae2940"} err="failed to get container status \"7afad25729522755696e57000988f54314924c8f077a7f9e70b6a90d89ae2940\": rpc error: code = NotFound desc = could not find container \"7afad25729522755696e57000988f54314924c8f077a7f9e70b6a90d89ae2940\": container with ID starting with 7afad25729522755696e57000988f54314924c8f077a7f9e70b6a90d89ae2940 not found: ID does not exist" Oct 13 15:12:09 crc kubenswrapper[4797]: I1013 15:12:09.250683 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="906c973d-0e7f-453a-bac5-f7571e9e6b7b" path="/var/lib/kubelet/pods/906c973d-0e7f-453a-bac5-f7571e9e6b7b/volumes" Oct 13 15:12:18 crc kubenswrapper[4797]: I1013 15:12:18.121013 4797 patch_prober.go:28] interesting pod/machine-config-daemon-hrdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 15:12:18 crc kubenswrapper[4797]: I1013 15:12:18.121901 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 15:12:48 crc kubenswrapper[4797]: I1013 15:12:48.120853 4797 patch_prober.go:28] interesting pod/machine-config-daemon-hrdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 15:12:48 crc kubenswrapper[4797]: I1013 15:12:48.121594 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 15:13:18 crc kubenswrapper[4797]: I1013 15:13:18.120485 4797 patch_prober.go:28] interesting pod/machine-config-daemon-hrdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 15:13:18 crc kubenswrapper[4797]: I1013 15:13:18.121013 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 15:13:18 crc kubenswrapper[4797]: I1013 15:13:18.121063 4797 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" Oct 13 15:13:18 crc kubenswrapper[4797]: I1013 15:13:18.121641 4797 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0d8c5076360fc4f912c2146f18658a8ad6b27bc03d71740bdbd957f89fae5603"} pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 13 15:13:18 crc kubenswrapper[4797]: I1013 15:13:18.121702 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerName="machine-config-daemon" containerID="cri-o://0d8c5076360fc4f912c2146f18658a8ad6b27bc03d71740bdbd957f89fae5603" gracePeriod=600 Oct 13 15:13:18 crc kubenswrapper[4797]: I1013 15:13:18.853282 4797 generic.go:334] "Generic (PLEG): container finished" podID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerID="0d8c5076360fc4f912c2146f18658a8ad6b27bc03d71740bdbd957f89fae5603" exitCode=0 Oct 13 15:13:18 crc kubenswrapper[4797]: I1013 15:13:18.853362 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" event={"ID":"345b1c60-ba79-407d-8423-53010f2dfeb0","Type":"ContainerDied","Data":"0d8c5076360fc4f912c2146f18658a8ad6b27bc03d71740bdbd957f89fae5603"} Oct 13 15:13:18 crc kubenswrapper[4797]: I1013 15:13:18.854004 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" event={"ID":"345b1c60-ba79-407d-8423-53010f2dfeb0","Type":"ContainerStarted","Data":"5ae84f5ef96749b196afba3caac331bd51c3761c94a664830d21bf6136f3106b"} Oct 13 15:13:18 crc kubenswrapper[4797]: I1013 15:13:18.854039 4797 scope.go:117] "RemoveContainer" containerID="3d7a7089f34adab43f593c787419674f839b78a7826391814ca89f2c6d53810d" Oct 13 15:13:18 crc kubenswrapper[4797]: I1013 15:13:18.866071 4797 generic.go:334] "Generic (PLEG): container finished" podID="26ec5d7d-17f0-4271-ab0b-30af4063a2c1" containerID="b6d68aaf2b8a80b0ec34c9ddf1e25006993a503dda6db754bc7ae36ab60554bc" exitCode=0 Oct 13 15:13:18 crc kubenswrapper[4797]: I1013 15:13:18.866116 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-b7rd2" event={"ID":"26ec5d7d-17f0-4271-ab0b-30af4063a2c1","Type":"ContainerDied","Data":"b6d68aaf2b8a80b0ec34c9ddf1e25006993a503dda6db754bc7ae36ab60554bc"} Oct 13 15:13:20 crc kubenswrapper[4797]: I1013 15:13:20.475879 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-b7rd2" Oct 13 15:13:20 crc kubenswrapper[4797]: I1013 15:13:20.551509 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26ec5d7d-17f0-4271-ab0b-30af4063a2c1-nova-cell1-combined-ca-bundle\") pod \"26ec5d7d-17f0-4271-ab0b-30af4063a2c1\" (UID: \"26ec5d7d-17f0-4271-ab0b-30af4063a2c1\") " Oct 13 15:13:20 crc kubenswrapper[4797]: I1013 15:13:20.551580 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/26ec5d7d-17f0-4271-ab0b-30af4063a2c1-nova-migration-ssh-key-1\") pod \"26ec5d7d-17f0-4271-ab0b-30af4063a2c1\" (UID: \"26ec5d7d-17f0-4271-ab0b-30af4063a2c1\") " Oct 13 15:13:20 crc kubenswrapper[4797]: I1013 15:13:20.551606 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-54b45\" (UniqueName: \"kubernetes.io/projected/26ec5d7d-17f0-4271-ab0b-30af4063a2c1-kube-api-access-54b45\") pod \"26ec5d7d-17f0-4271-ab0b-30af4063a2c1\" (UID: \"26ec5d7d-17f0-4271-ab0b-30af4063a2c1\") " Oct 13 15:13:20 crc kubenswrapper[4797]: I1013 15:13:20.551652 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/26ec5d7d-17f0-4271-ab0b-30af4063a2c1-nova-cells-global-config-0\") pod \"26ec5d7d-17f0-4271-ab0b-30af4063a2c1\" (UID: \"26ec5d7d-17f0-4271-ab0b-30af4063a2c1\") " Oct 13 15:13:20 crc kubenswrapper[4797]: I1013 15:13:20.551677 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/26ec5d7d-17f0-4271-ab0b-30af4063a2c1-nova-migration-ssh-key-0\") pod \"26ec5d7d-17f0-4271-ab0b-30af4063a2c1\" (UID: \"26ec5d7d-17f0-4271-ab0b-30af4063a2c1\") " Oct 13 15:13:20 crc kubenswrapper[4797]: I1013 15:13:20.551727 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/26ec5d7d-17f0-4271-ab0b-30af4063a2c1-nova-cells-global-config-1\") pod \"26ec5d7d-17f0-4271-ab0b-30af4063a2c1\" (UID: \"26ec5d7d-17f0-4271-ab0b-30af4063a2c1\") " Oct 13 15:13:20 crc kubenswrapper[4797]: I1013 15:13:20.551789 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/26ec5d7d-17f0-4271-ab0b-30af4063a2c1-nova-cell1-compute-config-1\") pod \"26ec5d7d-17f0-4271-ab0b-30af4063a2c1\" (UID: \"26ec5d7d-17f0-4271-ab0b-30af4063a2c1\") " Oct 13 15:13:20 crc kubenswrapper[4797]: I1013 15:13:20.551883 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/26ec5d7d-17f0-4271-ab0b-30af4063a2c1-ceph\") pod \"26ec5d7d-17f0-4271-ab0b-30af4063a2c1\" (UID: \"26ec5d7d-17f0-4271-ab0b-30af4063a2c1\") " Oct 13 15:13:20 crc kubenswrapper[4797]: I1013 15:13:20.551911 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/26ec5d7d-17f0-4271-ab0b-30af4063a2c1-ssh-key\") pod \"26ec5d7d-17f0-4271-ab0b-30af4063a2c1\" (UID: \"26ec5d7d-17f0-4271-ab0b-30af4063a2c1\") " Oct 13 15:13:20 crc kubenswrapper[4797]: I1013 15:13:20.551966 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/26ec5d7d-17f0-4271-ab0b-30af4063a2c1-nova-cell1-compute-config-0\") pod \"26ec5d7d-17f0-4271-ab0b-30af4063a2c1\" (UID: \"26ec5d7d-17f0-4271-ab0b-30af4063a2c1\") " Oct 13 15:13:20 crc kubenswrapper[4797]: I1013 15:13:20.551995 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/26ec5d7d-17f0-4271-ab0b-30af4063a2c1-inventory\") pod \"26ec5d7d-17f0-4271-ab0b-30af4063a2c1\" (UID: \"26ec5d7d-17f0-4271-ab0b-30af4063a2c1\") " Oct 13 15:13:20 crc kubenswrapper[4797]: I1013 15:13:20.558261 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26ec5d7d-17f0-4271-ab0b-30af4063a2c1-ceph" (OuterVolumeSpecName: "ceph") pod "26ec5d7d-17f0-4271-ab0b-30af4063a2c1" (UID: "26ec5d7d-17f0-4271-ab0b-30af4063a2c1"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 15:13:20 crc kubenswrapper[4797]: I1013 15:13:20.560308 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26ec5d7d-17f0-4271-ab0b-30af4063a2c1-kube-api-access-54b45" (OuterVolumeSpecName: "kube-api-access-54b45") pod "26ec5d7d-17f0-4271-ab0b-30af4063a2c1" (UID: "26ec5d7d-17f0-4271-ab0b-30af4063a2c1"). InnerVolumeSpecName "kube-api-access-54b45". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 15:13:20 crc kubenswrapper[4797]: I1013 15:13:20.568317 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26ec5d7d-17f0-4271-ab0b-30af4063a2c1-nova-cell1-combined-ca-bundle" (OuterVolumeSpecName: "nova-cell1-combined-ca-bundle") pod "26ec5d7d-17f0-4271-ab0b-30af4063a2c1" (UID: "26ec5d7d-17f0-4271-ab0b-30af4063a2c1"). InnerVolumeSpecName "nova-cell1-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 15:13:20 crc kubenswrapper[4797]: I1013 15:13:20.581268 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/26ec5d7d-17f0-4271-ab0b-30af4063a2c1-nova-cells-global-config-0" (OuterVolumeSpecName: "nova-cells-global-config-0") pod "26ec5d7d-17f0-4271-ab0b-30af4063a2c1" (UID: "26ec5d7d-17f0-4271-ab0b-30af4063a2c1"). InnerVolumeSpecName "nova-cells-global-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 15:13:20 crc kubenswrapper[4797]: I1013 15:13:20.582154 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/26ec5d7d-17f0-4271-ab0b-30af4063a2c1-nova-cells-global-config-1" (OuterVolumeSpecName: "nova-cells-global-config-1") pod "26ec5d7d-17f0-4271-ab0b-30af4063a2c1" (UID: "26ec5d7d-17f0-4271-ab0b-30af4063a2c1"). InnerVolumeSpecName "nova-cells-global-config-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 15:13:20 crc kubenswrapper[4797]: I1013 15:13:20.584020 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26ec5d7d-17f0-4271-ab0b-30af4063a2c1-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "26ec5d7d-17f0-4271-ab0b-30af4063a2c1" (UID: "26ec5d7d-17f0-4271-ab0b-30af4063a2c1"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 15:13:20 crc kubenswrapper[4797]: I1013 15:13:20.587103 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26ec5d7d-17f0-4271-ab0b-30af4063a2c1-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "26ec5d7d-17f0-4271-ab0b-30af4063a2c1" (UID: "26ec5d7d-17f0-4271-ab0b-30af4063a2c1"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 15:13:20 crc kubenswrapper[4797]: I1013 15:13:20.587522 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26ec5d7d-17f0-4271-ab0b-30af4063a2c1-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "26ec5d7d-17f0-4271-ab0b-30af4063a2c1" (UID: "26ec5d7d-17f0-4271-ab0b-30af4063a2c1"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 15:13:20 crc kubenswrapper[4797]: I1013 15:13:20.591022 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26ec5d7d-17f0-4271-ab0b-30af4063a2c1-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "26ec5d7d-17f0-4271-ab0b-30af4063a2c1" (UID: "26ec5d7d-17f0-4271-ab0b-30af4063a2c1"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 15:13:20 crc kubenswrapper[4797]: I1013 15:13:20.592925 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26ec5d7d-17f0-4271-ab0b-30af4063a2c1-inventory" (OuterVolumeSpecName: "inventory") pod "26ec5d7d-17f0-4271-ab0b-30af4063a2c1" (UID: "26ec5d7d-17f0-4271-ab0b-30af4063a2c1"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 15:13:20 crc kubenswrapper[4797]: I1013 15:13:20.594556 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26ec5d7d-17f0-4271-ab0b-30af4063a2c1-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "26ec5d7d-17f0-4271-ab0b-30af4063a2c1" (UID: "26ec5d7d-17f0-4271-ab0b-30af4063a2c1"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 15:13:20 crc kubenswrapper[4797]: I1013 15:13:20.654201 4797 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/26ec5d7d-17f0-4271-ab0b-30af4063a2c1-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Oct 13 15:13:20 crc kubenswrapper[4797]: I1013 15:13:20.654234 4797 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/26ec5d7d-17f0-4271-ab0b-30af4063a2c1-ceph\") on node \"crc\" DevicePath \"\"" Oct 13 15:13:20 crc kubenswrapper[4797]: I1013 15:13:20.654244 4797 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/26ec5d7d-17f0-4271-ab0b-30af4063a2c1-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 13 15:13:20 crc kubenswrapper[4797]: I1013 15:13:20.654253 4797 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/26ec5d7d-17f0-4271-ab0b-30af4063a2c1-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Oct 13 15:13:20 crc kubenswrapper[4797]: I1013 15:13:20.654263 4797 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/26ec5d7d-17f0-4271-ab0b-30af4063a2c1-inventory\") on node \"crc\" DevicePath \"\"" Oct 13 15:13:20 crc kubenswrapper[4797]: I1013 15:13:20.654272 4797 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26ec5d7d-17f0-4271-ab0b-30af4063a2c1-nova-cell1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 15:13:20 crc kubenswrapper[4797]: I1013 15:13:20.654281 4797 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/26ec5d7d-17f0-4271-ab0b-30af4063a2c1-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Oct 13 15:13:20 crc kubenswrapper[4797]: I1013 15:13:20.654291 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-54b45\" (UniqueName: \"kubernetes.io/projected/26ec5d7d-17f0-4271-ab0b-30af4063a2c1-kube-api-access-54b45\") on node \"crc\" DevicePath \"\"" Oct 13 15:13:20 crc kubenswrapper[4797]: I1013 15:13:20.654299 4797 reconciler_common.go:293] "Volume detached for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/26ec5d7d-17f0-4271-ab0b-30af4063a2c1-nova-cells-global-config-0\") on node \"crc\" DevicePath \"\"" Oct 13 15:13:20 crc kubenswrapper[4797]: I1013 15:13:20.654308 4797 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/26ec5d7d-17f0-4271-ab0b-30af4063a2c1-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Oct 13 15:13:20 crc kubenswrapper[4797]: I1013 15:13:20.654317 4797 reconciler_common.go:293] "Volume detached for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/26ec5d7d-17f0-4271-ab0b-30af4063a2c1-nova-cells-global-config-1\") on node \"crc\" DevicePath \"\"" Oct 13 15:13:20 crc kubenswrapper[4797]: I1013 15:13:20.912694 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-b7rd2" event={"ID":"26ec5d7d-17f0-4271-ab0b-30af4063a2c1","Type":"ContainerDied","Data":"66fc16f401825b239567b4c4d2597ec711fed9a1be5c1da3c8adce929d2c1ebc"} Oct 13 15:13:20 crc kubenswrapper[4797]: I1013 15:13:20.912750 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="66fc16f401825b239567b4c4d2597ec711fed9a1be5c1da3c8adce929d2c1ebc" Oct 13 15:13:20 crc kubenswrapper[4797]: I1013 15:13:20.912867 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-b7rd2" Oct 13 15:13:21 crc kubenswrapper[4797]: I1013 15:13:21.059829 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-openstack-openstack-cell1-8jxlh"] Oct 13 15:13:21 crc kubenswrapper[4797]: E1013 15:13:21.060677 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26ec5d7d-17f0-4271-ab0b-30af4063a2c1" containerName="nova-cell1-openstack-openstack-cell1" Oct 13 15:13:21 crc kubenswrapper[4797]: I1013 15:13:21.060774 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="26ec5d7d-17f0-4271-ab0b-30af4063a2c1" containerName="nova-cell1-openstack-openstack-cell1" Oct 13 15:13:21 crc kubenswrapper[4797]: E1013 15:13:21.060888 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="906c973d-0e7f-453a-bac5-f7571e9e6b7b" containerName="extract-utilities" Oct 13 15:13:21 crc kubenswrapper[4797]: I1013 15:13:21.060974 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="906c973d-0e7f-453a-bac5-f7571e9e6b7b" containerName="extract-utilities" Oct 13 15:13:21 crc kubenswrapper[4797]: E1013 15:13:21.061065 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="906c973d-0e7f-453a-bac5-f7571e9e6b7b" containerName="registry-server" Oct 13 15:13:21 crc kubenswrapper[4797]: I1013 15:13:21.061147 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="906c973d-0e7f-453a-bac5-f7571e9e6b7b" containerName="registry-server" Oct 13 15:13:21 crc kubenswrapper[4797]: E1013 15:13:21.061266 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="906c973d-0e7f-453a-bac5-f7571e9e6b7b" containerName="extract-content" Oct 13 15:13:21 crc kubenswrapper[4797]: I1013 15:13:21.061348 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="906c973d-0e7f-453a-bac5-f7571e9e6b7b" containerName="extract-content" Oct 13 15:13:21 crc kubenswrapper[4797]: I1013 15:13:21.061656 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="26ec5d7d-17f0-4271-ab0b-30af4063a2c1" containerName="nova-cell1-openstack-openstack-cell1" Oct 13 15:13:21 crc kubenswrapper[4797]: I1013 15:13:21.061750 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="906c973d-0e7f-453a-bac5-f7571e9e6b7b" containerName="registry-server" Oct 13 15:13:21 crc kubenswrapper[4797]: I1013 15:13:21.062825 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-8jxlh" Oct 13 15:13:21 crc kubenswrapper[4797]: I1013 15:13:21.065452 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Oct 13 15:13:21 crc kubenswrapper[4797]: I1013 15:13:21.065668 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 13 15:13:21 crc kubenswrapper[4797]: I1013 15:13:21.065989 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 13 15:13:21 crc kubenswrapper[4797]: I1013 15:13:21.066145 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 13 15:13:21 crc kubenswrapper[4797]: I1013 15:13:21.069692 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-rf85n" Oct 13 15:13:21 crc kubenswrapper[4797]: I1013 15:13:21.071782 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-openstack-openstack-cell1-8jxlh"] Oct 13 15:13:21 crc kubenswrapper[4797]: I1013 15:13:21.163292 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/af167228-e8f8-4bad-82cb-e1d853a9b317-ceilometer-compute-config-data-0\") pod \"telemetry-openstack-openstack-cell1-8jxlh\" (UID: \"af167228-e8f8-4bad-82cb-e1d853a9b317\") " pod="openstack/telemetry-openstack-openstack-cell1-8jxlh" Oct 13 15:13:21 crc kubenswrapper[4797]: I1013 15:13:21.163699 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/af167228-e8f8-4bad-82cb-e1d853a9b317-ceilometer-compute-config-data-2\") pod \"telemetry-openstack-openstack-cell1-8jxlh\" (UID: \"af167228-e8f8-4bad-82cb-e1d853a9b317\") " pod="openstack/telemetry-openstack-openstack-cell1-8jxlh" Oct 13 15:13:21 crc kubenswrapper[4797]: I1013 15:13:21.163731 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/af167228-e8f8-4bad-82cb-e1d853a9b317-ceilometer-compute-config-data-1\") pod \"telemetry-openstack-openstack-cell1-8jxlh\" (UID: \"af167228-e8f8-4bad-82cb-e1d853a9b317\") " pod="openstack/telemetry-openstack-openstack-cell1-8jxlh" Oct 13 15:13:21 crc kubenswrapper[4797]: I1013 15:13:21.163761 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/af167228-e8f8-4bad-82cb-e1d853a9b317-ceph\") pod \"telemetry-openstack-openstack-cell1-8jxlh\" (UID: \"af167228-e8f8-4bad-82cb-e1d853a9b317\") " pod="openstack/telemetry-openstack-openstack-cell1-8jxlh" Oct 13 15:13:21 crc kubenswrapper[4797]: I1013 15:13:21.164089 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af167228-e8f8-4bad-82cb-e1d853a9b317-telemetry-combined-ca-bundle\") pod \"telemetry-openstack-openstack-cell1-8jxlh\" (UID: \"af167228-e8f8-4bad-82cb-e1d853a9b317\") " pod="openstack/telemetry-openstack-openstack-cell1-8jxlh" Oct 13 15:13:21 crc kubenswrapper[4797]: I1013 15:13:21.164131 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjnh2\" (UniqueName: \"kubernetes.io/projected/af167228-e8f8-4bad-82cb-e1d853a9b317-kube-api-access-rjnh2\") pod \"telemetry-openstack-openstack-cell1-8jxlh\" (UID: \"af167228-e8f8-4bad-82cb-e1d853a9b317\") " pod="openstack/telemetry-openstack-openstack-cell1-8jxlh" Oct 13 15:13:21 crc kubenswrapper[4797]: I1013 15:13:21.164281 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/af167228-e8f8-4bad-82cb-e1d853a9b317-ssh-key\") pod \"telemetry-openstack-openstack-cell1-8jxlh\" (UID: \"af167228-e8f8-4bad-82cb-e1d853a9b317\") " pod="openstack/telemetry-openstack-openstack-cell1-8jxlh" Oct 13 15:13:21 crc kubenswrapper[4797]: I1013 15:13:21.164500 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/af167228-e8f8-4bad-82cb-e1d853a9b317-inventory\") pod \"telemetry-openstack-openstack-cell1-8jxlh\" (UID: \"af167228-e8f8-4bad-82cb-e1d853a9b317\") " pod="openstack/telemetry-openstack-openstack-cell1-8jxlh" Oct 13 15:13:21 crc kubenswrapper[4797]: I1013 15:13:21.266444 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af167228-e8f8-4bad-82cb-e1d853a9b317-telemetry-combined-ca-bundle\") pod \"telemetry-openstack-openstack-cell1-8jxlh\" (UID: \"af167228-e8f8-4bad-82cb-e1d853a9b317\") " pod="openstack/telemetry-openstack-openstack-cell1-8jxlh" Oct 13 15:13:21 crc kubenswrapper[4797]: I1013 15:13:21.267592 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rjnh2\" (UniqueName: \"kubernetes.io/projected/af167228-e8f8-4bad-82cb-e1d853a9b317-kube-api-access-rjnh2\") pod \"telemetry-openstack-openstack-cell1-8jxlh\" (UID: \"af167228-e8f8-4bad-82cb-e1d853a9b317\") " pod="openstack/telemetry-openstack-openstack-cell1-8jxlh" Oct 13 15:13:21 crc kubenswrapper[4797]: I1013 15:13:21.268010 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/af167228-e8f8-4bad-82cb-e1d853a9b317-ssh-key\") pod \"telemetry-openstack-openstack-cell1-8jxlh\" (UID: \"af167228-e8f8-4bad-82cb-e1d853a9b317\") " pod="openstack/telemetry-openstack-openstack-cell1-8jxlh" Oct 13 15:13:21 crc kubenswrapper[4797]: I1013 15:13:21.268208 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/af167228-e8f8-4bad-82cb-e1d853a9b317-inventory\") pod \"telemetry-openstack-openstack-cell1-8jxlh\" (UID: \"af167228-e8f8-4bad-82cb-e1d853a9b317\") " pod="openstack/telemetry-openstack-openstack-cell1-8jxlh" Oct 13 15:13:21 crc kubenswrapper[4797]: I1013 15:13:21.268379 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/af167228-e8f8-4bad-82cb-e1d853a9b317-ceilometer-compute-config-data-0\") pod \"telemetry-openstack-openstack-cell1-8jxlh\" (UID: \"af167228-e8f8-4bad-82cb-e1d853a9b317\") " pod="openstack/telemetry-openstack-openstack-cell1-8jxlh" Oct 13 15:13:21 crc kubenswrapper[4797]: I1013 15:13:21.268532 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/af167228-e8f8-4bad-82cb-e1d853a9b317-ceilometer-compute-config-data-2\") pod \"telemetry-openstack-openstack-cell1-8jxlh\" (UID: \"af167228-e8f8-4bad-82cb-e1d853a9b317\") " pod="openstack/telemetry-openstack-openstack-cell1-8jxlh" Oct 13 15:13:21 crc kubenswrapper[4797]: I1013 15:13:21.268647 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/af167228-e8f8-4bad-82cb-e1d853a9b317-ceilometer-compute-config-data-1\") pod \"telemetry-openstack-openstack-cell1-8jxlh\" (UID: \"af167228-e8f8-4bad-82cb-e1d853a9b317\") " pod="openstack/telemetry-openstack-openstack-cell1-8jxlh" Oct 13 15:13:21 crc kubenswrapper[4797]: I1013 15:13:21.268742 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/af167228-e8f8-4bad-82cb-e1d853a9b317-ceph\") pod \"telemetry-openstack-openstack-cell1-8jxlh\" (UID: \"af167228-e8f8-4bad-82cb-e1d853a9b317\") " pod="openstack/telemetry-openstack-openstack-cell1-8jxlh" Oct 13 15:13:21 crc kubenswrapper[4797]: I1013 15:13:21.272258 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/af167228-e8f8-4bad-82cb-e1d853a9b317-ceilometer-compute-config-data-0\") pod \"telemetry-openstack-openstack-cell1-8jxlh\" (UID: \"af167228-e8f8-4bad-82cb-e1d853a9b317\") " pod="openstack/telemetry-openstack-openstack-cell1-8jxlh" Oct 13 15:13:21 crc kubenswrapper[4797]: I1013 15:13:21.273216 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/af167228-e8f8-4bad-82cb-e1d853a9b317-ceph\") pod \"telemetry-openstack-openstack-cell1-8jxlh\" (UID: \"af167228-e8f8-4bad-82cb-e1d853a9b317\") " pod="openstack/telemetry-openstack-openstack-cell1-8jxlh" Oct 13 15:13:21 crc kubenswrapper[4797]: I1013 15:13:21.273378 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af167228-e8f8-4bad-82cb-e1d853a9b317-telemetry-combined-ca-bundle\") pod \"telemetry-openstack-openstack-cell1-8jxlh\" (UID: \"af167228-e8f8-4bad-82cb-e1d853a9b317\") " pod="openstack/telemetry-openstack-openstack-cell1-8jxlh" Oct 13 15:13:21 crc kubenswrapper[4797]: I1013 15:13:21.283059 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/af167228-e8f8-4bad-82cb-e1d853a9b317-inventory\") pod \"telemetry-openstack-openstack-cell1-8jxlh\" (UID: \"af167228-e8f8-4bad-82cb-e1d853a9b317\") " pod="openstack/telemetry-openstack-openstack-cell1-8jxlh" Oct 13 15:13:21 crc kubenswrapper[4797]: I1013 15:13:21.283312 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/af167228-e8f8-4bad-82cb-e1d853a9b317-ceilometer-compute-config-data-2\") pod \"telemetry-openstack-openstack-cell1-8jxlh\" (UID: \"af167228-e8f8-4bad-82cb-e1d853a9b317\") " pod="openstack/telemetry-openstack-openstack-cell1-8jxlh" Oct 13 15:13:21 crc kubenswrapper[4797]: I1013 15:13:21.283412 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/af167228-e8f8-4bad-82cb-e1d853a9b317-ceilometer-compute-config-data-1\") pod \"telemetry-openstack-openstack-cell1-8jxlh\" (UID: \"af167228-e8f8-4bad-82cb-e1d853a9b317\") " pod="openstack/telemetry-openstack-openstack-cell1-8jxlh" Oct 13 15:13:21 crc kubenswrapper[4797]: I1013 15:13:21.284931 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/af167228-e8f8-4bad-82cb-e1d853a9b317-ssh-key\") pod \"telemetry-openstack-openstack-cell1-8jxlh\" (UID: \"af167228-e8f8-4bad-82cb-e1d853a9b317\") " pod="openstack/telemetry-openstack-openstack-cell1-8jxlh" Oct 13 15:13:21 crc kubenswrapper[4797]: I1013 15:13:21.291551 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjnh2\" (UniqueName: \"kubernetes.io/projected/af167228-e8f8-4bad-82cb-e1d853a9b317-kube-api-access-rjnh2\") pod \"telemetry-openstack-openstack-cell1-8jxlh\" (UID: \"af167228-e8f8-4bad-82cb-e1d853a9b317\") " pod="openstack/telemetry-openstack-openstack-cell1-8jxlh" Oct 13 15:13:21 crc kubenswrapper[4797]: I1013 15:13:21.388530 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-8jxlh" Oct 13 15:13:21 crc kubenswrapper[4797]: I1013 15:13:21.994590 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-openstack-openstack-cell1-8jxlh"] Oct 13 15:13:22 crc kubenswrapper[4797]: W1013 15:13:22.014952 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaf167228_e8f8_4bad_82cb_e1d853a9b317.slice/crio-1816e0ddaab4ee336c3f89edb3e73b0c40bc3246892160af0f93e17af55f17d3 WatchSource:0}: Error finding container 1816e0ddaab4ee336c3f89edb3e73b0c40bc3246892160af0f93e17af55f17d3: Status 404 returned error can't find the container with id 1816e0ddaab4ee336c3f89edb3e73b0c40bc3246892160af0f93e17af55f17d3 Oct 13 15:13:22 crc kubenswrapper[4797]: I1013 15:13:22.945892 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-8jxlh" event={"ID":"af167228-e8f8-4bad-82cb-e1d853a9b317","Type":"ContainerStarted","Data":"7b9dfee15ee9fe36e806f0387340de2bee9591e15ed13f7d8215666ed400c624"} Oct 13 15:13:22 crc kubenswrapper[4797]: I1013 15:13:22.946438 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-8jxlh" event={"ID":"af167228-e8f8-4bad-82cb-e1d853a9b317","Type":"ContainerStarted","Data":"1816e0ddaab4ee336c3f89edb3e73b0c40bc3246892160af0f93e17af55f17d3"} Oct 13 15:13:22 crc kubenswrapper[4797]: I1013 15:13:22.968308 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-openstack-openstack-cell1-8jxlh" podStartSLOduration=1.561260623 podStartE2EDuration="1.968276359s" podCreationTimestamp="2025-10-13 15:13:21 +0000 UTC" firstStartedPulling="2025-10-13 15:13:22.017318831 +0000 UTC m=+7579.550869087" lastFinishedPulling="2025-10-13 15:13:22.424334567 +0000 UTC m=+7579.957884823" observedRunningTime="2025-10-13 15:13:22.966855734 +0000 UTC m=+7580.500406010" watchObservedRunningTime="2025-10-13 15:13:22.968276359 +0000 UTC m=+7580.501826615" Oct 13 15:15:00 crc kubenswrapper[4797]: I1013 15:15:00.158194 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29339475-x9hst"] Oct 13 15:15:00 crc kubenswrapper[4797]: I1013 15:15:00.160266 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29339475-x9hst" Oct 13 15:15:00 crc kubenswrapper[4797]: I1013 15:15:00.163549 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 13 15:15:00 crc kubenswrapper[4797]: I1013 15:15:00.168272 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 13 15:15:00 crc kubenswrapper[4797]: I1013 15:15:00.177228 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29339475-x9hst"] Oct 13 15:15:00 crc kubenswrapper[4797]: I1013 15:15:00.254955 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4m2f\" (UniqueName: \"kubernetes.io/projected/54ac0fc6-ca84-4bfd-a951-910ec25014dd-kube-api-access-l4m2f\") pod \"collect-profiles-29339475-x9hst\" (UID: \"54ac0fc6-ca84-4bfd-a951-910ec25014dd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339475-x9hst" Oct 13 15:15:00 crc kubenswrapper[4797]: I1013 15:15:00.255009 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/54ac0fc6-ca84-4bfd-a951-910ec25014dd-config-volume\") pod \"collect-profiles-29339475-x9hst\" (UID: \"54ac0fc6-ca84-4bfd-a951-910ec25014dd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339475-x9hst" Oct 13 15:15:00 crc kubenswrapper[4797]: I1013 15:15:00.255583 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/54ac0fc6-ca84-4bfd-a951-910ec25014dd-secret-volume\") pod \"collect-profiles-29339475-x9hst\" (UID: \"54ac0fc6-ca84-4bfd-a951-910ec25014dd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339475-x9hst" Oct 13 15:15:00 crc kubenswrapper[4797]: I1013 15:15:00.357933 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/54ac0fc6-ca84-4bfd-a951-910ec25014dd-secret-volume\") pod \"collect-profiles-29339475-x9hst\" (UID: \"54ac0fc6-ca84-4bfd-a951-910ec25014dd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339475-x9hst" Oct 13 15:15:00 crc kubenswrapper[4797]: I1013 15:15:00.358060 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l4m2f\" (UniqueName: \"kubernetes.io/projected/54ac0fc6-ca84-4bfd-a951-910ec25014dd-kube-api-access-l4m2f\") pod \"collect-profiles-29339475-x9hst\" (UID: \"54ac0fc6-ca84-4bfd-a951-910ec25014dd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339475-x9hst" Oct 13 15:15:00 crc kubenswrapper[4797]: I1013 15:15:00.358092 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/54ac0fc6-ca84-4bfd-a951-910ec25014dd-config-volume\") pod \"collect-profiles-29339475-x9hst\" (UID: \"54ac0fc6-ca84-4bfd-a951-910ec25014dd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339475-x9hst" Oct 13 15:15:00 crc kubenswrapper[4797]: I1013 15:15:00.359509 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/54ac0fc6-ca84-4bfd-a951-910ec25014dd-config-volume\") pod \"collect-profiles-29339475-x9hst\" (UID: \"54ac0fc6-ca84-4bfd-a951-910ec25014dd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339475-x9hst" Oct 13 15:15:00 crc kubenswrapper[4797]: I1013 15:15:00.365539 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/54ac0fc6-ca84-4bfd-a951-910ec25014dd-secret-volume\") pod \"collect-profiles-29339475-x9hst\" (UID: \"54ac0fc6-ca84-4bfd-a951-910ec25014dd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339475-x9hst" Oct 13 15:15:00 crc kubenswrapper[4797]: I1013 15:15:00.399418 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4m2f\" (UniqueName: \"kubernetes.io/projected/54ac0fc6-ca84-4bfd-a951-910ec25014dd-kube-api-access-l4m2f\") pod \"collect-profiles-29339475-x9hst\" (UID: \"54ac0fc6-ca84-4bfd-a951-910ec25014dd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339475-x9hst" Oct 13 15:15:00 crc kubenswrapper[4797]: I1013 15:15:00.488610 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29339475-x9hst" Oct 13 15:15:00 crc kubenswrapper[4797]: I1013 15:15:00.963759 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29339475-x9hst"] Oct 13 15:15:01 crc kubenswrapper[4797]: I1013 15:15:01.016892 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29339475-x9hst" event={"ID":"54ac0fc6-ca84-4bfd-a951-910ec25014dd","Type":"ContainerStarted","Data":"3dbdb2053d870cf8d779f80bb99db7cab051fa87d573f1926286fe342ab1b9c1"} Oct 13 15:15:02 crc kubenswrapper[4797]: I1013 15:15:02.028609 4797 generic.go:334] "Generic (PLEG): container finished" podID="54ac0fc6-ca84-4bfd-a951-910ec25014dd" containerID="b3bf19abd98838c8b3660aef7ef0a4d46bf279b21ed9ce1a237fd5c58c1f2ed1" exitCode=0 Oct 13 15:15:02 crc kubenswrapper[4797]: I1013 15:15:02.028658 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29339475-x9hst" event={"ID":"54ac0fc6-ca84-4bfd-a951-910ec25014dd","Type":"ContainerDied","Data":"b3bf19abd98838c8b3660aef7ef0a4d46bf279b21ed9ce1a237fd5c58c1f2ed1"} Oct 13 15:15:03 crc kubenswrapper[4797]: I1013 15:15:03.410906 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29339475-x9hst" Oct 13 15:15:03 crc kubenswrapper[4797]: I1013 15:15:03.528091 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l4m2f\" (UniqueName: \"kubernetes.io/projected/54ac0fc6-ca84-4bfd-a951-910ec25014dd-kube-api-access-l4m2f\") pod \"54ac0fc6-ca84-4bfd-a951-910ec25014dd\" (UID: \"54ac0fc6-ca84-4bfd-a951-910ec25014dd\") " Oct 13 15:15:03 crc kubenswrapper[4797]: I1013 15:15:03.528264 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/54ac0fc6-ca84-4bfd-a951-910ec25014dd-secret-volume\") pod \"54ac0fc6-ca84-4bfd-a951-910ec25014dd\" (UID: \"54ac0fc6-ca84-4bfd-a951-910ec25014dd\") " Oct 13 15:15:03 crc kubenswrapper[4797]: I1013 15:15:03.528298 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/54ac0fc6-ca84-4bfd-a951-910ec25014dd-config-volume\") pod \"54ac0fc6-ca84-4bfd-a951-910ec25014dd\" (UID: \"54ac0fc6-ca84-4bfd-a951-910ec25014dd\") " Oct 13 15:15:03 crc kubenswrapper[4797]: I1013 15:15:03.529125 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/54ac0fc6-ca84-4bfd-a951-910ec25014dd-config-volume" (OuterVolumeSpecName: "config-volume") pod "54ac0fc6-ca84-4bfd-a951-910ec25014dd" (UID: "54ac0fc6-ca84-4bfd-a951-910ec25014dd"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 15:15:03 crc kubenswrapper[4797]: I1013 15:15:03.535174 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54ac0fc6-ca84-4bfd-a951-910ec25014dd-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "54ac0fc6-ca84-4bfd-a951-910ec25014dd" (UID: "54ac0fc6-ca84-4bfd-a951-910ec25014dd"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 15:15:03 crc kubenswrapper[4797]: I1013 15:15:03.535346 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54ac0fc6-ca84-4bfd-a951-910ec25014dd-kube-api-access-l4m2f" (OuterVolumeSpecName: "kube-api-access-l4m2f") pod "54ac0fc6-ca84-4bfd-a951-910ec25014dd" (UID: "54ac0fc6-ca84-4bfd-a951-910ec25014dd"). InnerVolumeSpecName "kube-api-access-l4m2f". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 15:15:03 crc kubenswrapper[4797]: I1013 15:15:03.630893 4797 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/54ac0fc6-ca84-4bfd-a951-910ec25014dd-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 13 15:15:03 crc kubenswrapper[4797]: I1013 15:15:03.631291 4797 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/54ac0fc6-ca84-4bfd-a951-910ec25014dd-config-volume\") on node \"crc\" DevicePath \"\"" Oct 13 15:15:03 crc kubenswrapper[4797]: I1013 15:15:03.631308 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l4m2f\" (UniqueName: \"kubernetes.io/projected/54ac0fc6-ca84-4bfd-a951-910ec25014dd-kube-api-access-l4m2f\") on node \"crc\" DevicePath \"\"" Oct 13 15:15:04 crc kubenswrapper[4797]: I1013 15:15:04.052704 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29339475-x9hst" event={"ID":"54ac0fc6-ca84-4bfd-a951-910ec25014dd","Type":"ContainerDied","Data":"3dbdb2053d870cf8d779f80bb99db7cab051fa87d573f1926286fe342ab1b9c1"} Oct 13 15:15:04 crc kubenswrapper[4797]: I1013 15:15:04.052753 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3dbdb2053d870cf8d779f80bb99db7cab051fa87d573f1926286fe342ab1b9c1" Oct 13 15:15:04 crc kubenswrapper[4797]: I1013 15:15:04.052953 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29339475-x9hst" Oct 13 15:15:04 crc kubenswrapper[4797]: I1013 15:15:04.496171 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29339430-5225v"] Oct 13 15:15:04 crc kubenswrapper[4797]: I1013 15:15:04.507019 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29339430-5225v"] Oct 13 15:15:05 crc kubenswrapper[4797]: I1013 15:15:05.247477 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7bd21a4-836f-4c46-a04a-6a5f262004a7" path="/var/lib/kubelet/pods/c7bd21a4-836f-4c46-a04a-6a5f262004a7/volumes" Oct 13 15:15:18 crc kubenswrapper[4797]: I1013 15:15:18.120498 4797 patch_prober.go:28] interesting pod/machine-config-daemon-hrdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 15:15:18 crc kubenswrapper[4797]: I1013 15:15:18.121197 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 15:15:19 crc kubenswrapper[4797]: I1013 15:15:19.696698 4797 scope.go:117] "RemoveContainer" containerID="eafd5fb13c53708c4076fb9a4f3da118ab9fb0b8543c9acdfe748849a26b6a55" Oct 13 15:15:48 crc kubenswrapper[4797]: I1013 15:15:48.120177 4797 patch_prober.go:28] interesting pod/machine-config-daemon-hrdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 15:15:48 crc kubenswrapper[4797]: I1013 15:15:48.120874 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 15:16:18 crc kubenswrapper[4797]: I1013 15:16:18.120872 4797 patch_prober.go:28] interesting pod/machine-config-daemon-hrdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 15:16:18 crc kubenswrapper[4797]: I1013 15:16:18.121928 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 15:16:18 crc kubenswrapper[4797]: I1013 15:16:18.122025 4797 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" Oct 13 15:16:18 crc kubenswrapper[4797]: I1013 15:16:18.123724 4797 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5ae84f5ef96749b196afba3caac331bd51c3761c94a664830d21bf6136f3106b"} pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 13 15:16:18 crc kubenswrapper[4797]: I1013 15:16:18.123898 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerName="machine-config-daemon" containerID="cri-o://5ae84f5ef96749b196afba3caac331bd51c3761c94a664830d21bf6136f3106b" gracePeriod=600 Oct 13 15:16:18 crc kubenswrapper[4797]: E1013 15:16:18.259795 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 15:16:18 crc kubenswrapper[4797]: I1013 15:16:18.910750 4797 generic.go:334] "Generic (PLEG): container finished" podID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerID="5ae84f5ef96749b196afba3caac331bd51c3761c94a664830d21bf6136f3106b" exitCode=0 Oct 13 15:16:18 crc kubenswrapper[4797]: I1013 15:16:18.910830 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" event={"ID":"345b1c60-ba79-407d-8423-53010f2dfeb0","Type":"ContainerDied","Data":"5ae84f5ef96749b196afba3caac331bd51c3761c94a664830d21bf6136f3106b"} Oct 13 15:16:18 crc kubenswrapper[4797]: I1013 15:16:18.911230 4797 scope.go:117] "RemoveContainer" containerID="0d8c5076360fc4f912c2146f18658a8ad6b27bc03d71740bdbd957f89fae5603" Oct 13 15:16:18 crc kubenswrapper[4797]: I1013 15:16:18.912002 4797 scope.go:117] "RemoveContainer" containerID="5ae84f5ef96749b196afba3caac331bd51c3761c94a664830d21bf6136f3106b" Oct 13 15:16:18 crc kubenswrapper[4797]: E1013 15:16:18.912358 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 15:16:29 crc kubenswrapper[4797]: I1013 15:16:29.237913 4797 scope.go:117] "RemoveContainer" containerID="5ae84f5ef96749b196afba3caac331bd51c3761c94a664830d21bf6136f3106b" Oct 13 15:16:29 crc kubenswrapper[4797]: E1013 15:16:29.239515 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 15:16:44 crc kubenswrapper[4797]: I1013 15:16:44.236263 4797 scope.go:117] "RemoveContainer" containerID="5ae84f5ef96749b196afba3caac331bd51c3761c94a664830d21bf6136f3106b" Oct 13 15:16:44 crc kubenswrapper[4797]: E1013 15:16:44.237112 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 15:16:59 crc kubenswrapper[4797]: I1013 15:16:59.237471 4797 scope.go:117] "RemoveContainer" containerID="5ae84f5ef96749b196afba3caac331bd51c3761c94a664830d21bf6136f3106b" Oct 13 15:16:59 crc kubenswrapper[4797]: E1013 15:16:59.238261 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 15:17:14 crc kubenswrapper[4797]: I1013 15:17:14.236374 4797 scope.go:117] "RemoveContainer" containerID="5ae84f5ef96749b196afba3caac331bd51c3761c94a664830d21bf6136f3106b" Oct 13 15:17:14 crc kubenswrapper[4797]: E1013 15:17:14.237456 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 15:17:25 crc kubenswrapper[4797]: I1013 15:17:25.236592 4797 scope.go:117] "RemoveContainer" containerID="5ae84f5ef96749b196afba3caac331bd51c3761c94a664830d21bf6136f3106b" Oct 13 15:17:25 crc kubenswrapper[4797]: E1013 15:17:25.237670 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 15:17:38 crc kubenswrapper[4797]: I1013 15:17:38.236845 4797 scope.go:117] "RemoveContainer" containerID="5ae84f5ef96749b196afba3caac331bd51c3761c94a664830d21bf6136f3106b" Oct 13 15:17:38 crc kubenswrapper[4797]: E1013 15:17:38.238091 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 15:17:38 crc kubenswrapper[4797]: I1013 15:17:38.840014 4797 generic.go:334] "Generic (PLEG): container finished" podID="af167228-e8f8-4bad-82cb-e1d853a9b317" containerID="7b9dfee15ee9fe36e806f0387340de2bee9591e15ed13f7d8215666ed400c624" exitCode=0 Oct 13 15:17:38 crc kubenswrapper[4797]: I1013 15:17:38.840112 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-8jxlh" event={"ID":"af167228-e8f8-4bad-82cb-e1d853a9b317","Type":"ContainerDied","Data":"7b9dfee15ee9fe36e806f0387340de2bee9591e15ed13f7d8215666ed400c624"} Oct 13 15:17:40 crc kubenswrapper[4797]: I1013 15:17:40.339223 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-8jxlh" Oct 13 15:17:40 crc kubenswrapper[4797]: I1013 15:17:40.515396 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/af167228-e8f8-4bad-82cb-e1d853a9b317-ceilometer-compute-config-data-1\") pod \"af167228-e8f8-4bad-82cb-e1d853a9b317\" (UID: \"af167228-e8f8-4bad-82cb-e1d853a9b317\") " Oct 13 15:17:40 crc kubenswrapper[4797]: I1013 15:17:40.515494 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/af167228-e8f8-4bad-82cb-e1d853a9b317-ceilometer-compute-config-data-2\") pod \"af167228-e8f8-4bad-82cb-e1d853a9b317\" (UID: \"af167228-e8f8-4bad-82cb-e1d853a9b317\") " Oct 13 15:17:40 crc kubenswrapper[4797]: I1013 15:17:40.515851 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/af167228-e8f8-4bad-82cb-e1d853a9b317-ceilometer-compute-config-data-0\") pod \"af167228-e8f8-4bad-82cb-e1d853a9b317\" (UID: \"af167228-e8f8-4bad-82cb-e1d853a9b317\") " Oct 13 15:17:40 crc kubenswrapper[4797]: I1013 15:17:40.516104 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/af167228-e8f8-4bad-82cb-e1d853a9b317-ceph\") pod \"af167228-e8f8-4bad-82cb-e1d853a9b317\" (UID: \"af167228-e8f8-4bad-82cb-e1d853a9b317\") " Oct 13 15:17:40 crc kubenswrapper[4797]: I1013 15:17:40.516173 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/af167228-e8f8-4bad-82cb-e1d853a9b317-inventory\") pod \"af167228-e8f8-4bad-82cb-e1d853a9b317\" (UID: \"af167228-e8f8-4bad-82cb-e1d853a9b317\") " Oct 13 15:17:40 crc kubenswrapper[4797]: I1013 15:17:40.516297 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af167228-e8f8-4bad-82cb-e1d853a9b317-telemetry-combined-ca-bundle\") pod \"af167228-e8f8-4bad-82cb-e1d853a9b317\" (UID: \"af167228-e8f8-4bad-82cb-e1d853a9b317\") " Oct 13 15:17:40 crc kubenswrapper[4797]: I1013 15:17:40.516337 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rjnh2\" (UniqueName: \"kubernetes.io/projected/af167228-e8f8-4bad-82cb-e1d853a9b317-kube-api-access-rjnh2\") pod \"af167228-e8f8-4bad-82cb-e1d853a9b317\" (UID: \"af167228-e8f8-4bad-82cb-e1d853a9b317\") " Oct 13 15:17:40 crc kubenswrapper[4797]: I1013 15:17:40.516402 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/af167228-e8f8-4bad-82cb-e1d853a9b317-ssh-key\") pod \"af167228-e8f8-4bad-82cb-e1d853a9b317\" (UID: \"af167228-e8f8-4bad-82cb-e1d853a9b317\") " Oct 13 15:17:40 crc kubenswrapper[4797]: I1013 15:17:40.521798 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af167228-e8f8-4bad-82cb-e1d853a9b317-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "af167228-e8f8-4bad-82cb-e1d853a9b317" (UID: "af167228-e8f8-4bad-82cb-e1d853a9b317"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 15:17:40 crc kubenswrapper[4797]: I1013 15:17:40.522683 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af167228-e8f8-4bad-82cb-e1d853a9b317-kube-api-access-rjnh2" (OuterVolumeSpecName: "kube-api-access-rjnh2") pod "af167228-e8f8-4bad-82cb-e1d853a9b317" (UID: "af167228-e8f8-4bad-82cb-e1d853a9b317"). InnerVolumeSpecName "kube-api-access-rjnh2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 15:17:40 crc kubenswrapper[4797]: I1013 15:17:40.529071 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af167228-e8f8-4bad-82cb-e1d853a9b317-ceph" (OuterVolumeSpecName: "ceph") pod "af167228-e8f8-4bad-82cb-e1d853a9b317" (UID: "af167228-e8f8-4bad-82cb-e1d853a9b317"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 15:17:40 crc kubenswrapper[4797]: I1013 15:17:40.555026 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af167228-e8f8-4bad-82cb-e1d853a9b317-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "af167228-e8f8-4bad-82cb-e1d853a9b317" (UID: "af167228-e8f8-4bad-82cb-e1d853a9b317"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 15:17:40 crc kubenswrapper[4797]: I1013 15:17:40.573307 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af167228-e8f8-4bad-82cb-e1d853a9b317-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "af167228-e8f8-4bad-82cb-e1d853a9b317" (UID: "af167228-e8f8-4bad-82cb-e1d853a9b317"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 15:17:40 crc kubenswrapper[4797]: I1013 15:17:40.574006 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af167228-e8f8-4bad-82cb-e1d853a9b317-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "af167228-e8f8-4bad-82cb-e1d853a9b317" (UID: "af167228-e8f8-4bad-82cb-e1d853a9b317"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 15:17:40 crc kubenswrapper[4797]: I1013 15:17:40.580797 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af167228-e8f8-4bad-82cb-e1d853a9b317-inventory" (OuterVolumeSpecName: "inventory") pod "af167228-e8f8-4bad-82cb-e1d853a9b317" (UID: "af167228-e8f8-4bad-82cb-e1d853a9b317"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 15:17:40 crc kubenswrapper[4797]: I1013 15:17:40.581731 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af167228-e8f8-4bad-82cb-e1d853a9b317-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "af167228-e8f8-4bad-82cb-e1d853a9b317" (UID: "af167228-e8f8-4bad-82cb-e1d853a9b317"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 15:17:40 crc kubenswrapper[4797]: I1013 15:17:40.619271 4797 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/af167228-e8f8-4bad-82cb-e1d853a9b317-ceph\") on node \"crc\" DevicePath \"\"" Oct 13 15:17:40 crc kubenswrapper[4797]: I1013 15:17:40.619322 4797 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/af167228-e8f8-4bad-82cb-e1d853a9b317-inventory\") on node \"crc\" DevicePath \"\"" Oct 13 15:17:40 crc kubenswrapper[4797]: I1013 15:17:40.619365 4797 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af167228-e8f8-4bad-82cb-e1d853a9b317-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 15:17:40 crc kubenswrapper[4797]: I1013 15:17:40.619380 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rjnh2\" (UniqueName: \"kubernetes.io/projected/af167228-e8f8-4bad-82cb-e1d853a9b317-kube-api-access-rjnh2\") on node \"crc\" DevicePath \"\"" Oct 13 15:17:40 crc kubenswrapper[4797]: I1013 15:17:40.619393 4797 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/af167228-e8f8-4bad-82cb-e1d853a9b317-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 13 15:17:40 crc kubenswrapper[4797]: I1013 15:17:40.619405 4797 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/af167228-e8f8-4bad-82cb-e1d853a9b317-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Oct 13 15:17:40 crc kubenswrapper[4797]: I1013 15:17:40.619445 4797 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/af167228-e8f8-4bad-82cb-e1d853a9b317-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Oct 13 15:17:40 crc kubenswrapper[4797]: I1013 15:17:40.619460 4797 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/af167228-e8f8-4bad-82cb-e1d853a9b317-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Oct 13 15:17:40 crc kubenswrapper[4797]: I1013 15:17:40.871145 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-8jxlh" event={"ID":"af167228-e8f8-4bad-82cb-e1d853a9b317","Type":"ContainerDied","Data":"1816e0ddaab4ee336c3f89edb3e73b0c40bc3246892160af0f93e17af55f17d3"} Oct 13 15:17:40 crc kubenswrapper[4797]: I1013 15:17:40.871515 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1816e0ddaab4ee336c3f89edb3e73b0c40bc3246892160af0f93e17af55f17d3" Oct 13 15:17:40 crc kubenswrapper[4797]: I1013 15:17:40.871634 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-8jxlh" Oct 13 15:17:40 crc kubenswrapper[4797]: I1013 15:17:40.983448 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-sriov-openstack-openstack-cell1-dkdhk"] Oct 13 15:17:40 crc kubenswrapper[4797]: E1013 15:17:40.983880 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af167228-e8f8-4bad-82cb-e1d853a9b317" containerName="telemetry-openstack-openstack-cell1" Oct 13 15:17:40 crc kubenswrapper[4797]: I1013 15:17:40.983895 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="af167228-e8f8-4bad-82cb-e1d853a9b317" containerName="telemetry-openstack-openstack-cell1" Oct 13 15:17:40 crc kubenswrapper[4797]: E1013 15:17:40.983933 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54ac0fc6-ca84-4bfd-a951-910ec25014dd" containerName="collect-profiles" Oct 13 15:17:40 crc kubenswrapper[4797]: I1013 15:17:40.983940 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="54ac0fc6-ca84-4bfd-a951-910ec25014dd" containerName="collect-profiles" Oct 13 15:17:40 crc kubenswrapper[4797]: I1013 15:17:40.984132 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="54ac0fc6-ca84-4bfd-a951-910ec25014dd" containerName="collect-profiles" Oct 13 15:17:40 crc kubenswrapper[4797]: I1013 15:17:40.984157 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="af167228-e8f8-4bad-82cb-e1d853a9b317" containerName="telemetry-openstack-openstack-cell1" Oct 13 15:17:40 crc kubenswrapper[4797]: I1013 15:17:40.984853 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-dkdhk" Oct 13 15:17:40 crc kubenswrapper[4797]: I1013 15:17:40.987229 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 13 15:17:40 crc kubenswrapper[4797]: I1013 15:17:40.987456 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 13 15:17:40 crc kubenswrapper[4797]: I1013 15:17:40.987633 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-rf85n" Oct 13 15:17:40 crc kubenswrapper[4797]: I1013 15:17:40.988664 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-sriov-agent-neutron-config" Oct 13 15:17:40 crc kubenswrapper[4797]: I1013 15:17:40.988666 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 13 15:17:40 crc kubenswrapper[4797]: I1013 15:17:40.997284 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-sriov-openstack-openstack-cell1-dkdhk"] Oct 13 15:17:41 crc kubenswrapper[4797]: I1013 15:17:41.129960 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/373fe301-acc1-486b-a109-62e739dec048-inventory\") pod \"neutron-sriov-openstack-openstack-cell1-dkdhk\" (UID: \"373fe301-acc1-486b-a109-62e739dec048\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-dkdhk" Oct 13 15:17:41 crc kubenswrapper[4797]: I1013 15:17:41.130008 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wt4cq\" (UniqueName: \"kubernetes.io/projected/373fe301-acc1-486b-a109-62e739dec048-kube-api-access-wt4cq\") pod \"neutron-sriov-openstack-openstack-cell1-dkdhk\" (UID: \"373fe301-acc1-486b-a109-62e739dec048\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-dkdhk" Oct 13 15:17:41 crc kubenswrapper[4797]: I1013 15:17:41.130056 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/373fe301-acc1-486b-a109-62e739dec048-neutron-sriov-combined-ca-bundle\") pod \"neutron-sriov-openstack-openstack-cell1-dkdhk\" (UID: \"373fe301-acc1-486b-a109-62e739dec048\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-dkdhk" Oct 13 15:17:41 crc kubenswrapper[4797]: I1013 15:17:41.130146 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/373fe301-acc1-486b-a109-62e739dec048-ceph\") pod \"neutron-sriov-openstack-openstack-cell1-dkdhk\" (UID: \"373fe301-acc1-486b-a109-62e739dec048\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-dkdhk" Oct 13 15:17:41 crc kubenswrapper[4797]: I1013 15:17:41.130206 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/373fe301-acc1-486b-a109-62e739dec048-neutron-sriov-agent-neutron-config-0\") pod \"neutron-sriov-openstack-openstack-cell1-dkdhk\" (UID: \"373fe301-acc1-486b-a109-62e739dec048\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-dkdhk" Oct 13 15:17:41 crc kubenswrapper[4797]: I1013 15:17:41.130274 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/373fe301-acc1-486b-a109-62e739dec048-ssh-key\") pod \"neutron-sriov-openstack-openstack-cell1-dkdhk\" (UID: \"373fe301-acc1-486b-a109-62e739dec048\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-dkdhk" Oct 13 15:17:41 crc kubenswrapper[4797]: I1013 15:17:41.232296 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/373fe301-acc1-486b-a109-62e739dec048-inventory\") pod \"neutron-sriov-openstack-openstack-cell1-dkdhk\" (UID: \"373fe301-acc1-486b-a109-62e739dec048\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-dkdhk" Oct 13 15:17:41 crc kubenswrapper[4797]: I1013 15:17:41.232369 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wt4cq\" (UniqueName: \"kubernetes.io/projected/373fe301-acc1-486b-a109-62e739dec048-kube-api-access-wt4cq\") pod \"neutron-sriov-openstack-openstack-cell1-dkdhk\" (UID: \"373fe301-acc1-486b-a109-62e739dec048\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-dkdhk" Oct 13 15:17:41 crc kubenswrapper[4797]: I1013 15:17:41.232462 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/373fe301-acc1-486b-a109-62e739dec048-neutron-sriov-combined-ca-bundle\") pod \"neutron-sriov-openstack-openstack-cell1-dkdhk\" (UID: \"373fe301-acc1-486b-a109-62e739dec048\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-dkdhk" Oct 13 15:17:41 crc kubenswrapper[4797]: I1013 15:17:41.232528 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/373fe301-acc1-486b-a109-62e739dec048-ceph\") pod \"neutron-sriov-openstack-openstack-cell1-dkdhk\" (UID: \"373fe301-acc1-486b-a109-62e739dec048\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-dkdhk" Oct 13 15:17:41 crc kubenswrapper[4797]: I1013 15:17:41.232584 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/373fe301-acc1-486b-a109-62e739dec048-neutron-sriov-agent-neutron-config-0\") pod \"neutron-sriov-openstack-openstack-cell1-dkdhk\" (UID: \"373fe301-acc1-486b-a109-62e739dec048\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-dkdhk" Oct 13 15:17:41 crc kubenswrapper[4797]: I1013 15:17:41.232653 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/373fe301-acc1-486b-a109-62e739dec048-ssh-key\") pod \"neutron-sriov-openstack-openstack-cell1-dkdhk\" (UID: \"373fe301-acc1-486b-a109-62e739dec048\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-dkdhk" Oct 13 15:17:41 crc kubenswrapper[4797]: I1013 15:17:41.237915 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/373fe301-acc1-486b-a109-62e739dec048-inventory\") pod \"neutron-sriov-openstack-openstack-cell1-dkdhk\" (UID: \"373fe301-acc1-486b-a109-62e739dec048\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-dkdhk" Oct 13 15:17:41 crc kubenswrapper[4797]: I1013 15:17:41.237922 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/373fe301-acc1-486b-a109-62e739dec048-ceph\") pod \"neutron-sriov-openstack-openstack-cell1-dkdhk\" (UID: \"373fe301-acc1-486b-a109-62e739dec048\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-dkdhk" Oct 13 15:17:41 crc kubenswrapper[4797]: I1013 15:17:41.239692 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/373fe301-acc1-486b-a109-62e739dec048-neutron-sriov-combined-ca-bundle\") pod \"neutron-sriov-openstack-openstack-cell1-dkdhk\" (UID: \"373fe301-acc1-486b-a109-62e739dec048\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-dkdhk" Oct 13 15:17:41 crc kubenswrapper[4797]: I1013 15:17:41.240478 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/373fe301-acc1-486b-a109-62e739dec048-neutron-sriov-agent-neutron-config-0\") pod \"neutron-sriov-openstack-openstack-cell1-dkdhk\" (UID: \"373fe301-acc1-486b-a109-62e739dec048\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-dkdhk" Oct 13 15:17:41 crc kubenswrapper[4797]: I1013 15:17:41.253458 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/373fe301-acc1-486b-a109-62e739dec048-ssh-key\") pod \"neutron-sriov-openstack-openstack-cell1-dkdhk\" (UID: \"373fe301-acc1-486b-a109-62e739dec048\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-dkdhk" Oct 13 15:17:41 crc kubenswrapper[4797]: I1013 15:17:41.262862 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wt4cq\" (UniqueName: \"kubernetes.io/projected/373fe301-acc1-486b-a109-62e739dec048-kube-api-access-wt4cq\") pod \"neutron-sriov-openstack-openstack-cell1-dkdhk\" (UID: \"373fe301-acc1-486b-a109-62e739dec048\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-dkdhk" Oct 13 15:17:41 crc kubenswrapper[4797]: I1013 15:17:41.306707 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-dkdhk" Oct 13 15:17:41 crc kubenswrapper[4797]: I1013 15:17:41.884028 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-sriov-openstack-openstack-cell1-dkdhk"] Oct 13 15:17:41 crc kubenswrapper[4797]: I1013 15:17:41.887470 4797 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 13 15:17:42 crc kubenswrapper[4797]: I1013 15:17:42.847929 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-4k7gk"] Oct 13 15:17:42 crc kubenswrapper[4797]: I1013 15:17:42.850545 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4k7gk" Oct 13 15:17:42 crc kubenswrapper[4797]: I1013 15:17:42.861007 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4k7gk"] Oct 13 15:17:42 crc kubenswrapper[4797]: I1013 15:17:42.899391 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-dkdhk" event={"ID":"373fe301-acc1-486b-a109-62e739dec048","Type":"ContainerStarted","Data":"26fdfb84518082cdc732bb12957c338ac8a3af46a208a16a4333f0fde8ed8f55"} Oct 13 15:17:42 crc kubenswrapper[4797]: I1013 15:17:42.966352 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47f522cb-ffd5-423f-a435-1c2a1fc716ee-catalog-content\") pod \"community-operators-4k7gk\" (UID: \"47f522cb-ffd5-423f-a435-1c2a1fc716ee\") " pod="openshift-marketplace/community-operators-4k7gk" Oct 13 15:17:42 crc kubenswrapper[4797]: I1013 15:17:42.966456 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47f522cb-ffd5-423f-a435-1c2a1fc716ee-utilities\") pod \"community-operators-4k7gk\" (UID: \"47f522cb-ffd5-423f-a435-1c2a1fc716ee\") " pod="openshift-marketplace/community-operators-4k7gk" Oct 13 15:17:42 crc kubenswrapper[4797]: I1013 15:17:42.966737 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xgs6p\" (UniqueName: \"kubernetes.io/projected/47f522cb-ffd5-423f-a435-1c2a1fc716ee-kube-api-access-xgs6p\") pod \"community-operators-4k7gk\" (UID: \"47f522cb-ffd5-423f-a435-1c2a1fc716ee\") " pod="openshift-marketplace/community-operators-4k7gk" Oct 13 15:17:43 crc kubenswrapper[4797]: I1013 15:17:43.069724 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47f522cb-ffd5-423f-a435-1c2a1fc716ee-catalog-content\") pod \"community-operators-4k7gk\" (UID: \"47f522cb-ffd5-423f-a435-1c2a1fc716ee\") " pod="openshift-marketplace/community-operators-4k7gk" Oct 13 15:17:43 crc kubenswrapper[4797]: I1013 15:17:43.069780 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47f522cb-ffd5-423f-a435-1c2a1fc716ee-utilities\") pod \"community-operators-4k7gk\" (UID: \"47f522cb-ffd5-423f-a435-1c2a1fc716ee\") " pod="openshift-marketplace/community-operators-4k7gk" Oct 13 15:17:43 crc kubenswrapper[4797]: I1013 15:17:43.069861 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xgs6p\" (UniqueName: \"kubernetes.io/projected/47f522cb-ffd5-423f-a435-1c2a1fc716ee-kube-api-access-xgs6p\") pod \"community-operators-4k7gk\" (UID: \"47f522cb-ffd5-423f-a435-1c2a1fc716ee\") " pod="openshift-marketplace/community-operators-4k7gk" Oct 13 15:17:43 crc kubenswrapper[4797]: I1013 15:17:43.070510 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47f522cb-ffd5-423f-a435-1c2a1fc716ee-catalog-content\") pod \"community-operators-4k7gk\" (UID: \"47f522cb-ffd5-423f-a435-1c2a1fc716ee\") " pod="openshift-marketplace/community-operators-4k7gk" Oct 13 15:17:43 crc kubenswrapper[4797]: I1013 15:17:43.070976 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47f522cb-ffd5-423f-a435-1c2a1fc716ee-utilities\") pod \"community-operators-4k7gk\" (UID: \"47f522cb-ffd5-423f-a435-1c2a1fc716ee\") " pod="openshift-marketplace/community-operators-4k7gk" Oct 13 15:17:43 crc kubenswrapper[4797]: I1013 15:17:43.096570 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xgs6p\" (UniqueName: \"kubernetes.io/projected/47f522cb-ffd5-423f-a435-1c2a1fc716ee-kube-api-access-xgs6p\") pod \"community-operators-4k7gk\" (UID: \"47f522cb-ffd5-423f-a435-1c2a1fc716ee\") " pod="openshift-marketplace/community-operators-4k7gk" Oct 13 15:17:43 crc kubenswrapper[4797]: I1013 15:17:43.190319 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4k7gk" Oct 13 15:17:43 crc kubenswrapper[4797]: I1013 15:17:43.748584 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4k7gk"] Oct 13 15:17:43 crc kubenswrapper[4797]: I1013 15:17:43.911973 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-dkdhk" event={"ID":"373fe301-acc1-486b-a109-62e739dec048","Type":"ContainerStarted","Data":"abac1b360265b3acc2d189962ea734b909fb0d92d2b108e188e79328b2929bf7"} Oct 13 15:17:43 crc kubenswrapper[4797]: I1013 15:17:43.915019 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4k7gk" event={"ID":"47f522cb-ffd5-423f-a435-1c2a1fc716ee","Type":"ContainerStarted","Data":"053d6a443a5e1e9143dc78c4c0b1fbfd012c7ec8682440252371b44908c94ddd"} Oct 13 15:17:43 crc kubenswrapper[4797]: I1013 15:17:43.936870 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-sriov-openstack-openstack-cell1-dkdhk" podStartSLOduration=3.11454573 podStartE2EDuration="3.936845711s" podCreationTimestamp="2025-10-13 15:17:40 +0000 UTC" firstStartedPulling="2025-10-13 15:17:41.887114089 +0000 UTC m=+7839.420664365" lastFinishedPulling="2025-10-13 15:17:42.70941408 +0000 UTC m=+7840.242964346" observedRunningTime="2025-10-13 15:17:43.932613387 +0000 UTC m=+7841.466163673" watchObservedRunningTime="2025-10-13 15:17:43.936845711 +0000 UTC m=+7841.470395967" Oct 13 15:17:44 crc kubenswrapper[4797]: I1013 15:17:44.928169 4797 generic.go:334] "Generic (PLEG): container finished" podID="47f522cb-ffd5-423f-a435-1c2a1fc716ee" containerID="3a252cda3ca84aa118ad19adbca0cb0c08e334ba462a4639966aa5f0631f89ee" exitCode=0 Oct 13 15:17:44 crc kubenswrapper[4797]: I1013 15:17:44.928256 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4k7gk" event={"ID":"47f522cb-ffd5-423f-a435-1c2a1fc716ee","Type":"ContainerDied","Data":"3a252cda3ca84aa118ad19adbca0cb0c08e334ba462a4639966aa5f0631f89ee"} Oct 13 15:17:45 crc kubenswrapper[4797]: I1013 15:17:45.945033 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4k7gk" event={"ID":"47f522cb-ffd5-423f-a435-1c2a1fc716ee","Type":"ContainerStarted","Data":"175acd0ad61e34de2c0a4ea72ac0c58eb8f779260931243916fcdb20a3fd5756"} Oct 13 15:17:47 crc kubenswrapper[4797]: I1013 15:17:47.968536 4797 generic.go:334] "Generic (PLEG): container finished" podID="47f522cb-ffd5-423f-a435-1c2a1fc716ee" containerID="175acd0ad61e34de2c0a4ea72ac0c58eb8f779260931243916fcdb20a3fd5756" exitCode=0 Oct 13 15:17:47 crc kubenswrapper[4797]: I1013 15:17:47.968621 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4k7gk" event={"ID":"47f522cb-ffd5-423f-a435-1c2a1fc716ee","Type":"ContainerDied","Data":"175acd0ad61e34de2c0a4ea72ac0c58eb8f779260931243916fcdb20a3fd5756"} Oct 13 15:17:48 crc kubenswrapper[4797]: I1013 15:17:48.981288 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4k7gk" event={"ID":"47f522cb-ffd5-423f-a435-1c2a1fc716ee","Type":"ContainerStarted","Data":"38961888e7455239c328679ef64be1e23d045077a63525bd3487cce622d2589c"} Oct 13 15:17:49 crc kubenswrapper[4797]: I1013 15:17:49.007376 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-4k7gk" podStartSLOduration=3.546905112 podStartE2EDuration="7.007350112s" podCreationTimestamp="2025-10-13 15:17:42 +0000 UTC" firstStartedPulling="2025-10-13 15:17:44.930997236 +0000 UTC m=+7842.464547502" lastFinishedPulling="2025-10-13 15:17:48.391442236 +0000 UTC m=+7845.924992502" observedRunningTime="2025-10-13 15:17:48.997678143 +0000 UTC m=+7846.531228419" watchObservedRunningTime="2025-10-13 15:17:49.007350112 +0000 UTC m=+7846.540900368" Oct 13 15:17:53 crc kubenswrapper[4797]: I1013 15:17:53.191395 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-4k7gk" Oct 13 15:17:53 crc kubenswrapper[4797]: I1013 15:17:53.193181 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-4k7gk" Oct 13 15:17:53 crc kubenswrapper[4797]: I1013 15:17:53.252623 4797 scope.go:117] "RemoveContainer" containerID="5ae84f5ef96749b196afba3caac331bd51c3761c94a664830d21bf6136f3106b" Oct 13 15:17:53 crc kubenswrapper[4797]: E1013 15:17:53.252923 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 15:17:53 crc kubenswrapper[4797]: I1013 15:17:53.254602 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-4k7gk" Oct 13 15:17:54 crc kubenswrapper[4797]: I1013 15:17:54.117403 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-4k7gk" Oct 13 15:17:54 crc kubenswrapper[4797]: I1013 15:17:54.181502 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4k7gk"] Oct 13 15:17:56 crc kubenswrapper[4797]: I1013 15:17:56.055027 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-4k7gk" podUID="47f522cb-ffd5-423f-a435-1c2a1fc716ee" containerName="registry-server" containerID="cri-o://38961888e7455239c328679ef64be1e23d045077a63525bd3487cce622d2589c" gracePeriod=2 Oct 13 15:17:56 crc kubenswrapper[4797]: I1013 15:17:56.519414 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4k7gk" Oct 13 15:17:56 crc kubenswrapper[4797]: I1013 15:17:56.712643 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xgs6p\" (UniqueName: \"kubernetes.io/projected/47f522cb-ffd5-423f-a435-1c2a1fc716ee-kube-api-access-xgs6p\") pod \"47f522cb-ffd5-423f-a435-1c2a1fc716ee\" (UID: \"47f522cb-ffd5-423f-a435-1c2a1fc716ee\") " Oct 13 15:17:56 crc kubenswrapper[4797]: I1013 15:17:56.712733 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47f522cb-ffd5-423f-a435-1c2a1fc716ee-catalog-content\") pod \"47f522cb-ffd5-423f-a435-1c2a1fc716ee\" (UID: \"47f522cb-ffd5-423f-a435-1c2a1fc716ee\") " Oct 13 15:17:56 crc kubenswrapper[4797]: I1013 15:17:56.712916 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47f522cb-ffd5-423f-a435-1c2a1fc716ee-utilities\") pod \"47f522cb-ffd5-423f-a435-1c2a1fc716ee\" (UID: \"47f522cb-ffd5-423f-a435-1c2a1fc716ee\") " Oct 13 15:17:56 crc kubenswrapper[4797]: I1013 15:17:56.713670 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/47f522cb-ffd5-423f-a435-1c2a1fc716ee-utilities" (OuterVolumeSpecName: "utilities") pod "47f522cb-ffd5-423f-a435-1c2a1fc716ee" (UID: "47f522cb-ffd5-423f-a435-1c2a1fc716ee"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 15:17:56 crc kubenswrapper[4797]: I1013 15:17:56.722533 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47f522cb-ffd5-423f-a435-1c2a1fc716ee-kube-api-access-xgs6p" (OuterVolumeSpecName: "kube-api-access-xgs6p") pod "47f522cb-ffd5-423f-a435-1c2a1fc716ee" (UID: "47f522cb-ffd5-423f-a435-1c2a1fc716ee"). InnerVolumeSpecName "kube-api-access-xgs6p". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 15:17:56 crc kubenswrapper[4797]: I1013 15:17:56.766471 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/47f522cb-ffd5-423f-a435-1c2a1fc716ee-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "47f522cb-ffd5-423f-a435-1c2a1fc716ee" (UID: "47f522cb-ffd5-423f-a435-1c2a1fc716ee"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 15:17:56 crc kubenswrapper[4797]: I1013 15:17:56.815687 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xgs6p\" (UniqueName: \"kubernetes.io/projected/47f522cb-ffd5-423f-a435-1c2a1fc716ee-kube-api-access-xgs6p\") on node \"crc\" DevicePath \"\"" Oct 13 15:17:56 crc kubenswrapper[4797]: I1013 15:17:56.815723 4797 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47f522cb-ffd5-423f-a435-1c2a1fc716ee-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 13 15:17:56 crc kubenswrapper[4797]: I1013 15:17:56.815732 4797 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47f522cb-ffd5-423f-a435-1c2a1fc716ee-utilities\") on node \"crc\" DevicePath \"\"" Oct 13 15:17:57 crc kubenswrapper[4797]: I1013 15:17:57.065481 4797 generic.go:334] "Generic (PLEG): container finished" podID="47f522cb-ffd5-423f-a435-1c2a1fc716ee" containerID="38961888e7455239c328679ef64be1e23d045077a63525bd3487cce622d2589c" exitCode=0 Oct 13 15:17:57 crc kubenswrapper[4797]: I1013 15:17:57.065527 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4k7gk" event={"ID":"47f522cb-ffd5-423f-a435-1c2a1fc716ee","Type":"ContainerDied","Data":"38961888e7455239c328679ef64be1e23d045077a63525bd3487cce622d2589c"} Oct 13 15:17:57 crc kubenswrapper[4797]: I1013 15:17:57.065554 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4k7gk" event={"ID":"47f522cb-ffd5-423f-a435-1c2a1fc716ee","Type":"ContainerDied","Data":"053d6a443a5e1e9143dc78c4c0b1fbfd012c7ec8682440252371b44908c94ddd"} Oct 13 15:17:57 crc kubenswrapper[4797]: I1013 15:17:57.065570 4797 scope.go:117] "RemoveContainer" containerID="38961888e7455239c328679ef64be1e23d045077a63525bd3487cce622d2589c" Oct 13 15:17:57 crc kubenswrapper[4797]: I1013 15:17:57.065569 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4k7gk" Oct 13 15:17:57 crc kubenswrapper[4797]: I1013 15:17:57.089007 4797 scope.go:117] "RemoveContainer" containerID="175acd0ad61e34de2c0a4ea72ac0c58eb8f779260931243916fcdb20a3fd5756" Oct 13 15:17:57 crc kubenswrapper[4797]: I1013 15:17:57.119158 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4k7gk"] Oct 13 15:17:57 crc kubenswrapper[4797]: I1013 15:17:57.129377 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-4k7gk"] Oct 13 15:17:57 crc kubenswrapper[4797]: I1013 15:17:57.140230 4797 scope.go:117] "RemoveContainer" containerID="3a252cda3ca84aa118ad19adbca0cb0c08e334ba462a4639966aa5f0631f89ee" Oct 13 15:17:57 crc kubenswrapper[4797]: I1013 15:17:57.173895 4797 scope.go:117] "RemoveContainer" containerID="38961888e7455239c328679ef64be1e23d045077a63525bd3487cce622d2589c" Oct 13 15:17:57 crc kubenswrapper[4797]: E1013 15:17:57.174711 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38961888e7455239c328679ef64be1e23d045077a63525bd3487cce622d2589c\": container with ID starting with 38961888e7455239c328679ef64be1e23d045077a63525bd3487cce622d2589c not found: ID does not exist" containerID="38961888e7455239c328679ef64be1e23d045077a63525bd3487cce622d2589c" Oct 13 15:17:57 crc kubenswrapper[4797]: I1013 15:17:57.174777 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38961888e7455239c328679ef64be1e23d045077a63525bd3487cce622d2589c"} err="failed to get container status \"38961888e7455239c328679ef64be1e23d045077a63525bd3487cce622d2589c\": rpc error: code = NotFound desc = could not find container \"38961888e7455239c328679ef64be1e23d045077a63525bd3487cce622d2589c\": container with ID starting with 38961888e7455239c328679ef64be1e23d045077a63525bd3487cce622d2589c not found: ID does not exist" Oct 13 15:17:57 crc kubenswrapper[4797]: I1013 15:17:57.174837 4797 scope.go:117] "RemoveContainer" containerID="175acd0ad61e34de2c0a4ea72ac0c58eb8f779260931243916fcdb20a3fd5756" Oct 13 15:17:57 crc kubenswrapper[4797]: E1013 15:17:57.175374 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"175acd0ad61e34de2c0a4ea72ac0c58eb8f779260931243916fcdb20a3fd5756\": container with ID starting with 175acd0ad61e34de2c0a4ea72ac0c58eb8f779260931243916fcdb20a3fd5756 not found: ID does not exist" containerID="175acd0ad61e34de2c0a4ea72ac0c58eb8f779260931243916fcdb20a3fd5756" Oct 13 15:17:57 crc kubenswrapper[4797]: I1013 15:17:57.175433 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"175acd0ad61e34de2c0a4ea72ac0c58eb8f779260931243916fcdb20a3fd5756"} err="failed to get container status \"175acd0ad61e34de2c0a4ea72ac0c58eb8f779260931243916fcdb20a3fd5756\": rpc error: code = NotFound desc = could not find container \"175acd0ad61e34de2c0a4ea72ac0c58eb8f779260931243916fcdb20a3fd5756\": container with ID starting with 175acd0ad61e34de2c0a4ea72ac0c58eb8f779260931243916fcdb20a3fd5756 not found: ID does not exist" Oct 13 15:17:57 crc kubenswrapper[4797]: I1013 15:17:57.175462 4797 scope.go:117] "RemoveContainer" containerID="3a252cda3ca84aa118ad19adbca0cb0c08e334ba462a4639966aa5f0631f89ee" Oct 13 15:17:57 crc kubenswrapper[4797]: E1013 15:17:57.175794 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a252cda3ca84aa118ad19adbca0cb0c08e334ba462a4639966aa5f0631f89ee\": container with ID starting with 3a252cda3ca84aa118ad19adbca0cb0c08e334ba462a4639966aa5f0631f89ee not found: ID does not exist" containerID="3a252cda3ca84aa118ad19adbca0cb0c08e334ba462a4639966aa5f0631f89ee" Oct 13 15:17:57 crc kubenswrapper[4797]: I1013 15:17:57.175854 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a252cda3ca84aa118ad19adbca0cb0c08e334ba462a4639966aa5f0631f89ee"} err="failed to get container status \"3a252cda3ca84aa118ad19adbca0cb0c08e334ba462a4639966aa5f0631f89ee\": rpc error: code = NotFound desc = could not find container \"3a252cda3ca84aa118ad19adbca0cb0c08e334ba462a4639966aa5f0631f89ee\": container with ID starting with 3a252cda3ca84aa118ad19adbca0cb0c08e334ba462a4639966aa5f0631f89ee not found: ID does not exist" Oct 13 15:17:57 crc kubenswrapper[4797]: I1013 15:17:57.251569 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47f522cb-ffd5-423f-a435-1c2a1fc716ee" path="/var/lib/kubelet/pods/47f522cb-ffd5-423f-a435-1c2a1fc716ee/volumes" Oct 13 15:18:05 crc kubenswrapper[4797]: I1013 15:18:05.236425 4797 scope.go:117] "RemoveContainer" containerID="5ae84f5ef96749b196afba3caac331bd51c3761c94a664830d21bf6136f3106b" Oct 13 15:18:05 crc kubenswrapper[4797]: E1013 15:18:05.237200 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 15:18:19 crc kubenswrapper[4797]: I1013 15:18:19.236887 4797 scope.go:117] "RemoveContainer" containerID="5ae84f5ef96749b196afba3caac331bd51c3761c94a664830d21bf6136f3106b" Oct 13 15:18:19 crc kubenswrapper[4797]: E1013 15:18:19.237892 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 15:18:31 crc kubenswrapper[4797]: I1013 15:18:31.236894 4797 scope.go:117] "RemoveContainer" containerID="5ae84f5ef96749b196afba3caac331bd51c3761c94a664830d21bf6136f3106b" Oct 13 15:18:31 crc kubenswrapper[4797]: E1013 15:18:31.238363 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 15:18:42 crc kubenswrapper[4797]: I1013 15:18:42.236232 4797 scope.go:117] "RemoveContainer" containerID="5ae84f5ef96749b196afba3caac331bd51c3761c94a664830d21bf6136f3106b" Oct 13 15:18:42 crc kubenswrapper[4797]: E1013 15:18:42.237030 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 15:18:54 crc kubenswrapper[4797]: I1013 15:18:54.238035 4797 scope.go:117] "RemoveContainer" containerID="5ae84f5ef96749b196afba3caac331bd51c3761c94a664830d21bf6136f3106b" Oct 13 15:18:54 crc kubenswrapper[4797]: E1013 15:18:54.241122 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 15:19:09 crc kubenswrapper[4797]: I1013 15:19:09.236371 4797 scope.go:117] "RemoveContainer" containerID="5ae84f5ef96749b196afba3caac331bd51c3761c94a664830d21bf6136f3106b" Oct 13 15:19:09 crc kubenswrapper[4797]: E1013 15:19:09.237237 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 15:19:24 crc kubenswrapper[4797]: I1013 15:19:24.237150 4797 scope.go:117] "RemoveContainer" containerID="5ae84f5ef96749b196afba3caac331bd51c3761c94a664830d21bf6136f3106b" Oct 13 15:19:24 crc kubenswrapper[4797]: E1013 15:19:24.238835 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 15:19:36 crc kubenswrapper[4797]: I1013 15:19:36.236239 4797 scope.go:117] "RemoveContainer" containerID="5ae84f5ef96749b196afba3caac331bd51c3761c94a664830d21bf6136f3106b" Oct 13 15:19:36 crc kubenswrapper[4797]: E1013 15:19:36.236978 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 15:19:48 crc kubenswrapper[4797]: I1013 15:19:48.236642 4797 scope.go:117] "RemoveContainer" containerID="5ae84f5ef96749b196afba3caac331bd51c3761c94a664830d21bf6136f3106b" Oct 13 15:19:48 crc kubenswrapper[4797]: E1013 15:19:48.237518 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 15:19:49 crc kubenswrapper[4797]: I1013 15:19:49.559729 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-69lcq"] Oct 13 15:19:49 crc kubenswrapper[4797]: E1013 15:19:49.560545 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47f522cb-ffd5-423f-a435-1c2a1fc716ee" containerName="extract-content" Oct 13 15:19:49 crc kubenswrapper[4797]: I1013 15:19:49.560561 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="47f522cb-ffd5-423f-a435-1c2a1fc716ee" containerName="extract-content" Oct 13 15:19:49 crc kubenswrapper[4797]: E1013 15:19:49.560582 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47f522cb-ffd5-423f-a435-1c2a1fc716ee" containerName="extract-utilities" Oct 13 15:19:49 crc kubenswrapper[4797]: I1013 15:19:49.560591 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="47f522cb-ffd5-423f-a435-1c2a1fc716ee" containerName="extract-utilities" Oct 13 15:19:49 crc kubenswrapper[4797]: E1013 15:19:49.560615 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47f522cb-ffd5-423f-a435-1c2a1fc716ee" containerName="registry-server" Oct 13 15:19:49 crc kubenswrapper[4797]: I1013 15:19:49.560624 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="47f522cb-ffd5-423f-a435-1c2a1fc716ee" containerName="registry-server" Oct 13 15:19:49 crc kubenswrapper[4797]: I1013 15:19:49.562320 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="47f522cb-ffd5-423f-a435-1c2a1fc716ee" containerName="registry-server" Oct 13 15:19:49 crc kubenswrapper[4797]: I1013 15:19:49.576711 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-69lcq" Oct 13 15:19:49 crc kubenswrapper[4797]: I1013 15:19:49.587006 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6eabd4ed-9132-44b7-b717-f251458f5441-utilities\") pod \"redhat-operators-69lcq\" (UID: \"6eabd4ed-9132-44b7-b717-f251458f5441\") " pod="openshift-marketplace/redhat-operators-69lcq" Oct 13 15:19:49 crc kubenswrapper[4797]: I1013 15:19:49.587077 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6eabd4ed-9132-44b7-b717-f251458f5441-catalog-content\") pod \"redhat-operators-69lcq\" (UID: \"6eabd4ed-9132-44b7-b717-f251458f5441\") " pod="openshift-marketplace/redhat-operators-69lcq" Oct 13 15:19:49 crc kubenswrapper[4797]: I1013 15:19:49.587338 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-htgqg\" (UniqueName: \"kubernetes.io/projected/6eabd4ed-9132-44b7-b717-f251458f5441-kube-api-access-htgqg\") pod \"redhat-operators-69lcq\" (UID: \"6eabd4ed-9132-44b7-b717-f251458f5441\") " pod="openshift-marketplace/redhat-operators-69lcq" Oct 13 15:19:49 crc kubenswrapper[4797]: I1013 15:19:49.606700 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-69lcq"] Oct 13 15:19:49 crc kubenswrapper[4797]: I1013 15:19:49.688476 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6eabd4ed-9132-44b7-b717-f251458f5441-utilities\") pod \"redhat-operators-69lcq\" (UID: \"6eabd4ed-9132-44b7-b717-f251458f5441\") " pod="openshift-marketplace/redhat-operators-69lcq" Oct 13 15:19:49 crc kubenswrapper[4797]: I1013 15:19:49.688525 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6eabd4ed-9132-44b7-b717-f251458f5441-catalog-content\") pod \"redhat-operators-69lcq\" (UID: \"6eabd4ed-9132-44b7-b717-f251458f5441\") " pod="openshift-marketplace/redhat-operators-69lcq" Oct 13 15:19:49 crc kubenswrapper[4797]: I1013 15:19:49.688552 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-htgqg\" (UniqueName: \"kubernetes.io/projected/6eabd4ed-9132-44b7-b717-f251458f5441-kube-api-access-htgqg\") pod \"redhat-operators-69lcq\" (UID: \"6eabd4ed-9132-44b7-b717-f251458f5441\") " pod="openshift-marketplace/redhat-operators-69lcq" Oct 13 15:19:49 crc kubenswrapper[4797]: I1013 15:19:49.688940 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6eabd4ed-9132-44b7-b717-f251458f5441-utilities\") pod \"redhat-operators-69lcq\" (UID: \"6eabd4ed-9132-44b7-b717-f251458f5441\") " pod="openshift-marketplace/redhat-operators-69lcq" Oct 13 15:19:49 crc kubenswrapper[4797]: I1013 15:19:49.688960 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6eabd4ed-9132-44b7-b717-f251458f5441-catalog-content\") pod \"redhat-operators-69lcq\" (UID: \"6eabd4ed-9132-44b7-b717-f251458f5441\") " pod="openshift-marketplace/redhat-operators-69lcq" Oct 13 15:19:49 crc kubenswrapper[4797]: I1013 15:19:49.712147 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-htgqg\" (UniqueName: \"kubernetes.io/projected/6eabd4ed-9132-44b7-b717-f251458f5441-kube-api-access-htgqg\") pod \"redhat-operators-69lcq\" (UID: \"6eabd4ed-9132-44b7-b717-f251458f5441\") " pod="openshift-marketplace/redhat-operators-69lcq" Oct 13 15:19:49 crc kubenswrapper[4797]: I1013 15:19:49.909304 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-69lcq" Oct 13 15:19:50 crc kubenswrapper[4797]: I1013 15:19:50.409183 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-69lcq"] Oct 13 15:19:51 crc kubenswrapper[4797]: I1013 15:19:51.370270 4797 generic.go:334] "Generic (PLEG): container finished" podID="6eabd4ed-9132-44b7-b717-f251458f5441" containerID="c0bd6976a5d6e9bd7291a9745730a30576d201e8429aa562f85b661a170317a0" exitCode=0 Oct 13 15:19:51 crc kubenswrapper[4797]: I1013 15:19:51.370354 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-69lcq" event={"ID":"6eabd4ed-9132-44b7-b717-f251458f5441","Type":"ContainerDied","Data":"c0bd6976a5d6e9bd7291a9745730a30576d201e8429aa562f85b661a170317a0"} Oct 13 15:19:51 crc kubenswrapper[4797]: I1013 15:19:51.370995 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-69lcq" event={"ID":"6eabd4ed-9132-44b7-b717-f251458f5441","Type":"ContainerStarted","Data":"17f1b6066fd8f1dbebb6abc4e540936462a083c84aba32f31963b5f3cefb5f39"} Oct 13 15:19:53 crc kubenswrapper[4797]: I1013 15:19:53.399334 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-69lcq" event={"ID":"6eabd4ed-9132-44b7-b717-f251458f5441","Type":"ContainerStarted","Data":"6beaaafb195bf1ccabf3a2b00305e4d91ebd24c86c474c3c60b67bb44dfef23d"} Oct 13 15:19:56 crc kubenswrapper[4797]: I1013 15:19:56.490103 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-wwhqh"] Oct 13 15:19:56 crc kubenswrapper[4797]: I1013 15:19:56.493352 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wwhqh" Oct 13 15:19:56 crc kubenswrapper[4797]: I1013 15:19:56.532120 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wwhqh"] Oct 13 15:19:56 crc kubenswrapper[4797]: I1013 15:19:56.634937 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db0b28e8-de72-4fa1-a8ae-8b87b743819a-utilities\") pod \"redhat-marketplace-wwhqh\" (UID: \"db0b28e8-de72-4fa1-a8ae-8b87b743819a\") " pod="openshift-marketplace/redhat-marketplace-wwhqh" Oct 13 15:19:56 crc kubenswrapper[4797]: I1013 15:19:56.635219 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kcn6p\" (UniqueName: \"kubernetes.io/projected/db0b28e8-de72-4fa1-a8ae-8b87b743819a-kube-api-access-kcn6p\") pod \"redhat-marketplace-wwhqh\" (UID: \"db0b28e8-de72-4fa1-a8ae-8b87b743819a\") " pod="openshift-marketplace/redhat-marketplace-wwhqh" Oct 13 15:19:56 crc kubenswrapper[4797]: I1013 15:19:56.635312 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db0b28e8-de72-4fa1-a8ae-8b87b743819a-catalog-content\") pod \"redhat-marketplace-wwhqh\" (UID: \"db0b28e8-de72-4fa1-a8ae-8b87b743819a\") " pod="openshift-marketplace/redhat-marketplace-wwhqh" Oct 13 15:19:56 crc kubenswrapper[4797]: I1013 15:19:56.737654 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db0b28e8-de72-4fa1-a8ae-8b87b743819a-catalog-content\") pod \"redhat-marketplace-wwhqh\" (UID: \"db0b28e8-de72-4fa1-a8ae-8b87b743819a\") " pod="openshift-marketplace/redhat-marketplace-wwhqh" Oct 13 15:19:56 crc kubenswrapper[4797]: I1013 15:19:56.737870 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db0b28e8-de72-4fa1-a8ae-8b87b743819a-utilities\") pod \"redhat-marketplace-wwhqh\" (UID: \"db0b28e8-de72-4fa1-a8ae-8b87b743819a\") " pod="openshift-marketplace/redhat-marketplace-wwhqh" Oct 13 15:19:56 crc kubenswrapper[4797]: I1013 15:19:56.737908 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kcn6p\" (UniqueName: \"kubernetes.io/projected/db0b28e8-de72-4fa1-a8ae-8b87b743819a-kube-api-access-kcn6p\") pod \"redhat-marketplace-wwhqh\" (UID: \"db0b28e8-de72-4fa1-a8ae-8b87b743819a\") " pod="openshift-marketplace/redhat-marketplace-wwhqh" Oct 13 15:19:56 crc kubenswrapper[4797]: I1013 15:19:56.738372 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db0b28e8-de72-4fa1-a8ae-8b87b743819a-catalog-content\") pod \"redhat-marketplace-wwhqh\" (UID: \"db0b28e8-de72-4fa1-a8ae-8b87b743819a\") " pod="openshift-marketplace/redhat-marketplace-wwhqh" Oct 13 15:19:56 crc kubenswrapper[4797]: I1013 15:19:56.738491 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db0b28e8-de72-4fa1-a8ae-8b87b743819a-utilities\") pod \"redhat-marketplace-wwhqh\" (UID: \"db0b28e8-de72-4fa1-a8ae-8b87b743819a\") " pod="openshift-marketplace/redhat-marketplace-wwhqh" Oct 13 15:19:56 crc kubenswrapper[4797]: I1013 15:19:56.761630 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kcn6p\" (UniqueName: \"kubernetes.io/projected/db0b28e8-de72-4fa1-a8ae-8b87b743819a-kube-api-access-kcn6p\") pod \"redhat-marketplace-wwhqh\" (UID: \"db0b28e8-de72-4fa1-a8ae-8b87b743819a\") " pod="openshift-marketplace/redhat-marketplace-wwhqh" Oct 13 15:19:56 crc kubenswrapper[4797]: I1013 15:19:56.823348 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wwhqh" Oct 13 15:19:57 crc kubenswrapper[4797]: W1013 15:19:57.373479 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb0b28e8_de72_4fa1_a8ae_8b87b743819a.slice/crio-81ec0f30c5030cdc7c96fe3c743f90b140e37cf30d8ddc4303c577bd761cde81 WatchSource:0}: Error finding container 81ec0f30c5030cdc7c96fe3c743f90b140e37cf30d8ddc4303c577bd761cde81: Status 404 returned error can't find the container with id 81ec0f30c5030cdc7c96fe3c743f90b140e37cf30d8ddc4303c577bd761cde81 Oct 13 15:19:57 crc kubenswrapper[4797]: I1013 15:19:57.378547 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wwhqh"] Oct 13 15:19:57 crc kubenswrapper[4797]: I1013 15:19:57.439357 4797 generic.go:334] "Generic (PLEG): container finished" podID="6eabd4ed-9132-44b7-b717-f251458f5441" containerID="6beaaafb195bf1ccabf3a2b00305e4d91ebd24c86c474c3c60b67bb44dfef23d" exitCode=0 Oct 13 15:19:57 crc kubenswrapper[4797]: I1013 15:19:57.439441 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-69lcq" event={"ID":"6eabd4ed-9132-44b7-b717-f251458f5441","Type":"ContainerDied","Data":"6beaaafb195bf1ccabf3a2b00305e4d91ebd24c86c474c3c60b67bb44dfef23d"} Oct 13 15:19:57 crc kubenswrapper[4797]: I1013 15:19:57.440953 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wwhqh" event={"ID":"db0b28e8-de72-4fa1-a8ae-8b87b743819a","Type":"ContainerStarted","Data":"81ec0f30c5030cdc7c96fe3c743f90b140e37cf30d8ddc4303c577bd761cde81"} Oct 13 15:19:58 crc kubenswrapper[4797]: I1013 15:19:58.454307 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-69lcq" event={"ID":"6eabd4ed-9132-44b7-b717-f251458f5441","Type":"ContainerStarted","Data":"7461f1aa199b4bf32d37a36d45830441082d8e28b7e37054f36d32edc97df94b"} Oct 13 15:19:58 crc kubenswrapper[4797]: I1013 15:19:58.456403 4797 generic.go:334] "Generic (PLEG): container finished" podID="db0b28e8-de72-4fa1-a8ae-8b87b743819a" containerID="1c855c9c0d1b3689034eee767d8bda893c946b7792e4b294c20d048c9367bd17" exitCode=0 Oct 13 15:19:58 crc kubenswrapper[4797]: I1013 15:19:58.456449 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wwhqh" event={"ID":"db0b28e8-de72-4fa1-a8ae-8b87b743819a","Type":"ContainerDied","Data":"1c855c9c0d1b3689034eee767d8bda893c946b7792e4b294c20d048c9367bd17"} Oct 13 15:19:58 crc kubenswrapper[4797]: I1013 15:19:58.480821 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-69lcq" podStartSLOduration=2.975368157 podStartE2EDuration="9.480783461s" podCreationTimestamp="2025-10-13 15:19:49 +0000 UTC" firstStartedPulling="2025-10-13 15:19:51.372524064 +0000 UTC m=+7968.906074320" lastFinishedPulling="2025-10-13 15:19:57.877939348 +0000 UTC m=+7975.411489624" observedRunningTime="2025-10-13 15:19:58.472694182 +0000 UTC m=+7976.006244458" watchObservedRunningTime="2025-10-13 15:19:58.480783461 +0000 UTC m=+7976.014333727" Oct 13 15:19:59 crc kubenswrapper[4797]: I1013 15:19:59.909721 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-69lcq" Oct 13 15:19:59 crc kubenswrapper[4797]: I1013 15:19:59.910544 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-69lcq" Oct 13 15:20:00 crc kubenswrapper[4797]: I1013 15:20:00.481793 4797 generic.go:334] "Generic (PLEG): container finished" podID="db0b28e8-de72-4fa1-a8ae-8b87b743819a" containerID="4338d30a0cc0be9918ded5a175535f5d3e2a2b3ac5802577993800e3faaa8e65" exitCode=0 Oct 13 15:20:00 crc kubenswrapper[4797]: I1013 15:20:00.481855 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wwhqh" event={"ID":"db0b28e8-de72-4fa1-a8ae-8b87b743819a","Type":"ContainerDied","Data":"4338d30a0cc0be9918ded5a175535f5d3e2a2b3ac5802577993800e3faaa8e65"} Oct 13 15:20:00 crc kubenswrapper[4797]: I1013 15:20:00.971450 4797 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-69lcq" podUID="6eabd4ed-9132-44b7-b717-f251458f5441" containerName="registry-server" probeResult="failure" output=< Oct 13 15:20:00 crc kubenswrapper[4797]: timeout: failed to connect service ":50051" within 1s Oct 13 15:20:00 crc kubenswrapper[4797]: > Oct 13 15:20:01 crc kubenswrapper[4797]: I1013 15:20:01.498008 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wwhqh" event={"ID":"db0b28e8-de72-4fa1-a8ae-8b87b743819a","Type":"ContainerStarted","Data":"76997d0f2a4e0bb11eae1b765c57c0921a724a489e24c1473389569fa8161444"} Oct 13 15:20:01 crc kubenswrapper[4797]: I1013 15:20:01.527759 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-wwhqh" podStartSLOduration=3.09525915 podStartE2EDuration="5.527738303s" podCreationTimestamp="2025-10-13 15:19:56 +0000 UTC" firstStartedPulling="2025-10-13 15:19:58.459531817 +0000 UTC m=+7975.993082083" lastFinishedPulling="2025-10-13 15:20:00.89201097 +0000 UTC m=+7978.425561236" observedRunningTime="2025-10-13 15:20:01.518889705 +0000 UTC m=+7979.052439991" watchObservedRunningTime="2025-10-13 15:20:01.527738303 +0000 UTC m=+7979.061288579" Oct 13 15:20:02 crc kubenswrapper[4797]: I1013 15:20:02.236248 4797 scope.go:117] "RemoveContainer" containerID="5ae84f5ef96749b196afba3caac331bd51c3761c94a664830d21bf6136f3106b" Oct 13 15:20:02 crc kubenswrapper[4797]: E1013 15:20:02.236623 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 15:20:06 crc kubenswrapper[4797]: I1013 15:20:06.824395 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-wwhqh" Oct 13 15:20:06 crc kubenswrapper[4797]: I1013 15:20:06.825274 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-wwhqh" Oct 13 15:20:06 crc kubenswrapper[4797]: I1013 15:20:06.926364 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-wwhqh" Oct 13 15:20:07 crc kubenswrapper[4797]: I1013 15:20:07.601569 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-wwhqh" Oct 13 15:20:07 crc kubenswrapper[4797]: I1013 15:20:07.656291 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wwhqh"] Oct 13 15:20:09 crc kubenswrapper[4797]: I1013 15:20:09.574512 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-wwhqh" podUID="db0b28e8-de72-4fa1-a8ae-8b87b743819a" containerName="registry-server" containerID="cri-o://76997d0f2a4e0bb11eae1b765c57c0921a724a489e24c1473389569fa8161444" gracePeriod=2 Oct 13 15:20:10 crc kubenswrapper[4797]: I1013 15:20:10.094542 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wwhqh" Oct 13 15:20:10 crc kubenswrapper[4797]: I1013 15:20:10.270354 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db0b28e8-de72-4fa1-a8ae-8b87b743819a-utilities\") pod \"db0b28e8-de72-4fa1-a8ae-8b87b743819a\" (UID: \"db0b28e8-de72-4fa1-a8ae-8b87b743819a\") " Oct 13 15:20:10 crc kubenswrapper[4797]: I1013 15:20:10.270503 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db0b28e8-de72-4fa1-a8ae-8b87b743819a-catalog-content\") pod \"db0b28e8-de72-4fa1-a8ae-8b87b743819a\" (UID: \"db0b28e8-de72-4fa1-a8ae-8b87b743819a\") " Oct 13 15:20:10 crc kubenswrapper[4797]: I1013 15:20:10.270676 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kcn6p\" (UniqueName: \"kubernetes.io/projected/db0b28e8-de72-4fa1-a8ae-8b87b743819a-kube-api-access-kcn6p\") pod \"db0b28e8-de72-4fa1-a8ae-8b87b743819a\" (UID: \"db0b28e8-de72-4fa1-a8ae-8b87b743819a\") " Oct 13 15:20:10 crc kubenswrapper[4797]: I1013 15:20:10.273242 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/db0b28e8-de72-4fa1-a8ae-8b87b743819a-utilities" (OuterVolumeSpecName: "utilities") pod "db0b28e8-de72-4fa1-a8ae-8b87b743819a" (UID: "db0b28e8-de72-4fa1-a8ae-8b87b743819a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 15:20:10 crc kubenswrapper[4797]: I1013 15:20:10.276653 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db0b28e8-de72-4fa1-a8ae-8b87b743819a-kube-api-access-kcn6p" (OuterVolumeSpecName: "kube-api-access-kcn6p") pod "db0b28e8-de72-4fa1-a8ae-8b87b743819a" (UID: "db0b28e8-de72-4fa1-a8ae-8b87b743819a"). InnerVolumeSpecName "kube-api-access-kcn6p". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 15:20:10 crc kubenswrapper[4797]: I1013 15:20:10.294422 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/db0b28e8-de72-4fa1-a8ae-8b87b743819a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "db0b28e8-de72-4fa1-a8ae-8b87b743819a" (UID: "db0b28e8-de72-4fa1-a8ae-8b87b743819a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 15:20:10 crc kubenswrapper[4797]: I1013 15:20:10.374276 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kcn6p\" (UniqueName: \"kubernetes.io/projected/db0b28e8-de72-4fa1-a8ae-8b87b743819a-kube-api-access-kcn6p\") on node \"crc\" DevicePath \"\"" Oct 13 15:20:10 crc kubenswrapper[4797]: I1013 15:20:10.374304 4797 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db0b28e8-de72-4fa1-a8ae-8b87b743819a-utilities\") on node \"crc\" DevicePath \"\"" Oct 13 15:20:10 crc kubenswrapper[4797]: I1013 15:20:10.374315 4797 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db0b28e8-de72-4fa1-a8ae-8b87b743819a-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 13 15:20:10 crc kubenswrapper[4797]: I1013 15:20:10.587010 4797 generic.go:334] "Generic (PLEG): container finished" podID="db0b28e8-de72-4fa1-a8ae-8b87b743819a" containerID="76997d0f2a4e0bb11eae1b765c57c0921a724a489e24c1473389569fa8161444" exitCode=0 Oct 13 15:20:10 crc kubenswrapper[4797]: I1013 15:20:10.587064 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wwhqh" event={"ID":"db0b28e8-de72-4fa1-a8ae-8b87b743819a","Type":"ContainerDied","Data":"76997d0f2a4e0bb11eae1b765c57c0921a724a489e24c1473389569fa8161444"} Oct 13 15:20:10 crc kubenswrapper[4797]: I1013 15:20:10.587096 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wwhqh" event={"ID":"db0b28e8-de72-4fa1-a8ae-8b87b743819a","Type":"ContainerDied","Data":"81ec0f30c5030cdc7c96fe3c743f90b140e37cf30d8ddc4303c577bd761cde81"} Oct 13 15:20:10 crc kubenswrapper[4797]: I1013 15:20:10.587117 4797 scope.go:117] "RemoveContainer" containerID="76997d0f2a4e0bb11eae1b765c57c0921a724a489e24c1473389569fa8161444" Oct 13 15:20:10 crc kubenswrapper[4797]: I1013 15:20:10.587115 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wwhqh" Oct 13 15:20:10 crc kubenswrapper[4797]: I1013 15:20:10.617371 4797 scope.go:117] "RemoveContainer" containerID="4338d30a0cc0be9918ded5a175535f5d3e2a2b3ac5802577993800e3faaa8e65" Oct 13 15:20:10 crc kubenswrapper[4797]: I1013 15:20:10.633558 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wwhqh"] Oct 13 15:20:10 crc kubenswrapper[4797]: I1013 15:20:10.645334 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-wwhqh"] Oct 13 15:20:10 crc kubenswrapper[4797]: I1013 15:20:10.652288 4797 scope.go:117] "RemoveContainer" containerID="1c855c9c0d1b3689034eee767d8bda893c946b7792e4b294c20d048c9367bd17" Oct 13 15:20:10 crc kubenswrapper[4797]: I1013 15:20:10.685446 4797 scope.go:117] "RemoveContainer" containerID="76997d0f2a4e0bb11eae1b765c57c0921a724a489e24c1473389569fa8161444" Oct 13 15:20:10 crc kubenswrapper[4797]: E1013 15:20:10.686164 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76997d0f2a4e0bb11eae1b765c57c0921a724a489e24c1473389569fa8161444\": container with ID starting with 76997d0f2a4e0bb11eae1b765c57c0921a724a489e24c1473389569fa8161444 not found: ID does not exist" containerID="76997d0f2a4e0bb11eae1b765c57c0921a724a489e24c1473389569fa8161444" Oct 13 15:20:10 crc kubenswrapper[4797]: I1013 15:20:10.686205 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76997d0f2a4e0bb11eae1b765c57c0921a724a489e24c1473389569fa8161444"} err="failed to get container status \"76997d0f2a4e0bb11eae1b765c57c0921a724a489e24c1473389569fa8161444\": rpc error: code = NotFound desc = could not find container \"76997d0f2a4e0bb11eae1b765c57c0921a724a489e24c1473389569fa8161444\": container with ID starting with 76997d0f2a4e0bb11eae1b765c57c0921a724a489e24c1473389569fa8161444 not found: ID does not exist" Oct 13 15:20:10 crc kubenswrapper[4797]: I1013 15:20:10.686228 4797 scope.go:117] "RemoveContainer" containerID="4338d30a0cc0be9918ded5a175535f5d3e2a2b3ac5802577993800e3faaa8e65" Oct 13 15:20:10 crc kubenswrapper[4797]: E1013 15:20:10.686897 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4338d30a0cc0be9918ded5a175535f5d3e2a2b3ac5802577993800e3faaa8e65\": container with ID starting with 4338d30a0cc0be9918ded5a175535f5d3e2a2b3ac5802577993800e3faaa8e65 not found: ID does not exist" containerID="4338d30a0cc0be9918ded5a175535f5d3e2a2b3ac5802577993800e3faaa8e65" Oct 13 15:20:10 crc kubenswrapper[4797]: I1013 15:20:10.687043 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4338d30a0cc0be9918ded5a175535f5d3e2a2b3ac5802577993800e3faaa8e65"} err="failed to get container status \"4338d30a0cc0be9918ded5a175535f5d3e2a2b3ac5802577993800e3faaa8e65\": rpc error: code = NotFound desc = could not find container \"4338d30a0cc0be9918ded5a175535f5d3e2a2b3ac5802577993800e3faaa8e65\": container with ID starting with 4338d30a0cc0be9918ded5a175535f5d3e2a2b3ac5802577993800e3faaa8e65 not found: ID does not exist" Oct 13 15:20:10 crc kubenswrapper[4797]: I1013 15:20:10.687122 4797 scope.go:117] "RemoveContainer" containerID="1c855c9c0d1b3689034eee767d8bda893c946b7792e4b294c20d048c9367bd17" Oct 13 15:20:10 crc kubenswrapper[4797]: E1013 15:20:10.687538 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c855c9c0d1b3689034eee767d8bda893c946b7792e4b294c20d048c9367bd17\": container with ID starting with 1c855c9c0d1b3689034eee767d8bda893c946b7792e4b294c20d048c9367bd17 not found: ID does not exist" containerID="1c855c9c0d1b3689034eee767d8bda893c946b7792e4b294c20d048c9367bd17" Oct 13 15:20:10 crc kubenswrapper[4797]: I1013 15:20:10.687572 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c855c9c0d1b3689034eee767d8bda893c946b7792e4b294c20d048c9367bd17"} err="failed to get container status \"1c855c9c0d1b3689034eee767d8bda893c946b7792e4b294c20d048c9367bd17\": rpc error: code = NotFound desc = could not find container \"1c855c9c0d1b3689034eee767d8bda893c946b7792e4b294c20d048c9367bd17\": container with ID starting with 1c855c9c0d1b3689034eee767d8bda893c946b7792e4b294c20d048c9367bd17 not found: ID does not exist" Oct 13 15:20:10 crc kubenswrapper[4797]: I1013 15:20:10.985728 4797 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-69lcq" podUID="6eabd4ed-9132-44b7-b717-f251458f5441" containerName="registry-server" probeResult="failure" output=< Oct 13 15:20:10 crc kubenswrapper[4797]: timeout: failed to connect service ":50051" within 1s Oct 13 15:20:10 crc kubenswrapper[4797]: > Oct 13 15:20:11 crc kubenswrapper[4797]: I1013 15:20:11.250639 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db0b28e8-de72-4fa1-a8ae-8b87b743819a" path="/var/lib/kubelet/pods/db0b28e8-de72-4fa1-a8ae-8b87b743819a/volumes" Oct 13 15:20:17 crc kubenswrapper[4797]: I1013 15:20:17.236576 4797 scope.go:117] "RemoveContainer" containerID="5ae84f5ef96749b196afba3caac331bd51c3761c94a664830d21bf6136f3106b" Oct 13 15:20:17 crc kubenswrapper[4797]: E1013 15:20:17.237397 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 15:20:19 crc kubenswrapper[4797]: I1013 15:20:19.957456 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-69lcq" Oct 13 15:20:20 crc kubenswrapper[4797]: I1013 15:20:20.006676 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-69lcq" Oct 13 15:20:20 crc kubenswrapper[4797]: I1013 15:20:20.761926 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-69lcq"] Oct 13 15:20:21 crc kubenswrapper[4797]: I1013 15:20:21.701989 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-69lcq" podUID="6eabd4ed-9132-44b7-b717-f251458f5441" containerName="registry-server" containerID="cri-o://7461f1aa199b4bf32d37a36d45830441082d8e28b7e37054f36d32edc97df94b" gracePeriod=2 Oct 13 15:20:22 crc kubenswrapper[4797]: I1013 15:20:22.174475 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-69lcq" Oct 13 15:20:22 crc kubenswrapper[4797]: I1013 15:20:22.327233 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6eabd4ed-9132-44b7-b717-f251458f5441-utilities\") pod \"6eabd4ed-9132-44b7-b717-f251458f5441\" (UID: \"6eabd4ed-9132-44b7-b717-f251458f5441\") " Oct 13 15:20:22 crc kubenswrapper[4797]: I1013 15:20:22.327663 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htgqg\" (UniqueName: \"kubernetes.io/projected/6eabd4ed-9132-44b7-b717-f251458f5441-kube-api-access-htgqg\") pod \"6eabd4ed-9132-44b7-b717-f251458f5441\" (UID: \"6eabd4ed-9132-44b7-b717-f251458f5441\") " Oct 13 15:20:22 crc kubenswrapper[4797]: I1013 15:20:22.327792 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6eabd4ed-9132-44b7-b717-f251458f5441-catalog-content\") pod \"6eabd4ed-9132-44b7-b717-f251458f5441\" (UID: \"6eabd4ed-9132-44b7-b717-f251458f5441\") " Oct 13 15:20:22 crc kubenswrapper[4797]: I1013 15:20:22.328585 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6eabd4ed-9132-44b7-b717-f251458f5441-utilities" (OuterVolumeSpecName: "utilities") pod "6eabd4ed-9132-44b7-b717-f251458f5441" (UID: "6eabd4ed-9132-44b7-b717-f251458f5441"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 15:20:22 crc kubenswrapper[4797]: I1013 15:20:22.328985 4797 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6eabd4ed-9132-44b7-b717-f251458f5441-utilities\") on node \"crc\" DevicePath \"\"" Oct 13 15:20:22 crc kubenswrapper[4797]: I1013 15:20:22.333681 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6eabd4ed-9132-44b7-b717-f251458f5441-kube-api-access-htgqg" (OuterVolumeSpecName: "kube-api-access-htgqg") pod "6eabd4ed-9132-44b7-b717-f251458f5441" (UID: "6eabd4ed-9132-44b7-b717-f251458f5441"). InnerVolumeSpecName "kube-api-access-htgqg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 15:20:22 crc kubenswrapper[4797]: I1013 15:20:22.426676 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6eabd4ed-9132-44b7-b717-f251458f5441-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6eabd4ed-9132-44b7-b717-f251458f5441" (UID: "6eabd4ed-9132-44b7-b717-f251458f5441"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 15:20:22 crc kubenswrapper[4797]: I1013 15:20:22.430460 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htgqg\" (UniqueName: \"kubernetes.io/projected/6eabd4ed-9132-44b7-b717-f251458f5441-kube-api-access-htgqg\") on node \"crc\" DevicePath \"\"" Oct 13 15:20:22 crc kubenswrapper[4797]: I1013 15:20:22.430504 4797 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6eabd4ed-9132-44b7-b717-f251458f5441-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 13 15:20:22 crc kubenswrapper[4797]: I1013 15:20:22.717878 4797 generic.go:334] "Generic (PLEG): container finished" podID="6eabd4ed-9132-44b7-b717-f251458f5441" containerID="7461f1aa199b4bf32d37a36d45830441082d8e28b7e37054f36d32edc97df94b" exitCode=0 Oct 13 15:20:22 crc kubenswrapper[4797]: I1013 15:20:22.717918 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-69lcq" event={"ID":"6eabd4ed-9132-44b7-b717-f251458f5441","Type":"ContainerDied","Data":"7461f1aa199b4bf32d37a36d45830441082d8e28b7e37054f36d32edc97df94b"} Oct 13 15:20:22 crc kubenswrapper[4797]: I1013 15:20:22.717941 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-69lcq" Oct 13 15:20:22 crc kubenswrapper[4797]: I1013 15:20:22.717954 4797 scope.go:117] "RemoveContainer" containerID="7461f1aa199b4bf32d37a36d45830441082d8e28b7e37054f36d32edc97df94b" Oct 13 15:20:22 crc kubenswrapper[4797]: I1013 15:20:22.717943 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-69lcq" event={"ID":"6eabd4ed-9132-44b7-b717-f251458f5441","Type":"ContainerDied","Data":"17f1b6066fd8f1dbebb6abc4e540936462a083c84aba32f31963b5f3cefb5f39"} Oct 13 15:20:22 crc kubenswrapper[4797]: I1013 15:20:22.741195 4797 scope.go:117] "RemoveContainer" containerID="6beaaafb195bf1ccabf3a2b00305e4d91ebd24c86c474c3c60b67bb44dfef23d" Oct 13 15:20:22 crc kubenswrapper[4797]: I1013 15:20:22.759934 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-69lcq"] Oct 13 15:20:22 crc kubenswrapper[4797]: I1013 15:20:22.767284 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-69lcq"] Oct 13 15:20:22 crc kubenswrapper[4797]: I1013 15:20:22.778033 4797 scope.go:117] "RemoveContainer" containerID="c0bd6976a5d6e9bd7291a9745730a30576d201e8429aa562f85b661a170317a0" Oct 13 15:20:22 crc kubenswrapper[4797]: I1013 15:20:22.812596 4797 scope.go:117] "RemoveContainer" containerID="7461f1aa199b4bf32d37a36d45830441082d8e28b7e37054f36d32edc97df94b" Oct 13 15:20:22 crc kubenswrapper[4797]: E1013 15:20:22.813162 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7461f1aa199b4bf32d37a36d45830441082d8e28b7e37054f36d32edc97df94b\": container with ID starting with 7461f1aa199b4bf32d37a36d45830441082d8e28b7e37054f36d32edc97df94b not found: ID does not exist" containerID="7461f1aa199b4bf32d37a36d45830441082d8e28b7e37054f36d32edc97df94b" Oct 13 15:20:22 crc kubenswrapper[4797]: I1013 15:20:22.813219 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7461f1aa199b4bf32d37a36d45830441082d8e28b7e37054f36d32edc97df94b"} err="failed to get container status \"7461f1aa199b4bf32d37a36d45830441082d8e28b7e37054f36d32edc97df94b\": rpc error: code = NotFound desc = could not find container \"7461f1aa199b4bf32d37a36d45830441082d8e28b7e37054f36d32edc97df94b\": container with ID starting with 7461f1aa199b4bf32d37a36d45830441082d8e28b7e37054f36d32edc97df94b not found: ID does not exist" Oct 13 15:20:22 crc kubenswrapper[4797]: I1013 15:20:22.813291 4797 scope.go:117] "RemoveContainer" containerID="6beaaafb195bf1ccabf3a2b00305e4d91ebd24c86c474c3c60b67bb44dfef23d" Oct 13 15:20:22 crc kubenswrapper[4797]: E1013 15:20:22.813758 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6beaaafb195bf1ccabf3a2b00305e4d91ebd24c86c474c3c60b67bb44dfef23d\": container with ID starting with 6beaaafb195bf1ccabf3a2b00305e4d91ebd24c86c474c3c60b67bb44dfef23d not found: ID does not exist" containerID="6beaaafb195bf1ccabf3a2b00305e4d91ebd24c86c474c3c60b67bb44dfef23d" Oct 13 15:20:22 crc kubenswrapper[4797]: I1013 15:20:22.813792 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6beaaafb195bf1ccabf3a2b00305e4d91ebd24c86c474c3c60b67bb44dfef23d"} err="failed to get container status \"6beaaafb195bf1ccabf3a2b00305e4d91ebd24c86c474c3c60b67bb44dfef23d\": rpc error: code = NotFound desc = could not find container \"6beaaafb195bf1ccabf3a2b00305e4d91ebd24c86c474c3c60b67bb44dfef23d\": container with ID starting with 6beaaafb195bf1ccabf3a2b00305e4d91ebd24c86c474c3c60b67bb44dfef23d not found: ID does not exist" Oct 13 15:20:22 crc kubenswrapper[4797]: I1013 15:20:22.813827 4797 scope.go:117] "RemoveContainer" containerID="c0bd6976a5d6e9bd7291a9745730a30576d201e8429aa562f85b661a170317a0" Oct 13 15:20:22 crc kubenswrapper[4797]: E1013 15:20:22.814125 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0bd6976a5d6e9bd7291a9745730a30576d201e8429aa562f85b661a170317a0\": container with ID starting with c0bd6976a5d6e9bd7291a9745730a30576d201e8429aa562f85b661a170317a0 not found: ID does not exist" containerID="c0bd6976a5d6e9bd7291a9745730a30576d201e8429aa562f85b661a170317a0" Oct 13 15:20:22 crc kubenswrapper[4797]: I1013 15:20:22.814159 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0bd6976a5d6e9bd7291a9745730a30576d201e8429aa562f85b661a170317a0"} err="failed to get container status \"c0bd6976a5d6e9bd7291a9745730a30576d201e8429aa562f85b661a170317a0\": rpc error: code = NotFound desc = could not find container \"c0bd6976a5d6e9bd7291a9745730a30576d201e8429aa562f85b661a170317a0\": container with ID starting with c0bd6976a5d6e9bd7291a9745730a30576d201e8429aa562f85b661a170317a0 not found: ID does not exist" Oct 13 15:20:23 crc kubenswrapper[4797]: I1013 15:20:23.259304 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6eabd4ed-9132-44b7-b717-f251458f5441" path="/var/lib/kubelet/pods/6eabd4ed-9132-44b7-b717-f251458f5441/volumes" Oct 13 15:20:31 crc kubenswrapper[4797]: I1013 15:20:31.236824 4797 scope.go:117] "RemoveContainer" containerID="5ae84f5ef96749b196afba3caac331bd51c3761c94a664830d21bf6136f3106b" Oct 13 15:20:31 crc kubenswrapper[4797]: E1013 15:20:31.237725 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 15:20:44 crc kubenswrapper[4797]: I1013 15:20:44.235923 4797 scope.go:117] "RemoveContainer" containerID="5ae84f5ef96749b196afba3caac331bd51c3761c94a664830d21bf6136f3106b" Oct 13 15:20:44 crc kubenswrapper[4797]: E1013 15:20:44.237138 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 15:20:55 crc kubenswrapper[4797]: I1013 15:20:55.237120 4797 scope.go:117] "RemoveContainer" containerID="5ae84f5ef96749b196afba3caac331bd51c3761c94a664830d21bf6136f3106b" Oct 13 15:20:55 crc kubenswrapper[4797]: E1013 15:20:55.237976 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 15:21:08 crc kubenswrapper[4797]: I1013 15:21:08.237376 4797 scope.go:117] "RemoveContainer" containerID="5ae84f5ef96749b196afba3caac331bd51c3761c94a664830d21bf6136f3106b" Oct 13 15:21:08 crc kubenswrapper[4797]: E1013 15:21:08.238294 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 15:21:19 crc kubenswrapper[4797]: I1013 15:21:19.237990 4797 scope.go:117] "RemoveContainer" containerID="5ae84f5ef96749b196afba3caac331bd51c3761c94a664830d21bf6136f3106b" Oct 13 15:21:20 crc kubenswrapper[4797]: I1013 15:21:20.333869 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" event={"ID":"345b1c60-ba79-407d-8423-53010f2dfeb0","Type":"ContainerStarted","Data":"d5154c01ef831f582bdc74e590c7f68a0327106b2b148c6d1b0c61b36503e1a2"} Oct 13 15:22:42 crc kubenswrapper[4797]: I1013 15:22:42.371429 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-jwzl9"] Oct 13 15:22:42 crc kubenswrapper[4797]: E1013 15:22:42.372888 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6eabd4ed-9132-44b7-b717-f251458f5441" containerName="extract-utilities" Oct 13 15:22:42 crc kubenswrapper[4797]: I1013 15:22:42.372915 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="6eabd4ed-9132-44b7-b717-f251458f5441" containerName="extract-utilities" Oct 13 15:22:42 crc kubenswrapper[4797]: E1013 15:22:42.372939 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6eabd4ed-9132-44b7-b717-f251458f5441" containerName="extract-content" Oct 13 15:22:42 crc kubenswrapper[4797]: I1013 15:22:42.372951 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="6eabd4ed-9132-44b7-b717-f251458f5441" containerName="extract-content" Oct 13 15:22:42 crc kubenswrapper[4797]: E1013 15:22:42.372998 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db0b28e8-de72-4fa1-a8ae-8b87b743819a" containerName="extract-utilities" Oct 13 15:22:42 crc kubenswrapper[4797]: I1013 15:22:42.373011 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="db0b28e8-de72-4fa1-a8ae-8b87b743819a" containerName="extract-utilities" Oct 13 15:22:42 crc kubenswrapper[4797]: E1013 15:22:42.373042 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6eabd4ed-9132-44b7-b717-f251458f5441" containerName="registry-server" Oct 13 15:22:42 crc kubenswrapper[4797]: I1013 15:22:42.373053 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="6eabd4ed-9132-44b7-b717-f251458f5441" containerName="registry-server" Oct 13 15:22:42 crc kubenswrapper[4797]: E1013 15:22:42.373080 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db0b28e8-de72-4fa1-a8ae-8b87b743819a" containerName="registry-server" Oct 13 15:22:42 crc kubenswrapper[4797]: I1013 15:22:42.373091 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="db0b28e8-de72-4fa1-a8ae-8b87b743819a" containerName="registry-server" Oct 13 15:22:42 crc kubenswrapper[4797]: E1013 15:22:42.373108 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db0b28e8-de72-4fa1-a8ae-8b87b743819a" containerName="extract-content" Oct 13 15:22:42 crc kubenswrapper[4797]: I1013 15:22:42.373118 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="db0b28e8-de72-4fa1-a8ae-8b87b743819a" containerName="extract-content" Oct 13 15:22:42 crc kubenswrapper[4797]: I1013 15:22:42.373632 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="db0b28e8-de72-4fa1-a8ae-8b87b743819a" containerName="registry-server" Oct 13 15:22:42 crc kubenswrapper[4797]: I1013 15:22:42.373660 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="6eabd4ed-9132-44b7-b717-f251458f5441" containerName="registry-server" Oct 13 15:22:42 crc kubenswrapper[4797]: I1013 15:22:42.376304 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jwzl9" Oct 13 15:22:42 crc kubenswrapper[4797]: I1013 15:22:42.387896 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jwzl9"] Oct 13 15:22:42 crc kubenswrapper[4797]: I1013 15:22:42.467553 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjtlm\" (UniqueName: \"kubernetes.io/projected/d3337b64-24f8-453c-93c7-8fb7be2ea68f-kube-api-access-cjtlm\") pod \"certified-operators-jwzl9\" (UID: \"d3337b64-24f8-453c-93c7-8fb7be2ea68f\") " pod="openshift-marketplace/certified-operators-jwzl9" Oct 13 15:22:42 crc kubenswrapper[4797]: I1013 15:22:42.467988 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3337b64-24f8-453c-93c7-8fb7be2ea68f-catalog-content\") pod \"certified-operators-jwzl9\" (UID: \"d3337b64-24f8-453c-93c7-8fb7be2ea68f\") " pod="openshift-marketplace/certified-operators-jwzl9" Oct 13 15:22:42 crc kubenswrapper[4797]: I1013 15:22:42.468214 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3337b64-24f8-453c-93c7-8fb7be2ea68f-utilities\") pod \"certified-operators-jwzl9\" (UID: \"d3337b64-24f8-453c-93c7-8fb7be2ea68f\") " pod="openshift-marketplace/certified-operators-jwzl9" Oct 13 15:22:42 crc kubenswrapper[4797]: I1013 15:22:42.570440 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cjtlm\" (UniqueName: \"kubernetes.io/projected/d3337b64-24f8-453c-93c7-8fb7be2ea68f-kube-api-access-cjtlm\") pod \"certified-operators-jwzl9\" (UID: \"d3337b64-24f8-453c-93c7-8fb7be2ea68f\") " pod="openshift-marketplace/certified-operators-jwzl9" Oct 13 15:22:42 crc kubenswrapper[4797]: I1013 15:22:42.570836 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3337b64-24f8-453c-93c7-8fb7be2ea68f-catalog-content\") pod \"certified-operators-jwzl9\" (UID: \"d3337b64-24f8-453c-93c7-8fb7be2ea68f\") " pod="openshift-marketplace/certified-operators-jwzl9" Oct 13 15:22:42 crc kubenswrapper[4797]: I1013 15:22:42.571206 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3337b64-24f8-453c-93c7-8fb7be2ea68f-utilities\") pod \"certified-operators-jwzl9\" (UID: \"d3337b64-24f8-453c-93c7-8fb7be2ea68f\") " pod="openshift-marketplace/certified-operators-jwzl9" Oct 13 15:22:42 crc kubenswrapper[4797]: I1013 15:22:42.571358 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3337b64-24f8-453c-93c7-8fb7be2ea68f-catalog-content\") pod \"certified-operators-jwzl9\" (UID: \"d3337b64-24f8-453c-93c7-8fb7be2ea68f\") " pod="openshift-marketplace/certified-operators-jwzl9" Oct 13 15:22:42 crc kubenswrapper[4797]: I1013 15:22:42.571788 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3337b64-24f8-453c-93c7-8fb7be2ea68f-utilities\") pod \"certified-operators-jwzl9\" (UID: \"d3337b64-24f8-453c-93c7-8fb7be2ea68f\") " pod="openshift-marketplace/certified-operators-jwzl9" Oct 13 15:22:42 crc kubenswrapper[4797]: I1013 15:22:42.595774 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjtlm\" (UniqueName: \"kubernetes.io/projected/d3337b64-24f8-453c-93c7-8fb7be2ea68f-kube-api-access-cjtlm\") pod \"certified-operators-jwzl9\" (UID: \"d3337b64-24f8-453c-93c7-8fb7be2ea68f\") " pod="openshift-marketplace/certified-operators-jwzl9" Oct 13 15:22:42 crc kubenswrapper[4797]: I1013 15:22:42.728252 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jwzl9" Oct 13 15:22:43 crc kubenswrapper[4797]: I1013 15:22:43.214630 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jwzl9"] Oct 13 15:22:43 crc kubenswrapper[4797]: I1013 15:22:43.325416 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jwzl9" event={"ID":"d3337b64-24f8-453c-93c7-8fb7be2ea68f","Type":"ContainerStarted","Data":"c066534eb37a863a3e2efdf46dc5c05bb1acf228f954dc54f439ac8d46de6353"} Oct 13 15:22:44 crc kubenswrapper[4797]: I1013 15:22:44.335954 4797 generic.go:334] "Generic (PLEG): container finished" podID="d3337b64-24f8-453c-93c7-8fb7be2ea68f" containerID="aa5ad21f6ffdb8f920827726fce1fdc6e96308d91ea6bfc52b8fcf7f3dab055b" exitCode=0 Oct 13 15:22:44 crc kubenswrapper[4797]: I1013 15:22:44.336051 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jwzl9" event={"ID":"d3337b64-24f8-453c-93c7-8fb7be2ea68f","Type":"ContainerDied","Data":"aa5ad21f6ffdb8f920827726fce1fdc6e96308d91ea6bfc52b8fcf7f3dab055b"} Oct 13 15:22:44 crc kubenswrapper[4797]: I1013 15:22:44.338100 4797 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 13 15:22:46 crc kubenswrapper[4797]: I1013 15:22:46.366232 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jwzl9" event={"ID":"d3337b64-24f8-453c-93c7-8fb7be2ea68f","Type":"ContainerStarted","Data":"0092c5457bc7be40f21a8f3808eaf296cd38d8942a093fafe9f2496c01272506"} Oct 13 15:22:47 crc kubenswrapper[4797]: I1013 15:22:47.378463 4797 generic.go:334] "Generic (PLEG): container finished" podID="d3337b64-24f8-453c-93c7-8fb7be2ea68f" containerID="0092c5457bc7be40f21a8f3808eaf296cd38d8942a093fafe9f2496c01272506" exitCode=0 Oct 13 15:22:47 crc kubenswrapper[4797]: I1013 15:22:47.378536 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jwzl9" event={"ID":"d3337b64-24f8-453c-93c7-8fb7be2ea68f","Type":"ContainerDied","Data":"0092c5457bc7be40f21a8f3808eaf296cd38d8942a093fafe9f2496c01272506"} Oct 13 15:22:48 crc kubenswrapper[4797]: I1013 15:22:48.395466 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jwzl9" event={"ID":"d3337b64-24f8-453c-93c7-8fb7be2ea68f","Type":"ContainerStarted","Data":"a71e043b40a0cea80981f3a52b5cac83912a382908004e7411c6cde3881b27d7"} Oct 13 15:22:48 crc kubenswrapper[4797]: I1013 15:22:48.421218 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-jwzl9" podStartSLOduration=2.937774918 podStartE2EDuration="6.421197265s" podCreationTimestamp="2025-10-13 15:22:42 +0000 UTC" firstStartedPulling="2025-10-13 15:22:44.337870257 +0000 UTC m=+8141.871420523" lastFinishedPulling="2025-10-13 15:22:47.821292614 +0000 UTC m=+8145.354842870" observedRunningTime="2025-10-13 15:22:48.41777014 +0000 UTC m=+8145.951320416" watchObservedRunningTime="2025-10-13 15:22:48.421197265 +0000 UTC m=+8145.954747521" Oct 13 15:22:52 crc kubenswrapper[4797]: I1013 15:22:52.729342 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-jwzl9" Oct 13 15:22:52 crc kubenswrapper[4797]: I1013 15:22:52.730025 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-jwzl9" Oct 13 15:22:52 crc kubenswrapper[4797]: I1013 15:22:52.787787 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-jwzl9" Oct 13 15:22:53 crc kubenswrapper[4797]: I1013 15:22:53.525031 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-jwzl9" Oct 13 15:22:53 crc kubenswrapper[4797]: I1013 15:22:53.586077 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jwzl9"] Oct 13 15:22:55 crc kubenswrapper[4797]: I1013 15:22:55.466243 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-jwzl9" podUID="d3337b64-24f8-453c-93c7-8fb7be2ea68f" containerName="registry-server" containerID="cri-o://a71e043b40a0cea80981f3a52b5cac83912a382908004e7411c6cde3881b27d7" gracePeriod=2 Oct 13 15:22:55 crc kubenswrapper[4797]: I1013 15:22:55.950526 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jwzl9" Oct 13 15:22:56 crc kubenswrapper[4797]: I1013 15:22:56.071042 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3337b64-24f8-453c-93c7-8fb7be2ea68f-catalog-content\") pod \"d3337b64-24f8-453c-93c7-8fb7be2ea68f\" (UID: \"d3337b64-24f8-453c-93c7-8fb7be2ea68f\") " Oct 13 15:22:56 crc kubenswrapper[4797]: I1013 15:22:56.071129 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3337b64-24f8-453c-93c7-8fb7be2ea68f-utilities\") pod \"d3337b64-24f8-453c-93c7-8fb7be2ea68f\" (UID: \"d3337b64-24f8-453c-93c7-8fb7be2ea68f\") " Oct 13 15:22:56 crc kubenswrapper[4797]: I1013 15:22:56.071267 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cjtlm\" (UniqueName: \"kubernetes.io/projected/d3337b64-24f8-453c-93c7-8fb7be2ea68f-kube-api-access-cjtlm\") pod \"d3337b64-24f8-453c-93c7-8fb7be2ea68f\" (UID: \"d3337b64-24f8-453c-93c7-8fb7be2ea68f\") " Oct 13 15:22:56 crc kubenswrapper[4797]: I1013 15:22:56.072158 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3337b64-24f8-453c-93c7-8fb7be2ea68f-utilities" (OuterVolumeSpecName: "utilities") pod "d3337b64-24f8-453c-93c7-8fb7be2ea68f" (UID: "d3337b64-24f8-453c-93c7-8fb7be2ea68f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 15:22:56 crc kubenswrapper[4797]: I1013 15:22:56.076500 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3337b64-24f8-453c-93c7-8fb7be2ea68f-kube-api-access-cjtlm" (OuterVolumeSpecName: "kube-api-access-cjtlm") pod "d3337b64-24f8-453c-93c7-8fb7be2ea68f" (UID: "d3337b64-24f8-453c-93c7-8fb7be2ea68f"). InnerVolumeSpecName "kube-api-access-cjtlm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 15:22:56 crc kubenswrapper[4797]: I1013 15:22:56.173272 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cjtlm\" (UniqueName: \"kubernetes.io/projected/d3337b64-24f8-453c-93c7-8fb7be2ea68f-kube-api-access-cjtlm\") on node \"crc\" DevicePath \"\"" Oct 13 15:22:56 crc kubenswrapper[4797]: I1013 15:22:56.173306 4797 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3337b64-24f8-453c-93c7-8fb7be2ea68f-utilities\") on node \"crc\" DevicePath \"\"" Oct 13 15:22:56 crc kubenswrapper[4797]: I1013 15:22:56.376975 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3337b64-24f8-453c-93c7-8fb7be2ea68f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d3337b64-24f8-453c-93c7-8fb7be2ea68f" (UID: "d3337b64-24f8-453c-93c7-8fb7be2ea68f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 15:22:56 crc kubenswrapper[4797]: I1013 15:22:56.479475 4797 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3337b64-24f8-453c-93c7-8fb7be2ea68f-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 13 15:22:56 crc kubenswrapper[4797]: I1013 15:22:56.483854 4797 generic.go:334] "Generic (PLEG): container finished" podID="d3337b64-24f8-453c-93c7-8fb7be2ea68f" containerID="a71e043b40a0cea80981f3a52b5cac83912a382908004e7411c6cde3881b27d7" exitCode=0 Oct 13 15:22:56 crc kubenswrapper[4797]: I1013 15:22:56.483906 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jwzl9" event={"ID":"d3337b64-24f8-453c-93c7-8fb7be2ea68f","Type":"ContainerDied","Data":"a71e043b40a0cea80981f3a52b5cac83912a382908004e7411c6cde3881b27d7"} Oct 13 15:22:56 crc kubenswrapper[4797]: I1013 15:22:56.483940 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jwzl9" event={"ID":"d3337b64-24f8-453c-93c7-8fb7be2ea68f","Type":"ContainerDied","Data":"c066534eb37a863a3e2efdf46dc5c05bb1acf228f954dc54f439ac8d46de6353"} Oct 13 15:22:56 crc kubenswrapper[4797]: I1013 15:22:56.483963 4797 scope.go:117] "RemoveContainer" containerID="a71e043b40a0cea80981f3a52b5cac83912a382908004e7411c6cde3881b27d7" Oct 13 15:22:56 crc kubenswrapper[4797]: I1013 15:22:56.484151 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jwzl9" Oct 13 15:22:56 crc kubenswrapper[4797]: I1013 15:22:56.526204 4797 scope.go:117] "RemoveContainer" containerID="0092c5457bc7be40f21a8f3808eaf296cd38d8942a093fafe9f2496c01272506" Oct 13 15:22:56 crc kubenswrapper[4797]: I1013 15:22:56.532526 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jwzl9"] Oct 13 15:22:56 crc kubenswrapper[4797]: I1013 15:22:56.542114 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-jwzl9"] Oct 13 15:22:56 crc kubenswrapper[4797]: I1013 15:22:56.552160 4797 scope.go:117] "RemoveContainer" containerID="aa5ad21f6ffdb8f920827726fce1fdc6e96308d91ea6bfc52b8fcf7f3dab055b" Oct 13 15:22:56 crc kubenswrapper[4797]: I1013 15:22:56.591494 4797 scope.go:117] "RemoveContainer" containerID="a71e043b40a0cea80981f3a52b5cac83912a382908004e7411c6cde3881b27d7" Oct 13 15:22:56 crc kubenswrapper[4797]: E1013 15:22:56.592092 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a71e043b40a0cea80981f3a52b5cac83912a382908004e7411c6cde3881b27d7\": container with ID starting with a71e043b40a0cea80981f3a52b5cac83912a382908004e7411c6cde3881b27d7 not found: ID does not exist" containerID="a71e043b40a0cea80981f3a52b5cac83912a382908004e7411c6cde3881b27d7" Oct 13 15:22:56 crc kubenswrapper[4797]: I1013 15:22:56.592129 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a71e043b40a0cea80981f3a52b5cac83912a382908004e7411c6cde3881b27d7"} err="failed to get container status \"a71e043b40a0cea80981f3a52b5cac83912a382908004e7411c6cde3881b27d7\": rpc error: code = NotFound desc = could not find container \"a71e043b40a0cea80981f3a52b5cac83912a382908004e7411c6cde3881b27d7\": container with ID starting with a71e043b40a0cea80981f3a52b5cac83912a382908004e7411c6cde3881b27d7 not found: ID does not exist" Oct 13 15:22:56 crc kubenswrapper[4797]: I1013 15:22:56.592155 4797 scope.go:117] "RemoveContainer" containerID="0092c5457bc7be40f21a8f3808eaf296cd38d8942a093fafe9f2496c01272506" Oct 13 15:22:56 crc kubenswrapper[4797]: E1013 15:22:56.592447 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0092c5457bc7be40f21a8f3808eaf296cd38d8942a093fafe9f2496c01272506\": container with ID starting with 0092c5457bc7be40f21a8f3808eaf296cd38d8942a093fafe9f2496c01272506 not found: ID does not exist" containerID="0092c5457bc7be40f21a8f3808eaf296cd38d8942a093fafe9f2496c01272506" Oct 13 15:22:56 crc kubenswrapper[4797]: I1013 15:22:56.592470 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0092c5457bc7be40f21a8f3808eaf296cd38d8942a093fafe9f2496c01272506"} err="failed to get container status \"0092c5457bc7be40f21a8f3808eaf296cd38d8942a093fafe9f2496c01272506\": rpc error: code = NotFound desc = could not find container \"0092c5457bc7be40f21a8f3808eaf296cd38d8942a093fafe9f2496c01272506\": container with ID starting with 0092c5457bc7be40f21a8f3808eaf296cd38d8942a093fafe9f2496c01272506 not found: ID does not exist" Oct 13 15:22:56 crc kubenswrapper[4797]: I1013 15:22:56.592482 4797 scope.go:117] "RemoveContainer" containerID="aa5ad21f6ffdb8f920827726fce1fdc6e96308d91ea6bfc52b8fcf7f3dab055b" Oct 13 15:22:56 crc kubenswrapper[4797]: E1013 15:22:56.592753 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa5ad21f6ffdb8f920827726fce1fdc6e96308d91ea6bfc52b8fcf7f3dab055b\": container with ID starting with aa5ad21f6ffdb8f920827726fce1fdc6e96308d91ea6bfc52b8fcf7f3dab055b not found: ID does not exist" containerID="aa5ad21f6ffdb8f920827726fce1fdc6e96308d91ea6bfc52b8fcf7f3dab055b" Oct 13 15:22:56 crc kubenswrapper[4797]: I1013 15:22:56.592772 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa5ad21f6ffdb8f920827726fce1fdc6e96308d91ea6bfc52b8fcf7f3dab055b"} err="failed to get container status \"aa5ad21f6ffdb8f920827726fce1fdc6e96308d91ea6bfc52b8fcf7f3dab055b\": rpc error: code = NotFound desc = could not find container \"aa5ad21f6ffdb8f920827726fce1fdc6e96308d91ea6bfc52b8fcf7f3dab055b\": container with ID starting with aa5ad21f6ffdb8f920827726fce1fdc6e96308d91ea6bfc52b8fcf7f3dab055b not found: ID does not exist" Oct 13 15:22:57 crc kubenswrapper[4797]: I1013 15:22:57.285465 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3337b64-24f8-453c-93c7-8fb7be2ea68f" path="/var/lib/kubelet/pods/d3337b64-24f8-453c-93c7-8fb7be2ea68f/volumes" Oct 13 15:23:48 crc kubenswrapper[4797]: I1013 15:23:48.120323 4797 patch_prober.go:28] interesting pod/machine-config-daemon-hrdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 15:23:48 crc kubenswrapper[4797]: I1013 15:23:48.121009 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 15:24:18 crc kubenswrapper[4797]: I1013 15:24:18.120230 4797 patch_prober.go:28] interesting pod/machine-config-daemon-hrdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 15:24:18 crc kubenswrapper[4797]: I1013 15:24:18.120856 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 15:24:48 crc kubenswrapper[4797]: I1013 15:24:48.126698 4797 patch_prober.go:28] interesting pod/machine-config-daemon-hrdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 15:24:48 crc kubenswrapper[4797]: I1013 15:24:48.127579 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 15:24:48 crc kubenswrapper[4797]: I1013 15:24:48.127666 4797 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" Oct 13 15:24:48 crc kubenswrapper[4797]: I1013 15:24:48.129070 4797 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d5154c01ef831f582bdc74e590c7f68a0327106b2b148c6d1b0c61b36503e1a2"} pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 13 15:24:48 crc kubenswrapper[4797]: I1013 15:24:48.129192 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerName="machine-config-daemon" containerID="cri-o://d5154c01ef831f582bdc74e590c7f68a0327106b2b148c6d1b0c61b36503e1a2" gracePeriod=600 Oct 13 15:24:48 crc kubenswrapper[4797]: I1013 15:24:48.698081 4797 generic.go:334] "Generic (PLEG): container finished" podID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerID="d5154c01ef831f582bdc74e590c7f68a0327106b2b148c6d1b0c61b36503e1a2" exitCode=0 Oct 13 15:24:48 crc kubenswrapper[4797]: I1013 15:24:48.698157 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" event={"ID":"345b1c60-ba79-407d-8423-53010f2dfeb0","Type":"ContainerDied","Data":"d5154c01ef831f582bdc74e590c7f68a0327106b2b148c6d1b0c61b36503e1a2"} Oct 13 15:24:48 crc kubenswrapper[4797]: I1013 15:24:48.698478 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" event={"ID":"345b1c60-ba79-407d-8423-53010f2dfeb0","Type":"ContainerStarted","Data":"6a7efc0ac87fd5a3474be5b847665fe7e0d782a300b9c4d9444078885329765f"} Oct 13 15:24:48 crc kubenswrapper[4797]: I1013 15:24:48.698505 4797 scope.go:117] "RemoveContainer" containerID="5ae84f5ef96749b196afba3caac331bd51c3761c94a664830d21bf6136f3106b" Oct 13 15:25:14 crc kubenswrapper[4797]: I1013 15:25:14.981151 4797 generic.go:334] "Generic (PLEG): container finished" podID="373fe301-acc1-486b-a109-62e739dec048" containerID="abac1b360265b3acc2d189962ea734b909fb0d92d2b108e188e79328b2929bf7" exitCode=0 Oct 13 15:25:14 crc kubenswrapper[4797]: I1013 15:25:14.981267 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-dkdhk" event={"ID":"373fe301-acc1-486b-a109-62e739dec048","Type":"ContainerDied","Data":"abac1b360265b3acc2d189962ea734b909fb0d92d2b108e188e79328b2929bf7"} Oct 13 15:25:16 crc kubenswrapper[4797]: I1013 15:25:16.424111 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-dkdhk" Oct 13 15:25:16 crc kubenswrapper[4797]: I1013 15:25:16.543781 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/373fe301-acc1-486b-a109-62e739dec048-neutron-sriov-agent-neutron-config-0\") pod \"373fe301-acc1-486b-a109-62e739dec048\" (UID: \"373fe301-acc1-486b-a109-62e739dec048\") " Oct 13 15:25:16 crc kubenswrapper[4797]: I1013 15:25:16.545059 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wt4cq\" (UniqueName: \"kubernetes.io/projected/373fe301-acc1-486b-a109-62e739dec048-kube-api-access-wt4cq\") pod \"373fe301-acc1-486b-a109-62e739dec048\" (UID: \"373fe301-acc1-486b-a109-62e739dec048\") " Oct 13 15:25:16 crc kubenswrapper[4797]: I1013 15:25:16.545143 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/373fe301-acc1-486b-a109-62e739dec048-ssh-key\") pod \"373fe301-acc1-486b-a109-62e739dec048\" (UID: \"373fe301-acc1-486b-a109-62e739dec048\") " Oct 13 15:25:16 crc kubenswrapper[4797]: I1013 15:25:16.545190 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/373fe301-acc1-486b-a109-62e739dec048-ceph\") pod \"373fe301-acc1-486b-a109-62e739dec048\" (UID: \"373fe301-acc1-486b-a109-62e739dec048\") " Oct 13 15:25:16 crc kubenswrapper[4797]: I1013 15:25:16.545342 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/373fe301-acc1-486b-a109-62e739dec048-neutron-sriov-combined-ca-bundle\") pod \"373fe301-acc1-486b-a109-62e739dec048\" (UID: \"373fe301-acc1-486b-a109-62e739dec048\") " Oct 13 15:25:16 crc kubenswrapper[4797]: I1013 15:25:16.545393 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/373fe301-acc1-486b-a109-62e739dec048-inventory\") pod \"373fe301-acc1-486b-a109-62e739dec048\" (UID: \"373fe301-acc1-486b-a109-62e739dec048\") " Oct 13 15:25:16 crc kubenswrapper[4797]: I1013 15:25:16.550155 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/373fe301-acc1-486b-a109-62e739dec048-ceph" (OuterVolumeSpecName: "ceph") pod "373fe301-acc1-486b-a109-62e739dec048" (UID: "373fe301-acc1-486b-a109-62e739dec048"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 15:25:16 crc kubenswrapper[4797]: I1013 15:25:16.551261 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/373fe301-acc1-486b-a109-62e739dec048-kube-api-access-wt4cq" (OuterVolumeSpecName: "kube-api-access-wt4cq") pod "373fe301-acc1-486b-a109-62e739dec048" (UID: "373fe301-acc1-486b-a109-62e739dec048"). InnerVolumeSpecName "kube-api-access-wt4cq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 15:25:16 crc kubenswrapper[4797]: I1013 15:25:16.551889 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/373fe301-acc1-486b-a109-62e739dec048-neutron-sriov-combined-ca-bundle" (OuterVolumeSpecName: "neutron-sriov-combined-ca-bundle") pod "373fe301-acc1-486b-a109-62e739dec048" (UID: "373fe301-acc1-486b-a109-62e739dec048"). InnerVolumeSpecName "neutron-sriov-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 15:25:16 crc kubenswrapper[4797]: I1013 15:25:16.582692 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/373fe301-acc1-486b-a109-62e739dec048-neutron-sriov-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-sriov-agent-neutron-config-0") pod "373fe301-acc1-486b-a109-62e739dec048" (UID: "373fe301-acc1-486b-a109-62e739dec048"). InnerVolumeSpecName "neutron-sriov-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 15:25:16 crc kubenswrapper[4797]: I1013 15:25:16.584544 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/373fe301-acc1-486b-a109-62e739dec048-inventory" (OuterVolumeSpecName: "inventory") pod "373fe301-acc1-486b-a109-62e739dec048" (UID: "373fe301-acc1-486b-a109-62e739dec048"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 15:25:16 crc kubenswrapper[4797]: I1013 15:25:16.594779 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/373fe301-acc1-486b-a109-62e739dec048-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "373fe301-acc1-486b-a109-62e739dec048" (UID: "373fe301-acc1-486b-a109-62e739dec048"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 15:25:16 crc kubenswrapper[4797]: I1013 15:25:16.649578 4797 reconciler_common.go:293] "Volume detached for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/373fe301-acc1-486b-a109-62e739dec048-neutron-sriov-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 15:25:16 crc kubenswrapper[4797]: I1013 15:25:16.649649 4797 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/373fe301-acc1-486b-a109-62e739dec048-inventory\") on node \"crc\" DevicePath \"\"" Oct 13 15:25:16 crc kubenswrapper[4797]: I1013 15:25:16.649664 4797 reconciler_common.go:293] "Volume detached for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/373fe301-acc1-486b-a109-62e739dec048-neutron-sriov-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Oct 13 15:25:16 crc kubenswrapper[4797]: I1013 15:25:16.649678 4797 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/373fe301-acc1-486b-a109-62e739dec048-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 13 15:25:16 crc kubenswrapper[4797]: I1013 15:25:16.649701 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wt4cq\" (UniqueName: \"kubernetes.io/projected/373fe301-acc1-486b-a109-62e739dec048-kube-api-access-wt4cq\") on node \"crc\" DevicePath \"\"" Oct 13 15:25:16 crc kubenswrapper[4797]: I1013 15:25:16.649712 4797 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/373fe301-acc1-486b-a109-62e739dec048-ceph\") on node \"crc\" DevicePath \"\"" Oct 13 15:25:17 crc kubenswrapper[4797]: I1013 15:25:17.001440 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-dkdhk" event={"ID":"373fe301-acc1-486b-a109-62e739dec048","Type":"ContainerDied","Data":"26fdfb84518082cdc732bb12957c338ac8a3af46a208a16a4333f0fde8ed8f55"} Oct 13 15:25:17 crc kubenswrapper[4797]: I1013 15:25:17.001500 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="26fdfb84518082cdc732bb12957c338ac8a3af46a208a16a4333f0fde8ed8f55" Oct 13 15:25:17 crc kubenswrapper[4797]: I1013 15:25:17.001526 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-dkdhk" Oct 13 15:25:17 crc kubenswrapper[4797]: I1013 15:25:17.115204 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-dhcp-openstack-openstack-cell1-sfvnf"] Oct 13 15:25:17 crc kubenswrapper[4797]: E1013 15:25:17.116085 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="373fe301-acc1-486b-a109-62e739dec048" containerName="neutron-sriov-openstack-openstack-cell1" Oct 13 15:25:17 crc kubenswrapper[4797]: I1013 15:25:17.116117 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="373fe301-acc1-486b-a109-62e739dec048" containerName="neutron-sriov-openstack-openstack-cell1" Oct 13 15:25:17 crc kubenswrapper[4797]: E1013 15:25:17.116156 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3337b64-24f8-453c-93c7-8fb7be2ea68f" containerName="extract-utilities" Oct 13 15:25:17 crc kubenswrapper[4797]: I1013 15:25:17.116183 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3337b64-24f8-453c-93c7-8fb7be2ea68f" containerName="extract-utilities" Oct 13 15:25:17 crc kubenswrapper[4797]: E1013 15:25:17.116195 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3337b64-24f8-453c-93c7-8fb7be2ea68f" containerName="registry-server" Oct 13 15:25:17 crc kubenswrapper[4797]: I1013 15:25:17.116204 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3337b64-24f8-453c-93c7-8fb7be2ea68f" containerName="registry-server" Oct 13 15:25:17 crc kubenswrapper[4797]: E1013 15:25:17.116219 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3337b64-24f8-453c-93c7-8fb7be2ea68f" containerName="extract-content" Oct 13 15:25:17 crc kubenswrapper[4797]: I1013 15:25:17.116228 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3337b64-24f8-453c-93c7-8fb7be2ea68f" containerName="extract-content" Oct 13 15:25:17 crc kubenswrapper[4797]: I1013 15:25:17.116462 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3337b64-24f8-453c-93c7-8fb7be2ea68f" containerName="registry-server" Oct 13 15:25:17 crc kubenswrapper[4797]: I1013 15:25:17.116485 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="373fe301-acc1-486b-a109-62e739dec048" containerName="neutron-sriov-openstack-openstack-cell1" Oct 13 15:25:17 crc kubenswrapper[4797]: I1013 15:25:17.117922 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-sfvnf" Oct 13 15:25:17 crc kubenswrapper[4797]: I1013 15:25:17.121552 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-rf85n" Oct 13 15:25:17 crc kubenswrapper[4797]: I1013 15:25:17.121826 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 13 15:25:17 crc kubenswrapper[4797]: I1013 15:25:17.121835 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-dhcp-agent-neutron-config" Oct 13 15:25:17 crc kubenswrapper[4797]: I1013 15:25:17.121989 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 13 15:25:17 crc kubenswrapper[4797]: I1013 15:25:17.122271 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 13 15:25:17 crc kubenswrapper[4797]: I1013 15:25:17.127246 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-dhcp-openstack-openstack-cell1-sfvnf"] Oct 13 15:25:17 crc kubenswrapper[4797]: I1013 15:25:17.262699 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7fs8g\" (UniqueName: \"kubernetes.io/projected/17115edf-f950-40b3-9a3b-3948815da323-kube-api-access-7fs8g\") pod \"neutron-dhcp-openstack-openstack-cell1-sfvnf\" (UID: \"17115edf-f950-40b3-9a3b-3948815da323\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-sfvnf" Oct 13 15:25:17 crc kubenswrapper[4797]: I1013 15:25:17.262918 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/17115edf-f950-40b3-9a3b-3948815da323-ssh-key\") pod \"neutron-dhcp-openstack-openstack-cell1-sfvnf\" (UID: \"17115edf-f950-40b3-9a3b-3948815da323\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-sfvnf" Oct 13 15:25:17 crc kubenswrapper[4797]: I1013 15:25:17.263026 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/17115edf-f950-40b3-9a3b-3948815da323-ceph\") pod \"neutron-dhcp-openstack-openstack-cell1-sfvnf\" (UID: \"17115edf-f950-40b3-9a3b-3948815da323\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-sfvnf" Oct 13 15:25:17 crc kubenswrapper[4797]: I1013 15:25:17.263167 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/17115edf-f950-40b3-9a3b-3948815da323-neutron-dhcp-agent-neutron-config-0\") pod \"neutron-dhcp-openstack-openstack-cell1-sfvnf\" (UID: \"17115edf-f950-40b3-9a3b-3948815da323\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-sfvnf" Oct 13 15:25:17 crc kubenswrapper[4797]: I1013 15:25:17.263208 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17115edf-f950-40b3-9a3b-3948815da323-neutron-dhcp-combined-ca-bundle\") pod \"neutron-dhcp-openstack-openstack-cell1-sfvnf\" (UID: \"17115edf-f950-40b3-9a3b-3948815da323\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-sfvnf" Oct 13 15:25:17 crc kubenswrapper[4797]: I1013 15:25:17.263252 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/17115edf-f950-40b3-9a3b-3948815da323-inventory\") pod \"neutron-dhcp-openstack-openstack-cell1-sfvnf\" (UID: \"17115edf-f950-40b3-9a3b-3948815da323\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-sfvnf" Oct 13 15:25:17 crc kubenswrapper[4797]: I1013 15:25:17.365329 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/17115edf-f950-40b3-9a3b-3948815da323-ssh-key\") pod \"neutron-dhcp-openstack-openstack-cell1-sfvnf\" (UID: \"17115edf-f950-40b3-9a3b-3948815da323\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-sfvnf" Oct 13 15:25:17 crc kubenswrapper[4797]: I1013 15:25:17.365390 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/17115edf-f950-40b3-9a3b-3948815da323-ceph\") pod \"neutron-dhcp-openstack-openstack-cell1-sfvnf\" (UID: \"17115edf-f950-40b3-9a3b-3948815da323\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-sfvnf" Oct 13 15:25:17 crc kubenswrapper[4797]: I1013 15:25:17.365433 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/17115edf-f950-40b3-9a3b-3948815da323-neutron-dhcp-agent-neutron-config-0\") pod \"neutron-dhcp-openstack-openstack-cell1-sfvnf\" (UID: \"17115edf-f950-40b3-9a3b-3948815da323\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-sfvnf" Oct 13 15:25:17 crc kubenswrapper[4797]: I1013 15:25:17.365463 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17115edf-f950-40b3-9a3b-3948815da323-neutron-dhcp-combined-ca-bundle\") pod \"neutron-dhcp-openstack-openstack-cell1-sfvnf\" (UID: \"17115edf-f950-40b3-9a3b-3948815da323\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-sfvnf" Oct 13 15:25:17 crc kubenswrapper[4797]: I1013 15:25:17.365522 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/17115edf-f950-40b3-9a3b-3948815da323-inventory\") pod \"neutron-dhcp-openstack-openstack-cell1-sfvnf\" (UID: \"17115edf-f950-40b3-9a3b-3948815da323\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-sfvnf" Oct 13 15:25:17 crc kubenswrapper[4797]: I1013 15:25:17.365597 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7fs8g\" (UniqueName: \"kubernetes.io/projected/17115edf-f950-40b3-9a3b-3948815da323-kube-api-access-7fs8g\") pod \"neutron-dhcp-openstack-openstack-cell1-sfvnf\" (UID: \"17115edf-f950-40b3-9a3b-3948815da323\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-sfvnf" Oct 13 15:25:17 crc kubenswrapper[4797]: I1013 15:25:17.369655 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/17115edf-f950-40b3-9a3b-3948815da323-inventory\") pod \"neutron-dhcp-openstack-openstack-cell1-sfvnf\" (UID: \"17115edf-f950-40b3-9a3b-3948815da323\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-sfvnf" Oct 13 15:25:17 crc kubenswrapper[4797]: I1013 15:25:17.369861 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17115edf-f950-40b3-9a3b-3948815da323-neutron-dhcp-combined-ca-bundle\") pod \"neutron-dhcp-openstack-openstack-cell1-sfvnf\" (UID: \"17115edf-f950-40b3-9a3b-3948815da323\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-sfvnf" Oct 13 15:25:17 crc kubenswrapper[4797]: I1013 15:25:17.371500 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/17115edf-f950-40b3-9a3b-3948815da323-neutron-dhcp-agent-neutron-config-0\") pod \"neutron-dhcp-openstack-openstack-cell1-sfvnf\" (UID: \"17115edf-f950-40b3-9a3b-3948815da323\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-sfvnf" Oct 13 15:25:17 crc kubenswrapper[4797]: I1013 15:25:17.371667 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/17115edf-f950-40b3-9a3b-3948815da323-ceph\") pod \"neutron-dhcp-openstack-openstack-cell1-sfvnf\" (UID: \"17115edf-f950-40b3-9a3b-3948815da323\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-sfvnf" Oct 13 15:25:17 crc kubenswrapper[4797]: I1013 15:25:17.371822 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/17115edf-f950-40b3-9a3b-3948815da323-ssh-key\") pod \"neutron-dhcp-openstack-openstack-cell1-sfvnf\" (UID: \"17115edf-f950-40b3-9a3b-3948815da323\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-sfvnf" Oct 13 15:25:17 crc kubenswrapper[4797]: I1013 15:25:17.385366 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7fs8g\" (UniqueName: \"kubernetes.io/projected/17115edf-f950-40b3-9a3b-3948815da323-kube-api-access-7fs8g\") pod \"neutron-dhcp-openstack-openstack-cell1-sfvnf\" (UID: \"17115edf-f950-40b3-9a3b-3948815da323\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-sfvnf" Oct 13 15:25:17 crc kubenswrapper[4797]: I1013 15:25:17.443577 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-sfvnf" Oct 13 15:25:17 crc kubenswrapper[4797]: I1013 15:25:17.989504 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-dhcp-openstack-openstack-cell1-sfvnf"] Oct 13 15:25:18 crc kubenswrapper[4797]: I1013 15:25:18.013892 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-sfvnf" event={"ID":"17115edf-f950-40b3-9a3b-3948815da323","Type":"ContainerStarted","Data":"2cff8a3b08ecc43b2fb65b0620d4e5110346323036bddcbdef87171d5e5dc3da"} Oct 13 15:25:19 crc kubenswrapper[4797]: I1013 15:25:19.031081 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-sfvnf" event={"ID":"17115edf-f950-40b3-9a3b-3948815da323","Type":"ContainerStarted","Data":"c2df3e842601f97f4dfa5a6f91173a8f7a01d31d2da6eaeba9fcdd112cf5bc4f"} Oct 13 15:25:19 crc kubenswrapper[4797]: I1013 15:25:19.054569 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-dhcp-openstack-openstack-cell1-sfvnf" podStartSLOduration=1.584819295 podStartE2EDuration="2.054547169s" podCreationTimestamp="2025-10-13 15:25:17 +0000 UTC" firstStartedPulling="2025-10-13 15:25:17.992514632 +0000 UTC m=+8295.526064898" lastFinishedPulling="2025-10-13 15:25:18.462242506 +0000 UTC m=+8295.995792772" observedRunningTime="2025-10-13 15:25:19.043964419 +0000 UTC m=+8296.577514685" watchObservedRunningTime="2025-10-13 15:25:19.054547169 +0000 UTC m=+8296.588097435" Oct 13 15:26:48 crc kubenswrapper[4797]: I1013 15:26:48.119696 4797 patch_prober.go:28] interesting pod/machine-config-daemon-hrdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 15:26:48 crc kubenswrapper[4797]: I1013 15:26:48.120273 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 15:27:18 crc kubenswrapper[4797]: I1013 15:27:18.120028 4797 patch_prober.go:28] interesting pod/machine-config-daemon-hrdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 15:27:18 crc kubenswrapper[4797]: I1013 15:27:18.121011 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 15:27:48 crc kubenswrapper[4797]: I1013 15:27:48.120499 4797 patch_prober.go:28] interesting pod/machine-config-daemon-hrdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 15:27:48 crc kubenswrapper[4797]: I1013 15:27:48.121158 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 15:27:48 crc kubenswrapper[4797]: I1013 15:27:48.121223 4797 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" Oct 13 15:27:48 crc kubenswrapper[4797]: I1013 15:27:48.121995 4797 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6a7efc0ac87fd5a3474be5b847665fe7e0d782a300b9c4d9444078885329765f"} pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 13 15:27:48 crc kubenswrapper[4797]: I1013 15:27:48.122057 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerName="machine-config-daemon" containerID="cri-o://6a7efc0ac87fd5a3474be5b847665fe7e0d782a300b9c4d9444078885329765f" gracePeriod=600 Oct 13 15:27:48 crc kubenswrapper[4797]: E1013 15:27:48.242238 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 15:27:48 crc kubenswrapper[4797]: I1013 15:27:48.626392 4797 generic.go:334] "Generic (PLEG): container finished" podID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerID="6a7efc0ac87fd5a3474be5b847665fe7e0d782a300b9c4d9444078885329765f" exitCode=0 Oct 13 15:27:48 crc kubenswrapper[4797]: I1013 15:27:48.626438 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" event={"ID":"345b1c60-ba79-407d-8423-53010f2dfeb0","Type":"ContainerDied","Data":"6a7efc0ac87fd5a3474be5b847665fe7e0d782a300b9c4d9444078885329765f"} Oct 13 15:27:48 crc kubenswrapper[4797]: I1013 15:27:48.626478 4797 scope.go:117] "RemoveContainer" containerID="d5154c01ef831f582bdc74e590c7f68a0327106b2b148c6d1b0c61b36503e1a2" Oct 13 15:27:48 crc kubenswrapper[4797]: I1013 15:27:48.627697 4797 scope.go:117] "RemoveContainer" containerID="6a7efc0ac87fd5a3474be5b847665fe7e0d782a300b9c4d9444078885329765f" Oct 13 15:27:48 crc kubenswrapper[4797]: E1013 15:27:48.628499 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 15:27:59 crc kubenswrapper[4797]: I1013 15:27:59.236057 4797 scope.go:117] "RemoveContainer" containerID="6a7efc0ac87fd5a3474be5b847665fe7e0d782a300b9c4d9444078885329765f" Oct 13 15:27:59 crc kubenswrapper[4797]: E1013 15:27:59.237080 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 15:28:10 crc kubenswrapper[4797]: I1013 15:28:10.236433 4797 scope.go:117] "RemoveContainer" containerID="6a7efc0ac87fd5a3474be5b847665fe7e0d782a300b9c4d9444078885329765f" Oct 13 15:28:10 crc kubenswrapper[4797]: E1013 15:28:10.237272 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 15:28:20 crc kubenswrapper[4797]: I1013 15:28:20.183414 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-smswv"] Oct 13 15:28:20 crc kubenswrapper[4797]: I1013 15:28:20.188437 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-smswv" Oct 13 15:28:20 crc kubenswrapper[4797]: I1013 15:28:20.197736 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-smswv"] Oct 13 15:28:20 crc kubenswrapper[4797]: I1013 15:28:20.258318 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5fec5443-bfc7-48e6-896b-aee58dc6c74b-utilities\") pod \"community-operators-smswv\" (UID: \"5fec5443-bfc7-48e6-896b-aee58dc6c74b\") " pod="openshift-marketplace/community-operators-smswv" Oct 13 15:28:20 crc kubenswrapper[4797]: I1013 15:28:20.258413 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gh8fk\" (UniqueName: \"kubernetes.io/projected/5fec5443-bfc7-48e6-896b-aee58dc6c74b-kube-api-access-gh8fk\") pod \"community-operators-smswv\" (UID: \"5fec5443-bfc7-48e6-896b-aee58dc6c74b\") " pod="openshift-marketplace/community-operators-smswv" Oct 13 15:28:20 crc kubenswrapper[4797]: I1013 15:28:20.258510 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5fec5443-bfc7-48e6-896b-aee58dc6c74b-catalog-content\") pod \"community-operators-smswv\" (UID: \"5fec5443-bfc7-48e6-896b-aee58dc6c74b\") " pod="openshift-marketplace/community-operators-smswv" Oct 13 15:28:20 crc kubenswrapper[4797]: I1013 15:28:20.361680 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gh8fk\" (UniqueName: \"kubernetes.io/projected/5fec5443-bfc7-48e6-896b-aee58dc6c74b-kube-api-access-gh8fk\") pod \"community-operators-smswv\" (UID: \"5fec5443-bfc7-48e6-896b-aee58dc6c74b\") " pod="openshift-marketplace/community-operators-smswv" Oct 13 15:28:20 crc kubenswrapper[4797]: I1013 15:28:20.362325 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5fec5443-bfc7-48e6-896b-aee58dc6c74b-catalog-content\") pod \"community-operators-smswv\" (UID: \"5fec5443-bfc7-48e6-896b-aee58dc6c74b\") " pod="openshift-marketplace/community-operators-smswv" Oct 13 15:28:20 crc kubenswrapper[4797]: I1013 15:28:20.362383 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5fec5443-bfc7-48e6-896b-aee58dc6c74b-catalog-content\") pod \"community-operators-smswv\" (UID: \"5fec5443-bfc7-48e6-896b-aee58dc6c74b\") " pod="openshift-marketplace/community-operators-smswv" Oct 13 15:28:20 crc kubenswrapper[4797]: I1013 15:28:20.362627 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5fec5443-bfc7-48e6-896b-aee58dc6c74b-utilities\") pod \"community-operators-smswv\" (UID: \"5fec5443-bfc7-48e6-896b-aee58dc6c74b\") " pod="openshift-marketplace/community-operators-smswv" Oct 13 15:28:20 crc kubenswrapper[4797]: I1013 15:28:20.362990 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5fec5443-bfc7-48e6-896b-aee58dc6c74b-utilities\") pod \"community-operators-smswv\" (UID: \"5fec5443-bfc7-48e6-896b-aee58dc6c74b\") " pod="openshift-marketplace/community-operators-smswv" Oct 13 15:28:20 crc kubenswrapper[4797]: I1013 15:28:20.379759 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gh8fk\" (UniqueName: \"kubernetes.io/projected/5fec5443-bfc7-48e6-896b-aee58dc6c74b-kube-api-access-gh8fk\") pod \"community-operators-smswv\" (UID: \"5fec5443-bfc7-48e6-896b-aee58dc6c74b\") " pod="openshift-marketplace/community-operators-smswv" Oct 13 15:28:20 crc kubenswrapper[4797]: I1013 15:28:20.516818 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-smswv" Oct 13 15:28:21 crc kubenswrapper[4797]: I1013 15:28:21.049688 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-smswv"] Oct 13 15:28:21 crc kubenswrapper[4797]: I1013 15:28:21.236056 4797 scope.go:117] "RemoveContainer" containerID="6a7efc0ac87fd5a3474be5b847665fe7e0d782a300b9c4d9444078885329765f" Oct 13 15:28:21 crc kubenswrapper[4797]: E1013 15:28:21.236586 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 15:28:21 crc kubenswrapper[4797]: I1013 15:28:21.975513 4797 generic.go:334] "Generic (PLEG): container finished" podID="5fec5443-bfc7-48e6-896b-aee58dc6c74b" containerID="8c23adf5768e9ced854213c106b2fd6c5468fe9fab399bca9cc0a772626cccf9" exitCode=0 Oct 13 15:28:21 crc kubenswrapper[4797]: I1013 15:28:21.975583 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-smswv" event={"ID":"5fec5443-bfc7-48e6-896b-aee58dc6c74b","Type":"ContainerDied","Data":"8c23adf5768e9ced854213c106b2fd6c5468fe9fab399bca9cc0a772626cccf9"} Oct 13 15:28:21 crc kubenswrapper[4797]: I1013 15:28:21.975653 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-smswv" event={"ID":"5fec5443-bfc7-48e6-896b-aee58dc6c74b","Type":"ContainerStarted","Data":"ce00e8458ba65a51d17bdeee066b662a9cd2bf50c38d6f1b7eb8a313602f42f1"} Oct 13 15:28:21 crc kubenswrapper[4797]: I1013 15:28:21.977560 4797 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 13 15:28:22 crc kubenswrapper[4797]: I1013 15:28:22.988932 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-smswv" event={"ID":"5fec5443-bfc7-48e6-896b-aee58dc6c74b","Type":"ContainerStarted","Data":"98298caa525ac301aab17d6a152691bbaba718f3623f0411e3f7bd71659ca1e7"} Oct 13 15:28:25 crc kubenswrapper[4797]: I1013 15:28:25.015228 4797 generic.go:334] "Generic (PLEG): container finished" podID="5fec5443-bfc7-48e6-896b-aee58dc6c74b" containerID="98298caa525ac301aab17d6a152691bbaba718f3623f0411e3f7bd71659ca1e7" exitCode=0 Oct 13 15:28:25 crc kubenswrapper[4797]: I1013 15:28:25.015384 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-smswv" event={"ID":"5fec5443-bfc7-48e6-896b-aee58dc6c74b","Type":"ContainerDied","Data":"98298caa525ac301aab17d6a152691bbaba718f3623f0411e3f7bd71659ca1e7"} Oct 13 15:28:26 crc kubenswrapper[4797]: I1013 15:28:26.030543 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-smswv" event={"ID":"5fec5443-bfc7-48e6-896b-aee58dc6c74b","Type":"ContainerStarted","Data":"c7d478b6be272762a29c43dbee086d9f2cc2b2285278f35c8190bcc5e0eec93e"} Oct 13 15:28:26 crc kubenswrapper[4797]: I1013 15:28:26.061477 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-smswv" podStartSLOduration=2.563520554 podStartE2EDuration="6.061451358s" podCreationTimestamp="2025-10-13 15:28:20 +0000 UTC" firstStartedPulling="2025-10-13 15:28:21.977346672 +0000 UTC m=+8479.510896928" lastFinishedPulling="2025-10-13 15:28:25.475277476 +0000 UTC m=+8483.008827732" observedRunningTime="2025-10-13 15:28:26.050002566 +0000 UTC m=+8483.583552842" watchObservedRunningTime="2025-10-13 15:28:26.061451358 +0000 UTC m=+8483.595001654" Oct 13 15:28:30 crc kubenswrapper[4797]: I1013 15:28:30.517469 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-smswv" Oct 13 15:28:30 crc kubenswrapper[4797]: I1013 15:28:30.518035 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-smswv" Oct 13 15:28:31 crc kubenswrapper[4797]: I1013 15:28:31.569221 4797 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-smswv" podUID="5fec5443-bfc7-48e6-896b-aee58dc6c74b" containerName="registry-server" probeResult="failure" output=< Oct 13 15:28:31 crc kubenswrapper[4797]: timeout: failed to connect service ":50051" within 1s Oct 13 15:28:31 crc kubenswrapper[4797]: > Oct 13 15:28:32 crc kubenswrapper[4797]: I1013 15:28:32.236146 4797 scope.go:117] "RemoveContainer" containerID="6a7efc0ac87fd5a3474be5b847665fe7e0d782a300b9c4d9444078885329765f" Oct 13 15:28:32 crc kubenswrapper[4797]: E1013 15:28:32.236601 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 15:28:40 crc kubenswrapper[4797]: I1013 15:28:40.576568 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-smswv" Oct 13 15:28:40 crc kubenswrapper[4797]: I1013 15:28:40.635593 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-smswv" Oct 13 15:28:40 crc kubenswrapper[4797]: I1013 15:28:40.827395 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-smswv"] Oct 13 15:28:42 crc kubenswrapper[4797]: I1013 15:28:42.226047 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-smswv" podUID="5fec5443-bfc7-48e6-896b-aee58dc6c74b" containerName="registry-server" containerID="cri-o://c7d478b6be272762a29c43dbee086d9f2cc2b2285278f35c8190bcc5e0eec93e" gracePeriod=2 Oct 13 15:28:42 crc kubenswrapper[4797]: I1013 15:28:42.778667 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-smswv" Oct 13 15:28:42 crc kubenswrapper[4797]: I1013 15:28:42.898548 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gh8fk\" (UniqueName: \"kubernetes.io/projected/5fec5443-bfc7-48e6-896b-aee58dc6c74b-kube-api-access-gh8fk\") pod \"5fec5443-bfc7-48e6-896b-aee58dc6c74b\" (UID: \"5fec5443-bfc7-48e6-896b-aee58dc6c74b\") " Oct 13 15:28:42 crc kubenswrapper[4797]: I1013 15:28:42.898658 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5fec5443-bfc7-48e6-896b-aee58dc6c74b-catalog-content\") pod \"5fec5443-bfc7-48e6-896b-aee58dc6c74b\" (UID: \"5fec5443-bfc7-48e6-896b-aee58dc6c74b\") " Oct 13 15:28:42 crc kubenswrapper[4797]: I1013 15:28:42.898907 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5fec5443-bfc7-48e6-896b-aee58dc6c74b-utilities\") pod \"5fec5443-bfc7-48e6-896b-aee58dc6c74b\" (UID: \"5fec5443-bfc7-48e6-896b-aee58dc6c74b\") " Oct 13 15:28:42 crc kubenswrapper[4797]: I1013 15:28:42.900355 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5fec5443-bfc7-48e6-896b-aee58dc6c74b-utilities" (OuterVolumeSpecName: "utilities") pod "5fec5443-bfc7-48e6-896b-aee58dc6c74b" (UID: "5fec5443-bfc7-48e6-896b-aee58dc6c74b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 15:28:42 crc kubenswrapper[4797]: I1013 15:28:42.911037 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fec5443-bfc7-48e6-896b-aee58dc6c74b-kube-api-access-gh8fk" (OuterVolumeSpecName: "kube-api-access-gh8fk") pod "5fec5443-bfc7-48e6-896b-aee58dc6c74b" (UID: "5fec5443-bfc7-48e6-896b-aee58dc6c74b"). InnerVolumeSpecName "kube-api-access-gh8fk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 15:28:42 crc kubenswrapper[4797]: I1013 15:28:42.963419 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5fec5443-bfc7-48e6-896b-aee58dc6c74b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5fec5443-bfc7-48e6-896b-aee58dc6c74b" (UID: "5fec5443-bfc7-48e6-896b-aee58dc6c74b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 15:28:43 crc kubenswrapper[4797]: I1013 15:28:43.002029 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gh8fk\" (UniqueName: \"kubernetes.io/projected/5fec5443-bfc7-48e6-896b-aee58dc6c74b-kube-api-access-gh8fk\") on node \"crc\" DevicePath \"\"" Oct 13 15:28:43 crc kubenswrapper[4797]: I1013 15:28:43.002084 4797 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5fec5443-bfc7-48e6-896b-aee58dc6c74b-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 13 15:28:43 crc kubenswrapper[4797]: I1013 15:28:43.002096 4797 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5fec5443-bfc7-48e6-896b-aee58dc6c74b-utilities\") on node \"crc\" DevicePath \"\"" Oct 13 15:28:43 crc kubenswrapper[4797]: I1013 15:28:43.239374 4797 generic.go:334] "Generic (PLEG): container finished" podID="5fec5443-bfc7-48e6-896b-aee58dc6c74b" containerID="c7d478b6be272762a29c43dbee086d9f2cc2b2285278f35c8190bcc5e0eec93e" exitCode=0 Oct 13 15:28:43 crc kubenswrapper[4797]: I1013 15:28:43.244865 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-smswv" Oct 13 15:28:43 crc kubenswrapper[4797]: I1013 15:28:43.245446 4797 scope.go:117] "RemoveContainer" containerID="6a7efc0ac87fd5a3474be5b847665fe7e0d782a300b9c4d9444078885329765f" Oct 13 15:28:43 crc kubenswrapper[4797]: E1013 15:28:43.245944 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 15:28:43 crc kubenswrapper[4797]: I1013 15:28:43.258963 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-smswv" event={"ID":"5fec5443-bfc7-48e6-896b-aee58dc6c74b","Type":"ContainerDied","Data":"c7d478b6be272762a29c43dbee086d9f2cc2b2285278f35c8190bcc5e0eec93e"} Oct 13 15:28:43 crc kubenswrapper[4797]: I1013 15:28:43.259254 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-smswv" event={"ID":"5fec5443-bfc7-48e6-896b-aee58dc6c74b","Type":"ContainerDied","Data":"ce00e8458ba65a51d17bdeee066b662a9cd2bf50c38d6f1b7eb8a313602f42f1"} Oct 13 15:28:43 crc kubenswrapper[4797]: I1013 15:28:43.259385 4797 scope.go:117] "RemoveContainer" containerID="c7d478b6be272762a29c43dbee086d9f2cc2b2285278f35c8190bcc5e0eec93e" Oct 13 15:28:43 crc kubenswrapper[4797]: I1013 15:28:43.301408 4797 scope.go:117] "RemoveContainer" containerID="98298caa525ac301aab17d6a152691bbaba718f3623f0411e3f7bd71659ca1e7" Oct 13 15:28:43 crc kubenswrapper[4797]: I1013 15:28:43.307393 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-smswv"] Oct 13 15:28:43 crc kubenswrapper[4797]: I1013 15:28:43.321027 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-smswv"] Oct 13 15:28:43 crc kubenswrapper[4797]: I1013 15:28:43.341261 4797 scope.go:117] "RemoveContainer" containerID="8c23adf5768e9ced854213c106b2fd6c5468fe9fab399bca9cc0a772626cccf9" Oct 13 15:28:43 crc kubenswrapper[4797]: I1013 15:28:43.402544 4797 scope.go:117] "RemoveContainer" containerID="c7d478b6be272762a29c43dbee086d9f2cc2b2285278f35c8190bcc5e0eec93e" Oct 13 15:28:43 crc kubenswrapper[4797]: E1013 15:28:43.403351 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c7d478b6be272762a29c43dbee086d9f2cc2b2285278f35c8190bcc5e0eec93e\": container with ID starting with c7d478b6be272762a29c43dbee086d9f2cc2b2285278f35c8190bcc5e0eec93e not found: ID does not exist" containerID="c7d478b6be272762a29c43dbee086d9f2cc2b2285278f35c8190bcc5e0eec93e" Oct 13 15:28:43 crc kubenswrapper[4797]: I1013 15:28:43.403393 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7d478b6be272762a29c43dbee086d9f2cc2b2285278f35c8190bcc5e0eec93e"} err="failed to get container status \"c7d478b6be272762a29c43dbee086d9f2cc2b2285278f35c8190bcc5e0eec93e\": rpc error: code = NotFound desc = could not find container \"c7d478b6be272762a29c43dbee086d9f2cc2b2285278f35c8190bcc5e0eec93e\": container with ID starting with c7d478b6be272762a29c43dbee086d9f2cc2b2285278f35c8190bcc5e0eec93e not found: ID does not exist" Oct 13 15:28:43 crc kubenswrapper[4797]: I1013 15:28:43.403430 4797 scope.go:117] "RemoveContainer" containerID="98298caa525ac301aab17d6a152691bbaba718f3623f0411e3f7bd71659ca1e7" Oct 13 15:28:43 crc kubenswrapper[4797]: E1013 15:28:43.403851 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"98298caa525ac301aab17d6a152691bbaba718f3623f0411e3f7bd71659ca1e7\": container with ID starting with 98298caa525ac301aab17d6a152691bbaba718f3623f0411e3f7bd71659ca1e7 not found: ID does not exist" containerID="98298caa525ac301aab17d6a152691bbaba718f3623f0411e3f7bd71659ca1e7" Oct 13 15:28:43 crc kubenswrapper[4797]: I1013 15:28:43.403887 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98298caa525ac301aab17d6a152691bbaba718f3623f0411e3f7bd71659ca1e7"} err="failed to get container status \"98298caa525ac301aab17d6a152691bbaba718f3623f0411e3f7bd71659ca1e7\": rpc error: code = NotFound desc = could not find container \"98298caa525ac301aab17d6a152691bbaba718f3623f0411e3f7bd71659ca1e7\": container with ID starting with 98298caa525ac301aab17d6a152691bbaba718f3623f0411e3f7bd71659ca1e7 not found: ID does not exist" Oct 13 15:28:43 crc kubenswrapper[4797]: I1013 15:28:43.404021 4797 scope.go:117] "RemoveContainer" containerID="8c23adf5768e9ced854213c106b2fd6c5468fe9fab399bca9cc0a772626cccf9" Oct 13 15:28:43 crc kubenswrapper[4797]: E1013 15:28:43.404430 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c23adf5768e9ced854213c106b2fd6c5468fe9fab399bca9cc0a772626cccf9\": container with ID starting with 8c23adf5768e9ced854213c106b2fd6c5468fe9fab399bca9cc0a772626cccf9 not found: ID does not exist" containerID="8c23adf5768e9ced854213c106b2fd6c5468fe9fab399bca9cc0a772626cccf9" Oct 13 15:28:43 crc kubenswrapper[4797]: I1013 15:28:43.404463 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c23adf5768e9ced854213c106b2fd6c5468fe9fab399bca9cc0a772626cccf9"} err="failed to get container status \"8c23adf5768e9ced854213c106b2fd6c5468fe9fab399bca9cc0a772626cccf9\": rpc error: code = NotFound desc = could not find container \"8c23adf5768e9ced854213c106b2fd6c5468fe9fab399bca9cc0a772626cccf9\": container with ID starting with 8c23adf5768e9ced854213c106b2fd6c5468fe9fab399bca9cc0a772626cccf9 not found: ID does not exist" Oct 13 15:28:45 crc kubenswrapper[4797]: I1013 15:28:45.252640 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fec5443-bfc7-48e6-896b-aee58dc6c74b" path="/var/lib/kubelet/pods/5fec5443-bfc7-48e6-896b-aee58dc6c74b/volumes" Oct 13 15:28:57 crc kubenswrapper[4797]: I1013 15:28:57.237731 4797 scope.go:117] "RemoveContainer" containerID="6a7efc0ac87fd5a3474be5b847665fe7e0d782a300b9c4d9444078885329765f" Oct 13 15:28:57 crc kubenswrapper[4797]: E1013 15:28:57.239027 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 15:29:11 crc kubenswrapper[4797]: I1013 15:29:11.236929 4797 scope.go:117] "RemoveContainer" containerID="6a7efc0ac87fd5a3474be5b847665fe7e0d782a300b9c4d9444078885329765f" Oct 13 15:29:11 crc kubenswrapper[4797]: E1013 15:29:11.238971 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 15:29:23 crc kubenswrapper[4797]: I1013 15:29:23.246966 4797 scope.go:117] "RemoveContainer" containerID="6a7efc0ac87fd5a3474be5b847665fe7e0d782a300b9c4d9444078885329765f" Oct 13 15:29:23 crc kubenswrapper[4797]: E1013 15:29:23.247912 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 15:29:38 crc kubenswrapper[4797]: I1013 15:29:38.236088 4797 scope.go:117] "RemoveContainer" containerID="6a7efc0ac87fd5a3474be5b847665fe7e0d782a300b9c4d9444078885329765f" Oct 13 15:29:38 crc kubenswrapper[4797]: E1013 15:29:38.236792 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 15:29:51 crc kubenswrapper[4797]: I1013 15:29:51.237746 4797 scope.go:117] "RemoveContainer" containerID="6a7efc0ac87fd5a3474be5b847665fe7e0d782a300b9c4d9444078885329765f" Oct 13 15:29:51 crc kubenswrapper[4797]: E1013 15:29:51.238548 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 15:30:00 crc kubenswrapper[4797]: I1013 15:30:00.165695 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29339490-v97hg"] Oct 13 15:30:00 crc kubenswrapper[4797]: E1013 15:30:00.166716 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5fec5443-bfc7-48e6-896b-aee58dc6c74b" containerName="extract-content" Oct 13 15:30:00 crc kubenswrapper[4797]: I1013 15:30:00.166733 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="5fec5443-bfc7-48e6-896b-aee58dc6c74b" containerName="extract-content" Oct 13 15:30:00 crc kubenswrapper[4797]: E1013 15:30:00.166770 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5fec5443-bfc7-48e6-896b-aee58dc6c74b" containerName="registry-server" Oct 13 15:30:00 crc kubenswrapper[4797]: I1013 15:30:00.166778 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="5fec5443-bfc7-48e6-896b-aee58dc6c74b" containerName="registry-server" Oct 13 15:30:00 crc kubenswrapper[4797]: E1013 15:30:00.166797 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5fec5443-bfc7-48e6-896b-aee58dc6c74b" containerName="extract-utilities" Oct 13 15:30:00 crc kubenswrapper[4797]: I1013 15:30:00.166824 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="5fec5443-bfc7-48e6-896b-aee58dc6c74b" containerName="extract-utilities" Oct 13 15:30:00 crc kubenswrapper[4797]: I1013 15:30:00.167021 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="5fec5443-bfc7-48e6-896b-aee58dc6c74b" containerName="registry-server" Oct 13 15:30:00 crc kubenswrapper[4797]: I1013 15:30:00.167797 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29339490-v97hg" Oct 13 15:30:00 crc kubenswrapper[4797]: I1013 15:30:00.170776 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 13 15:30:00 crc kubenswrapper[4797]: I1013 15:30:00.170777 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 13 15:30:00 crc kubenswrapper[4797]: I1013 15:30:00.187134 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29339490-v97hg"] Oct 13 15:30:00 crc kubenswrapper[4797]: I1013 15:30:00.279412 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/82d55df3-5fdc-4b11-9ad7-b7ac219683a4-secret-volume\") pod \"collect-profiles-29339490-v97hg\" (UID: \"82d55df3-5fdc-4b11-9ad7-b7ac219683a4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339490-v97hg" Oct 13 15:30:00 crc kubenswrapper[4797]: I1013 15:30:00.279501 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/82d55df3-5fdc-4b11-9ad7-b7ac219683a4-config-volume\") pod \"collect-profiles-29339490-v97hg\" (UID: \"82d55df3-5fdc-4b11-9ad7-b7ac219683a4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339490-v97hg" Oct 13 15:30:00 crc kubenswrapper[4797]: I1013 15:30:00.279639 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99qvf\" (UniqueName: \"kubernetes.io/projected/82d55df3-5fdc-4b11-9ad7-b7ac219683a4-kube-api-access-99qvf\") pod \"collect-profiles-29339490-v97hg\" (UID: \"82d55df3-5fdc-4b11-9ad7-b7ac219683a4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339490-v97hg" Oct 13 15:30:00 crc kubenswrapper[4797]: I1013 15:30:00.381834 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99qvf\" (UniqueName: \"kubernetes.io/projected/82d55df3-5fdc-4b11-9ad7-b7ac219683a4-kube-api-access-99qvf\") pod \"collect-profiles-29339490-v97hg\" (UID: \"82d55df3-5fdc-4b11-9ad7-b7ac219683a4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339490-v97hg" Oct 13 15:30:00 crc kubenswrapper[4797]: I1013 15:30:00.381940 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/82d55df3-5fdc-4b11-9ad7-b7ac219683a4-secret-volume\") pod \"collect-profiles-29339490-v97hg\" (UID: \"82d55df3-5fdc-4b11-9ad7-b7ac219683a4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339490-v97hg" Oct 13 15:30:00 crc kubenswrapper[4797]: I1013 15:30:00.382075 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/82d55df3-5fdc-4b11-9ad7-b7ac219683a4-config-volume\") pod \"collect-profiles-29339490-v97hg\" (UID: \"82d55df3-5fdc-4b11-9ad7-b7ac219683a4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339490-v97hg" Oct 13 15:30:00 crc kubenswrapper[4797]: I1013 15:30:00.383149 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/82d55df3-5fdc-4b11-9ad7-b7ac219683a4-config-volume\") pod \"collect-profiles-29339490-v97hg\" (UID: \"82d55df3-5fdc-4b11-9ad7-b7ac219683a4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339490-v97hg" Oct 13 15:30:00 crc kubenswrapper[4797]: I1013 15:30:00.390655 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/82d55df3-5fdc-4b11-9ad7-b7ac219683a4-secret-volume\") pod \"collect-profiles-29339490-v97hg\" (UID: \"82d55df3-5fdc-4b11-9ad7-b7ac219683a4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339490-v97hg" Oct 13 15:30:00 crc kubenswrapper[4797]: I1013 15:30:00.412487 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99qvf\" (UniqueName: \"kubernetes.io/projected/82d55df3-5fdc-4b11-9ad7-b7ac219683a4-kube-api-access-99qvf\") pod \"collect-profiles-29339490-v97hg\" (UID: \"82d55df3-5fdc-4b11-9ad7-b7ac219683a4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339490-v97hg" Oct 13 15:30:00 crc kubenswrapper[4797]: I1013 15:30:00.493870 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29339490-v97hg" Oct 13 15:30:00 crc kubenswrapper[4797]: I1013 15:30:00.956144 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29339490-v97hg"] Oct 13 15:30:01 crc kubenswrapper[4797]: I1013 15:30:01.128510 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29339490-v97hg" event={"ID":"82d55df3-5fdc-4b11-9ad7-b7ac219683a4","Type":"ContainerStarted","Data":"6e54176640b8dfc8bc20ca18a185740c10c9170e3664707b001603ddbdeb8ffa"} Oct 13 15:30:02 crc kubenswrapper[4797]: I1013 15:30:02.138898 4797 generic.go:334] "Generic (PLEG): container finished" podID="82d55df3-5fdc-4b11-9ad7-b7ac219683a4" containerID="86c994b2026777c3a58adc328d6ee80183169468f3cd229a5867b22a506bf8fe" exitCode=0 Oct 13 15:30:02 crc kubenswrapper[4797]: I1013 15:30:02.138992 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29339490-v97hg" event={"ID":"82d55df3-5fdc-4b11-9ad7-b7ac219683a4","Type":"ContainerDied","Data":"86c994b2026777c3a58adc328d6ee80183169468f3cd229a5867b22a506bf8fe"} Oct 13 15:30:02 crc kubenswrapper[4797]: I1013 15:30:02.236847 4797 scope.go:117] "RemoveContainer" containerID="6a7efc0ac87fd5a3474be5b847665fe7e0d782a300b9c4d9444078885329765f" Oct 13 15:30:02 crc kubenswrapper[4797]: E1013 15:30:02.237144 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 15:30:03 crc kubenswrapper[4797]: I1013 15:30:03.525134 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29339490-v97hg" Oct 13 15:30:03 crc kubenswrapper[4797]: I1013 15:30:03.681423 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/82d55df3-5fdc-4b11-9ad7-b7ac219683a4-secret-volume\") pod \"82d55df3-5fdc-4b11-9ad7-b7ac219683a4\" (UID: \"82d55df3-5fdc-4b11-9ad7-b7ac219683a4\") " Oct 13 15:30:03 crc kubenswrapper[4797]: I1013 15:30:03.681662 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/82d55df3-5fdc-4b11-9ad7-b7ac219683a4-config-volume\") pod \"82d55df3-5fdc-4b11-9ad7-b7ac219683a4\" (UID: \"82d55df3-5fdc-4b11-9ad7-b7ac219683a4\") " Oct 13 15:30:03 crc kubenswrapper[4797]: I1013 15:30:03.681697 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-99qvf\" (UniqueName: \"kubernetes.io/projected/82d55df3-5fdc-4b11-9ad7-b7ac219683a4-kube-api-access-99qvf\") pod \"82d55df3-5fdc-4b11-9ad7-b7ac219683a4\" (UID: \"82d55df3-5fdc-4b11-9ad7-b7ac219683a4\") " Oct 13 15:30:03 crc kubenswrapper[4797]: I1013 15:30:03.682303 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82d55df3-5fdc-4b11-9ad7-b7ac219683a4-config-volume" (OuterVolumeSpecName: "config-volume") pod "82d55df3-5fdc-4b11-9ad7-b7ac219683a4" (UID: "82d55df3-5fdc-4b11-9ad7-b7ac219683a4"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 15:30:03 crc kubenswrapper[4797]: I1013 15:30:03.687967 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82d55df3-5fdc-4b11-9ad7-b7ac219683a4-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "82d55df3-5fdc-4b11-9ad7-b7ac219683a4" (UID: "82d55df3-5fdc-4b11-9ad7-b7ac219683a4"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 15:30:03 crc kubenswrapper[4797]: I1013 15:30:03.688475 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82d55df3-5fdc-4b11-9ad7-b7ac219683a4-kube-api-access-99qvf" (OuterVolumeSpecName: "kube-api-access-99qvf") pod "82d55df3-5fdc-4b11-9ad7-b7ac219683a4" (UID: "82d55df3-5fdc-4b11-9ad7-b7ac219683a4"). InnerVolumeSpecName "kube-api-access-99qvf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 15:30:03 crc kubenswrapper[4797]: I1013 15:30:03.783667 4797 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/82d55df3-5fdc-4b11-9ad7-b7ac219683a4-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 13 15:30:03 crc kubenswrapper[4797]: I1013 15:30:03.783709 4797 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/82d55df3-5fdc-4b11-9ad7-b7ac219683a4-config-volume\") on node \"crc\" DevicePath \"\"" Oct 13 15:30:03 crc kubenswrapper[4797]: I1013 15:30:03.783719 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-99qvf\" (UniqueName: \"kubernetes.io/projected/82d55df3-5fdc-4b11-9ad7-b7ac219683a4-kube-api-access-99qvf\") on node \"crc\" DevicePath \"\"" Oct 13 15:30:04 crc kubenswrapper[4797]: I1013 15:30:04.157435 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29339490-v97hg" event={"ID":"82d55df3-5fdc-4b11-9ad7-b7ac219683a4","Type":"ContainerDied","Data":"6e54176640b8dfc8bc20ca18a185740c10c9170e3664707b001603ddbdeb8ffa"} Oct 13 15:30:04 crc kubenswrapper[4797]: I1013 15:30:04.157485 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6e54176640b8dfc8bc20ca18a185740c10c9170e3664707b001603ddbdeb8ffa" Oct 13 15:30:04 crc kubenswrapper[4797]: I1013 15:30:04.157484 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29339490-v97hg" Oct 13 15:30:04 crc kubenswrapper[4797]: I1013 15:30:04.599931 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29339445-2wvzh"] Oct 13 15:30:04 crc kubenswrapper[4797]: I1013 15:30:04.608481 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29339445-2wvzh"] Oct 13 15:30:05 crc kubenswrapper[4797]: I1013 15:30:05.251899 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7d23b2c-c8c8-4d5e-8577-0574a1dbf3ca" path="/var/lib/kubelet/pods/d7d23b2c-c8c8-4d5e-8577-0574a1dbf3ca/volumes" Oct 13 15:30:17 crc kubenswrapper[4797]: I1013 15:30:17.236504 4797 scope.go:117] "RemoveContainer" containerID="6a7efc0ac87fd5a3474be5b847665fe7e0d782a300b9c4d9444078885329765f" Oct 13 15:30:17 crc kubenswrapper[4797]: E1013 15:30:17.238465 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 15:30:20 crc kubenswrapper[4797]: I1013 15:30:20.180624 4797 scope.go:117] "RemoveContainer" containerID="3ace9afe25f80a81e9ab314e3b95faca7ccbfc352e6c7a8cd8ebed938bc7004c" Oct 13 15:30:30 crc kubenswrapper[4797]: I1013 15:30:30.709021 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-fpxmk"] Oct 13 15:30:30 crc kubenswrapper[4797]: E1013 15:30:30.709955 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82d55df3-5fdc-4b11-9ad7-b7ac219683a4" containerName="collect-profiles" Oct 13 15:30:30 crc kubenswrapper[4797]: I1013 15:30:30.709969 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="82d55df3-5fdc-4b11-9ad7-b7ac219683a4" containerName="collect-profiles" Oct 13 15:30:30 crc kubenswrapper[4797]: I1013 15:30:30.710207 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="82d55df3-5fdc-4b11-9ad7-b7ac219683a4" containerName="collect-profiles" Oct 13 15:30:30 crc kubenswrapper[4797]: I1013 15:30:30.711734 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fpxmk" Oct 13 15:30:30 crc kubenswrapper[4797]: I1013 15:30:30.719503 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fpxmk"] Oct 13 15:30:30 crc kubenswrapper[4797]: I1013 15:30:30.814143 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6lqz\" (UniqueName: \"kubernetes.io/projected/b02861e3-2075-44f0-87da-4a4c772f2532-kube-api-access-m6lqz\") pod \"redhat-marketplace-fpxmk\" (UID: \"b02861e3-2075-44f0-87da-4a4c772f2532\") " pod="openshift-marketplace/redhat-marketplace-fpxmk" Oct 13 15:30:30 crc kubenswrapper[4797]: I1013 15:30:30.814335 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b02861e3-2075-44f0-87da-4a4c772f2532-utilities\") pod \"redhat-marketplace-fpxmk\" (UID: \"b02861e3-2075-44f0-87da-4a4c772f2532\") " pod="openshift-marketplace/redhat-marketplace-fpxmk" Oct 13 15:30:30 crc kubenswrapper[4797]: I1013 15:30:30.814691 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b02861e3-2075-44f0-87da-4a4c772f2532-catalog-content\") pod \"redhat-marketplace-fpxmk\" (UID: \"b02861e3-2075-44f0-87da-4a4c772f2532\") " pod="openshift-marketplace/redhat-marketplace-fpxmk" Oct 13 15:30:30 crc kubenswrapper[4797]: I1013 15:30:30.916507 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b02861e3-2075-44f0-87da-4a4c772f2532-catalog-content\") pod \"redhat-marketplace-fpxmk\" (UID: \"b02861e3-2075-44f0-87da-4a4c772f2532\") " pod="openshift-marketplace/redhat-marketplace-fpxmk" Oct 13 15:30:30 crc kubenswrapper[4797]: I1013 15:30:30.916588 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m6lqz\" (UniqueName: \"kubernetes.io/projected/b02861e3-2075-44f0-87da-4a4c772f2532-kube-api-access-m6lqz\") pod \"redhat-marketplace-fpxmk\" (UID: \"b02861e3-2075-44f0-87da-4a4c772f2532\") " pod="openshift-marketplace/redhat-marketplace-fpxmk" Oct 13 15:30:30 crc kubenswrapper[4797]: I1013 15:30:30.916638 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b02861e3-2075-44f0-87da-4a4c772f2532-utilities\") pod \"redhat-marketplace-fpxmk\" (UID: \"b02861e3-2075-44f0-87da-4a4c772f2532\") " pod="openshift-marketplace/redhat-marketplace-fpxmk" Oct 13 15:30:30 crc kubenswrapper[4797]: I1013 15:30:30.917096 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b02861e3-2075-44f0-87da-4a4c772f2532-catalog-content\") pod \"redhat-marketplace-fpxmk\" (UID: \"b02861e3-2075-44f0-87da-4a4c772f2532\") " pod="openshift-marketplace/redhat-marketplace-fpxmk" Oct 13 15:30:30 crc kubenswrapper[4797]: I1013 15:30:30.917120 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b02861e3-2075-44f0-87da-4a4c772f2532-utilities\") pod \"redhat-marketplace-fpxmk\" (UID: \"b02861e3-2075-44f0-87da-4a4c772f2532\") " pod="openshift-marketplace/redhat-marketplace-fpxmk" Oct 13 15:30:30 crc kubenswrapper[4797]: I1013 15:30:30.947474 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6lqz\" (UniqueName: \"kubernetes.io/projected/b02861e3-2075-44f0-87da-4a4c772f2532-kube-api-access-m6lqz\") pod \"redhat-marketplace-fpxmk\" (UID: \"b02861e3-2075-44f0-87da-4a4c772f2532\") " pod="openshift-marketplace/redhat-marketplace-fpxmk" Oct 13 15:30:31 crc kubenswrapper[4797]: I1013 15:30:31.062386 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fpxmk" Oct 13 15:30:31 crc kubenswrapper[4797]: I1013 15:30:31.509317 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fpxmk"] Oct 13 15:30:32 crc kubenswrapper[4797]: I1013 15:30:32.237312 4797 scope.go:117] "RemoveContainer" containerID="6a7efc0ac87fd5a3474be5b847665fe7e0d782a300b9c4d9444078885329765f" Oct 13 15:30:32 crc kubenswrapper[4797]: E1013 15:30:32.238235 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 15:30:32 crc kubenswrapper[4797]: I1013 15:30:32.471272 4797 generic.go:334] "Generic (PLEG): container finished" podID="b02861e3-2075-44f0-87da-4a4c772f2532" containerID="aaf1da0b17b7a055efbb26fde8811909b3aa128d97b4fcf8ec4d4a16655f2230" exitCode=0 Oct 13 15:30:32 crc kubenswrapper[4797]: I1013 15:30:32.471314 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fpxmk" event={"ID":"b02861e3-2075-44f0-87da-4a4c772f2532","Type":"ContainerDied","Data":"aaf1da0b17b7a055efbb26fde8811909b3aa128d97b4fcf8ec4d4a16655f2230"} Oct 13 15:30:32 crc kubenswrapper[4797]: I1013 15:30:32.471337 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fpxmk" event={"ID":"b02861e3-2075-44f0-87da-4a4c772f2532","Type":"ContainerStarted","Data":"f9055b2d4b519a9abdbaa3dd30ad035b60ece559f9b61835551c0eb13fe84954"} Oct 13 15:30:34 crc kubenswrapper[4797]: I1013 15:30:34.494928 4797 generic.go:334] "Generic (PLEG): container finished" podID="b02861e3-2075-44f0-87da-4a4c772f2532" containerID="5f5428a63a6f00db3e8a5718106ec7ea51282fc85dfd6b99b92c029a4f4d4b2a" exitCode=0 Oct 13 15:30:34 crc kubenswrapper[4797]: I1013 15:30:34.495022 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fpxmk" event={"ID":"b02861e3-2075-44f0-87da-4a4c772f2532","Type":"ContainerDied","Data":"5f5428a63a6f00db3e8a5718106ec7ea51282fc85dfd6b99b92c029a4f4d4b2a"} Oct 13 15:30:35 crc kubenswrapper[4797]: I1013 15:30:35.509732 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fpxmk" event={"ID":"b02861e3-2075-44f0-87da-4a4c772f2532","Type":"ContainerStarted","Data":"c55c8e4d326a891ee02a75822ffdd0f5238e06b4afbf988b65d3820717ff9963"} Oct 13 15:30:35 crc kubenswrapper[4797]: I1013 15:30:35.547268 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-fpxmk" podStartSLOduration=2.8517304169999997 podStartE2EDuration="5.54721603s" podCreationTimestamp="2025-10-13 15:30:30 +0000 UTC" firstStartedPulling="2025-10-13 15:30:32.472956585 +0000 UTC m=+8610.006506841" lastFinishedPulling="2025-10-13 15:30:35.168442188 +0000 UTC m=+8612.701992454" observedRunningTime="2025-10-13 15:30:35.532141879 +0000 UTC m=+8613.065692205" watchObservedRunningTime="2025-10-13 15:30:35.54721603 +0000 UTC m=+8613.080766326" Oct 13 15:30:41 crc kubenswrapper[4797]: I1013 15:30:41.063360 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-fpxmk" Oct 13 15:30:41 crc kubenswrapper[4797]: I1013 15:30:41.064008 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-fpxmk" Oct 13 15:30:41 crc kubenswrapper[4797]: I1013 15:30:41.141778 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-fpxmk" Oct 13 15:30:41 crc kubenswrapper[4797]: I1013 15:30:41.629536 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-fpxmk" Oct 13 15:30:41 crc kubenswrapper[4797]: I1013 15:30:41.674617 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fpxmk"] Oct 13 15:30:43 crc kubenswrapper[4797]: I1013 15:30:43.242970 4797 scope.go:117] "RemoveContainer" containerID="6a7efc0ac87fd5a3474be5b847665fe7e0d782a300b9c4d9444078885329765f" Oct 13 15:30:43 crc kubenswrapper[4797]: E1013 15:30:43.243298 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 15:30:43 crc kubenswrapper[4797]: I1013 15:30:43.600395 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-fpxmk" podUID="b02861e3-2075-44f0-87da-4a4c772f2532" containerName="registry-server" containerID="cri-o://c55c8e4d326a891ee02a75822ffdd0f5238e06b4afbf988b65d3820717ff9963" gracePeriod=2 Oct 13 15:30:44 crc kubenswrapper[4797]: I1013 15:30:44.097572 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fpxmk" Oct 13 15:30:44 crc kubenswrapper[4797]: I1013 15:30:44.205653 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b02861e3-2075-44f0-87da-4a4c772f2532-catalog-content\") pod \"b02861e3-2075-44f0-87da-4a4c772f2532\" (UID: \"b02861e3-2075-44f0-87da-4a4c772f2532\") " Oct 13 15:30:44 crc kubenswrapper[4797]: I1013 15:30:44.205895 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m6lqz\" (UniqueName: \"kubernetes.io/projected/b02861e3-2075-44f0-87da-4a4c772f2532-kube-api-access-m6lqz\") pod \"b02861e3-2075-44f0-87da-4a4c772f2532\" (UID: \"b02861e3-2075-44f0-87da-4a4c772f2532\") " Oct 13 15:30:44 crc kubenswrapper[4797]: I1013 15:30:44.205942 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b02861e3-2075-44f0-87da-4a4c772f2532-utilities\") pod \"b02861e3-2075-44f0-87da-4a4c772f2532\" (UID: \"b02861e3-2075-44f0-87da-4a4c772f2532\") " Oct 13 15:30:44 crc kubenswrapper[4797]: I1013 15:30:44.206618 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b02861e3-2075-44f0-87da-4a4c772f2532-utilities" (OuterVolumeSpecName: "utilities") pod "b02861e3-2075-44f0-87da-4a4c772f2532" (UID: "b02861e3-2075-44f0-87da-4a4c772f2532"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 15:30:44 crc kubenswrapper[4797]: I1013 15:30:44.211330 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b02861e3-2075-44f0-87da-4a4c772f2532-kube-api-access-m6lqz" (OuterVolumeSpecName: "kube-api-access-m6lqz") pod "b02861e3-2075-44f0-87da-4a4c772f2532" (UID: "b02861e3-2075-44f0-87da-4a4c772f2532"). InnerVolumeSpecName "kube-api-access-m6lqz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 15:30:44 crc kubenswrapper[4797]: I1013 15:30:44.218905 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b02861e3-2075-44f0-87da-4a4c772f2532-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b02861e3-2075-44f0-87da-4a4c772f2532" (UID: "b02861e3-2075-44f0-87da-4a4c772f2532"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 15:30:44 crc kubenswrapper[4797]: I1013 15:30:44.308197 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m6lqz\" (UniqueName: \"kubernetes.io/projected/b02861e3-2075-44f0-87da-4a4c772f2532-kube-api-access-m6lqz\") on node \"crc\" DevicePath \"\"" Oct 13 15:30:44 crc kubenswrapper[4797]: I1013 15:30:44.308233 4797 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b02861e3-2075-44f0-87da-4a4c772f2532-utilities\") on node \"crc\" DevicePath \"\"" Oct 13 15:30:44 crc kubenswrapper[4797]: I1013 15:30:44.308361 4797 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b02861e3-2075-44f0-87da-4a4c772f2532-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 13 15:30:44 crc kubenswrapper[4797]: I1013 15:30:44.612070 4797 generic.go:334] "Generic (PLEG): container finished" podID="b02861e3-2075-44f0-87da-4a4c772f2532" containerID="c55c8e4d326a891ee02a75822ffdd0f5238e06b4afbf988b65d3820717ff9963" exitCode=0 Oct 13 15:30:44 crc kubenswrapper[4797]: I1013 15:30:44.612117 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fpxmk" event={"ID":"b02861e3-2075-44f0-87da-4a4c772f2532","Type":"ContainerDied","Data":"c55c8e4d326a891ee02a75822ffdd0f5238e06b4afbf988b65d3820717ff9963"} Oct 13 15:30:44 crc kubenswrapper[4797]: I1013 15:30:44.612174 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fpxmk" Oct 13 15:30:44 crc kubenswrapper[4797]: I1013 15:30:44.612194 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fpxmk" event={"ID":"b02861e3-2075-44f0-87da-4a4c772f2532","Type":"ContainerDied","Data":"f9055b2d4b519a9abdbaa3dd30ad035b60ece559f9b61835551c0eb13fe84954"} Oct 13 15:30:44 crc kubenswrapper[4797]: I1013 15:30:44.612220 4797 scope.go:117] "RemoveContainer" containerID="c55c8e4d326a891ee02a75822ffdd0f5238e06b4afbf988b65d3820717ff9963" Oct 13 15:30:44 crc kubenswrapper[4797]: I1013 15:30:44.651499 4797 scope.go:117] "RemoveContainer" containerID="5f5428a63a6f00db3e8a5718106ec7ea51282fc85dfd6b99b92c029a4f4d4b2a" Oct 13 15:30:44 crc kubenswrapper[4797]: I1013 15:30:44.660340 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fpxmk"] Oct 13 15:30:44 crc kubenswrapper[4797]: I1013 15:30:44.670907 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-fpxmk"] Oct 13 15:30:44 crc kubenswrapper[4797]: I1013 15:30:44.676394 4797 scope.go:117] "RemoveContainer" containerID="aaf1da0b17b7a055efbb26fde8811909b3aa128d97b4fcf8ec4d4a16655f2230" Oct 13 15:30:44 crc kubenswrapper[4797]: I1013 15:30:44.732047 4797 scope.go:117] "RemoveContainer" containerID="c55c8e4d326a891ee02a75822ffdd0f5238e06b4afbf988b65d3820717ff9963" Oct 13 15:30:44 crc kubenswrapper[4797]: E1013 15:30:44.732744 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c55c8e4d326a891ee02a75822ffdd0f5238e06b4afbf988b65d3820717ff9963\": container with ID starting with c55c8e4d326a891ee02a75822ffdd0f5238e06b4afbf988b65d3820717ff9963 not found: ID does not exist" containerID="c55c8e4d326a891ee02a75822ffdd0f5238e06b4afbf988b65d3820717ff9963" Oct 13 15:30:44 crc kubenswrapper[4797]: I1013 15:30:44.732795 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c55c8e4d326a891ee02a75822ffdd0f5238e06b4afbf988b65d3820717ff9963"} err="failed to get container status \"c55c8e4d326a891ee02a75822ffdd0f5238e06b4afbf988b65d3820717ff9963\": rpc error: code = NotFound desc = could not find container \"c55c8e4d326a891ee02a75822ffdd0f5238e06b4afbf988b65d3820717ff9963\": container with ID starting with c55c8e4d326a891ee02a75822ffdd0f5238e06b4afbf988b65d3820717ff9963 not found: ID does not exist" Oct 13 15:30:44 crc kubenswrapper[4797]: I1013 15:30:44.732902 4797 scope.go:117] "RemoveContainer" containerID="5f5428a63a6f00db3e8a5718106ec7ea51282fc85dfd6b99b92c029a4f4d4b2a" Oct 13 15:30:44 crc kubenswrapper[4797]: E1013 15:30:44.733340 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f5428a63a6f00db3e8a5718106ec7ea51282fc85dfd6b99b92c029a4f4d4b2a\": container with ID starting with 5f5428a63a6f00db3e8a5718106ec7ea51282fc85dfd6b99b92c029a4f4d4b2a not found: ID does not exist" containerID="5f5428a63a6f00db3e8a5718106ec7ea51282fc85dfd6b99b92c029a4f4d4b2a" Oct 13 15:30:44 crc kubenswrapper[4797]: I1013 15:30:44.733486 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f5428a63a6f00db3e8a5718106ec7ea51282fc85dfd6b99b92c029a4f4d4b2a"} err="failed to get container status \"5f5428a63a6f00db3e8a5718106ec7ea51282fc85dfd6b99b92c029a4f4d4b2a\": rpc error: code = NotFound desc = could not find container \"5f5428a63a6f00db3e8a5718106ec7ea51282fc85dfd6b99b92c029a4f4d4b2a\": container with ID starting with 5f5428a63a6f00db3e8a5718106ec7ea51282fc85dfd6b99b92c029a4f4d4b2a not found: ID does not exist" Oct 13 15:30:44 crc kubenswrapper[4797]: I1013 15:30:44.733591 4797 scope.go:117] "RemoveContainer" containerID="aaf1da0b17b7a055efbb26fde8811909b3aa128d97b4fcf8ec4d4a16655f2230" Oct 13 15:30:44 crc kubenswrapper[4797]: E1013 15:30:44.734068 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aaf1da0b17b7a055efbb26fde8811909b3aa128d97b4fcf8ec4d4a16655f2230\": container with ID starting with aaf1da0b17b7a055efbb26fde8811909b3aa128d97b4fcf8ec4d4a16655f2230 not found: ID does not exist" containerID="aaf1da0b17b7a055efbb26fde8811909b3aa128d97b4fcf8ec4d4a16655f2230" Oct 13 15:30:44 crc kubenswrapper[4797]: I1013 15:30:44.734113 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aaf1da0b17b7a055efbb26fde8811909b3aa128d97b4fcf8ec4d4a16655f2230"} err="failed to get container status \"aaf1da0b17b7a055efbb26fde8811909b3aa128d97b4fcf8ec4d4a16655f2230\": rpc error: code = NotFound desc = could not find container \"aaf1da0b17b7a055efbb26fde8811909b3aa128d97b4fcf8ec4d4a16655f2230\": container with ID starting with aaf1da0b17b7a055efbb26fde8811909b3aa128d97b4fcf8ec4d4a16655f2230 not found: ID does not exist" Oct 13 15:30:45 crc kubenswrapper[4797]: I1013 15:30:45.249867 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b02861e3-2075-44f0-87da-4a4c772f2532" path="/var/lib/kubelet/pods/b02861e3-2075-44f0-87da-4a4c772f2532/volumes" Oct 13 15:30:54 crc kubenswrapper[4797]: I1013 15:30:54.236214 4797 scope.go:117] "RemoveContainer" containerID="6a7efc0ac87fd5a3474be5b847665fe7e0d782a300b9c4d9444078885329765f" Oct 13 15:30:54 crc kubenswrapper[4797]: E1013 15:30:54.237434 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 15:31:05 crc kubenswrapper[4797]: I1013 15:31:05.237158 4797 scope.go:117] "RemoveContainer" containerID="6a7efc0ac87fd5a3474be5b847665fe7e0d782a300b9c4d9444078885329765f" Oct 13 15:31:05 crc kubenswrapper[4797]: E1013 15:31:05.238262 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 15:31:18 crc kubenswrapper[4797]: I1013 15:31:18.236748 4797 scope.go:117] "RemoveContainer" containerID="6a7efc0ac87fd5a3474be5b847665fe7e0d782a300b9c4d9444078885329765f" Oct 13 15:31:18 crc kubenswrapper[4797]: E1013 15:31:18.238022 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 15:31:22 crc kubenswrapper[4797]: I1013 15:31:22.168734 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-4m8z9"] Oct 13 15:31:22 crc kubenswrapper[4797]: E1013 15:31:22.169888 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b02861e3-2075-44f0-87da-4a4c772f2532" containerName="extract-utilities" Oct 13 15:31:22 crc kubenswrapper[4797]: I1013 15:31:22.169903 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="b02861e3-2075-44f0-87da-4a4c772f2532" containerName="extract-utilities" Oct 13 15:31:22 crc kubenswrapper[4797]: E1013 15:31:22.169930 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b02861e3-2075-44f0-87da-4a4c772f2532" containerName="registry-server" Oct 13 15:31:22 crc kubenswrapper[4797]: I1013 15:31:22.169938 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="b02861e3-2075-44f0-87da-4a4c772f2532" containerName="registry-server" Oct 13 15:31:22 crc kubenswrapper[4797]: E1013 15:31:22.169953 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b02861e3-2075-44f0-87da-4a4c772f2532" containerName="extract-content" Oct 13 15:31:22 crc kubenswrapper[4797]: I1013 15:31:22.169962 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="b02861e3-2075-44f0-87da-4a4c772f2532" containerName="extract-content" Oct 13 15:31:22 crc kubenswrapper[4797]: I1013 15:31:22.170221 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="b02861e3-2075-44f0-87da-4a4c772f2532" containerName="registry-server" Oct 13 15:31:22 crc kubenswrapper[4797]: I1013 15:31:22.172273 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4m8z9" Oct 13 15:31:22 crc kubenswrapper[4797]: I1013 15:31:22.179450 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4m8z9"] Oct 13 15:31:22 crc kubenswrapper[4797]: I1013 15:31:22.297417 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vr5l\" (UniqueName: \"kubernetes.io/projected/50f028f5-b1b0-4136-9f4d-0addc9de54a8-kube-api-access-8vr5l\") pod \"redhat-operators-4m8z9\" (UID: \"50f028f5-b1b0-4136-9f4d-0addc9de54a8\") " pod="openshift-marketplace/redhat-operators-4m8z9" Oct 13 15:31:22 crc kubenswrapper[4797]: I1013 15:31:22.297746 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50f028f5-b1b0-4136-9f4d-0addc9de54a8-utilities\") pod \"redhat-operators-4m8z9\" (UID: \"50f028f5-b1b0-4136-9f4d-0addc9de54a8\") " pod="openshift-marketplace/redhat-operators-4m8z9" Oct 13 15:31:22 crc kubenswrapper[4797]: I1013 15:31:22.297815 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50f028f5-b1b0-4136-9f4d-0addc9de54a8-catalog-content\") pod \"redhat-operators-4m8z9\" (UID: \"50f028f5-b1b0-4136-9f4d-0addc9de54a8\") " pod="openshift-marketplace/redhat-operators-4m8z9" Oct 13 15:31:22 crc kubenswrapper[4797]: I1013 15:31:22.401073 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8vr5l\" (UniqueName: \"kubernetes.io/projected/50f028f5-b1b0-4136-9f4d-0addc9de54a8-kube-api-access-8vr5l\") pod \"redhat-operators-4m8z9\" (UID: \"50f028f5-b1b0-4136-9f4d-0addc9de54a8\") " pod="openshift-marketplace/redhat-operators-4m8z9" Oct 13 15:31:22 crc kubenswrapper[4797]: I1013 15:31:22.401137 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50f028f5-b1b0-4136-9f4d-0addc9de54a8-utilities\") pod \"redhat-operators-4m8z9\" (UID: \"50f028f5-b1b0-4136-9f4d-0addc9de54a8\") " pod="openshift-marketplace/redhat-operators-4m8z9" Oct 13 15:31:22 crc kubenswrapper[4797]: I1013 15:31:22.401192 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50f028f5-b1b0-4136-9f4d-0addc9de54a8-catalog-content\") pod \"redhat-operators-4m8z9\" (UID: \"50f028f5-b1b0-4136-9f4d-0addc9de54a8\") " pod="openshift-marketplace/redhat-operators-4m8z9" Oct 13 15:31:22 crc kubenswrapper[4797]: I1013 15:31:22.401679 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50f028f5-b1b0-4136-9f4d-0addc9de54a8-utilities\") pod \"redhat-operators-4m8z9\" (UID: \"50f028f5-b1b0-4136-9f4d-0addc9de54a8\") " pod="openshift-marketplace/redhat-operators-4m8z9" Oct 13 15:31:22 crc kubenswrapper[4797]: I1013 15:31:22.401725 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50f028f5-b1b0-4136-9f4d-0addc9de54a8-catalog-content\") pod \"redhat-operators-4m8z9\" (UID: \"50f028f5-b1b0-4136-9f4d-0addc9de54a8\") " pod="openshift-marketplace/redhat-operators-4m8z9" Oct 13 15:31:22 crc kubenswrapper[4797]: I1013 15:31:22.423841 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vr5l\" (UniqueName: \"kubernetes.io/projected/50f028f5-b1b0-4136-9f4d-0addc9de54a8-kube-api-access-8vr5l\") pod \"redhat-operators-4m8z9\" (UID: \"50f028f5-b1b0-4136-9f4d-0addc9de54a8\") " pod="openshift-marketplace/redhat-operators-4m8z9" Oct 13 15:31:22 crc kubenswrapper[4797]: I1013 15:31:22.498686 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4m8z9" Oct 13 15:31:23 crc kubenswrapper[4797]: I1013 15:31:23.230140 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4m8z9"] Oct 13 15:31:23 crc kubenswrapper[4797]: I1013 15:31:23.987964 4797 generic.go:334] "Generic (PLEG): container finished" podID="50f028f5-b1b0-4136-9f4d-0addc9de54a8" containerID="93475e4485dff81be221a4d694d21a8ee883ec194ea6df2e1e7e08b99c035292" exitCode=0 Oct 13 15:31:23 crc kubenswrapper[4797]: I1013 15:31:23.988073 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4m8z9" event={"ID":"50f028f5-b1b0-4136-9f4d-0addc9de54a8","Type":"ContainerDied","Data":"93475e4485dff81be221a4d694d21a8ee883ec194ea6df2e1e7e08b99c035292"} Oct 13 15:31:23 crc kubenswrapper[4797]: I1013 15:31:23.988316 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4m8z9" event={"ID":"50f028f5-b1b0-4136-9f4d-0addc9de54a8","Type":"ContainerStarted","Data":"faf170a034247ec62554ec627b2d59d7c05553f83b87660f4ccc868838624889"} Oct 13 15:31:26 crc kubenswrapper[4797]: I1013 15:31:26.013709 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4m8z9" event={"ID":"50f028f5-b1b0-4136-9f4d-0addc9de54a8","Type":"ContainerStarted","Data":"8cb465be1d280df13adae5eb65f92e22ec27dec35c76b2b02830ce9a30db6449"} Oct 13 15:31:29 crc kubenswrapper[4797]: I1013 15:31:29.236846 4797 scope.go:117] "RemoveContainer" containerID="6a7efc0ac87fd5a3474be5b847665fe7e0d782a300b9c4d9444078885329765f" Oct 13 15:31:29 crc kubenswrapper[4797]: E1013 15:31:29.237607 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 15:31:30 crc kubenswrapper[4797]: I1013 15:31:30.064588 4797 generic.go:334] "Generic (PLEG): container finished" podID="50f028f5-b1b0-4136-9f4d-0addc9de54a8" containerID="8cb465be1d280df13adae5eb65f92e22ec27dec35c76b2b02830ce9a30db6449" exitCode=0 Oct 13 15:31:30 crc kubenswrapper[4797]: I1013 15:31:30.064721 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4m8z9" event={"ID":"50f028f5-b1b0-4136-9f4d-0addc9de54a8","Type":"ContainerDied","Data":"8cb465be1d280df13adae5eb65f92e22ec27dec35c76b2b02830ce9a30db6449"} Oct 13 15:31:31 crc kubenswrapper[4797]: I1013 15:31:31.076363 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4m8z9" event={"ID":"50f028f5-b1b0-4136-9f4d-0addc9de54a8","Type":"ContainerStarted","Data":"1c9a422e9c0f8d65d97c95d309208faa4dd53d97b96420a6dd287f2f95aa6d41"} Oct 13 15:31:31 crc kubenswrapper[4797]: I1013 15:31:31.106918 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-4m8z9" podStartSLOduration=2.567673252 podStartE2EDuration="9.106897549s" podCreationTimestamp="2025-10-13 15:31:22 +0000 UTC" firstStartedPulling="2025-10-13 15:31:23.990829159 +0000 UTC m=+8661.524379435" lastFinishedPulling="2025-10-13 15:31:30.530053476 +0000 UTC m=+8668.063603732" observedRunningTime="2025-10-13 15:31:31.099523707 +0000 UTC m=+8668.633074033" watchObservedRunningTime="2025-10-13 15:31:31.106897549 +0000 UTC m=+8668.640447805" Oct 13 15:31:32 crc kubenswrapper[4797]: I1013 15:31:32.499903 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-4m8z9" Oct 13 15:31:32 crc kubenswrapper[4797]: I1013 15:31:32.499991 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-4m8z9" Oct 13 15:31:33 crc kubenswrapper[4797]: I1013 15:31:33.547182 4797 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-4m8z9" podUID="50f028f5-b1b0-4136-9f4d-0addc9de54a8" containerName="registry-server" probeResult="failure" output=< Oct 13 15:31:33 crc kubenswrapper[4797]: timeout: failed to connect service ":50051" within 1s Oct 13 15:31:33 crc kubenswrapper[4797]: > Oct 13 15:31:42 crc kubenswrapper[4797]: I1013 15:31:42.237243 4797 scope.go:117] "RemoveContainer" containerID="6a7efc0ac87fd5a3474be5b847665fe7e0d782a300b9c4d9444078885329765f" Oct 13 15:31:42 crc kubenswrapper[4797]: E1013 15:31:42.238533 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 15:31:42 crc kubenswrapper[4797]: I1013 15:31:42.551284 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-4m8z9" Oct 13 15:31:42 crc kubenswrapper[4797]: I1013 15:31:42.604919 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-4m8z9" Oct 13 15:31:42 crc kubenswrapper[4797]: I1013 15:31:42.793841 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4m8z9"] Oct 13 15:31:44 crc kubenswrapper[4797]: I1013 15:31:44.196695 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-4m8z9" podUID="50f028f5-b1b0-4136-9f4d-0addc9de54a8" containerName="registry-server" containerID="cri-o://1c9a422e9c0f8d65d97c95d309208faa4dd53d97b96420a6dd287f2f95aa6d41" gracePeriod=2 Oct 13 15:31:44 crc kubenswrapper[4797]: I1013 15:31:44.807669 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4m8z9" Oct 13 15:31:44 crc kubenswrapper[4797]: I1013 15:31:44.896386 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50f028f5-b1b0-4136-9f4d-0addc9de54a8-utilities\") pod \"50f028f5-b1b0-4136-9f4d-0addc9de54a8\" (UID: \"50f028f5-b1b0-4136-9f4d-0addc9de54a8\") " Oct 13 15:31:44 crc kubenswrapper[4797]: I1013 15:31:44.896466 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50f028f5-b1b0-4136-9f4d-0addc9de54a8-catalog-content\") pod \"50f028f5-b1b0-4136-9f4d-0addc9de54a8\" (UID: \"50f028f5-b1b0-4136-9f4d-0addc9de54a8\") " Oct 13 15:31:44 crc kubenswrapper[4797]: I1013 15:31:44.896503 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8vr5l\" (UniqueName: \"kubernetes.io/projected/50f028f5-b1b0-4136-9f4d-0addc9de54a8-kube-api-access-8vr5l\") pod \"50f028f5-b1b0-4136-9f4d-0addc9de54a8\" (UID: \"50f028f5-b1b0-4136-9f4d-0addc9de54a8\") " Oct 13 15:31:44 crc kubenswrapper[4797]: I1013 15:31:44.898025 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/50f028f5-b1b0-4136-9f4d-0addc9de54a8-utilities" (OuterVolumeSpecName: "utilities") pod "50f028f5-b1b0-4136-9f4d-0addc9de54a8" (UID: "50f028f5-b1b0-4136-9f4d-0addc9de54a8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 15:31:44 crc kubenswrapper[4797]: I1013 15:31:44.904331 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50f028f5-b1b0-4136-9f4d-0addc9de54a8-kube-api-access-8vr5l" (OuterVolumeSpecName: "kube-api-access-8vr5l") pod "50f028f5-b1b0-4136-9f4d-0addc9de54a8" (UID: "50f028f5-b1b0-4136-9f4d-0addc9de54a8"). InnerVolumeSpecName "kube-api-access-8vr5l". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 15:31:44 crc kubenswrapper[4797]: I1013 15:31:44.982125 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/50f028f5-b1b0-4136-9f4d-0addc9de54a8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "50f028f5-b1b0-4136-9f4d-0addc9de54a8" (UID: "50f028f5-b1b0-4136-9f4d-0addc9de54a8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 15:31:44 crc kubenswrapper[4797]: I1013 15:31:44.998986 4797 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50f028f5-b1b0-4136-9f4d-0addc9de54a8-utilities\") on node \"crc\" DevicePath \"\"" Oct 13 15:31:44 crc kubenswrapper[4797]: I1013 15:31:44.999025 4797 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50f028f5-b1b0-4136-9f4d-0addc9de54a8-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 13 15:31:44 crc kubenswrapper[4797]: I1013 15:31:44.999037 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8vr5l\" (UniqueName: \"kubernetes.io/projected/50f028f5-b1b0-4136-9f4d-0addc9de54a8-kube-api-access-8vr5l\") on node \"crc\" DevicePath \"\"" Oct 13 15:31:45 crc kubenswrapper[4797]: I1013 15:31:45.213395 4797 generic.go:334] "Generic (PLEG): container finished" podID="50f028f5-b1b0-4136-9f4d-0addc9de54a8" containerID="1c9a422e9c0f8d65d97c95d309208faa4dd53d97b96420a6dd287f2f95aa6d41" exitCode=0 Oct 13 15:31:45 crc kubenswrapper[4797]: I1013 15:31:45.213436 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4m8z9" event={"ID":"50f028f5-b1b0-4136-9f4d-0addc9de54a8","Type":"ContainerDied","Data":"1c9a422e9c0f8d65d97c95d309208faa4dd53d97b96420a6dd287f2f95aa6d41"} Oct 13 15:31:45 crc kubenswrapper[4797]: I1013 15:31:45.213462 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4m8z9" event={"ID":"50f028f5-b1b0-4136-9f4d-0addc9de54a8","Type":"ContainerDied","Data":"faf170a034247ec62554ec627b2d59d7c05553f83b87660f4ccc868838624889"} Oct 13 15:31:45 crc kubenswrapper[4797]: I1013 15:31:45.213478 4797 scope.go:117] "RemoveContainer" containerID="1c9a422e9c0f8d65d97c95d309208faa4dd53d97b96420a6dd287f2f95aa6d41" Oct 13 15:31:45 crc kubenswrapper[4797]: I1013 15:31:45.213607 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4m8z9" Oct 13 15:31:45 crc kubenswrapper[4797]: I1013 15:31:45.257586 4797 scope.go:117] "RemoveContainer" containerID="8cb465be1d280df13adae5eb65f92e22ec27dec35c76b2b02830ce9a30db6449" Oct 13 15:31:45 crc kubenswrapper[4797]: I1013 15:31:45.257775 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4m8z9"] Oct 13 15:31:45 crc kubenswrapper[4797]: I1013 15:31:45.275011 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-4m8z9"] Oct 13 15:31:45 crc kubenswrapper[4797]: I1013 15:31:45.284064 4797 scope.go:117] "RemoveContainer" containerID="93475e4485dff81be221a4d694d21a8ee883ec194ea6df2e1e7e08b99c035292" Oct 13 15:31:45 crc kubenswrapper[4797]: I1013 15:31:45.325008 4797 scope.go:117] "RemoveContainer" containerID="1c9a422e9c0f8d65d97c95d309208faa4dd53d97b96420a6dd287f2f95aa6d41" Oct 13 15:31:45 crc kubenswrapper[4797]: E1013 15:31:45.326075 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c9a422e9c0f8d65d97c95d309208faa4dd53d97b96420a6dd287f2f95aa6d41\": container with ID starting with 1c9a422e9c0f8d65d97c95d309208faa4dd53d97b96420a6dd287f2f95aa6d41 not found: ID does not exist" containerID="1c9a422e9c0f8d65d97c95d309208faa4dd53d97b96420a6dd287f2f95aa6d41" Oct 13 15:31:45 crc kubenswrapper[4797]: I1013 15:31:45.326125 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c9a422e9c0f8d65d97c95d309208faa4dd53d97b96420a6dd287f2f95aa6d41"} err="failed to get container status \"1c9a422e9c0f8d65d97c95d309208faa4dd53d97b96420a6dd287f2f95aa6d41\": rpc error: code = NotFound desc = could not find container \"1c9a422e9c0f8d65d97c95d309208faa4dd53d97b96420a6dd287f2f95aa6d41\": container with ID starting with 1c9a422e9c0f8d65d97c95d309208faa4dd53d97b96420a6dd287f2f95aa6d41 not found: ID does not exist" Oct 13 15:31:45 crc kubenswrapper[4797]: I1013 15:31:45.326152 4797 scope.go:117] "RemoveContainer" containerID="8cb465be1d280df13adae5eb65f92e22ec27dec35c76b2b02830ce9a30db6449" Oct 13 15:31:45 crc kubenswrapper[4797]: E1013 15:31:45.326419 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8cb465be1d280df13adae5eb65f92e22ec27dec35c76b2b02830ce9a30db6449\": container with ID starting with 8cb465be1d280df13adae5eb65f92e22ec27dec35c76b2b02830ce9a30db6449 not found: ID does not exist" containerID="8cb465be1d280df13adae5eb65f92e22ec27dec35c76b2b02830ce9a30db6449" Oct 13 15:31:45 crc kubenswrapper[4797]: I1013 15:31:45.326443 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8cb465be1d280df13adae5eb65f92e22ec27dec35c76b2b02830ce9a30db6449"} err="failed to get container status \"8cb465be1d280df13adae5eb65f92e22ec27dec35c76b2b02830ce9a30db6449\": rpc error: code = NotFound desc = could not find container \"8cb465be1d280df13adae5eb65f92e22ec27dec35c76b2b02830ce9a30db6449\": container with ID starting with 8cb465be1d280df13adae5eb65f92e22ec27dec35c76b2b02830ce9a30db6449 not found: ID does not exist" Oct 13 15:31:45 crc kubenswrapper[4797]: I1013 15:31:45.326457 4797 scope.go:117] "RemoveContainer" containerID="93475e4485dff81be221a4d694d21a8ee883ec194ea6df2e1e7e08b99c035292" Oct 13 15:31:45 crc kubenswrapper[4797]: E1013 15:31:45.326623 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93475e4485dff81be221a4d694d21a8ee883ec194ea6df2e1e7e08b99c035292\": container with ID starting with 93475e4485dff81be221a4d694d21a8ee883ec194ea6df2e1e7e08b99c035292 not found: ID does not exist" containerID="93475e4485dff81be221a4d694d21a8ee883ec194ea6df2e1e7e08b99c035292" Oct 13 15:31:45 crc kubenswrapper[4797]: I1013 15:31:45.326641 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93475e4485dff81be221a4d694d21a8ee883ec194ea6df2e1e7e08b99c035292"} err="failed to get container status \"93475e4485dff81be221a4d694d21a8ee883ec194ea6df2e1e7e08b99c035292\": rpc error: code = NotFound desc = could not find container \"93475e4485dff81be221a4d694d21a8ee883ec194ea6df2e1e7e08b99c035292\": container with ID starting with 93475e4485dff81be221a4d694d21a8ee883ec194ea6df2e1e7e08b99c035292 not found: ID does not exist" Oct 13 15:31:47 crc kubenswrapper[4797]: I1013 15:31:47.247684 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50f028f5-b1b0-4136-9f4d-0addc9de54a8" path="/var/lib/kubelet/pods/50f028f5-b1b0-4136-9f4d-0addc9de54a8/volumes" Oct 13 15:31:53 crc kubenswrapper[4797]: I1013 15:31:53.243047 4797 scope.go:117] "RemoveContainer" containerID="6a7efc0ac87fd5a3474be5b847665fe7e0d782a300b9c4d9444078885329765f" Oct 13 15:31:53 crc kubenswrapper[4797]: E1013 15:31:53.243905 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 15:32:06 crc kubenswrapper[4797]: I1013 15:32:06.236526 4797 scope.go:117] "RemoveContainer" containerID="6a7efc0ac87fd5a3474be5b847665fe7e0d782a300b9c4d9444078885329765f" Oct 13 15:32:06 crc kubenswrapper[4797]: E1013 15:32:06.237474 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 15:32:20 crc kubenswrapper[4797]: I1013 15:32:20.237454 4797 scope.go:117] "RemoveContainer" containerID="6a7efc0ac87fd5a3474be5b847665fe7e0d782a300b9c4d9444078885329765f" Oct 13 15:32:20 crc kubenswrapper[4797]: E1013 15:32:20.240067 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 15:32:31 crc kubenswrapper[4797]: I1013 15:32:31.236162 4797 scope.go:117] "RemoveContainer" containerID="6a7efc0ac87fd5a3474be5b847665fe7e0d782a300b9c4d9444078885329765f" Oct 13 15:32:31 crc kubenswrapper[4797]: E1013 15:32:31.236969 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 15:32:43 crc kubenswrapper[4797]: I1013 15:32:43.315997 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-95sw4"] Oct 13 15:32:43 crc kubenswrapper[4797]: E1013 15:32:43.317304 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50f028f5-b1b0-4136-9f4d-0addc9de54a8" containerName="extract-utilities" Oct 13 15:32:43 crc kubenswrapper[4797]: I1013 15:32:43.317321 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="50f028f5-b1b0-4136-9f4d-0addc9de54a8" containerName="extract-utilities" Oct 13 15:32:43 crc kubenswrapper[4797]: E1013 15:32:43.317364 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50f028f5-b1b0-4136-9f4d-0addc9de54a8" containerName="registry-server" Oct 13 15:32:43 crc kubenswrapper[4797]: I1013 15:32:43.317373 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="50f028f5-b1b0-4136-9f4d-0addc9de54a8" containerName="registry-server" Oct 13 15:32:43 crc kubenswrapper[4797]: E1013 15:32:43.317397 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50f028f5-b1b0-4136-9f4d-0addc9de54a8" containerName="extract-content" Oct 13 15:32:43 crc kubenswrapper[4797]: I1013 15:32:43.317405 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="50f028f5-b1b0-4136-9f4d-0addc9de54a8" containerName="extract-content" Oct 13 15:32:43 crc kubenswrapper[4797]: I1013 15:32:43.317672 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="50f028f5-b1b0-4136-9f4d-0addc9de54a8" containerName="registry-server" Oct 13 15:32:43 crc kubenswrapper[4797]: I1013 15:32:43.321114 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-95sw4" Oct 13 15:32:43 crc kubenswrapper[4797]: I1013 15:32:43.354012 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-95sw4"] Oct 13 15:32:43 crc kubenswrapper[4797]: I1013 15:32:43.475768 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a29aecb-d7e5-4378-872b-15fe7dcddb7a-utilities\") pod \"certified-operators-95sw4\" (UID: \"9a29aecb-d7e5-4378-872b-15fe7dcddb7a\") " pod="openshift-marketplace/certified-operators-95sw4" Oct 13 15:32:43 crc kubenswrapper[4797]: I1013 15:32:43.476150 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tngdc\" (UniqueName: \"kubernetes.io/projected/9a29aecb-d7e5-4378-872b-15fe7dcddb7a-kube-api-access-tngdc\") pod \"certified-operators-95sw4\" (UID: \"9a29aecb-d7e5-4378-872b-15fe7dcddb7a\") " pod="openshift-marketplace/certified-operators-95sw4" Oct 13 15:32:43 crc kubenswrapper[4797]: I1013 15:32:43.476268 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a29aecb-d7e5-4378-872b-15fe7dcddb7a-catalog-content\") pod \"certified-operators-95sw4\" (UID: \"9a29aecb-d7e5-4378-872b-15fe7dcddb7a\") " pod="openshift-marketplace/certified-operators-95sw4" Oct 13 15:32:43 crc kubenswrapper[4797]: I1013 15:32:43.577913 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a29aecb-d7e5-4378-872b-15fe7dcddb7a-utilities\") pod \"certified-operators-95sw4\" (UID: \"9a29aecb-d7e5-4378-872b-15fe7dcddb7a\") " pod="openshift-marketplace/certified-operators-95sw4" Oct 13 15:32:43 crc kubenswrapper[4797]: I1013 15:32:43.578009 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tngdc\" (UniqueName: \"kubernetes.io/projected/9a29aecb-d7e5-4378-872b-15fe7dcddb7a-kube-api-access-tngdc\") pod \"certified-operators-95sw4\" (UID: \"9a29aecb-d7e5-4378-872b-15fe7dcddb7a\") " pod="openshift-marketplace/certified-operators-95sw4" Oct 13 15:32:43 crc kubenswrapper[4797]: I1013 15:32:43.578100 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a29aecb-d7e5-4378-872b-15fe7dcddb7a-catalog-content\") pod \"certified-operators-95sw4\" (UID: \"9a29aecb-d7e5-4378-872b-15fe7dcddb7a\") " pod="openshift-marketplace/certified-operators-95sw4" Oct 13 15:32:43 crc kubenswrapper[4797]: I1013 15:32:43.578420 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a29aecb-d7e5-4378-872b-15fe7dcddb7a-utilities\") pod \"certified-operators-95sw4\" (UID: \"9a29aecb-d7e5-4378-872b-15fe7dcddb7a\") " pod="openshift-marketplace/certified-operators-95sw4" Oct 13 15:32:43 crc kubenswrapper[4797]: I1013 15:32:43.578482 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a29aecb-d7e5-4378-872b-15fe7dcddb7a-catalog-content\") pod \"certified-operators-95sw4\" (UID: \"9a29aecb-d7e5-4378-872b-15fe7dcddb7a\") " pod="openshift-marketplace/certified-operators-95sw4" Oct 13 15:32:43 crc kubenswrapper[4797]: I1013 15:32:43.603373 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tngdc\" (UniqueName: \"kubernetes.io/projected/9a29aecb-d7e5-4378-872b-15fe7dcddb7a-kube-api-access-tngdc\") pod \"certified-operators-95sw4\" (UID: \"9a29aecb-d7e5-4378-872b-15fe7dcddb7a\") " pod="openshift-marketplace/certified-operators-95sw4" Oct 13 15:32:43 crc kubenswrapper[4797]: I1013 15:32:43.655516 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-95sw4" Oct 13 15:32:44 crc kubenswrapper[4797]: I1013 15:32:44.212961 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-95sw4"] Oct 13 15:32:44 crc kubenswrapper[4797]: I1013 15:32:44.235879 4797 scope.go:117] "RemoveContainer" containerID="6a7efc0ac87fd5a3474be5b847665fe7e0d782a300b9c4d9444078885329765f" Oct 13 15:32:44 crc kubenswrapper[4797]: E1013 15:32:44.236140 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 15:32:44 crc kubenswrapper[4797]: I1013 15:32:44.858983 4797 generic.go:334] "Generic (PLEG): container finished" podID="17115edf-f950-40b3-9a3b-3948815da323" containerID="c2df3e842601f97f4dfa5a6f91173a8f7a01d31d2da6eaeba9fcdd112cf5bc4f" exitCode=0 Oct 13 15:32:44 crc kubenswrapper[4797]: I1013 15:32:44.859107 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-sfvnf" event={"ID":"17115edf-f950-40b3-9a3b-3948815da323","Type":"ContainerDied","Data":"c2df3e842601f97f4dfa5a6f91173a8f7a01d31d2da6eaeba9fcdd112cf5bc4f"} Oct 13 15:32:44 crc kubenswrapper[4797]: I1013 15:32:44.861108 4797 generic.go:334] "Generic (PLEG): container finished" podID="9a29aecb-d7e5-4378-872b-15fe7dcddb7a" containerID="8ba8ba5cadfff937c3d394fca207479a4e6c174f03772e1abcb6e82b86a49ce1" exitCode=0 Oct 13 15:32:44 crc kubenswrapper[4797]: I1013 15:32:44.861136 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-95sw4" event={"ID":"9a29aecb-d7e5-4378-872b-15fe7dcddb7a","Type":"ContainerDied","Data":"8ba8ba5cadfff937c3d394fca207479a4e6c174f03772e1abcb6e82b86a49ce1"} Oct 13 15:32:44 crc kubenswrapper[4797]: I1013 15:32:44.861155 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-95sw4" event={"ID":"9a29aecb-d7e5-4378-872b-15fe7dcddb7a","Type":"ContainerStarted","Data":"72d54018c7108f05d0c3f8fcc4c51899be5c5a7f1ea383047a6585acd85cb71a"} Oct 13 15:32:46 crc kubenswrapper[4797]: I1013 15:32:46.495526 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-sfvnf" Oct 13 15:32:46 crc kubenswrapper[4797]: I1013 15:32:46.653939 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/17115edf-f950-40b3-9a3b-3948815da323-neutron-dhcp-agent-neutron-config-0\") pod \"17115edf-f950-40b3-9a3b-3948815da323\" (UID: \"17115edf-f950-40b3-9a3b-3948815da323\") " Oct 13 15:32:46 crc kubenswrapper[4797]: I1013 15:32:46.653999 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/17115edf-f950-40b3-9a3b-3948815da323-inventory\") pod \"17115edf-f950-40b3-9a3b-3948815da323\" (UID: \"17115edf-f950-40b3-9a3b-3948815da323\") " Oct 13 15:32:46 crc kubenswrapper[4797]: I1013 15:32:46.654151 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/17115edf-f950-40b3-9a3b-3948815da323-ssh-key\") pod \"17115edf-f950-40b3-9a3b-3948815da323\" (UID: \"17115edf-f950-40b3-9a3b-3948815da323\") " Oct 13 15:32:46 crc kubenswrapper[4797]: I1013 15:32:46.654853 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7fs8g\" (UniqueName: \"kubernetes.io/projected/17115edf-f950-40b3-9a3b-3948815da323-kube-api-access-7fs8g\") pod \"17115edf-f950-40b3-9a3b-3948815da323\" (UID: \"17115edf-f950-40b3-9a3b-3948815da323\") " Oct 13 15:32:46 crc kubenswrapper[4797]: I1013 15:32:46.654932 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/17115edf-f950-40b3-9a3b-3948815da323-ceph\") pod \"17115edf-f950-40b3-9a3b-3948815da323\" (UID: \"17115edf-f950-40b3-9a3b-3948815da323\") " Oct 13 15:32:46 crc kubenswrapper[4797]: I1013 15:32:46.655016 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17115edf-f950-40b3-9a3b-3948815da323-neutron-dhcp-combined-ca-bundle\") pod \"17115edf-f950-40b3-9a3b-3948815da323\" (UID: \"17115edf-f950-40b3-9a3b-3948815da323\") " Oct 13 15:32:46 crc kubenswrapper[4797]: I1013 15:32:46.660990 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17115edf-f950-40b3-9a3b-3948815da323-ceph" (OuterVolumeSpecName: "ceph") pod "17115edf-f950-40b3-9a3b-3948815da323" (UID: "17115edf-f950-40b3-9a3b-3948815da323"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 15:32:46 crc kubenswrapper[4797]: I1013 15:32:46.661022 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17115edf-f950-40b3-9a3b-3948815da323-neutron-dhcp-combined-ca-bundle" (OuterVolumeSpecName: "neutron-dhcp-combined-ca-bundle") pod "17115edf-f950-40b3-9a3b-3948815da323" (UID: "17115edf-f950-40b3-9a3b-3948815da323"). InnerVolumeSpecName "neutron-dhcp-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 15:32:46 crc kubenswrapper[4797]: I1013 15:32:46.661647 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17115edf-f950-40b3-9a3b-3948815da323-kube-api-access-7fs8g" (OuterVolumeSpecName: "kube-api-access-7fs8g") pod "17115edf-f950-40b3-9a3b-3948815da323" (UID: "17115edf-f950-40b3-9a3b-3948815da323"). InnerVolumeSpecName "kube-api-access-7fs8g". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 15:32:46 crc kubenswrapper[4797]: I1013 15:32:46.690572 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17115edf-f950-40b3-9a3b-3948815da323-inventory" (OuterVolumeSpecName: "inventory") pod "17115edf-f950-40b3-9a3b-3948815da323" (UID: "17115edf-f950-40b3-9a3b-3948815da323"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 15:32:46 crc kubenswrapper[4797]: I1013 15:32:46.691028 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17115edf-f950-40b3-9a3b-3948815da323-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "17115edf-f950-40b3-9a3b-3948815da323" (UID: "17115edf-f950-40b3-9a3b-3948815da323"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 15:32:46 crc kubenswrapper[4797]: I1013 15:32:46.691361 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17115edf-f950-40b3-9a3b-3948815da323-neutron-dhcp-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-dhcp-agent-neutron-config-0") pod "17115edf-f950-40b3-9a3b-3948815da323" (UID: "17115edf-f950-40b3-9a3b-3948815da323"). InnerVolumeSpecName "neutron-dhcp-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 15:32:46 crc kubenswrapper[4797]: I1013 15:32:46.758330 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7fs8g\" (UniqueName: \"kubernetes.io/projected/17115edf-f950-40b3-9a3b-3948815da323-kube-api-access-7fs8g\") on node \"crc\" DevicePath \"\"" Oct 13 15:32:46 crc kubenswrapper[4797]: I1013 15:32:46.758373 4797 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/17115edf-f950-40b3-9a3b-3948815da323-ceph\") on node \"crc\" DevicePath \"\"" Oct 13 15:32:46 crc kubenswrapper[4797]: I1013 15:32:46.758389 4797 reconciler_common.go:293] "Volume detached for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17115edf-f950-40b3-9a3b-3948815da323-neutron-dhcp-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 15:32:46 crc kubenswrapper[4797]: I1013 15:32:46.758404 4797 reconciler_common.go:293] "Volume detached for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/17115edf-f950-40b3-9a3b-3948815da323-neutron-dhcp-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Oct 13 15:32:46 crc kubenswrapper[4797]: I1013 15:32:46.758421 4797 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/17115edf-f950-40b3-9a3b-3948815da323-inventory\") on node \"crc\" DevicePath \"\"" Oct 13 15:32:46 crc kubenswrapper[4797]: I1013 15:32:46.758433 4797 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/17115edf-f950-40b3-9a3b-3948815da323-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 13 15:32:46 crc kubenswrapper[4797]: I1013 15:32:46.911209 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-95sw4" event={"ID":"9a29aecb-d7e5-4378-872b-15fe7dcddb7a","Type":"ContainerStarted","Data":"0285b5f04f42bed78d74c9d3fb2b24cd430af8977cc7e2c117e38ed2dc2f35b2"} Oct 13 15:32:46 crc kubenswrapper[4797]: I1013 15:32:46.919458 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-sfvnf" event={"ID":"17115edf-f950-40b3-9a3b-3948815da323","Type":"ContainerDied","Data":"2cff8a3b08ecc43b2fb65b0620d4e5110346323036bddcbdef87171d5e5dc3da"} Oct 13 15:32:46 crc kubenswrapper[4797]: I1013 15:32:46.919501 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2cff8a3b08ecc43b2fb65b0620d4e5110346323036bddcbdef87171d5e5dc3da" Oct 13 15:32:46 crc kubenswrapper[4797]: I1013 15:32:46.919568 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-sfvnf" Oct 13 15:32:47 crc kubenswrapper[4797]: I1013 15:32:47.929895 4797 generic.go:334] "Generic (PLEG): container finished" podID="9a29aecb-d7e5-4378-872b-15fe7dcddb7a" containerID="0285b5f04f42bed78d74c9d3fb2b24cd430af8977cc7e2c117e38ed2dc2f35b2" exitCode=0 Oct 13 15:32:47 crc kubenswrapper[4797]: I1013 15:32:47.929970 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-95sw4" event={"ID":"9a29aecb-d7e5-4378-872b-15fe7dcddb7a","Type":"ContainerDied","Data":"0285b5f04f42bed78d74c9d3fb2b24cd430af8977cc7e2c117e38ed2dc2f35b2"} Oct 13 15:32:48 crc kubenswrapper[4797]: I1013 15:32:48.942087 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-95sw4" event={"ID":"9a29aecb-d7e5-4378-872b-15fe7dcddb7a","Type":"ContainerStarted","Data":"2f30881c965a7b34244dcd7f8510d5dfecdf5b1228fd7664d04681fa6e66491d"} Oct 13 15:32:48 crc kubenswrapper[4797]: I1013 15:32:48.963320 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-95sw4" podStartSLOduration=2.315951788 podStartE2EDuration="5.963298083s" podCreationTimestamp="2025-10-13 15:32:43 +0000 UTC" firstStartedPulling="2025-10-13 15:32:44.863340777 +0000 UTC m=+8742.396891033" lastFinishedPulling="2025-10-13 15:32:48.510687062 +0000 UTC m=+8746.044237328" observedRunningTime="2025-10-13 15:32:48.960695729 +0000 UTC m=+8746.494246005" watchObservedRunningTime="2025-10-13 15:32:48.963298083 +0000 UTC m=+8746.496848339" Oct 13 15:32:53 crc kubenswrapper[4797]: I1013 15:32:53.655822 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-95sw4" Oct 13 15:32:53 crc kubenswrapper[4797]: I1013 15:32:53.656370 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-95sw4" Oct 13 15:32:53 crc kubenswrapper[4797]: I1013 15:32:53.723849 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-95sw4" Oct 13 15:32:54 crc kubenswrapper[4797]: I1013 15:32:54.062852 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-95sw4" Oct 13 15:32:54 crc kubenswrapper[4797]: I1013 15:32:54.120987 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-95sw4"] Oct 13 15:32:56 crc kubenswrapper[4797]: I1013 15:32:56.016920 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-95sw4" podUID="9a29aecb-d7e5-4378-872b-15fe7dcddb7a" containerName="registry-server" containerID="cri-o://2f30881c965a7b34244dcd7f8510d5dfecdf5b1228fd7664d04681fa6e66491d" gracePeriod=2 Oct 13 15:32:56 crc kubenswrapper[4797]: I1013 15:32:56.551736 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-95sw4" Oct 13 15:32:56 crc kubenswrapper[4797]: I1013 15:32:56.669669 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tngdc\" (UniqueName: \"kubernetes.io/projected/9a29aecb-d7e5-4378-872b-15fe7dcddb7a-kube-api-access-tngdc\") pod \"9a29aecb-d7e5-4378-872b-15fe7dcddb7a\" (UID: \"9a29aecb-d7e5-4378-872b-15fe7dcddb7a\") " Oct 13 15:32:56 crc kubenswrapper[4797]: I1013 15:32:56.669753 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a29aecb-d7e5-4378-872b-15fe7dcddb7a-utilities\") pod \"9a29aecb-d7e5-4378-872b-15fe7dcddb7a\" (UID: \"9a29aecb-d7e5-4378-872b-15fe7dcddb7a\") " Oct 13 15:32:56 crc kubenswrapper[4797]: I1013 15:32:56.670048 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a29aecb-d7e5-4378-872b-15fe7dcddb7a-catalog-content\") pod \"9a29aecb-d7e5-4378-872b-15fe7dcddb7a\" (UID: \"9a29aecb-d7e5-4378-872b-15fe7dcddb7a\") " Oct 13 15:32:56 crc kubenswrapper[4797]: I1013 15:32:56.670995 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9a29aecb-d7e5-4378-872b-15fe7dcddb7a-utilities" (OuterVolumeSpecName: "utilities") pod "9a29aecb-d7e5-4378-872b-15fe7dcddb7a" (UID: "9a29aecb-d7e5-4378-872b-15fe7dcddb7a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 15:32:56 crc kubenswrapper[4797]: I1013 15:32:56.676778 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a29aecb-d7e5-4378-872b-15fe7dcddb7a-kube-api-access-tngdc" (OuterVolumeSpecName: "kube-api-access-tngdc") pod "9a29aecb-d7e5-4378-872b-15fe7dcddb7a" (UID: "9a29aecb-d7e5-4378-872b-15fe7dcddb7a"). InnerVolumeSpecName "kube-api-access-tngdc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 15:32:56 crc kubenswrapper[4797]: I1013 15:32:56.722788 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9a29aecb-d7e5-4378-872b-15fe7dcddb7a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9a29aecb-d7e5-4378-872b-15fe7dcddb7a" (UID: "9a29aecb-d7e5-4378-872b-15fe7dcddb7a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 15:32:56 crc kubenswrapper[4797]: I1013 15:32:56.773240 4797 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a29aecb-d7e5-4378-872b-15fe7dcddb7a-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 13 15:32:56 crc kubenswrapper[4797]: I1013 15:32:56.773290 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tngdc\" (UniqueName: \"kubernetes.io/projected/9a29aecb-d7e5-4378-872b-15fe7dcddb7a-kube-api-access-tngdc\") on node \"crc\" DevicePath \"\"" Oct 13 15:32:56 crc kubenswrapper[4797]: I1013 15:32:56.773304 4797 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a29aecb-d7e5-4378-872b-15fe7dcddb7a-utilities\") on node \"crc\" DevicePath \"\"" Oct 13 15:32:57 crc kubenswrapper[4797]: I1013 15:32:57.029723 4797 generic.go:334] "Generic (PLEG): container finished" podID="9a29aecb-d7e5-4378-872b-15fe7dcddb7a" containerID="2f30881c965a7b34244dcd7f8510d5dfecdf5b1228fd7664d04681fa6e66491d" exitCode=0 Oct 13 15:32:57 crc kubenswrapper[4797]: I1013 15:32:57.029775 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-95sw4" event={"ID":"9a29aecb-d7e5-4378-872b-15fe7dcddb7a","Type":"ContainerDied","Data":"2f30881c965a7b34244dcd7f8510d5dfecdf5b1228fd7664d04681fa6e66491d"} Oct 13 15:32:57 crc kubenswrapper[4797]: I1013 15:32:57.029867 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-95sw4" event={"ID":"9a29aecb-d7e5-4378-872b-15fe7dcddb7a","Type":"ContainerDied","Data":"72d54018c7108f05d0c3f8fcc4c51899be5c5a7f1ea383047a6585acd85cb71a"} Oct 13 15:32:57 crc kubenswrapper[4797]: I1013 15:32:57.029889 4797 scope.go:117] "RemoveContainer" containerID="2f30881c965a7b34244dcd7f8510d5dfecdf5b1228fd7664d04681fa6e66491d" Oct 13 15:32:57 crc kubenswrapper[4797]: I1013 15:32:57.030055 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-95sw4" Oct 13 15:32:57 crc kubenswrapper[4797]: I1013 15:32:57.062551 4797 scope.go:117] "RemoveContainer" containerID="0285b5f04f42bed78d74c9d3fb2b24cd430af8977cc7e2c117e38ed2dc2f35b2" Oct 13 15:32:57 crc kubenswrapper[4797]: I1013 15:32:57.081250 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-95sw4"] Oct 13 15:32:57 crc kubenswrapper[4797]: I1013 15:32:57.086558 4797 scope.go:117] "RemoveContainer" containerID="8ba8ba5cadfff937c3d394fca207479a4e6c174f03772e1abcb6e82b86a49ce1" Oct 13 15:32:57 crc kubenswrapper[4797]: I1013 15:32:57.093371 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-95sw4"] Oct 13 15:32:57 crc kubenswrapper[4797]: I1013 15:32:57.137539 4797 scope.go:117] "RemoveContainer" containerID="2f30881c965a7b34244dcd7f8510d5dfecdf5b1228fd7664d04681fa6e66491d" Oct 13 15:32:57 crc kubenswrapper[4797]: E1013 15:32:57.138330 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f30881c965a7b34244dcd7f8510d5dfecdf5b1228fd7664d04681fa6e66491d\": container with ID starting with 2f30881c965a7b34244dcd7f8510d5dfecdf5b1228fd7664d04681fa6e66491d not found: ID does not exist" containerID="2f30881c965a7b34244dcd7f8510d5dfecdf5b1228fd7664d04681fa6e66491d" Oct 13 15:32:57 crc kubenswrapper[4797]: I1013 15:32:57.138384 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f30881c965a7b34244dcd7f8510d5dfecdf5b1228fd7664d04681fa6e66491d"} err="failed to get container status \"2f30881c965a7b34244dcd7f8510d5dfecdf5b1228fd7664d04681fa6e66491d\": rpc error: code = NotFound desc = could not find container \"2f30881c965a7b34244dcd7f8510d5dfecdf5b1228fd7664d04681fa6e66491d\": container with ID starting with 2f30881c965a7b34244dcd7f8510d5dfecdf5b1228fd7664d04681fa6e66491d not found: ID does not exist" Oct 13 15:32:57 crc kubenswrapper[4797]: I1013 15:32:57.138415 4797 scope.go:117] "RemoveContainer" containerID="0285b5f04f42bed78d74c9d3fb2b24cd430af8977cc7e2c117e38ed2dc2f35b2" Oct 13 15:32:57 crc kubenswrapper[4797]: E1013 15:32:57.138759 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0285b5f04f42bed78d74c9d3fb2b24cd430af8977cc7e2c117e38ed2dc2f35b2\": container with ID starting with 0285b5f04f42bed78d74c9d3fb2b24cd430af8977cc7e2c117e38ed2dc2f35b2 not found: ID does not exist" containerID="0285b5f04f42bed78d74c9d3fb2b24cd430af8977cc7e2c117e38ed2dc2f35b2" Oct 13 15:32:57 crc kubenswrapper[4797]: I1013 15:32:57.138795 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0285b5f04f42bed78d74c9d3fb2b24cd430af8977cc7e2c117e38ed2dc2f35b2"} err="failed to get container status \"0285b5f04f42bed78d74c9d3fb2b24cd430af8977cc7e2c117e38ed2dc2f35b2\": rpc error: code = NotFound desc = could not find container \"0285b5f04f42bed78d74c9d3fb2b24cd430af8977cc7e2c117e38ed2dc2f35b2\": container with ID starting with 0285b5f04f42bed78d74c9d3fb2b24cd430af8977cc7e2c117e38ed2dc2f35b2 not found: ID does not exist" Oct 13 15:32:57 crc kubenswrapper[4797]: I1013 15:32:57.138848 4797 scope.go:117] "RemoveContainer" containerID="8ba8ba5cadfff937c3d394fca207479a4e6c174f03772e1abcb6e82b86a49ce1" Oct 13 15:32:57 crc kubenswrapper[4797]: E1013 15:32:57.139635 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ba8ba5cadfff937c3d394fca207479a4e6c174f03772e1abcb6e82b86a49ce1\": container with ID starting with 8ba8ba5cadfff937c3d394fca207479a4e6c174f03772e1abcb6e82b86a49ce1 not found: ID does not exist" containerID="8ba8ba5cadfff937c3d394fca207479a4e6c174f03772e1abcb6e82b86a49ce1" Oct 13 15:32:57 crc kubenswrapper[4797]: I1013 15:32:57.139660 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ba8ba5cadfff937c3d394fca207479a4e6c174f03772e1abcb6e82b86a49ce1"} err="failed to get container status \"8ba8ba5cadfff937c3d394fca207479a4e6c174f03772e1abcb6e82b86a49ce1\": rpc error: code = NotFound desc = could not find container \"8ba8ba5cadfff937c3d394fca207479a4e6c174f03772e1abcb6e82b86a49ce1\": container with ID starting with 8ba8ba5cadfff937c3d394fca207479a4e6c174f03772e1abcb6e82b86a49ce1 not found: ID does not exist" Oct 13 15:32:57 crc kubenswrapper[4797]: I1013 15:32:57.251945 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a29aecb-d7e5-4378-872b-15fe7dcddb7a" path="/var/lib/kubelet/pods/9a29aecb-d7e5-4378-872b-15fe7dcddb7a/volumes" Oct 13 15:32:58 crc kubenswrapper[4797]: I1013 15:32:58.235966 4797 scope.go:117] "RemoveContainer" containerID="6a7efc0ac87fd5a3474be5b847665fe7e0d782a300b9c4d9444078885329765f" Oct 13 15:32:59 crc kubenswrapper[4797]: I1013 15:32:59.053171 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" event={"ID":"345b1c60-ba79-407d-8423-53010f2dfeb0","Type":"ContainerStarted","Data":"98efd66111782db243b8b6633201b9be37d56075cae91ee51e26bf205e3692c6"} Oct 13 15:33:16 crc kubenswrapper[4797]: I1013 15:33:16.311731 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 13 15:33:16 crc kubenswrapper[4797]: I1013 15:33:16.312433 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="65493b63-8563-4071-98e1-647040b90c23" containerName="nova-cell0-conductor-conductor" containerID="cri-o://d98e214ebbcf98374ce57a8265cecb7f6fd8415b1210b0470ba22c31ad251f9b" gracePeriod=30 Oct 13 15:33:16 crc kubenswrapper[4797]: I1013 15:33:16.341290 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 13 15:33:16 crc kubenswrapper[4797]: I1013 15:33:16.341498 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="7cc2cb14-0b77-4361-b3e4-c1196a2253ce" containerName="nova-cell1-conductor-conductor" containerID="cri-o://af1984024f7a3d917a811ad718a3683fe1090a17543cab3ab5fec01ccdb4f851" gracePeriod=30 Oct 13 15:33:16 crc kubenswrapper[4797]: E1013 15:33:16.451874 4797 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="af1984024f7a3d917a811ad718a3683fe1090a17543cab3ab5fec01ccdb4f851" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 13 15:33:16 crc kubenswrapper[4797]: E1013 15:33:16.453928 4797 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="af1984024f7a3d917a811ad718a3683fe1090a17543cab3ab5fec01ccdb4f851" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 13 15:33:16 crc kubenswrapper[4797]: E1013 15:33:16.455572 4797 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="af1984024f7a3d917a811ad718a3683fe1090a17543cab3ab5fec01ccdb4f851" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 13 15:33:16 crc kubenswrapper[4797]: E1013 15:33:16.455676 4797 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell1-conductor-0" podUID="7cc2cb14-0b77-4361-b3e4-c1196a2253ce" containerName="nova-cell1-conductor-conductor" Oct 13 15:33:17 crc kubenswrapper[4797]: I1013 15:33:17.316549 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 13 15:33:17 crc kubenswrapper[4797]: I1013 15:33:17.316771 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="abf87031-2d48-4271-9cbd-2c872a8e75e3" containerName="nova-scheduler-scheduler" containerID="cri-o://1267e88d787b2a87fe9ff47a429f14e3f8526961643107ddae60fcdcb361fa35" gracePeriod=30 Oct 13 15:33:17 crc kubenswrapper[4797]: I1013 15:33:17.335661 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 13 15:33:17 crc kubenswrapper[4797]: I1013 15:33:17.335953 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="b3386188-e3d7-49b6-b099-7a5b66e012ee" containerName="nova-api-log" containerID="cri-o://95048d81c2b650b38cdff860feea929ab3fe2c9e48055adba8f91a38d9a111f4" gracePeriod=30 Oct 13 15:33:17 crc kubenswrapper[4797]: I1013 15:33:17.336038 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="b3386188-e3d7-49b6-b099-7a5b66e012ee" containerName="nova-api-api" containerID="cri-o://c9f3097f8f28be100e01af3e0dfca4331ebc0d5c4809ddeb98bd981107123b24" gracePeriod=30 Oct 13 15:33:17 crc kubenswrapper[4797]: I1013 15:33:17.367271 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 13 15:33:17 crc kubenswrapper[4797]: I1013 15:33:17.367776 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="10b527c4-151c-464b-a112-64bd7b5a7444" containerName="nova-metadata-log" containerID="cri-o://fb66a988bd5f490396133e2807eb34da2f13ff1da09ea7d4b86bf67ed042750a" gracePeriod=30 Oct 13 15:33:17 crc kubenswrapper[4797]: I1013 15:33:17.367856 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="10b527c4-151c-464b-a112-64bd7b5a7444" containerName="nova-metadata-metadata" containerID="cri-o://6c060dc934ae135cc4334d93281f9963ff3fb142cdeb222dbb378c1dc96b7d1c" gracePeriod=30 Oct 13 15:33:17 crc kubenswrapper[4797]: E1013 15:33:17.648210 4797 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1267e88d787b2a87fe9ff47a429f14e3f8526961643107ddae60fcdcb361fa35" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 13 15:33:17 crc kubenswrapper[4797]: E1013 15:33:17.652544 4797 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1267e88d787b2a87fe9ff47a429f14e3f8526961643107ddae60fcdcb361fa35" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 13 15:33:17 crc kubenswrapper[4797]: E1013 15:33:17.654462 4797 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1267e88d787b2a87fe9ff47a429f14e3f8526961643107ddae60fcdcb361fa35" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 13 15:33:17 crc kubenswrapper[4797]: E1013 15:33:17.654515 4797 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="abf87031-2d48-4271-9cbd-2c872a8e75e3" containerName="nova-scheduler-scheduler" Oct 13 15:33:18 crc kubenswrapper[4797]: I1013 15:33:18.247367 4797 generic.go:334] "Generic (PLEG): container finished" podID="b3386188-e3d7-49b6-b099-7a5b66e012ee" containerID="95048d81c2b650b38cdff860feea929ab3fe2c9e48055adba8f91a38d9a111f4" exitCode=143 Oct 13 15:33:18 crc kubenswrapper[4797]: I1013 15:33:18.247430 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b3386188-e3d7-49b6-b099-7a5b66e012ee","Type":"ContainerDied","Data":"95048d81c2b650b38cdff860feea929ab3fe2c9e48055adba8f91a38d9a111f4"} Oct 13 15:33:18 crc kubenswrapper[4797]: I1013 15:33:18.249731 4797 generic.go:334] "Generic (PLEG): container finished" podID="10b527c4-151c-464b-a112-64bd7b5a7444" containerID="fb66a988bd5f490396133e2807eb34da2f13ff1da09ea7d4b86bf67ed042750a" exitCode=143 Oct 13 15:33:18 crc kubenswrapper[4797]: I1013 15:33:18.249792 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"10b527c4-151c-464b-a112-64bd7b5a7444","Type":"ContainerDied","Data":"fb66a988bd5f490396133e2807eb34da2f13ff1da09ea7d4b86bf67ed042750a"} Oct 13 15:33:19 crc kubenswrapper[4797]: I1013 15:33:19.262030 4797 generic.go:334] "Generic (PLEG): container finished" podID="7cc2cb14-0b77-4361-b3e4-c1196a2253ce" containerID="af1984024f7a3d917a811ad718a3683fe1090a17543cab3ab5fec01ccdb4f851" exitCode=0 Oct 13 15:33:19 crc kubenswrapper[4797]: I1013 15:33:19.262295 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"7cc2cb14-0b77-4361-b3e4-c1196a2253ce","Type":"ContainerDied","Data":"af1984024f7a3d917a811ad718a3683fe1090a17543cab3ab5fec01ccdb4f851"} Oct 13 15:33:19 crc kubenswrapper[4797]: I1013 15:33:19.459684 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 13 15:33:19 crc kubenswrapper[4797]: I1013 15:33:19.564071 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cc2cb14-0b77-4361-b3e4-c1196a2253ce-combined-ca-bundle\") pod \"7cc2cb14-0b77-4361-b3e4-c1196a2253ce\" (UID: \"7cc2cb14-0b77-4361-b3e4-c1196a2253ce\") " Oct 13 15:33:19 crc kubenswrapper[4797]: I1013 15:33:19.564147 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cc2cb14-0b77-4361-b3e4-c1196a2253ce-config-data\") pod \"7cc2cb14-0b77-4361-b3e4-c1196a2253ce\" (UID: \"7cc2cb14-0b77-4361-b3e4-c1196a2253ce\") " Oct 13 15:33:19 crc kubenswrapper[4797]: I1013 15:33:19.564371 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vxm5f\" (UniqueName: \"kubernetes.io/projected/7cc2cb14-0b77-4361-b3e4-c1196a2253ce-kube-api-access-vxm5f\") pod \"7cc2cb14-0b77-4361-b3e4-c1196a2253ce\" (UID: \"7cc2cb14-0b77-4361-b3e4-c1196a2253ce\") " Oct 13 15:33:19 crc kubenswrapper[4797]: I1013 15:33:19.583481 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7cc2cb14-0b77-4361-b3e4-c1196a2253ce-kube-api-access-vxm5f" (OuterVolumeSpecName: "kube-api-access-vxm5f") pod "7cc2cb14-0b77-4361-b3e4-c1196a2253ce" (UID: "7cc2cb14-0b77-4361-b3e4-c1196a2253ce"). InnerVolumeSpecName "kube-api-access-vxm5f". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 15:33:19 crc kubenswrapper[4797]: I1013 15:33:19.593482 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cc2cb14-0b77-4361-b3e4-c1196a2253ce-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7cc2cb14-0b77-4361-b3e4-c1196a2253ce" (UID: "7cc2cb14-0b77-4361-b3e4-c1196a2253ce"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 15:33:19 crc kubenswrapper[4797]: I1013 15:33:19.595424 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cc2cb14-0b77-4361-b3e4-c1196a2253ce-config-data" (OuterVolumeSpecName: "config-data") pod "7cc2cb14-0b77-4361-b3e4-c1196a2253ce" (UID: "7cc2cb14-0b77-4361-b3e4-c1196a2253ce"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 15:33:19 crc kubenswrapper[4797]: I1013 15:33:19.667293 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cc2cb14-0b77-4361-b3e4-c1196a2253ce-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 15:33:19 crc kubenswrapper[4797]: I1013 15:33:19.667329 4797 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cc2cb14-0b77-4361-b3e4-c1196a2253ce-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 15:33:19 crc kubenswrapper[4797]: I1013 15:33:19.667339 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vxm5f\" (UniqueName: \"kubernetes.io/projected/7cc2cb14-0b77-4361-b3e4-c1196a2253ce-kube-api-access-vxm5f\") on node \"crc\" DevicePath \"\"" Oct 13 15:33:19 crc kubenswrapper[4797]: I1013 15:33:19.674251 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 13 15:33:19 crc kubenswrapper[4797]: I1013 15:33:19.768946 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65493b63-8563-4071-98e1-647040b90c23-combined-ca-bundle\") pod \"65493b63-8563-4071-98e1-647040b90c23\" (UID: \"65493b63-8563-4071-98e1-647040b90c23\") " Oct 13 15:33:19 crc kubenswrapper[4797]: I1013 15:33:19.768992 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5t7tp\" (UniqueName: \"kubernetes.io/projected/65493b63-8563-4071-98e1-647040b90c23-kube-api-access-5t7tp\") pod \"65493b63-8563-4071-98e1-647040b90c23\" (UID: \"65493b63-8563-4071-98e1-647040b90c23\") " Oct 13 15:33:19 crc kubenswrapper[4797]: I1013 15:33:19.769247 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65493b63-8563-4071-98e1-647040b90c23-config-data\") pod \"65493b63-8563-4071-98e1-647040b90c23\" (UID: \"65493b63-8563-4071-98e1-647040b90c23\") " Oct 13 15:33:19 crc kubenswrapper[4797]: I1013 15:33:19.772578 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65493b63-8563-4071-98e1-647040b90c23-kube-api-access-5t7tp" (OuterVolumeSpecName: "kube-api-access-5t7tp") pod "65493b63-8563-4071-98e1-647040b90c23" (UID: "65493b63-8563-4071-98e1-647040b90c23"). InnerVolumeSpecName "kube-api-access-5t7tp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 15:33:19 crc kubenswrapper[4797]: I1013 15:33:19.793098 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65493b63-8563-4071-98e1-647040b90c23-config-data" (OuterVolumeSpecName: "config-data") pod "65493b63-8563-4071-98e1-647040b90c23" (UID: "65493b63-8563-4071-98e1-647040b90c23"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 15:33:19 crc kubenswrapper[4797]: I1013 15:33:19.796307 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65493b63-8563-4071-98e1-647040b90c23-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "65493b63-8563-4071-98e1-647040b90c23" (UID: "65493b63-8563-4071-98e1-647040b90c23"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 15:33:19 crc kubenswrapper[4797]: I1013 15:33:19.871224 4797 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65493b63-8563-4071-98e1-647040b90c23-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 15:33:19 crc kubenswrapper[4797]: I1013 15:33:19.871255 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65493b63-8563-4071-98e1-647040b90c23-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 15:33:19 crc kubenswrapper[4797]: I1013 15:33:19.871265 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5t7tp\" (UniqueName: \"kubernetes.io/projected/65493b63-8563-4071-98e1-647040b90c23-kube-api-access-5t7tp\") on node \"crc\" DevicePath \"\"" Oct 13 15:33:20 crc kubenswrapper[4797]: I1013 15:33:20.276416 4797 generic.go:334] "Generic (PLEG): container finished" podID="65493b63-8563-4071-98e1-647040b90c23" containerID="d98e214ebbcf98374ce57a8265cecb7f6fd8415b1210b0470ba22c31ad251f9b" exitCode=0 Oct 13 15:33:20 crc kubenswrapper[4797]: I1013 15:33:20.276525 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 13 15:33:20 crc kubenswrapper[4797]: I1013 15:33:20.276548 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"65493b63-8563-4071-98e1-647040b90c23","Type":"ContainerDied","Data":"d98e214ebbcf98374ce57a8265cecb7f6fd8415b1210b0470ba22c31ad251f9b"} Oct 13 15:33:20 crc kubenswrapper[4797]: I1013 15:33:20.276655 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"65493b63-8563-4071-98e1-647040b90c23","Type":"ContainerDied","Data":"f43c9ed899e198c85092d2ecb0f8d5956c76a0cd3db3b5401721e8a67d2a5174"} Oct 13 15:33:20 crc kubenswrapper[4797]: I1013 15:33:20.276684 4797 scope.go:117] "RemoveContainer" containerID="d98e214ebbcf98374ce57a8265cecb7f6fd8415b1210b0470ba22c31ad251f9b" Oct 13 15:33:20 crc kubenswrapper[4797]: I1013 15:33:20.280282 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"7cc2cb14-0b77-4361-b3e4-c1196a2253ce","Type":"ContainerDied","Data":"18981afe8cb2860b08a2b3f06066237f41227dc99b962cbb4580e90325ad5bc7"} Oct 13 15:33:20 crc kubenswrapper[4797]: I1013 15:33:20.280388 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 13 15:33:20 crc kubenswrapper[4797]: I1013 15:33:20.321583 4797 scope.go:117] "RemoveContainer" containerID="d98e214ebbcf98374ce57a8265cecb7f6fd8415b1210b0470ba22c31ad251f9b" Oct 13 15:33:20 crc kubenswrapper[4797]: E1013 15:33:20.322353 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d98e214ebbcf98374ce57a8265cecb7f6fd8415b1210b0470ba22c31ad251f9b\": container with ID starting with d98e214ebbcf98374ce57a8265cecb7f6fd8415b1210b0470ba22c31ad251f9b not found: ID does not exist" containerID="d98e214ebbcf98374ce57a8265cecb7f6fd8415b1210b0470ba22c31ad251f9b" Oct 13 15:33:20 crc kubenswrapper[4797]: I1013 15:33:20.322399 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d98e214ebbcf98374ce57a8265cecb7f6fd8415b1210b0470ba22c31ad251f9b"} err="failed to get container status \"d98e214ebbcf98374ce57a8265cecb7f6fd8415b1210b0470ba22c31ad251f9b\": rpc error: code = NotFound desc = could not find container \"d98e214ebbcf98374ce57a8265cecb7f6fd8415b1210b0470ba22c31ad251f9b\": container with ID starting with d98e214ebbcf98374ce57a8265cecb7f6fd8415b1210b0470ba22c31ad251f9b not found: ID does not exist" Oct 13 15:33:20 crc kubenswrapper[4797]: I1013 15:33:20.322425 4797 scope.go:117] "RemoveContainer" containerID="af1984024f7a3d917a811ad718a3683fe1090a17543cab3ab5fec01ccdb4f851" Oct 13 15:33:20 crc kubenswrapper[4797]: I1013 15:33:20.325689 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 13 15:33:20 crc kubenswrapper[4797]: I1013 15:33:20.339486 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 13 15:33:20 crc kubenswrapper[4797]: I1013 15:33:20.344888 4797 scope.go:117] "RemoveContainer" containerID="af1984024f7a3d917a811ad718a3683fe1090a17543cab3ab5fec01ccdb4f851" Oct 13 15:33:20 crc kubenswrapper[4797]: E1013 15:33:20.355221 4797 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = Unknown desc = failed to delete container k8s_nova-cell1-conductor-conductor_nova-cell1-conductor-0_openstack_7cc2cb14-0b77-4361-b3e4-c1196a2253ce_0 in pod sandbox 18981afe8cb2860b08a2b3f06066237f41227dc99b962cbb4580e90325ad5bc7 from index: no such id: 'af1984024f7a3d917a811ad718a3683fe1090a17543cab3ab5fec01ccdb4f851'" containerID="af1984024f7a3d917a811ad718a3683fe1090a17543cab3ab5fec01ccdb4f851" Oct 13 15:33:20 crc kubenswrapper[4797]: E1013 15:33:20.355283 4797 kuberuntime_gc.go:150] "Failed to remove container" err="rpc error: code = Unknown desc = failed to delete container k8s_nova-cell1-conductor-conductor_nova-cell1-conductor-0_openstack_7cc2cb14-0b77-4361-b3e4-c1196a2253ce_0 in pod sandbox 18981afe8cb2860b08a2b3f06066237f41227dc99b962cbb4580e90325ad5bc7 from index: no such id: 'af1984024f7a3d917a811ad718a3683fe1090a17543cab3ab5fec01ccdb4f851'" containerID="af1984024f7a3d917a811ad718a3683fe1090a17543cab3ab5fec01ccdb4f851" Oct 13 15:33:20 crc kubenswrapper[4797]: I1013 15:33:20.356597 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 13 15:33:20 crc kubenswrapper[4797]: I1013 15:33:20.377575 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 13 15:33:20 crc kubenswrapper[4797]: I1013 15:33:20.398347 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 13 15:33:20 crc kubenswrapper[4797]: E1013 15:33:20.398945 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cc2cb14-0b77-4361-b3e4-c1196a2253ce" containerName="nova-cell1-conductor-conductor" Oct 13 15:33:20 crc kubenswrapper[4797]: I1013 15:33:20.398964 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cc2cb14-0b77-4361-b3e4-c1196a2253ce" containerName="nova-cell1-conductor-conductor" Oct 13 15:33:20 crc kubenswrapper[4797]: E1013 15:33:20.398981 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17115edf-f950-40b3-9a3b-3948815da323" containerName="neutron-dhcp-openstack-openstack-cell1" Oct 13 15:33:20 crc kubenswrapper[4797]: I1013 15:33:20.398987 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="17115edf-f950-40b3-9a3b-3948815da323" containerName="neutron-dhcp-openstack-openstack-cell1" Oct 13 15:33:20 crc kubenswrapper[4797]: E1013 15:33:20.399025 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a29aecb-d7e5-4378-872b-15fe7dcddb7a" containerName="extract-content" Oct 13 15:33:20 crc kubenswrapper[4797]: I1013 15:33:20.399033 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a29aecb-d7e5-4378-872b-15fe7dcddb7a" containerName="extract-content" Oct 13 15:33:20 crc kubenswrapper[4797]: E1013 15:33:20.399045 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a29aecb-d7e5-4378-872b-15fe7dcddb7a" containerName="registry-server" Oct 13 15:33:20 crc kubenswrapper[4797]: I1013 15:33:20.399051 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a29aecb-d7e5-4378-872b-15fe7dcddb7a" containerName="registry-server" Oct 13 15:33:20 crc kubenswrapper[4797]: E1013 15:33:20.399061 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65493b63-8563-4071-98e1-647040b90c23" containerName="nova-cell0-conductor-conductor" Oct 13 15:33:20 crc kubenswrapper[4797]: I1013 15:33:20.399068 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="65493b63-8563-4071-98e1-647040b90c23" containerName="nova-cell0-conductor-conductor" Oct 13 15:33:20 crc kubenswrapper[4797]: E1013 15:33:20.399076 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a29aecb-d7e5-4378-872b-15fe7dcddb7a" containerName="extract-utilities" Oct 13 15:33:20 crc kubenswrapper[4797]: I1013 15:33:20.399083 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a29aecb-d7e5-4378-872b-15fe7dcddb7a" containerName="extract-utilities" Oct 13 15:33:20 crc kubenswrapper[4797]: I1013 15:33:20.399259 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="65493b63-8563-4071-98e1-647040b90c23" containerName="nova-cell0-conductor-conductor" Oct 13 15:33:20 crc kubenswrapper[4797]: I1013 15:33:20.399272 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a29aecb-d7e5-4378-872b-15fe7dcddb7a" containerName="registry-server" Oct 13 15:33:20 crc kubenswrapper[4797]: I1013 15:33:20.399286 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="17115edf-f950-40b3-9a3b-3948815da323" containerName="neutron-dhcp-openstack-openstack-cell1" Oct 13 15:33:20 crc kubenswrapper[4797]: I1013 15:33:20.399299 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="7cc2cb14-0b77-4361-b3e4-c1196a2253ce" containerName="nova-cell1-conductor-conductor" Oct 13 15:33:20 crc kubenswrapper[4797]: I1013 15:33:20.400453 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 13 15:33:20 crc kubenswrapper[4797]: I1013 15:33:20.402948 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 13 15:33:20 crc kubenswrapper[4797]: I1013 15:33:20.412267 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 13 15:33:20 crc kubenswrapper[4797]: I1013 15:33:20.416709 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 13 15:33:20 crc kubenswrapper[4797]: I1013 15:33:20.419851 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 13 15:33:20 crc kubenswrapper[4797]: I1013 15:33:20.429727 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 13 15:33:20 crc kubenswrapper[4797]: I1013 15:33:20.444457 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 13 15:33:20 crc kubenswrapper[4797]: I1013 15:33:20.484092 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjgs5\" (UniqueName: \"kubernetes.io/projected/cb9b1886-01a9-49a3-a525-e2cebb3c8c85-kube-api-access-cjgs5\") pod \"nova-cell0-conductor-0\" (UID: \"cb9b1886-01a9-49a3-a525-e2cebb3c8c85\") " pod="openstack/nova-cell0-conductor-0" Oct 13 15:33:20 crc kubenswrapper[4797]: I1013 15:33:20.484130 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb9b1886-01a9-49a3-a525-e2cebb3c8c85-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"cb9b1886-01a9-49a3-a525-e2cebb3c8c85\") " pod="openstack/nova-cell0-conductor-0" Oct 13 15:33:20 crc kubenswrapper[4797]: I1013 15:33:20.484185 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7295c003-d21f-4137-96b6-0ae19de3d1be-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"7295c003-d21f-4137-96b6-0ae19de3d1be\") " pod="openstack/nova-cell1-conductor-0" Oct 13 15:33:20 crc kubenswrapper[4797]: I1013 15:33:20.484259 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb9b1886-01a9-49a3-a525-e2cebb3c8c85-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"cb9b1886-01a9-49a3-a525-e2cebb3c8c85\") " pod="openstack/nova-cell0-conductor-0" Oct 13 15:33:20 crc kubenswrapper[4797]: I1013 15:33:20.484297 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2bbz\" (UniqueName: \"kubernetes.io/projected/7295c003-d21f-4137-96b6-0ae19de3d1be-kube-api-access-x2bbz\") pod \"nova-cell1-conductor-0\" (UID: \"7295c003-d21f-4137-96b6-0ae19de3d1be\") " pod="openstack/nova-cell1-conductor-0" Oct 13 15:33:20 crc kubenswrapper[4797]: I1013 15:33:20.484352 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7295c003-d21f-4137-96b6-0ae19de3d1be-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"7295c003-d21f-4137-96b6-0ae19de3d1be\") " pod="openstack/nova-cell1-conductor-0" Oct 13 15:33:20 crc kubenswrapper[4797]: I1013 15:33:20.587453 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb9b1886-01a9-49a3-a525-e2cebb3c8c85-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"cb9b1886-01a9-49a3-a525-e2cebb3c8c85\") " pod="openstack/nova-cell0-conductor-0" Oct 13 15:33:20 crc kubenswrapper[4797]: I1013 15:33:20.587538 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x2bbz\" (UniqueName: \"kubernetes.io/projected/7295c003-d21f-4137-96b6-0ae19de3d1be-kube-api-access-x2bbz\") pod \"nova-cell1-conductor-0\" (UID: \"7295c003-d21f-4137-96b6-0ae19de3d1be\") " pod="openstack/nova-cell1-conductor-0" Oct 13 15:33:20 crc kubenswrapper[4797]: I1013 15:33:20.587619 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7295c003-d21f-4137-96b6-0ae19de3d1be-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"7295c003-d21f-4137-96b6-0ae19de3d1be\") " pod="openstack/nova-cell1-conductor-0" Oct 13 15:33:20 crc kubenswrapper[4797]: I1013 15:33:20.587693 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cjgs5\" (UniqueName: \"kubernetes.io/projected/cb9b1886-01a9-49a3-a525-e2cebb3c8c85-kube-api-access-cjgs5\") pod \"nova-cell0-conductor-0\" (UID: \"cb9b1886-01a9-49a3-a525-e2cebb3c8c85\") " pod="openstack/nova-cell0-conductor-0" Oct 13 15:33:20 crc kubenswrapper[4797]: I1013 15:33:20.587719 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb9b1886-01a9-49a3-a525-e2cebb3c8c85-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"cb9b1886-01a9-49a3-a525-e2cebb3c8c85\") " pod="openstack/nova-cell0-conductor-0" Oct 13 15:33:20 crc kubenswrapper[4797]: I1013 15:33:20.587793 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7295c003-d21f-4137-96b6-0ae19de3d1be-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"7295c003-d21f-4137-96b6-0ae19de3d1be\") " pod="openstack/nova-cell1-conductor-0" Oct 13 15:33:20 crc kubenswrapper[4797]: I1013 15:33:20.599605 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7295c003-d21f-4137-96b6-0ae19de3d1be-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"7295c003-d21f-4137-96b6-0ae19de3d1be\") " pod="openstack/nova-cell1-conductor-0" Oct 13 15:33:20 crc kubenswrapper[4797]: I1013 15:33:20.600431 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb9b1886-01a9-49a3-a525-e2cebb3c8c85-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"cb9b1886-01a9-49a3-a525-e2cebb3c8c85\") " pod="openstack/nova-cell0-conductor-0" Oct 13 15:33:20 crc kubenswrapper[4797]: I1013 15:33:20.605579 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb9b1886-01a9-49a3-a525-e2cebb3c8c85-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"cb9b1886-01a9-49a3-a525-e2cebb3c8c85\") " pod="openstack/nova-cell0-conductor-0" Oct 13 15:33:20 crc kubenswrapper[4797]: I1013 15:33:20.608703 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7295c003-d21f-4137-96b6-0ae19de3d1be-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"7295c003-d21f-4137-96b6-0ae19de3d1be\") " pod="openstack/nova-cell1-conductor-0" Oct 13 15:33:20 crc kubenswrapper[4797]: I1013 15:33:20.616353 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2bbz\" (UniqueName: \"kubernetes.io/projected/7295c003-d21f-4137-96b6-0ae19de3d1be-kube-api-access-x2bbz\") pod \"nova-cell1-conductor-0\" (UID: \"7295c003-d21f-4137-96b6-0ae19de3d1be\") " pod="openstack/nova-cell1-conductor-0" Oct 13 15:33:20 crc kubenswrapper[4797]: I1013 15:33:20.620578 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjgs5\" (UniqueName: \"kubernetes.io/projected/cb9b1886-01a9-49a3-a525-e2cebb3c8c85-kube-api-access-cjgs5\") pod \"nova-cell0-conductor-0\" (UID: \"cb9b1886-01a9-49a3-a525-e2cebb3c8c85\") " pod="openstack/nova-cell0-conductor-0" Oct 13 15:33:20 crc kubenswrapper[4797]: I1013 15:33:20.723293 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 13 15:33:20 crc kubenswrapper[4797]: I1013 15:33:20.742379 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 13 15:33:21 crc kubenswrapper[4797]: I1013 15:33:21.111502 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 13 15:33:21 crc kubenswrapper[4797]: I1013 15:33:21.203923 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b3386188-e3d7-49b6-b099-7a5b66e012ee-logs\") pod \"b3386188-e3d7-49b6-b099-7a5b66e012ee\" (UID: \"b3386188-e3d7-49b6-b099-7a5b66e012ee\") " Oct 13 15:33:21 crc kubenswrapper[4797]: I1013 15:33:21.204084 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3386188-e3d7-49b6-b099-7a5b66e012ee-combined-ca-bundle\") pod \"b3386188-e3d7-49b6-b099-7a5b66e012ee\" (UID: \"b3386188-e3d7-49b6-b099-7a5b66e012ee\") " Oct 13 15:33:21 crc kubenswrapper[4797]: I1013 15:33:21.204116 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3386188-e3d7-49b6-b099-7a5b66e012ee-config-data\") pod \"b3386188-e3d7-49b6-b099-7a5b66e012ee\" (UID: \"b3386188-e3d7-49b6-b099-7a5b66e012ee\") " Oct 13 15:33:21 crc kubenswrapper[4797]: I1013 15:33:21.204165 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-crdv4\" (UniqueName: \"kubernetes.io/projected/b3386188-e3d7-49b6-b099-7a5b66e012ee-kube-api-access-crdv4\") pod \"b3386188-e3d7-49b6-b099-7a5b66e012ee\" (UID: \"b3386188-e3d7-49b6-b099-7a5b66e012ee\") " Oct 13 15:33:21 crc kubenswrapper[4797]: I1013 15:33:21.224146 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b3386188-e3d7-49b6-b099-7a5b66e012ee-logs" (OuterVolumeSpecName: "logs") pod "b3386188-e3d7-49b6-b099-7a5b66e012ee" (UID: "b3386188-e3d7-49b6-b099-7a5b66e012ee"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 15:33:21 crc kubenswrapper[4797]: I1013 15:33:21.233621 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3386188-e3d7-49b6-b099-7a5b66e012ee-kube-api-access-crdv4" (OuterVolumeSpecName: "kube-api-access-crdv4") pod "b3386188-e3d7-49b6-b099-7a5b66e012ee" (UID: "b3386188-e3d7-49b6-b099-7a5b66e012ee"). InnerVolumeSpecName "kube-api-access-crdv4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 15:33:21 crc kubenswrapper[4797]: I1013 15:33:21.280212 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65493b63-8563-4071-98e1-647040b90c23" path="/var/lib/kubelet/pods/65493b63-8563-4071-98e1-647040b90c23/volumes" Oct 13 15:33:21 crc kubenswrapper[4797]: I1013 15:33:21.287074 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3386188-e3d7-49b6-b099-7a5b66e012ee-config-data" (OuterVolumeSpecName: "config-data") pod "b3386188-e3d7-49b6-b099-7a5b66e012ee" (UID: "b3386188-e3d7-49b6-b099-7a5b66e012ee"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 15:33:21 crc kubenswrapper[4797]: I1013 15:33:21.294459 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7cc2cb14-0b77-4361-b3e4-c1196a2253ce" path="/var/lib/kubelet/pods/7cc2cb14-0b77-4361-b3e4-c1196a2253ce/volumes" Oct 13 15:33:21 crc kubenswrapper[4797]: I1013 15:33:21.306158 4797 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b3386188-e3d7-49b6-b099-7a5b66e012ee-logs\") on node \"crc\" DevicePath \"\"" Oct 13 15:33:21 crc kubenswrapper[4797]: I1013 15:33:21.306189 4797 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3386188-e3d7-49b6-b099-7a5b66e012ee-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 15:33:21 crc kubenswrapper[4797]: I1013 15:33:21.306201 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-crdv4\" (UniqueName: \"kubernetes.io/projected/b3386188-e3d7-49b6-b099-7a5b66e012ee-kube-api-access-crdv4\") on node \"crc\" DevicePath \"\"" Oct 13 15:33:21 crc kubenswrapper[4797]: I1013 15:33:21.309935 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3386188-e3d7-49b6-b099-7a5b66e012ee-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b3386188-e3d7-49b6-b099-7a5b66e012ee" (UID: "b3386188-e3d7-49b6-b099-7a5b66e012ee"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 15:33:21 crc kubenswrapper[4797]: I1013 15:33:21.313176 4797 generic.go:334] "Generic (PLEG): container finished" podID="b3386188-e3d7-49b6-b099-7a5b66e012ee" containerID="c9f3097f8f28be100e01af3e0dfca4331ebc0d5c4809ddeb98bd981107123b24" exitCode=0 Oct 13 15:33:21 crc kubenswrapper[4797]: I1013 15:33:21.313326 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 13 15:33:21 crc kubenswrapper[4797]: I1013 15:33:21.314997 4797 generic.go:334] "Generic (PLEG): container finished" podID="10b527c4-151c-464b-a112-64bd7b5a7444" containerID="6c060dc934ae135cc4334d93281f9963ff3fb142cdeb222dbb378c1dc96b7d1c" exitCode=0 Oct 13 15:33:21 crc kubenswrapper[4797]: I1013 15:33:21.382212 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b3386188-e3d7-49b6-b099-7a5b66e012ee","Type":"ContainerDied","Data":"c9f3097f8f28be100e01af3e0dfca4331ebc0d5c4809ddeb98bd981107123b24"} Oct 13 15:33:21 crc kubenswrapper[4797]: I1013 15:33:21.382253 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b3386188-e3d7-49b6-b099-7a5b66e012ee","Type":"ContainerDied","Data":"305f114b16360690138d4618d0a0ffa4b05d0d3ed325872577336daef1b999c8"} Oct 13 15:33:21 crc kubenswrapper[4797]: I1013 15:33:21.382269 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"10b527c4-151c-464b-a112-64bd7b5a7444","Type":"ContainerDied","Data":"6c060dc934ae135cc4334d93281f9963ff3fb142cdeb222dbb378c1dc96b7d1c"} Oct 13 15:33:21 crc kubenswrapper[4797]: I1013 15:33:21.382294 4797 scope.go:117] "RemoveContainer" containerID="c9f3097f8f28be100e01af3e0dfca4331ebc0d5c4809ddeb98bd981107123b24" Oct 13 15:33:21 crc kubenswrapper[4797]: I1013 15:33:21.413389 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3386188-e3d7-49b6-b099-7a5b66e012ee-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 15:33:21 crc kubenswrapper[4797]: I1013 15:33:21.422974 4797 scope.go:117] "RemoveContainer" containerID="95048d81c2b650b38cdff860feea929ab3fe2c9e48055adba8f91a38d9a111f4" Oct 13 15:33:21 crc kubenswrapper[4797]: I1013 15:33:21.432303 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 13 15:33:21 crc kubenswrapper[4797]: I1013 15:33:21.433279 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 13 15:33:21 crc kubenswrapper[4797]: I1013 15:33:21.454061 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 13 15:33:21 crc kubenswrapper[4797]: I1013 15:33:21.482628 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 13 15:33:21 crc kubenswrapper[4797]: E1013 15:33:21.483123 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10b527c4-151c-464b-a112-64bd7b5a7444" containerName="nova-metadata-log" Oct 13 15:33:21 crc kubenswrapper[4797]: I1013 15:33:21.483162 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="10b527c4-151c-464b-a112-64bd7b5a7444" containerName="nova-metadata-log" Oct 13 15:33:21 crc kubenswrapper[4797]: E1013 15:33:21.483185 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10b527c4-151c-464b-a112-64bd7b5a7444" containerName="nova-metadata-metadata" Oct 13 15:33:21 crc kubenswrapper[4797]: I1013 15:33:21.483194 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="10b527c4-151c-464b-a112-64bd7b5a7444" containerName="nova-metadata-metadata" Oct 13 15:33:21 crc kubenswrapper[4797]: E1013 15:33:21.483206 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3386188-e3d7-49b6-b099-7a5b66e012ee" containerName="nova-api-api" Oct 13 15:33:21 crc kubenswrapper[4797]: I1013 15:33:21.483214 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3386188-e3d7-49b6-b099-7a5b66e012ee" containerName="nova-api-api" Oct 13 15:33:21 crc kubenswrapper[4797]: E1013 15:33:21.483240 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3386188-e3d7-49b6-b099-7a5b66e012ee" containerName="nova-api-log" Oct 13 15:33:21 crc kubenswrapper[4797]: I1013 15:33:21.483248 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3386188-e3d7-49b6-b099-7a5b66e012ee" containerName="nova-api-log" Oct 13 15:33:21 crc kubenswrapper[4797]: I1013 15:33:21.483514 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3386188-e3d7-49b6-b099-7a5b66e012ee" containerName="nova-api-log" Oct 13 15:33:21 crc kubenswrapper[4797]: I1013 15:33:21.483542 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="10b527c4-151c-464b-a112-64bd7b5a7444" containerName="nova-metadata-log" Oct 13 15:33:21 crc kubenswrapper[4797]: I1013 15:33:21.483554 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3386188-e3d7-49b6-b099-7a5b66e012ee" containerName="nova-api-api" Oct 13 15:33:21 crc kubenswrapper[4797]: I1013 15:33:21.483573 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="10b527c4-151c-464b-a112-64bd7b5a7444" containerName="nova-metadata-metadata" Oct 13 15:33:21 crc kubenswrapper[4797]: I1013 15:33:21.486277 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 13 15:33:21 crc kubenswrapper[4797]: I1013 15:33:21.488593 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 13 15:33:21 crc kubenswrapper[4797]: I1013 15:33:21.497176 4797 scope.go:117] "RemoveContainer" containerID="c9f3097f8f28be100e01af3e0dfca4331ebc0d5c4809ddeb98bd981107123b24" Oct 13 15:33:21 crc kubenswrapper[4797]: E1013 15:33:21.498030 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c9f3097f8f28be100e01af3e0dfca4331ebc0d5c4809ddeb98bd981107123b24\": container with ID starting with c9f3097f8f28be100e01af3e0dfca4331ebc0d5c4809ddeb98bd981107123b24 not found: ID does not exist" containerID="c9f3097f8f28be100e01af3e0dfca4331ebc0d5c4809ddeb98bd981107123b24" Oct 13 15:33:21 crc kubenswrapper[4797]: I1013 15:33:21.498070 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9f3097f8f28be100e01af3e0dfca4331ebc0d5c4809ddeb98bd981107123b24"} err="failed to get container status \"c9f3097f8f28be100e01af3e0dfca4331ebc0d5c4809ddeb98bd981107123b24\": rpc error: code = NotFound desc = could not find container \"c9f3097f8f28be100e01af3e0dfca4331ebc0d5c4809ddeb98bd981107123b24\": container with ID starting with c9f3097f8f28be100e01af3e0dfca4331ebc0d5c4809ddeb98bd981107123b24 not found: ID does not exist" Oct 13 15:33:21 crc kubenswrapper[4797]: I1013 15:33:21.498101 4797 scope.go:117] "RemoveContainer" containerID="95048d81c2b650b38cdff860feea929ab3fe2c9e48055adba8f91a38d9a111f4" Oct 13 15:33:21 crc kubenswrapper[4797]: E1013 15:33:21.498358 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95048d81c2b650b38cdff860feea929ab3fe2c9e48055adba8f91a38d9a111f4\": container with ID starting with 95048d81c2b650b38cdff860feea929ab3fe2c9e48055adba8f91a38d9a111f4 not found: ID does not exist" containerID="95048d81c2b650b38cdff860feea929ab3fe2c9e48055adba8f91a38d9a111f4" Oct 13 15:33:21 crc kubenswrapper[4797]: I1013 15:33:21.498379 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95048d81c2b650b38cdff860feea929ab3fe2c9e48055adba8f91a38d9a111f4"} err="failed to get container status \"95048d81c2b650b38cdff860feea929ab3fe2c9e48055adba8f91a38d9a111f4\": rpc error: code = NotFound desc = could not find container \"95048d81c2b650b38cdff860feea929ab3fe2c9e48055adba8f91a38d9a111f4\": container with ID starting with 95048d81c2b650b38cdff860feea929ab3fe2c9e48055adba8f91a38d9a111f4 not found: ID does not exist" Oct 13 15:33:21 crc kubenswrapper[4797]: I1013 15:33:21.501610 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 13 15:33:21 crc kubenswrapper[4797]: I1013 15:33:21.516417 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zg4ds\" (UniqueName: \"kubernetes.io/projected/10b527c4-151c-464b-a112-64bd7b5a7444-kube-api-access-zg4ds\") pod \"10b527c4-151c-464b-a112-64bd7b5a7444\" (UID: \"10b527c4-151c-464b-a112-64bd7b5a7444\") " Oct 13 15:33:21 crc kubenswrapper[4797]: I1013 15:33:21.516601 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10b527c4-151c-464b-a112-64bd7b5a7444-combined-ca-bundle\") pod \"10b527c4-151c-464b-a112-64bd7b5a7444\" (UID: \"10b527c4-151c-464b-a112-64bd7b5a7444\") " Oct 13 15:33:21 crc kubenswrapper[4797]: I1013 15:33:21.517200 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10b527c4-151c-464b-a112-64bd7b5a7444-config-data\") pod \"10b527c4-151c-464b-a112-64bd7b5a7444\" (UID: \"10b527c4-151c-464b-a112-64bd7b5a7444\") " Oct 13 15:33:21 crc kubenswrapper[4797]: I1013 15:33:21.517532 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/10b527c4-151c-464b-a112-64bd7b5a7444-logs\") pod \"10b527c4-151c-464b-a112-64bd7b5a7444\" (UID: \"10b527c4-151c-464b-a112-64bd7b5a7444\") " Oct 13 15:33:21 crc kubenswrapper[4797]: I1013 15:33:21.518948 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/10b527c4-151c-464b-a112-64bd7b5a7444-logs" (OuterVolumeSpecName: "logs") pod "10b527c4-151c-464b-a112-64bd7b5a7444" (UID: "10b527c4-151c-464b-a112-64bd7b5a7444"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 15:33:21 crc kubenswrapper[4797]: I1013 15:33:21.579633 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 13 15:33:21 crc kubenswrapper[4797]: I1013 15:33:21.619892 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6798d\" (UniqueName: \"kubernetes.io/projected/0762bcb1-f8cd-4a9d-8691-1f6e32602199-kube-api-access-6798d\") pod \"nova-api-0\" (UID: \"0762bcb1-f8cd-4a9d-8691-1f6e32602199\") " pod="openstack/nova-api-0" Oct 13 15:33:21 crc kubenswrapper[4797]: I1013 15:33:21.619947 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0762bcb1-f8cd-4a9d-8691-1f6e32602199-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0762bcb1-f8cd-4a9d-8691-1f6e32602199\") " pod="openstack/nova-api-0" Oct 13 15:33:21 crc kubenswrapper[4797]: I1013 15:33:21.620095 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0762bcb1-f8cd-4a9d-8691-1f6e32602199-logs\") pod \"nova-api-0\" (UID: \"0762bcb1-f8cd-4a9d-8691-1f6e32602199\") " pod="openstack/nova-api-0" Oct 13 15:33:21 crc kubenswrapper[4797]: I1013 15:33:21.620136 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0762bcb1-f8cd-4a9d-8691-1f6e32602199-config-data\") pod \"nova-api-0\" (UID: \"0762bcb1-f8cd-4a9d-8691-1f6e32602199\") " pod="openstack/nova-api-0" Oct 13 15:33:21 crc kubenswrapper[4797]: I1013 15:33:21.620272 4797 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/10b527c4-151c-464b-a112-64bd7b5a7444-logs\") on node \"crc\" DevicePath \"\"" Oct 13 15:33:21 crc kubenswrapper[4797]: I1013 15:33:21.705794 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 13 15:33:21 crc kubenswrapper[4797]: I1013 15:33:21.722312 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6798d\" (UniqueName: \"kubernetes.io/projected/0762bcb1-f8cd-4a9d-8691-1f6e32602199-kube-api-access-6798d\") pod \"nova-api-0\" (UID: \"0762bcb1-f8cd-4a9d-8691-1f6e32602199\") " pod="openstack/nova-api-0" Oct 13 15:33:21 crc kubenswrapper[4797]: I1013 15:33:21.722354 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0762bcb1-f8cd-4a9d-8691-1f6e32602199-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0762bcb1-f8cd-4a9d-8691-1f6e32602199\") " pod="openstack/nova-api-0" Oct 13 15:33:21 crc kubenswrapper[4797]: I1013 15:33:21.722432 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0762bcb1-f8cd-4a9d-8691-1f6e32602199-logs\") pod \"nova-api-0\" (UID: \"0762bcb1-f8cd-4a9d-8691-1f6e32602199\") " pod="openstack/nova-api-0" Oct 13 15:33:21 crc kubenswrapper[4797]: I1013 15:33:21.722463 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0762bcb1-f8cd-4a9d-8691-1f6e32602199-config-data\") pod \"nova-api-0\" (UID: \"0762bcb1-f8cd-4a9d-8691-1f6e32602199\") " pod="openstack/nova-api-0" Oct 13 15:33:21 crc kubenswrapper[4797]: I1013 15:33:21.722926 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0762bcb1-f8cd-4a9d-8691-1f6e32602199-logs\") pod \"nova-api-0\" (UID: \"0762bcb1-f8cd-4a9d-8691-1f6e32602199\") " pod="openstack/nova-api-0" Oct 13 15:33:21 crc kubenswrapper[4797]: I1013 15:33:21.884475 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0762bcb1-f8cd-4a9d-8691-1f6e32602199-config-data\") pod \"nova-api-0\" (UID: \"0762bcb1-f8cd-4a9d-8691-1f6e32602199\") " pod="openstack/nova-api-0" Oct 13 15:33:21 crc kubenswrapper[4797]: I1013 15:33:21.884990 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6798d\" (UniqueName: \"kubernetes.io/projected/0762bcb1-f8cd-4a9d-8691-1f6e32602199-kube-api-access-6798d\") pod \"nova-api-0\" (UID: \"0762bcb1-f8cd-4a9d-8691-1f6e32602199\") " pod="openstack/nova-api-0" Oct 13 15:33:21 crc kubenswrapper[4797]: I1013 15:33:21.886132 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10b527c4-151c-464b-a112-64bd7b5a7444-kube-api-access-zg4ds" (OuterVolumeSpecName: "kube-api-access-zg4ds") pod "10b527c4-151c-464b-a112-64bd7b5a7444" (UID: "10b527c4-151c-464b-a112-64bd7b5a7444"). InnerVolumeSpecName "kube-api-access-zg4ds". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 15:33:21 crc kubenswrapper[4797]: I1013 15:33:21.886446 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0762bcb1-f8cd-4a9d-8691-1f6e32602199-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0762bcb1-f8cd-4a9d-8691-1f6e32602199\") " pod="openstack/nova-api-0" Oct 13 15:33:21 crc kubenswrapper[4797]: I1013 15:33:21.926683 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zg4ds\" (UniqueName: \"kubernetes.io/projected/10b527c4-151c-464b-a112-64bd7b5a7444-kube-api-access-zg4ds\") on node \"crc\" DevicePath \"\"" Oct 13 15:33:21 crc kubenswrapper[4797]: I1013 15:33:21.936585 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10b527c4-151c-464b-a112-64bd7b5a7444-config-data" (OuterVolumeSpecName: "config-data") pod "10b527c4-151c-464b-a112-64bd7b5a7444" (UID: "10b527c4-151c-464b-a112-64bd7b5a7444"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 15:33:21 crc kubenswrapper[4797]: I1013 15:33:21.963596 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10b527c4-151c-464b-a112-64bd7b5a7444-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "10b527c4-151c-464b-a112-64bd7b5a7444" (UID: "10b527c4-151c-464b-a112-64bd7b5a7444"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 15:33:22 crc kubenswrapper[4797]: I1013 15:33:22.028849 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10b527c4-151c-464b-a112-64bd7b5a7444-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 15:33:22 crc kubenswrapper[4797]: I1013 15:33:22.028886 4797 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10b527c4-151c-464b-a112-64bd7b5a7444-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 15:33:22 crc kubenswrapper[4797]: I1013 15:33:22.110844 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 13 15:33:22 crc kubenswrapper[4797]: I1013 15:33:22.329188 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"7295c003-d21f-4137-96b6-0ae19de3d1be","Type":"ContainerStarted","Data":"9b30ced8cecc797bf441934c0573ae635cabe528b7e9e344c266854748278ee8"} Oct 13 15:33:22 crc kubenswrapper[4797]: I1013 15:33:22.331362 4797 generic.go:334] "Generic (PLEG): container finished" podID="abf87031-2d48-4271-9cbd-2c872a8e75e3" containerID="1267e88d787b2a87fe9ff47a429f14e3f8526961643107ddae60fcdcb361fa35" exitCode=0 Oct 13 15:33:22 crc kubenswrapper[4797]: I1013 15:33:22.331429 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"abf87031-2d48-4271-9cbd-2c872a8e75e3","Type":"ContainerDied","Data":"1267e88d787b2a87fe9ff47a429f14e3f8526961643107ddae60fcdcb361fa35"} Oct 13 15:33:22 crc kubenswrapper[4797]: I1013 15:33:22.333536 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"cb9b1886-01a9-49a3-a525-e2cebb3c8c85","Type":"ContainerStarted","Data":"aba430fd40071c9918ced30242511230c0896efa89eec803e716eaa32afc0783"} Oct 13 15:33:22 crc kubenswrapper[4797]: I1013 15:33:22.333569 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"cb9b1886-01a9-49a3-a525-e2cebb3c8c85","Type":"ContainerStarted","Data":"7795059ccd068e69167b5051f585d87eaea2bcbb6e509794e2fa1a1207ae1d65"} Oct 13 15:33:22 crc kubenswrapper[4797]: I1013 15:33:22.333596 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Oct 13 15:33:22 crc kubenswrapper[4797]: I1013 15:33:22.344153 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"10b527c4-151c-464b-a112-64bd7b5a7444","Type":"ContainerDied","Data":"7eb9ba15b76879003e5d86a4e9a6fdfa1c25fed169eeac9e873ada71422433c1"} Oct 13 15:33:22 crc kubenswrapper[4797]: I1013 15:33:22.344204 4797 scope.go:117] "RemoveContainer" containerID="6c060dc934ae135cc4334d93281f9963ff3fb142cdeb222dbb378c1dc96b7d1c" Oct 13 15:33:22 crc kubenswrapper[4797]: I1013 15:33:22.344220 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 13 15:33:22 crc kubenswrapper[4797]: I1013 15:33:22.351258 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.351241183 podStartE2EDuration="2.351241183s" podCreationTimestamp="2025-10-13 15:33:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 15:33:22.348585208 +0000 UTC m=+8779.882135464" watchObservedRunningTime="2025-10-13 15:33:22.351241183 +0000 UTC m=+8779.884791439" Oct 13 15:33:22 crc kubenswrapper[4797]: I1013 15:33:22.436631 4797 scope.go:117] "RemoveContainer" containerID="fb66a988bd5f490396133e2807eb34da2f13ff1da09ea7d4b86bf67ed042750a" Oct 13 15:33:22 crc kubenswrapper[4797]: I1013 15:33:22.456205 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 13 15:33:22 crc kubenswrapper[4797]: I1013 15:33:22.470871 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 13 15:33:22 crc kubenswrapper[4797]: I1013 15:33:22.486612 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 13 15:33:22 crc kubenswrapper[4797]: I1013 15:33:22.489018 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 13 15:33:22 crc kubenswrapper[4797]: I1013 15:33:22.491382 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 13 15:33:22 crc kubenswrapper[4797]: I1013 15:33:22.500260 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 13 15:33:22 crc kubenswrapper[4797]: I1013 15:33:22.524019 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 13 15:33:22 crc kubenswrapper[4797]: I1013 15:33:22.609223 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 13 15:33:22 crc kubenswrapper[4797]: W1013 15:33:22.610567 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0762bcb1_f8cd_4a9d_8691_1f6e32602199.slice/crio-86a94d229bc572ce33f96f059e18faa20ac9eedacdab4c048ef0dec9ed86b349 WatchSource:0}: Error finding container 86a94d229bc572ce33f96f059e18faa20ac9eedacdab4c048ef0dec9ed86b349: Status 404 returned error can't find the container with id 86a94d229bc572ce33f96f059e18faa20ac9eedacdab4c048ef0dec9ed86b349 Oct 13 15:33:22 crc kubenswrapper[4797]: I1013 15:33:22.644843 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z82jl\" (UniqueName: \"kubernetes.io/projected/abf87031-2d48-4271-9cbd-2c872a8e75e3-kube-api-access-z82jl\") pod \"abf87031-2d48-4271-9cbd-2c872a8e75e3\" (UID: \"abf87031-2d48-4271-9cbd-2c872a8e75e3\") " Oct 13 15:33:22 crc kubenswrapper[4797]: I1013 15:33:22.646222 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abf87031-2d48-4271-9cbd-2c872a8e75e3-config-data\") pod \"abf87031-2d48-4271-9cbd-2c872a8e75e3\" (UID: \"abf87031-2d48-4271-9cbd-2c872a8e75e3\") " Oct 13 15:33:22 crc kubenswrapper[4797]: I1013 15:33:22.646402 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abf87031-2d48-4271-9cbd-2c872a8e75e3-combined-ca-bundle\") pod \"abf87031-2d48-4271-9cbd-2c872a8e75e3\" (UID: \"abf87031-2d48-4271-9cbd-2c872a8e75e3\") " Oct 13 15:33:22 crc kubenswrapper[4797]: I1013 15:33:22.646943 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l89m9\" (UniqueName: \"kubernetes.io/projected/c1c896d3-bdc3-4adb-b712-43be751a5fd8-kube-api-access-l89m9\") pod \"nova-metadata-0\" (UID: \"c1c896d3-bdc3-4adb-b712-43be751a5fd8\") " pod="openstack/nova-metadata-0" Oct 13 15:33:22 crc kubenswrapper[4797]: I1013 15:33:22.647288 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c1c896d3-bdc3-4adb-b712-43be751a5fd8-logs\") pod \"nova-metadata-0\" (UID: \"c1c896d3-bdc3-4adb-b712-43be751a5fd8\") " pod="openstack/nova-metadata-0" Oct 13 15:33:22 crc kubenswrapper[4797]: I1013 15:33:22.647396 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1c896d3-bdc3-4adb-b712-43be751a5fd8-config-data\") pod \"nova-metadata-0\" (UID: \"c1c896d3-bdc3-4adb-b712-43be751a5fd8\") " pod="openstack/nova-metadata-0" Oct 13 15:33:22 crc kubenswrapper[4797]: I1013 15:33:22.647562 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1c896d3-bdc3-4adb-b712-43be751a5fd8-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c1c896d3-bdc3-4adb-b712-43be751a5fd8\") " pod="openstack/nova-metadata-0" Oct 13 15:33:22 crc kubenswrapper[4797]: I1013 15:33:22.651469 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/abf87031-2d48-4271-9cbd-2c872a8e75e3-kube-api-access-z82jl" (OuterVolumeSpecName: "kube-api-access-z82jl") pod "abf87031-2d48-4271-9cbd-2c872a8e75e3" (UID: "abf87031-2d48-4271-9cbd-2c872a8e75e3"). InnerVolumeSpecName "kube-api-access-z82jl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 15:33:22 crc kubenswrapper[4797]: I1013 15:33:22.673942 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abf87031-2d48-4271-9cbd-2c872a8e75e3-config-data" (OuterVolumeSpecName: "config-data") pod "abf87031-2d48-4271-9cbd-2c872a8e75e3" (UID: "abf87031-2d48-4271-9cbd-2c872a8e75e3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 15:33:22 crc kubenswrapper[4797]: I1013 15:33:22.687669 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abf87031-2d48-4271-9cbd-2c872a8e75e3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "abf87031-2d48-4271-9cbd-2c872a8e75e3" (UID: "abf87031-2d48-4271-9cbd-2c872a8e75e3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 15:33:22 crc kubenswrapper[4797]: I1013 15:33:22.749553 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l89m9\" (UniqueName: \"kubernetes.io/projected/c1c896d3-bdc3-4adb-b712-43be751a5fd8-kube-api-access-l89m9\") pod \"nova-metadata-0\" (UID: \"c1c896d3-bdc3-4adb-b712-43be751a5fd8\") " pod="openstack/nova-metadata-0" Oct 13 15:33:22 crc kubenswrapper[4797]: I1013 15:33:22.749656 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c1c896d3-bdc3-4adb-b712-43be751a5fd8-logs\") pod \"nova-metadata-0\" (UID: \"c1c896d3-bdc3-4adb-b712-43be751a5fd8\") " pod="openstack/nova-metadata-0" Oct 13 15:33:22 crc kubenswrapper[4797]: I1013 15:33:22.749694 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1c896d3-bdc3-4adb-b712-43be751a5fd8-config-data\") pod \"nova-metadata-0\" (UID: \"c1c896d3-bdc3-4adb-b712-43be751a5fd8\") " pod="openstack/nova-metadata-0" Oct 13 15:33:22 crc kubenswrapper[4797]: I1013 15:33:22.749758 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1c896d3-bdc3-4adb-b712-43be751a5fd8-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c1c896d3-bdc3-4adb-b712-43be751a5fd8\") " pod="openstack/nova-metadata-0" Oct 13 15:33:22 crc kubenswrapper[4797]: I1013 15:33:22.749938 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z82jl\" (UniqueName: \"kubernetes.io/projected/abf87031-2d48-4271-9cbd-2c872a8e75e3-kube-api-access-z82jl\") on node \"crc\" DevicePath \"\"" Oct 13 15:33:22 crc kubenswrapper[4797]: I1013 15:33:22.749955 4797 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abf87031-2d48-4271-9cbd-2c872a8e75e3-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 15:33:22 crc kubenswrapper[4797]: I1013 15:33:22.749967 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abf87031-2d48-4271-9cbd-2c872a8e75e3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 15:33:22 crc kubenswrapper[4797]: I1013 15:33:22.750140 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c1c896d3-bdc3-4adb-b712-43be751a5fd8-logs\") pod \"nova-metadata-0\" (UID: \"c1c896d3-bdc3-4adb-b712-43be751a5fd8\") " pod="openstack/nova-metadata-0" Oct 13 15:33:22 crc kubenswrapper[4797]: I1013 15:33:22.754111 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1c896d3-bdc3-4adb-b712-43be751a5fd8-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c1c896d3-bdc3-4adb-b712-43be751a5fd8\") " pod="openstack/nova-metadata-0" Oct 13 15:33:22 crc kubenswrapper[4797]: I1013 15:33:22.755710 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1c896d3-bdc3-4adb-b712-43be751a5fd8-config-data\") pod \"nova-metadata-0\" (UID: \"c1c896d3-bdc3-4adb-b712-43be751a5fd8\") " pod="openstack/nova-metadata-0" Oct 13 15:33:22 crc kubenswrapper[4797]: I1013 15:33:22.770604 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l89m9\" (UniqueName: \"kubernetes.io/projected/c1c896d3-bdc3-4adb-b712-43be751a5fd8-kube-api-access-l89m9\") pod \"nova-metadata-0\" (UID: \"c1c896d3-bdc3-4adb-b712-43be751a5fd8\") " pod="openstack/nova-metadata-0" Oct 13 15:33:22 crc kubenswrapper[4797]: I1013 15:33:22.842529 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 13 15:33:23 crc kubenswrapper[4797]: I1013 15:33:23.257319 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10b527c4-151c-464b-a112-64bd7b5a7444" path="/var/lib/kubelet/pods/10b527c4-151c-464b-a112-64bd7b5a7444/volumes" Oct 13 15:33:23 crc kubenswrapper[4797]: I1013 15:33:23.258171 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b3386188-e3d7-49b6-b099-7a5b66e012ee" path="/var/lib/kubelet/pods/b3386188-e3d7-49b6-b099-7a5b66e012ee/volumes" Oct 13 15:33:23 crc kubenswrapper[4797]: I1013 15:33:23.320276 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 13 15:33:23 crc kubenswrapper[4797]: I1013 15:33:23.353936 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0762bcb1-f8cd-4a9d-8691-1f6e32602199","Type":"ContainerStarted","Data":"09d3a7873ff1455c0b2bd88826fe0a91a1d66c957554787b27bb53c72088fd8e"} Oct 13 15:33:23 crc kubenswrapper[4797]: I1013 15:33:23.353985 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0762bcb1-f8cd-4a9d-8691-1f6e32602199","Type":"ContainerStarted","Data":"85f7d0f1b6c8a2442ab61c317dd4a8b3c1118e12ba053de9e491e120a088772c"} Oct 13 15:33:23 crc kubenswrapper[4797]: I1013 15:33:23.354000 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0762bcb1-f8cd-4a9d-8691-1f6e32602199","Type":"ContainerStarted","Data":"86a94d229bc572ce33f96f059e18faa20ac9eedacdab4c048ef0dec9ed86b349"} Oct 13 15:33:23 crc kubenswrapper[4797]: I1013 15:33:23.362637 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"7295c003-d21f-4137-96b6-0ae19de3d1be","Type":"ContainerStarted","Data":"e7c14a622262d92912233269c442992dd4453d612facfd80eff050d614988f60"} Oct 13 15:33:23 crc kubenswrapper[4797]: I1013 15:33:23.363566 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Oct 13 15:33:23 crc kubenswrapper[4797]: I1013 15:33:23.366558 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 13 15:33:23 crc kubenswrapper[4797]: I1013 15:33:23.366936 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"abf87031-2d48-4271-9cbd-2c872a8e75e3","Type":"ContainerDied","Data":"ec6e85227c57b9640496984943f3c136f93f62a974b37c3fccedd06ec7b475c6"} Oct 13 15:33:23 crc kubenswrapper[4797]: I1013 15:33:23.366968 4797 scope.go:117] "RemoveContainer" containerID="1267e88d787b2a87fe9ff47a429f14e3f8526961643107ddae60fcdcb361fa35" Oct 13 15:33:23 crc kubenswrapper[4797]: I1013 15:33:23.386847 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.386822359 podStartE2EDuration="2.386822359s" podCreationTimestamp="2025-10-13 15:33:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 15:33:23.377270363 +0000 UTC m=+8780.910820629" watchObservedRunningTime="2025-10-13 15:33:23.386822359 +0000 UTC m=+8780.920372615" Oct 13 15:33:23 crc kubenswrapper[4797]: I1013 15:33:23.400079 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=3.400054905 podStartE2EDuration="3.400054905s" podCreationTimestamp="2025-10-13 15:33:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 15:33:23.395322128 +0000 UTC m=+8780.928872414" watchObservedRunningTime="2025-10-13 15:33:23.400054905 +0000 UTC m=+8780.933605181" Oct 13 15:33:23 crc kubenswrapper[4797]: W1013 15:33:23.686732 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc1c896d3_bdc3_4adb_b712_43be751a5fd8.slice/crio-e804eb3885cf9304612d7cc24d6bd63634f30d1252389abd727f1c02b0085c02 WatchSource:0}: Error finding container e804eb3885cf9304612d7cc24d6bd63634f30d1252389abd727f1c02b0085c02: Status 404 returned error can't find the container with id e804eb3885cf9304612d7cc24d6bd63634f30d1252389abd727f1c02b0085c02 Oct 13 15:33:23 crc kubenswrapper[4797]: I1013 15:33:23.844413 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 13 15:33:23 crc kubenswrapper[4797]: I1013 15:33:23.853436 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 13 15:33:23 crc kubenswrapper[4797]: I1013 15:33:23.868198 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 13 15:33:23 crc kubenswrapper[4797]: E1013 15:33:23.868700 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abf87031-2d48-4271-9cbd-2c872a8e75e3" containerName="nova-scheduler-scheduler" Oct 13 15:33:23 crc kubenswrapper[4797]: I1013 15:33:23.868723 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="abf87031-2d48-4271-9cbd-2c872a8e75e3" containerName="nova-scheduler-scheduler" Oct 13 15:33:23 crc kubenswrapper[4797]: I1013 15:33:23.868992 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="abf87031-2d48-4271-9cbd-2c872a8e75e3" containerName="nova-scheduler-scheduler" Oct 13 15:33:23 crc kubenswrapper[4797]: I1013 15:33:23.869882 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 13 15:33:23 crc kubenswrapper[4797]: I1013 15:33:23.873867 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 13 15:33:23 crc kubenswrapper[4797]: I1013 15:33:23.897630 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 13 15:33:23 crc kubenswrapper[4797]: I1013 15:33:23.979698 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43eea067-06e3-4bc2-ad55-7e7157dbbb99-config-data\") pod \"nova-scheduler-0\" (UID: \"43eea067-06e3-4bc2-ad55-7e7157dbbb99\") " pod="openstack/nova-scheduler-0" Oct 13 15:33:23 crc kubenswrapper[4797]: I1013 15:33:23.980236 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43eea067-06e3-4bc2-ad55-7e7157dbbb99-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"43eea067-06e3-4bc2-ad55-7e7157dbbb99\") " pod="openstack/nova-scheduler-0" Oct 13 15:33:23 crc kubenswrapper[4797]: I1013 15:33:23.980564 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49hkv\" (UniqueName: \"kubernetes.io/projected/43eea067-06e3-4bc2-ad55-7e7157dbbb99-kube-api-access-49hkv\") pod \"nova-scheduler-0\" (UID: \"43eea067-06e3-4bc2-ad55-7e7157dbbb99\") " pod="openstack/nova-scheduler-0" Oct 13 15:33:24 crc kubenswrapper[4797]: I1013 15:33:24.082756 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43eea067-06e3-4bc2-ad55-7e7157dbbb99-config-data\") pod \"nova-scheduler-0\" (UID: \"43eea067-06e3-4bc2-ad55-7e7157dbbb99\") " pod="openstack/nova-scheduler-0" Oct 13 15:33:24 crc kubenswrapper[4797]: I1013 15:33:24.082905 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43eea067-06e3-4bc2-ad55-7e7157dbbb99-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"43eea067-06e3-4bc2-ad55-7e7157dbbb99\") " pod="openstack/nova-scheduler-0" Oct 13 15:33:24 crc kubenswrapper[4797]: I1013 15:33:24.083465 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49hkv\" (UniqueName: \"kubernetes.io/projected/43eea067-06e3-4bc2-ad55-7e7157dbbb99-kube-api-access-49hkv\") pod \"nova-scheduler-0\" (UID: \"43eea067-06e3-4bc2-ad55-7e7157dbbb99\") " pod="openstack/nova-scheduler-0" Oct 13 15:33:24 crc kubenswrapper[4797]: I1013 15:33:24.086620 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43eea067-06e3-4bc2-ad55-7e7157dbbb99-config-data\") pod \"nova-scheduler-0\" (UID: \"43eea067-06e3-4bc2-ad55-7e7157dbbb99\") " pod="openstack/nova-scheduler-0" Oct 13 15:33:24 crc kubenswrapper[4797]: I1013 15:33:24.086664 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43eea067-06e3-4bc2-ad55-7e7157dbbb99-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"43eea067-06e3-4bc2-ad55-7e7157dbbb99\") " pod="openstack/nova-scheduler-0" Oct 13 15:33:24 crc kubenswrapper[4797]: I1013 15:33:24.104152 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-49hkv\" (UniqueName: \"kubernetes.io/projected/43eea067-06e3-4bc2-ad55-7e7157dbbb99-kube-api-access-49hkv\") pod \"nova-scheduler-0\" (UID: \"43eea067-06e3-4bc2-ad55-7e7157dbbb99\") " pod="openstack/nova-scheduler-0" Oct 13 15:33:24 crc kubenswrapper[4797]: I1013 15:33:24.207569 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 13 15:33:24 crc kubenswrapper[4797]: I1013 15:33:24.406518 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c1c896d3-bdc3-4adb-b712-43be751a5fd8","Type":"ContainerStarted","Data":"eeb6342dd7cec21a514545e8d5649831891b6a59e15f8e0a1226db33ec270136"} Oct 13 15:33:24 crc kubenswrapper[4797]: I1013 15:33:24.406864 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c1c896d3-bdc3-4adb-b712-43be751a5fd8","Type":"ContainerStarted","Data":"414273e0e661a14e11c423d7fd1ec0eee1ab417279a75d82cba6cf62410fcd7f"} Oct 13 15:33:24 crc kubenswrapper[4797]: I1013 15:33:24.406880 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c1c896d3-bdc3-4adb-b712-43be751a5fd8","Type":"ContainerStarted","Data":"e804eb3885cf9304612d7cc24d6bd63634f30d1252389abd727f1c02b0085c02"} Oct 13 15:33:24 crc kubenswrapper[4797]: I1013 15:33:24.445153 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.4451299730000002 podStartE2EDuration="2.445129973s" podCreationTimestamp="2025-10-13 15:33:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 15:33:24.42472389 +0000 UTC m=+8781.958274166" watchObservedRunningTime="2025-10-13 15:33:24.445129973 +0000 UTC m=+8781.978680229" Oct 13 15:33:24 crc kubenswrapper[4797]: I1013 15:33:24.731043 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 13 15:33:25 crc kubenswrapper[4797]: I1013 15:33:25.252713 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="abf87031-2d48-4271-9cbd-2c872a8e75e3" path="/var/lib/kubelet/pods/abf87031-2d48-4271-9cbd-2c872a8e75e3/volumes" Oct 13 15:33:25 crc kubenswrapper[4797]: I1013 15:33:25.420627 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"43eea067-06e3-4bc2-ad55-7e7157dbbb99","Type":"ContainerStarted","Data":"40d654b6605e540a90dedd347e8ed080013eee6718605457a578cd67abb4963d"} Oct 13 15:33:25 crc kubenswrapper[4797]: I1013 15:33:25.420680 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"43eea067-06e3-4bc2-ad55-7e7157dbbb99","Type":"ContainerStarted","Data":"b9f456ac13229cb3bf1be41aa2a7d7ccd80c339c88c0ec99923129a71f76ee4f"} Oct 13 15:33:25 crc kubenswrapper[4797]: I1013 15:33:25.442488 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.442466106 podStartE2EDuration="2.442466106s" podCreationTimestamp="2025-10-13 15:33:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 15:33:25.439476053 +0000 UTC m=+8782.973026339" watchObservedRunningTime="2025-10-13 15:33:25.442466106 +0000 UTC m=+8782.976016372" Oct 13 15:33:27 crc kubenswrapper[4797]: I1013 15:33:27.843576 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 13 15:33:27 crc kubenswrapper[4797]: I1013 15:33:27.843915 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 13 15:33:29 crc kubenswrapper[4797]: I1013 15:33:29.208564 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 13 15:33:30 crc kubenswrapper[4797]: I1013 15:33:30.755276 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Oct 13 15:33:30 crc kubenswrapper[4797]: I1013 15:33:30.772763 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Oct 13 15:33:32 crc kubenswrapper[4797]: I1013 15:33:32.111428 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 13 15:33:32 crc kubenswrapper[4797]: I1013 15:33:32.111833 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 13 15:33:32 crc kubenswrapper[4797]: I1013 15:33:32.844495 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 13 15:33:32 crc kubenswrapper[4797]: I1013 15:33:32.844554 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 13 15:33:33 crc kubenswrapper[4797]: I1013 15:33:33.194058 4797 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="0762bcb1-f8cd-4a9d-8691-1f6e32602199" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.180:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 13 15:33:33 crc kubenswrapper[4797]: I1013 15:33:33.194076 4797 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="0762bcb1-f8cd-4a9d-8691-1f6e32602199" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.180:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 13 15:33:33 crc kubenswrapper[4797]: I1013 15:33:33.926143 4797 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="c1c896d3-bdc3-4adb-b712-43be751a5fd8" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.181:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 13 15:33:33 crc kubenswrapper[4797]: I1013 15:33:33.926189 4797 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="c1c896d3-bdc3-4adb-b712-43be751a5fd8" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.181:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 13 15:33:34 crc kubenswrapper[4797]: I1013 15:33:34.208374 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 13 15:33:34 crc kubenswrapper[4797]: I1013 15:33:34.248974 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 13 15:33:34 crc kubenswrapper[4797]: I1013 15:33:34.548414 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 13 15:33:42 crc kubenswrapper[4797]: I1013 15:33:42.118973 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 13 15:33:42 crc kubenswrapper[4797]: I1013 15:33:42.121037 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 13 15:33:42 crc kubenswrapper[4797]: I1013 15:33:42.123043 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 13 15:33:42 crc kubenswrapper[4797]: I1013 15:33:42.126577 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 13 15:33:42 crc kubenswrapper[4797]: I1013 15:33:42.594358 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 13 15:33:42 crc kubenswrapper[4797]: I1013 15:33:42.597994 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 13 15:33:42 crc kubenswrapper[4797]: I1013 15:33:42.846794 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 13 15:33:42 crc kubenswrapper[4797]: I1013 15:33:42.847353 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 13 15:33:42 crc kubenswrapper[4797]: I1013 15:33:42.848642 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 13 15:33:43 crc kubenswrapper[4797]: I1013 15:33:43.619766 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 13 15:33:44 crc kubenswrapper[4797]: I1013 15:33:44.559587 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellsnn9x"] Oct 13 15:33:44 crc kubenswrapper[4797]: I1013 15:33:44.561241 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellsnn9x" Oct 13 15:33:44 crc kubenswrapper[4797]: I1013 15:33:44.564045 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Oct 13 15:33:44 crc kubenswrapper[4797]: I1013 15:33:44.564839 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 13 15:33:44 crc kubenswrapper[4797]: I1013 15:33:44.564869 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 13 15:33:44 crc kubenswrapper[4797]: I1013 15:33:44.564874 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Oct 13 15:33:44 crc kubenswrapper[4797]: I1013 15:33:44.564984 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-cells-global-config" Oct 13 15:33:44 crc kubenswrapper[4797]: I1013 15:33:44.565169 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-rf85n" Oct 13 15:33:44 crc kubenswrapper[4797]: I1013 15:33:44.565750 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 13 15:33:44 crc kubenswrapper[4797]: I1013 15:33:44.579487 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellsnn9x"] Oct 13 15:33:44 crc kubenswrapper[4797]: I1013 15:33:44.736253 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwdxm\" (UniqueName: \"kubernetes.io/projected/2e8a47d5-adb1-4909-9731-680948fa0320-kube-api-access-xwdxm\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellsnn9x\" (UID: \"2e8a47d5-adb1-4909-9731-680948fa0320\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellsnn9x" Oct 13 15:33:44 crc kubenswrapper[4797]: I1013 15:33:44.736336 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/2e8a47d5-adb1-4909-9731-680948fa0320-nova-cells-global-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellsnn9x\" (UID: \"2e8a47d5-adb1-4909-9731-680948fa0320\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellsnn9x" Oct 13 15:33:44 crc kubenswrapper[4797]: I1013 15:33:44.736389 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/2e8a47d5-adb1-4909-9731-680948fa0320-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellsnn9x\" (UID: \"2e8a47d5-adb1-4909-9731-680948fa0320\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellsnn9x" Oct 13 15:33:44 crc kubenswrapper[4797]: I1013 15:33:44.736490 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/2e8a47d5-adb1-4909-9731-680948fa0320-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellsnn9x\" (UID: \"2e8a47d5-adb1-4909-9731-680948fa0320\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellsnn9x" Oct 13 15:33:44 crc kubenswrapper[4797]: I1013 15:33:44.736511 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e8a47d5-adb1-4909-9731-680948fa0320-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellsnn9x\" (UID: \"2e8a47d5-adb1-4909-9731-680948fa0320\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellsnn9x" Oct 13 15:33:44 crc kubenswrapper[4797]: I1013 15:33:44.736552 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2e8a47d5-adb1-4909-9731-680948fa0320-inventory\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellsnn9x\" (UID: \"2e8a47d5-adb1-4909-9731-680948fa0320\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellsnn9x" Oct 13 15:33:44 crc kubenswrapper[4797]: I1013 15:33:44.736592 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2e8a47d5-adb1-4909-9731-680948fa0320-ssh-key\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellsnn9x\" (UID: \"2e8a47d5-adb1-4909-9731-680948fa0320\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellsnn9x" Oct 13 15:33:44 crc kubenswrapper[4797]: I1013 15:33:44.736618 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2e8a47d5-adb1-4909-9731-680948fa0320-ceph\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellsnn9x\" (UID: \"2e8a47d5-adb1-4909-9731-680948fa0320\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellsnn9x" Oct 13 15:33:44 crc kubenswrapper[4797]: I1013 15:33:44.736658 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/2e8a47d5-adb1-4909-9731-680948fa0320-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellsnn9x\" (UID: \"2e8a47d5-adb1-4909-9731-680948fa0320\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellsnn9x" Oct 13 15:33:44 crc kubenswrapper[4797]: I1013 15:33:44.737176 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/2e8a47d5-adb1-4909-9731-680948fa0320-nova-cells-global-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellsnn9x\" (UID: \"2e8a47d5-adb1-4909-9731-680948fa0320\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellsnn9x" Oct 13 15:33:44 crc kubenswrapper[4797]: I1013 15:33:44.737238 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/2e8a47d5-adb1-4909-9731-680948fa0320-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellsnn9x\" (UID: \"2e8a47d5-adb1-4909-9731-680948fa0320\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellsnn9x" Oct 13 15:33:44 crc kubenswrapper[4797]: I1013 15:33:44.838565 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/2e8a47d5-adb1-4909-9731-680948fa0320-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellsnn9x\" (UID: \"2e8a47d5-adb1-4909-9731-680948fa0320\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellsnn9x" Oct 13 15:33:44 crc kubenswrapper[4797]: I1013 15:33:44.838601 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e8a47d5-adb1-4909-9731-680948fa0320-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellsnn9x\" (UID: \"2e8a47d5-adb1-4909-9731-680948fa0320\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellsnn9x" Oct 13 15:33:44 crc kubenswrapper[4797]: I1013 15:33:44.838643 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2e8a47d5-adb1-4909-9731-680948fa0320-inventory\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellsnn9x\" (UID: \"2e8a47d5-adb1-4909-9731-680948fa0320\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellsnn9x" Oct 13 15:33:44 crc kubenswrapper[4797]: I1013 15:33:44.838675 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2e8a47d5-adb1-4909-9731-680948fa0320-ssh-key\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellsnn9x\" (UID: \"2e8a47d5-adb1-4909-9731-680948fa0320\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellsnn9x" Oct 13 15:33:44 crc kubenswrapper[4797]: I1013 15:33:44.838694 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2e8a47d5-adb1-4909-9731-680948fa0320-ceph\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellsnn9x\" (UID: \"2e8a47d5-adb1-4909-9731-680948fa0320\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellsnn9x" Oct 13 15:33:44 crc kubenswrapper[4797]: I1013 15:33:44.838725 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/2e8a47d5-adb1-4909-9731-680948fa0320-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellsnn9x\" (UID: \"2e8a47d5-adb1-4909-9731-680948fa0320\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellsnn9x" Oct 13 15:33:44 crc kubenswrapper[4797]: I1013 15:33:44.838752 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/2e8a47d5-adb1-4909-9731-680948fa0320-nova-cells-global-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellsnn9x\" (UID: \"2e8a47d5-adb1-4909-9731-680948fa0320\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellsnn9x" Oct 13 15:33:44 crc kubenswrapper[4797]: I1013 15:33:44.838780 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/2e8a47d5-adb1-4909-9731-680948fa0320-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellsnn9x\" (UID: \"2e8a47d5-adb1-4909-9731-680948fa0320\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellsnn9x" Oct 13 15:33:44 crc kubenswrapper[4797]: I1013 15:33:44.838868 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwdxm\" (UniqueName: \"kubernetes.io/projected/2e8a47d5-adb1-4909-9731-680948fa0320-kube-api-access-xwdxm\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellsnn9x\" (UID: \"2e8a47d5-adb1-4909-9731-680948fa0320\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellsnn9x" Oct 13 15:33:44 crc kubenswrapper[4797]: I1013 15:33:44.838893 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/2e8a47d5-adb1-4909-9731-680948fa0320-nova-cells-global-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellsnn9x\" (UID: \"2e8a47d5-adb1-4909-9731-680948fa0320\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellsnn9x" Oct 13 15:33:44 crc kubenswrapper[4797]: I1013 15:33:44.838916 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/2e8a47d5-adb1-4909-9731-680948fa0320-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellsnn9x\" (UID: \"2e8a47d5-adb1-4909-9731-680948fa0320\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellsnn9x" Oct 13 15:33:44 crc kubenswrapper[4797]: I1013 15:33:44.841662 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/2e8a47d5-adb1-4909-9731-680948fa0320-nova-cells-global-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellsnn9x\" (UID: \"2e8a47d5-adb1-4909-9731-680948fa0320\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellsnn9x" Oct 13 15:33:44 crc kubenswrapper[4797]: I1013 15:33:44.842332 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/2e8a47d5-adb1-4909-9731-680948fa0320-nova-cells-global-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellsnn9x\" (UID: \"2e8a47d5-adb1-4909-9731-680948fa0320\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellsnn9x" Oct 13 15:33:45 crc kubenswrapper[4797]: I1013 15:33:45.084351 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/2e8a47d5-adb1-4909-9731-680948fa0320-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellsnn9x\" (UID: \"2e8a47d5-adb1-4909-9731-680948fa0320\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellsnn9x" Oct 13 15:33:45 crc kubenswrapper[4797]: I1013 15:33:45.084374 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/2e8a47d5-adb1-4909-9731-680948fa0320-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellsnn9x\" (UID: \"2e8a47d5-adb1-4909-9731-680948fa0320\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellsnn9x" Oct 13 15:33:45 crc kubenswrapper[4797]: I1013 15:33:45.085038 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/2e8a47d5-adb1-4909-9731-680948fa0320-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellsnn9x\" (UID: \"2e8a47d5-adb1-4909-9731-680948fa0320\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellsnn9x" Oct 13 15:33:45 crc kubenswrapper[4797]: I1013 15:33:45.085096 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2e8a47d5-adb1-4909-9731-680948fa0320-inventory\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellsnn9x\" (UID: \"2e8a47d5-adb1-4909-9731-680948fa0320\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellsnn9x" Oct 13 15:33:45 crc kubenswrapper[4797]: I1013 15:33:45.086531 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2e8a47d5-adb1-4909-9731-680948fa0320-ssh-key\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellsnn9x\" (UID: \"2e8a47d5-adb1-4909-9731-680948fa0320\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellsnn9x" Oct 13 15:33:45 crc kubenswrapper[4797]: I1013 15:33:45.089532 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/2e8a47d5-adb1-4909-9731-680948fa0320-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellsnn9x\" (UID: \"2e8a47d5-adb1-4909-9731-680948fa0320\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellsnn9x" Oct 13 15:33:45 crc kubenswrapper[4797]: I1013 15:33:45.090135 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2e8a47d5-adb1-4909-9731-680948fa0320-ceph\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellsnn9x\" (UID: \"2e8a47d5-adb1-4909-9731-680948fa0320\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellsnn9x" Oct 13 15:33:45 crc kubenswrapper[4797]: I1013 15:33:45.099990 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwdxm\" (UniqueName: \"kubernetes.io/projected/2e8a47d5-adb1-4909-9731-680948fa0320-kube-api-access-xwdxm\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellsnn9x\" (UID: \"2e8a47d5-adb1-4909-9731-680948fa0320\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellsnn9x" Oct 13 15:33:45 crc kubenswrapper[4797]: I1013 15:33:45.102717 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e8a47d5-adb1-4909-9731-680948fa0320-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellsnn9x\" (UID: \"2e8a47d5-adb1-4909-9731-680948fa0320\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellsnn9x" Oct 13 15:33:45 crc kubenswrapper[4797]: I1013 15:33:45.183529 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellsnn9x" Oct 13 15:33:45 crc kubenswrapper[4797]: I1013 15:33:45.817666 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellsnn9x"] Oct 13 15:33:45 crc kubenswrapper[4797]: W1013 15:33:45.835833 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2e8a47d5_adb1_4909_9731_680948fa0320.slice/crio-b72389b82f5622223d308d5e737af51a9e904622e089e612d10339231ac66e86 WatchSource:0}: Error finding container b72389b82f5622223d308d5e737af51a9e904622e089e612d10339231ac66e86: Status 404 returned error can't find the container with id b72389b82f5622223d308d5e737af51a9e904622e089e612d10339231ac66e86 Oct 13 15:33:45 crc kubenswrapper[4797]: I1013 15:33:45.838922 4797 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 13 15:33:46 crc kubenswrapper[4797]: I1013 15:33:46.644218 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellsnn9x" event={"ID":"2e8a47d5-adb1-4909-9731-680948fa0320","Type":"ContainerStarted","Data":"c7382b4073910c39e5b5b03e1b34ceed5b9902bc31b2891d6039414f32635839"} Oct 13 15:33:46 crc kubenswrapper[4797]: I1013 15:33:46.644784 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellsnn9x" event={"ID":"2e8a47d5-adb1-4909-9731-680948fa0320","Type":"ContainerStarted","Data":"b72389b82f5622223d308d5e737af51a9e904622e089e612d10339231ac66e86"} Oct 13 15:33:46 crc kubenswrapper[4797]: I1013 15:33:46.662501 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellsnn9x" podStartSLOduration=2.15975373 podStartE2EDuration="2.662483195s" podCreationTimestamp="2025-10-13 15:33:44 +0000 UTC" firstStartedPulling="2025-10-13 15:33:45.838633628 +0000 UTC m=+8803.372183884" lastFinishedPulling="2025-10-13 15:33:46.341363103 +0000 UTC m=+8803.874913349" observedRunningTime="2025-10-13 15:33:46.662254839 +0000 UTC m=+8804.195805115" watchObservedRunningTime="2025-10-13 15:33:46.662483195 +0000 UTC m=+8804.196033461" Oct 13 15:33:53 crc kubenswrapper[4797]: I1013 15:33:53.800338 4797 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","podabf87031-2d48-4271-9cbd-2c872a8e75e3"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort podabf87031-2d48-4271-9cbd-2c872a8e75e3] : Timed out while waiting for systemd to remove kubepods-besteffort-podabf87031_2d48_4271_9cbd_2c872a8e75e3.slice" Oct 13 15:35:18 crc kubenswrapper[4797]: I1013 15:35:18.119898 4797 patch_prober.go:28] interesting pod/machine-config-daemon-hrdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 15:35:18 crc kubenswrapper[4797]: I1013 15:35:18.120520 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 15:35:48 crc kubenswrapper[4797]: I1013 15:35:48.119938 4797 patch_prober.go:28] interesting pod/machine-config-daemon-hrdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 15:35:48 crc kubenswrapper[4797]: I1013 15:35:48.120409 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 15:36:18 crc kubenswrapper[4797]: I1013 15:36:18.120328 4797 patch_prober.go:28] interesting pod/machine-config-daemon-hrdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 15:36:18 crc kubenswrapper[4797]: I1013 15:36:18.120907 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 15:36:18 crc kubenswrapper[4797]: I1013 15:36:18.120970 4797 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" Oct 13 15:36:18 crc kubenswrapper[4797]: I1013 15:36:18.121578 4797 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"98efd66111782db243b8b6633201b9be37d56075cae91ee51e26bf205e3692c6"} pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 13 15:36:18 crc kubenswrapper[4797]: I1013 15:36:18.121648 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerName="machine-config-daemon" containerID="cri-o://98efd66111782db243b8b6633201b9be37d56075cae91ee51e26bf205e3692c6" gracePeriod=600 Oct 13 15:36:19 crc kubenswrapper[4797]: I1013 15:36:19.156106 4797 generic.go:334] "Generic (PLEG): container finished" podID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerID="98efd66111782db243b8b6633201b9be37d56075cae91ee51e26bf205e3692c6" exitCode=0 Oct 13 15:36:19 crc kubenswrapper[4797]: I1013 15:36:19.156203 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" event={"ID":"345b1c60-ba79-407d-8423-53010f2dfeb0","Type":"ContainerDied","Data":"98efd66111782db243b8b6633201b9be37d56075cae91ee51e26bf205e3692c6"} Oct 13 15:36:19 crc kubenswrapper[4797]: I1013 15:36:19.156887 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" event={"ID":"345b1c60-ba79-407d-8423-53010f2dfeb0","Type":"ContainerStarted","Data":"0f48189b10ecf709cf828e19132e91dbb262e3daf7016f0a69a4c9c375af6ddf"} Oct 13 15:36:19 crc kubenswrapper[4797]: I1013 15:36:19.156920 4797 scope.go:117] "RemoveContainer" containerID="6a7efc0ac87fd5a3474be5b847665fe7e0d782a300b9c4d9444078885329765f" Oct 13 15:37:03 crc kubenswrapper[4797]: I1013 15:37:03.622172 4797 generic.go:334] "Generic (PLEG): container finished" podID="2e8a47d5-adb1-4909-9731-680948fa0320" containerID="c7382b4073910c39e5b5b03e1b34ceed5b9902bc31b2891d6039414f32635839" exitCode=0 Oct 13 15:37:03 crc kubenswrapper[4797]: I1013 15:37:03.622252 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellsnn9x" event={"ID":"2e8a47d5-adb1-4909-9731-680948fa0320","Type":"ContainerDied","Data":"c7382b4073910c39e5b5b03e1b34ceed5b9902bc31b2891d6039414f32635839"} Oct 13 15:37:05 crc kubenswrapper[4797]: I1013 15:37:05.122623 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellsnn9x" Oct 13 15:37:05 crc kubenswrapper[4797]: I1013 15:37:05.297342 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/2e8a47d5-adb1-4909-9731-680948fa0320-nova-cell1-compute-config-0\") pod \"2e8a47d5-adb1-4909-9731-680948fa0320\" (UID: \"2e8a47d5-adb1-4909-9731-680948fa0320\") " Oct 13 15:37:05 crc kubenswrapper[4797]: I1013 15:37:05.297702 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2e8a47d5-adb1-4909-9731-680948fa0320-inventory\") pod \"2e8a47d5-adb1-4909-9731-680948fa0320\" (UID: \"2e8a47d5-adb1-4909-9731-680948fa0320\") " Oct 13 15:37:05 crc kubenswrapper[4797]: I1013 15:37:05.297762 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2e8a47d5-adb1-4909-9731-680948fa0320-ceph\") pod \"2e8a47d5-adb1-4909-9731-680948fa0320\" (UID: \"2e8a47d5-adb1-4909-9731-680948fa0320\") " Oct 13 15:37:05 crc kubenswrapper[4797]: I1013 15:37:05.297798 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xwdxm\" (UniqueName: \"kubernetes.io/projected/2e8a47d5-adb1-4909-9731-680948fa0320-kube-api-access-xwdxm\") pod \"2e8a47d5-adb1-4909-9731-680948fa0320\" (UID: \"2e8a47d5-adb1-4909-9731-680948fa0320\") " Oct 13 15:37:05 crc kubenswrapper[4797]: I1013 15:37:05.298307 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/2e8a47d5-adb1-4909-9731-680948fa0320-nova-cell1-compute-config-1\") pod \"2e8a47d5-adb1-4909-9731-680948fa0320\" (UID: \"2e8a47d5-adb1-4909-9731-680948fa0320\") " Oct 13 15:37:05 crc kubenswrapper[4797]: I1013 15:37:05.298374 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/2e8a47d5-adb1-4909-9731-680948fa0320-nova-migration-ssh-key-1\") pod \"2e8a47d5-adb1-4909-9731-680948fa0320\" (UID: \"2e8a47d5-adb1-4909-9731-680948fa0320\") " Oct 13 15:37:05 crc kubenswrapper[4797]: I1013 15:37:05.298412 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e8a47d5-adb1-4909-9731-680948fa0320-nova-cell1-combined-ca-bundle\") pod \"2e8a47d5-adb1-4909-9731-680948fa0320\" (UID: \"2e8a47d5-adb1-4909-9731-680948fa0320\") " Oct 13 15:37:05 crc kubenswrapper[4797]: I1013 15:37:05.298472 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2e8a47d5-adb1-4909-9731-680948fa0320-ssh-key\") pod \"2e8a47d5-adb1-4909-9731-680948fa0320\" (UID: \"2e8a47d5-adb1-4909-9731-680948fa0320\") " Oct 13 15:37:05 crc kubenswrapper[4797]: I1013 15:37:05.298514 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/2e8a47d5-adb1-4909-9731-680948fa0320-nova-migration-ssh-key-0\") pod \"2e8a47d5-adb1-4909-9731-680948fa0320\" (UID: \"2e8a47d5-adb1-4909-9731-680948fa0320\") " Oct 13 15:37:05 crc kubenswrapper[4797]: I1013 15:37:05.298586 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/2e8a47d5-adb1-4909-9731-680948fa0320-nova-cells-global-config-0\") pod \"2e8a47d5-adb1-4909-9731-680948fa0320\" (UID: \"2e8a47d5-adb1-4909-9731-680948fa0320\") " Oct 13 15:37:05 crc kubenswrapper[4797]: I1013 15:37:05.298663 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/2e8a47d5-adb1-4909-9731-680948fa0320-nova-cells-global-config-1\") pod \"2e8a47d5-adb1-4909-9731-680948fa0320\" (UID: \"2e8a47d5-adb1-4909-9731-680948fa0320\") " Oct 13 15:37:05 crc kubenswrapper[4797]: I1013 15:37:05.319596 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e8a47d5-adb1-4909-9731-680948fa0320-nova-cell1-combined-ca-bundle" (OuterVolumeSpecName: "nova-cell1-combined-ca-bundle") pod "2e8a47d5-adb1-4909-9731-680948fa0320" (UID: "2e8a47d5-adb1-4909-9731-680948fa0320"). InnerVolumeSpecName "nova-cell1-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 15:37:05 crc kubenswrapper[4797]: I1013 15:37:05.324055 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e8a47d5-adb1-4909-9731-680948fa0320-kube-api-access-xwdxm" (OuterVolumeSpecName: "kube-api-access-xwdxm") pod "2e8a47d5-adb1-4909-9731-680948fa0320" (UID: "2e8a47d5-adb1-4909-9731-680948fa0320"). InnerVolumeSpecName "kube-api-access-xwdxm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 15:37:05 crc kubenswrapper[4797]: I1013 15:37:05.324057 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e8a47d5-adb1-4909-9731-680948fa0320-ceph" (OuterVolumeSpecName: "ceph") pod "2e8a47d5-adb1-4909-9731-680948fa0320" (UID: "2e8a47d5-adb1-4909-9731-680948fa0320"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 15:37:05 crc kubenswrapper[4797]: I1013 15:37:05.327279 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e8a47d5-adb1-4909-9731-680948fa0320-nova-cells-global-config-0" (OuterVolumeSpecName: "nova-cells-global-config-0") pod "2e8a47d5-adb1-4909-9731-680948fa0320" (UID: "2e8a47d5-adb1-4909-9731-680948fa0320"). InnerVolumeSpecName "nova-cells-global-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 15:37:05 crc kubenswrapper[4797]: I1013 15:37:05.339454 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e8a47d5-adb1-4909-9731-680948fa0320-nova-cells-global-config-1" (OuterVolumeSpecName: "nova-cells-global-config-1") pod "2e8a47d5-adb1-4909-9731-680948fa0320" (UID: "2e8a47d5-adb1-4909-9731-680948fa0320"). InnerVolumeSpecName "nova-cells-global-config-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 15:37:05 crc kubenswrapper[4797]: I1013 15:37:05.341599 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e8a47d5-adb1-4909-9731-680948fa0320-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "2e8a47d5-adb1-4909-9731-680948fa0320" (UID: "2e8a47d5-adb1-4909-9731-680948fa0320"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 15:37:05 crc kubenswrapper[4797]: I1013 15:37:05.354854 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e8a47d5-adb1-4909-9731-680948fa0320-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "2e8a47d5-adb1-4909-9731-680948fa0320" (UID: "2e8a47d5-adb1-4909-9731-680948fa0320"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 15:37:05 crc kubenswrapper[4797]: I1013 15:37:05.364097 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e8a47d5-adb1-4909-9731-680948fa0320-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "2e8a47d5-adb1-4909-9731-680948fa0320" (UID: "2e8a47d5-adb1-4909-9731-680948fa0320"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 15:37:05 crc kubenswrapper[4797]: I1013 15:37:05.368070 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e8a47d5-adb1-4909-9731-680948fa0320-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "2e8a47d5-adb1-4909-9731-680948fa0320" (UID: "2e8a47d5-adb1-4909-9731-680948fa0320"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 15:37:05 crc kubenswrapper[4797]: I1013 15:37:05.369308 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e8a47d5-adb1-4909-9731-680948fa0320-inventory" (OuterVolumeSpecName: "inventory") pod "2e8a47d5-adb1-4909-9731-680948fa0320" (UID: "2e8a47d5-adb1-4909-9731-680948fa0320"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 15:37:05 crc kubenswrapper[4797]: I1013 15:37:05.379090 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e8a47d5-adb1-4909-9731-680948fa0320-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "2e8a47d5-adb1-4909-9731-680948fa0320" (UID: "2e8a47d5-adb1-4909-9731-680948fa0320"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 15:37:05 crc kubenswrapper[4797]: I1013 15:37:05.401593 4797 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/2e8a47d5-adb1-4909-9731-680948fa0320-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Oct 13 15:37:05 crc kubenswrapper[4797]: I1013 15:37:05.401638 4797 reconciler_common.go:293] "Volume detached for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/2e8a47d5-adb1-4909-9731-680948fa0320-nova-cells-global-config-0\") on node \"crc\" DevicePath \"\"" Oct 13 15:37:05 crc kubenswrapper[4797]: I1013 15:37:05.401647 4797 reconciler_common.go:293] "Volume detached for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/2e8a47d5-adb1-4909-9731-680948fa0320-nova-cells-global-config-1\") on node \"crc\" DevicePath \"\"" Oct 13 15:37:05 crc kubenswrapper[4797]: I1013 15:37:05.401655 4797 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/2e8a47d5-adb1-4909-9731-680948fa0320-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Oct 13 15:37:05 crc kubenswrapper[4797]: I1013 15:37:05.401666 4797 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2e8a47d5-adb1-4909-9731-680948fa0320-inventory\") on node \"crc\" DevicePath \"\"" Oct 13 15:37:05 crc kubenswrapper[4797]: I1013 15:37:05.401676 4797 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2e8a47d5-adb1-4909-9731-680948fa0320-ceph\") on node \"crc\" DevicePath \"\"" Oct 13 15:37:05 crc kubenswrapper[4797]: I1013 15:37:05.401684 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xwdxm\" (UniqueName: \"kubernetes.io/projected/2e8a47d5-adb1-4909-9731-680948fa0320-kube-api-access-xwdxm\") on node \"crc\" DevicePath \"\"" Oct 13 15:37:05 crc kubenswrapper[4797]: I1013 15:37:05.401694 4797 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/2e8a47d5-adb1-4909-9731-680948fa0320-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Oct 13 15:37:05 crc kubenswrapper[4797]: I1013 15:37:05.401706 4797 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/2e8a47d5-adb1-4909-9731-680948fa0320-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Oct 13 15:37:05 crc kubenswrapper[4797]: I1013 15:37:05.401714 4797 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e8a47d5-adb1-4909-9731-680948fa0320-nova-cell1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 15:37:05 crc kubenswrapper[4797]: I1013 15:37:05.401724 4797 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2e8a47d5-adb1-4909-9731-680948fa0320-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 13 15:37:05 crc kubenswrapper[4797]: I1013 15:37:05.643670 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellsnn9x" event={"ID":"2e8a47d5-adb1-4909-9731-680948fa0320","Type":"ContainerDied","Data":"b72389b82f5622223d308d5e737af51a9e904622e089e612d10339231ac66e86"} Oct 13 15:37:05 crc kubenswrapper[4797]: I1013 15:37:05.643712 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b72389b82f5622223d308d5e737af51a9e904622e089e612d10339231ac66e86" Oct 13 15:37:05 crc kubenswrapper[4797]: I1013 15:37:05.643750 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellsnn9x" Oct 13 15:38:18 crc kubenswrapper[4797]: I1013 15:38:18.120000 4797 patch_prober.go:28] interesting pod/machine-config-daemon-hrdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 15:38:18 crc kubenswrapper[4797]: I1013 15:38:18.122030 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 15:38:48 crc kubenswrapper[4797]: I1013 15:38:48.119612 4797 patch_prober.go:28] interesting pod/machine-config-daemon-hrdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 15:38:48 crc kubenswrapper[4797]: I1013 15:38:48.120250 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 15:39:11 crc kubenswrapper[4797]: I1013 15:39:11.191224 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-89r22"] Oct 13 15:39:11 crc kubenswrapper[4797]: E1013 15:39:11.192122 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e8a47d5-adb1-4909-9731-680948fa0320" containerName="nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1" Oct 13 15:39:11 crc kubenswrapper[4797]: I1013 15:39:11.192140 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e8a47d5-adb1-4909-9731-680948fa0320" containerName="nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1" Oct 13 15:39:11 crc kubenswrapper[4797]: I1013 15:39:11.192358 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e8a47d5-adb1-4909-9731-680948fa0320" containerName="nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1" Oct 13 15:39:11 crc kubenswrapper[4797]: I1013 15:39:11.193854 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-89r22" Oct 13 15:39:11 crc kubenswrapper[4797]: I1013 15:39:11.215075 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-89r22"] Oct 13 15:39:11 crc kubenswrapper[4797]: I1013 15:39:11.325822 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0afb0e3e-b08e-4b28-bfa4-d5f6d4109635-catalog-content\") pod \"community-operators-89r22\" (UID: \"0afb0e3e-b08e-4b28-bfa4-d5f6d4109635\") " pod="openshift-marketplace/community-operators-89r22" Oct 13 15:39:11 crc kubenswrapper[4797]: I1013 15:39:11.325929 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0afb0e3e-b08e-4b28-bfa4-d5f6d4109635-utilities\") pod \"community-operators-89r22\" (UID: \"0afb0e3e-b08e-4b28-bfa4-d5f6d4109635\") " pod="openshift-marketplace/community-operators-89r22" Oct 13 15:39:11 crc kubenswrapper[4797]: I1013 15:39:11.325948 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkjs6\" (UniqueName: \"kubernetes.io/projected/0afb0e3e-b08e-4b28-bfa4-d5f6d4109635-kube-api-access-xkjs6\") pod \"community-operators-89r22\" (UID: \"0afb0e3e-b08e-4b28-bfa4-d5f6d4109635\") " pod="openshift-marketplace/community-operators-89r22" Oct 13 15:39:11 crc kubenswrapper[4797]: I1013 15:39:11.428167 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0afb0e3e-b08e-4b28-bfa4-d5f6d4109635-catalog-content\") pod \"community-operators-89r22\" (UID: \"0afb0e3e-b08e-4b28-bfa4-d5f6d4109635\") " pod="openshift-marketplace/community-operators-89r22" Oct 13 15:39:11 crc kubenswrapper[4797]: I1013 15:39:11.428518 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0afb0e3e-b08e-4b28-bfa4-d5f6d4109635-utilities\") pod \"community-operators-89r22\" (UID: \"0afb0e3e-b08e-4b28-bfa4-d5f6d4109635\") " pod="openshift-marketplace/community-operators-89r22" Oct 13 15:39:11 crc kubenswrapper[4797]: I1013 15:39:11.428537 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkjs6\" (UniqueName: \"kubernetes.io/projected/0afb0e3e-b08e-4b28-bfa4-d5f6d4109635-kube-api-access-xkjs6\") pod \"community-operators-89r22\" (UID: \"0afb0e3e-b08e-4b28-bfa4-d5f6d4109635\") " pod="openshift-marketplace/community-operators-89r22" Oct 13 15:39:11 crc kubenswrapper[4797]: I1013 15:39:11.429083 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0afb0e3e-b08e-4b28-bfa4-d5f6d4109635-utilities\") pod \"community-operators-89r22\" (UID: \"0afb0e3e-b08e-4b28-bfa4-d5f6d4109635\") " pod="openshift-marketplace/community-operators-89r22" Oct 13 15:39:11 crc kubenswrapper[4797]: I1013 15:39:11.429370 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0afb0e3e-b08e-4b28-bfa4-d5f6d4109635-catalog-content\") pod \"community-operators-89r22\" (UID: \"0afb0e3e-b08e-4b28-bfa4-d5f6d4109635\") " pod="openshift-marketplace/community-operators-89r22" Oct 13 15:39:11 crc kubenswrapper[4797]: I1013 15:39:11.461352 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkjs6\" (UniqueName: \"kubernetes.io/projected/0afb0e3e-b08e-4b28-bfa4-d5f6d4109635-kube-api-access-xkjs6\") pod \"community-operators-89r22\" (UID: \"0afb0e3e-b08e-4b28-bfa4-d5f6d4109635\") " pod="openshift-marketplace/community-operators-89r22" Oct 13 15:39:11 crc kubenswrapper[4797]: I1013 15:39:11.526055 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-89r22" Oct 13 15:39:12 crc kubenswrapper[4797]: I1013 15:39:12.121991 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-89r22"] Oct 13 15:39:12 crc kubenswrapper[4797]: I1013 15:39:12.974256 4797 generic.go:334] "Generic (PLEG): container finished" podID="0afb0e3e-b08e-4b28-bfa4-d5f6d4109635" containerID="178aa2398515cee22d5d5df748b4cf6418879933579303597310857c49b2e4f2" exitCode=0 Oct 13 15:39:12 crc kubenswrapper[4797]: I1013 15:39:12.974493 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-89r22" event={"ID":"0afb0e3e-b08e-4b28-bfa4-d5f6d4109635","Type":"ContainerDied","Data":"178aa2398515cee22d5d5df748b4cf6418879933579303597310857c49b2e4f2"} Oct 13 15:39:12 crc kubenswrapper[4797]: I1013 15:39:12.974521 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-89r22" event={"ID":"0afb0e3e-b08e-4b28-bfa4-d5f6d4109635","Type":"ContainerStarted","Data":"fc5f241e6b82fd827d6be3a65517aff2b1df2cebf81521234600b1aa57517d63"} Oct 13 15:39:12 crc kubenswrapper[4797]: I1013 15:39:12.976951 4797 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 13 15:39:13 crc kubenswrapper[4797]: I1013 15:39:13.057693 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-copy-data"] Oct 13 15:39:13 crc kubenswrapper[4797]: I1013 15:39:13.058196 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/mariadb-copy-data" podUID="e6f3fa0b-2447-4956-8550-71b9a486cb9b" containerName="adoption" containerID="cri-o://247f4f4d82f828cf74310f3496e96c5bbbbce04b5e20087d937886dd34059a39" gracePeriod=30 Oct 13 15:39:13 crc kubenswrapper[4797]: I1013 15:39:13.984355 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-89r22" event={"ID":"0afb0e3e-b08e-4b28-bfa4-d5f6d4109635","Type":"ContainerStarted","Data":"9215fbb1cbd376a579aed53d28a8da2aba4c8b9ad7fbb8f7a65783faf200ea22"} Oct 13 15:39:15 crc kubenswrapper[4797]: E1013 15:39:15.176448 4797 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0afb0e3e_b08e_4b28_bfa4_d5f6d4109635.slice/crio-conmon-9215fbb1cbd376a579aed53d28a8da2aba4c8b9ad7fbb8f7a65783faf200ea22.scope\": RecentStats: unable to find data in memory cache]" Oct 13 15:39:16 crc kubenswrapper[4797]: I1013 15:39:16.005783 4797 generic.go:334] "Generic (PLEG): container finished" podID="0afb0e3e-b08e-4b28-bfa4-d5f6d4109635" containerID="9215fbb1cbd376a579aed53d28a8da2aba4c8b9ad7fbb8f7a65783faf200ea22" exitCode=0 Oct 13 15:39:16 crc kubenswrapper[4797]: I1013 15:39:16.005981 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-89r22" event={"ID":"0afb0e3e-b08e-4b28-bfa4-d5f6d4109635","Type":"ContainerDied","Data":"9215fbb1cbd376a579aed53d28a8da2aba4c8b9ad7fbb8f7a65783faf200ea22"} Oct 13 15:39:17 crc kubenswrapper[4797]: I1013 15:39:17.022493 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-89r22" event={"ID":"0afb0e3e-b08e-4b28-bfa4-d5f6d4109635","Type":"ContainerStarted","Data":"d07e17d560ce0d859989fb15b702892fdf844955e568cac7c12efc085d0124b0"} Oct 13 15:39:17 crc kubenswrapper[4797]: I1013 15:39:17.050345 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-89r22" podStartSLOduration=2.542609764 podStartE2EDuration="6.05031861s" podCreationTimestamp="2025-10-13 15:39:11 +0000 UTC" firstStartedPulling="2025-10-13 15:39:12.976712223 +0000 UTC m=+9130.510262479" lastFinishedPulling="2025-10-13 15:39:16.484421079 +0000 UTC m=+9134.017971325" observedRunningTime="2025-10-13 15:39:17.050212097 +0000 UTC m=+9134.583762353" watchObservedRunningTime="2025-10-13 15:39:17.05031861 +0000 UTC m=+9134.583868876" Oct 13 15:39:18 crc kubenswrapper[4797]: I1013 15:39:18.119778 4797 patch_prober.go:28] interesting pod/machine-config-daemon-hrdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 15:39:18 crc kubenswrapper[4797]: I1013 15:39:18.120208 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 15:39:18 crc kubenswrapper[4797]: I1013 15:39:18.120269 4797 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" Oct 13 15:39:18 crc kubenswrapper[4797]: I1013 15:39:18.121511 4797 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0f48189b10ecf709cf828e19132e91dbb262e3daf7016f0a69a4c9c375af6ddf"} pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 13 15:39:18 crc kubenswrapper[4797]: I1013 15:39:18.121599 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerName="machine-config-daemon" containerID="cri-o://0f48189b10ecf709cf828e19132e91dbb262e3daf7016f0a69a4c9c375af6ddf" gracePeriod=600 Oct 13 15:39:18 crc kubenswrapper[4797]: E1013 15:39:18.246372 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 15:39:19 crc kubenswrapper[4797]: I1013 15:39:19.045772 4797 generic.go:334] "Generic (PLEG): container finished" podID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerID="0f48189b10ecf709cf828e19132e91dbb262e3daf7016f0a69a4c9c375af6ddf" exitCode=0 Oct 13 15:39:19 crc kubenswrapper[4797]: I1013 15:39:19.045868 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" event={"ID":"345b1c60-ba79-407d-8423-53010f2dfeb0","Type":"ContainerDied","Data":"0f48189b10ecf709cf828e19132e91dbb262e3daf7016f0a69a4c9c375af6ddf"} Oct 13 15:39:19 crc kubenswrapper[4797]: I1013 15:39:19.046585 4797 scope.go:117] "RemoveContainer" containerID="98efd66111782db243b8b6633201b9be37d56075cae91ee51e26bf205e3692c6" Oct 13 15:39:19 crc kubenswrapper[4797]: I1013 15:39:19.047654 4797 scope.go:117] "RemoveContainer" containerID="0f48189b10ecf709cf828e19132e91dbb262e3daf7016f0a69a4c9c375af6ddf" Oct 13 15:39:19 crc kubenswrapper[4797]: E1013 15:39:19.048179 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 15:39:21 crc kubenswrapper[4797]: I1013 15:39:21.527072 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-89r22" Oct 13 15:39:21 crc kubenswrapper[4797]: I1013 15:39:21.527868 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-89r22" Oct 13 15:39:21 crc kubenswrapper[4797]: I1013 15:39:21.572917 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-89r22" Oct 13 15:39:22 crc kubenswrapper[4797]: I1013 15:39:22.145930 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-89r22" Oct 13 15:39:22 crc kubenswrapper[4797]: I1013 15:39:22.194604 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-89r22"] Oct 13 15:39:24 crc kubenswrapper[4797]: I1013 15:39:24.110990 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-89r22" podUID="0afb0e3e-b08e-4b28-bfa4-d5f6d4109635" containerName="registry-server" containerID="cri-o://d07e17d560ce0d859989fb15b702892fdf844955e568cac7c12efc085d0124b0" gracePeriod=2 Oct 13 15:39:24 crc kubenswrapper[4797]: I1013 15:39:24.692815 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-89r22" Oct 13 15:39:24 crc kubenswrapper[4797]: I1013 15:39:24.826683 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xkjs6\" (UniqueName: \"kubernetes.io/projected/0afb0e3e-b08e-4b28-bfa4-d5f6d4109635-kube-api-access-xkjs6\") pod \"0afb0e3e-b08e-4b28-bfa4-d5f6d4109635\" (UID: \"0afb0e3e-b08e-4b28-bfa4-d5f6d4109635\") " Oct 13 15:39:24 crc kubenswrapper[4797]: I1013 15:39:24.826850 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0afb0e3e-b08e-4b28-bfa4-d5f6d4109635-utilities\") pod \"0afb0e3e-b08e-4b28-bfa4-d5f6d4109635\" (UID: \"0afb0e3e-b08e-4b28-bfa4-d5f6d4109635\") " Oct 13 15:39:24 crc kubenswrapper[4797]: I1013 15:39:24.826889 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0afb0e3e-b08e-4b28-bfa4-d5f6d4109635-catalog-content\") pod \"0afb0e3e-b08e-4b28-bfa4-d5f6d4109635\" (UID: \"0afb0e3e-b08e-4b28-bfa4-d5f6d4109635\") " Oct 13 15:39:24 crc kubenswrapper[4797]: I1013 15:39:24.827602 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0afb0e3e-b08e-4b28-bfa4-d5f6d4109635-utilities" (OuterVolumeSpecName: "utilities") pod "0afb0e3e-b08e-4b28-bfa4-d5f6d4109635" (UID: "0afb0e3e-b08e-4b28-bfa4-d5f6d4109635"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 15:39:24 crc kubenswrapper[4797]: I1013 15:39:24.846887 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0afb0e3e-b08e-4b28-bfa4-d5f6d4109635-kube-api-access-xkjs6" (OuterVolumeSpecName: "kube-api-access-xkjs6") pod "0afb0e3e-b08e-4b28-bfa4-d5f6d4109635" (UID: "0afb0e3e-b08e-4b28-bfa4-d5f6d4109635"). InnerVolumeSpecName "kube-api-access-xkjs6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 15:39:24 crc kubenswrapper[4797]: I1013 15:39:24.889150 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0afb0e3e-b08e-4b28-bfa4-d5f6d4109635-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0afb0e3e-b08e-4b28-bfa4-d5f6d4109635" (UID: "0afb0e3e-b08e-4b28-bfa4-d5f6d4109635"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 15:39:24 crc kubenswrapper[4797]: I1013 15:39:24.929932 4797 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0afb0e3e-b08e-4b28-bfa4-d5f6d4109635-utilities\") on node \"crc\" DevicePath \"\"" Oct 13 15:39:24 crc kubenswrapper[4797]: I1013 15:39:24.930223 4797 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0afb0e3e-b08e-4b28-bfa4-d5f6d4109635-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 13 15:39:24 crc kubenswrapper[4797]: I1013 15:39:24.930294 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xkjs6\" (UniqueName: \"kubernetes.io/projected/0afb0e3e-b08e-4b28-bfa4-d5f6d4109635-kube-api-access-xkjs6\") on node \"crc\" DevicePath \"\"" Oct 13 15:39:25 crc kubenswrapper[4797]: I1013 15:39:25.134208 4797 generic.go:334] "Generic (PLEG): container finished" podID="0afb0e3e-b08e-4b28-bfa4-d5f6d4109635" containerID="d07e17d560ce0d859989fb15b702892fdf844955e568cac7c12efc085d0124b0" exitCode=0 Oct 13 15:39:25 crc kubenswrapper[4797]: I1013 15:39:25.134253 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-89r22" event={"ID":"0afb0e3e-b08e-4b28-bfa4-d5f6d4109635","Type":"ContainerDied","Data":"d07e17d560ce0d859989fb15b702892fdf844955e568cac7c12efc085d0124b0"} Oct 13 15:39:25 crc kubenswrapper[4797]: I1013 15:39:25.134280 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-89r22" event={"ID":"0afb0e3e-b08e-4b28-bfa4-d5f6d4109635","Type":"ContainerDied","Data":"fc5f241e6b82fd827d6be3a65517aff2b1df2cebf81521234600b1aa57517d63"} Oct 13 15:39:25 crc kubenswrapper[4797]: I1013 15:39:25.134282 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-89r22" Oct 13 15:39:25 crc kubenswrapper[4797]: I1013 15:39:25.134298 4797 scope.go:117] "RemoveContainer" containerID="d07e17d560ce0d859989fb15b702892fdf844955e568cac7c12efc085d0124b0" Oct 13 15:39:25 crc kubenswrapper[4797]: I1013 15:39:25.169082 4797 scope.go:117] "RemoveContainer" containerID="9215fbb1cbd376a579aed53d28a8da2aba4c8b9ad7fbb8f7a65783faf200ea22" Oct 13 15:39:25 crc kubenswrapper[4797]: I1013 15:39:25.173262 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-89r22"] Oct 13 15:39:25 crc kubenswrapper[4797]: I1013 15:39:25.198261 4797 scope.go:117] "RemoveContainer" containerID="178aa2398515cee22d5d5df748b4cf6418879933579303597310857c49b2e4f2" Oct 13 15:39:25 crc kubenswrapper[4797]: I1013 15:39:25.198653 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-89r22"] Oct 13 15:39:25 crc kubenswrapper[4797]: I1013 15:39:25.241595 4797 scope.go:117] "RemoveContainer" containerID="d07e17d560ce0d859989fb15b702892fdf844955e568cac7c12efc085d0124b0" Oct 13 15:39:25 crc kubenswrapper[4797]: E1013 15:39:25.242620 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d07e17d560ce0d859989fb15b702892fdf844955e568cac7c12efc085d0124b0\": container with ID starting with d07e17d560ce0d859989fb15b702892fdf844955e568cac7c12efc085d0124b0 not found: ID does not exist" containerID="d07e17d560ce0d859989fb15b702892fdf844955e568cac7c12efc085d0124b0" Oct 13 15:39:25 crc kubenswrapper[4797]: I1013 15:39:25.242649 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d07e17d560ce0d859989fb15b702892fdf844955e568cac7c12efc085d0124b0"} err="failed to get container status \"d07e17d560ce0d859989fb15b702892fdf844955e568cac7c12efc085d0124b0\": rpc error: code = NotFound desc = could not find container \"d07e17d560ce0d859989fb15b702892fdf844955e568cac7c12efc085d0124b0\": container with ID starting with d07e17d560ce0d859989fb15b702892fdf844955e568cac7c12efc085d0124b0 not found: ID does not exist" Oct 13 15:39:25 crc kubenswrapper[4797]: I1013 15:39:25.242668 4797 scope.go:117] "RemoveContainer" containerID="9215fbb1cbd376a579aed53d28a8da2aba4c8b9ad7fbb8f7a65783faf200ea22" Oct 13 15:39:25 crc kubenswrapper[4797]: E1013 15:39:25.244619 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9215fbb1cbd376a579aed53d28a8da2aba4c8b9ad7fbb8f7a65783faf200ea22\": container with ID starting with 9215fbb1cbd376a579aed53d28a8da2aba4c8b9ad7fbb8f7a65783faf200ea22 not found: ID does not exist" containerID="9215fbb1cbd376a579aed53d28a8da2aba4c8b9ad7fbb8f7a65783faf200ea22" Oct 13 15:39:25 crc kubenswrapper[4797]: I1013 15:39:25.244645 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9215fbb1cbd376a579aed53d28a8da2aba4c8b9ad7fbb8f7a65783faf200ea22"} err="failed to get container status \"9215fbb1cbd376a579aed53d28a8da2aba4c8b9ad7fbb8f7a65783faf200ea22\": rpc error: code = NotFound desc = could not find container \"9215fbb1cbd376a579aed53d28a8da2aba4c8b9ad7fbb8f7a65783faf200ea22\": container with ID starting with 9215fbb1cbd376a579aed53d28a8da2aba4c8b9ad7fbb8f7a65783faf200ea22 not found: ID does not exist" Oct 13 15:39:25 crc kubenswrapper[4797]: I1013 15:39:25.244661 4797 scope.go:117] "RemoveContainer" containerID="178aa2398515cee22d5d5df748b4cf6418879933579303597310857c49b2e4f2" Oct 13 15:39:25 crc kubenswrapper[4797]: E1013 15:39:25.248993 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"178aa2398515cee22d5d5df748b4cf6418879933579303597310857c49b2e4f2\": container with ID starting with 178aa2398515cee22d5d5df748b4cf6418879933579303597310857c49b2e4f2 not found: ID does not exist" containerID="178aa2398515cee22d5d5df748b4cf6418879933579303597310857c49b2e4f2" Oct 13 15:39:25 crc kubenswrapper[4797]: I1013 15:39:25.249028 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"178aa2398515cee22d5d5df748b4cf6418879933579303597310857c49b2e4f2"} err="failed to get container status \"178aa2398515cee22d5d5df748b4cf6418879933579303597310857c49b2e4f2\": rpc error: code = NotFound desc = could not find container \"178aa2398515cee22d5d5df748b4cf6418879933579303597310857c49b2e4f2\": container with ID starting with 178aa2398515cee22d5d5df748b4cf6418879933579303597310857c49b2e4f2 not found: ID does not exist" Oct 13 15:39:25 crc kubenswrapper[4797]: I1013 15:39:25.261528 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0afb0e3e-b08e-4b28-bfa4-d5f6d4109635" path="/var/lib/kubelet/pods/0afb0e3e-b08e-4b28-bfa4-d5f6d4109635/volumes" Oct 13 15:39:33 crc kubenswrapper[4797]: I1013 15:39:33.237045 4797 scope.go:117] "RemoveContainer" containerID="0f48189b10ecf709cf828e19132e91dbb262e3daf7016f0a69a4c9c375af6ddf" Oct 13 15:39:33 crc kubenswrapper[4797]: E1013 15:39:33.237847 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 15:39:43 crc kubenswrapper[4797]: I1013 15:39:43.339191 4797 generic.go:334] "Generic (PLEG): container finished" podID="e6f3fa0b-2447-4956-8550-71b9a486cb9b" containerID="247f4f4d82f828cf74310f3496e96c5bbbbce04b5e20087d937886dd34059a39" exitCode=137 Oct 13 15:39:43 crc kubenswrapper[4797]: I1013 15:39:43.339275 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"e6f3fa0b-2447-4956-8550-71b9a486cb9b","Type":"ContainerDied","Data":"247f4f4d82f828cf74310f3496e96c5bbbbce04b5e20087d937886dd34059a39"} Oct 13 15:39:43 crc kubenswrapper[4797]: I1013 15:39:43.580655 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Oct 13 15:39:43 crc kubenswrapper[4797]: I1013 15:39:43.703721 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mariadb-data\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8c6eed2c-d005-4cb8-891b-223d0e4428a7\") pod \"e6f3fa0b-2447-4956-8550-71b9a486cb9b\" (UID: \"e6f3fa0b-2447-4956-8550-71b9a486cb9b\") " Oct 13 15:39:43 crc kubenswrapper[4797]: I1013 15:39:43.704021 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d9x5q\" (UniqueName: \"kubernetes.io/projected/e6f3fa0b-2447-4956-8550-71b9a486cb9b-kube-api-access-d9x5q\") pod \"e6f3fa0b-2447-4956-8550-71b9a486cb9b\" (UID: \"e6f3fa0b-2447-4956-8550-71b9a486cb9b\") " Oct 13 15:39:43 crc kubenswrapper[4797]: I1013 15:39:43.712520 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6f3fa0b-2447-4956-8550-71b9a486cb9b-kube-api-access-d9x5q" (OuterVolumeSpecName: "kube-api-access-d9x5q") pod "e6f3fa0b-2447-4956-8550-71b9a486cb9b" (UID: "e6f3fa0b-2447-4956-8550-71b9a486cb9b"). InnerVolumeSpecName "kube-api-access-d9x5q". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 15:39:43 crc kubenswrapper[4797]: I1013 15:39:43.720897 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8c6eed2c-d005-4cb8-891b-223d0e4428a7" (OuterVolumeSpecName: "mariadb-data") pod "e6f3fa0b-2447-4956-8550-71b9a486cb9b" (UID: "e6f3fa0b-2447-4956-8550-71b9a486cb9b"). InnerVolumeSpecName "pvc-8c6eed2c-d005-4cb8-891b-223d0e4428a7". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 13 15:39:43 crc kubenswrapper[4797]: I1013 15:39:43.806074 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d9x5q\" (UniqueName: \"kubernetes.io/projected/e6f3fa0b-2447-4956-8550-71b9a486cb9b-kube-api-access-d9x5q\") on node \"crc\" DevicePath \"\"" Oct 13 15:39:43 crc kubenswrapper[4797]: I1013 15:39:43.806133 4797 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-8c6eed2c-d005-4cb8-891b-223d0e4428a7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8c6eed2c-d005-4cb8-891b-223d0e4428a7\") on node \"crc\" " Oct 13 15:39:43 crc kubenswrapper[4797]: I1013 15:39:43.830714 4797 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Oct 13 15:39:43 crc kubenswrapper[4797]: I1013 15:39:43.831017 4797 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-8c6eed2c-d005-4cb8-891b-223d0e4428a7" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8c6eed2c-d005-4cb8-891b-223d0e4428a7") on node "crc" Oct 13 15:39:43 crc kubenswrapper[4797]: I1013 15:39:43.908270 4797 reconciler_common.go:293] "Volume detached for volume \"pvc-8c6eed2c-d005-4cb8-891b-223d0e4428a7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8c6eed2c-d005-4cb8-891b-223d0e4428a7\") on node \"crc\" DevicePath \"\"" Oct 13 15:39:44 crc kubenswrapper[4797]: I1013 15:39:44.354846 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"e6f3fa0b-2447-4956-8550-71b9a486cb9b","Type":"ContainerDied","Data":"1557ff85d31aba04800006bfccc48051ef31398adbbf8c01cf5a99b246ad8426"} Oct 13 15:39:44 crc kubenswrapper[4797]: I1013 15:39:44.354907 4797 scope.go:117] "RemoveContainer" containerID="247f4f4d82f828cf74310f3496e96c5bbbbce04b5e20087d937886dd34059a39" Oct 13 15:39:44 crc kubenswrapper[4797]: I1013 15:39:44.355027 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Oct 13 15:39:44 crc kubenswrapper[4797]: I1013 15:39:44.391175 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-copy-data"] Oct 13 15:39:44 crc kubenswrapper[4797]: I1013 15:39:44.399548 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-copy-data"] Oct 13 15:39:45 crc kubenswrapper[4797]: I1013 15:39:45.078190 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-copy-data"] Oct 13 15:39:45 crc kubenswrapper[4797]: I1013 15:39:45.078395 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-copy-data" podUID="575e9052-19a2-442d-af24-3a8bd5a2eb64" containerName="adoption" containerID="cri-o://2833b38e9f91168527f7300260906101fec4bde704ef193b6181eaf83bd5afff" gracePeriod=30 Oct 13 15:39:45 crc kubenswrapper[4797]: I1013 15:39:45.254219 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6f3fa0b-2447-4956-8550-71b9a486cb9b" path="/var/lib/kubelet/pods/e6f3fa0b-2447-4956-8550-71b9a486cb9b/volumes" Oct 13 15:39:48 crc kubenswrapper[4797]: I1013 15:39:48.237077 4797 scope.go:117] "RemoveContainer" containerID="0f48189b10ecf709cf828e19132e91dbb262e3daf7016f0a69a4c9c375af6ddf" Oct 13 15:39:48 crc kubenswrapper[4797]: E1013 15:39:48.238261 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 15:40:01 crc kubenswrapper[4797]: I1013 15:40:01.238700 4797 scope.go:117] "RemoveContainer" containerID="0f48189b10ecf709cf828e19132e91dbb262e3daf7016f0a69a4c9c375af6ddf" Oct 13 15:40:01 crc kubenswrapper[4797]: E1013 15:40:01.239646 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 15:40:14 crc kubenswrapper[4797]: I1013 15:40:14.237045 4797 scope.go:117] "RemoveContainer" containerID="0f48189b10ecf709cf828e19132e91dbb262e3daf7016f0a69a4c9c375af6ddf" Oct 13 15:40:14 crc kubenswrapper[4797]: E1013 15:40:14.237619 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 15:40:15 crc kubenswrapper[4797]: I1013 15:40:15.546232 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Oct 13 15:40:15 crc kubenswrapper[4797]: I1013 15:40:15.602085 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/575e9052-19a2-442d-af24-3a8bd5a2eb64-ovn-data-cert\") pod \"575e9052-19a2-442d-af24-3a8bd5a2eb64\" (UID: \"575e9052-19a2-442d-af24-3a8bd5a2eb64\") " Oct 13 15:40:15 crc kubenswrapper[4797]: I1013 15:40:15.603134 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-data\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-14c99935-2bda-49e3-9c30-9ad5847c9567\") pod \"575e9052-19a2-442d-af24-3a8bd5a2eb64\" (UID: \"575e9052-19a2-442d-af24-3a8bd5a2eb64\") " Oct 13 15:40:15 crc kubenswrapper[4797]: I1013 15:40:15.603324 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f6j2j\" (UniqueName: \"kubernetes.io/projected/575e9052-19a2-442d-af24-3a8bd5a2eb64-kube-api-access-f6j2j\") pod \"575e9052-19a2-442d-af24-3a8bd5a2eb64\" (UID: \"575e9052-19a2-442d-af24-3a8bd5a2eb64\") " Oct 13 15:40:15 crc kubenswrapper[4797]: I1013 15:40:15.610124 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/575e9052-19a2-442d-af24-3a8bd5a2eb64-kube-api-access-f6j2j" (OuterVolumeSpecName: "kube-api-access-f6j2j") pod "575e9052-19a2-442d-af24-3a8bd5a2eb64" (UID: "575e9052-19a2-442d-af24-3a8bd5a2eb64"). InnerVolumeSpecName "kube-api-access-f6j2j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 15:40:15 crc kubenswrapper[4797]: I1013 15:40:15.610470 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/575e9052-19a2-442d-af24-3a8bd5a2eb64-ovn-data-cert" (OuterVolumeSpecName: "ovn-data-cert") pod "575e9052-19a2-442d-af24-3a8bd5a2eb64" (UID: "575e9052-19a2-442d-af24-3a8bd5a2eb64"). InnerVolumeSpecName "ovn-data-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 15:40:15 crc kubenswrapper[4797]: I1013 15:40:15.622954 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-14c99935-2bda-49e3-9c30-9ad5847c9567" (OuterVolumeSpecName: "ovn-data") pod "575e9052-19a2-442d-af24-3a8bd5a2eb64" (UID: "575e9052-19a2-442d-af24-3a8bd5a2eb64"). InnerVolumeSpecName "pvc-14c99935-2bda-49e3-9c30-9ad5847c9567". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 13 15:40:15 crc kubenswrapper[4797]: I1013 15:40:15.705484 4797 reconciler_common.go:293] "Volume detached for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/575e9052-19a2-442d-af24-3a8bd5a2eb64-ovn-data-cert\") on node \"crc\" DevicePath \"\"" Oct 13 15:40:15 crc kubenswrapper[4797]: I1013 15:40:15.705551 4797 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-14c99935-2bda-49e3-9c30-9ad5847c9567\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-14c99935-2bda-49e3-9c30-9ad5847c9567\") on node \"crc\" " Oct 13 15:40:15 crc kubenswrapper[4797]: I1013 15:40:15.705563 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f6j2j\" (UniqueName: \"kubernetes.io/projected/575e9052-19a2-442d-af24-3a8bd5a2eb64-kube-api-access-f6j2j\") on node \"crc\" DevicePath \"\"" Oct 13 15:40:15 crc kubenswrapper[4797]: I1013 15:40:15.714233 4797 generic.go:334] "Generic (PLEG): container finished" podID="575e9052-19a2-442d-af24-3a8bd5a2eb64" containerID="2833b38e9f91168527f7300260906101fec4bde704ef193b6181eaf83bd5afff" exitCode=137 Oct 13 15:40:15 crc kubenswrapper[4797]: I1013 15:40:15.714297 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Oct 13 15:40:15 crc kubenswrapper[4797]: I1013 15:40:15.714425 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"575e9052-19a2-442d-af24-3a8bd5a2eb64","Type":"ContainerDied","Data":"2833b38e9f91168527f7300260906101fec4bde704ef193b6181eaf83bd5afff"} Oct 13 15:40:15 crc kubenswrapper[4797]: I1013 15:40:15.714520 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"575e9052-19a2-442d-af24-3a8bd5a2eb64","Type":"ContainerDied","Data":"0647dda7607f8bdcd029415cfd253383347c5da774d58a46784333be814f72f3"} Oct 13 15:40:15 crc kubenswrapper[4797]: I1013 15:40:15.714602 4797 scope.go:117] "RemoveContainer" containerID="2833b38e9f91168527f7300260906101fec4bde704ef193b6181eaf83bd5afff" Oct 13 15:40:15 crc kubenswrapper[4797]: I1013 15:40:15.743986 4797 scope.go:117] "RemoveContainer" containerID="2833b38e9f91168527f7300260906101fec4bde704ef193b6181eaf83bd5afff" Oct 13 15:40:15 crc kubenswrapper[4797]: E1013 15:40:15.744951 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2833b38e9f91168527f7300260906101fec4bde704ef193b6181eaf83bd5afff\": container with ID starting with 2833b38e9f91168527f7300260906101fec4bde704ef193b6181eaf83bd5afff not found: ID does not exist" containerID="2833b38e9f91168527f7300260906101fec4bde704ef193b6181eaf83bd5afff" Oct 13 15:40:15 crc kubenswrapper[4797]: I1013 15:40:15.744988 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2833b38e9f91168527f7300260906101fec4bde704ef193b6181eaf83bd5afff"} err="failed to get container status \"2833b38e9f91168527f7300260906101fec4bde704ef193b6181eaf83bd5afff\": rpc error: code = NotFound desc = could not find container \"2833b38e9f91168527f7300260906101fec4bde704ef193b6181eaf83bd5afff\": container with ID starting with 2833b38e9f91168527f7300260906101fec4bde704ef193b6181eaf83bd5afff not found: ID does not exist" Oct 13 15:40:15 crc kubenswrapper[4797]: I1013 15:40:15.747200 4797 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Oct 13 15:40:15 crc kubenswrapper[4797]: I1013 15:40:15.747499 4797 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-14c99935-2bda-49e3-9c30-9ad5847c9567" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-14c99935-2bda-49e3-9c30-9ad5847c9567") on node "crc" Oct 13 15:40:15 crc kubenswrapper[4797]: I1013 15:40:15.748205 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-copy-data"] Oct 13 15:40:15 crc kubenswrapper[4797]: I1013 15:40:15.757235 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-copy-data"] Oct 13 15:40:15 crc kubenswrapper[4797]: I1013 15:40:15.807572 4797 reconciler_common.go:293] "Volume detached for volume \"pvc-14c99935-2bda-49e3-9c30-9ad5847c9567\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-14c99935-2bda-49e3-9c30-9ad5847c9567\") on node \"crc\" DevicePath \"\"" Oct 13 15:40:17 crc kubenswrapper[4797]: I1013 15:40:17.257459 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="575e9052-19a2-442d-af24-3a8bd5a2eb64" path="/var/lib/kubelet/pods/575e9052-19a2-442d-af24-3a8bd5a2eb64/volumes" Oct 13 15:40:29 crc kubenswrapper[4797]: I1013 15:40:29.236232 4797 scope.go:117] "RemoveContainer" containerID="0f48189b10ecf709cf828e19132e91dbb262e3daf7016f0a69a4c9c375af6ddf" Oct 13 15:40:29 crc kubenswrapper[4797]: E1013 15:40:29.237422 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 15:40:36 crc kubenswrapper[4797]: I1013 15:40:36.784520 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Oct 13 15:40:36 crc kubenswrapper[4797]: E1013 15:40:36.785406 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6f3fa0b-2447-4956-8550-71b9a486cb9b" containerName="adoption" Oct 13 15:40:36 crc kubenswrapper[4797]: I1013 15:40:36.785420 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6f3fa0b-2447-4956-8550-71b9a486cb9b" containerName="adoption" Oct 13 15:40:36 crc kubenswrapper[4797]: E1013 15:40:36.785431 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0afb0e3e-b08e-4b28-bfa4-d5f6d4109635" containerName="registry-server" Oct 13 15:40:36 crc kubenswrapper[4797]: I1013 15:40:36.785439 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="0afb0e3e-b08e-4b28-bfa4-d5f6d4109635" containerName="registry-server" Oct 13 15:40:36 crc kubenswrapper[4797]: E1013 15:40:36.785467 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0afb0e3e-b08e-4b28-bfa4-d5f6d4109635" containerName="extract-utilities" Oct 13 15:40:36 crc kubenswrapper[4797]: I1013 15:40:36.785474 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="0afb0e3e-b08e-4b28-bfa4-d5f6d4109635" containerName="extract-utilities" Oct 13 15:40:36 crc kubenswrapper[4797]: E1013 15:40:36.785495 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0afb0e3e-b08e-4b28-bfa4-d5f6d4109635" containerName="extract-content" Oct 13 15:40:36 crc kubenswrapper[4797]: I1013 15:40:36.785501 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="0afb0e3e-b08e-4b28-bfa4-d5f6d4109635" containerName="extract-content" Oct 13 15:40:36 crc kubenswrapper[4797]: E1013 15:40:36.785515 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="575e9052-19a2-442d-af24-3a8bd5a2eb64" containerName="adoption" Oct 13 15:40:36 crc kubenswrapper[4797]: I1013 15:40:36.785521 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="575e9052-19a2-442d-af24-3a8bd5a2eb64" containerName="adoption" Oct 13 15:40:36 crc kubenswrapper[4797]: I1013 15:40:36.785693 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6f3fa0b-2447-4956-8550-71b9a486cb9b" containerName="adoption" Oct 13 15:40:36 crc kubenswrapper[4797]: I1013 15:40:36.785708 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="575e9052-19a2-442d-af24-3a8bd5a2eb64" containerName="adoption" Oct 13 15:40:36 crc kubenswrapper[4797]: I1013 15:40:36.785720 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="0afb0e3e-b08e-4b28-bfa4-d5f6d4109635" containerName="registry-server" Oct 13 15:40:36 crc kubenswrapper[4797]: I1013 15:40:36.786455 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 13 15:40:36 crc kubenswrapper[4797]: I1013 15:40:36.790864 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Oct 13 15:40:36 crc kubenswrapper[4797]: I1013 15:40:36.791028 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-dj9th" Oct 13 15:40:36 crc kubenswrapper[4797]: I1013 15:40:36.791343 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Oct 13 15:40:36 crc kubenswrapper[4797]: I1013 15:40:36.792237 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Oct 13 15:40:36 crc kubenswrapper[4797]: I1013 15:40:36.803968 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Oct 13 15:40:36 crc kubenswrapper[4797]: I1013 15:40:36.856938 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b91301d7-01b4-48ec-b44e-12d408a58e1c-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"b91301d7-01b4-48ec-b44e-12d408a58e1c\") " pod="openstack/tempest-tests-tempest" Oct 13 15:40:36 crc kubenswrapper[4797]: I1013 15:40:36.857095 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b91301d7-01b4-48ec-b44e-12d408a58e1c-config-data\") pod \"tempest-tests-tempest\" (UID: \"b91301d7-01b4-48ec-b44e-12d408a58e1c\") " pod="openstack/tempest-tests-tempest" Oct 13 15:40:36 crc kubenswrapper[4797]: I1013 15:40:36.857732 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b91301d7-01b4-48ec-b44e-12d408a58e1c-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"b91301d7-01b4-48ec-b44e-12d408a58e1c\") " pod="openstack/tempest-tests-tempest" Oct 13 15:40:36 crc kubenswrapper[4797]: I1013 15:40:36.959151 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b91301d7-01b4-48ec-b44e-12d408a58e1c-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"b91301d7-01b4-48ec-b44e-12d408a58e1c\") " pod="openstack/tempest-tests-tempest" Oct 13 15:40:36 crc kubenswrapper[4797]: I1013 15:40:36.959612 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/b91301d7-01b4-48ec-b44e-12d408a58e1c-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"b91301d7-01b4-48ec-b44e-12d408a58e1c\") " pod="openstack/tempest-tests-tempest" Oct 13 15:40:36 crc kubenswrapper[4797]: I1013 15:40:36.959645 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"tempest-tests-tempest\" (UID: \"b91301d7-01b4-48ec-b44e-12d408a58e1c\") " pod="openstack/tempest-tests-tempest" Oct 13 15:40:36 crc kubenswrapper[4797]: I1013 15:40:36.959666 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/b91301d7-01b4-48ec-b44e-12d408a58e1c-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"b91301d7-01b4-48ec-b44e-12d408a58e1c\") " pod="openstack/tempest-tests-tempest" Oct 13 15:40:36 crc kubenswrapper[4797]: I1013 15:40:36.959709 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b91301d7-01b4-48ec-b44e-12d408a58e1c-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"b91301d7-01b4-48ec-b44e-12d408a58e1c\") " pod="openstack/tempest-tests-tempest" Oct 13 15:40:36 crc kubenswrapper[4797]: I1013 15:40:36.959730 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/b91301d7-01b4-48ec-b44e-12d408a58e1c-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"b91301d7-01b4-48ec-b44e-12d408a58e1c\") " pod="openstack/tempest-tests-tempest" Oct 13 15:40:36 crc kubenswrapper[4797]: I1013 15:40:36.959772 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b91301d7-01b4-48ec-b44e-12d408a58e1c-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"b91301d7-01b4-48ec-b44e-12d408a58e1c\") " pod="openstack/tempest-tests-tempest" Oct 13 15:40:36 crc kubenswrapper[4797]: I1013 15:40:36.959791 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b91301d7-01b4-48ec-b44e-12d408a58e1c-config-data\") pod \"tempest-tests-tempest\" (UID: \"b91301d7-01b4-48ec-b44e-12d408a58e1c\") " pod="openstack/tempest-tests-tempest" Oct 13 15:40:36 crc kubenswrapper[4797]: I1013 15:40:36.959866 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2w2r\" (UniqueName: \"kubernetes.io/projected/b91301d7-01b4-48ec-b44e-12d408a58e1c-kube-api-access-k2w2r\") pod \"tempest-tests-tempest\" (UID: \"b91301d7-01b4-48ec-b44e-12d408a58e1c\") " pod="openstack/tempest-tests-tempest" Oct 13 15:40:36 crc kubenswrapper[4797]: I1013 15:40:36.961690 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b91301d7-01b4-48ec-b44e-12d408a58e1c-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"b91301d7-01b4-48ec-b44e-12d408a58e1c\") " pod="openstack/tempest-tests-tempest" Oct 13 15:40:36 crc kubenswrapper[4797]: I1013 15:40:36.961775 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b91301d7-01b4-48ec-b44e-12d408a58e1c-config-data\") pod \"tempest-tests-tempest\" (UID: \"b91301d7-01b4-48ec-b44e-12d408a58e1c\") " pod="openstack/tempest-tests-tempest" Oct 13 15:40:36 crc kubenswrapper[4797]: I1013 15:40:36.972596 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b91301d7-01b4-48ec-b44e-12d408a58e1c-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"b91301d7-01b4-48ec-b44e-12d408a58e1c\") " pod="openstack/tempest-tests-tempest" Oct 13 15:40:37 crc kubenswrapper[4797]: I1013 15:40:37.061892 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/b91301d7-01b4-48ec-b44e-12d408a58e1c-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"b91301d7-01b4-48ec-b44e-12d408a58e1c\") " pod="openstack/tempest-tests-tempest" Oct 13 15:40:37 crc kubenswrapper[4797]: I1013 15:40:37.062030 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k2w2r\" (UniqueName: \"kubernetes.io/projected/b91301d7-01b4-48ec-b44e-12d408a58e1c-kube-api-access-k2w2r\") pod \"tempest-tests-tempest\" (UID: \"b91301d7-01b4-48ec-b44e-12d408a58e1c\") " pod="openstack/tempest-tests-tempest" Oct 13 15:40:37 crc kubenswrapper[4797]: I1013 15:40:37.062080 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b91301d7-01b4-48ec-b44e-12d408a58e1c-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"b91301d7-01b4-48ec-b44e-12d408a58e1c\") " pod="openstack/tempest-tests-tempest" Oct 13 15:40:37 crc kubenswrapper[4797]: I1013 15:40:37.062146 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/b91301d7-01b4-48ec-b44e-12d408a58e1c-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"b91301d7-01b4-48ec-b44e-12d408a58e1c\") " pod="openstack/tempest-tests-tempest" Oct 13 15:40:37 crc kubenswrapper[4797]: I1013 15:40:37.062181 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"tempest-tests-tempest\" (UID: \"b91301d7-01b4-48ec-b44e-12d408a58e1c\") " pod="openstack/tempest-tests-tempest" Oct 13 15:40:37 crc kubenswrapper[4797]: I1013 15:40:37.062206 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/b91301d7-01b4-48ec-b44e-12d408a58e1c-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"b91301d7-01b4-48ec-b44e-12d408a58e1c\") " pod="openstack/tempest-tests-tempest" Oct 13 15:40:37 crc kubenswrapper[4797]: I1013 15:40:37.062308 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/b91301d7-01b4-48ec-b44e-12d408a58e1c-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"b91301d7-01b4-48ec-b44e-12d408a58e1c\") " pod="openstack/tempest-tests-tempest" Oct 13 15:40:37 crc kubenswrapper[4797]: I1013 15:40:37.062937 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/b91301d7-01b4-48ec-b44e-12d408a58e1c-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"b91301d7-01b4-48ec-b44e-12d408a58e1c\") " pod="openstack/tempest-tests-tempest" Oct 13 15:40:37 crc kubenswrapper[4797]: I1013 15:40:37.063590 4797 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"tempest-tests-tempest\" (UID: \"b91301d7-01b4-48ec-b44e-12d408a58e1c\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/tempest-tests-tempest" Oct 13 15:40:37 crc kubenswrapper[4797]: I1013 15:40:37.069339 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b91301d7-01b4-48ec-b44e-12d408a58e1c-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"b91301d7-01b4-48ec-b44e-12d408a58e1c\") " pod="openstack/tempest-tests-tempest" Oct 13 15:40:37 crc kubenswrapper[4797]: I1013 15:40:37.069434 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/b91301d7-01b4-48ec-b44e-12d408a58e1c-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"b91301d7-01b4-48ec-b44e-12d408a58e1c\") " pod="openstack/tempest-tests-tempest" Oct 13 15:40:37 crc kubenswrapper[4797]: I1013 15:40:37.081488 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k2w2r\" (UniqueName: \"kubernetes.io/projected/b91301d7-01b4-48ec-b44e-12d408a58e1c-kube-api-access-k2w2r\") pod \"tempest-tests-tempest\" (UID: \"b91301d7-01b4-48ec-b44e-12d408a58e1c\") " pod="openstack/tempest-tests-tempest" Oct 13 15:40:37 crc kubenswrapper[4797]: I1013 15:40:37.094017 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"tempest-tests-tempest\" (UID: \"b91301d7-01b4-48ec-b44e-12d408a58e1c\") " pod="openstack/tempest-tests-tempest" Oct 13 15:40:37 crc kubenswrapper[4797]: I1013 15:40:37.120570 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 13 15:40:37 crc kubenswrapper[4797]: I1013 15:40:37.624294 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Oct 13 15:40:37 crc kubenswrapper[4797]: I1013 15:40:37.975138 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"b91301d7-01b4-48ec-b44e-12d408a58e1c","Type":"ContainerStarted","Data":"a177c9392d3d3445d8f54082b2985a7f36488c8e82954460e0aca0be78f4f7e9"} Oct 13 15:40:40 crc kubenswrapper[4797]: I1013 15:40:40.237006 4797 scope.go:117] "RemoveContainer" containerID="0f48189b10ecf709cf828e19132e91dbb262e3daf7016f0a69a4c9c375af6ddf" Oct 13 15:40:40 crc kubenswrapper[4797]: E1013 15:40:40.237572 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 15:40:53 crc kubenswrapper[4797]: I1013 15:40:53.244648 4797 scope.go:117] "RemoveContainer" containerID="0f48189b10ecf709cf828e19132e91dbb262e3daf7016f0a69a4c9c375af6ddf" Oct 13 15:40:53 crc kubenswrapper[4797]: E1013 15:40:53.245554 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 15:41:06 crc kubenswrapper[4797]: I1013 15:41:06.236531 4797 scope.go:117] "RemoveContainer" containerID="0f48189b10ecf709cf828e19132e91dbb262e3daf7016f0a69a4c9c375af6ddf" Oct 13 15:41:06 crc kubenswrapper[4797]: E1013 15:41:06.237487 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 15:41:19 crc kubenswrapper[4797]: I1013 15:41:19.239665 4797 scope.go:117] "RemoveContainer" containerID="0f48189b10ecf709cf828e19132e91dbb262e3daf7016f0a69a4c9c375af6ddf" Oct 13 15:41:19 crc kubenswrapper[4797]: E1013 15:41:19.240382 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 15:41:21 crc kubenswrapper[4797]: E1013 15:41:21.217049 4797 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-tempest-all:1e4eeec18f8da2b364b39b7a7358aef5" Oct 13 15:41:21 crc kubenswrapper[4797]: E1013 15:41:21.217489 4797 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-tempest-all:1e4eeec18f8da2b364b39b7a7358aef5" Oct 13 15:41:21 crc kubenswrapper[4797]: E1013 15:41:21.217771 4797 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.rdoproject.org/podified-antelope-centos9/openstack-tempest-all:1e4eeec18f8da2b364b39b7a7358aef5,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-k2w2r,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(b91301d7-01b4-48ec-b44e-12d408a58e1c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 13 15:41:21 crc kubenswrapper[4797]: E1013 15:41:21.220916 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="b91301d7-01b4-48ec-b44e-12d408a58e1c" Oct 13 15:41:21 crc kubenswrapper[4797]: E1013 15:41:21.457119 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-antelope-centos9/openstack-tempest-all:1e4eeec18f8da2b364b39b7a7358aef5\\\"\"" pod="openstack/tempest-tests-tempest" podUID="b91301d7-01b4-48ec-b44e-12d408a58e1c" Oct 13 15:41:31 crc kubenswrapper[4797]: I1013 15:41:31.238444 4797 scope.go:117] "RemoveContainer" containerID="0f48189b10ecf709cf828e19132e91dbb262e3daf7016f0a69a4c9c375af6ddf" Oct 13 15:41:31 crc kubenswrapper[4797]: E1013 15:41:31.240048 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 15:41:36 crc kubenswrapper[4797]: I1013 15:41:36.454263 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Oct 13 15:41:37 crc kubenswrapper[4797]: I1013 15:41:37.624443 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"b91301d7-01b4-48ec-b44e-12d408a58e1c","Type":"ContainerStarted","Data":"9dad40b66fcc3f0e2c5c03208f75913f92be54c06ac49e240af444ff05192294"} Oct 13 15:41:37 crc kubenswrapper[4797]: I1013 15:41:37.650474 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=3.832452096 podStartE2EDuration="1m2.650448984s" podCreationTimestamp="2025-10-13 15:40:35 +0000 UTC" firstStartedPulling="2025-10-13 15:40:37.634276827 +0000 UTC m=+9215.167827083" lastFinishedPulling="2025-10-13 15:41:36.452273715 +0000 UTC m=+9273.985823971" observedRunningTime="2025-10-13 15:41:37.640896749 +0000 UTC m=+9275.174447015" watchObservedRunningTime="2025-10-13 15:41:37.650448984 +0000 UTC m=+9275.183999240" Oct 13 15:41:44 crc kubenswrapper[4797]: I1013 15:41:44.237956 4797 scope.go:117] "RemoveContainer" containerID="0f48189b10ecf709cf828e19132e91dbb262e3daf7016f0a69a4c9c375af6ddf" Oct 13 15:41:44 crc kubenswrapper[4797]: E1013 15:41:44.238626 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 15:41:55 crc kubenswrapper[4797]: I1013 15:41:55.237498 4797 scope.go:117] "RemoveContainer" containerID="0f48189b10ecf709cf828e19132e91dbb262e3daf7016f0a69a4c9c375af6ddf" Oct 13 15:41:55 crc kubenswrapper[4797]: E1013 15:41:55.238982 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 15:41:58 crc kubenswrapper[4797]: I1013 15:41:58.159243 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-pf9n4"] Oct 13 15:41:58 crc kubenswrapper[4797]: I1013 15:41:58.161942 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pf9n4" Oct 13 15:41:58 crc kubenswrapper[4797]: I1013 15:41:58.191005 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pf9n4"] Oct 13 15:41:58 crc kubenswrapper[4797]: I1013 15:41:58.271881 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48786dcd-7a4c-412a-8507-eff373a15553-utilities\") pod \"redhat-marketplace-pf9n4\" (UID: \"48786dcd-7a4c-412a-8507-eff373a15553\") " pod="openshift-marketplace/redhat-marketplace-pf9n4" Oct 13 15:41:58 crc kubenswrapper[4797]: I1013 15:41:58.271968 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5j72\" (UniqueName: \"kubernetes.io/projected/48786dcd-7a4c-412a-8507-eff373a15553-kube-api-access-f5j72\") pod \"redhat-marketplace-pf9n4\" (UID: \"48786dcd-7a4c-412a-8507-eff373a15553\") " pod="openshift-marketplace/redhat-marketplace-pf9n4" Oct 13 15:41:58 crc kubenswrapper[4797]: I1013 15:41:58.271990 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48786dcd-7a4c-412a-8507-eff373a15553-catalog-content\") pod \"redhat-marketplace-pf9n4\" (UID: \"48786dcd-7a4c-412a-8507-eff373a15553\") " pod="openshift-marketplace/redhat-marketplace-pf9n4" Oct 13 15:41:58 crc kubenswrapper[4797]: I1013 15:41:58.375242 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48786dcd-7a4c-412a-8507-eff373a15553-utilities\") pod \"redhat-marketplace-pf9n4\" (UID: \"48786dcd-7a4c-412a-8507-eff373a15553\") " pod="openshift-marketplace/redhat-marketplace-pf9n4" Oct 13 15:41:58 crc kubenswrapper[4797]: I1013 15:41:58.375601 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48786dcd-7a4c-412a-8507-eff373a15553-utilities\") pod \"redhat-marketplace-pf9n4\" (UID: \"48786dcd-7a4c-412a-8507-eff373a15553\") " pod="openshift-marketplace/redhat-marketplace-pf9n4" Oct 13 15:41:58 crc kubenswrapper[4797]: I1013 15:41:58.376552 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f5j72\" (UniqueName: \"kubernetes.io/projected/48786dcd-7a4c-412a-8507-eff373a15553-kube-api-access-f5j72\") pod \"redhat-marketplace-pf9n4\" (UID: \"48786dcd-7a4c-412a-8507-eff373a15553\") " pod="openshift-marketplace/redhat-marketplace-pf9n4" Oct 13 15:41:58 crc kubenswrapper[4797]: I1013 15:41:58.376590 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48786dcd-7a4c-412a-8507-eff373a15553-catalog-content\") pod \"redhat-marketplace-pf9n4\" (UID: \"48786dcd-7a4c-412a-8507-eff373a15553\") " pod="openshift-marketplace/redhat-marketplace-pf9n4" Oct 13 15:41:58 crc kubenswrapper[4797]: I1013 15:41:58.376880 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48786dcd-7a4c-412a-8507-eff373a15553-catalog-content\") pod \"redhat-marketplace-pf9n4\" (UID: \"48786dcd-7a4c-412a-8507-eff373a15553\") " pod="openshift-marketplace/redhat-marketplace-pf9n4" Oct 13 15:41:58 crc kubenswrapper[4797]: I1013 15:41:58.409144 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5j72\" (UniqueName: \"kubernetes.io/projected/48786dcd-7a4c-412a-8507-eff373a15553-kube-api-access-f5j72\") pod \"redhat-marketplace-pf9n4\" (UID: \"48786dcd-7a4c-412a-8507-eff373a15553\") " pod="openshift-marketplace/redhat-marketplace-pf9n4" Oct 13 15:41:58 crc kubenswrapper[4797]: I1013 15:41:58.491948 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pf9n4" Oct 13 15:41:58 crc kubenswrapper[4797]: I1013 15:41:58.819386 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pf9n4"] Oct 13 15:41:58 crc kubenswrapper[4797]: W1013 15:41:58.825859 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod48786dcd_7a4c_412a_8507_eff373a15553.slice/crio-17a616e7ac2beea1df1cc37bf6618ebfbe1c9a47c0eea6fed563be2f74e4ad9c WatchSource:0}: Error finding container 17a616e7ac2beea1df1cc37bf6618ebfbe1c9a47c0eea6fed563be2f74e4ad9c: Status 404 returned error can't find the container with id 17a616e7ac2beea1df1cc37bf6618ebfbe1c9a47c0eea6fed563be2f74e4ad9c Oct 13 15:41:58 crc kubenswrapper[4797]: I1013 15:41:58.858819 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pf9n4" event={"ID":"48786dcd-7a4c-412a-8507-eff373a15553","Type":"ContainerStarted","Data":"17a616e7ac2beea1df1cc37bf6618ebfbe1c9a47c0eea6fed563be2f74e4ad9c"} Oct 13 15:41:59 crc kubenswrapper[4797]: I1013 15:41:59.869690 4797 generic.go:334] "Generic (PLEG): container finished" podID="48786dcd-7a4c-412a-8507-eff373a15553" containerID="1371214222caed37ddfce4049e8b7426b8f87674b4e7fdafe2e7a6b8ecd7069c" exitCode=0 Oct 13 15:41:59 crc kubenswrapper[4797]: I1013 15:41:59.869785 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pf9n4" event={"ID":"48786dcd-7a4c-412a-8507-eff373a15553","Type":"ContainerDied","Data":"1371214222caed37ddfce4049e8b7426b8f87674b4e7fdafe2e7a6b8ecd7069c"} Oct 13 15:42:01 crc kubenswrapper[4797]: I1013 15:42:01.999140 4797 generic.go:334] "Generic (PLEG): container finished" podID="48786dcd-7a4c-412a-8507-eff373a15553" containerID="4fd936ffc3074d3f7548b08073e9dbb5fdef3825e9ff3e7ff9d57b3846b3e633" exitCode=0 Oct 13 15:42:01 crc kubenswrapper[4797]: I1013 15:42:01.999235 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pf9n4" event={"ID":"48786dcd-7a4c-412a-8507-eff373a15553","Type":"ContainerDied","Data":"4fd936ffc3074d3f7548b08073e9dbb5fdef3825e9ff3e7ff9d57b3846b3e633"} Oct 13 15:42:03 crc kubenswrapper[4797]: I1013 15:42:03.012461 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pf9n4" event={"ID":"48786dcd-7a4c-412a-8507-eff373a15553","Type":"ContainerStarted","Data":"bca636dd6f6fdc5d23bbb6a4a442fee00ca6a7748374d5fb0b7a05aca5bcee1e"} Oct 13 15:42:03 crc kubenswrapper[4797]: I1013 15:42:03.036441 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-pf9n4" podStartSLOduration=2.485982912 podStartE2EDuration="5.036422193s" podCreationTimestamp="2025-10-13 15:41:58 +0000 UTC" firstStartedPulling="2025-10-13 15:41:59.871681677 +0000 UTC m=+9297.405231933" lastFinishedPulling="2025-10-13 15:42:02.422120948 +0000 UTC m=+9299.955671214" observedRunningTime="2025-10-13 15:42:03.029514503 +0000 UTC m=+9300.563064799" watchObservedRunningTime="2025-10-13 15:42:03.036422193 +0000 UTC m=+9300.569972459" Oct 13 15:42:06 crc kubenswrapper[4797]: I1013 15:42:06.236970 4797 scope.go:117] "RemoveContainer" containerID="0f48189b10ecf709cf828e19132e91dbb262e3daf7016f0a69a4c9c375af6ddf" Oct 13 15:42:06 crc kubenswrapper[4797]: E1013 15:42:06.237660 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 15:42:08 crc kubenswrapper[4797]: I1013 15:42:08.492848 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-pf9n4" Oct 13 15:42:08 crc kubenswrapper[4797]: I1013 15:42:08.493395 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-pf9n4" Oct 13 15:42:08 crc kubenswrapper[4797]: I1013 15:42:08.545317 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-pf9n4" Oct 13 15:42:09 crc kubenswrapper[4797]: I1013 15:42:09.136981 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-pf9n4" Oct 13 15:42:09 crc kubenswrapper[4797]: I1013 15:42:09.204993 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pf9n4"] Oct 13 15:42:11 crc kubenswrapper[4797]: I1013 15:42:11.095200 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-pf9n4" podUID="48786dcd-7a4c-412a-8507-eff373a15553" containerName="registry-server" containerID="cri-o://bca636dd6f6fdc5d23bbb6a4a442fee00ca6a7748374d5fb0b7a05aca5bcee1e" gracePeriod=2 Oct 13 15:42:11 crc kubenswrapper[4797]: I1013 15:42:11.654383 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pf9n4" Oct 13 15:42:11 crc kubenswrapper[4797]: I1013 15:42:11.763465 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f5j72\" (UniqueName: \"kubernetes.io/projected/48786dcd-7a4c-412a-8507-eff373a15553-kube-api-access-f5j72\") pod \"48786dcd-7a4c-412a-8507-eff373a15553\" (UID: \"48786dcd-7a4c-412a-8507-eff373a15553\") " Oct 13 15:42:11 crc kubenswrapper[4797]: I1013 15:42:11.763551 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48786dcd-7a4c-412a-8507-eff373a15553-utilities\") pod \"48786dcd-7a4c-412a-8507-eff373a15553\" (UID: \"48786dcd-7a4c-412a-8507-eff373a15553\") " Oct 13 15:42:11 crc kubenswrapper[4797]: I1013 15:42:11.763670 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48786dcd-7a4c-412a-8507-eff373a15553-catalog-content\") pod \"48786dcd-7a4c-412a-8507-eff373a15553\" (UID: \"48786dcd-7a4c-412a-8507-eff373a15553\") " Oct 13 15:42:11 crc kubenswrapper[4797]: I1013 15:42:11.764463 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48786dcd-7a4c-412a-8507-eff373a15553-utilities" (OuterVolumeSpecName: "utilities") pod "48786dcd-7a4c-412a-8507-eff373a15553" (UID: "48786dcd-7a4c-412a-8507-eff373a15553"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 15:42:11 crc kubenswrapper[4797]: I1013 15:42:11.769035 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48786dcd-7a4c-412a-8507-eff373a15553-kube-api-access-f5j72" (OuterVolumeSpecName: "kube-api-access-f5j72") pod "48786dcd-7a4c-412a-8507-eff373a15553" (UID: "48786dcd-7a4c-412a-8507-eff373a15553"). InnerVolumeSpecName "kube-api-access-f5j72". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 15:42:11 crc kubenswrapper[4797]: I1013 15:42:11.771102 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f5j72\" (UniqueName: \"kubernetes.io/projected/48786dcd-7a4c-412a-8507-eff373a15553-kube-api-access-f5j72\") on node \"crc\" DevicePath \"\"" Oct 13 15:42:11 crc kubenswrapper[4797]: I1013 15:42:11.771147 4797 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48786dcd-7a4c-412a-8507-eff373a15553-utilities\") on node \"crc\" DevicePath \"\"" Oct 13 15:42:11 crc kubenswrapper[4797]: I1013 15:42:11.775181 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48786dcd-7a4c-412a-8507-eff373a15553-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "48786dcd-7a4c-412a-8507-eff373a15553" (UID: "48786dcd-7a4c-412a-8507-eff373a15553"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 15:42:11 crc kubenswrapper[4797]: I1013 15:42:11.872955 4797 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48786dcd-7a4c-412a-8507-eff373a15553-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 13 15:42:12 crc kubenswrapper[4797]: I1013 15:42:12.116600 4797 generic.go:334] "Generic (PLEG): container finished" podID="48786dcd-7a4c-412a-8507-eff373a15553" containerID="bca636dd6f6fdc5d23bbb6a4a442fee00ca6a7748374d5fb0b7a05aca5bcee1e" exitCode=0 Oct 13 15:42:12 crc kubenswrapper[4797]: I1013 15:42:12.116645 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pf9n4" event={"ID":"48786dcd-7a4c-412a-8507-eff373a15553","Type":"ContainerDied","Data":"bca636dd6f6fdc5d23bbb6a4a442fee00ca6a7748374d5fb0b7a05aca5bcee1e"} Oct 13 15:42:12 crc kubenswrapper[4797]: I1013 15:42:12.116670 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pf9n4" event={"ID":"48786dcd-7a4c-412a-8507-eff373a15553","Type":"ContainerDied","Data":"17a616e7ac2beea1df1cc37bf6618ebfbe1c9a47c0eea6fed563be2f74e4ad9c"} Oct 13 15:42:12 crc kubenswrapper[4797]: I1013 15:42:12.116686 4797 scope.go:117] "RemoveContainer" containerID="bca636dd6f6fdc5d23bbb6a4a442fee00ca6a7748374d5fb0b7a05aca5bcee1e" Oct 13 15:42:12 crc kubenswrapper[4797]: I1013 15:42:12.116805 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pf9n4" Oct 13 15:42:12 crc kubenswrapper[4797]: I1013 15:42:12.149942 4797 scope.go:117] "RemoveContainer" containerID="4fd936ffc3074d3f7548b08073e9dbb5fdef3825e9ff3e7ff9d57b3846b3e633" Oct 13 15:42:12 crc kubenswrapper[4797]: I1013 15:42:12.153490 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pf9n4"] Oct 13 15:42:12 crc kubenswrapper[4797]: I1013 15:42:12.167687 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-pf9n4"] Oct 13 15:42:12 crc kubenswrapper[4797]: I1013 15:42:12.176851 4797 scope.go:117] "RemoveContainer" containerID="1371214222caed37ddfce4049e8b7426b8f87674b4e7fdafe2e7a6b8ecd7069c" Oct 13 15:42:12 crc kubenswrapper[4797]: I1013 15:42:12.233140 4797 scope.go:117] "RemoveContainer" containerID="bca636dd6f6fdc5d23bbb6a4a442fee00ca6a7748374d5fb0b7a05aca5bcee1e" Oct 13 15:42:12 crc kubenswrapper[4797]: E1013 15:42:12.233705 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bca636dd6f6fdc5d23bbb6a4a442fee00ca6a7748374d5fb0b7a05aca5bcee1e\": container with ID starting with bca636dd6f6fdc5d23bbb6a4a442fee00ca6a7748374d5fb0b7a05aca5bcee1e not found: ID does not exist" containerID="bca636dd6f6fdc5d23bbb6a4a442fee00ca6a7748374d5fb0b7a05aca5bcee1e" Oct 13 15:42:12 crc kubenswrapper[4797]: I1013 15:42:12.233761 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bca636dd6f6fdc5d23bbb6a4a442fee00ca6a7748374d5fb0b7a05aca5bcee1e"} err="failed to get container status \"bca636dd6f6fdc5d23bbb6a4a442fee00ca6a7748374d5fb0b7a05aca5bcee1e\": rpc error: code = NotFound desc = could not find container \"bca636dd6f6fdc5d23bbb6a4a442fee00ca6a7748374d5fb0b7a05aca5bcee1e\": container with ID starting with bca636dd6f6fdc5d23bbb6a4a442fee00ca6a7748374d5fb0b7a05aca5bcee1e not found: ID does not exist" Oct 13 15:42:12 crc kubenswrapper[4797]: I1013 15:42:12.233792 4797 scope.go:117] "RemoveContainer" containerID="4fd936ffc3074d3f7548b08073e9dbb5fdef3825e9ff3e7ff9d57b3846b3e633" Oct 13 15:42:12 crc kubenswrapper[4797]: E1013 15:42:12.234668 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4fd936ffc3074d3f7548b08073e9dbb5fdef3825e9ff3e7ff9d57b3846b3e633\": container with ID starting with 4fd936ffc3074d3f7548b08073e9dbb5fdef3825e9ff3e7ff9d57b3846b3e633 not found: ID does not exist" containerID="4fd936ffc3074d3f7548b08073e9dbb5fdef3825e9ff3e7ff9d57b3846b3e633" Oct 13 15:42:12 crc kubenswrapper[4797]: I1013 15:42:12.234711 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4fd936ffc3074d3f7548b08073e9dbb5fdef3825e9ff3e7ff9d57b3846b3e633"} err="failed to get container status \"4fd936ffc3074d3f7548b08073e9dbb5fdef3825e9ff3e7ff9d57b3846b3e633\": rpc error: code = NotFound desc = could not find container \"4fd936ffc3074d3f7548b08073e9dbb5fdef3825e9ff3e7ff9d57b3846b3e633\": container with ID starting with 4fd936ffc3074d3f7548b08073e9dbb5fdef3825e9ff3e7ff9d57b3846b3e633 not found: ID does not exist" Oct 13 15:42:12 crc kubenswrapper[4797]: I1013 15:42:12.234736 4797 scope.go:117] "RemoveContainer" containerID="1371214222caed37ddfce4049e8b7426b8f87674b4e7fdafe2e7a6b8ecd7069c" Oct 13 15:42:12 crc kubenswrapper[4797]: E1013 15:42:12.235199 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1371214222caed37ddfce4049e8b7426b8f87674b4e7fdafe2e7a6b8ecd7069c\": container with ID starting with 1371214222caed37ddfce4049e8b7426b8f87674b4e7fdafe2e7a6b8ecd7069c not found: ID does not exist" containerID="1371214222caed37ddfce4049e8b7426b8f87674b4e7fdafe2e7a6b8ecd7069c" Oct 13 15:42:12 crc kubenswrapper[4797]: I1013 15:42:12.235262 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1371214222caed37ddfce4049e8b7426b8f87674b4e7fdafe2e7a6b8ecd7069c"} err="failed to get container status \"1371214222caed37ddfce4049e8b7426b8f87674b4e7fdafe2e7a6b8ecd7069c\": rpc error: code = NotFound desc = could not find container \"1371214222caed37ddfce4049e8b7426b8f87674b4e7fdafe2e7a6b8ecd7069c\": container with ID starting with 1371214222caed37ddfce4049e8b7426b8f87674b4e7fdafe2e7a6b8ecd7069c not found: ID does not exist" Oct 13 15:42:13 crc kubenswrapper[4797]: I1013 15:42:13.247395 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48786dcd-7a4c-412a-8507-eff373a15553" path="/var/lib/kubelet/pods/48786dcd-7a4c-412a-8507-eff373a15553/volumes" Oct 13 15:42:17 crc kubenswrapper[4797]: I1013 15:42:17.236723 4797 scope.go:117] "RemoveContainer" containerID="0f48189b10ecf709cf828e19132e91dbb262e3daf7016f0a69a4c9c375af6ddf" Oct 13 15:42:17 crc kubenswrapper[4797]: E1013 15:42:17.237617 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 15:42:22 crc kubenswrapper[4797]: I1013 15:42:22.723006 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-xtvd9"] Oct 13 15:42:22 crc kubenswrapper[4797]: E1013 15:42:22.724032 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48786dcd-7a4c-412a-8507-eff373a15553" containerName="extract-content" Oct 13 15:42:22 crc kubenswrapper[4797]: I1013 15:42:22.724054 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="48786dcd-7a4c-412a-8507-eff373a15553" containerName="extract-content" Oct 13 15:42:22 crc kubenswrapper[4797]: E1013 15:42:22.724093 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48786dcd-7a4c-412a-8507-eff373a15553" containerName="extract-utilities" Oct 13 15:42:22 crc kubenswrapper[4797]: I1013 15:42:22.724102 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="48786dcd-7a4c-412a-8507-eff373a15553" containerName="extract-utilities" Oct 13 15:42:22 crc kubenswrapper[4797]: E1013 15:42:22.724136 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48786dcd-7a4c-412a-8507-eff373a15553" containerName="registry-server" Oct 13 15:42:22 crc kubenswrapper[4797]: I1013 15:42:22.724145 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="48786dcd-7a4c-412a-8507-eff373a15553" containerName="registry-server" Oct 13 15:42:22 crc kubenswrapper[4797]: I1013 15:42:22.724398 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="48786dcd-7a4c-412a-8507-eff373a15553" containerName="registry-server" Oct 13 15:42:22 crc kubenswrapper[4797]: I1013 15:42:22.725973 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xtvd9" Oct 13 15:42:22 crc kubenswrapper[4797]: I1013 15:42:22.750393 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xtvd9"] Oct 13 15:42:22 crc kubenswrapper[4797]: I1013 15:42:22.899520 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7137d952-5eec-4acd-8208-ab50d5ffd196-utilities\") pod \"redhat-operators-xtvd9\" (UID: \"7137d952-5eec-4acd-8208-ab50d5ffd196\") " pod="openshift-marketplace/redhat-operators-xtvd9" Oct 13 15:42:22 crc kubenswrapper[4797]: I1013 15:42:22.899948 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8cw2c\" (UniqueName: \"kubernetes.io/projected/7137d952-5eec-4acd-8208-ab50d5ffd196-kube-api-access-8cw2c\") pod \"redhat-operators-xtvd9\" (UID: \"7137d952-5eec-4acd-8208-ab50d5ffd196\") " pod="openshift-marketplace/redhat-operators-xtvd9" Oct 13 15:42:22 crc kubenswrapper[4797]: I1013 15:42:22.899993 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7137d952-5eec-4acd-8208-ab50d5ffd196-catalog-content\") pod \"redhat-operators-xtvd9\" (UID: \"7137d952-5eec-4acd-8208-ab50d5ffd196\") " pod="openshift-marketplace/redhat-operators-xtvd9" Oct 13 15:42:23 crc kubenswrapper[4797]: I1013 15:42:23.002405 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8cw2c\" (UniqueName: \"kubernetes.io/projected/7137d952-5eec-4acd-8208-ab50d5ffd196-kube-api-access-8cw2c\") pod \"redhat-operators-xtvd9\" (UID: \"7137d952-5eec-4acd-8208-ab50d5ffd196\") " pod="openshift-marketplace/redhat-operators-xtvd9" Oct 13 15:42:23 crc kubenswrapper[4797]: I1013 15:42:23.002861 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7137d952-5eec-4acd-8208-ab50d5ffd196-catalog-content\") pod \"redhat-operators-xtvd9\" (UID: \"7137d952-5eec-4acd-8208-ab50d5ffd196\") " pod="openshift-marketplace/redhat-operators-xtvd9" Oct 13 15:42:23 crc kubenswrapper[4797]: I1013 15:42:23.003168 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7137d952-5eec-4acd-8208-ab50d5ffd196-utilities\") pod \"redhat-operators-xtvd9\" (UID: \"7137d952-5eec-4acd-8208-ab50d5ffd196\") " pod="openshift-marketplace/redhat-operators-xtvd9" Oct 13 15:42:23 crc kubenswrapper[4797]: I1013 15:42:23.003279 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7137d952-5eec-4acd-8208-ab50d5ffd196-catalog-content\") pod \"redhat-operators-xtvd9\" (UID: \"7137d952-5eec-4acd-8208-ab50d5ffd196\") " pod="openshift-marketplace/redhat-operators-xtvd9" Oct 13 15:42:23 crc kubenswrapper[4797]: I1013 15:42:23.003568 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7137d952-5eec-4acd-8208-ab50d5ffd196-utilities\") pod \"redhat-operators-xtvd9\" (UID: \"7137d952-5eec-4acd-8208-ab50d5ffd196\") " pod="openshift-marketplace/redhat-operators-xtvd9" Oct 13 15:42:23 crc kubenswrapper[4797]: I1013 15:42:23.026109 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8cw2c\" (UniqueName: \"kubernetes.io/projected/7137d952-5eec-4acd-8208-ab50d5ffd196-kube-api-access-8cw2c\") pod \"redhat-operators-xtvd9\" (UID: \"7137d952-5eec-4acd-8208-ab50d5ffd196\") " pod="openshift-marketplace/redhat-operators-xtvd9" Oct 13 15:42:23 crc kubenswrapper[4797]: I1013 15:42:23.059118 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xtvd9" Oct 13 15:42:23 crc kubenswrapper[4797]: I1013 15:42:23.644473 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xtvd9"] Oct 13 15:42:24 crc kubenswrapper[4797]: I1013 15:42:24.322732 4797 generic.go:334] "Generic (PLEG): container finished" podID="7137d952-5eec-4acd-8208-ab50d5ffd196" containerID="8e1ceedfeaa1ef3c846120fab5fe9b694a95c62dbaaa996ebce57fc65b6c199c" exitCode=0 Oct 13 15:42:24 crc kubenswrapper[4797]: I1013 15:42:24.324551 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xtvd9" event={"ID":"7137d952-5eec-4acd-8208-ab50d5ffd196","Type":"ContainerDied","Data":"8e1ceedfeaa1ef3c846120fab5fe9b694a95c62dbaaa996ebce57fc65b6c199c"} Oct 13 15:42:24 crc kubenswrapper[4797]: I1013 15:42:24.324657 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xtvd9" event={"ID":"7137d952-5eec-4acd-8208-ab50d5ffd196","Type":"ContainerStarted","Data":"5408ea0b86a1b6ab668650a9ffbb0cd6b716876a8649efed4fbc1e55906be247"} Oct 13 15:42:26 crc kubenswrapper[4797]: I1013 15:42:26.347437 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xtvd9" event={"ID":"7137d952-5eec-4acd-8208-ab50d5ffd196","Type":"ContainerStarted","Data":"ef46209447ae6264484e3ac34d5b3053f56a513bdd1ee9f127c34a0e5c0b0903"} Oct 13 15:42:29 crc kubenswrapper[4797]: I1013 15:42:29.378857 4797 generic.go:334] "Generic (PLEG): container finished" podID="7137d952-5eec-4acd-8208-ab50d5ffd196" containerID="ef46209447ae6264484e3ac34d5b3053f56a513bdd1ee9f127c34a0e5c0b0903" exitCode=0 Oct 13 15:42:29 crc kubenswrapper[4797]: I1013 15:42:29.378939 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xtvd9" event={"ID":"7137d952-5eec-4acd-8208-ab50d5ffd196","Type":"ContainerDied","Data":"ef46209447ae6264484e3ac34d5b3053f56a513bdd1ee9f127c34a0e5c0b0903"} Oct 13 15:42:30 crc kubenswrapper[4797]: I1013 15:42:30.390934 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xtvd9" event={"ID":"7137d952-5eec-4acd-8208-ab50d5ffd196","Type":"ContainerStarted","Data":"5bc232860f5ff3766f4265d748a27d6ab23c8cbb53bae07fcb0b85178ac02b92"} Oct 13 15:42:30 crc kubenswrapper[4797]: I1013 15:42:30.420253 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-xtvd9" podStartSLOduration=2.858991972 podStartE2EDuration="8.420232979s" podCreationTimestamp="2025-10-13 15:42:22 +0000 UTC" firstStartedPulling="2025-10-13 15:42:24.326660969 +0000 UTC m=+9321.860211265" lastFinishedPulling="2025-10-13 15:42:29.887902016 +0000 UTC m=+9327.421452272" observedRunningTime="2025-10-13 15:42:30.412789616 +0000 UTC m=+9327.946339872" watchObservedRunningTime="2025-10-13 15:42:30.420232979 +0000 UTC m=+9327.953783245" Oct 13 15:42:32 crc kubenswrapper[4797]: I1013 15:42:32.236817 4797 scope.go:117] "RemoveContainer" containerID="0f48189b10ecf709cf828e19132e91dbb262e3daf7016f0a69a4c9c375af6ddf" Oct 13 15:42:32 crc kubenswrapper[4797]: E1013 15:42:32.237458 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 15:42:33 crc kubenswrapper[4797]: I1013 15:42:33.059355 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-xtvd9" Oct 13 15:42:33 crc kubenswrapper[4797]: I1013 15:42:33.059614 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-xtvd9" Oct 13 15:42:34 crc kubenswrapper[4797]: I1013 15:42:34.120336 4797 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-xtvd9" podUID="7137d952-5eec-4acd-8208-ab50d5ffd196" containerName="registry-server" probeResult="failure" output=< Oct 13 15:42:34 crc kubenswrapper[4797]: timeout: failed to connect service ":50051" within 1s Oct 13 15:42:34 crc kubenswrapper[4797]: > Oct 13 15:42:43 crc kubenswrapper[4797]: I1013 15:42:43.113583 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-xtvd9" Oct 13 15:42:43 crc kubenswrapper[4797]: I1013 15:42:43.165333 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-xtvd9" Oct 13 15:42:43 crc kubenswrapper[4797]: I1013 15:42:43.412350 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xtvd9"] Oct 13 15:42:44 crc kubenswrapper[4797]: I1013 15:42:44.519325 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-xtvd9" podUID="7137d952-5eec-4acd-8208-ab50d5ffd196" containerName="registry-server" containerID="cri-o://5bc232860f5ff3766f4265d748a27d6ab23c8cbb53bae07fcb0b85178ac02b92" gracePeriod=2 Oct 13 15:42:45 crc kubenswrapper[4797]: I1013 15:42:45.365654 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xtvd9" Oct 13 15:42:45 crc kubenswrapper[4797]: I1013 15:42:45.490317 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7137d952-5eec-4acd-8208-ab50d5ffd196-utilities\") pod \"7137d952-5eec-4acd-8208-ab50d5ffd196\" (UID: \"7137d952-5eec-4acd-8208-ab50d5ffd196\") " Oct 13 15:42:45 crc kubenswrapper[4797]: I1013 15:42:45.490521 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7137d952-5eec-4acd-8208-ab50d5ffd196-catalog-content\") pod \"7137d952-5eec-4acd-8208-ab50d5ffd196\" (UID: \"7137d952-5eec-4acd-8208-ab50d5ffd196\") " Oct 13 15:42:45 crc kubenswrapper[4797]: I1013 15:42:45.490625 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8cw2c\" (UniqueName: \"kubernetes.io/projected/7137d952-5eec-4acd-8208-ab50d5ffd196-kube-api-access-8cw2c\") pod \"7137d952-5eec-4acd-8208-ab50d5ffd196\" (UID: \"7137d952-5eec-4acd-8208-ab50d5ffd196\") " Oct 13 15:42:45 crc kubenswrapper[4797]: I1013 15:42:45.491236 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7137d952-5eec-4acd-8208-ab50d5ffd196-utilities" (OuterVolumeSpecName: "utilities") pod "7137d952-5eec-4acd-8208-ab50d5ffd196" (UID: "7137d952-5eec-4acd-8208-ab50d5ffd196"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 15:42:45 crc kubenswrapper[4797]: I1013 15:42:45.509099 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7137d952-5eec-4acd-8208-ab50d5ffd196-kube-api-access-8cw2c" (OuterVolumeSpecName: "kube-api-access-8cw2c") pod "7137d952-5eec-4acd-8208-ab50d5ffd196" (UID: "7137d952-5eec-4acd-8208-ab50d5ffd196"). InnerVolumeSpecName "kube-api-access-8cw2c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 15:42:45 crc kubenswrapper[4797]: I1013 15:42:45.529919 4797 generic.go:334] "Generic (PLEG): container finished" podID="7137d952-5eec-4acd-8208-ab50d5ffd196" containerID="5bc232860f5ff3766f4265d748a27d6ab23c8cbb53bae07fcb0b85178ac02b92" exitCode=0 Oct 13 15:42:45 crc kubenswrapper[4797]: I1013 15:42:45.529958 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xtvd9" event={"ID":"7137d952-5eec-4acd-8208-ab50d5ffd196","Type":"ContainerDied","Data":"5bc232860f5ff3766f4265d748a27d6ab23c8cbb53bae07fcb0b85178ac02b92"} Oct 13 15:42:45 crc kubenswrapper[4797]: I1013 15:42:45.529983 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xtvd9" event={"ID":"7137d952-5eec-4acd-8208-ab50d5ffd196","Type":"ContainerDied","Data":"5408ea0b86a1b6ab668650a9ffbb0cd6b716876a8649efed4fbc1e55906be247"} Oct 13 15:42:45 crc kubenswrapper[4797]: I1013 15:42:45.530019 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xtvd9" Oct 13 15:42:45 crc kubenswrapper[4797]: I1013 15:42:45.530014 4797 scope.go:117] "RemoveContainer" containerID="5bc232860f5ff3766f4265d748a27d6ab23c8cbb53bae07fcb0b85178ac02b92" Oct 13 15:42:45 crc kubenswrapper[4797]: I1013 15:42:45.569044 4797 scope.go:117] "RemoveContainer" containerID="ef46209447ae6264484e3ac34d5b3053f56a513bdd1ee9f127c34a0e5c0b0903" Oct 13 15:42:45 crc kubenswrapper[4797]: I1013 15:42:45.593645 4797 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7137d952-5eec-4acd-8208-ab50d5ffd196-utilities\") on node \"crc\" DevicePath \"\"" Oct 13 15:42:45 crc kubenswrapper[4797]: I1013 15:42:45.593677 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8cw2c\" (UniqueName: \"kubernetes.io/projected/7137d952-5eec-4acd-8208-ab50d5ffd196-kube-api-access-8cw2c\") on node \"crc\" DevicePath \"\"" Oct 13 15:42:45 crc kubenswrapper[4797]: I1013 15:42:45.597441 4797 scope.go:117] "RemoveContainer" containerID="8e1ceedfeaa1ef3c846120fab5fe9b694a95c62dbaaa996ebce57fc65b6c199c" Oct 13 15:42:45 crc kubenswrapper[4797]: I1013 15:42:45.615566 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7137d952-5eec-4acd-8208-ab50d5ffd196-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7137d952-5eec-4acd-8208-ab50d5ffd196" (UID: "7137d952-5eec-4acd-8208-ab50d5ffd196"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 15:42:45 crc kubenswrapper[4797]: I1013 15:42:45.642927 4797 scope.go:117] "RemoveContainer" containerID="5bc232860f5ff3766f4265d748a27d6ab23c8cbb53bae07fcb0b85178ac02b92" Oct 13 15:42:45 crc kubenswrapper[4797]: E1013 15:42:45.645069 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5bc232860f5ff3766f4265d748a27d6ab23c8cbb53bae07fcb0b85178ac02b92\": container with ID starting with 5bc232860f5ff3766f4265d748a27d6ab23c8cbb53bae07fcb0b85178ac02b92 not found: ID does not exist" containerID="5bc232860f5ff3766f4265d748a27d6ab23c8cbb53bae07fcb0b85178ac02b92" Oct 13 15:42:45 crc kubenswrapper[4797]: I1013 15:42:45.645114 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5bc232860f5ff3766f4265d748a27d6ab23c8cbb53bae07fcb0b85178ac02b92"} err="failed to get container status \"5bc232860f5ff3766f4265d748a27d6ab23c8cbb53bae07fcb0b85178ac02b92\": rpc error: code = NotFound desc = could not find container \"5bc232860f5ff3766f4265d748a27d6ab23c8cbb53bae07fcb0b85178ac02b92\": container with ID starting with 5bc232860f5ff3766f4265d748a27d6ab23c8cbb53bae07fcb0b85178ac02b92 not found: ID does not exist" Oct 13 15:42:45 crc kubenswrapper[4797]: I1013 15:42:45.645142 4797 scope.go:117] "RemoveContainer" containerID="ef46209447ae6264484e3ac34d5b3053f56a513bdd1ee9f127c34a0e5c0b0903" Oct 13 15:42:45 crc kubenswrapper[4797]: E1013 15:42:45.646179 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef46209447ae6264484e3ac34d5b3053f56a513bdd1ee9f127c34a0e5c0b0903\": container with ID starting with ef46209447ae6264484e3ac34d5b3053f56a513bdd1ee9f127c34a0e5c0b0903 not found: ID does not exist" containerID="ef46209447ae6264484e3ac34d5b3053f56a513bdd1ee9f127c34a0e5c0b0903" Oct 13 15:42:45 crc kubenswrapper[4797]: I1013 15:42:45.646203 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef46209447ae6264484e3ac34d5b3053f56a513bdd1ee9f127c34a0e5c0b0903"} err="failed to get container status \"ef46209447ae6264484e3ac34d5b3053f56a513bdd1ee9f127c34a0e5c0b0903\": rpc error: code = NotFound desc = could not find container \"ef46209447ae6264484e3ac34d5b3053f56a513bdd1ee9f127c34a0e5c0b0903\": container with ID starting with ef46209447ae6264484e3ac34d5b3053f56a513bdd1ee9f127c34a0e5c0b0903 not found: ID does not exist" Oct 13 15:42:45 crc kubenswrapper[4797]: I1013 15:42:45.646220 4797 scope.go:117] "RemoveContainer" containerID="8e1ceedfeaa1ef3c846120fab5fe9b694a95c62dbaaa996ebce57fc65b6c199c" Oct 13 15:42:45 crc kubenswrapper[4797]: E1013 15:42:45.646456 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e1ceedfeaa1ef3c846120fab5fe9b694a95c62dbaaa996ebce57fc65b6c199c\": container with ID starting with 8e1ceedfeaa1ef3c846120fab5fe9b694a95c62dbaaa996ebce57fc65b6c199c not found: ID does not exist" containerID="8e1ceedfeaa1ef3c846120fab5fe9b694a95c62dbaaa996ebce57fc65b6c199c" Oct 13 15:42:45 crc kubenswrapper[4797]: I1013 15:42:45.646476 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e1ceedfeaa1ef3c846120fab5fe9b694a95c62dbaaa996ebce57fc65b6c199c"} err="failed to get container status \"8e1ceedfeaa1ef3c846120fab5fe9b694a95c62dbaaa996ebce57fc65b6c199c\": rpc error: code = NotFound desc = could not find container \"8e1ceedfeaa1ef3c846120fab5fe9b694a95c62dbaaa996ebce57fc65b6c199c\": container with ID starting with 8e1ceedfeaa1ef3c846120fab5fe9b694a95c62dbaaa996ebce57fc65b6c199c not found: ID does not exist" Oct 13 15:42:45 crc kubenswrapper[4797]: I1013 15:42:45.695064 4797 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7137d952-5eec-4acd-8208-ab50d5ffd196-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 13 15:42:45 crc kubenswrapper[4797]: I1013 15:42:45.882273 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xtvd9"] Oct 13 15:42:45 crc kubenswrapper[4797]: I1013 15:42:45.893812 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-xtvd9"] Oct 13 15:42:47 crc kubenswrapper[4797]: I1013 15:42:47.236983 4797 scope.go:117] "RemoveContainer" containerID="0f48189b10ecf709cf828e19132e91dbb262e3daf7016f0a69a4c9c375af6ddf" Oct 13 15:42:47 crc kubenswrapper[4797]: E1013 15:42:47.237502 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 15:42:47 crc kubenswrapper[4797]: I1013 15:42:47.249565 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7137d952-5eec-4acd-8208-ab50d5ffd196" path="/var/lib/kubelet/pods/7137d952-5eec-4acd-8208-ab50d5ffd196/volumes" Oct 13 15:43:02 crc kubenswrapper[4797]: I1013 15:43:02.236739 4797 scope.go:117] "RemoveContainer" containerID="0f48189b10ecf709cf828e19132e91dbb262e3daf7016f0a69a4c9c375af6ddf" Oct 13 15:43:02 crc kubenswrapper[4797]: E1013 15:43:02.237482 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 15:43:05 crc kubenswrapper[4797]: I1013 15:43:05.249754 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-vzt6x"] Oct 13 15:43:05 crc kubenswrapper[4797]: E1013 15:43:05.251148 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7137d952-5eec-4acd-8208-ab50d5ffd196" containerName="registry-server" Oct 13 15:43:05 crc kubenswrapper[4797]: I1013 15:43:05.251163 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="7137d952-5eec-4acd-8208-ab50d5ffd196" containerName="registry-server" Oct 13 15:43:05 crc kubenswrapper[4797]: E1013 15:43:05.251207 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7137d952-5eec-4acd-8208-ab50d5ffd196" containerName="extract-content" Oct 13 15:43:05 crc kubenswrapper[4797]: I1013 15:43:05.251214 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="7137d952-5eec-4acd-8208-ab50d5ffd196" containerName="extract-content" Oct 13 15:43:05 crc kubenswrapper[4797]: E1013 15:43:05.251240 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7137d952-5eec-4acd-8208-ab50d5ffd196" containerName="extract-utilities" Oct 13 15:43:05 crc kubenswrapper[4797]: I1013 15:43:05.251246 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="7137d952-5eec-4acd-8208-ab50d5ffd196" containerName="extract-utilities" Oct 13 15:43:05 crc kubenswrapper[4797]: I1013 15:43:05.251488 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="7137d952-5eec-4acd-8208-ab50d5ffd196" containerName="registry-server" Oct 13 15:43:05 crc kubenswrapper[4797]: I1013 15:43:05.253405 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vzt6x" Oct 13 15:43:05 crc kubenswrapper[4797]: I1013 15:43:05.263539 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vzt6x"] Oct 13 15:43:05 crc kubenswrapper[4797]: I1013 15:43:05.306094 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/affb42a1-2da4-4a29-887f-323a055ae3aa-catalog-content\") pod \"certified-operators-vzt6x\" (UID: \"affb42a1-2da4-4a29-887f-323a055ae3aa\") " pod="openshift-marketplace/certified-operators-vzt6x" Oct 13 15:43:05 crc kubenswrapper[4797]: I1013 15:43:05.306151 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/affb42a1-2da4-4a29-887f-323a055ae3aa-utilities\") pod \"certified-operators-vzt6x\" (UID: \"affb42a1-2da4-4a29-887f-323a055ae3aa\") " pod="openshift-marketplace/certified-operators-vzt6x" Oct 13 15:43:05 crc kubenswrapper[4797]: I1013 15:43:05.306346 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42vnl\" (UniqueName: \"kubernetes.io/projected/affb42a1-2da4-4a29-887f-323a055ae3aa-kube-api-access-42vnl\") pod \"certified-operators-vzt6x\" (UID: \"affb42a1-2da4-4a29-887f-323a055ae3aa\") " pod="openshift-marketplace/certified-operators-vzt6x" Oct 13 15:43:05 crc kubenswrapper[4797]: I1013 15:43:05.408096 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42vnl\" (UniqueName: \"kubernetes.io/projected/affb42a1-2da4-4a29-887f-323a055ae3aa-kube-api-access-42vnl\") pod \"certified-operators-vzt6x\" (UID: \"affb42a1-2da4-4a29-887f-323a055ae3aa\") " pod="openshift-marketplace/certified-operators-vzt6x" Oct 13 15:43:05 crc kubenswrapper[4797]: I1013 15:43:05.408188 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/affb42a1-2da4-4a29-887f-323a055ae3aa-catalog-content\") pod \"certified-operators-vzt6x\" (UID: \"affb42a1-2da4-4a29-887f-323a055ae3aa\") " pod="openshift-marketplace/certified-operators-vzt6x" Oct 13 15:43:05 crc kubenswrapper[4797]: I1013 15:43:05.408215 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/affb42a1-2da4-4a29-887f-323a055ae3aa-utilities\") pod \"certified-operators-vzt6x\" (UID: \"affb42a1-2da4-4a29-887f-323a055ae3aa\") " pod="openshift-marketplace/certified-operators-vzt6x" Oct 13 15:43:05 crc kubenswrapper[4797]: I1013 15:43:05.408666 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/affb42a1-2da4-4a29-887f-323a055ae3aa-utilities\") pod \"certified-operators-vzt6x\" (UID: \"affb42a1-2da4-4a29-887f-323a055ae3aa\") " pod="openshift-marketplace/certified-operators-vzt6x" Oct 13 15:43:05 crc kubenswrapper[4797]: I1013 15:43:05.408788 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/affb42a1-2da4-4a29-887f-323a055ae3aa-catalog-content\") pod \"certified-operators-vzt6x\" (UID: \"affb42a1-2da4-4a29-887f-323a055ae3aa\") " pod="openshift-marketplace/certified-operators-vzt6x" Oct 13 15:43:05 crc kubenswrapper[4797]: I1013 15:43:05.433351 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42vnl\" (UniqueName: \"kubernetes.io/projected/affb42a1-2da4-4a29-887f-323a055ae3aa-kube-api-access-42vnl\") pod \"certified-operators-vzt6x\" (UID: \"affb42a1-2da4-4a29-887f-323a055ae3aa\") " pod="openshift-marketplace/certified-operators-vzt6x" Oct 13 15:43:05 crc kubenswrapper[4797]: I1013 15:43:05.577291 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vzt6x" Oct 13 15:43:06 crc kubenswrapper[4797]: I1013 15:43:06.155057 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vzt6x"] Oct 13 15:43:06 crc kubenswrapper[4797]: I1013 15:43:06.766444 4797 generic.go:334] "Generic (PLEG): container finished" podID="affb42a1-2da4-4a29-887f-323a055ae3aa" containerID="bc662a69214c4592d04ada007846dde5242e6c228b23146dd0f79cfaa0035c88" exitCode=0 Oct 13 15:43:06 crc kubenswrapper[4797]: I1013 15:43:06.766505 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vzt6x" event={"ID":"affb42a1-2da4-4a29-887f-323a055ae3aa","Type":"ContainerDied","Data":"bc662a69214c4592d04ada007846dde5242e6c228b23146dd0f79cfaa0035c88"} Oct 13 15:43:06 crc kubenswrapper[4797]: I1013 15:43:06.766717 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vzt6x" event={"ID":"affb42a1-2da4-4a29-887f-323a055ae3aa","Type":"ContainerStarted","Data":"5443849b3eaf5d542c33e52f1c1c442d7007dee9df92da4c4f3fd2c7ddcacfd0"} Oct 13 15:43:07 crc kubenswrapper[4797]: I1013 15:43:07.776424 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vzt6x" event={"ID":"affb42a1-2da4-4a29-887f-323a055ae3aa","Type":"ContainerStarted","Data":"165d933e6c86aa85f0a9b085d10b58bc912f2db48cf6efefc072b4cbd6b6ed57"} Oct 13 15:43:08 crc kubenswrapper[4797]: I1013 15:43:08.789553 4797 generic.go:334] "Generic (PLEG): container finished" podID="affb42a1-2da4-4a29-887f-323a055ae3aa" containerID="165d933e6c86aa85f0a9b085d10b58bc912f2db48cf6efefc072b4cbd6b6ed57" exitCode=0 Oct 13 15:43:08 crc kubenswrapper[4797]: I1013 15:43:08.789750 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vzt6x" event={"ID":"affb42a1-2da4-4a29-887f-323a055ae3aa","Type":"ContainerDied","Data":"165d933e6c86aa85f0a9b085d10b58bc912f2db48cf6efefc072b4cbd6b6ed57"} Oct 13 15:43:09 crc kubenswrapper[4797]: I1013 15:43:09.806657 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vzt6x" event={"ID":"affb42a1-2da4-4a29-887f-323a055ae3aa","Type":"ContainerStarted","Data":"1930134cbed9ef176c9022e296e32665009d0d01c3f737b807d4a0ae09e60be6"} Oct 13 15:43:09 crc kubenswrapper[4797]: I1013 15:43:09.827054 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-vzt6x" podStartSLOduration=2.372573274 podStartE2EDuration="4.827037076s" podCreationTimestamp="2025-10-13 15:43:05 +0000 UTC" firstStartedPulling="2025-10-13 15:43:06.768834769 +0000 UTC m=+9364.302385025" lastFinishedPulling="2025-10-13 15:43:09.223298571 +0000 UTC m=+9366.756848827" observedRunningTime="2025-10-13 15:43:09.826112803 +0000 UTC m=+9367.359663099" watchObservedRunningTime="2025-10-13 15:43:09.827037076 +0000 UTC m=+9367.360587342" Oct 13 15:43:15 crc kubenswrapper[4797]: I1013 15:43:15.577665 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-vzt6x" Oct 13 15:43:15 crc kubenswrapper[4797]: I1013 15:43:15.578171 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-vzt6x" Oct 13 15:43:15 crc kubenswrapper[4797]: I1013 15:43:15.628460 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-vzt6x" Oct 13 15:43:15 crc kubenswrapper[4797]: I1013 15:43:15.957924 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-vzt6x" Oct 13 15:43:16 crc kubenswrapper[4797]: I1013 15:43:16.010207 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vzt6x"] Oct 13 15:43:17 crc kubenswrapper[4797]: I1013 15:43:17.236600 4797 scope.go:117] "RemoveContainer" containerID="0f48189b10ecf709cf828e19132e91dbb262e3daf7016f0a69a4c9c375af6ddf" Oct 13 15:43:17 crc kubenswrapper[4797]: E1013 15:43:17.236914 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 15:43:17 crc kubenswrapper[4797]: I1013 15:43:17.920095 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-vzt6x" podUID="affb42a1-2da4-4a29-887f-323a055ae3aa" containerName="registry-server" containerID="cri-o://1930134cbed9ef176c9022e296e32665009d0d01c3f737b807d4a0ae09e60be6" gracePeriod=2 Oct 13 15:43:18 crc kubenswrapper[4797]: I1013 15:43:18.581552 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vzt6x" Oct 13 15:43:18 crc kubenswrapper[4797]: I1013 15:43:18.700347 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/affb42a1-2da4-4a29-887f-323a055ae3aa-utilities\") pod \"affb42a1-2da4-4a29-887f-323a055ae3aa\" (UID: \"affb42a1-2da4-4a29-887f-323a055ae3aa\") " Oct 13 15:43:18 crc kubenswrapper[4797]: I1013 15:43:18.700484 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-42vnl\" (UniqueName: \"kubernetes.io/projected/affb42a1-2da4-4a29-887f-323a055ae3aa-kube-api-access-42vnl\") pod \"affb42a1-2da4-4a29-887f-323a055ae3aa\" (UID: \"affb42a1-2da4-4a29-887f-323a055ae3aa\") " Oct 13 15:43:18 crc kubenswrapper[4797]: I1013 15:43:18.700696 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/affb42a1-2da4-4a29-887f-323a055ae3aa-catalog-content\") pod \"affb42a1-2da4-4a29-887f-323a055ae3aa\" (UID: \"affb42a1-2da4-4a29-887f-323a055ae3aa\") " Oct 13 15:43:18 crc kubenswrapper[4797]: I1013 15:43:18.701593 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/affb42a1-2da4-4a29-887f-323a055ae3aa-utilities" (OuterVolumeSpecName: "utilities") pod "affb42a1-2da4-4a29-887f-323a055ae3aa" (UID: "affb42a1-2da4-4a29-887f-323a055ae3aa"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 15:43:18 crc kubenswrapper[4797]: I1013 15:43:18.706248 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/affb42a1-2da4-4a29-887f-323a055ae3aa-kube-api-access-42vnl" (OuterVolumeSpecName: "kube-api-access-42vnl") pod "affb42a1-2da4-4a29-887f-323a055ae3aa" (UID: "affb42a1-2da4-4a29-887f-323a055ae3aa"). InnerVolumeSpecName "kube-api-access-42vnl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 15:43:18 crc kubenswrapper[4797]: I1013 15:43:18.722684 4797 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/affb42a1-2da4-4a29-887f-323a055ae3aa-utilities\") on node \"crc\" DevicePath \"\"" Oct 13 15:43:18 crc kubenswrapper[4797]: I1013 15:43:18.722721 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-42vnl\" (UniqueName: \"kubernetes.io/projected/affb42a1-2da4-4a29-887f-323a055ae3aa-kube-api-access-42vnl\") on node \"crc\" DevicePath \"\"" Oct 13 15:43:18 crc kubenswrapper[4797]: I1013 15:43:18.767606 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/affb42a1-2da4-4a29-887f-323a055ae3aa-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "affb42a1-2da4-4a29-887f-323a055ae3aa" (UID: "affb42a1-2da4-4a29-887f-323a055ae3aa"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 15:43:18 crc kubenswrapper[4797]: I1013 15:43:18.825112 4797 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/affb42a1-2da4-4a29-887f-323a055ae3aa-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 13 15:43:18 crc kubenswrapper[4797]: I1013 15:43:18.935108 4797 generic.go:334] "Generic (PLEG): container finished" podID="affb42a1-2da4-4a29-887f-323a055ae3aa" containerID="1930134cbed9ef176c9022e296e32665009d0d01c3f737b807d4a0ae09e60be6" exitCode=0 Oct 13 15:43:18 crc kubenswrapper[4797]: I1013 15:43:18.935149 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vzt6x" event={"ID":"affb42a1-2da4-4a29-887f-323a055ae3aa","Type":"ContainerDied","Data":"1930134cbed9ef176c9022e296e32665009d0d01c3f737b807d4a0ae09e60be6"} Oct 13 15:43:18 crc kubenswrapper[4797]: I1013 15:43:18.935175 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vzt6x" event={"ID":"affb42a1-2da4-4a29-887f-323a055ae3aa","Type":"ContainerDied","Data":"5443849b3eaf5d542c33e52f1c1c442d7007dee9df92da4c4f3fd2c7ddcacfd0"} Oct 13 15:43:18 crc kubenswrapper[4797]: I1013 15:43:18.935176 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vzt6x" Oct 13 15:43:18 crc kubenswrapper[4797]: I1013 15:43:18.935189 4797 scope.go:117] "RemoveContainer" containerID="1930134cbed9ef176c9022e296e32665009d0d01c3f737b807d4a0ae09e60be6" Oct 13 15:43:18 crc kubenswrapper[4797]: I1013 15:43:18.971337 4797 scope.go:117] "RemoveContainer" containerID="165d933e6c86aa85f0a9b085d10b58bc912f2db48cf6efefc072b4cbd6b6ed57" Oct 13 15:43:18 crc kubenswrapper[4797]: I1013 15:43:18.975035 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vzt6x"] Oct 13 15:43:19 crc kubenswrapper[4797]: I1013 15:43:19.005402 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-vzt6x"] Oct 13 15:43:19 crc kubenswrapper[4797]: I1013 15:43:19.013286 4797 scope.go:117] "RemoveContainer" containerID="bc662a69214c4592d04ada007846dde5242e6c228b23146dd0f79cfaa0035c88" Oct 13 15:43:19 crc kubenswrapper[4797]: I1013 15:43:19.070062 4797 scope.go:117] "RemoveContainer" containerID="1930134cbed9ef176c9022e296e32665009d0d01c3f737b807d4a0ae09e60be6" Oct 13 15:43:19 crc kubenswrapper[4797]: E1013 15:43:19.070993 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1930134cbed9ef176c9022e296e32665009d0d01c3f737b807d4a0ae09e60be6\": container with ID starting with 1930134cbed9ef176c9022e296e32665009d0d01c3f737b807d4a0ae09e60be6 not found: ID does not exist" containerID="1930134cbed9ef176c9022e296e32665009d0d01c3f737b807d4a0ae09e60be6" Oct 13 15:43:19 crc kubenswrapper[4797]: I1013 15:43:19.071047 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1930134cbed9ef176c9022e296e32665009d0d01c3f737b807d4a0ae09e60be6"} err="failed to get container status \"1930134cbed9ef176c9022e296e32665009d0d01c3f737b807d4a0ae09e60be6\": rpc error: code = NotFound desc = could not find container \"1930134cbed9ef176c9022e296e32665009d0d01c3f737b807d4a0ae09e60be6\": container with ID starting with 1930134cbed9ef176c9022e296e32665009d0d01c3f737b807d4a0ae09e60be6 not found: ID does not exist" Oct 13 15:43:19 crc kubenswrapper[4797]: I1013 15:43:19.071079 4797 scope.go:117] "RemoveContainer" containerID="165d933e6c86aa85f0a9b085d10b58bc912f2db48cf6efefc072b4cbd6b6ed57" Oct 13 15:43:19 crc kubenswrapper[4797]: E1013 15:43:19.071493 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"165d933e6c86aa85f0a9b085d10b58bc912f2db48cf6efefc072b4cbd6b6ed57\": container with ID starting with 165d933e6c86aa85f0a9b085d10b58bc912f2db48cf6efefc072b4cbd6b6ed57 not found: ID does not exist" containerID="165d933e6c86aa85f0a9b085d10b58bc912f2db48cf6efefc072b4cbd6b6ed57" Oct 13 15:43:19 crc kubenswrapper[4797]: I1013 15:43:19.071525 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"165d933e6c86aa85f0a9b085d10b58bc912f2db48cf6efefc072b4cbd6b6ed57"} err="failed to get container status \"165d933e6c86aa85f0a9b085d10b58bc912f2db48cf6efefc072b4cbd6b6ed57\": rpc error: code = NotFound desc = could not find container \"165d933e6c86aa85f0a9b085d10b58bc912f2db48cf6efefc072b4cbd6b6ed57\": container with ID starting with 165d933e6c86aa85f0a9b085d10b58bc912f2db48cf6efefc072b4cbd6b6ed57 not found: ID does not exist" Oct 13 15:43:19 crc kubenswrapper[4797]: I1013 15:43:19.071549 4797 scope.go:117] "RemoveContainer" containerID="bc662a69214c4592d04ada007846dde5242e6c228b23146dd0f79cfaa0035c88" Oct 13 15:43:19 crc kubenswrapper[4797]: E1013 15:43:19.071855 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc662a69214c4592d04ada007846dde5242e6c228b23146dd0f79cfaa0035c88\": container with ID starting with bc662a69214c4592d04ada007846dde5242e6c228b23146dd0f79cfaa0035c88 not found: ID does not exist" containerID="bc662a69214c4592d04ada007846dde5242e6c228b23146dd0f79cfaa0035c88" Oct 13 15:43:19 crc kubenswrapper[4797]: I1013 15:43:19.071876 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc662a69214c4592d04ada007846dde5242e6c228b23146dd0f79cfaa0035c88"} err="failed to get container status \"bc662a69214c4592d04ada007846dde5242e6c228b23146dd0f79cfaa0035c88\": rpc error: code = NotFound desc = could not find container \"bc662a69214c4592d04ada007846dde5242e6c228b23146dd0f79cfaa0035c88\": container with ID starting with bc662a69214c4592d04ada007846dde5242e6c228b23146dd0f79cfaa0035c88 not found: ID does not exist" Oct 13 15:43:19 crc kubenswrapper[4797]: I1013 15:43:19.277608 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="affb42a1-2da4-4a29-887f-323a055ae3aa" path="/var/lib/kubelet/pods/affb42a1-2da4-4a29-887f-323a055ae3aa/volumes" Oct 13 15:43:29 crc kubenswrapper[4797]: I1013 15:43:29.236640 4797 scope.go:117] "RemoveContainer" containerID="0f48189b10ecf709cf828e19132e91dbb262e3daf7016f0a69a4c9c375af6ddf" Oct 13 15:43:29 crc kubenswrapper[4797]: E1013 15:43:29.238360 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 15:43:44 crc kubenswrapper[4797]: I1013 15:43:44.237198 4797 scope.go:117] "RemoveContainer" containerID="0f48189b10ecf709cf828e19132e91dbb262e3daf7016f0a69a4c9c375af6ddf" Oct 13 15:43:44 crc kubenswrapper[4797]: E1013 15:43:44.238027 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 15:43:58 crc kubenswrapper[4797]: I1013 15:43:58.236691 4797 scope.go:117] "RemoveContainer" containerID="0f48189b10ecf709cf828e19132e91dbb262e3daf7016f0a69a4c9c375af6ddf" Oct 13 15:43:58 crc kubenswrapper[4797]: E1013 15:43:58.237457 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 15:44:10 crc kubenswrapper[4797]: I1013 15:44:10.237160 4797 scope.go:117] "RemoveContainer" containerID="0f48189b10ecf709cf828e19132e91dbb262e3daf7016f0a69a4c9c375af6ddf" Oct 13 15:44:10 crc kubenswrapper[4797]: E1013 15:44:10.238541 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 15:44:25 crc kubenswrapper[4797]: I1013 15:44:25.236919 4797 scope.go:117] "RemoveContainer" containerID="0f48189b10ecf709cf828e19132e91dbb262e3daf7016f0a69a4c9c375af6ddf" Oct 13 15:44:25 crc kubenswrapper[4797]: I1013 15:44:25.601320 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" event={"ID":"345b1c60-ba79-407d-8423-53010f2dfeb0","Type":"ContainerStarted","Data":"01b3742782a202fb659c82f51e632c96943ddd476f7bd91be0bd9afd8946b3b8"} Oct 13 15:45:00 crc kubenswrapper[4797]: I1013 15:45:00.163671 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29339505-rm55s"] Oct 13 15:45:00 crc kubenswrapper[4797]: E1013 15:45:00.164769 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="affb42a1-2da4-4a29-887f-323a055ae3aa" containerName="registry-server" Oct 13 15:45:00 crc kubenswrapper[4797]: I1013 15:45:00.164790 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="affb42a1-2da4-4a29-887f-323a055ae3aa" containerName="registry-server" Oct 13 15:45:00 crc kubenswrapper[4797]: E1013 15:45:00.164845 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="affb42a1-2da4-4a29-887f-323a055ae3aa" containerName="extract-content" Oct 13 15:45:00 crc kubenswrapper[4797]: I1013 15:45:00.164855 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="affb42a1-2da4-4a29-887f-323a055ae3aa" containerName="extract-content" Oct 13 15:45:00 crc kubenswrapper[4797]: E1013 15:45:00.164892 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="affb42a1-2da4-4a29-887f-323a055ae3aa" containerName="extract-utilities" Oct 13 15:45:00 crc kubenswrapper[4797]: I1013 15:45:00.164903 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="affb42a1-2da4-4a29-887f-323a055ae3aa" containerName="extract-utilities" Oct 13 15:45:00 crc kubenswrapper[4797]: I1013 15:45:00.165124 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="affb42a1-2da4-4a29-887f-323a055ae3aa" containerName="registry-server" Oct 13 15:45:00 crc kubenswrapper[4797]: I1013 15:45:00.166282 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29339505-rm55s" Oct 13 15:45:00 crc kubenswrapper[4797]: I1013 15:45:00.169385 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 13 15:45:00 crc kubenswrapper[4797]: I1013 15:45:00.169407 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 13 15:45:00 crc kubenswrapper[4797]: I1013 15:45:00.178080 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29339505-rm55s"] Oct 13 15:45:00 crc kubenswrapper[4797]: I1013 15:45:00.184330 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b1bf59bd-64b8-4ca6-aa53-a9fe81d1beae-config-volume\") pod \"collect-profiles-29339505-rm55s\" (UID: \"b1bf59bd-64b8-4ca6-aa53-a9fe81d1beae\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339505-rm55s" Oct 13 15:45:00 crc kubenswrapper[4797]: I1013 15:45:00.184385 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4k6n\" (UniqueName: \"kubernetes.io/projected/b1bf59bd-64b8-4ca6-aa53-a9fe81d1beae-kube-api-access-h4k6n\") pod \"collect-profiles-29339505-rm55s\" (UID: \"b1bf59bd-64b8-4ca6-aa53-a9fe81d1beae\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339505-rm55s" Oct 13 15:45:00 crc kubenswrapper[4797]: I1013 15:45:00.184793 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b1bf59bd-64b8-4ca6-aa53-a9fe81d1beae-secret-volume\") pod \"collect-profiles-29339505-rm55s\" (UID: \"b1bf59bd-64b8-4ca6-aa53-a9fe81d1beae\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339505-rm55s" Oct 13 15:45:00 crc kubenswrapper[4797]: I1013 15:45:00.286842 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b1bf59bd-64b8-4ca6-aa53-a9fe81d1beae-secret-volume\") pod \"collect-profiles-29339505-rm55s\" (UID: \"b1bf59bd-64b8-4ca6-aa53-a9fe81d1beae\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339505-rm55s" Oct 13 15:45:00 crc kubenswrapper[4797]: I1013 15:45:00.287311 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b1bf59bd-64b8-4ca6-aa53-a9fe81d1beae-config-volume\") pod \"collect-profiles-29339505-rm55s\" (UID: \"b1bf59bd-64b8-4ca6-aa53-a9fe81d1beae\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339505-rm55s" Oct 13 15:45:00 crc kubenswrapper[4797]: I1013 15:45:00.287335 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h4k6n\" (UniqueName: \"kubernetes.io/projected/b1bf59bd-64b8-4ca6-aa53-a9fe81d1beae-kube-api-access-h4k6n\") pod \"collect-profiles-29339505-rm55s\" (UID: \"b1bf59bd-64b8-4ca6-aa53-a9fe81d1beae\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339505-rm55s" Oct 13 15:45:00 crc kubenswrapper[4797]: I1013 15:45:00.291420 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b1bf59bd-64b8-4ca6-aa53-a9fe81d1beae-config-volume\") pod \"collect-profiles-29339505-rm55s\" (UID: \"b1bf59bd-64b8-4ca6-aa53-a9fe81d1beae\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339505-rm55s" Oct 13 15:45:00 crc kubenswrapper[4797]: I1013 15:45:00.299338 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b1bf59bd-64b8-4ca6-aa53-a9fe81d1beae-secret-volume\") pod \"collect-profiles-29339505-rm55s\" (UID: \"b1bf59bd-64b8-4ca6-aa53-a9fe81d1beae\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339505-rm55s" Oct 13 15:45:00 crc kubenswrapper[4797]: I1013 15:45:00.315558 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4k6n\" (UniqueName: \"kubernetes.io/projected/b1bf59bd-64b8-4ca6-aa53-a9fe81d1beae-kube-api-access-h4k6n\") pod \"collect-profiles-29339505-rm55s\" (UID: \"b1bf59bd-64b8-4ca6-aa53-a9fe81d1beae\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339505-rm55s" Oct 13 15:45:00 crc kubenswrapper[4797]: I1013 15:45:00.495397 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29339505-rm55s" Oct 13 15:45:00 crc kubenswrapper[4797]: I1013 15:45:00.996147 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29339505-rm55s"] Oct 13 15:45:01 crc kubenswrapper[4797]: I1013 15:45:01.980315 4797 generic.go:334] "Generic (PLEG): container finished" podID="b1bf59bd-64b8-4ca6-aa53-a9fe81d1beae" containerID="7a4e0fe7018f669fb25c7ef0684bd4d78772bd27fb9b3e4419eb6b576d1c5083" exitCode=0 Oct 13 15:45:01 crc kubenswrapper[4797]: I1013 15:45:01.980373 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29339505-rm55s" event={"ID":"b1bf59bd-64b8-4ca6-aa53-a9fe81d1beae","Type":"ContainerDied","Data":"7a4e0fe7018f669fb25c7ef0684bd4d78772bd27fb9b3e4419eb6b576d1c5083"} Oct 13 15:45:01 crc kubenswrapper[4797]: I1013 15:45:01.980910 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29339505-rm55s" event={"ID":"b1bf59bd-64b8-4ca6-aa53-a9fe81d1beae","Type":"ContainerStarted","Data":"063ca1af8c493b32a69352fab647a63f400c4cf4db1147815c42f9389d3572d2"} Oct 13 15:45:03 crc kubenswrapper[4797]: I1013 15:45:03.618669 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29339505-rm55s" Oct 13 15:45:03 crc kubenswrapper[4797]: I1013 15:45:03.647181 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h4k6n\" (UniqueName: \"kubernetes.io/projected/b1bf59bd-64b8-4ca6-aa53-a9fe81d1beae-kube-api-access-h4k6n\") pod \"b1bf59bd-64b8-4ca6-aa53-a9fe81d1beae\" (UID: \"b1bf59bd-64b8-4ca6-aa53-a9fe81d1beae\") " Oct 13 15:45:03 crc kubenswrapper[4797]: I1013 15:45:03.647241 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b1bf59bd-64b8-4ca6-aa53-a9fe81d1beae-config-volume\") pod \"b1bf59bd-64b8-4ca6-aa53-a9fe81d1beae\" (UID: \"b1bf59bd-64b8-4ca6-aa53-a9fe81d1beae\") " Oct 13 15:45:03 crc kubenswrapper[4797]: I1013 15:45:03.647290 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b1bf59bd-64b8-4ca6-aa53-a9fe81d1beae-secret-volume\") pod \"b1bf59bd-64b8-4ca6-aa53-a9fe81d1beae\" (UID: \"b1bf59bd-64b8-4ca6-aa53-a9fe81d1beae\") " Oct 13 15:45:03 crc kubenswrapper[4797]: I1013 15:45:03.651123 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1bf59bd-64b8-4ca6-aa53-a9fe81d1beae-config-volume" (OuterVolumeSpecName: "config-volume") pod "b1bf59bd-64b8-4ca6-aa53-a9fe81d1beae" (UID: "b1bf59bd-64b8-4ca6-aa53-a9fe81d1beae"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 15:45:03 crc kubenswrapper[4797]: I1013 15:45:03.671003 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1bf59bd-64b8-4ca6-aa53-a9fe81d1beae-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "b1bf59bd-64b8-4ca6-aa53-a9fe81d1beae" (UID: "b1bf59bd-64b8-4ca6-aa53-a9fe81d1beae"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 15:45:03 crc kubenswrapper[4797]: I1013 15:45:03.671716 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1bf59bd-64b8-4ca6-aa53-a9fe81d1beae-kube-api-access-h4k6n" (OuterVolumeSpecName: "kube-api-access-h4k6n") pod "b1bf59bd-64b8-4ca6-aa53-a9fe81d1beae" (UID: "b1bf59bd-64b8-4ca6-aa53-a9fe81d1beae"). InnerVolumeSpecName "kube-api-access-h4k6n". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 15:45:03 crc kubenswrapper[4797]: I1013 15:45:03.754176 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h4k6n\" (UniqueName: \"kubernetes.io/projected/b1bf59bd-64b8-4ca6-aa53-a9fe81d1beae-kube-api-access-h4k6n\") on node \"crc\" DevicePath \"\"" Oct 13 15:45:03 crc kubenswrapper[4797]: I1013 15:45:03.754426 4797 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b1bf59bd-64b8-4ca6-aa53-a9fe81d1beae-config-volume\") on node \"crc\" DevicePath \"\"" Oct 13 15:45:03 crc kubenswrapper[4797]: I1013 15:45:03.754504 4797 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b1bf59bd-64b8-4ca6-aa53-a9fe81d1beae-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 13 15:45:04 crc kubenswrapper[4797]: I1013 15:45:04.000068 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29339505-rm55s" event={"ID":"b1bf59bd-64b8-4ca6-aa53-a9fe81d1beae","Type":"ContainerDied","Data":"063ca1af8c493b32a69352fab647a63f400c4cf4db1147815c42f9389d3572d2"} Oct 13 15:45:04 crc kubenswrapper[4797]: I1013 15:45:04.000105 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="063ca1af8c493b32a69352fab647a63f400c4cf4db1147815c42f9389d3572d2" Oct 13 15:45:04 crc kubenswrapper[4797]: I1013 15:45:04.000155 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29339505-rm55s" Oct 13 15:45:04 crc kubenswrapper[4797]: I1013 15:45:04.720167 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29339460-5tzwt"] Oct 13 15:45:04 crc kubenswrapper[4797]: I1013 15:45:04.729852 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29339460-5tzwt"] Oct 13 15:45:05 crc kubenswrapper[4797]: I1013 15:45:05.247559 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d721eb5f-2ce0-4e90-b01c-ed1d97a58208" path="/var/lib/kubelet/pods/d721eb5f-2ce0-4e90-b01c-ed1d97a58208/volumes" Oct 13 15:45:20 crc kubenswrapper[4797]: I1013 15:45:20.881747 4797 scope.go:117] "RemoveContainer" containerID="ec9d30e7973a3e688af8d06494c7157c62228f27e52664e7956e6d5e4e9a51ce" Oct 13 15:46:48 crc kubenswrapper[4797]: I1013 15:46:48.120045 4797 patch_prober.go:28] interesting pod/machine-config-daemon-hrdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 15:46:48 crc kubenswrapper[4797]: I1013 15:46:48.120656 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 15:47:18 crc kubenswrapper[4797]: I1013 15:47:18.120705 4797 patch_prober.go:28] interesting pod/machine-config-daemon-hrdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 15:47:18 crc kubenswrapper[4797]: I1013 15:47:18.121341 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 15:47:48 crc kubenswrapper[4797]: I1013 15:47:48.120040 4797 patch_prober.go:28] interesting pod/machine-config-daemon-hrdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 15:47:48 crc kubenswrapper[4797]: I1013 15:47:48.120666 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 15:47:48 crc kubenswrapper[4797]: I1013 15:47:48.120723 4797 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" Oct 13 15:47:48 crc kubenswrapper[4797]: I1013 15:47:48.121626 4797 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"01b3742782a202fb659c82f51e632c96943ddd476f7bd91be0bd9afd8946b3b8"} pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 13 15:47:48 crc kubenswrapper[4797]: I1013 15:47:48.121694 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerName="machine-config-daemon" containerID="cri-o://01b3742782a202fb659c82f51e632c96943ddd476f7bd91be0bd9afd8946b3b8" gracePeriod=600 Oct 13 15:47:48 crc kubenswrapper[4797]: I1013 15:47:48.631566 4797 generic.go:334] "Generic (PLEG): container finished" podID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerID="01b3742782a202fb659c82f51e632c96943ddd476f7bd91be0bd9afd8946b3b8" exitCode=0 Oct 13 15:47:48 crc kubenswrapper[4797]: I1013 15:47:48.631607 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" event={"ID":"345b1c60-ba79-407d-8423-53010f2dfeb0","Type":"ContainerDied","Data":"01b3742782a202fb659c82f51e632c96943ddd476f7bd91be0bd9afd8946b3b8"} Oct 13 15:47:48 crc kubenswrapper[4797]: I1013 15:47:48.632165 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" event={"ID":"345b1c60-ba79-407d-8423-53010f2dfeb0","Type":"ContainerStarted","Data":"7f1e83580ed6057b8717484b17b17fa6431ebb611cc0b1233bd7239aa2d8f6f4"} Oct 13 15:47:48 crc kubenswrapper[4797]: I1013 15:47:48.632186 4797 scope.go:117] "RemoveContainer" containerID="0f48189b10ecf709cf828e19132e91dbb262e3daf7016f0a69a4c9c375af6ddf" Oct 13 15:49:48 crc kubenswrapper[4797]: I1013 15:49:48.119542 4797 patch_prober.go:28] interesting pod/machine-config-daemon-hrdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 15:49:48 crc kubenswrapper[4797]: I1013 15:49:48.120175 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 15:49:54 crc kubenswrapper[4797]: I1013 15:49:54.406291 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-dbhxf"] Oct 13 15:49:54 crc kubenswrapper[4797]: E1013 15:49:54.407460 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1bf59bd-64b8-4ca6-aa53-a9fe81d1beae" containerName="collect-profiles" Oct 13 15:49:54 crc kubenswrapper[4797]: I1013 15:49:54.407480 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1bf59bd-64b8-4ca6-aa53-a9fe81d1beae" containerName="collect-profiles" Oct 13 15:49:54 crc kubenswrapper[4797]: I1013 15:49:54.407734 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1bf59bd-64b8-4ca6-aa53-a9fe81d1beae" containerName="collect-profiles" Oct 13 15:49:54 crc kubenswrapper[4797]: I1013 15:49:54.414512 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dbhxf" Oct 13 15:49:54 crc kubenswrapper[4797]: I1013 15:49:54.435747 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-dbhxf"] Oct 13 15:49:54 crc kubenswrapper[4797]: I1013 15:49:54.459181 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/84559e5f-37ae-4957-afba-bf50be184c86-catalog-content\") pod \"community-operators-dbhxf\" (UID: \"84559e5f-37ae-4957-afba-bf50be184c86\") " pod="openshift-marketplace/community-operators-dbhxf" Oct 13 15:49:54 crc kubenswrapper[4797]: I1013 15:49:54.459277 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/84559e5f-37ae-4957-afba-bf50be184c86-utilities\") pod \"community-operators-dbhxf\" (UID: \"84559e5f-37ae-4957-afba-bf50be184c86\") " pod="openshift-marketplace/community-operators-dbhxf" Oct 13 15:49:54 crc kubenswrapper[4797]: I1013 15:49:54.459341 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65849\" (UniqueName: \"kubernetes.io/projected/84559e5f-37ae-4957-afba-bf50be184c86-kube-api-access-65849\") pod \"community-operators-dbhxf\" (UID: \"84559e5f-37ae-4957-afba-bf50be184c86\") " pod="openshift-marketplace/community-operators-dbhxf" Oct 13 15:49:54 crc kubenswrapper[4797]: I1013 15:49:54.560708 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/84559e5f-37ae-4957-afba-bf50be184c86-catalog-content\") pod \"community-operators-dbhxf\" (UID: \"84559e5f-37ae-4957-afba-bf50be184c86\") " pod="openshift-marketplace/community-operators-dbhxf" Oct 13 15:49:54 crc kubenswrapper[4797]: I1013 15:49:54.560777 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/84559e5f-37ae-4957-afba-bf50be184c86-utilities\") pod \"community-operators-dbhxf\" (UID: \"84559e5f-37ae-4957-afba-bf50be184c86\") " pod="openshift-marketplace/community-operators-dbhxf" Oct 13 15:49:54 crc kubenswrapper[4797]: I1013 15:49:54.560833 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-65849\" (UniqueName: \"kubernetes.io/projected/84559e5f-37ae-4957-afba-bf50be184c86-kube-api-access-65849\") pod \"community-operators-dbhxf\" (UID: \"84559e5f-37ae-4957-afba-bf50be184c86\") " pod="openshift-marketplace/community-operators-dbhxf" Oct 13 15:49:54 crc kubenswrapper[4797]: I1013 15:49:54.561610 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/84559e5f-37ae-4957-afba-bf50be184c86-catalog-content\") pod \"community-operators-dbhxf\" (UID: \"84559e5f-37ae-4957-afba-bf50be184c86\") " pod="openshift-marketplace/community-operators-dbhxf" Oct 13 15:49:54 crc kubenswrapper[4797]: I1013 15:49:54.561934 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/84559e5f-37ae-4957-afba-bf50be184c86-utilities\") pod \"community-operators-dbhxf\" (UID: \"84559e5f-37ae-4957-afba-bf50be184c86\") " pod="openshift-marketplace/community-operators-dbhxf" Oct 13 15:49:54 crc kubenswrapper[4797]: I1013 15:49:54.580052 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-65849\" (UniqueName: \"kubernetes.io/projected/84559e5f-37ae-4957-afba-bf50be184c86-kube-api-access-65849\") pod \"community-operators-dbhxf\" (UID: \"84559e5f-37ae-4957-afba-bf50be184c86\") " pod="openshift-marketplace/community-operators-dbhxf" Oct 13 15:49:54 crc kubenswrapper[4797]: I1013 15:49:54.744967 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dbhxf" Oct 13 15:49:55 crc kubenswrapper[4797]: I1013 15:49:55.294720 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-dbhxf"] Oct 13 15:49:56 crc kubenswrapper[4797]: I1013 15:49:56.002135 4797 generic.go:334] "Generic (PLEG): container finished" podID="84559e5f-37ae-4957-afba-bf50be184c86" containerID="5f5ededc14b51be6071792e56a83dbe7654cb747e577845b3ada2f7b6079fd02" exitCode=0 Oct 13 15:49:56 crc kubenswrapper[4797]: I1013 15:49:56.002205 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dbhxf" event={"ID":"84559e5f-37ae-4957-afba-bf50be184c86","Type":"ContainerDied","Data":"5f5ededc14b51be6071792e56a83dbe7654cb747e577845b3ada2f7b6079fd02"} Oct 13 15:49:56 crc kubenswrapper[4797]: I1013 15:49:56.002464 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dbhxf" event={"ID":"84559e5f-37ae-4957-afba-bf50be184c86","Type":"ContainerStarted","Data":"7aa79182858ba85c089f7122d177e5d53a472328807b2203c62ea394be5f9652"} Oct 13 15:49:56 crc kubenswrapper[4797]: I1013 15:49:56.004066 4797 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 13 15:49:58 crc kubenswrapper[4797]: I1013 15:49:58.025796 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dbhxf" event={"ID":"84559e5f-37ae-4957-afba-bf50be184c86","Type":"ContainerStarted","Data":"aa9fa06e57035d71198195137ca05423ae4885b19fa6745778e85a93744c8fde"} Oct 13 15:49:59 crc kubenswrapper[4797]: I1013 15:49:59.037544 4797 generic.go:334] "Generic (PLEG): container finished" podID="84559e5f-37ae-4957-afba-bf50be184c86" containerID="aa9fa06e57035d71198195137ca05423ae4885b19fa6745778e85a93744c8fde" exitCode=0 Oct 13 15:49:59 crc kubenswrapper[4797]: I1013 15:49:59.037615 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dbhxf" event={"ID":"84559e5f-37ae-4957-afba-bf50be184c86","Type":"ContainerDied","Data":"aa9fa06e57035d71198195137ca05423ae4885b19fa6745778e85a93744c8fde"} Oct 13 15:50:00 crc kubenswrapper[4797]: I1013 15:50:00.053735 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dbhxf" event={"ID":"84559e5f-37ae-4957-afba-bf50be184c86","Type":"ContainerStarted","Data":"9c9005fd554193e703fecc982da1c07fe8e263e253ffe44e55315b8449380f33"} Oct 13 15:50:00 crc kubenswrapper[4797]: I1013 15:50:00.079688 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-dbhxf" podStartSLOduration=2.535417886 podStartE2EDuration="6.079664604s" podCreationTimestamp="2025-10-13 15:49:54 +0000 UTC" firstStartedPulling="2025-10-13 15:49:56.003819312 +0000 UTC m=+9773.537369568" lastFinishedPulling="2025-10-13 15:49:59.54806603 +0000 UTC m=+9777.081616286" observedRunningTime="2025-10-13 15:50:00.07541176 +0000 UTC m=+9777.608962056" watchObservedRunningTime="2025-10-13 15:50:00.079664604 +0000 UTC m=+9777.613214870" Oct 13 15:50:04 crc kubenswrapper[4797]: I1013 15:50:04.745766 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-dbhxf" Oct 13 15:50:04 crc kubenswrapper[4797]: I1013 15:50:04.746504 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-dbhxf" Oct 13 15:50:04 crc kubenswrapper[4797]: I1013 15:50:04.849947 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-dbhxf" Oct 13 15:50:05 crc kubenswrapper[4797]: I1013 15:50:05.154767 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-dbhxf" Oct 13 15:50:05 crc kubenswrapper[4797]: I1013 15:50:05.205401 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-dbhxf"] Oct 13 15:50:07 crc kubenswrapper[4797]: I1013 15:50:07.127210 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-dbhxf" podUID="84559e5f-37ae-4957-afba-bf50be184c86" containerName="registry-server" containerID="cri-o://9c9005fd554193e703fecc982da1c07fe8e263e253ffe44e55315b8449380f33" gracePeriod=2 Oct 13 15:50:07 crc kubenswrapper[4797]: I1013 15:50:07.628335 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dbhxf" Oct 13 15:50:07 crc kubenswrapper[4797]: I1013 15:50:07.746882 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-65849\" (UniqueName: \"kubernetes.io/projected/84559e5f-37ae-4957-afba-bf50be184c86-kube-api-access-65849\") pod \"84559e5f-37ae-4957-afba-bf50be184c86\" (UID: \"84559e5f-37ae-4957-afba-bf50be184c86\") " Oct 13 15:50:07 crc kubenswrapper[4797]: I1013 15:50:07.747076 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/84559e5f-37ae-4957-afba-bf50be184c86-catalog-content\") pod \"84559e5f-37ae-4957-afba-bf50be184c86\" (UID: \"84559e5f-37ae-4957-afba-bf50be184c86\") " Oct 13 15:50:07 crc kubenswrapper[4797]: I1013 15:50:07.747291 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/84559e5f-37ae-4957-afba-bf50be184c86-utilities\") pod \"84559e5f-37ae-4957-afba-bf50be184c86\" (UID: \"84559e5f-37ae-4957-afba-bf50be184c86\") " Oct 13 15:50:07 crc kubenswrapper[4797]: I1013 15:50:07.747955 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/84559e5f-37ae-4957-afba-bf50be184c86-utilities" (OuterVolumeSpecName: "utilities") pod "84559e5f-37ae-4957-afba-bf50be184c86" (UID: "84559e5f-37ae-4957-afba-bf50be184c86"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 15:50:07 crc kubenswrapper[4797]: I1013 15:50:07.753018 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84559e5f-37ae-4957-afba-bf50be184c86-kube-api-access-65849" (OuterVolumeSpecName: "kube-api-access-65849") pod "84559e5f-37ae-4957-afba-bf50be184c86" (UID: "84559e5f-37ae-4957-afba-bf50be184c86"). InnerVolumeSpecName "kube-api-access-65849". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 15:50:07 crc kubenswrapper[4797]: I1013 15:50:07.849740 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-65849\" (UniqueName: \"kubernetes.io/projected/84559e5f-37ae-4957-afba-bf50be184c86-kube-api-access-65849\") on node \"crc\" DevicePath \"\"" Oct 13 15:50:07 crc kubenswrapper[4797]: I1013 15:50:07.849788 4797 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/84559e5f-37ae-4957-afba-bf50be184c86-utilities\") on node \"crc\" DevicePath \"\"" Oct 13 15:50:07 crc kubenswrapper[4797]: I1013 15:50:07.933121 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/84559e5f-37ae-4957-afba-bf50be184c86-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "84559e5f-37ae-4957-afba-bf50be184c86" (UID: "84559e5f-37ae-4957-afba-bf50be184c86"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 15:50:07 crc kubenswrapper[4797]: I1013 15:50:07.951617 4797 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/84559e5f-37ae-4957-afba-bf50be184c86-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 13 15:50:08 crc kubenswrapper[4797]: I1013 15:50:08.139554 4797 generic.go:334] "Generic (PLEG): container finished" podID="84559e5f-37ae-4957-afba-bf50be184c86" containerID="9c9005fd554193e703fecc982da1c07fe8e263e253ffe44e55315b8449380f33" exitCode=0 Oct 13 15:50:08 crc kubenswrapper[4797]: I1013 15:50:08.139583 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dbhxf" event={"ID":"84559e5f-37ae-4957-afba-bf50be184c86","Type":"ContainerDied","Data":"9c9005fd554193e703fecc982da1c07fe8e263e253ffe44e55315b8449380f33"} Oct 13 15:50:08 crc kubenswrapper[4797]: I1013 15:50:08.140300 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dbhxf" event={"ID":"84559e5f-37ae-4957-afba-bf50be184c86","Type":"ContainerDied","Data":"7aa79182858ba85c089f7122d177e5d53a472328807b2203c62ea394be5f9652"} Oct 13 15:50:08 crc kubenswrapper[4797]: I1013 15:50:08.139612 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dbhxf" Oct 13 15:50:08 crc kubenswrapper[4797]: I1013 15:50:08.140424 4797 scope.go:117] "RemoveContainer" containerID="9c9005fd554193e703fecc982da1c07fe8e263e253ffe44e55315b8449380f33" Oct 13 15:50:08 crc kubenswrapper[4797]: I1013 15:50:08.165753 4797 scope.go:117] "RemoveContainer" containerID="aa9fa06e57035d71198195137ca05423ae4885b19fa6745778e85a93744c8fde" Oct 13 15:50:08 crc kubenswrapper[4797]: I1013 15:50:08.186704 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-dbhxf"] Oct 13 15:50:08 crc kubenswrapper[4797]: I1013 15:50:08.198986 4797 scope.go:117] "RemoveContainer" containerID="5f5ededc14b51be6071792e56a83dbe7654cb747e577845b3ada2f7b6079fd02" Oct 13 15:50:08 crc kubenswrapper[4797]: I1013 15:50:08.205005 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-dbhxf"] Oct 13 15:50:08 crc kubenswrapper[4797]: I1013 15:50:08.269732 4797 scope.go:117] "RemoveContainer" containerID="9c9005fd554193e703fecc982da1c07fe8e263e253ffe44e55315b8449380f33" Oct 13 15:50:08 crc kubenswrapper[4797]: E1013 15:50:08.276506 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c9005fd554193e703fecc982da1c07fe8e263e253ffe44e55315b8449380f33\": container with ID starting with 9c9005fd554193e703fecc982da1c07fe8e263e253ffe44e55315b8449380f33 not found: ID does not exist" containerID="9c9005fd554193e703fecc982da1c07fe8e263e253ffe44e55315b8449380f33" Oct 13 15:50:08 crc kubenswrapper[4797]: I1013 15:50:08.276580 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c9005fd554193e703fecc982da1c07fe8e263e253ffe44e55315b8449380f33"} err="failed to get container status \"9c9005fd554193e703fecc982da1c07fe8e263e253ffe44e55315b8449380f33\": rpc error: code = NotFound desc = could not find container \"9c9005fd554193e703fecc982da1c07fe8e263e253ffe44e55315b8449380f33\": container with ID starting with 9c9005fd554193e703fecc982da1c07fe8e263e253ffe44e55315b8449380f33 not found: ID does not exist" Oct 13 15:50:08 crc kubenswrapper[4797]: I1013 15:50:08.276626 4797 scope.go:117] "RemoveContainer" containerID="aa9fa06e57035d71198195137ca05423ae4885b19fa6745778e85a93744c8fde" Oct 13 15:50:08 crc kubenswrapper[4797]: E1013 15:50:08.277024 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa9fa06e57035d71198195137ca05423ae4885b19fa6745778e85a93744c8fde\": container with ID starting with aa9fa06e57035d71198195137ca05423ae4885b19fa6745778e85a93744c8fde not found: ID does not exist" containerID="aa9fa06e57035d71198195137ca05423ae4885b19fa6745778e85a93744c8fde" Oct 13 15:50:08 crc kubenswrapper[4797]: I1013 15:50:08.277077 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa9fa06e57035d71198195137ca05423ae4885b19fa6745778e85a93744c8fde"} err="failed to get container status \"aa9fa06e57035d71198195137ca05423ae4885b19fa6745778e85a93744c8fde\": rpc error: code = NotFound desc = could not find container \"aa9fa06e57035d71198195137ca05423ae4885b19fa6745778e85a93744c8fde\": container with ID starting with aa9fa06e57035d71198195137ca05423ae4885b19fa6745778e85a93744c8fde not found: ID does not exist" Oct 13 15:50:08 crc kubenswrapper[4797]: I1013 15:50:08.277114 4797 scope.go:117] "RemoveContainer" containerID="5f5ededc14b51be6071792e56a83dbe7654cb747e577845b3ada2f7b6079fd02" Oct 13 15:50:08 crc kubenswrapper[4797]: E1013 15:50:08.277556 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f5ededc14b51be6071792e56a83dbe7654cb747e577845b3ada2f7b6079fd02\": container with ID starting with 5f5ededc14b51be6071792e56a83dbe7654cb747e577845b3ada2f7b6079fd02 not found: ID does not exist" containerID="5f5ededc14b51be6071792e56a83dbe7654cb747e577845b3ada2f7b6079fd02" Oct 13 15:50:08 crc kubenswrapper[4797]: I1013 15:50:08.277601 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f5ededc14b51be6071792e56a83dbe7654cb747e577845b3ada2f7b6079fd02"} err="failed to get container status \"5f5ededc14b51be6071792e56a83dbe7654cb747e577845b3ada2f7b6079fd02\": rpc error: code = NotFound desc = could not find container \"5f5ededc14b51be6071792e56a83dbe7654cb747e577845b3ada2f7b6079fd02\": container with ID starting with 5f5ededc14b51be6071792e56a83dbe7654cb747e577845b3ada2f7b6079fd02 not found: ID does not exist" Oct 13 15:50:09 crc kubenswrapper[4797]: I1013 15:50:09.250615 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84559e5f-37ae-4957-afba-bf50be184c86" path="/var/lib/kubelet/pods/84559e5f-37ae-4957-afba-bf50be184c86/volumes" Oct 13 15:50:18 crc kubenswrapper[4797]: I1013 15:50:18.120576 4797 patch_prober.go:28] interesting pod/machine-config-daemon-hrdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 15:50:18 crc kubenswrapper[4797]: I1013 15:50:18.121198 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 15:50:48 crc kubenswrapper[4797]: I1013 15:50:48.120982 4797 patch_prober.go:28] interesting pod/machine-config-daemon-hrdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 15:50:48 crc kubenswrapper[4797]: I1013 15:50:48.121417 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 15:50:48 crc kubenswrapper[4797]: I1013 15:50:48.121464 4797 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" Oct 13 15:50:48 crc kubenswrapper[4797]: I1013 15:50:48.122066 4797 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7f1e83580ed6057b8717484b17b17fa6431ebb611cc0b1233bd7239aa2d8f6f4"} pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 13 15:50:48 crc kubenswrapper[4797]: I1013 15:50:48.122123 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerName="machine-config-daemon" containerID="cri-o://7f1e83580ed6057b8717484b17b17fa6431ebb611cc0b1233bd7239aa2d8f6f4" gracePeriod=600 Oct 13 15:50:48 crc kubenswrapper[4797]: E1013 15:50:48.264998 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 15:50:48 crc kubenswrapper[4797]: I1013 15:50:48.532319 4797 generic.go:334] "Generic (PLEG): container finished" podID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerID="7f1e83580ed6057b8717484b17b17fa6431ebb611cc0b1233bd7239aa2d8f6f4" exitCode=0 Oct 13 15:50:48 crc kubenswrapper[4797]: I1013 15:50:48.532362 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" event={"ID":"345b1c60-ba79-407d-8423-53010f2dfeb0","Type":"ContainerDied","Data":"7f1e83580ed6057b8717484b17b17fa6431ebb611cc0b1233bd7239aa2d8f6f4"} Oct 13 15:50:48 crc kubenswrapper[4797]: I1013 15:50:48.532686 4797 scope.go:117] "RemoveContainer" containerID="01b3742782a202fb659c82f51e632c96943ddd476f7bd91be0bd9afd8946b3b8" Oct 13 15:50:48 crc kubenswrapper[4797]: I1013 15:50:48.533170 4797 scope.go:117] "RemoveContainer" containerID="7f1e83580ed6057b8717484b17b17fa6431ebb611cc0b1233bd7239aa2d8f6f4" Oct 13 15:50:48 crc kubenswrapper[4797]: E1013 15:50:48.533440 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 15:51:01 crc kubenswrapper[4797]: I1013 15:51:01.237262 4797 scope.go:117] "RemoveContainer" containerID="7f1e83580ed6057b8717484b17b17fa6431ebb611cc0b1233bd7239aa2d8f6f4" Oct 13 15:51:01 crc kubenswrapper[4797]: E1013 15:51:01.238109 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 15:51:14 crc kubenswrapper[4797]: I1013 15:51:14.236489 4797 scope.go:117] "RemoveContainer" containerID="7f1e83580ed6057b8717484b17b17fa6431ebb611cc0b1233bd7239aa2d8f6f4" Oct 13 15:51:14 crc kubenswrapper[4797]: E1013 15:51:14.237353 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 15:51:26 crc kubenswrapper[4797]: I1013 15:51:26.235830 4797 scope.go:117] "RemoveContainer" containerID="7f1e83580ed6057b8717484b17b17fa6431ebb611cc0b1233bd7239aa2d8f6f4" Oct 13 15:51:26 crc kubenswrapper[4797]: E1013 15:51:26.236547 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 15:51:34 crc kubenswrapper[4797]: I1013 15:51:34.006741 4797 generic.go:334] "Generic (PLEG): container finished" podID="b91301d7-01b4-48ec-b44e-12d408a58e1c" containerID="9dad40b66fcc3f0e2c5c03208f75913f92be54c06ac49e240af444ff05192294" exitCode=0 Oct 13 15:51:34 crc kubenswrapper[4797]: I1013 15:51:34.006860 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"b91301d7-01b4-48ec-b44e-12d408a58e1c","Type":"ContainerDied","Data":"9dad40b66fcc3f0e2c5c03208f75913f92be54c06ac49e240af444ff05192294"} Oct 13 15:51:35 crc kubenswrapper[4797]: I1013 15:51:35.478940 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 13 15:51:35 crc kubenswrapper[4797]: I1013 15:51:35.617592 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"b91301d7-01b4-48ec-b44e-12d408a58e1c\" (UID: \"b91301d7-01b4-48ec-b44e-12d408a58e1c\") " Oct 13 15:51:35 crc kubenswrapper[4797]: I1013 15:51:35.617749 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b91301d7-01b4-48ec-b44e-12d408a58e1c-openstack-config-secret\") pod \"b91301d7-01b4-48ec-b44e-12d408a58e1c\" (UID: \"b91301d7-01b4-48ec-b44e-12d408a58e1c\") " Oct 13 15:51:35 crc kubenswrapper[4797]: I1013 15:51:35.617779 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b91301d7-01b4-48ec-b44e-12d408a58e1c-openstack-config\") pod \"b91301d7-01b4-48ec-b44e-12d408a58e1c\" (UID: \"b91301d7-01b4-48ec-b44e-12d408a58e1c\") " Oct 13 15:51:35 crc kubenswrapper[4797]: I1013 15:51:35.617831 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/b91301d7-01b4-48ec-b44e-12d408a58e1c-test-operator-ephemeral-workdir\") pod \"b91301d7-01b4-48ec-b44e-12d408a58e1c\" (UID: \"b91301d7-01b4-48ec-b44e-12d408a58e1c\") " Oct 13 15:51:35 crc kubenswrapper[4797]: I1013 15:51:35.617866 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b91301d7-01b4-48ec-b44e-12d408a58e1c-ssh-key\") pod \"b91301d7-01b4-48ec-b44e-12d408a58e1c\" (UID: \"b91301d7-01b4-48ec-b44e-12d408a58e1c\") " Oct 13 15:51:35 crc kubenswrapper[4797]: I1013 15:51:35.617895 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b91301d7-01b4-48ec-b44e-12d408a58e1c-config-data\") pod \"b91301d7-01b4-48ec-b44e-12d408a58e1c\" (UID: \"b91301d7-01b4-48ec-b44e-12d408a58e1c\") " Oct 13 15:51:35 crc kubenswrapper[4797]: I1013 15:51:35.617979 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/b91301d7-01b4-48ec-b44e-12d408a58e1c-ca-certs\") pod \"b91301d7-01b4-48ec-b44e-12d408a58e1c\" (UID: \"b91301d7-01b4-48ec-b44e-12d408a58e1c\") " Oct 13 15:51:35 crc kubenswrapper[4797]: I1013 15:51:35.618017 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/b91301d7-01b4-48ec-b44e-12d408a58e1c-test-operator-ephemeral-temporary\") pod \"b91301d7-01b4-48ec-b44e-12d408a58e1c\" (UID: \"b91301d7-01b4-48ec-b44e-12d408a58e1c\") " Oct 13 15:51:35 crc kubenswrapper[4797]: I1013 15:51:35.618080 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k2w2r\" (UniqueName: \"kubernetes.io/projected/b91301d7-01b4-48ec-b44e-12d408a58e1c-kube-api-access-k2w2r\") pod \"b91301d7-01b4-48ec-b44e-12d408a58e1c\" (UID: \"b91301d7-01b4-48ec-b44e-12d408a58e1c\") " Oct 13 15:51:35 crc kubenswrapper[4797]: I1013 15:51:35.619838 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b91301d7-01b4-48ec-b44e-12d408a58e1c-config-data" (OuterVolumeSpecName: "config-data") pod "b91301d7-01b4-48ec-b44e-12d408a58e1c" (UID: "b91301d7-01b4-48ec-b44e-12d408a58e1c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 15:51:35 crc kubenswrapper[4797]: I1013 15:51:35.619919 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b91301d7-01b4-48ec-b44e-12d408a58e1c-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "b91301d7-01b4-48ec-b44e-12d408a58e1c" (UID: "b91301d7-01b4-48ec-b44e-12d408a58e1c"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 15:51:35 crc kubenswrapper[4797]: I1013 15:51:35.624486 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "test-operator-logs") pod "b91301d7-01b4-48ec-b44e-12d408a58e1c" (UID: "b91301d7-01b4-48ec-b44e-12d408a58e1c"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 13 15:51:35 crc kubenswrapper[4797]: I1013 15:51:35.626163 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b91301d7-01b4-48ec-b44e-12d408a58e1c-kube-api-access-k2w2r" (OuterVolumeSpecName: "kube-api-access-k2w2r") pod "b91301d7-01b4-48ec-b44e-12d408a58e1c" (UID: "b91301d7-01b4-48ec-b44e-12d408a58e1c"). InnerVolumeSpecName "kube-api-access-k2w2r". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 15:51:35 crc kubenswrapper[4797]: I1013 15:51:35.634110 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b91301d7-01b4-48ec-b44e-12d408a58e1c-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "b91301d7-01b4-48ec-b44e-12d408a58e1c" (UID: "b91301d7-01b4-48ec-b44e-12d408a58e1c"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 15:51:35 crc kubenswrapper[4797]: I1013 15:51:35.647527 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b91301d7-01b4-48ec-b44e-12d408a58e1c-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "b91301d7-01b4-48ec-b44e-12d408a58e1c" (UID: "b91301d7-01b4-48ec-b44e-12d408a58e1c"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 15:51:35 crc kubenswrapper[4797]: I1013 15:51:35.649878 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b91301d7-01b4-48ec-b44e-12d408a58e1c-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "b91301d7-01b4-48ec-b44e-12d408a58e1c" (UID: "b91301d7-01b4-48ec-b44e-12d408a58e1c"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 15:51:35 crc kubenswrapper[4797]: I1013 15:51:35.671764 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b91301d7-01b4-48ec-b44e-12d408a58e1c-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "b91301d7-01b4-48ec-b44e-12d408a58e1c" (UID: "b91301d7-01b4-48ec-b44e-12d408a58e1c"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 15:51:35 crc kubenswrapper[4797]: I1013 15:51:35.674151 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b91301d7-01b4-48ec-b44e-12d408a58e1c-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "b91301d7-01b4-48ec-b44e-12d408a58e1c" (UID: "b91301d7-01b4-48ec-b44e-12d408a58e1c"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 15:51:35 crc kubenswrapper[4797]: I1013 15:51:35.720283 4797 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b91301d7-01b4-48ec-b44e-12d408a58e1c-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Oct 13 15:51:35 crc kubenswrapper[4797]: I1013 15:51:35.720338 4797 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b91301d7-01b4-48ec-b44e-12d408a58e1c-openstack-config\") on node \"crc\" DevicePath \"\"" Oct 13 15:51:35 crc kubenswrapper[4797]: I1013 15:51:35.720351 4797 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/b91301d7-01b4-48ec-b44e-12d408a58e1c-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Oct 13 15:51:35 crc kubenswrapper[4797]: I1013 15:51:35.720364 4797 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b91301d7-01b4-48ec-b44e-12d408a58e1c-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 13 15:51:35 crc kubenswrapper[4797]: I1013 15:51:35.720376 4797 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b91301d7-01b4-48ec-b44e-12d408a58e1c-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 15:51:35 crc kubenswrapper[4797]: I1013 15:51:35.720386 4797 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/b91301d7-01b4-48ec-b44e-12d408a58e1c-ca-certs\") on node \"crc\" DevicePath \"\"" Oct 13 15:51:35 crc kubenswrapper[4797]: I1013 15:51:35.720397 4797 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/b91301d7-01b4-48ec-b44e-12d408a58e1c-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Oct 13 15:51:35 crc kubenswrapper[4797]: I1013 15:51:35.720411 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k2w2r\" (UniqueName: \"kubernetes.io/projected/b91301d7-01b4-48ec-b44e-12d408a58e1c-kube-api-access-k2w2r\") on node \"crc\" DevicePath \"\"" Oct 13 15:51:35 crc kubenswrapper[4797]: I1013 15:51:35.720452 4797 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Oct 13 15:51:35 crc kubenswrapper[4797]: I1013 15:51:35.749678 4797 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Oct 13 15:51:35 crc kubenswrapper[4797]: I1013 15:51:35.822679 4797 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Oct 13 15:51:36 crc kubenswrapper[4797]: I1013 15:51:36.027957 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"b91301d7-01b4-48ec-b44e-12d408a58e1c","Type":"ContainerDied","Data":"a177c9392d3d3445d8f54082b2985a7f36488c8e82954460e0aca0be78f4f7e9"} Oct 13 15:51:36 crc kubenswrapper[4797]: I1013 15:51:36.027999 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a177c9392d3d3445d8f54082b2985a7f36488c8e82954460e0aca0be78f4f7e9" Oct 13 15:51:36 crc kubenswrapper[4797]: I1013 15:51:36.028032 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 13 15:51:38 crc kubenswrapper[4797]: I1013 15:51:38.237203 4797 scope.go:117] "RemoveContainer" containerID="7f1e83580ed6057b8717484b17b17fa6431ebb611cc0b1233bd7239aa2d8f6f4" Oct 13 15:51:38 crc kubenswrapper[4797]: E1013 15:51:38.237895 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 15:51:44 crc kubenswrapper[4797]: I1013 15:51:44.515595 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Oct 13 15:51:44 crc kubenswrapper[4797]: E1013 15:51:44.516697 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84559e5f-37ae-4957-afba-bf50be184c86" containerName="extract-utilities" Oct 13 15:51:44 crc kubenswrapper[4797]: I1013 15:51:44.516719 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="84559e5f-37ae-4957-afba-bf50be184c86" containerName="extract-utilities" Oct 13 15:51:44 crc kubenswrapper[4797]: E1013 15:51:44.516740 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b91301d7-01b4-48ec-b44e-12d408a58e1c" containerName="tempest-tests-tempest-tests-runner" Oct 13 15:51:44 crc kubenswrapper[4797]: I1013 15:51:44.516750 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="b91301d7-01b4-48ec-b44e-12d408a58e1c" containerName="tempest-tests-tempest-tests-runner" Oct 13 15:51:44 crc kubenswrapper[4797]: E1013 15:51:44.516786 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84559e5f-37ae-4957-afba-bf50be184c86" containerName="registry-server" Oct 13 15:51:44 crc kubenswrapper[4797]: I1013 15:51:44.516795 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="84559e5f-37ae-4957-afba-bf50be184c86" containerName="registry-server" Oct 13 15:51:44 crc kubenswrapper[4797]: E1013 15:51:44.516855 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84559e5f-37ae-4957-afba-bf50be184c86" containerName="extract-content" Oct 13 15:51:44 crc kubenswrapper[4797]: I1013 15:51:44.516866 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="84559e5f-37ae-4957-afba-bf50be184c86" containerName="extract-content" Oct 13 15:51:44 crc kubenswrapper[4797]: I1013 15:51:44.517131 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="84559e5f-37ae-4957-afba-bf50be184c86" containerName="registry-server" Oct 13 15:51:44 crc kubenswrapper[4797]: I1013 15:51:44.517179 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="b91301d7-01b4-48ec-b44e-12d408a58e1c" containerName="tempest-tests-tempest-tests-runner" Oct 13 15:51:44 crc kubenswrapper[4797]: I1013 15:51:44.518069 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 13 15:51:44 crc kubenswrapper[4797]: I1013 15:51:44.520460 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-dj9th" Oct 13 15:51:44 crc kubenswrapper[4797]: I1013 15:51:44.536307 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Oct 13 15:51:44 crc kubenswrapper[4797]: I1013 15:51:44.708263 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"45bc585b-ca78-4517-aba6-d855a810d80b\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 13 15:51:44 crc kubenswrapper[4797]: I1013 15:51:44.708495 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-znx55\" (UniqueName: \"kubernetes.io/projected/45bc585b-ca78-4517-aba6-d855a810d80b-kube-api-access-znx55\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"45bc585b-ca78-4517-aba6-d855a810d80b\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 13 15:51:44 crc kubenswrapper[4797]: I1013 15:51:44.810329 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"45bc585b-ca78-4517-aba6-d855a810d80b\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 13 15:51:44 crc kubenswrapper[4797]: I1013 15:51:44.810475 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-znx55\" (UniqueName: \"kubernetes.io/projected/45bc585b-ca78-4517-aba6-d855a810d80b-kube-api-access-znx55\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"45bc585b-ca78-4517-aba6-d855a810d80b\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 13 15:51:44 crc kubenswrapper[4797]: I1013 15:51:44.811203 4797 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"45bc585b-ca78-4517-aba6-d855a810d80b\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 13 15:51:44 crc kubenswrapper[4797]: I1013 15:51:44.834296 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-znx55\" (UniqueName: \"kubernetes.io/projected/45bc585b-ca78-4517-aba6-d855a810d80b-kube-api-access-znx55\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"45bc585b-ca78-4517-aba6-d855a810d80b\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 13 15:51:44 crc kubenswrapper[4797]: I1013 15:51:44.856038 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"45bc585b-ca78-4517-aba6-d855a810d80b\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 13 15:51:45 crc kubenswrapper[4797]: I1013 15:51:45.136906 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 13 15:51:45 crc kubenswrapper[4797]: I1013 15:51:45.625299 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Oct 13 15:51:46 crc kubenswrapper[4797]: I1013 15:51:46.122830 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"45bc585b-ca78-4517-aba6-d855a810d80b","Type":"ContainerStarted","Data":"712a84f133a644beca62e91d43d62b9196ee8bb10772e416ab5f1379491800d6"} Oct 13 15:51:48 crc kubenswrapper[4797]: I1013 15:51:48.152623 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"45bc585b-ca78-4517-aba6-d855a810d80b","Type":"ContainerStarted","Data":"c1f3acaca51ba83aba94fdfb0b3283f0391480cceb026164fa7f5332fde4570f"} Oct 13 15:51:48 crc kubenswrapper[4797]: I1013 15:51:48.163367 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=2.689174542 podStartE2EDuration="4.16334587s" podCreationTimestamp="2025-10-13 15:51:44 +0000 UTC" firstStartedPulling="2025-10-13 15:51:45.615481683 +0000 UTC m=+9883.149031979" lastFinishedPulling="2025-10-13 15:51:47.089653051 +0000 UTC m=+9884.623203307" observedRunningTime="2025-10-13 15:51:48.163261258 +0000 UTC m=+9885.696811524" watchObservedRunningTime="2025-10-13 15:51:48.16334587 +0000 UTC m=+9885.696896136" Oct 13 15:51:53 crc kubenswrapper[4797]: I1013 15:51:53.244659 4797 scope.go:117] "RemoveContainer" containerID="7f1e83580ed6057b8717484b17b17fa6431ebb611cc0b1233bd7239aa2d8f6f4" Oct 13 15:51:53 crc kubenswrapper[4797]: E1013 15:51:53.245691 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 15:52:07 crc kubenswrapper[4797]: I1013 15:52:07.237129 4797 scope.go:117] "RemoveContainer" containerID="7f1e83580ed6057b8717484b17b17fa6431ebb611cc0b1233bd7239aa2d8f6f4" Oct 13 15:52:07 crc kubenswrapper[4797]: E1013 15:52:07.238099 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 15:52:22 crc kubenswrapper[4797]: I1013 15:52:22.236550 4797 scope.go:117] "RemoveContainer" containerID="7f1e83580ed6057b8717484b17b17fa6431ebb611cc0b1233bd7239aa2d8f6f4" Oct 13 15:52:22 crc kubenswrapper[4797]: E1013 15:52:22.237743 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 15:52:33 crc kubenswrapper[4797]: I1013 15:52:33.518780 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-6s585"] Oct 13 15:52:33 crc kubenswrapper[4797]: I1013 15:52:33.522048 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6s585" Oct 13 15:52:33 crc kubenswrapper[4797]: I1013 15:52:33.535260 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6s585"] Oct 13 15:52:33 crc kubenswrapper[4797]: I1013 15:52:33.691179 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7620ba0c-ad68-46e5-8210-ce430b5c2b64-utilities\") pod \"redhat-operators-6s585\" (UID: \"7620ba0c-ad68-46e5-8210-ce430b5c2b64\") " pod="openshift-marketplace/redhat-operators-6s585" Oct 13 15:52:33 crc kubenswrapper[4797]: I1013 15:52:33.691939 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7620ba0c-ad68-46e5-8210-ce430b5c2b64-catalog-content\") pod \"redhat-operators-6s585\" (UID: \"7620ba0c-ad68-46e5-8210-ce430b5c2b64\") " pod="openshift-marketplace/redhat-operators-6s585" Oct 13 15:52:33 crc kubenswrapper[4797]: I1013 15:52:33.692121 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vq89h\" (UniqueName: \"kubernetes.io/projected/7620ba0c-ad68-46e5-8210-ce430b5c2b64-kube-api-access-vq89h\") pod \"redhat-operators-6s585\" (UID: \"7620ba0c-ad68-46e5-8210-ce430b5c2b64\") " pod="openshift-marketplace/redhat-operators-6s585" Oct 13 15:52:33 crc kubenswrapper[4797]: I1013 15:52:33.798267 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7620ba0c-ad68-46e5-8210-ce430b5c2b64-utilities\") pod \"redhat-operators-6s585\" (UID: \"7620ba0c-ad68-46e5-8210-ce430b5c2b64\") " pod="openshift-marketplace/redhat-operators-6s585" Oct 13 15:52:33 crc kubenswrapper[4797]: I1013 15:52:33.798543 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7620ba0c-ad68-46e5-8210-ce430b5c2b64-utilities\") pod \"redhat-operators-6s585\" (UID: \"7620ba0c-ad68-46e5-8210-ce430b5c2b64\") " pod="openshift-marketplace/redhat-operators-6s585" Oct 13 15:52:33 crc kubenswrapper[4797]: I1013 15:52:33.799648 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7620ba0c-ad68-46e5-8210-ce430b5c2b64-catalog-content\") pod \"redhat-operators-6s585\" (UID: \"7620ba0c-ad68-46e5-8210-ce430b5c2b64\") " pod="openshift-marketplace/redhat-operators-6s585" Oct 13 15:52:33 crc kubenswrapper[4797]: I1013 15:52:33.799963 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vq89h\" (UniqueName: \"kubernetes.io/projected/7620ba0c-ad68-46e5-8210-ce430b5c2b64-kube-api-access-vq89h\") pod \"redhat-operators-6s585\" (UID: \"7620ba0c-ad68-46e5-8210-ce430b5c2b64\") " pod="openshift-marketplace/redhat-operators-6s585" Oct 13 15:52:33 crc kubenswrapper[4797]: I1013 15:52:33.800332 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7620ba0c-ad68-46e5-8210-ce430b5c2b64-catalog-content\") pod \"redhat-operators-6s585\" (UID: \"7620ba0c-ad68-46e5-8210-ce430b5c2b64\") " pod="openshift-marketplace/redhat-operators-6s585" Oct 13 15:52:33 crc kubenswrapper[4797]: I1013 15:52:33.827770 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vq89h\" (UniqueName: \"kubernetes.io/projected/7620ba0c-ad68-46e5-8210-ce430b5c2b64-kube-api-access-vq89h\") pod \"redhat-operators-6s585\" (UID: \"7620ba0c-ad68-46e5-8210-ce430b5c2b64\") " pod="openshift-marketplace/redhat-operators-6s585" Oct 13 15:52:33 crc kubenswrapper[4797]: I1013 15:52:33.855216 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6s585" Oct 13 15:52:34 crc kubenswrapper[4797]: I1013 15:52:34.393756 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6s585"] Oct 13 15:52:34 crc kubenswrapper[4797]: I1013 15:52:34.640843 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6s585" event={"ID":"7620ba0c-ad68-46e5-8210-ce430b5c2b64","Type":"ContainerStarted","Data":"39678929e2d25384eebd2b6c77de5f7e46816875ed1843917812a9412c4104a5"} Oct 13 15:52:35 crc kubenswrapper[4797]: I1013 15:52:35.670717 4797 generic.go:334] "Generic (PLEG): container finished" podID="7620ba0c-ad68-46e5-8210-ce430b5c2b64" containerID="1239b4958e5a80f8d18f97dad462fd009b71b1640f1031aa2044ddd8fe72e302" exitCode=0 Oct 13 15:52:35 crc kubenswrapper[4797]: I1013 15:52:35.670794 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6s585" event={"ID":"7620ba0c-ad68-46e5-8210-ce430b5c2b64","Type":"ContainerDied","Data":"1239b4958e5a80f8d18f97dad462fd009b71b1640f1031aa2044ddd8fe72e302"} Oct 13 15:52:37 crc kubenswrapper[4797]: I1013 15:52:37.235782 4797 scope.go:117] "RemoveContainer" containerID="7f1e83580ed6057b8717484b17b17fa6431ebb611cc0b1233bd7239aa2d8f6f4" Oct 13 15:52:37 crc kubenswrapper[4797]: E1013 15:52:37.236475 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 15:52:37 crc kubenswrapper[4797]: I1013 15:52:37.697042 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6s585" event={"ID":"7620ba0c-ad68-46e5-8210-ce430b5c2b64","Type":"ContainerStarted","Data":"6989d20af8a782895dbd76f10203bb814f1c1390fecf5fc46db24cf05f80cec4"} Oct 13 15:52:40 crc kubenswrapper[4797]: I1013 15:52:40.793542 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-dm9p7"] Oct 13 15:52:40 crc kubenswrapper[4797]: I1013 15:52:40.797338 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dm9p7" Oct 13 15:52:40 crc kubenswrapper[4797]: I1013 15:52:40.805432 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dm9p7"] Oct 13 15:52:40 crc kubenswrapper[4797]: I1013 15:52:40.965824 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e28056df-b19a-4cff-96c4-d40df86a0326-utilities\") pod \"redhat-marketplace-dm9p7\" (UID: \"e28056df-b19a-4cff-96c4-d40df86a0326\") " pod="openshift-marketplace/redhat-marketplace-dm9p7" Oct 13 15:52:40 crc kubenswrapper[4797]: I1013 15:52:40.965963 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wljxn\" (UniqueName: \"kubernetes.io/projected/e28056df-b19a-4cff-96c4-d40df86a0326-kube-api-access-wljxn\") pod \"redhat-marketplace-dm9p7\" (UID: \"e28056df-b19a-4cff-96c4-d40df86a0326\") " pod="openshift-marketplace/redhat-marketplace-dm9p7" Oct 13 15:52:40 crc kubenswrapper[4797]: I1013 15:52:40.965999 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e28056df-b19a-4cff-96c4-d40df86a0326-catalog-content\") pod \"redhat-marketplace-dm9p7\" (UID: \"e28056df-b19a-4cff-96c4-d40df86a0326\") " pod="openshift-marketplace/redhat-marketplace-dm9p7" Oct 13 15:52:41 crc kubenswrapper[4797]: I1013 15:52:41.069169 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e28056df-b19a-4cff-96c4-d40df86a0326-utilities\") pod \"redhat-marketplace-dm9p7\" (UID: \"e28056df-b19a-4cff-96c4-d40df86a0326\") " pod="openshift-marketplace/redhat-marketplace-dm9p7" Oct 13 15:52:41 crc kubenswrapper[4797]: I1013 15:52:41.069315 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wljxn\" (UniqueName: \"kubernetes.io/projected/e28056df-b19a-4cff-96c4-d40df86a0326-kube-api-access-wljxn\") pod \"redhat-marketplace-dm9p7\" (UID: \"e28056df-b19a-4cff-96c4-d40df86a0326\") " pod="openshift-marketplace/redhat-marketplace-dm9p7" Oct 13 15:52:41 crc kubenswrapper[4797]: I1013 15:52:41.069359 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e28056df-b19a-4cff-96c4-d40df86a0326-catalog-content\") pod \"redhat-marketplace-dm9p7\" (UID: \"e28056df-b19a-4cff-96c4-d40df86a0326\") " pod="openshift-marketplace/redhat-marketplace-dm9p7" Oct 13 15:52:41 crc kubenswrapper[4797]: I1013 15:52:41.069705 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e28056df-b19a-4cff-96c4-d40df86a0326-utilities\") pod \"redhat-marketplace-dm9p7\" (UID: \"e28056df-b19a-4cff-96c4-d40df86a0326\") " pod="openshift-marketplace/redhat-marketplace-dm9p7" Oct 13 15:52:41 crc kubenswrapper[4797]: I1013 15:52:41.070007 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e28056df-b19a-4cff-96c4-d40df86a0326-catalog-content\") pod \"redhat-marketplace-dm9p7\" (UID: \"e28056df-b19a-4cff-96c4-d40df86a0326\") " pod="openshift-marketplace/redhat-marketplace-dm9p7" Oct 13 15:52:41 crc kubenswrapper[4797]: I1013 15:52:41.092639 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wljxn\" (UniqueName: \"kubernetes.io/projected/e28056df-b19a-4cff-96c4-d40df86a0326-kube-api-access-wljxn\") pod \"redhat-marketplace-dm9p7\" (UID: \"e28056df-b19a-4cff-96c4-d40df86a0326\") " pod="openshift-marketplace/redhat-marketplace-dm9p7" Oct 13 15:52:41 crc kubenswrapper[4797]: I1013 15:52:41.128323 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dm9p7" Oct 13 15:52:41 crc kubenswrapper[4797]: I1013 15:52:41.663972 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dm9p7"] Oct 13 15:52:41 crc kubenswrapper[4797]: I1013 15:52:41.736048 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dm9p7" event={"ID":"e28056df-b19a-4cff-96c4-d40df86a0326","Type":"ContainerStarted","Data":"85e51b983904923a3d315d6a2f94aa6d9bae2eed3a73345c54f8b28c2b5b6047"} Oct 13 15:52:42 crc kubenswrapper[4797]: I1013 15:52:42.745437 4797 generic.go:334] "Generic (PLEG): container finished" podID="e28056df-b19a-4cff-96c4-d40df86a0326" containerID="e37b1cb568e05cabfda22b5212ce782b422323b5333548275a80b44e78c9ed3d" exitCode=0 Oct 13 15:52:42 crc kubenswrapper[4797]: I1013 15:52:42.745494 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dm9p7" event={"ID":"e28056df-b19a-4cff-96c4-d40df86a0326","Type":"ContainerDied","Data":"e37b1cb568e05cabfda22b5212ce782b422323b5333548275a80b44e78c9ed3d"} Oct 13 15:52:45 crc kubenswrapper[4797]: I1013 15:52:45.779439 4797 generic.go:334] "Generic (PLEG): container finished" podID="7620ba0c-ad68-46e5-8210-ce430b5c2b64" containerID="6989d20af8a782895dbd76f10203bb814f1c1390fecf5fc46db24cf05f80cec4" exitCode=0 Oct 13 15:52:45 crc kubenswrapper[4797]: I1013 15:52:45.779545 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6s585" event={"ID":"7620ba0c-ad68-46e5-8210-ce430b5c2b64","Type":"ContainerDied","Data":"6989d20af8a782895dbd76f10203bb814f1c1390fecf5fc46db24cf05f80cec4"} Oct 13 15:52:46 crc kubenswrapper[4797]: I1013 15:52:46.792598 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dm9p7" event={"ID":"e28056df-b19a-4cff-96c4-d40df86a0326","Type":"ContainerStarted","Data":"43eab859e515986d187d8d864240239a20d2cf54de3dc5e1c1b15e3956daab6d"} Oct 13 15:52:46 crc kubenswrapper[4797]: I1013 15:52:46.795139 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6s585" event={"ID":"7620ba0c-ad68-46e5-8210-ce430b5c2b64","Type":"ContainerStarted","Data":"46c7d078066c87520d4f5f084ebcdd0c941983e32a2d8da46541169a281a6c5a"} Oct 13 15:52:46 crc kubenswrapper[4797]: I1013 15:52:46.828779 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-6s585" podStartSLOduration=3.026009915 podStartE2EDuration="13.828754261s" podCreationTimestamp="2025-10-13 15:52:33 +0000 UTC" firstStartedPulling="2025-10-13 15:52:35.673703303 +0000 UTC m=+9933.207253569" lastFinishedPulling="2025-10-13 15:52:46.476447659 +0000 UTC m=+9944.009997915" observedRunningTime="2025-10-13 15:52:46.824088286 +0000 UTC m=+9944.357638542" watchObservedRunningTime="2025-10-13 15:52:46.828754261 +0000 UTC m=+9944.362304527" Oct 13 15:52:47 crc kubenswrapper[4797]: I1013 15:52:47.804971 4797 generic.go:334] "Generic (PLEG): container finished" podID="e28056df-b19a-4cff-96c4-d40df86a0326" containerID="43eab859e515986d187d8d864240239a20d2cf54de3dc5e1c1b15e3956daab6d" exitCode=0 Oct 13 15:52:47 crc kubenswrapper[4797]: I1013 15:52:47.805074 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dm9p7" event={"ID":"e28056df-b19a-4cff-96c4-d40df86a0326","Type":"ContainerDied","Data":"43eab859e515986d187d8d864240239a20d2cf54de3dc5e1c1b15e3956daab6d"} Oct 13 15:52:48 crc kubenswrapper[4797]: I1013 15:52:48.237013 4797 scope.go:117] "RemoveContainer" containerID="7f1e83580ed6057b8717484b17b17fa6431ebb611cc0b1233bd7239aa2d8f6f4" Oct 13 15:52:48 crc kubenswrapper[4797]: E1013 15:52:48.237456 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 15:52:48 crc kubenswrapper[4797]: I1013 15:52:48.821930 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dm9p7" event={"ID":"e28056df-b19a-4cff-96c4-d40df86a0326","Type":"ContainerStarted","Data":"b33d24924ce8b6469f7656848eb46203d4392e6fd83f8dff470e73d34cd99a14"} Oct 13 15:52:48 crc kubenswrapper[4797]: I1013 15:52:48.867699 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-dm9p7" podStartSLOduration=3.277019968 podStartE2EDuration="8.867671121s" podCreationTimestamp="2025-10-13 15:52:40 +0000 UTC" firstStartedPulling="2025-10-13 15:52:42.747624998 +0000 UTC m=+9940.281175264" lastFinishedPulling="2025-10-13 15:52:48.338276161 +0000 UTC m=+9945.871826417" observedRunningTime="2025-10-13 15:52:48.839849595 +0000 UTC m=+9946.373399891" watchObservedRunningTime="2025-10-13 15:52:48.867671121 +0000 UTC m=+9946.401221387" Oct 13 15:52:51 crc kubenswrapper[4797]: I1013 15:52:51.129399 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-dm9p7" Oct 13 15:52:51 crc kubenswrapper[4797]: I1013 15:52:51.129764 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-dm9p7" Oct 13 15:52:51 crc kubenswrapper[4797]: I1013 15:52:51.182217 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-dm9p7" Oct 13 15:52:51 crc kubenswrapper[4797]: I1013 15:52:51.756029 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-kvrb6/must-gather-5rh4w"] Oct 13 15:52:51 crc kubenswrapper[4797]: I1013 15:52:51.759142 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kvrb6/must-gather-5rh4w" Oct 13 15:52:51 crc kubenswrapper[4797]: I1013 15:52:51.766425 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-kvrb6/must-gather-5rh4w"] Oct 13 15:52:51 crc kubenswrapper[4797]: I1013 15:52:51.788742 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-kvrb6"/"openshift-service-ca.crt" Oct 13 15:52:51 crc kubenswrapper[4797]: I1013 15:52:51.789049 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-kvrb6"/"default-dockercfg-z2fdx" Oct 13 15:52:51 crc kubenswrapper[4797]: I1013 15:52:51.789199 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-kvrb6"/"kube-root-ca.crt" Oct 13 15:52:51 crc kubenswrapper[4797]: I1013 15:52:51.907602 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/1e1563cf-9081-41da-b894-f07f4ff18604-must-gather-output\") pod \"must-gather-5rh4w\" (UID: \"1e1563cf-9081-41da-b894-f07f4ff18604\") " pod="openshift-must-gather-kvrb6/must-gather-5rh4w" Oct 13 15:52:51 crc kubenswrapper[4797]: I1013 15:52:51.907778 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhvdq\" (UniqueName: \"kubernetes.io/projected/1e1563cf-9081-41da-b894-f07f4ff18604-kube-api-access-mhvdq\") pod \"must-gather-5rh4w\" (UID: \"1e1563cf-9081-41da-b894-f07f4ff18604\") " pod="openshift-must-gather-kvrb6/must-gather-5rh4w" Oct 13 15:52:52 crc kubenswrapper[4797]: I1013 15:52:52.009505 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mhvdq\" (UniqueName: \"kubernetes.io/projected/1e1563cf-9081-41da-b894-f07f4ff18604-kube-api-access-mhvdq\") pod \"must-gather-5rh4w\" (UID: \"1e1563cf-9081-41da-b894-f07f4ff18604\") " pod="openshift-must-gather-kvrb6/must-gather-5rh4w" Oct 13 15:52:52 crc kubenswrapper[4797]: I1013 15:52:52.009650 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/1e1563cf-9081-41da-b894-f07f4ff18604-must-gather-output\") pod \"must-gather-5rh4w\" (UID: \"1e1563cf-9081-41da-b894-f07f4ff18604\") " pod="openshift-must-gather-kvrb6/must-gather-5rh4w" Oct 13 15:52:52 crc kubenswrapper[4797]: I1013 15:52:52.010216 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/1e1563cf-9081-41da-b894-f07f4ff18604-must-gather-output\") pod \"must-gather-5rh4w\" (UID: \"1e1563cf-9081-41da-b894-f07f4ff18604\") " pod="openshift-must-gather-kvrb6/must-gather-5rh4w" Oct 13 15:52:52 crc kubenswrapper[4797]: I1013 15:52:52.030372 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhvdq\" (UniqueName: \"kubernetes.io/projected/1e1563cf-9081-41da-b894-f07f4ff18604-kube-api-access-mhvdq\") pod \"must-gather-5rh4w\" (UID: \"1e1563cf-9081-41da-b894-f07f4ff18604\") " pod="openshift-must-gather-kvrb6/must-gather-5rh4w" Oct 13 15:52:52 crc kubenswrapper[4797]: I1013 15:52:52.120235 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kvrb6/must-gather-5rh4w" Oct 13 15:52:52 crc kubenswrapper[4797]: I1013 15:52:52.575830 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-kvrb6/must-gather-5rh4w"] Oct 13 15:52:52 crc kubenswrapper[4797]: W1013 15:52:52.578178 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1e1563cf_9081_41da_b894_f07f4ff18604.slice/crio-6e1f437adc41e16d7e18b98bfabc344ea3a7c889a3def6d9b3de7d073fd953ee WatchSource:0}: Error finding container 6e1f437adc41e16d7e18b98bfabc344ea3a7c889a3def6d9b3de7d073fd953ee: Status 404 returned error can't find the container with id 6e1f437adc41e16d7e18b98bfabc344ea3a7c889a3def6d9b3de7d073fd953ee Oct 13 15:52:52 crc kubenswrapper[4797]: I1013 15:52:52.862464 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kvrb6/must-gather-5rh4w" event={"ID":"1e1563cf-9081-41da-b894-f07f4ff18604","Type":"ContainerStarted","Data":"6e1f437adc41e16d7e18b98bfabc344ea3a7c889a3def6d9b3de7d073fd953ee"} Oct 13 15:52:53 crc kubenswrapper[4797]: I1013 15:52:53.856896 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-6s585" Oct 13 15:52:53 crc kubenswrapper[4797]: I1013 15:52:53.857267 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-6s585" Oct 13 15:52:53 crc kubenswrapper[4797]: I1013 15:52:53.923069 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-6s585" Oct 13 15:52:53 crc kubenswrapper[4797]: I1013 15:52:53.969579 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-6s585" Oct 13 15:52:54 crc kubenswrapper[4797]: I1013 15:52:54.163339 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6s585"] Oct 13 15:52:55 crc kubenswrapper[4797]: I1013 15:52:55.896131 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-6s585" podUID="7620ba0c-ad68-46e5-8210-ce430b5c2b64" containerName="registry-server" containerID="cri-o://46c7d078066c87520d4f5f084ebcdd0c941983e32a2d8da46541169a281a6c5a" gracePeriod=2 Oct 13 15:52:56 crc kubenswrapper[4797]: I1013 15:52:56.908097 4797 generic.go:334] "Generic (PLEG): container finished" podID="7620ba0c-ad68-46e5-8210-ce430b5c2b64" containerID="46c7d078066c87520d4f5f084ebcdd0c941983e32a2d8da46541169a281a6c5a" exitCode=0 Oct 13 15:52:56 crc kubenswrapper[4797]: I1013 15:52:56.908122 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6s585" event={"ID":"7620ba0c-ad68-46e5-8210-ce430b5c2b64","Type":"ContainerDied","Data":"46c7d078066c87520d4f5f084ebcdd0c941983e32a2d8da46541169a281a6c5a"} Oct 13 15:52:57 crc kubenswrapper[4797]: I1013 15:52:57.763624 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6s585" Oct 13 15:52:57 crc kubenswrapper[4797]: I1013 15:52:57.835935 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7620ba0c-ad68-46e5-8210-ce430b5c2b64-catalog-content\") pod \"7620ba0c-ad68-46e5-8210-ce430b5c2b64\" (UID: \"7620ba0c-ad68-46e5-8210-ce430b5c2b64\") " Oct 13 15:52:57 crc kubenswrapper[4797]: I1013 15:52:57.836218 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7620ba0c-ad68-46e5-8210-ce430b5c2b64-utilities\") pod \"7620ba0c-ad68-46e5-8210-ce430b5c2b64\" (UID: \"7620ba0c-ad68-46e5-8210-ce430b5c2b64\") " Oct 13 15:52:57 crc kubenswrapper[4797]: I1013 15:52:57.836270 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vq89h\" (UniqueName: \"kubernetes.io/projected/7620ba0c-ad68-46e5-8210-ce430b5c2b64-kube-api-access-vq89h\") pod \"7620ba0c-ad68-46e5-8210-ce430b5c2b64\" (UID: \"7620ba0c-ad68-46e5-8210-ce430b5c2b64\") " Oct 13 15:52:57 crc kubenswrapper[4797]: I1013 15:52:57.837051 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7620ba0c-ad68-46e5-8210-ce430b5c2b64-utilities" (OuterVolumeSpecName: "utilities") pod "7620ba0c-ad68-46e5-8210-ce430b5c2b64" (UID: "7620ba0c-ad68-46e5-8210-ce430b5c2b64"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 15:52:57 crc kubenswrapper[4797]: I1013 15:52:57.846254 4797 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7620ba0c-ad68-46e5-8210-ce430b5c2b64-utilities\") on node \"crc\" DevicePath \"\"" Oct 13 15:52:57 crc kubenswrapper[4797]: I1013 15:52:57.849320 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7620ba0c-ad68-46e5-8210-ce430b5c2b64-kube-api-access-vq89h" (OuterVolumeSpecName: "kube-api-access-vq89h") pod "7620ba0c-ad68-46e5-8210-ce430b5c2b64" (UID: "7620ba0c-ad68-46e5-8210-ce430b5c2b64"). InnerVolumeSpecName "kube-api-access-vq89h". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 15:52:57 crc kubenswrapper[4797]: I1013 15:52:57.919300 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6s585" event={"ID":"7620ba0c-ad68-46e5-8210-ce430b5c2b64","Type":"ContainerDied","Data":"39678929e2d25384eebd2b6c77de5f7e46816875ed1843917812a9412c4104a5"} Oct 13 15:52:57 crc kubenswrapper[4797]: I1013 15:52:57.919370 4797 scope.go:117] "RemoveContainer" containerID="46c7d078066c87520d4f5f084ebcdd0c941983e32a2d8da46541169a281a6c5a" Oct 13 15:52:57 crc kubenswrapper[4797]: I1013 15:52:57.919563 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6s585" Oct 13 15:52:57 crc kubenswrapper[4797]: I1013 15:52:57.949769 4797 scope.go:117] "RemoveContainer" containerID="6989d20af8a782895dbd76f10203bb814f1c1390fecf5fc46db24cf05f80cec4" Oct 13 15:52:57 crc kubenswrapper[4797]: I1013 15:52:57.958698 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vq89h\" (UniqueName: \"kubernetes.io/projected/7620ba0c-ad68-46e5-8210-ce430b5c2b64-kube-api-access-vq89h\") on node \"crc\" DevicePath \"\"" Oct 13 15:52:58 crc kubenswrapper[4797]: I1013 15:52:58.005974 4797 scope.go:117] "RemoveContainer" containerID="1239b4958e5a80f8d18f97dad462fd009b71b1640f1031aa2044ddd8fe72e302" Oct 13 15:52:58 crc kubenswrapper[4797]: I1013 15:52:58.025308 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7620ba0c-ad68-46e5-8210-ce430b5c2b64-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7620ba0c-ad68-46e5-8210-ce430b5c2b64" (UID: "7620ba0c-ad68-46e5-8210-ce430b5c2b64"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 15:52:58 crc kubenswrapper[4797]: I1013 15:52:58.061417 4797 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7620ba0c-ad68-46e5-8210-ce430b5c2b64-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 13 15:52:58 crc kubenswrapper[4797]: I1013 15:52:58.261398 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6s585"] Oct 13 15:52:58 crc kubenswrapper[4797]: I1013 15:52:58.279117 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-6s585"] Oct 13 15:52:58 crc kubenswrapper[4797]: I1013 15:52:58.934484 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kvrb6/must-gather-5rh4w" event={"ID":"1e1563cf-9081-41da-b894-f07f4ff18604","Type":"ContainerStarted","Data":"c27e25ebe88fca8d591eda56eddf8511644e7b9446cb763294ef62f3f66e0cde"} Oct 13 15:52:58 crc kubenswrapper[4797]: I1013 15:52:58.934557 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kvrb6/must-gather-5rh4w" event={"ID":"1e1563cf-9081-41da-b894-f07f4ff18604","Type":"ContainerStarted","Data":"3e2767edb4ccc39ed30beabb8ca5bd06ab049fb29447f0b4099941a5cf68808b"} Oct 13 15:52:58 crc kubenswrapper[4797]: I1013 15:52:58.959025 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-kvrb6/must-gather-5rh4w" podStartSLOduration=2.90976082 podStartE2EDuration="7.959003857s" podCreationTimestamp="2025-10-13 15:52:51 +0000 UTC" firstStartedPulling="2025-10-13 15:52:52.580744404 +0000 UTC m=+9950.114294650" lastFinishedPulling="2025-10-13 15:52:57.629987431 +0000 UTC m=+9955.163537687" observedRunningTime="2025-10-13 15:52:58.948316784 +0000 UTC m=+9956.481867050" watchObservedRunningTime="2025-10-13 15:52:58.959003857 +0000 UTC m=+9956.492554123" Oct 13 15:52:59 crc kubenswrapper[4797]: I1013 15:52:59.247753 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7620ba0c-ad68-46e5-8210-ce430b5c2b64" path="/var/lib/kubelet/pods/7620ba0c-ad68-46e5-8210-ce430b5c2b64/volumes" Oct 13 15:53:01 crc kubenswrapper[4797]: I1013 15:53:01.192191 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-dm9p7" Oct 13 15:53:01 crc kubenswrapper[4797]: I1013 15:53:01.263546 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dm9p7"] Oct 13 15:53:01 crc kubenswrapper[4797]: I1013 15:53:01.961646 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-dm9p7" podUID="e28056df-b19a-4cff-96c4-d40df86a0326" containerName="registry-server" containerID="cri-o://b33d24924ce8b6469f7656848eb46203d4392e6fd83f8dff470e73d34cd99a14" gracePeriod=2 Oct 13 15:53:02 crc kubenswrapper[4797]: I1013 15:53:02.236270 4797 scope.go:117] "RemoveContainer" containerID="7f1e83580ed6057b8717484b17b17fa6431ebb611cc0b1233bd7239aa2d8f6f4" Oct 13 15:53:02 crc kubenswrapper[4797]: E1013 15:53:02.236709 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 15:53:02 crc kubenswrapper[4797]: I1013 15:53:02.437297 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dm9p7" Oct 13 15:53:02 crc kubenswrapper[4797]: I1013 15:53:02.588720 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e28056df-b19a-4cff-96c4-d40df86a0326-catalog-content\") pod \"e28056df-b19a-4cff-96c4-d40df86a0326\" (UID: \"e28056df-b19a-4cff-96c4-d40df86a0326\") " Oct 13 15:53:02 crc kubenswrapper[4797]: I1013 15:53:02.588924 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e28056df-b19a-4cff-96c4-d40df86a0326-utilities\") pod \"e28056df-b19a-4cff-96c4-d40df86a0326\" (UID: \"e28056df-b19a-4cff-96c4-d40df86a0326\") " Oct 13 15:53:02 crc kubenswrapper[4797]: I1013 15:53:02.589002 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wljxn\" (UniqueName: \"kubernetes.io/projected/e28056df-b19a-4cff-96c4-d40df86a0326-kube-api-access-wljxn\") pod \"e28056df-b19a-4cff-96c4-d40df86a0326\" (UID: \"e28056df-b19a-4cff-96c4-d40df86a0326\") " Oct 13 15:53:02 crc kubenswrapper[4797]: I1013 15:53:02.589834 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e28056df-b19a-4cff-96c4-d40df86a0326-utilities" (OuterVolumeSpecName: "utilities") pod "e28056df-b19a-4cff-96c4-d40df86a0326" (UID: "e28056df-b19a-4cff-96c4-d40df86a0326"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 15:53:02 crc kubenswrapper[4797]: I1013 15:53:02.594101 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e28056df-b19a-4cff-96c4-d40df86a0326-kube-api-access-wljxn" (OuterVolumeSpecName: "kube-api-access-wljxn") pod "e28056df-b19a-4cff-96c4-d40df86a0326" (UID: "e28056df-b19a-4cff-96c4-d40df86a0326"). InnerVolumeSpecName "kube-api-access-wljxn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 15:53:02 crc kubenswrapper[4797]: I1013 15:53:02.604794 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e28056df-b19a-4cff-96c4-d40df86a0326-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e28056df-b19a-4cff-96c4-d40df86a0326" (UID: "e28056df-b19a-4cff-96c4-d40df86a0326"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 15:53:02 crc kubenswrapper[4797]: I1013 15:53:02.691152 4797 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e28056df-b19a-4cff-96c4-d40df86a0326-utilities\") on node \"crc\" DevicePath \"\"" Oct 13 15:53:02 crc kubenswrapper[4797]: I1013 15:53:02.691510 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wljxn\" (UniqueName: \"kubernetes.io/projected/e28056df-b19a-4cff-96c4-d40df86a0326-kube-api-access-wljxn\") on node \"crc\" DevicePath \"\"" Oct 13 15:53:02 crc kubenswrapper[4797]: I1013 15:53:02.691526 4797 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e28056df-b19a-4cff-96c4-d40df86a0326-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 13 15:53:02 crc kubenswrapper[4797]: I1013 15:53:02.975848 4797 generic.go:334] "Generic (PLEG): container finished" podID="e28056df-b19a-4cff-96c4-d40df86a0326" containerID="b33d24924ce8b6469f7656848eb46203d4392e6fd83f8dff470e73d34cd99a14" exitCode=0 Oct 13 15:53:02 crc kubenswrapper[4797]: I1013 15:53:02.975915 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dm9p7" event={"ID":"e28056df-b19a-4cff-96c4-d40df86a0326","Type":"ContainerDied","Data":"b33d24924ce8b6469f7656848eb46203d4392e6fd83f8dff470e73d34cd99a14"} Oct 13 15:53:02 crc kubenswrapper[4797]: I1013 15:53:02.975941 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dm9p7" Oct 13 15:53:02 crc kubenswrapper[4797]: I1013 15:53:02.975960 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dm9p7" event={"ID":"e28056df-b19a-4cff-96c4-d40df86a0326","Type":"ContainerDied","Data":"85e51b983904923a3d315d6a2f94aa6d9bae2eed3a73345c54f8b28c2b5b6047"} Oct 13 15:53:02 crc kubenswrapper[4797]: I1013 15:53:02.975988 4797 scope.go:117] "RemoveContainer" containerID="b33d24924ce8b6469f7656848eb46203d4392e6fd83f8dff470e73d34cd99a14" Oct 13 15:53:03 crc kubenswrapper[4797]: I1013 15:53:03.003199 4797 scope.go:117] "RemoveContainer" containerID="43eab859e515986d187d8d864240239a20d2cf54de3dc5e1c1b15e3956daab6d" Oct 13 15:53:03 crc kubenswrapper[4797]: I1013 15:53:03.042668 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dm9p7"] Oct 13 15:53:03 crc kubenswrapper[4797]: I1013 15:53:03.051215 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-dm9p7"] Oct 13 15:53:03 crc kubenswrapper[4797]: I1013 15:53:03.053440 4797 scope.go:117] "RemoveContainer" containerID="e37b1cb568e05cabfda22b5212ce782b422323b5333548275a80b44e78c9ed3d" Oct 13 15:53:03 crc kubenswrapper[4797]: I1013 15:53:03.093829 4797 scope.go:117] "RemoveContainer" containerID="b33d24924ce8b6469f7656848eb46203d4392e6fd83f8dff470e73d34cd99a14" Oct 13 15:53:03 crc kubenswrapper[4797]: E1013 15:53:03.094388 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b33d24924ce8b6469f7656848eb46203d4392e6fd83f8dff470e73d34cd99a14\": container with ID starting with b33d24924ce8b6469f7656848eb46203d4392e6fd83f8dff470e73d34cd99a14 not found: ID does not exist" containerID="b33d24924ce8b6469f7656848eb46203d4392e6fd83f8dff470e73d34cd99a14" Oct 13 15:53:03 crc kubenswrapper[4797]: I1013 15:53:03.094426 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b33d24924ce8b6469f7656848eb46203d4392e6fd83f8dff470e73d34cd99a14"} err="failed to get container status \"b33d24924ce8b6469f7656848eb46203d4392e6fd83f8dff470e73d34cd99a14\": rpc error: code = NotFound desc = could not find container \"b33d24924ce8b6469f7656848eb46203d4392e6fd83f8dff470e73d34cd99a14\": container with ID starting with b33d24924ce8b6469f7656848eb46203d4392e6fd83f8dff470e73d34cd99a14 not found: ID does not exist" Oct 13 15:53:03 crc kubenswrapper[4797]: I1013 15:53:03.094454 4797 scope.go:117] "RemoveContainer" containerID="43eab859e515986d187d8d864240239a20d2cf54de3dc5e1c1b15e3956daab6d" Oct 13 15:53:03 crc kubenswrapper[4797]: E1013 15:53:03.094910 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43eab859e515986d187d8d864240239a20d2cf54de3dc5e1c1b15e3956daab6d\": container with ID starting with 43eab859e515986d187d8d864240239a20d2cf54de3dc5e1c1b15e3956daab6d not found: ID does not exist" containerID="43eab859e515986d187d8d864240239a20d2cf54de3dc5e1c1b15e3956daab6d" Oct 13 15:53:03 crc kubenswrapper[4797]: I1013 15:53:03.094958 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43eab859e515986d187d8d864240239a20d2cf54de3dc5e1c1b15e3956daab6d"} err="failed to get container status \"43eab859e515986d187d8d864240239a20d2cf54de3dc5e1c1b15e3956daab6d\": rpc error: code = NotFound desc = could not find container \"43eab859e515986d187d8d864240239a20d2cf54de3dc5e1c1b15e3956daab6d\": container with ID starting with 43eab859e515986d187d8d864240239a20d2cf54de3dc5e1c1b15e3956daab6d not found: ID does not exist" Oct 13 15:53:03 crc kubenswrapper[4797]: I1013 15:53:03.094993 4797 scope.go:117] "RemoveContainer" containerID="e37b1cb568e05cabfda22b5212ce782b422323b5333548275a80b44e78c9ed3d" Oct 13 15:53:03 crc kubenswrapper[4797]: E1013 15:53:03.095401 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e37b1cb568e05cabfda22b5212ce782b422323b5333548275a80b44e78c9ed3d\": container with ID starting with e37b1cb568e05cabfda22b5212ce782b422323b5333548275a80b44e78c9ed3d not found: ID does not exist" containerID="e37b1cb568e05cabfda22b5212ce782b422323b5333548275a80b44e78c9ed3d" Oct 13 15:53:03 crc kubenswrapper[4797]: I1013 15:53:03.095431 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e37b1cb568e05cabfda22b5212ce782b422323b5333548275a80b44e78c9ed3d"} err="failed to get container status \"e37b1cb568e05cabfda22b5212ce782b422323b5333548275a80b44e78c9ed3d\": rpc error: code = NotFound desc = could not find container \"e37b1cb568e05cabfda22b5212ce782b422323b5333548275a80b44e78c9ed3d\": container with ID starting with e37b1cb568e05cabfda22b5212ce782b422323b5333548275a80b44e78c9ed3d not found: ID does not exist" Oct 13 15:53:03 crc kubenswrapper[4797]: I1013 15:53:03.261012 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e28056df-b19a-4cff-96c4-d40df86a0326" path="/var/lib/kubelet/pods/e28056df-b19a-4cff-96c4-d40df86a0326/volumes" Oct 13 15:53:06 crc kubenswrapper[4797]: E1013 15:53:06.400040 4797 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.147:48900->38.102.83.147:46853: write tcp 38.102.83.147:48900->38.102.83.147:46853: write: broken pipe Oct 13 15:53:06 crc kubenswrapper[4797]: I1013 15:53:06.917720 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-kvrb6/crc-debug-cml7w"] Oct 13 15:53:06 crc kubenswrapper[4797]: E1013 15:53:06.919106 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e28056df-b19a-4cff-96c4-d40df86a0326" containerName="registry-server" Oct 13 15:53:06 crc kubenswrapper[4797]: I1013 15:53:06.919126 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="e28056df-b19a-4cff-96c4-d40df86a0326" containerName="registry-server" Oct 13 15:53:06 crc kubenswrapper[4797]: E1013 15:53:06.919157 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7620ba0c-ad68-46e5-8210-ce430b5c2b64" containerName="registry-server" Oct 13 15:53:06 crc kubenswrapper[4797]: I1013 15:53:06.919165 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="7620ba0c-ad68-46e5-8210-ce430b5c2b64" containerName="registry-server" Oct 13 15:53:06 crc kubenswrapper[4797]: E1013 15:53:06.919176 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e28056df-b19a-4cff-96c4-d40df86a0326" containerName="extract-content" Oct 13 15:53:06 crc kubenswrapper[4797]: I1013 15:53:06.919185 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="e28056df-b19a-4cff-96c4-d40df86a0326" containerName="extract-content" Oct 13 15:53:06 crc kubenswrapper[4797]: E1013 15:53:06.919204 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7620ba0c-ad68-46e5-8210-ce430b5c2b64" containerName="extract-utilities" Oct 13 15:53:06 crc kubenswrapper[4797]: I1013 15:53:06.919211 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="7620ba0c-ad68-46e5-8210-ce430b5c2b64" containerName="extract-utilities" Oct 13 15:53:06 crc kubenswrapper[4797]: E1013 15:53:06.919225 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7620ba0c-ad68-46e5-8210-ce430b5c2b64" containerName="extract-content" Oct 13 15:53:06 crc kubenswrapper[4797]: I1013 15:53:06.919231 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="7620ba0c-ad68-46e5-8210-ce430b5c2b64" containerName="extract-content" Oct 13 15:53:06 crc kubenswrapper[4797]: E1013 15:53:06.919248 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e28056df-b19a-4cff-96c4-d40df86a0326" containerName="extract-utilities" Oct 13 15:53:06 crc kubenswrapper[4797]: I1013 15:53:06.919257 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="e28056df-b19a-4cff-96c4-d40df86a0326" containerName="extract-utilities" Oct 13 15:53:06 crc kubenswrapper[4797]: I1013 15:53:06.919495 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="7620ba0c-ad68-46e5-8210-ce430b5c2b64" containerName="registry-server" Oct 13 15:53:06 crc kubenswrapper[4797]: I1013 15:53:06.919519 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="e28056df-b19a-4cff-96c4-d40df86a0326" containerName="registry-server" Oct 13 15:53:06 crc kubenswrapper[4797]: I1013 15:53:06.920438 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kvrb6/crc-debug-cml7w" Oct 13 15:53:06 crc kubenswrapper[4797]: I1013 15:53:06.977311 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljpxt\" (UniqueName: \"kubernetes.io/projected/2b618c96-0776-4d57-b5af-98cfb18b5a58-kube-api-access-ljpxt\") pod \"crc-debug-cml7w\" (UID: \"2b618c96-0776-4d57-b5af-98cfb18b5a58\") " pod="openshift-must-gather-kvrb6/crc-debug-cml7w" Oct 13 15:53:06 crc kubenswrapper[4797]: I1013 15:53:06.977405 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2b618c96-0776-4d57-b5af-98cfb18b5a58-host\") pod \"crc-debug-cml7w\" (UID: \"2b618c96-0776-4d57-b5af-98cfb18b5a58\") " pod="openshift-must-gather-kvrb6/crc-debug-cml7w" Oct 13 15:53:07 crc kubenswrapper[4797]: I1013 15:53:07.079382 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ljpxt\" (UniqueName: \"kubernetes.io/projected/2b618c96-0776-4d57-b5af-98cfb18b5a58-kube-api-access-ljpxt\") pod \"crc-debug-cml7w\" (UID: \"2b618c96-0776-4d57-b5af-98cfb18b5a58\") " pod="openshift-must-gather-kvrb6/crc-debug-cml7w" Oct 13 15:53:07 crc kubenswrapper[4797]: I1013 15:53:07.079476 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2b618c96-0776-4d57-b5af-98cfb18b5a58-host\") pod \"crc-debug-cml7w\" (UID: \"2b618c96-0776-4d57-b5af-98cfb18b5a58\") " pod="openshift-must-gather-kvrb6/crc-debug-cml7w" Oct 13 15:53:07 crc kubenswrapper[4797]: I1013 15:53:07.079701 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2b618c96-0776-4d57-b5af-98cfb18b5a58-host\") pod \"crc-debug-cml7w\" (UID: \"2b618c96-0776-4d57-b5af-98cfb18b5a58\") " pod="openshift-must-gather-kvrb6/crc-debug-cml7w" Oct 13 15:53:07 crc kubenswrapper[4797]: I1013 15:53:07.111041 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljpxt\" (UniqueName: \"kubernetes.io/projected/2b618c96-0776-4d57-b5af-98cfb18b5a58-kube-api-access-ljpxt\") pod \"crc-debug-cml7w\" (UID: \"2b618c96-0776-4d57-b5af-98cfb18b5a58\") " pod="openshift-must-gather-kvrb6/crc-debug-cml7w" Oct 13 15:53:07 crc kubenswrapper[4797]: I1013 15:53:07.253017 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kvrb6/crc-debug-cml7w" Oct 13 15:53:08 crc kubenswrapper[4797]: I1013 15:53:08.023664 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kvrb6/crc-debug-cml7w" event={"ID":"2b618c96-0776-4d57-b5af-98cfb18b5a58","Type":"ContainerStarted","Data":"f4385f34a482c6a1b3b51e44e9b6689e35b130a699aab46a823b68d409ba0c94"} Oct 13 15:53:13 crc kubenswrapper[4797]: I1013 15:53:13.243689 4797 scope.go:117] "RemoveContainer" containerID="7f1e83580ed6057b8717484b17b17fa6431ebb611cc0b1233bd7239aa2d8f6f4" Oct 13 15:53:13 crc kubenswrapper[4797]: E1013 15:53:13.244320 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 15:53:19 crc kubenswrapper[4797]: I1013 15:53:19.144364 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kvrb6/crc-debug-cml7w" event={"ID":"2b618c96-0776-4d57-b5af-98cfb18b5a58","Type":"ContainerStarted","Data":"e1cc6bf297f86f4eb8bc83bf953048585517720b009de660eeb1946380951b09"} Oct 13 15:53:19 crc kubenswrapper[4797]: I1013 15:53:19.161776 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-kvrb6/crc-debug-cml7w" podStartSLOduration=1.820652107 podStartE2EDuration="13.161751615s" podCreationTimestamp="2025-10-13 15:53:06 +0000 UTC" firstStartedPulling="2025-10-13 15:53:07.292520028 +0000 UTC m=+9964.826070284" lastFinishedPulling="2025-10-13 15:53:18.633619536 +0000 UTC m=+9976.167169792" observedRunningTime="2025-10-13 15:53:19.15705968 +0000 UTC m=+9976.690609936" watchObservedRunningTime="2025-10-13 15:53:19.161751615 +0000 UTC m=+9976.695301871" Oct 13 15:53:25 crc kubenswrapper[4797]: I1013 15:53:25.238631 4797 scope.go:117] "RemoveContainer" containerID="7f1e83580ed6057b8717484b17b17fa6431ebb611cc0b1233bd7239aa2d8f6f4" Oct 13 15:53:25 crc kubenswrapper[4797]: E1013 15:53:25.239460 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 15:53:32 crc kubenswrapper[4797]: I1013 15:53:32.655857 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-lbgts"] Oct 13 15:53:32 crc kubenswrapper[4797]: I1013 15:53:32.659310 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lbgts" Oct 13 15:53:32 crc kubenswrapper[4797]: I1013 15:53:32.690963 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lbgts"] Oct 13 15:53:32 crc kubenswrapper[4797]: I1013 15:53:32.761354 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhb55\" (UniqueName: \"kubernetes.io/projected/eb194e5f-1a8b-4f25-abc4-d481be912fff-kube-api-access-jhb55\") pod \"certified-operators-lbgts\" (UID: \"eb194e5f-1a8b-4f25-abc4-d481be912fff\") " pod="openshift-marketplace/certified-operators-lbgts" Oct 13 15:53:32 crc kubenswrapper[4797]: I1013 15:53:32.761793 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb194e5f-1a8b-4f25-abc4-d481be912fff-catalog-content\") pod \"certified-operators-lbgts\" (UID: \"eb194e5f-1a8b-4f25-abc4-d481be912fff\") " pod="openshift-marketplace/certified-operators-lbgts" Oct 13 15:53:32 crc kubenswrapper[4797]: I1013 15:53:32.761847 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb194e5f-1a8b-4f25-abc4-d481be912fff-utilities\") pod \"certified-operators-lbgts\" (UID: \"eb194e5f-1a8b-4f25-abc4-d481be912fff\") " pod="openshift-marketplace/certified-operators-lbgts" Oct 13 15:53:32 crc kubenswrapper[4797]: I1013 15:53:32.864004 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jhb55\" (UniqueName: \"kubernetes.io/projected/eb194e5f-1a8b-4f25-abc4-d481be912fff-kube-api-access-jhb55\") pod \"certified-operators-lbgts\" (UID: \"eb194e5f-1a8b-4f25-abc4-d481be912fff\") " pod="openshift-marketplace/certified-operators-lbgts" Oct 13 15:53:32 crc kubenswrapper[4797]: I1013 15:53:32.864124 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb194e5f-1a8b-4f25-abc4-d481be912fff-catalog-content\") pod \"certified-operators-lbgts\" (UID: \"eb194e5f-1a8b-4f25-abc4-d481be912fff\") " pod="openshift-marketplace/certified-operators-lbgts" Oct 13 15:53:32 crc kubenswrapper[4797]: I1013 15:53:32.864166 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb194e5f-1a8b-4f25-abc4-d481be912fff-utilities\") pod \"certified-operators-lbgts\" (UID: \"eb194e5f-1a8b-4f25-abc4-d481be912fff\") " pod="openshift-marketplace/certified-operators-lbgts" Oct 13 15:53:32 crc kubenswrapper[4797]: I1013 15:53:32.865015 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb194e5f-1a8b-4f25-abc4-d481be912fff-catalog-content\") pod \"certified-operators-lbgts\" (UID: \"eb194e5f-1a8b-4f25-abc4-d481be912fff\") " pod="openshift-marketplace/certified-operators-lbgts" Oct 13 15:53:32 crc kubenswrapper[4797]: I1013 15:53:32.865018 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb194e5f-1a8b-4f25-abc4-d481be912fff-utilities\") pod \"certified-operators-lbgts\" (UID: \"eb194e5f-1a8b-4f25-abc4-d481be912fff\") " pod="openshift-marketplace/certified-operators-lbgts" Oct 13 15:53:32 crc kubenswrapper[4797]: I1013 15:53:32.883608 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhb55\" (UniqueName: \"kubernetes.io/projected/eb194e5f-1a8b-4f25-abc4-d481be912fff-kube-api-access-jhb55\") pod \"certified-operators-lbgts\" (UID: \"eb194e5f-1a8b-4f25-abc4-d481be912fff\") " pod="openshift-marketplace/certified-operators-lbgts" Oct 13 15:53:33 crc kubenswrapper[4797]: I1013 15:53:33.046917 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lbgts" Oct 13 15:53:33 crc kubenswrapper[4797]: I1013 15:53:33.634346 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lbgts"] Oct 13 15:53:34 crc kubenswrapper[4797]: I1013 15:53:34.341271 4797 generic.go:334] "Generic (PLEG): container finished" podID="eb194e5f-1a8b-4f25-abc4-d481be912fff" containerID="4131a27e9c7d9b8e259cacb8d2fa4fe7aa6602b723498ed17604562e38810b3c" exitCode=0 Oct 13 15:53:34 crc kubenswrapper[4797]: I1013 15:53:34.341376 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lbgts" event={"ID":"eb194e5f-1a8b-4f25-abc4-d481be912fff","Type":"ContainerDied","Data":"4131a27e9c7d9b8e259cacb8d2fa4fe7aa6602b723498ed17604562e38810b3c"} Oct 13 15:53:34 crc kubenswrapper[4797]: I1013 15:53:34.341724 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lbgts" event={"ID":"eb194e5f-1a8b-4f25-abc4-d481be912fff","Type":"ContainerStarted","Data":"cd7aca5ce481900aca845fffdc8040f7b436c8ded72443026a22306affe34f4d"} Oct 13 15:53:36 crc kubenswrapper[4797]: I1013 15:53:36.362480 4797 generic.go:334] "Generic (PLEG): container finished" podID="2b618c96-0776-4d57-b5af-98cfb18b5a58" containerID="e1cc6bf297f86f4eb8bc83bf953048585517720b009de660eeb1946380951b09" exitCode=0 Oct 13 15:53:36 crc kubenswrapper[4797]: I1013 15:53:36.362581 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kvrb6/crc-debug-cml7w" event={"ID":"2b618c96-0776-4d57-b5af-98cfb18b5a58","Type":"ContainerDied","Data":"e1cc6bf297f86f4eb8bc83bf953048585517720b009de660eeb1946380951b09"} Oct 13 15:53:36 crc kubenswrapper[4797]: I1013 15:53:36.366015 4797 generic.go:334] "Generic (PLEG): container finished" podID="eb194e5f-1a8b-4f25-abc4-d481be912fff" containerID="67756563b3685b7cbc34b8716a45e41081bb3880344387f52bf6d3549f52cf7f" exitCode=0 Oct 13 15:53:36 crc kubenswrapper[4797]: I1013 15:53:36.366071 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lbgts" event={"ID":"eb194e5f-1a8b-4f25-abc4-d481be912fff","Type":"ContainerDied","Data":"67756563b3685b7cbc34b8716a45e41081bb3880344387f52bf6d3549f52cf7f"} Oct 13 15:53:37 crc kubenswrapper[4797]: I1013 15:53:37.377308 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lbgts" event={"ID":"eb194e5f-1a8b-4f25-abc4-d481be912fff","Type":"ContainerStarted","Data":"1ae1a7f2fc67788ed40796e8ecd8284e701b39a8561b9681fafb237844b266e5"} Oct 13 15:53:37 crc kubenswrapper[4797]: I1013 15:53:37.414029 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-lbgts" podStartSLOduration=2.952782874 podStartE2EDuration="5.414013894s" podCreationTimestamp="2025-10-13 15:53:32 +0000 UTC" firstStartedPulling="2025-10-13 15:53:34.343956014 +0000 UTC m=+9991.877506270" lastFinishedPulling="2025-10-13 15:53:36.805187034 +0000 UTC m=+9994.338737290" observedRunningTime="2025-10-13 15:53:37.409672607 +0000 UTC m=+9994.943222883" watchObservedRunningTime="2025-10-13 15:53:37.414013894 +0000 UTC m=+9994.947564150" Oct 13 15:53:37 crc kubenswrapper[4797]: I1013 15:53:37.492302 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kvrb6/crc-debug-cml7w" Oct 13 15:53:37 crc kubenswrapper[4797]: I1013 15:53:37.533991 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-kvrb6/crc-debug-cml7w"] Oct 13 15:53:37 crc kubenswrapper[4797]: I1013 15:53:37.543661 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-kvrb6/crc-debug-cml7w"] Oct 13 15:53:37 crc kubenswrapper[4797]: I1013 15:53:37.670992 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2b618c96-0776-4d57-b5af-98cfb18b5a58-host\") pod \"2b618c96-0776-4d57-b5af-98cfb18b5a58\" (UID: \"2b618c96-0776-4d57-b5af-98cfb18b5a58\") " Oct 13 15:53:37 crc kubenswrapper[4797]: I1013 15:53:37.671045 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ljpxt\" (UniqueName: \"kubernetes.io/projected/2b618c96-0776-4d57-b5af-98cfb18b5a58-kube-api-access-ljpxt\") pod \"2b618c96-0776-4d57-b5af-98cfb18b5a58\" (UID: \"2b618c96-0776-4d57-b5af-98cfb18b5a58\") " Oct 13 15:53:37 crc kubenswrapper[4797]: I1013 15:53:37.671100 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2b618c96-0776-4d57-b5af-98cfb18b5a58-host" (OuterVolumeSpecName: "host") pod "2b618c96-0776-4d57-b5af-98cfb18b5a58" (UID: "2b618c96-0776-4d57-b5af-98cfb18b5a58"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 13 15:53:37 crc kubenswrapper[4797]: I1013 15:53:37.671541 4797 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2b618c96-0776-4d57-b5af-98cfb18b5a58-host\") on node \"crc\" DevicePath \"\"" Oct 13 15:53:37 crc kubenswrapper[4797]: I1013 15:53:37.681572 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b618c96-0776-4d57-b5af-98cfb18b5a58-kube-api-access-ljpxt" (OuterVolumeSpecName: "kube-api-access-ljpxt") pod "2b618c96-0776-4d57-b5af-98cfb18b5a58" (UID: "2b618c96-0776-4d57-b5af-98cfb18b5a58"). InnerVolumeSpecName "kube-api-access-ljpxt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 15:53:37 crc kubenswrapper[4797]: I1013 15:53:37.772986 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ljpxt\" (UniqueName: \"kubernetes.io/projected/2b618c96-0776-4d57-b5af-98cfb18b5a58-kube-api-access-ljpxt\") on node \"crc\" DevicePath \"\"" Oct 13 15:53:38 crc kubenswrapper[4797]: I1013 15:53:38.236951 4797 scope.go:117] "RemoveContainer" containerID="7f1e83580ed6057b8717484b17b17fa6431ebb611cc0b1233bd7239aa2d8f6f4" Oct 13 15:53:38 crc kubenswrapper[4797]: E1013 15:53:38.237211 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 15:53:38 crc kubenswrapper[4797]: I1013 15:53:38.388674 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f4385f34a482c6a1b3b51e44e9b6689e35b130a699aab46a823b68d409ba0c94" Oct 13 15:53:38 crc kubenswrapper[4797]: I1013 15:53:38.388694 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kvrb6/crc-debug-cml7w" Oct 13 15:53:38 crc kubenswrapper[4797]: I1013 15:53:38.774138 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-kvrb6/crc-debug-p6zjg"] Oct 13 15:53:38 crc kubenswrapper[4797]: E1013 15:53:38.774609 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b618c96-0776-4d57-b5af-98cfb18b5a58" containerName="container-00" Oct 13 15:53:38 crc kubenswrapper[4797]: I1013 15:53:38.774629 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b618c96-0776-4d57-b5af-98cfb18b5a58" containerName="container-00" Oct 13 15:53:38 crc kubenswrapper[4797]: I1013 15:53:38.774829 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b618c96-0776-4d57-b5af-98cfb18b5a58" containerName="container-00" Oct 13 15:53:38 crc kubenswrapper[4797]: I1013 15:53:38.777312 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kvrb6/crc-debug-p6zjg" Oct 13 15:53:38 crc kubenswrapper[4797]: I1013 15:53:38.897000 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3ceca53c-91aa-4d19-bcff-f03db87675fb-host\") pod \"crc-debug-p6zjg\" (UID: \"3ceca53c-91aa-4d19-bcff-f03db87675fb\") " pod="openshift-must-gather-kvrb6/crc-debug-p6zjg" Oct 13 15:53:38 crc kubenswrapper[4797]: I1013 15:53:38.897052 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhgn7\" (UniqueName: \"kubernetes.io/projected/3ceca53c-91aa-4d19-bcff-f03db87675fb-kube-api-access-jhgn7\") pod \"crc-debug-p6zjg\" (UID: \"3ceca53c-91aa-4d19-bcff-f03db87675fb\") " pod="openshift-must-gather-kvrb6/crc-debug-p6zjg" Oct 13 15:53:38 crc kubenswrapper[4797]: I1013 15:53:38.998432 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3ceca53c-91aa-4d19-bcff-f03db87675fb-host\") pod \"crc-debug-p6zjg\" (UID: \"3ceca53c-91aa-4d19-bcff-f03db87675fb\") " pod="openshift-must-gather-kvrb6/crc-debug-p6zjg" Oct 13 15:53:38 crc kubenswrapper[4797]: I1013 15:53:38.998496 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jhgn7\" (UniqueName: \"kubernetes.io/projected/3ceca53c-91aa-4d19-bcff-f03db87675fb-kube-api-access-jhgn7\") pod \"crc-debug-p6zjg\" (UID: \"3ceca53c-91aa-4d19-bcff-f03db87675fb\") " pod="openshift-must-gather-kvrb6/crc-debug-p6zjg" Oct 13 15:53:38 crc kubenswrapper[4797]: I1013 15:53:38.998994 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3ceca53c-91aa-4d19-bcff-f03db87675fb-host\") pod \"crc-debug-p6zjg\" (UID: \"3ceca53c-91aa-4d19-bcff-f03db87675fb\") " pod="openshift-must-gather-kvrb6/crc-debug-p6zjg" Oct 13 15:53:39 crc kubenswrapper[4797]: I1013 15:53:39.033705 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhgn7\" (UniqueName: \"kubernetes.io/projected/3ceca53c-91aa-4d19-bcff-f03db87675fb-kube-api-access-jhgn7\") pod \"crc-debug-p6zjg\" (UID: \"3ceca53c-91aa-4d19-bcff-f03db87675fb\") " pod="openshift-must-gather-kvrb6/crc-debug-p6zjg" Oct 13 15:53:39 crc kubenswrapper[4797]: I1013 15:53:39.104346 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kvrb6/crc-debug-p6zjg" Oct 13 15:53:39 crc kubenswrapper[4797]: I1013 15:53:39.250183 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b618c96-0776-4d57-b5af-98cfb18b5a58" path="/var/lib/kubelet/pods/2b618c96-0776-4d57-b5af-98cfb18b5a58/volumes" Oct 13 15:53:39 crc kubenswrapper[4797]: I1013 15:53:39.407838 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kvrb6/crc-debug-p6zjg" event={"ID":"3ceca53c-91aa-4d19-bcff-f03db87675fb","Type":"ContainerStarted","Data":"39abd9102a446fe569f58ea85cc32be76af4dd474f6651a20a64c302e280ea6a"} Oct 13 15:53:39 crc kubenswrapper[4797]: I1013 15:53:39.408188 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kvrb6/crc-debug-p6zjg" event={"ID":"3ceca53c-91aa-4d19-bcff-f03db87675fb","Type":"ContainerStarted","Data":"0ad35c3a891f248d44bb73c8ffff93930c807265e3c5a86cfacb5a2a97d19207"} Oct 13 15:53:39 crc kubenswrapper[4797]: I1013 15:53:39.450147 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-kvrb6/crc-debug-p6zjg"] Oct 13 15:53:39 crc kubenswrapper[4797]: I1013 15:53:39.460720 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-kvrb6/crc-debug-p6zjg"] Oct 13 15:53:40 crc kubenswrapper[4797]: I1013 15:53:40.424138 4797 generic.go:334] "Generic (PLEG): container finished" podID="3ceca53c-91aa-4d19-bcff-f03db87675fb" containerID="39abd9102a446fe569f58ea85cc32be76af4dd474f6651a20a64c302e280ea6a" exitCode=1 Oct 13 15:53:40 crc kubenswrapper[4797]: I1013 15:53:40.917540 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kvrb6/crc-debug-p6zjg" Oct 13 15:53:41 crc kubenswrapper[4797]: I1013 15:53:41.045412 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhgn7\" (UniqueName: \"kubernetes.io/projected/3ceca53c-91aa-4d19-bcff-f03db87675fb-kube-api-access-jhgn7\") pod \"3ceca53c-91aa-4d19-bcff-f03db87675fb\" (UID: \"3ceca53c-91aa-4d19-bcff-f03db87675fb\") " Oct 13 15:53:41 crc kubenswrapper[4797]: I1013 15:53:41.045678 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3ceca53c-91aa-4d19-bcff-f03db87675fb-host\") pod \"3ceca53c-91aa-4d19-bcff-f03db87675fb\" (UID: \"3ceca53c-91aa-4d19-bcff-f03db87675fb\") " Oct 13 15:53:41 crc kubenswrapper[4797]: I1013 15:53:41.045730 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3ceca53c-91aa-4d19-bcff-f03db87675fb-host" (OuterVolumeSpecName: "host") pod "3ceca53c-91aa-4d19-bcff-f03db87675fb" (UID: "3ceca53c-91aa-4d19-bcff-f03db87675fb"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 13 15:53:41 crc kubenswrapper[4797]: I1013 15:53:41.046219 4797 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3ceca53c-91aa-4d19-bcff-f03db87675fb-host\") on node \"crc\" DevicePath \"\"" Oct 13 15:53:41 crc kubenswrapper[4797]: I1013 15:53:41.051229 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ceca53c-91aa-4d19-bcff-f03db87675fb-kube-api-access-jhgn7" (OuterVolumeSpecName: "kube-api-access-jhgn7") pod "3ceca53c-91aa-4d19-bcff-f03db87675fb" (UID: "3ceca53c-91aa-4d19-bcff-f03db87675fb"). InnerVolumeSpecName "kube-api-access-jhgn7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 15:53:41 crc kubenswrapper[4797]: I1013 15:53:41.148062 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhgn7\" (UniqueName: \"kubernetes.io/projected/3ceca53c-91aa-4d19-bcff-f03db87675fb-kube-api-access-jhgn7\") on node \"crc\" DevicePath \"\"" Oct 13 15:53:41 crc kubenswrapper[4797]: I1013 15:53:41.249474 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ceca53c-91aa-4d19-bcff-f03db87675fb" path="/var/lib/kubelet/pods/3ceca53c-91aa-4d19-bcff-f03db87675fb/volumes" Oct 13 15:53:41 crc kubenswrapper[4797]: I1013 15:53:41.435087 4797 scope.go:117] "RemoveContainer" containerID="39abd9102a446fe569f58ea85cc32be76af4dd474f6651a20a64c302e280ea6a" Oct 13 15:53:41 crc kubenswrapper[4797]: I1013 15:53:41.435116 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kvrb6/crc-debug-p6zjg" Oct 13 15:53:43 crc kubenswrapper[4797]: I1013 15:53:43.047456 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-lbgts" Oct 13 15:53:43 crc kubenswrapper[4797]: I1013 15:53:43.047826 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-lbgts" Oct 13 15:53:43 crc kubenswrapper[4797]: I1013 15:53:43.103799 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-lbgts" Oct 13 15:53:43 crc kubenswrapper[4797]: I1013 15:53:43.498844 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-lbgts" Oct 13 15:53:43 crc kubenswrapper[4797]: I1013 15:53:43.553536 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-lbgts"] Oct 13 15:53:45 crc kubenswrapper[4797]: I1013 15:53:45.474103 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-lbgts" podUID="eb194e5f-1a8b-4f25-abc4-d481be912fff" containerName="registry-server" containerID="cri-o://1ae1a7f2fc67788ed40796e8ecd8284e701b39a8561b9681fafb237844b266e5" gracePeriod=2 Oct 13 15:53:46 crc kubenswrapper[4797]: I1013 15:53:46.013889 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lbgts" Oct 13 15:53:46 crc kubenswrapper[4797]: I1013 15:53:46.181199 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb194e5f-1a8b-4f25-abc4-d481be912fff-utilities\") pod \"eb194e5f-1a8b-4f25-abc4-d481be912fff\" (UID: \"eb194e5f-1a8b-4f25-abc4-d481be912fff\") " Oct 13 15:53:46 crc kubenswrapper[4797]: I1013 15:53:46.181268 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb194e5f-1a8b-4f25-abc4-d481be912fff-catalog-content\") pod \"eb194e5f-1a8b-4f25-abc4-d481be912fff\" (UID: \"eb194e5f-1a8b-4f25-abc4-d481be912fff\") " Oct 13 15:53:46 crc kubenswrapper[4797]: I1013 15:53:46.181410 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhb55\" (UniqueName: \"kubernetes.io/projected/eb194e5f-1a8b-4f25-abc4-d481be912fff-kube-api-access-jhb55\") pod \"eb194e5f-1a8b-4f25-abc4-d481be912fff\" (UID: \"eb194e5f-1a8b-4f25-abc4-d481be912fff\") " Oct 13 15:53:46 crc kubenswrapper[4797]: I1013 15:53:46.182041 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb194e5f-1a8b-4f25-abc4-d481be912fff-utilities" (OuterVolumeSpecName: "utilities") pod "eb194e5f-1a8b-4f25-abc4-d481be912fff" (UID: "eb194e5f-1a8b-4f25-abc4-d481be912fff"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 15:53:46 crc kubenswrapper[4797]: I1013 15:53:46.186997 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb194e5f-1a8b-4f25-abc4-d481be912fff-kube-api-access-jhb55" (OuterVolumeSpecName: "kube-api-access-jhb55") pod "eb194e5f-1a8b-4f25-abc4-d481be912fff" (UID: "eb194e5f-1a8b-4f25-abc4-d481be912fff"). InnerVolumeSpecName "kube-api-access-jhb55". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 15:53:46 crc kubenswrapper[4797]: I1013 15:53:46.230395 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb194e5f-1a8b-4f25-abc4-d481be912fff-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "eb194e5f-1a8b-4f25-abc4-d481be912fff" (UID: "eb194e5f-1a8b-4f25-abc4-d481be912fff"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 15:53:46 crc kubenswrapper[4797]: I1013 15:53:46.284582 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhb55\" (UniqueName: \"kubernetes.io/projected/eb194e5f-1a8b-4f25-abc4-d481be912fff-kube-api-access-jhb55\") on node \"crc\" DevicePath \"\"" Oct 13 15:53:46 crc kubenswrapper[4797]: I1013 15:53:46.284618 4797 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb194e5f-1a8b-4f25-abc4-d481be912fff-utilities\") on node \"crc\" DevicePath \"\"" Oct 13 15:53:46 crc kubenswrapper[4797]: I1013 15:53:46.284630 4797 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb194e5f-1a8b-4f25-abc4-d481be912fff-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 13 15:53:46 crc kubenswrapper[4797]: I1013 15:53:46.483878 4797 generic.go:334] "Generic (PLEG): container finished" podID="eb194e5f-1a8b-4f25-abc4-d481be912fff" containerID="1ae1a7f2fc67788ed40796e8ecd8284e701b39a8561b9681fafb237844b266e5" exitCode=0 Oct 13 15:53:46 crc kubenswrapper[4797]: I1013 15:53:46.483927 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lbgts" event={"ID":"eb194e5f-1a8b-4f25-abc4-d481be912fff","Type":"ContainerDied","Data":"1ae1a7f2fc67788ed40796e8ecd8284e701b39a8561b9681fafb237844b266e5"} Oct 13 15:53:46 crc kubenswrapper[4797]: I1013 15:53:46.483959 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lbgts" event={"ID":"eb194e5f-1a8b-4f25-abc4-d481be912fff","Type":"ContainerDied","Data":"cd7aca5ce481900aca845fffdc8040f7b436c8ded72443026a22306affe34f4d"} Oct 13 15:53:46 crc kubenswrapper[4797]: I1013 15:53:46.483980 4797 scope.go:117] "RemoveContainer" containerID="1ae1a7f2fc67788ed40796e8ecd8284e701b39a8561b9681fafb237844b266e5" Oct 13 15:53:46 crc kubenswrapper[4797]: I1013 15:53:46.484142 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lbgts" Oct 13 15:53:46 crc kubenswrapper[4797]: I1013 15:53:46.505709 4797 scope.go:117] "RemoveContainer" containerID="67756563b3685b7cbc34b8716a45e41081bb3880344387f52bf6d3549f52cf7f" Oct 13 15:53:46 crc kubenswrapper[4797]: I1013 15:53:46.519935 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-lbgts"] Oct 13 15:53:46 crc kubenswrapper[4797]: I1013 15:53:46.530123 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-lbgts"] Oct 13 15:53:46 crc kubenswrapper[4797]: I1013 15:53:46.538489 4797 scope.go:117] "RemoveContainer" containerID="4131a27e9c7d9b8e259cacb8d2fa4fe7aa6602b723498ed17604562e38810b3c" Oct 13 15:53:46 crc kubenswrapper[4797]: I1013 15:53:46.555917 4797 scope.go:117] "RemoveContainer" containerID="1ae1a7f2fc67788ed40796e8ecd8284e701b39a8561b9681fafb237844b266e5" Oct 13 15:53:46 crc kubenswrapper[4797]: E1013 15:53:46.556408 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ae1a7f2fc67788ed40796e8ecd8284e701b39a8561b9681fafb237844b266e5\": container with ID starting with 1ae1a7f2fc67788ed40796e8ecd8284e701b39a8561b9681fafb237844b266e5 not found: ID does not exist" containerID="1ae1a7f2fc67788ed40796e8ecd8284e701b39a8561b9681fafb237844b266e5" Oct 13 15:53:46 crc kubenswrapper[4797]: I1013 15:53:46.556445 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ae1a7f2fc67788ed40796e8ecd8284e701b39a8561b9681fafb237844b266e5"} err="failed to get container status \"1ae1a7f2fc67788ed40796e8ecd8284e701b39a8561b9681fafb237844b266e5\": rpc error: code = NotFound desc = could not find container \"1ae1a7f2fc67788ed40796e8ecd8284e701b39a8561b9681fafb237844b266e5\": container with ID starting with 1ae1a7f2fc67788ed40796e8ecd8284e701b39a8561b9681fafb237844b266e5 not found: ID does not exist" Oct 13 15:53:46 crc kubenswrapper[4797]: I1013 15:53:46.556467 4797 scope.go:117] "RemoveContainer" containerID="67756563b3685b7cbc34b8716a45e41081bb3880344387f52bf6d3549f52cf7f" Oct 13 15:53:46 crc kubenswrapper[4797]: E1013 15:53:46.556925 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"67756563b3685b7cbc34b8716a45e41081bb3880344387f52bf6d3549f52cf7f\": container with ID starting with 67756563b3685b7cbc34b8716a45e41081bb3880344387f52bf6d3549f52cf7f not found: ID does not exist" containerID="67756563b3685b7cbc34b8716a45e41081bb3880344387f52bf6d3549f52cf7f" Oct 13 15:53:46 crc kubenswrapper[4797]: I1013 15:53:46.556979 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67756563b3685b7cbc34b8716a45e41081bb3880344387f52bf6d3549f52cf7f"} err="failed to get container status \"67756563b3685b7cbc34b8716a45e41081bb3880344387f52bf6d3549f52cf7f\": rpc error: code = NotFound desc = could not find container \"67756563b3685b7cbc34b8716a45e41081bb3880344387f52bf6d3549f52cf7f\": container with ID starting with 67756563b3685b7cbc34b8716a45e41081bb3880344387f52bf6d3549f52cf7f not found: ID does not exist" Oct 13 15:53:46 crc kubenswrapper[4797]: I1013 15:53:46.557012 4797 scope.go:117] "RemoveContainer" containerID="4131a27e9c7d9b8e259cacb8d2fa4fe7aa6602b723498ed17604562e38810b3c" Oct 13 15:53:46 crc kubenswrapper[4797]: E1013 15:53:46.557520 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4131a27e9c7d9b8e259cacb8d2fa4fe7aa6602b723498ed17604562e38810b3c\": container with ID starting with 4131a27e9c7d9b8e259cacb8d2fa4fe7aa6602b723498ed17604562e38810b3c not found: ID does not exist" containerID="4131a27e9c7d9b8e259cacb8d2fa4fe7aa6602b723498ed17604562e38810b3c" Oct 13 15:53:46 crc kubenswrapper[4797]: I1013 15:53:46.557575 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4131a27e9c7d9b8e259cacb8d2fa4fe7aa6602b723498ed17604562e38810b3c"} err="failed to get container status \"4131a27e9c7d9b8e259cacb8d2fa4fe7aa6602b723498ed17604562e38810b3c\": rpc error: code = NotFound desc = could not find container \"4131a27e9c7d9b8e259cacb8d2fa4fe7aa6602b723498ed17604562e38810b3c\": container with ID starting with 4131a27e9c7d9b8e259cacb8d2fa4fe7aa6602b723498ed17604562e38810b3c not found: ID does not exist" Oct 13 15:53:47 crc kubenswrapper[4797]: I1013 15:53:47.272435 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb194e5f-1a8b-4f25-abc4-d481be912fff" path="/var/lib/kubelet/pods/eb194e5f-1a8b-4f25-abc4-d481be912fff/volumes" Oct 13 15:53:52 crc kubenswrapper[4797]: I1013 15:53:52.236421 4797 scope.go:117] "RemoveContainer" containerID="7f1e83580ed6057b8717484b17b17fa6431ebb611cc0b1233bd7239aa2d8f6f4" Oct 13 15:53:52 crc kubenswrapper[4797]: E1013 15:53:52.237248 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 15:54:05 crc kubenswrapper[4797]: I1013 15:54:05.237247 4797 scope.go:117] "RemoveContainer" containerID="7f1e83580ed6057b8717484b17b17fa6431ebb611cc0b1233bd7239aa2d8f6f4" Oct 13 15:54:05 crc kubenswrapper[4797]: E1013 15:54:05.238136 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 15:54:16 crc kubenswrapper[4797]: I1013 15:54:16.236897 4797 scope.go:117] "RemoveContainer" containerID="7f1e83580ed6057b8717484b17b17fa6431ebb611cc0b1233bd7239aa2d8f6f4" Oct 13 15:54:16 crc kubenswrapper[4797]: E1013 15:54:16.237577 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 15:54:19 crc kubenswrapper[4797]: I1013 15:54:19.370831 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_8431ed84-8990-4a23-9d29-0f7ef6dfc9d0/init-config-reloader/0.log" Oct 13 15:54:19 crc kubenswrapper[4797]: I1013 15:54:19.592842 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_8431ed84-8990-4a23-9d29-0f7ef6dfc9d0/init-config-reloader/0.log" Oct 13 15:54:19 crc kubenswrapper[4797]: I1013 15:54:19.622797 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_8431ed84-8990-4a23-9d29-0f7ef6dfc9d0/config-reloader/0.log" Oct 13 15:54:19 crc kubenswrapper[4797]: I1013 15:54:19.641256 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_8431ed84-8990-4a23-9d29-0f7ef6dfc9d0/alertmanager/0.log" Oct 13 15:54:19 crc kubenswrapper[4797]: I1013 15:54:19.786882 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_a7c7e5e8-bc03-4870-a67d-fb86ae0e41e3/aodh-api/0.log" Oct 13 15:54:19 crc kubenswrapper[4797]: I1013 15:54:19.833025 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_a7c7e5e8-bc03-4870-a67d-fb86ae0e41e3/aodh-evaluator/0.log" Oct 13 15:54:19 crc kubenswrapper[4797]: I1013 15:54:19.974490 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_a7c7e5e8-bc03-4870-a67d-fb86ae0e41e3/aodh-listener/0.log" Oct 13 15:54:20 crc kubenswrapper[4797]: I1013 15:54:20.035345 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_a7c7e5e8-bc03-4870-a67d-fb86ae0e41e3/aodh-notifier/0.log" Oct 13 15:54:20 crc kubenswrapper[4797]: I1013 15:54:20.186752 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-5b7d9bc6cb-z65xd_e2fb17bd-c741-47d7-8f3b-c9057c58105c/barbican-api/0.log" Oct 13 15:54:20 crc kubenswrapper[4797]: I1013 15:54:20.282444 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-5b7d9bc6cb-z65xd_e2fb17bd-c741-47d7-8f3b-c9057c58105c/barbican-api-log/0.log" Oct 13 15:54:20 crc kubenswrapper[4797]: I1013 15:54:20.413266 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-5d8dfccd44-t4jbz_d7afb7d3-8d9d-475f-9ef5-75c5125ec374/barbican-keystone-listener/0.log" Oct 13 15:54:20 crc kubenswrapper[4797]: I1013 15:54:20.644119 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-5d95764d5c-br2pn_15f72fda-eacd-42a1-8be8-92e28ed31924/barbican-worker/0.log" Oct 13 15:54:20 crc kubenswrapper[4797]: I1013 15:54:20.808870 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-5d8dfccd44-t4jbz_d7afb7d3-8d9d-475f-9ef5-75c5125ec374/barbican-keystone-listener-log/0.log" Oct 13 15:54:20 crc kubenswrapper[4797]: I1013 15:54:20.810244 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-5d95764d5c-br2pn_15f72fda-eacd-42a1-8be8-92e28ed31924/barbican-worker-log/0.log" Oct 13 15:54:21 crc kubenswrapper[4797]: I1013 15:54:21.005151 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-openstack-openstack-cell1-znhqv_6ddd3dd4-0ef9-495e-9ffa-8358884cb552/bootstrap-openstack-openstack-cell1/0.log" Oct 13 15:54:21 crc kubenswrapper[4797]: I1013 15:54:21.069123 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_713cca7f-3dd9-4fde-8672-c566a1acd8ce/ceilometer-central-agent/0.log" Oct 13 15:54:21 crc kubenswrapper[4797]: I1013 15:54:21.190745 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_713cca7f-3dd9-4fde-8672-c566a1acd8ce/ceilometer-notification-agent/0.log" Oct 13 15:54:21 crc kubenswrapper[4797]: I1013 15:54:21.245141 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_713cca7f-3dd9-4fde-8672-c566a1acd8ce/sg-core/0.log" Oct 13 15:54:21 crc kubenswrapper[4797]: I1013 15:54:21.317234 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_713cca7f-3dd9-4fde-8672-c566a1acd8ce/proxy-httpd/0.log" Oct 13 15:54:21 crc kubenswrapper[4797]: I1013 15:54:21.458251 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-client-openstack-openstack-cell1-kpsp9_8d868a76-1350-41ea-875b-b4841f483390/ceph-client-openstack-openstack-cell1/0.log" Oct 13 15:54:21 crc kubenswrapper[4797]: I1013 15:54:21.677334 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_d3bace6c-b376-4b6d-8759-5c90d5d5b02b/cinder-api-log/0.log" Oct 13 15:54:21 crc kubenswrapper[4797]: I1013 15:54:21.795820 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_d3bace6c-b376-4b6d-8759-5c90d5d5b02b/cinder-api/0.log" Oct 13 15:54:22 crc kubenswrapper[4797]: I1013 15:54:22.161531 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_2579cd51-2c4b-4a29-993e-d38dfffead2b/cinder-backup/0.log" Oct 13 15:54:22 crc kubenswrapper[4797]: I1013 15:54:22.272962 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_2579cd51-2c4b-4a29-993e-d38dfffead2b/probe/0.log" Oct 13 15:54:22 crc kubenswrapper[4797]: I1013 15:54:22.383908 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_2079f065-f421-4b28-8023-926aa90e9f63/cinder-scheduler/0.log" Oct 13 15:54:22 crc kubenswrapper[4797]: I1013 15:54:22.507545 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_2079f065-f421-4b28-8023-926aa90e9f63/probe/0.log" Oct 13 15:54:22 crc kubenswrapper[4797]: I1013 15:54:22.658405 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_dc355640-e2b0-4d27-8135-0ab3599cba98/cinder-volume/0.log" Oct 13 15:54:22 crc kubenswrapper[4797]: I1013 15:54:22.732795 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_dc355640-e2b0-4d27-8135-0ab3599cba98/probe/0.log" Oct 13 15:54:22 crc kubenswrapper[4797]: I1013 15:54:22.845295 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-openstack-openstack-cell1-5k77q_edbca830-a799-43d2-a312-1aa256aabed6/configure-network-openstack-openstack-cell1/0.log" Oct 13 15:54:22 crc kubenswrapper[4797]: I1013 15:54:22.943006 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-openstack-openstack-cell1-fx9pj_dc3cd5ee-3e78-4d2f-97dd-6901058f0f49/configure-os-openstack-openstack-cell1/0.log" Oct 13 15:54:23 crc kubenswrapper[4797]: I1013 15:54:23.041722 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6f55445bd9-6dr9d_9f72e1e8-e5a1-4c86-9ef7-386e430a851c/init/0.log" Oct 13 15:54:23 crc kubenswrapper[4797]: I1013 15:54:23.217182 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6f55445bd9-6dr9d_9f72e1e8-e5a1-4c86-9ef7-386e430a851c/init/0.log" Oct 13 15:54:23 crc kubenswrapper[4797]: I1013 15:54:23.271729 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-openstack-openstack-cell1-6wfff_e800517f-8039-464d-8137-cc928b84cc79/download-cache-openstack-openstack-cell1/0.log" Oct 13 15:54:23 crc kubenswrapper[4797]: I1013 15:54:23.274413 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6f55445bd9-6dr9d_9f72e1e8-e5a1-4c86-9ef7-386e430a851c/dnsmasq-dns/0.log" Oct 13 15:54:23 crc kubenswrapper[4797]: I1013 15:54:23.454529 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_7332de29-571b-4dfd-8049-6b3c749109cc/glance-httpd/0.log" Oct 13 15:54:23 crc kubenswrapper[4797]: I1013 15:54:23.461198 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_7332de29-571b-4dfd-8049-6b3c749109cc/glance-log/0.log" Oct 13 15:54:23 crc kubenswrapper[4797]: I1013 15:54:23.633023 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_82fccc24-58d1-4521-9a93-be26f53cb8c3/glance-httpd/0.log" Oct 13 15:54:24 crc kubenswrapper[4797]: I1013 15:54:24.307505 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_82fccc24-58d1-4521-9a93-be26f53cb8c3/glance-log/0.log" Oct 13 15:54:24 crc kubenswrapper[4797]: I1013 15:54:24.424609 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-api-6794f4b959-8wbgr_b291a2fb-8b0f-4ece-8665-fa382a5c51e4/heat-api/0.log" Oct 13 15:54:24 crc kubenswrapper[4797]: I1013 15:54:24.599620 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-engine-6c67d84d9-t7s9t_456c5f14-d211-4869-914d-73d2fd6efa69/heat-engine/0.log" Oct 13 15:54:24 crc kubenswrapper[4797]: I1013 15:54:24.664263 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-cfnapi-6b8568fd4f-wdjdv_19924c78-0046-4b6a-91be-546357d8b190/heat-cfnapi/0.log" Oct 13 15:54:24 crc kubenswrapper[4797]: I1013 15:54:24.906242 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-55ccf7bdd9-dbjvm_59c6a50b-c86b-4e7e-98c8-067c2aaf9777/horizon/0.log" Oct 13 15:54:24 crc kubenswrapper[4797]: I1013 15:54:24.927325 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-55ccf7bdd9-dbjvm_59c6a50b-c86b-4e7e-98c8-067c2aaf9777/horizon-log/0.log" Oct 13 15:54:24 crc kubenswrapper[4797]: I1013 15:54:24.964856 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-openstack-openstack-cell1-qvlrc_a9088bd4-a2fd-4a3e-a3ff-b43cf035de24/install-certs-openstack-openstack-cell1/0.log" Oct 13 15:54:25 crc kubenswrapper[4797]: I1013 15:54:25.200928 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-openstack-openstack-cell1-r6xm4_fb2cf8ea-12be-4bf7-9b6e-fcded0175c91/install-os-openstack-openstack-cell1/0.log" Oct 13 15:54:25 crc kubenswrapper[4797]: I1013 15:54:25.463888 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29339461-hwsdn_7cc54a2c-eb42-4b57-b4c4-1592a7ce99d2/keystone-cron/0.log" Oct 13 15:54:25 crc kubenswrapper[4797]: I1013 15:54:25.678560 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-69d9d545f8-4q5z4_e8508459-dcad-4083-884e-5f763b630be0/keystone-api/0.log" Oct 13 15:54:26 crc kubenswrapper[4797]: I1013 15:54:26.226932 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_637619fb-02b4-4561-9765-0d75ef8ab480/kube-state-metrics/0.log" Oct 13 15:54:26 crc kubenswrapper[4797]: I1013 15:54:26.279169 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-openstack-openstack-cell1-wkdb2_9da6c975-70b2-4296-9070-7608770a0446/libvirt-openstack-openstack-cell1/0.log" Oct 13 15:54:26 crc kubenswrapper[4797]: I1013 15:54:26.483578 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_58623ce0-7ada-4657-954a-d91efd56bb3e/manila-api-log/0.log" Oct 13 15:54:26 crc kubenswrapper[4797]: I1013 15:54:26.523289 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_58623ce0-7ada-4657-954a-d91efd56bb3e/manila-api/0.log" Oct 13 15:54:26 crc kubenswrapper[4797]: I1013 15:54:26.654297 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_cb39c204-29ae-4483-8d78-810d907515fe/manila-scheduler/0.log" Oct 13 15:54:26 crc kubenswrapper[4797]: I1013 15:54:26.668647 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_cb39c204-29ae-4483-8d78-810d907515fe/probe/0.log" Oct 13 15:54:26 crc kubenswrapper[4797]: I1013 15:54:26.784503 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_945cdf25-57ce-44f2-89ae-2a17e21c485f/manila-share/0.log" Oct 13 15:54:26 crc kubenswrapper[4797]: I1013 15:54:26.882193 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_945cdf25-57ce-44f2-89ae-2a17e21c485f/probe/0.log" Oct 13 15:54:27 crc kubenswrapper[4797]: I1013 15:54:27.350948 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-7f5b654777-hpl7r_14fda70e-da94-432b-8c32-8f290ca1ab52/neutron-httpd/0.log" Oct 13 15:54:27 crc kubenswrapper[4797]: I1013 15:54:27.531903 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-dhcp-openstack-openstack-cell1-sfvnf_17115edf-f950-40b3-9a3b-3948815da323/neutron-dhcp-openstack-openstack-cell1/0.log" Oct 13 15:54:27 crc kubenswrapper[4797]: I1013 15:54:27.664695 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-7f5b654777-hpl7r_14fda70e-da94-432b-8c32-8f290ca1ab52/neutron-api/0.log" Oct 13 15:54:27 crc kubenswrapper[4797]: I1013 15:54:27.807417 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-openstack-openstack-cell1-mgjkg_bcae55b0-a4a1-40c4-85a0-de5a29520295/neutron-metadata-openstack-openstack-cell1/0.log" Oct 13 15:54:27 crc kubenswrapper[4797]: I1013 15:54:27.961674 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-sriov-openstack-openstack-cell1-dkdhk_373fe301-acc1-486b-a109-62e739dec048/neutron-sriov-openstack-openstack-cell1/0.log" Oct 13 15:54:28 crc kubenswrapper[4797]: I1013 15:54:28.295787 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_0762bcb1-f8cd-4a9d-8691-1f6e32602199/nova-api-api/0.log" Oct 13 15:54:28 crc kubenswrapper[4797]: I1013 15:54:28.436105 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_0762bcb1-f8cd-4a9d-8691-1f6e32602199/nova-api-log/0.log" Oct 13 15:54:28 crc kubenswrapper[4797]: I1013 15:54:28.542310 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_cb9b1886-01a9-49a3-a525-e2cebb3c8c85/nova-cell0-conductor-conductor/0.log" Oct 13 15:54:28 crc kubenswrapper[4797]: I1013 15:54:28.774620 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_7295c003-d21f-4137-96b6-0ae19de3d1be/nova-cell1-conductor-conductor/0.log" Oct 13 15:54:28 crc kubenswrapper[4797]: I1013 15:54:28.898346 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_a4d665a2-0bed-48b1-bb0c-0688c897a898/nova-cell1-novncproxy-novncproxy/0.log" Oct 13 15:54:29 crc kubenswrapper[4797]: I1013 15:54:29.134978 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellsnn9x_2e8a47d5-adb1-4909-9731-680948fa0320/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1/0.log" Oct 13 15:54:29 crc kubenswrapper[4797]: I1013 15:54:29.371327 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-openstack-openstack-cell1-b7rd2_26ec5d7d-17f0-4271-ab0b-30af4063a2c1/nova-cell1-openstack-openstack-cell1/0.log" Oct 13 15:54:29 crc kubenswrapper[4797]: I1013 15:54:29.452386 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_c1c896d3-bdc3-4adb-b712-43be751a5fd8/nova-metadata-log/0.log" Oct 13 15:54:29 crc kubenswrapper[4797]: I1013 15:54:29.600622 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_c1c896d3-bdc3-4adb-b712-43be751a5fd8/nova-metadata-metadata/0.log" Oct 13 15:54:29 crc kubenswrapper[4797]: I1013 15:54:29.788031 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_43eea067-06e3-4bc2-ad55-7e7157dbbb99/nova-scheduler-scheduler/0.log" Oct 13 15:54:29 crc kubenswrapper[4797]: I1013 15:54:29.834220 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_8eeb3b4f-e45c-44bb-874f-2445e71ea23a/mysql-bootstrap/0.log" Oct 13 15:54:30 crc kubenswrapper[4797]: I1013 15:54:30.009200 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_8eeb3b4f-e45c-44bb-874f-2445e71ea23a/galera/0.log" Oct 13 15:54:30 crc kubenswrapper[4797]: I1013 15:54:30.016464 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_8eeb3b4f-e45c-44bb-874f-2445e71ea23a/mysql-bootstrap/0.log" Oct 13 15:54:30 crc kubenswrapper[4797]: I1013 15:54:30.235936 4797 scope.go:117] "RemoveContainer" containerID="7f1e83580ed6057b8717484b17b17fa6431ebb611cc0b1233bd7239aa2d8f6f4" Oct 13 15:54:30 crc kubenswrapper[4797]: E1013 15:54:30.236202 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 15:54:30 crc kubenswrapper[4797]: I1013 15:54:30.259033 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_a5077271-1a56-4d9c-82e5-c8ecc23f5ef8/mysql-bootstrap/0.log" Oct 13 15:54:30 crc kubenswrapper[4797]: I1013 15:54:30.425551 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_a5077271-1a56-4d9c-82e5-c8ecc23f5ef8/mysql-bootstrap/0.log" Oct 13 15:54:30 crc kubenswrapper[4797]: I1013 15:54:30.433715 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_a5077271-1a56-4d9c-82e5-c8ecc23f5ef8/galera/0.log" Oct 13 15:54:30 crc kubenswrapper[4797]: I1013 15:54:30.640667 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_7b6a4ded-dd7b-45bc-94c3-379d1288d07f/openstackclient/0.log" Oct 13 15:54:30 crc kubenswrapper[4797]: I1013 15:54:30.689165 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_3f6dd28e-459b-4ef1-8211-a82eba48d1bd/openstack-network-exporter/0.log" Oct 13 15:54:30 crc kubenswrapper[4797]: I1013 15:54:30.818602 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_3f6dd28e-459b-4ef1-8211-a82eba48d1bd/ovn-northd/0.log" Oct 13 15:54:30 crc kubenswrapper[4797]: I1013 15:54:30.942088 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-openstack-openstack-cell1-57ps4_366a3bab-eb21-41c6-b4e3-52f6e34cc08c/ovn-openstack-openstack-cell1/0.log" Oct 13 15:54:31 crc kubenswrapper[4797]: I1013 15:54:31.146389 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_2f4d36fc-f414-419a-88fb-8897d4029861/ovsdbserver-nb/0.log" Oct 13 15:54:31 crc kubenswrapper[4797]: I1013 15:54:31.150564 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_2f4d36fc-f414-419a-88fb-8897d4029861/openstack-network-exporter/0.log" Oct 13 15:54:31 crc kubenswrapper[4797]: I1013 15:54:31.354546 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-1_f792911e-0022-42df-9a19-9912fde49848/ovsdbserver-nb/0.log" Oct 13 15:54:31 crc kubenswrapper[4797]: I1013 15:54:31.370095 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-1_f792911e-0022-42df-9a19-9912fde49848/openstack-network-exporter/0.log" Oct 13 15:54:31 crc kubenswrapper[4797]: I1013 15:54:31.558932 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-2_1cbefe95-eada-4df1-90ce-a7350636f4fb/openstack-network-exporter/0.log" Oct 13 15:54:31 crc kubenswrapper[4797]: I1013 15:54:31.601754 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-2_1cbefe95-eada-4df1-90ce-a7350636f4fb/ovsdbserver-nb/0.log" Oct 13 15:54:31 crc kubenswrapper[4797]: I1013 15:54:31.795365 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_31b1a65c-acf8-4ab5-8e13-fbfa0bd51d01/openstack-network-exporter/0.log" Oct 13 15:54:31 crc kubenswrapper[4797]: I1013 15:54:31.822715 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_31b1a65c-acf8-4ab5-8e13-fbfa0bd51d01/ovsdbserver-sb/0.log" Oct 13 15:54:32 crc kubenswrapper[4797]: I1013 15:54:32.012006 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-1_fb1869c6-0b2c-4fad-9681-94b1b427085a/openstack-network-exporter/0.log" Oct 13 15:54:32 crc kubenswrapper[4797]: I1013 15:54:32.042228 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-1_fb1869c6-0b2c-4fad-9681-94b1b427085a/ovsdbserver-sb/0.log" Oct 13 15:54:32 crc kubenswrapper[4797]: I1013 15:54:32.203019 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-2_2ab030ed-5bf4-480d-bdef-c2e145080cfd/openstack-network-exporter/0.log" Oct 13 15:54:32 crc kubenswrapper[4797]: I1013 15:54:32.234424 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-2_2ab030ed-5bf4-480d-bdef-c2e145080cfd/ovsdbserver-sb/0.log" Oct 13 15:54:32 crc kubenswrapper[4797]: I1013 15:54:32.667050 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-6f8d4c4db4-4chzm_fb9dbb3f-deb5-48f7-815c-9d2166039ea9/placement-api/0.log" Oct 13 15:54:32 crc kubenswrapper[4797]: I1013 15:54:32.710291 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-6f8d4c4db4-4chzm_fb9dbb3f-deb5-48f7-815c-9d2166039ea9/placement-log/0.log" Oct 13 15:54:32 crc kubenswrapper[4797]: I1013 15:54:32.834242 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_b8191ae4-3c4c-4059-8b22-ca8d1967e6f8/memcached/0.log" Oct 13 15:54:32 crc kubenswrapper[4797]: I1013 15:54:32.881101 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_pre-adoption-validation-openstack-pre-adoption-openstack-cwpmrr_9ceb0bb2-f070-48f0-a768-f0a75b81c937/pre-adoption-validation-openstack-pre-adoption-openstack-cell1/0.log" Oct 13 15:54:32 crc kubenswrapper[4797]: I1013 15:54:32.891344 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_5b21f03c-f0be-45b6-b3d2-eaa44eb95e3d/init-config-reloader/0.log" Oct 13 15:54:33 crc kubenswrapper[4797]: I1013 15:54:33.022554 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_5b21f03c-f0be-45b6-b3d2-eaa44eb95e3d/init-config-reloader/0.log" Oct 13 15:54:33 crc kubenswrapper[4797]: I1013 15:54:33.060786 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_5b21f03c-f0be-45b6-b3d2-eaa44eb95e3d/prometheus/0.log" Oct 13 15:54:33 crc kubenswrapper[4797]: I1013 15:54:33.235711 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_5b21f03c-f0be-45b6-b3d2-eaa44eb95e3d/config-reloader/0.log" Oct 13 15:54:33 crc kubenswrapper[4797]: I1013 15:54:33.245487 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_5b21f03c-f0be-45b6-b3d2-eaa44eb95e3d/thanos-sidecar/0.log" Oct 13 15:54:33 crc kubenswrapper[4797]: I1013 15:54:33.371276 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_24ee7ce6-783b-433c-a9d0-1a3b81edd035/setup-container/0.log" Oct 13 15:54:33 crc kubenswrapper[4797]: I1013 15:54:33.536721 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_24ee7ce6-783b-433c-a9d0-1a3b81edd035/setup-container/0.log" Oct 13 15:54:33 crc kubenswrapper[4797]: I1013 15:54:33.550096 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_24ee7ce6-783b-433c-a9d0-1a3b81edd035/rabbitmq/0.log" Oct 13 15:54:33 crc kubenswrapper[4797]: I1013 15:54:33.612895 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_c4404e85-5dbc-4ac0-af0a-75886d50bb73/setup-container/0.log" Oct 13 15:54:33 crc kubenswrapper[4797]: I1013 15:54:33.774406 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_c4404e85-5dbc-4ac0-af0a-75886d50bb73/setup-container/0.log" Oct 13 15:54:33 crc kubenswrapper[4797]: I1013 15:54:33.836270 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_c4404e85-5dbc-4ac0-af0a-75886d50bb73/rabbitmq/0.log" Oct 13 15:54:33 crc kubenswrapper[4797]: I1013 15:54:33.845537 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-openstack-openstack-cell1-xl2wv_24bc34c6-31c1-400b-8ea5-f857626afde4/reboot-os-openstack-openstack-cell1/0.log" Oct 13 15:54:34 crc kubenswrapper[4797]: I1013 15:54:34.684912 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-openstack-6c9b5_0e92099f-7960-419f-adcc-d73622e1b2f1/ssh-known-hosts-openstack/0.log" Oct 13 15:54:34 crc kubenswrapper[4797]: I1013 15:54:34.729607 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-openstack-openstack-cell1-29bxv_c215c11f-7d9d-482b-8ce9-9b7aeeb6ced0/run-os-openstack-openstack-cell1/0.log" Oct 13 15:54:34 crc kubenswrapper[4797]: I1013 15:54:34.888486 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-openstack-openstack-cell1-8jxlh_af167228-e8f8-4bad-82cb-e1d853a9b317/telemetry-openstack-openstack-cell1/0.log" Oct 13 15:54:35 crc kubenswrapper[4797]: I1013 15:54:35.018538 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_b91301d7-01b4-48ec-b44e-12d408a58e1c/tempest-tests-tempest-tests-runner/0.log" Oct 13 15:54:35 crc kubenswrapper[4797]: I1013 15:54:35.163692 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_45bc585b-ca78-4517-aba6-d855a810d80b/test-operator-logs-container/0.log" Oct 13 15:54:35 crc kubenswrapper[4797]: I1013 15:54:35.226446 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tripleo-cleanup-tripleo-cleanup-openstack-cell1-sfvqb_ab353644-5bc4-4dea-a881-0c4009efb270/tripleo-cleanup-tripleo-cleanup-openstack-cell1/0.log" Oct 13 15:54:35 crc kubenswrapper[4797]: I1013 15:54:35.343674 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-openstack-openstack-cell1-l9gmr_94c7ef1d-6bc2-4a04-94dc-8ab98fff0fce/validate-network-openstack-openstack-cell1/0.log" Oct 13 15:54:45 crc kubenswrapper[4797]: I1013 15:54:45.236370 4797 scope.go:117] "RemoveContainer" containerID="7f1e83580ed6057b8717484b17b17fa6431ebb611cc0b1233bd7239aa2d8f6f4" Oct 13 15:54:45 crc kubenswrapper[4797]: E1013 15:54:45.237109 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 15:54:59 crc kubenswrapper[4797]: I1013 15:54:59.238474 4797 scope.go:117] "RemoveContainer" containerID="7f1e83580ed6057b8717484b17b17fa6431ebb611cc0b1233bd7239aa2d8f6f4" Oct 13 15:54:59 crc kubenswrapper[4797]: E1013 15:54:59.239498 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 15:55:13 crc kubenswrapper[4797]: I1013 15:55:13.254135 4797 scope.go:117] "RemoveContainer" containerID="7f1e83580ed6057b8717484b17b17fa6431ebb611cc0b1233bd7239aa2d8f6f4" Oct 13 15:55:13 crc kubenswrapper[4797]: E1013 15:55:13.255047 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 15:55:28 crc kubenswrapper[4797]: I1013 15:55:28.235844 4797 scope.go:117] "RemoveContainer" containerID="7f1e83580ed6057b8717484b17b17fa6431ebb611cc0b1233bd7239aa2d8f6f4" Oct 13 15:55:28 crc kubenswrapper[4797]: E1013 15:55:28.236649 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 15:55:42 crc kubenswrapper[4797]: I1013 15:55:42.235998 4797 scope.go:117] "RemoveContainer" containerID="7f1e83580ed6057b8717484b17b17fa6431ebb611cc0b1233bd7239aa2d8f6f4" Oct 13 15:55:42 crc kubenswrapper[4797]: E1013 15:55:42.237032 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 15:55:55 crc kubenswrapper[4797]: I1013 15:55:55.236654 4797 scope.go:117] "RemoveContainer" containerID="7f1e83580ed6057b8717484b17b17fa6431ebb611cc0b1233bd7239aa2d8f6f4" Oct 13 15:55:55 crc kubenswrapper[4797]: I1013 15:55:55.805475 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" event={"ID":"345b1c60-ba79-407d-8423-53010f2dfeb0","Type":"ContainerStarted","Data":"b066f22aad24fb6aeea7702e6e3d1bfdd811246059ddf3b0194ee6a04bd047e1"} Oct 13 15:56:20 crc kubenswrapper[4797]: I1013 15:56:20.395083 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_32da80840a2017f27ed4ad61f02adc64a25aa18e8dad0409953372036afmvwg_24115c99-7b3e-44b3-b517-45c8118d5645/util/0.log" Oct 13 15:56:20 crc kubenswrapper[4797]: I1013 15:56:20.595558 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_32da80840a2017f27ed4ad61f02adc64a25aa18e8dad0409953372036afmvwg_24115c99-7b3e-44b3-b517-45c8118d5645/pull/0.log" Oct 13 15:56:20 crc kubenswrapper[4797]: I1013 15:56:20.621962 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_32da80840a2017f27ed4ad61f02adc64a25aa18e8dad0409953372036afmvwg_24115c99-7b3e-44b3-b517-45c8118d5645/pull/0.log" Oct 13 15:56:20 crc kubenswrapper[4797]: I1013 15:56:20.657237 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_32da80840a2017f27ed4ad61f02adc64a25aa18e8dad0409953372036afmvwg_24115c99-7b3e-44b3-b517-45c8118d5645/util/0.log" Oct 13 15:56:20 crc kubenswrapper[4797]: I1013 15:56:20.809284 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_32da80840a2017f27ed4ad61f02adc64a25aa18e8dad0409953372036afmvwg_24115c99-7b3e-44b3-b517-45c8118d5645/util/0.log" Oct 13 15:56:20 crc kubenswrapper[4797]: I1013 15:56:20.868190 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_32da80840a2017f27ed4ad61f02adc64a25aa18e8dad0409953372036afmvwg_24115c99-7b3e-44b3-b517-45c8118d5645/pull/0.log" Oct 13 15:56:20 crc kubenswrapper[4797]: I1013 15:56:20.915524 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_32da80840a2017f27ed4ad61f02adc64a25aa18e8dad0409953372036afmvwg_24115c99-7b3e-44b3-b517-45c8118d5645/extract/0.log" Oct 13 15:56:21 crc kubenswrapper[4797]: I1013 15:56:21.005666 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-658bdf4b74-4xbcc_2ce34775-733e-42d7-a688-4c12edad7614/kube-rbac-proxy/0.log" Oct 13 15:56:21 crc kubenswrapper[4797]: I1013 15:56:21.143039 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-7b7fb68549-jtc6c_cc7558e9-8bd0-4dda-9792-49855202f2bf/kube-rbac-proxy/0.log" Oct 13 15:56:21 crc kubenswrapper[4797]: I1013 15:56:21.153747 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-658bdf4b74-4xbcc_2ce34775-733e-42d7-a688-4c12edad7614/manager/0.log" Oct 13 15:56:21 crc kubenswrapper[4797]: I1013 15:56:21.305836 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-7b7fb68549-jtc6c_cc7558e9-8bd0-4dda-9792-49855202f2bf/manager/0.log" Oct 13 15:56:21 crc kubenswrapper[4797]: I1013 15:56:21.393101 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-85d5d9dd78-rdczf_f07f9d14-0fd4-4702-877a-8e0097a23791/manager/0.log" Oct 13 15:56:21 crc kubenswrapper[4797]: I1013 15:56:21.425789 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-85d5d9dd78-rdczf_f07f9d14-0fd4-4702-877a-8e0097a23791/kube-rbac-proxy/0.log" Oct 13 15:56:21 crc kubenswrapper[4797]: I1013 15:56:21.550030 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-84b9b84486-vfpgt_13487799-53f7-4c74-9f16-770bf4dbace5/kube-rbac-proxy/0.log" Oct 13 15:56:21 crc kubenswrapper[4797]: I1013 15:56:21.690057 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-84b9b84486-vfpgt_13487799-53f7-4c74-9f16-770bf4dbace5/manager/0.log" Oct 13 15:56:21 crc kubenswrapper[4797]: I1013 15:56:21.758589 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-858f76bbdd-79946_424af6e9-8a27-446e-b11e-7a84032f476e/kube-rbac-proxy/0.log" Oct 13 15:56:21 crc kubenswrapper[4797]: I1013 15:56:21.821478 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-858f76bbdd-79946_424af6e9-8a27-446e-b11e-7a84032f476e/manager/0.log" Oct 13 15:56:21 crc kubenswrapper[4797]: I1013 15:56:21.924256 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-7ffbcb7588-sdtkf_161dd833-44d7-4dac-9ea7-d2c059e2f593/kube-rbac-proxy/0.log" Oct 13 15:56:22 crc kubenswrapper[4797]: I1013 15:56:22.027944 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-7ffbcb7588-sdtkf_161dd833-44d7-4dac-9ea7-d2c059e2f593/manager/0.log" Oct 13 15:56:22 crc kubenswrapper[4797]: I1013 15:56:22.114320 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-656bcbd775-ntnr8_7493abc9-384b-43e1-aa00-1c6ae0ddf144/kube-rbac-proxy/0.log" Oct 13 15:56:22 crc kubenswrapper[4797]: I1013 15:56:22.297644 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-9c5c78d49-l8d5s_f092abb5-bd31-41f3-bedb-1e9523f17044/kube-rbac-proxy/0.log" Oct 13 15:56:22 crc kubenswrapper[4797]: I1013 15:56:22.402571 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-9c5c78d49-l8d5s_f092abb5-bd31-41f3-bedb-1e9523f17044/manager/0.log" Oct 13 15:56:22 crc kubenswrapper[4797]: I1013 15:56:22.420983 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-656bcbd775-ntnr8_7493abc9-384b-43e1-aa00-1c6ae0ddf144/manager/0.log" Oct 13 15:56:22 crc kubenswrapper[4797]: I1013 15:56:22.521230 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-55b6b7c7b8-znjhj_b9adb7ab-599b-4ac1-b7d4-d22efc7fda95/kube-rbac-proxy/0.log" Oct 13 15:56:22 crc kubenswrapper[4797]: I1013 15:56:22.727087 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-55b6b7c7b8-znjhj_b9adb7ab-599b-4ac1-b7d4-d22efc7fda95/manager/0.log" Oct 13 15:56:22 crc kubenswrapper[4797]: I1013 15:56:22.742213 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-5f67fbc655-m9cpw_5132d95b-625f-4eb5-9a09-47e695441c86/kube-rbac-proxy/0.log" Oct 13 15:56:22 crc kubenswrapper[4797]: I1013 15:56:22.769359 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-5f67fbc655-m9cpw_5132d95b-625f-4eb5-9a09-47e695441c86/manager/0.log" Oct 13 15:56:22 crc kubenswrapper[4797]: I1013 15:56:22.943581 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-f9fb45f8f-bld7w_ed01e7fd-31f4-47d4-9b83-3544f3e1f5d3/kube-rbac-proxy/0.log" Oct 13 15:56:22 crc kubenswrapper[4797]: I1013 15:56:22.966502 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-f9fb45f8f-bld7w_ed01e7fd-31f4-47d4-9b83-3544f3e1f5d3/manager/0.log" Oct 13 15:56:23 crc kubenswrapper[4797]: I1013 15:56:23.137358 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-79d585cb66-pqvwj_8c61f396-891f-4c58-ba21-e53d8e357358/kube-rbac-proxy/0.log" Oct 13 15:56:23 crc kubenswrapper[4797]: I1013 15:56:23.245176 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-79d585cb66-pqvwj_8c61f396-891f-4c58-ba21-e53d8e357358/manager/0.log" Oct 13 15:56:23 crc kubenswrapper[4797]: I1013 15:56:23.423707 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-5df598886f-fcgbt_990f0215-8f03-4fb7-ae16-0d89130a5ba3/kube-rbac-proxy/0.log" Oct 13 15:56:23 crc kubenswrapper[4797]: I1013 15:56:23.539242 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-69fdcfc5f5-dh4qb_21276615-f6a5-4b36-b65a-4b45a1f4b7e4/kube-rbac-proxy/0.log" Oct 13 15:56:23 crc kubenswrapper[4797]: I1013 15:56:23.731117 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-69fdcfc5f5-dh4qb_21276615-f6a5-4b36-b65a-4b45a1f4b7e4/manager/0.log" Oct 13 15:56:23 crc kubenswrapper[4797]: I1013 15:56:23.748113 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-5df598886f-fcgbt_990f0215-8f03-4fb7-ae16-0d89130a5ba3/manager/0.log" Oct 13 15:56:23 crc kubenswrapper[4797]: I1013 15:56:23.847985 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-55b7d448487kdnh_08de49ed-c17f-42fc-8bb1-2cb6684984f1/kube-rbac-proxy/0.log" Oct 13 15:56:23 crc kubenswrapper[4797]: I1013 15:56:23.957513 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-55b7d448487kdnh_08de49ed-c17f-42fc-8bb1-2cb6684984f1/manager/0.log" Oct 13 15:56:24 crc kubenswrapper[4797]: I1013 15:56:24.014937 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-7fb8c88b76-4rdzv_1d1f4280-f5b1-41f9-8eeb-3d2d9cac65e4/kube-rbac-proxy/0.log" Oct 13 15:56:24 crc kubenswrapper[4797]: I1013 15:56:24.241394 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-64895cd698-r4rcj_13b7c0b2-c215-4740-a6af-b67ce7ab0dd3/kube-rbac-proxy/0.log" Oct 13 15:56:24 crc kubenswrapper[4797]: I1013 15:56:24.502936 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-64895cd698-r4rcj_13b7c0b2-c215-4740-a6af-b67ce7ab0dd3/operator/0.log" Oct 13 15:56:24 crc kubenswrapper[4797]: I1013 15:56:24.704797 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-p2rmj_bb202dd1-eb2d-4c54-aa6e-160a65f46f21/registry-server/0.log" Oct 13 15:56:25 crc kubenswrapper[4797]: I1013 15:56:25.054505 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-79df5fb58c-mqjcg_f7afdb2f-edf9-4dfe-a7d1-43b6a5ec8dcf/kube-rbac-proxy/0.log" Oct 13 15:56:25 crc kubenswrapper[4797]: I1013 15:56:25.338760 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-79df5fb58c-mqjcg_f7afdb2f-edf9-4dfe-a7d1-43b6a5ec8dcf/manager/0.log" Oct 13 15:56:25 crc kubenswrapper[4797]: I1013 15:56:25.472488 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-68b6c87b68-766z6_f0e7ab2d-9124-44fe-aa40-abe8b405d449/manager/0.log" Oct 13 15:56:25 crc kubenswrapper[4797]: I1013 15:56:25.498683 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-68b6c87b68-766z6_f0e7ab2d-9124-44fe-aa40-abe8b405d449/kube-rbac-proxy/0.log" Oct 13 15:56:25 crc kubenswrapper[4797]: I1013 15:56:25.560427 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-5f97d8c699-w9st4_c2797d9e-1ac0-4ac4-8d0e-8c9061623f50/operator/0.log" Oct 13 15:56:25 crc kubenswrapper[4797]: I1013 15:56:25.685986 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-db6d7f97b-kgmrr_ad662e1a-7f25-414a-9358-cb1994840925/kube-rbac-proxy/0.log" Oct 13 15:56:25 crc kubenswrapper[4797]: I1013 15:56:25.846396 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-db6d7f97b-kgmrr_ad662e1a-7f25-414a-9358-cb1994840925/manager/0.log" Oct 13 15:56:26 crc kubenswrapper[4797]: I1013 15:56:26.115531 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-67cfc6749b-v826q_549ef07f-ef05-4c9a-8700-a19008de4afe/kube-rbac-proxy/0.log" Oct 13 15:56:26 crc kubenswrapper[4797]: I1013 15:56:26.239650 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5458f77c4-vrt4f_535566f0-4f81-4362-a4a0-18c9b2dedd8d/kube-rbac-proxy/0.log" Oct 13 15:56:26 crc kubenswrapper[4797]: I1013 15:56:26.304440 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-67cfc6749b-v826q_549ef07f-ef05-4c9a-8700-a19008de4afe/manager/0.log" Oct 13 15:56:26 crc kubenswrapper[4797]: I1013 15:56:26.391851 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-7fb8c88b76-4rdzv_1d1f4280-f5b1-41f9-8eeb-3d2d9cac65e4/manager/0.log" Oct 13 15:56:26 crc kubenswrapper[4797]: I1013 15:56:26.396203 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5458f77c4-vrt4f_535566f0-4f81-4362-a4a0-18c9b2dedd8d/manager/0.log" Oct 13 15:56:26 crc kubenswrapper[4797]: I1013 15:56:26.821306 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-7f554bff7b-qsnt6_4cc72187-24a1-4ec5-907d-d5295814e428/kube-rbac-proxy/0.log" Oct 13 15:56:26 crc kubenswrapper[4797]: I1013 15:56:26.857282 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-7f554bff7b-qsnt6_4cc72187-24a1-4ec5-907d-d5295814e428/manager/0.log" Oct 13 15:56:44 crc kubenswrapper[4797]: I1013 15:56:44.667044 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-fb6d2_73bc6ad6-8024-4d5b-a0b6-995e29f987af/control-plane-machine-set-operator/0.log" Oct 13 15:56:45 crc kubenswrapper[4797]: I1013 15:56:45.367058 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-vw685_3a9a3c57-03b2-4adc-82a1-3aba68c83636/machine-api-operator/0.log" Oct 13 15:56:45 crc kubenswrapper[4797]: I1013 15:56:45.371327 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-vw685_3a9a3c57-03b2-4adc-82a1-3aba68c83636/kube-rbac-proxy/0.log" Oct 13 15:56:57 crc kubenswrapper[4797]: I1013 15:56:57.811820 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-7d4cc89fcb-p25dd_5903e7e4-5c29-4320-9c08-cccaab0cd30f/cert-manager-controller/0.log" Oct 13 15:56:57 crc kubenswrapper[4797]: I1013 15:56:57.929709 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7d9f95dbf-f2mp5_99855fda-83c3-48ad-bdea-d1f21a0407fd/cert-manager-cainjector/0.log" Oct 13 15:56:58 crc kubenswrapper[4797]: I1013 15:56:58.038731 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-d969966f-h8b9w_ec3a8923-32a5-467e-949b-0450963f0afb/cert-manager-webhook/0.log" Oct 13 15:57:10 crc kubenswrapper[4797]: I1013 15:57:10.766846 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-6b874cbd85-vcz62_b6e43341-62a8-462b-80c3-86ab7db3a7f6/nmstate-console-plugin/0.log" Oct 13 15:57:10 crc kubenswrapper[4797]: I1013 15:57:10.989175 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-4gzdz_567b9831-64e4-48ec-bdae-acc961720179/nmstate-handler/0.log" Oct 13 15:57:11 crc kubenswrapper[4797]: I1013 15:57:11.011174 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-dgck6_4b7f9fc5-d10a-4c60-a58c-f3636f159812/kube-rbac-proxy/0.log" Oct 13 15:57:11 crc kubenswrapper[4797]: I1013 15:57:11.131200 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-dgck6_4b7f9fc5-d10a-4c60-a58c-f3636f159812/nmstate-metrics/0.log" Oct 13 15:57:11 crc kubenswrapper[4797]: I1013 15:57:11.205904 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-858ddd8f98-rzsld_b08560e7-e858-4ae9-9674-ac7c89a2d5d4/nmstate-operator/0.log" Oct 13 15:57:11 crc kubenswrapper[4797]: I1013 15:57:11.373319 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-6cdbc54649-6g2mr_2b289246-79ef-45ae-afdd-ab40537151b5/nmstate-webhook/0.log" Oct 13 15:57:26 crc kubenswrapper[4797]: I1013 15:57:26.871306 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-v7qt6_3fad6639-3207-47d1-9c1c-f28bed56e219/kube-rbac-proxy/0.log" Oct 13 15:57:27 crc kubenswrapper[4797]: I1013 15:57:27.106233 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-64bf5d555-zxbq8_5a9c42d5-f403-43ce-80fd-e38f20646272/frr-k8s-webhook-server/0.log" Oct 13 15:57:27 crc kubenswrapper[4797]: I1013 15:57:27.115077 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xjs8k_44e2efb9-07fa-42db-a605-44970bbe88bc/cp-frr-files/0.log" Oct 13 15:57:27 crc kubenswrapper[4797]: I1013 15:57:27.362411 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-v7qt6_3fad6639-3207-47d1-9c1c-f28bed56e219/controller/0.log" Oct 13 15:57:27 crc kubenswrapper[4797]: I1013 15:57:27.385438 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xjs8k_44e2efb9-07fa-42db-a605-44970bbe88bc/cp-frr-files/0.log" Oct 13 15:57:27 crc kubenswrapper[4797]: I1013 15:57:27.385583 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xjs8k_44e2efb9-07fa-42db-a605-44970bbe88bc/cp-reloader/0.log" Oct 13 15:57:27 crc kubenswrapper[4797]: I1013 15:57:27.421698 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xjs8k_44e2efb9-07fa-42db-a605-44970bbe88bc/cp-metrics/0.log" Oct 13 15:57:27 crc kubenswrapper[4797]: I1013 15:57:27.539593 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xjs8k_44e2efb9-07fa-42db-a605-44970bbe88bc/cp-reloader/0.log" Oct 13 15:57:27 crc kubenswrapper[4797]: I1013 15:57:27.731224 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xjs8k_44e2efb9-07fa-42db-a605-44970bbe88bc/cp-reloader/0.log" Oct 13 15:57:27 crc kubenswrapper[4797]: I1013 15:57:27.737321 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xjs8k_44e2efb9-07fa-42db-a605-44970bbe88bc/cp-frr-files/0.log" Oct 13 15:57:27 crc kubenswrapper[4797]: I1013 15:57:27.739195 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xjs8k_44e2efb9-07fa-42db-a605-44970bbe88bc/cp-metrics/0.log" Oct 13 15:57:27 crc kubenswrapper[4797]: I1013 15:57:27.749602 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xjs8k_44e2efb9-07fa-42db-a605-44970bbe88bc/cp-metrics/0.log" Oct 13 15:57:27 crc kubenswrapper[4797]: I1013 15:57:27.909076 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xjs8k_44e2efb9-07fa-42db-a605-44970bbe88bc/cp-frr-files/0.log" Oct 13 15:57:27 crc kubenswrapper[4797]: I1013 15:57:27.913944 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xjs8k_44e2efb9-07fa-42db-a605-44970bbe88bc/cp-metrics/0.log" Oct 13 15:57:27 crc kubenswrapper[4797]: I1013 15:57:27.914553 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xjs8k_44e2efb9-07fa-42db-a605-44970bbe88bc/cp-reloader/0.log" Oct 13 15:57:27 crc kubenswrapper[4797]: I1013 15:57:27.963250 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xjs8k_44e2efb9-07fa-42db-a605-44970bbe88bc/controller/0.log" Oct 13 15:57:28 crc kubenswrapper[4797]: I1013 15:57:28.076998 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xjs8k_44e2efb9-07fa-42db-a605-44970bbe88bc/frr-metrics/0.log" Oct 13 15:57:28 crc kubenswrapper[4797]: I1013 15:57:28.109283 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xjs8k_44e2efb9-07fa-42db-a605-44970bbe88bc/kube-rbac-proxy/0.log" Oct 13 15:57:28 crc kubenswrapper[4797]: I1013 15:57:28.169336 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xjs8k_44e2efb9-07fa-42db-a605-44970bbe88bc/kube-rbac-proxy-frr/0.log" Oct 13 15:57:28 crc kubenswrapper[4797]: I1013 15:57:28.273459 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xjs8k_44e2efb9-07fa-42db-a605-44970bbe88bc/reloader/0.log" Oct 13 15:57:28 crc kubenswrapper[4797]: I1013 15:57:28.382224 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-56c6888874-lkjwb_66c7fd1a-54de-49e1-9f09-93bad2b9ed1d/manager/0.log" Oct 13 15:57:28 crc kubenswrapper[4797]: I1013 15:57:28.642913 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-69f86b4dfd-84kr7_ffa5c642-390d-402e-82e5-ec453a7814ee/webhook-server/0.log" Oct 13 15:57:28 crc kubenswrapper[4797]: I1013 15:57:28.763147 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-kws7b_c2336e04-93f4-4e2a-a221-21c09083f0ac/kube-rbac-proxy/0.log" Oct 13 15:57:29 crc kubenswrapper[4797]: I1013 15:57:29.686145 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-kws7b_c2336e04-93f4-4e2a-a221-21c09083f0ac/speaker/0.log" Oct 13 15:57:31 crc kubenswrapper[4797]: I1013 15:57:31.533508 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xjs8k_44e2efb9-07fa-42db-a605-44970bbe88bc/frr/0.log" Oct 13 15:57:42 crc kubenswrapper[4797]: I1013 15:57:42.831486 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb695jx5p_bde8aa22-3546-40b8-89bb-3415532d55b4/util/0.log" Oct 13 15:57:43 crc kubenswrapper[4797]: I1013 15:57:43.008576 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb695jx5p_bde8aa22-3546-40b8-89bb-3415532d55b4/util/0.log" Oct 13 15:57:43 crc kubenswrapper[4797]: I1013 15:57:43.039432 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb695jx5p_bde8aa22-3546-40b8-89bb-3415532d55b4/pull/0.log" Oct 13 15:57:43 crc kubenswrapper[4797]: I1013 15:57:43.078460 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb695jx5p_bde8aa22-3546-40b8-89bb-3415532d55b4/pull/0.log" Oct 13 15:57:43 crc kubenswrapper[4797]: I1013 15:57:43.198713 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb695jx5p_bde8aa22-3546-40b8-89bb-3415532d55b4/pull/0.log" Oct 13 15:57:43 crc kubenswrapper[4797]: I1013 15:57:43.199256 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb695jx5p_bde8aa22-3546-40b8-89bb-3415532d55b4/util/0.log" Oct 13 15:57:43 crc kubenswrapper[4797]: I1013 15:57:43.228078 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb695jx5p_bde8aa22-3546-40b8-89bb-3415532d55b4/extract/0.log" Oct 13 15:57:43 crc kubenswrapper[4797]: I1013 15:57:43.378238 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zspc8_76e0cd0f-5fd1-4d5c-9590-c838a57bd5c4/util/0.log" Oct 13 15:57:43 crc kubenswrapper[4797]: I1013 15:57:43.520778 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zspc8_76e0cd0f-5fd1-4d5c-9590-c838a57bd5c4/util/0.log" Oct 13 15:57:43 crc kubenswrapper[4797]: I1013 15:57:43.539742 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zspc8_76e0cd0f-5fd1-4d5c-9590-c838a57bd5c4/pull/0.log" Oct 13 15:57:43 crc kubenswrapper[4797]: I1013 15:57:43.564186 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zspc8_76e0cd0f-5fd1-4d5c-9590-c838a57bd5c4/pull/0.log" Oct 13 15:57:43 crc kubenswrapper[4797]: I1013 15:57:43.684905 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zspc8_76e0cd0f-5fd1-4d5c-9590-c838a57bd5c4/util/0.log" Oct 13 15:57:43 crc kubenswrapper[4797]: I1013 15:57:43.727594 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zspc8_76e0cd0f-5fd1-4d5c-9590-c838a57bd5c4/pull/0.log" Oct 13 15:57:43 crc kubenswrapper[4797]: I1013 15:57:43.737458 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zspc8_76e0cd0f-5fd1-4d5c-9590-c838a57bd5c4/extract/0.log" Oct 13 15:57:43 crc kubenswrapper[4797]: I1013 15:57:43.880368 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2drz84j_db995f55-9b3f-4fc7-a12c-9f37d196794c/util/0.log" Oct 13 15:57:44 crc kubenswrapper[4797]: I1013 15:57:44.069775 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2drz84j_db995f55-9b3f-4fc7-a12c-9f37d196794c/util/0.log" Oct 13 15:57:44 crc kubenswrapper[4797]: I1013 15:57:44.092434 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2drz84j_db995f55-9b3f-4fc7-a12c-9f37d196794c/pull/0.log" Oct 13 15:57:44 crc kubenswrapper[4797]: I1013 15:57:44.149087 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2drz84j_db995f55-9b3f-4fc7-a12c-9f37d196794c/pull/0.log" Oct 13 15:57:44 crc kubenswrapper[4797]: I1013 15:57:44.272849 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2drz84j_db995f55-9b3f-4fc7-a12c-9f37d196794c/pull/0.log" Oct 13 15:57:44 crc kubenswrapper[4797]: I1013 15:57:44.274759 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2drz84j_db995f55-9b3f-4fc7-a12c-9f37d196794c/util/0.log" Oct 13 15:57:44 crc kubenswrapper[4797]: I1013 15:57:44.280710 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2drz84j_db995f55-9b3f-4fc7-a12c-9f37d196794c/extract/0.log" Oct 13 15:57:44 crc kubenswrapper[4797]: I1013 15:57:44.454057 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-x7gp7_c1a9748e-b3fa-4c05-98f8-5ed245e08fad/extract-utilities/0.log" Oct 13 15:57:45 crc kubenswrapper[4797]: I1013 15:57:45.178565 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-x7gp7_c1a9748e-b3fa-4c05-98f8-5ed245e08fad/extract-utilities/0.log" Oct 13 15:57:45 crc kubenswrapper[4797]: I1013 15:57:45.194022 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-x7gp7_c1a9748e-b3fa-4c05-98f8-5ed245e08fad/extract-content/0.log" Oct 13 15:57:45 crc kubenswrapper[4797]: I1013 15:57:45.229922 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-x7gp7_c1a9748e-b3fa-4c05-98f8-5ed245e08fad/extract-content/0.log" Oct 13 15:57:45 crc kubenswrapper[4797]: I1013 15:57:45.361898 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-x7gp7_c1a9748e-b3fa-4c05-98f8-5ed245e08fad/extract-utilities/0.log" Oct 13 15:57:45 crc kubenswrapper[4797]: I1013 15:57:45.377210 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-x7gp7_c1a9748e-b3fa-4c05-98f8-5ed245e08fad/extract-content/0.log" Oct 13 15:57:45 crc kubenswrapper[4797]: I1013 15:57:45.620662 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-phmt9_1fd8962b-9e85-452e-b0db-8d1f12109329/extract-utilities/0.log" Oct 13 15:57:45 crc kubenswrapper[4797]: I1013 15:57:45.801469 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-phmt9_1fd8962b-9e85-452e-b0db-8d1f12109329/extract-utilities/0.log" Oct 13 15:57:45 crc kubenswrapper[4797]: I1013 15:57:45.889849 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-phmt9_1fd8962b-9e85-452e-b0db-8d1f12109329/extract-content/0.log" Oct 13 15:57:45 crc kubenswrapper[4797]: I1013 15:57:45.935346 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-phmt9_1fd8962b-9e85-452e-b0db-8d1f12109329/extract-content/0.log" Oct 13 15:57:46 crc kubenswrapper[4797]: I1013 15:57:46.175419 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-phmt9_1fd8962b-9e85-452e-b0db-8d1f12109329/extract-utilities/0.log" Oct 13 15:57:46 crc kubenswrapper[4797]: I1013 15:57:46.223710 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-phmt9_1fd8962b-9e85-452e-b0db-8d1f12109329/extract-content/0.log" Oct 13 15:57:46 crc kubenswrapper[4797]: I1013 15:57:46.427003 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6rqvz_e70a0257-aaba-41a5-a201-8062761a1adf/util/0.log" Oct 13 15:57:47 crc kubenswrapper[4797]: I1013 15:57:47.046029 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-x7gp7_c1a9748e-b3fa-4c05-98f8-5ed245e08fad/registry-server/0.log" Oct 13 15:57:47 crc kubenswrapper[4797]: I1013 15:57:47.196381 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6rqvz_e70a0257-aaba-41a5-a201-8062761a1adf/pull/0.log" Oct 13 15:57:47 crc kubenswrapper[4797]: I1013 15:57:47.239409 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6rqvz_e70a0257-aaba-41a5-a201-8062761a1adf/util/0.log" Oct 13 15:57:47 crc kubenswrapper[4797]: I1013 15:57:47.243330 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6rqvz_e70a0257-aaba-41a5-a201-8062761a1adf/pull/0.log" Oct 13 15:57:47 crc kubenswrapper[4797]: I1013 15:57:47.464820 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6rqvz_e70a0257-aaba-41a5-a201-8062761a1adf/util/0.log" Oct 13 15:57:47 crc kubenswrapper[4797]: I1013 15:57:47.478549 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6rqvz_e70a0257-aaba-41a5-a201-8062761a1adf/extract/0.log" Oct 13 15:57:47 crc kubenswrapper[4797]: I1013 15:57:47.486261 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6rqvz_e70a0257-aaba-41a5-a201-8062761a1adf/pull/0.log" Oct 13 15:57:47 crc kubenswrapper[4797]: I1013 15:57:47.749174 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-frmvf_aea11d03-92b6-4f03-b4bc-61042afa7406/marketplace-operator/0.log" Oct 13 15:57:47 crc kubenswrapper[4797]: I1013 15:57:47.792971 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-c9wvg_cb509468-0e31-4f31-8aea-3a1c9111574d/extract-utilities/0.log" Oct 13 15:57:47 crc kubenswrapper[4797]: I1013 15:57:47.794725 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-phmt9_1fd8962b-9e85-452e-b0db-8d1f12109329/registry-server/0.log" Oct 13 15:57:47 crc kubenswrapper[4797]: I1013 15:57:47.971250 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-c9wvg_cb509468-0e31-4f31-8aea-3a1c9111574d/extract-utilities/0.log" Oct 13 15:57:47 crc kubenswrapper[4797]: I1013 15:57:47.991603 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-c9wvg_cb509468-0e31-4f31-8aea-3a1c9111574d/extract-content/0.log" Oct 13 15:57:47 crc kubenswrapper[4797]: I1013 15:57:47.993828 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-c9wvg_cb509468-0e31-4f31-8aea-3a1c9111574d/extract-content/0.log" Oct 13 15:57:48 crc kubenswrapper[4797]: I1013 15:57:48.179358 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-c9wvg_cb509468-0e31-4f31-8aea-3a1c9111574d/extract-content/0.log" Oct 13 15:57:48 crc kubenswrapper[4797]: I1013 15:57:48.186156 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-c9wvg_cb509468-0e31-4f31-8aea-3a1c9111574d/extract-utilities/0.log" Oct 13 15:57:48 crc kubenswrapper[4797]: I1013 15:57:48.243655 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vgrr4_eb0fc30c-b4bd-41bc-871e-1e85b3f115f2/extract-utilities/0.log" Oct 13 15:57:48 crc kubenswrapper[4797]: I1013 15:57:48.475616 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vgrr4_eb0fc30c-b4bd-41bc-871e-1e85b3f115f2/extract-utilities/0.log" Oct 13 15:57:48 crc kubenswrapper[4797]: I1013 15:57:48.476758 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-c9wvg_cb509468-0e31-4f31-8aea-3a1c9111574d/registry-server/0.log" Oct 13 15:57:48 crc kubenswrapper[4797]: I1013 15:57:48.527535 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vgrr4_eb0fc30c-b4bd-41bc-871e-1e85b3f115f2/extract-content/0.log" Oct 13 15:57:48 crc kubenswrapper[4797]: I1013 15:57:48.572883 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vgrr4_eb0fc30c-b4bd-41bc-871e-1e85b3f115f2/extract-content/0.log" Oct 13 15:57:48 crc kubenswrapper[4797]: I1013 15:57:48.697190 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vgrr4_eb0fc30c-b4bd-41bc-871e-1e85b3f115f2/extract-utilities/0.log" Oct 13 15:57:48 crc kubenswrapper[4797]: I1013 15:57:48.697440 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vgrr4_eb0fc30c-b4bd-41bc-871e-1e85b3f115f2/extract-content/0.log" Oct 13 15:57:49 crc kubenswrapper[4797]: I1013 15:57:49.903186 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vgrr4_eb0fc30c-b4bd-41bc-871e-1e85b3f115f2/registry-server/0.log" Oct 13 15:58:00 crc kubenswrapper[4797]: I1013 15:58:00.524032 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-7c8cf85677-tplx7_c51a5185-9bfb-46c9-95fa-b41b91150fc1/prometheus-operator/0.log" Oct 13 15:58:00 crc kubenswrapper[4797]: I1013 15:58:00.742700 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-84558866d4-4fbtq_57f56c14-5bb3-410a-a578-4814c6ce81a8/prometheus-operator-admission-webhook/0.log" Oct 13 15:58:00 crc kubenswrapper[4797]: I1013 15:58:00.778053 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-84558866d4-w8c2d_2d98b442-31b7-44c4-b551-55af7fb2ff25/prometheus-operator-admission-webhook/0.log" Oct 13 15:58:00 crc kubenswrapper[4797]: I1013 15:58:00.943151 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-cc5f78dfc-rpdw6_4449e3ff-7bc9-44b4-b1e6-932bb69225dd/operator/0.log" Oct 13 15:58:00 crc kubenswrapper[4797]: I1013 15:58:00.981632 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-54bc95c9fb-zc6fr_ece0b1a2-3e6f-461f-bace-aece27efc279/perses-operator/0.log" Oct 13 15:58:12 crc kubenswrapper[4797]: E1013 15:58:12.906870 4797 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 38.102.83.147:35008->38.102.83.147:46853: read tcp 38.102.83.147:35008->38.102.83.147:46853: read: connection reset by peer Oct 13 15:58:18 crc kubenswrapper[4797]: I1013 15:58:18.120275 4797 patch_prober.go:28] interesting pod/machine-config-daemon-hrdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 15:58:18 crc kubenswrapper[4797]: I1013 15:58:18.120870 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 15:58:48 crc kubenswrapper[4797]: I1013 15:58:48.120473 4797 patch_prober.go:28] interesting pod/machine-config-daemon-hrdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 15:58:48 crc kubenswrapper[4797]: I1013 15:58:48.121005 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 15:59:18 crc kubenswrapper[4797]: I1013 15:59:18.120480 4797 patch_prober.go:28] interesting pod/machine-config-daemon-hrdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 15:59:18 crc kubenswrapper[4797]: I1013 15:59:18.122797 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 15:59:18 crc kubenswrapper[4797]: I1013 15:59:18.123049 4797 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" Oct 13 15:59:18 crc kubenswrapper[4797]: I1013 15:59:18.124379 4797 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b066f22aad24fb6aeea7702e6e3d1bfdd811246059ddf3b0194ee6a04bd047e1"} pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 13 15:59:18 crc kubenswrapper[4797]: I1013 15:59:18.124536 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerName="machine-config-daemon" containerID="cri-o://b066f22aad24fb6aeea7702e6e3d1bfdd811246059ddf3b0194ee6a04bd047e1" gracePeriod=600 Oct 13 15:59:18 crc kubenswrapper[4797]: I1013 15:59:18.896501 4797 generic.go:334] "Generic (PLEG): container finished" podID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerID="b066f22aad24fb6aeea7702e6e3d1bfdd811246059ddf3b0194ee6a04bd047e1" exitCode=0 Oct 13 15:59:18 crc kubenswrapper[4797]: I1013 15:59:18.897015 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" event={"ID":"345b1c60-ba79-407d-8423-53010f2dfeb0","Type":"ContainerDied","Data":"b066f22aad24fb6aeea7702e6e3d1bfdd811246059ddf3b0194ee6a04bd047e1"} Oct 13 15:59:18 crc kubenswrapper[4797]: I1013 15:59:18.897052 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" event={"ID":"345b1c60-ba79-407d-8423-53010f2dfeb0","Type":"ContainerStarted","Data":"0920c8d24afe0b4d40a22e6525d60a7cf10d64ddda96852d0d01ef3518b10398"} Oct 13 15:59:18 crc kubenswrapper[4797]: I1013 15:59:18.897068 4797 scope.go:117] "RemoveContainer" containerID="7f1e83580ed6057b8717484b17b17fa6431ebb611cc0b1233bd7239aa2d8f6f4" Oct 13 15:59:21 crc kubenswrapper[4797]: I1013 15:59:21.411682 4797 scope.go:117] "RemoveContainer" containerID="e1cc6bf297f86f4eb8bc83bf953048585517720b009de660eeb1946380951b09" Oct 13 16:00:00 crc kubenswrapper[4797]: I1013 16:00:00.169154 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29339520-r9xls"] Oct 13 16:00:00 crc kubenswrapper[4797]: E1013 16:00:00.170613 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ceca53c-91aa-4d19-bcff-f03db87675fb" containerName="container-00" Oct 13 16:00:00 crc kubenswrapper[4797]: I1013 16:00:00.170632 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ceca53c-91aa-4d19-bcff-f03db87675fb" containerName="container-00" Oct 13 16:00:00 crc kubenswrapper[4797]: E1013 16:00:00.170679 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb194e5f-1a8b-4f25-abc4-d481be912fff" containerName="registry-server" Oct 13 16:00:00 crc kubenswrapper[4797]: I1013 16:00:00.170687 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb194e5f-1a8b-4f25-abc4-d481be912fff" containerName="registry-server" Oct 13 16:00:00 crc kubenswrapper[4797]: E1013 16:00:00.170708 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb194e5f-1a8b-4f25-abc4-d481be912fff" containerName="extract-utilities" Oct 13 16:00:00 crc kubenswrapper[4797]: I1013 16:00:00.170716 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb194e5f-1a8b-4f25-abc4-d481be912fff" containerName="extract-utilities" Oct 13 16:00:00 crc kubenswrapper[4797]: E1013 16:00:00.170737 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb194e5f-1a8b-4f25-abc4-d481be912fff" containerName="extract-content" Oct 13 16:00:00 crc kubenswrapper[4797]: I1013 16:00:00.170759 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb194e5f-1a8b-4f25-abc4-d481be912fff" containerName="extract-content" Oct 13 16:00:00 crc kubenswrapper[4797]: I1013 16:00:00.171084 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb194e5f-1a8b-4f25-abc4-d481be912fff" containerName="registry-server" Oct 13 16:00:00 crc kubenswrapper[4797]: I1013 16:00:00.171113 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ceca53c-91aa-4d19-bcff-f03db87675fb" containerName="container-00" Oct 13 16:00:00 crc kubenswrapper[4797]: I1013 16:00:00.172298 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29339520-r9xls" Oct 13 16:00:00 crc kubenswrapper[4797]: I1013 16:00:00.174911 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 13 16:00:00 crc kubenswrapper[4797]: I1013 16:00:00.175114 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 13 16:00:00 crc kubenswrapper[4797]: I1013 16:00:00.206722 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29339520-r9xls"] Oct 13 16:00:00 crc kubenswrapper[4797]: I1013 16:00:00.286924 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fdc95f86-5039-491a-8fe0-7e7db5483c51-config-volume\") pod \"collect-profiles-29339520-r9xls\" (UID: \"fdc95f86-5039-491a-8fe0-7e7db5483c51\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339520-r9xls" Oct 13 16:00:00 crc kubenswrapper[4797]: I1013 16:00:00.287019 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fdc95f86-5039-491a-8fe0-7e7db5483c51-secret-volume\") pod \"collect-profiles-29339520-r9xls\" (UID: \"fdc95f86-5039-491a-8fe0-7e7db5483c51\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339520-r9xls" Oct 13 16:00:00 crc kubenswrapper[4797]: I1013 16:00:00.287187 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwmws\" (UniqueName: \"kubernetes.io/projected/fdc95f86-5039-491a-8fe0-7e7db5483c51-kube-api-access-rwmws\") pod \"collect-profiles-29339520-r9xls\" (UID: \"fdc95f86-5039-491a-8fe0-7e7db5483c51\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339520-r9xls" Oct 13 16:00:00 crc kubenswrapper[4797]: I1013 16:00:00.389219 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rwmws\" (UniqueName: \"kubernetes.io/projected/fdc95f86-5039-491a-8fe0-7e7db5483c51-kube-api-access-rwmws\") pod \"collect-profiles-29339520-r9xls\" (UID: \"fdc95f86-5039-491a-8fe0-7e7db5483c51\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339520-r9xls" Oct 13 16:00:00 crc kubenswrapper[4797]: I1013 16:00:00.389404 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fdc95f86-5039-491a-8fe0-7e7db5483c51-config-volume\") pod \"collect-profiles-29339520-r9xls\" (UID: \"fdc95f86-5039-491a-8fe0-7e7db5483c51\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339520-r9xls" Oct 13 16:00:00 crc kubenswrapper[4797]: I1013 16:00:00.389425 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fdc95f86-5039-491a-8fe0-7e7db5483c51-secret-volume\") pod \"collect-profiles-29339520-r9xls\" (UID: \"fdc95f86-5039-491a-8fe0-7e7db5483c51\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339520-r9xls" Oct 13 16:00:00 crc kubenswrapper[4797]: I1013 16:00:00.390484 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fdc95f86-5039-491a-8fe0-7e7db5483c51-config-volume\") pod \"collect-profiles-29339520-r9xls\" (UID: \"fdc95f86-5039-491a-8fe0-7e7db5483c51\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339520-r9xls" Oct 13 16:00:00 crc kubenswrapper[4797]: I1013 16:00:00.395259 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fdc95f86-5039-491a-8fe0-7e7db5483c51-secret-volume\") pod \"collect-profiles-29339520-r9xls\" (UID: \"fdc95f86-5039-491a-8fe0-7e7db5483c51\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339520-r9xls" Oct 13 16:00:00 crc kubenswrapper[4797]: I1013 16:00:00.408466 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwmws\" (UniqueName: \"kubernetes.io/projected/fdc95f86-5039-491a-8fe0-7e7db5483c51-kube-api-access-rwmws\") pod \"collect-profiles-29339520-r9xls\" (UID: \"fdc95f86-5039-491a-8fe0-7e7db5483c51\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339520-r9xls" Oct 13 16:00:00 crc kubenswrapper[4797]: I1013 16:00:00.506495 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29339520-r9xls" Oct 13 16:00:00 crc kubenswrapper[4797]: I1013 16:00:00.969635 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29339520-r9xls"] Oct 13 16:00:01 crc kubenswrapper[4797]: I1013 16:00:01.353454 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29339520-r9xls" event={"ID":"fdc95f86-5039-491a-8fe0-7e7db5483c51","Type":"ContainerStarted","Data":"97592fcc45172002559471b64f8b8a26e636281ee55b09201d7dde7fdd5aee6e"} Oct 13 16:00:01 crc kubenswrapper[4797]: I1013 16:00:01.353723 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29339520-r9xls" event={"ID":"fdc95f86-5039-491a-8fe0-7e7db5483c51","Type":"ContainerStarted","Data":"3f191aa16c9adcb54a8dc655c3211ce77ca66ba64af2eb1d5ceeeecacbfafaa7"} Oct 13 16:00:01 crc kubenswrapper[4797]: I1013 16:00:01.375499 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29339520-r9xls" podStartSLOduration=1.375476651 podStartE2EDuration="1.375476651s" podCreationTimestamp="2025-10-13 16:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 16:00:01.37341062 +0000 UTC m=+10378.906960876" watchObservedRunningTime="2025-10-13 16:00:01.375476651 +0000 UTC m=+10378.909026927" Oct 13 16:00:02 crc kubenswrapper[4797]: I1013 16:00:02.365950 4797 generic.go:334] "Generic (PLEG): container finished" podID="fdc95f86-5039-491a-8fe0-7e7db5483c51" containerID="97592fcc45172002559471b64f8b8a26e636281ee55b09201d7dde7fdd5aee6e" exitCode=0 Oct 13 16:00:02 crc kubenswrapper[4797]: I1013 16:00:02.365999 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29339520-r9xls" event={"ID":"fdc95f86-5039-491a-8fe0-7e7db5483c51","Type":"ContainerDied","Data":"97592fcc45172002559471b64f8b8a26e636281ee55b09201d7dde7fdd5aee6e"} Oct 13 16:00:03 crc kubenswrapper[4797]: I1013 16:00:03.377160 4797 generic.go:334] "Generic (PLEG): container finished" podID="1e1563cf-9081-41da-b894-f07f4ff18604" containerID="3e2767edb4ccc39ed30beabb8ca5bd06ab049fb29447f0b4099941a5cf68808b" exitCode=0 Oct 13 16:00:03 crc kubenswrapper[4797]: I1013 16:00:03.377264 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kvrb6/must-gather-5rh4w" event={"ID":"1e1563cf-9081-41da-b894-f07f4ff18604","Type":"ContainerDied","Data":"3e2767edb4ccc39ed30beabb8ca5bd06ab049fb29447f0b4099941a5cf68808b"} Oct 13 16:00:03 crc kubenswrapper[4797]: I1013 16:00:03.379010 4797 scope.go:117] "RemoveContainer" containerID="3e2767edb4ccc39ed30beabb8ca5bd06ab049fb29447f0b4099941a5cf68808b" Oct 13 16:00:03 crc kubenswrapper[4797]: I1013 16:00:03.760423 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29339520-r9xls" Oct 13 16:00:03 crc kubenswrapper[4797]: I1013 16:00:03.878780 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fdc95f86-5039-491a-8fe0-7e7db5483c51-config-volume\") pod \"fdc95f86-5039-491a-8fe0-7e7db5483c51\" (UID: \"fdc95f86-5039-491a-8fe0-7e7db5483c51\") " Oct 13 16:00:03 crc kubenswrapper[4797]: I1013 16:00:03.878956 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rwmws\" (UniqueName: \"kubernetes.io/projected/fdc95f86-5039-491a-8fe0-7e7db5483c51-kube-api-access-rwmws\") pod \"fdc95f86-5039-491a-8fe0-7e7db5483c51\" (UID: \"fdc95f86-5039-491a-8fe0-7e7db5483c51\") " Oct 13 16:00:03 crc kubenswrapper[4797]: I1013 16:00:03.879061 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fdc95f86-5039-491a-8fe0-7e7db5483c51-secret-volume\") pod \"fdc95f86-5039-491a-8fe0-7e7db5483c51\" (UID: \"fdc95f86-5039-491a-8fe0-7e7db5483c51\") " Oct 13 16:00:03 crc kubenswrapper[4797]: I1013 16:00:03.880250 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fdc95f86-5039-491a-8fe0-7e7db5483c51-config-volume" (OuterVolumeSpecName: "config-volume") pod "fdc95f86-5039-491a-8fe0-7e7db5483c51" (UID: "fdc95f86-5039-491a-8fe0-7e7db5483c51"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 16:00:03 crc kubenswrapper[4797]: I1013 16:00:03.888072 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fdc95f86-5039-491a-8fe0-7e7db5483c51-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "fdc95f86-5039-491a-8fe0-7e7db5483c51" (UID: "fdc95f86-5039-491a-8fe0-7e7db5483c51"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 16:00:03 crc kubenswrapper[4797]: I1013 16:00:03.888579 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fdc95f86-5039-491a-8fe0-7e7db5483c51-kube-api-access-rwmws" (OuterVolumeSpecName: "kube-api-access-rwmws") pod "fdc95f86-5039-491a-8fe0-7e7db5483c51" (UID: "fdc95f86-5039-491a-8fe0-7e7db5483c51"). InnerVolumeSpecName "kube-api-access-rwmws". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 16:00:03 crc kubenswrapper[4797]: I1013 16:00:03.953746 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-kvrb6_must-gather-5rh4w_1e1563cf-9081-41da-b894-f07f4ff18604/gather/0.log" Oct 13 16:00:03 crc kubenswrapper[4797]: I1013 16:00:03.981261 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rwmws\" (UniqueName: \"kubernetes.io/projected/fdc95f86-5039-491a-8fe0-7e7db5483c51-kube-api-access-rwmws\") on node \"crc\" DevicePath \"\"" Oct 13 16:00:03 crc kubenswrapper[4797]: I1013 16:00:03.981310 4797 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fdc95f86-5039-491a-8fe0-7e7db5483c51-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 13 16:00:03 crc kubenswrapper[4797]: I1013 16:00:03.981320 4797 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fdc95f86-5039-491a-8fe0-7e7db5483c51-config-volume\") on node \"crc\" DevicePath \"\"" Oct 13 16:00:04 crc kubenswrapper[4797]: I1013 16:00:04.392366 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29339520-r9xls" event={"ID":"fdc95f86-5039-491a-8fe0-7e7db5483c51","Type":"ContainerDied","Data":"3f191aa16c9adcb54a8dc655c3211ce77ca66ba64af2eb1d5ceeeecacbfafaa7"} Oct 13 16:00:04 crc kubenswrapper[4797]: I1013 16:00:04.392423 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3f191aa16c9adcb54a8dc655c3211ce77ca66ba64af2eb1d5ceeeecacbfafaa7" Oct 13 16:00:04 crc kubenswrapper[4797]: I1013 16:00:04.392479 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29339520-r9xls" Oct 13 16:00:04 crc kubenswrapper[4797]: I1013 16:00:04.490833 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29339475-x9hst"] Oct 13 16:00:04 crc kubenswrapper[4797]: I1013 16:00:04.502716 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29339475-x9hst"] Oct 13 16:00:05 crc kubenswrapper[4797]: I1013 16:00:05.252418 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54ac0fc6-ca84-4bfd-a951-910ec25014dd" path="/var/lib/kubelet/pods/54ac0fc6-ca84-4bfd-a951-910ec25014dd/volumes" Oct 13 16:00:13 crc kubenswrapper[4797]: I1013 16:00:13.315013 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-kvrb6/must-gather-5rh4w"] Oct 13 16:00:13 crc kubenswrapper[4797]: I1013 16:00:13.315833 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-kvrb6/must-gather-5rh4w" podUID="1e1563cf-9081-41da-b894-f07f4ff18604" containerName="copy" containerID="cri-o://c27e25ebe88fca8d591eda56eddf8511644e7b9446cb763294ef62f3f66e0cde" gracePeriod=2 Oct 13 16:00:13 crc kubenswrapper[4797]: I1013 16:00:13.329174 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-kvrb6/must-gather-5rh4w"] Oct 13 16:00:13 crc kubenswrapper[4797]: I1013 16:00:13.496251 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-kvrb6_must-gather-5rh4w_1e1563cf-9081-41da-b894-f07f4ff18604/copy/0.log" Oct 13 16:00:13 crc kubenswrapper[4797]: I1013 16:00:13.499233 4797 generic.go:334] "Generic (PLEG): container finished" podID="1e1563cf-9081-41da-b894-f07f4ff18604" containerID="c27e25ebe88fca8d591eda56eddf8511644e7b9446cb763294ef62f3f66e0cde" exitCode=143 Oct 13 16:00:13 crc kubenswrapper[4797]: I1013 16:00:13.897408 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-kvrb6_must-gather-5rh4w_1e1563cf-9081-41da-b894-f07f4ff18604/copy/0.log" Oct 13 16:00:13 crc kubenswrapper[4797]: I1013 16:00:13.898049 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kvrb6/must-gather-5rh4w" Oct 13 16:00:13 crc kubenswrapper[4797]: I1013 16:00:13.913992 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mhvdq\" (UniqueName: \"kubernetes.io/projected/1e1563cf-9081-41da-b894-f07f4ff18604-kube-api-access-mhvdq\") pod \"1e1563cf-9081-41da-b894-f07f4ff18604\" (UID: \"1e1563cf-9081-41da-b894-f07f4ff18604\") " Oct 13 16:00:13 crc kubenswrapper[4797]: I1013 16:00:13.914203 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/1e1563cf-9081-41da-b894-f07f4ff18604-must-gather-output\") pod \"1e1563cf-9081-41da-b894-f07f4ff18604\" (UID: \"1e1563cf-9081-41da-b894-f07f4ff18604\") " Oct 13 16:00:13 crc kubenswrapper[4797]: I1013 16:00:13.919783 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e1563cf-9081-41da-b894-f07f4ff18604-kube-api-access-mhvdq" (OuterVolumeSpecName: "kube-api-access-mhvdq") pod "1e1563cf-9081-41da-b894-f07f4ff18604" (UID: "1e1563cf-9081-41da-b894-f07f4ff18604"). InnerVolumeSpecName "kube-api-access-mhvdq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 16:00:14 crc kubenswrapper[4797]: I1013 16:00:14.017342 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mhvdq\" (UniqueName: \"kubernetes.io/projected/1e1563cf-9081-41da-b894-f07f4ff18604-kube-api-access-mhvdq\") on node \"crc\" DevicePath \"\"" Oct 13 16:00:14 crc kubenswrapper[4797]: I1013 16:00:14.099170 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1e1563cf-9081-41da-b894-f07f4ff18604-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "1e1563cf-9081-41da-b894-f07f4ff18604" (UID: "1e1563cf-9081-41da-b894-f07f4ff18604"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 16:00:14 crc kubenswrapper[4797]: I1013 16:00:14.118840 4797 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/1e1563cf-9081-41da-b894-f07f4ff18604-must-gather-output\") on node \"crc\" DevicePath \"\"" Oct 13 16:00:14 crc kubenswrapper[4797]: I1013 16:00:14.511344 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-kvrb6_must-gather-5rh4w_1e1563cf-9081-41da-b894-f07f4ff18604/copy/0.log" Oct 13 16:00:14 crc kubenswrapper[4797]: I1013 16:00:14.512992 4797 scope.go:117] "RemoveContainer" containerID="c27e25ebe88fca8d591eda56eddf8511644e7b9446cb763294ef62f3f66e0cde" Oct 13 16:00:14 crc kubenswrapper[4797]: I1013 16:00:14.513006 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kvrb6/must-gather-5rh4w" Oct 13 16:00:14 crc kubenswrapper[4797]: I1013 16:00:14.532927 4797 scope.go:117] "RemoveContainer" containerID="3e2767edb4ccc39ed30beabb8ca5bd06ab049fb29447f0b4099941a5cf68808b" Oct 13 16:00:15 crc kubenswrapper[4797]: I1013 16:00:15.248913 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e1563cf-9081-41da-b894-f07f4ff18604" path="/var/lib/kubelet/pods/1e1563cf-9081-41da-b894-f07f4ff18604/volumes" Oct 13 16:00:21 crc kubenswrapper[4797]: I1013 16:00:21.470159 4797 scope.go:117] "RemoveContainer" containerID="b3bf19abd98838c8b3660aef7ef0a4d46bf279b21ed9ce1a237fd5c58c1f2ed1" Oct 13 16:01:00 crc kubenswrapper[4797]: I1013 16:01:00.155296 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29339521-qf69d"] Oct 13 16:01:00 crc kubenswrapper[4797]: E1013 16:01:00.156331 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e1563cf-9081-41da-b894-f07f4ff18604" containerName="copy" Oct 13 16:01:00 crc kubenswrapper[4797]: I1013 16:01:00.156348 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e1563cf-9081-41da-b894-f07f4ff18604" containerName="copy" Oct 13 16:01:00 crc kubenswrapper[4797]: E1013 16:01:00.156365 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdc95f86-5039-491a-8fe0-7e7db5483c51" containerName="collect-profiles" Oct 13 16:01:00 crc kubenswrapper[4797]: I1013 16:01:00.156371 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdc95f86-5039-491a-8fe0-7e7db5483c51" containerName="collect-profiles" Oct 13 16:01:00 crc kubenswrapper[4797]: E1013 16:01:00.156383 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e1563cf-9081-41da-b894-f07f4ff18604" containerName="gather" Oct 13 16:01:00 crc kubenswrapper[4797]: I1013 16:01:00.156390 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e1563cf-9081-41da-b894-f07f4ff18604" containerName="gather" Oct 13 16:01:00 crc kubenswrapper[4797]: I1013 16:01:00.156624 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="fdc95f86-5039-491a-8fe0-7e7db5483c51" containerName="collect-profiles" Oct 13 16:01:00 crc kubenswrapper[4797]: I1013 16:01:00.156637 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e1563cf-9081-41da-b894-f07f4ff18604" containerName="gather" Oct 13 16:01:00 crc kubenswrapper[4797]: I1013 16:01:00.156648 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e1563cf-9081-41da-b894-f07f4ff18604" containerName="copy" Oct 13 16:01:00 crc kubenswrapper[4797]: I1013 16:01:00.157456 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29339521-qf69d" Oct 13 16:01:00 crc kubenswrapper[4797]: I1013 16:01:00.167334 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29339521-qf69d"] Oct 13 16:01:00 crc kubenswrapper[4797]: I1013 16:01:00.347605 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/591d7509-bf22-4524-8c97-1f5f68c647bd-combined-ca-bundle\") pod \"keystone-cron-29339521-qf69d\" (UID: \"591d7509-bf22-4524-8c97-1f5f68c647bd\") " pod="openstack/keystone-cron-29339521-qf69d" Oct 13 16:01:00 crc kubenswrapper[4797]: I1013 16:01:00.347688 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/591d7509-bf22-4524-8c97-1f5f68c647bd-config-data\") pod \"keystone-cron-29339521-qf69d\" (UID: \"591d7509-bf22-4524-8c97-1f5f68c647bd\") " pod="openstack/keystone-cron-29339521-qf69d" Oct 13 16:01:00 crc kubenswrapper[4797]: I1013 16:01:00.347739 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vd4hk\" (UniqueName: \"kubernetes.io/projected/591d7509-bf22-4524-8c97-1f5f68c647bd-kube-api-access-vd4hk\") pod \"keystone-cron-29339521-qf69d\" (UID: \"591d7509-bf22-4524-8c97-1f5f68c647bd\") " pod="openstack/keystone-cron-29339521-qf69d" Oct 13 16:01:00 crc kubenswrapper[4797]: I1013 16:01:00.348284 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/591d7509-bf22-4524-8c97-1f5f68c647bd-fernet-keys\") pod \"keystone-cron-29339521-qf69d\" (UID: \"591d7509-bf22-4524-8c97-1f5f68c647bd\") " pod="openstack/keystone-cron-29339521-qf69d" Oct 13 16:01:00 crc kubenswrapper[4797]: I1013 16:01:00.449762 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/591d7509-bf22-4524-8c97-1f5f68c647bd-fernet-keys\") pod \"keystone-cron-29339521-qf69d\" (UID: \"591d7509-bf22-4524-8c97-1f5f68c647bd\") " pod="openstack/keystone-cron-29339521-qf69d" Oct 13 16:01:00 crc kubenswrapper[4797]: I1013 16:01:00.449856 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/591d7509-bf22-4524-8c97-1f5f68c647bd-combined-ca-bundle\") pod \"keystone-cron-29339521-qf69d\" (UID: \"591d7509-bf22-4524-8c97-1f5f68c647bd\") " pod="openstack/keystone-cron-29339521-qf69d" Oct 13 16:01:00 crc kubenswrapper[4797]: I1013 16:01:00.449905 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/591d7509-bf22-4524-8c97-1f5f68c647bd-config-data\") pod \"keystone-cron-29339521-qf69d\" (UID: \"591d7509-bf22-4524-8c97-1f5f68c647bd\") " pod="openstack/keystone-cron-29339521-qf69d" Oct 13 16:01:00 crc kubenswrapper[4797]: I1013 16:01:00.449929 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vd4hk\" (UniqueName: \"kubernetes.io/projected/591d7509-bf22-4524-8c97-1f5f68c647bd-kube-api-access-vd4hk\") pod \"keystone-cron-29339521-qf69d\" (UID: \"591d7509-bf22-4524-8c97-1f5f68c647bd\") " pod="openstack/keystone-cron-29339521-qf69d" Oct 13 16:01:00 crc kubenswrapper[4797]: I1013 16:01:00.456779 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/591d7509-bf22-4524-8c97-1f5f68c647bd-combined-ca-bundle\") pod \"keystone-cron-29339521-qf69d\" (UID: \"591d7509-bf22-4524-8c97-1f5f68c647bd\") " pod="openstack/keystone-cron-29339521-qf69d" Oct 13 16:01:00 crc kubenswrapper[4797]: I1013 16:01:00.457696 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/591d7509-bf22-4524-8c97-1f5f68c647bd-config-data\") pod \"keystone-cron-29339521-qf69d\" (UID: \"591d7509-bf22-4524-8c97-1f5f68c647bd\") " pod="openstack/keystone-cron-29339521-qf69d" Oct 13 16:01:00 crc kubenswrapper[4797]: I1013 16:01:00.457736 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/591d7509-bf22-4524-8c97-1f5f68c647bd-fernet-keys\") pod \"keystone-cron-29339521-qf69d\" (UID: \"591d7509-bf22-4524-8c97-1f5f68c647bd\") " pod="openstack/keystone-cron-29339521-qf69d" Oct 13 16:01:00 crc kubenswrapper[4797]: I1013 16:01:00.476864 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vd4hk\" (UniqueName: \"kubernetes.io/projected/591d7509-bf22-4524-8c97-1f5f68c647bd-kube-api-access-vd4hk\") pod \"keystone-cron-29339521-qf69d\" (UID: \"591d7509-bf22-4524-8c97-1f5f68c647bd\") " pod="openstack/keystone-cron-29339521-qf69d" Oct 13 16:01:00 crc kubenswrapper[4797]: I1013 16:01:00.490760 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29339521-qf69d" Oct 13 16:01:01 crc kubenswrapper[4797]: I1013 16:01:01.010404 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29339521-qf69d"] Oct 13 16:01:01 crc kubenswrapper[4797]: W1013 16:01:01.026667 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod591d7509_bf22_4524_8c97_1f5f68c647bd.slice/crio-fad18d86167e7d6498bd2af519e27cab249c4574d42585f503794c885427547e WatchSource:0}: Error finding container fad18d86167e7d6498bd2af519e27cab249c4574d42585f503794c885427547e: Status 404 returned error can't find the container with id fad18d86167e7d6498bd2af519e27cab249c4574d42585f503794c885427547e Oct 13 16:01:02 crc kubenswrapper[4797]: I1013 16:01:02.035962 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29339521-qf69d" event={"ID":"591d7509-bf22-4524-8c97-1f5f68c647bd","Type":"ContainerStarted","Data":"f8ae32a22674ff5181fce743b7f0b80857d89229d77c9829a6e9dbf06aeeb30a"} Oct 13 16:01:02 crc kubenswrapper[4797]: I1013 16:01:02.036626 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29339521-qf69d" event={"ID":"591d7509-bf22-4524-8c97-1f5f68c647bd","Type":"ContainerStarted","Data":"fad18d86167e7d6498bd2af519e27cab249c4574d42585f503794c885427547e"} Oct 13 16:01:02 crc kubenswrapper[4797]: I1013 16:01:02.055045 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29339521-qf69d" podStartSLOduration=2.055024732 podStartE2EDuration="2.055024732s" podCreationTimestamp="2025-10-13 16:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 16:01:02.054611802 +0000 UTC m=+10439.588162078" watchObservedRunningTime="2025-10-13 16:01:02.055024732 +0000 UTC m=+10439.588574988" Oct 13 16:01:05 crc kubenswrapper[4797]: I1013 16:01:05.066350 4797 generic.go:334] "Generic (PLEG): container finished" podID="591d7509-bf22-4524-8c97-1f5f68c647bd" containerID="f8ae32a22674ff5181fce743b7f0b80857d89229d77c9829a6e9dbf06aeeb30a" exitCode=0 Oct 13 16:01:05 crc kubenswrapper[4797]: I1013 16:01:05.066440 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29339521-qf69d" event={"ID":"591d7509-bf22-4524-8c97-1f5f68c647bd","Type":"ContainerDied","Data":"f8ae32a22674ff5181fce743b7f0b80857d89229d77c9829a6e9dbf06aeeb30a"} Oct 13 16:01:06 crc kubenswrapper[4797]: I1013 16:01:06.509239 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29339521-qf69d" Oct 13 16:01:06 crc kubenswrapper[4797]: I1013 16:01:06.678951 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/591d7509-bf22-4524-8c97-1f5f68c647bd-combined-ca-bundle\") pod \"591d7509-bf22-4524-8c97-1f5f68c647bd\" (UID: \"591d7509-bf22-4524-8c97-1f5f68c647bd\") " Oct 13 16:01:06 crc kubenswrapper[4797]: I1013 16:01:06.679060 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vd4hk\" (UniqueName: \"kubernetes.io/projected/591d7509-bf22-4524-8c97-1f5f68c647bd-kube-api-access-vd4hk\") pod \"591d7509-bf22-4524-8c97-1f5f68c647bd\" (UID: \"591d7509-bf22-4524-8c97-1f5f68c647bd\") " Oct 13 16:01:06 crc kubenswrapper[4797]: I1013 16:01:06.679094 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/591d7509-bf22-4524-8c97-1f5f68c647bd-config-data\") pod \"591d7509-bf22-4524-8c97-1f5f68c647bd\" (UID: \"591d7509-bf22-4524-8c97-1f5f68c647bd\") " Oct 13 16:01:06 crc kubenswrapper[4797]: I1013 16:01:06.679179 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/591d7509-bf22-4524-8c97-1f5f68c647bd-fernet-keys\") pod \"591d7509-bf22-4524-8c97-1f5f68c647bd\" (UID: \"591d7509-bf22-4524-8c97-1f5f68c647bd\") " Oct 13 16:01:06 crc kubenswrapper[4797]: I1013 16:01:06.685068 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/591d7509-bf22-4524-8c97-1f5f68c647bd-kube-api-access-vd4hk" (OuterVolumeSpecName: "kube-api-access-vd4hk") pod "591d7509-bf22-4524-8c97-1f5f68c647bd" (UID: "591d7509-bf22-4524-8c97-1f5f68c647bd"). InnerVolumeSpecName "kube-api-access-vd4hk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 16:01:06 crc kubenswrapper[4797]: I1013 16:01:06.691432 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/591d7509-bf22-4524-8c97-1f5f68c647bd-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "591d7509-bf22-4524-8c97-1f5f68c647bd" (UID: "591d7509-bf22-4524-8c97-1f5f68c647bd"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 16:01:06 crc kubenswrapper[4797]: I1013 16:01:06.722178 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/591d7509-bf22-4524-8c97-1f5f68c647bd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "591d7509-bf22-4524-8c97-1f5f68c647bd" (UID: "591d7509-bf22-4524-8c97-1f5f68c647bd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 16:01:06 crc kubenswrapper[4797]: I1013 16:01:06.734549 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/591d7509-bf22-4524-8c97-1f5f68c647bd-config-data" (OuterVolumeSpecName: "config-data") pod "591d7509-bf22-4524-8c97-1f5f68c647bd" (UID: "591d7509-bf22-4524-8c97-1f5f68c647bd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 16:01:06 crc kubenswrapper[4797]: I1013 16:01:06.781432 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/591d7509-bf22-4524-8c97-1f5f68c647bd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 16:01:06 crc kubenswrapper[4797]: I1013 16:01:06.781670 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vd4hk\" (UniqueName: \"kubernetes.io/projected/591d7509-bf22-4524-8c97-1f5f68c647bd-kube-api-access-vd4hk\") on node \"crc\" DevicePath \"\"" Oct 13 16:01:06 crc kubenswrapper[4797]: I1013 16:01:06.781742 4797 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/591d7509-bf22-4524-8c97-1f5f68c647bd-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 16:01:06 crc kubenswrapper[4797]: I1013 16:01:06.781818 4797 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/591d7509-bf22-4524-8c97-1f5f68c647bd-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 13 16:01:07 crc kubenswrapper[4797]: I1013 16:01:07.090676 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29339521-qf69d" event={"ID":"591d7509-bf22-4524-8c97-1f5f68c647bd","Type":"ContainerDied","Data":"fad18d86167e7d6498bd2af519e27cab249c4574d42585f503794c885427547e"} Oct 13 16:01:07 crc kubenswrapper[4797]: I1013 16:01:07.090720 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fad18d86167e7d6498bd2af519e27cab249c4574d42585f503794c885427547e" Oct 13 16:01:07 crc kubenswrapper[4797]: I1013 16:01:07.090795 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29339521-qf69d" Oct 13 16:01:18 crc kubenswrapper[4797]: I1013 16:01:18.120578 4797 patch_prober.go:28] interesting pod/machine-config-daemon-hrdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 16:01:18 crc kubenswrapper[4797]: I1013 16:01:18.121170 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 16:01:48 crc kubenswrapper[4797]: I1013 16:01:48.120366 4797 patch_prober.go:28] interesting pod/machine-config-daemon-hrdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 16:01:48 crc kubenswrapper[4797]: I1013 16:01:48.120995 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 16:02:18 crc kubenswrapper[4797]: I1013 16:02:18.120318 4797 patch_prober.go:28] interesting pod/machine-config-daemon-hrdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 16:02:18 crc kubenswrapper[4797]: I1013 16:02:18.120943 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 16:02:18 crc kubenswrapper[4797]: I1013 16:02:18.121000 4797 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" Oct 13 16:02:18 crc kubenswrapper[4797]: I1013 16:02:18.122011 4797 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0920c8d24afe0b4d40a22e6525d60a7cf10d64ddda96852d0d01ef3518b10398"} pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 13 16:02:18 crc kubenswrapper[4797]: I1013 16:02:18.122077 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerName="machine-config-daemon" containerID="cri-o://0920c8d24afe0b4d40a22e6525d60a7cf10d64ddda96852d0d01ef3518b10398" gracePeriod=600 Oct 13 16:02:18 crc kubenswrapper[4797]: E1013 16:02:18.267123 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 16:02:18 crc kubenswrapper[4797]: I1013 16:02:18.837307 4797 generic.go:334] "Generic (PLEG): container finished" podID="345b1c60-ba79-407d-8423-53010f2dfeb0" containerID="0920c8d24afe0b4d40a22e6525d60a7cf10d64ddda96852d0d01ef3518b10398" exitCode=0 Oct 13 16:02:18 crc kubenswrapper[4797]: I1013 16:02:18.837383 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" event={"ID":"345b1c60-ba79-407d-8423-53010f2dfeb0","Type":"ContainerDied","Data":"0920c8d24afe0b4d40a22e6525d60a7cf10d64ddda96852d0d01ef3518b10398"} Oct 13 16:02:18 crc kubenswrapper[4797]: I1013 16:02:18.837712 4797 scope.go:117] "RemoveContainer" containerID="b066f22aad24fb6aeea7702e6e3d1bfdd811246059ddf3b0194ee6a04bd047e1" Oct 13 16:02:18 crc kubenswrapper[4797]: I1013 16:02:18.838354 4797 scope.go:117] "RemoveContainer" containerID="0920c8d24afe0b4d40a22e6525d60a7cf10d64ddda96852d0d01ef3518b10398" Oct 13 16:02:18 crc kubenswrapper[4797]: E1013 16:02:18.838761 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0" Oct 13 16:02:33 crc kubenswrapper[4797]: I1013 16:02:33.246266 4797 scope.go:117] "RemoveContainer" containerID="0920c8d24afe0b4d40a22e6525d60a7cf10d64ddda96852d0d01ef3518b10398" Oct 13 16:02:33 crc kubenswrapper[4797]: E1013 16:02:33.247141 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hrdxs_openshift-machine-config-operator(345b1c60-ba79-407d-8423-53010f2dfeb0)\"" pod="openshift-machine-config-operator/machine-config-daemon-hrdxs" podUID="345b1c60-ba79-407d-8423-53010f2dfeb0"